mirror of
https://github.com/paperclipai/paperclip
synced 2026-04-26 01:35:18 +02:00
Compare commits
670 Commits
fix/canary
...
PAPA-45-up
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
347f38019f | ||
|
|
25615407a4 | ||
|
|
f843a45a84 | ||
|
|
36049beeea | ||
|
|
c041fee6fc | ||
|
|
82290451d4 | ||
|
|
fb3b57ab1f | ||
|
|
ca8d35fd99 | ||
|
|
81a7f79dfd | ||
|
|
ad1ef6a8c6 | ||
|
|
833842b391 | ||
|
|
fd6cfc7149 | ||
|
|
50e9f69010 | ||
|
|
38a0cd275e | ||
|
|
bd6d07d0b4 | ||
|
|
3ab7d52f00 | ||
|
|
909e8cd4c8 | ||
|
|
36376968af | ||
|
|
29d0e82dce | ||
|
|
1c1040e219 | ||
|
|
0ec8257563 | ||
|
|
38833304d4 | ||
|
|
85e6371cb6 | ||
|
|
daea94a2ed | ||
|
|
c18b3cb414 | ||
|
|
af844b778e | ||
|
|
53dbcd185e | ||
|
|
f16de6026d | ||
|
|
34044cdfce | ||
|
|
ca5659f734 | ||
|
|
d12e3e3d1a | ||
|
|
c0d0d03bce | ||
|
|
3db6bdfc3c | ||
|
|
6524dbe08f | ||
|
|
2c1883fc77 | ||
|
|
4abd53c089 | ||
|
|
3c99ab8d01 | ||
|
|
9d6d159209 | ||
|
|
26069682ee | ||
|
|
1e24e6e84c | ||
|
|
9d89d74d70 | ||
|
|
056a5ee32a | ||
|
|
dedd972e3d | ||
|
|
6a7830b07e | ||
|
|
f9cebe9b73 | ||
|
|
9e1ee925cd | ||
|
|
6c2c63e0f1 | ||
|
|
461779a960 | ||
|
|
6aa3ead238 | ||
|
|
e0f64c04e7 | ||
|
|
e5b2e8b29b | ||
|
|
62d8b39474 | ||
|
|
420cd4fd8d | ||
|
|
5b479652f2 | ||
|
|
99296f95db | ||
|
|
92e03ac4e3 | ||
|
|
ce8d9eb323 | ||
|
|
06cf00129f | ||
|
|
ebc6888e7d | ||
|
|
9f1bb350fe | ||
|
|
46ce546174 | ||
|
|
90889c12d8 | ||
|
|
761dce559d | ||
|
|
41f261eaf5 | ||
|
|
8427043431 | ||
|
|
19aaa54ae4 | ||
|
|
d134d5f3a1 | ||
|
|
98337f5b03 | ||
|
|
477ef78fed | ||
|
|
b0e0f8cd91 | ||
|
|
ccb5cce4ac | ||
|
|
5575399af1 | ||
|
|
2c75c8a1ec | ||
|
|
d8814e938c | ||
|
|
a7cfbc98f3 | ||
|
|
5e65bb2b92 | ||
|
|
d7d01e9819 | ||
|
|
88e742a129 | ||
|
|
db4e146551 | ||
|
|
9684e7bf30 | ||
|
|
a3e125f796 | ||
|
|
2b18fc4007 | ||
|
|
ec1210caaa | ||
|
|
3c66683169 | ||
|
|
c610192c53 | ||
|
|
4d61dbfd34 | ||
|
|
26a974da17 | ||
|
|
8a368e8721 | ||
|
|
c8ab70f2ce | ||
|
|
29da357c5b | ||
|
|
4120016d30 | ||
|
|
fceefe7f09 | ||
|
|
2d31c71fbe | ||
|
|
b5efd8b435 | ||
|
|
fc2be204e2 | ||
|
|
92ebad3d42 | ||
|
|
5310bbd4d8 | ||
|
|
c54b985d9f | ||
|
|
70702ce74f | ||
|
|
b1b3408efa | ||
|
|
57357991e4 | ||
|
|
50577b8c63 | ||
|
|
1871a602df | ||
|
|
facf994694 | ||
|
|
403aeff7f6 | ||
|
|
7d81e4cb2a | ||
|
|
44f052f4c5 | ||
|
|
c33dcbd202 | ||
|
|
bc61eb84df | ||
|
|
74687553f3 | ||
|
|
4226e15128 | ||
|
|
cfb7dd4818 | ||
|
|
52bb4ea37a | ||
|
|
3986eb615c | ||
|
|
0f9faa297b | ||
|
|
d917375e35 | ||
|
|
ce4536d1fa | ||
|
|
4fd62a3d91 | ||
|
|
25066c967b | ||
|
|
1534b39ee3 | ||
|
|
826da2973d | ||
|
|
4426d96610 | ||
|
|
c8956094ad | ||
|
|
2ec4ba629e | ||
|
|
182b459235 | ||
|
|
94d6ae4049 | ||
|
|
b3d61a7561 | ||
|
|
d9005405b9 | ||
|
|
e3f07aad55 | ||
|
|
2fea39b814 | ||
|
|
0356040a29 | ||
|
|
caa7550e9f | ||
|
|
84d4c328f5 | ||
|
|
11f08ea5d5 | ||
|
|
1f1fe9c989 | ||
|
|
f1ad07616c | ||
|
|
868cfa8c50 | ||
|
|
6793dde597 | ||
|
|
cadfcd1bc6 | ||
|
|
c114ff4dc6 | ||
|
|
84e35b801c | ||
|
|
cbeefbfa5a | ||
|
|
2de691f023 | ||
|
|
41f2a80aa8 | ||
|
|
bb1732dd11 | ||
|
|
15e0e2ece9 | ||
|
|
b7b5d8dae3 | ||
|
|
0ff778ec29 | ||
|
|
b69f0b7dc4 | ||
|
|
b75ac76b13 | ||
|
|
19b6adc415 | ||
|
|
54b05d6d68 | ||
|
|
f83a77f41f | ||
|
|
a3537a86e3 | ||
|
|
5d538d4792 | ||
|
|
dc3aa8f31f | ||
|
|
c98af52590 | ||
|
|
01fb97e8da | ||
|
|
6a72faf83b | ||
|
|
1fd40920db | ||
|
|
caef115b95 | ||
|
|
17e5322e28 | ||
|
|
582f4ceaf4 | ||
|
|
1583a2d65a | ||
|
|
9a70a4edaa | ||
|
|
0ac01a04e5 | ||
|
|
11ff24cd22 | ||
|
|
a5d47166e2 | ||
|
|
eb8c5d93e7 | ||
|
|
af5b980362 | ||
|
|
2e563ccd50 | ||
|
|
2c406d3b8c | ||
|
|
49c7fb7fbd | ||
|
|
995f5b0b66 | ||
|
|
b34fa3b273 | ||
|
|
9ddf960312 | ||
|
|
a8894799e4 | ||
|
|
76a692c260 | ||
|
|
5913706329 | ||
|
|
b944293eda | ||
|
|
3c1ebed539 | ||
|
|
ab0d04ff7a | ||
|
|
6073ac3145 | ||
|
|
3b329467eb | ||
|
|
aa5b2be907 | ||
|
|
dcb66eeae7 | ||
|
|
874fe5ec7d | ||
|
|
c916626cef | ||
|
|
555f026c24 | ||
|
|
e91da556ee | ||
|
|
ab82e3f022 | ||
|
|
c74cda1851 | ||
|
|
fcf3ba6974 | ||
|
|
ed62d58cb2 | ||
|
|
dd8c1ca3b2 | ||
|
|
5ee4cd98e8 | ||
|
|
a6ca3a9418 | ||
|
|
0fd75aa579 | ||
|
|
eaa765118f | ||
|
|
ed73547fb6 | ||
|
|
692105e202 | ||
|
|
01b550d61a | ||
|
|
c6364149b1 | ||
|
|
b0b9809732 | ||
|
|
844b6dfd70 | ||
|
|
0a32e3838a | ||
|
|
e186449f94 | ||
|
|
4bb42005ea | ||
|
|
66aa65f8f7 | ||
|
|
15f6079c6b | ||
|
|
9e9eec9af6 | ||
|
|
1a4ed8c953 | ||
|
|
bd60ea4909 | ||
|
|
6ebfc0ff3d | ||
|
|
083d7c9ac4 | ||
|
|
80766e589c | ||
|
|
c5c6c62bd7 | ||
|
|
1549799c1e | ||
|
|
af1b08fdf4 | ||
|
|
72bc4ab403 | ||
|
|
4c6b9c190b | ||
|
|
f6ac6e47c4 | ||
|
|
623ab1c3ea | ||
|
|
eeec52ad74 | ||
|
|
db3883d2e7 | ||
|
|
9637351880 | ||
|
|
d0e01d2863 | ||
|
|
cbca599625 | ||
|
|
b1d12d2f37 | ||
|
|
0a952dc93d | ||
|
|
ff8b839f42 | ||
|
|
fea892c8b3 | ||
|
|
1696ff0c3f | ||
|
|
4eecd23ea3 | ||
|
|
4da83296a9 | ||
|
|
0ce4134ce1 | ||
|
|
03f44d0089 | ||
|
|
d38d5e1a7b | ||
|
|
add6ca5648 | ||
|
|
04a07080af | ||
|
|
8bebc9599a | ||
|
|
6250d536a0 | ||
|
|
de5985bb75 | ||
|
|
331e1f0d06 | ||
|
|
58c511af9a | ||
|
|
4b668379bc | ||
|
|
f352f3f514 | ||
|
|
4ff460de38 | ||
|
|
06b85d62b2 | ||
|
|
3447e2087a | ||
|
|
44fbf83106 | ||
|
|
eb73fc747a | ||
|
|
5602576ae1 | ||
|
|
c4838cca6e | ||
|
|
67841a0c6d | ||
|
|
5561a9c17f | ||
|
|
a9dcea023b | ||
|
|
14ffbe30a0 | ||
|
|
98a5e287ef | ||
|
|
2735ef1f4a | ||
|
|
53f0988006 | ||
|
|
730a67bb20 | ||
|
|
59e29afab5 | ||
|
|
fd4df4db48 | ||
|
|
8ae954bb8f | ||
|
|
32c76e0012 | ||
|
|
70bd55a00f | ||
|
|
f92d2c3326 | ||
|
|
a3f4e6f56c | ||
|
|
08bdc3d28e | ||
|
|
7c54b6e9e3 | ||
|
|
a346ad2a73 | ||
|
|
e4e5b61596 | ||
|
|
eeb7e1a91a | ||
|
|
f2637e6972 | ||
|
|
c8f8f6752f | ||
|
|
87b3cacc8f | ||
|
|
4096db8053 | ||
|
|
fa084e1a16 | ||
|
|
22067c7d1d | ||
|
|
85d2c54d53 | ||
|
|
5222a49cc3 | ||
|
|
36574bd9c6 | ||
|
|
2cc2d4420d | ||
|
|
7576c5ecbc | ||
|
|
dd1d9bed80 | ||
|
|
92c29f27c3 | ||
|
|
55b26ed590 | ||
|
|
6960ab1106 | ||
|
|
c3f4e18a5e | ||
|
|
a3f568dec7 | ||
|
|
6f1ce3bd60 | ||
|
|
159c5b4360 | ||
|
|
b5fde733b0 | ||
|
|
f9927bdaaa | ||
|
|
dcead97650 | ||
|
|
9786ebb7ba | ||
|
|
66d84ccfa3 | ||
|
|
56a39fea3d | ||
|
|
2a6e1cf1fc | ||
|
|
c02dc73d3c | ||
|
|
06f5632d1a | ||
|
|
1246ccf250 | ||
|
|
a339b488ae | ||
|
|
ac376d0e5e | ||
|
|
220946b2a1 | ||
|
|
c41dd2e393 | ||
|
|
2e76a2a554 | ||
|
|
8fa4b6a5fb | ||
|
|
d8b408625e | ||
|
|
19154d0fec | ||
|
|
c0c1fd17cb | ||
|
|
2daae758b1 | ||
|
|
43b21c6033 | ||
|
|
0bb1ee3caa | ||
|
|
3b2cb3a699 | ||
|
|
1adfd30b3b | ||
|
|
a315838d43 | ||
|
|
75c7eb3868 | ||
|
|
eac3f3fa69 | ||
|
|
02c779b41d | ||
|
|
5a1e17f27f | ||
|
|
e0d2c4bddf | ||
|
|
d73c8df895 | ||
|
|
e73bc81a73 | ||
|
|
0b960b0739 | ||
|
|
bdecb1bad2 | ||
|
|
e61f00d4c1 | ||
|
|
42c8d9b660 | ||
|
|
bd0b76072b | ||
|
|
db42adf1bf | ||
|
|
0e8e162cd5 | ||
|
|
49ace2faf9 | ||
|
|
8232456ce8 | ||
|
|
cd7c6ee751 | ||
|
|
f8dd4dcb30 | ||
|
|
0b9f00346b | ||
|
|
ef0846e723 | ||
|
|
3a79d94050 | ||
|
|
b5610f66a6 | ||
|
|
119dd0eaa0 | ||
|
|
080c9e415d | ||
|
|
7f9a76411a | ||
|
|
01b6b7e66a | ||
|
|
298713fae7 | ||
|
|
37c2c4acc4 | ||
|
|
1376fc8f44 | ||
|
|
e6801123ca | ||
|
|
f23d611d0c | ||
|
|
5dfdbe91bb | ||
|
|
e6df9fa078 | ||
|
|
5a73556871 | ||
|
|
e204e03fa6 | ||
|
|
8b4850aaea | ||
|
|
f87db64ba9 | ||
|
|
f42aebdff8 | ||
|
|
4ebc12ab5a | ||
|
|
fdb20d5d08 | ||
|
|
5bf6fd1270 | ||
|
|
e3e7a92c77 | ||
|
|
640f527f8c | ||
|
|
49c1b8c2d8 | ||
|
|
93ba78362d | ||
|
|
2fdf953229 | ||
|
|
ebe00359d1 | ||
|
|
036e2b52db | ||
|
|
f4803291b8 | ||
|
|
d47ec56eca | ||
|
|
ae6aac044d | ||
|
|
da2c15905a | ||
|
|
13ca33aa4e | ||
|
|
e37e9df0d1 | ||
|
|
54b99d5096 | ||
|
|
fb63d61ae5 | ||
|
|
73ada45037 | ||
|
|
be911754c5 | ||
|
|
cff06c9a54 | ||
|
|
ad011fbf1e | ||
|
|
28a5f858b7 | ||
|
|
220a5ec5dd | ||
|
|
0ec79d4295 | ||
|
|
5e414ff4df | ||
|
|
a46dc4634b | ||
|
|
df64530333 | ||
|
|
8dc98db717 | ||
|
|
9093cfbe4f | ||
|
|
da9b31e393 | ||
|
|
99eb317600 | ||
|
|
652fa8223e | ||
|
|
4587627f3c | ||
|
|
17b6f6c8f7 | ||
|
|
de10269d10 | ||
|
|
dfb83295de | ||
|
|
61f53b6471 | ||
|
|
e3c92a20f1 | ||
|
|
a290d1d550 | ||
|
|
abf48cbbf9 | ||
|
|
d53714a145 | ||
|
|
07757a59e9 | ||
|
|
f0b5130b80 | ||
|
|
0ca479de9c | ||
|
|
553e7b6b30 | ||
|
|
1830216078 | ||
|
|
5140d7b0c4 | ||
|
|
a62c264ddf | ||
|
|
3db2d33e4c | ||
|
|
360a7fc17b | ||
|
|
13fd656e2b | ||
|
|
47449152ac | ||
|
|
9ee440b8e4 | ||
|
|
5b1e1239fd | ||
|
|
79652da520 | ||
|
|
0f4a5716ea | ||
|
|
8fc399f511 | ||
|
|
dd44f69e2b | ||
|
|
39878fcdfe | ||
|
|
3de7d63ea9 | ||
|
|
581a654748 | ||
|
|
888179f7f0 | ||
|
|
0bb6336eaf | ||
|
|
2d8c8abbfb | ||
|
|
6f7609daac | ||
|
|
b26b9cda7b | ||
|
|
fb760a63ab | ||
|
|
971513d3ae | ||
|
|
d6bb71f324 | ||
|
|
0f45999df9 | ||
|
|
bee814787a | ||
|
|
d22131ad0a | ||
|
|
7930e725af | ||
|
|
5fee484e85 | ||
|
|
d7a08c1db2 | ||
|
|
401b241570 | ||
|
|
bf5cfaaeab | ||
|
|
616a2bc8f9 | ||
|
|
4ab3e4f7ab | ||
|
|
2a33acce3a | ||
|
|
b2c2bbd96f | ||
|
|
b72279afe4 | ||
|
|
4c6e8e6053 | ||
|
|
f2c42aad12 | ||
|
|
6a568662b8 | ||
|
|
d07d86f778 | ||
|
|
8cc8540597 | ||
|
|
5f2b1b63c2 | ||
|
|
4fc80bdc16 | ||
|
|
dfdd3784b9 | ||
|
|
a0a28fce38 | ||
|
|
22b38b1956 | ||
|
|
4ffa2b15dc | ||
|
|
ee85028534 | ||
|
|
c844ca1a40 | ||
|
|
7f3fad64b8 | ||
|
|
d6c6aa5c49 | ||
|
|
f9d685344d | ||
|
|
bcc1d9f3d6 | ||
|
|
25af0a1532 | ||
|
|
72a0e256a8 | ||
|
|
9e21ef879f | ||
|
|
58a3cbd654 | ||
|
|
915a3ff3ce | ||
|
|
9c5a31ed45 | ||
|
|
14ee364190 | ||
|
|
2d7b9e95cb | ||
|
|
b20675b7b5 | ||
|
|
df8cc8136f | ||
|
|
b05d0c560e | ||
|
|
c5f20a9891 | ||
|
|
53249c00cf | ||
|
|
339c05c2d4 | ||
|
|
c7d05096ab | ||
|
|
21765f8118 | ||
|
|
9998cc0683 | ||
|
|
c39758a169 | ||
|
|
e341abb99c | ||
|
|
5caf43349b | ||
|
|
f7c766ff32 | ||
|
|
bdeaaeac9c | ||
|
|
a9802c1962 | ||
|
|
531945cfe2 | ||
|
|
6a7e2d3fce | ||
|
|
035cb8aec2 | ||
|
|
ca3fdb3957 | ||
|
|
301437e169 | ||
|
|
12c6584d30 | ||
|
|
efbcce27e4 | ||
|
|
54dd8f7ac8 | ||
|
|
ce69ebd2ec | ||
|
|
500d926da7 | ||
|
|
b1c4b2e420 | ||
|
|
1d1511e37c | ||
|
|
8f5196f7d6 | ||
|
|
8edff22c0b | ||
|
|
2f076f2add | ||
|
|
fff0600b1d | ||
|
|
16e221d03c | ||
|
|
cace79631e | ||
|
|
05c8a23a75 | ||
|
|
7a652b8998 | ||
|
|
6d564e0539 | ||
|
|
dbc9375256 | ||
|
|
b4e06c63e2 | ||
|
|
01afa92424 | ||
|
|
1cd61601f3 | ||
|
|
6eb9545a72 | ||
|
|
47a6d86174 | ||
|
|
aa854e7efe | ||
|
|
5536e6b91e | ||
|
|
f37e0aa7b3 | ||
|
|
b75e00e05d | ||
|
|
51ca713181 | ||
|
|
685c7549e1 | ||
|
|
8be868f0ab | ||
|
|
e28bcef4ad | ||
|
|
7b4a4f45ed | ||
|
|
87b17de0bd | ||
|
|
9ba47681c6 | ||
|
|
ef60ea0446 | ||
|
|
cd01ebb417 | ||
|
|
6000bb4ee2 | ||
|
|
e99fa66daf | ||
|
|
3b03ac1734 | ||
|
|
6ba5758d30 | ||
|
|
cfc53bf96b | ||
|
|
58d7f59477 | ||
|
|
b0524412c4 | ||
|
|
3689992965 | ||
|
|
55165f116d | ||
|
|
480174367d | ||
|
|
099c37c4b4 | ||
|
|
d84399aebe | ||
|
|
4f49c8a2b9 | ||
|
|
10f26cfad9 | ||
|
|
1e393bedb2 | ||
|
|
1ac85d837a | ||
|
|
9e19f1d005 | ||
|
|
731c9544b3 | ||
|
|
528f836e71 | ||
|
|
78c714c29a | ||
|
|
88da68d8a2 | ||
|
|
0d9fabb6ec | ||
|
|
ff16ff8d01 | ||
|
|
154a4a7ac1 | ||
|
|
493b0ca8d1 | ||
|
|
7730230aa9 | ||
|
|
2c05c2c0ac | ||
|
|
cc1620e4fe | ||
|
|
3e88afb64a | ||
|
|
3562cca743 | ||
|
|
9a4135c288 | ||
|
|
7140090d0b | ||
|
|
bfb1960703 | ||
|
|
22ae70649b | ||
|
|
c121f4d4a7 | ||
|
|
19f4a78f4a | ||
|
|
3e0e15394a | ||
|
|
5252568825 | ||
|
|
c7d31346e0 | ||
|
|
6b355e1acf | ||
|
|
f98d821213 | ||
|
|
8954512dad | ||
|
|
f598a556dc | ||
|
|
9d452eb120 | ||
|
|
4fdcfe5515 | ||
|
|
6ba9aea8ba | ||
|
|
e980c2ef64 | ||
|
|
827b09d7a5 | ||
|
|
e2f26f039a | ||
|
|
b5aeae7e22 | ||
|
|
2a7c44d314 | ||
|
|
517e90c13a | ||
|
|
228277d361 | ||
|
|
c539fcde8b | ||
|
|
7a08fbd370 | ||
|
|
71e1bc260d | ||
|
|
78342e384d | ||
|
|
59b1d1551a | ||
|
|
5d1e39b651 | ||
|
|
ceb18c77db | ||
|
|
6a1c198c04 | ||
|
|
dd11e7aa7b | ||
|
|
0cfbc58842 | ||
|
|
79e0915a86 | ||
|
|
56f7807732 | ||
|
|
52978e84ba | ||
|
|
b339f923d6 | ||
|
|
9e843c4dec | ||
|
|
9a26974ba8 | ||
|
|
5890b318c4 | ||
|
|
bb46423969 | ||
|
|
8460fee380 | ||
|
|
cca086b863 | ||
|
|
d77630154a | ||
|
|
10d06bc1ca | ||
|
|
0b76b1aced | ||
|
|
fed94d18f3 | ||
|
|
0763e2eb20 | ||
|
|
1548b73b77 | ||
|
|
cf8bfe8d8e | ||
|
|
5d6dadda83 | ||
|
|
43fa4fc487 | ||
|
|
bf9b057670 | ||
|
|
4a5aba5bac | ||
|
|
0b829ea20b | ||
|
|
86bb3d25cc | ||
|
|
ad494e74ad | ||
|
|
bc8fde5433 | ||
|
|
8d0581ffb4 | ||
|
|
298cb4ab8a | ||
|
|
3572ef230d | ||
|
|
f8249af501 | ||
|
|
140c4e1feb | ||
|
|
617aeaae0e | ||
|
|
b116e04894 | ||
|
|
dc1bf7e9c6 | ||
|
|
5b44dbe9c4 | ||
|
|
3c31e379a1 | ||
|
|
4e146f0075 | ||
|
|
173e7915a7 | ||
|
|
e76fca138d | ||
|
|
45df62652b | ||
|
|
068441b01b | ||
|
|
ca0169eb6c | ||
|
|
448fdaab96 | ||
|
|
b1e2a5615b | ||
|
|
b535860a50 | ||
|
|
2b478764a9 | ||
|
|
88cc8e495c | ||
|
|
88df0fecb0 | ||
|
|
ef652a2766 | ||
|
|
cf30ddb924 | ||
|
|
2f7da835de | ||
|
|
c6ea491000 | ||
|
|
76d30ff835 | ||
|
|
cc40e1f8e9 | ||
|
|
eb647ab2db | ||
|
|
7675fd0856 | ||
|
|
82f253c310 | ||
|
|
5de5fb507a | ||
|
|
269dd6abbe | ||
|
|
2c35be0212 | ||
|
|
c44dbf79cb | ||
|
|
5814249ea9 | ||
|
|
e619e64433 | ||
|
|
b2c0f3f9a5 | ||
|
|
7e43020a28 | ||
|
|
cfa4925075 | ||
|
|
280536092e | ||
|
|
2ba0f5914f | ||
|
|
0bf53bc513 | ||
|
|
2137c2f715 | ||
|
|
58a9259a2e | ||
|
|
1d8f514d10 | ||
|
|
8a201022c0 | ||
|
|
56a34a8f8a | ||
|
|
271c2b9018 | ||
|
|
2975aa950b | ||
|
|
29b70e0c36 | ||
|
|
3f48b61bfa | ||
|
|
dbb5bd48cc | ||
|
|
a39579dad3 | ||
|
|
fbb8d10305 | ||
|
|
3d2abbde72 | ||
|
|
ff02220890 | ||
|
|
bc5b30eccf | ||
|
|
d114927814 | ||
|
|
b41c00a9ef | ||
|
|
432d7e72fa | ||
|
|
666ab53648 | ||
|
|
314288ff82 |
269
.agents/skills/company-creator/SKILL.md
Normal file
269
.agents/skills/company-creator/SKILL.md
Normal file
@@ -0,0 +1,269 @@
|
||||
---
|
||||
name: company-creator
|
||||
description: >
|
||||
Create agent company packages conforming to the Agent Companies specification
|
||||
(agentcompanies/v1). Use when a user wants to create a new agent company from
|
||||
scratch, build a company around an existing git repo or skills collection, or
|
||||
scaffold a team/department of agents. Triggers on: "create a company", "make me
|
||||
a company", "build a company from this repo", "set up an agent company",
|
||||
"create a team of agents", "hire some agents", or when given a repo URL and
|
||||
asked to turn it into a company. Do NOT use for importing an existing company
|
||||
package (use the CLI import command instead) or for modifying a company that
|
||||
is already running in Paperclip.
|
||||
---
|
||||
|
||||
# Company Creator
|
||||
|
||||
Create agent company packages that conform to the Agent Companies specification.
|
||||
|
||||
Spec references:
|
||||
|
||||
- Normative spec: `docs/companies/companies-spec.md` (read this before generating files)
|
||||
- Web spec: https://agentcompanies.io/specification
|
||||
- Protocol site: https://agentcompanies.io/
|
||||
|
||||
## Two Modes
|
||||
|
||||
### Mode 1: Company From Scratch
|
||||
|
||||
The user describes what they want. Interview them to flesh out the vision, then generate the package.
|
||||
|
||||
### Mode 2: Company From a Repo
|
||||
|
||||
The user provides a git repo URL, local path, or tweet. Analyze the repo, then create a company that wraps it.
|
||||
|
||||
See [references/from-repo-guide.md](references/from-repo-guide.md) for detailed repo analysis steps.
|
||||
|
||||
## Process
|
||||
|
||||
### Step 1: Gather Context
|
||||
|
||||
Determine which mode applies:
|
||||
|
||||
- **From scratch**: What kind of company or team? What domain? What should the agents do?
|
||||
- **From repo**: Clone/read the repo. Scan for existing skills, agent configs, README, source structure.
|
||||
|
||||
### Step 2: Interview (Use AskUserQuestion)
|
||||
|
||||
Do not skip this step. Use AskUserQuestion to align with the user before writing any files.
|
||||
|
||||
**For from-scratch companies**, ask about:
|
||||
|
||||
- Company purpose and domain (1-2 sentences is fine)
|
||||
- What agents they need - propose a hiring plan based on what they described
|
||||
- Whether this is a full company (needs a CEO) or a team/department (no CEO required)
|
||||
- Any specific skills the agents should have
|
||||
- How work flows through the organization (see "Workflow" below)
|
||||
- Whether they want projects and starter tasks
|
||||
|
||||
**For from-repo companies**, present your analysis and ask:
|
||||
|
||||
- Confirm the agents you plan to create and their roles
|
||||
- Whether to reference or vendor any discovered skills (default: reference)
|
||||
- Any additional agents or skills beyond what the repo provides
|
||||
- Company name and any customization
|
||||
- Confirm the workflow you inferred from the repo (see "Workflow" below)
|
||||
|
||||
**Workflow — how does work move through this company?**
|
||||
|
||||
A company is not just a list of agents with skills. It's an organization that takes ideas and turns them into work products. You need to understand the workflow so each agent knows:
|
||||
|
||||
- Who gives them work and in what form (a task, a branch, a question, a review request)
|
||||
- What they do with it
|
||||
- Who they hand off to when they're done, and what that handoff looks like
|
||||
- What "done" means for their role
|
||||
|
||||
**Not every company is a pipeline.** Infer the right workflow pattern from context:
|
||||
|
||||
- **Pipeline** — sequential stages, each agent hands off to the next. Use when the repo/domain has a clear linear process (e.g. plan → build → review → ship → QA, or content ideation → draft → edit → publish).
|
||||
- **Hub-and-spoke** — a manager delegates to specialists who report back independently. Use when agents do different kinds of work that don't feed into each other (e.g. a CEO who dispatches to a researcher, a marketer, and an analyst).
|
||||
- **Collaborative** — agents work together on the same things as peers. Use for small teams where everyone contributes to the same output (e.g. a design studio, a brainstorming team).
|
||||
- **On-demand** — agents are summoned as needed with no fixed flow. Use when agents are more like a toolbox of specialists the user calls directly.
|
||||
|
||||
For from-scratch companies, propose a workflow pattern based on what they described and ask if it fits.
|
||||
|
||||
For from-repo companies, infer the pattern from the repo's structure. If skills have a clear sequential dependency (like `plan-ceo-review → plan-eng-review → review → ship → qa`), that's a pipeline. If skills are independent capabilities, it's more likely hub-and-spoke or on-demand. State your inference in the interview so the user can confirm or adjust.
|
||||
|
||||
**Key interviewing principles:**
|
||||
|
||||
- Propose a concrete hiring plan. Don't ask open-ended "what agents do you want?" - suggest specific agents based on context and let the user adjust.
|
||||
- Keep it lean. Most users are new to agent companies. A few agents (3-5) is typical for a startup. Don't suggest 10+ agents unless the scope demands it.
|
||||
- From-scratch companies should start with a CEO who manages everyone. Teams/departments don't need one.
|
||||
- Ask 2-3 focused questions per round, not 10.
|
||||
|
||||
### Step 3: Read the Spec
|
||||
|
||||
Before generating any files, read the normative spec:
|
||||
|
||||
```
|
||||
docs/companies/companies-spec.md
|
||||
```
|
||||
|
||||
Also read the quick reference: [references/companies-spec.md](references/companies-spec.md)
|
||||
|
||||
And the example: [references/example-company.md](references/example-company.md)
|
||||
|
||||
### Step 4: Generate the Package
|
||||
|
||||
Create the directory structure and all files. Follow the spec's conventions exactly.
|
||||
|
||||
**Directory structure:**
|
||||
|
||||
```
|
||||
<company-slug>/
|
||||
├── COMPANY.md
|
||||
├── agents/
|
||||
│ └── <slug>/AGENTS.md
|
||||
├── teams/
|
||||
│ └── <slug>/TEAM.md (if teams are needed)
|
||||
├── projects/
|
||||
│ └── <slug>/PROJECT.md (if projects are needed)
|
||||
├── tasks/
|
||||
│ └── <slug>/TASK.md (if tasks are needed)
|
||||
├── skills/
|
||||
│ └── <slug>/SKILL.md (if custom skills are needed)
|
||||
└── .paperclip.yaml (Paperclip vendor extension)
|
||||
```
|
||||
|
||||
**Rules:**
|
||||
|
||||
- Slugs must be URL-safe, lowercase, hyphenated
|
||||
- COMPANY.md gets `schema: agentcompanies/v1` - other files inherit it
|
||||
- Agent instructions go in the AGENTS.md body, not in .paperclip.yaml
|
||||
- Skills referenced by shortname in AGENTS.md resolve to `skills/<shortname>/SKILL.md`
|
||||
- For external skills, use `sources` with `usage: referenced` (see spec section 12)
|
||||
- Do not export secrets, machine-local paths, or database IDs
|
||||
- Omit empty/default fields
|
||||
- For companies generated from a repo, add a references footer at the bottom of COMPANY.md body:
|
||||
`Generated from [repo-name](repo-url) with the company-creator skill from [Paperclip](https://github.com/paperclipai/paperclip)`
|
||||
|
||||
**Reporting structure:**
|
||||
|
||||
- Every agent except the CEO should have `reportsTo` set to their manager's slug
|
||||
- The CEO has `reportsTo: null`
|
||||
- For teams without a CEO, the top-level agent has `reportsTo: null`
|
||||
|
||||
**Writing workflow-aware agent instructions:**
|
||||
|
||||
Each AGENTS.md body should include not just what the agent does, but how they fit into the organization's workflow. Include:
|
||||
|
||||
1. **Where work comes from** — "You receive feature ideas from the user" or "You pick up tasks assigned to you by the CTO"
|
||||
2. **What you produce** — "You produce a technical plan with architecture diagrams" or "You produce a reviewed, approved branch ready for shipping"
|
||||
3. **Who you hand off to** — "When your plan is locked, hand off to the Staff Engineer for implementation" or "When review passes, hand off to the Release Engineer to ship"
|
||||
4. **What triggers you** — "You are activated when a new feature idea needs product-level thinking" or "You are activated when a branch is ready for pre-landing review"
|
||||
|
||||
This turns a collection of agents into an organization that actually works together. Without workflow context, agents operate in isolation — they do their job but don't know what happens before or after them.
|
||||
|
||||
### Step 5: Confirm Output Location
|
||||
|
||||
Ask the user where to write the package. Common options:
|
||||
|
||||
- A subdirectory in the current repo
|
||||
- A new directory the user specifies
|
||||
- The current directory (if it's empty or they confirm)
|
||||
|
||||
### Step 6: Write README.md and LICENSE
|
||||
|
||||
**README.md** — every company package gets a README. It should be a nice, readable introduction that someone browsing GitHub would appreciate. Include:
|
||||
|
||||
- Company name and what it does
|
||||
- The workflow / how the company operates
|
||||
- Org chart as a markdown list or table showing agents, titles, reporting structure, and skills
|
||||
- Brief description of each agent's role
|
||||
- Citations and references: link to the source repo (if from-repo), link to the Agent Companies spec (https://agentcompanies.io/specification), and link to Paperclip (https://github.com/paperclipai/paperclip)
|
||||
- A "Getting Started" section explaining how to import: `paperclipai company import --from <path>`
|
||||
|
||||
**LICENSE** — include a LICENSE file. The copyright holder is the user creating the company, not the upstream repo author (they made the skills, the user is making the company). Use the same license type as the source repo (if from-repo) or ask the user (if from-scratch). Default to MIT if unclear.
|
||||
|
||||
### Step 7: Write Files and Summarize
|
||||
|
||||
Write all files, then give a brief summary:
|
||||
|
||||
- Company name and what it does
|
||||
- Agent roster with roles and reporting structure
|
||||
- Skills (custom + referenced)
|
||||
- Projects and tasks if any
|
||||
- The output path
|
||||
|
||||
## .paperclip.yaml Guidelines
|
||||
|
||||
The `.paperclip.yaml` file is the Paperclip vendor extension. It configures adapters and env inputs per agent.
|
||||
|
||||
### Adapter Rules
|
||||
|
||||
**Do not specify an adapter unless the repo or user context warrants it.** If you don't know what adapter the user wants, omit the adapter block entirely — Paperclip will use its default. Specifying an unknown adapter type causes an import error.
|
||||
|
||||
Paperclip's supported adapter types (these are the ONLY valid values):
|
||||
- `claude_local` — Claude Code CLI
|
||||
- `codex_local` — Codex CLI
|
||||
- `opencode_local` — OpenCode CLI
|
||||
- `pi_local` — Pi CLI
|
||||
- `cursor` — Cursor
|
||||
- `gemini_local` — Gemini CLI
|
||||
- `openclaw_gateway` — OpenClaw gateway
|
||||
|
||||
Only set an adapter when:
|
||||
- The repo or its skills clearly target a specific runtime (e.g. gstack is built for Claude Code, so `claude_local` is appropriate)
|
||||
- The user explicitly requests a specific adapter
|
||||
- The agent's role requires a specific runtime capability
|
||||
|
||||
### Env Inputs Rules
|
||||
|
||||
**Do not add boilerplate env variables.** Only add env inputs that the agent actually needs based on its skills or role:
|
||||
- `GH_TOKEN` for agents that push code, create PRs, or interact with GitHub
|
||||
- API keys only when a skill explicitly requires them
|
||||
- Never set `ANTHROPIC_API_KEY` as a default empty env variable — the runtime handles this
|
||||
|
||||
Example with adapter (only when warranted):
|
||||
```yaml
|
||||
schema: paperclip/v1
|
||||
agents:
|
||||
release-engineer:
|
||||
adapter:
|
||||
type: claude_local
|
||||
config:
|
||||
model: claude-sonnet-4-6
|
||||
inputs:
|
||||
env:
|
||||
GH_TOKEN:
|
||||
kind: secret
|
||||
requirement: optional
|
||||
```
|
||||
|
||||
Example — only agents with actual overrides appear:
|
||||
```yaml
|
||||
schema: paperclip/v1
|
||||
agents:
|
||||
release-engineer:
|
||||
inputs:
|
||||
env:
|
||||
GH_TOKEN:
|
||||
kind: secret
|
||||
requirement: optional
|
||||
```
|
||||
|
||||
In this example, only `release-engineer` appears because it needs `GH_TOKEN`. The other agents (ceo, cto, etc.) have no overrides, so they are omitted entirely from `.paperclip.yaml`.
|
||||
|
||||
## External Skill References
|
||||
|
||||
When referencing skills from a GitHub repo, always use the references pattern:
|
||||
|
||||
```yaml
|
||||
metadata:
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: owner/repo
|
||||
path: path/to/SKILL.md
|
||||
commit: <full SHA from git ls-remote or the repo>
|
||||
attribution: Owner or Org Name
|
||||
license: <from the repo's LICENSE>
|
||||
usage: referenced
|
||||
```
|
||||
|
||||
Get the commit SHA with:
|
||||
|
||||
```bash
|
||||
git ls-remote https://github.com/owner/repo HEAD
|
||||
```
|
||||
|
||||
Do NOT copy external skill content into the package unless the user explicitly asks.
|
||||
144
.agents/skills/company-creator/references/companies-spec.md
Normal file
144
.agents/skills/company-creator/references/companies-spec.md
Normal file
@@ -0,0 +1,144 @@
|
||||
# Agent Companies Specification Reference
|
||||
|
||||
The normative specification lives at:
|
||||
|
||||
- Web: https://agentcompanies.io/specification
|
||||
- Local: docs/companies/companies-spec.md
|
||||
|
||||
Read the local spec file before generating any package files. The spec defines the canonical format and all frontmatter fields. Below is a quick-reference summary for common authoring tasks.
|
||||
|
||||
## Package Kinds
|
||||
|
||||
| File | Kind | Purpose |
|
||||
| ---------- | ------- | ------------------------------------------------- |
|
||||
| COMPANY.md | company | Root entrypoint, org boundary and defaults |
|
||||
| TEAM.md | team | Reusable org subtree |
|
||||
| AGENTS.md | agent | One role, instructions, and attached skills |
|
||||
| PROJECT.md | project | Planned work grouping |
|
||||
| TASK.md | task | Portable starter task |
|
||||
| SKILL.md | skill | Agent Skills capability package (do not redefine) |
|
||||
|
||||
## Directory Layout
|
||||
|
||||
```
|
||||
company-package/
|
||||
├── COMPANY.md
|
||||
├── agents/
|
||||
│ └── <slug>/AGENTS.md
|
||||
├── teams/
|
||||
│ └── <slug>/TEAM.md
|
||||
├── projects/
|
||||
│ └── <slug>/
|
||||
│ ├── PROJECT.md
|
||||
│ └── tasks/
|
||||
│ └── <slug>/TASK.md
|
||||
├── tasks/
|
||||
│ └── <slug>/TASK.md
|
||||
├── skills/
|
||||
│ └── <slug>/SKILL.md
|
||||
├── assets/
|
||||
├── scripts/
|
||||
├── references/
|
||||
└── .paperclip.yaml (optional vendor extension)
|
||||
```
|
||||
|
||||
## Common Frontmatter Fields
|
||||
|
||||
```yaml
|
||||
schema: agentcompanies/v1
|
||||
kind: company | team | agent | project | task
|
||||
slug: url-safe-stable-identity
|
||||
name: Human Readable Name
|
||||
description: Short description for discovery
|
||||
version: 0.1.0
|
||||
license: MIT
|
||||
authors:
|
||||
- name: Jane Doe
|
||||
tags: []
|
||||
metadata: {}
|
||||
sources: []
|
||||
```
|
||||
|
||||
- `schema` usually appears only at package root
|
||||
- `kind` is optional when filename makes it obvious
|
||||
- `slug` must be URL-safe and stable
|
||||
- exporters should omit empty or default-valued fields
|
||||
|
||||
## COMPANY.md Required Fields
|
||||
|
||||
```yaml
|
||||
name: Company Name
|
||||
description: What this company does
|
||||
slug: company-slug
|
||||
schema: agentcompanies/v1
|
||||
```
|
||||
|
||||
Optional: `version`, `license`, `authors`, `goals`, `includes`, `requirements.secrets`
|
||||
|
||||
## AGENTS.md Key Fields
|
||||
|
||||
```yaml
|
||||
name: Agent Name
|
||||
title: Role Title
|
||||
reportsTo: <agent-slug or null>
|
||||
skills:
|
||||
- skill-shortname
|
||||
```
|
||||
|
||||
- Body content is the agent's default instructions
|
||||
- Skills resolve by shortname: `skills/<shortname>/SKILL.md`
|
||||
- Do not export machine-specific paths or secrets
|
||||
|
||||
## TEAM.md Key Fields
|
||||
|
||||
```yaml
|
||||
name: Team Name
|
||||
description: What this team does
|
||||
slug: team-slug
|
||||
manager: ../agent-slug/AGENTS.md
|
||||
includes:
|
||||
- ../agent-slug/AGENTS.md
|
||||
- ../../skills/skill-slug/SKILL.md
|
||||
```
|
||||
|
||||
## PROJECT.md Key Fields
|
||||
|
||||
```yaml
|
||||
name: Project Name
|
||||
description: What this project delivers
|
||||
owner: agent-slug
|
||||
```
|
||||
|
||||
## TASK.md Key Fields
|
||||
|
||||
```yaml
|
||||
name: Task Name
|
||||
assignee: agent-slug
|
||||
project: project-slug
|
||||
schedule:
|
||||
timezone: America/Chicago
|
||||
startsAt: 2026-03-16T09:00:00-05:00
|
||||
recurrence:
|
||||
frequency: weekly
|
||||
interval: 1
|
||||
weekdays: [monday]
|
||||
time: { hour: 9, minute: 0 }
|
||||
```
|
||||
|
||||
## Source References (for external skills/content)
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: owner/repo
|
||||
path: path/to/SKILL.md
|
||||
commit: <full-sha>
|
||||
sha256: <hash>
|
||||
attribution: Owner Name
|
||||
license: MIT
|
||||
usage: referenced
|
||||
```
|
||||
|
||||
Usage modes: `vendored` (bytes included), `referenced` (pointer only), `mirrored` (cached locally)
|
||||
|
||||
Default to `referenced` for third-party content.
|
||||
184
.agents/skills/company-creator/references/example-company.md
Normal file
184
.agents/skills/company-creator/references/example-company.md
Normal file
@@ -0,0 +1,184 @@
|
||||
# Example Company Package
|
||||
|
||||
A minimal but complete example of an agent company package.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
lean-dev-shop/
|
||||
├── COMPANY.md
|
||||
├── agents/
|
||||
│ ├── ceo/AGENTS.md
|
||||
│ ├── cto/AGENTS.md
|
||||
│ └── engineer/AGENTS.md
|
||||
├── teams/
|
||||
│ └── engineering/TEAM.md
|
||||
├── projects/
|
||||
│ └── q2-launch/
|
||||
│ ├── PROJECT.md
|
||||
│ └── tasks/
|
||||
│ └── monday-review/TASK.md
|
||||
├── tasks/
|
||||
│ └── weekly-standup/TASK.md
|
||||
├── skills/
|
||||
│ └── code-review/SKILL.md
|
||||
└── .paperclip.yaml
|
||||
```
|
||||
|
||||
## COMPANY.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: Lean Dev Shop
|
||||
description: Small engineering-focused AI company that builds and ships software products
|
||||
slug: lean-dev-shop
|
||||
schema: agentcompanies/v1
|
||||
version: 1.0.0
|
||||
license: MIT
|
||||
authors:
|
||||
- name: Example Org
|
||||
goals:
|
||||
- Build and ship software products
|
||||
- Maintain high code quality
|
||||
---
|
||||
|
||||
Lean Dev Shop is a small, focused engineering company. The CEO oversees strategy and coordinates work. The CTO leads the engineering team. Engineers build and ship code.
|
||||
```
|
||||
|
||||
## agents/ceo/AGENTS.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: CEO
|
||||
title: Chief Executive Officer
|
||||
reportsTo: null
|
||||
skills:
|
||||
- paperclip
|
||||
---
|
||||
|
||||
You are the CEO of Lean Dev Shop. You oversee company strategy, coordinate work across the team, and ensure projects ship on time.
|
||||
|
||||
Your responsibilities:
|
||||
|
||||
- Review and prioritize work across projects
|
||||
- Coordinate with the CTO on technical decisions
|
||||
- Ensure the company goals are being met
|
||||
```
|
||||
|
||||
## agents/cto/AGENTS.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: CTO
|
||||
title: Chief Technology Officer
|
||||
reportsTo: ceo
|
||||
skills:
|
||||
- code-review
|
||||
- paperclip
|
||||
---
|
||||
|
||||
You are the CTO of Lean Dev Shop. You lead the engineering team and make technical decisions.
|
||||
|
||||
Your responsibilities:
|
||||
|
||||
- Set technical direction and architecture
|
||||
- Review code and ensure quality standards
|
||||
- Mentor engineers and unblock technical challenges
|
||||
```
|
||||
|
||||
## agents/engineer/AGENTS.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: Engineer
|
||||
title: Software Engineer
|
||||
reportsTo: cto
|
||||
skills:
|
||||
- code-review
|
||||
- paperclip
|
||||
---
|
||||
|
||||
You are a software engineer at Lean Dev Shop. You write code, fix bugs, and ship features.
|
||||
|
||||
Your responsibilities:
|
||||
|
||||
- Implement features and fix bugs
|
||||
- Write tests and documentation
|
||||
- Participate in code reviews
|
||||
```
|
||||
|
||||
## teams/engineering/TEAM.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: Engineering
|
||||
description: Product and platform engineering team
|
||||
slug: engineering
|
||||
schema: agentcompanies/v1
|
||||
manager: ../../agents/cto/AGENTS.md
|
||||
includes:
|
||||
- ../../agents/engineer/AGENTS.md
|
||||
- ../../skills/code-review/SKILL.md
|
||||
tags:
|
||||
- engineering
|
||||
---
|
||||
|
||||
The engineering team builds and maintains all software products.
|
||||
```
|
||||
|
||||
## projects/q2-launch/PROJECT.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: Q2 Launch
|
||||
description: Ship the Q2 product launch
|
||||
slug: q2-launch
|
||||
owner: cto
|
||||
---
|
||||
|
||||
Deliver all features planned for the Q2 launch, including the new dashboard and API improvements.
|
||||
```
|
||||
|
||||
## projects/q2-launch/tasks/monday-review/TASK.md
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: Monday Review
|
||||
assignee: ceo
|
||||
project: q2-launch
|
||||
schedule:
|
||||
timezone: America/Chicago
|
||||
startsAt: 2026-03-16T09:00:00-05:00
|
||||
recurrence:
|
||||
frequency: weekly
|
||||
interval: 1
|
||||
weekdays:
|
||||
- monday
|
||||
time:
|
||||
hour: 9
|
||||
minute: 0
|
||||
---
|
||||
|
||||
Review the status of Q2 Launch project. Check progress on all open tasks, identify blockers, and update priorities for the week.
|
||||
```
|
||||
|
||||
## skills/code-review/SKILL.md (with external reference)
|
||||
|
||||
```markdown
|
||||
---
|
||||
name: code-review
|
||||
description: Thorough code review skill for pull requests and diffs
|
||||
metadata:
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: anthropics/claude-code
|
||||
path: skills/code-review/SKILL.md
|
||||
commit: abc123def456
|
||||
sha256: 3b7e...9a
|
||||
attribution: Anthropic
|
||||
license: MIT
|
||||
usage: referenced
|
||||
---
|
||||
|
||||
Review code changes for correctness, style, and potential issues.
|
||||
```
|
||||
79
.agents/skills/company-creator/references/from-repo-guide.md
Normal file
79
.agents/skills/company-creator/references/from-repo-guide.md
Normal file
@@ -0,0 +1,79 @@
|
||||
# Creating a Company From an Existing Repository
|
||||
|
||||
When a user provides a git repo (URL, local path, or tweet linking to a repo), analyze it and create a company package that wraps its content.
|
||||
|
||||
## Analysis Steps
|
||||
|
||||
1. **Clone or read the repo** - Use `git clone` for URLs, read directly for local paths
|
||||
2. **Scan for existing agent/skill files** - Look for SKILL.md, AGENTS.md, CLAUDE.md, .claude/ directories, or similar agent configuration
|
||||
3. **Understand the repo's purpose** - Read README, package.json, main source files to understand what the project does
|
||||
4. **Identify natural agent roles** - Based on the repo's structure and purpose, determine what agents would be useful
|
||||
|
||||
## Handling Existing Skills
|
||||
|
||||
Many repos already contain skills (SKILL.md files). When you find them:
|
||||
|
||||
**Default behavior: use references, not copies.**
|
||||
|
||||
Instead of copying skill content into your company package, create a source reference:
|
||||
|
||||
```yaml
|
||||
metadata:
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: owner/repo
|
||||
path: path/to/SKILL.md
|
||||
commit: <get the current HEAD commit SHA>
|
||||
attribution: <repo owner or org name>
|
||||
license: <from repo's LICENSE file>
|
||||
usage: referenced
|
||||
```
|
||||
|
||||
To get the commit SHA:
|
||||
```bash
|
||||
git ls-remote https://github.com/owner/repo HEAD
|
||||
```
|
||||
|
||||
Only vendor (copy) skills when:
|
||||
- The user explicitly asks to copy them
|
||||
- The skill is very small and tightly coupled to the company
|
||||
- The source repo is private or may become unavailable
|
||||
|
||||
## Handling Existing Agent Configurations
|
||||
|
||||
If the repo has agent configs (CLAUDE.md, .claude/ directories, codex configs, etc.):
|
||||
- Use them as inspiration for AGENTS.md instructions
|
||||
- Don't copy them verbatim - adapt them to the Agent Companies format
|
||||
- Preserve the intent and key instructions
|
||||
|
||||
## Repo-Only Skills (No Agents)
|
||||
|
||||
When a repo contains only skills and no agents:
|
||||
- Create agents that would naturally use those skills
|
||||
- The agents should be minimal - just enough to give the skills a runtime context
|
||||
- A single agent may use multiple skills from the repo
|
||||
- Name agents based on the domain the skills cover
|
||||
|
||||
Example: A repo with `code-review`, `testing`, and `deployment` skills might become:
|
||||
- A "Lead Engineer" agent with all three skills
|
||||
- Or separate "Reviewer", "QA Engineer", and "DevOps" agents if the skills are distinct enough
|
||||
|
||||
## Common Repo Patterns
|
||||
|
||||
### Developer Tools / CLI repos
|
||||
- Create agents for the tool's primary use cases
|
||||
- Reference any existing skills
|
||||
- Add a project maintainer or lead agent
|
||||
|
||||
### Library / Framework repos
|
||||
- Create agents for development, testing, documentation
|
||||
- Skills from the repo become agent capabilities
|
||||
|
||||
### Full Application repos
|
||||
- Map to departments: engineering, product, QA
|
||||
- Create a lean team structure appropriate to the project size
|
||||
|
||||
### Skills Collection repos (e.g. skills.sh repos)
|
||||
- Each skill or skill group gets an agent
|
||||
- Create a lightweight company or team wrapper
|
||||
- Keep the agent count proportional to the skill diversity
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
name: release-changelog
|
||||
description: >
|
||||
Generate the stable Paperclip release changelog at releases/v{version}.md by
|
||||
Generate the stable Paperclip release changelog at releases/vYYYY.MDD.P.md by
|
||||
reading commits, changesets, and merged PR context since the last stable tag.
|
||||
---
|
||||
|
||||
@@ -9,20 +9,33 @@ description: >
|
||||
|
||||
Generate the user-facing changelog for the **stable** Paperclip release.
|
||||
|
||||
## Versioning Model
|
||||
|
||||
Paperclip uses **calendar versioning (calver)**:
|
||||
|
||||
- Stable releases: `YYYY.MDD.P` (e.g. `2026.318.0`)
|
||||
- Canary releases: `YYYY.MDD.P-canary.N` (e.g. `2026.318.1-canary.0`)
|
||||
- Git tags: `vYYYY.MDD.P` for stable, `canary/vYYYY.MDD.P-canary.N` for canary
|
||||
|
||||
There are no major/minor/patch bumps. The stable version is derived from the
|
||||
intended release date (UTC) plus the next same-day stable patch slot.
|
||||
|
||||
Output:
|
||||
|
||||
- `releases/v{version}.md`
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Important rule:
|
||||
Important rules:
|
||||
|
||||
- even if there are canary releases such as `1.2.3-canary.0`, the changelog file stays `releases/v1.2.3.md`
|
||||
- even if there are canary releases such as `2026.318.1-canary.0`, the changelog file stays `releases/v2026.318.1.md`
|
||||
- do not derive versions from semver bump types
|
||||
- do not create canary changelog files
|
||||
|
||||
## Step 0 — Idempotency Check
|
||||
|
||||
Before generating anything, check whether the file already exists:
|
||||
|
||||
```bash
|
||||
ls releases/v{version}.md 2>/dev/null
|
||||
ls releases/vYYYY.MDD.P.md 2>/dev/null
|
||||
```
|
||||
|
||||
If it exists:
|
||||
@@ -41,13 +54,14 @@ git tag --list 'v*' --sort=-version:refname | head -1
|
||||
git log v{last}..HEAD --oneline --no-merges
|
||||
```
|
||||
|
||||
The planned stable version comes from one of:
|
||||
The stable version comes from one of:
|
||||
|
||||
- an explicit maintainer request
|
||||
- the chosen bump type applied to the last stable tag
|
||||
- `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
- the release plan already agreed in `doc/RELEASING.md`
|
||||
|
||||
Do not derive the changelog version from a canary tag or prerelease suffix.
|
||||
Do not derive major/minor/patch bumps from API intent — calver uses the date and same-day stable slot.
|
||||
|
||||
## Step 2 — Gather the Raw Inputs
|
||||
|
||||
@@ -73,7 +87,6 @@ Look for:
|
||||
- destructive migrations
|
||||
- removed or changed API fields/endpoints
|
||||
- renamed or removed config keys
|
||||
- `major` changesets
|
||||
- `BREAKING:` or `BREAKING CHANGE:` commit signals
|
||||
|
||||
Key commands:
|
||||
@@ -85,7 +98,8 @@ git diff v{last}..HEAD -- server/src/routes/ server/src/api/
|
||||
git log v{last}..HEAD --format="%s" | rg -n 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||
```
|
||||
|
||||
If the requested bump is lower than the minimum required bump, flag that before the release proceeds.
|
||||
If breaking changes are detected, flag them prominently — they must appear in the
|
||||
Breaking Changes section with an upgrade path.
|
||||
|
||||
## Step 4 — Categorize for Users
|
||||
|
||||
@@ -130,9 +144,9 @@ Rules:
|
||||
Template:
|
||||
|
||||
```markdown
|
||||
# v{version}
|
||||
# vYYYY.MDD.P
|
||||
|
||||
> Released: {YYYY-MM-DD}
|
||||
> Released: YYYY-MM-DD
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
|
||||
@@ -2,23 +2,21 @@
|
||||
name: release
|
||||
description: >
|
||||
Coordinate a full Paperclip release across engineering verification, npm,
|
||||
GitHub, website publishing, and announcement follow-up. Use when leadership
|
||||
asks to ship a release, not merely to discuss version bumps.
|
||||
GitHub, smoke testing, and announcement follow-up. Use when leadership asks
|
||||
to ship a release, not merely to discuss versioning.
|
||||
---
|
||||
|
||||
# Release Coordination Skill
|
||||
|
||||
Run the full Paperclip release as a maintainer workflow, not just an npm publish.
|
||||
Run the full Paperclip maintainer release workflow, not just an npm publish.
|
||||
|
||||
This skill coordinates:
|
||||
|
||||
- stable changelog drafting via `release-changelog`
|
||||
- release-train setup via `scripts/release-start.sh`
|
||||
- prerelease canary publishing via `scripts/release.sh --canary`
|
||||
- canary verification and publish status from `master`
|
||||
- Docker smoke testing via `scripts/docker-onboard-smoke.sh`
|
||||
- stable publishing via `scripts/release.sh`
|
||||
- pushing the stable branch commit and tag
|
||||
- GitHub Release creation via `scripts/create-github-release.sh`
|
||||
- manual stable promotion from a chosen source ref
|
||||
- GitHub Release creation
|
||||
- website / announcement follow-up tasks
|
||||
|
||||
## Trigger
|
||||
@@ -26,8 +24,9 @@ This skill coordinates:
|
||||
Use this skill when leadership asks for:
|
||||
|
||||
- "do a release"
|
||||
- "ship the next patch/minor/major"
|
||||
- "release vX.Y.Z"
|
||||
- "ship the release"
|
||||
- "promote this canary to stable"
|
||||
- "cut the stable release"
|
||||
|
||||
## Preconditions
|
||||
|
||||
@@ -35,10 +34,10 @@ Before proceeding, verify all of the following:
|
||||
|
||||
1. `.agents/skills/release-changelog/SKILL.md` exists and is usable.
|
||||
2. The repo working tree is clean, including untracked files.
|
||||
3. There are commits since the last stable tag.
|
||||
4. The release SHA has passed the verification gate or is about to.
|
||||
5. If package manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master` before the release branch is cut.
|
||||
6. npm publish rights are available locally, or the GitHub release workflow is being used with trusted publishing.
|
||||
3. There is at least one canary or candidate commit since the last stable tag.
|
||||
4. The candidate SHA has passed the verification gate or is about to.
|
||||
5. If manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master`.
|
||||
6. npm publish rights are available through GitHub trusted publishing, or through local npm auth for emergency/manual use.
|
||||
7. If running through Paperclip, you have issue context for status updates and follow-up task creation.
|
||||
|
||||
If any precondition fails, stop and report the blocker.
|
||||
@@ -47,78 +46,67 @@ If any precondition fails, stop and report the blocker.
|
||||
|
||||
Collect these inputs up front:
|
||||
|
||||
- requested bump: `patch`, `minor`, or `major`
|
||||
- whether this run is a dry run or live release
|
||||
- whether the release is being run locally or from GitHub Actions
|
||||
- whether the target is a canary check or a stable promotion
|
||||
- the candidate `source_ref` for stable
|
||||
- whether the stable run is dry-run or live
|
||||
- release issue / company context for website and announcement follow-up
|
||||
|
||||
## Step 0 — Release Model
|
||||
|
||||
Paperclip now uses this release model:
|
||||
Paperclip now uses a commit-driven release model:
|
||||
|
||||
1. Start or resume `release/X.Y.Z`
|
||||
2. Draft the **stable** changelog as `releases/vX.Y.Z.md`
|
||||
3. Publish one or more **prerelease canaries** such as `X.Y.Z-canary.0`
|
||||
4. Smoke test the canary via Docker
|
||||
5. Publish the stable version `X.Y.Z`
|
||||
6. Push the stable branch commit and tag
|
||||
7. Create the GitHub Release
|
||||
8. Merge `release/X.Y.Z` back to `master` without squash or rebase
|
||||
9. Complete website and announcement surfaces
|
||||
1. every push to `master` publishes a canary automatically
|
||||
2. canaries use `YYYY.MDD.P-canary.N`
|
||||
3. stable releases use `YYYY.MDD.P`
|
||||
4. the middle slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||
5. the stable patch slot increments when more than one stable ships on the same UTC date
|
||||
6. stable releases are manually promoted from a chosen tested commit or canary source commit
|
||||
7. only stable releases get `releases/vYYYY.MDD.P.md`, git tag `vYYYY.MDD.P`, and a GitHub Release
|
||||
|
||||
Critical consequence:
|
||||
Critical consequences:
|
||||
|
||||
- Canaries do **not** use promote-by-dist-tag anymore.
|
||||
- The changelog remains stable-only. Do not create `releases/vX.Y.Z-canary.N.md`.
|
||||
- do not use release branches as the default path
|
||||
- do not derive major/minor/patch bumps
|
||||
- do not create canary changelog files
|
||||
- do not create canary GitHub Releases
|
||||
|
||||
## Step 1 — Decide the Stable Version
|
||||
## Step 1 — Choose the Candidate
|
||||
|
||||
Start the release train first:
|
||||
For canary validation:
|
||||
|
||||
- inspect the latest successful canary run on `master`
|
||||
- record the canary version and source SHA
|
||||
|
||||
For stable promotion:
|
||||
|
||||
1. choose the tested source ref
|
||||
2. confirm it is the exact SHA you want to promote
|
||||
3. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
|
||||
Useful commands:
|
||||
|
||||
```bash
|
||||
./scripts/release-start.sh {patch|minor|major}
|
||||
git tag --list 'v*' --sort=-version:refname | head -1
|
||||
git log --oneline --no-merges
|
||||
npm view paperclipai@canary version
|
||||
```
|
||||
|
||||
Then run release preflight:
|
||||
|
||||
```bash
|
||||
./scripts/release-preflight.sh canary {patch|minor|major}
|
||||
# or
|
||||
./scripts/release-preflight.sh stable {patch|minor|major}
|
||||
```
|
||||
|
||||
Then use the last stable tag as the base:
|
||||
|
||||
```bash
|
||||
LAST_TAG=$(git tag --list 'v*' --sort=-version:refname | head -1)
|
||||
git log "${LAST_TAG}..HEAD" --oneline --no-merges
|
||||
git diff --name-only "${LAST_TAG}..HEAD" -- packages/db/src/migrations/
|
||||
git diff "${LAST_TAG}..HEAD" -- packages/db/src/schema/
|
||||
git log "${LAST_TAG}..HEAD" --format="%s" | rg -n 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||
```
|
||||
|
||||
Bump policy:
|
||||
|
||||
- destructive migrations, removed APIs, breaking config changes -> `major`
|
||||
- additive migrations or clearly user-visible features -> at least `minor`
|
||||
- fixes only -> `patch`
|
||||
|
||||
If the requested bump is too low, escalate it and explain why.
|
||||
|
||||
## Step 2 — Draft the Stable Changelog
|
||||
|
||||
Invoke `release-changelog` and generate:
|
||||
Stable changelog files live at:
|
||||
|
||||
- `releases/vX.Y.Z.md`
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Invoke `release-changelog` and generate or update the stable notes only.
|
||||
|
||||
Rules:
|
||||
|
||||
- review the draft with a human before publish
|
||||
- preserve manual edits if the file already exists
|
||||
- keep the heading and filename stable-only, for example `v1.2.3`
|
||||
- do not create a separate canary changelog file
|
||||
- keep the filename stable-only
|
||||
- do not create a canary changelog file
|
||||
|
||||
## Step 3 — Verify the Release SHA
|
||||
## Step 3 — Verify the Candidate SHA
|
||||
|
||||
Run the standard gate:
|
||||
|
||||
@@ -128,41 +116,27 @@ pnpm test:run
|
||||
pnpm build
|
||||
```
|
||||
|
||||
If the release will be run through GitHub Actions, the workflow can rerun this gate. Still report whether the local tree currently passes.
|
||||
If the GitHub release workflow will run the publish, it can rerun this gate. Still report local status if you checked it.
|
||||
|
||||
The GitHub Actions release workflow installs with `pnpm install --frozen-lockfile`. Treat that as a release invariant, not a nuisance: if manifests changed and the lockfile refresh PR has not landed yet, stop and wait for `master` to contain the committed lockfile before shipping.
|
||||
For PRs that touch release logic, the repo also runs a canary release dry-run in CI. That is a release-specific guard, not a substitute for the standard gate.
|
||||
|
||||
## Step 4 — Publish a Canary
|
||||
## Step 4 — Validate the Canary
|
||||
|
||||
Run from the `release/X.Y.Z` branch:
|
||||
The normal canary path is automatic from `master` via:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh {patch|minor|major} --canary --dry-run
|
||||
./scripts/release.sh {patch|minor|major} --canary
|
||||
```
|
||||
- `.github/workflows/release.yml`
|
||||
|
||||
What this means:
|
||||
Confirm:
|
||||
|
||||
- npm receives `X.Y.Z-canary.N` under dist-tag `canary`
|
||||
- `latest` remains unchanged
|
||||
- no git tag is created
|
||||
- the script cleans the working tree afterward
|
||||
1. verification passed
|
||||
2. npm canary publish succeeded
|
||||
3. git tag `canary/vYYYY.MDD.P-canary.N` exists
|
||||
|
||||
Guard:
|
||||
|
||||
- if the current stable is `0.2.7`, the next patch canary is `0.2.8-canary.0`
|
||||
- the tooling must never publish `0.2.7-canary.N` after `0.2.7` is already stable
|
||||
|
||||
After publish, verify:
|
||||
Useful checks:
|
||||
|
||||
```bash
|
||||
npm view paperclipai@canary version
|
||||
```
|
||||
|
||||
The user install path is:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
git tag --list 'canary/v*' --sort=-version:refname | head -5
|
||||
```
|
||||
|
||||
## Step 5 — Smoke Test the Canary
|
||||
@@ -173,60 +147,70 @@ Run:
|
||||
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Useful isolated variant:
|
||||
|
||||
```bash
|
||||
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Confirm:
|
||||
|
||||
1. install succeeds
|
||||
2. onboarding completes
|
||||
3. server boots
|
||||
4. UI loads
|
||||
5. basic company/dashboard flow works
|
||||
2. onboarding completes without crashes
|
||||
3. the server boots
|
||||
4. the UI loads
|
||||
5. basic company creation and dashboard load work
|
||||
|
||||
If smoke testing fails:
|
||||
|
||||
- stop the stable release
|
||||
- fix the issue
|
||||
- publish another canary
|
||||
- repeat the smoke test
|
||||
- fix the issue on `master`
|
||||
- wait for the next automatic canary
|
||||
- rerun smoke testing
|
||||
|
||||
Each retry should create a higher canary ordinal, while the stable target version can stay the same.
|
||||
## Step 6 — Preview or Publish Stable
|
||||
|
||||
## Step 6 — Publish Stable
|
||||
The normal stable path is manual `workflow_dispatch` on:
|
||||
|
||||
Once the SHA is vetted, run:
|
||||
- `.github/workflows/release.yml`
|
||||
|
||||
Inputs:
|
||||
|
||||
- `source_ref`
|
||||
- `stable_date`
|
||||
- `dry_run`
|
||||
|
||||
Before live stable:
|
||||
|
||||
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
2. ensure `releases/vYYYY.MDD.P.md` exists on the source ref
|
||||
3. run the stable workflow in dry-run mode first when practical
|
||||
4. then run the real stable publish
|
||||
|
||||
The stable workflow:
|
||||
|
||||
- re-verifies the exact source ref
|
||||
- computes the next stable patch slot for the chosen UTC date
|
||||
- publishes `YYYY.MDD.P` under dist-tag `latest`
|
||||
- creates git tag `vYYYY.MDD.P`
|
||||
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Local emergency/manual commands:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh {patch|minor|major} --dry-run
|
||||
./scripts/release.sh {patch|minor|major}
|
||||
./scripts/release.sh stable --dry-run
|
||||
./scripts/release.sh stable
|
||||
git push public-gh refs/tags/vYYYY.MDD.P
|
||||
./scripts/create-github-release.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
Stable publish does this:
|
||||
|
||||
- publishes `X.Y.Z` to npm under `latest`
|
||||
- creates the local release commit
|
||||
- creates the local git tag `vX.Y.Z`
|
||||
|
||||
Stable publish does **not** push the release for you.
|
||||
|
||||
## Step 7 — Push and Create GitHub Release
|
||||
|
||||
After stable publish succeeds:
|
||||
|
||||
```bash
|
||||
git push public-gh HEAD --follow-tags
|
||||
./scripts/create-github-release.sh X.Y.Z
|
||||
```
|
||||
|
||||
Use the stable changelog file as the GitHub Release notes source.
|
||||
|
||||
Then open the PR from `release/X.Y.Z` back to `master` and merge without squash or rebase.
|
||||
|
||||
## Step 8 — Finish the Other Surfaces
|
||||
## Step 7 — Finish the Other Surfaces
|
||||
|
||||
Create or verify follow-up work for:
|
||||
|
||||
- website changelog publishing
|
||||
- launch post / social announcement
|
||||
- any release summary in Paperclip issue context
|
||||
- release summary in Paperclip issue context
|
||||
|
||||
These should reference the stable release, not the canary.
|
||||
|
||||
@@ -236,9 +220,9 @@ If the canary is bad:
|
||||
|
||||
- publish another canary, do not ship stable
|
||||
|
||||
If stable npm publish succeeds but push or GitHub release creation fails:
|
||||
If stable npm publish succeeds but tag push or GitHub release creation fails:
|
||||
|
||||
- fix the git/GitHub issue immediately from the same checkout
|
||||
- fix the git/GitHub issue immediately from the same release result
|
||||
- do not republish the same version
|
||||
|
||||
If `latest` is bad after stable publish:
|
||||
@@ -247,15 +231,17 @@ If `latest` is bad after stable publish:
|
||||
./scripts/rollback-latest.sh <last-good-version>
|
||||
```
|
||||
|
||||
Then fix forward with a new patch release.
|
||||
Then fix forward with a new stable release.
|
||||
|
||||
## Output
|
||||
|
||||
When the skill completes, provide:
|
||||
|
||||
- stable version and, if relevant, the final canary version tested
|
||||
- candidate SHA and tested canary version, if relevant
|
||||
- stable version, if promoted
|
||||
- verification status
|
||||
- npm status
|
||||
- smoke-test status
|
||||
- git tag / GitHub Release status
|
||||
- website / announcement follow-up status
|
||||
- rollback recommendation if anything is still partially complete
|
||||
|
||||
1
.claude/skills/company-creator
Symbolic link
1
.claude/skills/company-creator
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/company-creator
|
||||
7
.github/CODEOWNERS
vendored
7
.github/CODEOWNERS
vendored
@@ -8,3 +8,10 @@ scripts/rollback-latest.sh @cryppadotta @devinfoley
|
||||
doc/RELEASING.md @cryppadotta @devinfoley
|
||||
doc/PUBLISHING.md @cryppadotta @devinfoley
|
||||
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
|
||||
|
||||
# Package files — dependency changes require review
|
||||
# package.json matches recursively at all depths (covers root + all workspaces)
|
||||
package.json @cryppadotta @devinfoley
|
||||
pnpm-lock.yaml @cryppadotta @devinfoley
|
||||
pnpm-workspace.yaml @cryppadotta @devinfoley
|
||||
.npmrc @cryppadotta @devinfoley
|
||||
|
||||
65
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
65
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,65 @@
|
||||
## Thinking Path
|
||||
|
||||
<!--
|
||||
Required. Trace your reasoning from the top of the project down to this
|
||||
specific change. Start with what Paperclip is, then narrow through the
|
||||
subsystem, the problem, and why this PR exists. Use blockquote style.
|
||||
Aim for 5–8 steps. See CONTRIBUTING.md for full examples.
|
||||
-->
|
||||
|
||||
> - Paperclip orchestrates AI agents for zero-human companies
|
||||
> - [Which subsystem or capability is involved]
|
||||
> - [What problem or gap exists]
|
||||
> - [Why it needs to be addressed]
|
||||
> - This pull request ...
|
||||
> - The benefit is ...
|
||||
|
||||
## What Changed
|
||||
|
||||
<!-- Bullet list of concrete changes. One bullet per logical unit. -->
|
||||
|
||||
-
|
||||
|
||||
## Verification
|
||||
|
||||
<!--
|
||||
How can a reviewer confirm this works? Include test commands, manual
|
||||
steps, or both. For UI changes, include before/after screenshots.
|
||||
-->
|
||||
|
||||
-
|
||||
|
||||
## Risks
|
||||
|
||||
<!--
|
||||
What could go wrong? Mention migration safety, breaking changes,
|
||||
behavioral shifts, or "Low risk" if genuinely minor.
|
||||
-->
|
||||
|
||||
-
|
||||
|
||||
## Model Used
|
||||
|
||||
<!--
|
||||
Required. Specify which AI model was used to produce or assist with
|
||||
this change. Be as descriptive as possible — include:
|
||||
• Provider and model name (e.g., Claude, GPT, Gemini, Codex)
|
||||
• Exact model ID or version (e.g., claude-opus-4-6, gpt-4-turbo-2024-04-09)
|
||||
• Context window size if relevant (e.g., 1M context)
|
||||
• Reasoning/thinking mode if applicable (e.g., extended thinking, chain-of-thought)
|
||||
• Any other relevant capability details (e.g., tool use, code execution)
|
||||
If no AI model was used, write "None — human-authored".
|
||||
-->
|
||||
|
||||
-
|
||||
|
||||
## Checklist
|
||||
|
||||
- [ ] I have included a thinking path that traces from project context to this change
|
||||
- [ ] I have specified the model used (with version and capability details)
|
||||
- [ ] I have run tests locally and they pass
|
||||
- [ ] I have added or updated tests where applicable
|
||||
- [ ] If this change affects the UI, I have included before/after screenshots
|
||||
- [ ] I have updated relevant documentation to reflect my changes
|
||||
- [ ] I have considered and documented any risks above
|
||||
- [ ] I will address all Greptile and reviewer comments before requesting merge
|
||||
55
.github/workflows/docker.yml
vendored
Normal file
55
.github/workflows/docker.yml
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
name: Docker
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- "master"
|
||||
tags:
|
||||
- "v*"
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
|
||||
jobs:
|
||||
build-and-push:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
concurrency:
|
||||
group: docker-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.repository_owner }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Docker meta
|
||||
id: meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ghcr.io/${{ github.repository }}
|
||||
tags: |
|
||||
type=raw,value=latest,enable={{is_default_branch}}
|
||||
type=semver,pattern={{version}}
|
||||
type=semver,pattern={{major}}.{{minor}}
|
||||
type=sha
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: .
|
||||
platforms: linux/amd64,linux/arm64
|
||||
push: true
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
labels: ${{ steps.meta.outputs.labels }}
|
||||
49
.github/workflows/pr-policy.yml
vendored
49
.github/workflows/pr-policy.yml
vendored
@@ -1,49 +0,0 @@
|
||||
name: PR Policy
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
concurrency:
|
||||
group: pr-policy-${{ github.event.pull_request.number }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
policy:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
run_install: false
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 20
|
||||
|
||||
- name: Block manual lockfile edits
|
||||
if: github.head_ref != 'chore/refresh-lockfile'
|
||||
run: |
|
||||
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||
if printf '%s\n' "$changed" | grep -qx 'pnpm-lock.yaml'; then
|
||||
echo "Do not commit pnpm-lock.yaml in pull requests. CI owns lockfile updates."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Validate dependency resolution when manifests change
|
||||
run: |
|
||||
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||
manifest_pattern='(^|/)package\.json$|^pnpm-workspace\.yaml$|^\.npmrc$|^pnpmfile\.(cjs|js|mjs)$'
|
||||
if printf '%s\n' "$changed" | grep -Eq "$manifest_pattern"; then
|
||||
pnpm install --lockfile-only --ignore-scripts --no-frozen-lockfile
|
||||
fi
|
||||
48
.github/workflows/pr-verify.yml
vendored
48
.github/workflows/pr-verify.yml
vendored
@@ -1,48 +0,0 @@
|
||||
name: PR Verify
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
concurrency:
|
||||
group: pr-verify-${{ github.event.pull_request.number }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
verify:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Typecheck
|
||||
run: pnpm -r typecheck
|
||||
|
||||
- name: Run tests
|
||||
run: pnpm test:run
|
||||
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
- name: Release canary dry run
|
||||
run: |
|
||||
git checkout -B master HEAD
|
||||
git checkout -- pnpm-lock.yaml
|
||||
./scripts/release.sh canary --skip-verify --dry-run
|
||||
186
.github/workflows/pr.yml
vendored
Normal file
186
.github/workflows/pr.yml
vendored
Normal file
@@ -0,0 +1,186 @@
|
||||
name: PR
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
concurrency:
|
||||
group: pr-${{ github.event.pull_request.number }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
policy:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 5
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Block manual lockfile edits
|
||||
if: github.head_ref != 'chore/refresh-lockfile'
|
||||
run: |
|
||||
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||
if printf '%s\n' "$changed" | grep -qx 'pnpm-lock.yaml'; then
|
||||
echo "Do not commit pnpm-lock.yaml in pull requests. CI owns lockfile updates."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
run_install: false
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
|
||||
- name: Validate Dockerfile deps stage
|
||||
run: |
|
||||
missing=0
|
||||
|
||||
# Extract only the deps stage from the Dockerfile
|
||||
deps_stage="$(awk '/^FROM .* AS deps$/{found=1; next} found && /^FROM /{exit} found{print}' Dockerfile)"
|
||||
|
||||
if [ -z "$deps_stage" ]; then
|
||||
echo "::error::Could not extract deps stage from Dockerfile (expected 'FROM ... AS deps')"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Derive workspace search roots from pnpm-workspace.yaml (exclude dev-only packages)
|
||||
search_roots="$(grep '^ *- ' pnpm-workspace.yaml | sed 's/^ *- //' | sed 's/\*$//' | grep -v 'examples' | grep -v 'create-paperclip-plugin' | tr '\n' ' ')"
|
||||
|
||||
if [ -z "$search_roots" ]; then
|
||||
echo "::error::Could not derive workspace roots from pnpm-workspace.yaml"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check all workspace package.json files are copied in the deps stage
|
||||
for pkg in $(find $search_roots -maxdepth 2 -name package.json -not -path '*/examples/*' -not -path '*/create-paperclip-plugin/*' -not -path '*/node_modules/*' 2>/dev/null | sort -u); do
|
||||
dir="$(dirname "$pkg")"
|
||||
if ! echo "$deps_stage" | grep -q "^COPY ${dir}/package.json"; then
|
||||
echo "::error::Dockerfile deps stage missing: COPY ${pkg} ${dir}/"
|
||||
missing=1
|
||||
fi
|
||||
done
|
||||
|
||||
# Check patches directory is copied if it exists
|
||||
if [ -d patches ] && ! echo "$deps_stage" | grep -q '^COPY patches/'; then
|
||||
echo "::error::Dockerfile deps stage missing: COPY patches/ patches/"
|
||||
missing=1
|
||||
fi
|
||||
|
||||
if [ "$missing" -eq 1 ]; then
|
||||
echo "Dockerfile deps stage is out of sync. Update it to include the missing files."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Validate dependency resolution when manifests change
|
||||
run: |
|
||||
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||
manifest_pattern='(^|/)package\.json$|^pnpm-workspace\.yaml$|^\.npmrc$|^pnpmfile\.(cjs|js|mjs)$'
|
||||
if printf '%s\n' "$changed" | grep -Eq "$manifest_pattern"; then
|
||||
pnpm install --lockfile-only --ignore-scripts --no-frozen-lockfile
|
||||
fi
|
||||
|
||||
verify:
|
||||
needs: [policy]
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Typecheck
|
||||
run: pnpm -r typecheck
|
||||
|
||||
- name: Run tests
|
||||
run: pnpm test:run
|
||||
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
- name: Release canary dry run
|
||||
run: |
|
||||
git checkout -B master HEAD
|
||||
git checkout -- pnpm-lock.yaml
|
||||
./scripts/release.sh canary --skip-verify --dry-run
|
||||
|
||||
e2e:
|
||||
needs: [policy]
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
- name: Install Playwright
|
||||
run: npx playwright install --with-deps chromium
|
||||
|
||||
- name: Generate Paperclip config
|
||||
run: |
|
||||
mkdir -p ~/.paperclip/instances/default
|
||||
cat > ~/.paperclip/instances/default/config.json << 'CONF'
|
||||
{
|
||||
"$meta": { "version": 1, "updatedAt": "2026-01-01T00:00:00.000Z", "source": "onboard" },
|
||||
"database": { "mode": "embedded-postgres" },
|
||||
"logging": { "mode": "file" },
|
||||
"server": { "deploymentMode": "local_trusted", "host": "127.0.0.1", "port": 3100 },
|
||||
"auth": { "baseUrlMode": "auto" },
|
||||
"storage": { "provider": "local_disk" },
|
||||
"secrets": { "provider": "local_encrypted", "strictMode": false }
|
||||
}
|
||||
CONF
|
||||
|
||||
- name: Run e2e tests
|
||||
env:
|
||||
PAPERCLIP_E2E_SKIP_LLM: "true"
|
||||
run: pnpm run test:e2e
|
||||
|
||||
- name: Upload Playwright report
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: playwright-report
|
||||
path: |
|
||||
tests/e2e/playwright-report/
|
||||
tests/e2e/test-results/
|
||||
retention-days: 14
|
||||
4
.github/workflows/refresh-lockfile.yml
vendored
4
.github/workflows/refresh-lockfile.yml
vendored
@@ -51,11 +51,13 @@ jobs:
|
||||
fi
|
||||
|
||||
- name: Create or update pull request
|
||||
id: upsert-pr
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
if git diff --quiet -- pnpm-lock.yaml; then
|
||||
echo "Lockfile unchanged, nothing to do."
|
||||
echo "pr_created=false" >> "$GITHUB_OUTPUT"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
@@ -79,8 +81,10 @@ jobs:
|
||||
else
|
||||
echo "PR #$existing already exists, branch updated via force push."
|
||||
fi
|
||||
echo "pr_created=true" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Enable auto-merge for lockfile PR
|
||||
if: steps.upsert-pr.outputs.pr_created == 'true'
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
|
||||
118
.github/workflows/release-smoke.yml
vendored
Normal file
118
.github/workflows/release-smoke.yml
vendored
Normal file
@@ -0,0 +1,118 @@
|
||||
name: Release Smoke
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
paperclip_version:
|
||||
description: Published Paperclip dist-tag to test
|
||||
required: true
|
||||
default: canary
|
||||
type: choice
|
||||
options:
|
||||
- canary
|
||||
- latest
|
||||
host_port:
|
||||
description: Host port for the Docker smoke container
|
||||
required: false
|
||||
default: "3232"
|
||||
type: string
|
||||
artifact_name:
|
||||
description: Artifact name for uploaded diagnostics
|
||||
required: false
|
||||
default: release-smoke
|
||||
type: string
|
||||
workflow_call:
|
||||
inputs:
|
||||
paperclip_version:
|
||||
required: true
|
||||
type: string
|
||||
host_port:
|
||||
required: false
|
||||
default: "3232"
|
||||
type: string
|
||||
artifact_name:
|
||||
required: false
|
||||
default: release-smoke
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
smoke:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 45
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Install Playwright browser
|
||||
run: npx playwright install --with-deps chromium
|
||||
|
||||
- name: Launch Docker smoke harness
|
||||
run: |
|
||||
metadata_file="$RUNNER_TEMP/release-smoke.env"
|
||||
HOST_PORT="${{ inputs.host_port }}" \
|
||||
DATA_DIR="$RUNNER_TEMP/release-smoke-data" \
|
||||
PAPERCLIPAI_VERSION="${{ inputs.paperclip_version }}" \
|
||||
SMOKE_DETACH=true \
|
||||
SMOKE_METADATA_FILE="$metadata_file" \
|
||||
./scripts/docker-onboard-smoke.sh
|
||||
set -a
|
||||
source "$metadata_file"
|
||||
set +a
|
||||
{
|
||||
echo "SMOKE_BASE_URL=$SMOKE_BASE_URL"
|
||||
echo "SMOKE_ADMIN_EMAIL=$SMOKE_ADMIN_EMAIL"
|
||||
echo "SMOKE_ADMIN_PASSWORD=$SMOKE_ADMIN_PASSWORD"
|
||||
echo "SMOKE_CONTAINER_NAME=$SMOKE_CONTAINER_NAME"
|
||||
echo "SMOKE_DATA_DIR=$SMOKE_DATA_DIR"
|
||||
echo "SMOKE_IMAGE_NAME=$SMOKE_IMAGE_NAME"
|
||||
echo "SMOKE_PAPERCLIPAI_VERSION=$SMOKE_PAPERCLIPAI_VERSION"
|
||||
echo "SMOKE_METADATA_FILE=$metadata_file"
|
||||
} >> "$GITHUB_ENV"
|
||||
|
||||
- name: Run release smoke Playwright suite
|
||||
env:
|
||||
PAPERCLIP_RELEASE_SMOKE_BASE_URL: ${{ env.SMOKE_BASE_URL }}
|
||||
PAPERCLIP_RELEASE_SMOKE_EMAIL: ${{ env.SMOKE_ADMIN_EMAIL }}
|
||||
PAPERCLIP_RELEASE_SMOKE_PASSWORD: ${{ env.SMOKE_ADMIN_PASSWORD }}
|
||||
run: pnpm run test:release-smoke
|
||||
|
||||
- name: Capture Docker logs
|
||||
if: always()
|
||||
run: |
|
||||
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||
docker logs "$SMOKE_CONTAINER_NAME" >"$RUNNER_TEMP/docker-onboard-smoke.log" 2>&1 || true
|
||||
fi
|
||||
|
||||
- name: Upload diagnostics
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ inputs.artifact_name }}
|
||||
path: |
|
||||
${{ runner.temp }}/docker-onboard-smoke.log
|
||||
${{ env.SMOKE_METADATA_FILE }}
|
||||
tests/release-smoke/playwright-report/
|
||||
tests/release-smoke/test-results/
|
||||
retention-days: 14
|
||||
|
||||
- name: Stop Docker smoke container
|
||||
if: always()
|
||||
run: |
|
||||
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||
docker rm -f "$SMOKE_CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||
fi
|
||||
3
.github/workflows/release.yml
vendored
3
.github/workflows/release.yml
vendored
@@ -12,7 +12,7 @@ on:
|
||||
type: string
|
||||
default: master
|
||||
stable_date:
|
||||
description: Stable release date in UTC (YYYY-MM-DD). Defaults to today.
|
||||
description: Enter a UTC date in YYYY-MM-DD format, for example 2026-03-18. Do not enter a version string. The workflow will resolve that date to a stable version such as 2026.318.0, then 2026.318.1 for the next same-day stable.
|
||||
required: false
|
||||
type: string
|
||||
dry_run:
|
||||
@@ -251,6 +251,7 @@ jobs:
|
||||
- name: Create GitHub Release
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
PUBLISH_REMOTE: origin
|
||||
run: |
|
||||
version="$(git tag --points-at HEAD | grep '^v' | head -1 | sed 's/^v//')"
|
||||
if [ -z "$version" ]; then
|
||||
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -31,6 +31,7 @@ server/src/**/*.js.map
|
||||
server/src/**/*.d.ts
|
||||
server/src/**/*.d.ts.map
|
||||
tmp/
|
||||
feedback-export-*
|
||||
|
||||
# Editor / tool temp files
|
||||
*.tmp
|
||||
@@ -46,5 +47,7 @@ tmp/
|
||||
# Playwright
|
||||
tests/e2e/test-results/
|
||||
tests/e2e/playwright-report/
|
||||
tests/release-smoke/test-results/
|
||||
tests/release-smoke/playwright-report/
|
||||
.superset/
|
||||
.claude/worktrees/
|
||||
|
||||
@@ -26,6 +26,9 @@ Before making changes, read in this order:
|
||||
- `ui/`: React + Vite board UI
|
||||
- `packages/db/`: Drizzle schema, migrations, DB clients
|
||||
- `packages/shared/`: shared types, constants, validators, API path constants
|
||||
- `packages/adapters/`: agent adapter implementations (Claude, Codex, Cursor, etc.)
|
||||
- `packages/adapter-utils/`: shared adapter utilities
|
||||
- `packages/plugins/`: plugin system packages
|
||||
- `doc/`: operational and product docs
|
||||
|
||||
## 4. Dev Setup (Auto DB)
|
||||
|
||||
@@ -11,8 +11,9 @@ We really appreciate both small fixes and thoughtful larger changes.
|
||||
- Pick **one** clear thing to fix/improve
|
||||
- Touch the **smallest possible number of files**
|
||||
- Make sure the change is very targeted and easy to review
|
||||
- All automated checks pass (including Greptile comments)
|
||||
- No new lint/test failures
|
||||
- All tests pass and CI is green
|
||||
- Greptile score is 5/5 with all comments addressed
|
||||
- Use the [PR template](.github/PULL_REQUEST_TEMPLATE.md)
|
||||
|
||||
These almost always get merged quickly when they're clean.
|
||||
|
||||
@@ -26,11 +27,26 @@ These almost always get merged quickly when they're clean.
|
||||
- Before / After screenshots (or short video if UI/behavior change)
|
||||
- Clear description of what & why
|
||||
- Proof it works (manual testing notes)
|
||||
- All tests passing
|
||||
- All Greptile + other PR comments addressed
|
||||
- All tests passing and CI green
|
||||
- Greptile score 5/5 with all comments addressed
|
||||
- [PR template](.github/PULL_REQUEST_TEMPLATE.md) fully filled out
|
||||
|
||||
PRs that follow this path are **much** more likely to be accepted, even when they're large.
|
||||
|
||||
## PR Requirements (all PRs)
|
||||
|
||||
### Use the PR Template
|
||||
|
||||
Every pull request **must** follow the PR template at [`.github/PULL_REQUEST_TEMPLATE.md`](.github/PULL_REQUEST_TEMPLATE.md). If you create a PR via the GitHub API or other tooling that bypasses the template, copy its contents into your PR description manually. The template includes required sections: Thinking Path, What Changed, Verification, Risks, and a Checklist.
|
||||
|
||||
### Tests Must Pass
|
||||
|
||||
All tests must pass before a PR can be merged. Run them locally first and verify CI is green after pushing.
|
||||
|
||||
### Greptile Review
|
||||
|
||||
We use [Greptile](https://greptile.com) for automated code review. Your PR must achieve a **5/5 Greptile score** with **all Greptile comments addressed** before it can be merged. If Greptile leaves comments, fix or respond to each one and request a re-review.
|
||||
|
||||
## General Rules (both paths)
|
||||
|
||||
- Write clear commit messages
|
||||
@@ -41,7 +57,7 @@ PRs that follow this path are **much** more likely to be accepted, even when the
|
||||
|
||||
## Writing a Good PR message
|
||||
|
||||
Please include a "thinking path" at the top of your PR message that explains from the top of the project down to what you fixed. E.g.:
|
||||
Your PR description must follow the [PR template](.github/PULL_REQUEST_TEMPLATE.md). All sections are required. The "thinking path" at the top explains from the top of the project down to what you fixed. E.g.:
|
||||
|
||||
### Thinking Path Example 1:
|
||||
|
||||
|
||||
36
Dockerfile
36
Dockerfile
@@ -1,8 +1,23 @@
|
||||
FROM node:lts-trixie-slim AS base
|
||||
ARG USER_UID=1000
|
||||
ARG USER_GID=1000
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y --no-install-recommends ca-certificates curl git \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
RUN corepack enable
|
||||
&& apt-get install -y --no-install-recommends ca-certificates gosu curl git wget ripgrep python3 \
|
||||
&& mkdir -p -m 755 /etc/apt/keyrings \
|
||||
&& wget -nv -O/etc/apt/keyrings/githubcli-archive-keyring.gpg https://cli.github.com/packages/githubcli-archive-keyring.gpg \
|
||||
&& echo "20e0125d6f6e077a9ad46f03371bc26d90b04939fb95170f5a1905099cc6bcc0 /etc/apt/keyrings/githubcli-archive-keyring.gpg" | sha256sum -c - \
|
||||
&& chmod go+r /etc/apt/keyrings/githubcli-archive-keyring.gpg \
|
||||
&& mkdir -p -m 755 /etc/apt/sources.list.d \
|
||||
&& echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" > /etc/apt/sources.list.d/github-cli.list \
|
||||
&& apt-get update \
|
||||
&& apt-get install -y --no-install-recommends gh \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& corepack enable
|
||||
|
||||
# Modify the existing node user/group to have the specified UID/GID to match host user
|
||||
RUN usermod -u $USER_UID --non-unique node \
|
||||
&& groupmod -g $USER_GID --non-unique node \
|
||||
&& usermod -g $USER_GID -d /paperclip node
|
||||
|
||||
FROM base AS deps
|
||||
WORKDIR /app
|
||||
@@ -20,6 +35,8 @@ COPY packages/adapters/gemini-local/package.json packages/adapters/gemini-local/
|
||||
COPY packages/adapters/openclaw-gateway/package.json packages/adapters/openclaw-gateway/
|
||||
COPY packages/adapters/opencode-local/package.json packages/adapters/opencode-local/
|
||||
COPY packages/adapters/pi-local/package.json packages/adapters/pi-local/
|
||||
COPY packages/plugins/sdk/package.json packages/plugins/sdk/
|
||||
COPY patches/ patches/
|
||||
|
||||
RUN pnpm install --frozen-lockfile
|
||||
|
||||
@@ -28,16 +45,22 @@ WORKDIR /app
|
||||
COPY --from=deps /app /app
|
||||
COPY . .
|
||||
RUN pnpm --filter @paperclipai/ui build
|
||||
RUN pnpm --filter @paperclipai/plugin-sdk build
|
||||
RUN pnpm --filter @paperclipai/server build
|
||||
RUN test -f server/dist/index.js || (echo "ERROR: server build output missing" && exit 1)
|
||||
|
||||
FROM base AS production
|
||||
ARG USER_UID=1000
|
||||
ARG USER_GID=1000
|
||||
WORKDIR /app
|
||||
COPY --chown=node:node --from=build /app /app
|
||||
RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest opencode-ai \
|
||||
&& mkdir -p /paperclip \
|
||||
&& chown node:node /paperclip
|
||||
|
||||
COPY scripts/docker-entrypoint.sh /usr/local/bin/
|
||||
RUN chmod +x /usr/local/bin/docker-entrypoint.sh
|
||||
|
||||
ENV NODE_ENV=production \
|
||||
HOME=/paperclip \
|
||||
HOST=0.0.0.0 \
|
||||
@@ -45,12 +68,15 @@ ENV NODE_ENV=production \
|
||||
SERVE_UI=true \
|
||||
PAPERCLIP_HOME=/paperclip \
|
||||
PAPERCLIP_INSTANCE_ID=default \
|
||||
USER_UID=${USER_UID} \
|
||||
USER_GID=${USER_GID} \
|
||||
PAPERCLIP_CONFIG=/paperclip/instances/default/config.json \
|
||||
PAPERCLIP_DEPLOYMENT_MODE=authenticated \
|
||||
PAPERCLIP_DEPLOYMENT_EXPOSURE=private
|
||||
PAPERCLIP_DEPLOYMENT_EXPOSURE=private \
|
||||
OPENCODE_ALLOW_ALL_MODELS=true
|
||||
|
||||
VOLUME ["/paperclip"]
|
||||
EXPOSE 3100
|
||||
|
||||
USER node
|
||||
ENTRYPOINT ["docker-entrypoint.sh"]
|
||||
CMD ["node", "--import", "./server/node_modules/tsx/dist/loader.mjs", "server/dist/index.js"]
|
||||
|
||||
40
README.md
40
README.md
@@ -177,6 +177,8 @@ Open source. Self-hosted. No Paperclip account required.
|
||||
npx paperclipai onboard --yes
|
||||
```
|
||||
|
||||
If you already have Paperclip configured, rerunning `onboard` keeps the existing config in place. Use `paperclipai configure` to edit settings.
|
||||
|
||||
Or manually:
|
||||
|
||||
```bash
|
||||
@@ -234,16 +236,40 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
||||
|
||||
## Roadmap
|
||||
|
||||
- ⚪ Get OpenClaw onboarding easier
|
||||
- ⚪ Get cloud agents working e.g. Cursor / e2b agents
|
||||
- ⚪ ClipMart - buy and sell entire agent companies
|
||||
- ⚪ Easy agent configurations / easier to understand
|
||||
- ⚪ Better support for harness engineering
|
||||
- 🟢 Plugin system (e.g. if you want to add a knowledgebase, custom tracing, queues, etc)
|
||||
- ⚪ Better docs
|
||||
- ✅ Plugin system (e.g. add a knowledge base, custom tracing, queues, etc)
|
||||
- ✅ Get OpenClaw / claw-style agent employees
|
||||
- ✅ companies.sh - import and export entire organizations
|
||||
- ✅ Easy AGENTS.md configurations
|
||||
- ✅ Skills Manager
|
||||
- ✅ Scheduled Routines
|
||||
- ✅ Better Budgeting
|
||||
- ⚪ Artifacts & Deployments
|
||||
- ⚪ CEO Chat
|
||||
- ⚪ MAXIMIZER MODE
|
||||
- ⚪ Multiple Human Users
|
||||
- ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
|
||||
- ⚪ Cloud deployments
|
||||
- ⚪ Desktop App
|
||||
|
||||
<br/>
|
||||
|
||||
## Community & Plugins
|
||||
|
||||
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
|
||||
|
||||
## Telemetry
|
||||
|
||||
Paperclip collects anonymous usage telemetry to help us understand how the product is used and improve it. No personal information, issue content, prompts, file paths, or secrets are ever collected. Private repository references are hashed with a per-install salt before being sent.
|
||||
|
||||
Telemetry is **enabled by default** and can be disabled with any of the following:
|
||||
|
||||
| Method | How |
|
||||
|---|---|
|
||||
| Environment variable | `PAPERCLIP_TELEMETRY_DISABLED=1` |
|
||||
| Standard convention | `DO_NOT_TRACK=1` |
|
||||
| CI environments | Automatically disabled when `CI=true` |
|
||||
| Config file | Set `telemetry.enabled: false` in your Paperclip config |
|
||||
|
||||
## Contributing
|
||||
|
||||
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.
|
||||
|
||||
292
cli/README.md
Normal file
292
cli/README.md
Normal file
@@ -0,0 +1,292 @@
|
||||
<p align="center">
|
||||
<img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/header.png" alt="Paperclip — runs your business" width="720" />
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
<a href="#quickstart"><strong>Quickstart</strong></a> ·
|
||||
<a href="https://paperclip.ing/docs"><strong>Docs</strong></a> ·
|
||||
<a href="https://github.com/paperclipai/paperclip"><strong>GitHub</strong></a> ·
|
||||
<a href="https://discord.gg/m4HZY7xNG3"><strong>Discord</strong></a>
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
<a href="https://github.com/paperclipai/paperclip/blob/master/LICENSE"><img src="https://img.shields.io/badge/license-MIT-blue" alt="MIT License" /></a>
|
||||
<a href="https://github.com/paperclipai/paperclip/stargazers"><img src="https://img.shields.io/github/stars/paperclipai/paperclip?style=flat" alt="Stars" /></a>
|
||||
<a href="https://discord.gg/m4HZY7xNG3"><img src="https://img.shields.io/badge/discord-join%20chat-5865F2?logo=discord&logoColor=white" alt="Discord" /></a>
|
||||
</p>
|
||||
|
||||
<br/>
|
||||
|
||||
<div align="center">
|
||||
<video src="https://github.com/user-attachments/assets/773bdfb2-6d1e-4e30-8c5f-3487d5b70c8f" width="600" controls></video>
|
||||
</div>
|
||||
|
||||
<br/>
|
||||
|
||||
## What is Paperclip?
|
||||
|
||||
# Open-source orchestration for zero-human companies
|
||||
|
||||
**If OpenClaw is an _employee_, Paperclip is the _company_**
|
||||
|
||||
Paperclip is a Node.js server and React UI that orchestrates a team of AI agents to run a business. Bring your own agents, assign goals, and track your agents' work and costs from one dashboard.
|
||||
|
||||
It looks like a task manager — but under the hood it has org charts, budgets, governance, goal alignment, and agent coordination.
|
||||
|
||||
**Manage business goals, not pull requests.**
|
||||
|
||||
| | Step | Example |
|
||||
| ------ | --------------- | ------------------------------------------------------------------ |
|
||||
| **01** | Define the goal | _"Build the #1 AI note-taking app to $1M MRR."_ |
|
||||
| **02** | Hire the team | CEO, CTO, engineers, designers, marketers — any bot, any provider. |
|
||||
| **03** | Approve and run | Review strategy. Set budgets. Hit go. Monitor from the dashboard. |
|
||||
|
||||
<br/>
|
||||
|
||||
> **COMING SOON: Clipmart** — Download and run entire companies with one click. Browse pre-built company templates — full org structures, agent configs, and skills — and import them into your Paperclip instance in seconds.
|
||||
|
||||
<br/>
|
||||
|
||||
<div align="center">
|
||||
<table>
|
||||
<tr>
|
||||
<td align="center"><strong>Works<br/>with</strong></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/openclaw.svg" width="32" alt="OpenClaw" /><br/><sub>OpenClaw</sub></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/claude.svg" width="32" alt="Claude" /><br/><sub>Claude Code</sub></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/codex.svg" width="32" alt="Codex" /><br/><sub>Codex</sub></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/cursor.svg" width="32" alt="Cursor" /><br/><sub>Cursor</sub></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/bash.svg" width="32" alt="Bash" /><br/><sub>Bash</sub></td>
|
||||
<td align="center"><img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/logos/http.svg" width="32" alt="HTTP" /><br/><sub>HTTP</sub></td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<em>If it can receive a heartbeat, it's hired.</em>
|
||||
|
||||
</div>
|
||||
|
||||
<br/>
|
||||
|
||||
## Paperclip is right for you if
|
||||
|
||||
- ✅ You want to build **autonomous AI companies**
|
||||
- ✅ You **coordinate many different agents** (OpenClaw, Codex, Claude, Cursor) toward a common goal
|
||||
- ✅ You have **20 simultaneous Claude Code terminals** open and lose track of what everyone is doing
|
||||
- ✅ You want agents running **autonomously 24/7**, but still want to audit work and chime in when needed
|
||||
- ✅ You want to **monitor costs** and enforce budgets
|
||||
- ✅ You want a process for managing agents that **feels like using a task manager**
|
||||
- ✅ You want to manage your autonomous businesses **from your phone**
|
||||
|
||||
<br/>
|
||||
|
||||
## Features
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<td align="center" width="33%">
|
||||
<h3>🔌 Bring Your Own Agent</h3>
|
||||
Any agent, any runtime, one org chart. If it can receive a heartbeat, it's hired.
|
||||
</td>
|
||||
<td align="center" width="33%">
|
||||
<h3>🎯 Goal Alignment</h3>
|
||||
Every task traces back to the company mission. Agents know <em>what</em> to do and <em>why</em>.
|
||||
</td>
|
||||
<td align="center" width="33%">
|
||||
<h3>💓 Heartbeats</h3>
|
||||
Agents wake on a schedule, check work, and act. Delegation flows up and down the org chart.
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td align="center">
|
||||
<h3>💰 Cost Control</h3>
|
||||
Monthly budgets per agent. When they hit the limit, they stop. No runaway costs.
|
||||
</td>
|
||||
<td align="center">
|
||||
<h3>🏢 Multi-Company</h3>
|
||||
One deployment, many companies. Complete data isolation. One control plane for your portfolio.
|
||||
</td>
|
||||
<td align="center">
|
||||
<h3>🎫 Ticket System</h3>
|
||||
Every conversation traced. Every decision explained. Full tool-call tracing and immutable audit log.
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td align="center">
|
||||
<h3>🛡️ Governance</h3>
|
||||
You're the board. Approve hires, override strategy, pause or terminate any agent — at any time.
|
||||
</td>
|
||||
<td align="center">
|
||||
<h3>📊 Org Chart</h3>
|
||||
Hierarchies, roles, reporting lines. Your agents have a boss, a title, and a job description.
|
||||
</td>
|
||||
<td align="center">
|
||||
<h3>📱 Mobile Ready</h3>
|
||||
Monitor and manage your autonomous businesses from anywhere.
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
<br/>
|
||||
|
||||
## Problems Paperclip solves
|
||||
|
||||
| Without Paperclip | With Paperclip |
|
||||
| ------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| ❌ You have 20 Claude Code tabs open and can't track which one does what. On reboot you lose everything. | ✅ Tasks are ticket-based, conversations are threaded, sessions persist across reboots. |
|
||||
| ❌ You manually gather context from several places to remind your bot what you're actually doing. | ✅ Context flows from the task up through the project and company goals — your agent always knows what to do and why. |
|
||||
| ❌ Folders of agent configs are disorganized and you're re-inventing task management, communication, and coordination between agents. | ✅ Paperclip gives you org charts, ticketing, delegation, and governance out of the box — so you run a company, not a pile of scripts. |
|
||||
| ❌ Runaway loops waste hundreds of dollars of tokens and max your quota before you even know what happened. | ✅ Cost tracking surfaces token budgets and throttles agents when they're out. Management prioritizes with budgets. |
|
||||
| ❌ You have recurring jobs (customer support, social, reports) and have to remember to manually kick them off. | ✅ Heartbeats handle regular work on a schedule. Management supervises. |
|
||||
| ❌ You have an idea, you have to find your repo, fire up Claude Code, keep a tab open, and babysit it. | ✅ Add a task in Paperclip. Your coding agent works on it until it's done. Management reviews their work. |
|
||||
|
||||
<br/>
|
||||
|
||||
## Why Paperclip is special
|
||||
|
||||
Paperclip handles the hard orchestration details correctly.
|
||||
|
||||
| | |
|
||||
| --------------------------------- | ------------------------------------------------------------------------------------------------------------- |
|
||||
| **Atomic execution.** | Task checkout and budget enforcement are atomic, so no double-work and no runaway spend. |
|
||||
| **Persistent agent state.** | Agents resume the same task context across heartbeats instead of restarting from scratch. |
|
||||
| **Runtime skill injection.** | Agents can learn Paperclip workflows and project context at runtime, without retraining. |
|
||||
| **Governance with rollback.** | Approval gates are enforced, config changes are revisioned, and bad changes can be rolled back safely. |
|
||||
| **Goal-aware execution.** | Tasks carry full goal ancestry so agents consistently see the "why," not just a title. |
|
||||
| **Portable company templates.** | Export/import orgs, agents, and skills with secret scrubbing and collision handling. |
|
||||
| **True multi-company isolation.** | Every entity is company-scoped, so one deployment can run many companies with separate data and audit trails. |
|
||||
|
||||
<br/>
|
||||
|
||||
## What Paperclip is not
|
||||
|
||||
| | |
|
||||
| ---------------------------- | -------------------------------------------------------------------------------------------------------------------- |
|
||||
| **Not a chatbot.** | Agents have jobs, not chat windows. |
|
||||
| **Not an agent framework.** | We don't tell you how to build agents. We tell you how to run a company made of them. |
|
||||
| **Not a workflow builder.** | No drag-and-drop pipelines. Paperclip models companies — with org charts, goals, budgets, and governance. |
|
||||
| **Not a prompt manager.** | Agents bring their own prompts, models, and runtimes. Paperclip manages the organization they work in. |
|
||||
| **Not a single-agent tool.** | This is for teams. If you have one agent, you probably don't need Paperclip. If you have twenty — you definitely do. |
|
||||
| **Not a code review tool.** | Paperclip orchestrates work, not pull requests. Bring your own review process. |
|
||||
|
||||
<br/>
|
||||
|
||||
## Quickstart
|
||||
|
||||
Open source. Self-hosted. No Paperclip account required.
|
||||
|
||||
```bash
|
||||
npx paperclipai onboard --yes
|
||||
```
|
||||
|
||||
If you already have Paperclip configured, rerunning `onboard` keeps the existing config in place. Use `paperclipai configure` to edit settings.
|
||||
|
||||
Or manually:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/paperclipai/paperclip.git
|
||||
cd paperclip
|
||||
pnpm install
|
||||
pnpm dev
|
||||
```
|
||||
|
||||
This starts the API server at `http://localhost:3100`. An embedded PostgreSQL database is created automatically — no setup required.
|
||||
|
||||
> **Requirements:** Node.js 20+, pnpm 9.15+
|
||||
|
||||
<br/>
|
||||
|
||||
## FAQ
|
||||
|
||||
**What does a typical setup look like?**
|
||||
Locally, a single Node.js process manages an embedded Postgres and local file storage. For production, point it at your own Postgres and deploy however you like. Configure projects, agents, and goals — the agents take care of the rest.
|
||||
|
||||
If you're a solo-entreprenuer you can use Tailscale to access Paperclip on the go. Then later you can deploy to e.g. Vercel when you need it.
|
||||
|
||||
**Can I run multiple companies?**
|
||||
Yes. A single deployment can run an unlimited number of companies with complete data isolation.
|
||||
|
||||
**How is Paperclip different from agents like OpenClaw or Claude Code?**
|
||||
Paperclip _uses_ those agents. It orchestrates them into a company — with org charts, budgets, goals, governance, and accountability.
|
||||
|
||||
**Why should I use Paperclip instead of just pointing my OpenClaw to Asana or Trello?**
|
||||
Agent orchestration has subtleties in how you coordinate who has work checked out, how to maintain sessions, monitoring costs, establishing governance - Paperclip does this for you.
|
||||
|
||||
(Bring-your-own-ticket-system is on the Roadmap)
|
||||
|
||||
**Do agents run continuously?**
|
||||
By default, agents run on scheduled heartbeats and event-based triggers (task assignment, @-mentions). You can also hook in continuous agents like OpenClaw. You bring your agent and Paperclip coordinates.
|
||||
|
||||
<br/>
|
||||
|
||||
## Development
|
||||
|
||||
```bash
|
||||
pnpm dev # Full dev (API + UI, watch mode)
|
||||
pnpm dev:once # Full dev without file watching
|
||||
pnpm dev:server # Server only
|
||||
pnpm build # Build all
|
||||
pnpm typecheck # Type checking
|
||||
pnpm test:run # Run tests
|
||||
pnpm db:generate # Generate DB migration
|
||||
pnpm db:migrate # Apply migrations
|
||||
```
|
||||
|
||||
See [doc/DEVELOPING.md](https://github.com/paperclipai/paperclip/blob/master/doc/DEVELOPING.md) for the full development guide.
|
||||
|
||||
<br/>
|
||||
|
||||
## Roadmap
|
||||
|
||||
- ✅ Plugin system (e.g. add a knowledge base, custom tracing, queues, etc)
|
||||
- ✅ Get OpenClaw / claw-style agent employees
|
||||
- ✅ companies.sh - import and export entire organizations
|
||||
- ✅ Easy AGENTS.md configurations
|
||||
- ✅ Skills Manager
|
||||
- ✅ Scheduled Routines
|
||||
- ✅ Better Budgeting
|
||||
- ⚪ Artifacts & Deployments
|
||||
- ⚪ CEO Chat
|
||||
- ⚪ MAXIMIZER MODE
|
||||
- ⚪ Multiple Human Users
|
||||
- ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
|
||||
- ⚪ Cloud deployments
|
||||
- ⚪ Desktop App
|
||||
|
||||
<br/>
|
||||
|
||||
## Community & Plugins
|
||||
|
||||
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
|
||||
|
||||
## Contributing
|
||||
|
||||
We welcome contributions. See the [contributing guide](https://github.com/paperclipai/paperclip/blob/master/CONTRIBUTING.md) for details.
|
||||
|
||||
<br/>
|
||||
|
||||
## Community
|
||||
|
||||
- [Discord](https://discord.gg/m4HZY7xNG3) — Join the community
|
||||
- [GitHub Issues](https://github.com/paperclipai/paperclip/issues) — bugs and feature requests
|
||||
- [GitHub Discussions](https://github.com/paperclipai/paperclip/discussions) — ideas and RFC
|
||||
|
||||
<br/>
|
||||
|
||||
## License
|
||||
|
||||
MIT © 2026 Paperclip
|
||||
|
||||
## Star History
|
||||
|
||||
[](https://www.star-history.com/?repos=paperclipai%2Fpaperclip&type=date&legend=top-left)
|
||||
|
||||
<br/>
|
||||
|
||||
---
|
||||
|
||||
<p align="center">
|
||||
<img src="https://raw.githubusercontent.com/paperclipai/paperclip/master/doc/assets/footer.jpg" alt="" width="720" />
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
<sub>Open source under MIT. Built for people who want to run companies, not babysit agents.</sub>
|
||||
</p>
|
||||
@@ -44,6 +44,9 @@ function writeBaseConfig(configPath: string) {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: { baseDir: "/tmp/paperclip-storage" },
|
||||
|
||||
16
cli/src/__tests__/auth-command-registration.test.ts
Normal file
16
cli/src/__tests__/auth-command-registration.test.ts
Normal file
@@ -0,0 +1,16 @@
|
||||
import { Command } from "commander";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { registerClientAuthCommands } from "../commands/client/auth.js";
|
||||
|
||||
describe("registerClientAuthCommands", () => {
|
||||
it("registers auth commands without duplicate company-id flags", () => {
|
||||
const program = new Command();
|
||||
const auth = program.command("auth");
|
||||
|
||||
expect(() => registerClientAuthCommands(auth)).not.toThrow();
|
||||
|
||||
const login = auth.commands.find((command) => command.name() === "login");
|
||||
expect(login).toBeDefined();
|
||||
expect(login?.options.filter((option) => option.long === "--company-id")).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
53
cli/src/__tests__/board-auth.test.ts
Normal file
53
cli/src/__tests__/board-auth.test.ts
Normal file
@@ -0,0 +1,53 @@
|
||||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
getStoredBoardCredential,
|
||||
readBoardAuthStore,
|
||||
removeStoredBoardCredential,
|
||||
setStoredBoardCredential,
|
||||
} from "../client/board-auth.js";
|
||||
|
||||
function createTempAuthPath(): string {
|
||||
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-auth-"));
|
||||
return path.join(dir, "auth.json");
|
||||
}
|
||||
|
||||
describe("board auth store", () => {
|
||||
it("returns an empty store when the file does not exist", () => {
|
||||
const authPath = createTempAuthPath();
|
||||
expect(readBoardAuthStore(authPath)).toEqual({
|
||||
version: 1,
|
||||
credentials: {},
|
||||
});
|
||||
});
|
||||
|
||||
it("stores and retrieves credentials by normalized api base", () => {
|
||||
const authPath = createTempAuthPath();
|
||||
setStoredBoardCredential({
|
||||
apiBase: "http://localhost:3100/",
|
||||
token: "token-123",
|
||||
userId: "user-1",
|
||||
storePath: authPath,
|
||||
});
|
||||
|
||||
expect(getStoredBoardCredential("http://localhost:3100", authPath)).toMatchObject({
|
||||
apiBase: "http://localhost:3100",
|
||||
token: "token-123",
|
||||
userId: "user-1",
|
||||
});
|
||||
});
|
||||
|
||||
it("removes stored credentials", () => {
|
||||
const authPath = createTempAuthPath();
|
||||
setStoredBoardCredential({
|
||||
apiBase: "http://localhost:3100",
|
||||
token: "token-123",
|
||||
storePath: authPath,
|
||||
});
|
||||
|
||||
expect(removeStoredBoardCredential("http://localhost:3100", authPath)).toBe(true);
|
||||
expect(getStoredBoardCredential("http://localhost:3100", authPath)).toBeNull();
|
||||
});
|
||||
});
|
||||
@@ -15,6 +15,10 @@ function makeCompany(overrides: Partial<Company>): Company {
|
||||
budgetMonthlyCents: 0,
|
||||
spentMonthlyCents: 0,
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
feedbackDataSharingEnabled: false,
|
||||
feedbackDataSharingConsentAt: null,
|
||||
feedbackDataSharingConsentByUserId: null,
|
||||
feedbackDataSharingTermsVersion: null,
|
||||
brandColor: null,
|
||||
logoAssetId: null,
|
||||
logoUrl: null,
|
||||
|
||||
502
cli/src/__tests__/company-import-export-e2e.test.ts
Normal file
502
cli/src/__tests__/company-import-export-e2e.test.ts
Normal file
@@ -0,0 +1,502 @@
|
||||
import { execFile, spawn } from "node:child_process";
|
||||
import { mkdirSync, mkdtempSync, readFileSync, readdirSync, rmSync, writeFileSync } from "node:fs";
|
||||
import net from "node:net";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { fileURLToPath } from "node:url";
|
||||
import { promisify } from "node:util";
|
||||
import { afterAll, beforeAll, describe, expect, it } from "vitest";
|
||||
import {
|
||||
getEmbeddedPostgresTestSupport,
|
||||
startEmbeddedPostgresTestDatabase,
|
||||
} from "./helpers/embedded-postgres.js";
|
||||
import { createStoredZipArchive } from "./helpers/zip.js";
|
||||
|
||||
const execFileAsync = promisify(execFile);
|
||||
type ServerProcess = ReturnType<typeof spawn>;
|
||||
|
||||
async function getAvailablePort(): Promise<number> {
|
||||
return await new Promise((resolve, reject) => {
|
||||
const server = net.createServer();
|
||||
server.unref();
|
||||
server.on("error", reject);
|
||||
server.listen(0, "127.0.0.1", () => {
|
||||
const address = server.address();
|
||||
if (!address || typeof address === "string") {
|
||||
server.close(() => reject(new Error("Failed to allocate test port")));
|
||||
return;
|
||||
}
|
||||
const { port } = address;
|
||||
server.close((error) => {
|
||||
if (error) reject(error);
|
||||
else resolve(port);
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||
|
||||
if (!embeddedPostgresSupport.supported) {
|
||||
console.warn(
|
||||
`Skipping embedded Postgres company import/export e2e tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
|
||||
);
|
||||
}
|
||||
|
||||
function writeTestConfig(configPath: string, tempRoot: string, port: number, connectionString: string) {
|
||||
const config = {
|
||||
$meta: {
|
||||
version: 1,
|
||||
updatedAt: new Date().toISOString(),
|
||||
source: "doctor",
|
||||
},
|
||||
database: {
|
||||
mode: "postgres",
|
||||
connectionString,
|
||||
embeddedPostgresDataDir: path.join(tempRoot, "embedded-db"),
|
||||
embeddedPostgresPort: 54329,
|
||||
backup: {
|
||||
enabled: false,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(tempRoot, "backups"),
|
||||
},
|
||||
},
|
||||
logging: {
|
||||
mode: "file",
|
||||
logDir: path.join(tempRoot, "logs"),
|
||||
},
|
||||
server: {
|
||||
deploymentMode: "local_trusted",
|
||||
exposure: "private",
|
||||
host: "127.0.0.1",
|
||||
port,
|
||||
allowedHostnames: [],
|
||||
serveUi: false,
|
||||
},
|
||||
auth: {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
baseDir: path.join(tempRoot, "storage"),
|
||||
},
|
||||
s3: {
|
||||
bucket: "paperclip",
|
||||
region: "us-east-1",
|
||||
prefix: "",
|
||||
forcePathStyle: false,
|
||||
},
|
||||
},
|
||||
secrets: {
|
||||
provider: "local_encrypted",
|
||||
strictMode: false,
|
||||
localEncrypted: {
|
||||
keyFilePath: path.join(tempRoot, "secrets", "master.key"),
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
mkdirSync(path.dirname(configPath), { recursive: true });
|
||||
writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
|
||||
}
|
||||
|
||||
function createServerEnv(configPath: string, port: number, connectionString: string) {
|
||||
const env = { ...process.env };
|
||||
for (const key of Object.keys(env)) {
|
||||
if (key.startsWith("PAPERCLIP_")) {
|
||||
delete env[key];
|
||||
}
|
||||
}
|
||||
delete env.DATABASE_URL;
|
||||
delete env.PORT;
|
||||
delete env.HOST;
|
||||
delete env.SERVE_UI;
|
||||
delete env.HEARTBEAT_SCHEDULER_ENABLED;
|
||||
|
||||
env.PAPERCLIP_CONFIG = configPath;
|
||||
env.DATABASE_URL = connectionString;
|
||||
env.HOST = "127.0.0.1";
|
||||
env.PORT = String(port);
|
||||
env.SERVE_UI = "false";
|
||||
env.PAPERCLIP_DB_BACKUP_ENABLED = "false";
|
||||
env.HEARTBEAT_SCHEDULER_ENABLED = "false";
|
||||
env.PAPERCLIP_MIGRATION_AUTO_APPLY = "true";
|
||||
env.PAPERCLIP_UI_DEV_MIDDLEWARE = "false";
|
||||
|
||||
return env;
|
||||
}
|
||||
|
||||
function createCliEnv() {
|
||||
const env = { ...process.env };
|
||||
for (const key of Object.keys(env)) {
|
||||
if (key.startsWith("PAPERCLIP_")) {
|
||||
delete env[key];
|
||||
}
|
||||
}
|
||||
delete env.DATABASE_URL;
|
||||
delete env.PORT;
|
||||
delete env.HOST;
|
||||
delete env.SERVE_UI;
|
||||
delete env.PAPERCLIP_DB_BACKUP_ENABLED;
|
||||
delete env.HEARTBEAT_SCHEDULER_ENABLED;
|
||||
delete env.PAPERCLIP_MIGRATION_AUTO_APPLY;
|
||||
delete env.PAPERCLIP_UI_DEV_MIDDLEWARE;
|
||||
return env;
|
||||
}
|
||||
|
||||
function collectTextFiles(root: string, current: string, files: Record<string, string>) {
|
||||
for (const entry of readdirSync(current, { withFileTypes: true })) {
|
||||
const absolutePath = path.join(current, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
collectTextFiles(root, absolutePath, files);
|
||||
continue;
|
||||
}
|
||||
if (!entry.isFile()) continue;
|
||||
const relativePath = path.relative(root, absolutePath).replace(/\\/g, "/");
|
||||
files[relativePath] = readFileSync(absolutePath, "utf8");
|
||||
}
|
||||
}
|
||||
|
||||
async function stopServerProcess(child: ServerProcess | null) {
|
||||
if (!child || child.exitCode !== null) return;
|
||||
child.kill("SIGTERM");
|
||||
await new Promise<void>((resolve) => {
|
||||
child.once("exit", () => resolve());
|
||||
setTimeout(() => {
|
||||
if (child.exitCode === null) {
|
||||
child.kill("SIGKILL");
|
||||
}
|
||||
}, 5_000);
|
||||
});
|
||||
}
|
||||
|
||||
async function api<T>(baseUrl: string, pathname: string, init?: RequestInit): Promise<T> {
|
||||
const res = await fetch(`${baseUrl}${pathname}`, init);
|
||||
const text = await res.text();
|
||||
if (!res.ok) {
|
||||
throw new Error(`Request failed ${res.status} ${pathname}: ${text}`);
|
||||
}
|
||||
return text ? JSON.parse(text) as T : (null as T);
|
||||
}
|
||||
|
||||
async function runCliJson<T>(args: string[], opts: { apiBase: string; configPath: string }) {
|
||||
const repoRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
||||
const result = await execFileAsync(
|
||||
"pnpm",
|
||||
["--silent", "paperclipai", ...args, "--api-base", opts.apiBase, "--config", opts.configPath, "--json"],
|
||||
{
|
||||
cwd: repoRoot,
|
||||
env: createCliEnv(),
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
},
|
||||
);
|
||||
const stdout = result.stdout.trim();
|
||||
const jsonStart = stdout.search(/[\[{]/);
|
||||
if (jsonStart === -1) {
|
||||
throw new Error(`CLI did not emit JSON.\nstdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
|
||||
}
|
||||
return JSON.parse(stdout.slice(jsonStart)) as T;
|
||||
}
|
||||
|
||||
async function waitForServer(
|
||||
apiBase: string,
|
||||
child: ServerProcess,
|
||||
output: { stdout: string[]; stderr: string[] },
|
||||
) {
|
||||
const startedAt = Date.now();
|
||||
while (Date.now() - startedAt < 30_000) {
|
||||
if (child.exitCode !== null) {
|
||||
throw new Error(
|
||||
`paperclipai run exited before healthcheck succeeded.\nstdout:\n${output.stdout.join("")}\nstderr:\n${output.stderr.join("")}`,
|
||||
);
|
||||
}
|
||||
|
||||
try {
|
||||
const res = await fetch(`${apiBase}/api/health`);
|
||||
if (res.ok) return;
|
||||
} catch {
|
||||
// Server is still starting.
|
||||
}
|
||||
|
||||
await new Promise((resolve) => setTimeout(resolve, 250));
|
||||
}
|
||||
|
||||
throw new Error(
|
||||
`Timed out waiting for ${apiBase}/api/health.\nstdout:\n${output.stdout.join("")}\nstderr:\n${output.stderr.join("")}`,
|
||||
);
|
||||
}
|
||||
|
||||
describeEmbeddedPostgres("paperclipai company import/export e2e", () => {
|
||||
let tempRoot = "";
|
||||
let configPath = "";
|
||||
let exportDir = "";
|
||||
let apiBase = "";
|
||||
let serverProcess: ServerProcess | null = null;
|
||||
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
|
||||
|
||||
beforeAll(async () => {
|
||||
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-company-cli-e2e-"));
|
||||
configPath = path.join(tempRoot, "config", "config.json");
|
||||
exportDir = path.join(tempRoot, "exported-company");
|
||||
|
||||
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-company-cli-db-");
|
||||
|
||||
const port = await getAvailablePort();
|
||||
writeTestConfig(configPath, tempRoot, port, tempDb.connectionString);
|
||||
apiBase = `http://127.0.0.1:${port}`;
|
||||
|
||||
const repoRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
||||
const output = { stdout: [] as string[], stderr: [] as string[] };
|
||||
const child = spawn(
|
||||
"pnpm",
|
||||
["paperclipai", "run", "--config", configPath],
|
||||
{
|
||||
cwd: repoRoot,
|
||||
env: createServerEnv(configPath, port, tempDb.connectionString),
|
||||
stdio: ["ignore", "pipe", "pipe"],
|
||||
},
|
||||
);
|
||||
serverProcess = child;
|
||||
child.stdout?.on("data", (chunk) => {
|
||||
output.stdout.push(String(chunk));
|
||||
});
|
||||
child.stderr?.on("data", (chunk) => {
|
||||
output.stderr.push(String(chunk));
|
||||
});
|
||||
|
||||
await waitForServer(apiBase, child, output);
|
||||
}, 60_000);
|
||||
|
||||
afterAll(async () => {
|
||||
await stopServerProcess(serverProcess);
|
||||
await tempDb?.cleanup();
|
||||
if (tempRoot) {
|
||||
rmSync(tempRoot, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("exports a company package and imports it into new and existing companies", async () => {
|
||||
expect(serverProcess).not.toBeNull();
|
||||
|
||||
const sourceCompany = await api<{ id: string; name: string; issuePrefix: string }>(apiBase, "/api/companies", {
|
||||
method: "POST",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({ name: `CLI Export Source ${Date.now()}` }),
|
||||
});
|
||||
|
||||
const sourceAgent = await api<{ id: string; name: string }>(
|
||||
apiBase,
|
||||
`/api/companies/${sourceCompany.id}/agents`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
name: "Export Engineer",
|
||||
role: "engineer",
|
||||
adapterType: "claude_local",
|
||||
adapterConfig: {
|
||||
promptTemplate: "You verify company portability.",
|
||||
},
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
const sourceProject = await api<{ id: string; name: string }>(
|
||||
apiBase,
|
||||
`/api/companies/${sourceCompany.id}/projects`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
name: "Portability Verification",
|
||||
status: "in_progress",
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
const largeIssueDescription = `Round-trip the company package through the CLI.\n\n${"portable-data ".repeat(12_000)}`;
|
||||
|
||||
const sourceIssue = await api<{ id: string; title: string; identifier: string }>(
|
||||
apiBase,
|
||||
`/api/companies/${sourceCompany.id}/issues`,
|
||||
{
|
||||
method: "POST",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
title: "Validate company import/export",
|
||||
description: largeIssueDescription,
|
||||
status: "todo",
|
||||
projectId: sourceProject.id,
|
||||
assigneeAgentId: sourceAgent.id,
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
const exportResult = await runCliJson<{
|
||||
ok: boolean;
|
||||
out: string;
|
||||
filesWritten: number;
|
||||
}>(
|
||||
[
|
||||
"company",
|
||||
"export",
|
||||
sourceCompany.id,
|
||||
"--out",
|
||||
exportDir,
|
||||
"--include",
|
||||
"company,agents,projects,issues",
|
||||
],
|
||||
{ apiBase, configPath },
|
||||
);
|
||||
|
||||
expect(exportResult.ok).toBe(true);
|
||||
expect(exportResult.filesWritten).toBeGreaterThan(0);
|
||||
expect(readFileSync(path.join(exportDir, "COMPANY.md"), "utf8")).toContain(sourceCompany.name);
|
||||
expect(readFileSync(path.join(exportDir, ".paperclip.yaml"), "utf8")).toContain('schema: "paperclip/v1"');
|
||||
|
||||
const importedNew = await runCliJson<{
|
||||
company: { id: string; name: string; action: string };
|
||||
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||
}>(
|
||||
[
|
||||
"company",
|
||||
"import",
|
||||
exportDir,
|
||||
"--target",
|
||||
"new",
|
||||
"--new-company-name",
|
||||
`Imported ${sourceCompany.name}`,
|
||||
"--include",
|
||||
"company,agents,projects,issues",
|
||||
"--yes",
|
||||
],
|
||||
{ apiBase, configPath },
|
||||
);
|
||||
|
||||
expect(importedNew.company.action).toBe("created");
|
||||
expect(importedNew.agents).toHaveLength(1);
|
||||
expect(importedNew.agents[0]?.action).toBe("created");
|
||||
|
||||
const importedAgents = await api<Array<{ id: string; name: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/agents`,
|
||||
);
|
||||
const importedProjects = await api<Array<{ id: string; name: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/projects`,
|
||||
);
|
||||
const importedIssues = await api<Array<{ id: string; title: string; identifier: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/issues`,
|
||||
);
|
||||
|
||||
expect(importedAgents.map((agent) => agent.name)).toContain(sourceAgent.name);
|
||||
expect(importedProjects.map((project) => project.name)).toContain(sourceProject.name);
|
||||
expect(importedIssues.map((issue) => issue.title)).toContain(sourceIssue.title);
|
||||
|
||||
const previewExisting = await runCliJson<{
|
||||
errors: string[];
|
||||
plan: {
|
||||
companyAction: string;
|
||||
agentPlans: Array<{ action: string }>;
|
||||
projectPlans: Array<{ action: string }>;
|
||||
issuePlans: Array<{ action: string }>;
|
||||
};
|
||||
}>(
|
||||
[
|
||||
"company",
|
||||
"import",
|
||||
exportDir,
|
||||
"--target",
|
||||
"existing",
|
||||
"--company-id",
|
||||
importedNew.company.id,
|
||||
"--include",
|
||||
"company,agents,projects,issues",
|
||||
"--collision",
|
||||
"rename",
|
||||
"--dry-run",
|
||||
],
|
||||
{ apiBase, configPath },
|
||||
);
|
||||
|
||||
expect(previewExisting.errors).toEqual([]);
|
||||
expect(previewExisting.plan.companyAction).toBe("none");
|
||||
expect(previewExisting.plan.agentPlans.some((plan) => plan.action === "create")).toBe(true);
|
||||
expect(previewExisting.plan.projectPlans.some((plan) => plan.action === "create")).toBe(true);
|
||||
expect(previewExisting.plan.issuePlans.some((plan) => plan.action === "create")).toBe(true);
|
||||
|
||||
const importedExisting = await runCliJson<{
|
||||
company: { id: string; action: string };
|
||||
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||
}>(
|
||||
[
|
||||
"company",
|
||||
"import",
|
||||
exportDir,
|
||||
"--target",
|
||||
"existing",
|
||||
"--company-id",
|
||||
importedNew.company.id,
|
||||
"--include",
|
||||
"company,agents,projects,issues",
|
||||
"--collision",
|
||||
"rename",
|
||||
"--yes",
|
||||
],
|
||||
{ apiBase, configPath },
|
||||
);
|
||||
|
||||
expect(importedExisting.company.action).toBe("unchanged");
|
||||
expect(importedExisting.agents.some((agent) => agent.action === "created")).toBe(true);
|
||||
|
||||
const twiceImportedAgents = await api<Array<{ id: string; name: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/agents`,
|
||||
);
|
||||
const twiceImportedProjects = await api<Array<{ id: string; name: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/projects`,
|
||||
);
|
||||
const twiceImportedIssues = await api<Array<{ id: string; title: string; identifier: string }>>(
|
||||
apiBase,
|
||||
`/api/companies/${importedNew.company.id}/issues`,
|
||||
);
|
||||
|
||||
expect(twiceImportedAgents).toHaveLength(2);
|
||||
expect(new Set(twiceImportedAgents.map((agent) => agent.name)).size).toBe(2);
|
||||
expect(twiceImportedProjects).toHaveLength(2);
|
||||
expect(twiceImportedIssues).toHaveLength(2);
|
||||
|
||||
const zipPath = path.join(tempRoot, "exported-company.zip");
|
||||
const portableFiles: Record<string, string> = {};
|
||||
collectTextFiles(exportDir, exportDir, portableFiles);
|
||||
writeFileSync(zipPath, createStoredZipArchive(portableFiles, "paperclip-demo"));
|
||||
|
||||
const importedFromZip = await runCliJson<{
|
||||
company: { id: string; name: string; action: string };
|
||||
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||
}>(
|
||||
[
|
||||
"company",
|
||||
"import",
|
||||
zipPath,
|
||||
"--target",
|
||||
"new",
|
||||
"--new-company-name",
|
||||
`Zip Imported ${sourceCompany.name}`,
|
||||
"--include",
|
||||
"company,agents,projects,issues",
|
||||
"--yes",
|
||||
],
|
||||
{ apiBase, configPath },
|
||||
);
|
||||
|
||||
expect(importedFromZip.company.action).toBe("created");
|
||||
expect(importedFromZip.agents.some((agent) => agent.action === "created")).toBe(true);
|
||||
}, 60_000);
|
||||
});
|
||||
74
cli/src/__tests__/company-import-url.test.ts
Normal file
74
cli/src/__tests__/company-import-url.test.ts
Normal file
@@ -0,0 +1,74 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
isGithubShorthand,
|
||||
looksLikeRepoUrl,
|
||||
isHttpUrl,
|
||||
normalizeGithubImportSource,
|
||||
} from "../commands/client/company.js";
|
||||
|
||||
describe("isHttpUrl", () => {
|
||||
it("matches http URLs", () => {
|
||||
expect(isHttpUrl("http://example.com/foo")).toBe(true);
|
||||
});
|
||||
|
||||
it("matches https URLs", () => {
|
||||
expect(isHttpUrl("https://example.com/foo")).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects local paths", () => {
|
||||
expect(isHttpUrl("/tmp/my-company")).toBe(false);
|
||||
expect(isHttpUrl("./relative")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("looksLikeRepoUrl", () => {
|
||||
it("matches GitHub URLs", () => {
|
||||
expect(looksLikeRepoUrl("https://github.com/org/repo")).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects URLs without owner/repo path", () => {
|
||||
expect(looksLikeRepoUrl("https://example.com/foo")).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects local paths", () => {
|
||||
expect(looksLikeRepoUrl("/tmp/my-company")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("isGithubShorthand", () => {
|
||||
it("matches owner/repo/path shorthands", () => {
|
||||
expect(isGithubShorthand("paperclipai/companies/gstack")).toBe(true);
|
||||
expect(isGithubShorthand("paperclipai/companies")).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects local-looking paths", () => {
|
||||
expect(isGithubShorthand("./exports/acme")).toBe(false);
|
||||
expect(isGithubShorthand("/tmp/acme")).toBe(false);
|
||||
expect(isGithubShorthand("C:\\temp\\acme")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("normalizeGithubImportSource", () => {
|
||||
it("normalizes shorthand imports to canonical GitHub sources", () => {
|
||||
expect(normalizeGithubImportSource("paperclipai/companies/gstack")).toBe(
|
||||
"https://github.com/paperclipai/companies?ref=main&path=gstack",
|
||||
);
|
||||
});
|
||||
|
||||
it("applies --ref to shorthand imports", () => {
|
||||
expect(normalizeGithubImportSource("paperclipai/companies/gstack", "feature/demo")).toBe(
|
||||
"https://github.com/paperclipai/companies?ref=feature%2Fdemo&path=gstack",
|
||||
);
|
||||
});
|
||||
|
||||
it("applies --ref to existing GitHub tree URLs without losing the package path", () => {
|
||||
expect(
|
||||
normalizeGithubImportSource(
|
||||
"https://github.com/paperclipai/companies/tree/main/gstack",
|
||||
"release/2026-03-23",
|
||||
),
|
||||
).toBe(
|
||||
"https://github.com/paperclipai/companies?ref=release%2F2026-03-23&path=gstack",
|
||||
);
|
||||
});
|
||||
});
|
||||
44
cli/src/__tests__/company-import-zip.test.ts
Normal file
44
cli/src/__tests__/company-import-zip.test.ts
Normal file
@@ -0,0 +1,44 @@
|
||||
import { mkdtemp, rm, writeFile } from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, it } from "vitest";
|
||||
import { resolveInlineSourceFromPath } from "../commands/client/company.js";
|
||||
import { createStoredZipArchive } from "./helpers/zip.js";
|
||||
|
||||
const tempDirs: string[] = [];
|
||||
|
||||
afterEach(async () => {
|
||||
for (const dir of tempDirs.splice(0)) {
|
||||
await rm(dir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
describe("resolveInlineSourceFromPath", () => {
|
||||
it("imports portable files from a zip archive instead of scanning the parent directory", async () => {
|
||||
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-company-import-zip-"));
|
||||
tempDirs.push(tempDir);
|
||||
|
||||
const archivePath = path.join(tempDir, "paperclip-demo.zip");
|
||||
const archive = createStoredZipArchive(
|
||||
{
|
||||
"COMPANY.md": "# Company\n",
|
||||
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||
"agents/ceo/AGENT.md": "# CEO\n",
|
||||
"notes/todo.txt": "ignore me\n",
|
||||
},
|
||||
"paperclip-demo",
|
||||
);
|
||||
await writeFile(archivePath, archive);
|
||||
|
||||
const resolved = await resolveInlineSourceFromPath(archivePath);
|
||||
|
||||
expect(resolved).toEqual({
|
||||
rootPath: "paperclip-demo",
|
||||
files: {
|
||||
"COMPANY.md": "# Company\n",
|
||||
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||
"agents/ceo/AGENT.md": "# CEO\n",
|
||||
},
|
||||
});
|
||||
});
|
||||
});
|
||||
595
cli/src/__tests__/company.test.ts
Normal file
595
cli/src/__tests__/company.test.ts
Normal file
@@ -0,0 +1,595 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import type { CompanyPortabilityPreviewResult } from "@paperclipai/shared";
|
||||
import {
|
||||
buildCompanyDashboardUrl,
|
||||
buildDefaultImportAdapterOverrides,
|
||||
buildDefaultImportSelectionState,
|
||||
buildImportSelectionCatalog,
|
||||
buildSelectedFilesFromImportSelection,
|
||||
renderCompanyImportPreview,
|
||||
renderCompanyImportResult,
|
||||
resolveCompanyImportApplyConfirmationMode,
|
||||
resolveCompanyImportApiPath,
|
||||
} from "../commands/client/company.js";
|
||||
|
||||
describe("resolveCompanyImportApiPath", () => {
|
||||
it("uses company-scoped preview route for existing-company dry runs", () => {
|
||||
expect(
|
||||
resolveCompanyImportApiPath({
|
||||
dryRun: true,
|
||||
targetMode: "existing_company",
|
||||
companyId: "company-123",
|
||||
}),
|
||||
).toBe("/api/companies/company-123/imports/preview");
|
||||
});
|
||||
|
||||
it("uses company-scoped apply route for existing-company imports", () => {
|
||||
expect(
|
||||
resolveCompanyImportApiPath({
|
||||
dryRun: false,
|
||||
targetMode: "existing_company",
|
||||
companyId: "company-123",
|
||||
}),
|
||||
).toBe("/api/companies/company-123/imports/apply");
|
||||
});
|
||||
|
||||
it("keeps global routes for new-company imports", () => {
|
||||
expect(
|
||||
resolveCompanyImportApiPath({
|
||||
dryRun: true,
|
||||
targetMode: "new_company",
|
||||
}),
|
||||
).toBe("/api/companies/import/preview");
|
||||
|
||||
expect(
|
||||
resolveCompanyImportApiPath({
|
||||
dryRun: false,
|
||||
targetMode: "new_company",
|
||||
}),
|
||||
).toBe("/api/companies/import");
|
||||
});
|
||||
|
||||
it("throws when an existing-company import is missing a company id", () => {
|
||||
expect(() =>
|
||||
resolveCompanyImportApiPath({
|
||||
dryRun: true,
|
||||
targetMode: "existing_company",
|
||||
companyId: " ",
|
||||
})
|
||||
).toThrow(/require a companyId/i);
|
||||
});
|
||||
});
|
||||
|
||||
describe("resolveCompanyImportApplyConfirmationMode", () => {
|
||||
it("skips confirmation when --yes is set", () => {
|
||||
expect(
|
||||
resolveCompanyImportApplyConfirmationMode({
|
||||
yes: true,
|
||||
interactive: false,
|
||||
json: false,
|
||||
}),
|
||||
).toBe("skip");
|
||||
});
|
||||
|
||||
it("prompts in interactive text mode when --yes is not set", () => {
|
||||
expect(
|
||||
resolveCompanyImportApplyConfirmationMode({
|
||||
yes: false,
|
||||
interactive: true,
|
||||
json: false,
|
||||
}),
|
||||
).toBe("prompt");
|
||||
});
|
||||
|
||||
it("requires --yes for non-interactive apply", () => {
|
||||
expect(() =>
|
||||
resolveCompanyImportApplyConfirmationMode({
|
||||
yes: false,
|
||||
interactive: false,
|
||||
json: false,
|
||||
})
|
||||
).toThrow(/non-interactive terminal requires --yes/i);
|
||||
});
|
||||
|
||||
it("requires --yes for json apply", () => {
|
||||
expect(() =>
|
||||
resolveCompanyImportApplyConfirmationMode({
|
||||
yes: false,
|
||||
interactive: false,
|
||||
json: true,
|
||||
})
|
||||
).toThrow(/with --json requires --yes/i);
|
||||
});
|
||||
});
|
||||
|
||||
describe("buildCompanyDashboardUrl", () => {
|
||||
it("preserves the configured base path when building a dashboard URL", () => {
|
||||
expect(buildCompanyDashboardUrl("https://paperclip.example/app/", "PAP")).toBe(
|
||||
"https://paperclip.example/app/PAP/dashboard",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("renderCompanyImportPreview", () => {
|
||||
it("summarizes the preview with counts, selection info, and truncated examples", () => {
|
||||
const preview: CompanyPortabilityPreviewResult = {
|
||||
include: {
|
||||
company: true,
|
||||
agents: true,
|
||||
projects: true,
|
||||
issues: true,
|
||||
skills: true,
|
||||
},
|
||||
targetCompanyId: "company-123",
|
||||
targetCompanyName: "Imported Co",
|
||||
collisionStrategy: "rename",
|
||||
selectedAgentSlugs: ["ceo", "cto", "eng-1", "eng-2", "eng-3", "eng-4", "eng-5"],
|
||||
plan: {
|
||||
companyAction: "update",
|
||||
agentPlans: [
|
||||
{ slug: "ceo", action: "create", plannedName: "CEO", existingAgentId: null, reason: null },
|
||||
{ slug: "cto", action: "update", plannedName: "CTO", existingAgentId: "agent-2", reason: "replace strategy" },
|
||||
{ slug: "eng-1", action: "skip", plannedName: "Engineer 1", existingAgentId: "agent-3", reason: "skip strategy" },
|
||||
{ slug: "eng-2", action: "create", plannedName: "Engineer 2", existingAgentId: null, reason: null },
|
||||
{ slug: "eng-3", action: "create", plannedName: "Engineer 3", existingAgentId: null, reason: null },
|
||||
{ slug: "eng-4", action: "create", plannedName: "Engineer 4", existingAgentId: null, reason: null },
|
||||
{ slug: "eng-5", action: "create", plannedName: "Engineer 5", existingAgentId: null, reason: null },
|
||||
],
|
||||
projectPlans: [
|
||||
{ slug: "alpha", action: "create", plannedName: "Alpha", existingProjectId: null, reason: null },
|
||||
],
|
||||
issuePlans: [
|
||||
{ slug: "kickoff", action: "create", plannedTitle: "Kickoff", reason: null },
|
||||
],
|
||||
},
|
||||
manifest: {
|
||||
schemaVersion: 1,
|
||||
generatedAt: "2026-03-23T17:00:00.000Z",
|
||||
source: {
|
||||
companyId: "company-src",
|
||||
companyName: "Source Co",
|
||||
},
|
||||
includes: {
|
||||
company: true,
|
||||
agents: true,
|
||||
projects: true,
|
||||
issues: true,
|
||||
skills: true,
|
||||
},
|
||||
company: {
|
||||
path: "COMPANY.md",
|
||||
name: "Source Co",
|
||||
description: null,
|
||||
brandColor: null,
|
||||
logoPath: null,
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
feedbackDataSharingEnabled: false,
|
||||
feedbackDataSharingConsentAt: null,
|
||||
feedbackDataSharingConsentByUserId: null,
|
||||
feedbackDataSharingTermsVersion: null,
|
||||
},
|
||||
sidebar: {
|
||||
agents: ["ceo"],
|
||||
projects: ["alpha"],
|
||||
},
|
||||
agents: [
|
||||
{
|
||||
slug: "ceo",
|
||||
name: "CEO",
|
||||
path: "agents/ceo/AGENT.md",
|
||||
skills: [],
|
||||
role: "ceo",
|
||||
title: null,
|
||||
icon: null,
|
||||
capabilities: null,
|
||||
reportsToSlug: null,
|
||||
adapterType: "codex_local",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
budgetMonthlyCents: 0,
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
skills: [
|
||||
{
|
||||
key: "skill-a",
|
||||
slug: "skill-a",
|
||||
name: "Skill A",
|
||||
path: "skills/skill-a/SKILL.md",
|
||||
description: null,
|
||||
sourceType: "inline",
|
||||
sourceLocator: null,
|
||||
sourceRef: null,
|
||||
trustLevel: null,
|
||||
compatibility: null,
|
||||
metadata: null,
|
||||
fileInventory: [],
|
||||
},
|
||||
],
|
||||
projects: [
|
||||
{
|
||||
slug: "alpha",
|
||||
name: "Alpha",
|
||||
path: "projects/alpha/PROJECT.md",
|
||||
description: null,
|
||||
ownerAgentSlug: null,
|
||||
leadAgentSlug: null,
|
||||
targetDate: null,
|
||||
color: null,
|
||||
status: null,
|
||||
executionWorkspacePolicy: null,
|
||||
workspaces: [],
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
issues: [
|
||||
{
|
||||
slug: "kickoff",
|
||||
identifier: null,
|
||||
title: "Kickoff",
|
||||
path: "projects/alpha/issues/kickoff/TASK.md",
|
||||
projectSlug: "alpha",
|
||||
projectWorkspaceKey: null,
|
||||
assigneeAgentSlug: "ceo",
|
||||
description: null,
|
||||
recurring: false,
|
||||
routine: null,
|
||||
legacyRecurrence: null,
|
||||
status: null,
|
||||
priority: null,
|
||||
labelIds: [],
|
||||
billingCode: null,
|
||||
executionWorkspaceSettings: null,
|
||||
assigneeAdapterOverrides: null,
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
envInputs: [
|
||||
{
|
||||
key: "OPENAI_API_KEY",
|
||||
description: null,
|
||||
agentSlug: "ceo",
|
||||
kind: "secret",
|
||||
requirement: "required",
|
||||
defaultValue: null,
|
||||
portability: "portable",
|
||||
},
|
||||
],
|
||||
},
|
||||
files: {
|
||||
"COMPANY.md": "# Source Co",
|
||||
},
|
||||
envInputs: [
|
||||
{
|
||||
key: "OPENAI_API_KEY",
|
||||
description: null,
|
||||
agentSlug: "ceo",
|
||||
kind: "secret",
|
||||
requirement: "required",
|
||||
defaultValue: null,
|
||||
portability: "portable",
|
||||
},
|
||||
],
|
||||
warnings: ["One warning"],
|
||||
errors: ["One error"],
|
||||
};
|
||||
|
||||
const rendered = renderCompanyImportPreview(preview, {
|
||||
sourceLabel: "GitHub: https://github.com/paperclipai/companies/demo",
|
||||
targetLabel: "Imported Co (company-123)",
|
||||
infoMessages: ["Using claude-local adapter"],
|
||||
});
|
||||
|
||||
expect(rendered).toContain("Include");
|
||||
expect(rendered).toContain("company, projects, tasks, agents, skills");
|
||||
expect(rendered).toContain("7 agents total");
|
||||
expect(rendered).toContain("1 project total");
|
||||
expect(rendered).toContain("1 task total");
|
||||
expect(rendered).toContain("skills: 1 skill packaged");
|
||||
expect(rendered).toContain("+1 more");
|
||||
expect(rendered).toContain("Using claude-local adapter");
|
||||
expect(rendered).toContain("Warnings");
|
||||
expect(rendered).toContain("Errors");
|
||||
});
|
||||
});
|
||||
|
||||
describe("renderCompanyImportResult", () => {
|
||||
it("summarizes import results with created, updated, and skipped counts", () => {
|
||||
const rendered = renderCompanyImportResult(
|
||||
{
|
||||
company: {
|
||||
id: "company-123",
|
||||
name: "Imported Co",
|
||||
action: "updated",
|
||||
},
|
||||
agents: [
|
||||
{ slug: "ceo", id: "agent-1", action: "created", name: "CEO", reason: null },
|
||||
{ slug: "cto", id: "agent-2", action: "updated", name: "CTO", reason: "replace strategy" },
|
||||
{ slug: "ops", id: null, action: "skipped", name: "Ops", reason: "skip strategy" },
|
||||
],
|
||||
projects: [
|
||||
{ slug: "app", id: "project-1", action: "created", name: "App", reason: null },
|
||||
{ slug: "ops", id: "project-2", action: "updated", name: "Operations", reason: "replace strategy" },
|
||||
{ slug: "archive", id: null, action: "skipped", name: "Archive", reason: "skip strategy" },
|
||||
],
|
||||
envInputs: [],
|
||||
warnings: ["Review API keys"],
|
||||
},
|
||||
{
|
||||
targetLabel: "Imported Co (company-123)",
|
||||
companyUrl: "https://paperclip.example/PAP/dashboard",
|
||||
infoMessages: ["Using claude-local adapter"],
|
||||
},
|
||||
);
|
||||
|
||||
expect(rendered).toContain("Company");
|
||||
expect(rendered).toContain("https://paperclip.example/PAP/dashboard");
|
||||
expect(rendered).toContain("3 agents total (1 created, 1 updated, 1 skipped)");
|
||||
expect(rendered).toContain("3 projects total (1 created, 1 updated, 1 skipped)");
|
||||
expect(rendered).toContain("Agent results");
|
||||
expect(rendered).toContain("Project results");
|
||||
expect(rendered).toContain("Using claude-local adapter");
|
||||
expect(rendered).toContain("Review API keys");
|
||||
});
|
||||
});
|
||||
|
||||
describe("import selection catalog", () => {
|
||||
it("defaults to everything and keeps project selection separate from task selection", () => {
|
||||
const preview: CompanyPortabilityPreviewResult = {
|
||||
include: {
|
||||
company: true,
|
||||
agents: true,
|
||||
projects: true,
|
||||
issues: true,
|
||||
skills: true,
|
||||
},
|
||||
targetCompanyId: "company-123",
|
||||
targetCompanyName: "Imported Co",
|
||||
collisionStrategy: "rename",
|
||||
selectedAgentSlugs: ["ceo"],
|
||||
plan: {
|
||||
companyAction: "create",
|
||||
agentPlans: [],
|
||||
projectPlans: [],
|
||||
issuePlans: [],
|
||||
},
|
||||
manifest: {
|
||||
schemaVersion: 1,
|
||||
generatedAt: "2026-03-23T18:00:00.000Z",
|
||||
source: {
|
||||
companyId: "company-src",
|
||||
companyName: "Source Co",
|
||||
},
|
||||
includes: {
|
||||
company: true,
|
||||
agents: true,
|
||||
projects: true,
|
||||
issues: true,
|
||||
skills: true,
|
||||
},
|
||||
company: {
|
||||
path: "COMPANY.md",
|
||||
name: "Source Co",
|
||||
description: null,
|
||||
brandColor: null,
|
||||
logoPath: "images/company-logo.png",
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
feedbackDataSharingEnabled: false,
|
||||
feedbackDataSharingConsentAt: null,
|
||||
feedbackDataSharingConsentByUserId: null,
|
||||
feedbackDataSharingTermsVersion: null,
|
||||
},
|
||||
sidebar: {
|
||||
agents: ["ceo"],
|
||||
projects: ["alpha"],
|
||||
},
|
||||
agents: [
|
||||
{
|
||||
slug: "ceo",
|
||||
name: "CEO",
|
||||
path: "agents/ceo/AGENT.md",
|
||||
skills: [],
|
||||
role: "ceo",
|
||||
title: null,
|
||||
icon: null,
|
||||
capabilities: null,
|
||||
reportsToSlug: null,
|
||||
adapterType: "codex_local",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
budgetMonthlyCents: 0,
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
skills: [
|
||||
{
|
||||
key: "skill-a",
|
||||
slug: "skill-a",
|
||||
name: "Skill A",
|
||||
path: "skills/skill-a/SKILL.md",
|
||||
description: null,
|
||||
sourceType: "inline",
|
||||
sourceLocator: null,
|
||||
sourceRef: null,
|
||||
trustLevel: null,
|
||||
compatibility: null,
|
||||
metadata: null,
|
||||
fileInventory: [{ path: "skills/skill-a/helper.md", kind: "doc" }],
|
||||
},
|
||||
],
|
||||
projects: [
|
||||
{
|
||||
slug: "alpha",
|
||||
name: "Alpha",
|
||||
path: "projects/alpha/PROJECT.md",
|
||||
description: null,
|
||||
ownerAgentSlug: null,
|
||||
leadAgentSlug: null,
|
||||
targetDate: null,
|
||||
color: null,
|
||||
status: null,
|
||||
executionWorkspacePolicy: null,
|
||||
workspaces: [],
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
issues: [
|
||||
{
|
||||
slug: "kickoff",
|
||||
identifier: null,
|
||||
title: "Kickoff",
|
||||
path: "projects/alpha/issues/kickoff/TASK.md",
|
||||
projectSlug: "alpha",
|
||||
projectWorkspaceKey: null,
|
||||
assigneeAgentSlug: "ceo",
|
||||
description: null,
|
||||
recurring: false,
|
||||
routine: null,
|
||||
legacyRecurrence: null,
|
||||
status: null,
|
||||
priority: null,
|
||||
labelIds: [],
|
||||
billingCode: null,
|
||||
executionWorkspaceSettings: null,
|
||||
assigneeAdapterOverrides: null,
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
envInputs: [],
|
||||
},
|
||||
files: {
|
||||
"COMPANY.md": "# Source Co",
|
||||
"README.md": "# Readme",
|
||||
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||
"images/company-logo.png": {
|
||||
encoding: "base64",
|
||||
data: "",
|
||||
contentType: "image/png",
|
||||
},
|
||||
"projects/alpha/PROJECT.md": "# Alpha",
|
||||
"projects/alpha/notes.md": "project notes",
|
||||
"projects/alpha/issues/kickoff/TASK.md": "# Kickoff",
|
||||
"projects/alpha/issues/kickoff/details.md": "task details",
|
||||
"agents/ceo/AGENT.md": "# CEO",
|
||||
"agents/ceo/prompt.md": "prompt",
|
||||
"skills/skill-a/SKILL.md": "# Skill A",
|
||||
"skills/skill-a/helper.md": "helper",
|
||||
},
|
||||
envInputs: [],
|
||||
warnings: [],
|
||||
errors: [],
|
||||
};
|
||||
|
||||
const catalog = buildImportSelectionCatalog(preview);
|
||||
const state = buildDefaultImportSelectionState(catalog);
|
||||
|
||||
expect(state.company).toBe(true);
|
||||
expect(state.projects.has("alpha")).toBe(true);
|
||||
expect(state.issues.has("kickoff")).toBe(true);
|
||||
expect(state.agents.has("ceo")).toBe(true);
|
||||
expect(state.skills.has("skill-a")).toBe(true);
|
||||
|
||||
state.company = false;
|
||||
state.issues.clear();
|
||||
state.agents.clear();
|
||||
state.skills.clear();
|
||||
|
||||
const selectedFiles = buildSelectedFilesFromImportSelection(catalog, state);
|
||||
|
||||
expect(selectedFiles).toContain(".paperclip.yaml");
|
||||
expect(selectedFiles).toContain("projects/alpha/PROJECT.md");
|
||||
expect(selectedFiles).toContain("projects/alpha/notes.md");
|
||||
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/TASK.md");
|
||||
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/details.md");
|
||||
});
|
||||
});
|
||||
|
||||
describe("default adapter overrides", () => {
|
||||
it("maps process-only imported agents to claude_local", () => {
|
||||
const preview: CompanyPortabilityPreviewResult = {
|
||||
include: {
|
||||
company: false,
|
||||
agents: true,
|
||||
projects: false,
|
||||
issues: false,
|
||||
skills: false,
|
||||
},
|
||||
targetCompanyId: null,
|
||||
targetCompanyName: null,
|
||||
collisionStrategy: "rename",
|
||||
selectedAgentSlugs: ["legacy-agent", "explicit-agent"],
|
||||
plan: {
|
||||
companyAction: "none",
|
||||
agentPlans: [],
|
||||
projectPlans: [],
|
||||
issuePlans: [],
|
||||
},
|
||||
manifest: {
|
||||
schemaVersion: 1,
|
||||
generatedAt: "2026-03-23T18:20:00.000Z",
|
||||
source: null,
|
||||
includes: {
|
||||
company: false,
|
||||
agents: true,
|
||||
projects: false,
|
||||
issues: false,
|
||||
skills: false,
|
||||
},
|
||||
company: null,
|
||||
sidebar: null,
|
||||
agents: [
|
||||
{
|
||||
slug: "legacy-agent",
|
||||
name: "Legacy Agent",
|
||||
path: "agents/legacy-agent/AGENT.md",
|
||||
skills: [],
|
||||
role: "agent",
|
||||
title: null,
|
||||
icon: null,
|
||||
capabilities: null,
|
||||
reportsToSlug: null,
|
||||
adapterType: "process",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
budgetMonthlyCents: 0,
|
||||
metadata: null,
|
||||
},
|
||||
{
|
||||
slug: "explicit-agent",
|
||||
name: "Explicit Agent",
|
||||
path: "agents/explicit-agent/AGENT.md",
|
||||
skills: [],
|
||||
role: "agent",
|
||||
title: null,
|
||||
icon: null,
|
||||
capabilities: null,
|
||||
reportsToSlug: null,
|
||||
adapterType: "codex_local",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
budgetMonthlyCents: 0,
|
||||
metadata: null,
|
||||
},
|
||||
],
|
||||
skills: [],
|
||||
projects: [],
|
||||
issues: [],
|
||||
envInputs: [],
|
||||
},
|
||||
files: {},
|
||||
envInputs: [],
|
||||
warnings: [],
|
||||
errors: [],
|
||||
};
|
||||
|
||||
expect(buildDefaultImportAdapterOverrides(preview)).toEqual({
|
||||
"legacy-agent": {
|
||||
adapterType: "claude_local",
|
||||
},
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -46,6 +46,9 @@ function createTempConfig(): string {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
|
||||
177
cli/src/__tests__/feedback.test.ts
Normal file
177
cli/src/__tests__/feedback.test.ts
Normal file
@@ -0,0 +1,177 @@
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { mkdtemp, readFile } from "node:fs/promises";
|
||||
import { Command } from "commander";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import type { FeedbackTrace } from "@paperclipai/shared";
|
||||
import { readZipArchive } from "../commands/client/zip.js";
|
||||
import {
|
||||
buildFeedbackTraceQuery,
|
||||
registerFeedbackCommands,
|
||||
renderFeedbackReport,
|
||||
summarizeFeedbackTraces,
|
||||
writeFeedbackExportBundle,
|
||||
} from "../commands/client/feedback.js";
|
||||
|
||||
function makeTrace(overrides: Partial<FeedbackTrace> = {}): FeedbackTrace {
|
||||
return {
|
||||
id: "trace-12345678",
|
||||
companyId: "company-123",
|
||||
feedbackVoteId: "vote-12345678",
|
||||
issueId: "issue-123",
|
||||
projectId: "project-123",
|
||||
issueIdentifier: "PAP-123",
|
||||
issueTitle: "Fix the feedback command",
|
||||
authorUserId: "user-123",
|
||||
targetType: "issue_comment",
|
||||
targetId: "comment-123",
|
||||
vote: "down",
|
||||
status: "pending",
|
||||
destination: "paperclip_labs_feedback_v1",
|
||||
exportId: null,
|
||||
consentVersion: "feedback-data-sharing-v1",
|
||||
schemaVersion: "1",
|
||||
bundleVersion: "1",
|
||||
payloadVersion: "1",
|
||||
payloadDigest: null,
|
||||
payloadSnapshot: {
|
||||
vote: {
|
||||
value: "down",
|
||||
reason: "Needed more detail",
|
||||
},
|
||||
},
|
||||
targetSummary: {
|
||||
label: "Comment",
|
||||
excerpt: "The first answer was too vague.",
|
||||
authorAgentId: "agent-123",
|
||||
authorUserId: null,
|
||||
createdAt: new Date("2026-03-31T12:00:00.000Z"),
|
||||
documentKey: null,
|
||||
documentTitle: null,
|
||||
revisionNumber: null,
|
||||
},
|
||||
redactionSummary: null,
|
||||
attemptCount: 0,
|
||||
lastAttemptedAt: null,
|
||||
exportedAt: null,
|
||||
failureReason: null,
|
||||
createdAt: new Date("2026-03-31T12:01:00.000Z"),
|
||||
updatedAt: new Date("2026-03-31T12:02:00.000Z"),
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
describe("registerFeedbackCommands", () => {
|
||||
it("registers the top-level feedback commands", () => {
|
||||
const program = new Command();
|
||||
|
||||
expect(() => registerFeedbackCommands(program)).not.toThrow();
|
||||
|
||||
const feedback = program.commands.find((command) => command.name() === "feedback");
|
||||
expect(feedback).toBeDefined();
|
||||
expect(feedback?.commands.map((command) => command.name())).toEqual(["report", "export"]);
|
||||
expect(feedback?.commands[0]?.options.filter((option) => option.long === "--company-id")).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("buildFeedbackTraceQuery", () => {
|
||||
it("encodes all supported filters", () => {
|
||||
expect(
|
||||
buildFeedbackTraceQuery({
|
||||
targetType: "issue_comment",
|
||||
vote: "down",
|
||||
status: "pending",
|
||||
projectId: "project-123",
|
||||
issueId: "issue-123",
|
||||
from: "2026-03-31T00:00:00.000Z",
|
||||
to: "2026-03-31T23:59:59.999Z",
|
||||
sharedOnly: true,
|
||||
}),
|
||||
).toBe(
|
||||
"?targetType=issue_comment&vote=down&status=pending&projectId=project-123&issueId=issue-123&from=2026-03-31T00%3A00%3A00.000Z&to=2026-03-31T23%3A59%3A59.999Z&sharedOnly=true&includePayload=true",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("renderFeedbackReport", () => {
|
||||
it("includes summary counts and the optional reason", () => {
|
||||
const traces = [
|
||||
makeTrace(),
|
||||
makeTrace({
|
||||
id: "trace-87654321",
|
||||
feedbackVoteId: "vote-87654321",
|
||||
vote: "up",
|
||||
status: "local_only",
|
||||
payloadSnapshot: {
|
||||
vote: {
|
||||
value: "up",
|
||||
reason: null,
|
||||
},
|
||||
},
|
||||
}),
|
||||
];
|
||||
|
||||
const report = renderFeedbackReport({
|
||||
apiBase: "http://127.0.0.1:3100",
|
||||
companyId: "company-123",
|
||||
traces,
|
||||
summary: summarizeFeedbackTraces(traces),
|
||||
includePayloads: false,
|
||||
});
|
||||
|
||||
expect(report).toContain("Paperclip Feedback Report");
|
||||
expect(report).toContain("thumbs up");
|
||||
expect(report).toContain("thumbs down");
|
||||
expect(report).toContain("Needed more detail");
|
||||
});
|
||||
});
|
||||
|
||||
describe("writeFeedbackExportBundle", () => {
|
||||
it("writes votes, traces, a manifest, and a zip archive", async () => {
|
||||
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-feedback-export-"));
|
||||
const outputDir = path.join(tempDir, "feedback-export");
|
||||
const traces = [
|
||||
makeTrace(),
|
||||
makeTrace({
|
||||
id: "trace-abcdef12",
|
||||
feedbackVoteId: "vote-abcdef12",
|
||||
issueIdentifier: "PAP-124",
|
||||
issueId: "issue-124",
|
||||
vote: "up",
|
||||
status: "local_only",
|
||||
payloadSnapshot: {
|
||||
vote: {
|
||||
value: "up",
|
||||
reason: null,
|
||||
},
|
||||
},
|
||||
}),
|
||||
];
|
||||
|
||||
const exported = await writeFeedbackExportBundle({
|
||||
apiBase: "http://127.0.0.1:3100",
|
||||
companyId: "company-123",
|
||||
traces,
|
||||
outputDir,
|
||||
});
|
||||
|
||||
expect(exported.manifest.summary.total).toBe(2);
|
||||
expect(exported.manifest.summary.withReason).toBe(1);
|
||||
|
||||
const manifest = JSON.parse(await readFile(path.join(outputDir, "index.json"), "utf8")) as {
|
||||
files: { votes: string[]; traces: string[]; zip: string };
|
||||
};
|
||||
expect(manifest.files.votes).toHaveLength(2);
|
||||
expect(manifest.files.traces).toHaveLength(2);
|
||||
|
||||
const archive = await readFile(exported.zipPath);
|
||||
const zip = await readZipArchive(archive);
|
||||
expect(Object.keys(zip.files)).toEqual(
|
||||
expect.arrayContaining([
|
||||
"index.json",
|
||||
`votes/${manifest.files.votes[0]}`,
|
||||
`traces/${manifest.files.traces[0]}`,
|
||||
]),
|
||||
);
|
||||
});
|
||||
});
|
||||
6
cli/src/__tests__/helpers/embedded-postgres.ts
Normal file
6
cli/src/__tests__/helpers/embedded-postgres.ts
Normal file
@@ -0,0 +1,6 @@
|
||||
export {
|
||||
getEmbeddedPostgresTestSupport,
|
||||
startEmbeddedPostgresTestDatabase,
|
||||
type EmbeddedPostgresTestDatabase,
|
||||
type EmbeddedPostgresTestSupport,
|
||||
} from "@paperclipai/db";
|
||||
87
cli/src/__tests__/helpers/zip.ts
Normal file
87
cli/src/__tests__/helpers/zip.ts
Normal file
@@ -0,0 +1,87 @@
|
||||
function writeUint16(target: Uint8Array, offset: number, value: number) {
|
||||
target[offset] = value & 0xff;
|
||||
target[offset + 1] = (value >>> 8) & 0xff;
|
||||
}
|
||||
|
||||
function writeUint32(target: Uint8Array, offset: number, value: number) {
|
||||
target[offset] = value & 0xff;
|
||||
target[offset + 1] = (value >>> 8) & 0xff;
|
||||
target[offset + 2] = (value >>> 16) & 0xff;
|
||||
target[offset + 3] = (value >>> 24) & 0xff;
|
||||
}
|
||||
|
||||
function crc32(bytes: Uint8Array) {
|
||||
let crc = 0xffffffff;
|
||||
for (const byte of bytes) {
|
||||
crc ^= byte;
|
||||
for (let bit = 0; bit < 8; bit += 1) {
|
||||
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
|
||||
}
|
||||
}
|
||||
return (crc ^ 0xffffffff) >>> 0;
|
||||
}
|
||||
|
||||
export function createStoredZipArchive(files: Record<string, string>, rootPath: string) {
|
||||
const encoder = new TextEncoder();
|
||||
const localChunks: Uint8Array[] = [];
|
||||
const centralChunks: Uint8Array[] = [];
|
||||
let localOffset = 0;
|
||||
let entryCount = 0;
|
||||
|
||||
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
|
||||
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
|
||||
const body = encoder.encode(content);
|
||||
const checksum = crc32(body);
|
||||
|
||||
const localHeader = new Uint8Array(30 + fileName.length);
|
||||
writeUint32(localHeader, 0, 0x04034b50);
|
||||
writeUint16(localHeader, 4, 20);
|
||||
writeUint16(localHeader, 6, 0x0800);
|
||||
writeUint16(localHeader, 8, 0);
|
||||
writeUint32(localHeader, 14, checksum);
|
||||
writeUint32(localHeader, 18, body.length);
|
||||
writeUint32(localHeader, 22, body.length);
|
||||
writeUint16(localHeader, 26, fileName.length);
|
||||
localHeader.set(fileName, 30);
|
||||
|
||||
const centralHeader = new Uint8Array(46 + fileName.length);
|
||||
writeUint32(centralHeader, 0, 0x02014b50);
|
||||
writeUint16(centralHeader, 4, 20);
|
||||
writeUint16(centralHeader, 6, 20);
|
||||
writeUint16(centralHeader, 8, 0x0800);
|
||||
writeUint16(centralHeader, 10, 0);
|
||||
writeUint32(centralHeader, 16, checksum);
|
||||
writeUint32(centralHeader, 20, body.length);
|
||||
writeUint32(centralHeader, 24, body.length);
|
||||
writeUint16(centralHeader, 28, fileName.length);
|
||||
writeUint32(centralHeader, 42, localOffset);
|
||||
centralHeader.set(fileName, 46);
|
||||
|
||||
localChunks.push(localHeader, body);
|
||||
centralChunks.push(centralHeader);
|
||||
localOffset += localHeader.length + body.length;
|
||||
entryCount += 1;
|
||||
}
|
||||
|
||||
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
|
||||
const archive = new Uint8Array(
|
||||
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
|
||||
);
|
||||
let offset = 0;
|
||||
for (const chunk of localChunks) {
|
||||
archive.set(chunk, offset);
|
||||
offset += chunk.length;
|
||||
}
|
||||
const centralDirectoryOffset = offset;
|
||||
for (const chunk of centralChunks) {
|
||||
archive.set(chunk, offset);
|
||||
offset += chunk.length;
|
||||
}
|
||||
writeUint32(archive, offset, 0x06054b50);
|
||||
writeUint16(archive, offset + 8, entryCount);
|
||||
writeUint16(archive, offset + 10, entryCount);
|
||||
writeUint32(archive, offset + 12, centralDirectoryLength);
|
||||
writeUint32(archive, offset + 16, centralDirectoryOffset);
|
||||
|
||||
return archive;
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import { ApiRequestError, PaperclipApiClient } from "../client/http.js";
|
||||
import { ApiConnectionError, ApiRequestError, PaperclipApiClient } from "../client/http.js";
|
||||
|
||||
describe("PaperclipApiClient", () => {
|
||||
afterEach(() => {
|
||||
@@ -58,4 +58,49 @@ describe("PaperclipApiClient", () => {
|
||||
details: { issueId: "1" },
|
||||
} satisfies Partial<ApiRequestError>);
|
||||
});
|
||||
|
||||
it("throws ApiConnectionError with recovery guidance when fetch fails", async () => {
|
||||
const fetchMock = vi.fn().mockRejectedValue(new TypeError("fetch failed"));
|
||||
vi.stubGlobal("fetch", fetchMock);
|
||||
|
||||
const client = new PaperclipApiClient({ apiBase: "http://localhost:3100" });
|
||||
|
||||
await expect(client.post("/api/companies/import/preview", {})).rejects.toBeInstanceOf(ApiConnectionError);
|
||||
await expect(client.post("/api/companies/import/preview", {})).rejects.toMatchObject({
|
||||
url: "http://localhost:3100/api/companies/import/preview",
|
||||
method: "POST",
|
||||
causeMessage: "fetch failed",
|
||||
} satisfies Partial<ApiConnectionError>);
|
||||
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||
/Could not reach the Paperclip API\./,
|
||||
);
|
||||
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||
/curl http:\/\/localhost:3100\/api\/health/,
|
||||
);
|
||||
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||
/pnpm dev|pnpm paperclipai run/,
|
||||
);
|
||||
});
|
||||
|
||||
it("retries once after interactive auth recovery", async () => {
|
||||
const fetchMock = vi
|
||||
.fn()
|
||||
.mockResolvedValueOnce(new Response(JSON.stringify({ error: "Board access required" }), { status: 403 }))
|
||||
.mockResolvedValueOnce(new Response(JSON.stringify({ ok: true }), { status: 200 }));
|
||||
vi.stubGlobal("fetch", fetchMock);
|
||||
|
||||
const recoverAuth = vi.fn().mockResolvedValue("board-token-123");
|
||||
const client = new PaperclipApiClient({
|
||||
apiBase: "http://localhost:3100",
|
||||
recoverAuth,
|
||||
});
|
||||
|
||||
const result = await client.post<{ ok: boolean }>("/api/test", { hello: "world" });
|
||||
|
||||
expect(result).toEqual({ ok: true });
|
||||
expect(recoverAuth).toHaveBeenCalledOnce();
|
||||
expect(fetchMock).toHaveBeenCalledTimes(2);
|
||||
const retryHeaders = fetchMock.mock.calls[1]?.[1]?.headers as Record<string, string>;
|
||||
expect(retryHeaders.authorization).toBe("Bearer board-token-123");
|
||||
});
|
||||
});
|
||||
|
||||
108
cli/src/__tests__/onboard.test.ts
Normal file
108
cli/src/__tests__/onboard.test.ts
Normal file
@@ -0,0 +1,108 @@
|
||||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it } from "vitest";
|
||||
import { onboard } from "../commands/onboard.js";
|
||||
import type { PaperclipConfig } from "../config/schema.js";
|
||||
|
||||
const ORIGINAL_ENV = { ...process.env };
|
||||
|
||||
function createExistingConfigFixture() {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-onboard-"));
|
||||
const runtimeRoot = path.join(root, "runtime");
|
||||
const configPath = path.join(root, ".paperclip", "config.json");
|
||||
const config: PaperclipConfig = {
|
||||
$meta: {
|
||||
version: 1,
|
||||
updatedAt: "2026-03-29T00:00:00.000Z",
|
||||
source: "configure",
|
||||
},
|
||||
database: {
|
||||
mode: "embedded-postgres",
|
||||
embeddedPostgresDataDir: path.join(runtimeRoot, "db"),
|
||||
embeddedPostgresPort: 54329,
|
||||
backup: {
|
||||
enabled: true,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(runtimeRoot, "backups"),
|
||||
},
|
||||
},
|
||||
logging: {
|
||||
mode: "file",
|
||||
logDir: path.join(runtimeRoot, "logs"),
|
||||
},
|
||||
server: {
|
||||
deploymentMode: "local_trusted",
|
||||
exposure: "private",
|
||||
host: "127.0.0.1",
|
||||
port: 3100,
|
||||
allowedHostnames: [],
|
||||
serveUi: true,
|
||||
},
|
||||
auth: {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
baseDir: path.join(runtimeRoot, "storage"),
|
||||
},
|
||||
s3: {
|
||||
bucket: "paperclip",
|
||||
region: "us-east-1",
|
||||
prefix: "",
|
||||
forcePathStyle: false,
|
||||
},
|
||||
},
|
||||
secrets: {
|
||||
provider: "local_encrypted",
|
||||
strictMode: false,
|
||||
localEncrypted: {
|
||||
keyFilePath: path.join(runtimeRoot, "secrets", "master.key"),
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
fs.mkdirSync(path.dirname(configPath), { recursive: true });
|
||||
fs.writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, { mode: 0o600 });
|
||||
|
||||
return { configPath, configText: fs.readFileSync(configPath, "utf8") };
|
||||
}
|
||||
|
||||
describe("onboard", () => {
|
||||
beforeEach(() => {
|
||||
process.env = { ...ORIGINAL_ENV };
|
||||
delete process.env.PAPERCLIP_AGENT_JWT_SECRET;
|
||||
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE;
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env = { ...ORIGINAL_ENV };
|
||||
});
|
||||
|
||||
it("preserves an existing config when rerun without flags", async () => {
|
||||
const fixture = createExistingConfigFixture();
|
||||
|
||||
await onboard({ config: fixture.configPath });
|
||||
|
||||
expect(fs.readFileSync(fixture.configPath, "utf8")).toBe(fixture.configText);
|
||||
expect(fs.existsSync(`${fixture.configPath}.backup`)).toBe(false);
|
||||
expect(fs.existsSync(path.join(path.dirname(fixture.configPath), ".env"))).toBe(true);
|
||||
});
|
||||
|
||||
it("preserves an existing config when rerun with --yes", async () => {
|
||||
const fixture = createExistingConfigFixture();
|
||||
|
||||
await onboard({ config: fixture.configPath, yes: true, invokedByRun: true });
|
||||
|
||||
expect(fs.readFileSync(fixture.configPath, "utf8")).toBe(fixture.configText);
|
||||
expect(fs.existsSync(`${fixture.configPath}.backup`)).toBe(false);
|
||||
expect(fs.existsSync(path.join(path.dirname(fixture.configPath), ".env"))).toBe(true);
|
||||
});
|
||||
});
|
||||
249
cli/src/__tests__/routines.test.ts
Normal file
249
cli/src/__tests__/routines.test.ts
Normal file
@@ -0,0 +1,249 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
||||
import { eq } from "drizzle-orm";
|
||||
import {
|
||||
agents,
|
||||
companies,
|
||||
createDb,
|
||||
projects,
|
||||
routines,
|
||||
} from "@paperclipai/db";
|
||||
import {
|
||||
getEmbeddedPostgresTestSupport,
|
||||
startEmbeddedPostgresTestDatabase,
|
||||
} from "./helpers/embedded-postgres.js";
|
||||
import { disableAllRoutinesInConfig } from "../commands/routines.js";
|
||||
|
||||
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||
|
||||
if (!embeddedPostgresSupport.supported) {
|
||||
console.warn(
|
||||
`Skipping embedded Postgres routines CLI tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
|
||||
);
|
||||
}
|
||||
|
||||
function writeTestConfig(configPath: string, tempRoot: string, connectionString: string) {
|
||||
const config = {
|
||||
$meta: {
|
||||
version: 1,
|
||||
updatedAt: new Date().toISOString(),
|
||||
source: "doctor" as const,
|
||||
},
|
||||
database: {
|
||||
mode: "postgres" as const,
|
||||
connectionString,
|
||||
embeddedPostgresDataDir: path.join(tempRoot, "embedded-db"),
|
||||
embeddedPostgresPort: 54329,
|
||||
backup: {
|
||||
enabled: false,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(tempRoot, "backups"),
|
||||
},
|
||||
},
|
||||
logging: {
|
||||
mode: "file" as const,
|
||||
logDir: path.join(tempRoot, "logs"),
|
||||
},
|
||||
server: {
|
||||
deploymentMode: "local_trusted" as const,
|
||||
exposure: "private" as const,
|
||||
host: "127.0.0.1",
|
||||
port: 3100,
|
||||
allowedHostnames: [],
|
||||
serveUi: false,
|
||||
},
|
||||
auth: {
|
||||
baseUrlMode: "auto" as const,
|
||||
disableSignUp: false,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk" as const,
|
||||
localDisk: {
|
||||
baseDir: path.join(tempRoot, "storage"),
|
||||
},
|
||||
s3: {
|
||||
bucket: "paperclip",
|
||||
region: "us-east-1",
|
||||
prefix: "",
|
||||
forcePathStyle: false,
|
||||
},
|
||||
},
|
||||
secrets: {
|
||||
provider: "local_encrypted" as const,
|
||||
strictMode: false,
|
||||
localEncrypted: {
|
||||
keyFilePath: path.join(tempRoot, "secrets", "master.key"),
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
mkdirSync(path.dirname(configPath), { recursive: true });
|
||||
writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
|
||||
}
|
||||
|
||||
describeEmbeddedPostgres("disableAllRoutinesInConfig", () => {
|
||||
let db!: ReturnType<typeof createDb>;
|
||||
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
|
||||
let tempRoot = "";
|
||||
let configPath = "";
|
||||
|
||||
beforeAll(async () => {
|
||||
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-routines-cli-db-");
|
||||
db = createDb(tempDb.connectionString);
|
||||
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-routines-cli-config-"));
|
||||
configPath = path.join(tempRoot, "config.json");
|
||||
writeTestConfig(configPath, tempRoot, tempDb.connectionString);
|
||||
}, 20_000);
|
||||
|
||||
afterEach(async () => {
|
||||
await db.delete(routines);
|
||||
await db.delete(projects);
|
||||
await db.delete(agents);
|
||||
await db.delete(companies);
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await tempDb?.cleanup();
|
||||
if (tempRoot) {
|
||||
rmSync(tempRoot, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("pauses only non-archived routines for the selected company", async () => {
|
||||
const companyId = randomUUID();
|
||||
const otherCompanyId = randomUUID();
|
||||
const projectId = randomUUID();
|
||||
const otherProjectId = randomUUID();
|
||||
const agentId = randomUUID();
|
||||
const otherAgentId = randomUUID();
|
||||
const activeRoutineId = randomUUID();
|
||||
const pausedRoutineId = randomUUID();
|
||||
const archivedRoutineId = randomUUID();
|
||||
const otherCompanyRoutineId = randomUUID();
|
||||
|
||||
await db.insert(companies).values([
|
||||
{
|
||||
id: companyId,
|
||||
name: "Paperclip",
|
||||
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
},
|
||||
{
|
||||
id: otherCompanyId,
|
||||
name: "Other company",
|
||||
issuePrefix: `T${otherCompanyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
},
|
||||
]);
|
||||
|
||||
await db.insert(agents).values([
|
||||
{
|
||||
id: agentId,
|
||||
companyId,
|
||||
name: "Coder",
|
||||
adapterType: "process",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
},
|
||||
{
|
||||
id: otherAgentId,
|
||||
companyId: otherCompanyId,
|
||||
name: "Other coder",
|
||||
adapterType: "process",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {},
|
||||
permissions: {},
|
||||
},
|
||||
]);
|
||||
|
||||
await db.insert(projects).values([
|
||||
{
|
||||
id: projectId,
|
||||
companyId,
|
||||
name: "Project",
|
||||
status: "in_progress",
|
||||
},
|
||||
{
|
||||
id: otherProjectId,
|
||||
companyId: otherCompanyId,
|
||||
name: "Other project",
|
||||
status: "in_progress",
|
||||
},
|
||||
]);
|
||||
|
||||
await db.insert(routines).values([
|
||||
{
|
||||
id: activeRoutineId,
|
||||
companyId,
|
||||
projectId,
|
||||
assigneeAgentId: agentId,
|
||||
title: "Active routine",
|
||||
status: "active",
|
||||
},
|
||||
{
|
||||
id: pausedRoutineId,
|
||||
companyId,
|
||||
projectId,
|
||||
assigneeAgentId: agentId,
|
||||
title: "Paused routine",
|
||||
status: "paused",
|
||||
},
|
||||
{
|
||||
id: archivedRoutineId,
|
||||
companyId,
|
||||
projectId,
|
||||
assigneeAgentId: agentId,
|
||||
title: "Archived routine",
|
||||
status: "archived",
|
||||
},
|
||||
{
|
||||
id: otherCompanyRoutineId,
|
||||
companyId: otherCompanyId,
|
||||
projectId: otherProjectId,
|
||||
assigneeAgentId: otherAgentId,
|
||||
title: "Other company routine",
|
||||
status: "active",
|
||||
},
|
||||
]);
|
||||
|
||||
const result = await disableAllRoutinesInConfig({
|
||||
config: configPath,
|
||||
companyId,
|
||||
});
|
||||
|
||||
expect(result).toMatchObject({
|
||||
companyId,
|
||||
totalRoutines: 3,
|
||||
pausedCount: 1,
|
||||
alreadyPausedCount: 1,
|
||||
archivedCount: 1,
|
||||
});
|
||||
|
||||
const companyRoutines = await db
|
||||
.select({
|
||||
id: routines.id,
|
||||
status: routines.status,
|
||||
})
|
||||
.from(routines)
|
||||
.where(eq(routines.companyId, companyId));
|
||||
const statusById = new Map(companyRoutines.map((routine) => [routine.id, routine.status]));
|
||||
|
||||
expect(statusById.get(activeRoutineId)).toBe("paused");
|
||||
expect(statusById.get(pausedRoutineId)).toBe("paused");
|
||||
expect(statusById.get(archivedRoutineId)).toBe("archived");
|
||||
|
||||
const otherCompanyRoutine = await db
|
||||
.select({
|
||||
status: routines.status,
|
||||
})
|
||||
.from(routines)
|
||||
.where(eq(routines.id, otherCompanyRoutineId));
|
||||
expect(otherCompanyRoutine[0]?.status).toBe("active");
|
||||
});
|
||||
});
|
||||
117
cli/src/__tests__/telemetry.test.ts
Normal file
117
cli/src/__tests__/telemetry.test.ts
Normal file
@@ -0,0 +1,117 @@
|
||||
import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
const ORIGINAL_ENV = { ...process.env };
|
||||
const CI_ENV_VARS = ["CI", "CONTINUOUS_INTEGRATION", "BUILD_NUMBER", "GITHUB_ACTIONS", "GITLAB_CI"];
|
||||
|
||||
function makeConfigPath(root: string, enabled: boolean): string {
|
||||
const configPath = path.join(root, ".paperclip", "config.json");
|
||||
fs.mkdirSync(path.dirname(configPath), { recursive: true });
|
||||
fs.writeFileSync(configPath, JSON.stringify({
|
||||
$meta: {
|
||||
version: 1,
|
||||
updatedAt: "2026-03-31T00:00:00.000Z",
|
||||
source: "configure",
|
||||
},
|
||||
database: {
|
||||
mode: "embedded-postgres",
|
||||
embeddedPostgresDataDir: path.join(root, "runtime", "db"),
|
||||
embeddedPostgresPort: 54329,
|
||||
backup: {
|
||||
enabled: true,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(root, "runtime", "backups"),
|
||||
},
|
||||
},
|
||||
logging: {
|
||||
mode: "file",
|
||||
logDir: path.join(root, "runtime", "logs"),
|
||||
},
|
||||
server: {
|
||||
deploymentMode: "local_trusted",
|
||||
exposure: "private",
|
||||
host: "127.0.0.1",
|
||||
port: 3100,
|
||||
allowedHostnames: [],
|
||||
serveUi: true,
|
||||
},
|
||||
auth: {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
baseDir: path.join(root, "runtime", "storage"),
|
||||
},
|
||||
s3: {
|
||||
bucket: "paperclip",
|
||||
region: "us-east-1",
|
||||
prefix: "",
|
||||
forcePathStyle: false,
|
||||
},
|
||||
},
|
||||
secrets: {
|
||||
provider: "local_encrypted",
|
||||
strictMode: false,
|
||||
localEncrypted: {
|
||||
keyFilePath: path.join(root, "runtime", "secrets", "master.key"),
|
||||
},
|
||||
},
|
||||
}, null, 2));
|
||||
return configPath;
|
||||
}
|
||||
|
||||
describe("cli telemetry", () => {
|
||||
beforeEach(() => {
|
||||
process.env = { ...ORIGINAL_ENV };
|
||||
for (const key of CI_ENV_VARS) {
|
||||
delete process.env[key];
|
||||
}
|
||||
vi.stubGlobal("fetch", vi.fn(async () => ({ ok: true })));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env = { ...ORIGINAL_ENV };
|
||||
vi.unstubAllGlobals();
|
||||
vi.resetModules();
|
||||
});
|
||||
|
||||
it("respects telemetry.enabled=false from the config file", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
|
||||
const configPath = makeConfigPath(root, false);
|
||||
process.env.PAPERCLIP_HOME = path.join(root, "home");
|
||||
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
|
||||
|
||||
const { initTelemetryFromConfigFile } = await import("../telemetry.js");
|
||||
const client = initTelemetryFromConfigFile(configPath);
|
||||
|
||||
expect(client).toBeNull();
|
||||
expect(fs.existsSync(path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json"))).toBe(false);
|
||||
});
|
||||
|
||||
it("creates telemetry state only after the first event is tracked", async () => {
|
||||
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
|
||||
process.env.PAPERCLIP_HOME = path.join(root, "home");
|
||||
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
|
||||
|
||||
const { initTelemetry, flushTelemetry } = await import("../telemetry.js");
|
||||
const client = initTelemetry({ enabled: true });
|
||||
const statePath = path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json");
|
||||
|
||||
expect(client).not.toBeNull();
|
||||
expect(fs.existsSync(statePath)).toBe(false);
|
||||
|
||||
client!.track("install.started", { setupMode: "quickstart" });
|
||||
|
||||
expect(fs.existsSync(statePath)).toBe(true);
|
||||
|
||||
await flushTelemetry();
|
||||
});
|
||||
});
|
||||
492
cli/src/__tests__/worktree-merge-history.test.ts
Normal file
492
cli/src/__tests__/worktree-merge-history.test.ts
Normal file
@@ -0,0 +1,492 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { buildWorktreeMergePlan, parseWorktreeMergeScopes } from "../commands/worktree-merge-history-lib.js";
|
||||
|
||||
function makeIssue(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "issue-1",
|
||||
companyId: "company-1",
|
||||
projectId: null,
|
||||
projectWorkspaceId: null,
|
||||
goalId: "goal-1",
|
||||
parentId: null,
|
||||
title: "Issue",
|
||||
description: null,
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
assigneeAgentId: null,
|
||||
assigneeUserId: null,
|
||||
checkoutRunId: null,
|
||||
executionRunId: null,
|
||||
executionAgentNameKey: null,
|
||||
executionLockedAt: null,
|
||||
createdByAgentId: null,
|
||||
createdByUserId: "local-board",
|
||||
issueNumber: 1,
|
||||
identifier: "PAP-1",
|
||||
requestDepth: 0,
|
||||
billingCode: null,
|
||||
assigneeAdapterOverrides: null,
|
||||
executionWorkspaceId: null,
|
||||
executionWorkspacePreference: null,
|
||||
executionWorkspaceSettings: null,
|
||||
startedAt: null,
|
||||
completedAt: null,
|
||||
cancelledAt: null,
|
||||
hiddenAt: null,
|
||||
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeComment(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "comment-1",
|
||||
companyId: "company-1",
|
||||
issueId: "issue-1",
|
||||
authorAgentId: null,
|
||||
authorUserId: "local-board",
|
||||
body: "hello",
|
||||
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeIssueDocument(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "issue-document-1",
|
||||
companyId: "company-1",
|
||||
issueId: "issue-1",
|
||||
documentId: "document-1",
|
||||
key: "plan",
|
||||
linkCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
linkUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
title: "Plan",
|
||||
format: "markdown",
|
||||
latestBody: "# Plan",
|
||||
latestRevisionId: "revision-1",
|
||||
latestRevisionNumber: 1,
|
||||
createdByAgentId: null,
|
||||
createdByUserId: "local-board",
|
||||
updatedByAgentId: null,
|
||||
updatedByUserId: "local-board",
|
||||
documentCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
documentUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeDocumentRevision(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "revision-1",
|
||||
companyId: "company-1",
|
||||
documentId: "document-1",
|
||||
revisionNumber: 1,
|
||||
body: "# Plan",
|
||||
changeSummary: null,
|
||||
createdByAgentId: null,
|
||||
createdByUserId: "local-board",
|
||||
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeAttachment(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "attachment-1",
|
||||
companyId: "company-1",
|
||||
issueId: "issue-1",
|
||||
issueCommentId: null,
|
||||
assetId: "asset-1",
|
||||
provider: "local_disk",
|
||||
objectKey: "company-1/issues/issue-1/2026/03/20/asset.png",
|
||||
contentType: "image/png",
|
||||
byteSize: 12,
|
||||
sha256: "deadbeef",
|
||||
originalFilename: "asset.png",
|
||||
createdByAgentId: null,
|
||||
createdByUserId: "local-board",
|
||||
assetCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
assetUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
attachmentCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
attachmentUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeProject(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "project-1",
|
||||
companyId: "company-1",
|
||||
goalId: null,
|
||||
name: "Project",
|
||||
description: null,
|
||||
status: "in_progress",
|
||||
leadAgentId: null,
|
||||
targetDate: null,
|
||||
color: "#22c55e",
|
||||
pauseReason: null,
|
||||
pausedAt: null,
|
||||
executionWorkspacePolicy: null,
|
||||
archivedAt: null,
|
||||
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
function makeProjectWorkspace(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
id: "workspace-1",
|
||||
companyId: "company-1",
|
||||
projectId: "project-1",
|
||||
name: "Workspace",
|
||||
sourceType: "local_path",
|
||||
cwd: "/tmp/project",
|
||||
repoUrl: "https://github.com/example/project.git",
|
||||
repoRef: "main",
|
||||
defaultRef: "main",
|
||||
visibility: "default",
|
||||
setupCommand: null,
|
||||
cleanupCommand: null,
|
||||
remoteProvider: null,
|
||||
remoteWorkspaceRef: null,
|
||||
sharedWorkspaceKey: null,
|
||||
metadata: null,
|
||||
isPrimary: true,
|
||||
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
describe("worktree merge history planner", () => {
|
||||
it("parses default scopes", () => {
|
||||
expect(parseWorktreeMergeScopes(undefined)).toEqual(["issues", "comments"]);
|
||||
expect(parseWorktreeMergeScopes("issues")).toEqual(["issues"]);
|
||||
});
|
||||
|
||||
it("dedupes nested worktree issues by preserved source uuid", () => {
|
||||
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10", title: "Shared" });
|
||||
const branchOneIssue = makeIssue({
|
||||
id: "issue-b",
|
||||
identifier: "PAP-22",
|
||||
title: "Branch one issue",
|
||||
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||
});
|
||||
const branchTwoIssue = makeIssue({
|
||||
id: "issue-c",
|
||||
identifier: "PAP-23",
|
||||
title: "Branch two issue",
|
||||
createdAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||
});
|
||||
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 500,
|
||||
scopes: ["issues", "comments"],
|
||||
sourceIssues: [sharedIssue, branchOneIssue, branchTwoIssue],
|
||||
targetIssues: [sharedIssue, branchOneIssue],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
});
|
||||
|
||||
expect(plan.counts.issuesToInsert).toBe(1);
|
||||
expect(plan.issuePlans.filter((item) => item.action === "insert").map((item) => item.source.id)).toEqual(["issue-c"]);
|
||||
expect(plan.issuePlans.find((item) => item.source.id === "issue-c" && item.action === "insert")).toMatchObject({
|
||||
previewIdentifier: "PAP-501",
|
||||
});
|
||||
});
|
||||
|
||||
it("clears missing references and coerces in_progress without an assignee", () => {
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues"],
|
||||
sourceIssues: [
|
||||
makeIssue({
|
||||
id: "issue-x",
|
||||
identifier: "PAP-99",
|
||||
status: "in_progress",
|
||||
assigneeAgentId: "agent-missing",
|
||||
projectId: "project-missing",
|
||||
projectWorkspaceId: "workspace-missing",
|
||||
goalId: "goal-missing",
|
||||
}),
|
||||
],
|
||||
targetIssues: [],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [],
|
||||
});
|
||||
|
||||
const insert = plan.issuePlans[0] as any;
|
||||
expect(insert.targetStatus).toBe("todo");
|
||||
expect(insert.targetAssigneeAgentId).toBeNull();
|
||||
expect(insert.targetProjectId).toBeNull();
|
||||
expect(insert.targetProjectWorkspaceId).toBeNull();
|
||||
expect(insert.targetGoalId).toBeNull();
|
||||
expect(insert.adjustments).toEqual([
|
||||
"clear_assignee_agent",
|
||||
"clear_project",
|
||||
"clear_project_workspace",
|
||||
"clear_goal",
|
||||
"coerce_in_progress_to_todo",
|
||||
]);
|
||||
});
|
||||
|
||||
it("applies an explicit project mapping override instead of clearing the project", () => {
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues"],
|
||||
sourceIssues: [
|
||||
makeIssue({
|
||||
id: "issue-project-map",
|
||||
identifier: "PAP-77",
|
||||
projectId: "source-project-1",
|
||||
projectWorkspaceId: "source-workspace-1",
|
||||
}),
|
||||
],
|
||||
targetIssues: [],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
targetAgents: [],
|
||||
targetProjects: [{ id: "target-project-1", name: "Mapped project", status: "in_progress" }] as any,
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
projectIdOverrides: {
|
||||
"source-project-1": "target-project-1",
|
||||
},
|
||||
});
|
||||
|
||||
const insert = plan.issuePlans[0] as any;
|
||||
expect(insert.targetProjectId).toBe("target-project-1");
|
||||
expect(insert.projectResolution).toBe("mapped");
|
||||
expect(insert.mappedProjectName).toBe("Mapped project");
|
||||
expect(insert.targetProjectWorkspaceId).toBeNull();
|
||||
expect(insert.adjustments).toEqual(["clear_project_workspace"]);
|
||||
});
|
||||
|
||||
it("plans selected project imports and preserves project workspace links", () => {
|
||||
const sourceProject = makeProject({
|
||||
id: "source-project-1",
|
||||
name: "Paperclip Evals",
|
||||
goalId: "goal-1",
|
||||
});
|
||||
const sourceWorkspace = makeProjectWorkspace({
|
||||
id: "source-workspace-1",
|
||||
projectId: "source-project-1",
|
||||
cwd: "/Users/dotta/paperclip-evals",
|
||||
repoUrl: "https://github.com/paperclipai/paperclip-evals.git",
|
||||
});
|
||||
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues"],
|
||||
sourceIssues: [
|
||||
makeIssue({
|
||||
id: "issue-project-import",
|
||||
identifier: "PAP-88",
|
||||
projectId: "source-project-1",
|
||||
projectWorkspaceId: "source-workspace-1",
|
||||
}),
|
||||
],
|
||||
targetIssues: [],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
sourceProjects: [sourceProject],
|
||||
sourceProjectWorkspaces: [sourceWorkspace],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
importProjectIds: ["source-project-1"],
|
||||
});
|
||||
|
||||
expect(plan.counts.projectsToImport).toBe(1);
|
||||
expect(plan.projectImports[0]).toMatchObject({
|
||||
source: { id: "source-project-1", name: "Paperclip Evals" },
|
||||
targetGoalId: "goal-1",
|
||||
workspaces: [{ id: "source-workspace-1" }],
|
||||
});
|
||||
|
||||
const insert = plan.issuePlans[0] as any;
|
||||
expect(insert.targetProjectId).toBe("source-project-1");
|
||||
expect(insert.targetProjectWorkspaceId).toBe("source-workspace-1");
|
||||
expect(insert.projectResolution).toBe("imported");
|
||||
expect(insert.mappedProjectName).toBe("Paperclip Evals");
|
||||
expect(insert.adjustments).toEqual([]);
|
||||
});
|
||||
|
||||
it("imports comments onto shared or newly imported issues while skipping existing comments", () => {
|
||||
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||
const newIssue = makeIssue({
|
||||
id: "issue-b",
|
||||
identifier: "PAP-11",
|
||||
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||
});
|
||||
const existingComment = makeComment({ id: "comment-existing", issueId: "issue-a" });
|
||||
const sharedIssueComment = makeComment({ id: "comment-shared", issueId: "issue-a" });
|
||||
const newIssueComment = makeComment({
|
||||
id: "comment-new-issue",
|
||||
issueId: "issue-b",
|
||||
authorAgentId: "missing-agent",
|
||||
createdAt: new Date("2026-03-20T01:05:00.000Z"),
|
||||
});
|
||||
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues", "comments"],
|
||||
sourceIssues: [sharedIssue, newIssue],
|
||||
targetIssues: [sharedIssue],
|
||||
sourceComments: [existingComment, sharedIssueComment, newIssueComment],
|
||||
targetComments: [existingComment],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
});
|
||||
|
||||
expect(plan.counts.commentsToInsert).toBe(2);
|
||||
expect(plan.counts.commentsExisting).toBe(1);
|
||||
expect(plan.commentPlans.filter((item) => item.action === "insert").map((item) => item.source.id)).toEqual([
|
||||
"comment-shared",
|
||||
"comment-new-issue",
|
||||
]);
|
||||
expect(plan.adjustments.clear_author_agent).toBe(1);
|
||||
});
|
||||
|
||||
it("merges document revisions onto an existing shared document and renumbers conflicts", () => {
|
||||
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||
const sourceDocument = makeIssueDocument({
|
||||
issueId: "issue-a",
|
||||
documentId: "document-a",
|
||||
latestBody: "# Branch plan",
|
||||
latestRevisionId: "revision-branch-2",
|
||||
latestRevisionNumber: 2,
|
||||
documentUpdatedAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||
linkUpdatedAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||
});
|
||||
const targetDocument = makeIssueDocument({
|
||||
issueId: "issue-a",
|
||||
documentId: "document-a",
|
||||
latestBody: "# Main plan",
|
||||
latestRevisionId: "revision-main-2",
|
||||
latestRevisionNumber: 2,
|
||||
documentUpdatedAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||
linkUpdatedAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||
});
|
||||
const sourceRevisionOne = makeDocumentRevision({ documentId: "document-a", id: "revision-1" });
|
||||
const sourceRevisionTwo = makeDocumentRevision({
|
||||
documentId: "document-a",
|
||||
id: "revision-branch-2",
|
||||
revisionNumber: 2,
|
||||
body: "# Branch plan",
|
||||
createdAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||
});
|
||||
const targetRevisionOne = makeDocumentRevision({ documentId: "document-a", id: "revision-1" });
|
||||
const targetRevisionTwo = makeDocumentRevision({
|
||||
documentId: "document-a",
|
||||
id: "revision-main-2",
|
||||
revisionNumber: 2,
|
||||
body: "# Main plan",
|
||||
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||
});
|
||||
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues", "comments"],
|
||||
sourceIssues: [sharedIssue],
|
||||
targetIssues: [sharedIssue],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
sourceDocuments: [sourceDocument],
|
||||
targetDocuments: [targetDocument],
|
||||
sourceDocumentRevisions: [sourceRevisionOne, sourceRevisionTwo],
|
||||
targetDocumentRevisions: [targetRevisionOne, targetRevisionTwo],
|
||||
sourceAttachments: [],
|
||||
targetAttachments: [],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
});
|
||||
|
||||
expect(plan.counts.documentsToMerge).toBe(1);
|
||||
expect(plan.counts.documentRevisionsToInsert).toBe(1);
|
||||
expect(plan.documentPlans[0]).toMatchObject({
|
||||
action: "merge_existing",
|
||||
latestRevisionId: "revision-branch-2",
|
||||
latestRevisionNumber: 3,
|
||||
});
|
||||
const mergePlan = plan.documentPlans[0] as any;
|
||||
expect(mergePlan.revisionsToInsert).toHaveLength(1);
|
||||
expect(mergePlan.revisionsToInsert[0]).toMatchObject({
|
||||
source: { id: "revision-branch-2" },
|
||||
targetRevisionNumber: 3,
|
||||
});
|
||||
});
|
||||
|
||||
it("imports attachments while clearing missing comment and author references", () => {
|
||||
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||
const attachment = makeAttachment({
|
||||
issueId: "issue-a",
|
||||
issueCommentId: "comment-missing",
|
||||
createdByAgentId: "agent-missing",
|
||||
});
|
||||
|
||||
const plan = buildWorktreeMergePlan({
|
||||
companyId: "company-1",
|
||||
companyName: "Paperclip",
|
||||
issuePrefix: "PAP",
|
||||
previewIssueCounterStart: 10,
|
||||
scopes: ["issues"],
|
||||
sourceIssues: [sharedIssue],
|
||||
targetIssues: [sharedIssue],
|
||||
sourceComments: [],
|
||||
targetComments: [],
|
||||
sourceDocuments: [],
|
||||
targetDocuments: [],
|
||||
sourceDocumentRevisions: [],
|
||||
targetDocumentRevisions: [],
|
||||
sourceAttachments: [attachment],
|
||||
targetAttachments: [],
|
||||
targetAgents: [],
|
||||
targetProjects: [],
|
||||
targetProjectWorkspaces: [],
|
||||
targetGoals: [{ id: "goal-1" }] as any,
|
||||
});
|
||||
|
||||
expect(plan.counts.attachmentsToInsert).toBe(1);
|
||||
expect(plan.adjustments.clear_attachment_agent).toBe(1);
|
||||
expect(plan.attachmentPlans[0]).toMatchObject({
|
||||
action: "insert",
|
||||
targetIssueCommentId: null,
|
||||
targetCreatedByAgentId: null,
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -6,6 +6,7 @@ import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import {
|
||||
copyGitHooksToWorktreeGitDir,
|
||||
copySeededSecretsKey,
|
||||
readSourceAttachmentBody,
|
||||
rebindWorkspaceCwd,
|
||||
resolveSourceConfigPath,
|
||||
resolveGitWorktreeAddArgs,
|
||||
@@ -74,6 +75,9 @@ function buildSourceConfig(): PaperclipConfig {
|
||||
publicBaseUrl: "http://127.0.0.1:3100",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
@@ -195,6 +199,43 @@ describe("worktree helpers", () => {
|
||||
expect(formatShellExports(env)).toContain("export PAPERCLIP_INSTANCE_ID='feature-worktree-support'");
|
||||
});
|
||||
|
||||
it("falls back across storage roots before skipping a missing attachment object", async () => {
|
||||
const missingErr = Object.assign(new Error("missing"), { code: "ENOENT" });
|
||||
const expected = Buffer.from("image-bytes");
|
||||
await expect(
|
||||
readSourceAttachmentBody(
|
||||
[
|
||||
{
|
||||
getObject: vi.fn().mockRejectedValue(missingErr),
|
||||
},
|
||||
{
|
||||
getObject: vi.fn().mockResolvedValue(expected),
|
||||
},
|
||||
],
|
||||
"company-1",
|
||||
"company-1/issues/issue-1/missing.png",
|
||||
),
|
||||
).resolves.toEqual(expected);
|
||||
});
|
||||
|
||||
it("returns null when an attachment object is missing from every lookup storage", async () => {
|
||||
const missingErr = Object.assign(new Error("missing"), { code: "ENOENT" });
|
||||
await expect(
|
||||
readSourceAttachmentBody(
|
||||
[
|
||||
{
|
||||
getObject: vi.fn().mockRejectedValue(missingErr),
|
||||
},
|
||||
{
|
||||
getObject: vi.fn().mockRejectedValue(Object.assign(new Error("missing"), { status: 404 })),
|
||||
},
|
||||
],
|
||||
"company-1",
|
||||
"company-1/issues/issue-1/missing.png",
|
||||
),
|
||||
).resolves.toBeNull();
|
||||
});
|
||||
|
||||
it("generates vivid worktree colors as hex", () => {
|
||||
expect(generateWorktreeColor()).toMatch(/^#[0-9a-f]{6}$/);
|
||||
});
|
||||
@@ -306,6 +347,87 @@ describe("worktree helpers", () => {
|
||||
}
|
||||
});
|
||||
|
||||
it("avoids ports already claimed by sibling worktree instance configs", async () => {
|
||||
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-claimed-ports-"));
|
||||
const repoRoot = path.join(tempRoot, "repo");
|
||||
const homeDir = path.join(tempRoot, ".paperclip-worktrees");
|
||||
const siblingInstanceRoot = path.join(homeDir, "instances", "existing-worktree");
|
||||
const originalCwd = process.cwd();
|
||||
|
||||
try {
|
||||
fs.mkdirSync(repoRoot, { recursive: true });
|
||||
fs.mkdirSync(siblingInstanceRoot, { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(siblingInstanceRoot, "config.json"),
|
||||
JSON.stringify(
|
||||
{
|
||||
...buildSourceConfig(),
|
||||
database: {
|
||||
mode: "embedded-postgres",
|
||||
embeddedPostgresDataDir: path.join(siblingInstanceRoot, "db"),
|
||||
embeddedPostgresPort: 54330,
|
||||
backup: {
|
||||
enabled: true,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(siblingInstanceRoot, "backups"),
|
||||
},
|
||||
},
|
||||
logging: {
|
||||
mode: "file",
|
||||
logDir: path.join(siblingInstanceRoot, "logs"),
|
||||
},
|
||||
server: {
|
||||
deploymentMode: "authenticated",
|
||||
exposure: "private",
|
||||
host: "127.0.0.1",
|
||||
port: 3101,
|
||||
allowedHostnames: ["localhost"],
|
||||
serveUi: true,
|
||||
},
|
||||
storage: {
|
||||
provider: "local_disk",
|
||||
localDisk: {
|
||||
baseDir: path.join(siblingInstanceRoot, "storage"),
|
||||
},
|
||||
s3: {
|
||||
bucket: "paperclip",
|
||||
region: "us-east-1",
|
||||
prefix: "",
|
||||
forcePathStyle: false,
|
||||
},
|
||||
},
|
||||
secrets: {
|
||||
provider: "local_encrypted",
|
||||
strictMode: false,
|
||||
localEncrypted: {
|
||||
keyFilePath: path.join(siblingInstanceRoot, "secrets", "master.key"),
|
||||
},
|
||||
},
|
||||
},
|
||||
null,
|
||||
2,
|
||||
) + "\n",
|
||||
);
|
||||
|
||||
process.chdir(repoRoot);
|
||||
await worktreeInitCommand({
|
||||
seed: false,
|
||||
fromConfig: path.join(tempRoot, "missing", "config.json"),
|
||||
home: homeDir,
|
||||
});
|
||||
|
||||
const config = JSON.parse(fs.readFileSync(path.join(repoRoot, ".paperclip", "config.json"), "utf8"));
|
||||
expect(config.server.port).toBeGreaterThan(3101);
|
||||
expect(config.database.embeddedPostgresPort).not.toBe(54330);
|
||||
expect(config.database.embeddedPostgresPort).not.toBe(config.server.port);
|
||||
expect(config.database.embeddedPostgresPort).toBeGreaterThan(54330);
|
||||
} finally {
|
||||
process.chdir(originalCwd);
|
||||
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("defaults the seed source config to the current repo-local Paperclip config", () => {
|
||||
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-source-config-"));
|
||||
const repoRoot = path.join(tempRoot, "repo");
|
||||
|
||||
282
cli/src/client/board-auth.ts
Normal file
282
cli/src/client/board-auth.ts
Normal file
@@ -0,0 +1,282 @@
|
||||
import { spawn } from "node:child_process";
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import pc from "picocolors";
|
||||
import { buildCliCommandLabel } from "./command-label.js";
|
||||
import { resolveDefaultCliAuthPath } from "../config/home.js";
|
||||
|
||||
type RequestedAccess = "board" | "instance_admin_required";
|
||||
|
||||
interface BoardAuthCredential {
|
||||
apiBase: string;
|
||||
token: string;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
userId?: string | null;
|
||||
}
|
||||
|
||||
interface BoardAuthStore {
|
||||
version: 1;
|
||||
credentials: Record<string, BoardAuthCredential>;
|
||||
}
|
||||
|
||||
interface CreateChallengeResponse {
|
||||
id: string;
|
||||
token: string;
|
||||
boardApiToken: string;
|
||||
approvalPath: string;
|
||||
approvalUrl: string | null;
|
||||
pollPath: string;
|
||||
expiresAt: string;
|
||||
suggestedPollIntervalMs: number;
|
||||
}
|
||||
|
||||
interface ChallengeStatusResponse {
|
||||
id: string;
|
||||
status: "pending" | "approved" | "cancelled" | "expired";
|
||||
command: string;
|
||||
clientName: string | null;
|
||||
requestedAccess: RequestedAccess;
|
||||
requestedCompanyId: string | null;
|
||||
requestedCompanyName: string | null;
|
||||
approvedAt: string | null;
|
||||
cancelledAt: string | null;
|
||||
expiresAt: string;
|
||||
approvedByUser: { id: string; name: string; email: string } | null;
|
||||
}
|
||||
|
||||
function defaultBoardAuthStore(): BoardAuthStore {
|
||||
return {
|
||||
version: 1,
|
||||
credentials: {},
|
||||
};
|
||||
}
|
||||
|
||||
function toStringOrNull(value: unknown): string | null {
|
||||
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||
}
|
||||
|
||||
function normalizeApiBase(apiBase: string): string {
|
||||
return apiBase.trim().replace(/\/+$/, "");
|
||||
}
|
||||
|
||||
export function resolveBoardAuthStorePath(overridePath?: string): string {
|
||||
if (overridePath?.trim()) return path.resolve(overridePath.trim());
|
||||
if (process.env.PAPERCLIP_AUTH_STORE?.trim()) return path.resolve(process.env.PAPERCLIP_AUTH_STORE.trim());
|
||||
return resolveDefaultCliAuthPath();
|
||||
}
|
||||
|
||||
export function readBoardAuthStore(storePath?: string): BoardAuthStore {
|
||||
const filePath = resolveBoardAuthStorePath(storePath);
|
||||
if (!fs.existsSync(filePath)) return defaultBoardAuthStore();
|
||||
|
||||
const raw = JSON.parse(fs.readFileSync(filePath, "utf8")) as Partial<BoardAuthStore> | null;
|
||||
const credentials = raw?.credentials && typeof raw.credentials === "object" ? raw.credentials : {};
|
||||
const normalized: Record<string, BoardAuthCredential> = {};
|
||||
|
||||
for (const [key, value] of Object.entries(credentials)) {
|
||||
if (typeof value !== "object" || value === null) continue;
|
||||
const record = value as unknown as Record<string, unknown>;
|
||||
const apiBase = toStringOrNull(record.apiBase);
|
||||
const token = toStringOrNull(record.token);
|
||||
const createdAt = toStringOrNull(record.createdAt);
|
||||
const updatedAt = toStringOrNull(record.updatedAt);
|
||||
if (!apiBase || !token || !createdAt || !updatedAt) continue;
|
||||
normalized[normalizeApiBase(key)] = {
|
||||
apiBase,
|
||||
token,
|
||||
createdAt,
|
||||
updatedAt,
|
||||
userId: toStringOrNull(record.userId),
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
version: 1,
|
||||
credentials: normalized,
|
||||
};
|
||||
}
|
||||
|
||||
export function writeBoardAuthStore(store: BoardAuthStore, storePath?: string): void {
|
||||
const filePath = resolveBoardAuthStorePath(storePath);
|
||||
fs.mkdirSync(path.dirname(filePath), { recursive: true });
|
||||
fs.writeFileSync(filePath, `${JSON.stringify(store, null, 2)}\n`, { mode: 0o600 });
|
||||
}
|
||||
|
||||
export function getStoredBoardCredential(apiBase: string, storePath?: string): BoardAuthCredential | null {
|
||||
const store = readBoardAuthStore(storePath);
|
||||
return store.credentials[normalizeApiBase(apiBase)] ?? null;
|
||||
}
|
||||
|
||||
export function setStoredBoardCredential(input: {
|
||||
apiBase: string;
|
||||
token: string;
|
||||
userId?: string | null;
|
||||
storePath?: string;
|
||||
}): BoardAuthCredential {
|
||||
const normalizedApiBase = normalizeApiBase(input.apiBase);
|
||||
const store = readBoardAuthStore(input.storePath);
|
||||
const now = new Date().toISOString();
|
||||
const existing = store.credentials[normalizedApiBase];
|
||||
const credential: BoardAuthCredential = {
|
||||
apiBase: normalizedApiBase,
|
||||
token: input.token.trim(),
|
||||
createdAt: existing?.createdAt ?? now,
|
||||
updatedAt: now,
|
||||
userId: input.userId ?? existing?.userId ?? null,
|
||||
};
|
||||
store.credentials[normalizedApiBase] = credential;
|
||||
writeBoardAuthStore(store, input.storePath);
|
||||
return credential;
|
||||
}
|
||||
|
||||
export function removeStoredBoardCredential(apiBase: string, storePath?: string): boolean {
|
||||
const normalizedApiBase = normalizeApiBase(apiBase);
|
||||
const store = readBoardAuthStore(storePath);
|
||||
if (!store.credentials[normalizedApiBase]) return false;
|
||||
delete store.credentials[normalizedApiBase];
|
||||
writeBoardAuthStore(store, storePath);
|
||||
return true;
|
||||
}
|
||||
|
||||
function sleep(ms: number) {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
}
|
||||
|
||||
async function requestJson<T>(url: string, init?: RequestInit): Promise<T> {
|
||||
const headers = new Headers(init?.headers ?? undefined);
|
||||
if (init?.body !== undefined && !headers.has("content-type")) {
|
||||
headers.set("content-type", "application/json");
|
||||
}
|
||||
if (!headers.has("accept")) {
|
||||
headers.set("accept", "application/json");
|
||||
}
|
||||
|
||||
const response = await fetch(url, {
|
||||
...init,
|
||||
headers,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const body = await response.json().catch(() => null);
|
||||
const message =
|
||||
body && typeof body === "object" && typeof (body as { error?: unknown }).error === "string"
|
||||
? (body as { error: string }).error
|
||||
: `Request failed: ${response.status}`;
|
||||
throw new Error(message);
|
||||
}
|
||||
|
||||
return response.json() as Promise<T>;
|
||||
}
|
||||
|
||||
export function openUrl(url: string): boolean {
|
||||
const platform = process.platform;
|
||||
try {
|
||||
if (platform === "darwin") {
|
||||
const child = spawn("open", [url], { detached: true, stdio: "ignore" });
|
||||
child.unref();
|
||||
return true;
|
||||
}
|
||||
if (platform === "win32") {
|
||||
const child = spawn("cmd", ["/c", "start", "", url], { detached: true, stdio: "ignore" });
|
||||
child.unref();
|
||||
return true;
|
||||
}
|
||||
const child = spawn("xdg-open", [url], { detached: true, stdio: "ignore" });
|
||||
child.unref();
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
export async function loginBoardCli(params: {
|
||||
apiBase: string;
|
||||
requestedAccess: RequestedAccess;
|
||||
requestedCompanyId?: string | null;
|
||||
clientName?: string | null;
|
||||
command?: string;
|
||||
storePath?: string;
|
||||
print?: boolean;
|
||||
}): Promise<{ token: string; approvalUrl: string; userId?: string | null }> {
|
||||
const apiBase = normalizeApiBase(params.apiBase);
|
||||
const createUrl = `${apiBase}/api/cli-auth/challenges`;
|
||||
const command = params.command?.trim() || buildCliCommandLabel();
|
||||
|
||||
const challenge = await requestJson<CreateChallengeResponse>(createUrl, {
|
||||
method: "POST",
|
||||
body: JSON.stringify({
|
||||
command,
|
||||
clientName: params.clientName?.trim() || "paperclipai cli",
|
||||
requestedAccess: params.requestedAccess,
|
||||
requestedCompanyId: params.requestedCompanyId?.trim() || null,
|
||||
}),
|
||||
});
|
||||
|
||||
const approvalUrl = challenge.approvalUrl ?? `${apiBase}${challenge.approvalPath}`;
|
||||
if (params.print !== false) {
|
||||
console.error(pc.bold("Board authentication required"));
|
||||
console.error(`Open this URL in your browser to approve CLI access:\n${approvalUrl}`);
|
||||
}
|
||||
|
||||
const opened = openUrl(approvalUrl);
|
||||
if (params.print !== false && opened) {
|
||||
console.error(pc.dim("Opened the approval page in your browser."));
|
||||
}
|
||||
|
||||
const expiresAtMs = Date.parse(challenge.expiresAt);
|
||||
const pollMs = Math.max(500, challenge.suggestedPollIntervalMs || 1000);
|
||||
|
||||
while (Number.isFinite(expiresAtMs) ? Date.now() < expiresAtMs : true) {
|
||||
const status = await requestJson<ChallengeStatusResponse>(
|
||||
`${apiBase}/api${challenge.pollPath}?token=${encodeURIComponent(challenge.token)}`,
|
||||
);
|
||||
|
||||
if (status.status === "approved") {
|
||||
const me = await requestJson<{ userId: string; user?: { id: string } | null }>(
|
||||
`${apiBase}/api/cli-auth/me`,
|
||||
{
|
||||
headers: {
|
||||
authorization: `Bearer ${challenge.boardApiToken}`,
|
||||
},
|
||||
},
|
||||
);
|
||||
setStoredBoardCredential({
|
||||
apiBase,
|
||||
token: challenge.boardApiToken,
|
||||
userId: me.userId ?? me.user?.id ?? null,
|
||||
storePath: params.storePath,
|
||||
});
|
||||
return {
|
||||
token: challenge.boardApiToken,
|
||||
approvalUrl,
|
||||
userId: me.userId ?? me.user?.id ?? null,
|
||||
};
|
||||
}
|
||||
|
||||
if (status.status === "cancelled") {
|
||||
throw new Error("CLI auth challenge was cancelled.");
|
||||
}
|
||||
if (status.status === "expired") {
|
||||
throw new Error("CLI auth challenge expired before approval.");
|
||||
}
|
||||
|
||||
await sleep(pollMs);
|
||||
}
|
||||
|
||||
throw new Error("CLI auth challenge expired before approval.");
|
||||
}
|
||||
|
||||
export async function revokeStoredBoardCredential(params: {
|
||||
apiBase: string;
|
||||
token: string;
|
||||
}): Promise<void> {
|
||||
const apiBase = normalizeApiBase(params.apiBase);
|
||||
await requestJson<{ revoked: boolean }>(`${apiBase}/api/cli-auth/revoke-current`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
authorization: `Bearer ${params.token}`,
|
||||
},
|
||||
body: JSON.stringify({}),
|
||||
});
|
||||
}
|
||||
4
cli/src/client/command-label.ts
Normal file
4
cli/src/client/command-label.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
export function buildCliCommandLabel(): string {
|
||||
const args = process.argv.slice(2);
|
||||
return args.length > 0 ? `paperclipai ${args.join(" ")}` : "paperclipai";
|
||||
}
|
||||
@@ -13,25 +13,54 @@ export class ApiRequestError extends Error {
|
||||
}
|
||||
}
|
||||
|
||||
export class ApiConnectionError extends Error {
|
||||
url: string;
|
||||
method: string;
|
||||
causeMessage?: string;
|
||||
|
||||
constructor(input: {
|
||||
apiBase: string;
|
||||
path: string;
|
||||
method: string;
|
||||
cause?: unknown;
|
||||
}) {
|
||||
const url = buildUrl(input.apiBase, input.path);
|
||||
const causeMessage = formatConnectionCause(input.cause);
|
||||
super(buildConnectionErrorMessage({ apiBase: input.apiBase, url, method: input.method, causeMessage }));
|
||||
this.url = url;
|
||||
this.method = input.method;
|
||||
this.causeMessage = causeMessage;
|
||||
}
|
||||
}
|
||||
|
||||
interface RequestOptions {
|
||||
ignoreNotFound?: boolean;
|
||||
}
|
||||
|
||||
interface RecoverAuthInput {
|
||||
path: string;
|
||||
method: string;
|
||||
error: ApiRequestError;
|
||||
}
|
||||
|
||||
interface ApiClientOptions {
|
||||
apiBase: string;
|
||||
apiKey?: string;
|
||||
runId?: string;
|
||||
recoverAuth?: (input: RecoverAuthInput) => Promise<string | null>;
|
||||
}
|
||||
|
||||
export class PaperclipApiClient {
|
||||
readonly apiBase: string;
|
||||
readonly apiKey?: string;
|
||||
apiKey?: string;
|
||||
readonly runId?: string;
|
||||
readonly recoverAuth?: (input: RecoverAuthInput) => Promise<string | null>;
|
||||
|
||||
constructor(opts: ApiClientOptions) {
|
||||
this.apiBase = opts.apiBase.replace(/\/+$/, "");
|
||||
this.apiKey = opts.apiKey?.trim() || undefined;
|
||||
this.runId = opts.runId?.trim() || undefined;
|
||||
this.recoverAuth = opts.recoverAuth;
|
||||
}
|
||||
|
||||
get<T>(path: string, opts?: RequestOptions): Promise<T | null> {
|
||||
@@ -56,8 +85,18 @@ export class PaperclipApiClient {
|
||||
return this.request<T>(path, { method: "DELETE" }, opts);
|
||||
}
|
||||
|
||||
private async request<T>(path: string, init: RequestInit, opts?: RequestOptions): Promise<T | null> {
|
||||
setApiKey(apiKey: string | undefined) {
|
||||
this.apiKey = apiKey?.trim() || undefined;
|
||||
}
|
||||
|
||||
private async request<T>(
|
||||
path: string,
|
||||
init: RequestInit,
|
||||
opts?: RequestOptions,
|
||||
hasRetriedAuth = false,
|
||||
): Promise<T | null> {
|
||||
const url = buildUrl(this.apiBase, path);
|
||||
const method = String(init.method ?? "GET").toUpperCase();
|
||||
|
||||
const headers: Record<string, string> = {
|
||||
accept: "application/json",
|
||||
@@ -76,17 +115,39 @@ export class PaperclipApiClient {
|
||||
headers["x-paperclip-run-id"] = this.runId;
|
||||
}
|
||||
|
||||
const response = await fetch(url, {
|
||||
...init,
|
||||
headers,
|
||||
});
|
||||
let response: Response;
|
||||
try {
|
||||
response = await fetch(url, {
|
||||
...init,
|
||||
headers,
|
||||
});
|
||||
} catch (error) {
|
||||
throw new ApiConnectionError({
|
||||
apiBase: this.apiBase,
|
||||
path,
|
||||
method,
|
||||
cause: error,
|
||||
});
|
||||
}
|
||||
|
||||
if (opts?.ignoreNotFound && response.status === 404) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
throw await toApiError(response);
|
||||
const apiError = await toApiError(response);
|
||||
if (!hasRetriedAuth && this.recoverAuth) {
|
||||
const recoveredToken = await this.recoverAuth({
|
||||
path,
|
||||
method,
|
||||
error: apiError,
|
||||
});
|
||||
if (recoveredToken) {
|
||||
this.setApiKey(recoveredToken);
|
||||
return this.request<T>(path, init, opts, true);
|
||||
}
|
||||
}
|
||||
throw apiError;
|
||||
}
|
||||
|
||||
if (response.status === 204) {
|
||||
@@ -136,6 +197,50 @@ async function toApiError(response: Response): Promise<ApiRequestError> {
|
||||
return new ApiRequestError(response.status, `Request failed with status ${response.status}`, undefined, parsed);
|
||||
}
|
||||
|
||||
function buildConnectionErrorMessage(input: {
|
||||
apiBase: string;
|
||||
url: string;
|
||||
method: string;
|
||||
causeMessage?: string;
|
||||
}): string {
|
||||
const healthUrl = buildHealthCheckUrl(input.url);
|
||||
const lines = [
|
||||
"Could not reach the Paperclip API.",
|
||||
"",
|
||||
`Request: ${input.method} ${input.url}`,
|
||||
];
|
||||
if (input.causeMessage) {
|
||||
lines.push(`Cause: ${input.causeMessage}`);
|
||||
}
|
||||
lines.push(
|
||||
"",
|
||||
"This usually means the Paperclip server is not running, the configured URL is wrong, or the request is being blocked before it reaches Paperclip.",
|
||||
"",
|
||||
"Try:",
|
||||
"- Start Paperclip with `pnpm dev` or `pnpm paperclipai run`.",
|
||||
`- Verify the server is reachable with \`curl ${healthUrl}\`.`,
|
||||
`- If Paperclip is running elsewhere, pass \`--api-base ${input.apiBase.replace(/\/+$/, "")}\` or set \`PAPERCLIP_API_URL\`.`,
|
||||
);
|
||||
return lines.join("\n");
|
||||
}
|
||||
|
||||
function buildHealthCheckUrl(requestUrl: string): string {
|
||||
const url = new URL(requestUrl);
|
||||
url.pathname = `${url.pathname.replace(/\/+$/, "").replace(/\/api(?:\/.*)?$/, "")}/api/health`;
|
||||
url.search = "";
|
||||
url.hash = "";
|
||||
return url.toString();
|
||||
}
|
||||
|
||||
function formatConnectionCause(error: unknown): string | undefined {
|
||||
if (!error) return undefined;
|
||||
if (error instanceof Error) {
|
||||
return error.message.trim() || error.name;
|
||||
}
|
||||
const message = String(error).trim();
|
||||
return message || undefined;
|
||||
}
|
||||
|
||||
function toStringRecord(headers: HeadersInit | undefined): Record<string, string> {
|
||||
if (!headers) return {};
|
||||
if (Array.isArray(headers)) {
|
||||
|
||||
113
cli/src/commands/client/auth.ts
Normal file
113
cli/src/commands/client/auth.ts
Normal file
@@ -0,0 +1,113 @@
|
||||
import type { Command } from "commander";
|
||||
import {
|
||||
getStoredBoardCredential,
|
||||
loginBoardCli,
|
||||
removeStoredBoardCredential,
|
||||
revokeStoredBoardCredential,
|
||||
} from "../../client/board-auth.js";
|
||||
import {
|
||||
addCommonClientOptions,
|
||||
handleCommandError,
|
||||
printOutput,
|
||||
resolveCommandContext,
|
||||
type BaseClientOptions,
|
||||
} from "./common.js";
|
||||
|
||||
interface AuthLoginOptions extends BaseClientOptions {
|
||||
instanceAdmin?: boolean;
|
||||
}
|
||||
|
||||
interface AuthLogoutOptions extends BaseClientOptions {}
|
||||
interface AuthWhoamiOptions extends BaseClientOptions {}
|
||||
|
||||
export function registerClientAuthCommands(auth: Command): void {
|
||||
addCommonClientOptions(
|
||||
auth
|
||||
.command("login")
|
||||
.description("Authenticate the CLI for board-user access")
|
||||
.option("--instance-admin", "Request instance-admin approval instead of plain board access", false)
|
||||
.action(async (opts: AuthLoginOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const login = await loginBoardCli({
|
||||
apiBase: ctx.api.apiBase,
|
||||
requestedAccess: opts.instanceAdmin ? "instance_admin_required" : "board",
|
||||
requestedCompanyId: ctx.companyId ?? null,
|
||||
command: "paperclipai auth login",
|
||||
});
|
||||
printOutput(
|
||||
{
|
||||
ok: true,
|
||||
apiBase: ctx.api.apiBase,
|
||||
userId: login.userId ?? null,
|
||||
approvalUrl: login.approvalUrl,
|
||||
},
|
||||
{ json: ctx.json },
|
||||
);
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
{ includeCompany: true },
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
auth
|
||||
.command("logout")
|
||||
.description("Remove the stored board-user credential for this API base")
|
||||
.action(async (opts: AuthLogoutOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const credential = getStoredBoardCredential(ctx.api.apiBase);
|
||||
if (!credential) {
|
||||
printOutput({ ok: true, apiBase: ctx.api.apiBase, revoked: false, removedLocalCredential: false }, { json: ctx.json });
|
||||
return;
|
||||
}
|
||||
let revoked = false;
|
||||
try {
|
||||
await revokeStoredBoardCredential({
|
||||
apiBase: ctx.api.apiBase,
|
||||
token: credential.token,
|
||||
});
|
||||
revoked = true;
|
||||
} catch {
|
||||
// Remove the local credential even if the server-side revoke fails.
|
||||
}
|
||||
const removedLocalCredential = removeStoredBoardCredential(ctx.api.apiBase);
|
||||
printOutput(
|
||||
{
|
||||
ok: true,
|
||||
apiBase: ctx.api.apiBase,
|
||||
revoked,
|
||||
removedLocalCredential,
|
||||
},
|
||||
{ json: ctx.json },
|
||||
);
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
auth
|
||||
.command("whoami")
|
||||
.description("Show the current board-user identity for this API base")
|
||||
.action(async (opts: AuthWhoamiOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const me = await ctx.api.get<{
|
||||
user: { id: string; name: string; email: string } | null;
|
||||
userId: string;
|
||||
isInstanceAdmin: boolean;
|
||||
companyIds: string[];
|
||||
source: string;
|
||||
keyId: string | null;
|
||||
}>("/api/cli-auth/me");
|
||||
printOutput(me, { json: ctx.json });
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
);
|
||||
}
|
||||
@@ -1,5 +1,7 @@
|
||||
import pc from "picocolors";
|
||||
import type { Command } from "commander";
|
||||
import { getStoredBoardCredential, loginBoardCli } from "../../client/board-auth.js";
|
||||
import { buildCliCommandLabel } from "../../client/command-label.js";
|
||||
import { readConfig } from "../../config/store.js";
|
||||
import { readContext, resolveProfile, type ClientContextProfile } from "../../client/context.js";
|
||||
import { ApiRequestError, PaperclipApiClient } from "../../client/http.js";
|
||||
@@ -53,10 +55,12 @@ export function resolveCommandContext(
|
||||
profile.apiBase ||
|
||||
inferApiBaseFromConfig(options.config);
|
||||
|
||||
const apiKey =
|
||||
const explicitApiKey =
|
||||
options.apiKey?.trim() ||
|
||||
process.env.PAPERCLIP_API_KEY?.trim() ||
|
||||
readKeyFromProfileEnv(profile);
|
||||
const storedBoardCredential = explicitApiKey ? null : getStoredBoardCredential(apiBase);
|
||||
const apiKey = explicitApiKey || storedBoardCredential?.token;
|
||||
|
||||
const companyId =
|
||||
options.companyId?.trim() ||
|
||||
@@ -69,7 +73,27 @@ export function resolveCommandContext(
|
||||
);
|
||||
}
|
||||
|
||||
const api = new PaperclipApiClient({ apiBase, apiKey });
|
||||
const api = new PaperclipApiClient({
|
||||
apiBase,
|
||||
apiKey,
|
||||
recoverAuth: explicitApiKey || !canAttemptInteractiveBoardAuth()
|
||||
? undefined
|
||||
: async ({ error }) => {
|
||||
const requestedAccess = error.message.includes("Instance admin required")
|
||||
? "instance_admin_required"
|
||||
: "board";
|
||||
if (!shouldRecoverBoardAuth(error)) {
|
||||
return null;
|
||||
}
|
||||
const login = await loginBoardCli({
|
||||
apiBase,
|
||||
requestedAccess,
|
||||
requestedCompanyId: companyId ?? null,
|
||||
command: buildCliCommandLabel(),
|
||||
});
|
||||
return login.token;
|
||||
},
|
||||
});
|
||||
return {
|
||||
api,
|
||||
companyId,
|
||||
@@ -79,6 +103,16 @@ export function resolveCommandContext(
|
||||
};
|
||||
}
|
||||
|
||||
function shouldRecoverBoardAuth(error: ApiRequestError): boolean {
|
||||
if (error.status === 401) return true;
|
||||
if (error.status !== 403) return false;
|
||||
return error.message.includes("Board access required") || error.message.includes("Instance admin required");
|
||||
}
|
||||
|
||||
function canAttemptInteractiveBoardAuth(): boolean {
|
||||
return Boolean(process.stdin.isTTY && process.stdout.isTTY);
|
||||
}
|
||||
|
||||
export function printOutput(data: unknown, opts: { json?: boolean; label?: string } = {}): void {
|
||||
if (opts.json) {
|
||||
console.log(JSON.stringify(data, null, 2));
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
645
cli/src/commands/client/feedback.ts
Normal file
645
cli/src/commands/client/feedback.ts
Normal file
@@ -0,0 +1,645 @@
|
||||
import { mkdir, readdir, readFile, stat, writeFile } from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import pc from "picocolors";
|
||||
import { Command } from "commander";
|
||||
import type { Company, FeedbackTrace, FeedbackTraceBundle } from "@paperclipai/shared";
|
||||
import {
|
||||
addCommonClientOptions,
|
||||
handleCommandError,
|
||||
printOutput,
|
||||
resolveCommandContext,
|
||||
type BaseClientOptions,
|
||||
type ResolvedClientContext,
|
||||
} from "./common.js";
|
||||
|
||||
interface FeedbackFilterOptions extends BaseClientOptions {
|
||||
targetType?: string;
|
||||
vote?: string;
|
||||
status?: string;
|
||||
projectId?: string;
|
||||
issueId?: string;
|
||||
from?: string;
|
||||
to?: string;
|
||||
sharedOnly?: boolean;
|
||||
}
|
||||
|
||||
export interface FeedbackTraceQueryOptions {
|
||||
targetType?: string;
|
||||
vote?: string;
|
||||
status?: string;
|
||||
projectId?: string;
|
||||
issueId?: string;
|
||||
from?: string;
|
||||
to?: string;
|
||||
sharedOnly?: boolean;
|
||||
}
|
||||
|
||||
interface FeedbackReportOptions extends FeedbackFilterOptions {
|
||||
payloads?: boolean;
|
||||
}
|
||||
|
||||
interface FeedbackExportOptions extends FeedbackFilterOptions {
|
||||
out?: string;
|
||||
}
|
||||
|
||||
interface FeedbackSummary {
|
||||
total: number;
|
||||
thumbsUp: number;
|
||||
thumbsDown: number;
|
||||
withReason: number;
|
||||
statuses: Record<string, number>;
|
||||
}
|
||||
|
||||
interface FeedbackExportManifest {
|
||||
exportedAt: string;
|
||||
serverUrl: string;
|
||||
companyId: string;
|
||||
summary: FeedbackSummary & {
|
||||
uniqueIssues: number;
|
||||
issues: string[];
|
||||
};
|
||||
files: {
|
||||
votes: string[];
|
||||
traces: string[];
|
||||
fullTraces: string[];
|
||||
zip: string;
|
||||
};
|
||||
}
|
||||
|
||||
interface FeedbackExportResult {
|
||||
outputDir: string;
|
||||
zipPath: string;
|
||||
manifest: FeedbackExportManifest;
|
||||
}
|
||||
|
||||
export function registerFeedbackCommands(program: Command): void {
|
||||
const feedback = program.command("feedback").description("Inspect and export local feedback traces");
|
||||
|
||||
addCommonClientOptions(
|
||||
feedback
|
||||
.command("report")
|
||||
.description("Render a terminal report for company feedback traces")
|
||||
.option("-C, --company-id <id>", "Company ID (overrides context default)")
|
||||
.option("--target-type <type>", "Filter by target type")
|
||||
.option("--vote <vote>", "Filter by vote value")
|
||||
.option("--status <status>", "Filter by trace status")
|
||||
.option("--project-id <id>", "Filter by project ID")
|
||||
.option("--issue-id <id>", "Filter by issue ID")
|
||||
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||
.option("--payloads", "Include raw payload dumps in the terminal report", false)
|
||||
.action(async (opts: FeedbackReportOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
|
||||
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
|
||||
const summary = summarizeFeedbackTraces(traces);
|
||||
if (ctx.json) {
|
||||
printOutput(
|
||||
{
|
||||
apiBase: ctx.api.apiBase,
|
||||
companyId,
|
||||
summary,
|
||||
traces,
|
||||
},
|
||||
{ json: true },
|
||||
);
|
||||
return;
|
||||
}
|
||||
console.log(renderFeedbackReport({
|
||||
apiBase: ctx.api.apiBase,
|
||||
companyId,
|
||||
traces,
|
||||
summary,
|
||||
includePayloads: Boolean(opts.payloads),
|
||||
}));
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
{ includeCompany: false },
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
feedback
|
||||
.command("export")
|
||||
.description("Export feedback votes and raw trace bundles into a folder plus zip archive")
|
||||
.option("-C, --company-id <id>", "Company ID (overrides context default)")
|
||||
.option("--target-type <type>", "Filter by target type")
|
||||
.option("--vote <vote>", "Filter by vote value")
|
||||
.option("--status <status>", "Filter by trace status")
|
||||
.option("--project-id <id>", "Filter by project ID")
|
||||
.option("--issue-id <id>", "Filter by issue ID")
|
||||
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||
.option("--out <path>", "Output directory (default: ./feedback-export-<timestamp>)")
|
||||
.action(async (opts: FeedbackExportOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
|
||||
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
|
||||
const outputDir = path.resolve(opts.out?.trim() || defaultFeedbackExportDirName());
|
||||
const exported = await writeFeedbackExportBundle({
|
||||
apiBase: ctx.api.apiBase,
|
||||
companyId,
|
||||
traces,
|
||||
outputDir,
|
||||
traceBundleFetcher: (trace) => fetchFeedbackTraceBundle(ctx, trace.id),
|
||||
});
|
||||
if (ctx.json) {
|
||||
printOutput(
|
||||
{
|
||||
companyId,
|
||||
outputDir: exported.outputDir,
|
||||
zipPath: exported.zipPath,
|
||||
summary: exported.manifest.summary,
|
||||
},
|
||||
{ json: true },
|
||||
);
|
||||
return;
|
||||
}
|
||||
console.log(renderFeedbackExportSummary(exported));
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
{ includeCompany: false },
|
||||
);
|
||||
}
|
||||
|
||||
export async function resolveFeedbackCompanyId(
|
||||
ctx: ResolvedClientContext,
|
||||
explicitCompanyId?: string,
|
||||
): Promise<string> {
|
||||
const direct = explicitCompanyId?.trim() || ctx.companyId?.trim();
|
||||
if (direct) return direct;
|
||||
const companies = (await ctx.api.get<Company[]>("/api/companies")) ?? [];
|
||||
const companyId = companies[0]?.id?.trim();
|
||||
if (!companyId) {
|
||||
throw new Error(
|
||||
"Company ID is required. Pass --company-id, set PAPERCLIP_COMPANY_ID, or configure a CLI context default.",
|
||||
);
|
||||
}
|
||||
return companyId;
|
||||
}
|
||||
|
||||
export function buildFeedbackTraceQuery(opts: FeedbackTraceQueryOptions, includePayload = true): string {
|
||||
const params = new URLSearchParams();
|
||||
if (opts.targetType) params.set("targetType", opts.targetType);
|
||||
if (opts.vote) params.set("vote", opts.vote);
|
||||
if (opts.status) params.set("status", opts.status);
|
||||
if (opts.projectId) params.set("projectId", opts.projectId);
|
||||
if (opts.issueId) params.set("issueId", opts.issueId);
|
||||
if (opts.from) params.set("from", opts.from);
|
||||
if (opts.to) params.set("to", opts.to);
|
||||
if (opts.sharedOnly) params.set("sharedOnly", "true");
|
||||
if (includePayload) params.set("includePayload", "true");
|
||||
const query = params.toString();
|
||||
return query ? `?${query}` : "";
|
||||
}
|
||||
|
||||
export function normalizeFeedbackTraceExportFormat(value: string | undefined): "json" | "ndjson" {
|
||||
if (!value || value === "ndjson") return "ndjson";
|
||||
if (value === "json") return "json";
|
||||
throw new Error(`Unsupported export format: ${value}`);
|
||||
}
|
||||
|
||||
export function serializeFeedbackTraces(traces: FeedbackTrace[], format: string | undefined): string {
|
||||
if (normalizeFeedbackTraceExportFormat(format) === "json") {
|
||||
return JSON.stringify(traces, null, 2);
|
||||
}
|
||||
return traces.map((trace) => JSON.stringify(trace)).join("\n");
|
||||
}
|
||||
|
||||
export async function fetchCompanyFeedbackTraces(
|
||||
ctx: ResolvedClientContext,
|
||||
companyId: string,
|
||||
opts: FeedbackFilterOptions,
|
||||
): Promise<FeedbackTrace[]> {
|
||||
return (
|
||||
(await ctx.api.get<FeedbackTrace[]>(
|
||||
`/api/companies/${companyId}/feedback-traces${buildFeedbackTraceQuery(opts, true)}`,
|
||||
)) ?? []
|
||||
);
|
||||
}
|
||||
|
||||
export async function fetchFeedbackTraceBundle(
|
||||
ctx: ResolvedClientContext,
|
||||
traceId: string,
|
||||
): Promise<FeedbackTraceBundle> {
|
||||
const bundle = await ctx.api.get<FeedbackTraceBundle>(`/api/feedback-traces/${traceId}/bundle`);
|
||||
if (!bundle) {
|
||||
throw new Error(`Feedback trace bundle ${traceId} not found`);
|
||||
}
|
||||
return bundle;
|
||||
}
|
||||
|
||||
export function summarizeFeedbackTraces(traces: FeedbackTrace[]): FeedbackSummary {
|
||||
const statuses: Record<string, number> = {};
|
||||
let thumbsUp = 0;
|
||||
let thumbsDown = 0;
|
||||
let withReason = 0;
|
||||
|
||||
for (const trace of traces) {
|
||||
if (trace.vote === "up") thumbsUp += 1;
|
||||
if (trace.vote === "down") thumbsDown += 1;
|
||||
if (readFeedbackReason(trace)) withReason += 1;
|
||||
statuses[trace.status] = (statuses[trace.status] ?? 0) + 1;
|
||||
}
|
||||
|
||||
return {
|
||||
total: traces.length,
|
||||
thumbsUp,
|
||||
thumbsDown,
|
||||
withReason,
|
||||
statuses,
|
||||
};
|
||||
}
|
||||
|
||||
export function renderFeedbackReport(input: {
|
||||
apiBase: string;
|
||||
companyId: string;
|
||||
traces: FeedbackTrace[];
|
||||
summary: FeedbackSummary;
|
||||
includePayloads: boolean;
|
||||
}): string {
|
||||
const lines: string[] = [];
|
||||
lines.push("");
|
||||
lines.push(pc.bold(pc.magenta("Paperclip Feedback Report")));
|
||||
lines.push(pc.dim(new Date().toISOString()));
|
||||
lines.push(horizontalRule());
|
||||
lines.push(`${pc.dim("Server:")} ${input.apiBase}`);
|
||||
lines.push(`${pc.dim("Company:")} ${input.companyId}`);
|
||||
lines.push("");
|
||||
|
||||
if (input.traces.length === 0) {
|
||||
lines.push(pc.yellow("[!!] No feedback traces found."));
|
||||
lines.push("");
|
||||
return lines.join("\n");
|
||||
}
|
||||
|
||||
lines.push(pc.bold(pc.cyan("Summary")));
|
||||
lines.push(horizontalRule());
|
||||
lines.push(` ${pc.green(pc.bold(String(input.summary.thumbsUp)))} thumbs up`);
|
||||
lines.push(` ${pc.red(pc.bold(String(input.summary.thumbsDown)))} thumbs down`);
|
||||
lines.push(` ${pc.yellow(pc.bold(String(input.summary.withReason)))} downvotes with a reason`);
|
||||
lines.push(` ${pc.bold(String(input.summary.total))} total traces`);
|
||||
lines.push("");
|
||||
lines.push(pc.dim("Export status:"));
|
||||
for (const status of ["pending", "sent", "local_only", "failed"]) {
|
||||
lines.push(` ${padRight(status, 10)} ${input.summary.statuses[status] ?? 0}`);
|
||||
}
|
||||
lines.push("");
|
||||
lines.push(pc.bold(pc.cyan("Trace Details")));
|
||||
lines.push(horizontalRule());
|
||||
|
||||
for (const trace of input.traces) {
|
||||
const voteColor = trace.vote === "up" ? pc.green : pc.red;
|
||||
const voteIcon = trace.vote === "up" ? "^" : "v";
|
||||
const issueRef = trace.issueIdentifier ?? trace.issueId;
|
||||
const label = trace.targetSummary.label?.trim() || trace.targetType;
|
||||
const excerpt = compactText(trace.targetSummary.excerpt);
|
||||
const reason = readFeedbackReason(trace);
|
||||
lines.push(
|
||||
` ${voteColor(voteIcon)} ${pc.bold(issueRef)} ${pc.dim(compactText(trace.issueTitle, 64))}`,
|
||||
);
|
||||
lines.push(
|
||||
` ${pc.dim("Trace:")} ${trace.id.slice(0, 8)} ${pc.dim("Status:")} ${trace.status} ${pc.dim("Date:")} ${formatTimestamp(trace.createdAt)}`,
|
||||
);
|
||||
lines.push(` ${pc.dim("Target:")} ${label}`);
|
||||
if (excerpt) {
|
||||
lines.push(` ${pc.dim("Excerpt:")} ${excerpt}`);
|
||||
}
|
||||
if (reason) {
|
||||
lines.push(` ${pc.yellow(pc.bold("Reason:"))} ${pc.yellow(reason)}`);
|
||||
}
|
||||
lines.push("");
|
||||
}
|
||||
|
||||
if (input.includePayloads) {
|
||||
lines.push(pc.bold(pc.cyan("Raw Payloads")));
|
||||
lines.push(horizontalRule());
|
||||
for (const trace of input.traces) {
|
||||
if (!trace.payloadSnapshot) continue;
|
||||
const issueRef = trace.issueIdentifier ?? trace.issueId;
|
||||
lines.push(` ${pc.bold(`${issueRef} (${trace.id.slice(0, 8)})`)}`);
|
||||
const body = JSON.stringify(trace.payloadSnapshot, null, 2)?.split("\n") ?? [];
|
||||
for (const line of body) {
|
||||
lines.push(` ${pc.dim(line)}`);
|
||||
}
|
||||
lines.push("");
|
||||
}
|
||||
}
|
||||
|
||||
lines.push(horizontalRule());
|
||||
lines.push(pc.dim(`Report complete. ${input.traces.length} trace(s) displayed.`));
|
||||
lines.push("");
|
||||
return lines.join("\n");
|
||||
}
|
||||
|
||||
export async function writeFeedbackExportBundle(input: {
|
||||
apiBase: string;
|
||||
companyId: string;
|
||||
traces: FeedbackTrace[];
|
||||
outputDir: string;
|
||||
traceBundleFetcher?: (trace: FeedbackTrace) => Promise<FeedbackTraceBundle>;
|
||||
}): Promise<FeedbackExportResult> {
|
||||
await ensureEmptyOutputDirectory(input.outputDir);
|
||||
await mkdir(path.join(input.outputDir, "votes"), { recursive: true });
|
||||
await mkdir(path.join(input.outputDir, "traces"), { recursive: true });
|
||||
await mkdir(path.join(input.outputDir, "full-traces"), { recursive: true });
|
||||
|
||||
const summary = summarizeFeedbackTraces(input.traces);
|
||||
const voteFiles: string[] = [];
|
||||
const traceFiles: string[] = [];
|
||||
const fullTraceDirs: string[] = [];
|
||||
const fullTraceFiles: string[] = [];
|
||||
const issueSet = new Set<string>();
|
||||
|
||||
for (const trace of input.traces) {
|
||||
const issueRef = sanitizeFileSegment(trace.issueIdentifier ?? trace.issueId);
|
||||
const voteRecord = buildFeedbackVoteRecord(trace);
|
||||
const voteFileName = `${issueRef}-${trace.feedbackVoteId.slice(0, 8)}.json`;
|
||||
const traceFileName = `${issueRef}-${trace.id.slice(0, 8)}.json`;
|
||||
voteFiles.push(voteFileName);
|
||||
traceFiles.push(traceFileName);
|
||||
issueSet.add(trace.issueIdentifier ?? trace.issueId);
|
||||
await writeFile(
|
||||
path.join(input.outputDir, "votes", voteFileName),
|
||||
`${JSON.stringify(voteRecord, null, 2)}\n`,
|
||||
"utf8",
|
||||
);
|
||||
await writeFile(
|
||||
path.join(input.outputDir, "traces", traceFileName),
|
||||
`${JSON.stringify(trace, null, 2)}\n`,
|
||||
"utf8",
|
||||
);
|
||||
|
||||
if (input.traceBundleFetcher) {
|
||||
const bundle = await input.traceBundleFetcher(trace);
|
||||
const bundleDirName = `${issueRef}-${trace.id.slice(0, 8)}`;
|
||||
const bundleDir = path.join(input.outputDir, "full-traces", bundleDirName);
|
||||
await mkdir(bundleDir, { recursive: true });
|
||||
fullTraceDirs.push(bundleDirName);
|
||||
await writeFile(
|
||||
path.join(bundleDir, "bundle.json"),
|
||||
`${JSON.stringify(bundle, null, 2)}\n`,
|
||||
"utf8",
|
||||
);
|
||||
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, "bundle.json"));
|
||||
for (const file of bundle.files) {
|
||||
const targetPath = path.join(bundleDir, file.path);
|
||||
await mkdir(path.dirname(targetPath), { recursive: true });
|
||||
await writeFile(targetPath, file.contents, "utf8");
|
||||
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, file.path.replace(/\\/g, "/")));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const zipPath = `${input.outputDir}.zip`;
|
||||
const manifest: FeedbackExportManifest = {
|
||||
exportedAt: new Date().toISOString(),
|
||||
serverUrl: input.apiBase,
|
||||
companyId: input.companyId,
|
||||
summary: {
|
||||
...summary,
|
||||
uniqueIssues: issueSet.size,
|
||||
issues: Array.from(issueSet).sort((left, right) => left.localeCompare(right)),
|
||||
},
|
||||
files: {
|
||||
votes: voteFiles.slice().sort((left, right) => left.localeCompare(right)),
|
||||
traces: traceFiles.slice().sort((left, right) => left.localeCompare(right)),
|
||||
fullTraces: fullTraceDirs.slice().sort((left, right) => left.localeCompare(right)),
|
||||
zip: path.basename(zipPath),
|
||||
},
|
||||
};
|
||||
|
||||
await writeFile(
|
||||
path.join(input.outputDir, "index.json"),
|
||||
`${JSON.stringify(manifest, null, 2)}\n`,
|
||||
"utf8",
|
||||
);
|
||||
const archiveFiles = await collectJsonFilesForArchive(input.outputDir, [
|
||||
"index.json",
|
||||
...manifest.files.votes.map((file) => path.posix.join("votes", file)),
|
||||
...manifest.files.traces.map((file) => path.posix.join("traces", file)),
|
||||
...fullTraceFiles,
|
||||
]);
|
||||
await writeFile(zipPath, createStoredZipArchive(archiveFiles, path.basename(input.outputDir)));
|
||||
|
||||
return {
|
||||
outputDir: input.outputDir,
|
||||
zipPath,
|
||||
manifest,
|
||||
};
|
||||
}
|
||||
|
||||
export function renderFeedbackExportSummary(exported: FeedbackExportResult): string {
|
||||
const lines: string[] = [];
|
||||
lines.push("");
|
||||
lines.push(pc.bold(pc.magenta("Paperclip Feedback Export")));
|
||||
lines.push(pc.dim(exported.manifest.exportedAt));
|
||||
lines.push(horizontalRule());
|
||||
lines.push(`${pc.dim("Company:")} ${exported.manifest.companyId}`);
|
||||
lines.push(`${pc.dim("Output:")} ${exported.outputDir}`);
|
||||
lines.push(`${pc.dim("Archive:")} ${exported.zipPath}`);
|
||||
lines.push("");
|
||||
lines.push(pc.bold("Export Summary"));
|
||||
lines.push(horizontalRule());
|
||||
lines.push(` ${pc.green(pc.bold(String(exported.manifest.summary.thumbsUp)))} thumbs up`);
|
||||
lines.push(` ${pc.red(pc.bold(String(exported.manifest.summary.thumbsDown)))} thumbs down`);
|
||||
lines.push(` ${pc.yellow(pc.bold(String(exported.manifest.summary.withReason)))} with reason`);
|
||||
lines.push(` ${pc.bold(String(exported.manifest.summary.uniqueIssues))} unique issues`);
|
||||
lines.push("");
|
||||
lines.push(pc.dim("Files:"));
|
||||
lines.push(` ${path.join(exported.outputDir, "index.json")}`);
|
||||
lines.push(` ${path.join(exported.outputDir, "votes")} (${exported.manifest.files.votes.length} files)`);
|
||||
lines.push(` ${path.join(exported.outputDir, "traces")} (${exported.manifest.files.traces.length} files)`);
|
||||
lines.push(` ${path.join(exported.outputDir, "full-traces")} (${exported.manifest.files.fullTraces.length} bundles)`);
|
||||
lines.push(` ${exported.zipPath}`);
|
||||
lines.push("");
|
||||
return lines.join("\n");
|
||||
}
|
||||
|
||||
function readFeedbackReason(trace: FeedbackTrace): string | null {
|
||||
const payload = asRecord(trace.payloadSnapshot);
|
||||
const vote = asRecord(payload?.vote);
|
||||
const reason = vote?.reason;
|
||||
return typeof reason === "string" && reason.trim() ? reason.trim() : null;
|
||||
}
|
||||
|
||||
function buildFeedbackVoteRecord(trace: FeedbackTrace) {
|
||||
return {
|
||||
voteId: trace.feedbackVoteId,
|
||||
traceId: trace.id,
|
||||
issueId: trace.issueId,
|
||||
issueIdentifier: trace.issueIdentifier,
|
||||
issueTitle: trace.issueTitle,
|
||||
vote: trace.vote,
|
||||
targetType: trace.targetType,
|
||||
targetId: trace.targetId,
|
||||
targetSummary: trace.targetSummary,
|
||||
status: trace.status,
|
||||
consentVersion: trace.consentVersion,
|
||||
createdAt: trace.createdAt,
|
||||
updatedAt: trace.updatedAt,
|
||||
reason: readFeedbackReason(trace),
|
||||
};
|
||||
}
|
||||
|
||||
function asRecord(value: unknown): Record<string, unknown> | null {
|
||||
if (!value || typeof value !== "object" || Array.isArray(value)) return null;
|
||||
return value as Record<string, unknown>;
|
||||
}
|
||||
|
||||
function compactText(value: string | null | undefined, maxLength = 88): string | null {
|
||||
if (!value) return null;
|
||||
const compact = value.replace(/\s+/g, " ").trim();
|
||||
if (!compact) return null;
|
||||
if (compact.length <= maxLength) return compact;
|
||||
return `${compact.slice(0, maxLength - 3)}...`;
|
||||
}
|
||||
|
||||
function formatTimestamp(value: unknown): string {
|
||||
if (value instanceof Date) return value.toISOString().slice(0, 19).replace("T", " ");
|
||||
if (typeof value === "string") return value.slice(0, 19).replace("T", " ");
|
||||
return "-";
|
||||
}
|
||||
|
||||
function horizontalRule(): string {
|
||||
return pc.dim("-".repeat(72));
|
||||
}
|
||||
|
||||
function padRight(value: string, width: number): string {
|
||||
return `${value}${" ".repeat(Math.max(0, width - value.length))}`;
|
||||
}
|
||||
|
||||
function defaultFeedbackExportDirName(): string {
|
||||
const iso = new Date().toISOString().replace(/[-:]/g, "").replace(/\.\d{3}Z$/, "Z");
|
||||
return `feedback-export-${iso}`;
|
||||
}
|
||||
|
||||
async function ensureEmptyOutputDirectory(outputDir: string): Promise<void> {
|
||||
try {
|
||||
const info = await stat(outputDir);
|
||||
if (!info.isDirectory()) {
|
||||
throw new Error(`Output path already exists and is not a directory: ${outputDir}`);
|
||||
}
|
||||
const entries = await readdir(outputDir);
|
||||
if (entries.length > 0) {
|
||||
throw new Error(`Output directory already exists and is not empty: ${outputDir}`);
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "";
|
||||
if (/ENOENT/.test(message)) {
|
||||
await mkdir(outputDir, { recursive: true });
|
||||
return;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async function collectJsonFilesForArchive(
|
||||
outputDir: string,
|
||||
relativePaths: string[],
|
||||
): Promise<Record<string, string>> {
|
||||
const files: Record<string, string> = {};
|
||||
for (const relativePath of relativePaths) {
|
||||
const normalized = relativePath.replace(/\\/g, "/");
|
||||
files[normalized] = await readFile(path.join(outputDir, normalized), "utf8");
|
||||
}
|
||||
return files;
|
||||
}
|
||||
|
||||
function sanitizeFileSegment(value: string): string {
|
||||
return value.replace(/[^a-zA-Z0-9._-]+/g, "-").replace(/^-+|-+$/g, "") || "feedback";
|
||||
}
|
||||
|
||||
function writeUint16(target: Uint8Array, offset: number, value: number) {
|
||||
target[offset] = value & 0xff;
|
||||
target[offset + 1] = (value >>> 8) & 0xff;
|
||||
}
|
||||
|
||||
function writeUint32(target: Uint8Array, offset: number, value: number) {
|
||||
target[offset] = value & 0xff;
|
||||
target[offset + 1] = (value >>> 8) & 0xff;
|
||||
target[offset + 2] = (value >>> 16) & 0xff;
|
||||
target[offset + 3] = (value >>> 24) & 0xff;
|
||||
}
|
||||
|
||||
function crc32(bytes: Uint8Array) {
|
||||
let crc = 0xffffffff;
|
||||
for (const byte of bytes) {
|
||||
crc ^= byte;
|
||||
for (let bit = 0; bit < 8; bit += 1) {
|
||||
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
|
||||
}
|
||||
}
|
||||
return (crc ^ 0xffffffff) >>> 0;
|
||||
}
|
||||
|
||||
function createStoredZipArchive(files: Record<string, string>, rootPath: string): Uint8Array {
|
||||
const encoder = new TextEncoder();
|
||||
const localChunks: Uint8Array[] = [];
|
||||
const centralChunks: Uint8Array[] = [];
|
||||
let localOffset = 0;
|
||||
let entryCount = 0;
|
||||
|
||||
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
|
||||
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
|
||||
const body = encoder.encode(content);
|
||||
const checksum = crc32(body);
|
||||
|
||||
const localHeader = new Uint8Array(30 + fileName.length);
|
||||
writeUint32(localHeader, 0, 0x04034b50);
|
||||
writeUint16(localHeader, 4, 20);
|
||||
writeUint16(localHeader, 6, 0x0800);
|
||||
writeUint16(localHeader, 8, 0);
|
||||
writeUint32(localHeader, 14, checksum);
|
||||
writeUint32(localHeader, 18, body.length);
|
||||
writeUint32(localHeader, 22, body.length);
|
||||
writeUint16(localHeader, 26, fileName.length);
|
||||
localHeader.set(fileName, 30);
|
||||
|
||||
const centralHeader = new Uint8Array(46 + fileName.length);
|
||||
writeUint32(centralHeader, 0, 0x02014b50);
|
||||
writeUint16(centralHeader, 4, 20);
|
||||
writeUint16(centralHeader, 6, 20);
|
||||
writeUint16(centralHeader, 8, 0x0800);
|
||||
writeUint16(centralHeader, 10, 0);
|
||||
writeUint32(centralHeader, 16, checksum);
|
||||
writeUint32(centralHeader, 20, body.length);
|
||||
writeUint32(centralHeader, 24, body.length);
|
||||
writeUint16(centralHeader, 28, fileName.length);
|
||||
writeUint32(centralHeader, 42, localOffset);
|
||||
centralHeader.set(fileName, 46);
|
||||
|
||||
localChunks.push(localHeader, body);
|
||||
centralChunks.push(centralHeader);
|
||||
localOffset += localHeader.length + body.length;
|
||||
entryCount += 1;
|
||||
}
|
||||
|
||||
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
|
||||
const archive = new Uint8Array(
|
||||
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
|
||||
);
|
||||
let offset = 0;
|
||||
for (const chunk of localChunks) {
|
||||
archive.set(chunk, offset);
|
||||
offset += chunk.length;
|
||||
}
|
||||
const centralDirectoryOffset = offset;
|
||||
for (const chunk of centralChunks) {
|
||||
archive.set(chunk, offset);
|
||||
offset += chunk.length;
|
||||
}
|
||||
writeUint32(archive, offset, 0x06054b50);
|
||||
writeUint16(archive, offset + 8, entryCount);
|
||||
writeUint16(archive, offset + 10, entryCount);
|
||||
writeUint32(archive, offset + 12, centralDirectoryLength);
|
||||
writeUint32(archive, offset + 16, centralDirectoryOffset);
|
||||
return archive;
|
||||
}
|
||||
@@ -1,8 +1,10 @@
|
||||
import { Command } from "commander";
|
||||
import { writeFile } from "node:fs/promises";
|
||||
import {
|
||||
addIssueCommentSchema,
|
||||
checkoutIssueSchema,
|
||||
createIssueSchema,
|
||||
type FeedbackTrace,
|
||||
updateIssueSchema,
|
||||
type Issue,
|
||||
type IssueComment,
|
||||
@@ -15,6 +17,11 @@ import {
|
||||
resolveCommandContext,
|
||||
type BaseClientOptions,
|
||||
} from "./common.js";
|
||||
import {
|
||||
buildFeedbackTraceQuery,
|
||||
normalizeFeedbackTraceExportFormat,
|
||||
serializeFeedbackTraces,
|
||||
} from "./feedback.js";
|
||||
|
||||
interface IssueBaseOptions extends BaseClientOptions {
|
||||
status?: string;
|
||||
@@ -61,6 +68,18 @@ interface IssueCheckoutOptions extends BaseClientOptions {
|
||||
expectedStatuses?: string;
|
||||
}
|
||||
|
||||
interface IssueFeedbackOptions extends BaseClientOptions {
|
||||
targetType?: string;
|
||||
vote?: string;
|
||||
status?: string;
|
||||
from?: string;
|
||||
to?: string;
|
||||
sharedOnly?: boolean;
|
||||
includePayload?: boolean;
|
||||
out?: string;
|
||||
format?: string;
|
||||
}
|
||||
|
||||
export function registerIssueCommands(program: Command): void {
|
||||
const issue = program.command("issue").description("Issue operations");
|
||||
|
||||
@@ -237,6 +256,85 @@ export function registerIssueCommands(program: Command): void {
|
||||
}),
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
issue
|
||||
.command("feedback:list")
|
||||
.description("List feedback traces for an issue")
|
||||
.argument("<issueId>", "Issue ID")
|
||||
.option("--target-type <type>", "Filter by target type")
|
||||
.option("--vote <vote>", "Filter by vote value")
|
||||
.option("--status <status>", "Filter by trace status")
|
||||
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||
.option("--include-payload", "Include stored payload snapshots in the response")
|
||||
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts)}`,
|
||||
)) ?? [];
|
||||
if (ctx.json) {
|
||||
printOutput(traces, { json: true });
|
||||
return;
|
||||
}
|
||||
printOutput(
|
||||
traces.map((trace) => ({
|
||||
id: trace.id,
|
||||
issue: trace.issueIdentifier ?? trace.issueId,
|
||||
vote: trace.vote,
|
||||
status: trace.status,
|
||||
targetType: trace.targetType,
|
||||
target: trace.targetSummary.label,
|
||||
})),
|
||||
{ json: false },
|
||||
);
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
issue
|
||||
.command("feedback:export")
|
||||
.description("Export feedback traces for an issue")
|
||||
.argument("<issueId>", "Issue ID")
|
||||
.option("--target-type <type>", "Filter by target type")
|
||||
.option("--vote <vote>", "Filter by vote value")
|
||||
.option("--status <status>", "Filter by trace status")
|
||||
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||
.option("--include-payload", "Include stored payload snapshots in the export")
|
||||
.option("--out <path>", "Write export to a file path instead of stdout")
|
||||
.option("--format <format>", "Export format: json or ndjson", "ndjson")
|
||||
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
|
||||
try {
|
||||
const ctx = resolveCommandContext(opts);
|
||||
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts, opts.includePayload ?? true)}`,
|
||||
)) ?? [];
|
||||
const serialized = serializeFeedbackTraces(traces, opts.format);
|
||||
if (opts.out?.trim()) {
|
||||
await writeFile(opts.out, serialized, "utf8");
|
||||
if (ctx.json) {
|
||||
printOutput(
|
||||
{ out: opts.out, count: traces.length, format: normalizeFeedbackTraceExportFormat(opts.format) },
|
||||
{ json: true },
|
||||
);
|
||||
return;
|
||||
}
|
||||
console.log(`Wrote ${traces.length} feedback trace(s) to ${opts.out}`);
|
||||
return;
|
||||
}
|
||||
process.stdout.write(`${serialized}${serialized.endsWith("\n") ? "" : "\n"}`);
|
||||
} catch (err) {
|
||||
handleCommandError(err);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
addCommonClientOptions(
|
||||
issue
|
||||
.command("checkout")
|
||||
|
||||
129
cli/src/commands/client/zip.ts
Normal file
129
cli/src/commands/client/zip.ts
Normal file
@@ -0,0 +1,129 @@
|
||||
import { inflateRawSync } from "node:zlib";
|
||||
import path from "node:path";
|
||||
import type { CompanyPortabilityFileEntry } from "@paperclipai/shared";
|
||||
|
||||
const textDecoder = new TextDecoder();
|
||||
|
||||
export const binaryContentTypeByExtension: Record<string, string> = {
|
||||
".gif": "image/gif",
|
||||
".jpeg": "image/jpeg",
|
||||
".jpg": "image/jpeg",
|
||||
".png": "image/png",
|
||||
".svg": "image/svg+xml",
|
||||
".webp": "image/webp",
|
||||
};
|
||||
|
||||
function normalizeArchivePath(pathValue: string) {
|
||||
return pathValue
|
||||
.replace(/\\/g, "/")
|
||||
.split("/")
|
||||
.filter(Boolean)
|
||||
.join("/");
|
||||
}
|
||||
|
||||
function readUint16(source: Uint8Array, offset: number) {
|
||||
return source[offset]! | (source[offset + 1]! << 8);
|
||||
}
|
||||
|
||||
function readUint32(source: Uint8Array, offset: number) {
|
||||
return (
|
||||
source[offset]! |
|
||||
(source[offset + 1]! << 8) |
|
||||
(source[offset + 2]! << 16) |
|
||||
(source[offset + 3]! << 24)
|
||||
) >>> 0;
|
||||
}
|
||||
|
||||
function sharedArchiveRoot(paths: string[]) {
|
||||
if (paths.length === 0) return null;
|
||||
const firstSegments = paths
|
||||
.map((entry) => normalizeArchivePath(entry).split("/").filter(Boolean))
|
||||
.filter((parts) => parts.length > 0);
|
||||
if (firstSegments.length === 0) return null;
|
||||
const candidate = firstSegments[0]![0]!;
|
||||
return firstSegments.every((parts) => parts.length > 1 && parts[0] === candidate)
|
||||
? candidate
|
||||
: null;
|
||||
}
|
||||
|
||||
function bytesToPortableFileEntry(pathValue: string, bytes: Uint8Array): CompanyPortabilityFileEntry {
|
||||
const contentType = binaryContentTypeByExtension[path.extname(pathValue).toLowerCase()];
|
||||
if (!contentType) return textDecoder.decode(bytes);
|
||||
return {
|
||||
encoding: "base64",
|
||||
data: Buffer.from(bytes).toString("base64"),
|
||||
contentType,
|
||||
};
|
||||
}
|
||||
|
||||
async function inflateZipEntry(compressionMethod: number, bytes: Uint8Array) {
|
||||
if (compressionMethod === 0) return bytes;
|
||||
if (compressionMethod !== 8) {
|
||||
throw new Error("Unsupported zip archive: only STORE and DEFLATE entries are supported.");
|
||||
}
|
||||
return new Uint8Array(inflateRawSync(bytes));
|
||||
}
|
||||
|
||||
export async function readZipArchive(source: ArrayBuffer | Uint8Array): Promise<{
|
||||
rootPath: string | null;
|
||||
files: Record<string, CompanyPortabilityFileEntry>;
|
||||
}> {
|
||||
const bytes = source instanceof Uint8Array ? source : new Uint8Array(source);
|
||||
const entries: Array<{ path: string; body: CompanyPortabilityFileEntry }> = [];
|
||||
let offset = 0;
|
||||
|
||||
while (offset + 4 <= bytes.length) {
|
||||
const signature = readUint32(bytes, offset);
|
||||
if (signature === 0x02014b50 || signature === 0x06054b50) break;
|
||||
if (signature !== 0x04034b50) {
|
||||
throw new Error("Invalid zip archive: unsupported local file header.");
|
||||
}
|
||||
|
||||
if (offset + 30 > bytes.length) {
|
||||
throw new Error("Invalid zip archive: truncated local file header.");
|
||||
}
|
||||
|
||||
const generalPurposeFlag = readUint16(bytes, offset + 6);
|
||||
const compressionMethod = readUint16(bytes, offset + 8);
|
||||
const compressedSize = readUint32(bytes, offset + 18);
|
||||
const fileNameLength = readUint16(bytes, offset + 26);
|
||||
const extraFieldLength = readUint16(bytes, offset + 28);
|
||||
|
||||
if ((generalPurposeFlag & 0x0008) !== 0) {
|
||||
throw new Error("Unsupported zip archive: data descriptors are not supported.");
|
||||
}
|
||||
|
||||
const nameOffset = offset + 30;
|
||||
const bodyOffset = nameOffset + fileNameLength + extraFieldLength;
|
||||
const bodyEnd = bodyOffset + compressedSize;
|
||||
if (bodyEnd > bytes.length) {
|
||||
throw new Error("Invalid zip archive: truncated file contents.");
|
||||
}
|
||||
|
||||
const rawArchivePath = textDecoder.decode(bytes.slice(nameOffset, nameOffset + fileNameLength));
|
||||
const archivePath = normalizeArchivePath(rawArchivePath);
|
||||
const isDirectoryEntry = /\/$/.test(rawArchivePath.replace(/\\/g, "/"));
|
||||
if (archivePath && !isDirectoryEntry) {
|
||||
const entryBytes = await inflateZipEntry(compressionMethod, bytes.slice(bodyOffset, bodyEnd));
|
||||
entries.push({
|
||||
path: archivePath,
|
||||
body: bytesToPortableFileEntry(archivePath, entryBytes),
|
||||
});
|
||||
}
|
||||
|
||||
offset = bodyEnd;
|
||||
}
|
||||
|
||||
const rootPath = sharedArchiveRoot(entries.map((entry) => entry.path));
|
||||
const files: Record<string, CompanyPortabilityFileEntry> = {};
|
||||
for (const entry of entries) {
|
||||
const normalizedPath =
|
||||
rootPath && entry.path.startsWith(`${rootPath}/`)
|
||||
? entry.path.slice(rootPath.length + 1)
|
||||
: entry.path;
|
||||
if (!normalizedPath) continue;
|
||||
files[normalizedPath] = entry.body;
|
||||
}
|
||||
|
||||
return { rootPath, files };
|
||||
}
|
||||
@@ -63,6 +63,9 @@ function defaultConfig(): PaperclipConfig {
|
||||
baseUrlMode: "auto",
|
||||
disableSignUp: false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage: defaultStorageConfig(),
|
||||
secrets: defaultSecretsConfig(),
|
||||
};
|
||||
|
||||
@@ -33,6 +33,11 @@ import {
|
||||
} from "../config/home.js";
|
||||
import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js";
|
||||
import { printPaperclipCliBanner } from "../utils/banner.js";
|
||||
import {
|
||||
getTelemetryClient,
|
||||
trackInstallStarted,
|
||||
trackInstallCompleted,
|
||||
} from "../telemetry.js";
|
||||
|
||||
type SetupMode = "quickstart" | "advanced";
|
||||
|
||||
@@ -244,11 +249,12 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||
),
|
||||
);
|
||||
|
||||
let existingConfig: PaperclipConfig | null = null;
|
||||
if (configExists(opts.config)) {
|
||||
p.log.message(pc.dim(`${configPath} exists, updating config`));
|
||||
p.log.message(pc.dim(`${configPath} exists`));
|
||||
|
||||
try {
|
||||
readConfig(opts.config);
|
||||
existingConfig = readConfig(opts.config);
|
||||
} catch (err) {
|
||||
p.log.message(
|
||||
pc.yellow(
|
||||
@@ -258,6 +264,76 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||
}
|
||||
}
|
||||
|
||||
if (existingConfig) {
|
||||
p.log.message(
|
||||
pc.dim("Existing Paperclip install detected; keeping the current configuration unchanged."),
|
||||
);
|
||||
p.log.message(pc.dim(`Use ${pc.cyan("paperclipai configure")} if you want to change settings.`));
|
||||
|
||||
const jwtSecret = ensureAgentJwtSecret(configPath);
|
||||
const envFilePath = resolveAgentJwtEnvFile(configPath);
|
||||
if (jwtSecret.created) {
|
||||
p.log.success(`Created ${pc.cyan("PAPERCLIP_AGENT_JWT_SECRET")} in ${pc.dim(envFilePath)}`);
|
||||
} else if (process.env.PAPERCLIP_AGENT_JWT_SECRET?.trim()) {
|
||||
p.log.info(`Using existing ${pc.cyan("PAPERCLIP_AGENT_JWT_SECRET")} from environment`);
|
||||
} else {
|
||||
p.log.info(`Using existing ${pc.cyan("PAPERCLIP_AGENT_JWT_SECRET")} in ${pc.dim(envFilePath)}`);
|
||||
}
|
||||
|
||||
const keyResult = ensureLocalSecretsKeyFile(existingConfig, configPath);
|
||||
if (keyResult.status === "created") {
|
||||
p.log.success(`Created local secrets key file at ${pc.dim(keyResult.path)}`);
|
||||
} else if (keyResult.status === "existing") {
|
||||
p.log.message(pc.dim(`Using existing local secrets key file at ${keyResult.path}`));
|
||||
}
|
||||
|
||||
p.note(
|
||||
[
|
||||
"Existing config preserved",
|
||||
`Database: ${existingConfig.database.mode}`,
|
||||
existingConfig.llm ? `LLM: ${existingConfig.llm.provider}` : "LLM: not configured",
|
||||
`Logging: ${existingConfig.logging.mode} -> ${existingConfig.logging.logDir}`,
|
||||
`Server: ${existingConfig.server.deploymentMode}/${existingConfig.server.exposure} @ ${existingConfig.server.host}:${existingConfig.server.port}`,
|
||||
`Allowed hosts: ${existingConfig.server.allowedHostnames.length > 0 ? existingConfig.server.allowedHostnames.join(", ") : "(loopback only)"}`,
|
||||
`Auth URL mode: ${existingConfig.auth.baseUrlMode}${existingConfig.auth.publicBaseUrl ? ` (${existingConfig.auth.publicBaseUrl})` : ""}`,
|
||||
`Storage: ${existingConfig.storage.provider}`,
|
||||
`Secrets: ${existingConfig.secrets.provider} (strict mode ${existingConfig.secrets.strictMode ? "on" : "off"})`,
|
||||
"Agent auth: PAPERCLIP_AGENT_JWT_SECRET configured",
|
||||
].join("\n"),
|
||||
"Configuration ready",
|
||||
);
|
||||
|
||||
p.note(
|
||||
[
|
||||
`Run: ${pc.cyan("paperclipai run")}`,
|
||||
`Reconfigure later: ${pc.cyan("paperclipai configure")}`,
|
||||
`Diagnose setup: ${pc.cyan("paperclipai doctor")}`,
|
||||
].join("\n"),
|
||||
"Next commands",
|
||||
);
|
||||
|
||||
let shouldRunNow = opts.run === true || opts.yes === true;
|
||||
if (!shouldRunNow && !opts.invokedByRun && process.stdin.isTTY && process.stdout.isTTY) {
|
||||
const answer = await p.confirm({
|
||||
message: "Start Paperclip now?",
|
||||
initialValue: true,
|
||||
});
|
||||
if (!p.isCancel(answer)) {
|
||||
shouldRunNow = answer;
|
||||
}
|
||||
}
|
||||
|
||||
if (shouldRunNow && !opts.invokedByRun) {
|
||||
process.env.PAPERCLIP_OPEN_ON_LISTEN = "true";
|
||||
const { runCommand } = await import("./run.js");
|
||||
await runCommand({ config: configPath, repair: true, yes: true });
|
||||
return;
|
||||
}
|
||||
|
||||
p.outro("Existing Paperclip setup is ready.");
|
||||
return;
|
||||
}
|
||||
|
||||
let setupMode: SetupMode = "quickstart";
|
||||
if (opts.yes) {
|
||||
p.log.message(pc.dim("`--yes` enabled: using Quickstart defaults."));
|
||||
@@ -285,6 +361,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||
setupMode = setupModeChoice as SetupMode;
|
||||
}
|
||||
|
||||
const tc = getTelemetryClient();
|
||||
if (tc) trackInstallStarted(tc);
|
||||
|
||||
let llm: PaperclipConfig["llm"] | undefined;
|
||||
const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv();
|
||||
let {
|
||||
@@ -417,6 +496,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||
logging,
|
||||
server,
|
||||
auth,
|
||||
telemetry: {
|
||||
enabled: true,
|
||||
},
|
||||
storage,
|
||||
secrets,
|
||||
};
|
||||
@@ -430,6 +512,10 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||
|
||||
writeConfig(config, opts.config);
|
||||
|
||||
if (tc) trackInstallCompleted(tc, {
|
||||
adapterType: server.deploymentMode,
|
||||
});
|
||||
|
||||
p.note(
|
||||
[
|
||||
`Database: ${database.mode}`,
|
||||
|
||||
352
cli/src/commands/routines.ts
Normal file
352
cli/src/commands/routines.ts
Normal file
@@ -0,0 +1,352 @@
|
||||
import fs from "node:fs";
|
||||
import net from "node:net";
|
||||
import path from "node:path";
|
||||
import { Command } from "commander";
|
||||
import pc from "picocolors";
|
||||
import {
|
||||
applyPendingMigrations,
|
||||
createDb,
|
||||
createEmbeddedPostgresLogBuffer,
|
||||
ensurePostgresDatabase,
|
||||
formatEmbeddedPostgresError,
|
||||
routines,
|
||||
} from "@paperclipai/db";
|
||||
import { eq, inArray } from "drizzle-orm";
|
||||
import { loadPaperclipEnvFile } from "../config/env.js";
|
||||
import { readConfig, resolveConfigPath } from "../config/store.js";
|
||||
|
||||
type RoutinesDisableAllOptions = {
|
||||
config?: string;
|
||||
dataDir?: string;
|
||||
companyId?: string;
|
||||
json?: boolean;
|
||||
};
|
||||
|
||||
type DisableAllRoutinesResult = {
|
||||
companyId: string;
|
||||
totalRoutines: number;
|
||||
pausedCount: number;
|
||||
alreadyPausedCount: number;
|
||||
archivedCount: number;
|
||||
};
|
||||
|
||||
type EmbeddedPostgresInstance = {
|
||||
initialise(): Promise<void>;
|
||||
start(): Promise<void>;
|
||||
stop(): Promise<void>;
|
||||
};
|
||||
|
||||
type EmbeddedPostgresCtor = new (opts: {
|
||||
databaseDir: string;
|
||||
user: string;
|
||||
password: string;
|
||||
port: number;
|
||||
persistent: boolean;
|
||||
initdbFlags?: string[];
|
||||
onLog?: (message: unknown) => void;
|
||||
onError?: (message: unknown) => void;
|
||||
}) => EmbeddedPostgresInstance;
|
||||
|
||||
type EmbeddedPostgresHandle = {
|
||||
port: number;
|
||||
startedByThisProcess: boolean;
|
||||
stop: () => Promise<void>;
|
||||
};
|
||||
|
||||
type ClosableDb = ReturnType<typeof createDb> & {
|
||||
$client?: {
|
||||
end?: (options?: { timeout?: number }) => Promise<void>;
|
||||
};
|
||||
};
|
||||
|
||||
function nonEmpty(value: string | null | undefined): string | null {
|
||||
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||
}
|
||||
|
||||
async function isPortAvailable(port: number): Promise<boolean> {
|
||||
return await new Promise<boolean>((resolve) => {
|
||||
const server = net.createServer();
|
||||
server.unref();
|
||||
server.once("error", () => resolve(false));
|
||||
server.listen(port, "127.0.0.1", () => {
|
||||
server.close(() => resolve(true));
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function findAvailablePort(preferredPort: number): Promise<number> {
|
||||
let port = Math.max(1, Math.trunc(preferredPort));
|
||||
while (!(await isPortAvailable(port))) {
|
||||
port += 1;
|
||||
}
|
||||
return port;
|
||||
}
|
||||
|
||||
function readPidFilePort(postmasterPidFile: string): number | null {
|
||||
if (!fs.existsSync(postmasterPidFile)) return null;
|
||||
try {
|
||||
const lines = fs.readFileSync(postmasterPidFile, "utf8").split("\n");
|
||||
const port = Number(lines[3]?.trim());
|
||||
return Number.isInteger(port) && port > 0 ? port : null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function readRunningPostmasterPid(postmasterPidFile: string): number | null {
|
||||
if (!fs.existsSync(postmasterPidFile)) return null;
|
||||
try {
|
||||
const pid = Number(fs.readFileSync(postmasterPidFile, "utf8").split("\n")[0]?.trim());
|
||||
if (!Number.isInteger(pid) || pid <= 0) return null;
|
||||
process.kill(pid, 0);
|
||||
return pid;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureEmbeddedPostgres(dataDir: string, preferredPort: number): Promise<EmbeddedPostgresHandle> {
|
||||
const moduleName = "embedded-postgres";
|
||||
let EmbeddedPostgres: EmbeddedPostgresCtor;
|
||||
try {
|
||||
const mod = await import(moduleName);
|
||||
EmbeddedPostgres = mod.default as EmbeddedPostgresCtor;
|
||||
} catch {
|
||||
throw new Error(
|
||||
"Embedded PostgreSQL support requires dependency `embedded-postgres`. Reinstall dependencies and try again.",
|
||||
);
|
||||
}
|
||||
|
||||
const postmasterPidFile = path.resolve(dataDir, "postmaster.pid");
|
||||
const runningPid = readRunningPostmasterPid(postmasterPidFile);
|
||||
if (runningPid) {
|
||||
return {
|
||||
port: readPidFilePort(postmasterPidFile) ?? preferredPort,
|
||||
startedByThisProcess: false,
|
||||
stop: async () => {},
|
||||
};
|
||||
}
|
||||
|
||||
const port = await findAvailablePort(preferredPort);
|
||||
const logBuffer = createEmbeddedPostgresLogBuffer();
|
||||
const instance = new EmbeddedPostgres({
|
||||
databaseDir: dataDir,
|
||||
user: "paperclip",
|
||||
password: "paperclip",
|
||||
port,
|
||||
persistent: true,
|
||||
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
|
||||
onLog: logBuffer.append,
|
||||
onError: logBuffer.append,
|
||||
});
|
||||
|
||||
if (!fs.existsSync(path.resolve(dataDir, "PG_VERSION"))) {
|
||||
try {
|
||||
await instance.initialise();
|
||||
} catch (error) {
|
||||
throw formatEmbeddedPostgresError(error, {
|
||||
fallbackMessage: `Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${port}`,
|
||||
recentLogs: logBuffer.getRecentLogs(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (fs.existsSync(postmasterPidFile)) {
|
||||
fs.rmSync(postmasterPidFile, { force: true });
|
||||
}
|
||||
|
||||
try {
|
||||
await instance.start();
|
||||
} catch (error) {
|
||||
throw formatEmbeddedPostgresError(error, {
|
||||
fallbackMessage: `Failed to start embedded PostgreSQL on port ${port}`,
|
||||
recentLogs: logBuffer.getRecentLogs(),
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
port,
|
||||
startedByThisProcess: true,
|
||||
stop: async () => {
|
||||
await instance.stop();
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
async function closeDb(db: ClosableDb): Promise<void> {
|
||||
await db.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||
}
|
||||
|
||||
async function openConfiguredDb(configPath: string): Promise<{
|
||||
db: ClosableDb;
|
||||
stop: () => Promise<void>;
|
||||
}> {
|
||||
const config = readConfig(configPath);
|
||||
if (!config) {
|
||||
throw new Error(`Config not found at ${configPath}.`);
|
||||
}
|
||||
|
||||
let embeddedHandle: EmbeddedPostgresHandle | null = null;
|
||||
try {
|
||||
if (config.database.mode === "embedded-postgres") {
|
||||
embeddedHandle = await ensureEmbeddedPostgres(
|
||||
config.database.embeddedPostgresDataDir,
|
||||
config.database.embeddedPostgresPort,
|
||||
);
|
||||
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
|
||||
await ensurePostgresDatabase(adminConnectionString, "paperclip");
|
||||
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
|
||||
await applyPendingMigrations(connectionString);
|
||||
const db = createDb(connectionString) as ClosableDb;
|
||||
return {
|
||||
db,
|
||||
stop: async () => {
|
||||
await closeDb(db);
|
||||
if (embeddedHandle?.startedByThisProcess) {
|
||||
await embeddedHandle.stop().catch(() => undefined);
|
||||
}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
const connectionString = nonEmpty(config.database.connectionString);
|
||||
if (!connectionString) {
|
||||
throw new Error(`Config at ${configPath} does not define a database connection string.`);
|
||||
}
|
||||
|
||||
await applyPendingMigrations(connectionString);
|
||||
const db = createDb(connectionString) as ClosableDb;
|
||||
return {
|
||||
db,
|
||||
stop: async () => {
|
||||
await closeDb(db);
|
||||
},
|
||||
};
|
||||
} catch (error) {
|
||||
if (embeddedHandle?.startedByThisProcess) {
|
||||
await embeddedHandle.stop().catch(() => undefined);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function disableAllRoutinesInConfig(
|
||||
options: Pick<RoutinesDisableAllOptions, "config" | "companyId">,
|
||||
): Promise<DisableAllRoutinesResult> {
|
||||
const configPath = resolveConfigPath(options.config);
|
||||
loadPaperclipEnvFile(configPath);
|
||||
const companyId =
|
||||
nonEmpty(options.companyId)
|
||||
?? nonEmpty(process.env.PAPERCLIP_COMPANY_ID)
|
||||
?? null;
|
||||
if (!companyId) {
|
||||
throw new Error("Company ID is required. Pass --company-id or set PAPERCLIP_COMPANY_ID.");
|
||||
}
|
||||
|
||||
const config = readConfig(configPath);
|
||||
if (!config) {
|
||||
throw new Error(`Config not found at ${configPath}.`);
|
||||
}
|
||||
|
||||
let embeddedHandle: EmbeddedPostgresHandle | null = null;
|
||||
let db: ClosableDb | null = null;
|
||||
try {
|
||||
if (config.database.mode === "embedded-postgres") {
|
||||
embeddedHandle = await ensureEmbeddedPostgres(
|
||||
config.database.embeddedPostgresDataDir,
|
||||
config.database.embeddedPostgresPort,
|
||||
);
|
||||
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
|
||||
await ensurePostgresDatabase(adminConnectionString, "paperclip");
|
||||
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
|
||||
await applyPendingMigrations(connectionString);
|
||||
db = createDb(connectionString) as ClosableDb;
|
||||
} else {
|
||||
const connectionString = nonEmpty(config.database.connectionString);
|
||||
if (!connectionString) {
|
||||
throw new Error(`Config at ${configPath} does not define a database connection string.`);
|
||||
}
|
||||
await applyPendingMigrations(connectionString);
|
||||
db = createDb(connectionString) as ClosableDb;
|
||||
}
|
||||
|
||||
const existing = await db
|
||||
.select({
|
||||
id: routines.id,
|
||||
status: routines.status,
|
||||
})
|
||||
.from(routines)
|
||||
.where(eq(routines.companyId, companyId));
|
||||
|
||||
const alreadyPausedCount = existing.filter((routine) => routine.status === "paused").length;
|
||||
const archivedCount = existing.filter((routine) => routine.status === "archived").length;
|
||||
const idsToPause = existing
|
||||
.filter((routine) => routine.status !== "paused" && routine.status !== "archived")
|
||||
.map((routine) => routine.id);
|
||||
|
||||
if (idsToPause.length > 0) {
|
||||
await db
|
||||
.update(routines)
|
||||
.set({
|
||||
status: "paused",
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(inArray(routines.id, idsToPause));
|
||||
}
|
||||
|
||||
return {
|
||||
companyId,
|
||||
totalRoutines: existing.length,
|
||||
pausedCount: idsToPause.length,
|
||||
alreadyPausedCount,
|
||||
archivedCount,
|
||||
};
|
||||
} finally {
|
||||
if (db) {
|
||||
await closeDb(db);
|
||||
}
|
||||
if (embeddedHandle?.startedByThisProcess) {
|
||||
await embeddedHandle.stop().catch(() => undefined);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export async function disableAllRoutinesCommand(options: RoutinesDisableAllOptions): Promise<void> {
|
||||
const result = await disableAllRoutinesInConfig(options);
|
||||
|
||||
if (options.json) {
|
||||
console.log(JSON.stringify(result, null, 2));
|
||||
return;
|
||||
}
|
||||
|
||||
if (result.totalRoutines === 0) {
|
||||
console.log(pc.dim(`No routines found for company ${result.companyId}.`));
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Paused ${result.pausedCount} routine(s) for company ${result.companyId} ` +
|
||||
`(${result.alreadyPausedCount} already paused, ${result.archivedCount} archived).`,
|
||||
);
|
||||
}
|
||||
|
||||
export function registerRoutineCommands(program: Command): void {
|
||||
const routinesCommand = program.command("routines").description("Local routine maintenance commands");
|
||||
|
||||
routinesCommand
|
||||
.command("disable-all")
|
||||
.description("Pause all non-archived routines in the configured local instance for one company")
|
||||
.option("-c, --config <path>", "Path to config file")
|
||||
.option("-d, --data-dir <path>", "Paperclip data directory root (isolates state from ~/.paperclip)")
|
||||
.option("-C, --company-id <id>", "Company ID")
|
||||
.option("--json", "Output raw JSON")
|
||||
.action(async (opts: RoutinesDisableAllOptions) => {
|
||||
try {
|
||||
await disableAllRoutinesCommand(opts);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : String(error);
|
||||
console.error(pc.red(message));
|
||||
process.exit(1);
|
||||
}
|
||||
});
|
||||
}
|
||||
@@ -224,6 +224,9 @@ export function buildWorktreeConfig(input: {
|
||||
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
|
||||
disableSignUp: source?.auth.disableSignUp ?? false,
|
||||
},
|
||||
telemetry: {
|
||||
enabled: source?.telemetry?.enabled ?? true,
|
||||
},
|
||||
storage: {
|
||||
provider: source?.storage.provider ?? "local_disk",
|
||||
localDisk: {
|
||||
|
||||
764
cli/src/commands/worktree-merge-history-lib.ts
Normal file
764
cli/src/commands/worktree-merge-history-lib.ts
Normal file
@@ -0,0 +1,764 @@
|
||||
import {
|
||||
agents,
|
||||
assets,
|
||||
documentRevisions,
|
||||
goals,
|
||||
issueAttachments,
|
||||
issueComments,
|
||||
issueDocuments,
|
||||
issues,
|
||||
projects,
|
||||
projectWorkspaces,
|
||||
} from "@paperclipai/db";
|
||||
|
||||
type IssueRow = typeof issues.$inferSelect;
|
||||
type CommentRow = typeof issueComments.$inferSelect;
|
||||
type AgentRow = typeof agents.$inferSelect;
|
||||
type ProjectRow = typeof projects.$inferSelect;
|
||||
type ProjectWorkspaceRow = typeof projectWorkspaces.$inferSelect;
|
||||
type GoalRow = typeof goals.$inferSelect;
|
||||
type IssueDocumentLinkRow = typeof issueDocuments.$inferSelect;
|
||||
type DocumentRevisionTableRow = typeof documentRevisions.$inferSelect;
|
||||
type IssueAttachmentTableRow = typeof issueAttachments.$inferSelect;
|
||||
type AssetRow = typeof assets.$inferSelect;
|
||||
|
||||
export const WORKTREE_MERGE_SCOPES = ["issues", "comments"] as const;
|
||||
export type WorktreeMergeScope = (typeof WORKTREE_MERGE_SCOPES)[number];
|
||||
|
||||
export type ImportAdjustment =
|
||||
| "clear_assignee_agent"
|
||||
| "clear_project"
|
||||
| "clear_project_workspace"
|
||||
| "clear_goal"
|
||||
| "clear_author_agent"
|
||||
| "coerce_in_progress_to_todo"
|
||||
| "clear_document_agent"
|
||||
| "clear_document_revision_agent"
|
||||
| "clear_attachment_agent";
|
||||
|
||||
export type IssueMergeAction = "skip_existing" | "insert";
|
||||
export type CommentMergeAction = "skip_existing" | "skip_missing_parent" | "insert";
|
||||
|
||||
export type PlannedIssueInsert = {
|
||||
source: IssueRow;
|
||||
action: "insert";
|
||||
previewIssueNumber: number;
|
||||
previewIdentifier: string;
|
||||
targetStatus: string;
|
||||
targetAssigneeAgentId: string | null;
|
||||
targetCreatedByAgentId: string | null;
|
||||
targetProjectId: string | null;
|
||||
targetProjectWorkspaceId: string | null;
|
||||
targetGoalId: string | null;
|
||||
projectResolution: "preserved" | "cleared" | "mapped" | "imported";
|
||||
mappedProjectName: string | null;
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedIssueSkip = {
|
||||
source: IssueRow;
|
||||
action: "skip_existing";
|
||||
driftKeys: string[];
|
||||
};
|
||||
|
||||
export type PlannedCommentInsert = {
|
||||
source: CommentRow;
|
||||
action: "insert";
|
||||
targetAuthorAgentId: string | null;
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedCommentSkip = {
|
||||
source: CommentRow;
|
||||
action: "skip_existing" | "skip_missing_parent";
|
||||
};
|
||||
|
||||
export type IssueDocumentRow = {
|
||||
id: IssueDocumentLinkRow["id"];
|
||||
companyId: IssueDocumentLinkRow["companyId"];
|
||||
issueId: IssueDocumentLinkRow["issueId"];
|
||||
documentId: IssueDocumentLinkRow["documentId"];
|
||||
key: IssueDocumentLinkRow["key"];
|
||||
linkCreatedAt: IssueDocumentLinkRow["createdAt"];
|
||||
linkUpdatedAt: IssueDocumentLinkRow["updatedAt"];
|
||||
title: string | null;
|
||||
format: string;
|
||||
latestBody: string;
|
||||
latestRevisionId: string | null;
|
||||
latestRevisionNumber: number;
|
||||
createdByAgentId: string | null;
|
||||
createdByUserId: string | null;
|
||||
updatedByAgentId: string | null;
|
||||
updatedByUserId: string | null;
|
||||
documentCreatedAt: Date;
|
||||
documentUpdatedAt: Date;
|
||||
};
|
||||
|
||||
export type DocumentRevisionRow = {
|
||||
id: DocumentRevisionTableRow["id"];
|
||||
companyId: DocumentRevisionTableRow["companyId"];
|
||||
documentId: DocumentRevisionTableRow["documentId"];
|
||||
revisionNumber: DocumentRevisionTableRow["revisionNumber"];
|
||||
body: DocumentRevisionTableRow["body"];
|
||||
changeSummary: DocumentRevisionTableRow["changeSummary"];
|
||||
createdByAgentId: string | null;
|
||||
createdByUserId: string | null;
|
||||
createdAt: Date;
|
||||
};
|
||||
|
||||
export type IssueAttachmentRow = {
|
||||
id: IssueAttachmentTableRow["id"];
|
||||
companyId: IssueAttachmentTableRow["companyId"];
|
||||
issueId: IssueAttachmentTableRow["issueId"];
|
||||
issueCommentId: IssueAttachmentTableRow["issueCommentId"];
|
||||
assetId: IssueAttachmentTableRow["assetId"];
|
||||
provider: AssetRow["provider"];
|
||||
objectKey: AssetRow["objectKey"];
|
||||
contentType: AssetRow["contentType"];
|
||||
byteSize: AssetRow["byteSize"];
|
||||
sha256: AssetRow["sha256"];
|
||||
originalFilename: AssetRow["originalFilename"];
|
||||
createdByAgentId: string | null;
|
||||
createdByUserId: string | null;
|
||||
assetCreatedAt: Date;
|
||||
assetUpdatedAt: Date;
|
||||
attachmentCreatedAt: Date;
|
||||
attachmentUpdatedAt: Date;
|
||||
};
|
||||
|
||||
export type PlannedDocumentRevisionInsert = {
|
||||
source: DocumentRevisionRow;
|
||||
targetRevisionNumber: number;
|
||||
targetCreatedByAgentId: string | null;
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedIssueDocumentInsert = {
|
||||
source: IssueDocumentRow;
|
||||
action: "insert";
|
||||
targetCreatedByAgentId: string | null;
|
||||
targetUpdatedByAgentId: string | null;
|
||||
latestRevisionId: string | null;
|
||||
latestRevisionNumber: number;
|
||||
revisionsToInsert: PlannedDocumentRevisionInsert[];
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedIssueDocumentMerge = {
|
||||
source: IssueDocumentRow;
|
||||
action: "merge_existing";
|
||||
targetCreatedByAgentId: string | null;
|
||||
targetUpdatedByAgentId: string | null;
|
||||
latestRevisionId: string | null;
|
||||
latestRevisionNumber: number;
|
||||
revisionsToInsert: PlannedDocumentRevisionInsert[];
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedIssueDocumentSkip = {
|
||||
source: IssueDocumentRow;
|
||||
action: "skip_existing" | "skip_missing_parent" | "skip_conflicting_key";
|
||||
};
|
||||
|
||||
export type PlannedAttachmentInsert = {
|
||||
source: IssueAttachmentRow;
|
||||
action: "insert";
|
||||
targetIssueCommentId: string | null;
|
||||
targetCreatedByAgentId: string | null;
|
||||
adjustments: ImportAdjustment[];
|
||||
};
|
||||
|
||||
export type PlannedAttachmentSkip = {
|
||||
source: IssueAttachmentRow;
|
||||
action: "skip_existing" | "skip_missing_parent";
|
||||
};
|
||||
|
||||
export type PlannedProjectImport = {
|
||||
source: ProjectRow;
|
||||
targetLeadAgentId: string | null;
|
||||
targetGoalId: string | null;
|
||||
workspaces: ProjectWorkspaceRow[];
|
||||
};
|
||||
|
||||
export type WorktreeMergePlan = {
|
||||
companyId: string;
|
||||
companyName: string;
|
||||
issuePrefix: string;
|
||||
previewIssueCounterStart: number;
|
||||
scopes: WorktreeMergeScope[];
|
||||
projectImports: PlannedProjectImport[];
|
||||
issuePlans: Array<PlannedIssueInsert | PlannedIssueSkip>;
|
||||
commentPlans: Array<PlannedCommentInsert | PlannedCommentSkip>;
|
||||
documentPlans: Array<PlannedIssueDocumentInsert | PlannedIssueDocumentMerge | PlannedIssueDocumentSkip>;
|
||||
attachmentPlans: Array<PlannedAttachmentInsert | PlannedAttachmentSkip>;
|
||||
counts: {
|
||||
projectsToImport: number;
|
||||
issuesToInsert: number;
|
||||
issuesExisting: number;
|
||||
issueDrift: number;
|
||||
commentsToInsert: number;
|
||||
commentsExisting: number;
|
||||
commentsMissingParent: number;
|
||||
documentsToInsert: number;
|
||||
documentsToMerge: number;
|
||||
documentsExisting: number;
|
||||
documentsConflictingKey: number;
|
||||
documentsMissingParent: number;
|
||||
documentRevisionsToInsert: number;
|
||||
attachmentsToInsert: number;
|
||||
attachmentsExisting: number;
|
||||
attachmentsMissingParent: number;
|
||||
};
|
||||
adjustments: Record<ImportAdjustment, number>;
|
||||
};
|
||||
|
||||
function compareIssueCoreFields(source: IssueRow, target: IssueRow): string[] {
|
||||
const driftKeys: string[] = [];
|
||||
if (source.title !== target.title) driftKeys.push("title");
|
||||
if ((source.description ?? null) !== (target.description ?? null)) driftKeys.push("description");
|
||||
if (source.status !== target.status) driftKeys.push("status");
|
||||
if (source.priority !== target.priority) driftKeys.push("priority");
|
||||
if ((source.parentId ?? null) !== (target.parentId ?? null)) driftKeys.push("parentId");
|
||||
if ((source.projectId ?? null) !== (target.projectId ?? null)) driftKeys.push("projectId");
|
||||
if ((source.projectWorkspaceId ?? null) !== (target.projectWorkspaceId ?? null)) driftKeys.push("projectWorkspaceId");
|
||||
if ((source.goalId ?? null) !== (target.goalId ?? null)) driftKeys.push("goalId");
|
||||
if ((source.assigneeAgentId ?? null) !== (target.assigneeAgentId ?? null)) driftKeys.push("assigneeAgentId");
|
||||
if ((source.assigneeUserId ?? null) !== (target.assigneeUserId ?? null)) driftKeys.push("assigneeUserId");
|
||||
return driftKeys;
|
||||
}
|
||||
|
||||
function incrementAdjustment(
|
||||
counts: Record<ImportAdjustment, number>,
|
||||
adjustment: ImportAdjustment,
|
||||
): void {
|
||||
counts[adjustment] += 1;
|
||||
}
|
||||
|
||||
function groupBy<T>(rows: T[], keyFor: (row: T) => string): Map<string, T[]> {
|
||||
const out = new Map<string, T[]>();
|
||||
for (const row of rows) {
|
||||
const key = keyFor(row);
|
||||
const existing = out.get(key);
|
||||
if (existing) {
|
||||
existing.push(row);
|
||||
} else {
|
||||
out.set(key, [row]);
|
||||
}
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
function sameDate(left: Date, right: Date): boolean {
|
||||
return left.getTime() === right.getTime();
|
||||
}
|
||||
|
||||
function sortDocumentRows(rows: IssueDocumentRow[]): IssueDocumentRow[] {
|
||||
return [...rows].sort((left, right) => {
|
||||
const createdDelta = left.documentCreatedAt.getTime() - right.documentCreatedAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
const linkDelta = left.linkCreatedAt.getTime() - right.linkCreatedAt.getTime();
|
||||
if (linkDelta !== 0) return linkDelta;
|
||||
return left.documentId.localeCompare(right.documentId);
|
||||
});
|
||||
}
|
||||
|
||||
function sortDocumentRevisions(rows: DocumentRevisionRow[]): DocumentRevisionRow[] {
|
||||
return [...rows].sort((left, right) => {
|
||||
const revisionDelta = left.revisionNumber - right.revisionNumber;
|
||||
if (revisionDelta !== 0) return revisionDelta;
|
||||
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
return left.id.localeCompare(right.id);
|
||||
});
|
||||
}
|
||||
|
||||
function sortAttachments(rows: IssueAttachmentRow[]): IssueAttachmentRow[] {
|
||||
return [...rows].sort((left, right) => {
|
||||
const createdDelta = left.attachmentCreatedAt.getTime() - right.attachmentCreatedAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
return left.id.localeCompare(right.id);
|
||||
});
|
||||
}
|
||||
|
||||
function sortIssuesForImport(sourceIssues: IssueRow[]): IssueRow[] {
|
||||
const byId = new Map(sourceIssues.map((issue) => [issue.id, issue]));
|
||||
const memoDepth = new Map<string, number>();
|
||||
|
||||
const depthFor = (issue: IssueRow, stack = new Set<string>()): number => {
|
||||
const memoized = memoDepth.get(issue.id);
|
||||
if (memoized !== undefined) return memoized;
|
||||
if (!issue.parentId) {
|
||||
memoDepth.set(issue.id, 0);
|
||||
return 0;
|
||||
}
|
||||
if (stack.has(issue.id)) {
|
||||
memoDepth.set(issue.id, 0);
|
||||
return 0;
|
||||
}
|
||||
const parent = byId.get(issue.parentId);
|
||||
if (!parent) {
|
||||
memoDepth.set(issue.id, 0);
|
||||
return 0;
|
||||
}
|
||||
stack.add(issue.id);
|
||||
const depth = depthFor(parent, stack) + 1;
|
||||
stack.delete(issue.id);
|
||||
memoDepth.set(issue.id, depth);
|
||||
return depth;
|
||||
};
|
||||
|
||||
return [...sourceIssues].sort((left, right) => {
|
||||
const depthDelta = depthFor(left) - depthFor(right);
|
||||
if (depthDelta !== 0) return depthDelta;
|
||||
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
return left.id.localeCompare(right.id);
|
||||
});
|
||||
}
|
||||
|
||||
export function parseWorktreeMergeScopes(rawValue: string | undefined): WorktreeMergeScope[] {
|
||||
if (!rawValue || rawValue.trim().length === 0) {
|
||||
return ["issues", "comments"];
|
||||
}
|
||||
|
||||
const parsed = rawValue
|
||||
.split(",")
|
||||
.map((value) => value.trim().toLowerCase())
|
||||
.filter((value): value is WorktreeMergeScope =>
|
||||
(WORKTREE_MERGE_SCOPES as readonly string[]).includes(value),
|
||||
);
|
||||
|
||||
if (parsed.length === 0) {
|
||||
throw new Error(
|
||||
`Invalid scope "${rawValue}". Expected a comma-separated list of: ${WORKTREE_MERGE_SCOPES.join(", ")}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return [...new Set(parsed)];
|
||||
}
|
||||
|
||||
export function buildWorktreeMergePlan(input: {
|
||||
companyId: string;
|
||||
companyName: string;
|
||||
issuePrefix: string;
|
||||
previewIssueCounterStart: number;
|
||||
scopes: WorktreeMergeScope[];
|
||||
sourceIssues: IssueRow[];
|
||||
targetIssues: IssueRow[];
|
||||
sourceComments: CommentRow[];
|
||||
targetComments: CommentRow[];
|
||||
sourceProjects?: ProjectRow[];
|
||||
sourceProjectWorkspaces?: ProjectWorkspaceRow[];
|
||||
sourceDocuments?: IssueDocumentRow[];
|
||||
targetDocuments?: IssueDocumentRow[];
|
||||
sourceDocumentRevisions?: DocumentRevisionRow[];
|
||||
targetDocumentRevisions?: DocumentRevisionRow[];
|
||||
sourceAttachments?: IssueAttachmentRow[];
|
||||
targetAttachments?: IssueAttachmentRow[];
|
||||
targetAgents: AgentRow[];
|
||||
targetProjects: ProjectRow[];
|
||||
targetProjectWorkspaces: ProjectWorkspaceRow[];
|
||||
targetGoals: GoalRow[];
|
||||
importProjectIds?: Iterable<string>;
|
||||
projectIdOverrides?: Record<string, string | null | undefined>;
|
||||
}): WorktreeMergePlan {
|
||||
const targetIssuesById = new Map(input.targetIssues.map((issue) => [issue.id, issue]));
|
||||
const targetCommentIds = new Set(input.targetComments.map((comment) => comment.id));
|
||||
const targetAgentIds = new Set(input.targetAgents.map((agent) => agent.id));
|
||||
const targetProjectIds = new Set(input.targetProjects.map((project) => project.id));
|
||||
const targetProjectsById = new Map(input.targetProjects.map((project) => [project.id, project]));
|
||||
const targetProjectWorkspaceIds = new Set(input.targetProjectWorkspaces.map((workspace) => workspace.id));
|
||||
const targetGoalIds = new Set(input.targetGoals.map((goal) => goal.id));
|
||||
const sourceProjectsById = new Map((input.sourceProjects ?? []).map((project) => [project.id, project]));
|
||||
const sourceProjectWorkspaces = input.sourceProjectWorkspaces ?? [];
|
||||
const sourceProjectWorkspacesByProjectId = groupBy(sourceProjectWorkspaces, (workspace) => workspace.projectId);
|
||||
const importProjectIds = new Set(input.importProjectIds ?? []);
|
||||
const scopes = new Set(input.scopes);
|
||||
|
||||
const adjustmentCounts: Record<ImportAdjustment, number> = {
|
||||
clear_assignee_agent: 0,
|
||||
clear_project: 0,
|
||||
clear_project_workspace: 0,
|
||||
clear_goal: 0,
|
||||
clear_author_agent: 0,
|
||||
coerce_in_progress_to_todo: 0,
|
||||
clear_document_agent: 0,
|
||||
clear_document_revision_agent: 0,
|
||||
clear_attachment_agent: 0,
|
||||
};
|
||||
|
||||
const projectImports: PlannedProjectImport[] = [];
|
||||
for (const projectId of importProjectIds) {
|
||||
if (targetProjectIds.has(projectId)) continue;
|
||||
const sourceProject = sourceProjectsById.get(projectId);
|
||||
if (!sourceProject) continue;
|
||||
projectImports.push({
|
||||
source: sourceProject,
|
||||
targetLeadAgentId:
|
||||
sourceProject.leadAgentId && targetAgentIds.has(sourceProject.leadAgentId)
|
||||
? sourceProject.leadAgentId
|
||||
: null,
|
||||
targetGoalId:
|
||||
sourceProject.goalId && targetGoalIds.has(sourceProject.goalId)
|
||||
? sourceProject.goalId
|
||||
: null,
|
||||
workspaces: [...(sourceProjectWorkspacesByProjectId.get(projectId) ?? [])].sort((left, right) => {
|
||||
const primaryDelta = Number(right.isPrimary) - Number(left.isPrimary);
|
||||
if (primaryDelta !== 0) return primaryDelta;
|
||||
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
return left.id.localeCompare(right.id);
|
||||
}),
|
||||
});
|
||||
}
|
||||
const importedProjectWorkspaceIds = new Set(
|
||||
projectImports.flatMap((project) => project.workspaces.map((workspace) => workspace.id)),
|
||||
);
|
||||
|
||||
const issuePlans: Array<PlannedIssueInsert | PlannedIssueSkip> = [];
|
||||
let nextPreviewIssueNumber = input.previewIssueCounterStart;
|
||||
for (const issue of sortIssuesForImport(input.sourceIssues)) {
|
||||
const existing = targetIssuesById.get(issue.id);
|
||||
if (existing) {
|
||||
issuePlans.push({
|
||||
source: issue,
|
||||
action: "skip_existing",
|
||||
driftKeys: compareIssueCoreFields(issue, existing),
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
nextPreviewIssueNumber += 1;
|
||||
const adjustments: ImportAdjustment[] = [];
|
||||
const targetAssigneeAgentId =
|
||||
issue.assigneeAgentId && targetAgentIds.has(issue.assigneeAgentId) ? issue.assigneeAgentId : null;
|
||||
if (issue.assigneeAgentId && !targetAssigneeAgentId) {
|
||||
adjustments.push("clear_assignee_agent");
|
||||
incrementAdjustment(adjustmentCounts, "clear_assignee_agent");
|
||||
}
|
||||
|
||||
const targetCreatedByAgentId =
|
||||
issue.createdByAgentId && targetAgentIds.has(issue.createdByAgentId) ? issue.createdByAgentId : null;
|
||||
|
||||
let targetProjectId =
|
||||
issue.projectId && targetProjectIds.has(issue.projectId) ? issue.projectId : null;
|
||||
let projectResolution: PlannedIssueInsert["projectResolution"] = targetProjectId ? "preserved" : "cleared";
|
||||
let mappedProjectName: string | null = null;
|
||||
const overrideProjectId =
|
||||
issue.projectId && input.projectIdOverrides
|
||||
? input.projectIdOverrides[issue.projectId] ?? null
|
||||
: null;
|
||||
if (!targetProjectId && overrideProjectId && targetProjectIds.has(overrideProjectId)) {
|
||||
targetProjectId = overrideProjectId;
|
||||
projectResolution = "mapped";
|
||||
mappedProjectName = targetProjectsById.get(overrideProjectId)?.name ?? null;
|
||||
}
|
||||
if (!targetProjectId && issue.projectId && importProjectIds.has(issue.projectId)) {
|
||||
const sourceProject = sourceProjectsById.get(issue.projectId);
|
||||
if (sourceProject) {
|
||||
targetProjectId = sourceProject.id;
|
||||
projectResolution = "imported";
|
||||
mappedProjectName = sourceProject.name;
|
||||
}
|
||||
}
|
||||
if (issue.projectId && !targetProjectId) {
|
||||
adjustments.push("clear_project");
|
||||
incrementAdjustment(adjustmentCounts, "clear_project");
|
||||
}
|
||||
|
||||
const targetProjectWorkspaceId =
|
||||
targetProjectId
|
||||
&& targetProjectId === issue.projectId
|
||||
&& issue.projectWorkspaceId
|
||||
&& (targetProjectWorkspaceIds.has(issue.projectWorkspaceId)
|
||||
|| importedProjectWorkspaceIds.has(issue.projectWorkspaceId))
|
||||
? issue.projectWorkspaceId
|
||||
: null;
|
||||
if (issue.projectWorkspaceId && !targetProjectWorkspaceId) {
|
||||
adjustments.push("clear_project_workspace");
|
||||
incrementAdjustment(adjustmentCounts, "clear_project_workspace");
|
||||
}
|
||||
|
||||
const targetGoalId =
|
||||
issue.goalId && targetGoalIds.has(issue.goalId) ? issue.goalId : null;
|
||||
if (issue.goalId && !targetGoalId) {
|
||||
adjustments.push("clear_goal");
|
||||
incrementAdjustment(adjustmentCounts, "clear_goal");
|
||||
}
|
||||
|
||||
let targetStatus = issue.status;
|
||||
if (
|
||||
targetStatus === "in_progress"
|
||||
&& !targetAssigneeAgentId
|
||||
&& !(issue.assigneeUserId && issue.assigneeUserId.trim().length > 0)
|
||||
) {
|
||||
targetStatus = "todo";
|
||||
adjustments.push("coerce_in_progress_to_todo");
|
||||
incrementAdjustment(adjustmentCounts, "coerce_in_progress_to_todo");
|
||||
}
|
||||
|
||||
issuePlans.push({
|
||||
source: issue,
|
||||
action: "insert",
|
||||
previewIssueNumber: nextPreviewIssueNumber,
|
||||
previewIdentifier: `${input.issuePrefix}-${nextPreviewIssueNumber}`,
|
||||
targetStatus,
|
||||
targetAssigneeAgentId,
|
||||
targetCreatedByAgentId,
|
||||
targetProjectId,
|
||||
targetProjectWorkspaceId,
|
||||
targetGoalId,
|
||||
projectResolution,
|
||||
mappedProjectName,
|
||||
adjustments,
|
||||
});
|
||||
}
|
||||
|
||||
const issueIdsAvailableAfterImport = new Set<string>([
|
||||
...input.targetIssues.map((issue) => issue.id),
|
||||
...issuePlans.filter((plan): plan is PlannedIssueInsert => plan.action === "insert").map((plan) => plan.source.id),
|
||||
]);
|
||||
|
||||
const commentPlans: Array<PlannedCommentInsert | PlannedCommentSkip> = [];
|
||||
if (scopes.has("comments")) {
|
||||
const sortedComments = [...input.sourceComments].sort((left, right) => {
|
||||
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||
if (createdDelta !== 0) return createdDelta;
|
||||
return left.id.localeCompare(right.id);
|
||||
});
|
||||
|
||||
for (const comment of sortedComments) {
|
||||
if (targetCommentIds.has(comment.id)) {
|
||||
commentPlans.push({ source: comment, action: "skip_existing" });
|
||||
continue;
|
||||
}
|
||||
if (!issueIdsAvailableAfterImport.has(comment.issueId)) {
|
||||
commentPlans.push({ source: comment, action: "skip_missing_parent" });
|
||||
continue;
|
||||
}
|
||||
|
||||
const adjustments: ImportAdjustment[] = [];
|
||||
const targetAuthorAgentId =
|
||||
comment.authorAgentId && targetAgentIds.has(comment.authorAgentId) ? comment.authorAgentId : null;
|
||||
if (comment.authorAgentId && !targetAuthorAgentId) {
|
||||
adjustments.push("clear_author_agent");
|
||||
incrementAdjustment(adjustmentCounts, "clear_author_agent");
|
||||
}
|
||||
|
||||
commentPlans.push({
|
||||
source: comment,
|
||||
action: "insert",
|
||||
targetAuthorAgentId,
|
||||
adjustments,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
const sourceDocuments = input.sourceDocuments ?? [];
|
||||
const targetDocuments = input.targetDocuments ?? [];
|
||||
const sourceDocumentRevisions = input.sourceDocumentRevisions ?? [];
|
||||
const targetDocumentRevisions = input.targetDocumentRevisions ?? [];
|
||||
|
||||
const targetDocumentsById = new Map(targetDocuments.map((document) => [document.documentId, document]));
|
||||
const targetDocumentsByIssueKey = new Map(targetDocuments.map((document) => [`${document.issueId}:${document.key}`, document]));
|
||||
const sourceRevisionsByDocumentId = groupBy(sourceDocumentRevisions, (revision) => revision.documentId);
|
||||
const targetRevisionsByDocumentId = groupBy(targetDocumentRevisions, (revision) => revision.documentId);
|
||||
const commentIdsAvailableAfterImport = new Set<string>([
|
||||
...input.targetComments.map((comment) => comment.id),
|
||||
...commentPlans.filter((plan): plan is PlannedCommentInsert => plan.action === "insert").map((plan) => plan.source.id),
|
||||
]);
|
||||
|
||||
const documentPlans: Array<PlannedIssueDocumentInsert | PlannedIssueDocumentMerge | PlannedIssueDocumentSkip> = [];
|
||||
for (const document of sortDocumentRows(sourceDocuments)) {
|
||||
if (!issueIdsAvailableAfterImport.has(document.issueId)) {
|
||||
documentPlans.push({ source: document, action: "skip_missing_parent" });
|
||||
continue;
|
||||
}
|
||||
|
||||
const existingDocument = targetDocumentsById.get(document.documentId);
|
||||
const conflictingIssueKeyDocument = targetDocumentsByIssueKey.get(`${document.issueId}:${document.key}`);
|
||||
if (!existingDocument && conflictingIssueKeyDocument && conflictingIssueKeyDocument.documentId !== document.documentId) {
|
||||
documentPlans.push({ source: document, action: "skip_conflicting_key" });
|
||||
continue;
|
||||
}
|
||||
|
||||
const adjustments: ImportAdjustment[] = [];
|
||||
const targetCreatedByAgentId =
|
||||
document.createdByAgentId && targetAgentIds.has(document.createdByAgentId) ? document.createdByAgentId : null;
|
||||
const targetUpdatedByAgentId =
|
||||
document.updatedByAgentId && targetAgentIds.has(document.updatedByAgentId) ? document.updatedByAgentId : null;
|
||||
if (
|
||||
(document.createdByAgentId && !targetCreatedByAgentId)
|
||||
|| (document.updatedByAgentId && !targetUpdatedByAgentId)
|
||||
) {
|
||||
adjustments.push("clear_document_agent");
|
||||
incrementAdjustment(adjustmentCounts, "clear_document_agent");
|
||||
}
|
||||
|
||||
const sourceRevisions = sortDocumentRevisions(sourceRevisionsByDocumentId.get(document.documentId) ?? []);
|
||||
const targetRevisions = sortDocumentRevisions(targetRevisionsByDocumentId.get(document.documentId) ?? []);
|
||||
const existingRevisionIds = new Set(targetRevisions.map((revision) => revision.id));
|
||||
const usedRevisionNumbers = new Set(targetRevisions.map((revision) => revision.revisionNumber));
|
||||
let nextRevisionNumber = targetRevisions.reduce(
|
||||
(maxValue, revision) => Math.max(maxValue, revision.revisionNumber),
|
||||
0,
|
||||
) + 1;
|
||||
|
||||
const targetRevisionNumberById = new Map<string, number>(
|
||||
targetRevisions.map((revision) => [revision.id, revision.revisionNumber]),
|
||||
);
|
||||
const revisionsToInsert: PlannedDocumentRevisionInsert[] = [];
|
||||
|
||||
for (const revision of sourceRevisions) {
|
||||
if (existingRevisionIds.has(revision.id)) continue;
|
||||
let targetRevisionNumber = revision.revisionNumber;
|
||||
if (usedRevisionNumbers.has(targetRevisionNumber)) {
|
||||
while (usedRevisionNumbers.has(nextRevisionNumber)) {
|
||||
nextRevisionNumber += 1;
|
||||
}
|
||||
targetRevisionNumber = nextRevisionNumber;
|
||||
nextRevisionNumber += 1;
|
||||
}
|
||||
usedRevisionNumbers.add(targetRevisionNumber);
|
||||
targetRevisionNumberById.set(revision.id, targetRevisionNumber);
|
||||
|
||||
const revisionAdjustments: ImportAdjustment[] = [];
|
||||
const targetCreatedByAgentId =
|
||||
revision.createdByAgentId && targetAgentIds.has(revision.createdByAgentId) ? revision.createdByAgentId : null;
|
||||
if (revision.createdByAgentId && !targetCreatedByAgentId) {
|
||||
revisionAdjustments.push("clear_document_revision_agent");
|
||||
incrementAdjustment(adjustmentCounts, "clear_document_revision_agent");
|
||||
}
|
||||
|
||||
revisionsToInsert.push({
|
||||
source: revision,
|
||||
targetRevisionNumber,
|
||||
targetCreatedByAgentId,
|
||||
adjustments: revisionAdjustments,
|
||||
});
|
||||
}
|
||||
|
||||
const latestRevisionId = document.latestRevisionId ?? existingDocument?.latestRevisionId ?? null;
|
||||
const latestRevisionNumber =
|
||||
(latestRevisionId ? targetRevisionNumberById.get(latestRevisionId) : undefined)
|
||||
?? document.latestRevisionNumber
|
||||
?? existingDocument?.latestRevisionNumber
|
||||
?? 0;
|
||||
|
||||
if (!existingDocument) {
|
||||
documentPlans.push({
|
||||
source: document,
|
||||
action: "insert",
|
||||
targetCreatedByAgentId,
|
||||
targetUpdatedByAgentId,
|
||||
latestRevisionId,
|
||||
latestRevisionNumber,
|
||||
revisionsToInsert,
|
||||
adjustments,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
const documentAlreadyMatches =
|
||||
existingDocument.key === document.key
|
||||
&& existingDocument.title === document.title
|
||||
&& existingDocument.format === document.format
|
||||
&& existingDocument.latestBody === document.latestBody
|
||||
&& (existingDocument.latestRevisionId ?? null) === latestRevisionId
|
||||
&& existingDocument.latestRevisionNumber === latestRevisionNumber
|
||||
&& (existingDocument.updatedByAgentId ?? null) === targetUpdatedByAgentId
|
||||
&& (existingDocument.updatedByUserId ?? null) === (document.updatedByUserId ?? null)
|
||||
&& sameDate(existingDocument.documentUpdatedAt, document.documentUpdatedAt)
|
||||
&& sameDate(existingDocument.linkUpdatedAt, document.linkUpdatedAt)
|
||||
&& revisionsToInsert.length === 0;
|
||||
|
||||
if (documentAlreadyMatches) {
|
||||
documentPlans.push({ source: document, action: "skip_existing" });
|
||||
continue;
|
||||
}
|
||||
|
||||
documentPlans.push({
|
||||
source: document,
|
||||
action: "merge_existing",
|
||||
targetCreatedByAgentId,
|
||||
targetUpdatedByAgentId,
|
||||
latestRevisionId,
|
||||
latestRevisionNumber,
|
||||
revisionsToInsert,
|
||||
adjustments,
|
||||
});
|
||||
}
|
||||
|
||||
const sourceAttachments = input.sourceAttachments ?? [];
|
||||
const targetAttachmentIds = new Set((input.targetAttachments ?? []).map((attachment) => attachment.id));
|
||||
const attachmentPlans: Array<PlannedAttachmentInsert | PlannedAttachmentSkip> = [];
|
||||
for (const attachment of sortAttachments(sourceAttachments)) {
|
||||
if (targetAttachmentIds.has(attachment.id)) {
|
||||
attachmentPlans.push({ source: attachment, action: "skip_existing" });
|
||||
continue;
|
||||
}
|
||||
if (!issueIdsAvailableAfterImport.has(attachment.issueId)) {
|
||||
attachmentPlans.push({ source: attachment, action: "skip_missing_parent" });
|
||||
continue;
|
||||
}
|
||||
|
||||
const adjustments: ImportAdjustment[] = [];
|
||||
const targetCreatedByAgentId =
|
||||
attachment.createdByAgentId && targetAgentIds.has(attachment.createdByAgentId)
|
||||
? attachment.createdByAgentId
|
||||
: null;
|
||||
if (attachment.createdByAgentId && !targetCreatedByAgentId) {
|
||||
adjustments.push("clear_attachment_agent");
|
||||
incrementAdjustment(adjustmentCounts, "clear_attachment_agent");
|
||||
}
|
||||
|
||||
attachmentPlans.push({
|
||||
source: attachment,
|
||||
action: "insert",
|
||||
targetIssueCommentId:
|
||||
attachment.issueCommentId && commentIdsAvailableAfterImport.has(attachment.issueCommentId)
|
||||
? attachment.issueCommentId
|
||||
: null,
|
||||
targetCreatedByAgentId,
|
||||
adjustments,
|
||||
});
|
||||
}
|
||||
|
||||
const counts = {
|
||||
projectsToImport: projectImports.length,
|
||||
issuesToInsert: issuePlans.filter((plan) => plan.action === "insert").length,
|
||||
issuesExisting: issuePlans.filter((plan) => plan.action === "skip_existing").length,
|
||||
issueDrift: issuePlans.filter((plan) => plan.action === "skip_existing" && plan.driftKeys.length > 0).length,
|
||||
commentsToInsert: commentPlans.filter((plan) => plan.action === "insert").length,
|
||||
commentsExisting: commentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||
commentsMissingParent: commentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||
documentsToInsert: documentPlans.filter((plan) => plan.action === "insert").length,
|
||||
documentsToMerge: documentPlans.filter((plan) => plan.action === "merge_existing").length,
|
||||
documentsExisting: documentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||
documentsConflictingKey: documentPlans.filter((plan) => plan.action === "skip_conflicting_key").length,
|
||||
documentsMissingParent: documentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||
documentRevisionsToInsert: documentPlans.reduce(
|
||||
(sum, plan) =>
|
||||
sum + (plan.action === "insert" || plan.action === "merge_existing" ? plan.revisionsToInsert.length : 0),
|
||||
0,
|
||||
),
|
||||
attachmentsToInsert: attachmentPlans.filter((plan) => plan.action === "insert").length,
|
||||
attachmentsExisting: attachmentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||
attachmentsMissingParent: attachmentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||
};
|
||||
|
||||
return {
|
||||
companyId: input.companyId,
|
||||
companyName: input.companyName,
|
||||
issuePrefix: input.issuePrefix,
|
||||
previewIssueCounterStart: input.previewIssueCounterStart,
|
||||
scopes: input.scopes,
|
||||
projectImports,
|
||||
issuePlans,
|
||||
commentPlans,
|
||||
documentPlans,
|
||||
attachmentPlans,
|
||||
counts,
|
||||
adjustments: adjustmentCounts,
|
||||
};
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -33,6 +33,10 @@ export function resolveDefaultContextPath(): string {
|
||||
return path.resolve(resolvePaperclipHomeDir(), "context.json");
|
||||
}
|
||||
|
||||
export function resolveDefaultCliAuthPath(): string {
|
||||
return path.resolve(resolvePaperclipHomeDir(), "auth.json");
|
||||
}
|
||||
|
||||
export function resolveDefaultEmbeddedPostgresDir(instanceId?: string): string {
|
||||
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "db");
|
||||
}
|
||||
|
||||
@@ -7,6 +7,7 @@ export {
|
||||
loggingConfigSchema,
|
||||
serverConfigSchema,
|
||||
authConfigSchema,
|
||||
telemetryConfigSchema,
|
||||
storageConfigSchema,
|
||||
storageLocalDiskConfigSchema,
|
||||
storageS3ConfigSchema,
|
||||
@@ -19,10 +20,11 @@ export {
|
||||
type LoggingConfig,
|
||||
type ServerConfig,
|
||||
type AuthConfig,
|
||||
type TelemetryConfig,
|
||||
type StorageConfig,
|
||||
type StorageLocalDiskConfig,
|
||||
type StorageS3Config,
|
||||
type SecretsConfig,
|
||||
type SecretsLocalEncryptedConfig,
|
||||
type ConfigMeta,
|
||||
} from "@paperclipai/shared";
|
||||
} from "../../../packages/shared/src/config-schema.js";
|
||||
|
||||
@@ -15,10 +15,15 @@ import { registerAgentCommands } from "./commands/client/agent.js";
|
||||
import { registerApprovalCommands } from "./commands/client/approval.js";
|
||||
import { registerActivityCommands } from "./commands/client/activity.js";
|
||||
import { registerDashboardCommands } from "./commands/client/dashboard.js";
|
||||
import { registerRoutineCommands } from "./commands/routines.js";
|
||||
import { registerFeedbackCommands } from "./commands/client/feedback.js";
|
||||
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
|
||||
import { loadPaperclipEnvFile } from "./config/env.js";
|
||||
import { initTelemetryFromConfigFile, flushTelemetry } from "./telemetry.js";
|
||||
import { registerWorktreeCommands } from "./commands/worktree.js";
|
||||
import { registerPluginCommands } from "./commands/client/plugin.js";
|
||||
import { registerClientAuthCommands } from "./commands/client/auth.js";
|
||||
import { cliVersion } from "./version.js";
|
||||
|
||||
const program = new Command();
|
||||
const DATA_DIR_OPTION_HELP =
|
||||
@@ -27,7 +32,7 @@ const DATA_DIR_OPTION_HELP =
|
||||
program
|
||||
.name("paperclipai")
|
||||
.description("Paperclip CLI — setup, diagnose, and configure your instance")
|
||||
.version("0.2.7");
|
||||
.version(cliVersion);
|
||||
|
||||
program.hook("preAction", (_thisCommand, actionCommand) => {
|
||||
const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
|
||||
@@ -37,6 +42,7 @@ program.hook("preAction", (_thisCommand, actionCommand) => {
|
||||
hasContextOption: optionNames.has("context"),
|
||||
});
|
||||
loadPaperclipEnvFile(options.config);
|
||||
initTelemetryFromConfigFile(options.config);
|
||||
});
|
||||
|
||||
program
|
||||
@@ -136,6 +142,8 @@ registerAgentCommands(program);
|
||||
registerApprovalCommands(program);
|
||||
registerActivityCommands(program);
|
||||
registerDashboardCommands(program);
|
||||
registerRoutineCommands(program);
|
||||
registerFeedbackCommands(program);
|
||||
registerWorktreeCommands(program);
|
||||
registerPluginCommands(program);
|
||||
|
||||
@@ -151,7 +159,22 @@ auth
|
||||
.option("--base-url <url>", "Public base URL used to print invite link")
|
||||
.action(bootstrapCeoInvite);
|
||||
|
||||
program.parseAsync().catch((err) => {
|
||||
console.error(err instanceof Error ? err.message : String(err));
|
||||
process.exit(1);
|
||||
});
|
||||
registerClientAuthCommands(auth);
|
||||
|
||||
async function main(): Promise<void> {
|
||||
let failed = false;
|
||||
try {
|
||||
await program.parseAsync();
|
||||
} catch (err) {
|
||||
failed = true;
|
||||
console.error(err instanceof Error ? err.message : String(err));
|
||||
} finally {
|
||||
await flushTelemetry();
|
||||
}
|
||||
|
||||
if (failed) {
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
void main();
|
||||
|
||||
49
cli/src/telemetry.ts
Normal file
49
cli/src/telemetry.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import path from "node:path";
|
||||
import {
|
||||
TelemetryClient,
|
||||
resolveTelemetryConfig,
|
||||
loadOrCreateState,
|
||||
trackInstallStarted,
|
||||
trackInstallCompleted,
|
||||
trackCompanyImported,
|
||||
} from "../../packages/shared/src/telemetry/index.js";
|
||||
import { resolvePaperclipInstanceRoot } from "./config/home.js";
|
||||
import { readConfig } from "./config/store.js";
|
||||
import { cliVersion } from "./version.js";
|
||||
|
||||
let client: TelemetryClient | null = null;
|
||||
|
||||
export function initTelemetry(fileConfig?: { enabled?: boolean }): TelemetryClient | null {
|
||||
if (client) return client;
|
||||
|
||||
const config = resolveTelemetryConfig(fileConfig);
|
||||
if (!config.enabled) return null;
|
||||
|
||||
const stateDir = path.join(resolvePaperclipInstanceRoot(), "telemetry");
|
||||
client = new TelemetryClient(config, () => loadOrCreateState(stateDir, cliVersion), cliVersion);
|
||||
return client;
|
||||
}
|
||||
|
||||
export function initTelemetryFromConfigFile(configPath?: string): TelemetryClient | null {
|
||||
try {
|
||||
return initTelemetry(readConfig(configPath)?.telemetry);
|
||||
} catch {
|
||||
return initTelemetry();
|
||||
}
|
||||
}
|
||||
|
||||
export function getTelemetryClient(): TelemetryClient | null {
|
||||
return client;
|
||||
}
|
||||
|
||||
export async function flushTelemetry(): Promise<void> {
|
||||
if (client) {
|
||||
await client.flush();
|
||||
}
|
||||
}
|
||||
|
||||
export {
|
||||
trackInstallStarted,
|
||||
trackInstallCompleted,
|
||||
trackCompanyImported,
|
||||
};
|
||||
10
cli/src/version.ts
Normal file
10
cli/src/version.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
import { createRequire } from "node:module";
|
||||
|
||||
type PackageJson = {
|
||||
version?: string;
|
||||
};
|
||||
|
||||
const require = createRequire(import.meta.url);
|
||||
const pkg = require("../package.json") as PackageJson;
|
||||
|
||||
export const cliVersion = pkg.version ?? "0.0.0";
|
||||
@@ -2,7 +2,7 @@
|
||||
"extends": "../tsconfig.base.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "dist",
|
||||
"rootDir": "src"
|
||||
"rootDir": ".."
|
||||
},
|
||||
"include": ["src"]
|
||||
"include": ["src", "../packages/shared/src"]
|
||||
}
|
||||
|
||||
115
doc/AGENTCOMPANIES_SPEC_INVENTORY.md
Normal file
115
doc/AGENTCOMPANIES_SPEC_INVENTORY.md
Normal file
@@ -0,0 +1,115 @@
|
||||
# Agent Companies Spec Inventory
|
||||
|
||||
This document indexes every part of the Paperclip codebase that touches the [Agent Companies Specification](docs/companies/companies-spec.md) (`agentcompanies/v1-draft`).
|
||||
|
||||
Use it when you need to:
|
||||
|
||||
1. **Update the spec** — know which implementation code must change in lockstep.
|
||||
2. **Change code that involves the spec** — find all related files quickly.
|
||||
3. **Keep things aligned** — audit whether implementation matches the spec.
|
||||
|
||||
---
|
||||
|
||||
## 1. Specification & Design Documents
|
||||
|
||||
| File | Role |
|
||||
|---|---|
|
||||
| `docs/companies/companies-spec.md` | **Normative spec** — defines the markdown-first package format (COMPANY.md, TEAM.md, AGENTS.md, PROJECT.md, TASK.md, SKILL.md), reserved files, frontmatter schemas, and vendor extension conventions (`.paperclip.yaml`). |
|
||||
| `doc/plans/2026-03-13-company-import-export-v2.md` | Implementation plan for the markdown-first package model cutover — phases, API changes, UI plan, and rollout strategy. |
|
||||
| `doc/SPEC-implementation.md` | V1 implementation contract; references the portability system and `.paperclip.yaml` sidecar format. |
|
||||
| `docs/specs/cliphub-plan.md` | Earlier blueprint bundle plan; partially superseded by the markdown-first spec (noted in the v2 plan). |
|
||||
| `doc/plans/2026-02-16-module-system.md` | Module system plan; JSON-only company template sections superseded by the markdown-first model. |
|
||||
| `doc/plans/2026-03-14-skills-ui-product-plan.md` | Skills UI plan; references portable skill files and `.paperclip.yaml`. |
|
||||
| `doc/plans/2026-03-14-adapter-skill-sync-rollout.md` | Adapter skill sync rollout; companion to the v2 import/export plan. |
|
||||
|
||||
## 2. Shared Types & Validators
|
||||
|
||||
These define the contract between server, CLI, and UI.
|
||||
|
||||
| File | What it defines |
|
||||
|---|---|
|
||||
| `packages/shared/src/types/company-portability.ts` | TypeScript interfaces: `CompanyPortabilityManifest`, `CompanyPortabilityFileEntry`, `CompanyPortabilityEnvInput`, export/import/preview request and result types, manifest entry types for agents, skills, projects, issues, recurring routines, companies. |
|
||||
| `packages/shared/src/validators/company-portability.ts` | Zod schemas for all portability request/response shapes — used by both server routes and CLI. |
|
||||
| `packages/shared/src/types/index.ts` | Re-exports portability types. |
|
||||
| `packages/shared/src/validators/index.ts` | Re-exports portability validators. |
|
||||
|
||||
## 3. Server — Services
|
||||
|
||||
| File | Responsibility |
|
||||
|---|---|
|
||||
| `server/src/services/company-portability.ts` | **Core portability service.** Export (manifest generation, markdown file emission, `.paperclip.yaml` sidecars), import (graph resolution, collision handling, entity creation), preview (planned-action summary). Handles skill key derivation, recurring task <-> routine mapping, legacy recurrence migration, and package README generation. References `agentcompanies/v1` version string. |
|
||||
| `server/src/services/routines.ts` | Paperclip routine runtime service. Portability now exports routines as recurring `TASK.md` entries and imports recurring tasks back through this service. |
|
||||
| `server/src/services/company-export-readme.ts` | Generates `README.md` and Mermaid org-chart for exported company packages. |
|
||||
| `server/src/services/index.ts` | Re-exports `companyPortabilityService`. |
|
||||
|
||||
## 4. Server — Routes
|
||||
|
||||
| File | Endpoints |
|
||||
|---|---|
|
||||
| `server/src/routes/companies.ts` | `POST /api/companies/:companyId/export` — legacy export bundle<br>`POST /api/companies/:companyId/exports/preview` — export preview<br>`POST /api/companies/:companyId/exports` — export package<br>`POST /api/companies/import/preview` — import preview<br>`POST /api/companies/import` — perform import |
|
||||
|
||||
Route registration lives in `server/src/app.ts` via `companyRoutes(db, storage)`.
|
||||
|
||||
## 5. Server — Tests
|
||||
|
||||
| File | Coverage |
|
||||
|---|---|
|
||||
| `server/src/__tests__/company-portability.test.ts` | Unit tests for the portability service (export, import, preview, manifest shape, `agentcompanies/v1` version). |
|
||||
| `server/src/__tests__/company-portability-routes.test.ts` | Integration tests for the portability HTTP endpoints. |
|
||||
|
||||
## 6. CLI
|
||||
|
||||
| File | Commands |
|
||||
|---|---|
|
||||
| `cli/src/commands/client/company.ts` | `company export` — exports a company package to disk (flags: `--out`, `--include`, `--projects`, `--issues`, `--projectIssues`).<br>`company import <fromPathOrUrl>` — imports a company package from a file or folder (flags: positional source path/URL or GitHub shorthand, `--include`, `--target`, `--companyId`, `--newCompanyName`, `--agents`, `--collision`, `--ref`, `--dryRun`).<br>Reads/writes portable file entries and handles `.paperclip.yaml` filtering. |
|
||||
|
||||
## 7. UI — Pages
|
||||
|
||||
| File | Role |
|
||||
|---|---|
|
||||
| `ui/src/pages/CompanyExport.tsx` | Export UI: preview, manifest display, file tree visualization, ZIP archive creation and download. Filters `.paperclip.yaml` based on selection. Shows manifest and README in editor. |
|
||||
| `ui/src/pages/CompanyImport.tsx` | Import UI: source input (upload/folder/GitHub URL/generic URL), ZIP reading, preview pane with dependency tree, entity selection checkboxes, trust/licensing warnings, secrets requirements, collision strategy, adapter config. |
|
||||
|
||||
## 8. UI — Components
|
||||
|
||||
| File | Role |
|
||||
|---|---|
|
||||
| `ui/src/components/PackageFileTree.tsx` | Reusable file tree component for both import and export. Builds tree from `CompanyPortabilityFileEntry` items, parses frontmatter, shows action indicators (create/update/skip), and maps frontmatter field labels. |
|
||||
|
||||
## 9. UI — Libraries
|
||||
|
||||
| File | Role |
|
||||
|---|---|
|
||||
| `ui/src/lib/portable-files.ts` | Helpers for portable file entries: `getPortableFileText`, `getPortableFileDataUrl`, `getPortableFileContentType`, `isPortableImageFile`. |
|
||||
| `ui/src/lib/zip.ts` | ZIP archive creation (`createZipArchive`) and reading (`readZipArchive`) — implements ZIP format from scratch for company packages. CRC32, DOS date/time encoding. |
|
||||
| `ui/src/lib/zip.test.ts` | Tests for ZIP utilities; exercises round-trip with portability file entries and `.paperclip.yaml` content. |
|
||||
|
||||
## 10. UI — API Client
|
||||
|
||||
| File | Functions |
|
||||
|---|---|
|
||||
| `ui/src/api/companies.ts` | `companiesApi.exportBundle`, `companiesApi.exportPreview`, `companiesApi.exportPackage`, `companiesApi.importPreview`, `companiesApi.importBundle` — typed fetch wrappers for the portability endpoints. |
|
||||
|
||||
## 11. Skills & Agent Instructions
|
||||
|
||||
| File | Relevance |
|
||||
|---|---|
|
||||
| `skills/paperclip/references/company-skills.md` | Reference doc for company skill library workflow — install, inspect, update, assign. Skill packages are a subset of the agent companies spec. |
|
||||
| `server/src/services/company-skills.ts` | Company skill management service — handles SKILL.md-based imports and company-level skill library. |
|
||||
| `server/src/services/agent-instructions.ts` | Agent instructions service — resolves AGENTS.md paths for agent instruction loading. |
|
||||
|
||||
## 12. Quick Cross-Reference by Spec Concept
|
||||
|
||||
| Spec concept | Primary implementation files |
|
||||
|---|---|
|
||||
| `COMPANY.md` frontmatter & body | `company-portability.ts` (export emitter + import parser) |
|
||||
| `AGENTS.md` frontmatter & body | `company-portability.ts`, `agent-instructions.ts` |
|
||||
| `PROJECT.md` frontmatter & body | `company-portability.ts` |
|
||||
| `TASK.md` frontmatter & body | `company-portability.ts` |
|
||||
| `SKILL.md` packages | `company-portability.ts`, `company-skills.ts` |
|
||||
| `.paperclip.yaml` vendor sidecar | `company-portability.ts`, `routines.ts`, `CompanyExport.tsx`, `company.ts` (CLI) |
|
||||
| `manifest.json` | `company-portability.ts` (generation), shared types (schema) |
|
||||
| ZIP package format | `zip.ts` (UI), `company.ts` (CLI file I/O) |
|
||||
| Collision resolution | `company-portability.ts` (server), `CompanyImport.tsx` (UI) |
|
||||
| Env/secrets declarations | shared types (`CompanyPortabilityEnvInput`), `CompanyImport.tsx` (UI) |
|
||||
| README + org chart | `company-export-readme.ts` |
|
||||
@@ -39,6 +39,19 @@ This starts:
|
||||
|
||||
`pnpm dev` runs the server in watch mode and restarts on changes from workspace packages (including adapter packages). Use `pnpm dev:once` to run without file watching.
|
||||
|
||||
`pnpm dev:once` auto-applies pending local migrations by default before starting the dev server.
|
||||
|
||||
`pnpm dev` and `pnpm dev:once` are now idempotent for the current repo and instance: if the matching Paperclip dev runner is already alive, Paperclip reports the existing process instead of starting a duplicate.
|
||||
|
||||
Inspect or stop the current repo's managed dev runner:
|
||||
|
||||
```sh
|
||||
pnpm dev:list
|
||||
pnpm dev:stop
|
||||
```
|
||||
|
||||
`pnpm dev:once` now tracks backend-relevant file changes and pending migrations. When the current boot is stale, the board UI shows a `Restart required` banner. You can also enable guarded auto-restart in `Instance Settings > Experimental`, which waits for queued/running local agent runs to finish before restarting the dev server.
|
||||
|
||||
Tailscale/private-auth dev mode:
|
||||
|
||||
```sh
|
||||
@@ -84,7 +97,7 @@ docker run --name paperclip \
|
||||
Or use Compose:
|
||||
|
||||
```sh
|
||||
docker compose -f docker-compose.quickstart.yml up --build
|
||||
docker compose -f docker/docker-compose.quickstart.yml up --build
|
||||
```
|
||||
|
||||
See `doc/DOCKER.md` for API key wiring (`OPENAI_API_KEY` / `ANTHROPIC_API_KEY`) and persistence details.
|
||||
@@ -128,6 +141,12 @@ When a local agent run has no resolved project/session workspace, Paperclip fall
|
||||
|
||||
This path honors `PAPERCLIP_HOME` and `PAPERCLIP_INSTANCE_ID` in non-default setups.
|
||||
|
||||
For `codex_local`, Paperclip also manages a per-company Codex home under the instance root and seeds it from the shared Codex login/config home (`$CODEX_HOME` or `~/.codex`):
|
||||
|
||||
- `~/.paperclip/instances/default/companies/<company-id>/codex-home`
|
||||
|
||||
If the `codex` CLI is not installed or not on `PATH`, `codex_local` agent runs fail at execution time with a clear adapter error. Quota polling uses a short-lived `codex app-server` subprocess: when `codex` cannot be spawned, that provider reports `ok: false` in aggregated quota results and the API server keeps running (it must not exit on a missing binary).
|
||||
|
||||
## Worktree-local Instances
|
||||
|
||||
When developing from multiple git worktrees, do not point two Paperclip servers at the same embedded PostgreSQL data directory.
|
||||
@@ -156,6 +175,8 @@ Seed modes:
|
||||
|
||||
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
|
||||
|
||||
Provisioned git worktrees also pause all seeded routines in the isolated worktree database by default. This prevents copied daily/cron routines from firing unexpectedly inside the new workspace instance during development.
|
||||
|
||||
That repo-local env also sets:
|
||||
|
||||
- `PAPERCLIP_IN_WORKTREE=true`
|
||||
@@ -200,6 +221,17 @@ paperclipai worktree init --from-data-dir ~/.paperclip
|
||||
paperclipai worktree init --force
|
||||
```
|
||||
|
||||
Repair an already-created repo-managed worktree and reseed its isolated instance from the main default install:
|
||||
|
||||
```sh
|
||||
cd ~/.paperclip/worktrees/PAP-884-ai-commits-component
|
||||
pnpm paperclipai worktree init --force --seed-mode minimal \
|
||||
--name PAP-884-ai-commits-component \
|
||||
--from-config ~/.paperclip/instances/default/config.json
|
||||
```
|
||||
|
||||
That rewrites the worktree-local `.paperclip/config.json` + `.paperclip/.env`, recreates the isolated instance under `~/.paperclip-worktrees/instances/<worktree-id>/`, and preserves the git worktree contents themselves.
|
||||
|
||||
**`pnpm paperclipai worktree:make <name> [options]`** — Create `~/NAME` as a git worktree, then initialize an isolated Paperclip instance inside it. This combines `git worktree add` with `worktree init` in a single step.
|
||||
|
||||
| Option | Description |
|
||||
|
||||
134
doc/DOCKER.md
134
doc/DOCKER.md
@@ -2,6 +2,28 @@
|
||||
|
||||
Run Paperclip in Docker without installing Node or pnpm locally.
|
||||
|
||||
All commands below assume you are in the **project root** (the directory containing `package.json`), not inside `docker/`.
|
||||
|
||||
## Building the image
|
||||
|
||||
```sh
|
||||
docker build -t paperclip-local .
|
||||
```
|
||||
|
||||
The Dockerfile installs common agent tools (`git`, `gh`, `curl`, `wget`, `ripgrep`, `python3`) and the Claude, Codex, and OpenCode CLIs.
|
||||
|
||||
Build arguments:
|
||||
|
||||
| Arg | Default | Purpose |
|
||||
|-----|---------|---------|
|
||||
| `USER_UID` | `1000` | UID for the container `node` user (match your host UID to avoid permission issues on bind mounts) |
|
||||
| `USER_GID` | `1000` | GID for the container `node` group |
|
||||
|
||||
```sh
|
||||
docker build -t paperclip-local \
|
||||
--build-arg USER_UID=$(id -u) --build-arg USER_GID=$(id -g) .
|
||||
```
|
||||
|
||||
## One-liner (build + run)
|
||||
|
||||
```sh
|
||||
@@ -10,6 +32,7 @@ docker run --name paperclip \
|
||||
-p 3100:3100 \
|
||||
-e HOST=0.0.0.0 \
|
||||
-e PAPERCLIP_HOME=/paperclip \
|
||||
-e BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
|
||||
-v "$(pwd)/data/docker-paperclip:/paperclip" \
|
||||
paperclip-local
|
||||
```
|
||||
@@ -25,10 +48,15 @@ Data persistence:
|
||||
|
||||
All persisted under your bind mount (`./data/docker-paperclip` in the example above).
|
||||
|
||||
## Compose Quickstart
|
||||
## Docker Compose
|
||||
|
||||
### Quickstart (embedded SQLite)
|
||||
|
||||
Single container, no external database. Data persists via a bind mount.
|
||||
|
||||
```sh
|
||||
docker compose -f docker-compose.quickstart.yml up --build
|
||||
BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
|
||||
docker compose -f docker/docker-compose.quickstart.yml up --build
|
||||
```
|
||||
|
||||
Defaults:
|
||||
@@ -39,11 +67,36 @@ Defaults:
|
||||
Optional overrides:
|
||||
|
||||
```sh
|
||||
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc docker compose -f docker-compose.quickstart.yml up --build
|
||||
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=../data/pc \
|
||||
docker compose -f docker/docker-compose.quickstart.yml up --build
|
||||
```
|
||||
|
||||
**Note:** `PAPERCLIP_DATA_DIR` is resolved relative to the compose file (`docker/`), so `../data/pc` maps to `data/pc` in the project root.
|
||||
|
||||
If you change host port or use a non-local domain, set `PAPERCLIP_PUBLIC_URL` to the external URL you will use in browser/auth flows.
|
||||
|
||||
Pass `OPENAI_API_KEY` and/or `ANTHROPIC_API_KEY` to enable local adapter runs.
|
||||
|
||||
### Full stack (with PostgreSQL)
|
||||
|
||||
Paperclip server + PostgreSQL 17. The database is health-checked before the server starts.
|
||||
|
||||
```sh
|
||||
BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
|
||||
docker compose -f docker/docker-compose.yml up --build
|
||||
```
|
||||
|
||||
PostgreSQL data persists in a named Docker volume (`pgdata`). Paperclip data persists in `paperclip-data`.
|
||||
|
||||
### Untrusted PR review
|
||||
|
||||
Isolated container for reviewing untrusted pull requests with Codex or Claude, without exposing your host machine. See `doc/UNTRUSTED-PR-REVIEW.md` for the full workflow.
|
||||
|
||||
```sh
|
||||
docker compose -f docker/docker-compose.untrusted-review.yml build
|
||||
docker compose -f docker/docker-compose.untrusted-review.yml run --rm --service-ports review
|
||||
```
|
||||
|
||||
## Authenticated Compose (Single Public URL)
|
||||
|
||||
For authenticated deployments, set one canonical public URL and let Paperclip derive auth/callback defaults:
|
||||
@@ -93,11 +146,71 @@ Notes:
|
||||
- Without API keys, the app still runs normally.
|
||||
- Adapter environment checks in Paperclip will surface missing auth/CLI prerequisites.
|
||||
|
||||
## Untrusted PR Review Container
|
||||
## Podman Quadlet (systemd)
|
||||
|
||||
If you want a separate Docker environment for reviewing untrusted pull requests with `codex` or `claude`, use the dedicated review workflow in `doc/UNTRUSTED-PR-REVIEW.md`.
|
||||
The `docker/quadlet/` directory contains unit files to run Paperclip + PostgreSQL as systemd services via Podman Quadlet.
|
||||
|
||||
That setup keeps CLI auth state in Docker volumes instead of your host home directory and uses a separate scratch workspace for PR checkouts and preview runs.
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `docker/quadlet/paperclip.pod` | Pod definition — groups containers into a shared network namespace |
|
||||
| `docker/quadlet/paperclip.container` | Paperclip server — joins the pod, connects to Postgres at `127.0.0.1` |
|
||||
| `docker/quadlet/paperclip-db.container` | PostgreSQL 17 — joins the pod, health-checked |
|
||||
|
||||
### Setup
|
||||
|
||||
1. Build the image (see above).
|
||||
|
||||
2. Copy quadlet files to your systemd directory:
|
||||
|
||||
```sh
|
||||
# Rootless (recommended)
|
||||
cp docker/quadlet/*.pod docker/quadlet/*.container \
|
||||
~/.config/containers/systemd/
|
||||
|
||||
# Or rootful
|
||||
sudo cp docker/quadlet/*.pod docker/quadlet/*.container \
|
||||
/etc/containers/systemd/
|
||||
```
|
||||
|
||||
3. Create a secrets env file (keep out of version control):
|
||||
|
||||
```sh
|
||||
cat > ~/.config/containers/systemd/paperclip.env <<EOL
|
||||
BETTER_AUTH_SECRET=$(openssl rand -hex 32)
|
||||
POSTGRES_USER=paperclip
|
||||
POSTGRES_PASSWORD=paperclip
|
||||
POSTGRES_DB=paperclip
|
||||
DATABASE_URL=postgres://paperclip:paperclip@127.0.0.1:5432/paperclip
|
||||
# OPENAI_API_KEY=sk-...
|
||||
# ANTHROPIC_API_KEY=sk-...
|
||||
EOL
|
||||
```
|
||||
|
||||
4. Create the data directory and start:
|
||||
|
||||
```sh
|
||||
mkdir -p ~/.local/share/paperclip
|
||||
systemctl --user daemon-reload
|
||||
systemctl --user start paperclip-pod
|
||||
```
|
||||
|
||||
### Quadlet management
|
||||
|
||||
```sh
|
||||
journalctl --user -u paperclip -f # App logs
|
||||
journalctl --user -u paperclip-db -f # DB logs
|
||||
systemctl --user status paperclip-pod # Pod status
|
||||
systemctl --user restart paperclip-pod # Restart all
|
||||
systemctl --user stop paperclip-pod # Stop all
|
||||
```
|
||||
|
||||
### Quadlet notes
|
||||
|
||||
- **First boot**: Unlike Docker Compose's `condition: service_healthy`, Quadlet's `After=` only waits for the DB unit to *start*, not for PostgreSQL to be ready. On a cold first boot you may see one or two restart attempts in `journalctl --user -u paperclip` while PostgreSQL initialises — this is expected and resolves automatically via `Restart=on-failure`.
|
||||
- Containers in a pod share `localhost`, so Paperclip reaches Postgres at `127.0.0.1:5432`.
|
||||
- PostgreSQL data persists in the `paperclip-pgdata` named volume.
|
||||
- Paperclip data persists at `~/.local/share/paperclip`.
|
||||
- For rootful quadlet deployment, remove `%h` prefixes and use absolute paths.
|
||||
|
||||
## Onboard Smoke Test (Ubuntu + npm only)
|
||||
|
||||
@@ -120,6 +233,7 @@ Useful overrides:
|
||||
```sh
|
||||
HOST_PORT=3200 PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
PAPERCLIP_DEPLOYMENT_MODE=authenticated PAPERCLIP_DEPLOYMENT_EXPOSURE=private ./scripts/docker-onboard-smoke.sh
|
||||
SMOKE_DETACH=true SMOKE_METADATA_FILE=/tmp/paperclip-smoke.env PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Notes:
|
||||
@@ -131,4 +245,10 @@ Notes:
|
||||
- Smoke script also defaults `PAPERCLIP_PUBLIC_URL` to `http://localhost:<HOST_PORT>` so bootstrap invite URLs and auth callbacks use the reachable host port instead of the container's internal `3100`.
|
||||
- In authenticated mode, the smoke script defaults `SMOKE_AUTO_BOOTSTRAP=true` and drives the real bootstrap path automatically: it signs up a real user, runs `paperclipai auth bootstrap-ceo` inside the container to mint a real bootstrap invite, accepts that invite over HTTP, and verifies board session access.
|
||||
- Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation.
|
||||
- The image definition is in `Dockerfile.onboard-smoke`.
|
||||
- Set `SMOKE_DETACH=true` to leave the container running for automation and optionally write shell-ready metadata to `SMOKE_METADATA_FILE`.
|
||||
- The image definition is in `docker/Dockerfile.onboard-smoke`.
|
||||
|
||||
## General Notes
|
||||
|
||||
- The `docker-entrypoint.sh` adjusts the container `node` user UID/GID at startup to match the values passed via `USER_UID`/`USER_GID`, avoiding permission issues on bind-mounted volumes.
|
||||
- Paperclip data persists via Docker volumes/bind mounts (compose) or at `~/.local/share/paperclip` (quadlet).
|
||||
|
||||
@@ -51,10 +51,9 @@ Public packages are discovered from:
|
||||
|
||||
- `packages/`
|
||||
- `server/`
|
||||
- `ui/`
|
||||
- `cli/`
|
||||
|
||||
`ui/` is ignored because it is private.
|
||||
|
||||
The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs), which:
|
||||
|
||||
- finds all public packages
|
||||
@@ -65,17 +64,68 @@ The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts
|
||||
|
||||
Those rewrites are temporary. The working tree is restored after publish or dry-run.
|
||||
|
||||
## `@paperclipai/ui` packaging
|
||||
|
||||
The UI package publishes prebuilt static assets, not the source workspace.
|
||||
|
||||
The `ui` package uses [`scripts/generate-ui-package-json.mjs`](../scripts/generate-ui-package-json.mjs) during `prepack` to swap in a lean publish manifest that:
|
||||
|
||||
- keeps the release-managed `name` and `version`
|
||||
- publishes only `dist/`
|
||||
- omits the source-only dependency graph from downstream installs
|
||||
|
||||
After packing or publishing, `postpack` restores the development manifest automatically.
|
||||
|
||||
### Manual first publish for `@paperclipai/ui`
|
||||
|
||||
If you need to publish only the UI package once by hand, use the real package name:
|
||||
|
||||
- `@paperclipai/ui`
|
||||
|
||||
Recommended flow from the repo root:
|
||||
|
||||
```bash
|
||||
# optional sanity check: this 404s until the first publish exists
|
||||
npm view @paperclipai/ui version
|
||||
|
||||
# make sure the dist payload is fresh
|
||||
pnpm --filter @paperclipai/ui build
|
||||
|
||||
# confirm your local npm auth before the real publish
|
||||
npm whoami
|
||||
|
||||
# safe preview of the exact publish payload
|
||||
cd ui
|
||||
pnpm publish --dry-run --no-git-checks --access public
|
||||
|
||||
# real publish
|
||||
pnpm publish --no-git-checks --access public
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- Publish from `ui/`, not the repo root.
|
||||
- `prepack` automatically rewrites `ui/package.json` to the lean publish manifest, and `postpack` restores the dev manifest after the command finishes.
|
||||
- If `npm view @paperclipai/ui version` already returns the same version that is in [`ui/package.json`](../ui/package.json), do not republish. Bump the version or use the normal repo-wide release flow in [`scripts/release.sh`](../scripts/release.sh).
|
||||
|
||||
If the first real publish returns npm `E404`, check npm-side prerequisites before retrying:
|
||||
|
||||
- `npm whoami` must succeed first. An expired or missing npm login will block the publish.
|
||||
- For an organization-scoped package like `@paperclipai/ui`, the `paperclipai` npm organization must exist and the publisher must be a member with permission to publish to that scope.
|
||||
- The initial publish must include `--access public` for a public scoped package.
|
||||
- npm also requires either account 2FA for publishing or a granular token that is allowed to bypass 2FA.
|
||||
|
||||
## Version formats
|
||||
|
||||
Paperclip uses calendar versions:
|
||||
|
||||
- stable: `YYYY.M.D`
|
||||
- canary: `YYYY.M.D-canary.N`
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
Examples:
|
||||
|
||||
- stable: `2026.3.17`
|
||||
- canary: `2026.3.17-canary.2`
|
||||
- stable: `2026.318.0`
|
||||
- canary: `2026.318.1-canary.2`
|
||||
|
||||
## Publish model
|
||||
|
||||
@@ -85,7 +135,7 @@ Canaries publish under the npm dist-tag `canary`.
|
||||
|
||||
Example:
|
||||
|
||||
- `paperclipai@2026.3.17-canary.2`
|
||||
- `paperclipai@2026.318.1-canary.2`
|
||||
|
||||
This keeps the default install path unchanged while allowing explicit installs with:
|
||||
|
||||
@@ -99,13 +149,13 @@ Stable publishes use the npm dist-tag `latest`.
|
||||
|
||||
Example:
|
||||
|
||||
- `paperclipai@2026.3.17`
|
||||
- `paperclipai@2026.318.0`
|
||||
|
||||
Stable publishes do not create a release commit. Instead:
|
||||
|
||||
- package versions are rewritten temporarily
|
||||
- packages are published from the chosen source commit
|
||||
- git tag `vYYYY.M.D` points at that original commit
|
||||
- git tag `vYYYY.MDD.P` points at that original commit
|
||||
|
||||
## Trusted publishing
|
||||
|
||||
@@ -126,7 +176,7 @@ Rollback does not unpublish anything.
|
||||
It repoints the `latest` dist-tag to a prior stable version:
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh 2026.3.16
|
||||
./scripts/rollback-latest.sh 2026.318.0
|
||||
```
|
||||
|
||||
This is the fastest way to restore the default install path if a stable release is bad.
|
||||
@@ -135,6 +185,7 @@ This is the fastest way to restore the default install path if a stable release
|
||||
|
||||
- [`scripts/build-npm.sh`](../scripts/build-npm.sh)
|
||||
- [`scripts/generate-npm-package-json.mjs`](../scripts/generate-npm-package-json.mjs)
|
||||
- [`scripts/generate-ui-package-json.mjs`](../scripts/generate-ui-package-json.mjs)
|
||||
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
|
||||
- [`cli/esbuild.config.mjs`](../cli/esbuild.config.mjs)
|
||||
- [`doc/RELEASING.md`](RELEASING.md)
|
||||
|
||||
@@ -35,6 +35,7 @@ At minimum that includes:
|
||||
|
||||
- `paperclipai`
|
||||
- `@paperclipai/server`
|
||||
- `@paperclipai/ui`
|
||||
- public packages under `packages/`
|
||||
|
||||
### 2.1. In npm, open each package settings page
|
||||
@@ -205,7 +206,7 @@ After setup:
|
||||
3. confirm it passes verification
|
||||
4. confirm publish succeeds under the `npm-canary` environment
|
||||
5. confirm npm now shows a new `canary` release
|
||||
6. confirm a git tag named `canary/vYYYY.M.D-canary.N` was pushed
|
||||
6. confirm a git tag named `canary/vYYYY.MDD.P-canary.N` was pushed
|
||||
|
||||
Install-path check:
|
||||
|
||||
@@ -217,18 +218,25 @@ npx paperclipai@canary onboard
|
||||
|
||||
After at least one good canary exists:
|
||||
|
||||
1. prepare `releases/vYYYY.M.D.md` on the source commit you want to promote
|
||||
2. open `Actions` -> `Release`
|
||||
3. run it with:
|
||||
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
2. prepare `releases/vYYYY.MDD.P.md` on the source commit you want to promote
|
||||
3. open `Actions` -> `Release`
|
||||
4. run it with:
|
||||
- `source_ref`: the tested commit SHA or canary tag source commit
|
||||
- `stable_date`: leave blank or set the intended UTC date
|
||||
- `stable_date`: leave blank or set the intended UTC date like `2026-03-18`
|
||||
do not enter a version like `2026.318.0`; the workflow computes that from the date
|
||||
- `dry_run`: `true`
|
||||
4. confirm the dry-run succeeds
|
||||
5. rerun with `dry_run: false`
|
||||
6. approve the `npm-stable` environment when prompted
|
||||
7. confirm npm `latest` points to the new stable version
|
||||
8. confirm git tag `vYYYY.M.D` exists
|
||||
9. confirm the GitHub Release was created
|
||||
5. confirm the dry-run succeeds
|
||||
6. rerun with `dry_run: false`
|
||||
7. approve the `npm-stable` environment when prompted
|
||||
8. confirm npm `latest` points to the new stable version
|
||||
9. confirm git tag `vYYYY.MDD.P` exists
|
||||
10. confirm the GitHub Release was created
|
||||
|
||||
Implementation note:
|
||||
|
||||
- the GitHub Actions stable workflow calls `create-github-release.sh` with `PUBLISH_REMOTE=origin`
|
||||
- local maintainer usage can still pass `PUBLISH_REMOTE=public-gh` explicitly when needed
|
||||
|
||||
## 13. Suggested Maintainer Policy
|
||||
|
||||
|
||||
@@ -6,26 +6,29 @@ The release model is now commit-driven:
|
||||
|
||||
1. Every push to `master` publishes a canary automatically.
|
||||
2. Stable releases are manually promoted from a chosen tested commit or canary tag.
|
||||
3. Stable release notes live in `releases/vYYYY.M.D.md`.
|
||||
3. Stable release notes live in `releases/vYYYY.MDD.P.md`.
|
||||
4. Only stable releases get GitHub Releases.
|
||||
|
||||
## Versioning Model
|
||||
|
||||
Paperclip uses calendar versions that still fit semver syntax:
|
||||
|
||||
- stable: `YYYY.M.D`
|
||||
- canary: `YYYY.M.D-canary.N`
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
Examples:
|
||||
|
||||
- stable on March 17, 2026: `2026.3.17`
|
||||
- fourth canary on March 17, 2026: `2026.3.17-canary.3`
|
||||
- first stable on March 18, 2026: `2026.318.0`
|
||||
- second stable on March 18, 2026: `2026.318.1`
|
||||
- fourth canary for the `2026.318.1` line: `2026.318.1-canary.3`
|
||||
|
||||
Important constraints:
|
||||
|
||||
- do not use leading zeroes such as `2026.03.17`
|
||||
- do not use four numeric segments such as `2026.03.17.1`
|
||||
- the semver-safe canary form is `2026.3.17-canary.1`
|
||||
- the middle numeric slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||
- use `2026.303.0` for March 3, not `2026.33.0`
|
||||
- do not use leading zeroes such as `2026.0318.0`
|
||||
- do not use four numeric segments such as `2026.3.18.1`
|
||||
- the semver-safe canary form is `2026.318.0-canary.1`
|
||||
|
||||
## Release Surfaces
|
||||
|
||||
@@ -45,7 +48,7 @@ Canaries only cover the first two surfaces plus an internal traceability tag.
|
||||
- canaries publish from `master`
|
||||
- stables publish from an explicitly chosen source ref
|
||||
- tags point at the original source commit, not a generated release commit
|
||||
- stable notes are always `releases/vYYYY.M.D.md`
|
||||
- stable notes are always `releases/vYYYY.MDD.P.md`
|
||||
- canaries never create GitHub Releases
|
||||
- canaries never require changelog generation
|
||||
|
||||
@@ -60,39 +63,52 @@ It:
|
||||
- verifies the pushed commit
|
||||
- computes the canary version for the current UTC date
|
||||
- publishes under npm dist-tag `canary`
|
||||
- creates a git tag `canary/vYYYY.M.D-canary.N`
|
||||
- creates a git tag `canary/vYYYY.MDD.P-canary.N`
|
||||
|
||||
Users install canaries with:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
# or
|
||||
npx paperclipai@canary onboard --data-dir "$(mktemp -d /tmp/paperclip-canary.XXXXXX)"
|
||||
```
|
||||
|
||||
### Stable
|
||||
|
||||
Use [`.github/workflows/release.yml`](../.github/workflows/release.yml) from the Actions tab with the manual `workflow_dispatch` inputs.
|
||||
|
||||
[Run the action here](https://github.com/paperclipai/paperclip/actions/workflows/release.yml)
|
||||
|
||||
Inputs:
|
||||
|
||||
- `source_ref`
|
||||
- commit SHA, branch, or tag
|
||||
- `stable_date`
|
||||
- optional UTC date override in `YYYY-MM-DD`
|
||||
- enter a date like `2026-03-18`, not a version like `2026.318.0`
|
||||
- `dry_run`
|
||||
- preview only when true
|
||||
|
||||
Before running stable:
|
||||
|
||||
1. pick the canary commit or tag you trust
|
||||
2. create or update `releases/vYYYY.M.D.md` on that source ref
|
||||
3. run the stable workflow from that source ref
|
||||
2. resolve the target stable version with `./scripts/release.sh stable --date "$(date +%F)" --print-version`
|
||||
3. create or update `releases/vYYYY.MDD.P.md` on that source ref
|
||||
4. run the stable workflow from that source ref
|
||||
|
||||
Example:
|
||||
|
||||
- `source_ref`: `master`
|
||||
- `stable_date`: `2026-03-18`
|
||||
- resulting stable version: `2026.318.0`
|
||||
|
||||
The workflow:
|
||||
|
||||
- re-verifies the exact source ref
|
||||
- publishes `YYYY.M.D` under npm dist-tag `latest`
|
||||
- creates git tag `vYYYY.M.D`
|
||||
- creates or updates the GitHub Release from `releases/vYYYY.M.D.md`
|
||||
- computes the next stable patch slot for the chosen UTC date
|
||||
- publishes `YYYY.MDD.P` under npm dist-tag `latest`
|
||||
- creates git tag `vYYYY.MDD.P`
|
||||
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
## Local Commands
|
||||
|
||||
@@ -114,22 +130,22 @@ This is mainly for emergency/manual use. The normal path is the GitHub workflow.
|
||||
|
||||
```bash
|
||||
./scripts/release.sh stable
|
||||
git push public-gh refs/tags/vYYYY.M.D
|
||||
./scripts/create-github-release.sh YYYY.M.D
|
||||
git push public-gh refs/tags/vYYYY.MDD.P
|
||||
PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
## Stable Changelog Workflow
|
||||
|
||||
Stable changelog files live at:
|
||||
|
||||
- `releases/vYYYY.M.D.md`
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Canaries do not get changelog files.
|
||||
|
||||
Recommended local generation flow:
|
||||
|
||||
```bash
|
||||
VERSION=2026.3.17
|
||||
VERSION="$(./scripts/release.sh stable --date 2026-03-18 --print-version)"
|
||||
claude --print --output-format stream-json --verbose --dangerously-skip-permissions --model claude-opus-4-6 "Use the release-changelog skill to draft or update releases/v${VERSION}.md for Paperclip. Read doc/RELEASING.md and .agents/skills/release-changelog/SKILL.md, then generate the stable changelog for v${VERSION} from commits since the last stable tag. Do not create a canary changelog."
|
||||
```
|
||||
|
||||
@@ -160,13 +176,22 @@ HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary .
|
||||
HOST_PORT=3233 DATA_DIR=./data/release-smoke-stable PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Automated browser smoke is also available:
|
||||
|
||||
```bash
|
||||
gh workflow run release-smoke.yml -f paperclip_version=canary
|
||||
gh workflow run release-smoke.yml -f paperclip_version=latest
|
||||
```
|
||||
|
||||
Minimum checks:
|
||||
|
||||
- `npx paperclipai@canary onboard` installs
|
||||
- onboarding completes without crashes
|
||||
- the server boots
|
||||
- the UI loads
|
||||
- basic company creation and dashboard load work
|
||||
- authenticated login works with the smoke credentials
|
||||
- the browser lands in onboarding on a fresh instance
|
||||
- company creation succeeds
|
||||
- the first CEO agent is created
|
||||
- the first CEO heartbeat run is triggered
|
||||
|
||||
## Rollback
|
||||
|
||||
@@ -175,11 +200,11 @@ Rollback does not unpublish versions.
|
||||
It only moves the `latest` dist-tag back to a previous stable:
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh 2026.3.16 --dry-run
|
||||
./scripts/rollback-latest.sh 2026.3.16
|
||||
./scripts/rollback-latest.sh 2026.318.0 --dry-run
|
||||
./scripts/rollback-latest.sh 2026.318.0
|
||||
```
|
||||
|
||||
Then fix forward with a new stable release date.
|
||||
Then fix forward with a new stable patch slot or release date.
|
||||
|
||||
## Failure Playbooks
|
||||
|
||||
@@ -201,8 +226,8 @@ This is a partial release. npm is already live.
|
||||
Do this immediately:
|
||||
|
||||
1. push the missing tag
|
||||
2. rerun `./scripts/create-github-release.sh YYYY.M.D`
|
||||
3. verify the GitHub Release notes point at `releases/vYYYY.M.D.md`
|
||||
2. rerun `PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P`
|
||||
3. verify the GitHub Release notes point at `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Do not republish the same version.
|
||||
|
||||
@@ -211,7 +236,7 @@ Do not republish the same version.
|
||||
Roll back the dist-tag:
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh YYYY.M.D
|
||||
./scripts/rollback-latest.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
Then fix forward with a new stable release.
|
||||
|
||||
@@ -441,6 +441,7 @@ All endpoints are under `/api` and return JSON.
|
||||
- `POST /companies`
|
||||
- `GET /companies/:companyId`
|
||||
- `PATCH /companies/:companyId`
|
||||
- `PATCH /companies/:companyId/branding`
|
||||
- `POST /companies/:companyId/archive`
|
||||
|
||||
## 10.2 Goals
|
||||
@@ -843,20 +844,31 @@ V1 is complete only when all criteria are true:
|
||||
|
||||
V1 supports company import/export using a portable package contract:
|
||||
|
||||
- exactly one JSON entrypoint: `paperclip.manifest.json`
|
||||
- all other package files are markdown with frontmatter
|
||||
- agent convention:
|
||||
- `agents/<slug>/AGENTS.md` (required for V1 export/import)
|
||||
- `agents/<slug>/HEARTBEAT.md` (optional, import accepted)
|
||||
- `agents/<slug>/*.md` (optional, import accepted)
|
||||
- markdown-first package rooted at `COMPANY.md`
|
||||
- implicit folder discovery by convention
|
||||
- `.paperclip.yaml` sidecar for Paperclip-specific fidelity
|
||||
- canonical base package is vendor-neutral and aligned with `docs/companies/companies-spec.md`
|
||||
- common conventions:
|
||||
- `agents/<slug>/AGENTS.md`
|
||||
- `teams/<slug>/TEAM.md`
|
||||
- `projects/<slug>/PROJECT.md`
|
||||
- `projects/<slug>/tasks/<slug>/TASK.md`
|
||||
- `tasks/<slug>/TASK.md`
|
||||
- `skills/<slug>/SKILL.md`
|
||||
|
||||
Export/import behavior in V1:
|
||||
|
||||
- export includes company metadata and/or agents based on selection
|
||||
- export strips environment-specific paths (`cwd`, local instruction file paths)
|
||||
- export never includes secret values; secret requirements are reported
|
||||
- export emits a clean vendor-neutral markdown package plus `.paperclip.yaml`
|
||||
- projects and starter tasks are opt-in export content rather than default package content
|
||||
- recurring `TASK.md` entries use `recurring: true` in the base package and Paperclip routine fidelity in `.paperclip.yaml`
|
||||
- Paperclip imports recurring task packages as routines instead of downgrading them to one-time issues
|
||||
- export strips environment-specific paths (`cwd`, local instruction file paths, inline prompt duplication) while preserving portable project repo/workspace metadata such as `repoUrl`, refs, and workspace-policy references keyed in `.paperclip.yaml`
|
||||
- export never includes secret values; env inputs are reported as portable declarations instead
|
||||
- import supports target modes:
|
||||
- create a new company
|
||||
- import into an existing company
|
||||
- import recreates exported project workspaces and remaps portable workspace keys back to target-local workspace ids
|
||||
- import forces imported agent timer heartbeats off so packages never start scheduled runs implicitly
|
||||
- import supports collision strategies: `rename`, `skip`, `replace`
|
||||
- import supports preview (dry-run) before apply
|
||||
- GitHub imports warn on unpinned refs instead of blocking
|
||||
|
||||
33
doc/SPEC.md
33
doc/SPEC.md
@@ -186,17 +186,21 @@ The heartbeat is a protocol, not a runtime. Paperclip defines how to initiate an
|
||||
|
||||
### Execution Adapters
|
||||
|
||||
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Initial adapters:
|
||||
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Built-in adapters include:
|
||||
|
||||
| Adapter | Mechanism | Example |
|
||||
| -------------------- | ----------------------- | --------------------------------------------- |
|
||||
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
|
||||
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
|
||||
| `openclaw_gateway` | OpenClaw gateway API | Managed OpenClaw agent via gateway |
|
||||
| `gemini_local` | Gemini CLI process | Local Gemini CLI with sandbox and approval |
|
||||
| `hermes_local` | Hermes agent process | Local Hermes agent |
|
||||
| Adapter | Mechanism | Example |
|
||||
| ---------------- | -------------------------- | -------------------------------------------------- |
|
||||
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
|
||||
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
|
||||
| `claude_local` | Local Claude Code process | Claude Code heartbeat worker |
|
||||
| `codex_local` | Local Codex process | Codex CLI heartbeat worker |
|
||||
| `opencode_local` | Local OpenCode process | OpenCode heartbeat worker |
|
||||
| `pi_local` | Local Pi process | Pi CLI heartbeat worker |
|
||||
| `cursor` | Cursor API/CLI bridge | Cursor-integrated heartbeat worker |
|
||||
| `openclaw_gateway` | OpenClaw gateway API | Managed OpenClaw agent via gateway |
|
||||
| `hermes_local` | Local Hermes process | Hermes agent heartbeat worker |
|
||||
|
||||
The `process` and `http` adapters ship as defaults. Additional adapters have been added for specific agent runtimes (see list above), and new adapter types can be registered via the plugin system (see Plugin / Extension Architecture).
|
||||
The `process` and `http` adapters ship as generic defaults. Additional built-in adapters cover common local coding runtimes (see list above), and new adapter types can be registered via the plugin system (see Plugin / Extension Architecture).
|
||||
|
||||
### Adapter Interface
|
||||
|
||||
@@ -376,7 +380,7 @@ Flow:
|
||||
| Layer | Technology |
|
||||
| -------- | ------------------------------------------------------------ |
|
||||
| Frontend | React + Vite |
|
||||
| Backend | TypeScript + Hono (REST API, not tRPC — need non-TS clients) |
|
||||
| Backend | TypeScript + Express (REST API, not tRPC — need non-TS clients) |
|
||||
| Database | PostgreSQL (see [doc/DATABASE.md](./doc/DATABASE.md) for details — PGlite embedded for dev, Docker or hosted Supabase for production) |
|
||||
| Auth | [Better Auth](https://www.better-auth.com/) |
|
||||
|
||||
@@ -406,7 +410,7 @@ No separate "agent API" vs. "board API." Same endpoints, different authorization
|
||||
|
||||
### Work Artifacts
|
||||
|
||||
Paperclip does **not** manage work artifacts (code repos, file systems, deployments, documents). That's entirely the agent's domain. Paperclip tracks tasks and costs. Where and how work gets done is outside scope.
|
||||
Paperclip manages task-linked work artifacts: issue documents (rich-text plans, specs, notes attached to issues) and file attachments. Agents read and write these through the API as part of normal task execution. Full delivery infrastructure (code repos, deployments, production runtime) remains the agent's domain — Paperclip orchestrates the work, not the build pipeline.
|
||||
|
||||
### Open Questions
|
||||
|
||||
@@ -476,15 +480,14 @@ Each is a distinct page/route:
|
||||
- [ ] **Default agent** — basic Claude Code/Codex loop with Paperclip skill
|
||||
- [ ] **Default CEO** — strategic planning, delegation, board communication
|
||||
- [ ] **Paperclip skill (SKILL.md)** — teaches agents to interact with the API
|
||||
- [ ] **REST API** — full API for agent interaction (Hono)
|
||||
- [ ] **REST API** — full API for agent interaction (Express)
|
||||
- [ ] **Web UI** — React/Vite: org chart, task board, dashboard, cost views
|
||||
- [ ] **Agent auth** — connection string generation with URL + key + instructions
|
||||
- [ ] **One-command dev setup** — embedded PGlite, everything local
|
||||
- [ ] **Multiple Adapter types** (HTTP Adapter, OpenClaw Adapter)
|
||||
- [ ] **Multiple Adapter types** (HTTP, OpenClaw gateway, and local coding adapters)
|
||||
|
||||
### Not V1
|
||||
|
||||
- Template export/import
|
||||
- Knowledge base - a future plugin
|
||||
- Advanced governance models (hiring budgets, multi-member boards)
|
||||
- Revenue/expense tracking beyond token costs - a future plugin
|
||||
@@ -509,7 +512,7 @@ Things Paperclip explicitly does **not** do:
|
||||
- **Not a SaaS** — single-tenant, self-hosted
|
||||
- **Not opinionated about Agent implementation** — any language, any framework, any runtime
|
||||
- **Not automatically self-healing** — surfaces problems, doesn't silently fix them
|
||||
- **Does not manage work artifacts** — no repo management, no deployment, no file systems
|
||||
- **Does not manage delivery infrastructure** — no repo management, no deployment, no file systems (but does manage task-linked documents and attachments)
|
||||
- **Does not auto-reassign work** — stale tasks are surfaced, not silently redistributed
|
||||
- **Does not track external revenue/expenses** — that's a future plugin. Token/LLM cost budgeting is core.
|
||||
|
||||
|
||||
@@ -16,14 +16,14 @@ By default this workflow does **not** mount your host repo checkout, your host h
|
||||
## Files
|
||||
|
||||
- `docker/untrusted-review/Dockerfile`
|
||||
- `docker-compose.untrusted-review.yml`
|
||||
- `docker/docker-compose.untrusted-review.yml`
|
||||
- `review-checkout-pr` inside the container
|
||||
|
||||
## Build and start a shell
|
||||
|
||||
```sh
|
||||
docker compose -f docker-compose.untrusted-review.yml build
|
||||
docker compose -f docker-compose.untrusted-review.yml run --rm --service-ports review
|
||||
docker compose -f docker/docker-compose.untrusted-review.yml build
|
||||
docker compose -f docker/docker-compose.untrusted-review.yml run --rm --service-ports review
|
||||
```
|
||||
|
||||
That opens an interactive shell in the review container with:
|
||||
@@ -47,7 +47,7 @@ claude login
|
||||
If you prefer API-key auth instead of CLI login, pass keys through Compose env:
|
||||
|
||||
```sh
|
||||
OPENAI_API_KEY=... ANTHROPIC_API_KEY=... docker compose -f docker-compose.untrusted-review.yml run --rm review
|
||||
OPENAI_API_KEY=... ANTHROPIC_API_KEY=... docker compose -f docker/docker-compose.untrusted-review.yml run --rm review
|
||||
```
|
||||
|
||||
## Check out a PR safely
|
||||
@@ -117,7 +117,7 @@ Notes:
|
||||
Remove the review container volumes when you want a clean environment:
|
||||
|
||||
```sh
|
||||
docker compose -f docker-compose.untrusted-review.yml down -v
|
||||
docker compose -f docker/docker-compose.untrusted-review.yml down -v
|
||||
```
|
||||
|
||||
That deletes:
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
# Paperclip Module System
|
||||
|
||||
> Supersession note: the company-template/package-format direction in this document is no longer current. For the current markdown-first company import/export plan, see `doc/plans/2026-03-13-company-import-export-v2.md` and `docs/companies/companies-spec.md`.
|
||||
|
||||
## Overview
|
||||
|
||||
Paperclip's module system lets you extend the control plane with new capabilities — revenue tracking, observability, notifications, dashboards — without forking core. Modules are self-contained packages that register routes, UI pages, database tables, and lifecycle hooks.
|
||||
|
||||
644
doc/plans/2026-03-13-company-import-export-v2.md
Normal file
644
doc/plans/2026-03-13-company-import-export-v2.md
Normal file
@@ -0,0 +1,644 @@
|
||||
# 2026-03-13 Company Import / Export V2 Plan
|
||||
|
||||
Status: Proposed implementation plan
|
||||
Date: 2026-03-13
|
||||
Audience: Product and engineering
|
||||
Supersedes for package-format direction:
|
||||
- `doc/plans/2026-02-16-module-system.md` sections that describe company templates as JSON-only
|
||||
- `docs/specs/cliphub-plan.md` assumptions about blueprint bundle shape where they conflict with the markdown-first package model
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
This document defines the next-stage plan for Paperclip company import/export.
|
||||
|
||||
The core shift is:
|
||||
|
||||
- move from a Paperclip-specific JSON-first portability package toward a markdown-first package format
|
||||
- make GitHub repositories first-class package sources
|
||||
- treat the company package model as an extension of the existing Agent Skills ecosystem instead of inventing a separate skill format
|
||||
- support company, team, agent, and skill reuse without requiring a central registry
|
||||
|
||||
The normative package format draft lives in:
|
||||
|
||||
- `docs/companies/companies-spec.md`
|
||||
|
||||
This plan is about implementation and rollout inside Paperclip.
|
||||
|
||||
Adapter-wide skill rollout details live in:
|
||||
|
||||
- `doc/plans/2026-03-14-adapter-skill-sync-rollout.md`
|
||||
|
||||
## 2. Executive Summary
|
||||
|
||||
Paperclip already has portability primitives in the repo:
|
||||
|
||||
- server import/export/preview APIs
|
||||
- CLI import/export commands
|
||||
- shared portability types and validators
|
||||
|
||||
Those primitives are being cut over to the new package model rather than extended for backward compatibility.
|
||||
|
||||
The new direction is:
|
||||
|
||||
1. markdown-first package authoring
|
||||
2. GitHub repo or local folder as the default source of truth
|
||||
3. a vendor-neutral base package spec for agent-company runtimes, not just Paperclip
|
||||
4. the company package model is explicitly an extension of Agent Skills
|
||||
5. no future dependency on `paperclip.manifest.json`
|
||||
6. implicit folder discovery by convention for the common case
|
||||
7. an always-emitted `.paperclip.yaml` sidecar for high-fidelity Paperclip-specific details
|
||||
8. package graph resolution at import time
|
||||
9. entity-level import UI with dependency-aware tree selection
|
||||
10. `skills.sh` compatibility is a V1 requirement for skill packages and skill installation flows
|
||||
11. adapter-aware skill sync surfaces so Paperclip can read, diff, enable, disable, and reconcile skills where the adapter supports it
|
||||
|
||||
## 3. Product Goals
|
||||
|
||||
### 3.1 Goals
|
||||
|
||||
- A user can point Paperclip at a local folder or GitHub repo and import a company package without any registry.
|
||||
- A package is readable and writable by humans with normal git workflows.
|
||||
- A package can contain:
|
||||
- company definition
|
||||
- org subtree / team definition
|
||||
- agent definitions
|
||||
- optional starter projects and tasks
|
||||
- reusable skills
|
||||
- V1 skill support is compatible with the existing `skills.sh` / Agent Skills ecosystem.
|
||||
- A user can import into:
|
||||
- a new company
|
||||
- an existing company
|
||||
- Import preview shows:
|
||||
- what will be created
|
||||
- what will be updated
|
||||
- what is skipped
|
||||
- what is referenced externally
|
||||
- what needs secrets or approvals
|
||||
- Export preserves attribution, licensing, and pinned upstream references.
|
||||
- Export produces a clean vendor-neutral package plus a Paperclip sidecar.
|
||||
- `companies.sh` can later act as a discovery/index layer over repos implementing this format.
|
||||
|
||||
### 3.2 Non-Goals
|
||||
|
||||
- No central registry is required for package validity.
|
||||
- This is not full database backup/restore.
|
||||
- This does not attempt to export runtime state like:
|
||||
- heartbeat runs
|
||||
- API keys
|
||||
- spend totals
|
||||
- run sessions
|
||||
- transient workspaces
|
||||
- This does not require a first-class runtime `teams` table before team portability ships.
|
||||
|
||||
## 4. Current State In Repo
|
||||
|
||||
Current implementation exists here:
|
||||
|
||||
- shared types: `packages/shared/src/types/company-portability.ts`
|
||||
- shared validators: `packages/shared/src/validators/company-portability.ts`
|
||||
- server routes: `server/src/routes/companies.ts`
|
||||
- server service: `server/src/services/company-portability.ts`
|
||||
- CLI commands: `cli/src/commands/client/company.ts`
|
||||
|
||||
Current product limitations:
|
||||
|
||||
1. Import/export UX still needs deeper tree-selection and skill/package management polish.
|
||||
2. Adapter-specific skill sync remains uneven across adapters and must degrade cleanly when unsupported.
|
||||
3. Projects and starter tasks should stay opt-in on export rather than default package content.
|
||||
4. Import/export still needs stronger coverage around attribution, pin verification, and executable-package warnings.
|
||||
5. The current markdown frontmatter parser is intentionally lightweight and should stay constrained to the documented shape.
|
||||
|
||||
## 5. Canonical Package Direction
|
||||
|
||||
### 5.1 Canonical Authoring Format
|
||||
|
||||
The canonical authoring format becomes a markdown-first package rooted in one of:
|
||||
|
||||
- `COMPANY.md`
|
||||
- `TEAM.md`
|
||||
- `AGENTS.md`
|
||||
- `PROJECT.md`
|
||||
- `TASK.md`
|
||||
- `SKILL.md`
|
||||
|
||||
The normative draft is:
|
||||
|
||||
- `docs/companies/companies-spec.md`
|
||||
|
||||
### 5.2 Relationship To Agent Skills
|
||||
|
||||
Paperclip must not redefine `SKILL.md`.
|
||||
|
||||
Rules:
|
||||
|
||||
- `SKILL.md` stays Agent Skills compatible
|
||||
- the company package model is an extension of Agent Skills
|
||||
- the base package is vendor-neutral and intended for any agent-company runtime
|
||||
- Paperclip-specific fidelity lives in `.paperclip.yaml`
|
||||
- Paperclip may resolve and install `SKILL.md` packages, but it must not require a Paperclip-only skill format
|
||||
- `skills.sh` compatibility is a V1 requirement, not a future nice-to-have
|
||||
|
||||
### 5.3 Agent-To-Skill Association
|
||||
|
||||
`AGENTS.md` should associate skills by skill shortname or slug, not by verbose path in the common case.
|
||||
|
||||
Preferred example:
|
||||
|
||||
- `skills: [review, react-best-practices]`
|
||||
|
||||
Resolution model:
|
||||
|
||||
- `review` resolves to `skills/review/SKILL.md` by package convention
|
||||
- if the skill is external or referenced, the skill package owns that complexity
|
||||
- exporters should prefer shortname-based associations in `AGENTS.md`
|
||||
- importers should resolve the shortname against local package skills first, then referenced or installed company skills
|
||||
### 5.4 Base Package Vs Paperclip Extension
|
||||
|
||||
The repo format should have two layers:
|
||||
|
||||
- base package:
|
||||
- minimal, readable, social, vendor-neutral
|
||||
- implicit folder discovery by convention
|
||||
- no Paperclip-only runtime fields by default
|
||||
- Paperclip extension:
|
||||
- `.paperclip.yaml`
|
||||
- adapter/runtime/permissions/budget/workspace fidelity
|
||||
- emitted by Paperclip tools as a sidecar while the base package stays readable
|
||||
|
||||
### 5.5 Relationship To Current V1 Manifest
|
||||
|
||||
`paperclip.manifest.json` is not part of the future package direction.
|
||||
|
||||
This should be treated as a hard cutover in product direction.
|
||||
|
||||
- markdown-first repo layout is the target
|
||||
- no new work should deepen investment in the old manifest model
|
||||
- future portability APIs and UI should target the markdown-first model only
|
||||
|
||||
## 6. Package Graph Model
|
||||
|
||||
### 6.1 Entity Kinds
|
||||
|
||||
Paperclip import/export should support these entity kinds:
|
||||
|
||||
- company
|
||||
- team
|
||||
- agent
|
||||
- project
|
||||
- task
|
||||
- skill
|
||||
|
||||
### 6.2 Team Semantics
|
||||
|
||||
`team` is a package concept first, not a database-table requirement.
|
||||
|
||||
In Paperclip V2 portability:
|
||||
|
||||
- a team is an importable org subtree
|
||||
- it is rooted at a manager agent
|
||||
- it can be attached under a target manager in an existing company
|
||||
|
||||
This avoids blocking portability on a future runtime `teams` model.
|
||||
|
||||
Imported-team tracking should initially be package/provenance-based:
|
||||
|
||||
- if a team package was imported, the imported agents should carry enough provenance to reconstruct that grouping
|
||||
- Paperclip can treat “this set of agents came from team package X” as the imported-team model
|
||||
- provenance grouping is the intended near- and medium-term team model for import/export
|
||||
- only add a first-class runtime `teams` table later if product needs move beyond what provenance grouping can express
|
||||
|
||||
### 6.3 Dependency Graph
|
||||
|
||||
Import should operate on an entity graph, not raw file selection.
|
||||
|
||||
Examples:
|
||||
|
||||
- selecting an agent auto-selects its required docs and skill refs
|
||||
- selecting a team auto-selects its subtree
|
||||
- selecting a company auto-selects all included entities by default
|
||||
- selecting a project auto-selects its starter tasks
|
||||
|
||||
The preview output should reflect graph resolution explicitly.
|
||||
|
||||
## 7. External References, Pinning, And Attribution
|
||||
|
||||
### 7.1 Why This Matters
|
||||
|
||||
Some packages will:
|
||||
|
||||
- reference upstream files we do not want to republish
|
||||
- include third-party work where attribution must remain visible
|
||||
- need protection from branch hot-swapping
|
||||
|
||||
### 7.2 Policy
|
||||
|
||||
Paperclip should support source references in package metadata with:
|
||||
|
||||
- repo
|
||||
- path
|
||||
- commit sha
|
||||
- optional blob sha
|
||||
- optional sha256
|
||||
- attribution
|
||||
- license
|
||||
- usage mode
|
||||
|
||||
Usage modes:
|
||||
|
||||
- `vendored`
|
||||
- `referenced`
|
||||
- `mirrored`
|
||||
|
||||
Default exporter behavior for third-party content should be:
|
||||
|
||||
- prefer `referenced`
|
||||
- preserve attribution
|
||||
- do not silently inline third-party content into exports
|
||||
|
||||
### 7.3 Trust Model
|
||||
|
||||
Imported package content should be classified by trust level:
|
||||
|
||||
- markdown-only
|
||||
- markdown + assets
|
||||
- markdown + scripts/executables
|
||||
|
||||
The UI and CLI should surface this clearly before apply.
|
||||
|
||||
## 8. Import Behavior
|
||||
|
||||
### 8.1 Supported Sources
|
||||
|
||||
- local folder
|
||||
- local package root file
|
||||
- GitHub repo URL
|
||||
- GitHub subtree URL
|
||||
- direct URL to markdown/package root
|
||||
|
||||
Registry-based discovery may be added later, but must remain optional.
|
||||
|
||||
### 8.2 Import Targets
|
||||
|
||||
- new company
|
||||
- existing company
|
||||
|
||||
For existing company imports, the preview must support:
|
||||
|
||||
- collision handling
|
||||
- attach-point selection for team imports
|
||||
- selective entity import
|
||||
|
||||
### 8.3 Collision Strategy
|
||||
|
||||
Current `rename | skip | replace` support remains, but matching should improve over time.
|
||||
|
||||
Preferred matching order:
|
||||
|
||||
1. prior install provenance
|
||||
2. stable package entity identity
|
||||
3. slug
|
||||
4. human name as weak fallback
|
||||
|
||||
Slug-only matching is acceptable only as a transitional strategy.
|
||||
|
||||
### 8.4 Required Preview Output
|
||||
|
||||
Every import preview should surface:
|
||||
|
||||
- target company action
|
||||
- entity-level create/update/skip plan
|
||||
- referenced external content
|
||||
- missing files
|
||||
- hash mismatch or pinning issues
|
||||
- env inputs, including required vs optional and default values when present
|
||||
- unsupported content types
|
||||
- trust/licensing warnings
|
||||
|
||||
### 8.5 Adapter Skill Sync Surface
|
||||
|
||||
People want skill management in the UI, but skills are adapter-dependent.
|
||||
|
||||
That means portability and UI planning must include an adapter capability model for skills.
|
||||
|
||||
Paperclip should define a new adapter surface area around skills:
|
||||
|
||||
- list currently enabled skills for an agent
|
||||
- report how those skills are represented by the adapter
|
||||
- install or enable a skill
|
||||
- disable or remove a skill
|
||||
- report sync state between desired package config and actual adapter state
|
||||
|
||||
Examples:
|
||||
|
||||
- Claude Code / Codex style adapters may manage skills as local filesystem packages or adapter-owned skill directories
|
||||
- OpenClaw-style adapters may expose currently enabled skills through an API or a reflected config surface
|
||||
- some adapters may be read-only and only report what they have
|
||||
|
||||
Planned adapter capability shape:
|
||||
|
||||
- `supportsSkillRead`
|
||||
- `supportsSkillWrite`
|
||||
- `supportsSkillRemove`
|
||||
- `supportsSkillSync`
|
||||
- `skillStorageKind` such as `filesystem`, `remote_api`, `inline_config`, or `unknown`
|
||||
|
||||
Baseline adapter interface:
|
||||
|
||||
- `listSkills(agent)`
|
||||
- `applySkills(agent, desiredSkills)`
|
||||
- `removeSkill(agent, skillId)` optional
|
||||
- `getSkillSyncState(agent, desiredSkills)` optional
|
||||
|
||||
Planned Paperclip behavior:
|
||||
|
||||
- if an adapter supports read, Paperclip should show current skills in the UI
|
||||
- if an adapter supports write, Paperclip should let the user enable/disable imported skills
|
||||
- if an adapter supports sync, Paperclip should compute desired vs actual state and offer reconcile actions
|
||||
- if an adapter does not support these capabilities, the UI should still show the package-level desired skills but mark them unmanaged
|
||||
|
||||
## 9. Export Behavior
|
||||
|
||||
### 9.1 Default Export Target
|
||||
|
||||
Default export target should become a markdown-first folder structure.
|
||||
|
||||
Example:
|
||||
|
||||
```text
|
||||
my-company/
|
||||
├── COMPANY.md
|
||||
├── agents/
|
||||
├── teams/
|
||||
└── skills/
|
||||
```
|
||||
|
||||
### 9.2 Export Rules
|
||||
|
||||
Exports should:
|
||||
|
||||
- omit machine-local ids
|
||||
- omit timestamps and counters unless explicitly needed
|
||||
- omit secret values
|
||||
- omit local absolute paths
|
||||
- omit duplicated inline prompt content from `.paperclip.yaml` when `AGENTS.md` already carries the instructions
|
||||
- preserve references and attribution
|
||||
- emit `.paperclip.yaml` alongside the base package
|
||||
- express adapter env/secrets as portable env input declarations rather than exported secret binding ids
|
||||
- preserve compatible `SKILL.md` content as-is
|
||||
|
||||
Projects and issues should not be exported by default.
|
||||
|
||||
They should be opt-in through selectors such as:
|
||||
|
||||
- `--projects project-shortname-1,project-shortname-2`
|
||||
- `--issues PAP-1,PAP-3`
|
||||
- `--project-issues project-shortname-1,project-shortname-2`
|
||||
|
||||
This supports “clean public company package” workflows where a maintainer exports a follower-facing company package without bundling active work items every time.
|
||||
|
||||
### 9.3 Export Units
|
||||
|
||||
Initial export units:
|
||||
|
||||
- company package
|
||||
- team package
|
||||
- single agent package
|
||||
|
||||
Later optional units:
|
||||
|
||||
- skill pack export
|
||||
- seed projects/tasks bundle
|
||||
|
||||
## 10. Storage Model Inside Paperclip
|
||||
|
||||
### 10.1 Short-Term
|
||||
|
||||
In the first phase, imported entities can continue mapping onto current runtime tables:
|
||||
|
||||
- company -> companies
|
||||
- agent -> agents
|
||||
- team -> imported agent subtree attachment plus package provenance grouping
|
||||
- skill -> company-scoped reusable package metadata plus agent-scoped desired-skill attachment state where supported
|
||||
|
||||
### 10.2 Medium-Term
|
||||
|
||||
Paperclip should add managed package/provenance records so imports are not anonymous one-off copies.
|
||||
|
||||
Needed capabilities:
|
||||
|
||||
- remember install origin
|
||||
- support re-import / upgrade
|
||||
- distinguish local edits from upstream package state
|
||||
- preserve external refs and package-level metadata
|
||||
- preserve imported team grouping without requiring a runtime `teams` table immediately
|
||||
- preserve desired-skill state separately from adapter runtime state
|
||||
- support both company-scoped reusable skills and agent-scoped skill attachments
|
||||
|
||||
Suggested future tables:
|
||||
|
||||
- package_installs
|
||||
- package_install_entities
|
||||
- package_sources
|
||||
- agent_skill_desires
|
||||
- adapter_skill_snapshots
|
||||
|
||||
This is not required for phase 1 UI, but it is required for a robust long-term system.
|
||||
|
||||
## 11. API Plan
|
||||
|
||||
### 11.1 Keep Existing Endpoints Initially
|
||||
|
||||
Retain:
|
||||
|
||||
- `POST /api/companies/:companyId/export`
|
||||
- `POST /api/companies/import/preview`
|
||||
- `POST /api/companies/import`
|
||||
|
||||
But evolve payloads toward the markdown-first graph model.
|
||||
|
||||
### 11.2 New API Capabilities
|
||||
|
||||
Add support for:
|
||||
|
||||
- package root resolution from local/GitHub inputs
|
||||
- graph resolution preview
|
||||
- source pin and hash verification results
|
||||
- entity-level selection
|
||||
- team attach target selection
|
||||
- provenance-aware collision planning
|
||||
|
||||
### 11.3 Parsing Changes
|
||||
|
||||
Replace the current ad hoc markdown frontmatter parser with a real parser that can handle:
|
||||
|
||||
- nested YAML
|
||||
- arrays/objects reliably
|
||||
- consistent round-tripping
|
||||
|
||||
This is a prerequisite for the new package model.
|
||||
|
||||
## 12. CLI Plan
|
||||
|
||||
The CLI should continue to support direct import/export without a registry.
|
||||
|
||||
Target commands:
|
||||
|
||||
- `paperclipai company export <company-id> --out <path>`
|
||||
- `paperclipai company import <path-or-url> --dry-run`
|
||||
- `paperclipai company import <path-or-url> --target existing -C <company-id>`
|
||||
|
||||
Planned additions:
|
||||
|
||||
- `--package-kind company|team|agent`
|
||||
- `--attach-under <agent-id-or-slug>` for team imports
|
||||
- `--strict-pins`
|
||||
- `--allow-unpinned`
|
||||
- `--materialize-references`
|
||||
- `--sync-skills`
|
||||
|
||||
## 13. UI Plan
|
||||
|
||||
### 13.1 Company Settings Import / Export
|
||||
|
||||
Add a real import/export section to Company Settings.
|
||||
|
||||
Export UI:
|
||||
|
||||
- export package kind selector
|
||||
- include options
|
||||
- local download/export destination guidance
|
||||
- attribution/reference summary
|
||||
|
||||
Import UI:
|
||||
|
||||
- source entry:
|
||||
- upload/folder where supported
|
||||
- GitHub URL
|
||||
- generic URL
|
||||
- preview pane with:
|
||||
- resolved package root
|
||||
- dependency tree
|
||||
- checkboxes by entity
|
||||
- trust/licensing warnings
|
||||
- secrets requirements
|
||||
- collision plan
|
||||
|
||||
### 13.2 Team Import UX
|
||||
|
||||
If importing a team into an existing company:
|
||||
|
||||
- show the subtree structure
|
||||
- require the user to choose where to attach it
|
||||
- preview manager/reporting updates before apply
|
||||
- preserve imported-team provenance so the UI can later say “these agents came from team package X”
|
||||
|
||||
### 13.3 Skills UX
|
||||
|
||||
See also:
|
||||
|
||||
- `doc/plans/2026-03-14-skills-ui-product-plan.md`
|
||||
|
||||
If importing skills:
|
||||
|
||||
- show whether each skill is local, vendored, or referenced
|
||||
- show whether it contains scripts/assets
|
||||
- preserve Agent Skills compatibility in presentation and export
|
||||
- preserve `skills.sh` compatibility in both import and install flows
|
||||
- show agent skill attachments by shortname/slug rather than noisy file paths
|
||||
- treat agent skills as a dedicated agent tab, not just another subsection of configuration
|
||||
- show current adapter-reported skills when supported
|
||||
- show desired package skills separately from actual adapter state
|
||||
- offer reconcile actions when the adapter supports sync
|
||||
|
||||
## 14. Rollout Phases
|
||||
|
||||
### Phase 1: Stabilize Current V1 Portability
|
||||
|
||||
- add tests for current portability flows
|
||||
- replace the frontmatter parser
|
||||
- add Company Settings UI for current import/export capabilities
|
||||
- start cutover work toward the markdown-first package reader
|
||||
|
||||
### Phase 2: Markdown-First Package Reader
|
||||
|
||||
- support `COMPANY.md` / `TEAM.md` / `AGENTS.md` root detection
|
||||
- build internal graph from markdown-first packages
|
||||
- support local folder and GitHub repo inputs natively
|
||||
- support agent skill references by shortname/slug
|
||||
- resolve local `skills/<slug>/SKILL.md` packages by convention
|
||||
- support `skills.sh`-compatible skill repos as V1 package sources
|
||||
|
||||
### Phase 3: Graph-Based Import UX And Skill Surfaces
|
||||
|
||||
- entity tree preview
|
||||
- checkbox selection
|
||||
- team subtree attach flow
|
||||
- licensing/trust/reference warnings
|
||||
- company skill library groundwork
|
||||
- dedicated agent `Skills` tab groundwork
|
||||
- adapter skill read/sync UI groundwork
|
||||
|
||||
### Phase 4: New Export Model
|
||||
|
||||
- export markdown-first folder structure by default
|
||||
|
||||
### Phase 5: Provenance And Upgrades
|
||||
|
||||
- persist install provenance
|
||||
- support package-aware re-import and upgrades
|
||||
- improve collision matching beyond slug-only
|
||||
- add imported-team provenance grouping
|
||||
- add desired-vs-actual skill sync state
|
||||
|
||||
### Phase 6: Optional Seed Content
|
||||
|
||||
- goals
|
||||
- projects
|
||||
- starter issues/tasks
|
||||
|
||||
This phase is intentionally after the structural model is stable.
|
||||
|
||||
## 15. Documentation Plan
|
||||
|
||||
Primary docs:
|
||||
|
||||
- `docs/companies/companies-spec.md` as the package-format draft
|
||||
- this implementation plan for rollout sequencing
|
||||
|
||||
Docs to update later as implementation lands:
|
||||
|
||||
- `doc/SPEC-implementation.md`
|
||||
- `docs/api/companies.md`
|
||||
- `docs/cli/control-plane-commands.md`
|
||||
- board operator docs for Company Settings import/export
|
||||
|
||||
## 16. Open Questions
|
||||
|
||||
1. Should imported skill packages be stored as managed package files in Paperclip storage, or only referenced at import time?
|
||||
Decision: managed package files should support both company-scoped reuse and agent-scoped attachment.
|
||||
2. What is the minimum adapter skill interface needed to make the UI useful across Claude Code, Codex, OpenClaw, and future adapters?
|
||||
Decision: use the baseline interface in section 8.5.
|
||||
3. Should Paperclip support direct local folder selection in the web UI, or keep that CLI-only initially?
|
||||
4. Do we want optional generated lock files in phase 2, or defer them until provenance work?
|
||||
5. How strict should pinning be by default for GitHub references:
|
||||
- warn on unpinned
|
||||
- or block in normal mode
|
||||
6. Is package-provenance grouping enough for imported teams, or do we expect product requirements soon that would justify a first-class runtime `teams` table?
|
||||
Decision: provenance grouping is enough for the import/export product model for now.
|
||||
|
||||
## 17. Recommendation
|
||||
|
||||
Engineering should treat this as the current plan of record for company import/export beyond the existing V1 portability feature.
|
||||
|
||||
Immediate next steps:
|
||||
|
||||
1. accept `docs/companies/companies-spec.md` as the package-format draft
|
||||
2. implement phase 1 stabilization work
|
||||
3. build phase 2 markdown-first package reader before expanding ClipHub or `companies.sh`
|
||||
4. treat the old manifest-based format as deprecated and not part of the future surface
|
||||
|
||||
This keeps Paperclip aligned with:
|
||||
|
||||
- GitHub-native distribution
|
||||
- Agent Skills compatibility
|
||||
- a registry-optional ecosystem model
|
||||
399
doc/plans/2026-03-14-adapter-skill-sync-rollout.md
Normal file
399
doc/plans/2026-03-14-adapter-skill-sync-rollout.md
Normal file
@@ -0,0 +1,399 @@
|
||||
# 2026-03-14 Adapter Skill Sync Rollout
|
||||
|
||||
Status: Proposed
|
||||
Date: 2026-03-14
|
||||
Audience: Product and engineering
|
||||
Related:
|
||||
- `doc/plans/2026-03-14-skills-ui-product-plan.md`
|
||||
- `doc/plans/2026-03-13-company-import-export-v2.md`
|
||||
- `docs/companies/companies-spec.md`
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
This document defines the rollout plan for adapter-wide skill support in Paperclip.
|
||||
|
||||
The goal is not just “show a skills tab.” The goal is:
|
||||
|
||||
- every adapter has a deliberate skill-sync truth model
|
||||
- the UI tells the truth for that adapter
|
||||
- Paperclip stores desired skill state consistently even when the adapter cannot fully reconcile it
|
||||
- unsupported adapters degrade clearly and safely
|
||||
|
||||
## 2. Current Adapter Matrix
|
||||
|
||||
Paperclip currently has these adapters:
|
||||
|
||||
- `claude_local`
|
||||
- `codex_local`
|
||||
- `cursor_local`
|
||||
- `gemini_local`
|
||||
- `opencode_local`
|
||||
- `pi_local`
|
||||
- `openclaw_gateway`
|
||||
|
||||
The current skill API supports:
|
||||
|
||||
- `unsupported`
|
||||
- `persistent`
|
||||
- `ephemeral`
|
||||
|
||||
Current implementation state:
|
||||
|
||||
- `codex_local`: implemented, `persistent`
|
||||
- `claude_local`: implemented, `ephemeral`
|
||||
- `cursor_local`: not yet implemented, but technically suited to `persistent`
|
||||
- `gemini_local`: not yet implemented, but technically suited to `persistent`
|
||||
- `pi_local`: not yet implemented, but technically suited to `persistent`
|
||||
- `opencode_local`: not yet implemented; likely `persistent`, but with special handling because it currently injects into Claude’s shared skills home
|
||||
- `openclaw_gateway`: not yet implemented; blocked on gateway protocol support, so `unsupported` for now
|
||||
|
||||
## 3. Product Principles
|
||||
|
||||
1. Desired skills live in Paperclip for every adapter.
|
||||
2. Adapters may expose different truth models, and the UI must reflect that honestly.
|
||||
3. Persistent adapters should read and reconcile actual installed state.
|
||||
4. Ephemeral adapters should report effective runtime state, not pretend they own a persistent install.
|
||||
5. Shared-home adapters need stronger safeguards than isolated-home adapters.
|
||||
6. Gateway or cloud adapters must not fake local filesystem sync.
|
||||
|
||||
## 4. Adapter Classification
|
||||
|
||||
### 4.1 Persistent local-home adapters
|
||||
|
||||
These adapters have a stable local skills directory that Paperclip can read and manage.
|
||||
|
||||
Candidates:
|
||||
|
||||
- `codex_local`
|
||||
- `cursor_local`
|
||||
- `gemini_local`
|
||||
- `pi_local`
|
||||
- `opencode_local` with caveats
|
||||
|
||||
Expected UX:
|
||||
|
||||
- show actual installed skills
|
||||
- show managed vs external skills
|
||||
- support `sync`
|
||||
- support stale removal
|
||||
- preserve unknown external skills
|
||||
|
||||
### 4.2 Ephemeral mount adapters
|
||||
|
||||
These adapters do not have a meaningful Paperclip-owned persistent install state.
|
||||
|
||||
Current adapter:
|
||||
|
||||
- `claude_local`
|
||||
|
||||
Expected UX:
|
||||
|
||||
- show desired Paperclip skills
|
||||
- show any discoverable external dirs if available
|
||||
- say “mounted on next run” instead of “installed”
|
||||
- do not imply a persistent adapter-owned install state
|
||||
|
||||
### 4.3 Unsupported / remote adapters
|
||||
|
||||
These adapters cannot support skill sync without new external capabilities.
|
||||
|
||||
Current adapter:
|
||||
|
||||
- `openclaw_gateway`
|
||||
|
||||
Expected UX:
|
||||
|
||||
- company skill library still works
|
||||
- agent attachment UI still works at the desired-state level
|
||||
- actual adapter state is `unsupported`
|
||||
- sync button is disabled or replaced with explanatory text
|
||||
|
||||
## 5. Per-Adapter Plan
|
||||
|
||||
### 5.1 Codex Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `persistent`
|
||||
|
||||
Current state:
|
||||
|
||||
- already implemented
|
||||
|
||||
Requirements to finish:
|
||||
|
||||
- keep as reference implementation
|
||||
- tighten tests around external custom skills and stale removal
|
||||
- ensure imported company skills can be attached and synced without manual path work
|
||||
|
||||
Success criteria:
|
||||
|
||||
- list installed managed and external skills
|
||||
- sync desired skills into `CODEX_HOME/skills`
|
||||
- preserve external user-managed skills
|
||||
|
||||
### 5.2 Claude Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `ephemeral`
|
||||
|
||||
Current state:
|
||||
|
||||
- already implemented
|
||||
|
||||
Requirements to finish:
|
||||
|
||||
- polish status language in UI
|
||||
- clearly distinguish “desired” from “mounted on next run”
|
||||
- optionally surface configured external skill dirs if Claude exposes them
|
||||
|
||||
Success criteria:
|
||||
|
||||
- desired skills stored in Paperclip
|
||||
- selected skills mounted per run
|
||||
- no misleading “installed” language
|
||||
|
||||
### 5.3 Cursor Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `persistent`
|
||||
|
||||
Technical basis:
|
||||
|
||||
- runtime already injects Paperclip skills into `~/.cursor/skills`
|
||||
|
||||
Implementation work:
|
||||
|
||||
1. Add `listSkills` for Cursor.
|
||||
2. Add `syncSkills` for Cursor.
|
||||
3. Reuse the same managed-symlink pattern as Codex.
|
||||
4. Distinguish:
|
||||
- managed Paperclip skills
|
||||
- external skills already present
|
||||
- missing desired skills
|
||||
- stale managed skills
|
||||
|
||||
Testing:
|
||||
|
||||
- unit tests for discovery
|
||||
- unit tests for sync and stale removal
|
||||
- verify shared auth/session setup is not disturbed
|
||||
|
||||
Success criteria:
|
||||
|
||||
- Cursor agents show real installed state
|
||||
- syncing from the agent Skills tab works
|
||||
|
||||
### 5.4 Gemini Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `persistent`
|
||||
|
||||
Technical basis:
|
||||
|
||||
- runtime already injects Paperclip skills into `~/.gemini/skills`
|
||||
|
||||
Implementation work:
|
||||
|
||||
1. Add `listSkills` for Gemini.
|
||||
2. Add `syncSkills` for Gemini.
|
||||
3. Reuse managed-symlink conventions from Codex/Cursor.
|
||||
4. Verify auth remains untouched while skills are reconciled.
|
||||
|
||||
Potential caveat:
|
||||
|
||||
- if Gemini treats that skills directory as shared user state, the UI should warn before removing stale managed skills
|
||||
|
||||
Success criteria:
|
||||
|
||||
- Gemini agents can reconcile desired vs actual skill state
|
||||
|
||||
### 5.5 Pi Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `persistent`
|
||||
|
||||
Technical basis:
|
||||
|
||||
- runtime already injects Paperclip skills into `~/.pi/agent/skills`
|
||||
|
||||
Implementation work:
|
||||
|
||||
1. Add `listSkills` for Pi.
|
||||
2. Add `syncSkills` for Pi.
|
||||
3. Reuse managed-symlink helpers.
|
||||
4. Verify session-file behavior remains independent from skill sync.
|
||||
|
||||
Success criteria:
|
||||
|
||||
- Pi agents expose actual installed skill state
|
||||
- Paperclip can sync desired skills into Pi’s persistent home
|
||||
|
||||
### 5.6 OpenCode Local
|
||||
|
||||
Target mode:
|
||||
|
||||
- `persistent`
|
||||
|
||||
Special case:
|
||||
|
||||
- OpenCode currently injects Paperclip skills into `~/.claude/skills`
|
||||
|
||||
This is product-risky because:
|
||||
|
||||
- it shares state with Claude
|
||||
- Paperclip may accidentally imply the skills belong only to OpenCode when the home is shared
|
||||
|
||||
Plan:
|
||||
|
||||
Phase 1:
|
||||
|
||||
- implement `listSkills` and `syncSkills`
|
||||
- treat it as `persistent`
|
||||
- explicitly label the home as shared in UI copy
|
||||
- only remove stale managed Paperclip skills that are clearly marked as Paperclip-managed
|
||||
|
||||
Phase 2:
|
||||
|
||||
- investigate whether OpenCode supports its own isolated skills home
|
||||
- if yes, migrate to an adapter-specific home and remove the shared-home caveat
|
||||
|
||||
Success criteria:
|
||||
|
||||
- OpenCode agents show real state
|
||||
- shared-home risk is visible and bounded
|
||||
|
||||
### 5.7 OpenClaw Gateway
|
||||
|
||||
Target mode:
|
||||
|
||||
- `unsupported` until gateway protocol support exists
|
||||
|
||||
Required external work:
|
||||
|
||||
- gateway API to list installed/available skills
|
||||
- gateway API to install/remove or otherwise reconcile skills
|
||||
- gateway metadata for whether state is persistent or ephemeral
|
||||
|
||||
Until then:
|
||||
|
||||
- Paperclip stores desired skills only
|
||||
- UI shows unsupported actual state
|
||||
- no fake sync implementation
|
||||
|
||||
Future target:
|
||||
|
||||
- likely a fourth truth model eventually, such as remote-managed persistent state
|
||||
- for now, keep the current API and treat gateway as unsupported
|
||||
|
||||
## 6. API Plan
|
||||
|
||||
## 6.1 Keep the current minimal adapter API
|
||||
|
||||
Near-term adapter contract remains:
|
||||
|
||||
- `listSkills(ctx)`
|
||||
- `syncSkills(ctx, desiredSkills)`
|
||||
|
||||
This is enough for all local adapters.
|
||||
|
||||
## 6.2 Optional extension points
|
||||
|
||||
Add only if needed after the first broad rollout:
|
||||
|
||||
- `skillHomeLabel`
|
||||
- `sharedHome: boolean`
|
||||
- `supportsExternalDiscovery: boolean`
|
||||
- `supportsDestructiveSync: boolean`
|
||||
|
||||
These should be optional metadata additions to the snapshot, not required new adapter methods.
|
||||
|
||||
## 7. UI Plan
|
||||
|
||||
The company-level skill library can stay adapter-neutral.
|
||||
|
||||
The agent-level Skills tab must become adapter-aware by copy and status:
|
||||
|
||||
- `persistent`: installed / missing / stale / external
|
||||
- `ephemeral`: mounted on next run / external / desired only
|
||||
- `unsupported`: desired only, adapter cannot report actual state
|
||||
|
||||
Additional UI requirement for shared-home adapters:
|
||||
|
||||
- show a small warning that the adapter uses a shared user skills home
|
||||
- avoid destructive wording unless Paperclip can prove a skill is Paperclip-managed
|
||||
|
||||
## 8. Rollout Phases
|
||||
|
||||
### Phase 1: Finish the local filesystem family
|
||||
|
||||
Ship:
|
||||
|
||||
- `cursor_local`
|
||||
- `gemini_local`
|
||||
- `pi_local`
|
||||
|
||||
Rationale:
|
||||
|
||||
- these are the closest to Codex in architecture
|
||||
- they already inject into stable local skill homes
|
||||
|
||||
### Phase 2: OpenCode shared-home support
|
||||
|
||||
Ship:
|
||||
|
||||
- `opencode_local`
|
||||
|
||||
Rationale:
|
||||
|
||||
- technically feasible now
|
||||
- needs slightly more careful product language because of the shared Claude skills home
|
||||
|
||||
### Phase 3: Gateway support decision
|
||||
|
||||
Decide:
|
||||
|
||||
- keep `openclaw_gateway` unsupported for V1
|
||||
- or extend the gateway protocol for remote skill management
|
||||
|
||||
My recommendation:
|
||||
|
||||
- do not block V1 on gateway support
|
||||
- keep it explicitly unsupported until the remote protocol exists
|
||||
|
||||
## 9. Definition Of Done
|
||||
|
||||
Adapter-wide skill support is ready when all are true:
|
||||
|
||||
1. Every adapter has an explicit truth model:
|
||||
- `persistent`
|
||||
- `ephemeral`
|
||||
- `unsupported`
|
||||
2. The UI copy matches that truth model.
|
||||
3. All local persistent adapters implement:
|
||||
- `listSkills`
|
||||
- `syncSkills`
|
||||
4. Tests cover:
|
||||
- desired-state storage
|
||||
- actual-state discovery
|
||||
- managed vs external distinctions
|
||||
- stale managed-skill cleanup where supported
|
||||
5. `openclaw_gateway` is either:
|
||||
- explicitly unsupported with clean UX
|
||||
- or backed by a real remote skill API
|
||||
|
||||
## 10. Recommendation
|
||||
|
||||
The recommended immediate order is:
|
||||
|
||||
1. `cursor_local`
|
||||
2. `gemini_local`
|
||||
3. `pi_local`
|
||||
4. `opencode_local`
|
||||
5. defer `openclaw_gateway`
|
||||
|
||||
That gets Paperclip from “skills work for Codex and Claude” to “skills work for the whole local-adapter family,” which is the meaningful V1 milestone.
|
||||
729
doc/plans/2026-03-14-skills-ui-product-plan.md
Normal file
729
doc/plans/2026-03-14-skills-ui-product-plan.md
Normal file
@@ -0,0 +1,729 @@
|
||||
# 2026-03-14 Skills UI Product Plan
|
||||
|
||||
Status: Proposed
|
||||
Date: 2026-03-14
|
||||
Audience: Product and engineering
|
||||
Related:
|
||||
- `doc/plans/2026-03-13-company-import-export-v2.md`
|
||||
- `doc/plans/2026-03-14-adapter-skill-sync-rollout.md`
|
||||
- `docs/companies/companies-spec.md`
|
||||
- `ui/src/pages/AgentDetail.tsx`
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
This document defines the product and UI plan for skill management in Paperclip.
|
||||
|
||||
The goal is to make skills understandable and manageable in the website without pretending that all adapters behave the same way.
|
||||
|
||||
This plan assumes:
|
||||
|
||||
- `SKILL.md` remains Agent Skills compatible
|
||||
- `skills.sh` compatibility is a V1 requirement
|
||||
- Paperclip company import/export can include skills as package content
|
||||
- adapters may support persistent skill sync, ephemeral skill mounting, read-only skill discovery, or no skill integration at all
|
||||
|
||||
## 2. Current State
|
||||
|
||||
There is already a first-pass agent-level skill sync UI on `AgentDetail`.
|
||||
|
||||
Today it supports:
|
||||
|
||||
- loading adapter skill sync state
|
||||
- showing unsupported adapters clearly
|
||||
- showing managed skills as checkboxes
|
||||
- showing external skills separately
|
||||
- syncing desired skills for adapters that implement the new API
|
||||
|
||||
Current limitations:
|
||||
|
||||
1. There is no company-level skill library UI.
|
||||
2. There is no package import flow for skills in the website.
|
||||
3. There is no distinction between skill package management and per-agent skill attachment.
|
||||
4. There is no multi-agent desired-vs-actual view.
|
||||
5. The current UI is adapter-sync-oriented, not package-oriented.
|
||||
6. Unsupported adapters degrade safely, but not elegantly.
|
||||
|
||||
## 2.1 V1 Decisions
|
||||
|
||||
For V1, this plan assumes the following product decisions are already made:
|
||||
|
||||
1. `skills.sh` compatibility is required.
|
||||
2. Agent-to-skill association in `AGENTS.md` is by shortname or slug.
|
||||
3. Company skills and agent skill attachments are separate concepts.
|
||||
4. Agent skills should move to their own tab rather than living inside configuration.
|
||||
5. Company import/export should eventually round-trip skill packages and agent skill attachments.
|
||||
|
||||
## 3. Product Principles
|
||||
|
||||
1. Skills are company assets first, agent attachments second.
|
||||
2. Package management and adapter sync are different concerns and should not be conflated in one screen.
|
||||
3. The UI must always tell the truth about what Paperclip knows:
|
||||
- desired state in Paperclip
|
||||
- actual state reported by the adapter
|
||||
- whether the adapter can reconcile the two
|
||||
4. Agent Skills compatibility must remain visible in the product model.
|
||||
5. Agent-to-skill associations should be human-readable and shortname-based wherever possible.
|
||||
6. Unsupported adapters should still have a useful UI, not just a dead end.
|
||||
|
||||
## 4. User Model
|
||||
|
||||
Paperclip should treat skills at two scopes:
|
||||
|
||||
### 4.1 Company skills
|
||||
|
||||
These are reusable skills known to the company.
|
||||
|
||||
Examples:
|
||||
|
||||
- imported from a GitHub repo
|
||||
- added from a local folder
|
||||
- installed from a `skills.sh`-compatible repo
|
||||
- created locally inside Paperclip later
|
||||
|
||||
These should have:
|
||||
|
||||
- name
|
||||
- description
|
||||
- slug or package identity
|
||||
- source/provenance
|
||||
- trust level
|
||||
- compatibility status
|
||||
|
||||
### 4.2 Agent skills
|
||||
|
||||
These are skill attachments for a specific agent.
|
||||
|
||||
Each attachment should have:
|
||||
|
||||
- shortname
|
||||
- desired state in Paperclip
|
||||
- actual state in the adapter when readable
|
||||
- sync status
|
||||
- origin
|
||||
|
||||
Agent attachments should normally reference skills by shortname or slug, for example:
|
||||
|
||||
- `review`
|
||||
- `react-best-practices`
|
||||
|
||||
not by noisy relative file path.
|
||||
|
||||
## 4.3 Primary user jobs
|
||||
|
||||
The UI should support these jobs cleanly:
|
||||
|
||||
1. “Show me what skills this company has.”
|
||||
2. “Import a skill from GitHub or a local folder.”
|
||||
3. “See whether a skill is safe, compatible, and who uses it.”
|
||||
4. “Attach skills to an agent.”
|
||||
5. “See whether the adapter actually has those skills.”
|
||||
6. “Reconcile desired vs actual skill state.”
|
||||
7. “Understand what Paperclip knows vs what the adapter knows.”
|
||||
|
||||
## 5. Core UI Surfaces
|
||||
|
||||
The product should have two primary skill surfaces.
|
||||
|
||||
### 5.1 Company Skills page
|
||||
|
||||
Add a company-level page, likely:
|
||||
|
||||
- `/companies/:companyId/skills`
|
||||
|
||||
Purpose:
|
||||
|
||||
- manage the company skill library
|
||||
- import and inspect skill packages
|
||||
- understand provenance and trust
|
||||
- see which agents use which skills
|
||||
|
||||
#### Route
|
||||
|
||||
- `/companies/:companyId/skills`
|
||||
|
||||
#### Primary actions
|
||||
|
||||
- import skill
|
||||
- inspect skill
|
||||
- attach to agents
|
||||
- detach from agents
|
||||
- export selected skills later
|
||||
|
||||
#### Empty state
|
||||
|
||||
When the company has no managed skills:
|
||||
|
||||
- explain what skills are
|
||||
- explain `skills.sh` / Agent Skills compatibility
|
||||
- offer `Import from GitHub` and `Import from folder`
|
||||
- optionally show adapter-discovered skills as a secondary “not managed yet” section
|
||||
|
||||
#### A. Skill library list
|
||||
|
||||
Each skill row should show:
|
||||
|
||||
- name
|
||||
- short description
|
||||
- source badge
|
||||
- trust badge
|
||||
- compatibility badge
|
||||
- number of attached agents
|
||||
|
||||
Suggested source states:
|
||||
|
||||
- local
|
||||
- github
|
||||
- imported package
|
||||
- external reference
|
||||
- adapter-discovered only
|
||||
|
||||
Suggested compatibility states:
|
||||
|
||||
- compatible
|
||||
- paperclip-extension
|
||||
- unknown
|
||||
- invalid
|
||||
|
||||
Suggested trust states:
|
||||
|
||||
- markdown-only
|
||||
- assets
|
||||
- scripts/executables
|
||||
|
||||
Suggested list affordances:
|
||||
|
||||
- search by name or slug
|
||||
- filter by source
|
||||
- filter by trust level
|
||||
- filter by usage
|
||||
- sort by name, recent import, usage count
|
||||
|
||||
#### B. Import actions
|
||||
|
||||
Allow:
|
||||
|
||||
- import from local folder
|
||||
- import from GitHub URL
|
||||
- import from direct URL
|
||||
|
||||
Future:
|
||||
|
||||
- install from `companies.sh`
|
||||
- install from `skills.sh`
|
||||
|
||||
V1 requirement:
|
||||
|
||||
- importing from a `skills.sh`-compatible source should work without requiring a Paperclip-specific package layout
|
||||
|
||||
#### C. Skill detail drawer or page
|
||||
|
||||
Each skill should have a detail view showing:
|
||||
|
||||
- rendered `SKILL.md`
|
||||
- package source and pinning
|
||||
- included files
|
||||
- trust and licensing warnings
|
||||
- who uses it
|
||||
- adapter compatibility notes
|
||||
|
||||
Recommended route:
|
||||
|
||||
- `/companies/:companyId/skills/:skillId`
|
||||
|
||||
Recommended sections:
|
||||
|
||||
- Overview
|
||||
- Contents
|
||||
- Usage
|
||||
- Source
|
||||
- Trust / licensing
|
||||
|
||||
#### D. Usage view
|
||||
|
||||
Each company skill should show which agents use it.
|
||||
|
||||
Suggested columns:
|
||||
|
||||
- agent
|
||||
- desired state
|
||||
- actual state
|
||||
- adapter
|
||||
- sync mode
|
||||
- last sync status
|
||||
|
||||
### 5.2 Agent Skills tab
|
||||
|
||||
Keep and evolve the existing `AgentDetail` skill sync UI, but move it out of configuration.
|
||||
|
||||
Purpose:
|
||||
|
||||
- attach/detach company skills to one agent
|
||||
- inspect adapter reality for that agent
|
||||
- reconcile desired vs actual state
|
||||
- keep the association format readable and aligned with `AGENTS.md`
|
||||
|
||||
#### Route
|
||||
|
||||
- `/agents/:agentId/skills`
|
||||
|
||||
#### Agent tabs
|
||||
|
||||
The intended agent-level tab model becomes:
|
||||
|
||||
- `dashboard`
|
||||
- `configuration`
|
||||
- `skills`
|
||||
- `runs`
|
||||
|
||||
This is preferable to hiding skills inside configuration because:
|
||||
|
||||
- skills are not just adapter config
|
||||
- skills need their own sync/status language
|
||||
- skills are a reusable company asset, not merely one agent field
|
||||
- the screen needs room for desired vs actual state, warnings, and external skill adoption
|
||||
|
||||
#### Tab layout
|
||||
|
||||
The `Skills` tab should have three stacked sections:
|
||||
|
||||
1. Summary
|
||||
2. Managed skills
|
||||
3. External / discovered skills
|
||||
|
||||
Summary should show:
|
||||
|
||||
- adapter sync support
|
||||
- sync mode
|
||||
- number of managed skills
|
||||
- number of external skills
|
||||
- drift or warning count
|
||||
|
||||
#### A. Desired skills
|
||||
|
||||
Show company-managed skills attached to the agent.
|
||||
|
||||
Each row should show:
|
||||
|
||||
- skill name
|
||||
- shortname
|
||||
- sync state
|
||||
- source
|
||||
- last adapter observation if available
|
||||
|
||||
Each row should support:
|
||||
|
||||
- enable / disable
|
||||
- open skill detail
|
||||
- see source badge
|
||||
- see sync badge
|
||||
|
||||
#### B. External or discovered skills
|
||||
|
||||
Show skills reported by the adapter that are not company-managed.
|
||||
|
||||
This matters because Codex and similar adapters may already have local skills that Paperclip did not install.
|
||||
|
||||
These should be clearly marked:
|
||||
|
||||
- external
|
||||
- not managed by Paperclip
|
||||
|
||||
Each external row should support:
|
||||
|
||||
- inspect
|
||||
- adopt into company library later
|
||||
- attach as managed skill later if appropriate
|
||||
|
||||
#### C. Sync controls
|
||||
|
||||
Support:
|
||||
|
||||
- sync
|
||||
- reset draft
|
||||
- detach
|
||||
|
||||
Future:
|
||||
|
||||
- import external skill into company library
|
||||
- promote ad hoc local skill into a managed company skill
|
||||
|
||||
Recommended footer actions:
|
||||
|
||||
- `Sync skills`
|
||||
- `Reset`
|
||||
- `Refresh adapter state`
|
||||
|
||||
## 6. Skill State Model In The UI
|
||||
|
||||
Each skill attachment should have a user-facing state.
|
||||
|
||||
Suggested states:
|
||||
|
||||
- `in_sync`
|
||||
- `desired_only`
|
||||
- `external`
|
||||
- `drifted`
|
||||
- `unmanaged`
|
||||
- `unknown`
|
||||
|
||||
Definitions:
|
||||
|
||||
- `in_sync`: desired and actual match
|
||||
- `desired_only`: Paperclip wants it, adapter does not show it yet
|
||||
- `external`: adapter has it but Paperclip does not manage it
|
||||
- `drifted`: adapter has a conflicting or unexpected version/location
|
||||
- `unmanaged`: adapter does not support sync, Paperclip only tracks desired state
|
||||
- `unknown`: adapter read failed or state cannot be trusted
|
||||
|
||||
Suggested badge copy:
|
||||
|
||||
- `In sync`
|
||||
- `Needs sync`
|
||||
- `External`
|
||||
- `Drifted`
|
||||
- `Unmanaged`
|
||||
- `Unknown`
|
||||
|
||||
## 7. Adapter Presentation Rules
|
||||
|
||||
The UI should not describe all adapters the same way.
|
||||
|
||||
### 7.1 Persistent adapters
|
||||
|
||||
Example:
|
||||
|
||||
- Codex local
|
||||
|
||||
Language:
|
||||
|
||||
- installed
|
||||
- synced into adapter home
|
||||
- external skills detected
|
||||
|
||||
### 7.2 Ephemeral adapters
|
||||
|
||||
Example:
|
||||
|
||||
- Claude local
|
||||
|
||||
Language:
|
||||
|
||||
- will be mounted on next run
|
||||
- effective runtime skills
|
||||
- not globally installed
|
||||
|
||||
### 7.3 Unsupported adapters
|
||||
|
||||
Language:
|
||||
|
||||
- this adapter does not implement skill sync yet
|
||||
- Paperclip can still track desired skills
|
||||
- actual adapter state is unavailable
|
||||
|
||||
This state should still allow:
|
||||
|
||||
- attaching company skills to the agent as desired state
|
||||
- export/import of those desired attachments
|
||||
|
||||
## 7.4 Read-only adapters
|
||||
|
||||
Some adapters may be able to list skills but not mutate them.
|
||||
|
||||
Language:
|
||||
|
||||
- Paperclip can see adapter skills
|
||||
- this adapter does not support applying changes
|
||||
- desired state can be tracked, but reconciliation is manual
|
||||
|
||||
## 8. Information Architecture
|
||||
|
||||
Recommended navigation:
|
||||
|
||||
- company nav adds `Skills`
|
||||
- agent detail adds `Skills` as its own tab
|
||||
- company skill detail gets its own route when the company library ships
|
||||
|
||||
Recommended separation:
|
||||
|
||||
- Company Skills page answers: “What skills do we have?”
|
||||
- Agent Skills tab answers: “What does this agent use, and is it synced?”
|
||||
|
||||
## 8.1 Proposed route map
|
||||
|
||||
- `/companies/:companyId/skills`
|
||||
- `/companies/:companyId/skills/:skillId`
|
||||
- `/agents/:agentId/skills`
|
||||
|
||||
## 8.2 Nav and discovery
|
||||
|
||||
Recommended entry points:
|
||||
|
||||
- company sidebar: `Skills`
|
||||
- agent page tabs: `Skills`
|
||||
- company import preview: link imported skills to company skills page later
|
||||
- agent skills rows: link to company skill detail
|
||||
|
||||
## 9. Import / Export Integration
|
||||
|
||||
Skill UI and package portability should meet in the company skill library.
|
||||
|
||||
Import behavior:
|
||||
|
||||
- importing a company package with `SKILL.md` content should create or update company skills
|
||||
- agent attachments should primarily come from `AGENTS.md` shortname associations
|
||||
- `.paperclip.yaml` may add Paperclip-specific fidelity, but should not replace the base shortname association model
|
||||
- referenced third-party skills should keep provenance visible
|
||||
|
||||
Export behavior:
|
||||
|
||||
- exporting a company should include company-managed skills when selected
|
||||
- `AGENTS.md` should emit skill associations by shortname or slug
|
||||
- `.paperclip.yaml` may add Paperclip-specific skill fidelity later if needed, but should not be required for ordinary agent-to-skill association
|
||||
- adapter-only external skills should not be silently exported as managed company skills
|
||||
|
||||
## 9.1 Import workflows
|
||||
|
||||
V1 workflows should support:
|
||||
|
||||
1. import one or more skills from a local folder
|
||||
2. import one or more skills from a GitHub repo
|
||||
3. import a company package that contains skills
|
||||
4. attach imported skills to one or more agents
|
||||
|
||||
Import preview for skills should show:
|
||||
|
||||
- skills discovered
|
||||
- source and pinning
|
||||
- trust level
|
||||
- licensing warnings
|
||||
- whether an existing company skill will be created, updated, or skipped
|
||||
|
||||
## 9.2 Export workflows
|
||||
|
||||
V1 should support:
|
||||
|
||||
1. export a company with managed skills included when selected
|
||||
2. export an agent whose `AGENTS.md` contains shortname skill associations
|
||||
3. preserve Agent Skills compatibility for each `SKILL.md`
|
||||
|
||||
Out of scope for V1:
|
||||
|
||||
- exporting adapter-only external skills as managed packages automatically
|
||||
|
||||
## 10. Data And API Shape
|
||||
|
||||
This plan implies a clean split in backend concepts.
|
||||
|
||||
### 10.1 Company skill records
|
||||
|
||||
Paperclip should have a company-scoped skill model or managed package model representing:
|
||||
|
||||
- identity
|
||||
- source
|
||||
- files
|
||||
- provenance
|
||||
- trust and licensing metadata
|
||||
|
||||
### 10.2 Agent skill attachments
|
||||
|
||||
Paperclip should separately store:
|
||||
|
||||
- agent id
|
||||
- skill identity
|
||||
- desired enabled state
|
||||
- optional ordering or metadata later
|
||||
|
||||
### 10.3 Adapter sync snapshot
|
||||
|
||||
Adapter reads should return:
|
||||
|
||||
- supported flag
|
||||
- sync mode
|
||||
- entries
|
||||
- warnings
|
||||
- desired skills
|
||||
|
||||
This already exists in rough form and should be the basis for the UI.
|
||||
|
||||
### 10.4 UI-facing API needs
|
||||
|
||||
The complete UI implies these API surfaces:
|
||||
|
||||
- list company-managed skills
|
||||
- import company skills from path/URL/GitHub
|
||||
- get one company skill detail
|
||||
- list agents using a given skill
|
||||
- attach/detach company skills for an agent
|
||||
- list adapter sync snapshot for an agent
|
||||
- apply desired skills for an agent
|
||||
|
||||
Existing agent-level skill sync APIs can remain the base for the agent tab.
|
||||
The company-level library APIs still need to be designed and implemented.
|
||||
|
||||
## 11. Page-by-page UX
|
||||
|
||||
### 11.1 Company Skills list page
|
||||
|
||||
Header:
|
||||
|
||||
- title
|
||||
- short explanation of compatibility with Agent Skills / `skills.sh`
|
||||
- import button
|
||||
|
||||
Body:
|
||||
|
||||
- filters
|
||||
- skill table or cards
|
||||
- empty state when none
|
||||
|
||||
Secondary content:
|
||||
|
||||
- warnings panel for untrusted or incompatible skills
|
||||
|
||||
### 11.2 Company Skill detail page
|
||||
|
||||
Header:
|
||||
|
||||
- skill name
|
||||
- shortname
|
||||
- source badge
|
||||
- trust badge
|
||||
- compatibility badge
|
||||
|
||||
Sections:
|
||||
|
||||
- rendered `SKILL.md`
|
||||
- files and references
|
||||
- usage by agents
|
||||
- source / provenance
|
||||
- trust and licensing warnings
|
||||
|
||||
Actions:
|
||||
|
||||
- attach to agent
|
||||
- remove from company library later
|
||||
- export later
|
||||
|
||||
### 11.3 Agent Skills tab
|
||||
|
||||
Header:
|
||||
|
||||
- adapter support summary
|
||||
- sync mode
|
||||
- refresh and sync actions
|
||||
|
||||
Body:
|
||||
|
||||
- managed skills list
|
||||
- external/discovered skills list
|
||||
- warnings / unsupported state block
|
||||
|
||||
## 12. States And Empty Cases
|
||||
|
||||
### 12.1 Company Skills page
|
||||
|
||||
States:
|
||||
|
||||
- empty
|
||||
- loading
|
||||
- loaded
|
||||
- import in progress
|
||||
- import failed
|
||||
|
||||
### 12.2 Company Skill detail
|
||||
|
||||
States:
|
||||
|
||||
- loading
|
||||
- not found
|
||||
- incompatible
|
||||
- loaded
|
||||
|
||||
### 12.3 Agent Skills tab
|
||||
|
||||
States:
|
||||
|
||||
- loading snapshot
|
||||
- unsupported adapter
|
||||
- read-only adapter
|
||||
- sync-capable adapter
|
||||
- sync failed
|
||||
- stale draft
|
||||
|
||||
## 13. Permissions And Governance
|
||||
|
||||
Suggested V1 policy:
|
||||
|
||||
- board users can manage company skills
|
||||
- board users can attach skills to agents
|
||||
- agents themselves do not mutate company skill library by default
|
||||
- later, certain agents may get scoped permissions for skill attachment or sync
|
||||
|
||||
## 14. UI Phases
|
||||
|
||||
### Phase A: Stabilize current agent skill sync UI
|
||||
|
||||
Goals:
|
||||
|
||||
- move skills to an `AgentDetail` tab
|
||||
- improve status language
|
||||
- support desired-only state even on unsupported adapters
|
||||
- polish copy for persistent vs ephemeral adapters
|
||||
|
||||
### Phase B: Add Company Skills page
|
||||
|
||||
Goals:
|
||||
|
||||
- company-level skill library
|
||||
- import from GitHub/local folder
|
||||
- basic detail view
|
||||
- usage counts by agent
|
||||
- `skills.sh`-compatible import path
|
||||
|
||||
### Phase C: Connect skills to portability
|
||||
|
||||
Goals:
|
||||
|
||||
- importing company packages creates company skills
|
||||
- exporting selected skills works cleanly
|
||||
- agent attachments round-trip primarily through `AGENTS.md` shortnames
|
||||
|
||||
### Phase D: External skill adoption flow
|
||||
|
||||
Goals:
|
||||
|
||||
- detect adapter external skills
|
||||
- allow importing them into company-managed state where possible
|
||||
- make provenance explicit
|
||||
|
||||
### Phase E: Advanced sync and drift UX
|
||||
|
||||
Goals:
|
||||
|
||||
- desired-vs-actual diffing
|
||||
- drift resolution actions
|
||||
- multi-agent skill usage and sync reporting
|
||||
|
||||
## 15. Design Risks
|
||||
|
||||
1. Overloading the agent page with package management will make the feature confusing.
|
||||
2. Treating unsupported adapters as broken rather than unmanaged will make the product feel inconsistent.
|
||||
3. Mixing external adapter-discovered skills with company-managed skills without clear labels will erode trust.
|
||||
4. If company skill records do not exist, import/export and UI will remain loosely coupled and round-trip fidelity will stay weak.
|
||||
5. If agent skill associations are path-based instead of shortname-based, the format will feel too technical and too Paperclip-specific.
|
||||
|
||||
## 16. Recommendation
|
||||
|
||||
The next product step should be:
|
||||
|
||||
1. move skills out of agent configuration and into a dedicated `Skills` tab
|
||||
2. add a dedicated company-level `Skills` page as the library and package-management surface
|
||||
3. make company import/export target that company skill library, not the agent page directly
|
||||
4. preserve adapter-aware truth in the UI by clearly separating:
|
||||
- desired
|
||||
- actual
|
||||
- external
|
||||
- unmanaged
|
||||
5. keep agent-to-skill associations shortname-based in `AGENTS.md`
|
||||
|
||||
That gives Paperclip one coherent skill story instead of forcing package management, adapter sync, and agent configuration into the same screen.
|
||||
424
doc/plans/2026-03-17-docker-release-browser-e2e.md
Normal file
424
doc/plans/2026-03-17-docker-release-browser-e2e.md
Normal file
@@ -0,0 +1,424 @@
|
||||
# Docker Release Browser E2E Plan
|
||||
|
||||
## Context
|
||||
|
||||
Today release smoke testing for published Paperclip packages is manual and shell-driven:
|
||||
|
||||
```sh
|
||||
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
HOST_PORT=3233 DATA_DIR=./data/release-smoke-stable PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
That is useful because it exercises the same public install surface users hit:
|
||||
|
||||
- Docker
|
||||
- `npx paperclipai@canary`
|
||||
- `npx paperclipai@latest`
|
||||
- authenticated bootstrap flow
|
||||
|
||||
But it still leaves the most important release questions to a human with a browser:
|
||||
|
||||
- can I sign in with the smoke credentials?
|
||||
- do I land in onboarding?
|
||||
- can I complete onboarding?
|
||||
- does the initial CEO agent actually get created and run?
|
||||
|
||||
The repo already has two adjacent pieces:
|
||||
|
||||
- `tests/e2e/onboarding.spec.ts` covers the onboarding wizard against the local source tree
|
||||
- `scripts/docker-onboard-smoke.sh` boots a published Docker install and auto-bootstraps authenticated mode, but only verifies the API/session layer
|
||||
|
||||
What is missing is one deterministic browser test that joins those two paths.
|
||||
|
||||
## Goal
|
||||
|
||||
Add a release-grade Docker-backed browser E2E that validates the published `canary` and `latest` installs end to end:
|
||||
|
||||
1. boot the published package in Docker
|
||||
2. sign in with known smoke credentials
|
||||
3. verify the user is routed into onboarding
|
||||
4. complete onboarding in the browser
|
||||
5. verify the first CEO agent exists
|
||||
6. verify the initial CEO run was triggered and reached a terminal or active state
|
||||
|
||||
Then wire that test into GitHub Actions so release validation is no longer manual-only.
|
||||
|
||||
## Recommendation In One Sentence
|
||||
|
||||
Turn the current Docker smoke script into a machine-friendly test harness, add a dedicated Playwright release-smoke spec that drives the authenticated browser flow against published Docker installs, and run it in GitHub Actions for both `canary` and `latest`.
|
||||
|
||||
## What We Have Today
|
||||
|
||||
### Existing local browser coverage
|
||||
|
||||
`tests/e2e/onboarding.spec.ts` already proves the onboarding wizard can:
|
||||
|
||||
- create a company
|
||||
- create a CEO agent
|
||||
- create an initial issue
|
||||
- optionally observe task progress
|
||||
|
||||
That is a good base, but it does not validate the public npm package, Docker path, authenticated login flow, or release dist-tags.
|
||||
|
||||
### Existing Docker smoke coverage
|
||||
|
||||
`scripts/docker-onboard-smoke.sh` already does useful setup work:
|
||||
|
||||
- builds `Dockerfile.onboard-smoke`
|
||||
- runs `paperclipai@${PAPERCLIPAI_VERSION}` inside Docker
|
||||
- waits for health
|
||||
- signs up or signs in a smoke admin user
|
||||
- generates and accepts the bootstrap CEO invite in authenticated mode
|
||||
- verifies a board session and `/api/companies`
|
||||
|
||||
That means the hard bootstrap problem is mostly solved already. The main gap is that the script is human-oriented and never hands control to a browser test.
|
||||
|
||||
### Existing CI shape
|
||||
|
||||
The repo already has:
|
||||
|
||||
- `.github/workflows/e2e.yml` for manual Playwright runs against local source
|
||||
- `.github/workflows/release.yml` for canary publish on `master` and manual stable promotion
|
||||
|
||||
So the right move is to extend the current test/release system, not create a parallel one.
|
||||
|
||||
## Product Decision
|
||||
|
||||
### 1. The release smoke should stay deterministic and token-free
|
||||
|
||||
The first version should not require OpenAI, Anthropic, or external agent credentials.
|
||||
|
||||
Use the onboarding flow with a deterministic adapter that can run on a stock GitHub runner and inside the published Docker install. The existing `process` adapter with a trivial command is the right base path for this release gate.
|
||||
|
||||
That keeps this test focused on:
|
||||
|
||||
- release packaging
|
||||
- auth/bootstrap
|
||||
- UI routing
|
||||
- onboarding contract
|
||||
- agent creation
|
||||
- heartbeat invocation plumbing
|
||||
|
||||
Later we can add a second credentialed smoke lane for real model-backed agents.
|
||||
|
||||
### 2. Smoke credentials become an explicit test contract
|
||||
|
||||
The current defaults in `scripts/docker-onboard-smoke.sh` should be treated as stable test fixtures:
|
||||
|
||||
- email: `smoke-admin@paperclip.local`
|
||||
- password: `paperclip-smoke-password`
|
||||
|
||||
The browser test should log in with those exact values unless overridden by env vars.
|
||||
|
||||
### 3. Published-package smoke and source-tree E2E stay separate
|
||||
|
||||
Keep two lanes:
|
||||
|
||||
- source-tree E2E for feature development
|
||||
- published Docker release smoke for release confidence
|
||||
|
||||
They overlap on onboarding assertions, but they guard different failure classes.
|
||||
|
||||
## Proposed Design
|
||||
|
||||
## 1. Add a CI-friendly Docker smoke harness
|
||||
|
||||
Refactor `scripts/docker-onboard-smoke.sh` so it can run in two modes:
|
||||
|
||||
- interactive mode
|
||||
- current behavior
|
||||
- streams logs and waits in foreground for manual inspection
|
||||
- CI mode
|
||||
- starts the container
|
||||
- waits for health and authenticated bootstrap
|
||||
- prints machine-readable metadata
|
||||
- exits while leaving the container running for Playwright
|
||||
|
||||
Recommended shape:
|
||||
|
||||
- keep `scripts/docker-onboard-smoke.sh` as the public entry point
|
||||
- add a `SMOKE_DETACH=true` or `--detach` mode
|
||||
- emit a JSON blob or `.env` file containing:
|
||||
- `SMOKE_BASE_URL`
|
||||
- `SMOKE_ADMIN_EMAIL`
|
||||
- `SMOKE_ADMIN_PASSWORD`
|
||||
- `SMOKE_CONTAINER_NAME`
|
||||
- `SMOKE_DATA_DIR`
|
||||
|
||||
The workflow and Playwright tests can then consume the emitted metadata instead of scraping logs.
|
||||
|
||||
### Why this matters
|
||||
|
||||
The current script always tails logs and then blocks on `wait "$LOG_PID"`. That is convenient for manual smoke testing, but it is the wrong shape for CI orchestration.
|
||||
|
||||
## 2. Add a dedicated Playwright release-smoke spec
|
||||
|
||||
Create a second Playwright entry point specifically for published Docker installs, for example:
|
||||
|
||||
- `tests/release-smoke/playwright.config.ts`
|
||||
- `tests/release-smoke/docker-auth-onboarding.spec.ts`
|
||||
|
||||
This suite should not use Playwright `webServer`, because the app server will already be running inside Docker.
|
||||
|
||||
### Browser scenario
|
||||
|
||||
The first release-smoke scenario should validate:
|
||||
|
||||
1. open `/`
|
||||
2. unauthenticated user is redirected to `/auth`
|
||||
3. sign in using the smoke credentials
|
||||
4. authenticated user lands on onboarding when no companies exist
|
||||
5. onboarding wizard appears with the expected step labels
|
||||
6. create a company
|
||||
7. create the first agent using `process`
|
||||
8. create the initial issue
|
||||
9. finish onboarding and open the created issue
|
||||
10. verify via API:
|
||||
- company exists
|
||||
- CEO agent exists
|
||||
- issue exists and is assigned to the CEO
|
||||
11. verify the first heartbeat run was triggered:
|
||||
- either by checking issue status changed from initial state, or
|
||||
- by checking agent/runs API shows a run for the CEO, or
|
||||
- both
|
||||
|
||||
The test should tolerate the run completing quickly. For this reason, the assertion should accept:
|
||||
|
||||
- `queued`
|
||||
- `running`
|
||||
- `succeeded`
|
||||
|
||||
and similarly for issue progression if the issue status changes before the assertion runs.
|
||||
|
||||
### Why a separate spec instead of reusing `tests/e2e/onboarding.spec.ts`
|
||||
|
||||
The local-source test and release-smoke test have different assumptions:
|
||||
|
||||
- different server lifecycle
|
||||
- different auth path
|
||||
- different deployment mode
|
||||
- published npm package instead of local workspace code
|
||||
|
||||
Trying to force both through one spec will make both worse.
|
||||
|
||||
## 3. Add a release-smoke workflow in GitHub Actions
|
||||
|
||||
Add a workflow dedicated to this surface, ideally reusable:
|
||||
|
||||
- `.github/workflows/release-smoke.yml`
|
||||
|
||||
Recommended triggers:
|
||||
|
||||
- `workflow_dispatch`
|
||||
- `workflow_call`
|
||||
|
||||
Recommended inputs:
|
||||
|
||||
- `paperclip_version`
|
||||
- `canary` or `latest`
|
||||
- `host_port`
|
||||
- optional, default runner-safe port
|
||||
- `artifact_name`
|
||||
- optional for clearer uploads
|
||||
|
||||
### Job outline
|
||||
|
||||
1. checkout repo
|
||||
2. install Node/pnpm
|
||||
3. install Playwright browser dependencies
|
||||
4. launch Docker smoke harness in detached mode with the chosen dist-tag
|
||||
5. run the release-smoke Playwright suite against the returned base URL
|
||||
6. always collect diagnostics:
|
||||
- Playwright report
|
||||
- screenshots
|
||||
- trace
|
||||
- `docker logs`
|
||||
- harness metadata file
|
||||
7. stop and remove container
|
||||
|
||||
### Why a reusable workflow
|
||||
|
||||
This lets us:
|
||||
|
||||
- run the smoke manually on demand
|
||||
- call it from `release.yml`
|
||||
- reuse the same job for both `canary` and `latest`
|
||||
|
||||
## 4. Integrate it into release automation incrementally
|
||||
|
||||
### Phase A: Manual workflow only
|
||||
|
||||
First ship the workflow as manual-only so the harness and test can be stabilized without blocking releases.
|
||||
|
||||
### Phase B: Run automatically after canary publish
|
||||
|
||||
After `publish_canary` succeeds in `.github/workflows/release.yml`, call the reusable release-smoke workflow with:
|
||||
|
||||
- `paperclip_version=canary`
|
||||
|
||||
This proves the just-published public canary really boots and onboards.
|
||||
|
||||
### Phase C: Run automatically after stable publish
|
||||
|
||||
After `publish_stable` succeeds, call the same workflow with:
|
||||
|
||||
- `paperclip_version=latest`
|
||||
|
||||
This gives us post-publish confirmation that the stable dist-tag is healthy.
|
||||
|
||||
### Important nuance
|
||||
|
||||
Testing `latest` from npm cannot happen before stable publish, because the package under test does not exist under `latest` yet. So the `latest` smoke is a post-publish verification, not a pre-publish gate.
|
||||
|
||||
If we later want a true pre-publish stable gate, that should be a separate source-ref or locally built package smoke job.
|
||||
|
||||
## 5. Make diagnostics first-class
|
||||
|
||||
This workflow is only valuable if failures are fast to debug.
|
||||
|
||||
Always capture:
|
||||
|
||||
- Playwright HTML report
|
||||
- Playwright trace on failure
|
||||
- final screenshot on failure
|
||||
- full `docker logs` output
|
||||
- emitted smoke metadata
|
||||
- optional `curl /api/health` snapshot
|
||||
|
||||
Without that, the test will become a flaky black box and people will stop trusting it.
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
## Phase 1: Harness refactor
|
||||
|
||||
Files:
|
||||
|
||||
- `scripts/docker-onboard-smoke.sh`
|
||||
- optionally `scripts/lib/docker-onboard-smoke.sh` or similar helper
|
||||
- `doc/DOCKER.md`
|
||||
- `doc/RELEASING.md`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add detached/CI mode to the Docker smoke script.
|
||||
2. Make the script emit machine-readable connection metadata.
|
||||
3. Keep the current interactive manual mode intact.
|
||||
4. Add reliable cleanup commands for CI.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- a script invocation can start the published Docker app, auto-bootstrap it, and return control to the caller with enough metadata for browser automation
|
||||
|
||||
## Phase 2: Browser release-smoke suite
|
||||
|
||||
Files:
|
||||
|
||||
- `tests/release-smoke/playwright.config.ts`
|
||||
- `tests/release-smoke/docker-auth-onboarding.spec.ts`
|
||||
- root `package.json`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add a dedicated Playwright config for external server testing.
|
||||
2. Implement login + onboarding + CEO creation flow.
|
||||
3. Assert a CEO run was created or completed.
|
||||
4. Add a root script such as:
|
||||
- `test:release-smoke`
|
||||
|
||||
Acceptance:
|
||||
|
||||
- the suite passes locally against both:
|
||||
- `PAPERCLIPAI_VERSION=canary`
|
||||
- `PAPERCLIPAI_VERSION=latest`
|
||||
|
||||
## Phase 3: GitHub Actions workflow
|
||||
|
||||
Files:
|
||||
|
||||
- `.github/workflows/release-smoke.yml`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add manual and reusable workflow entry points.
|
||||
2. Install Chromium and runner dependencies.
|
||||
3. Start Docker smoke in detached mode.
|
||||
4. Run the release-smoke Playwright suite.
|
||||
5. Upload diagnostics artifacts.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- a maintainer can run the workflow manually for either `canary` or `latest`
|
||||
|
||||
## Phase 4: Release workflow integration
|
||||
|
||||
Files:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `doc/RELEASING.md`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Trigger release smoke automatically after canary publish.
|
||||
2. Trigger release smoke automatically after stable publish.
|
||||
3. Document expected behavior and failure handling.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- canary releases automatically produce a published-package browser smoke result
|
||||
- stable releases automatically produce a `latest` browser smoke result
|
||||
|
||||
## Phase 5: Future extension for real model-backed agent validation
|
||||
|
||||
Not part of the first implementation, but this should be the next layer after the deterministic lane is stable.
|
||||
|
||||
Possible additions:
|
||||
|
||||
- a second Playwright project gated on repo secrets
|
||||
- real `claude_local` or `codex_local` adapter validation in Docker-capable environments
|
||||
- assertion that the CEO posts a real task/comment artifact
|
||||
- stable release holdback until the credentialed lane passes
|
||||
|
||||
This should stay optional until the token-free lane is trustworthy.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
The plan is complete when the implemented system can demonstrate all of the following:
|
||||
|
||||
1. A published `paperclipai@canary` Docker install can be smoke-tested by Playwright in CI.
|
||||
2. A published `paperclipai@latest` Docker install can be smoke-tested by Playwright in CI.
|
||||
3. The test logs into authenticated mode with the smoke credentials.
|
||||
4. The test sees onboarding for a fresh instance.
|
||||
5. The test completes onboarding in the browser.
|
||||
6. The test verifies the initial CEO agent was created.
|
||||
7. The test verifies at least one CEO heartbeat run was triggered.
|
||||
8. Failures produce actionable artifacts rather than just a red job.
|
||||
|
||||
## Risks And Decisions To Make
|
||||
|
||||
### 1. Fast process runs may finish before the UI visibly updates
|
||||
|
||||
That is expected. The assertions should prefer API polling for run existence/status rather than only visual indicators.
|
||||
|
||||
### 2. `latest` smoke is post-publish, not preventive
|
||||
|
||||
This is a real limitation of testing the published dist-tag itself. It is still valuable, but it should not be confused with a pre-publish gate.
|
||||
|
||||
### 3. We should not overcouple the test to cosmetic onboarding text
|
||||
|
||||
The important contract is flow success, created entities, and run creation. Use visible labels sparingly and prefer stable semantic selectors where possible.
|
||||
|
||||
### 4. Keep the smoke adapter path boring
|
||||
|
||||
For release safety, the first test should use the most boring runnable adapter possible. This is not the place to validate every adapter.
|
||||
|
||||
## Recommended First Slice
|
||||
|
||||
If we want the fastest path to value, ship this in order:
|
||||
|
||||
1. add detached mode to `scripts/docker-onboard-smoke.sh`
|
||||
2. add one Playwright spec for authenticated login + onboarding + CEO run verification
|
||||
3. add manual `release-smoke.yml`
|
||||
4. once stable, wire canary into `release.yml`
|
||||
5. after that, wire stable `latest` smoke into `release.yml`
|
||||
|
||||
That gives release confidence quickly without turning the first version into a large CI redesign.
|
||||
@@ -49,13 +49,13 @@ The repo and npm tooling still assume semver-shaped version strings in many plac
|
||||
|
||||
Recommended format:
|
||||
|
||||
- stable: `YYYY.M.D`
|
||||
- canary: `YYYY.M.D-canary.N`
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
Examples:
|
||||
|
||||
- stable on March 17, 2026: `2026.3.17`
|
||||
- third canary on March 17, 2026: `2026.3.17-canary.2`
|
||||
- first stable on March 17, 2026: `2026.317.0`
|
||||
- third canary on the `2026.317.0` line: `2026.317.0-canary.2`
|
||||
|
||||
Why this shape:
|
||||
|
||||
@@ -66,11 +66,12 @@ Why this shape:
|
||||
|
||||
Important constraints:
|
||||
|
||||
- the middle numeric slot should be `MDD`, where `M` is the month and `DD` is the zero-padded day
|
||||
- `2026.03.17` is not the format to use
|
||||
- numeric semver identifiers do not allow leading zeroes
|
||||
- `2026.03.16.8` is not the format to use
|
||||
- `2026.3.17.1` is not the format to use
|
||||
- semver has three numeric components, not four
|
||||
- the practical semver-safe equivalent of your example is `2026.3.16-canary.8`
|
||||
- the practical semver-safe equivalent is `2026.317.0-canary.8`
|
||||
|
||||
This is effectively CalVer on semver rails.
|
||||
|
||||
@@ -109,7 +110,7 @@ This is the most important mechanical constraint.
|
||||
npm can move dist-tags, but it does not let you rename an already-published version. That means:
|
||||
|
||||
- you can move `latest` to `paperclipai@1.2.3`
|
||||
- you cannot turn `paperclipai@2026.3.16-canary.8` into `paperclipai@2026.3.17`
|
||||
- you cannot turn `paperclipai@2026.317.0-canary.8` into `paperclipai@2026.317.0`
|
||||
|
||||
So "promote canary to stable" really means:
|
||||
|
||||
@@ -123,7 +124,7 @@ Recommended stable input:
|
||||
|
||||
- `source_ref`
|
||||
- commit SHA, or
|
||||
- a canary git tag such as `canary/v2026.3.16-canary.8`
|
||||
- a canary git tag such as `canary/v2026.317.1-canary.8`
|
||||
|
||||
### 5. Only stable releases get release notes, tags, and GitHub Releases
|
||||
|
||||
@@ -137,9 +138,9 @@ Canaries should stay lightweight:
|
||||
|
||||
Stable releases should remain the public narrative surface:
|
||||
|
||||
- git tag `v2026.3.17`
|
||||
- GitHub Release `v2026.3.17`
|
||||
- stable changelog file `releases/v2026.3.17.md`
|
||||
- git tag `v2026.317.0`
|
||||
- GitHub Release `v2026.317.0`
|
||||
- stable changelog file `releases/v2026.317.0.md`
|
||||
|
||||
## Security Model
|
||||
|
||||
@@ -233,14 +234,14 @@ Recommended stable path:
|
||||
|
||||
1. pick a canary commit or tag
|
||||
2. run changelog generation locally from a trusted machine
|
||||
3. commit `releases/vYYYY.M.D.md`
|
||||
3. commit `releases/vYYYY.MDD.P.md`
|
||||
4. run stable promotion
|
||||
|
||||
If the notes are not ready yet, a fallback is acceptable:
|
||||
|
||||
- publish stable
|
||||
- create a minimal GitHub Release
|
||||
- update `releases/vYYYY.M.D.md` immediately afterward
|
||||
- update `releases/vYYYY.MDD.P.md` immediately afterward
|
||||
|
||||
But the better steady-state is to have the stable notes committed before stable publish.
|
||||
|
||||
@@ -268,13 +269,13 @@ Steps:
|
||||
1. checkout the merged `master` commit
|
||||
2. run verification on that exact commit
|
||||
3. compute canary version for current UTC date
|
||||
4. version public packages to `YYYY.M.D-canary.N`
|
||||
4. version public packages to `YYYY.MDD.P-canary.N`
|
||||
5. publish to npm with dist-tag `canary`
|
||||
6. create a canary git tag for traceability
|
||||
|
||||
Recommended canary tag format:
|
||||
|
||||
- `canary/v2026.3.17-canary.4`
|
||||
- `canary/v2026.317.1-canary.4`
|
||||
|
||||
Outputs:
|
||||
|
||||
@@ -299,14 +300,14 @@ Steps:
|
||||
|
||||
1. checkout `source_ref`
|
||||
2. run verification on that exact commit
|
||||
3. compute stable version from UTC date or provided override
|
||||
4. fail if `vYYYY.M.D` already exists
|
||||
5. require `releases/vYYYY.M.D.md`
|
||||
6. version public packages to `YYYY.M.D`
|
||||
3. compute the next stable patch slot for the UTC date or provided override
|
||||
4. fail if `vYYYY.MDD.P` already exists
|
||||
5. require `releases/vYYYY.MDD.P.md`
|
||||
6. version public packages to `YYYY.MDD.P`
|
||||
7. publish to npm under `latest`
|
||||
8. create git tag `vYYYY.M.D`
|
||||
8. create git tag `vYYYY.MDD.P`
|
||||
9. push tag
|
||||
10. create GitHub Release from `releases/vYYYY.M.D.md`
|
||||
10. create GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Outputs:
|
||||
|
||||
@@ -332,8 +333,8 @@ That logic should be replaced with:
|
||||
|
||||
For example:
|
||||
|
||||
- `stable_version_for_utc_date(2026-03-17) -> 2026.3.17`
|
||||
- `next_canary_for_utc_date(2026-03-17) -> 2026.3.17-canary.0`
|
||||
- `next_stable_version(2026-03-17) -> 2026.317.0`
|
||||
- `next_canary_for_utc_date(2026-03-17) -> 2026.317.0-canary.0`
|
||||
|
||||
### 2. Stop requiring `release/X.Y.Z`
|
||||
|
||||
@@ -392,19 +393,15 @@ It should continue to:
|
||||
|
||||
## Tradeoffs and Risks
|
||||
|
||||
### 1. One stable per UTC day
|
||||
### 1. The stable patch slot is now part of the version contract
|
||||
|
||||
With plain `YYYY.M.D`, you get one stable release per UTC day.
|
||||
With `YYYY.MDD.P`, same-day hotfixes are supported, but the stable patch slot is now part of the visible version format.
|
||||
|
||||
That is probably fine, but it is a real product rule.
|
||||
That is the right tradeoff because:
|
||||
|
||||
If you need multiple same-day stables later, you have three options:
|
||||
|
||||
1. accept a less pretty stable format
|
||||
2. go back to a serial patch component
|
||||
3. keep daily stable cadence and use canaries for same-day fixes
|
||||
|
||||
My recommendation is to accept one stable per UTC day unless reality proves otherwise.
|
||||
1. npm still gets semver-valid versions
|
||||
2. same-day hotfixes stay possible
|
||||
3. chronological ordering still works as long as the day is zero-padded inside `MDD`
|
||||
|
||||
### 2. Public package consumers lose semver intent signaling
|
||||
|
||||
@@ -469,8 +466,8 @@ That is acceptable if canaries stay clearly separate:
|
||||
|
||||
Paperclip should adopt this model:
|
||||
|
||||
- stable versions: `YYYY.M.D`
|
||||
- canary versions: `YYYY.M.D-canary.N`
|
||||
- stable versions: `YYYY.MDD.P`
|
||||
- canary versions: `YYYY.MDD.P-canary.N`
|
||||
- canaries auto-published on every push to `master`
|
||||
- stables manually promoted from a chosen tested commit or canary tag
|
||||
- no release branches in the default path
|
||||
|
||||
@@ -249,7 +249,7 @@ Runs local `claude` CLI directly.
|
||||
"cwd": "/absolute/or/relative/path",
|
||||
"promptTemplate": "You are agent {{agent.id}} ...",
|
||||
"model": "optional-model-id",
|
||||
"maxTurnsPerRun": 300,
|
||||
"maxTurnsPerRun": 1000,
|
||||
"dangerouslySkipPermissions": true,
|
||||
"env": {"KEY": "VALUE"},
|
||||
"extraArgs": [],
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
services:
|
||||
paperclip:
|
||||
build:
|
||||
context: .
|
||||
context: ..
|
||||
dockerfile: Dockerfile
|
||||
ports:
|
||||
- "${PAPERCLIP_PORT:-3100}:3100"
|
||||
@@ -15,4 +15,4 @@ services:
|
||||
PAPERCLIP_PUBLIC_URL: "${PAPERCLIP_PUBLIC_URL:-http://localhost:3100}"
|
||||
BETTER_AUTH_SECRET: "${BETTER_AUTH_SECRET:?BETTER_AUTH_SECRET must be set}"
|
||||
volumes:
|
||||
- "${PAPERCLIP_DATA_DIR:-./data/docker-paperclip}:/paperclip"
|
||||
- "${PAPERCLIP_DATA_DIR:-../data/docker-paperclip}:/paperclip"
|
||||
@@ -1,7 +1,7 @@
|
||||
services:
|
||||
review:
|
||||
build:
|
||||
context: .
|
||||
context: ..
|
||||
dockerfile: docker/untrusted-review/Dockerfile
|
||||
init: true
|
||||
tty: true
|
||||
@@ -16,7 +16,9 @@ services:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
|
||||
server:
|
||||
build: .
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: Dockerfile
|
||||
ports:
|
||||
- "3100:3100"
|
||||
environment:
|
||||
20
docker/quadlet/paperclip-db.container
Normal file
20
docker/quadlet/paperclip-db.container
Normal file
@@ -0,0 +1,20 @@
|
||||
[Unit]
|
||||
Description=PostgreSQL for Paperclip
|
||||
|
||||
[Container]
|
||||
Image=docker.io/library/postgres:17-alpine
|
||||
ContainerName=paperclip-db
|
||||
Pod=paperclip.pod
|
||||
Volume=paperclip-pgdata:/var/lib/postgresql/data
|
||||
EnvironmentFile=%h/.config/containers/systemd/paperclip.env
|
||||
HealthCmd=pg_isready -U $POSTGRES_USER -d $POSTGRES_DB -h localhost || exit 1
|
||||
HealthInterval=15s
|
||||
HealthTimeout=5s
|
||||
HealthRetries=5
|
||||
|
||||
[Service]
|
||||
Restart=on-failure
|
||||
TimeoutStartSec=60
|
||||
|
||||
[Install]
|
||||
WantedBy=default.target
|
||||
23
docker/quadlet/paperclip.container
Normal file
23
docker/quadlet/paperclip.container
Normal file
@@ -0,0 +1,23 @@
|
||||
[Unit]
|
||||
Description=Paperclip AI Agent Orchestrator
|
||||
Requires=paperclip-db.service
|
||||
After=paperclip-db.service
|
||||
|
||||
[Container]
|
||||
Image=paperclip-local
|
||||
ContainerName=paperclip
|
||||
Pod=paperclip.pod
|
||||
Volume=%h/.local/share/paperclip:/paperclip:Z
|
||||
Environment=HOST=0.0.0.0
|
||||
Environment=PAPERCLIP_HOME=/paperclip
|
||||
Environment=PAPERCLIP_DEPLOYMENT_MODE=authenticated
|
||||
Environment=PAPERCLIP_DEPLOYMENT_EXPOSURE=private
|
||||
Environment=PAPERCLIP_PUBLIC_URL=http://localhost:3100
|
||||
EnvironmentFile=%h/.config/containers/systemd/paperclip.env
|
||||
|
||||
[Service]
|
||||
Restart=on-failure
|
||||
TimeoutStartSec=120
|
||||
|
||||
[Install]
|
||||
WantedBy=default.target
|
||||
3
docker/quadlet/paperclip.pod
Normal file
3
docker/quadlet/paperclip.pod
Normal file
@@ -0,0 +1,3 @@
|
||||
[Pod]
|
||||
PodName=paperclip
|
||||
PublishPort=3100:3100
|
||||
@@ -20,7 +20,7 @@ The `claude_local` adapter runs Anthropic's Claude Code CLI locally. It supports
|
||||
| `env` | object | No | Environment variables (supports secret refs) |
|
||||
| `timeoutSec` | number | No | Process timeout (0 = no timeout) |
|
||||
| `graceSec` | number | No | Grace period before force-kill |
|
||||
| `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `300`) |
|
||||
| `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `1000`) |
|
||||
| `dangerouslySkipPermissions` | boolean | No | Skip permission prompts (dev only) |
|
||||
|
||||
## Prompt Templates
|
||||
|
||||
@@ -40,6 +40,12 @@ pnpm paperclipai agent local-cli codexcoder --company-id <company-id>
|
||||
|
||||
This installs any missing skills, creates an agent API key, and prints shell exports to run as that agent.
|
||||
|
||||
## Instructions Resolution
|
||||
|
||||
If `instructionsFilePath` is configured, Paperclip reads that file and prepends it to the stdin prompt sent to `codex exec` on every run.
|
||||
|
||||
This is separate from any workspace-level instruction discovery that Codex itself performs in the run `cwd`. Paperclip does not disable Codex-native repo instruction files, so a repo-local `AGENTS.md` may still be loaded by Codex in addition to the Paperclip-managed agent instructions.
|
||||
|
||||
## Environment Test
|
||||
|
||||
The environment test checks:
|
||||
|
||||
@@ -20,9 +20,12 @@ When a heartbeat fires, Paperclip:
|
||||
|---------|----------|-------------|
|
||||
| [Claude Local](/adapters/claude-local) | `claude_local` | Runs Claude Code CLI locally |
|
||||
| [Codex Local](/adapters/codex-local) | `codex_local` | Runs OpenAI Codex CLI locally |
|
||||
| [Gemini Local](/adapters/gemini-local) | `gemini_local` | Runs Gemini CLI locally |
|
||||
| [Gemini Local](/adapters/gemini-local) | `gemini_local` | Runs Gemini CLI locally (experimental — adapter package exists, not yet in stable type enum) |
|
||||
| OpenCode Local | `opencode_local` | Runs OpenCode CLI locally (multi-provider `provider/model`) |
|
||||
| OpenClaw | `openclaw` | Sends wake payloads to an OpenClaw webhook |
|
||||
| Hermes Local | `hermes_local` | Runs Hermes CLI locally |
|
||||
| Cursor | `cursor` | Runs Cursor in background mode |
|
||||
| Pi Local | `pi_local` | Runs an embedded Pi agent locally |
|
||||
| OpenClaw Gateway | `openclaw_gateway` | Connects to an OpenClaw gateway endpoint |
|
||||
| [Process](/adapters/process) | `process` | Executes arbitrary shell commands |
|
||||
| [HTTP](/adapters/http) | `http` | Sends webhooks to external agents |
|
||||
|
||||
@@ -55,7 +58,7 @@ Three registries consume these modules:
|
||||
|
||||
## Choosing an Adapter
|
||||
|
||||
- **Need a coding agent?** Use `claude_local`, `codex_local`, `gemini_local`, or `opencode_local`
|
||||
- **Need a coding agent?** Use `claude_local`, `codex_local`, `opencode_local`, or `hermes_local`
|
||||
- **Need to run a script or command?** Use `process`
|
||||
- **Need to call an external service?** Use `http`
|
||||
- **Need something custom?** [Create your own adapter](/adapters/creating-an-adapter)
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# Agent Runtime Guide
|
||||
|
||||
Status: User-facing guide
|
||||
Last updated: 2026-02-17
|
||||
Status: User-facing guide
|
||||
Last updated: 2026-03-26
|
||||
Audience: Operators setting up and running agents in Paperclip
|
||||
|
||||
## 1. What this system does
|
||||
@@ -32,14 +32,19 @@ If an agent is already running, new wakeups are merged (coalesced) instead of la
|
||||
|
||||
## 3.1 Adapter choice
|
||||
|
||||
Common choices:
|
||||
Built-in adapters:
|
||||
|
||||
- `claude_local`: runs your local `claude` CLI
|
||||
- `codex_local`: runs your local `codex` CLI
|
||||
- `opencode_local`: runs your local `opencode` CLI
|
||||
- `hermes_local`: runs your local `hermes` CLI
|
||||
- `cursor`: runs Cursor in background mode
|
||||
- `pi_local`: runs an embedded Pi agent locally
|
||||
- `openclaw_gateway`: connects to an OpenClaw gateway endpoint
|
||||
- `process`: generic shell command adapter
|
||||
- `http`: calls an external HTTP endpoint
|
||||
|
||||
For `claude_local` and `codex_local`, Paperclip assumes the CLI is already installed and authenticated on the host machine.
|
||||
For local CLI adapters (`claude_local`, `codex_local`, `opencode_local`, `hermes_local`), Paperclip assumes the CLI is already installed and authenticated on the host machine.
|
||||
|
||||
## 3.2 Runtime behavior
|
||||
|
||||
@@ -69,6 +74,8 @@ You can set:
|
||||
|
||||
Templates support variables like `{{agent.id}}`, `{{agent.name}}`, and run context values.
|
||||
|
||||
> **Note:** `bootstrapPromptTemplate` is deprecated and should not be used for new agents. Existing configs that use it will continue to work but should be migrated to the managed instructions bundle system.
|
||||
|
||||
## 4. Session resume behavior
|
||||
|
||||
Paperclip stores session IDs for resumable adapters.
|
||||
@@ -133,7 +140,7 @@ If the connection drops, the UI reconnects automatically.
|
||||
|
||||
If runs fail repeatedly:
|
||||
|
||||
1. Check adapter command availability (`claude`/`codex` installed and logged in).
|
||||
1. Check adapter command availability (e.g. `claude`/`codex`/`opencode`/`hermes` installed and logged in).
|
||||
2. Verify `cwd` exists and is accessible.
|
||||
3. Inspect run error + stderr excerpt, then full log.
|
||||
4. Confirm timeout is not too low.
|
||||
@@ -166,9 +173,9 @@ Start with least privilege where possible, and avoid exposing secrets in broad r
|
||||
|
||||
## 10. Minimal setup checklist
|
||||
|
||||
1. Choose adapter (`claude_local` or `codex_local`).
|
||||
2. Set `cwd` to the target workspace.
|
||||
3. Add bootstrap + normal prompt templates.
|
||||
1. Choose adapter (e.g. `claude_local`, `codex_local`, `opencode_local`, `hermes_local`, `cursor`, or `openclaw_gateway`).
|
||||
2. Set `cwd` to the target workspace (for local adapters).
|
||||
3. Optionally add a prompt template (`promptTemplate`) or use the managed instructions bundle.
|
||||
4. Configure heartbeat policy (timer and/or assignment wakeups).
|
||||
5. Trigger a manual wakeup.
|
||||
6. Confirm run succeeds and session/token usage is recorded.
|
||||
|
||||
@@ -38,11 +38,13 @@ POST /api/companies/{companyId}/goals
|
||||
```
|
||||
PATCH /api/goals/{goalId}
|
||||
{
|
||||
"status": "completed",
|
||||
"status": "achieved",
|
||||
"description": "Updated description"
|
||||
}
|
||||
```
|
||||
|
||||
Valid status values: `planned`, `active`, `achieved`, `cancelled`.
|
||||
|
||||
## Projects
|
||||
|
||||
Projects group related issues toward a deliverable. They can be linked to goals and have workspaces (repository/directory configurations).
|
||||
|
||||
@@ -81,6 +81,19 @@ Atomically claims the task and transitions to `in_progress`. Returns `409 Confli
|
||||
|
||||
Idempotent if you already own the task.
|
||||
|
||||
**Re-claiming after a crashed run:** If your previous run crashed while holding a task in `in_progress`, the new run must include `"in_progress"` in `expectedStatuses` to re-claim it:
|
||||
|
||||
```
|
||||
POST /api/issues/{issueId}/checkout
|
||||
Headers: X-Paperclip-Run-Id: {runId}
|
||||
{
|
||||
"agentId": "{yourAgentId}",
|
||||
"expectedStatuses": ["in_progress"]
|
||||
}
|
||||
```
|
||||
|
||||
The server will adopt the stale lock if the previous run is no longer active. **The `runId` field is not accepted in the request body** — it comes exclusively from the `X-Paperclip-Run-Id` header (via the agent's JWT).
|
||||
|
||||
## Release Task
|
||||
|
||||
```
|
||||
|
||||
201
docs/api/routines.md
Normal file
201
docs/api/routines.md
Normal file
@@ -0,0 +1,201 @@
|
||||
---
|
||||
title: Routines
|
||||
summary: Recurring task scheduling, triggers, and run history
|
||||
---
|
||||
|
||||
Routines are recurring tasks that fire on a schedule, webhook, or API call and create a heartbeat run for the assigned agent.
|
||||
|
||||
## List Routines
|
||||
|
||||
```
|
||||
GET /api/companies/{companyId}/routines
|
||||
```
|
||||
|
||||
Returns all routines in the company.
|
||||
|
||||
## Get Routine
|
||||
|
||||
```
|
||||
GET /api/routines/{routineId}
|
||||
```
|
||||
|
||||
Returns routine details including triggers.
|
||||
|
||||
## Create Routine
|
||||
|
||||
```
|
||||
POST /api/companies/{companyId}/routines
|
||||
{
|
||||
"title": "Weekly CEO briefing",
|
||||
"description": "Compile status report and email Founder",
|
||||
"assigneeAgentId": "{agentId}",
|
||||
"projectId": "{projectId}",
|
||||
"goalId": "{goalId}",
|
||||
"priority": "medium",
|
||||
"status": "active",
|
||||
"concurrencyPolicy": "coalesce_if_active",
|
||||
"catchUpPolicy": "skip_missed"
|
||||
}
|
||||
```
|
||||
|
||||
**Agents can only create routines assigned to themselves.** Board operators can assign to any agent.
|
||||
|
||||
Fields:
|
||||
|
||||
| Field | Required | Description |
|
||||
|-------|----------|-------------|
|
||||
| `title` | yes | Routine name |
|
||||
| `description` | no | Human-readable description of the routine |
|
||||
| `assigneeAgentId` | yes | Agent who receives each run |
|
||||
| `projectId` | yes | Project this routine belongs to |
|
||||
| `goalId` | no | Goal to link runs to |
|
||||
| `parentIssueId` | no | Parent issue for created run issues |
|
||||
| `priority` | no | `critical`, `high`, `medium` (default), `low` |
|
||||
| `status` | no | `active` (default), `paused`, `archived` |
|
||||
| `concurrencyPolicy` | no | Behaviour when a run fires while a previous one is still active |
|
||||
| `catchUpPolicy` | no | Behaviour for missed scheduled runs |
|
||||
|
||||
**Concurrency policies:**
|
||||
|
||||
| Value | Behaviour |
|
||||
|-------|-----------|
|
||||
| `coalesce_if_active` (default) | Incoming run is immediately finalised as `coalesced` and linked to the active run — no new issue is created |
|
||||
| `skip_if_active` | Incoming run is immediately finalised as `skipped` and linked to the active run — no new issue is created |
|
||||
| `always_enqueue` | Always create a new run regardless of active runs |
|
||||
|
||||
**Catch-up policies:**
|
||||
|
||||
| Value | Behaviour |
|
||||
|-------|-----------|
|
||||
| `skip_missed` (default) | Missed scheduled runs are dropped |
|
||||
| `enqueue_missed_with_cap` | Missed runs are enqueued up to an internal cap |
|
||||
|
||||
## Update Routine
|
||||
|
||||
```
|
||||
PATCH /api/routines/{routineId}
|
||||
{
|
||||
"status": "paused"
|
||||
}
|
||||
```
|
||||
|
||||
All fields from create are updatable. **Agents can only update routines assigned to themselves and cannot reassign a routine to another agent.**
|
||||
|
||||
## Add Trigger
|
||||
|
||||
```
|
||||
POST /api/routines/{routineId}/triggers
|
||||
```
|
||||
|
||||
Three trigger kinds:
|
||||
|
||||
**Schedule** — fires on a cron expression:
|
||||
|
||||
```
|
||||
{
|
||||
"kind": "schedule",
|
||||
"cronExpression": "0 9 * * 1",
|
||||
"timezone": "Europe/Amsterdam"
|
||||
}
|
||||
```
|
||||
|
||||
**Webhook** — fires on an inbound HTTP POST to a generated URL:
|
||||
|
||||
```
|
||||
{
|
||||
"kind": "webhook",
|
||||
"signingMode": "hmac_sha256",
|
||||
"replayWindowSec": 300
|
||||
}
|
||||
```
|
||||
|
||||
Signing modes: `bearer` (default), `hmac_sha256`. Replay window range: 30–86400 seconds (default 300).
|
||||
|
||||
**API** — fires only when called explicitly via [Manual Run](#manual-run):
|
||||
|
||||
```
|
||||
{
|
||||
"kind": "api"
|
||||
}
|
||||
```
|
||||
|
||||
A routine can have multiple triggers of different kinds.
|
||||
|
||||
## Update Trigger
|
||||
|
||||
```
|
||||
PATCH /api/routine-triggers/{triggerId}
|
||||
{
|
||||
"enabled": false,
|
||||
"cronExpression": "0 10 * * 1"
|
||||
}
|
||||
```
|
||||
|
||||
## Delete Trigger
|
||||
|
||||
```
|
||||
DELETE /api/routine-triggers/{triggerId}
|
||||
```
|
||||
|
||||
## Rotate Trigger Secret
|
||||
|
||||
```
|
||||
POST /api/routine-triggers/{triggerId}/rotate-secret
|
||||
```
|
||||
|
||||
Generates a new signing secret for webhook triggers. The previous secret is immediately invalidated.
|
||||
|
||||
## Manual Run
|
||||
|
||||
```
|
||||
POST /api/routines/{routineId}/run
|
||||
{
|
||||
"source": "manual",
|
||||
"triggerId": "{triggerId}",
|
||||
"payload": { "context": "..." },
|
||||
"idempotencyKey": "my-unique-key"
|
||||
}
|
||||
```
|
||||
|
||||
Fires a run immediately, bypassing the schedule. Concurrency policy still applies.
|
||||
|
||||
`triggerId` is optional. When supplied, the server validates the trigger belongs to this routine (`403`) and is enabled (`409`), then records the run against that trigger and updates its `lastFiredAt`. Omit it for a generic manual run with no trigger attribution.
|
||||
|
||||
## Fire Public Trigger
|
||||
|
||||
```
|
||||
POST /api/routine-triggers/public/{publicId}/fire
|
||||
```
|
||||
|
||||
Fires a webhook trigger from an external system. Requires a valid `Authorization` or `X-Paperclip-Signature` + `X-Paperclip-Timestamp` header pair matching the trigger's signing mode.
|
||||
|
||||
## List Runs
|
||||
|
||||
```
|
||||
GET /api/routines/{routineId}/runs?limit=50
|
||||
```
|
||||
|
||||
Returns recent run history for the routine. Defaults to 50 most recent runs.
|
||||
|
||||
## Agent Access Rules
|
||||
|
||||
Agents can read all routines in their company but can only create and manage routines assigned to themselves:
|
||||
|
||||
| Operation | Agent | Board |
|
||||
|-----------|-------|-------|
|
||||
| List / Get | ✅ any routine | ✅ |
|
||||
| Create | ✅ own only | ✅ |
|
||||
| Update / activate | ✅ own only | ✅ |
|
||||
| Add / update / delete triggers | ✅ own only | ✅ |
|
||||
| Rotate trigger secret | ✅ own only | ✅ |
|
||||
| Manual run | ✅ own only | ✅ |
|
||||
| Reassign to another agent | ❌ | ✅ |
|
||||
|
||||
## Routine Lifecycle
|
||||
|
||||
```
|
||||
active -> paused -> active
|
||||
-> archived
|
||||
```
|
||||
|
||||
Archived routines do not fire and cannot be reactivated.
|
||||
@@ -41,15 +41,16 @@ pnpm paperclipai company export <company-id> --out ./exports/acme --include comp
|
||||
|
||||
# Preview import (no writes)
|
||||
pnpm paperclipai company import \
|
||||
--from https://github.com/<owner>/<repo>/tree/main/<path> \
|
||||
<owner>/<repo>/<path> \
|
||||
--target existing \
|
||||
--company-id <company-id> \
|
||||
--ref main \
|
||||
--collision rename \
|
||||
--dry-run
|
||||
|
||||
# Apply import
|
||||
pnpm paperclipai company import \
|
||||
--from ./exports/acme \
|
||||
./exports/acme \
|
||||
--target new \
|
||||
--new-company-name "Acme Imported" \
|
||||
--include company,agents
|
||||
|
||||
@@ -33,6 +33,8 @@ Interactive first-time setup:
|
||||
pnpm paperclipai onboard
|
||||
```
|
||||
|
||||
If Paperclip is already configured, rerunning `onboard` keeps the existing config in place. Use `paperclipai configure` to change settings on an existing install.
|
||||
|
||||
First prompt:
|
||||
|
||||
1. `Quickstart` (recommended): local defaults (embedded database, no LLM provider, local disk storage, default secrets)
|
||||
@@ -50,6 +52,8 @@ Non-interactive defaults + immediate start (opens browser on server listen):
|
||||
pnpm paperclipai onboard --yes
|
||||
```
|
||||
|
||||
On an existing install, `--yes` now preserves the current config and just starts Paperclip with that setup.
|
||||
|
||||
## `paperclipai doctor`
|
||||
|
||||
Health checks with optional auto-repair:
|
||||
|
||||
596
docs/companies/companies-spec.md
Normal file
596
docs/companies/companies-spec.md
Normal file
@@ -0,0 +1,596 @@
|
||||
# Agent Companies Specification
|
||||
|
||||
Extension of the Agent Skills Specification
|
||||
|
||||
Version: `agentcompanies/v1-draft`
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
An Agent Company package is a filesystem- and GitHub-native format for describing a company, team, agent, project, task, and associated skills using markdown files with YAML frontmatter.
|
||||
|
||||
This specification is an extension of the Agent Skills specification, not a replacement for it.
|
||||
|
||||
It defines how company-, team-, and agent-level package structure composes around the existing `SKILL.md` model.
|
||||
|
||||
This specification is vendor-neutral. It is intended to be usable by any agent-company runtime, not only Paperclip.
|
||||
|
||||
The format is designed to:
|
||||
|
||||
- be readable and writable by humans
|
||||
- work directly from a local folder or GitHub repository
|
||||
- require no central registry
|
||||
- support attribution and pinned references to upstream files
|
||||
- extend the existing Agent Skills ecosystem without redefining it
|
||||
- be useful outside Paperclip
|
||||
|
||||
## 2. Core Principles
|
||||
|
||||
1. Markdown is canonical.
|
||||
2. Git repositories are valid package containers.
|
||||
3. Registries are optional discovery layers, not authorities.
|
||||
4. `SKILL.md` remains owned by the Agent Skills specification.
|
||||
5. External references must be pinnable to immutable Git commits.
|
||||
6. Attribution and license metadata must survive import/export.
|
||||
7. Slugs and relative paths are the portable identity layer, not database ids.
|
||||
8. Conventional folder structure should work without verbose wiring.
|
||||
9. Vendor-specific fidelity belongs in optional extensions, not the base package.
|
||||
|
||||
## 3. Package Kinds
|
||||
|
||||
A package root is identified by one primary markdown file:
|
||||
|
||||
- `COMPANY.md` for a company package
|
||||
- `TEAM.md` for a team package
|
||||
- `AGENTS.md` for an agent package
|
||||
- `PROJECT.md` for a project package
|
||||
- `TASK.md` for a task package
|
||||
- `SKILL.md` for a skill package defined by the Agent Skills specification
|
||||
|
||||
A GitHub repo may contain one package at root or many packages in subdirectories.
|
||||
|
||||
## 4. Reserved Files And Directories
|
||||
|
||||
Common conventions:
|
||||
|
||||
```text
|
||||
COMPANY.md
|
||||
TEAM.md
|
||||
AGENTS.md
|
||||
PROJECT.md
|
||||
TASK.md
|
||||
SKILL.md
|
||||
|
||||
agents/<slug>/AGENTS.md
|
||||
teams/<slug>/TEAM.md
|
||||
projects/<slug>/PROJECT.md
|
||||
projects/<slug>/tasks/<slug>/TASK.md
|
||||
tasks/<slug>/TASK.md
|
||||
skills/<slug>/SKILL.md
|
||||
.paperclip.yaml
|
||||
|
||||
HEARTBEAT.md
|
||||
SOUL.md
|
||||
TOOLS.md
|
||||
README.md
|
||||
assets/
|
||||
scripts/
|
||||
references/
|
||||
```
|
||||
|
||||
Rules:
|
||||
|
||||
- only markdown files are canonical content docs
|
||||
- non-markdown directories like `assets/`, `scripts/`, and `references/` are allowed
|
||||
- package tools may generate optional lock files, but lock files are not required for authoring
|
||||
|
||||
## 5. Common Frontmatter
|
||||
|
||||
Package docs may support these fields:
|
||||
|
||||
```yaml
|
||||
schema: agentcompanies/v1
|
||||
kind: company | team | agent | project | task
|
||||
slug: my-slug
|
||||
name: Human Readable Name
|
||||
description: Short description
|
||||
version: 0.1.0
|
||||
license: MIT
|
||||
authors:
|
||||
- name: Jane Doe
|
||||
homepage: https://example.com
|
||||
tags:
|
||||
- startup
|
||||
- engineering
|
||||
metadata: {}
|
||||
sources: []
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- `schema` is optional and should usually appear only at the package root
|
||||
- `kind` is optional when file path and file name already make the kind obvious
|
||||
- `slug` should be URL-safe and stable
|
||||
- `sources` is for provenance and external references
|
||||
- `metadata` is for tool-specific extensions
|
||||
- exporters should omit empty or default-valued fields
|
||||
|
||||
## 6. COMPANY.md
|
||||
|
||||
`COMPANY.md` is the root entrypoint for a whole company package.
|
||||
|
||||
### Required fields
|
||||
|
||||
```yaml
|
||||
name: Lean Dev Shop
|
||||
description: Small engineering-focused AI company
|
||||
slug: lean-dev-shop
|
||||
schema: agentcompanies/v1
|
||||
```
|
||||
|
||||
### Recommended fields
|
||||
|
||||
```yaml
|
||||
version: 1.0.0
|
||||
license: MIT
|
||||
authors:
|
||||
- name: Example Org
|
||||
goals:
|
||||
- Build and ship software products
|
||||
includes:
|
||||
- https://github.com/example/shared-company-parts/blob/0123456789abcdef0123456789abcdef01234567/teams/engineering/TEAM.md
|
||||
requirements:
|
||||
secrets:
|
||||
- OPENAI_API_KEY
|
||||
```
|
||||
|
||||
### Semantics
|
||||
|
||||
- `includes` defines the package graph
|
||||
- local package contents should be discovered implicitly by folder convention
|
||||
- `includes` is optional and should be used mainly for external refs or nonstandard locations
|
||||
- included items may be local or external references
|
||||
- `COMPANY.md` may include agents directly, teams, projects, tasks, or skills
|
||||
- a company importer may render `includes` as the tree/checkbox import UI
|
||||
|
||||
## 7. TEAM.md
|
||||
|
||||
`TEAM.md` defines an org subtree.
|
||||
|
||||
### Example
|
||||
|
||||
```yaml
|
||||
name: Engineering
|
||||
description: Product and platform engineering team
|
||||
schema: agentcompanies/v1
|
||||
slug: engineering
|
||||
manager: ../cto/AGENTS.md
|
||||
includes:
|
||||
- ../platform-lead/AGENTS.md
|
||||
- ../frontend-lead/AGENTS.md
|
||||
- ../../skills/review/SKILL.md
|
||||
tags:
|
||||
- team
|
||||
- engineering
|
||||
```
|
||||
|
||||
### Semantics
|
||||
|
||||
- a team package is a reusable subtree, not necessarily a runtime database table
|
||||
- `manager` identifies the root agent of the subtree
|
||||
- `includes` may contain child agents, child teams, or shared skills
|
||||
- a team package can be imported into an existing company and attached under a target manager
|
||||
|
||||
## 8. AGENTS.md
|
||||
|
||||
`AGENTS.md` defines an agent.
|
||||
|
||||
### Example
|
||||
|
||||
```yaml
|
||||
name: CEO
|
||||
title: Chief Executive Officer
|
||||
reportsTo: null
|
||||
skills:
|
||||
- plan-ceo-review
|
||||
- review
|
||||
```
|
||||
|
||||
### Semantics
|
||||
|
||||
- body content is the canonical default instruction content for the agent
|
||||
- `docs` points to sibling markdown docs when present
|
||||
- `skills` references reusable `SKILL.md` packages by skill shortname or slug
|
||||
- a bare skill entry like `review` should resolve to `skills/review/SKILL.md` by convention
|
||||
- if a package references external skills, the agent should still refer to the skill by shortname; the skill package itself owns any source refs, pinning, or attribution details
|
||||
- tools may allow path or URL entries as an escape hatch, but exporters should prefer shortname-based skill references in `AGENTS.md`
|
||||
- vendor-specific adapter/runtime config should not live in the base package
|
||||
- local absolute paths, machine-specific cwd values, and secret values must not be exported as canonical package data
|
||||
|
||||
### Skill Resolution
|
||||
|
||||
The preferred association standard between agents and skills is by skill shortname.
|
||||
|
||||
Suggested resolution order for an agent skill entry:
|
||||
|
||||
1. a local package skill at `skills/<shortname>/SKILL.md`
|
||||
2. a referenced or included skill package whose declared slug or shortname matches
|
||||
3. a tool-managed company skill library entry with the same shortname
|
||||
|
||||
Rules:
|
||||
|
||||
- exporters should emit shortnames in `AGENTS.md` whenever possible
|
||||
- importers should not require full file paths for ordinary skill references
|
||||
- the skill package itself should carry any complexity around external refs, vendoring, mirrors, or pinned upstream content
|
||||
- this keeps `AGENTS.md` readable and consistent with `skills.sh`-style sharing
|
||||
|
||||
## 9. PROJECT.md
|
||||
|
||||
`PROJECT.md` defines a lightweight project package.
|
||||
|
||||
### Example
|
||||
|
||||
```yaml
|
||||
name: Q2 Launch
|
||||
description: Ship the Q2 launch plan and supporting assets
|
||||
owner: cto
|
||||
```
|
||||
|
||||
### Semantics
|
||||
|
||||
- a project package groups related starter tasks and supporting markdown
|
||||
- `owner` should reference an agent slug when there is a clear project owner
|
||||
- a conventional `tasks/` subfolder should be discovered implicitly
|
||||
- `includes` may contain `TASK.md`, `SKILL.md`, or supporting docs when explicit wiring is needed
|
||||
- project packages are intended to seed planned work, not represent runtime task state
|
||||
|
||||
## 10. TASK.md
|
||||
|
||||
`TASK.md` defines a lightweight starter task.
|
||||
|
||||
### Example
|
||||
|
||||
```yaml
|
||||
name: Monday Review
|
||||
assignee: ceo
|
||||
project: q2-launch
|
||||
recurring: true
|
||||
```
|
||||
|
||||
### Semantics
|
||||
|
||||
- body content is the canonical markdown task description
|
||||
- `assignee` should reference an agent slug inside the package
|
||||
- `project` should reference a project slug when the task belongs to a `PROJECT.md`
|
||||
- `recurring: true` marks the task as ongoing recurring work instead of a one-time starter task
|
||||
- tasks are intentionally basic seed work: title, markdown body, assignee, project linkage, and optional `recurring: true`
|
||||
- tools may also support optional fields like `priority`, `labels`, or `metadata`, but they should not require them in the base package
|
||||
|
||||
### Recurring Tasks
|
||||
|
||||
- the base package only needs to say whether a task is recurring
|
||||
- vendors may attach the actual schedule / trigger / runtime fidelity in a vendor extension such as `.paperclip.yaml`
|
||||
- this keeps `TASK.md` portable while still allowing richer runtime systems to round-trip their own automation details
|
||||
- legacy packages may still use `schedule.recurrence` during transition, but exporters should prefer `recurring: true`
|
||||
|
||||
Example Paperclip extension:
|
||||
|
||||
```yaml
|
||||
routines:
|
||||
monday-review:
|
||||
triggers:
|
||||
- kind: schedule
|
||||
cronExpression: "0 9 * * 1"
|
||||
timezone: America/Chicago
|
||||
```
|
||||
|
||||
- vendors should ignore unknown recurring-task extensions they do not understand
|
||||
- vendors importing legacy `schedule.recurrence` data may translate it into their own runtime trigger model, but new exports should prefer the simpler `recurring: true` base field
|
||||
|
||||
## 11. SKILL.md Compatibility
|
||||
|
||||
A skill package must remain a valid Agent Skills package.
|
||||
|
||||
Rules:
|
||||
|
||||
- `SKILL.md` should follow the Agent Skills spec
|
||||
- Paperclip must not require extra top-level fields for skill validity
|
||||
- Paperclip-specific extensions must live under `metadata.paperclip` or `metadata.sources`
|
||||
- a skill directory may include `scripts/`, `references/`, and `assets/` exactly as the Agent Skills ecosystem expects
|
||||
- tools implementing this spec should treat `skills.sh` compatibility as a first-class goal rather than inventing a parallel skill format
|
||||
|
||||
In other words, this spec extends Agent Skills upward into company/team/agent composition. It does not redefine skill package semantics.
|
||||
|
||||
### Example compatible extension
|
||||
|
||||
```yaml
|
||||
---
|
||||
name: review
|
||||
description: Paranoid code review skill
|
||||
allowed-tools:
|
||||
- Read
|
||||
- Grep
|
||||
metadata:
|
||||
paperclip:
|
||||
tags:
|
||||
- engineering
|
||||
- review
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: vercel-labs/skills
|
||||
path: review/SKILL.md
|
||||
commit: 0123456789abcdef0123456789abcdef01234567
|
||||
sha256: 3b7e...9a
|
||||
attribution: Vercel Labs
|
||||
usage: referenced
|
||||
---
|
||||
```
|
||||
|
||||
## 12. Source References
|
||||
|
||||
A package may point to upstream content instead of vendoring it.
|
||||
|
||||
### Source object
|
||||
|
||||
```yaml
|
||||
sources:
|
||||
- kind: github-file
|
||||
repo: owner/repo
|
||||
path: path/to/file.md
|
||||
commit: 0123456789abcdef0123456789abcdef01234567
|
||||
blob: abcdef0123456789abcdef0123456789abcdef01
|
||||
sha256: 3b7e...9a
|
||||
url: https://github.com/owner/repo/blob/0123456789abcdef0123456789abcdef01234567/path/to/file.md
|
||||
rawUrl: https://raw.githubusercontent.com/owner/repo/0123456789abcdef0123456789abcdef01234567/path/to/file.md
|
||||
attribution: Owner Name
|
||||
license: MIT
|
||||
usage: referenced
|
||||
```
|
||||
|
||||
### Supported kinds
|
||||
|
||||
- `local-file`
|
||||
- `local-dir`
|
||||
- `github-file`
|
||||
- `github-dir`
|
||||
- `url`
|
||||
|
||||
### Usage modes
|
||||
|
||||
- `vendored`: bytes are included in the package
|
||||
- `referenced`: package points to upstream immutable content
|
||||
- `mirrored`: bytes are cached locally but upstream attribution remains canonical
|
||||
|
||||
### Rules
|
||||
|
||||
- `commit` is required for `github-file` and `github-dir` in strict mode
|
||||
- `sha256` is strongly recommended and should be verified on fetch
|
||||
- branch-only refs may be allowed in development mode but must warn
|
||||
- exporters should default to `referenced` for third-party content unless redistribution is clearly allowed
|
||||
|
||||
## 13. Resolution Rules
|
||||
|
||||
Given a package root, an importer resolves in this order:
|
||||
|
||||
1. local relative paths
|
||||
2. local absolute paths if explicitly allowed by the importing tool
|
||||
3. pinned GitHub refs
|
||||
4. generic URLs
|
||||
|
||||
For pinned GitHub refs:
|
||||
|
||||
1. resolve `repo + commit + path`
|
||||
2. fetch content
|
||||
3. verify `sha256` if present
|
||||
4. verify `blob` if present
|
||||
5. fail closed on mismatch
|
||||
|
||||
An importer must surface:
|
||||
|
||||
- missing files
|
||||
- hash mismatches
|
||||
- missing licenses
|
||||
- referenced upstream content that requires network fetch
|
||||
- executable content in skills or scripts
|
||||
|
||||
## 14. Import Graph
|
||||
|
||||
A package importer should build a graph from:
|
||||
|
||||
- `COMPANY.md`
|
||||
- `TEAM.md`
|
||||
- `AGENTS.md`
|
||||
- `PROJECT.md`
|
||||
- `TASK.md`
|
||||
- `SKILL.md`
|
||||
- local and external refs
|
||||
|
||||
Suggested import UI behavior:
|
||||
|
||||
- render graph as a tree
|
||||
- checkbox at entity level, not raw file level
|
||||
- selecting an agent auto-selects required docs and referenced skills
|
||||
- selecting a team auto-selects its subtree
|
||||
- selecting a project auto-selects its included tasks
|
||||
- selecting a recurring task should make it clear that the import target is a routine / automation, not a one-time task
|
||||
- selecting referenced third-party content shows attribution, license, and fetch policy
|
||||
|
||||
## 15. Vendor Extensions
|
||||
|
||||
Vendor-specific data should live outside the base package shape.
|
||||
|
||||
For Paperclip, the preferred fidelity extension is:
|
||||
|
||||
```text
|
||||
.paperclip.yaml
|
||||
```
|
||||
|
||||
Example uses:
|
||||
|
||||
- adapter type and adapter config
|
||||
- adapter env inputs and defaults
|
||||
- runtime settings
|
||||
- permissions
|
||||
- budgets
|
||||
- approval policies
|
||||
- project execution workspace policies
|
||||
- issue/task Paperclip-only metadata
|
||||
|
||||
Rules:
|
||||
|
||||
- the base package must remain readable without the extension
|
||||
- tools that do not understand a vendor extension should ignore it
|
||||
- Paperclip tools may emit the vendor extension by default as a sidecar while keeping the base markdown clean
|
||||
|
||||
Suggested Paperclip shape:
|
||||
|
||||
```yaml
|
||||
schema: paperclip/v1
|
||||
agents:
|
||||
claudecoder:
|
||||
adapter:
|
||||
type: claude_local
|
||||
config:
|
||||
model: claude-opus-4-6
|
||||
inputs:
|
||||
env:
|
||||
ANTHROPIC_API_KEY:
|
||||
kind: secret
|
||||
requirement: optional
|
||||
default: ""
|
||||
GH_TOKEN:
|
||||
kind: secret
|
||||
requirement: optional
|
||||
CLAUDE_BIN:
|
||||
kind: plain
|
||||
requirement: optional
|
||||
default: claude
|
||||
routines:
|
||||
monday-review:
|
||||
triggers:
|
||||
- kind: schedule
|
||||
cronExpression: "0 9 * * 1"
|
||||
timezone: America/Chicago
|
||||
```
|
||||
|
||||
Additional rules for Paperclip exporters:
|
||||
|
||||
- do not duplicate `promptTemplate` when `AGENTS.md` already contains the agent instructions
|
||||
- do not export provider-specific secret bindings such as `secretId`, `version`, or `type: secret_ref`
|
||||
- export env inputs as portable declarations with `required` or `optional` semantics and optional defaults
|
||||
- warn on system-dependent values such as absolute commands and absolute `PATH` overrides
|
||||
- omit empty and default-valued Paperclip fields when possible
|
||||
|
||||
## 16. Export Rules
|
||||
|
||||
A compliant exporter should:
|
||||
|
||||
- emit markdown roots and relative folder layout
|
||||
- omit machine-local ids and timestamps
|
||||
- omit secret values
|
||||
- omit machine-specific paths
|
||||
- preserve task descriptions and recurring-task declarations when exporting tasks
|
||||
- omit empty/default fields
|
||||
- default to the vendor-neutral base package
|
||||
- Paperclip exporters should emit `.paperclip.yaml` as a sidecar by default
|
||||
- preserve attribution and source references
|
||||
- prefer `referenced` over silent vendoring for third-party content
|
||||
- preserve `SKILL.md` as-is when exporting compatible skills
|
||||
|
||||
## 17. Licensing And Attribution
|
||||
|
||||
A compliant tool must:
|
||||
|
||||
- preserve `license` and `attribution` metadata when importing and exporting
|
||||
- distinguish vendored vs referenced content
|
||||
- not silently inline referenced third-party content during export
|
||||
- surface missing license metadata as a warning
|
||||
- surface restrictive or unknown licenses before install/import if content is vendored or mirrored
|
||||
|
||||
## 18. Optional Lock File
|
||||
|
||||
Authoring does not require a lock file.
|
||||
|
||||
Tools may generate an optional lock file such as:
|
||||
|
||||
```text
|
||||
company-package.lock.json
|
||||
```
|
||||
|
||||
Purpose:
|
||||
|
||||
- cache resolved refs
|
||||
- record final hashes
|
||||
- support reproducible installs
|
||||
|
||||
Rules:
|
||||
|
||||
- lock files are optional
|
||||
- lock files are generated artifacts, not canonical authoring input
|
||||
- the markdown package remains the source of truth
|
||||
|
||||
## 19. Paperclip Mapping
|
||||
|
||||
Paperclip can map this spec to its runtime model like this:
|
||||
|
||||
- base package:
|
||||
- `COMPANY.md` -> company metadata
|
||||
- `TEAM.md` -> importable org subtree
|
||||
- `AGENTS.md` -> agent identity and instructions
|
||||
- `PROJECT.md` -> starter project definition
|
||||
- `TASK.md` -> starter issue/task definition, or recurring task template when `recurring: true`
|
||||
- `SKILL.md` -> imported skill package
|
||||
- `sources[]` -> provenance and pinned upstream refs
|
||||
- Paperclip extension:
|
||||
- `.paperclip.yaml` -> adapter config, runtime config, env input declarations, permissions, budgets, routine triggers, and other Paperclip-specific fidelity
|
||||
|
||||
Inline Paperclip-only metadata that must live inside a shared markdown file should use:
|
||||
|
||||
- `metadata.paperclip`
|
||||
|
||||
That keeps the base format broader than Paperclip.
|
||||
|
||||
This specification itself remains vendor-neutral and intended for any agent-company runtime, not only Paperclip.
|
||||
|
||||
## 20. Cutover
|
||||
|
||||
Paperclip should cut over to this markdown-first package model as the primary portability format.
|
||||
|
||||
`paperclip.manifest.json` does not need to be preserved as a compatibility requirement for the future package system.
|
||||
|
||||
For Paperclip, this should be treated as a hard cutover in product direction rather than a long-lived dual-format strategy.
|
||||
|
||||
## 21. Minimal Example
|
||||
|
||||
```text
|
||||
lean-dev-shop/
|
||||
├── COMPANY.md
|
||||
├── agents/
|
||||
│ ├── ceo/AGENTS.md
|
||||
│ └── cto/AGENTS.md
|
||||
├── projects/
|
||||
│ └── q2-launch/
|
||||
│ ├── PROJECT.md
|
||||
│ └── tasks/
|
||||
│ └── monday-review/
|
||||
│ └── TASK.md
|
||||
├── teams/
|
||||
│ └── engineering/TEAM.md
|
||||
├── tasks/
|
||||
│ └── weekly-review/TASK.md
|
||||
└── skills/
|
||||
└── review/SKILL.md
|
||||
|
||||
Optional:
|
||||
|
||||
```text
|
||||
.paperclip.yaml
|
||||
```
|
||||
```
|
||||
|
||||
**Recommendation**
|
||||
This is the direction I would take:
|
||||
|
||||
- make this the human-facing spec
|
||||
- define `SKILL.md` compatibility as non-negotiable
|
||||
- treat this spec as an extension of Agent Skills, not a parallel format
|
||||
- make `companies.sh` a discovery layer for repos implementing this spec, not a publishing authority
|
||||
@@ -8,7 +8,7 @@ Run Paperclip in Docker without installing Node or pnpm locally.
|
||||
## Compose Quickstart (Recommended)
|
||||
|
||||
```sh
|
||||
docker compose -f docker-compose.quickstart.yml up --build
|
||||
docker compose -f docker/docker-compose.quickstart.yml up --build
|
||||
```
|
||||
|
||||
Open [http://localhost:3100](http://localhost:3100).
|
||||
@@ -21,10 +21,12 @@ Defaults:
|
||||
Override with environment variables:
|
||||
|
||||
```sh
|
||||
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc \
|
||||
docker compose -f docker-compose.quickstart.yml up --build
|
||||
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=../data/pc \
|
||||
docker compose -f docker/docker-compose.quickstart.yml up --build
|
||||
```
|
||||
|
||||
**Note:** `PAPERCLIP_DATA_DIR` is resolved relative to the compose file (`docker/`), so `../data/pc` maps to `data/pc` in the project root.
|
||||
|
||||
## Manual Docker Build
|
||||
|
||||
```sh
|
||||
|
||||
@@ -46,9 +46,12 @@
|
||||
"guides/board-operator/managing-agents",
|
||||
"guides/board-operator/org-structure",
|
||||
"guides/board-operator/managing-tasks",
|
||||
"guides/board-operator/execution-workspaces-and-runtime-services",
|
||||
"guides/board-operator/delegation",
|
||||
"guides/board-operator/approvals",
|
||||
"guides/board-operator/costs-and-budgets",
|
||||
"guides/board-operator/activity-log"
|
||||
"guides/board-operator/activity-log",
|
||||
"guides/board-operator/importing-and-exporting"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
189
docs/feedback-voting.md
Normal file
189
docs/feedback-voting.md
Normal file
@@ -0,0 +1,189 @@
|
||||
# Feedback Voting — Local Data Guide
|
||||
|
||||
When you rate an agent's response with **Helpful** (thumbs up) or **Needs work** (thumbs down), Paperclip saves your vote locally alongside your running instance. This guide covers what gets stored, how to access it, and how to export it.
|
||||
|
||||
## How voting works
|
||||
|
||||
1. Click **Helpful** or **Needs work** on any agent comment or document revision.
|
||||
2. If you click **Needs work**, an optional text prompt appears: _"What could have been better?"_ You can type a reason or dismiss it.
|
||||
3. A consent dialog asks whether to keep the vote local or share it. Your choice is remembered for future votes.
|
||||
|
||||
### What gets stored
|
||||
|
||||
Each vote creates two local records:
|
||||
|
||||
| Record | What it contains |
|
||||
|--------|-----------------|
|
||||
| **Vote** | Your vote (up/down), optional reason text, sharing preference, consent version, timestamp |
|
||||
| **Trace bundle** | Full context snapshot: the voted-on comment/revision text, issue title, agent info, your vote, and reason — everything needed to understand the feedback in isolation |
|
||||
|
||||
All data lives in your local Paperclip database. Nothing leaves your machine unless you explicitly choose to share.
|
||||
|
||||
When a vote is marked for sharing, Paperclip also queues the trace bundle for background export through the Telemetry Backend. The app server never uploads raw feedback trace bundles directly to object storage.
|
||||
|
||||
## Viewing your votes
|
||||
|
||||
### Quick report (terminal)
|
||||
|
||||
```bash
|
||||
pnpm paperclipai feedback report
|
||||
```
|
||||
|
||||
Shows a color-coded summary: vote counts, per-trace details with reasons, and export statuses.
|
||||
|
||||
```bash
|
||||
# Installed CLI
|
||||
paperclipai feedback report
|
||||
|
||||
# Point to a different server or company
|
||||
pnpm paperclipai feedback report --api-base http://127.0.0.1:3000 --company-id <company-id>
|
||||
|
||||
# Include raw payload dumps in the report
|
||||
pnpm paperclipai feedback report --payloads
|
||||
```
|
||||
|
||||
### API endpoints
|
||||
|
||||
All endpoints require board-user access (automatic in local dev).
|
||||
|
||||
**List votes for an issue:**
|
||||
```bash
|
||||
curl http://127.0.0.1:3102/api/issues/<issueId>/feedback-votes
|
||||
```
|
||||
|
||||
**List trace bundles for an issue (with full payloads):**
|
||||
```bash
|
||||
curl 'http://127.0.0.1:3102/api/issues/<issueId>/feedback-traces?includePayload=true'
|
||||
```
|
||||
|
||||
**List all traces company-wide:**
|
||||
```bash
|
||||
curl 'http://127.0.0.1:3102/api/companies/<companyId>/feedback-traces?includePayload=true'
|
||||
```
|
||||
|
||||
**Get a single trace envelope record:**
|
||||
```bash
|
||||
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>
|
||||
```
|
||||
|
||||
**Get the full export bundle for a trace:**
|
||||
```bash
|
||||
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>/bundle
|
||||
```
|
||||
|
||||
#### Filtering
|
||||
|
||||
The trace endpoints accept query parameters:
|
||||
|
||||
| Parameter | Values | Description |
|
||||
|-----------|--------|-------------|
|
||||
| `vote` | `up`, `down` | Filter by vote direction |
|
||||
| `status` | `local_only`, `pending`, `sent`, `failed` | Filter by export status |
|
||||
| `targetType` | `issue_comment`, `issue_document_revision` | Filter by what was voted on |
|
||||
| `sharedOnly` | `true` | Only show votes the user chose to share |
|
||||
| `includePayload` | `true` | Include the full context snapshot |
|
||||
| `from` / `to` | ISO date | Date range filter |
|
||||
|
||||
## Exporting your data
|
||||
|
||||
### Export to files + zip
|
||||
|
||||
```bash
|
||||
pnpm paperclipai feedback export
|
||||
```
|
||||
|
||||
Creates a timestamped directory with:
|
||||
|
||||
```
|
||||
feedback-export-20260331T120000Z/
|
||||
index.json # manifest with summary stats
|
||||
votes/
|
||||
PAP-123-a1b2c3d4.json # vote metadata (one per vote)
|
||||
traces/
|
||||
PAP-123-e5f6g7h8.json # Paperclip feedback envelope (one per trace)
|
||||
full-traces/
|
||||
PAP-123-e5f6g7h8/
|
||||
bundle.json # full export manifest for the trace
|
||||
...raw adapter files # codex / claude / opencode session artifacts when available
|
||||
feedback-export-20260331T120000Z.zip
|
||||
```
|
||||
|
||||
Exports are full by default. `traces/` keeps the Paperclip envelope, while `full-traces/` contains the richer per-trace bundle plus any recoverable adapter-native files.
|
||||
|
||||
```bash
|
||||
# Custom server and output directory
|
||||
pnpm paperclipai feedback export --api-base http://127.0.0.1:3000 --company-id <company-id> --out ./my-export
|
||||
```
|
||||
|
||||
### Reading an exported trace
|
||||
|
||||
Open any file in `traces/` to see:
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "trace-uuid",
|
||||
"vote": "down",
|
||||
"issueIdentifier": "PAP-123",
|
||||
"issueTitle": "Fix login timeout",
|
||||
"targetType": "issue_comment",
|
||||
"targetSummary": {
|
||||
"label": "Comment",
|
||||
"excerpt": "The first 80 chars of the comment that was voted on..."
|
||||
},
|
||||
"payloadSnapshot": {
|
||||
"vote": {
|
||||
"value": "down",
|
||||
"reason": "Did not address the root cause"
|
||||
},
|
||||
"target": {
|
||||
"body": "Full text of the agent comment..."
|
||||
},
|
||||
"issue": {
|
||||
"identifier": "PAP-123",
|
||||
"title": "Fix login timeout"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Open `full-traces/<issue>-<trace>/bundle.json` to see the expanded export metadata, including capture notes, adapter type, integrity metadata, and the inventory of raw files written alongside it.
|
||||
|
||||
Built-in local adapters now export their native session artifacts more directly:
|
||||
|
||||
- `codex_local`: `adapter/codex/session.jsonl`
|
||||
- `claude_local`: `adapter/claude/session.jsonl`, plus any `adapter/claude/session/...` sidecar files and `adapter/claude/debug.txt` when present
|
||||
- `opencode_local`: `adapter/opencode/session.json`, `adapter/opencode/messages/*.json`, and `adapter/opencode/parts/<messageId>/*.json`, with optional `project.json`, `todo.json`, and `session-diff.json`
|
||||
|
||||
## Sharing preferences
|
||||
|
||||
The first time you vote, a consent dialog asks:
|
||||
|
||||
- **Keep local** — vote is stored locally only (`sharedWithLabs: false`)
|
||||
- **Share this vote** — vote is marked for sharing (`sharedWithLabs: true`)
|
||||
|
||||
Your preference is saved per-company. You can change it any time via the feedback settings. Votes marked "keep local" are never queued for export.
|
||||
|
||||
## Data lifecycle
|
||||
|
||||
| Status | Meaning |
|
||||
|--------|---------|
|
||||
| `local_only` | Vote stored locally, not marked for sharing |
|
||||
| `pending` | Marked for sharing, waiting to be sent |
|
||||
| `sent` | Successfully transmitted |
|
||||
| `failed` | Transmission attempted but failed (will retry) |
|
||||
|
||||
Your local database always retains the full vote and trace data regardless of sharing status.
|
||||
|
||||
## Remote sync
|
||||
|
||||
Votes you choose to share are queued as `pending` traces and flushed by the server's background worker to the Telemetry Backend. The Telemetry Backend validates the request, then persists the bundle into its configured object storage.
|
||||
|
||||
- App server responsibility: build the bundle, POST it to Telemetry Backend, update trace status
|
||||
- Telemetry Backend responsibility: authenticate the request, validate payload shape, compress/store the bundle, return the final object key
|
||||
- Retry behavior: failed uploads move to `failed` with an error message in `failureReason`, and the worker retries them on later ticks
|
||||
|
||||
Exported objects use a deterministic key pattern so they are easy to inspect:
|
||||
|
||||
```text
|
||||
feedback-traces/<companyId>/YYYY/MM/DD/<exportId-or-traceId>.json
|
||||
```
|
||||
122
docs/guides/board-operator/delegation.md
Normal file
122
docs/guides/board-operator/delegation.md
Normal file
@@ -0,0 +1,122 @@
|
||||
---
|
||||
title: How Delegation Works
|
||||
summary: How the CEO breaks down goals into tasks and assigns them to agents
|
||||
---
|
||||
|
||||
Delegation is one of Paperclip's most powerful features. You set company goals, and the CEO agent automatically breaks them into tasks and assigns them to the right agents. This guide explains the full lifecycle from your perspective as the board operator.
|
||||
|
||||
## The Delegation Lifecycle
|
||||
|
||||
When you create a company goal, the CEO doesn't just acknowledge it — it builds a plan and mobilizes the team:
|
||||
|
||||
```
|
||||
You set a company goal
|
||||
→ CEO wakes up on heartbeat
|
||||
→ CEO proposes a strategy (creates an approval for you)
|
||||
→ You approve the strategy
|
||||
→ CEO breaks goals into tasks and assigns them to reports
|
||||
→ Reports wake up (heartbeat triggered by assignment)
|
||||
→ Reports execute work and update task status
|
||||
→ CEO monitors progress, unblocks, and escalates
|
||||
→ You see results in the dashboard and activity log
|
||||
```
|
||||
|
||||
Each step is traceable. Every task links back to the goal through a parent hierarchy, so you can always see why work is happening.
|
||||
|
||||
## What You Need to Do
|
||||
|
||||
Your role is strategic oversight, not task management. Here's what the delegation model expects from you:
|
||||
|
||||
1. **Set clear company goals.** The CEO works from these. Specific, measurable goals produce better delegation. "Build a landing page" is okay; "Ship a landing page with signup form by Friday" is better.
|
||||
|
||||
2. **Approve the CEO's strategy.** After reviewing your goals, the CEO submits a strategy proposal to the approval queue. Review it, then approve, reject, or request revisions.
|
||||
|
||||
3. **Approve hire requests.** When the CEO needs more capacity (e.g., a frontend engineer to build the landing page), it submits a hire request. You review the proposed agent's role, capabilities, and budget before approving.
|
||||
|
||||
4. **Monitor progress.** Use the dashboard and activity log to track how work is flowing. Check task status, agent activity, and completion rates.
|
||||
|
||||
5. **Intervene only when things stall.** If progress stops, check these in order:
|
||||
- Is an approval pending in your queue?
|
||||
- Is an agent paused or in an error state?
|
||||
- Is the CEO's budget exhausted (above 80%, it focuses on critical tasks only)?
|
||||
|
||||
## What the CEO Does Automatically
|
||||
|
||||
You do **not** need to tell the CEO to engage specific agents. After you approve its strategy, the CEO:
|
||||
|
||||
- **Breaks goals into concrete tasks** with clear descriptions, priorities, and acceptance criteria
|
||||
- **Assigns tasks to the right agent** based on role and capabilities (e.g., engineering tasks go to the CTO or engineers, marketing tasks go to the CMO)
|
||||
- **Creates subtasks** when work needs to be decomposed further
|
||||
- **Hires new agents** when the team lacks capacity for a goal (subject to your approval)
|
||||
- **Monitors progress** on each heartbeat, checking task status and unblocking reports
|
||||
- **Escalates to you** when it encounters something it can't resolve — budget issues, blocked approvals, or strategic ambiguity
|
||||
|
||||
## Common Delegation Patterns
|
||||
|
||||
### Flat Hierarchy (Small Teams)
|
||||
|
||||
For small companies with 3-5 agents, the CEO delegates directly to each report:
|
||||
|
||||
```
|
||||
CEO
|
||||
├── CTO (engineering tasks)
|
||||
├── CMO (marketing tasks)
|
||||
└── Designer (design tasks)
|
||||
```
|
||||
|
||||
The CEO assigns tasks directly. Each agent works independently and reports status back.
|
||||
|
||||
### Three-Level Hierarchy (Larger Teams)
|
||||
|
||||
For larger organizations, managers delegate further down the chain:
|
||||
|
||||
```
|
||||
CEO
|
||||
├── CTO
|
||||
│ ├── Backend Engineer
|
||||
│ └── Frontend Engineer
|
||||
└── CMO
|
||||
└── Content Writer
|
||||
```
|
||||
|
||||
The CEO assigns high-level tasks to the CTO and CMO. They break those into subtasks and assign them to their own reports. You only interact with the CEO — the rest happens automatically.
|
||||
|
||||
### Hire-on-Demand
|
||||
|
||||
The CEO can start as the only agent and hire as work requires:
|
||||
|
||||
1. You set a goal that needs engineering work
|
||||
2. The CEO proposes a strategy that includes hiring a CTO
|
||||
3. You approve the hire
|
||||
4. The CEO assigns engineering tasks to the new CTO
|
||||
5. As scope grows, the CTO may request to hire engineers
|
||||
|
||||
This pattern lets you start small and scale the team based on actual work, not upfront planning.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Why isn't the CEO delegating?"
|
||||
|
||||
If you've set a goal but nothing is happening, check these common causes:
|
||||
|
||||
| Check | What to look for |
|
||||
|-------|-----------------|
|
||||
| **Approval queue** | The CEO may have submitted a strategy or hire request that's waiting for your approval. This is the most common reason. |
|
||||
| **Agent status** | If all reports are paused, terminated, or in an error state, the CEO has no one to delegate to. Check the Agents page. |
|
||||
| **Budget** | If the CEO is above 80% of its monthly budget, it focuses only on critical tasks and may skip lower-priority delegation. |
|
||||
| **Goals** | If no company goals are set, the CEO has nothing to work from. Create a goal first. |
|
||||
| **Heartbeat** | Is the CEO's heartbeat enabled and running? Check the agent detail page for recent heartbeat history. |
|
||||
| **Agent instructions** | The CEO's delegation behavior is driven by its `AGENTS.md` instructions file. Open the CEO agent's detail page and verify that its instructions path is set and that the file includes delegation directives (subtask creation, hiring, assignment). If AGENTS.md is missing or doesn't mention delegation, the CEO won't know to break down goals and assign work. |
|
||||
|
||||
### "Do I have to tell the CEO to engage engineering and marketing?"
|
||||
|
||||
**No.** The CEO will delegate automatically after you approve its strategy. It knows the org chart and assigns tasks based on each agent's role and capabilities. You set the goal and approve the plan — the CEO handles task breakdown and assignment.
|
||||
|
||||
### "A task seems stuck"
|
||||
|
||||
If a specific task isn't progressing:
|
||||
|
||||
1. Check the task's comment thread — the assigned agent may have posted a blocker
|
||||
2. Check if the task is in `blocked` status — read the blocker comment to understand why
|
||||
3. Check the assigned agent's status — it may be paused or over budget
|
||||
4. If the agent is stuck, you can reassign the task or add a comment with guidance
|
||||
@@ -0,0 +1,68 @@
|
||||
---
|
||||
title: Execution Workspaces And Runtime Services
|
||||
summary: How project runtime configuration, execution workspaces, and issue runs fit together
|
||||
---
|
||||
|
||||
This guide documents the intended runtime model for projects, execution workspaces, and issue runs in Paperclip.
|
||||
|
||||
## Project runtime configuration
|
||||
|
||||
You can define how to run a project on the project workspace itself.
|
||||
|
||||
- Project workspace runtime config describes how to run services for that project checkout.
|
||||
- This is the default runtime configuration that child execution workspaces may inherit.
|
||||
- Defining the config does not start anything by itself.
|
||||
|
||||
## Manual runtime control
|
||||
|
||||
Runtime services are manually controlled from the UI.
|
||||
|
||||
- Project workspace runtime services are started and stopped from the project workspace UI.
|
||||
- Execution workspace runtime services are started and stopped from the execution workspace UI.
|
||||
- Paperclip does not automatically start or stop these runtime services as part of issue execution.
|
||||
- Paperclip also does not automatically restart workspace runtime services on server boot.
|
||||
|
||||
## Execution workspace inheritance
|
||||
|
||||
Execution workspaces isolate code and runtime state from the project primary workspace.
|
||||
|
||||
- An isolated execution workspace has its own checkout path, branch, and local runtime instance.
|
||||
- The runtime configuration may inherit from the linked project workspace by default.
|
||||
- The execution workspace may override that runtime configuration with its own workspace-specific settings.
|
||||
- The inherited configuration answers "how to run the service", but the running process is still specific to that execution workspace.
|
||||
|
||||
## Issues and execution workspaces
|
||||
|
||||
Issues are attached to execution workspace behavior, not to automatic runtime management.
|
||||
|
||||
- An issue may create a new execution workspace when you choose an isolated workspace mode.
|
||||
- An issue may reuse an existing execution workspace when you choose reuse.
|
||||
- Multiple issues may intentionally share one execution workspace so they can work against the same branch and running runtime services.
|
||||
- Assigning or running an issue does not automatically start or stop runtime services for that workspace.
|
||||
|
||||
## Execution workspace lifecycle
|
||||
|
||||
Execution workspaces are durable until a human closes them.
|
||||
|
||||
- The UI can archive an execution workspace.
|
||||
- Closing an execution workspace stops its runtime services and cleans up its workspace artifacts when allowed.
|
||||
- Shared workspaces that point at the project primary checkout are treated more conservatively during cleanup than disposable isolated workspaces.
|
||||
|
||||
## Resolved workspace logic during heartbeat runs
|
||||
|
||||
Heartbeat still resolves a workspace for the run, but that is about code location and session continuity, not runtime-service control.
|
||||
|
||||
1. Heartbeat resolves a base workspace for the run.
|
||||
2. Paperclip realizes the effective execution workspace, including creating or reusing a worktree when needed.
|
||||
3. Paperclip persists execution-workspace metadata such as paths, refs, and provisioning settings.
|
||||
4. Heartbeat passes the resolved code workspace to the agent run.
|
||||
5. Workspace runtime services remain manual UI-managed controls rather than automatic heartbeat-managed services.
|
||||
|
||||
## Current implementation guarantees
|
||||
|
||||
With the current implementation:
|
||||
|
||||
- Project workspace runtime config is the fallback for execution workspace UI controls.
|
||||
- Execution workspace runtime overrides are stored on the execution workspace.
|
||||
- Heartbeat runs do not auto-start workspace runtime services.
|
||||
- Server startup does not auto-restart workspace runtime services.
|
||||
203
docs/guides/board-operator/importing-and-exporting.md
Normal file
203
docs/guides/board-operator/importing-and-exporting.md
Normal file
@@ -0,0 +1,203 @@
|
||||
---
|
||||
title: Importing & Exporting Companies
|
||||
summary: Export companies to portable packages and import them from local paths or GitHub
|
||||
---
|
||||
|
||||
Paperclip companies can be exported to portable markdown packages and imported from local directories or GitHub repositories. This lets you share company configurations, duplicate setups, and version-control your agent teams.
|
||||
|
||||
## Package Format
|
||||
|
||||
Exported packages follow the [Agent Companies specification](/companies/companies-spec) and use a markdown-first structure:
|
||||
|
||||
```text
|
||||
my-company/
|
||||
├── COMPANY.md # Company metadata
|
||||
├── agents/
|
||||
│ ├── ceo/AGENT.md # Agent instructions + frontmatter
|
||||
│ └── cto/AGENT.md
|
||||
├── projects/
|
||||
│ └── main/PROJECT.md
|
||||
├── skills/
|
||||
│ └── review/SKILL.md
|
||||
├── tasks/
|
||||
│ └── onboarding/TASK.md
|
||||
└── .paperclip.yaml # Adapter config, env inputs, routines
|
||||
```
|
||||
|
||||
- **COMPANY.md** defines company name, description, and metadata.
|
||||
- **AGENT.md** files contain agent identity, role, and instructions.
|
||||
- **SKILL.md** files are compatible with the Agent Skills ecosystem.
|
||||
- **.paperclip.yaml** holds Paperclip-specific config (adapter types, env inputs, budgets) as an optional sidecar.
|
||||
|
||||
## Exporting a Company
|
||||
|
||||
Export a company into a portable folder:
|
||||
|
||||
```sh
|
||||
paperclipai company export <company-id> --out ./my-export
|
||||
```
|
||||
|
||||
### Options
|
||||
|
||||
| Option | Description | Default |
|
||||
|--------|-------------|---------|
|
||||
| `--out <path>` | Output directory (required) | — |
|
||||
| `--include <values>` | Comma-separated set: `company`, `agents`, `projects`, `issues`, `tasks`, `skills` | `company,agents` |
|
||||
| `--skills <values>` | Export only specific skill slugs | all |
|
||||
| `--projects <values>` | Export only specific project shortnames or IDs | all |
|
||||
| `--issues <values>` | Export specific issue identifiers or IDs | none |
|
||||
| `--project-issues <values>` | Export issues belonging to specific projects | none |
|
||||
| `--expand-referenced-skills` | Vendor skill file contents instead of keeping upstream references | `false` |
|
||||
|
||||
### Examples
|
||||
|
||||
```sh
|
||||
# Export company with agents and projects
|
||||
paperclipai company export abc123 --out ./backup --include company,agents,projects
|
||||
|
||||
# Export everything including tasks and skills
|
||||
paperclipai company export abc123 --out ./full-export --include company,agents,projects,tasks,skills
|
||||
|
||||
# Export only specific skills
|
||||
paperclipai company export abc123 --out ./skills-only --include skills --skills review,deploy
|
||||
```
|
||||
|
||||
### What Gets Exported
|
||||
|
||||
- Company name, description, and metadata
|
||||
- Agent names, roles, reporting structure, and instructions
|
||||
- Project definitions and workspace config
|
||||
- Task/issue descriptions (when included)
|
||||
- Skill packages (as references or vendored content)
|
||||
- Adapter type and env input declarations in `.paperclip.yaml`
|
||||
|
||||
Secret values, machine-local paths, and database IDs are **never** exported.
|
||||
|
||||
## Importing a Company
|
||||
|
||||
Import from a local directory, GitHub URL, or GitHub shorthand:
|
||||
|
||||
```sh
|
||||
# From a local folder
|
||||
paperclipai company import ./my-export
|
||||
|
||||
# From a GitHub URL
|
||||
paperclipai company import https://github.com/org/repo
|
||||
|
||||
# From a GitHub subfolder
|
||||
paperclipai company import https://github.com/org/repo/tree/main/companies/acme
|
||||
|
||||
# From GitHub shorthand
|
||||
paperclipai company import org/repo
|
||||
paperclipai company import org/repo/companies/acme
|
||||
```
|
||||
|
||||
### Options
|
||||
|
||||
| Option | Description | Default |
|
||||
|--------|-------------|---------|
|
||||
| `--target <mode>` | `new` (create a new company) or `existing` (merge into existing) | inferred from context |
|
||||
| `--company-id <id>` | Target company ID for `--target existing` | current context |
|
||||
| `--new-company-name <name>` | Override company name for `--target new` | from package |
|
||||
| `--include <values>` | Comma-separated set: `company`, `agents`, `projects`, `issues`, `tasks`, `skills` | auto-detected |
|
||||
| `--agents <list>` | Comma-separated agent slugs to import, or `all` | `all` |
|
||||
| `--collision <mode>` | How to handle name conflicts: `rename`, `skip`, or `replace` | `rename` |
|
||||
| `--ref <value>` | Git ref for GitHub imports (branch, tag, or commit) | default branch |
|
||||
| `--dry-run` | Preview what would be imported without applying | `false` |
|
||||
| `--yes` | Skip the interactive confirmation prompt | `false` |
|
||||
| `--json` | Output result as JSON | `false` |
|
||||
|
||||
### Target Modes
|
||||
|
||||
- **`new`** — Creates a fresh company from the package. Good for duplicating a company template.
|
||||
- **`existing`** — Merges the package into an existing company. Use `--company-id` to specify the target.
|
||||
|
||||
If `--target` is not specified, Paperclip infers it: if a `--company-id` is provided (or one exists in context), it defaults to `existing`; otherwise `new`.
|
||||
|
||||
### Collision Strategies
|
||||
|
||||
When importing into an existing company, agent or project names may conflict with existing ones:
|
||||
|
||||
- **`rename`** (default) — Appends a suffix to avoid conflicts (e.g., `ceo` becomes `ceo-2`).
|
||||
- **`skip`** — Skips entities that already exist.
|
||||
- **`replace`** — Overwrites existing entities. Only available for non-safe imports (not available through the CEO API).
|
||||
|
||||
### Interactive Selection
|
||||
|
||||
When running interactively (no `--yes` or `--json` flags), the import command shows a selection picker before applying. You can choose exactly which agents, projects, skills, and tasks to import using a checkbox interface.
|
||||
|
||||
### Preview Before Applying
|
||||
|
||||
Always preview first with `--dry-run`:
|
||||
|
||||
```sh
|
||||
paperclipai company import org/repo --target existing --company-id abc123 --dry-run
|
||||
```
|
||||
|
||||
The preview shows:
|
||||
- **Package contents** — How many agents, projects, tasks, and skills are in the source
|
||||
- **Import plan** — What will be created, renamed, skipped, or replaced
|
||||
- **Env inputs** — Environment variables that may need values after import
|
||||
- **Warnings** — Potential issues like missing skills or unresolved references
|
||||
|
||||
Imported agents always land with timer heartbeats disabled. Assignment/on-demand wake behavior from the package is preserved, but scheduled runs stay off until a board operator re-enables them.
|
||||
|
||||
### Common Workflows
|
||||
|
||||
**Clone a company template from GitHub:**
|
||||
|
||||
```sh
|
||||
paperclipai company import org/company-templates/engineering-team \
|
||||
--target new \
|
||||
--new-company-name "My Engineering Team"
|
||||
```
|
||||
|
||||
**Add agents from a package into your existing company:**
|
||||
|
||||
```sh
|
||||
paperclipai company import ./shared-agents \
|
||||
--target existing \
|
||||
--company-id abc123 \
|
||||
--include agents \
|
||||
--collision rename
|
||||
```
|
||||
|
||||
**Import a specific branch or tag:**
|
||||
|
||||
```sh
|
||||
paperclipai company import org/repo --ref v2.0.0 --dry-run
|
||||
```
|
||||
|
||||
**Non-interactive import (CI/scripts):**
|
||||
|
||||
```sh
|
||||
paperclipai company import ./package \
|
||||
--target new \
|
||||
--yes \
|
||||
--json
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
The CLI commands use these API endpoints under the hood:
|
||||
|
||||
| Action | Endpoint |
|
||||
|--------|----------|
|
||||
| Export company | `POST /api/companies/{companyId}/export` |
|
||||
| Preview import (existing company) | `POST /api/companies/{companyId}/imports/preview` |
|
||||
| Apply import (existing company) | `POST /api/companies/{companyId}/imports/apply` |
|
||||
| Preview import (new company) | `POST /api/companies/import/preview` |
|
||||
| Apply import (new company) | `POST /api/companies/import` |
|
||||
|
||||
CEO agents can also use the safe import routes (`/imports/preview` and `/imports/apply`) which enforce non-destructive rules: `replace` is rejected, collisions resolve with `rename` or `skip`, and issues are always created as new.
|
||||
|
||||
## GitHub Sources
|
||||
|
||||
Paperclip supports several GitHub URL formats:
|
||||
|
||||
- Full URL: `https://github.com/org/repo`
|
||||
- Subfolder URL: `https://github.com/org/repo/tree/main/path/to/company`
|
||||
- Shorthand: `org/repo`
|
||||
- Shorthand with path: `org/repo/path/to/company`
|
||||
|
||||
Use `--ref` to pin to a specific branch, tag, or commit hash when importing from GitHub.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user