mirror of
https://github.com/paperclipai/paperclip
synced 2026-04-26 01:35:18 +02:00
Compare commits
1197 Commits
@paperclip
...
pr/pap-817
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5561a9c17f | ||
|
|
59e29afab5 | ||
|
|
fd4df4db48 | ||
|
|
8ae954bb8f | ||
|
|
32c76e0012 | ||
|
|
70bd55a00f | ||
|
|
f92d2c3326 | ||
|
|
a3f4e6f56c | ||
|
|
08bdc3d28e | ||
|
|
7c54b6e9e3 | ||
|
|
a346ad2a73 | ||
|
|
e4e5b61596 | ||
|
|
eeb7e1a91a | ||
|
|
f2637e6972 | ||
|
|
c8f8f6752f | ||
|
|
87b3cacc8f | ||
|
|
4096db8053 | ||
|
|
fa084e1a16 | ||
|
|
22067c7d1d | ||
|
|
85d2c54d53 | ||
|
|
5222a49cc3 | ||
|
|
36574bd9c6 | ||
|
|
2cc2d4420d | ||
|
|
7576c5ecbc | ||
|
|
92c29f27c3 | ||
|
|
55b26ed590 | ||
|
|
6960ab1106 | ||
|
|
c3f4e18a5e | ||
|
|
a3f568dec7 | ||
|
|
6f1ce3bd60 | ||
|
|
159c5b4360 | ||
|
|
b5fde733b0 | ||
|
|
f9927bdaaa | ||
|
|
dcead97650 | ||
|
|
9786ebb7ba | ||
|
|
66d84ccfa3 | ||
|
|
56a39fea3d | ||
|
|
2a6e1cf1fc | ||
|
|
c02dc73d3c | ||
|
|
06f5632d1a | ||
|
|
1246ccf250 | ||
|
|
a339b488ae | ||
|
|
ac376d0e5e | ||
|
|
220946b2a1 | ||
|
|
c41dd2e393 | ||
|
|
2e76a2a554 | ||
|
|
8fa4b6a5fb | ||
|
|
d8b408625e | ||
|
|
19154d0fec | ||
|
|
c0c1fd17cb | ||
|
|
2daae758b1 | ||
|
|
43b21c6033 | ||
|
|
0bb1ee3caa | ||
|
|
3b2cb3a699 | ||
|
|
1adfd30b3b | ||
|
|
a315838d43 | ||
|
|
75c7eb3868 | ||
|
|
eac3f3fa69 | ||
|
|
02c779b41d | ||
|
|
5a1e17f27f | ||
|
|
e0d2c4bddf | ||
|
|
d73c8df895 | ||
|
|
e73bc81a73 | ||
|
|
0b960b0739 | ||
|
|
bdecb1bad2 | ||
|
|
e61f00d4c1 | ||
|
|
42c8d9b660 | ||
|
|
bd0b76072b | ||
|
|
db42adf1bf | ||
|
|
0e8e162cd5 | ||
|
|
49ace2faf9 | ||
|
|
8232456ce8 | ||
|
|
cd7c6ee751 | ||
|
|
f8dd4dcb30 | ||
|
|
0b9f00346b | ||
|
|
ef0846e723 | ||
|
|
3a79d94050 | ||
|
|
b5610f66a6 | ||
|
|
119dd0eaa0 | ||
|
|
080c9e415d | ||
|
|
7f9a76411a | ||
|
|
01b6b7e66a | ||
|
|
298713fae7 | ||
|
|
37c2c4acc4 | ||
|
|
1376fc8f44 | ||
|
|
e6801123ca | ||
|
|
f23d611d0c | ||
|
|
5dfdbe91bb | ||
|
|
e6df9fa078 | ||
|
|
5a73556871 | ||
|
|
e204e03fa6 | ||
|
|
8b4850aaea | ||
|
|
f87db64ba9 | ||
|
|
f42aebdff8 | ||
|
|
4ebc12ab5a | ||
|
|
fdb20d5d08 | ||
|
|
5bf6fd1270 | ||
|
|
e3e7a92c77 | ||
|
|
640f527f8c | ||
|
|
49c1b8c2d8 | ||
|
|
93ba78362d | ||
|
|
2fdf953229 | ||
|
|
ebe00359d1 | ||
|
|
036e2b52db | ||
|
|
f4803291b8 | ||
|
|
d47ec56eca | ||
|
|
ae6aac044d | ||
|
|
da2c15905a | ||
|
|
13ca33aa4e | ||
|
|
e37e9df0d1 | ||
|
|
54b99d5096 | ||
|
|
fb63d61ae5 | ||
|
|
73ada45037 | ||
|
|
be911754c5 | ||
|
|
cff06c9a54 | ||
|
|
ad011fbf1e | ||
|
|
28a5f858b7 | ||
|
|
220a5ec5dd | ||
|
|
0ec79d4295 | ||
|
|
5e414ff4df | ||
|
|
a46dc4634b | ||
|
|
df64530333 | ||
|
|
8dc98db717 | ||
|
|
9093cfbe4f | ||
|
|
da9b31e393 | ||
|
|
99eb317600 | ||
|
|
652fa8223e | ||
|
|
4587627f3c | ||
|
|
17b6f6c8f7 | ||
|
|
de10269d10 | ||
|
|
dfb83295de | ||
|
|
61f53b6471 | ||
|
|
e3c92a20f1 | ||
|
|
a290d1d550 | ||
|
|
abf48cbbf9 | ||
|
|
d53714a145 | ||
|
|
07757a59e9 | ||
|
|
f0b5130b80 | ||
|
|
0ca479de9c | ||
|
|
553e7b6b30 | ||
|
|
1830216078 | ||
|
|
5140d7b0c4 | ||
|
|
a62c264ddf | ||
|
|
3db2d33e4c | ||
|
|
360a7fc17b | ||
|
|
13fd656e2b | ||
|
|
9ee440b8e4 | ||
|
|
5b1e1239fd | ||
|
|
79652da520 | ||
|
|
0f4a5716ea | ||
|
|
8fc399f511 | ||
|
|
dd44f69e2b | ||
|
|
39878fcdfe | ||
|
|
3de7d63ea9 | ||
|
|
581a654748 | ||
|
|
888179f7f0 | ||
|
|
0bb6336eaf | ||
|
|
2d8c8abbfb | ||
|
|
6f7609daac | ||
|
|
b26b9cda7b | ||
|
|
fb760a63ab | ||
|
|
971513d3ae | ||
|
|
d6bb71f324 | ||
|
|
0f45999df9 | ||
|
|
bee814787a | ||
|
|
d22131ad0a | ||
|
|
7930e725af | ||
|
|
5fee484e85 | ||
|
|
d7a08c1db2 | ||
|
|
401b241570 | ||
|
|
bf5cfaaeab | ||
|
|
616a2bc8f9 | ||
|
|
4ab3e4f7ab | ||
|
|
2a33acce3a | ||
|
|
b2c2bbd96f | ||
|
|
b72279afe4 | ||
|
|
4c6e8e6053 | ||
|
|
f2c42aad12 | ||
|
|
6a568662b8 | ||
|
|
d07d86f778 | ||
|
|
8cc8540597 | ||
|
|
5f2b1b63c2 | ||
|
|
4fc80bdc16 | ||
|
|
dfdd3784b9 | ||
|
|
a0a28fce38 | ||
|
|
22b38b1956 | ||
|
|
4ffa2b15dc | ||
|
|
ee85028534 | ||
|
|
c844ca1a40 | ||
|
|
7f3fad64b8 | ||
|
|
d6c6aa5c49 | ||
|
|
f9d685344d | ||
|
|
bcc1d9f3d6 | ||
|
|
25af0a1532 | ||
|
|
72a0e256a8 | ||
|
|
9e21ef879f | ||
|
|
58a3cbd654 | ||
|
|
915a3ff3ce | ||
|
|
9c5a31ed45 | ||
|
|
14ee364190 | ||
|
|
2d7b9e95cb | ||
|
|
b20675b7b5 | ||
|
|
df8cc8136f | ||
|
|
b05d0c560e | ||
|
|
c5f20a9891 | ||
|
|
53249c00cf | ||
|
|
339c05c2d4 | ||
|
|
c7d05096ab | ||
|
|
21765f8118 | ||
|
|
9998cc0683 | ||
|
|
c39758a169 | ||
|
|
e341abb99c | ||
|
|
5caf43349b | ||
|
|
f7c766ff32 | ||
|
|
bdeaaeac9c | ||
|
|
a9802c1962 | ||
|
|
531945cfe2 | ||
|
|
6a7e2d3fce | ||
|
|
035cb8aec2 | ||
|
|
ca3fdb3957 | ||
|
|
301437e169 | ||
|
|
12c6584d30 | ||
|
|
efbcce27e4 | ||
|
|
54dd8f7ac8 | ||
|
|
ce69ebd2ec | ||
|
|
500d926da7 | ||
|
|
b1c4b2e420 | ||
|
|
1d1511e37c | ||
|
|
8f5196f7d6 | ||
|
|
8edff22c0b | ||
|
|
2f076f2add | ||
|
|
fff0600b1d | ||
|
|
16e221d03c | ||
|
|
cace79631e | ||
|
|
05c8a23a75 | ||
|
|
7a652b8998 | ||
|
|
6d564e0539 | ||
|
|
dbc9375256 | ||
|
|
b4e06c63e2 | ||
|
|
01afa92424 | ||
|
|
1cd61601f3 | ||
|
|
6eb9545a72 | ||
|
|
47a6d86174 | ||
|
|
aa854e7efe | ||
|
|
5536e6b91e | ||
|
|
f37e0aa7b3 | ||
|
|
b75e00e05d | ||
|
|
51ca713181 | ||
|
|
685c7549e1 | ||
|
|
8be868f0ab | ||
|
|
e28bcef4ad | ||
|
|
7b4a4f45ed | ||
|
|
87b17de0bd | ||
|
|
9ba47681c6 | ||
|
|
ef60ea0446 | ||
|
|
cd01ebb417 | ||
|
|
6000bb4ee2 | ||
|
|
e99fa66daf | ||
|
|
3b03ac1734 | ||
|
|
6ba5758d30 | ||
|
|
cfc53bf96b | ||
|
|
58d7f59477 | ||
|
|
b0524412c4 | ||
|
|
3689992965 | ||
|
|
55165f116d | ||
|
|
480174367d | ||
|
|
099c37c4b4 | ||
|
|
d84399aebe | ||
|
|
4f49c8a2b9 | ||
|
|
10f26cfad9 | ||
|
|
1e393bedb2 | ||
|
|
1ac85d837a | ||
|
|
9e19f1d005 | ||
|
|
731c9544b3 | ||
|
|
528f836e71 | ||
|
|
78c714c29a | ||
|
|
88da68d8a2 | ||
|
|
0d9fabb6ec | ||
|
|
ff16ff8d01 | ||
|
|
154a4a7ac1 | ||
|
|
493b0ca8d1 | ||
|
|
7730230aa9 | ||
|
|
2c05c2c0ac | ||
|
|
cc1620e4fe | ||
|
|
3e88afb64a | ||
|
|
3562cca743 | ||
|
|
9a4135c288 | ||
|
|
7140090d0b | ||
|
|
bfb1960703 | ||
|
|
22ae70649b | ||
|
|
c121f4d4a7 | ||
|
|
19f4a78f4a | ||
|
|
3e0e15394a | ||
|
|
5252568825 | ||
|
|
c7d31346e0 | ||
|
|
6b355e1acf | ||
|
|
f98d821213 | ||
|
|
8954512dad | ||
|
|
f598a556dc | ||
|
|
21f7adbe45 | ||
|
|
9d452eb120 | ||
|
|
4fdcfe5515 | ||
|
|
9df7fd019f | ||
|
|
0036f0e9a1 | ||
|
|
6ba9aea8ba | ||
|
|
f499c9e222 | ||
|
|
b59bc9a6de | ||
|
|
5cf841283a | ||
|
|
9176218d16 | ||
|
|
42c0ca669b | ||
|
|
9acf70baab | ||
|
|
62e8fd494f | ||
|
|
3921466aae | ||
|
|
f1a0460105 | ||
|
|
773f8dcf6d | ||
|
|
824a297c73 | ||
|
|
4d8c988dab | ||
|
|
48326da83f | ||
|
|
21c1235277 | ||
|
|
e980c2ef64 | ||
|
|
7b9718cbaa | ||
|
|
5965266cb8 | ||
|
|
2aa607c828 | ||
|
|
827b09d7a5 | ||
|
|
e2f26f039a | ||
|
|
71de1c5877 | ||
|
|
cd67bf1d3d | ||
|
|
2a15650341 | ||
|
|
b5aeae7e22 | ||
|
|
eaa7d54cb4 | ||
|
|
2a56f4134e | ||
|
|
b8a816ff8c | ||
|
|
2a7c44d314 | ||
|
|
108792ac51 | ||
|
|
4a5f6ec00a | ||
|
|
549e3b22e5 | ||
|
|
b2f7252b27 | ||
|
|
6ebfd3ccf1 | ||
|
|
4da13984e2 | ||
|
|
d974268e37 | ||
|
|
2c747402a8 | ||
|
|
e39ae5a400 | ||
|
|
4d9769c620 | ||
|
|
4323d4bbda | ||
|
|
5a9a4170e8 | ||
|
|
cebd62cbb7 | ||
|
|
bba36ab4c0 | ||
|
|
fee3df2e62 | ||
|
|
2539950ad7 | ||
|
|
d06cbb84f4 | ||
|
|
7c2f015f31 | ||
|
|
b2072518e0 | ||
|
|
10e37dd7e5 | ||
|
|
0fd7ed84fb | ||
|
|
54a28cc5b9 | ||
|
|
132e2bd0d9 | ||
|
|
517e90c13a | ||
|
|
228277d361 | ||
|
|
6c779fbd48 | ||
|
|
c539fcde8b | ||
|
|
7a08fbd370 | ||
|
|
71e1bc260d | ||
|
|
78342e384d | ||
|
|
e6269e5817 | ||
|
|
be9dac6723 | ||
|
|
ce105d32c3 | ||
|
|
8abfe894e3 | ||
|
|
02bf0dd862 | ||
|
|
88bf1b23a3 | ||
|
|
5d1e39b651 | ||
|
|
ceb18c77db | ||
|
|
8d5af56fc5 | ||
|
|
6dd4cc2840 | ||
|
|
794ba59bb6 | ||
|
|
6a1c198c04 | ||
|
|
1309cc449d | ||
|
|
dd11e7aa7b | ||
|
|
81b4e4f826 | ||
|
|
3456808e1c | ||
|
|
0cfbc58842 | ||
|
|
79e0915a86 | ||
|
|
56f7807732 | ||
|
|
52978e84ba | ||
|
|
b339f923d6 | ||
|
|
9e843c4dec | ||
|
|
9a26974ba8 | ||
|
|
e079b8ebcf | ||
|
|
9e6cc0851b | ||
|
|
7e4aec9379 | ||
|
|
4220d6e057 | ||
|
|
5890b318c4 | ||
|
|
bb788d8360 | ||
|
|
04ceb1f619 | ||
|
|
bb46423969 | ||
|
|
8460fee380 | ||
|
|
bb7d1b2c71 | ||
|
|
eb113bff3d | ||
|
|
3d01217aef | ||
|
|
cca086b863 | ||
|
|
56985a320f | ||
|
|
0d4dd50b35 | ||
|
|
c578fb1575 | ||
|
|
8fbbc4ada6 | ||
|
|
d77630154a | ||
|
|
0c121b856f | ||
|
|
1990b29018 | ||
|
|
10d06bc1ca | ||
|
|
9b7b90521f | ||
|
|
728d9729ed | ||
|
|
0b76b1aced | ||
|
|
5f2c2ee0e2 | ||
|
|
411952573e | ||
|
|
76e6cc08a6 | ||
|
|
656b4659fc | ||
|
|
f383a37b01 | ||
|
|
3529ccfa85 | ||
|
|
7db3446a09 | ||
|
|
9d21380699 | ||
|
|
db20f4f46e | ||
|
|
bc991a96b4 | ||
|
|
56c9d95daa | ||
|
|
f14b6e449f | ||
|
|
82bc00a3ae | ||
|
|
94018e0239 | ||
|
|
fed94d18f3 | ||
|
|
3dc3347a58 | ||
|
|
0763e2eb20 | ||
|
|
1548b73b77 | ||
|
|
cf8bfe8d8e | ||
|
|
6eceb9b886 | ||
|
|
4dfd862f11 | ||
|
|
5d6dadda83 | ||
|
|
43fa4fc487 | ||
|
|
bf9b057670 | ||
|
|
4a5aba5bac | ||
|
|
0b829ea20b | ||
|
|
2d548a9da0 | ||
|
|
86bb3d25cc | ||
|
|
e538329b0a | ||
|
|
ad494e74ad | ||
|
|
bc8fde5433 | ||
|
|
8d0581ffb4 | ||
|
|
298cb4ab8a | ||
|
|
3572ef230d | ||
|
|
f8249af501 | ||
|
|
140c4e1feb | ||
|
|
617aeaae0e | ||
|
|
b116e04894 | ||
|
|
dc1bf7e9c6 | ||
|
|
1a5eaba622 | ||
|
|
5b44dbe9c4 | ||
|
|
3c31e379a1 | ||
|
|
4e146f0075 | ||
|
|
173e7915a7 | ||
|
|
e76fca138d | ||
|
|
45df62652b | ||
|
|
068441b01b | ||
|
|
7034ea5b01 | ||
|
|
ccb6729ec8 | ||
|
|
ca0169eb6c | ||
|
|
448fdaab96 | ||
|
|
4244047d4d | ||
|
|
b1e2a5615b | ||
|
|
b535860a50 | ||
|
|
2b478764a9 | ||
|
|
88cc8e495c | ||
|
|
88df0fecb0 | ||
|
|
34ec40211e | ||
|
|
52b12784a0 | ||
|
|
3bffe3e479 | ||
|
|
ef652a2766 | ||
|
|
cf30ddb924 | ||
|
|
2f7da835de | ||
|
|
c5cc191a08 | ||
|
|
c6ea491000 | ||
|
|
76d30ff835 | ||
|
|
16ab8c8303 | ||
|
|
2daa35cd3a | ||
|
|
597c4b1d45 | ||
|
|
6f931b8405 | ||
|
|
41e03bae61 | ||
|
|
d7f45eac14 | ||
|
|
94112b324c | ||
|
|
fe0d7d029a | ||
|
|
aea133ff9f | ||
|
|
c94132bc7e | ||
|
|
8468d347be | ||
|
|
cc40e1f8e9 | ||
|
|
61fd5486e8 | ||
|
|
675421f3a9 | ||
|
|
2162289bf3 | ||
|
|
eb647ab2db | ||
|
|
7675fd0856 | ||
|
|
82f253c310 | ||
|
|
5de5fb507a | ||
|
|
269dd6abbe | ||
|
|
2c35be0212 | ||
|
|
c44dbf79cb | ||
|
|
5814249ea9 | ||
|
|
bfaa4b4bdc | ||
|
|
e619e64433 | ||
|
|
b2c0f3f9a5 | ||
|
|
872807a6f8 | ||
|
|
f482433ddf | ||
|
|
825d2b4759 | ||
|
|
6c7ebaeb59 | ||
|
|
6d65800173 | ||
|
|
0d2380b7b1 | ||
|
|
ec261e9c7c | ||
|
|
5d52ce2e5e | ||
|
|
811e2b9909 | ||
|
|
8985ddaeed | ||
|
|
2dbb31ef3c | ||
|
|
648ee37a17 | ||
|
|
98e73acc3b | ||
|
|
0fb85e5729 | ||
|
|
7e3a04c76c | ||
|
|
e219761d95 | ||
|
|
0afd5d5630 | ||
|
|
4f8df1804d | ||
|
|
d0677dcd91 | ||
|
|
bc5d650248 | ||
|
|
2e3a0d027e | ||
|
|
b92f234d88 | ||
|
|
0f831e09c1 | ||
|
|
a6c7e09e2a | ||
|
|
30e2914424 | ||
|
|
6b17f7caa8 | ||
|
|
2dc3b4df24 | ||
|
|
b13c530024 | ||
|
|
dd828e96ad | ||
|
|
6e6d67372c | ||
|
|
7e43020a28 | ||
|
|
0851e81b47 | ||
|
|
325fcf8505 | ||
|
|
8cf85a5a50 | ||
|
|
4cfbeaba9d | ||
|
|
0605c9f229 | ||
|
|
22b8e90ba6 | ||
|
|
7c4b02f02b | ||
|
|
cfa4925075 | ||
|
|
280536092e | ||
|
|
ff4f326341 | ||
|
|
2ba0f5914f | ||
|
|
dcd8a47d4f | ||
|
|
0bf53bc513 | ||
|
|
eafb5b8fd9 | ||
|
|
30888759f2 | ||
|
|
2137c2f715 | ||
|
|
58a9259a2e | ||
|
|
1d8f514d10 | ||
|
|
9ed7092aab | ||
|
|
193a987513 | ||
|
|
3b25268c0b | ||
|
|
cb5d7e76fb | ||
|
|
bc12f08c66 | ||
|
|
a7a64f11be | ||
|
|
31e6e30fe3 | ||
|
|
ad7bf4288a | ||
|
|
16dfcb56a4 | ||
|
|
924762c073 | ||
|
|
abb70ca5c5 | ||
|
|
1e3a485408 | ||
|
|
07d13e1738 | ||
|
|
c8cd950a03 | ||
|
|
501ab4ffa9 | ||
|
|
d671a59306 | ||
|
|
8a201022c0 | ||
|
|
6fa1dd2197 | ||
|
|
56a34a8f8a | ||
|
|
271c2b9018 | ||
|
|
93faf6d361 | ||
|
|
eb0a74384e | ||
|
|
ab41fdbaee | ||
|
|
2975aa950b | ||
|
|
45998aa9a0 | ||
|
|
29b70e0c36 | ||
|
|
3f48b61bfa | ||
|
|
7a06a577ce | ||
|
|
dbb5bd48cc | ||
|
|
303c00b61b | ||
|
|
a39579dad3 | ||
|
|
fbb8d10305 | ||
|
|
920bc4c70f | ||
|
|
12ccfc2c9a | ||
|
|
9da5358bb3 | ||
|
|
80cdbdbd47 | ||
|
|
bcce5b7ec2 | ||
|
|
8eacc9c697 | ||
|
|
db81a06386 | ||
|
|
626a8f1976 | ||
|
|
aa799bba4c | ||
|
|
aaadbdc144 | ||
|
|
a393db78b4 | ||
|
|
c1430e7b06 | ||
|
|
7e288d20fc | ||
|
|
528505a04a | ||
|
|
e2a0347c6d | ||
|
|
cce9941464 | ||
|
|
d51c4b1a4c | ||
|
|
3b0d9a93f4 | ||
|
|
41eb8e51e3 | ||
|
|
cdebf7b538 | ||
|
|
32ab4f8e47 | ||
|
|
6365e03731 | ||
|
|
2b9de934e3 | ||
|
|
4a368f54d5 | ||
|
|
25d3bf2c64 | ||
|
|
7d1748b3a7 | ||
|
|
2246d5f1eb | ||
|
|
575a2fd83f | ||
|
|
c9259bbec0 | ||
|
|
f3c18db7dd | ||
|
|
43baf709dd | ||
|
|
24d6e3a543 | ||
|
|
0b8223b8b9 | ||
|
|
e2f0241533 | ||
|
|
89e247b410 | ||
|
|
216cb3fb28 | ||
|
|
84fc6d4a87 | ||
|
|
9c7d9ded1e | ||
|
|
dfe40ffcca | ||
|
|
f477f23738 | ||
|
|
752a53e38e | ||
|
|
29a743cb9e | ||
|
|
4e759da070 | ||
|
|
69b9e45eaf | ||
|
|
5c7d2116e9 | ||
|
|
284bd733b9 | ||
|
|
0f3e9937f6 | ||
|
|
c32d19415b | ||
|
|
0a0d74eb94 | ||
|
|
2c5e48993d | ||
|
|
77af1ae544 | ||
|
|
e24a116943 | ||
|
|
872d2434a9 | ||
|
|
fe764cac75 | ||
|
|
3d2abbde72 | ||
|
|
e84c0e8df2 | ||
|
|
4e354ad00d | ||
|
|
ff02220890 | ||
|
|
f81d37fbf7 | ||
|
|
83381f9c12 | ||
|
|
f7e1952a55 | ||
|
|
cf77ff927f | ||
|
|
fc8b5e3956 | ||
|
|
ed16d30afc | ||
|
|
402cef66e9 | ||
|
|
13c2ecd1d0 | ||
|
|
a2b7611d8d | ||
|
|
d14e656ec1 | ||
|
|
63c62e3ada | ||
|
|
964e04369a | ||
|
|
873535fbf0 | ||
|
|
5201222ce7 | ||
|
|
b888f92718 | ||
|
|
87c0bf9cdf | ||
|
|
56df8d3cf0 | ||
|
|
8808a33fe1 | ||
|
|
ac82cae39a | ||
|
|
9c6a913ef1 | ||
|
|
18f7092b71 | ||
|
|
c8b08e64d6 | ||
|
|
e6ff4eb8b2 | ||
|
|
7adc14ab50 | ||
|
|
aeafeba12b | ||
|
|
890ff39bdb | ||
|
|
55c145bff2 | ||
|
|
7809405e8f | ||
|
|
88916fd11b | ||
|
|
06b50ba161 | ||
|
|
f76a7ef408 | ||
|
|
448e9c192b | ||
|
|
1d5e5247e8 | ||
|
|
5f3f354b3a | ||
|
|
7df74b170d | ||
|
|
7e6a5682fa | ||
|
|
e6a684d96a | ||
|
|
c3cf4279fa | ||
|
|
d4d1b2e7f9 | ||
|
|
b7744a2215 | ||
|
|
f5c766beb9 | ||
|
|
3e8993b449 | ||
|
|
32bdcf1dca | ||
|
|
369dfa4397 | ||
|
|
905403c1af | ||
|
|
dc3f3776ea | ||
|
|
44396be7c1 | ||
|
|
c49e5e90be | ||
|
|
01180d3027 | ||
|
|
397e6d0915 | ||
|
|
778afd31b1 | ||
|
|
6fe7f7a510 | ||
|
|
088eaea0cb | ||
|
|
b1bf09970f | ||
|
|
6540084ddf | ||
|
|
cde3a8c604 | ||
|
|
57113b1075 | ||
|
|
cbe5cfe603 | ||
|
|
833ccb9921 | ||
|
|
bfbb42a9fc | ||
|
|
c4e64be4bc | ||
|
|
88b47c805c | ||
|
|
908e01655a | ||
|
|
ea54c018ad | ||
|
|
6c351cb37d | ||
|
|
ee3d8c1890 | ||
|
|
3b9da0ee95 | ||
|
|
6bfe0b8422 | ||
|
|
33c6d093ab | ||
|
|
d0b1079b9b | ||
|
|
7945e7e780 | ||
|
|
6e7266eeb4 | ||
|
|
d19ff3f4dd | ||
|
|
4435e14838 | ||
|
|
df121c61dc | ||
|
|
1f204e4d76 | ||
|
|
8194132996 | ||
|
|
f7cc292742 | ||
|
|
2efc3a3ef6 | ||
|
|
057e3a494c | ||
|
|
bb6e721567 | ||
|
|
e76adf6ed1 | ||
|
|
2b4d82bfdd | ||
|
|
5e9c223077 | ||
|
|
98ede67b9b | ||
|
|
f594edd39f | ||
|
|
487c86f58e | ||
|
|
b3e71ca562 | ||
|
|
ab2f9e90eb | ||
|
|
cb77b2eb7e | ||
|
|
6c9e639a68 | ||
|
|
6e4694716b | ||
|
|
87b8e21701 | ||
|
|
dd5d2c7c92 | ||
|
|
e168dc7b97 | ||
|
|
4670f60d3e | ||
|
|
472322de24 | ||
|
|
3770e94d56 | ||
|
|
d9492f02d6 | ||
|
|
57d8d01079 | ||
|
|
345c7f4a88 | ||
|
|
521b24da3d | ||
|
|
96e03b45b9 | ||
|
|
57dcdb51af | ||
|
|
a503d2c12c | ||
|
|
97d628d784 | ||
|
|
21d2b075e7 | ||
|
|
426b16987a | ||
|
|
92aef9bae8 | ||
|
|
5f76d03913 | ||
|
|
d3ac8722be | ||
|
|
183d71eb7c | ||
|
|
3273692944 | ||
|
|
b5935349ed | ||
|
|
4b49efa02e | ||
|
|
c2c63868e9 | ||
|
|
bc5b30eccf | ||
|
|
d114927814 | ||
|
|
b41c00a9ef | ||
|
|
9d2800e691 | ||
|
|
3a003e11cc | ||
|
|
d388255e66 | ||
|
|
80d87d3b4e | ||
|
|
21eb904a4d | ||
|
|
d62b89cadd | ||
|
|
78207304d4 | ||
|
|
c799fca313 | ||
|
|
50db379db2 | ||
|
|
56aeddfa1c | ||
|
|
42c8aca5c0 | ||
|
|
00495d3d89 | ||
|
|
a613435249 | ||
|
|
576b408682 | ||
|
|
193b7c0570 | ||
|
|
93a8b55ff8 | ||
|
|
24a553c255 | ||
|
|
2332a79e0b | ||
|
|
65af1d77a4 | ||
|
|
b0b7ec779a | ||
|
|
859c82aa12 | ||
|
|
6fd29e05ad | ||
|
|
12216b5cc6 | ||
|
|
0c525febf2 | ||
|
|
b0fe48b730 | ||
|
|
f3a9b6de21 | ||
|
|
31561724f7 | ||
|
|
c363428966 | ||
|
|
f783f66866 | ||
|
|
deec68ab16 | ||
|
|
6733a6cd7e | ||
|
|
dfbb4f1ccb | ||
|
|
6956dad53a | ||
|
|
e9fc403b94 | ||
|
|
8eb8b16047 | ||
|
|
4e5f67ef96 | ||
|
|
ec445e4cc9 | ||
|
|
af97259a9c | ||
|
|
9c68c1b80b | ||
|
|
e94ce47ba5 | ||
|
|
6186eba098 | ||
|
|
b83a87f42f | ||
|
|
3120c72372 | ||
|
|
7934952a77 | ||
|
|
d9574fea71 | ||
|
|
83738b45cd | ||
|
|
4a67db6a4d | ||
|
|
0704854926 | ||
|
|
1959badde7 | ||
|
|
3ff07c23d2 | ||
|
|
dec02225f1 | ||
|
|
f6f5fee200 | ||
|
|
49b9511889 | ||
|
|
1a53567cb6 | ||
|
|
9248881d42 | ||
|
|
ef978dd601 | ||
|
|
fbf9d5714f | ||
|
|
8ac064499f | ||
|
|
cbbf695c35 | ||
|
|
7e8908afa2 | ||
|
|
58d4d04e99 | ||
|
|
c672b71f7f | ||
|
|
01c5a6f198 | ||
|
|
8a7b7a2383 | ||
|
|
64f5c3f837 | ||
|
|
c62266aa6a | ||
|
|
5dd1e6335a | ||
|
|
469bfe3953 | ||
|
|
d20341c797 | ||
|
|
756ddb6cf7 | ||
|
|
200dd66f63 | ||
|
|
9859bac440 | ||
|
|
8d6b20b47b | ||
|
|
a418106005 | ||
|
|
84ef17bf85 | ||
|
|
23dec980e2 | ||
|
|
03c37f8dea | ||
|
|
8360b2e3e3 | ||
|
|
d9ba4790e9 | ||
|
|
3ec96fdb73 | ||
|
|
eecb780dd7 | ||
|
|
632079ae3b | ||
|
|
7d8d6a5caf | ||
|
|
948080fee9 | ||
|
|
af0e05f38c | ||
|
|
8d53800c19 | ||
|
|
422f57b160 | ||
|
|
31c947bf7f | ||
|
|
f5bf743745 | ||
|
|
0a8b96cdb3 | ||
|
|
a47ea343ba | ||
|
|
0781b7a15c | ||
|
|
30ee59c324 | ||
|
|
aa2b11d528 | ||
|
|
e1ddcbb71f | ||
|
|
df94c98494 | ||
|
|
a7cfd9f24b | ||
|
|
e48beafc90 | ||
|
|
e6e41dba9d | ||
|
|
f4a9788f2d | ||
|
|
ccd501ea02 | ||
|
|
d7b98a72b4 | ||
|
|
210715117c | ||
|
|
38cb2bf3c4 | ||
|
|
f2a0a0b804 | ||
|
|
035e1a9333 | ||
|
|
ec4667c8b2 | ||
|
|
f32b76f213 | ||
|
|
ee7fddf8d5 | ||
|
|
77e04407b9 | ||
|
|
1a75e6d15c | ||
|
|
5e18ccace7 | ||
|
|
7b70713fcb | ||
|
|
ad55af04cc | ||
|
|
57406dbc90 | ||
|
|
e35e2c4343 | ||
|
|
d58f269281 | ||
|
|
2a7043d677 | ||
|
|
31b5ff1c61 | ||
|
|
c674462a02 | ||
|
|
e3ff0c8e1b | ||
|
|
17b10c43fe | ||
|
|
343d4e5877 | ||
|
|
1078c7dd2b | ||
|
|
4c630bc66e | ||
|
|
f5190f28d1 | ||
|
|
edfc6be63c | ||
|
|
61551ffea3 | ||
|
|
0fedd8a395 | ||
|
|
b090c33ca1 | ||
|
|
3fb96506bd | ||
|
|
dcf879f6fb | ||
|
|
4e01633202 | ||
|
|
ff3f04ff48 | ||
|
|
91fda5d04f | ||
|
|
77e06c57f9 | ||
|
|
0f75c35392 | ||
|
|
45473b3e72 | ||
|
|
a96556b8f4 | ||
|
|
ce8fe38ffc | ||
|
|
6e86f69f95 | ||
|
|
7661fae4b3 | ||
|
|
ba080cb4dd | ||
|
|
3860812323 | ||
|
|
2639184f46 | ||
|
|
61966fba1f | ||
|
|
54b512f9e0 | ||
|
|
667d23e79e | ||
|
|
416177ae4c | ||
|
|
72cc748aa8 | ||
|
|
9299660388 | ||
|
|
2cb82f326f | ||
|
|
f81d2ebcc4 | ||
|
|
048e2b1bfe | ||
|
|
5fae7d4de7 | ||
|
|
0f32fffe79 | ||
|
|
0233525e99 | ||
|
|
20b171bd16 | ||
|
|
3f2274cd8d | ||
|
|
c59e059976 | ||
|
|
9933039094 | ||
|
|
b886eb3cf0 | ||
|
|
53c944e8bc | ||
|
|
977f5570be | ||
|
|
609b55f530 | ||
|
|
2223afa0e9 | ||
|
|
3479ea6e80 | ||
|
|
63a876ca3c | ||
|
|
df0f101fbd | ||
|
|
0abb6a1205 | ||
|
|
d52f1d4b44 | ||
|
|
e27ec5de8c | ||
|
|
83488b4ed0 | ||
|
|
271a632f1c | ||
|
|
9a0e3a8425 | ||
|
|
1c1b86f495 | ||
|
|
1420b86aa7 | ||
|
|
22053d18e4 | ||
|
|
3b4db7a3bc | ||
|
|
db15dfaf5e | ||
|
|
1afadd7354 | ||
|
|
9ac2e71187 | ||
|
|
3bde21bb06 | ||
|
|
fb684f25e9 | ||
|
|
fa7acd2482 | ||
|
|
5114c32810 | ||
|
|
672d769c68 | ||
|
|
46c343f81d | ||
|
|
17058dd751 | ||
|
|
346152f67d | ||
|
|
a765d342e0 | ||
|
|
a5fda1546b | ||
|
|
4dffdc4de2 | ||
|
|
dd14643848 | ||
|
|
1dac0ec7cf | ||
|
|
7c0a3efea6 | ||
|
|
671a8ae554 | ||
|
|
baa71d6a08 | ||
|
|
638f2303bb | ||
|
|
a4d0901e89 | ||
|
|
f85f2fbcc2 | ||
|
|
fbcd80948e | ||
|
|
9d6a83dcca | ||
|
|
a251a53571 | ||
|
|
63afce3692 | ||
|
|
e07646bade | ||
|
|
ddb7101fa5 | ||
|
|
3f42357e5f | ||
|
|
3b08d4d582 | ||
|
|
049f768bc7 | ||
|
|
19c295ec03 | ||
|
|
a6b5f12daf | ||
|
|
4bd6961020 | ||
|
|
fd0799fd71 | ||
|
|
b91820afd3 | ||
|
|
0315e4cdc2 | ||
|
|
654463c28f | ||
|
|
f1ad727f8e | ||
|
|
10cccc07cd | ||
|
|
a498c268c5 | ||
|
|
fa8499719a | ||
|
|
1fcc6900ff | ||
|
|
45708a06f1 | ||
|
|
792397c2a9 | ||
|
|
36e4e67025 | ||
|
|
f44efce265 | ||
|
|
6f16fc0a93 | ||
|
|
6077ae6064 | ||
|
|
eb7f690ceb | ||
|
|
ef0e08b8ed | ||
|
|
3bcdf3e3ad | ||
|
|
fccec94805 | ||
|
|
bee9fdd207 | ||
|
|
0ae5d81deb | ||
|
|
ffc59f5b08 | ||
|
|
f5f8c4a883 | ||
|
|
e693e3d466 | ||
|
|
e4928f3a10 | ||
|
|
514dc43923 | ||
|
|
4599fc5a8d | ||
|
|
a4702e48f9 | ||
|
|
b539462319 | ||
|
|
aa7e069044 | ||
|
|
3b0ff94e3f | ||
|
|
5ab1c18530 | ||
|
|
36013c35d9 | ||
|
|
1448b55ca4 | ||
|
|
b19d0b6f3b | ||
|
|
b155415d7d | ||
|
|
d7f68ec1c9 | ||
|
|
af09510f6a | ||
|
|
a2bdfb0dd3 | ||
|
|
67247b5d6a | ||
|
|
5f2dfcb94e | ||
|
|
67491483b7 | ||
|
|
54a4f784a4 | ||
|
|
5aecb148a2 | ||
|
|
f49a003bd9 | ||
|
|
feb384acca | ||
|
|
c9718dc27a | ||
|
|
0b42045053 | ||
|
|
d8f7c6bf81 | ||
|
|
c8bd578415 | ||
|
|
5dfd9a2429 | ||
|
|
0324259da3 | ||
|
|
7af9aa61fa | ||
|
|
55bb3012ea | ||
|
|
ca919d73f9 | ||
|
|
70051735f6 | ||
|
|
2ad616780f | ||
|
|
fa43e5b0dd | ||
|
|
1d42b6e726 | ||
|
|
a3493dbb74 | ||
|
|
59a07324ec | ||
|
|
4d8663ebc8 | ||
|
|
2e7bf85e7a | ||
|
|
35e4897256 | ||
|
|
68ee3f8ea0 | ||
|
|
cf1ccd1e14 | ||
|
|
f56901b473 | ||
|
|
f99f174e2d | ||
|
|
cec372f9bb | ||
|
|
8355dd7905 | ||
|
|
8151331375 | ||
|
|
b06e41bed2 | ||
|
|
1179d7e75a | ||
|
|
2ec2dcf9c6 | ||
|
|
cbce8bfbc3 | ||
|
|
0f895a8cf9 | ||
|
|
c3ac209e5f | ||
|
|
192d76678e | ||
|
|
7bcf994064 | ||
|
|
e670324334 | ||
|
|
c23ddbad3f | ||
|
|
e6339e911d | ||
|
|
c0c64fe682 | ||
|
|
ae60879507 | ||
|
|
de60519ef6 | ||
|
|
44a00596a4 | ||
|
|
a57732f7dd | ||
|
|
63c0e22a2a | ||
|
|
2405851436 | ||
|
|
d9d2ad209d | ||
|
|
e1d4e37776 | ||
|
|
08ac2bc9a7 | ||
|
|
b213eb695b | ||
|
|
494448dcf7 | ||
|
|
854e818b74 | ||
|
|
38d3d5fa59 | ||
|
|
86bd26ee8a | ||
|
|
9cacf4a981 | ||
|
|
9184cf92dd | ||
|
|
38b9a55eab | ||
|
|
3369a9e685 | ||
|
|
553c939f1f | ||
|
|
67bc601258 | ||
|
|
9d570b3ed7 | ||
|
|
d4eb502389 | ||
|
|
50276ed981 | ||
|
|
2d21045424 | ||
|
|
eb607f7df8 | ||
|
|
eb033a221f | ||
|
|
5f6e68e7aa | ||
|
|
88682632f9 | ||
|
|
264d40e6ca | ||
|
|
de7d6294ea | ||
|
|
f41373dc46 | ||
|
|
1bbb98aaa9 | ||
|
|
cecb94213d | ||
|
|
0cdc9547d9 | ||
|
|
4c1504872f | ||
|
|
7086ad00ae | ||
|
|
222e0624a8 | ||
|
|
81bc8c7313 | ||
|
|
5134cac993 | ||
|
|
e401979851 | ||
|
|
4569d57f5b | ||
|
|
eff0c506fa | ||
|
|
c486bad2dd | ||
|
|
0e387426fa | ||
|
|
6ee4315eef | ||
|
|
7c07b16f80 | ||
|
|
77500b50d9 | ||
|
|
0cc75c6e10 | ||
|
|
82d97418b2 | ||
|
|
35a7acc058 | ||
|
|
bd32c871b7 | ||
|
|
8e63dd44b6 | ||
|
|
4eedf15870 | ||
|
|
a0e6ad0b7d | ||
|
|
4b90784183 | ||
|
|
ab6ec999c5 | ||
|
|
babea25649 | ||
|
|
e9ffde610b | ||
|
|
a05aa99c7e | ||
|
|
690149d555 | ||
|
|
ffd1631b14 | ||
|
|
185317c153 | ||
|
|
988f1244e5 | ||
|
|
38b855e495 | ||
|
|
0ed0c0abdb | ||
|
|
7a2ecff4f0 | ||
|
|
bee24e880f | ||
|
|
7ab5b8a0c2 | ||
|
|
089a2d08bf | ||
|
|
d8fb93edcf | ||
|
|
201d91b4f5 | ||
|
|
9da1803f29 | ||
|
|
1b98c2b279 | ||
|
|
a85511dad2 | ||
|
|
f75a4d9589 | ||
|
|
0d36cf00f8 | ||
|
|
4b8e880a96 | ||
|
|
1e5e09f0fa | ||
|
|
57db28e9e6 | ||
|
|
c610951a71 | ||
|
|
e5049a448e | ||
|
|
1f261d90f3 | ||
|
|
d2dd8d0cc5 | ||
|
|
e08362b667 | ||
|
|
2c809d55c0 | ||
|
|
529d53acc0 | ||
|
|
fd73d6fcab | ||
|
|
cdf63d0024 | ||
|
|
09a8ecbded | ||
|
|
6f98c5f25c | ||
|
|
70e41150c5 | ||
|
|
bc765b0867 | ||
|
|
9dbd72cffd | ||
|
|
084c0a19a2 | ||
|
|
85f95c4542 | ||
|
|
732ae4e46c | ||
|
|
c1a92d8520 | ||
|
|
69b2875060 | ||
|
|
7cb46d97f6 | ||
|
|
e31d77bc47 | ||
|
|
90d39b9cbd | ||
|
|
bc68c3a504 | ||
|
|
59bc52f527 | ||
|
|
d37e1d3dc3 | ||
|
|
34d9122b45 | ||
|
|
1f7218640c | ||
|
|
0078fa66a3 | ||
|
|
f4f9d6fd3f | ||
|
|
69c453b274 | ||
|
|
d54ee6c4dc | ||
|
|
b48f0314e7 | ||
|
|
e1b24c1d5c | ||
|
|
1c9b7ef918 | ||
|
|
8f70e79240 | ||
|
|
eabfd9d9f6 | ||
|
|
6a101e0da1 | ||
|
|
426c1044b6 | ||
|
|
875924a7f3 | ||
|
|
e835c5cee9 | ||
|
|
db54f77b73 | ||
|
|
67eb5e5734 | ||
|
|
758a5538c5 | ||
|
|
3ae112acff | ||
|
|
9454f76c0c | ||
|
|
944263f44b | ||
|
|
a1944fceab | ||
|
|
8d5c9fde3b | ||
|
|
d8688bbd93 | ||
|
|
306cd65353 | ||
|
|
8a85173150 | ||
|
|
b4a02ebc3f | ||
|
|
1f57577c54 | ||
|
|
ec0b7daca2 | ||
|
|
bdc0480e62 | ||
|
|
c145074daf | ||
|
|
f6a09bcbea | ||
|
|
d4a2fc6464 | ||
|
|
be50daba42 | ||
|
|
7b334ff2b7 | ||
|
|
5bbfddf70d | ||
|
|
358467a506 |
269
.agents/skills/company-creator/SKILL.md
Normal file
269
.agents/skills/company-creator/SKILL.md
Normal file
@@ -0,0 +1,269 @@
|
|||||||
|
---
|
||||||
|
name: company-creator
|
||||||
|
description: >
|
||||||
|
Create agent company packages conforming to the Agent Companies specification
|
||||||
|
(agentcompanies/v1). Use when a user wants to create a new agent company from
|
||||||
|
scratch, build a company around an existing git repo or skills collection, or
|
||||||
|
scaffold a team/department of agents. Triggers on: "create a company", "make me
|
||||||
|
a company", "build a company from this repo", "set up an agent company",
|
||||||
|
"create a team of agents", "hire some agents", or when given a repo URL and
|
||||||
|
asked to turn it into a company. Do NOT use for importing an existing company
|
||||||
|
package (use the CLI import command instead) or for modifying a company that
|
||||||
|
is already running in Paperclip.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Company Creator
|
||||||
|
|
||||||
|
Create agent company packages that conform to the Agent Companies specification.
|
||||||
|
|
||||||
|
Spec references:
|
||||||
|
|
||||||
|
- Normative spec: `docs/companies/companies-spec.md` (read this before generating files)
|
||||||
|
- Web spec: https://agentcompanies.io/specification
|
||||||
|
- Protocol site: https://agentcompanies.io/
|
||||||
|
|
||||||
|
## Two Modes
|
||||||
|
|
||||||
|
### Mode 1: Company From Scratch
|
||||||
|
|
||||||
|
The user describes what they want. Interview them to flesh out the vision, then generate the package.
|
||||||
|
|
||||||
|
### Mode 2: Company From a Repo
|
||||||
|
|
||||||
|
The user provides a git repo URL, local path, or tweet. Analyze the repo, then create a company that wraps it.
|
||||||
|
|
||||||
|
See [references/from-repo-guide.md](references/from-repo-guide.md) for detailed repo analysis steps.
|
||||||
|
|
||||||
|
## Process
|
||||||
|
|
||||||
|
### Step 1: Gather Context
|
||||||
|
|
||||||
|
Determine which mode applies:
|
||||||
|
|
||||||
|
- **From scratch**: What kind of company or team? What domain? What should the agents do?
|
||||||
|
- **From repo**: Clone/read the repo. Scan for existing skills, agent configs, README, source structure.
|
||||||
|
|
||||||
|
### Step 2: Interview (Use AskUserQuestion)
|
||||||
|
|
||||||
|
Do not skip this step. Use AskUserQuestion to align with the user before writing any files.
|
||||||
|
|
||||||
|
**For from-scratch companies**, ask about:
|
||||||
|
|
||||||
|
- Company purpose and domain (1-2 sentences is fine)
|
||||||
|
- What agents they need - propose a hiring plan based on what they described
|
||||||
|
- Whether this is a full company (needs a CEO) or a team/department (no CEO required)
|
||||||
|
- Any specific skills the agents should have
|
||||||
|
- How work flows through the organization (see "Workflow" below)
|
||||||
|
- Whether they want projects and starter tasks
|
||||||
|
|
||||||
|
**For from-repo companies**, present your analysis and ask:
|
||||||
|
|
||||||
|
- Confirm the agents you plan to create and their roles
|
||||||
|
- Whether to reference or vendor any discovered skills (default: reference)
|
||||||
|
- Any additional agents or skills beyond what the repo provides
|
||||||
|
- Company name and any customization
|
||||||
|
- Confirm the workflow you inferred from the repo (see "Workflow" below)
|
||||||
|
|
||||||
|
**Workflow — how does work move through this company?**
|
||||||
|
|
||||||
|
A company is not just a list of agents with skills. It's an organization that takes ideas and turns them into work products. You need to understand the workflow so each agent knows:
|
||||||
|
|
||||||
|
- Who gives them work and in what form (a task, a branch, a question, a review request)
|
||||||
|
- What they do with it
|
||||||
|
- Who they hand off to when they're done, and what that handoff looks like
|
||||||
|
- What "done" means for their role
|
||||||
|
|
||||||
|
**Not every company is a pipeline.** Infer the right workflow pattern from context:
|
||||||
|
|
||||||
|
- **Pipeline** — sequential stages, each agent hands off to the next. Use when the repo/domain has a clear linear process (e.g. plan → build → review → ship → QA, or content ideation → draft → edit → publish).
|
||||||
|
- **Hub-and-spoke** — a manager delegates to specialists who report back independently. Use when agents do different kinds of work that don't feed into each other (e.g. a CEO who dispatches to a researcher, a marketer, and an analyst).
|
||||||
|
- **Collaborative** — agents work together on the same things as peers. Use for small teams where everyone contributes to the same output (e.g. a design studio, a brainstorming team).
|
||||||
|
- **On-demand** — agents are summoned as needed with no fixed flow. Use when agents are more like a toolbox of specialists the user calls directly.
|
||||||
|
|
||||||
|
For from-scratch companies, propose a workflow pattern based on what they described and ask if it fits.
|
||||||
|
|
||||||
|
For from-repo companies, infer the pattern from the repo's structure. If skills have a clear sequential dependency (like `plan-ceo-review → plan-eng-review → review → ship → qa`), that's a pipeline. If skills are independent capabilities, it's more likely hub-and-spoke or on-demand. State your inference in the interview so the user can confirm or adjust.
|
||||||
|
|
||||||
|
**Key interviewing principles:**
|
||||||
|
|
||||||
|
- Propose a concrete hiring plan. Don't ask open-ended "what agents do you want?" - suggest specific agents based on context and let the user adjust.
|
||||||
|
- Keep it lean. Most users are new to agent companies. A few agents (3-5) is typical for a startup. Don't suggest 10+ agents unless the scope demands it.
|
||||||
|
- From-scratch companies should start with a CEO who manages everyone. Teams/departments don't need one.
|
||||||
|
- Ask 2-3 focused questions per round, not 10.
|
||||||
|
|
||||||
|
### Step 3: Read the Spec
|
||||||
|
|
||||||
|
Before generating any files, read the normative spec:
|
||||||
|
|
||||||
|
```
|
||||||
|
docs/companies/companies-spec.md
|
||||||
|
```
|
||||||
|
|
||||||
|
Also read the quick reference: [references/companies-spec.md](references/companies-spec.md)
|
||||||
|
|
||||||
|
And the example: [references/example-company.md](references/example-company.md)
|
||||||
|
|
||||||
|
### Step 4: Generate the Package
|
||||||
|
|
||||||
|
Create the directory structure and all files. Follow the spec's conventions exactly.
|
||||||
|
|
||||||
|
**Directory structure:**
|
||||||
|
|
||||||
|
```
|
||||||
|
<company-slug>/
|
||||||
|
├── COMPANY.md
|
||||||
|
├── agents/
|
||||||
|
│ └── <slug>/AGENTS.md
|
||||||
|
├── teams/
|
||||||
|
│ └── <slug>/TEAM.md (if teams are needed)
|
||||||
|
├── projects/
|
||||||
|
│ └── <slug>/PROJECT.md (if projects are needed)
|
||||||
|
├── tasks/
|
||||||
|
│ └── <slug>/TASK.md (if tasks are needed)
|
||||||
|
├── skills/
|
||||||
|
│ └── <slug>/SKILL.md (if custom skills are needed)
|
||||||
|
└── .paperclip.yaml (Paperclip vendor extension)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Rules:**
|
||||||
|
|
||||||
|
- Slugs must be URL-safe, lowercase, hyphenated
|
||||||
|
- COMPANY.md gets `schema: agentcompanies/v1` - other files inherit it
|
||||||
|
- Agent instructions go in the AGENTS.md body, not in .paperclip.yaml
|
||||||
|
- Skills referenced by shortname in AGENTS.md resolve to `skills/<shortname>/SKILL.md`
|
||||||
|
- For external skills, use `sources` with `usage: referenced` (see spec section 12)
|
||||||
|
- Do not export secrets, machine-local paths, or database IDs
|
||||||
|
- Omit empty/default fields
|
||||||
|
- For companies generated from a repo, add a references footer at the bottom of COMPANY.md body:
|
||||||
|
`Generated from [repo-name](repo-url) with the company-creator skill from [Paperclip](https://github.com/paperclipai/paperclip)`
|
||||||
|
|
||||||
|
**Reporting structure:**
|
||||||
|
|
||||||
|
- Every agent except the CEO should have `reportsTo` set to their manager's slug
|
||||||
|
- The CEO has `reportsTo: null`
|
||||||
|
- For teams without a CEO, the top-level agent has `reportsTo: null`
|
||||||
|
|
||||||
|
**Writing workflow-aware agent instructions:**
|
||||||
|
|
||||||
|
Each AGENTS.md body should include not just what the agent does, but how they fit into the organization's workflow. Include:
|
||||||
|
|
||||||
|
1. **Where work comes from** — "You receive feature ideas from the user" or "You pick up tasks assigned to you by the CTO"
|
||||||
|
2. **What you produce** — "You produce a technical plan with architecture diagrams" or "You produce a reviewed, approved branch ready for shipping"
|
||||||
|
3. **Who you hand off to** — "When your plan is locked, hand off to the Staff Engineer for implementation" or "When review passes, hand off to the Release Engineer to ship"
|
||||||
|
4. **What triggers you** — "You are activated when a new feature idea needs product-level thinking" or "You are activated when a branch is ready for pre-landing review"
|
||||||
|
|
||||||
|
This turns a collection of agents into an organization that actually works together. Without workflow context, agents operate in isolation — they do their job but don't know what happens before or after them.
|
||||||
|
|
||||||
|
### Step 5: Confirm Output Location
|
||||||
|
|
||||||
|
Ask the user where to write the package. Common options:
|
||||||
|
|
||||||
|
- A subdirectory in the current repo
|
||||||
|
- A new directory the user specifies
|
||||||
|
- The current directory (if it's empty or they confirm)
|
||||||
|
|
||||||
|
### Step 6: Write README.md and LICENSE
|
||||||
|
|
||||||
|
**README.md** — every company package gets a README. It should be a nice, readable introduction that someone browsing GitHub would appreciate. Include:
|
||||||
|
|
||||||
|
- Company name and what it does
|
||||||
|
- The workflow / how the company operates
|
||||||
|
- Org chart as a markdown list or table showing agents, titles, reporting structure, and skills
|
||||||
|
- Brief description of each agent's role
|
||||||
|
- Citations and references: link to the source repo (if from-repo), link to the Agent Companies spec (https://agentcompanies.io/specification), and link to Paperclip (https://github.com/paperclipai/paperclip)
|
||||||
|
- A "Getting Started" section explaining how to import: `paperclipai company import --from <path>`
|
||||||
|
|
||||||
|
**LICENSE** — include a LICENSE file. The copyright holder is the user creating the company, not the upstream repo author (they made the skills, the user is making the company). Use the same license type as the source repo (if from-repo) or ask the user (if from-scratch). Default to MIT if unclear.
|
||||||
|
|
||||||
|
### Step 7: Write Files and Summarize
|
||||||
|
|
||||||
|
Write all files, then give a brief summary:
|
||||||
|
|
||||||
|
- Company name and what it does
|
||||||
|
- Agent roster with roles and reporting structure
|
||||||
|
- Skills (custom + referenced)
|
||||||
|
- Projects and tasks if any
|
||||||
|
- The output path
|
||||||
|
|
||||||
|
## .paperclip.yaml Guidelines
|
||||||
|
|
||||||
|
The `.paperclip.yaml` file is the Paperclip vendor extension. It configures adapters and env inputs per agent.
|
||||||
|
|
||||||
|
### Adapter Rules
|
||||||
|
|
||||||
|
**Do not specify an adapter unless the repo or user context warrants it.** If you don't know what adapter the user wants, omit the adapter block entirely — Paperclip will use its default. Specifying an unknown adapter type causes an import error.
|
||||||
|
|
||||||
|
Paperclip's supported adapter types (these are the ONLY valid values):
|
||||||
|
- `claude_local` — Claude Code CLI
|
||||||
|
- `codex_local` — Codex CLI
|
||||||
|
- `opencode_local` — OpenCode CLI
|
||||||
|
- `pi_local` — Pi CLI
|
||||||
|
- `cursor` — Cursor
|
||||||
|
- `gemini_local` — Gemini CLI
|
||||||
|
- `openclaw_gateway` — OpenClaw gateway
|
||||||
|
|
||||||
|
Only set an adapter when:
|
||||||
|
- The repo or its skills clearly target a specific runtime (e.g. gstack is built for Claude Code, so `claude_local` is appropriate)
|
||||||
|
- The user explicitly requests a specific adapter
|
||||||
|
- The agent's role requires a specific runtime capability
|
||||||
|
|
||||||
|
### Env Inputs Rules
|
||||||
|
|
||||||
|
**Do not add boilerplate env variables.** Only add env inputs that the agent actually needs based on its skills or role:
|
||||||
|
- `GH_TOKEN` for agents that push code, create PRs, or interact with GitHub
|
||||||
|
- API keys only when a skill explicitly requires them
|
||||||
|
- Never set `ANTHROPIC_API_KEY` as a default empty env variable — the runtime handles this
|
||||||
|
|
||||||
|
Example with adapter (only when warranted):
|
||||||
|
```yaml
|
||||||
|
schema: paperclip/v1
|
||||||
|
agents:
|
||||||
|
release-engineer:
|
||||||
|
adapter:
|
||||||
|
type: claude_local
|
||||||
|
config:
|
||||||
|
model: claude-sonnet-4-6
|
||||||
|
inputs:
|
||||||
|
env:
|
||||||
|
GH_TOKEN:
|
||||||
|
kind: secret
|
||||||
|
requirement: optional
|
||||||
|
```
|
||||||
|
|
||||||
|
Example — only agents with actual overrides appear:
|
||||||
|
```yaml
|
||||||
|
schema: paperclip/v1
|
||||||
|
agents:
|
||||||
|
release-engineer:
|
||||||
|
inputs:
|
||||||
|
env:
|
||||||
|
GH_TOKEN:
|
||||||
|
kind: secret
|
||||||
|
requirement: optional
|
||||||
|
```
|
||||||
|
|
||||||
|
In this example, only `release-engineer` appears because it needs `GH_TOKEN`. The other agents (ceo, cto, etc.) have no overrides, so they are omitted entirely from `.paperclip.yaml`.
|
||||||
|
|
||||||
|
## External Skill References
|
||||||
|
|
||||||
|
When referencing skills from a GitHub repo, always use the references pattern:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
metadata:
|
||||||
|
sources:
|
||||||
|
- kind: github-file
|
||||||
|
repo: owner/repo
|
||||||
|
path: path/to/SKILL.md
|
||||||
|
commit: <full SHA from git ls-remote or the repo>
|
||||||
|
attribution: Owner or Org Name
|
||||||
|
license: <from the repo's LICENSE>
|
||||||
|
usage: referenced
|
||||||
|
```
|
||||||
|
|
||||||
|
Get the commit SHA with:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git ls-remote https://github.com/owner/repo HEAD
|
||||||
|
```
|
||||||
|
|
||||||
|
Do NOT copy external skill content into the package unless the user explicitly asks.
|
||||||
144
.agents/skills/company-creator/references/companies-spec.md
Normal file
144
.agents/skills/company-creator/references/companies-spec.md
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
# Agent Companies Specification Reference
|
||||||
|
|
||||||
|
The normative specification lives at:
|
||||||
|
|
||||||
|
- Web: https://agentcompanies.io/specification
|
||||||
|
- Local: docs/companies/companies-spec.md
|
||||||
|
|
||||||
|
Read the local spec file before generating any package files. The spec defines the canonical format and all frontmatter fields. Below is a quick-reference summary for common authoring tasks.
|
||||||
|
|
||||||
|
## Package Kinds
|
||||||
|
|
||||||
|
| File | Kind | Purpose |
|
||||||
|
| ---------- | ------- | ------------------------------------------------- |
|
||||||
|
| COMPANY.md | company | Root entrypoint, org boundary and defaults |
|
||||||
|
| TEAM.md | team | Reusable org subtree |
|
||||||
|
| AGENTS.md | agent | One role, instructions, and attached skills |
|
||||||
|
| PROJECT.md | project | Planned work grouping |
|
||||||
|
| TASK.md | task | Portable starter task |
|
||||||
|
| SKILL.md | skill | Agent Skills capability package (do not redefine) |
|
||||||
|
|
||||||
|
## Directory Layout
|
||||||
|
|
||||||
|
```
|
||||||
|
company-package/
|
||||||
|
├── COMPANY.md
|
||||||
|
├── agents/
|
||||||
|
│ └── <slug>/AGENTS.md
|
||||||
|
├── teams/
|
||||||
|
│ └── <slug>/TEAM.md
|
||||||
|
├── projects/
|
||||||
|
│ └── <slug>/
|
||||||
|
│ ├── PROJECT.md
|
||||||
|
│ └── tasks/
|
||||||
|
│ └── <slug>/TASK.md
|
||||||
|
├── tasks/
|
||||||
|
│ └── <slug>/TASK.md
|
||||||
|
├── skills/
|
||||||
|
│ └── <slug>/SKILL.md
|
||||||
|
├── assets/
|
||||||
|
├── scripts/
|
||||||
|
├── references/
|
||||||
|
└── .paperclip.yaml (optional vendor extension)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Frontmatter Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
schema: agentcompanies/v1
|
||||||
|
kind: company | team | agent | project | task
|
||||||
|
slug: url-safe-stable-identity
|
||||||
|
name: Human Readable Name
|
||||||
|
description: Short description for discovery
|
||||||
|
version: 0.1.0
|
||||||
|
license: MIT
|
||||||
|
authors:
|
||||||
|
- name: Jane Doe
|
||||||
|
tags: []
|
||||||
|
metadata: {}
|
||||||
|
sources: []
|
||||||
|
```
|
||||||
|
|
||||||
|
- `schema` usually appears only at package root
|
||||||
|
- `kind` is optional when filename makes it obvious
|
||||||
|
- `slug` must be URL-safe and stable
|
||||||
|
- exporters should omit empty or default-valued fields
|
||||||
|
|
||||||
|
## COMPANY.md Required Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: Company Name
|
||||||
|
description: What this company does
|
||||||
|
slug: company-slug
|
||||||
|
schema: agentcompanies/v1
|
||||||
|
```
|
||||||
|
|
||||||
|
Optional: `version`, `license`, `authors`, `goals`, `includes`, `requirements.secrets`
|
||||||
|
|
||||||
|
## AGENTS.md Key Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: Agent Name
|
||||||
|
title: Role Title
|
||||||
|
reportsTo: <agent-slug or null>
|
||||||
|
skills:
|
||||||
|
- skill-shortname
|
||||||
|
```
|
||||||
|
|
||||||
|
- Body content is the agent's default instructions
|
||||||
|
- Skills resolve by shortname: `skills/<shortname>/SKILL.md`
|
||||||
|
- Do not export machine-specific paths or secrets
|
||||||
|
|
||||||
|
## TEAM.md Key Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: Team Name
|
||||||
|
description: What this team does
|
||||||
|
slug: team-slug
|
||||||
|
manager: ../agent-slug/AGENTS.md
|
||||||
|
includes:
|
||||||
|
- ../agent-slug/AGENTS.md
|
||||||
|
- ../../skills/skill-slug/SKILL.md
|
||||||
|
```
|
||||||
|
|
||||||
|
## PROJECT.md Key Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: Project Name
|
||||||
|
description: What this project delivers
|
||||||
|
owner: agent-slug
|
||||||
|
```
|
||||||
|
|
||||||
|
## TASK.md Key Fields
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
name: Task Name
|
||||||
|
assignee: agent-slug
|
||||||
|
project: project-slug
|
||||||
|
schedule:
|
||||||
|
timezone: America/Chicago
|
||||||
|
startsAt: 2026-03-16T09:00:00-05:00
|
||||||
|
recurrence:
|
||||||
|
frequency: weekly
|
||||||
|
interval: 1
|
||||||
|
weekdays: [monday]
|
||||||
|
time: { hour: 9, minute: 0 }
|
||||||
|
```
|
||||||
|
|
||||||
|
## Source References (for external skills/content)
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
sources:
|
||||||
|
- kind: github-file
|
||||||
|
repo: owner/repo
|
||||||
|
path: path/to/SKILL.md
|
||||||
|
commit: <full-sha>
|
||||||
|
sha256: <hash>
|
||||||
|
attribution: Owner Name
|
||||||
|
license: MIT
|
||||||
|
usage: referenced
|
||||||
|
```
|
||||||
|
|
||||||
|
Usage modes: `vendored` (bytes included), `referenced` (pointer only), `mirrored` (cached locally)
|
||||||
|
|
||||||
|
Default to `referenced` for third-party content.
|
||||||
184
.agents/skills/company-creator/references/example-company.md
Normal file
184
.agents/skills/company-creator/references/example-company.md
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
# Example Company Package
|
||||||
|
|
||||||
|
A minimal but complete example of an agent company package.
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
lean-dev-shop/
|
||||||
|
├── COMPANY.md
|
||||||
|
├── agents/
|
||||||
|
│ ├── ceo/AGENTS.md
|
||||||
|
│ ├── cto/AGENTS.md
|
||||||
|
│ └── engineer/AGENTS.md
|
||||||
|
├── teams/
|
||||||
|
│ └── engineering/TEAM.md
|
||||||
|
├── projects/
|
||||||
|
│ └── q2-launch/
|
||||||
|
│ ├── PROJECT.md
|
||||||
|
│ └── tasks/
|
||||||
|
│ └── monday-review/TASK.md
|
||||||
|
├── tasks/
|
||||||
|
│ └── weekly-standup/TASK.md
|
||||||
|
├── skills/
|
||||||
|
│ └── code-review/SKILL.md
|
||||||
|
└── .paperclip.yaml
|
||||||
|
```
|
||||||
|
|
||||||
|
## COMPANY.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: Lean Dev Shop
|
||||||
|
description: Small engineering-focused AI company that builds and ships software products
|
||||||
|
slug: lean-dev-shop
|
||||||
|
schema: agentcompanies/v1
|
||||||
|
version: 1.0.0
|
||||||
|
license: MIT
|
||||||
|
authors:
|
||||||
|
- name: Example Org
|
||||||
|
goals:
|
||||||
|
- Build and ship software products
|
||||||
|
- Maintain high code quality
|
||||||
|
---
|
||||||
|
|
||||||
|
Lean Dev Shop is a small, focused engineering company. The CEO oversees strategy and coordinates work. The CTO leads the engineering team. Engineers build and ship code.
|
||||||
|
```
|
||||||
|
|
||||||
|
## agents/ceo/AGENTS.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: CEO
|
||||||
|
title: Chief Executive Officer
|
||||||
|
reportsTo: null
|
||||||
|
skills:
|
||||||
|
- paperclip
|
||||||
|
---
|
||||||
|
|
||||||
|
You are the CEO of Lean Dev Shop. You oversee company strategy, coordinate work across the team, and ensure projects ship on time.
|
||||||
|
|
||||||
|
Your responsibilities:
|
||||||
|
|
||||||
|
- Review and prioritize work across projects
|
||||||
|
- Coordinate with the CTO on technical decisions
|
||||||
|
- Ensure the company goals are being met
|
||||||
|
```
|
||||||
|
|
||||||
|
## agents/cto/AGENTS.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: CTO
|
||||||
|
title: Chief Technology Officer
|
||||||
|
reportsTo: ceo
|
||||||
|
skills:
|
||||||
|
- code-review
|
||||||
|
- paperclip
|
||||||
|
---
|
||||||
|
|
||||||
|
You are the CTO of Lean Dev Shop. You lead the engineering team and make technical decisions.
|
||||||
|
|
||||||
|
Your responsibilities:
|
||||||
|
|
||||||
|
- Set technical direction and architecture
|
||||||
|
- Review code and ensure quality standards
|
||||||
|
- Mentor engineers and unblock technical challenges
|
||||||
|
```
|
||||||
|
|
||||||
|
## agents/engineer/AGENTS.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: Engineer
|
||||||
|
title: Software Engineer
|
||||||
|
reportsTo: cto
|
||||||
|
skills:
|
||||||
|
- code-review
|
||||||
|
- paperclip
|
||||||
|
---
|
||||||
|
|
||||||
|
You are a software engineer at Lean Dev Shop. You write code, fix bugs, and ship features.
|
||||||
|
|
||||||
|
Your responsibilities:
|
||||||
|
|
||||||
|
- Implement features and fix bugs
|
||||||
|
- Write tests and documentation
|
||||||
|
- Participate in code reviews
|
||||||
|
```
|
||||||
|
|
||||||
|
## teams/engineering/TEAM.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: Engineering
|
||||||
|
description: Product and platform engineering team
|
||||||
|
slug: engineering
|
||||||
|
schema: agentcompanies/v1
|
||||||
|
manager: ../../agents/cto/AGENTS.md
|
||||||
|
includes:
|
||||||
|
- ../../agents/engineer/AGENTS.md
|
||||||
|
- ../../skills/code-review/SKILL.md
|
||||||
|
tags:
|
||||||
|
- engineering
|
||||||
|
---
|
||||||
|
|
||||||
|
The engineering team builds and maintains all software products.
|
||||||
|
```
|
||||||
|
|
||||||
|
## projects/q2-launch/PROJECT.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: Q2 Launch
|
||||||
|
description: Ship the Q2 product launch
|
||||||
|
slug: q2-launch
|
||||||
|
owner: cto
|
||||||
|
---
|
||||||
|
|
||||||
|
Deliver all features planned for the Q2 launch, including the new dashboard and API improvements.
|
||||||
|
```
|
||||||
|
|
||||||
|
## projects/q2-launch/tasks/monday-review/TASK.md
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: Monday Review
|
||||||
|
assignee: ceo
|
||||||
|
project: q2-launch
|
||||||
|
schedule:
|
||||||
|
timezone: America/Chicago
|
||||||
|
startsAt: 2026-03-16T09:00:00-05:00
|
||||||
|
recurrence:
|
||||||
|
frequency: weekly
|
||||||
|
interval: 1
|
||||||
|
weekdays:
|
||||||
|
- monday
|
||||||
|
time:
|
||||||
|
hour: 9
|
||||||
|
minute: 0
|
||||||
|
---
|
||||||
|
|
||||||
|
Review the status of Q2 Launch project. Check progress on all open tasks, identify blockers, and update priorities for the week.
|
||||||
|
```
|
||||||
|
|
||||||
|
## skills/code-review/SKILL.md (with external reference)
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
---
|
||||||
|
name: code-review
|
||||||
|
description: Thorough code review skill for pull requests and diffs
|
||||||
|
metadata:
|
||||||
|
sources:
|
||||||
|
- kind: github-file
|
||||||
|
repo: anthropics/claude-code
|
||||||
|
path: skills/code-review/SKILL.md
|
||||||
|
commit: abc123def456
|
||||||
|
sha256: 3b7e...9a
|
||||||
|
attribution: Anthropic
|
||||||
|
license: MIT
|
||||||
|
usage: referenced
|
||||||
|
---
|
||||||
|
|
||||||
|
Review code changes for correctness, style, and potential issues.
|
||||||
|
```
|
||||||
79
.agents/skills/company-creator/references/from-repo-guide.md
Normal file
79
.agents/skills/company-creator/references/from-repo-guide.md
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
# Creating a Company From an Existing Repository
|
||||||
|
|
||||||
|
When a user provides a git repo (URL, local path, or tweet linking to a repo), analyze it and create a company package that wraps its content.
|
||||||
|
|
||||||
|
## Analysis Steps
|
||||||
|
|
||||||
|
1. **Clone or read the repo** - Use `git clone` for URLs, read directly for local paths
|
||||||
|
2. **Scan for existing agent/skill files** - Look for SKILL.md, AGENTS.md, CLAUDE.md, .claude/ directories, or similar agent configuration
|
||||||
|
3. **Understand the repo's purpose** - Read README, package.json, main source files to understand what the project does
|
||||||
|
4. **Identify natural agent roles** - Based on the repo's structure and purpose, determine what agents would be useful
|
||||||
|
|
||||||
|
## Handling Existing Skills
|
||||||
|
|
||||||
|
Many repos already contain skills (SKILL.md files). When you find them:
|
||||||
|
|
||||||
|
**Default behavior: use references, not copies.**
|
||||||
|
|
||||||
|
Instead of copying skill content into your company package, create a source reference:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
metadata:
|
||||||
|
sources:
|
||||||
|
- kind: github-file
|
||||||
|
repo: owner/repo
|
||||||
|
path: path/to/SKILL.md
|
||||||
|
commit: <get the current HEAD commit SHA>
|
||||||
|
attribution: <repo owner or org name>
|
||||||
|
license: <from repo's LICENSE file>
|
||||||
|
usage: referenced
|
||||||
|
```
|
||||||
|
|
||||||
|
To get the commit SHA:
|
||||||
|
```bash
|
||||||
|
git ls-remote https://github.com/owner/repo HEAD
|
||||||
|
```
|
||||||
|
|
||||||
|
Only vendor (copy) skills when:
|
||||||
|
- The user explicitly asks to copy them
|
||||||
|
- The skill is very small and tightly coupled to the company
|
||||||
|
- The source repo is private or may become unavailable
|
||||||
|
|
||||||
|
## Handling Existing Agent Configurations
|
||||||
|
|
||||||
|
If the repo has agent configs (CLAUDE.md, .claude/ directories, codex configs, etc.):
|
||||||
|
- Use them as inspiration for AGENTS.md instructions
|
||||||
|
- Don't copy them verbatim - adapt them to the Agent Companies format
|
||||||
|
- Preserve the intent and key instructions
|
||||||
|
|
||||||
|
## Repo-Only Skills (No Agents)
|
||||||
|
|
||||||
|
When a repo contains only skills and no agents:
|
||||||
|
- Create agents that would naturally use those skills
|
||||||
|
- The agents should be minimal - just enough to give the skills a runtime context
|
||||||
|
- A single agent may use multiple skills from the repo
|
||||||
|
- Name agents based on the domain the skills cover
|
||||||
|
|
||||||
|
Example: A repo with `code-review`, `testing`, and `deployment` skills might become:
|
||||||
|
- A "Lead Engineer" agent with all three skills
|
||||||
|
- Or separate "Reviewer", "QA Engineer", and "DevOps" agents if the skills are distinct enough
|
||||||
|
|
||||||
|
## Common Repo Patterns
|
||||||
|
|
||||||
|
### Developer Tools / CLI repos
|
||||||
|
- Create agents for the tool's primary use cases
|
||||||
|
- Reference any existing skills
|
||||||
|
- Add a project maintainer or lead agent
|
||||||
|
|
||||||
|
### Library / Framework repos
|
||||||
|
- Create agents for development, testing, documentation
|
||||||
|
- Skills from the repo become agent capabilities
|
||||||
|
|
||||||
|
### Full Application repos
|
||||||
|
- Map to departments: engineering, product, QA
|
||||||
|
- Create a lean team structure appropriate to the project size
|
||||||
|
|
||||||
|
### Skills Collection repos (e.g. skills.sh repos)
|
||||||
|
- Each skill or skill group gets an agent
|
||||||
|
- Create a lightweight company or team wrapper
|
||||||
|
- Keep the agent count proportional to the skill diversity
|
||||||
201
.agents/skills/doc-maintenance/SKILL.md
Normal file
201
.agents/skills/doc-maintenance/SKILL.md
Normal file
@@ -0,0 +1,201 @@
|
|||||||
|
---
|
||||||
|
name: doc-maintenance
|
||||||
|
description: >
|
||||||
|
Audit top-level documentation (README, SPEC, PRODUCT) against recent git
|
||||||
|
history to find drift — shipped features missing from docs or features
|
||||||
|
listed as upcoming that already landed. Proposes minimal edits, creates
|
||||||
|
a branch, and opens a PR. Use when asked to review docs for accuracy,
|
||||||
|
after major feature merges, or on a periodic schedule.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Doc Maintenance Skill
|
||||||
|
|
||||||
|
Detect documentation drift and fix it via PR — no rewrites, no churn.
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
- Periodic doc review (e.g. weekly or after releases)
|
||||||
|
- After major feature merges
|
||||||
|
- When asked "are our docs up to date?"
|
||||||
|
- When asked to audit README / SPEC / PRODUCT accuracy
|
||||||
|
|
||||||
|
## Target Documents
|
||||||
|
|
||||||
|
| Document | Path | What matters |
|
||||||
|
|----------|------|-------------|
|
||||||
|
| README | `README.md` | Features table, roadmap, quickstart, "what is" accuracy, "works with" table |
|
||||||
|
| SPEC | `doc/SPEC.md` | No false "not supported" claims, major model/schema accuracy |
|
||||||
|
| PRODUCT | `doc/PRODUCT.md` | Core concepts, feature list, principles accuracy |
|
||||||
|
|
||||||
|
Out of scope: DEVELOPING.md, DATABASE.md, CLI.md, doc/plans/, skill files,
|
||||||
|
release notes. These are dev-facing or ephemeral — lower risk of user-facing
|
||||||
|
confusion.
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
### Step 1 — Detect what changed
|
||||||
|
|
||||||
|
Find the last review cursor:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Read the last-reviewed commit SHA
|
||||||
|
CURSOR_FILE=".doc-review-cursor"
|
||||||
|
if [ -f "$CURSOR_FILE" ]; then
|
||||||
|
LAST_SHA=$(cat "$CURSOR_FILE" | head -1)
|
||||||
|
else
|
||||||
|
# First run: look back 60 days
|
||||||
|
LAST_SHA=$(git log --format="%H" --after="60 days ago" --reverse | head -1)
|
||||||
|
fi
|
||||||
|
```
|
||||||
|
|
||||||
|
Then gather commits since the cursor:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git log "$LAST_SHA"..HEAD --oneline --no-merges
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 2 — Classify changes
|
||||||
|
|
||||||
|
Scan commit messages and changed files. Categorize into:
|
||||||
|
|
||||||
|
- **Feature** — new capabilities (keywords: `feat`, `add`, `implement`, `support`)
|
||||||
|
- **Breaking** — removed/renamed things (keywords: `remove`, `breaking`, `drop`, `rename`)
|
||||||
|
- **Structural** — new directories, config changes, new adapters, new CLI commands
|
||||||
|
|
||||||
|
**Ignore:** refactors, test-only changes, CI config, dependency bumps, doc-only
|
||||||
|
changes, style/formatting commits. These don't affect doc accuracy.
|
||||||
|
|
||||||
|
For borderline cases, check the actual diff — a commit titled "refactor: X"
|
||||||
|
that adds a new public API is a feature.
|
||||||
|
|
||||||
|
### Step 3 — Build a change summary
|
||||||
|
|
||||||
|
Produce a concise list like:
|
||||||
|
|
||||||
|
```
|
||||||
|
Since last review (<sha>, <date>):
|
||||||
|
- FEATURE: Plugin system merged (runtime, SDK, CLI, slots, event bridge)
|
||||||
|
- FEATURE: Project archiving added
|
||||||
|
- BREAKING: Removed legacy webhook adapter
|
||||||
|
- STRUCTURAL: New .agents/skills/ directory convention
|
||||||
|
```
|
||||||
|
|
||||||
|
If there are no notable changes, skip to Step 7 (update cursor and exit).
|
||||||
|
|
||||||
|
### Step 4 — Audit each target doc
|
||||||
|
|
||||||
|
For each target document, read it fully and cross-reference against the change
|
||||||
|
summary. Check for:
|
||||||
|
|
||||||
|
1. **False negatives** — major shipped features not mentioned at all
|
||||||
|
2. **False positives** — features listed as "coming soon" / "roadmap" / "planned"
|
||||||
|
/ "not supported" / "TBD" that already shipped
|
||||||
|
3. **Quickstart accuracy** — install commands, prereqs, and startup instructions
|
||||||
|
still correct (README only)
|
||||||
|
4. **Feature table accuracy** — does the features section reflect current
|
||||||
|
capabilities? (README only)
|
||||||
|
5. **Works-with accuracy** — are supported adapters/integrations listed correctly?
|
||||||
|
|
||||||
|
Use `references/audit-checklist.md` as the structured checklist.
|
||||||
|
Use `references/section-map.md` to know where to look for each feature area.
|
||||||
|
|
||||||
|
### Step 5 — Create branch and apply minimal edits
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create a branch for the doc updates
|
||||||
|
BRANCH="docs/maintenance-$(date +%Y%m%d)"
|
||||||
|
git checkout -b "$BRANCH"
|
||||||
|
```
|
||||||
|
|
||||||
|
Apply **only** the edits needed to fix drift. Rules:
|
||||||
|
|
||||||
|
- **Minimal patches only.** Fix inaccuracies, don't rewrite sections.
|
||||||
|
- **Preserve voice and style.** Match the existing tone of each document.
|
||||||
|
- **No cosmetic changes.** Don't fix typos, reformat tables, or reorganize
|
||||||
|
sections unless they're part of a factual fix.
|
||||||
|
- **No new sections.** If a feature needs a whole new section, note it in the
|
||||||
|
PR description as a follow-up — don't add it in a maintenance pass.
|
||||||
|
- **Roadmap items:** Move shipped features out of Roadmap. Add a brief mention
|
||||||
|
in the appropriate existing section if there isn't one already. Don't add
|
||||||
|
long descriptions.
|
||||||
|
|
||||||
|
### Step 6 — Open a PR
|
||||||
|
|
||||||
|
Commit the changes and open a PR:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git add README.md doc/SPEC.md doc/PRODUCT.md .doc-review-cursor
|
||||||
|
git commit -m "docs: update documentation for accuracy
|
||||||
|
|
||||||
|
- [list each fix briefly]
|
||||||
|
|
||||||
|
Co-Authored-By: Paperclip <noreply@paperclip.ing>"
|
||||||
|
|
||||||
|
git push -u origin "$BRANCH"
|
||||||
|
|
||||||
|
gh pr create \
|
||||||
|
--title "docs: periodic documentation accuracy update" \
|
||||||
|
--body "$(cat <<'EOF'
|
||||||
|
## Summary
|
||||||
|
Automated doc maintenance pass. Fixes documentation drift detected since
|
||||||
|
last review.
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
- [list each fix]
|
||||||
|
|
||||||
|
### Change summary (since last review)
|
||||||
|
- [list notable code changes that triggered doc updates]
|
||||||
|
|
||||||
|
## Review notes
|
||||||
|
- Only factual accuracy fixes — no style/cosmetic changes
|
||||||
|
- Preserves existing voice and structure
|
||||||
|
- Larger doc additions (new sections, tutorials) noted as follow-ups
|
||||||
|
|
||||||
|
🤖 Generated by doc-maintenance skill
|
||||||
|
EOF
|
||||||
|
)"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 7 — Update the cursor
|
||||||
|
|
||||||
|
After a successful audit (whether or not edits were needed), update the cursor:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git rev-parse HEAD > .doc-review-cursor
|
||||||
|
```
|
||||||
|
|
||||||
|
If edits were made, this is already committed in the PR branch. If no edits
|
||||||
|
were needed, commit the cursor update to the current branch.
|
||||||
|
|
||||||
|
## Change Classification Rules
|
||||||
|
|
||||||
|
| Signal | Category | Doc update needed? |
|
||||||
|
|--------|----------|-------------------|
|
||||||
|
| `feat:`, `add`, `implement`, `support` in message | Feature | Yes if user-facing |
|
||||||
|
| `remove`, `drop`, `breaking`, `!:` in message | Breaking | Yes |
|
||||||
|
| New top-level directory or config file | Structural | Maybe |
|
||||||
|
| `fix:`, `bugfix` | Fix | No (unless it changes behavior described in docs) |
|
||||||
|
| `refactor:`, `chore:`, `ci:`, `test:` | Maintenance | No |
|
||||||
|
| `docs:` | Doc change | No (already handled) |
|
||||||
|
| Dependency bumps only | Maintenance | No |
|
||||||
|
|
||||||
|
## Patch Style Guide
|
||||||
|
|
||||||
|
- Fix the fact, not the prose
|
||||||
|
- If removing a roadmap item, don't leave a gap — remove the bullet cleanly
|
||||||
|
- If adding a feature mention, match the format of surrounding entries
|
||||||
|
(e.g. if features are in a table, add a table row)
|
||||||
|
- Keep README changes especially minimal — it shouldn't churn often
|
||||||
|
- For SPEC/PRODUCT, prefer updating existing statements over adding new ones
|
||||||
|
(e.g. change "not supported in V1" to "supported via X" rather than adding
|
||||||
|
a new section)
|
||||||
|
|
||||||
|
## Output
|
||||||
|
|
||||||
|
When the skill completes, report:
|
||||||
|
|
||||||
|
- How many commits were scanned
|
||||||
|
- How many notable changes were found
|
||||||
|
- How many doc edits were made (and to which files)
|
||||||
|
- PR link (if edits were made)
|
||||||
|
- Any follow-up items that need larger doc work
|
||||||
85
.agents/skills/doc-maintenance/references/audit-checklist.md
Normal file
85
.agents/skills/doc-maintenance/references/audit-checklist.md
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
# Doc Maintenance Audit Checklist
|
||||||
|
|
||||||
|
Use this checklist when auditing each target document. For each item, compare
|
||||||
|
against the change summary from git history.
|
||||||
|
|
||||||
|
## README.md
|
||||||
|
|
||||||
|
### Features table
|
||||||
|
- [ ] Each feature card reflects a shipped capability
|
||||||
|
- [ ] No feature cards for things that don't exist yet
|
||||||
|
- [ ] No major shipped features missing from the table
|
||||||
|
|
||||||
|
### Roadmap
|
||||||
|
- [ ] Nothing listed as "planned" or "coming soon" that already shipped
|
||||||
|
- [ ] No removed/cancelled items still listed
|
||||||
|
- [ ] Items reflect current priorities (cross-check with recent PRs)
|
||||||
|
|
||||||
|
### Quickstart
|
||||||
|
- [ ] `npx paperclipai onboard` command is correct
|
||||||
|
- [ ] Manual install steps are accurate (clone URL, commands)
|
||||||
|
- [ ] Prerequisites (Node version, pnpm version) are current
|
||||||
|
- [ ] Server URL and port are correct
|
||||||
|
|
||||||
|
### "What is Paperclip" section
|
||||||
|
- [ ] High-level description is accurate
|
||||||
|
- [ ] Step table (Define goal / Hire team / Approve and run) is correct
|
||||||
|
|
||||||
|
### "Works with" table
|
||||||
|
- [ ] All supported adapters/runtimes are listed
|
||||||
|
- [ ] No removed adapters still listed
|
||||||
|
- [ ] Logos and labels match current adapter names
|
||||||
|
|
||||||
|
### "Paperclip is right for you if"
|
||||||
|
- [ ] Use cases are still accurate
|
||||||
|
- [ ] No claims about capabilities that don't exist
|
||||||
|
|
||||||
|
### "Why Paperclip is special"
|
||||||
|
- [ ] Technical claims are accurate (atomic execution, governance, etc.)
|
||||||
|
- [ ] No features listed that were removed or significantly changed
|
||||||
|
|
||||||
|
### FAQ
|
||||||
|
- [ ] Answers are still correct
|
||||||
|
- [ ] No references to removed features or outdated behavior
|
||||||
|
|
||||||
|
### Development section
|
||||||
|
- [ ] Commands are accurate (`pnpm dev`, `pnpm build`, etc.)
|
||||||
|
- [ ] Link to DEVELOPING.md is correct
|
||||||
|
|
||||||
|
## doc/SPEC.md
|
||||||
|
|
||||||
|
### Company Model
|
||||||
|
- [ ] Fields match current schema
|
||||||
|
- [ ] Governance model description is accurate
|
||||||
|
|
||||||
|
### Agent Model
|
||||||
|
- [ ] Adapter types match what's actually supported
|
||||||
|
- [ ] Agent configuration description is accurate
|
||||||
|
- [ ] No features described as "not supported" or "not V1" that shipped
|
||||||
|
|
||||||
|
### Task Model
|
||||||
|
- [ ] Task hierarchy description is accurate
|
||||||
|
- [ ] Status values match current implementation
|
||||||
|
|
||||||
|
### Extensions / Plugins
|
||||||
|
- [ ] If plugins are shipped, no "not in V1" or "future" language
|
||||||
|
- [ ] Plugin model description matches implementation
|
||||||
|
|
||||||
|
### Open Questions
|
||||||
|
- [ ] Resolved questions removed or updated
|
||||||
|
- [ ] No "TBD" items that have been decided
|
||||||
|
|
||||||
|
## doc/PRODUCT.md
|
||||||
|
|
||||||
|
### Core Concepts
|
||||||
|
- [ ] Company, Employees, Task Management descriptions accurate
|
||||||
|
- [ ] Agent Execution modes described correctly
|
||||||
|
- [ ] No missing major concepts
|
||||||
|
|
||||||
|
### Principles
|
||||||
|
- [ ] Principles haven't been contradicted by shipped features
|
||||||
|
- [ ] No principles referencing removed capabilities
|
||||||
|
|
||||||
|
### User Flow
|
||||||
|
- [ ] Dream scenario still reflects actual onboarding
|
||||||
|
- [ ] Steps are achievable with current features
|
||||||
22
.agents/skills/doc-maintenance/references/section-map.md
Normal file
22
.agents/skills/doc-maintenance/references/section-map.md
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Section Map
|
||||||
|
|
||||||
|
Maps feature areas to specific document sections so the skill knows where to
|
||||||
|
look when a feature ships or changes.
|
||||||
|
|
||||||
|
| Feature Area | README Section | SPEC Section | PRODUCT Section |
|
||||||
|
|-------------|---------------|-------------|----------------|
|
||||||
|
| Plugins / Extensions | Features table, Roadmap | Extensions, Agent Model | Core Concepts |
|
||||||
|
| Adapters (new runtimes) | "Works with" table, FAQ | Agent Model, Agent Configuration | Employees & Agents, Agent Execution |
|
||||||
|
| Governance / Approvals | Features table, "Why special" | Board Governance, Board Approval Gates | Principles |
|
||||||
|
| Budget / Cost Control | Features table, "Why special" | Budget Delegation | Company (revenue & expenses) |
|
||||||
|
| Task Management | Features table | Task Model | Task Management |
|
||||||
|
| Org Chart / Hierarchy | Features table | Agent Model (reporting) | Employees & Agents |
|
||||||
|
| Multi-Company | Features table, FAQ | Company Model | Company |
|
||||||
|
| Heartbeats | Features table, FAQ | Agent Execution | Agent Execution |
|
||||||
|
| CLI Commands | Development section | — | — |
|
||||||
|
| Onboarding / Quickstart | Quickstart, FAQ | — | User Flow |
|
||||||
|
| Skills / Skill Injection | "Why special" | — | — |
|
||||||
|
| Company Templates | "Why special", Roadmap (ClipMart) | — | — |
|
||||||
|
| Mobile / UI | Features table | — | — |
|
||||||
|
| Project Archiving | — | — | — |
|
||||||
|
| OpenClaw Integration | "Works with" table, FAQ | Agent Model | Agent Execution |
|
||||||
202
.agents/skills/pr-report/SKILL.md
Normal file
202
.agents/skills/pr-report/SKILL.md
Normal file
@@ -0,0 +1,202 @@
|
|||||||
|
---
|
||||||
|
name: pr-report
|
||||||
|
description: >
|
||||||
|
Review a pull request or contribution deeply, explain it tutorial-style for a
|
||||||
|
maintainer, and produce a polished report artifact such as HTML or Markdown.
|
||||||
|
Use when asked to analyze a PR, explain a contributor's design decisions,
|
||||||
|
compare it with similar systems, or prepare a merge recommendation.
|
||||||
|
---
|
||||||
|
|
||||||
|
# PR Report Skill
|
||||||
|
|
||||||
|
Produce a maintainer-grade review of a PR, branch, or large contribution.
|
||||||
|
|
||||||
|
Default posture:
|
||||||
|
|
||||||
|
- understand the change before judging it
|
||||||
|
- explain the system as built, not just the diff
|
||||||
|
- separate architectural problems from product-scope objections
|
||||||
|
- make a concrete recommendation, not a vague impression
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
Use this skill when the user asks for things like:
|
||||||
|
|
||||||
|
- "review this PR deeply"
|
||||||
|
- "explain this contribution to me"
|
||||||
|
- "make me a report or webpage for this PR"
|
||||||
|
- "compare this design to similar systems"
|
||||||
|
- "should I merge this?"
|
||||||
|
|
||||||
|
## Outputs
|
||||||
|
|
||||||
|
Common outputs:
|
||||||
|
|
||||||
|
- standalone HTML report in `tmp/reports/...`
|
||||||
|
- Markdown report in `report/` or another requested folder
|
||||||
|
- short maintainer summary in chat
|
||||||
|
|
||||||
|
If the user asks for a webpage, build a polished standalone HTML artifact with
|
||||||
|
clear sections and readable visual hierarchy.
|
||||||
|
|
||||||
|
Resources bundled with this skill:
|
||||||
|
|
||||||
|
- `references/style-guide.md` for visual direction and report presentation rules
|
||||||
|
- `assets/html-report-starter.html` for a reusable standalone HTML/CSS starter
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
### 1. Acquire and frame the target
|
||||||
|
|
||||||
|
Work from local code when possible, not just the GitHub PR page.
|
||||||
|
|
||||||
|
Gather:
|
||||||
|
|
||||||
|
- target branch or worktree
|
||||||
|
- diff size and changed subsystems
|
||||||
|
- relevant repo docs, specs, and invariants
|
||||||
|
- contributor intent if it is documented in PR text or design docs
|
||||||
|
|
||||||
|
Start by answering: what is this change *trying* to become?
|
||||||
|
|
||||||
|
### 2. Build a mental model of the system
|
||||||
|
|
||||||
|
Do not stop at file-by-file notes. Reconstruct the design:
|
||||||
|
|
||||||
|
- what new runtime or contract exists
|
||||||
|
- which layers changed: db, shared types, server, UI, CLI, docs
|
||||||
|
- lifecycle: install, startup, execution, UI, failure, disablement
|
||||||
|
- trust boundary: what code runs where, under what authority
|
||||||
|
|
||||||
|
For large contributions, include a tutorial-style section that teaches the
|
||||||
|
system from first principles.
|
||||||
|
|
||||||
|
### 3. Review like a maintainer
|
||||||
|
|
||||||
|
Findings come first. Order by severity.
|
||||||
|
|
||||||
|
Prioritize:
|
||||||
|
|
||||||
|
- behavioral regressions
|
||||||
|
- trust or security gaps
|
||||||
|
- misleading abstractions
|
||||||
|
- lifecycle and operational risks
|
||||||
|
- coupling that will be hard to unwind
|
||||||
|
- missing tests or unverifiable claims
|
||||||
|
|
||||||
|
Always cite concrete file references when possible.
|
||||||
|
|
||||||
|
### 4. Distinguish the objection type
|
||||||
|
|
||||||
|
Be explicit about whether a concern is:
|
||||||
|
|
||||||
|
- product direction
|
||||||
|
- architecture
|
||||||
|
- implementation quality
|
||||||
|
- rollout strategy
|
||||||
|
- documentation honesty
|
||||||
|
|
||||||
|
Do not hide an architectural objection inside a scope objection.
|
||||||
|
|
||||||
|
### 5. Compare to external precedents when needed
|
||||||
|
|
||||||
|
If the contribution introduces a framework or platform concept, compare it to
|
||||||
|
similar open-source systems.
|
||||||
|
|
||||||
|
When comparing:
|
||||||
|
|
||||||
|
- prefer official docs or source
|
||||||
|
- focus on extension boundaries, context passing, trust model, and UI ownership
|
||||||
|
- extract lessons, not just similarities
|
||||||
|
|
||||||
|
Good comparison questions:
|
||||||
|
|
||||||
|
- Who owns lifecycle?
|
||||||
|
- Who owns UI composition?
|
||||||
|
- Is context explicit or ambient?
|
||||||
|
- Are plugins trusted code or sandboxed code?
|
||||||
|
- Are extension points named and typed?
|
||||||
|
|
||||||
|
### 6. Make the recommendation actionable
|
||||||
|
|
||||||
|
Do not stop at "merge" or "do not merge."
|
||||||
|
|
||||||
|
Choose one:
|
||||||
|
|
||||||
|
- merge as-is
|
||||||
|
- merge after specific redesign
|
||||||
|
- salvage specific pieces
|
||||||
|
- keep as design research
|
||||||
|
|
||||||
|
If rejecting or narrowing, say what should be kept.
|
||||||
|
|
||||||
|
Useful recommendation buckets:
|
||||||
|
|
||||||
|
- keep the protocol/type model
|
||||||
|
- redesign the UI boundary
|
||||||
|
- narrow the initial surface area
|
||||||
|
- defer third-party execution
|
||||||
|
- ship a host-owned extension-point model first
|
||||||
|
|
||||||
|
### 7. Build the artifact
|
||||||
|
|
||||||
|
Suggested report structure:
|
||||||
|
|
||||||
|
1. Executive summary
|
||||||
|
2. What the PR actually adds
|
||||||
|
3. Tutorial: how the system works
|
||||||
|
4. Strengths
|
||||||
|
5. Main findings
|
||||||
|
6. Comparisons
|
||||||
|
7. Recommendation
|
||||||
|
|
||||||
|
For HTML reports:
|
||||||
|
|
||||||
|
- use intentional typography and color
|
||||||
|
- make navigation easy for long reports
|
||||||
|
- favor strong section headings and small reference labels
|
||||||
|
- avoid generic dashboard styling
|
||||||
|
|
||||||
|
Before building from scratch, read `references/style-guide.md`.
|
||||||
|
If a fast polished starter is helpful, begin from `assets/html-report-starter.html`
|
||||||
|
and replace the placeholder content with the actual report.
|
||||||
|
|
||||||
|
### 8. Verify before handoff
|
||||||
|
|
||||||
|
Check:
|
||||||
|
|
||||||
|
- artifact path exists
|
||||||
|
- findings still match the actual code
|
||||||
|
- any requested forbidden strings are absent from generated output
|
||||||
|
- if tests were not run, say so explicitly
|
||||||
|
|
||||||
|
## Review Heuristics
|
||||||
|
|
||||||
|
### Plugin and platform work
|
||||||
|
|
||||||
|
Watch closely for:
|
||||||
|
|
||||||
|
- docs claiming sandboxing while runtime executes trusted host processes
|
||||||
|
- module-global state used to smuggle React context
|
||||||
|
- hidden dependence on render order
|
||||||
|
- plugins reaching into host internals instead of using explicit APIs
|
||||||
|
- "capabilities" that are really policy labels on top of fully trusted code
|
||||||
|
|
||||||
|
### Good signs
|
||||||
|
|
||||||
|
- typed contracts shared across layers
|
||||||
|
- explicit extension points
|
||||||
|
- host-owned lifecycle
|
||||||
|
- honest trust model
|
||||||
|
- narrow first rollout with room to grow
|
||||||
|
|
||||||
|
## Final Response
|
||||||
|
|
||||||
|
In chat, summarize:
|
||||||
|
|
||||||
|
- where the report is
|
||||||
|
- your overall call
|
||||||
|
- the top one or two reasons
|
||||||
|
- whether verification or tests were skipped
|
||||||
|
|
||||||
|
Keep the chat summary shorter than the report itself.
|
||||||
426
.agents/skills/pr-report/assets/html-report-starter.html
Normal file
426
.agents/skills/pr-report/assets/html-report-starter.html
Normal file
@@ -0,0 +1,426 @@
|
|||||||
|
<!doctype html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
<title>PR Report Starter</title>
|
||||||
|
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||||
|
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
||||||
|
<link
|
||||||
|
href="https://fonts.googleapis.com/css2?family=IBM+Plex+Sans:wght@400;500;600;700&family=Newsreader:opsz,wght@6..72,500;6..72,700&display=swap"
|
||||||
|
rel="stylesheet"
|
||||||
|
/>
|
||||||
|
<style>
|
||||||
|
:root {
|
||||||
|
--bg: #f4efe5;
|
||||||
|
--paper: rgba(255, 251, 244, 0.88);
|
||||||
|
--paper-strong: #fffaf1;
|
||||||
|
--ink: #1f1b17;
|
||||||
|
--muted: #6a6257;
|
||||||
|
--line: rgba(31, 27, 23, 0.12);
|
||||||
|
--accent: #9c4729;
|
||||||
|
--accent-soft: rgba(156, 71, 41, 0.1);
|
||||||
|
--good: #2f6a42;
|
||||||
|
--warn: #946200;
|
||||||
|
--bad: #8c2f25;
|
||||||
|
--shadow: 0 22px 60px rgba(52, 37, 19, 0.1);
|
||||||
|
--radius: 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
* {
|
||||||
|
box-sizing: border-box;
|
||||||
|
}
|
||||||
|
|
||||||
|
html {
|
||||||
|
scroll-behavior: smooth;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
margin: 0;
|
||||||
|
color: var(--ink);
|
||||||
|
font-family: "IBM Plex Sans", sans-serif;
|
||||||
|
background:
|
||||||
|
radial-gradient(circle at top left, rgba(156, 71, 41, 0.12), transparent 34rem),
|
||||||
|
radial-gradient(circle at top right, rgba(47, 106, 66, 0.08), transparent 28rem),
|
||||||
|
linear-gradient(180deg, #efe6d6 0%, var(--bg) 48%, #ece5d8 100%);
|
||||||
|
}
|
||||||
|
|
||||||
|
.shell {
|
||||||
|
width: min(1360px, calc(100vw - 32px));
|
||||||
|
margin: 24px auto;
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 280px minmax(0, 1fr);
|
||||||
|
gap: 24px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.panel {
|
||||||
|
background: var(--paper);
|
||||||
|
backdrop-filter: blur(12px);
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: var(--radius);
|
||||||
|
box-shadow: var(--shadow);
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav {
|
||||||
|
position: sticky;
|
||||||
|
top: 20px;
|
||||||
|
align-self: start;
|
||||||
|
padding: 22px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.eyebrow {
|
||||||
|
letter-spacing: 0.12em;
|
||||||
|
text-transform: uppercase;
|
||||||
|
font-size: 11px;
|
||||||
|
font-weight: 700;
|
||||||
|
color: var(--accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav h1,
|
||||||
|
.hero h1,
|
||||||
|
h2,
|
||||||
|
h3 {
|
||||||
|
font-family: "Newsreader", serif;
|
||||||
|
line-height: 0.96;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav h1 {
|
||||||
|
font-size: 2rem;
|
||||||
|
margin-top: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav p {
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 0.95rem;
|
||||||
|
line-height: 1.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav ul {
|
||||||
|
list-style: none;
|
||||||
|
padding: 0;
|
||||||
|
margin: 18px 0 0;
|
||||||
|
display: grid;
|
||||||
|
gap: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav a {
|
||||||
|
display: block;
|
||||||
|
color: var(--ink);
|
||||||
|
text-decoration: none;
|
||||||
|
padding: 10px 12px;
|
||||||
|
border-radius: 12px;
|
||||||
|
border: 1px solid transparent;
|
||||||
|
background: rgba(255, 255, 255, 0.35);
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav a:hover {
|
||||||
|
border-color: var(--line);
|
||||||
|
background: rgba(255, 255, 255, 0.75);
|
||||||
|
}
|
||||||
|
|
||||||
|
.meta-block {
|
||||||
|
margin-top: 20px;
|
||||||
|
padding-top: 18px;
|
||||||
|
border-top: 1px solid var(--line);
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 0.86rem;
|
||||||
|
line-height: 1.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
main {
|
||||||
|
display: grid;
|
||||||
|
gap: 24px;
|
||||||
|
}
|
||||||
|
|
||||||
|
section {
|
||||||
|
padding: 26px 28px 28px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero {
|
||||||
|
padding: 28px;
|
||||||
|
overflow: hidden;
|
||||||
|
position: relative;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero::after {
|
||||||
|
content: "";
|
||||||
|
position: absolute;
|
||||||
|
inset: auto -3rem -6rem auto;
|
||||||
|
width: 18rem;
|
||||||
|
height: 18rem;
|
||||||
|
border-radius: 50%;
|
||||||
|
background: radial-gradient(circle, rgba(156, 71, 41, 0.14), transparent 68%);
|
||||||
|
pointer-events: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero h1 {
|
||||||
|
font-size: clamp(2.6rem, 5vw, 4.6rem);
|
||||||
|
max-width: 12ch;
|
||||||
|
margin-top: 12px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.lede {
|
||||||
|
margin-top: 16px;
|
||||||
|
max-width: 70ch;
|
||||||
|
font-size: 1.05rem;
|
||||||
|
line-height: 1.65;
|
||||||
|
color: #2b2723;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-grid,
|
||||||
|
.card-grid,
|
||||||
|
.two-col {
|
||||||
|
display: grid;
|
||||||
|
gap: 14px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-grid {
|
||||||
|
margin-top: 24px;
|
||||||
|
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
|
||||||
|
.card-grid {
|
||||||
|
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
|
||||||
|
.two-col {
|
||||||
|
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
|
||||||
|
.metric,
|
||||||
|
.card,
|
||||||
|
.finding {
|
||||||
|
padding: 18px;
|
||||||
|
background: rgba(255, 255, 255, 0.68);
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 18px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.metric .label {
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 0.82rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.08em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.metric .value {
|
||||||
|
margin-top: 8px;
|
||||||
|
font-size: 1.45rem;
|
||||||
|
font-weight: 700;
|
||||||
|
}
|
||||||
|
|
||||||
|
h2 {
|
||||||
|
font-size: 2rem;
|
||||||
|
margin-bottom: 16px;
|
||||||
|
}
|
||||||
|
|
||||||
|
h3 {
|
||||||
|
font-size: 1.3rem;
|
||||||
|
margin-bottom: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
p {
|
||||||
|
margin: 0 0 14px;
|
||||||
|
line-height: 1.65;
|
||||||
|
}
|
||||||
|
|
||||||
|
ul,
|
||||||
|
ol {
|
||||||
|
margin: 0;
|
||||||
|
padding-left: 20px;
|
||||||
|
line-height: 1.65;
|
||||||
|
}
|
||||||
|
|
||||||
|
li + li {
|
||||||
|
margin-top: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge-row {
|
||||||
|
display: flex;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 8px;
|
||||||
|
margin: 18px 0 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
padding: 8px 10px;
|
||||||
|
border-radius: 999px;
|
||||||
|
font-size: 0.82rem;
|
||||||
|
font-weight: 700;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
background: rgba(255, 255, 255, 0.68);
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge.good {
|
||||||
|
color: var(--good);
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge.warn {
|
||||||
|
color: var(--warn);
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge.bad {
|
||||||
|
color: var(--bad);
|
||||||
|
}
|
||||||
|
|
||||||
|
.quote {
|
||||||
|
margin-top: 18px;
|
||||||
|
padding: 18px;
|
||||||
|
border-left: 4px solid var(--accent);
|
||||||
|
border-radius: 14px;
|
||||||
|
background: var(--accent-soft);
|
||||||
|
}
|
||||||
|
|
||||||
|
.severity {
|
||||||
|
display: inline-flex;
|
||||||
|
margin-bottom: 12px;
|
||||||
|
padding: 6px 10px;
|
||||||
|
border-radius: 999px;
|
||||||
|
font-size: 0.78rem;
|
||||||
|
font-weight: 700;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.08em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.severity.high {
|
||||||
|
background: rgba(140, 47, 37, 0.12);
|
||||||
|
color: var(--bad);
|
||||||
|
}
|
||||||
|
|
||||||
|
.severity.medium {
|
||||||
|
background: rgba(148, 98, 0, 0.12);
|
||||||
|
color: var(--warn);
|
||||||
|
}
|
||||||
|
|
||||||
|
.severity.low {
|
||||||
|
background: rgba(47, 106, 66, 0.12);
|
||||||
|
color: var(--good);
|
||||||
|
}
|
||||||
|
|
||||||
|
.ref {
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 0.82rem;
|
||||||
|
line-height: 1.5;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 980px) {
|
||||||
|
.shell {
|
||||||
|
grid-template-columns: 1fr;
|
||||||
|
}
|
||||||
|
|
||||||
|
.nav {
|
||||||
|
position: static;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-grid,
|
||||||
|
.card-grid,
|
||||||
|
.two-col {
|
||||||
|
grid-template-columns: 1fr;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero h1 {
|
||||||
|
max-width: 100%;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div class="shell">
|
||||||
|
<aside class="panel nav">
|
||||||
|
<div class="eyebrow">Maintainer Report</div>
|
||||||
|
<h1>Report Title</h1>
|
||||||
|
<p>Replace this with a concise description of what the report covers.</p>
|
||||||
|
<ul>
|
||||||
|
<li><a href="#summary">Summary</a></li>
|
||||||
|
<li><a href="#tutorial">Tutorial</a></li>
|
||||||
|
<li><a href="#findings">Findings</a></li>
|
||||||
|
<li><a href="#recommendation">Recommendation</a></li>
|
||||||
|
</ul>
|
||||||
|
<div class="meta-block">
|
||||||
|
Replace with project metadata, review date, or scope notes.
|
||||||
|
</div>
|
||||||
|
</aside>
|
||||||
|
|
||||||
|
<main>
|
||||||
|
<section class="panel hero" id="summary">
|
||||||
|
<div class="eyebrow">Executive Summary</div>
|
||||||
|
<h1>Use the hero for the clearest one-line judgment.</h1>
|
||||||
|
<p class="lede">
|
||||||
|
Replace this with the short explanation of what the contribution does, why it matters,
|
||||||
|
and what the core maintainer question is.
|
||||||
|
</p>
|
||||||
|
<div class="badge-row">
|
||||||
|
<span class="badge good">Strength</span>
|
||||||
|
<span class="badge warn">Tradeoff</span>
|
||||||
|
<span class="badge bad">Risk</span>
|
||||||
|
</div>
|
||||||
|
<div class="hero-grid">
|
||||||
|
<div class="metric">
|
||||||
|
<div class="label">Overall Call</div>
|
||||||
|
<div class="value">Placeholder</div>
|
||||||
|
</div>
|
||||||
|
<div class="metric">
|
||||||
|
<div class="label">Main Concern</div>
|
||||||
|
<div class="value">Placeholder</div>
|
||||||
|
</div>
|
||||||
|
<div class="metric">
|
||||||
|
<div class="label">Best Part</div>
|
||||||
|
<div class="value">Placeholder</div>
|
||||||
|
</div>
|
||||||
|
<div class="metric">
|
||||||
|
<div class="label">Weakest Part</div>
|
||||||
|
<div class="value">Placeholder</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="quote">
|
||||||
|
Use this block for the thesis, a sharp takeaway, or a key cited point.
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="panel" id="tutorial">
|
||||||
|
<h2>Tutorial Section</h2>
|
||||||
|
<div class="two-col">
|
||||||
|
<div class="card">
|
||||||
|
<h3>Concept Card</h3>
|
||||||
|
<p>Use cards for mental models, subsystems, or comparison slices.</p>
|
||||||
|
<div class="ref">path/to/file.ts:10</div>
|
||||||
|
</div>
|
||||||
|
<div class="card">
|
||||||
|
<h3>Second Card</h3>
|
||||||
|
<p>Keep cards fairly dense. This template is about style, not fixed structure.</p>
|
||||||
|
<div class="ref">path/to/file.ts:20</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="panel" id="findings">
|
||||||
|
<h2>Findings</h2>
|
||||||
|
<article class="finding">
|
||||||
|
<div class="severity high">High</div>
|
||||||
|
<h3>Finding Title</h3>
|
||||||
|
<p>Use findings for the sharpest judgment calls and risks.</p>
|
||||||
|
<div class="ref">path/to/file.ts:30</div>
|
||||||
|
</article>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="panel" id="recommendation">
|
||||||
|
<h2>Recommendation</h2>
|
||||||
|
<div class="card-grid">
|
||||||
|
<div class="card">
|
||||||
|
<h3>Path Forward</h3>
|
||||||
|
<p>Use this area for merge guidance, salvage plan, or rollout advice.</p>
|
||||||
|
</div>
|
||||||
|
<div class="card">
|
||||||
|
<h3>What To Keep</h3>
|
||||||
|
<p>Call out the parts worth preserving even if the whole proposal should not land.</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
149
.agents/skills/pr-report/references/style-guide.md
Normal file
149
.agents/skills/pr-report/references/style-guide.md
Normal file
@@ -0,0 +1,149 @@
|
|||||||
|
# PR Report Style Guide
|
||||||
|
|
||||||
|
Use this guide when the user wants a report artifact, especially a webpage.
|
||||||
|
|
||||||
|
## Goal
|
||||||
|
|
||||||
|
Make the report feel like an editorial review, not an internal admin dashboard.
|
||||||
|
The page should make a long technical argument easy to scan without looking
|
||||||
|
generic or overdesigned.
|
||||||
|
|
||||||
|
## Visual Direction
|
||||||
|
|
||||||
|
Preferred tone:
|
||||||
|
|
||||||
|
- editorial
|
||||||
|
- warm
|
||||||
|
- serious
|
||||||
|
- high-contrast
|
||||||
|
- handcrafted, not corporate SaaS
|
||||||
|
|
||||||
|
Avoid:
|
||||||
|
|
||||||
|
- default app-shell layouts
|
||||||
|
- purple gradients on white
|
||||||
|
- generic card dashboards
|
||||||
|
- cramped pages with weak hierarchy
|
||||||
|
- novelty fonts that hurt readability
|
||||||
|
|
||||||
|
## Typography
|
||||||
|
|
||||||
|
Recommended pattern:
|
||||||
|
|
||||||
|
- one expressive serif or display face for major headings
|
||||||
|
- one sturdy sans-serif for body copy and UI labels
|
||||||
|
|
||||||
|
Good combinations:
|
||||||
|
|
||||||
|
- Newsreader + IBM Plex Sans
|
||||||
|
- Source Serif 4 + Instrument Sans
|
||||||
|
- Fraunces + Public Sans
|
||||||
|
- Libre Baskerville + Work Sans
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- headings should feel deliberate and large
|
||||||
|
- body copy should stay comfortable for long reading
|
||||||
|
- reference labels and badges should use smaller dense sans text
|
||||||
|
|
||||||
|
## Layout
|
||||||
|
|
||||||
|
Recommended structure:
|
||||||
|
|
||||||
|
- a sticky side or top navigation for long reports
|
||||||
|
- one strong hero summary at the top
|
||||||
|
- panel or paper-like sections for each major topic
|
||||||
|
- multi-column card grids for comparisons and strengths
|
||||||
|
- single-column body text for findings and recommendations
|
||||||
|
|
||||||
|
Use generous spacing. Long-form technical reports need breathing room.
|
||||||
|
|
||||||
|
## Color
|
||||||
|
|
||||||
|
Prefer muted paper-like backgrounds with one warm accent and one cool counterweight.
|
||||||
|
|
||||||
|
Suggested token categories:
|
||||||
|
|
||||||
|
- `--bg`
|
||||||
|
- `--paper`
|
||||||
|
- `--ink`
|
||||||
|
- `--muted`
|
||||||
|
- `--line`
|
||||||
|
- `--accent`
|
||||||
|
- `--good`
|
||||||
|
- `--warn`
|
||||||
|
- `--bad`
|
||||||
|
|
||||||
|
The accent should highlight navigation, badges, and important labels. Do not
|
||||||
|
let accent colors dominate body text.
|
||||||
|
|
||||||
|
## Useful UI Elements
|
||||||
|
|
||||||
|
Include small reusable styles for:
|
||||||
|
|
||||||
|
- summary metrics
|
||||||
|
- badges
|
||||||
|
- quotes or callouts
|
||||||
|
- finding cards
|
||||||
|
- severity labels
|
||||||
|
- reference labels
|
||||||
|
- comparison cards
|
||||||
|
- responsive two-column sections
|
||||||
|
|
||||||
|
## Motion
|
||||||
|
|
||||||
|
Keep motion restrained.
|
||||||
|
|
||||||
|
Good:
|
||||||
|
|
||||||
|
- soft fade/slide-in on first load
|
||||||
|
- hover response on nav items or cards
|
||||||
|
|
||||||
|
Bad:
|
||||||
|
|
||||||
|
- constant animation
|
||||||
|
- floating blobs
|
||||||
|
- decorative motion with no reading benefit
|
||||||
|
|
||||||
|
## Content Presentation
|
||||||
|
|
||||||
|
Even when the user wants design polish, clarity stays primary.
|
||||||
|
|
||||||
|
Good structure for long reports:
|
||||||
|
|
||||||
|
1. executive summary
|
||||||
|
2. what changed
|
||||||
|
3. tutorial explanation
|
||||||
|
4. strengths
|
||||||
|
5. findings
|
||||||
|
6. comparisons
|
||||||
|
7. recommendation
|
||||||
|
|
||||||
|
The exact headings can change. The important thing is to separate explanation
|
||||||
|
from judgment.
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
Reference labels should be visually quiet but easy to spot.
|
||||||
|
|
||||||
|
Good pattern:
|
||||||
|
|
||||||
|
- small muted text
|
||||||
|
- monospace or compact sans
|
||||||
|
- keep them close to the paragraph they support
|
||||||
|
|
||||||
|
## Starter Usage
|
||||||
|
|
||||||
|
If you need a fast polished base, start from:
|
||||||
|
|
||||||
|
- `assets/html-report-starter.html`
|
||||||
|
|
||||||
|
Customize:
|
||||||
|
|
||||||
|
- fonts
|
||||||
|
- color tokens
|
||||||
|
- hero copy
|
||||||
|
- section ordering
|
||||||
|
- card density
|
||||||
|
|
||||||
|
Do not preserve the placeholder sections if they do not fit the actual report.
|
||||||
192
.agents/skills/release-changelog/SKILL.md
Normal file
192
.agents/skills/release-changelog/SKILL.md
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
---
|
||||||
|
name: release-changelog
|
||||||
|
description: >
|
||||||
|
Generate the stable Paperclip release changelog at releases/vYYYY.MDD.P.md by
|
||||||
|
reading commits, changesets, and merged PR context since the last stable tag.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Release Changelog Skill
|
||||||
|
|
||||||
|
Generate the user-facing changelog for the **stable** Paperclip release.
|
||||||
|
|
||||||
|
## Versioning Model
|
||||||
|
|
||||||
|
Paperclip uses **calendar versioning (calver)**:
|
||||||
|
|
||||||
|
- Stable releases: `YYYY.MDD.P` (e.g. `2026.318.0`)
|
||||||
|
- Canary releases: `YYYY.MDD.P-canary.N` (e.g. `2026.318.1-canary.0`)
|
||||||
|
- Git tags: `vYYYY.MDD.P` for stable, `canary/vYYYY.MDD.P-canary.N` for canary
|
||||||
|
|
||||||
|
There are no major/minor/patch bumps. The stable version is derived from the
|
||||||
|
intended release date (UTC) plus the next same-day stable patch slot.
|
||||||
|
|
||||||
|
Output:
|
||||||
|
|
||||||
|
- `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
Important rules:
|
||||||
|
|
||||||
|
- even if there are canary releases such as `2026.318.1-canary.0`, the changelog file stays `releases/v2026.318.1.md`
|
||||||
|
- do not derive versions from semver bump types
|
||||||
|
- do not create canary changelog files
|
||||||
|
|
||||||
|
## Step 0 — Idempotency Check
|
||||||
|
|
||||||
|
Before generating anything, check whether the file already exists:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ls releases/vYYYY.MDD.P.md 2>/dev/null
|
||||||
|
```
|
||||||
|
|
||||||
|
If it exists:
|
||||||
|
|
||||||
|
1. read it first
|
||||||
|
2. present it to the reviewer
|
||||||
|
3. ask whether to keep it, regenerate it, or update specific sections
|
||||||
|
4. never overwrite it silently
|
||||||
|
|
||||||
|
## Step 1 — Determine the Stable Range
|
||||||
|
|
||||||
|
Find the last stable tag:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git tag --list 'v*' --sort=-version:refname | head -1
|
||||||
|
git log v{last}..HEAD --oneline --no-merges
|
||||||
|
```
|
||||||
|
|
||||||
|
The stable version comes from one of:
|
||||||
|
|
||||||
|
- an explicit maintainer request
|
||||||
|
- `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||||
|
- the release plan already agreed in `doc/RELEASING.md`
|
||||||
|
|
||||||
|
Do not derive the changelog version from a canary tag or prerelease suffix.
|
||||||
|
Do not derive major/minor/patch bumps from API intent — calver uses the date and same-day stable slot.
|
||||||
|
|
||||||
|
## Step 2 — Gather the Raw Inputs
|
||||||
|
|
||||||
|
Collect release data from:
|
||||||
|
|
||||||
|
1. git commits since the last stable tag
|
||||||
|
2. `.changeset/*.md` files
|
||||||
|
3. merged PRs via `gh` when available
|
||||||
|
|
||||||
|
Useful commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git log v{last}..HEAD --oneline --no-merges
|
||||||
|
git log v{last}..HEAD --format="%H %s" --no-merges
|
||||||
|
ls .changeset/*.md | grep -v README.md
|
||||||
|
gh pr list --state merged --search "merged:>={last-tag-date}" --json number,title,body,labels
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 3 — Detect Breaking Changes
|
||||||
|
|
||||||
|
Look for:
|
||||||
|
|
||||||
|
- destructive migrations
|
||||||
|
- removed or changed API fields/endpoints
|
||||||
|
- renamed or removed config keys
|
||||||
|
- `BREAKING:` or `BREAKING CHANGE:` commit signals
|
||||||
|
|
||||||
|
Key commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git diff --name-only v{last}..HEAD -- packages/db/src/migrations/
|
||||||
|
git diff v{last}..HEAD -- packages/db/src/schema/
|
||||||
|
git diff v{last}..HEAD -- server/src/routes/ server/src/api/
|
||||||
|
git log v{last}..HEAD --format="%s" | rg -n 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||||
|
```
|
||||||
|
|
||||||
|
If breaking changes are detected, flag them prominently — they must appear in the
|
||||||
|
Breaking Changes section with an upgrade path.
|
||||||
|
|
||||||
|
## Step 4 — Categorize for Users
|
||||||
|
|
||||||
|
Use these stable changelog sections:
|
||||||
|
|
||||||
|
- `Breaking Changes`
|
||||||
|
- `Highlights`
|
||||||
|
- `Improvements`
|
||||||
|
- `Fixes`
|
||||||
|
- `Upgrade Guide` when needed
|
||||||
|
|
||||||
|
Exclude purely internal refactors, CI changes, and docs-only work unless they materially affect users.
|
||||||
|
|
||||||
|
Guidelines:
|
||||||
|
|
||||||
|
- group related commits into one user-facing entry
|
||||||
|
- write from the user perspective
|
||||||
|
- keep highlights short and concrete
|
||||||
|
- spell out upgrade actions for breaking changes
|
||||||
|
|
||||||
|
### Inline PR and contributor attribution
|
||||||
|
|
||||||
|
When a bullet item clearly maps to a merged pull request, add inline attribution at the
|
||||||
|
end of the entry in this format:
|
||||||
|
|
||||||
|
```
|
||||||
|
- **Feature name** — Description. ([#123](https://github.com/paperclipai/paperclip/pull/123), @contributor1, @contributor2)
|
||||||
|
```
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- Only add a PR link when you can confidently trace the bullet to a specific merged PR.
|
||||||
|
Use merge commit messages (`Merge pull request #N from user/branch`) to map PRs.
|
||||||
|
- List the contributor(s) who authored the PR. Use GitHub usernames, not real names or emails.
|
||||||
|
- If multiple PRs contributed to a single bullet, list them all: `([#10](url), [#12](url), @user1, @user2)`.
|
||||||
|
- If you cannot determine the PR number or contributor with confidence, omit the attribution
|
||||||
|
parenthetical — do not guess.
|
||||||
|
- Core maintainer commits that don't have an external PR can omit the parenthetical.
|
||||||
|
|
||||||
|
## Step 5 — Write the File
|
||||||
|
|
||||||
|
Template:
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# vYYYY.MDD.P
|
||||||
|
|
||||||
|
> Released: YYYY-MM-DD
|
||||||
|
|
||||||
|
## Breaking Changes
|
||||||
|
|
||||||
|
## Highlights
|
||||||
|
|
||||||
|
## Improvements
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
|
||||||
|
## Upgrade Guide
|
||||||
|
|
||||||
|
## Contributors
|
||||||
|
|
||||||
|
Thank you to everyone who contributed to this release!
|
||||||
|
|
||||||
|
@username1, @username2, @username3
|
||||||
|
```
|
||||||
|
|
||||||
|
Omit empty sections except `Highlights`, `Improvements`, and `Fixes`, which should usually exist.
|
||||||
|
|
||||||
|
The `Contributors` section should always be included. List every person who authored
|
||||||
|
commits in the release range, @-mentioning them by their **GitHub username** (not their
|
||||||
|
real name or email). To find GitHub usernames:
|
||||||
|
|
||||||
|
1. Extract usernames from merge commit messages: `git log v{last}..HEAD --oneline --merges` — the branch prefix (e.g. `from username/branch`) gives the GitHub username.
|
||||||
|
2. For noreply emails like `user@users.noreply.github.com`, the username is the part before `@`.
|
||||||
|
3. For contributors whose username is ambiguous, check `gh api users/{guess}` or the PR page.
|
||||||
|
|
||||||
|
**Never expose contributor email addresses.** Use `@username` only.
|
||||||
|
|
||||||
|
Exclude bot accounts (e.g. `lockfile-bot`, `dependabot`) from the list. List contributors
|
||||||
|
in alphabetical order by GitHub username (case-insensitive).
|
||||||
|
|
||||||
|
## Step 6 — Review Before Release
|
||||||
|
|
||||||
|
Before handing it off:
|
||||||
|
|
||||||
|
1. confirm the heading is the stable version only
|
||||||
|
2. confirm there is no `-canary` language in the title or filename
|
||||||
|
3. confirm any breaking changes have an upgrade path
|
||||||
|
4. present the draft for human sign-off
|
||||||
|
|
||||||
|
This skill never publishes anything. It only prepares the stable changelog artifact.
|
||||||
247
.agents/skills/release/SKILL.md
Normal file
247
.agents/skills/release/SKILL.md
Normal file
@@ -0,0 +1,247 @@
|
|||||||
|
---
|
||||||
|
name: release
|
||||||
|
description: >
|
||||||
|
Coordinate a full Paperclip release across engineering verification, npm,
|
||||||
|
GitHub, smoke testing, and announcement follow-up. Use when leadership asks
|
||||||
|
to ship a release, not merely to discuss versioning.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Release Coordination Skill
|
||||||
|
|
||||||
|
Run the full Paperclip maintainer release workflow, not just an npm publish.
|
||||||
|
|
||||||
|
This skill coordinates:
|
||||||
|
|
||||||
|
- stable changelog drafting via `release-changelog`
|
||||||
|
- canary verification and publish status from `master`
|
||||||
|
- Docker smoke testing via `scripts/docker-onboard-smoke.sh`
|
||||||
|
- manual stable promotion from a chosen source ref
|
||||||
|
- GitHub Release creation
|
||||||
|
- website / announcement follow-up tasks
|
||||||
|
|
||||||
|
## Trigger
|
||||||
|
|
||||||
|
Use this skill when leadership asks for:
|
||||||
|
|
||||||
|
- "do a release"
|
||||||
|
- "ship the release"
|
||||||
|
- "promote this canary to stable"
|
||||||
|
- "cut the stable release"
|
||||||
|
|
||||||
|
## Preconditions
|
||||||
|
|
||||||
|
Before proceeding, verify all of the following:
|
||||||
|
|
||||||
|
1. `.agents/skills/release-changelog/SKILL.md` exists and is usable.
|
||||||
|
2. The repo working tree is clean, including untracked files.
|
||||||
|
3. There is at least one canary or candidate commit since the last stable tag.
|
||||||
|
4. The candidate SHA has passed the verification gate or is about to.
|
||||||
|
5. If manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master`.
|
||||||
|
6. npm publish rights are available through GitHub trusted publishing, or through local npm auth for emergency/manual use.
|
||||||
|
7. If running through Paperclip, you have issue context for status updates and follow-up task creation.
|
||||||
|
|
||||||
|
If any precondition fails, stop and report the blocker.
|
||||||
|
|
||||||
|
## Inputs
|
||||||
|
|
||||||
|
Collect these inputs up front:
|
||||||
|
|
||||||
|
- whether the target is a canary check or a stable promotion
|
||||||
|
- the candidate `source_ref` for stable
|
||||||
|
- whether the stable run is dry-run or live
|
||||||
|
- release issue / company context for website and announcement follow-up
|
||||||
|
|
||||||
|
## Step 0 — Release Model
|
||||||
|
|
||||||
|
Paperclip now uses a commit-driven release model:
|
||||||
|
|
||||||
|
1. every push to `master` publishes a canary automatically
|
||||||
|
2. canaries use `YYYY.MDD.P-canary.N`
|
||||||
|
3. stable releases use `YYYY.MDD.P`
|
||||||
|
4. the middle slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||||
|
5. the stable patch slot increments when more than one stable ships on the same UTC date
|
||||||
|
6. stable releases are manually promoted from a chosen tested commit or canary source commit
|
||||||
|
7. only stable releases get `releases/vYYYY.MDD.P.md`, git tag `vYYYY.MDD.P`, and a GitHub Release
|
||||||
|
|
||||||
|
Critical consequences:
|
||||||
|
|
||||||
|
- do not use release branches as the default path
|
||||||
|
- do not derive major/minor/patch bumps
|
||||||
|
- do not create canary changelog files
|
||||||
|
- do not create canary GitHub Releases
|
||||||
|
|
||||||
|
## Step 1 — Choose the Candidate
|
||||||
|
|
||||||
|
For canary validation:
|
||||||
|
|
||||||
|
- inspect the latest successful canary run on `master`
|
||||||
|
- record the canary version and source SHA
|
||||||
|
|
||||||
|
For stable promotion:
|
||||||
|
|
||||||
|
1. choose the tested source ref
|
||||||
|
2. confirm it is the exact SHA you want to promote
|
||||||
|
3. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||||
|
|
||||||
|
Useful commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git tag --list 'v*' --sort=-version:refname | head -1
|
||||||
|
git log --oneline --no-merges
|
||||||
|
npm view paperclipai@canary version
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 2 — Draft the Stable Changelog
|
||||||
|
|
||||||
|
Stable changelog files live at:
|
||||||
|
|
||||||
|
- `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
Invoke `release-changelog` and generate or update the stable notes only.
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- review the draft with a human before publish
|
||||||
|
- preserve manual edits if the file already exists
|
||||||
|
- keep the filename stable-only
|
||||||
|
- do not create a canary changelog file
|
||||||
|
|
||||||
|
## Step 3 — Verify the Candidate SHA
|
||||||
|
|
||||||
|
Run the standard gate:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pnpm -r typecheck
|
||||||
|
pnpm test:run
|
||||||
|
pnpm build
|
||||||
|
```
|
||||||
|
|
||||||
|
If the GitHub release workflow will run the publish, it can rerun this gate. Still report local status if you checked it.
|
||||||
|
|
||||||
|
For PRs that touch release logic, the repo also runs a canary release dry-run in CI. That is a release-specific guard, not a substitute for the standard gate.
|
||||||
|
|
||||||
|
## Step 4 — Validate the Canary
|
||||||
|
|
||||||
|
The normal canary path is automatic from `master` via:
|
||||||
|
|
||||||
|
- `.github/workflows/release.yml`
|
||||||
|
|
||||||
|
Confirm:
|
||||||
|
|
||||||
|
1. verification passed
|
||||||
|
2. npm canary publish succeeded
|
||||||
|
3. git tag `canary/vYYYY.MDD.P-canary.N` exists
|
||||||
|
|
||||||
|
Useful checks:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm view paperclipai@canary version
|
||||||
|
git tag --list 'canary/v*' --sort=-version:refname | head -5
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 5 — Smoke Test the Canary
|
||||||
|
|
||||||
|
Run:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Useful isolated variant:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Confirm:
|
||||||
|
|
||||||
|
1. install succeeds
|
||||||
|
2. onboarding completes without crashes
|
||||||
|
3. the server boots
|
||||||
|
4. the UI loads
|
||||||
|
5. basic company creation and dashboard load work
|
||||||
|
|
||||||
|
If smoke testing fails:
|
||||||
|
|
||||||
|
- stop the stable release
|
||||||
|
- fix the issue on `master`
|
||||||
|
- wait for the next automatic canary
|
||||||
|
- rerun smoke testing
|
||||||
|
|
||||||
|
## Step 6 — Preview or Publish Stable
|
||||||
|
|
||||||
|
The normal stable path is manual `workflow_dispatch` on:
|
||||||
|
|
||||||
|
- `.github/workflows/release.yml`
|
||||||
|
|
||||||
|
Inputs:
|
||||||
|
|
||||||
|
- `source_ref`
|
||||||
|
- `stable_date`
|
||||||
|
- `dry_run`
|
||||||
|
|
||||||
|
Before live stable:
|
||||||
|
|
||||||
|
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||||
|
2. ensure `releases/vYYYY.MDD.P.md` exists on the source ref
|
||||||
|
3. run the stable workflow in dry-run mode first when practical
|
||||||
|
4. then run the real stable publish
|
||||||
|
|
||||||
|
The stable workflow:
|
||||||
|
|
||||||
|
- re-verifies the exact source ref
|
||||||
|
- computes the next stable patch slot for the chosen UTC date
|
||||||
|
- publishes `YYYY.MDD.P` under dist-tag `latest`
|
||||||
|
- creates git tag `vYYYY.MDD.P`
|
||||||
|
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
Local emergency/manual commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/release.sh stable --dry-run
|
||||||
|
./scripts/release.sh stable
|
||||||
|
git push public-gh refs/tags/vYYYY.MDD.P
|
||||||
|
./scripts/create-github-release.sh YYYY.MDD.P
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 7 — Finish the Other Surfaces
|
||||||
|
|
||||||
|
Create or verify follow-up work for:
|
||||||
|
|
||||||
|
- website changelog publishing
|
||||||
|
- launch post / social announcement
|
||||||
|
- release summary in Paperclip issue context
|
||||||
|
|
||||||
|
These should reference the stable release, not the canary.
|
||||||
|
|
||||||
|
## Failure Handling
|
||||||
|
|
||||||
|
If the canary is bad:
|
||||||
|
|
||||||
|
- publish another canary, do not ship stable
|
||||||
|
|
||||||
|
If stable npm publish succeeds but tag push or GitHub release creation fails:
|
||||||
|
|
||||||
|
- fix the git/GitHub issue immediately from the same release result
|
||||||
|
- do not republish the same version
|
||||||
|
|
||||||
|
If `latest` is bad after stable publish:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/rollback-latest.sh <last-good-version>
|
||||||
|
```
|
||||||
|
|
||||||
|
Then fix forward with a new stable release.
|
||||||
|
|
||||||
|
## Output
|
||||||
|
|
||||||
|
When the skill completes, provide:
|
||||||
|
|
||||||
|
- candidate SHA and tested canary version, if relevant
|
||||||
|
- stable version, if promoted
|
||||||
|
- verification status
|
||||||
|
- npm status
|
||||||
|
- smoke-test status
|
||||||
|
- git tag / GitHub Release status
|
||||||
|
- website / announcement follow-up status
|
||||||
|
- rollback recommendation if anything is still partially complete
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
# Changesets
|
|
||||||
|
|
||||||
Hello and welcome! This folder has been automatically generated by `@changesets/cli`, a build tool that works
|
|
||||||
with multi-package repos, or single-package repos to help you version and publish your code. You can
|
|
||||||
find the full documentation for it [in our repository](https://github.com/changesets/changesets).
|
|
||||||
|
|
||||||
We have a quick list of common questions to get you started engaging with this project in
|
|
||||||
[our documentation](https://github.com/changesets/changesets/blob/main/docs/common-questions.md).
|
|
||||||
@@ -1,11 +0,0 @@
|
|||||||
{
|
|
||||||
"$schema": "https://unpkg.com/@changesets/config@3.1.3/schema.json",
|
|
||||||
"changelog": "@changesets/cli/changelog",
|
|
||||||
"commit": false,
|
|
||||||
"fixed": [["@paperclipai/*", "paperclipai"]],
|
|
||||||
"linked": [],
|
|
||||||
"access": "public",
|
|
||||||
"baseBranch": "master",
|
|
||||||
"updateInternalDependencies": "patch",
|
|
||||||
"ignore": ["@paperclipai/ui"]
|
|
||||||
}
|
|
||||||
1
.claude/skills/company-creator
Symbolic link
1
.claude/skills/company-creator
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../../.agents/skills/company-creator
|
||||||
10
.github/CODEOWNERS
vendored
Normal file
10
.github/CODEOWNERS
vendored
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
# Replace @cryppadotta if a different maintainer or team should own release infrastructure.
|
||||||
|
|
||||||
|
.github/** @cryppadotta @devinfoley
|
||||||
|
scripts/release*.sh @cryppadotta @devinfoley
|
||||||
|
scripts/release-*.mjs @cryppadotta @devinfoley
|
||||||
|
scripts/create-github-release.sh @cryppadotta @devinfoley
|
||||||
|
scripts/rollback-latest.sh @cryppadotta @devinfoley
|
||||||
|
doc/RELEASING.md @cryppadotta @devinfoley
|
||||||
|
doc/PUBLISHING.md @cryppadotta @devinfoley
|
||||||
|
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
|
||||||
49
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
49
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
## Thinking Path
|
||||||
|
|
||||||
|
<!--
|
||||||
|
Required. Trace your reasoning from the top of the project down to this
|
||||||
|
specific change. Start with what Paperclip is, then narrow through the
|
||||||
|
subsystem, the problem, and why this PR exists. Use blockquote style.
|
||||||
|
Aim for 5–8 steps. See CONTRIBUTING.md for full examples.
|
||||||
|
-->
|
||||||
|
|
||||||
|
> - Paperclip orchestrates AI agents for zero-human companies
|
||||||
|
> - [Which subsystem or capability is involved]
|
||||||
|
> - [What problem or gap exists]
|
||||||
|
> - [Why it needs to be addressed]
|
||||||
|
> - This pull request ...
|
||||||
|
> - The benefit is ...
|
||||||
|
|
||||||
|
## What Changed
|
||||||
|
|
||||||
|
<!-- Bullet list of concrete changes. One bullet per logical unit. -->
|
||||||
|
|
||||||
|
-
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
|
||||||
|
<!--
|
||||||
|
How can a reviewer confirm this works? Include test commands, manual
|
||||||
|
steps, or both. For UI changes, include before/after screenshots.
|
||||||
|
-->
|
||||||
|
|
||||||
|
-
|
||||||
|
|
||||||
|
## Risks
|
||||||
|
|
||||||
|
<!--
|
||||||
|
What could go wrong? Mention migration safety, breaking changes,
|
||||||
|
behavioral shifts, or "Low risk" if genuinely minor.
|
||||||
|
-->
|
||||||
|
|
||||||
|
-
|
||||||
|
|
||||||
|
## Checklist
|
||||||
|
|
||||||
|
- [ ] I have included a thinking path that traces from project context to this change
|
||||||
|
- [ ] I have run tests locally and they pass
|
||||||
|
- [ ] I have added or updated tests where applicable
|
||||||
|
- [ ] If this change affects the UI, I have included before/after screenshots
|
||||||
|
- [ ] I have updated relevant documentation to reflect my changes
|
||||||
|
- [ ] I have considered and documented any risks above
|
||||||
|
- [ ] I will address all Greptile and reviewer comments before requesting merge
|
||||||
55
.github/workflows/docker.yml
vendored
Normal file
55
.github/workflows/docker.yml
vendored
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
name: Docker
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- "master"
|
||||||
|
tags:
|
||||||
|
- "v*"
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
packages: write
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build-and-push:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 30
|
||||||
|
concurrency:
|
||||||
|
group: docker-${{ github.ref }}
|
||||||
|
cancel-in-progress: true
|
||||||
|
steps:
|
||||||
|
- name: Checkout
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Login to GitHub Container Registry
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
with:
|
||||||
|
registry: ghcr.io
|
||||||
|
username: ${{ github.repository_owner }}
|
||||||
|
password: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
|
||||||
|
- name: Docker meta
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: ghcr.io/${{ github.repository }}
|
||||||
|
tags: |
|
||||||
|
type=raw,value=latest,enable={{is_default_branch}}
|
||||||
|
type=semver,pattern={{version}}
|
||||||
|
type=semver,pattern={{major}}.{{minor}}
|
||||||
|
type=sha
|
||||||
|
|
||||||
|
- name: Build and push
|
||||||
|
uses: docker/build-push-action@v6
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
push: true
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
44
.github/workflows/e2e.yml
vendored
Normal file
44
.github/workflows/e2e.yml
vendored
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
name: E2E Tests
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
skip_llm:
|
||||||
|
description: "Skip LLM-dependent assertions (default: true)"
|
||||||
|
type: boolean
|
||||||
|
default: true
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
e2e:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 30
|
||||||
|
env:
|
||||||
|
PAPERCLIP_E2E_SKIP_LLM: ${{ inputs.skip_llm && 'true' || 'false' }}
|
||||||
|
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9
|
||||||
|
|
||||||
|
- uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 20
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- run: pnpm install --frozen-lockfile
|
||||||
|
- run: pnpm build
|
||||||
|
- run: npx playwright install --with-deps chromium
|
||||||
|
|
||||||
|
- name: Run e2e tests
|
||||||
|
run: pnpm run test:e2e
|
||||||
|
|
||||||
|
- uses: actions/upload-artifact@v4
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
name: playwright-report
|
||||||
|
path: |
|
||||||
|
tests/e2e/playwright-report/
|
||||||
|
tests/e2e/test-results/
|
||||||
|
retention-days: 14
|
||||||
146
.github/workflows/pr.yml
vendored
Normal file
146
.github/workflows/pr.yml
vendored
Normal file
@@ -0,0 +1,146 @@
|
|||||||
|
name: PR
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- master
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: pr-${{ github.event.pull_request.number }}
|
||||||
|
cancel-in-progress: true
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
policy:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 5
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Block manual lockfile edits
|
||||||
|
if: github.head_ref != 'chore/refresh-lockfile'
|
||||||
|
run: |
|
||||||
|
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||||
|
if printf '%s\n' "$changed" | grep -qx 'pnpm-lock.yaml'; then
|
||||||
|
echo "Do not commit pnpm-lock.yaml in pull requests. CI owns lockfile updates."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
run_install: false
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
|
||||||
|
- name: Validate dependency resolution when manifests change
|
||||||
|
run: |
|
||||||
|
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"
|
||||||
|
manifest_pattern='(^|/)package\.json$|^pnpm-workspace\.yaml$|^\.npmrc$|^pnpmfile\.(cjs|js|mjs)$'
|
||||||
|
if printf '%s\n' "$changed" | grep -Eq "$manifest_pattern"; then
|
||||||
|
pnpm install --lockfile-only --ignore-scripts --no-frozen-lockfile
|
||||||
|
fi
|
||||||
|
|
||||||
|
verify:
|
||||||
|
needs: [policy]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 20
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --frozen-lockfile
|
||||||
|
|
||||||
|
- name: Typecheck
|
||||||
|
run: pnpm -r typecheck
|
||||||
|
|
||||||
|
- name: Run tests
|
||||||
|
run: pnpm test:run
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: pnpm build
|
||||||
|
|
||||||
|
- name: Release canary dry run
|
||||||
|
run: |
|
||||||
|
git checkout -B master HEAD
|
||||||
|
git checkout -- pnpm-lock.yaml
|
||||||
|
./scripts/release.sh canary --skip-verify --dry-run
|
||||||
|
|
||||||
|
e2e:
|
||||||
|
needs: [policy]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 30
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --frozen-lockfile
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: pnpm build
|
||||||
|
|
||||||
|
- name: Install Playwright
|
||||||
|
run: npx playwright install --with-deps chromium
|
||||||
|
|
||||||
|
- name: Generate Paperclip config
|
||||||
|
run: |
|
||||||
|
mkdir -p ~/.paperclip/instances/default
|
||||||
|
cat > ~/.paperclip/instances/default/config.json << 'CONF'
|
||||||
|
{
|
||||||
|
"$meta": { "version": 1, "updatedAt": "2026-01-01T00:00:00.000Z", "source": "onboard" },
|
||||||
|
"database": { "mode": "embedded-postgres" },
|
||||||
|
"logging": { "mode": "file" },
|
||||||
|
"server": { "deploymentMode": "local_trusted", "host": "127.0.0.1", "port": 3100 },
|
||||||
|
"auth": { "baseUrlMode": "auto" },
|
||||||
|
"storage": { "provider": "local_disk" },
|
||||||
|
"secrets": { "provider": "local_encrypted", "strictMode": false }
|
||||||
|
}
|
||||||
|
CONF
|
||||||
|
|
||||||
|
- name: Run e2e tests
|
||||||
|
env:
|
||||||
|
PAPERCLIP_E2E_SKIP_LLM: "true"
|
||||||
|
run: pnpm run test:e2e
|
||||||
|
|
||||||
|
- name: Upload Playwright report
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
name: playwright-report
|
||||||
|
path: |
|
||||||
|
tests/e2e/playwright-report/
|
||||||
|
tests/e2e/test-results/
|
||||||
|
retention-days: 14
|
||||||
97
.github/workflows/refresh-lockfile.yml
vendored
Normal file
97
.github/workflows/refresh-lockfile.yml
vendored
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
name: Refresh Lockfile
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- master
|
||||||
|
workflow_dispatch:
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: refresh-lockfile-master
|
||||||
|
cancel-in-progress: false
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
refresh:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 10
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
pull-requests: write
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
run_install: false
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 20
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Refresh pnpm lockfile
|
||||||
|
run: pnpm install --lockfile-only --ignore-scripts --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Fail on unexpected file changes
|
||||||
|
run: |
|
||||||
|
changed="$(git status --porcelain)"
|
||||||
|
if [ -z "$changed" ]; then
|
||||||
|
echo "Lockfile is already up to date."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
if printf '%s\n' "$changed" | grep -Fvq ' pnpm-lock.yaml'; then
|
||||||
|
echo "Unexpected files changed during lockfile refresh:"
|
||||||
|
echo "$changed"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Create or update pull request
|
||||||
|
id: upsert-pr
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
run: |
|
||||||
|
if git diff --quiet -- pnpm-lock.yaml; then
|
||||||
|
echo "Lockfile unchanged, nothing to do."
|
||||||
|
echo "pr_created=false" >> "$GITHUB_OUTPUT"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
BRANCH="chore/refresh-lockfile"
|
||||||
|
git config user.name "lockfile-bot"
|
||||||
|
git config user.email "lockfile-bot@users.noreply.github.com"
|
||||||
|
|
||||||
|
git checkout -B "$BRANCH"
|
||||||
|
git add pnpm-lock.yaml
|
||||||
|
git commit -m "chore(lockfile): refresh pnpm-lock.yaml"
|
||||||
|
git push --force origin "$BRANCH"
|
||||||
|
|
||||||
|
# Create PR if one doesn't already exist
|
||||||
|
existing=$(gh pr list --head "$BRANCH" --json number --jq '.[0].number')
|
||||||
|
if [ -z "$existing" ]; then
|
||||||
|
gh pr create \
|
||||||
|
--head "$BRANCH" \
|
||||||
|
--title "chore(lockfile): refresh pnpm-lock.yaml" \
|
||||||
|
--body "Auto-generated lockfile refresh after dependencies changed on master. This PR only updates pnpm-lock.yaml."
|
||||||
|
echo "Created new PR."
|
||||||
|
else
|
||||||
|
echo "PR #$existing already exists, branch updated via force push."
|
||||||
|
fi
|
||||||
|
echo "pr_created=true" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Enable auto-merge for lockfile PR
|
||||||
|
if: steps.upsert-pr.outputs.pr_created == 'true'
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
run: |
|
||||||
|
pr_url="$(gh pr list --head chore/refresh-lockfile --json url --jq '.[0].url')"
|
||||||
|
if [ -z "$pr_url" ]; then
|
||||||
|
echo "Error: lockfile PR was not found." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
gh pr merge --auto --squash --delete-branch "$pr_url"
|
||||||
118
.github/workflows/release-smoke.yml
vendored
Normal file
118
.github/workflows/release-smoke.yml
vendored
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
name: Release Smoke
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
paperclip_version:
|
||||||
|
description: Published Paperclip dist-tag to test
|
||||||
|
required: true
|
||||||
|
default: canary
|
||||||
|
type: choice
|
||||||
|
options:
|
||||||
|
- canary
|
||||||
|
- latest
|
||||||
|
host_port:
|
||||||
|
description: Host port for the Docker smoke container
|
||||||
|
required: false
|
||||||
|
default: "3232"
|
||||||
|
type: string
|
||||||
|
artifact_name:
|
||||||
|
description: Artifact name for uploaded diagnostics
|
||||||
|
required: false
|
||||||
|
default: release-smoke
|
||||||
|
type: string
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
paperclip_version:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
host_port:
|
||||||
|
required: false
|
||||||
|
default: "3232"
|
||||||
|
type: string
|
||||||
|
artifact_name:
|
||||||
|
required: false
|
||||||
|
default: release-smoke
|
||||||
|
type: string
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
smoke:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 45
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Install Playwright browser
|
||||||
|
run: npx playwright install --with-deps chromium
|
||||||
|
|
||||||
|
- name: Launch Docker smoke harness
|
||||||
|
run: |
|
||||||
|
metadata_file="$RUNNER_TEMP/release-smoke.env"
|
||||||
|
HOST_PORT="${{ inputs.host_port }}" \
|
||||||
|
DATA_DIR="$RUNNER_TEMP/release-smoke-data" \
|
||||||
|
PAPERCLIPAI_VERSION="${{ inputs.paperclip_version }}" \
|
||||||
|
SMOKE_DETACH=true \
|
||||||
|
SMOKE_METADATA_FILE="$metadata_file" \
|
||||||
|
./scripts/docker-onboard-smoke.sh
|
||||||
|
set -a
|
||||||
|
source "$metadata_file"
|
||||||
|
set +a
|
||||||
|
{
|
||||||
|
echo "SMOKE_BASE_URL=$SMOKE_BASE_URL"
|
||||||
|
echo "SMOKE_ADMIN_EMAIL=$SMOKE_ADMIN_EMAIL"
|
||||||
|
echo "SMOKE_ADMIN_PASSWORD=$SMOKE_ADMIN_PASSWORD"
|
||||||
|
echo "SMOKE_CONTAINER_NAME=$SMOKE_CONTAINER_NAME"
|
||||||
|
echo "SMOKE_DATA_DIR=$SMOKE_DATA_DIR"
|
||||||
|
echo "SMOKE_IMAGE_NAME=$SMOKE_IMAGE_NAME"
|
||||||
|
echo "SMOKE_PAPERCLIPAI_VERSION=$SMOKE_PAPERCLIPAI_VERSION"
|
||||||
|
echo "SMOKE_METADATA_FILE=$metadata_file"
|
||||||
|
} >> "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Run release smoke Playwright suite
|
||||||
|
env:
|
||||||
|
PAPERCLIP_RELEASE_SMOKE_BASE_URL: ${{ env.SMOKE_BASE_URL }}
|
||||||
|
PAPERCLIP_RELEASE_SMOKE_EMAIL: ${{ env.SMOKE_ADMIN_EMAIL }}
|
||||||
|
PAPERCLIP_RELEASE_SMOKE_PASSWORD: ${{ env.SMOKE_ADMIN_PASSWORD }}
|
||||||
|
run: pnpm run test:release-smoke
|
||||||
|
|
||||||
|
- name: Capture Docker logs
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||||
|
docker logs "$SMOKE_CONTAINER_NAME" >"$RUNNER_TEMP/docker-onboard-smoke.log" 2>&1 || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload diagnostics
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.artifact_name }}
|
||||||
|
path: |
|
||||||
|
${{ runner.temp }}/docker-onboard-smoke.log
|
||||||
|
${{ env.SMOKE_METADATA_FILE }}
|
||||||
|
tests/release-smoke/playwright-report/
|
||||||
|
tests/release-smoke/test-results/
|
||||||
|
retention-days: 14
|
||||||
|
|
||||||
|
- name: Stop Docker smoke container
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||||
|
docker rm -f "$SMOKE_CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||||
|
fi
|
||||||
261
.github/workflows/release.yml
vendored
Normal file
261
.github/workflows/release.yml
vendored
Normal file
@@ -0,0 +1,261 @@
|
|||||||
|
name: Release
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- master
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
source_ref:
|
||||||
|
description: Commit SHA, branch, or tag to publish as stable
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
default: master
|
||||||
|
stable_date:
|
||||||
|
description: Enter a UTC date in YYYY-MM-DD format, for example 2026-03-18. Do not enter a version string. The workflow will resolve that date to a stable version such as 2026.318.0, then 2026.318.1 for the next same-day stable.
|
||||||
|
required: false
|
||||||
|
type: string
|
||||||
|
dry_run:
|
||||||
|
description: Preview the stable release without publishing
|
||||||
|
required: true
|
||||||
|
type: boolean
|
||||||
|
default: false
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: release-${{ github.event_name }}-${{ github.ref }}
|
||||||
|
cancel-in-progress: false
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
verify_canary:
|
||||||
|
if: github.event_name == 'push'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 30
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Typecheck
|
||||||
|
run: pnpm -r typecheck
|
||||||
|
|
||||||
|
- name: Run tests
|
||||||
|
run: pnpm test:run
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: pnpm build
|
||||||
|
|
||||||
|
publish_canary:
|
||||||
|
if: github.event_name == 'push'
|
||||||
|
needs: verify_canary
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 45
|
||||||
|
environment: npm-canary
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
id-token: write
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Restore tracked install-time changes
|
||||||
|
run: git checkout -- pnpm-lock.yaml
|
||||||
|
|
||||||
|
- name: Configure git author
|
||||||
|
run: |
|
||||||
|
git config user.name "github-actions[bot]"
|
||||||
|
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||||
|
|
||||||
|
- name: Publish canary
|
||||||
|
env:
|
||||||
|
GITHUB_ACTIONS: "true"
|
||||||
|
run: ./scripts/release.sh canary --skip-verify
|
||||||
|
|
||||||
|
- name: Push canary tag
|
||||||
|
run: |
|
||||||
|
tag="$(git tag --points-at HEAD | grep '^canary/v' | head -1)"
|
||||||
|
if [ -z "$tag" ]; then
|
||||||
|
echo "Error: no canary tag points at HEAD after release." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
git push origin "refs/tags/${tag}"
|
||||||
|
|
||||||
|
verify_stable:
|
||||||
|
if: github.event_name == 'workflow_dispatch'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 30
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
ref: ${{ inputs.source_ref }}
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Typecheck
|
||||||
|
run: pnpm -r typecheck
|
||||||
|
|
||||||
|
- name: Run tests
|
||||||
|
run: pnpm test:run
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: pnpm build
|
||||||
|
|
||||||
|
preview_stable:
|
||||||
|
if: github.event_name == 'workflow_dispatch' && inputs.dry_run
|
||||||
|
needs: verify_stable
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 45
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
ref: ${{ inputs.source_ref }}
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Dry-run stable release
|
||||||
|
env:
|
||||||
|
GITHUB_ACTIONS: "true"
|
||||||
|
run: |
|
||||||
|
args=(stable --skip-verify --dry-run)
|
||||||
|
if [ -n "${{ inputs.stable_date }}" ]; then
|
||||||
|
args+=(--date "${{ inputs.stable_date }}")
|
||||||
|
fi
|
||||||
|
./scripts/release.sh "${args[@]}"
|
||||||
|
|
||||||
|
publish_stable:
|
||||||
|
if: github.event_name == 'workflow_dispatch' && !inputs.dry_run
|
||||||
|
needs: verify_stable
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
timeout-minutes: 45
|
||||||
|
environment: npm-stable
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
id-token: write
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
ref: ${{ inputs.source_ref }}
|
||||||
|
|
||||||
|
- name: Setup pnpm
|
||||||
|
uses: pnpm/action-setup@v4
|
||||||
|
with:
|
||||||
|
version: 9.15.4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: 24
|
||||||
|
cache: pnpm
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: pnpm install --no-frozen-lockfile
|
||||||
|
|
||||||
|
- name: Restore tracked install-time changes
|
||||||
|
run: git checkout -- pnpm-lock.yaml
|
||||||
|
|
||||||
|
- name: Configure git author
|
||||||
|
run: |
|
||||||
|
git config user.name "github-actions[bot]"
|
||||||
|
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||||
|
|
||||||
|
- name: Publish stable
|
||||||
|
env:
|
||||||
|
GITHUB_ACTIONS: "true"
|
||||||
|
run: |
|
||||||
|
args=(stable --skip-verify)
|
||||||
|
if [ -n "${{ inputs.stable_date }}" ]; then
|
||||||
|
args+=(--date "${{ inputs.stable_date }}")
|
||||||
|
fi
|
||||||
|
./scripts/release.sh "${args[@]}"
|
||||||
|
|
||||||
|
- name: Push stable tag
|
||||||
|
run: |
|
||||||
|
tag="$(git tag --points-at HEAD | grep '^v' | head -1)"
|
||||||
|
if [ -z "$tag" ]; then
|
||||||
|
echo "Error: no stable tag points at HEAD after release." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
git push origin "refs/tags/${tag}"
|
||||||
|
|
||||||
|
- name: Create GitHub Release
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
PUBLISH_REMOTE: origin
|
||||||
|
run: |
|
||||||
|
version="$(git tag --points-at HEAD | grep '^v' | head -1 | sed 's/^v//')"
|
||||||
|
if [ -z "$version" ]; then
|
||||||
|
echo "Error: no v* tag points at HEAD after stable release." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
./scripts/create-github-release.sh "$version"
|
||||||
13
.gitignore
vendored
13
.gitignore
vendored
@@ -37,3 +37,16 @@ tmp/
|
|||||||
.vscode/
|
.vscode/
|
||||||
.claude/settings.local.json
|
.claude/settings.local.json
|
||||||
.paperclip-local/
|
.paperclip-local/
|
||||||
|
/.idea/
|
||||||
|
/.agents/
|
||||||
|
|
||||||
|
# Doc maintenance cursor
|
||||||
|
.doc-review-cursor
|
||||||
|
|
||||||
|
# Playwright
|
||||||
|
tests/e2e/test-results/
|
||||||
|
tests/e2e/playwright-report/
|
||||||
|
tests/release-smoke/test-results/
|
||||||
|
tests/release-smoke/playwright-report/
|
||||||
|
.superset/
|
||||||
|
.claude/worktrees/
|
||||||
|
|||||||
@@ -78,6 +78,9 @@ If you change schema/API behavior, update all impacted layers:
|
|||||||
4. Do not replace strategic docs wholesale unless asked.
|
4. Do not replace strategic docs wholesale unless asked.
|
||||||
Prefer additive updates. Keep `doc/SPEC.md` and `doc/SPEC-implementation.md` aligned.
|
Prefer additive updates. Keep `doc/SPEC.md` and `doc/SPEC-implementation.md` aligned.
|
||||||
|
|
||||||
|
5. Keep plan docs dated and centralized.
|
||||||
|
New plan documents belong in `doc/plans/` and should use `YYYY-MM-DD-slug.md` filenames.
|
||||||
|
|
||||||
## 6. Database Change Workflow
|
## 6. Database Change Workflow
|
||||||
|
|
||||||
When changing data model:
|
When changing data model:
|
||||||
|
|||||||
74
CONTRIBUTING.md
Normal file
74
CONTRIBUTING.md
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
# Contributing Guide
|
||||||
|
|
||||||
|
Thanks for wanting to contribute!
|
||||||
|
|
||||||
|
We really appreciate both small fixes and thoughtful larger changes.
|
||||||
|
|
||||||
|
## Two Paths to Get Your Pull Request Accepted
|
||||||
|
|
||||||
|
### Path 1: Small, Focused Changes (Fastest way to get merged)
|
||||||
|
|
||||||
|
- Pick **one** clear thing to fix/improve
|
||||||
|
- Touch the **smallest possible number of files**
|
||||||
|
- Make sure the change is very targeted and easy to review
|
||||||
|
- All automated checks pass (including Greptile comments)
|
||||||
|
- No new lint/test failures
|
||||||
|
|
||||||
|
These almost always get merged quickly when they're clean.
|
||||||
|
|
||||||
|
### Path 2: Bigger or Impactful Changes
|
||||||
|
|
||||||
|
- **First** talk about it in Discord → #dev channel
|
||||||
|
→ Describe what you're trying to solve
|
||||||
|
→ Share rough ideas / approach
|
||||||
|
- Once there's rough agreement, build it
|
||||||
|
- In your PR include:
|
||||||
|
- Before / After screenshots (or short video if UI/behavior change)
|
||||||
|
- Clear description of what & why
|
||||||
|
- Proof it works (manual testing notes)
|
||||||
|
- All tests passing
|
||||||
|
- All Greptile + other PR comments addressed
|
||||||
|
|
||||||
|
PRs that follow this path are **much** more likely to be accepted, even when they're large.
|
||||||
|
|
||||||
|
## General Rules (both paths)
|
||||||
|
|
||||||
|
- Write clear commit messages
|
||||||
|
- Keep PR title + description meaningful
|
||||||
|
- One PR = one logical change (unless it's a small related group)
|
||||||
|
- Run tests locally first
|
||||||
|
- Be kind in discussions 😄
|
||||||
|
|
||||||
|
## Writing a Good PR message
|
||||||
|
|
||||||
|
Please include a "thinking path" at the top of your PR message that explains from the top of the project down to what you fixed. E.g.:
|
||||||
|
|
||||||
|
### Thinking Path Example 1:
|
||||||
|
|
||||||
|
> - Paperclip orchestrates ai-agents for zero-human companies
|
||||||
|
> - There are many types of adapters for each LLM model provider
|
||||||
|
> - But LLM's have a context limit and not all agents can automatically compact their context
|
||||||
|
> - So we need to have an adapter-specific configuration for which adapters can and cannot automatically compact their context
|
||||||
|
> - This pull request adds per-adapter configuration of compaction, either auto or paperclip managed
|
||||||
|
> - That way we can get optimal performance from any adapter/provider in Paperclip
|
||||||
|
|
||||||
|
### Thinking Path Example 2:
|
||||||
|
|
||||||
|
> - Paperclip orchestrates ai-agents for zero-human companies
|
||||||
|
> - But humans want to watch the agents and oversee their work
|
||||||
|
> - Human users also operate in teams and so they need their own logins, profiles, views etc.
|
||||||
|
> - So we have a multi-user system for humans
|
||||||
|
> - But humans want to be able to update their own profile picture and avatar
|
||||||
|
> - But the avatar upload form wasn't saving the avatar to the file storage system
|
||||||
|
> - So this PR fixes the avatar upload form to use the file storage service
|
||||||
|
> - The benefit is we don't have a one-off file storage for just one aspect of the system, which would cause confusion and extra configuration
|
||||||
|
|
||||||
|
Then have the rest of your normal PR message after the Thinking Path.
|
||||||
|
|
||||||
|
This should include details about what you did, why you did it, why it matters & the benefits, how we can verify it works, and any risks.
|
||||||
|
|
||||||
|
Please include screenshots if possible if you have a visible change. (use something like the [agent-browser skill](https://github.com/vercel-labs/agent-browser/blob/main/skills/agent-browser/SKILL.md) or similar to take screenshots). Ideally, you include before and after screenshots.
|
||||||
|
|
||||||
|
Questions? Just ask in #dev — we're happy to help.
|
||||||
|
|
||||||
|
Happy hacking!
|
||||||
24
Dockerfile
24
Dockerfile
@@ -1,4 +1,4 @@
|
|||||||
FROM node:20-bookworm-slim AS base
|
FROM node:lts-trixie-slim AS base
|
||||||
RUN apt-get update \
|
RUN apt-get update \
|
||||||
&& apt-get install -y --no-install-recommends ca-certificates curl git \
|
&& apt-get install -y --no-install-recommends ca-certificates curl git \
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
@@ -15,19 +15,30 @@ COPY packages/db/package.json packages/db/
|
|||||||
COPY packages/adapter-utils/package.json packages/adapter-utils/
|
COPY packages/adapter-utils/package.json packages/adapter-utils/
|
||||||
COPY packages/adapters/claude-local/package.json packages/adapters/claude-local/
|
COPY packages/adapters/claude-local/package.json packages/adapters/claude-local/
|
||||||
COPY packages/adapters/codex-local/package.json packages/adapters/codex-local/
|
COPY packages/adapters/codex-local/package.json packages/adapters/codex-local/
|
||||||
|
COPY packages/adapters/cursor-local/package.json packages/adapters/cursor-local/
|
||||||
|
COPY packages/adapters/gemini-local/package.json packages/adapters/gemini-local/
|
||||||
|
COPY packages/adapters/openclaw-gateway/package.json packages/adapters/openclaw-gateway/
|
||||||
|
COPY packages/adapters/opencode-local/package.json packages/adapters/opencode-local/
|
||||||
|
COPY packages/adapters/pi-local/package.json packages/adapters/pi-local/
|
||||||
|
COPY packages/plugins/sdk/package.json packages/plugins/sdk/
|
||||||
|
|
||||||
RUN pnpm install --frozen-lockfile
|
RUN pnpm install --frozen-lockfile
|
||||||
|
|
||||||
FROM base AS build
|
FROM base AS build
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
COPY --from=deps /app /app
|
COPY --from=deps /app /app
|
||||||
COPY . .
|
COPY . .
|
||||||
RUN pnpm --filter @paperclip/ui build
|
RUN pnpm --filter @paperclipai/ui build
|
||||||
RUN pnpm --filter @paperclip/server build
|
RUN pnpm --filter @paperclipai/plugin-sdk build
|
||||||
|
RUN pnpm --filter @paperclipai/server build
|
||||||
|
RUN test -f server/dist/index.js || (echo "ERROR: server build output missing" && exit 1)
|
||||||
|
|
||||||
FROM base AS production
|
FROM base AS production
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
COPY --from=build /app /app
|
COPY --chown=node:node --from=build /app /app
|
||||||
RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest
|
RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest opencode-ai \
|
||||||
|
&& mkdir -p /paperclip \
|
||||||
|
&& chown node:node /paperclip
|
||||||
|
|
||||||
ENV NODE_ENV=production \
|
ENV NODE_ENV=production \
|
||||||
HOME=/paperclip \
|
HOME=/paperclip \
|
||||||
@@ -37,10 +48,11 @@ ENV NODE_ENV=production \
|
|||||||
PAPERCLIP_HOME=/paperclip \
|
PAPERCLIP_HOME=/paperclip \
|
||||||
PAPERCLIP_INSTANCE_ID=default \
|
PAPERCLIP_INSTANCE_ID=default \
|
||||||
PAPERCLIP_CONFIG=/paperclip/instances/default/config.json \
|
PAPERCLIP_CONFIG=/paperclip/instances/default/config.json \
|
||||||
PAPERCLIP_DEPLOYMENT_MODE=local_trusted \
|
PAPERCLIP_DEPLOYMENT_MODE=authenticated \
|
||||||
PAPERCLIP_DEPLOYMENT_EXPOSURE=private
|
PAPERCLIP_DEPLOYMENT_EXPOSURE=private
|
||||||
|
|
||||||
VOLUME ["/paperclip"]
|
VOLUME ["/paperclip"]
|
||||||
EXPOSE 3100
|
EXPOSE 3100
|
||||||
|
|
||||||
|
USER node
|
||||||
CMD ["node", "--import", "./server/node_modules/tsx/dist/loader.mjs", "server/dist/index.js"]
|
CMD ["node", "--import", "./server/node_modules/tsx/dist/loader.mjs", "server/dist/index.js"]
|
||||||
|
|||||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2025 Paperclip AI
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
19
README.md
19
README.md
@@ -218,7 +218,8 @@ By default, agents run on scheduled heartbeats and event-based triggers (task as
|
|||||||
## Development
|
## Development
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pnpm dev # Full dev (API + UI)
|
pnpm dev # Full dev (API + UI, watch mode)
|
||||||
|
pnpm dev:once # Full dev without file watching
|
||||||
pnpm dev:server # Server only
|
pnpm dev:server # Server only
|
||||||
pnpm build # Build all
|
pnpm build # Build all
|
||||||
pnpm typecheck # Type checking
|
pnpm typecheck # Type checking
|
||||||
@@ -233,9 +234,13 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
|||||||
|
|
||||||
## Roadmap
|
## Roadmap
|
||||||
|
|
||||||
- 🛒 **Clipmart** — Download and share entire company architectures
|
- ⚪ Get OpenClaw onboarding easier
|
||||||
- 🧩 **Plugin System** — Embed custom plugins (e.g. Reporting, Knowledge Base) into Paperclip
|
- ⚪ Get cloud agents working e.g. Cursor / e2b agents
|
||||||
- ☁️ **Cloud Agent Adapters** — Add more adapters for cloud-hosted agents
|
- ⚪ ClipMart - buy and sell entire agent companies
|
||||||
|
- ⚪ Easy agent configurations / easier to understand
|
||||||
|
- ⚪ Better support for harness engineering
|
||||||
|
- 🟢 Plugin system (e.g. if you want to add a knowledgebase, custom tracing, queues, etc)
|
||||||
|
- ⚪ Better docs
|
||||||
|
|
||||||
<br/>
|
<br/>
|
||||||
|
|
||||||
@@ -243,8 +248,6 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
|||||||
|
|
||||||
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.
|
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.
|
||||||
|
|
||||||
<!-- TODO: add CONTRIBUTING.md -->
|
|
||||||
|
|
||||||
<br/>
|
<br/>
|
||||||
|
|
||||||
## Community
|
## Community
|
||||||
@@ -259,6 +262,10 @@ We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for deta
|
|||||||
|
|
||||||
MIT © 2026 Paperclip
|
MIT © 2026 Paperclip
|
||||||
|
|
||||||
|
## Star History
|
||||||
|
|
||||||
|
[](https://www.star-history.com/?repos=paperclipai%2Fpaperclip&type=date&legend=top-left)
|
||||||
|
|
||||||
<br/>
|
<br/>
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|||||||
@@ -1,5 +1,58 @@
|
|||||||
# paperclipai
|
# paperclipai
|
||||||
|
|
||||||
|
## 0.3.1
|
||||||
|
|
||||||
|
### Patch Changes
|
||||||
|
|
||||||
|
- Stable release preparation for 0.3.1
|
||||||
|
- Updated dependencies
|
||||||
|
- @paperclipai/adapter-utils@0.3.1
|
||||||
|
- @paperclipai/adapter-claude-local@0.3.1
|
||||||
|
- @paperclipai/adapter-codex-local@0.3.1
|
||||||
|
- @paperclipai/adapter-cursor-local@0.3.1
|
||||||
|
- @paperclipai/adapter-gemini-local@0.3.1
|
||||||
|
- @paperclipai/adapter-openclaw-gateway@0.3.1
|
||||||
|
- @paperclipai/adapter-opencode-local@0.3.1
|
||||||
|
- @paperclipai/adapter-pi-local@0.3.1
|
||||||
|
- @paperclipai/db@0.3.1
|
||||||
|
- @paperclipai/shared@0.3.1
|
||||||
|
- @paperclipai/server@0.3.1
|
||||||
|
|
||||||
|
## 0.3.0
|
||||||
|
|
||||||
|
### Minor Changes
|
||||||
|
|
||||||
|
- Stable release preparation for 0.3.0
|
||||||
|
|
||||||
|
### Patch Changes
|
||||||
|
|
||||||
|
- Updated dependencies [6077ae6]
|
||||||
|
- Updated dependencies
|
||||||
|
- @paperclipai/shared@0.3.0
|
||||||
|
- @paperclipai/adapter-utils@0.3.0
|
||||||
|
- @paperclipai/adapter-claude-local@0.3.0
|
||||||
|
- @paperclipai/adapter-codex-local@0.3.0
|
||||||
|
- @paperclipai/adapter-cursor-local@0.3.0
|
||||||
|
- @paperclipai/adapter-openclaw-gateway@0.3.0
|
||||||
|
- @paperclipai/adapter-opencode-local@0.3.0
|
||||||
|
- @paperclipai/adapter-pi-local@0.3.0
|
||||||
|
- @paperclipai/db@0.3.0
|
||||||
|
- @paperclipai/server@0.3.0
|
||||||
|
|
||||||
|
## 0.2.7
|
||||||
|
|
||||||
|
### Patch Changes
|
||||||
|
|
||||||
|
- Version bump (patch)
|
||||||
|
- Updated dependencies
|
||||||
|
- @paperclipai/shared@0.2.7
|
||||||
|
- @paperclipai/adapter-utils@0.2.7
|
||||||
|
- @paperclipai/db@0.2.7
|
||||||
|
- @paperclipai/adapter-claude-local@0.2.7
|
||||||
|
- @paperclipai/adapter-codex-local@0.2.7
|
||||||
|
- @paperclipai/adapter-openclaw@0.2.7
|
||||||
|
- @paperclipai/server@0.2.7
|
||||||
|
|
||||||
## 0.2.6
|
## 0.2.6
|
||||||
|
|
||||||
### Patch Changes
|
### Patch Changes
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ const workspacePaths = [
|
|||||||
"packages/adapter-utils",
|
"packages/adapter-utils",
|
||||||
"packages/adapters/claude-local",
|
"packages/adapters/claude-local",
|
||||||
"packages/adapters/codex-local",
|
"packages/adapters/codex-local",
|
||||||
"packages/adapters/openclaw",
|
"packages/adapters/openclaw-gateway",
|
||||||
];
|
];
|
||||||
|
|
||||||
// Workspace packages that should NOT be bundled — they'll be published
|
// Workspace packages that should NOT be bundled — they'll be published
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "paperclipai",
|
"name": "paperclipai",
|
||||||
"version": "0.2.6",
|
"version": "0.3.1",
|
||||||
"description": "Paperclip CLI — orchestrate AI agent teams to run a business",
|
"description": "Paperclip CLI — orchestrate AI agent teams to run a business",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"bin": {
|
"bin": {
|
||||||
@@ -16,10 +16,13 @@
|
|||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"repository": {
|
"repository": {
|
||||||
"type": "git",
|
"type": "git",
|
||||||
"url": "https://github.com/paperclipai/paperclip.git",
|
"url": "https://github.com/paperclipai/paperclip",
|
||||||
"directory": "cli"
|
"directory": "cli"
|
||||||
},
|
},
|
||||||
"homepage": "https://github.com/paperclipai/paperclip",
|
"homepage": "https://github.com/paperclipai/paperclip",
|
||||||
|
"bugs": {
|
||||||
|
"url": "https://github.com/paperclipai/paperclip/issues"
|
||||||
|
},
|
||||||
"files": [
|
"files": [
|
||||||
"dist"
|
"dist"
|
||||||
],
|
],
|
||||||
@@ -36,7 +39,11 @@
|
|||||||
"@clack/prompts": "^0.10.0",
|
"@clack/prompts": "^0.10.0",
|
||||||
"@paperclipai/adapter-claude-local": "workspace:*",
|
"@paperclipai/adapter-claude-local": "workspace:*",
|
||||||
"@paperclipai/adapter-codex-local": "workspace:*",
|
"@paperclipai/adapter-codex-local": "workspace:*",
|
||||||
"@paperclipai/adapter-openclaw": "workspace:*",
|
"@paperclipai/adapter-cursor-local": "workspace:*",
|
||||||
|
"@paperclipai/adapter-gemini-local": "workspace:*",
|
||||||
|
"@paperclipai/adapter-opencode-local": "workspace:*",
|
||||||
|
"@paperclipai/adapter-pi-local": "workspace:*",
|
||||||
|
"@paperclipai/adapter-openclaw-gateway": "workspace:*",
|
||||||
"@paperclipai/adapter-utils": "workspace:*",
|
"@paperclipai/adapter-utils": "workspace:*",
|
||||||
"@paperclipai/db": "workspace:*",
|
"@paperclipai/db": "workspace:*",
|
||||||
"@paperclipai/server": "workspace:*",
|
"@paperclipai/server": "workspace:*",
|
||||||
@@ -44,6 +51,7 @@
|
|||||||
"drizzle-orm": "0.38.4",
|
"drizzle-orm": "0.38.4",
|
||||||
"dotenv": "^17.0.1",
|
"dotenv": "^17.0.1",
|
||||||
"commander": "^13.1.0",
|
"commander": "^13.1.0",
|
||||||
|
"embedded-postgres": "^18.1.0-beta.16",
|
||||||
"picocolors": "^1.1.1"
|
"picocolors": "^1.1.1"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
|||||||
@@ -4,7 +4,9 @@ import path from "node:path";
|
|||||||
import { afterEach, beforeEach, describe, expect, it } from "vitest";
|
import { afterEach, beforeEach, describe, expect, it } from "vitest";
|
||||||
import {
|
import {
|
||||||
ensureAgentJwtSecret,
|
ensureAgentJwtSecret,
|
||||||
|
mergePaperclipEnvEntries,
|
||||||
readAgentJwtSecretFromEnv,
|
readAgentJwtSecretFromEnv,
|
||||||
|
readPaperclipEnvEntries,
|
||||||
resolveAgentJwtEnvFile,
|
resolveAgentJwtEnvFile,
|
||||||
} from "../config/env.js";
|
} from "../config/env.js";
|
||||||
import { agentJwtSecretCheck } from "../checks/agent-jwt-secret-check.js";
|
import { agentJwtSecretCheck } from "../checks/agent-jwt-secret-check.js";
|
||||||
@@ -58,4 +60,20 @@ describe("agent jwt env helpers", () => {
|
|||||||
const result = agentJwtSecretCheck(configPath);
|
const result = agentJwtSecretCheck(configPath);
|
||||||
expect(result.status).toBe("pass");
|
expect(result.status).toBe("pass");
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("quotes hash-prefixed env values so dotenv round-trips them", () => {
|
||||||
|
const configPath = tempConfigPath();
|
||||||
|
const envPath = resolveAgentJwtEnvFile(configPath);
|
||||||
|
|
||||||
|
mergePaperclipEnvEntries(
|
||||||
|
{
|
||||||
|
PAPERCLIP_WORKTREE_COLOR: "#439edb",
|
||||||
|
},
|
||||||
|
envPath,
|
||||||
|
);
|
||||||
|
|
||||||
|
const contents = fs.readFileSync(envPath, "utf-8");
|
||||||
|
expect(contents).toContain('PAPERCLIP_WORKTREE_COLOR="#439edb"');
|
||||||
|
expect(readPaperclipEnvEntries(envPath).PAPERCLIP_WORKTREE_COLOR).toBe("#439edb");
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -21,6 +21,12 @@ function writeBaseConfig(configPath: string) {
|
|||||||
mode: "embedded-postgres",
|
mode: "embedded-postgres",
|
||||||
embeddedPostgresDataDir: "/tmp/paperclip-db",
|
embeddedPostgresDataDir: "/tmp/paperclip-db",
|
||||||
embeddedPostgresPort: 54329,
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: "/tmp/paperclip-backups",
|
||||||
|
},
|
||||||
},
|
},
|
||||||
logging: {
|
logging: {
|
||||||
mode: "file",
|
mode: "file",
|
||||||
@@ -36,6 +42,7 @@ function writeBaseConfig(configPath: string) {
|
|||||||
},
|
},
|
||||||
auth: {
|
auth: {
|
||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: "local_disk",
|
provider: "local_disk",
|
||||||
@@ -68,4 +75,3 @@ describe("allowed-hostname command", () => {
|
|||||||
expect(raw.server.allowedHostnames).toEqual(["dotta-macbook-pro"]);
|
expect(raw.server.allowedHostnames).toEqual(["dotta-macbook-pro"]);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
16
cli/src/__tests__/auth-command-registration.test.ts
Normal file
16
cli/src/__tests__/auth-command-registration.test.ts
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
import { Command } from "commander";
|
||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import { registerClientAuthCommands } from "../commands/client/auth.js";
|
||||||
|
|
||||||
|
describe("registerClientAuthCommands", () => {
|
||||||
|
it("registers auth commands without duplicate company-id flags", () => {
|
||||||
|
const program = new Command();
|
||||||
|
const auth = program.command("auth");
|
||||||
|
|
||||||
|
expect(() => registerClientAuthCommands(auth)).not.toThrow();
|
||||||
|
|
||||||
|
const login = auth.commands.find((command) => command.name() === "login");
|
||||||
|
expect(login).toBeDefined();
|
||||||
|
expect(login?.options.filter((option) => option.long === "--company-id")).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
53
cli/src/__tests__/board-auth.test.ts
Normal file
53
cli/src/__tests__/board-auth.test.ts
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import {
|
||||||
|
getStoredBoardCredential,
|
||||||
|
readBoardAuthStore,
|
||||||
|
removeStoredBoardCredential,
|
||||||
|
setStoredBoardCredential,
|
||||||
|
} from "../client/board-auth.js";
|
||||||
|
|
||||||
|
function createTempAuthPath(): string {
|
||||||
|
const dir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-auth-"));
|
||||||
|
return path.join(dir, "auth.json");
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("board auth store", () => {
|
||||||
|
it("returns an empty store when the file does not exist", () => {
|
||||||
|
const authPath = createTempAuthPath();
|
||||||
|
expect(readBoardAuthStore(authPath)).toEqual({
|
||||||
|
version: 1,
|
||||||
|
credentials: {},
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("stores and retrieves credentials by normalized api base", () => {
|
||||||
|
const authPath = createTempAuthPath();
|
||||||
|
setStoredBoardCredential({
|
||||||
|
apiBase: "http://localhost:3100/",
|
||||||
|
token: "token-123",
|
||||||
|
userId: "user-1",
|
||||||
|
storePath: authPath,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(getStoredBoardCredential("http://localhost:3100", authPath)).toMatchObject({
|
||||||
|
apiBase: "http://localhost:3100",
|
||||||
|
token: "token-123",
|
||||||
|
userId: "user-1",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("removes stored credentials", () => {
|
||||||
|
const authPath = createTempAuthPath();
|
||||||
|
setStoredBoardCredential({
|
||||||
|
apiBase: "http://localhost:3100",
|
||||||
|
token: "token-123",
|
||||||
|
storePath: authPath,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(removeStoredBoardCredential("http://localhost:3100", authPath)).toBe(true);
|
||||||
|
expect(getStoredBoardCredential("http://localhost:3100", authPath)).toBeNull();
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -8,12 +8,16 @@ function makeCompany(overrides: Partial<Company>): Company {
|
|||||||
name: "Alpha",
|
name: "Alpha",
|
||||||
description: null,
|
description: null,
|
||||||
status: "active",
|
status: "active",
|
||||||
|
pauseReason: null,
|
||||||
|
pausedAt: null,
|
||||||
issuePrefix: "ALP",
|
issuePrefix: "ALP",
|
||||||
issueCounter: 1,
|
issueCounter: 1,
|
||||||
budgetMonthlyCents: 0,
|
budgetMonthlyCents: 0,
|
||||||
spentMonthlyCents: 0,
|
spentMonthlyCents: 0,
|
||||||
requireBoardApprovalForNewAgents: false,
|
requireBoardApprovalForNewAgents: false,
|
||||||
brandColor: null,
|
brandColor: null,
|
||||||
|
logoAssetId: null,
|
||||||
|
logoUrl: null,
|
||||||
createdAt: new Date(),
|
createdAt: new Date(),
|
||||||
updatedAt: new Date(),
|
updatedAt: new Date(),
|
||||||
...overrides,
|
...overrides,
|
||||||
|
|||||||
543
cli/src/__tests__/company-import-export-e2e.test.ts
Normal file
543
cli/src/__tests__/company-import-export-e2e.test.ts
Normal file
@@ -0,0 +1,543 @@
|
|||||||
|
import { execFile, spawn } from "node:child_process";
|
||||||
|
import { mkdirSync, mkdtempSync, readFileSync, readdirSync, rmSync, writeFileSync } from "node:fs";
|
||||||
|
import net from "node:net";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { fileURLToPath } from "node:url";
|
||||||
|
import { promisify } from "node:util";
|
||||||
|
import { afterAll, beforeAll, describe, expect, it } from "vitest";
|
||||||
|
import { createStoredZipArchive } from "./helpers/zip.js";
|
||||||
|
|
||||||
|
type EmbeddedPostgresInstance = {
|
||||||
|
initialise(): Promise<void>;
|
||||||
|
start(): Promise<void>;
|
||||||
|
stop(): Promise<void>;
|
||||||
|
};
|
||||||
|
|
||||||
|
type EmbeddedPostgresCtor = new (opts: {
|
||||||
|
databaseDir: string;
|
||||||
|
user: string;
|
||||||
|
password: string;
|
||||||
|
port: number;
|
||||||
|
persistent: boolean;
|
||||||
|
initdbFlags?: string[];
|
||||||
|
onLog?: (message: unknown) => void;
|
||||||
|
onError?: (message: unknown) => void;
|
||||||
|
}) => EmbeddedPostgresInstance;
|
||||||
|
|
||||||
|
const execFileAsync = promisify(execFile);
|
||||||
|
type ServerProcess = ReturnType<typeof spawn>;
|
||||||
|
|
||||||
|
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
|
||||||
|
const mod = await import("embedded-postgres");
|
||||||
|
return mod.default as EmbeddedPostgresCtor;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getAvailablePort(): Promise<number> {
|
||||||
|
return await new Promise((resolve, reject) => {
|
||||||
|
const server = net.createServer();
|
||||||
|
server.unref();
|
||||||
|
server.on("error", reject);
|
||||||
|
server.listen(0, "127.0.0.1", () => {
|
||||||
|
const address = server.address();
|
||||||
|
if (!address || typeof address === "string") {
|
||||||
|
server.close(() => reject(new Error("Failed to allocate test port")));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const { port } = address;
|
||||||
|
server.close((error) => {
|
||||||
|
if (error) reject(error);
|
||||||
|
else resolve(port);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function startTempDatabase() {
|
||||||
|
const dataDir = mkdtempSync(path.join(os.tmpdir(), "paperclip-company-cli-db-"));
|
||||||
|
const port = await getAvailablePort();
|
||||||
|
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
|
||||||
|
const instance = new EmbeddedPostgres({
|
||||||
|
databaseDir: dataDir,
|
||||||
|
user: "paperclip",
|
||||||
|
password: "paperclip",
|
||||||
|
port,
|
||||||
|
persistent: true,
|
||||||
|
initdbFlags: ["--encoding=UTF8", "--locale=C"],
|
||||||
|
onLog: () => {},
|
||||||
|
onError: () => {},
|
||||||
|
});
|
||||||
|
await instance.initialise();
|
||||||
|
await instance.start();
|
||||||
|
|
||||||
|
const { applyPendingMigrations, ensurePostgresDatabase } = await import("@paperclipai/db");
|
||||||
|
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
|
||||||
|
await ensurePostgresDatabase(adminConnectionString, "paperclip");
|
||||||
|
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
return { connectionString, dataDir, instance };
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeTestConfig(configPath: string, tempRoot: string, port: number, connectionString: string) {
|
||||||
|
const config = {
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: new Date().toISOString(),
|
||||||
|
source: "doctor",
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
mode: "postgres",
|
||||||
|
connectionString,
|
||||||
|
embeddedPostgresDataDir: path.join(tempRoot, "embedded-db"),
|
||||||
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: false,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: path.join(tempRoot, "backups"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: "file",
|
||||||
|
logDir: path.join(tempRoot, "logs"),
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: "local_trusted",
|
||||||
|
exposure: "private",
|
||||||
|
host: "127.0.0.1",
|
||||||
|
port,
|
||||||
|
allowedHostnames: [],
|
||||||
|
serveUi: false,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: "auto",
|
||||||
|
disableSignUp: false,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: "local_disk",
|
||||||
|
localDisk: {
|
||||||
|
baseDir: path.join(tempRoot, "storage"),
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: "paperclip",
|
||||||
|
region: "us-east-1",
|
||||||
|
prefix: "",
|
||||||
|
forcePathStyle: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: "local_encrypted",
|
||||||
|
strictMode: false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: path.join(tempRoot, "secrets", "master.key"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
mkdirSync(path.dirname(configPath), { recursive: true });
|
||||||
|
writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
|
||||||
|
}
|
||||||
|
|
||||||
|
function createServerEnv(configPath: string, port: number, connectionString: string) {
|
||||||
|
const env = { ...process.env };
|
||||||
|
for (const key of Object.keys(env)) {
|
||||||
|
if (key.startsWith("PAPERCLIP_")) {
|
||||||
|
delete env[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
delete env.DATABASE_URL;
|
||||||
|
delete env.PORT;
|
||||||
|
delete env.HOST;
|
||||||
|
delete env.SERVE_UI;
|
||||||
|
delete env.HEARTBEAT_SCHEDULER_ENABLED;
|
||||||
|
|
||||||
|
env.PAPERCLIP_CONFIG = configPath;
|
||||||
|
env.DATABASE_URL = connectionString;
|
||||||
|
env.HOST = "127.0.0.1";
|
||||||
|
env.PORT = String(port);
|
||||||
|
env.SERVE_UI = "false";
|
||||||
|
env.PAPERCLIP_DB_BACKUP_ENABLED = "false";
|
||||||
|
env.HEARTBEAT_SCHEDULER_ENABLED = "false";
|
||||||
|
env.PAPERCLIP_MIGRATION_AUTO_APPLY = "true";
|
||||||
|
env.PAPERCLIP_UI_DEV_MIDDLEWARE = "false";
|
||||||
|
|
||||||
|
return env;
|
||||||
|
}
|
||||||
|
|
||||||
|
function createCliEnv() {
|
||||||
|
const env = { ...process.env };
|
||||||
|
for (const key of Object.keys(env)) {
|
||||||
|
if (key.startsWith("PAPERCLIP_")) {
|
||||||
|
delete env[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
delete env.DATABASE_URL;
|
||||||
|
delete env.PORT;
|
||||||
|
delete env.HOST;
|
||||||
|
delete env.SERVE_UI;
|
||||||
|
delete env.PAPERCLIP_DB_BACKUP_ENABLED;
|
||||||
|
delete env.HEARTBEAT_SCHEDULER_ENABLED;
|
||||||
|
delete env.PAPERCLIP_MIGRATION_AUTO_APPLY;
|
||||||
|
delete env.PAPERCLIP_UI_DEV_MIDDLEWARE;
|
||||||
|
return env;
|
||||||
|
}
|
||||||
|
|
||||||
|
function collectTextFiles(root: string, current: string, files: Record<string, string>) {
|
||||||
|
for (const entry of readdirSync(current, { withFileTypes: true })) {
|
||||||
|
const absolutePath = path.join(current, entry.name);
|
||||||
|
if (entry.isDirectory()) {
|
||||||
|
collectTextFiles(root, absolutePath, files);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!entry.isFile()) continue;
|
||||||
|
const relativePath = path.relative(root, absolutePath).replace(/\\/g, "/");
|
||||||
|
files[relativePath] = readFileSync(absolutePath, "utf8");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function stopServerProcess(child: ServerProcess | null) {
|
||||||
|
if (!child || child.exitCode !== null) return;
|
||||||
|
child.kill("SIGTERM");
|
||||||
|
await new Promise<void>((resolve) => {
|
||||||
|
child.once("exit", () => resolve());
|
||||||
|
setTimeout(() => {
|
||||||
|
if (child.exitCode === null) {
|
||||||
|
child.kill("SIGKILL");
|
||||||
|
}
|
||||||
|
}, 5_000);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function api<T>(baseUrl: string, pathname: string, init?: RequestInit): Promise<T> {
|
||||||
|
const res = await fetch(`${baseUrl}${pathname}`, init);
|
||||||
|
const text = await res.text();
|
||||||
|
if (!res.ok) {
|
||||||
|
throw new Error(`Request failed ${res.status} ${pathname}: ${text}`);
|
||||||
|
}
|
||||||
|
return text ? JSON.parse(text) as T : (null as T);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function runCliJson<T>(args: string[], opts: { apiBase: string; configPath: string }) {
|
||||||
|
const repoRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
||||||
|
const result = await execFileAsync(
|
||||||
|
"pnpm",
|
||||||
|
["--silent", "paperclipai", ...args, "--api-base", opts.apiBase, "--config", opts.configPath, "--json"],
|
||||||
|
{
|
||||||
|
cwd: repoRoot,
|
||||||
|
env: createCliEnv(),
|
||||||
|
maxBuffer: 10 * 1024 * 1024,
|
||||||
|
},
|
||||||
|
);
|
||||||
|
const stdout = result.stdout.trim();
|
||||||
|
const jsonStart = stdout.search(/[\[{]/);
|
||||||
|
if (jsonStart === -1) {
|
||||||
|
throw new Error(`CLI did not emit JSON.\nstdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
|
||||||
|
}
|
||||||
|
return JSON.parse(stdout.slice(jsonStart)) as T;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function waitForServer(
|
||||||
|
apiBase: string,
|
||||||
|
child: ServerProcess,
|
||||||
|
output: { stdout: string[]; stderr: string[] },
|
||||||
|
) {
|
||||||
|
const startedAt = Date.now();
|
||||||
|
while (Date.now() - startedAt < 30_000) {
|
||||||
|
if (child.exitCode !== null) {
|
||||||
|
throw new Error(
|
||||||
|
`paperclipai run exited before healthcheck succeeded.\nstdout:\n${output.stdout.join("")}\nstderr:\n${output.stderr.join("")}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const res = await fetch(`${apiBase}/api/health`);
|
||||||
|
if (res.ok) return;
|
||||||
|
} catch {
|
||||||
|
// Server is still starting.
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 250));
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(
|
||||||
|
`Timed out waiting for ${apiBase}/api/health.\nstdout:\n${output.stdout.join("")}\nstderr:\n${output.stderr.join("")}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("paperclipai company import/export e2e", () => {
|
||||||
|
let tempRoot = "";
|
||||||
|
let configPath = "";
|
||||||
|
let exportDir = "";
|
||||||
|
let apiBase = "";
|
||||||
|
let serverProcess: ServerProcess | null = null;
|
||||||
|
let dbDataDir = "";
|
||||||
|
let dbInstance: EmbeddedPostgresInstance | null = null;
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-company-cli-e2e-"));
|
||||||
|
configPath = path.join(tempRoot, "config", "config.json");
|
||||||
|
exportDir = path.join(tempRoot, "exported-company");
|
||||||
|
|
||||||
|
const db = await startTempDatabase();
|
||||||
|
dbDataDir = db.dataDir;
|
||||||
|
dbInstance = db.instance;
|
||||||
|
|
||||||
|
const port = await getAvailablePort();
|
||||||
|
writeTestConfig(configPath, tempRoot, port, db.connectionString);
|
||||||
|
apiBase = `http://127.0.0.1:${port}`;
|
||||||
|
|
||||||
|
const repoRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
||||||
|
const output = { stdout: [] as string[], stderr: [] as string[] };
|
||||||
|
const child = spawn(
|
||||||
|
"pnpm",
|
||||||
|
["paperclipai", "run", "--config", configPath],
|
||||||
|
{
|
||||||
|
cwd: repoRoot,
|
||||||
|
env: createServerEnv(configPath, port, db.connectionString),
|
||||||
|
stdio: ["ignore", "pipe", "pipe"],
|
||||||
|
},
|
||||||
|
);
|
||||||
|
serverProcess = child;
|
||||||
|
child.stdout?.on("data", (chunk) => {
|
||||||
|
output.stdout.push(String(chunk));
|
||||||
|
});
|
||||||
|
child.stderr?.on("data", (chunk) => {
|
||||||
|
output.stderr.push(String(chunk));
|
||||||
|
});
|
||||||
|
|
||||||
|
await waitForServer(apiBase, child, output);
|
||||||
|
}, 60_000);
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
await stopServerProcess(serverProcess);
|
||||||
|
await dbInstance?.stop();
|
||||||
|
if (dbDataDir) {
|
||||||
|
rmSync(dbDataDir, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
if (tempRoot) {
|
||||||
|
rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("exports a company package and imports it into new and existing companies", async () => {
|
||||||
|
expect(serverProcess).not.toBeNull();
|
||||||
|
|
||||||
|
const sourceCompany = await api<{ id: string; name: string; issuePrefix: string }>(apiBase, "/api/companies", {
|
||||||
|
method: "POST",
|
||||||
|
headers: { "content-type": "application/json" },
|
||||||
|
body: JSON.stringify({ name: `CLI Export Source ${Date.now()}` }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const sourceAgent = await api<{ id: string; name: string }>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${sourceCompany.id}/agents`,
|
||||||
|
{
|
||||||
|
method: "POST",
|
||||||
|
headers: { "content-type": "application/json" },
|
||||||
|
body: JSON.stringify({
|
||||||
|
name: "Export Engineer",
|
||||||
|
role: "engineer",
|
||||||
|
adapterType: "claude_local",
|
||||||
|
adapterConfig: {
|
||||||
|
promptTemplate: "You verify company portability.",
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
const sourceProject = await api<{ id: string; name: string }>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${sourceCompany.id}/projects`,
|
||||||
|
{
|
||||||
|
method: "POST",
|
||||||
|
headers: { "content-type": "application/json" },
|
||||||
|
body: JSON.stringify({
|
||||||
|
name: "Portability Verification",
|
||||||
|
status: "in_progress",
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
const largeIssueDescription = `Round-trip the company package through the CLI.\n\n${"portable-data ".repeat(12_000)}`;
|
||||||
|
|
||||||
|
const sourceIssue = await api<{ id: string; title: string; identifier: string }>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${sourceCompany.id}/issues`,
|
||||||
|
{
|
||||||
|
method: "POST",
|
||||||
|
headers: { "content-type": "application/json" },
|
||||||
|
body: JSON.stringify({
|
||||||
|
title: "Validate company import/export",
|
||||||
|
description: largeIssueDescription,
|
||||||
|
status: "todo",
|
||||||
|
projectId: sourceProject.id,
|
||||||
|
assigneeAgentId: sourceAgent.id,
|
||||||
|
}),
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
const exportResult = await runCliJson<{
|
||||||
|
ok: boolean;
|
||||||
|
out: string;
|
||||||
|
filesWritten: number;
|
||||||
|
}>(
|
||||||
|
[
|
||||||
|
"company",
|
||||||
|
"export",
|
||||||
|
sourceCompany.id,
|
||||||
|
"--out",
|
||||||
|
exportDir,
|
||||||
|
"--include",
|
||||||
|
"company,agents,projects,issues",
|
||||||
|
],
|
||||||
|
{ apiBase, configPath },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(exportResult.ok).toBe(true);
|
||||||
|
expect(exportResult.filesWritten).toBeGreaterThan(0);
|
||||||
|
expect(readFileSync(path.join(exportDir, "COMPANY.md"), "utf8")).toContain(sourceCompany.name);
|
||||||
|
expect(readFileSync(path.join(exportDir, ".paperclip.yaml"), "utf8")).toContain('schema: "paperclip/v1"');
|
||||||
|
|
||||||
|
const importedNew = await runCliJson<{
|
||||||
|
company: { id: string; name: string; action: string };
|
||||||
|
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||||
|
}>(
|
||||||
|
[
|
||||||
|
"company",
|
||||||
|
"import",
|
||||||
|
exportDir,
|
||||||
|
"--target",
|
||||||
|
"new",
|
||||||
|
"--new-company-name",
|
||||||
|
`Imported ${sourceCompany.name}`,
|
||||||
|
"--include",
|
||||||
|
"company,agents,projects,issues",
|
||||||
|
"--yes",
|
||||||
|
],
|
||||||
|
{ apiBase, configPath },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(importedNew.company.action).toBe("created");
|
||||||
|
expect(importedNew.agents).toHaveLength(1);
|
||||||
|
expect(importedNew.agents[0]?.action).toBe("created");
|
||||||
|
|
||||||
|
const importedAgents = await api<Array<{ id: string; name: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/agents`,
|
||||||
|
);
|
||||||
|
const importedProjects = await api<Array<{ id: string; name: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/projects`,
|
||||||
|
);
|
||||||
|
const importedIssues = await api<Array<{ id: string; title: string; identifier: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/issues`,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(importedAgents.map((agent) => agent.name)).toContain(sourceAgent.name);
|
||||||
|
expect(importedProjects.map((project) => project.name)).toContain(sourceProject.name);
|
||||||
|
expect(importedIssues.map((issue) => issue.title)).toContain(sourceIssue.title);
|
||||||
|
|
||||||
|
const previewExisting = await runCliJson<{
|
||||||
|
errors: string[];
|
||||||
|
plan: {
|
||||||
|
companyAction: string;
|
||||||
|
agentPlans: Array<{ action: string }>;
|
||||||
|
projectPlans: Array<{ action: string }>;
|
||||||
|
issuePlans: Array<{ action: string }>;
|
||||||
|
};
|
||||||
|
}>(
|
||||||
|
[
|
||||||
|
"company",
|
||||||
|
"import",
|
||||||
|
exportDir,
|
||||||
|
"--target",
|
||||||
|
"existing",
|
||||||
|
"--company-id",
|
||||||
|
importedNew.company.id,
|
||||||
|
"--include",
|
||||||
|
"company,agents,projects,issues",
|
||||||
|
"--collision",
|
||||||
|
"rename",
|
||||||
|
"--dry-run",
|
||||||
|
],
|
||||||
|
{ apiBase, configPath },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(previewExisting.errors).toEqual([]);
|
||||||
|
expect(previewExisting.plan.companyAction).toBe("none");
|
||||||
|
expect(previewExisting.plan.agentPlans.some((plan) => plan.action === "create")).toBe(true);
|
||||||
|
expect(previewExisting.plan.projectPlans.some((plan) => plan.action === "create")).toBe(true);
|
||||||
|
expect(previewExisting.plan.issuePlans.some((plan) => plan.action === "create")).toBe(true);
|
||||||
|
|
||||||
|
const importedExisting = await runCliJson<{
|
||||||
|
company: { id: string; action: string };
|
||||||
|
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||||
|
}>(
|
||||||
|
[
|
||||||
|
"company",
|
||||||
|
"import",
|
||||||
|
exportDir,
|
||||||
|
"--target",
|
||||||
|
"existing",
|
||||||
|
"--company-id",
|
||||||
|
importedNew.company.id,
|
||||||
|
"--include",
|
||||||
|
"company,agents,projects,issues",
|
||||||
|
"--collision",
|
||||||
|
"rename",
|
||||||
|
"--yes",
|
||||||
|
],
|
||||||
|
{ apiBase, configPath },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(importedExisting.company.action).toBe("unchanged");
|
||||||
|
expect(importedExisting.agents.some((agent) => agent.action === "created")).toBe(true);
|
||||||
|
|
||||||
|
const twiceImportedAgents = await api<Array<{ id: string; name: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/agents`,
|
||||||
|
);
|
||||||
|
const twiceImportedProjects = await api<Array<{ id: string; name: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/projects`,
|
||||||
|
);
|
||||||
|
const twiceImportedIssues = await api<Array<{ id: string; title: string; identifier: string }>>(
|
||||||
|
apiBase,
|
||||||
|
`/api/companies/${importedNew.company.id}/issues`,
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(twiceImportedAgents).toHaveLength(2);
|
||||||
|
expect(new Set(twiceImportedAgents.map((agent) => agent.name)).size).toBe(2);
|
||||||
|
expect(twiceImportedProjects).toHaveLength(2);
|
||||||
|
expect(twiceImportedIssues).toHaveLength(2);
|
||||||
|
|
||||||
|
const zipPath = path.join(tempRoot, "exported-company.zip");
|
||||||
|
const portableFiles: Record<string, string> = {};
|
||||||
|
collectTextFiles(exportDir, exportDir, portableFiles);
|
||||||
|
writeFileSync(zipPath, createStoredZipArchive(portableFiles, "paperclip-demo"));
|
||||||
|
|
||||||
|
const importedFromZip = await runCliJson<{
|
||||||
|
company: { id: string; name: string; action: string };
|
||||||
|
agents: Array<{ id: string | null; action: string; name: string }>;
|
||||||
|
}>(
|
||||||
|
[
|
||||||
|
"company",
|
||||||
|
"import",
|
||||||
|
zipPath,
|
||||||
|
"--target",
|
||||||
|
"new",
|
||||||
|
"--new-company-name",
|
||||||
|
`Zip Imported ${sourceCompany.name}`,
|
||||||
|
"--include",
|
||||||
|
"company,agents,projects,issues",
|
||||||
|
"--yes",
|
||||||
|
],
|
||||||
|
{ apiBase, configPath },
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(importedFromZip.company.action).toBe("created");
|
||||||
|
expect(importedFromZip.agents.some((agent) => agent.action === "created")).toBe(true);
|
||||||
|
}, 60_000);
|
||||||
|
});
|
||||||
74
cli/src/__tests__/company-import-url.test.ts
Normal file
74
cli/src/__tests__/company-import-url.test.ts
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import {
|
||||||
|
isGithubShorthand,
|
||||||
|
isGithubUrl,
|
||||||
|
isHttpUrl,
|
||||||
|
normalizeGithubImportSource,
|
||||||
|
} from "../commands/client/company.js";
|
||||||
|
|
||||||
|
describe("isHttpUrl", () => {
|
||||||
|
it("matches http URLs", () => {
|
||||||
|
expect(isHttpUrl("http://example.com/foo")).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("matches https URLs", () => {
|
||||||
|
expect(isHttpUrl("https://example.com/foo")).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects local paths", () => {
|
||||||
|
expect(isHttpUrl("/tmp/my-company")).toBe(false);
|
||||||
|
expect(isHttpUrl("./relative")).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("isGithubUrl", () => {
|
||||||
|
it("matches GitHub URLs", () => {
|
||||||
|
expect(isGithubUrl("https://github.com/org/repo")).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects non-GitHub HTTP URLs", () => {
|
||||||
|
expect(isGithubUrl("https://example.com/foo")).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects local paths", () => {
|
||||||
|
expect(isGithubUrl("/tmp/my-company")).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("isGithubShorthand", () => {
|
||||||
|
it("matches owner/repo/path shorthands", () => {
|
||||||
|
expect(isGithubShorthand("paperclipai/companies/gstack")).toBe(true);
|
||||||
|
expect(isGithubShorthand("paperclipai/companies")).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects local-looking paths", () => {
|
||||||
|
expect(isGithubShorthand("./exports/acme")).toBe(false);
|
||||||
|
expect(isGithubShorthand("/tmp/acme")).toBe(false);
|
||||||
|
expect(isGithubShorthand("C:\\temp\\acme")).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("normalizeGithubImportSource", () => {
|
||||||
|
it("normalizes shorthand imports to canonical GitHub sources", () => {
|
||||||
|
expect(normalizeGithubImportSource("paperclipai/companies/gstack")).toBe(
|
||||||
|
"https://github.com/paperclipai/companies?ref=main&path=gstack",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("applies --ref to shorthand imports", () => {
|
||||||
|
expect(normalizeGithubImportSource("paperclipai/companies/gstack", "feature/demo")).toBe(
|
||||||
|
"https://github.com/paperclipai/companies?ref=feature%2Fdemo&path=gstack",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("applies --ref to existing GitHub tree URLs without losing the package path", () => {
|
||||||
|
expect(
|
||||||
|
normalizeGithubImportSource(
|
||||||
|
"https://github.com/paperclipai/companies/tree/main/gstack",
|
||||||
|
"release/2026-03-23",
|
||||||
|
),
|
||||||
|
).toBe(
|
||||||
|
"https://github.com/paperclipai/companies?ref=release%2F2026-03-23&path=gstack",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
44
cli/src/__tests__/company-import-zip.test.ts
Normal file
44
cli/src/__tests__/company-import-zip.test.ts
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
import { mkdtemp, rm, writeFile } from "node:fs/promises";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterEach, describe, expect, it } from "vitest";
|
||||||
|
import { resolveInlineSourceFromPath } from "../commands/client/company.js";
|
||||||
|
import { createStoredZipArchive } from "./helpers/zip.js";
|
||||||
|
|
||||||
|
const tempDirs: string[] = [];
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
for (const dir of tempDirs.splice(0)) {
|
||||||
|
await rm(dir, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("resolveInlineSourceFromPath", () => {
|
||||||
|
it("imports portable files from a zip archive instead of scanning the parent directory", async () => {
|
||||||
|
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-company-import-zip-"));
|
||||||
|
tempDirs.push(tempDir);
|
||||||
|
|
||||||
|
const archivePath = path.join(tempDir, "paperclip-demo.zip");
|
||||||
|
const archive = createStoredZipArchive(
|
||||||
|
{
|
||||||
|
"COMPANY.md": "# Company\n",
|
||||||
|
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||||
|
"agents/ceo/AGENT.md": "# CEO\n",
|
||||||
|
"notes/todo.txt": "ignore me\n",
|
||||||
|
},
|
||||||
|
"paperclip-demo",
|
||||||
|
);
|
||||||
|
await writeFile(archivePath, archive);
|
||||||
|
|
||||||
|
const resolved = await resolveInlineSourceFromPath(archivePath);
|
||||||
|
|
||||||
|
expect(resolved).toEqual({
|
||||||
|
rootPath: "paperclip-demo",
|
||||||
|
files: {
|
||||||
|
"COMPANY.md": "# Company\n",
|
||||||
|
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||||
|
"agents/ceo/AGENT.md": "# CEO\n",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
587
cli/src/__tests__/company.test.ts
Normal file
587
cli/src/__tests__/company.test.ts
Normal file
@@ -0,0 +1,587 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import type { CompanyPortabilityPreviewResult } from "@paperclipai/shared";
|
||||||
|
import {
|
||||||
|
buildCompanyDashboardUrl,
|
||||||
|
buildDefaultImportAdapterOverrides,
|
||||||
|
buildDefaultImportSelectionState,
|
||||||
|
buildImportSelectionCatalog,
|
||||||
|
buildSelectedFilesFromImportSelection,
|
||||||
|
renderCompanyImportPreview,
|
||||||
|
renderCompanyImportResult,
|
||||||
|
resolveCompanyImportApplyConfirmationMode,
|
||||||
|
resolveCompanyImportApiPath,
|
||||||
|
} from "../commands/client/company.js";
|
||||||
|
|
||||||
|
describe("resolveCompanyImportApiPath", () => {
|
||||||
|
it("uses company-scoped preview route for existing-company dry runs", () => {
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApiPath({
|
||||||
|
dryRun: true,
|
||||||
|
targetMode: "existing_company",
|
||||||
|
companyId: "company-123",
|
||||||
|
}),
|
||||||
|
).toBe("/api/companies/company-123/imports/preview");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses company-scoped apply route for existing-company imports", () => {
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApiPath({
|
||||||
|
dryRun: false,
|
||||||
|
targetMode: "existing_company",
|
||||||
|
companyId: "company-123",
|
||||||
|
}),
|
||||||
|
).toBe("/api/companies/company-123/imports/apply");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("keeps global routes for new-company imports", () => {
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApiPath({
|
||||||
|
dryRun: true,
|
||||||
|
targetMode: "new_company",
|
||||||
|
}),
|
||||||
|
).toBe("/api/companies/import/preview");
|
||||||
|
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApiPath({
|
||||||
|
dryRun: false,
|
||||||
|
targetMode: "new_company",
|
||||||
|
}),
|
||||||
|
).toBe("/api/companies/import");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("throws when an existing-company import is missing a company id", () => {
|
||||||
|
expect(() =>
|
||||||
|
resolveCompanyImportApiPath({
|
||||||
|
dryRun: true,
|
||||||
|
targetMode: "existing_company",
|
||||||
|
companyId: " ",
|
||||||
|
})
|
||||||
|
).toThrow(/require a companyId/i);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("resolveCompanyImportApplyConfirmationMode", () => {
|
||||||
|
it("skips confirmation when --yes is set", () => {
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApplyConfirmationMode({
|
||||||
|
yes: true,
|
||||||
|
interactive: false,
|
||||||
|
json: false,
|
||||||
|
}),
|
||||||
|
).toBe("skip");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("prompts in interactive text mode when --yes is not set", () => {
|
||||||
|
expect(
|
||||||
|
resolveCompanyImportApplyConfirmationMode({
|
||||||
|
yes: false,
|
||||||
|
interactive: true,
|
||||||
|
json: false,
|
||||||
|
}),
|
||||||
|
).toBe("prompt");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("requires --yes for non-interactive apply", () => {
|
||||||
|
expect(() =>
|
||||||
|
resolveCompanyImportApplyConfirmationMode({
|
||||||
|
yes: false,
|
||||||
|
interactive: false,
|
||||||
|
json: false,
|
||||||
|
})
|
||||||
|
).toThrow(/non-interactive terminal requires --yes/i);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("requires --yes for json apply", () => {
|
||||||
|
expect(() =>
|
||||||
|
resolveCompanyImportApplyConfirmationMode({
|
||||||
|
yes: false,
|
||||||
|
interactive: false,
|
||||||
|
json: true,
|
||||||
|
})
|
||||||
|
).toThrow(/with --json requires --yes/i);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("buildCompanyDashboardUrl", () => {
|
||||||
|
it("preserves the configured base path when building a dashboard URL", () => {
|
||||||
|
expect(buildCompanyDashboardUrl("https://paperclip.example/app/", "PAP")).toBe(
|
||||||
|
"https://paperclip.example/app/PAP/dashboard",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("renderCompanyImportPreview", () => {
|
||||||
|
it("summarizes the preview with counts, selection info, and truncated examples", () => {
|
||||||
|
const preview: CompanyPortabilityPreviewResult = {
|
||||||
|
include: {
|
||||||
|
company: true,
|
||||||
|
agents: true,
|
||||||
|
projects: true,
|
||||||
|
issues: true,
|
||||||
|
skills: true,
|
||||||
|
},
|
||||||
|
targetCompanyId: "company-123",
|
||||||
|
targetCompanyName: "Imported Co",
|
||||||
|
collisionStrategy: "rename",
|
||||||
|
selectedAgentSlugs: ["ceo", "cto", "eng-1", "eng-2", "eng-3", "eng-4", "eng-5"],
|
||||||
|
plan: {
|
||||||
|
companyAction: "update",
|
||||||
|
agentPlans: [
|
||||||
|
{ slug: "ceo", action: "create", plannedName: "CEO", existingAgentId: null, reason: null },
|
||||||
|
{ slug: "cto", action: "update", plannedName: "CTO", existingAgentId: "agent-2", reason: "replace strategy" },
|
||||||
|
{ slug: "eng-1", action: "skip", plannedName: "Engineer 1", existingAgentId: "agent-3", reason: "skip strategy" },
|
||||||
|
{ slug: "eng-2", action: "create", plannedName: "Engineer 2", existingAgentId: null, reason: null },
|
||||||
|
{ slug: "eng-3", action: "create", plannedName: "Engineer 3", existingAgentId: null, reason: null },
|
||||||
|
{ slug: "eng-4", action: "create", plannedName: "Engineer 4", existingAgentId: null, reason: null },
|
||||||
|
{ slug: "eng-5", action: "create", plannedName: "Engineer 5", existingAgentId: null, reason: null },
|
||||||
|
],
|
||||||
|
projectPlans: [
|
||||||
|
{ slug: "alpha", action: "create", plannedName: "Alpha", existingProjectId: null, reason: null },
|
||||||
|
],
|
||||||
|
issuePlans: [
|
||||||
|
{ slug: "kickoff", action: "create", plannedTitle: "Kickoff", reason: null },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
manifest: {
|
||||||
|
schemaVersion: 1,
|
||||||
|
generatedAt: "2026-03-23T17:00:00.000Z",
|
||||||
|
source: {
|
||||||
|
companyId: "company-src",
|
||||||
|
companyName: "Source Co",
|
||||||
|
},
|
||||||
|
includes: {
|
||||||
|
company: true,
|
||||||
|
agents: true,
|
||||||
|
projects: true,
|
||||||
|
issues: true,
|
||||||
|
skills: true,
|
||||||
|
},
|
||||||
|
company: {
|
||||||
|
path: "COMPANY.md",
|
||||||
|
name: "Source Co",
|
||||||
|
description: null,
|
||||||
|
brandColor: null,
|
||||||
|
logoPath: null,
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
},
|
||||||
|
sidebar: {
|
||||||
|
agents: ["ceo"],
|
||||||
|
projects: ["alpha"],
|
||||||
|
},
|
||||||
|
agents: [
|
||||||
|
{
|
||||||
|
slug: "ceo",
|
||||||
|
name: "CEO",
|
||||||
|
path: "agents/ceo/AGENT.md",
|
||||||
|
skills: [],
|
||||||
|
role: "ceo",
|
||||||
|
title: null,
|
||||||
|
icon: null,
|
||||||
|
capabilities: null,
|
||||||
|
reportsToSlug: null,
|
||||||
|
adapterType: "codex_local",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
budgetMonthlyCents: 0,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
skills: [
|
||||||
|
{
|
||||||
|
key: "skill-a",
|
||||||
|
slug: "skill-a",
|
||||||
|
name: "Skill A",
|
||||||
|
path: "skills/skill-a/SKILL.md",
|
||||||
|
description: null,
|
||||||
|
sourceType: "inline",
|
||||||
|
sourceLocator: null,
|
||||||
|
sourceRef: null,
|
||||||
|
trustLevel: null,
|
||||||
|
compatibility: null,
|
||||||
|
metadata: null,
|
||||||
|
fileInventory: [],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
projects: [
|
||||||
|
{
|
||||||
|
slug: "alpha",
|
||||||
|
name: "Alpha",
|
||||||
|
path: "projects/alpha/PROJECT.md",
|
||||||
|
description: null,
|
||||||
|
ownerAgentSlug: null,
|
||||||
|
leadAgentSlug: null,
|
||||||
|
targetDate: null,
|
||||||
|
color: null,
|
||||||
|
status: null,
|
||||||
|
executionWorkspacePolicy: null,
|
||||||
|
workspaces: [],
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
issues: [
|
||||||
|
{
|
||||||
|
slug: "kickoff",
|
||||||
|
identifier: null,
|
||||||
|
title: "Kickoff",
|
||||||
|
path: "projects/alpha/issues/kickoff/TASK.md",
|
||||||
|
projectSlug: "alpha",
|
||||||
|
projectWorkspaceKey: null,
|
||||||
|
assigneeAgentSlug: "ceo",
|
||||||
|
description: null,
|
||||||
|
recurring: false,
|
||||||
|
routine: null,
|
||||||
|
legacyRecurrence: null,
|
||||||
|
status: null,
|
||||||
|
priority: null,
|
||||||
|
labelIds: [],
|
||||||
|
billingCode: null,
|
||||||
|
executionWorkspaceSettings: null,
|
||||||
|
assigneeAdapterOverrides: null,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
envInputs: [
|
||||||
|
{
|
||||||
|
key: "OPENAI_API_KEY",
|
||||||
|
description: null,
|
||||||
|
agentSlug: "ceo",
|
||||||
|
kind: "secret",
|
||||||
|
requirement: "required",
|
||||||
|
defaultValue: null,
|
||||||
|
portability: "portable",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
files: {
|
||||||
|
"COMPANY.md": "# Source Co",
|
||||||
|
},
|
||||||
|
envInputs: [
|
||||||
|
{
|
||||||
|
key: "OPENAI_API_KEY",
|
||||||
|
description: null,
|
||||||
|
agentSlug: "ceo",
|
||||||
|
kind: "secret",
|
||||||
|
requirement: "required",
|
||||||
|
defaultValue: null,
|
||||||
|
portability: "portable",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
warnings: ["One warning"],
|
||||||
|
errors: ["One error"],
|
||||||
|
};
|
||||||
|
|
||||||
|
const rendered = renderCompanyImportPreview(preview, {
|
||||||
|
sourceLabel: "GitHub: https://github.com/paperclipai/companies/demo",
|
||||||
|
targetLabel: "Imported Co (company-123)",
|
||||||
|
infoMessages: ["Using claude-local adapter"],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(rendered).toContain("Include");
|
||||||
|
expect(rendered).toContain("company, projects, tasks, agents, skills");
|
||||||
|
expect(rendered).toContain("7 agents total");
|
||||||
|
expect(rendered).toContain("1 project total");
|
||||||
|
expect(rendered).toContain("1 task total");
|
||||||
|
expect(rendered).toContain("skills: 1 skill packaged");
|
||||||
|
expect(rendered).toContain("+1 more");
|
||||||
|
expect(rendered).toContain("Using claude-local adapter");
|
||||||
|
expect(rendered).toContain("Warnings");
|
||||||
|
expect(rendered).toContain("Errors");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("renderCompanyImportResult", () => {
|
||||||
|
it("summarizes import results with created, updated, and skipped counts", () => {
|
||||||
|
const rendered = renderCompanyImportResult(
|
||||||
|
{
|
||||||
|
company: {
|
||||||
|
id: "company-123",
|
||||||
|
name: "Imported Co",
|
||||||
|
action: "updated",
|
||||||
|
},
|
||||||
|
agents: [
|
||||||
|
{ slug: "ceo", id: "agent-1", action: "created", name: "CEO", reason: null },
|
||||||
|
{ slug: "cto", id: "agent-2", action: "updated", name: "CTO", reason: "replace strategy" },
|
||||||
|
{ slug: "ops", id: null, action: "skipped", name: "Ops", reason: "skip strategy" },
|
||||||
|
],
|
||||||
|
projects: [
|
||||||
|
{ slug: "app", id: "project-1", action: "created", name: "App", reason: null },
|
||||||
|
{ slug: "ops", id: "project-2", action: "updated", name: "Operations", reason: "replace strategy" },
|
||||||
|
{ slug: "archive", id: null, action: "skipped", name: "Archive", reason: "skip strategy" },
|
||||||
|
],
|
||||||
|
envInputs: [],
|
||||||
|
warnings: ["Review API keys"],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targetLabel: "Imported Co (company-123)",
|
||||||
|
companyUrl: "https://paperclip.example/PAP/dashboard",
|
||||||
|
infoMessages: ["Using claude-local adapter"],
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(rendered).toContain("Company");
|
||||||
|
expect(rendered).toContain("https://paperclip.example/PAP/dashboard");
|
||||||
|
expect(rendered).toContain("3 agents total (1 created, 1 updated, 1 skipped)");
|
||||||
|
expect(rendered).toContain("3 projects total (1 created, 1 updated, 1 skipped)");
|
||||||
|
expect(rendered).toContain("Agent results");
|
||||||
|
expect(rendered).toContain("Project results");
|
||||||
|
expect(rendered).toContain("Using claude-local adapter");
|
||||||
|
expect(rendered).toContain("Review API keys");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("import selection catalog", () => {
|
||||||
|
it("defaults to everything and keeps project selection separate from task selection", () => {
|
||||||
|
const preview: CompanyPortabilityPreviewResult = {
|
||||||
|
include: {
|
||||||
|
company: true,
|
||||||
|
agents: true,
|
||||||
|
projects: true,
|
||||||
|
issues: true,
|
||||||
|
skills: true,
|
||||||
|
},
|
||||||
|
targetCompanyId: "company-123",
|
||||||
|
targetCompanyName: "Imported Co",
|
||||||
|
collisionStrategy: "rename",
|
||||||
|
selectedAgentSlugs: ["ceo"],
|
||||||
|
plan: {
|
||||||
|
companyAction: "create",
|
||||||
|
agentPlans: [],
|
||||||
|
projectPlans: [],
|
||||||
|
issuePlans: [],
|
||||||
|
},
|
||||||
|
manifest: {
|
||||||
|
schemaVersion: 1,
|
||||||
|
generatedAt: "2026-03-23T18:00:00.000Z",
|
||||||
|
source: {
|
||||||
|
companyId: "company-src",
|
||||||
|
companyName: "Source Co",
|
||||||
|
},
|
||||||
|
includes: {
|
||||||
|
company: true,
|
||||||
|
agents: true,
|
||||||
|
projects: true,
|
||||||
|
issues: true,
|
||||||
|
skills: true,
|
||||||
|
},
|
||||||
|
company: {
|
||||||
|
path: "COMPANY.md",
|
||||||
|
name: "Source Co",
|
||||||
|
description: null,
|
||||||
|
brandColor: null,
|
||||||
|
logoPath: "images/company-logo.png",
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
},
|
||||||
|
sidebar: {
|
||||||
|
agents: ["ceo"],
|
||||||
|
projects: ["alpha"],
|
||||||
|
},
|
||||||
|
agents: [
|
||||||
|
{
|
||||||
|
slug: "ceo",
|
||||||
|
name: "CEO",
|
||||||
|
path: "agents/ceo/AGENT.md",
|
||||||
|
skills: [],
|
||||||
|
role: "ceo",
|
||||||
|
title: null,
|
||||||
|
icon: null,
|
||||||
|
capabilities: null,
|
||||||
|
reportsToSlug: null,
|
||||||
|
adapterType: "codex_local",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
budgetMonthlyCents: 0,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
skills: [
|
||||||
|
{
|
||||||
|
key: "skill-a",
|
||||||
|
slug: "skill-a",
|
||||||
|
name: "Skill A",
|
||||||
|
path: "skills/skill-a/SKILL.md",
|
||||||
|
description: null,
|
||||||
|
sourceType: "inline",
|
||||||
|
sourceLocator: null,
|
||||||
|
sourceRef: null,
|
||||||
|
trustLevel: null,
|
||||||
|
compatibility: null,
|
||||||
|
metadata: null,
|
||||||
|
fileInventory: [{ path: "skills/skill-a/helper.md", kind: "doc" }],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
projects: [
|
||||||
|
{
|
||||||
|
slug: "alpha",
|
||||||
|
name: "Alpha",
|
||||||
|
path: "projects/alpha/PROJECT.md",
|
||||||
|
description: null,
|
||||||
|
ownerAgentSlug: null,
|
||||||
|
leadAgentSlug: null,
|
||||||
|
targetDate: null,
|
||||||
|
color: null,
|
||||||
|
status: null,
|
||||||
|
executionWorkspacePolicy: null,
|
||||||
|
workspaces: [],
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
issues: [
|
||||||
|
{
|
||||||
|
slug: "kickoff",
|
||||||
|
identifier: null,
|
||||||
|
title: "Kickoff",
|
||||||
|
path: "projects/alpha/issues/kickoff/TASK.md",
|
||||||
|
projectSlug: "alpha",
|
||||||
|
projectWorkspaceKey: null,
|
||||||
|
assigneeAgentSlug: "ceo",
|
||||||
|
description: null,
|
||||||
|
recurring: false,
|
||||||
|
routine: null,
|
||||||
|
legacyRecurrence: null,
|
||||||
|
status: null,
|
||||||
|
priority: null,
|
||||||
|
labelIds: [],
|
||||||
|
billingCode: null,
|
||||||
|
executionWorkspaceSettings: null,
|
||||||
|
assigneeAdapterOverrides: null,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
envInputs: [],
|
||||||
|
},
|
||||||
|
files: {
|
||||||
|
"COMPANY.md": "# Source Co",
|
||||||
|
"README.md": "# Readme",
|
||||||
|
".paperclip.yaml": "schema: paperclip/v1\n",
|
||||||
|
"images/company-logo.png": {
|
||||||
|
encoding: "base64",
|
||||||
|
data: "",
|
||||||
|
contentType: "image/png",
|
||||||
|
},
|
||||||
|
"projects/alpha/PROJECT.md": "# Alpha",
|
||||||
|
"projects/alpha/notes.md": "project notes",
|
||||||
|
"projects/alpha/issues/kickoff/TASK.md": "# Kickoff",
|
||||||
|
"projects/alpha/issues/kickoff/details.md": "task details",
|
||||||
|
"agents/ceo/AGENT.md": "# CEO",
|
||||||
|
"agents/ceo/prompt.md": "prompt",
|
||||||
|
"skills/skill-a/SKILL.md": "# Skill A",
|
||||||
|
"skills/skill-a/helper.md": "helper",
|
||||||
|
},
|
||||||
|
envInputs: [],
|
||||||
|
warnings: [],
|
||||||
|
errors: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
const catalog = buildImportSelectionCatalog(preview);
|
||||||
|
const state = buildDefaultImportSelectionState(catalog);
|
||||||
|
|
||||||
|
expect(state.company).toBe(true);
|
||||||
|
expect(state.projects.has("alpha")).toBe(true);
|
||||||
|
expect(state.issues.has("kickoff")).toBe(true);
|
||||||
|
expect(state.agents.has("ceo")).toBe(true);
|
||||||
|
expect(state.skills.has("skill-a")).toBe(true);
|
||||||
|
|
||||||
|
state.company = false;
|
||||||
|
state.issues.clear();
|
||||||
|
state.agents.clear();
|
||||||
|
state.skills.clear();
|
||||||
|
|
||||||
|
const selectedFiles = buildSelectedFilesFromImportSelection(catalog, state);
|
||||||
|
|
||||||
|
expect(selectedFiles).toContain(".paperclip.yaml");
|
||||||
|
expect(selectedFiles).toContain("projects/alpha/PROJECT.md");
|
||||||
|
expect(selectedFiles).toContain("projects/alpha/notes.md");
|
||||||
|
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/TASK.md");
|
||||||
|
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/details.md");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("default adapter overrides", () => {
|
||||||
|
it("maps process-only imported agents to claude_local", () => {
|
||||||
|
const preview: CompanyPortabilityPreviewResult = {
|
||||||
|
include: {
|
||||||
|
company: false,
|
||||||
|
agents: true,
|
||||||
|
projects: false,
|
||||||
|
issues: false,
|
||||||
|
skills: false,
|
||||||
|
},
|
||||||
|
targetCompanyId: null,
|
||||||
|
targetCompanyName: null,
|
||||||
|
collisionStrategy: "rename",
|
||||||
|
selectedAgentSlugs: ["legacy-agent", "explicit-agent"],
|
||||||
|
plan: {
|
||||||
|
companyAction: "none",
|
||||||
|
agentPlans: [],
|
||||||
|
projectPlans: [],
|
||||||
|
issuePlans: [],
|
||||||
|
},
|
||||||
|
manifest: {
|
||||||
|
schemaVersion: 1,
|
||||||
|
generatedAt: "2026-03-23T18:20:00.000Z",
|
||||||
|
source: null,
|
||||||
|
includes: {
|
||||||
|
company: false,
|
||||||
|
agents: true,
|
||||||
|
projects: false,
|
||||||
|
issues: false,
|
||||||
|
skills: false,
|
||||||
|
},
|
||||||
|
company: null,
|
||||||
|
sidebar: null,
|
||||||
|
agents: [
|
||||||
|
{
|
||||||
|
slug: "legacy-agent",
|
||||||
|
name: "Legacy Agent",
|
||||||
|
path: "agents/legacy-agent/AGENT.md",
|
||||||
|
skills: [],
|
||||||
|
role: "agent",
|
||||||
|
title: null,
|
||||||
|
icon: null,
|
||||||
|
capabilities: null,
|
||||||
|
reportsToSlug: null,
|
||||||
|
adapterType: "process",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
budgetMonthlyCents: 0,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
slug: "explicit-agent",
|
||||||
|
name: "Explicit Agent",
|
||||||
|
path: "agents/explicit-agent/AGENT.md",
|
||||||
|
skills: [],
|
||||||
|
role: "agent",
|
||||||
|
title: null,
|
||||||
|
icon: null,
|
||||||
|
capabilities: null,
|
||||||
|
reportsToSlug: null,
|
||||||
|
adapterType: "codex_local",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
budgetMonthlyCents: 0,
|
||||||
|
metadata: null,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
skills: [],
|
||||||
|
projects: [],
|
||||||
|
issues: [],
|
||||||
|
envInputs: [],
|
||||||
|
},
|
||||||
|
files: {},
|
||||||
|
envInputs: [],
|
||||||
|
warnings: [],
|
||||||
|
errors: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(buildDefaultImportAdapterOverrides(preview)).toEqual({
|
||||||
|
"legacy-agent": {
|
||||||
|
adapterType: "claude_local",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
99
cli/src/__tests__/doctor.test.ts
Normal file
99
cli/src/__tests__/doctor.test.ts
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from "vitest";
|
||||||
|
import { doctor } from "../commands/doctor.js";
|
||||||
|
import { writeConfig } from "../config/store.js";
|
||||||
|
import type { PaperclipConfig } from "../config/schema.js";
|
||||||
|
|
||||||
|
const ORIGINAL_ENV = { ...process.env };
|
||||||
|
|
||||||
|
function createTempConfig(): string {
|
||||||
|
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-doctor-"));
|
||||||
|
const configPath = path.join(root, ".paperclip", "config.json");
|
||||||
|
const runtimeRoot = path.join(root, "runtime");
|
||||||
|
|
||||||
|
const config: PaperclipConfig = {
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: "2026-03-10T00:00:00.000Z",
|
||||||
|
source: "configure",
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
mode: "embedded-postgres",
|
||||||
|
embeddedPostgresDataDir: path.join(runtimeRoot, "db"),
|
||||||
|
embeddedPostgresPort: 55432,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: path.join(runtimeRoot, "backups"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: "file",
|
||||||
|
logDir: path.join(runtimeRoot, "logs"),
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: "local_trusted",
|
||||||
|
exposure: "private",
|
||||||
|
host: "127.0.0.1",
|
||||||
|
port: 3199,
|
||||||
|
allowedHostnames: [],
|
||||||
|
serveUi: true,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: "auto",
|
||||||
|
disableSignUp: false,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: "local_disk",
|
||||||
|
localDisk: {
|
||||||
|
baseDir: path.join(runtimeRoot, "storage"),
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: "paperclip",
|
||||||
|
region: "us-east-1",
|
||||||
|
prefix: "",
|
||||||
|
forcePathStyle: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: "local_encrypted",
|
||||||
|
strictMode: false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: path.join(runtimeRoot, "secrets", "master.key"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
writeConfig(config, configPath);
|
||||||
|
return configPath;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("doctor", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
process.env = { ...ORIGINAL_ENV };
|
||||||
|
delete process.env.PAPERCLIP_AGENT_JWT_SECRET;
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE;
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
process.env = { ...ORIGINAL_ENV };
|
||||||
|
});
|
||||||
|
|
||||||
|
it("re-runs repairable checks so repaired failures do not remain blocking", async () => {
|
||||||
|
const configPath = createTempConfig();
|
||||||
|
|
||||||
|
const summary = await doctor({
|
||||||
|
config: configPath,
|
||||||
|
repair: true,
|
||||||
|
yes: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(summary.failed).toBe(0);
|
||||||
|
expect(summary.warned).toBe(0);
|
||||||
|
expect(process.env.PAPERCLIP_AGENT_JWT_SECRET).toBeTruthy();
|
||||||
|
});
|
||||||
|
});
|
||||||
87
cli/src/__tests__/helpers/zip.ts
Normal file
87
cli/src/__tests__/helpers/zip.ts
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
function writeUint16(target: Uint8Array, offset: number, value: number) {
|
||||||
|
target[offset] = value & 0xff;
|
||||||
|
target[offset + 1] = (value >>> 8) & 0xff;
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeUint32(target: Uint8Array, offset: number, value: number) {
|
||||||
|
target[offset] = value & 0xff;
|
||||||
|
target[offset + 1] = (value >>> 8) & 0xff;
|
||||||
|
target[offset + 2] = (value >>> 16) & 0xff;
|
||||||
|
target[offset + 3] = (value >>> 24) & 0xff;
|
||||||
|
}
|
||||||
|
|
||||||
|
function crc32(bytes: Uint8Array) {
|
||||||
|
let crc = 0xffffffff;
|
||||||
|
for (const byte of bytes) {
|
||||||
|
crc ^= byte;
|
||||||
|
for (let bit = 0; bit < 8; bit += 1) {
|
||||||
|
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return (crc ^ 0xffffffff) >>> 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function createStoredZipArchive(files: Record<string, string>, rootPath: string) {
|
||||||
|
const encoder = new TextEncoder();
|
||||||
|
const localChunks: Uint8Array[] = [];
|
||||||
|
const centralChunks: Uint8Array[] = [];
|
||||||
|
let localOffset = 0;
|
||||||
|
let entryCount = 0;
|
||||||
|
|
||||||
|
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
|
||||||
|
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
|
||||||
|
const body = encoder.encode(content);
|
||||||
|
const checksum = crc32(body);
|
||||||
|
|
||||||
|
const localHeader = new Uint8Array(30 + fileName.length);
|
||||||
|
writeUint32(localHeader, 0, 0x04034b50);
|
||||||
|
writeUint16(localHeader, 4, 20);
|
||||||
|
writeUint16(localHeader, 6, 0x0800);
|
||||||
|
writeUint16(localHeader, 8, 0);
|
||||||
|
writeUint32(localHeader, 14, checksum);
|
||||||
|
writeUint32(localHeader, 18, body.length);
|
||||||
|
writeUint32(localHeader, 22, body.length);
|
||||||
|
writeUint16(localHeader, 26, fileName.length);
|
||||||
|
localHeader.set(fileName, 30);
|
||||||
|
|
||||||
|
const centralHeader = new Uint8Array(46 + fileName.length);
|
||||||
|
writeUint32(centralHeader, 0, 0x02014b50);
|
||||||
|
writeUint16(centralHeader, 4, 20);
|
||||||
|
writeUint16(centralHeader, 6, 20);
|
||||||
|
writeUint16(centralHeader, 8, 0x0800);
|
||||||
|
writeUint16(centralHeader, 10, 0);
|
||||||
|
writeUint32(centralHeader, 16, checksum);
|
||||||
|
writeUint32(centralHeader, 20, body.length);
|
||||||
|
writeUint32(centralHeader, 24, body.length);
|
||||||
|
writeUint16(centralHeader, 28, fileName.length);
|
||||||
|
writeUint32(centralHeader, 42, localOffset);
|
||||||
|
centralHeader.set(fileName, 46);
|
||||||
|
|
||||||
|
localChunks.push(localHeader, body);
|
||||||
|
centralChunks.push(centralHeader);
|
||||||
|
localOffset += localHeader.length + body.length;
|
||||||
|
entryCount += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
|
||||||
|
const archive = new Uint8Array(
|
||||||
|
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
|
||||||
|
);
|
||||||
|
let offset = 0;
|
||||||
|
for (const chunk of localChunks) {
|
||||||
|
archive.set(chunk, offset);
|
||||||
|
offset += chunk.length;
|
||||||
|
}
|
||||||
|
const centralDirectoryOffset = offset;
|
||||||
|
for (const chunk of centralChunks) {
|
||||||
|
archive.set(chunk, offset);
|
||||||
|
offset += chunk.length;
|
||||||
|
}
|
||||||
|
writeUint32(archive, offset, 0x06054b50);
|
||||||
|
writeUint16(archive, offset + 8, entryCount);
|
||||||
|
writeUint16(archive, offset + 10, entryCount);
|
||||||
|
writeUint32(archive, offset + 12, centralDirectoryLength);
|
||||||
|
writeUint32(archive, offset + 16, centralDirectoryOffset);
|
||||||
|
|
||||||
|
return archive;
|
||||||
|
}
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||||
import { ApiRequestError, PaperclipApiClient } from "../client/http.js";
|
import { ApiConnectionError, ApiRequestError, PaperclipApiClient } from "../client/http.js";
|
||||||
|
|
||||||
describe("PaperclipApiClient", () => {
|
describe("PaperclipApiClient", () => {
|
||||||
afterEach(() => {
|
afterEach(() => {
|
||||||
@@ -58,4 +58,49 @@ describe("PaperclipApiClient", () => {
|
|||||||
details: { issueId: "1" },
|
details: { issueId: "1" },
|
||||||
} satisfies Partial<ApiRequestError>);
|
} satisfies Partial<ApiRequestError>);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("throws ApiConnectionError with recovery guidance when fetch fails", async () => {
|
||||||
|
const fetchMock = vi.fn().mockRejectedValue(new TypeError("fetch failed"));
|
||||||
|
vi.stubGlobal("fetch", fetchMock);
|
||||||
|
|
||||||
|
const client = new PaperclipApiClient({ apiBase: "http://localhost:3100" });
|
||||||
|
|
||||||
|
await expect(client.post("/api/companies/import/preview", {})).rejects.toBeInstanceOf(ApiConnectionError);
|
||||||
|
await expect(client.post("/api/companies/import/preview", {})).rejects.toMatchObject({
|
||||||
|
url: "http://localhost:3100/api/companies/import/preview",
|
||||||
|
method: "POST",
|
||||||
|
causeMessage: "fetch failed",
|
||||||
|
} satisfies Partial<ApiConnectionError>);
|
||||||
|
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||||
|
/Could not reach the Paperclip API\./,
|
||||||
|
);
|
||||||
|
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||||
|
/curl http:\/\/localhost:3100\/api\/health/,
|
||||||
|
);
|
||||||
|
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
|
||||||
|
/pnpm dev|pnpm paperclipai run/,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("retries once after interactive auth recovery", async () => {
|
||||||
|
const fetchMock = vi
|
||||||
|
.fn()
|
||||||
|
.mockResolvedValueOnce(new Response(JSON.stringify({ error: "Board access required" }), { status: 403 }))
|
||||||
|
.mockResolvedValueOnce(new Response(JSON.stringify({ ok: true }), { status: 200 }));
|
||||||
|
vi.stubGlobal("fetch", fetchMock);
|
||||||
|
|
||||||
|
const recoverAuth = vi.fn().mockResolvedValue("board-token-123");
|
||||||
|
const client = new PaperclipApiClient({
|
||||||
|
apiBase: "http://localhost:3100",
|
||||||
|
recoverAuth,
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await client.post<{ ok: boolean }>("/api/test", { hello: "world" });
|
||||||
|
|
||||||
|
expect(result).toEqual({ ok: true });
|
||||||
|
expect(recoverAuth).toHaveBeenCalledOnce();
|
||||||
|
expect(fetchMock).toHaveBeenCalledTimes(2);
|
||||||
|
const retryHeaders = fetchMock.mock.calls[1]?.[1]?.headers as Record<string, string>;
|
||||||
|
expect(retryHeaders.authorization).toBe("Bearer board-token-123");
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
492
cli/src/__tests__/worktree-merge-history.test.ts
Normal file
492
cli/src/__tests__/worktree-merge-history.test.ts
Normal file
@@ -0,0 +1,492 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import { buildWorktreeMergePlan, parseWorktreeMergeScopes } from "../commands/worktree-merge-history-lib.js";
|
||||||
|
|
||||||
|
function makeIssue(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "issue-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
projectId: null,
|
||||||
|
projectWorkspaceId: null,
|
||||||
|
goalId: "goal-1",
|
||||||
|
parentId: null,
|
||||||
|
title: "Issue",
|
||||||
|
description: null,
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
assigneeAgentId: null,
|
||||||
|
assigneeUserId: null,
|
||||||
|
checkoutRunId: null,
|
||||||
|
executionRunId: null,
|
||||||
|
executionAgentNameKey: null,
|
||||||
|
executionLockedAt: null,
|
||||||
|
createdByAgentId: null,
|
||||||
|
createdByUserId: "local-board",
|
||||||
|
issueNumber: 1,
|
||||||
|
identifier: "PAP-1",
|
||||||
|
requestDepth: 0,
|
||||||
|
billingCode: null,
|
||||||
|
assigneeAdapterOverrides: null,
|
||||||
|
executionWorkspaceId: null,
|
||||||
|
executionWorkspacePreference: null,
|
||||||
|
executionWorkspaceSettings: null,
|
||||||
|
startedAt: null,
|
||||||
|
completedAt: null,
|
||||||
|
cancelledAt: null,
|
||||||
|
hiddenAt: null,
|
||||||
|
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeComment(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "comment-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
issueId: "issue-1",
|
||||||
|
authorAgentId: null,
|
||||||
|
authorUserId: "local-board",
|
||||||
|
body: "hello",
|
||||||
|
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeIssueDocument(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "issue-document-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
issueId: "issue-1",
|
||||||
|
documentId: "document-1",
|
||||||
|
key: "plan",
|
||||||
|
linkCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
linkUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
title: "Plan",
|
||||||
|
format: "markdown",
|
||||||
|
latestBody: "# Plan",
|
||||||
|
latestRevisionId: "revision-1",
|
||||||
|
latestRevisionNumber: 1,
|
||||||
|
createdByAgentId: null,
|
||||||
|
createdByUserId: "local-board",
|
||||||
|
updatedByAgentId: null,
|
||||||
|
updatedByUserId: "local-board",
|
||||||
|
documentCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
documentUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeDocumentRevision(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "revision-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
documentId: "document-1",
|
||||||
|
revisionNumber: 1,
|
||||||
|
body: "# Plan",
|
||||||
|
changeSummary: null,
|
||||||
|
createdByAgentId: null,
|
||||||
|
createdByUserId: "local-board",
|
||||||
|
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeAttachment(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "attachment-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
issueId: "issue-1",
|
||||||
|
issueCommentId: null,
|
||||||
|
assetId: "asset-1",
|
||||||
|
provider: "local_disk",
|
||||||
|
objectKey: "company-1/issues/issue-1/2026/03/20/asset.png",
|
||||||
|
contentType: "image/png",
|
||||||
|
byteSize: 12,
|
||||||
|
sha256: "deadbeef",
|
||||||
|
originalFilename: "asset.png",
|
||||||
|
createdByAgentId: null,
|
||||||
|
createdByUserId: "local-board",
|
||||||
|
assetCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
assetUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
attachmentCreatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
attachmentUpdatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeProject(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "project-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
goalId: null,
|
||||||
|
name: "Project",
|
||||||
|
description: null,
|
||||||
|
status: "in_progress",
|
||||||
|
leadAgentId: null,
|
||||||
|
targetDate: null,
|
||||||
|
color: "#22c55e",
|
||||||
|
pauseReason: null,
|
||||||
|
pausedAt: null,
|
||||||
|
executionWorkspacePolicy: null,
|
||||||
|
archivedAt: null,
|
||||||
|
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
function makeProjectWorkspace(overrides: Record<string, unknown> = {}) {
|
||||||
|
return {
|
||||||
|
id: "workspace-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
projectId: "project-1",
|
||||||
|
name: "Workspace",
|
||||||
|
sourceType: "local_path",
|
||||||
|
cwd: "/tmp/project",
|
||||||
|
repoUrl: "https://github.com/example/project.git",
|
||||||
|
repoRef: "main",
|
||||||
|
defaultRef: "main",
|
||||||
|
visibility: "default",
|
||||||
|
setupCommand: null,
|
||||||
|
cleanupCommand: null,
|
||||||
|
remoteProvider: null,
|
||||||
|
remoteWorkspaceRef: null,
|
||||||
|
sharedWorkspaceKey: null,
|
||||||
|
metadata: null,
|
||||||
|
isPrimary: true,
|
||||||
|
createdAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-20T00:00:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("worktree merge history planner", () => {
|
||||||
|
it("parses default scopes", () => {
|
||||||
|
expect(parseWorktreeMergeScopes(undefined)).toEqual(["issues", "comments"]);
|
||||||
|
expect(parseWorktreeMergeScopes("issues")).toEqual(["issues"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("dedupes nested worktree issues by preserved source uuid", () => {
|
||||||
|
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10", title: "Shared" });
|
||||||
|
const branchOneIssue = makeIssue({
|
||||||
|
id: "issue-b",
|
||||||
|
identifier: "PAP-22",
|
||||||
|
title: "Branch one issue",
|
||||||
|
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||||
|
});
|
||||||
|
const branchTwoIssue = makeIssue({
|
||||||
|
id: "issue-c",
|
||||||
|
identifier: "PAP-23",
|
||||||
|
title: "Branch two issue",
|
||||||
|
createdAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 500,
|
||||||
|
scopes: ["issues", "comments"],
|
||||||
|
sourceIssues: [sharedIssue, branchOneIssue, branchTwoIssue],
|
||||||
|
targetIssues: [sharedIssue, branchOneIssue],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(plan.counts.issuesToInsert).toBe(1);
|
||||||
|
expect(plan.issuePlans.filter((item) => item.action === "insert").map((item) => item.source.id)).toEqual(["issue-c"]);
|
||||||
|
expect(plan.issuePlans.find((item) => item.source.id === "issue-c" && item.action === "insert")).toMatchObject({
|
||||||
|
previewIdentifier: "PAP-501",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("clears missing references and coerces in_progress without an assignee", () => {
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues"],
|
||||||
|
sourceIssues: [
|
||||||
|
makeIssue({
|
||||||
|
id: "issue-x",
|
||||||
|
identifier: "PAP-99",
|
||||||
|
status: "in_progress",
|
||||||
|
assigneeAgentId: "agent-missing",
|
||||||
|
projectId: "project-missing",
|
||||||
|
projectWorkspaceId: "workspace-missing",
|
||||||
|
goalId: "goal-missing",
|
||||||
|
}),
|
||||||
|
],
|
||||||
|
targetIssues: [],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [],
|
||||||
|
});
|
||||||
|
|
||||||
|
const insert = plan.issuePlans[0] as any;
|
||||||
|
expect(insert.targetStatus).toBe("todo");
|
||||||
|
expect(insert.targetAssigneeAgentId).toBeNull();
|
||||||
|
expect(insert.targetProjectId).toBeNull();
|
||||||
|
expect(insert.targetProjectWorkspaceId).toBeNull();
|
||||||
|
expect(insert.targetGoalId).toBeNull();
|
||||||
|
expect(insert.adjustments).toEqual([
|
||||||
|
"clear_assignee_agent",
|
||||||
|
"clear_project",
|
||||||
|
"clear_project_workspace",
|
||||||
|
"clear_goal",
|
||||||
|
"coerce_in_progress_to_todo",
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("applies an explicit project mapping override instead of clearing the project", () => {
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues"],
|
||||||
|
sourceIssues: [
|
||||||
|
makeIssue({
|
||||||
|
id: "issue-project-map",
|
||||||
|
identifier: "PAP-77",
|
||||||
|
projectId: "source-project-1",
|
||||||
|
projectWorkspaceId: "source-workspace-1",
|
||||||
|
}),
|
||||||
|
],
|
||||||
|
targetIssues: [],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [{ id: "target-project-1", name: "Mapped project", status: "in_progress" }] as any,
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
projectIdOverrides: {
|
||||||
|
"source-project-1": "target-project-1",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const insert = plan.issuePlans[0] as any;
|
||||||
|
expect(insert.targetProjectId).toBe("target-project-1");
|
||||||
|
expect(insert.projectResolution).toBe("mapped");
|
||||||
|
expect(insert.mappedProjectName).toBe("Mapped project");
|
||||||
|
expect(insert.targetProjectWorkspaceId).toBeNull();
|
||||||
|
expect(insert.adjustments).toEqual(["clear_project_workspace"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("plans selected project imports and preserves project workspace links", () => {
|
||||||
|
const sourceProject = makeProject({
|
||||||
|
id: "source-project-1",
|
||||||
|
name: "Paperclip Evals",
|
||||||
|
goalId: "goal-1",
|
||||||
|
});
|
||||||
|
const sourceWorkspace = makeProjectWorkspace({
|
||||||
|
id: "source-workspace-1",
|
||||||
|
projectId: "source-project-1",
|
||||||
|
cwd: "/Users/dotta/paperclip-evals",
|
||||||
|
repoUrl: "https://github.com/paperclipai/paperclip-evals.git",
|
||||||
|
});
|
||||||
|
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues"],
|
||||||
|
sourceIssues: [
|
||||||
|
makeIssue({
|
||||||
|
id: "issue-project-import",
|
||||||
|
identifier: "PAP-88",
|
||||||
|
projectId: "source-project-1",
|
||||||
|
projectWorkspaceId: "source-workspace-1",
|
||||||
|
}),
|
||||||
|
],
|
||||||
|
targetIssues: [],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
sourceProjects: [sourceProject],
|
||||||
|
sourceProjectWorkspaces: [sourceWorkspace],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
importProjectIds: ["source-project-1"],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(plan.counts.projectsToImport).toBe(1);
|
||||||
|
expect(plan.projectImports[0]).toMatchObject({
|
||||||
|
source: { id: "source-project-1", name: "Paperclip Evals" },
|
||||||
|
targetGoalId: "goal-1",
|
||||||
|
workspaces: [{ id: "source-workspace-1" }],
|
||||||
|
});
|
||||||
|
|
||||||
|
const insert = plan.issuePlans[0] as any;
|
||||||
|
expect(insert.targetProjectId).toBe("source-project-1");
|
||||||
|
expect(insert.targetProjectWorkspaceId).toBe("source-workspace-1");
|
||||||
|
expect(insert.projectResolution).toBe("imported");
|
||||||
|
expect(insert.mappedProjectName).toBe("Paperclip Evals");
|
||||||
|
expect(insert.adjustments).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("imports comments onto shared or newly imported issues while skipping existing comments", () => {
|
||||||
|
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||||
|
const newIssue = makeIssue({
|
||||||
|
id: "issue-b",
|
||||||
|
identifier: "PAP-11",
|
||||||
|
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||||
|
});
|
||||||
|
const existingComment = makeComment({ id: "comment-existing", issueId: "issue-a" });
|
||||||
|
const sharedIssueComment = makeComment({ id: "comment-shared", issueId: "issue-a" });
|
||||||
|
const newIssueComment = makeComment({
|
||||||
|
id: "comment-new-issue",
|
||||||
|
issueId: "issue-b",
|
||||||
|
authorAgentId: "missing-agent",
|
||||||
|
createdAt: new Date("2026-03-20T01:05:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues", "comments"],
|
||||||
|
sourceIssues: [sharedIssue, newIssue],
|
||||||
|
targetIssues: [sharedIssue],
|
||||||
|
sourceComments: [existingComment, sharedIssueComment, newIssueComment],
|
||||||
|
targetComments: [existingComment],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(plan.counts.commentsToInsert).toBe(2);
|
||||||
|
expect(plan.counts.commentsExisting).toBe(1);
|
||||||
|
expect(plan.commentPlans.filter((item) => item.action === "insert").map((item) => item.source.id)).toEqual([
|
||||||
|
"comment-shared",
|
||||||
|
"comment-new-issue",
|
||||||
|
]);
|
||||||
|
expect(plan.adjustments.clear_author_agent).toBe(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("merges document revisions onto an existing shared document and renumbers conflicts", () => {
|
||||||
|
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||||
|
const sourceDocument = makeIssueDocument({
|
||||||
|
issueId: "issue-a",
|
||||||
|
documentId: "document-a",
|
||||||
|
latestBody: "# Branch plan",
|
||||||
|
latestRevisionId: "revision-branch-2",
|
||||||
|
latestRevisionNumber: 2,
|
||||||
|
documentUpdatedAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||||
|
linkUpdatedAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||||
|
});
|
||||||
|
const targetDocument = makeIssueDocument({
|
||||||
|
issueId: "issue-a",
|
||||||
|
documentId: "document-a",
|
||||||
|
latestBody: "# Main plan",
|
||||||
|
latestRevisionId: "revision-main-2",
|
||||||
|
latestRevisionNumber: 2,
|
||||||
|
documentUpdatedAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||||
|
linkUpdatedAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||||
|
});
|
||||||
|
const sourceRevisionOne = makeDocumentRevision({ documentId: "document-a", id: "revision-1" });
|
||||||
|
const sourceRevisionTwo = makeDocumentRevision({
|
||||||
|
documentId: "document-a",
|
||||||
|
id: "revision-branch-2",
|
||||||
|
revisionNumber: 2,
|
||||||
|
body: "# Branch plan",
|
||||||
|
createdAt: new Date("2026-03-20T02:00:00.000Z"),
|
||||||
|
});
|
||||||
|
const targetRevisionOne = makeDocumentRevision({ documentId: "document-a", id: "revision-1" });
|
||||||
|
const targetRevisionTwo = makeDocumentRevision({
|
||||||
|
documentId: "document-a",
|
||||||
|
id: "revision-main-2",
|
||||||
|
revisionNumber: 2,
|
||||||
|
body: "# Main plan",
|
||||||
|
createdAt: new Date("2026-03-20T01:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues", "comments"],
|
||||||
|
sourceIssues: [sharedIssue],
|
||||||
|
targetIssues: [sharedIssue],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
sourceDocuments: [sourceDocument],
|
||||||
|
targetDocuments: [targetDocument],
|
||||||
|
sourceDocumentRevisions: [sourceRevisionOne, sourceRevisionTwo],
|
||||||
|
targetDocumentRevisions: [targetRevisionOne, targetRevisionTwo],
|
||||||
|
sourceAttachments: [],
|
||||||
|
targetAttachments: [],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(plan.counts.documentsToMerge).toBe(1);
|
||||||
|
expect(plan.counts.documentRevisionsToInsert).toBe(1);
|
||||||
|
expect(plan.documentPlans[0]).toMatchObject({
|
||||||
|
action: "merge_existing",
|
||||||
|
latestRevisionId: "revision-branch-2",
|
||||||
|
latestRevisionNumber: 3,
|
||||||
|
});
|
||||||
|
const mergePlan = plan.documentPlans[0] as any;
|
||||||
|
expect(mergePlan.revisionsToInsert).toHaveLength(1);
|
||||||
|
expect(mergePlan.revisionsToInsert[0]).toMatchObject({
|
||||||
|
source: { id: "revision-branch-2" },
|
||||||
|
targetRevisionNumber: 3,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("imports attachments while clearing missing comment and author references", () => {
|
||||||
|
const sharedIssue = makeIssue({ id: "issue-a", identifier: "PAP-10" });
|
||||||
|
const attachment = makeAttachment({
|
||||||
|
issueId: "issue-a",
|
||||||
|
issueCommentId: "comment-missing",
|
||||||
|
createdByAgentId: "agent-missing",
|
||||||
|
});
|
||||||
|
|
||||||
|
const plan = buildWorktreeMergePlan({
|
||||||
|
companyId: "company-1",
|
||||||
|
companyName: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
previewIssueCounterStart: 10,
|
||||||
|
scopes: ["issues"],
|
||||||
|
sourceIssues: [sharedIssue],
|
||||||
|
targetIssues: [sharedIssue],
|
||||||
|
sourceComments: [],
|
||||||
|
targetComments: [],
|
||||||
|
sourceDocuments: [],
|
||||||
|
targetDocuments: [],
|
||||||
|
sourceDocumentRevisions: [],
|
||||||
|
targetDocumentRevisions: [],
|
||||||
|
sourceAttachments: [attachment],
|
||||||
|
targetAttachments: [],
|
||||||
|
targetAgents: [],
|
||||||
|
targetProjects: [],
|
||||||
|
targetProjectWorkspaces: [],
|
||||||
|
targetGoals: [{ id: "goal-1" }] as any,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(plan.counts.attachmentsToInsert).toBe(1);
|
||||||
|
expect(plan.adjustments.clear_attachment_agent).toBe(1);
|
||||||
|
expect(plan.attachmentPlans[0]).toMatchObject({
|
||||||
|
action: "insert",
|
||||||
|
targetIssueCommentId: null,
|
||||||
|
targetCreatedByAgentId: null,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
510
cli/src/__tests__/worktree.test.ts
Normal file
510
cli/src/__tests__/worktree.test.ts
Normal file
@@ -0,0 +1,510 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { execFileSync } from "node:child_process";
|
||||||
|
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import {
|
||||||
|
copyGitHooksToWorktreeGitDir,
|
||||||
|
copySeededSecretsKey,
|
||||||
|
readSourceAttachmentBody,
|
||||||
|
rebindWorkspaceCwd,
|
||||||
|
resolveSourceConfigPath,
|
||||||
|
resolveGitWorktreeAddArgs,
|
||||||
|
resolveWorktreeMakeTargetPath,
|
||||||
|
worktreeInitCommand,
|
||||||
|
worktreeMakeCommand,
|
||||||
|
} from "../commands/worktree.js";
|
||||||
|
import {
|
||||||
|
buildWorktreeConfig,
|
||||||
|
buildWorktreeEnvEntries,
|
||||||
|
formatShellExports,
|
||||||
|
generateWorktreeColor,
|
||||||
|
resolveWorktreeSeedPlan,
|
||||||
|
resolveWorktreeLocalPaths,
|
||||||
|
rewriteLocalUrlPort,
|
||||||
|
sanitizeWorktreeInstanceId,
|
||||||
|
} from "../commands/worktree-lib.js";
|
||||||
|
import type { PaperclipConfig } from "../config/schema.js";
|
||||||
|
|
||||||
|
const ORIGINAL_CWD = process.cwd();
|
||||||
|
const ORIGINAL_ENV = { ...process.env };
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
process.chdir(ORIGINAL_CWD);
|
||||||
|
for (const key of Object.keys(process.env)) {
|
||||||
|
if (!(key in ORIGINAL_ENV)) delete process.env[key];
|
||||||
|
}
|
||||||
|
for (const [key, value] of Object.entries(ORIGINAL_ENV)) {
|
||||||
|
if (value === undefined) delete process.env[key];
|
||||||
|
else process.env[key] = value;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function buildSourceConfig(): PaperclipConfig {
|
||||||
|
return {
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: "2026-03-09T00:00:00.000Z",
|
||||||
|
source: "configure",
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
mode: "embedded-postgres",
|
||||||
|
embeddedPostgresDataDir: "/tmp/main/db",
|
||||||
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: "/tmp/main/backups",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: "file",
|
||||||
|
logDir: "/tmp/main/logs",
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: "authenticated",
|
||||||
|
exposure: "private",
|
||||||
|
host: "127.0.0.1",
|
||||||
|
port: 3100,
|
||||||
|
allowedHostnames: ["localhost"],
|
||||||
|
serveUi: true,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: "explicit",
|
||||||
|
publicBaseUrl: "http://127.0.0.1:3100",
|
||||||
|
disableSignUp: false,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: "local_disk",
|
||||||
|
localDisk: {
|
||||||
|
baseDir: "/tmp/main/storage",
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: "paperclip",
|
||||||
|
region: "us-east-1",
|
||||||
|
prefix: "",
|
||||||
|
forcePathStyle: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: "local_encrypted",
|
||||||
|
strictMode: false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: "/tmp/main/secrets/master.key",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("worktree helpers", () => {
|
||||||
|
it("sanitizes instance ids", () => {
|
||||||
|
expect(sanitizeWorktreeInstanceId("feature/worktree-support")).toBe("feature-worktree-support");
|
||||||
|
expect(sanitizeWorktreeInstanceId(" ")).toBe("worktree");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("resolves worktree:make target paths under the user home directory", () => {
|
||||||
|
expect(resolveWorktreeMakeTargetPath("paperclip-pr-432")).toBe(
|
||||||
|
path.resolve(os.homedir(), "paperclip-pr-432"),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects worktree:make names that are not safe directory/branch names", () => {
|
||||||
|
expect(() => resolveWorktreeMakeTargetPath("paperclip/pr-432")).toThrow(
|
||||||
|
"Worktree name must contain only letters, numbers, dots, underscores, or dashes.",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("builds git worktree add args for new and existing branches", () => {
|
||||||
|
expect(
|
||||||
|
resolveGitWorktreeAddArgs({
|
||||||
|
branchName: "feature-branch",
|
||||||
|
targetPath: "/tmp/feature-branch",
|
||||||
|
branchExists: false,
|
||||||
|
}),
|
||||||
|
).toEqual(["worktree", "add", "-b", "feature-branch", "/tmp/feature-branch", "HEAD"]);
|
||||||
|
|
||||||
|
expect(
|
||||||
|
resolveGitWorktreeAddArgs({
|
||||||
|
branchName: "feature-branch",
|
||||||
|
targetPath: "/tmp/feature-branch",
|
||||||
|
branchExists: true,
|
||||||
|
}),
|
||||||
|
).toEqual(["worktree", "add", "/tmp/feature-branch", "feature-branch"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("builds git worktree add args with a start point", () => {
|
||||||
|
expect(
|
||||||
|
resolveGitWorktreeAddArgs({
|
||||||
|
branchName: "my-worktree",
|
||||||
|
targetPath: "/tmp/my-worktree",
|
||||||
|
branchExists: false,
|
||||||
|
startPoint: "public-gh/master",
|
||||||
|
}),
|
||||||
|
).toEqual(["worktree", "add", "-b", "my-worktree", "/tmp/my-worktree", "public-gh/master"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses start point even when a local branch with the same name exists", () => {
|
||||||
|
expect(
|
||||||
|
resolveGitWorktreeAddArgs({
|
||||||
|
branchName: "my-worktree",
|
||||||
|
targetPath: "/tmp/my-worktree",
|
||||||
|
branchExists: true,
|
||||||
|
startPoint: "origin/main",
|
||||||
|
}),
|
||||||
|
).toEqual(["worktree", "add", "-b", "my-worktree", "/tmp/my-worktree", "origin/main"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rewrites loopback auth URLs to the new port only", () => {
|
||||||
|
expect(rewriteLocalUrlPort("http://127.0.0.1:3100", 3110)).toBe("http://127.0.0.1:3110/");
|
||||||
|
expect(rewriteLocalUrlPort("https://paperclip.example", 3110)).toBe("https://paperclip.example");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("builds isolated config and env paths for a worktree", () => {
|
||||||
|
const paths = resolveWorktreeLocalPaths({
|
||||||
|
cwd: "/tmp/paperclip-feature",
|
||||||
|
homeDir: "/tmp/paperclip-worktrees",
|
||||||
|
instanceId: "feature-worktree-support",
|
||||||
|
});
|
||||||
|
const config = buildWorktreeConfig({
|
||||||
|
sourceConfig: buildSourceConfig(),
|
||||||
|
paths,
|
||||||
|
serverPort: 3110,
|
||||||
|
databasePort: 54339,
|
||||||
|
now: new Date("2026-03-09T12:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(config.database.embeddedPostgresDataDir).toBe(
|
||||||
|
path.resolve("/tmp/paperclip-worktrees", "instances", "feature-worktree-support", "db"),
|
||||||
|
);
|
||||||
|
expect(config.database.embeddedPostgresPort).toBe(54339);
|
||||||
|
expect(config.server.port).toBe(3110);
|
||||||
|
expect(config.auth.publicBaseUrl).toBe("http://127.0.0.1:3110/");
|
||||||
|
expect(config.storage.localDisk.baseDir).toBe(
|
||||||
|
path.resolve("/tmp/paperclip-worktrees", "instances", "feature-worktree-support", "data", "storage"),
|
||||||
|
);
|
||||||
|
|
||||||
|
const env = buildWorktreeEnvEntries(paths, {
|
||||||
|
name: "feature-worktree-support",
|
||||||
|
color: "#3abf7a",
|
||||||
|
});
|
||||||
|
expect(env.PAPERCLIP_HOME).toBe(path.resolve("/tmp/paperclip-worktrees"));
|
||||||
|
expect(env.PAPERCLIP_INSTANCE_ID).toBe("feature-worktree-support");
|
||||||
|
expect(env.PAPERCLIP_IN_WORKTREE).toBe("true");
|
||||||
|
expect(env.PAPERCLIP_WORKTREE_NAME).toBe("feature-worktree-support");
|
||||||
|
expect(env.PAPERCLIP_WORKTREE_COLOR).toBe("#3abf7a");
|
||||||
|
expect(formatShellExports(env)).toContain("export PAPERCLIP_INSTANCE_ID='feature-worktree-support'");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("falls back across storage roots before skipping a missing attachment object", async () => {
|
||||||
|
const missingErr = Object.assign(new Error("missing"), { code: "ENOENT" });
|
||||||
|
const expected = Buffer.from("image-bytes");
|
||||||
|
await expect(
|
||||||
|
readSourceAttachmentBody(
|
||||||
|
[
|
||||||
|
{
|
||||||
|
getObject: vi.fn().mockRejectedValue(missingErr),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
getObject: vi.fn().mockResolvedValue(expected),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"company-1",
|
||||||
|
"company-1/issues/issue-1/missing.png",
|
||||||
|
),
|
||||||
|
).resolves.toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("returns null when an attachment object is missing from every lookup storage", async () => {
|
||||||
|
const missingErr = Object.assign(new Error("missing"), { code: "ENOENT" });
|
||||||
|
await expect(
|
||||||
|
readSourceAttachmentBody(
|
||||||
|
[
|
||||||
|
{
|
||||||
|
getObject: vi.fn().mockRejectedValue(missingErr),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
getObject: vi.fn().mockRejectedValue(Object.assign(new Error("missing"), { status: 404 })),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"company-1",
|
||||||
|
"company-1/issues/issue-1/missing.png",
|
||||||
|
),
|
||||||
|
).resolves.toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("generates vivid worktree colors as hex", () => {
|
||||||
|
expect(generateWorktreeColor()).toMatch(/^#[0-9a-f]{6}$/);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses minimal seed mode to keep app state but drop heavy runtime history", () => {
|
||||||
|
const minimal = resolveWorktreeSeedPlan("minimal");
|
||||||
|
const full = resolveWorktreeSeedPlan("full");
|
||||||
|
|
||||||
|
expect(minimal.excludedTables).toContain("heartbeat_runs");
|
||||||
|
expect(minimal.excludedTables).toContain("heartbeat_run_events");
|
||||||
|
expect(minimal.excludedTables).toContain("workspace_runtime_services");
|
||||||
|
expect(minimal.excludedTables).toContain("agent_task_sessions");
|
||||||
|
expect(minimal.nullifyColumns.issues).toEqual(["checkout_run_id", "execution_run_id"]);
|
||||||
|
|
||||||
|
expect(full.excludedTables).toEqual([]);
|
||||||
|
expect(full.nullifyColumns).toEqual({});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("copies the source local_encrypted secrets key into the seeded worktree instance", () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-secrets-"));
|
||||||
|
const originalInlineMasterKey = process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||||
|
const originalKeyFile = process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE;
|
||||||
|
try {
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE;
|
||||||
|
const sourceConfigPath = path.join(tempRoot, "source", "config.json");
|
||||||
|
const sourceKeyPath = path.join(tempRoot, "source", "secrets", "master.key");
|
||||||
|
const targetKeyPath = path.join(tempRoot, "target", "secrets", "master.key");
|
||||||
|
fs.mkdirSync(path.dirname(sourceKeyPath), { recursive: true });
|
||||||
|
fs.writeFileSync(sourceKeyPath, "source-master-key", "utf8");
|
||||||
|
|
||||||
|
const sourceConfig = buildSourceConfig();
|
||||||
|
sourceConfig.secrets.localEncrypted.keyFilePath = sourceKeyPath;
|
||||||
|
|
||||||
|
copySeededSecretsKey({
|
||||||
|
sourceConfigPath,
|
||||||
|
sourceConfig,
|
||||||
|
sourceEnvEntries: {},
|
||||||
|
targetKeyFilePath: targetKeyPath,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(fs.readFileSync(targetKeyPath, "utf8")).toBe("source-master-key");
|
||||||
|
} finally {
|
||||||
|
if (originalInlineMasterKey === undefined) {
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||||
|
} else {
|
||||||
|
process.env.PAPERCLIP_SECRETS_MASTER_KEY = originalInlineMasterKey;
|
||||||
|
}
|
||||||
|
if (originalKeyFile === undefined) {
|
||||||
|
delete process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE;
|
||||||
|
} else {
|
||||||
|
process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE = originalKeyFile;
|
||||||
|
}
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("writes the source inline secrets master key into the seeded worktree instance", () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-secrets-"));
|
||||||
|
try {
|
||||||
|
const sourceConfigPath = path.join(tempRoot, "source", "config.json");
|
||||||
|
const targetKeyPath = path.join(tempRoot, "target", "secrets", "master.key");
|
||||||
|
|
||||||
|
copySeededSecretsKey({
|
||||||
|
sourceConfigPath,
|
||||||
|
sourceConfig: buildSourceConfig(),
|
||||||
|
sourceEnvEntries: {
|
||||||
|
PAPERCLIP_SECRETS_MASTER_KEY: "inline-source-master-key",
|
||||||
|
},
|
||||||
|
targetKeyFilePath: targetKeyPath,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(fs.readFileSync(targetKeyPath, "utf8")).toBe("inline-source-master-key");
|
||||||
|
} finally {
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("persists the current agent jwt secret into the worktree env file", async () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-jwt-"));
|
||||||
|
const repoRoot = path.join(tempRoot, "repo");
|
||||||
|
const originalCwd = process.cwd();
|
||||||
|
const originalJwtSecret = process.env.PAPERCLIP_AGENT_JWT_SECRET;
|
||||||
|
|
||||||
|
try {
|
||||||
|
fs.mkdirSync(repoRoot, { recursive: true });
|
||||||
|
process.env.PAPERCLIP_AGENT_JWT_SECRET = "worktree-shared-secret";
|
||||||
|
process.chdir(repoRoot);
|
||||||
|
|
||||||
|
await worktreeInitCommand({
|
||||||
|
seed: false,
|
||||||
|
fromConfig: path.join(tempRoot, "missing", "config.json"),
|
||||||
|
home: path.join(tempRoot, ".paperclip-worktrees"),
|
||||||
|
});
|
||||||
|
|
||||||
|
const envPath = path.join(repoRoot, ".paperclip", ".env");
|
||||||
|
const envContents = fs.readFileSync(envPath, "utf8");
|
||||||
|
expect(envContents).toContain("PAPERCLIP_AGENT_JWT_SECRET=worktree-shared-secret");
|
||||||
|
expect(envContents).toContain("PAPERCLIP_WORKTREE_NAME=repo");
|
||||||
|
expect(envContents).toMatch(/PAPERCLIP_WORKTREE_COLOR=\"#[0-9a-f]{6}\"/);
|
||||||
|
} finally {
|
||||||
|
process.chdir(originalCwd);
|
||||||
|
if (originalJwtSecret === undefined) {
|
||||||
|
delete process.env.PAPERCLIP_AGENT_JWT_SECRET;
|
||||||
|
} else {
|
||||||
|
process.env.PAPERCLIP_AGENT_JWT_SECRET = originalJwtSecret;
|
||||||
|
}
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("defaults the seed source config to the current repo-local Paperclip config", () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-source-config-"));
|
||||||
|
const repoRoot = path.join(tempRoot, "repo");
|
||||||
|
const localConfigPath = path.join(repoRoot, ".paperclip", "config.json");
|
||||||
|
const originalCwd = process.cwd();
|
||||||
|
const originalPaperclipConfig = process.env.PAPERCLIP_CONFIG;
|
||||||
|
|
||||||
|
try {
|
||||||
|
fs.mkdirSync(path.dirname(localConfigPath), { recursive: true });
|
||||||
|
fs.writeFileSync(localConfigPath, JSON.stringify(buildSourceConfig()), "utf8");
|
||||||
|
delete process.env.PAPERCLIP_CONFIG;
|
||||||
|
process.chdir(repoRoot);
|
||||||
|
|
||||||
|
expect(fs.realpathSync(resolveSourceConfigPath({}))).toBe(fs.realpathSync(localConfigPath));
|
||||||
|
} finally {
|
||||||
|
process.chdir(originalCwd);
|
||||||
|
if (originalPaperclipConfig === undefined) {
|
||||||
|
delete process.env.PAPERCLIP_CONFIG;
|
||||||
|
} else {
|
||||||
|
process.env.PAPERCLIP_CONFIG = originalPaperclipConfig;
|
||||||
|
}
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("preserves the source config path across worktree:make cwd changes", () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-source-override-"));
|
||||||
|
const sourceConfigPath = path.join(tempRoot, "source", "config.json");
|
||||||
|
const targetRoot = path.join(tempRoot, "target");
|
||||||
|
const originalCwd = process.cwd();
|
||||||
|
const originalPaperclipConfig = process.env.PAPERCLIP_CONFIG;
|
||||||
|
|
||||||
|
try {
|
||||||
|
fs.mkdirSync(path.dirname(sourceConfigPath), { recursive: true });
|
||||||
|
fs.mkdirSync(targetRoot, { recursive: true });
|
||||||
|
fs.writeFileSync(sourceConfigPath, JSON.stringify(buildSourceConfig()), "utf8");
|
||||||
|
delete process.env.PAPERCLIP_CONFIG;
|
||||||
|
process.chdir(targetRoot);
|
||||||
|
|
||||||
|
expect(resolveSourceConfigPath({ sourceConfigPathOverride: sourceConfigPath })).toBe(
|
||||||
|
path.resolve(sourceConfigPath),
|
||||||
|
);
|
||||||
|
} finally {
|
||||||
|
process.chdir(originalCwd);
|
||||||
|
if (originalPaperclipConfig === undefined) {
|
||||||
|
delete process.env.PAPERCLIP_CONFIG;
|
||||||
|
} else {
|
||||||
|
process.env.PAPERCLIP_CONFIG = originalPaperclipConfig;
|
||||||
|
}
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rebinds same-repo workspace paths onto the current worktree root", () => {
|
||||||
|
expect(
|
||||||
|
rebindWorkspaceCwd({
|
||||||
|
sourceRepoRoot: "/Users/example/paperclip",
|
||||||
|
targetRepoRoot: "/Users/example/paperclip-pr-432",
|
||||||
|
workspaceCwd: "/Users/example/paperclip",
|
||||||
|
}),
|
||||||
|
).toBe("/Users/example/paperclip-pr-432");
|
||||||
|
|
||||||
|
expect(
|
||||||
|
rebindWorkspaceCwd({
|
||||||
|
sourceRepoRoot: "/Users/example/paperclip",
|
||||||
|
targetRepoRoot: "/Users/example/paperclip-pr-432",
|
||||||
|
workspaceCwd: "/Users/example/paperclip/packages/db",
|
||||||
|
}),
|
||||||
|
).toBe("/Users/example/paperclip-pr-432/packages/db");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("does not rebind paths outside the source repo root", () => {
|
||||||
|
expect(
|
||||||
|
rebindWorkspaceCwd({
|
||||||
|
sourceRepoRoot: "/Users/example/paperclip",
|
||||||
|
targetRepoRoot: "/Users/example/paperclip-pr-432",
|
||||||
|
workspaceCwd: "/Users/example/other-project",
|
||||||
|
}),
|
||||||
|
).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("copies shared git hooks into a linked worktree git dir", () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-hooks-"));
|
||||||
|
const repoRoot = path.join(tempRoot, "repo");
|
||||||
|
const worktreePath = path.join(tempRoot, "repo-feature");
|
||||||
|
|
||||||
|
try {
|
||||||
|
fs.mkdirSync(repoRoot, { recursive: true });
|
||||||
|
execFileSync("git", ["init"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["config", "user.email", "test@example.com"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["config", "user.name", "Test User"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
fs.writeFileSync(path.join(repoRoot, "README.md"), "# temp\n", "utf8");
|
||||||
|
execFileSync("git", ["add", "README.md"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["commit", "-m", "Initial commit"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
|
||||||
|
const sourceHooksDir = path.join(repoRoot, ".git", "hooks");
|
||||||
|
const sourceHookPath = path.join(sourceHooksDir, "pre-commit");
|
||||||
|
const sourceTokensPath = path.join(sourceHooksDir, "forbidden-tokens.txt");
|
||||||
|
fs.writeFileSync(sourceHookPath, "#!/usr/bin/env bash\nexit 0\n", { encoding: "utf8", mode: 0o755 });
|
||||||
|
fs.chmodSync(sourceHookPath, 0o755);
|
||||||
|
fs.writeFileSync(sourceTokensPath, "secret-token\n", "utf8");
|
||||||
|
|
||||||
|
execFileSync("git", ["worktree", "add", "--detach", worktreePath], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
|
||||||
|
const copied = copyGitHooksToWorktreeGitDir(worktreePath);
|
||||||
|
const worktreeGitDir = execFileSync("git", ["rev-parse", "--git-dir"], {
|
||||||
|
cwd: worktreePath,
|
||||||
|
encoding: "utf8",
|
||||||
|
stdio: ["ignore", "pipe", "ignore"],
|
||||||
|
}).trim();
|
||||||
|
const resolvedSourceHooksDir = fs.realpathSync(sourceHooksDir);
|
||||||
|
const resolvedTargetHooksDir = fs.realpathSync(path.resolve(worktreePath, worktreeGitDir, "hooks"));
|
||||||
|
const targetHookPath = path.join(resolvedTargetHooksDir, "pre-commit");
|
||||||
|
const targetTokensPath = path.join(resolvedTargetHooksDir, "forbidden-tokens.txt");
|
||||||
|
|
||||||
|
expect(copied).toMatchObject({
|
||||||
|
sourceHooksPath: resolvedSourceHooksDir,
|
||||||
|
targetHooksPath: resolvedTargetHooksDir,
|
||||||
|
copied: true,
|
||||||
|
});
|
||||||
|
expect(fs.readFileSync(targetHookPath, "utf8")).toBe("#!/usr/bin/env bash\nexit 0\n");
|
||||||
|
expect(fs.statSync(targetHookPath).mode & 0o111).not.toBe(0);
|
||||||
|
expect(fs.readFileSync(targetTokensPath, "utf8")).toBe("secret-token\n");
|
||||||
|
} finally {
|
||||||
|
execFileSync("git", ["worktree", "remove", "--force", worktreePath], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("creates and initializes a worktree from the top-level worktree:make command", async () => {
|
||||||
|
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-make-"));
|
||||||
|
const repoRoot = path.join(tempRoot, "repo");
|
||||||
|
const fakeHome = path.join(tempRoot, "home");
|
||||||
|
const worktreePath = path.join(fakeHome, "paperclip-make-test");
|
||||||
|
const originalCwd = process.cwd();
|
||||||
|
const homedirSpy = vi.spyOn(os, "homedir").mockReturnValue(fakeHome);
|
||||||
|
|
||||||
|
try {
|
||||||
|
fs.mkdirSync(repoRoot, { recursive: true });
|
||||||
|
fs.mkdirSync(fakeHome, { recursive: true });
|
||||||
|
execFileSync("git", ["init"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["config", "user.email", "test@example.com"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["config", "user.name", "Test User"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
fs.writeFileSync(path.join(repoRoot, "README.md"), "# temp\n", "utf8");
|
||||||
|
execFileSync("git", ["add", "README.md"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
execFileSync("git", ["commit", "-m", "Initial commit"], { cwd: repoRoot, stdio: "ignore" });
|
||||||
|
|
||||||
|
process.chdir(repoRoot);
|
||||||
|
|
||||||
|
await worktreeMakeCommand("paperclip-make-test", {
|
||||||
|
seed: false,
|
||||||
|
home: path.join(tempRoot, ".paperclip-worktrees"),
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(fs.existsSync(path.join(worktreePath, ".git"))).toBe(true);
|
||||||
|
expect(fs.existsSync(path.join(worktreePath, ".paperclip", "config.json"))).toBe(true);
|
||||||
|
expect(fs.existsSync(path.join(worktreePath, ".paperclip", ".env"))).toBe(true);
|
||||||
|
} finally {
|
||||||
|
process.chdir(originalCwd);
|
||||||
|
homedirSpy.mockRestore();
|
||||||
|
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
}, 20_000);
|
||||||
|
});
|
||||||
@@ -1,7 +1,11 @@
|
|||||||
import type { CLIAdapterModule } from "@paperclipai/adapter-utils";
|
import type { CLIAdapterModule } from "@paperclipai/adapter-utils";
|
||||||
import { printClaudeStreamEvent } from "@paperclipai/adapter-claude-local/cli";
|
import { printClaudeStreamEvent } from "@paperclipai/adapter-claude-local/cli";
|
||||||
import { printCodexStreamEvent } from "@paperclipai/adapter-codex-local/cli";
|
import { printCodexStreamEvent } from "@paperclipai/adapter-codex-local/cli";
|
||||||
import { printOpenClawStreamEvent } from "@paperclipai/adapter-openclaw/cli";
|
import { printCursorStreamEvent } from "@paperclipai/adapter-cursor-local/cli";
|
||||||
|
import { printGeminiStreamEvent } from "@paperclipai/adapter-gemini-local/cli";
|
||||||
|
import { printOpenCodeStreamEvent } from "@paperclipai/adapter-opencode-local/cli";
|
||||||
|
import { printPiStreamEvent } from "@paperclipai/adapter-pi-local/cli";
|
||||||
|
import { printOpenClawGatewayStreamEvent } from "@paperclipai/adapter-openclaw-gateway/cli";
|
||||||
import { processCLIAdapter } from "./process/index.js";
|
import { processCLIAdapter } from "./process/index.js";
|
||||||
import { httpCLIAdapter } from "./http/index.js";
|
import { httpCLIAdapter } from "./http/index.js";
|
||||||
|
|
||||||
@@ -15,13 +19,43 @@ const codexLocalCLIAdapter: CLIAdapterModule = {
|
|||||||
formatStdoutEvent: printCodexStreamEvent,
|
formatStdoutEvent: printCodexStreamEvent,
|
||||||
};
|
};
|
||||||
|
|
||||||
const openclawCLIAdapter: CLIAdapterModule = {
|
const openCodeLocalCLIAdapter: CLIAdapterModule = {
|
||||||
type: "openclaw",
|
type: "opencode_local",
|
||||||
formatStdoutEvent: printOpenClawStreamEvent,
|
formatStdoutEvent: printOpenCodeStreamEvent,
|
||||||
|
};
|
||||||
|
|
||||||
|
const piLocalCLIAdapter: CLIAdapterModule = {
|
||||||
|
type: "pi_local",
|
||||||
|
formatStdoutEvent: printPiStreamEvent,
|
||||||
|
};
|
||||||
|
|
||||||
|
const cursorLocalCLIAdapter: CLIAdapterModule = {
|
||||||
|
type: "cursor",
|
||||||
|
formatStdoutEvent: printCursorStreamEvent,
|
||||||
|
};
|
||||||
|
|
||||||
|
const geminiLocalCLIAdapter: CLIAdapterModule = {
|
||||||
|
type: "gemini_local",
|
||||||
|
formatStdoutEvent: printGeminiStreamEvent,
|
||||||
|
};
|
||||||
|
|
||||||
|
const openclawGatewayCLIAdapter: CLIAdapterModule = {
|
||||||
|
type: "openclaw_gateway",
|
||||||
|
formatStdoutEvent: printOpenClawGatewayStreamEvent,
|
||||||
};
|
};
|
||||||
|
|
||||||
const adaptersByType = new Map<string, CLIAdapterModule>(
|
const adaptersByType = new Map<string, CLIAdapterModule>(
|
||||||
[claudeLocalCLIAdapter, codexLocalCLIAdapter, openclawCLIAdapter, processCLIAdapter, httpCLIAdapter].map((a) => [a.type, a]),
|
[
|
||||||
|
claudeLocalCLIAdapter,
|
||||||
|
codexLocalCLIAdapter,
|
||||||
|
openCodeLocalCLIAdapter,
|
||||||
|
piLocalCLIAdapter,
|
||||||
|
cursorLocalCLIAdapter,
|
||||||
|
geminiLocalCLIAdapter,
|
||||||
|
openclawGatewayCLIAdapter,
|
||||||
|
processCLIAdapter,
|
||||||
|
httpCLIAdapter,
|
||||||
|
].map((a) => [a.type, a]),
|
||||||
);
|
);
|
||||||
|
|
||||||
export function getCLIAdapter(type: string): CLIAdapterModule {
|
export function getCLIAdapter(type: string): CLIAdapterModule {
|
||||||
|
|||||||
282
cli/src/client/board-auth.ts
Normal file
282
cli/src/client/board-auth.ts
Normal file
@@ -0,0 +1,282 @@
|
|||||||
|
import { spawn } from "node:child_process";
|
||||||
|
import fs from "node:fs";
|
||||||
|
import path from "node:path";
|
||||||
|
import pc from "picocolors";
|
||||||
|
import { buildCliCommandLabel } from "./command-label.js";
|
||||||
|
import { resolveDefaultCliAuthPath } from "../config/home.js";
|
||||||
|
|
||||||
|
type RequestedAccess = "board" | "instance_admin_required";
|
||||||
|
|
||||||
|
interface BoardAuthCredential {
|
||||||
|
apiBase: string;
|
||||||
|
token: string;
|
||||||
|
createdAt: string;
|
||||||
|
updatedAt: string;
|
||||||
|
userId?: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BoardAuthStore {
|
||||||
|
version: 1;
|
||||||
|
credentials: Record<string, BoardAuthCredential>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface CreateChallengeResponse {
|
||||||
|
id: string;
|
||||||
|
token: string;
|
||||||
|
boardApiToken: string;
|
||||||
|
approvalPath: string;
|
||||||
|
approvalUrl: string | null;
|
||||||
|
pollPath: string;
|
||||||
|
expiresAt: string;
|
||||||
|
suggestedPollIntervalMs: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ChallengeStatusResponse {
|
||||||
|
id: string;
|
||||||
|
status: "pending" | "approved" | "cancelled" | "expired";
|
||||||
|
command: string;
|
||||||
|
clientName: string | null;
|
||||||
|
requestedAccess: RequestedAccess;
|
||||||
|
requestedCompanyId: string | null;
|
||||||
|
requestedCompanyName: string | null;
|
||||||
|
approvedAt: string | null;
|
||||||
|
cancelledAt: string | null;
|
||||||
|
expiresAt: string;
|
||||||
|
approvedByUser: { id: string; name: string; email: string } | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function defaultBoardAuthStore(): BoardAuthStore {
|
||||||
|
return {
|
||||||
|
version: 1,
|
||||||
|
credentials: {},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function toStringOrNull(value: unknown): string | null {
|
||||||
|
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function normalizeApiBase(apiBase: string): string {
|
||||||
|
return apiBase.trim().replace(/\/+$/, "");
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveBoardAuthStorePath(overridePath?: string): string {
|
||||||
|
if (overridePath?.trim()) return path.resolve(overridePath.trim());
|
||||||
|
if (process.env.PAPERCLIP_AUTH_STORE?.trim()) return path.resolve(process.env.PAPERCLIP_AUTH_STORE.trim());
|
||||||
|
return resolveDefaultCliAuthPath();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function readBoardAuthStore(storePath?: string): BoardAuthStore {
|
||||||
|
const filePath = resolveBoardAuthStorePath(storePath);
|
||||||
|
if (!fs.existsSync(filePath)) return defaultBoardAuthStore();
|
||||||
|
|
||||||
|
const raw = JSON.parse(fs.readFileSync(filePath, "utf8")) as Partial<BoardAuthStore> | null;
|
||||||
|
const credentials = raw?.credentials && typeof raw.credentials === "object" ? raw.credentials : {};
|
||||||
|
const normalized: Record<string, BoardAuthCredential> = {};
|
||||||
|
|
||||||
|
for (const [key, value] of Object.entries(credentials)) {
|
||||||
|
if (typeof value !== "object" || value === null) continue;
|
||||||
|
const record = value as unknown as Record<string, unknown>;
|
||||||
|
const apiBase = toStringOrNull(record.apiBase);
|
||||||
|
const token = toStringOrNull(record.token);
|
||||||
|
const createdAt = toStringOrNull(record.createdAt);
|
||||||
|
const updatedAt = toStringOrNull(record.updatedAt);
|
||||||
|
if (!apiBase || !token || !createdAt || !updatedAt) continue;
|
||||||
|
normalized[normalizeApiBase(key)] = {
|
||||||
|
apiBase,
|
||||||
|
token,
|
||||||
|
createdAt,
|
||||||
|
updatedAt,
|
||||||
|
userId: toStringOrNull(record.userId),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
version: 1,
|
||||||
|
credentials: normalized,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function writeBoardAuthStore(store: BoardAuthStore, storePath?: string): void {
|
||||||
|
const filePath = resolveBoardAuthStorePath(storePath);
|
||||||
|
fs.mkdirSync(path.dirname(filePath), { recursive: true });
|
||||||
|
fs.writeFileSync(filePath, `${JSON.stringify(store, null, 2)}\n`, { mode: 0o600 });
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getStoredBoardCredential(apiBase: string, storePath?: string): BoardAuthCredential | null {
|
||||||
|
const store = readBoardAuthStore(storePath);
|
||||||
|
return store.credentials[normalizeApiBase(apiBase)] ?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function setStoredBoardCredential(input: {
|
||||||
|
apiBase: string;
|
||||||
|
token: string;
|
||||||
|
userId?: string | null;
|
||||||
|
storePath?: string;
|
||||||
|
}): BoardAuthCredential {
|
||||||
|
const normalizedApiBase = normalizeApiBase(input.apiBase);
|
||||||
|
const store = readBoardAuthStore(input.storePath);
|
||||||
|
const now = new Date().toISOString();
|
||||||
|
const existing = store.credentials[normalizedApiBase];
|
||||||
|
const credential: BoardAuthCredential = {
|
||||||
|
apiBase: normalizedApiBase,
|
||||||
|
token: input.token.trim(),
|
||||||
|
createdAt: existing?.createdAt ?? now,
|
||||||
|
updatedAt: now,
|
||||||
|
userId: input.userId ?? existing?.userId ?? null,
|
||||||
|
};
|
||||||
|
store.credentials[normalizedApiBase] = credential;
|
||||||
|
writeBoardAuthStore(store, input.storePath);
|
||||||
|
return credential;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function removeStoredBoardCredential(apiBase: string, storePath?: string): boolean {
|
||||||
|
const normalizedApiBase = normalizeApiBase(apiBase);
|
||||||
|
const store = readBoardAuthStore(storePath);
|
||||||
|
if (!store.credentials[normalizedApiBase]) return false;
|
||||||
|
delete store.credentials[normalizedApiBase];
|
||||||
|
writeBoardAuthStore(store, storePath);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
function sleep(ms: number) {
|
||||||
|
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||||
|
}
|
||||||
|
|
||||||
|
async function requestJson<T>(url: string, init?: RequestInit): Promise<T> {
|
||||||
|
const headers = new Headers(init?.headers ?? undefined);
|
||||||
|
if (init?.body !== undefined && !headers.has("content-type")) {
|
||||||
|
headers.set("content-type", "application/json");
|
||||||
|
}
|
||||||
|
if (!headers.has("accept")) {
|
||||||
|
headers.set("accept", "application/json");
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await fetch(url, {
|
||||||
|
...init,
|
||||||
|
headers,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const body = await response.json().catch(() => null);
|
||||||
|
const message =
|
||||||
|
body && typeof body === "object" && typeof (body as { error?: unknown }).error === "string"
|
||||||
|
? (body as { error: string }).error
|
||||||
|
: `Request failed: ${response.status}`;
|
||||||
|
throw new Error(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json() as Promise<T>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function openUrl(url: string): boolean {
|
||||||
|
const platform = process.platform;
|
||||||
|
try {
|
||||||
|
if (platform === "darwin") {
|
||||||
|
const child = spawn("open", [url], { detached: true, stdio: "ignore" });
|
||||||
|
child.unref();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
if (platform === "win32") {
|
||||||
|
const child = spawn("cmd", ["/c", "start", "", url], { detached: true, stdio: "ignore" });
|
||||||
|
child.unref();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
const child = spawn("xdg-open", [url], { detached: true, stdio: "ignore" });
|
||||||
|
child.unref();
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function loginBoardCli(params: {
|
||||||
|
apiBase: string;
|
||||||
|
requestedAccess: RequestedAccess;
|
||||||
|
requestedCompanyId?: string | null;
|
||||||
|
clientName?: string | null;
|
||||||
|
command?: string;
|
||||||
|
storePath?: string;
|
||||||
|
print?: boolean;
|
||||||
|
}): Promise<{ token: string; approvalUrl: string; userId?: string | null }> {
|
||||||
|
const apiBase = normalizeApiBase(params.apiBase);
|
||||||
|
const createUrl = `${apiBase}/api/cli-auth/challenges`;
|
||||||
|
const command = params.command?.trim() || buildCliCommandLabel();
|
||||||
|
|
||||||
|
const challenge = await requestJson<CreateChallengeResponse>(createUrl, {
|
||||||
|
method: "POST",
|
||||||
|
body: JSON.stringify({
|
||||||
|
command,
|
||||||
|
clientName: params.clientName?.trim() || "paperclipai cli",
|
||||||
|
requestedAccess: params.requestedAccess,
|
||||||
|
requestedCompanyId: params.requestedCompanyId?.trim() || null,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const approvalUrl = challenge.approvalUrl ?? `${apiBase}${challenge.approvalPath}`;
|
||||||
|
if (params.print !== false) {
|
||||||
|
console.error(pc.bold("Board authentication required"));
|
||||||
|
console.error(`Open this URL in your browser to approve CLI access:\n${approvalUrl}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const opened = openUrl(approvalUrl);
|
||||||
|
if (params.print !== false && opened) {
|
||||||
|
console.error(pc.dim("Opened the approval page in your browser."));
|
||||||
|
}
|
||||||
|
|
||||||
|
const expiresAtMs = Date.parse(challenge.expiresAt);
|
||||||
|
const pollMs = Math.max(500, challenge.suggestedPollIntervalMs || 1000);
|
||||||
|
|
||||||
|
while (Number.isFinite(expiresAtMs) ? Date.now() < expiresAtMs : true) {
|
||||||
|
const status = await requestJson<ChallengeStatusResponse>(
|
||||||
|
`${apiBase}/api${challenge.pollPath}?token=${encodeURIComponent(challenge.token)}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (status.status === "approved") {
|
||||||
|
const me = await requestJson<{ userId: string; user?: { id: string } | null }>(
|
||||||
|
`${apiBase}/api/cli-auth/me`,
|
||||||
|
{
|
||||||
|
headers: {
|
||||||
|
authorization: `Bearer ${challenge.boardApiToken}`,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
);
|
||||||
|
setStoredBoardCredential({
|
||||||
|
apiBase,
|
||||||
|
token: challenge.boardApiToken,
|
||||||
|
userId: me.userId ?? me.user?.id ?? null,
|
||||||
|
storePath: params.storePath,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
token: challenge.boardApiToken,
|
||||||
|
approvalUrl,
|
||||||
|
userId: me.userId ?? me.user?.id ?? null,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (status.status === "cancelled") {
|
||||||
|
throw new Error("CLI auth challenge was cancelled.");
|
||||||
|
}
|
||||||
|
if (status.status === "expired") {
|
||||||
|
throw new Error("CLI auth challenge expired before approval.");
|
||||||
|
}
|
||||||
|
|
||||||
|
await sleep(pollMs);
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error("CLI auth challenge expired before approval.");
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function revokeStoredBoardCredential(params: {
|
||||||
|
apiBase: string;
|
||||||
|
token: string;
|
||||||
|
}): Promise<void> {
|
||||||
|
const apiBase = normalizeApiBase(params.apiBase);
|
||||||
|
await requestJson<{ revoked: boolean }>(`${apiBase}/api/cli-auth/revoke-current`, {
|
||||||
|
method: "POST",
|
||||||
|
headers: {
|
||||||
|
authorization: `Bearer ${params.token}`,
|
||||||
|
},
|
||||||
|
body: JSON.stringify({}),
|
||||||
|
});
|
||||||
|
}
|
||||||
4
cli/src/client/command-label.ts
Normal file
4
cli/src/client/command-label.ts
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
export function buildCliCommandLabel(): string {
|
||||||
|
const args = process.argv.slice(2);
|
||||||
|
return args.length > 0 ? `paperclipai ${args.join(" ")}` : "paperclipai";
|
||||||
|
}
|
||||||
@@ -13,25 +13,54 @@ export class ApiRequestError extends Error {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export class ApiConnectionError extends Error {
|
||||||
|
url: string;
|
||||||
|
method: string;
|
||||||
|
causeMessage?: string;
|
||||||
|
|
||||||
|
constructor(input: {
|
||||||
|
apiBase: string;
|
||||||
|
path: string;
|
||||||
|
method: string;
|
||||||
|
cause?: unknown;
|
||||||
|
}) {
|
||||||
|
const url = buildUrl(input.apiBase, input.path);
|
||||||
|
const causeMessage = formatConnectionCause(input.cause);
|
||||||
|
super(buildConnectionErrorMessage({ apiBase: input.apiBase, url, method: input.method, causeMessage }));
|
||||||
|
this.url = url;
|
||||||
|
this.method = input.method;
|
||||||
|
this.causeMessage = causeMessage;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
interface RequestOptions {
|
interface RequestOptions {
|
||||||
ignoreNotFound?: boolean;
|
ignoreNotFound?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface RecoverAuthInput {
|
||||||
|
path: string;
|
||||||
|
method: string;
|
||||||
|
error: ApiRequestError;
|
||||||
|
}
|
||||||
|
|
||||||
interface ApiClientOptions {
|
interface ApiClientOptions {
|
||||||
apiBase: string;
|
apiBase: string;
|
||||||
apiKey?: string;
|
apiKey?: string;
|
||||||
runId?: string;
|
runId?: string;
|
||||||
|
recoverAuth?: (input: RecoverAuthInput) => Promise<string | null>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export class PaperclipApiClient {
|
export class PaperclipApiClient {
|
||||||
readonly apiBase: string;
|
readonly apiBase: string;
|
||||||
readonly apiKey?: string;
|
apiKey?: string;
|
||||||
readonly runId?: string;
|
readonly runId?: string;
|
||||||
|
readonly recoverAuth?: (input: RecoverAuthInput) => Promise<string | null>;
|
||||||
|
|
||||||
constructor(opts: ApiClientOptions) {
|
constructor(opts: ApiClientOptions) {
|
||||||
this.apiBase = opts.apiBase.replace(/\/+$/, "");
|
this.apiBase = opts.apiBase.replace(/\/+$/, "");
|
||||||
this.apiKey = opts.apiKey?.trim() || undefined;
|
this.apiKey = opts.apiKey?.trim() || undefined;
|
||||||
this.runId = opts.runId?.trim() || undefined;
|
this.runId = opts.runId?.trim() || undefined;
|
||||||
|
this.recoverAuth = opts.recoverAuth;
|
||||||
}
|
}
|
||||||
|
|
||||||
get<T>(path: string, opts?: RequestOptions): Promise<T | null> {
|
get<T>(path: string, opts?: RequestOptions): Promise<T | null> {
|
||||||
@@ -56,8 +85,18 @@ export class PaperclipApiClient {
|
|||||||
return this.request<T>(path, { method: "DELETE" }, opts);
|
return this.request<T>(path, { method: "DELETE" }, opts);
|
||||||
}
|
}
|
||||||
|
|
||||||
private async request<T>(path: string, init: RequestInit, opts?: RequestOptions): Promise<T | null> {
|
setApiKey(apiKey: string | undefined) {
|
||||||
|
this.apiKey = apiKey?.trim() || undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async request<T>(
|
||||||
|
path: string,
|
||||||
|
init: RequestInit,
|
||||||
|
opts?: RequestOptions,
|
||||||
|
hasRetriedAuth = false,
|
||||||
|
): Promise<T | null> {
|
||||||
const url = buildUrl(this.apiBase, path);
|
const url = buildUrl(this.apiBase, path);
|
||||||
|
const method = String(init.method ?? "GET").toUpperCase();
|
||||||
|
|
||||||
const headers: Record<string, string> = {
|
const headers: Record<string, string> = {
|
||||||
accept: "application/json",
|
accept: "application/json",
|
||||||
@@ -76,17 +115,39 @@ export class PaperclipApiClient {
|
|||||||
headers["x-paperclip-run-id"] = this.runId;
|
headers["x-paperclip-run-id"] = this.runId;
|
||||||
}
|
}
|
||||||
|
|
||||||
const response = await fetch(url, {
|
let response: Response;
|
||||||
|
try {
|
||||||
|
response = await fetch(url, {
|
||||||
...init,
|
...init,
|
||||||
headers,
|
headers,
|
||||||
});
|
});
|
||||||
|
} catch (error) {
|
||||||
|
throw new ApiConnectionError({
|
||||||
|
apiBase: this.apiBase,
|
||||||
|
path,
|
||||||
|
method,
|
||||||
|
cause: error,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
if (opts?.ignoreNotFound && response.status === 404) {
|
if (opts?.ignoreNotFound && response.status === 404) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
throw await toApiError(response);
|
const apiError = await toApiError(response);
|
||||||
|
if (!hasRetriedAuth && this.recoverAuth) {
|
||||||
|
const recoveredToken = await this.recoverAuth({
|
||||||
|
path,
|
||||||
|
method,
|
||||||
|
error: apiError,
|
||||||
|
});
|
||||||
|
if (recoveredToken) {
|
||||||
|
this.setApiKey(recoveredToken);
|
||||||
|
return this.request<T>(path, init, opts, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw apiError;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (response.status === 204) {
|
if (response.status === 204) {
|
||||||
@@ -104,8 +165,10 @@ export class PaperclipApiClient {
|
|||||||
|
|
||||||
function buildUrl(apiBase: string, path: string): string {
|
function buildUrl(apiBase: string, path: string): string {
|
||||||
const normalizedPath = path.startsWith("/") ? path : `/${path}`;
|
const normalizedPath = path.startsWith("/") ? path : `/${path}`;
|
||||||
|
const [pathname, query] = normalizedPath.split("?");
|
||||||
const url = new URL(apiBase);
|
const url = new URL(apiBase);
|
||||||
url.pathname = `${url.pathname.replace(/\/+$/, "")}${normalizedPath}`;
|
url.pathname = `${url.pathname.replace(/\/+$/, "")}${pathname}`;
|
||||||
|
if (query) url.search = query;
|
||||||
return url.toString();
|
return url.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -134,6 +197,50 @@ async function toApiError(response: Response): Promise<ApiRequestError> {
|
|||||||
return new ApiRequestError(response.status, `Request failed with status ${response.status}`, undefined, parsed);
|
return new ApiRequestError(response.status, `Request failed with status ${response.status}`, undefined, parsed);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function buildConnectionErrorMessage(input: {
|
||||||
|
apiBase: string;
|
||||||
|
url: string;
|
||||||
|
method: string;
|
||||||
|
causeMessage?: string;
|
||||||
|
}): string {
|
||||||
|
const healthUrl = buildHealthCheckUrl(input.url);
|
||||||
|
const lines = [
|
||||||
|
"Could not reach the Paperclip API.",
|
||||||
|
"",
|
||||||
|
`Request: ${input.method} ${input.url}`,
|
||||||
|
];
|
||||||
|
if (input.causeMessage) {
|
||||||
|
lines.push(`Cause: ${input.causeMessage}`);
|
||||||
|
}
|
||||||
|
lines.push(
|
||||||
|
"",
|
||||||
|
"This usually means the Paperclip server is not running, the configured URL is wrong, or the request is being blocked before it reaches Paperclip.",
|
||||||
|
"",
|
||||||
|
"Try:",
|
||||||
|
"- Start Paperclip with `pnpm dev` or `pnpm paperclipai run`.",
|
||||||
|
`- Verify the server is reachable with \`curl ${healthUrl}\`.`,
|
||||||
|
`- If Paperclip is running elsewhere, pass \`--api-base ${input.apiBase.replace(/\/+$/, "")}\` or set \`PAPERCLIP_API_URL\`.`,
|
||||||
|
);
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildHealthCheckUrl(requestUrl: string): string {
|
||||||
|
const url = new URL(requestUrl);
|
||||||
|
url.pathname = `${url.pathname.replace(/\/+$/, "").replace(/\/api(?:\/.*)?$/, "")}/api/health`;
|
||||||
|
url.search = "";
|
||||||
|
url.hash = "";
|
||||||
|
return url.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatConnectionCause(error: unknown): string | undefined {
|
||||||
|
if (!error) return undefined;
|
||||||
|
if (error instanceof Error) {
|
||||||
|
return error.message.trim() || error.name;
|
||||||
|
}
|
||||||
|
const message = String(error).trim();
|
||||||
|
return message || undefined;
|
||||||
|
}
|
||||||
|
|
||||||
function toStringRecord(headers: HeadersInit | undefined): Record<string, string> {
|
function toStringRecord(headers: HeadersInit | undefined): Record<string, string> {
|
||||||
if (!headers) return {};
|
if (!headers) return {};
|
||||||
if (Array.isArray(headers)) {
|
if (Array.isArray(headers)) {
|
||||||
|
|||||||
@@ -26,6 +26,9 @@ export async function addAllowedHostname(host: string, opts: { config?: string }
|
|||||||
p.log.info(`Hostname ${pc.cyan(normalized)} is already allowed.`);
|
p.log.info(`Hostname ${pc.cyan(normalized)} is already allowed.`);
|
||||||
} else {
|
} else {
|
||||||
p.log.success(`Added allowed hostname: ${pc.cyan(normalized)}`);
|
p.log.success(`Added allowed hostname: ${pc.cyan(normalized)}`);
|
||||||
|
p.log.message(
|
||||||
|
pc.dim("Restart the Paperclip server for this change to take effect."),
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!(config.server.deploymentMode === "authenticated" && config.server.exposure === "private")) {
|
if (!(config.server.deploymentMode === "authenticated" && config.server.exposure === "private")) {
|
||||||
|
|||||||
@@ -3,6 +3,7 @@ import * as p from "@clack/prompts";
|
|||||||
import pc from "picocolors";
|
import pc from "picocolors";
|
||||||
import { and, eq, gt, isNull } from "drizzle-orm";
|
import { and, eq, gt, isNull } from "drizzle-orm";
|
||||||
import { createDb, instanceUserRoles, invites } from "@paperclipai/db";
|
import { createDb, instanceUserRoles, invites } from "@paperclipai/db";
|
||||||
|
import { loadPaperclipEnvFile } from "../config/env.js";
|
||||||
import { readConfig, resolveConfigPath } from "../config/store.js";
|
import { readConfig, resolveConfigPath } from "../config/store.js";
|
||||||
|
|
||||||
function hashToken(token: string) {
|
function hashToken(token: string) {
|
||||||
@@ -13,7 +14,8 @@ function createInviteToken() {
|
|||||||
return `pcp_bootstrap_${randomBytes(24).toString("hex")}`;
|
return `pcp_bootstrap_${randomBytes(24).toString("hex")}`;
|
||||||
}
|
}
|
||||||
|
|
||||||
function resolveDbUrl(configPath?: string) {
|
function resolveDbUrl(configPath?: string, explicitDbUrl?: string) {
|
||||||
|
if (explicitDbUrl) return explicitDbUrl;
|
||||||
const config = readConfig(configPath);
|
const config = readConfig(configPath);
|
||||||
if (process.env.DATABASE_URL) return process.env.DATABASE_URL;
|
if (process.env.DATABASE_URL) return process.env.DATABASE_URL;
|
||||||
if (config?.database.mode === "postgres" && config.database.connectionString) {
|
if (config?.database.mode === "postgres" && config.database.connectionString) {
|
||||||
@@ -28,6 +30,12 @@ function resolveDbUrl(configPath?: string) {
|
|||||||
|
|
||||||
function resolveBaseUrl(configPath?: string, explicitBaseUrl?: string) {
|
function resolveBaseUrl(configPath?: string, explicitBaseUrl?: string) {
|
||||||
if (explicitBaseUrl) return explicitBaseUrl.replace(/\/+$/, "");
|
if (explicitBaseUrl) return explicitBaseUrl.replace(/\/+$/, "");
|
||||||
|
const fromEnv =
|
||||||
|
process.env.PAPERCLIP_PUBLIC_URL ??
|
||||||
|
process.env.PAPERCLIP_AUTH_PUBLIC_BASE_URL ??
|
||||||
|
process.env.BETTER_AUTH_URL ??
|
||||||
|
process.env.BETTER_AUTH_BASE_URL;
|
||||||
|
if (fromEnv?.trim()) return fromEnv.trim().replace(/\/+$/, "");
|
||||||
const config = readConfig(configPath);
|
const config = readConfig(configPath);
|
||||||
if (config?.auth.baseUrlMode === "explicit" && config.auth.publicBaseUrl) {
|
if (config?.auth.baseUrlMode === "explicit" && config.auth.publicBaseUrl) {
|
||||||
return config.auth.publicBaseUrl.replace(/\/+$/, "");
|
return config.auth.publicBaseUrl.replace(/\/+$/, "");
|
||||||
@@ -43,8 +51,10 @@ export async function bootstrapCeoInvite(opts: {
|
|||||||
force?: boolean;
|
force?: boolean;
|
||||||
expiresHours?: number;
|
expiresHours?: number;
|
||||||
baseUrl?: string;
|
baseUrl?: string;
|
||||||
|
dbUrl?: string;
|
||||||
}) {
|
}) {
|
||||||
const configPath = resolveConfigPath(opts.config);
|
const configPath = resolveConfigPath(opts.config);
|
||||||
|
loadPaperclipEnvFile(configPath);
|
||||||
const config = readConfig(configPath);
|
const config = readConfig(configPath);
|
||||||
if (!config) {
|
if (!config) {
|
||||||
p.log.error(`No config found at ${configPath}. Run ${pc.cyan("paperclip onboard")} first.`);
|
p.log.error(`No config found at ${configPath}. Run ${pc.cyan("paperclip onboard")} first.`);
|
||||||
@@ -56,7 +66,7 @@ export async function bootstrapCeoInvite(opts: {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const dbUrl = resolveDbUrl(configPath);
|
const dbUrl = resolveDbUrl(configPath, opts.dbUrl);
|
||||||
if (!dbUrl) {
|
if (!dbUrl) {
|
||||||
p.log.error(
|
p.log.error(
|
||||||
"Could not resolve database connection for bootstrap.",
|
"Could not resolve database connection for bootstrap.",
|
||||||
@@ -65,6 +75,11 @@ export async function bootstrapCeoInvite(opts: {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const db = createDb(dbUrl);
|
const db = createDb(dbUrl);
|
||||||
|
const closableDb = db as typeof db & {
|
||||||
|
$client?: {
|
||||||
|
end?: (options?: { timeout?: number }) => Promise<void>;
|
||||||
|
};
|
||||||
|
};
|
||||||
try {
|
try {
|
||||||
const existingAdminCount = await db
|
const existingAdminCount = await db
|
||||||
.select()
|
.select()
|
||||||
@@ -112,5 +127,7 @@ export async function bootstrapCeoInvite(opts: {
|
|||||||
} catch (err) {
|
} catch (err) {
|
||||||
p.log.error(`Could not create bootstrap invite: ${err instanceof Error ? err.message : String(err)}`);
|
p.log.error(`Could not create bootstrap invite: ${err instanceof Error ? err.message : String(err)}`);
|
||||||
p.log.info("If using embedded-postgres, start the Paperclip server and run this command again.");
|
p.log.info("If using embedded-postgres, start the Paperclip server and run this command again.");
|
||||||
|
} finally {
|
||||||
|
await closableDb.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,13 @@
|
|||||||
import { Command } from "commander";
|
import { Command } from "commander";
|
||||||
import type { Agent } from "@paperclipai/shared";
|
import type { Agent } from "@paperclipai/shared";
|
||||||
|
import {
|
||||||
|
removeMaintainerOnlySkillSymlinks,
|
||||||
|
resolvePaperclipSkillsDir,
|
||||||
|
} from "@paperclipai/adapter-utils/server-utils";
|
||||||
|
import fs from "node:fs/promises";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { fileURLToPath } from "node:url";
|
||||||
import {
|
import {
|
||||||
addCommonClientOptions,
|
addCommonClientOptions,
|
||||||
formatInlineRecord,
|
formatInlineRecord,
|
||||||
@@ -13,6 +21,141 @@ interface AgentListOptions extends BaseClientOptions {
|
|||||||
companyId?: string;
|
companyId?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface AgentLocalCliOptions extends BaseClientOptions {
|
||||||
|
companyId?: string;
|
||||||
|
keyName?: string;
|
||||||
|
installSkills?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface CreatedAgentKey {
|
||||||
|
id: string;
|
||||||
|
name: string;
|
||||||
|
token: string;
|
||||||
|
createdAt: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface SkillsInstallSummary {
|
||||||
|
tool: "codex" | "claude";
|
||||||
|
target: string;
|
||||||
|
linked: string[];
|
||||||
|
removed: string[];
|
||||||
|
skipped: string[];
|
||||||
|
failed: Array<{ name: string; error: string }>;
|
||||||
|
}
|
||||||
|
|
||||||
|
const __moduleDir = path.dirname(fileURLToPath(import.meta.url));
|
||||||
|
|
||||||
|
function codexSkillsHome(): string {
|
||||||
|
const fromEnv = process.env.CODEX_HOME?.trim();
|
||||||
|
const base = fromEnv && fromEnv.length > 0 ? fromEnv : path.join(os.homedir(), ".codex");
|
||||||
|
return path.join(base, "skills");
|
||||||
|
}
|
||||||
|
|
||||||
|
function claudeSkillsHome(): string {
|
||||||
|
const fromEnv = process.env.CLAUDE_HOME?.trim();
|
||||||
|
const base = fromEnv && fromEnv.length > 0 ? fromEnv : path.join(os.homedir(), ".claude");
|
||||||
|
return path.join(base, "skills");
|
||||||
|
}
|
||||||
|
|
||||||
|
async function installSkillsForTarget(
|
||||||
|
sourceSkillsDir: string,
|
||||||
|
targetSkillsDir: string,
|
||||||
|
tool: "codex" | "claude",
|
||||||
|
): Promise<SkillsInstallSummary> {
|
||||||
|
const summary: SkillsInstallSummary = {
|
||||||
|
tool,
|
||||||
|
target: targetSkillsDir,
|
||||||
|
linked: [],
|
||||||
|
removed: [],
|
||||||
|
skipped: [],
|
||||||
|
failed: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
await fs.mkdir(targetSkillsDir, { recursive: true });
|
||||||
|
const entries = await fs.readdir(sourceSkillsDir, { withFileTypes: true });
|
||||||
|
summary.removed = await removeMaintainerOnlySkillSymlinks(
|
||||||
|
targetSkillsDir,
|
||||||
|
entries.filter((entry) => entry.isDirectory()).map((entry) => entry.name),
|
||||||
|
);
|
||||||
|
for (const entry of entries) {
|
||||||
|
if (!entry.isDirectory()) continue;
|
||||||
|
const source = path.join(sourceSkillsDir, entry.name);
|
||||||
|
const target = path.join(targetSkillsDir, entry.name);
|
||||||
|
const existing = await fs.lstat(target).catch(() => null);
|
||||||
|
if (existing) {
|
||||||
|
if (existing.isSymbolicLink()) {
|
||||||
|
let linkedPath: string | null = null;
|
||||||
|
try {
|
||||||
|
linkedPath = await fs.readlink(target);
|
||||||
|
} catch (err) {
|
||||||
|
await fs.unlink(target);
|
||||||
|
try {
|
||||||
|
await fs.symlink(source, target);
|
||||||
|
summary.linked.push(entry.name);
|
||||||
|
continue;
|
||||||
|
} catch (linkErr) {
|
||||||
|
summary.failed.push({
|
||||||
|
name: entry.name,
|
||||||
|
error:
|
||||||
|
err instanceof Error && linkErr instanceof Error
|
||||||
|
? `${err.message}; then ${linkErr.message}`
|
||||||
|
: err instanceof Error
|
||||||
|
? err.message
|
||||||
|
: `Failed to recover broken symlink: ${String(err)}`,
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const resolvedLinkedPath = path.isAbsolute(linkedPath)
|
||||||
|
? linkedPath
|
||||||
|
: path.resolve(path.dirname(target), linkedPath);
|
||||||
|
const linkedTargetExists = await fs
|
||||||
|
.stat(resolvedLinkedPath)
|
||||||
|
.then(() => true)
|
||||||
|
.catch(() => false);
|
||||||
|
|
||||||
|
if (!linkedTargetExists) {
|
||||||
|
await fs.unlink(target);
|
||||||
|
} else {
|
||||||
|
summary.skipped.push(entry.name);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
summary.skipped.push(entry.name);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await fs.symlink(source, target);
|
||||||
|
summary.linked.push(entry.name);
|
||||||
|
} catch (err) {
|
||||||
|
summary.failed.push({
|
||||||
|
name: entry.name,
|
||||||
|
error: err instanceof Error ? err.message : String(err),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return summary;
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildAgentEnvExports(input: {
|
||||||
|
apiBase: string;
|
||||||
|
companyId: string;
|
||||||
|
agentId: string;
|
||||||
|
apiKey: string;
|
||||||
|
}): string {
|
||||||
|
const escaped = (value: string) => value.replace(/'/g, "'\"'\"'");
|
||||||
|
return [
|
||||||
|
`export PAPERCLIP_API_URL='${escaped(input.apiBase)}'`,
|
||||||
|
`export PAPERCLIP_COMPANY_ID='${escaped(input.companyId)}'`,
|
||||||
|
`export PAPERCLIP_AGENT_ID='${escaped(input.agentId)}'`,
|
||||||
|
`export PAPERCLIP_API_KEY='${escaped(input.apiKey)}'`,
|
||||||
|
].join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
export function registerAgentCommands(program: Command): void {
|
export function registerAgentCommands(program: Command): void {
|
||||||
const agent = program.command("agent").description("Agent operations");
|
const agent = program.command("agent").description("Agent operations");
|
||||||
|
|
||||||
@@ -71,4 +214,102 @@ export function registerAgentCommands(program: Command): void {
|
|||||||
}
|
}
|
||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
agent
|
||||||
|
.command("local-cli")
|
||||||
|
.description(
|
||||||
|
"Create an agent API key, install local Paperclip skills for Codex/Claude, and print shell exports",
|
||||||
|
)
|
||||||
|
.argument("<agentRef>", "Agent ID or shortname/url-key")
|
||||||
|
.requiredOption("-C, --company-id <id>", "Company ID")
|
||||||
|
.option("--key-name <name>", "API key label", "local-cli")
|
||||||
|
.option(
|
||||||
|
"--no-install-skills",
|
||||||
|
"Skip installing Paperclip skills into ~/.codex/skills and ~/.claude/skills",
|
||||||
|
)
|
||||||
|
.action(async (agentRef: string, opts: AgentLocalCliOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts, { requireCompany: true });
|
||||||
|
const query = new URLSearchParams({ companyId: ctx.companyId ?? "" });
|
||||||
|
const agentRow = await ctx.api.get<Agent>(
|
||||||
|
`/api/agents/${encodeURIComponent(agentRef)}?${query.toString()}`,
|
||||||
|
);
|
||||||
|
if (!agentRow) {
|
||||||
|
throw new Error(`Agent not found: ${agentRef}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const now = new Date().toISOString().replaceAll(":", "-");
|
||||||
|
const keyName = opts.keyName?.trim() ? opts.keyName.trim() : `local-cli-${now}`;
|
||||||
|
const key = await ctx.api.post<CreatedAgentKey>(`/api/agents/${agentRow.id}/keys`, { name: keyName });
|
||||||
|
if (!key) {
|
||||||
|
throw new Error("Failed to create API key");
|
||||||
|
}
|
||||||
|
|
||||||
|
const installSummaries: SkillsInstallSummary[] = [];
|
||||||
|
if (opts.installSkills !== false) {
|
||||||
|
const skillsDir = await resolvePaperclipSkillsDir(__moduleDir, [path.resolve(process.cwd(), "skills")]);
|
||||||
|
if (!skillsDir) {
|
||||||
|
throw new Error(
|
||||||
|
"Could not locate local Paperclip skills directory. Expected ./skills in the repo checkout.",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
installSummaries.push(
|
||||||
|
await installSkillsForTarget(skillsDir, codexSkillsHome(), "codex"),
|
||||||
|
await installSkillsForTarget(skillsDir, claudeSkillsHome(), "claude"),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const exportsText = buildAgentEnvExports({
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
companyId: agentRow.companyId,
|
||||||
|
agentId: agentRow.id,
|
||||||
|
apiKey: key.token,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(
|
||||||
|
{
|
||||||
|
agent: {
|
||||||
|
id: agentRow.id,
|
||||||
|
name: agentRow.name,
|
||||||
|
urlKey: agentRow.urlKey,
|
||||||
|
companyId: agentRow.companyId,
|
||||||
|
},
|
||||||
|
key: {
|
||||||
|
id: key.id,
|
||||||
|
name: key.name,
|
||||||
|
createdAt: key.createdAt,
|
||||||
|
token: key.token,
|
||||||
|
},
|
||||||
|
skills: installSummaries,
|
||||||
|
exports: exportsText,
|
||||||
|
},
|
||||||
|
{ json: true },
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Agent: ${agentRow.name} (${agentRow.id})`);
|
||||||
|
console.log(`API key created: ${key.name} (${key.id})`);
|
||||||
|
if (installSummaries.length > 0) {
|
||||||
|
for (const summary of installSummaries) {
|
||||||
|
console.log(
|
||||||
|
`${summary.tool}: linked=${summary.linked.length} removed=${summary.removed.length} skipped=${summary.skipped.length} failed=${summary.failed.length} target=${summary.target}`,
|
||||||
|
);
|
||||||
|
for (const failed of summary.failed) {
|
||||||
|
console.log(` failed ${failed.name}: ${failed.error}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
console.log("");
|
||||||
|
console.log("# Run this in your shell before launching codex/claude:");
|
||||||
|
console.log(exportsText);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: false },
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
113
cli/src/commands/client/auth.ts
Normal file
113
cli/src/commands/client/auth.ts
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
import type { Command } from "commander";
|
||||||
|
import {
|
||||||
|
getStoredBoardCredential,
|
||||||
|
loginBoardCli,
|
||||||
|
removeStoredBoardCredential,
|
||||||
|
revokeStoredBoardCredential,
|
||||||
|
} from "../../client/board-auth.js";
|
||||||
|
import {
|
||||||
|
addCommonClientOptions,
|
||||||
|
handleCommandError,
|
||||||
|
printOutput,
|
||||||
|
resolveCommandContext,
|
||||||
|
type BaseClientOptions,
|
||||||
|
} from "./common.js";
|
||||||
|
|
||||||
|
interface AuthLoginOptions extends BaseClientOptions {
|
||||||
|
instanceAdmin?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface AuthLogoutOptions extends BaseClientOptions {}
|
||||||
|
interface AuthWhoamiOptions extends BaseClientOptions {}
|
||||||
|
|
||||||
|
export function registerClientAuthCommands(auth: Command): void {
|
||||||
|
addCommonClientOptions(
|
||||||
|
auth
|
||||||
|
.command("login")
|
||||||
|
.description("Authenticate the CLI for board-user access")
|
||||||
|
.option("--instance-admin", "Request instance-admin approval instead of plain board access", false)
|
||||||
|
.action(async (opts: AuthLoginOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const login = await loginBoardCli({
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
requestedAccess: opts.instanceAdmin ? "instance_admin_required" : "board",
|
||||||
|
requestedCompanyId: ctx.companyId ?? null,
|
||||||
|
command: "paperclipai auth login",
|
||||||
|
});
|
||||||
|
printOutput(
|
||||||
|
{
|
||||||
|
ok: true,
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
userId: login.userId ?? null,
|
||||||
|
approvalUrl: login.approvalUrl,
|
||||||
|
},
|
||||||
|
{ json: ctx.json },
|
||||||
|
);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: true },
|
||||||
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
auth
|
||||||
|
.command("logout")
|
||||||
|
.description("Remove the stored board-user credential for this API base")
|
||||||
|
.action(async (opts: AuthLogoutOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const credential = getStoredBoardCredential(ctx.api.apiBase);
|
||||||
|
if (!credential) {
|
||||||
|
printOutput({ ok: true, apiBase: ctx.api.apiBase, revoked: false, removedLocalCredential: false }, { json: ctx.json });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let revoked = false;
|
||||||
|
try {
|
||||||
|
await revokeStoredBoardCredential({
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
token: credential.token,
|
||||||
|
});
|
||||||
|
revoked = true;
|
||||||
|
} catch {
|
||||||
|
// Remove the local credential even if the server-side revoke fails.
|
||||||
|
}
|
||||||
|
const removedLocalCredential = removeStoredBoardCredential(ctx.api.apiBase);
|
||||||
|
printOutput(
|
||||||
|
{
|
||||||
|
ok: true,
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
revoked,
|
||||||
|
removedLocalCredential,
|
||||||
|
},
|
||||||
|
{ json: ctx.json },
|
||||||
|
);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
auth
|
||||||
|
.command("whoami")
|
||||||
|
.description("Show the current board-user identity for this API base")
|
||||||
|
.action(async (opts: AuthWhoamiOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const me = await ctx.api.get<{
|
||||||
|
user: { id: string; name: string; email: string } | null;
|
||||||
|
userId: string;
|
||||||
|
isInstanceAdmin: boolean;
|
||||||
|
companyIds: string[];
|
||||||
|
source: string;
|
||||||
|
keyId: string | null;
|
||||||
|
}>("/api/cli-auth/me");
|
||||||
|
printOutput(me, { json: ctx.json });
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -1,5 +1,7 @@
|
|||||||
import pc from "picocolors";
|
import pc from "picocolors";
|
||||||
import type { Command } from "commander";
|
import type { Command } from "commander";
|
||||||
|
import { getStoredBoardCredential, loginBoardCli } from "../../client/board-auth.js";
|
||||||
|
import { buildCliCommandLabel } from "../../client/command-label.js";
|
||||||
import { readConfig } from "../../config/store.js";
|
import { readConfig } from "../../config/store.js";
|
||||||
import { readContext, resolveProfile, type ClientContextProfile } from "../../client/context.js";
|
import { readContext, resolveProfile, type ClientContextProfile } from "../../client/context.js";
|
||||||
import { ApiRequestError, PaperclipApiClient } from "../../client/http.js";
|
import { ApiRequestError, PaperclipApiClient } from "../../client/http.js";
|
||||||
@@ -53,10 +55,12 @@ export function resolveCommandContext(
|
|||||||
profile.apiBase ||
|
profile.apiBase ||
|
||||||
inferApiBaseFromConfig(options.config);
|
inferApiBaseFromConfig(options.config);
|
||||||
|
|
||||||
const apiKey =
|
const explicitApiKey =
|
||||||
options.apiKey?.trim() ||
|
options.apiKey?.trim() ||
|
||||||
process.env.PAPERCLIP_API_KEY?.trim() ||
|
process.env.PAPERCLIP_API_KEY?.trim() ||
|
||||||
readKeyFromProfileEnv(profile);
|
readKeyFromProfileEnv(profile);
|
||||||
|
const storedBoardCredential = explicitApiKey ? null : getStoredBoardCredential(apiBase);
|
||||||
|
const apiKey = explicitApiKey || storedBoardCredential?.token;
|
||||||
|
|
||||||
const companyId =
|
const companyId =
|
||||||
options.companyId?.trim() ||
|
options.companyId?.trim() ||
|
||||||
@@ -69,7 +73,27 @@ export function resolveCommandContext(
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
const api = new PaperclipApiClient({ apiBase, apiKey });
|
const api = new PaperclipApiClient({
|
||||||
|
apiBase,
|
||||||
|
apiKey,
|
||||||
|
recoverAuth: explicitApiKey || !canAttemptInteractiveBoardAuth()
|
||||||
|
? undefined
|
||||||
|
: async ({ error }) => {
|
||||||
|
const requestedAccess = error.message.includes("Instance admin required")
|
||||||
|
? "instance_admin_required"
|
||||||
|
: "board";
|
||||||
|
if (!shouldRecoverBoardAuth(error)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const login = await loginBoardCli({
|
||||||
|
apiBase,
|
||||||
|
requestedAccess,
|
||||||
|
requestedCompanyId: companyId ?? null,
|
||||||
|
command: buildCliCommandLabel(),
|
||||||
|
});
|
||||||
|
return login.token;
|
||||||
|
},
|
||||||
|
});
|
||||||
return {
|
return {
|
||||||
api,
|
api,
|
||||||
companyId,
|
companyId,
|
||||||
@@ -79,6 +103,16 @@ export function resolveCommandContext(
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function shouldRecoverBoardAuth(error: ApiRequestError): boolean {
|
||||||
|
if (error.status === 401) return true;
|
||||||
|
if (error.status !== 403) return false;
|
||||||
|
return error.message.includes("Board access required") || error.message.includes("Instance admin required");
|
||||||
|
}
|
||||||
|
|
||||||
|
function canAttemptInteractiveBoardAuth(): boolean {
|
||||||
|
return Boolean(process.stdin.isTTY && process.stdout.isTTY);
|
||||||
|
}
|
||||||
|
|
||||||
export function printOutput(data: unknown, opts: { json?: boolean; label?: string } = {}): void {
|
export function printOutput(data: unknown, opts: { json?: boolean; label?: string } = {}): void {
|
||||||
if (opts.json) {
|
if (opts.json) {
|
||||||
console.log(JSON.stringify(data, null, 2));
|
console.log(JSON.stringify(data, null, 2));
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
374
cli/src/commands/client/plugin.ts
Normal file
374
cli/src/commands/client/plugin.ts
Normal file
@@ -0,0 +1,374 @@
|
|||||||
|
import path from "node:path";
|
||||||
|
import { Command } from "commander";
|
||||||
|
import pc from "picocolors";
|
||||||
|
import {
|
||||||
|
addCommonClientOptions,
|
||||||
|
handleCommandError,
|
||||||
|
printOutput,
|
||||||
|
resolveCommandContext,
|
||||||
|
type BaseClientOptions,
|
||||||
|
} from "./common.js";
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Types mirroring server-side shapes
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
interface PluginRecord {
|
||||||
|
id: string;
|
||||||
|
pluginKey: string;
|
||||||
|
packageName: string;
|
||||||
|
version: string;
|
||||||
|
status: string;
|
||||||
|
displayName?: string;
|
||||||
|
lastError?: string | null;
|
||||||
|
installedAt: string;
|
||||||
|
updatedAt: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Option types
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
interface PluginListOptions extends BaseClientOptions {
|
||||||
|
status?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PluginInstallOptions extends BaseClientOptions {
|
||||||
|
local?: boolean;
|
||||||
|
version?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PluginUninstallOptions extends BaseClientOptions {
|
||||||
|
force?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Helpers
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Resolve a local path argument to an absolute path so the server can find the
|
||||||
|
* plugin on disk regardless of where the user ran the CLI.
|
||||||
|
*/
|
||||||
|
function resolvePackageArg(packageArg: string, isLocal: boolean): string {
|
||||||
|
if (!isLocal) return packageArg;
|
||||||
|
// Already absolute
|
||||||
|
if (path.isAbsolute(packageArg)) return packageArg;
|
||||||
|
// Expand leading ~ to home directory
|
||||||
|
if (packageArg.startsWith("~")) {
|
||||||
|
const home = process.env.HOME ?? process.env.USERPROFILE ?? "";
|
||||||
|
return path.resolve(home, packageArg.slice(1).replace(/^[\\/]/, ""));
|
||||||
|
}
|
||||||
|
return path.resolve(process.cwd(), packageArg);
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatPlugin(p: PluginRecord): string {
|
||||||
|
const statusColor =
|
||||||
|
p.status === "ready"
|
||||||
|
? pc.green(p.status)
|
||||||
|
: p.status === "error"
|
||||||
|
? pc.red(p.status)
|
||||||
|
: p.status === "disabled"
|
||||||
|
? pc.dim(p.status)
|
||||||
|
: pc.yellow(p.status);
|
||||||
|
|
||||||
|
const parts = [
|
||||||
|
`key=${pc.bold(p.pluginKey)}`,
|
||||||
|
`status=${statusColor}`,
|
||||||
|
`version=${p.version}`,
|
||||||
|
`id=${pc.dim(p.id)}`,
|
||||||
|
];
|
||||||
|
|
||||||
|
if (p.lastError) {
|
||||||
|
parts.push(`error=${pc.red(p.lastError.slice(0, 80))}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts.join(" ");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Command registration
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
export function registerPluginCommands(program: Command): void {
|
||||||
|
const plugin = program.command("plugin").description("Plugin lifecycle management");
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin list
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("list")
|
||||||
|
.description("List installed plugins")
|
||||||
|
.option("--status <status>", "Filter by status (ready, error, disabled, installed, upgrade_pending)")
|
||||||
|
.action(async (opts: PluginListOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const qs = opts.status ? `?status=${encodeURIComponent(opts.status)}` : "";
|
||||||
|
const plugins = await ctx.api.get<PluginRecord[]>(`/api/plugins${qs}`);
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(plugins, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const rows = plugins ?? [];
|
||||||
|
if (rows.length === 0) {
|
||||||
|
console.log(pc.dim("No plugins installed."));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const p of rows) {
|
||||||
|
console.log(formatPlugin(p));
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin install <package-or-path>
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("install <package>")
|
||||||
|
.description(
|
||||||
|
"Install a plugin from a local path or npm package.\n" +
|
||||||
|
" Examples:\n" +
|
||||||
|
" paperclipai plugin install ./my-plugin # local path\n" +
|
||||||
|
" paperclipai plugin install @acme/plugin-linear # npm package\n" +
|
||||||
|
" paperclipai plugin install @acme/plugin-linear@1.2 # pinned version",
|
||||||
|
)
|
||||||
|
.option("-l, --local", "Treat <package> as a local filesystem path", false)
|
||||||
|
.option("--version <version>", "Specific npm version to install (npm packages only)")
|
||||||
|
.action(async (packageArg: string, opts: PluginInstallOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
|
||||||
|
// Auto-detect local paths: starts with . or / or ~ or is an absolute path
|
||||||
|
const isLocal =
|
||||||
|
opts.local ||
|
||||||
|
packageArg.startsWith("./") ||
|
||||||
|
packageArg.startsWith("../") ||
|
||||||
|
packageArg.startsWith("/") ||
|
||||||
|
packageArg.startsWith("~");
|
||||||
|
|
||||||
|
const resolvedPackage = resolvePackageArg(packageArg, isLocal);
|
||||||
|
|
||||||
|
if (!ctx.json) {
|
||||||
|
console.log(
|
||||||
|
pc.dim(
|
||||||
|
isLocal
|
||||||
|
? `Installing plugin from local path: ${resolvedPackage}`
|
||||||
|
: `Installing plugin: ${resolvedPackage}${opts.version ? `@${opts.version}` : ""}`,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const installedPlugin = await ctx.api.post<PluginRecord>("/api/plugins/install", {
|
||||||
|
packageName: resolvedPackage,
|
||||||
|
version: opts.version,
|
||||||
|
isLocalPath: isLocal,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(installedPlugin, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!installedPlugin) {
|
||||||
|
console.log(pc.dim("Install returned no plugin record."));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(
|
||||||
|
pc.green(
|
||||||
|
`✓ Installed ${pc.bold(installedPlugin.pluginKey)} v${installedPlugin.version} (${installedPlugin.status})`,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (installedPlugin.lastError) {
|
||||||
|
console.log(pc.red(` Warning: ${installedPlugin.lastError}`));
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin uninstall <plugin-key-or-id>
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("uninstall <pluginKey>")
|
||||||
|
.description(
|
||||||
|
"Uninstall a plugin by its plugin key or database ID.\n" +
|
||||||
|
" Use --force to hard-purge all state and config.",
|
||||||
|
)
|
||||||
|
.option("--force", "Purge all plugin state and config (hard delete)", false)
|
||||||
|
.action(async (pluginKey: string, opts: PluginUninstallOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const purge = opts.force === true;
|
||||||
|
const qs = purge ? "?purge=true" : "";
|
||||||
|
|
||||||
|
if (!ctx.json) {
|
||||||
|
console.log(
|
||||||
|
pc.dim(
|
||||||
|
purge
|
||||||
|
? `Uninstalling and purging plugin: ${pluginKey}`
|
||||||
|
: `Uninstalling plugin: ${pluginKey}`,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await ctx.api.delete<PluginRecord | null>(
|
||||||
|
`/api/plugins/${encodeURIComponent(pluginKey)}${qs}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(result, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(pc.green(`✓ Uninstalled ${pc.bold(pluginKey)}${purge ? " (purged)" : ""}`));
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin enable <plugin-key-or-id>
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("enable <pluginKey>")
|
||||||
|
.description("Enable a disabled or errored plugin")
|
||||||
|
.action(async (pluginKey: string, opts: BaseClientOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const result = await ctx.api.post<PluginRecord>(
|
||||||
|
`/api/plugins/${encodeURIComponent(pluginKey)}/enable`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(result, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(pc.green(`✓ Enabled ${pc.bold(pluginKey)} — status: ${result?.status ?? "unknown"}`));
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin disable <plugin-key-or-id>
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("disable <pluginKey>")
|
||||||
|
.description("Disable a running plugin without uninstalling it")
|
||||||
|
.action(async (pluginKey: string, opts: BaseClientOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const result = await ctx.api.post<PluginRecord>(
|
||||||
|
`/api/plugins/${encodeURIComponent(pluginKey)}/disable`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(result, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(pc.dim(`Disabled ${pc.bold(pluginKey)} — status: ${result?.status ?? "unknown"}`));
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin inspect <plugin-key-or-id>
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("inspect <pluginKey>")
|
||||||
|
.description("Show full details for an installed plugin")
|
||||||
|
.action(async (pluginKey: string, opts: BaseClientOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const result = await ctx.api.get<PluginRecord>(
|
||||||
|
`/api/plugins/${encodeURIComponent(pluginKey)}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(result, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!result) {
|
||||||
|
console.log(pc.red(`Plugin not found: ${pluginKey}`));
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(formatPlugin(result));
|
||||||
|
if (result.lastError) {
|
||||||
|
console.log(`\n${pc.red("Last error:")}\n${result.lastError}`);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
// plugin examples
|
||||||
|
// -------------------------------------------------------------------------
|
||||||
|
addCommonClientOptions(
|
||||||
|
plugin
|
||||||
|
.command("examples")
|
||||||
|
.description("List bundled example plugins available for local install")
|
||||||
|
.action(async (opts: BaseClientOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const examples = await ctx.api.get<
|
||||||
|
Array<{
|
||||||
|
packageName: string;
|
||||||
|
pluginKey: string;
|
||||||
|
displayName: string;
|
||||||
|
description: string;
|
||||||
|
localPath: string;
|
||||||
|
tag: string;
|
||||||
|
}>
|
||||||
|
>("/api/plugins/examples");
|
||||||
|
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(examples, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const rows = examples ?? [];
|
||||||
|
if (rows.length === 0) {
|
||||||
|
console.log(pc.dim("No bundled examples available."));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const ex of rows) {
|
||||||
|
console.log(
|
||||||
|
`${pc.bold(ex.displayName)} ${pc.dim(ex.pluginKey)}\n` +
|
||||||
|
` ${ex.description}\n` +
|
||||||
|
` ${pc.cyan(`paperclipai plugin install ${ex.localPath}`)}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
129
cli/src/commands/client/zip.ts
Normal file
129
cli/src/commands/client/zip.ts
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
import { inflateRawSync } from "node:zlib";
|
||||||
|
import path from "node:path";
|
||||||
|
import type { CompanyPortabilityFileEntry } from "@paperclipai/shared";
|
||||||
|
|
||||||
|
const textDecoder = new TextDecoder();
|
||||||
|
|
||||||
|
export const binaryContentTypeByExtension: Record<string, string> = {
|
||||||
|
".gif": "image/gif",
|
||||||
|
".jpeg": "image/jpeg",
|
||||||
|
".jpg": "image/jpeg",
|
||||||
|
".png": "image/png",
|
||||||
|
".svg": "image/svg+xml",
|
||||||
|
".webp": "image/webp",
|
||||||
|
};
|
||||||
|
|
||||||
|
function normalizeArchivePath(pathValue: string) {
|
||||||
|
return pathValue
|
||||||
|
.replace(/\\/g, "/")
|
||||||
|
.split("/")
|
||||||
|
.filter(Boolean)
|
||||||
|
.join("/");
|
||||||
|
}
|
||||||
|
|
||||||
|
function readUint16(source: Uint8Array, offset: number) {
|
||||||
|
return source[offset]! | (source[offset + 1]! << 8);
|
||||||
|
}
|
||||||
|
|
||||||
|
function readUint32(source: Uint8Array, offset: number) {
|
||||||
|
return (
|
||||||
|
source[offset]! |
|
||||||
|
(source[offset + 1]! << 8) |
|
||||||
|
(source[offset + 2]! << 16) |
|
||||||
|
(source[offset + 3]! << 24)
|
||||||
|
) >>> 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
function sharedArchiveRoot(paths: string[]) {
|
||||||
|
if (paths.length === 0) return null;
|
||||||
|
const firstSegments = paths
|
||||||
|
.map((entry) => normalizeArchivePath(entry).split("/").filter(Boolean))
|
||||||
|
.filter((parts) => parts.length > 0);
|
||||||
|
if (firstSegments.length === 0) return null;
|
||||||
|
const candidate = firstSegments[0]![0]!;
|
||||||
|
return firstSegments.every((parts) => parts.length > 1 && parts[0] === candidate)
|
||||||
|
? candidate
|
||||||
|
: null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function bytesToPortableFileEntry(pathValue: string, bytes: Uint8Array): CompanyPortabilityFileEntry {
|
||||||
|
const contentType = binaryContentTypeByExtension[path.extname(pathValue).toLowerCase()];
|
||||||
|
if (!contentType) return textDecoder.decode(bytes);
|
||||||
|
return {
|
||||||
|
encoding: "base64",
|
||||||
|
data: Buffer.from(bytes).toString("base64"),
|
||||||
|
contentType,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async function inflateZipEntry(compressionMethod: number, bytes: Uint8Array) {
|
||||||
|
if (compressionMethod === 0) return bytes;
|
||||||
|
if (compressionMethod !== 8) {
|
||||||
|
throw new Error("Unsupported zip archive: only STORE and DEFLATE entries are supported.");
|
||||||
|
}
|
||||||
|
return new Uint8Array(inflateRawSync(bytes));
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function readZipArchive(source: ArrayBuffer | Uint8Array): Promise<{
|
||||||
|
rootPath: string | null;
|
||||||
|
files: Record<string, CompanyPortabilityFileEntry>;
|
||||||
|
}> {
|
||||||
|
const bytes = source instanceof Uint8Array ? source : new Uint8Array(source);
|
||||||
|
const entries: Array<{ path: string; body: CompanyPortabilityFileEntry }> = [];
|
||||||
|
let offset = 0;
|
||||||
|
|
||||||
|
while (offset + 4 <= bytes.length) {
|
||||||
|
const signature = readUint32(bytes, offset);
|
||||||
|
if (signature === 0x02014b50 || signature === 0x06054b50) break;
|
||||||
|
if (signature !== 0x04034b50) {
|
||||||
|
throw new Error("Invalid zip archive: unsupported local file header.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (offset + 30 > bytes.length) {
|
||||||
|
throw new Error("Invalid zip archive: truncated local file header.");
|
||||||
|
}
|
||||||
|
|
||||||
|
const generalPurposeFlag = readUint16(bytes, offset + 6);
|
||||||
|
const compressionMethod = readUint16(bytes, offset + 8);
|
||||||
|
const compressedSize = readUint32(bytes, offset + 18);
|
||||||
|
const fileNameLength = readUint16(bytes, offset + 26);
|
||||||
|
const extraFieldLength = readUint16(bytes, offset + 28);
|
||||||
|
|
||||||
|
if ((generalPurposeFlag & 0x0008) !== 0) {
|
||||||
|
throw new Error("Unsupported zip archive: data descriptors are not supported.");
|
||||||
|
}
|
||||||
|
|
||||||
|
const nameOffset = offset + 30;
|
||||||
|
const bodyOffset = nameOffset + fileNameLength + extraFieldLength;
|
||||||
|
const bodyEnd = bodyOffset + compressedSize;
|
||||||
|
if (bodyEnd > bytes.length) {
|
||||||
|
throw new Error("Invalid zip archive: truncated file contents.");
|
||||||
|
}
|
||||||
|
|
||||||
|
const rawArchivePath = textDecoder.decode(bytes.slice(nameOffset, nameOffset + fileNameLength));
|
||||||
|
const archivePath = normalizeArchivePath(rawArchivePath);
|
||||||
|
const isDirectoryEntry = /\/$/.test(rawArchivePath.replace(/\\/g, "/"));
|
||||||
|
if (archivePath && !isDirectoryEntry) {
|
||||||
|
const entryBytes = await inflateZipEntry(compressionMethod, bytes.slice(bodyOffset, bodyEnd));
|
||||||
|
entries.push({
|
||||||
|
path: archivePath,
|
||||||
|
body: bytesToPortableFileEntry(archivePath, entryBytes),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
offset = bodyEnd;
|
||||||
|
}
|
||||||
|
|
||||||
|
const rootPath = sharedArchiveRoot(entries.map((entry) => entry.path));
|
||||||
|
const files: Record<string, CompanyPortabilityFileEntry> = {};
|
||||||
|
for (const entry of entries) {
|
||||||
|
const normalizedPath =
|
||||||
|
rootPath && entry.path.startsWith(`${rootPath}/`)
|
||||||
|
? entry.path.slice(rootPath.length + 1)
|
||||||
|
: entry.path;
|
||||||
|
if (!normalizedPath) continue;
|
||||||
|
files[normalizedPath] = entry.body;
|
||||||
|
}
|
||||||
|
|
||||||
|
return { rootPath, files };
|
||||||
|
}
|
||||||
@@ -10,6 +10,7 @@ import { defaultSecretsConfig, promptSecrets } from "../prompts/secrets.js";
|
|||||||
import { defaultStorageConfig, promptStorage } from "../prompts/storage.js";
|
import { defaultStorageConfig, promptStorage } from "../prompts/storage.js";
|
||||||
import { promptServer } from "../prompts/server.js";
|
import { promptServer } from "../prompts/server.js";
|
||||||
import {
|
import {
|
||||||
|
resolveDefaultBackupDir,
|
||||||
resolveDefaultEmbeddedPostgresDir,
|
resolveDefaultEmbeddedPostgresDir,
|
||||||
resolveDefaultLogsDir,
|
resolveDefaultLogsDir,
|
||||||
resolvePaperclipInstanceId,
|
resolvePaperclipInstanceId,
|
||||||
@@ -39,6 +40,12 @@ function defaultConfig(): PaperclipConfig {
|
|||||||
mode: "embedded-postgres",
|
mode: "embedded-postgres",
|
||||||
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(instanceId),
|
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(instanceId),
|
||||||
embeddedPostgresPort: 54329,
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: resolveDefaultBackupDir(instanceId),
|
||||||
|
},
|
||||||
},
|
},
|
||||||
logging: {
|
logging: {
|
||||||
mode: "file",
|
mode: "file",
|
||||||
@@ -54,6 +61,7 @@ function defaultConfig(): PaperclipConfig {
|
|||||||
},
|
},
|
||||||
auth: {
|
auth: {
|
||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
storage: defaultStorageConfig(),
|
storage: defaultStorageConfig(),
|
||||||
secrets: defaultSecretsConfig(),
|
secrets: defaultSecretsConfig(),
|
||||||
@@ -118,7 +126,7 @@ export async function configure(opts: {
|
|||||||
|
|
||||||
switch (section) {
|
switch (section) {
|
||||||
case "database":
|
case "database":
|
||||||
config.database = await promptDatabase();
|
config.database = await promptDatabase(config.database);
|
||||||
break;
|
break;
|
||||||
case "llm": {
|
case "llm": {
|
||||||
const llm = await promptLlm();
|
const llm = await promptLlm();
|
||||||
|
|||||||
102
cli/src/commands/db-backup.ts
Normal file
102
cli/src/commands/db-backup.ts
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
import path from "node:path";
|
||||||
|
import * as p from "@clack/prompts";
|
||||||
|
import pc from "picocolors";
|
||||||
|
import { formatDatabaseBackupResult, runDatabaseBackup } from "@paperclipai/db";
|
||||||
|
import {
|
||||||
|
expandHomePrefix,
|
||||||
|
resolveDefaultBackupDir,
|
||||||
|
resolvePaperclipInstanceId,
|
||||||
|
} from "../config/home.js";
|
||||||
|
import { readConfig, resolveConfigPath } from "../config/store.js";
|
||||||
|
import { printPaperclipCliBanner } from "../utils/banner.js";
|
||||||
|
|
||||||
|
type DbBackupOptions = {
|
||||||
|
config?: string;
|
||||||
|
dir?: string;
|
||||||
|
retentionDays?: number;
|
||||||
|
filenamePrefix?: string;
|
||||||
|
json?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
function resolveConnectionString(configPath?: string): { value: string; source: string } {
|
||||||
|
const envUrl = process.env.DATABASE_URL?.trim();
|
||||||
|
if (envUrl) return { value: envUrl, source: "DATABASE_URL" };
|
||||||
|
|
||||||
|
const config = readConfig(configPath);
|
||||||
|
if (config?.database.mode === "postgres" && config.database.connectionString?.trim()) {
|
||||||
|
return { value: config.database.connectionString.trim(), source: "config.database.connectionString" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const port = config?.database.embeddedPostgresPort ?? 54329;
|
||||||
|
return {
|
||||||
|
value: `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`,
|
||||||
|
source: `embedded-postgres@${port}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function normalizeRetentionDays(value: number | undefined, fallback: number): number {
|
||||||
|
const candidate = value ?? fallback;
|
||||||
|
if (!Number.isInteger(candidate) || candidate < 1) {
|
||||||
|
throw new Error(`Invalid retention days '${String(candidate)}'. Use a positive integer.`);
|
||||||
|
}
|
||||||
|
return candidate;
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolveBackupDir(raw: string): string {
|
||||||
|
return path.resolve(expandHomePrefix(raw.trim()));
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function dbBackupCommand(opts: DbBackupOptions): Promise<void> {
|
||||||
|
printPaperclipCliBanner();
|
||||||
|
p.intro(pc.bgCyan(pc.black(" paperclip db:backup ")));
|
||||||
|
|
||||||
|
const configPath = resolveConfigPath(opts.config);
|
||||||
|
const config = readConfig(opts.config);
|
||||||
|
const connection = resolveConnectionString(opts.config);
|
||||||
|
const defaultDir = resolveDefaultBackupDir(resolvePaperclipInstanceId());
|
||||||
|
const configuredDir = opts.dir?.trim() || config?.database.backup.dir || defaultDir;
|
||||||
|
const backupDir = resolveBackupDir(configuredDir);
|
||||||
|
const retentionDays = normalizeRetentionDays(
|
||||||
|
opts.retentionDays,
|
||||||
|
config?.database.backup.retentionDays ?? 30,
|
||||||
|
);
|
||||||
|
const filenamePrefix = opts.filenamePrefix?.trim() || "paperclip";
|
||||||
|
|
||||||
|
p.log.message(pc.dim(`Config: ${configPath}`));
|
||||||
|
p.log.message(pc.dim(`Connection source: ${connection.source}`));
|
||||||
|
p.log.message(pc.dim(`Backup dir: ${backupDir}`));
|
||||||
|
p.log.message(pc.dim(`Retention: ${retentionDays} day(s)`));
|
||||||
|
|
||||||
|
const spinner = p.spinner();
|
||||||
|
spinner.start("Creating database backup...");
|
||||||
|
try {
|
||||||
|
const result = await runDatabaseBackup({
|
||||||
|
connectionString: connection.value,
|
||||||
|
backupDir,
|
||||||
|
retentionDays,
|
||||||
|
filenamePrefix,
|
||||||
|
});
|
||||||
|
spinner.stop(`Backup saved: ${formatDatabaseBackupResult(result)}`);
|
||||||
|
|
||||||
|
if (opts.json) {
|
||||||
|
console.log(
|
||||||
|
JSON.stringify(
|
||||||
|
{
|
||||||
|
backupFile: result.backupFile,
|
||||||
|
sizeBytes: result.sizeBytes,
|
||||||
|
prunedCount: result.prunedCount,
|
||||||
|
backupDir,
|
||||||
|
retentionDays,
|
||||||
|
connectionSource: connection.source,
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
p.outro(pc.green("Backup completed."));
|
||||||
|
} catch (err) {
|
||||||
|
spinner.stop(pc.red("Backup failed."));
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -14,6 +14,7 @@ import {
|
|||||||
storageCheck,
|
storageCheck,
|
||||||
type CheckResult,
|
type CheckResult,
|
||||||
} from "../checks/index.js";
|
} from "../checks/index.js";
|
||||||
|
import { loadPaperclipEnvFile } from "../config/env.js";
|
||||||
import { printPaperclipCliBanner } from "../utils/banner.js";
|
import { printPaperclipCliBanner } from "../utils/banner.js";
|
||||||
|
|
||||||
const STATUS_ICON = {
|
const STATUS_ICON = {
|
||||||
@@ -31,6 +32,7 @@ export async function doctor(opts: {
|
|||||||
p.intro(pc.bgCyan(pc.black(" paperclip doctor ")));
|
p.intro(pc.bgCyan(pc.black(" paperclip doctor ")));
|
||||||
|
|
||||||
const configPath = resolveConfigPath(opts.config);
|
const configPath = resolveConfigPath(opts.config);
|
||||||
|
loadPaperclipEnvFile(configPath);
|
||||||
const results: CheckResult[] = [];
|
const results: CheckResult[] = [];
|
||||||
|
|
||||||
// 1. Config check (must pass before others)
|
// 1. Config check (must pass before others)
|
||||||
@@ -64,28 +66,40 @@ export async function doctor(opts: {
|
|||||||
printResult(deploymentAuthResult);
|
printResult(deploymentAuthResult);
|
||||||
|
|
||||||
// 3. Agent JWT check
|
// 3. Agent JWT check
|
||||||
const jwtResult = agentJwtSecretCheck(opts.config);
|
results.push(
|
||||||
results.push(jwtResult);
|
await runRepairableCheck({
|
||||||
printResult(jwtResult);
|
run: () => agentJwtSecretCheck(opts.config),
|
||||||
await maybeRepair(jwtResult, opts);
|
configPath,
|
||||||
|
opts,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
// 4. Secrets adapter check
|
// 4. Secrets adapter check
|
||||||
const secretsResult = secretsCheck(config, configPath);
|
results.push(
|
||||||
results.push(secretsResult);
|
await runRepairableCheck({
|
||||||
printResult(secretsResult);
|
run: () => secretsCheck(config, configPath),
|
||||||
await maybeRepair(secretsResult, opts);
|
configPath,
|
||||||
|
opts,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
// 5. Storage check
|
// 5. Storage check
|
||||||
const storageResult = storageCheck(config, configPath);
|
results.push(
|
||||||
results.push(storageResult);
|
await runRepairableCheck({
|
||||||
printResult(storageResult);
|
run: () => storageCheck(config, configPath),
|
||||||
await maybeRepair(storageResult, opts);
|
configPath,
|
||||||
|
opts,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
// 6. Database check
|
// 6. Database check
|
||||||
const dbResult = await databaseCheck(config, configPath);
|
results.push(
|
||||||
results.push(dbResult);
|
await runRepairableCheck({
|
||||||
printResult(dbResult);
|
run: () => databaseCheck(config, configPath),
|
||||||
await maybeRepair(dbResult, opts);
|
configPath,
|
||||||
|
opts,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
// 7. LLM check
|
// 7. LLM check
|
||||||
const llmResult = await llmCheck(config);
|
const llmResult = await llmCheck(config);
|
||||||
@@ -93,10 +107,13 @@ export async function doctor(opts: {
|
|||||||
printResult(llmResult);
|
printResult(llmResult);
|
||||||
|
|
||||||
// 8. Log directory check
|
// 8. Log directory check
|
||||||
const logResult = logCheck(config, configPath);
|
results.push(
|
||||||
results.push(logResult);
|
await runRepairableCheck({
|
||||||
printResult(logResult);
|
run: () => logCheck(config, configPath),
|
||||||
await maybeRepair(logResult, opts);
|
configPath,
|
||||||
|
opts,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
// 9. Port check
|
// 9. Port check
|
||||||
const portResult = await portCheck(config);
|
const portResult = await portCheck(config);
|
||||||
@@ -118,9 +135,9 @@ function printResult(result: CheckResult): void {
|
|||||||
async function maybeRepair(
|
async function maybeRepair(
|
||||||
result: CheckResult,
|
result: CheckResult,
|
||||||
opts: { repair?: boolean; yes?: boolean },
|
opts: { repair?: boolean; yes?: boolean },
|
||||||
): Promise<void> {
|
): Promise<boolean> {
|
||||||
if (result.status === "pass" || !result.canRepair || !result.repair) return;
|
if (result.status === "pass" || !result.canRepair || !result.repair) return false;
|
||||||
if (!opts.repair) return;
|
if (!opts.repair) return false;
|
||||||
|
|
||||||
let shouldRepair = opts.yes;
|
let shouldRepair = opts.yes;
|
||||||
if (!shouldRepair) {
|
if (!shouldRepair) {
|
||||||
@@ -128,7 +145,7 @@ async function maybeRepair(
|
|||||||
message: `Repair "${result.name}"?`,
|
message: `Repair "${result.name}"?`,
|
||||||
initialValue: true,
|
initialValue: true,
|
||||||
});
|
});
|
||||||
if (p.isCancel(answer)) return;
|
if (p.isCancel(answer)) return false;
|
||||||
shouldRepair = answer;
|
shouldRepair = answer;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -136,10 +153,30 @@ async function maybeRepair(
|
|||||||
try {
|
try {
|
||||||
await result.repair();
|
await result.repair();
|
||||||
p.log.success(`Repaired: ${result.name}`);
|
p.log.success(`Repaired: ${result.name}`);
|
||||||
|
return true;
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
p.log.error(`Repair failed: ${err instanceof Error ? err.message : String(err)}`);
|
p.log.error(`Repair failed: ${err instanceof Error ? err.message : String(err)}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function runRepairableCheck(input: {
|
||||||
|
run: () => CheckResult | Promise<CheckResult>;
|
||||||
|
configPath: string;
|
||||||
|
opts: { repair?: boolean; yes?: boolean };
|
||||||
|
}): Promise<CheckResult> {
|
||||||
|
let result = await input.run();
|
||||||
|
printResult(result);
|
||||||
|
|
||||||
|
const repaired = await maybeRepair(result, input.opts);
|
||||||
|
if (!repaired) return result;
|
||||||
|
|
||||||
|
// Repairs may create/update the adjacent .env file or other local resources.
|
||||||
|
loadPaperclipEnvFile(input.configPath);
|
||||||
|
result = await input.run();
|
||||||
|
printResult(result);
|
||||||
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
function printSummary(results: CheckResult[]): { passed: number; warned: number; failed: number } {
|
function printSummary(results: CheckResult[]): { passed: number; warned: number; failed: number } {
|
||||||
|
|||||||
@@ -118,6 +118,29 @@ function collectDeploymentEnvRows(config: PaperclipConfig | null, configPath: st
|
|||||||
const dbUrl = process.env.DATABASE_URL ?? config?.database?.connectionString ?? "";
|
const dbUrl = process.env.DATABASE_URL ?? config?.database?.connectionString ?? "";
|
||||||
const databaseMode = config?.database?.mode ?? "embedded-postgres";
|
const databaseMode = config?.database?.mode ?? "embedded-postgres";
|
||||||
const dbUrlSource: EnvSource = process.env.DATABASE_URL ? "env" : config?.database?.connectionString ? "config" : "missing";
|
const dbUrlSource: EnvSource = process.env.DATABASE_URL ? "env" : config?.database?.connectionString ? "config" : "missing";
|
||||||
|
const publicUrl =
|
||||||
|
process.env.PAPERCLIP_PUBLIC_URL ??
|
||||||
|
process.env.PAPERCLIP_AUTH_PUBLIC_BASE_URL ??
|
||||||
|
process.env.BETTER_AUTH_URL ??
|
||||||
|
process.env.BETTER_AUTH_BASE_URL ??
|
||||||
|
config?.auth?.publicBaseUrl ??
|
||||||
|
"";
|
||||||
|
const publicUrlSource: EnvSource =
|
||||||
|
process.env.PAPERCLIP_PUBLIC_URL
|
||||||
|
? "env"
|
||||||
|
: process.env.PAPERCLIP_AUTH_PUBLIC_BASE_URL || process.env.BETTER_AUTH_URL || process.env.BETTER_AUTH_BASE_URL
|
||||||
|
? "env"
|
||||||
|
: config?.auth?.publicBaseUrl
|
||||||
|
? "config"
|
||||||
|
: "missing";
|
||||||
|
let trustedOriginsDefault = "";
|
||||||
|
if (publicUrl) {
|
||||||
|
try {
|
||||||
|
trustedOriginsDefault = new URL(publicUrl).origin;
|
||||||
|
} catch {
|
||||||
|
trustedOriginsDefault = "";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const heartbeatInterval = process.env.HEARTBEAT_SCHEDULER_INTERVAL_MS ?? DEFAULT_HEARTBEAT_SCHEDULER_INTERVAL_MS;
|
const heartbeatInterval = process.env.HEARTBEAT_SCHEDULER_INTERVAL_MS ?? DEFAULT_HEARTBEAT_SCHEDULER_INTERVAL_MS;
|
||||||
const heartbeatEnabled = process.env.HEARTBEAT_SCHEDULER_ENABLED ?? "true";
|
const heartbeatEnabled = process.env.HEARTBEAT_SCHEDULER_ENABLED ?? "true";
|
||||||
@@ -192,6 +215,24 @@ function collectDeploymentEnvRows(config: PaperclipConfig | null, configPath: st
|
|||||||
required: false,
|
required: false,
|
||||||
note: "HTTP listen port",
|
note: "HTTP listen port",
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
key: "PAPERCLIP_PUBLIC_URL",
|
||||||
|
value: publicUrl,
|
||||||
|
source: publicUrlSource,
|
||||||
|
required: false,
|
||||||
|
note: "Canonical public URL for auth/callback/invite origin wiring",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
key: "BETTER_AUTH_TRUSTED_ORIGINS",
|
||||||
|
value: process.env.BETTER_AUTH_TRUSTED_ORIGINS ?? trustedOriginsDefault,
|
||||||
|
source: process.env.BETTER_AUTH_TRUSTED_ORIGINS
|
||||||
|
? "env"
|
||||||
|
: trustedOriginsDefault
|
||||||
|
? "default"
|
||||||
|
: "missing",
|
||||||
|
required: false,
|
||||||
|
note: "Comma-separated auth origin allowlist (auto-derived from PAPERCLIP_PUBLIC_URL when possible)",
|
||||||
|
},
|
||||||
{
|
{
|
||||||
key: "PAPERCLIP_AGENT_JWT_TTL_SECONDS",
|
key: "PAPERCLIP_AGENT_JWT_TTL_SECONDS",
|
||||||
value: process.env.PAPERCLIP_AGENT_JWT_TTL_SECONDS ?? DEFAULT_AGENT_JWT_TTL_SECONDS,
|
value: process.env.PAPERCLIP_AGENT_JWT_TTL_SECONDS ?? DEFAULT_AGENT_JWT_TTL_SECONDS,
|
||||||
|
|||||||
@@ -1,5 +1,18 @@
|
|||||||
import * as p from "@clack/prompts";
|
import * as p from "@clack/prompts";
|
||||||
|
import path from "node:path";
|
||||||
import pc from "picocolors";
|
import pc from "picocolors";
|
||||||
|
import {
|
||||||
|
AUTH_BASE_URL_MODES,
|
||||||
|
DEPLOYMENT_EXPOSURES,
|
||||||
|
DEPLOYMENT_MODES,
|
||||||
|
SECRET_PROVIDERS,
|
||||||
|
STORAGE_PROVIDERS,
|
||||||
|
type AuthBaseUrlMode,
|
||||||
|
type DeploymentExposure,
|
||||||
|
type DeploymentMode,
|
||||||
|
type SecretProvider,
|
||||||
|
type StorageProvider,
|
||||||
|
} from "@paperclipai/shared";
|
||||||
import { configExists, readConfig, resolveConfigPath, writeConfig } from "../config/store.js";
|
import { configExists, readConfig, resolveConfigPath, writeConfig } from "../config/store.js";
|
||||||
import type { PaperclipConfig } from "../config/schema.js";
|
import type { PaperclipConfig } from "../config/schema.js";
|
||||||
import { ensureAgentJwtSecret, resolveAgentJwtEnvFile } from "../config/env.js";
|
import { ensureAgentJwtSecret, resolveAgentJwtEnvFile } from "../config/env.js";
|
||||||
@@ -12,6 +25,8 @@ import { defaultStorageConfig, promptStorage } from "../prompts/storage.js";
|
|||||||
import { promptServer } from "../prompts/server.js";
|
import { promptServer } from "../prompts/server.js";
|
||||||
import {
|
import {
|
||||||
describeLocalInstancePaths,
|
describeLocalInstancePaths,
|
||||||
|
expandHomePrefix,
|
||||||
|
resolveDefaultBackupDir,
|
||||||
resolveDefaultEmbeddedPostgresDir,
|
resolveDefaultEmbeddedPostgresDir,
|
||||||
resolveDefaultLogsDir,
|
resolveDefaultLogsDir,
|
||||||
resolvePaperclipInstanceId,
|
resolvePaperclipInstanceId,
|
||||||
@@ -28,32 +43,194 @@ type OnboardOptions = {
|
|||||||
invokedByRun?: boolean;
|
invokedByRun?: boolean;
|
||||||
};
|
};
|
||||||
|
|
||||||
function quickstartDefaults(): Pick<PaperclipConfig, "database" | "logging" | "server" | "auth" | "storage" | "secrets"> {
|
type OnboardDefaults = Pick<PaperclipConfig, "database" | "logging" | "server" | "auth" | "storage" | "secrets">;
|
||||||
|
|
||||||
|
const ONBOARD_ENV_KEYS = [
|
||||||
|
"PAPERCLIP_PUBLIC_URL",
|
||||||
|
"DATABASE_URL",
|
||||||
|
"PAPERCLIP_DB_BACKUP_ENABLED",
|
||||||
|
"PAPERCLIP_DB_BACKUP_INTERVAL_MINUTES",
|
||||||
|
"PAPERCLIP_DB_BACKUP_RETENTION_DAYS",
|
||||||
|
"PAPERCLIP_DB_BACKUP_DIR",
|
||||||
|
"PAPERCLIP_DEPLOYMENT_MODE",
|
||||||
|
"PAPERCLIP_DEPLOYMENT_EXPOSURE",
|
||||||
|
"HOST",
|
||||||
|
"PORT",
|
||||||
|
"SERVE_UI",
|
||||||
|
"PAPERCLIP_ALLOWED_HOSTNAMES",
|
||||||
|
"PAPERCLIP_AUTH_BASE_URL_MODE",
|
||||||
|
"PAPERCLIP_AUTH_PUBLIC_BASE_URL",
|
||||||
|
"BETTER_AUTH_URL",
|
||||||
|
"BETTER_AUTH_BASE_URL",
|
||||||
|
"PAPERCLIP_STORAGE_PROVIDER",
|
||||||
|
"PAPERCLIP_STORAGE_LOCAL_DIR",
|
||||||
|
"PAPERCLIP_STORAGE_S3_BUCKET",
|
||||||
|
"PAPERCLIP_STORAGE_S3_REGION",
|
||||||
|
"PAPERCLIP_STORAGE_S3_ENDPOINT",
|
||||||
|
"PAPERCLIP_STORAGE_S3_PREFIX",
|
||||||
|
"PAPERCLIP_STORAGE_S3_FORCE_PATH_STYLE",
|
||||||
|
"PAPERCLIP_SECRETS_PROVIDER",
|
||||||
|
"PAPERCLIP_SECRETS_STRICT_MODE",
|
||||||
|
"PAPERCLIP_SECRETS_MASTER_KEY_FILE",
|
||||||
|
] as const;
|
||||||
|
|
||||||
|
function parseBooleanFromEnv(rawValue: string | undefined): boolean | null {
|
||||||
|
if (rawValue === undefined) return null;
|
||||||
|
const lower = rawValue.trim().toLowerCase();
|
||||||
|
if (lower === "true" || lower === "1" || lower === "yes") return true;
|
||||||
|
if (lower === "false" || lower === "0" || lower === "no") return false;
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function parseNumberFromEnv(rawValue: string | undefined): number | null {
|
||||||
|
if (!rawValue) return null;
|
||||||
|
const parsed = Number(rawValue);
|
||||||
|
if (!Number.isFinite(parsed)) return null;
|
||||||
|
return parsed;
|
||||||
|
}
|
||||||
|
|
||||||
|
function parseEnumFromEnv<T extends string>(rawValue: string | undefined, allowedValues: readonly T[]): T | null {
|
||||||
|
if (!rawValue) return null;
|
||||||
|
return allowedValues.includes(rawValue as T) ? (rawValue as T) : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolvePathFromEnv(rawValue: string | undefined): string | null {
|
||||||
|
if (!rawValue || rawValue.trim().length === 0) return null;
|
||||||
|
return path.resolve(expandHomePrefix(rawValue.trim()));
|
||||||
|
}
|
||||||
|
|
||||||
|
function quickstartDefaultsFromEnv(): {
|
||||||
|
defaults: OnboardDefaults;
|
||||||
|
usedEnvKeys: string[];
|
||||||
|
ignoredEnvKeys: Array<{ key: string; reason: string }>;
|
||||||
|
} {
|
||||||
const instanceId = resolvePaperclipInstanceId();
|
const instanceId = resolvePaperclipInstanceId();
|
||||||
return {
|
const defaultStorage = defaultStorageConfig();
|
||||||
|
const defaultSecrets = defaultSecretsConfig();
|
||||||
|
const databaseUrl = process.env.DATABASE_URL?.trim() || undefined;
|
||||||
|
const publicUrl =
|
||||||
|
process.env.PAPERCLIP_PUBLIC_URL?.trim() ||
|
||||||
|
process.env.PAPERCLIP_AUTH_PUBLIC_BASE_URL?.trim() ||
|
||||||
|
process.env.BETTER_AUTH_URL?.trim() ||
|
||||||
|
process.env.BETTER_AUTH_BASE_URL?.trim() ||
|
||||||
|
undefined;
|
||||||
|
const deploymentMode =
|
||||||
|
parseEnumFromEnv<DeploymentMode>(process.env.PAPERCLIP_DEPLOYMENT_MODE, DEPLOYMENT_MODES) ?? "local_trusted";
|
||||||
|
const deploymentExposureFromEnv = parseEnumFromEnv<DeploymentExposure>(
|
||||||
|
process.env.PAPERCLIP_DEPLOYMENT_EXPOSURE,
|
||||||
|
DEPLOYMENT_EXPOSURES,
|
||||||
|
);
|
||||||
|
const deploymentExposure =
|
||||||
|
deploymentMode === "local_trusted" ? "private" : (deploymentExposureFromEnv ?? "private");
|
||||||
|
const authPublicBaseUrl = publicUrl;
|
||||||
|
const authBaseUrlModeFromEnv = parseEnumFromEnv<AuthBaseUrlMode>(
|
||||||
|
process.env.PAPERCLIP_AUTH_BASE_URL_MODE,
|
||||||
|
AUTH_BASE_URL_MODES,
|
||||||
|
);
|
||||||
|
const authBaseUrlMode = authBaseUrlModeFromEnv ?? (authPublicBaseUrl ? "explicit" : "auto");
|
||||||
|
const allowedHostnamesFromEnv = process.env.PAPERCLIP_ALLOWED_HOSTNAMES
|
||||||
|
? process.env.PAPERCLIP_ALLOWED_HOSTNAMES
|
||||||
|
.split(",")
|
||||||
|
.map((value) => value.trim().toLowerCase())
|
||||||
|
.filter((value) => value.length > 0)
|
||||||
|
: [];
|
||||||
|
const hostnameFromPublicUrl = publicUrl
|
||||||
|
? (() => {
|
||||||
|
try {
|
||||||
|
return new URL(publicUrl).hostname.trim().toLowerCase();
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
})()
|
||||||
|
: null;
|
||||||
|
const storageProvider =
|
||||||
|
parseEnumFromEnv<StorageProvider>(process.env.PAPERCLIP_STORAGE_PROVIDER, STORAGE_PROVIDERS) ??
|
||||||
|
defaultStorage.provider;
|
||||||
|
const secretsProvider =
|
||||||
|
parseEnumFromEnv<SecretProvider>(process.env.PAPERCLIP_SECRETS_PROVIDER, SECRET_PROVIDERS) ??
|
||||||
|
defaultSecrets.provider;
|
||||||
|
const databaseBackupEnabled = parseBooleanFromEnv(process.env.PAPERCLIP_DB_BACKUP_ENABLED) ?? true;
|
||||||
|
const databaseBackupIntervalMinutes = Math.max(
|
||||||
|
1,
|
||||||
|
parseNumberFromEnv(process.env.PAPERCLIP_DB_BACKUP_INTERVAL_MINUTES) ?? 60,
|
||||||
|
);
|
||||||
|
const databaseBackupRetentionDays = Math.max(
|
||||||
|
1,
|
||||||
|
parseNumberFromEnv(process.env.PAPERCLIP_DB_BACKUP_RETENTION_DAYS) ?? 30,
|
||||||
|
);
|
||||||
|
const defaults: OnboardDefaults = {
|
||||||
database: {
|
database: {
|
||||||
mode: "embedded-postgres",
|
mode: databaseUrl ? "postgres" : "embedded-postgres",
|
||||||
|
...(databaseUrl ? { connectionString: databaseUrl } : {}),
|
||||||
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(instanceId),
|
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(instanceId),
|
||||||
embeddedPostgresPort: 54329,
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: databaseBackupEnabled,
|
||||||
|
intervalMinutes: databaseBackupIntervalMinutes,
|
||||||
|
retentionDays: databaseBackupRetentionDays,
|
||||||
|
dir: resolvePathFromEnv(process.env.PAPERCLIP_DB_BACKUP_DIR) ?? resolveDefaultBackupDir(instanceId),
|
||||||
|
},
|
||||||
},
|
},
|
||||||
logging: {
|
logging: {
|
||||||
mode: "file",
|
mode: "file",
|
||||||
logDir: resolveDefaultLogsDir(instanceId),
|
logDir: resolveDefaultLogsDir(instanceId),
|
||||||
},
|
},
|
||||||
server: {
|
server: {
|
||||||
deploymentMode: "local_trusted",
|
deploymentMode,
|
||||||
exposure: "private",
|
exposure: deploymentExposure,
|
||||||
host: "127.0.0.1",
|
host: process.env.HOST ?? "127.0.0.1",
|
||||||
port: 3100,
|
port: Number(process.env.PORT) || 3100,
|
||||||
allowedHostnames: [],
|
allowedHostnames: Array.from(new Set([...allowedHostnamesFromEnv, ...(hostnameFromPublicUrl ? [hostnameFromPublicUrl] : [])])),
|
||||||
serveUi: true,
|
serveUi: parseBooleanFromEnv(process.env.SERVE_UI) ?? true,
|
||||||
},
|
},
|
||||||
auth: {
|
auth: {
|
||||||
baseUrlMode: "auto",
|
baseUrlMode: authBaseUrlMode,
|
||||||
|
disableSignUp: false,
|
||||||
|
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: storageProvider,
|
||||||
|
localDisk: {
|
||||||
|
baseDir:
|
||||||
|
resolvePathFromEnv(process.env.PAPERCLIP_STORAGE_LOCAL_DIR) ?? defaultStorage.localDisk.baseDir,
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: process.env.PAPERCLIP_STORAGE_S3_BUCKET ?? defaultStorage.s3.bucket,
|
||||||
|
region: process.env.PAPERCLIP_STORAGE_S3_REGION ?? defaultStorage.s3.region,
|
||||||
|
endpoint: process.env.PAPERCLIP_STORAGE_S3_ENDPOINT ?? defaultStorage.s3.endpoint,
|
||||||
|
prefix: process.env.PAPERCLIP_STORAGE_S3_PREFIX ?? defaultStorage.s3.prefix,
|
||||||
|
forcePathStyle:
|
||||||
|
parseBooleanFromEnv(process.env.PAPERCLIP_STORAGE_S3_FORCE_PATH_STYLE) ??
|
||||||
|
defaultStorage.s3.forcePathStyle,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: secretsProvider,
|
||||||
|
strictMode: parseBooleanFromEnv(process.env.PAPERCLIP_SECRETS_STRICT_MODE) ?? defaultSecrets.strictMode,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath:
|
||||||
|
resolvePathFromEnv(process.env.PAPERCLIP_SECRETS_MASTER_KEY_FILE) ??
|
||||||
|
defaultSecrets.localEncrypted.keyFilePath,
|
||||||
|
},
|
||||||
},
|
},
|
||||||
storage: defaultStorageConfig(),
|
|
||||||
secrets: defaultSecretsConfig(),
|
|
||||||
};
|
};
|
||||||
|
const ignoredEnvKeys: Array<{ key: string; reason: string }> = [];
|
||||||
|
if (deploymentMode === "local_trusted" && process.env.PAPERCLIP_DEPLOYMENT_EXPOSURE !== undefined) {
|
||||||
|
ignoredEnvKeys.push({
|
||||||
|
key: "PAPERCLIP_DEPLOYMENT_EXPOSURE",
|
||||||
|
reason: "Ignored because deployment mode local_trusted always forces private exposure",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const ignoredKeySet = new Set(ignoredEnvKeys.map((entry) => entry.key));
|
||||||
|
const usedEnvKeys = ONBOARD_ENV_KEYS.filter(
|
||||||
|
(key) => process.env[key] !== undefined && !ignoredKeySet.has(key),
|
||||||
|
);
|
||||||
|
return { defaults, usedEnvKeys, ignoredEnvKeys };
|
||||||
|
}
|
||||||
|
|
||||||
|
function canCreateBootstrapInviteImmediately(config: Pick<PaperclipConfig, "database" | "server">): boolean {
|
||||||
|
return config.server.deploymentMode === "authenticated" && config.database.mode !== "embedded-postgres";
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function onboard(opts: OnboardOptions): Promise<void> {
|
export async function onboard(opts: OnboardOptions): Promise<void> {
|
||||||
@@ -109,6 +286,7 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let llm: PaperclipConfig["llm"] | undefined;
|
let llm: PaperclipConfig["llm"] | undefined;
|
||||||
|
const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv();
|
||||||
let {
|
let {
|
||||||
database,
|
database,
|
||||||
logging,
|
logging,
|
||||||
@@ -116,11 +294,11 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
auth,
|
auth,
|
||||||
storage,
|
storage,
|
||||||
secrets,
|
secrets,
|
||||||
} = quickstartDefaults();
|
} = derivedDefaults;
|
||||||
|
|
||||||
if (setupMode === "advanced") {
|
if (setupMode === "advanced") {
|
||||||
p.log.step(pc.bold("Database"));
|
p.log.step(pc.bold("Database"));
|
||||||
database = await promptDatabase();
|
database = await promptDatabase(database);
|
||||||
|
|
||||||
if (database.mode === "postgres" && database.connectionString) {
|
if (database.mode === "postgres" && database.connectionString) {
|
||||||
const s = p.spinner();
|
const s = p.spinner();
|
||||||
@@ -184,13 +362,20 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
logging = await promptLogging();
|
logging = await promptLogging();
|
||||||
|
|
||||||
p.log.step(pc.bold("Server"));
|
p.log.step(pc.bold("Server"));
|
||||||
({ server, auth } = await promptServer());
|
({ server, auth } = await promptServer({ currentServer: server, currentAuth: auth }));
|
||||||
|
|
||||||
p.log.step(pc.bold("Storage"));
|
p.log.step(pc.bold("Storage"));
|
||||||
storage = await promptStorage(defaultStorageConfig());
|
storage = await promptStorage(storage);
|
||||||
|
|
||||||
p.log.step(pc.bold("Secrets"));
|
p.log.step(pc.bold("Secrets"));
|
||||||
secrets = defaultSecretsConfig();
|
const secretsDefaults = defaultSecretsConfig();
|
||||||
|
secrets = {
|
||||||
|
provider: secrets.provider ?? secretsDefaults.provider,
|
||||||
|
strictMode: secrets.strictMode ?? secretsDefaults.strictMode,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: secrets.localEncrypted?.keyFilePath ?? secretsDefaults.localEncrypted.keyFilePath,
|
||||||
|
},
|
||||||
|
};
|
||||||
p.log.message(
|
p.log.message(
|
||||||
pc.dim(
|
pc.dim(
|
||||||
`Using defaults: provider=${secrets.provider}, strictMode=${secrets.strictMode}, keyFile=${secrets.localEncrypted.keyFilePath}`,
|
`Using defaults: provider=${secrets.provider}, strictMode=${secrets.strictMode}, keyFile=${secrets.localEncrypted.keyFilePath}`,
|
||||||
@@ -198,10 +383,18 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
);
|
);
|
||||||
} else {
|
} else {
|
||||||
p.log.step(pc.bold("Quickstart"));
|
p.log.step(pc.bold("Quickstart"));
|
||||||
|
p.log.message(pc.dim("Using quickstart defaults."));
|
||||||
|
if (usedEnvKeys.length > 0) {
|
||||||
|
p.log.message(pc.dim(`Environment-aware defaults active (${usedEnvKeys.length} env var(s) detected).`));
|
||||||
|
} else {
|
||||||
p.log.message(
|
p.log.message(
|
||||||
pc.dim("Using local defaults: embedded database, no LLM provider, file storage, and local encrypted secrets."),
|
pc.dim("No environment overrides detected: embedded database, file storage, local encrypted secrets."),
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
for (const ignored of ignoredEnvKeys) {
|
||||||
|
p.log.message(pc.dim(`Ignored ${ignored.key}: ${ignored.reason}`));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const jwtSecret = ensureAgentJwtSecret(configPath);
|
const jwtSecret = ensureAgentJwtSecret(configPath);
|
||||||
const envFilePath = resolveAgentJwtEnvFile(configPath);
|
const envFilePath = resolveAgentJwtEnvFile(configPath);
|
||||||
@@ -261,7 +454,7 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
"Next commands",
|
"Next commands",
|
||||||
);
|
);
|
||||||
|
|
||||||
if (server.deploymentMode === "authenticated") {
|
if (canCreateBootstrapInviteImmediately({ database, server })) {
|
||||||
p.log.step("Generating bootstrap CEO invite");
|
p.log.step("Generating bootstrap CEO invite");
|
||||||
await bootstrapCeoInvite({ config: configPath });
|
await bootstrapCeoInvite({ config: configPath });
|
||||||
}
|
}
|
||||||
@@ -284,5 +477,15 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (server.deploymentMode === "authenticated" && database.mode === "embedded-postgres") {
|
||||||
|
p.log.info(
|
||||||
|
[
|
||||||
|
"Bootstrap CEO invite will be created after the server starts.",
|
||||||
|
`Next: ${pc.cyan("paperclipai run")}`,
|
||||||
|
`Then: ${pc.cyan("paperclipai auth bootstrap-ceo")}`,
|
||||||
|
].join("\n"),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
p.outro("You're all set!");
|
p.outro("You're all set!");
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -3,9 +3,13 @@ import path from "node:path";
|
|||||||
import { fileURLToPath, pathToFileURL } from "node:url";
|
import { fileURLToPath, pathToFileURL } from "node:url";
|
||||||
import * as p from "@clack/prompts";
|
import * as p from "@clack/prompts";
|
||||||
import pc from "picocolors";
|
import pc from "picocolors";
|
||||||
|
import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js";
|
||||||
import { onboard } from "./onboard.js";
|
import { onboard } from "./onboard.js";
|
||||||
import { doctor } from "./doctor.js";
|
import { doctor } from "./doctor.js";
|
||||||
|
import { loadPaperclipEnvFile } from "../config/env.js";
|
||||||
import { configExists, resolveConfigPath } from "../config/store.js";
|
import { configExists, resolveConfigPath } from "../config/store.js";
|
||||||
|
import type { PaperclipConfig } from "../config/schema.js";
|
||||||
|
import { readConfig } from "../config/store.js";
|
||||||
import {
|
import {
|
||||||
describeLocalInstancePaths,
|
describeLocalInstancePaths,
|
||||||
resolvePaperclipHomeDir,
|
resolvePaperclipHomeDir,
|
||||||
@@ -19,6 +23,13 @@ interface RunOptions {
|
|||||||
yes?: boolean;
|
yes?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface StartedServer {
|
||||||
|
apiUrl: string;
|
||||||
|
databaseUrl: string;
|
||||||
|
host: string;
|
||||||
|
listenPort: number;
|
||||||
|
}
|
||||||
|
|
||||||
export async function runCommand(opts: RunOptions): Promise<void> {
|
export async function runCommand(opts: RunOptions): Promise<void> {
|
||||||
const instanceId = resolvePaperclipInstanceId(opts.instance);
|
const instanceId = resolvePaperclipInstanceId(opts.instance);
|
||||||
process.env.PAPERCLIP_INSTANCE_ID = instanceId;
|
process.env.PAPERCLIP_INSTANCE_ID = instanceId;
|
||||||
@@ -31,6 +42,7 @@ export async function runCommand(opts: RunOptions): Promise<void> {
|
|||||||
|
|
||||||
const configPath = resolveConfigPath(opts.config);
|
const configPath = resolveConfigPath(opts.config);
|
||||||
process.env.PAPERCLIP_CONFIG = configPath;
|
process.env.PAPERCLIP_CONFIG = configPath;
|
||||||
|
loadPaperclipEnvFile(configPath);
|
||||||
|
|
||||||
p.intro(pc.bgCyan(pc.black(" paperclipai run ")));
|
p.intro(pc.bgCyan(pc.black(" paperclipai run ")));
|
||||||
p.log.message(pc.dim(`Home: ${paths.homeDir}`));
|
p.log.message(pc.dim(`Home: ${paths.homeDir}`));
|
||||||
@@ -60,8 +72,41 @@ export async function runCommand(opts: RunOptions): Promise<void> {
|
|||||||
process.exit(1);
|
process.exit(1);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const config = readConfig(configPath);
|
||||||
|
if (!config) {
|
||||||
|
p.log.error(`No config found at ${configPath}.`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
p.log.step("Starting Paperclip server...");
|
p.log.step("Starting Paperclip server...");
|
||||||
await importServerEntry();
|
const startedServer = await importServerEntry();
|
||||||
|
|
||||||
|
if (shouldGenerateBootstrapInviteAfterStart(config)) {
|
||||||
|
p.log.step("Generating bootstrap CEO invite");
|
||||||
|
await bootstrapCeoInvite({
|
||||||
|
config: configPath,
|
||||||
|
dbUrl: startedServer.databaseUrl,
|
||||||
|
baseUrl: resolveBootstrapInviteBaseUrl(config, startedServer),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolveBootstrapInviteBaseUrl(
|
||||||
|
config: PaperclipConfig,
|
||||||
|
startedServer: StartedServer,
|
||||||
|
): string {
|
||||||
|
const explicitBaseUrl =
|
||||||
|
process.env.PAPERCLIP_PUBLIC_URL ??
|
||||||
|
process.env.PAPERCLIP_AUTH_PUBLIC_BASE_URL ??
|
||||||
|
process.env.BETTER_AUTH_URL ??
|
||||||
|
process.env.BETTER_AUTH_BASE_URL ??
|
||||||
|
(config.auth.baseUrlMode === "explicit" ? config.auth.publicBaseUrl : undefined);
|
||||||
|
|
||||||
|
if (typeof explicitBaseUrl === "string" && explicitBaseUrl.trim().length > 0) {
|
||||||
|
return explicitBaseUrl.trim().replace(/\/+$/, "");
|
||||||
|
}
|
||||||
|
|
||||||
|
return startedServer.apiUrl.replace(/\/api$/, "");
|
||||||
}
|
}
|
||||||
|
|
||||||
function formatError(err: unknown): string {
|
function formatError(err: unknown): string {
|
||||||
@@ -101,19 +146,20 @@ function maybeEnableUiDevMiddleware(entrypoint: string): void {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function importServerEntry(): Promise<void> {
|
async function importServerEntry(): Promise<StartedServer> {
|
||||||
// Dev mode: try local workspace path (monorepo with tsx)
|
// Dev mode: try local workspace path (monorepo with tsx)
|
||||||
const projectRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
const projectRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
|
||||||
const devEntry = path.resolve(projectRoot, "server/src/index.ts");
|
const devEntry = path.resolve(projectRoot, "server/src/index.ts");
|
||||||
if (fs.existsSync(devEntry)) {
|
if (fs.existsSync(devEntry)) {
|
||||||
maybeEnableUiDevMiddleware(devEntry);
|
maybeEnableUiDevMiddleware(devEntry);
|
||||||
await import(pathToFileURL(devEntry).href);
|
const mod = await import(pathToFileURL(devEntry).href);
|
||||||
return;
|
return await startServerFromModule(mod, devEntry);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Production mode: import the published @paperclipai/server package
|
// Production mode: import the published @paperclipai/server package
|
||||||
try {
|
try {
|
||||||
await import("@paperclipai/server");
|
const mod = await import("@paperclipai/server");
|
||||||
|
return await startServerFromModule(mod, "@paperclipai/server");
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
const missingSpecifier = getMissingModuleSpecifier(err);
|
const missingSpecifier = getMissingModuleSpecifier(err);
|
||||||
const missingServerEntrypoint = !missingSpecifier || missingSpecifier === "@paperclipai/server";
|
const missingServerEntrypoint = !missingSpecifier || missingSpecifier === "@paperclipai/server";
|
||||||
@@ -130,3 +176,15 @@ async function importServerEntry(): Promise<void> {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function shouldGenerateBootstrapInviteAfterStart(config: PaperclipConfig): boolean {
|
||||||
|
return config.server.deploymentMode === "authenticated" && config.database.mode === "embedded-postgres";
|
||||||
|
}
|
||||||
|
|
||||||
|
async function startServerFromModule(mod: unknown, label: string): Promise<StartedServer> {
|
||||||
|
const startServer = (mod as { startServer?: () => Promise<StartedServer> }).startServer;
|
||||||
|
if (typeof startServer !== "function") {
|
||||||
|
throw new Error(`Paperclip server entrypoint did not export startServer(): ${label}`);
|
||||||
|
}
|
||||||
|
return await startServer();
|
||||||
|
}
|
||||||
|
|||||||
274
cli/src/commands/worktree-lib.ts
Normal file
274
cli/src/commands/worktree-lib.ts
Normal file
@@ -0,0 +1,274 @@
|
|||||||
|
import { randomInt } from "node:crypto";
|
||||||
|
import path from "node:path";
|
||||||
|
import type { PaperclipConfig } from "../config/schema.js";
|
||||||
|
import { expandHomePrefix } from "../config/home.js";
|
||||||
|
|
||||||
|
export const DEFAULT_WORKTREE_HOME = "~/.paperclip-worktrees";
|
||||||
|
export const WORKTREE_SEED_MODES = ["minimal", "full"] as const;
|
||||||
|
|
||||||
|
export type WorktreeSeedMode = (typeof WORKTREE_SEED_MODES)[number];
|
||||||
|
|
||||||
|
export type WorktreeSeedPlan = {
|
||||||
|
mode: WorktreeSeedMode;
|
||||||
|
excludedTables: string[];
|
||||||
|
nullifyColumns: Record<string, string[]>;
|
||||||
|
};
|
||||||
|
|
||||||
|
const MINIMAL_WORKTREE_EXCLUDED_TABLES = [
|
||||||
|
"activity_log",
|
||||||
|
"agent_runtime_state",
|
||||||
|
"agent_task_sessions",
|
||||||
|
"agent_wakeup_requests",
|
||||||
|
"cost_events",
|
||||||
|
"heartbeat_run_events",
|
||||||
|
"heartbeat_runs",
|
||||||
|
"workspace_runtime_services",
|
||||||
|
];
|
||||||
|
|
||||||
|
const MINIMAL_WORKTREE_NULLIFIED_COLUMNS: Record<string, string[]> = {
|
||||||
|
issues: ["checkout_run_id", "execution_run_id"],
|
||||||
|
};
|
||||||
|
|
||||||
|
export type WorktreeLocalPaths = {
|
||||||
|
cwd: string;
|
||||||
|
repoConfigDir: string;
|
||||||
|
configPath: string;
|
||||||
|
envPath: string;
|
||||||
|
homeDir: string;
|
||||||
|
instanceId: string;
|
||||||
|
instanceRoot: string;
|
||||||
|
contextPath: string;
|
||||||
|
embeddedPostgresDataDir: string;
|
||||||
|
backupDir: string;
|
||||||
|
logDir: string;
|
||||||
|
secretsKeyFilePath: string;
|
||||||
|
storageDir: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type WorktreeUiBranding = {
|
||||||
|
name: string;
|
||||||
|
color: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function isWorktreeSeedMode(value: string): value is WorktreeSeedMode {
|
||||||
|
return (WORKTREE_SEED_MODES as readonly string[]).includes(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveWorktreeSeedPlan(mode: WorktreeSeedMode): WorktreeSeedPlan {
|
||||||
|
if (mode === "full") {
|
||||||
|
return {
|
||||||
|
mode,
|
||||||
|
excludedTables: [],
|
||||||
|
nullifyColumns: {},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
mode,
|
||||||
|
excludedTables: [...MINIMAL_WORKTREE_EXCLUDED_TABLES],
|
||||||
|
nullifyColumns: {
|
||||||
|
...MINIMAL_WORKTREE_NULLIFIED_COLUMNS,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function nonEmpty(value: string | null | undefined): string | null {
|
||||||
|
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function isLoopbackHost(hostname: string): boolean {
|
||||||
|
const value = hostname.trim().toLowerCase();
|
||||||
|
return value === "127.0.0.1" || value === "localhost" || value === "::1";
|
||||||
|
}
|
||||||
|
|
||||||
|
export function sanitizeWorktreeInstanceId(rawValue: string): string {
|
||||||
|
const trimmed = rawValue.trim().toLowerCase();
|
||||||
|
const normalized = trimmed
|
||||||
|
.replace(/[^a-z0-9_-]+/g, "-")
|
||||||
|
.replace(/-+/g, "-")
|
||||||
|
.replace(/^[-_]+|[-_]+$/g, "");
|
||||||
|
return normalized || "worktree";
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveSuggestedWorktreeName(cwd: string, explicitName?: string): string {
|
||||||
|
return nonEmpty(explicitName) ?? path.basename(path.resolve(cwd));
|
||||||
|
}
|
||||||
|
|
||||||
|
function hslComponentToHex(n: number): string {
|
||||||
|
return Math.round(Math.max(0, Math.min(255, n)))
|
||||||
|
.toString(16)
|
||||||
|
.padStart(2, "0");
|
||||||
|
}
|
||||||
|
|
||||||
|
function hslToHex(hue: number, saturation: number, lightness: number): string {
|
||||||
|
const s = Math.max(0, Math.min(100, saturation)) / 100;
|
||||||
|
const l = Math.max(0, Math.min(100, lightness)) / 100;
|
||||||
|
const c = (1 - Math.abs((2 * l) - 1)) * s;
|
||||||
|
const h = ((hue % 360) + 360) % 360;
|
||||||
|
const x = c * (1 - Math.abs(((h / 60) % 2) - 1));
|
||||||
|
const m = l - (c / 2);
|
||||||
|
|
||||||
|
let r = 0;
|
||||||
|
let g = 0;
|
||||||
|
let b = 0;
|
||||||
|
|
||||||
|
if (h < 60) {
|
||||||
|
r = c;
|
||||||
|
g = x;
|
||||||
|
} else if (h < 120) {
|
||||||
|
r = x;
|
||||||
|
g = c;
|
||||||
|
} else if (h < 180) {
|
||||||
|
g = c;
|
||||||
|
b = x;
|
||||||
|
} else if (h < 240) {
|
||||||
|
g = x;
|
||||||
|
b = c;
|
||||||
|
} else if (h < 300) {
|
||||||
|
r = x;
|
||||||
|
b = c;
|
||||||
|
} else {
|
||||||
|
r = c;
|
||||||
|
b = x;
|
||||||
|
}
|
||||||
|
|
||||||
|
return `#${hslComponentToHex((r + m) * 255)}${hslComponentToHex((g + m) * 255)}${hslComponentToHex((b + m) * 255)}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function generateWorktreeColor(): string {
|
||||||
|
return hslToHex(randomInt(0, 360), 68, 56);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveWorktreeLocalPaths(opts: {
|
||||||
|
cwd: string;
|
||||||
|
homeDir?: string;
|
||||||
|
instanceId: string;
|
||||||
|
}): WorktreeLocalPaths {
|
||||||
|
const cwd = path.resolve(opts.cwd);
|
||||||
|
const homeDir = path.resolve(expandHomePrefix(opts.homeDir ?? DEFAULT_WORKTREE_HOME));
|
||||||
|
const instanceRoot = path.resolve(homeDir, "instances", opts.instanceId);
|
||||||
|
const repoConfigDir = path.resolve(cwd, ".paperclip");
|
||||||
|
return {
|
||||||
|
cwd,
|
||||||
|
repoConfigDir,
|
||||||
|
configPath: path.resolve(repoConfigDir, "config.json"),
|
||||||
|
envPath: path.resolve(repoConfigDir, ".env"),
|
||||||
|
homeDir,
|
||||||
|
instanceId: opts.instanceId,
|
||||||
|
instanceRoot,
|
||||||
|
contextPath: path.resolve(homeDir, "context.json"),
|
||||||
|
embeddedPostgresDataDir: path.resolve(instanceRoot, "db"),
|
||||||
|
backupDir: path.resolve(instanceRoot, "data", "backups"),
|
||||||
|
logDir: path.resolve(instanceRoot, "logs"),
|
||||||
|
secretsKeyFilePath: path.resolve(instanceRoot, "secrets", "master.key"),
|
||||||
|
storageDir: path.resolve(instanceRoot, "data", "storage"),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function rewriteLocalUrlPort(rawUrl: string | undefined, port: number): string | undefined {
|
||||||
|
if (!rawUrl) return undefined;
|
||||||
|
try {
|
||||||
|
const parsed = new URL(rawUrl);
|
||||||
|
if (!isLoopbackHost(parsed.hostname)) return rawUrl;
|
||||||
|
parsed.port = String(port);
|
||||||
|
return parsed.toString();
|
||||||
|
} catch {
|
||||||
|
return rawUrl;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildWorktreeConfig(input: {
|
||||||
|
sourceConfig: PaperclipConfig | null;
|
||||||
|
paths: WorktreeLocalPaths;
|
||||||
|
serverPort: number;
|
||||||
|
databasePort: number;
|
||||||
|
now?: Date;
|
||||||
|
}): PaperclipConfig {
|
||||||
|
const { sourceConfig, paths, serverPort, databasePort } = input;
|
||||||
|
const nowIso = (input.now ?? new Date()).toISOString();
|
||||||
|
|
||||||
|
const source = sourceConfig;
|
||||||
|
const authPublicBaseUrl = rewriteLocalUrlPort(source?.auth.publicBaseUrl, serverPort);
|
||||||
|
|
||||||
|
return {
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: nowIso,
|
||||||
|
source: "configure",
|
||||||
|
},
|
||||||
|
...(source?.llm ? { llm: source.llm } : {}),
|
||||||
|
database: {
|
||||||
|
mode: "embedded-postgres",
|
||||||
|
embeddedPostgresDataDir: paths.embeddedPostgresDataDir,
|
||||||
|
embeddedPostgresPort: databasePort,
|
||||||
|
backup: {
|
||||||
|
enabled: source?.database.backup.enabled ?? true,
|
||||||
|
intervalMinutes: source?.database.backup.intervalMinutes ?? 60,
|
||||||
|
retentionDays: source?.database.backup.retentionDays ?? 30,
|
||||||
|
dir: paths.backupDir,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: source?.logging.mode ?? "file",
|
||||||
|
logDir: paths.logDir,
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: source?.server.deploymentMode ?? "local_trusted",
|
||||||
|
exposure: source?.server.exposure ?? "private",
|
||||||
|
host: source?.server.host ?? "127.0.0.1",
|
||||||
|
port: serverPort,
|
||||||
|
allowedHostnames: source?.server.allowedHostnames ?? [],
|
||||||
|
serveUi: source?.server.serveUi ?? true,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: source?.auth.baseUrlMode ?? "auto",
|
||||||
|
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
|
||||||
|
disableSignUp: source?.auth.disableSignUp ?? false,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: source?.storage.provider ?? "local_disk",
|
||||||
|
localDisk: {
|
||||||
|
baseDir: paths.storageDir,
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: source?.storage.s3.bucket ?? "paperclip",
|
||||||
|
region: source?.storage.s3.region ?? "us-east-1",
|
||||||
|
endpoint: source?.storage.s3.endpoint,
|
||||||
|
prefix: source?.storage.s3.prefix ?? "",
|
||||||
|
forcePathStyle: source?.storage.s3.forcePathStyle ?? false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: source?.secrets.provider ?? "local_encrypted",
|
||||||
|
strictMode: source?.secrets.strictMode ?? false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: paths.secretsKeyFilePath,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildWorktreeEnvEntries(
|
||||||
|
paths: WorktreeLocalPaths,
|
||||||
|
branding?: WorktreeUiBranding,
|
||||||
|
): Record<string, string> {
|
||||||
|
return {
|
||||||
|
PAPERCLIP_HOME: paths.homeDir,
|
||||||
|
PAPERCLIP_INSTANCE_ID: paths.instanceId,
|
||||||
|
PAPERCLIP_CONFIG: paths.configPath,
|
||||||
|
PAPERCLIP_CONTEXT: paths.contextPath,
|
||||||
|
PAPERCLIP_IN_WORKTREE: "true",
|
||||||
|
...(branding?.name ? { PAPERCLIP_WORKTREE_NAME: branding.name } : {}),
|
||||||
|
...(branding?.color ? { PAPERCLIP_WORKTREE_COLOR: branding.color } : {}),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function shellEscape(value: string): string {
|
||||||
|
return `'${value.replaceAll("'", `'\"'\"'`)}'`;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function formatShellExports(entries: Record<string, string>): string {
|
||||||
|
return Object.entries(entries)
|
||||||
|
.filter(([, value]) => typeof value === "string" && value.trim().length > 0)
|
||||||
|
.map(([key, value]) => `export ${key}=${shellEscape(value)}`)
|
||||||
|
.join("\n");
|
||||||
|
}
|
||||||
764
cli/src/commands/worktree-merge-history-lib.ts
Normal file
764
cli/src/commands/worktree-merge-history-lib.ts
Normal file
@@ -0,0 +1,764 @@
|
|||||||
|
import {
|
||||||
|
agents,
|
||||||
|
assets,
|
||||||
|
documentRevisions,
|
||||||
|
goals,
|
||||||
|
issueAttachments,
|
||||||
|
issueComments,
|
||||||
|
issueDocuments,
|
||||||
|
issues,
|
||||||
|
projects,
|
||||||
|
projectWorkspaces,
|
||||||
|
} from "@paperclipai/db";
|
||||||
|
|
||||||
|
type IssueRow = typeof issues.$inferSelect;
|
||||||
|
type CommentRow = typeof issueComments.$inferSelect;
|
||||||
|
type AgentRow = typeof agents.$inferSelect;
|
||||||
|
type ProjectRow = typeof projects.$inferSelect;
|
||||||
|
type ProjectWorkspaceRow = typeof projectWorkspaces.$inferSelect;
|
||||||
|
type GoalRow = typeof goals.$inferSelect;
|
||||||
|
type IssueDocumentLinkRow = typeof issueDocuments.$inferSelect;
|
||||||
|
type DocumentRevisionTableRow = typeof documentRevisions.$inferSelect;
|
||||||
|
type IssueAttachmentTableRow = typeof issueAttachments.$inferSelect;
|
||||||
|
type AssetRow = typeof assets.$inferSelect;
|
||||||
|
|
||||||
|
export const WORKTREE_MERGE_SCOPES = ["issues", "comments"] as const;
|
||||||
|
export type WorktreeMergeScope = (typeof WORKTREE_MERGE_SCOPES)[number];
|
||||||
|
|
||||||
|
export type ImportAdjustment =
|
||||||
|
| "clear_assignee_agent"
|
||||||
|
| "clear_project"
|
||||||
|
| "clear_project_workspace"
|
||||||
|
| "clear_goal"
|
||||||
|
| "clear_author_agent"
|
||||||
|
| "coerce_in_progress_to_todo"
|
||||||
|
| "clear_document_agent"
|
||||||
|
| "clear_document_revision_agent"
|
||||||
|
| "clear_attachment_agent";
|
||||||
|
|
||||||
|
export type IssueMergeAction = "skip_existing" | "insert";
|
||||||
|
export type CommentMergeAction = "skip_existing" | "skip_missing_parent" | "insert";
|
||||||
|
|
||||||
|
export type PlannedIssueInsert = {
|
||||||
|
source: IssueRow;
|
||||||
|
action: "insert";
|
||||||
|
previewIssueNumber: number;
|
||||||
|
previewIdentifier: string;
|
||||||
|
targetStatus: string;
|
||||||
|
targetAssigneeAgentId: string | null;
|
||||||
|
targetCreatedByAgentId: string | null;
|
||||||
|
targetProjectId: string | null;
|
||||||
|
targetProjectWorkspaceId: string | null;
|
||||||
|
targetGoalId: string | null;
|
||||||
|
projectResolution: "preserved" | "cleared" | "mapped" | "imported";
|
||||||
|
mappedProjectName: string | null;
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedIssueSkip = {
|
||||||
|
source: IssueRow;
|
||||||
|
action: "skip_existing";
|
||||||
|
driftKeys: string[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedCommentInsert = {
|
||||||
|
source: CommentRow;
|
||||||
|
action: "insert";
|
||||||
|
targetAuthorAgentId: string | null;
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedCommentSkip = {
|
||||||
|
source: CommentRow;
|
||||||
|
action: "skip_existing" | "skip_missing_parent";
|
||||||
|
};
|
||||||
|
|
||||||
|
export type IssueDocumentRow = {
|
||||||
|
id: IssueDocumentLinkRow["id"];
|
||||||
|
companyId: IssueDocumentLinkRow["companyId"];
|
||||||
|
issueId: IssueDocumentLinkRow["issueId"];
|
||||||
|
documentId: IssueDocumentLinkRow["documentId"];
|
||||||
|
key: IssueDocumentLinkRow["key"];
|
||||||
|
linkCreatedAt: IssueDocumentLinkRow["createdAt"];
|
||||||
|
linkUpdatedAt: IssueDocumentLinkRow["updatedAt"];
|
||||||
|
title: string | null;
|
||||||
|
format: string;
|
||||||
|
latestBody: string;
|
||||||
|
latestRevisionId: string | null;
|
||||||
|
latestRevisionNumber: number;
|
||||||
|
createdByAgentId: string | null;
|
||||||
|
createdByUserId: string | null;
|
||||||
|
updatedByAgentId: string | null;
|
||||||
|
updatedByUserId: string | null;
|
||||||
|
documentCreatedAt: Date;
|
||||||
|
documentUpdatedAt: Date;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type DocumentRevisionRow = {
|
||||||
|
id: DocumentRevisionTableRow["id"];
|
||||||
|
companyId: DocumentRevisionTableRow["companyId"];
|
||||||
|
documentId: DocumentRevisionTableRow["documentId"];
|
||||||
|
revisionNumber: DocumentRevisionTableRow["revisionNumber"];
|
||||||
|
body: DocumentRevisionTableRow["body"];
|
||||||
|
changeSummary: DocumentRevisionTableRow["changeSummary"];
|
||||||
|
createdByAgentId: string | null;
|
||||||
|
createdByUserId: string | null;
|
||||||
|
createdAt: Date;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type IssueAttachmentRow = {
|
||||||
|
id: IssueAttachmentTableRow["id"];
|
||||||
|
companyId: IssueAttachmentTableRow["companyId"];
|
||||||
|
issueId: IssueAttachmentTableRow["issueId"];
|
||||||
|
issueCommentId: IssueAttachmentTableRow["issueCommentId"];
|
||||||
|
assetId: IssueAttachmentTableRow["assetId"];
|
||||||
|
provider: AssetRow["provider"];
|
||||||
|
objectKey: AssetRow["objectKey"];
|
||||||
|
contentType: AssetRow["contentType"];
|
||||||
|
byteSize: AssetRow["byteSize"];
|
||||||
|
sha256: AssetRow["sha256"];
|
||||||
|
originalFilename: AssetRow["originalFilename"];
|
||||||
|
createdByAgentId: string | null;
|
||||||
|
createdByUserId: string | null;
|
||||||
|
assetCreatedAt: Date;
|
||||||
|
assetUpdatedAt: Date;
|
||||||
|
attachmentCreatedAt: Date;
|
||||||
|
attachmentUpdatedAt: Date;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedDocumentRevisionInsert = {
|
||||||
|
source: DocumentRevisionRow;
|
||||||
|
targetRevisionNumber: number;
|
||||||
|
targetCreatedByAgentId: string | null;
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedIssueDocumentInsert = {
|
||||||
|
source: IssueDocumentRow;
|
||||||
|
action: "insert";
|
||||||
|
targetCreatedByAgentId: string | null;
|
||||||
|
targetUpdatedByAgentId: string | null;
|
||||||
|
latestRevisionId: string | null;
|
||||||
|
latestRevisionNumber: number;
|
||||||
|
revisionsToInsert: PlannedDocumentRevisionInsert[];
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedIssueDocumentMerge = {
|
||||||
|
source: IssueDocumentRow;
|
||||||
|
action: "merge_existing";
|
||||||
|
targetCreatedByAgentId: string | null;
|
||||||
|
targetUpdatedByAgentId: string | null;
|
||||||
|
latestRevisionId: string | null;
|
||||||
|
latestRevisionNumber: number;
|
||||||
|
revisionsToInsert: PlannedDocumentRevisionInsert[];
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedIssueDocumentSkip = {
|
||||||
|
source: IssueDocumentRow;
|
||||||
|
action: "skip_existing" | "skip_missing_parent" | "skip_conflicting_key";
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedAttachmentInsert = {
|
||||||
|
source: IssueAttachmentRow;
|
||||||
|
action: "insert";
|
||||||
|
targetIssueCommentId: string | null;
|
||||||
|
targetCreatedByAgentId: string | null;
|
||||||
|
adjustments: ImportAdjustment[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedAttachmentSkip = {
|
||||||
|
source: IssueAttachmentRow;
|
||||||
|
action: "skip_existing" | "skip_missing_parent";
|
||||||
|
};
|
||||||
|
|
||||||
|
export type PlannedProjectImport = {
|
||||||
|
source: ProjectRow;
|
||||||
|
targetLeadAgentId: string | null;
|
||||||
|
targetGoalId: string | null;
|
||||||
|
workspaces: ProjectWorkspaceRow[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type WorktreeMergePlan = {
|
||||||
|
companyId: string;
|
||||||
|
companyName: string;
|
||||||
|
issuePrefix: string;
|
||||||
|
previewIssueCounterStart: number;
|
||||||
|
scopes: WorktreeMergeScope[];
|
||||||
|
projectImports: PlannedProjectImport[];
|
||||||
|
issuePlans: Array<PlannedIssueInsert | PlannedIssueSkip>;
|
||||||
|
commentPlans: Array<PlannedCommentInsert | PlannedCommentSkip>;
|
||||||
|
documentPlans: Array<PlannedIssueDocumentInsert | PlannedIssueDocumentMerge | PlannedIssueDocumentSkip>;
|
||||||
|
attachmentPlans: Array<PlannedAttachmentInsert | PlannedAttachmentSkip>;
|
||||||
|
counts: {
|
||||||
|
projectsToImport: number;
|
||||||
|
issuesToInsert: number;
|
||||||
|
issuesExisting: number;
|
||||||
|
issueDrift: number;
|
||||||
|
commentsToInsert: number;
|
||||||
|
commentsExisting: number;
|
||||||
|
commentsMissingParent: number;
|
||||||
|
documentsToInsert: number;
|
||||||
|
documentsToMerge: number;
|
||||||
|
documentsExisting: number;
|
||||||
|
documentsConflictingKey: number;
|
||||||
|
documentsMissingParent: number;
|
||||||
|
documentRevisionsToInsert: number;
|
||||||
|
attachmentsToInsert: number;
|
||||||
|
attachmentsExisting: number;
|
||||||
|
attachmentsMissingParent: number;
|
||||||
|
};
|
||||||
|
adjustments: Record<ImportAdjustment, number>;
|
||||||
|
};
|
||||||
|
|
||||||
|
function compareIssueCoreFields(source: IssueRow, target: IssueRow): string[] {
|
||||||
|
const driftKeys: string[] = [];
|
||||||
|
if (source.title !== target.title) driftKeys.push("title");
|
||||||
|
if ((source.description ?? null) !== (target.description ?? null)) driftKeys.push("description");
|
||||||
|
if (source.status !== target.status) driftKeys.push("status");
|
||||||
|
if (source.priority !== target.priority) driftKeys.push("priority");
|
||||||
|
if ((source.parentId ?? null) !== (target.parentId ?? null)) driftKeys.push("parentId");
|
||||||
|
if ((source.projectId ?? null) !== (target.projectId ?? null)) driftKeys.push("projectId");
|
||||||
|
if ((source.projectWorkspaceId ?? null) !== (target.projectWorkspaceId ?? null)) driftKeys.push("projectWorkspaceId");
|
||||||
|
if ((source.goalId ?? null) !== (target.goalId ?? null)) driftKeys.push("goalId");
|
||||||
|
if ((source.assigneeAgentId ?? null) !== (target.assigneeAgentId ?? null)) driftKeys.push("assigneeAgentId");
|
||||||
|
if ((source.assigneeUserId ?? null) !== (target.assigneeUserId ?? null)) driftKeys.push("assigneeUserId");
|
||||||
|
return driftKeys;
|
||||||
|
}
|
||||||
|
|
||||||
|
function incrementAdjustment(
|
||||||
|
counts: Record<ImportAdjustment, number>,
|
||||||
|
adjustment: ImportAdjustment,
|
||||||
|
): void {
|
||||||
|
counts[adjustment] += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
function groupBy<T>(rows: T[], keyFor: (row: T) => string): Map<string, T[]> {
|
||||||
|
const out = new Map<string, T[]>();
|
||||||
|
for (const row of rows) {
|
||||||
|
const key = keyFor(row);
|
||||||
|
const existing = out.get(key);
|
||||||
|
if (existing) {
|
||||||
|
existing.push(row);
|
||||||
|
} else {
|
||||||
|
out.set(key, [row]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return out;
|
||||||
|
}
|
||||||
|
|
||||||
|
function sameDate(left: Date, right: Date): boolean {
|
||||||
|
return left.getTime() === right.getTime();
|
||||||
|
}
|
||||||
|
|
||||||
|
function sortDocumentRows(rows: IssueDocumentRow[]): IssueDocumentRow[] {
|
||||||
|
return [...rows].sort((left, right) => {
|
||||||
|
const createdDelta = left.documentCreatedAt.getTime() - right.documentCreatedAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
const linkDelta = left.linkCreatedAt.getTime() - right.linkCreatedAt.getTime();
|
||||||
|
if (linkDelta !== 0) return linkDelta;
|
||||||
|
return left.documentId.localeCompare(right.documentId);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function sortDocumentRevisions(rows: DocumentRevisionRow[]): DocumentRevisionRow[] {
|
||||||
|
return [...rows].sort((left, right) => {
|
||||||
|
const revisionDelta = left.revisionNumber - right.revisionNumber;
|
||||||
|
if (revisionDelta !== 0) return revisionDelta;
|
||||||
|
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
return left.id.localeCompare(right.id);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function sortAttachments(rows: IssueAttachmentRow[]): IssueAttachmentRow[] {
|
||||||
|
return [...rows].sort((left, right) => {
|
||||||
|
const createdDelta = left.attachmentCreatedAt.getTime() - right.attachmentCreatedAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
return left.id.localeCompare(right.id);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function sortIssuesForImport(sourceIssues: IssueRow[]): IssueRow[] {
|
||||||
|
const byId = new Map(sourceIssues.map((issue) => [issue.id, issue]));
|
||||||
|
const memoDepth = new Map<string, number>();
|
||||||
|
|
||||||
|
const depthFor = (issue: IssueRow, stack = new Set<string>()): number => {
|
||||||
|
const memoized = memoDepth.get(issue.id);
|
||||||
|
if (memoized !== undefined) return memoized;
|
||||||
|
if (!issue.parentId) {
|
||||||
|
memoDepth.set(issue.id, 0);
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
if (stack.has(issue.id)) {
|
||||||
|
memoDepth.set(issue.id, 0);
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
const parent = byId.get(issue.parentId);
|
||||||
|
if (!parent) {
|
||||||
|
memoDepth.set(issue.id, 0);
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
stack.add(issue.id);
|
||||||
|
const depth = depthFor(parent, stack) + 1;
|
||||||
|
stack.delete(issue.id);
|
||||||
|
memoDepth.set(issue.id, depth);
|
||||||
|
return depth;
|
||||||
|
};
|
||||||
|
|
||||||
|
return [...sourceIssues].sort((left, right) => {
|
||||||
|
const depthDelta = depthFor(left) - depthFor(right);
|
||||||
|
if (depthDelta !== 0) return depthDelta;
|
||||||
|
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
return left.id.localeCompare(right.id);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function parseWorktreeMergeScopes(rawValue: string | undefined): WorktreeMergeScope[] {
|
||||||
|
if (!rawValue || rawValue.trim().length === 0) {
|
||||||
|
return ["issues", "comments"];
|
||||||
|
}
|
||||||
|
|
||||||
|
const parsed = rawValue
|
||||||
|
.split(",")
|
||||||
|
.map((value) => value.trim().toLowerCase())
|
||||||
|
.filter((value): value is WorktreeMergeScope =>
|
||||||
|
(WORKTREE_MERGE_SCOPES as readonly string[]).includes(value),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (parsed.length === 0) {
|
||||||
|
throw new Error(
|
||||||
|
`Invalid scope "${rawValue}". Expected a comma-separated list of: ${WORKTREE_MERGE_SCOPES.join(", ")}.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return [...new Set(parsed)];
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildWorktreeMergePlan(input: {
|
||||||
|
companyId: string;
|
||||||
|
companyName: string;
|
||||||
|
issuePrefix: string;
|
||||||
|
previewIssueCounterStart: number;
|
||||||
|
scopes: WorktreeMergeScope[];
|
||||||
|
sourceIssues: IssueRow[];
|
||||||
|
targetIssues: IssueRow[];
|
||||||
|
sourceComments: CommentRow[];
|
||||||
|
targetComments: CommentRow[];
|
||||||
|
sourceProjects?: ProjectRow[];
|
||||||
|
sourceProjectWorkspaces?: ProjectWorkspaceRow[];
|
||||||
|
sourceDocuments?: IssueDocumentRow[];
|
||||||
|
targetDocuments?: IssueDocumentRow[];
|
||||||
|
sourceDocumentRevisions?: DocumentRevisionRow[];
|
||||||
|
targetDocumentRevisions?: DocumentRevisionRow[];
|
||||||
|
sourceAttachments?: IssueAttachmentRow[];
|
||||||
|
targetAttachments?: IssueAttachmentRow[];
|
||||||
|
targetAgents: AgentRow[];
|
||||||
|
targetProjects: ProjectRow[];
|
||||||
|
targetProjectWorkspaces: ProjectWorkspaceRow[];
|
||||||
|
targetGoals: GoalRow[];
|
||||||
|
importProjectIds?: Iterable<string>;
|
||||||
|
projectIdOverrides?: Record<string, string | null | undefined>;
|
||||||
|
}): WorktreeMergePlan {
|
||||||
|
const targetIssuesById = new Map(input.targetIssues.map((issue) => [issue.id, issue]));
|
||||||
|
const targetCommentIds = new Set(input.targetComments.map((comment) => comment.id));
|
||||||
|
const targetAgentIds = new Set(input.targetAgents.map((agent) => agent.id));
|
||||||
|
const targetProjectIds = new Set(input.targetProjects.map((project) => project.id));
|
||||||
|
const targetProjectsById = new Map(input.targetProjects.map((project) => [project.id, project]));
|
||||||
|
const targetProjectWorkspaceIds = new Set(input.targetProjectWorkspaces.map((workspace) => workspace.id));
|
||||||
|
const targetGoalIds = new Set(input.targetGoals.map((goal) => goal.id));
|
||||||
|
const sourceProjectsById = new Map((input.sourceProjects ?? []).map((project) => [project.id, project]));
|
||||||
|
const sourceProjectWorkspaces = input.sourceProjectWorkspaces ?? [];
|
||||||
|
const sourceProjectWorkspacesByProjectId = groupBy(sourceProjectWorkspaces, (workspace) => workspace.projectId);
|
||||||
|
const importProjectIds = new Set(input.importProjectIds ?? []);
|
||||||
|
const scopes = new Set(input.scopes);
|
||||||
|
|
||||||
|
const adjustmentCounts: Record<ImportAdjustment, number> = {
|
||||||
|
clear_assignee_agent: 0,
|
||||||
|
clear_project: 0,
|
||||||
|
clear_project_workspace: 0,
|
||||||
|
clear_goal: 0,
|
||||||
|
clear_author_agent: 0,
|
||||||
|
coerce_in_progress_to_todo: 0,
|
||||||
|
clear_document_agent: 0,
|
||||||
|
clear_document_revision_agent: 0,
|
||||||
|
clear_attachment_agent: 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
const projectImports: PlannedProjectImport[] = [];
|
||||||
|
for (const projectId of importProjectIds) {
|
||||||
|
if (targetProjectIds.has(projectId)) continue;
|
||||||
|
const sourceProject = sourceProjectsById.get(projectId);
|
||||||
|
if (!sourceProject) continue;
|
||||||
|
projectImports.push({
|
||||||
|
source: sourceProject,
|
||||||
|
targetLeadAgentId:
|
||||||
|
sourceProject.leadAgentId && targetAgentIds.has(sourceProject.leadAgentId)
|
||||||
|
? sourceProject.leadAgentId
|
||||||
|
: null,
|
||||||
|
targetGoalId:
|
||||||
|
sourceProject.goalId && targetGoalIds.has(sourceProject.goalId)
|
||||||
|
? sourceProject.goalId
|
||||||
|
: null,
|
||||||
|
workspaces: [...(sourceProjectWorkspacesByProjectId.get(projectId) ?? [])].sort((left, right) => {
|
||||||
|
const primaryDelta = Number(right.isPrimary) - Number(left.isPrimary);
|
||||||
|
if (primaryDelta !== 0) return primaryDelta;
|
||||||
|
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
return left.id.localeCompare(right.id);
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
const importedProjectWorkspaceIds = new Set(
|
||||||
|
projectImports.flatMap((project) => project.workspaces.map((workspace) => workspace.id)),
|
||||||
|
);
|
||||||
|
|
||||||
|
const issuePlans: Array<PlannedIssueInsert | PlannedIssueSkip> = [];
|
||||||
|
let nextPreviewIssueNumber = input.previewIssueCounterStart;
|
||||||
|
for (const issue of sortIssuesForImport(input.sourceIssues)) {
|
||||||
|
const existing = targetIssuesById.get(issue.id);
|
||||||
|
if (existing) {
|
||||||
|
issuePlans.push({
|
||||||
|
source: issue,
|
||||||
|
action: "skip_existing",
|
||||||
|
driftKeys: compareIssueCoreFields(issue, existing),
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
nextPreviewIssueNumber += 1;
|
||||||
|
const adjustments: ImportAdjustment[] = [];
|
||||||
|
const targetAssigneeAgentId =
|
||||||
|
issue.assigneeAgentId && targetAgentIds.has(issue.assigneeAgentId) ? issue.assigneeAgentId : null;
|
||||||
|
if (issue.assigneeAgentId && !targetAssigneeAgentId) {
|
||||||
|
adjustments.push("clear_assignee_agent");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_assignee_agent");
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetCreatedByAgentId =
|
||||||
|
issue.createdByAgentId && targetAgentIds.has(issue.createdByAgentId) ? issue.createdByAgentId : null;
|
||||||
|
|
||||||
|
let targetProjectId =
|
||||||
|
issue.projectId && targetProjectIds.has(issue.projectId) ? issue.projectId : null;
|
||||||
|
let projectResolution: PlannedIssueInsert["projectResolution"] = targetProjectId ? "preserved" : "cleared";
|
||||||
|
let mappedProjectName: string | null = null;
|
||||||
|
const overrideProjectId =
|
||||||
|
issue.projectId && input.projectIdOverrides
|
||||||
|
? input.projectIdOverrides[issue.projectId] ?? null
|
||||||
|
: null;
|
||||||
|
if (!targetProjectId && overrideProjectId && targetProjectIds.has(overrideProjectId)) {
|
||||||
|
targetProjectId = overrideProjectId;
|
||||||
|
projectResolution = "mapped";
|
||||||
|
mappedProjectName = targetProjectsById.get(overrideProjectId)?.name ?? null;
|
||||||
|
}
|
||||||
|
if (!targetProjectId && issue.projectId && importProjectIds.has(issue.projectId)) {
|
||||||
|
const sourceProject = sourceProjectsById.get(issue.projectId);
|
||||||
|
if (sourceProject) {
|
||||||
|
targetProjectId = sourceProject.id;
|
||||||
|
projectResolution = "imported";
|
||||||
|
mappedProjectName = sourceProject.name;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (issue.projectId && !targetProjectId) {
|
||||||
|
adjustments.push("clear_project");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_project");
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetProjectWorkspaceId =
|
||||||
|
targetProjectId
|
||||||
|
&& targetProjectId === issue.projectId
|
||||||
|
&& issue.projectWorkspaceId
|
||||||
|
&& (targetProjectWorkspaceIds.has(issue.projectWorkspaceId)
|
||||||
|
|| importedProjectWorkspaceIds.has(issue.projectWorkspaceId))
|
||||||
|
? issue.projectWorkspaceId
|
||||||
|
: null;
|
||||||
|
if (issue.projectWorkspaceId && !targetProjectWorkspaceId) {
|
||||||
|
adjustments.push("clear_project_workspace");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_project_workspace");
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetGoalId =
|
||||||
|
issue.goalId && targetGoalIds.has(issue.goalId) ? issue.goalId : null;
|
||||||
|
if (issue.goalId && !targetGoalId) {
|
||||||
|
adjustments.push("clear_goal");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_goal");
|
||||||
|
}
|
||||||
|
|
||||||
|
let targetStatus = issue.status;
|
||||||
|
if (
|
||||||
|
targetStatus === "in_progress"
|
||||||
|
&& !targetAssigneeAgentId
|
||||||
|
&& !(issue.assigneeUserId && issue.assigneeUserId.trim().length > 0)
|
||||||
|
) {
|
||||||
|
targetStatus = "todo";
|
||||||
|
adjustments.push("coerce_in_progress_to_todo");
|
||||||
|
incrementAdjustment(adjustmentCounts, "coerce_in_progress_to_todo");
|
||||||
|
}
|
||||||
|
|
||||||
|
issuePlans.push({
|
||||||
|
source: issue,
|
||||||
|
action: "insert",
|
||||||
|
previewIssueNumber: nextPreviewIssueNumber,
|
||||||
|
previewIdentifier: `${input.issuePrefix}-${nextPreviewIssueNumber}`,
|
||||||
|
targetStatus,
|
||||||
|
targetAssigneeAgentId,
|
||||||
|
targetCreatedByAgentId,
|
||||||
|
targetProjectId,
|
||||||
|
targetProjectWorkspaceId,
|
||||||
|
targetGoalId,
|
||||||
|
projectResolution,
|
||||||
|
mappedProjectName,
|
||||||
|
adjustments,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const issueIdsAvailableAfterImport = new Set<string>([
|
||||||
|
...input.targetIssues.map((issue) => issue.id),
|
||||||
|
...issuePlans.filter((plan): plan is PlannedIssueInsert => plan.action === "insert").map((plan) => plan.source.id),
|
||||||
|
]);
|
||||||
|
|
||||||
|
const commentPlans: Array<PlannedCommentInsert | PlannedCommentSkip> = [];
|
||||||
|
if (scopes.has("comments")) {
|
||||||
|
const sortedComments = [...input.sourceComments].sort((left, right) => {
|
||||||
|
const createdDelta = left.createdAt.getTime() - right.createdAt.getTime();
|
||||||
|
if (createdDelta !== 0) return createdDelta;
|
||||||
|
return left.id.localeCompare(right.id);
|
||||||
|
});
|
||||||
|
|
||||||
|
for (const comment of sortedComments) {
|
||||||
|
if (targetCommentIds.has(comment.id)) {
|
||||||
|
commentPlans.push({ source: comment, action: "skip_existing" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!issueIdsAvailableAfterImport.has(comment.issueId)) {
|
||||||
|
commentPlans.push({ source: comment, action: "skip_missing_parent" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const adjustments: ImportAdjustment[] = [];
|
||||||
|
const targetAuthorAgentId =
|
||||||
|
comment.authorAgentId && targetAgentIds.has(comment.authorAgentId) ? comment.authorAgentId : null;
|
||||||
|
if (comment.authorAgentId && !targetAuthorAgentId) {
|
||||||
|
adjustments.push("clear_author_agent");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_author_agent");
|
||||||
|
}
|
||||||
|
|
||||||
|
commentPlans.push({
|
||||||
|
source: comment,
|
||||||
|
action: "insert",
|
||||||
|
targetAuthorAgentId,
|
||||||
|
adjustments,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const sourceDocuments = input.sourceDocuments ?? [];
|
||||||
|
const targetDocuments = input.targetDocuments ?? [];
|
||||||
|
const sourceDocumentRevisions = input.sourceDocumentRevisions ?? [];
|
||||||
|
const targetDocumentRevisions = input.targetDocumentRevisions ?? [];
|
||||||
|
|
||||||
|
const targetDocumentsById = new Map(targetDocuments.map((document) => [document.documentId, document]));
|
||||||
|
const targetDocumentsByIssueKey = new Map(targetDocuments.map((document) => [`${document.issueId}:${document.key}`, document]));
|
||||||
|
const sourceRevisionsByDocumentId = groupBy(sourceDocumentRevisions, (revision) => revision.documentId);
|
||||||
|
const targetRevisionsByDocumentId = groupBy(targetDocumentRevisions, (revision) => revision.documentId);
|
||||||
|
const commentIdsAvailableAfterImport = new Set<string>([
|
||||||
|
...input.targetComments.map((comment) => comment.id),
|
||||||
|
...commentPlans.filter((plan): plan is PlannedCommentInsert => plan.action === "insert").map((plan) => plan.source.id),
|
||||||
|
]);
|
||||||
|
|
||||||
|
const documentPlans: Array<PlannedIssueDocumentInsert | PlannedIssueDocumentMerge | PlannedIssueDocumentSkip> = [];
|
||||||
|
for (const document of sortDocumentRows(sourceDocuments)) {
|
||||||
|
if (!issueIdsAvailableAfterImport.has(document.issueId)) {
|
||||||
|
documentPlans.push({ source: document, action: "skip_missing_parent" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const existingDocument = targetDocumentsById.get(document.documentId);
|
||||||
|
const conflictingIssueKeyDocument = targetDocumentsByIssueKey.get(`${document.issueId}:${document.key}`);
|
||||||
|
if (!existingDocument && conflictingIssueKeyDocument && conflictingIssueKeyDocument.documentId !== document.documentId) {
|
||||||
|
documentPlans.push({ source: document, action: "skip_conflicting_key" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const adjustments: ImportAdjustment[] = [];
|
||||||
|
const targetCreatedByAgentId =
|
||||||
|
document.createdByAgentId && targetAgentIds.has(document.createdByAgentId) ? document.createdByAgentId : null;
|
||||||
|
const targetUpdatedByAgentId =
|
||||||
|
document.updatedByAgentId && targetAgentIds.has(document.updatedByAgentId) ? document.updatedByAgentId : null;
|
||||||
|
if (
|
||||||
|
(document.createdByAgentId && !targetCreatedByAgentId)
|
||||||
|
|| (document.updatedByAgentId && !targetUpdatedByAgentId)
|
||||||
|
) {
|
||||||
|
adjustments.push("clear_document_agent");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_document_agent");
|
||||||
|
}
|
||||||
|
|
||||||
|
const sourceRevisions = sortDocumentRevisions(sourceRevisionsByDocumentId.get(document.documentId) ?? []);
|
||||||
|
const targetRevisions = sortDocumentRevisions(targetRevisionsByDocumentId.get(document.documentId) ?? []);
|
||||||
|
const existingRevisionIds = new Set(targetRevisions.map((revision) => revision.id));
|
||||||
|
const usedRevisionNumbers = new Set(targetRevisions.map((revision) => revision.revisionNumber));
|
||||||
|
let nextRevisionNumber = targetRevisions.reduce(
|
||||||
|
(maxValue, revision) => Math.max(maxValue, revision.revisionNumber),
|
||||||
|
0,
|
||||||
|
) + 1;
|
||||||
|
|
||||||
|
const targetRevisionNumberById = new Map<string, number>(
|
||||||
|
targetRevisions.map((revision) => [revision.id, revision.revisionNumber]),
|
||||||
|
);
|
||||||
|
const revisionsToInsert: PlannedDocumentRevisionInsert[] = [];
|
||||||
|
|
||||||
|
for (const revision of sourceRevisions) {
|
||||||
|
if (existingRevisionIds.has(revision.id)) continue;
|
||||||
|
let targetRevisionNumber = revision.revisionNumber;
|
||||||
|
if (usedRevisionNumbers.has(targetRevisionNumber)) {
|
||||||
|
while (usedRevisionNumbers.has(nextRevisionNumber)) {
|
||||||
|
nextRevisionNumber += 1;
|
||||||
|
}
|
||||||
|
targetRevisionNumber = nextRevisionNumber;
|
||||||
|
nextRevisionNumber += 1;
|
||||||
|
}
|
||||||
|
usedRevisionNumbers.add(targetRevisionNumber);
|
||||||
|
targetRevisionNumberById.set(revision.id, targetRevisionNumber);
|
||||||
|
|
||||||
|
const revisionAdjustments: ImportAdjustment[] = [];
|
||||||
|
const targetCreatedByAgentId =
|
||||||
|
revision.createdByAgentId && targetAgentIds.has(revision.createdByAgentId) ? revision.createdByAgentId : null;
|
||||||
|
if (revision.createdByAgentId && !targetCreatedByAgentId) {
|
||||||
|
revisionAdjustments.push("clear_document_revision_agent");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_document_revision_agent");
|
||||||
|
}
|
||||||
|
|
||||||
|
revisionsToInsert.push({
|
||||||
|
source: revision,
|
||||||
|
targetRevisionNumber,
|
||||||
|
targetCreatedByAgentId,
|
||||||
|
adjustments: revisionAdjustments,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const latestRevisionId = document.latestRevisionId ?? existingDocument?.latestRevisionId ?? null;
|
||||||
|
const latestRevisionNumber =
|
||||||
|
(latestRevisionId ? targetRevisionNumberById.get(latestRevisionId) : undefined)
|
||||||
|
?? document.latestRevisionNumber
|
||||||
|
?? existingDocument?.latestRevisionNumber
|
||||||
|
?? 0;
|
||||||
|
|
||||||
|
if (!existingDocument) {
|
||||||
|
documentPlans.push({
|
||||||
|
source: document,
|
||||||
|
action: "insert",
|
||||||
|
targetCreatedByAgentId,
|
||||||
|
targetUpdatedByAgentId,
|
||||||
|
latestRevisionId,
|
||||||
|
latestRevisionNumber,
|
||||||
|
revisionsToInsert,
|
||||||
|
adjustments,
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const documentAlreadyMatches =
|
||||||
|
existingDocument.key === document.key
|
||||||
|
&& existingDocument.title === document.title
|
||||||
|
&& existingDocument.format === document.format
|
||||||
|
&& existingDocument.latestBody === document.latestBody
|
||||||
|
&& (existingDocument.latestRevisionId ?? null) === latestRevisionId
|
||||||
|
&& existingDocument.latestRevisionNumber === latestRevisionNumber
|
||||||
|
&& (existingDocument.updatedByAgentId ?? null) === targetUpdatedByAgentId
|
||||||
|
&& (existingDocument.updatedByUserId ?? null) === (document.updatedByUserId ?? null)
|
||||||
|
&& sameDate(existingDocument.documentUpdatedAt, document.documentUpdatedAt)
|
||||||
|
&& sameDate(existingDocument.linkUpdatedAt, document.linkUpdatedAt)
|
||||||
|
&& revisionsToInsert.length === 0;
|
||||||
|
|
||||||
|
if (documentAlreadyMatches) {
|
||||||
|
documentPlans.push({ source: document, action: "skip_existing" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
documentPlans.push({
|
||||||
|
source: document,
|
||||||
|
action: "merge_existing",
|
||||||
|
targetCreatedByAgentId,
|
||||||
|
targetUpdatedByAgentId,
|
||||||
|
latestRevisionId,
|
||||||
|
latestRevisionNumber,
|
||||||
|
revisionsToInsert,
|
||||||
|
adjustments,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const sourceAttachments = input.sourceAttachments ?? [];
|
||||||
|
const targetAttachmentIds = new Set((input.targetAttachments ?? []).map((attachment) => attachment.id));
|
||||||
|
const attachmentPlans: Array<PlannedAttachmentInsert | PlannedAttachmentSkip> = [];
|
||||||
|
for (const attachment of sortAttachments(sourceAttachments)) {
|
||||||
|
if (targetAttachmentIds.has(attachment.id)) {
|
||||||
|
attachmentPlans.push({ source: attachment, action: "skip_existing" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!issueIdsAvailableAfterImport.has(attachment.issueId)) {
|
||||||
|
attachmentPlans.push({ source: attachment, action: "skip_missing_parent" });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const adjustments: ImportAdjustment[] = [];
|
||||||
|
const targetCreatedByAgentId =
|
||||||
|
attachment.createdByAgentId && targetAgentIds.has(attachment.createdByAgentId)
|
||||||
|
? attachment.createdByAgentId
|
||||||
|
: null;
|
||||||
|
if (attachment.createdByAgentId && !targetCreatedByAgentId) {
|
||||||
|
adjustments.push("clear_attachment_agent");
|
||||||
|
incrementAdjustment(adjustmentCounts, "clear_attachment_agent");
|
||||||
|
}
|
||||||
|
|
||||||
|
attachmentPlans.push({
|
||||||
|
source: attachment,
|
||||||
|
action: "insert",
|
||||||
|
targetIssueCommentId:
|
||||||
|
attachment.issueCommentId && commentIdsAvailableAfterImport.has(attachment.issueCommentId)
|
||||||
|
? attachment.issueCommentId
|
||||||
|
: null,
|
||||||
|
targetCreatedByAgentId,
|
||||||
|
adjustments,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const counts = {
|
||||||
|
projectsToImport: projectImports.length,
|
||||||
|
issuesToInsert: issuePlans.filter((plan) => plan.action === "insert").length,
|
||||||
|
issuesExisting: issuePlans.filter((plan) => plan.action === "skip_existing").length,
|
||||||
|
issueDrift: issuePlans.filter((plan) => plan.action === "skip_existing" && plan.driftKeys.length > 0).length,
|
||||||
|
commentsToInsert: commentPlans.filter((plan) => plan.action === "insert").length,
|
||||||
|
commentsExisting: commentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||||
|
commentsMissingParent: commentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||||
|
documentsToInsert: documentPlans.filter((plan) => plan.action === "insert").length,
|
||||||
|
documentsToMerge: documentPlans.filter((plan) => plan.action === "merge_existing").length,
|
||||||
|
documentsExisting: documentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||||
|
documentsConflictingKey: documentPlans.filter((plan) => plan.action === "skip_conflicting_key").length,
|
||||||
|
documentsMissingParent: documentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||||
|
documentRevisionsToInsert: documentPlans.reduce(
|
||||||
|
(sum, plan) =>
|
||||||
|
sum + (plan.action === "insert" || plan.action === "merge_existing" ? plan.revisionsToInsert.length : 0),
|
||||||
|
0,
|
||||||
|
),
|
||||||
|
attachmentsToInsert: attachmentPlans.filter((plan) => plan.action === "insert").length,
|
||||||
|
attachmentsExisting: attachmentPlans.filter((plan) => plan.action === "skip_existing").length,
|
||||||
|
attachmentsMissingParent: attachmentPlans.filter((plan) => plan.action === "skip_missing_parent").length,
|
||||||
|
};
|
||||||
|
|
||||||
|
return {
|
||||||
|
companyId: input.companyId,
|
||||||
|
companyName: input.companyName,
|
||||||
|
issuePrefix: input.issuePrefix,
|
||||||
|
previewIssueCounterStart: input.previewIssueCounterStart,
|
||||||
|
scopes: input.scopes,
|
||||||
|
projectImports,
|
||||||
|
issuePlans,
|
||||||
|
commentPlans,
|
||||||
|
documentPlans,
|
||||||
|
attachmentPlans,
|
||||||
|
counts,
|
||||||
|
adjustments: adjustmentCounts,
|
||||||
|
};
|
||||||
|
}
|
||||||
2585
cli/src/commands/worktree.ts
Normal file
2585
cli/src/commands/worktree.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -22,20 +22,35 @@ function parseEnvFile(contents: string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function formatEnvValue(value: string): string {
|
||||||
|
if (/^[A-Za-z0-9_./:@-]+$/.test(value)) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
return JSON.stringify(value);
|
||||||
|
}
|
||||||
|
|
||||||
function renderEnvFile(entries: Record<string, string>) {
|
function renderEnvFile(entries: Record<string, string>) {
|
||||||
const lines = [
|
const lines = [
|
||||||
"# Paperclip environment variables",
|
"# Paperclip environment variables",
|
||||||
"# Generated by `paperclipai onboard`",
|
"# Generated by Paperclip CLI commands",
|
||||||
...Object.entries(entries).map(([key, value]) => `${key}=${value}`),
|
...Object.entries(entries).map(([key, value]) => `${key}=${formatEnvValue(value)}`),
|
||||||
"",
|
"",
|
||||||
];
|
];
|
||||||
return lines.join("\n");
|
return lines.join("\n");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function resolvePaperclipEnvFile(configPath?: string): string {
|
||||||
|
return resolveEnvFilePath(configPath);
|
||||||
|
}
|
||||||
|
|
||||||
export function resolveAgentJwtEnvFile(configPath?: string): string {
|
export function resolveAgentJwtEnvFile(configPath?: string): string {
|
||||||
return resolveEnvFilePath(configPath);
|
return resolveEnvFilePath(configPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function loadPaperclipEnvFile(configPath?: string): void {
|
||||||
|
loadAgentJwtEnvFile(resolveEnvFilePath(configPath));
|
||||||
|
}
|
||||||
|
|
||||||
export function loadAgentJwtEnvFile(filePath = resolveEnvFilePath()): void {
|
export function loadAgentJwtEnvFile(filePath = resolveEnvFilePath()): void {
|
||||||
if (loadedEnvFiles.has(filePath)) return;
|
if (loadedEnvFiles.has(filePath)) return;
|
||||||
|
|
||||||
@@ -78,13 +93,33 @@ export function ensureAgentJwtSecret(configPath?: string): { secret: string; cre
|
|||||||
}
|
}
|
||||||
|
|
||||||
export function writeAgentJwtEnv(secret: string, filePath = resolveEnvFilePath()): void {
|
export function writeAgentJwtEnv(secret: string, filePath = resolveEnvFilePath()): void {
|
||||||
|
mergePaperclipEnvEntries({ [JWT_SECRET_ENV_KEY]: secret }, filePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function readPaperclipEnvEntries(filePath = resolveEnvFilePath()): Record<string, string> {
|
||||||
|
if (!fs.existsSync(filePath)) return {};
|
||||||
|
return parseEnvFile(fs.readFileSync(filePath, "utf-8"));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function writePaperclipEnvEntries(entries: Record<string, string>, filePath = resolveEnvFilePath()): void {
|
||||||
const dir = path.dirname(filePath);
|
const dir = path.dirname(filePath);
|
||||||
fs.mkdirSync(dir, { recursive: true });
|
fs.mkdirSync(dir, { recursive: true });
|
||||||
|
fs.writeFileSync(filePath, renderEnvFile(entries), {
|
||||||
const current = fs.existsSync(filePath) ? parseEnvFile(fs.readFileSync(filePath, "utf-8")) : {};
|
|
||||||
current[JWT_SECRET_ENV_KEY] = secret;
|
|
||||||
|
|
||||||
fs.writeFileSync(filePath, renderEnvFile(current), {
|
|
||||||
mode: 0o600,
|
mode: 0o600,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function mergePaperclipEnvEntries(
|
||||||
|
entries: Record<string, string>,
|
||||||
|
filePath = resolveEnvFilePath(),
|
||||||
|
): Record<string, string> {
|
||||||
|
const current = readPaperclipEnvEntries(filePath);
|
||||||
|
const next = {
|
||||||
|
...current,
|
||||||
|
...Object.fromEntries(
|
||||||
|
Object.entries(entries).filter(([, value]) => typeof value === "string" && value.trim().length > 0),
|
||||||
|
),
|
||||||
|
};
|
||||||
|
writePaperclipEnvEntries(next, filePath);
|
||||||
|
return next;
|
||||||
|
}
|
||||||
|
|||||||
@@ -33,6 +33,10 @@ export function resolveDefaultContextPath(): string {
|
|||||||
return path.resolve(resolvePaperclipHomeDir(), "context.json");
|
return path.resolve(resolvePaperclipHomeDir(), "context.json");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function resolveDefaultCliAuthPath(): string {
|
||||||
|
return path.resolve(resolvePaperclipHomeDir(), "auth.json");
|
||||||
|
}
|
||||||
|
|
||||||
export function resolveDefaultEmbeddedPostgresDir(instanceId?: string): string {
|
export function resolveDefaultEmbeddedPostgresDir(instanceId?: string): string {
|
||||||
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "db");
|
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "db");
|
||||||
}
|
}
|
||||||
@@ -49,6 +53,10 @@ export function resolveDefaultStorageDir(instanceId?: string): string {
|
|||||||
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "data", "storage");
|
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "data", "storage");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function resolveDefaultBackupDir(instanceId?: string): string {
|
||||||
|
return path.resolve(resolvePaperclipInstanceRoot(instanceId), "data", "backups");
|
||||||
|
}
|
||||||
|
|
||||||
export function expandHomePrefix(value: string): string {
|
export function expandHomePrefix(value: string): string {
|
||||||
if (value === "~") return os.homedir();
|
if (value === "~") return os.homedir();
|
||||||
if (value.startsWith("~/")) return path.resolve(os.homedir(), value.slice(2));
|
if (value.startsWith("~/")) return path.resolve(os.homedir(), value.slice(2));
|
||||||
@@ -64,6 +72,7 @@ export function describeLocalInstancePaths(instanceId?: string) {
|
|||||||
instanceRoot,
|
instanceRoot,
|
||||||
configPath: resolveDefaultConfigPath(resolvedInstanceId),
|
configPath: resolveDefaultConfigPath(resolvedInstanceId),
|
||||||
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(resolvedInstanceId),
|
embeddedPostgresDataDir: resolveDefaultEmbeddedPostgresDir(resolvedInstanceId),
|
||||||
|
backupDir: resolveDefaultBackupDir(resolvedInstanceId),
|
||||||
logDir: resolveDefaultLogsDir(resolvedInstanceId),
|
logDir: resolveDefaultLogsDir(resolvedInstanceId),
|
||||||
secretsKeyFilePath: resolveDefaultSecretsKeyFilePath(resolvedInstanceId),
|
secretsKeyFilePath: resolveDefaultSecretsKeyFilePath(resolvedInstanceId),
|
||||||
storageDir: resolveDefaultStorageDir(resolvedInstanceId),
|
storageDir: resolveDefaultStorageDir(resolvedInstanceId),
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ export {
|
|||||||
paperclipConfigSchema,
|
paperclipConfigSchema,
|
||||||
configMetaSchema,
|
configMetaSchema,
|
||||||
llmConfigSchema,
|
llmConfigSchema,
|
||||||
|
databaseBackupConfigSchema,
|
||||||
databaseConfigSchema,
|
databaseConfigSchema,
|
||||||
loggingConfigSchema,
|
loggingConfigSchema,
|
||||||
serverConfigSchema,
|
serverConfigSchema,
|
||||||
@@ -13,6 +14,7 @@ export {
|
|||||||
secretsLocalEncryptedConfigSchema,
|
secretsLocalEncryptedConfigSchema,
|
||||||
type PaperclipConfig,
|
type PaperclipConfig,
|
||||||
type LlmConfig,
|
type LlmConfig,
|
||||||
|
type DatabaseBackupConfig,
|
||||||
type DatabaseConfig,
|
type DatabaseConfig,
|
||||||
type LoggingConfig,
|
type LoggingConfig,
|
||||||
type ServerConfig,
|
type ServerConfig,
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ import { addAllowedHostname } from "./commands/allowed-hostname.js";
|
|||||||
import { heartbeatRun } from "./commands/heartbeat-run.js";
|
import { heartbeatRun } from "./commands/heartbeat-run.js";
|
||||||
import { runCommand } from "./commands/run.js";
|
import { runCommand } from "./commands/run.js";
|
||||||
import { bootstrapCeoInvite } from "./commands/auth-bootstrap-ceo.js";
|
import { bootstrapCeoInvite } from "./commands/auth-bootstrap-ceo.js";
|
||||||
|
import { dbBackupCommand } from "./commands/db-backup.js";
|
||||||
import { registerContextCommands } from "./commands/client/context.js";
|
import { registerContextCommands } from "./commands/client/context.js";
|
||||||
import { registerCompanyCommands } from "./commands/client/company.js";
|
import { registerCompanyCommands } from "./commands/client/company.js";
|
||||||
import { registerIssueCommands } from "./commands/client/issue.js";
|
import { registerIssueCommands } from "./commands/client/issue.js";
|
||||||
@@ -15,6 +16,10 @@ import { registerApprovalCommands } from "./commands/client/approval.js";
|
|||||||
import { registerActivityCommands } from "./commands/client/activity.js";
|
import { registerActivityCommands } from "./commands/client/activity.js";
|
||||||
import { registerDashboardCommands } from "./commands/client/dashboard.js";
|
import { registerDashboardCommands } from "./commands/client/dashboard.js";
|
||||||
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
|
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
|
||||||
|
import { loadPaperclipEnvFile } from "./config/env.js";
|
||||||
|
import { registerWorktreeCommands } from "./commands/worktree.js";
|
||||||
|
import { registerPluginCommands } from "./commands/client/plugin.js";
|
||||||
|
import { registerClientAuthCommands } from "./commands/client/auth.js";
|
||||||
|
|
||||||
const program = new Command();
|
const program = new Command();
|
||||||
const DATA_DIR_OPTION_HELP =
|
const DATA_DIR_OPTION_HELP =
|
||||||
@@ -23,7 +28,7 @@ const DATA_DIR_OPTION_HELP =
|
|||||||
program
|
program
|
||||||
.name("paperclipai")
|
.name("paperclipai")
|
||||||
.description("Paperclip CLI — setup, diagnose, and configure your instance")
|
.description("Paperclip CLI — setup, diagnose, and configure your instance")
|
||||||
.version("0.2.6");
|
.version("0.2.7");
|
||||||
|
|
||||||
program.hook("preAction", (_thisCommand, actionCommand) => {
|
program.hook("preAction", (_thisCommand, actionCommand) => {
|
||||||
const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
|
const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
|
||||||
@@ -32,6 +37,7 @@ program.hook("preAction", (_thisCommand, actionCommand) => {
|
|||||||
hasConfigOption: optionNames.has("config"),
|
hasConfigOption: optionNames.has("config"),
|
||||||
hasContextOption: optionNames.has("context"),
|
hasContextOption: optionNames.has("context"),
|
||||||
});
|
});
|
||||||
|
loadPaperclipEnvFile(options.config);
|
||||||
});
|
});
|
||||||
|
|
||||||
program
|
program
|
||||||
@@ -70,6 +76,19 @@ program
|
|||||||
.option("-s, --section <section>", "Section to configure (llm, database, logging, server, storage, secrets)")
|
.option("-s, --section <section>", "Section to configure (llm, database, logging, server, storage, secrets)")
|
||||||
.action(configure);
|
.action(configure);
|
||||||
|
|
||||||
|
program
|
||||||
|
.command("db:backup")
|
||||||
|
.description("Create a one-off database backup using current config")
|
||||||
|
.option("-c, --config <path>", "Path to config file")
|
||||||
|
.option("-d, --data-dir <path>", DATA_DIR_OPTION_HELP)
|
||||||
|
.option("--dir <path>", "Backup output directory (overrides config)")
|
||||||
|
.option("--retention-days <days>", "Retention window used for pruning", (value) => Number(value))
|
||||||
|
.option("--filename-prefix <prefix>", "Backup filename prefix", "paperclip")
|
||||||
|
.option("--json", "Print backup metadata as JSON")
|
||||||
|
.action(async (opts) => {
|
||||||
|
await dbBackupCommand(opts);
|
||||||
|
});
|
||||||
|
|
||||||
program
|
program
|
||||||
.command("allowed-hostname")
|
.command("allowed-hostname")
|
||||||
.description("Allow a hostname for authenticated/private mode access")
|
.description("Allow a hostname for authenticated/private mode access")
|
||||||
@@ -118,6 +137,8 @@ registerAgentCommands(program);
|
|||||||
registerApprovalCommands(program);
|
registerApprovalCommands(program);
|
||||||
registerActivityCommands(program);
|
registerActivityCommands(program);
|
||||||
registerDashboardCommands(program);
|
registerDashboardCommands(program);
|
||||||
|
registerWorktreeCommands(program);
|
||||||
|
registerPluginCommands(program);
|
||||||
|
|
||||||
const auth = program.command("auth").description("Authentication and bootstrap utilities");
|
const auth = program.command("auth").description("Authentication and bootstrap utilities");
|
||||||
|
|
||||||
@@ -131,6 +152,8 @@ auth
|
|||||||
.option("--base-url <url>", "Public base URL used to print invite link")
|
.option("--base-url <url>", "Public base URL used to print invite link")
|
||||||
.action(bootstrapCeoInvite);
|
.action(bootstrapCeoInvite);
|
||||||
|
|
||||||
|
registerClientAuthCommands(auth);
|
||||||
|
|
||||||
program.parseAsync().catch((err) => {
|
program.parseAsync().catch((err) => {
|
||||||
console.error(err instanceof Error ? err.message : String(err));
|
console.error(err instanceof Error ? err.message : String(err));
|
||||||
process.exit(1);
|
process.exit(1);
|
||||||
|
|||||||
@@ -1,9 +1,26 @@
|
|||||||
import * as p from "@clack/prompts";
|
import * as p from "@clack/prompts";
|
||||||
import type { DatabaseConfig } from "../config/schema.js";
|
import type { DatabaseConfig } from "../config/schema.js";
|
||||||
import { resolveDefaultEmbeddedPostgresDir, resolvePaperclipInstanceId } from "../config/home.js";
|
import {
|
||||||
|
resolveDefaultBackupDir,
|
||||||
|
resolveDefaultEmbeddedPostgresDir,
|
||||||
|
resolvePaperclipInstanceId,
|
||||||
|
} from "../config/home.js";
|
||||||
|
|
||||||
export async function promptDatabase(): Promise<DatabaseConfig> {
|
export async function promptDatabase(current?: DatabaseConfig): Promise<DatabaseConfig> {
|
||||||
const defaultEmbeddedDir = resolveDefaultEmbeddedPostgresDir(resolvePaperclipInstanceId());
|
const instanceId = resolvePaperclipInstanceId();
|
||||||
|
const defaultEmbeddedDir = resolveDefaultEmbeddedPostgresDir(instanceId);
|
||||||
|
const defaultBackupDir = resolveDefaultBackupDir(instanceId);
|
||||||
|
const base: DatabaseConfig = current ?? {
|
||||||
|
mode: "embedded-postgres",
|
||||||
|
embeddedPostgresDataDir: defaultEmbeddedDir,
|
||||||
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: defaultBackupDir,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
const mode = await p.select({
|
const mode = await p.select({
|
||||||
message: "Database mode",
|
message: "Database mode",
|
||||||
@@ -11,6 +28,7 @@ export async function promptDatabase(): Promise<DatabaseConfig> {
|
|||||||
{ value: "embedded-postgres" as const, label: "Embedded PostgreSQL (managed locally)", hint: "recommended" },
|
{ value: "embedded-postgres" as const, label: "Embedded PostgreSQL (managed locally)", hint: "recommended" },
|
||||||
{ value: "postgres" as const, label: "PostgreSQL (external server)" },
|
{ value: "postgres" as const, label: "PostgreSQL (external server)" },
|
||||||
],
|
],
|
||||||
|
initialValue: base.mode,
|
||||||
});
|
});
|
||||||
|
|
||||||
if (p.isCancel(mode)) {
|
if (p.isCancel(mode)) {
|
||||||
@@ -18,9 +36,14 @@ export async function promptDatabase(): Promise<DatabaseConfig> {
|
|||||||
process.exit(0);
|
process.exit(0);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let connectionString: string | undefined = base.connectionString;
|
||||||
|
let embeddedPostgresDataDir = base.embeddedPostgresDataDir || defaultEmbeddedDir;
|
||||||
|
let embeddedPostgresPort = base.embeddedPostgresPort || 54329;
|
||||||
|
|
||||||
if (mode === "postgres") {
|
if (mode === "postgres") {
|
||||||
const connectionString = await p.text({
|
const value = await p.text({
|
||||||
message: "PostgreSQL connection string",
|
message: "PostgreSQL connection string",
|
||||||
|
defaultValue: base.connectionString ?? "",
|
||||||
placeholder: "postgres://user:pass@localhost:5432/paperclip",
|
placeholder: "postgres://user:pass@localhost:5432/paperclip",
|
||||||
validate: (val) => {
|
validate: (val) => {
|
||||||
if (!val) return "Connection string is required for PostgreSQL mode";
|
if (!val) return "Connection string is required for PostgreSQL mode";
|
||||||
@@ -28,33 +51,29 @@ export async function promptDatabase(): Promise<DatabaseConfig> {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
if (p.isCancel(connectionString)) {
|
if (p.isCancel(value)) {
|
||||||
p.cancel("Setup cancelled.");
|
p.cancel("Setup cancelled.");
|
||||||
process.exit(0);
|
process.exit(0);
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
connectionString = value;
|
||||||
mode: "postgres",
|
} else {
|
||||||
connectionString,
|
const dataDir = await p.text({
|
||||||
embeddedPostgresDataDir: defaultEmbeddedDir,
|
|
||||||
embeddedPostgresPort: 54329,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
const embeddedPostgresDataDir = await p.text({
|
|
||||||
message: "Embedded PostgreSQL data directory",
|
message: "Embedded PostgreSQL data directory",
|
||||||
defaultValue: defaultEmbeddedDir,
|
defaultValue: base.embeddedPostgresDataDir || defaultEmbeddedDir,
|
||||||
placeholder: defaultEmbeddedDir,
|
placeholder: defaultEmbeddedDir,
|
||||||
});
|
});
|
||||||
|
|
||||||
if (p.isCancel(embeddedPostgresDataDir)) {
|
if (p.isCancel(dataDir)) {
|
||||||
p.cancel("Setup cancelled.");
|
p.cancel("Setup cancelled.");
|
||||||
process.exit(0);
|
process.exit(0);
|
||||||
}
|
}
|
||||||
|
|
||||||
const embeddedPostgresPort = await p.text({
|
embeddedPostgresDataDir = dataDir || defaultEmbeddedDir;
|
||||||
|
|
||||||
|
const portValue = await p.text({
|
||||||
message: "Embedded PostgreSQL port",
|
message: "Embedded PostgreSQL port",
|
||||||
defaultValue: "54329",
|
defaultValue: String(base.embeddedPostgresPort || 54329),
|
||||||
placeholder: "54329",
|
placeholder: "54329",
|
||||||
validate: (val) => {
|
validate: (val) => {
|
||||||
const n = Number(val);
|
const n = Number(val);
|
||||||
@@ -62,14 +81,77 @@ export async function promptDatabase(): Promise<DatabaseConfig> {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
if (p.isCancel(embeddedPostgresPort)) {
|
if (p.isCancel(portValue)) {
|
||||||
|
p.cancel("Setup cancelled.");
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
embeddedPostgresPort = Number(portValue || "54329");
|
||||||
|
connectionString = undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
const backupEnabled = await p.confirm({
|
||||||
|
message: "Enable automatic database backups?",
|
||||||
|
initialValue: base.backup.enabled,
|
||||||
|
});
|
||||||
|
if (p.isCancel(backupEnabled)) {
|
||||||
|
p.cancel("Setup cancelled.");
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
const backupDirInput = await p.text({
|
||||||
|
message: "Backup directory",
|
||||||
|
defaultValue: base.backup.dir || defaultBackupDir,
|
||||||
|
placeholder: defaultBackupDir,
|
||||||
|
validate: (val) => (!val || val.trim().length === 0 ? "Backup directory is required" : undefined),
|
||||||
|
});
|
||||||
|
if (p.isCancel(backupDirInput)) {
|
||||||
|
p.cancel("Setup cancelled.");
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
const backupIntervalInput = await p.text({
|
||||||
|
message: "Backup interval (minutes)",
|
||||||
|
defaultValue: String(base.backup.intervalMinutes || 60),
|
||||||
|
placeholder: "60",
|
||||||
|
validate: (val) => {
|
||||||
|
const n = Number(val);
|
||||||
|
if (!Number.isInteger(n) || n < 1) return "Interval must be a positive integer";
|
||||||
|
if (n > 10080) return "Interval must be 10080 minutes (7 days) or less";
|
||||||
|
return undefined;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
if (p.isCancel(backupIntervalInput)) {
|
||||||
|
p.cancel("Setup cancelled.");
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
const backupRetentionInput = await p.text({
|
||||||
|
message: "Backup retention (days)",
|
||||||
|
defaultValue: String(base.backup.retentionDays || 30),
|
||||||
|
placeholder: "30",
|
||||||
|
validate: (val) => {
|
||||||
|
const n = Number(val);
|
||||||
|
if (!Number.isInteger(n) || n < 1) return "Retention must be a positive integer";
|
||||||
|
if (n > 3650) return "Retention must be 3650 days or less";
|
||||||
|
return undefined;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
if (p.isCancel(backupRetentionInput)) {
|
||||||
p.cancel("Setup cancelled.");
|
p.cancel("Setup cancelled.");
|
||||||
process.exit(0);
|
process.exit(0);
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
return {
|
||||||
mode: "embedded-postgres",
|
mode,
|
||||||
embeddedPostgresDataDir: embeddedPostgresDataDir || defaultEmbeddedDir,
|
connectionString,
|
||||||
embeddedPostgresPort: Number(embeddedPostgresPort || "54329"),
|
embeddedPostgresDataDir,
|
||||||
|
embeddedPostgresPort,
|
||||||
|
backup: {
|
||||||
|
enabled: backupEnabled,
|
||||||
|
intervalMinutes: Number(backupIntervalInput || "60"),
|
||||||
|
retentionDays: Number(backupRetentionInput || "30"),
|
||||||
|
dir: backupDirInput || defaultBackupDir,
|
||||||
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -113,7 +113,7 @@ export async function promptServer(opts?: {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const port = Number(portStr) || 3100;
|
const port = Number(portStr) || 3100;
|
||||||
let auth: AuthConfig = { baseUrlMode: "auto" };
|
let auth: AuthConfig = { baseUrlMode: "auto", disableSignUp: false };
|
||||||
if (deploymentMode === "authenticated" && exposure === "public") {
|
if (deploymentMode === "authenticated" && exposure === "public") {
|
||||||
const urlInput = await p.text({
|
const urlInput = await p.text({
|
||||||
message: "Public base URL",
|
message: "Public base URL",
|
||||||
@@ -139,18 +139,26 @@ export async function promptServer(opts?: {
|
|||||||
}
|
}
|
||||||
auth = {
|
auth = {
|
||||||
baseUrlMode: "explicit",
|
baseUrlMode: "explicit",
|
||||||
|
disableSignUp: false,
|
||||||
publicBaseUrl: urlInput.trim().replace(/\/+$/, ""),
|
publicBaseUrl: urlInput.trim().replace(/\/+$/, ""),
|
||||||
};
|
};
|
||||||
} else if (currentAuth?.baseUrlMode === "explicit" && currentAuth.publicBaseUrl) {
|
} else if (currentAuth?.baseUrlMode === "explicit" && currentAuth.publicBaseUrl) {
|
||||||
auth = {
|
auth = {
|
||||||
baseUrlMode: "explicit",
|
baseUrlMode: "explicit",
|
||||||
|
disableSignUp: false,
|
||||||
publicBaseUrl: currentAuth.publicBaseUrl,
|
publicBaseUrl: currentAuth.publicBaseUrl,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
return {
|
||||||
server: { deploymentMode, exposure, host: hostStr.trim(), port, allowedHostnames, serveUi: true },
|
server: {
|
||||||
|
deploymentMode,
|
||||||
|
exposure,
|
||||||
|
host: hostStr.trim(),
|
||||||
|
port,
|
||||||
|
allowedHostnames,
|
||||||
|
serveUi: currentServer?.serveUi ?? true,
|
||||||
|
},
|
||||||
auth,
|
auth,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"extends": "../tsconfig.json",
|
"extends": "../tsconfig.base.json",
|
||||||
"compilerOptions": {
|
"compilerOptions": {
|
||||||
"outDir": "dist",
|
"outDir": "dist",
|
||||||
"rootDir": "src"
|
"rootDir": "src"
|
||||||
|
|||||||
115
doc/AGENTCOMPANIES_SPEC_INVENTORY.md
Normal file
115
doc/AGENTCOMPANIES_SPEC_INVENTORY.md
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
# Agent Companies Spec Inventory
|
||||||
|
|
||||||
|
This document indexes every part of the Paperclip codebase that touches the [Agent Companies Specification](docs/companies/companies-spec.md) (`agentcompanies/v1-draft`).
|
||||||
|
|
||||||
|
Use it when you need to:
|
||||||
|
|
||||||
|
1. **Update the spec** — know which implementation code must change in lockstep.
|
||||||
|
2. **Change code that involves the spec** — find all related files quickly.
|
||||||
|
3. **Keep things aligned** — audit whether implementation matches the spec.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Specification & Design Documents
|
||||||
|
|
||||||
|
| File | Role |
|
||||||
|
|---|---|
|
||||||
|
| `docs/companies/companies-spec.md` | **Normative spec** — defines the markdown-first package format (COMPANY.md, TEAM.md, AGENTS.md, PROJECT.md, TASK.md, SKILL.md), reserved files, frontmatter schemas, and vendor extension conventions (`.paperclip.yaml`). |
|
||||||
|
| `doc/plans/2026-03-13-company-import-export-v2.md` | Implementation plan for the markdown-first package model cutover — phases, API changes, UI plan, and rollout strategy. |
|
||||||
|
| `doc/SPEC-implementation.md` | V1 implementation contract; references the portability system and `.paperclip.yaml` sidecar format. |
|
||||||
|
| `docs/specs/cliphub-plan.md` | Earlier blueprint bundle plan; partially superseded by the markdown-first spec (noted in the v2 plan). |
|
||||||
|
| `doc/plans/2026-02-16-module-system.md` | Module system plan; JSON-only company template sections superseded by the markdown-first model. |
|
||||||
|
| `doc/plans/2026-03-14-skills-ui-product-plan.md` | Skills UI plan; references portable skill files and `.paperclip.yaml`. |
|
||||||
|
| `doc/plans/2026-03-14-adapter-skill-sync-rollout.md` | Adapter skill sync rollout; companion to the v2 import/export plan. |
|
||||||
|
|
||||||
|
## 2. Shared Types & Validators
|
||||||
|
|
||||||
|
These define the contract between server, CLI, and UI.
|
||||||
|
|
||||||
|
| File | What it defines |
|
||||||
|
|---|---|
|
||||||
|
| `packages/shared/src/types/company-portability.ts` | TypeScript interfaces: `CompanyPortabilityManifest`, `CompanyPortabilityFileEntry`, `CompanyPortabilityEnvInput`, export/import/preview request and result types, manifest entry types for agents, skills, projects, issues, recurring routines, companies. |
|
||||||
|
| `packages/shared/src/validators/company-portability.ts` | Zod schemas for all portability request/response shapes — used by both server routes and CLI. |
|
||||||
|
| `packages/shared/src/types/index.ts` | Re-exports portability types. |
|
||||||
|
| `packages/shared/src/validators/index.ts` | Re-exports portability validators. |
|
||||||
|
|
||||||
|
## 3. Server — Services
|
||||||
|
|
||||||
|
| File | Responsibility |
|
||||||
|
|---|---|
|
||||||
|
| `server/src/services/company-portability.ts` | **Core portability service.** Export (manifest generation, markdown file emission, `.paperclip.yaml` sidecars), import (graph resolution, collision handling, entity creation), preview (planned-action summary). Handles skill key derivation, recurring task <-> routine mapping, legacy recurrence migration, and package README generation. References `agentcompanies/v1` version string. |
|
||||||
|
| `server/src/services/routines.ts` | Paperclip routine runtime service. Portability now exports routines as recurring `TASK.md` entries and imports recurring tasks back through this service. |
|
||||||
|
| `server/src/services/company-export-readme.ts` | Generates `README.md` and Mermaid org-chart for exported company packages. |
|
||||||
|
| `server/src/services/index.ts` | Re-exports `companyPortabilityService`. |
|
||||||
|
|
||||||
|
## 4. Server — Routes
|
||||||
|
|
||||||
|
| File | Endpoints |
|
||||||
|
|---|---|
|
||||||
|
| `server/src/routes/companies.ts` | `POST /api/companies/:companyId/export` — legacy export bundle<br>`POST /api/companies/:companyId/exports/preview` — export preview<br>`POST /api/companies/:companyId/exports` — export package<br>`POST /api/companies/import/preview` — import preview<br>`POST /api/companies/import` — perform import |
|
||||||
|
|
||||||
|
Route registration lives in `server/src/app.ts` via `companyRoutes(db, storage)`.
|
||||||
|
|
||||||
|
## 5. Server — Tests
|
||||||
|
|
||||||
|
| File | Coverage |
|
||||||
|
|---|---|
|
||||||
|
| `server/src/__tests__/company-portability.test.ts` | Unit tests for the portability service (export, import, preview, manifest shape, `agentcompanies/v1` version). |
|
||||||
|
| `server/src/__tests__/company-portability-routes.test.ts` | Integration tests for the portability HTTP endpoints. |
|
||||||
|
|
||||||
|
## 6. CLI
|
||||||
|
|
||||||
|
| File | Commands |
|
||||||
|
|---|---|
|
||||||
|
| `cli/src/commands/client/company.ts` | `company export` — exports a company package to disk (flags: `--out`, `--include`, `--projects`, `--issues`, `--projectIssues`).<br>`company import <fromPathOrUrl>` — imports a company package from a file or folder (flags: positional source path/URL or GitHub shorthand, `--include`, `--target`, `--companyId`, `--newCompanyName`, `--agents`, `--collision`, `--ref`, `--dryRun`).<br>Reads/writes portable file entries and handles `.paperclip.yaml` filtering. |
|
||||||
|
|
||||||
|
## 7. UI — Pages
|
||||||
|
|
||||||
|
| File | Role |
|
||||||
|
|---|---|
|
||||||
|
| `ui/src/pages/CompanyExport.tsx` | Export UI: preview, manifest display, file tree visualization, ZIP archive creation and download. Filters `.paperclip.yaml` based on selection. Shows manifest and README in editor. |
|
||||||
|
| `ui/src/pages/CompanyImport.tsx` | Import UI: source input (upload/folder/GitHub URL/generic URL), ZIP reading, preview pane with dependency tree, entity selection checkboxes, trust/licensing warnings, secrets requirements, collision strategy, adapter config. |
|
||||||
|
|
||||||
|
## 8. UI — Components
|
||||||
|
|
||||||
|
| File | Role |
|
||||||
|
|---|---|
|
||||||
|
| `ui/src/components/PackageFileTree.tsx` | Reusable file tree component for both import and export. Builds tree from `CompanyPortabilityFileEntry` items, parses frontmatter, shows action indicators (create/update/skip), and maps frontmatter field labels. |
|
||||||
|
|
||||||
|
## 9. UI — Libraries
|
||||||
|
|
||||||
|
| File | Role |
|
||||||
|
|---|---|
|
||||||
|
| `ui/src/lib/portable-files.ts` | Helpers for portable file entries: `getPortableFileText`, `getPortableFileDataUrl`, `getPortableFileContentType`, `isPortableImageFile`. |
|
||||||
|
| `ui/src/lib/zip.ts` | ZIP archive creation (`createZipArchive`) and reading (`readZipArchive`) — implements ZIP format from scratch for company packages. CRC32, DOS date/time encoding. |
|
||||||
|
| `ui/src/lib/zip.test.ts` | Tests for ZIP utilities; exercises round-trip with portability file entries and `.paperclip.yaml` content. |
|
||||||
|
|
||||||
|
## 10. UI — API Client
|
||||||
|
|
||||||
|
| File | Functions |
|
||||||
|
|---|---|
|
||||||
|
| `ui/src/api/companies.ts` | `companiesApi.exportBundle`, `companiesApi.exportPreview`, `companiesApi.exportPackage`, `companiesApi.importPreview`, `companiesApi.importBundle` — typed fetch wrappers for the portability endpoints. |
|
||||||
|
|
||||||
|
## 11. Skills & Agent Instructions
|
||||||
|
|
||||||
|
| File | Relevance |
|
||||||
|
|---|---|
|
||||||
|
| `skills/paperclip/references/company-skills.md` | Reference doc for company skill library workflow — install, inspect, update, assign. Skill packages are a subset of the agent companies spec. |
|
||||||
|
| `server/src/services/company-skills.ts` | Company skill management service — handles SKILL.md-based imports and company-level skill library. |
|
||||||
|
| `server/src/services/agent-instructions.ts` | Agent instructions service — resolves AGENTS.md paths for agent instruction loading. |
|
||||||
|
|
||||||
|
## 12. Quick Cross-Reference by Spec Concept
|
||||||
|
|
||||||
|
| Spec concept | Primary implementation files |
|
||||||
|
|---|---|
|
||||||
|
| `COMPANY.md` frontmatter & body | `company-portability.ts` (export emitter + import parser) |
|
||||||
|
| `AGENTS.md` frontmatter & body | `company-portability.ts`, `agent-instructions.ts` |
|
||||||
|
| `PROJECT.md` frontmatter & body | `company-portability.ts` |
|
||||||
|
| `TASK.md` frontmatter & body | `company-portability.ts` |
|
||||||
|
| `SKILL.md` packages | `company-portability.ts`, `company-skills.ts` |
|
||||||
|
| `.paperclip.yaml` vendor sidecar | `company-portability.ts`, `routines.ts`, `CompanyExport.tsx`, `company.ts` (CLI) |
|
||||||
|
| `manifest.json` | `company-portability.ts` (generation), shared types (schema) |
|
||||||
|
| ZIP package format | `zip.ts` (UI), `company.ts` (CLI file I/O) |
|
||||||
|
| Collision resolution | `company-portability.ts` (server), `CompanyImport.tsx` (UI) |
|
||||||
|
| Env/secrets declarations | shared types (`CompanyPortabilityEnvInput`), `CompanyImport.tsx` (UI) |
|
||||||
|
| README + org chart | `company-export-readme.ts` |
|
||||||
14
doc/CLI.md
14
doc/CLI.md
@@ -116,6 +116,20 @@ pnpm paperclipai issue release <issue-id>
|
|||||||
```sh
|
```sh
|
||||||
pnpm paperclipai agent list --company-id <company-id>
|
pnpm paperclipai agent list --company-id <company-id>
|
||||||
pnpm paperclipai agent get <agent-id>
|
pnpm paperclipai agent get <agent-id>
|
||||||
|
pnpm paperclipai agent local-cli <agent-id-or-shortname> --company-id <company-id>
|
||||||
|
```
|
||||||
|
|
||||||
|
`agent local-cli` is the quickest way to run local Claude/Codex manually as a Paperclip agent:
|
||||||
|
|
||||||
|
- creates a new long-lived agent API key
|
||||||
|
- installs missing Paperclip skills into `~/.codex/skills` and `~/.claude/skills`
|
||||||
|
- prints `export ...` lines for `PAPERCLIP_API_URL`, `PAPERCLIP_COMPANY_ID`, `PAPERCLIP_AGENT_ID`, and `PAPERCLIP_API_KEY`
|
||||||
|
|
||||||
|
Example for shortname-based local setup:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm paperclipai agent local-cli codexcoder --company-id <company-id>
|
||||||
|
pnpm paperclipai agent local-cli claudecoder --company-id <company-id>
|
||||||
```
|
```
|
||||||
|
|
||||||
## Approval Commands
|
## Approval Commands
|
||||||
|
|||||||
@@ -19,6 +19,14 @@ That's it. On first start the server:
|
|||||||
|
|
||||||
Data persists across restarts in `~/.paperclip/instances/default/db/`. To reset local dev data, delete that directory.
|
Data persists across restarts in `~/.paperclip/instances/default/db/`. To reset local dev data, delete that directory.
|
||||||
|
|
||||||
|
If you need to apply pending migrations manually, run:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm db:migrate
|
||||||
|
```
|
||||||
|
|
||||||
|
When `DATABASE_URL` is unset, this command targets the current embedded PostgreSQL instance for your active Paperclip config/instance.
|
||||||
|
|
||||||
This mode is ideal for local development and one-command installs.
|
This mode is ideal for local development and one-command installs.
|
||||||
|
|
||||||
Docker note: the Docker quickstart image also uses embedded PostgreSQL by default. Persist `/paperclip` to keep DB state across container restarts (see `doc/DOCKER.md`).
|
Docker note: the Docker quickstart image also uses embedded PostgreSQL by default. Persist `/paperclip` to keep DB state across container restarts (see `doc/DOCKER.md`).
|
||||||
|
|||||||
@@ -15,6 +15,14 @@ Current implementation status:
|
|||||||
- Node.js 20+
|
- Node.js 20+
|
||||||
- pnpm 9+
|
- pnpm 9+
|
||||||
|
|
||||||
|
## Dependency Lockfile Policy
|
||||||
|
|
||||||
|
GitHub Actions owns `pnpm-lock.yaml`.
|
||||||
|
|
||||||
|
- Do not commit `pnpm-lock.yaml` in pull requests.
|
||||||
|
- Pull request CI validates dependency resolution when manifests change.
|
||||||
|
- Pushes to `master` regenerate `pnpm-lock.yaml` with `pnpm install --lockfile-only --no-frozen-lockfile`, commit it back if needed, and then run verification with `--frozen-lockfile`.
|
||||||
|
|
||||||
## Start Dev
|
## Start Dev
|
||||||
|
|
||||||
From repo root:
|
From repo root:
|
||||||
@@ -29,6 +37,10 @@ This starts:
|
|||||||
- API server: `http://localhost:3100`
|
- API server: `http://localhost:3100`
|
||||||
- UI: served by the API server in dev middleware mode (same origin as API)
|
- UI: served by the API server in dev middleware mode (same origin as API)
|
||||||
|
|
||||||
|
`pnpm dev` runs the server in watch mode and restarts on changes from workspace packages (including adapter packages). Use `pnpm dev:once` to run without file watching.
|
||||||
|
|
||||||
|
`pnpm dev:once` now tracks backend-relevant file changes and pending migrations. When the current boot is stale, the board UI shows a `Restart required` banner. You can also enable guarded auto-restart in `Instance Settings > Experimental`, which waits for queued/running local agent runs to finish before restarting the dev server.
|
||||||
|
|
||||||
Tailscale/private-auth dev mode:
|
Tailscale/private-auth dev mode:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
@@ -79,6 +91,10 @@ docker compose -f docker-compose.quickstart.yml up --build
|
|||||||
|
|
||||||
See `doc/DOCKER.md` for API key wiring (`OPENAI_API_KEY` / `ANTHROPIC_API_KEY`) and persistence details.
|
See `doc/DOCKER.md` for API key wiring (`OPENAI_API_KEY` / `ANTHROPIC_API_KEY`) and persistence details.
|
||||||
|
|
||||||
|
## Docker For Untrusted PR Review
|
||||||
|
|
||||||
|
For a separate review-oriented container that keeps `codex`/`claude` login state in Docker volumes and checks out PRs into an isolated scratch workspace, see `doc/UNTRUSTED-PR-REVIEW.md`.
|
||||||
|
|
||||||
## Database in Dev (Auto-Handled)
|
## Database in Dev (Auto-Handled)
|
||||||
|
|
||||||
For local development, leave `DATABASE_URL` unset.
|
For local development, leave `DATABASE_URL` unset.
|
||||||
@@ -114,6 +130,123 @@ When a local agent run has no resolved project/session workspace, Paperclip fall
|
|||||||
|
|
||||||
This path honors `PAPERCLIP_HOME` and `PAPERCLIP_INSTANCE_ID` in non-default setups.
|
This path honors `PAPERCLIP_HOME` and `PAPERCLIP_INSTANCE_ID` in non-default setups.
|
||||||
|
|
||||||
|
For `codex_local`, Paperclip also manages a per-company Codex home under the instance root and seeds it from the shared Codex login/config home (`$CODEX_HOME` or `~/.codex`):
|
||||||
|
|
||||||
|
- `~/.paperclip/instances/default/companies/<company-id>/codex-home`
|
||||||
|
|
||||||
|
## Worktree-local Instances
|
||||||
|
|
||||||
|
When developing from multiple git worktrees, do not point two Paperclip servers at the same embedded PostgreSQL data directory.
|
||||||
|
|
||||||
|
Instead, create a repo-local Paperclip config plus an isolated instance for the worktree:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
paperclipai worktree init
|
||||||
|
# or create the git worktree and initialize it in one step:
|
||||||
|
pnpm paperclipai worktree:make paperclip-pr-432
|
||||||
|
```
|
||||||
|
|
||||||
|
This command:
|
||||||
|
|
||||||
|
- writes repo-local files at `.paperclip/config.json` and `.paperclip/.env`
|
||||||
|
- creates an isolated instance under `~/.paperclip-worktrees/instances/<worktree-id>/`
|
||||||
|
- when run inside a linked git worktree, mirrors the effective git hooks into that worktree's private git dir
|
||||||
|
- picks a free app port and embedded PostgreSQL port
|
||||||
|
- by default seeds the isolated DB in `minimal` mode from the current effective Paperclip instance/config (repo-local worktree config when present, otherwise the default instance) via a logical SQL snapshot
|
||||||
|
|
||||||
|
Seed modes:
|
||||||
|
|
||||||
|
- `minimal` keeps core app state like companies, projects, issues, comments, approvals, and auth state, preserves schema for all tables, but omits row data from heavy operational history such as heartbeat runs, wake requests, activity logs, runtime services, and agent session state
|
||||||
|
- `full` makes a full logical clone of the source instance
|
||||||
|
- `--no-seed` creates an empty isolated instance
|
||||||
|
|
||||||
|
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
|
||||||
|
|
||||||
|
That repo-local env also sets:
|
||||||
|
|
||||||
|
- `PAPERCLIP_IN_WORKTREE=true`
|
||||||
|
- `PAPERCLIP_WORKTREE_NAME=<worktree-name>`
|
||||||
|
- `PAPERCLIP_WORKTREE_COLOR=<hex-color>`
|
||||||
|
|
||||||
|
The server/UI use those values for worktree-specific branding such as the top banner and dynamically colored favicon.
|
||||||
|
|
||||||
|
Print shell exports explicitly when needed:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
paperclipai worktree env
|
||||||
|
# or:
|
||||||
|
eval "$(paperclipai worktree env)"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Worktree CLI Reference
|
||||||
|
|
||||||
|
**`pnpm paperclipai worktree init [options]`** — Create repo-local config/env and an isolated instance for the current worktree.
|
||||||
|
|
||||||
|
| Option | Description |
|
||||||
|
|---|---|
|
||||||
|
| `--name <name>` | Display name used to derive the instance id |
|
||||||
|
| `--instance <id>` | Explicit isolated instance id |
|
||||||
|
| `--home <path>` | Home root for worktree instances (default: `~/.paperclip-worktrees`) |
|
||||||
|
| `--from-config <path>` | Source config.json to seed from |
|
||||||
|
| `--from-data-dir <path>` | Source PAPERCLIP_HOME used when deriving the source config |
|
||||||
|
| `--from-instance <id>` | Source instance id (default: `default`) |
|
||||||
|
| `--server-port <port>` | Preferred server port |
|
||||||
|
| `--db-port <port>` | Preferred embedded Postgres port |
|
||||||
|
| `--seed-mode <mode>` | Seed profile: `minimal` or `full` (default: `minimal`) |
|
||||||
|
| `--no-seed` | Skip database seeding from the source instance |
|
||||||
|
| `--force` | Replace existing repo-local config and isolated instance data |
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
paperclipai worktree init --no-seed
|
||||||
|
paperclipai worktree init --seed-mode full
|
||||||
|
paperclipai worktree init --from-instance default
|
||||||
|
paperclipai worktree init --from-data-dir ~/.paperclip
|
||||||
|
paperclipai worktree init --force
|
||||||
|
```
|
||||||
|
|
||||||
|
**`pnpm paperclipai worktree:make <name> [options]`** — Create `~/NAME` as a git worktree, then initialize an isolated Paperclip instance inside it. This combines `git worktree add` with `worktree init` in a single step.
|
||||||
|
|
||||||
|
| Option | Description |
|
||||||
|
|---|---|
|
||||||
|
| `--start-point <ref>` | Remote ref to base the new branch on (e.g. `origin/main`) |
|
||||||
|
| `--instance <id>` | Explicit isolated instance id |
|
||||||
|
| `--home <path>` | Home root for worktree instances (default: `~/.paperclip-worktrees`) |
|
||||||
|
| `--from-config <path>` | Source config.json to seed from |
|
||||||
|
| `--from-data-dir <path>` | Source PAPERCLIP_HOME used when deriving the source config |
|
||||||
|
| `--from-instance <id>` | Source instance id (default: `default`) |
|
||||||
|
| `--server-port <port>` | Preferred server port |
|
||||||
|
| `--db-port <port>` | Preferred embedded Postgres port |
|
||||||
|
| `--seed-mode <mode>` | Seed profile: `minimal` or `full` (default: `minimal`) |
|
||||||
|
| `--no-seed` | Skip database seeding from the source instance |
|
||||||
|
| `--force` | Replace existing repo-local config and isolated instance data |
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm paperclipai worktree:make paperclip-pr-432
|
||||||
|
pnpm paperclipai worktree:make my-feature --start-point origin/main
|
||||||
|
pnpm paperclipai worktree:make experiment --no-seed
|
||||||
|
```
|
||||||
|
|
||||||
|
**`pnpm paperclipai worktree env [options]`** — Print shell exports for the current worktree-local Paperclip instance.
|
||||||
|
|
||||||
|
| Option | Description |
|
||||||
|
|---|---|
|
||||||
|
| `-c, --config <path>` | Path to config file |
|
||||||
|
| `--json` | Print JSON instead of shell exports |
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm paperclipai worktree env
|
||||||
|
pnpm paperclipai worktree env --json
|
||||||
|
eval "$(pnpm paperclipai worktree env)"
|
||||||
|
```
|
||||||
|
|
||||||
|
For project execution worktrees, Paperclip can also run a project-defined provision command after it creates or reuses an isolated git worktree. Configure this on the project's execution workspace policy (`workspaceStrategy.provisionCommand`). The command runs inside the derived worktree and receives `PAPERCLIP_WORKSPACE_*`, `PAPERCLIP_PROJECT_ID`, `PAPERCLIP_AGENT_ID`, and `PAPERCLIP_ISSUE_*` environment variables so each repo can bootstrap itself however it wants.
|
||||||
|
|
||||||
## Quick Health Checks
|
## Quick Health Checks
|
||||||
|
|
||||||
In another terminal:
|
In another terminal:
|
||||||
@@ -141,6 +274,36 @@ pnpm dev
|
|||||||
|
|
||||||
If you set `DATABASE_URL`, the server will use that instead of embedded PostgreSQL.
|
If you set `DATABASE_URL`, the server will use that instead of embedded PostgreSQL.
|
||||||
|
|
||||||
|
## Automatic DB Backups
|
||||||
|
|
||||||
|
Paperclip can run automatic DB backups on a timer. Defaults:
|
||||||
|
|
||||||
|
- enabled
|
||||||
|
- every 60 minutes
|
||||||
|
- retain 30 days
|
||||||
|
- backup dir: `~/.paperclip/instances/default/data/backups`
|
||||||
|
|
||||||
|
Configure these in:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm paperclipai configure --section database
|
||||||
|
```
|
||||||
|
|
||||||
|
Run a one-off backup manually:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm paperclipai db:backup
|
||||||
|
# or:
|
||||||
|
pnpm db:backup
|
||||||
|
```
|
||||||
|
|
||||||
|
Environment overrides:
|
||||||
|
|
||||||
|
- `PAPERCLIP_DB_BACKUP_ENABLED=true|false`
|
||||||
|
- `PAPERCLIP_DB_BACKUP_INTERVAL_MINUTES=<minutes>`
|
||||||
|
- `PAPERCLIP_DB_BACKUP_RETENTION_DAYS=<days>`
|
||||||
|
- `PAPERCLIP_DB_BACKUP_DIR=/absolute/or/~/path`
|
||||||
|
|
||||||
## Secrets in Dev
|
## Secrets in Dev
|
||||||
|
|
||||||
Agent env vars now support secret references. By default, secret values are stored with local encryption and only secret refs are persisted in agent config.
|
Agent env vars now support secret references. By default, secret values are stored with local encryption and only secret refs are persisted in agent config.
|
||||||
@@ -216,5 +379,61 @@ Agent-oriented invite onboarding now exposes machine-readable API docs:
|
|||||||
|
|
||||||
- `GET /api/invites/:token` returns invite summary plus onboarding and skills index links.
|
- `GET /api/invites/:token` returns invite summary plus onboarding and skills index links.
|
||||||
- `GET /api/invites/:token/onboarding` returns onboarding manifest details (registration endpoint, claim endpoint template, skill install hints).
|
- `GET /api/invites/:token/onboarding` returns onboarding manifest details (registration endpoint, claim endpoint template, skill install hints).
|
||||||
|
- `GET /api/invites/:token/onboarding.txt` returns a plain-text onboarding doc intended for both human operators and agents (llm.txt-style handoff), including optional inviter message and suggested network host candidates.
|
||||||
- `GET /api/skills/index` lists available skill documents.
|
- `GET /api/skills/index` lists available skill documents.
|
||||||
- `GET /api/skills/paperclip` returns the Paperclip heartbeat skill markdown.
|
- `GET /api/skills/paperclip` returns the Paperclip heartbeat skill markdown.
|
||||||
|
|
||||||
|
## OpenClaw Join Smoke Test
|
||||||
|
|
||||||
|
Run the end-to-end OpenClaw join smoke harness:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm smoke:openclaw-join
|
||||||
|
```
|
||||||
|
|
||||||
|
What it validates:
|
||||||
|
|
||||||
|
- invite creation for agent-only join
|
||||||
|
- agent join request using `adapterType=openclaw`
|
||||||
|
- board approval + one-time API key claim semantics
|
||||||
|
- callback delivery on wakeup to a dockerized OpenClaw-style webhook receiver
|
||||||
|
|
||||||
|
Required permissions:
|
||||||
|
|
||||||
|
- This script performs board-governed actions (create invite, approve join, wakeup another agent).
|
||||||
|
- In authenticated mode, run with board auth via `PAPERCLIP_AUTH_HEADER` or `PAPERCLIP_COOKIE`.
|
||||||
|
|
||||||
|
Optional auth flags (for authenticated mode):
|
||||||
|
|
||||||
|
- `PAPERCLIP_AUTH_HEADER` (for example `Bearer ...`)
|
||||||
|
- `PAPERCLIP_COOKIE` (session cookie header value)
|
||||||
|
|
||||||
|
## OpenClaw Docker UI One-Command Script
|
||||||
|
|
||||||
|
To boot OpenClaw in Docker and print a host-browser dashboard URL in one command:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm smoke:openclaw-docker-ui
|
||||||
|
```
|
||||||
|
|
||||||
|
This script lives at `scripts/smoke/openclaw-docker-ui.sh` and automates clone/build/config/start for Compose-based local OpenClaw UI testing.
|
||||||
|
|
||||||
|
Pairing behavior for this smoke script:
|
||||||
|
|
||||||
|
- default `OPENCLAW_DISABLE_DEVICE_AUTH=1` (no Control UI pairing prompt for local smoke; no extra pairing env vars required)
|
||||||
|
- set `OPENCLAW_DISABLE_DEVICE_AUTH=0` to require standard device pairing
|
||||||
|
|
||||||
|
Model behavior for this smoke script:
|
||||||
|
|
||||||
|
- defaults to OpenAI models (`openai/gpt-5.2` + OpenAI fallback) so it does not require Anthropic auth by default
|
||||||
|
|
||||||
|
State behavior for this smoke script:
|
||||||
|
|
||||||
|
- defaults to isolated config dir `~/.openclaw-paperclip-smoke`
|
||||||
|
- resets smoke agent state each run by default (`OPENCLAW_RESET_STATE=1`) to avoid stale provider/auth drift
|
||||||
|
|
||||||
|
Networking behavior for this smoke script:
|
||||||
|
|
||||||
|
- auto-detects and prints a Paperclip host URL reachable from inside OpenClaw Docker
|
||||||
|
- default container-side host alias is `host.docker.internal` (override with `PAPERCLIP_HOST_FROM_CONTAINER` / `PAPERCLIP_HOST_PORT`)
|
||||||
|
- if Paperclip rejects container hostnames in authenticated/private mode, allow `host.docker.internal` via `pnpm paperclipai allowed-hostname host.docker.internal` and restart Paperclip
|
||||||
|
|||||||
@@ -42,6 +42,32 @@ Optional overrides:
|
|||||||
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc docker compose -f docker-compose.quickstart.yml up --build
|
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc docker compose -f docker-compose.quickstart.yml up --build
|
||||||
```
|
```
|
||||||
|
|
||||||
|
If you change host port or use a non-local domain, set `PAPERCLIP_PUBLIC_URL` to the external URL you will use in browser/auth flows.
|
||||||
|
|
||||||
|
## Authenticated Compose (Single Public URL)
|
||||||
|
|
||||||
|
For authenticated deployments, set one canonical public URL and let Paperclip derive auth/callback defaults:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
paperclip:
|
||||||
|
environment:
|
||||||
|
PAPERCLIP_DEPLOYMENT_MODE: authenticated
|
||||||
|
PAPERCLIP_DEPLOYMENT_EXPOSURE: private
|
||||||
|
PAPERCLIP_PUBLIC_URL: https://desk.koker.net
|
||||||
|
```
|
||||||
|
|
||||||
|
`PAPERCLIP_PUBLIC_URL` is used as the primary source for:
|
||||||
|
|
||||||
|
- auth public base URL
|
||||||
|
- Better Auth base URL defaults
|
||||||
|
- bootstrap invite URL defaults
|
||||||
|
- hostname allowlist defaults (hostname extracted from URL)
|
||||||
|
|
||||||
|
Granular overrides remain available if needed (`PAPERCLIP_AUTH_PUBLIC_BASE_URL`, `BETTER_AUTH_URL`, `BETTER_AUTH_TRUSTED_ORIGINS`, `PAPERCLIP_ALLOWED_HOSTNAMES`).
|
||||||
|
|
||||||
|
Set `PAPERCLIP_ALLOWED_HOSTNAMES` explicitly only when you need additional hostnames beyond the public URL host (for example Tailscale/LAN aliases or multiple private hostnames).
|
||||||
|
|
||||||
## Claude + Codex Local Adapters in Docker
|
## Claude + Codex Local Adapters in Docker
|
||||||
|
|
||||||
The image pre-installs:
|
The image pre-installs:
|
||||||
@@ -67,6 +93,12 @@ Notes:
|
|||||||
- Without API keys, the app still runs normally.
|
- Without API keys, the app still runs normally.
|
||||||
- Adapter environment checks in Paperclip will surface missing auth/CLI prerequisites.
|
- Adapter environment checks in Paperclip will surface missing auth/CLI prerequisites.
|
||||||
|
|
||||||
|
## Untrusted PR Review Container
|
||||||
|
|
||||||
|
If you want a separate Docker environment for reviewing untrusted pull requests with `codex` or `claude`, use the dedicated review workflow in `doc/UNTRUSTED-PR-REVIEW.md`.
|
||||||
|
|
||||||
|
That setup keeps CLI auth state in Docker volumes instead of your host home directory and uses a separate scratch workspace for PR checkouts and preview runs.
|
||||||
|
|
||||||
## Onboard Smoke Test (Ubuntu + npm only)
|
## Onboard Smoke Test (Ubuntu + npm only)
|
||||||
|
|
||||||
Use this when you want to mimic a fresh machine that only has Ubuntu + npm and verify:
|
Use this when you want to mimic a fresh machine that only has Ubuntu + npm and verify:
|
||||||
@@ -88,6 +120,7 @@ Useful overrides:
|
|||||||
```sh
|
```sh
|
||||||
HOST_PORT=3200 PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
HOST_PORT=3200 PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||||
PAPERCLIP_DEPLOYMENT_MODE=authenticated PAPERCLIP_DEPLOYMENT_EXPOSURE=private ./scripts/docker-onboard-smoke.sh
|
PAPERCLIP_DEPLOYMENT_MODE=authenticated PAPERCLIP_DEPLOYMENT_EXPOSURE=private ./scripts/docker-onboard-smoke.sh
|
||||||
|
SMOKE_DETACH=true SMOKE_METADATA_FILE=/tmp/paperclip-smoke.env PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
Notes:
|
Notes:
|
||||||
@@ -96,5 +129,8 @@ Notes:
|
|||||||
- Container runtime user id defaults to your local `id -u` so the mounted data dir stays writable while avoiding root runtime.
|
- Container runtime user id defaults to your local `id -u` so the mounted data dir stays writable while avoiding root runtime.
|
||||||
- Smoke script defaults to `authenticated/private` mode so `HOST=0.0.0.0` can be exposed to the host.
|
- Smoke script defaults to `authenticated/private` mode so `HOST=0.0.0.0` can be exposed to the host.
|
||||||
- Smoke script defaults host port to `3131` to avoid conflicts with local Paperclip on `3100`.
|
- Smoke script defaults host port to `3131` to avoid conflicts with local Paperclip on `3100`.
|
||||||
|
- Smoke script also defaults `PAPERCLIP_PUBLIC_URL` to `http://localhost:<HOST_PORT>` so bootstrap invite URLs and auth callbacks use the reachable host port instead of the container's internal `3100`.
|
||||||
|
- In authenticated mode, the smoke script defaults `SMOKE_AUTO_BOOTSTRAP=true` and drives the real bootstrap path automatically: it signs up a real user, runs `paperclipai auth bootstrap-ceo` inside the container to mint a real bootstrap invite, accepts that invite over HTTP, and verifies board session access.
|
||||||
- Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation.
|
- Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation.
|
||||||
|
- Set `SMOKE_DETACH=true` to leave the container running for automation and optionally write shell-ready metadata to `SMOKE_METADATA_FILE`.
|
||||||
- The image definition is in `Dockerfile.onboard-smoke`.
|
- The image definition is in `Dockerfile.onboard-smoke`.
|
||||||
|
|||||||
94
doc/OPENCLAW_ONBOARDING.md
Normal file
94
doc/OPENCLAW_ONBOARDING.md
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
Use this exact checklist.
|
||||||
|
|
||||||
|
1. Start Paperclip in auth mode.
|
||||||
|
```bash
|
||||||
|
cd <paperclip-repo-root>
|
||||||
|
pnpm dev --tailscale-auth
|
||||||
|
```
|
||||||
|
Then verify:
|
||||||
|
```bash
|
||||||
|
curl -sS http://127.0.0.1:3100/api/health | jq
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Start a clean/stock OpenClaw Docker.
|
||||||
|
```bash
|
||||||
|
OPENCLAW_RESET_STATE=1 OPENCLAW_BUILD=1 ./scripts/smoke/openclaw-docker-ui.sh
|
||||||
|
```
|
||||||
|
Open the printed `Dashboard URL` (includes `#token=...`) in your browser.
|
||||||
|
|
||||||
|
3. In Paperclip UI, go to `http://127.0.0.1:3100/CLA/company/settings`.
|
||||||
|
|
||||||
|
4. Use the OpenClaw invite prompt flow.
|
||||||
|
- In the Invites section, click `Generate OpenClaw Invite Prompt`.
|
||||||
|
- Copy the generated prompt from `OpenClaw Invite Prompt`.
|
||||||
|
- Paste it into OpenClaw main chat as one message.
|
||||||
|
- If it stalls, send one follow-up: `How is onboarding going? Continue setup now.`
|
||||||
|
|
||||||
|
Security/control note:
|
||||||
|
- The OpenClaw invite prompt is created from a controlled endpoint:
|
||||||
|
- `POST /api/companies/{companyId}/openclaw/invite-prompt`
|
||||||
|
- board users with invite permission can call it
|
||||||
|
- agent callers are limited to the company CEO agent
|
||||||
|
|
||||||
|
5. Approve the join request in Paperclip UI, then confirm the OpenClaw agent appears in CLA agents.
|
||||||
|
|
||||||
|
6. Gateway preflight (required before task tests).
|
||||||
|
- Confirm the created agent uses `openclaw_gateway` (not `openclaw`).
|
||||||
|
- Confirm gateway URL is `ws://...` or `wss://...`.
|
||||||
|
- Confirm gateway token is non-trivial (not empty / not 1-char placeholder).
|
||||||
|
- The OpenClaw Gateway adapter UI should not expose `disableDeviceAuth` for normal onboarding.
|
||||||
|
- Confirm pairing mode is explicit:
|
||||||
|
- required default: device auth enabled (`adapterConfig.disableDeviceAuth` false/absent) with persisted `adapterConfig.devicePrivateKeyPem`
|
||||||
|
- do not rely on `disableDeviceAuth` for normal onboarding
|
||||||
|
- If you can run API checks with board auth:
|
||||||
|
```bash
|
||||||
|
AGENT_ID="<newly-created-agent-id>"
|
||||||
|
curl -sS -H "Cookie: $PAPERCLIP_COOKIE" "http://127.0.0.1:3100/api/agents/$AGENT_ID" | jq '{adapterType,adapterConfig:{url:.adapterConfig.url,tokenLen:(.adapterConfig.headers["x-openclaw-token"] // .adapterConfig.headers["x-openclaw-auth"] // "" | length),disableDeviceAuth:(.adapterConfig.disableDeviceAuth // false),hasDeviceKey:(.adapterConfig.devicePrivateKeyPem // "" | length > 0)}}'
|
||||||
|
```
|
||||||
|
- Expected: `adapterType=openclaw_gateway`, `tokenLen >= 16`, `hasDeviceKey=true`, and `disableDeviceAuth=false`.
|
||||||
|
|
||||||
|
Pairing handshake note:
|
||||||
|
- Clean run expectation: first task should succeed without manual pairing commands.
|
||||||
|
- The adapter attempts one automatic pairing approval + retry on first `pairing required` (when shared gateway auth token/password is valid).
|
||||||
|
- If auto-pair cannot complete (for example token mismatch or no pending request), the first gateway run may still return `pairing required`.
|
||||||
|
- This is a separate approval from Paperclip invite approval. You must approve the pending device in OpenClaw itself.
|
||||||
|
- Approve it in OpenClaw, then retry the task.
|
||||||
|
- For local docker smoke, you can approve from host:
|
||||||
|
```bash
|
||||||
|
docker exec openclaw-docker-openclaw-gateway-1 sh -lc 'openclaw devices approve --latest --json --url "ws://127.0.0.1:18789" --token "$(node -p \"require(process.env.HOME+\\\"/.openclaw/openclaw.json\\\").gateway.auth.token\")"'
|
||||||
|
```
|
||||||
|
- You can inspect pending vs paired devices:
|
||||||
|
```bash
|
||||||
|
docker exec openclaw-docker-openclaw-gateway-1 sh -lc 'TOK="$(node -e \"const fs=require(\\\"fs\\\");const c=JSON.parse(fs.readFileSync(\\\"/home/node/.openclaw/openclaw.json\\\",\\\"utf8\\\"));process.stdout.write(c.gateway?.auth?.token||\\\"\\\");\")\"; openclaw devices list --json --url \"ws://127.0.0.1:18789\" --token \"$TOK\"'
|
||||||
|
```
|
||||||
|
|
||||||
|
7. Case A (manual issue test).
|
||||||
|
- Create an issue assigned to the OpenClaw agent.
|
||||||
|
- Put instructions: “post comment `OPENCLAW_CASE_A_OK_<timestamp>` and mark done.”
|
||||||
|
- Verify in UI: issue status becomes `done` and comment exists.
|
||||||
|
|
||||||
|
8. Case B (message tool test).
|
||||||
|
- Create another issue assigned to OpenClaw.
|
||||||
|
- Instructions: “send `OPENCLAW_CASE_B_OK_<timestamp>` to main webchat via message tool, then comment same marker on issue, then mark done.”
|
||||||
|
- Verify both:
|
||||||
|
- marker comment on issue
|
||||||
|
- marker text appears in OpenClaw main chat
|
||||||
|
|
||||||
|
9. Case C (new session memory/skills test).
|
||||||
|
- In OpenClaw, start `/new` session.
|
||||||
|
- Ask it to create a new CLA issue in Paperclip with unique title `OPENCLAW_CASE_C_CREATED_<timestamp>`.
|
||||||
|
- Verify in Paperclip UI that new issue exists.
|
||||||
|
|
||||||
|
10. Watch logs during test (optional but helpful):
|
||||||
|
```bash
|
||||||
|
docker compose -f /tmp/openclaw-docker/docker-compose.yml -f /tmp/openclaw-docker/.paperclip-openclaw.override.yml logs -f openclaw-gateway
|
||||||
|
```
|
||||||
|
|
||||||
|
11. Expected pass criteria.
|
||||||
|
- Preflight: `openclaw_gateway` + non-placeholder token (`tokenLen >= 16`).
|
||||||
|
- Pairing mode: stable `devicePrivateKeyPem` configured with device auth enabled (default path).
|
||||||
|
- Case A: `done` + marker comment.
|
||||||
|
- Case B: `done` + marker comment + main-chat message visible.
|
||||||
|
- Case C: original task done and new issue created from `/new` session.
|
||||||
|
|
||||||
|
If you want, I can also give you a single “observer mode” command that runs the stock smoke harness while you watch the same steps live in UI.
|
||||||
@@ -94,3 +94,53 @@ Canonical mode design and command expectations live in `doc/DEPLOYMENT-MODES.md`
|
|||||||
## Further Detail
|
## Further Detail
|
||||||
|
|
||||||
See [SPEC.md](./SPEC.md) for the full technical specification and [TASKS.md](./TASKS.md) for the task management data model.
|
See [SPEC.md](./SPEC.md) for the full technical specification and [TASKS.md](./TASKS.md) for the task management data model.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Paperclip’s core identity is a **control plane for autonomous AI companies**, centered on **companies, org charts, goals, issues/comments, heartbeats, budgets, approvals, and board governance**. The public docs are also explicit about the current boundaries: **tasks/comments are the built-in communication model**, Paperclip is **not a chatbot**, and it is **not a code review tool**. The roadmap already points toward **easier onboarding, cloud agents, easier agent configuration, plugins, better docs, and ClipMart/ClipHub-style reusable companies/templates**.
|
||||||
|
|
||||||
|
## What Paperclip should do vs. not do
|
||||||
|
|
||||||
|
**Do**
|
||||||
|
|
||||||
|
- Stay **board-level and company-level**. Users should manage goals, orgs, budgets, approvals, and outputs.
|
||||||
|
- Make the first five minutes feel magical: install, answer a few questions, see a CEO do something real.
|
||||||
|
- Keep work anchored to **issues/comments/projects/goals**, even if the surface feels conversational.
|
||||||
|
- Treat **agency / internal team / startup** as the same underlying abstraction with different templates and labels.
|
||||||
|
- Make outputs first-class: files, docs, reports, previews, links, screenshots.
|
||||||
|
- Provide **hooks into engineering workflows**: worktrees, preview servers, PR links, external review tools.
|
||||||
|
- Use **plugins** for edge cases like rich chat, knowledge bases, doc editors, custom tracing.
|
||||||
|
|
||||||
|
**Do not**
|
||||||
|
|
||||||
|
- Do not make the core product a general chat app. The current product definition is explicitly task/comment-centric and “not a chatbot,” and that boundary is valuable.
|
||||||
|
- Do not build a complete Jira/GitHub replacement. The repo/docs already position Paperclip as organization orchestration, not focused on pull-request review.
|
||||||
|
- Do not build enterprise-grade RBAC first. The current V1 spec still treats multi-board governance and fine-grained human permissions as out of scope, so the first multi-user version should be coarse and company-scoped.
|
||||||
|
- Do not lead with raw bash logs and transcripts. Default view should be human-readable intent/progress, with raw detail beneath.
|
||||||
|
- Do not force users to understand provider/API-key plumbing unless absolutely necessary. There are active onboarding/auth issues already; friction here is clearly real.
|
||||||
|
|
||||||
|
## Specific design goals
|
||||||
|
|
||||||
|
1. **Time-to-first-success under 5 minutes**
|
||||||
|
A fresh user should go from install to “my CEO completed a first task” in one sitting.
|
||||||
|
|
||||||
|
2. **Board-level abstraction always wins**
|
||||||
|
The default UI should answer: what is the company doing, who is doing it, why does it matter, what did it cost, and what needs my approval.
|
||||||
|
|
||||||
|
3. **Conversation stays attached to work objects**
|
||||||
|
“Chat with CEO” should still resolve to strategy threads, decisions, tasks, or approvals.
|
||||||
|
|
||||||
|
4. **Progressive disclosure**
|
||||||
|
Top layer: human-readable summary. Middle layer: checklist/steps/artifacts. Bottom layer: raw logs/tool calls/transcript.
|
||||||
|
|
||||||
|
5. **Output-first**
|
||||||
|
Work is not done until the user can see the result: file, document, preview link, screenshot, plan, or PR.
|
||||||
|
|
||||||
|
6. **Local-first, cloud-ready**
|
||||||
|
The mental model should not change between local solo use and shared/private or public/cloud deployment.
|
||||||
|
|
||||||
|
7. **Safe autonomy**
|
||||||
|
Auto mode is allowed; hidden token burn is not.
|
||||||
|
|
||||||
|
8. **Thin core, rich edges**
|
||||||
|
Put optional chat, knowledge, and special surfaces into plugins/extensions rather than bloating the control plane.
|
||||||
|
|||||||
@@ -1,196 +1,140 @@
|
|||||||
# Publishing to npm
|
# Publishing to npm
|
||||||
|
|
||||||
This document covers how to build and publish the `paperclipai` CLI package to npm.
|
Low-level reference for how Paperclip packages are prepared and published to npm.
|
||||||
|
|
||||||
## Prerequisites
|
For the maintainer workflow, use [doc/RELEASING.md](RELEASING.md). This document focuses on packaging internals.
|
||||||
|
|
||||||
- Node.js 20+
|
## Current Release Entry Points
|
||||||
- pnpm 9.15+
|
|
||||||
- An npm account with publish access to the `paperclipai` package
|
|
||||||
- Logged in to npm: `npm login`
|
|
||||||
|
|
||||||
## One-Command Publish
|
Use these scripts:
|
||||||
|
|
||||||
The fastest way to publish — bumps version, builds, publishes, restores, commits, and tags in one shot:
|
- [`scripts/release.sh`](../scripts/release.sh) for canary and stable publish flows
|
||||||
|
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh) after pushing a stable tag
|
||||||
|
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh) to repoint `latest`
|
||||||
|
- [`scripts/build-npm.sh`](../scripts/build-npm.sh) for the CLI packaging build
|
||||||
|
|
||||||
|
Paperclip no longer uses release branches or Changesets for publishing.
|
||||||
|
|
||||||
|
## Why the CLI needs special packaging
|
||||||
|
|
||||||
|
The CLI package, `paperclipai`, imports code from workspace packages such as:
|
||||||
|
|
||||||
|
- `@paperclipai/server`
|
||||||
|
- `@paperclipai/db`
|
||||||
|
- `@paperclipai/shared`
|
||||||
|
- adapter packages under `packages/adapters/`
|
||||||
|
|
||||||
|
Those workspace references are valid in development but not in a publishable npm package. The release flow rewrites versions temporarily, then builds a publishable CLI bundle.
|
||||||
|
|
||||||
|
## `build-npm.sh`
|
||||||
|
|
||||||
|
Run:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./scripts/bump-and-publish.sh patch # 0.1.1 → 0.1.2
|
|
||||||
./scripts/bump-and-publish.sh minor # 0.1.1 → 0.2.0
|
|
||||||
./scripts/bump-and-publish.sh major # 0.1.1 → 1.0.0
|
|
||||||
./scripts/bump-and-publish.sh 2.0.0 # set explicit version
|
|
||||||
./scripts/bump-and-publish.sh patch --dry-run # everything except npm publish
|
|
||||||
```
|
|
||||||
|
|
||||||
The script runs all 6 steps below in order. It requires a clean working tree and an active `npm login` session (unless `--dry-run`). After it finishes, push:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git push && git push origin v<version>
|
|
||||||
```
|
|
||||||
|
|
||||||
## Manual Step-by-Step
|
|
||||||
|
|
||||||
If you prefer to run each step individually:
|
|
||||||
|
|
||||||
### Quick Reference
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Bump version
|
|
||||||
./scripts/version-bump.sh patch # 0.1.0 → 0.1.1
|
|
||||||
|
|
||||||
# Build
|
|
||||||
./scripts/build-npm.sh
|
./scripts/build-npm.sh
|
||||||
|
|
||||||
# Preview what will be published
|
|
||||||
cd cli && npm pack --dry-run
|
|
||||||
|
|
||||||
# Publish
|
|
||||||
cd cli && npm publish --access public
|
|
||||||
|
|
||||||
# Restore dev package.json
|
|
||||||
mv cli/package.dev.json cli/package.json
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Step-by-Step
|
This script:
|
||||||
|
|
||||||
### 1. Bump the version
|
1. runs the forbidden token check unless `--skip-checks` is supplied
|
||||||
|
2. runs `pnpm -r typecheck`
|
||||||
|
3. bundles the CLI entrypoint with esbuild into `cli/dist/index.js`
|
||||||
|
4. verifies the bundled entrypoint with `node --check`
|
||||||
|
5. rewrites `cli/package.json` into a publishable npm manifest and stores the dev copy as `cli/package.dev.json`
|
||||||
|
6. copies the repo `README.md` into `cli/README.md` for npm metadata
|
||||||
|
|
||||||
```bash
|
After the release script exits, the dev manifest and temporary files are restored automatically.
|
||||||
./scripts/version-bump.sh <patch|minor|major|X.Y.Z>
|
|
||||||
```
|
|
||||||
|
|
||||||
This updates the version in two places:
|
## Package discovery and versioning
|
||||||
|
|
||||||
- `cli/package.json` — the source of truth
|
Public packages are discovered from:
|
||||||
- `cli/src/index.ts` — the Commander `.version()` call
|
|
||||||
|
- `packages/`
|
||||||
|
- `server/`
|
||||||
|
- `cli/`
|
||||||
|
|
||||||
|
`ui/` is ignored because it is private.
|
||||||
|
|
||||||
|
The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs), which:
|
||||||
|
|
||||||
|
- finds all public packages
|
||||||
|
- sorts them topologically by internal dependencies
|
||||||
|
- rewrites each package version to the target release version
|
||||||
|
- rewrites internal `workspace:*` dependency references to the exact target version
|
||||||
|
- updates the CLI's displayed version string
|
||||||
|
|
||||||
|
Those rewrites are temporary. The working tree is restored after publish or dry-run.
|
||||||
|
|
||||||
|
## Version formats
|
||||||
|
|
||||||
|
Paperclip uses calendar versions:
|
||||||
|
|
||||||
|
- stable: `YYYY.MDD.P`
|
||||||
|
- canary: `YYYY.MDD.P-canary.N`
|
||||||
|
|
||||||
Examples:
|
Examples:
|
||||||
|
|
||||||
```bash
|
- stable: `2026.318.0`
|
||||||
./scripts/version-bump.sh patch # 0.1.0 → 0.1.1
|
- canary: `2026.318.1-canary.2`
|
||||||
./scripts/version-bump.sh minor # 0.1.0 → 0.2.0
|
|
||||||
./scripts/version-bump.sh major # 0.1.0 → 1.0.0
|
|
||||||
./scripts/version-bump.sh 1.2.3 # set explicit version
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Build
|
## Publish model
|
||||||
|
|
||||||
|
### Canary
|
||||||
|
|
||||||
|
Canaries publish under the npm dist-tag `canary`.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
- `paperclipai@2026.318.1-canary.2`
|
||||||
|
|
||||||
|
This keeps the default install path unchanged while allowing explicit installs with:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./scripts/build-npm.sh
|
npx paperclipai@canary onboard
|
||||||
```
|
```
|
||||||
|
|
||||||
The build script runs five steps:
|
### Stable
|
||||||
|
|
||||||
1. **Forbidden token check** — scans tracked files for tokens listed in `.git/hooks/forbidden-tokens.txt`. If the file is missing (e.g. on a contributor's machine), the check passes silently. The script never prints which tokens it's searching for.
|
Stable publishes use the npm dist-tag `latest`.
|
||||||
2. **TypeScript type-check** — runs `pnpm -r typecheck` across all workspace packages.
|
|
||||||
3. **esbuild bundle** — bundles the CLI entry point (`cli/src/index.ts`) and all workspace package code (`@paperclipai/*`) into a single file at `cli/dist/index.js`. External npm dependencies (express, postgres, etc.) are kept as regular imports.
|
|
||||||
4. **Generate publishable package.json** — replaces `cli/package.json` with a version that has real npm dependency ranges instead of `workspace:*` references (see [package.dev.json](#packagedevjson) below).
|
|
||||||
5. **Summary** — prints the bundle size and next steps.
|
|
||||||
|
|
||||||
To skip the forbidden token check (e.g. in CI without the token list):
|
Example:
|
||||||
|
|
||||||
|
- `paperclipai@2026.318.0`
|
||||||
|
|
||||||
|
Stable publishes do not create a release commit. Instead:
|
||||||
|
|
||||||
|
- package versions are rewritten temporarily
|
||||||
|
- packages are published from the chosen source commit
|
||||||
|
- git tag `vYYYY.MDD.P` points at that original commit
|
||||||
|
|
||||||
|
## Trusted publishing
|
||||||
|
|
||||||
|
The intended CI model is npm trusted publishing through GitHub OIDC.
|
||||||
|
|
||||||
|
That means:
|
||||||
|
|
||||||
|
- no long-lived `NPM_TOKEN` in repository secrets
|
||||||
|
- GitHub Actions obtains short-lived publish credentials
|
||||||
|
- trusted publisher rules are configured per workflow file
|
||||||
|
|
||||||
|
See [doc/RELEASE-AUTOMATION-SETUP.md](RELEASE-AUTOMATION-SETUP.md) for the GitHub/npm setup steps.
|
||||||
|
|
||||||
|
## Rollback model
|
||||||
|
|
||||||
|
Rollback does not unpublish anything.
|
||||||
|
|
||||||
|
It repoints the `latest` dist-tag to a prior stable version:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./scripts/build-npm.sh --skip-checks
|
./scripts/rollback-latest.sh 2026.318.0
|
||||||
```
|
```
|
||||||
|
|
||||||
### 3. Preview (optional)
|
This is the fastest way to restore the default install path if a stable release is bad.
|
||||||
|
|
||||||
See what npm will publish:
|
## Related Files
|
||||||
|
|
||||||
```bash
|
- [`scripts/build-npm.sh`](../scripts/build-npm.sh)
|
||||||
cd cli && npm pack --dry-run
|
- [`scripts/generate-npm-package-json.mjs`](../scripts/generate-npm-package-json.mjs)
|
||||||
```
|
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
|
||||||
|
- [`cli/esbuild.config.mjs`](../cli/esbuild.config.mjs)
|
||||||
### 4. Publish
|
- [`doc/RELEASING.md`](RELEASING.md)
|
||||||
|
|
||||||
```bash
|
|
||||||
cd cli && npm publish --access public
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5. Restore dev package.json
|
|
||||||
|
|
||||||
After publishing, restore the workspace-aware `package.json`:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
mv cli/package.dev.json cli/package.json
|
|
||||||
```
|
|
||||||
|
|
||||||
### 6. Commit and tag
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git add cli/package.json cli/src/index.ts
|
|
||||||
git commit -m "chore: bump version to X.Y.Z"
|
|
||||||
git tag vX.Y.Z
|
|
||||||
```
|
|
||||||
|
|
||||||
## package.dev.json
|
|
||||||
|
|
||||||
During development, `cli/package.json` contains `workspace:*` references like:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"dependencies": {
|
|
||||||
"@paperclipai/server": "workspace:*",
|
|
||||||
"@paperclipai/db": "workspace:*"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
These tell pnpm to resolve those packages from the local monorepo. This is great for development but **npm doesn't understand `workspace:*`** — publishing with these references would cause install failures for users.
|
|
||||||
|
|
||||||
The build script solves this with a two-file swap:
|
|
||||||
|
|
||||||
1. **Before building:** `cli/package.json` has `workspace:*` refs (the dev version).
|
|
||||||
2. **During build (`build-npm.sh` step 4):**
|
|
||||||
- The dev `package.json` is copied to `package.dev.json` as a backup.
|
|
||||||
- `generate-npm-package-json.mjs` reads every workspace package's `package.json`, collects all their external npm dependencies, and writes a new `cli/package.json` with those real dependency ranges — no `workspace:*` refs.
|
|
||||||
3. **After publishing:** you restore the dev version with `mv package.dev.json package.json`.
|
|
||||||
|
|
||||||
The generated publishable `package.json` looks like:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"name": "paperclipai",
|
|
||||||
"version": "0.1.0",
|
|
||||||
"bin": { "paperclipai": "./dist/index.js" },
|
|
||||||
"dependencies": {
|
|
||||||
"express": "^5.1.0",
|
|
||||||
"postgres": "^3.4.5",
|
|
||||||
"commander": "^13.1.0"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
`package.dev.json` is listed in `.gitignore` — it only exists temporarily on disk during the build/publish cycle.
|
|
||||||
|
|
||||||
## How the bundle works
|
|
||||||
|
|
||||||
The CLI is a monorepo package that imports code from `@paperclipai/server`, `@paperclipai/db`, `@paperclipai/shared`, and several adapter packages. These workspace packages don't exist on npm.
|
|
||||||
|
|
||||||
**esbuild** bundles all workspace TypeScript code into a single `dist/index.js` file (~250kb). External npm packages (express, postgres, zod, etc.) are left as normal `import` statements — they get installed by npm when a user runs `npx paperclipai onboard`.
|
|
||||||
|
|
||||||
The esbuild configuration lives at `cli/esbuild.config.mjs`. It automatically reads every workspace package's `package.json` to determine which dependencies are external (real npm packages) vs. internal (workspace code to bundle).
|
|
||||||
|
|
||||||
## Forbidden token enforcement
|
|
||||||
|
|
||||||
The build process includes the same forbidden-token check used by the git pre-commit hook. This catches any accidentally committed tokens before they reach npm.
|
|
||||||
|
|
||||||
- Token list: `.git/hooks/forbidden-tokens.txt` (one token per line, `#` comments supported)
|
|
||||||
- The file lives inside `.git/` and is never committed
|
|
||||||
- If the file is missing, the check passes — contributors without the list can still build
|
|
||||||
- The script never prints which tokens are being searched for
|
|
||||||
- Matches are printed so you know which files to fix, but not which token triggered it
|
|
||||||
|
|
||||||
Run the check standalone:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
pnpm check:tokens
|
|
||||||
```
|
|
||||||
|
|
||||||
## npm scripts reference
|
|
||||||
|
|
||||||
| Script | Command | Description |
|
|
||||||
|---|---|---|
|
|
||||||
| `bump-and-publish` | `pnpm bump-and-publish <type>` | One-command bump + build + publish + commit + tag |
|
|
||||||
| `build:npm` | `pnpm build:npm` | Full build (check + typecheck + bundle + package.json) |
|
|
||||||
| `version:bump` | `pnpm version:bump <type>` | Bump CLI version |
|
|
||||||
| `check:tokens` | `pnpm check:tokens` | Run forbidden token check only |
|
|
||||||
|
|||||||
281
doc/RELEASE-AUTOMATION-SETUP.md
Normal file
281
doc/RELEASE-AUTOMATION-SETUP.md
Normal file
@@ -0,0 +1,281 @@
|
|||||||
|
# Release Automation Setup
|
||||||
|
|
||||||
|
This document covers the GitHub and npm setup required for the current Paperclip release model:
|
||||||
|
|
||||||
|
- automatic canaries from `master`
|
||||||
|
- manual stable promotion from a chosen source ref
|
||||||
|
- npm trusted publishing via GitHub OIDC
|
||||||
|
- protected release infrastructure in a public repository
|
||||||
|
|
||||||
|
Repo-side files that depend on this setup:
|
||||||
|
|
||||||
|
- `.github/workflows/release.yml`
|
||||||
|
- `.github/CODEOWNERS`
|
||||||
|
|
||||||
|
Note:
|
||||||
|
|
||||||
|
- the release workflows intentionally use `pnpm install --no-frozen-lockfile`
|
||||||
|
- this matches the repo's current policy where `pnpm-lock.yaml` is refreshed by GitHub automation after manifest changes land on `master`
|
||||||
|
- the publish jobs then restore `pnpm-lock.yaml` before running `scripts/release.sh`, so the release script still sees a clean worktree
|
||||||
|
|
||||||
|
## 1. Merge the Repo Changes First
|
||||||
|
|
||||||
|
Before touching GitHub or npm settings, merge the release automation code so the referenced workflow filenames already exist on the default branch.
|
||||||
|
|
||||||
|
Required files:
|
||||||
|
|
||||||
|
- `.github/workflows/release.yml`
|
||||||
|
- `.github/CODEOWNERS`
|
||||||
|
|
||||||
|
## 2. Configure npm Trusted Publishing
|
||||||
|
|
||||||
|
Do this for every public package that Paperclip publishes.
|
||||||
|
|
||||||
|
At minimum that includes:
|
||||||
|
|
||||||
|
- `paperclipai`
|
||||||
|
- `@paperclipai/server`
|
||||||
|
- public packages under `packages/`
|
||||||
|
|
||||||
|
### 2.1. In npm, open each package settings page
|
||||||
|
|
||||||
|
For each package:
|
||||||
|
|
||||||
|
1. open npm as an owner of the package
|
||||||
|
2. go to the package settings / publishing access area
|
||||||
|
3. add a trusted publisher for the GitHub repository `paperclipai/paperclip`
|
||||||
|
|
||||||
|
### 2.2. Add one trusted publisher entry per package
|
||||||
|
|
||||||
|
npm currently allows one trusted publisher configuration per package.
|
||||||
|
|
||||||
|
Configure:
|
||||||
|
|
||||||
|
- workflow: `.github/workflows/release.yml`
|
||||||
|
|
||||||
|
Repository:
|
||||||
|
|
||||||
|
- `paperclipai/paperclip`
|
||||||
|
|
||||||
|
Environment name:
|
||||||
|
|
||||||
|
- leave the npm trusted-publisher environment field blank
|
||||||
|
|
||||||
|
Why:
|
||||||
|
|
||||||
|
- the single `release.yml` workflow handles both canary and stable publishing
|
||||||
|
- GitHub environments `npm-canary` and `npm-stable` still enforce different approval rules on the GitHub side
|
||||||
|
|
||||||
|
### 2.3. Verify trusted publishing before removing old auth
|
||||||
|
|
||||||
|
After the workflows are live:
|
||||||
|
|
||||||
|
1. run a canary publish
|
||||||
|
2. confirm npm publish succeeds without any `NPM_TOKEN`
|
||||||
|
3. run a stable dry-run
|
||||||
|
4. run one real stable publish
|
||||||
|
|
||||||
|
Only after that should you remove old token-based access.
|
||||||
|
|
||||||
|
## 3. Remove Legacy npm Tokens
|
||||||
|
|
||||||
|
After trusted publishing works:
|
||||||
|
|
||||||
|
1. revoke any repository or organization `NPM_TOKEN` secrets used for publish
|
||||||
|
2. revoke any personal automation token that used to publish Paperclip
|
||||||
|
3. if npm offers a package-level setting to restrict publishing to trusted publishers, enable it
|
||||||
|
|
||||||
|
Goal:
|
||||||
|
|
||||||
|
- no long-lived npm publishing token should remain in GitHub Actions
|
||||||
|
|
||||||
|
## 4. Create GitHub Environments
|
||||||
|
|
||||||
|
Create two environments in the GitHub repository:
|
||||||
|
|
||||||
|
- `npm-canary`
|
||||||
|
- `npm-stable`
|
||||||
|
|
||||||
|
Path:
|
||||||
|
|
||||||
|
1. GitHub repository
|
||||||
|
2. `Settings`
|
||||||
|
3. `Environments`
|
||||||
|
4. `New environment`
|
||||||
|
|
||||||
|
## 5. Configure `npm-canary`
|
||||||
|
|
||||||
|
Recommended settings for `npm-canary`:
|
||||||
|
|
||||||
|
- environment name: `npm-canary`
|
||||||
|
- required reviewers: none
|
||||||
|
- wait timer: none
|
||||||
|
- deployment branches and tags:
|
||||||
|
- selected branches only
|
||||||
|
- allow `master`
|
||||||
|
|
||||||
|
Reasoning:
|
||||||
|
|
||||||
|
- every push to `master` should be able to publish a canary automatically
|
||||||
|
- no human approval should be required for canaries
|
||||||
|
|
||||||
|
## 6. Configure `npm-stable`
|
||||||
|
|
||||||
|
Recommended settings for `npm-stable`:
|
||||||
|
|
||||||
|
- environment name: `npm-stable`
|
||||||
|
- required reviewers: at least one maintainer other than the person triggering the workflow when possible
|
||||||
|
- prevent self-review: enabled
|
||||||
|
- admin bypass: disabled if your team can tolerate it
|
||||||
|
- wait timer: optional
|
||||||
|
- deployment branches and tags:
|
||||||
|
- selected branches only
|
||||||
|
- allow `master`
|
||||||
|
|
||||||
|
Reasoning:
|
||||||
|
|
||||||
|
- stable publishes should require an explicit human approval gate
|
||||||
|
- the workflow is manual, but the environment should still be the real control point
|
||||||
|
|
||||||
|
## 7. Protect `master`
|
||||||
|
|
||||||
|
Open the branch protection settings for `master`.
|
||||||
|
|
||||||
|
Recommended rules:
|
||||||
|
|
||||||
|
1. require pull requests before merging
|
||||||
|
2. require status checks to pass before merging
|
||||||
|
3. require review from code owners
|
||||||
|
4. dismiss stale approvals when new commits are pushed
|
||||||
|
5. restrict who can push directly to `master`
|
||||||
|
|
||||||
|
At minimum, make sure workflow and release script changes cannot land without review.
|
||||||
|
|
||||||
|
## 8. Enforce CODEOWNERS Review
|
||||||
|
|
||||||
|
This repo now includes `.github/CODEOWNERS`, but GitHub only enforces it if branch protection requires code owner reviews.
|
||||||
|
|
||||||
|
In branch protection for `master`, enable:
|
||||||
|
|
||||||
|
- `Require review from Code Owners`
|
||||||
|
|
||||||
|
Then verify the owner entries are correct for your actual maintainer set.
|
||||||
|
|
||||||
|
Current file:
|
||||||
|
|
||||||
|
- `.github/CODEOWNERS`
|
||||||
|
|
||||||
|
If `@cryppadotta` is not the right reviewer identity in the public repo, change it before enabling enforcement.
|
||||||
|
|
||||||
|
## 9. Protect Release Infrastructure Specifically
|
||||||
|
|
||||||
|
These files should always trigger code owner review:
|
||||||
|
|
||||||
|
- `.github/workflows/release.yml`
|
||||||
|
- `scripts/release.sh`
|
||||||
|
- `scripts/release-lib.sh`
|
||||||
|
- `scripts/release-package-map.mjs`
|
||||||
|
- `scripts/create-github-release.sh`
|
||||||
|
- `scripts/rollback-latest.sh`
|
||||||
|
- `doc/RELEASING.md`
|
||||||
|
- `doc/PUBLISHING.md`
|
||||||
|
|
||||||
|
If you want stronger controls, add a repository ruleset that explicitly blocks direct pushes to:
|
||||||
|
|
||||||
|
- `.github/workflows/**`
|
||||||
|
- `scripts/release*`
|
||||||
|
|
||||||
|
## 10. Do Not Store a Claude Token in GitHub Actions
|
||||||
|
|
||||||
|
Do not add a personal Claude or Anthropic token for automatic changelog generation.
|
||||||
|
|
||||||
|
Recommended policy:
|
||||||
|
|
||||||
|
- stable changelog generation happens locally from a trusted maintainer machine
|
||||||
|
- canaries never generate changelogs
|
||||||
|
|
||||||
|
This keeps LLM spending intentional and avoids a high-value token sitting in Actions.
|
||||||
|
|
||||||
|
## 11. Verify the Canary Workflow
|
||||||
|
|
||||||
|
After setup:
|
||||||
|
|
||||||
|
1. merge a harmless commit to `master`
|
||||||
|
2. open the `Release` workflow run triggered by that push
|
||||||
|
3. confirm it passes verification
|
||||||
|
4. confirm publish succeeds under the `npm-canary` environment
|
||||||
|
5. confirm npm now shows a new `canary` release
|
||||||
|
6. confirm a git tag named `canary/vYYYY.MDD.P-canary.N` was pushed
|
||||||
|
|
||||||
|
Install-path check:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx paperclipai@canary onboard
|
||||||
|
```
|
||||||
|
|
||||||
|
## 12. Verify the Stable Workflow
|
||||||
|
|
||||||
|
After at least one good canary exists:
|
||||||
|
|
||||||
|
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||||
|
2. prepare `releases/vYYYY.MDD.P.md` on the source commit you want to promote
|
||||||
|
3. open `Actions` -> `Release`
|
||||||
|
4. run it with:
|
||||||
|
- `source_ref`: the tested commit SHA or canary tag source commit
|
||||||
|
- `stable_date`: leave blank or set the intended UTC date like `2026-03-18`
|
||||||
|
do not enter a version like `2026.318.0`; the workflow computes that from the date
|
||||||
|
- `dry_run`: `true`
|
||||||
|
5. confirm the dry-run succeeds
|
||||||
|
6. rerun with `dry_run: false`
|
||||||
|
7. approve the `npm-stable` environment when prompted
|
||||||
|
8. confirm npm `latest` points to the new stable version
|
||||||
|
9. confirm git tag `vYYYY.MDD.P` exists
|
||||||
|
10. confirm the GitHub Release was created
|
||||||
|
|
||||||
|
Implementation note:
|
||||||
|
|
||||||
|
- the GitHub Actions stable workflow calls `create-github-release.sh` with `PUBLISH_REMOTE=origin`
|
||||||
|
- local maintainer usage can still pass `PUBLISH_REMOTE=public-gh` explicitly when needed
|
||||||
|
|
||||||
|
## 13. Suggested Maintainer Policy
|
||||||
|
|
||||||
|
Use this policy going forward:
|
||||||
|
|
||||||
|
- canaries are automatic and cheap
|
||||||
|
- stables are manual and approved
|
||||||
|
- only stables get public notes and announcements
|
||||||
|
- release notes are committed before stable publish
|
||||||
|
- rollback uses `npm dist-tag`, not unpublish
|
||||||
|
|
||||||
|
## 14. Troubleshooting
|
||||||
|
|
||||||
|
### Trusted publishing fails with an auth error
|
||||||
|
|
||||||
|
Check:
|
||||||
|
|
||||||
|
1. the workflow filename on GitHub exactly matches the filename configured in npm
|
||||||
|
2. the package has the trusted publisher entry for the correct repository
|
||||||
|
3. the job has `id-token: write`
|
||||||
|
4. the job is running from the expected repository, not a fork
|
||||||
|
|
||||||
|
### Stable workflow runs but never asks for approval
|
||||||
|
|
||||||
|
Check:
|
||||||
|
|
||||||
|
1. the `publish` job uses environment `npm-stable`
|
||||||
|
2. the environment actually has required reviewers configured
|
||||||
|
3. the workflow is running in the canonical repository, not a fork
|
||||||
|
|
||||||
|
### CODEOWNERS does not trigger
|
||||||
|
|
||||||
|
Check:
|
||||||
|
|
||||||
|
1. `.github/CODEOWNERS` is on the default branch
|
||||||
|
2. branch protection on `master` requires code owner review
|
||||||
|
3. the owner identities in the file are valid reviewers with repository access
|
||||||
|
|
||||||
|
## Related Docs
|
||||||
|
|
||||||
|
- [doc/RELEASING.md](RELEASING.md)
|
||||||
|
- [doc/PUBLISHING.md](PUBLISHING.md)
|
||||||
|
- [doc/plans/2026-03-17-release-automation-and-versioning.md](plans/2026-03-17-release-automation-and-versioning.md)
|
||||||
251
doc/RELEASING.md
Normal file
251
doc/RELEASING.md
Normal file
@@ -0,0 +1,251 @@
|
|||||||
|
# Releasing Paperclip
|
||||||
|
|
||||||
|
Maintainer runbook for shipping Paperclip across npm, GitHub, and the website-facing changelog surface.
|
||||||
|
|
||||||
|
The release model is now commit-driven:
|
||||||
|
|
||||||
|
1. Every push to `master` publishes a canary automatically.
|
||||||
|
2. Stable releases are manually promoted from a chosen tested commit or canary tag.
|
||||||
|
3. Stable release notes live in `releases/vYYYY.MDD.P.md`.
|
||||||
|
4. Only stable releases get GitHub Releases.
|
||||||
|
|
||||||
|
## Versioning Model
|
||||||
|
|
||||||
|
Paperclip uses calendar versions that still fit semver syntax:
|
||||||
|
|
||||||
|
- stable: `YYYY.MDD.P`
|
||||||
|
- canary: `YYYY.MDD.P-canary.N`
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- first stable on March 18, 2026: `2026.318.0`
|
||||||
|
- second stable on March 18, 2026: `2026.318.1`
|
||||||
|
- fourth canary for the `2026.318.1` line: `2026.318.1-canary.3`
|
||||||
|
|
||||||
|
Important constraints:
|
||||||
|
|
||||||
|
- the middle numeric slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||||
|
- use `2026.303.0` for March 3, not `2026.33.0`
|
||||||
|
- do not use leading zeroes such as `2026.0318.0`
|
||||||
|
- do not use four numeric segments such as `2026.3.18.1`
|
||||||
|
- the semver-safe canary form is `2026.318.0-canary.1`
|
||||||
|
|
||||||
|
## Release Surfaces
|
||||||
|
|
||||||
|
Every stable release has four separate surfaces:
|
||||||
|
|
||||||
|
1. **Verification** — the exact git SHA passes typecheck, tests, and build
|
||||||
|
2. **npm** — `paperclipai` and public workspace packages are published
|
||||||
|
3. **GitHub** — the stable release gets a git tag and GitHub Release
|
||||||
|
4. **Website / announcements** — the stable changelog is published externally and announced
|
||||||
|
|
||||||
|
A stable release is done only when all four surfaces are handled.
|
||||||
|
|
||||||
|
Canaries only cover the first two surfaces plus an internal traceability tag.
|
||||||
|
|
||||||
|
## Core Invariants
|
||||||
|
|
||||||
|
- canaries publish from `master`
|
||||||
|
- stables publish from an explicitly chosen source ref
|
||||||
|
- tags point at the original source commit, not a generated release commit
|
||||||
|
- stable notes are always `releases/vYYYY.MDD.P.md`
|
||||||
|
- canaries never create GitHub Releases
|
||||||
|
- canaries never require changelog generation
|
||||||
|
|
||||||
|
## TL;DR
|
||||||
|
|
||||||
|
### Canary
|
||||||
|
|
||||||
|
Every push to `master` runs the canary path inside [`.github/workflows/release.yml`](../.github/workflows/release.yml).
|
||||||
|
|
||||||
|
It:
|
||||||
|
|
||||||
|
- verifies the pushed commit
|
||||||
|
- computes the canary version for the current UTC date
|
||||||
|
- publishes under npm dist-tag `canary`
|
||||||
|
- creates a git tag `canary/vYYYY.MDD.P-canary.N`
|
||||||
|
|
||||||
|
Users install canaries with:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npx paperclipai@canary onboard
|
||||||
|
# or
|
||||||
|
npx paperclipai@canary onboard --data-dir "$(mktemp -d /tmp/paperclip-canary.XXXXXX)"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Stable
|
||||||
|
|
||||||
|
Use [`.github/workflows/release.yml`](../.github/workflows/release.yml) from the Actions tab with the manual `workflow_dispatch` inputs.
|
||||||
|
|
||||||
|
[Run the action here](https://github.com/paperclipai/paperclip/actions/workflows/release.yml)
|
||||||
|
|
||||||
|
Inputs:
|
||||||
|
|
||||||
|
- `source_ref`
|
||||||
|
- commit SHA, branch, or tag
|
||||||
|
- `stable_date`
|
||||||
|
- optional UTC date override in `YYYY-MM-DD`
|
||||||
|
- enter a date like `2026-03-18`, not a version like `2026.318.0`
|
||||||
|
- `dry_run`
|
||||||
|
- preview only when true
|
||||||
|
|
||||||
|
Before running stable:
|
||||||
|
|
||||||
|
1. pick the canary commit or tag you trust
|
||||||
|
2. resolve the target stable version with `./scripts/release.sh stable --date "$(date +%F)" --print-version`
|
||||||
|
3. create or update `releases/vYYYY.MDD.P.md` on that source ref
|
||||||
|
4. run the stable workflow from that source ref
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
- `source_ref`: `master`
|
||||||
|
- `stable_date`: `2026-03-18`
|
||||||
|
- resulting stable version: `2026.318.0`
|
||||||
|
|
||||||
|
The workflow:
|
||||||
|
|
||||||
|
- re-verifies the exact source ref
|
||||||
|
- computes the next stable patch slot for the chosen UTC date
|
||||||
|
- publishes `YYYY.MDD.P` under npm dist-tag `latest`
|
||||||
|
- creates git tag `vYYYY.MDD.P`
|
||||||
|
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
## Local Commands
|
||||||
|
|
||||||
|
### Preview a canary locally
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/release.sh canary --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
### Preview a stable locally
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/release.sh stable --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
### Publish a stable locally
|
||||||
|
|
||||||
|
This is mainly for emergency/manual use. The normal path is the GitHub workflow.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/release.sh stable
|
||||||
|
git push public-gh refs/tags/vYYYY.MDD.P
|
||||||
|
PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P
|
||||||
|
```
|
||||||
|
|
||||||
|
## Stable Changelog Workflow
|
||||||
|
|
||||||
|
Stable changelog files live at:
|
||||||
|
|
||||||
|
- `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
Canaries do not get changelog files.
|
||||||
|
|
||||||
|
Recommended local generation flow:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
VERSION="$(./scripts/release.sh stable --date 2026-03-18 --print-version)"
|
||||||
|
claude --print --output-format stream-json --verbose --dangerously-skip-permissions --model claude-opus-4-6 "Use the release-changelog skill to draft or update releases/v${VERSION}.md for Paperclip. Read doc/RELEASING.md and .agents/skills/release-changelog/SKILL.md, then generate the stable changelog for v${VERSION} from commits since the last stable tag. Do not create a canary changelog."
|
||||||
|
```
|
||||||
|
|
||||||
|
The repo intentionally does not run this through GitHub Actions because:
|
||||||
|
|
||||||
|
- canaries are too frequent
|
||||||
|
- stable notes are the only public narrative surface that needs LLM help
|
||||||
|
- maintainer LLM tokens should not live in Actions
|
||||||
|
|
||||||
|
## Smoke Testing
|
||||||
|
|
||||||
|
For a canary:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
For the current stable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Useful isolated variants:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||||
|
HOST_PORT=3233 DATA_DIR=./data/release-smoke-stable PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Automated browser smoke is also available:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
gh workflow run release-smoke.yml -f paperclip_version=canary
|
||||||
|
gh workflow run release-smoke.yml -f paperclip_version=latest
|
||||||
|
```
|
||||||
|
|
||||||
|
Minimum checks:
|
||||||
|
|
||||||
|
- `npx paperclipai@canary onboard` installs
|
||||||
|
- onboarding completes without crashes
|
||||||
|
- authenticated login works with the smoke credentials
|
||||||
|
- the browser lands in onboarding on a fresh instance
|
||||||
|
- company creation succeeds
|
||||||
|
- the first CEO agent is created
|
||||||
|
- the first CEO heartbeat run is triggered
|
||||||
|
|
||||||
|
## Rollback
|
||||||
|
|
||||||
|
Rollback does not unpublish versions.
|
||||||
|
|
||||||
|
It only moves the `latest` dist-tag back to a previous stable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/rollback-latest.sh 2026.318.0 --dry-run
|
||||||
|
./scripts/rollback-latest.sh 2026.318.0
|
||||||
|
```
|
||||||
|
|
||||||
|
Then fix forward with a new stable patch slot or release date.
|
||||||
|
|
||||||
|
## Failure Playbooks
|
||||||
|
|
||||||
|
### If the canary publishes but smoke testing fails
|
||||||
|
|
||||||
|
Do not run stable.
|
||||||
|
|
||||||
|
Instead:
|
||||||
|
|
||||||
|
1. fix the issue on `master`
|
||||||
|
2. merge the fix
|
||||||
|
3. wait for the next automatic canary
|
||||||
|
4. rerun smoke testing
|
||||||
|
|
||||||
|
### If stable npm publish succeeds but tag push or GitHub release creation fails
|
||||||
|
|
||||||
|
This is a partial release. npm is already live.
|
||||||
|
|
||||||
|
Do this immediately:
|
||||||
|
|
||||||
|
1. push the missing tag
|
||||||
|
2. rerun `PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P`
|
||||||
|
3. verify the GitHub Release notes point at `releases/vYYYY.MDD.P.md`
|
||||||
|
|
||||||
|
Do not republish the same version.
|
||||||
|
|
||||||
|
### If `latest` is broken after stable publish
|
||||||
|
|
||||||
|
Roll back the dist-tag:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./scripts/rollback-latest.sh YYYY.MDD.P
|
||||||
|
```
|
||||||
|
|
||||||
|
Then fix forward with a new stable release.
|
||||||
|
|
||||||
|
## Related Files
|
||||||
|
|
||||||
|
- [`scripts/release.sh`](../scripts/release.sh)
|
||||||
|
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
|
||||||
|
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh)
|
||||||
|
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh)
|
||||||
|
- [`doc/PUBLISHING.md`](PUBLISHING.md)
|
||||||
|
- [`doc/RELEASE-AUTOMATION-SETUP.md`](RELEASE-AUTOMATION-SETUP.md)
|
||||||
@@ -37,7 +37,7 @@ These decisions close open questions from `SPEC.md` for V1.
|
|||||||
| Visibility | Full visibility to board and all agents in same company |
|
| Visibility | Full visibility to board and all agents in same company |
|
||||||
| Communication | Tasks + comments only (no separate chat system) |
|
| Communication | Tasks + comments only (no separate chat system) |
|
||||||
| Task ownership | Single assignee; atomic checkout required for `in_progress` transition |
|
| Task ownership | Single assignee; atomic checkout required for `in_progress` transition |
|
||||||
| Recovery | No automatic reassignment; stale work is surfaced, not silently fixed |
|
| Recovery | No automatic reassignment; work recovery stays manual/explicit |
|
||||||
| Agent adapters | Built-in `process` and `http` adapters |
|
| Agent adapters | Built-in `process` and `http` adapters |
|
||||||
| Auth | Mode-dependent human auth (`local_trusted` implicit board in current code; authenticated mode uses sessions), API keys for agents |
|
| Auth | Mode-dependent human auth (`local_trusted` implicit board in current code; authenticated mode uses sessions), API keys for agents |
|
||||||
| Budget period | Monthly UTC calendar window |
|
| Budget period | Monthly UTC calendar window |
|
||||||
@@ -106,7 +106,6 @@ A lightweight scheduler/worker in the server process handles:
|
|||||||
- heartbeat trigger checks
|
- heartbeat trigger checks
|
||||||
- stuck run detection
|
- stuck run detection
|
||||||
- budget threshold checks
|
- budget threshold checks
|
||||||
- stale task reporting generation
|
|
||||||
|
|
||||||
Separate queue infrastructure is not required for V1.
|
Separate queue infrastructure is not required for V1.
|
||||||
|
|
||||||
@@ -331,6 +330,34 @@ Operational policy:
|
|||||||
- `asset_id` uuid fk not null
|
- `asset_id` uuid fk not null
|
||||||
- `issue_comment_id` uuid fk null
|
- `issue_comment_id` uuid fk null
|
||||||
|
|
||||||
|
## 7.15 `documents` + `document_revisions` + `issue_documents`
|
||||||
|
|
||||||
|
- `documents` stores editable text-first documents:
|
||||||
|
- `id` uuid pk
|
||||||
|
- `company_id` uuid fk not null
|
||||||
|
- `title` text null
|
||||||
|
- `format` text not null (`markdown`)
|
||||||
|
- `latest_body` text not null
|
||||||
|
- `latest_revision_id` uuid null
|
||||||
|
- `latest_revision_number` int not null
|
||||||
|
- `created_by_agent_id` uuid fk null
|
||||||
|
- `created_by_user_id` uuid/text fk null
|
||||||
|
- `updated_by_agent_id` uuid fk null
|
||||||
|
- `updated_by_user_id` uuid/text fk null
|
||||||
|
- `document_revisions` stores append-only history:
|
||||||
|
- `id` uuid pk
|
||||||
|
- `company_id` uuid fk not null
|
||||||
|
- `document_id` uuid fk not null
|
||||||
|
- `revision_number` int not null
|
||||||
|
- `body` text not null
|
||||||
|
- `change_summary` text null
|
||||||
|
- `issue_documents` links documents to issues with a stable workflow key:
|
||||||
|
- `id` uuid pk
|
||||||
|
- `company_id` uuid fk not null
|
||||||
|
- `issue_id` uuid fk not null
|
||||||
|
- `document_id` uuid fk not null
|
||||||
|
- `key` text not null (`plan`, `design`, `notes`, etc.)
|
||||||
|
|
||||||
## 8. State Machines
|
## 8. State Machines
|
||||||
|
|
||||||
## 8.1 Agent Status
|
## 8.1 Agent Status
|
||||||
@@ -414,6 +441,7 @@ All endpoints are under `/api` and return JSON.
|
|||||||
- `POST /companies`
|
- `POST /companies`
|
||||||
- `GET /companies/:companyId`
|
- `GET /companies/:companyId`
|
||||||
- `PATCH /companies/:companyId`
|
- `PATCH /companies/:companyId`
|
||||||
|
- `PATCH /companies/:companyId/branding`
|
||||||
- `POST /companies/:companyId/archive`
|
- `POST /companies/:companyId/archive`
|
||||||
|
|
||||||
## 10.2 Goals
|
## 10.2 Goals
|
||||||
@@ -442,6 +470,11 @@ All endpoints are under `/api` and return JSON.
|
|||||||
- `POST /companies/:companyId/issues`
|
- `POST /companies/:companyId/issues`
|
||||||
- `GET /issues/:issueId`
|
- `GET /issues/:issueId`
|
||||||
- `PATCH /issues/:issueId`
|
- `PATCH /issues/:issueId`
|
||||||
|
- `GET /issues/:issueId/documents`
|
||||||
|
- `GET /issues/:issueId/documents/:key`
|
||||||
|
- `PUT /issues/:issueId/documents/:key`
|
||||||
|
- `GET /issues/:issueId/documents/:key/revisions`
|
||||||
|
- `DELETE /issues/:issueId/documents/:key`
|
||||||
- `POST /issues/:issueId/checkout`
|
- `POST /issues/:issueId/checkout`
|
||||||
- `POST /issues/:issueId/release`
|
- `POST /issues/:issueId/release`
|
||||||
- `POST /issues/:issueId/comments`
|
- `POST /issues/:issueId/comments`
|
||||||
@@ -502,7 +535,6 @@ Dashboard payload must include:
|
|||||||
- open/in-progress/blocked/done issue counts
|
- open/in-progress/blocked/done issue counts
|
||||||
- month-to-date spend and budget utilization
|
- month-to-date spend and budget utilization
|
||||||
- pending approvals count
|
- pending approvals count
|
||||||
- stale task count
|
|
||||||
|
|
||||||
## 10.9 Error Semantics
|
## 10.9 Error Semantics
|
||||||
|
|
||||||
@@ -681,7 +713,6 @@ Required UX behaviors:
|
|||||||
- global company selector
|
- global company selector
|
||||||
- quick actions: pause/resume agent, create task, approve/reject request
|
- quick actions: pause/resume agent, create task, approve/reject request
|
||||||
- conflict toasts on atomic checkout failure
|
- conflict toasts on atomic checkout failure
|
||||||
- clear stale-task indicators
|
|
||||||
- no silent background failures; every failed run visible in UI
|
- no silent background failures; every failed run visible in UI
|
||||||
|
|
||||||
## 15. Operational Requirements
|
## 15. Operational Requirements
|
||||||
@@ -780,7 +811,6 @@ A release candidate is blocked unless these pass:
|
|||||||
|
|
||||||
- add company selector and org chart view
|
- add company selector and org chart view
|
||||||
- add approvals and cost pages
|
- add approvals and cost pages
|
||||||
- add operational dashboard and stale-task surfacing
|
|
||||||
|
|
||||||
## Milestone 6: Hardening and Release
|
## Milestone 6: Hardening and Release
|
||||||
|
|
||||||
@@ -814,20 +844,31 @@ V1 is complete only when all criteria are true:
|
|||||||
|
|
||||||
V1 supports company import/export using a portable package contract:
|
V1 supports company import/export using a portable package contract:
|
||||||
|
|
||||||
- exactly one JSON entrypoint: `paperclip.manifest.json`
|
- markdown-first package rooted at `COMPANY.md`
|
||||||
- all other package files are markdown with frontmatter
|
- implicit folder discovery by convention
|
||||||
- agent convention:
|
- `.paperclip.yaml` sidecar for Paperclip-specific fidelity
|
||||||
- `agents/<slug>/AGENTS.md` (required for V1 export/import)
|
- canonical base package is vendor-neutral and aligned with `docs/companies/companies-spec.md`
|
||||||
- `agents/<slug>/HEARTBEAT.md` (optional, import accepted)
|
- common conventions:
|
||||||
- `agents/<slug>/*.md` (optional, import accepted)
|
- `agents/<slug>/AGENTS.md`
|
||||||
|
- `teams/<slug>/TEAM.md`
|
||||||
|
- `projects/<slug>/PROJECT.md`
|
||||||
|
- `projects/<slug>/tasks/<slug>/TASK.md`
|
||||||
|
- `tasks/<slug>/TASK.md`
|
||||||
|
- `skills/<slug>/SKILL.md`
|
||||||
|
|
||||||
Export/import behavior in V1:
|
Export/import behavior in V1:
|
||||||
|
|
||||||
- export includes company metadata and/or agents based on selection
|
- export emits a clean vendor-neutral markdown package plus `.paperclip.yaml`
|
||||||
- export strips environment-specific paths (`cwd`, local instruction file paths)
|
- projects and starter tasks are opt-in export content rather than default package content
|
||||||
- export never includes secret values; secret requirements are reported
|
- recurring `TASK.md` entries use `recurring: true` in the base package and Paperclip routine fidelity in `.paperclip.yaml`
|
||||||
|
- Paperclip imports recurring task packages as routines instead of downgrading them to one-time issues
|
||||||
|
- export strips environment-specific paths (`cwd`, local instruction file paths, inline prompt duplication) while preserving portable project repo/workspace metadata such as `repoUrl`, refs, and workspace-policy references keyed in `.paperclip.yaml`
|
||||||
|
- export never includes secret values; env inputs are reported as portable declarations instead
|
||||||
- import supports target modes:
|
- import supports target modes:
|
||||||
- create a new company
|
- create a new company
|
||||||
- import into an existing company
|
- import into an existing company
|
||||||
|
- import recreates exported project workspaces and remaps portable workspace keys back to target-local workspace ids
|
||||||
|
- import forces imported agent timer heartbeats off so packages never start scheduled runs implicitly
|
||||||
- import supports collision strategies: `rename`, `skip`, `replace`
|
- import supports collision strategies: `rename`, `skip`, `replace`
|
||||||
- import supports preview (dry-run) before apply
|
- import supports preview (dry-run) before apply
|
||||||
|
- GitHub imports warn on unpinned refs instead of blocking
|
||||||
|
|||||||
@@ -189,11 +189,14 @@ The heartbeat is a protocol, not a runtime. Paperclip defines how to initiate an
|
|||||||
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Initial adapters:
|
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Initial adapters:
|
||||||
|
|
||||||
| Adapter | Mechanism | Example |
|
| Adapter | Mechanism | Example |
|
||||||
| --------- | ----------------------- | --------------------------------------------- |
|
| -------------------- | ----------------------- | --------------------------------------------- |
|
||||||
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
|
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
|
||||||
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
|
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
|
||||||
|
| `openclaw_gateway` | OpenClaw gateway API | Managed OpenClaw agent via gateway |
|
||||||
|
| `gemini_local` | Gemini CLI process | Local Gemini CLI with sandbox and approval |
|
||||||
|
| `hermes_local` | Hermes agent process | Local Hermes agent |
|
||||||
|
|
||||||
The `process` and `http` adapters ship as defaults. Additional adapters can be added via the plugin system (see Plugin / Extension Architecture).
|
The `process` and `http` adapters ship as defaults. Additional adapters have been added for specific agent runtimes (see list above), and new adapter types can be registered via the plugin system (see Plugin / Extension Architecture).
|
||||||
|
|
||||||
### Adapter Interface
|
### Adapter Interface
|
||||||
|
|
||||||
@@ -429,7 +432,7 @@ The core Paperclip system must be extensible. Features like knowledge bases, ext
|
|||||||
- **Agent Adapter plugins** — new Adapter types can be registered via the plugin system
|
- **Agent Adapter plugins** — new Adapter types can be registered via the plugin system
|
||||||
- Plugin-registrable UI components (future)
|
- Plugin-registrable UI components (future)
|
||||||
|
|
||||||
This isn't a V1 deliverable (we're not building a plugin framework upfront), but the architecture should not paint us into a corner. Keep boundaries clean so extensions are possible.
|
The plugin framework has shipped. Plugins can register new adapter types, hook into lifecycle events, and contribute UI components (e.g. global toolbar buttons). A plugin SDK and CLI commands (`paperclipai plugin`) are available for authoring and installing plugins.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
135
doc/UNTRUSTED-PR-REVIEW.md
Normal file
135
doc/UNTRUSTED-PR-REVIEW.md
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
# Untrusted PR Review In Docker
|
||||||
|
|
||||||
|
Use this workflow when you want Codex or Claude to inspect a pull request that you do not want touching your host machine directly.
|
||||||
|
|
||||||
|
This is intentionally separate from the normal Paperclip dev image.
|
||||||
|
|
||||||
|
## What this container isolates
|
||||||
|
|
||||||
|
- `codex` auth/session state in a Docker volume, not your host `~/.codex`
|
||||||
|
- `claude` auth/session state in a Docker volume, not your host `~/.claude`
|
||||||
|
- `gh` auth state in the same container-local home volume
|
||||||
|
- review clones, worktrees, dependency installs, and local databases in a writable scratch volume under `/work`
|
||||||
|
|
||||||
|
By default this workflow does **not** mount your host repo checkout, your host home directory, or your SSH agent.
|
||||||
|
|
||||||
|
## Files
|
||||||
|
|
||||||
|
- `docker/untrusted-review/Dockerfile`
|
||||||
|
- `docker-compose.untrusted-review.yml`
|
||||||
|
- `review-checkout-pr` inside the container
|
||||||
|
|
||||||
|
## Build and start a shell
|
||||||
|
|
||||||
|
```sh
|
||||||
|
docker compose -f docker-compose.untrusted-review.yml build
|
||||||
|
docker compose -f docker-compose.untrusted-review.yml run --rm --service-ports review
|
||||||
|
```
|
||||||
|
|
||||||
|
That opens an interactive shell in the review container with:
|
||||||
|
|
||||||
|
- Node + Corepack/pnpm
|
||||||
|
- `codex`
|
||||||
|
- `claude`
|
||||||
|
- `gh`
|
||||||
|
- `git`, `rg`, `fd`, `jq`
|
||||||
|
|
||||||
|
## First-time login inside the container
|
||||||
|
|
||||||
|
Run these once. The resulting login state persists in the `review-home` Docker volume.
|
||||||
|
|
||||||
|
```sh
|
||||||
|
gh auth login
|
||||||
|
codex login
|
||||||
|
claude login
|
||||||
|
```
|
||||||
|
|
||||||
|
If you prefer API-key auth instead of CLI login, pass keys through Compose env:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
OPENAI_API_KEY=... ANTHROPIC_API_KEY=... docker compose -f docker-compose.untrusted-review.yml run --rm review
|
||||||
|
```
|
||||||
|
|
||||||
|
## Check out a PR safely
|
||||||
|
|
||||||
|
Inside the container:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
review-checkout-pr paperclipai/paperclip 432
|
||||||
|
cd /work/checkouts/paperclipai-paperclip/pr-432
|
||||||
|
```
|
||||||
|
|
||||||
|
What this does:
|
||||||
|
|
||||||
|
1. Creates or reuses a repo clone under `/work/repos/...`
|
||||||
|
2. Fetches `pull/<pr>/head` from GitHub
|
||||||
|
3. Creates a detached git worktree under `/work/checkouts/...`
|
||||||
|
|
||||||
|
The checkout lives entirely inside the container volume.
|
||||||
|
|
||||||
|
## Ask Codex or Claude to review it
|
||||||
|
|
||||||
|
Inside the PR checkout:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
codex
|
||||||
|
```
|
||||||
|
|
||||||
|
Then give it a prompt like:
|
||||||
|
|
||||||
|
```text
|
||||||
|
Review this PR as hostile input. Focus on security issues, data exfiltration paths, sandbox escapes, dangerous install/runtime scripts, auth changes, and subtle behavioral regressions. Do not modify files. Produce findings ordered by severity with file references.
|
||||||
|
```
|
||||||
|
|
||||||
|
Or with Claude:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
claude
|
||||||
|
```
|
||||||
|
|
||||||
|
## Preview the Paperclip app from the PR
|
||||||
|
|
||||||
|
Only do this when you intentionally want to execute the PR's code inside the container.
|
||||||
|
|
||||||
|
Inside the PR checkout:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
pnpm install
|
||||||
|
HOST=0.0.0.0 pnpm dev
|
||||||
|
```
|
||||||
|
|
||||||
|
Open from the host:
|
||||||
|
|
||||||
|
- `http://localhost:3100`
|
||||||
|
|
||||||
|
The Compose file also exposes Vite's default port:
|
||||||
|
|
||||||
|
- `http://localhost:5173`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
|
||||||
|
- `pnpm install` can run untrusted lifecycle scripts from the PR. That is why this happens inside the isolated container instead of on your host.
|
||||||
|
- If you only want static inspection, do not run install/dev commands.
|
||||||
|
- Paperclip's embedded PostgreSQL and local storage stay inside the container home volume via `PAPERCLIP_HOME=/home/reviewer/.paperclip-review`.
|
||||||
|
|
||||||
|
## Reset state
|
||||||
|
|
||||||
|
Remove the review container volumes when you want a clean environment:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
docker compose -f docker-compose.untrusted-review.yml down -v
|
||||||
|
```
|
||||||
|
|
||||||
|
That deletes:
|
||||||
|
|
||||||
|
- Codex/Claude/GitHub login state stored in `review-home`
|
||||||
|
- cloned repos, worktrees, installs, and scratch data stored in `review-work`
|
||||||
|
|
||||||
|
## Security limits
|
||||||
|
|
||||||
|
This is a useful isolation boundary, but it is still Docker, not a full VM.
|
||||||
|
|
||||||
|
- A reviewed PR can still access the container's network unless you disable it.
|
||||||
|
- Any secrets you pass into the container are available to code you execute inside it.
|
||||||
|
- Do not mount your host repo, host home, `.ssh`, or Docker socket unless you are intentionally weakening the boundary.
|
||||||
|
- If you need a stronger boundary than this, use a disposable VM instead of Docker.
|
||||||
62
doc/experimental/issue-worktree-support.md
Normal file
62
doc/experimental/issue-worktree-support.md
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
# Issue worktree support
|
||||||
|
|
||||||
|
Status: experimental, runtime-only, not shipping as a user-facing feature yet.
|
||||||
|
|
||||||
|
This branch contains the runtime and seeding work needed for issue-scoped worktrees:
|
||||||
|
|
||||||
|
- project execution workspace policy support
|
||||||
|
- issue-level execution workspace settings
|
||||||
|
- git worktree realization for isolated issue execution
|
||||||
|
- optional command-based worktree provisioning
|
||||||
|
- seeded worktree fixes for secrets key compatibility
|
||||||
|
- seeded project workspace rebinding to the current git worktree
|
||||||
|
|
||||||
|
We are intentionally not shipping the UI for this yet. The runtime code remains in place, but the main UI entrypoints are hard-gated off for now.
|
||||||
|
|
||||||
|
## What works today
|
||||||
|
|
||||||
|
- projects can carry execution workspace policy in the backend
|
||||||
|
- issues can carry execution workspace settings in the backend
|
||||||
|
- heartbeat execution can realize isolated git worktrees
|
||||||
|
- runtime can run a project-defined provision command inside the derived worktree
|
||||||
|
- seeded worktree instances can keep local-encrypted secrets working
|
||||||
|
- seeded worktree instances can rebind same-repo project workspace paths onto the current git worktree
|
||||||
|
|
||||||
|
## Hidden UI entrypoints
|
||||||
|
|
||||||
|
These are the current user-facing UI surfaces for the feature, now intentionally disabled:
|
||||||
|
|
||||||
|
- project settings:
|
||||||
|
- `ui/src/components/ProjectProperties.tsx`
|
||||||
|
- execution workspace policy controls
|
||||||
|
- git worktree base ref / branch template / parent dir
|
||||||
|
- provision / teardown command inputs
|
||||||
|
|
||||||
|
- issue creation:
|
||||||
|
- `ui/src/components/NewIssueDialog.tsx`
|
||||||
|
- isolated issue checkout toggle
|
||||||
|
- defaulting issue execution workspace settings from project policy
|
||||||
|
|
||||||
|
- issue editing:
|
||||||
|
- `ui/src/components/IssueProperties.tsx`
|
||||||
|
- issue-level workspace mode toggle
|
||||||
|
- defaulting issue execution workspace settings when project changes
|
||||||
|
|
||||||
|
- agent/runtime settings:
|
||||||
|
- `ui/src/adapters/runtime-json-fields.tsx`
|
||||||
|
- runtime services JSON field, which is part of the broader workspace-runtime support surface
|
||||||
|
|
||||||
|
## Why the UI is hidden
|
||||||
|
|
||||||
|
- the runtime behavior is still being validated
|
||||||
|
- the workflow and operator ergonomics are not final
|
||||||
|
- we do not want to expose a partially-baked user-facing feature in issues, projects, or settings
|
||||||
|
|
||||||
|
## Re-enable plan
|
||||||
|
|
||||||
|
When this is ready to ship:
|
||||||
|
|
||||||
|
- re-enable the gated UI sections in the files above
|
||||||
|
- review wording and defaults for project and issue controls
|
||||||
|
- decide which agent/runtime settings should remain advanced-only
|
||||||
|
- add end-to-end product-level verification for the full UI workflow
|
||||||
172
doc/memory-landscape.md
Normal file
172
doc/memory-landscape.md
Normal file
@@ -0,0 +1,172 @@
|
|||||||
|
# Memory Landscape
|
||||||
|
|
||||||
|
Date: 2026-03-17
|
||||||
|
|
||||||
|
This document summarizes the memory systems referenced in task `PAP-530` and extracts the design patterns that matter for Paperclip.
|
||||||
|
|
||||||
|
## What Paperclip Needs From This Survey
|
||||||
|
|
||||||
|
Paperclip is not trying to become a single opinionated memory engine. The more useful target is a control-plane memory surface that:
|
||||||
|
|
||||||
|
- stays company-scoped
|
||||||
|
- lets each company choose a default memory provider
|
||||||
|
- lets specific agents override that default
|
||||||
|
- keeps provenance back to Paperclip runs, issues, comments, and documents
|
||||||
|
- records memory-related cost and latency the same way the rest of the control plane records work
|
||||||
|
- works with plugin-provided providers, not only built-ins
|
||||||
|
|
||||||
|
The question is not "which memory project wins?" The question is "what is the smallest Paperclip contract that can sit above several very different memory systems without flattening away the useful differences?"
|
||||||
|
|
||||||
|
## Quick Grouping
|
||||||
|
|
||||||
|
### Hosted memory APIs
|
||||||
|
|
||||||
|
- `mem0`
|
||||||
|
- `supermemory`
|
||||||
|
- `Memori`
|
||||||
|
|
||||||
|
These optimize for a simple application integration story: send conversation/content plus an identity, then query for relevant memory or user context later.
|
||||||
|
|
||||||
|
### Agent-centric memory frameworks / memory OSes
|
||||||
|
|
||||||
|
- `MemOS`
|
||||||
|
- `memU`
|
||||||
|
- `EverMemOS`
|
||||||
|
- `OpenViking`
|
||||||
|
|
||||||
|
These treat memory as an agent runtime subsystem, not only as a search index. They usually add task memory, profiles, filesystem-style organization, async ingestion, or skill/resource management.
|
||||||
|
|
||||||
|
### Local-first memory stores / indexes
|
||||||
|
|
||||||
|
- `nuggets`
|
||||||
|
- `memsearch`
|
||||||
|
|
||||||
|
These emphasize local persistence, inspectability, and low operational overhead. They are useful because Paperclip is local-first today and needs at least one zero-config path.
|
||||||
|
|
||||||
|
## Per-Project Notes
|
||||||
|
|
||||||
|
| Project | Shape | Notable API / model | Strong fit for Paperclip | Main mismatch |
|
||||||
|
|---|---|---|---|---|
|
||||||
|
| [nuggets](https://github.com/NeoVertex1/nuggets) | local memory engine + messaging gateway | topic-scoped HRR memory with `remember`, `recall`, `forget`, fact promotion into `MEMORY.md` | good example of lightweight local memory and automatic promotion | very specific architecture; not a general multi-tenant service |
|
||||||
|
| [mem0](https://github.com/mem0ai/mem0) | hosted + OSS SDK | `add`, `search`, `getAll`, `get`, `update`, `delete`, `deleteAll`; entity partitioning via `user_id`, `agent_id`, `run_id`, `app_id` | closest to a clean provider API with identities and metadata filters | provider owns extraction heavily; Paperclip should not assume every backend behaves like mem0 |
|
||||||
|
| [MemOS](https://github.com/MemTensor/MemOS) | memory OS / framework | unified add-retrieve-edit-delete, memory cubes, multimodal memory, tool memory, async scheduler, feedback/correction | strong source for optional capabilities beyond plain search | much broader than the minimal contract Paperclip should standardize first |
|
||||||
|
| [supermemory](https://github.com/supermemoryai/supermemory) | hosted memory + context API | `add`, `profile`, `search.memories`, `search.documents`, document upload, settings; automatic profile building and forgetting | strong example of "context bundle" rather than raw search results | heavily productized around its own ontology and hosted flow |
|
||||||
|
| [memU](https://github.com/NevaMind-AI/memU) | proactive agent memory framework | file-system metaphor, proactive loop, intent prediction, always-on companion model | good source for when memory should trigger agent behavior, not just retrieval | proactive assistant framing is broader than Paperclip's task-centric control plane |
|
||||||
|
| [Memori](https://github.com/MemoriLabs/Memori) | hosted memory fabric + SDK wrappers | registers against LLM SDKs, attribution via `entity_id` + `process_id`, sessions, cloud + BYODB | strong example of automatic capture around model clients | wrapper-centric design does not map 1:1 to Paperclip's run / issue / comment lifecycle |
|
||||||
|
| [EverMemOS](https://github.com/EverMind-AI/EverMemOS) | conversational long-term memory system | MemCell extraction, structured narratives, user profiles, hybrid retrieval / reranking | useful model for provenance-rich structured memories and evolving profiles | focused on conversational memory rather than generalized control-plane events |
|
||||||
|
| [memsearch](https://github.com/zilliztech/memsearch) | markdown-first local memory index | markdown as source of truth, `index`, `search`, `watch`, transcript parsing, plugin hooks | excellent baseline for a local built-in provider and inspectable provenance | intentionally simple; no hosted service semantics or rich correction workflow |
|
||||||
|
| [OpenViking](https://github.com/volcengine/OpenViking) | context database | filesystem-style organization of memories/resources/skills, tiered loading, visualized retrieval trajectories | strong source for browse/inspect UX and context provenance | treats "context database" as a larger product surface than Paperclip should own |
|
||||||
|
|
||||||
|
## Common Primitives Across The Landscape
|
||||||
|
|
||||||
|
Even though the systems disagree on architecture, they converge on a few primitives:
|
||||||
|
|
||||||
|
- `ingest`: add memory from text, messages, documents, or transcripts
|
||||||
|
- `query`: search or retrieve memory given a task, question, or scope
|
||||||
|
- `scope`: partition memory by user, agent, project, process, or session
|
||||||
|
- `provenance`: carry enough metadata to explain where a memory came from
|
||||||
|
- `maintenance`: update, forget, dedupe, compact, or correct memories over time
|
||||||
|
- `context assembly`: turn raw memories into a prompt-ready bundle for the agent
|
||||||
|
|
||||||
|
If Paperclip does not expose these, it will not adapt well to the systems above.
|
||||||
|
|
||||||
|
## Where The Systems Differ
|
||||||
|
|
||||||
|
These differences are exactly why Paperclip needs a layered contract instead of a single hard-coded engine.
|
||||||
|
|
||||||
|
### 1. Who owns extraction?
|
||||||
|
|
||||||
|
- `mem0`, `supermemory`, and `Memori` expect the provider to infer memories from conversations.
|
||||||
|
- `memsearch` expects the host to decide what markdown to write, then indexes it.
|
||||||
|
- `MemOS`, `memU`, `EverMemOS`, and `OpenViking` sit somewhere in between and often expose richer memory construction pipelines.
|
||||||
|
|
||||||
|
Paperclip should support both:
|
||||||
|
|
||||||
|
- provider-managed extraction
|
||||||
|
- Paperclip-managed extraction with provider-managed storage / retrieval
|
||||||
|
|
||||||
|
### 2. What is the source of truth?
|
||||||
|
|
||||||
|
- `memsearch` and `nuggets` make the source inspectable on disk.
|
||||||
|
- hosted APIs often make the provider store canonical.
|
||||||
|
- filesystem-style systems like `OpenViking` and `memU` treat hierarchy itself as part of the memory model.
|
||||||
|
|
||||||
|
Paperclip should not require a single storage shape. It should require normalized references back to Paperclip entities.
|
||||||
|
|
||||||
|
### 3. Is memory just search, or also profile and planning state?
|
||||||
|
|
||||||
|
- `mem0` and `memsearch` center search and CRUD.
|
||||||
|
- `supermemory` adds user profiles as a first-class output.
|
||||||
|
- `MemOS`, `memU`, `EverMemOS`, and `OpenViking` expand into tool traces, task memory, resources, and skills.
|
||||||
|
|
||||||
|
Paperclip should make plain search the minimum contract and richer outputs optional capabilities.
|
||||||
|
|
||||||
|
### 4. Is memory synchronous or asynchronous?
|
||||||
|
|
||||||
|
- local tools often work synchronously in-process.
|
||||||
|
- larger systems add schedulers, background indexing, compaction, or sync jobs.
|
||||||
|
|
||||||
|
Paperclip needs both direct request/response operations and background maintenance hooks.
|
||||||
|
|
||||||
|
## Paperclip-Specific Takeaways
|
||||||
|
|
||||||
|
### Paperclip should own these concerns
|
||||||
|
|
||||||
|
- binding a provider to a company and optionally overriding it per agent
|
||||||
|
- mapping Paperclip entities into provider scopes
|
||||||
|
- provenance back to issue comments, documents, runs, and activity
|
||||||
|
- cost / token / latency reporting for memory work
|
||||||
|
- browse and inspect surfaces in the Paperclip UI
|
||||||
|
- governance on destructive operations
|
||||||
|
|
||||||
|
### Providers should own these concerns
|
||||||
|
|
||||||
|
- extraction heuristics
|
||||||
|
- embedding / indexing strategy
|
||||||
|
- ranking and reranking
|
||||||
|
- profile synthesis
|
||||||
|
- contradiction resolution and forgetting logic
|
||||||
|
- storage engine details
|
||||||
|
|
||||||
|
### The control-plane contract should stay small
|
||||||
|
|
||||||
|
Paperclip does not need to standardize every feature from every provider. It needs:
|
||||||
|
|
||||||
|
- a required portable core
|
||||||
|
- optional capability flags for richer providers
|
||||||
|
- a way to record provider-native ids and metadata without pretending all providers are equivalent internally
|
||||||
|
|
||||||
|
## Recommended Direction
|
||||||
|
|
||||||
|
Paperclip should adopt a two-layer memory model:
|
||||||
|
|
||||||
|
1. `Memory binding + control plane layer`
|
||||||
|
Paperclip decides which provider key is in effect for a company, agent, or project, and it logs every memory operation with provenance and usage.
|
||||||
|
|
||||||
|
2. `Provider adapter layer`
|
||||||
|
A built-in or plugin-supplied adapter turns Paperclip memory requests into provider-specific calls.
|
||||||
|
|
||||||
|
The portable core should cover:
|
||||||
|
|
||||||
|
- ingest / write
|
||||||
|
- search / recall
|
||||||
|
- browse / inspect
|
||||||
|
- get by provider record handle
|
||||||
|
- forget / correction
|
||||||
|
- usage reporting
|
||||||
|
|
||||||
|
Optional capabilities can cover:
|
||||||
|
|
||||||
|
- profile synthesis
|
||||||
|
- async ingestion
|
||||||
|
- multimodal content
|
||||||
|
- tool / resource / skill memory
|
||||||
|
- provider-native graph browsing
|
||||||
|
|
||||||
|
That is enough to support:
|
||||||
|
|
||||||
|
- a local markdown-first baseline similar to `memsearch`
|
||||||
|
- hosted services similar to `mem0`, `supermemory`, or `Memori`
|
||||||
|
- richer agent-memory systems like `MemOS` or `OpenViking`
|
||||||
|
|
||||||
|
without forcing Paperclip itself to become a monolithic memory engine.
|
||||||
@@ -1,5 +1,7 @@
|
|||||||
# Paperclip Module System
|
# Paperclip Module System
|
||||||
|
|
||||||
|
> Supersession note: the company-template/package-format direction in this document is no longer current. For the current markdown-first company import/export plan, see `doc/plans/2026-03-13-company-import-export-v2.md` and `docs/companies/companies-spec.md`.
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Paperclip's module system lets you extend the control plane with new capabilities — revenue tracking, observability, notifications, dashboards — without forking core. Modules are self-contained packages that register routes, UI pages, database tables, and lifecycle hooks.
|
Paperclip's module system lets you extend the control plane with new capabilities — revenue tracking, observability, notifications, dashboards — without forking core. Modules are self-contained packages that register routes, UI pages, database tables, and lifecycle hooks.
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user