Compare commits

...

156 Commits

Author SHA1 Message Date
Devin Foley
49fa6c8b6c docs: fix stale concept count and frontmatter summary
Update "five key concepts" to "six" and add "delegation" to the
frontmatter summary field, addressing Greptile review comments.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 22:27:27 -07:00
Devin Foley
99bd62987b docs: add AGENTS.md troubleshooting note to delegation guide
Add a row to the troubleshooting table telling board operators to
verify the CEO's AGENTS.md instructions file contains delegation
directives. Without these instructions, the CEO won't delegate.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 22:16:45 -07:00
Devin Foley
face20430b docs: add board-operator delegation guide
Create docs/guides/board-operator/delegation.md explaining the full
CEO-led delegation lifecycle from the board operator's perspective.
Covers what the board needs to do, what the CEO automates, common
delegation patterns (flat, 3-level, hire-on-demand), and a
troubleshooting section that directly answers the #1 new-user
confusion point: "Do I have to tell the CEO to delegate?"

Also adds a Delegation section to core-concepts.md and wires the
new guide into docs.json navigation after Managing Tasks.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 20:11:38 -07:00
Dotta
af5b980362 Merge pull request #1857 from paperclipai/PAP-878-create-a-mine-tab-in-inbox
Add a Mine tab and archive flow to inbox
2026-03-26 16:21:47 -05:00
dotta
2e563ccd50 Move unread/archive column to the left for non-issue inbox items
Repositions the unread dot and archive X button to the leading
(left) side of approval, failed run, and join request rows,
matching the visual alignment of IssueRow where the unread slot
appears first due to CSS flex ordering.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 16:09:43 -05:00
dotta
2c406d3b8c Extend read/dismissed functionality to all inbox item types
Approvals, failed runs, and join requests now have the same
unread dot + archive X pattern as issues in the Mine tab:
- Click the blue dot to mark as read, then X appears on hover
- Desktop: animated dismiss with scale/slide transition
- Mobile: swipe-to-archive via SwipeToArchive wrapper
- Dismissed items are filtered out of Mine tab
- Badge count excludes dismissed approvals and join requests
- localStorage-backed read/dismiss state for non-issue items

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 16:09:43 -05:00
dotta
49c7fb7fbd Unify unread badge and archive X into single column on Mine tab
The unread dot and dismiss X now share the same rightmost column on
the Mine tab.  When an issue is unread the blue dot shows first;
clicking it marks the issue as read and reveals the X on hover for
archiving.  Read/unread state stays in sync across all inbox tabs.
Desktop dismiss animation polished with scale + slide.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-26 16:09:43 -05:00
dotta
995f5b0b66 Add the inbox mine tab and archive flow
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 16:09:43 -05:00
Dotta
b34fa3b273 Merge pull request #1834 from paperclipai/fix/project-description-mentions
fix: improve embedded Postgres bootstrap and worktree init
2026-03-26 12:43:22 -05:00
dotta
9ddf960312 Harden dev-watch excludes for nested UI outputs
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 12:35:19 -05:00
dotta
a8894799e4 Align worktree provision with worktree init
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 12:35:19 -05:00
dotta
76a692c260 Improve embedded Postgres bootstrap errors
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 12:35:19 -05:00
Dotta
5913706329 Merge pull request #1830 from paperclipai/pr/pap-891-ui-polish
ui: polish mentions, issue workspace details, and issue search
2026-03-26 12:02:28 -05:00
Dotta
b944293eda Merge pull request #1829 from paperclipai/pr/pap-891-worktree-reliability
fix(worktree): harden provisioned worktree isolation and test fallback behavior
2026-03-26 12:01:47 -05:00
dotta
3c1ebed539 test(worktree): address embedded postgres helper review feedback
- probe host support on every platform instead of special-casing darwin
- re-export the db package helper from server and cli tests

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:56:05 -05:00
dotta
ab0d04ff7a fix(ui): address workspace card review feedback
- restore pre-run workspace configuration visibility
- require explicit save/cancel for workspace edits
- stabilize debounced issue search callback

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:53:25 -05:00
Dotta
6073ac3145 Merge pull request #1827 from paperclipai/pr/pap-891-docs-refresh
docs: refresh adapter/runtime docs and deprecate bootstrapPromptTemplate
2026-03-26 11:46:44 -05:00
Dotta
3b329467eb Merge pull request #1828 from paperclipai/pr/pap-891-release-automation-followups
chore(release): publish @paperclipai/ui from release automation
2026-03-26 11:46:10 -05:00
Dotta
aa5b2be907 Merge pull request #1831 from paperclipai/pr/pap-891-opencode-headless-prompts
fix(opencode): support headless permission prompt configuration
2026-03-26 11:43:01 -05:00
Dotta
dcb66eeae7 Merge pull request #1812 from paperclipai/docs/maintenance-20260326-public
docs: documentation accuracy update 2026-03-26
2026-03-26 11:17:43 -05:00
dotta
874fe5ec7d Publish @paperclipai/ui from release automation
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:13:11 -05:00
dotta
c916626cef test: skip embedded postgres suites when initdb is unavailable
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:39 -05:00
dotta
555f026c24 Avoid sibling worktree port collisions
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:39 -05:00
dotta
e91da556ee updated reamde 2026-03-26 11:12:39 -05:00
dotta
ab82e3f022 Fix worktree runtime isolation recovery
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:39 -05:00
dotta
c74cda1851 Fix worktree provision isolation
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:39 -05:00
dotta
fcf3ba6974 Seed Paperclip env in provisioned worktrees
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:39 -05:00
dotta
ed62d58cb2 Fix headless OpenCode permission prompts
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:35 -05:00
dotta
dd8c1ca3b2 Speed up issues page search responsiveness
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:29 -05:00
dotta
5ee4cd98e8 feat: move workspace info from properties panel to issue main pane
Display workspace branch, path, and status in a card on the issue main pane
instead of in the properties sidebar. Only shown for non-default (isolated)
workspaces. Edit controls are hidden behind an Edit toggle button.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:29 -05:00
dotta
a6ca3a9418 fix: enable @-mention autocomplete in new project description editor
The MarkdownEditor in NewProjectDialog was not receiving mention options,
so typing @ in the description field did nothing. Added agents query and
mentionOptions prop to match how NewIssueDialog handles mentions.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:29 -05:00
dotta
0fd75aa579 fix: render mention autocomplete via portal to prevent overflow clipping
The mention suggestion dropdown was getting clipped when typing at the
end of a long description inside modals/dialogs because parent containers
had overflow-y-auto. Render it via createPortal to document.body with
fixed positioning and z-index 9999 so it always appears above all UI.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:29 -05:00
dotta
eaa765118f chore: mark bootstrapPromptTemplate as deprecated
Add @deprecated JSDoc and inline comments to bootstrapPromptTemplate
references in agent-instructions and company-portability services.
This field is superseded by the managed instructions bundle system.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:25 -05:00
dotta
ed73547fb6 docs: update SPEC work artifacts and deprecate bootstrapPromptTemplate
- SPEC: reflect that Paperclip now manages task-linked documents and
  attachments (issue documents, file attachments) instead of claiming
  it does not manage work artifacts
- agents-runtime: remove bootstrapPromptTemplate from recommended config,
  add deprecation notice, update minimal setup checklist

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:25 -05:00
dotta
692105e202 docs: update adapter list and repo map accuracy
- Add missing adapters (opencode_local, hermes_local, cursor, pi_local,
  openclaw_gateway) to agents-runtime.md
- Document bootstrapPromptTemplate in prompt templates section
- Update AGENTS.md repo map with packages/adapters, adapter-utils, plugins
- Fix troubleshooting section to reference all local CLI adapters

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:25 -05:00
dotta
01b550d61a docs: fix SPEC accuracy for adapters and backend
- align adapter list with current built-in adapters
- update backend framework references to Express
- remove outdated V1 not-supported template export claim
- clarify work artifact boundaries with issue documents

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 11:12:25 -05:00
Devin Foley
c6364149b1 Add delegation instructions to default CEO agent prompt (#1796)
New CEO agents created during onboarding now include explicit delegation
rules: triage tasks, route to CTO/CMO/UXDesigner, never do IC work, and
follow up on delegated work.
2026-03-26 08:11:22 -07:00
dotta
844b6dfd70 docs: update SPEC work artifacts and deprecate bootstrapPromptTemplate
- SPEC: reflect that Paperclip now manages task-linked documents and
  attachments (issue documents, file attachments) instead of claiming
  it does not manage work artifacts
- agents-runtime: remove bootstrapPromptTemplate from recommended config,
  add deprecation notice, update minimal setup checklist

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 07:23:09 -05:00
dotta
0a32e3838a fix: render mention autocomplete via portal to prevent overflow clipping
The mention suggestion dropdown was getting clipped when typing at the
end of a long description inside modals/dialogs because parent containers
had overflow-y-auto. Render it via createPortal to document.body with
fixed positioning and z-index 9999 so it always appears above all UI.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 07:22:24 -05:00
dotta
e186449f94 docs: update adapter list and repo map accuracy
- Add missing adapters (opencode_local, hermes_local, cursor, pi_local,
  openclaw_gateway) to agents-runtime.md
- Document bootstrapPromptTemplate in prompt templates section
- Update AGENTS.md repo map with packages/adapters, adapter-utils, plugins
- Fix troubleshooting section to reference all local CLI adapters

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 07:08:45 -05:00
dotta
4bb42005ea docs: fix SPEC accuracy for adapters and backend
- align adapter list with current built-in adapters
- update backend framework references to Express
- remove outdated V1 not-supported template export claim
- clarify work artifact boundaries with issue documents

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 07:08:45 -05:00
Dotta
66aa65f8f7 Merge pull request #1787 from HenkDz/fix/pi-adapter-models-stderr
fix(pi-local): parse models from stderr
2026-03-26 07:04:38 -05:00
HenkDz
15f6079c6b Fix Pi adapter execution and improve transcript parsing
- Changed from RPC mode to JSON print mode (--mode json -p)
- Added prompt as CLI argument instead of stdin RPC command
- Rewrote transcript parser to properly handle Pi's JSONL output
- Added toolUseId to tool_call entries for proper matching with tool_result
- Filter out RPC protocol messages from transcript
- Extract thinking blocks and usage statistics
2026-03-26 10:59:58 +01:00
Devin Foley
9e9eec9af6 ci: validate Dockerfile deps stage in PR policy (#1799)
* ci: add Dockerfile deps stage validation to PR policy

Checks that all workspace package.json files and the patches/
directory are copied into the Dockerfile deps stage. Prevents the
Docker build from breaking when new packages or patches are added
without updating the Dockerfile.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ci: scope Dockerfile check to deps stage and derive workspace roots

Address Greptile review feedback:
- Use awk to extract only the deps stage before grepping, preventing
  false positives from COPY lines in other stages
- Derive workspace search roots from pnpm-workspace.yaml instead of
  hardcoding them, so new top-level workspaces are automatically covered

* ci: guard against empty workspace roots in Dockerfile check

Fail early if pnpm-workspace.yaml parsing yields no search roots,
preventing a silent false-pass from find defaulting to cwd.

* ci: guard against empty deps stage extraction

Fail early with a clear error if awk cannot find the deps stage in the
Dockerfile, instead of producing misleading "missing COPY" errors.

* ci: deduplicate find results from overlapping workspace roots

Use sort -u instead of sort to prevent duplicate error messages when
nested workspace globs (e.g. packages/* and packages/adapters/*) cause
the same package.json to be found twice.

* ci: anchor grep to ^COPY to ignore commented-out Dockerfile lines

Prevents false negatives when a COPY directive is commented out
(e.g. # COPY packages/foo/package.json).

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-25 22:42:16 -07:00
Devin Foley
1a4ed8c953 Merge pull request #1794 from paperclipai/fix/cursor-native-auth-check
fix(cursor): check native auth before warning about missing API key
2026-03-25 22:00:37 -07:00
Devin Foley
bd60ea4909 refactor: use async fs.readFile in readCursorAuthInfo for consistency
Match the async pattern used by readCodexAuthInfo in the Codex adapter.
2026-03-25 21:52:38 -07:00
Devin Foley
6ebfc0ff3d Merge pull request #1782 from paperclipai/fix/codex-skill-injection-location
fix(codex): inject skills into ~/.codex/skills/ instead of workspace
2026-03-25 21:44:58 -07:00
Devin Foley
083d7c9ac4 fix(cursor): check native auth before warning about missing API key
When CURSOR_API_KEY is not set, check ~/.cursor/cli-config.json for
authInfo from `agent login` before emitting the missing key warning.
Users authenticated via native login no longer see a false warning.
2026-03-25 20:54:16 -07:00
Devin Foley
80766e589c Clarify docs: skills go to the effective CODEX_HOME, not ~/.codex
The previous documentation parenthetical "(defaulting to ~/.codex/skills/)"
was misleading because Paperclip almost always sets CODEX_HOME to a
per-company managed home.  Update index.ts docs, skills.ts detail string,
and execute.ts inline comment to make the runtime path unambiguous.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 20:46:05 -07:00
Devin Foley
c5c6c62bd7 Merge pull request #1786 from paperclipai/fix/opencode-disable-project-config
fix(opencode): prevent opencode.json pollution in workspace
2026-03-25 20:38:11 -07:00
Devin Foley
1549799c1e Move OPENCODE_DISABLE_PROJECT_CONFIG after envConfig loop
Setting the env var before the user-config loop meant adapter env
overrides could disable the guard.  Move it after the loop so it
always wins, matching the pattern already used in test.ts and
models.ts.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 20:29:48 -07:00
HenkDz
af1b08fdf4 fix(pi-local): parse models from stderr
Pi outputs the model list to stderr instead of stdout. This fix checks
stderr first and falls back to stdout for compatibility with older
versions.

Fixes model discovery returning empty arrays and environment tests
failing with 'Pi returned no models' error.
2026-03-26 01:30:54 +01:00
Devin Foley
72bc4ab403 fix(opencode): prevent opencode.json config pollution in workspace
Set OPENCODE_DISABLE_PROJECT_CONFIG=true in all OpenCode invocations
(execute, model discovery, environment test) to stop the OpenCode CLI
from writing an opencode.json file into the project working directory.
Model selection is already passed via the --model CLI flag.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 17:22:49 -07:00
Devin Foley
4c6b9c190b Fix stale reference to resolveCodexSkillsHome in fallback path
The default fallback in ensureCodexSkillsInjected still referenced the
old function name. Updated to use resolveCodexSkillsDir with shared
home as fallback.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 16:09:09 -07:00
Devin Foley
f6ac6e47c4 Clarify docs: skills go to $CODEX_HOME/skills/, defaulting to ~/.codex
Addresses Greptile P2 review comment.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 16:05:57 -07:00
Devin Foley
623ab1c3ea Fix skill injection to use effective CODEX_HOME, not shared home
The previous commit incorrectly used resolveSharedCodexHomeDir() (~/.codex)
but Codex runs with CODEX_HOME set to a per-company managed home under
~/.paperclip/instances/. Skills injected into ~/.codex/skills/ would not
be discoverable by Codex. Now uses effectiveCodexHome directly.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 16:04:53 -07:00
Devin Foley
eeec52ad74 Fix Codex skill injection to use ~/.codex/skills/ instead of cwd
The Codex adapter was the only one injecting skills into
<cwd>/.agents/skills/, polluting the project's git repo. All other
adapters (Gemini, Cursor, etc.) use a home-based directory. This
changes the Codex adapter to inject into ~/.codex/skills/ (resolved
via resolveSharedCodexHomeDir) to match the established pattern.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 15:55:51 -07:00
Dotta
db3883d2e7 Merge pull request #1748 from paperclipai/pr/pap-849-release-changelog
docs(release): add v2026.325.0 changelog
2026-03-25 07:53:38 -05:00
dotta
9637351880 docs(release): add v2026.325.0 changelog
Summarizes the v2026.325.0 release with highlights, fixes, upgrade notes, and contributor credits.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-25 07:42:38 -05:00
Devin Foley
cbca599625 Merge pull request #1363 from amit221/fix/issue-1255
fix(issues): normalize HTML entities in @mention tokens before agent lookup
2026-03-24 20:19:47 -07:00
Devin Foley
b1d12d2f37 Merge pull request #1730 from paperclipai/fix/docker-patches
fix(docker): copy patches directory into deps stage
2026-03-24 16:13:24 -07:00
Devin Foley
0a952dc93d fix(docker): copy patches directory into deps stage
pnpm install needs the patches/ directory to resolve patched
dependencies (embedded-postgres). Without it, --frozen-lockfile
fails with ENOENT on the patch file.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 15:59:36 -07:00
Devin Foley
ff8b839f42 Merge pull request #1658 from paperclipai/fix/migration-auto-apply-precedence
fix(server): check MIGRATION_AUTO_APPLY before MIGRATION_PROMPT
2026-03-24 15:56:00 -07:00
Devin Foley
fea892c8b3 Merge pull request #1702 from paperclipai/fix/codex-auth-check
fix(codex): check native auth before warning about missing API key
2026-03-24 15:40:20 -07:00
Devin Foley
1696ff0c3f fix(codex): use path.join for auth detail message path
Use path.join instead of string concatenation for the auth.json
fallback path in the detail message, ensuring correct path
separators on Windows.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 15:27:32 -07:00
Devin Foley
4eecd23ea3 fix(codex): use codexHomeDir() fallback for accurate auth detail path
When adapter config has no CODEX_HOME but process.env.CODEX_HOME is
set, readCodexAuthInfo reads from the process env path. The detail
message now uses codexHomeDir() instead of hardcoded "~/.codex" so
the displayed path always matches where credentials were read from.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 12:46:16 -07:00
Devin Foley
4da83296a9 test(codex): move OPENAI_API_KEY stub to beforeEach for all tests
Consolidate the env stub into beforeEach so the pre-existing cwd
test is also isolated from host OPENAI_API_KEY, avoiding
non-deterministic filesystem side effects.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 12:33:11 -07:00
Devin Foley
0ce4134ce1 fix(codex): use actual CODEX_HOME in auth detail message
Show the configured CODEX_HOME path instead of hardcoded ~/.codex
when the email fallback message is displayed.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 10:54:05 -07:00
Dotta
03f44d0089 Merge pull request #1708 from paperclipai/pr/pap-817-onboarding-goal-context
Seed onboarding project and issue goal context
2026-03-24 12:38:19 -05:00
Dotta
d38d5e1a7b Merge pull request #1710 from paperclipai/pr/pap-817-agent-mention-pill-alignment
Fix agent mention pill vertical misalignment with project mention pill
2026-03-24 12:33:51 -05:00
Dotta
add6ca5648 Merge pull request #1709 from paperclipai/pr/pap-817-join-request-task-assignment-grants
Preserve task assignment grants for joined agents
2026-03-24 12:33:40 -05:00
github-actions[bot]
04a07080af chore(lockfile): refresh pnpm-lock.yaml (#1712)
Co-authored-by: lockfile-bot <lockfile-bot@users.noreply.github.com>
2026-03-24 17:33:24 +00:00
Dotta
8bebc9599a Merge pull request #1707 from paperclipai/pr/pap-817-embedded-postgres-docker-initdb
Fix embedded Postgres initdb failure in Docker slim containers
2026-03-24 12:32:57 -05:00
Dotta
6250d536a0 Merge pull request #1706 from paperclipai/pr/pap-817-inline-join-requests-inbox
Render join requests inline in inbox like approvals and other work items
2026-03-24 12:30:43 -05:00
Dotta
de5985bb75 Merge pull request #1705 from paperclipai/pr/pap-817-remove-instructions-log
Remove noisy "Loaded agent instructions file" log from all adapters
2026-03-24 12:30:15 -05:00
Dotta
331e1f0d06 Merge pull request #1704 from paperclipai/pr/pap-817-cli-api-connection-errors
Improve CLI API connection errors
2026-03-24 12:30:06 -05:00
Devin Foley
58c511af9a test(codex): isolate auth tests from host OPENAI_API_KEY
Use vi.stubEnv to clear OPENAI_API_KEY in both new tests so they
don't silently pass the wrong branch when the key is set in the
test runner's environment.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 09:58:40 -07:00
dotta
4b668379bc Regenerate embedded-postgres vendor patch
Rebuild the patch file with valid unified-diff hunks so pnpm can
apply the locale and environment fixes during install.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:56:41 -05:00
dotta
f352f3f514 Force embedded-postgres messages locale to C
The vendor package still hardcoded LC_MESSAGES to en_US.UTF-8.
That locale is missing in slim containers, and initdb fails during
bootstrap even when --lc-messages=C is passed later.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:55:39 -05:00
dotta
4ff460de38 Fix embedded-postgres patch env lookup
Use globalThis.process.env in the vendor patch so the spawned child
process config does not trip over the local process binding inside
embedded-postgres.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:51:45 -05:00
Devin Foley
06b85d62b2 test(codex): add coverage for native auth detection in environment probe
Add tests for codex_native_auth_present and codex_openai_api_key_missing
code paths. Also pass adapter-configured CODEX_HOME through to
readCodexAuthInfo so the probe respects per-adapter home directories.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 09:51:44 -07:00
dotta
3447e2087a Fix agent mention pill vertical misalignment with project mention pill
Change vertical-align from baseline to middle on both editor and
read-only mention chip styles. The baseline alignment caused
inconsistent positioning because the agent ::before icon (0.75rem)
and project ::before dot (0.45rem) produced different synthesized
baselines in the inline-flex containers.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:49:00 -05:00
dotta
44fbf83106 Preserve task assignment grants for joined agents 2026-03-24 11:49:00 -05:00
dotta
eb73fc747a Seed onboarding project and issue goal context
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:48:59 -05:00
dotta
5602576ae1 Fix embedded Postgres initdb failure in Docker slim containers
The embedded-postgres library hardcodes --lc-messages=en_US.UTF-8 and
strips the parent process environment when spawning initdb/postgres.
In slim Docker images (e.g. node:20-bookworm-slim), the en_US.UTF-8
locale isn't installed, causing initdb to exit with code 1.

Two fixes applied:
1. Add --lc-messages=C to all initdbFlags arrays (overrides the
   library's hardcoded locale since our flags come after in the spread)
2. pnpm patch on embedded-postgres to preserve process.env in spawn
   calls, preventing loss of PATH, LD_LIBRARY_PATH, and other vars

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:48:59 -05:00
dotta
c4838cca6e Render join requests inline in inbox like approvals and other work items
Join requests were displayed in a separate card-style section below the main
inbox list. This moves them into the unified work items feed so they sort
chronologically alongside issues, approvals, and failed runs—matching the
inline treatment hiring requests already receive.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:48:43 -05:00
dotta
67841a0c6d Remove noisy "Loaded agent instructions file" log from all adapters
Loading an instructions file is normal, expected behavior — not worth
logging to stdout/stderr on every run. Warning logs for failed reads
are preserved.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:48:43 -05:00
dotta
5561a9c17f Improve CLI API connection errors
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-24 11:48:43 -05:00
Devin Foley
a9dcea023b fix(codex): check native auth before warning about missing API key
The environment test warned about OPENAI_API_KEY being unset even
when Codex was authenticated via `codex auth`. Now checks
~/.codex/auth.json before emitting the warning.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-24 09:47:48 -07:00
amit221
14ffbe30a0 test(issues): shorten mid-token entity test comment
Made-with: Cursor
2026-03-24 15:39:59 +02:00
amit221
98a5e287ef test(issues): document Greptile mid-token case vs old strip behavior
Made-with: Cursor
2026-03-24 15:29:28 +02:00
amit221
2735ef1f4a fix(issues): decode @mention entities without lockfile or new deps
- Drop entities package (CI blocks pnpm-lock.yaml on PRs; reset lockfile to master)
- Restore numeric + allowlisted named entity decoding in issues.ts
- Split Greptile mid-token &amp; case into its own test with review comment

Made-with: Cursor
2026-03-24 15:22:21 +02:00
amit221
53f0988006 Merge origin/master into fix/issue-1255
- findMentionedAgents: keep normalizeAgentMentionToken + extractAgentMentionIds
- decode @mention tokens with entities.decodeHTMLStrict (full HTML entities)
- Add entities dependency; expand unit tests for Greptile follow-ups

Made-with: Cursor
2026-03-24 10:03:15 +02:00
amit221
730a67bb20 fix(issues): decode HTML entities in @mention tokens instead of stripping
Addresses Greptile review on PR #1363: numeric entities decode via
code points; named entities use a small allowlist (amp, nbsp, etc.)
so M&amp;M resolves correctly; unknown named entities are preserved.

Adds mid-token tests for &amp; in agent names.

Made-with: Cursor
2026-03-24 09:40:55 +02:00
Devin Foley
59e29afab5 Merge pull request #1672 from paperclipai/fix/docker-plugin-sdk
fix(docker): add plugin-sdk to Dockerfile build
2026-03-23 20:05:56 -07:00
Devin Foley
fd4df4db48 fix(docker): add plugin-sdk to Dockerfile build
The plugin framework landed without updating the Dockerfile. The
server now imports @paperclipai/plugin-sdk, so the deps stage needs
its package.json for install and the build stage needs to compile
it before building the server.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 19:58:59 -07:00
Devin Foley
8ae954bb8f Merge pull request #1666 from paperclipai/chore/pr-template
chore: add GitHub PR template
2026-03-23 19:53:45 -07:00
Dotta
32c76e0012 Merge pull request #1670 from paperclipai/pr/pap-803-mention-aware-link-node
Extract mention-aware link node helper and add tests
2026-03-23 21:33:49 -05:00
Dotta
70bd55a00f Merge pull request #1669 from paperclipai/pr/pap-803-agent-instructions-tab-reset
Fix instructions tab state on agent switch
2026-03-23 21:27:51 -05:00
Dotta
f92d2c3326 Merge pull request #1668 from paperclipai/pr/pap-803-imported-agent-frontmatter
Fix imported agent bundle frontmatter leakage
2026-03-23 21:27:42 -05:00
dotta
a3f4e6f56c Preserve prompts panel width on agent switch
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 20:58:19 -05:00
dotta
08bdc3d28e Handle nested imported AGENTS edge case
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 20:56:34 -05:00
dotta
7c54b6e9e3 Extract mention-aware link node helper and add tests
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 20:46:19 -05:00
dotta
a346ad2a73 Fix instructions tab state on agent switch
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 20:45:11 -05:00
dotta
e4e5b61596 Fix imported agent bundle frontmatter leakage
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 20:43:20 -05:00
Dotta
eeb7e1a91a Merge pull request #1655 from paperclipai/pr/pap-795-company-portability
feat(portability): improve company import and export flow
2026-03-23 19:45:05 -05:00
Dotta
f2637e6972 Merge pull request #1654 from paperclipai/pr/pap-795-agent-runtime
fix(runtime): improve agent recovery and heartbeat operations
2026-03-23 19:44:51 -05:00
dotta
87b3cacc8f Address valid Greptile portability follow-ups 2026-03-23 19:42:58 -05:00
github-actions[bot]
4096db8053 chore(lockfile): refresh pnpm-lock.yaml (#1667)
Co-authored-by: lockfile-bot <lockfile-bot@users.noreply.github.com>
2026-03-24 00:29:19 +00:00
Dotta
fa084e1a16 Merge pull request #1653 from paperclipai/pr/pap-795-ui-polish
fix(ui): polish issue and agent surfaces
2026-03-23 19:28:50 -05:00
dotta
22067c7d1d revert: drop PR workflow lockfile refresh 2026-03-23 19:26:33 -05:00
dotta
85d2c54d53 fix(ci): refresh lockfile in PR jobs 2026-03-23 19:23:10 -05:00
Devin Foley
5222a49cc3 chore: expand thinking path placeholder for depth
Address Greptile feedback — the sparse 3-line placeholder could
lead to shallow thinking paths. Expanded to 6 lines with guiding
brackets and added "Aim for 5–8 steps" hint in the comment.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 17:12:07 -07:00
Devin Foley
36574bd9c6 chore: add GitHub PR template
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 17:08:04 -07:00
dotta
2cc2d4420d Remove lockfile changes from UI polish PR
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 19:03:50 -05:00
Dotta
7576c5ecbc Update ui/src/pages/Auth.tsx
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-03-23 19:00:34 -05:00
Devin Foley
dd1d9bed80 fix(server): check MIGRATION_AUTO_APPLY before MIGRATION_PROMPT
PAPERCLIP_MIGRATION_PROMPT=never was checked before
PAPERCLIP_MIGRATION_AUTO_APPLY=true, causing auto-apply to never
trigger when both env vars are set (as in dev:watch). Swap the
check order so AUTO_APPLY takes precedence.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:14:14 -07:00
dotta
92c29f27c3 Address Greptile review on portability PR
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 17:23:59 -05:00
dotta
6960ab1106 Address Greptile review on UI polish PR
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 17:16:10 -05:00
dotta
c3f4e18a5e Keep sidebar ordering with portability branch
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 17:02:43 -05:00
dotta
a3f568dec7 Improve generated company org chart assets
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:58:07 -05:00
dotta
6f1ce3bd60 Document imported heartbeat defaults
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:58:07 -05:00
dotta
159c5b4360 Preserve sidebar order in company portability
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:59 -05:00
dotta
b5fde733b0 Open imported company after import
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:39 -05:00
dotta
f9927bdaaa Disable imported timer heartbeats
Prevent company imports from re-enabling scheduler heartbeats on imported agents and cover both new-company and existing-company import flows in portability tests.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:39 -05:00
dotta
dcead97650 Fix company zip imports
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:39 -05:00
dotta
9786ebb7ba Revert "Add companies.sh import wrapper"
This reverts commit 17876ec1dc65a9150488874d79fc2fcc087c13ae.
2026-03-23 16:57:39 -05:00
dotta
66d84ccfa3 Add companies.sh import wrapper
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
56a39fea3d Add importing & exporting company guide
Documents the `paperclipai company export` and `paperclipai company import`
CLI commands, covering package format, all options, target modes, collision
strategies, GitHub sources, interactive selection, and API endpoints.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:38 -05:00
dotta
2a6e1cf1fc Fix imported GitHub skill file paths
Normalize GitHub skill directories for blob/file imports and when reading legacy stored metadata so imported SKILL.md files resolve correctly.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
c02dc73d3c Confirm company imports after preview
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
06f5632d1a Polish import adapter defaults
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
1246ccf250 Add nested import picker
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
a339b488ae fix: dedupe company skill inventory refreshes
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
ac376d0e5e Add TUI import summaries
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
220946b2a1 Default recurring task exports to checked
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
c41dd2e393 Reduce portability warning fan-out
Infer portable repo metadata from local git workspaces when repoUrl is missing, and collapse repeated task workspace export warnings into a single summary per missing workspace.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
2e76a2a554 Add routine support to recurring task portability
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:38 -05:00
dotta
8fa4b6a5fb added a script to generate company assets 2026-03-23 16:57:38 -05:00
dotta
d8b408625e fix providers 2026-03-23 16:57:38 -05:00
dotta
d73c8df895 fix: improve pill contrast by using WCAG contrast ratios on composited backgrounds
Pills with semi-transparent backgrounds were using raw color luminance to pick
text color, ignoring the page background showing through. This caused unreadable
text on dark themes for mid-luminance colors like orange. Now composites the
rgba background over the actual page bg (dark/light) before computing WCAG
contrast ratios, and centralizes the logic in a shared color-contrast utility.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:27 -05:00
dotta
e73bc81a73 fix: prevent documents row from causing horizontal scroll on mobile
- Shorten button labels ("New document" → "New", "Upload attachment" → "Upload") on small screens
- Add min-w-0 and shrink-0 to flex containers and items to prevent overflow
- Truncate document revision text on narrow viewports
- Mark chevron, key badge, and action buttons as shrink-0

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
0b960b0739 Suppress same-page issue toasts
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
bdecb1bad2 Sort agents alphabetically by name in all views
Sort the flat list view, org tree view, and sidebar agents list
alphabetically by name using localeCompare.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
e61f00d4c1 Add missing data-slot="toggle" to Routines toggle buttons
The initial mobile toggle fix (afc3d7ec) missed 2 toggle switches in
RoutineDetail.tsx and Routines.tsx. Without the attribute, these toggles
would still inflate to 44px on touch devices via the @media (pointer: coarse)
rule.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
42c8d9b660 Fix oversized toggle switches on mobile
The global @media (pointer: coarse) rule was forcing min-height: 44px on
toggle button elements, inflating them from 20px to 44px. Added
data-slot="toggle" to all 10 toggle buttons and a CSS override to reset
their min-height, keeping the parent row as the touch target.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
bd0b76072b Fix atomic markdown mention deletion
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
db42adf1bf Make agent instructions tab responsive on mobile
On mobile, the two-panel file browser + editor layout now stacks
vertically with a toggleable file panel. The draggable separator is
hidden, and selecting a file auto-closes the panel to maximize
editor space.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:27 -05:00
dotta
0e8e162cd5 Fix mention pills by allowing custom URL schemes in Lexical LinkNode
The previous fix (validateUrl on linkPlugin) only affected the link dialog,
not the markdown-to-Lexical import path. Lexical's LinkNode.sanitizeUrl()
converts agent:// and project:// URLs to about:blank because they aren't
in its allowlist. Override the prototype method to preserve these schemes
so mention chips render correctly.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
49ace2faf9 Allow custom markdown mention links in editor
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
8232456ce8 Fix markdown mention chips
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
dotta
cd7c6ee751 Fix login form not being detected by 1Password
Add name, id, and htmlFor attributes to form inputs and a method/action
to the form element so password managers can properly identify the login
form fields.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:27 -05:00
dotta
f8dd4dcb30 Reduce monospace font size from 1.1em to 1em
1.1em was too large per feedback — settle on 1em for all markdown
monospace contexts.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:27 -05:00
dotta
0b9f00346b Increase monospace font size and add dark mode background for inline code
Bump monospace font-size from 0.78em to 1.1em across all markdown
contexts (editor, code blocks, inline code). Add subtle gray
background (#ffffff0f) for inline code in dark mode.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 16:57:27 -05:00
dotta
ef0846e723 Remove priority icon from issue rows across the app
Priority is still supported as a feature (editable in issue properties,
used in filters), but no longer shown prominently in every issue row.
Affects inbox, issues list, my issues, and dashboard pages.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-23 16:57:27 -05:00
Paperclip Dev
47449152ac fix(issues): normalize HTML entities in @mention tokens before agent lookup (#1255)
Rich-text comments store entities like &#x20; after @names; strip them before matching agents so issue_comment_mentioned and wake injection work.

Made-with: Cursor
2026-03-20 16:38:55 +00:00
171 changed files with 24727 additions and 2185 deletions

49
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,49 @@
## Thinking Path
<!--
Required. Trace your reasoning from the top of the project down to this
specific change. Start with what Paperclip is, then narrow through the
subsystem, the problem, and why this PR exists. Use blockquote style.
Aim for 58 steps. See CONTRIBUTING.md for full examples.
-->
> - Paperclip orchestrates AI agents for zero-human companies
> - [Which subsystem or capability is involved]
> - [What problem or gap exists]
> - [Why it needs to be addressed]
> - This pull request ...
> - The benefit is ...
## What Changed
<!-- Bullet list of concrete changes. One bullet per logical unit. -->
-
## Verification
<!--
How can a reviewer confirm this works? Include test commands, manual
steps, or both. For UI changes, include before/after screenshots.
-->
-
## Risks
<!--
What could go wrong? Mention migration safety, breaking changes,
behavioral shifts, or "Low risk" if genuinely minor.
-->
-
## Checklist
- [ ] I have included a thinking path that traces from project context to this change
- [ ] I have run tests locally and they pass
- [ ] I have added or updated tests where applicable
- [ ] If this change affects the UI, I have included before/after screenshots
- [ ] I have updated relevant documentation to reflect my changes
- [ ] I have considered and documented any risks above
- [ ] I will address all Greptile and reviewer comments before requesting merge

View File

@@ -40,6 +40,46 @@ jobs:
with:
node-version: 24
- name: Validate Dockerfile deps stage
run: |
missing=0
# Extract only the deps stage from the Dockerfile
deps_stage="$(awk '/^FROM .* AS deps$/{found=1; next} found && /^FROM /{exit} found{print}' Dockerfile)"
if [ -z "$deps_stage" ]; then
echo "::error::Could not extract deps stage from Dockerfile (expected 'FROM ... AS deps')"
exit 1
fi
# Derive workspace search roots from pnpm-workspace.yaml (exclude dev-only packages)
search_roots="$(grep '^ *- ' pnpm-workspace.yaml | sed 's/^ *- //' | sed 's/\*$//' | grep -v 'examples' | grep -v 'create-paperclip-plugin' | tr '\n' ' ')"
if [ -z "$search_roots" ]; then
echo "::error::Could not derive workspace roots from pnpm-workspace.yaml"
exit 1
fi
# Check all workspace package.json files are copied in the deps stage
for pkg in $(find $search_roots -maxdepth 2 -name package.json -not -path '*/examples/*' -not -path '*/create-paperclip-plugin/*' -not -path '*/node_modules/*' 2>/dev/null | sort -u); do
dir="$(dirname "$pkg")"
if ! echo "$deps_stage" | grep -q "^COPY ${dir}/package.json"; then
echo "::error::Dockerfile deps stage missing: COPY ${pkg} ${dir}/"
missing=1
fi
done
# Check patches directory is copied if it exists
if [ -d patches ] && ! echo "$deps_stage" | grep -q '^COPY patches/'; then
echo "::error::Dockerfile deps stage missing: COPY patches/ patches/"
missing=1
fi
if [ "$missing" -eq 1 ]; then
echo "Dockerfile deps stage is out of sync. Update it to include the missing files."
exit 1
fi
- name: Validate dependency resolution when manifests change
run: |
changed="$(git diff --name-only "${{ github.event.pull_request.base.sha }}" "${{ github.event.pull_request.head.sha }}")"

View File

@@ -26,6 +26,9 @@ Before making changes, read in this order:
- `ui/`: React + Vite board UI
- `packages/db/`: Drizzle schema, migrations, DB clients
- `packages/shared/`: shared types, constants, validators, API path constants
- `packages/adapters/`: agent adapter implementations (Claude, Codex, Cursor, etc.)
- `packages/adapter-utils/`: shared adapter utilities
- `packages/plugins/`: plugin system packages
- `doc/`: operational and product docs
## 4. Dev Setup (Auto DB)

View File

@@ -20,6 +20,8 @@ COPY packages/adapters/gemini-local/package.json packages/adapters/gemini-local/
COPY packages/adapters/openclaw-gateway/package.json packages/adapters/openclaw-gateway/
COPY packages/adapters/opencode-local/package.json packages/adapters/opencode-local/
COPY packages/adapters/pi-local/package.json packages/adapters/pi-local/
COPY packages/plugins/sdk/package.json packages/plugins/sdk/
COPY patches/ patches/
RUN pnpm install --frozen-lockfile
@@ -28,6 +30,7 @@ WORKDIR /app
COPY --from=deps /app /app
COPY . .
RUN pnpm --filter @paperclipai/ui build
RUN pnpm --filter @paperclipai/plugin-sdk build
RUN pnpm --filter @paperclipai/server build
RUN test -f server/dist/index.js || (echo "ERROR: server build output missing" && exit 1)

View File

@@ -234,16 +234,27 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
## Roadmap
- ⚪ Get OpenClaw onboarding easier
- Get cloud agents working e.g. Cursor / e2b agents
- ⚪ ClipMart - buy and sell entire agent companies
- Easy agent configurations / easier to understand
- ⚪ Better support for harness engineering
- 🟢 Plugin system (e.g. if you want to add a knowledgebase, custom tracing, queues, etc)
- Better docs
- ✅ Plugin system (e.g. add a knowledge base, custom tracing, queues, etc)
- Get OpenClaw / claw-style agent employees
- ✅ companies.sh - import and export entire organizations
- Easy AGENTS.md configurations
- ✅ Skills Manager
- ✅ Scheduled Routines
- Better Budgeting
- ⚪ Artifacts & Deployments
- ⚪ CEO Chat
- ⚪ MAXIMIZER MODE
- ⚪ Multiple Human Users
- ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
- ⚪ Cloud deployments
- ⚪ Desktop App
<br/>
## Community & Plugins
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
## Contributing
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.

View File

@@ -1,37 +1,20 @@
import { execFile, spawn } from "node:child_process";
import { mkdirSync, mkdtempSync, readFileSync, rmSync, writeFileSync } from "node:fs";
import { mkdirSync, mkdtempSync, readFileSync, readdirSync, rmSync, writeFileSync } from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { fileURLToPath } from "node:url";
import { promisify } from "node:util";
import { afterAll, beforeAll, describe, expect, it } from "vitest";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { createStoredZipArchive } from "./helpers/zip.js";
const execFileAsync = promisify(execFile);
type ServerProcess = ReturnType<typeof spawn>;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
@@ -52,30 +35,13 @@ async function getAvailablePort(): Promise<number> {
});
}
async function startTempDatabase() {
const dataDir = mkdtempSync(path.join(os.tmpdir(), "paperclip-company-cli-db-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
const { applyPendingMigrations, ensurePostgresDatabase } = await import("@paperclipai/db");
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, dataDir, instance };
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres company import/export e2e tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
function writeTestConfig(configPath: string, tempRoot: string, port: number, connectionString: string) {
@@ -182,6 +148,19 @@ function createCliEnv() {
return env;
}
function collectTextFiles(root: string, current: string, files: Record<string, string>) {
for (const entry of readdirSync(current, { withFileTypes: true })) {
const absolutePath = path.join(current, entry.name);
if (entry.isDirectory()) {
collectTextFiles(root, absolutePath, files);
continue;
}
if (!entry.isFile()) continue;
const relativePath = path.relative(root, absolutePath).replace(/\\/g, "/");
files[relativePath] = readFileSync(absolutePath, "utf8");
}
}
async function stopServerProcess(child: ServerProcess | null) {
if (!child || child.exitCode !== null) return;
child.kill("SIGTERM");
@@ -251,26 +230,23 @@ async function waitForServer(
);
}
describe("paperclipai company import/export e2e", () => {
describeEmbeddedPostgres("paperclipai company import/export e2e", () => {
let tempRoot = "";
let configPath = "";
let exportDir = "";
let apiBase = "";
let serverProcess: ServerProcess | null = null;
let dbDataDir = "";
let dbInstance: EmbeddedPostgresInstance | null = null;
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-company-cli-e2e-"));
configPath = path.join(tempRoot, "config", "config.json");
exportDir = path.join(tempRoot, "exported-company");
const db = await startTempDatabase();
dbDataDir = db.dataDir;
dbInstance = db.instance;
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-company-cli-db-");
const port = await getAvailablePort();
writeTestConfig(configPath, tempRoot, port, db.connectionString);
writeTestConfig(configPath, tempRoot, port, tempDb.connectionString);
apiBase = `http://127.0.0.1:${port}`;
const repoRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../../..");
@@ -280,7 +256,7 @@ describe("paperclipai company import/export e2e", () => {
["paperclipai", "run", "--config", configPath],
{
cwd: repoRoot,
env: createServerEnv(configPath, port, db.connectionString),
env: createServerEnv(configPath, port, tempDb.connectionString),
stdio: ["ignore", "pipe", "pipe"],
},
);
@@ -297,10 +273,7 @@ describe("paperclipai company import/export e2e", () => {
afterAll(async () => {
await stopServerProcess(serverProcess);
await dbInstance?.stop();
if (dbDataDir) {
rmSync(dbDataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
if (tempRoot) {
rmSync(tempRoot, { recursive: true, force: true });
}
@@ -345,6 +318,8 @@ describe("paperclipai company import/export e2e", () => {
},
);
const largeIssueDescription = `Round-trip the company package through the CLI.\n\n${"portable-data ".repeat(12_000)}`;
const sourceIssue = await api<{ id: string; title: string; identifier: string }>(
apiBase,
`/api/companies/${sourceCompany.id}/issues`,
@@ -353,7 +328,7 @@ describe("paperclipai company import/export e2e", () => {
headers: { "content-type": "application/json" },
body: JSON.stringify({
title: "Validate company import/export",
description: "Round-trip the company package through the CLI.",
description: largeIssueDescription,
status: "todo",
projectId: sourceProject.id,
assigneeAgentId: sourceAgent.id,
@@ -397,6 +372,7 @@ describe("paperclipai company import/export e2e", () => {
`Imported ${sourceCompany.name}`,
"--include",
"company,agents,projects,issues",
"--yes",
],
{ apiBase, configPath },
);
@@ -470,6 +446,7 @@ describe("paperclipai company import/export e2e", () => {
"company,agents,projects,issues",
"--collision",
"rename",
"--yes",
],
{ apiBase, configPath },
);
@@ -494,5 +471,32 @@ describe("paperclipai company import/export e2e", () => {
expect(new Set(twiceImportedAgents.map((agent) => agent.name)).size).toBe(2);
expect(twiceImportedProjects).toHaveLength(2);
expect(twiceImportedIssues).toHaveLength(2);
const zipPath = path.join(tempRoot, "exported-company.zip");
const portableFiles: Record<string, string> = {};
collectTextFiles(exportDir, exportDir, portableFiles);
writeFileSync(zipPath, createStoredZipArchive(portableFiles, "paperclip-demo"));
const importedFromZip = await runCliJson<{
company: { id: string; name: string; action: string };
agents: Array<{ id: string | null; action: string; name: string }>;
}>(
[
"company",
"import",
zipPath,
"--target",
"new",
"--new-company-name",
`Zip Imported ${sourceCompany.name}`,
"--include",
"company,agents,projects,issues",
"--yes",
],
{ apiBase, configPath },
);
expect(importedFromZip.company.action).toBe("created");
expect(importedFromZip.agents.some((agent) => agent.action === "created")).toBe(true);
}, 60_000);
});

View File

@@ -0,0 +1,44 @@
import { mkdtemp, rm, writeFile } from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { resolveInlineSourceFromPath } from "../commands/client/company.js";
import { createStoredZipArchive } from "./helpers/zip.js";
const tempDirs: string[] = [];
afterEach(async () => {
for (const dir of tempDirs.splice(0)) {
await rm(dir, { recursive: true, force: true });
}
});
describe("resolveInlineSourceFromPath", () => {
it("imports portable files from a zip archive instead of scanning the parent directory", async () => {
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-company-import-zip-"));
tempDirs.push(tempDir);
const archivePath = path.join(tempDir, "paperclip-demo.zip");
const archive = createStoredZipArchive(
{
"COMPANY.md": "# Company\n",
".paperclip.yaml": "schema: paperclip/v1\n",
"agents/ceo/AGENT.md": "# CEO\n",
"notes/todo.txt": "ignore me\n",
},
"paperclip-demo",
);
await writeFile(archivePath, archive);
const resolved = await resolveInlineSourceFromPath(archivePath);
expect(resolved).toEqual({
rootPath: "paperclip-demo",
files: {
"COMPANY.md": "# Company\n",
".paperclip.yaml": "schema: paperclip/v1\n",
"agents/ceo/AGENT.md": "# CEO\n",
},
});
});
});

View File

@@ -1,5 +1,16 @@
import { describe, expect, it } from "vitest";
import { resolveCompanyImportApiPath } from "../commands/client/company.js";
import type { CompanyPortabilityPreviewResult } from "@paperclipai/shared";
import {
buildCompanyDashboardUrl,
buildDefaultImportAdapterOverrides,
buildDefaultImportSelectionState,
buildImportSelectionCatalog,
buildSelectedFilesFromImportSelection,
renderCompanyImportPreview,
renderCompanyImportResult,
resolveCompanyImportApplyConfirmationMode,
resolveCompanyImportApiPath,
} from "../commands/client/company.js";
describe("resolveCompanyImportApiPath", () => {
it("uses company-scoped preview route for existing-company dry runs", () => {
@@ -48,3 +59,529 @@ describe("resolveCompanyImportApiPath", () => {
).toThrow(/require a companyId/i);
});
});
describe("resolveCompanyImportApplyConfirmationMode", () => {
it("skips confirmation when --yes is set", () => {
expect(
resolveCompanyImportApplyConfirmationMode({
yes: true,
interactive: false,
json: false,
}),
).toBe("skip");
});
it("prompts in interactive text mode when --yes is not set", () => {
expect(
resolveCompanyImportApplyConfirmationMode({
yes: false,
interactive: true,
json: false,
}),
).toBe("prompt");
});
it("requires --yes for non-interactive apply", () => {
expect(() =>
resolveCompanyImportApplyConfirmationMode({
yes: false,
interactive: false,
json: false,
})
).toThrow(/non-interactive terminal requires --yes/i);
});
it("requires --yes for json apply", () => {
expect(() =>
resolveCompanyImportApplyConfirmationMode({
yes: false,
interactive: false,
json: true,
})
).toThrow(/with --json requires --yes/i);
});
});
describe("buildCompanyDashboardUrl", () => {
it("preserves the configured base path when building a dashboard URL", () => {
expect(buildCompanyDashboardUrl("https://paperclip.example/app/", "PAP")).toBe(
"https://paperclip.example/app/PAP/dashboard",
);
});
});
describe("renderCompanyImportPreview", () => {
it("summarizes the preview with counts, selection info, and truncated examples", () => {
const preview: CompanyPortabilityPreviewResult = {
include: {
company: true,
agents: true,
projects: true,
issues: true,
skills: true,
},
targetCompanyId: "company-123",
targetCompanyName: "Imported Co",
collisionStrategy: "rename",
selectedAgentSlugs: ["ceo", "cto", "eng-1", "eng-2", "eng-3", "eng-4", "eng-5"],
plan: {
companyAction: "update",
agentPlans: [
{ slug: "ceo", action: "create", plannedName: "CEO", existingAgentId: null, reason: null },
{ slug: "cto", action: "update", plannedName: "CTO", existingAgentId: "agent-2", reason: "replace strategy" },
{ slug: "eng-1", action: "skip", plannedName: "Engineer 1", existingAgentId: "agent-3", reason: "skip strategy" },
{ slug: "eng-2", action: "create", plannedName: "Engineer 2", existingAgentId: null, reason: null },
{ slug: "eng-3", action: "create", plannedName: "Engineer 3", existingAgentId: null, reason: null },
{ slug: "eng-4", action: "create", plannedName: "Engineer 4", existingAgentId: null, reason: null },
{ slug: "eng-5", action: "create", plannedName: "Engineer 5", existingAgentId: null, reason: null },
],
projectPlans: [
{ slug: "alpha", action: "create", plannedName: "Alpha", existingProjectId: null, reason: null },
],
issuePlans: [
{ slug: "kickoff", action: "create", plannedTitle: "Kickoff", reason: null },
],
},
manifest: {
schemaVersion: 1,
generatedAt: "2026-03-23T17:00:00.000Z",
source: {
companyId: "company-src",
companyName: "Source Co",
},
includes: {
company: true,
agents: true,
projects: true,
issues: true,
skills: true,
},
company: {
path: "COMPANY.md",
name: "Source Co",
description: null,
brandColor: null,
logoPath: null,
requireBoardApprovalForNewAgents: false,
},
sidebar: {
agents: ["ceo"],
projects: ["alpha"],
},
agents: [
{
slug: "ceo",
name: "CEO",
path: "agents/ceo/AGENT.md",
skills: [],
role: "ceo",
title: null,
icon: null,
capabilities: null,
reportsToSlug: null,
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
budgetMonthlyCents: 0,
metadata: null,
},
],
skills: [
{
key: "skill-a",
slug: "skill-a",
name: "Skill A",
path: "skills/skill-a/SKILL.md",
description: null,
sourceType: "inline",
sourceLocator: null,
sourceRef: null,
trustLevel: null,
compatibility: null,
metadata: null,
fileInventory: [],
},
],
projects: [
{
slug: "alpha",
name: "Alpha",
path: "projects/alpha/PROJECT.md",
description: null,
ownerAgentSlug: null,
leadAgentSlug: null,
targetDate: null,
color: null,
status: null,
executionWorkspacePolicy: null,
workspaces: [],
metadata: null,
},
],
issues: [
{
slug: "kickoff",
identifier: null,
title: "Kickoff",
path: "projects/alpha/issues/kickoff/TASK.md",
projectSlug: "alpha",
projectWorkspaceKey: null,
assigneeAgentSlug: "ceo",
description: null,
recurring: false,
routine: null,
legacyRecurrence: null,
status: null,
priority: null,
labelIds: [],
billingCode: null,
executionWorkspaceSettings: null,
assigneeAdapterOverrides: null,
metadata: null,
},
],
envInputs: [
{
key: "OPENAI_API_KEY",
description: null,
agentSlug: "ceo",
kind: "secret",
requirement: "required",
defaultValue: null,
portability: "portable",
},
],
},
files: {
"COMPANY.md": "# Source Co",
},
envInputs: [
{
key: "OPENAI_API_KEY",
description: null,
agentSlug: "ceo",
kind: "secret",
requirement: "required",
defaultValue: null,
portability: "portable",
},
],
warnings: ["One warning"],
errors: ["One error"],
};
const rendered = renderCompanyImportPreview(preview, {
sourceLabel: "GitHub: https://github.com/paperclipai/companies/demo",
targetLabel: "Imported Co (company-123)",
infoMessages: ["Using claude-local adapter"],
});
expect(rendered).toContain("Include");
expect(rendered).toContain("company, projects, tasks, agents, skills");
expect(rendered).toContain("7 agents total");
expect(rendered).toContain("1 project total");
expect(rendered).toContain("1 task total");
expect(rendered).toContain("skills: 1 skill packaged");
expect(rendered).toContain("+1 more");
expect(rendered).toContain("Using claude-local adapter");
expect(rendered).toContain("Warnings");
expect(rendered).toContain("Errors");
});
});
describe("renderCompanyImportResult", () => {
it("summarizes import results with created, updated, and skipped counts", () => {
const rendered = renderCompanyImportResult(
{
company: {
id: "company-123",
name: "Imported Co",
action: "updated",
},
agents: [
{ slug: "ceo", id: "agent-1", action: "created", name: "CEO", reason: null },
{ slug: "cto", id: "agent-2", action: "updated", name: "CTO", reason: "replace strategy" },
{ slug: "ops", id: null, action: "skipped", name: "Ops", reason: "skip strategy" },
],
projects: [
{ slug: "app", id: "project-1", action: "created", name: "App", reason: null },
{ slug: "ops", id: "project-2", action: "updated", name: "Operations", reason: "replace strategy" },
{ slug: "archive", id: null, action: "skipped", name: "Archive", reason: "skip strategy" },
],
envInputs: [],
warnings: ["Review API keys"],
},
{
targetLabel: "Imported Co (company-123)",
companyUrl: "https://paperclip.example/PAP/dashboard",
infoMessages: ["Using claude-local adapter"],
},
);
expect(rendered).toContain("Company");
expect(rendered).toContain("https://paperclip.example/PAP/dashboard");
expect(rendered).toContain("3 agents total (1 created, 1 updated, 1 skipped)");
expect(rendered).toContain("3 projects total (1 created, 1 updated, 1 skipped)");
expect(rendered).toContain("Agent results");
expect(rendered).toContain("Project results");
expect(rendered).toContain("Using claude-local adapter");
expect(rendered).toContain("Review API keys");
});
});
describe("import selection catalog", () => {
it("defaults to everything and keeps project selection separate from task selection", () => {
const preview: CompanyPortabilityPreviewResult = {
include: {
company: true,
agents: true,
projects: true,
issues: true,
skills: true,
},
targetCompanyId: "company-123",
targetCompanyName: "Imported Co",
collisionStrategy: "rename",
selectedAgentSlugs: ["ceo"],
plan: {
companyAction: "create",
agentPlans: [],
projectPlans: [],
issuePlans: [],
},
manifest: {
schemaVersion: 1,
generatedAt: "2026-03-23T18:00:00.000Z",
source: {
companyId: "company-src",
companyName: "Source Co",
},
includes: {
company: true,
agents: true,
projects: true,
issues: true,
skills: true,
},
company: {
path: "COMPANY.md",
name: "Source Co",
description: null,
brandColor: null,
logoPath: "images/company-logo.png",
requireBoardApprovalForNewAgents: false,
},
sidebar: {
agents: ["ceo"],
projects: ["alpha"],
},
agents: [
{
slug: "ceo",
name: "CEO",
path: "agents/ceo/AGENT.md",
skills: [],
role: "ceo",
title: null,
icon: null,
capabilities: null,
reportsToSlug: null,
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
budgetMonthlyCents: 0,
metadata: null,
},
],
skills: [
{
key: "skill-a",
slug: "skill-a",
name: "Skill A",
path: "skills/skill-a/SKILL.md",
description: null,
sourceType: "inline",
sourceLocator: null,
sourceRef: null,
trustLevel: null,
compatibility: null,
metadata: null,
fileInventory: [{ path: "skills/skill-a/helper.md", kind: "doc" }],
},
],
projects: [
{
slug: "alpha",
name: "Alpha",
path: "projects/alpha/PROJECT.md",
description: null,
ownerAgentSlug: null,
leadAgentSlug: null,
targetDate: null,
color: null,
status: null,
executionWorkspacePolicy: null,
workspaces: [],
metadata: null,
},
],
issues: [
{
slug: "kickoff",
identifier: null,
title: "Kickoff",
path: "projects/alpha/issues/kickoff/TASK.md",
projectSlug: "alpha",
projectWorkspaceKey: null,
assigneeAgentSlug: "ceo",
description: null,
recurring: false,
routine: null,
legacyRecurrence: null,
status: null,
priority: null,
labelIds: [],
billingCode: null,
executionWorkspaceSettings: null,
assigneeAdapterOverrides: null,
metadata: null,
},
],
envInputs: [],
},
files: {
"COMPANY.md": "# Source Co",
"README.md": "# Readme",
".paperclip.yaml": "schema: paperclip/v1\n",
"images/company-logo.png": {
encoding: "base64",
data: "",
contentType: "image/png",
},
"projects/alpha/PROJECT.md": "# Alpha",
"projects/alpha/notes.md": "project notes",
"projects/alpha/issues/kickoff/TASK.md": "# Kickoff",
"projects/alpha/issues/kickoff/details.md": "task details",
"agents/ceo/AGENT.md": "# CEO",
"agents/ceo/prompt.md": "prompt",
"skills/skill-a/SKILL.md": "# Skill A",
"skills/skill-a/helper.md": "helper",
},
envInputs: [],
warnings: [],
errors: [],
};
const catalog = buildImportSelectionCatalog(preview);
const state = buildDefaultImportSelectionState(catalog);
expect(state.company).toBe(true);
expect(state.projects.has("alpha")).toBe(true);
expect(state.issues.has("kickoff")).toBe(true);
expect(state.agents.has("ceo")).toBe(true);
expect(state.skills.has("skill-a")).toBe(true);
state.company = false;
state.issues.clear();
state.agents.clear();
state.skills.clear();
const selectedFiles = buildSelectedFilesFromImportSelection(catalog, state);
expect(selectedFiles).toContain(".paperclip.yaml");
expect(selectedFiles).toContain("projects/alpha/PROJECT.md");
expect(selectedFiles).toContain("projects/alpha/notes.md");
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/TASK.md");
expect(selectedFiles).not.toContain("projects/alpha/issues/kickoff/details.md");
});
});
describe("default adapter overrides", () => {
it("maps process-only imported agents to claude_local", () => {
const preview: CompanyPortabilityPreviewResult = {
include: {
company: false,
agents: true,
projects: false,
issues: false,
skills: false,
},
targetCompanyId: null,
targetCompanyName: null,
collisionStrategy: "rename",
selectedAgentSlugs: ["legacy-agent", "explicit-agent"],
plan: {
companyAction: "none",
agentPlans: [],
projectPlans: [],
issuePlans: [],
},
manifest: {
schemaVersion: 1,
generatedAt: "2026-03-23T18:20:00.000Z",
source: null,
includes: {
company: false,
agents: true,
projects: false,
issues: false,
skills: false,
},
company: null,
sidebar: null,
agents: [
{
slug: "legacy-agent",
name: "Legacy Agent",
path: "agents/legacy-agent/AGENT.md",
skills: [],
role: "agent",
title: null,
icon: null,
capabilities: null,
reportsToSlug: null,
adapterType: "process",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
budgetMonthlyCents: 0,
metadata: null,
},
{
slug: "explicit-agent",
name: "Explicit Agent",
path: "agents/explicit-agent/AGENT.md",
skills: [],
role: "agent",
title: null,
icon: null,
capabilities: null,
reportsToSlug: null,
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
budgetMonthlyCents: 0,
metadata: null,
},
],
skills: [],
projects: [],
issues: [],
envInputs: [],
},
files: {},
envInputs: [],
warnings: [],
errors: [],
};
expect(buildDefaultImportAdapterOverrides(preview)).toEqual({
"legacy-agent": {
adapterType: "claude_local",
},
});
});
});

View File

@@ -0,0 +1,6 @@
export {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestSupport,
} from "@paperclipai/db";

View File

@@ -0,0 +1,87 @@
function writeUint16(target: Uint8Array, offset: number, value: number) {
target[offset] = value & 0xff;
target[offset + 1] = (value >>> 8) & 0xff;
}
function writeUint32(target: Uint8Array, offset: number, value: number) {
target[offset] = value & 0xff;
target[offset + 1] = (value >>> 8) & 0xff;
target[offset + 2] = (value >>> 16) & 0xff;
target[offset + 3] = (value >>> 24) & 0xff;
}
function crc32(bytes: Uint8Array) {
let crc = 0xffffffff;
for (const byte of bytes) {
crc ^= byte;
for (let bit = 0; bit < 8; bit += 1) {
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
}
}
return (crc ^ 0xffffffff) >>> 0;
}
export function createStoredZipArchive(files: Record<string, string>, rootPath: string) {
const encoder = new TextEncoder();
const localChunks: Uint8Array[] = [];
const centralChunks: Uint8Array[] = [];
let localOffset = 0;
let entryCount = 0;
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
const body = encoder.encode(content);
const checksum = crc32(body);
const localHeader = new Uint8Array(30 + fileName.length);
writeUint32(localHeader, 0, 0x04034b50);
writeUint16(localHeader, 4, 20);
writeUint16(localHeader, 6, 0x0800);
writeUint16(localHeader, 8, 0);
writeUint32(localHeader, 14, checksum);
writeUint32(localHeader, 18, body.length);
writeUint32(localHeader, 22, body.length);
writeUint16(localHeader, 26, fileName.length);
localHeader.set(fileName, 30);
const centralHeader = new Uint8Array(46 + fileName.length);
writeUint32(centralHeader, 0, 0x02014b50);
writeUint16(centralHeader, 4, 20);
writeUint16(centralHeader, 6, 20);
writeUint16(centralHeader, 8, 0x0800);
writeUint16(centralHeader, 10, 0);
writeUint32(centralHeader, 16, checksum);
writeUint32(centralHeader, 20, body.length);
writeUint32(centralHeader, 24, body.length);
writeUint16(centralHeader, 28, fileName.length);
writeUint32(centralHeader, 42, localOffset);
centralHeader.set(fileName, 46);
localChunks.push(localHeader, body);
centralChunks.push(centralHeader);
localOffset += localHeader.length + body.length;
entryCount += 1;
}
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
const archive = new Uint8Array(
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
);
let offset = 0;
for (const chunk of localChunks) {
archive.set(chunk, offset);
offset += chunk.length;
}
const centralDirectoryOffset = offset;
for (const chunk of centralChunks) {
archive.set(chunk, offset);
offset += chunk.length;
}
writeUint32(archive, offset, 0x06054b50);
writeUint16(archive, offset + 8, entryCount);
writeUint16(archive, offset + 10, entryCount);
writeUint32(archive, offset + 12, centralDirectoryLength);
writeUint32(archive, offset + 16, centralDirectoryOffset);
return archive;
}

View File

@@ -1,5 +1,5 @@
import { afterEach, describe, expect, it, vi } from "vitest";
import { ApiRequestError, PaperclipApiClient } from "../client/http.js";
import { ApiConnectionError, ApiRequestError, PaperclipApiClient } from "../client/http.js";
describe("PaperclipApiClient", () => {
afterEach(() => {
@@ -59,6 +59,29 @@ describe("PaperclipApiClient", () => {
} satisfies Partial<ApiRequestError>);
});
it("throws ApiConnectionError with recovery guidance when fetch fails", async () => {
const fetchMock = vi.fn().mockRejectedValue(new TypeError("fetch failed"));
vi.stubGlobal("fetch", fetchMock);
const client = new PaperclipApiClient({ apiBase: "http://localhost:3100" });
await expect(client.post("/api/companies/import/preview", {})).rejects.toBeInstanceOf(ApiConnectionError);
await expect(client.post("/api/companies/import/preview", {})).rejects.toMatchObject({
url: "http://localhost:3100/api/companies/import/preview",
method: "POST",
causeMessage: "fetch failed",
} satisfies Partial<ApiConnectionError>);
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
/Could not reach the Paperclip API\./,
);
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
/curl http:\/\/localhost:3100\/api\/health/,
);
await expect(client.post("/api/companies/import/preview", {})).rejects.toThrow(
/pnpm dev|pnpm paperclipai run/,
);
});
it("retries once after interactive auth recovery", async () => {
const fetchMock = vi
.fn()

View File

@@ -344,6 +344,87 @@ describe("worktree helpers", () => {
}
});
it("avoids ports already claimed by sibling worktree instance configs", async () => {
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-claimed-ports-"));
const repoRoot = path.join(tempRoot, "repo");
const homeDir = path.join(tempRoot, ".paperclip-worktrees");
const siblingInstanceRoot = path.join(homeDir, "instances", "existing-worktree");
const originalCwd = process.cwd();
try {
fs.mkdirSync(repoRoot, { recursive: true });
fs.mkdirSync(siblingInstanceRoot, { recursive: true });
fs.writeFileSync(
path.join(siblingInstanceRoot, "config.json"),
JSON.stringify(
{
...buildSourceConfig(),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(siblingInstanceRoot, "db"),
embeddedPostgresPort: 54330,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(siblingInstanceRoot, "backups"),
},
},
logging: {
mode: "file",
logDir: path.join(siblingInstanceRoot, "logs"),
},
server: {
deploymentMode: "authenticated",
exposure: "private",
host: "127.0.0.1",
port: 3101,
allowedHostnames: ["localhost"],
serveUi: true,
},
storage: {
provider: "local_disk",
localDisk: {
baseDir: path.join(siblingInstanceRoot, "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted",
strictMode: false,
localEncrypted: {
keyFilePath: path.join(siblingInstanceRoot, "secrets", "master.key"),
},
},
},
null,
2,
) + "\n",
);
process.chdir(repoRoot);
await worktreeInitCommand({
seed: false,
fromConfig: path.join(tempRoot, "missing", "config.json"),
home: homeDir,
});
const config = JSON.parse(fs.readFileSync(path.join(repoRoot, ".paperclip", "config.json"), "utf8"));
expect(config.server.port).toBe(3102);
expect(config.database.embeddedPostgresPort).not.toBe(54330);
expect(config.database.embeddedPostgresPort).not.toBe(config.server.port);
expect(config.database.embeddedPostgresPort).toBeGreaterThan(54330);
} finally {
process.chdir(originalCwd);
fs.rmSync(tempRoot, { recursive: true, force: true });
}
});
it("defaults the seed source config to the current repo-local Paperclip config", () => {
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-source-config-"));
const repoRoot = path.join(tempRoot, "repo");

View File

@@ -169,7 +169,7 @@ async function requestJson<T>(url: string, init?: RequestInit): Promise<T> {
return response.json() as Promise<T>;
}
function openUrl(url: string): boolean {
export function openUrl(url: string): boolean {
const platform = process.platform;
try {
if (platform === "darwin") {

View File

@@ -13,6 +13,26 @@ export class ApiRequestError extends Error {
}
}
export class ApiConnectionError extends Error {
url: string;
method: string;
causeMessage?: string;
constructor(input: {
apiBase: string;
path: string;
method: string;
cause?: unknown;
}) {
const url = buildUrl(input.apiBase, input.path);
const causeMessage = formatConnectionCause(input.cause);
super(buildConnectionErrorMessage({ apiBase: input.apiBase, url, method: input.method, causeMessage }));
this.url = url;
this.method = input.method;
this.causeMessage = causeMessage;
}
}
interface RequestOptions {
ignoreNotFound?: boolean;
}
@@ -76,6 +96,7 @@ export class PaperclipApiClient {
hasRetriedAuth = false,
): Promise<T | null> {
const url = buildUrl(this.apiBase, path);
const method = String(init.method ?? "GET").toUpperCase();
const headers: Record<string, string> = {
accept: "application/json",
@@ -94,10 +115,20 @@ export class PaperclipApiClient {
headers["x-paperclip-run-id"] = this.runId;
}
const response = await fetch(url, {
...init,
headers,
});
let response: Response;
try {
response = await fetch(url, {
...init,
headers,
});
} catch (error) {
throw new ApiConnectionError({
apiBase: this.apiBase,
path,
method,
cause: error,
});
}
if (opts?.ignoreNotFound && response.status === 404) {
return null;
@@ -108,7 +139,7 @@ export class PaperclipApiClient {
if (!hasRetriedAuth && this.recoverAuth) {
const recoveredToken = await this.recoverAuth({
path,
method: String(init.method ?? "GET").toUpperCase(),
method,
error: apiError,
});
if (recoveredToken) {
@@ -166,6 +197,50 @@ async function toApiError(response: Response): Promise<ApiRequestError> {
return new ApiRequestError(response.status, `Request failed with status ${response.status}`, undefined, parsed);
}
function buildConnectionErrorMessage(input: {
apiBase: string;
url: string;
method: string;
causeMessage?: string;
}): string {
const healthUrl = buildHealthCheckUrl(input.url);
const lines = [
"Could not reach the Paperclip API.",
"",
`Request: ${input.method} ${input.url}`,
];
if (input.causeMessage) {
lines.push(`Cause: ${input.causeMessage}`);
}
lines.push(
"",
"This usually means the Paperclip server is not running, the configured URL is wrong, or the request is being blocked before it reaches Paperclip.",
"",
"Try:",
"- Start Paperclip with `pnpm dev` or `pnpm paperclipai run`.",
`- Verify the server is reachable with \`curl ${healthUrl}\`.`,
`- If Paperclip is running elsewhere, pass \`--api-base ${input.apiBase.replace(/\/+$/, "")}\` or set \`PAPERCLIP_API_URL\`.`,
);
return lines.join("\n");
}
function buildHealthCheckUrl(requestUrl: string): string {
const url = new URL(requestUrl);
url.pathname = `${url.pathname.replace(/\/+$/, "").replace(/\/api(?:\/.*)?$/, "")}/api/health`;
url.search = "";
url.hash = "";
return url.toString();
}
function formatConnectionCause(error: unknown): string | undefined {
if (!error) return undefined;
if (error instanceof Error) {
return error.message.trim() || error.name;
}
const message = String(error).trim();
return message || undefined;
}
function toStringRecord(headers: HeadersInit | undefined): Record<string, string> {
if (!headers) return {};
if (Array.isArray(headers)) {

View File

@@ -2,6 +2,7 @@ import { Command } from "commander";
import { mkdir, readdir, readFile, stat, writeFile } from "node:fs/promises";
import path from "node:path";
import * as p from "@clack/prompts";
import pc from "picocolors";
import type {
Company,
CompanyPortabilityFileEntry,
@@ -11,6 +12,8 @@ import type {
CompanyPortabilityImportResult,
} from "@paperclipai/shared";
import { ApiRequestError } from "../../client/http.js";
import { openUrl } from "../../client/board-auth.js";
import { binaryContentTypeByExtension, readZipArchive } from "./zip.js";
import {
addCommonClientOptions,
formatInlineRecord,
@@ -49,16 +52,61 @@ interface CompanyImportOptions extends BaseClientOptions {
agents?: string;
collision?: CompanyCollisionMode;
ref?: string;
paperclipUrl?: string;
yes?: boolean;
dryRun?: boolean;
}
const binaryContentTypeByExtension: Record<string, string> = {
".gif": "image/gif",
".jpeg": "image/jpeg",
".jpg": "image/jpeg",
".png": "image/png",
".svg": "image/svg+xml",
".webp": "image/webp",
const DEFAULT_EXPORT_INCLUDE: CompanyPortabilityInclude = {
company: true,
agents: true,
projects: false,
issues: false,
skills: false,
};
const DEFAULT_IMPORT_INCLUDE: CompanyPortabilityInclude = {
company: true,
agents: true,
projects: true,
issues: true,
skills: true,
};
const IMPORT_INCLUDE_OPTIONS: Array<{
value: keyof CompanyPortabilityInclude;
label: string;
hint: string;
}> = [
{ value: "company", label: "Company", hint: "name, branding, and company settings" },
{ value: "projects", label: "Projects", hint: "projects and workspace metadata" },
{ value: "issues", label: "Tasks", hint: "tasks and recurring routines" },
{ value: "agents", label: "Agents", hint: "agent records and org structure" },
{ value: "skills", label: "Skills", hint: "company skill packages and references" },
];
const IMPORT_PREVIEW_SAMPLE_LIMIT = 6;
type ImportSelectableGroup = "projects" | "issues" | "agents" | "skills";
type ImportSelectionCatalog = {
company: {
includedByDefault: boolean;
files: string[];
};
projects: Array<{ key: string; label: string; hint?: string; files: string[] }>;
issues: Array<{ key: string; label: string; hint?: string; files: string[] }>;
agents: Array<{ key: string; label: string; hint?: string; files: string[] }>;
skills: Array<{ key: string; label: string; hint?: string; files: string[] }>;
extensionPath: string | null;
};
type ImportSelectionState = {
company: boolean;
projects: Set<string>;
issues: Set<string>;
agents: Set<string>;
skills: Set<string>;
};
function readPortableFileEntry(filePath: string, contents: Buffer): CompanyPortabilityFileEntry {
@@ -84,8 +132,11 @@ function normalizeSelector(input: string): string {
return input.trim();
}
function parseInclude(input: string | undefined): CompanyPortabilityInclude {
if (!input || !input.trim()) return { company: true, agents: true, projects: false, issues: false, skills: false };
function parseInclude(
input: string | undefined,
fallback: CompanyPortabilityInclude = DEFAULT_EXPORT_INCLUDE,
): CompanyPortabilityInclude {
if (!input || !input.trim()) return { ...fallback };
const values = input.split(",").map((part) => part.trim().toLowerCase()).filter(Boolean);
const include = {
company: values.includes("company"),
@@ -114,6 +165,554 @@ function parseCsvValues(input: string | undefined): string[] {
return Array.from(new Set(input.split(",").map((part) => part.trim()).filter(Boolean)));
}
function isInteractiveTerminal(): boolean {
return Boolean(process.stdin.isTTY && process.stdout.isTTY);
}
function resolveImportInclude(input: string | undefined): CompanyPortabilityInclude {
return parseInclude(input, DEFAULT_IMPORT_INCLUDE);
}
function normalizePortablePath(filePath: string): string {
return filePath.replace(/\\/g, "/");
}
function shouldIncludePortableFile(filePath: string): boolean {
const baseName = path.basename(filePath);
const isMarkdown = baseName.endsWith(".md");
const isPaperclipYaml = baseName === ".paperclip.yaml" || baseName === ".paperclip.yml";
const contentType = binaryContentTypeByExtension[path.extname(baseName).toLowerCase()];
return isMarkdown || isPaperclipYaml || Boolean(contentType);
}
function findPortableExtensionPath(files: Record<string, CompanyPortabilityFileEntry>): string | null {
if (files[".paperclip.yaml"] !== undefined) return ".paperclip.yaml";
if (files[".paperclip.yml"] !== undefined) return ".paperclip.yml";
return Object.keys(files).find((entry) => entry.endsWith("/.paperclip.yaml") || entry.endsWith("/.paperclip.yml")) ?? null;
}
function collectFilesUnderDirectory(
files: Record<string, CompanyPortabilityFileEntry>,
directory: string,
opts?: { excludePrefixes?: string[] },
): string[] {
const normalizedDirectory = normalizePortablePath(directory).replace(/\/+$/, "");
if (!normalizedDirectory) return [];
const prefix = `${normalizedDirectory}/`;
const excluded = (opts?.excludePrefixes ?? []).map((entry) => normalizePortablePath(entry).replace(/\/+$/, "")).filter(Boolean);
return Object.keys(files)
.map(normalizePortablePath)
.filter((filePath) => filePath.startsWith(prefix))
.filter((filePath) => !excluded.some((excludePrefix) => filePath.startsWith(`${excludePrefix}/`)))
.sort((left, right) => left.localeCompare(right));
}
function collectEntityFiles(
files: Record<string, CompanyPortabilityFileEntry>,
entryPath: string,
opts?: { excludePrefixes?: string[] },
): string[] {
const normalizedPath = normalizePortablePath(entryPath);
const directory = normalizedPath.includes("/") ? normalizedPath.slice(0, normalizedPath.lastIndexOf("/")) : "";
const selected = new Set<string>([normalizedPath]);
if (directory) {
for (const filePath of collectFilesUnderDirectory(files, directory, opts)) {
selected.add(filePath);
}
}
return Array.from(selected).sort((left, right) => left.localeCompare(right));
}
export function buildImportSelectionCatalog(preview: CompanyPortabilityPreviewResult): ImportSelectionCatalog {
const selectedAgentSlugs = new Set(preview.selectedAgentSlugs);
const companyFiles = new Set<string>();
const companyPath = preview.manifest.company?.path ? normalizePortablePath(preview.manifest.company.path) : null;
if (companyPath) {
companyFiles.add(companyPath);
}
const readmePath = Object.keys(preview.files).find((entry) => normalizePortablePath(entry) === "README.md");
if (readmePath) {
companyFiles.add(normalizePortablePath(readmePath));
}
const logoPath = preview.manifest.company?.logoPath ? normalizePortablePath(preview.manifest.company.logoPath) : null;
if (logoPath && preview.files[logoPath] !== undefined) {
companyFiles.add(logoPath);
}
return {
company: {
includedByDefault: preview.include.company && preview.manifest.company !== null,
files: Array.from(companyFiles).sort((left, right) => left.localeCompare(right)),
},
projects: preview.manifest.projects.map((project) => {
const projectPath = normalizePortablePath(project.path);
const projectDir = projectPath.includes("/") ? projectPath.slice(0, projectPath.lastIndexOf("/")) : "";
return {
key: project.slug,
label: project.name,
hint: project.slug,
files: collectEntityFiles(preview.files, projectPath, {
excludePrefixes: projectDir ? [`${projectDir}/issues`] : [],
}),
};
}),
issues: preview.manifest.issues.map((issue) => ({
key: issue.slug,
label: issue.title,
hint: issue.identifier ?? issue.slug,
files: collectEntityFiles(preview.files, normalizePortablePath(issue.path)),
})),
agents: preview.manifest.agents
.filter((agent) => selectedAgentSlugs.size === 0 || selectedAgentSlugs.has(agent.slug))
.map((agent) => ({
key: agent.slug,
label: agent.name,
hint: agent.slug,
files: collectEntityFiles(preview.files, normalizePortablePath(agent.path)),
})),
skills: preview.manifest.skills.map((skill) => ({
key: skill.slug,
label: skill.name,
hint: skill.slug,
files: collectEntityFiles(preview.files, normalizePortablePath(skill.path)),
})),
extensionPath: findPortableExtensionPath(preview.files),
};
}
function toKeySet(items: Array<{ key: string }>): Set<string> {
return new Set(items.map((item) => item.key));
}
export function buildDefaultImportSelectionState(catalog: ImportSelectionCatalog): ImportSelectionState {
return {
company: catalog.company.includedByDefault,
projects: toKeySet(catalog.projects),
issues: toKeySet(catalog.issues),
agents: toKeySet(catalog.agents),
skills: toKeySet(catalog.skills),
};
}
function countSelected(state: ImportSelectionState, group: ImportSelectableGroup): number {
return state[group].size;
}
function countTotal(catalog: ImportSelectionCatalog, group: ImportSelectableGroup): number {
return catalog[group].length;
}
function summarizeGroupSelection(catalog: ImportSelectionCatalog, state: ImportSelectionState, group: ImportSelectableGroup): string {
return `${countSelected(state, group)}/${countTotal(catalog, group)} selected`;
}
function getGroupLabel(group: ImportSelectableGroup): string {
switch (group) {
case "projects":
return "Projects";
case "issues":
return "Tasks";
case "agents":
return "Agents";
case "skills":
return "Skills";
}
}
export function buildSelectedFilesFromImportSelection(
catalog: ImportSelectionCatalog,
state: ImportSelectionState,
): string[] {
const selected = new Set<string>();
if (state.company) {
for (const filePath of catalog.company.files) {
selected.add(normalizePortablePath(filePath));
}
}
for (const group of ["projects", "issues", "agents", "skills"] as const) {
const selectedKeys = state[group];
for (const item of catalog[group]) {
if (!selectedKeys.has(item.key)) continue;
for (const filePath of item.files) {
selected.add(normalizePortablePath(filePath));
}
}
}
if (selected.size > 0 && catalog.extensionPath) {
selected.add(normalizePortablePath(catalog.extensionPath));
}
return Array.from(selected).sort((left, right) => left.localeCompare(right));
}
export function buildDefaultImportAdapterOverrides(
preview: Pick<CompanyPortabilityPreviewResult, "manifest" | "selectedAgentSlugs">,
): Record<string, { adapterType: string }> | undefined {
const selectedAgentSlugs = new Set(preview.selectedAgentSlugs);
const overrides = Object.fromEntries(
preview.manifest.agents
.filter((agent) => selectedAgentSlugs.size === 0 || selectedAgentSlugs.has(agent.slug))
.filter((agent) => agent.adapterType === "process")
.map((agent) => [
agent.slug,
{
// TODO: replace this temporary claude_local fallback with adapter selection in the import TUI.
adapterType: "claude_local",
},
]),
);
return Object.keys(overrides).length > 0 ? overrides : undefined;
}
function buildDefaultImportAdapterMessages(
overrides: Record<string, { adapterType: string }> | undefined,
): string[] {
if (!overrides) return [];
const adapterTypes = Array.from(new Set(Object.values(overrides).map((override) => override.adapterType)))
.map((adapterType) => adapterType.replace(/_/g, "-"));
const agentCount = Object.keys(overrides).length;
return [
`Using ${adapterTypes.join(", ")} adapter${adapterTypes.length === 1 ? "" : "s"} for ${agentCount} imported ${pluralize(agentCount, "agent")} without an explicit adapter.`,
];
}
async function promptForImportSelection(preview: CompanyPortabilityPreviewResult): Promise<string[]> {
const catalog = buildImportSelectionCatalog(preview);
const state = buildDefaultImportSelectionState(catalog);
while (true) {
const choice = await p.select<ImportSelectableGroup | "company" | "confirm">({
message: "Select what Paperclip should import",
options: [
{
value: "company",
label: state.company ? "Company: included" : "Company: skipped",
hint: catalog.company.files.length > 0 ? "toggle company metadata" : "no company metadata in package",
},
{
value: "projects",
label: "Select Projects",
hint: summarizeGroupSelection(catalog, state, "projects"),
},
{
value: "issues",
label: "Select Tasks",
hint: summarizeGroupSelection(catalog, state, "issues"),
},
{
value: "agents",
label: "Select Agents",
hint: summarizeGroupSelection(catalog, state, "agents"),
},
{
value: "skills",
label: "Select Skills",
hint: summarizeGroupSelection(catalog, state, "skills"),
},
{
value: "confirm",
label: "Confirm",
hint: `${buildSelectedFilesFromImportSelection(catalog, state).length} files selected`,
},
],
initialValue: "confirm",
});
if (p.isCancel(choice)) {
p.cancel("Import cancelled.");
process.exit(0);
}
if (choice === "confirm") {
const selectedFiles = buildSelectedFilesFromImportSelection(catalog, state);
if (selectedFiles.length === 0) {
p.note("Select at least one import target before confirming.", "Nothing selected");
continue;
}
return selectedFiles;
}
if (choice === "company") {
if (catalog.company.files.length === 0) {
p.note("This package does not include company metadata to toggle.", "No company metadata");
continue;
}
state.company = !state.company;
continue;
}
const group = choice;
const groupItems = catalog[group];
if (groupItems.length === 0) {
p.note(`This package does not include any ${getGroupLabel(group).toLowerCase()}.`, `No ${getGroupLabel(group)}`);
continue;
}
const selection = await p.multiselect<string>({
message: `${getGroupLabel(group)} to import. Space toggles, enter returns to the main menu.`,
options: groupItems.map((item) => ({
value: item.key,
label: item.label,
hint: item.hint,
})),
initialValues: Array.from(state[group]),
});
if (p.isCancel(selection)) {
p.cancel("Import cancelled.");
process.exit(0);
}
state[group] = new Set(selection);
}
}
function summarizeInclude(include: CompanyPortabilityInclude): string {
const labels = IMPORT_INCLUDE_OPTIONS
.filter((option) => include[option.value])
.map((option) => option.label.toLowerCase());
return labels.length > 0 ? labels.join(", ") : "nothing selected";
}
function formatSourceLabel(source: { type: "inline"; rootPath?: string | null } | { type: "github"; url: string }): string {
if (source.type === "github") {
return `GitHub: ${source.url}`;
}
return `Local package: ${source.rootPath?.trim() || "(current folder)"}`;
}
function formatTargetLabel(
target: { mode: "existing_company"; companyId?: string | null } | { mode: "new_company"; newCompanyName?: string | null },
preview?: CompanyPortabilityPreviewResult,
): string {
if (target.mode === "existing_company") {
const targetName = preview?.targetCompanyName?.trim();
const targetId = preview?.targetCompanyId?.trim() || target.companyId?.trim() || "unknown-company";
return targetName ? `${targetName} (${targetId})` : targetId;
}
return target.newCompanyName?.trim() || preview?.manifest.company?.name || "new company";
}
function pluralize(count: number, singular: string, plural = `${singular}s`): string {
return count === 1 ? singular : plural;
}
function summarizePlanCounts(
plans: Array<{ action: "create" | "update" | "skip" }>,
noun: string,
): string {
if (plans.length === 0) return `0 ${pluralize(0, noun)} selected`;
const createCount = plans.filter((plan) => plan.action === "create").length;
const updateCount = plans.filter((plan) => plan.action === "update").length;
const skipCount = plans.filter((plan) => plan.action === "skip").length;
const parts: string[] = [];
if (createCount > 0) parts.push(`${createCount} create`);
if (updateCount > 0) parts.push(`${updateCount} update`);
if (skipCount > 0) parts.push(`${skipCount} skip`);
return `${plans.length} ${pluralize(plans.length, noun)} total (${parts.join(", ")})`;
}
function summarizeImportAgentResults(agents: CompanyPortabilityImportResult["agents"]): string {
if (agents.length === 0) return "0 agents changed";
const created = agents.filter((agent) => agent.action === "created").length;
const updated = agents.filter((agent) => agent.action === "updated").length;
const skipped = agents.filter((agent) => agent.action === "skipped").length;
const parts: string[] = [];
if (created > 0) parts.push(`${created} created`);
if (updated > 0) parts.push(`${updated} updated`);
if (skipped > 0) parts.push(`${skipped} skipped`);
return `${agents.length} ${pluralize(agents.length, "agent")} total (${parts.join(", ")})`;
}
function summarizeImportProjectResults(projects: CompanyPortabilityImportResult["projects"]): string {
if (projects.length === 0) return "0 projects changed";
const created = projects.filter((project) => project.action === "created").length;
const updated = projects.filter((project) => project.action === "updated").length;
const skipped = projects.filter((project) => project.action === "skipped").length;
const parts: string[] = [];
if (created > 0) parts.push(`${created} created`);
if (updated > 0) parts.push(`${updated} updated`);
if (skipped > 0) parts.push(`${skipped} skipped`);
return `${projects.length} ${pluralize(projects.length, "project")} total (${parts.join(", ")})`;
}
function actionChip(action: string): string {
switch (action) {
case "create":
case "created":
return pc.green(action);
case "update":
case "updated":
return pc.yellow(action);
case "skip":
case "skipped":
case "none":
case "unchanged":
return pc.dim(action);
default:
return action;
}
}
function appendPreviewExamples(
lines: string[],
title: string,
entries: Array<{ action: string; label: string; reason?: string | null }>,
): void {
if (entries.length === 0) return;
lines.push("");
lines.push(pc.bold(title));
const shown = entries.slice(0, IMPORT_PREVIEW_SAMPLE_LIMIT);
for (const entry of shown) {
const reason = entry.reason?.trim() ? pc.dim(` (${entry.reason.trim()})`) : "";
lines.push(`- ${actionChip(entry.action)} ${entry.label}${reason}`);
}
if (entries.length > shown.length) {
lines.push(pc.dim(`- +${entries.length - shown.length} more`));
}
}
function appendMessageBlock(lines: string[], title: string, messages: string[]): void {
if (messages.length === 0) return;
lines.push("");
lines.push(pc.bold(title));
for (const message of messages) {
lines.push(`- ${message}`);
}
}
export function renderCompanyImportPreview(
preview: CompanyPortabilityPreviewResult,
meta: {
sourceLabel: string;
targetLabel: string;
infoMessages?: string[];
},
): string {
const lines: string[] = [
`${pc.bold("Source")} ${meta.sourceLabel}`,
`${pc.bold("Target")} ${meta.targetLabel}`,
`${pc.bold("Include")} ${summarizeInclude(preview.include)}`,
`${pc.bold("Mode")} ${preview.collisionStrategy} collisions`,
"",
pc.bold("Package"),
`- company: ${preview.manifest.company?.name ?? preview.manifest.source?.companyName ?? "not included"}`,
`- agents: ${preview.manifest.agents.length}`,
`- projects: ${preview.manifest.projects.length}`,
`- tasks: ${preview.manifest.issues.length}`,
`- skills: ${preview.manifest.skills.length}`,
];
if (preview.envInputs.length > 0) {
const requiredCount = preview.envInputs.filter((item) => item.requirement === "required").length;
lines.push(`- env inputs: ${preview.envInputs.length} (${requiredCount} required)`);
}
lines.push("");
lines.push(pc.bold("Plan"));
lines.push(`- company: ${actionChip(preview.plan.companyAction === "none" ? "unchanged" : preview.plan.companyAction)}`);
lines.push(`- agents: ${summarizePlanCounts(preview.plan.agentPlans, "agent")}`);
lines.push(`- projects: ${summarizePlanCounts(preview.plan.projectPlans, "project")}`);
lines.push(`- tasks: ${summarizePlanCounts(preview.plan.issuePlans, "task")}`);
if (preview.include.skills) {
lines.push(`- skills: ${preview.manifest.skills.length} ${pluralize(preview.manifest.skills.length, "skill")} packaged`);
}
appendPreviewExamples(
lines,
"Agent examples",
preview.plan.agentPlans.map((plan) => ({
action: plan.action,
label: `${plan.slug} -> ${plan.plannedName}`,
reason: plan.reason,
})),
);
appendPreviewExamples(
lines,
"Project examples",
preview.plan.projectPlans.map((plan) => ({
action: plan.action,
label: `${plan.slug} -> ${plan.plannedName}`,
reason: plan.reason,
})),
);
appendPreviewExamples(
lines,
"Task examples",
preview.plan.issuePlans.map((plan) => ({
action: plan.action,
label: `${plan.slug} -> ${plan.plannedTitle}`,
reason: plan.reason,
})),
);
appendMessageBlock(lines, pc.cyan("Info"), meta.infoMessages ?? []);
appendMessageBlock(lines, pc.yellow("Warnings"), preview.warnings);
appendMessageBlock(lines, pc.red("Errors"), preview.errors);
return lines.join("\n");
}
export function renderCompanyImportResult(
result: CompanyPortabilityImportResult,
meta: { targetLabel: string; companyUrl?: string; infoMessages?: string[] },
): string {
const lines: string[] = [
`${pc.bold("Target")} ${meta.targetLabel}`,
`${pc.bold("Company")} ${result.company.name} (${actionChip(result.company.action)})`,
`${pc.bold("Agents")} ${summarizeImportAgentResults(result.agents)}`,
`${pc.bold("Projects")} ${summarizeImportProjectResults(result.projects)}`,
];
if (meta.companyUrl) {
lines.splice(1, 0, `${pc.bold("URL")} ${meta.companyUrl}`);
}
appendPreviewExamples(
lines,
"Agent results",
result.agents.map((agent) => ({
action: agent.action,
label: `${agent.slug} -> ${agent.name}`,
reason: agent.reason,
})),
);
appendPreviewExamples(
lines,
"Project results",
result.projects.map((project) => ({
action: project.action,
label: `${project.slug} -> ${project.name}`,
reason: project.reason,
})),
);
if (result.envInputs.length > 0) {
lines.push("");
lines.push(pc.bold("Env inputs"));
lines.push(
`- ${result.envInputs.length} ${pluralize(result.envInputs.length, "input")} may need values after import`,
);
}
appendMessageBlock(lines, pc.cyan("Info"), meta.infoMessages ?? []);
appendMessageBlock(lines, pc.yellow("Warnings"), result.warnings);
return lines.join("\n");
}
function printCompanyImportView(title: string, body: string, opts?: { interactive?: boolean }): void {
if (opts?.interactive) {
p.note(body, title);
return;
}
console.log(pc.bold(title));
console.log(body);
}
export function resolveCompanyImportApiPath(input: {
dryRun: boolean;
targetMode: "new_company" | "existing_company";
@@ -132,6 +731,36 @@ export function resolveCompanyImportApiPath(input: {
return input.dryRun ? "/api/companies/import/preview" : "/api/companies/import";
}
export function buildCompanyDashboardUrl(apiBase: string, issuePrefix: string): string {
const url = new URL(apiBase);
const normalizedPrefix = issuePrefix.trim().replace(/^\/+|\/+$/g, "");
url.pathname = `${url.pathname.replace(/\/+$/, "")}/${normalizedPrefix}/dashboard`;
url.search = "";
url.hash = "";
return url.toString();
}
export function resolveCompanyImportApplyConfirmationMode(input: {
yes?: boolean;
interactive: boolean;
json: boolean;
}): "skip" | "prompt" {
if (input.yes) {
return "skip";
}
if (input.json) {
throw new Error(
"Applying a company import with --json requires --yes. Use --dry-run first to inspect the preview.",
);
}
if (!input.interactive) {
throw new Error(
"Applying a company import from a non-interactive terminal requires --yes. Use --dry-run first to inspect the preview.",
);
}
return "prompt";
}
export function isHttpUrl(input: string): boolean {
return /^https?:\/\//i.test(input.trim());
}
@@ -260,21 +889,29 @@ async function collectPackageFiles(
continue;
}
if (!entry.isFile()) continue;
const isMarkdown = entry.name.endsWith(".md");
const isPaperclipYaml = entry.name === ".paperclip.yaml" || entry.name === ".paperclip.yml";
const contentType = binaryContentTypeByExtension[path.extname(entry.name).toLowerCase()];
if (!isMarkdown && !isPaperclipYaml && !contentType) continue;
const relativePath = path.relative(root, absolutePath).replace(/\\/g, "/");
if (!shouldIncludePortableFile(relativePath)) continue;
files[relativePath] = readPortableFileEntry(relativePath, await readFile(absolutePath));
}
}
async function resolveInlineSourceFromPath(inputPath: string): Promise<{
export async function resolveInlineSourceFromPath(inputPath: string): Promise<{
rootPath: string;
files: Record<string, CompanyPortabilityFileEntry>;
}> {
const resolved = path.resolve(inputPath);
const resolvedStat = await stat(resolved);
if (resolvedStat.isFile() && path.extname(resolved).toLowerCase() === ".zip") {
const archive = await readZipArchive(await readFile(resolved));
const filteredFiles = Object.fromEntries(
Object.entries(archive.files).filter(([relativePath]) => shouldIncludePortableFile(relativePath)),
);
return {
rootPath: archive.rootPath ?? path.basename(resolved, ".zip"),
files: filteredFiles,
};
}
const rootDir = resolvedStat.isDirectory() ? resolved : path.dirname(resolved);
const files: Record<string, CompanyPortabilityFileEntry> = {};
await collectPackageFiles(rootDir, rootDir, files);
@@ -515,23 +1152,29 @@ export function registerCompanyCommands(program: Command): void {
.command("import")
.description("Import a portable markdown company package from local path, URL, or GitHub")
.argument("<fromPathOrUrl>", "Source path or URL")
.option("--include <values>", "Comma-separated include set: company,agents,projects,issues,tasks,skills", "company,agents")
.option("--include <values>", "Comma-separated include set: company,agents,projects,issues,tasks,skills")
.option("--target <mode>", "Target mode: new | existing")
.option("-C, --company-id <id>", "Existing target company ID")
.option("--new-company-name <name>", "Name override for --target new")
.option("--agents <list>", "Comma-separated agent slugs to import, or all", "all")
.option("--collision <mode>", "Collision strategy: rename | skip | replace", "rename")
.option("--ref <value>", "Git ref to use for GitHub imports (branch, tag, or commit)")
.option("--paperclip-url <url>", "Alias for --api-base on this command")
.option("--yes", "Accept default selection and skip the pre-import confirmation prompt", false)
.option("--dry-run", "Run preview only without applying", false)
.action(async (fromPathOrUrl: string, opts: CompanyImportOptions) => {
try {
if (!opts.apiBase?.trim() && opts.paperclipUrl?.trim()) {
opts.apiBase = opts.paperclipUrl.trim();
}
const ctx = resolveCommandContext(opts);
const interactiveView = isInteractiveTerminal() && !ctx.json;
const from = fromPathOrUrl.trim();
if (!from) {
throw new Error("Source path or URL is required.");
}
const include = parseInclude(opts.include);
const include = resolveImportInclude(opts.include);
const agents = parseAgents(opts.agents);
const collision = (opts.collision ?? "rename").toLowerCase() as CompanyCollisionMode;
if (!["rename", "skip", "replace"].includes(collision)) {
@@ -587,27 +1230,139 @@ export function registerCompanyCommands(program: Command): void {
};
}
const payload = {
const sourceLabel = formatSourceLabel(sourcePayload);
const targetLabel = formatTargetLabel(targetPayload);
const previewApiPath = resolveCompanyImportApiPath({
dryRun: true,
targetMode: targetPayload.mode,
companyId: targetPayload.mode === "existing_company" ? targetPayload.companyId : null,
});
let selectedFiles: string[] | undefined;
if (interactiveView && !opts.yes && !opts.include?.trim()) {
const initialPreview = await ctx.api.post<CompanyPortabilityPreviewResult>(previewApiPath, {
source: sourcePayload,
include,
target: targetPayload,
agents,
collisionStrategy: collision,
});
if (!initialPreview) {
throw new Error("Import preview returned no data.");
}
selectedFiles = await promptForImportSelection(initialPreview);
}
const previewPayload = {
source: sourcePayload,
include,
target: targetPayload,
agents,
collisionStrategy: collision,
selectedFiles,
};
const importApiPath = resolveCompanyImportApiPath({
dryRun: Boolean(opts.dryRun),
targetMode: targetPayload.mode,
companyId: targetPayload.mode === "existing_company" ? targetPayload.companyId : null,
});
const preview = await ctx.api.post<CompanyPortabilityPreviewResult>(previewApiPath, previewPayload);
if (!preview) {
throw new Error("Import preview returned no data.");
}
const adapterOverrides = buildDefaultImportAdapterOverrides(preview);
const adapterMessages = buildDefaultImportAdapterMessages(adapterOverrides);
if (opts.dryRun) {
const preview = await ctx.api.post<CompanyPortabilityPreviewResult>(importApiPath, payload);
printOutput(preview, { json: ctx.json });
if (ctx.json) {
printOutput(preview, { json: true });
} else {
printCompanyImportView(
"Import Preview",
renderCompanyImportPreview(preview, {
sourceLabel,
targetLabel: formatTargetLabel(targetPayload, preview),
infoMessages: adapterMessages,
}),
{ interactive: interactiveView },
);
}
return;
}
const imported = await ctx.api.post<CompanyPortabilityImportResult>(importApiPath, payload);
printOutput(imported, { json: ctx.json });
if (!ctx.json) {
printCompanyImportView(
"Import Preview",
renderCompanyImportPreview(preview, {
sourceLabel,
targetLabel: formatTargetLabel(targetPayload, preview),
infoMessages: adapterMessages,
}),
{ interactive: interactiveView },
);
}
const confirmationMode = resolveCompanyImportApplyConfirmationMode({
yes: opts.yes,
interactive: interactiveView,
json: ctx.json,
});
if (confirmationMode === "prompt") {
const confirmed = await p.confirm({
message: "Apply this import? (y/N)",
initialValue: false,
});
if (p.isCancel(confirmed) || !confirmed) {
p.log.warn("Import cancelled.");
return;
}
}
const importApiPath = resolveCompanyImportApiPath({
dryRun: false,
targetMode: targetPayload.mode,
companyId: targetPayload.mode === "existing_company" ? targetPayload.companyId : null,
});
const imported = await ctx.api.post<CompanyPortabilityImportResult>(importApiPath, {
...previewPayload,
adapterOverrides,
});
if (!imported) {
throw new Error("Import request returned no data.");
}
let companyUrl: string | undefined;
if (!ctx.json) {
try {
const importedCompany = await ctx.api.get<Company>(`/api/companies/${imported.company.id}`);
const issuePrefix = importedCompany?.issuePrefix?.trim();
if (issuePrefix) {
companyUrl = buildCompanyDashboardUrl(ctx.api.apiBase, issuePrefix);
}
} catch {
companyUrl = undefined;
}
}
if (ctx.json) {
printOutput(imported, { json: true });
} else {
printCompanyImportView(
"Import Result",
renderCompanyImportResult(imported, {
targetLabel,
companyUrl,
infoMessages: adapterMessages,
}),
{ interactive: interactiveView },
);
if (interactiveView && companyUrl) {
const openImportedCompany = await p.confirm({
message: "Open the imported company in your browser?",
initialValue: true,
});
if (!p.isCancel(openImportedCompany) && openImportedCompany) {
if (openUrl(companyUrl)) {
p.log.info(`Opened ${companyUrl}`);
} else {
p.log.warn(`Could not open your browser automatically. Open this URL manually:\n${companyUrl}`);
}
}
}
}
} catch (err) {
handleCommandError(err);
}

View File

@@ -0,0 +1,129 @@
import { inflateRawSync } from "node:zlib";
import path from "node:path";
import type { CompanyPortabilityFileEntry } from "@paperclipai/shared";
const textDecoder = new TextDecoder();
export const binaryContentTypeByExtension: Record<string, string> = {
".gif": "image/gif",
".jpeg": "image/jpeg",
".jpg": "image/jpeg",
".png": "image/png",
".svg": "image/svg+xml",
".webp": "image/webp",
};
function normalizeArchivePath(pathValue: string) {
return pathValue
.replace(/\\/g, "/")
.split("/")
.filter(Boolean)
.join("/");
}
function readUint16(source: Uint8Array, offset: number) {
return source[offset]! | (source[offset + 1]! << 8);
}
function readUint32(source: Uint8Array, offset: number) {
return (
source[offset]! |
(source[offset + 1]! << 8) |
(source[offset + 2]! << 16) |
(source[offset + 3]! << 24)
) >>> 0;
}
function sharedArchiveRoot(paths: string[]) {
if (paths.length === 0) return null;
const firstSegments = paths
.map((entry) => normalizeArchivePath(entry).split("/").filter(Boolean))
.filter((parts) => parts.length > 0);
if (firstSegments.length === 0) return null;
const candidate = firstSegments[0]![0]!;
return firstSegments.every((parts) => parts.length > 1 && parts[0] === candidate)
? candidate
: null;
}
function bytesToPortableFileEntry(pathValue: string, bytes: Uint8Array): CompanyPortabilityFileEntry {
const contentType = binaryContentTypeByExtension[path.extname(pathValue).toLowerCase()];
if (!contentType) return textDecoder.decode(bytes);
return {
encoding: "base64",
data: Buffer.from(bytes).toString("base64"),
contentType,
};
}
async function inflateZipEntry(compressionMethod: number, bytes: Uint8Array) {
if (compressionMethod === 0) return bytes;
if (compressionMethod !== 8) {
throw new Error("Unsupported zip archive: only STORE and DEFLATE entries are supported.");
}
return new Uint8Array(inflateRawSync(bytes));
}
export async function readZipArchive(source: ArrayBuffer | Uint8Array): Promise<{
rootPath: string | null;
files: Record<string, CompanyPortabilityFileEntry>;
}> {
const bytes = source instanceof Uint8Array ? source : new Uint8Array(source);
const entries: Array<{ path: string; body: CompanyPortabilityFileEntry }> = [];
let offset = 0;
while (offset + 4 <= bytes.length) {
const signature = readUint32(bytes, offset);
if (signature === 0x02014b50 || signature === 0x06054b50) break;
if (signature !== 0x04034b50) {
throw new Error("Invalid zip archive: unsupported local file header.");
}
if (offset + 30 > bytes.length) {
throw new Error("Invalid zip archive: truncated local file header.");
}
const generalPurposeFlag = readUint16(bytes, offset + 6);
const compressionMethod = readUint16(bytes, offset + 8);
const compressedSize = readUint32(bytes, offset + 18);
const fileNameLength = readUint16(bytes, offset + 26);
const extraFieldLength = readUint16(bytes, offset + 28);
if ((generalPurposeFlag & 0x0008) !== 0) {
throw new Error("Unsupported zip archive: data descriptors are not supported.");
}
const nameOffset = offset + 30;
const bodyOffset = nameOffset + fileNameLength + extraFieldLength;
const bodyEnd = bodyOffset + compressedSize;
if (bodyEnd > bytes.length) {
throw new Error("Invalid zip archive: truncated file contents.");
}
const rawArchivePath = textDecoder.decode(bytes.slice(nameOffset, nameOffset + fileNameLength));
const archivePath = normalizeArchivePath(rawArchivePath);
const isDirectoryEntry = /\/$/.test(rawArchivePath.replace(/\\/g, "/"));
if (archivePath && !isDirectoryEntry) {
const entryBytes = await inflateZipEntry(compressionMethod, bytes.slice(bodyOffset, bodyEnd));
entries.push({
path: archivePath,
body: bytesToPortableFileEntry(archivePath, entryBytes),
});
}
offset = bodyEnd;
}
const rootPath = sharedArchiveRoot(entries.map((entry) => entry.path));
const files: Record<string, CompanyPortabilityFileEntry> = {};
for (const entry of entries) {
const normalizedPath =
rootPath && entry.path.startsWith(`${rootPath}/`)
? entry.path.slice(rootPath.length + 1)
: entry.path;
if (!normalizedPath) continue;
files[normalizedPath] = entry.body;
}
return { rootPath, files };
}

View File

@@ -41,6 +41,8 @@ import {
projects,
runDatabaseBackup,
runDatabaseRestore,
createEmbeddedPostgresLogBuffer,
formatEmbeddedPostgresError,
} from "@paperclipai/db";
import type { Command } from "commander";
import { ensureAgentJwtSecret, loadPaperclipEnvFile, mergePaperclipEnvEntries, readPaperclipEnvEntries, resolvePaperclipEnvFile } from "../config/env.js";
@@ -465,6 +467,62 @@ async function findAvailablePort(preferredPort: number, reserved = new Set<numbe
return port;
}
function resolveRepoManagedWorktreesRoot(cwd: string): string | null {
const normalized = path.resolve(cwd);
const marker = `${path.sep}.paperclip${path.sep}worktrees${path.sep}`;
const index = normalized.indexOf(marker);
if (index === -1) return null;
const repoRoot = normalized.slice(0, index);
return path.resolve(repoRoot, ".paperclip", "worktrees");
}
function collectClaimedWorktreePorts(homeDir: string, currentInstanceId: string, cwd: string): {
serverPorts: Set<number>;
databasePorts: Set<number>;
} {
const serverPorts = new Set<number>();
const databasePorts = new Set<number>();
const configPaths = new Set<string>();
const instancesDir = path.resolve(homeDir, "instances");
if (existsSync(instancesDir)) {
for (const entry of readdirSync(instancesDir, { withFileTypes: true })) {
if (!entry.isDirectory() || entry.name === currentInstanceId) continue;
const configPath = path.resolve(instancesDir, entry.name, "config.json");
if (existsSync(configPath)) {
configPaths.add(configPath);
}
}
}
const repoManagedWorktreesRoot = resolveRepoManagedWorktreesRoot(cwd);
if (repoManagedWorktreesRoot && existsSync(repoManagedWorktreesRoot)) {
for (const entry of readdirSync(repoManagedWorktreesRoot, { withFileTypes: true })) {
if (!entry.isDirectory()) continue;
const configPath = path.resolve(repoManagedWorktreesRoot, entry.name, ".paperclip", "config.json");
if (existsSync(configPath)) {
configPaths.add(configPath);
}
}
}
for (const configPath of configPaths) {
try {
const config = readConfig(configPath);
if (config?.server.port) {
serverPorts.add(config.server.port);
}
if (config?.database.mode === "embedded-postgres") {
databasePorts.add(config.database.embeddedPostgresPort);
}
} catch {
// Ignore malformed sibling configs.
}
}
return { serverPorts, databasePorts };
}
function detectGitBranchName(cwd: string): string | null {
try {
const value = execFileSync("git", ["branch", "--show-current"], {
@@ -750,24 +808,39 @@ async function ensureEmbeddedPostgres(dataDir: string, preferredPort: number): P
}
const port = await findAvailablePort(preferredPort);
const logBuffer = createEmbeddedPostgresLogBuffer();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: logBuffer.append,
onError: logBuffer.append,
});
if (!existsSync(path.resolve(dataDir, "PG_VERSION"))) {
await instance.initialise();
try {
await instance.initialise();
} catch (error) {
throw formatEmbeddedPostgresError(error, {
fallbackMessage: `Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
}
if (existsSync(postmasterPidFile)) {
rmSync(postmasterPidFile, { force: true });
}
await instance.start();
try {
await instance.start();
} catch (error) {
throw formatEmbeddedPostgresError(error, {
fallbackMessage: `Failed to start embedded PostgreSQL on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
return {
port,
@@ -886,10 +959,14 @@ async function runWorktreeInit(opts: WorktreeInitOptions): Promise<void> {
rmSync(paths.instanceRoot, { recursive: true, force: true });
}
const claimedPorts = collectClaimedWorktreePorts(paths.homeDir, paths.instanceId, paths.cwd);
const preferredServerPort = opts.serverPort ?? ((sourceConfig?.server.port ?? 3100) + 1);
const serverPort = await findAvailablePort(preferredServerPort);
const serverPort = await findAvailablePort(preferredServerPort, claimedPorts.serverPorts);
const preferredDbPort = opts.dbPort ?? ((sourceConfig?.database.embeddedPostgresPort ?? 54329) + 1);
const databasePort = await findAvailablePort(preferredDbPort, new Set([serverPort]));
const databasePort = await findAvailablePort(
preferredDbPort,
new Set([...claimedPorts.databasePorts, serverPort]),
);
const targetConfig = buildWorktreeConfig({
sourceConfig,
paths,

View File

@@ -28,7 +28,7 @@ These define the contract between server, CLI, and UI.
| File | What it defines |
|---|---|
| `packages/shared/src/types/company-portability.ts` | TypeScript interfaces: `CompanyPortabilityManifest`, `CompanyPortabilityFileEntry`, `CompanyPortabilityEnvInput`, export/import/preview request and result types, manifest entry types for agents, skills, projects, issues, companies. |
| `packages/shared/src/types/company-portability.ts` | TypeScript interfaces: `CompanyPortabilityManifest`, `CompanyPortabilityFileEntry`, `CompanyPortabilityEnvInput`, export/import/preview request and result types, manifest entry types for agents, skills, projects, issues, recurring routines, companies. |
| `packages/shared/src/validators/company-portability.ts` | Zod schemas for all portability request/response shapes — used by both server routes and CLI. |
| `packages/shared/src/types/index.ts` | Re-exports portability types. |
| `packages/shared/src/validators/index.ts` | Re-exports portability validators. |
@@ -37,7 +37,8 @@ These define the contract between server, CLI, and UI.
| File | Responsibility |
|---|---|
| `server/src/services/company-portability.ts` | **Core portability service.** Export (manifest generation, markdown file emission, `.paperclip.yaml` sidecars), import (graph resolution, collision handling, entity creation), preview (planned-action summary). Handles skill key derivation, task recurrence parsing, and package README generation. References `agentcompanies/v1` version string. |
| `server/src/services/company-portability.ts` | **Core portability service.** Export (manifest generation, markdown file emission, `.paperclip.yaml` sidecars), import (graph resolution, collision handling, entity creation), preview (planned-action summary). Handles skill key derivation, recurring task <-> routine mapping, legacy recurrence migration, and package README generation. References `agentcompanies/v1` version string. |
| `server/src/services/routines.ts` | Paperclip routine runtime service. Portability now exports routines as recurring `TASK.md` entries and imports recurring tasks back through this service. |
| `server/src/services/company-export-readme.ts` | Generates `README.md` and Mermaid org-chart for exported company packages. |
| `server/src/services/index.ts` | Re-exports `companyPortabilityService`. |
@@ -106,7 +107,7 @@ Route registration lives in `server/src/app.ts` via `companyRoutes(db, storage)`
| `PROJECT.md` frontmatter & body | `company-portability.ts` |
| `TASK.md` frontmatter & body | `company-portability.ts` |
| `SKILL.md` packages | `company-portability.ts`, `company-skills.ts` |
| `.paperclip.yaml` vendor sidecar | `company-portability.ts`, `CompanyExport.tsx`, `company.ts` (CLI) |
| `.paperclip.yaml` vendor sidecar | `company-portability.ts`, `routines.ts`, `CompanyExport.tsx`, `company.ts` (CLI) |
| `manifest.json` | `company-portability.ts` (generation), shared types (schema) |
| ZIP package format | `zip.ts` (UI), `company.ts` (CLI file I/O) |
| Collision resolution | `company-portability.ts` (server), `CompanyImport.tsx` (UI) |

View File

@@ -206,6 +206,17 @@ paperclipai worktree init --from-data-dir ~/.paperclip
paperclipai worktree init --force
```
Repair an already-created repo-managed worktree and reseed its isolated instance from the main default install:
```sh
cd ~/.paperclip/worktrees/PAP-884-ai-commits-component
pnpm paperclipai worktree init --force --seed-mode minimal \
--name PAP-884-ai-commits-component \
--from-config ~/.paperclip/instances/default/config.json
```
That rewrites the worktree-local `.paperclip/config.json` + `.paperclip/.env`, recreates the isolated instance under `~/.paperclip-worktrees/instances/<worktree-id>/`, and preserves the git worktree contents themselves.
**`pnpm paperclipai worktree:make <name> [options]`** — Create `~/NAME` as a git worktree, then initialize an isolated Paperclip instance inside it. This combines `git worktree add` with `worktree init` in a single step.
| Option | Description |

View File

@@ -51,10 +51,9 @@ Public packages are discovered from:
- `packages/`
- `server/`
- `ui/`
- `cli/`
`ui/` is ignored because it is private.
The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs), which:
- finds all public packages
@@ -65,6 +64,18 @@ The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts
Those rewrites are temporary. The working tree is restored after publish or dry-run.
## `@paperclipai/ui` packaging
The UI package publishes prebuilt static assets, not the source workspace.
The `ui` package uses [`scripts/generate-ui-package-json.mjs`](../scripts/generate-ui-package-json.mjs) during `prepack` to swap in a lean publish manifest that:
- keeps the release-managed `name` and `version`
- publishes only `dist/`
- omits the source-only dependency graph from downstream installs
After packing or publishing, `postpack` restores the development manifest automatically.
## Version formats
Paperclip uses calendar versions:
@@ -135,6 +146,7 @@ This is the fastest way to restore the default install path if a stable release
- [`scripts/build-npm.sh`](../scripts/build-npm.sh)
- [`scripts/generate-npm-package-json.mjs`](../scripts/generate-npm-package-json.mjs)
- [`scripts/generate-ui-package-json.mjs`](../scripts/generate-ui-package-json.mjs)
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
- [`cli/esbuild.config.mjs`](../cli/esbuild.config.mjs)
- [`doc/RELEASING.md`](RELEASING.md)

View File

@@ -35,6 +35,7 @@ At minimum that includes:
- `paperclipai`
- `@paperclipai/server`
- `@paperclipai/ui`
- public packages under `packages/`
### 2.1. In npm, open each package settings page

View File

@@ -860,11 +860,15 @@ Export/import behavior in V1:
- export emits a clean vendor-neutral markdown package plus `.paperclip.yaml`
- projects and starter tasks are opt-in export content rather than default package content
- export strips environment-specific paths (`cwd`, local instruction file paths, inline prompt duplication)
- recurring `TASK.md` entries use `recurring: true` in the base package and Paperclip routine fidelity in `.paperclip.yaml`
- Paperclip imports recurring task packages as routines instead of downgrading them to one-time issues
- export strips environment-specific paths (`cwd`, local instruction file paths, inline prompt duplication) while preserving portable project repo/workspace metadata such as `repoUrl`, refs, and workspace-policy references keyed in `.paperclip.yaml`
- export never includes secret values; env inputs are reported as portable declarations instead
- import supports target modes:
- create a new company
- import into an existing company
- import recreates exported project workspaces and remaps portable workspace keys back to target-local workspace ids
- import forces imported agent timer heartbeats off so packages never start scheduled runs implicitly
- import supports collision strategies: `rename`, `skip`, `replace`
- import supports preview (dry-run) before apply
- GitHub imports warn on unpinned refs instead of blocking

View File

@@ -186,17 +186,21 @@ The heartbeat is a protocol, not a runtime. Paperclip defines how to initiate an
### Execution Adapters
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Initial adapters:
Agent configuration includes an **adapter** that defines how Paperclip invokes the agent. Built-in adapters include:
| Adapter | Mechanism | Example |
| -------------------- | ----------------------- | --------------------------------------------- |
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
| `openclaw_gateway` | OpenClaw gateway API | Managed OpenClaw agent via gateway |
| `gemini_local` | Gemini CLI process | Local Gemini CLI with sandbox and approval |
| `hermes_local` | Hermes agent process | Local Hermes agent |
| Adapter | Mechanism | Example |
| ---------------- | -------------------------- | -------------------------------------------------- |
| `process` | Execute a child process | `python run_agent.py --agent-id {id}` |
| `http` | Send an HTTP request | `POST https://openclaw.example.com/hook/{id}` |
| `claude_local` | Local Claude Code process | Claude Code heartbeat worker |
| `codex_local` | Local Codex process | Codex CLI heartbeat worker |
| `opencode_local` | Local OpenCode process | OpenCode heartbeat worker |
| `pi_local` | Local Pi process | Pi CLI heartbeat worker |
| `cursor` | Cursor API/CLI bridge | Cursor-integrated heartbeat worker |
| `openclaw_gateway` | OpenClaw gateway API | Managed OpenClaw agent via gateway |
| `hermes_local` | Local Hermes process | Hermes agent heartbeat worker |
The `process` and `http` adapters ship as defaults. Additional adapters have been added for specific agent runtimes (see list above), and new adapter types can be registered via the plugin system (see Plugin / Extension Architecture).
The `process` and `http` adapters ship as generic defaults. Additional built-in adapters cover common local coding runtimes (see list above), and new adapter types can be registered via the plugin system (see Plugin / Extension Architecture).
### Adapter Interface
@@ -376,7 +380,7 @@ Flow:
| Layer | Technology |
| -------- | ------------------------------------------------------------ |
| Frontend | React + Vite |
| Backend | TypeScript + Hono (REST API, not tRPC — need non-TS clients) |
| Backend | TypeScript + Express (REST API, not tRPC — need non-TS clients) |
| Database | PostgreSQL (see [doc/DATABASE.md](./doc/DATABASE.md) for details — PGlite embedded for dev, Docker or hosted Supabase for production) |
| Auth | [Better Auth](https://www.better-auth.com/) |
@@ -406,7 +410,7 @@ No separate "agent API" vs. "board API." Same endpoints, different authorization
### Work Artifacts
Paperclip does **not** manage work artifacts (code repos, file systems, deployments, documents). That's entirely the agent's domain. Paperclip tracks tasks and costs. Where and how work gets done is outside scope.
Paperclip manages task-linked work artifacts: issue documents (rich-text plans, specs, notes attached to issues) and file attachments. Agents read and write these through the API as part of normal task execution. Full delivery infrastructure (code repos, deployments, production runtime) remains the agent's domain Paperclip orchestrates the work, not the build pipeline.
### Open Questions
@@ -476,15 +480,14 @@ Each is a distinct page/route:
- [ ] **Default agent** — basic Claude Code/Codex loop with Paperclip skill
- [ ] **Default CEO** — strategic planning, delegation, board communication
- [ ] **Paperclip skill (SKILL.md)** — teaches agents to interact with the API
- [ ] **REST API** — full API for agent interaction (Hono)
- [ ] **REST API** — full API for agent interaction (Express)
- [ ] **Web UI** — React/Vite: org chart, task board, dashboard, cost views
- [ ] **Agent auth** — connection string generation with URL + key + instructions
- [ ] **One-command dev setup** — embedded PGlite, everything local
- [ ] **Multiple Adapter types** (HTTP Adapter, OpenClaw Adapter)
- [ ] **Multiple Adapter types** (HTTP, OpenClaw gateway, and local coding adapters)
### Not V1
- Template export/import
- Knowledge base - a future plugin
- Advanced governance models (hiring budgets, multi-member boards)
- Revenue/expense tracking beyond token costs - a future plugin
@@ -509,7 +512,7 @@ Things Paperclip explicitly does **not** do:
- **Not a SaaS** — single-tenant, self-hosted
- **Not opinionated about Agent implementation** — any language, any framework, any runtime
- **Not automatically self-healing** — surfaces problems, doesn't silently fix them
- **Does not manage work artifacts** — no repo management, no deployment, no file systems
- **Does not manage delivery infrastructure** — no repo management, no deployment, no file systems (but does manage task-linked documents and attachments)
- **Does not auto-reassign work** — stale tasks are surfaced, not silently redistributed
- **Does not track external revenue/expenses** — that's a future plugin. Token/LLM cost budgeting is core.

View File

@@ -1,7 +1,7 @@
# Agent Runtime Guide
Status: User-facing guide
Last updated: 2026-02-17
Status: User-facing guide
Last updated: 2026-03-26
Audience: Operators setting up and running agents in Paperclip
## 1. What this system does
@@ -32,14 +32,19 @@ If an agent is already running, new wakeups are merged (coalesced) instead of la
## 3.1 Adapter choice
Common choices:
Built-in adapters:
- `claude_local`: runs your local `claude` CLI
- `codex_local`: runs your local `codex` CLI
- `opencode_local`: runs your local `opencode` CLI
- `hermes_local`: runs your local `hermes` CLI
- `cursor`: runs Cursor in background mode
- `pi_local`: runs an embedded Pi agent locally
- `openclaw_gateway`: connects to an OpenClaw gateway endpoint
- `process`: generic shell command adapter
- `http`: calls an external HTTP endpoint
For `claude_local` and `codex_local`, Paperclip assumes the CLI is already installed and authenticated on the host machine.
For local CLI adapters (`claude_local`, `codex_local`, `opencode_local`, `hermes_local`), Paperclip assumes the CLI is already installed and authenticated on the host machine.
## 3.2 Runtime behavior
@@ -69,6 +74,8 @@ You can set:
Templates support variables like `{{agent.id}}`, `{{agent.name}}`, and run context values.
> **Note:** `bootstrapPromptTemplate` is deprecated and should not be used for new agents. Existing configs that use it will continue to work but should be migrated to the managed instructions bundle system.
## 4. Session resume behavior
Paperclip stores session IDs for resumable adapters.
@@ -133,7 +140,7 @@ If the connection drops, the UI reconnects automatically.
If runs fail repeatedly:
1. Check adapter command availability (`claude`/`codex` installed and logged in).
1. Check adapter command availability (e.g. `claude`/`codex`/`opencode`/`hermes` installed and logged in).
2. Verify `cwd` exists and is accessible.
3. Inspect run error + stderr excerpt, then full log.
4. Confirm timeout is not too low.
@@ -166,9 +173,9 @@ Start with least privilege where possible, and avoid exposing secrets in broad r
## 10. Minimal setup checklist
1. Choose adapter (`claude_local` or `codex_local`).
2. Set `cwd` to the target workspace.
3. Add bootstrap + normal prompt templates.
1. Choose adapter (e.g. `claude_local`, `codex_local`, `opencode_local`, `hermes_local`, `cursor`, or `openclaw_gateway`).
2. Set `cwd` to the target workspace (for local adapters).
3. Optionally add a prompt template (`promptTemplate`) or use the managed instructions bundle.
4. Configure heartbeat policy (timer and/or assignment wakeups).
5. Trigger a manual wakeup.
6. Confirm run succeeds and session/token usage is recorded.

View File

@@ -253,17 +253,7 @@ owner: cto
name: Monday Review
assignee: ceo
project: q2-launch
schedule:
timezone: America/Chicago
startsAt: 2026-03-16T09:00:00-05:00
recurrence:
frequency: weekly
interval: 1
weekdays:
- monday
time:
hour: 9
minute: 0
recurring: true
```
### Semantics
@@ -271,58 +261,30 @@ schedule:
- body content is the canonical markdown task description
- `assignee` should reference an agent slug inside the package
- `project` should reference a project slug when the task belongs to a `PROJECT.md`
- tasks are intentionally basic seed work: title, markdown body, assignee, and optional recurrence
- `recurring: true` marks the task as ongoing recurring work instead of a one-time starter task
- tasks are intentionally basic seed work: title, markdown body, assignee, project linkage, and optional `recurring: true`
- tools may also support optional fields like `priority`, `labels`, or `metadata`, but they should not require them in the base package
### Scheduling
### Recurring Tasks
The scheduling model is intentionally lightweight. It should cover common recurring patterns such as:
- the base package only needs to say whether a task is recurring
- vendors may attach the actual schedule / trigger / runtime fidelity in a vendor extension such as `.paperclip.yaml`
- this keeps `TASK.md` portable while still allowing richer runtime systems to round-trip their own automation details
- legacy packages may still use `schedule.recurrence` during transition, but exporters should prefer `recurring: true`
- every 6 hours
- every weekday at 9:00
- every Monday morning
- every month on the 1st
- every first Monday of the month
- every year on January 1
Suggested shape:
Example Paperclip extension:
```yaml
schedule:
timezone: America/Chicago
startsAt: 2026-03-14T09:00:00-05:00
recurrence:
frequency: hourly | daily | weekly | monthly | yearly
interval: 1
weekdays:
- monday
- wednesday
monthDays:
- 1
- 15
ordinalWeekdays:
- weekday: monday
ordinal: 1
months:
- 1
- 6
time:
hour: 9
minute: 0
until: 2026-12-31T23:59:59-06:00
count: 10
routines:
monday-review:
triggers:
- kind: schedule
cronExpression: "0 9 * * 1"
timezone: America/Chicago
```
Rules:
- `timezone` should use an IANA timezone like `America/Chicago`
- `startsAt` anchors the first occurrence
- `frequency` and `interval` are the only required recurrence fields
- `weekdays`, `monthDays`, `ordinalWeekdays`, and `months` are optional narrowing rules
- `ordinalWeekdays` uses `ordinal` values like `1`, `2`, `3`, `4`, or `-1` for “last”
- `time.hour` and `time.minute` keep common “morning / 9:00 / end of day” scheduling human-readable
- `until` and `count` are optional recurrence end bounds
- tools may accept richer calendar syntaxes such as RFC5545 `RRULE`, but exporters should prefer the structured form above
- vendors should ignore unknown recurring-task extensions they do not understand
- vendors importing legacy `schedule.recurrence` data may translate it into their own runtime trigger model, but new exports should prefer the simpler `recurring: true` base field
## 11. SKILL.md Compatibility
@@ -449,7 +411,7 @@ Suggested import UI behavior:
- selecting an agent auto-selects required docs and referenced skills
- selecting a team auto-selects its subtree
- selecting a project auto-selects its included tasks
- selecting a recurring task should surface its schedule before import
- selecting a recurring task should make it clear that the import target is a routine / automation, not a one-time task
- selecting referenced third-party content shows attribution, license, and fetch policy
## 15. Vendor Extensions
@@ -502,6 +464,12 @@ agents:
kind: plain
requirement: optional
default: claude
routines:
monday-review:
triggers:
- kind: schedule
cronExpression: "0 9 * * 1"
timezone: America/Chicago
```
Additional rules for Paperclip exporters:
@@ -520,7 +488,7 @@ A compliant exporter should:
- omit machine-local ids and timestamps
- omit secret values
- omit machine-specific paths
- preserve task descriptions and recurrence definitions when exporting tasks
- preserve task descriptions and recurring-task declarations when exporting tasks
- omit empty/default fields
- default to the vendor-neutral base package
- Paperclip exporters should emit `.paperclip.yaml` as a sidecar by default
@@ -569,11 +537,11 @@ Paperclip can map this spec to its runtime model like this:
- `TEAM.md` -> importable org subtree
- `AGENTS.md` -> agent identity and instructions
- `PROJECT.md` -> starter project definition
- `TASK.md` -> starter issue/task definition, or automation template when recurrence is present
- `TASK.md` -> starter issue/task definition, or recurring task template when `recurring: true`
- `SKILL.md` -> imported skill package
- `sources[]` -> provenance and pinned upstream refs
- Paperclip extension:
- `.paperclip.yaml` -> adapter config, runtime config, env input declarations, permissions, budgets, and other Paperclip-specific fidelity
- `.paperclip.yaml` -> adapter config, runtime config, env input declarations, permissions, budgets, routine triggers, and other Paperclip-specific fidelity
Inline Paperclip-only metadata that must live inside a shared markdown file should use:

View File

@@ -46,9 +46,11 @@
"guides/board-operator/managing-agents",
"guides/board-operator/org-structure",
"guides/board-operator/managing-tasks",
"guides/board-operator/delegation",
"guides/board-operator/approvals",
"guides/board-operator/costs-and-budgets",
"guides/board-operator/activity-log"
"guides/board-operator/activity-log",
"guides/board-operator/importing-and-exporting"
]
},
{

View File

@@ -0,0 +1,122 @@
---
title: How Delegation Works
summary: How the CEO breaks down goals into tasks and assigns them to agents
---
Delegation is one of Paperclip's most powerful features. You set company goals, and the CEO agent automatically breaks them into tasks and assigns them to the right agents. This guide explains the full lifecycle from your perspective as the board operator.
## The Delegation Lifecycle
When you create a company goal, the CEO doesn't just acknowledge it — it builds a plan and mobilizes the team:
```
You set a company goal
→ CEO wakes up on heartbeat
→ CEO proposes a strategy (creates an approval for you)
→ You approve the strategy
→ CEO breaks goals into tasks and assigns them to reports
→ Reports wake up (heartbeat triggered by assignment)
→ Reports execute work and update task status
→ CEO monitors progress, unblocks, and escalates
→ You see results in the dashboard and activity log
```
Each step is traceable. Every task links back to the goal through a parent hierarchy, so you can always see why work is happening.
## What You Need to Do
Your role is strategic oversight, not task management. Here's what the delegation model expects from you:
1. **Set clear company goals.** The CEO works from these. Specific, measurable goals produce better delegation. "Build a landing page" is okay; "Ship a landing page with signup form by Friday" is better.
2. **Approve the CEO's strategy.** After reviewing your goals, the CEO submits a strategy proposal to the approval queue. Review it, then approve, reject, or request revisions.
3. **Approve hire requests.** When the CEO needs more capacity (e.g., a frontend engineer to build the landing page), it submits a hire request. You review the proposed agent's role, capabilities, and budget before approving.
4. **Monitor progress.** Use the dashboard and activity log to track how work is flowing. Check task status, agent activity, and completion rates.
5. **Intervene only when things stall.** If progress stops, check these in order:
- Is an approval pending in your queue?
- Is an agent paused or in an error state?
- Is the CEO's budget exhausted (above 80%, it focuses on critical tasks only)?
## What the CEO Does Automatically
You do **not** need to tell the CEO to engage specific agents. After you approve its strategy, the CEO:
- **Breaks goals into concrete tasks** with clear descriptions, priorities, and acceptance criteria
- **Assigns tasks to the right agent** based on role and capabilities (e.g., engineering tasks go to the CTO or engineers, marketing tasks go to the CMO)
- **Creates subtasks** when work needs to be decomposed further
- **Hires new agents** when the team lacks capacity for a goal (subject to your approval)
- **Monitors progress** on each heartbeat, checking task status and unblocking reports
- **Escalates to you** when it encounters something it can't resolve — budget issues, blocked approvals, or strategic ambiguity
## Common Delegation Patterns
### Flat Hierarchy (Small Teams)
For small companies with 3-5 agents, the CEO delegates directly to each report:
```
CEO
├── CTO (engineering tasks)
├── CMO (marketing tasks)
└── Designer (design tasks)
```
The CEO assigns tasks directly. Each agent works independently and reports status back.
### Three-Level Hierarchy (Larger Teams)
For larger organizations, managers delegate further down the chain:
```
CEO
├── CTO
│ ├── Backend Engineer
│ └── Frontend Engineer
└── CMO
└── Content Writer
```
The CEO assigns high-level tasks to the CTO and CMO. They break those into subtasks and assign them to their own reports. You only interact with the CEO — the rest happens automatically.
### Hire-on-Demand
The CEO can start as the only agent and hire as work requires:
1. You set a goal that needs engineering work
2. The CEO proposes a strategy that includes hiring a CTO
3. You approve the hire
4. The CEO assigns engineering tasks to the new CTO
5. As scope grows, the CTO may request to hire engineers
This pattern lets you start small and scale the team based on actual work, not upfront planning.
## Troubleshooting
### "Why isn't the CEO delegating?"
If you've set a goal but nothing is happening, check these common causes:
| Check | What to look for |
|-------|-----------------|
| **Approval queue** | The CEO may have submitted a strategy or hire request that's waiting for your approval. This is the most common reason. |
| **Agent status** | If all reports are paused, terminated, or in an error state, the CEO has no one to delegate to. Check the Agents page. |
| **Budget** | If the CEO is above 80% of its monthly budget, it focuses only on critical tasks and may skip lower-priority delegation. |
| **Goals** | If no company goals are set, the CEO has nothing to work from. Create a goal first. |
| **Heartbeat** | Is the CEO's heartbeat enabled and running? Check the agent detail page for recent heartbeat history. |
| **Agent instructions** | The CEO's delegation behavior is driven by its `AGENTS.md` instructions file. Open the CEO agent's detail page and verify that its instructions path is set and that the file includes delegation directives (subtask creation, hiring, assignment). If AGENTS.md is missing or doesn't mention delegation, the CEO won't know to break down goals and assign work. |
### "Do I have to tell the CEO to engage engineering and marketing?"
**No.** The CEO will delegate automatically after you approve its strategy. It knows the org chart and assigns tasks based on each agent's role and capabilities. You set the goal and approve the plan — the CEO handles task breakdown and assignment.
### "A task seems stuck"
If a specific task isn't progressing:
1. Check the task's comment thread — the assigned agent may have posted a blocker
2. Check if the task is in `blocked` status — read the blocker comment to understand why
3. Check the assigned agent's status — it may be paused or over budget
4. If the agent is stuck, you can reassign the task or add a comment with guidance

View File

@@ -0,0 +1,203 @@
---
title: Importing & Exporting Companies
summary: Export companies to portable packages and import them from local paths or GitHub
---
Paperclip companies can be exported to portable markdown packages and imported from local directories or GitHub repositories. This lets you share company configurations, duplicate setups, and version-control your agent teams.
## Package Format
Exported packages follow the [Agent Companies specification](/companies/companies-spec) and use a markdown-first structure:
```text
my-company/
├── COMPANY.md # Company metadata
├── agents/
│ ├── ceo/AGENT.md # Agent instructions + frontmatter
│ └── cto/AGENT.md
├── projects/
│ └── main/PROJECT.md
├── skills/
│ └── review/SKILL.md
├── tasks/
│ └── onboarding/TASK.md
└── .paperclip.yaml # Adapter config, env inputs, routines
```
- **COMPANY.md** defines company name, description, and metadata.
- **AGENT.md** files contain agent identity, role, and instructions.
- **SKILL.md** files are compatible with the Agent Skills ecosystem.
- **.paperclip.yaml** holds Paperclip-specific config (adapter types, env inputs, budgets) as an optional sidecar.
## Exporting a Company
Export a company into a portable folder:
```sh
paperclipai company export <company-id> --out ./my-export
```
### Options
| Option | Description | Default |
|--------|-------------|---------|
| `--out <path>` | Output directory (required) | — |
| `--include <values>` | Comma-separated set: `company`, `agents`, `projects`, `issues`, `tasks`, `skills` | `company,agents` |
| `--skills <values>` | Export only specific skill slugs | all |
| `--projects <values>` | Export only specific project shortnames or IDs | all |
| `--issues <values>` | Export specific issue identifiers or IDs | none |
| `--project-issues <values>` | Export issues belonging to specific projects | none |
| `--expand-referenced-skills` | Vendor skill file contents instead of keeping upstream references | `false` |
### Examples
```sh
# Export company with agents and projects
paperclipai company export abc123 --out ./backup --include company,agents,projects
# Export everything including tasks and skills
paperclipai company export abc123 --out ./full-export --include company,agents,projects,tasks,skills
# Export only specific skills
paperclipai company export abc123 --out ./skills-only --include skills --skills review,deploy
```
### What Gets Exported
- Company name, description, and metadata
- Agent names, roles, reporting structure, and instructions
- Project definitions and workspace config
- Task/issue descriptions (when included)
- Skill packages (as references or vendored content)
- Adapter type and env input declarations in `.paperclip.yaml`
Secret values, machine-local paths, and database IDs are **never** exported.
## Importing a Company
Import from a local directory, GitHub URL, or GitHub shorthand:
```sh
# From a local folder
paperclipai company import ./my-export
# From a GitHub URL
paperclipai company import https://github.com/org/repo
# From a GitHub subfolder
paperclipai company import https://github.com/org/repo/tree/main/companies/acme
# From GitHub shorthand
paperclipai company import org/repo
paperclipai company import org/repo/companies/acme
```
### Options
| Option | Description | Default |
|--------|-------------|---------|
| `--target <mode>` | `new` (create a new company) or `existing` (merge into existing) | inferred from context |
| `--company-id <id>` | Target company ID for `--target existing` | current context |
| `--new-company-name <name>` | Override company name for `--target new` | from package |
| `--include <values>` | Comma-separated set: `company`, `agents`, `projects`, `issues`, `tasks`, `skills` | auto-detected |
| `--agents <list>` | Comma-separated agent slugs to import, or `all` | `all` |
| `--collision <mode>` | How to handle name conflicts: `rename`, `skip`, or `replace` | `rename` |
| `--ref <value>` | Git ref for GitHub imports (branch, tag, or commit) | default branch |
| `--dry-run` | Preview what would be imported without applying | `false` |
| `--yes` | Skip the interactive confirmation prompt | `false` |
| `--json` | Output result as JSON | `false` |
### Target Modes
- **`new`** — Creates a fresh company from the package. Good for duplicating a company template.
- **`existing`** — Merges the package into an existing company. Use `--company-id` to specify the target.
If `--target` is not specified, Paperclip infers it: if a `--company-id` is provided (or one exists in context), it defaults to `existing`; otherwise `new`.
### Collision Strategies
When importing into an existing company, agent or project names may conflict with existing ones:
- **`rename`** (default) — Appends a suffix to avoid conflicts (e.g., `ceo` becomes `ceo-2`).
- **`skip`** — Skips entities that already exist.
- **`replace`** — Overwrites existing entities. Only available for non-safe imports (not available through the CEO API).
### Interactive Selection
When running interactively (no `--yes` or `--json` flags), the import command shows a selection picker before applying. You can choose exactly which agents, projects, skills, and tasks to import using a checkbox interface.
### Preview Before Applying
Always preview first with `--dry-run`:
```sh
paperclipai company import org/repo --target existing --company-id abc123 --dry-run
```
The preview shows:
- **Package contents** — How many agents, projects, tasks, and skills are in the source
- **Import plan** — What will be created, renamed, skipped, or replaced
- **Env inputs** — Environment variables that may need values after import
- **Warnings** — Potential issues like missing skills or unresolved references
Imported agents always land with timer heartbeats disabled. Assignment/on-demand wake behavior from the package is preserved, but scheduled runs stay off until a board operator re-enables them.
### Common Workflows
**Clone a company template from GitHub:**
```sh
paperclipai company import org/company-templates/engineering-team \
--target new \
--new-company-name "My Engineering Team"
```
**Add agents from a package into your existing company:**
```sh
paperclipai company import ./shared-agents \
--target existing \
--company-id abc123 \
--include agents \
--collision rename
```
**Import a specific branch or tag:**
```sh
paperclipai company import org/repo --ref v2.0.0 --dry-run
```
**Non-interactive import (CI/scripts):**
```sh
paperclipai company import ./package \
--target new \
--yes \
--json
```
## API Endpoints
The CLI commands use these API endpoints under the hood:
| Action | Endpoint |
|--------|----------|
| Export company | `POST /api/companies/{companyId}/export` |
| Preview import (existing company) | `POST /api/companies/{companyId}/imports/preview` |
| Apply import (existing company) | `POST /api/companies/{companyId}/imports/apply` |
| Preview import (new company) | `POST /api/companies/import/preview` |
| Apply import (new company) | `POST /api/companies/import` |
CEO agents can also use the safe import routes (`/imports/preview` and `/imports/apply`) which enforce non-destructive rules: `replace` is rejected, collisions resolve with `rename` or `skip`, and issues are always created as new.
## GitHub Sources
Paperclip supports several GitHub URL formats:
- Full URL: `https://github.com/org/repo`
- Subfolder URL: `https://github.com/org/repo/tree/main/path/to/company`
- Shorthand: `org/repo`
- Shorthand with path: `org/repo/path/to/company`
Use `--ref` to pin to a specific branch, tag, or commit hash when importing from GitHub.

View File

@@ -1,9 +1,9 @@
---
title: Core Concepts
summary: Companies, agents, issues, heartbeats, and governance
summary: Companies, agents, issues, delegation, heartbeats, and governance
---
Paperclip organizes autonomous AI work around five key concepts.
Paperclip organizes autonomous AI work around six key concepts.
## Company
@@ -50,6 +50,17 @@ Terminal states: `done`, `cancelled`.
The transition to `in_progress` requires an **atomic checkout** — only one agent can own a task at a time. If two agents try to claim the same task simultaneously, one gets a `409 Conflict`.
## Delegation
The CEO is the primary delegator. When you set company goals, the CEO:
1. Creates a strategy and submits it for your approval
2. Breaks approved goals into tasks
3. Assigns tasks to agents based on their role and capabilities
4. Hires new agents when needed (subject to your approval)
You don't need to manually assign every task — set the goals and let the CEO organize the work. You approve key decisions (strategy, hiring) and monitor progress. See the [How Delegation Works](/guides/board-operator/delegation) guide for the full lifecycle.
## Heartbeats
Agents don't run continuously. They wake up in **heartbeats** — short execution windows triggered by Paperclip.

View File

@@ -35,8 +35,8 @@
"test:release-smoke:headed": "npx playwright test --config tests/release-smoke/playwright.config.ts --headed"
},
"devDependencies": {
"cross-env": "^10.1.0",
"@playwright/test": "^1.58.2",
"cross-env": "^10.1.0",
"esbuild": "^0.27.3",
"typescript": "^5.7.3",
"vitest": "^3.0.5"
@@ -44,5 +44,10 @@
"engines": {
"node": ">=20"
},
"packageManager": "pnpm@9.15.4"
"packageManager": "pnpm@9.15.4",
"pnpm": {
"patchedDependencies": {
"embedded-postgres@18.1.0-beta.16": "patches/embedded-postgres@18.1.0-beta.16.patch"
}
}
}

View File

@@ -352,7 +352,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
const combinedPath = path.join(skillsDir, "agent-instructions.md");
await fs.writeFile(combinedPath, instructionsContent + pathDirective, "utf-8");
effectiveInstructionsFilePath = combinedPath;
await onLog("stderr", `[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`);
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(

View File

@@ -42,7 +42,7 @@ Notes:
- Prompts are piped via stdin (Codex receives "-" prompt argument).
- If instructionsFilePath is configured, Paperclip prepends that file's contents to the stdin prompt on every run.
- Codex exec automatically applies repo-scoped AGENTS.md instructions from the active workspace. Paperclip cannot suppress that discovery in exec mode, so repo AGENTS.md files may still apply even when you only configured an explicit instructionsFilePath.
- Paperclip injects desired local skills into the active workspace's ".agents/skills" directory at execution time so Codex can discover "$paperclip" and related skills without coupling them to the user's login home.
- Paperclip injects desired local skills into the effective CODEX_HOME/skills/ directory at execution time so Codex can discover "$paperclip" and related skills without polluting the project working directory. In managed-home mode (the default) this is ~/.paperclip/instances/<id>/companies/<companyId>/codex-home/skills/; when CODEX_HOME is explicitly overridden in adapter config, that override is used instead.
- Unless explicitly overridden in adapter config, Paperclip runs Codex with a per-company managed CODEX_HOME under the active Paperclip instance and seeds auth/config from the shared Codex home (the CODEX_HOME env var, when set, or ~/.codex).
- Some model/tool combinations reject certain effort levels (for example minimal with web search enabled).
- When Paperclip realizes a workspace/runtime for a run, it injects PAPERCLIP_WORKSPACE_* and PAPERCLIP_RUNTIME_* env vars for agent-side tooling.

View File

@@ -21,7 +21,7 @@ import {
runChildProcess,
} from "@paperclipai/adapter-utils/server-utils";
import { parseCodexJsonl, isCodexUnknownSessionError } from "./parse.js";
import { pathExists, prepareManagedCodexHome, resolveManagedCodexHomeDir } from "./codex-home.js";
import { pathExists, prepareManagedCodexHome, resolveManagedCodexHomeDir, resolveSharedCodexHomeDir } from "./codex-home.js";
import { resolveCodexDesiredSkillNames } from "./skills.js";
const __moduleDir = path.dirname(fileURLToPath(import.meta.url));
@@ -135,8 +135,8 @@ async function pruneBrokenUnavailablePaperclipSkillSymlinks(
}
}
function resolveCodexWorkspaceSkillsDir(cwd: string): string {
return path.join(cwd, ".agents", "skills");
function resolveCodexSkillsDir(codexHome: string): string {
return path.join(codexHome, "skills");
}
type EnsureCodexSkillsInjectedOptions = {
@@ -157,7 +157,7 @@ export async function ensureCodexSkillsInjected(
const skillsEntries = allSkillsEntries.filter((entry) => desiredSet.has(entry.key));
if (skillsEntries.length === 0) return;
const skillsHome = options.skillsHome ?? resolveCodexWorkspaceSkillsDir(process.cwd());
const skillsHome = options.skillsHome ?? resolveCodexSkillsDir(resolveSharedCodexHomeDir());
await fs.mkdir(skillsHome, { recursive: true });
const linkSkill = options.linkSkill;
for (const entry of skillsEntries) {
@@ -273,11 +273,13 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
const defaultCodexHome = resolveManagedCodexHomeDir(process.env, agent.companyId);
const effectiveCodexHome = configuredCodexHome ?? preparedManagedCodexHome ?? defaultCodexHome;
await fs.mkdir(effectiveCodexHome, { recursive: true });
const codexWorkspaceSkillsDir = resolveCodexWorkspaceSkillsDir(cwd);
// Inject skills into the same CODEX_HOME that Codex will actually run with
// (managed home in the default case, or an explicit override from adapter config).
const codexSkillsDir = resolveCodexSkillsDir(effectiveCodexHome);
await ensureCodexSkillsInjected(
onLog,
{
skillsHome: codexWorkspaceSkillsDir,
skillsHome: codexSkillsDir,
skillsEntries: codexSkillEntries,
desiredSkillNames,
},
@@ -415,10 +417,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
`The above agent instructions were loaded from ${instructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsDir}.\n\n`;
instructionsChars = instructionsPrefix.length;
await onLog(
"stdout",
`[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`,
);
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(

View File

@@ -107,8 +107,8 @@ function parsePlanAndEmailFromToken(idToken: string | null, accessToken: string
return { email: null, planType: null };
}
export async function readCodexAuthInfo(): Promise<CodexAuthInfo | null> {
const authPath = path.join(codexHomeDir(), "auth.json");
export async function readCodexAuthInfo(codexHome?: string): Promise<CodexAuthInfo | null> {
const authPath = path.join(codexHome ?? codexHomeDir(), "auth.json");
let raw: string;
try {
raw = await fs.readFile(authPath, "utf8");

View File

@@ -31,7 +31,7 @@ async function buildCodexSkillSnapshot(
sourcePath: entry.source,
targetPath: null,
detail: desiredSet.has(entry.key)
? "Will be linked into the workspace .agents/skills directory on the next run."
? "Will be linked into the effective CODEX_HOME/skills/ directory on the next run."
: null,
required: Boolean(entry.required),
requiredReason: entry.requiredReason ?? null,

View File

@@ -15,6 +15,7 @@ import {
} from "@paperclipai/adapter-utils/server-utils";
import path from "node:path";
import { parseCodexJsonl } from "./parse.js";
import { codexHomeDir, readCodexAuthInfo } from "./quota.js";
function summarizeStatus(checks: AdapterEnvironmentCheck[]): AdapterEnvironmentTestResult["status"] {
if (checks.some((check) => check.level === "error")) return "fail";
@@ -108,12 +109,23 @@ export async function testEnvironment(
detail: `Detected in ${source}.`,
});
} else {
checks.push({
code: "codex_openai_api_key_missing",
level: "warn",
message: "OPENAI_API_KEY is not set. Codex runs may fail until authentication is configured.",
hint: "Set OPENAI_API_KEY in adapter env, shell environment, or Codex auth configuration.",
});
const codexHome = isNonEmpty(env.CODEX_HOME) ? env.CODEX_HOME : undefined;
const codexAuth = await readCodexAuthInfo(codexHome).catch(() => null);
if (codexAuth) {
checks.push({
code: "codex_native_auth_present",
level: "info",
message: "Codex is authenticated via its own auth configuration.",
detail: codexAuth.email ? `Logged in as ${codexAuth.email}.` : `Credentials found in ${path.join(codexHome ?? codexHomeDir(), "auth.json")}.`,
});
} else {
checks.push({
code: "codex_openai_api_key_missing",
level: "warn",
message: "OPENAI_API_KEY is not set. Codex runs may fail until authentication is configured.",
hint: "Set OPENAI_API_KEY in adapter env, shell environment, or run `codex auth` to log in.",
});
}
}
const canRunProbe =

View File

@@ -307,10 +307,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
`The above agent instructions were loaded from ${instructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsDir}.\n\n`;
instructionsChars = instructionsPrefix.length;
await onLog(
"stdout",
`[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`,
);
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(

View File

@@ -12,6 +12,8 @@ import {
ensurePathInEnv,
runChildProcess,
} from "@paperclipai/adapter-utils/server-utils";
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { DEFAULT_CURSOR_LOCAL_MODEL } from "../index.js";
import { parseCursorJsonl } from "./parse.js";
@@ -49,6 +51,41 @@ function summarizeProbeDetail(stdout: string, stderr: string, parsedError: strin
return clean.length > max ? `${clean.slice(0, max - 1)}` : clean;
}
export interface CursorAuthInfo {
email: string | null;
displayName: string | null;
userId: number | null;
}
export function cursorConfigPath(cursorHome?: string): string {
return path.join(cursorHome ?? path.join(os.homedir(), ".cursor"), "cli-config.json");
}
export async function readCursorAuthInfo(cursorHome?: string): Promise<CursorAuthInfo | null> {
let raw: string;
try {
raw = await fs.readFile(cursorConfigPath(cursorHome), "utf8");
} catch {
return null;
}
let parsed: unknown;
try {
parsed = JSON.parse(raw);
} catch {
return null;
}
if (typeof parsed !== "object" || parsed === null) return null;
const obj = parsed as Record<string, unknown>;
const authInfo = obj.authInfo;
if (typeof authInfo !== "object" || authInfo === null) return null;
const info = authInfo as Record<string, unknown>;
const email = typeof info.email === "string" && info.email.trim().length > 0 ? info.email.trim() : null;
const displayName = typeof info.displayName === "string" && info.displayName.trim().length > 0 ? info.displayName.trim() : null;
const userId = typeof info.userId === "number" ? info.userId : null;
if (!email && !displayName && userId == null) return null;
return { email, displayName, userId };
}
const CURSOR_AUTH_REQUIRED_RE =
/(?:authentication\s+required|not\s+authenticated|not\s+logged\s+in|unauthorized|invalid(?:\s+or\s+missing)?\s+api(?:[_\s-]?key)?|cursor[_\s-]?api[_\s-]?key|run\s+'?agent\s+login'?\s+first|api(?:[_\s-]?key)?(?:\s+is)?\s+required)/i;
@@ -109,12 +146,25 @@ export async function testEnvironment(
detail: `Detected in ${source}.`,
});
} else {
checks.push({
code: "cursor_api_key_missing",
level: "warn",
message: "CURSOR_API_KEY is not set. Cursor runs may fail until authentication is configured.",
hint: "Set CURSOR_API_KEY in adapter env or run `agent login`.",
});
const cursorHome = isNonEmpty(env.CURSOR_HOME) ? env.CURSOR_HOME : undefined;
const cursorAuth = await readCursorAuthInfo(cursorHome).catch(() => null);
if (cursorAuth) {
checks.push({
code: "cursor_native_auth_present",
level: "info",
message: "Cursor is authenticated via `agent login`.",
detail: cursorAuth.email
? `Logged in as ${cursorAuth.email}.`
: `Credentials found in ${cursorConfigPath(cursorHome)}.`,
});
} else {
checks.push({
code: "cursor_api_key_missing",
level: "warn",
message: "CURSOR_API_KEY is not set. Cursor runs may fail until authentication is configured.",
hint: "Set CURSOR_API_KEY in adapter env or run `agent login`.",
});
}
}
const canRunProbe =

View File

@@ -253,10 +253,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
`${instructionsContents}\n\n` +
`The above agent instructions were loaded from ${instructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsDir}.\n\n`;
await onLog(
"stdout",
`[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`,
);
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(

View File

@@ -22,6 +22,7 @@ Core fields:
- instructionsFilePath (string, optional): absolute path to a markdown instructions file prepended to the run prompt
- model (string, required): OpenCode model id in provider/model format (for example anthropic/claude-sonnet-4-5)
- variant (string, optional): provider-specific model variant (for example minimal|low|medium|high|max)
- dangerouslySkipPermissions (boolean, optional): inject a runtime OpenCode config that allows \`external_directory\` access without interactive prompts; defaults to true for unattended Paperclip runs
- promptTemplate (string, optional): run prompt template
- command (string, optional): defaults to "opencode"
- extraArgs (string[], optional): additional CLI args
@@ -37,4 +38,10 @@ Notes:
- Paperclip requires an explicit \`model\` value for \`opencode_local\` agents.
- Runs are executed with: opencode run --format json ...
- Sessions are resumed with --session when stored session cwd matches current cwd.
- The adapter sets OPENCODE_DISABLE_PROJECT_CONFIG=true to prevent OpenCode from \
writing an opencode.json config file into the project working directory. Model \
selection is passed via the --model CLI flag instead.
- When \`dangerouslySkipPermissions\` is enabled, Paperclip injects a temporary \
runtime config with \`permission.external_directory=allow\` so headless runs do \
not stall on approval prompts.
`;

View File

@@ -23,6 +23,7 @@ import {
import { isOpenCodeUnknownSessionError, parseOpenCodeJsonl } from "./parse.js";
import { ensureOpenCodeModelConfiguredAndAvailable } from "./models.js";
import { removeMaintainerOnlySkillSymlinks } from "@paperclipai/adapter-utils/server-utils";
import { prepareOpenCodeRuntimeConfig } from "./runtime-config.js";
const __moduleDir = path.dirname(fileURLToPath(import.meta.url));
@@ -169,238 +170,247 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
for (const [key, value] of Object.entries(envConfig)) {
if (typeof value === "string") env[key] = value;
}
// Prevent OpenCode from writing an opencode.json config file into the
// project working directory (which would pollute the git repo). Model
// selection is already handled via the --model CLI flag. Set after the
// envConfig loop so user overrides cannot disable this guard.
env.OPENCODE_DISABLE_PROJECT_CONFIG = "true";
if (!hasExplicitApiKey && authToken) {
env.PAPERCLIP_API_KEY = authToken;
}
const runtimeEnv = Object.fromEntries(
Object.entries(ensurePathInEnv({ ...process.env, ...env })).filter(
(entry): entry is [string, string] => typeof entry[1] === "string",
),
);
await ensureCommandResolvable(command, cwd, runtimeEnv);
await ensureOpenCodeModelConfiguredAndAvailable({
model,
command,
cwd,
env: runtimeEnv,
});
const timeoutSec = asNumber(config.timeoutSec, 0);
const graceSec = asNumber(config.graceSec, 20);
const extraArgs = (() => {
const fromExtraArgs = asStringArray(config.extraArgs);
if (fromExtraArgs.length > 0) return fromExtraArgs;
return asStringArray(config.args);
})();
const runtimeSessionParams = parseObject(runtime.sessionParams);
const runtimeSessionId = asString(runtimeSessionParams.sessionId, runtime.sessionId ?? "");
const runtimeSessionCwd = asString(runtimeSessionParams.cwd, "");
const canResumeSession =
runtimeSessionId.length > 0 &&
(runtimeSessionCwd.length === 0 || path.resolve(runtimeSessionCwd) === path.resolve(cwd));
const sessionId = canResumeSession ? runtimeSessionId : null;
if (runtimeSessionId && !canResumeSession) {
await onLog(
"stdout",
`[paperclip] OpenCode session "${runtimeSessionId}" was saved for cwd "${runtimeSessionCwd}" and will not be resumed in "${cwd}".\n`,
const preparedRuntimeConfig = await prepareOpenCodeRuntimeConfig({ env, config });
try {
const runtimeEnv = Object.fromEntries(
Object.entries(ensurePathInEnv({ ...process.env, ...preparedRuntimeConfig.env })).filter(
(entry): entry is [string, string] => typeof entry[1] === "string",
),
);
}
await ensureCommandResolvable(command, cwd, runtimeEnv);
const instructionsFilePath = asString(config.instructionsFilePath, "").trim();
const resolvedInstructionsFilePath = instructionsFilePath
? path.resolve(cwd, instructionsFilePath)
: "";
const instructionsDir = resolvedInstructionsFilePath ? `${path.dirname(resolvedInstructionsFilePath)}/` : "";
let instructionsPrefix = "";
if (resolvedInstructionsFilePath) {
try {
const instructionsContents = await fs.readFile(resolvedInstructionsFilePath, "utf8");
instructionsPrefix =
`${instructionsContents}\n\n` +
`The above agent instructions were loaded from ${resolvedInstructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsDir}.\n\n`;
await onLog(
"stdout",
`[paperclip] Loaded agent instructions file: ${resolvedInstructionsFilePath}\n`,
);
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(
"stdout",
`[paperclip] Warning: could not read agent instructions file "${resolvedInstructionsFilePath}": ${reason}\n`,
);
}
}
const commandNotes = (() => {
if (!resolvedInstructionsFilePath) return [] as string[];
if (instructionsPrefix.length > 0) {
return [
`Loaded agent instructions from ${resolvedInstructionsFilePath}`,
`Prepended instructions + path directive to stdin prompt (relative references from ${instructionsDir}).`,
];
}
return [
`Configured instructionsFilePath ${resolvedInstructionsFilePath}, but file could not be read; continuing without injected instructions.`,
];
})();
const bootstrapPromptTemplate = asString(config.bootstrapPromptTemplate, "");
const templateData = {
agentId: agent.id,
companyId: agent.companyId,
runId,
company: { id: agent.companyId },
agent,
run: { id: runId, source: "on_demand" },
context,
};
const renderedPrompt = renderTemplate(promptTemplate, templateData);
const renderedBootstrapPrompt =
!sessionId && bootstrapPromptTemplate.trim().length > 0
? renderTemplate(bootstrapPromptTemplate, templateData).trim()
: "";
const sessionHandoffNote = asString(context.paperclipSessionHandoffMarkdown, "").trim();
const prompt = joinPromptSections([
instructionsPrefix,
renderedBootstrapPrompt,
sessionHandoffNote,
renderedPrompt,
]);
const promptMetrics = {
promptChars: prompt.length,
instructionsChars: instructionsPrefix.length,
bootstrapPromptChars: renderedBootstrapPrompt.length,
sessionHandoffChars: sessionHandoffNote.length,
heartbeatPromptChars: renderedPrompt.length,
};
const buildArgs = (resumeSessionId: string | null) => {
const args = ["run", "--format", "json"];
if (resumeSessionId) args.push("--session", resumeSessionId);
if (model) args.push("--model", model);
if (variant) args.push("--variant", variant);
if (extraArgs.length > 0) args.push(...extraArgs);
return args;
};
const runAttempt = async (resumeSessionId: string | null) => {
const args = buildArgs(resumeSessionId);
if (onMeta) {
await onMeta({
adapterType: "opencode_local",
command,
cwd,
commandNotes,
commandArgs: [...args, `<stdin prompt ${prompt.length} chars>`],
env: redactEnvForLogs(env),
prompt,
promptMetrics,
context,
});
}
const proc = await runChildProcess(runId, command, args, {
await ensureOpenCodeModelConfiguredAndAvailable({
model,
command,
cwd,
env: runtimeEnv,
stdin: prompt,
timeoutSec,
graceSec,
onSpawn,
onLog,
});
return {
proc,
rawStderr: proc.stderr,
parsed: parseOpenCodeJsonl(proc.stdout),
};
};
const toResult = (
attempt: {
proc: { exitCode: number | null; signal: string | null; timedOut: boolean; stdout: string; stderr: string };
rawStderr: string;
parsed: ReturnType<typeof parseOpenCodeJsonl>;
},
clearSessionOnMissingSession = false,
): AdapterExecutionResult => {
if (attempt.proc.timedOut) {
return {
exitCode: attempt.proc.exitCode,
signal: attempt.proc.signal,
timedOut: true,
errorMessage: `Timed out after ${timeoutSec}s`,
clearSession: clearSessionOnMissingSession,
};
const timeoutSec = asNumber(config.timeoutSec, 0);
const graceSec = asNumber(config.graceSec, 20);
const extraArgs = (() => {
const fromExtraArgs = asStringArray(config.extraArgs);
if (fromExtraArgs.length > 0) return fromExtraArgs;
return asStringArray(config.args);
})();
const runtimeSessionParams = parseObject(runtime.sessionParams);
const runtimeSessionId = asString(runtimeSessionParams.sessionId, runtime.sessionId ?? "");
const runtimeSessionCwd = asString(runtimeSessionParams.cwd, "");
const canResumeSession =
runtimeSessionId.length > 0 &&
(runtimeSessionCwd.length === 0 || path.resolve(runtimeSessionCwd) === path.resolve(cwd));
const sessionId = canResumeSession ? runtimeSessionId : null;
if (runtimeSessionId && !canResumeSession) {
await onLog(
"stdout",
`[paperclip] OpenCode session "${runtimeSessionId}" was saved for cwd "${runtimeSessionCwd}" and will not be resumed in "${cwd}".\n`,
);
}
const resolvedSessionId =
attempt.parsed.sessionId ??
(clearSessionOnMissingSession ? null : runtimeSessionId ?? runtime.sessionId ?? null);
const resolvedSessionParams = resolvedSessionId
? ({
sessionId: resolvedSessionId,
cwd,
...(workspaceId ? { workspaceId } : {}),
...(workspaceRepoUrl ? { repoUrl: workspaceRepoUrl } : {}),
...(workspaceRepoRef ? { repoRef: workspaceRepoRef } : {}),
} as Record<string, unknown>)
: null;
const instructionsFilePath = asString(config.instructionsFilePath, "").trim();
const resolvedInstructionsFilePath = instructionsFilePath
? path.resolve(cwd, instructionsFilePath)
: "";
const instructionsDir = resolvedInstructionsFilePath ? `${path.dirname(resolvedInstructionsFilePath)}/` : "";
let instructionsPrefix = "";
if (resolvedInstructionsFilePath) {
try {
const instructionsContents = await fs.readFile(resolvedInstructionsFilePath, "utf8");
instructionsPrefix =
`${instructionsContents}\n\n` +
`The above agent instructions were loaded from ${resolvedInstructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsDir}.\n\n`;
} catch (err) {
const reason = err instanceof Error ? err.message : String(err);
await onLog(
"stdout",
`[paperclip] Warning: could not read agent instructions file "${resolvedInstructionsFilePath}": ${reason}\n`,
);
}
}
const parsedError = typeof attempt.parsed.errorMessage === "string" ? attempt.parsed.errorMessage.trim() : "";
const stderrLine = firstNonEmptyLine(attempt.proc.stderr);
const rawExitCode = attempt.proc.exitCode;
const synthesizedExitCode = parsedError && (rawExitCode ?? 0) === 0 ? 1 : rawExitCode;
const fallbackErrorMessage =
parsedError ||
stderrLine ||
`OpenCode exited with code ${synthesizedExitCode ?? -1}`;
const modelId = model || null;
const commandNotes = (() => {
const notes = [...preparedRuntimeConfig.notes];
if (!resolvedInstructionsFilePath) return notes;
if (instructionsPrefix.length > 0) {
notes.push(`Loaded agent instructions from ${resolvedInstructionsFilePath}`);
notes.push(
`Prepended instructions + path directive to stdin prompt (relative references from ${instructionsDir}).`,
);
return notes;
}
notes.push(
`Configured instructionsFilePath ${resolvedInstructionsFilePath}, but file could not be read; continuing without injected instructions.`,
);
return notes;
})();
return {
exitCode: synthesizedExitCode,
signal: attempt.proc.signal,
timedOut: false,
errorMessage: (synthesizedExitCode ?? 0) === 0 ? null : fallbackErrorMessage,
usage: {
inputTokens: attempt.parsed.usage.inputTokens,
outputTokens: attempt.parsed.usage.outputTokens,
cachedInputTokens: attempt.parsed.usage.cachedInputTokens,
},
sessionId: resolvedSessionId,
sessionParams: resolvedSessionParams,
sessionDisplayId: resolvedSessionId,
provider: parseModelProvider(modelId),
biller: resolveOpenCodeBiller(runtimeEnv, parseModelProvider(modelId)),
model: modelId,
billingType: "unknown",
costUsd: attempt.parsed.costUsd,
resultJson: {
stdout: attempt.proc.stdout,
stderr: attempt.proc.stderr,
},
summary: attempt.parsed.summary,
clearSession: Boolean(clearSessionOnMissingSession && !attempt.parsed.sessionId),
const bootstrapPromptTemplate = asString(config.bootstrapPromptTemplate, "");
const templateData = {
agentId: agent.id,
companyId: agent.companyId,
runId,
company: { id: agent.companyId },
agent,
run: { id: runId, source: "on_demand" },
context,
};
const renderedPrompt = renderTemplate(promptTemplate, templateData);
const renderedBootstrapPrompt =
!sessionId && bootstrapPromptTemplate.trim().length > 0
? renderTemplate(bootstrapPromptTemplate, templateData).trim()
: "";
const sessionHandoffNote = asString(context.paperclipSessionHandoffMarkdown, "").trim();
const prompt = joinPromptSections([
instructionsPrefix,
renderedBootstrapPrompt,
sessionHandoffNote,
renderedPrompt,
]);
const promptMetrics = {
promptChars: prompt.length,
instructionsChars: instructionsPrefix.length,
bootstrapPromptChars: renderedBootstrapPrompt.length,
sessionHandoffChars: sessionHandoffNote.length,
heartbeatPromptChars: renderedPrompt.length,
};
};
const initial = await runAttempt(sessionId);
const initialFailed =
!initial.proc.timedOut && ((initial.proc.exitCode ?? 0) !== 0 || Boolean(initial.parsed.errorMessage));
if (
sessionId &&
initialFailed &&
isOpenCodeUnknownSessionError(initial.proc.stdout, initial.rawStderr)
) {
await onLog(
"stdout",
`[paperclip] OpenCode session "${sessionId}" is unavailable; retrying with a fresh session.\n`,
);
const retry = await runAttempt(null);
return toResult(retry, true);
const buildArgs = (resumeSessionId: string | null) => {
const args = ["run", "--format", "json"];
if (resumeSessionId) args.push("--session", resumeSessionId);
if (model) args.push("--model", model);
if (variant) args.push("--variant", variant);
if (extraArgs.length > 0) args.push(...extraArgs);
return args;
};
const runAttempt = async (resumeSessionId: string | null) => {
const args = buildArgs(resumeSessionId);
if (onMeta) {
await onMeta({
adapterType: "opencode_local",
command,
cwd,
commandNotes,
commandArgs: [...args, `<stdin prompt ${prompt.length} chars>`],
env: redactEnvForLogs(preparedRuntimeConfig.env),
prompt,
promptMetrics,
context,
});
}
const proc = await runChildProcess(runId, command, args, {
cwd,
env: runtimeEnv,
stdin: prompt,
timeoutSec,
graceSec,
onSpawn,
onLog,
});
return {
proc,
rawStderr: proc.stderr,
parsed: parseOpenCodeJsonl(proc.stdout),
};
};
const toResult = (
attempt: {
proc: { exitCode: number | null; signal: string | null; timedOut: boolean; stdout: string; stderr: string };
rawStderr: string;
parsed: ReturnType<typeof parseOpenCodeJsonl>;
},
clearSessionOnMissingSession = false,
): AdapterExecutionResult => {
if (attempt.proc.timedOut) {
return {
exitCode: attempt.proc.exitCode,
signal: attempt.proc.signal,
timedOut: true,
errorMessage: `Timed out after ${timeoutSec}s`,
clearSession: clearSessionOnMissingSession,
};
}
const resolvedSessionId =
attempt.parsed.sessionId ??
(clearSessionOnMissingSession ? null : runtimeSessionId ?? runtime.sessionId ?? null);
const resolvedSessionParams = resolvedSessionId
? ({
sessionId: resolvedSessionId,
cwd,
...(workspaceId ? { workspaceId } : {}),
...(workspaceRepoUrl ? { repoUrl: workspaceRepoUrl } : {}),
...(workspaceRepoRef ? { repoRef: workspaceRepoRef } : {}),
} as Record<string, unknown>)
: null;
const parsedError = typeof attempt.parsed.errorMessage === "string" ? attempt.parsed.errorMessage.trim() : "";
const stderrLine = firstNonEmptyLine(attempt.proc.stderr);
const rawExitCode = attempt.proc.exitCode;
const synthesizedExitCode = parsedError && (rawExitCode ?? 0) === 0 ? 1 : rawExitCode;
const fallbackErrorMessage =
parsedError ||
stderrLine ||
`OpenCode exited with code ${synthesizedExitCode ?? -1}`;
const modelId = model || null;
return {
exitCode: synthesizedExitCode,
signal: attempt.proc.signal,
timedOut: false,
errorMessage: (synthesizedExitCode ?? 0) === 0 ? null : fallbackErrorMessage,
usage: {
inputTokens: attempt.parsed.usage.inputTokens,
outputTokens: attempt.parsed.usage.outputTokens,
cachedInputTokens: attempt.parsed.usage.cachedInputTokens,
},
sessionId: resolvedSessionId,
sessionParams: resolvedSessionParams,
sessionDisplayId: resolvedSessionId,
provider: parseModelProvider(modelId),
biller: resolveOpenCodeBiller(runtimeEnv, parseModelProvider(modelId)),
model: modelId,
billingType: "unknown",
costUsd: attempt.parsed.costUsd,
resultJson: {
stdout: attempt.proc.stdout,
stderr: attempt.proc.stderr,
},
summary: attempt.parsed.summary,
clearSession: Boolean(clearSessionOnMissingSession && !attempt.parsed.sessionId),
};
};
const initial = await runAttempt(sessionId);
const initialFailed =
!initial.proc.timedOut && ((initial.proc.exitCode ?? 0) !== 0 || Boolean(initial.parsed.errorMessage));
if (
sessionId &&
initialFailed &&
isOpenCodeUnknownSessionError(initial.proc.stdout, initial.rawStderr)
) {
await onLog(
"stdout",
`[paperclip] OpenCode session "${sessionId}" is unavailable; retrying with a fresh session.\n`,
);
const retry = await runAttempt(null);
return toResult(retry, true);
}
return toResult(initial);
} finally {
await preparedRuntimeConfig.cleanup();
}
return toResult(initial);
}

View File

@@ -120,7 +120,8 @@ export async function discoverOpenCodeModels(input: {
// /etc/passwd entry (e.g. `docker run --user 1234` with a minimal
// image). Fall back to process.env.HOME.
}
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...env, ...(resolvedHome ? { HOME: resolvedHome } : {}) }));
// Prevent OpenCode from writing an opencode.json into the working directory.
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...env, ...(resolvedHome ? { HOME: resolvedHome } : {}), OPENCODE_DISABLE_PROJECT_CONFIG: "true" }));
const result = await runChildProcess(
`opencode-models-${Date.now()}-${Math.random().toString(16).slice(2)}`,

View File

@@ -0,0 +1,79 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import { prepareOpenCodeRuntimeConfig } from "./runtime-config.js";
const cleanupPaths = new Set<string>();
afterEach(async () => {
await Promise.all(
[...cleanupPaths].map(async (filepath) => {
await fs.rm(filepath, { recursive: true, force: true });
cleanupPaths.delete(filepath);
}),
);
});
async function makeConfigHome(initialConfig?: Record<string, unknown>) {
const root = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-opencode-test-"));
cleanupPaths.add(root);
const configDir = path.join(root, "opencode");
await fs.mkdir(configDir, { recursive: true });
if (initialConfig) {
await fs.writeFile(
path.join(configDir, "opencode.json"),
`${JSON.stringify(initialConfig, null, 2)}\n`,
"utf8",
);
}
return root;
}
describe("prepareOpenCodeRuntimeConfig", () => {
it("injects an external_directory allow rule by default", async () => {
const configHome = await makeConfigHome({
permission: {
read: "allow",
},
theme: "system",
});
const prepared = await prepareOpenCodeRuntimeConfig({
env: { XDG_CONFIG_HOME: configHome },
config: {},
});
cleanupPaths.add(prepared.env.XDG_CONFIG_HOME);
expect(prepared.env.XDG_CONFIG_HOME).not.toBe(configHome);
const runtimeConfig = JSON.parse(
await fs.readFile(
path.join(prepared.env.XDG_CONFIG_HOME, "opencode", "opencode.json"),
"utf8",
),
) as Record<string, unknown>;
expect(runtimeConfig).toMatchObject({
theme: "system",
permission: {
read: "allow",
external_directory: "allow",
},
});
await prepared.cleanup();
cleanupPaths.delete(prepared.env.XDG_CONFIG_HOME);
await expect(fs.access(prepared.env.XDG_CONFIG_HOME)).rejects.toThrow();
});
it("respects explicit opt-out", async () => {
const configHome = await makeConfigHome();
const prepared = await prepareOpenCodeRuntimeConfig({
env: { XDG_CONFIG_HOME: configHome },
config: { dangerouslySkipPermissions: false },
});
expect(prepared.env).toEqual({ XDG_CONFIG_HOME: configHome });
expect(prepared.notes).toEqual([]);
await prepared.cleanup();
});
});

View File

@@ -0,0 +1,91 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { asBoolean } from "@paperclipai/adapter-utils/server-utils";
type PreparedOpenCodeRuntimeConfig = {
env: Record<string, string>;
notes: string[];
cleanup: () => Promise<void>;
};
function resolveXdgConfigHome(env: Record<string, string>): string {
return (
(typeof env.XDG_CONFIG_HOME === "string" && env.XDG_CONFIG_HOME.trim()) ||
(typeof process.env.XDG_CONFIG_HOME === "string" && process.env.XDG_CONFIG_HOME.trim()) ||
path.join(os.homedir(), ".config")
);
}
function isPlainObject(value: unknown): value is Record<string, unknown> {
return typeof value === "object" && value !== null && !Array.isArray(value);
}
async function readJsonObject(filepath: string): Promise<Record<string, unknown>> {
try {
const raw = await fs.readFile(filepath, "utf8");
const parsed = JSON.parse(raw);
return isPlainObject(parsed) ? parsed : {};
} catch {
return {};
}
}
export async function prepareOpenCodeRuntimeConfig(input: {
env: Record<string, string>;
config: Record<string, unknown>;
}): Promise<PreparedOpenCodeRuntimeConfig> {
const skipPermissions = asBoolean(input.config.dangerouslySkipPermissions, true);
if (!skipPermissions) {
return {
env: input.env,
notes: [],
cleanup: async () => {},
};
}
const sourceConfigDir = path.join(resolveXdgConfigHome(input.env), "opencode");
const runtimeConfigHome = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-opencode-config-"));
const runtimeConfigDir = path.join(runtimeConfigHome, "opencode");
const runtimeConfigPath = path.join(runtimeConfigDir, "opencode.json");
await fs.mkdir(runtimeConfigDir, { recursive: true });
try {
await fs.cp(sourceConfigDir, runtimeConfigDir, {
recursive: true,
force: true,
errorOnExist: false,
dereference: false,
});
} catch (err) {
if ((err as NodeJS.ErrnoException | null)?.code !== "ENOENT") {
throw err;
}
}
const existingConfig = await readJsonObject(runtimeConfigPath);
const existingPermission = isPlainObject(existingConfig.permission)
? existingConfig.permission
: {};
const nextConfig = {
...existingConfig,
permission: {
...existingPermission,
external_directory: "allow",
},
};
await fs.writeFile(runtimeConfigPath, `${JSON.stringify(nextConfig, null, 2)}\n`, "utf8");
return {
env: {
...input.env,
XDG_CONFIG_HOME: runtimeConfigHome,
},
notes: [
"Injected runtime OpenCode config with permission.external_directory=allow to avoid headless approval prompts.",
],
cleanup: async () => {
await fs.rm(runtimeConfigHome, { recursive: true, force: true });
},
};
}

View File

@@ -4,6 +4,7 @@ import type {
AdapterEnvironmentTestResult,
} from "@paperclipai/adapter-utils";
import {
asBoolean,
asString,
asStringArray,
parseObject,
@@ -14,6 +15,7 @@ import {
} from "@paperclipai/adapter-utils/server-utils";
import { discoverOpenCodeModels, ensureOpenCodeModelConfiguredAndAvailable } from "./models.js";
import { parseOpenCodeJsonl } from "./parse.js";
import { prepareOpenCodeRuntimeConfig } from "./runtime-config.js";
function summarizeStatus(checks: AdapterEnvironmentCheck[]): AdapterEnvironmentTestResult["status"] {
if (checks.some((check) => check.level === "error")) return "fail";
@@ -90,224 +92,238 @@ export async function testEnvironment(
});
}
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...env }));
const cwdInvalid = checks.some((check) => check.code === "opencode_cwd_invalid");
if (cwdInvalid) {
// Prevent OpenCode from writing an opencode.json into the working directory.
env.OPENCODE_DISABLE_PROJECT_CONFIG = "true";
const preparedRuntimeConfig = await prepareOpenCodeRuntimeConfig({ env, config });
if (asBoolean(config.dangerouslySkipPermissions, true)) {
checks.push({
code: "opencode_command_skipped",
level: "warn",
message: "Skipped command check because working directory validation failed.",
detail: command,
code: "opencode_headless_permissions_enabled",
level: "info",
message: "Headless OpenCode external-directory permissions are auto-approved for unattended runs.",
});
} else {
try {
await ensureCommandResolvable(command, cwd, runtimeEnv);
}
try {
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...preparedRuntimeConfig.env }));
const cwdInvalid = checks.some((check) => check.code === "opencode_cwd_invalid");
if (cwdInvalid) {
checks.push({
code: "opencode_command_resolvable",
level: "info",
message: `Command is executable: ${command}`,
});
} catch (err) {
checks.push({
code: "opencode_command_unresolvable",
level: "error",
message: err instanceof Error ? err.message : "Command is not executable",
code: "opencode_command_skipped",
level: "warn",
message: "Skipped command check because working directory validation failed.",
detail: command,
});
}
}
const canRunProbe =
checks.every((check) => check.code !== "opencode_cwd_invalid" && check.code !== "opencode_command_unresolvable");
let modelValidationPassed = false;
const configuredModel = asString(config.model, "").trim();
if (canRunProbe && configuredModel) {
try {
const discovered = await discoverOpenCodeModels({ command, cwd, env: runtimeEnv });
if (discovered.length > 0) {
} else {
try {
await ensureCommandResolvable(command, cwd, runtimeEnv);
checks.push({
code: "opencode_models_discovered",
code: "opencode_command_resolvable",
level: "info",
message: `Discovered ${discovered.length} model(s) from OpenCode providers.`,
message: `Command is executable: ${command}`,
});
} else {
} catch (err) {
checks.push({
code: "opencode_models_empty",
code: "opencode_command_unresolvable",
level: "error",
message: "OpenCode returned no models.",
hint: "Run `opencode models` and verify provider authentication.",
});
}
} catch (err) {
const errMsg = err instanceof Error ? err.message : String(err);
if (/ProviderModelNotFoundError/i.test(errMsg)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
detail: errMsg,
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else {
checks.push({
code: "opencode_models_discovery_failed",
level: "error",
message: errMsg || "OpenCode model discovery failed.",
hint: "Run `opencode models` manually to verify provider auth and config.",
message: err instanceof Error ? err.message : "Command is not executable",
detail: command,
});
}
}
} else if (canRunProbe && !configuredModel) {
try {
const discovered = await discoverOpenCodeModels({ command, cwd, env: runtimeEnv });
if (discovered.length > 0) {
checks.push({
code: "opencode_models_discovered",
level: "info",
message: `Discovered ${discovered.length} model(s) from OpenCode providers.`,
});
const canRunProbe =
checks.every((check) => check.code !== "opencode_cwd_invalid" && check.code !== "opencode_command_unresolvable");
let modelValidationPassed = false;
const configuredModel = asString(config.model, "").trim();
if (canRunProbe && configuredModel) {
try {
const discovered = await discoverOpenCodeModels({ command, cwd, env: runtimeEnv });
if (discovered.length > 0) {
checks.push({
code: "opencode_models_discovered",
level: "info",
message: `Discovered ${discovered.length} model(s) from OpenCode providers.`,
});
} else {
checks.push({
code: "opencode_models_empty",
level: "error",
message: "OpenCode returned no models.",
hint: "Run `opencode models` and verify provider authentication.",
});
}
} catch (err) {
const errMsg = err instanceof Error ? err.message : String(err);
if (/ProviderModelNotFoundError/i.test(errMsg)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
detail: errMsg,
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else {
checks.push({
code: "opencode_models_discovery_failed",
level: "error",
message: errMsg || "OpenCode model discovery failed.",
hint: "Run `opencode models` manually to verify provider auth and config.",
});
}
}
} catch (err) {
const errMsg = err instanceof Error ? err.message : String(err);
if (/ProviderModelNotFoundError/i.test(errMsg)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
detail: errMsg,
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else {
checks.push({
code: "opencode_models_discovery_failed",
level: "warn",
message: errMsg || "OpenCode model discovery failed (best-effort, no model configured).",
hint: "Run `opencode models` manually to verify provider auth and config.",
});
} else if (canRunProbe && !configuredModel) {
try {
const discovered = await discoverOpenCodeModels({ command, cwd, env: runtimeEnv });
if (discovered.length > 0) {
checks.push({
code: "opencode_models_discovered",
level: "info",
message: `Discovered ${discovered.length} model(s) from OpenCode providers.`,
});
}
} catch (err) {
const errMsg = err instanceof Error ? err.message : String(err);
if (/ProviderModelNotFoundError/i.test(errMsg)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
detail: errMsg,
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else {
checks.push({
code: "opencode_models_discovery_failed",
level: "warn",
message: errMsg || "OpenCode model discovery failed (best-effort, no model configured).",
hint: "Run `opencode models` manually to verify provider auth and config.",
});
}
}
}
}
const modelUnavailable = checks.some((check) => check.code === "opencode_hello_probe_model_unavailable");
if (!configuredModel && !modelUnavailable) {
// No model configured skip model requirement if no model-related checks exist
} else if (configuredModel && canRunProbe) {
try {
await ensureOpenCodeModelConfiguredAndAvailable({
model: configuredModel,
command,
cwd,
env: runtimeEnv,
});
checks.push({
code: "opencode_model_configured",
level: "info",
message: `Configured model: ${configuredModel}`,
});
modelValidationPassed = true;
} catch (err) {
checks.push({
code: "opencode_model_invalid",
level: "error",
message: err instanceof Error ? err.message : "Configured model is unavailable.",
hint: "Run `opencode models` and choose a currently available provider/model ID.",
});
}
}
if (canRunProbe && modelValidationPassed) {
const extraArgs = (() => {
const fromExtraArgs = asStringArray(config.extraArgs);
if (fromExtraArgs.length > 0) return fromExtraArgs;
return asStringArray(config.args);
})();
const variant = asString(config.variant, "").trim();
const probeModel = configuredModel;
const args = ["run", "--format", "json"];
args.push("--model", probeModel);
if (variant) args.push("--variant", variant);
if (extraArgs.length > 0) args.push(...extraArgs);
try {
const probe = await runChildProcess(
`opencode-envtest-${Date.now()}-${Math.random().toString(16).slice(2)}`,
command,
args,
{
const modelUnavailable = checks.some((check) => check.code === "opencode_hello_probe_model_unavailable");
if (!configuredModel && !modelUnavailable) {
// No model configured skip model requirement if no model-related checks exist
} else if (configuredModel && canRunProbe) {
try {
await ensureOpenCodeModelConfiguredAndAvailable({
model: configuredModel,
command,
cwd,
env: runtimeEnv,
timeoutSec: 60,
graceSec: 5,
stdin: "Respond with hello.",
onLog: async () => {},
},
);
});
checks.push({
code: "opencode_model_configured",
level: "info",
message: `Configured model: ${configuredModel}`,
});
modelValidationPassed = true;
} catch (err) {
checks.push({
code: "opencode_model_invalid",
level: "error",
message: err instanceof Error ? err.message : "Configured model is unavailable.",
hint: "Run `opencode models` and choose a currently available provider/model ID.",
});
}
}
const parsed = parseOpenCodeJsonl(probe.stdout);
const detail = summarizeProbeDetail(probe.stdout, probe.stderr, parsed.errorMessage);
const authEvidence = `${parsed.errorMessage ?? ""}\n${probe.stdout}\n${probe.stderr}`.trim();
if (canRunProbe && modelValidationPassed) {
const extraArgs = (() => {
const fromExtraArgs = asStringArray(config.extraArgs);
if (fromExtraArgs.length > 0) return fromExtraArgs;
return asStringArray(config.args);
})();
const variant = asString(config.variant, "").trim();
const probeModel = configuredModel;
if (probe.timedOut) {
checks.push({
code: "opencode_hello_probe_timed_out",
level: "warn",
message: "OpenCode hello probe timed out.",
hint: "Retry the probe. If this persists, run OpenCode manually in this working directory.",
});
} else if ((probe.exitCode ?? 1) === 0 && !parsed.errorMessage) {
const summary = parsed.summary.trim();
const hasHello = /\bhello\b/i.test(summary);
checks.push({
code: hasHello ? "opencode_hello_probe_passed" : "opencode_hello_probe_unexpected_output",
level: hasHello ? "info" : "warn",
message: hasHello
? "OpenCode hello probe succeeded."
: "OpenCode probe ran but did not return `hello` as expected.",
...(summary ? { detail: summary.replace(/\s+/g, " ").trim().slice(0, 240) } : {}),
...(hasHello
? {}
: {
hint: "Run `opencode run --format json` manually and prompt `Respond with hello` to inspect output.",
}),
});
} else if (/ProviderModelNotFoundError/i.test(authEvidence)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
...(detail ? { detail } : {}),
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else if (OPENCODE_AUTH_REQUIRED_RE.test(authEvidence)) {
checks.push({
code: "opencode_hello_probe_auth_required",
level: "warn",
message: "OpenCode is installed, but provider authentication is not ready.",
...(detail ? { detail } : {}),
hint: "Run `opencode auth login` or set provider credentials, then retry the probe.",
});
} else {
const args = ["run", "--format", "json"];
args.push("--model", probeModel);
if (variant) args.push("--variant", variant);
if (extraArgs.length > 0) args.push(...extraArgs);
try {
const probe = await runChildProcess(
`opencode-envtest-${Date.now()}-${Math.random().toString(16).slice(2)}`,
command,
args,
{
cwd,
env: runtimeEnv,
timeoutSec: 60,
graceSec: 5,
stdin: "Respond with hello.",
onLog: async () => {},
},
);
const parsed = parseOpenCodeJsonl(probe.stdout);
const detail = summarizeProbeDetail(probe.stdout, probe.stderr, parsed.errorMessage);
const authEvidence = `${parsed.errorMessage ?? ""}\n${probe.stdout}\n${probe.stderr}`.trim();
if (probe.timedOut) {
checks.push({
code: "opencode_hello_probe_timed_out",
level: "warn",
message: "OpenCode hello probe timed out.",
hint: "Retry the probe. If this persists, run OpenCode manually in this working directory.",
});
} else if ((probe.exitCode ?? 1) === 0 && !parsed.errorMessage) {
const summary = parsed.summary.trim();
const hasHello = /\bhello\b/i.test(summary);
checks.push({
code: hasHello ? "opencode_hello_probe_passed" : "opencode_hello_probe_unexpected_output",
level: hasHello ? "info" : "warn",
message: hasHello
? "OpenCode hello probe succeeded."
: "OpenCode probe ran but did not return `hello` as expected.",
...(summary ? { detail: summary.replace(/\s+/g, " ").trim().slice(0, 240) } : {}),
...(hasHello
? {}
: {
hint: "Run `opencode run --format json` manually and prompt `Respond with hello` to inspect output.",
}),
});
} else if (/ProviderModelNotFoundError/i.test(authEvidence)) {
checks.push({
code: "opencode_hello_probe_model_unavailable",
level: "warn",
message: "The configured model was not found by the provider.",
...(detail ? { detail } : {}),
hint: "Run `opencode models` and choose an available provider/model ID.",
});
} else if (OPENCODE_AUTH_REQUIRED_RE.test(authEvidence)) {
checks.push({
code: "opencode_hello_probe_auth_required",
level: "warn",
message: "OpenCode is installed, but provider authentication is not ready.",
...(detail ? { detail } : {}),
hint: "Run `opencode auth login` or set provider credentials, then retry the probe.",
});
} else {
checks.push({
code: "opencode_hello_probe_failed",
level: "error",
message: "OpenCode hello probe failed.",
...(detail ? { detail } : {}),
hint: "Run `opencode run --format json` manually in this working directory to debug.",
});
}
} catch (err) {
checks.push({
code: "opencode_hello_probe_failed",
level: "error",
message: "OpenCode hello probe failed.",
...(detail ? { detail } : {}),
detail: err instanceof Error ? err.message : String(err),
hint: "Run `opencode run --format json` manually in this working directory to debug.",
});
}
} catch (err) {
checks.push({
code: "opencode_hello_probe_failed",
level: "error",
message: "OpenCode hello probe failed.",
detail: err instanceof Error ? err.message : String(err),
hint: "Run `opencode run --format json` manually in this working directory to debug.",
});
}
} finally {
await preparedRuntimeConfig.cleanup();
}
return {

View File

@@ -58,6 +58,7 @@ export function buildOpenCodeLocalConfig(v: CreateConfigValues): Record<string,
if (v.bootstrapPrompt) ac.bootstrapPromptTemplate = v.bootstrapPrompt;
if (v.model) ac.model = v.model;
if (v.thinkingEffort) ac.variant = v.thinkingEffort;
ac.dangerouslySkipPermissions = v.dangerouslySkipPermissions;
// OpenCode sessions can run until the CLI exits naturally; keep timeout disabled (0)
// and rely on graceSec for termination handling when a timeout is configured elsewhere.
ac.timeoutSec = 0;

View File

@@ -266,10 +266,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
`The above agent instructions were loaded from ${resolvedInstructionsFilePath}. ` +
`Resolve any relative file references from ${instructionsFileDir}.\n\n` +
`You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.`;
await onLog(
"stdout",
`[paperclip] Loaded agent instructions file: ${resolvedInstructionsFilePath}\n`,
);
} catch (err) {
instructionsReadFailed = true;
const reason = err instanceof Error ? err.message : String(err);
@@ -330,8 +326,9 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
const buildArgs = (sessionFile: string): string[] => {
const args: string[] = [];
// Use RPC mode for proper lifecycle management (waits for agent completion)
args.push("--mode", "rpc");
// Use JSON mode for structured output with print mode (non-interactive)
args.push("--mode", "json");
args.push("-p"); // Non-interactive mode: process prompt and exit
// Use --append-system-prompt to extend Pi's default system prompt
args.push("--append-system-prompt", renderedSystemPromptExtension);
@@ -347,19 +344,13 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
args.push("--skill", PI_AGENT_SKILLS_DIR);
if (extraArgs.length > 0) args.push(...extraArgs);
// Add the user prompt as the last argument
args.push(userPrompt);
return args;
};
const buildRpcStdin = (): string => {
// Send the prompt as an RPC command
const promptCommand = {
type: "prompt",
message: userPrompt,
};
return JSON.stringify(promptCommand) + "\n";
};
const runAttempt = async (sessionFile: string) => {
const args = buildArgs(sessionFile);
if (onMeta) {
@@ -406,7 +397,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
graceSec,
onSpawn,
onLog: bufferedOnLog,
stdin: buildRpcStdin(),
});
// Flush any remaining buffer content

View File

@@ -131,7 +131,9 @@ export async function discoverPiModels(input: {
throw new Error(detail ? `\`pi --list-models\` failed: ${detail}` : "`pi --list-models` failed.");
}
return sortModels(dedupeModels(parseModelsOutput(result.stdout)));
// Pi outputs model list to stderr, but fall back to stdout for older versions
const output = result.stderr || result.stdout;
return sortModels(dedupeModels(parseModelsOutput(output)));
}
function normalizeEnv(input: unknown): Record<string, string> {

View File

@@ -17,19 +17,39 @@ function asString(value: unknown, fallback = ""): string {
return typeof value === "string" ? value : fallback;
}
function extractTextContent(content: string | Array<{ type: string; text?: string }>): string {
if (typeof content === "string") return content;
if (!Array.isArray(content)) return "";
return content
.filter((c) => c.type === "text" && c.text)
.map((c) => c.text!)
.join("");
function extractTextContent(content: string | Array<{ type: string; text?: string; thinking?: string }>): { text: string; thinking: string } {
if (typeof content === "string") return { text: content, thinking: "" };
if (!Array.isArray(content)) return { text: "", thinking: "" };
let text = "";
let thinking = "";
for (const c of content) {
if (c.type === "text" && c.text) {
text += c.text;
}
if (c.type === "thinking" && c.thinking) {
thinking += c.thinking;
}
}
return { text, thinking };
}
// Track pending tool calls for proper toolUseId matching
let pendingToolCalls = new Map<string, { toolName: string; args: unknown }>();
export function resetParserState(): void {
pendingToolCalls.clear();
}
export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
const parsed = asRecord(safeJsonParse(line));
if (!parsed) {
return [{ kind: "stdout", ts, text: line }];
// Non-JSON line, treat as raw stdout
const trimmed = line.trim();
if (!trimmed) return [];
return [{ kind: "stdout", ts, text: trimmed }];
}
const type = asString(parsed.type);
@@ -41,16 +61,64 @@ export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
// Agent lifecycle
if (type === "agent_start") {
return [{ kind: "system", ts, text: "Pi agent started" }];
return [{ kind: "system", ts, text: "🚀 Pi agent started" }];
}
if (type === "agent_end") {
return [{ kind: "system", ts, text: "Pi agent finished" }];
const entries: TranscriptEntry[] = [];
// Extract final message from messages array if available
const messages = parsed.messages as Array<Record<string, unknown>> | undefined;
if (messages && messages.length > 0) {
const lastMessage = messages[messages.length - 1];
if (lastMessage?.role === "assistant") {
const content = lastMessage.content as string | Array<{ type: string; text?: string; thinking?: string }>;
const { text, thinking } = extractTextContent(content);
if (thinking) {
entries.push({ kind: "thinking", ts, text: thinking });
}
if (text) {
entries.push({ kind: "assistant", ts, text });
}
// Extract usage
const usage = asRecord(lastMessage.usage);
if (usage) {
const inputTokens = (usage.inputTokens ?? usage.input ?? 0) as number;
const outputTokens = (usage.outputTokens ?? usage.output ?? 0) as number;
const cachedTokens = (usage.cacheRead ?? usage.cachedInputTokens ?? 0) as number;
const costRecord = asRecord(usage.cost);
const costUsd = (costRecord?.total ?? usage.costUsd ?? 0) as number;
if (inputTokens > 0 || outputTokens > 0) {
entries.push({
kind: "result",
ts,
text: "Run completed",
inputTokens,
outputTokens,
cachedTokens,
costUsd,
subtype: "end",
isError: false,
errors: [],
});
}
}
}
}
if (entries.length === 0) {
entries.push({ kind: "system", ts, text: "✅ Pi agent finished" });
}
return entries;
}
// Turn lifecycle
if (type === "turn_start") {
return [{ kind: "system", ts, text: "Turn started" }];
return []; // Skip noisy lifecycle events
}
if (type === "turn_end") {
@@ -60,16 +128,21 @@ export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
const entries: TranscriptEntry[] = [];
if (message) {
const content = message.content as string | Array<{ type: string; text?: string }>;
const text = extractTextContent(content);
const content = message.content as string | Array<{ type: string; text?: string; thinking?: string }>;
const { text, thinking } = extractTextContent(content);
if (thinking) {
entries.push({ kind: "thinking", ts, text: thinking });
}
if (text) {
entries.push({ kind: "assistant", ts, text });
}
}
// Process tool results
// Process tool results - match with pending tool calls
if (toolResults) {
for (const tr of toolResults) {
const toolCallId = asString(tr.toolCallId, `tool-${Date.now()}`);
const content = tr.content;
const isError = tr.isError === true;
@@ -78,23 +151,31 @@ export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
if (typeof content === "string") {
contentStr = content;
} else if (Array.isArray(content)) {
contentStr = extractTextContent(content as Array<{ type: string; text?: string }>);
const extracted = extractTextContent(content as Array<{ type: string; text?: string }>);
contentStr = extracted.text || JSON.stringify(content);
} else {
contentStr = JSON.stringify(content);
}
// Get tool name from pending calls if available
const pendingCall = pendingToolCalls.get(toolCallId);
const toolName = asString(tr.toolName, pendingCall?.toolName || "tool");
entries.push({
kind: "tool_result",
ts,
toolUseId: asString(tr.toolCallId, "unknown"),
toolName: asString(tr.toolName),
toolUseId: toolCallId,
toolName,
content: contentStr,
isError,
});
// Clean up pending call
pendingToolCalls.delete(toolCallId);
}
}
return entries.length > 0 ? entries : [{ kind: "system", ts, text: "Turn ended" }];
return entries;
}
// Message streaming
@@ -106,33 +187,81 @@ export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
const assistantEvent = asRecord(parsed.assistantMessageEvent);
if (assistantEvent) {
const msgType = asString(assistantEvent.type);
// Handle thinking deltas
if (msgType === "thinking_delta") {
const delta = asString(assistantEvent.delta);
if (delta) {
return [{ kind: "thinking", ts, text: delta, delta: true }];
}
}
// Handle text deltas
if (msgType === "text_delta") {
const delta = asString(assistantEvent.delta);
if (delta) {
return [{ kind: "assistant", ts, text: delta, delta: true }];
}
}
// Handle thinking end - emit full thinking block
if (msgType === "thinking_end") {
const content = asString(assistantEvent.content);
if (content) {
return [{ kind: "thinking", ts, text: content }];
}
}
// Handle text end - emit full text block
if (msgType === "text_end") {
const content = asString(assistantEvent.content);
if (content) {
return [{ kind: "assistant", ts, text: content }];
}
}
}
return [];
}
if (type === "message_end") {
const message = asRecord(parsed.message);
if (message) {
const content = message.content as string | Array<{ type: string; text?: string; thinking?: string }>;
const { text, thinking } = extractTextContent(content);
const entries: TranscriptEntry[] = [];
// Emit final thinking block if present
if (thinking) {
entries.push({ kind: "thinking", ts, text: thinking });
}
// Emit final text block if present
if (text) {
entries.push({ kind: "assistant", ts, text });
}
return entries;
}
return [];
}
// Tool execution
if (type === "tool_execution_start") {
const toolName = asString(parsed.toolName);
const toolCallId = asString(parsed.toolCallId, `tool-${Date.now()}`);
const toolName = asString(parsed.toolName, "tool");
const args = parsed.args;
if (toolName) {
return [{
kind: "tool_call",
ts,
name: toolName,
input: args,
}];
}
return [{ kind: "system", ts, text: `Tool started` }];
// Track this tool call for later matching
pendingToolCalls.set(toolCallId, { toolName, args });
return [{
kind: "tool_call",
ts,
name: toolName,
input: args,
toolUseId: toolCallId,
}];
}
if (type === "tool_execution_update") {
@@ -140,40 +269,43 @@ export function parsePiStdoutLine(line: string, ts: string): TranscriptEntry[] {
}
if (type === "tool_execution_end") {
const toolCallId = asString(parsed.toolCallId);
const toolName = asString(parsed.toolName);
const toolCallId = asString(parsed.toolCallId, `tool-${Date.now()}`);
const toolName = asString(parsed.toolName, "tool");
const result = parsed.result;
const isError = parsed.isError === true;
// Extract text from Pi's content array format
// Can be: {"content": [{"type": "text", "text": "..."}]} or [{"type": "text", "text": "..."}]
let contentStr: string;
if (typeof result === "string") {
contentStr = result;
} else if (Array.isArray(result)) {
// Direct array format: result is [{"type": "text", "text": "..."}]
contentStr = extractTextContent(result as Array<{ type: string; text?: string }>);
const extracted = extractTextContent(result as Array<{ type: string; text?: string }>);
contentStr = extracted.text || JSON.stringify(result);
} else if (result && typeof result === "object") {
const resultObj = result as Record<string, unknown>;
if (Array.isArray(resultObj.content)) {
// Wrapped format: result is {"content": [{"type": "text", "text": "..."}]}
contentStr = extractTextContent(resultObj.content as Array<{ type: string; text?: string }>);
const extracted = extractTextContent(resultObj.content as Array<{ type: string; text?: string }>);
contentStr = extracted.text || JSON.stringify(result);
} else {
contentStr = JSON.stringify(result);
}
} else {
contentStr = JSON.stringify(result);
contentStr = String(result);
}
// Clean up pending call
pendingToolCalls.delete(toolCallId);
return [{
kind: "tool_result",
ts,
toolUseId: toolCallId || "unknown",
toolUseId: toolCallId,
toolName,
content: contentStr,
isError,
}];
}
// Fallback for unknown event types
return [{ kind: "stdout", ts, text: line }];
}

View File

@@ -1,83 +1,24 @@
import { createHash } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import postgres from "postgres";
import {
applyPendingMigrations,
ensurePostgresDatabase,
inspectMigrations,
} from "./client.js";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./test-embedded-postgres.js";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
const tempPaths: string[] = [];
const runningInstances: EmbeddedPostgresInstance[] = [];
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
const cleanups: Array<() => Promise<void>> = [];
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
async function createTempDatabase(): Promise<string> {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-db-client-"));
tempPaths.push(dataDir);
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
runningInstances.push(instance);
const adminUrl = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminUrl, "paperclip");
return `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
const db = await startEmbeddedPostgresTestDatabase("paperclip-db-client-");
cleanups.push(db.cleanup);
return db.connectionString;
}
async function migrationHash(migrationFile: string): Promise<string> {
@@ -89,19 +30,19 @@ async function migrationHash(migrationFile: string): Promise<string> {
}
afterEach(async () => {
while (runningInstances.length > 0) {
const instance = runningInstances.pop();
if (!instance) continue;
await instance.stop();
}
while (tempPaths.length > 0) {
const tempPath = tempPaths.pop();
if (!tempPath) continue;
fs.rmSync(tempPath, { recursive: true, force: true });
while (cleanups.length > 0) {
const cleanup = cleanups.pop();
await cleanup?.();
}
});
describe("applyPendingMigrations", () => {
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres migration tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
describeEmbeddedPostgres("applyPendingMigrations", () => {
it(
"applies an inserted earlier migration without replaying later legacy migrations",
async () => {

View File

@@ -0,0 +1,28 @@
import { describe, expect, it } from "vitest";
import { createEmbeddedPostgresLogBuffer, formatEmbeddedPostgresError } from "./embedded-postgres-error.js";
describe("formatEmbeddedPostgresError", () => {
it("adds a shared-memory hint when initdb logs expose the real cause", () => {
const error = formatEmbeddedPostgresError("Postgres init script exited with code 1.", {
fallbackMessage: "Failed to initialize embedded PostgreSQL cluster",
recentLogs: [
"running bootstrap script ...",
"FATAL: could not create shared memory segment: Cannot allocate memory",
"DETAIL: Failed system call was shmget(key=123, size=56, 03600).",
],
});
expect(error.message).toContain("could not allocate shared memory");
expect(error.message).toContain("kern.sysv.shm");
expect(error.message).toContain("could not create shared memory segment");
});
it("keeps only recent non-empty log lines in the collector", () => {
const buffer = createEmbeddedPostgresLogBuffer(2);
buffer.append("line one\n\n");
buffer.append("line two");
buffer.append("line three");
expect(buffer.getRecentLogs()).toEqual(["line two", "line three"]);
});
});

View File

@@ -0,0 +1,89 @@
const DEFAULT_RECENT_LOG_LIMIT = 40;
const RECENT_LOG_SUMMARY_LINES = 8;
function toError(error: unknown, fallbackMessage: string): Error {
if (error instanceof Error) return error;
if (error === undefined) return new Error(fallbackMessage);
if (typeof error === "string") return new Error(`${fallbackMessage}: ${error}`);
try {
return new Error(`${fallbackMessage}: ${JSON.stringify(error)}`);
} catch {
return new Error(`${fallbackMessage}: ${String(error)}`);
}
}
function summarizeRecentLogs(recentLogs: string[]): string | null {
if (recentLogs.length === 0) return null;
return recentLogs
.slice(-RECENT_LOG_SUMMARY_LINES)
.map((line) => line.trim())
.filter((line) => line.length > 0)
.join(" | ");
}
function detectEmbeddedPostgresHint(recentLogs: string[]): string | null {
const haystack = recentLogs.join("\n").toLowerCase();
if (!haystack.includes("could not create shared memory segment")) {
return null;
}
return (
"Embedded PostgreSQL bootstrap could not allocate shared memory. " +
"On macOS, this usually means the host's kern.sysv.shm* limits are too low for another local PostgreSQL cluster. " +
"Stop other local PostgreSQL servers or raise the shared-memory sysctls, then retry."
);
}
export function createEmbeddedPostgresLogBuffer(limit = DEFAULT_RECENT_LOG_LIMIT): {
append(message: unknown): void;
getRecentLogs(): string[];
} {
const recentLogs: string[] = [];
return {
append(message: unknown) {
const text =
typeof message === "string"
? message
: message instanceof Error
? message.message
: String(message ?? "");
for (const rawLine of text.split(/\r?\n/)) {
const line = rawLine.trim();
if (!line) continue;
recentLogs.push(line);
if (recentLogs.length > limit) {
recentLogs.splice(0, recentLogs.length - limit);
}
}
},
getRecentLogs() {
return [...recentLogs];
},
};
}
export function formatEmbeddedPostgresError(
error: unknown,
input: {
fallbackMessage: string;
recentLogs?: string[];
},
): Error {
const baseError = toError(error, input.fallbackMessage);
const recentLogs = input.recentLogs ?? [];
const parts = [baseError.message];
const hint = detectEmbeddedPostgresHint(recentLogs);
const recentSummary = summarizeRecentLogs(recentLogs);
if (hint) {
parts.push(hint);
}
if (recentSummary) {
parts.push(`Recent embedded Postgres logs: ${recentSummary}`);
}
return new Error(parts.join(" "));
}

View File

@@ -11,6 +11,12 @@ export {
type MigrationBootstrapResult,
type Db,
} from "./client.js";
export {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestSupport,
} from "./test-embedded-postgres.js";
export {
runDatabaseBackup,
runDatabaseRestore,
@@ -19,4 +25,8 @@ export {
type RunDatabaseBackupResult,
type RunDatabaseRestoreOptions,
} from "./backup-lib.js";
export {
createEmbeddedPostgresLogBuffer,
formatEmbeddedPostgresError,
} from "./embedded-postgres-error.js";
export * from "./schema/index.js";

View File

@@ -2,6 +2,7 @@ import { existsSync, readFileSync, rmSync } from "node:fs";
import { createServer } from "node:net";
import path from "node:path";
import { ensurePostgresDatabase, getPostgresDataDirectory } from "./client.js";
import { createEmbeddedPostgresLogBuffer, formatEmbeddedPostgresError } from "./embedded-postgres-error.js";
import { resolveDatabaseTarget } from "./runtime-config.js";
type EmbeddedPostgresInstance = {
@@ -27,18 +28,6 @@ export type MigrationConnection = {
stop: () => Promise<void>;
};
function toError(error: unknown, fallbackMessage: string): Error {
if (error instanceof Error) return error;
if (error === undefined) return new Error(fallbackMessage);
if (typeof error === "string") return new Error(`${fallbackMessage}: ${error}`);
try {
return new Error(`${fallbackMessage}: ${JSON.stringify(error)}`);
} catch {
return new Error(`${fallbackMessage}: ${String(error)}`);
}
}
function readRunningPostmasterPid(postmasterPidFile: string): number | null {
if (!existsSync(postmasterPidFile)) return null;
try {
@@ -109,6 +98,7 @@ async function ensureEmbeddedPostgresConnection(
const runningPid = readRunningPostmasterPid(postmasterPidFile);
const runningPort = readPidFilePort(postmasterPidFile);
const preferredAdminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${preferredPort}/postgres`;
const logBuffer = createEmbeddedPostgresLogBuffer();
if (!runningPid && existsSync(pgVersionFile)) {
try {
@@ -150,19 +140,20 @@ async function ensureEmbeddedPostgresConnection(
password: "paperclip",
port: selectedPort,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: logBuffer.append,
onError: logBuffer.append,
});
if (!existsSync(path.resolve(dataDir, "PG_VERSION"))) {
try {
await instance.initialise();
} catch (error) {
throw toError(
error,
`Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${selectedPort}`,
);
throw formatEmbeddedPostgresError(error, {
fallbackMessage:
`Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${selectedPort}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
}
if (existsSync(postmasterPidFile)) {
@@ -171,7 +162,10 @@ async function ensureEmbeddedPostgresConnection(
try {
await instance.start();
} catch (error) {
throw toError(error, `Failed to start embedded PostgreSQL on port ${selectedPort}`);
throw formatEmbeddedPostgresError(error, {
fallbackMessage: `Failed to start embedded PostgreSQL on port ${selectedPort}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${selectedPort}/postgres`;

View File

@@ -0,0 +1,17 @@
CREATE TABLE "issue_inbox_archives" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"company_id" uuid NOT NULL,
"issue_id" uuid NOT NULL,
"user_id" text NOT NULL,
"archived_at" timestamp with time zone DEFAULT now() NOT NULL,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
DROP INDEX "board_api_keys_key_hash_idx";--> statement-breakpoint
ALTER TABLE "issue_inbox_archives" ADD CONSTRAINT "issue_inbox_archives_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
ALTER TABLE "issue_inbox_archives" ADD CONSTRAINT "issue_inbox_archives_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
CREATE INDEX "issue_inbox_archives_company_issue_idx" ON "issue_inbox_archives" USING btree ("company_id","issue_id");--> statement-breakpoint
CREATE INDEX "issue_inbox_archives_company_user_idx" ON "issue_inbox_archives" USING btree ("company_id","user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "issue_inbox_archives_company_issue_user_idx" ON "issue_inbox_archives" USING btree ("company_id","issue_id","user_id");--> statement-breakpoint
CREATE UNIQUE INDEX "board_api_keys_key_hash_idx" ON "board_api_keys" USING btree ("key_hash");

File diff suppressed because it is too large Load Diff

View File

@@ -316,6 +316,13 @@
"when": 1774269579794,
"tag": "0044_illegal_toad",
"breakpoints": true
},
{
"idx": 45,
"version": "7",
"when": 1774530504348,
"tag": "0045_workable_shockwave",
"breakpoints": true
}
]
}

View File

@@ -31,6 +31,7 @@ export { labels } from "./labels.js";
export { issueLabels } from "./issue_labels.js";
export { issueApprovals } from "./issue_approvals.js";
export { issueComments } from "./issue_comments.js";
export { issueInboxArchives } from "./issue_inbox_archives.js";
export { issueReadStates } from "./issue_read_states.js";
export { assets } from "./assets.js";
export { issueAttachments } from "./issue_attachments.js";

View File

@@ -0,0 +1,25 @@
import { pgTable, uuid, text, timestamp, index, uniqueIndex } from "drizzle-orm/pg-core";
import { companies } from "./companies.js";
import { issues } from "./issues.js";
export const issueInboxArchives = pgTable(
"issue_inbox_archives",
{
id: uuid("id").primaryKey().defaultRandom(),
companyId: uuid("company_id").notNull().references(() => companies.id),
issueId: uuid("issue_id").notNull().references(() => issues.id),
userId: text("user_id").notNull(),
archivedAt: timestamp("archived_at", { withTimezone: true }).notNull().defaultNow(),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
},
(table) => ({
companyIssueIdx: index("issue_inbox_archives_company_issue_idx").on(table.companyId, table.issueId),
companyUserIdx: index("issue_inbox_archives_company_user_idx").on(table.companyId, table.userId),
companyIssueUserUnique: uniqueIndex("issue_inbox_archives_company_issue_user_idx").on(
table.companyId,
table.issueId,
table.userId,
),
}),
);

View File

@@ -0,0 +1,144 @@
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { applyPendingMigrations, ensurePostgresDatabase } from "./client.js";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
export type EmbeddedPostgresTestSupport = {
supported: boolean;
reason?: string;
};
export type EmbeddedPostgresTestDatabase = {
connectionString: string;
cleanup(): Promise<void>;
};
let embeddedPostgresSupportPromise: Promise<EmbeddedPostgresTestSupport> | null = null;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
function formatEmbeddedPostgresError(error: unknown): string {
if (error instanceof Error && error.message.length > 0) return error.message;
if (typeof error === "string" && error.length > 0) return error;
return "embedded Postgres startup failed";
}
async function probeEmbeddedPostgresSupport(): Promise<EmbeddedPostgresTestSupport> {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-embedded-postgres-probe-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: () => {},
onError: () => {},
});
try {
await instance.initialise();
await instance.start();
return { supported: true };
} catch (error) {
return {
supported: false,
reason: formatEmbeddedPostgresError(error),
};
} finally {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
}
}
export async function getEmbeddedPostgresTestSupport(): Promise<EmbeddedPostgresTestSupport> {
if (!embeddedPostgresSupportPromise) {
embeddedPostgresSupportPromise = probeEmbeddedPostgresSupport();
}
return await embeddedPostgresSupportPromise;
}
export async function startEmbeddedPostgresTestDatabase(
tempDirPrefix: string,
): Promise<EmbeddedPostgresTestDatabase> {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), tempDirPrefix));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: () => {},
onError: () => {},
});
try {
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return {
connectionString,
cleanup: async () => {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
},
};
} catch (error) {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
throw new Error(
`Failed to start embedded PostgreSQL test database: ${formatEmbeddedPostgresError(error)}`,
);
}
}

View File

@@ -253,9 +253,13 @@ export type {
CompanyPortabilityEnvInput,
CompanyPortabilityFileEntry,
CompanyPortabilityCompanyManifestEntry,
CompanyPortabilitySidebarOrder,
CompanyPortabilityAgentManifestEntry,
CompanyPortabilitySkillManifestEntry,
CompanyPortabilityProjectManifestEntry,
CompanyPortabilityProjectWorkspaceManifestEntry,
CompanyPortabilityIssueRoutineTriggerManifestEntry,
CompanyPortabilityIssueRoutineManifestEntry,
CompanyPortabilityIssueManifestEntry,
CompanyPortabilityManifest,
CompanyPortabilityExportResult,
@@ -484,6 +488,7 @@ export {
portabilityIncludeSchema,
portabilityEnvInputSchema,
portabilityCompanyManifestEntrySchema,
portabilitySidebarOrderSchema,
portabilityAgentManifestEntrySchema,
portabilityManifestSchema,
portabilitySourceSchema,
@@ -535,10 +540,15 @@ export { API_PREFIX, API } from "./api.js";
export { normalizeAgentUrlKey, deriveAgentUrlKey, isUuidLike } from "./agent-url-key.js";
export { deriveProjectUrlKey, normalizeProjectUrlKey } from "./project-url-key.js";
export {
AGENT_MENTION_SCHEME,
PROJECT_MENTION_SCHEME,
buildAgentMentionHref,
buildProjectMentionHref,
extractAgentMentionIds,
parseAgentMentionHref,
parseProjectMentionHref,
extractProjectMentionIds,
type ParsedAgentMention,
type ParsedProjectMention,
} from "./project-mentions.js";

View File

@@ -0,0 +1,29 @@
import { describe, expect, it } from "vitest";
import {
buildAgentMentionHref,
buildProjectMentionHref,
extractAgentMentionIds,
extractProjectMentionIds,
parseAgentMentionHref,
parseProjectMentionHref,
} from "./project-mentions.js";
describe("project-mentions", () => {
it("round-trips project mentions with color metadata", () => {
const href = buildProjectMentionHref("project-123", "#336699");
expect(parseProjectMentionHref(href)).toEqual({
projectId: "project-123",
color: "#336699",
});
expect(extractProjectMentionIds(`[@Paperclip App](${href})`)).toEqual(["project-123"]);
});
it("round-trips agent mentions with icon metadata", () => {
const href = buildAgentMentionHref("agent-123", "code");
expect(parseAgentMentionHref(href)).toEqual({
agentId: "agent-123",
icon: "code",
});
expect(extractAgentMentionIds(`[@CodexCoder](${href})`)).toEqual(["agent-123"]);
});
});

View File

@@ -1,16 +1,24 @@
export const PROJECT_MENTION_SCHEME = "project://";
export const AGENT_MENTION_SCHEME = "agent://";
const HEX_COLOR_RE = /^[0-9a-f]{6}$/i;
const HEX_COLOR_SHORT_RE = /^[0-9a-f]{3}$/i;
const HEX_COLOR_WITH_HASH_RE = /^#[0-9a-f]{6}$/i;
const HEX_COLOR_SHORT_WITH_HASH_RE = /^#[0-9a-f]{3}$/i;
const PROJECT_MENTION_LINK_RE = /\[[^\]]*]\((project:\/\/[^)\s]+)\)/gi;
const AGENT_MENTION_LINK_RE = /\[[^\]]*]\((agent:\/\/[^)\s]+)\)/gi;
const AGENT_ICON_NAME_RE = /^[a-z0-9-]+$/i;
export interface ParsedProjectMention {
projectId: string;
color: string | null;
}
export interface ParsedAgentMention {
agentId: string;
icon: string | null;
}
function normalizeHexColor(input: string | null | undefined): string | null {
if (!input) return null;
const trimmed = input.trim();
@@ -65,6 +73,36 @@ export function parseProjectMentionHref(href: string): ParsedProjectMention | nu
};
}
export function buildAgentMentionHref(agentId: string, icon?: string | null): string {
const trimmedAgentId = agentId.trim();
const normalizedIcon = normalizeAgentIcon(icon ?? null);
if (!normalizedIcon) {
return `${AGENT_MENTION_SCHEME}${trimmedAgentId}`;
}
return `${AGENT_MENTION_SCHEME}${trimmedAgentId}?i=${encodeURIComponent(normalizedIcon)}`;
}
export function parseAgentMentionHref(href: string): ParsedAgentMention | null {
if (!href.startsWith(AGENT_MENTION_SCHEME)) return null;
let url: URL;
try {
url = new URL(href);
} catch {
return null;
}
if (url.protocol !== "agent:") return null;
const agentId = `${url.hostname}${url.pathname}`.replace(/^\/+/, "").trim();
if (!agentId) return null;
return {
agentId,
icon: normalizeAgentIcon(url.searchParams.get("i") ?? url.searchParams.get("icon")),
};
}
export function extractProjectMentionIds(markdown: string): string[] {
if (!markdown) return [];
const ids = new Set<string>();
@@ -76,3 +114,22 @@ export function extractProjectMentionIds(markdown: string): string[] {
}
return [...ids];
}
export function extractAgentMentionIds(markdown: string): string[] {
if (!markdown) return [];
const ids = new Set<string>();
const re = new RegExp(AGENT_MENTION_LINK_RE);
let match: RegExpExecArray | null;
while ((match = re.exec(markdown)) !== null) {
const parsed = parseAgentMentionHref(match[1]);
if (parsed) ids.add(parsed.agentId);
}
return [...ids];
}
function normalizeAgentIcon(input: string | null | undefined): string | null {
if (!input) return null;
const trimmed = input.trim().toLowerCase();
if (!trimmed || !AGENT_ICON_NAME_RE.test(trimmed)) return null;
return trimmed;
}

View File

@@ -33,6 +33,11 @@ export interface CompanyPortabilityCompanyManifestEntry {
requireBoardApprovalForNewAgents: boolean;
}
export interface CompanyPortabilitySidebarOrder {
agents: string[];
projects: string[];
}
export interface CompanyPortabilityProjectManifestEntry {
slug: string;
name: string;
@@ -44,18 +49,52 @@ export interface CompanyPortabilityProjectManifestEntry {
color: string | null;
status: string | null;
executionWorkspacePolicy: Record<string, unknown> | null;
workspaces: CompanyPortabilityProjectWorkspaceManifestEntry[];
metadata: Record<string, unknown> | null;
}
export interface CompanyPortabilityProjectWorkspaceManifestEntry {
key: string;
name: string;
sourceType: string | null;
repoUrl: string | null;
repoRef: string | null;
defaultRef: string | null;
visibility: string | null;
setupCommand: string | null;
cleanupCommand: string | null;
metadata: Record<string, unknown> | null;
isPrimary: boolean;
}
export interface CompanyPortabilityIssueRoutineTriggerManifestEntry {
kind: string;
label: string | null;
enabled: boolean;
cronExpression: string | null;
timezone: string | null;
signingMode: string | null;
replayWindowSec: number | null;
}
export interface CompanyPortabilityIssueRoutineManifestEntry {
concurrencyPolicy: string | null;
catchUpPolicy: string | null;
triggers: CompanyPortabilityIssueRoutineTriggerManifestEntry[];
}
export interface CompanyPortabilityIssueManifestEntry {
slug: string;
identifier: string | null;
title: string;
path: string;
projectSlug: string | null;
projectWorkspaceKey: string | null;
assigneeAgentSlug: string | null;
description: string | null;
recurrence: Record<string, unknown> | null;
recurring: boolean;
routine: CompanyPortabilityIssueRoutineManifestEntry | null;
legacyRecurrence: Record<string, unknown> | null;
status: string | null;
priority: string | null;
labelIds: string[];
@@ -110,6 +149,7 @@ export interface CompanyPortabilityManifest {
} | null;
includes: CompanyPortabilityInclude;
company: CompanyPortabilityCompanyManifestEntry | null;
sidebar: CompanyPortabilitySidebarOrder | null;
agents: CompanyPortabilityAgentManifestEntry[];
skills: CompanyPortabilitySkillManifestEntry[];
projects: CompanyPortabilityProjectManifestEntry[];
@@ -245,6 +285,13 @@ export interface CompanyPortabilityImportResult {
name: string;
reason: string | null;
}[];
projects: {
slug: string;
id: string | null;
action: "created" | "updated" | "skipped";
name: string;
reason: string | null;
}[];
envInputs: CompanyPortabilityEnvInput[];
warnings: string[];
}
@@ -258,4 +305,5 @@ export interface CompanyPortabilityExportRequest {
projectIssues?: string[];
selectedFiles?: string[];
expandReferencedSkills?: boolean;
sidebarOrder?: Partial<CompanyPortabilitySidebarOrder>;
}

View File

@@ -144,9 +144,13 @@ export type {
CompanyPortabilityEnvInput,
CompanyPortabilityFileEntry,
CompanyPortabilityCompanyManifestEntry,
CompanyPortabilitySidebarOrder,
CompanyPortabilityAgentManifestEntry,
CompanyPortabilitySkillManifestEntry,
CompanyPortabilityProjectManifestEntry,
CompanyPortabilityProjectWorkspaceManifestEntry,
CompanyPortabilityIssueRoutineTriggerManifestEntry,
CompanyPortabilityIssueRoutineManifestEntry,
CompanyPortabilityIssueManifestEntry,
CompanyPortabilityManifest,
CompanyPortabilityExportResult,

View File

@@ -38,6 +38,11 @@ export const portabilityCompanyManifestEntrySchema = z.object({
requireBoardApprovalForNewAgents: z.boolean(),
});
export const portabilitySidebarOrderSchema = z.object({
agents: z.array(z.string().min(1)).default([]),
projects: z.array(z.string().min(1)).default([]),
});
export const portabilityAgentManifestEntrySchema = z.object({
slug: z.string().min(1),
name: z.string().min(1),
@@ -85,18 +90,50 @@ export const portabilityProjectManifestEntrySchema = z.object({
color: z.string().nullable(),
status: z.string().nullable(),
executionWorkspacePolicy: z.record(z.unknown()).nullable(),
workspaces: z.array(z.object({
key: z.string().min(1),
name: z.string().min(1),
sourceType: z.string().nullable(),
repoUrl: z.string().nullable(),
repoRef: z.string().nullable(),
defaultRef: z.string().nullable(),
visibility: z.string().nullable(),
setupCommand: z.string().nullable(),
cleanupCommand: z.string().nullable(),
metadata: z.record(z.unknown()).nullable(),
isPrimary: z.boolean(),
})).default([]),
metadata: z.record(z.unknown()).nullable(),
});
export const portabilityIssueRoutineTriggerManifestEntrySchema = z.object({
kind: z.string().min(1),
label: z.string().nullable(),
enabled: z.boolean(),
cronExpression: z.string().nullable(),
timezone: z.string().nullable(),
signingMode: z.string().nullable(),
replayWindowSec: z.number().int().nullable(),
});
export const portabilityIssueRoutineManifestEntrySchema = z.object({
concurrencyPolicy: z.string().nullable(),
catchUpPolicy: z.string().nullable(),
triggers: z.array(portabilityIssueRoutineTriggerManifestEntrySchema).default([]),
});
export const portabilityIssueManifestEntrySchema = z.object({
slug: z.string().min(1),
identifier: z.string().min(1).nullable(),
title: z.string().min(1),
path: z.string().min(1),
projectSlug: z.string().min(1).nullable(),
projectWorkspaceKey: z.string().min(1).nullable(),
assigneeAgentSlug: z.string().min(1).nullable(),
description: z.string().nullable(),
recurrence: z.record(z.unknown()).nullable(),
recurring: z.boolean().default(false),
routine: portabilityIssueRoutineManifestEntrySchema.nullable(),
legacyRecurrence: z.record(z.unknown()).nullable(),
status: z.string().nullable(),
priority: z.string().nullable(),
labelIds: z.array(z.string().min(1)).default([]),
@@ -123,6 +160,7 @@ export const portabilityManifestSchema = z.object({
skills: z.boolean(),
}),
company: portabilityCompanyManifestEntrySchema.nullable(),
sidebar: portabilitySidebarOrderSchema.nullable(),
agents: z.array(portabilityAgentManifestEntrySchema),
skills: z.array(portabilitySkillManifestEntrySchema).default([]),
projects: z.array(portabilityProjectManifestEntrySchema).default([]),
@@ -169,6 +207,7 @@ export const companyPortabilityExportSchema = z.object({
projectIssues: z.array(z.string().min(1)).optional(),
selectedFiles: z.array(z.string().min(1)).optional(),
expandReferencedSkills: z.boolean().optional(),
sidebarOrder: portabilitySidebarOrderSchema.partial().optional(),
});
export type CompanyPortabilityExport = z.infer<typeof companyPortabilityExportSchema>;

View File

@@ -60,6 +60,7 @@ export {
portabilityIncludeSchema,
portabilityEnvInputSchema,
portabilityCompanyManifestEntrySchema,
portabilitySidebarOrderSchema,
portabilityAgentManifestEntrySchema,
portabilitySkillManifestEntrySchema,
portabilityManifestSchema,

View File

@@ -0,0 +1,30 @@
diff --git a/dist/index.js b/dist/index.js
--- a/dist/index.js
+++ b/dist/index.js
@@ -23,7 +23,7 @@
* for a particular string, we need to force that string into the right locale.
* @see https://github.com/leinelissen/embedded-postgres/issues/15
*/
-const LC_MESSAGES_LOCALE = 'en_US.UTF-8';
+const LC_MESSAGES_LOCALE = 'C';
// The default configuration options for the class
const defaults = {
databaseDir: path.join(process.cwd(), 'data', 'db'),
@@ -133,7 +133,7 @@
`--pwfile=${passwordFile}`,
`--lc-messages=${LC_MESSAGES_LOCALE}`,
...this.options.initdbFlags,
- ], Object.assign(Object.assign({}, permissionIds), { env: { LC_MESSAGES: LC_MESSAGES_LOCALE } }));
+ ], Object.assign(Object.assign({}, permissionIds), { env: Object.assign(Object.assign({}, globalThis.process.env), { LC_MESSAGES: LC_MESSAGES_LOCALE }) }));
// Connect to stderr, as that is where the messages get sent
(_a = process.stdout) === null || _a === void 0 ? void 0 : _a.on('data', (chunk) => {
// Parse the data as a string and log it
@@ -177,7 +177,7 @@
'-p',
this.options.port.toString(),
...this.options.postgresFlags,
- ], Object.assign(Object.assign({}, permissionIds), { env: { LC_MESSAGES: LC_MESSAGES_LOCALE } }));
+ ], Object.assign(Object.assign({}, permissionIds), { env: Object.assign(Object.assign({}, globalThis.process.env), { LC_MESSAGES: LC_MESSAGES_LOCALE }) }));
// Connect to stderr, as that is where the messages get sent
(_a = this.process.stderr) === null || _a === void 0 ? void 0 : _a.on('data', (chunk) => {
// Parse the data as a string and log it

19
pnpm-lock.yaml generated
View File

@@ -4,6 +4,11 @@ settings:
autoInstallPeers: true
excludeLinksFromLockfile: false
patchedDependencies:
embedded-postgres@18.1.0-beta.16:
hash: 55uhvnotpqyiy37rn3pqpukhei
path: patches/embedded-postgres@18.1.0-beta.16.patch
importers:
.:
@@ -73,7 +78,7 @@ importers:
version: 0.38.4(@electric-sql/pglite@0.3.15)(@types/react@19.2.14)(kysely@0.28.11)(pg@8.18.0)(postgres@3.4.8)(react@19.2.4)
embedded-postgres:
specifier: ^18.1.0-beta.16
version: 18.1.0-beta.16
version: 18.1.0-beta.16(patch_hash=55uhvnotpqyiy37rn3pqpukhei)
picocolors:
specifier: ^1.1.1
version: 1.1.1
@@ -225,7 +230,7 @@ importers:
version: 0.38.4(@electric-sql/pglite@0.3.15)(@types/react@19.2.14)(kysely@0.28.11)(pg@8.18.0)(postgres@3.4.8)(react@19.2.4)
embedded-postgres:
specifier: ^18.1.0-beta.16
version: 18.1.0-beta.16
version: 18.1.0-beta.16(patch_hash=55uhvnotpqyiy37rn3pqpukhei)
postgres:
specifier: ^3.4.5
version: 3.4.8
@@ -494,7 +499,7 @@ importers:
version: 0.38.4(@electric-sql/pglite@0.3.15)(@types/react@19.2.14)(kysely@0.28.11)(pg@8.18.0)(postgres@3.4.8)(react@19.2.4)
embedded-postgres:
specifier: ^18.1.0-beta.16
version: 18.1.0-beta.16
version: 18.1.0-beta.16(patch_hash=55uhvnotpqyiy37rn3pqpukhei)
express:
specifier: ^5.1.0
version: 5.2.1
@@ -583,6 +588,9 @@ importers:
'@dnd-kit/utilities':
specifier: ^3.2.2
version: 3.2.2(react@19.2.4)
'@lexical/link':
specifier: 0.35.0
version: 0.35.0
'@mdxeditor/editor':
specifier: ^3.52.4
version: 3.52.4(@codemirror/language@6.12.1)(@lezer/highlight@1.2.3)(@types/react-dom@19.2.3(@types/react@19.2.14))(@types/react@19.2.14)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(yjs@13.6.29)
@@ -631,6 +639,9 @@ importers:
cmdk:
specifier: ^1.1.1
version: 1.1.1(@types/react-dom@19.2.3(@types/react@19.2.14))(@types/react@19.2.14)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
lexical:
specifier: 0.35.0
version: 0.35.0
lucide-react:
specifier: ^0.574.0
version: 0.574.0(react@19.2.4)
@@ -9967,7 +9978,7 @@ snapshots:
electron-to-chromium@1.5.286: {}
embedded-postgres@18.1.0-beta.16:
embedded-postgres@18.1.0-beta.16(patch_hash=55uhvnotpqyiy37rn3pqpukhei):
dependencies:
async-exit-hook: 2.0.1
pg: 8.18.0

77
releases/v2026.325.0.md Normal file
View File

@@ -0,0 +1,77 @@
# v2026.325.0
> Released: 2026-03-25
## Highlights
- **Company import/export** — Full company portability with a file-browser UX for importing and exporting agent companies. Includes rich frontmatter preview, nested file picker, merge-history support, GitHub shorthand refs, and CLI `company import`/`company export` commands. Imported companies open automatically after import, and heartbeat timers are disabled for imported agents by default. ([#840](https://github.com/paperclipai/paperclip/pull/840), [#1631](https://github.com/paperclipai/paperclip/pull/1631), [#1632](https://github.com/paperclipai/paperclip/pull/1632), [#1655](https://github.com/paperclipai/paperclip/pull/1655))
- **Company skills library** — New company-scoped skills system with a skills UI, agent skill sync across all local adapters (Claude, Codex, Pi, Gemini), pinned GitHub skills with update checks, and built-in skill support. ([#1346](https://github.com/paperclipai/paperclip/pull/1346))
- **Routines and recurring tasks** — Full routines engine with triggers, routine runs, coalescing, and recurring task portability. Includes API documentation and routine export support. ([#1351](https://github.com/paperclipai/paperclip/pull/1351), [#1622](https://github.com/paperclipai/paperclip/pull/1622), @aronprins)
## Improvements
- **Inline join requests in inbox** — Join requests now render inline in the inbox alongside approvals and other work items.
- **Onboarding seeding** — New projects and issues are seeded with goal context during onboarding for a better first-run experience.
- **Agent instructions recovery** — Managed agent instructions are recovered from disk on startup; instructions are preserved across adapter switches.
- **Heartbeats settings page** — Shows all agents regardless of interval config; added a "Disable All" button for quick bulk control.
- **Agent history via participation** — Agent issue history now uses participation records instead of direct assignment lookups.
- **Alphabetical agent sorting** — Agents are sorted alphabetically by name across all views.
- **Company org chart assets** — Improved generated org chart visuals for companies.
- **Improved CLI API connection errors** — Better error messages when the CLI cannot reach the Paperclip API.
- **Markdown mention links** — Custom URL schemes are now allowed in Lexical LinkNode, enabling mention pills with proper linking behavior. Atomic deletion of mention pills works correctly.
- **Issue workspace reuse** — Workspaces are correctly reused after isolation runs.
- **Failed-run session resume** — Explicit failed-run sessions can now be resumed via honor flag.
- **Docker image CI** — Added Docker image build and deploy workflow. ([#542](https://github.com/paperclipai/paperclip/pull/542), @albttx)
- **Project filter on issues** — Issues list can now be filtered by project. ([#552](https://github.com/paperclipai/paperclip/pull/552), @mvanhorn)
- **Inline comment image attachments** — Uploaded images are now embedded inline in comments. ([#551](https://github.com/paperclipai/paperclip/pull/551), @mvanhorn)
- **AGENTS.md fallback** — Claude-local adapter gracefully falls back when AGENTS.md is missing. ([#550](https://github.com/paperclipai/paperclip/pull/550), @mvanhorn)
- **Company-creator skill** — New skill for scaffolding agent company packages from scratch.
- **Reports page rename** — Reports section renamed for clarity. ([#1380](https://github.com/paperclipai/paperclip/pull/1380), @DanielSousa)
- **Eval framework bootstrap** — Promptfoo-based evaluation framework with YAML test cases for systematic agent behavior testing. ([#832](https://github.com/paperclipai/paperclip/pull/832), @mvanhorn)
- **Board CLI authentication** — Browser-based auth flow for the CLI so board users can authenticate without manually copying API keys. ([#1635](https://github.com/paperclipai/paperclip/pull/1635))
## Fixes
- **Embedded Postgres initdb in Docker slim** — Fixed initdb failure in slim containers by adding proper initdbFlags types. ([#737](https://github.com/paperclipai/paperclip/pull/737), @alaa-alghazouli)
- **OpenClaw gateway crash** — Fixed unhandled rejection when challengePromise fails. ([#743](https://github.com/paperclipai/paperclip/pull/743), @Sigmabrogz)
- **Agent mention pill alignment** — Fixed vertical misalignment between agent mention pills and project mention pills.
- **Task assignment grants** — Preserved task assignment grants for agents that have already joined.
- **Instructions tab state** — Fixed tab state not updating correctly when switching between agents.
- **Imported agent bundle frontmatter** — Fixed frontmatter leakage in imported agent bundles.
- **Login form 1Password detection** — Fixed login form not being detected by password managers; Enter key now submits correctly. ([#1014](https://github.com/paperclipai/paperclip/pull/1014))
- **Pill contrast (WCAG)** — Improved mention pill contrast using WCAG contrast ratios on composited backgrounds.
- **Documents horizontal scroll** — Prevented documents row from causing horizontal scroll on mobile.
- **Toggle switch sizing** — Fixed oversized toggle switches on mobile; added missing `data-slot` attributes.
- **Agent instructions tab responsive** — Made agent instructions tab responsive on mobile.
- **Monospace font sizing** — Adjusted inline code font size and added dark mode background.
- **Priority icon removal** — Removed priority icon from issue rows for a cleaner list view.
- **Same-page issue toasts** — Suppressed redundant toasts when navigating to an issue already on screen.
- **Noisy adapter log** — Removed noisy "Loaded agent instructions file" log message from all adapters.
- **Pi local adapter** — Fixed Pi adapter missing from `isLocal` check. ([#1382](https://github.com/paperclipai/paperclip/pull/1382), @lucas-stellet)
- **CLI auth migration idempotency** — Made migration 0044 idempotent to avoid failures on re-run.
- **Dev restart tracking** — `.paperclip` and test-only paths are now ignored in dev restart detection.
- **Duplicate CLI auth flag** — Fixed duplicate `--company` flag on `auth login`.
- **Gemini local execution** — Fixed Gemini local adapter execution and diagnostics.
- **Sidebar ordering** — Preserved sidebar ordering during company portability operations.
- **Company skill deduplication** — Fixed duplicate skill inventory refreshes.
- **Worktree merge-history migrations** — Fixed migration handling in worktree contexts. ([#1385](https://github.com/paperclipai/paperclip/pull/1385))
## Upgrade Guide
Seven new database migrations (`0038``0044`) will run automatically on startup:
- **Migration 0038** adds process tracking columns to heartbeat runs (PID, started-at, retry tracking).
- **Migration 0039** adds the routines engine tables (routines, triggers, routine runs).
- **Migrations 00400042** extend company skills, recurring tasks, and portability metadata.
- **Migration 0043** adds the Codex managed-home and agent instructions recovery columns.
- **Migration 0044** adds board API keys and CLI auth challenge tables for browser-based CLI auth.
All migrations are additive (new tables and columns) — no existing data is modified. Standard `paperclipai` startup will apply them automatically.
If you use the company import/export feature, note that imported companies have heartbeat timers disabled by default. Re-enable them manually from the Heartbeats settings page after verifying adapter configuration.
## Contributors
Thank you to everyone who contributed to this release!
@alaa-alghazouli, @albttx, @AOrobator, @aronprins, @cryppadotta, @DanielSousa, @lucas-stellet, @mvanhorn, @richardanaya, @Sigmabrogz

View File

@@ -0,0 +1,364 @@
#!/usr/bin/env npx tsx
/**
* Generate org chart images and READMEs for agent company packages.
*
* Reads company packages from a directory, builds manifest-like data,
* then uses the existing server-side SVG renderer (sharp, no browser)
* and README generator.
*
* Usage:
* npx tsx scripts/generate-company-assets.ts /path/to/companies-repo
*
* Processes each subdirectory that contains a COMPANY.md file.
*/
import * as fs from "fs";
import * as path from "path";
import { renderOrgChartPng, type OrgNode, type OrgChartOverlay } from "../server/src/routes/org-chart-svg.js";
import { generateReadme } from "../server/src/services/company-export-readme.js";
import type { CompanyPortabilityManifest } from "@paperclipai/shared";
// ── YAML frontmatter parser (minimal, no deps) ──────────────────
function parseFrontmatter(content: string): { data: Record<string, unknown>; body: string } {
const match = content.match(/^---\n([\s\S]*?)\n---\n?([\s\S]*)$/);
if (!match) return { data: {}, body: content };
const yamlStr = match[1];
const body = match[2];
const data: Record<string, unknown> = {};
let currentKey: string | null = null;
let currentValue: string | string[] | null = null;
let inList = false;
for (const line of yamlStr.split("\n")) {
// List item
if (inList && /^\s+-\s+/.test(line)) {
const val = line.replace(/^\s+-\s+/, "").trim();
(currentValue as string[]).push(val);
continue;
}
// Save previous key
if (currentKey !== null && currentValue !== null) {
data[currentKey] = currentValue;
}
inList = false;
// Key: value line
const kvMatch = line.match(/^(\w[\w-]*)\s*:\s*(.*)$/);
if (kvMatch) {
currentKey = kvMatch[1];
let val = kvMatch[2].trim();
if (val === "" || val === ">") {
// Could be a multi-line value or list — peek ahead handled by next iterations
currentValue = "";
continue;
}
if (val === "null" || val === "~") {
currentValue = null;
data[currentKey] = null;
currentKey = null;
currentValue = null;
continue;
}
// Remove surrounding quotes
if ((val.startsWith('"') && val.endsWith('"')) || (val.startsWith("'") && val.endsWith("'"))) {
val = val.slice(1, -1);
}
currentValue = val;
} else if (currentKey !== null && line.match(/^\s+-\s+/)) {
// Start of list
inList = true;
currentValue = [];
const val = line.replace(/^\s+-\s+/, "").trim();
(currentValue as string[]).push(val);
} else if (currentKey !== null && line.match(/^\s+\S/)) {
// Continuation of multi-line scalar
const trimmed = line.trim();
if (typeof currentValue === "string") {
currentValue = currentValue ? `${currentValue} ${trimmed}` : trimmed;
}
}
}
// Save last key
if (currentKey !== null && currentValue !== null) {
data[currentKey] = currentValue;
}
return { data, body };
}
// ── Slug to role mapping ─────────────────────────────────────────
const SLUG_TO_ROLE: Record<string, string> = {
ceo: "ceo",
cto: "cto",
cmo: "cmo",
cfo: "cfo",
coo: "coo",
};
function inferRole(slug: string, title: string | null): string {
// Check direct slug match first
if (SLUG_TO_ROLE[slug]) return SLUG_TO_ROLE[slug];
// Check title for C-suite
const t = (title || "").toLowerCase();
if (t.includes("chief executive")) return "ceo";
if (t.includes("chief technology")) return "cto";
if (t.includes("chief marketing")) return "cmo";
if (t.includes("chief financial")) return "cfo";
if (t.includes("chief operating")) return "coo";
if (t.includes("vp") || t.includes("vice president")) return "vp";
if (t.includes("manager")) return "manager";
if (t.includes("qa") || t.includes("quality")) return "engineer";
// Default to engineer
return "engineer";
}
// ── Parse a company package directory ────────────────────────────
interface CompanyPackage {
dir: string;
name: string;
description: string | null;
slug: string;
agents: CompanyPortabilityManifest["agents"];
skills: CompanyPortabilityManifest["skills"];
}
function parseCompanyPackage(companyDir: string): CompanyPackage | null {
const companyMdPath = path.join(companyDir, "COMPANY.md");
if (!fs.existsSync(companyMdPath)) return null;
const companyMd = fs.readFileSync(companyMdPath, "utf-8");
const { data: companyData } = parseFrontmatter(companyMd);
const name = (companyData.name as string) || path.basename(companyDir);
const description = (companyData.description as string) || null;
const slug = (companyData.slug as string) || path.basename(companyDir);
// Parse agents
const agentsDir = path.join(companyDir, "agents");
const agents: CompanyPortabilityManifest["agents"] = [];
if (fs.existsSync(agentsDir)) {
for (const agentSlug of fs.readdirSync(agentsDir)) {
const agentMdName = fs.existsSync(path.join(agentsDir, agentSlug, "AGENT.md"))
? "AGENT.md"
: fs.existsSync(path.join(agentsDir, agentSlug, "AGENTS.md"))
? "AGENTS.md"
: null;
if (!agentMdName) continue;
const agentMdPath = path.join(agentsDir, agentSlug, agentMdName);
const agentMd = fs.readFileSync(agentMdPath, "utf-8");
const { data: agentData } = parseFrontmatter(agentMd);
const agentName = (agentData.name as string) || agentSlug;
const title = (agentData.title as string) || null;
const reportsTo = agentData.reportsTo as string | null;
const skills = (agentData.skills as string[]) || [];
const role = inferRole(agentSlug, title);
agents.push({
slug: agentSlug,
name: agentName,
path: `agents/${agentSlug}/${agentMdName}`,
skills,
role,
title,
icon: null,
capabilities: null,
reportsToSlug: reportsTo || null,
adapterType: "claude_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
budgetMonthlyCents: 0,
metadata: null,
});
}
}
// Parse skills
const skillsDir = path.join(companyDir, "skills");
const skills: CompanyPortabilityManifest["skills"] = [];
if (fs.existsSync(skillsDir)) {
for (const skillSlug of fs.readdirSync(skillsDir)) {
const skillMdPath = path.join(skillsDir, skillSlug, "SKILL.md");
if (!fs.existsSync(skillMdPath)) continue;
const skillMd = fs.readFileSync(skillMdPath, "utf-8");
const { data: skillData } = parseFrontmatter(skillMd);
const skillName = (skillData.name as string) || skillSlug;
const skillDesc = (skillData.description as string) || null;
// Extract source info from metadata
let sourceType = "local";
let sourceLocator: string | null = null;
const metadata = skillData.metadata as Record<string, unknown> | undefined;
if (metadata) {
// metadata.sources is parsed as a nested structure, but our simple parser
// doesn't handle it well. Check for github repo in the raw SKILL.md instead.
const repoMatch = skillMd.match(/repo:\s*(.+)/);
const pathMatch = skillMd.match(/path:\s*(.+)/);
if (repoMatch) {
sourceType = "github";
const repo = repoMatch[1].trim();
const filePath = pathMatch ? pathMatch[1].trim() : "";
sourceLocator = `https://github.com/${repo}/blob/main/${filePath}`;
}
}
skills.push({
key: skillSlug,
slug: skillSlug,
name: skillName,
path: `skills/${skillSlug}/SKILL.md`,
description: skillDesc,
sourceType,
sourceLocator,
sourceRef: null,
trustLevel: null,
compatibility: null,
metadata: null,
fileInventory: [{ path: `skills/${skillSlug}/SKILL.md`, kind: "skill" }],
});
}
}
return { dir: companyDir, name, description, slug, agents, skills };
}
// ── Build OrgNode tree from agents ───────────────────────────────
const ROLE_LABELS: Record<string, string> = {
ceo: "Chief Executive",
cto: "Technology",
cmo: "Marketing",
cfo: "Finance",
coo: "Operations",
vp: "VP",
manager: "Manager",
engineer: "Engineer",
agent: "Agent",
};
function buildOrgTree(agents: CompanyPortabilityManifest["agents"]): OrgNode[] {
const bySlug = new Map(agents.map((a) => [a.slug, a]));
const childrenOf = new Map<string | null, typeof agents>();
for (const a of agents) {
const parent = a.reportsToSlug ?? null;
const list = childrenOf.get(parent) ?? [];
list.push(a);
childrenOf.set(parent, list);
}
const build = (parentSlug: string | null): OrgNode[] => {
const members = childrenOf.get(parentSlug) ?? [];
return members.map((m) => ({
id: m.slug,
name: m.name,
role: ROLE_LABELS[m.role] ?? m.role,
status: "active",
reports: build(m.slug),
}));
};
const roots = agents.filter((a) => !a.reportsToSlug || !bySlug.has(a.reportsToSlug));
const tree = build(null);
for (const root of roots) {
if (root.reportsToSlug && !bySlug.has(root.reportsToSlug)) {
tree.push({
id: root.slug,
name: root.name,
role: ROLE_LABELS[root.role] ?? root.role,
status: "active",
reports: build(root.slug),
});
}
}
return tree;
}
// ── Main ─────────────────────────────────────────────────────────
async function main() {
const companiesDir = process.argv[2];
if (!companiesDir) {
console.error("Usage: npx tsx scripts/generate-company-assets.ts <companies-dir>");
process.exit(1);
}
const resolvedDir = path.resolve(companiesDir);
if (!fs.existsSync(resolvedDir)) {
console.error(`Directory not found: ${resolvedDir}`);
process.exit(1);
}
const entries = fs.readdirSync(resolvedDir, { withFileTypes: true });
let processed = 0;
for (const entry of entries) {
if (!entry.isDirectory()) continue;
const companyDir = path.join(resolvedDir, entry.name);
const pkg = parseCompanyPackage(companyDir);
if (!pkg) continue;
console.log(`\n── ${pkg.name} (${pkg.slug}) ──`);
console.log(` ${pkg.agents.length} agents, ${pkg.skills.length} skills`);
// Generate org chart PNG
if (pkg.agents.length > 0) {
const orgTree = buildOrgTree(pkg.agents);
console.log(` Org tree roots: ${orgTree.map((n) => n.name).join(", ")}`);
const overlay: OrgChartOverlay = {
companyName: pkg.name,
stats: `Agents: ${pkg.agents.length}, Skills: ${pkg.skills.length}`,
};
const pngBuffer = await renderOrgChartPng(orgTree, "warmth", overlay);
const imagesDir = path.join(companyDir, "images");
fs.mkdirSync(imagesDir, { recursive: true });
const pngPath = path.join(imagesDir, "org-chart.png");
fs.writeFileSync(pngPath, pngBuffer);
console.log(`${path.relative(resolvedDir, pngPath)} (${(pngBuffer.length / 1024).toFixed(1)}kb)`);
}
// Generate README
const manifest: CompanyPortabilityManifest = {
schemaVersion: 1,
generatedAt: new Date().toISOString(),
source: null,
includes: { company: true, agents: true, projects: false, issues: false, skills: true },
company: null,
agents: pkg.agents,
skills: pkg.skills,
projects: [],
issues: [],
envInputs: [],
};
const readme = generateReadme(manifest, {
companyName: pkg.name,
companyDescription: pkg.description,
});
const readmePath = path.join(companyDir, "README.md");
fs.writeFileSync(readmePath, readme);
console.log(`${path.relative(resolvedDir, readmePath)}`);
processed++;
}
console.log(`\n✓ Processed ${processed} companies.`);
}
main().catch((e) => {
console.error(e);
process.exit(1);
});

View File

@@ -0,0 +1,31 @@
#!/usr/bin/env node
import { readFileSync, writeFileSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const __dirname = dirname(fileURLToPath(import.meta.url));
const repoRoot = resolve(__dirname, "..");
const uiDir = join(repoRoot, "ui");
const packageJsonPath = join(uiDir, "package.json");
const packageJson = JSON.parse(readFileSync(packageJsonPath, "utf8"));
const publishPackageJson = {
name: packageJson.name,
version: packageJson.version,
description: packageJson.description,
license: packageJson.license,
homepage: packageJson.homepage,
bugs: packageJson.bugs,
repository: packageJson.repository,
type: packageJson.type,
files: ["dist"],
publishConfig: {
access: "public",
},
};
writeFileSync(packageJsonPath, `${JSON.stringify(publishPackageJson, null, 2)}\n`);
console.log(" ✓ Generated publishable UI package.json");

View File

@@ -3,6 +3,12 @@ set -euo pipefail
base_cwd="${PAPERCLIP_WORKSPACE_BASE_CWD:?PAPERCLIP_WORKSPACE_BASE_CWD is required}"
worktree_cwd="${PAPERCLIP_WORKSPACE_CWD:?PAPERCLIP_WORKSPACE_CWD is required}"
paperclip_home="${PAPERCLIP_HOME:-$HOME/.paperclip}"
paperclip_instance_id="${PAPERCLIP_INSTANCE_ID:-default}"
paperclip_dir="$worktree_cwd/.paperclip"
worktree_config_path="$paperclip_dir/config.json"
worktree_env_path="$paperclip_dir/.env"
worktree_name="${PAPERCLIP_WORKSPACE_BRANCH:-$(basename "$worktree_cwd")}"
if [[ ! -d "$base_cwd" ]]; then
echo "Base workspace does not exist: $base_cwd" >&2
@@ -14,6 +20,286 @@ if [[ ! -d "$worktree_cwd" ]]; then
exit 1
fi
source_config_path="${PAPERCLIP_CONFIG:-}"
if [[ -z "$source_config_path" && ( -e "$base_cwd/.paperclip/config.json" || -L "$base_cwd/.paperclip/config.json" ) ]]; then
source_config_path="$base_cwd/.paperclip/config.json"
fi
if [[ -z "$source_config_path" ]]; then
source_config_path="$paperclip_home/instances/$paperclip_instance_id/config.json"
fi
source_env_path="$(dirname "$source_config_path")/.env"
mkdir -p "$paperclip_dir"
run_isolated_worktree_init() {
if command -v pnpm >/dev/null 2>&1 && pnpm paperclipai --help >/dev/null 2>&1; then
pnpm paperclipai worktree init --force --seed-mode minimal --name "$worktree_name" --from-config "$source_config_path"
return 0
fi
if command -v paperclipai >/dev/null 2>&1; then
paperclipai worktree init --force --seed-mode minimal --name "$worktree_name" --from-config "$source_config_path"
return 0
fi
return 1
}
write_fallback_worktree_config() {
WORKTREE_NAME="$worktree_name" \
BASE_CWD="$base_cwd" \
WORKTREE_CWD="$worktree_cwd" \
PAPERCLIP_DIR="$paperclip_dir" \
SOURCE_CONFIG_PATH="$source_config_path" \
SOURCE_ENV_PATH="$source_env_path" \
PAPERCLIP_WORKTREES_DIR="${PAPERCLIP_WORKTREES_DIR:-}" \
node <<'EOF'
const fs = require("node:fs");
const os = require("node:os");
const path = require("node:path");
const net = require("node:net");
function expandHomePrefix(value) {
if (!value) return value;
if (value === "~") return os.homedir();
if (value.startsWith("~/")) return path.resolve(os.homedir(), value.slice(2));
return value;
}
function nonEmpty(value) {
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
}
function sanitizeInstanceId(value) {
const trimmed = String(value ?? "").trim().toLowerCase();
const normalized = trimmed
.replace(/[^a-z0-9_-]+/g, "-")
.replace(/-+/g, "-")
.replace(/^[-_]+|[-_]+$/g, "");
return normalized || "worktree";
}
function parseEnvFile(contents) {
const entries = {};
for (const rawLine of contents.split(/\r?\n/)) {
const line = rawLine.trim();
if (!line || line.startsWith("#")) continue;
const match = rawLine.match(/^\s*(?:export\s+)?([A-Za-z_][A-Za-z0-9_]*)\s*=\s*(.*)\s*$/);
if (!match) continue;
const [, key, rawValue] = match;
const value = rawValue.trim();
if (!value) {
entries[key] = "";
continue;
}
if (
(value.startsWith("\"") && value.endsWith("\"")) ||
(value.startsWith("'") && value.endsWith("'"))
) {
entries[key] = value.slice(1, -1);
continue;
}
entries[key] = value.replace(/\s+#.*$/, "").trim();
}
return entries;
}
async function findAvailablePort(preferredPort, reserved = new Set()) {
const startPort = Number.isFinite(preferredPort) && preferredPort > 0 ? Math.trunc(preferredPort) : 0;
if (startPort > 0) {
for (let port = startPort; port < startPort + 100; port += 1) {
if (reserved.has(port)) continue;
const available = await new Promise((resolve) => {
const server = net.createServer();
server.unref();
server.once("error", () => resolve(false));
server.listen(port, "127.0.0.1", () => {
server.close(() => resolve(true));
});
});
if (available) return port;
}
}
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.once("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate a port.")));
return;
}
const port = address.port;
server.close(() => resolve(port));
});
});
}
function isLoopbackHost(hostname) {
const value = hostname.trim().toLowerCase();
return value === "127.0.0.1" || value === "localhost" || value === "::1";
}
function rewriteLocalUrlPort(rawUrl, port) {
if (!rawUrl) return undefined;
try {
const parsed = new URL(rawUrl);
if (!isLoopbackHost(parsed.hostname)) return rawUrl;
parsed.port = String(port);
return parsed.toString();
} catch {
return rawUrl;
}
}
function resolveRuntimeLikePath(value, configPath) {
const expanded = expandHomePrefix(value);
if (path.isAbsolute(expanded)) return expanded;
return path.resolve(path.dirname(configPath), expanded);
}
async function main() {
const worktreeName = process.env.WORKTREE_NAME;
const paperclipDir = process.env.PAPERCLIP_DIR;
const sourceConfigPath = process.env.SOURCE_CONFIG_PATH;
const sourceEnvPath = process.env.SOURCE_ENV_PATH;
const worktreeHome = path.resolve(expandHomePrefix(nonEmpty(process.env.PAPERCLIP_WORKTREES_DIR) ?? "~/.paperclip-worktrees"));
const instanceId = sanitizeInstanceId(worktreeName);
const instanceRoot = path.resolve(worktreeHome, "instances", instanceId);
const configPath = path.resolve(paperclipDir, "config.json");
const envPath = path.resolve(paperclipDir, ".env");
let sourceConfig = null;
if (sourceConfigPath && fs.existsSync(sourceConfigPath)) {
sourceConfig = JSON.parse(fs.readFileSync(sourceConfigPath, "utf8"));
}
const sourceEnvEntries =
sourceEnvPath && fs.existsSync(sourceEnvPath)
? parseEnvFile(fs.readFileSync(sourceEnvPath, "utf8"))
: {};
const preferredServerPort = Number(sourceConfig?.server?.port ?? 3101) + 1;
const serverPort = await findAvailablePort(preferredServerPort);
const preferredDbPort = Number(sourceConfig?.database?.embeddedPostgresPort ?? 54329) + 1;
const databasePort = await findAvailablePort(preferredDbPort, new Set([serverPort]));
fs.rmSync(configPath, { force: true });
fs.mkdirSync(path.dirname(configPath), { recursive: true });
fs.mkdirSync(instanceRoot, { recursive: true });
const authPublicBaseUrl = rewriteLocalUrlPort(sourceConfig?.auth?.publicBaseUrl, serverPort);
const targetConfig = {
$meta: {
version: 1,
updatedAt: new Date().toISOString(),
source: "configure",
},
...(sourceConfig?.llm ? { llm: sourceConfig.llm } : {}),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.resolve(instanceRoot, "db"),
embeddedPostgresPort: databasePort,
backup: {
enabled: sourceConfig?.database?.backup?.enabled ?? true,
intervalMinutes: sourceConfig?.database?.backup?.intervalMinutes ?? 60,
retentionDays: sourceConfig?.database?.backup?.retentionDays ?? 30,
dir: path.resolve(instanceRoot, "data", "backups"),
},
},
logging: {
mode: sourceConfig?.logging?.mode ?? "file",
logDir: path.resolve(instanceRoot, "logs"),
},
server: {
deploymentMode: sourceConfig?.server?.deploymentMode ?? "local_trusted",
exposure: sourceConfig?.server?.exposure ?? "private",
host: sourceConfig?.server?.host ?? "127.0.0.1",
port: serverPort,
allowedHostnames: sourceConfig?.server?.allowedHostnames ?? [],
serveUi: sourceConfig?.server?.serveUi ?? true,
},
auth: {
baseUrlMode: sourceConfig?.auth?.baseUrlMode ?? "auto",
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
disableSignUp: sourceConfig?.auth?.disableSignUp ?? false,
},
storage: {
provider: sourceConfig?.storage?.provider ?? "local_disk",
localDisk: {
baseDir: path.resolve(instanceRoot, "data", "storage"),
},
s3: {
bucket: sourceConfig?.storage?.s3?.bucket ?? "paperclip",
region: sourceConfig?.storage?.s3?.region ?? "us-east-1",
endpoint: sourceConfig?.storage?.s3?.endpoint,
prefix: sourceConfig?.storage?.s3?.prefix ?? "",
forcePathStyle: sourceConfig?.storage?.s3?.forcePathStyle ?? false,
},
},
secrets: {
provider: sourceConfig?.secrets?.provider ?? "local_encrypted",
strictMode: sourceConfig?.secrets?.strictMode ?? false,
localEncrypted: {
keyFilePath: path.resolve(instanceRoot, "secrets", "master.key"),
},
},
};
fs.writeFileSync(configPath, `${JSON.stringify(targetConfig, null, 2)}\n`, { mode: 0o600 });
const inlineMasterKey = nonEmpty(sourceEnvEntries.PAPERCLIP_SECRETS_MASTER_KEY);
if (inlineMasterKey) {
fs.mkdirSync(path.resolve(instanceRoot, "secrets"), { recursive: true });
fs.writeFileSync(targetConfig.secrets.localEncrypted.keyFilePath, inlineMasterKey, {
encoding: "utf8",
mode: 0o600,
});
} else {
const sourceKeyFilePath = nonEmpty(sourceEnvEntries.PAPERCLIP_SECRETS_MASTER_KEY_FILE)
? resolveRuntimeLikePath(sourceEnvEntries.PAPERCLIP_SECRETS_MASTER_KEY_FILE, sourceConfigPath)
: nonEmpty(sourceConfig?.secrets?.localEncrypted?.keyFilePath)
? resolveRuntimeLikePath(sourceConfig.secrets.localEncrypted.keyFilePath, sourceConfigPath)
: null;
if (sourceKeyFilePath && fs.existsSync(sourceKeyFilePath)) {
fs.mkdirSync(path.resolve(instanceRoot, "secrets"), { recursive: true });
fs.copyFileSync(sourceKeyFilePath, targetConfig.secrets.localEncrypted.keyFilePath);
fs.chmodSync(targetConfig.secrets.localEncrypted.keyFilePath, 0o600);
}
}
const envLines = [
"PAPERCLIP_HOME=" + JSON.stringify(worktreeHome),
"PAPERCLIP_INSTANCE_ID=" + JSON.stringify(instanceId),
"PAPERCLIP_CONFIG=" + JSON.stringify(configPath),
"PAPERCLIP_CONTEXT=" + JSON.stringify(path.resolve(worktreeHome, "context.json")),
"PAPERCLIP_IN_WORKTREE=true",
"PAPERCLIP_WORKTREE_NAME=" + JSON.stringify(worktreeName),
];
const agentJwtSecret = nonEmpty(sourceEnvEntries.PAPERCLIP_AGENT_JWT_SECRET);
if (agentJwtSecret) {
envLines.push("PAPERCLIP_AGENT_JWT_SECRET=" + JSON.stringify(agentJwtSecret));
}
fs.writeFileSync(envPath, `${envLines.join("\n")}\n`, { mode: 0o600 });
}
main().catch((error) => {
console.error(error instanceof Error ? error.message : String(error));
process.exit(1);
});
EOF
}
if ! run_isolated_worktree_init; then
echo "paperclipai CLI not available in this workspace; writing isolated fallback config without DB seeding." >&2
write_fallback_worktree_config
fi
while IFS= read -r relative_path; do
[[ -n "$relative_path" ]] || continue
source_path="$base_cwd/$relative_path"

View File

@@ -33,7 +33,7 @@
],
"scripts": {
"dev": "tsx src/index.ts",
"dev:watch": "cross-env PAPERCLIP_MIGRATION_PROMPT=never PAPERCLIP_MIGRATION_AUTO_APPLY=true tsx watch --ignore ../ui/node_modules --ignore ../ui/.vite --ignore ../ui/dist src/index.ts",
"dev:watch": "cross-env PAPERCLIP_MIGRATION_PROMPT=never PAPERCLIP_MIGRATION_AUTO_APPLY=true tsx ./scripts/dev-watch.ts",
"prepare:ui-dist": "bash ../scripts/prepare-server-ui-dist.sh",
"build": "tsc && mkdir -p dist/onboarding-assets && cp -R src/onboarding-assets/. dist/onboarding-assets/",
"prepack": "pnpm run prepare:ui-dist",

View File

@@ -0,0 +1,33 @@
import { spawn } from "node:child_process";
import { createRequire } from "node:module";
import path from "node:path";
import { fileURLToPath } from "node:url";
import { resolveServerDevWatchIgnorePaths } from "../src/dev-watch-ignore.ts";
const require = createRequire(import.meta.url);
const tsxCliPath = require.resolve("tsx/dist/cli.mjs");
const serverRoot = path.resolve(path.dirname(fileURLToPath(import.meta.url)), "..");
const ignoreArgs = resolveServerDevWatchIgnorePaths(serverRoot).flatMap((ignorePath) => ["--exclude", ignorePath]);
const child = spawn(
process.execPath,
[tsxCliPath, "watch", ...ignoreArgs, "src/index.ts"],
{
cwd: serverRoot,
env: process.env,
stdio: "inherit",
},
);
child.on("exit", (code, signal) => {
if (signal) {
process.kill(process.pid, signal);
return;
}
process.exit(code ?? 0);
});
child.on("error", (error) => {
console.error(error);
process.exit(1);
});

View File

@@ -1,4 +1,4 @@
import { describe, expect, it } from "vitest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
@@ -7,6 +7,12 @@ import { testEnvironment } from "@paperclipai/adapter-codex-local/server";
const itWindows = process.platform === "win32" ? it : it.skip;
describe("codex_local environment diagnostics", () => {
beforeEach(() => {
vi.stubEnv("OPENAI_API_KEY", "");
});
afterEach(() => {
vi.unstubAllEnvs();
});
it("creates a missing working directory when cwd is absolute", async () => {
const cwd = path.join(
os.tmpdir(),
@@ -32,6 +38,67 @@ describe("codex_local environment diagnostics", () => {
await fs.rm(path.dirname(cwd), { recursive: true, force: true });
});
it("emits codex_native_auth_present when ~/.codex/auth.json exists and OPENAI_API_KEY is unset", async () => {
const root = path.join(
os.tmpdir(),
`paperclip-codex-auth-${Date.now()}-${Math.random().toString(16).slice(2)}`,
);
const codexHome = path.join(root, ".codex");
const cwd = path.join(root, "workspace");
try {
await fs.mkdir(codexHome, { recursive: true });
await fs.writeFile(
path.join(codexHome, "auth.json"),
JSON.stringify({ accessToken: "fake-token", accountId: "acct-1" }),
);
const result = await testEnvironment({
companyId: "company-1",
adapterType: "codex_local",
config: {
command: process.execPath,
cwd,
env: { CODEX_HOME: codexHome },
},
});
expect(result.checks.some((check) => check.code === "codex_native_auth_present")).toBe(true);
expect(result.checks.some((check) => check.code === "codex_openai_api_key_missing")).toBe(false);
} finally {
await fs.rm(root, { recursive: true, force: true });
}
});
it("emits codex_openai_api_key_missing when neither env var nor native auth exists", async () => {
const root = path.join(
os.tmpdir(),
`paperclip-codex-noauth-${Date.now()}-${Math.random().toString(16).slice(2)}`,
);
const codexHome = path.join(root, ".codex");
const cwd = path.join(root, "workspace");
try {
await fs.mkdir(codexHome, { recursive: true });
// No auth.json written
const result = await testEnvironment({
companyId: "company-1",
adapterType: "codex_local",
config: {
command: process.execPath,
cwd,
env: { CODEX_HOME: codexHome },
},
});
expect(result.checks.some((check) => check.code === "codex_openai_api_key_missing")).toBe(true);
expect(result.checks.some((check) => check.code === "codex_native_auth_present")).toBe(false);
} finally {
await fs.rm(root, { recursive: true, force: true });
}
});
itWindows("runs the hello probe when Codex is available via a Windows .cmd wrapper", async () => {
const root = path.join(
os.tmpdir(),

View File

@@ -210,7 +210,7 @@ describe("codex execute", () => {
"company-1",
"codex-home",
);
const workspaceSkill = path.join(workspace, ".agents", "skills", "paperclip");
const homeSkill = path.join(isolatedCodexHome, "skills", "paperclip");
await fs.mkdir(workspace, { recursive: true });
await fs.mkdir(sharedCodexHome, { recursive: true });
await fs.writeFile(path.join(sharedCodexHome, "auth.json"), '{"token":"shared"}\n', "utf8");
@@ -284,7 +284,7 @@ describe("codex execute", () => {
expect(await fs.realpath(isolatedAuth)).toBe(await fs.realpath(path.join(sharedCodexHome, "auth.json")));
expect((await fs.lstat(isolatedConfig)).isFile()).toBe(true);
expect(await fs.readFile(isolatedConfig, "utf8")).toBe('model = "codex-mini-latest"\n');
expect((await fs.lstat(workspaceSkill)).isSymbolicLink()).toBe(true);
expect((await fs.lstat(homeSkill)).isSymbolicLink()).toBe(true);
expect(logs).toContainEqual(
expect.objectContaining({
stream: "stdout",
@@ -371,7 +371,7 @@ describe("codex execute", () => {
const capture = JSON.parse(await fs.readFile(capturePath, "utf8")) as CapturePayload;
expect(capture.codexHome).toBe(explicitCodexHome);
expect((await fs.lstat(path.join(workspace, ".agents", "skills", "paperclip"))).isSymbolicLink()).toBe(true);
expect((await fs.lstat(path.join(explicitCodexHome, "skills", "paperclip"))).isSymbolicLink()).toBe(true);
await expect(fs.lstat(path.join(paperclipHome, "instances", "worktree-1", "codex-home"))).rejects.toThrow();
} finally {
if (previousHome === undefined) delete process.env.HOME;

View File

@@ -43,7 +43,7 @@ describe("codex local skill sync", () => {
expect(before.desiredSkills).toContain(paperclipKey);
expect(before.entries.find((entry) => entry.key === paperclipKey)?.required).toBe(true);
expect(before.entries.find((entry) => entry.key === paperclipKey)?.state).toBe("configured");
expect(before.entries.find((entry) => entry.key === paperclipKey)?.detail).toContain(".agents/skills");
expect(before.entries.find((entry) => entry.key === paperclipKey)?.detail).toContain("CODEX_HOME/skills/");
});
it("does not persist Paperclip skills into CODEX_HOME during sync", async () => {

File diff suppressed because it is too large Load Diff

View File

@@ -5,6 +5,7 @@ import { afterEach, describe, expect, it } from "vitest";
import {
discoverProjectWorkspaceSkillDirectories,
findMissingLocalSkillIds,
normalizeGitHubSkillDirectory,
parseSkillImportSourceInput,
readLocalSkillImportFromDirectory,
} from "../services/company-skills.js";
@@ -86,6 +87,13 @@ describe("company skill import source parsing", () => {
});
describe("project workspace skill discovery", () => {
it("normalizes GitHub skill directories for blob imports and legacy metadata", () => {
expect(normalizeGitHubSkillDirectory("retro/.", "retro")).toBe("retro");
expect(normalizeGitHubSkillDirectory("retro/SKILL.md", "retro")).toBe("retro");
expect(normalizeGitHubSkillDirectory("SKILL.md", "root-skill")).toBe("");
expect(normalizeGitHubSkillDirectory("", "fallback-skill")).toBe("fallback-skill");
});
it("finds bounded skill roots under supported workspace paths", async () => {
const workspace = await makeTempDir("paperclip-skill-workspace-");
await writeSkillDir(workspace, "Workspace Root");

View File

@@ -1,4 +1,4 @@
import { describe, expect, it } from "vitest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
@@ -28,6 +28,13 @@ console.log(JSON.stringify({
}
describe("cursor environment diagnostics", () => {
beforeEach(() => {
vi.stubEnv("CURSOR_API_KEY", "");
});
afterEach(() => {
vi.unstubAllEnvs();
});
it("creates a missing working directory when cwd is absolute", async () => {
const cwd = path.join(
os.tmpdir(),
@@ -116,4 +123,73 @@ describe("cursor environment diagnostics", () => {
expect(args).not.toContain("--trust");
await fs.rm(root, { recursive: true, force: true });
});
it("emits cursor_native_auth_present when cli-config.json has authInfo and CURSOR_API_KEY is unset", async () => {
const root = path.join(
os.tmpdir(),
`paperclip-cursor-auth-${Date.now()}-${Math.random().toString(16).slice(2)}`,
);
const cursorHome = path.join(root, ".cursor");
const cwd = path.join(root, "workspace");
try {
await fs.mkdir(cursorHome, { recursive: true });
await fs.writeFile(
path.join(cursorHome, "cli-config.json"),
JSON.stringify({
authInfo: {
email: "test@example.com",
displayName: "Test User",
userId: 12345,
},
}),
);
const result = await testEnvironment({
companyId: "company-1",
adapterType: "cursor",
config: {
command: process.execPath,
cwd,
env: { CURSOR_HOME: cursorHome },
},
});
expect(result.checks.some((check) => check.code === "cursor_native_auth_present")).toBe(true);
expect(result.checks.some((check) => check.code === "cursor_api_key_missing")).toBe(false);
const authCheck = result.checks.find((check) => check.code === "cursor_native_auth_present");
expect(authCheck?.detail).toContain("test@example.com");
} finally {
await fs.rm(root, { recursive: true, force: true });
}
});
it("emits cursor_api_key_missing when neither env var nor native auth exists", async () => {
const root = path.join(
os.tmpdir(),
`paperclip-cursor-noauth-${Date.now()}-${Math.random().toString(16).slice(2)}`,
);
const cursorHome = path.join(root, ".cursor");
const cwd = path.join(root, "workspace");
try {
await fs.mkdir(cursorHome, { recursive: true });
// No cli-config.json written
const result = await testEnvironment({
companyId: "company-1",
adapterType: "cursor",
config: {
command: process.execPath,
cwd,
env: { CURSOR_HOME: cursorHome },
},
});
expect(result.checks.some((check) => check.code === "cursor_api_key_missing")).toBe(true);
expect(result.checks.some((check) => check.code === "cursor_native_auth_present")).toBe(false);
} finally {
await fs.rm(root, { recursive: true, force: true });
}
});
});

View File

@@ -0,0 +1,42 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { describe, expect, it } from "vitest";
import { resolveServerDevWatchIgnorePaths } from "../dev-watch-ignore.js";
describe("resolveServerDevWatchIgnorePaths", () => {
it("includes both the worktree UI paths and their real shared targets", () => {
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-dev-watch-"));
const sharedUiRoot = path.join(tempRoot, "shared-ui");
const worktreeRoot = path.join(tempRoot, "repo", ".paperclip", "worktrees", "PAP-884");
const serverRoot = path.join(worktreeRoot, "server");
const worktreeUiRoot = path.join(worktreeRoot, "ui");
fs.mkdirSync(path.join(sharedUiRoot, "node_modules"), { recursive: true });
fs.mkdirSync(path.join(sharedUiRoot, ".vite"), { recursive: true });
fs.mkdirSync(path.join(sharedUiRoot, "dist"), { recursive: true });
fs.mkdirSync(serverRoot, { recursive: true });
fs.mkdirSync(worktreeUiRoot, { recursive: true });
fs.symlinkSync(path.join(sharedUiRoot, "node_modules"), path.join(worktreeUiRoot, "node_modules"));
fs.symlinkSync(path.join(sharedUiRoot, ".vite"), path.join(worktreeUiRoot, ".vite"));
fs.symlinkSync(path.join(sharedUiRoot, "dist"), path.join(worktreeUiRoot, "dist"));
const ignorePaths = resolveServerDevWatchIgnorePaths(serverRoot);
expect(ignorePaths).toContain(path.join(worktreeUiRoot, "node_modules"));
expect(ignorePaths).toContain(`${path.join(worktreeUiRoot, "node_modules").replaceAll(path.sep, "/")}/**`);
expect(ignorePaths).toContain(fs.realpathSync(path.join(sharedUiRoot, "node_modules")));
expect(ignorePaths).toContain(`${fs.realpathSync(path.join(sharedUiRoot, "node_modules")).replaceAll(path.sep, "/")}/**`);
expect(ignorePaths).toContain(path.join(worktreeUiRoot, "node_modules", ".vite-temp"));
expect(ignorePaths).toContain(
`${path.join(worktreeUiRoot, "node_modules", ".vite-temp").replaceAll(path.sep, "/")}/**`,
);
expect(ignorePaths).toContain(path.join(worktreeUiRoot, ".vite"));
expect(ignorePaths).toContain(fs.realpathSync(path.join(sharedUiRoot, ".vite")));
expect(ignorePaths).toContain(path.join(worktreeUiRoot, "dist"));
expect(ignorePaths).toContain(fs.realpathSync(path.join(sharedUiRoot, "dist")));
expect(ignorePaths).toContain("**/{node_modules,bower_components,vendor}/**");
expect(ignorePaths).toContain("**/.vite-temp/**");
});
});

View File

@@ -1,89 +1,29 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { spawn, type ChildProcess } from "node:child_process";
import { eq } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
applyPendingMigrations,
createDb,
ensurePostgresDatabase,
agents,
agentWakeupRequests,
companies,
createDb,
heartbeatRunEvents,
heartbeatRuns,
issues,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { runningProcesses } from "../adapters/index.ts";
import { heartbeatService } from "../services/heartbeat.ts";
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-heartbeat-recovery-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, instance, dataDir };
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres heartbeat recovery tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
function spawnAliveProcess() {
@@ -92,17 +32,14 @@ function spawnAliveProcess() {
});
}
describe("heartbeat orphaned process recovery", () => {
describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
let db!: ReturnType<typeof createDb>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
const childProcesses = new Set<ChildProcess>();
beforeAll(async () => {
const started = await startTempDatabase();
db = createDb(started.connectionString);
instance = started.instance;
dataDir = started.dataDir;
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-heartbeat-recovery-");
db = createDb(tempDb.connectionString);
}, 20_000);
afterEach(async () => {
@@ -125,10 +62,7 @@ describe("heartbeat orphaned process recovery", () => {
}
childProcesses.clear();
runningProcesses.clear();
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
async function seedRunFixture(input?: {

View File

@@ -0,0 +1,6 @@
export {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestDatabase,
type EmbeddedPostgresTestSupport,
} from "@paperclipai/db";

View File

@@ -0,0 +1,57 @@
import { describe, expect, it } from "vitest";
import { agentJoinGrantsFromDefaults } from "../routes/access.js";
describe("agentJoinGrantsFromDefaults", () => {
it("adds tasks:assign when invite defaults do not specify agent grants", () => {
expect(agentJoinGrantsFromDefaults(null)).toEqual([
{
permissionKey: "tasks:assign",
scope: null,
},
]);
});
it("preserves invite agent grants and appends tasks:assign", () => {
expect(
agentJoinGrantsFromDefaults({
agent: {
grants: [
{
permissionKey: "agents:create",
scope: null,
},
],
},
}),
).toEqual([
{
permissionKey: "agents:create",
scope: null,
},
{
permissionKey: "tasks:assign",
scope: null,
},
]);
});
it("does not duplicate tasks:assign when invite defaults already include it", () => {
expect(
agentJoinGrantsFromDefaults({
agent: {
grants: [
{
permissionKey: "tasks:assign",
scope: { projectId: "project-1" },
},
],
},
}),
).toEqual([
{
permissionKey: "tasks:assign",
scope: { projectId: "project-1" },
},
]);
});
});

View File

@@ -20,16 +20,29 @@ describe("issue goal fallback", () => {
resolveIssueGoalId({
projectId: null,
goalId: "goal-2",
projectGoalId: "goal-3",
defaultGoalId: "goal-1",
}),
).toBe("goal-2");
});
it("does not force a company goal when the issue belongs to a project", () => {
it("inherits the project goal when creating a project-linked issue", () => {
expect(
resolveIssueGoalId({
projectId: "project-1",
goalId: null,
projectGoalId: "goal-2",
defaultGoalId: "goal-1",
}),
).toBe("goal-2");
});
it("does not force a company goal when the project has no goal", () => {
expect(
resolveIssueGoalId({
projectId: "project-1",
goalId: null,
projectGoalId: null,
defaultGoalId: "goal-1",
}),
).toBeNull();
@@ -40,20 +53,47 @@ describe("issue goal fallback", () => {
resolveNextIssueGoalId({
currentProjectId: null,
currentGoalId: null,
currentProjectGoalId: null,
defaultGoalId: "goal-1",
}),
).toBe("goal-1");
});
it("clears the fallback when a project is added later", () => {
it("switches from the company fallback to the project goal when a project is added later", () => {
expect(
resolveNextIssueGoalId({
currentProjectId: null,
currentGoalId: "goal-1",
currentProjectGoalId: null,
projectId: "project-1",
goalId: null,
projectGoalId: "goal-2",
defaultGoalId: "goal-1",
}),
).toBeNull();
).toBe("goal-2");
});
it("backfills the project goal for legacy project-linked issues on update", () => {
expect(
resolveNextIssueGoalId({
currentProjectId: "project-1",
currentGoalId: null,
currentProjectGoalId: "goal-2",
defaultGoalId: "goal-1",
}),
).toBe("goal-2");
});
it("preserves an explicit goal across project fallback changes", () => {
expect(
resolveNextIssueGoalId({
currentProjectId: "project-1",
currentGoalId: "goal-explicit",
currentProjectGoalId: "goal-2",
projectId: "project-2",
projectGoalId: "goal-3",
defaultGoalId: "goal-1",
}),
).toBe("goal-explicit");
});
});

View File

@@ -0,0 +1,187 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { issueRoutes } from "../routes/issues.js";
import { errorHandler } from "../middleware/index.js";
const mockIssueService = vi.hoisted(() => ({
getById: vi.fn(),
getAncestors: vi.fn(),
findMentionedProjectIds: vi.fn(),
getCommentCursor: vi.fn(),
getComment: vi.fn(),
}));
const mockProjectService = vi.hoisted(() => ({
getById: vi.fn(),
listByIds: vi.fn(),
}));
const mockGoalService = vi.hoisted(() => ({
getById: vi.fn(),
getDefaultCompanyGoal: vi.fn(),
}));
vi.mock("../services/index.js", () => ({
accessService: () => ({
canUser: vi.fn(),
hasPermission: vi.fn(),
}),
agentService: () => ({
getById: vi.fn(),
}),
documentService: () => ({
getIssueDocumentPayload: vi.fn(async () => ({})),
}),
executionWorkspaceService: () => ({
getById: vi.fn(),
}),
goalService: () => mockGoalService,
heartbeatService: () => ({
wakeup: vi.fn(async () => undefined),
reportRunActivity: vi.fn(async () => undefined),
}),
issueApprovalService: () => ({}),
issueService: () => mockIssueService,
logActivity: vi.fn(async () => undefined),
projectService: () => mockProjectService,
routineService: () => ({
syncRunStatusForIssue: vi.fn(async () => undefined),
}),
workProductService: () => ({
listForIssue: vi.fn(async () => []),
}),
}));
function createApp() {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
(req as any).actor = {
type: "board",
userId: "local-board",
companyIds: ["company-1"],
source: "local_implicit",
isInstanceAdmin: false,
};
next();
});
app.use("/api", issueRoutes({} as any, {} as any));
app.use(errorHandler);
return app;
}
const legacyProjectLinkedIssue = {
id: "11111111-1111-4111-8111-111111111111",
companyId: "company-1",
identifier: "PAP-581",
title: "Legacy onboarding task",
description: "Seed the first CEO task",
status: "todo",
priority: "medium",
projectId: "22222222-2222-4222-8222-222222222222",
goalId: null,
parentId: null,
assigneeAgentId: "33333333-3333-4333-8333-333333333333",
assigneeUserId: null,
updatedAt: new Date("2026-03-24T12:00:00Z"),
executionWorkspaceId: null,
labels: [],
labelIds: [],
};
const projectGoal = {
id: "44444444-4444-4444-8444-444444444444",
companyId: "company-1",
title: "Launch the company",
description: null,
level: "company",
status: "active",
parentId: null,
ownerAgentId: null,
createdAt: new Date("2026-03-20T00:00:00Z"),
updatedAt: new Date("2026-03-20T00:00:00Z"),
};
describe("issue goal context routes", () => {
beforeEach(() => {
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(legacyProjectLinkedIssue);
mockIssueService.getAncestors.mockResolvedValue([]);
mockIssueService.findMentionedProjectIds.mockResolvedValue([]);
mockIssueService.getCommentCursor.mockResolvedValue({
totalComments: 0,
latestCommentId: null,
latestCommentAt: null,
});
mockIssueService.getComment.mockResolvedValue(null);
mockProjectService.getById.mockResolvedValue({
id: legacyProjectLinkedIssue.projectId,
companyId: "company-1",
urlKey: "onboarding",
goalId: projectGoal.id,
goalIds: [projectGoal.id],
goals: [{ id: projectGoal.id, title: projectGoal.title }],
name: "Onboarding",
description: null,
status: "in_progress",
leadAgentId: null,
targetDate: null,
color: null,
pauseReason: null,
pausedAt: null,
executionWorkspacePolicy: null,
codebase: {
workspaceId: null,
repoUrl: null,
repoRef: null,
defaultRef: null,
repoName: null,
localFolder: null,
managedFolder: "/tmp/company-1/project-1",
effectiveLocalFolder: "/tmp/company-1/project-1",
origin: "managed_checkout",
},
workspaces: [],
primaryWorkspace: null,
archivedAt: null,
createdAt: new Date("2026-03-20T00:00:00Z"),
updatedAt: new Date("2026-03-20T00:00:00Z"),
});
mockProjectService.listByIds.mockResolvedValue([]);
mockGoalService.getById.mockImplementation(async (id: string) =>
id === projectGoal.id ? projectGoal : null,
);
mockGoalService.getDefaultCompanyGoal.mockResolvedValue(null);
});
it("surfaces the project goal from GET /issues/:id when the issue has no direct goal", async () => {
const res = await request(createApp()).get("/api/issues/11111111-1111-4111-8111-111111111111");
expect(res.status).toBe(200);
expect(res.body.goalId).toBe(projectGoal.id);
expect(res.body.goal).toEqual(
expect.objectContaining({
id: projectGoal.id,
title: projectGoal.title,
}),
);
expect(mockGoalService.getDefaultCompanyGoal).not.toHaveBeenCalled();
});
it("surfaces the project goal from GET /issues/:id/heartbeat-context", async () => {
const res = await request(createApp()).get(
"/api/issues/11111111-1111-4111-8111-111111111111/heartbeat-context",
);
expect(res.status).toBe(200);
expect(res.body.issue.goalId).toBe(projectGoal.id);
expect(res.body.goal).toEqual(
expect.objectContaining({
id: projectGoal.id,
title: projectGoal.title,
}),
);
expect(mockGoalService.getDefaultCompanyGoal).not.toHaveBeenCalled();
});
});

View File

@@ -1,103 +1,43 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
activityLog,
agents,
applyPendingMigrations,
companies,
createDb,
ensurePostgresDatabase,
issueComments,
issueInboxArchives,
issues,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { issueService } from "../services/issues.ts";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres issue service tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-issues-service-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, dataDir, instance };
}
describe("issueService.list participantAgentId", () => {
describeEmbeddedPostgres("issueService.list participantAgentId", () => {
let db!: ReturnType<typeof createDb>;
let svc!: ReturnType<typeof issueService>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
const started = await startTempDatabase();
db = createDb(started.connectionString);
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-issues-service-");
db = createDb(tempDb.connectionString);
svc = issueService(db);
instance = started.instance;
dataDir = started.dataDir;
}, 20_000);
afterEach(async () => {
await db.delete(issueComments);
await db.delete(issueInboxArchives);
await db.delete(activityLog);
await db.delete(issues);
await db.delete(agents);
@@ -105,10 +45,7 @@ describe("issueService.list participantAgentId", () => {
});
afterAll(async () => {
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
it("returns issues an agent participated in across the supported signals", async () => {
@@ -281,4 +218,99 @@ describe("issueService.list participantAgentId", () => {
expect(result.map((issue) => issue.id)).toEqual([matchedIssueId]);
});
it("hides archived inbox issues until new external activity arrives", async () => {
const companyId = randomUUID();
const userId = "user-1";
const otherUserId = "user-2";
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
const visibleIssueId = randomUUID();
const archivedIssueId = randomUUID();
const resurfacedIssueId = randomUUID();
await db.insert(issues).values([
{
id: visibleIssueId,
companyId,
title: "Visible issue",
status: "todo",
priority: "medium",
createdByUserId: userId,
createdAt: new Date("2026-03-26T10:00:00.000Z"),
updatedAt: new Date("2026-03-26T10:00:00.000Z"),
},
{
id: archivedIssueId,
companyId,
title: "Archived issue",
status: "todo",
priority: "medium",
createdByUserId: userId,
createdAt: new Date("2026-03-26T11:00:00.000Z"),
updatedAt: new Date("2026-03-26T11:00:00.000Z"),
},
{
id: resurfacedIssueId,
companyId,
title: "Resurfaced issue",
status: "todo",
priority: "medium",
createdByUserId: userId,
createdAt: new Date("2026-03-26T12:00:00.000Z"),
updatedAt: new Date("2026-03-26T12:00:00.000Z"),
},
]);
await svc.archiveInbox(
companyId,
archivedIssueId,
userId,
new Date("2026-03-26T12:30:00.000Z"),
);
await svc.archiveInbox(
companyId,
resurfacedIssueId,
userId,
new Date("2026-03-26T13:00:00.000Z"),
);
await db.insert(issueComments).values({
companyId,
issueId: resurfacedIssueId,
authorUserId: otherUserId,
body: "This should bring the issue back into Mine.",
createdAt: new Date("2026-03-26T13:30:00.000Z"),
updatedAt: new Date("2026-03-26T13:30:00.000Z"),
});
const archivedFiltered = await svc.list(companyId, {
touchedByUserId: userId,
inboxArchivedByUserId: userId,
});
expect(archivedFiltered.map((issue) => issue.id)).toEqual([
resurfacedIssueId,
visibleIssueId,
]);
await svc.unarchiveInbox(companyId, archivedIssueId, userId);
const afterUnarchive = await svc.list(companyId, {
touchedByUserId: userId,
inboxArchivedByUserId: userId,
});
expect(new Set(afterUnarchive.map((issue) => issue.id))).toEqual(new Set([
visibleIssueId,
archivedIssueId,
resurfacedIssueId,
]));
});
});

View File

@@ -0,0 +1,41 @@
import { describe, expect, it } from "vitest";
import { normalizeAgentMentionToken } from "../services/issues.ts";
describe("normalizeAgentMentionToken", () => {
it("decodes hex numeric entities such as space (&#x20;)", () => {
expect(normalizeAgentMentionToken("Baba&#x20;")).toBe("Baba");
});
it("decodes decimal numeric entities", () => {
expect(normalizeAgentMentionToken("Baba&#32;")).toBe("Baba");
});
it("decodes common named whitespace entities", () => {
expect(normalizeAgentMentionToken("Baba&nbsp;")).toBe("Baba");
});
// Mid-token entity (review asked for this shape); we decode &amp;→&, not strip to "Baba" (that broke M&amp;M).
it("decodes a named entity in the middle of the token", () => {
expect(normalizeAgentMentionToken("Ba&amp;ba")).toBe("Ba&ba");
});
it("decodes &amp; so agent names with ampersands still match", () => {
expect(normalizeAgentMentionToken("M&amp;M")).toBe("M&M");
});
it("decodes additional named entities used in rich text (e.g. &copy;)", () => {
expect(normalizeAgentMentionToken("Agent&copy;Name")).toBe("Agent©Name");
});
it("leaves unknown semicolon-terminated named references unchanged", () => {
expect(normalizeAgentMentionToken("Baba&notarealentity;")).toBe("Baba&notarealentity;");
});
it("returns plain names unchanged", () => {
expect(normalizeAgentMentionToken("Baba")).toBe("Baba");
});
it("trims after decoding entities", () => {
expect(normalizeAgentMentionToken("Baba&#x20;&#x20;")).toBe("Baba");
});
});

View File

@@ -1,8 +1,4 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { eq } from "drizzle-orm";
import express from "express";
import request from "supertest";
@@ -11,11 +7,9 @@ import {
activityLog,
agentWakeupRequests,
agents,
applyPendingMigrations,
companies,
companyMemberships,
createDb,
ensurePostgresDatabase,
heartbeatRunEvents,
heartbeatRuns,
instanceSettings,
@@ -26,6 +20,10 @@ import {
routines,
routineTriggers,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { errorHandler } from "../middleware/index.js";
import { accessService } from "../services/access.js";
@@ -78,82 +76,22 @@ vi.mock("../services/index.js", async () => {
};
});
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres routine route tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-routines-e2e-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, dataDir, instance };
}
describe("routine routes end-to-end", () => {
describeEmbeddedPostgres("routine routes end-to-end", () => {
let db!: ReturnType<typeof createDb>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
const started = await startTempDatabase();
db = createDb(started.connectionString);
instance = started.instance;
dataDir = started.dataDir;
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-routines-e2e-");
db = createDb(tempDb.connectionString);
}, 20_000);
afterEach(async () => {
@@ -174,10 +112,7 @@ describe("routine routes end-to-end", () => {
});
afterAll(async () => {
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
async function createApp(actor: Record<string, unknown>) {

View File

@@ -1,19 +1,13 @@
import { createHmac, randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { eq } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
activityLog,
agents,
applyPendingMigrations,
companies,
companySecrets,
companySecretVersions,
createDb,
ensurePostgresDatabase,
heartbeatRuns,
issues,
projects,
@@ -21,85 +15,29 @@ import {
routines,
routineTriggers,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { issueService } from "../services/issues.ts";
import { routineService } from "../services/routines.ts";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres routines service tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-routines-service-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, dataDir, instance };
}
describe("routine service live-execution coalescing", () => {
describeEmbeddedPostgres("routine service live-execution coalescing", () => {
let db!: ReturnType<typeof createDb>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
const started = await startTempDatabase();
db = createDb(started.connectionString);
instance = started.instance;
dataDir = started.dataDir;
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-routines-service-");
db = createDb(tempDb.connectionString);
}, 20_000);
afterEach(async () => {
@@ -117,10 +55,7 @@ describe("routine service live-execution coalescing", () => {
});
afterAll(async () => {
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
async function seedFixture(opts?: {

View File

@@ -2,6 +2,7 @@ import { execFile } from "node:child_process";
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { fileURLToPath } from "node:url";
import { promisify } from "node:util";
import { afterEach, describe, expect, it } from "vitest";
import {
@@ -13,6 +14,7 @@ import {
stopRuntimeServicesForExecutionWorkspace,
type RealizedExecutionWorkspace,
} from "../services/workspace-runtime.ts";
import { resolvePaperclipConfigPath } from "../paths.ts";
import type { WorkspaceOperation } from "@paperclipai/shared";
import type { WorkspaceOperationRecorder } from "../services/workspace-operations.ts";
@@ -124,6 +126,7 @@ afterEach(async () => {
delete process.env.PAPERCLIP_CONFIG;
delete process.env.PAPERCLIP_HOME;
delete process.env.PAPERCLIP_INSTANCE_ID;
delete process.env.PAPERCLIP_WORKTREES_DIR;
delete process.env.DATABASE_URL;
});
@@ -282,6 +285,156 @@ describe("realizeExecutionWorkspace", () => {
await expect(fs.readFile(path.join(reused.cwd, ".paperclip-provision-created"), "utf8")).resolves.toBe("false\n");
});
it("writes an isolated repo-local Paperclip config and worktree branding when provisioning", async () => {
const repoRoot = await createTempRepo();
const previousCwd = process.cwd();
const paperclipHome = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktree-home-"));
const isolatedWorktreeHome = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktrees-"));
const instanceId = "worktree-base";
const sharedConfigDir = path.join(paperclipHome, "instances", instanceId);
const sharedConfigPath = path.join(sharedConfigDir, "config.json");
const sharedEnvPath = path.join(sharedConfigDir, ".env");
process.env.PAPERCLIP_HOME = paperclipHome;
process.env.PAPERCLIP_INSTANCE_ID = instanceId;
process.env.PAPERCLIP_WORKTREES_DIR = isolatedWorktreeHome;
await fs.mkdir(sharedConfigDir, { recursive: true });
await fs.writeFile(
sharedConfigPath,
JSON.stringify(
{
$meta: {
version: 1,
updatedAt: "2026-03-26T00:00:00.000Z",
source: "doctor",
},
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(sharedConfigDir, "db"),
embeddedPostgresPort: 54329,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(sharedConfigDir, "backups"),
},
},
logging: {
mode: "file",
logDir: path.join(sharedConfigDir, "logs"),
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3100,
allowedHostnames: [],
serveUi: true,
},
auth: {
baseUrlMode: "auto",
disableSignUp: false,
},
storage: {
provider: "local_disk",
localDisk: {
baseDir: path.join(sharedConfigDir, "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted",
strictMode: false,
localEncrypted: {
keyFilePath: path.join(sharedConfigDir, "master.key"),
},
},
},
null,
2,
) + "\n",
"utf8",
);
await fs.writeFile(sharedEnvPath, 'DATABASE_URL="postgres://worktree:test@db.example.com:6543/paperclip"\n', "utf8");
await fs.mkdir(path.join(repoRoot, "scripts"), { recursive: true });
await fs.copyFile(
fileURLToPath(new URL("../../../scripts/provision-worktree.sh", import.meta.url)),
path.join(repoRoot, "scripts", "provision-worktree.sh"),
);
await runGit(repoRoot, ["add", "scripts/provision-worktree.sh"]);
await runGit(repoRoot, ["commit", "-m", "Add worktree provision script"]);
try {
const workspace = await realizeExecutionWorkspace({
base: {
baseCwd: repoRoot,
source: "project_primary",
projectId: "project-1",
workspaceId: "workspace-1",
repoUrl: null,
repoRef: "HEAD",
},
config: {
workspaceStrategy: {
type: "git_worktree",
branchTemplate: "{{issue.identifier}}-{{slug}}",
provisionCommand: "bash ./scripts/provision-worktree.sh",
},
},
issue: {
id: "issue-1",
identifier: "PAP-885",
title: "Show worktree banner",
},
agent: {
id: "agent-1",
name: "Codex Coder",
companyId: "company-1",
},
});
const configPath = path.join(workspace.cwd, ".paperclip", "config.json");
const envPath = path.join(workspace.cwd, ".paperclip", ".env");
const envContents = await fs.readFile(envPath, "utf8");
const configContents = JSON.parse(await fs.readFile(configPath, "utf8"));
const configStats = await fs.lstat(configPath);
const expectedInstanceId = "pap-885-show-worktree-banner";
const expectedInstanceRoot = path.join(
isolatedWorktreeHome,
"instances",
expectedInstanceId,
);
expect(configStats.isSymbolicLink()).toBe(false);
expect(configContents.database.embeddedPostgresDataDir).toBe(path.join(expectedInstanceRoot, "db"));
expect(configContents.database.embeddedPostgresDataDir).not.toBe(path.join(sharedConfigDir, "db"));
expect(configContents.server.port).not.toBe(3100);
expect(configContents.secrets.localEncrypted.keyFilePath).toBe(
path.join(expectedInstanceRoot, "secrets", "master.key"),
);
expect(envContents).not.toContain("DATABASE_URL=");
expect(envContents).toContain(`PAPERCLIP_HOME=${JSON.stringify(isolatedWorktreeHome)}`);
expect(envContents).toContain(`PAPERCLIP_INSTANCE_ID=${JSON.stringify(expectedInstanceId)}`);
expect(envContents).toContain(`PAPERCLIP_CONFIG=${JSON.stringify(configPath)}`);
expect(envContents).toContain("PAPERCLIP_IN_WORKTREE=true");
expect(envContents).toContain(
`PAPERCLIP_WORKTREE_NAME=${JSON.stringify("PAP-885-show-worktree-banner")}`,
);
process.chdir(workspace.cwd);
expect(resolvePaperclipConfigPath()).toBe(configPath);
} finally {
process.chdir(previousCwd);
}
});
it("records worktree setup and provision operations when a recorder is provided", async () => {
const repoRoot = await createTempRepo();
const { recorder, operations } = createWorkspaceOperationRecorderDouble();

View File

@@ -0,0 +1,426 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import {
applyRuntimePortSelectionToConfig,
maybePersistWorktreeRuntimePorts,
maybeRepairLegacyWorktreeConfigAndEnvFiles,
} from "../worktree-config.js";
const ORIGINAL_ENV = { ...process.env };
const ORIGINAL_CWD = process.cwd();
afterEach(() => {
process.chdir(ORIGINAL_CWD);
for (const key of Object.keys(process.env)) {
if (!(key in ORIGINAL_ENV)) {
delete process.env[key];
}
}
for (const [key, value] of Object.entries(ORIGINAL_ENV)) {
process.env[key] = value;
}
});
function buildLegacyConfig(sharedRoot: string) {
return {
$meta: {
version: 1,
updatedAt: "2026-03-26T00:00:00.000Z",
source: "configure",
},
database: {
mode: "embedded-postgres" as const,
embeddedPostgresDataDir: path.join(sharedRoot, "db"),
embeddedPostgresPort: 54329,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(sharedRoot, "data", "backups"),
},
},
logging: {
mode: "file" as const,
logDir: path.join(sharedRoot, "logs"),
},
server: {
deploymentMode: "local_trusted" as const,
exposure: "private" as const,
host: "127.0.0.1",
port: 3100,
allowedHostnames: [],
serveUi: true,
},
auth: {
baseUrlMode: "explicit" as const,
publicBaseUrl: "http://127.0.0.1:3100",
disableSignUp: false,
},
storage: {
provider: "local_disk" as const,
localDisk: {
baseDir: path.join(sharedRoot, "data", "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted" as const,
strictMode: false,
localEncrypted: {
keyFilePath: path.join(sharedRoot, "secrets", "master.key"),
},
},
};
}
describe("worktree config repair", () => {
it("repairs legacy repo-local worktree config and env files into an isolated instance", async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktree-repair-"));
const worktreeRoot = path.join(tempRoot, "PAP-884-ai-commits-component");
const paperclipDir = path.join(worktreeRoot, ".paperclip");
const configPath = path.join(paperclipDir, "config.json");
const envPath = path.join(paperclipDir, ".env");
const sharedRoot = path.join(tempRoot, ".paperclip", "instances", "default");
const isolatedHome = path.join(tempRoot, ".paperclip-worktrees");
await fs.mkdir(paperclipDir, { recursive: true });
await fs.writeFile(configPath, JSON.stringify(buildLegacyConfig(sharedRoot), null, 2) + "\n", "utf8");
await fs.writeFile(
envPath,
[
"# Paperclip environment variables",
"PAPERCLIP_IN_WORKTREE=true",
"PAPERCLIP_WORKTREE_NAME=PAP-884-ai-commits-component",
"PAPERCLIP_AGENT_JWT_SECRET=shared-secret",
"",
].join("\n"),
"utf8",
);
process.chdir(worktreeRoot);
process.env.PAPERCLIP_IN_WORKTREE = "true";
process.env.PAPERCLIP_WORKTREE_NAME = "PAP-884-ai-commits-component";
process.env.PAPERCLIP_WORKTREES_DIR = isolatedHome;
delete process.env.PAPERCLIP_HOME;
delete process.env.PAPERCLIP_INSTANCE_ID;
delete process.env.PAPERCLIP_CONFIG;
delete process.env.PAPERCLIP_CONTEXT;
const result = maybeRepairLegacyWorktreeConfigAndEnvFiles();
expect(result).toEqual({
repairedConfig: true,
repairedEnv: true,
});
const repairedConfig = JSON.parse(await fs.readFile(configPath, "utf8"));
const repairedEnv = await fs.readFile(envPath, "utf8");
const instanceRoot = path.join(isolatedHome, "instances", "pap-884-ai-commits-component");
expect(repairedConfig.database.embeddedPostgresDataDir).toBe(path.join(instanceRoot, "db"));
expect(repairedConfig.database.backup.dir).toBe(path.join(instanceRoot, "data", "backups"));
expect(repairedConfig.logging.logDir).toBe(path.join(instanceRoot, "logs"));
expect(repairedConfig.storage.localDisk.baseDir).toBe(path.join(instanceRoot, "data", "storage"));
expect(repairedConfig.secrets.localEncrypted.keyFilePath).toBe(path.join(instanceRoot, "secrets", "master.key"));
expect(repairedEnv).toContain(`PAPERCLIP_HOME=${JSON.stringify(isolatedHome)}`);
expect(repairedEnv).toContain('PAPERCLIP_INSTANCE_ID="pap-884-ai-commits-component"');
expect(repairedEnv).toContain(`PAPERCLIP_CONFIG=${JSON.stringify(await fs.realpath(configPath))}`);
expect(repairedEnv).toContain(`PAPERCLIP_CONTEXT=${JSON.stringify(path.join(isolatedHome, "context.json"))}`);
expect(repairedEnv).toContain('PAPERCLIP_AGENT_JWT_SECRET="shared-secret"');
expect(process.env.PAPERCLIP_HOME).toBe(isolatedHome);
expect(process.env.PAPERCLIP_INSTANCE_ID).toBe("pap-884-ai-commits-component");
});
it("avoids sibling worktree ports when repairing legacy configs", async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktree-repair-ports-"));
const worktreeRoot = path.join(tempRoot, "PAP-880-thumbs-capture-for-evals-feature");
const paperclipDir = path.join(worktreeRoot, ".paperclip");
const configPath = path.join(paperclipDir, "config.json");
const envPath = path.join(paperclipDir, ".env");
const sharedRoot = path.join(tempRoot, ".paperclip", "instances", "default");
const isolatedHome = path.join(tempRoot, ".paperclip-worktrees");
const siblingInstanceRoot = path.join(isolatedHome, "instances", "pap-878-create-a-mine-tab-in-inbox");
await fs.mkdir(paperclipDir, { recursive: true });
await fs.mkdir(siblingInstanceRoot, { recursive: true });
await fs.writeFile(configPath, JSON.stringify(buildLegacyConfig(sharedRoot), null, 2) + "\n", "utf8");
await fs.writeFile(
envPath,
[
"# Paperclip environment variables",
"PAPERCLIP_IN_WORKTREE=true",
"PAPERCLIP_WORKTREE_NAME=PAP-880-thumbs-capture-for-evals-feature",
"",
].join("\n"),
"utf8",
);
await fs.writeFile(
path.join(siblingInstanceRoot, "config.json"),
JSON.stringify(
{
...buildLegacyConfig(siblingInstanceRoot),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(siblingInstanceRoot, "db"),
embeddedPostgresPort: 54330,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(siblingInstanceRoot, "data", "backups"),
},
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3101,
allowedHostnames: [],
serveUi: true,
},
},
null,
2,
) + "\n",
"utf8",
);
process.chdir(worktreeRoot);
process.env.PAPERCLIP_IN_WORKTREE = "true";
process.env.PAPERCLIP_WORKTREE_NAME = "PAP-880-thumbs-capture-for-evals-feature";
process.env.PAPERCLIP_WORKTREES_DIR = isolatedHome;
const result = maybeRepairLegacyWorktreeConfigAndEnvFiles();
const repairedConfig = JSON.parse(await fs.readFile(configPath, "utf8"));
expect(result.repairedConfig).toBe(true);
expect(repairedConfig.server.port).toBe(3102);
expect(repairedConfig.database.embeddedPostgresPort).toBe(54331);
});
it("rebalances duplicate ports for already isolated worktree configs", async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktree-rebalance-"));
const isolatedHome = path.join(tempRoot, ".paperclip-worktrees");
const repoWorktreesRoot = path.join(tempRoot, "repo", ".paperclip", "worktrees");
const siblingWorktreeRoot = path.join(repoWorktreesRoot, "PAP-878-create-a-mine-tab-in-inbox");
const siblingInstanceRoot = path.join(isolatedHome, "instances", "pap-878-create-a-mine-tab-in-inbox");
const currentWorktreeRoot = path.join(repoWorktreesRoot, "PAP-884-ai-commits-component");
const paperclipDir = path.join(currentWorktreeRoot, ".paperclip");
const configPath = path.join(paperclipDir, "config.json");
const envPath = path.join(paperclipDir, ".env");
const currentInstanceRoot = path.join(isolatedHome, "instances", "pap-884-ai-commits-component");
const siblingConfigPath = path.join(siblingWorktreeRoot, ".paperclip", "config.json");
await fs.mkdir(paperclipDir, { recursive: true });
await fs.mkdir(path.dirname(siblingConfigPath), { recursive: true });
await fs.writeFile(
configPath,
JSON.stringify(
{
...buildLegacyConfig(currentInstanceRoot),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(currentInstanceRoot, "db"),
embeddedPostgresPort: 54330,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(currentInstanceRoot, "data", "backups"),
},
},
logging: {
mode: "file",
logDir: path.join(currentInstanceRoot, "logs"),
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3101,
allowedHostnames: [],
serveUi: true,
},
storage: {
provider: "local_disk",
localDisk: {
baseDir: path.join(currentInstanceRoot, "data", "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted",
strictMode: false,
localEncrypted: {
keyFilePath: path.join(currentInstanceRoot, "secrets", "master.key"),
},
},
},
null,
2,
) + "\n",
"utf8",
);
await fs.writeFile(
envPath,
[
"# Paperclip environment variables",
"PAPERCLIP_IN_WORKTREE=true",
"PAPERCLIP_WORKTREE_NAME=PAP-884-ai-commits-component",
"",
].join("\n"),
"utf8",
);
await fs.writeFile(
siblingConfigPath,
JSON.stringify(
{
...buildLegacyConfig(siblingInstanceRoot),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(siblingInstanceRoot, "db"),
embeddedPostgresPort: 54330,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(siblingInstanceRoot, "data", "backups"),
},
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3101,
allowedHostnames: [],
serveUi: true,
},
},
null,
2,
) + "\n",
"utf8",
);
process.chdir(currentWorktreeRoot);
process.env.PAPERCLIP_IN_WORKTREE = "true";
process.env.PAPERCLIP_WORKTREE_NAME = "PAP-884-ai-commits-component";
process.env.PAPERCLIP_WORKTREES_DIR = isolatedHome;
const result = maybeRepairLegacyWorktreeConfigAndEnvFiles();
const repairedConfig = JSON.parse(await fs.readFile(configPath, "utf8"));
expect(result.repairedConfig).toBe(true);
expect(repairedConfig.server.port).toBe(3102);
expect(repairedConfig.database.embeddedPostgresPort).toBe(54331);
});
it("persists runtime-selected worktree ports back into config", async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-worktree-ports-"));
const worktreeRoot = path.join(tempRoot, "PAP-878-create-a-mine-tab-in-inbox");
const paperclipDir = path.join(worktreeRoot, ".paperclip");
const configPath = path.join(paperclipDir, "config.json");
const isolatedHome = path.join(tempRoot, ".paperclip-worktrees");
const instanceRoot = path.join(isolatedHome, "instances", "pap-878-create-a-mine-tab-in-inbox");
await fs.mkdir(paperclipDir, { recursive: true });
await fs.writeFile(
configPath,
JSON.stringify(
{
...buildLegacyConfig(instanceRoot),
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(instanceRoot, "db"),
embeddedPostgresPort: 54331,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(instanceRoot, "data", "backups"),
},
},
logging: {
mode: "file",
logDir: path.join(instanceRoot, "logs"),
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3101,
allowedHostnames: [],
serveUi: true,
},
storage: {
provider: "local_disk",
localDisk: {
baseDir: path.join(instanceRoot, "data", "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted",
strictMode: false,
localEncrypted: {
keyFilePath: path.join(instanceRoot, "secrets", "master.key"),
},
},
},
null,
2,
) + "\n",
"utf8",
);
process.chdir(worktreeRoot);
process.env.PAPERCLIP_IN_WORKTREE = "true";
process.env.PAPERCLIP_WORKTREE_NAME = "PAP-878-create-a-mine-tab-in-inbox";
process.env.PAPERCLIP_HOME = isolatedHome;
process.env.PAPERCLIP_INSTANCE_ID = "pap-878-create-a-mine-tab-in-inbox";
process.env.PAPERCLIP_CONFIG = configPath;
maybePersistWorktreeRuntimePorts({
serverPort: 3103,
databasePort: 54335,
});
const writtenConfig = JSON.parse(await fs.readFile(configPath, "utf8"));
expect(writtenConfig.server.port).toBe(3103);
expect(writtenConfig.database.embeddedPostgresPort).toBe(54335);
expect(writtenConfig.auth.publicBaseUrl).toBe("http://127.0.0.1:3103/");
});
it("can update the in-memory config without rewriting env-driven ports", () => {
const { config, changed } = applyRuntimePortSelectionToConfig(buildLegacyConfig("/tmp/shared"), {
serverPort: 3104,
databasePort: 54340,
allowServerPortWrite: false,
allowDatabasePortWrite: true,
});
expect(changed).toBe(true);
expect(config.server.port).toBe(3100);
expect(config.database.embeddedPostgresPort).toBe(54340);
expect(config.auth.publicBaseUrl).toBe("http://127.0.0.1:3104/");
});
});

View File

@@ -79,6 +79,8 @@ export async function createApp(
const app = express();
app.use(express.json({
// Company import/export payloads can inline full portable packages.
limit: "10mb",
verify: (req, _res, buf) => {
(req as unknown as { rawBody: Buffer }).rawBody = buf;
},

View File

@@ -3,6 +3,7 @@ import { existsSync, realpathSync } from "node:fs";
import { resolve } from "node:path";
import { config as loadDotenv } from "dotenv";
import { resolvePaperclipEnvPath } from "./paths.js";
import { maybeRepairLegacyWorktreeConfigAndEnvFiles } from "./worktree-config.js";
import {
AUTH_BASE_URL_MODES,
DEPLOYMENT_EXPOSURES,
@@ -36,6 +37,8 @@ if (!isSameFile && existsSync(CWD_ENV_PATH)) {
loadDotenv({ path: CWD_ENV_PATH, override: false, quiet: true });
}
maybeRepairLegacyWorktreeConfigAndEnvFiles();
type DatabaseMode = "embedded-postgres" | "postgres";
export interface Config {

View File

@@ -0,0 +1,36 @@
import fs from "node:fs";
import path from "node:path";
function toGlobstarPath(candidate: string): string {
return `${candidate.replaceAll(path.sep, "/")}/**`;
}
function addIgnorePath(target: Set<string>, candidate: string): void {
target.add(candidate);
target.add(toGlobstarPath(candidate));
try {
const realPath = fs.realpathSync(candidate);
target.add(realPath);
target.add(toGlobstarPath(realPath));
} catch {
// Ignore paths that do not exist in the current checkout.
}
}
export function resolveServerDevWatchIgnorePaths(serverRoot: string): string[] {
const ignorePaths = new Set<string>([
"**/{node_modules,bower_components,vendor}/**",
"**/.vite-temp/**",
]);
for (const relativePath of [
"../ui/node_modules",
"../ui/node_modules/.vite-temp",
"../ui/.vite",
"../ui/dist",
]) {
addIgnorePath(ignorePaths, path.resolve(serverRoot, relativePath));
}
return [...ignorePaths];
}

View File

@@ -10,9 +10,11 @@ import { and, eq } from "drizzle-orm";
import {
createDb,
ensurePostgresDatabase,
formatEmbeddedPostgresError,
getPostgresDataDirectory,
inspectMigrations,
applyPendingMigrations,
createEmbeddedPostgresLogBuffer,
reconcilePendingMigrationHistory,
formatDatabaseBackupResult,
runDatabaseBackup,
@@ -30,6 +32,7 @@ import { heartbeatService, reconcilePersistedRuntimeServicesOnStartup, routineSe
import { createStorageServiceFromConfig } from "./storage/index.js";
import { printStartupBanner } from "./startup-banner.js";
import { getBoardClaimWarningUrl, initializeBoardClaimChallenge } from "./board-claim.js";
import { maybePersistWorktreeRuntimePorts } from "./worktree-config.js";
type BetterAuthSessionUser = {
id: string;
@@ -69,7 +72,7 @@ export interface StartedServer {
}
export async function startServer(): Promise<StartedServer> {
const config = loadConfig();
let config = loadConfig();
if (process.env.PAPERCLIP_SECRETS_PROVIDER === undefined) {
process.env.PAPERCLIP_SECRETS_PROVIDER = config.secretsProvider;
}
@@ -94,8 +97,8 @@ export async function startServer(): Promise<StartedServer> {
}
async function promptApplyMigrations(migrations: string[]): Promise<boolean> {
if (process.env.PAPERCLIP_MIGRATION_PROMPT === "never") return false;
if (process.env.PAPERCLIP_MIGRATION_AUTO_APPLY === "true") return true;
if (process.env.PAPERCLIP_MIGRATION_PROMPT === "never") return false;
if (!stdin.isTTY || !stdout.isTTY) return true;
const prompt = createInterface({ input: stdin, output: stdout });
@@ -167,6 +170,18 @@ export async function startServer(): Promise<StartedServer> {
const normalized = host.trim().toLowerCase();
return normalized === "127.0.0.1" || normalized === "localhost" || normalized === "::1";
}
function rewriteLocalUrlPort(rawUrl: string | undefined, port: number): string | undefined {
if (!rawUrl) return undefined;
try {
const parsed = new URL(rawUrl);
if (!isLoopbackHost(parsed.hostname)) return rawUrl;
parsed.port = String(port);
return parsed.toString();
} catch {
return rawUrl;
}
}
const LOCAL_BOARD_USER_ID = "local-board";
const LOCAL_BOARD_USER_EMAIL = "local@paperclip.local";
@@ -233,6 +248,7 @@ export async function startServer(): Promise<StartedServer> {
let embeddedPostgresStartedByThisProcess = false;
let migrationSummary: MigrationSummary = "skipped";
let activeDatabaseConnectionString: string;
let resolvedEmbeddedPostgresPort: number | null = null;
let startupDbInfo:
| { mode: "external-postgres"; connectionString: string }
| { mode: "embedded-postgres"; dataDir: string; port: number };
@@ -258,29 +274,31 @@ export async function startServer(): Promise<StartedServer> {
const dataDir = resolve(config.embeddedPostgresDataDir);
const configuredPort = config.embeddedPostgresPort;
let port = configuredPort;
const embeddedPostgresLogBuffer: string[] = [];
const EMBEDDED_POSTGRES_LOG_BUFFER_LIMIT = 120;
const logBuffer = createEmbeddedPostgresLogBuffer(120);
const verboseEmbeddedPostgresLogs = process.env.PAPERCLIP_EMBEDDED_POSTGRES_VERBOSE === "true";
const appendEmbeddedPostgresLog = (message: unknown) => {
const text = typeof message === "string" ? message : message instanceof Error ? message.message : String(message ?? "");
for (const lineRaw of text.split(/\r?\n/)) {
logBuffer.append(message);
if (!verboseEmbeddedPostgresLogs) {
return;
}
const lines = typeof message === "string"
? message.split(/\r?\n/)
: message instanceof Error
? [message.message]
: [String(message ?? "")];
for (const lineRaw of lines) {
const line = lineRaw.trim();
if (!line) continue;
embeddedPostgresLogBuffer.push(line);
if (embeddedPostgresLogBuffer.length > EMBEDDED_POSTGRES_LOG_BUFFER_LIMIT) {
embeddedPostgresLogBuffer.splice(0, embeddedPostgresLogBuffer.length - EMBEDDED_POSTGRES_LOG_BUFFER_LIMIT);
}
if (verboseEmbeddedPostgresLogs) {
logger.info({ embeddedPostgresLog: line }, "embedded-postgres");
}
logger.info({ embeddedPostgresLog: line }, "embedded-postgres");
}
};
const logEmbeddedPostgresFailure = (phase: "initialise" | "start", err: unknown) => {
if (embeddedPostgresLogBuffer.length > 0) {
const recentLogs = logBuffer.getRecentLogs();
if (recentLogs.length > 0) {
logger.error(
{
phase,
recentLogs: embeddedPostgresLogBuffer,
recentLogs,
err,
},
"Embedded PostgreSQL failed; showing buffered startup logs",
@@ -347,7 +365,7 @@ export async function startServer(): Promise<StartedServer> {
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C"],
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: appendEmbeddedPostgresLog,
onError: appendEmbeddedPostgresLog,
});
@@ -357,7 +375,10 @@ export async function startServer(): Promise<StartedServer> {
await embeddedPostgres.initialise();
} catch (err) {
logEmbeddedPostgresFailure("initialise", err);
throw err;
throw formatEmbeddedPostgresError(err, {
fallbackMessage: `Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
} else {
logger.info(`Embedded PostgreSQL cluster already exists (${clusterVersionFile}); skipping init`);
@@ -371,7 +392,10 @@ export async function startServer(): Promise<StartedServer> {
await embeddedPostgres.start();
} catch (err) {
logEmbeddedPostgresFailure("start", err);
throw err;
throw formatEmbeddedPostgresError(err, {
fallbackMessage: `Failed to start embedded PostgreSQL on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
embeddedPostgresStartedByThisProcess = true;
}
@@ -395,6 +419,7 @@ export async function startServer(): Promise<StartedServer> {
db = createDb(embeddedConnectionString);
logger.info("Embedded PostgreSQL ready");
activeDatabaseConnectionString = embeddedConnectionString;
resolvedEmbeddedPostgresPort = port;
startupDbInfo = { mode: "embedded-postgres", dataDir, port };
}
@@ -476,6 +501,19 @@ export async function startServer(): Promise<StartedServer> {
}
const listenPort = await detectPort(config.port);
if (listenPort !== config.port) {
config.port = listenPort;
}
if (resolvedEmbeddedPostgresPort !== null && resolvedEmbeddedPostgresPort !== config.embeddedPostgresPort) {
config.embeddedPostgresPort = resolvedEmbeddedPostgresPort;
}
if (config.authBaseUrlMode === "explicit" && config.authPublicBaseUrl) {
config.authPublicBaseUrl = rewriteLocalUrlPort(config.authPublicBaseUrl, listenPort);
}
maybePersistWorktreeRuntimePorts({
serverPort: listenPort,
databasePort: resolvedEmbeddedPostgresPort,
});
const uiMode = config.uiDevMiddleware ? "vite-dev" : config.serveUi ? "static" : "none";
const storageService = createStorageServiceFromConfig(config);
const app = await createApp(db as any, {

View File

@@ -1,9 +1,39 @@
You are the CEO.
You are the CEO. Your job is to lead the company, not to do individual contributor work. You own strategy, prioritization, and cross-functional coordination.
Your home directory is $AGENT_HOME. Everything personal to you -- life, memory, knowledge -- lives there. Other agents may have their own folders and you may update them when necessary.
Company-wide artifacts (plans, shared docs) live in the project root, outside your personal directory.
## Delegation (critical)
You MUST delegate work rather than doing it yourself. When a task is assigned to you:
1. **Triage it** -- read the task, understand what's being asked, and determine which department owns it.
2. **Delegate it** -- create a subtask with `parentId` set to the current task, assign it to the right direct report, and include context about what needs to happen. Use these routing rules:
- **Code, bugs, features, infra, devtools, technical tasks** → CTO
- **Marketing, content, social media, growth, devrel** → CMO
- **UX, design, user research, design-system** → UXDesigner
- **Cross-functional or unclear** → break into separate subtasks for each department, or assign to the CTO if it's primarily technical with a design component
- If the right report doesn't exist yet, use the `paperclip-create-agent` skill to hire one before delegating.
3. **Do NOT write code, implement features, or fix bugs yourself.** Your reports exist for this. Even if a task seems small or quick, delegate it.
4. **Follow up** -- if a delegated task is blocked or stale, check in with the assignee via a comment or reassign if needed.
## What you DO personally
- Set priorities and make product decisions
- Resolve cross-team conflicts or ambiguity
- Communicate with the board (human users)
- Approve or reject proposals from your reports
- Hire new agents when the team needs capacity
- Unblock your direct reports when they escalate to you
## Keeping work moving
- Don't let tasks sit idle. If you delegate something, check that it's progressing.
- If a report is blocked, help unblock them -- escalate to the board if needed.
- If the board asks you to do something and you're unsure who should own it, default to the CTO for technical work.
- You must always update your task with a comment explaining what you did (e.g., who you delegated to and why).
## Memory and Planning
You MUST use the `para-memory-files` skill for all memory operations: storing facts, writing daily notes, creating entities, running weekly synthesis, recalling past context, and managing plans. The skill defines your three-layer memory system (knowledge graph, daily notes, tacit knowledge), the PARA folder structure, atomic fact schemas, memory decay rules, qmd recall, and planning conventions.

View File

@@ -1411,6 +1411,25 @@ function grantsFromDefaults(
return result;
}
export function agentJoinGrantsFromDefaults(
defaultsPayload: Record<string, unknown> | null | undefined
): Array<{
permissionKey: (typeof PERMISSION_KEYS)[number];
scope: Record<string, unknown> | null;
}> {
const grants = grantsFromDefaults(defaultsPayload, "agent");
if (grants.some((grant) => grant.permissionKey === "tasks:assign")) {
return grants;
}
return [
...grants,
{
permissionKey: "tasks:assign",
scope: null
}
];
}
type JoinRequestManagerCandidate = {
id: string;
role: string;
@@ -2618,17 +2637,8 @@ export function accessRoutes(
"member",
"active"
);
await access.setPrincipalPermission(
companyId,
"agent",
created.id,
"tasks:assign",
true,
req.actor.userId ?? null
);
const grants = grantsFromDefaults(
invite.defaultsPayload as Record<string, unknown> | null,
"agent"
const grants = agentJoinGrantsFromDefaults(
invite.defaultsPayload as Record<string, unknown> | null
);
await access.setPrincipalGrants(
companyId,

View File

@@ -171,6 +171,33 @@ export function issueRoutes(db: Db, storage: StorageService) {
return rawId;
}
async function resolveIssueProjectAndGoal(issue: {
companyId: string;
projectId: string | null;
goalId: string | null;
}) {
const projectPromise = issue.projectId ? projectsSvc.getById(issue.projectId) : Promise.resolve(null);
const directGoalPromise = issue.goalId ? goalsSvc.getById(issue.goalId) : Promise.resolve(null);
const [project, directGoal] = await Promise.all([projectPromise, directGoalPromise]);
if (directGoal) {
return { project, goal: directGoal };
}
const projectGoalId = project?.goalId ?? project?.goalIds[0] ?? null;
if (projectGoalId) {
const projectGoal = await goalsSvc.getById(projectGoalId);
return { project, goal: projectGoal };
}
if (!issue.projectId) {
const defaultGoal = await goalsSvc.getDefaultCompanyGoal(issue.companyId);
return { project, goal: defaultGoal };
}
return { project, goal: null };
}
// Resolve issue identifiers (e.g. "PAP-39") to UUIDs for all /issues/:id routes
router.param("id", async (req, res, next, rawId) => {
try {
@@ -203,6 +230,7 @@ export function issueRoutes(db: Db, storage: StorageService) {
assertCompanyAccess(req, companyId);
const assigneeUserFilterRaw = req.query.assigneeUserId as string | undefined;
const touchedByUserFilterRaw = req.query.touchedByUserId as string | undefined;
const inboxArchivedByUserFilterRaw = req.query.inboxArchivedByUserId as string | undefined;
const unreadForUserFilterRaw = req.query.unreadForUserId as string | undefined;
const assigneeUserId =
assigneeUserFilterRaw === "me" && req.actor.type === "board"
@@ -212,6 +240,10 @@ export function issueRoutes(db: Db, storage: StorageService) {
touchedByUserFilterRaw === "me" && req.actor.type === "board"
? req.actor.userId
: touchedByUserFilterRaw;
const inboxArchivedByUserId =
inboxArchivedByUserFilterRaw === "me" && req.actor.type === "board"
? req.actor.userId
: inboxArchivedByUserFilterRaw;
const unreadForUserId =
unreadForUserFilterRaw === "me" && req.actor.type === "board"
? req.actor.userId
@@ -225,6 +257,10 @@ export function issueRoutes(db: Db, storage: StorageService) {
res.status(403).json({ error: "touchedByUserId=me requires board authentication" });
return;
}
if (inboxArchivedByUserFilterRaw === "me" && (!inboxArchivedByUserId || req.actor.type !== "board")) {
res.status(403).json({ error: "inboxArchivedByUserId=me requires board authentication" });
return;
}
if (unreadForUserFilterRaw === "me" && (!unreadForUserId || req.actor.type !== "board")) {
res.status(403).json({ error: "unreadForUserId=me requires board authentication" });
return;
@@ -236,6 +272,7 @@ export function issueRoutes(db: Db, storage: StorageService) {
participantAgentId: req.query.participantAgentId as string | undefined,
assigneeUserId,
touchedByUserId,
inboxArchivedByUserId,
unreadForUserId,
projectId: req.query.projectId as string | undefined,
parentId: req.query.parentId as string | undefined,
@@ -311,14 +348,9 @@ export function issueRoutes(db: Db, storage: StorageService) {
return;
}
assertCompanyAccess(req, issue.companyId);
const [ancestors, project, goal, mentionedProjectIds, documentPayload] = await Promise.all([
const [{ project, goal }, ancestors, mentionedProjectIds, documentPayload] = await Promise.all([
resolveIssueProjectAndGoal(issue),
svc.getAncestors(issue.id),
issue.projectId ? projectsSvc.getById(issue.projectId) : null,
issue.goalId
? goalsSvc.getById(issue.goalId)
: !issue.projectId
? goalsSvc.getDefaultCompanyGoal(issue.companyId)
: null,
svc.findMentionedProjectIds(issue.id),
documentsSvc.getIssueDocumentPayload(issue),
]);
@@ -356,14 +388,9 @@ export function issueRoutes(db: Db, storage: StorageService) {
? req.query.wakeCommentId.trim()
: null;
const [ancestors, project, goal, commentCursor, wakeComment] = await Promise.all([
const [{ project, goal }, ancestors, commentCursor, wakeComment] = await Promise.all([
resolveIssueProjectAndGoal(issue),
svc.getAncestors(issue.id),
issue.projectId ? projectsSvc.getById(issue.projectId) : null,
issue.goalId
? goalsSvc.getById(issue.goalId)
: !issue.projectId
? goalsSvc.getDefaultCompanyGoal(issue.companyId)
: null,
svc.getCommentCursor(issue.id),
wakeCommentId ? svc.getComment(wakeCommentId) : null,
]);
@@ -686,6 +713,70 @@ export function issueRoutes(db: Db, storage: StorageService) {
res.json(readState);
});
router.post("/issues/:id/inbox-archive", async (req, res) => {
const id = req.params.id as string;
const issue = await svc.getById(id);
if (!issue) {
res.status(404).json({ error: "Issue not found" });
return;
}
assertCompanyAccess(req, issue.companyId);
if (req.actor.type !== "board") {
res.status(403).json({ error: "Board authentication required" });
return;
}
if (!req.actor.userId) {
res.status(403).json({ error: "Board user context required" });
return;
}
const archiveState = await svc.archiveInbox(issue.companyId, issue.id, req.actor.userId, new Date());
const actor = getActorInfo(req);
await logActivity(db, {
companyId: issue.companyId,
actorType: actor.actorType,
actorId: actor.actorId,
agentId: actor.agentId,
runId: actor.runId,
action: "issue.inbox_archived",
entityType: "issue",
entityId: issue.id,
details: { userId: req.actor.userId, archivedAt: archiveState.archivedAt },
});
res.json(archiveState);
});
router.delete("/issues/:id/inbox-archive", async (req, res) => {
const id = req.params.id as string;
const issue = await svc.getById(id);
if (!issue) {
res.status(404).json({ error: "Issue not found" });
return;
}
assertCompanyAccess(req, issue.companyId);
if (req.actor.type !== "board") {
res.status(403).json({ error: "Board authentication required" });
return;
}
if (!req.actor.userId) {
res.status(403).json({ error: "Board user context required" });
return;
}
const removed = await svc.unarchiveInbox(issue.companyId, issue.id, req.actor.userId);
const actor = getActorInfo(req);
await logActivity(db, {
companyId: issue.companyId,
actorType: actor.actorType,
actorId: actor.actorId,
agentId: actor.agentId,
runId: actor.runId,
action: "issue.inbox_unarchived",
entityType: "issue",
entityId: issue.id,
details: { userId: req.actor.userId },
});
res.json(removed ?? { ok: true });
});
router.get("/issues/:id/approvals", async (req, res) => {
const id = req.params.id as string;
const issue = await svc.getById(id);

Some files were not shown because too many files have changed in this diff Show More