Compare commits

...

93 Commits

Author SHA1 Message Date
Devin Foley
347f38019f docs: clarify all PR template sections are required, not just thinking path
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 23:47:08 -07:00
Devin Foley
25615407a4 docs: update CONTRIBUTING.md to require PR template, Greptile 5/5, and passing tests
Add explicit PR Requirements section referencing .github/PULL_REQUEST_TEMPLATE.md.
Clarify that all PRs must use the template, achieve a 5/5 Greptile score with
all comments addressed, and have passing tests/CI before merge.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 23:46:58 -07:00
Octasoft Ltd
f843a45a84 fix: use sh instead of /bin/sh as shell fallback on Windows (#891)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - Agents run shell commands during workspace provisioning (git
worktree creation, runtime services)
> - When `process.env.SHELL` is unset, the code falls back to `/bin/sh`
> - But on Windows with Git Bash, `/bin/sh` doesn't exist as an absolute
path — Git Bash provides `sh` on PATH instead
> - This causes `child_process.spawn` to throw `ENOENT`, crashing
workspace provisioning on Windows
> - This PR extracts a `resolveShell()` helper that uses `$SHELL` when
set, falls back to `sh` (bare) on Windows or `/bin/sh` on Unix
> - The benefit is that agents running on Windows via Git Bash can
provision workspaces without shell resolution errors
## Summary
- `workspace-runtime.ts` falls back to `/bin/sh` when
`process.env.SHELL` is unset
- On Windows, `/bin/sh` doesn't exist → `spawn /bin/sh ENOENT`
- Fix: extract `resolveShell()` helper that uses `$SHELL` when set,
falls back to `sh` on Windows (Git Bash PATH lookup) or `/bin/sh` on
Unix

Three call sites updated to use the new helper.

Fixes #892

## Root cause

When Paperclip spawns shell commands in workspace operations (e.g., git
worktree creation), it uses `process.env.SHELL` if set, otherwise
defaults to `/bin/sh`. On Windows with Git Bash, `$SHELL` is typically
unset and `/bin/sh` is not a valid path — Git Bash provides `sh` on PATH
but not at the absolute `/bin/sh` location. This causes
`child_process.spawn` to throw `ENOENT`.

## Approach

Rather than hard-coding a Windows-specific absolute path (e.g.,
`C:\Program Files\Git\bin\sh.exe`), we use the bare `"sh"` command which
relies on PATH resolution. This works because:
1. Git Bash adds its `usr/bin` directory to PATH, making `sh` resolvable
2. On Unix/macOS, `/bin/sh` remains the correct default (it's the POSIX
standard location)
3. `process.env.SHELL` takes priority when set, so this only affects the
fallback

## Test plan

- [x] 7 unit tests for `resolveShell()`: SHELL set, trimmed, empty,
whitespace-only, linux/darwin/win32 fallbacks
- [x] Run a workspace provision command on Windows with `git_worktree`
strategy
- [x] Verify Unix/macOS is unaffected

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Paperclip <noreply@paperclip.ing>
Co-authored-by: Devin Foley <devin@devinfoley.com>
2026-04-02 17:34:26 -07:00
Dotta
36049beeea Merge pull request #2552 from paperclipai/PAPA-42-add-model-used-to-pr-template-and-checklist
feat: add Model Used section to PR template and checklist
2026-04-02 13:47:46 -05:00
Devin Foley
c041fee6fc feat: add Model Used section to PR template and checklist
Add a required "Model Used" section to the PR template so contributors
document which AI model (with version, context window, reasoning mode,
and other capability details) was used for each change. Also adds a
corresponding checklist item.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:32:22 -07:00
Dotta
82290451d4 Merge pull request #2541 from paperclipai/pap-1078-qol-fixes
fix(ui): polish issue detail timelines and attachments
2026-04-02 13:31:12 -05:00
dotta
fb3b57ab1f merge master into pap-1078-qol-fixes
Resolve the keyboard shortcut conflicts after [#2539](https://github.com/paperclipai/paperclip/pull/2539) and [#2540](https://github.com/paperclipai/paperclip/pull/2540), keep the release package rewrite working with cliVersion, and stabilize the provisioning timeout in the full suite.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 13:14:20 -05:00
Dotta
ca8d35fd99 Merge pull request #2540 from paperclipai/pap-1078-inbox-operator-polish
feat(inbox): add operator search and keyboard controls
2026-04-02 13:02:33 -05:00
Dotta
81a7f79dfd Merge pull request #2539 from paperclipai/pap-1078-workspaces-routines
feat(routines): add workspace-aware routine runs
2026-04-02 13:01:19 -05:00
dotta
ad1ef6a8c6 fix(ui): address final Greptile follow-up
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 12:21:35 -05:00
dotta
833842b391 fix(inbox): address Greptile review findings
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 12:16:34 -05:00
dotta
fd6cfc7149 fix(routines): address Greptile review findings
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 12:09:02 -05:00
dotta
50e9f69010 fix(ui): surface skipped wakeup messages in agent detail
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:52:43 -05:00
dotta
38a0cd275e test(ui): cover routine run variables dialog
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:52:13 -05:00
dotta
bd6d07d0b4 fix(ui): polish issue detail timelines and attachments
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:51:40 -05:00
dotta
3ab7d52f00 feat(inbox): add operator search and keyboard controls
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:45:15 -05:00
dotta
909e8cd4c8 feat(routines): add workspace-aware routine runs
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 11:38:57 -05:00
Dotta
36376968af Merge pull request #2527 from paperclipai/PAP-806-telemetry-implementation-in-paperclip-plan
Add app, server, and plugin telemetry plumbing
2026-04-02 11:10:16 -05:00
dotta
29d0e82dce fix: make feedback migration replay-safe after rebase
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:54:56 -05:00
dotta
1c1040e219 test: make cli telemetry test deterministic in CI
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:30 -05:00
dotta
0ec8257563 fix: include shared telemetry sources in cli typecheck
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:30 -05:00
dotta
38833304d4 fix: restore cli telemetry config handling in worktrees
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:30 -05:00
dotta
85e6371cb6 fix: use agent role for first heartbeat telemetry
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:30 -05:00
dotta
daea94a2ed test: align task-completed telemetry assertion with agent role
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:30 -05:00
dotta
c18b3cb414 fix: use agent role instead of adapter type in task_completed telemetry
The agent.task_completed event was sending adapterType (e.g. "claude_local")
as the agent_role dimension instead of the actual role (e.g. "engineer").

Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-02 10:47:30 -05:00
dotta
af844b778e Add plugin telemetry bridge capability
Expose telemetry.track through the plugin SDK and server host bridge, forward plugin-prefixed events into the shared telemetry client, and demonstrate the capability in the kitchen sink example.\n\nCo-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:29 -05:00
dotta
53dbcd185e fix: align telemetry client payload and dimensions with backend schema
Restructure the TelemetryClient to send the correct backend envelope
format ({app, schemaVersion, installId, events: [{name, occurredAt, dimensions}]})
instead of the old per-event format. Update all event dimension names
to match the backend registry (agent_role, adapter_type, error_code, etc.).

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:29 -05:00
dotta
f16de6026d fix: add periodic flush and graceful shutdown for server-side telemetry
The TelemetryClient only flushed at 50 events, so the server silently
lost all queued telemetry on restart. Add startPeriodicFlush/stop methods
to TelemetryClient, wire up 60s periodic flush in server initTelemetry,
and flush on SIGTERM/SIGINT before exit.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:29 -05:00
dotta
34044cdfce feat: implement app-side telemetry sender
Add the shared telemetry sender, wire the CLI/server emit points,
and cover the config and completion behavior with tests.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:47:29 -05:00
Dotta
ca5659f734 Merge pull request #2529 from paperclipai/PAP-880-thumbs-capture-for-evals-feature-pr
Add feedback voting and thumbs capture flow
2026-04-02 10:44:50 -05:00
dotta
d12e3e3d1a Fix feedback review findings
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 10:03:07 -05:00
dotta
c0d0d03bce Add feedback voting and thumbs capture flow
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 09:11:49 -05:00
Dotta
3db6bdfc3c Merge pull request #2414 from aronprins/skill/routines
feat(skills): add paperclip-routines skill
2026-04-02 06:37:44 -05:00
dotta
6524dbe08f fix(skills): move routines docs into paperclip references
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-02 06:28:04 -05:00
Dotta
2c1883fc77 Merge pull request #2449 from statxc/feat/github-enterprise-url-support
feat: GitHub enterprise url support
2026-04-02 06:07:44 -05:00
Aron Prins
4abd53c089 fix(skills): tighten api-reference table descriptions to match existing style
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-02 11:00:53 +02:00
Aron Prins
3c99ab8d01 chore: improve api documentation and implementing routines properly. 2026-04-02 10:52:52 +02:00
Devin Foley
9d6d159209 chore: add package files to CODEOWNERS for dependency review (#2476)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The GitHub repository uses CODEOWNERS to enforce review requirements on critical files
> - Currently only release scripts and CI config are protected — package manifests are not
> - Dependency changes (package.json, lockfile) can introduce supply-chain risk if merged without review
> - This PR adds all package files to CODEOWNERS
> - The benefit is that any dependency change now requires explicit approval from maintainers

## What Changed

- Added root package manifest files (`package.json`, `pnpm-lock.yaml`, `pnpm-workspace.yaml`, `.npmrc`) to CODEOWNERS
- Added all 19 workspace `package.json` files (`cli/`, `server/`, `ui/`, `packages/*`) to CODEOWNERS
- All entries owned by `@cryppadotta` and `@devinfoley`, consistent with existing release infrastructure ownership

## Verification

- `gh api repos/paperclipai/paperclip/contents/.github/CODEOWNERS?ref=PAPA-41-add-package-files-to-codeowners` to inspect the file
- Open a test PR touching any `package.json` and confirm GitHub requests review from the listed owners

## Risks

- Low risk. CODEOWNERS only adds review requirements — does not block merges unless branch protection enforces it. New packages added in the future will need a corresponding CODEOWNERS entry.

## Checklist

- [x] I have included a thinking path that traces from project context to this change
- [x] I have run tests locally and they pass
- [ ] I have added or updated tests where applicable
- [ ] If this change affects the UI, I have included before/after screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before requesting merge

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-01 20:32:39 -07:00
Devin Foley
26069682ee fix: copy button fallback for non-secure contexts (#2472)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The UI serves agent management pages including an instructions editor with copy-to-clipboard buttons
> - The Clipboard API (`navigator.clipboard.writeText`) requires a secure context (HTTPS or localhost)
> - Users accessing the UI over HTTP on a LAN IP get "Copy failed" when clicking the copy icon
> - This pull request adds an `execCommand("copy")` fallback in `CopyText` for non-secure contexts
> - The benefit is that copy buttons work reliably regardless of whether the page is served over HTTPS or plain HTTP

## What Changed

- `ui/src/components/CopyText.tsx`: Added `window.isSecureContext` check before using `navigator.clipboard`. When unavailable, falls back to creating a temporary `<textarea>`, selecting its content, and using `document.execCommand("copy")`. The return value is checked and the DOM element is cleaned up via `try/finally`.

## Verification

- Access the UI over HTTP on a non-localhost IP (e.g. `http://[local-ip]:3100`)
- Navigate to any agent's instructions page → Advanced → click the copy icon next to Root path
- Should show "Copied!" tooltip and the path should be on the clipboard

## Risks

- Low risk. `execCommand("copy")` is deprecated in the spec but universally supported by all major browsers. The fallback only activates in non-secure contexts where the modern API is unavailable. If/when HTTPS is enabled, the modern `navigator.clipboard` path is used automatically.

## Checklist

- [x] I have included a thinking path that traces from project context to this change
- [ ] I have run tests locally and they pass
- [ ] I have added or updated tests where applicable
- [ ] If this change affects the UI, I have included before/after screenshots
- [ ] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before requesting merge
2026-04-01 20:16:52 -07:00
Devin Foley
1e24e6e84c fix: auto-detect default branch for worktree creation when baseRef not configured (#2463)
* fix: auto-detect default branch for worktree creation when baseRef not configured

When creating git worktrees, if no explicit baseRef is configured in
the project workspace strategy and no repoRef is set, the system now
auto-detects the repository's default branch instead of blindly
falling back to "HEAD".

Detection strategy:
1. Check refs/remotes/origin/HEAD (set by git clone / remote set-head)
2. Fall back to probing refs/remotes/origin/main, then origin/master
3. Final fallback: HEAD (preserves existing behavior)

This prevents failures like "fatal: invalid reference: main" when a
project's workspace strategy has no baseRef and the repo uses a
non-standard default branch name.

Co-Authored-By: Paperclip <noreply@paperclip.ing>

* fix: address Greptile review - fix misleading comment and add symbolic-ref test

- Corrected comment to clarify that the existing test exercises the
  heuristic fallback path (not symbolic-ref)
- Added new test case that explicitly sets refs/remotes/origin/HEAD
  via `git remote set-head` to exercise the symbolic-ref code path

Co-Authored-By: Paperclip <noreply@paperclip.ing>

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-01 18:00:49 -07:00
statxc
9d89d74d70 refactor: rename URL validators to looksLikeRepoUrl 2026-04-01 23:21:22 +00:00
statxc
056a5ee32a fix(ui): render agent capabilities field in org chart cards (#2349)
* fix(ui): render agent capabilities field in org chart cards

Closes #2209

* Update ui/src/pages/OrgChart.tsx

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>

---------

Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-04-01 15:46:44 -07:00
Devin Foley
dedd972e3d Fix inbox ordering: self-touched issues no longer sink to bottom (#2144)
issueLastActivityTimestamp() returned 0 for issues where the user was
the last to touch them (myLastTouchAt >= updatedAt) and no external
comment existed. This pushed those items to the bottom of the inbox
list regardless of how recently they were updated.

Now falls back to updatedAt instead, so recently updated items sort
to the top of the Recent tab as expected.

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-01 14:52:53 -07:00
statxc
6a7830b07e fix: add HTTPS protocol check to server-side GitHub URL parsers 2026-04-01 21:27:10 +00:00
statxc
f9cebe9b73 fix: harden GHE URL detection and extract shared GitHub helpers 2026-04-01 21:05:48 +00:00
statxc
9e1ee925cd feat: support GitHub Enterprise URLs for skill and company imports 2026-04-01 20:53:41 +00:00
Dotta
6c2c63e0f1 Merge pull request #2328 from bittoby/fix/project-slug-collision
Fix: project slug collisions for non-English names (#2318)
2026-04-01 09:34:23 -05:00
Dotta
461779a960 Merge pull request #2430 from bittoby/fix/add-gemini-local-to-adapter-types
fix: add gemini_local to AGENT_ADAPTER_TYPES validation enum
2026-04-01 09:18:39 -05:00
bittoby
6aa3ead238 fix: add gemini_local to AGENT_ADAPTER_TYPES validation enum 2026-04-01 14:07:47 +00:00
Dotta
e0f64c04e7 Merge pull request #2407 from radiusred/chore/docker-improvements
chore(docker): improve base image and organize docker files
2026-04-01 08:14:55 -05:00
Aron Prins
e5b2e8b29b fix(skills): address greptile review on paperclip-routines skill
- Add missing `description` field to the Creating a Routine field table
- Document optional `label` field available on all trigger kinds
2026-04-01 13:56:10 +02:00
Aron Prins
62d8b39474 feat(skills): add paperclip-routines skill
Adds a new skill that documents how to create and manage Paperclip
routines — recurring tasks that fire on a schedule, webhook, or API
call and dispatch an execution issue to the assigned agent.
2026-04-01 13:49:11 +02:00
Cody (Radius Red)
420cd4fd8d chore(docker): improve base image and organize docker files
- Add wget, ripgrep, python3, and GitHub CLI (gh) to base image
- Add OPENCODE_ALLOW_ALL_MODELS=true to production ENV
- Move compose files, onboard-smoke Dockerfile to docker/
- Move entrypoint script to scripts/docker-entrypoint.sh
- Add Podman Quadlet unit files (pod, app, db containers)
- Add docker/README.md with build, compose, and quadlet docs
- Add scripts/docker-build-test.sh for local build validation
- Update all doc references for new file locations
- Keep main Dockerfile at project root (no .dockerignore changes needed)

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-01 11:36:27 +00:00
Dotta
5b479652f2 Merge pull request #2327 from radiusred/fix/env-var-plain-to-secret-data-loss
fix(ui): preserve env var when switching type from Plain to Secret
2026-03-31 11:37:07 -05:00
bittoby
99296f95db fix: append short UUID suffix to project slugs when non-ASCII characters are stripped to prevent slug collisions 2026-03-31 16:35:30 +00:00
Cody (Radius Red)
92e03ac4e3 fix(ui): prevent dropdown snap-back when switching env var to Secret
Address Greptile review feedback: the plain-value fallback in emit()
caused the useEffect sync to re-run toRows(), which mapped the plain
binding back to source: "plain", snapping the dropdown back.

Fix: add an emittingRef that distinguishes local emit() calls from
external value changes (like overlay reset after save). When the
change originated from our own emit, skip the re-sync so the
transitioning row stays in "secret" mode while the user picks a secret.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-31 15:52:46 +00:00
Cody (Radius Red)
ce8d9eb323 fix(server): preserve adapter-agnostic keys when changing adapter type
When the adapter type changes via PATCH, the server only preserved
instruction bundle keys (instructionsBundleMode, etc.) from the
existing config. Adapter-agnostic keys like env, cwd, timeoutSec,
graceSec, promptTemplate, and bootstrapPromptTemplate were silently
dropped if the PATCH payload didn't explicitly include them.

This caused env var data loss when adapter type was changed via the
UI or API without sending the full existing adapterConfig.

The fix preserves these adapter-agnostic keys from the existing config
before applying the instruction bundle preservation, matching the
UI's behavior in AgentConfigForm.handleSave.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-31 15:42:03 +00:00
Cody (Radius Red)
06cf00129f fix(ui): preserve env var when switching type from Plain to Secret
When changing an env var's type from Plain to Secret in the agent
config form, the row was silently dropped because emit() skipped
secret rows without a secretId. This caused data loss — the variable
disappeared from both the UI and the saved config.

Fix: keep the row as a plain binding during the transition state
until the user selects an actual secret. This preserves the key and
value so nothing is lost.

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-31 15:09:54 +00:00
Dotta
ebc6888e7d Merge pull request #1923 from radiusred/fix/docker-volumes
fix(docker): remap container UID/GID at runtime to avoid volume mount permission errors
2026-03-31 08:46:27 -05:00
Dotta
9f1bb350fe Merge pull request #2065 from edimuj/fix/heartbeat-session-reuse
fix: preserve session continuity for timer/heartbeat wakes
2026-03-31 08:29:45 -05:00
Dotta
46ce546174 Merge pull request #2317 from paperclipai/PAP-881-document-revisions-bulid-it
Add issue document revision restore flow
2026-03-31 08:25:07 -05:00
dotta
90889c12d8 fix(db): make document revision migration replay-safe 2026-03-31 08:09:00 -05:00
dotta
761dce559d test(worktree): avoid assuming a specific free port 2026-03-31 07:44:19 -05:00
dotta
41f261eaf5 Merge public-gh/master into PAP-881-document-revisions-bulid-it 2026-03-31 07:31:17 -05:00
Dotta
8427043431 Merge pull request #112 from kevmok/add-gpt-5-4-xhigh-effort
Add gpt-5.4 fallback and xhigh effort options
2026-03-31 06:19:38 -05:00
Dotta
19aaa54ae4 Merge branch 'master' into add-gpt-5-4-xhigh-effort 2026-03-31 06:19:26 -05:00
Cody (Radius Red)
d134d5f3a1 fix: support host UID/GID mapping for volume mounts
- Add USER_UID/USER_GID build args to Dockerfile
- Install gosu and remap node user/group at build time
- Set node home directory to /paperclip so agent credentials resolve correctly
- Add docker-entrypoint.sh for runtime UID/GID remapping via gosu

Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 23:48:21 +00:00
Dotta
98337f5b03 Merge pull request #2203 from paperclipai/pap-1007-workspace-followups
fix: preserve workspace continuity across follow-up issues
2026-03-30 15:24:47 -05:00
dotta
477ef78fed Address Greptile feedback on workspace reuse
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:55:44 -05:00
Dotta
b0e0f8cd91 Merge pull request #2205 from paperclipai/pap-1007-publishing-docs
docs: add manual @paperclipai/ui publishing prerequisites
2026-03-30 14:48:52 -05:00
Dotta
ccb5cce4ac Merge pull request #2204 from paperclipai/pap-1007-operator-polish
fix: apply operator polish across comments, invites, routines, and health
2026-03-30 14:48:24 -05:00
Dotta
5575399af1 Merge pull request #2048 from remdev/fix/codex-rpc-client-spawn-error
fix(codex) rpc client spawn error
2026-03-30 14:24:33 -05:00
dotta
2c75c8a1ec docs: clarify npm prerequisites for first ui publish
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:15:30 -05:00
dotta
d8814e938c docs: add manual @paperclipai/ui publish steps
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:15:30 -05:00
dotta
a7cfbc98f3 Fix optimistic comment draft clearing 2026-03-30 14:14:36 -05:00
dotta
5e65bb2b92 Add company name to invite summaries
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:14:14 -05:00
dotta
d7d01e9819 test: add company settings selectors
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:14:14 -05:00
dotta
88e742a129 Fix health DB connectivity probe
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:14:14 -05:00
dotta
db4e146551 Fix routine modal scrolling
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:14:14 -05:00
dotta
9684e7bf30 Add dark mode inbox selection color
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:14:14 -05:00
dotta
a3e125f796 Clarify Claude transcript event categories
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:13:52 -05:00
dotta
2b18fc4007 Repair server workspace package links in worktrees
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:10:36 -05:00
dotta
ec1210caaa Preserve workspaces for follow-up issues
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:10:36 -05:00
dotta
3c66683169 Fix execution workspace reuse and slugify worktrees
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-30 14:10:36 -05:00
Dotta
c610192c53 Merge pull request #2074 from paperclipai/pap-979-runtime-workspaces
feat: expand execution workspace runtime controls
2026-03-30 08:35:50 -05:00
Edin Mujkanovic
70702ce74f fix: preserve session continuity for timer/heartbeat wakes
Timer wakes had no taskKey, so they couldn't use agentTaskSessions for
session resume. Adds a synthetic __heartbeat__ task key for timer wakes
so they participate in the full session system.

Includes 6 dedicated unit tests for deriveTaskKeyWithHeartbeatFallback.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-29 18:19:02 +02:00
Mikhail Batukhtin
dc3aa8f31f test(codex-local): isolate quota spawn test from host CODEX_HOME
After the mocked RPC spawn fails, getQuotaWindows() still calls
readCodexToken(). Use an empty mkdtemp directory for CODEX_HOME for the
duration of the test so we never read ~/.codex/auth.json or call WHAM.
2026-03-29 15:15:37 +03:00
Mikhail Batukhtin
c98af52590 test(codex-local): regression for CodexRpcClient spawn ENOENT
Add a Vitest case that mocks `node:child_process.spawn` so the child
emits `error` (ENOENT) after the constructor attaches listeners.
`getQuotaWindows()` must resolve with `ok: false` instead of leaving an
unhandled `error` event on the process.

Register `packages/adapters/codex-local` in the root Vitest workspace.

Document in DEVELOPING.md that a missing `codex` binary should not take
down the API server during quota polling.
2026-03-29 14:43:51 +03:00
Mikhail Batukhtin
01fb97e8da fix(codex-local): handle spawn error event in CodexRpcClient
When the `codex` binary is absent from PATH, Node.js emits an `error`
event on the ChildProcess. Because `CodexRpcClient` only subscribed to
`exit` and `data` events, the `error` event was unhandled — causing
Node to throw it as an uncaught exception and crash the server.

Add an `error` handler in the constructor that rejects all pending RPC
requests and clears the queue. This makes a missing `codex` binary a
recoverable condition: `fetchCodexRpcQuota()` rejects, `getQuotaWindows()`
catches the error and returns `{ ok: false }`, and the server stays up.

The fix mirrors the existing pattern in `runChildProcess`
(packages/adapter-utils/src/server-utils.ts) which already handles
`ENOENT` the same way for the main task execution path.
2026-03-29 14:20:55 +03:00
dotta
b0b9809732 Add issue document revision restore flow
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-03-26 08:24:57 -05:00
Kevin Mok
432d7e72fa Merge upstream/master into add-gpt-5-4-xhigh-effort 2026-03-08 12:10:59 -05:00
Kevin Mok
666ab53648 Remove redundant opencode model assertion 2026-03-05 19:55:15 -06:00
Kevin Mok
314288ff82 Add gpt-5.4 fallback and xhigh effort options 2026-03-05 18:59:42 -06:00
229 changed files with 54320 additions and 1089 deletions

7
.github/CODEOWNERS vendored
View File

@@ -8,3 +8,10 @@ scripts/rollback-latest.sh @cryppadotta @devinfoley
doc/RELEASING.md @cryppadotta @devinfoley doc/RELEASING.md @cryppadotta @devinfoley
doc/PUBLISHING.md @cryppadotta @devinfoley doc/PUBLISHING.md @cryppadotta @devinfoley
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
# Package files — dependency changes require review
# package.json matches recursively at all depths (covers root + all workspaces)
package.json @cryppadotta @devinfoley
pnpm-lock.yaml @cryppadotta @devinfoley
pnpm-workspace.yaml @cryppadotta @devinfoley
.npmrc @cryppadotta @devinfoley

View File

@@ -38,9 +38,25 @@
- -
## Model Used
<!--
Required. Specify which AI model was used to produce or assist with
this change. Be as descriptive as possible — include:
• Provider and model name (e.g., Claude, GPT, Gemini, Codex)
• Exact model ID or version (e.g., claude-opus-4-6, gpt-4-turbo-2024-04-09)
• Context window size if relevant (e.g., 1M context)
• Reasoning/thinking mode if applicable (e.g., extended thinking, chain-of-thought)
• Any other relevant capability details (e.g., tool use, code execution)
If no AI model was used, write "None — human-authored".
-->
-
## Checklist ## Checklist
- [ ] I have included a thinking path that traces from project context to this change - [ ] I have included a thinking path that traces from project context to this change
- [ ] I have specified the model used (with version and capability details)
- [ ] I have run tests locally and they pass - [ ] I have run tests locally and they pass
- [ ] I have added or updated tests where applicable - [ ] I have added or updated tests where applicable
- [ ] If this change affects the UI, I have included before/after screenshots - [ ] If this change affects the UI, I have included before/after screenshots

1
.gitignore vendored
View File

@@ -31,6 +31,7 @@ server/src/**/*.js.map
server/src/**/*.d.ts server/src/**/*.d.ts
server/src/**/*.d.ts.map server/src/**/*.d.ts.map
tmp/ tmp/
feedback-export-*
# Editor / tool temp files # Editor / tool temp files
*.tmp *.tmp

View File

@@ -11,8 +11,9 @@ We really appreciate both small fixes and thoughtful larger changes.
- Pick **one** clear thing to fix/improve - Pick **one** clear thing to fix/improve
- Touch the **smallest possible number of files** - Touch the **smallest possible number of files**
- Make sure the change is very targeted and easy to review - Make sure the change is very targeted and easy to review
- All automated checks pass (including Greptile comments) - All tests pass and CI is green
- No new lint/test failures - Greptile score is 5/5 with all comments addressed
- Use the [PR template](.github/PULL_REQUEST_TEMPLATE.md)
These almost always get merged quickly when they're clean. These almost always get merged quickly when they're clean.
@@ -26,11 +27,26 @@ These almost always get merged quickly when they're clean.
- Before / After screenshots (or short video if UI/behavior change) - Before / After screenshots (or short video if UI/behavior change)
- Clear description of what & why - Clear description of what & why
- Proof it works (manual testing notes) - Proof it works (manual testing notes)
- All tests passing - All tests passing and CI green
- All Greptile + other PR comments addressed - Greptile score 5/5 with all comments addressed
- [PR template](.github/PULL_REQUEST_TEMPLATE.md) fully filled out
PRs that follow this path are **much** more likely to be accepted, even when they're large. PRs that follow this path are **much** more likely to be accepted, even when they're large.
## PR Requirements (all PRs)
### Use the PR Template
Every pull request **must** follow the PR template at [`.github/PULL_REQUEST_TEMPLATE.md`](.github/PULL_REQUEST_TEMPLATE.md). If you create a PR via the GitHub API or other tooling that bypasses the template, copy its contents into your PR description manually. The template includes required sections: Thinking Path, What Changed, Verification, Risks, and a Checklist.
### Tests Must Pass
All tests must pass before a PR can be merged. Run them locally first and verify CI is green after pushing.
### Greptile Review
We use [Greptile](https://greptile.com) for automated code review. Your PR must achieve a **5/5 Greptile score** with **all Greptile comments addressed** before it can be merged. If Greptile leaves comments, fix or respond to each one and request a re-review.
## General Rules (both paths) ## General Rules (both paths)
- Write clear commit messages - Write clear commit messages
@@ -41,7 +57,7 @@ PRs that follow this path are **much** more likely to be accepted, even when the
## Writing a Good PR message ## Writing a Good PR message
Please include a "thinking path" at the top of your PR message that explains from the top of the project down to what you fixed. E.g.: Your PR description must follow the [PR template](.github/PULL_REQUEST_TEMPLATE.md). All sections are required. The "thinking path" at the top explains from the top of the project down to what you fixed. E.g.:
### Thinking Path Example 1: ### Thinking Path Example 1:

View File

@@ -1,8 +1,23 @@
FROM node:lts-trixie-slim AS base FROM node:lts-trixie-slim AS base
ARG USER_UID=1000
ARG USER_GID=1000
RUN apt-get update \ RUN apt-get update \
&& apt-get install -y --no-install-recommends ca-certificates curl git \ && apt-get install -y --no-install-recommends ca-certificates gosu curl git wget ripgrep python3 \
&& rm -rf /var/lib/apt/lists/* && mkdir -p -m 755 /etc/apt/keyrings \
RUN corepack enable && wget -nv -O/etc/apt/keyrings/githubcli-archive-keyring.gpg https://cli.github.com/packages/githubcli-archive-keyring.gpg \
&& echo "20e0125d6f6e077a9ad46f03371bc26d90b04939fb95170f5a1905099cc6bcc0 /etc/apt/keyrings/githubcli-archive-keyring.gpg" | sha256sum -c - \
&& chmod go+r /etc/apt/keyrings/githubcli-archive-keyring.gpg \
&& mkdir -p -m 755 /etc/apt/sources.list.d \
&& echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" > /etc/apt/sources.list.d/github-cli.list \
&& apt-get update \
&& apt-get install -y --no-install-recommends gh \
&& rm -rf /var/lib/apt/lists/* \
&& corepack enable
# Modify the existing node user/group to have the specified UID/GID to match host user
RUN usermod -u $USER_UID --non-unique node \
&& groupmod -g $USER_GID --non-unique node \
&& usermod -g $USER_GID -d /paperclip node
FROM base AS deps FROM base AS deps
WORKDIR /app WORKDIR /app
@@ -35,12 +50,17 @@ RUN pnpm --filter @paperclipai/server build
RUN test -f server/dist/index.js || (echo "ERROR: server build output missing" && exit 1) RUN test -f server/dist/index.js || (echo "ERROR: server build output missing" && exit 1)
FROM base AS production FROM base AS production
ARG USER_UID=1000
ARG USER_GID=1000
WORKDIR /app WORKDIR /app
COPY --chown=node:node --from=build /app /app COPY --chown=node:node --from=build /app /app
RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest opencode-ai \ RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest opencode-ai \
&& mkdir -p /paperclip \ && mkdir -p /paperclip \
&& chown node:node /paperclip && chown node:node /paperclip
COPY scripts/docker-entrypoint.sh /usr/local/bin/
RUN chmod +x /usr/local/bin/docker-entrypoint.sh
ENV NODE_ENV=production \ ENV NODE_ENV=production \
HOME=/paperclip \ HOME=/paperclip \
HOST=0.0.0.0 \ HOST=0.0.0.0 \
@@ -48,12 +68,15 @@ ENV NODE_ENV=production \
SERVE_UI=true \ SERVE_UI=true \
PAPERCLIP_HOME=/paperclip \ PAPERCLIP_HOME=/paperclip \
PAPERCLIP_INSTANCE_ID=default \ PAPERCLIP_INSTANCE_ID=default \
USER_UID=${USER_UID} \
USER_GID=${USER_GID} \
PAPERCLIP_CONFIG=/paperclip/instances/default/config.json \ PAPERCLIP_CONFIG=/paperclip/instances/default/config.json \
PAPERCLIP_DEPLOYMENT_MODE=authenticated \ PAPERCLIP_DEPLOYMENT_MODE=authenticated \
PAPERCLIP_DEPLOYMENT_EXPOSURE=private PAPERCLIP_DEPLOYMENT_EXPOSURE=private \
OPENCODE_ALLOW_ALL_MODELS=true
VOLUME ["/paperclip"] VOLUME ["/paperclip"]
EXPOSE 3100 EXPOSE 3100
USER node ENTRYPOINT ["docker-entrypoint.sh"]
CMD ["node", "--import", "./server/node_modules/tsx/dist/loader.mjs", "server/dist/index.js"] CMD ["node", "--import", "./server/node_modules/tsx/dist/loader.mjs", "server/dist/index.js"]

View File

@@ -257,6 +257,19 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip) Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
## Telemetry
Paperclip collects anonymous usage telemetry to help us understand how the product is used and improve it. No personal information, issue content, prompts, file paths, or secrets are ever collected. Private repository references are hashed with a per-install salt before being sent.
Telemetry is **enabled by default** and can be disabled with any of the following:
| Method | How |
|---|---|
| Environment variable | `PAPERCLIP_TELEMETRY_DISABLED=1` |
| Standard convention | `DO_NOT_TRACK=1` |
| CI environments | Automatically disabled when `CI=true` |
| Config file | Set `telemetry.enabled: false` in your Paperclip config |
## Contributing ## Contributing
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details. We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.

View File

@@ -44,6 +44,9 @@ function writeBaseConfig(configPath: string) {
baseUrlMode: "auto", baseUrlMode: "auto",
disableSignUp: false, disableSignUp: false,
}, },
telemetry: {
enabled: true,
},
storage: { storage: {
provider: "local_disk", provider: "local_disk",
localDisk: { baseDir: "/tmp/paperclip-storage" }, localDisk: { baseDir: "/tmp/paperclip-storage" },

View File

@@ -15,6 +15,10 @@ function makeCompany(overrides: Partial<Company>): Company {
budgetMonthlyCents: 0, budgetMonthlyCents: 0,
spentMonthlyCents: 0, spentMonthlyCents: 0,
requireBoardApprovalForNewAgents: false, requireBoardApprovalForNewAgents: false,
feedbackDataSharingEnabled: false,
feedbackDataSharingConsentAt: null,
feedbackDataSharingConsentByUserId: null,
feedbackDataSharingTermsVersion: null,
brandColor: null, brandColor: null,
logoAssetId: null, logoAssetId: null,
logoUrl: null, logoUrl: null,

View File

@@ -1,7 +1,7 @@
import { describe, expect, it } from "vitest"; import { describe, expect, it } from "vitest";
import { import {
isGithubShorthand, isGithubShorthand,
isGithubUrl, looksLikeRepoUrl,
isHttpUrl, isHttpUrl,
normalizeGithubImportSource, normalizeGithubImportSource,
} from "../commands/client/company.js"; } from "../commands/client/company.js";
@@ -21,17 +21,17 @@ describe("isHttpUrl", () => {
}); });
}); });
describe("isGithubUrl", () => { describe("looksLikeRepoUrl", () => {
it("matches GitHub URLs", () => { it("matches GitHub URLs", () => {
expect(isGithubUrl("https://github.com/org/repo")).toBe(true); expect(looksLikeRepoUrl("https://github.com/org/repo")).toBe(true);
}); });
it("rejects non-GitHub HTTP URLs", () => { it("rejects URLs without owner/repo path", () => {
expect(isGithubUrl("https://example.com/foo")).toBe(false); expect(looksLikeRepoUrl("https://example.com/foo")).toBe(false);
}); });
it("rejects local paths", () => { it("rejects local paths", () => {
expect(isGithubUrl("/tmp/my-company")).toBe(false); expect(looksLikeRepoUrl("/tmp/my-company")).toBe(false);
}); });
}); });

View File

@@ -163,6 +163,10 @@ describe("renderCompanyImportPreview", () => {
brandColor: null, brandColor: null,
logoPath: null, logoPath: null,
requireBoardApprovalForNewAgents: false, requireBoardApprovalForNewAgents: false,
feedbackDataSharingEnabled: false,
feedbackDataSharingConsentAt: null,
feedbackDataSharingConsentByUserId: null,
feedbackDataSharingTermsVersion: null,
}, },
sidebar: { sidebar: {
agents: ["ceo"], agents: ["ceo"],
@@ -371,6 +375,10 @@ describe("import selection catalog", () => {
brandColor: null, brandColor: null,
logoPath: "images/company-logo.png", logoPath: "images/company-logo.png",
requireBoardApprovalForNewAgents: false, requireBoardApprovalForNewAgents: false,
feedbackDataSharingEnabled: false,
feedbackDataSharingConsentAt: null,
feedbackDataSharingConsentByUserId: null,
feedbackDataSharingTermsVersion: null,
}, },
sidebar: { sidebar: {
agents: ["ceo"], agents: ["ceo"],

View File

@@ -46,6 +46,9 @@ function createTempConfig(): string {
baseUrlMode: "auto", baseUrlMode: "auto",
disableSignUp: false, disableSignUp: false,
}, },
telemetry: {
enabled: true,
},
storage: { storage: {
provider: "local_disk", provider: "local_disk",
localDisk: { localDisk: {

View File

@@ -0,0 +1,177 @@
import os from "node:os";
import path from "node:path";
import { mkdtemp, readFile } from "node:fs/promises";
import { Command } from "commander";
import { describe, expect, it } from "vitest";
import type { FeedbackTrace } from "@paperclipai/shared";
import { readZipArchive } from "../commands/client/zip.js";
import {
buildFeedbackTraceQuery,
registerFeedbackCommands,
renderFeedbackReport,
summarizeFeedbackTraces,
writeFeedbackExportBundle,
} from "../commands/client/feedback.js";
function makeTrace(overrides: Partial<FeedbackTrace> = {}): FeedbackTrace {
return {
id: "trace-12345678",
companyId: "company-123",
feedbackVoteId: "vote-12345678",
issueId: "issue-123",
projectId: "project-123",
issueIdentifier: "PAP-123",
issueTitle: "Fix the feedback command",
authorUserId: "user-123",
targetType: "issue_comment",
targetId: "comment-123",
vote: "down",
status: "pending",
destination: "paperclip_labs_feedback_v1",
exportId: null,
consentVersion: "feedback-data-sharing-v1",
schemaVersion: "1",
bundleVersion: "1",
payloadVersion: "1",
payloadDigest: null,
payloadSnapshot: {
vote: {
value: "down",
reason: "Needed more detail",
},
},
targetSummary: {
label: "Comment",
excerpt: "The first answer was too vague.",
authorAgentId: "agent-123",
authorUserId: null,
createdAt: new Date("2026-03-31T12:00:00.000Z"),
documentKey: null,
documentTitle: null,
revisionNumber: null,
},
redactionSummary: null,
attemptCount: 0,
lastAttemptedAt: null,
exportedAt: null,
failureReason: null,
createdAt: new Date("2026-03-31T12:01:00.000Z"),
updatedAt: new Date("2026-03-31T12:02:00.000Z"),
...overrides,
};
}
describe("registerFeedbackCommands", () => {
it("registers the top-level feedback commands", () => {
const program = new Command();
expect(() => registerFeedbackCommands(program)).not.toThrow();
const feedback = program.commands.find((command) => command.name() === "feedback");
expect(feedback).toBeDefined();
expect(feedback?.commands.map((command) => command.name())).toEqual(["report", "export"]);
expect(feedback?.commands[0]?.options.filter((option) => option.long === "--company-id")).toHaveLength(1);
});
});
describe("buildFeedbackTraceQuery", () => {
it("encodes all supported filters", () => {
expect(
buildFeedbackTraceQuery({
targetType: "issue_comment",
vote: "down",
status: "pending",
projectId: "project-123",
issueId: "issue-123",
from: "2026-03-31T00:00:00.000Z",
to: "2026-03-31T23:59:59.999Z",
sharedOnly: true,
}),
).toBe(
"?targetType=issue_comment&vote=down&status=pending&projectId=project-123&issueId=issue-123&from=2026-03-31T00%3A00%3A00.000Z&to=2026-03-31T23%3A59%3A59.999Z&sharedOnly=true&includePayload=true",
);
});
});
describe("renderFeedbackReport", () => {
it("includes summary counts and the optional reason", () => {
const traces = [
makeTrace(),
makeTrace({
id: "trace-87654321",
feedbackVoteId: "vote-87654321",
vote: "up",
status: "local_only",
payloadSnapshot: {
vote: {
value: "up",
reason: null,
},
},
}),
];
const report = renderFeedbackReport({
apiBase: "http://127.0.0.1:3100",
companyId: "company-123",
traces,
summary: summarizeFeedbackTraces(traces),
includePayloads: false,
});
expect(report).toContain("Paperclip Feedback Report");
expect(report).toContain("thumbs up");
expect(report).toContain("thumbs down");
expect(report).toContain("Needed more detail");
});
});
describe("writeFeedbackExportBundle", () => {
it("writes votes, traces, a manifest, and a zip archive", async () => {
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-feedback-export-"));
const outputDir = path.join(tempDir, "feedback-export");
const traces = [
makeTrace(),
makeTrace({
id: "trace-abcdef12",
feedbackVoteId: "vote-abcdef12",
issueIdentifier: "PAP-124",
issueId: "issue-124",
vote: "up",
status: "local_only",
payloadSnapshot: {
vote: {
value: "up",
reason: null,
},
},
}),
];
const exported = await writeFeedbackExportBundle({
apiBase: "http://127.0.0.1:3100",
companyId: "company-123",
traces,
outputDir,
});
expect(exported.manifest.summary.total).toBe(2);
expect(exported.manifest.summary.withReason).toBe(1);
const manifest = JSON.parse(await readFile(path.join(outputDir, "index.json"), "utf8")) as {
files: { votes: string[]; traces: string[]; zip: string };
};
expect(manifest.files.votes).toHaveLength(2);
expect(manifest.files.traces).toHaveLength(2);
const archive = await readFile(exported.zipPath);
const zip = await readZipArchive(archive);
expect(Object.keys(zip.files)).toEqual(
expect.arrayContaining([
"index.json",
`votes/${manifest.files.votes[0]}`,
`traces/${manifest.files.traces[0]}`,
]),
);
});
});

View File

@@ -44,6 +44,9 @@ function createExistingConfigFixture() {
baseUrlMode: "auto", baseUrlMode: "auto",
disableSignUp: false, disableSignUp: false,
}, },
telemetry: {
enabled: true,
},
storage: { storage: {
provider: "local_disk", provider: "local_disk",
localDisk: { localDisk: {

View File

@@ -0,0 +1,249 @@
import { randomUUID } from "node:crypto";
import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import { eq } from "drizzle-orm";
import {
agents,
companies,
createDb,
projects,
routines,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { disableAllRoutinesInConfig } from "../commands/routines.js";
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres routines CLI tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
function writeTestConfig(configPath: string, tempRoot: string, connectionString: string) {
const config = {
$meta: {
version: 1,
updatedAt: new Date().toISOString(),
source: "doctor" as const,
},
database: {
mode: "postgres" as const,
connectionString,
embeddedPostgresDataDir: path.join(tempRoot, "embedded-db"),
embeddedPostgresPort: 54329,
backup: {
enabled: false,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(tempRoot, "backups"),
},
},
logging: {
mode: "file" as const,
logDir: path.join(tempRoot, "logs"),
},
server: {
deploymentMode: "local_trusted" as const,
exposure: "private" as const,
host: "127.0.0.1",
port: 3100,
allowedHostnames: [],
serveUi: false,
},
auth: {
baseUrlMode: "auto" as const,
disableSignUp: false,
},
storage: {
provider: "local_disk" as const,
localDisk: {
baseDir: path.join(tempRoot, "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted" as const,
strictMode: false,
localEncrypted: {
keyFilePath: path.join(tempRoot, "secrets", "master.key"),
},
},
};
mkdirSync(path.dirname(configPath), { recursive: true });
writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
}
describeEmbeddedPostgres("disableAllRoutinesInConfig", () => {
let db!: ReturnType<typeof createDb>;
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
let tempRoot = "";
let configPath = "";
beforeAll(async () => {
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-routines-cli-db-");
db = createDb(tempDb.connectionString);
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-routines-cli-config-"));
configPath = path.join(tempRoot, "config.json");
writeTestConfig(configPath, tempRoot, tempDb.connectionString);
}, 20_000);
afterEach(async () => {
await db.delete(routines);
await db.delete(projects);
await db.delete(agents);
await db.delete(companies);
});
afterAll(async () => {
await tempDb?.cleanup();
if (tempRoot) {
rmSync(tempRoot, { recursive: true, force: true });
}
});
it("pauses only non-archived routines for the selected company", async () => {
const companyId = randomUUID();
const otherCompanyId = randomUUID();
const projectId = randomUUID();
const otherProjectId = randomUUID();
const agentId = randomUUID();
const otherAgentId = randomUUID();
const activeRoutineId = randomUUID();
const pausedRoutineId = randomUUID();
const archivedRoutineId = randomUUID();
const otherCompanyRoutineId = randomUUID();
await db.insert(companies).values([
{
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
},
{
id: otherCompanyId,
name: "Other company",
issuePrefix: `T${otherCompanyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
},
]);
await db.insert(agents).values([
{
id: agentId,
companyId,
name: "Coder",
adapterType: "process",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
},
{
id: otherAgentId,
companyId: otherCompanyId,
name: "Other coder",
adapterType: "process",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
},
]);
await db.insert(projects).values([
{
id: projectId,
companyId,
name: "Project",
status: "in_progress",
},
{
id: otherProjectId,
companyId: otherCompanyId,
name: "Other project",
status: "in_progress",
},
]);
await db.insert(routines).values([
{
id: activeRoutineId,
companyId,
projectId,
assigneeAgentId: agentId,
title: "Active routine",
status: "active",
},
{
id: pausedRoutineId,
companyId,
projectId,
assigneeAgentId: agentId,
title: "Paused routine",
status: "paused",
},
{
id: archivedRoutineId,
companyId,
projectId,
assigneeAgentId: agentId,
title: "Archived routine",
status: "archived",
},
{
id: otherCompanyRoutineId,
companyId: otherCompanyId,
projectId: otherProjectId,
assigneeAgentId: otherAgentId,
title: "Other company routine",
status: "active",
},
]);
const result = await disableAllRoutinesInConfig({
config: configPath,
companyId,
});
expect(result).toMatchObject({
companyId,
totalRoutines: 3,
pausedCount: 1,
alreadyPausedCount: 1,
archivedCount: 1,
});
const companyRoutines = await db
.select({
id: routines.id,
status: routines.status,
})
.from(routines)
.where(eq(routines.companyId, companyId));
const statusById = new Map(companyRoutines.map((routine) => [routine.id, routine.status]));
expect(statusById.get(activeRoutineId)).toBe("paused");
expect(statusById.get(pausedRoutineId)).toBe("paused");
expect(statusById.get(archivedRoutineId)).toBe("archived");
const otherCompanyRoutine = await db
.select({
status: routines.status,
})
.from(routines)
.where(eq(routines.id, otherCompanyRoutineId));
expect(otherCompanyRoutine[0]?.status).toBe("active");
});
});

View File

@@ -0,0 +1,117 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
const ORIGINAL_ENV = { ...process.env };
const CI_ENV_VARS = ["CI", "CONTINUOUS_INTEGRATION", "BUILD_NUMBER", "GITHUB_ACTIONS", "GITLAB_CI"];
function makeConfigPath(root: string, enabled: boolean): string {
const configPath = path.join(root, ".paperclip", "config.json");
fs.mkdirSync(path.dirname(configPath), { recursive: true });
fs.writeFileSync(configPath, JSON.stringify({
$meta: {
version: 1,
updatedAt: "2026-03-31T00:00:00.000Z",
source: "configure",
},
database: {
mode: "embedded-postgres",
embeddedPostgresDataDir: path.join(root, "runtime", "db"),
embeddedPostgresPort: 54329,
backup: {
enabled: true,
intervalMinutes: 60,
retentionDays: 30,
dir: path.join(root, "runtime", "backups"),
},
},
logging: {
mode: "file",
logDir: path.join(root, "runtime", "logs"),
},
server: {
deploymentMode: "local_trusted",
exposure: "private",
host: "127.0.0.1",
port: 3100,
allowedHostnames: [],
serveUi: true,
},
auth: {
baseUrlMode: "auto",
disableSignUp: false,
},
telemetry: {
enabled,
},
storage: {
provider: "local_disk",
localDisk: {
baseDir: path.join(root, "runtime", "storage"),
},
s3: {
bucket: "paperclip",
region: "us-east-1",
prefix: "",
forcePathStyle: false,
},
},
secrets: {
provider: "local_encrypted",
strictMode: false,
localEncrypted: {
keyFilePath: path.join(root, "runtime", "secrets", "master.key"),
},
},
}, null, 2));
return configPath;
}
describe("cli telemetry", () => {
beforeEach(() => {
process.env = { ...ORIGINAL_ENV };
for (const key of CI_ENV_VARS) {
delete process.env[key];
}
vi.stubGlobal("fetch", vi.fn(async () => ({ ok: true })));
});
afterEach(() => {
process.env = { ...ORIGINAL_ENV };
vi.unstubAllGlobals();
vi.resetModules();
});
it("respects telemetry.enabled=false from the config file", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
const configPath = makeConfigPath(root, false);
process.env.PAPERCLIP_HOME = path.join(root, "home");
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
const { initTelemetryFromConfigFile } = await import("../telemetry.js");
const client = initTelemetryFromConfigFile(configPath);
expect(client).toBeNull();
expect(fs.existsSync(path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json"))).toBe(false);
});
it("creates telemetry state only after the first event is tracked", async () => {
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
process.env.PAPERCLIP_HOME = path.join(root, "home");
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
const { initTelemetry, flushTelemetry } = await import("../telemetry.js");
const client = initTelemetry({ enabled: true });
const statePath = path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json");
expect(client).not.toBeNull();
expect(fs.existsSync(statePath)).toBe(false);
client!.track("install.started", { setupMode: "quickstart" });
expect(fs.existsSync(statePath)).toBe(true);
await flushTelemetry();
});
});

View File

@@ -75,6 +75,9 @@ function buildSourceConfig(): PaperclipConfig {
publicBaseUrl: "http://127.0.0.1:3100", publicBaseUrl: "http://127.0.0.1:3100",
disableSignUp: false, disableSignUp: false,
}, },
telemetry: {
enabled: true,
},
storage: { storage: {
provider: "local_disk", provider: "local_disk",
localDisk: { localDisk: {
@@ -415,7 +418,7 @@ describe("worktree helpers", () => {
}); });
const config = JSON.parse(fs.readFileSync(path.join(repoRoot, ".paperclip", "config.json"), "utf8")); const config = JSON.parse(fs.readFileSync(path.join(repoRoot, ".paperclip", "config.json"), "utf8"));
expect(config.server.port).toBe(3102); expect(config.server.port).toBeGreaterThan(3101);
expect(config.database.embeddedPostgresPort).not.toBe(54330); expect(config.database.embeddedPostgresPort).not.toBe(54330);
expect(config.database.embeddedPostgresPort).not.toBe(config.server.port); expect(config.database.embeddedPostgresPort).not.toBe(config.server.port);
expect(config.database.embeddedPostgresPort).toBeGreaterThan(54330); expect(config.database.embeddedPostgresPort).toBeGreaterThan(54330);

View File

@@ -5,12 +5,14 @@ import * as p from "@clack/prompts";
import pc from "picocolors"; import pc from "picocolors";
import type { import type {
Company, Company,
FeedbackTrace,
CompanyPortabilityFileEntry, CompanyPortabilityFileEntry,
CompanyPortabilityExportResult, CompanyPortabilityExportResult,
CompanyPortabilityInclude, CompanyPortabilityInclude,
CompanyPortabilityPreviewResult, CompanyPortabilityPreviewResult,
CompanyPortabilityImportResult, CompanyPortabilityImportResult,
} from "@paperclipai/shared"; } from "@paperclipai/shared";
import { getTelemetryClient, trackCompanyImported } from "../../telemetry.js";
import { ApiRequestError } from "../../client/http.js"; import { ApiRequestError } from "../../client/http.js";
import { openUrl } from "../../client/board-auth.js"; import { openUrl } from "../../client/board-auth.js";
import { binaryContentTypeByExtension, readZipArchive } from "./zip.js"; import { binaryContentTypeByExtension, readZipArchive } from "./zip.js";
@@ -22,6 +24,11 @@ import {
resolveCommandContext, resolveCommandContext,
type BaseClientOptions, type BaseClientOptions,
} from "./common.js"; } from "./common.js";
import {
buildFeedbackTraceQuery,
normalizeFeedbackTraceExportFormat,
serializeFeedbackTraces,
} from "./feedback.js";
interface CompanyCommandOptions extends BaseClientOptions {} interface CompanyCommandOptions extends BaseClientOptions {}
type CompanyDeleteSelectorMode = "auto" | "id" | "prefix"; type CompanyDeleteSelectorMode = "auto" | "id" | "prefix";
@@ -44,6 +51,20 @@ interface CompanyExportOptions extends BaseClientOptions {
expandReferencedSkills?: boolean; expandReferencedSkills?: boolean;
} }
interface CompanyFeedbackOptions extends BaseClientOptions {
targetType?: string;
vote?: string;
status?: string;
projectId?: string;
issueId?: string;
from?: string;
to?: string;
sharedOnly?: boolean;
includePayload?: boolean;
out?: string;
format?: string;
}
interface CompanyImportOptions extends BaseClientOptions { interface CompanyImportOptions extends BaseClientOptions {
include?: string; include?: string;
target?: CompanyImportTargetMode; target?: CompanyImportTargetMode;
@@ -765,8 +786,15 @@ export function isHttpUrl(input: string): boolean {
return /^https?:\/\//i.test(input.trim()); return /^https?:\/\//i.test(input.trim());
} }
export function isGithubUrl(input: string): boolean { export function looksLikeRepoUrl(input: string): boolean {
return /^https?:\/\/github\.com\//i.test(input.trim()); try {
const url = new URL(input.trim());
if (url.protocol !== "https:") return false;
const segments = url.pathname.split("/").filter(Boolean);
return segments.length >= 2;
} catch {
return false;
}
} }
function isGithubSegment(input: string): boolean { function isGithubSegment(input: string): boolean {
@@ -797,13 +825,15 @@ function normalizeGithubImportPath(input: string | null | undefined): string | n
} }
function buildGithubImportUrl(input: { function buildGithubImportUrl(input: {
hostname?: string;
owner: string; owner: string;
repo: string; repo: string;
ref?: string | null; ref?: string | null;
path?: string | null; path?: string | null;
companyPath?: string | null; companyPath?: string | null;
}): string { }): string {
const url = new URL(`https://github.com/${input.owner}/${input.repo.replace(/\.git$/i, "")}`); const host = input.hostname || "github.com";
const url = new URL(`https://${host}/${input.owner}/${input.repo.replace(/\.git$/i, "")}`);
const ref = input.ref?.trim(); const ref = input.ref?.trim();
if (ref) { if (ref) {
url.searchParams.set("ref", ref); url.searchParams.set("ref", ref);
@@ -834,14 +864,15 @@ export function normalizeGithubImportSource(input: string, refOverride?: string)
}); });
} }
if (!isGithubUrl(trimmed)) { if (!looksLikeRepoUrl(trimmed)) {
throw new Error("GitHub source must be a github.com URL or owner/repo[/path] shorthand."); throw new Error("GitHub source must be a GitHub or GitHub Enterprise URL, or owner/repo[/path] shorthand.");
} }
if (!ref) { if (!ref) {
return trimmed; return trimmed;
} }
const url = new URL(trimmed); const url = new URL(trimmed);
const hostname = url.hostname;
const parts = url.pathname.split("/").filter(Boolean); const parts = url.pathname.split("/").filter(Boolean);
if (parts.length < 2) { if (parts.length < 2) {
throw new Error("Invalid GitHub URL."); throw new Error("Invalid GitHub URL.");
@@ -852,18 +883,18 @@ export function normalizeGithubImportSource(input: string, refOverride?: string)
const existingPath = normalizeGithubImportPath(url.searchParams.get("path")); const existingPath = normalizeGithubImportPath(url.searchParams.get("path"));
const existingCompanyPath = normalizeGithubImportPath(url.searchParams.get("companyPath")); const existingCompanyPath = normalizeGithubImportPath(url.searchParams.get("companyPath"));
if (existingCompanyPath) { if (existingCompanyPath) {
return buildGithubImportUrl({ owner, repo, ref, companyPath: existingCompanyPath }); return buildGithubImportUrl({ hostname, owner, repo, ref, companyPath: existingCompanyPath });
} }
if (existingPath) { if (existingPath) {
return buildGithubImportUrl({ owner, repo, ref, path: existingPath }); return buildGithubImportUrl({ hostname, owner, repo, ref, path: existingPath });
} }
if (parts[2] === "tree") { if (parts[2] === "tree") {
return buildGithubImportUrl({ owner, repo, ref, path: parts.slice(4).join("/") }); return buildGithubImportUrl({ hostname, owner, repo, ref, path: parts.slice(4).join("/") });
} }
if (parts[2] === "blob") { if (parts[2] === "blob") {
return buildGithubImportUrl({ owner, repo, ref, companyPath: parts.slice(4).join("/") }); return buildGithubImportUrl({ hostname, owner, repo, ref, companyPath: parts.slice(4).join("/") });
} }
return buildGithubImportUrl({ owner, repo, ref }); return buildGithubImportUrl({ hostname, owner, repo, ref });
} }
async function pathExists(inputPath: string): Promise<boolean> { async function pathExists(inputPath: string): Promise<boolean> {
@@ -1093,6 +1124,91 @@ export function registerCompanyCommands(program: Command): void {
}), }),
); );
addCommonClientOptions(
company
.command("feedback:list")
.description("List feedback traces for a company")
.requiredOption("-C, --company-id <id>", "Company ID")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--project-id <id>", "Filter by project ID")
.option("--issue-id <id>", "Filter by issue ID")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--include-payload", "Include stored payload snapshots in the response")
.action(async (opts: CompanyFeedbackOptions) => {
try {
const ctx = resolveCommandContext(opts, { requireCompany: true });
const traces = (await ctx.api.get<FeedbackTrace[]>(
`/api/companies/${ctx.companyId}/feedback-traces${buildFeedbackTraceQuery(opts)}`,
)) ?? [];
if (ctx.json) {
printOutput(traces, { json: true });
return;
}
printOutput(
traces.map((trace) => ({
id: trace.id,
issue: trace.issueIdentifier ?? trace.issueId,
vote: trace.vote,
status: trace.status,
targetType: trace.targetType,
target: trace.targetSummary.label,
})),
{ json: false },
);
} catch (err) {
handleCommandError(err);
}
}),
{ includeCompany: false },
);
addCommonClientOptions(
company
.command("feedback:export")
.description("Export feedback traces for a company")
.requiredOption("-C, --company-id <id>", "Company ID")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--project-id <id>", "Filter by project ID")
.option("--issue-id <id>", "Filter by issue ID")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--include-payload", "Include stored payload snapshots in the export")
.option("--out <path>", "Write export to a file path instead of stdout")
.option("--format <format>", "Export format: json or ndjson", "ndjson")
.action(async (opts: CompanyFeedbackOptions) => {
try {
const ctx = resolveCommandContext(opts, { requireCompany: true });
const traces = (await ctx.api.get<FeedbackTrace[]>(
`/api/companies/${ctx.companyId}/feedback-traces${buildFeedbackTraceQuery(opts, opts.includePayload ?? true)}`,
)) ?? [];
const serialized = serializeFeedbackTraces(traces, opts.format);
if (opts.out?.trim()) {
await writeFile(opts.out, serialized, "utf8");
if (ctx.json) {
printOutput(
{ out: opts.out, count: traces.length, format: normalizeFeedbackTraceExportFormat(opts.format) },
{ json: true },
);
return;
}
console.log(`Wrote ${traces.length} feedback trace(s) to ${opts.out}`);
return;
}
process.stdout.write(`${serialized}${serialized.endsWith("\n") ? "" : "\n"}`);
} catch (err) {
handleCommandError(err);
}
}),
{ includeCompany: false },
);
addCommonClientOptions( addCommonClientOptions(
company company
.command("export") .command("export")
@@ -1208,13 +1324,13 @@ export function registerCompanyCommands(program: Command): void {
| { type: "github"; url: string }; | { type: "github"; url: string };
const treatAsLocalPath = !isHttpUrl(from) && await pathExists(from); const treatAsLocalPath = !isHttpUrl(from) && await pathExists(from);
const isGithubSource = isGithubUrl(from) || (isGithubShorthand(from) && !treatAsLocalPath); const isGithubSource = looksLikeRepoUrl(from) || (isGithubShorthand(from) && !treatAsLocalPath);
if (isHttpUrl(from) || isGithubSource) { if (isHttpUrl(from) || isGithubSource) {
if (!isGithubUrl(from) && !isGithubShorthand(from)) { if (!looksLikeRepoUrl(from) && !isGithubShorthand(from)) {
throw new Error( throw new Error(
"Only GitHub URLs and local paths are supported for import. " + "Only GitHub URLs and local paths are supported for import. " +
"Generic HTTP URLs are not supported. Use a GitHub URL (https://github.com/...) or a local directory path.", "Generic HTTP URLs are not supported. Use a GitHub or GitHub Enterprise URL (https://github.com/... or https://ghe.example.com/...) or a local directory path.",
); );
} }
sourcePayload = { type: "github", url: normalizeGithubImportSource(from, opts.ref) }; sourcePayload = { type: "github", url: normalizeGithubImportSource(from, opts.ref) };
@@ -1325,6 +1441,12 @@ export function registerCompanyCommands(program: Command): void {
if (!imported) { if (!imported) {
throw new Error("Import request returned no data."); throw new Error("Import request returned no data.");
} }
const tc = getTelemetryClient();
if (tc) {
const isPrivate = sourcePayload.type !== "github";
const sourceRef = sourcePayload.type === "github" ? sourcePayload.url : from;
trackCompanyImported(tc, { sourceType: sourcePayload.type, sourceRef, isPrivate });
}
let companyUrl: string | undefined; let companyUrl: string | undefined;
if (!ctx.json) { if (!ctx.json) {
try { try {

View File

@@ -0,0 +1,645 @@
import { mkdir, readdir, readFile, stat, writeFile } from "node:fs/promises";
import path from "node:path";
import pc from "picocolors";
import { Command } from "commander";
import type { Company, FeedbackTrace, FeedbackTraceBundle } from "@paperclipai/shared";
import {
addCommonClientOptions,
handleCommandError,
printOutput,
resolveCommandContext,
type BaseClientOptions,
type ResolvedClientContext,
} from "./common.js";
interface FeedbackFilterOptions extends BaseClientOptions {
targetType?: string;
vote?: string;
status?: string;
projectId?: string;
issueId?: string;
from?: string;
to?: string;
sharedOnly?: boolean;
}
export interface FeedbackTraceQueryOptions {
targetType?: string;
vote?: string;
status?: string;
projectId?: string;
issueId?: string;
from?: string;
to?: string;
sharedOnly?: boolean;
}
interface FeedbackReportOptions extends FeedbackFilterOptions {
payloads?: boolean;
}
interface FeedbackExportOptions extends FeedbackFilterOptions {
out?: string;
}
interface FeedbackSummary {
total: number;
thumbsUp: number;
thumbsDown: number;
withReason: number;
statuses: Record<string, number>;
}
interface FeedbackExportManifest {
exportedAt: string;
serverUrl: string;
companyId: string;
summary: FeedbackSummary & {
uniqueIssues: number;
issues: string[];
};
files: {
votes: string[];
traces: string[];
fullTraces: string[];
zip: string;
};
}
interface FeedbackExportResult {
outputDir: string;
zipPath: string;
manifest: FeedbackExportManifest;
}
export function registerFeedbackCommands(program: Command): void {
const feedback = program.command("feedback").description("Inspect and export local feedback traces");
addCommonClientOptions(
feedback
.command("report")
.description("Render a terminal report for company feedback traces")
.option("-C, --company-id <id>", "Company ID (overrides context default)")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--project-id <id>", "Filter by project ID")
.option("--issue-id <id>", "Filter by issue ID")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--payloads", "Include raw payload dumps in the terminal report", false)
.action(async (opts: FeedbackReportOptions) => {
try {
const ctx = resolveCommandContext(opts);
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
const summary = summarizeFeedbackTraces(traces);
if (ctx.json) {
printOutput(
{
apiBase: ctx.api.apiBase,
companyId,
summary,
traces,
},
{ json: true },
);
return;
}
console.log(renderFeedbackReport({
apiBase: ctx.api.apiBase,
companyId,
traces,
summary,
includePayloads: Boolean(opts.payloads),
}));
} catch (err) {
handleCommandError(err);
}
}),
{ includeCompany: false },
);
addCommonClientOptions(
feedback
.command("export")
.description("Export feedback votes and raw trace bundles into a folder plus zip archive")
.option("-C, --company-id <id>", "Company ID (overrides context default)")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--project-id <id>", "Filter by project ID")
.option("--issue-id <id>", "Filter by issue ID")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--out <path>", "Output directory (default: ./feedback-export-<timestamp>)")
.action(async (opts: FeedbackExportOptions) => {
try {
const ctx = resolveCommandContext(opts);
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
const outputDir = path.resolve(opts.out?.trim() || defaultFeedbackExportDirName());
const exported = await writeFeedbackExportBundle({
apiBase: ctx.api.apiBase,
companyId,
traces,
outputDir,
traceBundleFetcher: (trace) => fetchFeedbackTraceBundle(ctx, trace.id),
});
if (ctx.json) {
printOutput(
{
companyId,
outputDir: exported.outputDir,
zipPath: exported.zipPath,
summary: exported.manifest.summary,
},
{ json: true },
);
return;
}
console.log(renderFeedbackExportSummary(exported));
} catch (err) {
handleCommandError(err);
}
}),
{ includeCompany: false },
);
}
export async function resolveFeedbackCompanyId(
ctx: ResolvedClientContext,
explicitCompanyId?: string,
): Promise<string> {
const direct = explicitCompanyId?.trim() || ctx.companyId?.trim();
if (direct) return direct;
const companies = (await ctx.api.get<Company[]>("/api/companies")) ?? [];
const companyId = companies[0]?.id?.trim();
if (!companyId) {
throw new Error(
"Company ID is required. Pass --company-id, set PAPERCLIP_COMPANY_ID, or configure a CLI context default.",
);
}
return companyId;
}
export function buildFeedbackTraceQuery(opts: FeedbackTraceQueryOptions, includePayload = true): string {
const params = new URLSearchParams();
if (opts.targetType) params.set("targetType", opts.targetType);
if (opts.vote) params.set("vote", opts.vote);
if (opts.status) params.set("status", opts.status);
if (opts.projectId) params.set("projectId", opts.projectId);
if (opts.issueId) params.set("issueId", opts.issueId);
if (opts.from) params.set("from", opts.from);
if (opts.to) params.set("to", opts.to);
if (opts.sharedOnly) params.set("sharedOnly", "true");
if (includePayload) params.set("includePayload", "true");
const query = params.toString();
return query ? `?${query}` : "";
}
export function normalizeFeedbackTraceExportFormat(value: string | undefined): "json" | "ndjson" {
if (!value || value === "ndjson") return "ndjson";
if (value === "json") return "json";
throw new Error(`Unsupported export format: ${value}`);
}
export function serializeFeedbackTraces(traces: FeedbackTrace[], format: string | undefined): string {
if (normalizeFeedbackTraceExportFormat(format) === "json") {
return JSON.stringify(traces, null, 2);
}
return traces.map((trace) => JSON.stringify(trace)).join("\n");
}
export async function fetchCompanyFeedbackTraces(
ctx: ResolvedClientContext,
companyId: string,
opts: FeedbackFilterOptions,
): Promise<FeedbackTrace[]> {
return (
(await ctx.api.get<FeedbackTrace[]>(
`/api/companies/${companyId}/feedback-traces${buildFeedbackTraceQuery(opts, true)}`,
)) ?? []
);
}
export async function fetchFeedbackTraceBundle(
ctx: ResolvedClientContext,
traceId: string,
): Promise<FeedbackTraceBundle> {
const bundle = await ctx.api.get<FeedbackTraceBundle>(`/api/feedback-traces/${traceId}/bundle`);
if (!bundle) {
throw new Error(`Feedback trace bundle ${traceId} not found`);
}
return bundle;
}
export function summarizeFeedbackTraces(traces: FeedbackTrace[]): FeedbackSummary {
const statuses: Record<string, number> = {};
let thumbsUp = 0;
let thumbsDown = 0;
let withReason = 0;
for (const trace of traces) {
if (trace.vote === "up") thumbsUp += 1;
if (trace.vote === "down") thumbsDown += 1;
if (readFeedbackReason(trace)) withReason += 1;
statuses[trace.status] = (statuses[trace.status] ?? 0) + 1;
}
return {
total: traces.length,
thumbsUp,
thumbsDown,
withReason,
statuses,
};
}
export function renderFeedbackReport(input: {
apiBase: string;
companyId: string;
traces: FeedbackTrace[];
summary: FeedbackSummary;
includePayloads: boolean;
}): string {
const lines: string[] = [];
lines.push("");
lines.push(pc.bold(pc.magenta("Paperclip Feedback Report")));
lines.push(pc.dim(new Date().toISOString()));
lines.push(horizontalRule());
lines.push(`${pc.dim("Server:")} ${input.apiBase}`);
lines.push(`${pc.dim("Company:")} ${input.companyId}`);
lines.push("");
if (input.traces.length === 0) {
lines.push(pc.yellow("[!!] No feedback traces found."));
lines.push("");
return lines.join("\n");
}
lines.push(pc.bold(pc.cyan("Summary")));
lines.push(horizontalRule());
lines.push(` ${pc.green(pc.bold(String(input.summary.thumbsUp)))} thumbs up`);
lines.push(` ${pc.red(pc.bold(String(input.summary.thumbsDown)))} thumbs down`);
lines.push(` ${pc.yellow(pc.bold(String(input.summary.withReason)))} downvotes with a reason`);
lines.push(` ${pc.bold(String(input.summary.total))} total traces`);
lines.push("");
lines.push(pc.dim("Export status:"));
for (const status of ["pending", "sent", "local_only", "failed"]) {
lines.push(` ${padRight(status, 10)} ${input.summary.statuses[status] ?? 0}`);
}
lines.push("");
lines.push(pc.bold(pc.cyan("Trace Details")));
lines.push(horizontalRule());
for (const trace of input.traces) {
const voteColor = trace.vote === "up" ? pc.green : pc.red;
const voteIcon = trace.vote === "up" ? "^" : "v";
const issueRef = trace.issueIdentifier ?? trace.issueId;
const label = trace.targetSummary.label?.trim() || trace.targetType;
const excerpt = compactText(trace.targetSummary.excerpt);
const reason = readFeedbackReason(trace);
lines.push(
` ${voteColor(voteIcon)} ${pc.bold(issueRef)} ${pc.dim(compactText(trace.issueTitle, 64))}`,
);
lines.push(
` ${pc.dim("Trace:")} ${trace.id.slice(0, 8)} ${pc.dim("Status:")} ${trace.status} ${pc.dim("Date:")} ${formatTimestamp(trace.createdAt)}`,
);
lines.push(` ${pc.dim("Target:")} ${label}`);
if (excerpt) {
lines.push(` ${pc.dim("Excerpt:")} ${excerpt}`);
}
if (reason) {
lines.push(` ${pc.yellow(pc.bold("Reason:"))} ${pc.yellow(reason)}`);
}
lines.push("");
}
if (input.includePayloads) {
lines.push(pc.bold(pc.cyan("Raw Payloads")));
lines.push(horizontalRule());
for (const trace of input.traces) {
if (!trace.payloadSnapshot) continue;
const issueRef = trace.issueIdentifier ?? trace.issueId;
lines.push(` ${pc.bold(`${issueRef} (${trace.id.slice(0, 8)})`)}`);
const body = JSON.stringify(trace.payloadSnapshot, null, 2)?.split("\n") ?? [];
for (const line of body) {
lines.push(` ${pc.dim(line)}`);
}
lines.push("");
}
}
lines.push(horizontalRule());
lines.push(pc.dim(`Report complete. ${input.traces.length} trace(s) displayed.`));
lines.push("");
return lines.join("\n");
}
export async function writeFeedbackExportBundle(input: {
apiBase: string;
companyId: string;
traces: FeedbackTrace[];
outputDir: string;
traceBundleFetcher?: (trace: FeedbackTrace) => Promise<FeedbackTraceBundle>;
}): Promise<FeedbackExportResult> {
await ensureEmptyOutputDirectory(input.outputDir);
await mkdir(path.join(input.outputDir, "votes"), { recursive: true });
await mkdir(path.join(input.outputDir, "traces"), { recursive: true });
await mkdir(path.join(input.outputDir, "full-traces"), { recursive: true });
const summary = summarizeFeedbackTraces(input.traces);
const voteFiles: string[] = [];
const traceFiles: string[] = [];
const fullTraceDirs: string[] = [];
const fullTraceFiles: string[] = [];
const issueSet = new Set<string>();
for (const trace of input.traces) {
const issueRef = sanitizeFileSegment(trace.issueIdentifier ?? trace.issueId);
const voteRecord = buildFeedbackVoteRecord(trace);
const voteFileName = `${issueRef}-${trace.feedbackVoteId.slice(0, 8)}.json`;
const traceFileName = `${issueRef}-${trace.id.slice(0, 8)}.json`;
voteFiles.push(voteFileName);
traceFiles.push(traceFileName);
issueSet.add(trace.issueIdentifier ?? trace.issueId);
await writeFile(
path.join(input.outputDir, "votes", voteFileName),
`${JSON.stringify(voteRecord, null, 2)}\n`,
"utf8",
);
await writeFile(
path.join(input.outputDir, "traces", traceFileName),
`${JSON.stringify(trace, null, 2)}\n`,
"utf8",
);
if (input.traceBundleFetcher) {
const bundle = await input.traceBundleFetcher(trace);
const bundleDirName = `${issueRef}-${trace.id.slice(0, 8)}`;
const bundleDir = path.join(input.outputDir, "full-traces", bundleDirName);
await mkdir(bundleDir, { recursive: true });
fullTraceDirs.push(bundleDirName);
await writeFile(
path.join(bundleDir, "bundle.json"),
`${JSON.stringify(bundle, null, 2)}\n`,
"utf8",
);
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, "bundle.json"));
for (const file of bundle.files) {
const targetPath = path.join(bundleDir, file.path);
await mkdir(path.dirname(targetPath), { recursive: true });
await writeFile(targetPath, file.contents, "utf8");
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, file.path.replace(/\\/g, "/")));
}
}
}
const zipPath = `${input.outputDir}.zip`;
const manifest: FeedbackExportManifest = {
exportedAt: new Date().toISOString(),
serverUrl: input.apiBase,
companyId: input.companyId,
summary: {
...summary,
uniqueIssues: issueSet.size,
issues: Array.from(issueSet).sort((left, right) => left.localeCompare(right)),
},
files: {
votes: voteFiles.slice().sort((left, right) => left.localeCompare(right)),
traces: traceFiles.slice().sort((left, right) => left.localeCompare(right)),
fullTraces: fullTraceDirs.slice().sort((left, right) => left.localeCompare(right)),
zip: path.basename(zipPath),
},
};
await writeFile(
path.join(input.outputDir, "index.json"),
`${JSON.stringify(manifest, null, 2)}\n`,
"utf8",
);
const archiveFiles = await collectJsonFilesForArchive(input.outputDir, [
"index.json",
...manifest.files.votes.map((file) => path.posix.join("votes", file)),
...manifest.files.traces.map((file) => path.posix.join("traces", file)),
...fullTraceFiles,
]);
await writeFile(zipPath, createStoredZipArchive(archiveFiles, path.basename(input.outputDir)));
return {
outputDir: input.outputDir,
zipPath,
manifest,
};
}
export function renderFeedbackExportSummary(exported: FeedbackExportResult): string {
const lines: string[] = [];
lines.push("");
lines.push(pc.bold(pc.magenta("Paperclip Feedback Export")));
lines.push(pc.dim(exported.manifest.exportedAt));
lines.push(horizontalRule());
lines.push(`${pc.dim("Company:")} ${exported.manifest.companyId}`);
lines.push(`${pc.dim("Output:")} ${exported.outputDir}`);
lines.push(`${pc.dim("Archive:")} ${exported.zipPath}`);
lines.push("");
lines.push(pc.bold("Export Summary"));
lines.push(horizontalRule());
lines.push(` ${pc.green(pc.bold(String(exported.manifest.summary.thumbsUp)))} thumbs up`);
lines.push(` ${pc.red(pc.bold(String(exported.manifest.summary.thumbsDown)))} thumbs down`);
lines.push(` ${pc.yellow(pc.bold(String(exported.manifest.summary.withReason)))} with reason`);
lines.push(` ${pc.bold(String(exported.manifest.summary.uniqueIssues))} unique issues`);
lines.push("");
lines.push(pc.dim("Files:"));
lines.push(` ${path.join(exported.outputDir, "index.json")}`);
lines.push(` ${path.join(exported.outputDir, "votes")} (${exported.manifest.files.votes.length} files)`);
lines.push(` ${path.join(exported.outputDir, "traces")} (${exported.manifest.files.traces.length} files)`);
lines.push(` ${path.join(exported.outputDir, "full-traces")} (${exported.manifest.files.fullTraces.length} bundles)`);
lines.push(` ${exported.zipPath}`);
lines.push("");
return lines.join("\n");
}
function readFeedbackReason(trace: FeedbackTrace): string | null {
const payload = asRecord(trace.payloadSnapshot);
const vote = asRecord(payload?.vote);
const reason = vote?.reason;
return typeof reason === "string" && reason.trim() ? reason.trim() : null;
}
function buildFeedbackVoteRecord(trace: FeedbackTrace) {
return {
voteId: trace.feedbackVoteId,
traceId: trace.id,
issueId: trace.issueId,
issueIdentifier: trace.issueIdentifier,
issueTitle: trace.issueTitle,
vote: trace.vote,
targetType: trace.targetType,
targetId: trace.targetId,
targetSummary: trace.targetSummary,
status: trace.status,
consentVersion: trace.consentVersion,
createdAt: trace.createdAt,
updatedAt: trace.updatedAt,
reason: readFeedbackReason(trace),
};
}
function asRecord(value: unknown): Record<string, unknown> | null {
if (!value || typeof value !== "object" || Array.isArray(value)) return null;
return value as Record<string, unknown>;
}
function compactText(value: string | null | undefined, maxLength = 88): string | null {
if (!value) return null;
const compact = value.replace(/\s+/g, " ").trim();
if (!compact) return null;
if (compact.length <= maxLength) return compact;
return `${compact.slice(0, maxLength - 3)}...`;
}
function formatTimestamp(value: unknown): string {
if (value instanceof Date) return value.toISOString().slice(0, 19).replace("T", " ");
if (typeof value === "string") return value.slice(0, 19).replace("T", " ");
return "-";
}
function horizontalRule(): string {
return pc.dim("-".repeat(72));
}
function padRight(value: string, width: number): string {
return `${value}${" ".repeat(Math.max(0, width - value.length))}`;
}
function defaultFeedbackExportDirName(): string {
const iso = new Date().toISOString().replace(/[-:]/g, "").replace(/\.\d{3}Z$/, "Z");
return `feedback-export-${iso}`;
}
async function ensureEmptyOutputDirectory(outputDir: string): Promise<void> {
try {
const info = await stat(outputDir);
if (!info.isDirectory()) {
throw new Error(`Output path already exists and is not a directory: ${outputDir}`);
}
const entries = await readdir(outputDir);
if (entries.length > 0) {
throw new Error(`Output directory already exists and is not empty: ${outputDir}`);
}
} catch (error) {
const message = error instanceof Error ? error.message : "";
if (/ENOENT/.test(message)) {
await mkdir(outputDir, { recursive: true });
return;
}
throw error;
}
}
async function collectJsonFilesForArchive(
outputDir: string,
relativePaths: string[],
): Promise<Record<string, string>> {
const files: Record<string, string> = {};
for (const relativePath of relativePaths) {
const normalized = relativePath.replace(/\\/g, "/");
files[normalized] = await readFile(path.join(outputDir, normalized), "utf8");
}
return files;
}
function sanitizeFileSegment(value: string): string {
return value.replace(/[^a-zA-Z0-9._-]+/g, "-").replace(/^-+|-+$/g, "") || "feedback";
}
function writeUint16(target: Uint8Array, offset: number, value: number) {
target[offset] = value & 0xff;
target[offset + 1] = (value >>> 8) & 0xff;
}
function writeUint32(target: Uint8Array, offset: number, value: number) {
target[offset] = value & 0xff;
target[offset + 1] = (value >>> 8) & 0xff;
target[offset + 2] = (value >>> 16) & 0xff;
target[offset + 3] = (value >>> 24) & 0xff;
}
function crc32(bytes: Uint8Array) {
let crc = 0xffffffff;
for (const byte of bytes) {
crc ^= byte;
for (let bit = 0; bit < 8; bit += 1) {
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
}
}
return (crc ^ 0xffffffff) >>> 0;
}
function createStoredZipArchive(files: Record<string, string>, rootPath: string): Uint8Array {
const encoder = new TextEncoder();
const localChunks: Uint8Array[] = [];
const centralChunks: Uint8Array[] = [];
let localOffset = 0;
let entryCount = 0;
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
const body = encoder.encode(content);
const checksum = crc32(body);
const localHeader = new Uint8Array(30 + fileName.length);
writeUint32(localHeader, 0, 0x04034b50);
writeUint16(localHeader, 4, 20);
writeUint16(localHeader, 6, 0x0800);
writeUint16(localHeader, 8, 0);
writeUint32(localHeader, 14, checksum);
writeUint32(localHeader, 18, body.length);
writeUint32(localHeader, 22, body.length);
writeUint16(localHeader, 26, fileName.length);
localHeader.set(fileName, 30);
const centralHeader = new Uint8Array(46 + fileName.length);
writeUint32(centralHeader, 0, 0x02014b50);
writeUint16(centralHeader, 4, 20);
writeUint16(centralHeader, 6, 20);
writeUint16(centralHeader, 8, 0x0800);
writeUint16(centralHeader, 10, 0);
writeUint32(centralHeader, 16, checksum);
writeUint32(centralHeader, 20, body.length);
writeUint32(centralHeader, 24, body.length);
writeUint16(centralHeader, 28, fileName.length);
writeUint32(centralHeader, 42, localOffset);
centralHeader.set(fileName, 46);
localChunks.push(localHeader, body);
centralChunks.push(centralHeader);
localOffset += localHeader.length + body.length;
entryCount += 1;
}
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
const archive = new Uint8Array(
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
);
let offset = 0;
for (const chunk of localChunks) {
archive.set(chunk, offset);
offset += chunk.length;
}
const centralDirectoryOffset = offset;
for (const chunk of centralChunks) {
archive.set(chunk, offset);
offset += chunk.length;
}
writeUint32(archive, offset, 0x06054b50);
writeUint16(archive, offset + 8, entryCount);
writeUint16(archive, offset + 10, entryCount);
writeUint32(archive, offset + 12, centralDirectoryLength);
writeUint32(archive, offset + 16, centralDirectoryOffset);
return archive;
}

View File

@@ -1,8 +1,10 @@
import { Command } from "commander"; import { Command } from "commander";
import { writeFile } from "node:fs/promises";
import { import {
addIssueCommentSchema, addIssueCommentSchema,
checkoutIssueSchema, checkoutIssueSchema,
createIssueSchema, createIssueSchema,
type FeedbackTrace,
updateIssueSchema, updateIssueSchema,
type Issue, type Issue,
type IssueComment, type IssueComment,
@@ -15,6 +17,11 @@ import {
resolveCommandContext, resolveCommandContext,
type BaseClientOptions, type BaseClientOptions,
} from "./common.js"; } from "./common.js";
import {
buildFeedbackTraceQuery,
normalizeFeedbackTraceExportFormat,
serializeFeedbackTraces,
} from "./feedback.js";
interface IssueBaseOptions extends BaseClientOptions { interface IssueBaseOptions extends BaseClientOptions {
status?: string; status?: string;
@@ -61,6 +68,18 @@ interface IssueCheckoutOptions extends BaseClientOptions {
expectedStatuses?: string; expectedStatuses?: string;
} }
interface IssueFeedbackOptions extends BaseClientOptions {
targetType?: string;
vote?: string;
status?: string;
from?: string;
to?: string;
sharedOnly?: boolean;
includePayload?: boolean;
out?: string;
format?: string;
}
export function registerIssueCommands(program: Command): void { export function registerIssueCommands(program: Command): void {
const issue = program.command("issue").description("Issue operations"); const issue = program.command("issue").description("Issue operations");
@@ -237,6 +256,85 @@ export function registerIssueCommands(program: Command): void {
}), }),
); );
addCommonClientOptions(
issue
.command("feedback:list")
.description("List feedback traces for an issue")
.argument("<issueId>", "Issue ID")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--include-payload", "Include stored payload snapshots in the response")
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
try {
const ctx = resolveCommandContext(opts);
const traces = (await ctx.api.get<FeedbackTrace[]>(
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts)}`,
)) ?? [];
if (ctx.json) {
printOutput(traces, { json: true });
return;
}
printOutput(
traces.map((trace) => ({
id: trace.id,
issue: trace.issueIdentifier ?? trace.issueId,
vote: trace.vote,
status: trace.status,
targetType: trace.targetType,
target: trace.targetSummary.label,
})),
{ json: false },
);
} catch (err) {
handleCommandError(err);
}
}),
);
addCommonClientOptions(
issue
.command("feedback:export")
.description("Export feedback traces for an issue")
.argument("<issueId>", "Issue ID")
.option("--target-type <type>", "Filter by target type")
.option("--vote <vote>", "Filter by vote value")
.option("--status <status>", "Filter by trace status")
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
.option("--shared-only", "Only include traces eligible for sharing/export")
.option("--include-payload", "Include stored payload snapshots in the export")
.option("--out <path>", "Write export to a file path instead of stdout")
.option("--format <format>", "Export format: json or ndjson", "ndjson")
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
try {
const ctx = resolveCommandContext(opts);
const traces = (await ctx.api.get<FeedbackTrace[]>(
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts, opts.includePayload ?? true)}`,
)) ?? [];
const serialized = serializeFeedbackTraces(traces, opts.format);
if (opts.out?.trim()) {
await writeFile(opts.out, serialized, "utf8");
if (ctx.json) {
printOutput(
{ out: opts.out, count: traces.length, format: normalizeFeedbackTraceExportFormat(opts.format) },
{ json: true },
);
return;
}
console.log(`Wrote ${traces.length} feedback trace(s) to ${opts.out}`);
return;
}
process.stdout.write(`${serialized}${serialized.endsWith("\n") ? "" : "\n"}`);
} catch (err) {
handleCommandError(err);
}
}),
);
addCommonClientOptions( addCommonClientOptions(
issue issue
.command("checkout") .command("checkout")

View File

@@ -63,6 +63,9 @@ function defaultConfig(): PaperclipConfig {
baseUrlMode: "auto", baseUrlMode: "auto",
disableSignUp: false, disableSignUp: false,
}, },
telemetry: {
enabled: true,
},
storage: defaultStorageConfig(), storage: defaultStorageConfig(),
secrets: defaultSecretsConfig(), secrets: defaultSecretsConfig(),
}; };

View File

@@ -33,6 +33,11 @@ import {
} from "../config/home.js"; } from "../config/home.js";
import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js"; import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js";
import { printPaperclipCliBanner } from "../utils/banner.js"; import { printPaperclipCliBanner } from "../utils/banner.js";
import {
getTelemetryClient,
trackInstallStarted,
trackInstallCompleted,
} from "../telemetry.js";
type SetupMode = "quickstart" | "advanced"; type SetupMode = "quickstart" | "advanced";
@@ -356,6 +361,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
setupMode = setupModeChoice as SetupMode; setupMode = setupModeChoice as SetupMode;
} }
const tc = getTelemetryClient();
if (tc) trackInstallStarted(tc);
let llm: PaperclipConfig["llm"] | undefined; let llm: PaperclipConfig["llm"] | undefined;
const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv(); const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv();
let { let {
@@ -488,6 +496,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
logging, logging,
server, server,
auth, auth,
telemetry: {
enabled: true,
},
storage, storage,
secrets, secrets,
}; };
@@ -501,6 +512,10 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
writeConfig(config, opts.config); writeConfig(config, opts.config);
if (tc) trackInstallCompleted(tc, {
adapterType: server.deploymentMode,
});
p.note( p.note(
[ [
`Database: ${database.mode}`, `Database: ${database.mode}`,

View File

@@ -0,0 +1,352 @@
import fs from "node:fs";
import net from "node:net";
import path from "node:path";
import { Command } from "commander";
import pc from "picocolors";
import {
applyPendingMigrations,
createDb,
createEmbeddedPostgresLogBuffer,
ensurePostgresDatabase,
formatEmbeddedPostgresError,
routines,
} from "@paperclipai/db";
import { eq, inArray } from "drizzle-orm";
import { loadPaperclipEnvFile } from "../config/env.js";
import { readConfig, resolveConfigPath } from "../config/store.js";
type RoutinesDisableAllOptions = {
config?: string;
dataDir?: string;
companyId?: string;
json?: boolean;
};
type DisableAllRoutinesResult = {
companyId: string;
totalRoutines: number;
pausedCount: number;
alreadyPausedCount: number;
archivedCount: number;
};
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
type EmbeddedPostgresHandle = {
port: number;
startedByThisProcess: boolean;
stop: () => Promise<void>;
};
type ClosableDb = ReturnType<typeof createDb> & {
$client?: {
end?: (options?: { timeout?: number }) => Promise<void>;
};
};
function nonEmpty(value: string | null | undefined): string | null {
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
}
async function isPortAvailable(port: number): Promise<boolean> {
return await new Promise<boolean>((resolve) => {
const server = net.createServer();
server.unref();
server.once("error", () => resolve(false));
server.listen(port, "127.0.0.1", () => {
server.close(() => resolve(true));
});
});
}
async function findAvailablePort(preferredPort: number): Promise<number> {
let port = Math.max(1, Math.trunc(preferredPort));
while (!(await isPortAvailable(port))) {
port += 1;
}
return port;
}
function readPidFilePort(postmasterPidFile: string): number | null {
if (!fs.existsSync(postmasterPidFile)) return null;
try {
const lines = fs.readFileSync(postmasterPidFile, "utf8").split("\n");
const port = Number(lines[3]?.trim());
return Number.isInteger(port) && port > 0 ? port : null;
} catch {
return null;
}
}
function readRunningPostmasterPid(postmasterPidFile: string): number | null {
if (!fs.existsSync(postmasterPidFile)) return null;
try {
const pid = Number(fs.readFileSync(postmasterPidFile, "utf8").split("\n")[0]?.trim());
if (!Number.isInteger(pid) || pid <= 0) return null;
process.kill(pid, 0);
return pid;
} catch {
return null;
}
}
async function ensureEmbeddedPostgres(dataDir: string, preferredPort: number): Promise<EmbeddedPostgresHandle> {
const moduleName = "embedded-postgres";
let EmbeddedPostgres: EmbeddedPostgresCtor;
try {
const mod = await import(moduleName);
EmbeddedPostgres = mod.default as EmbeddedPostgresCtor;
} catch {
throw new Error(
"Embedded PostgreSQL support requires dependency `embedded-postgres`. Reinstall dependencies and try again.",
);
}
const postmasterPidFile = path.resolve(dataDir, "postmaster.pid");
const runningPid = readRunningPostmasterPid(postmasterPidFile);
if (runningPid) {
return {
port: readPidFilePort(postmasterPidFile) ?? preferredPort,
startedByThisProcess: false,
stop: async () => {},
};
}
const port = await findAvailablePort(preferredPort);
const logBuffer = createEmbeddedPostgresLogBuffer();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: logBuffer.append,
onError: logBuffer.append,
});
if (!fs.existsSync(path.resolve(dataDir, "PG_VERSION"))) {
try {
await instance.initialise();
} catch (error) {
throw formatEmbeddedPostgresError(error, {
fallbackMessage: `Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
}
if (fs.existsSync(postmasterPidFile)) {
fs.rmSync(postmasterPidFile, { force: true });
}
try {
await instance.start();
} catch (error) {
throw formatEmbeddedPostgresError(error, {
fallbackMessage: `Failed to start embedded PostgreSQL on port ${port}`,
recentLogs: logBuffer.getRecentLogs(),
});
}
return {
port,
startedByThisProcess: true,
stop: async () => {
await instance.stop();
},
};
}
async function closeDb(db: ClosableDb): Promise<void> {
await db.$client?.end?.({ timeout: 5 }).catch(() => undefined);
}
async function openConfiguredDb(configPath: string): Promise<{
db: ClosableDb;
stop: () => Promise<void>;
}> {
const config = readConfig(configPath);
if (!config) {
throw new Error(`Config not found at ${configPath}.`);
}
let embeddedHandle: EmbeddedPostgresHandle | null = null;
try {
if (config.database.mode === "embedded-postgres") {
embeddedHandle = await ensureEmbeddedPostgres(
config.database.embeddedPostgresDataDir,
config.database.embeddedPostgresPort,
);
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
await applyPendingMigrations(connectionString);
const db = createDb(connectionString) as ClosableDb;
return {
db,
stop: async () => {
await closeDb(db);
if (embeddedHandle?.startedByThisProcess) {
await embeddedHandle.stop().catch(() => undefined);
}
},
};
}
const connectionString = nonEmpty(config.database.connectionString);
if (!connectionString) {
throw new Error(`Config at ${configPath} does not define a database connection string.`);
}
await applyPendingMigrations(connectionString);
const db = createDb(connectionString) as ClosableDb;
return {
db,
stop: async () => {
await closeDb(db);
},
};
} catch (error) {
if (embeddedHandle?.startedByThisProcess) {
await embeddedHandle.stop().catch(() => undefined);
}
throw error;
}
}
export async function disableAllRoutinesInConfig(
options: Pick<RoutinesDisableAllOptions, "config" | "companyId">,
): Promise<DisableAllRoutinesResult> {
const configPath = resolveConfigPath(options.config);
loadPaperclipEnvFile(configPath);
const companyId =
nonEmpty(options.companyId)
?? nonEmpty(process.env.PAPERCLIP_COMPANY_ID)
?? null;
if (!companyId) {
throw new Error("Company ID is required. Pass --company-id or set PAPERCLIP_COMPANY_ID.");
}
const config = readConfig(configPath);
if (!config) {
throw new Error(`Config not found at ${configPath}.`);
}
let embeddedHandle: EmbeddedPostgresHandle | null = null;
let db: ClosableDb | null = null;
try {
if (config.database.mode === "embedded-postgres") {
embeddedHandle = await ensureEmbeddedPostgres(
config.database.embeddedPostgresDataDir,
config.database.embeddedPostgresPort,
);
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
await applyPendingMigrations(connectionString);
db = createDb(connectionString) as ClosableDb;
} else {
const connectionString = nonEmpty(config.database.connectionString);
if (!connectionString) {
throw new Error(`Config at ${configPath} does not define a database connection string.`);
}
await applyPendingMigrations(connectionString);
db = createDb(connectionString) as ClosableDb;
}
const existing = await db
.select({
id: routines.id,
status: routines.status,
})
.from(routines)
.where(eq(routines.companyId, companyId));
const alreadyPausedCount = existing.filter((routine) => routine.status === "paused").length;
const archivedCount = existing.filter((routine) => routine.status === "archived").length;
const idsToPause = existing
.filter((routine) => routine.status !== "paused" && routine.status !== "archived")
.map((routine) => routine.id);
if (idsToPause.length > 0) {
await db
.update(routines)
.set({
status: "paused",
updatedAt: new Date(),
})
.where(inArray(routines.id, idsToPause));
}
return {
companyId,
totalRoutines: existing.length,
pausedCount: idsToPause.length,
alreadyPausedCount,
archivedCount,
};
} finally {
if (db) {
await closeDb(db);
}
if (embeddedHandle?.startedByThisProcess) {
await embeddedHandle.stop().catch(() => undefined);
}
}
}
export async function disableAllRoutinesCommand(options: RoutinesDisableAllOptions): Promise<void> {
const result = await disableAllRoutinesInConfig(options);
if (options.json) {
console.log(JSON.stringify(result, null, 2));
return;
}
if (result.totalRoutines === 0) {
console.log(pc.dim(`No routines found for company ${result.companyId}.`));
return;
}
console.log(
`Paused ${result.pausedCount} routine(s) for company ${result.companyId} ` +
`(${result.alreadyPausedCount} already paused, ${result.archivedCount} archived).`,
);
}
export function registerRoutineCommands(program: Command): void {
const routinesCommand = program.command("routines").description("Local routine maintenance commands");
routinesCommand
.command("disable-all")
.description("Pause all non-archived routines in the configured local instance for one company")
.option("-c, --config <path>", "Path to config file")
.option("-d, --data-dir <path>", "Paperclip data directory root (isolates state from ~/.paperclip)")
.option("-C, --company-id <id>", "Company ID")
.option("--json", "Output raw JSON")
.action(async (opts: RoutinesDisableAllOptions) => {
try {
await disableAllRoutinesCommand(opts);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error(pc.red(message));
process.exit(1);
}
});
}

View File

@@ -224,6 +224,9 @@ export function buildWorktreeConfig(input: {
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}), ...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
disableSignUp: source?.auth.disableSignUp ?? false, disableSignUp: source?.auth.disableSignUp ?? false,
}, },
telemetry: {
enabled: source?.telemetry?.enabled ?? true,
},
storage: { storage: {
provider: source?.storage.provider ?? "local_disk", provider: source?.storage.provider ?? "local_disk",
localDisk: { localDisk: {

View File

@@ -7,6 +7,7 @@ export {
loggingConfigSchema, loggingConfigSchema,
serverConfigSchema, serverConfigSchema,
authConfigSchema, authConfigSchema,
telemetryConfigSchema,
storageConfigSchema, storageConfigSchema,
storageLocalDiskConfigSchema, storageLocalDiskConfigSchema,
storageS3ConfigSchema, storageS3ConfigSchema,
@@ -19,10 +20,11 @@ export {
type LoggingConfig, type LoggingConfig,
type ServerConfig, type ServerConfig,
type AuthConfig, type AuthConfig,
type TelemetryConfig,
type StorageConfig, type StorageConfig,
type StorageLocalDiskConfig, type StorageLocalDiskConfig,
type StorageS3Config, type StorageS3Config,
type SecretsConfig, type SecretsConfig,
type SecretsLocalEncryptedConfig, type SecretsLocalEncryptedConfig,
type ConfigMeta, type ConfigMeta,
} from "@paperclipai/shared"; } from "../../../packages/shared/src/config-schema.js";

View File

@@ -15,11 +15,15 @@ import { registerAgentCommands } from "./commands/client/agent.js";
import { registerApprovalCommands } from "./commands/client/approval.js"; import { registerApprovalCommands } from "./commands/client/approval.js";
import { registerActivityCommands } from "./commands/client/activity.js"; import { registerActivityCommands } from "./commands/client/activity.js";
import { registerDashboardCommands } from "./commands/client/dashboard.js"; import { registerDashboardCommands } from "./commands/client/dashboard.js";
import { registerRoutineCommands } from "./commands/routines.js";
import { registerFeedbackCommands } from "./commands/client/feedback.js";
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js"; import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
import { loadPaperclipEnvFile } from "./config/env.js"; import { loadPaperclipEnvFile } from "./config/env.js";
import { initTelemetryFromConfigFile, flushTelemetry } from "./telemetry.js";
import { registerWorktreeCommands } from "./commands/worktree.js"; import { registerWorktreeCommands } from "./commands/worktree.js";
import { registerPluginCommands } from "./commands/client/plugin.js"; import { registerPluginCommands } from "./commands/client/plugin.js";
import { registerClientAuthCommands } from "./commands/client/auth.js"; import { registerClientAuthCommands } from "./commands/client/auth.js";
import { cliVersion } from "./version.js";
const program = new Command(); const program = new Command();
const DATA_DIR_OPTION_HELP = const DATA_DIR_OPTION_HELP =
@@ -28,7 +32,7 @@ const DATA_DIR_OPTION_HELP =
program program
.name("paperclipai") .name("paperclipai")
.description("Paperclip CLI — setup, diagnose, and configure your instance") .description("Paperclip CLI — setup, diagnose, and configure your instance")
.version("0.2.7"); .version(cliVersion);
program.hook("preAction", (_thisCommand, actionCommand) => { program.hook("preAction", (_thisCommand, actionCommand) => {
const options = actionCommand.optsWithGlobals() as DataDirOptionLike; const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
@@ -38,6 +42,7 @@ program.hook("preAction", (_thisCommand, actionCommand) => {
hasContextOption: optionNames.has("context"), hasContextOption: optionNames.has("context"),
}); });
loadPaperclipEnvFile(options.config); loadPaperclipEnvFile(options.config);
initTelemetryFromConfigFile(options.config);
}); });
program program
@@ -137,6 +142,8 @@ registerAgentCommands(program);
registerApprovalCommands(program); registerApprovalCommands(program);
registerActivityCommands(program); registerActivityCommands(program);
registerDashboardCommands(program); registerDashboardCommands(program);
registerRoutineCommands(program);
registerFeedbackCommands(program);
registerWorktreeCommands(program); registerWorktreeCommands(program);
registerPluginCommands(program); registerPluginCommands(program);
@@ -154,7 +161,20 @@ auth
registerClientAuthCommands(auth); registerClientAuthCommands(auth);
program.parseAsync().catch((err) => { async function main(): Promise<void> {
console.error(err instanceof Error ? err.message : String(err)); let failed = false;
process.exit(1); try {
}); await program.parseAsync();
} catch (err) {
failed = true;
console.error(err instanceof Error ? err.message : String(err));
} finally {
await flushTelemetry();
}
if (failed) {
process.exit(1);
}
}
void main();

49
cli/src/telemetry.ts Normal file
View File

@@ -0,0 +1,49 @@
import path from "node:path";
import {
TelemetryClient,
resolveTelemetryConfig,
loadOrCreateState,
trackInstallStarted,
trackInstallCompleted,
trackCompanyImported,
} from "../../packages/shared/src/telemetry/index.js";
import { resolvePaperclipInstanceRoot } from "./config/home.js";
import { readConfig } from "./config/store.js";
import { cliVersion } from "./version.js";
let client: TelemetryClient | null = null;
export function initTelemetry(fileConfig?: { enabled?: boolean }): TelemetryClient | null {
if (client) return client;
const config = resolveTelemetryConfig(fileConfig);
if (!config.enabled) return null;
const stateDir = path.join(resolvePaperclipInstanceRoot(), "telemetry");
client = new TelemetryClient(config, () => loadOrCreateState(stateDir, cliVersion), cliVersion);
return client;
}
export function initTelemetryFromConfigFile(configPath?: string): TelemetryClient | null {
try {
return initTelemetry(readConfig(configPath)?.telemetry);
} catch {
return initTelemetry();
}
}
export function getTelemetryClient(): TelemetryClient | null {
return client;
}
export async function flushTelemetry(): Promise<void> {
if (client) {
await client.flush();
}
}
export {
trackInstallStarted,
trackInstallCompleted,
trackCompanyImported,
};

10
cli/src/version.ts Normal file
View File

@@ -0,0 +1,10 @@
import { createRequire } from "node:module";
type PackageJson = {
version?: string;
};
const require = createRequire(import.meta.url);
const pkg = require("../package.json") as PackageJson;
export const cliVersion = pkg.version ?? "0.0.0";

View File

@@ -2,7 +2,7 @@
"extends": "../tsconfig.base.json", "extends": "../tsconfig.base.json",
"compilerOptions": { "compilerOptions": {
"outDir": "dist", "outDir": "dist",
"rootDir": "src" "rootDir": ".."
}, },
"include": ["src"] "include": ["src", "../packages/shared/src"]
} }

View File

@@ -97,7 +97,7 @@ docker run --name paperclip \
Or use Compose: Or use Compose:
```sh ```sh
docker compose -f docker-compose.quickstart.yml up --build docker compose -f docker/docker-compose.quickstart.yml up --build
``` ```
See `doc/DOCKER.md` for API key wiring (`OPENAI_API_KEY` / `ANTHROPIC_API_KEY`) and persistence details. See `doc/DOCKER.md` for API key wiring (`OPENAI_API_KEY` / `ANTHROPIC_API_KEY`) and persistence details.
@@ -145,6 +145,8 @@ For `codex_local`, Paperclip also manages a per-company Codex home under the ins
- `~/.paperclip/instances/default/companies/<company-id>/codex-home` - `~/.paperclip/instances/default/companies/<company-id>/codex-home`
If the `codex` CLI is not installed or not on `PATH`, `codex_local` agent runs fail at execution time with a clear adapter error. Quota polling uses a short-lived `codex app-server` subprocess: when `codex` cannot be spawned, that provider reports `ok: false` in aggregated quota results and the API server keeps running (it must not exit on a missing binary).
## Worktree-local Instances ## Worktree-local Instances
When developing from multiple git worktrees, do not point two Paperclip servers at the same embedded PostgreSQL data directory. When developing from multiple git worktrees, do not point two Paperclip servers at the same embedded PostgreSQL data directory.
@@ -173,6 +175,8 @@ Seed modes:
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance. After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
Provisioned git worktrees also pause all seeded routines in the isolated worktree database by default. This prevents copied daily/cron routines from firing unexpectedly inside the new workspace instance during development.
That repo-local env also sets: That repo-local env also sets:
- `PAPERCLIP_IN_WORKTREE=true` - `PAPERCLIP_IN_WORKTREE=true`

View File

@@ -2,6 +2,28 @@
Run Paperclip in Docker without installing Node or pnpm locally. Run Paperclip in Docker without installing Node or pnpm locally.
All commands below assume you are in the **project root** (the directory containing `package.json`), not inside `docker/`.
## Building the image
```sh
docker build -t paperclip-local .
```
The Dockerfile installs common agent tools (`git`, `gh`, `curl`, `wget`, `ripgrep`, `python3`) and the Claude, Codex, and OpenCode CLIs.
Build arguments:
| Arg | Default | Purpose |
|-----|---------|---------|
| `USER_UID` | `1000` | UID for the container `node` user (match your host UID to avoid permission issues on bind mounts) |
| `USER_GID` | `1000` | GID for the container `node` group |
```sh
docker build -t paperclip-local \
--build-arg USER_UID=$(id -u) --build-arg USER_GID=$(id -g) .
```
## One-liner (build + run) ## One-liner (build + run)
```sh ```sh
@@ -10,6 +32,7 @@ docker run --name paperclip \
-p 3100:3100 \ -p 3100:3100 \
-e HOST=0.0.0.0 \ -e HOST=0.0.0.0 \
-e PAPERCLIP_HOME=/paperclip \ -e PAPERCLIP_HOME=/paperclip \
-e BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
-v "$(pwd)/data/docker-paperclip:/paperclip" \ -v "$(pwd)/data/docker-paperclip:/paperclip" \
paperclip-local paperclip-local
``` ```
@@ -25,10 +48,15 @@ Data persistence:
All persisted under your bind mount (`./data/docker-paperclip` in the example above). All persisted under your bind mount (`./data/docker-paperclip` in the example above).
## Compose Quickstart ## Docker Compose
### Quickstart (embedded SQLite)
Single container, no external database. Data persists via a bind mount.
```sh ```sh
docker compose -f docker-compose.quickstart.yml up --build BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
docker compose -f docker/docker-compose.quickstart.yml up --build
``` ```
Defaults: Defaults:
@@ -39,11 +67,36 @@ Defaults:
Optional overrides: Optional overrides:
```sh ```sh
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc docker compose -f docker-compose.quickstart.yml up --build PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=../data/pc \
docker compose -f docker/docker-compose.quickstart.yml up --build
``` ```
**Note:** `PAPERCLIP_DATA_DIR` is resolved relative to the compose file (`docker/`), so `../data/pc` maps to `data/pc` in the project root.
If you change host port or use a non-local domain, set `PAPERCLIP_PUBLIC_URL` to the external URL you will use in browser/auth flows. If you change host port or use a non-local domain, set `PAPERCLIP_PUBLIC_URL` to the external URL you will use in browser/auth flows.
Pass `OPENAI_API_KEY` and/or `ANTHROPIC_API_KEY` to enable local adapter runs.
### Full stack (with PostgreSQL)
Paperclip server + PostgreSQL 17. The database is health-checked before the server starts.
```sh
BETTER_AUTH_SECRET=$(openssl rand -hex 32) \
docker compose -f docker/docker-compose.yml up --build
```
PostgreSQL data persists in a named Docker volume (`pgdata`). Paperclip data persists in `paperclip-data`.
### Untrusted PR review
Isolated container for reviewing untrusted pull requests with Codex or Claude, without exposing your host machine. See `doc/UNTRUSTED-PR-REVIEW.md` for the full workflow.
```sh
docker compose -f docker/docker-compose.untrusted-review.yml build
docker compose -f docker/docker-compose.untrusted-review.yml run --rm --service-ports review
```
## Authenticated Compose (Single Public URL) ## Authenticated Compose (Single Public URL)
For authenticated deployments, set one canonical public URL and let Paperclip derive auth/callback defaults: For authenticated deployments, set one canonical public URL and let Paperclip derive auth/callback defaults:
@@ -93,11 +146,71 @@ Notes:
- Without API keys, the app still runs normally. - Without API keys, the app still runs normally.
- Adapter environment checks in Paperclip will surface missing auth/CLI prerequisites. - Adapter environment checks in Paperclip will surface missing auth/CLI prerequisites.
## Untrusted PR Review Container ## Podman Quadlet (systemd)
If you want a separate Docker environment for reviewing untrusted pull requests with `codex` or `claude`, use the dedicated review workflow in `doc/UNTRUSTED-PR-REVIEW.md`. The `docker/quadlet/` directory contains unit files to run Paperclip + PostgreSQL as systemd services via Podman Quadlet.
That setup keeps CLI auth state in Docker volumes instead of your host home directory and uses a separate scratch workspace for PR checkouts and preview runs. | File | Purpose |
|------|---------|
| `docker/quadlet/paperclip.pod` | Pod definition — groups containers into a shared network namespace |
| `docker/quadlet/paperclip.container` | Paperclip server — joins the pod, connects to Postgres at `127.0.0.1` |
| `docker/quadlet/paperclip-db.container` | PostgreSQL 17 — joins the pod, health-checked |
### Setup
1. Build the image (see above).
2. Copy quadlet files to your systemd directory:
```sh
# Rootless (recommended)
cp docker/quadlet/*.pod docker/quadlet/*.container \
~/.config/containers/systemd/
# Or rootful
sudo cp docker/quadlet/*.pod docker/quadlet/*.container \
/etc/containers/systemd/
```
3. Create a secrets env file (keep out of version control):
```sh
cat > ~/.config/containers/systemd/paperclip.env <<EOL
BETTER_AUTH_SECRET=$(openssl rand -hex 32)
POSTGRES_USER=paperclip
POSTGRES_PASSWORD=paperclip
POSTGRES_DB=paperclip
DATABASE_URL=postgres://paperclip:paperclip@127.0.0.1:5432/paperclip
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-...
EOL
```
4. Create the data directory and start:
```sh
mkdir -p ~/.local/share/paperclip
systemctl --user daemon-reload
systemctl --user start paperclip-pod
```
### Quadlet management
```sh
journalctl --user -u paperclip -f # App logs
journalctl --user -u paperclip-db -f # DB logs
systemctl --user status paperclip-pod # Pod status
systemctl --user restart paperclip-pod # Restart all
systemctl --user stop paperclip-pod # Stop all
```
### Quadlet notes
- **First boot**: Unlike Docker Compose's `condition: service_healthy`, Quadlet's `After=` only waits for the DB unit to *start*, not for PostgreSQL to be ready. On a cold first boot you may see one or two restart attempts in `journalctl --user -u paperclip` while PostgreSQL initialises — this is expected and resolves automatically via `Restart=on-failure`.
- Containers in a pod share `localhost`, so Paperclip reaches Postgres at `127.0.0.1:5432`.
- PostgreSQL data persists in the `paperclip-pgdata` named volume.
- Paperclip data persists at `~/.local/share/paperclip`.
- For rootful quadlet deployment, remove `%h` prefixes and use absolute paths.
## Onboard Smoke Test (Ubuntu + npm only) ## Onboard Smoke Test (Ubuntu + npm only)
@@ -133,4 +246,9 @@ Notes:
- In authenticated mode, the smoke script defaults `SMOKE_AUTO_BOOTSTRAP=true` and drives the real bootstrap path automatically: it signs up a real user, runs `paperclipai auth bootstrap-ceo` inside the container to mint a real bootstrap invite, accepts that invite over HTTP, and verifies board session access. - In authenticated mode, the smoke script defaults `SMOKE_AUTO_BOOTSTRAP=true` and drives the real bootstrap path automatically: it signs up a real user, runs `paperclipai auth bootstrap-ceo` inside the container to mint a real bootstrap invite, accepts that invite over HTTP, and verifies board session access.
- Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation. - Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation.
- Set `SMOKE_DETACH=true` to leave the container running for automation and optionally write shell-ready metadata to `SMOKE_METADATA_FILE`. - Set `SMOKE_DETACH=true` to leave the container running for automation and optionally write shell-ready metadata to `SMOKE_METADATA_FILE`.
- The image definition is in `Dockerfile.onboard-smoke`. - The image definition is in `docker/Dockerfile.onboard-smoke`.
## General Notes
- The `docker-entrypoint.sh` adjusts the container `node` user UID/GID at startup to match the values passed via `USER_UID`/`USER_GID`, avoiding permission issues on bind-mounted volumes.
- Paperclip data persists via Docker volumes/bind mounts (compose) or at `~/.local/share/paperclip` (quadlet).

View File

@@ -76,6 +76,45 @@ The `ui` package uses [`scripts/generate-ui-package-json.mjs`](../scripts/genera
After packing or publishing, `postpack` restores the development manifest automatically. After packing or publishing, `postpack` restores the development manifest automatically.
### Manual first publish for `@paperclipai/ui`
If you need to publish only the UI package once by hand, use the real package name:
- `@paperclipai/ui`
Recommended flow from the repo root:
```bash
# optional sanity check: this 404s until the first publish exists
npm view @paperclipai/ui version
# make sure the dist payload is fresh
pnpm --filter @paperclipai/ui build
# confirm your local npm auth before the real publish
npm whoami
# safe preview of the exact publish payload
cd ui
pnpm publish --dry-run --no-git-checks --access public
# real publish
pnpm publish --no-git-checks --access public
```
Notes:
- Publish from `ui/`, not the repo root.
- `prepack` automatically rewrites `ui/package.json` to the lean publish manifest, and `postpack` restores the dev manifest after the command finishes.
- If `npm view @paperclipai/ui version` already returns the same version that is in [`ui/package.json`](../ui/package.json), do not republish. Bump the version or use the normal repo-wide release flow in [`scripts/release.sh`](../scripts/release.sh).
If the first real publish returns npm `E404`, check npm-side prerequisites before retrying:
- `npm whoami` must succeed first. An expired or missing npm login will block the publish.
- For an organization-scoped package like `@paperclipai/ui`, the `paperclipai` npm organization must exist and the publisher must be a member with permission to publish to that scope.
- The initial publish must include `--access public` for a public scoped package.
- npm also requires either account 2FA for publishing or a granular token that is allowed to bypass 2FA.
## Version formats ## Version formats
Paperclip uses calendar versions: Paperclip uses calendar versions:

View File

@@ -16,14 +16,14 @@ By default this workflow does **not** mount your host repo checkout, your host h
## Files ## Files
- `docker/untrusted-review/Dockerfile` - `docker/untrusted-review/Dockerfile`
- `docker-compose.untrusted-review.yml` - `docker/docker-compose.untrusted-review.yml`
- `review-checkout-pr` inside the container - `review-checkout-pr` inside the container
## Build and start a shell ## Build and start a shell
```sh ```sh
docker compose -f docker-compose.untrusted-review.yml build docker compose -f docker/docker-compose.untrusted-review.yml build
docker compose -f docker-compose.untrusted-review.yml run --rm --service-ports review docker compose -f docker/docker-compose.untrusted-review.yml run --rm --service-ports review
``` ```
That opens an interactive shell in the review container with: That opens an interactive shell in the review container with:
@@ -47,7 +47,7 @@ claude login
If you prefer API-key auth instead of CLI login, pass keys through Compose env: If you prefer API-key auth instead of CLI login, pass keys through Compose env:
```sh ```sh
OPENAI_API_KEY=... ANTHROPIC_API_KEY=... docker compose -f docker-compose.untrusted-review.yml run --rm review OPENAI_API_KEY=... ANTHROPIC_API_KEY=... docker compose -f docker/docker-compose.untrusted-review.yml run --rm review
``` ```
## Check out a PR safely ## Check out a PR safely
@@ -117,7 +117,7 @@ Notes:
Remove the review container volumes when you want a clean environment: Remove the review container volumes when you want a clean environment:
```sh ```sh
docker compose -f docker-compose.untrusted-review.yml down -v docker compose -f docker/docker-compose.untrusted-review.yml down -v
``` ```
That deletes: That deletes:

View File

@@ -249,7 +249,7 @@ Runs local `claude` CLI directly.
"cwd": "/absolute/or/relative/path", "cwd": "/absolute/or/relative/path",
"promptTemplate": "You are agent {{agent.id}} ...", "promptTemplate": "You are agent {{agent.id}} ...",
"model": "optional-model-id", "model": "optional-model-id",
"maxTurnsPerRun": 300, "maxTurnsPerRun": 1000,
"dangerouslySkipPermissions": true, "dangerouslySkipPermissions": true,
"env": {"KEY": "VALUE"}, "env": {"KEY": "VALUE"},
"extraArgs": [], "extraArgs": [],

View File

@@ -1,7 +1,7 @@
services: services:
paperclip: paperclip:
build: build:
context: . context: ..
dockerfile: Dockerfile dockerfile: Dockerfile
ports: ports:
- "${PAPERCLIP_PORT:-3100}:3100" - "${PAPERCLIP_PORT:-3100}:3100"
@@ -15,4 +15,4 @@ services:
PAPERCLIP_PUBLIC_URL: "${PAPERCLIP_PUBLIC_URL:-http://localhost:3100}" PAPERCLIP_PUBLIC_URL: "${PAPERCLIP_PUBLIC_URL:-http://localhost:3100}"
BETTER_AUTH_SECRET: "${BETTER_AUTH_SECRET:?BETTER_AUTH_SECRET must be set}" BETTER_AUTH_SECRET: "${BETTER_AUTH_SECRET:?BETTER_AUTH_SECRET must be set}"
volumes: volumes:
- "${PAPERCLIP_DATA_DIR:-./data/docker-paperclip}:/paperclip" - "${PAPERCLIP_DATA_DIR:-../data/docker-paperclip}:/paperclip"

View File

@@ -1,7 +1,7 @@
services: services:
review: review:
build: build:
context: . context: ..
dockerfile: docker/untrusted-review/Dockerfile dockerfile: docker/untrusted-review/Dockerfile
init: true init: true
tty: true tty: true

View File

@@ -16,7 +16,9 @@ services:
- pgdata:/var/lib/postgresql/data - pgdata:/var/lib/postgresql/data
server: server:
build: . build:
context: ..
dockerfile: Dockerfile
ports: ports:
- "3100:3100" - "3100:3100"
environment: environment:

View File

@@ -0,0 +1,20 @@
[Unit]
Description=PostgreSQL for Paperclip
[Container]
Image=docker.io/library/postgres:17-alpine
ContainerName=paperclip-db
Pod=paperclip.pod
Volume=paperclip-pgdata:/var/lib/postgresql/data
EnvironmentFile=%h/.config/containers/systemd/paperclip.env
HealthCmd=pg_isready -U $POSTGRES_USER -d $POSTGRES_DB -h localhost || exit 1
HealthInterval=15s
HealthTimeout=5s
HealthRetries=5
[Service]
Restart=on-failure
TimeoutStartSec=60
[Install]
WantedBy=default.target

View File

@@ -0,0 +1,23 @@
[Unit]
Description=Paperclip AI Agent Orchestrator
Requires=paperclip-db.service
After=paperclip-db.service
[Container]
Image=paperclip-local
ContainerName=paperclip
Pod=paperclip.pod
Volume=%h/.local/share/paperclip:/paperclip:Z
Environment=HOST=0.0.0.0
Environment=PAPERCLIP_HOME=/paperclip
Environment=PAPERCLIP_DEPLOYMENT_MODE=authenticated
Environment=PAPERCLIP_DEPLOYMENT_EXPOSURE=private
Environment=PAPERCLIP_PUBLIC_URL=http://localhost:3100
EnvironmentFile=%h/.config/containers/systemd/paperclip.env
[Service]
Restart=on-failure
TimeoutStartSec=120
[Install]
WantedBy=default.target

View File

@@ -0,0 +1,3 @@
[Pod]
PodName=paperclip
PublishPort=3100:3100

View File

@@ -20,7 +20,7 @@ The `claude_local` adapter runs Anthropic's Claude Code CLI locally. It supports
| `env` | object | No | Environment variables (supports secret refs) | | `env` | object | No | Environment variables (supports secret refs) |
| `timeoutSec` | number | No | Process timeout (0 = no timeout) | | `timeoutSec` | number | No | Process timeout (0 = no timeout) |
| `graceSec` | number | No | Grace period before force-kill | | `graceSec` | number | No | Grace period before force-kill |
| `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `300`) | | `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `1000`) |
| `dangerouslySkipPermissions` | boolean | No | Skip permission prompts (dev only) | | `dangerouslySkipPermissions` | boolean | No | Skip permission prompts (dev only) |
## Prompt Templates ## Prompt Templates

View File

@@ -8,7 +8,7 @@ Run Paperclip in Docker without installing Node or pnpm locally.
## Compose Quickstart (Recommended) ## Compose Quickstart (Recommended)
```sh ```sh
docker compose -f docker-compose.quickstart.yml up --build docker compose -f docker/docker-compose.quickstart.yml up --build
``` ```
Open [http://localhost:3100](http://localhost:3100). Open [http://localhost:3100](http://localhost:3100).
@@ -21,10 +21,12 @@ Defaults:
Override with environment variables: Override with environment variables:
```sh ```sh
PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=./data/pc \ PAPERCLIP_PORT=3200 PAPERCLIP_DATA_DIR=../data/pc \
docker compose -f docker-compose.quickstart.yml up --build docker compose -f docker/docker-compose.quickstart.yml up --build
``` ```
**Note:** `PAPERCLIP_DATA_DIR` is resolved relative to the compose file (`docker/`), so `../data/pc` maps to `data/pc` in the project root.
## Manual Docker Build ## Manual Docker Build
```sh ```sh

189
docs/feedback-voting.md Normal file
View File

@@ -0,0 +1,189 @@
# Feedback Voting — Local Data Guide
When you rate an agent's response with **Helpful** (thumbs up) or **Needs work** (thumbs down), Paperclip saves your vote locally alongside your running instance. This guide covers what gets stored, how to access it, and how to export it.
## How voting works
1. Click **Helpful** or **Needs work** on any agent comment or document revision.
2. If you click **Needs work**, an optional text prompt appears: _"What could have been better?"_ You can type a reason or dismiss it.
3. A consent dialog asks whether to keep the vote local or share it. Your choice is remembered for future votes.
### What gets stored
Each vote creates two local records:
| Record | What it contains |
|--------|-----------------|
| **Vote** | Your vote (up/down), optional reason text, sharing preference, consent version, timestamp |
| **Trace bundle** | Full context snapshot: the voted-on comment/revision text, issue title, agent info, your vote, and reason — everything needed to understand the feedback in isolation |
All data lives in your local Paperclip database. Nothing leaves your machine unless you explicitly choose to share.
When a vote is marked for sharing, Paperclip also queues the trace bundle for background export through the Telemetry Backend. The app server never uploads raw feedback trace bundles directly to object storage.
## Viewing your votes
### Quick report (terminal)
```bash
pnpm paperclipai feedback report
```
Shows a color-coded summary: vote counts, per-trace details with reasons, and export statuses.
```bash
# Installed CLI
paperclipai feedback report
# Point to a different server or company
pnpm paperclipai feedback report --api-base http://127.0.0.1:3000 --company-id <company-id>
# Include raw payload dumps in the report
pnpm paperclipai feedback report --payloads
```
### API endpoints
All endpoints require board-user access (automatic in local dev).
**List votes for an issue:**
```bash
curl http://127.0.0.1:3102/api/issues/<issueId>/feedback-votes
```
**List trace bundles for an issue (with full payloads):**
```bash
curl 'http://127.0.0.1:3102/api/issues/<issueId>/feedback-traces?includePayload=true'
```
**List all traces company-wide:**
```bash
curl 'http://127.0.0.1:3102/api/companies/<companyId>/feedback-traces?includePayload=true'
```
**Get a single trace envelope record:**
```bash
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>
```
**Get the full export bundle for a trace:**
```bash
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>/bundle
```
#### Filtering
The trace endpoints accept query parameters:
| Parameter | Values | Description |
|-----------|--------|-------------|
| `vote` | `up`, `down` | Filter by vote direction |
| `status` | `local_only`, `pending`, `sent`, `failed` | Filter by export status |
| `targetType` | `issue_comment`, `issue_document_revision` | Filter by what was voted on |
| `sharedOnly` | `true` | Only show votes the user chose to share |
| `includePayload` | `true` | Include the full context snapshot |
| `from` / `to` | ISO date | Date range filter |
## Exporting your data
### Export to files + zip
```bash
pnpm paperclipai feedback export
```
Creates a timestamped directory with:
```
feedback-export-20260331T120000Z/
index.json # manifest with summary stats
votes/
PAP-123-a1b2c3d4.json # vote metadata (one per vote)
traces/
PAP-123-e5f6g7h8.json # Paperclip feedback envelope (one per trace)
full-traces/
PAP-123-e5f6g7h8/
bundle.json # full export manifest for the trace
...raw adapter files # codex / claude / opencode session artifacts when available
feedback-export-20260331T120000Z.zip
```
Exports are full by default. `traces/` keeps the Paperclip envelope, while `full-traces/` contains the richer per-trace bundle plus any recoverable adapter-native files.
```bash
# Custom server and output directory
pnpm paperclipai feedback export --api-base http://127.0.0.1:3000 --company-id <company-id> --out ./my-export
```
### Reading an exported trace
Open any file in `traces/` to see:
```json
{
"id": "trace-uuid",
"vote": "down",
"issueIdentifier": "PAP-123",
"issueTitle": "Fix login timeout",
"targetType": "issue_comment",
"targetSummary": {
"label": "Comment",
"excerpt": "The first 80 chars of the comment that was voted on..."
},
"payloadSnapshot": {
"vote": {
"value": "down",
"reason": "Did not address the root cause"
},
"target": {
"body": "Full text of the agent comment..."
},
"issue": {
"identifier": "PAP-123",
"title": "Fix login timeout"
}
}
}
```
Open `full-traces/<issue>-<trace>/bundle.json` to see the expanded export metadata, including capture notes, adapter type, integrity metadata, and the inventory of raw files written alongside it.
Built-in local adapters now export their native session artifacts more directly:
- `codex_local`: `adapter/codex/session.jsonl`
- `claude_local`: `adapter/claude/session.jsonl`, plus any `adapter/claude/session/...` sidecar files and `adapter/claude/debug.txt` when present
- `opencode_local`: `adapter/opencode/session.json`, `adapter/opencode/messages/*.json`, and `adapter/opencode/parts/<messageId>/*.json`, with optional `project.json`, `todo.json`, and `session-diff.json`
## Sharing preferences
The first time you vote, a consent dialog asks:
- **Keep local** — vote is stored locally only (`sharedWithLabs: false`)
- **Share this vote** — vote is marked for sharing (`sharedWithLabs: true`)
Your preference is saved per-company. You can change it any time via the feedback settings. Votes marked "keep local" are never queued for export.
## Data lifecycle
| Status | Meaning |
|--------|---------|
| `local_only` | Vote stored locally, not marked for sharing |
| `pending` | Marked for sharing, waiting to be sent |
| `sent` | Successfully transmitted |
| `failed` | Transmission attempted but failed (will retry) |
Your local database always retains the full vote and trace data regardless of sharing status.
## Remote sync
Votes you choose to share are queued as `pending` traces and flushed by the server's background worker to the Telemetry Backend. The Telemetry Backend validates the request, then persists the bundle into its configured object storage.
- App server responsibility: build the bundle, POST it to Telemetry Backend, update trace status
- Telemetry Backend responsibility: authenticate the request, validate payload shape, compress/store the bundle, return the final object key
- Retry behavior: failed uploads move to `failed` with an error message in `failureReason`, and the worker retries them on later ticks
Exported objects use a deterministic key pattern so they are easy to inspect:
```text
feedback-traces/<companyId>/YYYY/MM/DD/<exportId-or-traceId>.json
```

View File

@@ -17,6 +17,27 @@ function asErrorText(value: unknown): string {
} }
} }
function printToolResult(block: Record<string, unknown>): void {
const isError = block.is_error === true;
let text = "";
if (typeof block.content === "string") {
text = block.content;
} else if (Array.isArray(block.content)) {
const parts: string[] = [];
for (const part of block.content) {
if (typeof part !== "object" || part === null || Array.isArray(part)) continue;
const record = part as Record<string, unknown>;
if (typeof record.text === "string") parts.push(record.text);
}
text = parts.join("\n");
}
console.log((isError ? pc.red : pc.cyan)(`tool_result${isError ? " (error)" : ""}`));
if (text) {
console.log((isError ? pc.red : pc.gray)(text));
}
}
export function printClaudeStreamEvent(raw: string, debug: boolean): void { export function printClaudeStreamEvent(raw: string, debug: boolean): void {
const line = raw.trim(); const line = raw.trim();
if (!line) return; if (!line) return;
@@ -51,6 +72,9 @@ export function printClaudeStreamEvent(raw: string, debug: boolean): void {
if (blockType === "text") { if (blockType === "text") {
const text = typeof block.text === "string" ? block.text : ""; const text = typeof block.text === "string" ? block.text : "";
if (text) console.log(pc.green(`assistant: ${text}`)); if (text) console.log(pc.green(`assistant: ${text}`));
} else if (blockType === "thinking") {
const text = typeof block.thinking === "string" ? block.thinking : "";
if (text) console.log(pc.gray(`thinking: ${text}`));
} else if (blockType === "tool_use") { } else if (blockType === "tool_use") {
const name = typeof block.name === "string" ? block.name : "unknown"; const name = typeof block.name === "string" ? block.name : "unknown";
console.log(pc.yellow(`tool_call: ${name}`)); console.log(pc.yellow(`tool_call: ${name}`));
@@ -62,6 +86,22 @@ export function printClaudeStreamEvent(raw: string, debug: boolean): void {
return; return;
} }
if (type === "user") {
const message =
typeof parsed.message === "object" && parsed.message !== null && !Array.isArray(parsed.message)
? (parsed.message as Record<string, unknown>)
: {};
const content = Array.isArray(message.content) ? message.content : [];
for (const blockRaw of content) {
if (typeof blockRaw !== "object" || blockRaw === null || Array.isArray(blockRaw)) continue;
const block = blockRaw as Record<string, unknown>;
if (typeof block.type === "string" && block.type === "tool_result") {
printToolResult(block);
}
}
return;
}
if (type === "result") { if (type === "result") {
const usage = const usage =
typeof parsed.usage === "object" && parsed.usage !== null && !Array.isArray(parsed.usage) typeof parsed.usage === "object" && parsed.usage !== null && !Array.isArray(parsed.usage)

View File

@@ -24,7 +24,7 @@ Core fields:
- cwd (string, optional): default absolute working directory fallback for the agent process (created if missing when possible) - cwd (string, optional): default absolute working directory fallback for the agent process (created if missing when possible)
- instructionsFilePath (string, optional): absolute path to a markdown instructions file prepended to stdin prompt at runtime - instructionsFilePath (string, optional): absolute path to a markdown instructions file prepended to stdin prompt at runtime
- model (string, optional): Codex model id - model (string, optional): Codex model id
- modelReasoningEffort (string, optional): reasoning effort override (minimal|low|medium|high) passed via -c model_reasoning_effort=... - modelReasoningEffort (string, optional): reasoning effort override (minimal|low|medium|high|xhigh) passed via -c model_reasoning_effort=...
- promptTemplate (string, optional): run prompt template - promptTemplate (string, optional): run prompt template
- search (boolean, optional): run codex with --search - search (boolean, optional): run codex with --search
- dangerouslyBypassApprovalsAndSandbox (boolean, optional): run with bypass flag - dangerouslyBypassApprovalsAndSandbox (boolean, optional): run with bypass flag

View File

@@ -0,0 +1,85 @@
import { EventEmitter } from "node:events";
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import type { ChildProcess } from "node:child_process";
import { describe, expect, it, vi, beforeEach, afterEach } from "vitest";
const { mockSpawn } = vi.hoisted(() => ({
mockSpawn: vi.fn(),
}));
vi.mock("node:child_process", async (importOriginal) => {
const cp = await importOriginal<typeof import("node:child_process")>();
return {
...cp,
spawn: (...args: Parameters<typeof cp.spawn>) => mockSpawn(...args) as ReturnType<typeof cp.spawn>,
};
});
import { getQuotaWindows } from "./quota.js";
function createChildThatErrorsOnMicrotask(err: Error): ChildProcess {
const child = new EventEmitter() as ChildProcess;
const stream = Object.assign(new EventEmitter(), {
setEncoding: () => {},
});
Object.assign(child, {
stdout: stream,
stderr: Object.assign(new EventEmitter(), { setEncoding: () => {} }),
stdin: { write: vi.fn(), end: vi.fn() },
kill: vi.fn(),
});
queueMicrotask(() => {
child.emit("error", err);
});
return child;
}
describe("CodexRpcClient spawn failures", () => {
let previousCodexHome: string | undefined;
let isolatedCodexHome: string | undefined;
beforeEach(() => {
mockSpawn.mockReset();
// After the RPC path fails, getQuotaWindows() calls readCodexToken() which
// reads $CODEX_HOME/auth.json (default ~/.codex). Point CODEX_HOME at an
// empty temp directory so we never hit real host auth or the WHAM network.
previousCodexHome = process.env.CODEX_HOME;
isolatedCodexHome = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-codex-spawn-test-"));
process.env.CODEX_HOME = isolatedCodexHome;
});
afterEach(() => {
if (isolatedCodexHome) {
try {
fs.rmSync(isolatedCodexHome, { recursive: true, force: true });
} catch {
/* ignore */
}
isolatedCodexHome = undefined;
}
if (previousCodexHome === undefined) {
delete process.env.CODEX_HOME;
} else {
process.env.CODEX_HOME = previousCodexHome;
}
});
it("does not crash the process when codex is missing; getQuotaWindows returns ok: false", async () => {
const enoent = Object.assign(new Error("spawn codex ENOENT"), {
code: "ENOENT",
errno: -2,
syscall: "spawn codex",
path: "codex",
});
mockSpawn.mockImplementation(() => createChildThatErrorsOnMicrotask(enoent));
const result = await getQuotaWindows();
expect(result.ok).toBe(false);
expect(result.windows).toEqual([]);
expect(result.error).toContain("Codex app-server");
expect(result.error).toContain("spawn codex ENOENT");
});
});

View File

@@ -432,6 +432,13 @@ class CodexRpcClient {
} }
this.pending.clear(); this.pending.clear();
}); });
this.proc.on("error", (err: Error) => {
for (const request of this.pending.values()) {
clearTimeout(request.timer);
request.reject(err);
}
this.pending.clear();
});
} }
private onStdout(chunk: string) { private onStdout(chunk: string) {

View File

@@ -0,0 +1,7 @@
import { defineConfig } from "vitest/config";
export default defineConfig({
test: {
environment: "node",
},
});

View File

@@ -1,7 +1,15 @@
export const type = "opencode_local"; export const type = "opencode_local";
export const label = "OpenCode (local)"; export const label = "OpenCode (local)";
export const models: Array<{ id: string; label: string }> = []; export const DEFAULT_OPENCODE_LOCAL_MODEL = "openai/gpt-5.2-codex";
export const models: Array<{ id: string; label: string }> = [
{ id: DEFAULT_OPENCODE_LOCAL_MODEL, label: DEFAULT_OPENCODE_LOCAL_MODEL },
{ id: "openai/gpt-5.4", label: "openai/gpt-5.4" },
{ id: "openai/gpt-5.2", label: "openai/gpt-5.2" },
{ id: "openai/gpt-5.1-codex-max", label: "openai/gpt-5.1-codex-max" },
{ id: "openai/gpt-5.1-codex-mini", label: "openai/gpt-5.1-codex-mini" },
];
export const agentConfigurationDoc = `# opencode_local agent configuration export const agentConfigurationDoc = `# opencode_local agent configuration
@@ -21,7 +29,7 @@ Core fields:
- cwd (string, optional): default absolute working directory fallback for the agent process (created if missing when possible) - cwd (string, optional): default absolute working directory fallback for the agent process (created if missing when possible)
- instructionsFilePath (string, optional): absolute path to a markdown instructions file prepended to the run prompt - instructionsFilePath (string, optional): absolute path to a markdown instructions file prepended to the run prompt
- model (string, required): OpenCode model id in provider/model format (for example anthropic/claude-sonnet-4-5) - model (string, required): OpenCode model id in provider/model format (for example anthropic/claude-sonnet-4-5)
- variant (string, optional): provider-specific model variant (for example minimal|low|medium|high|max) - variant (string, optional): provider-specific reasoning/profile variant passed as --variant (for example minimal|low|medium|high|xhigh|max)
- dangerouslySkipPermissions (boolean, optional): inject a runtime OpenCode config that allows \`external_directory\` access without interactive prompts; defaults to true for unattended Paperclip runs - dangerouslySkipPermissions (boolean, optional): inject a runtime OpenCode config that allows \`external_directory\` access without interactive prompts; defaults to true for unattended Paperclip runs
- promptTemplate (string, optional): run prompt template - promptTemplate (string, optional): run prompt template
- command (string, optional): defaults to "opencode" - command (string, optional): defaults to "opencode"

View File

@@ -35,11 +35,12 @@
"dist" "dist"
], ],
"scripts": { "scripts": {
"build": "tsc && cp -r src/migrations dist/migrations", "check:migrations": "tsx src/check-migration-numbering.ts",
"build": "pnpm run check:migrations && tsc && cp -r src/migrations dist/migrations",
"clean": "rm -rf dist", "clean": "rm -rf dist",
"typecheck": "tsc --noEmit", "typecheck": "pnpm run check:migrations && tsc --noEmit",
"generate": "tsc -p tsconfig.json && drizzle-kit generate", "generate": "pnpm run check:migrations && tsc -p tsconfig.json && drizzle-kit generate",
"migrate": "tsx src/migrate.ts", "migrate": "pnpm run check:migrations && tsx src/migrate.ts",
"seed": "tsx src/seed.ts" "seed": "tsx src/seed.ts"
}, },
"dependencies": { "dependencies": {

View File

@@ -0,0 +1,179 @@
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
import { afterEach, describe, expect, it } from "vitest";
import postgres from "postgres";
import { createBufferedTextFileWriter, runDatabaseBackup, runDatabaseRestore } from "./backup-lib.js";
import { ensurePostgresDatabase } from "./client.js";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./test-embedded-postgres.js";
const cleanups: Array<() => Promise<void> | void> = [];
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
function createTempDir(prefix: string): string {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), prefix));
cleanups.push(() => {
fs.rmSync(dir, { recursive: true, force: true });
});
return dir;
}
async function createTempDatabase(): Promise<string> {
const db = await startEmbeddedPostgresTestDatabase("paperclip-db-backup-");
cleanups.push(db.cleanup);
return db.connectionString;
}
async function createSiblingDatabase(connectionString: string, databaseName: string): Promise<string> {
const adminUrl = new URL(connectionString);
adminUrl.pathname = "/postgres";
await ensurePostgresDatabase(adminUrl.toString(), databaseName);
const targetUrl = new URL(connectionString);
targetUrl.pathname = `/${databaseName}`;
return targetUrl.toString();
}
afterEach(async () => {
while (cleanups.length > 0) {
const cleanup = cleanups.pop();
await cleanup?.();
}
});
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres backup tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
describe("createBufferedTextFileWriter", () => {
it("preserves line boundaries across buffered flushes", async () => {
const tempDir = createTempDir("paperclip-buffered-writer-");
const outputPath = path.join(tempDir, "backup.sql");
const writer = createBufferedTextFileWriter(outputPath, 16);
const lines = [
"-- header",
"BEGIN;",
"",
"INSERT INTO test VALUES (1);",
"-- footer",
];
for (const line of lines) {
writer.emit(line);
}
await writer.close();
expect(fs.readFileSync(outputPath, "utf8")).toBe(lines.join("\n"));
});
});
describeEmbeddedPostgres("runDatabaseBackup", () => {
it(
"backs up and restores large table payloads without materializing one giant string",
async () => {
const sourceConnectionString = await createTempDatabase();
const restoreConnectionString = await createSiblingDatabase(
sourceConnectionString,
"paperclip_restore_target",
);
const backupDir = createTempDir("paperclip-db-backup-output-");
const sourceSql = postgres(sourceConnectionString, { max: 1, onnotice: () => {} });
const restoreSql = postgres(restoreConnectionString, { max: 1, onnotice: () => {} });
try {
await sourceSql.unsafe(`
CREATE TYPE "public"."backup_test_state" AS ENUM ('pending', 'done');
`);
await sourceSql.unsafe(`
CREATE TABLE "public"."backup_test_records" (
"id" serial PRIMARY KEY,
"title" text NOT NULL,
"payload" text NOT NULL,
"state" "public"."backup_test_state" NOT NULL,
"metadata" jsonb,
"created_at" timestamptz NOT NULL DEFAULT now()
);
`);
const payload = "x".repeat(8192);
for (let index = 0; index < 160; index += 1) {
const createdAt = new Date(Date.UTC(2026, 0, 1, 0, 0, index));
await sourceSql`
INSERT INTO "public"."backup_test_records" (
"title",
"payload",
"state",
"metadata",
"created_at"
)
VALUES (
${`row-${index}`},
${payload},
${index % 2 === 0 ? "pending" : "done"}::"public"."backup_test_state",
${JSON.stringify({ index, even: index % 2 === 0 })}::jsonb,
${createdAt}
)
`;
}
const result = await runDatabaseBackup({
connectionString: sourceConnectionString,
backupDir,
retentionDays: 7,
filenamePrefix: "paperclip-test",
});
expect(result.backupFile).toMatch(/paperclip-test-.*\.sql$/);
expect(result.sizeBytes).toBeGreaterThan(1024 * 1024);
expect(fs.existsSync(result.backupFile)).toBe(true);
await runDatabaseRestore({
connectionString: restoreConnectionString,
backupFile: result.backupFile,
});
const counts = await restoreSql.unsafe<{ count: number }[]>(`
SELECT count(*)::int AS count
FROM "public"."backup_test_records"
`);
expect(counts[0]?.count).toBe(160);
const sampleRows = await restoreSql.unsafe<{
title: string;
payload: string;
state: string;
metadata: { index: number; even: boolean };
}[]>(`
SELECT "title", "payload", "state"::text AS "state", "metadata"
FROM "public"."backup_test_records"
WHERE "title" IN ('row-0', 'row-159')
ORDER BY "title"
`);
expect(sampleRows).toEqual([
{
title: "row-0",
payload,
state: "pending",
metadata: { index: 0, even: true },
},
{
title: "row-159",
payload,
state: "done",
metadata: { index: 159, even: false },
},
]);
} finally {
await sourceSql.end();
await restoreSql.end();
}
},
60_000,
);
});

View File

@@ -1,5 +1,5 @@
import { existsSync, mkdirSync, readdirSync, statSync, unlinkSync } from "node:fs"; import { createWriteStream, existsSync, mkdirSync, readdirSync, statSync, unlinkSync } from "node:fs";
import { readFile, writeFile } from "node:fs/promises"; import { readFile } from "node:fs/promises";
import { basename, resolve } from "node:path"; import { basename, resolve } from "node:path";
import postgres from "postgres"; import postgres from "postgres";
@@ -47,6 +47,7 @@ type TableDefinition = {
const DRIZZLE_SCHEMA = "drizzle"; const DRIZZLE_SCHEMA = "drizzle";
const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations"; const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations";
const DEFAULT_BACKUP_WRITE_BUFFER_BYTES = 1024 * 1024;
const STATEMENT_BREAKPOINT = "-- paperclip statement breakpoint 69f6f3f1-42fd-46a6-bf17-d1d85f8f3900"; const STATEMENT_BREAKPOINT = "-- paperclip statement breakpoint 69f6f3f1-42fd-46a6-bf17-d1d85f8f3900";
@@ -141,6 +142,102 @@ function tableKey(schemaName: string, tableName: string): string {
return `${schemaName}.${tableName}`; return `${schemaName}.${tableName}`;
} }
export function createBufferedTextFileWriter(filePath: string, maxBufferedBytes = DEFAULT_BACKUP_WRITE_BUFFER_BYTES) {
const stream = createWriteStream(filePath, { encoding: "utf8" });
const flushThreshold = Math.max(1, Math.trunc(maxBufferedBytes));
let bufferedLines: string[] = [];
let bufferedBytes = 0;
let firstChunk = true;
let closed = false;
let streamError: Error | null = null;
let pendingWrite = Promise.resolve();
stream.on("error", (error) => {
streamError = error;
});
const writeChunk = async (chunk: string): Promise<void> => {
if (streamError) throw streamError;
const canContinue = stream.write(chunk);
if (!canContinue) {
await new Promise<void>((resolve, reject) => {
const handleDrain = () => {
cleanup();
resolve();
};
const handleError = (error: Error) => {
cleanup();
reject(error);
};
const cleanup = () => {
stream.off("drain", handleDrain);
stream.off("error", handleError);
};
stream.once("drain", handleDrain);
stream.once("error", handleError);
});
}
if (streamError) throw streamError;
};
const flushBufferedLines = () => {
if (bufferedLines.length === 0) return;
const linesToWrite = bufferedLines;
bufferedLines = [];
bufferedBytes = 0;
const chunkBody = linesToWrite.join("\n");
const chunk = firstChunk ? chunkBody : `\n${chunkBody}`;
firstChunk = false;
pendingWrite = pendingWrite.then(() => writeChunk(chunk));
};
return {
emit(line: string) {
if (closed) {
throw new Error(`Cannot write to closed backup file: ${filePath}`);
}
if (streamError) throw streamError;
bufferedLines.push(line);
bufferedBytes += Buffer.byteLength(line, "utf8") + 1;
if (bufferedBytes >= flushThreshold) {
flushBufferedLines();
}
},
async close() {
if (closed) return;
closed = true;
flushBufferedLines();
await pendingWrite;
await new Promise<void>((resolve, reject) => {
if (streamError) {
reject(streamError);
return;
}
stream.end((error?: Error | null) => {
if (error) reject(error);
else resolve();
});
});
if (streamError) throw streamError;
},
async abort() {
if (closed) return;
closed = true;
bufferedLines = [];
bufferedBytes = 0;
stream.destroy();
await pendingWrite.catch(() => {});
if (existsSync(filePath)) {
try {
unlinkSync(filePath);
} catch {
// Preserve the original backup failure if temporary file cleanup also fails.
}
}
},
};
}
export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise<RunDatabaseBackupResult> { export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise<RunDatabaseBackupResult> {
const filenamePrefix = opts.filenamePrefix ?? "paperclip"; const filenamePrefix = opts.filenamePrefix ?? "paperclip";
const retentionDays = Math.max(1, Math.trunc(opts.retentionDays)); const retentionDays = Math.max(1, Math.trunc(opts.retentionDays));
@@ -149,12 +246,14 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
const excludedTableNames = normalizeTableNameSet(opts.excludeTables); const excludedTableNames = normalizeTableNameSet(opts.excludeTables);
const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns); const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns);
const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout }); const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
mkdirSync(opts.backupDir, { recursive: true });
const backupFile = resolve(opts.backupDir, `${filenamePrefix}-${timestamp()}.sql`);
const writer = createBufferedTextFileWriter(backupFile);
try { try {
await sql`SELECT 1`; await sql`SELECT 1`;
const lines: string[] = []; const emit = (line: string) => writer.emit(line);
const emit = (line: string) => lines.push(line);
const emitStatement = (statement: string) => { const emitStatement = (statement: string) => {
emit(statement); emit(statement);
emit(STATEMENT_BREAKPOINT); emit(STATEMENT_BREAKPOINT);
@@ -503,10 +602,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
emitStatement("COMMIT;"); emitStatement("COMMIT;");
emit(""); emit("");
// Write the backup file await writer.close();
mkdirSync(opts.backupDir, { recursive: true });
const backupFile = resolve(opts.backupDir, `${filenamePrefix}-${timestamp()}.sql`);
await writeFile(backupFile, lines.join("\n"), "utf8");
const sizeBytes = statSync(backupFile).size; const sizeBytes = statSync(backupFile).size;
const prunedCount = pruneOldBackups(opts.backupDir, retentionDays, filenamePrefix); const prunedCount = pruneOldBackups(opts.backupDir, retentionDays, filenamePrefix);
@@ -516,6 +612,9 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
sizeBytes, sizeBytes,
prunedCount, prunedCount,
}; };
} catch (error) {
await writer.abort();
throw error;
} finally { } finally {
await sql.end(); await sql.end();
} }

View File

@@ -0,0 +1,89 @@
import { readdir, readFile } from "node:fs/promises";
import { fileURLToPath } from "node:url";
const migrationsDir = fileURLToPath(new URL("./migrations", import.meta.url));
const journalPath = fileURLToPath(new URL("./migrations/meta/_journal.json", import.meta.url));
type JournalFile = {
entries?: Array<{
idx?: number;
tag?: string;
}>;
};
function migrationNumber(value: string): string | null {
const match = value.match(/^(\d{4})_/);
return match ? match[1] : null;
}
function ensureNoDuplicates(values: string[], label: string) {
const seen = new Map<string, string>();
for (const value of values) {
const number = migrationNumber(value);
if (!number) {
throw new Error(`${label} entry does not start with a 4-digit migration number: ${value}`);
}
const existing = seen.get(number);
if (existing) {
throw new Error(`Duplicate migration number ${number} in ${label}: ${existing}, ${value}`);
}
seen.set(number, value);
}
}
function ensureStrictlyOrdered(values: string[], label: string) {
const sorted = [...values].sort();
for (let index = 0; index < values.length; index += 1) {
if (values[index] !== sorted[index]) {
throw new Error(
`${label} are out of order at position ${index}: expected ${sorted[index]}, found ${values[index]}`,
);
}
}
}
function ensureJournalMatchesFiles(migrationFiles: string[], journalTags: string[]) {
const journalFiles = journalTags.map((tag) => `${tag}.sql`);
if (journalFiles.length !== migrationFiles.length) {
throw new Error(
`Migration journal/file count mismatch: journal has ${journalFiles.length}, files have ${migrationFiles.length}`,
);
}
for (let index = 0; index < migrationFiles.length; index += 1) {
const migrationFile = migrationFiles[index];
const journalFile = journalFiles[index];
if (migrationFile !== journalFile) {
throw new Error(
`Migration journal/file order mismatch at position ${index}: journal has ${journalFile}, files have ${migrationFile}`,
);
}
}
}
async function main() {
const migrationFiles = (await readdir(migrationsDir))
.filter((entry) => entry.endsWith(".sql"))
.sort();
ensureNoDuplicates(migrationFiles, "migration files");
ensureStrictlyOrdered(migrationFiles, "migration files");
const rawJournal = await readFile(journalPath, "utf8");
const journal = JSON.parse(rawJournal) as JournalFile;
const journalTags = (journal.entries ?? [])
.map((entry, index) => {
if (typeof entry.tag !== "string" || entry.tag.length === 0) {
throw new Error(`Migration journal entry ${index} is missing a tag`);
}
return entry.tag;
});
ensureNoDuplicates(journalTags, "migration journal");
ensureStrictlyOrdered(journalTags, "migration journal");
ensureJournalMatchesFiles(migrationFiles, journalTags);
}
await main();

View File

@@ -169,4 +169,236 @@ describeEmbeddedPostgres("applyPendingMigrations", () => {
}, },
20_000, 20_000,
); );
it(
"replays migration 0046 safely when document revision columns already exist",
async () => {
const connectionString = await createTempDatabase();
await applyPendingMigrations(connectionString);
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const smoothSentinelsHash = await migrationHash("0046_smooth_sentinels.sql");
await sql.unsafe(
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${smoothSentinelsHash}'`,
);
const columns = await sql.unsafe<{ column_name: string; is_nullable: string; column_default: string | null }[]>(
`
SELECT column_name, is_nullable, column_default
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'document_revisions'
AND column_name IN ('title', 'format')
ORDER BY column_name
`,
);
expect(columns).toHaveLength(2);
} finally {
await sql.end();
}
const pendingState = await inspectMigrations(connectionString);
expect(pendingState).toMatchObject({
status: "needsMigrations",
pendingMigrations: ["0046_smooth_sentinels.sql"],
reason: "pending-migrations",
});
await applyPendingMigrations(connectionString);
const finalState = await inspectMigrations(connectionString);
expect(finalState.status).toBe("upToDate");
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const columns = await verifySql.unsafe<{ column_name: string; is_nullable: string; column_default: string | null }[]>(
`
SELECT column_name, is_nullable, column_default
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'document_revisions'
AND column_name IN ('title', 'format')
ORDER BY column_name
`,
);
expect(columns).toEqual([
expect.objectContaining({
column_name: "format",
is_nullable: "NO",
}),
expect.objectContaining({
column_name: "title",
is_nullable: "YES",
}),
]);
expect(columns[0]?.column_default).toContain("'markdown'");
} finally {
await verifySql.end();
}
},
20_000,
);
it(
"replays migration 0047 safely when feedback tables and run columns already exist",
async () => {
const connectionString = await createTempDatabase();
await applyPendingMigrations(connectionString);
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const overjoyedGrootHash = await migrationHash("0047_overjoyed_groot.sql");
await sql.unsafe(
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${overjoyedGrootHash}'`,
);
const tables = await sql.unsafe<{ table_name: string }[]>(
`
SELECT table_name
FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name IN ('feedback_exports', 'feedback_votes')
ORDER BY table_name
`,
);
expect(tables.map((row) => row.table_name)).toEqual([
"feedback_exports",
"feedback_votes",
]);
const columns = await sql.unsafe<{ table_name: string; column_name: string }[]>(
`
SELECT table_name, column_name
FROM information_schema.columns
WHERE table_schema = 'public'
AND (
(table_name = 'companies' AND column_name IN (
'feedback_data_sharing_enabled',
'feedback_data_sharing_consent_at',
'feedback_data_sharing_consent_by_user_id',
'feedback_data_sharing_terms_version'
))
OR (table_name = 'document_revisions' AND column_name = 'created_by_run_id')
OR (table_name = 'issue_comments' AND column_name = 'created_by_run_id')
)
ORDER BY table_name, column_name
`,
);
expect(columns).toHaveLength(6);
} finally {
await sql.end();
}
const pendingState = await inspectMigrations(connectionString);
expect(pendingState).toMatchObject({
status: "needsMigrations",
pendingMigrations: ["0047_overjoyed_groot.sql"],
reason: "pending-migrations",
});
await applyPendingMigrations(connectionString);
const finalState = await inspectMigrations(connectionString);
expect(finalState.status).toBe("upToDate");
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const constraints = await verifySql.unsafe<{ conname: string }[]>(
`
SELECT conname
FROM pg_constraint
WHERE conname IN (
'feedback_exports_company_id_companies_id_fk',
'feedback_exports_feedback_vote_id_feedback_votes_id_fk',
'feedback_exports_issue_id_issues_id_fk',
'feedback_votes_company_id_companies_id_fk',
'feedback_votes_issue_id_issues_id_fk'
)
ORDER BY conname
`,
);
expect(constraints.map((row) => row.conname)).toEqual([
"feedback_exports_company_id_companies_id_fk",
"feedback_exports_feedback_vote_id_feedback_votes_id_fk",
"feedback_exports_issue_id_issues_id_fk",
"feedback_votes_company_id_companies_id_fk",
"feedback_votes_issue_id_issues_id_fk",
]);
} finally {
await verifySql.end();
}
},
20_000,
);
it(
"replays migration 0048 safely when routines.variables already exists",
async () => {
const connectionString = await createTempDatabase();
await applyPendingMigrations(connectionString);
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const flashyMarrowHash = await migrationHash("0048_flashy_marrow.sql");
await sql.unsafe(
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${flashyMarrowHash}'`,
);
const columns = await sql.unsafe<{ column_name: string }[]>(
`
SELECT column_name
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'routines'
AND column_name = 'variables'
`,
);
expect(columns).toHaveLength(1);
} finally {
await sql.end();
}
const pendingState = await inspectMigrations(connectionString);
expect(pendingState).toMatchObject({
status: "needsMigrations",
pendingMigrations: ["0048_flashy_marrow.sql"],
reason: "pending-migrations",
});
await applyPendingMigrations(connectionString);
const finalState = await inspectMigrations(connectionString);
expect(finalState.status).toBe("upToDate");
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
try {
const columns = await verifySql.unsafe<{ column_name: string; is_nullable: string; data_type: string }[]>(
`
SELECT column_name, is_nullable, data_type
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'routines'
AND column_name = 'variables'
`,
);
expect(columns).toEqual([
expect.objectContaining({
column_name: "variables",
is_nullable: "NO",
data_type: "jsonb",
}),
]);
} finally {
await verifySql.end();
}
},
20_000,
);
}); });

View File

@@ -0,0 +1,11 @@
ALTER TABLE "document_revisions" ADD COLUMN IF NOT EXISTS "title" text;--> statement-breakpoint
ALTER TABLE "document_revisions" ADD COLUMN IF NOT EXISTS "format" text;--> statement-breakpoint
ALTER TABLE "document_revisions" ALTER COLUMN "format" SET DEFAULT 'markdown';
--> statement-breakpoint
UPDATE "document_revisions" AS "dr"
SET
"title" = COALESCE("dr"."title", "d"."title"),
"format" = COALESCE("dr"."format", "d"."format", 'markdown')
FROM "documents" AS "d"
WHERE "d"."id" = "dr"."document_id";--> statement-breakpoint
ALTER TABLE "document_revisions" ALTER COLUMN "format" SET NOT NULL;

View File

@@ -0,0 +1,102 @@
CREATE TABLE IF NOT EXISTS "feedback_exports" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"company_id" uuid NOT NULL,
"feedback_vote_id" uuid NOT NULL,
"issue_id" uuid NOT NULL,
"project_id" uuid,
"author_user_id" text NOT NULL,
"target_type" text NOT NULL,
"target_id" text NOT NULL,
"vote" text NOT NULL,
"status" text DEFAULT 'local_only' NOT NULL,
"destination" text,
"export_id" text,
"consent_version" text,
"schema_version" text DEFAULT 'paperclip-feedback-envelope-v2' NOT NULL,
"bundle_version" text DEFAULT 'paperclip-feedback-bundle-v2' NOT NULL,
"payload_version" text DEFAULT 'paperclip-feedback-v1' NOT NULL,
"payload_digest" text,
"payload_snapshot" jsonb,
"target_summary" jsonb NOT NULL,
"redaction_summary" jsonb,
"attempt_count" integer DEFAULT 0 NOT NULL,
"last_attempted_at" timestamp with time zone,
"exported_at" timestamp with time zone,
"failure_reason" text,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
CREATE TABLE IF NOT EXISTS "feedback_votes" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"company_id" uuid NOT NULL,
"issue_id" uuid NOT NULL,
"target_type" text NOT NULL,
"target_id" text NOT NULL,
"author_user_id" text NOT NULL,
"vote" text NOT NULL,
"reason" text,
"shared_with_labs" boolean DEFAULT false NOT NULL,
"shared_at" timestamp with time zone,
"consent_version" text,
"redaction_summary" jsonb,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_enabled" boolean DEFAULT false NOT NULL;--> statement-breakpoint
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_consent_at" timestamp with time zone;--> statement-breakpoint
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_consent_by_user_id" text;--> statement-breakpoint
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_terms_version" text;--> statement-breakpoint
ALTER TABLE "document_revisions" ADD COLUMN IF NOT EXISTS "created_by_run_id" uuid;--> statement-breakpoint
ALTER TABLE "issue_comments" ADD COLUMN IF NOT EXISTS "created_by_run_id" uuid;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_company_id_companies_id_fk') THEN
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_feedback_vote_id_feedback_votes_id_fk') THEN
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_feedback_vote_id_feedback_votes_id_fk" FOREIGN KEY ("feedback_vote_id") REFERENCES "public"."feedback_votes"("id") ON DELETE cascade ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_issue_id_issues_id_fk') THEN
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE cascade ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_project_id_projects_id_fk') THEN
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_project_id_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."projects"("id") ON DELETE set null ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_votes_company_id_companies_id_fk') THEN
ALTER TABLE "feedback_votes" ADD CONSTRAINT "feedback_votes_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_votes_issue_id_issues_id_fk') THEN
ALTER TABLE "feedback_votes" ADD CONSTRAINT "feedback_votes_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE no action ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
CREATE UNIQUE INDEX IF NOT EXISTS "feedback_exports_feedback_vote_idx" ON "feedback_exports" USING btree ("feedback_vote_id");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_exports_company_created_idx" ON "feedback_exports" USING btree ("company_id","created_at");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_exports_company_status_idx" ON "feedback_exports" USING btree ("company_id","status","created_at");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_exports_company_issue_idx" ON "feedback_exports" USING btree ("company_id","issue_id","created_at");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_exports_company_project_idx" ON "feedback_exports" USING btree ("company_id","project_id","created_at");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_exports_company_author_idx" ON "feedback_exports" USING btree ("company_id","author_user_id","created_at");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_votes_company_issue_idx" ON "feedback_votes" USING btree ("company_id","issue_id");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_votes_issue_target_idx" ON "feedback_votes" USING btree ("issue_id","target_type","target_id");--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "feedback_votes_author_idx" ON "feedback_votes" USING btree ("author_user_id","created_at");--> statement-breakpoint
CREATE UNIQUE INDEX IF NOT EXISTS "feedback_votes_company_target_author_idx" ON "feedback_votes" USING btree ("company_id","target_type","target_id","author_user_id");--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'document_revisions_created_by_run_id_heartbeat_runs_id_fk') THEN
ALTER TABLE "document_revisions" ADD CONSTRAINT "document_revisions_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;
END IF;
END $$;--> statement-breakpoint
DO $$ BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'issue_comments_created_by_run_id_heartbeat_runs_id_fk') THEN
ALTER TABLE "issue_comments" ADD CONSTRAINT "issue_comments_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;
END IF;
END $$;

View File

@@ -0,0 +1 @@
ALTER TABLE "routines" ADD COLUMN IF NOT EXISTS "variables" jsonb DEFAULT '[]'::jsonb NOT NULL;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -323,6 +323,27 @@
"when": 1774530504348, "when": 1774530504348,
"tag": "0045_workable_shockwave", "tag": "0045_workable_shockwave",
"breakpoints": true "breakpoints": true
},
{
"idx": 46,
"version": "7",
"when": 1774960197878,
"tag": "0046_smooth_sentinels",
"breakpoints": true
},
{
"idx": 47,
"version": "7",
"when": 1775137972687,
"tag": "0047_overjoyed_groot",
"breakpoints": true
},
{
"idx": 48,
"version": "7",
"when": 1775145655557,
"tag": "0048_flashy_marrow",
"breakpoints": true
} }
] ]
} }

View File

@@ -16,6 +16,12 @@ export const companies = pgTable(
requireBoardApprovalForNewAgents: boolean("require_board_approval_for_new_agents") requireBoardApprovalForNewAgents: boolean("require_board_approval_for_new_agents")
.notNull() .notNull()
.default(true), .default(true),
feedbackDataSharingEnabled: boolean("feedback_data_sharing_enabled")
.notNull()
.default(false),
feedbackDataSharingConsentAt: timestamp("feedback_data_sharing_consent_at", { withTimezone: true }),
feedbackDataSharingConsentByUserId: text("feedback_data_sharing_consent_by_user_id"),
feedbackDataSharingTermsVersion: text("feedback_data_sharing_terms_version"),
brandColor: text("brand_color"), brandColor: text("brand_color"),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(), createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(), updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),

View File

@@ -2,6 +2,7 @@ import { pgTable, uuid, text, integer, timestamp, index, uniqueIndex } from "dri
import { companies } from "./companies.js"; import { companies } from "./companies.js";
import { agents } from "./agents.js"; import { agents } from "./agents.js";
import { documents } from "./documents.js"; import { documents } from "./documents.js";
import { heartbeatRuns } from "./heartbeat_runs.js";
export const documentRevisions = pgTable( export const documentRevisions = pgTable(
"document_revisions", "document_revisions",
@@ -10,10 +11,13 @@ export const documentRevisions = pgTable(
companyId: uuid("company_id").notNull().references(() => companies.id), companyId: uuid("company_id").notNull().references(() => companies.id),
documentId: uuid("document_id").notNull().references(() => documents.id, { onDelete: "cascade" }), documentId: uuid("document_id").notNull().references(() => documents.id, { onDelete: "cascade" }),
revisionNumber: integer("revision_number").notNull(), revisionNumber: integer("revision_number").notNull(),
title: text("title"),
format: text("format").notNull().default("markdown"),
body: text("body").notNull(), body: text("body").notNull(),
changeSummary: text("change_summary"), changeSummary: text("change_summary"),
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }), createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
createdByUserId: text("created_by_user_id"), createdByUserId: text("created_by_user_id"),
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(), createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
}, },
(table) => ({ (table) => ({

View File

@@ -0,0 +1,45 @@
import { index, integer, jsonb, pgTable, text, timestamp, uniqueIndex, uuid } from "drizzle-orm/pg-core";
import { companies } from "./companies.js";
import { feedbackVotes } from "./feedback_votes.js";
import { issues } from "./issues.js";
import { projects } from "./projects.js";
export const feedbackExports = pgTable(
"feedback_exports",
{
id: uuid("id").primaryKey().defaultRandom(),
companyId: uuid("company_id").notNull().references(() => companies.id),
feedbackVoteId: uuid("feedback_vote_id").notNull().references(() => feedbackVotes.id, { onDelete: "cascade" }),
issueId: uuid("issue_id").notNull().references(() => issues.id, { onDelete: "cascade" }),
projectId: uuid("project_id").references(() => projects.id, { onDelete: "set null" }),
authorUserId: text("author_user_id").notNull(),
targetType: text("target_type").notNull(),
targetId: text("target_id").notNull(),
vote: text("vote").notNull(),
status: text("status").notNull().default("local_only"),
destination: text("destination"),
exportId: text("export_id"),
consentVersion: text("consent_version"),
schemaVersion: text("schema_version").notNull().default("paperclip-feedback-envelope-v2"),
bundleVersion: text("bundle_version").notNull().default("paperclip-feedback-bundle-v2"),
payloadVersion: text("payload_version").notNull().default("paperclip-feedback-v1"),
payloadDigest: text("payload_digest"),
payloadSnapshot: jsonb("payload_snapshot"),
targetSummary: jsonb("target_summary").notNull(),
redactionSummary: jsonb("redaction_summary"),
attemptCount: integer("attempt_count").notNull().default(0),
lastAttemptedAt: timestamp("last_attempted_at", { withTimezone: true }),
exportedAt: timestamp("exported_at", { withTimezone: true }),
failureReason: text("failure_reason"),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
},
(table) => ({
voteUniqueIdx: uniqueIndex("feedback_exports_feedback_vote_idx").on(table.feedbackVoteId),
companyCreatedIdx: index("feedback_exports_company_created_idx").on(table.companyId, table.createdAt),
companyStatusIdx: index("feedback_exports_company_status_idx").on(table.companyId, table.status, table.createdAt),
companyIssueIdx: index("feedback_exports_company_issue_idx").on(table.companyId, table.issueId, table.createdAt),
companyProjectIdx: index("feedback_exports_company_project_idx").on(table.companyId, table.projectId, table.createdAt),
companyAuthorIdx: index("feedback_exports_company_author_idx").on(table.companyId, table.authorUserId, table.createdAt),
}),
);

View File

@@ -0,0 +1,34 @@
import { boolean, index, jsonb, pgTable, text, timestamp, uniqueIndex, uuid } from "drizzle-orm/pg-core";
import { companies } from "./companies.js";
import { issues } from "./issues.js";
export const feedbackVotes = pgTable(
"feedback_votes",
{
id: uuid("id").primaryKey().defaultRandom(),
companyId: uuid("company_id").notNull().references(() => companies.id),
issueId: uuid("issue_id").notNull().references(() => issues.id),
targetType: text("target_type").notNull(),
targetId: text("target_id").notNull(),
authorUserId: text("author_user_id").notNull(),
vote: text("vote").notNull(),
reason: text("reason"),
sharedWithLabs: boolean("shared_with_labs").notNull().default(false),
sharedAt: timestamp("shared_at", { withTimezone: true }),
consentVersion: text("consent_version"),
redactionSummary: jsonb("redaction_summary"),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
},
(table) => ({
companyIssueIdx: index("feedback_votes_company_issue_idx").on(table.companyId, table.issueId),
issueTargetIdx: index("feedback_votes_issue_target_idx").on(table.issueId, table.targetType, table.targetId),
authorIdx: index("feedback_votes_author_idx").on(table.authorUserId, table.createdAt),
companyTargetAuthorUniqueIdx: uniqueIndex("feedback_votes_company_target_author_idx").on(
table.companyId,
table.targetType,
table.targetId,
table.authorUserId,
),
}),
);

View File

@@ -32,6 +32,8 @@ export { issueLabels } from "./issue_labels.js";
export { issueApprovals } from "./issue_approvals.js"; export { issueApprovals } from "./issue_approvals.js";
export { issueComments } from "./issue_comments.js"; export { issueComments } from "./issue_comments.js";
export { issueInboxArchives } from "./issue_inbox_archives.js"; export { issueInboxArchives } from "./issue_inbox_archives.js";
export { feedbackVotes } from "./feedback_votes.js";
export { feedbackExports } from "./feedback_exports.js";
export { issueReadStates } from "./issue_read_states.js"; export { issueReadStates } from "./issue_read_states.js";
export { assets } from "./assets.js"; export { assets } from "./assets.js";
export { issueAttachments } from "./issue_attachments.js"; export { issueAttachments } from "./issue_attachments.js";

View File

@@ -2,6 +2,7 @@ import { pgTable, uuid, text, timestamp, index } from "drizzle-orm/pg-core";
import { companies } from "./companies.js"; import { companies } from "./companies.js";
import { issues } from "./issues.js"; import { issues } from "./issues.js";
import { agents } from "./agents.js"; import { agents } from "./agents.js";
import { heartbeatRuns } from "./heartbeat_runs.js";
export const issueComments = pgTable( export const issueComments = pgTable(
"issue_comments", "issue_comments",
@@ -11,6 +12,7 @@ export const issueComments = pgTable(
issueId: uuid("issue_id").notNull().references(() => issues.id), issueId: uuid("issue_id").notNull().references(() => issues.id),
authorAgentId: uuid("author_agent_id").references(() => agents.id), authorAgentId: uuid("author_agent_id").references(() => agents.id),
authorUserId: text("author_user_id"), authorUserId: text("author_user_id"),
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
body: text("body").notNull(), body: text("body").notNull(),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(), createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(), updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),

View File

@@ -15,6 +15,7 @@ import { companySecrets } from "./company_secrets.js";
import { issues } from "./issues.js"; import { issues } from "./issues.js";
import { projects } from "./projects.js"; import { projects } from "./projects.js";
import { goals } from "./goals.js"; import { goals } from "./goals.js";
import type { RoutineVariable } from "@paperclipai/shared";
export const routines = pgTable( export const routines = pgTable(
"routines", "routines",
@@ -31,6 +32,7 @@ export const routines = pgTable(
status: text("status").notNull().default("active"), status: text("status").notNull().default("active"),
concurrencyPolicy: text("concurrency_policy").notNull().default("coalesce_if_active"), concurrencyPolicy: text("concurrency_policy").notNull().default("coalesce_if_active"),
catchUpPolicy: text("catch_up_policy").notNull().default("skip_missed"), catchUpPolicy: text("catch_up_policy").notNull().default("skip_missed"),
variables: jsonb("variables").$type<RoutineVariable[]>().notNull().default([]),
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }), createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
createdByUserId: text("created_by_user_id"), createdByUserId: text("created_by_user_id"),
updatedByAgentId: uuid("updated_by_agent_id").references(() => agents.id, { onDelete: "set null" }), updatedByAgentId: uuid("updated_by_agent_id").references(() => agents.id, { onDelete: "set null" }),

View File

@@ -41,6 +41,7 @@ const manifest: PaperclipPluginManifestV1 = {
"goals.update", "goals.update",
"activity.log.write", "activity.log.write",
"metrics.write", "metrics.write",
"telemetry.track",
"plugin.state.read", "plugin.state.read",
"plugin.state.write", "plugin.state.write",
"events.subscribe", "events.subscribe",

View File

@@ -405,6 +405,16 @@ async function registerActionHandlers(ctx: PluginContext): Promise<void> {
data: { companyId }, data: { companyId },
}); });
await ctx.metrics.write("demo.events.emitted", 1, { source: "manual" }); await ctx.metrics.write("demo.events.emitted", 1, { source: "manual" });
await ctx.telemetry.track("demo_event", {
source: "manual",
has_company: Boolean(companyId),
});
pushRecord({
level: "info",
source: "telemetry",
message: "Tracked plugin telemetry event demo_event",
data: { companyId },
});
return { ok: true, message }; return { ok: true, message };
}); });

View File

@@ -312,6 +312,7 @@ Declare in `manifest.capabilities`. Grouped by scope:
| | `issue.comments.create` | | | `issue.comments.create` |
| | `activity.log.write` | | | `activity.log.write` |
| | `metrics.write` | | | `metrics.write` |
| | `telemetry.track` |
| **Instance** | `instance.settings.register` | | **Instance** | `instance.settings.register` |
| | `plugin.state.read` | | | `plugin.state.read` |
| | `plugin.state.write` | | | `plugin.state.write` |

View File

@@ -135,6 +135,11 @@ export interface HostServices {
write(params: WorkerToHostMethods["metrics.write"][0]): Promise<void>; write(params: WorkerToHostMethods["metrics.write"][0]): Promise<void>;
}; };
/** Provides `telemetry.track`. */
telemetry: {
track(params: WorkerToHostMethods["telemetry.track"][0]): Promise<void>;
};
/** Provides `log`. */ /** Provides `log`. */
logger: { logger: {
log(params: WorkerToHostMethods["log"][0]): Promise<void>; log(params: WorkerToHostMethods["log"][0]): Promise<void>;
@@ -284,6 +289,9 @@ const METHOD_CAPABILITY_MAP: Record<WorkerToHostMethodName, PluginCapability | n
// Metrics // Metrics
"metrics.write": "metrics.write", "metrics.write": "metrics.write",
// Telemetry
"telemetry.track": "telemetry.track",
// Logger — always allowed // Logger — always allowed
"log": null, "log": null,
@@ -447,6 +455,11 @@ export function createHostClientHandlers(
return services.metrics.write(params); return services.metrics.write(params);
}), }),
// Telemetry
"telemetry.track": gated("telemetry.track", async (params) => {
return services.telemetry.track(params);
}),
// Logger // Logger
"log": gated("log", async (params) => { "log": gated("log", async (params) => {
return services.logger.log(params); return services.logger.log(params);

View File

@@ -182,6 +182,7 @@ export type {
PluginStreamsClient, PluginStreamsClient,
PluginToolsClient, PluginToolsClient,
PluginMetricsClient, PluginMetricsClient,
PluginTelemetryClient,
PluginLogger, PluginLogger,
} from "./types.js"; } from "./types.js";

View File

@@ -519,6 +519,12 @@ export interface WorkerToHostMethods {
result: void, result: void,
]; ];
// Telemetry
"telemetry.track": [
params: { eventName: string; dimensions?: Record<string, string | number | boolean> },
result: void,
];
// Logger // Logger
"log": [ "log": [
params: { level: "info" | "warn" | "error" | "debug"; message: string; meta?: Record<string, unknown> }, params: { level: "info" | "warn" | "error" | "debug"; message: string; meta?: Record<string, unknown> },
@@ -579,6 +585,7 @@ export interface WorkerToHostMethods {
projectId?: string; projectId?: string;
goalId?: string; goalId?: string;
parentId?: string; parentId?: string;
inheritExecutionWorkspaceFromIssueId?: string;
title: string; title: string;
description?: string; description?: string;
priority?: string; priority?: string;

View File

@@ -71,6 +71,7 @@ export interface TestHarness {
logs: TestHarnessLogEntry[]; logs: TestHarnessLogEntry[];
activity: Array<{ message: string; entityType?: string; entityId?: string; metadata?: Record<string, unknown> }>; activity: Array<{ message: string; entityType?: string; entityId?: string; metadata?: Record<string, unknown> }>;
metrics: Array<{ name: string; value: number; tags?: Record<string, string> }>; metrics: Array<{ name: string; value: number; tags?: Record<string, string> }>;
telemetry: Array<{ eventName: string; dimensions?: Record<string, string | number | boolean> }>;
} }
type EventRegistration = { type EventRegistration = {
@@ -132,6 +133,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
const logs: TestHarnessLogEntry[] = []; const logs: TestHarnessLogEntry[] = [];
const activity: TestHarness["activity"] = []; const activity: TestHarness["activity"] = [];
const metrics: TestHarness["metrics"] = []; const metrics: TestHarness["metrics"] = [];
const telemetry: TestHarness["telemetry"] = [];
const state = new Map<string, unknown>(); const state = new Map<string, unknown>();
const entities = new Map<string, PluginEntityRecord>(); const entities = new Map<string, PluginEntityRecord>();
@@ -631,6 +633,12 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
metrics.push({ name, value, tags }); metrics.push({ name, value, tags });
}, },
}, },
telemetry: {
async track(eventName, dimensions) {
requireCapability(manifest, capabilitySet, "telemetry.track");
telemetry.push({ eventName, dimensions });
},
},
logger: { logger: {
info(message, meta) { info(message, meta) {
logs.push({ level: "info", message, meta }); logs.push({ level: "info", message, meta });
@@ -729,6 +737,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
logs, logs,
activity, activity,
metrics, metrics,
telemetry,
}; };
return harness; return harness;

View File

@@ -761,6 +761,28 @@ export interface PluginMetricsClient {
write(name: string, value: number, tags?: Record<string, string>): Promise<void>; write(name: string, value: number, tags?: Record<string, string>): Promise<void>;
} }
/**
* `ctx.telemetry` — emit plugin-scoped telemetry to the host's external
* telemetry pipeline.
*
* Requires `telemetry.track` capability.
*/
export interface PluginTelemetryClient {
/**
* Track a plugin telemetry event.
*
* The host prefixes the final event name as `plugin.<pluginId>.<eventName>`
* before forwarding it to the shared telemetry client.
*
* @param eventName - Bare plugin event slug (for example `"sync_completed"`)
* @param dimensions - Optional structured dimensions
*/
track(
eventName: string,
dimensions?: Record<string, string | number | boolean>,
): Promise<void>;
}
/** /**
* `ctx.companies` — read company metadata. * `ctx.companies` — read company metadata.
* *
@@ -872,6 +894,7 @@ export interface PluginIssuesClient {
projectId?: string; projectId?: string;
goalId?: string; goalId?: string;
parentId?: string; parentId?: string;
inheritExecutionWorkspaceFromIssueId?: string;
title: string; title: string;
description?: string; description?: string;
priority?: Issue["priority"]; priority?: Issue["priority"];
@@ -1155,6 +1178,9 @@ export interface PluginContext {
/** Write plugin metrics. Requires `metrics.write`. */ /** Write plugin metrics. Requires `metrics.write`. */
metrics: PluginMetricsClient; metrics: PluginMetricsClient;
/** Emit plugin-scoped external telemetry. Requires `telemetry.track`. */
telemetry: PluginTelemetryClient;
/** Structured logger. Output is captured and surfaced in the plugin health dashboard. */ /** Structured logger. Output is captured and surfaced in the plugin health dashboard. */
logger: PluginLogger; logger: PluginLogger;
} }

View File

@@ -590,6 +590,7 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
projectId: input.projectId, projectId: input.projectId,
goalId: input.goalId, goalId: input.goalId,
parentId: input.parentId, parentId: input.parentId,
inheritExecutionWorkspaceFromIssueId: input.inheritExecutionWorkspaceFromIssueId,
title: input.title, title: input.title,
description: input.description, description: input.description,
priority: input.priority, priority: input.priority,
@@ -792,6 +793,15 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
}, },
}, },
telemetry: {
async track(
eventName: string,
dimensions?: Record<string, string | number | boolean>,
): Promise<void> {
await callHost("telemetry.track", { eventName, dimensions });
},
},
logger: { logger: {
info(message: string, meta?: Record<string, unknown>): void { info(message: string, meta?: Record<string, unknown>): void {
notifyHost("log", { level: "info", message, meta }); notifyHost("log", { level: "info", message, meta });

View File

@@ -14,6 +14,7 @@
"type": "module", "type": "module",
"exports": { "exports": {
".": "./src/index.ts", ".": "./src/index.ts",
"./telemetry": "./src/telemetry/index.ts",
"./*": "./src/*.ts" "./*": "./src/*.ts"
}, },
"publishConfig": { "publishConfig": {
@@ -23,6 +24,10 @@
"types": "./dist/index.d.ts", "types": "./dist/index.d.ts",
"import": "./dist/index.js" "import": "./dist/index.js"
}, },
"./telemetry": {
"types": "./dist/telemetry/index.d.ts",
"import": "./dist/telemetry/index.js"
},
"./*": { "./*": {
"types": "./dist/*.d.ts", "types": "./dist/*.d.ts",
"import": "./dist/*.js" "import": "./dist/*.js"

View File

@@ -95,6 +95,10 @@ export const secretsConfigSchema = z.object({
}), }),
}); });
export const telemetryConfigSchema = z.object({
enabled: z.boolean().default(true),
}).default({});
export const paperclipConfigSchema = z export const paperclipConfigSchema = z
.object({ .object({
$meta: configMetaSchema, $meta: configMetaSchema,
@@ -102,6 +106,7 @@ export const paperclipConfigSchema = z
database: databaseConfigSchema, database: databaseConfigSchema,
logging: loggingConfigSchema, logging: loggingConfigSchema,
server: serverConfigSchema, server: serverConfigSchema,
telemetry: telemetryConfigSchema,
auth: authConfigSchema.default({ auth: authConfigSchema.default({
baseUrlMode: "auto", baseUrlMode: "auto",
disableSignUp: false, disableSignUp: false,
@@ -174,5 +179,6 @@ export type StorageS3Config = z.infer<typeof storageS3ConfigSchema>;
export type SecretsConfig = z.infer<typeof secretsConfigSchema>; export type SecretsConfig = z.infer<typeof secretsConfigSchema>;
export type SecretsLocalEncryptedConfig = z.infer<typeof secretsLocalEncryptedConfigSchema>; export type SecretsLocalEncryptedConfig = z.infer<typeof secretsLocalEncryptedConfigSchema>;
export type AuthConfig = z.infer<typeof authConfigSchema>; export type AuthConfig = z.infer<typeof authConfigSchema>;
export type TelemetryConfig = z.infer<typeof telemetryConfigSchema>;
export type ConfigMeta = z.infer<typeof configMetaSchema>; export type ConfigMeta = z.infer<typeof configMetaSchema>;
export type DatabaseBackupConfig = z.infer<typeof databaseBackupConfigSchema>; export type DatabaseBackupConfig = z.infer<typeof databaseBackupConfigSchema>;

View File

@@ -26,6 +26,7 @@ export const AGENT_ADAPTER_TYPES = [
"http", "http",
"claude_local", "claude_local",
"codex_local", "codex_local",
"gemini_local",
"opencode_local", "opencode_local",
"pi_local", "pi_local",
"cursor", "cursor",
@@ -165,6 +166,9 @@ export type RoutineTriggerKind = (typeof ROUTINE_TRIGGER_KINDS)[number];
export const ROUTINE_TRIGGER_SIGNING_MODES = ["bearer", "hmac_sha256"] as const; export const ROUTINE_TRIGGER_SIGNING_MODES = ["bearer", "hmac_sha256"] as const;
export type RoutineTriggerSigningMode = (typeof ROUTINE_TRIGGER_SIGNING_MODES)[number]; export type RoutineTriggerSigningMode = (typeof ROUTINE_TRIGGER_SIGNING_MODES)[number];
export const ROUTINE_VARIABLE_TYPES = ["text", "textarea", "number", "boolean", "select"] as const;
export type RoutineVariableType = (typeof ROUTINE_VARIABLE_TYPES)[number];
export const ROUTINE_RUN_STATUSES = [ export const ROUTINE_RUN_STATUSES = [
"received", "received",
"coalesced", "coalesced",
@@ -447,6 +451,7 @@ export const PLUGIN_CAPABILITIES = [
"agent.sessions.close", "agent.sessions.close",
"activity.log.write", "activity.log.write",
"metrics.write", "metrics.write",
"telemetry.track",
// Plugin State // Plugin State
"plugin.state.read", "plugin.state.read",
"plugin.state.write", "plugin.state.write",

View File

@@ -21,6 +21,7 @@ export {
ROUTINE_CATCH_UP_POLICIES, ROUTINE_CATCH_UP_POLICIES,
ROUTINE_TRIGGER_KINDS, ROUTINE_TRIGGER_KINDS,
ROUTINE_TRIGGER_SIGNING_MODES, ROUTINE_TRIGGER_SIGNING_MODES,
ROUTINE_VARIABLE_TYPES,
ROUTINE_RUN_STATUSES, ROUTINE_RUN_STATUSES,
ROUTINE_RUN_SOURCES, ROUTINE_RUN_SOURCES,
PAUSE_REASONS, PAUSE_REASONS,
@@ -88,6 +89,7 @@ export {
type RoutineCatchUpPolicy, type RoutineCatchUpPolicy,
type RoutineTriggerKind, type RoutineTriggerKind,
type RoutineTriggerSigningMode, type RoutineTriggerSigningMode,
type RoutineVariableType,
type RoutineRunStatus, type RoutineRunStatus,
type RoutineRunSource, type RoutineRunSource,
type PauseReason, type PauseReason,
@@ -138,6 +140,16 @@ export {
export type { export type {
Company, Company,
FeedbackVote,
FeedbackDataSharingPreference,
FeedbackTargetType,
FeedbackVoteValue,
FeedbackTrace,
FeedbackTraceStatus,
FeedbackTraceTargetSummary,
FeedbackTraceBundleCaptureStatus,
FeedbackTraceBundleFile,
FeedbackTraceBundle,
CompanySkillSourceType, CompanySkillSourceType,
CompanySkillTrustLevel, CompanySkillTrustLevel,
CompanySkillCompatibility, CompanySkillCompatibility,
@@ -245,6 +257,8 @@ export type {
FinanceSummary, FinanceSummary,
FinanceByBiller, FinanceByBiller,
FinanceByKind, FinanceByKind,
AgentWakeupResponse,
AgentWakeupSkipped,
HeartbeatRun, HeartbeatRun,
HeartbeatRunEvent, HeartbeatRunEvent,
AgentRuntimeState, AgentRuntimeState,
@@ -294,6 +308,8 @@ export type {
CompanySecret, CompanySecret,
SecretProviderDescriptor, SecretProviderDescriptor,
Routine, Routine,
RoutineVariable,
RoutineVariableDefaultValue,
RoutineTrigger, RoutineTrigger,
RoutineRun, RoutineRun,
RoutineTriggerSecretMaterial, RoutineTriggerSecretMaterial,
@@ -325,6 +341,15 @@ export type {
ProviderQuotaResult, ProviderQuotaResult,
} from "./types/index.js"; } from "./types/index.js";
export {
DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE,
FEEDBACK_TARGET_TYPES,
FEEDBACK_DATA_SHARING_PREFERENCES,
FEEDBACK_TRACE_STATUSES,
FEEDBACK_VOTE_VALUES,
DEFAULT_FEEDBACK_DATA_SHARING_TERMS_VERSION,
} from "./types/feedback.js";
export { export {
instanceGeneralSettingsSchema, instanceGeneralSettingsSchema,
patchInstanceGeneralSettingsSchema, patchInstanceGeneralSettingsSchema,
@@ -338,9 +363,14 @@ export {
createCompanySchema, createCompanySchema,
updateCompanySchema, updateCompanySchema,
updateCompanyBrandingSchema, updateCompanyBrandingSchema,
feedbackTargetTypeSchema,
feedbackTraceStatusSchema,
feedbackVoteValueSchema,
upsertIssueFeedbackVoteSchema,
type CreateCompany, type CreateCompany,
type UpdateCompany, type UpdateCompany,
type UpdateCompanyBranding, type UpdateCompanyBranding,
type UpsertIssueFeedbackVote,
agentSkillStateSchema, agentSkillStateSchema,
agentSkillSyncModeSchema, agentSkillSyncModeSchema,
agentSkillEntrySchema, agentSkillEntrySchema,
@@ -406,6 +436,7 @@ export {
issueDocumentFormatSchema, issueDocumentFormatSchema,
issueDocumentKeySchema, issueDocumentKeySchema,
upsertIssueDocumentSchema, upsertIssueDocumentSchema,
restoreIssueDocumentRevisionSchema,
type CreateIssue, type CreateIssue,
type CreateIssueLabel, type CreateIssueLabel,
type UpdateIssue, type UpdateIssue,
@@ -418,6 +449,7 @@ export {
type UpdateExecutionWorkspace, type UpdateExecutionWorkspace,
type IssueDocumentFormat, type IssueDocumentFormat,
type UpsertIssueDocument, type UpsertIssueDocument,
type RestoreIssueDocumentRevision,
createGoalSchema, createGoalSchema,
updateGoalSchema, updateGoalSchema,
type CreateGoal, type CreateGoal,
@@ -447,6 +479,7 @@ export {
updateRoutineSchema, updateRoutineSchema,
createRoutineTriggerSchema, createRoutineTriggerSchema,
updateRoutineTriggerSchema, updateRoutineTriggerSchema,
routineVariableSchema,
runRoutineSchema, runRoutineSchema,
rotateRoutineTriggerSecretSchema, rotateRoutineTriggerSecretSchema,
type CreateSecret, type CreateSecret,
@@ -557,7 +590,7 @@ export {
export { API_PREFIX, API } from "./api.js"; export { API_PREFIX, API } from "./api.js";
export { normalizeAgentUrlKey, deriveAgentUrlKey, isUuidLike } from "./agent-url-key.js"; export { normalizeAgentUrlKey, deriveAgentUrlKey, isUuidLike } from "./agent-url-key.js";
export { deriveProjectUrlKey, normalizeProjectUrlKey } from "./project-url-key.js"; export { deriveProjectUrlKey, normalizeProjectUrlKey, hasNonAsciiContent } from "./project-url-key.js";
export { export {
AGENT_MENTION_SCHEME, AGENT_MENTION_SCHEME,
PROJECT_MENTION_SCHEME, PROJECT_MENTION_SCHEME,
@@ -571,6 +604,14 @@ export {
type ParsedProjectMention, type ParsedProjectMention,
} from "./project-mentions.js"; } from "./project-mentions.js";
export {
extractRoutineVariableNames,
interpolateRoutineTemplate,
isValidRoutineVariableName,
stringifyRoutineVariableValue,
syncRoutineVariablesWithTemplate,
} from "./routine-variables.js";
export { export {
paperclipConfigSchema, paperclipConfigSchema,
configMetaSchema, configMetaSchema,
@@ -585,6 +626,8 @@ export {
storageLocalDiskConfigSchema, storageLocalDiskConfigSchema,
storageS3ConfigSchema, storageS3ConfigSchema,
secretsLocalEncryptedConfigSchema, secretsLocalEncryptedConfigSchema,
telemetryConfigSchema,
type TelemetryConfig,
type PaperclipConfig, type PaperclipConfig,
type LlmConfig, type LlmConfig,
type DatabaseBackupConfig, type DatabaseBackupConfig,

View File

@@ -1,5 +1,7 @@
const PROJECT_URL_KEY_DELIM_RE = /[^a-z0-9]+/g; const PROJECT_URL_KEY_DELIM_RE = /[^a-z0-9]+/g;
const PROJECT_URL_KEY_TRIM_RE = /^-+|-+$/g; const PROJECT_URL_KEY_TRIM_RE = /^-+|-+$/g;
const NON_ASCII_RE = /[^\x00-\x7F]/;
const UUID_RE = /^[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
export function normalizeProjectUrlKey(value: string | null | undefined): string | null { export function normalizeProjectUrlKey(value: string | null | undefined): string | null {
if (typeof value !== "string") return null; if (typeof value !== "string") return null;
@@ -11,6 +13,24 @@ export function normalizeProjectUrlKey(value: string | null | undefined): string
return normalized.length > 0 ? normalized : null; return normalized.length > 0 ? normalized : null;
} }
export function deriveProjectUrlKey(name: string | null | undefined, fallback?: string | null): string { /** Check whether a string contains non-ASCII characters that normalization would strip. */
return normalizeProjectUrlKey(name) ?? normalizeProjectUrlKey(fallback) ?? "project"; export function hasNonAsciiContent(value: string | null | undefined): boolean {
if (typeof value !== "string") return false;
return NON_ASCII_RE.test(value);
}
/** Extract the first 8 hex chars from a valid UUID, or null. */
function shortIdFromUuid(value: string | null | undefined): string | null {
if (typeof value !== "string" || !UUID_RE.test(value.trim())) return null;
return value.trim().replace(/-/g, "").slice(0, 8).toLowerCase();
}
export function deriveProjectUrlKey(name: string | null | undefined, fallback?: string | null): string {
const base = normalizeProjectUrlKey(name);
if (base && !hasNonAsciiContent(name)) return base;
// Non-ASCII content was stripped — append short UUID suffix for uniqueness.
const shortId = shortIdFromUuid(fallback);
if (base && shortId) return `${base}-${shortId}`;
if (shortId) return shortId;
return base ?? normalizeProjectUrlKey(fallback) ?? "project";
} }

View File

@@ -0,0 +1,34 @@
import { describe, expect, it } from "vitest";
import {
extractRoutineVariableNames,
interpolateRoutineTemplate,
syncRoutineVariablesWithTemplate,
} from "./routine-variables.js";
describe("routine variable helpers", () => {
it("extracts placeholder names in first-appearance order", () => {
expect(
extractRoutineVariableNames("Review {{repo}} and {{priority}} for {{repo}}"),
).toEqual(["repo", "priority"]);
});
it("preserves existing metadata when syncing variables from a template", () => {
expect(
syncRoutineVariablesWithTemplate("Review {{repo}} and {{priority}}", [
{ name: "repo", label: "Repository", type: "text", defaultValue: "paperclip", required: true, options: [] },
]),
).toEqual([
{ name: "repo", label: "Repository", type: "text", defaultValue: "paperclip", required: true, options: [] },
{ name: "priority", label: null, type: "text", defaultValue: null, required: true, options: [] },
]);
});
it("interpolates provided variable values into the routine template", () => {
expect(
interpolateRoutineTemplate("Review {{repo}} for {{priority}}", {
repo: "paperclip",
priority: "high",
}),
).toBe("Review paperclip for high");
});
});

View File

@@ -0,0 +1,62 @@
import type { RoutineVariable } from "./types/routine.js";
const ROUTINE_VARIABLE_MATCHER = /\{\{\s*([A-Za-z][A-Za-z0-9_]*)\s*\}\}/g;
export function isValidRoutineVariableName(name: string): boolean {
return /^[A-Za-z][A-Za-z0-9_]*$/.test(name);
}
export function extractRoutineVariableNames(template: string | null | undefined): string[] {
if (!template) return [];
const found = new Set<string>();
for (const match of template.matchAll(ROUTINE_VARIABLE_MATCHER)) {
const name = match[1];
if (name && !found.has(name)) {
found.add(name);
}
}
return [...found];
}
function defaultRoutineVariable(name: string): RoutineVariable {
return {
name,
label: null,
type: "text",
defaultValue: null,
required: true,
options: [],
};
}
export function syncRoutineVariablesWithTemplate(
template: string | null | undefined,
existing: RoutineVariable[] | null | undefined,
): RoutineVariable[] {
const names = extractRoutineVariableNames(template);
const existingByName = new Map((existing ?? []).map((variable) => [variable.name, variable]));
return names.map((name) => existingByName.get(name) ?? defaultRoutineVariable(name));
}
export function stringifyRoutineVariableValue(value: unknown): string {
if (typeof value === "string") return value;
if (typeof value === "number" || typeof value === "boolean") return String(value);
if (value == null) return "";
try {
return JSON.stringify(value);
} catch {
return String(value);
}
}
export function interpolateRoutineTemplate(
template: string | null | undefined,
values: Record<string, unknown> | null | undefined,
): string | null {
if (template == null) return null;
if (!values || Object.keys(values).length === 0) return template;
return template.replace(ROUTINE_VARIABLE_MATCHER, (match, rawName: string) => {
if (!(rawName in values)) return match;
return stringifyRoutineVariableValue(values[rawName]);
});
}

View File

@@ -0,0 +1,104 @@
import { createHash } from "node:crypto";
import type {
TelemetryConfig,
TelemetryEvent,
TelemetryEventName,
TelemetryState,
} from "./types.js";
const DEFAULT_ENDPOINT = "https://telemetry.paperclip.ing/ingest";
const BATCH_SIZE = 50;
const SEND_TIMEOUT_MS = 5_000;
export class TelemetryClient {
private queue: TelemetryEvent[] = [];
private readonly config: TelemetryConfig;
private readonly stateFactory: () => TelemetryState;
private readonly version: string;
private state: TelemetryState | null = null;
private flushInterval: ReturnType<typeof setInterval> | null = null;
constructor(config: TelemetryConfig, stateFactory: () => TelemetryState, version: string) {
this.config = config;
this.stateFactory = stateFactory;
this.version = version;
}
track(eventName: TelemetryEventName, dimensions?: Record<string, string | number | boolean>): void {
if (!this.config.enabled) return;
this.getState(); // ensure state is initialised (side-effect: creates state file on first call)
this.queue.push({
name: eventName,
occurredAt: new Date().toISOString(),
dimensions: dimensions ?? {},
});
if (this.queue.length >= BATCH_SIZE) {
void this.flush();
}
}
async flush(): Promise<void> {
if (!this.config.enabled || this.queue.length === 0) return;
const events = this.queue.splice(0);
const state = this.getState();
const endpoint = this.config.endpoint ?? DEFAULT_ENDPOINT;
const app = this.config.app ?? "paperclip";
const schemaVersion = this.config.schemaVersion ?? "1";
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), SEND_TIMEOUT_MS);
try {
await fetch(endpoint, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
app,
schemaVersion,
installId: state.installId,
events,
}),
signal: controller.signal,
});
} catch {
// Fire-and-forget: silent failure, no retries
} finally {
clearTimeout(timer);
}
}
startPeriodicFlush(intervalMs: number = 60_000): void {
if (this.flushInterval) return;
this.flushInterval = setInterval(() => {
void this.flush();
}, intervalMs);
// Allow the process to exit even if the interval is still active
if (typeof this.flushInterval === "object" && "unref" in this.flushInterval) {
this.flushInterval.unref();
}
}
stop(): void {
if (this.flushInterval) {
clearInterval(this.flushInterval);
this.flushInterval = null;
}
}
hashPrivateRef(value: string): string {
const state = this.getState();
return createHash("sha256")
.update(state.salt + value)
.digest("hex")
.slice(0, 16);
}
private getState(): TelemetryState {
if (!this.state) {
this.state = this.stateFactory();
}
return this.state;
}
}

View File

@@ -0,0 +1,25 @@
import type { TelemetryConfig } from "./types.js";
const CI_ENV_VARS = ["CI", "CONTINUOUS_INTEGRATION", "BUILD_NUMBER", "GITHUB_ACTIONS", "GITLAB_CI"];
function isCI(): boolean {
return CI_ENV_VARS.some((key) => process.env[key] === "true" || process.env[key] === "1");
}
export function resolveTelemetryConfig(fileConfig?: { enabled?: boolean }): TelemetryConfig {
if (process.env.PAPERCLIP_TELEMETRY_DISABLED === "1") {
return { enabled: false };
}
if (process.env.DO_NOT_TRACK === "1") {
return { enabled: false };
}
if (isCI()) {
return { enabled: false };
}
if (fileConfig?.enabled === false) {
return { enabled: false };
}
const endpoint = process.env.PAPERCLIP_TELEMETRY_ENDPOINT || undefined;
return { enabled: true, endpoint };
}

View File

@@ -0,0 +1,45 @@
import type { TelemetryClient } from "./client.js";
export function trackInstallStarted(client: TelemetryClient): void {
client.track("install.started");
}
export function trackInstallCompleted(
client: TelemetryClient,
dims: { adapterType: string },
): void {
client.track("install.completed", { adapter_type: dims.adapterType });
}
export function trackCompanyImported(
client: TelemetryClient,
dims: { sourceType: string; sourceRef: string; isPrivate: boolean },
): void {
const ref = dims.isPrivate ? client.hashPrivateRef(dims.sourceRef) : dims.sourceRef;
client.track("company.imported", {
source_type: dims.sourceType,
source_ref: ref,
source_ref_hashed: dims.isPrivate,
});
}
export function trackAgentFirstHeartbeat(
client: TelemetryClient,
dims: { agentRole: string },
): void {
client.track("agent.first_heartbeat", { agent_role: dims.agentRole });
}
export function trackAgentTaskCompleted(
client: TelemetryClient,
dims: { agentRole: string },
): void {
client.track("agent.task_completed", { agent_role: dims.agentRole });
}
export function trackErrorHandlerCrash(
client: TelemetryClient,
dims: { errorCode: string },
): void {
client.track("error.handler_crash", { error_code: dims.errorCode });
}

View File

@@ -0,0 +1,18 @@
export { TelemetryClient } from "./client.js";
export { resolveTelemetryConfig } from "./config.js";
export { loadOrCreateState } from "./state.js";
export {
trackInstallStarted,
trackInstallCompleted,
trackCompanyImported,
trackAgentFirstHeartbeat,
trackAgentTaskCompleted,
trackErrorHandlerCrash,
} from "./events.js";
export type {
TelemetryConfig,
TelemetryState,
TelemetryEvent,
TelemetryEventEnvelope,
TelemetryEventName,
} from "./types.js";

View File

@@ -0,0 +1,31 @@
import { randomUUID, randomBytes } from "node:crypto";
import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import path from "node:path";
import type { TelemetryState } from "./types.js";
export function loadOrCreateState(stateDir: string, version: string): TelemetryState {
const filePath = path.join(stateDir, "state.json");
if (existsSync(filePath)) {
try {
const raw = readFileSync(filePath, "utf-8");
const parsed = JSON.parse(raw) as TelemetryState;
if (parsed.installId && parsed.salt) {
return parsed;
}
} catch {
// Corrupted state file — recreate
}
}
const state: TelemetryState = {
installId: randomUUID(),
salt: randomBytes(32).toString("hex"),
createdAt: new Date().toISOString(),
firstSeenVersion: version,
};
mkdirSync(stateDir, { recursive: true });
writeFileSync(filePath, JSON.stringify(state, null, 2) + "\n", "utf-8");
return state;
}

View File

@@ -0,0 +1,37 @@
export interface TelemetryState {
installId: string;
salt: string;
createdAt: string;
firstSeenVersion: string;
}
export interface TelemetryConfig {
enabled: boolean;
endpoint?: string;
app?: string;
schemaVersion?: string;
}
/** Per-event object inside the backend envelope */
export interface TelemetryEvent {
name: string;
occurredAt: string;
dimensions: Record<string, string | number | boolean>;
}
/** Full payload sent to the backend ingest endpoint */
export interface TelemetryEventEnvelope {
app: string;
schemaVersion: string;
installId: string;
events: TelemetryEvent[];
}
export type TelemetryEventName =
| "install.started"
| "install.completed"
| "company.imported"
| "agent.first_heartbeat"
| "agent.task_completed"
| "error.handler_crash"
| `plugin.${string}`;

View File

@@ -31,6 +31,10 @@ export interface CompanyPortabilityCompanyManifestEntry {
brandColor: string | null; brandColor: string | null;
logoPath: string | null; logoPath: string | null;
requireBoardApprovalForNewAgents: boolean; requireBoardApprovalForNewAgents: boolean;
feedbackDataSharingEnabled: boolean;
feedbackDataSharingConsentAt: string | null;
feedbackDataSharingConsentByUserId: string | null;
feedbackDataSharingTermsVersion: string | null;
} }
export interface CompanyPortabilitySidebarOrder { export interface CompanyPortabilitySidebarOrder {
@@ -53,6 +57,8 @@ export interface CompanyPortabilityProjectManifestEntry {
metadata: Record<string, unknown> | null; metadata: Record<string, unknown> | null;
} }
import type { RoutineVariable } from "./routine.js";
export interface CompanyPortabilityProjectWorkspaceManifestEntry { export interface CompanyPortabilityProjectWorkspaceManifestEntry {
key: string; key: string;
name: string; name: string;
@@ -80,6 +86,7 @@ export interface CompanyPortabilityIssueRoutineTriggerManifestEntry {
export interface CompanyPortabilityIssueRoutineManifestEntry { export interface CompanyPortabilityIssueRoutineManifestEntry {
concurrencyPolicy: string | null; concurrencyPolicy: string | null;
catchUpPolicy: string | null; catchUpPolicy: string | null;
variables?: RoutineVariable[] | null;
triggers: CompanyPortabilityIssueRoutineTriggerManifestEntry[]; triggers: CompanyPortabilityIssueRoutineTriggerManifestEntry[];
} }

View File

@@ -12,6 +12,10 @@ export interface Company {
budgetMonthlyCents: number; budgetMonthlyCents: number;
spentMonthlyCents: number; spentMonthlyCents: number;
requireBoardApprovalForNewAgents: boolean; requireBoardApprovalForNewAgents: boolean;
feedbackDataSharingEnabled: boolean;
feedbackDataSharingConsentAt: Date | null;
feedbackDataSharingConsentByUserId: string | null;
feedbackDataSharingTermsVersion: string | null;
brandColor: string | null; brandColor: string | null;
logoAssetId: string | null; logoAssetId: string | null;
logoUrl: string | null; logoUrl: string | null;

View File

@@ -0,0 +1,120 @@
export const FEEDBACK_TARGET_TYPES = ["issue_comment", "issue_document_revision"] as const;
export type FeedbackTargetType = (typeof FEEDBACK_TARGET_TYPES)[number];
export const FEEDBACK_VOTE_VALUES = ["up", "down"] as const;
export type FeedbackVoteValue = (typeof FEEDBACK_VOTE_VALUES)[number];
export const FEEDBACK_DATA_SHARING_PREFERENCES = ["allowed", "not_allowed", "prompt"] as const;
export type FeedbackDataSharingPreference = (typeof FEEDBACK_DATA_SHARING_PREFERENCES)[number];
export const DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE: FeedbackDataSharingPreference = "prompt";
export const FEEDBACK_TRACE_STATUSES = ["local_only", "pending", "sent", "failed"] as const;
export type FeedbackTraceStatus = (typeof FEEDBACK_TRACE_STATUSES)[number];
export const DEFAULT_FEEDBACK_DATA_SHARING_TERMS_VERSION = "feedback-data-sharing-v1";
export interface FeedbackVote {
id: string;
companyId: string;
issueId: string;
targetType: FeedbackTargetType;
targetId: string;
authorUserId: string;
vote: FeedbackVoteValue;
reason: string | null;
sharedWithLabs: boolean;
sharedAt: Date | null;
consentVersion: string | null;
redactionSummary: Record<string, unknown> | null;
createdAt: Date;
updatedAt: Date;
}
export interface FeedbackTraceTargetSummary {
label: string;
excerpt: string | null;
authorAgentId: string | null;
authorUserId: string | null;
createdAt: Date | null;
documentKey: string | null;
documentTitle: string | null;
revisionNumber: number | null;
}
export interface FeedbackTrace {
id: string;
companyId: string;
feedbackVoteId: string;
issueId: string;
projectId: string | null;
issueIdentifier: string | null;
issueTitle: string;
authorUserId: string;
targetType: FeedbackTargetType;
targetId: string;
vote: FeedbackVoteValue;
status: FeedbackTraceStatus;
destination: string | null;
exportId: string | null;
consentVersion: string | null;
schemaVersion: string;
bundleVersion: string;
payloadVersion: string;
payloadDigest: string | null;
payloadSnapshot: Record<string, unknown> | null;
targetSummary: FeedbackTraceTargetSummary;
redactionSummary: Record<string, unknown> | null;
attemptCount: number;
lastAttemptedAt: Date | null;
exportedAt: Date | null;
failureReason: string | null;
createdAt: Date;
updatedAt: Date;
}
export type FeedbackTraceBundleCaptureStatus = "full" | "partial" | "unavailable";
export interface FeedbackTraceBundleFile {
path: string;
contentType: string;
encoding: "utf8";
byteLength: number;
sha256: string;
source:
| "paperclip_run"
| "paperclip_run_events"
| "paperclip_run_log"
| "codex_session"
| "claude_stream_json"
| "claude_project_session"
| "claude_project_artifact"
| "claude_debug_log"
| "claude_task_metadata"
| "opencode_session"
| "opencode_session_diff"
| "opencode_message"
| "opencode_message_part"
| "opencode_project"
| "opencode_todo";
contents: string;
}
export interface FeedbackTraceBundle {
traceId: string;
exportId: string | null;
companyId: string;
issueId: string;
issueIdentifier: string | null;
adapterType: string | null;
captureStatus: FeedbackTraceBundleCaptureStatus;
notes: string[];
envelope: Record<string, unknown>;
surface: Record<string, unknown> | null;
paperclipRun: Record<string, unknown> | null;
rawAdapterTrace: Record<string, unknown> | null;
normalizedAdapterTrace: Record<string, unknown> | null;
privacy: Record<string, unknown> | null;
integrity: Record<string, unknown>;
files: FeedbackTraceBundleFile[];
}

View File

@@ -42,6 +42,18 @@ export interface HeartbeatRun {
updatedAt: Date; updatedAt: Date;
} }
export interface AgentWakeupSkipped {
status: "skipped";
reason: string;
message: string | null;
issueId: string | null;
executionRunId: string | null;
executionAgentId: string | null;
executionAgentName: string | null;
}
export type AgentWakeupResponse = HeartbeatRun | AgentWakeupSkipped;
export interface HeartbeatRunEvent { export interface HeartbeatRunEvent {
id: number; id: number;
companyId: string; companyId: string;

View File

@@ -1,4 +1,16 @@
export type { Company } from "./company.js"; export type { Company } from "./company.js";
export type {
FeedbackVote,
FeedbackDataSharingPreference,
FeedbackTargetType,
FeedbackVoteValue,
FeedbackTrace,
FeedbackTraceStatus,
FeedbackTraceTargetSummary,
FeedbackTraceBundleCaptureStatus,
FeedbackTraceBundleFile,
FeedbackTraceBundle,
} from "./feedback.js";
export type { InstanceExperimentalSettings, InstanceGeneralSettings, InstanceSettings } from "./instance.js"; export type { InstanceExperimentalSettings, InstanceGeneralSettings, InstanceSettings } from "./instance.js";
export type { export type {
CompanySkillSourceType, CompanySkillSourceType,
@@ -118,6 +130,8 @@ export type {
} from "./secrets.js"; } from "./secrets.js";
export type { export type {
Routine, Routine,
RoutineVariable,
RoutineVariableDefaultValue,
RoutineTrigger, RoutineTrigger,
RoutineRun, RoutineRun,
RoutineTriggerSecretMaterial, RoutineTriggerSecretMaterial,
@@ -129,6 +143,8 @@ export type {
export type { CostEvent, CostSummary, CostByAgent, CostByProviderModel, CostByBiller, CostByAgentModel, CostWindowSpendRow, CostByProject } from "./cost.js"; export type { CostEvent, CostSummary, CostByAgent, CostByProviderModel, CostByBiller, CostByAgentModel, CostWindowSpendRow, CostByProject } from "./cost.js";
export type { FinanceEvent, FinanceSummary, FinanceByBiller, FinanceByKind } from "./finance.js"; export type { FinanceEvent, FinanceSummary, FinanceByBiller, FinanceByKind } from "./finance.js";
export type { export type {
AgentWakeupResponse,
AgentWakeupSkipped,
HeartbeatRun, HeartbeatRun,
HeartbeatRunEvent, HeartbeatRunEvent,
AgentRuntimeState, AgentRuntimeState,

View File

@@ -1,5 +1,9 @@
import type { FeedbackDataSharingPreference } from "./feedback.js";
export interface InstanceGeneralSettings { export interface InstanceGeneralSettings {
censorUsernameInLogs: boolean; censorUsernameInLogs: boolean;
keyboardShortcuts: boolean;
feedbackDataSharingPreference: FeedbackDataSharingPreference;
} }
export interface InstanceExperimentalSettings { export interface InstanceExperimentalSettings {

View File

@@ -81,6 +81,8 @@ export interface DocumentRevision {
issueId: string; issueId: string;
key: string; key: string;
revisionNumber: number; revisionNumber: number;
title: string | null;
format: DocumentFormat;
body: string; body: string;
changeSummary: string | null; changeSummary: string | null;
createdByAgentId: string | null; createdByAgentId: string | null;

View File

@@ -1,4 +1,4 @@
import type { IssueOriginKind } from "../constants.js"; import type { IssueOriginKind, RoutineVariableType } from "../constants.js";
export interface RoutineProjectSummary { export interface RoutineProjectSummary {
id: string; id: string;
@@ -25,6 +25,17 @@ export interface RoutineIssueSummary {
updatedAt: Date; updatedAt: Date;
} }
export type RoutineVariableDefaultValue = string | number | boolean | null;
export interface RoutineVariable {
name: string;
label: string | null;
type: RoutineVariableType;
defaultValue: RoutineVariableDefaultValue;
required: boolean;
options: string[];
}
export interface Routine { export interface Routine {
id: string; id: string;
companyId: string; companyId: string;
@@ -38,6 +49,7 @@ export interface Routine {
status: string; status: string;
concurrencyPolicy: string; concurrencyPolicy: string;
catchUpPolicy: string; catchUpPolicy: string;
variables: RoutineVariable[];
createdByAgentId: string | null; createdByAgentId: string | null;
createdByUserId: string | null; createdByUserId: string | null;
updatedByAgentId: string | null; updatedByAgentId: string | null;

View File

@@ -1,4 +1,5 @@
import { z } from "zod"; import { z } from "zod";
import { routineVariableSchema } from "./routine.js";
export const portabilityIncludeSchema = z export const portabilityIncludeSchema = z
.object({ .object({
@@ -36,6 +37,10 @@ export const portabilityCompanyManifestEntrySchema = z.object({
brandColor: z.string().nullable(), brandColor: z.string().nullable(),
logoPath: z.string().nullable(), logoPath: z.string().nullable(),
requireBoardApprovalForNewAgents: z.boolean(), requireBoardApprovalForNewAgents: z.boolean(),
feedbackDataSharingEnabled: z.boolean().default(false),
feedbackDataSharingConsentAt: z.string().datetime().nullable().default(null),
feedbackDataSharingConsentByUserId: z.string().nullable().default(null),
feedbackDataSharingTermsVersion: z.string().nullable().default(null),
}); });
export const portabilitySidebarOrderSchema = z.object({ export const portabilitySidebarOrderSchema = z.object({
@@ -119,6 +124,7 @@ export const portabilityIssueRoutineTriggerManifestEntrySchema = z.object({
export const portabilityIssueRoutineManifestEntrySchema = z.object({ export const portabilityIssueRoutineManifestEntrySchema = z.object({
concurrencyPolicy: z.string().nullable(), concurrencyPolicy: z.string().nullable(),
catchUpPolicy: z.string().nullable(), catchUpPolicy: z.string().nullable(),
variables: z.array(routineVariableSchema).nullable().optional(),
triggers: z.array(portabilityIssueRoutineTriggerManifestEntrySchema).default([]), triggers: z.array(portabilityIssueRoutineTriggerManifestEntrySchema).default([]),
}); });

View File

@@ -3,6 +3,7 @@ import { COMPANY_STATUSES } from "../constants.js";
const logoAssetIdSchema = z.string().uuid().nullable().optional(); const logoAssetIdSchema = z.string().uuid().nullable().optional();
const brandColorSchema = z.string().regex(/^#[0-9a-fA-F]{6}$/).nullable().optional(); const brandColorSchema = z.string().regex(/^#[0-9a-fA-F]{6}$/).nullable().optional();
const feedbackDataSharingTermsVersionSchema = z.string().min(1).nullable().optional();
export const createCompanySchema = z.object({ export const createCompanySchema = z.object({
name: z.string().min(1), name: z.string().min(1),
@@ -18,6 +19,10 @@ export const updateCompanySchema = createCompanySchema
status: z.enum(COMPANY_STATUSES).optional(), status: z.enum(COMPANY_STATUSES).optional(),
spentMonthlyCents: z.number().int().nonnegative().optional(), spentMonthlyCents: z.number().int().nonnegative().optional(),
requireBoardApprovalForNewAgents: z.boolean().optional(), requireBoardApprovalForNewAgents: z.boolean().optional(),
feedbackDataSharingEnabled: z.boolean().optional(),
feedbackDataSharingConsentAt: z.coerce.date().nullable().optional(),
feedbackDataSharingConsentByUserId: z.string().min(1).nullable().optional(),
feedbackDataSharingTermsVersion: feedbackDataSharingTermsVersionSchema,
brandColor: brandColorSchema, brandColor: brandColorSchema,
logoAssetId: logoAssetIdSchema, logoAssetId: logoAssetIdSchema,
}); });

Some files were not shown because too many files have changed in this diff Show More