Compare commits

...

8 Commits

Author SHA1 Message Date
Devin Foley
5bd0f578fd Generalize sandbox provider core for plugin-only providers (#4449)
## Thinking Path

> - Paperclip is a control plane, so optional execution providers should
sit at the plugin edge instead of hardcoding provider-specific behavior
into core shared/server/ui layers.
> - Sandbox environments are already first-class, and the fake provider
proves the built-in path; the remaining gap was that real providers
still leaked provider-specific config and runtime assumptions into core.
> - That coupling showed up in config normalization, secret persistence,
capabilities reporting, lease reconstruction, and the board UI form
fields.
> - As long as core knew about those provider-shaped details, shipping a
provider as a pure third-party plugin meant every new provider would
still require host changes.
> - This pull request generalizes the sandbox provider seam around
schema-driven plugin metadata and generic secret-ref handling.
> - The runtime and UI now consume provider metadata generically, so
core only special-cases the built-in fake provider while third-party
providers can live entirely in plugins.

## What Changed

- Added generic sandbox-provider capability metadata so plugin-backed
providers can expose `configSchema` through shared environment support
and the environments capabilities API.
- Reworked sandbox config normalization/persistence/runtime resolution
to handle schema-declared secret-ref fields generically, storing them as
Paperclip secrets and resolving them for probe/execute/release flows.
- Generalized plugin sandbox runtime handling so provider validation,
reusable-lease matching, lease reconstruction, and plugin worker calls
all operate on provider-agnostic config instead of provider-shaped
branches.
- Replaced hardcoded sandbox provider form fields in Company Settings
with schema-driven rendering and blocked agent environment selection
from the built-in fake provider.
- Added regression coverage for the generic seam across shared support
helpers plus environment config, probe, routes, runtime, and
sandbox-provider runtime tests.

## Verification

- `pnpm vitest --run packages/shared/src/environment-support.test.ts
server/src/__tests__/environment-config.test.ts
server/src/__tests__/environment-probe.test.ts
server/src/__tests__/environment-routes.test.ts
server/src/__tests__/environment-runtime.test.ts
server/src/__tests__/sandbox-provider-runtime.test.ts`
- `pnpm -r typecheck`

## Risks

- Plugin sandbox providers now depend more heavily on accurate
`configSchema` declarations; incorrect schemas can misclassify
secret-bearing fields or omit required config.
- Reusable lease matching is now metadata-driven for plugin-backed
providers, so providers that fail to persist stable metadata may
reprovision instead of resuming an existing lease.
- The UI form is now fully schema-driven for plugin-backed sandbox
providers; provider manifests without good defaults or descriptions may
produce a rougher operator experience.

## Model Used

- OpenAI Codex via `codex_local`
- Model ID: `gpt-5.4`
- Reasoning effort: `high`
- Context window observed in runtime session metadata: `258400` tokens
- Capabilities used: terminal tool execution, git, and local code/test
inspection

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [ ] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge
2026-04-24 18:03:41 -07:00
Dotta
deba60ebb2 Stabilize serialized server route tests (#4448)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The server route suite is a core confidence layer for auth, issue
context, and workspace runtime behavior
> - Some route tests were doing extra module/server isolation work that
made local runs slower and more fragile
> - The stable Vitest runner also needs to pass server-relative exclude
paths to avoid accidentally re-including serialized suites
> - This pull request tightens route test isolation and runner
serialization behavior
> - The benefit is more reliable targeted and stable-route test
execution without product behavior changes

## What Changed

- Updated `run-vitest-stable.mjs` to exclude serialized server tests
using server-relative paths.
- Forced the server Vitest config to use a single worker in addition to
isolated forks.
- Simplified agent permission route tests to create per-request test
servers without shared server lifecycle state.
- Stabilized issue goal context route mocks by using static mocked
services and a sequential suite.
- Re-registered workspace runtime route mocks before cache-busted route
imports.

## Verification

- `pnpm exec vitest run --project @paperclipai/server
server/src/__tests__/agent-permissions-routes.test.ts
server/src/__tests__/issues-goal-context-routes.test.ts
server/src/__tests__/workspace-runtime-routes-authz.test.ts --pool=forks
--poolOptions.forks.isolate=true`
- `node --check scripts/run-vitest-stable.mjs`

## Risks

- Low risk. This is test infrastructure only.
- The stable runner path fix changes which tests are excluded from the
non-serialized server batch, matching the server project root that
Vitest applies internally.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool-enabled with
shell/GitHub/Paperclip API access. Context window was not reported by
the runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 19:27:00 -05:00
Dotta
f68e9caa9a Polish markdown external link wrapping (#4447)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The board UI renders agent comments, PR links, issue links, and
operational markdown throughout issue threads
> - Long GitHub and external links can wrap awkwardly, leaving icons
orphaned from the text they describe
> - Small inbox visual polish also helps repeated board scanning without
changing behavior
> - This pull request glues markdown link icons to adjacent link
characters and removes a redundant inbox list border
> - The benefit is cleaner, more stable markdown and inbox rendering for
day-to-day operator review

## What Changed

- Added an external-link indicator for external markdown links.
- Kept the GitHub icon attached to the first link character so it does
not wrap onto a separate line.
- Kept the external-link icon attached to the final link character so it
does not wrap away from the URL/text.
- Added markdown rendering regressions for GitHub and external link icon
wrapping.
- Removed the extra border around the inbox list card.

## Verification

- `pnpm exec vitest run --project @paperclipai/ui
ui/src/components/MarkdownBody.test.tsx`
- `pnpm --filter @paperclipai/ui typecheck`

## Risks

- Low risk. The markdown change is limited to link child rendering and
preserves existing href/target/rel behavior.
- Visual-only inbox polish.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool-enabled with
shell/GitHub/Paperclip API access. Context window was not reported by
the runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 19:26:13 -05:00
Dotta
73fbdf36db Gate stale-run watchdog decisions by board access (#4446)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The run ledger surfaces stale-run watchdog evaluation issues and
recovery actions
> - Viewer-level board users should be able to inspect status without
getting controls that the server will reject
> - The UI also needs enough board-access context to know when to hide
those decision actions
> - This pull request exposes board memberships in the current board
access snapshot and gates watchdog action controls for known viewer
contexts
> - The benefit is clearer least-privilege UI behavior around recovery
controls

## What Changed

- Included memberships in `/api/cli-auth/me` so the board UI can
distinguish active viewer memberships from operator/admin access.
- Added the stale-run evaluation issue assignee to output silence
summaries.
- Hid stale-run watchdog decision buttons for known non-owner viewer
contexts.
- Surfaced watchdog decision failures through toast and inline error
text.
- Threaded `companyId` through the issue activity run ledger so access
checks are company-scoped.
- Added IssueRunLedger coverage for non-owner viewers.

## Verification

- `pnpm exec vitest run --project @paperclipai/ui
ui/src/components/IssueRunLedger.test.tsx`
- `pnpm --filter @paperclipai/server typecheck`
- `pnpm --filter @paperclipai/ui typecheck`

## Risks

- Medium-low risk. This is a UI gating change backed by existing server
authorization.
- Local implicit and instance-admin board contexts continue to show
watchdog decision controls.
- No migrations.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool-enabled with
shell/GitHub/Paperclip API access. Context window was not reported by
the runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 19:25:23 -05:00
Dotta
6916e30f8e Cancel stale retries when issue ownership changes (#4445)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - Issue execution is guarded by run locks and bounded retry scheduling
> - A failed run can schedule a retry, but the issue may be reassigned
before that retry becomes due
> - The old assignee's scheduled retry should not continue to hold or
reclaim execution for the issue
> - This pull request cancels stale scheduled retries when ownership
changes and cancels live work when an issue is explicitly cancelled
> - The benefit is cleaner issue handoff semantics and fewer stranded or
incorrect execution locks

## What Changed

- Cancel scheduled retry runs when their issue has been reassigned
before the retry is promoted.
- Clear stale issue execution locks and cancel the associated wakeup
request when a stale retry is cancelled.
- Avoid deferring a new assignee behind a previous assignee's scheduled
retry.
- Cancel an active run when an issue status is explicitly changed to
`cancelled`, while leaving `done` transitions alone.
- Added route and heartbeat regressions for reassignment and
cancellation behavior.

## Verification

- `pnpm exec vitest run --project @paperclipai/server
server/src/__tests__/heartbeat-retry-scheduling.test.ts
server/src/__tests__/issue-comment-reopen-routes.test.ts --pool=forks
--poolOptions.forks.isolate=true`
  - `issue-comment-reopen-routes.test.ts`: 28 passed.
- `heartbeat-retry-scheduling.test.ts`: skipped by the existing embedded
Postgres host guard (`Postgres init script exited with code null`).
- `pnpm --filter @paperclipai/server typecheck`

## Risks

- Medium risk because this changes heartbeat retry lifecycle behavior.
- The cancellation path is scoped to scheduled retries whose issue
assignee no longer matches the retrying agent, and logs a lifecycle
event for auditability.
- No migrations.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool-enabled with
shell/GitHub/Paperclip API access. Context window was not reported by
the runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 19:24:13 -05:00
Dotta
0c6961a03e Normalize escaped multiline issue and approval text (#4444)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - The board and agent APIs accept multiline issue, approval,
interaction, and document text
> - Some callers accidentally send literal escaped newline sequences
like `\n` instead of JSON-decoded line breaks
> - That makes comments, descriptions, documents, and approval notes
render as flattened text instead of readable markdown
> - This pull request centralizes multiline text normalization in shared
validators
> - The benefit is newline-preserving API behavior across issue and
approval workflows without route-specific fixes

## What Changed

- Added a shared `multilineTextSchema` helper that normalizes escaped
`\n`, `\r\n`, and `\r` sequences to real line breaks.
- Applied the helper to issue descriptions, issue update comments, issue
comment bodies, suggested task descriptions, interaction summaries,
issue documents, approval comments, and approval decision notes.
- Added shared validator regressions for issue and approval multiline
inputs.

## Verification

- `pnpm exec vitest run --project @paperclipai/shared
packages/shared/src/validators/approval.test.ts
packages/shared/src/validators/issue.test.ts`
- `pnpm --filter @paperclipai/shared typecheck`

## Risks

- Low risk. This only changes text fields that are explicitly multiline
user/operator content.
- If a caller intentionally wanted literal backslash-n text in these
fields, it will now render as a real line break.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool-enabled with
shell/GitHub/Paperclip API access. Context window was not reported by
the runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 18:02:45 -05:00
Dotta
5a0c1979cf [codex] Add runtime lifecycle recovery and live issue visibility (#4419) 2026-04-24 15:50:32 -05:00
Dotta
9a8d219949 [codex] Stabilize tests and local maintenance assets (#4423)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - A fast-moving control plane needs stable local tests and repeatable
local maintenance tools so contributors can safely split and review work
> - Several route suites needed stronger isolation, Codex manual model
selection needed a faster-mode option, and local browser cleanup missed
Playwright's headless shell binary
> - Storybook static output also needed to be preserved as a generated
review artifact from the working branch
> - This pull request groups the test/local-dev maintenance pieces so
they can be reviewed separately from product runtime changes
> - The benefit is more predictable contributor verification and cleaner
local maintenance without mixing these changes into feature PRs

## What Changed

- Added stable Vitest runner support and serialized route/authz test
isolation.
- Fixed workspace runtime authz route mocks and stabilized
Claude/company-import related assertions.
- Allowed Codex fast mode for manually selected models.
- Broadened the agent browser cleanup script to detect
`chrome-headless-shell` as well as Chrome for Testing.
- Preserved generated Storybook static output from the source branch.

## Verification

- `pnpm exec vitest run
src/__tests__/workspace-runtime-routes-authz.test.ts
src/__tests__/claude-local-execute.test.ts --config vitest.config.ts`
from `server/` passed: 2 files, 19 tests.
- `pnpm exec vitest run src/server/codex-args.test.ts --config
vitest.config.ts` from `packages/adapters/codex-local/` passed: 1 file,
3 tests.
- `bash -n scripts/kill-agent-browsers.sh &&
scripts/kill-agent-browsers.sh --dry` passed; dry-run detected
`chrome-headless-shell` processes without killing them.
- `test -f ui/storybook-static/index.html && test -f
ui/storybook-static/assets/forms-editors.stories-Dry7qwx2.js` passed.
- `git diff --check public-gh/master..pap-2228-test-local-maintenance --
. ':(exclude)ui/storybook-static'` passed.
- `pnpm exec vitest run
cli/src/__tests__/company-import-export-e2e.test.ts --config
cli/vitest.config.ts` did not complete in the isolated split worktree
because `paperclipai run` exited during build prep with `TS2688: Cannot
find type definition file for 'react'`; this appears to be caused by the
worktree dependency symlink setup, not the code under test.
- Confirmed this PR does not include `pnpm-lock.yaml`.

## Risks

- Medium risk: the stable Vitest runner changes how route/authz tests
are scheduled.
- Generated `ui/storybook-static` files are large and contain minified
third-party output; `git diff --check` reports whitespace inside those
generated assets, so reviewers may choose to drop or regenerate that
artifact before merge.
- No database migrations.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex coding agent based on GPT-5, with shell, git, Paperclip
API, and GitHub CLI tool use in the local Paperclip workspace.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

Note: screenshot checklist item is not applicable to source UI behavior;
the included Storybook static output is generated artifact preservation
from the source branch.

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-24 15:11:42 -05:00
202 changed files with 13405 additions and 3626 deletions

1
.gitignore vendored
View File

@@ -3,6 +3,7 @@ node_modules/
**/node_modules
**/node_modules/
dist/
ui/storybook-static/
.env
*.tsbuildinfo
drizzle/meta/

View File

@@ -398,10 +398,11 @@ describeEmbeddedPostgres("paperclipai company import/export e2e", () => {
apiBase,
`/api/companies/${importedNew.company.id}/issues`,
);
const importedMatchingIssues = importedIssues.filter((issue) => issue.title === sourceIssue.title);
expect(importedAgents.map((agent) => agent.name)).toContain(sourceAgent.name);
expect(importedProjects.map((project) => project.name)).toContain(sourceProject.name);
expect(importedIssues.map((issue) => issue.title)).toContain(sourceIssue.title);
expect(importedMatchingIssues).toHaveLength(1);
const previewExisting = await runCliJson<{
errors: string[];
@@ -471,11 +472,13 @@ describeEmbeddedPostgres("paperclipai company import/export e2e", () => {
apiBase,
`/api/companies/${importedNew.company.id}/issues`,
);
const twiceImportedMatchingIssues = twiceImportedIssues.filter((issue) => issue.title === sourceIssue.title);
expect(twiceImportedAgents).toHaveLength(2);
expect(new Set(twiceImportedAgents.map((agent) => agent.name)).size).toBe(2);
expect(twiceImportedProjects).toHaveLength(2);
expect(twiceImportedIssues).toHaveLength(2);
expect(twiceImportedMatchingIssues).toHaveLength(2);
expect(new Set(twiceImportedMatchingIssues.map((issue) => issue.identifier)).size).toBe(2);
const zipPath = path.join(tempRoot, "exported-company.zip");
const portableFiles: Record<string, string> = {};

View File

@@ -61,6 +61,7 @@ interface IssueUpdateOptions extends BaseClientOptions {
interface IssueCommentOptions extends BaseClientOptions {
body: string;
reopen?: boolean;
resume?: boolean;
}
interface IssueCheckoutOptions extends BaseClientOptions {
@@ -241,12 +242,14 @@ export function registerIssueCommands(program: Command): void {
.argument("<issueId>", "Issue ID")
.requiredOption("--body <text>", "Comment body")
.option("--reopen", "Reopen if issue is done/cancelled")
.option("--resume", "Request explicit follow-up and wake the assignee when resumable")
.action(async (issueId: string, opts: IssueCommentOptions) => {
try {
const ctx = resolveCommandContext(opts);
const payload = addIssueCommentSchema.parse({
body: opts.body,
reopen: opts.reopen,
resume: opts.resume,
});
const comment = await ctx.api.post<IssueComment>(`/api/issues/${issueId}/comments`, payload);
printOutput(comment, { json: ctx.json });

View File

@@ -37,7 +37,7 @@ These decisions close open questions from `SPEC.md` for V1.
| Visibility | Full visibility to board and all agents in same company |
| Communication | Tasks + comments only (no separate chat system) |
| Task ownership | Single assignee; atomic checkout required for `in_progress` transition |
| Recovery | No automatic reassignment; work recovery stays manual/explicit |
| Recovery | No automatic reassignment; control-plane recovery may retry lost execution continuity once, then uses explicit recovery issues or human escalation |
| Agent adapters | Built-in `process` and `http` adapters |
| Auth | Mode-dependent human auth (`local_trusted` implicit board in current code; authenticated mode uses sessions), API keys for agents |
| Budget period | Monthly UTC calendar window |
@@ -395,7 +395,7 @@ Side effects:
- entering `done` sets `completed_at`
- entering `cancelled` sets `cancelled_at`
Detailed ownership, execution, blocker, and crash-recovery semantics are documented in `doc/execution-semantics.md`.
Detailed ownership, execution, blocker, active-run watchdog, and crash-recovery semantics are documented in `doc/execution-semantics.md`.
## 8.3 Approval Status

View File

@@ -1,7 +1,7 @@
# Execution Semantics
Status: Current implementation guide
Date: 2026-04-13
Date: 2026-04-23
Audience: Product and engineering
This document explains how Paperclip interprets issue assignment, issue status, execution runs, wakeups, parent/sub-issue structure, and blocker relationships.
@@ -218,15 +218,81 @@ This is an active-work continuity recovery.
Startup recovery and periodic recovery are different from normal wakeup delivery.
On startup and on the periodic recovery loop, Paperclip now does three things in sequence:
On startup and on the periodic recovery loop, Paperclip now does four things in sequence:
1. reap orphaned `running` runs
2. resume persisted `queued` runs
3. reconcile stranded assigned work
4. scan silent active runs and create or update explicit watchdog review issues
That last step is what closes the gap where issue state survives a crash but the wake/run path does not.
The stranded-work pass closes the gap where issue state survives a crash but the wake/run path does not. The silent-run scan covers the separate case where a live process exists but has stopped producing observable output.
## 10. What This Does Not Mean
## 10. Silent Active-Run Watchdog
An active run can still be unhealthy even when its process is `running`. Paperclip treats prolonged output silence as a watchdog signal, not as proof that the run is failed.
The recovery service owns this contract:
- classify active-run output silence as `ok`, `suspicious`, `critical`, `snoozed`, or `not_applicable`
- collect bounded evidence from run logs, recent run events, child issues, and blockers
- preserve redaction and truncation before evidence is written to issue descriptions
- create at most one open `stale_active_run_evaluation` issue per run
- honor active snooze decisions before creating more review work
- build the `outputSilence` summary shown by live-run and active-run API responses
Suspicious silence creates a medium-priority review issue for the selected recovery owner. Critical silence raises that review issue to high priority and blocks the source issue on the explicit evaluation task without cancelling the active process.
Watchdog decisions are explicit operator/recovery-owner decisions:
- `snooze` records an operator-chosen future quiet-until time and suppresses scan-created review work during that window
- `continue` records that the current evidence is acceptable, does not cancel or mutate the active run, and sets a 30-minute default re-arm window before the watchdog evaluates the still-silent run again
- `dismissed_false_positive` records why the review was not actionable
Operators should prefer `snooze` for known time-bounded quiet periods. `continue` is only a short acknowledgement of the current evidence; if the run remains silent after the re-arm window, the periodic watchdog scan can create or update review work again.
The board can record watchdog decisions. The assigned owner of the watchdog evaluation issue can also record them. Other agents cannot.
## 11. Auto-Recover vs Explicit Recovery vs Human Escalation
Paperclip uses three different recovery outcomes, depending on how much it can safely infer.
### Auto-Recover
Auto-recovery is allowed when ownership is clear and the control plane only lost execution continuity.
Examples:
- requeue one dispatch wake for an assigned `todo` issue whose latest run failed, timed out, or was cancelled
- requeue one continuation wake for an assigned `in_progress` issue whose live execution path disappeared
- assign an orphan blocker back to its creator when that blocker is already preventing other work
Auto-recovery preserves the existing owner. It does not choose a replacement agent.
### Explicit Recovery Issue
Paperclip creates an explicit recovery issue when the system can identify a problem but cannot safely complete the work itself.
Examples:
- automatic stranded-work retry was already exhausted
- a dependency graph has an invalid/uninvokable owner, unassigned blocker, or invalid review participant
- an active run is silent past the watchdog threshold
The source issue remains visible and blocked on the recovery issue when blocking is necessary for correctness. The recovery owner must restore a live path, resolve the source issue manually, or record the reason it is a false positive.
### Human Escalation
Human escalation is required when the next safe action depends on board judgment, budget/approval policy, or information unavailable to the control plane.
Examples:
- all candidate recovery owners are paused, terminated, pending approval, or budget-blocked
- the issue is human-owned rather than agent-owned
- the run is intentionally quiet but needs an operator decision before cancellation or continuation
In these cases Paperclip should leave a visible issue/comment trail instead of silently retrying.
## 12. What This Does Not Mean
These semantics do not change V1 into an auto-reassignment system.
@@ -240,9 +306,10 @@ The recovery model is intentionally conservative:
- preserve ownership
- retry once when the control plane lost execution continuity
- create explicit recovery work when the system can identify a bounded recovery owner/action
- escalate visibly when the system cannot safely keep going
## 11. Practical Interpretation
## 13. Practical Interpretation
For a board operator, the intended meaning is:

View File

@@ -17,7 +17,7 @@
"typecheck": "pnpm run preflight:workspace-links && pnpm -r typecheck",
"test": "pnpm run test:run",
"test:watch": "pnpm run preflight:workspace-links && vitest",
"test:run": "pnpm run preflight:workspace-links && vitest run",
"test:run": "pnpm run preflight:workspace-links && node scripts/run-vitest-stable.mjs",
"db:generate": "pnpm --filter @paperclipai/db generate",
"db:migrate": "pnpm --filter @paperclipai/db migrate",
"issue-references:backfill": "pnpm run preflight:workspace-links && tsx scripts/backfill-issue-reference-mentions.ts",

View File

@@ -92,6 +92,7 @@ export const DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE = [
"- If woken by a human comment on a dependency-blocked issue, respond or triage the comment without treating the blocked deliverable work as unblocked.",
"- Create child issues directly when you know what needs to be done; use issue-thread interactions when the board/user must choose suggested tasks, answer structured questions, or confirm a proposal.",
"- To ask for that input, create an interaction on the current issue with POST /api/issues/{issueId}/interactions using kind suggest_tasks, ask_user_questions, or request_confirmation. Use continuationPolicy wake_assignee when you need to resume after a response; for request_confirmation this resumes only after acceptance.",
"- When you intentionally restart follow-up work on a completed assigned issue, include structured `resume: true` with the POST /api/issues/{issueId}/comments or PATCH /api/issues/{issueId} comment payload. Generic agent comments on closed issues are inert by default.",
"- For plan approval, update the plan document first, then create request_confirmation targeting the latest plan revision with idempotencyKey confirmation:{issueId}:plan:{revisionId}. Wait for acceptance before creating implementation subtasks, and create a fresh confirmation after superseding board/user comments if approval is still needed.",
"- If blocked, mark the issue blocked and name the unblock owner and action.",
"- Respect budget, pause/cancel, approval gates, and company boundaries.",

View File

@@ -4,7 +4,23 @@ export const DEFAULT_CODEX_LOCAL_MODEL = "gpt-5.3-codex";
export const DEFAULT_CODEX_LOCAL_BYPASS_APPROVALS_AND_SANDBOX = true;
export const CODEX_LOCAL_FAST_MODE_SUPPORTED_MODELS = ["gpt-5.4"] as const;
function normalizeModelId(model: string | null | undefined): string {
return typeof model === "string" ? model.trim() : "";
}
export function isCodexLocalKnownModel(model: string | null | undefined): boolean {
const normalizedModel = normalizeModelId(model);
if (!normalizedModel) return false;
return models.some((entry) => entry.id === normalizedModel);
}
export function isCodexLocalManualModel(model: string | null | undefined): boolean {
const normalizedModel = normalizeModelId(model);
return Boolean(normalizedModel) && !isCodexLocalKnownModel(normalizedModel);
}
export function isCodexLocalFastModeSupported(model: string | null | undefined): boolean {
if (isCodexLocalManualModel(model)) return true;
const normalizedModel = typeof model === "string" ? model.trim() : "";
return CODEX_LOCAL_FAST_MODE_SUPPORTED_MODELS.includes(
normalizedModel as (typeof CODEX_LOCAL_FAST_MODE_SUPPORTED_MODELS)[number],
@@ -35,7 +51,7 @@ Core fields:
- modelReasoningEffort (string, optional): reasoning effort override (minimal|low|medium|high|xhigh) passed via -c model_reasoning_effort=...
- promptTemplate (string, optional): run prompt template
- search (boolean, optional): run codex with --search
- fastMode (boolean, optional): enable Codex Fast mode; currently supported on GPT-5.4 only and consumes credits faster
- fastMode (boolean, optional): enable Codex Fast mode; supported on GPT-5.4 and passed through for manual model IDs
- dangerouslyBypassApprovalsAndSandbox (boolean, optional): run with bypass flag
- command (string, optional): defaults to "codex"
- extraArgs (string[], optional): additional CLI args
@@ -54,6 +70,6 @@ Notes:
- Paperclip injects desired local skills into the effective CODEX_HOME/skills/ directory at execution time so Codex can discover "$paperclip" and related skills without polluting the project working directory. In managed-home mode (the default) this is ~/.paperclip/instances/<id>/companies/<companyId>/codex-home/skills/; when CODEX_HOME is explicitly overridden in adapter config, that override is used instead.
- Unless explicitly overridden in adapter config, Paperclip runs Codex with a per-company managed CODEX_HOME under the active Paperclip instance and seeds auth/config from the shared Codex home (the CODEX_HOME env var, when set, or ~/.codex).
- Some model/tool combinations reject certain effort levels (for example minimal with web search enabled).
- Fast mode is currently supported on GPT-5.4 only. When enabled, Paperclip applies \`service_tier="fast"\` and \`features.fast_mode=true\`.
- Fast mode is supported on GPT-5.4 and manual model IDs. When enabled for those models, Paperclip applies \`service_tier="fast"\` and \`features.fast_mode=true\`.
- When Paperclip realizes a workspace/runtime for a run, it injects PAPERCLIP_WORKSPACE_* and PAPERCLIP_RUNTIME_* env vars for agent-side tooling.
`;

View File

@@ -26,6 +26,28 @@ describe("buildCodexExecArgs", () => {
]);
});
it("enables Codex fast mode overrides for manual models", () => {
const result = buildCodexExecArgs({
model: "gpt-5.5",
fastMode: true,
});
expect(result.fastModeRequested).toBe(true);
expect(result.fastModeApplied).toBe(true);
expect(result.fastModeIgnoredReason).toBeNull();
expect(result.args).toEqual([
"exec",
"--json",
"--model",
"gpt-5.5",
"-c",
'service_tier="fast"',
"-c",
"features.fast_mode=true",
"-",
]);
});
it("ignores fast mode for unsupported models", () => {
const result = buildCodexExecArgs({
model: "gpt-5.3-codex",
@@ -34,7 +56,9 @@ describe("buildCodexExecArgs", () => {
expect(result.fastModeRequested).toBe(true);
expect(result.fastModeApplied).toBe(false);
expect(result.fastModeIgnoredReason).toContain("currently only supported on gpt-5.4");
expect(result.fastModeIgnoredReason).toContain(
"currently only supported on gpt-5.4 or manually configured model IDs",
);
expect(result.args).toEqual([
"exec",
"--json",

View File

@@ -25,7 +25,7 @@ function asRecord(value: unknown): Record<string, unknown> {
}
function formatFastModeSupportedModels(): string {
return CODEX_LOCAL_FAST_MODE_SUPPORTED_MODELS.join(", ");
return `${CODEX_LOCAL_FAST_MODE_SUPPORTED_MODELS.join(", ")} or manually configured model IDs`;
}
export function buildCodexExecArgs(

View File

@@ -146,7 +146,7 @@ export async function testEnvironment(
code: "codex_fast_mode_unsupported_model",
level: "warn",
message: execArgs.fastModeIgnoredReason,
hint: "Switch the agent model to GPT-5.4 to enable Codex Fast mode.",
hint: "Switch the agent model to GPT-5.4 or enter a manual model ID to enable Codex Fast mode.",
});
}

View File

@@ -0,0 +1,13 @@
CREATE UNIQUE INDEX IF NOT EXISTS "issues_active_liveness_recovery_incident_uq"
ON "issues" USING btree ("company_id","origin_kind","origin_id")
WHERE "origin_kind" = 'harness_liveness_escalation'
AND "origin_id" IS NOT NULL
AND "hidden_at" IS NULL
AND "status" NOT IN ('done', 'cancelled');
--> statement-breakpoint
CREATE UNIQUE INDEX IF NOT EXISTS "issues_active_liveness_recovery_leaf_uq"
ON "issues" USING btree ("company_id","origin_kind","origin_fingerprint")
WHERE "origin_kind" = 'harness_liveness_escalation'
AND "origin_fingerprint" <> 'default'
AND "hidden_at" IS NULL
AND "status" NOT IN ('done', 'cancelled');

View File

@@ -0,0 +1,70 @@
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "last_output_at" timestamp with time zone;
--> statement-breakpoint
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "last_output_seq" integer DEFAULT 0 NOT NULL;
--> statement-breakpoint
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "last_output_stream" text;
--> statement-breakpoint
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "last_output_bytes" bigint;
--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "heartbeat_runs_company_status_last_output_idx"
ON "heartbeat_runs" USING btree ("company_id","status","last_output_at");
--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "heartbeat_runs_company_status_process_started_idx"
ON "heartbeat_runs" USING btree ("company_id","status","process_started_at");
--> statement-breakpoint
CREATE TABLE IF NOT EXISTS "heartbeat_run_watchdog_decisions" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"company_id" uuid NOT NULL,
"run_id" uuid NOT NULL,
"evaluation_issue_id" uuid,
"decision" text NOT NULL,
"snoozed_until" timestamp with time zone,
"reason" text,
"created_by_agent_id" uuid,
"created_by_user_id" text,
"created_by_run_id" uuid,
"created_at" timestamp with time zone DEFAULT now() NOT NULL
);
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "heartbeat_run_watchdog_decisions" ADD CONSTRAINT "heartbeat_run_watchdog_decisions_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "heartbeat_run_watchdog_decisions" ADD CONSTRAINT "heartbeat_run_watchdog_decisions_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE cascade ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "heartbeat_run_watchdog_decisions" ADD CONSTRAINT "heartbeat_run_watchdog_decisions_evaluation_issue_id_issues_id_fk" FOREIGN KEY ("evaluation_issue_id") REFERENCES "public"."issues"("id") ON DELETE set null ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "heartbeat_run_watchdog_decisions" ADD CONSTRAINT "heartbeat_run_watchdog_decisions_created_by_agent_id_agents_id_fk" FOREIGN KEY ("created_by_agent_id") REFERENCES "public"."agents"("id") ON DELETE set null ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
DO $$ BEGIN
ALTER TABLE "heartbeat_run_watchdog_decisions" ADD CONSTRAINT "heartbeat_run_watchdog_decisions_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;
EXCEPTION
WHEN duplicate_object THEN null;
END $$;
--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "heartbeat_run_watchdog_decisions_company_run_created_idx"
ON "heartbeat_run_watchdog_decisions" USING btree ("company_id","run_id","created_at");
--> statement-breakpoint
CREATE INDEX IF NOT EXISTS "heartbeat_run_watchdog_decisions_company_run_snooze_idx"
ON "heartbeat_run_watchdog_decisions" USING btree ("company_id","run_id","snoozed_until");
--> statement-breakpoint
CREATE UNIQUE INDEX IF NOT EXISTS "issues_active_stale_run_evaluation_uq"
ON "issues" USING btree ("company_id","origin_kind","origin_id")
WHERE "origin_kind" = 'stale_active_run_evaluation'
AND "origin_id" IS NOT NULL
AND "hidden_at" IS NULL
AND "status" NOT IN ('done', 'cancelled');

View File

@@ -484,6 +484,20 @@
"when": 1776959400000,
"tag": "0068_environment_local_driver_unique",
"breakpoints": true
},
{
"idx": 69,
"version": "7",
"when": 1776780003000,
"tag": "0069_liveness_recovery_dedupe",
"breakpoints": true
},
{
"idx": 70,
"version": "7",
"when": 1776780004000,
"tag": "0070_active_run_output_watchdog",
"breakpoints": true
}
]
}

View File

@@ -0,0 +1,34 @@
import { index, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
import { agents } from "./agents.js";
import { companies } from "./companies.js";
import { heartbeatRuns } from "./heartbeat_runs.js";
import { issues } from "./issues.js";
export const heartbeatRunWatchdogDecisions = pgTable(
"heartbeat_run_watchdog_decisions",
{
id: uuid("id").primaryKey().defaultRandom(),
companyId: uuid("company_id").notNull().references(() => companies.id),
runId: uuid("run_id").notNull().references(() => heartbeatRuns.id, { onDelete: "cascade" }),
evaluationIssueId: uuid("evaluation_issue_id").references(() => issues.id, { onDelete: "set null" }),
decision: text("decision").notNull(),
snoozedUntil: timestamp("snoozed_until", { withTimezone: true }),
reason: text("reason"),
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
createdByUserId: text("created_by_user_id"),
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
},
(table) => ({
companyRunCreatedIdx: index("heartbeat_run_watchdog_decisions_company_run_created_idx").on(
table.companyId,
table.runId,
table.createdAt,
),
companyRunSnoozeIdx: index("heartbeat_run_watchdog_decisions_company_run_snooze_idx").on(
table.companyId,
table.runId,
table.snoozedUntil,
),
}),
);

View File

@@ -34,6 +34,10 @@ export const heartbeatRuns = pgTable(
processPid: integer("process_pid"),
processGroupId: integer("process_group_id"),
processStartedAt: timestamp("process_started_at", { withTimezone: true }),
lastOutputAt: timestamp("last_output_at", { withTimezone: true }),
lastOutputSeq: integer("last_output_seq").notNull().default(0),
lastOutputStream: text("last_output_stream"),
lastOutputBytes: bigint("last_output_bytes", { mode: "number" }),
retryOfRunId: uuid("retry_of_run_id").references((): AnyPgColumn => heartbeatRuns.id, {
onDelete: "set null",
}),
@@ -64,5 +68,15 @@ export const heartbeatRuns = pgTable(
table.livenessState,
table.createdAt,
),
companyStatusLastOutputIdx: index("heartbeat_runs_company_status_last_output_idx").on(
table.companyId,
table.status,
table.lastOutputAt,
),
companyStatusProcessStartedIdx: index("heartbeat_runs_company_status_process_started_idx").on(
table.companyId,
table.status,
table.processStartedAt,
),
}),
);

View File

@@ -53,6 +53,7 @@ export { documentRevisions } from "./document_revisions.js";
export { issueDocuments } from "./issue_documents.js";
export { heartbeatRuns } from "./heartbeat_runs.js";
export { heartbeatRunEvents } from "./heartbeat_run_events.js";
export { heartbeatRunWatchdogDecisions } from "./heartbeat_run_watchdog_decisions.js";
export { costEvents } from "./cost_events.js";
export { financeEvents } from "./finance_events.js";
export { approvals } from "./approvals.js";

View File

@@ -91,5 +91,29 @@ export const issues = pgTable(
and ${table.executionRunId} is not null
and ${table.status} in ('backlog', 'todo', 'in_progress', 'in_review', 'blocked')`,
),
activeLivenessRecoveryIncidentIdx: uniqueIndex("issues_active_liveness_recovery_incident_uq")
.on(table.companyId, table.originKind, table.originId)
.where(
sql`${table.originKind} = 'harness_liveness_escalation'
and ${table.originId} is not null
and ${table.hiddenAt} is null
and ${table.status} not in ('done', 'cancelled')`,
),
activeLivenessRecoveryLeafIdx: uniqueIndex("issues_active_liveness_recovery_leaf_uq")
.on(table.companyId, table.originKind, table.originFingerprint)
.where(
sql`${table.originKind} = 'harness_liveness_escalation'
and ${table.originFingerprint} <> 'default'
and ${table.hiddenAt} is null
and ${table.status} not in ('done', 'cancelled')`,
),
activeStaleRunEvaluationIdx: uniqueIndex("issues_active_stale_run_evaluation_uq")
.on(table.companyId, table.originKind, table.originId)
.where(
sql`${table.originKind} = 'stale_active_run_evaluation'
and ${table.originId} is not null
and ${table.hiddenAt} is null
and ${table.status} not in ('done', 'cancelled')`,
),
}),
);

View File

@@ -33,77 +33,56 @@ export type EmbeddedPostgresTestDatabase = {
let embeddedPostgresSupportPromise: Promise<EmbeddedPostgresTestSupport> | null = null;
const DEFAULT_PAPERCLIP_EMBEDDED_POSTGRES_PORT = 54329;
function getReservedTestPorts(): Set<number> {
const configuredPorts = [
DEFAULT_PAPERCLIP_EMBEDDED_POSTGRES_PORT,
Number.parseInt(process.env.PAPERCLIP_EMBEDDED_POSTGRES_PORT ?? "", 10),
...String(process.env.PAPERCLIP_TEST_POSTGRES_RESERVED_PORTS ?? "")
.split(",")
.map((value) => Number.parseInt(value.trim(), 10)),
];
return new Set(configuredPorts.filter((port) => Number.isInteger(port) && port > 0 && port <= 65535));
}
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
const reservedPorts = getReservedTestPorts();
for (let attempt = 0; attempt < 20; attempt += 1) {
const port = await new Promise<number>((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
});
}
function formatEmbeddedPostgresError(error: unknown): string {
if (error instanceof Error && error.message.length > 0) return error.message;
if (typeof error === "string" && error.length > 0) return error;
return "embedded Postgres startup failed";
}
async function probeEmbeddedPostgresSupport(): Promise<EmbeddedPostgresTestSupport> {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-embedded-postgres-probe-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: () => {},
onError: () => {},
});
try {
await instance.initialise();
await instance.start();
return { supported: true };
} catch (error) {
return {
supported: false,
reason: formatEmbeddedPostgresError(error),
};
} finally {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
if (!reservedPorts.has(port)) return port;
}
throw new Error(
`Failed to allocate embedded Postgres test port outside reserved Paperclip ports: ${[
...reservedPorts,
].join(", ")}`,
);
}
export async function getEmbeddedPostgresTestSupport(): Promise<EmbeddedPostgresTestSupport> {
if (!embeddedPostgresSupportPromise) {
embeddedPostgresSupportPromise = probeEmbeddedPostgresSupport();
}
return await embeddedPostgresSupportPromise;
}
export async function startEmbeddedPostgresTestDatabase(
tempDirPrefix: string,
): Promise<EmbeddedPostgresTestDatabase> {
async function createEmbeddedPostgresTestInstance(tempDirPrefix: string) {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), tempDirPrefix));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
@@ -118,6 +97,51 @@ export async function startEmbeddedPostgresTestDatabase(
onError: () => {},
});
return { dataDir, port, instance };
}
function cleanupEmbeddedPostgresTestDirs(dataDir: string) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
function formatEmbeddedPostgresError(error: unknown): string {
if (error instanceof Error && error.message.length > 0) return error.message;
if (typeof error === "string" && error.length > 0) return error;
return "embedded Postgres startup failed";
}
async function probeEmbeddedPostgresSupport(): Promise<EmbeddedPostgresTestSupport> {
const { dataDir, instance } = await createEmbeddedPostgresTestInstance(
"paperclip-embedded-postgres-probe-",
);
try {
await instance.initialise();
await instance.start();
return { supported: true };
} catch (error) {
return {
supported: false,
reason: formatEmbeddedPostgresError(error),
};
} finally {
await instance.stop().catch(() => {});
cleanupEmbeddedPostgresTestDirs(dataDir);
}
}
export async function getEmbeddedPostgresTestSupport(): Promise<EmbeddedPostgresTestSupport> {
if (!embeddedPostgresSupportPromise) {
embeddedPostgresSupportPromise = probeEmbeddedPostgresSupport();
}
return await embeddedPostgresSupportPromise;
}
export async function startEmbeddedPostgresTestDatabase(
tempDirPrefix: string,
): Promise<EmbeddedPostgresTestDatabase> {
const { dataDir, port, instance } = await createEmbeddedPostgresTestInstance(tempDirPrefix);
try {
await instance.initialise();
await instance.start();
@@ -131,12 +155,12 @@ export async function startEmbeddedPostgresTestDatabase(
connectionString,
cleanup: async () => {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
cleanupEmbeddedPostgresTestDirs(dataDir);
},
};
} catch (error) {
await instance.stop().catch(() => {});
fs.rmSync(dataDir, { recursive: true, force: true });
cleanupEmbeddedPostgresTestDirs(dataDir);
throw new Error(
`Failed to start embedded PostgreSQL test database: ${formatEmbeddedPostgresError(error)}`,
);

View File

@@ -450,7 +450,7 @@ export function createToolDefinitions(client: PaperclipApiClient): ToolDefinitio
),
makeTool(
"paperclipUpdateIssue",
"Patch an issue, optionally including a comment",
"Patch an issue, optionally including a comment; include resume=true when intentionally requesting follow-up on resumable closed work",
updateIssueToolSchema,
async ({ issueId, ...body }) =>
client.requestJson("PATCH", `/issues/${encodeURIComponent(issueId)}`, { body }),
@@ -475,7 +475,7 @@ export function createToolDefinitions(client: PaperclipApiClient): ToolDefinitio
),
makeTool(
"paperclipAddComment",
"Add a comment to an issue",
"Add a comment to an issue; include resume=true when intentionally requesting follow-up on resumable closed work",
addCommentToolSchema,
async ({ issueId, ...body }) =>
client.requestJson("POST", `/issues/${encodeURIComponent(issueId)}/comments`, { body }),

View File

@@ -162,7 +162,7 @@ export const ISSUE_THREAD_INTERACTION_CONTINUATION_POLICIES = [
export type IssueThreadInteractionContinuationPolicy =
(typeof ISSUE_THREAD_INTERACTION_CONTINUATION_POLICIES)[number];
export const ISSUE_ORIGIN_KINDS = ["manual", "routine_execution"] as const;
export const ISSUE_ORIGIN_KINDS = ["manual", "routine_execution", "stale_active_run_evaluation"] as const;
export type BuiltInIssueOriginKind = (typeof ISSUE_ORIGIN_KINDS)[number];
export type PluginIssueOriginKind = `plugin:${string}`;
export type IssueOriginKind = BuiltInIssueOriginKind | PluginIssueOriginKind;

View File

@@ -1,5 +1,6 @@
import type { AgentAdapterType, EnvironmentDriver } from "./constants.js";
import type { SandboxEnvironmentProvider } from "./types/environment.js";
import type { JsonSchema } from "./types/plugin.js";
export type EnvironmentSupportStatus = "supported" | "unsupported";
@@ -20,6 +21,7 @@ export interface EnvironmentProviderCapability {
source?: "builtin" | "plugin";
pluginKey?: string;
pluginId?: string;
configSchema?: JsonSchema;
}
export interface EnvironmentCapabilities {
@@ -81,7 +83,7 @@ export function getAdapterEnvironmentSupport(
const supportedDrivers = new Set(supportedEnvironmentDriversForAdapter(adapterType));
const supportedProviders = new Set(supportedSandboxProvidersForAdapter(adapterType, additionalSandboxProviders));
const sandboxProviders: Record<SandboxEnvironmentProvider, EnvironmentSupportStatus> = {
fake: supportedProviders.has("fake") ? "supported" : "unsupported",
fake: "unsupported",
};
for (const provider of additionalSandboxProviders) {
sandboxProviders[provider as SandboxEnvironmentProvider] = supportedProviders.has(provider as SandboxEnvironmentProvider)
@@ -130,6 +132,7 @@ export function getEnvironmentCapabilities(
source: capability.source ?? "plugin",
pluginKey: capability.pluginKey,
pluginId: capability.pluginId,
configSchema: capability.configSchema,
};
}
return {

View File

@@ -324,6 +324,9 @@ export type {
IssueWorkProductReviewState,
Issue,
IssueAssigneeAdapterOverrides,
IssueBlockerAttention,
IssueBlockerAttentionReason,
IssueBlockerAttentionState,
IssueReferenceSource,
IssueRelatedWorkItem,
IssueRelatedWorkSummary,

View File

@@ -22,16 +22,6 @@ export interface SshEnvironmentConfig {
strictHostKeyChecking: boolean;
}
/**
* Known sandbox environment provider keys.
*
* `"fake"` is a built-in test-only provider.
*
* Additional providers can be added by installing sandbox provider plugins
* that declare matching `environmentDrivers` in their manifest. The type
* includes `string` to allow plugin-backed providers without requiring
* shared type changes.
*/
export type SandboxEnvironmentProvider = "fake" | (string & {});
export interface FakeSandboxEnvironmentConfig {

View File

@@ -37,6 +37,10 @@ export interface HeartbeatRun {
processPid: number | null;
processGroupId?: number | null;
processStartedAt: Date | null;
lastOutputAt: Date | null;
lastOutputSeq: number;
lastOutputStream: "stdout" | "stderr" | null;
lastOutputBytes: number | null;
retryOfRunId: string | null;
processLossRetryCount: number;
scheduledRetryAt?: Date | null;
@@ -51,6 +55,29 @@ export interface HeartbeatRun {
contextSnapshot: Record<string, unknown> | null;
createdAt: Date;
updatedAt: Date;
outputSilence?: HeartbeatRunOutputSilence;
}
export type HeartbeatRunOutputSilenceLevel =
| "not_applicable"
| "ok"
| "suspicious"
| "critical"
| "snoozed";
export interface HeartbeatRunOutputSilence {
lastOutputAt: Date | string | null;
lastOutputSeq: number;
lastOutputStream: "stdout" | "stderr" | null;
silenceStartedAt: Date | string | null;
silenceAgeMs: number | null;
level: HeartbeatRunOutputSilenceLevel;
suspicionThresholdMs: number;
criticalThresholdMs: number;
snoozedUntil: Date | string | null;
evaluationIssueId: string | null;
evaluationIssueIdentifier: string | null;
evaluationIssueAssigneeAgentId: string | null;
}
export interface AgentWakeupSkipped {

View File

@@ -118,6 +118,9 @@ export type {
export type {
Issue,
IssueAssigneeAdapterOverrides,
IssueBlockerAttention,
IssueBlockerAttentionReason,
IssueBlockerAttentionState,
IssueReferenceSource,
IssueRelatedWorkItem,
IssueRelatedWorkSummary,

View File

@@ -27,6 +27,7 @@ export interface InstanceExperimentalSettings {
enableEnvironments: boolean;
enableIsolatedWorkspaces: boolean;
autoRestartDevServerWhenIdle: boolean;
enableIssueGraphLivenessAutoRecovery: boolean;
}
export interface InstanceSettings {

View File

@@ -116,6 +116,24 @@ export interface IssueRelationIssueSummary {
priority: IssuePriority;
assigneeAgentId: string | null;
assigneeUserId: string | null;
terminalBlockers?: IssueRelationIssueSummary[];
}
export type IssueBlockerAttentionState = "none" | "covered" | "needs_attention";
export type IssueBlockerAttentionReason =
| "active_child"
| "active_dependency"
| "attention_required"
| null;
export interface IssueBlockerAttention {
state: IssueBlockerAttentionState;
reason: IssueBlockerAttentionReason;
unresolvedBlockerCount: number;
coveredBlockerCount: number;
attentionBlockerCount: number;
sampleBlockerIdentifier: string | null;
}
export interface IssueRelation {
@@ -242,6 +260,7 @@ export interface Issue {
labels?: IssueLabel[];
blockedBy?: IssueRelationIssueSummary[];
blocks?: IssueRelationIssueSummary[];
blockerAttention?: IssueBlockerAttention;
relatedWork?: IssueRelatedWorkSummary;
referencedIssueIdentifiers?: string[];
planDocument?: IssueDocument | null;
@@ -267,6 +286,7 @@ export interface IssueComment {
authorAgentId: string | null;
authorUserId: string | null;
body: string;
followUpRequested?: boolean;
createdAt: Date;
updatedAt: Date;
}

View File

@@ -0,0 +1,31 @@
import { describe, expect, it } from "vitest";
import {
addApprovalCommentSchema,
requestApprovalRevisionSchema,
resolveApprovalSchema,
} from "./approval.js";
describe("approval validators", () => {
it("passes real line breaks through unchanged", () => {
expect(addApprovalCommentSchema.parse({ body: "Looks good\n\nApproved." }).body)
.toBe("Looks good\n\nApproved.");
expect(resolveApprovalSchema.parse({ decisionNote: "Decision\n\nApproved." }).decisionNote)
.toBe("Decision\n\nApproved.");
});
it("accepts null and omitted optional decision notes", () => {
expect(resolveApprovalSchema.parse({ decisionNote: null }).decisionNote).toBeNull();
expect(resolveApprovalSchema.parse({}).decisionNote).toBeUndefined();
expect(requestApprovalRevisionSchema.parse({ decisionNote: null }).decisionNote).toBeNull();
expect(requestApprovalRevisionSchema.parse({}).decisionNote).toBeUndefined();
});
it("normalizes escaped line breaks in approval comments and decision notes", () => {
expect(addApprovalCommentSchema.parse({ body: "Looks good\\n\\nApproved." }).body)
.toBe("Looks good\n\nApproved.");
expect(resolveApprovalSchema.parse({ decisionNote: "Decision\\n\\nApproved." }).decisionNote)
.toBe("Decision\n\nApproved.");
expect(requestApprovalRevisionSchema.parse({ decisionNote: "Decision\\r\\nRevise." }).decisionNote)
.toBe("Decision\nRevise.");
});
});

View File

@@ -1,5 +1,6 @@
import { z } from "zod";
import { APPROVAL_TYPES } from "../constants.js";
import { multilineTextSchema } from "./text.js";
export const createApprovalSchema = z.object({
type: z.enum(APPROVAL_TYPES),
@@ -11,13 +12,13 @@ export const createApprovalSchema = z.object({
export type CreateApproval = z.infer<typeof createApprovalSchema>;
export const resolveApprovalSchema = z.object({
decisionNote: z.string().optional().nullable(),
decisionNote: multilineTextSchema.optional().nullable(),
});
export type ResolveApproval = z.infer<typeof resolveApprovalSchema>;
export const requestApprovalRevisionSchema = z.object({
decisionNote: z.string().optional().nullable(),
decisionNote: multilineTextSchema.optional().nullable(),
});
export type RequestApprovalRevision = z.infer<typeof requestApprovalRevisionSchema>;
@@ -29,7 +30,7 @@ export const resubmitApprovalSchema = z.object({
export type ResubmitApproval = z.infer<typeof resubmitApprovalSchema>;
export const addApprovalCommentSchema = z.object({
body: z.string().min(1),
body: multilineTextSchema.pipe(z.string().min(1)),
});
export type AddApprovalComment = z.infer<typeof addApprovalCommentSchema>;

View File

@@ -36,6 +36,7 @@ export const instanceExperimentalSettingsSchema = z.object({
enableEnvironments: z.boolean().default(false),
enableIsolatedWorkspaces: z.boolean().default(false),
autoRestartDevServerWhenIdle: z.boolean().default(false),
enableIssueGraphLivenessAutoRecovery: z.boolean().default(false),
}).strict();
export const patchInstanceExperimentalSettingsSchema = instanceExperimentalSettingsSchema.partial();

View File

@@ -0,0 +1,78 @@
import { describe, expect, it } from "vitest";
import {
addIssueCommentSchema,
createIssueSchema,
respondIssueThreadInteractionSchema,
suggestedTaskDraftSchema,
updateIssueSchema,
upsertIssueDocumentSchema,
} from "./issue.js";
describe("issue validators", () => {
it("passes real line breaks through unchanged", () => {
const parsed = createIssueSchema.parse({
title: "Follow up PR",
description: "Line 1\n\nLine 2",
});
expect(parsed.description).toBe("Line 1\n\nLine 2");
});
it("accepts null and omitted optional multiline issue fields", () => {
expect(createIssueSchema.parse({ title: "Follow up PR", description: null }).description)
.toBeNull();
expect(createIssueSchema.parse({ title: "Follow up PR" }).description)
.toBeUndefined();
expect(updateIssueSchema.parse({ comment: undefined }).comment)
.toBeUndefined();
});
it("normalizes JSON-escaped line breaks in issue descriptions", () => {
const parsed = createIssueSchema.parse({
title: "Follow up PR",
description: "PR: https://example.com/pr/1\\n\\nShip the follow-up.",
});
expect(parsed.description).toBe("PR: https://example.com/pr/1\n\nShip the follow-up.");
});
it("normalizes escaped line breaks in issue update comments", () => {
const parsed = updateIssueSchema.parse({
comment: "Done\\n\\n- Verified the route",
});
expect(parsed.comment).toBe("Done\n\n- Verified the route");
});
it("normalizes escaped line breaks in issue comment bodies", () => {
const parsed = addIssueCommentSchema.parse({
body: "Progress update\\r\\n\\r\\nNext action.",
});
expect(parsed.body).toBe("Progress update\n\nNext action.");
});
it("normalizes escaped line breaks in generated task drafts", () => {
const parsed = suggestedTaskDraftSchema.parse({
clientKey: "task-1",
title: "Follow up",
description: "Line 1\\n\\nLine 2",
});
expect(parsed.description).toBe("Line 1\n\nLine 2");
});
it("normalizes escaped line breaks in thread summaries and documents", () => {
const response = respondIssueThreadInteractionSchema.parse({
answers: [],
summaryMarkdown: "Summary\\n\\nNext action",
});
const document = upsertIssueDocumentSchema.parse({
format: "markdown",
body: "# Plan\\n\\nShip it",
});
expect(response.summaryMarkdown).toBe("Summary\n\nNext action");
expect(document.body).toBe("# Plan\n\nShip it");
});
});

View File

@@ -10,6 +10,7 @@ import {
ISSUE_THREAD_INTERACTION_KINDS,
ISSUE_THREAD_INTERACTION_STATUSES,
} from "../constants.js";
import { multilineTextSchema } from "./text.js";
export const ISSUE_EXECUTION_WORKSPACE_PREFERENCES = [
"inherit",
@@ -130,7 +131,7 @@ export const createIssueSchema = z.object({
blockedByIssueIds: z.array(z.string().uuid()).optional(),
inheritExecutionWorkspaceFromIssueId: z.string().uuid().optional().nullable(),
title: z.string().min(1),
description: z.string().optional().nullable(),
description: multilineTextSchema.optional().nullable(),
status: z.enum(ISSUE_STATUSES).optional().default("backlog"),
priority: z.enum(ISSUE_PRIORITIES).optional().default("medium"),
assigneeAgentId: z.string().uuid().optional().nullable(),
@@ -168,9 +169,10 @@ export type CreateIssueLabel = z.infer<typeof createIssueLabelSchema>;
export const updateIssueSchema = createIssueSchema.partial().extend({
assigneeAgentId: z.string().trim().min(1).optional().nullable(),
comment: z.string().min(1).optional(),
comment: multilineTextSchema.pipe(z.string().min(1)).optional(),
reviewRequest: issueReviewRequestSchema.optional().nullable(),
reopen: z.boolean().optional(),
resume: z.boolean().optional(),
interrupt: z.boolean().optional(),
hiddenAt: z.string().datetime().nullable().optional(),
});
@@ -186,8 +188,9 @@ export const checkoutIssueSchema = z.object({
export type CheckoutIssue = z.infer<typeof checkoutIssueSchema>;
export const addIssueCommentSchema = z.object({
body: z.string().min(1),
body: multilineTextSchema.pipe(z.string().min(1)),
reopen: z.boolean().optional(),
resume: z.boolean().optional(),
interrupt: z.boolean().optional(),
});
@@ -211,7 +214,7 @@ export const suggestedTaskDraftSchema = z.object({
parentClientKey: z.string().trim().min(1).max(120).nullable().optional(),
parentId: z.string().uuid().nullable().optional(),
title: z.string().trim().min(1).max(240),
description: z.string().trim().max(20000).nullable().optional(),
description: multilineTextSchema.pipe(z.string().trim().max(20000)).nullable().optional(),
priority: z.enum(ISSUE_PRIORITIES).nullable().optional(),
assigneeAgentId: z.string().uuid().nullable().optional(),
assigneeUserId: z.string().trim().min(1).nullable().optional(),
@@ -437,7 +440,7 @@ export type RejectIssueThreadInteraction = z.infer<typeof rejectIssueThreadInter
export const respondIssueThreadInteractionSchema = z.object({
answers: z.array(askUserQuestionsAnswerSchema).max(20),
summaryMarkdown: z.string().max(20000).nullable().optional(),
summaryMarkdown: multilineTextSchema.pipe(z.string().max(20000)).nullable().optional(),
});
export type RespondIssueThreadInteraction = z.infer<typeof respondIssueThreadInteractionSchema>;
@@ -460,7 +463,7 @@ export const issueDocumentFormatSchema = z.enum(ISSUE_DOCUMENT_FORMATS);
export const upsertIssueDocumentSchema = z.object({
title: z.string().trim().max(200).nullable().optional(),
format: issueDocumentFormatSchema,
body: z.string().max(524288),
body: multilineTextSchema.pipe(z.string().max(524288)),
changeSummary: z.string().trim().max(500).nullable().optional(),
baseRevisionId: z.string().uuid().nullable().optional(),
});

View File

@@ -0,0 +1,10 @@
import { z } from "zod";
export function normalizeEscapedLineBreaks(value: string): string {
return value
.replace(/\\r\\n/g, "\n")
.replace(/\\n/g, "\n")
.replace(/\\r/g, "\n");
}
export const multilineTextSchema = z.string().transform(normalizeEscapedLineBreaks);

View File

@@ -1,6 +1,6 @@
#!/usr/bin/env bash
#
# Kill all "Google Chrome for Testing" processes (agent headless browsers).
# Kill all agent headless browser processes.
#
# Usage:
# scripts/kill-agent-browsers.sh # kill all
@@ -22,14 +22,14 @@ while IFS= read -r line; do
pid=$(echo "$line" | awk '{print $2}')
pids+=("$pid")
lines+=("$line")
done < <(ps aux | grep 'Google Chrome for Testing' | grep -v grep || true)
done < <(ps aux | grep -E 'Google Chrome for Testing|chrome-headless-shell' | grep -v grep || true)
if [[ ${#pids[@]} -eq 0 ]]; then
echo "No Google Chrome for Testing processes found."
echo "No agent headless browser processes found."
exit 0
fi
echo "Found ${#pids[@]} Google Chrome for Testing process(es):"
echo "Found ${#pids[@]} agent headless browser process(es):"
echo ""
for i in "${!pids[@]}"; do

View File

@@ -0,0 +1,134 @@
#!/usr/bin/env node
import { spawnSync } from "node:child_process";
import { mkdirSync, mkdtempSync, readdirSync, statSync } from "node:fs";
import os from "node:os";
import path from "node:path";
const repoRoot = process.cwd();
const serverRoot = path.join(repoRoot, "server");
const serverTestsDir = path.join(repoRoot, "server", "src", "__tests__");
const nonServerProjects = [
"@paperclipai/shared",
"@paperclipai/db",
"@paperclipai/adapter-utils",
"@paperclipai/adapter-codex-local",
"@paperclipai/adapter-opencode-local",
"@paperclipai/ui",
"paperclipai",
];
const routeTestPattern = /[^/]*(?:route|routes|authz)[^/]*\.test\.ts$/;
const additionalSerializedServerTests = new Set([
"server/src/__tests__/approval-routes-idempotency.test.ts",
"server/src/__tests__/assets.test.ts",
"server/src/__tests__/authz-company-access.test.ts",
"server/src/__tests__/companies-route-path-guard.test.ts",
"server/src/__tests__/company-portability.test.ts",
"server/src/__tests__/costs-service.test.ts",
"server/src/__tests__/express5-auth-wildcard.test.ts",
"server/src/__tests__/health-dev-server-token.test.ts",
"server/src/__tests__/health.test.ts",
"server/src/__tests__/heartbeat-dependency-scheduling.test.ts",
"server/src/__tests__/heartbeat-issue-liveness-escalation.test.ts",
"server/src/__tests__/heartbeat-process-recovery.test.ts",
"server/src/__tests__/invite-accept-existing-member.test.ts",
"server/src/__tests__/invite-accept-gateway-defaults.test.ts",
"server/src/__tests__/invite-accept-replay.test.ts",
"server/src/__tests__/invite-expiry.test.ts",
"server/src/__tests__/invite-join-manager.test.ts",
"server/src/__tests__/invite-onboarding-text.test.ts",
"server/src/__tests__/issues-checkout-wakeup.test.ts",
"server/src/__tests__/issues-service.test.ts",
"server/src/__tests__/opencode-local-adapter-environment.test.ts",
"server/src/__tests__/project-routes-env.test.ts",
"server/src/__tests__/redaction.test.ts",
"server/src/__tests__/routines-e2e.test.ts",
]);
let invocationIndex = 0;
function walk(dir) {
const entries = readdirSync(dir);
const files = [];
for (const entry of entries) {
const absolute = path.join(dir, entry);
const stats = statSync(absolute);
if (stats.isDirectory()) {
files.push(...walk(absolute));
} else if (stats.isFile()) {
files.push(absolute);
}
}
return files;
}
function toRepoPath(file) {
return path.relative(repoRoot, file).split(path.sep).join("/");
}
function toServerPath(file) {
return path.relative(serverRoot, file).split(path.sep).join("/");
}
function isRouteOrAuthzTest(file) {
if (routeTestPattern.test(file)) {
return true;
}
return additionalSerializedServerTests.has(file);
}
function runVitest(args, label) {
console.log(`\n[test:run] ${label}`);
invocationIndex += 1;
const testRoot = mkdtempSync(path.join(os.tmpdir(), `paperclip-vitest-${process.pid}-${invocationIndex}-`));
const env = {
...process.env,
PAPERCLIP_HOME: path.join(testRoot, "home"),
PAPERCLIP_INSTANCE_ID: `vitest-${process.pid}-${invocationIndex}`,
TMPDIR: path.join(testRoot, "tmp"),
};
mkdirSync(env.PAPERCLIP_HOME, { recursive: true });
mkdirSync(env.TMPDIR, { recursive: true });
const result = spawnSync("pnpm", ["exec", "vitest", "run", ...args], {
cwd: repoRoot,
env,
stdio: "inherit",
});
if (result.error) {
console.error(`[test:run] Failed to start Vitest: ${result.error.message}`);
process.exit(1);
}
if (result.status !== 0) {
process.exit(result.status ?? 1);
}
}
const routeTests = walk(serverTestsDir)
.filter((file) => isRouteOrAuthzTest(toRepoPath(file)))
.map((file) => ({
repoPath: toRepoPath(file),
serverPath: toServerPath(file),
}))
.sort((a, b) => a.repoPath.localeCompare(b.repoPath));
const excludeRouteArgs = routeTests.flatMap((file) => ["--exclude", file.serverPath]);
for (const project of nonServerProjects) {
runVitest(["--project", project], `non-server project ${project}`);
}
runVitest(
["--project", "@paperclipai/server", ...excludeRouteArgs],
`server suites excluding ${routeTests.length} serialized suites`,
);
for (const routeTest of routeTests) {
runVitest(
[
"--project",
"@paperclipai/server",
routeTest.repoPath,
"--pool=forks",
"--poolOptions.forks.isolate=true",
],
routeTest.repoPath,
);
}

View File

@@ -0,0 +1,10 @@
# Server Tests
Server tests that need a real PostgreSQL process must use
`./helpers/embedded-postgres.ts` instead of constructing `embedded-postgres`
directly.
The shared helper creates a throwaway data directory and a reserved-safe
loopback port for each test database. This protects the live Paperclip
control-plane Postgres from server vitest runs; see PAP-2033 for the incident
that introduced this guard.

View File

@@ -1,7 +1,6 @@
import type { Server } from "node:http";
import express from "express";
import request from "supertest";
import { afterAll, beforeEach, describe, expect, it, vi } from "vitest";
import { beforeEach, describe, expect, it, vi } from "vitest";
const mockActivityService = vi.hoisted(() => ({
list: vi.fn(),
@@ -33,8 +32,6 @@ vi.mock("../services/index.js", () => ({
heartbeatService: () => mockHeartbeatService,
}));
let server: Server | null = null;
async function createApp(
actor: Record<string, unknown> = {
type: "board",
@@ -44,44 +41,64 @@ async function createApp(
isInstanceAdmin: false,
},
) {
vi.resetModules();
const [{ errorHandler }, { activityRoutes }] = await Promise.all([
import("../middleware/index.js"),
import("../routes/activity.js"),
import("../middleware/index.js") as Promise<typeof import("../middleware/index.js")>,
import("../routes/activity.js") as Promise<typeof import("../routes/activity.js")>,
]);
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
(req as any).actor = actor;
(req as any).actor = {
...actor,
companyIds: Array.isArray(actor.companyIds) ? [...actor.companyIds] : actor.companyIds,
};
next();
});
app.use("/api", activityRoutes({} as any));
app.use(errorHandler);
server = app.listen(0);
return server;
return app;
}
describe("activity routes", () => {
afterAll(async () => {
if (!server) return;
await new Promise<void>((resolve, reject) => {
server?.close((err) => {
if (err) reject(err);
else resolve();
});
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
server = null;
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
describe.sequential("activity routes", () => {
beforeEach(() => {
vi.resetModules();
vi.clearAllMocks();
for (const mock of Object.values(mockActivityService)) mock.mockReset();
for (const mock of Object.values(mockHeartbeatService)) mock.mockReset();
for (const mock of Object.values(mockIssueService)) mock.mockReset();
});
it("limits company activity lists by default", async () => {
mockActivityService.list.mockResolvedValue([]);
const app = await createApp();
const res = await request(app).get("/api/companies/company-1/activity");
const res = await requestApp(app, (baseUrl) => request(baseUrl).get("/api/companies/company-1/activity"));
expect(res.status).toBe(200);
expect(mockActivityService.list).toHaveBeenCalledWith({
@@ -97,7 +114,9 @@ describe("activity routes", () => {
mockActivityService.list.mockResolvedValue([]);
const app = await createApp();
const res = await request(app).get("/api/companies/company-1/activity?limit=5000&entityType=issue");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl).get("/api/companies/company-1/activity?limit=5000&entityType=issue"),
);
expect(res.status).toBe(200);
expect(mockActivityService.list).toHaveBeenCalledWith({
@@ -122,7 +141,7 @@ describe("activity routes", () => {
]);
const app = await createApp();
const res = await request(app).get("/api/issues/PAP-475/runs");
const res = await requestApp(app, (baseUrl) => request(baseUrl).get("/api/issues/PAP-475/runs"));
expect(res.status).toBe(200);
expect(mockIssueService.getByIdentifier).toHaveBeenCalledWith("PAP-475");
@@ -133,14 +152,14 @@ describe("activity routes", () => {
it("requires company access before creating activity events", async () => {
const app = await createApp();
const res = await request(app)
const res = await requestApp(app, (baseUrl) => request(baseUrl)
.post("/api/companies/company-2/activity")
.send({
actorId: "user-1",
action: "test.event",
entityType: "issue",
entityId: "issue-1",
});
}));
expect(res.status).toBe(403);
expect(mockActivityService.create).not.toHaveBeenCalled();
@@ -153,7 +172,7 @@ describe("activity routes", () => {
});
const app = await createApp();
const res = await request(app).get("/api/heartbeat-runs/run-2/issues");
const res = await requestApp(app, (baseUrl) => request(baseUrl).get("/api/heartbeat-runs/run-2/issues"));
expect(res.status).toBe(403);
expect(mockActivityService.issuesForRun).not.toHaveBeenCalled();
@@ -161,7 +180,7 @@ describe("activity routes", () => {
it("rejects anonymous heartbeat run issue lookups before run existence checks", async () => {
const app = await createApp({ type: "none", source: "none" });
const res = await request(app).get("/api/heartbeat-runs/missing-run/issues");
const res = await requestApp(app, (baseUrl) => request(baseUrl).get("/api/heartbeat-runs/missing-run/issues"));
expect(res.status).toBe(401);
expect(mockHeartbeatService.getRun).not.toHaveBeenCalled();

View File

@@ -424,7 +424,7 @@ describeEmbeddedPostgres("activity service", () => {
expect(backfilledRun).toMatchObject({
runId,
livenessState: "plan_only",
livenessReason: "Run described future work without concrete action evidence",
livenessReason: "Run described runnable future work without concrete action evidence",
lastUsefulActionAt: null,
});
});
@@ -530,7 +530,7 @@ describeEmbeddedPostgres("activity service", () => {
expect(backfilledRun).toMatchObject({
runId,
livenessState: "plan_only",
livenessReason: "Run described future work without concrete action evidence",
livenessReason: "Run described runnable future work without concrete action evidence",
lastUsefulActionAt: null,
});
});

View File

@@ -1,6 +1,6 @@
import express from "express";
import request from "supertest";
import { afterEach, beforeAll, beforeEach, describe, expect, it, vi } from "vitest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import type { ServerAdapterModule } from "../adapters/index.js";
const mocks = vi.hoisted(() => {
@@ -121,7 +121,13 @@ function createApp(actor: Express.Request["actor"]) {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
req.actor = actor;
req.actor = {
...actor,
companyIds: Array.isArray(actor.companyIds) ? [...actor.companyIds] : actor.companyIds,
memberships: Array.isArray(actor.memberships)
? actor.memberships.map((membership) => ({ ...membership }))
: actor.memberships,
} as Express.Request["actor"];
next();
});
app.use("/api", adapterRoutes());
@@ -129,6 +135,33 @@ function createApp(actor: Express.Request["actor"]) {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
function boardMember(membershipRole: "admin" | "operator" | "viewer"): Express.Request["actor"] {
return {
type: "board",
@@ -162,23 +195,29 @@ const instanceAdmin: Express.Request["actor"] = {
function sendMutatingRequest(app: express.Express, name: string) {
switch (name) {
case "install":
return request(app)
.post("/api/adapters/install")
.send({ packageName: EXTERNAL_PACKAGE_NAME });
return requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/adapters/install")
.send({ packageName: EXTERNAL_PACKAGE_NAME }),
);
case "disable":
return request(app)
.patch(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}`)
.send({ disabled: true });
return requestApp(app, (baseUrl) =>
request(baseUrl)
.patch(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}`)
.send({ disabled: true }),
);
case "override":
return request(app)
.patch("/api/adapters/claude_local/override")
.send({ paused: true });
return requestApp(app, (baseUrl) =>
request(baseUrl)
.patch("/api/adapters/claude_local/override")
.send({ paused: true }),
);
case "delete":
return request(app).delete(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}`);
return requestApp(app, (baseUrl) => request(baseUrl).delete(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}`));
case "reload":
return request(app).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reload`);
return requestApp(app, (baseUrl) => request(baseUrl).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reload`));
case "reinstall":
return request(app).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reinstall`);
return requestApp(app, (baseUrl) => request(baseUrl).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reinstall`));
default:
throw new Error(`Unknown mutating adapter route: ${name}`);
}
@@ -190,7 +229,13 @@ function seedInstalledExternalAdapter() {
registerServerAdapter(createAdapter());
}
describe("adapter management route authorization", () => {
function resetInstalledExternalAdapterState() {
mocks.externalRecords.clear();
unregisterServerAdapter(EXTERNAL_ADAPTER_TYPE);
setOverridePaused("claude_local", false);
}
describe.sequential("adapter management route authorization", () => {
beforeEach(async () => {
vi.resetModules();
vi.doUnmock("node:child_process");
@@ -232,50 +277,61 @@ describe("adapter management route authorization", () => {
setOverridePaused("claude_local", false);
});
it.each([
"install",
"disable",
"override",
"delete",
"reload",
"reinstall",
])("rejects %s for a non-instance-admin board user with company membership", async (routeName) => {
seedInstalledExternalAdapter();
const app = createApp(boardMember("admin"));
it("rejects mutating adapter routes for a non-instance-admin board user with company membership", async () => {
for (const routeName of [
"install",
"disable",
"override",
"delete",
"reload",
"reinstall",
]) {
resetInstalledExternalAdapterState();
seedInstalledExternalAdapter();
const app = createApp(boardMember("admin"));
const res = await sendMutatingRequest(app, routeName);
const res = await sendMutatingRequest(app, routeName);
expect(res.status, JSON.stringify(res.body)).toBe(403);
expect(res.status, `${routeName}: ${JSON.stringify(res.body)}`).toBe(403);
}
});
it.each([
["install", 201],
["disable", 200],
["override", 200],
["delete", 200],
["reload", 200],
["reinstall", 200],
] as const)("allows instance admins to reach %s", async (routeName, expectedStatus) => {
if (routeName !== "install") {
seedInstalledExternalAdapter();
it("allows instance admins to reach mutating adapter routes", async () => {
for (const [routeName, expectedStatus] of [
["install", 201],
["disable", 200],
["override", 200],
["delete", 200],
["reload", 200],
["reinstall", 200],
] as const) {
resetInstalledExternalAdapterState();
if (routeName !== "install") {
seedInstalledExternalAdapter();
}
const app = createApp(instanceAdmin);
const res = await sendMutatingRequest(app, routeName);
expect(res.status, `${routeName}: ${JSON.stringify(res.body)}`).toBe(expectedStatus);
}
const app = createApp(instanceAdmin);
const res = await sendMutatingRequest(app, routeName);
expect(res.status, JSON.stringify(res.body)).toBe(expectedStatus);
});
it.each(["viewer", "operator"] as const)(
"does not let a company %s trigger adapter npm install or reload",
async (membershipRole) => {
seedInstalledExternalAdapter();
const app = createApp(boardMember(membershipRole));
const installApp = createApp(boardMember(membershipRole));
const reloadApp = createApp(boardMember(membershipRole));
const install = await request(app)
.post("/api/adapters/install")
.send({ packageName: EXTERNAL_PACKAGE_NAME });
const reload = await request(app).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reload`);
const install = await requestApp(installApp, (baseUrl) =>
request(baseUrl)
.post("/api/adapters/install")
.send({ packageName: EXTERNAL_PACKAGE_NAME }),
);
const reload = await requestApp(reloadApp, (baseUrl) =>
request(baseUrl).post(`/api/adapters/${EXTERNAL_ADAPTER_TYPE}/reload`),
);
expect(install.status, JSON.stringify(install.body)).toBe(403);
expect(reload.status, JSON.stringify(reload.body)).toBe(403);

View File

@@ -148,6 +148,33 @@ async function createApp() {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
async function unregisterTestAdapter(type: string) {
const { unregisterServerAdapter } = await import("../adapters/index.js");
unregisterServerAdapter(type);
@@ -161,7 +188,7 @@ describe("agent routes adapter validation", () => {
vi.doUnmock("../middleware/index.js");
vi.doUnmock("../routes/agents.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockCompanySkillService.listRuntimeSkillEntries.mockResolvedValue([]);
mockCompanySkillService.resolveRequestedSkillKeys.mockResolvedValue([]);
mockAccessService.canUser.mockResolvedValue(true);
@@ -207,12 +234,14 @@ describe("agent routes adapter validation", () => {
registerServerAdapter(externalAdapter);
const app = await createApp();
const res = await request(app)
.post("/api/companies/company-1/agents")
.send({
name: "External Agent",
adapterType: "external_test",
});
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "External Agent",
adapterType: "external_test",
}),
);
expect(res.status, JSON.stringify(res.body)).toBe(201);
expect(res.body.adapterType).toBe("external_test");
@@ -220,12 +249,14 @@ describe("agent routes adapter validation", () => {
it("rejects unknown adapter types even when schema accepts arbitrary strings", async () => {
const app = await createApp();
const res = await request(app)
.post("/api/companies/company-1/agents")
.send({
name: "Missing Adapter",
adapterType: missingAdapterType,
});
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "Missing Adapter",
adapterType: missingAdapterType,
}),
);
expect(res.status, JSON.stringify(res.body)).toBe(422);
expect(String(res.body.error ?? res.body.message ?? "")).toContain(`Unknown adapter type: ${missingAdapterType}`);

View File

@@ -1,8 +1,9 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { errorHandler } from "../middleware/index.js";
import { agentRoutes } from "../routes/agents.js";
vi.unmock("http");
vi.unmock("node:http");
const agentId = "11111111-1111-4111-8111-111111111111";
const companyId = "22222222-2222-4222-8222-222222222222";
@@ -42,6 +43,9 @@ const baseKey = {
revokedAt: null,
};
let currentKeyAgentId = agentId;
let currentAccessCanUser = false;
const mockAgentService = vi.hoisted(() => ({
getById: vi.fn(),
pause: vi.fn(),
@@ -111,6 +115,66 @@ vi.mock("../telemetry.js", () => ({
getTelemetryClient: mockGetTelemetryClient,
}));
vi.mock("../routes/authz.js", async () => {
const { forbidden, unauthorized } = await vi.importActual<typeof import("../errors.js")>("../errors.js");
function assertAuthenticated(req: Express.Request) {
if (req.actor.type === "none") {
throw unauthorized();
}
}
function assertBoard(req: Express.Request) {
if (req.actor.type !== "board") {
throw forbidden("Board access required");
}
}
function assertCompanyAccess(req: Express.Request, expectedCompanyId: string) {
assertAuthenticated(req);
if (req.actor.type === "agent" && req.actor.companyId !== expectedCompanyId) {
throw forbidden("Agent key cannot access another company");
}
if (req.actor.type === "board" && req.actor.source !== "local_implicit") {
const allowedCompanies = req.actor.companyIds ?? [];
if (!allowedCompanies.includes(expectedCompanyId)) {
throw forbidden("User does not have access to this company");
}
}
}
function assertInstanceAdmin(req: Express.Request) {
assertBoard(req);
if (req.actor.source === "local_implicit" || req.actor.isInstanceAdmin) return;
throw forbidden("Instance admin access required");
}
function getActorInfo(req: Express.Request) {
assertAuthenticated(req);
if (req.actor.type === "agent") {
return {
actorType: "agent" as const,
actorId: req.actor.agentId ?? "unknown-agent",
agentId: req.actor.agentId ?? null,
runId: req.actor.runId ?? null,
};
}
return {
actorType: "user" as const,
actorId: req.actor.userId ?? "board",
agentId: null,
runId: req.actor.runId ?? null,
};
}
return {
assertAuthenticated,
assertBoard,
assertCompanyAccess,
assertInstanceAdmin,
getActorInfo,
};
});
vi.mock("../services/index.js", () => ({
agentService: () => mockAgentService,
agentInstructionsService: () => mockAgentInstructionsService,
@@ -133,11 +197,30 @@ vi.mock("../services/instance-settings.js", () => ({
}),
}));
function createApp(actor: Record<string, unknown>) {
let routeModules:
| Promise<[
typeof import("../middleware/index.js"),
typeof import("../routes/agents.js"),
]>
| null = null;
async function loadRouteModules() {
routeModules ??= Promise.all([
import("../middleware/index.js"),
import("../routes/agents.js"),
]);
return routeModules;
}
async function createApp(actor: Record<string, unknown>) {
const [{ errorHandler }, { agentRoutes }] = await loadRouteModules();
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
(req as any).actor = actor;
(req as any).actor = {
...actor,
companyIds: Array.isArray(actor.companyIds) ? [...actor.companyIds] : actor.companyIds,
};
next();
});
app.use("/api", agentRoutes({} as any));
@@ -145,111 +228,138 @@ function createApp(actor: Record<string, unknown>) {
return app;
}
describe("agent cross-tenant route authorization", () => {
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
function resetMockDefaults() {
vi.clearAllMocks();
for (const mock of Object.values(mockAgentService)) mock.mockReset();
for (const mock of Object.values(mockAccessService)) mock.mockReset();
for (const mock of Object.values(mockApprovalService)) mock.mockReset();
for (const mock of Object.values(mockBudgetService)) mock.mockReset();
for (const mock of Object.values(mockHeartbeatService)) mock.mockReset();
for (const mock of Object.values(mockIssueApprovalService)) mock.mockReset();
for (const mock of Object.values(mockIssueService)) mock.mockReset();
for (const mock of Object.values(mockSecretService)) mock.mockReset();
for (const mock of Object.values(mockAgentInstructionsService)) mock.mockReset();
for (const mock of Object.values(mockCompanySkillService)) mock.mockReset();
mockLogActivity.mockReset();
mockGetTelemetryClient.mockReset();
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
currentKeyAgentId = agentId;
currentAccessCanUser = false;
mockAgentService.getById.mockImplementation(async () => ({ ...baseAgent }));
mockAgentService.pause.mockImplementation(async () => ({ ...baseAgent }));
mockAgentService.resume.mockImplementation(async () => ({ ...baseAgent }));
mockAgentService.terminate.mockImplementation(async () => ({ ...baseAgent }));
mockAgentService.remove.mockImplementation(async () => ({ ...baseAgent }));
mockAgentService.listKeys.mockImplementation(async () => []);
mockAgentService.createApiKey.mockImplementation(async () => ({
id: keyId,
name: baseKey.name,
token: "pcp_test_token",
createdAt: baseKey.createdAt,
}));
mockAgentService.getKeyById.mockImplementation(async () => ({
...baseKey,
agentId: currentKeyAgentId,
}));
mockAgentService.revokeKey.mockImplementation(async () => ({
...baseKey,
revokedAt: new Date("2026-04-11T00:05:00.000Z"),
}));
mockAccessService.canUser.mockImplementation(async () => currentAccessCanUser);
mockAccessService.hasPermission.mockImplementation(async () => false);
mockAccessService.getMembership.mockImplementation(async () => null);
mockAccessService.listPrincipalGrants.mockImplementation(async () => []);
mockAccessService.ensureMembership.mockImplementation(async () => undefined);
mockAccessService.setPrincipalPermission.mockImplementation(async () => undefined);
mockHeartbeatService.cancelActiveForAgent.mockImplementation(async () => undefined);
mockLogActivity.mockImplementation(async () => undefined);
}
describe.sequential("agent cross-tenant route authorization", () => {
beforeEach(() => {
vi.clearAllMocks();
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
mockAgentService.getById.mockResolvedValue(baseAgent);
mockAgentService.pause.mockResolvedValue(baseAgent);
mockAgentService.resume.mockResolvedValue(baseAgent);
mockAgentService.terminate.mockResolvedValue(baseAgent);
mockAgentService.remove.mockResolvedValue(baseAgent);
mockAgentService.listKeys.mockResolvedValue([]);
mockAgentService.createApiKey.mockResolvedValue({
id: keyId,
name: baseKey.name,
token: "pcp_test_token",
createdAt: baseKey.createdAt,
});
mockAgentService.getKeyById.mockResolvedValue(baseKey);
mockAgentService.revokeKey.mockResolvedValue({
...baseKey,
revokedAt: new Date("2026-04-11T00:05:00.000Z"),
});
mockHeartbeatService.cancelActiveForAgent.mockResolvedValue(undefined);
mockLogActivity.mockResolvedValue(undefined);
resetMockDefaults();
});
it("rejects cross-tenant board pause before mutating the agent", async () => {
const app = createApp({
it("enforces company boundaries before mutating or reading agent keys", async () => {
const crossTenantActor = {
type: "board",
userId: "mallory",
companyIds: [],
source: "session",
isInstanceAdmin: false,
});
};
const deniedCases = [
{
label: "pause",
request: (app: express.Express) =>
requestApp(app, (baseUrl) => request(baseUrl).post(`/api/agents/${agentId}/pause`).send({})),
untouched: [mockAgentService.pause, mockHeartbeatService.cancelActiveForAgent],
},
{
label: "list keys",
request: (app: express.Express) =>
requestApp(app, (baseUrl) => request(baseUrl).get(`/api/agents/${agentId}/keys`)),
untouched: [mockAgentService.listKeys],
},
{
label: "create key",
request: (app: express.Express) =>
requestApp(app, (baseUrl) => request(baseUrl).post(`/api/agents/${agentId}/keys`).send({ name: "exploit" })),
untouched: [mockAgentService.createApiKey],
},
{
label: "revoke key",
request: (app: express.Express) =>
requestApp(app, (baseUrl) => request(baseUrl).delete(`/api/agents/${agentId}/keys/${keyId}`)),
untouched: [mockAgentService.getKeyById, mockAgentService.revokeKey],
},
];
const res = await request(app).post(`/api/agents/${agentId}/pause`).send({});
for (const deniedCase of deniedCases) {
resetMockDefaults();
const app = await createApp(crossTenantActor);
const res = await deniedCase.request(app);
expect(res.status).toBe(403);
expect(res.body.error).toContain("User does not have access to this company");
expect(mockAgentService.getById).toHaveBeenCalledWith(agentId);
expect(mockAgentService.pause).not.toHaveBeenCalled();
expect(mockHeartbeatService.cancelActiveForAgent).not.toHaveBeenCalled();
});
expect(res.status, `${deniedCase.label}: ${JSON.stringify(res.body)}`).toBe(403);
expect(res.body.error).toContain("User does not have access to this company");
expect(mockAgentService.getById).toHaveBeenCalledWith(agentId);
for (const mock of deniedCase.untouched) {
expect(mock).not.toHaveBeenCalled();
}
}
it("rejects cross-tenant board key listing before reading any keys", async () => {
const app = createApp({
type: "board",
userId: "mallory",
companyIds: [],
source: "session",
isInstanceAdmin: false,
});
resetMockDefaults();
currentKeyAgentId = "44444444-4444-4444-8444-444444444444";
currentAccessCanUser = true;
const res = await request(app).get(`/api/agents/${agentId}/keys`);
expect(res.status).toBe(403);
expect(res.body.error).toContain("User does not have access to this company");
expect(mockAgentService.getById).toHaveBeenCalledWith(agentId);
expect(mockAgentService.listKeys).not.toHaveBeenCalled();
});
it("rejects cross-tenant board key creation before minting a token", async () => {
const app = createApp({
type: "board",
userId: "mallory",
companyIds: [],
source: "session",
isInstanceAdmin: false,
});
const res = await request(app)
.post(`/api/agents/${agentId}/keys`)
.send({ name: "exploit" });
expect(res.status).toBe(403);
expect(res.body.error).toContain("User does not have access to this company");
expect(mockAgentService.getById).toHaveBeenCalledWith(agentId);
expect(mockAgentService.createApiKey).not.toHaveBeenCalled();
});
it("rejects cross-tenant board key revocation before touching the key", async () => {
const app = createApp({
type: "board",
userId: "mallory",
companyIds: [],
source: "session",
isInstanceAdmin: false,
});
const res = await request(app).delete(`/api/agents/${agentId}/keys/${keyId}`);
expect(res.status).toBe(403);
expect(res.body.error).toContain("User does not have access to this company");
expect(mockAgentService.getById).toHaveBeenCalledWith(agentId);
expect(mockAgentService.getKeyById).not.toHaveBeenCalled();
expect(mockAgentService.revokeKey).not.toHaveBeenCalled();
});
it("requires the key to belong to the route agent before revocation", async () => {
mockAgentService.getKeyById.mockResolvedValue({
...baseKey,
agentId: "44444444-4444-4444-8444-444444444444",
});
mockAccessService.canUser.mockResolvedValue(true);
const app = createApp({
const app = await createApp({
type: "board",
userId: "board-user",
companyIds: [companyId],
@@ -257,7 +367,7 @@ describe("agent cross-tenant route authorization", () => {
isInstanceAdmin: false,
});
const res = await request(app).delete(`/api/agents/${agentId}/keys/${keyId}`);
const res = await requestApp(app, (baseUrl) => request(baseUrl).delete(`/api/agents/${agentId}/keys/${keyId}`));
expect(res.status).toBe(404);
expect(res.body.error).toContain("Key not found");

View File

@@ -103,6 +103,33 @@ async function createApp() {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
function makeAgent() {
return {
id: "11111111-1111-4111-8111-111111111111",
@@ -129,7 +156,7 @@ describe("agent instructions bundle routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockSyncInstructionsBundleConfigFromFilePath.mockImplementation((_agent, config) => config);
mockFindServerAdapter.mockImplementation((_type: string) => ({ type: _type }));
mockAgentService.getById.mockResolvedValue(makeAgent());
@@ -194,8 +221,11 @@ describe("agent instructions bundle routes", () => {
});
it("returns bundle metadata", async () => {
const res = await request(await createApp())
.get("/api/agents/11111111-1111-4111-8111-111111111111/instructions-bundle?companyId=company-1");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl)
.get("/api/agents/11111111-1111-4111-8111-111111111111/instructions-bundle?companyId=company-1"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(res.body).toMatchObject({
@@ -208,13 +238,13 @@ describe("agent instructions bundle routes", () => {
});
it("writes a bundle file and persists compatibility config", async () => {
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.put("/api/agents/11111111-1111-4111-8111-111111111111/instructions-bundle/file?companyId=company-1")
.send({
path: "AGENTS.md",
content: "# Updated Agent\n",
clearLegacyPromptTemplate: true,
});
}));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAgentInstructionsService.writeFile).toHaveBeenCalledWith(
@@ -250,14 +280,14 @@ describe("agent instructions bundle routes", () => {
},
});
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.patch("/api/agents/11111111-1111-4111-8111-111111111111?companyId=company-1")
.send({
adapterType: "claude_local",
adapterConfig: {
model: "claude-sonnet-4",
},
});
}));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAgentService.update).toHaveBeenCalledWith(
@@ -289,13 +319,13 @@ describe("agent instructions bundle routes", () => {
},
});
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.patch("/api/agents/11111111-1111-4111-8111-111111111111?companyId=company-1")
.send({
adapterConfig: {
command: "codex --profile engineer",
},
});
}));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAgentService.update).toHaveBeenCalledWith(
@@ -327,14 +357,14 @@ describe("agent instructions bundle routes", () => {
},
});
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.patch("/api/agents/11111111-1111-4111-8111-111111111111?companyId=company-1")
.send({
replaceAdapterConfig: true,
adapterConfig: {
command: "codex --profile engineer",
},
});
}));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(res.body.adapterConfig).toMatchObject({

View File

@@ -7,8 +7,10 @@ const mockAgentService = vi.hoisted(() => ({
}));
const mockHeartbeatService = vi.hoisted(() => ({
buildRunOutputSilence: vi.fn(),
getRunIssueSummary: vi.fn(),
getActiveRunIssueSummaryForAgent: vi.fn(),
buildRunOutputSilence: vi.fn(),
getRunLogAccess: vi.fn(),
readLog: vi.fn(),
}));
@@ -91,6 +93,33 @@ async function createApp() {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
describe("agent live run routes", () => {
beforeEach(() => {
vi.resetModules();
@@ -104,7 +133,7 @@ describe("agent live run routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getByIdentifier.mockResolvedValue({
id: "issue-1",
companyId: "company-1",
@@ -132,6 +161,7 @@ describe("agent live run routes", () => {
feedbackDataSharingPreference: "prompt",
});
mockInstanceSettingsService.listCompanyIds.mockResolvedValue(["company-1"]);
mockHeartbeatService.buildRunOutputSilence.mockResolvedValue(null);
mockHeartbeatService.getRunIssueSummary.mockResolvedValue({
id: "run-1",
status: "running",
@@ -144,6 +174,7 @@ describe("agent live run routes", () => {
issueId: "issue-1",
});
mockHeartbeatService.getActiveRunIssueSummaryForAgent.mockResolvedValue(null);
mockHeartbeatService.buildRunOutputSilence.mockResolvedValue(null);
mockHeartbeatService.getRunLogAccess.mockResolvedValue({
id: "run-1",
companyId: "company-1",
@@ -160,12 +191,15 @@ describe("agent live run routes", () => {
});
it("returns a compact active run payload for issue polling", async () => {
const res = await request(await createApp()).get("/api/issues/PAP-1295/active-run");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl).get("/api/issues/PAP-1295/active-run"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockIssueService.getByIdentifier).toHaveBeenCalledWith("PAP-1295");
expect(mockHeartbeatService.getRunIssueSummary).toHaveBeenCalledWith("run-1");
expect(res.body).toEqual({
expect(res.body).toMatchObject({
id: "run-1",
status: "running",
invocationSource: "on_demand",
@@ -177,6 +211,7 @@ describe("agent live run routes", () => {
issueId: "issue-1",
agentName: "Builder",
adapterType: "codex_local",
outputSilence: null,
});
expect(res.body).not.toHaveProperty("resultJson");
expect(res.body).not.toHaveProperty("contextSnapshot");
@@ -207,7 +242,10 @@ describe("agent live run routes", () => {
issueId: "issue-1",
});
const res = await request(await createApp()).get("/api/issues/PAP-1295/active-run");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl).get("/api/issues/PAP-1295/active-run"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockHeartbeatService.getRunIssueSummary).toHaveBeenCalledWith("run-1");
@@ -222,7 +260,10 @@ describe("agent live run routes", () => {
});
it("uses narrow run log metadata lookups for log polling", async () => {
const res = await request(await createApp()).get("/api/heartbeat-runs/run-1/log?offset=12&limitBytes=64");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl).get("/api/heartbeat-runs/run-1/log?offset=12&limitBytes=64"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockHeartbeatService.getRunLogAccess).toHaveBeenCalledWith("run-1");

File diff suppressed because it is too large Load Diff

View File

@@ -165,6 +165,33 @@ async function createApp(db: Record<string, unknown> = createDb()) {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
function makeAgent(adapterType: string) {
return {
id: "11111111-1111-4111-8111-111111111111",
@@ -184,14 +211,27 @@ function makeAgent(adapterType: string) {
};
}
describe("agent skill routes", () => {
describe.sequential("agent skill routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../routes/agents.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
for (const mock of Object.values(mockAgentService)) mock.mockReset();
for (const mock of Object.values(mockAccessService)) mock.mockReset();
for (const mock of Object.values(mockApprovalService)) mock.mockReset();
for (const mock of Object.values(mockIssueApprovalService)) mock.mockReset();
for (const mock of Object.values(mockAgentInstructionsService)) mock.mockReset();
for (const mock of Object.values(mockCompanySkillService)) mock.mockReset();
for (const mock of Object.values(mockSecretService)) mock.mockReset();
mockLogActivity.mockReset();
mockTrackAgentCreated.mockReset();
mockGetTelemetryClient.mockReset();
mockSyncInstructionsBundleConfigFromFilePath.mockReset();
mockAdapter.listSkills.mockReset();
mockAdapter.syncSkills.mockReset();
mockSyncInstructionsBundleConfigFromFilePath.mockImplementation((_agent, config) => config);
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
mockAgentService.resolveByReference.mockResolvedValue({
@@ -276,8 +316,11 @@ describe("agent skill routes", () => {
it("skips runtime materialization when listing Claude skills", async () => {
mockAgentService.getById.mockResolvedValue(makeAgent("claude_local"));
const res = await request(await createApp())
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl)
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAdapter.listSkills).toHaveBeenCalledWith(
@@ -301,8 +344,11 @@ describe("agent skill routes", () => {
warnings: [],
});
const res = await request(await createApp())
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl)
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
});
@@ -318,8 +364,11 @@ describe("agent skill routes", () => {
warnings: [],
});
const res = await request(await createApp())
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1");
const res = await requestApp(
await createApp(),
(baseUrl) => request(baseUrl)
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
});
@@ -327,9 +376,9 @@ describe("agent skill routes", () => {
it("skips runtime materialization when syncing Claude skills", async () => {
mockAgentService.getById.mockResolvedValue(makeAgent("claude_local"));
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/agents/11111111-1111-4111-8111-111111111111/skills/sync?companyId=company-1")
.send({ desiredSkills: ["paperclipai/paperclip/paperclip"] });
.send({ desiredSkills: ["paperclipai/paperclip/paperclip"] }));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAdapter.syncSkills).toHaveBeenCalled();
@@ -338,9 +387,9 @@ describe("agent skill routes", () => {
it("canonicalizes desired skill references before syncing", async () => {
mockAgentService.getById.mockResolvedValue(makeAgent("claude_local"));
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/agents/11111111-1111-4111-8111-111111111111/skills/sync?companyId=company-1")
.send({ desiredSkills: ["paperclip"] });
.send({ desiredSkills: ["paperclip"] }));
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(mockAgentService.update).toHaveBeenCalledWith(
@@ -357,7 +406,7 @@ describe("agent skill routes", () => {
});
it("persists canonical desired skills when creating an agent directly", async () => {
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "QA Agent",
@@ -365,7 +414,7 @@ describe("agent skill routes", () => {
adapterType: "claude_local",
desiredSkills: ["paperclip"],
adapterConfig: {},
});
}));
expect([200, 201], JSON.stringify(res.body)).toContain(res.status);
expect(mockAgentService.create).toHaveBeenCalledWith(
@@ -388,7 +437,7 @@ describe("agent skill routes", () => {
});
it("materializes a managed AGENTS.md for directly created local agents", async () => {
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "QA Agent",
@@ -397,7 +446,7 @@ describe("agent skill routes", () => {
adapterConfig: {
promptTemplate: "You are QA.",
},
});
}));
expect([200, 201], JSON.stringify(res.body)).toContain(res.status);
expect(mockAgentService.update).toHaveBeenCalledWith(
@@ -418,14 +467,14 @@ describe("agent skill routes", () => {
});
it("materializes the bundled CEO instruction set for default CEO agents", async () => {
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "CEO",
role: "ceo",
adapterType: "claude_local",
adapterConfig: {},
});
}));
expect([200, 201], JSON.stringify(res.body)).toContain(res.status);
expect(mockAgentInstructionsService.materializeManagedBundle).toHaveBeenCalledWith(
@@ -445,14 +494,14 @@ describe("agent skill routes", () => {
});
it("materializes the bundled default instruction set for non-CEO agents with no prompt template", async () => {
const res = await request(await createApp())
const res = await requestApp(await createApp(), (baseUrl) => request(baseUrl)
.post("/api/companies/company-1/agents")
.send({
name: "Engineer",
role: "engineer",
adapterType: "claude_local",
adapterConfig: {},
});
}));
expect([200, 201], JSON.stringify(res.body)).toContain(res.status);
await vi.waitFor(() => {

View File

@@ -92,7 +92,7 @@ describe("approval routes idempotent retries", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockApprovalService.list.mockReset();
mockApprovalService.getById.mockReset();
mockApprovalService.create.mockReset();

View File

@@ -106,6 +106,33 @@ async function createApp(storage: ReturnType<typeof createStorageService>) {
return app;
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
) {
const { createServer } = await vi.importActual<typeof import("node:http")>("node:http");
const server = createServer(app);
try {
await new Promise<void>((resolve) => {
server.listen(0, "127.0.0.1", resolve);
});
const address = server.address();
if (!address || typeof address === "string") {
throw new Error("Expected HTTP server to listen on a TCP port");
}
return await buildRequest(`http://127.0.0.1:${address.port}`);
} finally {
if (server.listening) {
await new Promise<void>((resolve, reject) => {
server.close((error) => {
if (error) reject(error);
else resolve();
});
});
}
}
}
describe("POST /api/companies/:companyId/assets/images", () => {
beforeEach(() => {
vi.resetModules();
@@ -116,7 +143,7 @@ describe("POST /api/companies/:companyId/assets/images", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
createAssetMock.mockReset();
getAssetByIdMock.mockReset();
logActivityMock.mockReset();
@@ -128,10 +155,12 @@ describe("POST /api/companies/:companyId/assets/images", () => {
createAssetMock.mockResolvedValue(createAsset());
const res = await request(app)
.post("/api/companies/company-1/assets/images")
.field("namespace", "goals")
.attach("file", Buffer.from("png"), "logo.png");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/assets/images")
.field("namespace", "goals")
.attach("file", Buffer.from("png"), "logo.png"),
);
expect([200, 201], JSON.stringify(res.body)).toContain(res.status);
expect(res.body.contentPath).toBe("/api/assets/asset-1/content");
@@ -155,10 +184,12 @@ describe("POST /api/companies/:companyId/assets/images", () => {
originalFilename: "note.txt",
});
const res = await request(app)
.post("/api/companies/company-1/assets/images")
.field("namespace", "issues/drafts")
.attach("file", Buffer.from("hello"), { filename: "note.txt", contentType: "text/plain" });
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/assets/images")
.field("namespace", "issues/drafts")
.attach("file", Buffer.from("hello"), { filename: "note.txt", contentType: "text/plain" }),
);
expect([200, 201]).toContain(res.status);
expect(res.body.contentPath).toBe("/api/assets/asset-1/content");
@@ -174,7 +205,7 @@ describe("POST /api/companies/:companyId/logo", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
createAssetMock.mockReset();
getAssetByIdMock.mockReset();
logActivityMock.mockReset();
@@ -186,11 +217,13 @@ describe("POST /api/companies/:companyId/logo", () => {
createAssetMock.mockResolvedValue(createAsset());
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("png"), "logo.png");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("png"), "logo.png"),
);
expect(res.status).toBe(201);
expect(res.status, JSON.stringify({ body: res.body, text: res.text, createCalls: createAssetMock.mock.calls.length })).toBe(201);
expect(res.body.contentPath).toBe("/api/assets/asset-1/content");
expect(createAssetMock).toHaveBeenCalledTimes(1);
expect(png.__calls.putFileInputs[0]).toMatchObject({
@@ -212,17 +245,19 @@ describe("POST /api/companies/:companyId/logo", () => {
originalFilename: "logo.svg",
});
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach(
"file",
Buffer.from(
"<svg xmlns='http://www.w3.org/2000/svg' onload='alert(1)'><script>alert(1)</script><a href='https://evil.example/'><circle cx='12' cy='12' r='10'/></a></svg>",
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach(
"file",
Buffer.from(
"<svg xmlns='http://www.w3.org/2000/svg' onload='alert(1)'><script>alert(1)</script><a href='https://evil.example/'><circle cx='12' cy='12' r='10'/></a></svg>",
),
"logo.svg",
),
"logo.svg",
);
);
expect(res.status).toBe(201);
expect(res.status, JSON.stringify({ body: res.body, text: res.text, createCalls: createAssetMock.mock.calls.length })).toBe(201);
expect(svg.__calls.putFileInputs).toHaveLength(1);
const stored = svg.__calls.putFileInputs[0];
expect(stored.contentType).toBe("image/svg+xml");
@@ -241,11 +276,13 @@ describe("POST /api/companies/:companyId/logo", () => {
createAssetMock.mockResolvedValue(createAsset());
const file = Buffer.alloc(150 * 1024, "a");
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach("file", file, "within-limit.png");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach("file", file, "within-limit.png"),
);
expect(res.status).toBe(201);
expect(res.status, JSON.stringify({ body: res.body, text: res.text, createCalls: createAssetMock.mock.calls.length })).toBe(201);
});
it("rejects logo files larger than the general attachment limit", async () => {
@@ -253,9 +290,11 @@ describe("POST /api/companies/:companyId/logo", () => {
createAssetMock.mockResolvedValue(createAsset());
const file = Buffer.alloc(MAX_ATTACHMENT_BYTES + 1, "a");
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach("file", file, "too-large.png");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach("file", file, "too-large.png"),
);
expect(res.status).toBe(422);
expect(res.body.error).toBe(`Image exceeds ${MAX_ATTACHMENT_BYTES} bytes`);
@@ -265,9 +304,11 @@ describe("POST /api/companies/:companyId/logo", () => {
const app = await createApp(createStorageService("text/plain"));
createAssetMock.mockResolvedValue(createAsset());
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("not an image"), "note.txt");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("not an image"), "note.txt"),
);
expect(res.status).toBe(422);
expect(res.body.error).toBe("Unsupported image type: text/plain");
@@ -278,9 +319,11 @@ describe("POST /api/companies/:companyId/logo", () => {
const app = await createApp(createStorageService("image/svg+xml"));
createAssetMock.mockResolvedValue(createAsset());
const res = await request(app)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("not actually svg"), "logo.svg");
const res = await requestApp(app, (baseUrl) =>
request(baseUrl)
.post("/api/companies/company-1/logo")
.attach("file", Buffer.from("not actually svg"), "logo.svg"),
);
expect(res.status).toBe(422);
expect(res.body.error).toBe("SVG could not be sanitized");

View File

@@ -1,6 +1,8 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { describe, expect, it } from "vitest";
import { errorHandler } from "../middleware/index.js";
import { authRoutes } from "../routes/auth.js";
function createSelectChain(rows: unknown[]) {
return {
@@ -32,16 +34,12 @@ function createUpdateChain(row: unknown) {
function createDb(row: Record<string, unknown>) {
return {
select: vi.fn(() => createSelectChain([row])),
update: vi.fn(() => createUpdateChain(row)),
select: () => createSelectChain([row]),
update: () => createUpdateChain(row),
} as any;
}
async function createApp(actor: Express.Request["actor"], row: Record<string, unknown>) {
const [{ authRoutes }, { errorHandler }] = await Promise.all([
import("../routes/auth.js"),
import("../middleware/index.js"),
]);
function createApp(actor: Express.Request["actor"], row: Record<string, unknown>) {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
@@ -53,7 +51,7 @@ async function createApp(actor: Express.Request["actor"], row: Record<string, un
return app;
}
describe("auth routes", () => {
describe.sequential("auth routes", () => {
const baseUser = {
id: "user-1",
name: "Jane Example",
@@ -61,10 +59,6 @@ describe("auth routes", () => {
image: "https://example.com/jane.png",
};
beforeEach(() => {
vi.resetModules();
});
it("returns the persisted user profile in the session payload", async () => {
const app = await createApp(
{

View File

@@ -415,7 +415,7 @@ describe("claude execute", () => {
const previousPaperclipInstanceId = process.env.PAPERCLIP_INSTANCE_ID;
process.env.HOME = root;
process.env.PAPERCLIP_HOME = paperclipHome;
process.env.PAPERCLIP_INSTANCE_ID = "default";
delete process.env.PAPERCLIP_INSTANCE_ID;
try {
const first = await execute({
@@ -574,7 +574,7 @@ describe("claude execute", () => {
const previousPaperclipInstanceId = process.env.PAPERCLIP_INSTANCE_ID;
process.env.HOME = root;
process.env.PAPERCLIP_HOME = paperclipHome;
process.env.PAPERCLIP_INSTANCE_ID = "default";
delete process.env.PAPERCLIP_INSTANCE_ID;
try {
const first = await execute({
@@ -711,8 +711,9 @@ describe("claude execute", () => {
expect(result.exitCode).toBe(1);
expect(result.errorCode).toBe("claude_transient_upstream");
expect(result.errorFamily).toBe("transient_upstream");
expect(result.retryNotBefore).toBe("2026-04-22T21:00:00.000Z");
expect(result.resultJson?.retryNotBefore).toBe("2026-04-22T21:00:00.000Z");
const expectedRetryNotBefore = "2026-04-22T21:00:00.000Z";
expect(result.retryNotBefore).toBe(expectedRetryNotBefore);
expect(result.resultJson?.retryNotBefore).toBe(expectedRetryNotBefore);
expect(result.errorMessage ?? "").toContain("extra usage");
expect(new Date(String(result.resultJson?.transientRetryNotBefore)).getTime()).toBe(
new Date("2026-04-22T21:00:00.000Z").getTime(),

View File

@@ -1,7 +1,6 @@
import type { Server } from "node:http";
import express from "express";
import request from "supertest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { beforeEach, describe, expect, it, vi } from "vitest";
const mockAccessService = vi.hoisted(() => ({
isInstanceAdmin: vi.fn(),
@@ -35,20 +34,6 @@ vi.mock("../services/index.js", () => ({
deduplicateAgentName: vi.fn((name: string) => name),
}));
let currentServer: Server | null = null;
async function closeCurrentServer() {
if (!currentServer) return;
const server = currentServer;
currentServer = null;
await new Promise<void>((resolve, reject) => {
server.close((err) => {
if (err) reject(err);
else resolve();
});
});
}
function registerModuleMocks() {
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
@@ -62,16 +47,31 @@ function registerModuleMocks() {
}));
}
let appImportCounter = 0;
async function createApp(actor: any, db: any = {} as any) {
await closeCurrentServer();
appImportCounter += 1;
const routeModulePath = `../routes/access.js?cli-auth-routes-${appImportCounter}`;
const middlewareModulePath = `../middleware/index.js?cli-auth-routes-${appImportCounter}`;
const [{ accessRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/access.js")>("../routes/access.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import(routeModulePath) as Promise<typeof import("../routes/access.js")>,
import(middlewareModulePath) as Promise<typeof import("../middleware/index.js")>,
]);
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
req.actor = actor;
req.actor = {
...actor,
companyIds: Array.isArray(actor.companyIds) ? [...actor.companyIds] : actor.companyIds,
memberships: Array.isArray(actor.memberships)
? actor.memberships.map((membership: unknown) =>
typeof membership === "object" && membership !== null
? { ...membership }
: membership,
)
: actor.memberships,
};
next();
});
app.use(
@@ -84,13 +84,10 @@ async function createApp(actor: any, db: any = {} as any) {
}),
);
app.use(errorHandler);
currentServer = app.listen(0);
return currentServer;
return app;
}
describe("cli auth routes", () => {
afterEach(closeCurrentServer);
describe.sequential("cli auth routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/index.js");
@@ -101,7 +98,7 @@ describe("cli auth routes", () => {
vi.resetAllMocks();
});
it("creates a CLI auth challenge with approval metadata", async () => {
it.sequential("creates a CLI auth challenge with approval metadata", async () => {
mockBoardAuthService.createCliAuthChallenge.mockResolvedValue({
challenge: {
id: "challenge-1",
@@ -120,7 +117,7 @@ describe("cli auth routes", () => {
requestedAccess: "board",
});
expect(res.status).toBe(201);
expect(res.status, res.text || JSON.stringify(res.body)).toBe(201);
expect(res.body).toMatchObject({
id: "challenge-1",
token: "pcp_cli_auth_secret",
@@ -132,18 +129,18 @@ describe("cli auth routes", () => {
expect(res.body.approvalUrl).toContain("/cli-auth/challenge-1?token=pcp_cli_auth_secret");
});
it("rejects anonymous access to generic skill documents", async () => {
const app = await createApp({ type: "none", source: "none" });
const [indexRes, skillRes] = await Promise.all([
request(app).get("/api/skills/index"),
request(app).get("/api/skills/paperclip"),
]);
it.sequential("rejects anonymous access to generic skill documents", async () => {
const indexApp = await createApp({ type: "none", source: "none" });
const skillApp = await createApp({ type: "none", source: "none" });
expect(indexRes.status).toBe(401);
expect(skillRes.status).toBe(401);
const indexRes = await request(indexApp).get("/api/skills/index");
const skillRes = await request(skillApp).get("/api/skills/paperclip");
expect(indexRes.status, JSON.stringify(indexRes.body)).toBe(401);
expect(skillRes.status, skillRes.text || JSON.stringify(skillRes.body)).toBe(401);
});
it("serves the invite-scoped paperclip skill anonymously for active invites", async () => {
it.sequential("serves the invite-scoped paperclip skill anonymously for active invites", async () => {
const invite = {
id: "invite-1",
companyId: "company-1",
@@ -174,7 +171,7 @@ describe("cli auth routes", () => {
expect(res.text).toContain("# Paperclip Skill");
});
it("marks challenge status as requiring sign-in for anonymous viewers", async () => {
it.sequential("marks challenge status as requiring sign-in for anonymous viewers", async () => {
mockBoardAuthService.describeCliAuthChallenge.mockResolvedValue({
id: "challenge-1",
status: "pending",
@@ -197,7 +194,7 @@ describe("cli auth routes", () => {
expect(res.body.canApprove).toBe(false);
});
it("approves a CLI auth challenge for a signed-in board user", async () => {
it.sequential("approves a CLI auth challenge for a signed-in board user", async () => {
mockBoardAuthService.approveCliAuthChallenge.mockResolvedValue({
status: "approved",
challenge: {
@@ -242,7 +239,7 @@ describe("cli auth routes", () => {
);
});
it("logs approve activity for instance admins without company memberships", async () => {
it.sequential("logs approve activity for instance admins without company memberships", async () => {
mockBoardAuthService.approveCliAuthChallenge.mockResolvedValue({
status: "approved",
challenge: {
@@ -275,7 +272,7 @@ describe("cli auth routes", () => {
expect(mockLogActivity).toHaveBeenCalledTimes(2);
});
it("logs revoke activity with resolved audit company ids", async () => {
it.sequential("logs revoke activity with resolved audit company ids", async () => {
mockBoardAuthService.assertCurrentBoardKey.mockResolvedValue({
id: "board-key-3",
userId: "admin-2",

View File

@@ -91,7 +91,7 @@ describe("PATCH /api/companies/:companyId/branding", () => {
vi.doUnmock("../routes/companies.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
vi.resetAllMocks();
vi.clearAllMocks();
});
it("rejects non-CEO agent callers", async () => {

View File

@@ -39,37 +39,45 @@ const mockFeedbackService = vi.hoisted(() => ({
saveIssueVote: vi.fn(),
}));
function registerModuleMocks() {
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
vi.mock("../services/access.js", () => ({
accessService: () => mockAccessService,
}));
vi.doMock("../services/access.js", () => ({
accessService: () => mockAccessService,
}));
vi.mock("../services/activity-log.js", () => ({
logActivity: mockLogActivity,
}));
vi.doMock("../services/activity-log.js", () => ({
logActivity: mockLogActivity,
}));
vi.mock("../services/agents.js", () => ({
agentService: () => mockAgentService,
}));
vi.doMock("../services/agents.js", () => ({
agentService: () => mockAgentService,
}));
vi.mock("../services/budgets.js", () => ({
budgetService: () => mockBudgetService,
}));
vi.doMock("../services/budgets.js", () => ({
budgetService: () => mockBudgetService,
}));
vi.mock("../services/companies.js", () => ({
companyService: () => mockCompanyService,
}));
vi.doMock("../services/companies.js", () => ({
companyService: () => mockCompanyService,
}));
vi.mock("../services/company-portability.js", () => ({
companyPortabilityService: () => mockCompanyPortabilityService,
}));
vi.doMock("../services/company-portability.js", () => ({
companyPortabilityService: () => mockCompanyPortabilityService,
}));
vi.mock("../services/feedback.js", () => ({
feedbackService: () => mockFeedbackService,
}));
vi.doMock("../services/feedback.js", () => ({
feedbackService: () => mockFeedbackService,
}));
vi.mock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
budgetService: () => mockBudgetService,
companyPortabilityService: () => mockCompanyPortabilityService,
companyService: () => mockCompanyService,
feedbackService: () => mockFeedbackService,
logActivity: mockLogActivity,
}));
function registerCompanyRouteMocks() {
vi.doMock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
@@ -81,10 +89,16 @@ function registerModuleMocks() {
}));
}
let appImportCounter = 0;
async function createApp(actor: Record<string, unknown>) {
registerCompanyRouteMocks();
appImportCounter += 1;
const routeModulePath = `../routes/companies.js?company-portability-routes-${appImportCounter}`;
const middlewareModulePath = `../middleware/index.js?company-portability-routes-${appImportCounter}`;
const [{ companyRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/companies.js")>("../routes/companies.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import(routeModulePath) as Promise<typeof import("../routes/companies.js")>,
import(middlewareModulePath) as Promise<typeof import("../middleware/index.js")>,
]);
const app = express();
app.use(express.json());
@@ -98,6 +112,8 @@ async function createApp(actor: Record<string, unknown>) {
}
const companyId = "11111111-1111-4111-8111-111111111111";
const ceoAgentId = "ceo-agent";
const engineerAgentId = "engineer-agent";
const exportRequest = {
include: { company: true, agents: true, projects: true },
@@ -123,33 +139,36 @@ function createExportResult() {
};
}
describe("company portability routes", () => {
describe.sequential("company portability routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/access.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/agents.js");
vi.doUnmock("../services/budgets.js");
vi.doUnmock("../services/companies.js");
vi.doUnmock("../services/company-portability.js");
vi.doUnmock("../services/feedback.js");
vi.doUnmock("../services/index.js");
vi.doUnmock("../routes/companies.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockAgentService.getById.mockImplementation(async (id: string) => ({
id,
companyId,
role: id === ceoAgentId ? "ceo" : "engineer",
}));
mockCompanyPortabilityService.exportBundle.mockResolvedValue(createExportResult());
mockCompanyPortabilityService.previewExport.mockResolvedValue({
rootPath: "paperclip",
manifest: { agents: [], skills: [], projects: [], issues: [], envInputs: [], includes: { company: true, agents: true, projects: true, issues: false, skills: false }, company: null, schemaVersion: 1, generatedAt: new Date().toISOString(), source: null },
files: {},
fileInventory: [],
counts: { files: 0, agents: 0, skills: 0, projects: 0, issues: 0 },
warnings: [],
paperclipExtensionPath: ".paperclip.yaml",
});
mockCompanyPortabilityService.previewImport.mockResolvedValue({ ok: true });
mockCompanyPortabilityService.importBundle.mockResolvedValue({
company: { id: companyId, action: "created" },
agents: [],
warnings: [],
});
});
it("rejects non-CEO agents from CEO-safe export preview routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId,
role: "engineer",
});
it.sequential("rejects non-CEO agents from CEO-safe export preview routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: engineerAgentId,
companyId,
source: "agent_key",
runId: "run-1",
@@ -164,15 +183,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.previewExport).not.toHaveBeenCalled();
});
it("rejects non-CEO agents from legacy and CEO-safe export bundle routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId,
role: "engineer",
});
it.sequential("rejects non-CEO agents from legacy and CEO-safe export bundle routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: engineerAgentId,
companyId,
source: "agent_key",
runId: "run-1",
@@ -187,12 +201,7 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.exportBundle).not.toHaveBeenCalled();
});
it("allows CEO agents to use company-scoped export preview routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId,
role: "ceo",
});
it.sequential("allows CEO agents to use company-scoped export preview routes", async () => {
mockCompanyPortabilityService.previewExport.mockResolvedValue({
rootPath: "paperclip",
manifest: { agents: [], skills: [], projects: [], issues: [], envInputs: [], includes: { company: true, agents: true, projects: true, issues: false, skills: false }, company: null, schemaVersion: 1, generatedAt: new Date().toISOString(), source: null },
@@ -204,7 +213,7 @@ describe("company portability routes", () => {
});
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: ceoAgentId,
companyId,
source: "agent_key",
runId: "run-1",
@@ -218,16 +227,11 @@ describe("company portability routes", () => {
expect(res.body.rootPath).toBe("paperclip");
});
it("allows CEO agents to export through legacy and CEO-safe bundle routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId,
role: "ceo",
});
it.sequential("allows CEO agents to export through legacy and CEO-safe bundle routes", async () => {
mockCompanyPortabilityService.exportBundle.mockResolvedValue(createExportResult());
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: ceoAgentId,
companyId,
source: "agent_key",
runId: "run-1",
@@ -244,7 +248,7 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.exportBundle).toHaveBeenNthCalledWith(2, companyId, exportRequest);
});
it("allows board users to export through legacy and CEO-safe bundle routes", async () => {
it.sequential("allows board users to export through legacy and CEO-safe bundle routes", async () => {
mockCompanyPortabilityService.exportBundle.mockResolvedValue(createExportResult());
const app = await createApp({
type: "board",
@@ -263,15 +267,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.exportBundle).toHaveBeenCalledTimes(2);
});
it("rejects replace collision strategy on CEO-safe import routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId: "11111111-1111-4111-8111-111111111111",
role: "ceo",
});
it.sequential("rejects replace collision strategy on CEO-safe import routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: ceoAgentId,
companyId: "11111111-1111-4111-8111-111111111111",
source: "agent_key",
runId: "run-1",
@@ -291,10 +290,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.previewImport).not.toHaveBeenCalled();
});
it("keeps global import preview routes board-only", async () => {
it.sequential("keeps global import preview routes board-only", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: engineerAgentId,
companyId: "11111111-1111-4111-8111-111111111111",
source: "agent_key",
runId: "run-1",
@@ -313,7 +312,7 @@ describe("company portability routes", () => {
expect(res.body.error).toContain("Board access required");
});
it("requires instance admin for new-company import preview", async () => {
it.sequential("requires instance admin for new-company import preview", async () => {
const app = await createApp({
type: "board",
userId: "user-1",
@@ -336,15 +335,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.previewImport).not.toHaveBeenCalled();
});
it("rejects replace collision strategy on CEO-safe import apply routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId: "11111111-1111-4111-8111-111111111111",
role: "ceo",
});
it.sequential("rejects replace collision strategy on CEO-safe import apply routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: ceoAgentId,
companyId: "11111111-1111-4111-8111-111111111111",
source: "agent_key",
runId: "run-1",
@@ -364,15 +358,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.importBundle).not.toHaveBeenCalled();
});
it("rejects non-CEO agents from CEO-safe import preview routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId: "11111111-1111-4111-8111-111111111111",
role: "engineer",
});
it.sequential("rejects non-CEO agents from CEO-safe import preview routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: engineerAgentId,
companyId: "11111111-1111-4111-8111-111111111111",
source: "agent_key",
runId: "run-1",
@@ -392,15 +381,10 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.previewImport).not.toHaveBeenCalled();
});
it("rejects non-CEO agents from CEO-safe import apply routes", async () => {
mockAgentService.getById.mockResolvedValue({
id: "agent-1",
companyId: "11111111-1111-4111-8111-111111111111",
role: "engineer",
});
it.sequential("rejects non-CEO agents from CEO-safe import apply routes", async () => {
const app = await createApp({
type: "agent",
agentId: "agent-1",
agentId: engineerAgentId,
companyId: "11111111-1111-4111-8111-111111111111",
source: "agent_key",
runId: "run-1",
@@ -420,7 +404,7 @@ describe("company portability routes", () => {
expect(mockCompanyPortabilityService.importBundle).not.toHaveBeenCalled();
});
it("requires instance admin for new-company import apply", async () => {
it.sequential("requires instance admin for new-company import apply", async () => {
const app = await createApp({
type: "board",
userId: "user-1",

View File

@@ -86,7 +86,7 @@ describe("company skill mutation permissions", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
mockCompanySkillService.importFromSource.mockResolvedValue({
imported: [],

View File

@@ -141,6 +141,26 @@ describe("environment config helpers", () => {
});
});
it("normalizes schema-driven sandbox config into the generic plugin-backed stored shape", () => {
const config = normalizeEnvironmentConfig({
driver: "sandbox",
config: {
provider: "secure-plugin",
template: " base ",
apiKey: "22222222-2222-2222-2222-222222222222",
timeoutMs: "450000",
},
});
expect(config).toEqual({
provider: "secure-plugin",
template: " base ",
apiKey: "22222222-2222-2222-2222-222222222222",
timeoutMs: 450000,
reuseLease: false,
});
});
it("normalizes plugin-backed sandbox provider config without server provider changes", () => {
const config = normalizeEnvironmentConfig({
driver: "sandbox",
@@ -162,6 +182,30 @@ describe("environment config helpers", () => {
});
});
it("parses a persisted schema-driven sandbox environment into a typed driver config", () => {
const parsed = parseEnvironmentDriverConfig({
driver: "sandbox",
config: {
provider: "secure-plugin",
template: "base",
apiKey: "22222222-2222-2222-2222-222222222222",
timeoutMs: 300000,
reuseLease: true,
},
});
expect(parsed).toEqual({
driver: "sandbox",
config: {
provider: "secure-plugin",
template: "base",
apiKey: "22222222-2222-2222-2222-222222222222",
timeoutMs: 300000,
reuseLease: true,
},
});
});
it("parses a persisted plugin-backed sandbox environment into a typed driver config", () => {
const parsed = parseEnvironmentDriverConfig({
driver: "sandbox",

View File

@@ -3,6 +3,7 @@ import { beforeEach, describe, expect, it, vi } from "vitest";
const mockEnsureSshWorkspaceReady = vi.hoisted(() => vi.fn());
const mockProbePluginEnvironmentDriver = vi.hoisted(() => vi.fn());
const mockProbePluginSandboxProviderDriver = vi.hoisted(() => vi.fn());
const mockResolvePluginSandboxProviderDriverByKey = vi.hoisted(() => vi.fn());
vi.mock("@paperclipai/adapter-utils/ssh", () => ({
ensureSshWorkspaceReady: mockEnsureSshWorkspaceReady,
@@ -11,6 +12,7 @@ vi.mock("@paperclipai/adapter-utils/ssh", () => ({
vi.mock("../services/plugin-environment-driver.js", () => ({
probePluginEnvironmentDriver: mockProbePluginEnvironmentDriver,
probePluginSandboxProviderDriver: mockProbePluginSandboxProviderDriver,
resolvePluginSandboxProviderDriverByKey: mockResolvePluginSandboxProviderDriverByKey,
}));
import { probeEnvironment } from "../services/environment-probe.ts";
@@ -20,6 +22,8 @@ describe("probeEnvironment", () => {
mockEnsureSshWorkspaceReady.mockReset();
mockProbePluginEnvironmentDriver.mockReset();
mockProbePluginSandboxProviderDriver.mockReset();
mockResolvePluginSandboxProviderDriverByKey.mockReset();
mockResolvePluginSandboxProviderDriverByKey.mockResolvedValue(null);
});
it("reports local environments as immediately available", async () => {

View File

@@ -38,6 +38,7 @@ const mockSecretService = vi.hoisted(() => ({
resolveSecretValue: vi.fn(),
}));
const mockValidatePluginEnvironmentDriverConfig = vi.hoisted(() => vi.fn());
const mockValidatePluginSandboxProviderConfig = vi.hoisted(() => vi.fn());
const mockListReadyPluginEnvironmentDrivers = vi.hoisted(() => vi.fn());
const mockExecutionWorkspaceService = vi.hoisted(() => ({}));
@@ -69,6 +70,7 @@ vi.mock("../services/execution-workspaces.js", () => ({
vi.mock("../services/plugin-environment-driver.js", () => ({
listReadyPluginEnvironmentDrivers: mockListReadyPluginEnvironmentDrivers,
validatePluginEnvironmentDriverConfig: mockValidatePluginEnvironmentDriverConfig,
validatePluginSandboxProviderConfig: mockValidatePluginSandboxProviderConfig,
}));
function createEnvironment() {
@@ -148,6 +150,18 @@ describe("environment routes", () => {
});
mockValidatePluginEnvironmentDriverConfig.mockReset();
mockValidatePluginEnvironmentDriverConfig.mockImplementation(async ({ config }) => config);
mockValidatePluginSandboxProviderConfig.mockReset();
mockValidatePluginSandboxProviderConfig.mockImplementation(async ({ provider, config }) => ({
normalizedConfig: config,
pluginId: `plugin-${provider}`,
pluginKey: `plugin.${provider}`,
driver: {
driverKey: provider,
kind: "sandbox_provider",
displayName: provider,
configSchema: { type: "object" },
},
}));
mockListReadyPluginEnvironmentDrivers.mockReset();
mockListReadyPluginEnvironmentDrivers.mockResolvedValue([]);
});
@@ -185,6 +199,52 @@ describe("environment routes", () => {
expect(res.body.sandboxProviders).not.toHaveProperty("fake-plugin");
});
it("returns installed plugin-backed sandbox capabilities for environment creation", async () => {
mockListReadyPluginEnvironmentDrivers.mockResolvedValue([
{
pluginId: "plugin-1",
pluginKey: "acme.secure-sandbox-provider",
driverKey: "secure-plugin",
displayName: "Secure Sandbox",
description: "Provisions schema-driven cloud sandboxes.",
configSchema: {
type: "object",
properties: {
template: { type: "string" },
apiKey: { type: "string", format: "secret-ref" },
},
},
},
]);
const app = createApp({
type: "board",
userId: "user-1",
source: "local_implicit",
});
const res = await request(app).get("/api/companies/company-1/environments/capabilities");
expect(res.status).toBe(200);
expect(res.body.sandboxProviders["secure-plugin"]).toMatchObject({
status: "supported",
supportsRunExecution: true,
supportsReusableLeases: true,
displayName: "Secure Sandbox",
source: "plugin",
pluginKey: "acme.secure-sandbox-provider",
pluginId: "plugin-1",
configSchema: {
type: "object",
properties: {
template: { type: "string" },
apiKey: { type: "string", format: "secret-ref" },
},
},
});
expect(res.body.adapters.find((row: any) => row.adapterType === "codex_local").sandboxProviders["secure-plugin"])
.toBe("supported");
});
it("redacts config and metadata for unprivileged agent list reads", async () => {
mockEnvironmentService.list.mockResolvedValue([createEnvironment()]);
mockAgentService.getById.mockResolvedValue({
@@ -453,11 +513,12 @@ describe("environment routes", () => {
},
};
mockEnvironmentService.create.mockResolvedValue(environment);
const pluginWorkerManager = {};
const app = createApp({
type: "board",
userId: "user-1",
source: "local_implicit",
});
}, { pluginWorkerManager });
const res = await request(app)
.post("/api/companies/company-1/environments")
@@ -531,11 +592,12 @@ describe("environment routes", () => {
},
};
mockEnvironmentService.create.mockResolvedValue(environment);
const pluginWorkerManager = {};
const app = createApp({
type: "board",
userId: "user-1",
source: "local_implicit",
});
}, { pluginWorkerManager });
const res = await request(app)
.post("/api/companies/company-1/environments")
@@ -551,6 +613,16 @@ describe("environment routes", () => {
});
expect(res.status).toBe(201);
expect(mockValidatePluginSandboxProviderConfig).toHaveBeenCalledWith({
db: expect.anything(),
workerManager: pluginWorkerManager,
provider: "fake-plugin",
config: {
image: "fake:test",
timeoutMs: 450000,
reuseLease: true,
},
});
expect(mockEnvironmentService.create).toHaveBeenCalledWith("company-1", {
name: "Fake plugin Sandbox",
driver: "sandbox",
@@ -565,6 +637,101 @@ describe("environment routes", () => {
expect(mockSecretService.create).not.toHaveBeenCalled();
});
it("creates a schema-driven sandbox environment with secret-ref fields persisted as secrets", async () => {
const environment = {
...createEnvironment(),
id: "env-sandbox-secure-plugin",
name: "Secure Sandbox",
driver: "sandbox" as const,
config: {
provider: "secure-plugin",
template: "base",
apiKey: "11111111-1111-1111-1111-111111111111",
timeoutMs: 450000,
reuseLease: true,
},
};
mockEnvironmentService.create.mockResolvedValue(environment);
mockValidatePluginSandboxProviderConfig.mockResolvedValue({
normalizedConfig: {
template: "base",
apiKey: "test-provider-key",
timeoutMs: 450000,
reuseLease: true,
},
pluginId: "plugin-secure",
pluginKey: "acme.secure-sandbox-provider",
driver: {
driverKey: "secure-plugin",
kind: "sandbox_provider",
displayName: "Secure Sandbox",
configSchema: {
type: "object",
properties: {
template: { type: "string" },
apiKey: { type: "string", format: "secret-ref" },
timeoutMs: { type: "number" },
reuseLease: { type: "boolean" },
},
},
},
});
const pluginWorkerManager = {};
const app = createApp({
type: "board",
userId: "user-1",
source: "local_implicit",
}, { pluginWorkerManager });
const res = await request(app)
.post("/api/companies/company-1/environments")
.send({
name: "Secure Sandbox",
driver: "sandbox",
config: {
provider: "secure-plugin",
template: " base ",
apiKey: " test-provider-key ",
timeoutMs: "450000",
reuseLease: true,
},
});
expect(res.status).toBe(201);
expect(mockValidatePluginSandboxProviderConfig).toHaveBeenCalledWith({
db: expect.anything(),
workerManager: pluginWorkerManager,
provider: "secure-plugin",
config: {
template: " base ",
apiKey: " test-provider-key ",
timeoutMs: 450000,
reuseLease: true,
},
});
expect(mockEnvironmentService.create).toHaveBeenCalledWith("company-1", {
name: "Secure Sandbox",
driver: "sandbox",
status: "active",
config: {
provider: "secure-plugin",
template: "base",
apiKey: "11111111-1111-1111-1111-111111111111",
timeoutMs: 450000,
reuseLease: true,
},
});
expect(JSON.stringify(mockEnvironmentService.create.mock.calls[0][1])).not.toContain("test-provider-key");
expect(mockSecretService.create).toHaveBeenCalledWith(
"company-1",
expect.objectContaining({
provider: "local_encrypted",
value: "test-provider-key",
}),
expect.any(Object),
);
});
it("validates plugin environment config through the plugin driver host", async () => {
const environment = {
...createEnvironment(),
@@ -997,12 +1164,13 @@ describe("environment routes", () => {
summary: "Fake plugin sandbox provider is ready.",
details: { provider: "fake-plugin" },
});
const pluginWorkerManager = {};
const app = createApp({
type: "board",
userId: "user-1",
source: "local_implicit",
runId: "run-1",
});
}, { pluginWorkerManager });
const res = await request(app)
.post("/api/companies/company-1/environments/probe-config")
@@ -1031,7 +1199,7 @@ describe("environment routes", () => {
}),
}),
expect.objectContaining({
pluginWorkerManager: undefined,
pluginWorkerManager,
resolvedConfig: expect.objectContaining({
driver: "sandbox",
}),

View File

@@ -56,6 +56,7 @@ describe("findReusableSandboxLeaseId", () => {
metadata: {
provider: "fake-plugin",
image: "template-a",
timeoutMs: 300000,
reuseLease: true,
},
},
@@ -64,13 +65,14 @@ describe("findReusableSandboxLeaseId", () => {
metadata: {
provider: "fake-plugin",
image: "template-b",
timeoutMs: 300000,
reuseLease: true,
},
},
],
});
expect(selected).toBe("sandbox-template-a");
expect(selected).toBe("sandbox-template-b");
});
it("requires image identity for reusable fake sandbox leases", () => {
@@ -476,7 +478,12 @@ describeEmbeddedPostgres("environmentRuntimeService", () => {
const workerManager = {
isRunning: vi.fn((id: string) => id === pluginId),
call: vi.fn(async (_pluginId: string, method: string, params: any) => {
expect(params.config).toEqual(expect.objectContaining(fakePluginConfig));
expect(params.config).toEqual(expect.objectContaining({
image: "fake:test",
timeoutMs: 1234,
reuseLease: false,
}));
expect(params.config).not.toHaveProperty("provider");
if (method === "environmentAcquireLease") {
return {
providerLeaseId: "sandbox-1",
@@ -499,12 +506,17 @@ describeEmbeddedPostgres("environmentRuntimeService", () => {
};
}
if (method === "environmentReleaseLease") {
expect(params.config).toEqual(fakePluginConfig);
expect(params.config).toEqual({
image: "fake:test",
timeoutMs: 1234,
reuseLease: false,
});
expect(params.config).not.toHaveProperty("driver");
expect(params.config).not.toHaveProperty("executionWorkspaceMode");
expect(params.config).not.toHaveProperty("pluginId");
expect(params.config).not.toHaveProperty("pluginKey");
expect(params.config).not.toHaveProperty("providerMetadata");
expect(params.config).not.toHaveProperty("provider");
expect(params.config).not.toHaveProperty("sandboxProviderPlugin");
return undefined;
}
@@ -543,6 +555,270 @@ describeEmbeddedPostgres("environmentRuntimeService", () => {
expect(workerManager.call).toHaveBeenCalledWith(pluginId, "environmentReleaseLease", expect.anything());
});
it("uses resolved secret-ref config for plugin-backed sandbox execute and release", async () => {
const pluginId = randomUUID();
const { companyId, environment: baseEnvironment, runId } = await seedEnvironment();
const apiSecret = await secretService(db).create(companyId, {
name: `secure-plugin-api-key-${randomUUID()}`,
provider: "local_encrypted",
value: "resolved-provider-key",
});
const providerConfig = {
provider: "secure-plugin",
template: "base",
apiKey: apiSecret.id,
timeoutMs: 1234,
reuseLease: false,
};
const environment = {
...baseEnvironment,
name: "Secure Plugin Sandbox",
driver: "sandbox",
config: providerConfig,
};
await environmentService(db).update(environment.id, {
driver: "sandbox",
name: environment.name,
config: providerConfig,
});
await db.insert(plugins).values({
id: pluginId,
pluginKey: "acme.secure-sandbox-provider",
packageName: "@acme/secure-sandbox-provider",
version: "1.0.0",
apiVersion: 1,
categories: ["automation"],
manifestJson: {
id: "acme.secure-sandbox-provider",
apiVersion: 1,
version: "1.0.0",
displayName: "Secure Sandbox Provider",
description: "Test schema-driven provider",
author: "Paperclip",
categories: ["automation"],
capabilities: ["environment.drivers.register"],
entrypoints: { worker: "dist/worker.js" },
environmentDrivers: [
{
driverKey: "secure-plugin",
kind: "sandbox_provider",
displayName: "Secure Sandbox",
configSchema: {
type: "object",
properties: {
template: { type: "string" },
apiKey: { type: "string", format: "secret-ref" },
timeoutMs: { type: "number" },
reuseLease: { type: "boolean" },
},
},
},
],
},
status: "ready",
installOrder: 1,
updatedAt: new Date(),
} as any);
const workerManager = {
isRunning: vi.fn((id: string) => id === pluginId),
call: vi.fn(async (_pluginId: string, method: string, params: any) => {
expect(params.config.apiKey).toBe("resolved-provider-key");
expect(params.config).not.toHaveProperty("provider");
if (method === "environmentAcquireLease") {
return {
providerLeaseId: "sandbox-1",
metadata: {
provider: "secure-plugin",
template: "base",
apiKey: "resolved-provider-key",
timeoutMs: 1234,
reuseLease: false,
sandboxId: "sandbox-1",
remoteCwd: "/workspace",
},
};
}
if (method === "environmentExecute") {
return {
exitCode: 0,
signal: null,
timedOut: false,
stdout: "ok\n",
stderr: "",
};
}
if (method === "environmentReleaseLease") {
return undefined;
}
throw new Error(`Unexpected plugin method: ${method}`);
}),
} as unknown as PluginWorkerManager;
const runtimeWithPlugin = environmentRuntimeService(db, { pluginWorkerManager: workerManager });
const acquired = await runtimeWithPlugin.acquireRunLease({
companyId,
environment,
issueId: null,
heartbeatRunId: runId,
persistedExecutionWorkspace: null,
});
expect(acquired.lease.metadata).toMatchObject({
provider: "secure-plugin",
template: "base",
apiKey: apiSecret.id,
timeoutMs: 1234,
sandboxId: "sandbox-1",
});
const executed = await runtimeWithPlugin.execute({
environment,
lease: acquired.lease,
command: "printf",
args: ["ok"],
cwd: "/workspace",
env: {},
timeoutMs: 1000,
});
await environmentService(db).update(environment.id, {
driver: "local",
config: {},
});
const released = await runtimeWithPlugin.releaseRunLeases(runId);
expect(executed.stdout).toBe("ok\n");
expect(released).toHaveLength(1);
expect(released[0]?.lease.status).toBe("released");
expect(workerManager.call).toHaveBeenCalledWith(pluginId, "environmentExecute", expect.objectContaining({
config: expect.objectContaining({
apiKey: "resolved-provider-key",
}),
}));
expect(workerManager.call).toHaveBeenCalledWith(pluginId, "environmentReleaseLease", expect.objectContaining({
config: expect.objectContaining({
apiKey: "resolved-provider-key",
}),
}));
});
it("falls back to acquire when plugin-backed sandbox lease resume throws", async () => {
const pluginId = randomUUID();
const { companyId, environment: baseEnvironment, runId } = await seedEnvironment();
const providerConfig = {
provider: "fake-plugin",
image: "fake:test",
timeoutMs: 1234,
reuseLease: true,
};
const environment = {
...baseEnvironment,
name: "Reusable Plugin Sandbox",
driver: "sandbox",
config: providerConfig,
};
await environmentService(db).update(environment.id, {
driver: "sandbox",
name: environment.name,
config: providerConfig,
});
await db.insert(plugins).values({
id: pluginId,
pluginKey: "acme.fake-sandbox-provider",
packageName: "@acme/fake-sandbox-provider",
version: "1.0.0",
apiVersion: 1,
categories: ["automation"],
manifestJson: {
id: "acme.fake-sandbox-provider",
apiVersion: 1,
version: "1.0.0",
displayName: "Fake Sandbox Provider",
description: "Test schema-driven provider",
author: "Paperclip",
categories: ["automation"],
capabilities: ["environment.drivers.register"],
entrypoints: { worker: "dist/worker.js" },
environmentDrivers: [
{
driverKey: "fake-plugin",
kind: "sandbox_provider",
displayName: "Fake Plugin",
configSchema: {
type: "object",
properties: {
image: { type: "string" },
timeoutMs: { type: "number" },
reuseLease: { type: "boolean" },
},
},
},
],
},
status: "ready",
installOrder: 1,
updatedAt: new Date(),
} as any);
await environmentService(db).acquireLease({
companyId,
environmentId: environment.id,
heartbeatRunId: runId,
leasePolicy: "reuse_by_environment",
provider: "fake-plugin",
providerLeaseId: "stale-plugin-lease",
metadata: {
provider: "fake-plugin",
image: "fake:test",
timeoutMs: 1234,
reuseLease: true,
},
});
const workerManager = {
isRunning: vi.fn((id: string) => id === pluginId),
call: vi.fn(async (_pluginId: string, method: string) => {
if (method === "environmentResumeLease") {
throw new Error("stale sandbox");
}
if (method === "environmentAcquireLease") {
return {
providerLeaseId: "fresh-plugin-lease",
metadata: {
provider: "fake-plugin",
image: "fake:test",
timeoutMs: 1234,
reuseLease: true,
remoteCwd: "/workspace",
},
};
}
throw new Error(`Unexpected plugin method: ${method}`);
}),
} as unknown as PluginWorkerManager;
const runtimeWithPlugin = environmentRuntimeService(db, { pluginWorkerManager: workerManager });
const acquired = await runtimeWithPlugin.acquireRunLease({
companyId,
environment,
issueId: null,
heartbeatRunId: runId,
persistedExecutionWorkspace: null,
});
expect(acquired.lease.providerLeaseId).toBe("fresh-plugin-lease");
expect(workerManager.call).toHaveBeenNthCalledWith(1, pluginId, "environmentResumeLease", expect.objectContaining({
driverKey: "fake-plugin",
providerLeaseId: "stale-plugin-lease",
}));
expect(workerManager.call).toHaveBeenNthCalledWith(2, pluginId, "environmentAcquireLease", expect.objectContaining({
driverKey: "fake-plugin",
config: {
image: "fake:test",
timeoutMs: 1234,
reuseLease: true,
},
runId,
}));
});
it("releases a sandbox run lease from metadata after the environment config changes", async () => {
const { companyId, environment, runId } = await seedEnvironment({
driver: "sandbox",

View File

@@ -1,6 +1,8 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { errorHandler } from "../middleware/index.js";
import { executionWorkspaceRoutes } from "../routes/execution-workspaces.js";
const mockExecutionWorkspaceService = vi.hoisted(() => ({
list: vi.fn(),
@@ -15,19 +17,15 @@ const mockWorkspaceOperationService = vi.hoisted(() => ({
createRecorder: vi.fn(),
}));
function registerServiceMocks() {
vi.doMock("../services/index.js", () => ({
executionWorkspaceService: () => mockExecutionWorkspaceService,
logActivity: vi.fn(async () => undefined),
workspaceOperationService: () => mockWorkspaceOperationService,
}));
}
const mockLogActivity = vi.hoisted(() => vi.fn(async () => undefined));
async function createApp() {
const [{ executionWorkspaceRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/execution-workspaces.js")>("../routes/execution-workspaces.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
]);
vi.mock("../services/index.js", () => ({
executionWorkspaceService: () => mockExecutionWorkspaceService,
logActivity: mockLogActivity,
workspaceOperationService: () => mockWorkspaceOperationService,
}));
function createApp() {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
@@ -45,15 +43,9 @@ async function createApp() {
return app;
}
describe("execution workspace routes", () => {
describe.sequential("execution workspace routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/index.js");
vi.doUnmock("../routes/execution-workspaces.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerServiceMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockExecutionWorkspaceService.list.mockResolvedValue([]);
mockExecutionWorkspaceService.listSummaries.mockResolvedValue([
{
@@ -66,7 +58,7 @@ describe("execution workspace routes", () => {
});
it("uses summary mode for lightweight workspace lookups", async () => {
const res = await request(await createApp())
const res = await request(createApp())
.get("/api/companies/company-1/execution-workspaces?summary=true&reuseEligible=true");
expect(res.status).toBe(200);

View File

@@ -1,6 +1,6 @@
import express from "express";
import request from "supertest";
import { describe, expect, it, vi } from "vitest";
import { describe, expect, it } from "vitest";
/**
* Regression test for https://github.com/paperclipai/paperclip/issues/2898
@@ -29,33 +29,28 @@ describe("Express 5 /api/auth wildcard route", () => {
};
}
it("matches a shallow auth sub-path (sign-in/email)", async () => {
const { app } = buildApp();
const res = await request(app).post("/api/auth/sign-in/email");
expect(res.status).toBe(200);
});
it("matches a deep auth sub-path (callback/credentials/sign-in)", async () => {
const { app } = buildApp();
const res = await request(app).get(
"/api/auth/callback/credentials/sign-in"
);
expect(res.status).toBe(200);
});
it("does not match unrelated paths outside /api/auth", async () => {
// Confirm the route is not over-broad — requests to other API paths
// must fall through to 404 and not reach the better-auth handler.
it("matches auth sub-paths without matching unrelated API paths", async () => {
const { app, getCallCount } = buildApp();
const res = await request(app).get("/api/other/endpoint");
expect(res.status).toBe(404);
expect(getCallCount()).toBe(0);
});
it("invokes the handler for every matched sub-path", async () => {
const { app, getCallCount } = buildApp();
await request(app).post("/api/auth/sign-out");
await request(app).get("/api/auth/session");
await expect(request(app).post("/api/auth/sign-in/email")).resolves.toMatchObject({
status: 200,
});
await expect(request(app).get("/api/auth/callback/credentials/sign-in")).resolves.toMatchObject({
status: 200,
});
expect(getCallCount()).toBe(2);
await expect(request(app).get("/api/other/endpoint")).resolves.toMatchObject({
status: 404,
});
expect(getCallCount()).toBe(2);
await expect(request(app).post("/api/auth/sign-out")).resolves.toMatchObject({
status: 200,
});
await expect(request(app).get("/api/auth/session")).resolves.toMatchObject({
status: 200,
});
expect(getCallCount()).toBe(4);
});
});

View File

@@ -1,6 +1,5 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { eq } from "drizzle-orm";
@@ -8,14 +7,12 @@ import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest
import { writePaperclipSkillSyncPreference } from "@paperclipai/adapter-utils/server-utils";
import {
agents,
applyPendingMigrations,
companies,
companySkills,
costEvents,
createDb,
documents,
documentRevisions,
ensurePostgresDatabase,
feedbackExports,
feedbackVotes,
heartbeatRuns,
@@ -25,72 +22,7 @@ import {
issues,
} from "@paperclipai/db";
import { feedbackService } from "../services/feedback.ts";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-feedback-service-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, dataDir, instance };
}
import { startEmbeddedPostgresTestDatabase } from "./helpers/embedded-postgres.ts";
async function closeDbClient(db: ReturnType<typeof createDb> | undefined) {
await db?.$client?.end?.({ timeout: 0 });
@@ -99,17 +31,15 @@ async function closeDbClient(db: ReturnType<typeof createDb> | undefined) {
describe("feedbackService.saveIssueVote", () => {
let db!: ReturnType<typeof createDb>;
let svc!: ReturnType<typeof feedbackService>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
let tempDirs: string[] = [];
beforeAll(async () => {
const started = await startTempDatabase();
const started = await startEmbeddedPostgresTestDatabase("paperclip-feedback-service-");
db = createDb(started.connectionString);
svc = feedbackService(db);
instance = started.instance;
dataDir = started.dataDir;
}, 20_000);
tempDb = started;
}, 120_000);
afterEach(async () => {
await db.delete(feedbackExports);
@@ -134,10 +64,7 @@ describe("feedbackService.saveIssueVote", () => {
afterAll(async () => {
await closeDbClient(db);
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
async function seedIssueWithAgentComment() {

View File

@@ -0,0 +1,549 @@
import { randomUUID } from "node:crypto";
import { and, eq, sql } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
import {
agents,
companies,
createDb,
heartbeatRunWatchdogDecisions,
heartbeatRuns,
issueRelations,
issues,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import {
ACTIVE_RUN_OUTPUT_CONTINUE_REARM_MS,
ACTIVE_RUN_OUTPUT_CRITICAL_THRESHOLD_MS,
ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS,
heartbeatService,
} from "../services/heartbeat.ts";
import { recoveryService } from "../services/recovery/service.ts";
import { getRunLogStore } from "../services/run-log-store.ts";
const mockAdapterExecute = vi.hoisted(() =>
vi.fn(async () => ({
exitCode: 0,
signal: null,
timedOut: false,
errorMessage: null,
summary: "Acknowledged stale-run evaluation.",
provider: "test",
model: "test-model",
})),
);
vi.mock("../telemetry.ts", () => ({
getTelemetryClient: () => ({ track: vi.fn() }),
}));
vi.mock("@paperclipai/shared/telemetry", async () => {
const actual = await vi.importActual<typeof import("@paperclipai/shared/telemetry")>(
"@paperclipai/shared/telemetry",
);
return {
...actual,
trackAgentFirstHeartbeat: vi.fn(),
};
});
vi.mock("../adapters/index.ts", async () => {
const actual = await vi.importActual<typeof import("../adapters/index.ts")>("../adapters/index.ts");
return {
...actual,
getServerAdapter: vi.fn(() => ({
supportsLocalAgentJwt: false,
execute: mockAdapterExecute,
})),
};
});
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres active-run output watchdog tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
describeEmbeddedPostgres("active-run output watchdog", () => {
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
let db: ReturnType<typeof createDb>;
beforeAll(async () => {
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-active-run-output-watchdog-");
db = createDb(tempDb.connectionString);
}, 30_000);
afterEach(async () => {
for (let attempt = 0; attempt < 100; attempt += 1) {
const activeRuns = await db
.select({ id: heartbeatRuns.id })
.from(heartbeatRuns)
.where(sql`${heartbeatRuns.status} in ('queued', 'running')`);
if (activeRuns.length === 0) break;
await new Promise((resolve) => setTimeout(resolve, 25));
}
await db.execute(sql.raw(`TRUNCATE TABLE "companies" CASCADE`));
});
afterAll(async () => {
await tempDb?.cleanup();
});
async function seedRunningRun(opts: { now: Date; ageMs: number; withOutput?: boolean; logChunk?: string }) {
const companyId = randomUUID();
const managerId = randomUUID();
const coderId = randomUUID();
const issueId = randomUUID();
const runId = randomUUID();
const issuePrefix = `W${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
const startedAt = new Date(opts.now.getTime() - opts.ageMs);
const lastOutputAt = opts.withOutput ? new Date(opts.now.getTime() - 5 * 60 * 1000) : null;
await db.insert(companies).values({
id: companyId,
name: "Watchdog Co",
issuePrefix,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values([
{
id: managerId,
companyId,
name: "CTO",
role: "cto",
status: "idle",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
},
{
id: coderId,
companyId,
name: "Coder",
role: "engineer",
status: "running",
reportsTo: managerId,
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
permissions: {},
},
]);
await db.insert(issues).values({
id: issueId,
companyId,
title: "Long running implementation",
status: "in_progress",
priority: "medium",
assigneeAgentId: coderId,
issueNumber: 1,
identifier: `${issuePrefix}-1`,
updatedAt: startedAt,
createdAt: startedAt,
});
await db.insert(heartbeatRuns).values({
id: runId,
companyId,
agentId: coderId,
status: "running",
invocationSource: "assignment",
triggerDetail: "system",
startedAt,
processStartedAt: startedAt,
lastOutputAt,
lastOutputSeq: opts.withOutput ? 3 : 0,
lastOutputStream: opts.withOutput ? "stdout" : null,
contextSnapshot: { issueId },
stdoutExcerpt: "OPENAI_API_KEY=sk-test-secret-value should not leak",
logBytes: 0,
});
if (opts.logChunk) {
const store = getRunLogStore();
const handle = await store.begin({ companyId, agentId: coderId, runId });
const logBytes = await store.append(handle, {
stream: "stdout",
chunk: opts.logChunk,
ts: startedAt.toISOString(),
});
await db
.update(heartbeatRuns)
.set({
logStore: handle.store,
logRef: handle.logRef,
logBytes,
})
.where(eq(heartbeatRuns.id, runId));
}
await db.update(issues).set({ executionRunId: runId }).where(eq(issues.id, issueId));
return { companyId, managerId, coderId, issueId, runId, issuePrefix };
}
it("creates one medium-priority evaluation issue for a suspicious silent run", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, managerId, runId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const first = await heartbeat.scanSilentActiveRuns({ now, companyId });
const second = await heartbeat.scanSilentActiveRuns({ now, companyId });
expect(first.created).toBe(1);
expect(second.created).toBe(0);
expect(second.existing).toBe(1);
const evaluations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stale_active_run_evaluation")));
expect(evaluations).toHaveLength(1);
expect(["todo", "in_progress"]).toContain(evaluations[0]?.status);
expect(evaluations[0]).toMatchObject({
priority: "medium",
assigneeAgentId: managerId,
originId: runId,
originFingerprint: `stale_active_run:${companyId}:${runId}`,
});
expect(evaluations[0]?.description).toContain("Decision Checklist");
expect(evaluations[0]?.description).not.toContain("sk-test-secret-value");
});
it("redacts sensitive values from actual run-log evidence", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const leakedJwt = "eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiIxMjM0NTY3ODkwIn0.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c";
const leakedGithubToken = "ghp_1234567890abcdefghijklmnopqrstuvwxyz";
const { companyId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
logChunk: [
"Authorization: Bearer live-bearer-token-value",
`POST payload {"apiKey":"json-secret-value","token":"${leakedJwt}"}`,
`GITHUB_TOKEN=${leakedGithubToken}`,
].join("\n"),
});
const heartbeat = heartbeatService(db);
await heartbeat.scanSilentActiveRuns({ now, companyId });
const [evaluation] = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stale_active_run_evaluation")));
expect(evaluation?.description).toContain("***REDACTED***");
expect(evaluation?.description).not.toContain("live-bearer-token-value");
expect(evaluation?.description).not.toContain("json-secret-value");
expect(evaluation?.description).not.toContain(leakedJwt);
expect(evaluation?.description).not.toContain(leakedGithubToken);
});
it("raises critical stale-run evaluations and blocks the source issue", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, issueId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_CRITICAL_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const result = await heartbeat.scanSilentActiveRuns({ now, companyId });
expect(result.created).toBe(1);
const [evaluation] = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stale_active_run_evaluation")));
expect(evaluation?.priority).toBe("high");
const [blocker] = await db
.select()
.from(issueRelations)
.where(and(eq(issueRelations.companyId, companyId), eq(issueRelations.relatedIssueId, issueId)));
expect(blocker?.issueId).toBe(evaluation?.id);
const [source] = await db.select().from(issues).where(eq(issues.id, issueId));
expect(source?.status).toBe("blocked");
});
it("skips snoozed runs and healthy noisy runs", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const stale = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_CRITICAL_THRESHOLD_MS + 60_000,
});
const noisy = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_CRITICAL_THRESHOLD_MS + 60_000,
withOutput: true,
});
await db.insert(heartbeatRunWatchdogDecisions).values({
companyId: stale.companyId,
runId: stale.runId,
decision: "snooze",
snoozedUntil: new Date(now.getTime() + 60 * 60 * 1000),
reason: "Intentional quiet run",
});
const heartbeat = heartbeatService(db);
const staleResult = await heartbeat.scanSilentActiveRuns({ now, companyId: stale.companyId });
const noisyResult = await heartbeat.scanSilentActiveRuns({ now, companyId: noisy.companyId });
expect(staleResult).toMatchObject({ created: 0, snoozed: 1 });
expect(noisyResult).toMatchObject({ scanned: 0, created: 0 });
});
it("records watchdog decisions through recovery owner authorization", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, managerId, runId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const recovery = recoveryService(db, { enqueueWakeup: vi.fn() });
const scan = await heartbeat.scanSilentActiveRuns({ now, companyId });
const evaluationIssueId = scan.evaluationIssueIds[0];
expect(evaluationIssueId).toBeTruthy();
await expect(
recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: randomUUID() },
decision: "continue",
evaluationIssueId,
reason: "not my recovery issue",
}),
).rejects.toMatchObject({ status: 403 });
const snoozedUntil = new Date(now.getTime() + 60 * 60 * 1000);
const decision = await recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId },
decision: "snooze",
evaluationIssueId,
reason: "Long compile with no output",
snoozedUntil,
});
expect(decision).toMatchObject({
runId,
evaluationIssueId,
decision: "snooze",
createdByAgentId: managerId,
});
await expect(recovery.buildRunOutputSilence({
id: runId,
companyId,
status: "running",
lastOutputAt: null,
lastOutputSeq: 0,
lastOutputStream: null,
processStartedAt: new Date(now.getTime() - ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS - 60_000),
startedAt: new Date(now.getTime() - ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS - 60_000),
createdAt: new Date(now.getTime() - ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS - 60_000),
}, now)).resolves.toMatchObject({
level: "snoozed",
snoozedUntil,
evaluationIssueId,
});
});
it("re-arms continue decisions after the default quiet window", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, managerId, runId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const recovery = recoveryService(db, { enqueueWakeup: vi.fn() });
const scan = await heartbeat.scanSilentActiveRuns({ now, companyId });
const evaluationIssueId = scan.evaluationIssueIds[0];
expect(evaluationIssueId).toBeTruthy();
const decision = await recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId },
decision: "continue",
evaluationIssueId,
reason: "Current evidence is acceptable; keep watching.",
now,
});
const rearmAt = new Date(now.getTime() + ACTIVE_RUN_OUTPUT_CONTINUE_REARM_MS);
expect(decision).toMatchObject({
runId,
evaluationIssueId,
decision: "continue",
createdByAgentId: managerId,
});
expect(decision.snoozedUntil?.toISOString()).toBe(rearmAt.toISOString());
await db.update(issues).set({ status: "done" }).where(eq(issues.id, evaluationIssueId));
const beforeRearm = await heartbeat.scanSilentActiveRuns({
now: new Date(rearmAt.getTime() - 60_000),
companyId,
});
expect(beforeRearm).toMatchObject({ created: 0, snoozed: 1 });
const afterRearm = await heartbeat.scanSilentActiveRuns({
now: new Date(rearmAt.getTime() + 60_000),
companyId,
});
expect(afterRearm.created).toBe(1);
expect(afterRearm.evaluationIssueIds[0]).not.toBe(evaluationIssueId);
const evaluations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stale_active_run_evaluation")));
expect(evaluations.filter((issue) => !["done", "cancelled"].includes(issue.status))).toHaveLength(1);
});
it("rejects agent watchdog decisions using issues not bound to the target run", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, managerId, coderId, runId, issuePrefix } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const recovery = recoveryService(db, { enqueueWakeup: vi.fn() });
const scan = await heartbeat.scanSilentActiveRuns({ now, companyId });
const evaluationIssueId = scan.evaluationIssueIds[0];
expect(evaluationIssueId).toBeTruthy();
const unrelatedIssueId = randomUUID();
await db.insert(issues).values({
id: unrelatedIssueId,
companyId,
title: "Assigned but unrelated",
status: "todo",
priority: "medium",
assigneeAgentId: managerId,
issueNumber: 20,
identifier: `${issuePrefix}-20`,
});
const otherRunId = randomUUID();
const otherEvaluationIssueId = randomUUID();
await db.insert(heartbeatRuns).values({
id: otherRunId,
companyId,
agentId: coderId,
status: "running",
invocationSource: "assignment",
triggerDetail: "system",
startedAt: new Date(now.getTime() - ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS - 120_000),
processStartedAt: new Date(now.getTime() - ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS - 120_000),
lastOutputAt: null,
lastOutputSeq: 0,
lastOutputStream: null,
contextSnapshot: {},
logBytes: 0,
});
await db.insert(issues).values({
id: otherEvaluationIssueId,
companyId,
title: "Other run evaluation",
status: "todo",
priority: "medium",
assigneeAgentId: managerId,
issueNumber: 21,
identifier: `${issuePrefix}-21`,
originKind: "stale_active_run_evaluation",
originId: otherRunId,
originFingerprint: `stale_active_run:${companyId}:${otherRunId}`,
});
const attempts = [
{ decision: "continue" as const, evaluationIssueId: unrelatedIssueId },
{ decision: "dismissed_false_positive" as const, evaluationIssueId: unrelatedIssueId },
{
decision: "snooze" as const,
evaluationIssueId: unrelatedIssueId,
snoozedUntil: new Date(now.getTime() + 60 * 60 * 1000),
},
{ decision: "continue" as const, evaluationIssueId: otherEvaluationIssueId },
];
for (const attempt of attempts) {
await expect(
recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId },
reason: "malicious or stale binding",
...attempt,
}),
).rejects.toMatchObject({ status: 403 });
}
await db.update(issues).set({ status: "done" }).where(eq(issues.id, evaluationIssueId));
await expect(
recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId },
decision: "continue",
evaluationIssueId,
reason: "closed evaluation should not authorize",
}),
).rejects.toMatchObject({ status: 403 });
});
it("validates createdByRunId before storing watchdog decisions", async () => {
const now = new Date("2026-04-22T20:00:00.000Z");
const { companyId, managerId, runId } = await seedRunningRun({
now,
ageMs: ACTIVE_RUN_OUTPUT_SUSPICION_THRESHOLD_MS + 60_000,
});
const heartbeat = heartbeatService(db);
const recovery = recoveryService(db, { enqueueWakeup: vi.fn() });
const scan = await heartbeat.scanSilentActiveRuns({ now, companyId });
const evaluationIssueId = scan.evaluationIssueIds[0];
expect(evaluationIssueId).toBeTruthy();
await expect(
recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId },
decision: "continue",
evaluationIssueId,
reason: "client supplied another agent run",
createdByRunId: runId,
}),
).rejects.toMatchObject({ status: 403 });
const managerRunId = randomUUID();
await db.insert(heartbeatRuns).values({
id: managerRunId,
companyId,
agentId: managerId,
status: "running",
invocationSource: "assignment",
triggerDetail: "system",
startedAt: now,
processStartedAt: now,
lastOutputAt: now,
lastOutputSeq: 1,
lastOutputStream: "stdout",
contextSnapshot: {},
logBytes: 0,
});
const decision = await recovery.recordWatchdogDecision({
runId,
actor: { type: "agent", agentId: managerId, runId: managerRunId },
decision: "continue",
evaluationIssueId,
reason: "valid current actor run",
createdByRunId: randomUUID(),
});
expect(decision.createdByRunId).toBe(managerRunId);
});
});

View File

@@ -1,8 +1,4 @@
import { randomUUID } from "node:crypto";
import fs from "node:fs";
import net from "node:net";
import os from "node:os";
import path from "node:path";
import { createServer } from "node:http";
import { and, asc, eq } from "drizzle-orm";
import { WebSocketServer } from "ws";
@@ -10,81 +6,14 @@ import { afterAll, beforeAll, describe, expect, it } from "vitest";
import {
agents,
agentWakeupRequests,
applyPendingMigrations,
companies,
createDb,
ensurePostgresDatabase,
heartbeatRuns,
issueComments,
issues,
} from "@paperclipai/db";
import { heartbeatService } from "../services/heartbeat.ts";
type EmbeddedPostgresInstance = {
initialise(): Promise<void>;
start(): Promise<void>;
stop(): Promise<void>;
};
type EmbeddedPostgresCtor = new (opts: {
databaseDir: string;
user: string;
password: string;
port: number;
persistent: boolean;
initdbFlags?: string[];
onLog?: (message: unknown) => void;
onError?: (message: unknown) => void;
}) => EmbeddedPostgresInstance;
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
const mod = await import("embedded-postgres");
return mod.default as EmbeddedPostgresCtor;
}
async function getAvailablePort(): Promise<number> {
return await new Promise((resolve, reject) => {
const server = net.createServer();
server.unref();
server.on("error", reject);
server.listen(0, "127.0.0.1", () => {
const address = server.address();
if (!address || typeof address === "string") {
server.close(() => reject(new Error("Failed to allocate test port")));
return;
}
const { port } = address;
server.close((error) => {
if (error) reject(error);
else resolve(port);
});
});
});
}
async function startTempDatabase() {
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-heartbeat-comment-wake-"));
const port = await getAvailablePort();
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
const instance = new EmbeddedPostgres({
databaseDir: dataDir,
user: "paperclip",
password: "paperclip",
port,
persistent: true,
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
onLog: () => {},
onError: () => {},
});
await instance.initialise();
await instance.start();
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
await ensurePostgresDatabase(adminConnectionString, "paperclip");
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
await applyPendingMigrations(connectionString);
return { connectionString, instance, dataDir };
}
import { startEmbeddedPostgresTestDatabase } from "./helpers/embedded-postgres.ts";
async function waitFor(condition: () => boolean | Promise<boolean>, timeoutMs = 10_000, intervalMs = 50) {
const startedAt = Date.now();
@@ -218,22 +147,17 @@ async function createControlledGatewayServer() {
describe("heartbeat comment wake batching", () => {
let db!: ReturnType<typeof createDb>;
let instance: EmbeddedPostgresInstance | null = null;
let dataDir = "";
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
const started = await startTempDatabase();
const started = await startEmbeddedPostgresTestDatabase("paperclip-heartbeat-comment-wake-");
db = createDb(started.connectionString);
instance = started.instance;
dataDir = started.dataDir;
}, 45_000);
tempDb = started;
}, 120_000);
afterAll(async () => {
await closeDbClient(db);
await instance?.stop();
if (dataDir) {
fs.rmSync(dataDir, { recursive: true, force: true });
}
await tempDb?.cleanup();
});
it("defers approval-approved wakes for a running issue so the assignee resumes after the run", async () => {
@@ -862,6 +786,206 @@ describe("heartbeat comment wake batching", () => {
}
}, 120_000);
it("does not reopen a finished issue when the deferred comment wake came from another agent", async () => {
const gateway = await createControlledGatewayServer();
const companyId = randomUUID();
const assigneeAgentId = randomUUID();
const mentionedAgentId = randomUUID();
const issueId = randomUUID();
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
const heartbeat = heartbeatService(db);
try {
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values([
{
id: assigneeAgentId,
companyId,
name: "Primary Agent",
role: "engineer",
status: "idle",
adapterType: "openclaw_gateway",
adapterConfig: {
url: gateway.url,
headers: {
"x-openclaw-token": "gateway-token",
},
payloadTemplate: {
message: "wake now",
},
waitTimeoutMs: 2_000,
},
runtimeConfig: {},
permissions: {},
},
{
id: mentionedAgentId,
companyId,
name: "Mentioned Agent",
role: "engineer",
status: "idle",
adapterType: "openclaw_gateway",
adapterConfig: {
url: gateway.url,
headers: {
"x-openclaw-token": "gateway-token",
},
payloadTemplate: {
message: "wake now",
},
waitTimeoutMs: 2_000,
},
runtimeConfig: {},
permissions: {},
},
]);
await db.insert(issues).values({
id: issueId,
companyId,
title: "Do not reopen from agent mention",
status: "todo",
priority: "medium",
assigneeAgentId,
issueNumber: 1,
identifier: `${issuePrefix}-1`,
});
const firstRun = await heartbeat.wakeup(assigneeAgentId, {
source: "assignment",
triggerDetail: "system",
reason: "issue_assigned",
payload: { issueId },
contextSnapshot: {
issueId,
taskId: issueId,
wakeReason: "issue_assigned",
},
requestedByActorType: "system",
requestedByActorId: null,
});
expect(firstRun).not.toBeNull();
await waitFor(async () => {
const run = await db
.select({ status: heartbeatRuns.status })
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, firstRun!.id))
.then((rows) => rows[0] ?? null);
return run?.status === "running";
});
const comment = await db
.insert(issueComments)
.values({
companyId,
issueId,
authorAgentId: assigneeAgentId,
createdByRunId: firstRun?.id ?? null,
body: "@Mentioned Agent please review after I finish",
})
.returning()
.then((rows) => rows[0]);
const deferredRun = await heartbeat.wakeup(mentionedAgentId, {
source: "automation",
triggerDetail: "system",
reason: "issue_comment_mentioned",
payload: { issueId, commentId: comment.id },
contextSnapshot: {
issueId,
taskId: issueId,
commentId: comment.id,
wakeCommentId: comment.id,
wakeReason: "issue_comment_mentioned",
source: "comment.mention",
},
requestedByActorType: "agent",
requestedByActorId: assigneeAgentId,
});
expect(deferredRun).toBeNull();
await waitFor(async () => {
const deferred = await db
.select()
.from(agentWakeupRequests)
.where(
and(
eq(agentWakeupRequests.companyId, companyId),
eq(agentWakeupRequests.agentId, mentionedAgentId),
eq(agentWakeupRequests.status, "deferred_issue_execution"),
),
)
.then((rows) => rows[0] ?? null);
return Boolean(deferred);
});
await db
.update(issues)
.set({
status: "done",
completedAt: new Date(),
executionRunId: null,
executionAgentNameKey: null,
executionLockedAt: null,
updatedAt: new Date(),
})
.where(eq(issues.id, issueId));
gateway.releaseFirstWait();
await waitFor(() => gateway.getAgentPayloads().length === 2, 90_000);
await waitFor(async () => {
const runs = await db
.select()
.from(heartbeatRuns)
.where(eq(heartbeatRuns.companyId, companyId));
return runs.length === 2 && runs.every((run) => run.status === "succeeded");
}, 90_000);
const issueAfterPromotion = await db
.select({
status: issues.status,
completedAt: issues.completedAt,
})
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issueAfterPromotion).toMatchObject({
status: "done",
});
expect(issueAfterPromotion?.completedAt).not.toBeNull();
const secondPayload = gateway.getAgentPayloads()[1] ?? {};
expect(secondPayload.paperclip).toMatchObject({
wake: {
reason: "issue_comment_mentioned",
commentIds: [comment.id],
latestCommentId: comment.id,
issue: {
id: issueId,
identifier: `${issuePrefix}-1`,
title: "Do not reopen from agent mention",
status: "done",
priority: "medium",
},
},
});
expect(String(secondPayload.message ?? "")).toContain("please review after I finish");
} finally {
gateway.releaseFirstWait();
await gateway.close();
}
}, 120_000);
it("queues exactly one follow-up run when an issue-bound run exits without a comment", async () => {
const gateway = await createControlledGatewayServer();
const companyId = randomUUID();
@@ -1172,6 +1296,20 @@ describe("heartbeat comment wake batching", () => {
wakeReason: "issue_comment_mentioned",
});
const issueAfterMention = await db
.select({
assigneeAgentId: issues.assigneeAgentId,
executionRunId: issues.executionRunId,
executionAgentNameKey: issues.executionAgentNameKey,
})
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issueAfterMention?.assigneeAgentId).toBe(primaryAgentId);
expect(issueAfterMention?.executionRunId).not.toBe(mentionedRuns[0]?.id);
expect(issueAfterMention?.executionAgentNameKey).not.toBe("mentioned agent");
const primaryRuns = await db
.select()
.from(heartbeatRuns)
@@ -1198,6 +1336,155 @@ describe("heartbeat comment wake batching", () => {
await gateway.close();
}
}, 120_000);
it("does not mark a direct mentioned-agent run as the issue execution owner", async () => {
const gateway = await createControlledGatewayServer();
const companyId = randomUUID();
const primaryAgentId = randomUUID();
const mentionedAgentId = randomUUID();
const issueId = randomUUID();
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
const heartbeat = heartbeatService(db);
try {
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values([
{
id: primaryAgentId,
companyId,
name: "Primary Agent",
role: "engineer",
status: "idle",
adapterType: "openclaw_gateway",
adapterConfig: {
url: gateway.url,
headers: {
"x-openclaw-token": "gateway-token",
},
payloadTemplate: {
message: "wake now",
},
waitTimeoutMs: 2_000,
},
runtimeConfig: {},
permissions: {},
},
{
id: mentionedAgentId,
companyId,
name: "Mentioned Agent",
role: "engineer",
status: "idle",
adapterType: "openclaw_gateway",
adapterConfig: {
url: gateway.url,
headers: {
"x-openclaw-token": "gateway-token",
},
payloadTemplate: {
message: "wake now",
},
waitTimeoutMs: 2_000,
},
runtimeConfig: {},
permissions: {},
},
]);
await db.insert(issues).values({
id: issueId,
companyId,
title: "Mention should not steal execution ownership",
status: "todo",
priority: "medium",
assigneeAgentId: primaryAgentId,
issueNumber: 1,
identifier: `${issuePrefix}-1`,
});
const mentionComment = await db
.insert(issueComments)
.values({
companyId,
issueId,
authorUserId: "user-1",
body: "@Mentioned Agent please inspect this.",
})
.returning()
.then((rows) => rows[0]);
const mentionRun = await heartbeat.wakeup(mentionedAgentId, {
source: "automation",
triggerDetail: "system",
reason: "issue_comment_mentioned",
payload: { issueId, commentId: mentionComment.id },
contextSnapshot: {
issueId,
taskId: issueId,
commentId: mentionComment.id,
wakeCommentId: mentionComment.id,
wakeReason: "issue_comment_mentioned",
source: "comment.mention",
},
requestedByActorType: "user",
requestedByActorId: "user-1",
});
expect(mentionRun).not.toBeNull();
await waitFor(() => gateway.getAgentPayloads().length === 1);
const issueDuringMention = await db
.select({
assigneeAgentId: issues.assigneeAgentId,
executionRunId: issues.executionRunId,
executionAgentNameKey: issues.executionAgentNameKey,
})
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issueDuringMention).toMatchObject({
assigneeAgentId: primaryAgentId,
executionRunId: null,
executionAgentNameKey: null,
});
gateway.releaseFirstWait();
await waitFor(async () => {
const run = await db
.select({ status: heartbeatRuns.status })
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, mentionRun!.id))
.then((rows) => rows[0] ?? null);
return run?.status === "succeeded";
}, 90_000);
const issueAfterMention = await db
.select({
assigneeAgentId: issues.assigneeAgentId,
executionRunId: issues.executionRunId,
executionAgentNameKey: issues.executionAgentNameKey,
})
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issueAfterMention).toMatchObject({
assigneeAgentId: primaryAgentId,
executionRunId: null,
executionAgentNameKey: null,
});
} finally {
gateway.releaseFirstWait();
await gateway.close();
}
}, 120_000);
it("treats the automatic run summary as fallback-only when the run already posted a comment", async () => {
const gateway = await createControlledGatewayServer();
const companyId = randomUUID();

View File

@@ -347,6 +347,198 @@ describeEmbeddedPostgres("heartbeat dependency-aware queued run selection", () =
expect(blockedWakeRequestCount).toBeGreaterThanOrEqual(2);
});
it("cancels stale queued runs when issue blockers are still unresolved", async () => {
const companyId = randomUUID();
const agentId = randomUUID();
const blockerId = randomUUID();
const blockedIssueId = randomUUID();
const readyIssueId = randomUUID();
const blockedWakeupRequestId = randomUUID();
const readyWakeupRequestId = randomUUID();
const blockedRunId = randomUUID();
const readyRunId = randomUUID();
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values({
id: agentId,
companyId,
name: "QAChecker",
role: "qa",
status: "active",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 2,
},
},
permissions: {},
});
await db.insert(issues).values([
{
id: blockerId,
companyId,
title: "Security review",
status: "blocked",
priority: "high",
},
{
id: blockedIssueId,
companyId,
title: "QA validation",
status: "blocked",
priority: "medium",
assigneeAgentId: agentId,
},
{
id: readyIssueId,
companyId,
title: "Ready QA task",
status: "todo",
priority: "low",
assigneeAgentId: agentId,
},
]);
await db.insert(issueRelations).values({
companyId,
issueId: blockerId,
relatedIssueId: blockedIssueId,
type: "blocks",
});
await db.insert(agentWakeupRequests).values([
{
id: blockedWakeupRequestId,
companyId,
agentId,
source: "automation",
triggerDetail: "system",
reason: "transient_failure_retry",
payload: { issueId: blockedIssueId },
status: "queued",
},
{
id: readyWakeupRequestId,
companyId,
agentId,
source: "assignment",
triggerDetail: "system",
reason: "issue_assigned",
payload: { issueId: readyIssueId },
status: "queued",
},
]);
await db.insert(heartbeatRuns).values([
{
id: blockedRunId,
companyId,
agentId,
invocationSource: "automation",
triggerDetail: "system",
status: "queued",
wakeupRequestId: blockedWakeupRequestId,
contextSnapshot: {
issueId: blockedIssueId,
wakeReason: "transient_failure_retry",
},
},
{
id: readyRunId,
companyId,
agentId,
invocationSource: "assignment",
triggerDetail: "system",
status: "queued",
wakeupRequestId: readyWakeupRequestId,
contextSnapshot: {
issueId: readyIssueId,
wakeReason: "issue_assigned",
},
},
]);
await db
.update(agentWakeupRequests)
.set({ runId: blockedRunId })
.where(eq(agentWakeupRequests.id, blockedWakeupRequestId));
await db
.update(agentWakeupRequests)
.set({ runId: readyRunId })
.where(eq(agentWakeupRequests.id, readyWakeupRequestId));
await db
.update(issues)
.set({
executionRunId: blockedRunId,
executionAgentNameKey: "qa-checker",
executionLockedAt: new Date(),
})
.where(eq(issues.id, blockedIssueId));
await heartbeat.resumeQueuedRuns();
await waitForCondition(async () => {
const run = await db
.select({ status: heartbeatRuns.status })
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, readyRunId))
.then((rows) => rows[0] ?? null);
return run?.status === "succeeded";
});
const [blockedRun, blockedWakeup, blockedIssue, readyRun] = await Promise.all([
db
.select({
status: heartbeatRuns.status,
errorCode: heartbeatRuns.errorCode,
finishedAt: heartbeatRuns.finishedAt,
resultJson: heartbeatRuns.resultJson,
})
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, blockedRunId))
.then((rows) => rows[0] ?? null),
db
.select({
status: agentWakeupRequests.status,
error: agentWakeupRequests.error,
})
.from(agentWakeupRequests)
.where(eq(agentWakeupRequests.id, blockedWakeupRequestId))
.then((rows) => rows[0] ?? null),
db
.select({
executionRunId: issues.executionRunId,
executionAgentNameKey: issues.executionAgentNameKey,
executionLockedAt: issues.executionLockedAt,
})
.from(issues)
.where(eq(issues.id, blockedIssueId))
.then((rows) => rows[0] ?? null),
db
.select({ status: heartbeatRuns.status })
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, readyRunId))
.then((rows) => rows[0] ?? null),
]);
expect(blockedRun?.status).toBe("cancelled");
expect(blockedRun?.errorCode).toBe("issue_dependencies_blocked");
expect(blockedRun?.finishedAt).toBeTruthy();
expect(blockedRun?.resultJson).toMatchObject({ stopReason: "issue_dependencies_blocked" });
expect(blockedWakeup?.status).toBe("skipped");
expect(blockedWakeup?.error).toContain("dependencies are still blocked");
expect(blockedIssue).toMatchObject({
executionRunId: null,
executionAgentNameKey: null,
executionLockedAt: null,
});
expect(readyRun?.status).toBe("succeeded");
expect(mockAdapterExecute).toHaveBeenCalledTimes(1);
});
it("suppresses normal wakeups while allowing comment interaction wakes under a pause hold", async () => {
const companyId = randomUUID();
const agentId = randomUUID();
@@ -425,12 +617,39 @@ describeEmbeddedPostgres("heartbeat dependency-aware queued run selection", () =
.then((rows) => rows[0] ?? null);
expect(skippedWake).toMatchObject({ status: "skipped", reason: "issue_tree_hold_active" });
const childCommentId = randomUUID();
await db.insert(issueComments).values({
id: childCommentId,
companyId,
issueId: childIssueId,
authorUserId: "board-user",
body: "Please respond while this hold is active.",
});
const forgedChildCommentWake = await heartbeat.wakeup(agentId, {
source: "on_demand",
triggerDetail: "manual",
reason: "issue_commented",
payload: { issueId: childIssueId, commentId: childCommentId },
requestedByActorType: "agent",
requestedByActorId: agentId,
});
expect(forgedChildCommentWake).toBeNull();
const childCommentWake = await heartbeat.wakeup(agentId, {
source: "automation",
triggerDetail: "system",
reason: "issue_commented",
payload: { issueId: childIssueId, commentId: randomUUID() },
contextSnapshot: { issueId: childIssueId, wakeReason: "issue_commented" },
payload: { issueId: childIssueId, commentId: childCommentId },
requestedByActorType: "user",
requestedByActorId: "board-user",
contextSnapshot: {
issueId: childIssueId,
commentId: childCommentId,
wakeCommentId: childCommentId,
wakeReason: "issue_commented",
source: "issue.comment",
},
});
expect(childCommentWake).not.toBeNull();
@@ -494,12 +713,29 @@ describeEmbeddedPostgres("heartbeat dependency-aware queued run selection", () =
releasePolicy: { strategy: "manual", note: "full_pause" },
});
const rootCommentId = randomUUID();
await db.insert(issueComments).values({
id: rootCommentId,
companyId,
issueId: rootIssueId,
authorUserId: "board-user",
body: "Please respond while this hold is active.",
});
const rootCommentWake = await heartbeat.wakeup(agentId, {
source: "automation",
triggerDetail: "system",
reason: "issue_commented",
payload: { issueId: rootIssueId, commentId: randomUUID() },
contextSnapshot: { issueId: rootIssueId, wakeReason: "issue_commented" },
payload: { issueId: rootIssueId, commentId: rootCommentId },
requestedByActorType: "user",
requestedByActorId: "board-user",
contextSnapshot: {
issueId: rootIssueId,
commentId: rootCommentId,
wakeCommentId: rootCommentId,
wakeReason: "issue_commented",
source: "issue.comment",
},
});
expect(rootCommentWake).not.toBeNull();

View File

@@ -4,13 +4,16 @@ import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest
import {
activityLog,
agents,
agentWakeupRequests,
companies,
createDb,
executionWorkspaces,
heartbeatRuns,
issueComments,
issueRelations,
issueTreeHolds,
issues,
projects,
projectWorkspaces,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
@@ -55,6 +58,7 @@ vi.mock("../adapters/index.ts", async () => {
});
import { heartbeatService } from "../services/heartbeat.ts";
import { instanceSettingsService } from "../services/instance-settings.ts";
import { runningProcesses } from "../adapters/index.ts";
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
@@ -94,13 +98,23 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
}
await new Promise((resolve) => setTimeout(resolve, 50));
await db.execute(sql.raw(`TRUNCATE TABLE "companies" CASCADE`));
await instanceSettingsService(db).updateExperimental({
enableIssueGraphLivenessAutoRecovery: false,
enableIsolatedWorkspaces: false,
});
});
afterAll(async () => {
await tempDb?.cleanup();
});
async function seedBlockedChain() {
async function enableAutoRecovery() {
await instanceSettingsService(db).updateExperimental({
enableIssueGraphLivenessAutoRecovery: true,
});
}
async function seedBlockedChain(opts: { stale?: boolean } = {}) {
const companyId = randomUUID();
const managerId = randomUUID();
const coderId = randomUUID();
@@ -124,7 +138,7 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
status: "idle",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
runtimeConfig: { heartbeat: { wakeOnDemand: false } },
permissions: {},
},
{
@@ -136,11 +150,14 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
reportsTo: managerId,
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {},
runtimeConfig: { heartbeat: { wakeOnDemand: false } },
permissions: {},
},
]);
const issueTimestamp = opts.stale === false
? new Date()
: new Date(Date.now() - 25 * 60 * 60 * 1000);
await db.insert(issues).values([
{
id: blockedIssueId,
@@ -151,6 +168,8 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
assigneeAgentId: coderId,
issueNumber: 1,
identifier: `${issuePrefix}-1`,
createdAt: issueTimestamp,
updatedAt: issueTimestamp,
},
{
id: blockerIssueId,
@@ -160,6 +179,8 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
priority: "medium",
issueNumber: 2,
identifier: `${issuePrefix}-2`,
createdAt: issueTimestamp,
updatedAt: issueTimestamp,
},
]);
@@ -173,7 +194,91 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
return { companyId, managerId, blockedIssueId, blockerIssueId };
}
it("creates one manager escalation, preserves blockers, and wakes the assignee", async () => {
it("keeps liveness findings advisory when auto recovery is disabled", async () => {
const { companyId } = await seedBlockedChain();
const heartbeat = heartbeatService(db);
const result = await heartbeat.reconcileIssueGraphLiveness();
expect(result.findings).toBe(1);
expect(result.autoRecoveryEnabled).toBe(false);
expect(result.escalationsCreated).toBe(0);
expect(result.skippedAutoRecoveryDisabled).toBe(1);
const escalations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "harness_liveness_escalation")));
expect(escalations).toHaveLength(0);
});
it("does not create recovery issues until the dependency path is stale for 24 hours", async () => {
await enableAutoRecovery();
const { companyId } = await seedBlockedChain({ stale: false });
const heartbeat = heartbeatService(db);
const result = await heartbeat.reconcileIssueGraphLiveness();
expect(result.findings).toBe(1);
expect(result.escalationsCreated).toBe(0);
expect(result.skippedAutoRecoveryTooYoung).toBe(1);
const escalations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "harness_liveness_escalation")));
expect(escalations).toHaveLength(0);
});
it("suppresses liveness escalation when the source issue is under an active pause hold", async () => {
await enableAutoRecovery();
const { companyId, blockedIssueId } = await seedBlockedChain();
await db.insert(issueTreeHolds).values({
companyId,
rootIssueId: blockedIssueId,
mode: "pause",
status: "active",
reason: "pause liveness recovery subtree",
releasePolicy: { strategy: "manual" },
});
const result = await heartbeatService(db).reconcileIssueGraphLiveness();
expect(result.findings).toBe(1);
expect(result.escalationsCreated).toBe(0);
expect(result.existingEscalations).toBe(0);
expect(result.skipped).toBe(1);
const escalations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "harness_liveness_escalation")));
expect(escalations).toHaveLength(0);
});
it("treats an active executionRunId on the leaf blocker as a live execution path", async () => {
await enableAutoRecovery();
const { companyId, managerId, blockedIssueId, blockerIssueId } = await seedBlockedChain();
const runId = randomUUID();
await db.insert(heartbeatRuns).values({
id: runId,
companyId,
agentId: managerId,
status: "running",
contextSnapshot: { issueId: blockedIssueId },
});
await db.update(issues).set({ executionRunId: runId }).where(eq(issues.id, blockerIssueId));
const heartbeat = heartbeatService(db);
const result = await heartbeat.reconcileIssueGraphLiveness();
expect(result.findings).toBe(0);
expect(result.escalationsCreated).toBe(0);
});
it("creates one manager escalation, preserves blockers, and records owner selection", async () => {
await enableAutoRecovery();
const { companyId, managerId, blockedIssueId, blockerIssueId } = await seedBlockedChain();
const heartbeat = heartbeatService(db);
@@ -182,7 +287,6 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
expect(first.escalationsCreated).toBe(1);
expect(second.escalationsCreated).toBe(0);
expect(second.existingEscalations).toBe(1);
const escalations = await db
.select()
@@ -195,9 +299,15 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
);
expect(escalations).toHaveLength(1);
expect(escalations[0]).toMatchObject({
parentId: blockedIssueId,
parentId: blockerIssueId,
assigneeAgentId: managerId,
status: expect.stringMatching(/^(todo|in_progress|done)$/),
originFingerprint: [
"harness_liveness_leaf",
companyId,
"blocked_by_unassigned_issue",
blockerIssueId,
].join(":"),
});
const blockers = await db
@@ -213,15 +323,217 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
expect(comments[0]?.body).toContain("harness-level liveness incident");
expect(comments[0]?.body).toContain(escalations[0]?.identifier ?? escalations[0]!.id);
const wakes = await db.select().from(agentWakeupRequests).where(eq(agentWakeupRequests.agentId, managerId));
expect(wakes.some((wake) => wake.reason === "issue_assigned")).toBe(true);
const events = await db.select().from(activityLog).where(eq(activityLog.companyId, companyId));
expect(events.some((event) => event.action === "issue.harness_liveness_escalation_created")).toBe(true);
const createdEvent = events.find((event) => event.action === "issue.harness_liveness_escalation_created");
expect(createdEvent).toBeTruthy();
expect(createdEvent?.details).toMatchObject({
recoveryIssueId: blockerIssueId,
ownerSelection: {
selectedAgentId: managerId,
selectedReason: "root_agent",
selectedSourceIssueId: blockerIssueId,
},
workspaceSelection: {
reuseRecoveryExecutionWorkspace: false,
inheritedExecutionWorkspaceFromIssueId: null,
projectWorkspaceSourceIssueId: blockerIssueId,
},
});
expect(events.some((event) => event.action === "issue.blockers.updated")).toBe(true);
});
it("parents recovery under the leaf blocker without inheriting dependent or blocker execution state for manager-owned recovery", async () => {
await enableAutoRecovery();
await instanceSettingsService(db).updateExperimental({ enableIsolatedWorkspaces: true });
const companyId = randomUUID();
const managerId = randomUUID();
const blockedIssueId = randomUUID();
const blockerIssueId = randomUUID();
const dependentProjectId = randomUUID();
const blockerProjectId = randomUUID();
const dependentProjectWorkspaceId = randomUUID();
const blockerProjectWorkspaceId = randomUUID();
const dependentExecutionWorkspaceId = randomUUID();
const blockerExecutionWorkspaceId = randomUUID();
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
const issueTimestamp = new Date(Date.now() - 25 * 60 * 60 * 1000);
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values({
id: managerId,
companyId,
name: "Root Operator",
role: "operator",
status: "idle",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: { heartbeat: { wakeOnDemand: false } },
permissions: {},
});
await db.insert(projects).values([
{
id: dependentProjectId,
companyId,
name: "Dependent workspace project",
status: "in_progress",
},
{
id: blockerProjectId,
companyId,
name: "Blocker workspace project",
status: "in_progress",
},
]);
await db.insert(projectWorkspaces).values([
{
id: dependentProjectWorkspaceId,
companyId,
projectId: dependentProjectId,
name: "Dependent primary",
},
{
id: blockerProjectWorkspaceId,
companyId,
projectId: blockerProjectId,
name: "Blocker primary",
},
]);
await db.insert(executionWorkspaces).values([
{
id: dependentExecutionWorkspaceId,
companyId,
projectId: dependentProjectId,
projectWorkspaceId: dependentProjectWorkspaceId,
mode: "operator_branch",
strategyType: "git_worktree",
name: "Dependent branch",
status: "active",
providerType: "git_worktree",
},
{
id: blockerExecutionWorkspaceId,
companyId,
projectId: blockerProjectId,
projectWorkspaceId: blockerProjectWorkspaceId,
mode: "operator_branch",
strategyType: "git_worktree",
name: "Blocker branch",
status: "active",
providerType: "git_worktree",
},
]);
await db.insert(issues).values([
{
id: blockedIssueId,
companyId,
projectId: dependentProjectId,
projectWorkspaceId: dependentProjectWorkspaceId,
executionWorkspaceId: dependentExecutionWorkspaceId,
executionWorkspacePreference: "reuse_existing",
executionWorkspaceSettings: { mode: "operator_branch" },
title: "Blocked dependent",
status: "blocked",
priority: "medium",
issueNumber: 1,
identifier: `${issuePrefix}-1`,
createdAt: issueTimestamp,
updatedAt: issueTimestamp,
},
{
id: blockerIssueId,
companyId,
projectId: blockerProjectId,
projectWorkspaceId: blockerProjectWorkspaceId,
executionWorkspaceId: blockerExecutionWorkspaceId,
executionWorkspacePreference: "reuse_existing",
executionWorkspaceSettings: { mode: "operator_branch" },
title: "Unassigned leaf blocker",
status: "todo",
priority: "medium",
issueNumber: 2,
identifier: `${issuePrefix}-2`,
createdAt: issueTimestamp,
updatedAt: issueTimestamp,
},
]);
await db.insert(issueRelations).values({
companyId,
issueId: blockerIssueId,
relatedIssueId: blockedIssueId,
type: "blocks",
});
const result = await heartbeatService(db).reconcileIssueGraphLiveness();
expect(result.escalationsCreated).toBe(1);
const escalations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "harness_liveness_escalation")));
expect(escalations).toHaveLength(1);
expect(escalations[0]).toMatchObject({
parentId: blockerIssueId,
projectId: blockerProjectId,
projectWorkspaceId: blockerProjectWorkspaceId,
executionWorkspaceId: null,
executionWorkspacePreference: null,
assigneeAgentId: managerId,
});
});
it("reuses one open recovery issue for multiple dependents with the same leaf blocker", async () => {
await enableAutoRecovery();
const { companyId, blockedIssueId, blockerIssueId } = await seedBlockedChain();
const secondBlockedIssueId = randomUUID();
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
const issueTimestamp = new Date(Date.now() - 25 * 60 * 60 * 1000);
await db.insert(issues).values({
id: secondBlockedIssueId,
companyId,
title: "Second blocked parent",
status: "blocked",
priority: "medium",
issueNumber: 3,
identifier: `${issuePrefix}-3`,
createdAt: issueTimestamp,
updatedAt: issueTimestamp,
});
await db.insert(issueRelations).values({
companyId,
issueId: blockerIssueId,
relatedIssueId: secondBlockedIssueId,
type: "blocks",
});
const heartbeat = heartbeatService(db);
const result = await heartbeat.reconcileIssueGraphLiveness();
expect(result.findings).toBe(2);
expect(result.escalationsCreated).toBe(1);
expect(result.existingEscalations).toBe(1);
const escalations = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "harness_liveness_escalation")));
expect(escalations).toHaveLength(1);
const blockers = await db
.select({ blockedIssueId: issueRelations.relatedIssueId })
.from(issueRelations)
.where(and(eq(issueRelations.companyId, companyId), eq(issueRelations.issueId, escalations[0]!.id)));
expect(blockers.map((row) => row.blockedIssueId).sort()).toEqual(
[blockedIssueId, secondBlockedIssueId].sort(),
);
});
it("creates a fresh escalation when the previous matching escalation is terminal", async () => {
await enableAutoRecovery();
const { companyId, managerId, blockedIssueId, blockerIssueId } = await seedBlockedChain();
const heartbeat = heartbeatService(db);
const incidentKey = [
@@ -265,7 +577,7 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
expect(openEscalations).toHaveLength(2);
const freshEscalation = openEscalations.find((issue) => issue.status !== "done");
expect(freshEscalation).toMatchObject({
parentId: blockedIssueId,
parentId: blockerIssueId,
assigneeAgentId: managerId,
status: expect.stringMatching(/^(todo|in_progress|done)$/),
});

View File

@@ -1,6 +1,6 @@
import { randomUUID } from "node:crypto";
import { spawn, type ChildProcess } from "node:child_process";
import { eq, or, inArray } from "drizzle-orm";
import { and, eq, or, inArray } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
import {
activityLog,
@@ -17,6 +17,8 @@ import {
issueComments,
issueDocuments,
issueRelations,
issueTreeHoldMembers,
issueTreeHolds,
issues,
} from "@paperclipai/db";
import {
@@ -309,6 +311,8 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
await db.delete(documentRevisions);
await db.delete(documents);
await db.delete(issueRelations);
await db.delete(issueTreeHoldMembers);
await db.delete(issueTreeHolds);
for (let attempt = 0; attempt < 5; attempt += 1) {
await db.delete(issueComments);
await db.delete(issueDocuments);
@@ -454,11 +458,13 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
runStatus: "failed" | "timed_out" | "cancelled" | "succeeded";
retryReason?: "assignment_recovery" | "issue_continuation_needed" | null;
assignToUser?: boolean;
activePauseHold?: boolean;
}) {
const companyId = randomUUID();
const agentId = randomUUID();
const runId = randomUUID();
const wakeupRequestId = randomUUID();
const rootIssueId = randomUUID();
const issueId = randomUUID();
const now = new Date("2026-03-19T00:00:00.000Z");
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
@@ -520,22 +526,128 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
error: input.runStatus === "succeeded" ? null : "run failed before issue advanced",
});
await db.insert(issues).values({
id: issueId,
companyId,
title: "Recover stranded assigned work",
status: input.status,
await db.insert(issues).values([
...(input.activePauseHold
? [{
id: rootIssueId,
companyId,
title: "Paused recovery root",
status: "todo",
priority: "medium",
issueNumber: 1,
identifier: `${issuePrefix}-1`,
}]
: []),
{
id: issueId,
companyId,
parentId: input.activePauseHold ? rootIssueId : null,
title: "Recover stranded assigned work",
status: input.status,
priority: "medium",
assigneeAgentId: input.assignToUser ? null : agentId,
assigneeUserId: input.assignToUser ? "user-1" : null,
checkoutRunId: input.status === "in_progress" ? runId : null,
executionRunId: null,
issueNumber: input.activePauseHold ? 2 : 1,
identifier: `${issuePrefix}-${input.activePauseHold ? 2 : 1}`,
startedAt: input.status === "in_progress" ? now : null,
},
]);
if (input.activePauseHold) {
await db.insert(issueTreeHolds).values({
companyId,
rootIssueId,
mode: "pause",
status: "active",
reason: "pause recovery subtree",
releasePolicy: { strategy: "manual" },
});
}
return { companyId, agentId, runId, wakeupRequestId, issueId, rootIssueId };
}
async function expectStrandedRecoveryArtifacts(input: {
companyId: string;
agentId: string;
issueId: string;
runId: string;
previousStatus: "todo" | "in_progress";
retryReason: "assignment_recovery" | "issue_continuation_needed";
}) {
const recovery = await waitForValue(async () =>
db.select().from(issues).where(
and(
eq(issues.companyId, input.companyId),
eq(issues.originKind, "stranded_issue_recovery"),
eq(issues.originId, input.issueId),
),
).then((rows) => rows[0] ?? null),
);
if (!recovery) throw new Error("Expected stranded issue recovery issue to be created");
expect(recovery).toMatchObject({
companyId: input.companyId,
parentId: input.issueId,
assigneeAgentId: input.agentId,
originKind: "stranded_issue_recovery",
originId: input.issueId,
originRunId: input.runId,
priority: "medium",
assigneeAgentId: input.assignToUser ? null : agentId,
assigneeUserId: input.assignToUser ? "user-1" : null,
checkoutRunId: input.status === "in_progress" ? runId : null,
executionRunId: null,
issueNumber: 1,
identifier: `${issuePrefix}-1`,
startedAt: input.status === "in_progress" ? now : null,
});
expect(recovery.title).toContain("Recover stalled issue");
expect(recovery.description).toContain(`Previous source status: \`${input.previousStatus}\``);
expect(recovery.description).toContain(`Retry reason: \`${input.retryReason}\``);
expect(recovery.description).toContain("Fix the runtime/adapter problem");
const relation = await db
.select()
.from(issueRelations)
.where(
and(
eq(issueRelations.companyId, input.companyId),
eq(issueRelations.issueId, recovery.id),
eq(issueRelations.relatedIssueId, input.issueId),
eq(issueRelations.type, "blocks"),
),
)
.then((rows) => rows[0] ?? null);
expect(relation).toBeTruthy();
const wakeups = await db
.select()
.from(agentWakeupRequests)
.where(eq(agentWakeupRequests.agentId, input.agentId));
const recoveryWakeup = wakeups.find((wakeup) => {
const payload = wakeup.payload as Record<string, unknown> | null;
return payload?.issueId === recovery.id &&
payload?.sourceIssueId === input.issueId &&
payload?.strandedRunId === input.runId;
});
expect(recoveryWakeup).toMatchObject({
companyId: input.companyId,
reason: "issue_assigned",
source: "assignment",
});
return { companyId, agentId, runId, wakeupRequestId, issueId };
const recoveryRun = recoveryWakeup?.runId
? await db
.select()
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, recoveryWakeup.runId))
.then((rows) => rows[0] ?? null)
: null;
expect(recoveryRun?.contextSnapshot).toMatchObject({
issueId: recovery.id,
taskId: recovery.id,
source: "stranded_issue_recovery",
sourceIssueId: input.issueId,
strandedRunId: input.runId,
});
return recovery;
}
async function seedQueuedIssueRunFixture() {
@@ -728,11 +840,28 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
it("blocks the issue when process-loss retry is exhausted and the immediate continuation recovery also fails", async () => {
mockAdapterExecute.mockRejectedValueOnce(new Error("continuation recovery failed"));
const { agentId, runId, issueId } = await seedRunFixture({
const { companyId, agentId, runId, issueId } = await seedRunFixture({
agentStatus: "idle",
processPid: 999_999_999,
processLossRetryCount: 1,
});
const resolvedBlockerId = randomUUID();
const issuePrefix = `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`;
await db.insert(issues).values({
id: resolvedBlockerId,
companyId,
title: "Already completed prerequisite",
status: "done",
priority: "medium",
issueNumber: 2,
identifier: `${issuePrefix}-2`,
});
await db.insert(issueRelations).values({
companyId,
issueId: resolvedBlockerId,
relatedIssueId: issueId,
type: "blocks",
});
const heartbeat = heartbeatService(db);
const result = await heartbeat.reapOrphanedRuns();
@@ -759,7 +888,29 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
);
expect(blockedIssue?.status).toBe("blocked");
expect(blockedIssue?.executionRunId).toBeNull();
expect(blockedIssue?.checkoutRunId).toBe(continuationRun?.id ?? null);
expect(blockedIssue?.checkoutRunId).toBeNull();
if (!continuationRun?.id) throw new Error("Expected continuation recovery run to exist");
const recovery = await expectStrandedRecoveryArtifacts({
companyId,
agentId,
issueId,
runId: continuationRun.id,
previousStatus: "in_progress",
retryReason: "issue_continuation_needed",
});
const blockerRelations = await db
.select()
.from(issueRelations)
.where(
and(
eq(issueRelations.companyId, companyId),
eq(issueRelations.relatedIssueId, issueId),
eq(issueRelations.type, "blocks"),
),
);
expect(blockerRelations.map((relation) => relation.issueId)).toEqual([recovery.id]);
const comments = await waitForValue(async () => {
const rows = await db.select().from(issueComments).where(eq(issueComments.issueId, issueId));
@@ -767,6 +918,49 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
});
expect(comments).toHaveLength(1);
expect(comments[0]?.body).toContain("retried continuation");
expect(comments[0]?.body).toContain(`Recovery issue: [${recovery.identifier}]`);
});
it("does not block paused-tree work when immediate continuation recovery is suppressed by the hold", async () => {
const { companyId, agentId, runId, issueId } = await seedRunFixture({
agentStatus: "idle",
processPid: 999_999_999,
processLossRetryCount: 1,
});
await db.insert(issueTreeHolds).values({
companyId,
rootIssueId: issueId,
mode: "pause",
status: "active",
reason: "pause immediate recovery subtree",
releasePolicy: { strategy: "manual" },
});
const heartbeat = heartbeatService(db);
const result = await heartbeat.reapOrphanedRuns();
expect(result.reaped).toBe(1);
expect(result.runIds).toEqual([runId]);
const runs = await db
.select()
.from(heartbeatRuns)
.where(eq(heartbeatRuns.agentId, agentId));
expect(runs).toHaveLength(1);
expect(runs[0]?.status).toBe("failed");
const issue = await db.select().from(issues).where(eq(issues.id, issueId)).then((rows) => rows[0] ?? null);
expect(issue?.status).toBe("in_progress");
expect(issue?.executionRunId).toBeNull();
expect(issue?.checkoutRunId).toBe(runId);
const recoveryIssues = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stranded_issue_recovery")));
expect(recoveryIssues).toHaveLength(0);
const comments = await db.select().from(issueComments).where(eq(issueComments.issueId, issueId));
expect(comments).toHaveLength(0);
});
it("schedules a bounded retry for codex transient upstream failures instead of blocking the issue immediately", async () => {
@@ -901,7 +1095,7 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
});
it("blocks assigned todo work after the one automatic dispatch recovery was already used", async () => {
const { issueId } = await seedStrandedIssueFixture({
const { companyId, agentId, issueId, runId } = await seedStrandedIssueFixture({
status: "todo",
runStatus: "failed",
retryReason: "assignment_recovery",
@@ -916,10 +1110,20 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
const issue = await db.select().from(issues).where(eq(issues.id, issueId)).then((rows) => rows[0] ?? null);
expect(issue?.status).toBe("blocked");
const recovery = await expectStrandedRecoveryArtifacts({
companyId,
agentId,
issueId,
runId,
previousStatus: "todo",
retryReason: "assignment_recovery",
});
const comments = await db.select().from(issueComments).where(eq(issueComments.issueId, issueId));
expect(comments).toHaveLength(1);
expect(comments[0]?.body).toContain("retried dispatch");
expect(comments[0]?.body).toContain("Latest retry failure: `process_lost` - run failed before issue advanced.");
expect(comments[0]?.body).toContain(`Recovery issue: [${recovery.identifier}]`);
});
it("assigns open unassigned blockers back to their creator agent", async () => {
@@ -1206,7 +1410,7 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
expect(wakes.some((row) => row.reason === "run_liveness_continuation")).toBe(false);
});
it("blocks stranded in-progress work after the continuation retry was already used", async () => {
const { issueId } = await seedStrandedIssueFixture({
const { companyId, agentId, issueId, runId } = await seedStrandedIssueFixture({
status: "in_progress",
runStatus: "failed",
retryReason: "issue_continuation_needed",
@@ -1221,10 +1425,65 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
const issue = await db.select().from(issues).where(eq(issues.id, issueId)).then((rows) => rows[0] ?? null);
expect(issue?.status).toBe("blocked");
const recovery = await expectStrandedRecoveryArtifacts({
companyId,
agentId,
issueId,
runId,
previousStatus: "in_progress",
retryReason: "issue_continuation_needed",
});
const comments = await db.select().from(issueComments).where(eq(issueComments.issueId, issueId));
expect(comments).toHaveLength(1);
expect(comments[0]?.body).toContain("retried continuation");
expect(comments[0]?.body).toContain("Latest retry failure: `process_lost` - run failed before issue advanced.");
expect(comments[0]?.body).toContain(`Recovery issue: [${recovery.identifier}]`);
});
it("does not escalate paused-tree recovery when the automatic continuation retry was cancelled by the hold", async () => {
const { companyId, agentId, issueId } = await seedStrandedIssueFixture({
status: "in_progress",
runStatus: "cancelled",
retryReason: "issue_continuation_needed",
activePauseHold: true,
});
const heartbeat = heartbeatService(db);
const result = await heartbeat.reconcileStrandedAssignedIssues();
expect(result.dispatchRequeued).toBe(0);
expect(result.continuationRequeued).toBe(0);
expect(result.escalated).toBe(0);
expect(result.skipped).toBe(1);
expect(result.issueIds).toEqual([]);
const issue = await db.select().from(issues).where(eq(issues.id, issueId)).then((rows) => rows[0] ?? null);
expect(issue?.status).toBe("in_progress");
expect(issue?.checkoutRunId).toBeTruthy();
const recoveryIssues = await db
.select()
.from(issues)
.where(and(eq(issues.companyId, companyId), eq(issues.originKind, "stranded_issue_recovery")));
expect(recoveryIssues).toHaveLength(0);
const blockerRelations = await db
.select()
.from(issueRelations)
.where(
and(
eq(issueRelations.companyId, companyId),
eq(issueRelations.relatedIssueId, issueId),
eq(issueRelations.type, "blocks"),
),
);
expect(blockerRelations).toHaveLength(0);
const comments = await db.select().from(issueComments).where(eq(issueComments.issueId, issueId));
expect(comments).toHaveLength(0);
const wakeups = await db.select().from(agentWakeupRequests).where(eq(agentWakeupRequests.agentId, agentId));
expect(wakeups).toHaveLength(1);
});
it("re-enqueues continuation when the latest automatic continuation succeeded without closing the issue", async () => {

View File

@@ -3,11 +3,14 @@ import { eq, sql } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
agents,
agentRuntimeState,
agentWakeupRequests,
companies,
createDb,
environmentLeases,
heartbeatRunEvents,
heartbeatRuns,
issues,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
@@ -40,8 +43,11 @@ describeEmbeddedPostgres("heartbeat bounded retry scheduling", () => {
afterEach(async () => {
await db.delete(heartbeatRunEvents);
await db.delete(environmentLeases);
await db.delete(issues);
await db.delete(heartbeatRuns);
await db.delete(agentWakeupRequests);
await db.delete(agentRuntimeState);
await db.delete(agents);
await db.delete(companies);
});
@@ -212,6 +218,376 @@ describeEmbeddedPostgres("heartbeat bounded retry scheduling", () => {
expect(promotedRun?.status).toBe("queued");
});
it("does not defer a new assignee behind the previous assignee's scheduled retry", async () => {
const companyId = randomUUID();
const oldAgentId = randomUUID();
const newAgentId = randomUUID();
const issueId = randomUUID();
const sourceRunId = randomUUID();
const now = new Date("2026-04-20T13:00:00.000Z");
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values([
{
id: oldAgentId,
companyId,
name: "ClaudeCoder",
role: "engineer",
status: "active",
adapterType: "claude_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 1,
},
},
permissions: {},
},
{
id: newAgentId,
companyId,
name: "CodexCoder",
role: "engineer",
status: "active",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 1,
},
},
permissions: {},
},
]);
await db.insert(heartbeatRuns).values({
id: sourceRunId,
companyId,
agentId: oldAgentId,
invocationSource: "assignment",
triggerDetail: "system",
status: "failed",
error: "upstream overload",
errorCode: "adapter_failed",
finishedAt: now,
contextSnapshot: {
issueId,
wakeReason: "issue_assigned",
},
updatedAt: now,
createdAt: now,
});
await db.insert(issues).values({
id: issueId,
companyId,
title: "Retry reassignment",
status: "todo",
priority: "medium",
assigneeAgentId: oldAgentId,
executionRunId: sourceRunId,
executionAgentNameKey: "claudecoder",
executionLockedAt: now,
issueNumber: 1,
identifier: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}-1`,
});
const scheduled = await heartbeat.scheduleBoundedRetry(sourceRunId, {
now,
random: () => 0.5,
});
expect(scheduled.outcome).toBe("scheduled");
if (scheduled.outcome !== "scheduled") return;
await db.update(issues).set({
assigneeAgentId: newAgentId,
updatedAt: now,
}).where(eq(issues.id, issueId));
// Keep the new agent's queue from auto-claiming/executing during this unit test.
await db.insert(heartbeatRuns).values(
Array.from({ length: 5 }, () => ({
id: randomUUID(),
companyId,
agentId: newAgentId,
invocationSource: "automation",
triggerDetail: "system",
status: "running",
contextSnapshot: {
wakeReason: "test_busy_slot",
},
startedAt: now,
updatedAt: now,
createdAt: now,
})),
);
const newAssigneeRun = await heartbeat.wakeup(newAgentId, {
source: "assignment",
triggerDetail: "system",
reason: "issue_assigned",
payload: {
issueId,
mutation: "update",
},
contextSnapshot: {
issueId,
source: "issue.update",
},
requestedByActorType: "user",
requestedByActorId: "local-board",
});
expect(newAssigneeRun).not.toBeNull();
expect(newAssigneeRun?.agentId).toBe(newAgentId);
expect(newAssigneeRun?.status).toBe("queued");
const oldRetry = await db
.select({
status: heartbeatRuns.status,
errorCode: heartbeatRuns.errorCode,
})
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, scheduled.run.id))
.then((rows) => rows[0] ?? null);
expect(oldRetry).toEqual({
status: "cancelled",
errorCode: "issue_reassigned",
});
const deferredWakeups = await db
.select({ count: sql<number>`count(*)::int` })
.from(agentWakeupRequests)
.where(eq(agentWakeupRequests.status, "deferred_issue_execution"))
.then((rows) => rows[0]?.count ?? 0);
expect(deferredWakeups).toBe(0);
});
it("does not promote a scheduled retry after issue ownership changes", async () => {
const companyId = randomUUID();
const oldAgentId = randomUUID();
const newAgentId = randomUUID();
const issueId = randomUUID();
const sourceRunId = randomUUID();
const now = new Date("2026-04-20T14:00:00.000Z");
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values([
{
id: oldAgentId,
companyId,
name: "ClaudeCoder",
role: "engineer",
status: "active",
adapterType: "claude_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 1,
},
},
permissions: {},
},
{
id: newAgentId,
companyId,
name: "CodexCoder",
role: "engineer",
status: "active",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 1,
},
},
permissions: {},
},
]);
await db.insert(heartbeatRuns).values({
id: sourceRunId,
companyId,
agentId: oldAgentId,
invocationSource: "assignment",
triggerDetail: "system",
status: "failed",
error: "upstream overload",
errorCode: "adapter_failed",
finishedAt: now,
contextSnapshot: {
issueId,
wakeReason: "issue_assigned",
},
updatedAt: now,
createdAt: now,
});
await db.insert(issues).values({
id: issueId,
companyId,
title: "Retry promotion reassignment",
status: "todo",
priority: "medium",
assigneeAgentId: oldAgentId,
executionRunId: sourceRunId,
executionAgentNameKey: "claudecoder",
executionLockedAt: now,
issueNumber: 1,
identifier: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}-2`,
});
const scheduled = await heartbeat.scheduleBoundedRetry(sourceRunId, {
now,
random: () => 0.5,
});
expect(scheduled.outcome).toBe("scheduled");
if (scheduled.outcome !== "scheduled") return;
await db.update(issues).set({
assigneeAgentId: newAgentId,
updatedAt: now,
}).where(eq(issues.id, issueId));
const promotion = await heartbeat.promoteDueScheduledRetries(scheduled.dueAt);
expect(promotion).toEqual({ promoted: 0, runIds: [] });
const oldRetry = await db
.select({
status: heartbeatRuns.status,
errorCode: heartbeatRuns.errorCode,
})
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, scheduled.run.id))
.then((rows) => rows[0] ?? null);
expect(oldRetry).toEqual({
status: "cancelled",
errorCode: "issue_reassigned",
});
const issue = await db
.select({ executionRunId: issues.executionRunId })
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issue?.executionRunId).toBeNull();
});
it("does not promote a scheduled retry after the issue is cancelled", async () => {
const companyId = randomUUID();
const agentId = randomUUID();
const issueId = randomUUID();
const sourceRunId = randomUUID();
const now = new Date("2026-04-20T15:00:00.000Z");
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values({
id: agentId,
companyId,
name: "CodexCoder",
role: "engineer",
status: "active",
adapterType: "codex_local",
adapterConfig: {},
runtimeConfig: {
heartbeat: {
wakeOnDemand: true,
maxConcurrentRuns: 1,
},
},
permissions: {},
});
await db.insert(heartbeatRuns).values({
id: sourceRunId,
companyId,
agentId,
invocationSource: "assignment",
triggerDetail: "system",
status: "failed",
error: "upstream overload",
errorCode: "adapter_failed",
finishedAt: now,
contextSnapshot: {
issueId,
wakeReason: "issue_assigned",
},
updatedAt: now,
createdAt: now,
});
await db.insert(issues).values({
id: issueId,
companyId,
title: "Retry promotion cancellation",
status: "todo",
priority: "medium",
assigneeAgentId: agentId,
executionRunId: sourceRunId,
executionAgentNameKey: "codexcoder",
executionLockedAt: now,
issueNumber: 1,
identifier: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}-3`,
});
const scheduled = await heartbeat.scheduleBoundedRetry(sourceRunId, {
now,
random: () => 0.5,
});
expect(scheduled.outcome).toBe("scheduled");
if (scheduled.outcome !== "scheduled") return;
await db.update(issues).set({
status: "cancelled",
updatedAt: now,
}).where(eq(issues.id, issueId));
const promotion = await heartbeat.promoteDueScheduledRetries(scheduled.dueAt);
expect(promotion).toEqual({ promoted: 0, runIds: [] });
const oldRetry = await db
.select({
status: heartbeatRuns.status,
errorCode: heartbeatRuns.errorCode,
})
.from(heartbeatRuns)
.where(eq(heartbeatRuns.id, scheduled.run.id))
.then((rows) => rows[0] ?? null);
expect(oldRetry).toEqual({
status: "cancelled",
errorCode: "issue_cancelled",
});
const issue = await db
.select({ executionRunId: issues.executionRunId })
.from(issues)
.where(eq(issues.id, issueId))
.then((rows) => rows[0] ?? null);
expect(issue?.executionRunId).toBeNull();
});
it("exhausts bounded retries after the hard cap", async () => {
const companyId = randomUUID();
const agentId = randomUUID();

View File

@@ -0,0 +1,30 @@
import { randomUUID } from "node:crypto";
import { afterEach, describe, expect, it, vi } from "vitest";
import { withAgentStartLock } from "../services/agent-start-lock.ts";
describe("heartbeat agent start lock", () => {
afterEach(() => {
vi.useRealTimers();
});
it("does not let a stale start lock freeze later queued-run starts", async () => {
vi.useFakeTimers();
const agentId = randomUUID();
const firstStart = vi.fn(() => new Promise<void>(() => undefined));
const secondStart = vi.fn(async () => "started");
void withAgentStartLock(agentId, firstStart);
await Promise.resolve();
expect(firstStart).toHaveBeenCalledTimes(1);
const secondStartResult = withAgentStartLock(agentId, secondStart);
await Promise.resolve();
expect(secondStart).not.toHaveBeenCalled();
await vi.advanceTimersByTimeAsync(30_000);
await expect(secondStartResult).resolves.toBe("started");
expect(secondStart).toHaveBeenCalledTimes(1);
});
});

View File

@@ -42,7 +42,7 @@ describe("instance settings routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockInstanceSettingsService.getGeneral.mockReset();
mockInstanceSettingsService.getExperimental.mockReset();
mockInstanceSettingsService.updateGeneral.mockReset();
@@ -58,6 +58,7 @@ describe("instance settings routes", () => {
enableEnvironments: false,
enableIsolatedWorkspaces: false,
autoRestartDevServerWhenIdle: false,
enableIssueGraphLivenessAutoRecovery: false,
});
mockInstanceSettingsService.updateGeneral.mockResolvedValue({
id: "instance-settings-1",
@@ -73,6 +74,7 @@ describe("instance settings routes", () => {
enableEnvironments: true,
enableIsolatedWorkspaces: true,
autoRestartDevServerWhenIdle: false,
enableIssueGraphLivenessAutoRecovery: false,
},
});
mockInstanceSettingsService.listCompanyIds.mockResolvedValue(["company-1", "company-2"]);
@@ -92,6 +94,7 @@ describe("instance settings routes", () => {
enableEnvironments: false,
enableIsolatedWorkspaces: false,
autoRestartDevServerWhenIdle: false,
enableIssueGraphLivenessAutoRecovery: false,
});
const patchRes = await request(app)
@@ -103,7 +106,7 @@ describe("instance settings routes", () => {
enableIsolatedWorkspaces: true,
});
expect(mockLogActivity).toHaveBeenCalledTimes(2);
});
}, 10_000);
it("allows local board users to update guarded dev-server auto-restart", async () => {
const app = await createApp({
@@ -118,8 +121,28 @@ describe("instance settings routes", () => {
.send({ autoRestartDevServerWhenIdle: true })
.expect(200);
expect(
mockInstanceSettingsService.updateExperimental.mock.calls.some(
([patch]) => patch?.autoRestartDevServerWhenIdle === true,
),
).toBe(true);
});
it("allows local board users to update issue graph liveness auto-recovery", async () => {
const app = await createApp({
type: "board",
userId: "local-board",
source: "local_implicit",
isInstanceAdmin: true,
});
await request(app)
.patch("/api/instance/settings/experimental")
.send({ enableIssueGraphLivenessAutoRecovery: true })
.expect(200);
expect(mockInstanceSettingsService.updateExperimental).toHaveBeenCalledWith({
autoRestartDevServerWhenIdle: true,
enableIssueGraphLivenessAutoRecovery: true,
});
});

View File

@@ -113,7 +113,7 @@ describe("POST /companies/:companyId/invites", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
logActivityMock.mockReset();
});

View File

@@ -127,7 +127,7 @@ describe("GET /invites/:token", () => {
expect(res.body.companyBrandColor).toBe("#114488");
expect(res.body.companyLogoUrl).toBe("/api/invites/pcp_invite_test/logo");
expect(res.body.inviteType).toBe("company_join");
});
}, 10_000);
it("omits companyLogoUrl when the stored logo object is missing", async () => {
mockStorage.headObject.mockResolvedValue({ exists: false });
@@ -172,7 +172,7 @@ describe("GET /invites/:token", () => {
expect(res.status).toBe(200);
expect(res.body.companyLogoUrl).toBeNull();
});
}, 10_000);
it("returns pending join-request status for an already-accepted invite", async () => {
const invite = {
@@ -218,7 +218,7 @@ describe("GET /invites/:token", () => {
expect(res.body.joinRequestStatus).toBe("pending_approval");
expect(res.body.joinRequestType).toBe("human");
expect(res.body.companyName).toBe("Acme Robotics");
});
}, 10_000);
it("falls back to a reusable human join request when the accepted invite reused an existing queue entry", async () => {
const invite = {
@@ -274,5 +274,5 @@ describe("GET /invites/:token", () => {
expect(res.status).toBe(200);
expect(res.body.joinRequestStatus).toBe("pending_approval");
expect(res.body.joinRequestType).toBe("human");
});
}, 10_000);
});

View File

@@ -1,6 +1,6 @@
import express from "express";
import request from "supertest";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { beforeEach, describe, expect, it, vi } from "vitest";
function createSelectChain(rows: unknown[]) {
const query = {
@@ -44,8 +44,6 @@ function createInvite(overrides: Record<string, unknown> = {}) {
};
}
let currentAccessModule: Awaited<ReturnType<typeof vi.importActual<typeof import("../routes/access.js")>>> | null = null;
async function createApp(
db: Record<string, unknown>,
network: {
@@ -54,11 +52,9 @@ async function createApp(
},
) {
const [access, middleware] = await Promise.all([
vi.importActual<typeof import("../routes/access.js")>("../routes/access.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/access.js"),
import("../middleware/index.js"),
]);
currentAccessModule = access;
access.setInviteResolutionNetworkForTest(network);
const app = express();
app.use((req, _res, next) => {
(req as any).actor = { type: "anon" };
@@ -71,6 +67,7 @@ async function createApp(
deploymentExposure: "private",
bindHost: "127.0.0.1",
allowedHostnames: [],
inviteResolutionNetwork: network,
}),
);
app.use(middleware.errorHandler);
@@ -79,43 +76,43 @@ async function createApp(
describe.sequential("GET /invites/:token/test-resolution", () => {
beforeEach(() => {
currentAccessModule = null;
vi.clearAllMocks();
});
afterEach(async () => {
currentAccessModule?.setInviteResolutionNetworkForTest(null);
});
it("rejects private, local, multicast, and reserved targets before probing", async () => {
const cases = [
["localhost", "http://localhost:3100/api/health", "127.0.0.1"],
["IPv4 loopback", "http://127.0.0.1:3100/api/health", "127.0.0.1"],
["IPv6 loopback", "http://[::1]:3100/api/health", "::1"],
["IPv4-mapped IPv6 loopback hex", "http://[::ffff:7f00:1]/api/health", "::ffff:7f00:1"],
["IPv4-mapped IPv6 RFC1918 hex", "http://[::ffff:c0a8:101]/api/health", "::ffff:c0a8:101"],
["RFC1918 10/8", "http://10.0.0.5/api/health", "10.0.0.5"],
["RFC1918 172.16/12", "http://172.16.10.5/api/health", "172.16.10.5"],
["RFC1918 192.168/16", "http://192.168.1.10/api/health", "192.168.1.10"],
["link-local metadata", "http://169.254.169.254/latest/meta-data", "169.254.169.254"],
["multicast", "http://224.0.0.1/probe", "224.0.0.1"],
["NAT64 well-known prefix", "https://gateway.example.test/health", "64:ff9b::0a00:0001"],
["NAT64 local-use prefix", "https://gateway.example.test/health", "64:ff9b:1::0a00:0001"],
] as const;
it.each([
["localhost", "http://localhost:3100/api/health", "127.0.0.1"],
["IPv4 loopback", "http://127.0.0.1:3100/api/health", "127.0.0.1"],
["IPv6 loopback", "http://[::1]:3100/api/health", "::1"],
["IPv4-mapped IPv6 loopback hex", "http://[::ffff:7f00:1]/api/health", "::ffff:7f00:1"],
["IPv4-mapped IPv6 RFC1918 hex", "http://[::ffff:c0a8:101]/api/health", "::ffff:c0a8:101"],
["RFC1918 10/8", "http://10.0.0.5/api/health", "10.0.0.5"],
["RFC1918 172.16/12", "http://172.16.10.5/api/health", "172.16.10.5"],
["RFC1918 192.168/16", "http://192.168.1.10/api/health", "192.168.1.10"],
["link-local metadata", "http://169.254.169.254/latest/meta-data", "169.254.169.254"],
["multicast", "http://224.0.0.1/probe", "224.0.0.1"],
["NAT64 well-known prefix", "https://gateway.example.test/health", "64:ff9b::0a00:0001"],
["NAT64 local-use prefix", "https://gateway.example.test/health", "64:ff9b:1::0a00:0001"],
])("rejects %s targets before probing", async (_label, url, address) => {
const lookup = vi.fn().mockResolvedValue([{ address, family: address.includes(":") ? 6 : 4 }]);
const requestHead = vi.fn();
const app = await createApp(createDbStub([createInvite()]), { lookup, requestHead });
for (const [label, url, address] of cases) {
const lookup = vi.fn().mockResolvedValue([{ address, family: address.includes(":") ? 6 : 4 }]);
const requestHead = vi.fn();
const app = await createApp(createDbStub([createInvite()]), { lookup, requestHead });
const res = await request(app)
.get("/api/invites/pcp_invite_test/test-resolution")
.query({ url });
const res = await request(app)
.get("/api/invites/pcp_invite_test/test-resolution")
.query({ url });
expect(res.status).toBe(400);
expect(res.body.error).toBe(
"url resolves to a private, local, multicast, or reserved address",
);
expect(requestHead).not.toHaveBeenCalled();
}, 15_000);
expect(res.status, label).toBe(400);
expect(res.body.error).toBe(
"url resolves to a private, local, multicast, or reserved address",
);
expect(requestHead).not.toHaveBeenCalled();
}
}, 20_000);
it("rejects hostnames that resolve to private addresses", async () => {
it.sequential("rejects hostnames that resolve to private addresses", async () => {
const lookup = vi.fn().mockResolvedValue([{ address: "10.1.2.3", family: 4 }]);
const requestHead = vi.fn();
const app = await createApp(createDbStub([createInvite()]), { lookup, requestHead });
@@ -132,7 +129,7 @@ describe.sequential("GET /invites/:token/test-resolution", () => {
expect(requestHead).not.toHaveBeenCalled();
});
it("rejects hostnames when any resolved address is private", async () => {
it.sequential("rejects hostnames when any resolved address is private", async () => {
const lookup = vi.fn().mockResolvedValue([
{ address: "127.0.0.1", family: 4 },
{ address: "93.184.216.34", family: 4 },
@@ -148,7 +145,7 @@ describe.sequential("GET /invites/:token/test-resolution", () => {
expect(requestHead).not.toHaveBeenCalled();
});
it("allows public HTTPS targets through the resolved and pinned probe path", async () => {
it.sequential("allows public HTTPS targets through the resolved and pinned probe path", async () => {
const lookup = vi.fn().mockResolvedValue([{ address: "93.184.216.34", family: 4 }]);
const requestHead = vi.fn().mockResolvedValue({ httpStatus: 204 });
const app = await createApp(createDbStub([createInvite()]), { lookup, requestHead });
@@ -177,7 +174,7 @@ describe.sequential("GET /invites/:token/test-resolution", () => {
);
});
it.each([
it.sequential.each([
["missing invite", []],
["revoked invite", [createInvite({ revokedAt: new Date("2026-03-07T00:05:00.000Z") })]],
["expired invite", [createInvite({ expiresAt: new Date("2020-03-07T00:10:00.000Z") })]],

View File

@@ -158,7 +158,7 @@ describe("issue activity event routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.assertCheckoutOwner.mockResolvedValue({ adoptedFromRunId: null });
mockIssueService.findMentionedAgents.mockResolvedValue([]);
mockIssueService.getRelationSummaries.mockResolvedValue({ blockedBy: [], blocks: [] });

View File

@@ -238,7 +238,7 @@ describe("agent issue mutation checkout ownership", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockAccessService.canUser.mockReset();
mockAccessService.hasPermission.mockReset();
mockAgentService.getById.mockReset();

View File

@@ -178,7 +178,7 @@ describe("issue attachment routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockLogActivity.mockResolvedValue(undefined);
});

View File

@@ -0,0 +1,280 @@
import { randomUUID } from "node:crypto";
import { eq } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
agents,
agentWakeupRequests,
companies,
createDb,
heartbeatRuns,
issueRelations,
issues,
} from "@paperclipai/db";
import {
getEmbeddedPostgresTestSupport,
startEmbeddedPostgresTestDatabase,
} from "./helpers/embedded-postgres.js";
import { issueService } from "../services/issues.js";
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
if (!embeddedPostgresSupport.supported) {
console.warn(
`Skipping embedded Postgres issue blocker attention tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
);
}
describeEmbeddedPostgres("issue blocker attention", () => {
let db!: ReturnType<typeof createDb>;
let svc!: ReturnType<typeof issueService>;
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
beforeAll(async () => {
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-issue-blocker-attention-");
db = createDb(tempDb.connectionString);
svc = issueService(db);
}, 20_000);
afterEach(async () => {
await db.delete(heartbeatRuns);
await db.delete(agentWakeupRequests);
await db.delete(issueRelations);
await db.delete(issues);
await db.delete(agents);
await db.delete(companies);
});
afterAll(async () => {
await tempDb?.cleanup();
});
async function createCompany(prefix = "PBA") {
const companyId = randomUUID();
const agentId = randomUUID();
await db.insert(companies).values({
id: companyId,
name: `Company ${prefix}`,
issuePrefix: prefix,
requireBoardApprovalForNewAgents: false,
});
await db.insert(agents).values({
id: agentId,
companyId,
name: `${prefix} Agent`,
role: "engineer",
status: "idle",
});
return { companyId, agentId };
}
async function insertIssue(input: {
companyId: string;
id?: string;
identifier: string;
title: string;
status: string;
parentId?: string | null;
assigneeAgentId?: string | null;
}) {
const id = input.id ?? randomUUID();
await db.insert(issues).values({
id,
companyId: input.companyId,
identifier: input.identifier,
title: input.title,
status: input.status,
priority: "medium",
parentId: input.parentId ?? null,
assigneeAgentId: input.assigneeAgentId ?? null,
});
return id;
}
async function block(input: { companyId: string; blockerIssueId: string; blockedIssueId: string }) {
await db.insert(issueRelations).values({
companyId: input.companyId,
issueId: input.blockerIssueId,
relatedIssueId: input.blockedIssueId,
type: "blocks",
});
}
async function activeRun(input: { companyId: string; agentId: string; issueId: string; status?: string; current?: boolean }) {
const runId = randomUUID();
await db.insert(heartbeatRuns).values({
id: runId,
companyId: input.companyId,
agentId: input.agentId,
status: input.status ?? "running",
contextSnapshot: { issueId: input.issueId },
});
if (input.current !== false) {
await db.update(issues).set({ executionRunId: runId }).where(eq(issues.id, input.issueId));
}
return runId;
}
it("classifies a blocked parent as covered when its child has a running execution path", async () => {
const { companyId, agentId } = await createCompany("PBC");
const parentId = await insertIssue({ companyId, identifier: "PBC-1", title: "Parent", status: "blocked" });
const childId = await insertIssue({
companyId,
identifier: "PBC-2",
title: "Running child",
status: "todo",
parentId,
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: childId, blockedIssueId: parentId });
await activeRun({ companyId, agentId, issueId: childId });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "covered",
reason: "active_child",
unresolvedBlockerCount: 1,
coveredBlockerCount: 1,
attentionBlockerCount: 0,
sampleBlockerIdentifier: "PBC-2",
});
});
it("keeps mixed blockers attention-required when any path lacks active work", async () => {
const { companyId, agentId } = await createCompany("PBM");
const parentId = await insertIssue({ companyId, identifier: "PBM-1", title: "Parent", status: "blocked" });
const activeChildId = await insertIssue({
companyId,
identifier: "PBM-2",
title: "Running child",
status: "todo",
parentId,
assigneeAgentId: agentId,
});
const idleBlockerId = await insertIssue({
companyId,
identifier: "PBM-3",
title: "Idle blocker",
status: "todo",
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: activeChildId, blockedIssueId: parentId });
await block({ companyId, blockerIssueId: idleBlockerId, blockedIssueId: parentId });
await activeRun({ companyId, agentId, issueId: activeChildId });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "needs_attention",
reason: "attention_required",
unresolvedBlockerCount: 2,
coveredBlockerCount: 1,
attentionBlockerCount: 1,
sampleBlockerIdentifier: "PBM-3",
});
});
it("covers recursive blocker chains when the downstream leaf has active work", async () => {
const { companyId, agentId } = await createCompany("PBR");
const parentId = await insertIssue({ companyId, identifier: "PBR-1", title: "Parent", status: "blocked" });
const blockerId = await insertIssue({ companyId, identifier: "PBR-2", title: "Blocked dependency", status: "blocked" });
const leafId = await insertIssue({
companyId,
identifier: "PBR-3",
title: "Running leaf",
status: "todo",
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: blockerId, blockedIssueId: parentId });
await block({ companyId, blockerIssueId: leafId, blockedIssueId: blockerId });
await activeRun({ companyId, agentId, issueId: leafId });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "covered",
reason: "active_dependency",
unresolvedBlockerCount: 1,
coveredBlockerCount: 1,
attentionBlockerCount: 0,
sampleBlockerIdentifier: "PBR-3",
});
});
it("does not let another company's active run cover the blocker", async () => {
const { companyId, agentId } = await createCompany("PBS");
const other = await createCompany("PBT");
const parentId = await insertIssue({ companyId, identifier: "PBS-1", title: "Parent", status: "blocked" });
const blockerId = await insertIssue({
companyId,
identifier: "PBS-2",
title: "Same-company blocker",
status: "todo",
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: blockerId, blockedIssueId: parentId });
await activeRun({ companyId: other.companyId, agentId: other.agentId, issueId: blockerId });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "needs_attention",
reason: "attention_required",
unresolvedBlockerCount: 1,
coveredBlockerCount: 0,
attentionBlockerCount: 1,
sampleBlockerIdentifier: "PBS-2",
});
});
it("does not cover a blocker from a stale run the issue no longer owns", async () => {
const { companyId, agentId } = await createCompany("PBX");
const parentId = await insertIssue({ companyId, identifier: "PBX-1", title: "Parent", status: "blocked" });
const blockerId = await insertIssue({
companyId,
identifier: "PBX-2",
title: "Previously running blocker",
status: "blocked",
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: blockerId, blockedIssueId: parentId });
await activeRun({ companyId, agentId, issueId: blockerId, current: false });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "needs_attention",
reason: "attention_required",
unresolvedBlockerCount: 1,
coveredBlockerCount: 0,
attentionBlockerCount: 1,
sampleBlockerIdentifier: "PBX-2",
});
});
it("does not treat a scheduled retry as actively covered work", async () => {
const { companyId, agentId } = await createCompany("PBY");
const parentId = await insertIssue({ companyId, identifier: "PBY-1", title: "Parent", status: "blocked" });
const blockerId = await insertIssue({
companyId,
identifier: "PBY-2",
title: "Retrying blocker",
status: "blocked",
assigneeAgentId: agentId,
});
await block({ companyId, blockerIssueId: blockerId, blockedIssueId: parentId });
await activeRun({ companyId, agentId, issueId: blockerId, status: "scheduled_retry" });
const parent = (await svc.list(companyId, { status: "blocked" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "needs_attention",
reason: "attention_required",
unresolvedBlockerCount: 1,
coveredBlockerCount: 0,
attentionBlockerCount: 1,
sampleBlockerIdentifier: "PBY-2",
});
});
});

View File

@@ -125,8 +125,8 @@ function registerServiceMocks() {
async function createApp() {
const [{ issueRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/issues.js")>("../routes/issues.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/issues.js"),
import("../middleware/index.js"),
]);
const app = express();
app.use(express.json());
@@ -173,7 +173,7 @@ function makeClosedWorkspace() {
};
}
describe("closed isolated workspace issue routes", () => {
describe.sequential("closed isolated workspace issue routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("@paperclipai/shared/telemetry");
@@ -189,7 +189,7 @@ describe("closed isolated workspace issue routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerServiceMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(makeIssue());
mockExecutionWorkspaceService.getById.mockResolvedValue(makeClosedWorkspace());
});

View File

@@ -113,8 +113,8 @@ function createApp() {
async function installActor(app: express.Express, actor?: Record<string, unknown>) {
const [{ issueRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/issues.js")>("../routes/issues.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/issues.js"),
import("../middleware/index.js"),
]);
app.use((req, _res, next) => {
@@ -159,7 +159,7 @@ function makeComment(overrides: Record<string, unknown> = {}) {
};
}
describe("issue comment cancel routes", () => {
describe.sequential("issue comment cancel routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("@paperclipai/shared/telemetry");
@@ -175,7 +175,7 @@ describe("issue comment cancel routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(makeIssue());
mockIssueService.assertCheckoutOwner.mockResolvedValue({ adoptedFromRunId: null });
mockIssueService.getComment.mockResolvedValue(makeComment());

View File

@@ -28,6 +28,7 @@ const mockHeartbeatService = vi.hoisted(() => ({
const mockAgentService = vi.hoisted(() => ({
getById: vi.fn(),
list: vi.fn(),
resolveByReference: vi.fn(),
}));
@@ -61,80 +62,82 @@ const mockIssueThreadInteractionService = vi.hoisted(() => ({
expireRequestConfirmationsSupersededByComment: vi.fn(async () => []),
expireStaleRequestConfirmationsForIssueDocument: vi.fn(async () => []),
}));
const mockIssueTreeControlService = vi.hoisted(() => ({
getActivePauseHoldGate: vi.fn(async () => null),
}));
function registerModuleMocks() {
vi.doMock("@paperclipai/shared/telemetry", () => ({
trackAgentTaskCompleted: vi.fn(),
trackErrorHandlerCrash: vi.fn(),
}));
vi.mock("@paperclipai/shared/telemetry", () => ({
trackAgentTaskCompleted: vi.fn(),
trackErrorHandlerCrash: vi.fn(),
}));
vi.doMock("../telemetry.js", () => ({
getTelemetryClient: vi.fn(() => ({ track: vi.fn() })),
}));
vi.mock("../telemetry.js", () => ({
getTelemetryClient: vi.fn(() => ({ track: vi.fn() })),
}));
vi.doMock("../services/access.js", () => ({
accessService: () => mockAccessService,
}));
vi.mock("../services/access.js", () => ({
accessService: () => mockAccessService,
}));
vi.doMock("../services/activity-log.js", () => ({
logActivity: mockLogActivity,
}));
vi.mock("../services/activity-log.js", () => ({
logActivity: mockLogActivity,
}));
vi.doMock("../services/agents.js", () => ({
agentService: () => mockAgentService,
}));
vi.mock("../services/agents.js", () => ({
agentService: () => mockAgentService,
}));
vi.doMock("../services/feedback.js", () => ({
feedbackService: () => mockFeedbackService,
}));
vi.mock("../services/feedback.js", () => ({
feedbackService: () => mockFeedbackService,
}));
vi.doMock("../services/heartbeat.js", () => ({
heartbeatService: () => mockHeartbeatService,
}));
vi.mock("../services/heartbeat.js", () => ({
heartbeatService: () => mockHeartbeatService,
}));
vi.doMock("../services/instance-settings.js", () => ({
instanceSettingsService: () => mockInstanceSettingsService,
}));
vi.mock("../services/instance-settings.js", () => ({
instanceSettingsService: () => mockInstanceSettingsService,
}));
vi.doMock("../services/issues.js", () => ({
issueService: () => mockIssueService,
}));
vi.mock("../services/issues.js", () => ({
issueService: () => mockIssueService,
}));
vi.doMock("../services/routines.js", () => ({
routineService: () => mockRoutineService,
}));
vi.mock("../services/routines.js", () => ({
routineService: () => mockRoutineService,
}));
vi.doMock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
documentService: () => ({}),
executionWorkspaceService: () => ({}),
feedbackService: () => mockFeedbackService,
goalService: () => ({}),
heartbeatService: () => mockHeartbeatService,
instanceSettingsService: () => mockInstanceSettingsService,
issueApprovalService: () => ({}),
issueReferenceService: () => ({
deleteDocumentSource: async () => undefined,
diffIssueReferenceSummary: () => ({
addedReferencedIssues: [],
removedReferencedIssues: [],
currentReferencedIssues: [],
}),
emptySummary: () => ({ outbound: [], inbound: [] }),
listIssueReferenceSummary: async () => ({ outbound: [], inbound: [] }),
syncComment: async () => undefined,
syncDocument: async () => undefined,
syncIssue: async () => undefined,
vi.mock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
documentService: () => ({}),
executionWorkspaceService: () => ({}),
feedbackService: () => mockFeedbackService,
goalService: () => ({}),
heartbeatService: () => mockHeartbeatService,
instanceSettingsService: () => mockInstanceSettingsService,
issueApprovalService: () => ({}),
issueReferenceService: () => ({
deleteDocumentSource: async () => undefined,
diffIssueReferenceSummary: () => ({
addedReferencedIssues: [],
removedReferencedIssues: [],
currentReferencedIssues: [],
}),
issueService: () => mockIssueService,
issueThreadInteractionService: () => mockIssueThreadInteractionService,
logActivity: mockLogActivity,
projectService: () => ({}),
routineService: () => mockRoutineService,
workProductService: () => ({}),
}));
}
emptySummary: () => ({ outbound: [], inbound: [] }),
listIssueReferenceSummary: async () => ({ outbound: [], inbound: [] }),
syncComment: async () => undefined,
syncDocument: async () => undefined,
syncIssue: async () => undefined,
}),
issueService: () => mockIssueService,
issueThreadInteractionService: () => mockIssueThreadInteractionService,
issueTreeControlService: () => mockIssueTreeControlService,
logActivity: mockLogActivity,
projectService: () => ({}),
routineService: () => mockRoutineService,
workProductService: () => ({}),
}));
function createApp() {
const app = express();
@@ -144,8 +147,8 @@ function createApp() {
async function installActor(app: express.Express, actor?: Record<string, unknown>) {
const [{ issueRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/issues.js")>("../routes/issues.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/issues.js"),
import("../middleware/index.js"),
]);
app.use((req, _res, next) => {
(req as any).actor = actor ?? {
@@ -173,7 +176,7 @@ async function normalizePolicy(input: {
return normalizeIssueExecutionPolicy(input);
}
function makeIssue(status: "todo" | "done" | "blocked") {
function makeIssue(status: "todo" | "done" | "blocked" | "cancelled" | "in_progress") {
return {
id: "11111111-1111-4111-8111-111111111111",
companyId: "company-1",
@@ -186,25 +189,23 @@ function makeIssue(status: "todo" | "done" | "blocked") {
};
}
describe("issue comment reopen routes", () => {
function agentActor(agentId = "22222222-2222-4222-8222-222222222222") {
return {
type: "agent",
agentId,
companyId: "company-1",
source: "agent_key",
runId: "run-1",
};
}
async function waitForWakeup(assertion: () => void) {
await vi.waitFor(assertion);
}
describe.sequential("issue comment reopen routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("@paperclipai/shared/telemetry");
vi.doUnmock("../telemetry.js");
vi.doUnmock("../services/access.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/agents.js");
vi.doUnmock("../services/feedback.js");
vi.doUnmock("../services/heartbeat.js");
vi.doUnmock("../services/index.js");
vi.doUnmock("../services/instance-settings.js");
vi.doUnmock("../services/issues.js");
vi.doUnmock("../services/routines.js");
vi.doUnmock("../routes/issues.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockReset();
mockIssueService.assertCheckoutOwner.mockReset();
mockIssueService.update.mockReset();
@@ -221,6 +222,7 @@ describe("issue comment reopen routes", () => {
mockHeartbeatService.getActiveRunForAgent.mockReset();
mockHeartbeatService.cancelRun.mockReset();
mockAgentService.getById.mockReset();
mockAgentService.list.mockReset();
mockAgentService.resolveByReference.mockReset();
mockLogActivity.mockReset();
mockFeedbackService.listIssueVotesForUser.mockReset();
@@ -228,6 +230,7 @@ describe("issue comment reopen routes", () => {
mockInstanceSettingsService.get.mockReset();
mockInstanceSettingsService.listCompanyIds.mockReset();
mockRoutineService.syncRunStatusForIssue.mockReset();
mockIssueTreeControlService.getActivePauseHoldGate.mockReset();
mockTxInsertValues.mockReset();
mockTxInsert.mockReset();
mockDb.transaction.mockReset();
@@ -255,6 +258,7 @@ describe("issue comment reopen routes", () => {
});
mockInstanceSettingsService.listCompanyIds.mockResolvedValue(["company-1"]);
mockRoutineService.syncRunStatusForIssue.mockResolvedValue(undefined);
mockIssueTreeControlService.getActivePauseHoldGate.mockResolvedValue(null);
mockIssueService.addComment.mockResolvedValue({
id: "comment-1",
issueId: "11111111-1111-4111-8111-111111111111",
@@ -280,12 +284,36 @@ describe("issue comment reopen routes", () => {
mockAccessService.canUser.mockResolvedValue(false);
mockAccessService.hasPermission.mockResolvedValue(false);
mockAgentService.getById.mockResolvedValue(null);
mockAgentService.resolveByReference.mockImplementation(async (_companyId: string, reference: string) => ({
ambiguous: false,
agent: {
id: reference,
mockAgentService.list.mockResolvedValue([
{
id: "22222222-2222-4222-8222-222222222222",
reportsTo: null,
permissions: { canCreateAgents: false },
},
}));
{
id: "44444444-4444-4444-8444-444444444444",
reportsTo: null,
permissions: { canCreateAgents: false },
},
]);
mockAgentService.resolveByReference.mockImplementation(async (_companyId: string, reference: string) => {
if (reference === "ambiguous-codex") {
return { ambiguous: true, agent: null };
}
if (reference === "missing-codex") {
return { ambiguous: false, agent: null };
}
if (reference === "codexcoder") {
return {
ambiguous: false,
agent: { id: "33333333-3333-4333-8333-333333333333" },
};
}
return {
ambiguous: false,
agent: { id: reference },
};
});
});
it("treats reopen=true as a no-op when the issue is already open", async () => {
@@ -350,10 +378,6 @@ describe("issue comment reopen routes", () => {
...makeIssue("todo"),
...patch,
}));
mockAgentService.resolveByReference.mockResolvedValue({
ambiguous: false,
agent: { id: "33333333-3333-4333-8333-333333333333" },
});
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
@@ -371,14 +395,10 @@ describe("issue comment reopen routes", () => {
it("rejects ambiguous assignee shortnames", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("todo"));
mockAgentService.resolveByReference.mockResolvedValue({
ambiguous: true,
agent: null,
});
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ assigneeAgentId: "codexcoder" });
.send({ assigneeAgentId: "ambiguous-codex" });
expect(res.status).toBe(409);
expect(res.body.error).toContain("ambiguous");
@@ -387,14 +407,10 @@ describe("issue comment reopen routes", () => {
it("rejects missing assignee shortnames", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("todo"));
mockAgentService.resolveByReference.mockResolvedValue({
ambiguous: false,
agent: null,
});
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ assigneeAgentId: "codexcoder" });
.send({ assigneeAgentId: "missing-codex" });
expect(res.status).toBe(404);
expect(res.body.error).toBe("Agent not found");
@@ -450,7 +466,7 @@ describe("issue comment reopen routes", () => {
"11111111-1111-4111-8111-111111111111",
{ status: "todo" },
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_reopened_via_comment",
@@ -458,7 +474,38 @@ describe("issue comment reopen routes", () => {
reopenedFrom: "done",
}),
}),
));
});
it("does not implicitly reopen closed issues via POST comments for agent-authored comments", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
mockIssueService.addComment.mockResolvedValue({
id: "comment-1",
issueId: "11111111-1111-4111-8111-111111111111",
companyId: "company-1",
body: "hello",
createdAt: new Date(),
updatedAt: new Date(),
authorAgentId: "33333333-3333-4333-8333-333333333333",
authorUserId: null,
});
const res = await request(await installActor(createApp(), {
type: "agent",
agentId: "33333333-3333-4333-8333-333333333333",
companyId: "company-1",
source: "agent_key",
runId: "77777777-7777-4777-8777-777777777777",
}))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "hello" });
expect(res.status).toBe(201);
expect(mockIssueService.update).not.toHaveBeenCalledWith(
"11111111-1111-4111-8111-111111111111",
{ status: "todo" },
);
expect(mockHeartbeatService.wakeup).not.toHaveBeenCalled();
});
it("moves assigned blocked issues back to todo via POST comments", async () => {
@@ -477,7 +524,7 @@ describe("issue comment reopen routes", () => {
"11111111-1111-4111-8111-111111111111",
{ status: "todo" },
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_reopened_via_comment",
@@ -493,7 +540,7 @@ describe("issue comment reopen routes", () => {
reopenedFrom: "blocked",
}),
}),
);
));
});
it("does not move dependency-blocked issues to todo via POST comments", async () => {
@@ -513,7 +560,7 @@ describe("issue comment reopen routes", () => {
expect(res.status).toBe(201);
expect(mockIssueService.update).not.toHaveBeenCalled();
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_commented",
@@ -527,7 +574,7 @@ describe("issue comment reopen routes", () => {
wakeReason: "issue_commented",
}),
}),
);
));
});
it("does not implicitly reopen closed issues via POST comments when no agent is assigned", async () => {
@@ -565,7 +612,7 @@ describe("issue comment reopen routes", () => {
actorUserId: "local-board",
}),
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_reopened_via_comment",
@@ -575,7 +622,42 @@ describe("issue comment reopen routes", () => {
mutation: "comment",
}),
}),
));
});
it("does not implicitly reopen closed issues via the PATCH comment path for agent-authored comments", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
mockIssueService.addComment.mockResolvedValue({
id: "comment-1",
issueId: "11111111-1111-4111-8111-111111111111",
companyId: "company-1",
body: "hello",
createdAt: new Date(),
updatedAt: new Date(),
authorAgentId: "33333333-3333-4333-8333-333333333333",
authorUserId: null,
});
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...makeIssue("done"),
...patch,
}));
const res = await request(await installActor(createApp(), {
type: "agent",
agentId: "33333333-3333-4333-8333-333333333333",
companyId: "company-1",
source: "agent_key",
runId: "88888888-8888-4888-8888-888888888888",
}))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ comment: "hello" });
expect(res.status).toBe(200);
expect(mockIssueService.update).not.toHaveBeenCalledWith(
"11111111-1111-4111-8111-111111111111",
expect.objectContaining({ status: "todo" }),
);
expect(mockHeartbeatService.wakeup).not.toHaveBeenCalled();
});
it("does not move dependency-blocked issues to todo via the PATCH comment path", async () => {
@@ -609,7 +691,7 @@ describe("issue comment reopen routes", () => {
"11111111-1111-4111-8111-111111111111",
expect.objectContaining({ status: "todo" }),
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_commented",
@@ -618,7 +700,7 @@ describe("issue comment reopen routes", () => {
mutation: "comment",
}),
}),
);
));
});
it("wakes the assignee when an assigned blocked issue moves back to todo", async () => {
@@ -630,6 +712,34 @@ describe("issue comment reopen routes", () => {
updatedAt: new Date(),
}));
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ status: "todo" });
expect(res.status).toBe(200);
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
source: "automation",
triggerDetail: "system",
reason: "issue_status_changed",
payload: expect.objectContaining({
issueId: "11111111-1111-4111-8111-111111111111",
mutation: "update",
}),
}),
));
});
it("wakes the assignee when an assigned done issue moves back to todo", async () => {
const issue = makeIssue("done");
mockIssueService.getById.mockResolvedValue(issue);
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...issue,
...patch,
updatedAt: new Date(),
}));
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ status: "todo" });
@@ -645,9 +755,166 @@ describe("issue comment reopen routes", () => {
issueId: "11111111-1111-4111-8111-111111111111",
mutation: "update",
}),
contextSnapshot: expect.objectContaining({
issueId: "11111111-1111-4111-8111-111111111111",
source: "issue.status_change",
}),
}),
);
});
it("explicit same-agent resume works through the PATCH comment path", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...makeIssue("done"),
...patch,
}));
const res = await request(await installActor(createApp(), agentActor()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ comment: "please validate the follow-up", resume: true });
expect(res.status).toBe(200);
expect(mockIssueService.update).toHaveBeenCalledWith(
"11111111-1111-4111-8111-111111111111",
expect.objectContaining({
status: "todo",
actorAgentId: "22222222-2222-4222-8222-222222222222",
actorUserId: null,
}),
);
expect(mockLogActivity).toHaveBeenCalledWith(
expect.anything(),
expect.objectContaining({
action: "issue.comment_added",
details: expect.objectContaining({
commentId: "comment-1",
resumeIntent: true,
followUpRequested: true,
}),
}),
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_reopened_via_comment",
payload: expect.objectContaining({
commentId: "comment-1",
reopenedFrom: "done",
resumeIntent: true,
followUpRequested: true,
}),
}),
);
});
it("keeps generic same-agent comments on closed issues inert", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
const res = await request(await installActor(createApp(), agentActor()))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "follow-up note without intent" });
expect(res.status).toBe(201);
expect(mockIssueService.update).not.toHaveBeenCalled();
expect(mockHeartbeatService.wakeup).not.toHaveBeenCalled();
});
it("explicit same-agent resume comments reopen closed issues and mark the wake payload", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...makeIssue("done"),
...patch,
}));
const res = await request(await installActor(createApp(), agentActor()))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "please validate the follow-up", resume: true });
expect(res.status).toBe(201);
expect(mockIssueService.update).toHaveBeenCalledWith(
"11111111-1111-4111-8111-111111111111",
{ status: "todo" },
);
expect(mockLogActivity).toHaveBeenCalledWith(
expect.anything(),
expect.objectContaining({
action: "issue.comment_added",
details: expect.objectContaining({
commentId: "comment-1",
resumeIntent: true,
followUpRequested: true,
}),
}),
);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "issue_reopened_via_comment",
payload: expect.objectContaining({
commentId: "comment-1",
reopenedFrom: "done",
resumeIntent: true,
followUpRequested: true,
}),
contextSnapshot: expect.objectContaining({
wakeReason: "issue_reopened_via_comment",
resumeIntent: true,
followUpRequested: true,
}),
}),
);
});
it("rejects explicit agent resume intent from a non-assignee", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
const res = await request(await installActor(createApp(), agentActor("44444444-4444-4444-8444-444444444444")))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "restart someone else's work", resume: true });
expect(res.status).toBe(403);
expect(res.body.error).toBe("Agent cannot request follow-up for another agent's issue");
expect(mockIssueService.update).not.toHaveBeenCalled();
expect(mockIssueService.addComment).not.toHaveBeenCalled();
expect(mockHeartbeatService.wakeup).not.toHaveBeenCalled();
});
it("rejects explicit resume intent under an active pause hold", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("done"));
mockIssueTreeControlService.getActivePauseHoldGate.mockResolvedValue({
holdId: "hold-1",
rootIssueId: "root-1",
issueId: "11111111-1111-4111-8111-111111111111",
isRoot: false,
mode: "pause",
reason: "reviewing",
releasePolicy: null,
});
const res = await request(await installActor(createApp(), agentActor()))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "please resume", resume: true });
expect(res.status).toBe(409);
expect(res.body.error).toBe("Issue follow-up blocked by active subtree pause hold");
expect(mockIssueService.update).not.toHaveBeenCalled();
expect(mockIssueService.addComment).not.toHaveBeenCalled();
});
it("rejects explicit resume intent on cancelled issues", async () => {
mockIssueService.getById.mockResolvedValue(makeIssue("cancelled"));
const res = await request(await installActor(createApp(), agentActor()))
.post("/api/issues/11111111-1111-4111-8111-111111111111/comments")
.send({ body: "please resume", resume: true });
expect(res.status).toBe(409);
expect(res.body.error).toBe("Cancelled issues must be restored through the dedicated restore flow");
expect(mockIssueService.update).not.toHaveBeenCalled();
expect(mockIssueService.addComment).not.toHaveBeenCalled();
});
it("interrupts an active run before a combined comment update", async () => {
const issue = {
...makeIssue("todo"),
@@ -690,6 +957,73 @@ describe("issue comment reopen routes", () => {
);
});
it("cancels an active run when an issue is marked cancelled", async () => {
const issue = {
...makeIssue("in_progress"),
executionRunId: "run-1",
};
mockIssueService.getById.mockResolvedValue(issue);
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...issue,
...patch,
}));
mockHeartbeatService.getRun.mockResolvedValue({
id: "run-1",
companyId: "company-1",
agentId: "22222222-2222-4222-8222-222222222222",
status: "running",
});
mockHeartbeatService.cancelRun.mockResolvedValue({
id: "run-1",
companyId: "company-1",
agentId: "22222222-2222-4222-8222-222222222222",
status: "cancelled",
});
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ status: "cancelled" });
expect(res.status).toBe(200);
expect(mockHeartbeatService.getRun).toHaveBeenCalledWith("run-1");
expect(mockHeartbeatService.cancelRun).toHaveBeenCalledWith("run-1");
expect(mockLogActivity).toHaveBeenCalledWith(
expect.anything(),
expect.objectContaining({
action: "heartbeat.cancelled",
details: expect.objectContaining({
source: "issue_status_cancelled",
issueId: "11111111-1111-4111-8111-111111111111",
}),
}),
);
});
it("does not cancel active runs when an issue is marked done", async () => {
const issue = {
...makeIssue("in_progress"),
executionRunId: "run-1",
};
mockIssueService.getById.mockResolvedValue(issue);
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
...issue,
...patch,
}));
mockHeartbeatService.getRun.mockResolvedValue({
id: "run-1",
companyId: "company-1",
agentId: "22222222-2222-4222-8222-222222222222",
status: "running",
});
const res = await request(await installActor(createApp()))
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
.send({ status: "done" });
expect(res.status).toBe(200);
expect(mockHeartbeatService.cancelRun).not.toHaveBeenCalled();
});
it("writes decision ids into executionState and inserts the decision inside the transaction", async () => {
const policy = await normalizePolicy({
stages: [
@@ -818,7 +1152,7 @@ describe("issue comment reopen routes", () => {
instructions: "Please verify the fix against the reproduction steps and note any residual risk.",
},
});
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"33333333-3333-4333-8333-333333333333",
expect.objectContaining({
reason: "execution_review_requested",
@@ -834,7 +1168,7 @@ describe("issue comment reopen routes", () => {
}),
}),
}),
);
));
});
it("wakes the return assignee with execution_changes_requested", async () => {
@@ -886,7 +1220,7 @@ describe("issue comment reopen routes", () => {
});
expect(res.status).toBe(200);
expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
await waitForWakeup(() => expect(mockHeartbeatService.wakeup).toHaveBeenCalledWith(
"22222222-2222-4222-8222-222222222222",
expect.objectContaining({
reason: "execution_changes_requested",
@@ -900,6 +1234,6 @@ describe("issue comment reopen routes", () => {
}),
}),
}),
);
));
});
});

View File

@@ -100,7 +100,7 @@ describe("issue dependency wakeups in issue routes", () => {
vi.doUnmock("../routes/issues.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getAncestors.mockResolvedValue([]);
mockIssueService.getComment.mockResolvedValue(null);
mockIssueService.getCommentCursor.mockResolvedValue({

View File

@@ -177,7 +177,7 @@ describe("issue document revision routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue({
id: issueId,
companyId,

View File

@@ -67,6 +67,7 @@ describe("issue graph liveness classifier", () => {
issueId: blockedId,
identifier: "PAP-1703",
state: "blocked_by_unassigned_issue",
recoveryIssueId: blockerId,
recommendedOwnerAgentId: managerId,
dependencyPath: [
expect.objectContaining({ issueId: blockedId }),
@@ -76,6 +77,57 @@ describe("issue graph liveness classifier", () => {
});
});
it("does not use free-form executive role or name matching for recovery ownership", () => {
const rootAgentId = "root-agent";
const spoofedExecutiveId = "spoofed-executive";
const findings = classifyIssueGraphLiveness({
issues: [
issue({
assigneeAgentId: null,
createdByAgentId: null,
}),
issue({
id: blockerId,
identifier: "PAP-1704",
title: "Missing unblock work",
status: "todo",
assigneeAgentId: null,
createdByAgentId: null,
}),
],
relations: blocks,
agents: [
agent({
id: spoofedExecutiveId,
name: "Chief Executive Recovery",
role: "cto",
title: "CEO",
reportsTo: rootAgentId,
}),
agent({
id: rootAgentId,
name: "Root Operator",
role: "operator",
title: null,
reportsTo: null,
}),
],
});
expect(findings).toHaveLength(1);
expect(findings[0]?.recommendedOwnerAgentId).toBe(rootAgentId);
expect(findings[0]?.recommendedOwnerCandidates[0]).toMatchObject({
agentId: rootAgentId,
reason: "root_agent",
sourceIssueId: blockerId,
});
expect(findings[0]?.recommendedOwnerCandidateAgentIds).toEqual([
rootAgentId,
spoofedExecutiveId,
]);
});
it("does not flag a live blocked chain with an active assignee and wake path", () => {
const findings = classifyIssueGraphLiveness({
issues: [

View File

@@ -104,7 +104,7 @@ describe("issue telemetry routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
mockIssueService.getById.mockResolvedValue(makeIssue("todo"));
mockIssueService.getWakeableParentAfterChildCompletion.mockResolvedValue(null);

View File

@@ -136,7 +136,7 @@ async function createApp(actor: Record<string, unknown> = {
return app;
}
describe("issue thread interaction routes", () => {
describe.sequential("issue thread interaction routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../routes/issues.js");
@@ -144,7 +144,7 @@ describe("issue thread interaction routes", () => {
vi.doUnmock("../middleware/index.js");
vi.doUnmock("../services/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(createIssue());
mockInteractionService.listForIssue.mockResolvedValue([]);
mockInteractionService.create.mockResolvedValue({

View File

@@ -195,6 +195,61 @@ describe("issue tree control routes", () => {
);
});
it("still marks affected issues cancelled when run interruption fails", async () => {
const app = await createApp({
type: "board",
userId: "user-1",
companyIds: ["company-2"],
source: "session",
isInstanceAdmin: false,
});
mockTreeControlService.createHold.mockResolvedValue({
hold: {
id: "33333333-3333-4333-8333-333333333333",
mode: "cancel",
reason: "cancel subtree",
},
preview: {
mode: "cancel",
totals: { affectedIssues: 1 },
warnings: [],
activeRuns: [
{
id: "44444444-4444-4444-8444-444444444444",
issueId: "11111111-1111-4111-8111-111111111111",
},
],
},
});
mockTreeControlService.cancelIssueStatusesForHold.mockResolvedValue({
updatedIssueIds: ["11111111-1111-4111-8111-111111111111"],
updatedIssues: [],
});
mockHeartbeatService.cancelRun.mockRejectedValue(new Error("adapter process did not exit"));
const res = await request(app)
.post("/api/issues/11111111-1111-4111-8111-111111111111/tree-holds")
.send({ mode: "cancel", reason: "cancel subtree" });
expect(res.status).toBe(201);
expect(mockHeartbeatService.cancelRun).toHaveBeenCalledWith("44444444-4444-4444-8444-444444444444");
expect(mockTreeControlService.cancelIssueStatusesForHold).toHaveBeenCalledWith(
"company-2",
"11111111-1111-4111-8111-111111111111",
"33333333-3333-4333-8333-333333333333",
);
expect(mockLogActivity).toHaveBeenCalledWith(
expect.anything(),
expect.objectContaining({
action: "issue.tree_hold_run_interrupt_failed",
entityId: "44444444-4444-4444-8444-444444444444",
details: expect.objectContaining({
error: "adapter process did not exit",
}),
}),
);
});
it("restores affected issues and can request explicit wakeups", async () => {
const app = await createApp({
type: "board",

View File

@@ -3,9 +3,11 @@ import { eq, inArray } from "drizzle-orm";
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
import {
agents,
agentWakeupRequests,
companies,
createDb,
heartbeatRuns,
issueComments,
issueTreeHoldMembers,
issueTreeHolds,
issues,
@@ -38,8 +40,10 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
afterEach(async () => {
await db.delete(issueTreeHoldMembers);
await db.delete(issueTreeHolds);
await db.delete(issueComments);
await db.delete(issues);
await db.delete(heartbeatRuns);
await db.delete(agentWakeupRequests);
await db.delete(agents);
await db.delete(companies);
});
@@ -340,6 +344,12 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
const childIssueId = randomUUID();
const rootRunId = randomUUID();
const childRunId = randomUUID();
const forgedRunId = randomUUID();
const rootWakeupRequestId = randomUUID();
const childWakeupRequestId = randomUUID();
const forgedWakeupRequestId = randomUUID();
const rootCommentId = randomUUID();
const childCommentId = randomUUID();
await db.insert(companies).values({
id: companyId,
@@ -377,6 +387,63 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
assigneeAgentId: agentId,
},
]);
await db.insert(issueComments).values([
{
id: rootCommentId,
companyId,
issueId: rootIssueId,
authorUserId: "board-user",
body: "Please answer this root issue question.",
},
{
id: childCommentId,
companyId,
issueId: childIssueId,
authorUserId: "board-user",
body: "Please answer this child issue question.",
},
]);
await db.insert(agentWakeupRequests).values([
{
id: rootWakeupRequestId,
companyId,
agentId,
source: "automation",
triggerDetail: "system",
reason: "issue_commented",
payload: { issueId: rootIssueId, commentId: rootCommentId },
status: "queued",
requestedByActorType: "user",
requestedByActorId: "board-user",
runId: rootRunId,
},
{
id: forgedWakeupRequestId,
companyId,
agentId,
source: "on_demand",
triggerDetail: "manual",
reason: "issue_commented",
payload: { issueId: childIssueId, commentId: childCommentId },
status: "queued",
requestedByActorType: "agent",
requestedByActorId: agentId,
runId: forgedRunId,
},
{
id: childWakeupRequestId,
companyId,
agentId,
source: "automation",
triggerDetail: "system",
reason: "issue_commented",
payload: { issueId: childIssueId, commentId: childCommentId },
status: "queued",
requestedByActorType: "user",
requestedByActorId: "board-user",
runId: childRunId,
},
]);
await db.insert(heartbeatRuns).values([
{
id: rootRunId,
@@ -385,7 +452,29 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
invocationSource: "automation",
triggerDetail: "system",
status: "queued",
contextSnapshot: { issueId: rootIssueId, wakeReason: "issue_commented", commentId: randomUUID() },
wakeupRequestId: rootWakeupRequestId,
contextSnapshot: {
issueId: rootIssueId,
wakeReason: "issue_commented",
commentId: rootCommentId,
wakeCommentId: rootCommentId,
source: "issue.comment",
},
},
{
id: forgedRunId,
companyId,
agentId,
invocationSource: "on_demand",
triggerDetail: "manual",
status: "queued",
wakeupRequestId: forgedWakeupRequestId,
contextSnapshot: {
issueId: childIssueId,
wakeReason: "issue_commented",
commentId: childCommentId,
wakeCommentId: childCommentId,
},
},
{
id: childRunId,
@@ -394,7 +483,14 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
invocationSource: "automation",
triggerDetail: "system",
status: "queued",
contextSnapshot: { issueId: childIssueId, wakeReason: "issue_commented", commentId: randomUUID() },
wakeupRequestId: childWakeupRequestId,
contextSnapshot: {
issueId: childIssueId,
wakeReason: "issue_commented",
commentId: childCommentId,
wakeCommentId: childCommentId,
source: "issue.comment",
},
},
]);
@@ -413,6 +509,13 @@ describeEmbeddedPostgres("issueTreeControlService", () => {
mode: "pause",
}),
});
await expect(issueSvc.checkout(childIssueId, agentId, ["todo"], forgedRunId)).rejects.toMatchObject({
status: 409,
details: expect.objectContaining({
rootIssueId,
mode: "pause",
}),
});
const checkedOutChild = await issueSvc.checkout(childIssueId, agentId, ["todo"], childRunId);
expect(checkedOutChild.status).toBe("in_progress");

View File

@@ -186,7 +186,7 @@ describe("issue update comment wakeups", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.findMentionedAgents.mockResolvedValue([]);
mockIssueService.getRelationSummaries.mockResolvedValue({ blockedBy: [], blocks: [] });
mockIssueService.listWakeableBlockedDependents.mockResolvedValue([]);

View File

@@ -175,7 +175,7 @@ describe("issue workspace command authorization", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.addComment.mockResolvedValue(null);
mockIssueService.create.mockResolvedValue(makeIssue());
mockIssueService.findMentionedAgents.mockResolvedValue([]);

View File

@@ -1,6 +1,8 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { errorHandler } from "../middleware/index.js";
import { issueRoutes } from "../routes/issues.js";
const mockIssueService = vi.hoisted(() => ({
getById: vi.fn(),
@@ -9,6 +11,7 @@ const mockIssueService = vi.hoisted(() => ({
findMentionedProjectIds: vi.fn(),
getCommentCursor: vi.fn(),
getComment: vi.fn(),
listBlockerAttention: vi.fn(),
listAttachments: vi.fn(),
}));
@@ -31,72 +34,86 @@ const mockExecutionWorkspaceService = vi.hoisted(() => ({
getById: vi.fn(),
}));
function registerModuleMocks() {
vi.doMock("../services/index.js", () => ({
accessService: () => ({
canUser: vi.fn(),
hasPermission: vi.fn(),
}),
agentService: () => ({
getById: vi.fn(),
}),
documentService: () => mockDocumentsService,
environmentService: () => ({}),
executionWorkspaceService: () => mockExecutionWorkspaceService,
feedbackService: () => ({
listIssueVotesForUser: vi.fn(async () => []),
saveIssueVote: vi.fn(async () => ({ vote: null, consentEnabledNow: false, sharingEnabled: false })),
}),
goalService: () => mockGoalService,
heartbeatService: () => ({
wakeup: vi.fn(async () => undefined),
reportRunActivity: vi.fn(async () => undefined),
}),
instanceSettingsService: () => ({
get: vi.fn(async () => ({
id: "instance-settings-1",
general: {
censorUsernameInLogs: false,
feedbackDataSharingPreference: "prompt",
},
})),
listCompanyIds: vi.fn(async () => ["company-1"]),
}),
issueApprovalService: () => ({}),
issueReferenceService: () => ({
deleteDocumentSource: async () => undefined,
diffIssueReferenceSummary: () => ({
addedReferencedIssues: [],
removedReferencedIssues: [],
currentReferencedIssues: [],
}),
emptySummary: () => ({ outbound: [], inbound: [] }),
listIssueReferenceSummary: async () => ({ outbound: [], inbound: [] }),
syncComment: async () => undefined,
syncDocument: async () => undefined,
syncIssue: async () => undefined,
}),
issueService: () => mockIssueService,
logActivity: vi.fn(async () => undefined),
projectService: () => mockProjectService,
routineService: () => ({
syncRunStatusForIssue: vi.fn(async () => undefined),
}),
workProductService: () => ({
listForIssue: vi.fn(async () => []),
}),
}));
const mockAccessService = vi.hoisted(() => ({
canUser: vi.fn(),
hasPermission: vi.fn(),
}));
vi.doMock("../services/execution-workspaces.js", () => ({
executionWorkspaceService: () => mockExecutionWorkspaceService,
}));
}
const mockAgentService = vi.hoisted(() => ({
getById: vi.fn(),
}));
async function createApp() {
const [{ issueRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/issues.js")>("../routes/issues.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
]);
const mockFeedbackService = vi.hoisted(() => ({
listIssueVotesForUser: vi.fn(async () => []),
saveIssueVote: vi.fn(async () => ({ vote: null, consentEnabledNow: false, sharingEnabled: false })),
}));
const mockHeartbeatService = vi.hoisted(() => ({
wakeup: vi.fn(async () => undefined),
reportRunActivity: vi.fn(async () => undefined),
}));
const mockInstanceSettingsService = vi.hoisted(() => ({
get: vi.fn(async () => ({
id: "instance-settings-1",
general: {
censorUsernameInLogs: false,
feedbackDataSharingPreference: "prompt",
},
})),
listCompanyIds: vi.fn(async () => ["company-1"]),
}));
const mockIssueReferenceService = vi.hoisted(() => ({
deleteDocumentSource: vi.fn(async () => undefined),
diffIssueReferenceSummary: vi.fn(() => ({
addedReferencedIssues: [],
removedReferencedIssues: [],
currentReferencedIssues: [],
})),
emptySummary: vi.fn(() => ({ outbound: [], inbound: [] })),
listIssueReferenceSummary: vi.fn(async () => ({ outbound: [], inbound: [] })),
syncComment: vi.fn(async () => undefined),
syncDocument: vi.fn(async () => undefined),
syncIssue: vi.fn(async () => undefined),
}));
const mockLogActivity = vi.hoisted(() => vi.fn(async () => undefined));
const mockRoutineService = vi.hoisted(() => ({
syncRunStatusForIssue: vi.fn(async () => undefined),
}));
const mockWorkProductService = vi.hoisted(() => ({
listForIssue: vi.fn(async () => []),
}));
const mockEnvironmentService = vi.hoisted(() => ({}));
vi.mock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
documentService: () => mockDocumentsService,
environmentService: () => mockEnvironmentService,
executionWorkspaceService: () => mockExecutionWorkspaceService,
feedbackService: () => mockFeedbackService,
goalService: () => mockGoalService,
heartbeatService: () => mockHeartbeatService,
instanceSettingsService: () => mockInstanceSettingsService,
issueApprovalService: () => ({}),
issueReferenceService: () => mockIssueReferenceService,
issueService: () => mockIssueService,
logActivity: mockLogActivity,
projectService: () => mockProjectService,
routineService: () => mockRoutineService,
workProductService: () => mockWorkProductService,
}));
vi.mock("../services/execution-workspaces.js", () => ({
executionWorkspaceService: () => mockExecutionWorkspaceService,
}));
function createApp() {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
@@ -146,16 +163,9 @@ const projectGoal = {
updatedAt: new Date("2026-03-20T00:00:00Z"),
};
describe("issue goal context routes", () => {
describe.sequential("issue goal context routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/index.js");
vi.doUnmock("../services/execution-workspaces.js");
vi.doUnmock("../routes/issues.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(legacyProjectLinkedIssue);
mockIssueService.getAncestors.mockResolvedValue([]);
mockIssueService.getRelationSummaries.mockResolvedValue({ blockedBy: [], blocks: [] });
@@ -166,6 +176,7 @@ describe("issue goal context routes", () => {
latestCommentAt: null,
});
mockIssueService.getComment.mockResolvedValue(null);
mockIssueService.listBlockerAttention.mockResolvedValue(new Map());
mockIssueService.listAttachments.mockResolvedValue([]);
mockDocumentsService.getIssueDocumentPayload.mockResolvedValue({});
mockDocumentsService.getIssueDocumentByKey.mockResolvedValue(null);
@@ -211,7 +222,7 @@ describe("issue goal context routes", () => {
});
it("surfaces the project goal from GET /issues/:id when the issue has no direct goal", async () => {
const res = await request(await createApp()).get("/api/issues/11111111-1111-4111-8111-111111111111");
const res = await request(createApp()).get("/api/issues/11111111-1111-4111-8111-111111111111");
expect(res.status).toBe(200);
expect(res.body.goalId).toBe(projectGoal.id);
@@ -229,7 +240,7 @@ describe("issue goal context routes", () => {
});
it("surfaces the project goal from GET /issues/:id/heartbeat-context", async () => {
const res = await request(await createApp()).get(
const res = await request(createApp()).get(
"/api/issues/11111111-1111-4111-8111-111111111111/heartbeat-context",
);
@@ -255,7 +266,7 @@ describe("issue goal context routes", () => {
updatedAt: new Date("2026-04-19T12:00:00.000Z"),
});
const res = await request(await createApp()).get(
const res = await request(createApp()).get(
"/api/issues/11111111-1111-4111-8111-111111111111/heartbeat-context",
);
@@ -286,7 +297,7 @@ describe("issue goal context routes", () => {
blocks: [],
});
const res = await request(await createApp()).get(
const res = await request(createApp()).get(
"/api/issues/11111111-1111-4111-8111-111111111111/heartbeat-context",
);
@@ -321,7 +332,7 @@ describe("issue goal context routes", () => {
],
});
const res = await request(await createApp()).get(
const res = await request(createApp()).get(
"/api/issues/11111111-1111-4111-8111-111111111111/heartbeat-context",
);

View File

@@ -1401,6 +1401,49 @@ describeEmbeddedPostgres("issueService blockers and dependency wake readiness",
expect(blockedRelations.blockedBy.map((relation) => relation.id)).toEqual([blockerId]);
});
it("adds terminal blockers to immediate blocked-by summaries", async () => {
const companyId = randomUUID();
await db.insert(companies).values({
id: companyId,
name: "Paperclip",
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
requireBoardApprovalForNewAgents: false,
});
const issueA = randomUUID();
const issueB = randomUUID();
const issueC = randomUUID();
const issueD = randomUUID();
await db.insert(issues).values([
{ id: issueA, companyId, identifier: "PAP-1", title: "Issue A", status: "blocked", priority: "medium" },
{ id: issueB, companyId, identifier: "PAP-2", title: "Issue B", status: "blocked", priority: "medium" },
{ id: issueC, companyId, identifier: "PAP-3", title: "Issue C", status: "blocked", priority: "medium" },
{ id: issueD, companyId, identifier: "PAP-4", title: "Issue D", status: "todo", priority: "high" },
]);
await svc.update(issueC, { blockedByIssueIds: [issueD] });
await svc.update(issueB, { blockedByIssueIds: [issueC] });
await svc.update(issueA, { blockedByIssueIds: [issueB] });
const relations = await svc.getRelationSummaries(issueA);
expect(relations.blockedBy).toHaveLength(1);
expect(relations.blockedBy[0]).toMatchObject({
id: issueB,
identifier: "PAP-2",
title: "Issue B",
terminalBlockers: [
expect.objectContaining({
id: issueD,
identifier: "PAP-4",
title: "Issue D",
status: "todo",
priority: "high",
}),
],
});
});
it("rejects blocking cycles", async () => {
const companyId = randomUUID();
await db.insert(companies).values({

View File

@@ -0,0 +1,44 @@
import { describe, expect, it } from "vitest";
import { collectSecretRefPaths } from "../services/json-schema-secret-refs.ts";
describe("collectSecretRefPaths", () => {
it("collects nested secret-ref paths from object properties", () => {
expect(Array.from(collectSecretRefPaths({
type: "object",
properties: {
credentials: {
type: "object",
properties: {
apiKey: { type: "string", format: "secret-ref" },
},
},
},
}))).toEqual(["credentials.apiKey"]);
});
it("collects secret-ref paths from JSON Schema composition keywords", () => {
expect(Array.from(collectSecretRefPaths({
type: "object",
allOf: [
{
properties: {
apiKey: { type: "string", format: "secret-ref" },
},
},
{
properties: {
nested: {
oneOf: [
{
properties: {
token: { type: "string", format: "secret-ref" },
},
},
],
},
},
},
],
})).sort()).toEqual(["apiKey", "nested.token"]);
});
});

View File

@@ -48,7 +48,7 @@ describe("llm routes", () => {
vi.doUnmock("../routes/llms.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockListServerAdapters.mockReturnValue([
{ type: "codex_local", agentConfigurationDoc: "# codex_local agent configuration" },
]);

View File

@@ -1,6 +1,8 @@
import express from "express";
import request from "supertest";
import { beforeEach, describe, expect, it, vi } from "vitest";
import { errorHandler } from "../middleware/index.js";
import { accessRoutes } from "../routes/access.js";
const mockAccessService = vi.hoisted(() => ({
hasPermission: vi.fn(),
@@ -36,40 +38,18 @@ const mockStorage = vi.hoisted(() => ({
headObject: vi.fn(),
}));
function registerModuleMocks() {
vi.doMock("../routes/access.js", async () => vi.importActual("../routes/access.js"));
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
vi.doMock("../middleware/index.js", async () => vi.importActual("../middleware/index.js"));
vi.mock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
boardAuthService: () => mockBoardAuthService,
deduplicateAgentName: vi.fn(),
logActivity: mockLogActivity,
notifyHireApproved: vi.fn(),
}));
vi.doMock("../services/access.js", () => ({
accessService: () => mockAccessService,
}));
vi.doMock("../services/activity-log.js", () => ({
logActivity: mockLogActivity,
}));
vi.doMock("../services/agents.js", () => ({
agentService: () => mockAgentService,
}));
vi.doMock("../services/board-auth.js", () => ({
boardAuthService: () => mockBoardAuthService,
}));
vi.doMock("../services/index.js", () => ({
accessService: () => mockAccessService,
agentService: () => mockAgentService,
boardAuthService: () => mockBoardAuthService,
deduplicateAgentName: vi.fn(),
logActivity: mockLogActivity,
notifyHireApproved: vi.fn(),
}));
vi.doMock("../storage/index.js", () => ({
getStorageService: () => mockStorage,
}));
}
vi.mock("../storage/index.js", () => ({
getStorageService: () => mockStorage,
}));
function createSelectChain(rows: unknown[]) {
const query = {
@@ -126,11 +106,7 @@ function createDbStub(...selectResponses: unknown[][]) {
};
}
async function createApp(actor: Record<string, unknown>, db: Record<string, unknown>) {
const [{ accessRoutes }, { errorHandler }] = await Promise.all([
import("../routes/access.js"),
import("../middleware/index.js"),
]);
function createApp(actor: Record<string, unknown>, db: Record<string, unknown>) {
const app = express();
app.use(express.json());
app.use((req, _res, next) => {
@@ -150,7 +126,7 @@ async function createApp(actor: Record<string, unknown>, db: Record<string, unkn
return app;
}
describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
describe.sequential("POST /companies/:companyId/openclaw/invite-prompt", () => {
const companyBranding = {
name: "Acme AI",
brandColor: "#225577",
@@ -165,18 +141,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
};
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/access.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/agents.js");
vi.doUnmock("../services/board-auth.js");
vi.doUnmock("../services/index.js");
vi.doUnmock("../storage/index.js");
vi.doUnmock("../routes/access.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockAccessService.canUser.mockResolvedValue(false);
mockAgentService.getById.mockReset();
mockLogActivity.mockResolvedValue(undefined);
@@ -190,7 +155,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
companyId: "company-1",
role: "engineer",
});
const app = await createApp(
const app = createApp(
{
type: "agent",
agentId: "agent-1",
@@ -215,7 +180,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
companyId: "company-1",
role: "ceo",
});
const app = await createApp(
const app = createApp(
{
type: "agent",
agentId: "agent-1",
@@ -243,7 +208,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
it("includes companyName in invite summary responses", async () => {
const db = createDbStub([companyBranding], [logoAsset]);
const app = await createApp(
const app = createApp(
{
type: "board",
userId: "user-1",
@@ -267,7 +232,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
it("allows board callers with invite permission", async () => {
const db = createDbStub([companyBranding], [logoAsset]);
mockAccessService.canUser.mockResolvedValue(true);
const app = await createApp(
const app = createApp(
{
type: "board",
userId: "user-1",
@@ -291,7 +256,7 @@ describe("POST /companies/:companyId/openclaw/invite-prompt", () => {
it("rejects board callers without invite permission", async () => {
const db = createDbStub();
mockAccessService.canUser.mockResolvedValue(false);
const app = await createApp(
const app = createApp(
{
type: "board",
userId: "user-1",

View File

@@ -16,25 +16,21 @@ const mockLifecycle = vi.hoisted(() => ({
disable: vi.fn(),
}));
function registerRouteMocks() {
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
vi.mock("../services/plugin-registry.js", () => ({
pluginRegistryService: () => mockRegistry,
}));
vi.doMock("../services/plugin-registry.js", () => ({
pluginRegistryService: () => mockRegistry,
}));
vi.mock("../services/plugin-lifecycle.js", () => ({
pluginLifecycleManager: () => mockLifecycle,
}));
vi.doMock("../services/plugin-lifecycle.js", () => ({
pluginLifecycleManager: () => mockLifecycle,
}));
vi.mock("../services/activity-log.js", () => ({
logActivity: vi.fn(),
}));
vi.doMock("../services/activity-log.js", () => ({
logActivity: vi.fn(),
}));
vi.doMock("../services/live-events.js", () => ({
publishGlobalLiveEvent: vi.fn(),
}));
}
vi.mock("../services/live-events.js", () => ({
publishGlobalLiveEvent: vi.fn(),
}));
async function createApp(
actor: Record<string, unknown>,
@@ -47,8 +43,8 @@ async function createApp(
} = {},
) {
const [{ pluginRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/plugins.js")>("../routes/plugins.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/plugins.js"),
import("../middleware/index.js"),
]);
const loader = {
@@ -114,21 +110,9 @@ function readyPlugin() {
});
}
describe("plugin install and upgrade authz", () => {
describe.sequential("plugin install and upgrade authz", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/issues.js");
vi.doUnmock("../services/plugin-config-validator.js");
vi.doUnmock("../services/plugin-loader.js");
vi.doUnmock("../services/plugin-registry.js");
vi.doUnmock("../services/plugin-lifecycle.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/live-events.js");
vi.doUnmock("../routes/plugins.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
});
it("rejects plugin installation for non-admin board users", async () => {
@@ -267,21 +251,9 @@ describe("plugin install and upgrade authz", () => {
}, 20_000);
});
describe("scoped plugin API routes", () => {
describe.sequential("scoped plugin API routes", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/issues.js");
vi.doUnmock("../services/plugin-config-validator.js");
vi.doUnmock("../services/plugin-loader.js");
vi.doUnmock("../services/plugin-registry.js");
vi.doUnmock("../services/plugin-lifecycle.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/live-events.js");
vi.doUnmock("../routes/plugins.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
});
it("dispatches manifest-declared scoped routes after company access checks", async () => {
@@ -345,21 +317,9 @@ describe("scoped plugin API routes", () => {
}, 20_000);
});
describe("plugin tool and bridge authz", () => {
describe.sequential("plugin tool and bridge authz", () => {
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/issues.js");
vi.doUnmock("../services/plugin-config-validator.js");
vi.doUnmock("../services/plugin-loader.js");
vi.doUnmock("../services/plugin-registry.js");
vi.doUnmock("../services/plugin-lifecycle.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/live-events.js");
vi.doUnmock("../routes/plugins.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerRouteMocks();
vi.resetAllMocks();
vi.clearAllMocks();
});
it("rejects tool execution when the board user cannot access runContext.companyId", async () => {
@@ -393,63 +353,67 @@ describe("plugin tool and bridge authz", () => {
expect(executeTool).not.toHaveBeenCalled();
});
it.each([
[
"agentId",
it("rejects tool execution when any runContext reference is outside the company scope", async () => {
const cases: Array<[string, Array<Array<Record<string, unknown>>>]> = [
[
[{ companyId: companyB }],
"agentId",
[
[{ companyId: companyB }],
],
],
],
[
"runId company",
[
[{ companyId: companyA }],
[{ companyId: companyB, agentId: agentA }],
"runId company",
[
[{ companyId: companyA }],
[{ companyId: companyB, agentId: agentA }],
],
],
],
[
"runId agent",
[
[{ companyId: companyA }],
[{ companyId: companyA, agentId: "77777777-7777-4777-8777-777777777777" }],
"runId agent",
[
[{ companyId: companyA }],
[{ companyId: companyA, agentId: "77777777-7777-4777-8777-777777777777" }],
],
],
],
[
"projectId",
[
[{ companyId: companyA }],
[{ companyId: companyA, agentId: agentA }],
[{ companyId: companyB }],
"projectId",
[
[{ companyId: companyA }],
[{ companyId: companyA, agentId: agentA }],
[{ companyId: companyB }],
],
],
],
])("rejects tool execution when runContext.%s is outside the company scope", async (_case, rows) => {
const executeTool = vi.fn();
const { app } = await createApp(boardActor(), {}, {
db: createSelectQueueDb(rows),
toolDeps: {
toolDispatcher: {
listToolsForAgent: vi.fn(),
getTool: vi.fn(() => ({ name: "paperclip.example:search" })),
executeTool,
},
},
});
];
const res = await request(app)
.post("/api/plugins/tools/execute")
.send({
tool: "paperclip.example:search",
parameters: {},
runContext: {
agentId: agentA,
runId: runA,
companyId: companyA,
projectId: projectA,
for (const [label, rows] of cases) {
const executeTool = vi.fn();
const { app } = await createApp(boardActor(), {}, {
db: createSelectQueueDb(rows),
toolDeps: {
toolDispatcher: {
listToolsForAgent: vi.fn(),
getTool: vi.fn(() => ({ name: "paperclip.example:search" })),
executeTool,
},
},
});
expect(res.status).toBe(403);
expect(executeTool).not.toHaveBeenCalled();
const res = await request(app)
.post("/api/plugins/tools/execute")
.send({
tool: "paperclip.example:search",
parameters: {},
runContext: {
agentId: agentA,
runId: runA,
companyId: companyA,
projectId: projectA,
},
});
expect(res.status, label).toBe(403);
expect(executeTool).not.toHaveBeenCalled();
}
});
it("allows tool execution when agent, run, and project all belong to runContext.companyId", async () => {

View File

@@ -38,30 +38,6 @@ vi.mock("../services/live-events.js", () => ({
publishGlobalLiveEvent: vi.fn(),
}));
function registerModuleMocks() {
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
vi.doMock("../services/plugin-registry.js", () => ({
pluginRegistryService: () => mockRegistry,
}));
vi.doMock("../services/plugin-lifecycle.js", () => ({
pluginLifecycleManager: () => mockLifecycle,
}));
vi.doMock("../services/issues.js", () => ({
issueService: () => mockIssueService,
}));
vi.doMock("../services/activity-log.js", () => ({
logActivity: vi.fn(),
}));
vi.doMock("../services/live-events.js", () => ({
publishGlobalLiveEvent: vi.fn(),
}));
}
function manifest(apiRoutes: NonNullable<PaperclipPluginManifestV1["apiRoutes"]>): PaperclipPluginManifestV1 {
return {
id: "paperclip.scoped-api-test",
@@ -84,8 +60,8 @@ async function createApp(input: {
workerResult?: unknown;
}) {
const [{ pluginRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/plugins.js")>("../routes/plugins.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/plugins.js"),
import("../middleware/index.js"),
]);
const workerManager = {
@@ -118,7 +94,7 @@ async function createApp(input: {
return { app, workerManager };
}
describe("plugin scoped API routes", () => {
describe.sequential("plugin scoped API routes", () => {
const pluginId = "11111111-1111-4111-8111-111111111111";
const companyId = "22222222-2222-4222-8222-222222222222";
const agentId = "33333333-3333-4333-8333-333333333333";
@@ -126,17 +102,7 @@ describe("plugin scoped API routes", () => {
const issueId = "55555555-5555-4555-8555-555555555555";
beforeEach(() => {
vi.resetModules();
vi.doUnmock("../services/plugin-registry.js");
vi.doUnmock("../services/plugin-lifecycle.js");
vi.doUnmock("../services/issues.js");
vi.doUnmock("../services/activity-log.js");
vi.doUnmock("../services/live-events.js");
vi.doUnmock("../routes/plugins.js");
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockIssueService.getById.mockResolvedValue(null);
mockIssueService.assertCheckoutOwner.mockResolvedValue({
id: issueId,

View File

@@ -109,7 +109,7 @@ describe("project and goal telemetry routes", () => {
vi.doUnmock("../routes/authz.js");
vi.doUnmock("../middleware/index.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockGetTelemetryClient.mockReturnValue({ track: mockTelemetryTrack });
mockProjectService.resolveByReference.mockResolvedValue({ ambiguous: false, project: null });
mockEnvironmentService.getById.mockReset();

View File

@@ -145,7 +145,7 @@ describe("project env routes", () => {
vi.doUnmock("../services/environments.js");
vi.doUnmock("../services/secrets.js");
registerModuleMocks();
vi.resetAllMocks();
vi.clearAllMocks();
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
mockProjectService.resolveByReference.mockResolvedValue({ ambiguous: false, project: null });
mockProjectService.createWorkspace.mockResolvedValue(null);

View File

@@ -0,0 +1,146 @@
import { describe, expect, it } from "vitest";
import { classifyIssueGraphLiveness as classifyIssueGraphLivenessCompat } from "../services/issue-liveness.ts";
import { decideRunLivenessContinuation as decideRunLivenessContinuationCompat } from "../services/run-continuations.ts";
import {
RECOVERY_KEY_PREFIXES,
RECOVERY_ORIGIN_KINDS,
RECOVERY_REASON_KINDS,
buildIssueGraphLivenessIncidentKey,
buildIssueGraphLivenessLeafKey,
buildRunLivenessContinuationIdempotencyKey,
classifyIssueGraphLiveness,
decideRunLivenessContinuation,
parseIssueGraphLivenessIncidentKey,
} from "../services/recovery/index.ts";
const companyId = "company-1";
const agentId = "agent-1";
const managerId = "manager-1";
const issueId = "issue-1";
const blockerId = "blocker-1";
const runId = "run-1";
describe("recovery classifier boundary", () => {
it("keeps issue graph liveness classifier parity with the compatibility export", () => {
const input = {
issues: [
{
id: issueId,
companyId,
identifier: "PAP-2073",
title: "Centralize recovery classifiers",
status: "blocked",
assigneeAgentId: agentId,
assigneeUserId: null,
createdByAgentId: null,
createdByUserId: null,
executionState: null,
},
{
id: blockerId,
companyId,
identifier: "PAP-2074",
title: "Move recovery side effects",
status: "todo",
assigneeAgentId: null,
assigneeUserId: null,
createdByAgentId: null,
createdByUserId: null,
executionState: null,
},
],
relations: [{ companyId, blockerIssueId: blockerId, blockedIssueId: issueId }],
agents: [
{
id: agentId,
companyId,
name: "Coder",
role: "engineer",
status: "idle",
reportsTo: managerId,
},
{
id: managerId,
companyId,
name: "CTO",
role: "cto",
status: "idle",
reportsTo: null,
},
],
};
expect(classifyIssueGraphLiveness(input)).toEqual(classifyIssueGraphLivenessCompat(input));
});
it("keeps run liveness continuation decision parity with the compatibility export", () => {
const input = {
run: {
id: runId,
companyId,
agentId,
continuationAttempt: 0,
} as never,
issue: {
id: issueId,
companyId,
identifier: "PAP-2073",
title: "Centralize recovery classifiers",
status: "in_progress",
assigneeAgentId: agentId,
executionState: null,
projectId: null,
} as never,
agent: {
id: agentId,
companyId,
status: "idle",
} as never,
livenessState: "plan_only" as const,
livenessReason: "Planned without acting",
nextAction: "Take the first concrete action.",
budgetBlocked: false,
idempotentWakeExists: false,
};
expect(decideRunLivenessContinuation(input)).toEqual(decideRunLivenessContinuationCompat(input));
});
it("keeps recovery origin and idempotency keys stable", () => {
expect(RECOVERY_ORIGIN_KINDS).toMatchObject({
issueGraphLivenessEscalation: "harness_liveness_escalation",
strandedIssueRecovery: "stranded_issue_recovery",
staleActiveRunEvaluation: "stale_active_run_evaluation",
});
expect(RECOVERY_REASON_KINDS.runLivenessContinuation).toBe("run_liveness_continuation");
expect(RECOVERY_KEY_PREFIXES.issueGraphLivenessIncident).toBe("harness_liveness");
expect(RECOVERY_KEY_PREFIXES.issueGraphLivenessLeaf).toBe("harness_liveness_leaf");
const incidentKey = buildIssueGraphLivenessIncidentKey({
companyId,
issueId,
state: "blocked_by_unassigned_issue",
blockerIssueId: blockerId,
});
expect(incidentKey).toBe(
"harness_liveness:company-1:issue-1:blocked_by_unassigned_issue:blocker-1",
);
expect(parseIssueGraphLivenessIncidentKey(incidentKey)).toEqual({
companyId,
issueId,
state: "blocked_by_unassigned_issue",
leafIssueId: blockerId,
});
expect(buildIssueGraphLivenessLeafKey({
companyId,
state: "blocked_by_unassigned_issue",
leafIssueId: blockerId,
})).toBe("harness_liveness_leaf:company-1:blocked_by_unassigned_issue:blocker-1");
expect(buildRunLivenessContinuationIdempotencyKey({
issueId,
sourceRunId: runId,
livenessState: "plan_only",
nextAttempt: 1,
})).toBe("run_liveness_continuation:issue-1:run-1:plan_only:1");
});
});

View File

@@ -1,5 +1,5 @@
import { describe, expect, it } from "vitest";
import { REDACTED_EVENT_VALUE, redactEventPayload, sanitizeRecord } from "../redaction.js";
import { REDACTED_EVENT_VALUE, redactEventPayload, redactSensitiveText, sanitizeRecord } from "../redaction.js";
describe("redaction", () => {
it("redacts sensitive keys and nested secret values", () => {
@@ -63,4 +63,25 @@ describe("redaction", () => {
safe: "value",
});
});
it("redacts common secret shapes from unstructured text", () => {
const jwt = "eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiIxMjM0NTY3ODkwIn0.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c";
const githubToken = "ghp_1234567890abcdefghijklmnopqrstuvwxyz";
const input = [
"Authorization: Bearer live-bearer-token-value",
`payload {"apiKey":"json-secret-value"}`,
`escaped {\\"apiKey\\":\\"escaped-json-secret\\"}`,
`GITHUB_TOKEN=${githubToken}`,
`session=${jwt}`,
].join("\n");
const result = redactSensitiveText(input);
expect(result).toContain(REDACTED_EVENT_VALUE);
expect(result).not.toContain("live-bearer-token-value");
expect(result).not.toContain("json-secret-value");
expect(result).not.toContain("escaped-json-secret");
expect(result).not.toContain(githubToken);
expect(result).not.toContain(jwt);
});
});

View File

@@ -77,7 +77,7 @@ function registerRoutineServiceMock() {
}
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe.sequential : describe.skip;
if (!embeddedPostgresSupport.supported) {
console.warn(
@@ -136,13 +136,13 @@ describeEmbeddedPostgres("routine routes end-to-end", () => {
vi.doUnmock("../middleware/index.js");
registerRoutineServiceMock();
vi.doMock("../routes/authz.js", async () => vi.importActual("../routes/authz.js"));
vi.resetAllMocks();
vi.clearAllMocks();
});
async function createApp(actor: Record<string, unknown>) {
const [{ routineRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/routines.js")>("../routes/routines.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
import("../routes/routines.js"),
import("../middleware/index.js"),
]);
const app = express();
app.use(express.json());

Some files were not shown because too many files have changed in this diff Show More