mirror of
https://github.com/paperclipai/paperclip
synced 2026-05-08 16:12:20 +02:00
Compare commits
3 Commits
codex/pap-
...
pap-1497-d
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
66cbf5260f | ||
|
|
42189e1bf9 | ||
|
|
9c231d925a |
@@ -154,14 +154,6 @@ Each AGENTS.md body should include not just what the agent does, but how they fi
|
||||
|
||||
This turns a collection of agents into an organization that actually works together. Without workflow context, agents operate in isolation — they do their job but don't know what happens before or after them.
|
||||
|
||||
Add a concise execution contract to every generated working agent:
|
||||
|
||||
- Start actionable work in the same heartbeat and do not stop at a plan unless planning was requested.
|
||||
- Leave durable progress in comments, documents, or work products with the next action.
|
||||
- Use child issues for long or parallel delegated work instead of polling agents, sessions, or processes.
|
||||
- Mark blocked work with the unblock owner and action.
|
||||
- Respect budget, pause/cancel, approval gates, and company boundaries.
|
||||
|
||||
### Step 5: Confirm Output Location
|
||||
|
||||
Ask the user where to write the package. Common options:
|
||||
|
||||
@@ -105,13 +105,6 @@ Your responsibilities:
|
||||
- Implement features and fix bugs
|
||||
- Write tests and documentation
|
||||
- Participate in code reviews
|
||||
|
||||
Execution contract:
|
||||
|
||||
- Start actionable implementation work in the same heartbeat; do not stop at a plan unless planning was requested.
|
||||
- Leave durable progress with a clear next action.
|
||||
- Use child issues for long or parallel delegated work instead of polling agents, sessions, or processes.
|
||||
- Mark blocked work with the unblock owner and action.
|
||||
```
|
||||
|
||||
## teams/engineering/TEAM.md
|
||||
|
||||
@@ -548,7 +548,7 @@ Import from `@paperclipai/adapter-utils/server-utils`:
|
||||
### Prompt Templates
|
||||
- Support `promptTemplate` for every run
|
||||
- Use `renderTemplate()` with the standard variable set
|
||||
- Default prompt should use `DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE` from `@paperclipai/adapter-utils/server-utils` so local adapters share Paperclip's execution contract: act in the same heartbeat, avoid planning-only exits unless requested, leave durable progress and a next action, use child issues instead of polling, mark blockers with owner/action, and respect governance boundaries.
|
||||
- Default prompt: `"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work."`
|
||||
|
||||
### Error Handling
|
||||
- Differentiate timeout vs process error vs parse failure
|
||||
|
||||
@@ -2,6 +2,3 @@ DATABASE_URL=postgres://paperclip:paperclip@localhost:5432/paperclip
|
||||
PORT=3100
|
||||
SERVE_UI=false
|
||||
BETTER_AUTH_SECRET=paperclip-dev-secret
|
||||
|
||||
# Discord webhook for daily merge digest (scripts/discord-daily-digest.sh)
|
||||
# DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/...
|
||||
|
||||
3
.github/PULL_REQUEST_TEMPLATE.md
vendored
3
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -38,8 +38,6 @@
|
||||
|
||||
-
|
||||
|
||||
> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and discuss it in `#dev` before opening the PR. Feature PRs that overlap with planned core work may need to be redirected — check the roadmap first. See `CONTRIBUTING.md`.
|
||||
|
||||
## Model Used
|
||||
|
||||
<!--
|
||||
@@ -59,7 +57,6 @@
|
||||
|
||||
- [ ] I have included a thinking path that traces from project context to this change
|
||||
- [ ] I have specified the model used (with version and capability details)
|
||||
- [ ] I have checked ROADMAP.md and confirmed this PR does not duplicate planned core work
|
||||
- [ ] I have run tests locally and they pass
|
||||
- [ ] I have added or updated tests where applicable
|
||||
- [ ] If this change affects the UI, I have included before/after screenshots
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -1,7 +1,4 @@
|
||||
node_modules
|
||||
node_modules/
|
||||
**/node_modules
|
||||
**/node_modules/
|
||||
dist/
|
||||
.env
|
||||
*.tsbuildinfo
|
||||
@@ -35,7 +32,6 @@ server/src/**/*.d.ts
|
||||
server/src/**/*.d.ts.map
|
||||
tmp/
|
||||
feedback-export-*
|
||||
diagnostics/
|
||||
|
||||
# Editor / tool temp files
|
||||
*.tmp
|
||||
|
||||
@@ -51,21 +51,6 @@ All tests must pass before a PR can be merged. Run them locally first and verify
|
||||
|
||||
We use [Greptile](https://greptile.com) for automated code review. Your PR must achieve a **5/5 Greptile score** with **all Greptile comments addressed** before it can be merged. If Greptile leaves comments, fix or respond to each one and request a re-review.
|
||||
|
||||
## Feature Contributions
|
||||
|
||||
We actively manage the core Paperclip feature roadmap.
|
||||
|
||||
Uncoordinated feature PRs against the core product may be closed, even when the implementation is thoughtful and high quality. That is about roadmap ownership, product coherence, and long-term maintenance commitment, not a judgment about the effort.
|
||||
|
||||
If you want to contribute a feature:
|
||||
|
||||
- Check [ROADMAP.md](ROADMAP.md) first
|
||||
- Start the discussion in Discord -> `#dev` before writing code
|
||||
- If the idea fits as an extension, prefer building it with the [plugin system](doc/plugins/PLUGIN_SPEC.md)
|
||||
- If you want to show a possible direction, reference implementations are welcome as feedback, but they generally will not be merged directly into core
|
||||
|
||||
Bugs, docs improvements, and small targeted improvements are still the easiest path to getting merged, and we really do appreciate them.
|
||||
|
||||
## General Rules (both paths)
|
||||
|
||||
- Write clear commit messages
|
||||
|
||||
13
Dockerfile
13
Dockerfile
@@ -2,7 +2,15 @@ FROM node:lts-trixie-slim AS base
|
||||
ARG USER_UID=1000
|
||||
ARG USER_GID=1000
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y --no-install-recommends ca-certificates gosu curl gh git wget ripgrep python3 \
|
||||
&& apt-get install -y --no-install-recommends ca-certificates gosu curl git wget ripgrep python3 \
|
||||
&& mkdir -p -m 755 /etc/apt/keyrings \
|
||||
&& wget -nv -O/etc/apt/keyrings/githubcli-archive-keyring.gpg https://cli.github.com/packages/githubcli-archive-keyring.gpg \
|
||||
&& echo "20e0125d6f6e077a9ad46f03371bc26d90b04939fb95170f5a1905099cc6bcc0 /etc/apt/keyrings/githubcli-archive-keyring.gpg" | sha256sum -c - \
|
||||
&& chmod go+r /etc/apt/keyrings/githubcli-archive-keyring.gpg \
|
||||
&& mkdir -p -m 755 /etc/apt/sources.list.d \
|
||||
&& echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" > /etc/apt/sources.list.d/github-cli.list \
|
||||
&& apt-get update \
|
||||
&& apt-get install -y --no-install-recommends gh \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& corepack enable
|
||||
|
||||
@@ -48,9 +56,6 @@ ARG USER_GID=1000
|
||||
WORKDIR /app
|
||||
COPY --chown=node:node --from=build /app /app
|
||||
RUN npm install --global --omit=dev @anthropic-ai/claude-code@latest @openai/codex@latest opencode-ai \
|
||||
&& apt-get update \
|
||||
&& apt-get install -y --no-install-recommends openssh-client jq \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& mkdir -p /paperclip \
|
||||
&& chown node:node /paperclip
|
||||
|
||||
|
||||
@@ -256,10 +256,10 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
||||
- ✅ Scheduled Routines
|
||||
- ✅ Better Budgeting
|
||||
- ✅ Agent Reviews and Approvals
|
||||
- ✅ Multiple Human Users
|
||||
- ⚪ Multiple Human Users
|
||||
- ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
|
||||
- ⚪ Artifacts & Work Products
|
||||
- ⚪ Memory / Knowledge
|
||||
- ⚪ Memory & Knowledge
|
||||
- ⚪ Enforced Outcomes
|
||||
- ⚪ MAXIMIZER MODE
|
||||
- ⚪ Deep Planning
|
||||
@@ -270,8 +270,6 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
||||
- ⚪ Cloud deployments
|
||||
- ⚪ Desktop App
|
||||
|
||||
This is the short roadmap preview. See the full roadmap in [ROADMAP.md](ROADMAP.md).
|
||||
|
||||
<br/>
|
||||
|
||||
## Community & Plugins
|
||||
|
||||
97
ROADMAP.md
97
ROADMAP.md
@@ -1,97 +0,0 @@
|
||||
# Roadmap
|
||||
|
||||
This document expands the roadmap preview in `README.md`.
|
||||
|
||||
Paperclip is still moving quickly. The list below is directional, not promised, and priorities may shift as we learn from users and from operating real AI companies with the product.
|
||||
|
||||
We value community involvement and want to make sure contributor energy goes toward areas where it can land.
|
||||
|
||||
We may accept contributions in the areas below, but if you want to work on roadmap-level core features, please coordinate with us first in Discord (`#dev`) before writing code. Bugs, docs, polish, and tightly scoped improvements are still the easiest contributions to merge.
|
||||
|
||||
If you want to extend Paperclip today, the best path is often the [plugin system](doc/plugins/PLUGIN_SPEC.md). Community reference implementations are also useful feedback even when they are not merged directly into core.
|
||||
|
||||
## Milestones
|
||||
|
||||
### ✅ Plugin system
|
||||
|
||||
Paperclip should keep a thin core and rich edges. Plugins are the path for optional capabilities like knowledge bases, custom tracing, queues, doc editors, and other product-specific surfaces that do not need to live in the control plane itself.
|
||||
|
||||
### ✅ Get OpenClaw / claw-style agent employees
|
||||
|
||||
Paperclip should be able to hire and manage real claw-style agent workers, not just a narrow built-in runtime. This is part of the larger "bring your own agent" story and keeps the control plane useful across different agent ecosystems.
|
||||
|
||||
### ✅ companies.sh - import and export entire organizations
|
||||
|
||||
Reusable companies matter. Import/export is the foundation for moving org structures, agent definitions, and reusable company setups between environments and eventually for broader company-template distribution.
|
||||
|
||||
### ✅ Easy AGENTS.md configurations
|
||||
|
||||
Agent setup should feel repo-native and legible. Simple `AGENTS.md`-style configuration lowers the barrier to getting an agent team running and makes it easier for contributors to understand how a company is wired together.
|
||||
|
||||
### ✅ Skills Manager
|
||||
|
||||
Agents need a practical way to discover, install, and use skills without every setup becoming bespoke. The skills layer is part of making Paperclip companies more reusable and easier to operate.
|
||||
|
||||
### ✅ Scheduled Routines
|
||||
|
||||
Recurring work should be native. Routine tasks like reports, reviews, and other periodic work need first-class scheduling so the company keeps operating even when no human is manually kicking work off.
|
||||
|
||||
### ✅ Better Budgeting
|
||||
|
||||
Budgets are a core control-plane feature, not an afterthought. Better budgeting means clearer spend visibility, safer hard stops, and better operator control over how autonomy turns into real cost.
|
||||
|
||||
### ✅ Agent Reviews and Approvals
|
||||
|
||||
Paperclip should support explicit review and approval stages as first-class workflow steps, not just ad hoc comments. That means reviewer routing, approval gates, change requests, and durable audit trails that fit the same task model as the rest of the control plane.
|
||||
|
||||
### ✅ Multiple Human Users
|
||||
|
||||
Paperclip needs a clearer path from solo operator to real human teams. That means shared board access, safer collaboration, and a better model for several humans supervising the same autonomous company.
|
||||
|
||||
### ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
|
||||
|
||||
We want agents to run in more remote and sandboxed environments while preserving the same Paperclip control-plane model. This makes the system safer, more flexible, and more useful outside a single trusted local machine.
|
||||
|
||||
### ⚪ Artifacts & Work Products
|
||||
|
||||
Paperclip should make outputs first-class. That means generated artifacts, previews, deployable outputs, and the handoff from "agent did work" to "here is the result" should become more visible and easier to operate.
|
||||
|
||||
### ⚪ Memory / Knowledge
|
||||
|
||||
We want a stronger memory and knowledge surface for companies, agents, and projects. That includes durable memory, better recall of prior decisions and context, and a clearer path for knowledge-style capabilities without turning Paperclip into a generic chat app.
|
||||
|
||||
### ⚪ Enforced Outcomes
|
||||
|
||||
Paperclip should get stricter about what counts as finished work. Tasks, approvals, and execution flows should resolve to clear outcomes like merged code, published artifacts, shipped docs, or explicit decisions instead of stopping at vague status updates.
|
||||
|
||||
### ⚪ MAXIMIZER MODE
|
||||
|
||||
This is the direction for higher-autonomy execution: more aggressive delegation, deeper follow-through, and stronger operating loops with clear budgets, visibility, and governance. The point is not hidden autonomy; the point is more output per human supervisor.
|
||||
|
||||
### ⚪ Deep Planning
|
||||
|
||||
Some work needs more than a task description before execution starts. Deeper planning means stronger issue documents, revisionable plans, and clearer review loops for strategy-heavy work before agents begin execution.
|
||||
|
||||
### ⚪ Work Queues
|
||||
|
||||
Paperclip should support queue-style work streams for repeatable inputs like support, triage, review, and backlog intake. That would make it easier to route work continuously without turning every system into a one-off workflow.
|
||||
|
||||
### ⚪ Self-Organization
|
||||
|
||||
As companies grow, agents should be able to propose useful structural changes such as role adjustments, delegation changes, and new recurring routines. The goal is adaptive organizations that still stay within governance and approval boundaries.
|
||||
|
||||
### ⚪ Automatic Organizational Learning
|
||||
|
||||
Paperclip should get better at turning completed work into reusable organizational knowledge. That includes capturing playbooks, recurring fixes, and decision patterns so future work starts from what the company has already learned.
|
||||
|
||||
### ⚪ CEO Chat
|
||||
|
||||
We want a lighter-weight way to talk to leadership agents, but those conversations should still resolve to real work objects like plans, issues, approvals, or decisions. This should improve interaction without changing the core task-and-comments model.
|
||||
|
||||
### ⚪ Cloud deployments
|
||||
|
||||
Local-first remains important, but Paperclip also needs a cleaner shared deployment story. Teams should be able to run the same product in hosted or semi-hosted environments without changing the mental model.
|
||||
|
||||
### ⚪ Desktop App
|
||||
|
||||
A desktop app can make Paperclip feel more accessible and persistent for day-to-day operators. The goal is easier access, better local ergonomics, and a smoother default experience for users who want the control plane always close at hand.
|
||||
@@ -12,7 +12,7 @@
|
||||
<p align="center">
|
||||
<a href="https://github.com/paperclipai/paperclip/blob/master/LICENSE"><img src="https://img.shields.io/badge/license-MIT-blue" alt="MIT License" /></a>
|
||||
<a href="https://github.com/paperclipai/paperclip/stargazers"><img src="https://img.shields.io/github/stars/paperclipai/paperclip?style=flat" alt="Stars" /></a>
|
||||
<a href="https://discord.gg/m4HZY7xNG3"><img src="https://img.shields.io/discord/000000000?label=discord" alt="Discord" /></a>
|
||||
<a href="https://discord.gg/m4HZY7xNG3"><img src="https://img.shields.io/badge/discord-join%20chat-5865F2?logo=discord&logoColor=white" alt="Discord" /></a>
|
||||
</p>
|
||||
|
||||
<br/>
|
||||
@@ -258,7 +258,7 @@ See [doc/DEVELOPING.md](https://github.com/paperclipai/paperclip/blob/master/doc
|
||||
- ⚪ Artifacts & Deployments
|
||||
- ⚪ CEO Chat
|
||||
- ⚪ MAXIMIZER MODE
|
||||
- ✅ Multiple Human Users
|
||||
- ⚪ Multiple Human Users
|
||||
- ⚪ Cloud / Sandbox agents (e.g. Cursor / e2b agents)
|
||||
- ⚪ Cloud deployments
|
||||
- ⚪ Desktop App
|
||||
|
||||
@@ -287,11 +287,6 @@ describeEmbeddedPostgres("paperclipai company import/export e2e", () => {
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({ name: `CLI Export Source ${Date.now()}` }),
|
||||
});
|
||||
await api(apiBase, `/api/companies/${sourceCompany.id}`, {
|
||||
method: "PATCH",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({ requireBoardApprovalForNewAgents: false }),
|
||||
});
|
||||
|
||||
const sourceAgent = await api<{ id: string; name: string }>(
|
||||
apiBase,
|
||||
|
||||
@@ -3,15 +3,11 @@ import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { execFileSync } from "node:child_process";
|
||||
import { randomUUID } from "node:crypto";
|
||||
import { eq } from "drizzle-orm";
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
import {
|
||||
agents,
|
||||
authUsers,
|
||||
companies,
|
||||
createDb,
|
||||
issueComments,
|
||||
issues,
|
||||
projects,
|
||||
routines,
|
||||
routineTriggers,
|
||||
@@ -20,7 +16,6 @@ import {
|
||||
copyGitHooksToWorktreeGitDir,
|
||||
copySeededSecretsKey,
|
||||
pauseSeededScheduledRoutines,
|
||||
quarantineSeededWorktreeExecutionState,
|
||||
readSourceAttachmentBody,
|
||||
rebindWorkspaceCwd,
|
||||
resolveSourceConfigPath,
|
||||
@@ -52,7 +47,6 @@ import {
|
||||
const ORIGINAL_CWD = process.cwd();
|
||||
const ORIGINAL_ENV = { ...process.env };
|
||||
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||
const itEmbeddedPostgres = embeddedPostgresSupport.supported ? it : it.skip;
|
||||
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||
|
||||
if (!embeddedPostgresSupport.supported) {
|
||||
@@ -286,138 +280,6 @@ describe("worktree helpers", () => {
|
||||
expect(full.nullifyColumns).toEqual({});
|
||||
});
|
||||
|
||||
itEmbeddedPostgres("quarantines copied live execution state in seeded worktree databases", async () => {
|
||||
const tempDb = await startEmbeddedPostgresTestDatabase("paperclip-worktree-quarantine-");
|
||||
const db = createDb(tempDb.connectionString);
|
||||
const companyId = randomUUID();
|
||||
const agentId = randomUUID();
|
||||
const idleAgentId = randomUUID();
|
||||
const inProgressIssueId = randomUUID();
|
||||
const todoIssueId = randomUUID();
|
||||
const reviewIssueId = randomUUID();
|
||||
const userIssueId = randomUUID();
|
||||
|
||||
try {
|
||||
await db.insert(companies).values({
|
||||
id: companyId,
|
||||
name: "Paperclip",
|
||||
issuePrefix: "WTQ",
|
||||
requireBoardApprovalForNewAgents: false,
|
||||
});
|
||||
await db.insert(agents).values([
|
||||
{
|
||||
id: agentId,
|
||||
companyId,
|
||||
name: "CodexCoder",
|
||||
role: "engineer",
|
||||
status: "running",
|
||||
adapterType: "codex_local",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: {
|
||||
heartbeat: { enabled: true, intervalSec: 60 },
|
||||
wakeOnDemand: true,
|
||||
},
|
||||
permissions: {},
|
||||
},
|
||||
{
|
||||
id: idleAgentId,
|
||||
companyId,
|
||||
name: "Reviewer",
|
||||
role: "reviewer",
|
||||
status: "idle",
|
||||
adapterType: "codex_local",
|
||||
adapterConfig: {},
|
||||
runtimeConfig: { heartbeat: { enabled: false, intervalSec: 300 } },
|
||||
permissions: {},
|
||||
},
|
||||
]);
|
||||
await db.insert(issues).values([
|
||||
{
|
||||
id: inProgressIssueId,
|
||||
companyId,
|
||||
title: "Copied in-flight issue",
|
||||
status: "in_progress",
|
||||
priority: "medium",
|
||||
assigneeAgentId: agentId,
|
||||
issueNumber: 1,
|
||||
identifier: "WTQ-1",
|
||||
executionAgentNameKey: "codexcoder",
|
||||
executionLockedAt: new Date("2026-04-18T00:00:00.000Z"),
|
||||
},
|
||||
{
|
||||
id: todoIssueId,
|
||||
companyId,
|
||||
title: "Copied assigned todo issue",
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
assigneeAgentId: agentId,
|
||||
issueNumber: 2,
|
||||
identifier: "WTQ-2",
|
||||
},
|
||||
{
|
||||
id: reviewIssueId,
|
||||
companyId,
|
||||
title: "Copied assigned review issue",
|
||||
status: "in_review",
|
||||
priority: "medium",
|
||||
assigneeAgentId: idleAgentId,
|
||||
issueNumber: 3,
|
||||
identifier: "WTQ-3",
|
||||
},
|
||||
{
|
||||
id: userIssueId,
|
||||
companyId,
|
||||
title: "Copied user issue",
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
assigneeUserId: "user-1",
|
||||
issueNumber: 4,
|
||||
identifier: "WTQ-4",
|
||||
},
|
||||
]);
|
||||
|
||||
await expect(quarantineSeededWorktreeExecutionState(tempDb.connectionString)).resolves.toEqual({
|
||||
disabledTimerHeartbeats: 1,
|
||||
resetRunningAgents: 1,
|
||||
quarantinedInProgressIssues: 1,
|
||||
unassignedTodoIssues: 1,
|
||||
unassignedReviewIssues: 1,
|
||||
});
|
||||
|
||||
const [quarantinedAgent] = await db.select().from(agents).where(eq(agents.id, agentId));
|
||||
expect(quarantinedAgent?.status).toBe("idle");
|
||||
expect(quarantinedAgent?.runtimeConfig).toMatchObject({
|
||||
heartbeat: { enabled: false, intervalSec: 60 },
|
||||
wakeOnDemand: true,
|
||||
});
|
||||
|
||||
const [inProgressIssue] = await db.select().from(issues).where(eq(issues.id, inProgressIssueId));
|
||||
expect(inProgressIssue?.status).toBe("blocked");
|
||||
expect(inProgressIssue?.assigneeAgentId).toBeNull();
|
||||
expect(inProgressIssue?.executionAgentNameKey).toBeNull();
|
||||
expect(inProgressIssue?.executionLockedAt).toBeNull();
|
||||
|
||||
const [todoIssue] = await db.select().from(issues).where(eq(issues.id, todoIssueId));
|
||||
expect(todoIssue?.status).toBe("todo");
|
||||
expect(todoIssue?.assigneeAgentId).toBeNull();
|
||||
|
||||
const [reviewIssue] = await db.select().from(issues).where(eq(issues.id, reviewIssueId));
|
||||
expect(reviewIssue?.status).toBe("in_review");
|
||||
expect(reviewIssue?.assigneeAgentId).toBeNull();
|
||||
|
||||
const [userIssue] = await db.select().from(issues).where(eq(issues.id, userIssueId));
|
||||
expect(userIssue?.status).toBe("todo");
|
||||
expect(userIssue?.assigneeUserId).toBe("user-1");
|
||||
|
||||
const comments = await db.select().from(issueComments).where(eq(issueComments.issueId, inProgressIssueId));
|
||||
expect(comments).toHaveLength(1);
|
||||
expect(comments[0]?.body).toContain("Quarantined during worktree seed");
|
||||
} finally {
|
||||
await db.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||
await tempDb.cleanup();
|
||||
}
|
||||
}, 20_000);
|
||||
|
||||
it("copies the source local_encrypted secrets key into the seeded worktree instance", () => {
|
||||
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-secrets-"));
|
||||
const originalInlineMasterKey = process.env.PAPERCLIP_SECRETS_MASTER_KEY;
|
||||
@@ -511,97 +373,6 @@ describe("worktree helpers", () => {
|
||||
}
|
||||
});
|
||||
|
||||
itEmbeddedPostgres(
|
||||
"seeds authenticated users into minimally cloned worktree instances",
|
||||
async () => {
|
||||
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-auth-seed-"));
|
||||
const worktreeRoot = path.join(tempRoot, "PAP-999-auth-seed");
|
||||
const sourceHome = path.join(tempRoot, "source-home");
|
||||
const sourceConfigDir = path.join(sourceHome, "instances", "source");
|
||||
const sourceConfigPath = path.join(sourceConfigDir, "config.json");
|
||||
const sourceEnvPath = path.join(sourceConfigDir, ".env");
|
||||
const sourceKeyPath = path.join(sourceConfigDir, "secrets", "master.key");
|
||||
const worktreeHome = path.join(tempRoot, ".paperclip-worktrees");
|
||||
const originalCwd = process.cwd();
|
||||
const sourceDb = await startEmbeddedPostgresTestDatabase("paperclip-worktree-auth-source-");
|
||||
|
||||
try {
|
||||
const sourceDbClient = createDb(sourceDb.connectionString);
|
||||
await sourceDbClient.insert(authUsers).values({
|
||||
id: "user-existing",
|
||||
email: "existing@paperclip.ing",
|
||||
name: "Existing User",
|
||||
emailVerified: true,
|
||||
createdAt: new Date(),
|
||||
updatedAt: new Date(),
|
||||
});
|
||||
|
||||
fs.mkdirSync(path.dirname(sourceKeyPath), { recursive: true });
|
||||
fs.mkdirSync(worktreeRoot, { recursive: true });
|
||||
|
||||
const sourceConfig = buildSourceConfig();
|
||||
sourceConfig.database = {
|
||||
mode: "postgres",
|
||||
embeddedPostgresDataDir: path.join(sourceConfigDir, "db"),
|
||||
embeddedPostgresPort: 54329,
|
||||
backup: {
|
||||
enabled: true,
|
||||
intervalMinutes: 60,
|
||||
retentionDays: 30,
|
||||
dir: path.join(sourceConfigDir, "backups"),
|
||||
},
|
||||
connectionString: sourceDb.connectionString,
|
||||
};
|
||||
sourceConfig.logging.logDir = path.join(sourceConfigDir, "logs");
|
||||
sourceConfig.storage.localDisk.baseDir = path.join(sourceConfigDir, "storage");
|
||||
sourceConfig.secrets.localEncrypted.keyFilePath = sourceKeyPath;
|
||||
|
||||
fs.writeFileSync(sourceConfigPath, JSON.stringify(sourceConfig, null, 2) + "\n", "utf8");
|
||||
fs.writeFileSync(sourceEnvPath, "", "utf8");
|
||||
fs.writeFileSync(sourceKeyPath, "source-master-key", "utf8");
|
||||
|
||||
process.chdir(worktreeRoot);
|
||||
await worktreeInitCommand({
|
||||
name: "PAP-999-auth-seed",
|
||||
home: worktreeHome,
|
||||
fromConfig: sourceConfigPath,
|
||||
force: true,
|
||||
});
|
||||
|
||||
const targetConfig = JSON.parse(
|
||||
fs.readFileSync(path.join(worktreeRoot, ".paperclip", "config.json"), "utf8"),
|
||||
) as PaperclipConfig;
|
||||
const { default: EmbeddedPostgres } = await import("embedded-postgres");
|
||||
const targetPg = new EmbeddedPostgres({
|
||||
databaseDir: targetConfig.database.embeddedPostgresDataDir,
|
||||
user: "paperclip",
|
||||
password: "paperclip",
|
||||
port: targetConfig.database.embeddedPostgresPort,
|
||||
persistent: true,
|
||||
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
|
||||
onLog: () => {},
|
||||
onError: () => {},
|
||||
});
|
||||
|
||||
await targetPg.start();
|
||||
try {
|
||||
const targetDb = createDb(
|
||||
`postgres://paperclip:paperclip@127.0.0.1:${targetConfig.database.embeddedPostgresPort}/paperclip`,
|
||||
);
|
||||
const seededUsers = await targetDb.select().from(authUsers);
|
||||
expect(seededUsers.some((row) => row.email === "existing@paperclip.ing")).toBe(true);
|
||||
} finally {
|
||||
await targetPg.stop();
|
||||
}
|
||||
} finally {
|
||||
process.chdir(originalCwd);
|
||||
await sourceDb.cleanup();
|
||||
fs.rmSync(tempRoot, { recursive: true, force: true });
|
||||
}
|
||||
},
|
||||
20000,
|
||||
);
|
||||
|
||||
it("avoids ports already claimed by sibling worktree instance configs", async () => {
|
||||
const tempRoot = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-worktree-claimed-ports-"));
|
||||
const repoRoot = path.join(tempRoot, "repo");
|
||||
|
||||
@@ -93,7 +93,6 @@ type WorktreeInitOptions = {
|
||||
dbPort?: number;
|
||||
seed?: boolean;
|
||||
seedMode?: string;
|
||||
preserveLiveWork?: boolean;
|
||||
force?: boolean;
|
||||
};
|
||||
|
||||
@@ -127,7 +126,6 @@ type WorktreeReseedOptions = {
|
||||
fromDataDir?: string;
|
||||
fromInstance?: string;
|
||||
seedMode?: string;
|
||||
preserveLiveWork?: boolean;
|
||||
yes?: boolean;
|
||||
allowLiveTarget?: boolean;
|
||||
};
|
||||
@@ -139,7 +137,6 @@ type WorktreeRepairOptions = {
|
||||
fromDataDir?: string;
|
||||
fromInstance?: string;
|
||||
seedMode?: string;
|
||||
preserveLiveWork?: boolean;
|
||||
noSeed?: boolean;
|
||||
allowLiveTarget?: boolean;
|
||||
};
|
||||
@@ -182,8 +179,6 @@ type CopiedGitHooksResult = {
|
||||
|
||||
type SeedWorktreeDatabaseResult = {
|
||||
backupSummary: string;
|
||||
pausedScheduledRoutines: number;
|
||||
executionQuarantine: SeededWorktreeExecutionQuarantineSummary;
|
||||
reboundWorkspaces: Array<{
|
||||
name: string;
|
||||
fromCwd: string;
|
||||
@@ -191,14 +186,6 @@ type SeedWorktreeDatabaseResult = {
|
||||
}>;
|
||||
};
|
||||
|
||||
export type SeededWorktreeExecutionQuarantineSummary = {
|
||||
disabledTimerHeartbeats: number;
|
||||
resetRunningAgents: number;
|
||||
quarantinedInProgressIssues: number;
|
||||
unassignedTodoIssues: number;
|
||||
unassignedReviewIssues: number;
|
||||
};
|
||||
|
||||
function nonEmpty(value: string | null | undefined): string | null {
|
||||
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||
}
|
||||
@@ -211,18 +198,6 @@ function isCurrentSourceConfigPath(sourceConfigPath: string): boolean {
|
||||
return path.resolve(currentConfigPath) === path.resolve(sourceConfigPath);
|
||||
}
|
||||
|
||||
function formatSeededWorktreeExecutionQuarantineSummary(
|
||||
summary: SeededWorktreeExecutionQuarantineSummary,
|
||||
): string {
|
||||
return [
|
||||
`disabled timer heartbeats: ${summary.disabledTimerHeartbeats}`,
|
||||
`reset running agents: ${summary.resetRunningAgents}`,
|
||||
`quarantined in-progress issues: ${summary.quarantinedInProgressIssues}`,
|
||||
`unassigned todo issues: ${summary.unassignedTodoIssues}`,
|
||||
`unassigned review issues: ${summary.unassignedReviewIssues}`,
|
||||
].join(", ");
|
||||
}
|
||||
|
||||
const WORKTREE_NAME_PREFIX = "paperclip-";
|
||||
|
||||
function resolveWorktreeMakeName(name: string): string {
|
||||
@@ -1144,133 +1119,6 @@ export async function pauseSeededScheduledRoutines(connectionString: string): Pr
|
||||
}
|
||||
}
|
||||
|
||||
const EMPTY_SEEDED_WORKTREE_EXECUTION_QUARANTINE_SUMMARY: SeededWorktreeExecutionQuarantineSummary = {
|
||||
disabledTimerHeartbeats: 0,
|
||||
resetRunningAgents: 0,
|
||||
quarantinedInProgressIssues: 0,
|
||||
unassignedTodoIssues: 0,
|
||||
unassignedReviewIssues: 0,
|
||||
};
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return Boolean(value) && typeof value === "object" && !Array.isArray(value);
|
||||
}
|
||||
|
||||
function isEnabledValue(value: unknown): boolean {
|
||||
return value === true || value === "true" || value === 1 || value === "1";
|
||||
}
|
||||
|
||||
function normalizeWorktreeRuntimeConfig(runtimeConfig: unknown): {
|
||||
runtimeConfig: Record<string, unknown>;
|
||||
disabledTimerHeartbeat: boolean;
|
||||
changed: boolean;
|
||||
} {
|
||||
const nextRuntimeConfig = isRecord(runtimeConfig) ? { ...runtimeConfig } : {};
|
||||
const heartbeat = isRecord(nextRuntimeConfig.heartbeat) ? { ...nextRuntimeConfig.heartbeat } : null;
|
||||
if (!heartbeat) {
|
||||
return { runtimeConfig: nextRuntimeConfig, disabledTimerHeartbeat: false, changed: false };
|
||||
}
|
||||
|
||||
const disabledTimerHeartbeat = isEnabledValue(heartbeat.enabled);
|
||||
if (heartbeat.enabled !== false) {
|
||||
heartbeat.enabled = false;
|
||||
nextRuntimeConfig.heartbeat = heartbeat;
|
||||
return { runtimeConfig: nextRuntimeConfig, disabledTimerHeartbeat, changed: true };
|
||||
}
|
||||
|
||||
return { runtimeConfig: nextRuntimeConfig, disabledTimerHeartbeat: false, changed: false };
|
||||
}
|
||||
|
||||
export async function quarantineSeededWorktreeExecutionState(
|
||||
connectionString: string,
|
||||
): Promise<SeededWorktreeExecutionQuarantineSummary> {
|
||||
const db = createDb(connectionString);
|
||||
const summary = { ...EMPTY_SEEDED_WORKTREE_EXECUTION_QUARANTINE_SUMMARY };
|
||||
try {
|
||||
await db.transaction(async (tx) => {
|
||||
const seededAgents = await tx
|
||||
.select({
|
||||
id: agents.id,
|
||||
status: agents.status,
|
||||
runtimeConfig: agents.runtimeConfig,
|
||||
})
|
||||
.from(agents);
|
||||
|
||||
for (const agent of seededAgents) {
|
||||
const normalized = normalizeWorktreeRuntimeConfig(agent.runtimeConfig);
|
||||
const nextStatus = agent.status === "running" ? "idle" : agent.status;
|
||||
if (normalized.disabledTimerHeartbeat) {
|
||||
summary.disabledTimerHeartbeats += 1;
|
||||
}
|
||||
if (agent.status === "running") {
|
||||
summary.resetRunningAgents += 1;
|
||||
}
|
||||
if (normalized.changed || nextStatus !== agent.status) {
|
||||
await tx
|
||||
.update(agents)
|
||||
.set({
|
||||
runtimeConfig: normalized.runtimeConfig,
|
||||
status: nextStatus,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(agents.id, agent.id));
|
||||
}
|
||||
}
|
||||
|
||||
const affectedIssues = await tx
|
||||
.select({
|
||||
id: issues.id,
|
||||
companyId: issues.companyId,
|
||||
status: issues.status,
|
||||
})
|
||||
.from(issues)
|
||||
.where(
|
||||
and(
|
||||
sql`${issues.assigneeAgentId} is not null`,
|
||||
sql`${issues.assigneeUserId} is null`,
|
||||
inArray(issues.status, ["todo", "in_progress", "in_review"]),
|
||||
),
|
||||
);
|
||||
|
||||
for (const issue of affectedIssues) {
|
||||
const nextStatus = issue.status === "in_progress" ? "blocked" : issue.status;
|
||||
await tx
|
||||
.update(issues)
|
||||
.set({
|
||||
status: nextStatus,
|
||||
assigneeAgentId: null,
|
||||
checkoutRunId: null,
|
||||
executionRunId: null,
|
||||
executionAgentNameKey: null,
|
||||
executionLockedAt: null,
|
||||
executionWorkspaceId: null,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(issues.id, issue.id));
|
||||
|
||||
if (issue.status === "in_progress") {
|
||||
summary.quarantinedInProgressIssues += 1;
|
||||
await tx.insert(issueComments).values({
|
||||
companyId: issue.companyId,
|
||||
issueId: issue.id,
|
||||
body:
|
||||
"Quarantined during worktree seed so copied in-flight work does not auto-run in this isolated instance. " +
|
||||
"Reassign or unblock here only if you intentionally want the worktree instance to own this task.",
|
||||
});
|
||||
} else if (issue.status === "todo") {
|
||||
summary.unassignedTodoIssues += 1;
|
||||
} else if (issue.status === "in_review") {
|
||||
summary.unassignedReviewIssues += 1;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
return summary;
|
||||
} finally {
|
||||
await db.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||
}
|
||||
}
|
||||
|
||||
async function seedWorktreeDatabase(input: {
|
||||
sourceConfigPath: string;
|
||||
sourceConfig: PaperclipConfig;
|
||||
@@ -1278,7 +1126,6 @@ async function seedWorktreeDatabase(input: {
|
||||
targetPaths: WorktreeLocalPaths;
|
||||
instanceId: string;
|
||||
seedMode: WorktreeSeedMode;
|
||||
preserveLiveWork?: boolean;
|
||||
}): Promise<SeedWorktreeDatabaseResult> {
|
||||
const seedPlan = resolveWorktreeSeedPlan(input.seedMode);
|
||||
const sourceEnvFile = resolvePaperclipEnvFile(input.sourceConfigPath);
|
||||
@@ -1329,10 +1176,7 @@ async function seedWorktreeDatabase(input: {
|
||||
backupFile: backup.backupFile,
|
||||
});
|
||||
await applyPendingMigrations(targetConnectionString);
|
||||
const executionQuarantine = input.preserveLiveWork
|
||||
? { ...EMPTY_SEEDED_WORKTREE_EXECUTION_QUARANTINE_SUMMARY }
|
||||
: await quarantineSeededWorktreeExecutionState(targetConnectionString);
|
||||
const pausedScheduledRoutines = await pauseSeededScheduledRoutines(targetConnectionString);
|
||||
await pauseSeededScheduledRoutines(targetConnectionString);
|
||||
const reboundWorkspaces = await rebindSeededProjectWorkspaces({
|
||||
targetConnectionString,
|
||||
currentCwd: input.targetPaths.cwd,
|
||||
@@ -1340,8 +1184,6 @@ async function seedWorktreeDatabase(input: {
|
||||
|
||||
return {
|
||||
backupSummary: formatDatabaseBackupResult(backup),
|
||||
pausedScheduledRoutines,
|
||||
executionQuarantine,
|
||||
reboundWorkspaces,
|
||||
};
|
||||
} finally {
|
||||
@@ -1420,8 +1262,6 @@ async function runWorktreeInit(opts: WorktreeInitOptions): Promise<void> {
|
||||
const copiedGitHooks = copyGitHooksToWorktreeGitDir(cwd);
|
||||
|
||||
let seedSummary: string | null = null;
|
||||
let seedExecutionQuarantineSummary: SeededWorktreeExecutionQuarantineSummary | null = null;
|
||||
let pausedScheduledRoutineCount: number | null = null;
|
||||
let reboundWorkspaceSummary: SeedWorktreeDatabaseResult["reboundWorkspaces"] = [];
|
||||
if (opts.seed !== false) {
|
||||
if (!sourceConfig) {
|
||||
@@ -1439,11 +1279,8 @@ async function runWorktreeInit(opts: WorktreeInitOptions): Promise<void> {
|
||||
targetPaths: paths,
|
||||
instanceId,
|
||||
seedMode,
|
||||
preserveLiveWork: opts.preserveLiveWork,
|
||||
});
|
||||
seedSummary = seeded.backupSummary;
|
||||
seedExecutionQuarantineSummary = seeded.executionQuarantine;
|
||||
pausedScheduledRoutineCount = seeded.pausedScheduledRoutines;
|
||||
reboundWorkspaceSummary = seeded.reboundWorkspaces;
|
||||
spinner.stop(`Seeded isolated worktree database (${seedMode}).`);
|
||||
} catch (error) {
|
||||
@@ -1466,16 +1303,6 @@ async function runWorktreeInit(opts: WorktreeInitOptions): Promise<void> {
|
||||
if (seedSummary) {
|
||||
p.log.message(pc.dim(`Seed mode: ${seedMode}`));
|
||||
p.log.message(pc.dim(`Seed snapshot: ${seedSummary}`));
|
||||
if (opts.preserveLiveWork) {
|
||||
p.log.warning("Preserved copied live work; this worktree instance may auto-run source-instance assignments.");
|
||||
} else if (seedExecutionQuarantineSummary) {
|
||||
p.log.message(
|
||||
pc.dim(`Seed execution quarantine: ${formatSeededWorktreeExecutionQuarantineSummary(seedExecutionQuarantineSummary)}`),
|
||||
);
|
||||
}
|
||||
if (pausedScheduledRoutineCount != null) {
|
||||
p.log.message(pc.dim(`Paused scheduled routines: ${pausedScheduledRoutineCount}`));
|
||||
}
|
||||
for (const rebound of reboundWorkspaceSummary) {
|
||||
p.log.message(
|
||||
pc.dim(`Rebound workspace ${rebound.name}: ${rebound.fromCwd} -> ${rebound.toCwd}`),
|
||||
@@ -3120,20 +2947,11 @@ async function runWorktreeReseed(opts: WorktreeReseedOptions): Promise<void> {
|
||||
targetPaths,
|
||||
instanceId: targetPaths.instanceId,
|
||||
seedMode,
|
||||
preserveLiveWork: opts.preserveLiveWork,
|
||||
});
|
||||
spinner.stop(`Reseeded ${targetEndpoint.label} (${seedMode}).`);
|
||||
p.log.message(pc.dim(`Source: ${source.configPath}`));
|
||||
p.log.message(pc.dim(`Target: ${targetEndpoint.configPath}`));
|
||||
p.log.message(pc.dim(`Seed snapshot: ${seeded.backupSummary}`));
|
||||
if (opts.preserveLiveWork) {
|
||||
p.log.warning("Preserved copied live work; this worktree instance may auto-run source-instance assignments.");
|
||||
} else {
|
||||
p.log.message(
|
||||
pc.dim(`Seed execution quarantine: ${formatSeededWorktreeExecutionQuarantineSummary(seeded.executionQuarantine)}`),
|
||||
);
|
||||
}
|
||||
p.log.message(pc.dim(`Paused scheduled routines: ${seeded.pausedScheduledRoutines}`));
|
||||
for (const rebound of seeded.reboundWorkspaces) {
|
||||
p.log.message(
|
||||
pc.dim(`Rebound workspace ${rebound.name}: ${rebound.fromCwd} -> ${rebound.toCwd}`),
|
||||
@@ -3197,7 +3015,6 @@ export async function worktreeRepairCommand(opts: WorktreeRepairOptions): Promis
|
||||
fromConfig: source.configPath,
|
||||
to: target.rootPath,
|
||||
seedMode,
|
||||
preserveLiveWork: opts.preserveLiveWork,
|
||||
yes: true,
|
||||
allowLiveTarget: opts.allowLiveTarget,
|
||||
});
|
||||
@@ -3230,7 +3047,6 @@ export async function worktreeRepairCommand(opts: WorktreeRepairOptions): Promis
|
||||
fromInstance: opts.fromInstance,
|
||||
seed: opts.noSeed ? false : true,
|
||||
seedMode,
|
||||
preserveLiveWork: opts.preserveLiveWork,
|
||||
force: true,
|
||||
});
|
||||
} finally {
|
||||
@@ -3254,7 +3070,6 @@ export function registerWorktreeCommands(program: Command): void {
|
||||
.option("--server-port <port>", "Preferred server port", (value) => Number(value))
|
||||
.option("--db-port <port>", "Preferred embedded Postgres port", (value) => Number(value))
|
||||
.option("--seed-mode <mode>", "Seed profile: minimal or full (default: minimal)", "minimal")
|
||||
.option("--preserve-live-work", "Do not quarantine copied agent timers or assigned open issues in the seeded worktree", false)
|
||||
.option("--no-seed", "Skip database seeding from the source instance")
|
||||
.option("--force", "Replace existing repo-local config and isolated instance data", false)
|
||||
.action(worktreeMakeCommand);
|
||||
@@ -3271,7 +3086,6 @@ export function registerWorktreeCommands(program: Command): void {
|
||||
.option("--server-port <port>", "Preferred server port", (value) => Number(value))
|
||||
.option("--db-port <port>", "Preferred embedded Postgres port", (value) => Number(value))
|
||||
.option("--seed-mode <mode>", "Seed profile: minimal or full (default: minimal)", "minimal")
|
||||
.option("--preserve-live-work", "Do not quarantine copied agent timers or assigned open issues in the seeded worktree", false)
|
||||
.option("--no-seed", "Skip database seeding from the source instance")
|
||||
.option("--force", "Replace existing repo-local config and isolated instance data", false)
|
||||
.action(worktreeInitCommand);
|
||||
@@ -3311,7 +3125,6 @@ export function registerWorktreeCommands(program: Command): void {
|
||||
.option("--from-data-dir <path>", "Source PAPERCLIP_HOME used when deriving the source config")
|
||||
.option("--from-instance <id>", "Source instance id when deriving the source config")
|
||||
.option("--seed-mode <mode>", "Seed profile: minimal or full (default: full)", "full")
|
||||
.option("--preserve-live-work", "Do not quarantine copied agent timers or assigned open issues in the seeded worktree", false)
|
||||
.option("--yes", "Skip the destructive confirmation prompt", false)
|
||||
.option("--allow-live-target", "Override the guard that requires the target worktree DB to be stopped first", false)
|
||||
.action(worktreeReseedCommand);
|
||||
@@ -3325,7 +3138,6 @@ export function registerWorktreeCommands(program: Command): void {
|
||||
.option("--from-data-dir <path>", "Source PAPERCLIP_HOME used when deriving the source config")
|
||||
.option("--from-instance <id>", "Source instance id when deriving the source config (default: default)")
|
||||
.option("--seed-mode <mode>", "Seed profile: minimal or full (default: minimal)", "minimal")
|
||||
.option("--preserve-live-work", "Do not quarantine copied agent timers or assigned open issues in the seeded worktree", false)
|
||||
.option("--no-seed", "Repair metadata only and skip reseeding when bootstrapping a missing worktree config", false)
|
||||
.option("--allow-live-target", "Override the guard that requires the target worktree DB to be stopped first", false)
|
||||
.action(worktreeRepairCommand);
|
||||
|
||||
@@ -27,18 +27,6 @@ pnpm db:migrate
|
||||
|
||||
When `DATABASE_URL` is unset, this command targets the current embedded PostgreSQL instance for your active Paperclip config/instance.
|
||||
|
||||
Issue reference mentions follow the normal migration path: the schema migration creates the tracking table, but it does not backfill historical issue titles, descriptions, comments, or documents automatically.
|
||||
|
||||
To backfill existing content manually after migrating, run:
|
||||
|
||||
```sh
|
||||
pnpm issue-references:backfill
|
||||
# optional: limit to one company
|
||||
pnpm issue-references:backfill -- --company <company-id>
|
||||
```
|
||||
|
||||
Future issue, comment, and document writes sync references automatically without running the backfill command.
|
||||
|
||||
This mode is ideal for local development and one-command installs.
|
||||
|
||||
Docker note: the Docker quickstart image also uses embedded PostgreSQL by default. Persist `/paperclip` to keep DB state across container restarts (see `doc/DOCKER.md`).
|
||||
@@ -106,16 +94,6 @@ Set `DATABASE_URL` in your `.env`:
|
||||
DATABASE_URL=postgres://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-[REGION].pooler.supabase.com:6543/postgres
|
||||
```
|
||||
|
||||
For hosted deployments that use a pooled runtime URL, set
|
||||
`DATABASE_MIGRATION_URL` to the direct connection URL. Paperclip uses it for
|
||||
startup schema checks/migrations and plugin namespace migrations, while the app
|
||||
continues to use `DATABASE_URL` for runtime queries:
|
||||
|
||||
```sh
|
||||
DATABASE_URL=postgres://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-[REGION].pooler.supabase.com:6543/postgres
|
||||
DATABASE_MIGRATION_URL=postgres://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-[REGION].pooler.supabase.com:5432/postgres
|
||||
```
|
||||
|
||||
If using connection pooling (port 6543), the `postgres` client must disable prepared statements. Update `packages/db/src/client.ts`:
|
||||
|
||||
```ts
|
||||
|
||||
@@ -142,4 +142,3 @@ This prevents lockout when a user migrates from long-running local trusted usage
|
||||
- implementation plan: `doc/plans/deployment-auth-mode-consolidation.md`
|
||||
- V1 contract: `doc/SPEC-implementation.md`
|
||||
- operator workflows: `doc/DEVELOPING.md` and `doc/CLI.md`
|
||||
- invite/join state map: `doc/spec/invite-flow.md`
|
||||
|
||||
@@ -43,17 +43,6 @@ This starts:
|
||||
|
||||
`pnpm dev` and `pnpm dev:once` are now idempotent for the current repo and instance: if the matching Paperclip dev runner is already alive, Paperclip reports the existing process instead of starting a duplicate.
|
||||
|
||||
## Storybook
|
||||
|
||||
The board UI Storybook keeps stories and Storybook config under `ui/storybook/` so component review files stay out of the app source routes.
|
||||
|
||||
```sh
|
||||
pnpm storybook
|
||||
pnpm build-storybook
|
||||
```
|
||||
|
||||
These run the `@paperclipai/ui` Storybook on port `6006` and build the static output to `ui/storybook-static/`.
|
||||
|
||||
Inspect or stop the current repo's managed dev runner:
|
||||
|
||||
```sh
|
||||
@@ -220,8 +209,6 @@ Seed modes:
|
||||
- `full` makes a full logical clone of the source instance
|
||||
- `--no-seed` creates an empty isolated instance
|
||||
|
||||
Seeded worktree instances quarantine copied live execution by default for both `minimal` and `full` seeds. During restore, Paperclip disables copied agent timer heartbeats, resets copied `running` agents to `idle`, blocks and unassigns copied agent-owned `in_progress` issues, and unassigns copied agent-owned `todo`/`in_review` issues. This keeps a freshly booted worktree from starting agents for work already owned by the source instance. Pass `--preserve-live-work` only when you intentionally want the isolated worktree to resume copied assignments.
|
||||
|
||||
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
|
||||
|
||||
`pnpm dev` now fails fast in a linked git worktree when `.paperclip/.env` is missing, instead of silently booting against the default instance/port. If that happens, run `paperclipai worktree init` in the worktree first.
|
||||
@@ -235,8 +222,6 @@ That repo-local env also sets:
|
||||
- `PAPERCLIP_WORKTREE_COLOR=<hex-color>`
|
||||
|
||||
The server/UI use those values for worktree-specific branding such as the top banner and dynamically colored favicon.
|
||||
Authenticated worktree servers also use the `PAPERCLIP_INSTANCE_ID` value to scope Better Auth cookie names.
|
||||
Browser cookies are shared by host rather than port, so this prevents logging into one `127.0.0.1:<port>` worktree from replacing another worktree server's session cookie.
|
||||
|
||||
Print shell exports explicitly when needed:
|
||||
|
||||
|
||||
@@ -115,6 +115,38 @@ If the first real publish returns npm `E404`, check npm-side prerequisites befor
|
||||
- The initial publish must include `--access public` for a public scoped package.
|
||||
- npm also requires either account 2FA for publishing or a granular token that is allowed to bypass 2FA.
|
||||
|
||||
### Manual first publish for `@paperclipai/mcp-server`
|
||||
|
||||
If you need to publish only the MCP server package once by hand, use:
|
||||
|
||||
- `@paperclipai/mcp-server`
|
||||
|
||||
Recommended flow from the repo root:
|
||||
|
||||
```bash
|
||||
# optional sanity check: this 404s until the first publish exists
|
||||
npm view @paperclipai/mcp-server version
|
||||
|
||||
# make sure the build output is fresh
|
||||
pnpm --filter @paperclipai/mcp-server build
|
||||
|
||||
# confirm your local npm auth before the real publish
|
||||
npm whoami
|
||||
|
||||
# safe preview of the exact publish payload
|
||||
cd packages/mcp-server
|
||||
pnpm publish --dry-run --no-git-checks --access public
|
||||
|
||||
# real publish
|
||||
pnpm publish --no-git-checks --access public
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- Publish from `packages/mcp-server/`, not the repo root.
|
||||
- If `npm view @paperclipai/mcp-server version` already returns the same version that is in [`packages/mcp-server/package.json`](../packages/mcp-server/package.json), do not republish. Bump the version or use the normal repo-wide release flow in [`scripts/release.sh`](../scripts/release.sh).
|
||||
- The same npm-side prerequisites apply as above: valid npm auth, permission to publish to the `@paperclipai` scope, `--access public`, and the required publish auth/2FA policy.
|
||||
|
||||
## Version formats
|
||||
|
||||
Paperclip uses calendar versions:
|
||||
|
||||
@@ -619,7 +619,7 @@ Per-agent schedule fields in `adapter_config`:
|
||||
|
||||
- `enabled` boolean
|
||||
- `intervalSec` integer (minimum 30)
|
||||
- `maxConcurrentRuns` integer; new agents default to `5`
|
||||
- `maxConcurrentRuns` fixed at `1` for V1
|
||||
|
||||
Scheduler must skip invocation when:
|
||||
|
||||
|
||||
@@ -146,8 +146,6 @@ Use it for:
|
||||
- explicit waiting relationships
|
||||
- automatic wakeups when all blockers resolve
|
||||
|
||||
Blocked issues should stay idle while blockers remain unresolved. Paperclip should not create a queued heartbeat run for that issue until the final blocker is done and the `issue_blockers_resolved` wake can start real work.
|
||||
|
||||
If a parent is truly waiting on a child, model that with blockers. Do not rely on the parent/child relationship alone.
|
||||
|
||||
## 7. Consistent Execution Path Rules
|
||||
|
||||
@@ -10,9 +10,6 @@ It is intentionally narrower than [PLUGIN_SPEC.md](./PLUGIN_SPEC.md). The spec i
|
||||
- Plugin UI runs as same-origin JavaScript inside the main Paperclip app.
|
||||
- Worker-side host APIs are capability-gated.
|
||||
- Plugin UI is not sandboxed by manifest capabilities.
|
||||
- Plugin database migrations are restricted to a host-derived plugin namespace.
|
||||
- Plugin-owned JSON API routes must be declared in the manifest and are mounted
|
||||
only under `/api/plugins/:pluginId/api/*`.
|
||||
- There is no host-provided shared React component kit for plugins yet.
|
||||
- `ctx.assets` is not supported in the current runtime.
|
||||
|
||||
@@ -80,12 +77,10 @@ Worker:
|
||||
- secrets
|
||||
- activity
|
||||
- state
|
||||
- database namespace via `ctx.db`
|
||||
- scoped JSON API routes declared with `apiRoutes`
|
||||
- entities
|
||||
- projects and project workspaces
|
||||
- companies
|
||||
- issues, comments, namespaced `plugin:<pluginKey>` origins, blocker relations, checkout assertions, assignment wakeups, and orchestration summaries
|
||||
- issues and comments
|
||||
- agents and agent sessions
|
||||
- goals
|
||||
- data/actions
|
||||
@@ -94,55 +89,6 @@ Worker:
|
||||
- metrics
|
||||
- logger
|
||||
|
||||
### Plugin database declarations
|
||||
|
||||
First-party or otherwise trusted orchestration plugins can declare:
|
||||
|
||||
```ts
|
||||
database: {
|
||||
migrationsDir: "migrations",
|
||||
coreReadTables: ["issues"],
|
||||
}
|
||||
```
|
||||
|
||||
Required capabilities are `database.namespace.migrate` and
|
||||
`database.namespace.read`; add `database.namespace.write` for runtime mutations.
|
||||
The host derives `ctx.db.namespace`, runs SQL files in filename order before the
|
||||
worker starts, records checksums in `plugin_migrations`, and rejects changed
|
||||
already-applied migrations.
|
||||
|
||||
Migration SQL may create or alter objects only inside `ctx.db.namespace`. It may
|
||||
reference whitelisted `public` core tables for foreign keys or read-only views,
|
||||
but may not mutate/alter/drop/truncate public tables, create extensions,
|
||||
triggers, untrusted languages, or runtime multi-statement SQL. Runtime
|
||||
`ctx.db.query()` is restricted to `SELECT`; runtime `ctx.db.execute()` is
|
||||
restricted to namespace-local `INSERT`, `UPDATE`, and `DELETE`.
|
||||
|
||||
### Scoped plugin API routes
|
||||
|
||||
Plugins can expose JSON-only routes under their own namespace:
|
||||
|
||||
```ts
|
||||
apiRoutes: [
|
||||
{
|
||||
routeKey: "initialize",
|
||||
method: "POST",
|
||||
path: "/issues/:issueId/smoke",
|
||||
auth: "board-or-agent",
|
||||
capability: "api.routes.register",
|
||||
checkoutPolicy: "required-for-agent-in-progress",
|
||||
companyResolution: { from: "issue", param: "issueId" },
|
||||
},
|
||||
]
|
||||
```
|
||||
|
||||
The host resolves the plugin, checks that it is ready, enforces
|
||||
`api.routes.register`, matches the declared method/path, resolves company access,
|
||||
and applies checkout policy before dispatching to the worker's `onApiRequest`
|
||||
handler. The worker receives sanitized headers, route params, query, parsed JSON
|
||||
body, actor context, and company id. Do not use plugin routes to claim core
|
||||
paths; they always remain under `/api/plugins/:pluginId/api/*`.
|
||||
|
||||
UI:
|
||||
|
||||
- `usePluginData`
|
||||
|
||||
@@ -28,9 +28,6 @@ Current limitations to keep in mind:
|
||||
- The repo example plugins under `packages/plugins/examples/` are development conveniences. They work from a source checkout and should not be assumed to exist in a generic published build unless they are explicitly shipped with that build.
|
||||
- Dynamic plugin install is not yet cloud-ready for horizontally scaled or ephemeral deployments. There is no shared artifact store, install coordination, or cross-node distribution layer yet.
|
||||
- The current runtime does not yet ship a real host-provided plugin UI component kit, and it does not support plugin asset uploads/reads. Treat those as future-scope ideas in this spec, not current implementation promises.
|
||||
- Scoped plugin API routes are JSON-only and must be declared in `apiRoutes`.
|
||||
They mount under `/api/plugins/:pluginId/api/*`; plugins cannot shadow core
|
||||
API routes.
|
||||
|
||||
In practice, that means the current implementation is a good fit for local development and self-hosted persistent deployments, but not yet for multi-instance cloud plugin distribution.
|
||||
|
||||
@@ -627,46 +624,7 @@ Required SDK clients:
|
||||
|
||||
Plugins that need filesystem, git, terminal, or process operations handle those directly using standard Node APIs or libraries. The host provides project workspace metadata through `ctx.projects` so plugins can resolve workspace paths, but the host does not proxy low-level OS operations.
|
||||
|
||||
## 14.1 Issue Orchestration APIs
|
||||
|
||||
Trusted orchestration plugins can create and update Paperclip issues through `ctx.issues` instead of importing server internals. The public issue contract includes parent/project/goal links, board or agent assignees, blocker IDs, labels, billing code, request depth, execution workspace inheritance, and plugin origin metadata.
|
||||
|
||||
Origin rules:
|
||||
|
||||
- Built-in core issues keep built-in origins such as `manual` and `routine_execution`.
|
||||
- Plugin-managed issues use `plugin:<pluginKey>` or a sub-kind such as `plugin:<pluginKey>:feature`.
|
||||
- The host derives the default plugin origin from the installed plugin key and rejects attempts to set `plugin:<otherPluginKey>` origins.
|
||||
- `originId` is plugin-defined and should be stable for idempotent generated work.
|
||||
|
||||
Relation and read helpers:
|
||||
|
||||
- `ctx.issues.relations.get(issueId, companyId)`
|
||||
- `ctx.issues.relations.setBlockedBy(issueId, blockerIssueIds, companyId)`
|
||||
- `ctx.issues.relations.addBlockers(issueId, blockerIssueIds, companyId)`
|
||||
- `ctx.issues.relations.removeBlockers(issueId, blockerIssueIds, companyId)`
|
||||
- `ctx.issues.getSubtree(issueId, companyId, options)`
|
||||
- `ctx.issues.summaries.getOrchestration({ issueId, companyId, includeSubtree, billingCode })`
|
||||
|
||||
Governance helpers:
|
||||
|
||||
- `ctx.issues.assertCheckoutOwner({ issueId, companyId, actorAgentId, actorRunId })` lets plugin actions preserve agent-run checkout ownership.
|
||||
- `ctx.issues.requestWakeup(issueId, companyId, options)` requests assignment wakeups through host heartbeat semantics, including terminal-status, blocker, assignee, and budget hard-stop checks.
|
||||
- `ctx.issues.requestWakeups(issueIds, companyId, options)` applies the same host-owned wakeup semantics to a batch and may use an idempotency key prefix for stable coordinator retries.
|
||||
|
||||
Plugin-originated issue, relation, document, comment, and wakeup mutations must write activity entries with `actorType: "plugin"` and details fields for `sourcePluginId`, `sourcePluginKey`, `initiatingActorType`, `initiatingActorId`, and `initiatingRunId` when a user or agent run initiated the plugin work.
|
||||
|
||||
Scoped API routes:
|
||||
|
||||
- `apiRoutes[]` declares `routeKey`, `method`, plugin-local `path`, `auth`,
|
||||
`capability`, optional checkout policy, and company resolution.
|
||||
- The host enforces auth, company access, `api.routes.register`, route matching,
|
||||
and checkout policy before worker dispatch.
|
||||
- The worker implements `onApiRequest(input)` and returns a JSON response shape
|
||||
`{ status?, headers?, body? }`.
|
||||
- Only safe request headers are forwarded; auth/cookie headers are never passed
|
||||
to the worker.
|
||||
|
||||
## 14.2 Example SDK Shape
|
||||
## 14.1 Example SDK Shape
|
||||
|
||||
```ts
|
||||
/** Top-level helper for defining a plugin with type checking */
|
||||
@@ -738,24 +696,16 @@ The host enforces capabilities in the SDK layer and refuses calls outside the gr
|
||||
- `project.workspaces.read`
|
||||
- `issues.read`
|
||||
- `issue.comments.read`
|
||||
- `issue.documents.read`
|
||||
- `issue.relations.read`
|
||||
- `issue.subtree.read`
|
||||
- `agents.read`
|
||||
- `goals.read`
|
||||
- `activity.read`
|
||||
- `costs.read`
|
||||
- `issues.orchestration.read`
|
||||
|
||||
### Data Write
|
||||
|
||||
- `issues.create`
|
||||
- `issues.update`
|
||||
- `issue.comments.create`
|
||||
- `issue.documents.write`
|
||||
- `issue.relations.write`
|
||||
- `issues.checkout`
|
||||
- `issues.wakeup`
|
||||
- `assets.write`
|
||||
- `assets.read`
|
||||
- `activity.log.write`
|
||||
@@ -822,13 +772,6 @@ Minimum event set:
|
||||
- `issue.created`
|
||||
- `issue.updated`
|
||||
- `issue.comment.created`
|
||||
- `issue.document.created`
|
||||
- `issue.document.updated`
|
||||
- `issue.document.deleted`
|
||||
- `issue.relations.updated`
|
||||
- `issue.checked_out`
|
||||
- `issue.released`
|
||||
- `issue.assignment_wakeup_requested`
|
||||
- `agent.created`
|
||||
- `agent.updated`
|
||||
- `agent.status_changed`
|
||||
@@ -838,8 +781,6 @@ Minimum event set:
|
||||
- `agent.run.cancelled`
|
||||
- `approval.created`
|
||||
- `approval.decided`
|
||||
- `budget.incident.opened`
|
||||
- `budget.incident.resolved`
|
||||
- `cost_event.created`
|
||||
- `activity.logged`
|
||||
|
||||
@@ -1297,8 +1238,6 @@ Plugin-originated mutations should write:
|
||||
|
||||
- `actor_type = plugin`
|
||||
- `actor_id = <plugin-id>`
|
||||
- details include `sourcePluginId` and `sourcePluginKey`
|
||||
- details include `initiatingActorType`, `initiatingActorId`, and `initiatingRunId` when a user or agent run triggered the plugin work
|
||||
|
||||
## 21.5 Plugin Migrations
|
||||
|
||||
|
||||
@@ -114,14 +114,14 @@ If the connection drops, the UI reconnects automatically.
|
||||
|
||||
1. Enable timer wakeups (for example every 300s)
|
||||
2. Keep assignment wakeups on
|
||||
3. Use a focused prompt template that tells agents to act in the same heartbeat, leave durable progress, and mark blocked work with an owner/action
|
||||
3. Use a focused prompt template
|
||||
4. Watch run logs and adjust prompt/config over time
|
||||
|
||||
## 7.2 Event-driven loop (less constant polling)
|
||||
|
||||
1. Disable timer or set a long interval
|
||||
2. Keep wake-on-assignment enabled
|
||||
3. Use child issues, comments, and on-demand wakeups for handoffs instead of loops that poll agents, sessions, or processes
|
||||
3. Use on-demand wakeups for manual nudges
|
||||
|
||||
## 7.3 Safety-first loop
|
||||
|
||||
|
||||
@@ -1,299 +0,0 @@
|
||||
# Invite Flow State Map
|
||||
|
||||
Status: Current implementation map
|
||||
Date: 2026-04-13
|
||||
|
||||
This document maps the current invite creation and acceptance states implemented in:
|
||||
|
||||
- `ui/src/pages/CompanyInvites.tsx`
|
||||
- `ui/src/pages/CompanySettings.tsx`
|
||||
- `ui/src/pages/InviteLanding.tsx`
|
||||
- `server/src/routes/access.ts`
|
||||
- `server/src/lib/join-request-dedupe.ts`
|
||||
|
||||
## State Legend
|
||||
|
||||
- Invite state: `active`, `revoked`, `accepted`, `expired`
|
||||
- Join request status: `pending_approval`, `approved`, `rejected`
|
||||
- Claim secret state for agent joins: `available`, `consumed`, `expired`
|
||||
- Invite type: `company_join` or `bootstrap_ceo`
|
||||
- Join type: `human`, `agent`, or `both`
|
||||
|
||||
## Entity Lifecycle
|
||||
|
||||
```mermaid
|
||||
flowchart TD
|
||||
Board[Board user on invite screen]
|
||||
HumanInvite[Create human company invite]
|
||||
OpenClawInvite[Generate OpenClaw invite prompt]
|
||||
Active[Invite state: active]
|
||||
Revoked[Invite state: revoked]
|
||||
Expired[Invite state: expired]
|
||||
Accepted[Invite state: accepted]
|
||||
BootstrapDone[Bootstrap accepted<br/>no join request]
|
||||
HumanReuse{Matching human join request<br/>already exists for same user/email?}
|
||||
HumanPending[Join request<br/>pending_approval]
|
||||
HumanApproved[Join request<br/>approved]
|
||||
HumanRejected[Join request<br/>rejected]
|
||||
AgentPending[Agent join request<br/>pending_approval<br/>+ optional claim secret]
|
||||
AgentApproved[Agent join request<br/>approved]
|
||||
AgentRejected[Agent join request<br/>rejected]
|
||||
ClaimAvailable[Claim secret available]
|
||||
ClaimConsumed[Claim secret consumed]
|
||||
ClaimExpired[Claim secret expired]
|
||||
OpenClawReplay[Special replay path:<br/>accepted invite can be POSTed again<br/>for openclaw_gateway only]
|
||||
|
||||
Board --> HumanInvite --> Active
|
||||
Board --> OpenClawInvite --> Active
|
||||
Active --> Revoked: revoke
|
||||
Active --> Expired: expiresAt passes
|
||||
|
||||
Active --> BootstrapDone: bootstrap_ceo accept
|
||||
BootstrapDone --> Accepted
|
||||
|
||||
Active --> HumanReuse: human accept
|
||||
HumanReuse --> HumanPending: reuse existing pending request
|
||||
HumanReuse --> HumanApproved: reuse existing approved request
|
||||
HumanReuse --> HumanPending: no reusable request<br/>create new request
|
||||
HumanPending --> HumanApproved: board approves
|
||||
HumanPending --> HumanRejected: board rejects
|
||||
HumanPending --> Accepted
|
||||
HumanApproved --> Accepted
|
||||
|
||||
Active --> AgentPending: agent accept
|
||||
AgentPending --> Accepted
|
||||
AgentPending --> AgentApproved: board approves
|
||||
AgentPending --> AgentRejected: board rejects
|
||||
AgentApproved --> ClaimAvailable: createdAgentId + claimSecretHash
|
||||
ClaimAvailable --> ClaimConsumed: POST claim-api-key succeeds
|
||||
ClaimAvailable --> ClaimExpired: secret expires
|
||||
|
||||
Accepted --> OpenClawReplay
|
||||
OpenClawReplay --> AgentPending
|
||||
OpenClawReplay --> AgentApproved
|
||||
```
|
||||
|
||||
## Board-Side Screen States
|
||||
|
||||
```mermaid
|
||||
stateDiagram-v2
|
||||
[*] --> CompanySelection
|
||||
|
||||
CompanySelection --> NoCompany: no company selected
|
||||
CompanySelection --> LoadingHistory: selectedCompanyId present
|
||||
LoadingHistory --> HistoryError: listInvites failed
|
||||
LoadingHistory --> Ready: listInvites succeeded
|
||||
|
||||
state Ready {
|
||||
[*] --> EmptyHistory
|
||||
EmptyHistory --> PopulatedHistory: invites exist
|
||||
PopulatedHistory --> LoadingMore: View more
|
||||
LoadingMore --> PopulatedHistory: next page loaded
|
||||
|
||||
PopulatedHistory --> RevokePending: Revoke active invite
|
||||
RevokePending --> PopulatedHistory: revoke succeeded
|
||||
RevokePending --> PopulatedHistory: revoke failed
|
||||
|
||||
EmptyHistory --> CreatePending: Create invite
|
||||
PopulatedHistory --> CreatePending: Create invite
|
||||
CreatePending --> LatestInviteVisible: create succeeded
|
||||
CreatePending --> Ready: create failed
|
||||
LatestInviteVisible --> CopiedToast: clipboard copy succeeded
|
||||
LatestInviteVisible --> Ready: navigate away or refresh
|
||||
}
|
||||
|
||||
CompanySelection --> OpenClawPromptReady: Company settings prompt generator
|
||||
OpenClawPromptReady --> OpenClawPromptPending: Generate OpenClaw Invite Prompt
|
||||
OpenClawPromptPending --> OpenClawSnippetVisible: prompt generated
|
||||
OpenClawPromptPending --> OpenClawPromptReady: generation failed
|
||||
```
|
||||
|
||||
## Invite Landing Screen States
|
||||
|
||||
```mermaid
|
||||
stateDiagram-v2
|
||||
[*] --> TokenGate
|
||||
|
||||
TokenGate --> InvalidToken: token missing
|
||||
TokenGate --> Loading: token present
|
||||
Loading --> InviteUnavailable: invite fetch failed or invite not returned
|
||||
Loading --> CheckingAccess: signed-in session and invite.companyId
|
||||
Loading --> InviteResolved: invite loaded without membership check
|
||||
Loading --> AcceptedInviteSummary: invite already consumed<br/>but linked join request still exists
|
||||
|
||||
CheckingAccess --> RedirectToBoard: current user already belongs to company
|
||||
CheckingAccess --> InviteResolved: membership check finished and no join-request summary state is active
|
||||
CheckingAccess --> AcceptedInviteSummary: membership check finished and invite has joinRequestStatus
|
||||
|
||||
state InviteResolved {
|
||||
[*] --> Branch
|
||||
Branch --> AgentForm: company_join + allowedJoinTypes=agent
|
||||
Branch --> InlineAuth: authenticated mode + no session + join is not agent-only
|
||||
Branch --> AcceptReady: bootstrap invite or human-ready session/local_trusted
|
||||
|
||||
InlineAuth --> InlineAuth: toggle sign-up/sign-in
|
||||
InlineAuth --> InlineAuth: auth validation or auth error message
|
||||
InlineAuth --> RedirectToBoard: auth succeeded and company membership already exists
|
||||
InlineAuth --> AcceptPending: auth succeeded and invite still needs acceptance
|
||||
|
||||
AgentForm --> AcceptPending: submit request
|
||||
AgentForm --> AgentForm: validation or accept error
|
||||
|
||||
AcceptReady --> AcceptPending: Accept invite
|
||||
AcceptReady --> AcceptReady: accept error
|
||||
}
|
||||
|
||||
AcceptPending --> BootstrapComplete: bootstrapAccepted=true
|
||||
AcceptPending --> RedirectToBoard: join status=approved
|
||||
AcceptPending --> PendingApprovalResult: join status=pending_approval
|
||||
AcceptPending --> RejectedResult: join status=rejected
|
||||
|
||||
state AcceptedInviteSummary {
|
||||
[*] --> SummaryBranch
|
||||
SummaryBranch --> PendingApprovalReload: joinRequestStatus=pending_approval
|
||||
SummaryBranch --> OpeningCompany: joinRequestStatus=approved<br/>and human invite user is now a member
|
||||
SummaryBranch --> RejectedReload: joinRequestStatus=rejected
|
||||
SummaryBranch --> ConsumedReload: approved agent invite or other consumed state
|
||||
}
|
||||
|
||||
PendingApprovalResult --> PendingApprovalReload: reload after submit
|
||||
RejectedResult --> RejectedReload: reload after board rejects
|
||||
RedirectToBoard --> OpeningCompany: brief pre-navigation render when approved membership is detected
|
||||
OpeningCompany --> RedirectToBoard: navigate to board
|
||||
```
|
||||
|
||||
## Sequence Diagrams
|
||||
|
||||
### Human Invite Creation And First Acceptance
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
autonumber
|
||||
actor Board as Board user
|
||||
participant Settings as Company Invites UI
|
||||
participant API as Access routes
|
||||
participant Invites as invites table
|
||||
actor Invitee as Invite recipient
|
||||
participant Landing as Invite landing UI
|
||||
participant Auth as Auth session
|
||||
participant Join as join_requests table
|
||||
|
||||
Board->>Settings: Choose role and click Create invite
|
||||
Settings->>API: POST /api/companies/:companyId/invites
|
||||
API->>Invites: Insert active invite
|
||||
API-->>Settings: inviteUrl + metadata
|
||||
|
||||
Invitee->>Landing: Open invite URL
|
||||
Landing->>API: GET /api/invites/:token
|
||||
API->>Invites: Load active invite
|
||||
API-->>Landing: Invite summary
|
||||
|
||||
alt Authenticated mode and no session
|
||||
Landing->>Auth: Sign up or sign in
|
||||
Auth-->>Landing: Session established
|
||||
end
|
||||
|
||||
Landing->>API: POST /api/invites/:token/accept (requestType=human)
|
||||
API->>Join: Look for reusable human join request
|
||||
alt Reusable pending or approved request exists
|
||||
API->>Invites: Mark invite accepted
|
||||
API-->>Landing: Existing join request status
|
||||
else No reusable request exists
|
||||
API->>Invites: Mark invite accepted
|
||||
API->>Join: Insert pending_approval join request
|
||||
API-->>Landing: New pending_approval join request
|
||||
end
|
||||
```
|
||||
|
||||
### Human Approval And Reload Path
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
autonumber
|
||||
actor Invitee as Invite recipient
|
||||
participant Landing as Invite landing UI
|
||||
participant API as Access routes
|
||||
participant Join as join_requests table
|
||||
actor Approver as Company admin
|
||||
participant Queue as Access queue UI
|
||||
participant Membership as company_memberships + grants
|
||||
|
||||
Invitee->>Landing: Reload consumed invite URL
|
||||
Landing->>API: GET /api/invites/:token
|
||||
API->>Join: Load join request by inviteId
|
||||
API-->>Landing: joinRequestStatus + joinRequestType
|
||||
|
||||
alt joinRequestStatus = pending_approval
|
||||
Landing-->>Invitee: Show waiting-for-approval panel
|
||||
Approver->>Queue: Review request in Company Settings -> Access
|
||||
Queue->>API: POST /companies/:companyId/join-requests/:requestId/approve
|
||||
API->>Membership: Ensure membership and grants
|
||||
API->>Join: Mark join request approved
|
||||
Invitee->>Landing: Refresh after approval
|
||||
Landing->>API: GET /api/invites/:token
|
||||
API->>Join: Reload approved join request
|
||||
API-->>Landing: approved status
|
||||
Landing-->>Invitee: Opening company and redirect
|
||||
else joinRequestStatus = rejected
|
||||
Landing-->>Invitee: Show rejected error panel
|
||||
else joinRequestStatus = approved but membership missing
|
||||
Landing-->>Invitee: Fall through to consumed/unavailable state
|
||||
end
|
||||
```
|
||||
|
||||
### Agent Invite Approval, Claim, And Replay
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
autonumber
|
||||
actor Board as Board user
|
||||
participant Settings as Company Settings UI
|
||||
participant API as Access routes
|
||||
participant Invites as invites table
|
||||
actor Gateway as OpenClaw gateway agent
|
||||
participant Join as join_requests table
|
||||
actor Approver as Company admin
|
||||
participant Agents as agents table
|
||||
participant Keys as agent_api_keys table
|
||||
|
||||
Board->>Settings: Generate OpenClaw invite prompt
|
||||
Settings->>API: POST /api/companies/:companyId/openclaw-invite-prompt
|
||||
API->>Invites: Insert active agent invite
|
||||
API-->>Settings: Prompt text + invite token
|
||||
|
||||
Gateway->>API: POST /api/invites/:token/accept (agent, openclaw_gateway)
|
||||
API->>Invites: Mark invite accepted
|
||||
API->>Join: Insert pending_approval join request + claimSecretHash
|
||||
API-->>Gateway: requestId + claimSecret + claimApiKeyPath
|
||||
|
||||
Approver->>API: POST /companies/:companyId/join-requests/:requestId/approve
|
||||
API->>Agents: Create agent + membership + grants
|
||||
API->>Join: Mark request approved and store createdAgentId
|
||||
|
||||
Gateway->>API: POST /api/join-requests/:requestId/claim-api-key (claimSecret)
|
||||
API->>Keys: Create initial API key
|
||||
API->>Join: Mark claim secret consumed
|
||||
API-->>Gateway: Plaintext Paperclip API key
|
||||
|
||||
opt Replay accepted invite for updated gateway defaults
|
||||
Gateway->>API: POST /api/invites/:token/accept again
|
||||
API->>Join: Reuse existing approved or pending request
|
||||
API->>Agents: Update approved agent adapter config when applicable
|
||||
API-->>Gateway: Updated join request payload
|
||||
end
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- `GET /api/invites/:token` treats `revoked` and `expired` invites as unavailable. Accepted invites remain resolvable when they already have a linked join request, and the summary now includes `joinRequestStatus` plus `joinRequestType`.
|
||||
- Human acceptance consumes the invite immediately and then either creates a new join request or reuses an existing `pending_approval` or `approved` human join request for the same user/email.
|
||||
- The landing page has two layers of post-accept UI:
|
||||
- immediate mutation-result UI from `POST /api/invites/:token/accept`
|
||||
- reload-time summary UI from `GET /api/invites/:token` once the invite has already been consumed
|
||||
- Reload behavior for accepted company invites is now status-sensitive:
|
||||
- `pending_approval` re-renders the waiting-for-approval panel
|
||||
- `rejected` renders the "This join request was not approved." error panel
|
||||
- `approved` only becomes a success path for human invites after membership is visible to the current session; otherwise the page falls through to the generic consumed/unavailable state
|
||||
- `GET /api/invites/:token/logo` still rejects accepted invites, so accepted-invite reload states may fall back to the generated company icon even though the summary payload still carries `companyLogoUrl`.
|
||||
- The only accepted-invite replay path in the current implementation is `POST /api/invites/:token/accept` for `agent` requests with `adapterType=openclaw_gateway`, and only when the existing join request is still `pending_approval` or already `approved`.
|
||||
- `bootstrap_ceo` invites are one-time and do not create join requests.
|
||||
@@ -124,14 +124,14 @@ If the connection drops, the UI reconnects automatically.
|
||||
|
||||
1. Enable timer wakeups (for example every 300s)
|
||||
2. Keep assignment wakeups on
|
||||
3. Use a focused prompt template that tells agents to act in the same heartbeat, leave durable progress, and mark blocked work with an owner/action
|
||||
3. Use a focused prompt template
|
||||
4. Watch run logs and adjust prompt/config over time
|
||||
|
||||
## 7.2 Event-driven loop (less constant polling)
|
||||
|
||||
1. Disable timer or set a long interval
|
||||
2. Keep wake-on-assignment enabled
|
||||
3. Use child issues, comments, and on-demand wakeups for handoffs instead of loops that poll agents, sessions, or processes
|
||||
3. Use on-demand wakeups for manual nudges
|
||||
|
||||
## 7.3 Safety-first loop
|
||||
|
||||
|
||||
@@ -48,8 +48,6 @@
|
||||
"guides/board-operator/managing-tasks",
|
||||
"guides/board-operator/execution-workspaces-and-runtime-services",
|
||||
"guides/board-operator/delegation",
|
||||
"guides/board-operator/execution-workspaces-and-runtime-services",
|
||||
"guides/board-operator/delegation",
|
||||
"guides/board-operator/approvals",
|
||||
"guides/board-operator/costs-and-budgets",
|
||||
"guides/board-operator/activity-log",
|
||||
|
||||
@@ -66,9 +66,7 @@ Read ancestors to understand why this task exists. If woken by a specific commen
|
||||
|
||||
### Step 7: Do the Work
|
||||
|
||||
Use your tools and capabilities to complete the task. If the issue is actionable, take a concrete action in the same heartbeat. Do not stop at a plan unless the issue asked for planning.
|
||||
|
||||
Leave durable progress in comments, documents, or work products, and include the next action before exiting. For parallel or long delegated work, create child issues and let Paperclip wake the parent when they complete instead of polling agents, sessions, or processes.
|
||||
Use your tools and capabilities to complete the task.
|
||||
|
||||
### Step 8: Update Status
|
||||
|
||||
@@ -104,22 +102,6 @@ Always set `parentId` and `goalId` on subtasks.
|
||||
- **Always checkout** before working — never PATCH to `in_progress` manually
|
||||
- **Never retry a 409** — the task belongs to someone else
|
||||
- **Always comment** on in-progress work before exiting a heartbeat
|
||||
- **Start actionable work** in the same heartbeat; planning-only exits are for planning tasks
|
||||
- **Leave a clear next action** in durable issue context
|
||||
- **Use child issues instead of polling** for long or parallel delegated work
|
||||
- **Always set parentId** on subtasks
|
||||
- **Never cancel cross-team tasks** — reassign to your manager
|
||||
- **Escalate when stuck** — use your chain of command
|
||||
|
||||
## Run Liveness
|
||||
|
||||
Paperclip records run liveness as metadata on heartbeat runs. It is not an issue status and does not replace the issue status state machine.
|
||||
|
||||
- Issue status remains authoritative for workflow: `todo`, `in_progress`, `blocked`, `in_review`, `done`, and related states.
|
||||
- Run liveness describes the latest run outcome: for example `completed`, `advanced`, `plan_only`, `empty_response`, `blocked`, `failed`, or `needs_followup`.
|
||||
- Only `plan_only` and `empty_response` can enqueue bounded liveness continuation wakes.
|
||||
- Continuations re-wake the same assigned agent on the same issue when the issue is still active and budget/execution policy allow it.
|
||||
- `continuationAttempt` counts semantic liveness continuations for a source run chain. It is separate from process recovery, queued wake delivery, adapter session resume, and other operational retries.
|
||||
- Liveness continuation wake prompts include the attempt, source run, liveness state, liveness reason, and the instruction for the next heartbeat.
|
||||
- Continuations do not mark the issue `blocked` or `done`. If automatic continuations are exhausted, Paperclip leaves an audit comment so a human or manager can clarify, block, or assign follow-up work.
|
||||
- Workspace provisioning alone is not treated as concrete task progress. Durable progress should appear as tool/action events, issue comments, document or work-product revisions, activity log entries, commits, or tests.
|
||||
|
||||
@@ -20,13 +20,6 @@ The Heartbeat Procedure:
|
||||
8. Update status: PATCH /api/issues/{issueId} with status and comment
|
||||
9. Delegate if needed: POST /api/companies/{companyId}/issues
|
||||
|
||||
Execution Contract:
|
||||
- If the issue is actionable, start concrete work in this heartbeat. Do not stop at a plan unless the issue asks for planning.
|
||||
- Leave durable progress in comments, documents, or work products, with a clear next action.
|
||||
- Use child issues for parallel or long delegated work instead of polling agents, sessions, or processes.
|
||||
- If blocked, PATCH the issue to blocked and name the unblock owner and action.
|
||||
- Respect budget, pause/cancel, approval gates, and company boundaries.
|
||||
|
||||
Critical Rules:
|
||||
- Always checkout before working. Never PATCH to in_progress manually.
|
||||
- Never retry a 409. The task belongs to someone else.
|
||||
|
||||
@@ -11,8 +11,6 @@
|
||||
"dev:stop": "pnpm --filter @paperclipai/server exec tsx ../scripts/dev-service.ts stop",
|
||||
"dev:server": "pnpm --filter @paperclipai/server dev",
|
||||
"dev:ui": "pnpm --filter @paperclipai/ui dev",
|
||||
"storybook": "pnpm --filter @paperclipai/ui storybook",
|
||||
"build-storybook": "pnpm --filter @paperclipai/ui build-storybook",
|
||||
"build": "pnpm run preflight:workspace-links && pnpm -r build",
|
||||
"typecheck": "pnpm run preflight:workspace-links && pnpm -r typecheck",
|
||||
"test": "pnpm run test:run",
|
||||
@@ -20,7 +18,6 @@
|
||||
"test:run": "pnpm run preflight:workspace-links && vitest run",
|
||||
"db:generate": "pnpm --filter @paperclipai/db generate",
|
||||
"db:migrate": "pnpm --filter @paperclipai/db migrate",
|
||||
"issue-references:backfill": "pnpm run preflight:workspace-links && tsx scripts/backfill-issue-reference-mentions.ts",
|
||||
"secrets:migrate-inline-env": "tsx scripts/migrate-inline-env-secrets.ts",
|
||||
"db:backup": "./scripts/backup-db.sh",
|
||||
"paperclipai": "node cli/node_modules/tsx/dist/cli.mjs cli/src/index.ts",
|
||||
@@ -37,7 +34,6 @@
|
||||
"smoke:openclaw-sse-standalone": "./scripts/smoke/openclaw-sse-standalone.sh",
|
||||
"test:e2e": "npx playwright test --config tests/e2e/playwright.config.ts",
|
||||
"test:e2e:headed": "npx playwright test --config tests/e2e/playwright.config.ts --headed",
|
||||
"test:e2e:multiuser-authenticated": "npx playwright test --config tests/e2e/playwright-multiuser-authenticated.config.ts",
|
||||
"evals:smoke": "cd evals/promptfoo && npx promptfoo@0.103.3 eval",
|
||||
"test:release-smoke": "npx playwright test --config tests/release-smoke/playwright.config.ts",
|
||||
"test:release-smoke:headed": "npx playwright test --config tests/release-smoke/playwright.config.ts --headed",
|
||||
|
||||
@@ -1,13 +1,6 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
appendWithByteCap,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
renderPaperclipWakePrompt,
|
||||
runningProcesses,
|
||||
runChildProcess,
|
||||
stringifyPaperclipWakePayload,
|
||||
} from "./server-utils.js";
|
||||
import { runChildProcess } from "./server-utils.js";
|
||||
|
||||
function isPidAlive(pid: number) {
|
||||
try {
|
||||
@@ -27,37 +20,7 @@ async function waitForPidExit(pid: number, timeoutMs = 2_000) {
|
||||
return !isPidAlive(pid);
|
||||
}
|
||||
|
||||
async function waitForTextMatch(read: () => string, pattern: RegExp, timeoutMs = 1_000) {
|
||||
const deadline = Date.now() + timeoutMs;
|
||||
while (Date.now() < deadline) {
|
||||
const value = read();
|
||||
const match = value.match(pattern);
|
||||
if (match) return match;
|
||||
await new Promise((resolve) => setTimeout(resolve, 25));
|
||||
}
|
||||
return read().match(pattern);
|
||||
}
|
||||
|
||||
describe("runChildProcess", () => {
|
||||
it("does not arm a timeout when timeoutSec is 0", async () => {
|
||||
const result = await runChildProcess(
|
||||
randomUUID(),
|
||||
process.execPath,
|
||||
["-e", "setTimeout(() => process.stdout.write('done'), 150);"],
|
||||
{
|
||||
cwd: process.cwd(),
|
||||
env: {},
|
||||
timeoutSec: 0,
|
||||
graceSec: 1,
|
||||
onLog: async () => {},
|
||||
},
|
||||
);
|
||||
|
||||
expect(result.exitCode).toBe(0);
|
||||
expect(result.timedOut).toBe(false);
|
||||
expect(result.stdout).toBe("done");
|
||||
});
|
||||
|
||||
it("waits for onSpawn before sending stdin to the child", async () => {
|
||||
const spawnDelayMs = 150;
|
||||
const startedAt = Date.now();
|
||||
@@ -122,252 +85,4 @@ describe("runChildProcess", () => {
|
||||
|
||||
expect(await waitForPidExit(descendantPid!, 2_000)).toBe(true);
|
||||
});
|
||||
|
||||
it.skipIf(process.platform === "win32")("cleans up a lingering process group after terminal output and child exit", async () => {
|
||||
const result = await runChildProcess(
|
||||
randomUUID(),
|
||||
process.execPath,
|
||||
[
|
||||
"-e",
|
||||
[
|
||||
"const { spawn } = require('node:child_process');",
|
||||
"const child = spawn(process.execPath, ['-e', 'setInterval(() => {}, 1000)'], { stdio: ['ignore', 'inherit', 'ignore'] });",
|
||||
"process.stdout.write(`descendant:${child.pid}\\n`);",
|
||||
"process.stdout.write(`${JSON.stringify({ type: 'result', result: 'done' })}\\n`);",
|
||||
"setTimeout(() => process.exit(0), 25);",
|
||||
].join(" "),
|
||||
],
|
||||
{
|
||||
cwd: process.cwd(),
|
||||
env: {},
|
||||
timeoutSec: 0,
|
||||
graceSec: 1,
|
||||
onLog: async () => {},
|
||||
terminalResultCleanup: {
|
||||
graceMs: 100,
|
||||
hasTerminalResult: ({ stdout }) => stdout.includes('"type":"result"'),
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
const descendantPid = Number.parseInt(result.stdout.match(/descendant:(\d+)/)?.[1] ?? "", 10);
|
||||
expect(result.timedOut).toBe(false);
|
||||
expect(result.exitCode).toBe(0);
|
||||
expect(Number.isInteger(descendantPid) && descendantPid > 0).toBe(true);
|
||||
expect(await waitForPidExit(descendantPid, 2_000)).toBe(true);
|
||||
});
|
||||
|
||||
it.skipIf(process.platform === "win32")("does not clean up noisy runs that have no terminal output", async () => {
|
||||
const runId = randomUUID();
|
||||
let observed = "";
|
||||
const resultPromise = runChildProcess(
|
||||
runId,
|
||||
process.execPath,
|
||||
[
|
||||
"-e",
|
||||
[
|
||||
"const { spawn } = require('node:child_process');",
|
||||
"const child = spawn(process.execPath, ['-e', \"setInterval(() => process.stdout.write('noise\\\\n'), 50)\"], { stdio: ['ignore', 'inherit', 'ignore'] });",
|
||||
"process.stdout.write(`descendant:${child.pid}\\n`);",
|
||||
"setTimeout(() => process.exit(0), 25);",
|
||||
].join(" "),
|
||||
],
|
||||
{
|
||||
cwd: process.cwd(),
|
||||
env: {},
|
||||
timeoutSec: 0,
|
||||
graceSec: 1,
|
||||
onLog: async (_stream, chunk) => {
|
||||
observed += chunk;
|
||||
},
|
||||
terminalResultCleanup: {
|
||||
graceMs: 50,
|
||||
hasTerminalResult: ({ stdout }) => stdout.includes('"type":"result"'),
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
const pidMatch = await waitForTextMatch(() => observed, /descendant:(\d+)/);
|
||||
const descendantPid = Number.parseInt(pidMatch?.[1] ?? "", 10);
|
||||
expect(Number.isInteger(descendantPid) && descendantPid > 0).toBe(true);
|
||||
|
||||
const race = await Promise.race([
|
||||
resultPromise.then(() => "settled" as const),
|
||||
new Promise<"pending">((resolve) => setTimeout(() => resolve("pending"), 300)),
|
||||
]);
|
||||
expect(race).toBe("pending");
|
||||
expect(isPidAlive(descendantPid)).toBe(true);
|
||||
|
||||
const running = runningProcesses.get(runId) as
|
||||
| { child: { kill(signal: NodeJS.Signals): boolean }; processGroupId: number | null }
|
||||
| undefined;
|
||||
try {
|
||||
if (running?.processGroupId) {
|
||||
process.kill(-running.processGroupId, "SIGKILL");
|
||||
} else {
|
||||
running?.child.kill("SIGKILL");
|
||||
}
|
||||
await resultPromise;
|
||||
} finally {
|
||||
runningProcesses.delete(runId);
|
||||
if (isPidAlive(descendantPid)) {
|
||||
try {
|
||||
process.kill(descendantPid, "SIGKILL");
|
||||
} catch {
|
||||
// Ignore cleanup races.
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe("renderPaperclipWakePrompt", () => {
|
||||
it("keeps the default local-agent prompt action-oriented", () => {
|
||||
expect(DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE).toContain("Start actionable work in this heartbeat");
|
||||
expect(DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE).toContain("do not stop at a plan");
|
||||
expect(DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE).toContain("Use child issues");
|
||||
expect(DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE).toContain("instead of polling agents, sessions, or processes");
|
||||
expect(DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE).toContain(
|
||||
"Respect budget, pause/cancel, approval gates, and company boundaries",
|
||||
);
|
||||
});
|
||||
|
||||
it("adds the execution contract to scoped wake prompts", () => {
|
||||
const prompt = renderPaperclipWakePrompt({
|
||||
reason: "issue_assigned",
|
||||
issue: {
|
||||
id: "issue-1",
|
||||
identifier: "PAP-1580",
|
||||
title: "Update prompts",
|
||||
status: "in_progress",
|
||||
},
|
||||
commentWindow: {
|
||||
requestedCount: 0,
|
||||
includedCount: 0,
|
||||
missingCount: 0,
|
||||
},
|
||||
comments: [],
|
||||
fallbackFetchNeeded: false,
|
||||
});
|
||||
|
||||
expect(prompt).toContain("## Paperclip Wake Payload");
|
||||
expect(prompt).toContain("Execution contract: take concrete action in this heartbeat");
|
||||
expect(prompt).toContain("use child issues instead of polling");
|
||||
expect(prompt).toContain("mark blocked work with the unblock owner/action");
|
||||
});
|
||||
|
||||
it("renders dependency-blocked interaction guidance", () => {
|
||||
const prompt = renderPaperclipWakePrompt({
|
||||
reason: "issue_commented",
|
||||
issue: {
|
||||
id: "issue-1",
|
||||
identifier: "PAP-1703",
|
||||
title: "Blocked parent",
|
||||
status: "todo",
|
||||
},
|
||||
dependencyBlockedInteraction: true,
|
||||
unresolvedBlockerIssueIds: ["blocker-1"],
|
||||
unresolvedBlockerSummaries: [
|
||||
{
|
||||
id: "blocker-1",
|
||||
identifier: "PAP-1723",
|
||||
title: "Finish blocker",
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
},
|
||||
],
|
||||
commentWindow: {
|
||||
requestedCount: 1,
|
||||
includedCount: 1,
|
||||
missingCount: 0,
|
||||
},
|
||||
commentIds: ["comment-1"],
|
||||
latestCommentId: "comment-1",
|
||||
comments: [{ id: "comment-1", body: "hello" }],
|
||||
fallbackFetchNeeded: false,
|
||||
});
|
||||
|
||||
expect(prompt).toContain("dependency-blocked interaction: yes");
|
||||
expect(prompt).toContain("respond or triage the human comment");
|
||||
expect(prompt).toContain("PAP-1723 Finish blocker (todo)");
|
||||
});
|
||||
|
||||
it("includes continuation and child issue summaries in structured wake context", () => {
|
||||
const payload = {
|
||||
reason: "issue_children_completed",
|
||||
issue: {
|
||||
id: "parent-1",
|
||||
identifier: "PAP-100",
|
||||
title: "Integrate child work",
|
||||
status: "in_progress",
|
||||
priority: "medium",
|
||||
},
|
||||
continuationSummary: {
|
||||
key: "continuation-summary",
|
||||
title: "Continuation Summary",
|
||||
body: "# Continuation Summary\n\n## Next Action\n\n- Integrate child outputs.",
|
||||
updatedAt: "2026-04-18T12:00:00.000Z",
|
||||
},
|
||||
livenessContinuation: {
|
||||
attempt: 2,
|
||||
maxAttempts: 2,
|
||||
sourceRunId: "run-1",
|
||||
state: "plan_only",
|
||||
reason: "Run described future work without concrete action evidence",
|
||||
instruction: "Take the first concrete action now.",
|
||||
},
|
||||
childIssueSummaries: [
|
||||
{
|
||||
id: "child-1",
|
||||
identifier: "PAP-101",
|
||||
title: "Implement helper",
|
||||
status: "done",
|
||||
priority: "medium",
|
||||
summary: "Added the helper route and tests.",
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
expect(JSON.parse(stringifyPaperclipWakePayload(payload) ?? "{}")).toMatchObject({
|
||||
continuationSummary: {
|
||||
body: expect.stringContaining("Continuation Summary"),
|
||||
},
|
||||
livenessContinuation: {
|
||||
attempt: 2,
|
||||
maxAttempts: 2,
|
||||
sourceRunId: "run-1",
|
||||
state: "plan_only",
|
||||
instruction: "Take the first concrete action now.",
|
||||
},
|
||||
childIssueSummaries: [
|
||||
{
|
||||
identifier: "PAP-101",
|
||||
summary: "Added the helper route and tests.",
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
const prompt = renderPaperclipWakePrompt(payload);
|
||||
expect(prompt).toContain("Issue continuation summary:");
|
||||
expect(prompt).toContain("Integrate child outputs.");
|
||||
expect(prompt).toContain("Run liveness continuation:");
|
||||
expect(prompt).toContain("- attempt: 2/2");
|
||||
expect(prompt).toContain("- source run: run-1");
|
||||
expect(prompt).toContain("- liveness state: plan_only");
|
||||
expect(prompt).toContain("- reason: Run described future work without concrete action evidence");
|
||||
expect(prompt).toContain("- instruction: Take the first concrete action now.");
|
||||
expect(prompt).toContain("Direct child issue summaries:");
|
||||
expect(prompt).toContain("PAP-101 Implement helper (done)");
|
||||
expect(prompt).toContain("Added the helper route and tests.");
|
||||
});
|
||||
});
|
||||
|
||||
describe("appendWithByteCap", () => {
|
||||
it("keeps valid UTF-8 when trimming through multibyte text", () => {
|
||||
const output = appendWithByteCap("prefix ", "hello — world", 7);
|
||||
|
||||
expect(output).not.toContain("\uFFFD");
|
||||
expect(Buffer.from(output, "utf8").toString("utf8")).toBe(output);
|
||||
expect(Buffer.byteLength(output, "utf8")).toBeLessThanOrEqual(7);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -16,11 +16,6 @@ export interface RunProcessResult {
|
||||
startedAt: string | null;
|
||||
}
|
||||
|
||||
export interface TerminalResultCleanupOptions {
|
||||
hasTerminalResult: (output: { stdout: string; stderr: string }) => boolean;
|
||||
graceMs?: number;
|
||||
}
|
||||
|
||||
interface RunningProcess {
|
||||
child: ChildProcess;
|
||||
graceSec: number;
|
||||
@@ -34,10 +29,6 @@ interface SpawnTarget {
|
||||
|
||||
type ChildProcessWithEvents = ChildProcess & {
|
||||
on(event: "error", listener: (err: Error) => void): ChildProcess;
|
||||
on(
|
||||
event: "exit",
|
||||
listener: (code: number | null, signal: NodeJS.Signals | null) => void,
|
||||
): ChildProcess;
|
||||
on(
|
||||
event: "close",
|
||||
listener: (code: number | null, signal: NodeJS.Signals | null) => void,
|
||||
@@ -69,25 +60,12 @@ function signalRunningProcess(
|
||||
export const runningProcesses = new Map<string, RunningProcess>();
|
||||
export const MAX_CAPTURE_BYTES = 4 * 1024 * 1024;
|
||||
export const MAX_EXCERPT_BYTES = 32 * 1024;
|
||||
const TERMINAL_RESULT_SCAN_OVERLAP_CHARS = 64 * 1024;
|
||||
const SENSITIVE_ENV_KEY = /(key|token|secret|password|passwd|authorization|cookie)/i;
|
||||
const PAPERCLIP_SKILL_ROOT_RELATIVE_CANDIDATES = [
|
||||
"../../skills",
|
||||
"../../../../../skills",
|
||||
];
|
||||
|
||||
export const DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE = [
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
"",
|
||||
"Execution contract:",
|
||||
"- Start actionable work in this heartbeat; do not stop at a plan unless the issue asks for planning.",
|
||||
"- Leave durable progress in comments, documents, or work products with a clear next action.",
|
||||
"- Use child issues for parallel or long delegated work instead of polling agents, sessions, or processes.",
|
||||
"- If woken by a human comment on a dependency-blocked issue, respond or triage the comment without treating the blocked deliverable work as unblocked.",
|
||||
"- If blocked, mark the issue blocked and name the unblock owner and action.",
|
||||
"- Respect budget, pause/cancel, approval gates, and company boundaries.",
|
||||
].join("\n");
|
||||
|
||||
export interface PaperclipSkillEntry {
|
||||
key: string;
|
||||
runtimeName: string;
|
||||
@@ -202,22 +180,6 @@ export function appendWithCap(prev: string, chunk: string, cap = MAX_CAPTURE_BYT
|
||||
return combined.length > cap ? combined.slice(combined.length - cap) : combined;
|
||||
}
|
||||
|
||||
export function appendWithByteCap(prev: string, chunk: string, cap = MAX_CAPTURE_BYTES) {
|
||||
const combined = prev + chunk;
|
||||
const bytes = Buffer.byteLength(combined, "utf8");
|
||||
if (bytes <= cap) return combined;
|
||||
|
||||
const buffer = Buffer.from(combined, "utf8");
|
||||
let start = Math.max(0, bytes - cap);
|
||||
while (start < buffer.length && (buffer[start]! & 0xc0) === 0x80) start += 1;
|
||||
return buffer.subarray(start).toString("utf8");
|
||||
}
|
||||
|
||||
function resumeReadable(readable: { resume: () => unknown; destroyed?: boolean } | null | undefined) {
|
||||
if (!readable || readable.destroyed) return;
|
||||
readable.resume();
|
||||
}
|
||||
|
||||
export function resolvePathValue(obj: Record<string, unknown>, dottedPath: string) {
|
||||
const parts = dottedPath.split(".");
|
||||
let cursor: unknown = obj;
|
||||
@@ -288,52 +250,11 @@ type PaperclipWakeComment = {
|
||||
authorId: string | null;
|
||||
};
|
||||
|
||||
type PaperclipWakeContinuationSummary = {
|
||||
key: string | null;
|
||||
title: string | null;
|
||||
body: string;
|
||||
bodyTruncated: boolean;
|
||||
updatedAt: string | null;
|
||||
};
|
||||
|
||||
type PaperclipWakeLivenessContinuation = {
|
||||
attempt: number | null;
|
||||
maxAttempts: number | null;
|
||||
sourceRunId: string | null;
|
||||
state: string | null;
|
||||
reason: string | null;
|
||||
instruction: string | null;
|
||||
};
|
||||
|
||||
type PaperclipWakeChildIssueSummary = {
|
||||
id: string | null;
|
||||
identifier: string | null;
|
||||
title: string | null;
|
||||
status: string | null;
|
||||
priority: string | null;
|
||||
summary: string | null;
|
||||
};
|
||||
|
||||
type PaperclipWakeBlockerSummary = {
|
||||
id: string | null;
|
||||
identifier: string | null;
|
||||
title: string | null;
|
||||
status: string | null;
|
||||
priority: string | null;
|
||||
};
|
||||
|
||||
type PaperclipWakePayload = {
|
||||
reason: string | null;
|
||||
issue: PaperclipWakeIssue | null;
|
||||
checkedOutByHarness: boolean;
|
||||
dependencyBlockedInteraction: boolean;
|
||||
unresolvedBlockerIssueIds: string[];
|
||||
unresolvedBlockerSummaries: PaperclipWakeBlockerSummary[];
|
||||
executionStage: PaperclipWakeExecutionStage | null;
|
||||
continuationSummary: PaperclipWakeContinuationSummary | null;
|
||||
livenessContinuation: PaperclipWakeLivenessContinuation | null;
|
||||
childIssueSummaries: PaperclipWakeChildIssueSummary[];
|
||||
childIssueSummaryTruncated: boolean;
|
||||
commentIds: string[];
|
||||
latestCommentId: string | null;
|
||||
comments: PaperclipWakeComment[];
|
||||
@@ -377,61 +298,6 @@ function normalizePaperclipWakeComment(value: unknown): PaperclipWakeComment | n
|
||||
};
|
||||
}
|
||||
|
||||
function normalizePaperclipWakeContinuationSummary(value: unknown): PaperclipWakeContinuationSummary | null {
|
||||
const summary = parseObject(value);
|
||||
const body = asString(summary.body, "").trim();
|
||||
if (!body) return null;
|
||||
return {
|
||||
key: asString(summary.key, "").trim() || null,
|
||||
title: asString(summary.title, "").trim() || null,
|
||||
body,
|
||||
bodyTruncated: asBoolean(summary.bodyTruncated, false),
|
||||
updatedAt: asString(summary.updatedAt, "").trim() || null,
|
||||
};
|
||||
}
|
||||
|
||||
function normalizePaperclipWakeLivenessContinuation(value: unknown): PaperclipWakeLivenessContinuation | null {
|
||||
const continuation = parseObject(value);
|
||||
const attempt = asNumber(continuation.attempt, 0);
|
||||
const maxAttempts = asNumber(continuation.maxAttempts, 0);
|
||||
const sourceRunId = asString(continuation.sourceRunId, "").trim() || null;
|
||||
const state = asString(continuation.state, "").trim() || null;
|
||||
const reason = asString(continuation.reason, "").trim() || null;
|
||||
const instruction = asString(continuation.instruction, "").trim() || null;
|
||||
if (!attempt && !maxAttempts && !sourceRunId && !state && !reason && !instruction) return null;
|
||||
return {
|
||||
attempt: attempt > 0 ? attempt : null,
|
||||
maxAttempts: maxAttempts > 0 ? maxAttempts : null,
|
||||
sourceRunId,
|
||||
state,
|
||||
reason,
|
||||
instruction,
|
||||
};
|
||||
}
|
||||
|
||||
function normalizePaperclipWakeChildIssueSummary(value: unknown): PaperclipWakeChildIssueSummary | null {
|
||||
const child = parseObject(value);
|
||||
const id = asString(child.id, "").trim() || null;
|
||||
const identifier = asString(child.identifier, "").trim() || null;
|
||||
const title = asString(child.title, "").trim() || null;
|
||||
const status = asString(child.status, "").trim() || null;
|
||||
const priority = asString(child.priority, "").trim() || null;
|
||||
const summary = asString(child.summary, "").trim() || null;
|
||||
if (!id && !identifier && !title && !status && !summary) return null;
|
||||
return { id, identifier, title, status, priority, summary };
|
||||
}
|
||||
|
||||
function normalizePaperclipWakeBlockerSummary(value: unknown): PaperclipWakeBlockerSummary | null {
|
||||
const blocker = parseObject(value);
|
||||
const id = asString(blocker.id, "").trim() || null;
|
||||
const identifier = asString(blocker.identifier, "").trim() || null;
|
||||
const title = asString(blocker.title, "").trim() || null;
|
||||
const status = asString(blocker.status, "").trim() || null;
|
||||
const priority = asString(blocker.priority, "").trim() || null;
|
||||
if (!id && !identifier && !title && !status) return null;
|
||||
return { id, identifier, title, status, priority };
|
||||
}
|
||||
|
||||
function normalizePaperclipWakeExecutionPrincipal(value: unknown): PaperclipWakeExecutionPrincipal | null {
|
||||
const principal = parseObject(value);
|
||||
const typeRaw = asString(principal.type, "").trim().toLowerCase();
|
||||
@@ -490,25 +356,8 @@ export function normalizePaperclipWakePayload(value: unknown): PaperclipWakePayl
|
||||
.map((entry) => entry.trim())
|
||||
: [];
|
||||
const executionStage = normalizePaperclipWakeExecutionStage(payload.executionStage);
|
||||
const continuationSummary = normalizePaperclipWakeContinuationSummary(payload.continuationSummary);
|
||||
const livenessContinuation = normalizePaperclipWakeLivenessContinuation(payload.livenessContinuation);
|
||||
const childIssueSummaries = Array.isArray(payload.childIssueSummaries)
|
||||
? payload.childIssueSummaries
|
||||
.map((entry) => normalizePaperclipWakeChildIssueSummary(entry))
|
||||
.filter((entry): entry is PaperclipWakeChildIssueSummary => Boolean(entry))
|
||||
: [];
|
||||
const unresolvedBlockerIssueIds = Array.isArray(payload.unresolvedBlockerIssueIds)
|
||||
? payload.unresolvedBlockerIssueIds
|
||||
.map((entry) => asString(entry, "").trim())
|
||||
.filter(Boolean)
|
||||
: [];
|
||||
const unresolvedBlockerSummaries = Array.isArray(payload.unresolvedBlockerSummaries)
|
||||
? payload.unresolvedBlockerSummaries
|
||||
.map((entry) => normalizePaperclipWakeBlockerSummary(entry))
|
||||
.filter((entry): entry is PaperclipWakeBlockerSummary => Boolean(entry))
|
||||
: [];
|
||||
|
||||
if (comments.length === 0 && commentIds.length === 0 && childIssueSummaries.length === 0 && unresolvedBlockerIssueIds.length === 0 && unresolvedBlockerSummaries.length === 0 && !executionStage && !continuationSummary && !livenessContinuation && !normalizePaperclipWakeIssue(payload.issue)) {
|
||||
if (comments.length === 0 && commentIds.length === 0 && !executionStage && !normalizePaperclipWakeIssue(payload.issue)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
@@ -516,14 +365,7 @@ export function normalizePaperclipWakePayload(value: unknown): PaperclipWakePayl
|
||||
reason: asString(payload.reason, "").trim() || null,
|
||||
issue: normalizePaperclipWakeIssue(payload.issue),
|
||||
checkedOutByHarness: asBoolean(payload.checkedOutByHarness, false),
|
||||
dependencyBlockedInteraction: asBoolean(payload.dependencyBlockedInteraction, false),
|
||||
unresolvedBlockerIssueIds,
|
||||
unresolvedBlockerSummaries,
|
||||
executionStage,
|
||||
continuationSummary,
|
||||
livenessContinuation,
|
||||
childIssueSummaries,
|
||||
childIssueSummaryTruncated: asBoolean(payload.childIssueSummaryTruncated, false),
|
||||
commentIds,
|
||||
latestCommentId: asString(payload.latestCommentId, "").trim() || null,
|
||||
comments,
|
||||
@@ -564,8 +406,6 @@ export function renderPaperclipWakePrompt(
|
||||
"Focus on the new wake delta below and continue the current task without restating the full heartbeat boilerplate.",
|
||||
"Fetch the API thread only when `fallbackFetchNeeded` is true or you need broader history than this batch.",
|
||||
"",
|
||||
"Execution contract: take concrete action in this heartbeat when the issue is actionable; do not stop at a plan unless planning was requested. Leave durable progress with a clear next action, use child issues instead of polling for long or parallel work, and mark blocked work with the unblock owner/action.",
|
||||
"",
|
||||
`- reason: ${normalized.reason ?? "unknown"}`,
|
||||
`- issue: ${normalized.issue?.identifier ?? normalized.issue?.id ?? "unknown"}${normalized.issue?.title ? ` ${normalized.issue.title}` : ""}`,
|
||||
`- pending comments: ${normalized.includedCount}/${normalized.requestedCount}`,
|
||||
@@ -581,8 +421,6 @@ export function renderPaperclipWakePrompt(
|
||||
"Use this inline wake data first before refetching the issue thread.",
|
||||
"Only fetch the API thread when `fallbackFetchNeeded` is true or you need broader history than this batch.",
|
||||
"",
|
||||
"Execution contract: take concrete action in this heartbeat when the issue is actionable; do not stop at a plan unless planning was requested. Leave durable progress with a clear next action, use child issues instead of polling for long or parallel work, and mark blocked work with the unblock owner/action.",
|
||||
"",
|
||||
`- reason: ${normalized.reason ?? "unknown"}`,
|
||||
`- issue: ${normalized.issue?.identifier ?? normalized.issue?.id ?? "unknown"}${normalized.issue?.title ? ` ${normalized.issue.title}` : ""}`,
|
||||
`- pending comments: ${normalized.includedCount}/${normalized.requestedCount}`,
|
||||
@@ -599,18 +437,6 @@ export function renderPaperclipWakePrompt(
|
||||
if (normalized.checkedOutByHarness) {
|
||||
lines.push("- checkout: already claimed by the harness for this run");
|
||||
}
|
||||
if (normalized.dependencyBlockedInteraction) {
|
||||
lines.push("- dependency-blocked interaction: yes");
|
||||
lines.push("- execution scope: respond or triage the human comment; do not treat blocker-dependent deliverable work as unblocked");
|
||||
if (normalized.unresolvedBlockerSummaries.length > 0) {
|
||||
const blockers = normalized.unresolvedBlockerSummaries
|
||||
.map((blocker) => `${blocker.identifier ?? blocker.id ?? "unknown"}${blocker.title ? ` ${blocker.title}` : ""}${blocker.status ? ` (${blocker.status})` : ""}`)
|
||||
.join("; ");
|
||||
lines.push(`- unresolved blockers: ${blockers}`);
|
||||
} else if (normalized.unresolvedBlockerIssueIds.length > 0) {
|
||||
lines.push(`- unresolved blocker issue ids: ${normalized.unresolvedBlockerIssueIds.join(", ")}`);
|
||||
}
|
||||
}
|
||||
if (normalized.missingCount > 0) {
|
||||
lines.push(`- omitted comments: ${normalized.missingCount}`);
|
||||
}
|
||||
@@ -644,55 +470,6 @@ export function renderPaperclipWakePrompt(
|
||||
}
|
||||
}
|
||||
|
||||
if (normalized.continuationSummary) {
|
||||
lines.push(
|
||||
"",
|
||||
"Issue continuation summary:",
|
||||
normalized.continuationSummary.body,
|
||||
);
|
||||
if (normalized.continuationSummary.bodyTruncated) {
|
||||
lines.push("[continuation summary truncated]");
|
||||
}
|
||||
}
|
||||
|
||||
if (normalized.livenessContinuation) {
|
||||
const continuation = normalized.livenessContinuation;
|
||||
lines.push("", "Run liveness continuation:");
|
||||
if (continuation.attempt) {
|
||||
lines.push(
|
||||
`- attempt: ${continuation.attempt}${continuation.maxAttempts ? `/${continuation.maxAttempts}` : ""}`,
|
||||
);
|
||||
}
|
||||
if (continuation.sourceRunId) {
|
||||
lines.push(`- source run: ${continuation.sourceRunId}`);
|
||||
}
|
||||
if (continuation.state) {
|
||||
lines.push(`- liveness state: ${continuation.state}`);
|
||||
}
|
||||
if (continuation.reason) {
|
||||
lines.push(`- reason: ${continuation.reason}`);
|
||||
}
|
||||
if (continuation.instruction) {
|
||||
lines.push(`- instruction: ${continuation.instruction}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (normalized.childIssueSummaries.length > 0) {
|
||||
lines.push("", "Direct child issue summaries:");
|
||||
for (const child of normalized.childIssueSummaries) {
|
||||
const label = child.identifier ?? child.id ?? "unknown";
|
||||
lines.push(
|
||||
`- ${label}${child.title ? ` ${child.title}` : ""}${child.status ? ` (${child.status})` : ""}`,
|
||||
);
|
||||
if (child.summary) {
|
||||
lines.push(` ${child.summary}`);
|
||||
}
|
||||
}
|
||||
if (normalized.childIssueSummaryTruncated) {
|
||||
lines.push("[child issue summaries truncated]");
|
||||
}
|
||||
}
|
||||
|
||||
if (normalized.checkedOutByHarness) {
|
||||
lines.push(
|
||||
"",
|
||||
@@ -1295,7 +1072,6 @@ export async function runChildProcess(
|
||||
onLog: (stream: "stdout" | "stderr", chunk: string) => Promise<void>;
|
||||
onLogError?: (err: unknown, runId: string, message: string) => void;
|
||||
onSpawn?: (meta: { pid: number; processGroupId: number | null; startedAt: string }) => Promise<void>;
|
||||
terminalResultCleanup?: TerminalResultCleanupOptions;
|
||||
stdin?: string;
|
||||
},
|
||||
): Promise<RunProcessResult> {
|
||||
@@ -1345,61 +1121,11 @@ export async function runChildProcess(
|
||||
let stdout = "";
|
||||
let stderr = "";
|
||||
let logChain: Promise<void> = Promise.resolve();
|
||||
let childExited = false;
|
||||
let terminalResultSeen = false;
|
||||
let terminalCleanupStarted = false;
|
||||
let terminalCleanupTimer: NodeJS.Timeout | null = null;
|
||||
let terminalCleanupKillTimer: NodeJS.Timeout | null = null;
|
||||
let terminalResultStdoutScanOffset = 0;
|
||||
let terminalResultStderrScanOffset = 0;
|
||||
|
||||
const clearTerminalCleanupTimers = () => {
|
||||
if (terminalCleanupTimer) clearTimeout(terminalCleanupTimer);
|
||||
if (terminalCleanupKillTimer) clearTimeout(terminalCleanupKillTimer);
|
||||
terminalCleanupTimer = null;
|
||||
terminalCleanupKillTimer = null;
|
||||
};
|
||||
|
||||
const maybeArmTerminalResultCleanup = () => {
|
||||
const terminalCleanup = opts.terminalResultCleanup;
|
||||
if (!terminalCleanup || terminalCleanupStarted || timedOut) return;
|
||||
if (!terminalResultSeen) {
|
||||
const stdoutStart = Math.max(0, terminalResultStdoutScanOffset - TERMINAL_RESULT_SCAN_OVERLAP_CHARS);
|
||||
const stderrStart = Math.max(0, terminalResultStderrScanOffset - TERMINAL_RESULT_SCAN_OVERLAP_CHARS);
|
||||
const scanOutput = {
|
||||
stdout: stdout.slice(stdoutStart),
|
||||
stderr: stderr.slice(stderrStart),
|
||||
};
|
||||
terminalResultStdoutScanOffset = stdout.length;
|
||||
terminalResultStderrScanOffset = stderr.length;
|
||||
if (scanOutput.stdout.length === 0 && scanOutput.stderr.length === 0) return;
|
||||
try {
|
||||
terminalResultSeen = terminalCleanup.hasTerminalResult(scanOutput);
|
||||
} catch (err) {
|
||||
onLogError(err, runId, "failed to inspect terminal adapter output");
|
||||
}
|
||||
}
|
||||
if (!terminalResultSeen || !childExited) return;
|
||||
|
||||
if (terminalCleanupTimer) return;
|
||||
const graceMs = Math.max(0, terminalCleanup.graceMs ?? 5_000);
|
||||
terminalCleanupTimer = setTimeout(() => {
|
||||
terminalCleanupTimer = null;
|
||||
if (terminalCleanupStarted || timedOut) return;
|
||||
terminalCleanupStarted = true;
|
||||
signalRunningProcess({ child, processGroupId }, "SIGTERM");
|
||||
terminalCleanupKillTimer = setTimeout(() => {
|
||||
terminalCleanupKillTimer = null;
|
||||
signalRunningProcess({ child, processGroupId }, "SIGKILL");
|
||||
}, Math.max(1, opts.graceSec) * 1000);
|
||||
}, graceMs);
|
||||
};
|
||||
|
||||
const timeout =
|
||||
opts.timeoutSec > 0
|
||||
? setTimeout(() => {
|
||||
timedOut = true;
|
||||
clearTerminalCleanupTimers();
|
||||
signalRunningProcess({ child, processGroupId }, "SIGTERM");
|
||||
setTimeout(() => {
|
||||
signalRunningProcess({ child, processGroupId }, "SIGKILL");
|
||||
@@ -1408,35 +1134,19 @@ export async function runChildProcess(
|
||||
: null;
|
||||
|
||||
child.stdout?.on("data", (chunk: unknown) => {
|
||||
const readable = child.stdout;
|
||||
if (!readable) return;
|
||||
readable.pause();
|
||||
const text = String(chunk);
|
||||
stdout = appendWithCap(stdout, text);
|
||||
maybeArmTerminalResultCleanup();
|
||||
logChain = logChain
|
||||
.then(() => opts.onLog("stdout", text))
|
||||
.catch((err) => onLogError(err, runId, "failed to append stdout log chunk"))
|
||||
.finally(() => {
|
||||
maybeArmTerminalResultCleanup();
|
||||
resumeReadable(readable);
|
||||
});
|
||||
.catch((err) => onLogError(err, runId, "failed to append stdout log chunk"));
|
||||
});
|
||||
|
||||
child.stderr?.on("data", (chunk: unknown) => {
|
||||
const readable = child.stderr;
|
||||
if (!readable) return;
|
||||
readable.pause();
|
||||
const text = String(chunk);
|
||||
stderr = appendWithCap(stderr, text);
|
||||
maybeArmTerminalResultCleanup();
|
||||
logChain = logChain
|
||||
.then(() => opts.onLog("stderr", text))
|
||||
.catch((err) => onLogError(err, runId, "failed to append stderr log chunk"))
|
||||
.finally(() => {
|
||||
maybeArmTerminalResultCleanup();
|
||||
resumeReadable(readable);
|
||||
});
|
||||
.catch((err) => onLogError(err, runId, "failed to append stderr log chunk"));
|
||||
});
|
||||
|
||||
const stdin = child.stdin;
|
||||
@@ -1450,7 +1160,6 @@ export async function runChildProcess(
|
||||
|
||||
child.on("error", (err: Error) => {
|
||||
if (timeout) clearTimeout(timeout);
|
||||
clearTerminalCleanupTimers();
|
||||
runningProcesses.delete(runId);
|
||||
const errno = (err as NodeJS.ErrnoException).code;
|
||||
const pathValue = mergedEnv.PATH ?? mergedEnv.Path ?? "";
|
||||
@@ -1461,14 +1170,8 @@ export async function runChildProcess(
|
||||
reject(new Error(msg));
|
||||
});
|
||||
|
||||
child.on("exit", () => {
|
||||
childExited = true;
|
||||
maybeArmTerminalResultCleanup();
|
||||
});
|
||||
|
||||
child.on("close", (code: number | null, signal: NodeJS.Signals | null) => {
|
||||
if (timeout) clearTimeout(timeout);
|
||||
clearTerminalCleanupTimers();
|
||||
runningProcesses.delete(runId);
|
||||
void logChain.finally(() => {
|
||||
resolve({
|
||||
|
||||
@@ -2,7 +2,6 @@ export const type = "claude_local";
|
||||
export const label = "Claude Code (local)";
|
||||
|
||||
export const models = [
|
||||
{ id: "claude-opus-4-7", label: "Claude Opus 4.7" },
|
||||
{ id: "claude-opus-4-6", label: "Claude Opus 4.6" },
|
||||
{ id: "claude-sonnet-4-6", label: "Claude Sonnet 4.6" },
|
||||
{ id: "claude-haiku-4-6", label: "Claude Haiku 4.6" },
|
||||
|
||||
@@ -21,7 +21,6 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
runChildProcess,
|
||||
} from "@paperclipai/adapter-utils/server-utils";
|
||||
import {
|
||||
@@ -301,7 +300,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const model = asString(config.model, "");
|
||||
const effort = asString(config.effort, "");
|
||||
@@ -330,10 +329,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
graceSec,
|
||||
extraArgs,
|
||||
} = runtimeConfig;
|
||||
const terminalResultCleanupGraceMs = Math.max(
|
||||
0,
|
||||
asNumber(config.terminalResultCleanupGraceMs, 5_000),
|
||||
);
|
||||
const effectiveEnv = Object.fromEntries(
|
||||
Object.entries({ ...process.env, ...env }).filter(
|
||||
(entry): entry is [string, string] => typeof entry[1] === "string",
|
||||
@@ -507,10 +502,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
graceSec,
|
||||
onSpawn,
|
||||
onLog,
|
||||
terminalResultCleanup: {
|
||||
graceMs: terminalResultCleanupGraceMs,
|
||||
hasTerminalResult: ({ stdout }) => parseClaudeStreamJson(stdout).resultJson !== null,
|
||||
},
|
||||
});
|
||||
|
||||
const parsedStream = parseClaudeStreamJson(proc.stdout);
|
||||
|
||||
@@ -18,15 +18,10 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
joinPromptSections,
|
||||
runChildProcess,
|
||||
} from "@paperclipai/adapter-utils/server-utils";
|
||||
import {
|
||||
parseCodexJsonl,
|
||||
isCodexTransientUpstreamError,
|
||||
isCodexUnknownSessionError,
|
||||
} from "./parse.js";
|
||||
import { parseCodexJsonl, isCodexUnknownSessionError } from "./parse.js";
|
||||
import { pathExists, prepareManagedCodexHome, resolveManagedCodexHomeDir, resolveSharedCodexHomeDir } from "./codex-home.js";
|
||||
import { resolveCodexDesiredSkillNames } from "./skills.js";
|
||||
import { buildCodexExecArgs } from "./codex-args.js";
|
||||
@@ -153,52 +148,6 @@ type EnsureCodexSkillsInjectedOptions = {
|
||||
linkSkill?: (source: string, target: string) => Promise<void>;
|
||||
};
|
||||
|
||||
type CodexTransientFallbackMode =
|
||||
| "same_session"
|
||||
| "safer_invocation"
|
||||
| "fresh_session"
|
||||
| "fresh_session_safer_invocation";
|
||||
|
||||
function readCodexTransientFallbackMode(context: Record<string, unknown>): CodexTransientFallbackMode | null {
|
||||
const value = asString(context.codexTransientFallbackMode, "").trim();
|
||||
switch (value) {
|
||||
case "same_session":
|
||||
case "safer_invocation":
|
||||
case "fresh_session":
|
||||
case "fresh_session_safer_invocation":
|
||||
return value;
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function fallbackModeUsesSaferInvocation(mode: CodexTransientFallbackMode | null): boolean {
|
||||
return mode === "safer_invocation" || mode === "fresh_session_safer_invocation";
|
||||
}
|
||||
|
||||
function fallbackModeUsesFreshSession(mode: CodexTransientFallbackMode | null): boolean {
|
||||
return mode === "fresh_session" || mode === "fresh_session_safer_invocation";
|
||||
}
|
||||
|
||||
function buildCodexTransientHandoffNote(input: {
|
||||
previousSessionId: string | null;
|
||||
fallbackMode: CodexTransientFallbackMode;
|
||||
continuationSummaryBody: string | null;
|
||||
}): string {
|
||||
return [
|
||||
"Paperclip session handoff:",
|
||||
input.previousSessionId ? `- Previous session: ${input.previousSessionId}` : "",
|
||||
"- Rotation reason: repeated Codex transient remote-compaction failures",
|
||||
`- Fallback mode: ${input.fallbackMode}`,
|
||||
input.continuationSummaryBody
|
||||
? `- Issue continuation summary: ${input.continuationSummaryBody.slice(0, 1_500)}`
|
||||
: "",
|
||||
"Continue from the current task state. Rebuild only the minimum context you need.",
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join("\n");
|
||||
}
|
||||
|
||||
export async function ensureCodexSkillsInjected(
|
||||
onLog: AdapterExecutionContext["onLog"],
|
||||
options: EnsureCodexSkillsInjectedOptions = {},
|
||||
@@ -269,7 +218,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const command = asString(config.command, "codex");
|
||||
const model = asString(config.model, "");
|
||||
@@ -447,10 +396,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
const canResumeSession =
|
||||
runtimeSessionId.length > 0 &&
|
||||
(runtimeSessionCwd.length === 0 || path.resolve(runtimeSessionCwd) === path.resolve(cwd));
|
||||
const codexTransientFallbackMode = readCodexTransientFallbackMode(context);
|
||||
const forceSaferInvocation = fallbackModeUsesSaferInvocation(codexTransientFallbackMode);
|
||||
const forceFreshSession = fallbackModeUsesFreshSession(codexTransientFallbackMode);
|
||||
const sessionId = canResumeSession && !forceFreshSession ? runtimeSessionId : null;
|
||||
const sessionId = canResumeSession ? runtimeSessionId : null;
|
||||
if (runtimeSessionId && !canResumeSession) {
|
||||
await onLog(
|
||||
"stdout",
|
||||
@@ -497,66 +443,28 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
const shouldUseResumeDeltaPrompt = Boolean(sessionId) && wakePrompt.length > 0;
|
||||
const promptInstructionsPrefix = shouldUseResumeDeltaPrompt ? "" : instructionsPrefix;
|
||||
instructionsChars = promptInstructionsPrefix.length;
|
||||
const continuationSummary = parseObject(context.paperclipContinuationSummary);
|
||||
const continuationSummaryBody = asString(continuationSummary.body, "").trim() || null;
|
||||
const codexFallbackHandoffNote =
|
||||
forceFreshSession
|
||||
? buildCodexTransientHandoffNote({
|
||||
previousSessionId: runtimeSessionId || runtime.sessionId || null,
|
||||
fallbackMode: codexTransientFallbackMode ?? "fresh_session",
|
||||
continuationSummaryBody,
|
||||
})
|
||||
: "";
|
||||
const commandNotes = (() => {
|
||||
if (!instructionsFilePath) {
|
||||
const notes = [repoAgentsNote];
|
||||
if (forceSaferInvocation) {
|
||||
notes.push("Codex transient fallback requested safer invocation settings for this retry.");
|
||||
}
|
||||
if (forceFreshSession) {
|
||||
notes.push("Codex transient fallback forced a fresh session with a continuation handoff.");
|
||||
}
|
||||
return notes;
|
||||
return [repoAgentsNote];
|
||||
}
|
||||
if (instructionsPrefix.length > 0) {
|
||||
if (shouldUseResumeDeltaPrompt) {
|
||||
const notes = [
|
||||
return [
|
||||
`Loaded agent instructions from ${instructionsFilePath}`,
|
||||
"Skipped stdin instruction reinjection because an existing Codex session is being resumed with a wake delta.",
|
||||
repoAgentsNote,
|
||||
];
|
||||
if (forceSaferInvocation) {
|
||||
notes.push("Codex transient fallback requested safer invocation settings for this retry.");
|
||||
}
|
||||
if (forceFreshSession) {
|
||||
notes.push("Codex transient fallback forced a fresh session with a continuation handoff.");
|
||||
}
|
||||
return notes;
|
||||
}
|
||||
const notes = [
|
||||
return [
|
||||
`Loaded agent instructions from ${instructionsFilePath}`,
|
||||
`Prepended instructions + path directive to stdin prompt (relative references from ${instructionsDir}).`,
|
||||
repoAgentsNote,
|
||||
];
|
||||
if (forceSaferInvocation) {
|
||||
notes.push("Codex transient fallback requested safer invocation settings for this retry.");
|
||||
}
|
||||
if (forceFreshSession) {
|
||||
notes.push("Codex transient fallback forced a fresh session with a continuation handoff.");
|
||||
}
|
||||
return notes;
|
||||
}
|
||||
const notes = [
|
||||
return [
|
||||
`Configured instructionsFilePath ${instructionsFilePath}, but file could not be read; continuing without injected instructions.`,
|
||||
repoAgentsNote,
|
||||
];
|
||||
if (forceSaferInvocation) {
|
||||
notes.push("Codex transient fallback requested safer invocation settings for this retry.");
|
||||
}
|
||||
if (forceFreshSession) {
|
||||
notes.push("Codex transient fallback forced a fresh session with a continuation handoff.");
|
||||
}
|
||||
return notes;
|
||||
})();
|
||||
const renderedPrompt = shouldUseResumeDeltaPrompt ? "" : renderTemplate(promptTemplate, templateData);
|
||||
const sessionHandoffNote = asString(context.paperclipSessionHandoffMarkdown, "").trim();
|
||||
@@ -564,7 +472,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
promptInstructionsPrefix,
|
||||
renderedBootstrapPrompt,
|
||||
wakePrompt,
|
||||
codexFallbackHandoffNote,
|
||||
sessionHandoffNote,
|
||||
renderedPrompt,
|
||||
]);
|
||||
@@ -578,10 +485,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
};
|
||||
|
||||
const runAttempt = async (resumeSessionId: string | null) => {
|
||||
const execArgs = buildCodexExecArgs(
|
||||
forceSaferInvocation ? { ...config, fastMode: false } : config,
|
||||
{ resumeSessionId },
|
||||
);
|
||||
const execArgs = buildCodexExecArgs(config, { resumeSessionId });
|
||||
const args = execArgs.args;
|
||||
const commandNotesWithFastMode =
|
||||
execArgs.fastModeIgnoredReason == null
|
||||
@@ -635,7 +539,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
const toResult = (
|
||||
attempt: { proc: { exitCode: number | null; signal: string | null; timedOut: boolean; stdout: string; stderr: string }; rawStderr: string; parsed: ReturnType<typeof parseCodexJsonl> },
|
||||
clearSessionOnMissingSession = false,
|
||||
isRetry = false,
|
||||
): AdapterExecutionResult => {
|
||||
if (attempt.proc.timedOut) {
|
||||
return {
|
||||
@@ -647,10 +550,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
};
|
||||
}
|
||||
|
||||
const canFallbackToRuntimeSession = !isRetry && !forceFreshSession;
|
||||
const resolvedSessionId =
|
||||
attempt.parsed.sessionId ??
|
||||
(canFallbackToRuntimeSession ? (runtimeSessionId ?? runtime.sessionId ?? null) : null);
|
||||
const resolvedSessionId = attempt.parsed.sessionId ?? runtimeSessionId ?? runtime.sessionId ?? null;
|
||||
const resolvedSessionParams = resolvedSessionId
|
||||
? ({
|
||||
sessionId: resolvedSessionId,
|
||||
@@ -675,15 +575,6 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
(attempt.proc.exitCode ?? 0) === 0
|
||||
? null
|
||||
: fallbackErrorMessage,
|
||||
errorCode:
|
||||
(attempt.proc.exitCode ?? 0) !== 0 &&
|
||||
isCodexTransientUpstreamError({
|
||||
stdout: attempt.proc.stdout,
|
||||
stderr: attempt.proc.stderr,
|
||||
errorMessage: fallbackErrorMessage,
|
||||
})
|
||||
? "codex_transient_upstream"
|
||||
: null,
|
||||
usage: attempt.parsed.usage,
|
||||
sessionId: resolvedSessionId,
|
||||
sessionParams: resolvedSessionParams,
|
||||
@@ -698,7 +589,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
stderr: attempt.proc.stderr,
|
||||
},
|
||||
summary: attempt.parsed.summary,
|
||||
clearSession: Boolean((clearSessionOnMissingSession || forceFreshSession) && !resolvedSessionId),
|
||||
clearSession: Boolean(clearSessionOnMissingSession && !resolvedSessionId),
|
||||
};
|
||||
};
|
||||
|
||||
@@ -714,8 +605,8 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
`[paperclip] Codex resume session "${sessionId}" is unavailable; retrying with a fresh session.\n`,
|
||||
);
|
||||
const retry = await runAttempt(null);
|
||||
return toResult(retry, true, true);
|
||||
return toResult(retry, true);
|
||||
}
|
||||
|
||||
return toResult(initial, false, false);
|
||||
return toResult(initial);
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
export { execute, ensureCodexSkillsInjected } from "./execute.js";
|
||||
export { listCodexSkills, syncCodexSkills } from "./skills.js";
|
||||
export { testEnvironment } from "./test.js";
|
||||
export { parseCodexJsonl, isCodexTransientUpstreamError, isCodexUnknownSessionError } from "./parse.js";
|
||||
export { parseCodexJsonl, isCodexUnknownSessionError } from "./parse.js";
|
||||
export {
|
||||
getQuotaWindows,
|
||||
readCodexAuthInfo,
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
isCodexTransientUpstreamError,
|
||||
isCodexUnknownSessionError,
|
||||
parseCodexJsonl,
|
||||
} from "./parse.js";
|
||||
import { isCodexUnknownSessionError, parseCodexJsonl } from "./parse.js";
|
||||
|
||||
describe("parseCodexJsonl", () => {
|
||||
it("captures session id, assistant summary, usage, and error message", () => {
|
||||
@@ -85,36 +81,3 @@ describe("isCodexUnknownSessionError", () => {
|
||||
expect(isCodexUnknownSessionError("", "model overloaded")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("isCodexTransientUpstreamError", () => {
|
||||
it("classifies the remote-compaction high-demand failure as transient upstream", () => {
|
||||
expect(
|
||||
isCodexTransientUpstreamError({
|
||||
errorMessage:
|
||||
"Error running remote compact task: We're currently experiencing high demand, which may cause temporary errors.",
|
||||
}),
|
||||
).toBe(true);
|
||||
expect(
|
||||
isCodexTransientUpstreamError({
|
||||
stderr: "We're currently experiencing high demand, which may cause temporary errors.",
|
||||
}),
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
it("does not classify deterministic compaction errors as transient", () => {
|
||||
expect(
|
||||
isCodexTransientUpstreamError({
|
||||
errorMessage: [
|
||||
"Error running remote compact task: {",
|
||||
' "error": {',
|
||||
' "message": "Unknown parameter: \'prompt_cache_retention\'.",',
|
||||
' "type": "invalid_request_error",',
|
||||
' "param": "prompt_cache_retention",',
|
||||
' "code": "unknown_parameter"',
|
||||
" }",
|
||||
"}",
|
||||
].join("\n"),
|
||||
}),
|
||||
).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
import { asString, asNumber, parseObject, parseJson } from "@paperclipai/adapter-utils/server-utils";
|
||||
|
||||
const CODEX_TRANSIENT_UPSTREAM_RE =
|
||||
/(?:we(?:'|’)re\s+currently\s+experiencing\s+high\s+demand|temporary\s+errors|rate[-\s]?limit(?:ed)?|too\s+many\s+requests|\b429\b|server\s+overloaded|service\s+unavailable|try\s+again\s+later)/i;
|
||||
const CODEX_REMOTE_COMPACTION_RE = /remote\s+compact\s+task/i;
|
||||
|
||||
export function parseCodexJsonl(stdout: string) {
|
||||
let sessionId: string | null = null;
|
||||
let finalMessage: string | null = null;
|
||||
@@ -75,25 +71,3 @@ export function isCodexUnknownSessionError(stdout: string, stderr: string): bool
|
||||
haystack,
|
||||
);
|
||||
}
|
||||
|
||||
export function isCodexTransientUpstreamError(input: {
|
||||
stdout?: string | null;
|
||||
stderr?: string | null;
|
||||
errorMessage?: string | null;
|
||||
}): boolean {
|
||||
const haystack = [
|
||||
input.errorMessage ?? "",
|
||||
input.stdout ?? "",
|
||||
input.stderr ?? "",
|
||||
]
|
||||
.join("\n")
|
||||
.split(/\r?\n/)
|
||||
.map((line) => line.trim())
|
||||
.filter(Boolean)
|
||||
.join("\n");
|
||||
|
||||
if (!CODEX_TRANSIENT_UPSTREAM_RE.test(haystack)) return false;
|
||||
// Keep automatic retries scoped to the observed remote-compaction/high-demand
|
||||
// failure shape; broader 429s may be caused by user or account limits.
|
||||
return CODEX_REMOTE_COMPACTION_RE.test(haystack) || /high\s+demand|temporary\s+errors/i.test(haystack);
|
||||
}
|
||||
|
||||
@@ -21,7 +21,6 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
joinPromptSections,
|
||||
runChildProcess,
|
||||
} from "@paperclipai/adapter-utils/server-utils";
|
||||
@@ -165,7 +164,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const command = asString(config.command, "agent");
|
||||
const model = asString(config.model, DEFAULT_CURSOR_LOCAL_MODEL).trim();
|
||||
|
||||
@@ -24,7 +24,6 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
runChildProcess,
|
||||
} from "@paperclipai/adapter-utils/server-utils";
|
||||
import { DEFAULT_GEMINI_LOCAL_MODEL } from "../index.js";
|
||||
@@ -141,7 +140,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const command = asString(config.command, "gemini");
|
||||
const model = asString(config.model, DEFAULT_GEMINI_LOCAL_MODEL).trim();
|
||||
|
||||
@@ -420,9 +420,7 @@ function buildWakeText(
|
||||
" - POST /api/issues/{issueId}/checkout with {\"agentId\":\"$PAPERCLIP_AGENT_ID\",\"expectedStatuses\":[\"todo\",\"backlog\",\"blocked\",\"in_review\"]}",
|
||||
" - GET /api/issues/{issueId}",
|
||||
" - GET /api/issues/{issueId}/comments",
|
||||
" - Execute the issue instructions exactly. If the issue is actionable, take concrete action in this run; do not stop at a plan unless planning was requested.",
|
||||
" - Leave durable progress with a clear next action. Use child issues for long or parallel delegated work instead of polling agents, sessions, or processes.",
|
||||
" - If blocked, PATCH /api/issues/{issueId} with {\"status\":\"blocked\",\"comment\":\"what is blocked, who owns the unblock, and the next action\"}.",
|
||||
" - Execute the issue instructions exactly.",
|
||||
" - If instructions require a comment, POST /api/issues/{issueId}/comments with {\"body\":\"...\"}.",
|
||||
" - PATCH /api/issues/{issueId} with {\"status\":\"done\",\"comment\":\"what changed and why\"}.",
|
||||
"4) If issueId does not exist:",
|
||||
|
||||
@@ -19,7 +19,6 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
runChildProcess,
|
||||
readPaperclipRuntimeSkillEntries,
|
||||
resolvePaperclipDesiredSkillNames,
|
||||
@@ -98,7 +97,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const command = asString(config.command, "opencode");
|
||||
const model = asString(config.model, "").trim();
|
||||
|
||||
@@ -22,7 +22,6 @@ import {
|
||||
renderTemplate,
|
||||
renderPaperclipWakePrompt,
|
||||
stringifyPaperclipWakePayload,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
runChildProcess,
|
||||
} from "@paperclipai/adapter-utils/server-utils";
|
||||
import { isPiUnknownSessionError, parsePiJsonl } from "./parse.js";
|
||||
@@ -114,7 +113,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
|
||||
const promptTemplate = asString(
|
||||
config.promptTemplate,
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE,
|
||||
"You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.",
|
||||
);
|
||||
const command = asString(config.command, "pi");
|
||||
const model = asString(config.model, "").trim();
|
||||
@@ -277,7 +276,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
||||
`${instructionsContents}\n\n` +
|
||||
`The above agent instructions were loaded from ${resolvedInstructionsFilePath}. ` +
|
||||
`Resolve any relative file references from ${instructionsFileDir}.\n\n` +
|
||||
DEFAULT_PAPERCLIP_AGENT_PROMPT_TEMPLATE;
|
||||
`You are agent {{agent.id}} ({{agent.name}}). Continue your Paperclip work.`;
|
||||
} catch (err) {
|
||||
instructionsReadFailed = true;
|
||||
const reason = err instanceof Error ? err.message : String(err);
|
||||
|
||||
@@ -1,84 +0,0 @@
|
||||
import { createHash, randomBytes } from "node:crypto";
|
||||
import { readFileSync } from "node:fs";
|
||||
import path from "node:path";
|
||||
import { and, eq, gt, isNull } from "drizzle-orm";
|
||||
import { createDb } from "../src/client.js";
|
||||
import { invites } from "../src/schema/index.js";
|
||||
|
||||
function hashToken(token: string) {
|
||||
return createHash("sha256").update(token).digest("hex");
|
||||
}
|
||||
|
||||
function createInviteToken() {
|
||||
return `pcp_bootstrap_${randomBytes(24).toString("hex")}`;
|
||||
}
|
||||
|
||||
function readArg(flag: string) {
|
||||
const index = process.argv.indexOf(flag);
|
||||
if (index === -1) return null;
|
||||
return process.argv[index + 1] ?? null;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const configPath = readArg("--config");
|
||||
const baseUrl = readArg("--base-url");
|
||||
|
||||
if (!configPath || !baseUrl) {
|
||||
throw new Error("Usage: tsx create-auth-bootstrap-invite.ts --config <path> --base-url <url>");
|
||||
}
|
||||
|
||||
const config = JSON.parse(readFileSync(path.resolve(configPath), "utf8")) as {
|
||||
database?: {
|
||||
mode?: string;
|
||||
embeddedPostgresPort?: number;
|
||||
connectionString?: string;
|
||||
};
|
||||
};
|
||||
const dbUrl =
|
||||
config.database?.mode === "postgres"
|
||||
? config.database.connectionString
|
||||
: `postgres://paperclip:paperclip@127.0.0.1:${config.database?.embeddedPostgresPort ?? 54329}/paperclip`;
|
||||
if (!dbUrl) {
|
||||
throw new Error(`Could not resolve database connection from ${configPath}`);
|
||||
}
|
||||
|
||||
const db = createDb(dbUrl);
|
||||
const closableDb = db as typeof db & {
|
||||
$client?: {
|
||||
end?: (options?: { timeout?: number }) => Promise<void>;
|
||||
};
|
||||
};
|
||||
|
||||
try {
|
||||
const now = new Date();
|
||||
await db
|
||||
.update(invites)
|
||||
.set({ revokedAt: now, updatedAt: now })
|
||||
.where(
|
||||
and(
|
||||
eq(invites.inviteType, "bootstrap_ceo"),
|
||||
isNull(invites.revokedAt),
|
||||
isNull(invites.acceptedAt),
|
||||
gt(invites.expiresAt, now)
|
||||
)
|
||||
);
|
||||
|
||||
const token = createInviteToken();
|
||||
await db.insert(invites).values({
|
||||
inviteType: "bootstrap_ceo",
|
||||
tokenHash: hashToken(token),
|
||||
allowedJoinTypes: "human",
|
||||
expiresAt: new Date(Date.now() + 72 * 60 * 60 * 1000),
|
||||
invitedByUserId: "system",
|
||||
});
|
||||
|
||||
process.stdout.write(`${baseUrl.replace(/\/+$/, "")}/invite/${token}\n`);
|
||||
} finally {
|
||||
await closableDb.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((error) => {
|
||||
process.stderr.write(`${error instanceof Error ? error.stack ?? error.message : String(error)}\n`);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -127,7 +127,6 @@ describeEmbeddedPostgres("runDatabaseBackup", () => {
|
||||
backupDir,
|
||||
retention: { dailyDays: 7, weeklyWeeks: 4, monthlyMonths: 1 },
|
||||
filenamePrefix: "paperclip-test",
|
||||
backupEngine: "javascript",
|
||||
});
|
||||
|
||||
expect(result.backupFile).toMatch(/paperclip-test-.*\.sql\.gz$/);
|
||||
@@ -149,17 +148,14 @@ describeEmbeddedPostgres("runDatabaseBackup", () => {
|
||||
title: string;
|
||||
payload: string;
|
||||
state: string;
|
||||
metadata: { index: number; even: boolean } | string;
|
||||
metadata: { index: number; even: boolean };
|
||||
}[]>(`
|
||||
SELECT "title", "payload", "state"::text AS "state", "metadata"
|
||||
FROM "public"."backup_test_records"
|
||||
WHERE "title" IN ('row-0', 'row-159')
|
||||
ORDER BY "title"
|
||||
`);
|
||||
expect(sampleRows.map((row) => ({
|
||||
...row,
|
||||
metadata: typeof row.metadata === "string" ? JSON.parse(row.metadata) : row.metadata,
|
||||
}))).toEqual([
|
||||
expect(sampleRows).toEqual([
|
||||
{
|
||||
title: "row-0",
|
||||
payload,
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
import { createReadStream, createWriteStream, existsSync, mkdirSync, readdirSync, statSync, unlinkSync } from "node:fs";
|
||||
import { basename, resolve } from "node:path";
|
||||
import { createInterface } from "node:readline";
|
||||
import { spawn } from "node:child_process";
|
||||
import { open as openFile } from "node:fs/promises";
|
||||
import { pipeline } from "node:stream/promises";
|
||||
import { createGunzip, createGzip } from "node:zlib";
|
||||
import postgres from "postgres";
|
||||
@@ -22,7 +20,6 @@ export type RunDatabaseBackupOptions = {
|
||||
includeMigrationJournal?: boolean;
|
||||
excludeTables?: string[];
|
||||
nullifyColumns?: Record<string, string[]>;
|
||||
backupEngine?: "auto" | "pg_dump" | "javascript";
|
||||
};
|
||||
|
||||
export type RunDatabaseBackupResult = {
|
||||
@@ -64,9 +61,6 @@ type ExtensionDefinition = {
|
||||
const DRIZZLE_SCHEMA = "drizzle";
|
||||
const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations";
|
||||
const DEFAULT_BACKUP_WRITE_BUFFER_BYTES = 1024 * 1024;
|
||||
const BACKUP_DATA_CURSOR_ROWS = 100;
|
||||
const BACKUP_CLI_STDERR_BYTES = 64 * 1024;
|
||||
const BACKUP_BREAKPOINT_DETECT_BYTES = 64 * 1024;
|
||||
|
||||
const STATEMENT_BREAKPOINT = "-- paperclip statement breakpoint 69f6f3f1-42fd-46a6-bf17-d1d85f8f3900";
|
||||
|
||||
@@ -229,134 +223,6 @@ function tableKey(schemaName: string, tableName: string): string {
|
||||
return `${schemaName}.${tableName}`;
|
||||
}
|
||||
|
||||
function hasBackupTransforms(opts: RunDatabaseBackupOptions): boolean {
|
||||
return opts.includeMigrationJournal === true ||
|
||||
(opts.excludeTables?.length ?? 0) > 0 ||
|
||||
Object.keys(opts.nullifyColumns ?? {}).length > 0;
|
||||
}
|
||||
|
||||
function formatSqlValue(rawValue: unknown, columnName: string | undefined, nullifiedColumns: Set<string>): string {
|
||||
const val = columnName && nullifiedColumns.has(columnName) ? null : rawValue;
|
||||
if (val === null || val === undefined) return "NULL";
|
||||
if (typeof val === "boolean") return val ? "true" : "false";
|
||||
if (typeof val === "number") return String(val);
|
||||
if (val instanceof Date) return formatSqlLiteral(val.toISOString());
|
||||
if (typeof val === "object") return formatSqlLiteral(JSON.stringify(val));
|
||||
return formatSqlLiteral(String(val));
|
||||
}
|
||||
|
||||
function appendCapturedStderr(previous: string, chunk: Buffer | string): string {
|
||||
const next = previous + (Buffer.isBuffer(chunk) ? chunk.toString("utf8") : chunk);
|
||||
if (Buffer.byteLength(next, "utf8") <= BACKUP_CLI_STDERR_BYTES) return next;
|
||||
return Buffer.from(next, "utf8").subarray(-BACKUP_CLI_STDERR_BYTES).toString("utf8");
|
||||
}
|
||||
|
||||
async function waitForChildExit(child: ReturnType<typeof spawn>, label: string): Promise<void> {
|
||||
let stderr = "";
|
||||
child.stderr?.on("data", (chunk) => {
|
||||
stderr = appendCapturedStderr(stderr, chunk);
|
||||
});
|
||||
|
||||
const result = await new Promise<{ code: number | null; signal: NodeJS.Signals | null }>((resolve, reject) => {
|
||||
child.once("error", reject);
|
||||
child.once("exit", (code, signal) => resolve({ code, signal }));
|
||||
});
|
||||
|
||||
if (result.signal) {
|
||||
throw new Error(`${label} exited via ${result.signal}${stderr.trim() ? `: ${stderr.trim()}` : ""}`);
|
||||
}
|
||||
if (result.code !== 0) {
|
||||
throw new Error(`${label} failed with exit code ${result.code ?? "unknown"}${stderr.trim() ? `: ${stderr.trim()}` : ""}`);
|
||||
}
|
||||
}
|
||||
|
||||
async function runPgDumpBackup(opts: {
|
||||
connectionString: string;
|
||||
backupFile: string;
|
||||
connectTimeout: number;
|
||||
}): Promise<void> {
|
||||
const pgDumpBin = process.env.PAPERCLIP_PG_DUMP_PATH || "pg_dump";
|
||||
const child = spawn(
|
||||
pgDumpBin,
|
||||
[
|
||||
`--dbname=${opts.connectionString}`,
|
||||
"--format=plain",
|
||||
"--clean",
|
||||
"--if-exists",
|
||||
"--no-owner",
|
||||
"--no-privileges",
|
||||
"--schema=public",
|
||||
],
|
||||
{
|
||||
stdio: ["ignore", "pipe", "pipe"],
|
||||
env: {
|
||||
...process.env,
|
||||
PGCONNECT_TIMEOUT: String(opts.connectTimeout),
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
if (!child.stdout) {
|
||||
throw new Error("pg_dump did not expose stdout");
|
||||
}
|
||||
|
||||
await Promise.all([
|
||||
pipeline(child.stdout, createGzip(), createWriteStream(opts.backupFile)),
|
||||
waitForChildExit(child, pgDumpBin),
|
||||
]);
|
||||
}
|
||||
|
||||
async function restoreWithPsql(opts: RunDatabaseRestoreOptions, connectTimeout: number): Promise<void> {
|
||||
const psqlBin = process.env.PAPERCLIP_PSQL_PATH || "psql";
|
||||
const child = spawn(
|
||||
psqlBin,
|
||||
[
|
||||
`--dbname=${opts.connectionString}`,
|
||||
"--set=ON_ERROR_STOP=1",
|
||||
"--quiet",
|
||||
"--no-psqlrc",
|
||||
],
|
||||
{
|
||||
stdio: ["pipe", "ignore", "pipe"],
|
||||
env: {
|
||||
...process.env,
|
||||
PGCONNECT_TIMEOUT: String(connectTimeout),
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
if (!child.stdin) {
|
||||
throw new Error("psql did not expose stdin");
|
||||
}
|
||||
|
||||
const input = opts.backupFile.endsWith(".gz")
|
||||
? createReadStream(opts.backupFile).pipe(createGunzip())
|
||||
: createReadStream(opts.backupFile);
|
||||
|
||||
await Promise.all([
|
||||
pipeline(input, child.stdin),
|
||||
waitForChildExit(child, psqlBin),
|
||||
]);
|
||||
}
|
||||
|
||||
async function hasStatementBreakpoints(backupFile: string): Promise<boolean> {
|
||||
const raw = createReadStream(backupFile);
|
||||
const stream = backupFile.endsWith(".gz") ? raw.pipe(createGunzip()) : raw;
|
||||
let text = "";
|
||||
|
||||
try {
|
||||
for await (const chunk of stream) {
|
||||
text += Buffer.isBuffer(chunk) ? chunk.toString("utf8") : String(chunk);
|
||||
if (text.includes(STATEMENT_BREAKPOINT)) return true;
|
||||
if (Buffer.byteLength(text, "utf8") >= BACKUP_BREAKPOINT_DETECT_BYTES) return false;
|
||||
}
|
||||
return text.includes(STATEMENT_BREAKPOINT);
|
||||
} finally {
|
||||
stream.destroy();
|
||||
raw.destroy();
|
||||
}
|
||||
}
|
||||
|
||||
async function* readRestoreStatements(backupFile: string): AsyncGenerator<string> {
|
||||
const raw = createReadStream(backupFile);
|
||||
const stream = backupFile.endsWith(".gz") ? raw.pipe(createGunzip()) : raw;
|
||||
@@ -397,21 +263,41 @@ async function* readRestoreStatements(backupFile: string): AsyncGenerator<string
|
||||
}
|
||||
|
||||
export function createBufferedTextFileWriter(filePath: string, maxBufferedBytes = DEFAULT_BACKUP_WRITE_BUFFER_BYTES) {
|
||||
const filePromise = openFile(filePath, "w");
|
||||
const stream = createWriteStream(filePath, { encoding: "utf8" });
|
||||
const flushThreshold = Math.max(1, Math.trunc(maxBufferedBytes));
|
||||
let bufferedLines: string[] = [];
|
||||
let bufferedBytes = 0;
|
||||
let firstChunk = true;
|
||||
let closed = false;
|
||||
let streamError: Error | null = null;
|
||||
let pendingWrite = Promise.resolve();
|
||||
|
||||
const writeChunk = async (chunk: string | Buffer): Promise<void> => {
|
||||
const file = await filePromise;
|
||||
if (typeof chunk === "string") {
|
||||
await file.write(chunk, null, "utf8");
|
||||
} else {
|
||||
await file.write(chunk);
|
||||
stream.on("error", (error) => {
|
||||
streamError = error;
|
||||
});
|
||||
|
||||
const writeChunk = async (chunk: string): Promise<void> => {
|
||||
if (streamError) throw streamError;
|
||||
const canContinue = stream.write(chunk);
|
||||
if (!canContinue) {
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
const handleDrain = () => {
|
||||
cleanup();
|
||||
resolve();
|
||||
};
|
||||
const handleError = (error: Error) => {
|
||||
cleanup();
|
||||
reject(error);
|
||||
};
|
||||
const cleanup = () => {
|
||||
stream.off("drain", handleDrain);
|
||||
stream.off("error", handleError);
|
||||
};
|
||||
stream.once("drain", handleDrain);
|
||||
stream.once("error", handleError);
|
||||
});
|
||||
}
|
||||
if (streamError) throw streamError;
|
||||
};
|
||||
|
||||
const flushBufferedLines = () => {
|
||||
@@ -430,43 +316,37 @@ export function createBufferedTextFileWriter(filePath: string, maxBufferedBytes
|
||||
if (closed) {
|
||||
throw new Error(`Cannot write to closed backup file: ${filePath}`);
|
||||
}
|
||||
if (streamError) throw streamError;
|
||||
bufferedLines.push(line);
|
||||
bufferedBytes += Buffer.byteLength(line, "utf8") + 1;
|
||||
if (bufferedBytes >= flushThreshold) {
|
||||
flushBufferedLines();
|
||||
}
|
||||
},
|
||||
async drain() {
|
||||
if (closed) {
|
||||
throw new Error(`Cannot drain closed backup file: ${filePath}`);
|
||||
}
|
||||
flushBufferedLines();
|
||||
await pendingWrite;
|
||||
},
|
||||
async writeRaw(chunk: string | Buffer) {
|
||||
if (closed) {
|
||||
throw new Error(`Cannot write to closed backup file: ${filePath}`);
|
||||
}
|
||||
flushBufferedLines();
|
||||
firstChunk = false;
|
||||
pendingWrite = pendingWrite.then(() => writeChunk(chunk));
|
||||
await pendingWrite;
|
||||
},
|
||||
async close() {
|
||||
if (closed) return;
|
||||
closed = true;
|
||||
flushBufferedLines();
|
||||
await pendingWrite;
|
||||
const file = await filePromise;
|
||||
await file.close();
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
if (streamError) {
|
||||
reject(streamError);
|
||||
return;
|
||||
}
|
||||
stream.end((error?: Error | null) => {
|
||||
if (error) reject(error);
|
||||
else resolve();
|
||||
});
|
||||
});
|
||||
if (streamError) throw streamError;
|
||||
},
|
||||
async abort() {
|
||||
if (closed) return;
|
||||
closed = true;
|
||||
bufferedLines = [];
|
||||
bufferedBytes = 0;
|
||||
stream.destroy();
|
||||
await pendingWrite.catch(() => {});
|
||||
await filePromise.then((file) => file.close()).catch(() => {});
|
||||
if (existsSync(filePath)) {
|
||||
try {
|
||||
unlinkSync(filePath);
|
||||
@@ -482,53 +362,16 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
||||
const filenamePrefix = opts.filenamePrefix ?? "paperclip";
|
||||
const retention = opts.retention;
|
||||
const connectTimeout = Math.max(1, Math.trunc(opts.connectTimeoutSeconds ?? 5));
|
||||
const backupEngine = opts.backupEngine ?? "auto";
|
||||
const canUsePgDump = !hasBackupTransforms(opts);
|
||||
const includeMigrationJournal = opts.includeMigrationJournal === true;
|
||||
const excludedTableNames = normalizeTableNameSet(opts.excludeTables);
|
||||
const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns);
|
||||
let sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||
let sqlClosed = false;
|
||||
const closeSql = async () => {
|
||||
if (sqlClosed) return;
|
||||
sqlClosed = true;
|
||||
await sql.end();
|
||||
};
|
||||
const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||
mkdirSync(opts.backupDir, { recursive: true });
|
||||
const sqlFile = resolve(opts.backupDir, `${filenamePrefix}-${timestamp()}.sql`);
|
||||
const backupFile = `${sqlFile}.gz`;
|
||||
const writer = createBufferedTextFileWriter(sqlFile);
|
||||
|
||||
try {
|
||||
if (backupEngine === "pg_dump" || (backupEngine === "auto" && canUsePgDump)) {
|
||||
await sql`SELECT 1`;
|
||||
try {
|
||||
await closeSql();
|
||||
await runPgDumpBackup({
|
||||
connectionString: opts.connectionString,
|
||||
backupFile,
|
||||
connectTimeout,
|
||||
});
|
||||
await writer.abort();
|
||||
const sizeBytes = statSync(backupFile).size;
|
||||
const prunedCount = pruneOldBackups(opts.backupDir, retention, filenamePrefix);
|
||||
return {
|
||||
backupFile,
|
||||
sizeBytes,
|
||||
prunedCount,
|
||||
};
|
||||
} catch (error) {
|
||||
if (existsSync(backupFile)) {
|
||||
try { unlinkSync(backupFile); } catch { /* ignore */ }
|
||||
}
|
||||
if (backupEngine === "pg_dump") {
|
||||
throw error;
|
||||
}
|
||||
sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||
sqlClosed = false;
|
||||
}
|
||||
}
|
||||
|
||||
await sql`SELECT 1`;
|
||||
|
||||
const emit = (line: string) => writer.emit(line);
|
||||
@@ -860,39 +703,20 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
||||
|
||||
emit(`-- Data for: ${schema_name}.${tablename} (${count[0]!.n} rows)`);
|
||||
|
||||
const rows = await sql.unsafe(`SELECT * FROM ${qualifiedTableName}`).values();
|
||||
const nullifiedColumns = nullifiedColumnsByTable.get(tablename) ?? new Set<string>();
|
||||
if (backupEngine !== "javascript" && nullifiedColumns.size === 0) {
|
||||
emit(`COPY ${qualifiedTableName} (${colNames}) FROM stdin;`);
|
||||
await writer.writeRaw("\n");
|
||||
const copySql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||
try {
|
||||
const copyStream = await copySql
|
||||
.unsafe(`COPY ${qualifiedTableName} (${colNames}) TO STDOUT`)
|
||||
.readable();
|
||||
for await (const chunk of copyStream) {
|
||||
await writer.writeRaw(Buffer.isBuffer(chunk) ? chunk : Buffer.from(String(chunk)));
|
||||
}
|
||||
} finally {
|
||||
await copySql.end();
|
||||
}
|
||||
await writer.writeRaw("\\.\n");
|
||||
emitStatementBoundary();
|
||||
emit("");
|
||||
continue;
|
||||
}
|
||||
|
||||
const rowCursor = sql
|
||||
.unsafe(`SELECT * FROM ${qualifiedTableName}`)
|
||||
.values()
|
||||
.cursor(BACKUP_DATA_CURSOR_ROWS) as AsyncIterable<unknown[][]>;
|
||||
for await (const rows of rowCursor) {
|
||||
for (const row of rows) {
|
||||
const values = row.map((rawValue, index) =>
|
||||
formatSqlValue(rawValue, cols[index]?.column_name, nullifiedColumns),
|
||||
);
|
||||
emitStatement(`INSERT INTO ${qualifiedTableName} (${colNames}) VALUES (${values.join(", ")});`);
|
||||
}
|
||||
await writer.drain();
|
||||
for (const row of rows) {
|
||||
const values = row.map((rawValue: unknown, index) => {
|
||||
const columnName = cols[index]?.column_name;
|
||||
const val = columnName && nullifiedColumns.has(columnName) ? null : rawValue;
|
||||
if (val === null || val === undefined) return "NULL";
|
||||
if (typeof val === "boolean") return val ? "true" : "false";
|
||||
if (typeof val === "number") return String(val);
|
||||
if (val instanceof Date) return formatSqlLiteral(val.toISOString());
|
||||
if (typeof val === "object") return formatSqlLiteral(JSON.stringify(val));
|
||||
return formatSqlLiteral(String(val));
|
||||
});
|
||||
emitStatement(`INSERT INTO ${qualifiedTableName} (${colNames}) VALUES (${values.join(", ")});`);
|
||||
}
|
||||
emit("");
|
||||
}
|
||||
@@ -944,23 +768,12 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
await closeSql();
|
||||
await sql.end();
|
||||
}
|
||||
}
|
||||
|
||||
export async function runDatabaseRestore(opts: RunDatabaseRestoreOptions): Promise<void> {
|
||||
const connectTimeout = Math.max(1, Math.trunc(opts.connectTimeoutSeconds ?? 5));
|
||||
try {
|
||||
await restoreWithPsql(opts, connectTimeout);
|
||||
return;
|
||||
} catch (error) {
|
||||
if (!(await hasStatementBreakpoints(opts.backupFile))) {
|
||||
throw new Error(
|
||||
`Failed to restore ${basename(opts.backupFile)} with psql: ${sanitizeRestoreErrorMessage(error)}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||
|
||||
try {
|
||||
|
||||
@@ -467,78 +467,4 @@ describeEmbeddedPostgres("applyPendingMigrations", () => {
|
||||
},
|
||||
20_000,
|
||||
);
|
||||
|
||||
it(
|
||||
"replays migration 0059 safely when plugin_database_namespaces already exists",
|
||||
async () => {
|
||||
const connectionString = await createTempDatabase();
|
||||
|
||||
await applyPendingMigrations(connectionString);
|
||||
|
||||
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||
try {
|
||||
const pluginNamespacesHash = await migrationHash(
|
||||
"0059_plugin_database_namespaces.sql",
|
||||
);
|
||||
|
||||
await sql.unsafe(
|
||||
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${pluginNamespacesHash}'`,
|
||||
);
|
||||
|
||||
const tables = await sql.unsafe<{ table_name: string }[]>(
|
||||
`
|
||||
SELECT table_name
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = 'public'
|
||||
AND table_name IN ('plugin_database_namespaces', 'plugin_migrations')
|
||||
ORDER BY table_name
|
||||
`,
|
||||
);
|
||||
expect(tables.map((row) => row.table_name)).toEqual([
|
||||
"plugin_database_namespaces",
|
||||
"plugin_migrations",
|
||||
]);
|
||||
} finally {
|
||||
await sql.end();
|
||||
}
|
||||
|
||||
const pendingState = await inspectMigrations(connectionString);
|
||||
expect(pendingState).toMatchObject({
|
||||
status: "needsMigrations",
|
||||
pendingMigrations: ["0059_plugin_database_namespaces.sql"],
|
||||
reason: "pending-migrations",
|
||||
});
|
||||
|
||||
await applyPendingMigrations(connectionString);
|
||||
|
||||
const finalState = await inspectMigrations(connectionString);
|
||||
expect(finalState.status).toBe("upToDate");
|
||||
|
||||
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||
try {
|
||||
const indexes = await verifySql.unsafe<{ indexname: string }[]>(
|
||||
`
|
||||
SELECT indexname
|
||||
FROM pg_indexes
|
||||
WHERE schemaname = 'public'
|
||||
AND tablename IN ('plugin_database_namespaces', 'plugin_migrations')
|
||||
ORDER BY indexname
|
||||
`,
|
||||
);
|
||||
expect(indexes.map((row) => row.indexname)).toEqual(
|
||||
expect.arrayContaining([
|
||||
"plugin_database_namespaces_namespace_idx",
|
||||
"plugin_database_namespaces_plugin_idx",
|
||||
"plugin_database_namespaces_status_idx",
|
||||
"plugin_migrations_plugin_idx",
|
||||
"plugin_migrations_plugin_key_idx",
|
||||
"plugin_migrations_status_idx",
|
||||
]),
|
||||
);
|
||||
} finally {
|
||||
await verifySql.end();
|
||||
}
|
||||
},
|
||||
20_000,
|
||||
);
|
||||
});
|
||||
|
||||
@@ -31,5 +31,4 @@ export {
|
||||
formatEmbeddedPostgresError,
|
||||
} from "./embedded-postgres-error.js";
|
||||
export { issueRelations } from "./schema/issue_relations.js";
|
||||
export { issueReferenceMentions } from "./schema/issue_reference_mentions.js";
|
||||
export * from "./schema/index.js";
|
||||
|
||||
@@ -1,57 +0,0 @@
|
||||
WITH ranked_user_requests AS (
|
||||
SELECT
|
||||
id,
|
||||
row_number() OVER (
|
||||
PARTITION BY company_id, requesting_user_id
|
||||
ORDER BY created_at ASC, id ASC
|
||||
) AS rank
|
||||
FROM join_requests
|
||||
WHERE request_type = 'human'
|
||||
AND status = 'pending_approval'
|
||||
AND requesting_user_id IS NOT NULL
|
||||
)
|
||||
UPDATE join_requests
|
||||
SET
|
||||
status = 'rejected',
|
||||
rejected_at = COALESCE(rejected_at, now()),
|
||||
updated_at = now()
|
||||
WHERE id IN (
|
||||
SELECT id
|
||||
FROM ranked_user_requests
|
||||
WHERE rank > 1
|
||||
);
|
||||
--> statement-breakpoint
|
||||
WITH ranked_email_requests AS (
|
||||
SELECT
|
||||
id,
|
||||
row_number() OVER (
|
||||
PARTITION BY company_id, lower(request_email_snapshot)
|
||||
ORDER BY created_at ASC, id ASC
|
||||
) AS rank
|
||||
FROM join_requests
|
||||
WHERE request_type = 'human'
|
||||
AND status = 'pending_approval'
|
||||
AND request_email_snapshot IS NOT NULL
|
||||
)
|
||||
UPDATE join_requests
|
||||
SET
|
||||
status = 'rejected',
|
||||
rejected_at = COALESCE(rejected_at, now()),
|
||||
updated_at = now()
|
||||
WHERE id IN (
|
||||
SELECT id
|
||||
FROM ranked_email_requests
|
||||
WHERE rank > 1
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "join_requests_pending_human_user_uq"
|
||||
ON "join_requests" USING btree ("company_id", "requesting_user_id")
|
||||
WHERE "request_type" = 'human'
|
||||
AND "status" = 'pending_approval'
|
||||
AND "requesting_user_id" IS NOT NULL;
|
||||
--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "join_requests_pending_human_email_uq"
|
||||
ON "join_requests" USING btree ("company_id", lower("request_email_snapshot"))
|
||||
WHERE "request_type" = 'human'
|
||||
AND "status" = 'pending_approval'
|
||||
AND "request_email_snapshot" IS NOT NULL;
|
||||
@@ -1,6 +0,0 @@
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "liveness_state" text;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "liveness_reason" text;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "continuation_attempt" integer DEFAULT 0 NOT NULL;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "last_useful_action_at" timestamp with time zone;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "next_action" text;--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "heartbeat_runs_company_liveness_idx" ON "heartbeat_runs" USING btree ("company_id","liveness_state","created_at");
|
||||
@@ -1,41 +0,0 @@
|
||||
CREATE TABLE IF NOT EXISTS "plugin_database_namespaces" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"plugin_id" uuid NOT NULL,
|
||||
"plugin_key" text NOT NULL,
|
||||
"namespace_name" text NOT NULL,
|
||||
"namespace_mode" text DEFAULT 'schema' NOT NULL,
|
||||
"status" text DEFAULT 'active' NOT NULL,
|
||||
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE IF NOT EXISTS "plugin_migrations" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"plugin_id" uuid NOT NULL,
|
||||
"plugin_key" text NOT NULL,
|
||||
"namespace_name" text NOT NULL,
|
||||
"migration_key" text NOT NULL,
|
||||
"checksum" text NOT NULL,
|
||||
"plugin_version" text NOT NULL,
|
||||
"status" text NOT NULL,
|
||||
"started_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||
"applied_at" timestamp with time zone,
|
||||
"error_message" text
|
||||
);
|
||||
--> statement-breakpoint
|
||||
DO $$ BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'plugin_database_namespaces_plugin_id_plugins_id_fk') THEN
|
||||
ALTER TABLE "plugin_database_namespaces" ADD CONSTRAINT "plugin_database_namespaces_plugin_id_plugins_id_fk" FOREIGN KEY ("plugin_id") REFERENCES "public"."plugins"("id") ON DELETE cascade ON UPDATE no action;
|
||||
END IF;
|
||||
END $$;--> statement-breakpoint
|
||||
DO $$ BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'plugin_migrations_plugin_id_plugins_id_fk') THEN
|
||||
ALTER TABLE "plugin_migrations" ADD CONSTRAINT "plugin_migrations_plugin_id_plugins_id_fk" FOREIGN KEY ("plugin_id") REFERENCES "public"."plugins"("id") ON DELETE cascade ON UPDATE no action;
|
||||
END IF;
|
||||
END $$;--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "plugin_database_namespaces_plugin_idx" ON "plugin_database_namespaces" USING btree ("plugin_id");--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "plugin_database_namespaces_namespace_idx" ON "plugin_database_namespaces" USING btree ("namespace_name");--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "plugin_database_namespaces_status_idx" ON "plugin_database_namespaces" USING btree ("status");--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "plugin_migrations_plugin_key_idx" ON "plugin_migrations" USING btree ("plugin_id","migration_key");--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "plugin_migrations_plugin_idx" ON "plugin_migrations" USING btree ("plugin_id");--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "plugin_migrations_status_idx" ON "plugin_migrations" USING btree ("status");
|
||||
@@ -1,50 +0,0 @@
|
||||
CREATE TABLE IF NOT EXISTS "issue_reference_mentions" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"company_id" uuid NOT NULL,
|
||||
"source_issue_id" uuid NOT NULL,
|
||||
"target_issue_id" uuid NOT NULL,
|
||||
"source_kind" text NOT NULL,
|
||||
"source_record_id" uuid,
|
||||
"document_key" text,
|
||||
"matched_text" text,
|
||||
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
DO $$ BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'issue_reference_mentions_company_id_companies_id_fk') THEN
|
||||
ALTER TABLE "issue_reference_mentions" ADD CONSTRAINT "issue_reference_mentions_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
|
||||
END IF;
|
||||
END $$;
|
||||
--> statement-breakpoint
|
||||
DO $$ BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'issue_reference_mentions_source_issue_id_issues_id_fk') THEN
|
||||
ALTER TABLE "issue_reference_mentions" ADD CONSTRAINT "issue_reference_mentions_source_issue_id_issues_id_fk" FOREIGN KEY ("source_issue_id") REFERENCES "public"."issues"("id") ON DELETE cascade ON UPDATE no action;
|
||||
END IF;
|
||||
END $$;
|
||||
--> statement-breakpoint
|
||||
DO $$ BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'issue_reference_mentions_target_issue_id_issues_id_fk') THEN
|
||||
ALTER TABLE "issue_reference_mentions" ADD CONSTRAINT "issue_reference_mentions_target_issue_id_issues_id_fk" FOREIGN KEY ("target_issue_id") REFERENCES "public"."issues"("id") ON DELETE cascade ON UPDATE no action;
|
||||
END IF;
|
||||
END $$;--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "issue_reference_mentions_company_source_issue_idx" ON "issue_reference_mentions" USING btree ("company_id","source_issue_id");--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "issue_reference_mentions_company_target_issue_idx" ON "issue_reference_mentions" USING btree ("company_id","target_issue_id");--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "issue_reference_mentions_company_issue_pair_idx" ON "issue_reference_mentions" USING btree ("company_id","source_issue_id","target_issue_id");--> statement-breakpoint
|
||||
DELETE FROM "issue_reference_mentions"
|
||||
WHERE "id" IN (
|
||||
SELECT "id"
|
||||
FROM (
|
||||
SELECT
|
||||
"id",
|
||||
row_number() OVER (
|
||||
PARTITION BY "company_id", "source_issue_id", "target_issue_id", "source_kind", "source_record_id"
|
||||
ORDER BY "created_at", "id"
|
||||
) AS "row_number"
|
||||
FROM "issue_reference_mentions"
|
||||
) AS "duplicates"
|
||||
WHERE "duplicates"."row_number" > 1
|
||||
);--> statement-breakpoint
|
||||
DROP INDEX IF EXISTS "issue_reference_mentions_company_source_mention_uq";--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "issue_reference_mentions_company_source_mention_record_uq" ON "issue_reference_mentions" USING btree ("company_id","source_issue_id","target_issue_id","source_kind","source_record_id") WHERE "source_record_id" IS NOT NULL;--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "issue_reference_mentions_company_source_mention_null_record_uq" ON "issue_reference_mentions" USING btree ("company_id","source_issue_id","target_issue_id","source_kind") WHERE "source_record_id" IS NULL;
|
||||
@@ -1,3 +0,0 @@
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "scheduled_retry_at" timestamp with time zone;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "scheduled_retry_attempt" integer DEFAULT 0 NOT NULL;--> statement-breakpoint
|
||||
ALTER TABLE "heartbeat_runs" ADD COLUMN IF NOT EXISTS "scheduled_retry_reason" text;
|
||||
@@ -1,9 +0,0 @@
|
||||
ALTER TABLE "routine_runs" ADD COLUMN IF NOT EXISTS "dispatch_fingerprint" text;--> statement-breakpoint
|
||||
ALTER TABLE "issues" ADD COLUMN IF NOT EXISTS "origin_fingerprint" text DEFAULT 'default' NOT NULL;--> statement-breakpoint
|
||||
DROP INDEX IF EXISTS "issues_open_routine_execution_uq";--> statement-breakpoint
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS "issues_open_routine_execution_uq" ON "issues" USING btree ("company_id","origin_kind","origin_id","origin_fingerprint") WHERE "issues"."origin_kind" = 'routine_execution'
|
||||
and "issues"."origin_id" is not null
|
||||
and "issues"."hidden_at" is null
|
||||
and "issues"."execution_run_id" is not null
|
||||
and "issues"."status" in ('backlog', 'todo', 'in_progress', 'in_review', 'blocked');--> statement-breakpoint
|
||||
CREATE INDEX IF NOT EXISTS "routine_runs_dispatch_fingerprint_idx" ON "routine_runs" USING btree ("routine_id","dispatch_fingerprint");
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -400,48 +400,6 @@
|
||||
"when": 1776084034244,
|
||||
"tag": "0056_spooky_ultragirl",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 57,
|
||||
"version": "7",
|
||||
"when": 1776309613598,
|
||||
"tag": "0057_tidy_join_requests",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 58,
|
||||
"version": "7",
|
||||
"when": 1776542245004,
|
||||
"tag": "0058_wealthy_starbolt",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 59,
|
||||
"version": "7",
|
||||
"when": 1776542246000,
|
||||
"tag": "0059_plugin_database_namespaces",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 60,
|
||||
"version": "7",
|
||||
"when": 1776717606743,
|
||||
"tag": "0060_orange_annihilus",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 61,
|
||||
"version": "7",
|
||||
"when": 1776785165389,
|
||||
"tag": "0061_lively_thor_girl",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 62,
|
||||
"version": "7",
|
||||
"when": 1776780000000,
|
||||
"tag": "0062_routine_run_dispatch_fingerprint",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -38,17 +38,9 @@ export const heartbeatRuns = pgTable(
|
||||
onDelete: "set null",
|
||||
}),
|
||||
processLossRetryCount: integer("process_loss_retry_count").notNull().default(0),
|
||||
scheduledRetryAt: timestamp("scheduled_retry_at", { withTimezone: true }),
|
||||
scheduledRetryAttempt: integer("scheduled_retry_attempt").notNull().default(0),
|
||||
scheduledRetryReason: text("scheduled_retry_reason"),
|
||||
issueCommentStatus: text("issue_comment_status").notNull().default("not_applicable"),
|
||||
issueCommentSatisfiedByCommentId: uuid("issue_comment_satisfied_by_comment_id"),
|
||||
issueCommentRetryQueuedAt: timestamp("issue_comment_retry_queued_at", { withTimezone: true }),
|
||||
livenessState: text("liveness_state"),
|
||||
livenessReason: text("liveness_reason"),
|
||||
continuationAttempt: integer("continuation_attempt").notNull().default(0),
|
||||
lastUsefulActionAt: timestamp("last_useful_action_at", { withTimezone: true }),
|
||||
nextAction: text("next_action"),
|
||||
contextSnapshot: jsonb("context_snapshot").$type<Record<string, unknown>>(),
|
||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
@@ -59,10 +51,5 @@ export const heartbeatRuns = pgTable(
|
||||
table.agentId,
|
||||
table.startedAt,
|
||||
),
|
||||
companyLivenessIdx: index("heartbeat_runs_company_liveness_idx").on(
|
||||
table.companyId,
|
||||
table.livenessState,
|
||||
table.createdAt,
|
||||
),
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -27,7 +27,6 @@ export { workspaceRuntimeServices } from "./workspace_runtime_services.js";
|
||||
export { projectGoals } from "./project_goals.js";
|
||||
export { goals } from "./goals.js";
|
||||
export { issues } from "./issues.js";
|
||||
export { issueReferenceMentions } from "./issue_reference_mentions.js";
|
||||
export { issueRelations } from "./issue_relations.js";
|
||||
export { routines, routineTriggers, routineRuns } from "./routines.js";
|
||||
export { issueWorkProducts } from "./issue_work_products.js";
|
||||
@@ -61,7 +60,6 @@ export { pluginConfig } from "./plugin_config.js";
|
||||
export { pluginCompanySettings } from "./plugin_company_settings.js";
|
||||
export { pluginState } from "./plugin_state.js";
|
||||
export { pluginEntities } from "./plugin_entities.js";
|
||||
export { pluginDatabaseNamespaces, pluginMigrations } from "./plugin_database.js";
|
||||
export { pluginJobs, pluginJobRuns } from "./plugin_jobs.js";
|
||||
export { pluginWebhookDeliveries } from "./plugin_webhooks.js";
|
||||
export { pluginLogs } from "./plugin_logs.js";
|
||||
|
||||
@@ -1,48 +0,0 @@
|
||||
import { sql } from "drizzle-orm";
|
||||
import { index, pgTable, text, timestamp, uniqueIndex, uuid } from "drizzle-orm/pg-core";
|
||||
import { companies } from "./companies.js";
|
||||
import { issues } from "./issues.js";
|
||||
|
||||
export const issueReferenceMentions = pgTable(
|
||||
"issue_reference_mentions",
|
||||
{
|
||||
id: uuid("id").primaryKey().defaultRandom(),
|
||||
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||
sourceIssueId: uuid("source_issue_id").notNull().references(() => issues.id, { onDelete: "cascade" }),
|
||||
targetIssueId: uuid("target_issue_id").notNull().references(() => issues.id, { onDelete: "cascade" }),
|
||||
sourceKind: text("source_kind").$type<"title" | "description" | "comment" | "document">().notNull(),
|
||||
sourceRecordId: uuid("source_record_id"),
|
||||
documentKey: text("document_key"),
|
||||
matchedText: text("matched_text"),
|
||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
},
|
||||
(table) => ({
|
||||
companySourceIssueIdx: index("issue_reference_mentions_company_source_issue_idx").on(
|
||||
table.companyId,
|
||||
table.sourceIssueId,
|
||||
),
|
||||
companyTargetIssueIdx: index("issue_reference_mentions_company_target_issue_idx").on(
|
||||
table.companyId,
|
||||
table.targetIssueId,
|
||||
),
|
||||
companyIssuePairIdx: index("issue_reference_mentions_company_issue_pair_idx").on(
|
||||
table.companyId,
|
||||
table.sourceIssueId,
|
||||
table.targetIssueId,
|
||||
),
|
||||
companySourceMentionWithRecordUq: uniqueIndex("issue_reference_mentions_company_source_mention_record_uq").on(
|
||||
table.companyId,
|
||||
table.sourceIssueId,
|
||||
table.targetIssueId,
|
||||
table.sourceKind,
|
||||
table.sourceRecordId,
|
||||
).where(sql`${table.sourceRecordId} is not null`),
|
||||
companySourceMentionWithoutRecordUq: uniqueIndex("issue_reference_mentions_company_source_mention_null_record_uq").on(
|
||||
table.companyId,
|
||||
table.sourceIssueId,
|
||||
table.targetIssueId,
|
||||
table.sourceKind,
|
||||
).where(sql`${table.sourceRecordId} is null`),
|
||||
}),
|
||||
);
|
||||
@@ -44,7 +44,6 @@ export const issues = pgTable(
|
||||
originKind: text("origin_kind").notNull().default("manual"),
|
||||
originId: text("origin_id"),
|
||||
originRunId: text("origin_run_id"),
|
||||
originFingerprint: text("origin_fingerprint").notNull().default("default"),
|
||||
requestDepth: integer("request_depth").notNull().default(0),
|
||||
billingCode: text("billing_code"),
|
||||
assigneeAdapterOverrides: jsonb("assignee_adapter_overrides").$type<Record<string, unknown>>(),
|
||||
@@ -83,7 +82,7 @@ export const issues = pgTable(
|
||||
identifierSearchIdx: index("issues_identifier_search_idx").using("gin", table.identifier.op("gin_trgm_ops")),
|
||||
descriptionSearchIdx: index("issues_description_search_idx").using("gin", table.description.op("gin_trgm_ops")),
|
||||
openRoutineExecutionIdx: uniqueIndex("issues_open_routine_execution_uq")
|
||||
.on(table.companyId, table.originKind, table.originId, table.originFingerprint)
|
||||
.on(table.companyId, table.originKind, table.originId)
|
||||
.where(
|
||||
sql`${table.originKind} = 'routine_execution'
|
||||
and ${table.originId} is not null
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import { sql } from "drizzle-orm";
|
||||
import { pgTable, uuid, text, timestamp, jsonb, index, uniqueIndex } from "drizzle-orm/pg-core";
|
||||
import { companies } from "./companies.js";
|
||||
import { invites } from "./invites.js";
|
||||
@@ -38,11 +37,5 @@ export const joinRequests = pgTable(
|
||||
table.requestType,
|
||||
table.createdAt,
|
||||
),
|
||||
pendingHumanUserUniqueIdx: uniqueIndex("join_requests_pending_human_user_uq")
|
||||
.on(table.companyId, table.requestingUserId)
|
||||
.where(sql`${table.requestType} = 'human' AND ${table.status} = 'pending_approval' AND ${table.requestingUserId} IS NOT NULL`),
|
||||
pendingHumanEmailUniqueIdx: uniqueIndex("join_requests_pending_human_email_uq")
|
||||
.on(table.companyId, sql`lower(${table.requestEmailSnapshot})`)
|
||||
.where(sql`${table.requestType} = 'human' AND ${table.status} = 'pending_approval' AND ${table.requestEmailSnapshot} IS NOT NULL`),
|
||||
}),
|
||||
);
|
||||
|
||||
@@ -1,75 +0,0 @@
|
||||
import {
|
||||
pgTable,
|
||||
uuid,
|
||||
text,
|
||||
timestamp,
|
||||
index,
|
||||
uniqueIndex,
|
||||
} from "drizzle-orm/pg-core";
|
||||
import type {
|
||||
PluginDatabaseMigrationStatus,
|
||||
PluginDatabaseNamespaceMode,
|
||||
PluginDatabaseNamespaceStatus,
|
||||
} from "@paperclipai/shared";
|
||||
import { plugins } from "./plugins.js";
|
||||
|
||||
/**
|
||||
* Database namespace allocated to an installed plugin.
|
||||
*
|
||||
* Namespaces are deterministic and owned by the host. Plugin SQL may create
|
||||
* objects only inside its namespace, while selected public core tables remain
|
||||
* read-only join targets through runtime checks.
|
||||
*/
|
||||
export const pluginDatabaseNamespaces = pgTable(
|
||||
"plugin_database_namespaces",
|
||||
{
|
||||
id: uuid("id").primaryKey().defaultRandom(),
|
||||
pluginId: uuid("plugin_id")
|
||||
.notNull()
|
||||
.references(() => plugins.id, { onDelete: "cascade" }),
|
||||
pluginKey: text("plugin_key").notNull(),
|
||||
namespaceName: text("namespace_name").notNull(),
|
||||
namespaceMode: text("namespace_mode").$type<PluginDatabaseNamespaceMode>().notNull().default("schema"),
|
||||
status: text("status").$type<PluginDatabaseNamespaceStatus>().notNull().default("active"),
|
||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
},
|
||||
(table) => ({
|
||||
pluginIdx: uniqueIndex("plugin_database_namespaces_plugin_idx").on(table.pluginId),
|
||||
namespaceIdx: uniqueIndex("plugin_database_namespaces_namespace_idx").on(table.namespaceName),
|
||||
statusIdx: index("plugin_database_namespaces_status_idx").on(table.status),
|
||||
}),
|
||||
);
|
||||
|
||||
/**
|
||||
* Per-plugin migration ledger.
|
||||
*
|
||||
* Every migration file is recorded with a checksum. A previously applied
|
||||
* migration whose checksum changes is rejected during later activation.
|
||||
*/
|
||||
export const pluginMigrations = pgTable(
|
||||
"plugin_migrations",
|
||||
{
|
||||
id: uuid("id").primaryKey().defaultRandom(),
|
||||
pluginId: uuid("plugin_id")
|
||||
.notNull()
|
||||
.references(() => plugins.id, { onDelete: "cascade" }),
|
||||
pluginKey: text("plugin_key").notNull(),
|
||||
namespaceName: text("namespace_name").notNull(),
|
||||
migrationKey: text("migration_key").notNull(),
|
||||
checksum: text("checksum").notNull(),
|
||||
pluginVersion: text("plugin_version").notNull(),
|
||||
status: text("status").$type<PluginDatabaseMigrationStatus>().notNull(),
|
||||
startedAt: timestamp("started_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
appliedAt: timestamp("applied_at", { withTimezone: true }),
|
||||
errorMessage: text("error_message"),
|
||||
},
|
||||
(table) => ({
|
||||
pluginMigrationIdx: uniqueIndex("plugin_migrations_plugin_key_idx").on(
|
||||
table.pluginId,
|
||||
table.migrationKey,
|
||||
),
|
||||
pluginIdx: index("plugin_migrations_plugin_idx").on(table.pluginId),
|
||||
statusIdx: index("plugin_migrations_status_idx").on(table.status),
|
||||
}),
|
||||
);
|
||||
@@ -96,7 +96,6 @@ export const routineRuns = pgTable(
|
||||
triggeredAt: timestamp("triggered_at", { withTimezone: true }).notNull().defaultNow(),
|
||||
idempotencyKey: text("idempotency_key"),
|
||||
triggerPayload: jsonb("trigger_payload").$type<Record<string, unknown>>(),
|
||||
dispatchFingerprint: text("dispatch_fingerprint"),
|
||||
linkedIssueId: uuid("linked_issue_id").references(() => issues.id, { onDelete: "set null" }),
|
||||
coalescedIntoRunId: uuid("coalesced_into_run_id"),
|
||||
failureReason: text("failure_reason"),
|
||||
@@ -107,7 +106,6 @@ export const routineRuns = pgTable(
|
||||
(table) => ({
|
||||
companyRoutineIdx: index("routine_runs_company_routine_idx").on(table.companyId, table.routineId, table.createdAt),
|
||||
triggerIdx: index("routine_runs_trigger_idx").on(table.triggerId, table.createdAt),
|
||||
dispatchFingerprintIdx: index("routine_runs_dispatch_fingerprint_idx").on(table.routineId, table.dispatchFingerprint),
|
||||
linkedIssueIdx: index("routine_runs_linked_issue_idx").on(table.linkedIssueId),
|
||||
idempotencyIdx: index("routine_runs_trigger_idempotency_idx").on(table.triggerId, table.idempotencyKey),
|
||||
}),
|
||||
|
||||
@@ -47,8 +47,6 @@ Read tools:
|
||||
- `paperclipListDocumentRevisions`
|
||||
- `paperclipListProjects`
|
||||
- `paperclipGetProject`
|
||||
- `paperclipGetIssueWorkspaceRuntime`
|
||||
- `paperclipWaitForIssueWorkspaceService`
|
||||
- `paperclipListGoals`
|
||||
- `paperclipGetGoal`
|
||||
- `paperclipListApprovals`
|
||||
@@ -65,7 +63,6 @@ Write tools:
|
||||
- `paperclipAddComment`
|
||||
- `paperclipUpsertIssueDocument`
|
||||
- `paperclipRestoreIssueDocumentRevision`
|
||||
- `paperclipControlIssueWorkspaceServices`
|
||||
- `paperclipCreateApproval`
|
||||
- `paperclipLinkIssueApproval`
|
||||
- `paperclipUnlinkIssueApproval`
|
||||
|
||||
@@ -107,81 +107,6 @@ describe("paperclip MCP tools", () => {
|
||||
});
|
||||
});
|
||||
|
||||
it("controls issue workspace services through the current execution workspace", async () => {
|
||||
const fetchMock = vi.fn()
|
||||
.mockResolvedValueOnce(mockJsonResponse({
|
||||
currentExecutionWorkspace: {
|
||||
id: "44444444-4444-4444-8444-444444444444",
|
||||
runtimeServices: [],
|
||||
},
|
||||
}))
|
||||
.mockResolvedValueOnce(mockJsonResponse({
|
||||
operation: { id: "operation-1" },
|
||||
workspace: {
|
||||
id: "44444444-4444-4444-8444-444444444444",
|
||||
runtimeServices: [
|
||||
{
|
||||
id: "55555555-5555-4555-8555-555555555555",
|
||||
serviceName: "web",
|
||||
status: "running",
|
||||
url: "http://127.0.0.1:5173",
|
||||
},
|
||||
],
|
||||
},
|
||||
}));
|
||||
vi.stubGlobal("fetch", fetchMock);
|
||||
|
||||
const tool = getTool("paperclipControlIssueWorkspaceServices");
|
||||
await tool.execute({
|
||||
issueId: "PAP-1135",
|
||||
action: "restart",
|
||||
workspaceCommandId: "web",
|
||||
});
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledTimes(2);
|
||||
const [lookupUrl, lookupInit] = fetchMock.mock.calls[0] as [string, RequestInit];
|
||||
expect(String(lookupUrl)).toBe("http://localhost:3100/api/issues/PAP-1135/heartbeat-context");
|
||||
expect(lookupInit.method).toBe("GET");
|
||||
|
||||
const [controlUrl, controlInit] = fetchMock.mock.calls[1] as [string, RequestInit];
|
||||
expect(String(controlUrl)).toBe(
|
||||
"http://localhost:3100/api/execution-workspaces/44444444-4444-4444-8444-444444444444/runtime-services/restart",
|
||||
);
|
||||
expect(controlInit.method).toBe("POST");
|
||||
expect(JSON.parse(String(controlInit.body))).toEqual({
|
||||
workspaceCommandId: "web",
|
||||
});
|
||||
});
|
||||
|
||||
it("waits for an issue workspace runtime service URL", async () => {
|
||||
const fetchMock = vi.fn()
|
||||
.mockResolvedValueOnce(mockJsonResponse({
|
||||
currentExecutionWorkspace: {
|
||||
id: "44444444-4444-4444-8444-444444444444",
|
||||
runtimeServices: [
|
||||
{
|
||||
id: "55555555-5555-4555-8555-555555555555",
|
||||
serviceName: "web",
|
||||
status: "running",
|
||||
healthStatus: "healthy",
|
||||
url: "http://127.0.0.1:5173",
|
||||
},
|
||||
],
|
||||
},
|
||||
}));
|
||||
vi.stubGlobal("fetch", fetchMock);
|
||||
|
||||
const tool = getTool("paperclipWaitForIssueWorkspaceService");
|
||||
const response = await tool.execute({
|
||||
issueId: "PAP-1135",
|
||||
serviceName: "web",
|
||||
timeoutSeconds: 1,
|
||||
});
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledTimes(1);
|
||||
expect(response.content[0]?.text).toContain("http://127.0.0.1:5173");
|
||||
});
|
||||
|
||||
it("creates approvals with the expected company-scoped payload", async () => {
|
||||
const fetchMock = vi.fn().mockResolvedValue(
|
||||
mockJsonResponse({ id: "approval-1" }),
|
||||
|
||||
@@ -124,66 +124,6 @@ const apiRequestSchema = z.object({
|
||||
jsonBody: z.string().optional(),
|
||||
});
|
||||
|
||||
const workspaceRuntimeControlTargetSchema = z.object({
|
||||
workspaceCommandId: z.string().min(1).optional().nullable(),
|
||||
runtimeServiceId: z.string().uuid().optional().nullable(),
|
||||
serviceIndex: z.number().int().nonnegative().optional().nullable(),
|
||||
});
|
||||
|
||||
const issueWorkspaceRuntimeControlSchema = z.object({
|
||||
issueId: issueIdSchema,
|
||||
action: z.enum(["start", "stop", "restart"]),
|
||||
}).merge(workspaceRuntimeControlTargetSchema);
|
||||
|
||||
const waitForIssueWorkspaceServiceSchema = z.object({
|
||||
issueId: issueIdSchema,
|
||||
runtimeServiceId: z.string().uuid().optional().nullable(),
|
||||
serviceName: z.string().min(1).optional().nullable(),
|
||||
timeoutSeconds: z.number().int().positive().max(300).optional(),
|
||||
});
|
||||
|
||||
function sleep(ms: number) {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
}
|
||||
|
||||
function readCurrentExecutionWorkspace(context: unknown): Record<string, unknown> | null {
|
||||
if (!context || typeof context !== "object") return null;
|
||||
const workspace = (context as { currentExecutionWorkspace?: unknown }).currentExecutionWorkspace;
|
||||
return workspace && typeof workspace === "object" ? workspace as Record<string, unknown> : null;
|
||||
}
|
||||
|
||||
function readWorkspaceRuntimeServices(workspace: Record<string, unknown> | null): Array<Record<string, unknown>> {
|
||||
const raw = workspace?.runtimeServices;
|
||||
return Array.isArray(raw)
|
||||
? raw.filter((entry): entry is Record<string, unknown> => Boolean(entry) && typeof entry === "object")
|
||||
: [];
|
||||
}
|
||||
|
||||
function selectRuntimeService(
|
||||
services: Array<Record<string, unknown>>,
|
||||
input: { runtimeServiceId?: string | null; serviceName?: string | null },
|
||||
) {
|
||||
if (input.runtimeServiceId) {
|
||||
return services.find((service) => service.id === input.runtimeServiceId) ?? null;
|
||||
}
|
||||
if (input.serviceName) {
|
||||
return services.find((service) => service.serviceName === input.serviceName) ?? null;
|
||||
}
|
||||
return services.find((service) => service.status === "running" || service.status === "starting")
|
||||
?? services[0]
|
||||
?? null;
|
||||
}
|
||||
|
||||
async function getIssueWorkspaceRuntime(client: PaperclipApiClient, issueId: string) {
|
||||
const context = await client.requestJson("GET", `/issues/${encodeURIComponent(issueId)}/heartbeat-context`);
|
||||
const workspace = readCurrentExecutionWorkspace(context);
|
||||
return {
|
||||
context,
|
||||
workspace,
|
||||
runtimeServices: readWorkspaceRuntimeServices(workspace),
|
||||
};
|
||||
}
|
||||
|
||||
export function createToolDefinitions(client: PaperclipApiClient): ToolDefinition[] {
|
||||
return [
|
||||
makeTool(
|
||||
@@ -307,55 +247,6 @@ export function createToolDefinitions(client: PaperclipApiClient): ToolDefinitio
|
||||
return client.requestJson("GET", `/projects/${encodeURIComponent(projectId)}${qs}`);
|
||||
},
|
||||
),
|
||||
makeTool(
|
||||
"paperclipGetIssueWorkspaceRuntime",
|
||||
"Get the current execution workspace and runtime services for an issue, including service URLs",
|
||||
z.object({ issueId: issueIdSchema }),
|
||||
async ({ issueId }) => getIssueWorkspaceRuntime(client, issueId),
|
||||
),
|
||||
makeTool(
|
||||
"paperclipControlIssueWorkspaceServices",
|
||||
"Start, stop, or restart the current issue execution workspace runtime services",
|
||||
issueWorkspaceRuntimeControlSchema,
|
||||
async ({ issueId, action, ...target }) => {
|
||||
const runtime = await getIssueWorkspaceRuntime(client, issueId);
|
||||
const workspaceId = typeof runtime.workspace?.id === "string" ? runtime.workspace.id : null;
|
||||
if (!workspaceId) {
|
||||
throw new Error("Issue has no current execution workspace");
|
||||
}
|
||||
return client.requestJson(
|
||||
"POST",
|
||||
`/execution-workspaces/${encodeURIComponent(workspaceId)}/runtime-services/${action}`,
|
||||
{ body: target },
|
||||
);
|
||||
},
|
||||
),
|
||||
makeTool(
|
||||
"paperclipWaitForIssueWorkspaceService",
|
||||
"Wait until an issue execution workspace runtime service is running and has a URL when one is exposed",
|
||||
waitForIssueWorkspaceServiceSchema,
|
||||
async ({ issueId, runtimeServiceId, serviceName, timeoutSeconds }) => {
|
||||
const deadline = Date.now() + (timeoutSeconds ?? 60) * 1000;
|
||||
let latest: Awaited<ReturnType<typeof getIssueWorkspaceRuntime>> | null = null;
|
||||
while (Date.now() <= deadline) {
|
||||
latest = await getIssueWorkspaceRuntime(client, issueId);
|
||||
const service = selectRuntimeService(latest.runtimeServices, { runtimeServiceId, serviceName });
|
||||
if (service?.status === "running" && service.healthStatus !== "unhealthy") {
|
||||
return {
|
||||
workspace: latest.workspace,
|
||||
service,
|
||||
};
|
||||
}
|
||||
await sleep(1000);
|
||||
}
|
||||
|
||||
return {
|
||||
timedOut: true,
|
||||
latestWorkspace: latest?.workspace ?? null,
|
||||
latestRuntimeServices: latest?.runtimeServices ?? [],
|
||||
};
|
||||
},
|
||||
),
|
||||
makeTool(
|
||||
"paperclipListGoals",
|
||||
"List goals in a company",
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
dist
|
||||
node_modules
|
||||
.paperclip-sdk
|
||||
@@ -1,48 +0,0 @@
|
||||
# Plugin Orchestration Smoke Example
|
||||
|
||||
This first-party example validates the orchestration-grade plugin host surface.
|
||||
It is intentionally small and exists as an acceptance fixture rather than a
|
||||
product plugin.
|
||||
|
||||
## What it exercises
|
||||
|
||||
- `apiRoutes` under `/api/plugins/:pluginId/api/*`
|
||||
- restricted database migrations and runtime `ctx.db`
|
||||
- plugin-owned rows joined to `public.issues`
|
||||
- plugin-created child issues with namespaced origin metadata
|
||||
- billing codes, workspace inheritance, blocker relations, documents, wakeups,
|
||||
and orchestration summaries
|
||||
- issue detail and settings UI slots that surface route, capability, namespace,
|
||||
and smoke status
|
||||
|
||||
## Development
|
||||
|
||||
```bash
|
||||
pnpm install
|
||||
pnpm typecheck
|
||||
pnpm test
|
||||
pnpm build
|
||||
```
|
||||
|
||||
## Install Into Paperclip
|
||||
|
||||
Use an absolute local path during development:
|
||||
|
||||
```bash
|
||||
curl -X POST http://127.0.0.1:3100/api/plugins/install \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"packageName":"/absolute/path/to/paperclip/packages/plugins/examples/plugin-orchestration-smoke-example","isLocalPath":true}'
|
||||
```
|
||||
|
||||
## Scoped Route Smoke
|
||||
|
||||
After the plugin is ready, run the scoped route against an existing issue:
|
||||
|
||||
```bash
|
||||
curl -X POST http://127.0.0.1:3100/api/plugins/paperclipai.plugin-orchestration-smoke-example/api/issues/<issue-id>/smoke \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"assigneeAgentId":"<agent-id>"}'
|
||||
```
|
||||
|
||||
The route returns the generated child issue, resolved blocker, billing code,
|
||||
subtree ids, and wakeup result.
|
||||
@@ -1,17 +0,0 @@
|
||||
import esbuild from "esbuild";
|
||||
import { createPluginBundlerPresets } from "@paperclipai/plugin-sdk/bundlers";
|
||||
|
||||
const presets = createPluginBundlerPresets({ uiEntry: "src/ui/index.tsx" });
|
||||
const watch = process.argv.includes("--watch");
|
||||
|
||||
const workerCtx = await esbuild.context(presets.esbuild.worker);
|
||||
const manifestCtx = await esbuild.context(presets.esbuild.manifest);
|
||||
const uiCtx = await esbuild.context(presets.esbuild.ui);
|
||||
|
||||
if (watch) {
|
||||
await Promise.all([workerCtx.watch(), manifestCtx.watch(), uiCtx.watch()]);
|
||||
console.log("esbuild watch mode enabled for worker, manifest, and ui");
|
||||
} else {
|
||||
await Promise.all([workerCtx.rebuild(), manifestCtx.rebuild(), uiCtx.rebuild()]);
|
||||
await Promise.all([workerCtx.dispose(), manifestCtx.dispose(), uiCtx.dispose()]);
|
||||
}
|
||||
@@ -1,10 +0,0 @@
|
||||
CREATE TABLE plugin_orchestration_smoke_1e8c264c64.smoke_runs (
|
||||
id uuid PRIMARY KEY,
|
||||
root_issue_id uuid NOT NULL REFERENCES public.issues(id) ON DELETE CASCADE,
|
||||
child_issue_id uuid REFERENCES public.issues(id) ON DELETE SET NULL,
|
||||
blocker_issue_id uuid REFERENCES public.issues(id) ON DELETE SET NULL,
|
||||
billing_code text NOT NULL,
|
||||
last_summary jsonb NOT NULL DEFAULT '{}'::jsonb,
|
||||
created_at timestamptz NOT NULL DEFAULT now(),
|
||||
updated_at timestamptz NOT NULL DEFAULT now()
|
||||
);
|
||||
@@ -1,46 +0,0 @@
|
||||
{
|
||||
"name": "@paperclipai/plugin-orchestration-smoke-example",
|
||||
"version": "0.1.0",
|
||||
"type": "module",
|
||||
"private": true,
|
||||
"description": "First-party smoke plugin for orchestration-grade Paperclip plugin APIs",
|
||||
"scripts": {
|
||||
"prebuild": "node ../../../../scripts/ensure-plugin-build-deps.mjs",
|
||||
"build": "node ./esbuild.config.mjs",
|
||||
"build:rollup": "rollup -c",
|
||||
"dev": "node ./esbuild.config.mjs --watch",
|
||||
"dev:ui": "paperclip-plugin-dev-server --root . --ui-dir dist/ui --port 4177",
|
||||
"test": "vitest run --config ./vitest.config.ts",
|
||||
"typecheck": "pnpm --filter @paperclipai/plugin-sdk build && tsc --noEmit"
|
||||
},
|
||||
"paperclipPlugin": {
|
||||
"manifest": "./dist/manifest.js",
|
||||
"worker": "./dist/worker.js",
|
||||
"ui": "./dist/ui/"
|
||||
},
|
||||
"keywords": [
|
||||
"paperclip",
|
||||
"plugin",
|
||||
"connector"
|
||||
],
|
||||
"author": "Paperclip",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@paperclipai/plugin-sdk": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@paperclipai/shared": "workspace:*",
|
||||
"@rollup/plugin-node-resolve": "^16.0.1",
|
||||
"@rollup/plugin-typescript": "^12.1.2",
|
||||
"@types/node": "^24.6.0",
|
||||
"@types/react": "^19.0.8",
|
||||
"esbuild": "^0.27.3",
|
||||
"rollup": "^4.38.0",
|
||||
"tslib": "^2.8.1",
|
||||
"typescript": "^5.7.3",
|
||||
"vitest": "^3.0.5"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": ">=18"
|
||||
}
|
||||
}
|
||||
@@ -1,28 +0,0 @@
|
||||
import { nodeResolve } from "@rollup/plugin-node-resolve";
|
||||
import typescript from "@rollup/plugin-typescript";
|
||||
import { createPluginBundlerPresets } from "@paperclipai/plugin-sdk/bundlers";
|
||||
|
||||
const presets = createPluginBundlerPresets({ uiEntry: "src/ui/index.tsx" });
|
||||
|
||||
function withPlugins(config) {
|
||||
if (!config) return null;
|
||||
return {
|
||||
...config,
|
||||
plugins: [
|
||||
nodeResolve({
|
||||
extensions: [".ts", ".tsx", ".js", ".jsx", ".mjs"],
|
||||
}),
|
||||
typescript({
|
||||
tsconfig: "./tsconfig.json",
|
||||
declaration: false,
|
||||
declarationMap: false,
|
||||
}),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
export default [
|
||||
withPlugins(presets.rollup.manifest),
|
||||
withPlugins(presets.rollup.worker),
|
||||
withPlugins(presets.rollup.ui),
|
||||
].filter(Boolean);
|
||||
@@ -1,82 +0,0 @@
|
||||
import type { PaperclipPluginManifestV1 } from "@paperclipai/plugin-sdk";
|
||||
|
||||
const manifest: PaperclipPluginManifestV1 = {
|
||||
id: "paperclipai.plugin-orchestration-smoke-example",
|
||||
apiVersion: 1,
|
||||
version: "0.1.0",
|
||||
displayName: "Plugin Orchestration Smoke Example",
|
||||
description: "First-party smoke plugin that exercises Paperclip orchestration-grade plugin APIs.",
|
||||
author: "Paperclip",
|
||||
categories: ["automation", "ui"],
|
||||
capabilities: [
|
||||
"api.routes.register",
|
||||
"database.namespace.migrate",
|
||||
"database.namespace.read",
|
||||
"database.namespace.write",
|
||||
"issues.read",
|
||||
"issues.create",
|
||||
"issues.wakeup",
|
||||
"issue.relations.read",
|
||||
"issue.relations.write",
|
||||
"issue.documents.read",
|
||||
"issue.documents.write",
|
||||
"issue.subtree.read",
|
||||
"issues.orchestration.read",
|
||||
"ui.dashboardWidget.register",
|
||||
"ui.detailTab.register",
|
||||
"instance.settings.register"
|
||||
],
|
||||
entrypoints: {
|
||||
worker: "./dist/worker.js",
|
||||
ui: "./dist/ui"
|
||||
},
|
||||
database: {
|
||||
namespaceSlug: "orchestration_smoke",
|
||||
migrationsDir: "migrations",
|
||||
coreReadTables: ["issues"]
|
||||
},
|
||||
apiRoutes: [
|
||||
{
|
||||
routeKey: "initialize",
|
||||
method: "POST",
|
||||
path: "/issues/:issueId/smoke",
|
||||
auth: "board-or-agent",
|
||||
capability: "api.routes.register",
|
||||
checkoutPolicy: "required-for-agent-in-progress",
|
||||
companyResolution: { from: "issue", param: "issueId" }
|
||||
},
|
||||
{
|
||||
routeKey: "summary",
|
||||
method: "GET",
|
||||
path: "/issues/:issueId/smoke",
|
||||
auth: "board-or-agent",
|
||||
capability: "api.routes.register",
|
||||
companyResolution: { from: "issue", param: "issueId" }
|
||||
}
|
||||
],
|
||||
ui: {
|
||||
slots: [
|
||||
{
|
||||
type: "dashboardWidget",
|
||||
id: "health-widget",
|
||||
displayName: "Orchestration Smoke Health",
|
||||
exportName: "DashboardWidget"
|
||||
},
|
||||
{
|
||||
type: "taskDetailView",
|
||||
id: "issue-panel",
|
||||
displayName: "Orchestration Smoke",
|
||||
exportName: "IssuePanel",
|
||||
entityTypes: ["issue"]
|
||||
},
|
||||
{
|
||||
type: "settingsPage",
|
||||
id: "settings",
|
||||
displayName: "Orchestration Smoke",
|
||||
exportName: "SettingsPage"
|
||||
}
|
||||
]
|
||||
}
|
||||
};
|
||||
|
||||
export default manifest;
|
||||
@@ -1,134 +0,0 @@
|
||||
import {
|
||||
usePluginAction,
|
||||
usePluginData,
|
||||
type PluginDetailTabProps,
|
||||
type PluginSettingsPageProps,
|
||||
type PluginWidgetProps,
|
||||
} from "@paperclipai/plugin-sdk/ui";
|
||||
import type React from "react";
|
||||
|
||||
type SurfaceStatus = {
|
||||
status: "ok" | "degraded" | "error";
|
||||
checkedAt: string;
|
||||
databaseNamespace: string;
|
||||
routeKeys: string[];
|
||||
capabilities: string[];
|
||||
summary: null | {
|
||||
rootIssueId: string;
|
||||
childIssueId: string | null;
|
||||
blockerIssueId: string | null;
|
||||
billingCode: string;
|
||||
subtreeIssueIds: string[];
|
||||
wakeupQueued: boolean;
|
||||
};
|
||||
};
|
||||
|
||||
const panelStyle = {
|
||||
display: "grid",
|
||||
gap: 10,
|
||||
fontSize: 13,
|
||||
lineHeight: 1.45,
|
||||
} satisfies React.CSSProperties;
|
||||
|
||||
const rowStyle = {
|
||||
display: "flex",
|
||||
justifyContent: "space-between",
|
||||
gap: 12,
|
||||
} satisfies React.CSSProperties;
|
||||
|
||||
const buttonStyle = {
|
||||
border: "1px solid #1f2937",
|
||||
background: "#111827",
|
||||
color: "#fff",
|
||||
borderRadius: 6,
|
||||
padding: "6px 10px",
|
||||
font: "inherit",
|
||||
cursor: "pointer",
|
||||
} satisfies React.CSSProperties;
|
||||
|
||||
function SurfaceRows({ data }: { data: SurfaceStatus }) {
|
||||
return (
|
||||
<div style={{ display: "grid", gap: 6 }}>
|
||||
<div style={rowStyle}><span>Status</span><strong>{data.status}</strong></div>
|
||||
<div style={rowStyle}><span>Namespace</span><code>{data.databaseNamespace}</code></div>
|
||||
<div style={rowStyle}><span>Routes</span><code>{data.routeKeys.join(", ")}</code></div>
|
||||
<div style={rowStyle}><span>Capabilities</span><strong>{data.capabilities.length}</strong></div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export function DashboardWidget({ context }: PluginWidgetProps) {
|
||||
const { data, loading, error } = usePluginData<SurfaceStatus>("surface-status", {
|
||||
companyId: context.companyId,
|
||||
});
|
||||
|
||||
if (loading) return <div>Loading orchestration smoke status...</div>;
|
||||
if (error) return <div>Orchestration smoke error: {error.message}</div>;
|
||||
if (!data) return null;
|
||||
|
||||
return (
|
||||
<div style={panelStyle}>
|
||||
<strong>Orchestration Smoke</strong>
|
||||
<SurfaceRows data={data} />
|
||||
<div>Checked {data.checkedAt}</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export function IssuePanel({ context }: PluginDetailTabProps) {
|
||||
const { data, loading, error, refresh } = usePluginData<SurfaceStatus>("surface-status", {
|
||||
companyId: context.companyId,
|
||||
issueId: context.entityId,
|
||||
});
|
||||
const initialize = usePluginAction("initialize-smoke");
|
||||
|
||||
if (loading) return <div>Loading orchestration smoke...</div>;
|
||||
if (error) return <div>Orchestration smoke error: {error.message}</div>;
|
||||
if (!data) return null;
|
||||
|
||||
return (
|
||||
<div style={panelStyle}>
|
||||
<div style={rowStyle}>
|
||||
<strong>Orchestration Smoke</strong>
|
||||
<button
|
||||
style={buttonStyle}
|
||||
onClick={async () => {
|
||||
await initialize({ companyId: context.companyId, issueId: context.entityId });
|
||||
refresh();
|
||||
}}
|
||||
>
|
||||
Run Smoke
|
||||
</button>
|
||||
</div>
|
||||
<SurfaceRows data={data} />
|
||||
{data.summary ? (
|
||||
<div style={{ display: "grid", gap: 4 }}>
|
||||
<div style={rowStyle}><span>Child</span><code>{data.summary.childIssueId ?? "none"}</code></div>
|
||||
<div style={rowStyle}><span>Blocker</span><code>{data.summary.blockerIssueId ?? "none"}</code></div>
|
||||
<div style={rowStyle}><span>Billing</span><code>{data.summary.billingCode}</code></div>
|
||||
<div style={rowStyle}><span>Subtree</span><strong>{data.summary.subtreeIssueIds.length}</strong></div>
|
||||
<div style={rowStyle}><span>Wakeup</span><strong>{data.summary.wakeupQueued ? "queued" : "not queued"}</strong></div>
|
||||
</div>
|
||||
) : (
|
||||
<div>No smoke run recorded for this issue.</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export function SettingsPage({ context }: PluginSettingsPageProps) {
|
||||
const { data, loading, error } = usePluginData<SurfaceStatus>("surface-status", {
|
||||
companyId: context.companyId,
|
||||
});
|
||||
|
||||
if (loading) return <div>Loading orchestration smoke settings...</div>;
|
||||
if (error) return <div>Orchestration smoke settings error: {error.message}</div>;
|
||||
if (!data) return null;
|
||||
|
||||
return (
|
||||
<div style={panelStyle}>
|
||||
<strong>Orchestration Smoke Surface</strong>
|
||||
<SurfaceRows data={data} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,253 +0,0 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import { definePlugin, runWorker, type PluginApiRequestInput } from "@paperclipai/plugin-sdk";
|
||||
|
||||
type SmokeInput = {
|
||||
companyId: string;
|
||||
issueId: string;
|
||||
assigneeAgentId?: string | null;
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
};
|
||||
|
||||
type SmokeSummary = {
|
||||
rootIssueId: string;
|
||||
childIssueId: string | null;
|
||||
blockerIssueId: string | null;
|
||||
billingCode: string;
|
||||
joinedRows: unknown[];
|
||||
subtreeIssueIds: string[];
|
||||
wakeupQueued: boolean;
|
||||
};
|
||||
|
||||
let readSmokeSummary: ((companyId: string, issueId: string) => Promise<SmokeSummary | null>) | null = null;
|
||||
let initializeSmoke: ((input: SmokeInput) => Promise<SmokeSummary>) | null = null;
|
||||
|
||||
function tableName(namespace: string) {
|
||||
return `${namespace}.smoke_runs`;
|
||||
}
|
||||
|
||||
function stringField(value: unknown): string | null {
|
||||
return typeof value === "string" && value.trim().length > 0 ? value : null;
|
||||
}
|
||||
|
||||
const plugin = definePlugin({
|
||||
async setup(ctx) {
|
||||
readSmokeSummary = async function readSummary(companyId: string, issueId: string): Promise<SmokeSummary | null> {
|
||||
const rows = await ctx.db.query<{
|
||||
root_issue_id: string;
|
||||
child_issue_id: string | null;
|
||||
blocker_issue_id: string | null;
|
||||
billing_code: string;
|
||||
issue_title: string;
|
||||
last_summary: unknown;
|
||||
}>(
|
||||
`SELECT s.root_issue_id, s.child_issue_id, s.blocker_issue_id, s.billing_code, i.title AS issue_title, s.last_summary
|
||||
FROM ${tableName(ctx.db.namespace)} s
|
||||
JOIN public.issues i ON i.id = s.root_issue_id
|
||||
WHERE s.root_issue_id = $1`,
|
||||
[issueId],
|
||||
);
|
||||
const row = rows[0];
|
||||
if (!row) return null;
|
||||
const orchestration = await ctx.issues.summaries.getOrchestration({
|
||||
issueId,
|
||||
companyId,
|
||||
includeSubtree: true,
|
||||
billingCode: row.billing_code,
|
||||
});
|
||||
return {
|
||||
rootIssueId: row.root_issue_id,
|
||||
childIssueId: row.child_issue_id,
|
||||
blockerIssueId: row.blocker_issue_id,
|
||||
billingCode: row.billing_code,
|
||||
joinedRows: rows,
|
||||
subtreeIssueIds: orchestration.subtreeIssueIds,
|
||||
wakeupQueued: Boolean((row.last_summary as { wakeupQueued?: unknown } | null)?.wakeupQueued),
|
||||
};
|
||||
};
|
||||
|
||||
initializeSmoke = async function runSmoke(input: SmokeInput): Promise<SmokeSummary> {
|
||||
const root = await ctx.issues.get(input.issueId, input.companyId);
|
||||
if (!root) throw new Error(`Issue not found: ${input.issueId}`);
|
||||
|
||||
const billingCode = `plugin-smoke:${input.issueId}`;
|
||||
const actor = {
|
||||
actorAgentId: input.actorAgentId ?? null,
|
||||
actorUserId: input.actorUserId ?? null,
|
||||
actorRunId: input.actorRunId ?? null,
|
||||
};
|
||||
const blocker = await ctx.issues.create({
|
||||
companyId: input.companyId,
|
||||
parentId: input.issueId,
|
||||
inheritExecutionWorkspaceFromIssueId: input.issueId,
|
||||
title: "Orchestration smoke blocker",
|
||||
description: "Resolved blocker used to verify plugin relation writes without preventing the smoke wakeup.",
|
||||
status: "done",
|
||||
priority: "low",
|
||||
billingCode,
|
||||
originKind: `plugin:${ctx.manifest.id}:blocker`,
|
||||
originId: `${input.issueId}:blocker`,
|
||||
actor,
|
||||
});
|
||||
|
||||
const child = await ctx.issues.create({
|
||||
companyId: input.companyId,
|
||||
parentId: input.issueId,
|
||||
inheritExecutionWorkspaceFromIssueId: input.issueId,
|
||||
title: "Orchestration smoke child",
|
||||
description: "Generated by the orchestration smoke plugin to verify issue, document, relation, wakeup, and summary APIs.",
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
assigneeAgentId: input.assigneeAgentId ?? root.assigneeAgentId ?? undefined,
|
||||
billingCode,
|
||||
originKind: `plugin:${ctx.manifest.id}:child`,
|
||||
originId: `${input.issueId}:child`,
|
||||
blockedByIssueIds: [blocker.id],
|
||||
actor,
|
||||
});
|
||||
|
||||
await ctx.issues.relations.setBlockedBy(child.id, [blocker.id], input.companyId, actor);
|
||||
await ctx.issues.documents.upsert({
|
||||
issueId: child.id,
|
||||
companyId: input.companyId,
|
||||
key: "orchestration-smoke",
|
||||
title: "Orchestration Smoke",
|
||||
format: "markdown",
|
||||
body: [
|
||||
"# Orchestration Smoke",
|
||||
"",
|
||||
`- Root issue: ${input.issueId}`,
|
||||
`- Child issue: ${child.id}`,
|
||||
`- Billing code: ${billingCode}`,
|
||||
].join("\n"),
|
||||
changeSummary: "Recorded orchestration smoke output",
|
||||
});
|
||||
|
||||
const wakeup = await ctx.issues.requestWakeup(child.id, input.companyId, {
|
||||
reason: "plugin:orchestration_smoke",
|
||||
contextSource: "plugin-orchestration-smoke",
|
||||
idempotencyKey: `${input.issueId}:child`,
|
||||
...actor,
|
||||
});
|
||||
const orchestration = await ctx.issues.summaries.getOrchestration({
|
||||
issueId: input.issueId,
|
||||
companyId: input.companyId,
|
||||
includeSubtree: true,
|
||||
billingCode,
|
||||
});
|
||||
const summarySnapshot = {
|
||||
childIssueId: child.id,
|
||||
blockerIssueId: blocker.id,
|
||||
wakeupQueued: wakeup.queued,
|
||||
subtreeIssueIds: orchestration.subtreeIssueIds,
|
||||
};
|
||||
|
||||
await ctx.db.execute(
|
||||
`INSERT INTO ${tableName(ctx.db.namespace)} (id, root_issue_id, child_issue_id, blocker_issue_id, billing_code, last_summary)
|
||||
VALUES ($1, $2, $3, $4, $5, $6::jsonb)
|
||||
ON CONFLICT (id) DO UPDATE SET
|
||||
child_issue_id = EXCLUDED.child_issue_id,
|
||||
blocker_issue_id = EXCLUDED.blocker_issue_id,
|
||||
billing_code = EXCLUDED.billing_code,
|
||||
last_summary = EXCLUDED.last_summary,
|
||||
updated_at = now()`,
|
||||
[
|
||||
randomUUID(),
|
||||
input.issueId,
|
||||
child.id,
|
||||
blocker.id,
|
||||
billingCode,
|
||||
JSON.stringify(summarySnapshot),
|
||||
],
|
||||
);
|
||||
|
||||
return {
|
||||
rootIssueId: input.issueId,
|
||||
childIssueId: child.id,
|
||||
blockerIssueId: blocker.id,
|
||||
billingCode,
|
||||
joinedRows: await ctx.db.query(
|
||||
`SELECT s.id, s.billing_code, i.title AS root_title
|
||||
FROM ${tableName(ctx.db.namespace)} s
|
||||
JOIN public.issues i ON i.id = s.root_issue_id
|
||||
WHERE s.root_issue_id = $1`,
|
||||
[input.issueId],
|
||||
),
|
||||
subtreeIssueIds: orchestration.subtreeIssueIds,
|
||||
wakeupQueued: wakeup.queued,
|
||||
};
|
||||
};
|
||||
|
||||
ctx.data.register("surface-status", async (params) => {
|
||||
const companyId = stringField(params.companyId);
|
||||
const issueId = stringField(params.issueId);
|
||||
return {
|
||||
status: "ok",
|
||||
checkedAt: new Date().toISOString(),
|
||||
databaseNamespace: ctx.db.namespace,
|
||||
routeKeys: (ctx.manifest.apiRoutes ?? []).map((route) => route.routeKey),
|
||||
capabilities: ctx.manifest.capabilities,
|
||||
summary: companyId && issueId ? await readSmokeSummary?.(companyId, issueId) ?? null : null,
|
||||
};
|
||||
});
|
||||
|
||||
ctx.actions.register("initialize-smoke", async (params) => {
|
||||
const companyId = stringField(params.companyId);
|
||||
const issueId = stringField(params.issueId);
|
||||
if (!companyId || !issueId) throw new Error("companyId and issueId are required");
|
||||
if (!initializeSmoke) throw new Error("Smoke initializer is not ready");
|
||||
return initializeSmoke({
|
||||
companyId,
|
||||
issueId,
|
||||
assigneeAgentId: stringField(params.assigneeAgentId),
|
||||
actorAgentId: stringField(params.actorAgentId),
|
||||
actorUserId: stringField(params.actorUserId),
|
||||
actorRunId: stringField(params.actorRunId),
|
||||
});
|
||||
});
|
||||
},
|
||||
|
||||
async onApiRequest(input: PluginApiRequestInput) {
|
||||
if (input.routeKey === "summary") {
|
||||
const issueId = input.params.issueId;
|
||||
return {
|
||||
body: await readSmokeSummary?.(input.companyId, issueId) ?? null,
|
||||
};
|
||||
}
|
||||
|
||||
if (input.routeKey === "initialize") {
|
||||
if (!initializeSmoke) throw new Error("Smoke initializer is not ready");
|
||||
const body = input.body as Record<string, unknown> | null;
|
||||
return {
|
||||
status: 201,
|
||||
body: await initializeSmoke({
|
||||
companyId: input.companyId,
|
||||
issueId: input.params.issueId,
|
||||
assigneeAgentId: stringField(body?.assigneeAgentId),
|
||||
actorAgentId: input.actor.agentId ?? null,
|
||||
actorUserId: input.actor.userId ?? null,
|
||||
actorRunId: input.actor.runId ?? null,
|
||||
}),
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
status: 404,
|
||||
body: { error: `Unknown orchestration smoke route: ${input.routeKey}` },
|
||||
};
|
||||
},
|
||||
|
||||
async onHealth() {
|
||||
return {
|
||||
status: "ok",
|
||||
message: "Orchestration smoke plugin worker is running",
|
||||
details: {
|
||||
surfaces: ["database", "scoped-api-route", "issue-panel", "orchestration-apis"],
|
||||
},
|
||||
};
|
||||
}
|
||||
});
|
||||
|
||||
export default plugin;
|
||||
runWorker(plugin, import.meta.url);
|
||||
@@ -1,162 +0,0 @@
|
||||
import { randomUUID } from "node:crypto";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { pluginManifestV1Schema, type Issue } from "@paperclipai/shared";
|
||||
import { createTestHarness } from "@paperclipai/plugin-sdk/testing";
|
||||
import manifest from "../src/manifest.js";
|
||||
import plugin from "../src/worker.js";
|
||||
|
||||
function issue(input: Partial<Issue> & Pick<Issue, "id" | "companyId" | "title">): Issue {
|
||||
const now = new Date();
|
||||
const { id, companyId, title, ...rest } = input;
|
||||
return {
|
||||
id,
|
||||
companyId,
|
||||
projectId: null,
|
||||
projectWorkspaceId: null,
|
||||
goalId: null,
|
||||
parentId: null,
|
||||
title,
|
||||
description: null,
|
||||
status: "todo",
|
||||
priority: "medium",
|
||||
assigneeAgentId: null,
|
||||
assigneeUserId: null,
|
||||
checkoutRunId: null,
|
||||
executionRunId: null,
|
||||
executionAgentNameKey: null,
|
||||
executionLockedAt: null,
|
||||
createdByAgentId: null,
|
||||
createdByUserId: null,
|
||||
issueNumber: null,
|
||||
identifier: null,
|
||||
originKind: "manual",
|
||||
originId: null,
|
||||
originRunId: null,
|
||||
requestDepth: 0,
|
||||
billingCode: null,
|
||||
assigneeAdapterOverrides: null,
|
||||
executionWorkspaceId: null,
|
||||
executionWorkspacePreference: null,
|
||||
executionWorkspaceSettings: null,
|
||||
startedAt: null,
|
||||
completedAt: null,
|
||||
cancelledAt: null,
|
||||
hiddenAt: null,
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
...rest,
|
||||
};
|
||||
}
|
||||
|
||||
describe("orchestration smoke plugin", () => {
|
||||
it("declares the Phase 1 orchestration surfaces", () => {
|
||||
expect(pluginManifestV1Schema.parse(manifest)).toMatchObject({
|
||||
id: "paperclipai.plugin-orchestration-smoke-example",
|
||||
database: {
|
||||
migrationsDir: "migrations",
|
||||
coreReadTables: ["issues"],
|
||||
},
|
||||
apiRoutes: [
|
||||
expect.objectContaining({ routeKey: "initialize" }),
|
||||
expect.objectContaining({ routeKey: "summary" }),
|
||||
],
|
||||
});
|
||||
});
|
||||
|
||||
it("creates plugin-owned orchestration rows, issue tree, document, wakeup, and summary reads", async () => {
|
||||
const companyId = randomUUID();
|
||||
const rootIssueId = randomUUID();
|
||||
const agentId = randomUUID();
|
||||
const harness = createTestHarness({ manifest });
|
||||
harness.seed({
|
||||
issues: [
|
||||
issue({
|
||||
id: rootIssueId,
|
||||
companyId,
|
||||
title: "Root orchestration issue",
|
||||
assigneeAgentId: agentId,
|
||||
}),
|
||||
],
|
||||
});
|
||||
await plugin.definition.setup(harness.ctx);
|
||||
|
||||
const result = await harness.performAction<{
|
||||
rootIssueId: string;
|
||||
childIssueId: string;
|
||||
blockerIssueId: string;
|
||||
billingCode: string;
|
||||
subtreeIssueIds: string[];
|
||||
wakeupQueued: boolean;
|
||||
}>("initialize-smoke", {
|
||||
companyId,
|
||||
issueId: rootIssueId,
|
||||
assigneeAgentId: agentId,
|
||||
});
|
||||
|
||||
expect(result.rootIssueId).toBe(rootIssueId);
|
||||
expect(result.childIssueId).toEqual(expect.any(String));
|
||||
expect(result.blockerIssueId).toEqual(expect.any(String));
|
||||
expect(result.billingCode).toBe(`plugin-smoke:${rootIssueId}`);
|
||||
expect(result.wakeupQueued).toBe(true);
|
||||
expect(result.subtreeIssueIds).toEqual(expect.arrayContaining([rootIssueId, result.childIssueId]));
|
||||
expect(harness.dbExecutes[0]?.sql).toContain(".smoke_runs");
|
||||
expect(harness.dbQueries.some((entry) => entry.sql.includes("JOIN public.issues"))).toBe(true);
|
||||
|
||||
const relations = await harness.ctx.issues.relations.get(result.childIssueId, companyId);
|
||||
expect(relations.blockedBy).toEqual([
|
||||
expect.objectContaining({
|
||||
id: result.blockerIssueId,
|
||||
status: "done",
|
||||
}),
|
||||
]);
|
||||
const docs = await harness.ctx.issues.documents.list(result.childIssueId, companyId);
|
||||
expect(docs).toEqual([
|
||||
expect.objectContaining({
|
||||
key: "orchestration-smoke",
|
||||
title: "Orchestration Smoke",
|
||||
}),
|
||||
]);
|
||||
});
|
||||
|
||||
it("dispatches the scoped API route through the same smoke path", async () => {
|
||||
const companyId = randomUUID();
|
||||
const rootIssueId = randomUUID();
|
||||
const agentId = randomUUID();
|
||||
const harness = createTestHarness({ manifest });
|
||||
harness.seed({
|
||||
issues: [
|
||||
issue({
|
||||
id: rootIssueId,
|
||||
companyId,
|
||||
title: "Scoped API root",
|
||||
assigneeAgentId: agentId,
|
||||
}),
|
||||
],
|
||||
});
|
||||
await plugin.definition.setup(harness.ctx);
|
||||
|
||||
await expect(plugin.definition.onApiRequest?.({
|
||||
routeKey: "initialize",
|
||||
method: "POST",
|
||||
path: `/issues/${rootIssueId}/smoke`,
|
||||
params: { issueId: rootIssueId },
|
||||
query: {},
|
||||
body: { assigneeAgentId: agentId },
|
||||
actor: {
|
||||
actorType: "user",
|
||||
actorId: "board",
|
||||
userId: "board",
|
||||
agentId: null,
|
||||
runId: null,
|
||||
},
|
||||
companyId,
|
||||
headers: {},
|
||||
})).resolves.toMatchObject({
|
||||
status: 201,
|
||||
body: expect.objectContaining({
|
||||
rootIssueId,
|
||||
wakeupQueued: true,
|
||||
}),
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,27 +0,0 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"lib": [
|
||||
"ES2022",
|
||||
"DOM"
|
||||
],
|
||||
"jsx": "react-jsx",
|
||||
"strict": true,
|
||||
"skipLibCheck": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"sourceMap": true,
|
||||
"outDir": "dist",
|
||||
"rootDir": "."
|
||||
},
|
||||
"include": [
|
||||
"src",
|
||||
"tests"
|
||||
],
|
||||
"exclude": [
|
||||
"dist",
|
||||
"node_modules"
|
||||
]
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
import { defineConfig } from "vitest/config";
|
||||
|
||||
export default defineConfig({
|
||||
test: {
|
||||
include: ["tests/**/*.spec.ts"],
|
||||
environment: "node",
|
||||
},
|
||||
});
|
||||
@@ -118,13 +118,10 @@ Subscribe in `setup` with `ctx.events.on(name, handler)` or `ctx.events.on(name,
|
||||
| `project.created`, `project.updated` | project |
|
||||
| `project.workspace_created`, `project.workspace_updated`, `project.workspace_deleted` | project_workspace |
|
||||
| `issue.created`, `issue.updated`, `issue.comment.created` | issue |
|
||||
| `issue.document.created`, `issue.document.updated`, `issue.document.deleted` | issue |
|
||||
| `issue.relations.updated`, `issue.checked_out`, `issue.released`, `issue.assignment_wakeup_requested` | issue |
|
||||
| `agent.created`, `agent.updated`, `agent.status_changed` | agent |
|
||||
| `agent.run.started`, `agent.run.finished`, `agent.run.failed`, `agent.run.cancelled` | run |
|
||||
| `goal.created`, `goal.updated` | goal |
|
||||
| `approval.created`, `approval.decided` | approval |
|
||||
| `budget.incident.opened`, `budget.incident.resolved` | budget_incident |
|
||||
| `cost_event.created` | cost |
|
||||
| `activity.logged` | activity |
|
||||
|
||||
@@ -304,29 +301,18 @@ Declare in `manifest.capabilities`. Grouped by scope:
|
||||
| | `project.workspaces.read` |
|
||||
| | `issues.read` |
|
||||
| | `issue.comments.read` |
|
||||
| | `issue.documents.read` |
|
||||
| | `issue.relations.read` |
|
||||
| | `issue.subtree.read` |
|
||||
| | `agents.read` |
|
||||
| | `goals.read` |
|
||||
| | `goals.create` |
|
||||
| | `goals.update` |
|
||||
| | `activity.read` |
|
||||
| | `costs.read` |
|
||||
| | `issues.orchestration.read` |
|
||||
| | `database.namespace.read` |
|
||||
| | `issues.create` |
|
||||
| | `issues.update` |
|
||||
| | `issues.checkout` |
|
||||
| | `issues.wakeup` |
|
||||
| | `issue.comments.create` |
|
||||
| | `issue.documents.write` |
|
||||
| | `issue.relations.write` |
|
||||
| | `activity.log.write` |
|
||||
| | `metrics.write` |
|
||||
| | `telemetry.track` |
|
||||
| | `database.namespace.migrate` |
|
||||
| | `database.namespace.write` |
|
||||
| **Instance** | `instance.settings.register` |
|
||||
| | `plugin.state.read` |
|
||||
| | `plugin.state.write` |
|
||||
@@ -334,7 +320,6 @@ Declare in `manifest.capabilities`. Grouped by scope:
|
||||
| | `events.emit` |
|
||||
| | `jobs.schedule` |
|
||||
| | `webhooks.receive` |
|
||||
| | `api.routes.register` |
|
||||
| | `http.outbound` |
|
||||
| | `secrets.read-ref` |
|
||||
| **Agent** | `agent.tools.register` |
|
||||
@@ -352,144 +337,6 @@ Declare in `manifest.capabilities`. Grouped by scope:
|
||||
|
||||
Full list in code: import `PLUGIN_CAPABILITIES` from `@paperclipai/plugin-sdk`.
|
||||
|
||||
### Restricted Database Namespace
|
||||
|
||||
Trusted orchestration plugins can declare a host-owned PostgreSQL namespace:
|
||||
|
||||
```ts
|
||||
database: {
|
||||
migrationsDir: "migrations",
|
||||
coreReadTables: ["issues"],
|
||||
}
|
||||
```
|
||||
|
||||
Declare `database.namespace.migrate` and `database.namespace.read`; add
|
||||
`database.namespace.write` when the worker needs runtime writes. Migrations run
|
||||
before worker startup, are checksum-recorded, and may create or alter objects
|
||||
only inside the plugin namespace. Runtime `ctx.db.query()` allows `SELECT` from
|
||||
`ctx.db.namespace` plus manifest-whitelisted `public` core tables. Runtime
|
||||
`ctx.db.execute()` allows `INSERT`, `UPDATE`, and `DELETE` only against the
|
||||
plugin namespace.
|
||||
|
||||
### Scoped API Routes
|
||||
|
||||
Manifest-declared `apiRoutes` expose JSON routes under
|
||||
`/api/plugins/:pluginId/api/*` without letting a plugin claim core paths:
|
||||
|
||||
```ts
|
||||
apiRoutes: [
|
||||
{
|
||||
routeKey: "initialize",
|
||||
method: "POST",
|
||||
path: "/issues/:issueId/smoke",
|
||||
auth: "board-or-agent",
|
||||
capability: "api.routes.register",
|
||||
checkoutPolicy: "required-for-agent-in-progress",
|
||||
companyResolution: { from: "issue", param: "issueId" },
|
||||
},
|
||||
]
|
||||
```
|
||||
|
||||
Implement `onApiRequest(input)` in the worker to handle the route. The host
|
||||
performs auth, company access, capability, route matching, and checkout policy
|
||||
before dispatch. The worker receives route params, query, parsed JSON body,
|
||||
sanitized headers, actor context, and `companyId`; responses are JSON `{ status?,
|
||||
headers?, body? }`.
|
||||
|
||||
## Issue Orchestration APIs
|
||||
|
||||
Workflow plugins can use `ctx.issues` for orchestration-grade issue operations without importing host server internals.
|
||||
|
||||
Expanded create/update fields include blockers, billing code, board or agent assignees, labels, namespaced plugin origins, request depth, and safe execution workspace fields:
|
||||
|
||||
```ts
|
||||
const child = await ctx.issues.create({
|
||||
companyId,
|
||||
parentId: missionIssueId,
|
||||
inheritExecutionWorkspaceFromIssueId: missionIssueId,
|
||||
title: "Implement feature slice",
|
||||
status: "todo",
|
||||
assigneeAgentId: workerAgentId,
|
||||
billingCode: "mission:alpha",
|
||||
originKind: "plugin:paperclip.missions:feature",
|
||||
originId: "mission-alpha:feature-1",
|
||||
blockedByIssueIds: [planningIssueId],
|
||||
});
|
||||
```
|
||||
|
||||
If `originKind` is omitted, the host stores `plugin:<pluginKey>`. Plugins may use sub-kinds such as `plugin:<pluginKey>:feature`, but the host rejects attempts to set another plugin's namespace.
|
||||
|
||||
Blocker relationships are also exposed as first-class helpers:
|
||||
|
||||
```ts
|
||||
const relations = await ctx.issues.relations.get(child.id, companyId);
|
||||
await ctx.issues.relations.setBlockedBy(child.id, [planningIssueId], companyId);
|
||||
await ctx.issues.relations.addBlockers(child.id, [validationIssueId], companyId);
|
||||
await ctx.issues.relations.removeBlockers(child.id, [planningIssueId], companyId);
|
||||
```
|
||||
|
||||
Subtree reads can include just the issue tree, or compact related data for orchestration dashboards:
|
||||
|
||||
```ts
|
||||
const subtree = await ctx.issues.getSubtree(missionIssueId, companyId, {
|
||||
includeRoot: true,
|
||||
includeRelations: true,
|
||||
includeDocuments: true,
|
||||
includeActiveRuns: true,
|
||||
includeAssignees: true,
|
||||
});
|
||||
```
|
||||
|
||||
Agent-run actions can assert checkout ownership before mutating in-progress work:
|
||||
|
||||
```ts
|
||||
await ctx.issues.assertCheckoutOwner({
|
||||
issueId,
|
||||
companyId,
|
||||
actorAgentId: runCtx.agentId,
|
||||
actorRunId: runCtx.runId,
|
||||
});
|
||||
```
|
||||
|
||||
Plugins can request assignment wakeups through the host so budget stops, execution locks, blocker checks, and heartbeat policy still apply:
|
||||
|
||||
```ts
|
||||
await ctx.issues.requestWakeup(child.id, companyId, {
|
||||
reason: "mission_advance",
|
||||
contextSource: "missions.advance",
|
||||
});
|
||||
|
||||
await ctx.issues.requestWakeups([featureIssueId, validationIssueId], companyId, {
|
||||
reason: "mission_advance",
|
||||
contextSource: "missions.advance",
|
||||
idempotencyKeyPrefix: `mission:${missionIssueId}:advance`,
|
||||
});
|
||||
```
|
||||
|
||||
Use `ctx.issues.summaries.getOrchestration()` when a workflow needs compact reads across a root issue or subtree:
|
||||
|
||||
```ts
|
||||
const summary = await ctx.issues.summaries.getOrchestration({
|
||||
issueId: missionIssueId,
|
||||
companyId,
|
||||
includeSubtree: true,
|
||||
billingCode: "mission:alpha",
|
||||
});
|
||||
```
|
||||
|
||||
Required capabilities:
|
||||
|
||||
| API | Capability |
|
||||
|-----|------------|
|
||||
| `ctx.issues.relations.get` | `issue.relations.read` |
|
||||
| `ctx.issues.relations.setBlockedBy` / `addBlockers` / `removeBlockers` | `issue.relations.write` |
|
||||
| `ctx.issues.getSubtree` | `issue.subtree.read` |
|
||||
| `ctx.issues.assertCheckoutOwner` | `issues.checkout` |
|
||||
| `ctx.issues.requestWakeup` / `requestWakeups` | `issues.wakeup` |
|
||||
| `ctx.issues.summaries.getOrchestration` | `issues.orchestration.read` |
|
||||
|
||||
Plugin-originated mutations are logged with `actorType: "plugin"` and details fields `sourcePluginId`, `sourcePluginKey`, `initiatingActorType`, `initiatingActorId`, and `initiatingRunId` when a user or agent run initiated the plugin work.
|
||||
|
||||
## UI quick start
|
||||
|
||||
```tsx
|
||||
|
||||
@@ -107,30 +107,6 @@ export interface PluginWebhookInput {
|
||||
requestId: string;
|
||||
}
|
||||
|
||||
export interface PluginApiRequestInput {
|
||||
routeKey: string;
|
||||
method: string;
|
||||
path: string;
|
||||
params: Record<string, string>;
|
||||
query: Record<string, string | string[]>;
|
||||
body: unknown;
|
||||
actor: {
|
||||
actorType: "user" | "agent";
|
||||
actorId: string;
|
||||
agentId?: string | null;
|
||||
userId?: string | null;
|
||||
runId?: string | null;
|
||||
};
|
||||
companyId: string;
|
||||
headers: Record<string, string>;
|
||||
}
|
||||
|
||||
export interface PluginApiResponse {
|
||||
status?: number;
|
||||
headers?: Record<string, string>;
|
||||
body?: unknown;
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Plugin definition
|
||||
// ---------------------------------------------------------------------------
|
||||
@@ -221,13 +197,6 @@ export interface PluginDefinition {
|
||||
* @see PLUGIN_SPEC.md §13.7 — `handleWebhook`
|
||||
*/
|
||||
onWebhook?(input: PluginWebhookInput): Promise<void>;
|
||||
|
||||
/**
|
||||
* Called for manifest-declared scoped JSON API routes under
|
||||
* `/api/plugins/:pluginId/api/*` after the host has enforced auth, company
|
||||
* access, capabilities, and checkout policy.
|
||||
*/
|
||||
onApiRequest?(input: PluginApiRequestInput): Promise<PluginApiResponse>;
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
@@ -97,13 +97,6 @@ export interface HostServices {
|
||||
delete(params: WorkerToHostMethods["state.delete"][0]): Promise<void>;
|
||||
};
|
||||
|
||||
/** Provides restricted plugin database namespace methods. */
|
||||
db: {
|
||||
namespace(params: WorkerToHostMethods["db.namespace"][0]): Promise<WorkerToHostMethods["db.namespace"][1]>;
|
||||
query(params: WorkerToHostMethods["db.query"][0]): Promise<WorkerToHostMethods["db.query"][1]>;
|
||||
execute(params: WorkerToHostMethods["db.execute"][0]): Promise<WorkerToHostMethods["db.execute"][1]>;
|
||||
};
|
||||
|
||||
/** Provides `entities.upsert`, `entities.list`. */
|
||||
entities: {
|
||||
upsert(params: WorkerToHostMethods["entities.upsert"][0]): Promise<WorkerToHostMethods["entities.upsert"][1]>;
|
||||
@@ -167,21 +160,12 @@ export interface HostServices {
|
||||
getWorkspaceForIssue(params: WorkerToHostMethods["projects.getWorkspaceForIssue"][0]): Promise<WorkerToHostMethods["projects.getWorkspaceForIssue"][1]>;
|
||||
};
|
||||
|
||||
/** Provides issue read/write, relation, checkout, wakeup, summary, comment methods. */
|
||||
/** Provides `issues.list`, `issues.get`, `issues.create`, `issues.update`, `issues.listComments`, `issues.createComment`. */
|
||||
issues: {
|
||||
list(params: WorkerToHostMethods["issues.list"][0]): Promise<WorkerToHostMethods["issues.list"][1]>;
|
||||
get(params: WorkerToHostMethods["issues.get"][0]): Promise<WorkerToHostMethods["issues.get"][1]>;
|
||||
create(params: WorkerToHostMethods["issues.create"][0]): Promise<WorkerToHostMethods["issues.create"][1]>;
|
||||
update(params: WorkerToHostMethods["issues.update"][0]): Promise<WorkerToHostMethods["issues.update"][1]>;
|
||||
getRelations(params: WorkerToHostMethods["issues.relations.get"][0]): Promise<WorkerToHostMethods["issues.relations.get"][1]>;
|
||||
setBlockedBy(params: WorkerToHostMethods["issues.relations.setBlockedBy"][0]): Promise<WorkerToHostMethods["issues.relations.setBlockedBy"][1]>;
|
||||
addBlockers(params: WorkerToHostMethods["issues.relations.addBlockers"][0]): Promise<WorkerToHostMethods["issues.relations.addBlockers"][1]>;
|
||||
removeBlockers(params: WorkerToHostMethods["issues.relations.removeBlockers"][0]): Promise<WorkerToHostMethods["issues.relations.removeBlockers"][1]>;
|
||||
assertCheckoutOwner(params: WorkerToHostMethods["issues.assertCheckoutOwner"][0]): Promise<WorkerToHostMethods["issues.assertCheckoutOwner"][1]>;
|
||||
getSubtree(params: WorkerToHostMethods["issues.getSubtree"][0]): Promise<WorkerToHostMethods["issues.getSubtree"][1]>;
|
||||
requestWakeup(params: WorkerToHostMethods["issues.requestWakeup"][0]): Promise<WorkerToHostMethods["issues.requestWakeup"][1]>;
|
||||
requestWakeups(params: WorkerToHostMethods["issues.requestWakeups"][0]): Promise<WorkerToHostMethods["issues.requestWakeups"][1]>;
|
||||
getOrchestrationSummary(params: WorkerToHostMethods["issues.summaries.getOrchestration"][0]): Promise<WorkerToHostMethods["issues.summaries.getOrchestration"][1]>;
|
||||
listComments(params: WorkerToHostMethods["issues.listComments"][0]): Promise<WorkerToHostMethods["issues.listComments"][1]>;
|
||||
createComment(params: WorkerToHostMethods["issues.createComment"][0]): Promise<WorkerToHostMethods["issues.createComment"][1]>;
|
||||
};
|
||||
@@ -285,10 +269,6 @@ const METHOD_CAPABILITY_MAP: Record<WorkerToHostMethodName, PluginCapability | n
|
||||
"state.set": "plugin.state.write",
|
||||
"state.delete": "plugin.state.write",
|
||||
|
||||
"db.namespace": "database.namespace.read",
|
||||
"db.query": "database.namespace.read",
|
||||
"db.execute": "database.namespace.write",
|
||||
|
||||
// Entities — no specific capability required (plugin-scoped by design)
|
||||
"entities.upsert": null,
|
||||
"entities.list": null,
|
||||
@@ -331,15 +311,6 @@ const METHOD_CAPABILITY_MAP: Record<WorkerToHostMethodName, PluginCapability | n
|
||||
"issues.get": "issues.read",
|
||||
"issues.create": "issues.create",
|
||||
"issues.update": "issues.update",
|
||||
"issues.relations.get": "issue.relations.read",
|
||||
"issues.relations.setBlockedBy": "issue.relations.write",
|
||||
"issues.relations.addBlockers": "issue.relations.write",
|
||||
"issues.relations.removeBlockers": "issue.relations.write",
|
||||
"issues.assertCheckoutOwner": "issues.checkout",
|
||||
"issues.getSubtree": "issue.subtree.read",
|
||||
"issues.requestWakeup": "issues.wakeup",
|
||||
"issues.requestWakeups": "issues.wakeup",
|
||||
"issues.summaries.getOrchestration": "issues.orchestration.read",
|
||||
"issues.listComments": "issue.comments.read",
|
||||
"issues.createComment": "issue.comments.create",
|
||||
|
||||
@@ -448,16 +419,6 @@ export function createHostClientHandlers(
|
||||
return services.state.delete(params);
|
||||
}),
|
||||
|
||||
"db.namespace": gated("db.namespace", async (params) => {
|
||||
return services.db.namespace(params);
|
||||
}),
|
||||
"db.query": gated("db.query", async (params) => {
|
||||
return services.db.query(params);
|
||||
}),
|
||||
"db.execute": gated("db.execute", async (params) => {
|
||||
return services.db.execute(params);
|
||||
}),
|
||||
|
||||
// Entities
|
||||
"entities.upsert": gated("entities.upsert", async (params) => {
|
||||
return services.entities.upsert(params);
|
||||
@@ -542,33 +503,6 @@ export function createHostClientHandlers(
|
||||
"issues.update": gated("issues.update", async (params) => {
|
||||
return services.issues.update(params);
|
||||
}),
|
||||
"issues.relations.get": gated("issues.relations.get", async (params) => {
|
||||
return services.issues.getRelations(params);
|
||||
}),
|
||||
"issues.relations.setBlockedBy": gated("issues.relations.setBlockedBy", async (params) => {
|
||||
return services.issues.setBlockedBy(params);
|
||||
}),
|
||||
"issues.relations.addBlockers": gated("issues.relations.addBlockers", async (params) => {
|
||||
return services.issues.addBlockers(params);
|
||||
}),
|
||||
"issues.relations.removeBlockers": gated("issues.relations.removeBlockers", async (params) => {
|
||||
return services.issues.removeBlockers(params);
|
||||
}),
|
||||
"issues.assertCheckoutOwner": gated("issues.assertCheckoutOwner", async (params) => {
|
||||
return services.issues.assertCheckoutOwner(params);
|
||||
}),
|
||||
"issues.getSubtree": gated("issues.getSubtree", async (params) => {
|
||||
return services.issues.getSubtree(params);
|
||||
}),
|
||||
"issues.requestWakeup": gated("issues.requestWakeup", async (params) => {
|
||||
return services.issues.requestWakeup(params);
|
||||
}),
|
||||
"issues.requestWakeups": gated("issues.requestWakeups", async (params) => {
|
||||
return services.issues.requestWakeups(params);
|
||||
}),
|
||||
"issues.summaries.getOrchestration": gated("issues.summaries.getOrchestration", async (params) => {
|
||||
return services.issues.getOrchestrationSummary(params);
|
||||
}),
|
||||
"issues.listComments": gated("issues.listComments", async (params) => {
|
||||
return services.issues.listComments(params);
|
||||
}),
|
||||
|
||||
@@ -95,8 +95,6 @@ export type {
|
||||
PluginHealthDiagnostics,
|
||||
PluginConfigValidationResult,
|
||||
PluginWebhookInput,
|
||||
PluginApiRequestInput,
|
||||
PluginApiResponse,
|
||||
} from "./define-plugin.js";
|
||||
export type {
|
||||
TestHarness,
|
||||
@@ -173,22 +171,6 @@ export type {
|
||||
PluginProjectsClient,
|
||||
PluginCompaniesClient,
|
||||
PluginIssuesClient,
|
||||
PluginIssueMutationActor,
|
||||
PluginIssueRelationsClient,
|
||||
PluginIssueRelationSummary,
|
||||
PluginIssueCheckoutOwnership,
|
||||
PluginIssueWakeupResult,
|
||||
PluginIssueWakeupBatchResult,
|
||||
PluginIssueRunSummary,
|
||||
PluginIssueApprovalSummary,
|
||||
PluginIssueCostSummary,
|
||||
PluginBudgetIncidentSummary,
|
||||
PluginIssueInvocationBlockSummary,
|
||||
PluginIssueOrchestrationSummary,
|
||||
PluginIssueSubtreeOptions,
|
||||
PluginIssueAssigneeSummary,
|
||||
PluginIssueSubtree,
|
||||
PluginIssueSummariesClient,
|
||||
PluginAgentsClient,
|
||||
PluginAgentSessionsClient,
|
||||
AgentSession,
|
||||
@@ -221,10 +203,8 @@ export type {
|
||||
Project,
|
||||
Issue,
|
||||
IssueComment,
|
||||
IssueDocumentSummary,
|
||||
Agent,
|
||||
Goal,
|
||||
PluginDatabaseClient,
|
||||
} from "./types.js";
|
||||
|
||||
// Manifest and constant types re-exported from @paperclipai/shared
|
||||
@@ -241,12 +221,7 @@ export type {
|
||||
PluginLauncherRenderDeclaration,
|
||||
PluginLauncherDeclaration,
|
||||
PluginMinimumHostVersion,
|
||||
PluginDatabaseDeclaration,
|
||||
PluginApiRouteCompanyResolution,
|
||||
PluginApiRouteDeclaration,
|
||||
PluginRecord,
|
||||
PluginDatabaseNamespaceRecord,
|
||||
PluginMigrationRecord,
|
||||
PluginConfig,
|
||||
JsonSchema,
|
||||
PluginStatus,
|
||||
@@ -263,13 +238,6 @@ export type {
|
||||
PluginJobRunStatus,
|
||||
PluginJobRunTrigger,
|
||||
PluginWebhookDeliveryStatus,
|
||||
PluginDatabaseCoreReadTable,
|
||||
PluginDatabaseMigrationStatus,
|
||||
PluginDatabaseNamespaceMode,
|
||||
PluginDatabaseNamespaceStatus,
|
||||
PluginApiRouteAuthMode,
|
||||
PluginApiRouteCheckoutPolicy,
|
||||
PluginApiRouteMethod,
|
||||
PluginEventType,
|
||||
PluginBridgeErrorCode,
|
||||
} from "./types.js";
|
||||
|
||||
@@ -34,12 +34,6 @@ export type { PluginLauncherRenderContextSnapshot } from "@paperclipai/shared";
|
||||
|
||||
import type {
|
||||
PluginEvent,
|
||||
PluginIssueCheckoutOwnership,
|
||||
PluginIssueOrchestrationSummary,
|
||||
PluginIssueRelationSummary,
|
||||
PluginIssueSubtree,
|
||||
PluginIssueWakeupBatchResult,
|
||||
PluginIssueWakeupResult,
|
||||
PluginJobContext,
|
||||
PluginWorkspace,
|
||||
ToolRunContext,
|
||||
@@ -47,8 +41,6 @@ import type {
|
||||
} from "./types.js";
|
||||
import type {
|
||||
PluginHealthDiagnostics,
|
||||
PluginApiRequestInput,
|
||||
PluginApiResponse,
|
||||
PluginConfigValidationResult,
|
||||
PluginWebhookInput,
|
||||
} from "./define-plugin.js";
|
||||
@@ -227,8 +219,6 @@ export interface InitializeParams {
|
||||
};
|
||||
/** Host API version. */
|
||||
apiVersion: number;
|
||||
/** Host-derived plugin database namespace, when the manifest declares database access. */
|
||||
databaseNamespace?: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -384,8 +374,6 @@ export interface HostToWorkerMethods {
|
||||
runJob: [params: RunJobParams, result: void];
|
||||
/** @see PLUGIN_SPEC.md §13.7 */
|
||||
handleWebhook: [params: PluginWebhookInput, result: void];
|
||||
/** Scoped plugin API route dispatch. */
|
||||
handleApiRequest: [params: PluginApiRequestInput, result: PluginApiResponse];
|
||||
/** @see PLUGIN_SPEC.md §13.8 */
|
||||
getData: [params: GetDataParams, result: unknown];
|
||||
/** @see PLUGIN_SPEC.md §13.9 */
|
||||
@@ -411,7 +399,6 @@ export const HOST_TO_WORKER_OPTIONAL_METHODS: readonly HostToWorkerMethodName[]
|
||||
"onEvent",
|
||||
"runJob",
|
||||
"handleWebhook",
|
||||
"handleApiRequest",
|
||||
"getData",
|
||||
"performAction",
|
||||
"executeTool",
|
||||
@@ -445,20 +432,6 @@ export interface WorkerToHostMethods {
|
||||
result: void,
|
||||
];
|
||||
|
||||
// Restricted plugin database namespace
|
||||
"db.namespace": [
|
||||
params: Record<string, never>,
|
||||
result: string,
|
||||
];
|
||||
"db.query": [
|
||||
params: { sql: string; params?: unknown[] },
|
||||
result: unknown[],
|
||||
];
|
||||
"db.execute": [
|
||||
params: { sql: string; params?: unknown[] },
|
||||
result: { rowCount: number },
|
||||
];
|
||||
|
||||
// Entities
|
||||
"entities.upsert": [
|
||||
params: {
|
||||
@@ -596,8 +569,6 @@ export interface WorkerToHostMethods {
|
||||
companyId: string;
|
||||
projectId?: string;
|
||||
assigneeAgentId?: string;
|
||||
originKind?: string;
|
||||
originId?: string;
|
||||
status?: string;
|
||||
limit?: number;
|
||||
offset?: number;
|
||||
@@ -617,23 +588,8 @@ export interface WorkerToHostMethods {
|
||||
inheritExecutionWorkspaceFromIssueId?: string;
|
||||
title: string;
|
||||
description?: string;
|
||||
status?: string;
|
||||
priority?: string;
|
||||
assigneeAgentId?: string;
|
||||
assigneeUserId?: string | null;
|
||||
requestDepth?: number;
|
||||
billingCode?: string | null;
|
||||
originKind?: string | null;
|
||||
originId?: string | null;
|
||||
originRunId?: string | null;
|
||||
blockedByIssueIds?: string[];
|
||||
labelIds?: string[];
|
||||
executionWorkspaceId?: string | null;
|
||||
executionWorkspacePreference?: string | null;
|
||||
executionWorkspaceSettings?: Record<string, unknown> | null;
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: Issue,
|
||||
];
|
||||
@@ -645,99 +601,6 @@ export interface WorkerToHostMethods {
|
||||
},
|
||||
result: Issue,
|
||||
];
|
||||
"issues.relations.get": [
|
||||
params: { issueId: string; companyId: string },
|
||||
result: PluginIssueRelationSummary,
|
||||
];
|
||||
"issues.relations.setBlockedBy": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
blockedByIssueIds: string[];
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: PluginIssueRelationSummary,
|
||||
];
|
||||
"issues.relations.addBlockers": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
blockerIssueIds: string[];
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: PluginIssueRelationSummary,
|
||||
];
|
||||
"issues.relations.removeBlockers": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
blockerIssueIds: string[];
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: PluginIssueRelationSummary,
|
||||
];
|
||||
"issues.assertCheckoutOwner": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
actorAgentId: string;
|
||||
actorRunId: string;
|
||||
},
|
||||
result: PluginIssueCheckoutOwnership,
|
||||
];
|
||||
"issues.getSubtree": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
includeRoot?: boolean;
|
||||
includeRelations?: boolean;
|
||||
includeDocuments?: boolean;
|
||||
includeActiveRuns?: boolean;
|
||||
includeAssignees?: boolean;
|
||||
},
|
||||
result: PluginIssueSubtree,
|
||||
];
|
||||
"issues.requestWakeup": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
reason?: string;
|
||||
contextSource?: string;
|
||||
idempotencyKey?: string | null;
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: PluginIssueWakeupResult,
|
||||
];
|
||||
"issues.requestWakeups": [
|
||||
params: {
|
||||
issueIds: string[];
|
||||
companyId: string;
|
||||
reason?: string;
|
||||
contextSource?: string;
|
||||
idempotencyKeyPrefix?: string | null;
|
||||
actorAgentId?: string | null;
|
||||
actorUserId?: string | null;
|
||||
actorRunId?: string | null;
|
||||
},
|
||||
result: PluginIssueWakeupBatchResult[],
|
||||
];
|
||||
"issues.summaries.getOrchestration": [
|
||||
params: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
includeSubtree?: boolean;
|
||||
billingCode?: string | null;
|
||||
},
|
||||
result: PluginIssueOrchestrationSummary,
|
||||
];
|
||||
"issues.listComments": [
|
||||
params: { issueId: string; companyId: string },
|
||||
result: IssueComment[],
|
||||
|
||||
@@ -3,12 +3,10 @@ import type {
|
||||
PaperclipPluginManifestV1,
|
||||
PluginCapability,
|
||||
PluginEventType,
|
||||
PluginIssueOriginKind,
|
||||
Company,
|
||||
Project,
|
||||
Issue,
|
||||
IssueComment,
|
||||
IssueDocument,
|
||||
Agent,
|
||||
Goal,
|
||||
} from "@paperclipai/shared";
|
||||
@@ -74,8 +72,6 @@ export interface TestHarness {
|
||||
activity: Array<{ message: string; entityType?: string; entityId?: string; metadata?: Record<string, unknown> }>;
|
||||
metrics: Array<{ name: string; value: number; tags?: Record<string, string> }>;
|
||||
telemetry: Array<{ eventName: string; dimensions?: Record<string, string | number | boolean> }>;
|
||||
dbQueries: Array<{ sql: string; params?: unknown[] }>;
|
||||
dbExecutes: Array<{ sql: string; params?: unknown[] }>;
|
||||
}
|
||||
|
||||
type EventRegistration = {
|
||||
@@ -138,8 +134,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
const activity: TestHarness["activity"] = [];
|
||||
const metrics: TestHarness["metrics"] = [];
|
||||
const telemetry: TestHarness["telemetry"] = [];
|
||||
const dbQueries: TestHarness["dbQueries"] = [];
|
||||
const dbExecutes: TestHarness["dbExecutes"] = [];
|
||||
|
||||
const state = new Map<string, unknown>();
|
||||
const entities = new Map<string, PluginEntityRecord>();
|
||||
@@ -147,9 +141,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
const companies = new Map<string, Company>();
|
||||
const projects = new Map<string, Project>();
|
||||
const issues = new Map<string, Issue>();
|
||||
const blockedByIssueIds = new Map<string, string[]>();
|
||||
const issueComments = new Map<string, IssueComment[]>();
|
||||
const issueDocuments = new Map<string, IssueDocument>();
|
||||
const agents = new Map<string, Agent>();
|
||||
const goals = new Map<string, Goal>();
|
||||
const projectWorkspaces = new Map<string, PluginWorkspace[]>();
|
||||
@@ -164,42 +156,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
const actionHandlers = new Map<string, (params: Record<string, unknown>) => Promise<unknown>>();
|
||||
const toolHandlers = new Map<string, (params: unknown, runCtx: ToolRunContext) => Promise<ToolResult>>();
|
||||
|
||||
function issueRelationSummary(issueId: string) {
|
||||
const issue = issues.get(issueId);
|
||||
if (!issue) throw new Error(`Issue not found: ${issueId}`);
|
||||
const summarize = (candidateId: string) => {
|
||||
const related = issues.get(candidateId);
|
||||
if (!related || related.companyId !== issue.companyId) return null;
|
||||
return {
|
||||
id: related.id,
|
||||
identifier: related.identifier,
|
||||
title: related.title,
|
||||
status: related.status,
|
||||
priority: related.priority,
|
||||
assigneeAgentId: related.assigneeAgentId,
|
||||
assigneeUserId: related.assigneeUserId,
|
||||
};
|
||||
};
|
||||
const blockedBy = (blockedByIssueIds.get(issueId) ?? [])
|
||||
.map(summarize)
|
||||
.filter((value): value is NonNullable<typeof value> => value !== null);
|
||||
const blocks = [...blockedByIssueIds.entries()]
|
||||
.filter(([, blockers]) => blockers.includes(issueId))
|
||||
.map(([blockedIssueId]) => summarize(blockedIssueId))
|
||||
.filter((value): value is NonNullable<typeof value> => value !== null);
|
||||
return { blockedBy, blocks };
|
||||
}
|
||||
|
||||
const defaultPluginOriginKind: PluginIssueOriginKind = `plugin:${manifest.id}`;
|
||||
function normalizePluginOriginKind(originKind: unknown = defaultPluginOriginKind): PluginIssueOriginKind {
|
||||
if (originKind == null || originKind === "") return defaultPluginOriginKind;
|
||||
if (typeof originKind !== "string") throw new Error("Plugin issue originKind must be a string");
|
||||
if (originKind === defaultPluginOriginKind || originKind.startsWith(`${defaultPluginOriginKind}:`)) {
|
||||
return originKind as PluginIssueOriginKind;
|
||||
}
|
||||
throw new Error(`Plugin may only use originKind values under ${defaultPluginOriginKind}`);
|
||||
}
|
||||
|
||||
const ctx: PluginContext = {
|
||||
manifest,
|
||||
config: {
|
||||
@@ -239,19 +195,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
launchers.set(launcher.id, launcher);
|
||||
},
|
||||
},
|
||||
db: {
|
||||
namespace: manifest.database ? `test_${manifest.id.replace(/[^a-z0-9_]+/g, "_")}` : "",
|
||||
async query(sql, params) {
|
||||
requireCapability(manifest, capabilitySet, "database.namespace.read");
|
||||
dbQueries.push({ sql, params });
|
||||
return [];
|
||||
},
|
||||
async execute(sql, params) {
|
||||
requireCapability(manifest, capabilitySet, "database.namespace.write");
|
||||
dbExecutes.push({ sql, params });
|
||||
return { rowCount: 0 };
|
||||
},
|
||||
},
|
||||
http: {
|
||||
async fetch(url, init) {
|
||||
requireCapability(manifest, capabilitySet, "http.outbound");
|
||||
@@ -395,11 +338,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
out = out.filter((issue) => issue.companyId === companyId);
|
||||
if (input?.projectId) out = out.filter((issue) => issue.projectId === input.projectId);
|
||||
if (input?.assigneeAgentId) out = out.filter((issue) => issue.assigneeAgentId === input.assigneeAgentId);
|
||||
if (input?.originKind) {
|
||||
if (input.originKind.startsWith("plugin:")) normalizePluginOriginKind(input.originKind);
|
||||
out = out.filter((issue) => issue.originKind === input.originKind);
|
||||
}
|
||||
if (input?.originId) out = out.filter((issue) => issue.originId === input.originId);
|
||||
if (input?.status) out = out.filter((issue) => issue.status === input.status);
|
||||
if (input?.offset) out = out.slice(input.offset);
|
||||
if (input?.limit) out = out.slice(0, input.limit);
|
||||
@@ -422,10 +360,10 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
parentId: input.parentId ?? null,
|
||||
title: input.title,
|
||||
description: input.description ?? null,
|
||||
status: input.status ?? "todo",
|
||||
status: "todo",
|
||||
priority: input.priority ?? "medium",
|
||||
assigneeAgentId: input.assigneeAgentId ?? null,
|
||||
assigneeUserId: input.assigneeUserId ?? null,
|
||||
assigneeUserId: null,
|
||||
checkoutRunId: null,
|
||||
executionRunId: null,
|
||||
executionAgentNameKey: null,
|
||||
@@ -434,15 +372,12 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
createdByUserId: null,
|
||||
issueNumber: null,
|
||||
identifier: null,
|
||||
originKind: normalizePluginOriginKind(input.originKind),
|
||||
originId: input.originId ?? null,
|
||||
originRunId: input.originRunId ?? null,
|
||||
requestDepth: input.requestDepth ?? 0,
|
||||
billingCode: input.billingCode ?? null,
|
||||
requestDepth: 0,
|
||||
billingCode: null,
|
||||
assigneeAdapterOverrides: null,
|
||||
executionWorkspaceId: input.executionWorkspaceId ?? null,
|
||||
executionWorkspacePreference: input.executionWorkspacePreference ?? null,
|
||||
executionWorkspaceSettings: input.executionWorkspaceSettings ?? null,
|
||||
executionWorkspaceId: null,
|
||||
executionWorkspacePreference: null,
|
||||
executionWorkspaceSettings: null,
|
||||
startedAt: null,
|
||||
completedAt: null,
|
||||
cancelledAt: null,
|
||||
@@ -451,75 +386,20 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
updatedAt: now,
|
||||
};
|
||||
issues.set(record.id, record);
|
||||
if (input.blockedByIssueIds) blockedByIssueIds.set(record.id, [...new Set(input.blockedByIssueIds)]);
|
||||
return record;
|
||||
},
|
||||
async update(issueId, patch, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issues.update");
|
||||
const record = issues.get(issueId);
|
||||
if (!isInCompany(record, companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
const { blockedByIssueIds: nextBlockedByIssueIds, ...issuePatch } = patch;
|
||||
if (issuePatch.originKind !== undefined) {
|
||||
issuePatch.originKind = normalizePluginOriginKind(issuePatch.originKind);
|
||||
}
|
||||
const updated: Issue = {
|
||||
...record,
|
||||
...issuePatch,
|
||||
...patch,
|
||||
updatedAt: new Date(),
|
||||
};
|
||||
issues.set(issueId, updated);
|
||||
if (nextBlockedByIssueIds !== undefined) {
|
||||
blockedByIssueIds.set(issueId, [...new Set(nextBlockedByIssueIds)]);
|
||||
}
|
||||
return updated;
|
||||
},
|
||||
async assertCheckoutOwner(input) {
|
||||
requireCapability(manifest, capabilitySet, "issues.checkout");
|
||||
const record = issues.get(input.issueId);
|
||||
if (!isInCompany(record, input.companyId)) throw new Error(`Issue not found: ${input.issueId}`);
|
||||
if (
|
||||
record.status !== "in_progress" ||
|
||||
record.assigneeAgentId !== input.actorAgentId ||
|
||||
(record.checkoutRunId !== null && record.checkoutRunId !== input.actorRunId)
|
||||
) {
|
||||
throw new Error("Issue run ownership conflict");
|
||||
}
|
||||
return {
|
||||
issueId: record.id,
|
||||
status: record.status,
|
||||
assigneeAgentId: record.assigneeAgentId,
|
||||
checkoutRunId: record.checkoutRunId,
|
||||
adoptedFromRunId: null,
|
||||
};
|
||||
},
|
||||
async requestWakeup(issueId, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issues.wakeup");
|
||||
const record = issues.get(issueId);
|
||||
if (!isInCompany(record, companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
if (!record.assigneeAgentId) throw new Error("Issue has no assigned agent to wake");
|
||||
if (["backlog", "done", "cancelled"].includes(record.status)) {
|
||||
throw new Error(`Issue is not wakeable in status: ${record.status}`);
|
||||
}
|
||||
const unresolved = issueRelationSummary(issueId).blockedBy.filter((blocker) => blocker.status !== "done");
|
||||
if (unresolved.length > 0) throw new Error("Issue is blocked by unresolved blockers");
|
||||
return { queued: true, runId: randomUUID() };
|
||||
},
|
||||
async requestWakeups(issueIds, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issues.wakeup");
|
||||
const results = [];
|
||||
for (const issueId of issueIds) {
|
||||
const record = issues.get(issueId);
|
||||
if (!isInCompany(record, companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
if (!record.assigneeAgentId) throw new Error("Issue has no assigned agent to wake");
|
||||
if (["backlog", "done", "cancelled"].includes(record.status)) {
|
||||
throw new Error(`Issue is not wakeable in status: ${record.status}`);
|
||||
}
|
||||
const unresolved = issueRelationSummary(issueId).blockedBy.filter((blocker) => blocker.status !== "done");
|
||||
if (unresolved.length > 0) throw new Error("Issue is blocked by unresolved blockers");
|
||||
results.push({ issueId, queued: true, runId: randomUUID() });
|
||||
}
|
||||
return results;
|
||||
},
|
||||
async listComments(issueId, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.comments.read");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) return [];
|
||||
@@ -551,14 +431,12 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
async list(issueId, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.documents.read");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) return [];
|
||||
return [...issueDocuments.values()]
|
||||
.filter((document) => document.issueId === issueId && document.companyId === companyId)
|
||||
.map(({ body: _body, ...summary }) => summary);
|
||||
return [];
|
||||
},
|
||||
async get(issueId, key, companyId) {
|
||||
async get(issueId, _key, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.documents.read");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) return null;
|
||||
return issueDocuments.get(`${issueId}|${key}`) ?? null;
|
||||
return null;
|
||||
},
|
||||
async upsert(input) {
|
||||
requireCapability(manifest, capabilitySet, "issue.documents.write");
|
||||
@@ -566,27 +444,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
if (!isInCompany(parentIssue, input.companyId)) {
|
||||
throw new Error(`Issue not found: ${input.issueId}`);
|
||||
}
|
||||
const now = new Date();
|
||||
const existing = issueDocuments.get(`${input.issueId}|${input.key}`);
|
||||
const document: IssueDocument = {
|
||||
id: existing?.id ?? randomUUID(),
|
||||
companyId: input.companyId,
|
||||
issueId: input.issueId,
|
||||
key: input.key,
|
||||
title: input.title ?? existing?.title ?? null,
|
||||
format: "markdown",
|
||||
latestRevisionId: randomUUID(),
|
||||
latestRevisionNumber: (existing?.latestRevisionNumber ?? 0) + 1,
|
||||
createdByAgentId: existing?.createdByAgentId ?? null,
|
||||
createdByUserId: existing?.createdByUserId ?? null,
|
||||
updatedByAgentId: null,
|
||||
updatedByUserId: null,
|
||||
createdAt: existing?.createdAt ?? now,
|
||||
updatedAt: now,
|
||||
body: input.body,
|
||||
};
|
||||
issueDocuments.set(`${input.issueId}|${input.key}`, document);
|
||||
return document;
|
||||
throw new Error("documents.upsert is not implemented in test context");
|
||||
},
|
||||
async delete(issueId, _key, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.documents.write");
|
||||
@@ -594,104 +452,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
if (!isInCompany(parentIssue, companyId)) {
|
||||
throw new Error(`Issue not found: ${issueId}`);
|
||||
}
|
||||
issueDocuments.delete(`${issueId}|${_key}`);
|
||||
},
|
||||
},
|
||||
relations: {
|
||||
async get(issueId, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.relations.read");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
return issueRelationSummary(issueId);
|
||||
},
|
||||
async setBlockedBy(issueId, nextBlockedByIssueIds, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.relations.write");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
blockedByIssueIds.set(issueId, [...new Set(nextBlockedByIssueIds)]);
|
||||
return issueRelationSummary(issueId);
|
||||
},
|
||||
async addBlockers(issueId, blockerIssueIds, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.relations.write");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
const next = new Set(blockedByIssueIds.get(issueId) ?? []);
|
||||
for (const blockerIssueId of blockerIssueIds) next.add(blockerIssueId);
|
||||
blockedByIssueIds.set(issueId, [...next]);
|
||||
return issueRelationSummary(issueId);
|
||||
},
|
||||
async removeBlockers(issueId, blockerIssueIds, companyId) {
|
||||
requireCapability(manifest, capabilitySet, "issue.relations.write");
|
||||
if (!isInCompany(issues.get(issueId), companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
const removals = new Set(blockerIssueIds);
|
||||
blockedByIssueIds.set(
|
||||
issueId,
|
||||
(blockedByIssueIds.get(issueId) ?? []).filter((blockerIssueId) => !removals.has(blockerIssueId)),
|
||||
);
|
||||
return issueRelationSummary(issueId);
|
||||
},
|
||||
},
|
||||
async getSubtree(issueId, companyId, options) {
|
||||
requireCapability(manifest, capabilitySet, "issue.subtree.read");
|
||||
const root = issues.get(issueId);
|
||||
if (!isInCompany(root, companyId)) throw new Error(`Issue not found: ${issueId}`);
|
||||
const includeRoot = options?.includeRoot !== false;
|
||||
const allIds = [root.id];
|
||||
let frontier = [root.id];
|
||||
while (frontier.length > 0) {
|
||||
const children = [...issues.values()]
|
||||
.filter((issue) => issue.companyId === companyId && frontier.includes(issue.parentId ?? ""))
|
||||
.map((issue) => issue.id)
|
||||
.filter((id) => !allIds.includes(id));
|
||||
allIds.push(...children);
|
||||
frontier = children;
|
||||
}
|
||||
const issueIds = includeRoot ? allIds : allIds.filter((id) => id !== root.id);
|
||||
const subtreeIssues = issueIds.map((id) => issues.get(id)).filter((candidate): candidate is Issue => Boolean(candidate));
|
||||
return {
|
||||
rootIssueId: root.id,
|
||||
companyId,
|
||||
issueIds,
|
||||
issues: subtreeIssues,
|
||||
...(options?.includeRelations
|
||||
? { relations: Object.fromEntries(issueIds.map((id) => [id, issueRelationSummary(id)])) }
|
||||
: {}),
|
||||
...(options?.includeDocuments ? { documents: Object.fromEntries(issueIds.map((id) => [id, []])) } : {}),
|
||||
...(options?.includeActiveRuns ? { activeRuns: Object.fromEntries(issueIds.map((id) => [id, []])) } : {}),
|
||||
...(options?.includeAssignees ? { assignees: {} } : {}),
|
||||
};
|
||||
},
|
||||
summaries: {
|
||||
async getOrchestration(input) {
|
||||
requireCapability(manifest, capabilitySet, "issues.orchestration.read");
|
||||
const root = issues.get(input.issueId);
|
||||
if (!isInCompany(root, input.companyId)) throw new Error(`Issue not found: ${input.issueId}`);
|
||||
const subtreeIssueIds = [root.id];
|
||||
if (input.includeSubtree) {
|
||||
let frontier = [root.id];
|
||||
while (frontier.length > 0) {
|
||||
const children = [...issues.values()]
|
||||
.filter((issue) => issue.companyId === input.companyId && frontier.includes(issue.parentId ?? ""))
|
||||
.map((issue) => issue.id)
|
||||
.filter((id) => !subtreeIssueIds.includes(id));
|
||||
subtreeIssueIds.push(...children);
|
||||
frontier = children;
|
||||
}
|
||||
}
|
||||
return {
|
||||
issueId: root.id,
|
||||
companyId: input.companyId,
|
||||
subtreeIssueIds,
|
||||
relations: Object.fromEntries(subtreeIssueIds.map((id) => [id, issueRelationSummary(id)])),
|
||||
approvals: [],
|
||||
runs: [],
|
||||
costs: {
|
||||
costCents: 0,
|
||||
inputTokens: 0,
|
||||
cachedInputTokens: 0,
|
||||
outputTokens: 0,
|
||||
billingCode: input.billingCode ?? null,
|
||||
},
|
||||
openBudgetIncidents: [],
|
||||
invocationBlocks: [],
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
@@ -900,12 +660,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
seed(input) {
|
||||
for (const row of input.companies ?? []) companies.set(row.id, row);
|
||||
for (const row of input.projects ?? []) projects.set(row.id, row);
|
||||
for (const row of input.issues ?? []) {
|
||||
issues.set(row.id, row);
|
||||
if (row.blockedBy) {
|
||||
blockedByIssueIds.set(row.id, row.blockedBy.map((blocker) => blocker.id));
|
||||
}
|
||||
}
|
||||
for (const row of input.issues ?? []) issues.set(row.id, row);
|
||||
for (const row of input.issueComments ?? []) {
|
||||
const list = issueComments.get(row.issueId) ?? [];
|
||||
list.push(row);
|
||||
@@ -983,8 +738,6 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
||||
activity,
|
||||
metrics,
|
||||
telemetry,
|
||||
dbQueries,
|
||||
dbExecutes,
|
||||
};
|
||||
|
||||
return harness;
|
||||
|
||||
@@ -21,8 +21,6 @@ import type {
|
||||
IssueComment,
|
||||
IssueDocument,
|
||||
IssueDocumentSummary,
|
||||
IssueRelationIssueSummary,
|
||||
PluginIssueOriginKind,
|
||||
Agent,
|
||||
Goal,
|
||||
} from "@paperclipai/shared";
|
||||
@@ -42,12 +40,7 @@ export type {
|
||||
PluginLauncherRenderDeclaration,
|
||||
PluginLauncherDeclaration,
|
||||
PluginMinimumHostVersion,
|
||||
PluginDatabaseDeclaration,
|
||||
PluginApiRouteDeclaration,
|
||||
PluginApiRouteCompanyResolution,
|
||||
PluginRecord,
|
||||
PluginDatabaseNamespaceRecord,
|
||||
PluginMigrationRecord,
|
||||
PluginConfig,
|
||||
JsonSchema,
|
||||
PluginStatus,
|
||||
@@ -64,13 +57,6 @@ export type {
|
||||
PluginJobRunStatus,
|
||||
PluginJobRunTrigger,
|
||||
PluginWebhookDeliveryStatus,
|
||||
PluginDatabaseCoreReadTable,
|
||||
PluginDatabaseMigrationStatus,
|
||||
PluginDatabaseNamespaceMode,
|
||||
PluginDatabaseNamespaceStatus,
|
||||
PluginApiRouteAuthMode,
|
||||
PluginApiRouteCheckoutPolicy,
|
||||
PluginApiRouteMethod,
|
||||
PluginEventType,
|
||||
PluginBridgeErrorCode,
|
||||
Company,
|
||||
@@ -79,8 +65,6 @@ export type {
|
||||
IssueComment,
|
||||
IssueDocument,
|
||||
IssueDocumentSummary,
|
||||
IssueRelationIssueSummary,
|
||||
PluginIssueOriginKind,
|
||||
Agent,
|
||||
Goal,
|
||||
} from "@paperclipai/shared";
|
||||
@@ -423,17 +407,6 @@ export interface PluginLaunchersClient {
|
||||
register(launcher: PluginLauncherRegistration): void;
|
||||
}
|
||||
|
||||
export interface PluginDatabaseClient {
|
||||
/** Host-derived PostgreSQL schema name for this plugin's namespace. */
|
||||
namespace: string;
|
||||
|
||||
/** Run a restricted SELECT against the plugin namespace and whitelisted core tables. */
|
||||
query<T = Record<string, unknown>>(sql: string, params?: unknown[]): Promise<T[]>;
|
||||
|
||||
/** Run a restricted INSERT, UPDATE, or DELETE against the plugin namespace. */
|
||||
execute(sql: string, params?: unknown[]): Promise<{ rowCount: number }>;
|
||||
}
|
||||
|
||||
/**
|
||||
* `ctx.http` — make outbound HTTP requests.
|
||||
*
|
||||
@@ -894,178 +867,6 @@ export interface PluginIssueDocumentsClient {
|
||||
delete(issueId: string, key: string, companyId: string): Promise<void>;
|
||||
}
|
||||
|
||||
export interface PluginIssueMutationActor {
|
||||
/** Agent that initiated the plugin operation, when the plugin is acting from an agent run. */
|
||||
actorAgentId?: string | null;
|
||||
/** Board/user that initiated the plugin operation, when known. */
|
||||
actorUserId?: string | null;
|
||||
/** Heartbeat run that initiated the operation. Required for checkout-aware agent actions. */
|
||||
actorRunId?: string | null;
|
||||
}
|
||||
|
||||
export interface PluginIssueRelationSummary {
|
||||
blockedBy: IssueRelationIssueSummary[];
|
||||
blocks: IssueRelationIssueSummary[];
|
||||
}
|
||||
|
||||
export interface PluginIssueRelationsClient {
|
||||
/** Read blocker relationships for an issue. Requires `issue.relations.read`. */
|
||||
get(issueId: string, companyId: string): Promise<PluginIssueRelationSummary>;
|
||||
/** Replace the issue's blocked-by relation set. Requires `issue.relations.write`. */
|
||||
setBlockedBy(
|
||||
issueId: string,
|
||||
blockedByIssueIds: string[],
|
||||
companyId: string,
|
||||
actor?: PluginIssueMutationActor,
|
||||
): Promise<PluginIssueRelationSummary>;
|
||||
/** Add one or more blockers while preserving existing blockers. Requires `issue.relations.write`. */
|
||||
addBlockers(
|
||||
issueId: string,
|
||||
blockerIssueIds: string[],
|
||||
companyId: string,
|
||||
actor?: PluginIssueMutationActor,
|
||||
): Promise<PluginIssueRelationSummary>;
|
||||
/** Remove one or more blockers while preserving all other blockers. Requires `issue.relations.write`. */
|
||||
removeBlockers(
|
||||
issueId: string,
|
||||
blockerIssueIds: string[],
|
||||
companyId: string,
|
||||
actor?: PluginIssueMutationActor,
|
||||
): Promise<PluginIssueRelationSummary>;
|
||||
}
|
||||
|
||||
export interface PluginIssueCheckoutOwnership {
|
||||
issueId: string;
|
||||
status: Issue["status"];
|
||||
assigneeAgentId: string | null;
|
||||
checkoutRunId: string | null;
|
||||
adoptedFromRunId: string | null;
|
||||
}
|
||||
|
||||
export interface PluginIssueWakeupResult {
|
||||
queued: boolean;
|
||||
runId: string | null;
|
||||
}
|
||||
|
||||
export interface PluginIssueWakeupBatchResult {
|
||||
issueId: string;
|
||||
queued: boolean;
|
||||
runId: string | null;
|
||||
}
|
||||
|
||||
export interface PluginIssueRunSummary {
|
||||
id: string;
|
||||
issueId: string | null;
|
||||
agentId: string;
|
||||
status: string;
|
||||
invocationSource: string;
|
||||
triggerDetail: string | null;
|
||||
startedAt: string | null;
|
||||
finishedAt: string | null;
|
||||
error: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
export interface PluginIssueApprovalSummary {
|
||||
issueId: string;
|
||||
id: string;
|
||||
type: string;
|
||||
status: string;
|
||||
requestedByAgentId: string | null;
|
||||
requestedByUserId: string | null;
|
||||
decidedByUserId: string | null;
|
||||
decidedAt: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
export interface PluginIssueCostSummary {
|
||||
costCents: number;
|
||||
inputTokens: number;
|
||||
cachedInputTokens: number;
|
||||
outputTokens: number;
|
||||
billingCode: string | null;
|
||||
}
|
||||
|
||||
export interface PluginBudgetIncidentSummary {
|
||||
id: string;
|
||||
scopeType: string;
|
||||
scopeId: string;
|
||||
metric: string;
|
||||
windowKind: string;
|
||||
thresholdType: string;
|
||||
amountLimit: number;
|
||||
amountObserved: number;
|
||||
status: string;
|
||||
approvalId: string | null;
|
||||
createdAt: string;
|
||||
}
|
||||
|
||||
export interface PluginIssueInvocationBlockSummary {
|
||||
issueId: string;
|
||||
agentId: string;
|
||||
scopeType: "company" | "agent" | "project";
|
||||
scopeId: string;
|
||||
scopeName: string;
|
||||
reason: string;
|
||||
}
|
||||
|
||||
export interface PluginIssueOrchestrationSummary {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
subtreeIssueIds: string[];
|
||||
relations: Record<string, PluginIssueRelationSummary>;
|
||||
approvals: PluginIssueApprovalSummary[];
|
||||
runs: PluginIssueRunSummary[];
|
||||
costs: PluginIssueCostSummary;
|
||||
openBudgetIncidents: PluginBudgetIncidentSummary[];
|
||||
invocationBlocks: PluginIssueInvocationBlockSummary[];
|
||||
}
|
||||
|
||||
export interface PluginIssueSubtreeOptions {
|
||||
/** Include the root issue in the result. Defaults to true. */
|
||||
includeRoot?: boolean;
|
||||
/** Include blocker relationship summaries keyed by issue ID. */
|
||||
includeRelations?: boolean;
|
||||
/** Include issue document summaries keyed by issue ID. */
|
||||
includeDocuments?: boolean;
|
||||
/** Include queued/running heartbeat runs keyed by issue ID. */
|
||||
includeActiveRuns?: boolean;
|
||||
/** Include assignee summaries keyed by agent ID. */
|
||||
includeAssignees?: boolean;
|
||||
}
|
||||
|
||||
export interface PluginIssueAssigneeSummary {
|
||||
id: string;
|
||||
name: string;
|
||||
role: string;
|
||||
title: string | null;
|
||||
status: Agent["status"];
|
||||
}
|
||||
|
||||
export interface PluginIssueSubtree {
|
||||
rootIssueId: string;
|
||||
companyId: string;
|
||||
issueIds: string[];
|
||||
issues: Issue[];
|
||||
relations?: Record<string, PluginIssueRelationSummary>;
|
||||
documents?: Record<string, IssueDocumentSummary[]>;
|
||||
activeRuns?: Record<string, PluginIssueRunSummary[]>;
|
||||
assignees?: Record<string, PluginIssueAssigneeSummary>;
|
||||
}
|
||||
|
||||
export interface PluginIssueSummariesClient {
|
||||
/**
|
||||
* Read the compact orchestration inputs a workflow plugin needs for an
|
||||
* issue or issue subtree. Requires `issues.orchestration.read`.
|
||||
*/
|
||||
getOrchestration(input: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
includeSubtree?: boolean;
|
||||
billingCode?: string | null;
|
||||
}): Promise<PluginIssueOrchestrationSummary>;
|
||||
}
|
||||
|
||||
/**
|
||||
* `ctx.issues` — read and mutate issues plus comments.
|
||||
*
|
||||
@@ -1073,9 +874,6 @@ export interface PluginIssueSummariesClient {
|
||||
* - `issues.read` for read operations
|
||||
* - `issues.create` for create
|
||||
* - `issues.update` for update
|
||||
* - `issues.checkout` for checkout ownership assertions
|
||||
* - `issues.wakeup` for assignment wakeup requests
|
||||
* - `issues.orchestration.read` for orchestration summaries
|
||||
* - `issue.comments.read` for `listComments`
|
||||
* - `issue.comments.create` for `createComment`
|
||||
* - `issue.documents.read` for `documents.list` and `documents.get`
|
||||
@@ -1086,8 +884,6 @@ export interface PluginIssuesClient {
|
||||
companyId: string;
|
||||
projectId?: string;
|
||||
assigneeAgentId?: string;
|
||||
originKind?: PluginIssueOriginKind;
|
||||
originId?: string;
|
||||
status?: Issue["status"];
|
||||
limit?: number;
|
||||
offset?: number;
|
||||
@@ -1101,80 +897,17 @@ export interface PluginIssuesClient {
|
||||
inheritExecutionWorkspaceFromIssueId?: string;
|
||||
title: string;
|
||||
description?: string;
|
||||
status?: Issue["status"];
|
||||
priority?: Issue["priority"];
|
||||
assigneeAgentId?: string;
|
||||
assigneeUserId?: string | null;
|
||||
requestDepth?: number;
|
||||
billingCode?: string | null;
|
||||
originKind?: PluginIssueOriginKind;
|
||||
originId?: string | null;
|
||||
originRunId?: string | null;
|
||||
blockedByIssueIds?: string[];
|
||||
labelIds?: string[];
|
||||
executionWorkspaceId?: string | null;
|
||||
executionWorkspacePreference?: string | null;
|
||||
executionWorkspaceSettings?: Record<string, unknown> | null;
|
||||
actor?: PluginIssueMutationActor;
|
||||
}): Promise<Issue>;
|
||||
update(
|
||||
issueId: string,
|
||||
patch: Partial<Pick<
|
||||
Issue,
|
||||
| "title"
|
||||
| "description"
|
||||
| "status"
|
||||
| "priority"
|
||||
| "assigneeAgentId"
|
||||
| "assigneeUserId"
|
||||
| "billingCode"
|
||||
| "originKind"
|
||||
| "originId"
|
||||
| "originRunId"
|
||||
| "requestDepth"
|
||||
| "executionWorkspaceId"
|
||||
| "executionWorkspacePreference"
|
||||
>> & {
|
||||
blockedByIssueIds?: string[];
|
||||
labelIds?: string[];
|
||||
executionWorkspaceSettings?: Record<string, unknown> | null;
|
||||
},
|
||||
"title" | "description" | "status" | "priority" | "assigneeAgentId"
|
||||
>>,
|
||||
companyId: string,
|
||||
actor?: PluginIssueMutationActor,
|
||||
): Promise<Issue>;
|
||||
assertCheckoutOwner(input: {
|
||||
issueId: string;
|
||||
companyId: string;
|
||||
actorAgentId: string;
|
||||
actorRunId: string;
|
||||
}): Promise<PluginIssueCheckoutOwnership>;
|
||||
/**
|
||||
* Read a root issue's descendants with optional relation/document/run/assignee
|
||||
* summaries. Requires `issue.subtree.read`.
|
||||
*/
|
||||
getSubtree(
|
||||
issueId: string,
|
||||
companyId: string,
|
||||
options?: PluginIssueSubtreeOptions,
|
||||
): Promise<PluginIssueSubtree>;
|
||||
requestWakeup(
|
||||
issueId: string,
|
||||
companyId: string,
|
||||
options?: {
|
||||
reason?: string;
|
||||
contextSource?: string;
|
||||
idempotencyKey?: string | null;
|
||||
} & PluginIssueMutationActor,
|
||||
): Promise<PluginIssueWakeupResult>;
|
||||
requestWakeups(
|
||||
issueIds: string[],
|
||||
companyId: string,
|
||||
options?: {
|
||||
reason?: string;
|
||||
contextSource?: string;
|
||||
idempotencyKeyPrefix?: string | null;
|
||||
} & PluginIssueMutationActor,
|
||||
): Promise<PluginIssueWakeupBatchResult[]>;
|
||||
listComments(issueId: string, companyId: string): Promise<IssueComment[]>;
|
||||
createComment(
|
||||
issueId: string,
|
||||
@@ -1184,10 +917,6 @@ export interface PluginIssuesClient {
|
||||
): Promise<IssueComment>;
|
||||
/** Read and write issue documents. Requires `issue.documents.read` / `issue.documents.write`. */
|
||||
documents: PluginIssueDocumentsClient;
|
||||
/** Read and write blocker relationships. */
|
||||
relations: PluginIssueRelationsClient;
|
||||
/** Read compact orchestration summaries. */
|
||||
summaries: PluginIssueSummariesClient;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -1409,9 +1138,6 @@ export interface PluginContext {
|
||||
/** Register launcher metadata that the host can surface in plugin UI entry points. */
|
||||
launchers: PluginLaunchersClient;
|
||||
|
||||
/** Restricted plugin-owned database namespace. Requires database namespace capabilities. */
|
||||
db: PluginDatabaseClient;
|
||||
|
||||
/** Make outbound HTTP requests. Requires `http.outbound`. */
|
||||
http: PluginHttpClient;
|
||||
|
||||
|
||||
@@ -42,7 +42,6 @@ import type { PaperclipPluginManifestV1 } from "@paperclipai/shared";
|
||||
|
||||
import type { PaperclipPlugin } from "./define-plugin.js";
|
||||
import type {
|
||||
PluginApiRequestInput,
|
||||
PluginHealthDiagnostics,
|
||||
PluginConfigValidationResult,
|
||||
PluginWebhookInput,
|
||||
@@ -251,7 +250,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
let initialized = false;
|
||||
let manifest: PaperclipPluginManifestV1 | null = null;
|
||||
let currentConfig: Record<string, unknown> = {};
|
||||
let databaseNamespace: string | null = null;
|
||||
|
||||
// Plugin handler registrations (populated during setup())
|
||||
const eventHandlers: EventRegistration[] = [];
|
||||
@@ -418,18 +416,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
},
|
||||
},
|
||||
|
||||
db: {
|
||||
get namespace() {
|
||||
return databaseNamespace ?? "";
|
||||
},
|
||||
async query<T = Record<string, unknown>>(sql: string, params?: unknown[]): Promise<T[]> {
|
||||
return callHost("db.query", { sql, params }) as Promise<T[]>;
|
||||
},
|
||||
async execute(sql: string, params?: unknown[]) {
|
||||
return callHost("db.execute", { sql, params });
|
||||
},
|
||||
},
|
||||
|
||||
http: {
|
||||
async fetch(url: string, init?: RequestInit): Promise<Response> {
|
||||
const serializedInit: Record<string, unknown> = {};
|
||||
@@ -588,8 +574,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
companyId: input.companyId,
|
||||
projectId: input.projectId,
|
||||
assigneeAgentId: input.assigneeAgentId,
|
||||
originKind: input.originKind,
|
||||
originId: input.originId,
|
||||
status: input.status,
|
||||
limit: input.limit,
|
||||
offset: input.offset,
|
||||
@@ -609,81 +593,19 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
inheritExecutionWorkspaceFromIssueId: input.inheritExecutionWorkspaceFromIssueId,
|
||||
title: input.title,
|
||||
description: input.description,
|
||||
status: input.status,
|
||||
priority: input.priority,
|
||||
assigneeAgentId: input.assigneeAgentId,
|
||||
assigneeUserId: input.assigneeUserId,
|
||||
requestDepth: input.requestDepth,
|
||||
billingCode: input.billingCode,
|
||||
originKind: input.originKind,
|
||||
originId: input.originId,
|
||||
originRunId: input.originRunId,
|
||||
blockedByIssueIds: input.blockedByIssueIds,
|
||||
labelIds: input.labelIds,
|
||||
executionWorkspaceId: input.executionWorkspaceId,
|
||||
executionWorkspacePreference: input.executionWorkspacePreference,
|
||||
executionWorkspaceSettings: input.executionWorkspaceSettings,
|
||||
actorAgentId: input.actor?.actorAgentId,
|
||||
actorUserId: input.actor?.actorUserId,
|
||||
actorRunId: input.actor?.actorRunId,
|
||||
});
|
||||
},
|
||||
|
||||
async update(issueId: string, patch, companyId: string, actor) {
|
||||
async update(issueId: string, patch, companyId: string) {
|
||||
return callHost("issues.update", {
|
||||
issueId,
|
||||
patch: {
|
||||
...(patch as Record<string, unknown>),
|
||||
actorAgentId: actor?.actorAgentId,
|
||||
actorUserId: actor?.actorUserId,
|
||||
actorRunId: actor?.actorRunId,
|
||||
},
|
||||
patch: patch as Record<string, unknown>,
|
||||
companyId,
|
||||
});
|
||||
},
|
||||
|
||||
async assertCheckoutOwner(input) {
|
||||
return callHost("issues.assertCheckoutOwner", input);
|
||||
},
|
||||
|
||||
async getSubtree(issueId: string, companyId: string, options) {
|
||||
return callHost("issues.getSubtree", {
|
||||
issueId,
|
||||
companyId,
|
||||
includeRoot: options?.includeRoot,
|
||||
includeRelations: options?.includeRelations,
|
||||
includeDocuments: options?.includeDocuments,
|
||||
includeActiveRuns: options?.includeActiveRuns,
|
||||
includeAssignees: options?.includeAssignees,
|
||||
});
|
||||
},
|
||||
|
||||
async requestWakeup(issueId: string, companyId: string, options) {
|
||||
return callHost("issues.requestWakeup", {
|
||||
issueId,
|
||||
companyId,
|
||||
reason: options?.reason,
|
||||
contextSource: options?.contextSource,
|
||||
idempotencyKey: options?.idempotencyKey,
|
||||
actorAgentId: options?.actorAgentId,
|
||||
actorUserId: options?.actorUserId,
|
||||
actorRunId: options?.actorRunId,
|
||||
});
|
||||
},
|
||||
|
||||
async requestWakeups(issueIds: string[], companyId: string, options) {
|
||||
return callHost("issues.requestWakeups", {
|
||||
issueIds,
|
||||
companyId,
|
||||
reason: options?.reason,
|
||||
contextSource: options?.contextSource,
|
||||
idempotencyKeyPrefix: options?.idempotencyKeyPrefix,
|
||||
actorAgentId: options?.actorAgentId,
|
||||
actorUserId: options?.actorUserId,
|
||||
actorRunId: options?.actorRunId,
|
||||
});
|
||||
},
|
||||
|
||||
async listComments(issueId: string, companyId: string) {
|
||||
return callHost("issues.listComments", { issueId, companyId });
|
||||
},
|
||||
@@ -717,51 +639,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
return callHost("issues.documents.delete", { issueId, key, companyId });
|
||||
},
|
||||
},
|
||||
|
||||
relations: {
|
||||
async get(issueId: string, companyId: string) {
|
||||
return callHost("issues.relations.get", { issueId, companyId });
|
||||
},
|
||||
|
||||
async setBlockedBy(issueId: string, blockedByIssueIds: string[], companyId: string, actor) {
|
||||
return callHost("issues.relations.setBlockedBy", {
|
||||
issueId,
|
||||
companyId,
|
||||
blockedByIssueIds,
|
||||
actorAgentId: actor?.actorAgentId,
|
||||
actorUserId: actor?.actorUserId,
|
||||
actorRunId: actor?.actorRunId,
|
||||
});
|
||||
},
|
||||
|
||||
async addBlockers(issueId: string, blockerIssueIds: string[], companyId: string, actor) {
|
||||
return callHost("issues.relations.addBlockers", {
|
||||
issueId,
|
||||
companyId,
|
||||
blockerIssueIds,
|
||||
actorAgentId: actor?.actorAgentId,
|
||||
actorUserId: actor?.actorUserId,
|
||||
actorRunId: actor?.actorRunId,
|
||||
});
|
||||
},
|
||||
|
||||
async removeBlockers(issueId: string, blockerIssueIds: string[], companyId: string, actor) {
|
||||
return callHost("issues.relations.removeBlockers", {
|
||||
issueId,
|
||||
companyId,
|
||||
blockerIssueIds,
|
||||
actorAgentId: actor?.actorAgentId,
|
||||
actorUserId: actor?.actorUserId,
|
||||
actorRunId: actor?.actorRunId,
|
||||
});
|
||||
},
|
||||
},
|
||||
|
||||
summaries: {
|
||||
async getOrchestration(input) {
|
||||
return callHost("issues.summaries.getOrchestration", input);
|
||||
},
|
||||
},
|
||||
},
|
||||
|
||||
agents: {
|
||||
@@ -1002,9 +879,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
case "handleWebhook":
|
||||
return handleWebhook(params as PluginWebhookInput);
|
||||
|
||||
case "handleApiRequest":
|
||||
return handleApiRequest(params as PluginApiRequestInput);
|
||||
|
||||
case "getData":
|
||||
return handleGetData(params as GetDataParams);
|
||||
|
||||
@@ -1033,7 +907,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
|
||||
manifest = params.manifest;
|
||||
currentConfig = params.config;
|
||||
databaseNamespace = params.databaseNamespace ?? null;
|
||||
|
||||
// Call the plugin's setup function
|
||||
await plugin.definition.setup(ctx);
|
||||
@@ -1046,7 +919,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
if (plugin.definition.onConfigChanged) supportedMethods.push("configChanged");
|
||||
if (plugin.definition.onHealth) supportedMethods.push("health");
|
||||
if (plugin.definition.onShutdown) supportedMethods.push("shutdown");
|
||||
if (plugin.definition.onApiRequest) supportedMethods.push("handleApiRequest");
|
||||
|
||||
return { ok: true, supportedMethods };
|
||||
}
|
||||
@@ -1148,16 +1020,6 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
||||
await plugin.definition.onWebhook(params);
|
||||
}
|
||||
|
||||
async function handleApiRequest(params: PluginApiRequestInput): Promise<unknown> {
|
||||
if (!plugin.definition.onApiRequest) {
|
||||
throw Object.assign(
|
||||
new Error("handleApiRequest is not implemented by this plugin"),
|
||||
{ code: PLUGIN_RPC_ERROR_CODES.METHOD_NOT_IMPLEMENTED },
|
||||
);
|
||||
}
|
||||
return plugin.definition.onApiRequest(params);
|
||||
}
|
||||
|
||||
async function handleGetData(params: GetDataParams): Promise<unknown> {
|
||||
const handler = dataHandlers.get(params.key);
|
||||
if (!handler) {
|
||||
|
||||
@@ -66,8 +66,6 @@ export const AGENT_ROLE_LABELS: Record<AgentRole, string> = {
|
||||
general: "General",
|
||||
};
|
||||
|
||||
export const AGENT_DEFAULT_MAX_CONCURRENT_RUNS = 5;
|
||||
export const WORKSPACE_BRANCH_ROUTINE_VARIABLE = "workspaceBranch";
|
||||
export const AGENT_ICON_NAMES = [
|
||||
"bot",
|
||||
"cpu",
|
||||
@@ -138,25 +136,11 @@ export const ISSUE_PRIORITIES = ["critical", "high", "medium", "low"] as const;
|
||||
export type IssuePriority = (typeof ISSUE_PRIORITIES)[number];
|
||||
|
||||
export const ISSUE_ORIGIN_KINDS = ["manual", "routine_execution"] as const;
|
||||
export type BuiltInIssueOriginKind = (typeof ISSUE_ORIGIN_KINDS)[number];
|
||||
export type PluginIssueOriginKind = `plugin:${string}`;
|
||||
export type IssueOriginKind = BuiltInIssueOriginKind | PluginIssueOriginKind;
|
||||
export type IssueOriginKind = (typeof ISSUE_ORIGIN_KINDS)[number];
|
||||
|
||||
export const ISSUE_RELATION_TYPES = ["blocks"] as const;
|
||||
export type IssueRelationType = (typeof ISSUE_RELATION_TYPES)[number];
|
||||
|
||||
export const ISSUE_CONTINUATION_SUMMARY_DOCUMENT_KEY = "continuation-summary" as const;
|
||||
export const SYSTEM_ISSUE_DOCUMENT_KEYS = [ISSUE_CONTINUATION_SUMMARY_DOCUMENT_KEY] as const;
|
||||
export type SystemIssueDocumentKey = (typeof SYSTEM_ISSUE_DOCUMENT_KEYS)[number];
|
||||
|
||||
const SYSTEM_ISSUE_DOCUMENT_KEY_SET = new Set<string>(SYSTEM_ISSUE_DOCUMENT_KEYS);
|
||||
|
||||
export function isSystemIssueDocumentKey(key: string): key is SystemIssueDocumentKey {
|
||||
return SYSTEM_ISSUE_DOCUMENT_KEY_SET.has(key);
|
||||
}
|
||||
export const ISSUE_REFERENCE_SOURCE_KINDS = ["title", "description", "comment", "document"] as const;
|
||||
export type IssueReferenceSourceKind = (typeof ISSUE_REFERENCE_SOURCE_KINDS)[number];
|
||||
|
||||
export const ISSUE_EXECUTION_POLICY_MODES = ["normal", "auto"] as const;
|
||||
export type IssueExecutionPolicyMode = (typeof ISSUE_EXECUTION_POLICY_MODES)[number];
|
||||
|
||||
@@ -351,7 +335,6 @@ export type WakeupRequestStatus = (typeof WAKEUP_REQUEST_STATUSES)[number];
|
||||
|
||||
export const HEARTBEAT_RUN_STATUSES = [
|
||||
"queued",
|
||||
"scheduled_retry",
|
||||
"running",
|
||||
"succeeded",
|
||||
"failed",
|
||||
@@ -360,17 +343,6 @@ export const HEARTBEAT_RUN_STATUSES = [
|
||||
] as const;
|
||||
export type HeartbeatRunStatus = (typeof HEARTBEAT_RUN_STATUSES)[number];
|
||||
|
||||
export const RUN_LIVENESS_STATES = [
|
||||
"completed",
|
||||
"advanced",
|
||||
"plan_only",
|
||||
"empty_response",
|
||||
"blocked",
|
||||
"failed",
|
||||
"needs_followup",
|
||||
] as const;
|
||||
export type RunLivenessState = (typeof RUN_LIVENESS_STATES)[number];
|
||||
|
||||
export const LIVE_EVENT_TYPES = [
|
||||
"heartbeat.run.queued",
|
||||
"heartbeat.run.status",
|
||||
@@ -387,33 +359,9 @@ export type LiveEventType = (typeof LIVE_EVENT_TYPES)[number];
|
||||
export const PRINCIPAL_TYPES = ["user", "agent"] as const;
|
||||
export type PrincipalType = (typeof PRINCIPAL_TYPES)[number];
|
||||
|
||||
export const MEMBERSHIP_STATUSES = ["pending", "active", "suspended", "archived"] as const;
|
||||
export const MEMBERSHIP_STATUSES = ["pending", "active", "suspended"] as const;
|
||||
export type MembershipStatus = (typeof MEMBERSHIP_STATUSES)[number];
|
||||
|
||||
export const COMPANY_MEMBERSHIP_ROLES = [
|
||||
"owner",
|
||||
"admin",
|
||||
"operator",
|
||||
"viewer",
|
||||
"member",
|
||||
] as const;
|
||||
export type CompanyMembershipRole = (typeof COMPANY_MEMBERSHIP_ROLES)[number];
|
||||
|
||||
export const HUMAN_COMPANY_MEMBERSHIP_ROLES = [
|
||||
"owner",
|
||||
"admin",
|
||||
"operator",
|
||||
"viewer",
|
||||
] as const;
|
||||
export type HumanCompanyMembershipRole = (typeof HUMAN_COMPANY_MEMBERSHIP_ROLES)[number];
|
||||
|
||||
export const HUMAN_COMPANY_MEMBERSHIP_ROLE_LABELS: Record<HumanCompanyMembershipRole, string> = {
|
||||
owner: "Owner",
|
||||
admin: "Admin",
|
||||
operator: "Operator",
|
||||
viewer: "Viewer",
|
||||
};
|
||||
|
||||
export const INSTANCE_USER_ROLES = ["instance_admin"] as const;
|
||||
export type InstanceUserRole = (typeof INSTANCE_USER_ROLES)[number];
|
||||
|
||||
@@ -435,7 +383,6 @@ export const PERMISSION_KEYS = [
|
||||
"users:manage_permissions",
|
||||
"tasks:assign",
|
||||
"tasks:assign_scope",
|
||||
"tasks:manage_active_checkouts",
|
||||
"joins:approve",
|
||||
] as const;
|
||||
export type PermissionKey = (typeof PERMISSION_KEYS)[number];
|
||||
@@ -504,8 +451,6 @@ export const PLUGIN_CAPABILITIES = [
|
||||
"projects.read",
|
||||
"project.workspaces.read",
|
||||
"issues.read",
|
||||
"issue.relations.read",
|
||||
"issue.subtree.read",
|
||||
"issue.comments.read",
|
||||
"issue.documents.read",
|
||||
"agents.read",
|
||||
@@ -514,14 +459,9 @@ export const PLUGIN_CAPABILITIES = [
|
||||
"goals.update",
|
||||
"activity.read",
|
||||
"costs.read",
|
||||
"issues.orchestration.read",
|
||||
"database.namespace.read",
|
||||
// Data Write
|
||||
"issues.create",
|
||||
"issues.update",
|
||||
"issue.relations.write",
|
||||
"issues.checkout",
|
||||
"issues.wakeup",
|
||||
"issue.comments.create",
|
||||
"issue.documents.write",
|
||||
"agents.pause",
|
||||
@@ -534,8 +474,6 @@ export const PLUGIN_CAPABILITIES = [
|
||||
"activity.log.write",
|
||||
"metrics.write",
|
||||
"telemetry.track",
|
||||
"database.namespace.migrate",
|
||||
"database.namespace.write",
|
||||
// Plugin State
|
||||
"plugin.state.read",
|
||||
"plugin.state.write",
|
||||
@@ -544,7 +482,6 @@ export const PLUGIN_CAPABILITIES = [
|
||||
"events.emit",
|
||||
"jobs.schedule",
|
||||
"webhooks.receive",
|
||||
"api.routes.register",
|
||||
"http.outbound",
|
||||
"secrets.read-ref",
|
||||
// Agent Tools
|
||||
@@ -560,51 +497,6 @@ export const PLUGIN_CAPABILITIES = [
|
||||
] as const;
|
||||
export type PluginCapability = (typeof PLUGIN_CAPABILITIES)[number];
|
||||
|
||||
export const PLUGIN_DATABASE_NAMESPACE_MODES = ["schema"] as const;
|
||||
export type PluginDatabaseNamespaceMode = (typeof PLUGIN_DATABASE_NAMESPACE_MODES)[number];
|
||||
|
||||
export const PLUGIN_DATABASE_NAMESPACE_STATUSES = [
|
||||
"active",
|
||||
"migration_failed",
|
||||
] as const;
|
||||
export type PluginDatabaseNamespaceStatus = (typeof PLUGIN_DATABASE_NAMESPACE_STATUSES)[number];
|
||||
|
||||
export const PLUGIN_DATABASE_MIGRATION_STATUSES = [
|
||||
"applied",
|
||||
"failed",
|
||||
] as const;
|
||||
export type PluginDatabaseMigrationStatus = (typeof PLUGIN_DATABASE_MIGRATION_STATUSES)[number];
|
||||
|
||||
export const PLUGIN_DATABASE_CORE_READ_TABLES = [
|
||||
"companies",
|
||||
"projects",
|
||||
"goals",
|
||||
"agents",
|
||||
"issues",
|
||||
"issue_documents",
|
||||
"issue_relations",
|
||||
"issue_comments",
|
||||
"heartbeat_runs",
|
||||
"cost_events",
|
||||
"approvals",
|
||||
"issue_approvals",
|
||||
"budget_incidents",
|
||||
] as const;
|
||||
export type PluginDatabaseCoreReadTable = (typeof PLUGIN_DATABASE_CORE_READ_TABLES)[number];
|
||||
|
||||
export const PLUGIN_API_ROUTE_METHODS = ["GET", "POST", "PATCH", "DELETE"] as const;
|
||||
export type PluginApiRouteMethod = (typeof PLUGIN_API_ROUTE_METHODS)[number];
|
||||
|
||||
export const PLUGIN_API_ROUTE_AUTH_MODES = ["board", "agent", "board-or-agent", "webhook"] as const;
|
||||
export type PluginApiRouteAuthMode = (typeof PLUGIN_API_ROUTE_AUTH_MODES)[number];
|
||||
|
||||
export const PLUGIN_API_ROUTE_CHECKOUT_POLICIES = [
|
||||
"none",
|
||||
"required-for-agent-in-progress",
|
||||
"always-for-agent",
|
||||
] as const;
|
||||
export type PluginApiRouteCheckoutPolicy = (typeof PLUGIN_API_ROUTE_CHECKOUT_POLICIES)[number];
|
||||
|
||||
/**
|
||||
* UI extension slot types. Each slot type corresponds to a mount point in the
|
||||
* Paperclip UI where plugin components can be rendered.
|
||||
@@ -803,13 +695,6 @@ export const PLUGIN_EVENT_TYPES = [
|
||||
"issue.created",
|
||||
"issue.updated",
|
||||
"issue.comment.created",
|
||||
"issue.document.created",
|
||||
"issue.document.updated",
|
||||
"issue.document.deleted",
|
||||
"issue.relations.updated",
|
||||
"issue.checked_out",
|
||||
"issue.released",
|
||||
"issue.assignment_wakeup_requested",
|
||||
"agent.created",
|
||||
"agent.updated",
|
||||
"agent.status_changed",
|
||||
@@ -821,8 +706,6 @@ export const PLUGIN_EVENT_TYPES = [
|
||||
"goal.updated",
|
||||
"approval.created",
|
||||
"approval.decided",
|
||||
"budget.incident.opened",
|
||||
"budget.incident.resolved",
|
||||
"cost_event.created",
|
||||
"activity.logged",
|
||||
] as const;
|
||||
|
||||
@@ -9,8 +9,6 @@ export {
|
||||
AGENT_ADAPTER_TYPES,
|
||||
AGENT_ROLES,
|
||||
AGENT_ROLE_LABELS,
|
||||
AGENT_DEFAULT_MAX_CONCURRENT_RUNS,
|
||||
WORKSPACE_BRANCH_ROUTINE_VARIABLE,
|
||||
AGENT_ICON_NAMES,
|
||||
ISSUE_STATUSES,
|
||||
INBOX_MINE_ISSUE_STATUSES,
|
||||
@@ -18,10 +16,6 @@ export {
|
||||
ISSUE_PRIORITIES,
|
||||
ISSUE_ORIGIN_KINDS,
|
||||
ISSUE_RELATION_TYPES,
|
||||
ISSUE_CONTINUATION_SUMMARY_DOCUMENT_KEY,
|
||||
SYSTEM_ISSUE_DOCUMENT_KEYS,
|
||||
isSystemIssueDocumentKey,
|
||||
ISSUE_REFERENCE_SOURCE_KINDS,
|
||||
ISSUE_EXECUTION_POLICY_MODES,
|
||||
ISSUE_EXECUTION_STAGE_TYPES,
|
||||
ISSUE_EXECUTION_STATE_STATUSES,
|
||||
@@ -55,15 +49,11 @@ export {
|
||||
BUDGET_INCIDENT_RESOLUTION_ACTIONS,
|
||||
HEARTBEAT_INVOCATION_SOURCES,
|
||||
HEARTBEAT_RUN_STATUSES,
|
||||
RUN_LIVENESS_STATES,
|
||||
WAKEUP_TRIGGER_DETAILS,
|
||||
WAKEUP_REQUEST_STATUSES,
|
||||
LIVE_EVENT_TYPES,
|
||||
PRINCIPAL_TYPES,
|
||||
MEMBERSHIP_STATUSES,
|
||||
COMPANY_MEMBERSHIP_ROLES,
|
||||
HUMAN_COMPANY_MEMBERSHIP_ROLES,
|
||||
HUMAN_COMPANY_MEMBERSHIP_ROLE_LABELS,
|
||||
INSTANCE_USER_ROLES,
|
||||
INVITE_TYPES,
|
||||
INVITE_JOIN_TYPES,
|
||||
@@ -85,13 +75,6 @@ export {
|
||||
PLUGIN_JOB_RUN_STATUSES,
|
||||
PLUGIN_JOB_RUN_TRIGGERS,
|
||||
PLUGIN_WEBHOOK_DELIVERY_STATUSES,
|
||||
PLUGIN_DATABASE_NAMESPACE_MODES,
|
||||
PLUGIN_DATABASE_NAMESPACE_STATUSES,
|
||||
PLUGIN_DATABASE_MIGRATION_STATUSES,
|
||||
PLUGIN_DATABASE_CORE_READ_TABLES,
|
||||
PLUGIN_API_ROUTE_METHODS,
|
||||
PLUGIN_API_ROUTE_AUTH_MODES,
|
||||
PLUGIN_API_ROUTE_CHECKOUT_POLICIES,
|
||||
PLUGIN_EVENT_TYPES,
|
||||
PLUGIN_BRIDGE_ERROR_CODES,
|
||||
type CompanyStatus,
|
||||
@@ -105,12 +88,8 @@ export {
|
||||
type AgentIconName,
|
||||
type IssueStatus,
|
||||
type IssuePriority,
|
||||
type BuiltInIssueOriginKind,
|
||||
type PluginIssueOriginKind,
|
||||
type IssueOriginKind,
|
||||
type IssueRelationType,
|
||||
type SystemIssueDocumentKey,
|
||||
type IssueReferenceSourceKind,
|
||||
type IssueExecutionPolicyMode,
|
||||
type IssueExecutionStageType,
|
||||
type IssueExecutionStateStatus,
|
||||
@@ -143,14 +122,11 @@ export {
|
||||
type BudgetIncidentResolutionAction,
|
||||
type HeartbeatInvocationSource,
|
||||
type HeartbeatRunStatus,
|
||||
type RunLivenessState,
|
||||
type WakeupTriggerDetail,
|
||||
type WakeupRequestStatus,
|
||||
type LiveEventType,
|
||||
type PrincipalType,
|
||||
type MembershipStatus,
|
||||
type CompanyMembershipRole,
|
||||
type HumanCompanyMembershipRole,
|
||||
type InstanceUserRole,
|
||||
type InviteType,
|
||||
type InviteJoinType,
|
||||
@@ -171,13 +147,6 @@ export {
|
||||
type PluginJobRunStatus,
|
||||
type PluginJobRunTrigger,
|
||||
type PluginWebhookDeliveryStatus,
|
||||
type PluginDatabaseNamespaceMode,
|
||||
type PluginDatabaseNamespaceStatus,
|
||||
type PluginDatabaseMigrationStatus,
|
||||
type PluginDatabaseCoreReadTable,
|
||||
type PluginApiRouteMethod,
|
||||
type PluginApiRouteAuthMode,
|
||||
type PluginApiRouteCheckoutPolicy,
|
||||
type PluginEventType,
|
||||
type PluginBridgeErrorCode,
|
||||
} from "./constants.js";
|
||||
@@ -255,7 +224,6 @@ export type {
|
||||
ProjectGoalRef,
|
||||
ProjectWorkspace,
|
||||
ExecutionWorkspace,
|
||||
ExecutionWorkspaceSummary,
|
||||
ExecutionWorkspaceConfig,
|
||||
ExecutionWorkspaceCloseAction,
|
||||
ExecutionWorkspaceCloseActionKind,
|
||||
@@ -288,9 +256,6 @@ export type {
|
||||
IssueWorkProductReviewState,
|
||||
Issue,
|
||||
IssueAssigneeAdapterOverrides,
|
||||
IssueReferenceSource,
|
||||
IssueRelatedWorkItem,
|
||||
IssueRelatedWorkSummary,
|
||||
IssueRelation,
|
||||
IssueRelationIssueSummary,
|
||||
IssueExecutionPolicy,
|
||||
@@ -337,35 +302,16 @@ export type {
|
||||
AgentWakeupRequest,
|
||||
InstanceSchedulerHeartbeatAgent,
|
||||
LiveEvent,
|
||||
DashboardRunActivityDay,
|
||||
DashboardSummary,
|
||||
ActivityEvent,
|
||||
UserProfileActivitySummary,
|
||||
UserProfileAgentUsage,
|
||||
UserProfileDailyPoint,
|
||||
UserProfileIdentity,
|
||||
UserProfileIssueSummary,
|
||||
UserProfileProviderUsage,
|
||||
UserProfileResponse,
|
||||
UserProfileWindowStats,
|
||||
SidebarBadges,
|
||||
SidebarOrderPreference,
|
||||
InboxDismissal,
|
||||
AccessUserProfile,
|
||||
CompanyMemberRecord,
|
||||
CompanyMembersResponse,
|
||||
CompanyMembership,
|
||||
CompanyInviteListResponse,
|
||||
CompanyInviteRecord,
|
||||
PrincipalPermissionGrant,
|
||||
Invite,
|
||||
JoinRequest,
|
||||
JoinRequestInviteSummary,
|
||||
JoinRequestRecord,
|
||||
InstanceUserRoleGrant,
|
||||
AdminUserDirectoryEntry,
|
||||
UserCompanyAccessEntry,
|
||||
UserCompanyAccessResponse,
|
||||
CompanyPortabilityInclude,
|
||||
CompanyPortabilityEnvInput,
|
||||
CompanyPortabilityFileEntry,
|
||||
@@ -420,13 +366,8 @@ export type {
|
||||
PluginLauncherDeclaration,
|
||||
PluginMinimumHostVersion,
|
||||
PluginUiDeclaration,
|
||||
PluginDatabaseDeclaration,
|
||||
PluginApiRouteCompanyResolution,
|
||||
PluginApiRouteDeclaration,
|
||||
PaperclipPluginManifestV1,
|
||||
PluginRecord,
|
||||
PluginDatabaseNamespaceRecord,
|
||||
PluginMigrationRecord,
|
||||
PluginStateRecord,
|
||||
PluginConfig,
|
||||
PluginEntityRecord,
|
||||
@@ -437,16 +378,6 @@ export type {
|
||||
QuotaWindow,
|
||||
ProviderQuotaResult,
|
||||
} from "./types/index.js";
|
||||
export {
|
||||
ISSUE_REFERENCE_IDENTIFIER_RE,
|
||||
buildIssueReferenceHref,
|
||||
extractIssueReferenceIdentifiers,
|
||||
extractIssueReferenceMatches,
|
||||
findIssueReferenceMatches,
|
||||
normalizeIssueIdentifier,
|
||||
parseIssueReferenceHref,
|
||||
type IssueReferenceMatch,
|
||||
} from "./issue-references.js";
|
||||
|
||||
export {
|
||||
sidebarOrderPreferenceSchema,
|
||||
@@ -547,7 +478,6 @@ export {
|
||||
type UpdateProjectWorkspace,
|
||||
projectExecutionWorkspacePolicySchema,
|
||||
createIssueSchema,
|
||||
createChildIssueSchema,
|
||||
createIssueLabelSchema,
|
||||
updateIssueSchema,
|
||||
issueExecutionPolicySchema,
|
||||
@@ -575,7 +505,6 @@ export {
|
||||
upsertIssueDocumentSchema,
|
||||
restoreIssueDocumentRevisionSchema,
|
||||
type CreateIssue,
|
||||
type CreateChildIssue,
|
||||
type CreateIssueLabel,
|
||||
type UpdateIssue,
|
||||
type CheckoutIssue,
|
||||
@@ -636,20 +565,12 @@ export {
|
||||
createCompanyInviteSchema,
|
||||
createOpenClawInvitePromptSchema,
|
||||
acceptInviteSchema,
|
||||
listCompanyInvitesQuerySchema,
|
||||
listJoinRequestsQuerySchema,
|
||||
claimJoinRequestApiKeySchema,
|
||||
boardCliAuthAccessLevelSchema,
|
||||
createCliAuthChallengeSchema,
|
||||
resolveCliAuthChallengeSchema,
|
||||
currentUserProfileSchema,
|
||||
authSessionSchema,
|
||||
updateCurrentUserProfileSchema,
|
||||
updateCompanyMemberSchema,
|
||||
updateCompanyMemberWithPermissionsSchema,
|
||||
archiveCompanyMemberSchema,
|
||||
updateMemberPermissionsSchema,
|
||||
searchAdminUsersQuerySchema,
|
||||
updateUserCompanyAccessSchema,
|
||||
type CreateCostEvent,
|
||||
type CreateFinanceEvent,
|
||||
@@ -658,20 +579,12 @@ export {
|
||||
type CreateCompanyInvite,
|
||||
type CreateOpenClawInvitePrompt,
|
||||
type AcceptInvite,
|
||||
type ListCompanyInvitesQuery,
|
||||
type ListJoinRequestsQuery,
|
||||
type ClaimJoinRequestApiKey,
|
||||
type BoardCliAuthAccessLevel,
|
||||
type CreateCliAuthChallenge,
|
||||
type ResolveCliAuthChallenge,
|
||||
type CurrentUserProfile,
|
||||
type AuthSession,
|
||||
type UpdateCurrentUserProfile,
|
||||
type UpdateCompanyMember,
|
||||
type UpdateCompanyMemberWithPermissions,
|
||||
type ArchiveCompanyMember,
|
||||
type UpdateMemberPermissions,
|
||||
type SearchAdminUsersQuery,
|
||||
type UpdateUserCompanyAccess,
|
||||
companySkillSourceTypeSchema,
|
||||
companySkillTrustLevelSchema,
|
||||
@@ -715,8 +628,6 @@ export {
|
||||
pluginLauncherActionDeclarationSchema,
|
||||
pluginLauncherRenderDeclarationSchema,
|
||||
pluginLauncherDeclarationSchema,
|
||||
pluginDatabaseDeclarationSchema,
|
||||
pluginApiRouteDeclarationSchema,
|
||||
pluginManifestV1Schema,
|
||||
installPluginSchema,
|
||||
upsertPluginConfigSchema,
|
||||
@@ -733,8 +644,6 @@ export {
|
||||
type PluginLauncherActionDeclarationInput,
|
||||
type PluginLauncherRenderDeclarationInput,
|
||||
type PluginLauncherDeclarationInput,
|
||||
type PluginDatabaseDeclarationInput,
|
||||
type PluginApiRouteDeclarationInput,
|
||||
type PluginManifestV1Input,
|
||||
type InstallPlugin,
|
||||
type UpsertPluginConfig,
|
||||
@@ -753,23 +662,18 @@ export {
|
||||
AGENT_MENTION_SCHEME,
|
||||
PROJECT_MENTION_SCHEME,
|
||||
SKILL_MENTION_SCHEME,
|
||||
USER_MENTION_SCHEME,
|
||||
buildAgentMentionHref,
|
||||
buildProjectMentionHref,
|
||||
buildSkillMentionHref,
|
||||
buildUserMentionHref,
|
||||
extractAgentMentionIds,
|
||||
extractProjectMentionIds,
|
||||
extractSkillMentionIds,
|
||||
extractUserMentionIds,
|
||||
parseAgentMentionHref,
|
||||
parseProjectMentionHref,
|
||||
parseSkillMentionHref,
|
||||
parseUserMentionHref,
|
||||
extractProjectMentionIds,
|
||||
type ParsedAgentMention,
|
||||
type ParsedProjectMention,
|
||||
type ParsedSkillMention,
|
||||
type ParsedUserMention,
|
||||
} from "./project-mentions.js";
|
||||
|
||||
export {
|
||||
|
||||
@@ -1,68 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
buildIssueReferenceHref,
|
||||
extractIssueReferenceIdentifiers,
|
||||
findIssueReferenceMatches,
|
||||
normalizeIssueIdentifier,
|
||||
parseIssueReferenceHref,
|
||||
} from "./issue-references.js";
|
||||
|
||||
describe("issue references", () => {
|
||||
it("normalizes identifiers to uppercase", () => {
|
||||
expect(normalizeIssueIdentifier("pap-123")).toBe("PAP-123");
|
||||
expect(normalizeIssueIdentifier("not-an-issue")).toBeNull();
|
||||
});
|
||||
|
||||
it("parses relative and absolute issue hrefs", () => {
|
||||
expect(parseIssueReferenceHref("/issues/PAP-123")).toEqual({ identifier: "PAP-123" });
|
||||
expect(parseIssueReferenceHref("/PAP/issues/pap-456")).toEqual({ identifier: "PAP-456" });
|
||||
expect(parseIssueReferenceHref("https://paperclip.ing/PAP/issues/pap-789#comment-1")).toEqual({
|
||||
identifier: "PAP-789",
|
||||
});
|
||||
expect(parseIssueReferenceHref("https://paperclip.ing/projects/PAP-789")).toBeNull();
|
||||
});
|
||||
|
||||
it("builds canonical issue hrefs", () => {
|
||||
expect(buildIssueReferenceHref("pap-123")).toBe("/issues/PAP-123");
|
||||
});
|
||||
|
||||
it("finds identifiers and issue paths in plain text", () => {
|
||||
expect(findIssueReferenceMatches("See PAP-1, /issues/PAP-2, and https://x.test/PAP/issues/pap-3.")).toEqual([
|
||||
{ index: 4, length: 5, identifier: "PAP-1", matchedText: "PAP-1" },
|
||||
{ index: 11, length: 13, identifier: "PAP-2", matchedText: "/issues/PAP-2" },
|
||||
{
|
||||
index: 30,
|
||||
length: 31,
|
||||
identifier: "PAP-3",
|
||||
matchedText: "https://x.test/PAP/issues/pap-3",
|
||||
},
|
||||
]);
|
||||
});
|
||||
|
||||
it("trims unmatched square brackets from issue path tokens", () => {
|
||||
expect(findIssueReferenceMatches("See /issues/PAP-123] for context.")).toEqual([
|
||||
{ index: 4, length: 15, identifier: "PAP-123", matchedText: "/issues/PAP-123" },
|
||||
]);
|
||||
});
|
||||
|
||||
it("extracts and dedupes references from markdown", () => {
|
||||
expect(extractIssueReferenceIdentifiers("PAP-1 [again](/issues/pap-1) PAP-2")).toEqual(["PAP-1", "PAP-2"]);
|
||||
});
|
||||
|
||||
it("ignores inline code and fenced code blocks", () => {
|
||||
const markdown = [
|
||||
"Use PAP-1 here.",
|
||||
"",
|
||||
"`PAP-2` should not count.",
|
||||
"",
|
||||
"```md",
|
||||
"PAP-3",
|
||||
"/issues/PAP-4",
|
||||
"```",
|
||||
"",
|
||||
"Final /issues/PAP-5 mention.",
|
||||
].join("\n");
|
||||
|
||||
expect(extractIssueReferenceIdentifiers(markdown)).toEqual(["PAP-1", "PAP-5"]);
|
||||
});
|
||||
});
|
||||
@@ -1,188 +0,0 @@
|
||||
export const ISSUE_REFERENCE_IDENTIFIER_RE = /^[A-Z]+-\d+$/;
|
||||
|
||||
export interface IssueReferenceMatch {
|
||||
index: number;
|
||||
length: number;
|
||||
identifier: string;
|
||||
matchedText: string;
|
||||
}
|
||||
|
||||
const ISSUE_REFERENCE_TOKEN_RE = /https?:\/\/[^\s<>()]+|\/[^\s<>()]+|[A-Z]+-\d+/gi;
|
||||
|
||||
function preserveNewlinesAsWhitespace(value: string) {
|
||||
return value.replace(/[^\n]/g, " ");
|
||||
}
|
||||
|
||||
function stripMarkdownCode(markdown: string): string {
|
||||
if (!markdown) return "";
|
||||
|
||||
let output = "";
|
||||
let index = 0;
|
||||
|
||||
while (index < markdown.length) {
|
||||
const remaining = markdown.slice(index);
|
||||
const fenceMatch = /^(?:```+|~~~+)/.exec(remaining);
|
||||
const atLineStart = index === 0 || markdown[index - 1] === "\n";
|
||||
|
||||
if (atLineStart && fenceMatch) {
|
||||
const fence = fenceMatch[0]!;
|
||||
const blockStart = index;
|
||||
index += fence.length;
|
||||
while (index < markdown.length && markdown[index] !== "\n") index += 1;
|
||||
if (index < markdown.length) index += 1;
|
||||
|
||||
while (index < markdown.length) {
|
||||
const lineStart = index === 0 || markdown[index - 1] === "\n";
|
||||
if (lineStart && markdown.startsWith(fence, index)) {
|
||||
index += fence.length;
|
||||
while (index < markdown.length && markdown[index] !== "\n") index += 1;
|
||||
if (index < markdown.length) index += 1;
|
||||
break;
|
||||
}
|
||||
index += 1;
|
||||
}
|
||||
|
||||
output += preserveNewlinesAsWhitespace(markdown.slice(blockStart, index));
|
||||
continue;
|
||||
}
|
||||
|
||||
if (markdown[index] === "`") {
|
||||
let tickCount = 1;
|
||||
while (index + tickCount < markdown.length && markdown[index + tickCount] === "`") {
|
||||
tickCount += 1;
|
||||
}
|
||||
const fence = "`".repeat(tickCount);
|
||||
const inlineStart = index;
|
||||
index += tickCount;
|
||||
const closeIndex = markdown.indexOf(fence, index);
|
||||
if (closeIndex === -1) {
|
||||
output += markdown.slice(inlineStart, inlineStart + tickCount);
|
||||
index = inlineStart + tickCount;
|
||||
continue;
|
||||
}
|
||||
index = closeIndex + tickCount;
|
||||
output += preserveNewlinesAsWhitespace(markdown.slice(inlineStart, index));
|
||||
continue;
|
||||
}
|
||||
|
||||
output += markdown[index]!;
|
||||
index += 1;
|
||||
}
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
function trimTrailingPunctuation(token: string): string {
|
||||
let trimmed = token;
|
||||
while (trimmed.length > 0) {
|
||||
const last = trimmed[trimmed.length - 1]!;
|
||||
if (!".,!?;:".includes(last) && last !== ")" && last !== "]") break;
|
||||
|
||||
if (
|
||||
(last === ")" && (trimmed.match(/\(/g)?.length ?? 0) >= (trimmed.match(/\)/g)?.length ?? 0))
|
||||
|| (last === "]" && (trimmed.match(/\[/g)?.length ?? 0) >= (trimmed.match(/\]/g)?.length ?? 0))
|
||||
) {
|
||||
break;
|
||||
}
|
||||
trimmed = trimmed.slice(0, -1);
|
||||
}
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
export function normalizeIssueIdentifier(value: string): string | null {
|
||||
const trimmed = value.trim().toUpperCase();
|
||||
return ISSUE_REFERENCE_IDENTIFIER_RE.test(trimmed) ? trimmed : null;
|
||||
}
|
||||
|
||||
export function buildIssueReferenceHref(identifier: string): string {
|
||||
const normalized = normalizeIssueIdentifier(identifier);
|
||||
return `/issues/${normalized ?? identifier.trim()}`;
|
||||
}
|
||||
|
||||
export function parseIssueReferenceHref(href: string): { identifier: string } | null {
|
||||
const raw = href.trim();
|
||||
if (!raw) return null;
|
||||
|
||||
let url: URL;
|
||||
try {
|
||||
url = raw.startsWith("/")
|
||||
? new URL(raw, "https://paperclip.invalid")
|
||||
: new URL(raw);
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
|
||||
const segments = url.pathname
|
||||
.split("/")
|
||||
.map((segment) => segment.trim())
|
||||
.filter(Boolean);
|
||||
|
||||
for (let index = 0; index < segments.length - 1; index += 1) {
|
||||
if (segments[index]?.toLowerCase() !== "issues") continue;
|
||||
const identifier = normalizeIssueIdentifier(segments[index + 1] ?? "");
|
||||
if (identifier) {
|
||||
return { identifier };
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
export function findIssueReferenceMatches(text: string): IssueReferenceMatch[] {
|
||||
if (!text) return [];
|
||||
|
||||
const matches: IssueReferenceMatch[] = [];
|
||||
let match: RegExpExecArray | null;
|
||||
const re = new RegExp(ISSUE_REFERENCE_TOKEN_RE);
|
||||
|
||||
while ((match = re.exec(text)) !== null) {
|
||||
const rawToken = match[0];
|
||||
const cleanedToken = trimTrailingPunctuation(rawToken);
|
||||
if (!cleanedToken) continue;
|
||||
|
||||
const identifier =
|
||||
normalizeIssueIdentifier(cleanedToken)
|
||||
?? parseIssueReferenceHref(cleanedToken)?.identifier
|
||||
?? null;
|
||||
|
||||
if (!identifier) continue;
|
||||
|
||||
const cleanedIndex = match.index;
|
||||
matches.push({
|
||||
index: cleanedIndex,
|
||||
length: cleanedToken.length,
|
||||
identifier,
|
||||
matchedText: cleanedToken,
|
||||
});
|
||||
}
|
||||
|
||||
return matches;
|
||||
}
|
||||
|
||||
export function extractIssueReferenceIdentifiers(markdown: string): string[] {
|
||||
const scrubbed = stripMarkdownCode(markdown);
|
||||
const seen = new Set<string>();
|
||||
const ordered: string[] = [];
|
||||
|
||||
for (const match of findIssueReferenceMatches(scrubbed)) {
|
||||
if (seen.has(match.identifier)) continue;
|
||||
seen.add(match.identifier);
|
||||
ordered.push(match.identifier);
|
||||
}
|
||||
|
||||
return ordered;
|
||||
}
|
||||
|
||||
export function extractIssueReferenceMatches(markdown: string): IssueReferenceMatch[] {
|
||||
const scrubbed = stripMarkdownCode(markdown);
|
||||
const seen = new Set<string>();
|
||||
const ordered: IssueReferenceMatch[] = [];
|
||||
|
||||
for (const match of findIssueReferenceMatches(scrubbed)) {
|
||||
if (seen.has(match.identifier)) continue;
|
||||
seen.add(match.identifier);
|
||||
ordered.push(match);
|
||||
}
|
||||
|
||||
return ordered;
|
||||
}
|
||||
@@ -3,15 +3,12 @@ import {
|
||||
buildAgentMentionHref,
|
||||
buildProjectMentionHref,
|
||||
buildSkillMentionHref,
|
||||
buildUserMentionHref,
|
||||
extractAgentMentionIds,
|
||||
extractProjectMentionIds,
|
||||
extractSkillMentionIds,
|
||||
extractUserMentionIds,
|
||||
parseAgentMentionHref,
|
||||
parseProjectMentionHref,
|
||||
parseSkillMentionHref,
|
||||
parseUserMentionHref,
|
||||
} from "./project-mentions.js";
|
||||
|
||||
describe("project-mentions", () => {
|
||||
@@ -33,14 +30,6 @@ describe("project-mentions", () => {
|
||||
expect(extractAgentMentionIds(`[@CodexCoder](${href})`)).toEqual(["agent-123"]);
|
||||
});
|
||||
|
||||
it("round-trips user mentions", () => {
|
||||
const href = buildUserMentionHref("user-123");
|
||||
expect(parseUserMentionHref(href)).toEqual({
|
||||
userId: "user-123",
|
||||
});
|
||||
expect(extractUserMentionIds(`[@Taylor](${href})`)).toEqual(["user-123"]);
|
||||
});
|
||||
|
||||
it("round-trips skill mentions with slug metadata", () => {
|
||||
const href = buildSkillMentionHref("skill-123", "release-changelog");
|
||||
expect(parseSkillMentionHref(href)).toEqual({
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
export const PROJECT_MENTION_SCHEME = "project://";
|
||||
export const AGENT_MENTION_SCHEME = "agent://";
|
||||
export const USER_MENTION_SCHEME = "user://";
|
||||
export const SKILL_MENTION_SCHEME = "skill://";
|
||||
|
||||
const HEX_COLOR_RE = /^[0-9a-f]{6}$/i;
|
||||
@@ -9,7 +8,6 @@ const HEX_COLOR_WITH_HASH_RE = /^#[0-9a-f]{6}$/i;
|
||||
const HEX_COLOR_SHORT_WITH_HASH_RE = /^#[0-9a-f]{3}$/i;
|
||||
const PROJECT_MENTION_LINK_RE = /\[[^\]]*]\((project:\/\/[^)\s]+)\)/gi;
|
||||
const AGENT_MENTION_LINK_RE = /\[[^\]]*]\((agent:\/\/[^)\s]+)\)/gi;
|
||||
const USER_MENTION_LINK_RE = /\[[^\]]*]\((user:\/\/[^)\s]+)\)/gi;
|
||||
const SKILL_MENTION_LINK_RE = /\[[^\]]*]\((skill:\/\/[^)\s]+)\)/gi;
|
||||
const AGENT_ICON_NAME_RE = /^[a-z0-9-]+$/i;
|
||||
const SKILL_SLUG_RE = /^[a-z0-9][a-z0-9-]*$/i;
|
||||
@@ -24,10 +22,6 @@ export interface ParsedAgentMention {
|
||||
icon: string | null;
|
||||
}
|
||||
|
||||
export interface ParsedUserMention {
|
||||
userId: string;
|
||||
}
|
||||
|
||||
export interface ParsedSkillMention {
|
||||
skillId: string;
|
||||
slug: string | null;
|
||||
@@ -117,28 +111,6 @@ export function parseAgentMentionHref(href: string): ParsedAgentMention | null {
|
||||
};
|
||||
}
|
||||
|
||||
export function buildUserMentionHref(userId: string): string {
|
||||
return `${USER_MENTION_SCHEME}${userId.trim()}`;
|
||||
}
|
||||
|
||||
export function parseUserMentionHref(href: string): ParsedUserMention | null {
|
||||
if (!href.startsWith(USER_MENTION_SCHEME)) return null;
|
||||
|
||||
let url: URL;
|
||||
try {
|
||||
url = new URL(href);
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (url.protocol !== "user:") return null;
|
||||
|
||||
const userId = `${url.hostname}${url.pathname}`.replace(/^\/+/, "").trim();
|
||||
if (!userId) return null;
|
||||
|
||||
return { userId };
|
||||
}
|
||||
|
||||
export function buildSkillMentionHref(skillId: string, slug?: string | null): string {
|
||||
const trimmedSkillId = skillId.trim();
|
||||
const normalizedSlug = normalizeSkillSlug(slug ?? null);
|
||||
@@ -193,18 +165,6 @@ export function extractAgentMentionIds(markdown: string): string[] {
|
||||
return [...ids];
|
||||
}
|
||||
|
||||
export function extractUserMentionIds(markdown: string): string[] {
|
||||
if (!markdown) return [];
|
||||
const ids = new Set<string>();
|
||||
const re = new RegExp(USER_MENTION_LINK_RE);
|
||||
let match: RegExpExecArray | null;
|
||||
while ((match = re.exec(markdown)) !== null) {
|
||||
const parsed = parseUserMentionHref(match[1]);
|
||||
if (parsed) ids.add(parsed.userId);
|
||||
}
|
||||
return [...ids];
|
||||
}
|
||||
|
||||
export function extractSkillMentionIds(markdown: string): string[] {
|
||||
if (!markdown) return [];
|
||||
const ids = new Set<string>();
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
import type {
|
||||
AgentAdapterType,
|
||||
CompanyStatus,
|
||||
HumanCompanyMembershipRole,
|
||||
InstanceUserRole,
|
||||
InviteJoinType,
|
||||
InviteType,
|
||||
@@ -35,39 +33,6 @@ export interface PrincipalPermissionGrant {
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export interface AccessUserProfile {
|
||||
id: string;
|
||||
email: string | null;
|
||||
name: string | null;
|
||||
image: string | null;
|
||||
}
|
||||
|
||||
export interface CompanyMemberRecord extends CompanyMembership {
|
||||
principalType: "user";
|
||||
membershipRole: HumanCompanyMembershipRole | null;
|
||||
user: AccessUserProfile | null;
|
||||
grants: PrincipalPermissionGrant[];
|
||||
removal?: {
|
||||
canArchive: boolean;
|
||||
reason: string | null;
|
||||
};
|
||||
}
|
||||
|
||||
export interface CompanyMembersResponse {
|
||||
members: CompanyMemberRecord[];
|
||||
access: {
|
||||
currentUserRole: HumanCompanyMembershipRole | null;
|
||||
canManageMembers: boolean;
|
||||
canInviteUsers: boolean;
|
||||
canApproveJoinRequests: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
export interface ArchiveCompanyMemberResponse {
|
||||
member: CompanyMemberRecord;
|
||||
reassignedIssueCount: number;
|
||||
}
|
||||
|
||||
export interface Invite {
|
||||
id: string;
|
||||
companyId: string | null;
|
||||
@@ -83,22 +48,6 @@ export interface Invite {
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export type InviteState = "active" | "revoked" | "accepted" | "expired";
|
||||
|
||||
export interface CompanyInviteRecord extends Invite {
|
||||
companyName: string | null;
|
||||
humanRole: HumanCompanyMembershipRole | null;
|
||||
inviteMessage: string | null;
|
||||
state: InviteState;
|
||||
invitedByUser: AccessUserProfile | null;
|
||||
relatedJoinRequestId: string | null;
|
||||
}
|
||||
|
||||
export interface CompanyInviteListResponse {
|
||||
invites: CompanyInviteRecord[];
|
||||
nextOffset: number | null;
|
||||
}
|
||||
|
||||
export interface JoinRequest {
|
||||
id: string;
|
||||
inviteId: string;
|
||||
@@ -123,26 +72,6 @@ export interface JoinRequest {
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export interface JoinRequestInviteSummary {
|
||||
id: string;
|
||||
inviteType: InviteType;
|
||||
allowedJoinTypes: InviteJoinType;
|
||||
humanRole: HumanCompanyMembershipRole | null;
|
||||
inviteMessage: string | null;
|
||||
createdAt: Date;
|
||||
expiresAt: Date;
|
||||
revokedAt: Date | null;
|
||||
acceptedAt: Date | null;
|
||||
invitedByUser: AccessUserProfile | null;
|
||||
}
|
||||
|
||||
export interface JoinRequestRecord extends JoinRequest {
|
||||
requesterUser: AccessUserProfile | null;
|
||||
approvedByUser: AccessUserProfile | null;
|
||||
rejectedByUser: AccessUserProfile | null;
|
||||
invite: JoinRequestInviteSummary | null;
|
||||
}
|
||||
|
||||
export interface InstanceUserRoleGrant {
|
||||
id: string;
|
||||
userId: string;
|
||||
@@ -150,21 +79,3 @@ export interface InstanceUserRoleGrant {
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
export interface AdminUserDirectoryEntry extends AccessUserProfile {
|
||||
isInstanceAdmin: boolean;
|
||||
activeCompanyMembershipCount: number;
|
||||
}
|
||||
|
||||
export interface UserCompanyAccessEntry extends CompanyMembership {
|
||||
principalType: "user";
|
||||
companyName: string | null;
|
||||
companyStatus: CompanyStatus | null;
|
||||
}
|
||||
|
||||
export interface UserCompanyAccessResponse {
|
||||
user: (AccessUserProfile & {
|
||||
isInstanceAdmin: boolean;
|
||||
}) | null;
|
||||
companyAccess: UserCompanyAccessEntry[];
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
export interface ActivityEvent {
|
||||
id: string;
|
||||
companyId: string;
|
||||
actorType: "agent" | "user" | "system" | "plugin";
|
||||
actorType: "agent" | "user" | "system";
|
||||
actorId: string;
|
||||
action: string;
|
||||
entityType: string;
|
||||
|
||||
@@ -1,11 +1,3 @@
|
||||
export interface DashboardRunActivityDay {
|
||||
date: string;
|
||||
succeeded: number;
|
||||
failed: number;
|
||||
other: number;
|
||||
total: number;
|
||||
}
|
||||
|
||||
export interface DashboardSummary {
|
||||
companyId: string;
|
||||
agents: {
|
||||
@@ -32,5 +24,4 @@ export interface DashboardSummary {
|
||||
pausedAgents: number;
|
||||
pausedProjects: number;
|
||||
};
|
||||
runActivity: DashboardRunActivityDay[];
|
||||
}
|
||||
|
||||
@@ -3,7 +3,6 @@ import type {
|
||||
AgentStatus,
|
||||
HeartbeatInvocationSource,
|
||||
HeartbeatRunStatus,
|
||||
RunLivenessState,
|
||||
WakeupTriggerDetail,
|
||||
WakeupRequestStatus,
|
||||
} from "../constants.js";
|
||||
@@ -39,15 +38,6 @@ export interface HeartbeatRun {
|
||||
processStartedAt: Date | null;
|
||||
retryOfRunId: string | null;
|
||||
processLossRetryCount: number;
|
||||
scheduledRetryAt?: Date | null;
|
||||
scheduledRetryAttempt?: number;
|
||||
scheduledRetryReason?: string | null;
|
||||
retryExhaustedReason?: string | null;
|
||||
livenessState: RunLivenessState | null;
|
||||
livenessReason: string | null;
|
||||
continuationAttempt: number;
|
||||
lastUsefulActionAt: Date | null;
|
||||
nextAction: string | null;
|
||||
contextSnapshot: Record<string, unknown> | null;
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
|
||||
@@ -63,7 +63,6 @@ export type { AssetImage } from "./asset.js";
|
||||
export type { Project, ProjectCodebase, ProjectCodebaseOrigin, ProjectGoalRef, ProjectWorkspace } from "./project.js";
|
||||
export type {
|
||||
ExecutionWorkspace,
|
||||
ExecutionWorkspaceSummary,
|
||||
ExecutionWorkspaceConfig,
|
||||
ExecutionWorkspaceCloseAction,
|
||||
ExecutionWorkspaceCloseActionKind,
|
||||
@@ -102,9 +101,6 @@ export type {
|
||||
export type {
|
||||
Issue,
|
||||
IssueAssigneeAdapterOverrides,
|
||||
IssueReferenceSource,
|
||||
IssueRelatedWorkItem,
|
||||
IssueRelatedWorkSummary,
|
||||
IssueRelation,
|
||||
IssueRelationIssueSummary,
|
||||
IssueExecutionPolicy,
|
||||
@@ -170,38 +166,17 @@ export type {
|
||||
InstanceSchedulerHeartbeatAgent,
|
||||
} from "./heartbeat.js";
|
||||
export type { LiveEvent } from "./live.js";
|
||||
export type { DashboardRunActivityDay, DashboardSummary } from "./dashboard.js";
|
||||
export type { DashboardSummary } from "./dashboard.js";
|
||||
export type { ActivityEvent } from "./activity.js";
|
||||
export type {
|
||||
UserProfileActivitySummary,
|
||||
UserProfileAgentUsage,
|
||||
UserProfileDailyPoint,
|
||||
UserProfileIdentity,
|
||||
UserProfileIssueSummary,
|
||||
UserProfileProviderUsage,
|
||||
UserProfileResponse,
|
||||
UserProfileWindowStats,
|
||||
} from "./user-profile.js";
|
||||
export type { SidebarBadges } from "./sidebar-badges.js";
|
||||
export type { SidebarOrderPreference } from "./sidebar-preferences.js";
|
||||
export type { InboxDismissal } from "./inbox-dismissal.js";
|
||||
export type {
|
||||
AccessUserProfile,
|
||||
CompanyMemberRecord,
|
||||
CompanyMembersResponse,
|
||||
ArchiveCompanyMemberResponse,
|
||||
CompanyMembership,
|
||||
CompanyInviteListResponse,
|
||||
CompanyInviteRecord,
|
||||
PrincipalPermissionGrant,
|
||||
Invite,
|
||||
JoinRequest,
|
||||
JoinRequestInviteSummary,
|
||||
JoinRequestRecord,
|
||||
InstanceUserRoleGrant,
|
||||
AdminUserDirectoryEntry,
|
||||
UserCompanyAccessEntry,
|
||||
UserCompanyAccessResponse,
|
||||
} from "./access.js";
|
||||
export type { QuotaWindow, ProviderQuotaResult } from "./quota.js";
|
||||
export type {
|
||||
@@ -247,13 +222,8 @@ export type {
|
||||
PluginLauncherDeclaration,
|
||||
PluginMinimumHostVersion,
|
||||
PluginUiDeclaration,
|
||||
PluginDatabaseDeclaration,
|
||||
PluginApiRouteCompanyResolution,
|
||||
PluginApiRouteDeclaration,
|
||||
PaperclipPluginManifestV1,
|
||||
PluginRecord,
|
||||
PluginDatabaseNamespaceRecord,
|
||||
PluginMigrationRecord,
|
||||
PluginStateRecord,
|
||||
PluginConfig,
|
||||
PluginEntityRecord,
|
||||
@@ -261,8 +231,4 @@ export type {
|
||||
PluginJobRecord,
|
||||
PluginJobRunRecord,
|
||||
PluginWebhookDeliveryRecord,
|
||||
PluginDatabaseCoreReadTable,
|
||||
PluginDatabaseMigrationStatus,
|
||||
PluginDatabaseNamespaceMode,
|
||||
PluginDatabaseNamespaceStatus,
|
||||
} from "./plugin.js";
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import type {
|
||||
IssueExecutionDecisionOutcome,
|
||||
IssueExecutionPolicyMode,
|
||||
IssueReferenceSourceKind,
|
||||
IssueExecutionStageType,
|
||||
IssueExecutionStateStatus,
|
||||
IssueOriginKind,
|
||||
@@ -124,24 +123,6 @@ export interface IssueRelation {
|
||||
relatedIssue: IssueRelationIssueSummary;
|
||||
}
|
||||
|
||||
export interface IssueReferenceSource {
|
||||
kind: IssueReferenceSourceKind;
|
||||
sourceRecordId: string | null;
|
||||
label: string;
|
||||
matchedText: string | null;
|
||||
}
|
||||
|
||||
export interface IssueRelatedWorkItem {
|
||||
issue: IssueRelationIssueSummary;
|
||||
mentionCount: number;
|
||||
sources: IssueReferenceSource[];
|
||||
}
|
||||
|
||||
export interface IssueRelatedWorkSummary {
|
||||
outbound: IssueRelatedWorkItem[];
|
||||
inbound: IssueRelatedWorkItem[];
|
||||
}
|
||||
|
||||
export interface IssueExecutionStagePrincipal {
|
||||
type: "agent" | "user";
|
||||
agentId?: string | null;
|
||||
@@ -217,7 +198,6 @@ export interface Issue {
|
||||
originKind?: IssueOriginKind;
|
||||
originId?: string | null;
|
||||
originRunId?: string | null;
|
||||
originFingerprint?: string | null;
|
||||
requestDepth: number;
|
||||
billingCode: string | null;
|
||||
assigneeAdapterOverrides: IssueAssigneeAdapterOverrides | null;
|
||||
@@ -234,8 +214,6 @@ export interface Issue {
|
||||
labels?: IssueLabel[];
|
||||
blockedBy?: IssueRelationIssueSummary[];
|
||||
blocks?: IssueRelationIssueSummary[];
|
||||
relatedWork?: IssueRelatedWorkSummary;
|
||||
referencedIssueIdentifiers?: string[];
|
||||
planDocument?: IssueDocument | null;
|
||||
documentSummaries?: IssueDocumentSummary[];
|
||||
legacyPlanDocument?: LegacyPlanDocument | null;
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user