mirror of
https://github.com/paperclipai/paperclip
synced 2026-04-26 01:35:18 +02:00
Compare commits
48 Commits
PAPA-41-ad
...
PAPA-45-up
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
347f38019f | ||
|
|
25615407a4 | ||
|
|
f843a45a84 | ||
|
|
36049beeea | ||
|
|
c041fee6fc | ||
|
|
82290451d4 | ||
|
|
fb3b57ab1f | ||
|
|
ca8d35fd99 | ||
|
|
81a7f79dfd | ||
|
|
ad1ef6a8c6 | ||
|
|
833842b391 | ||
|
|
fd6cfc7149 | ||
|
|
50e9f69010 | ||
|
|
38a0cd275e | ||
|
|
bd6d07d0b4 | ||
|
|
3ab7d52f00 | ||
|
|
909e8cd4c8 | ||
|
|
36376968af | ||
|
|
29d0e82dce | ||
|
|
1c1040e219 | ||
|
|
0ec8257563 | ||
|
|
38833304d4 | ||
|
|
85e6371cb6 | ||
|
|
daea94a2ed | ||
|
|
c18b3cb414 | ||
|
|
af844b778e | ||
|
|
53dbcd185e | ||
|
|
f16de6026d | ||
|
|
34044cdfce | ||
|
|
ca5659f734 | ||
|
|
d12e3e3d1a | ||
|
|
c0d0d03bce | ||
|
|
3db6bdfc3c | ||
|
|
6524dbe08f | ||
|
|
2c1883fc77 | ||
|
|
4abd53c089 | ||
|
|
3c99ab8d01 | ||
|
|
9d6d159209 | ||
|
|
26069682ee | ||
|
|
1e24e6e84c | ||
|
|
9d89d74d70 | ||
|
|
056a5ee32a | ||
|
|
dedd972e3d | ||
|
|
6a7830b07e | ||
|
|
f9cebe9b73 | ||
|
|
9e1ee925cd | ||
|
|
e5b2e8b29b | ||
|
|
62d8b39474 |
7
.github/CODEOWNERS
vendored
7
.github/CODEOWNERS
vendored
@@ -8,3 +8,10 @@ scripts/rollback-latest.sh @cryppadotta @devinfoley
|
|||||||
doc/RELEASING.md @cryppadotta @devinfoley
|
doc/RELEASING.md @cryppadotta @devinfoley
|
||||||
doc/PUBLISHING.md @cryppadotta @devinfoley
|
doc/PUBLISHING.md @cryppadotta @devinfoley
|
||||||
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
|
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
|
||||||
|
|
||||||
|
# Package files — dependency changes require review
|
||||||
|
# package.json matches recursively at all depths (covers root + all workspaces)
|
||||||
|
package.json @cryppadotta @devinfoley
|
||||||
|
pnpm-lock.yaml @cryppadotta @devinfoley
|
||||||
|
pnpm-workspace.yaml @cryppadotta @devinfoley
|
||||||
|
.npmrc @cryppadotta @devinfoley
|
||||||
|
|||||||
16
.github/PULL_REQUEST_TEMPLATE.md
vendored
16
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -38,9 +38,25 @@
|
|||||||
|
|
||||||
-
|
-
|
||||||
|
|
||||||
|
## Model Used
|
||||||
|
|
||||||
|
<!--
|
||||||
|
Required. Specify which AI model was used to produce or assist with
|
||||||
|
this change. Be as descriptive as possible — include:
|
||||||
|
• Provider and model name (e.g., Claude, GPT, Gemini, Codex)
|
||||||
|
• Exact model ID or version (e.g., claude-opus-4-6, gpt-4-turbo-2024-04-09)
|
||||||
|
• Context window size if relevant (e.g., 1M context)
|
||||||
|
• Reasoning/thinking mode if applicable (e.g., extended thinking, chain-of-thought)
|
||||||
|
• Any other relevant capability details (e.g., tool use, code execution)
|
||||||
|
If no AI model was used, write "None — human-authored".
|
||||||
|
-->
|
||||||
|
|
||||||
|
-
|
||||||
|
|
||||||
## Checklist
|
## Checklist
|
||||||
|
|
||||||
- [ ] I have included a thinking path that traces from project context to this change
|
- [ ] I have included a thinking path that traces from project context to this change
|
||||||
|
- [ ] I have specified the model used (with version and capability details)
|
||||||
- [ ] I have run tests locally and they pass
|
- [ ] I have run tests locally and they pass
|
||||||
- [ ] I have added or updated tests where applicable
|
- [ ] I have added or updated tests where applicable
|
||||||
- [ ] If this change affects the UI, I have included before/after screenshots
|
- [ ] If this change affects the UI, I have included before/after screenshots
|
||||||
|
|||||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -31,6 +31,7 @@ server/src/**/*.js.map
|
|||||||
server/src/**/*.d.ts
|
server/src/**/*.d.ts
|
||||||
server/src/**/*.d.ts.map
|
server/src/**/*.d.ts.map
|
||||||
tmp/
|
tmp/
|
||||||
|
feedback-export-*
|
||||||
|
|
||||||
# Editor / tool temp files
|
# Editor / tool temp files
|
||||||
*.tmp
|
*.tmp
|
||||||
|
|||||||
@@ -11,8 +11,9 @@ We really appreciate both small fixes and thoughtful larger changes.
|
|||||||
- Pick **one** clear thing to fix/improve
|
- Pick **one** clear thing to fix/improve
|
||||||
- Touch the **smallest possible number of files**
|
- Touch the **smallest possible number of files**
|
||||||
- Make sure the change is very targeted and easy to review
|
- Make sure the change is very targeted and easy to review
|
||||||
- All automated checks pass (including Greptile comments)
|
- All tests pass and CI is green
|
||||||
- No new lint/test failures
|
- Greptile score is 5/5 with all comments addressed
|
||||||
|
- Use the [PR template](.github/PULL_REQUEST_TEMPLATE.md)
|
||||||
|
|
||||||
These almost always get merged quickly when they're clean.
|
These almost always get merged quickly when they're clean.
|
||||||
|
|
||||||
@@ -26,11 +27,26 @@ These almost always get merged quickly when they're clean.
|
|||||||
- Before / After screenshots (or short video if UI/behavior change)
|
- Before / After screenshots (or short video if UI/behavior change)
|
||||||
- Clear description of what & why
|
- Clear description of what & why
|
||||||
- Proof it works (manual testing notes)
|
- Proof it works (manual testing notes)
|
||||||
- All tests passing
|
- All tests passing and CI green
|
||||||
- All Greptile + other PR comments addressed
|
- Greptile score 5/5 with all comments addressed
|
||||||
|
- [PR template](.github/PULL_REQUEST_TEMPLATE.md) fully filled out
|
||||||
|
|
||||||
PRs that follow this path are **much** more likely to be accepted, even when they're large.
|
PRs that follow this path are **much** more likely to be accepted, even when they're large.
|
||||||
|
|
||||||
|
## PR Requirements (all PRs)
|
||||||
|
|
||||||
|
### Use the PR Template
|
||||||
|
|
||||||
|
Every pull request **must** follow the PR template at [`.github/PULL_REQUEST_TEMPLATE.md`](.github/PULL_REQUEST_TEMPLATE.md). If you create a PR via the GitHub API or other tooling that bypasses the template, copy its contents into your PR description manually. The template includes required sections: Thinking Path, What Changed, Verification, Risks, and a Checklist.
|
||||||
|
|
||||||
|
### Tests Must Pass
|
||||||
|
|
||||||
|
All tests must pass before a PR can be merged. Run them locally first and verify CI is green after pushing.
|
||||||
|
|
||||||
|
### Greptile Review
|
||||||
|
|
||||||
|
We use [Greptile](https://greptile.com) for automated code review. Your PR must achieve a **5/5 Greptile score** with **all Greptile comments addressed** before it can be merged. If Greptile leaves comments, fix or respond to each one and request a re-review.
|
||||||
|
|
||||||
## General Rules (both paths)
|
## General Rules (both paths)
|
||||||
|
|
||||||
- Write clear commit messages
|
- Write clear commit messages
|
||||||
@@ -41,7 +57,7 @@ PRs that follow this path are **much** more likely to be accepted, even when the
|
|||||||
|
|
||||||
## Writing a Good PR message
|
## Writing a Good PR message
|
||||||
|
|
||||||
Please include a "thinking path" at the top of your PR message that explains from the top of the project down to what you fixed. E.g.:
|
Your PR description must follow the [PR template](.github/PULL_REQUEST_TEMPLATE.md). All sections are required. The "thinking path" at the top explains from the top of the project down to what you fixed. E.g.:
|
||||||
|
|
||||||
### Thinking Path Example 1:
|
### Thinking Path Example 1:
|
||||||
|
|
||||||
|
|||||||
13
README.md
13
README.md
@@ -257,6 +257,19 @@ See [doc/DEVELOPING.md](doc/DEVELOPING.md) for the full development guide.
|
|||||||
|
|
||||||
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
|
Find Plugins and more at [awesome-paperclip](https://github.com/gsxdsm/awesome-paperclip)
|
||||||
|
|
||||||
|
## Telemetry
|
||||||
|
|
||||||
|
Paperclip collects anonymous usage telemetry to help us understand how the product is used and improve it. No personal information, issue content, prompts, file paths, or secrets are ever collected. Private repository references are hashed with a per-install salt before being sent.
|
||||||
|
|
||||||
|
Telemetry is **enabled by default** and can be disabled with any of the following:
|
||||||
|
|
||||||
|
| Method | How |
|
||||||
|
|---|---|
|
||||||
|
| Environment variable | `PAPERCLIP_TELEMETRY_DISABLED=1` |
|
||||||
|
| Standard convention | `DO_NOT_TRACK=1` |
|
||||||
|
| CI environments | Automatically disabled when `CI=true` |
|
||||||
|
| Config file | Set `telemetry.enabled: false` in your Paperclip config |
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.
|
We welcome contributions. See the [contributing guide](CONTRIBUTING.md) for details.
|
||||||
|
|||||||
@@ -44,6 +44,9 @@ function writeBaseConfig(configPath: string) {
|
|||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: "local_disk",
|
provider: "local_disk",
|
||||||
localDisk: { baseDir: "/tmp/paperclip-storage" },
|
localDisk: { baseDir: "/tmp/paperclip-storage" },
|
||||||
|
|||||||
@@ -15,6 +15,10 @@ function makeCompany(overrides: Partial<Company>): Company {
|
|||||||
budgetMonthlyCents: 0,
|
budgetMonthlyCents: 0,
|
||||||
spentMonthlyCents: 0,
|
spentMonthlyCents: 0,
|
||||||
requireBoardApprovalForNewAgents: false,
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
feedbackDataSharingEnabled: false,
|
||||||
|
feedbackDataSharingConsentAt: null,
|
||||||
|
feedbackDataSharingConsentByUserId: null,
|
||||||
|
feedbackDataSharingTermsVersion: null,
|
||||||
brandColor: null,
|
brandColor: null,
|
||||||
logoAssetId: null,
|
logoAssetId: null,
|
||||||
logoUrl: null,
|
logoUrl: null,
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import { describe, expect, it } from "vitest";
|
import { describe, expect, it } from "vitest";
|
||||||
import {
|
import {
|
||||||
isGithubShorthand,
|
isGithubShorthand,
|
||||||
isGithubUrl,
|
looksLikeRepoUrl,
|
||||||
isHttpUrl,
|
isHttpUrl,
|
||||||
normalizeGithubImportSource,
|
normalizeGithubImportSource,
|
||||||
} from "../commands/client/company.js";
|
} from "../commands/client/company.js";
|
||||||
@@ -21,17 +21,17 @@ describe("isHttpUrl", () => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describe("isGithubUrl", () => {
|
describe("looksLikeRepoUrl", () => {
|
||||||
it("matches GitHub URLs", () => {
|
it("matches GitHub URLs", () => {
|
||||||
expect(isGithubUrl("https://github.com/org/repo")).toBe(true);
|
expect(looksLikeRepoUrl("https://github.com/org/repo")).toBe(true);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("rejects non-GitHub HTTP URLs", () => {
|
it("rejects URLs without owner/repo path", () => {
|
||||||
expect(isGithubUrl("https://example.com/foo")).toBe(false);
|
expect(looksLikeRepoUrl("https://example.com/foo")).toBe(false);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("rejects local paths", () => {
|
it("rejects local paths", () => {
|
||||||
expect(isGithubUrl("/tmp/my-company")).toBe(false);
|
expect(looksLikeRepoUrl("/tmp/my-company")).toBe(false);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -163,6 +163,10 @@ describe("renderCompanyImportPreview", () => {
|
|||||||
brandColor: null,
|
brandColor: null,
|
||||||
logoPath: null,
|
logoPath: null,
|
||||||
requireBoardApprovalForNewAgents: false,
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
feedbackDataSharingEnabled: false,
|
||||||
|
feedbackDataSharingConsentAt: null,
|
||||||
|
feedbackDataSharingConsentByUserId: null,
|
||||||
|
feedbackDataSharingTermsVersion: null,
|
||||||
},
|
},
|
||||||
sidebar: {
|
sidebar: {
|
||||||
agents: ["ceo"],
|
agents: ["ceo"],
|
||||||
@@ -371,6 +375,10 @@ describe("import selection catalog", () => {
|
|||||||
brandColor: null,
|
brandColor: null,
|
||||||
logoPath: "images/company-logo.png",
|
logoPath: "images/company-logo.png",
|
||||||
requireBoardApprovalForNewAgents: false,
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
feedbackDataSharingEnabled: false,
|
||||||
|
feedbackDataSharingConsentAt: null,
|
||||||
|
feedbackDataSharingConsentByUserId: null,
|
||||||
|
feedbackDataSharingTermsVersion: null,
|
||||||
},
|
},
|
||||||
sidebar: {
|
sidebar: {
|
||||||
agents: ["ceo"],
|
agents: ["ceo"],
|
||||||
|
|||||||
@@ -46,6 +46,9 @@ function createTempConfig(): string {
|
|||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: "local_disk",
|
provider: "local_disk",
|
||||||
localDisk: {
|
localDisk: {
|
||||||
|
|||||||
177
cli/src/__tests__/feedback.test.ts
Normal file
177
cli/src/__tests__/feedback.test.ts
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { mkdtemp, readFile } from "node:fs/promises";
|
||||||
|
import { Command } from "commander";
|
||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import type { FeedbackTrace } from "@paperclipai/shared";
|
||||||
|
import { readZipArchive } from "../commands/client/zip.js";
|
||||||
|
import {
|
||||||
|
buildFeedbackTraceQuery,
|
||||||
|
registerFeedbackCommands,
|
||||||
|
renderFeedbackReport,
|
||||||
|
summarizeFeedbackTraces,
|
||||||
|
writeFeedbackExportBundle,
|
||||||
|
} from "../commands/client/feedback.js";
|
||||||
|
|
||||||
|
function makeTrace(overrides: Partial<FeedbackTrace> = {}): FeedbackTrace {
|
||||||
|
return {
|
||||||
|
id: "trace-12345678",
|
||||||
|
companyId: "company-123",
|
||||||
|
feedbackVoteId: "vote-12345678",
|
||||||
|
issueId: "issue-123",
|
||||||
|
projectId: "project-123",
|
||||||
|
issueIdentifier: "PAP-123",
|
||||||
|
issueTitle: "Fix the feedback command",
|
||||||
|
authorUserId: "user-123",
|
||||||
|
targetType: "issue_comment",
|
||||||
|
targetId: "comment-123",
|
||||||
|
vote: "down",
|
||||||
|
status: "pending",
|
||||||
|
destination: "paperclip_labs_feedback_v1",
|
||||||
|
exportId: null,
|
||||||
|
consentVersion: "feedback-data-sharing-v1",
|
||||||
|
schemaVersion: "1",
|
||||||
|
bundleVersion: "1",
|
||||||
|
payloadVersion: "1",
|
||||||
|
payloadDigest: null,
|
||||||
|
payloadSnapshot: {
|
||||||
|
vote: {
|
||||||
|
value: "down",
|
||||||
|
reason: "Needed more detail",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
targetSummary: {
|
||||||
|
label: "Comment",
|
||||||
|
excerpt: "The first answer was too vague.",
|
||||||
|
authorAgentId: "agent-123",
|
||||||
|
authorUserId: null,
|
||||||
|
createdAt: new Date("2026-03-31T12:00:00.000Z"),
|
||||||
|
documentKey: null,
|
||||||
|
documentTitle: null,
|
||||||
|
revisionNumber: null,
|
||||||
|
},
|
||||||
|
redactionSummary: null,
|
||||||
|
attemptCount: 0,
|
||||||
|
lastAttemptedAt: null,
|
||||||
|
exportedAt: null,
|
||||||
|
failureReason: null,
|
||||||
|
createdAt: new Date("2026-03-31T12:01:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-31T12:02:00.000Z"),
|
||||||
|
...overrides,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("registerFeedbackCommands", () => {
|
||||||
|
it("registers the top-level feedback commands", () => {
|
||||||
|
const program = new Command();
|
||||||
|
|
||||||
|
expect(() => registerFeedbackCommands(program)).not.toThrow();
|
||||||
|
|
||||||
|
const feedback = program.commands.find((command) => command.name() === "feedback");
|
||||||
|
expect(feedback).toBeDefined();
|
||||||
|
expect(feedback?.commands.map((command) => command.name())).toEqual(["report", "export"]);
|
||||||
|
expect(feedback?.commands[0]?.options.filter((option) => option.long === "--company-id")).toHaveLength(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("buildFeedbackTraceQuery", () => {
|
||||||
|
it("encodes all supported filters", () => {
|
||||||
|
expect(
|
||||||
|
buildFeedbackTraceQuery({
|
||||||
|
targetType: "issue_comment",
|
||||||
|
vote: "down",
|
||||||
|
status: "pending",
|
||||||
|
projectId: "project-123",
|
||||||
|
issueId: "issue-123",
|
||||||
|
from: "2026-03-31T00:00:00.000Z",
|
||||||
|
to: "2026-03-31T23:59:59.999Z",
|
||||||
|
sharedOnly: true,
|
||||||
|
}),
|
||||||
|
).toBe(
|
||||||
|
"?targetType=issue_comment&vote=down&status=pending&projectId=project-123&issueId=issue-123&from=2026-03-31T00%3A00%3A00.000Z&to=2026-03-31T23%3A59%3A59.999Z&sharedOnly=true&includePayload=true",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("renderFeedbackReport", () => {
|
||||||
|
it("includes summary counts and the optional reason", () => {
|
||||||
|
const traces = [
|
||||||
|
makeTrace(),
|
||||||
|
makeTrace({
|
||||||
|
id: "trace-87654321",
|
||||||
|
feedbackVoteId: "vote-87654321",
|
||||||
|
vote: "up",
|
||||||
|
status: "local_only",
|
||||||
|
payloadSnapshot: {
|
||||||
|
vote: {
|
||||||
|
value: "up",
|
||||||
|
reason: null,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
const report = renderFeedbackReport({
|
||||||
|
apiBase: "http://127.0.0.1:3100",
|
||||||
|
companyId: "company-123",
|
||||||
|
traces,
|
||||||
|
summary: summarizeFeedbackTraces(traces),
|
||||||
|
includePayloads: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(report).toContain("Paperclip Feedback Report");
|
||||||
|
expect(report).toContain("thumbs up");
|
||||||
|
expect(report).toContain("thumbs down");
|
||||||
|
expect(report).toContain("Needed more detail");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("writeFeedbackExportBundle", () => {
|
||||||
|
it("writes votes, traces, a manifest, and a zip archive", async () => {
|
||||||
|
const tempDir = await mkdtemp(path.join(os.tmpdir(), "paperclip-feedback-export-"));
|
||||||
|
const outputDir = path.join(tempDir, "feedback-export");
|
||||||
|
const traces = [
|
||||||
|
makeTrace(),
|
||||||
|
makeTrace({
|
||||||
|
id: "trace-abcdef12",
|
||||||
|
feedbackVoteId: "vote-abcdef12",
|
||||||
|
issueIdentifier: "PAP-124",
|
||||||
|
issueId: "issue-124",
|
||||||
|
vote: "up",
|
||||||
|
status: "local_only",
|
||||||
|
payloadSnapshot: {
|
||||||
|
vote: {
|
||||||
|
value: "up",
|
||||||
|
reason: null,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
const exported = await writeFeedbackExportBundle({
|
||||||
|
apiBase: "http://127.0.0.1:3100",
|
||||||
|
companyId: "company-123",
|
||||||
|
traces,
|
||||||
|
outputDir,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(exported.manifest.summary.total).toBe(2);
|
||||||
|
expect(exported.manifest.summary.withReason).toBe(1);
|
||||||
|
|
||||||
|
const manifest = JSON.parse(await readFile(path.join(outputDir, "index.json"), "utf8")) as {
|
||||||
|
files: { votes: string[]; traces: string[]; zip: string };
|
||||||
|
};
|
||||||
|
expect(manifest.files.votes).toHaveLength(2);
|
||||||
|
expect(manifest.files.traces).toHaveLength(2);
|
||||||
|
|
||||||
|
const archive = await readFile(exported.zipPath);
|
||||||
|
const zip = await readZipArchive(archive);
|
||||||
|
expect(Object.keys(zip.files)).toEqual(
|
||||||
|
expect.arrayContaining([
|
||||||
|
"index.json",
|
||||||
|
`votes/${manifest.files.votes[0]}`,
|
||||||
|
`traces/${manifest.files.traces[0]}`,
|
||||||
|
]),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -44,6 +44,9 @@ function createExistingConfigFixture() {
|
|||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: "local_disk",
|
provider: "local_disk",
|
||||||
localDisk: {
|
localDisk: {
|
||||||
|
|||||||
249
cli/src/__tests__/routines.test.ts
Normal file
249
cli/src/__tests__/routines.test.ts
Normal file
@@ -0,0 +1,249 @@
|
|||||||
|
import { randomUUID } from "node:crypto";
|
||||||
|
import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
||||||
|
import { eq } from "drizzle-orm";
|
||||||
|
import {
|
||||||
|
agents,
|
||||||
|
companies,
|
||||||
|
createDb,
|
||||||
|
projects,
|
||||||
|
routines,
|
||||||
|
} from "@paperclipai/db";
|
||||||
|
import {
|
||||||
|
getEmbeddedPostgresTestSupport,
|
||||||
|
startEmbeddedPostgresTestDatabase,
|
||||||
|
} from "./helpers/embedded-postgres.js";
|
||||||
|
import { disableAllRoutinesInConfig } from "../commands/routines.js";
|
||||||
|
|
||||||
|
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||||
|
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||||
|
|
||||||
|
if (!embeddedPostgresSupport.supported) {
|
||||||
|
console.warn(
|
||||||
|
`Skipping embedded Postgres routines CLI tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeTestConfig(configPath: string, tempRoot: string, connectionString: string) {
|
||||||
|
const config = {
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: new Date().toISOString(),
|
||||||
|
source: "doctor" as const,
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
mode: "postgres" as const,
|
||||||
|
connectionString,
|
||||||
|
embeddedPostgresDataDir: path.join(tempRoot, "embedded-db"),
|
||||||
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: false,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: path.join(tempRoot, "backups"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: "file" as const,
|
||||||
|
logDir: path.join(tempRoot, "logs"),
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: "local_trusted" as const,
|
||||||
|
exposure: "private" as const,
|
||||||
|
host: "127.0.0.1",
|
||||||
|
port: 3100,
|
||||||
|
allowedHostnames: [],
|
||||||
|
serveUi: false,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: "auto" as const,
|
||||||
|
disableSignUp: false,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: "local_disk" as const,
|
||||||
|
localDisk: {
|
||||||
|
baseDir: path.join(tempRoot, "storage"),
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: "paperclip",
|
||||||
|
region: "us-east-1",
|
||||||
|
prefix: "",
|
||||||
|
forcePathStyle: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: "local_encrypted" as const,
|
||||||
|
strictMode: false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: path.join(tempRoot, "secrets", "master.key"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
mkdirSync(path.dirname(configPath), { recursive: true });
|
||||||
|
writeFileSync(configPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
|
||||||
|
}
|
||||||
|
|
||||||
|
describeEmbeddedPostgres("disableAllRoutinesInConfig", () => {
|
||||||
|
let db!: ReturnType<typeof createDb>;
|
||||||
|
let tempDb: Awaited<ReturnType<typeof startEmbeddedPostgresTestDatabase>> | null = null;
|
||||||
|
let tempRoot = "";
|
||||||
|
let configPath = "";
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
tempDb = await startEmbeddedPostgresTestDatabase("paperclip-routines-cli-db-");
|
||||||
|
db = createDb(tempDb.connectionString);
|
||||||
|
tempRoot = mkdtempSync(path.join(os.tmpdir(), "paperclip-routines-cli-config-"));
|
||||||
|
configPath = path.join(tempRoot, "config.json");
|
||||||
|
writeTestConfig(configPath, tempRoot, tempDb.connectionString);
|
||||||
|
}, 20_000);
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
await db.delete(routines);
|
||||||
|
await db.delete(projects);
|
||||||
|
await db.delete(agents);
|
||||||
|
await db.delete(companies);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterAll(async () => {
|
||||||
|
await tempDb?.cleanup();
|
||||||
|
if (tempRoot) {
|
||||||
|
rmSync(tempRoot, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it("pauses only non-archived routines for the selected company", async () => {
|
||||||
|
const companyId = randomUUID();
|
||||||
|
const otherCompanyId = randomUUID();
|
||||||
|
const projectId = randomUUID();
|
||||||
|
const otherProjectId = randomUUID();
|
||||||
|
const agentId = randomUUID();
|
||||||
|
const otherAgentId = randomUUID();
|
||||||
|
const activeRoutineId = randomUUID();
|
||||||
|
const pausedRoutineId = randomUUID();
|
||||||
|
const archivedRoutineId = randomUUID();
|
||||||
|
const otherCompanyRoutineId = randomUUID();
|
||||||
|
|
||||||
|
await db.insert(companies).values([
|
||||||
|
{
|
||||||
|
id: companyId,
|
||||||
|
name: "Paperclip",
|
||||||
|
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: otherCompanyId,
|
||||||
|
name: "Other company",
|
||||||
|
issuePrefix: `T${otherCompanyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
await db.insert(agents).values([
|
||||||
|
{
|
||||||
|
id: agentId,
|
||||||
|
companyId,
|
||||||
|
name: "Coder",
|
||||||
|
adapterType: "process",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: otherAgentId,
|
||||||
|
companyId: otherCompanyId,
|
||||||
|
name: "Other coder",
|
||||||
|
adapterType: "process",
|
||||||
|
adapterConfig: {},
|
||||||
|
runtimeConfig: {},
|
||||||
|
permissions: {},
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
await db.insert(projects).values([
|
||||||
|
{
|
||||||
|
id: projectId,
|
||||||
|
companyId,
|
||||||
|
name: "Project",
|
||||||
|
status: "in_progress",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: otherProjectId,
|
||||||
|
companyId: otherCompanyId,
|
||||||
|
name: "Other project",
|
||||||
|
status: "in_progress",
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
await db.insert(routines).values([
|
||||||
|
{
|
||||||
|
id: activeRoutineId,
|
||||||
|
companyId,
|
||||||
|
projectId,
|
||||||
|
assigneeAgentId: agentId,
|
||||||
|
title: "Active routine",
|
||||||
|
status: "active",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: pausedRoutineId,
|
||||||
|
companyId,
|
||||||
|
projectId,
|
||||||
|
assigneeAgentId: agentId,
|
||||||
|
title: "Paused routine",
|
||||||
|
status: "paused",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: archivedRoutineId,
|
||||||
|
companyId,
|
||||||
|
projectId,
|
||||||
|
assigneeAgentId: agentId,
|
||||||
|
title: "Archived routine",
|
||||||
|
status: "archived",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: otherCompanyRoutineId,
|
||||||
|
companyId: otherCompanyId,
|
||||||
|
projectId: otherProjectId,
|
||||||
|
assigneeAgentId: otherAgentId,
|
||||||
|
title: "Other company routine",
|
||||||
|
status: "active",
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
const result = await disableAllRoutinesInConfig({
|
||||||
|
config: configPath,
|
||||||
|
companyId,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result).toMatchObject({
|
||||||
|
companyId,
|
||||||
|
totalRoutines: 3,
|
||||||
|
pausedCount: 1,
|
||||||
|
alreadyPausedCount: 1,
|
||||||
|
archivedCount: 1,
|
||||||
|
});
|
||||||
|
|
||||||
|
const companyRoutines = await db
|
||||||
|
.select({
|
||||||
|
id: routines.id,
|
||||||
|
status: routines.status,
|
||||||
|
})
|
||||||
|
.from(routines)
|
||||||
|
.where(eq(routines.companyId, companyId));
|
||||||
|
const statusById = new Map(companyRoutines.map((routine) => [routine.id, routine.status]));
|
||||||
|
|
||||||
|
expect(statusById.get(activeRoutineId)).toBe("paused");
|
||||||
|
expect(statusById.get(pausedRoutineId)).toBe("paused");
|
||||||
|
expect(statusById.get(archivedRoutineId)).toBe("archived");
|
||||||
|
|
||||||
|
const otherCompanyRoutine = await db
|
||||||
|
.select({
|
||||||
|
status: routines.status,
|
||||||
|
})
|
||||||
|
.from(routines)
|
||||||
|
.where(eq(routines.id, otherCompanyRoutineId));
|
||||||
|
expect(otherCompanyRoutine[0]?.status).toBe("active");
|
||||||
|
});
|
||||||
|
});
|
||||||
117
cli/src/__tests__/telemetry.test.ts
Normal file
117
cli/src/__tests__/telemetry.test.ts
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
|
||||||
|
const ORIGINAL_ENV = { ...process.env };
|
||||||
|
const CI_ENV_VARS = ["CI", "CONTINUOUS_INTEGRATION", "BUILD_NUMBER", "GITHUB_ACTIONS", "GITLAB_CI"];
|
||||||
|
|
||||||
|
function makeConfigPath(root: string, enabled: boolean): string {
|
||||||
|
const configPath = path.join(root, ".paperclip", "config.json");
|
||||||
|
fs.mkdirSync(path.dirname(configPath), { recursive: true });
|
||||||
|
fs.writeFileSync(configPath, JSON.stringify({
|
||||||
|
$meta: {
|
||||||
|
version: 1,
|
||||||
|
updatedAt: "2026-03-31T00:00:00.000Z",
|
||||||
|
source: "configure",
|
||||||
|
},
|
||||||
|
database: {
|
||||||
|
mode: "embedded-postgres",
|
||||||
|
embeddedPostgresDataDir: path.join(root, "runtime", "db"),
|
||||||
|
embeddedPostgresPort: 54329,
|
||||||
|
backup: {
|
||||||
|
enabled: true,
|
||||||
|
intervalMinutes: 60,
|
||||||
|
retentionDays: 30,
|
||||||
|
dir: path.join(root, "runtime", "backups"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logging: {
|
||||||
|
mode: "file",
|
||||||
|
logDir: path.join(root, "runtime", "logs"),
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
deploymentMode: "local_trusted",
|
||||||
|
exposure: "private",
|
||||||
|
host: "127.0.0.1",
|
||||||
|
port: 3100,
|
||||||
|
allowedHostnames: [],
|
||||||
|
serveUi: true,
|
||||||
|
},
|
||||||
|
auth: {
|
||||||
|
baseUrlMode: "auto",
|
||||||
|
disableSignUp: false,
|
||||||
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled,
|
||||||
|
},
|
||||||
|
storage: {
|
||||||
|
provider: "local_disk",
|
||||||
|
localDisk: {
|
||||||
|
baseDir: path.join(root, "runtime", "storage"),
|
||||||
|
},
|
||||||
|
s3: {
|
||||||
|
bucket: "paperclip",
|
||||||
|
region: "us-east-1",
|
||||||
|
prefix: "",
|
||||||
|
forcePathStyle: false,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
secrets: {
|
||||||
|
provider: "local_encrypted",
|
||||||
|
strictMode: false,
|
||||||
|
localEncrypted: {
|
||||||
|
keyFilePath: path.join(root, "runtime", "secrets", "master.key"),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}, null, 2));
|
||||||
|
return configPath;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("cli telemetry", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
process.env = { ...ORIGINAL_ENV };
|
||||||
|
for (const key of CI_ENV_VARS) {
|
||||||
|
delete process.env[key];
|
||||||
|
}
|
||||||
|
vi.stubGlobal("fetch", vi.fn(async () => ({ ok: true })));
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
process.env = { ...ORIGINAL_ENV };
|
||||||
|
vi.unstubAllGlobals();
|
||||||
|
vi.resetModules();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("respects telemetry.enabled=false from the config file", async () => {
|
||||||
|
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
|
||||||
|
const configPath = makeConfigPath(root, false);
|
||||||
|
process.env.PAPERCLIP_HOME = path.join(root, "home");
|
||||||
|
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
|
||||||
|
|
||||||
|
const { initTelemetryFromConfigFile } = await import("../telemetry.js");
|
||||||
|
const client = initTelemetryFromConfigFile(configPath);
|
||||||
|
|
||||||
|
expect(client).toBeNull();
|
||||||
|
expect(fs.existsSync(path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json"))).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("creates telemetry state only after the first event is tracked", async () => {
|
||||||
|
const root = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-cli-telemetry-"));
|
||||||
|
process.env.PAPERCLIP_HOME = path.join(root, "home");
|
||||||
|
process.env.PAPERCLIP_INSTANCE_ID = "telemetry-test";
|
||||||
|
|
||||||
|
const { initTelemetry, flushTelemetry } = await import("../telemetry.js");
|
||||||
|
const client = initTelemetry({ enabled: true });
|
||||||
|
const statePath = path.join(root, "home", "instances", "telemetry-test", "telemetry", "state.json");
|
||||||
|
|
||||||
|
expect(client).not.toBeNull();
|
||||||
|
expect(fs.existsSync(statePath)).toBe(false);
|
||||||
|
|
||||||
|
client!.track("install.started", { setupMode: "quickstart" });
|
||||||
|
|
||||||
|
expect(fs.existsSync(statePath)).toBe(true);
|
||||||
|
|
||||||
|
await flushTelemetry();
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -75,6 +75,9 @@ function buildSourceConfig(): PaperclipConfig {
|
|||||||
publicBaseUrl: "http://127.0.0.1:3100",
|
publicBaseUrl: "http://127.0.0.1:3100",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: "local_disk",
|
provider: "local_disk",
|
||||||
localDisk: {
|
localDisk: {
|
||||||
|
|||||||
@@ -5,12 +5,14 @@ import * as p from "@clack/prompts";
|
|||||||
import pc from "picocolors";
|
import pc from "picocolors";
|
||||||
import type {
|
import type {
|
||||||
Company,
|
Company,
|
||||||
|
FeedbackTrace,
|
||||||
CompanyPortabilityFileEntry,
|
CompanyPortabilityFileEntry,
|
||||||
CompanyPortabilityExportResult,
|
CompanyPortabilityExportResult,
|
||||||
CompanyPortabilityInclude,
|
CompanyPortabilityInclude,
|
||||||
CompanyPortabilityPreviewResult,
|
CompanyPortabilityPreviewResult,
|
||||||
CompanyPortabilityImportResult,
|
CompanyPortabilityImportResult,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
|
import { getTelemetryClient, trackCompanyImported } from "../../telemetry.js";
|
||||||
import { ApiRequestError } from "../../client/http.js";
|
import { ApiRequestError } from "../../client/http.js";
|
||||||
import { openUrl } from "../../client/board-auth.js";
|
import { openUrl } from "../../client/board-auth.js";
|
||||||
import { binaryContentTypeByExtension, readZipArchive } from "./zip.js";
|
import { binaryContentTypeByExtension, readZipArchive } from "./zip.js";
|
||||||
@@ -22,6 +24,11 @@ import {
|
|||||||
resolveCommandContext,
|
resolveCommandContext,
|
||||||
type BaseClientOptions,
|
type BaseClientOptions,
|
||||||
} from "./common.js";
|
} from "./common.js";
|
||||||
|
import {
|
||||||
|
buildFeedbackTraceQuery,
|
||||||
|
normalizeFeedbackTraceExportFormat,
|
||||||
|
serializeFeedbackTraces,
|
||||||
|
} from "./feedback.js";
|
||||||
|
|
||||||
interface CompanyCommandOptions extends BaseClientOptions {}
|
interface CompanyCommandOptions extends BaseClientOptions {}
|
||||||
type CompanyDeleteSelectorMode = "auto" | "id" | "prefix";
|
type CompanyDeleteSelectorMode = "auto" | "id" | "prefix";
|
||||||
@@ -44,6 +51,20 @@ interface CompanyExportOptions extends BaseClientOptions {
|
|||||||
expandReferencedSkills?: boolean;
|
expandReferencedSkills?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface CompanyFeedbackOptions extends BaseClientOptions {
|
||||||
|
targetType?: string;
|
||||||
|
vote?: string;
|
||||||
|
status?: string;
|
||||||
|
projectId?: string;
|
||||||
|
issueId?: string;
|
||||||
|
from?: string;
|
||||||
|
to?: string;
|
||||||
|
sharedOnly?: boolean;
|
||||||
|
includePayload?: boolean;
|
||||||
|
out?: string;
|
||||||
|
format?: string;
|
||||||
|
}
|
||||||
|
|
||||||
interface CompanyImportOptions extends BaseClientOptions {
|
interface CompanyImportOptions extends BaseClientOptions {
|
||||||
include?: string;
|
include?: string;
|
||||||
target?: CompanyImportTargetMode;
|
target?: CompanyImportTargetMode;
|
||||||
@@ -765,8 +786,15 @@ export function isHttpUrl(input: string): boolean {
|
|||||||
return /^https?:\/\//i.test(input.trim());
|
return /^https?:\/\//i.test(input.trim());
|
||||||
}
|
}
|
||||||
|
|
||||||
export function isGithubUrl(input: string): boolean {
|
export function looksLikeRepoUrl(input: string): boolean {
|
||||||
return /^https?:\/\/github\.com\//i.test(input.trim());
|
try {
|
||||||
|
const url = new URL(input.trim());
|
||||||
|
if (url.protocol !== "https:") return false;
|
||||||
|
const segments = url.pathname.split("/").filter(Boolean);
|
||||||
|
return segments.length >= 2;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function isGithubSegment(input: string): boolean {
|
function isGithubSegment(input: string): boolean {
|
||||||
@@ -797,13 +825,15 @@ function normalizeGithubImportPath(input: string | null | undefined): string | n
|
|||||||
}
|
}
|
||||||
|
|
||||||
function buildGithubImportUrl(input: {
|
function buildGithubImportUrl(input: {
|
||||||
|
hostname?: string;
|
||||||
owner: string;
|
owner: string;
|
||||||
repo: string;
|
repo: string;
|
||||||
ref?: string | null;
|
ref?: string | null;
|
||||||
path?: string | null;
|
path?: string | null;
|
||||||
companyPath?: string | null;
|
companyPath?: string | null;
|
||||||
}): string {
|
}): string {
|
||||||
const url = new URL(`https://github.com/${input.owner}/${input.repo.replace(/\.git$/i, "")}`);
|
const host = input.hostname || "github.com";
|
||||||
|
const url = new URL(`https://${host}/${input.owner}/${input.repo.replace(/\.git$/i, "")}`);
|
||||||
const ref = input.ref?.trim();
|
const ref = input.ref?.trim();
|
||||||
if (ref) {
|
if (ref) {
|
||||||
url.searchParams.set("ref", ref);
|
url.searchParams.set("ref", ref);
|
||||||
@@ -834,14 +864,15 @@ export function normalizeGithubImportSource(input: string, refOverride?: string)
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!isGithubUrl(trimmed)) {
|
if (!looksLikeRepoUrl(trimmed)) {
|
||||||
throw new Error("GitHub source must be a github.com URL or owner/repo[/path] shorthand.");
|
throw new Error("GitHub source must be a GitHub or GitHub Enterprise URL, or owner/repo[/path] shorthand.");
|
||||||
}
|
}
|
||||||
if (!ref) {
|
if (!ref) {
|
||||||
return trimmed;
|
return trimmed;
|
||||||
}
|
}
|
||||||
|
|
||||||
const url = new URL(trimmed);
|
const url = new URL(trimmed);
|
||||||
|
const hostname = url.hostname;
|
||||||
const parts = url.pathname.split("/").filter(Boolean);
|
const parts = url.pathname.split("/").filter(Boolean);
|
||||||
if (parts.length < 2) {
|
if (parts.length < 2) {
|
||||||
throw new Error("Invalid GitHub URL.");
|
throw new Error("Invalid GitHub URL.");
|
||||||
@@ -852,18 +883,18 @@ export function normalizeGithubImportSource(input: string, refOverride?: string)
|
|||||||
const existingPath = normalizeGithubImportPath(url.searchParams.get("path"));
|
const existingPath = normalizeGithubImportPath(url.searchParams.get("path"));
|
||||||
const existingCompanyPath = normalizeGithubImportPath(url.searchParams.get("companyPath"));
|
const existingCompanyPath = normalizeGithubImportPath(url.searchParams.get("companyPath"));
|
||||||
if (existingCompanyPath) {
|
if (existingCompanyPath) {
|
||||||
return buildGithubImportUrl({ owner, repo, ref, companyPath: existingCompanyPath });
|
return buildGithubImportUrl({ hostname, owner, repo, ref, companyPath: existingCompanyPath });
|
||||||
}
|
}
|
||||||
if (existingPath) {
|
if (existingPath) {
|
||||||
return buildGithubImportUrl({ owner, repo, ref, path: existingPath });
|
return buildGithubImportUrl({ hostname, owner, repo, ref, path: existingPath });
|
||||||
}
|
}
|
||||||
if (parts[2] === "tree") {
|
if (parts[2] === "tree") {
|
||||||
return buildGithubImportUrl({ owner, repo, ref, path: parts.slice(4).join("/") });
|
return buildGithubImportUrl({ hostname, owner, repo, ref, path: parts.slice(4).join("/") });
|
||||||
}
|
}
|
||||||
if (parts[2] === "blob") {
|
if (parts[2] === "blob") {
|
||||||
return buildGithubImportUrl({ owner, repo, ref, companyPath: parts.slice(4).join("/") });
|
return buildGithubImportUrl({ hostname, owner, repo, ref, companyPath: parts.slice(4).join("/") });
|
||||||
}
|
}
|
||||||
return buildGithubImportUrl({ owner, repo, ref });
|
return buildGithubImportUrl({ hostname, owner, repo, ref });
|
||||||
}
|
}
|
||||||
|
|
||||||
async function pathExists(inputPath: string): Promise<boolean> {
|
async function pathExists(inputPath: string): Promise<boolean> {
|
||||||
@@ -1093,6 +1124,91 @@ export function registerCompanyCommands(program: Command): void {
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
company
|
||||||
|
.command("feedback:list")
|
||||||
|
.description("List feedback traces for a company")
|
||||||
|
.requiredOption("-C, --company-id <id>", "Company ID")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--project-id <id>", "Filter by project ID")
|
||||||
|
.option("--issue-id <id>", "Filter by issue ID")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--include-payload", "Include stored payload snapshots in the response")
|
||||||
|
.action(async (opts: CompanyFeedbackOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts, { requireCompany: true });
|
||||||
|
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||||
|
`/api/companies/${ctx.companyId}/feedback-traces${buildFeedbackTraceQuery(opts)}`,
|
||||||
|
)) ?? [];
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(traces, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
printOutput(
|
||||||
|
traces.map((trace) => ({
|
||||||
|
id: trace.id,
|
||||||
|
issue: trace.issueIdentifier ?? trace.issueId,
|
||||||
|
vote: trace.vote,
|
||||||
|
status: trace.status,
|
||||||
|
targetType: trace.targetType,
|
||||||
|
target: trace.targetSummary.label,
|
||||||
|
})),
|
||||||
|
{ json: false },
|
||||||
|
);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: false },
|
||||||
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
company
|
||||||
|
.command("feedback:export")
|
||||||
|
.description("Export feedback traces for a company")
|
||||||
|
.requiredOption("-C, --company-id <id>", "Company ID")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--project-id <id>", "Filter by project ID")
|
||||||
|
.option("--issue-id <id>", "Filter by issue ID")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--include-payload", "Include stored payload snapshots in the export")
|
||||||
|
.option("--out <path>", "Write export to a file path instead of stdout")
|
||||||
|
.option("--format <format>", "Export format: json or ndjson", "ndjson")
|
||||||
|
.action(async (opts: CompanyFeedbackOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts, { requireCompany: true });
|
||||||
|
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||||
|
`/api/companies/${ctx.companyId}/feedback-traces${buildFeedbackTraceQuery(opts, opts.includePayload ?? true)}`,
|
||||||
|
)) ?? [];
|
||||||
|
const serialized = serializeFeedbackTraces(traces, opts.format);
|
||||||
|
if (opts.out?.trim()) {
|
||||||
|
await writeFile(opts.out, serialized, "utf8");
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(
|
||||||
|
{ out: opts.out, count: traces.length, format: normalizeFeedbackTraceExportFormat(opts.format) },
|
||||||
|
{ json: true },
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
console.log(`Wrote ${traces.length} feedback trace(s) to ${opts.out}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
process.stdout.write(`${serialized}${serialized.endsWith("\n") ? "" : "\n"}`);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: false },
|
||||||
|
);
|
||||||
|
|
||||||
addCommonClientOptions(
|
addCommonClientOptions(
|
||||||
company
|
company
|
||||||
.command("export")
|
.command("export")
|
||||||
@@ -1208,13 +1324,13 @@ export function registerCompanyCommands(program: Command): void {
|
|||||||
| { type: "github"; url: string };
|
| { type: "github"; url: string };
|
||||||
|
|
||||||
const treatAsLocalPath = !isHttpUrl(from) && await pathExists(from);
|
const treatAsLocalPath = !isHttpUrl(from) && await pathExists(from);
|
||||||
const isGithubSource = isGithubUrl(from) || (isGithubShorthand(from) && !treatAsLocalPath);
|
const isGithubSource = looksLikeRepoUrl(from) || (isGithubShorthand(from) && !treatAsLocalPath);
|
||||||
|
|
||||||
if (isHttpUrl(from) || isGithubSource) {
|
if (isHttpUrl(from) || isGithubSource) {
|
||||||
if (!isGithubUrl(from) && !isGithubShorthand(from)) {
|
if (!looksLikeRepoUrl(from) && !isGithubShorthand(from)) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
"Only GitHub URLs and local paths are supported for import. " +
|
"Only GitHub URLs and local paths are supported for import. " +
|
||||||
"Generic HTTP URLs are not supported. Use a GitHub URL (https://github.com/...) or a local directory path.",
|
"Generic HTTP URLs are not supported. Use a GitHub or GitHub Enterprise URL (https://github.com/... or https://ghe.example.com/...) or a local directory path.",
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
sourcePayload = { type: "github", url: normalizeGithubImportSource(from, opts.ref) };
|
sourcePayload = { type: "github", url: normalizeGithubImportSource(from, opts.ref) };
|
||||||
@@ -1325,6 +1441,12 @@ export function registerCompanyCommands(program: Command): void {
|
|||||||
if (!imported) {
|
if (!imported) {
|
||||||
throw new Error("Import request returned no data.");
|
throw new Error("Import request returned no data.");
|
||||||
}
|
}
|
||||||
|
const tc = getTelemetryClient();
|
||||||
|
if (tc) {
|
||||||
|
const isPrivate = sourcePayload.type !== "github";
|
||||||
|
const sourceRef = sourcePayload.type === "github" ? sourcePayload.url : from;
|
||||||
|
trackCompanyImported(tc, { sourceType: sourcePayload.type, sourceRef, isPrivate });
|
||||||
|
}
|
||||||
let companyUrl: string | undefined;
|
let companyUrl: string | undefined;
|
||||||
if (!ctx.json) {
|
if (!ctx.json) {
|
||||||
try {
|
try {
|
||||||
|
|||||||
645
cli/src/commands/client/feedback.ts
Normal file
645
cli/src/commands/client/feedback.ts
Normal file
@@ -0,0 +1,645 @@
|
|||||||
|
import { mkdir, readdir, readFile, stat, writeFile } from "node:fs/promises";
|
||||||
|
import path from "node:path";
|
||||||
|
import pc from "picocolors";
|
||||||
|
import { Command } from "commander";
|
||||||
|
import type { Company, FeedbackTrace, FeedbackTraceBundle } from "@paperclipai/shared";
|
||||||
|
import {
|
||||||
|
addCommonClientOptions,
|
||||||
|
handleCommandError,
|
||||||
|
printOutput,
|
||||||
|
resolveCommandContext,
|
||||||
|
type BaseClientOptions,
|
||||||
|
type ResolvedClientContext,
|
||||||
|
} from "./common.js";
|
||||||
|
|
||||||
|
interface FeedbackFilterOptions extends BaseClientOptions {
|
||||||
|
targetType?: string;
|
||||||
|
vote?: string;
|
||||||
|
status?: string;
|
||||||
|
projectId?: string;
|
||||||
|
issueId?: string;
|
||||||
|
from?: string;
|
||||||
|
to?: string;
|
||||||
|
sharedOnly?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FeedbackTraceQueryOptions {
|
||||||
|
targetType?: string;
|
||||||
|
vote?: string;
|
||||||
|
status?: string;
|
||||||
|
projectId?: string;
|
||||||
|
issueId?: string;
|
||||||
|
from?: string;
|
||||||
|
to?: string;
|
||||||
|
sharedOnly?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FeedbackReportOptions extends FeedbackFilterOptions {
|
||||||
|
payloads?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FeedbackExportOptions extends FeedbackFilterOptions {
|
||||||
|
out?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FeedbackSummary {
|
||||||
|
total: number;
|
||||||
|
thumbsUp: number;
|
||||||
|
thumbsDown: number;
|
||||||
|
withReason: number;
|
||||||
|
statuses: Record<string, number>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FeedbackExportManifest {
|
||||||
|
exportedAt: string;
|
||||||
|
serverUrl: string;
|
||||||
|
companyId: string;
|
||||||
|
summary: FeedbackSummary & {
|
||||||
|
uniqueIssues: number;
|
||||||
|
issues: string[];
|
||||||
|
};
|
||||||
|
files: {
|
||||||
|
votes: string[];
|
||||||
|
traces: string[];
|
||||||
|
fullTraces: string[];
|
||||||
|
zip: string;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FeedbackExportResult {
|
||||||
|
outputDir: string;
|
||||||
|
zipPath: string;
|
||||||
|
manifest: FeedbackExportManifest;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function registerFeedbackCommands(program: Command): void {
|
||||||
|
const feedback = program.command("feedback").description("Inspect and export local feedback traces");
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
feedback
|
||||||
|
.command("report")
|
||||||
|
.description("Render a terminal report for company feedback traces")
|
||||||
|
.option("-C, --company-id <id>", "Company ID (overrides context default)")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--project-id <id>", "Filter by project ID")
|
||||||
|
.option("--issue-id <id>", "Filter by issue ID")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--payloads", "Include raw payload dumps in the terminal report", false)
|
||||||
|
.action(async (opts: FeedbackReportOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
|
||||||
|
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
|
||||||
|
const summary = summarizeFeedbackTraces(traces);
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(
|
||||||
|
{
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
companyId,
|
||||||
|
summary,
|
||||||
|
traces,
|
||||||
|
},
|
||||||
|
{ json: true },
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
console.log(renderFeedbackReport({
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
companyId,
|
||||||
|
traces,
|
||||||
|
summary,
|
||||||
|
includePayloads: Boolean(opts.payloads),
|
||||||
|
}));
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: false },
|
||||||
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
feedback
|
||||||
|
.command("export")
|
||||||
|
.description("Export feedback votes and raw trace bundles into a folder plus zip archive")
|
||||||
|
.option("-C, --company-id <id>", "Company ID (overrides context default)")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--project-id <id>", "Filter by project ID")
|
||||||
|
.option("--issue-id <id>", "Filter by issue ID")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--out <path>", "Output directory (default: ./feedback-export-<timestamp>)")
|
||||||
|
.action(async (opts: FeedbackExportOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const companyId = await resolveFeedbackCompanyId(ctx, opts.companyId);
|
||||||
|
const traces = await fetchCompanyFeedbackTraces(ctx, companyId, opts);
|
||||||
|
const outputDir = path.resolve(opts.out?.trim() || defaultFeedbackExportDirName());
|
||||||
|
const exported = await writeFeedbackExportBundle({
|
||||||
|
apiBase: ctx.api.apiBase,
|
||||||
|
companyId,
|
||||||
|
traces,
|
||||||
|
outputDir,
|
||||||
|
traceBundleFetcher: (trace) => fetchFeedbackTraceBundle(ctx, trace.id),
|
||||||
|
});
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(
|
||||||
|
{
|
||||||
|
companyId,
|
||||||
|
outputDir: exported.outputDir,
|
||||||
|
zipPath: exported.zipPath,
|
||||||
|
summary: exported.manifest.summary,
|
||||||
|
},
|
||||||
|
{ json: true },
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
console.log(renderFeedbackExportSummary(exported));
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
{ includeCompany: false },
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function resolveFeedbackCompanyId(
|
||||||
|
ctx: ResolvedClientContext,
|
||||||
|
explicitCompanyId?: string,
|
||||||
|
): Promise<string> {
|
||||||
|
const direct = explicitCompanyId?.trim() || ctx.companyId?.trim();
|
||||||
|
if (direct) return direct;
|
||||||
|
const companies = (await ctx.api.get<Company[]>("/api/companies")) ?? [];
|
||||||
|
const companyId = companies[0]?.id?.trim();
|
||||||
|
if (!companyId) {
|
||||||
|
throw new Error(
|
||||||
|
"Company ID is required. Pass --company-id, set PAPERCLIP_COMPANY_ID, or configure a CLI context default.",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return companyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function buildFeedbackTraceQuery(opts: FeedbackTraceQueryOptions, includePayload = true): string {
|
||||||
|
const params = new URLSearchParams();
|
||||||
|
if (opts.targetType) params.set("targetType", opts.targetType);
|
||||||
|
if (opts.vote) params.set("vote", opts.vote);
|
||||||
|
if (opts.status) params.set("status", opts.status);
|
||||||
|
if (opts.projectId) params.set("projectId", opts.projectId);
|
||||||
|
if (opts.issueId) params.set("issueId", opts.issueId);
|
||||||
|
if (opts.from) params.set("from", opts.from);
|
||||||
|
if (opts.to) params.set("to", opts.to);
|
||||||
|
if (opts.sharedOnly) params.set("sharedOnly", "true");
|
||||||
|
if (includePayload) params.set("includePayload", "true");
|
||||||
|
const query = params.toString();
|
||||||
|
return query ? `?${query}` : "";
|
||||||
|
}
|
||||||
|
|
||||||
|
export function normalizeFeedbackTraceExportFormat(value: string | undefined): "json" | "ndjson" {
|
||||||
|
if (!value || value === "ndjson") return "ndjson";
|
||||||
|
if (value === "json") return "json";
|
||||||
|
throw new Error(`Unsupported export format: ${value}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function serializeFeedbackTraces(traces: FeedbackTrace[], format: string | undefined): string {
|
||||||
|
if (normalizeFeedbackTraceExportFormat(format) === "json") {
|
||||||
|
return JSON.stringify(traces, null, 2);
|
||||||
|
}
|
||||||
|
return traces.map((trace) => JSON.stringify(trace)).join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function fetchCompanyFeedbackTraces(
|
||||||
|
ctx: ResolvedClientContext,
|
||||||
|
companyId: string,
|
||||||
|
opts: FeedbackFilterOptions,
|
||||||
|
): Promise<FeedbackTrace[]> {
|
||||||
|
return (
|
||||||
|
(await ctx.api.get<FeedbackTrace[]>(
|
||||||
|
`/api/companies/${companyId}/feedback-traces${buildFeedbackTraceQuery(opts, true)}`,
|
||||||
|
)) ?? []
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function fetchFeedbackTraceBundle(
|
||||||
|
ctx: ResolvedClientContext,
|
||||||
|
traceId: string,
|
||||||
|
): Promise<FeedbackTraceBundle> {
|
||||||
|
const bundle = await ctx.api.get<FeedbackTraceBundle>(`/api/feedback-traces/${traceId}/bundle`);
|
||||||
|
if (!bundle) {
|
||||||
|
throw new Error(`Feedback trace bundle ${traceId} not found`);
|
||||||
|
}
|
||||||
|
return bundle;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function summarizeFeedbackTraces(traces: FeedbackTrace[]): FeedbackSummary {
|
||||||
|
const statuses: Record<string, number> = {};
|
||||||
|
let thumbsUp = 0;
|
||||||
|
let thumbsDown = 0;
|
||||||
|
let withReason = 0;
|
||||||
|
|
||||||
|
for (const trace of traces) {
|
||||||
|
if (trace.vote === "up") thumbsUp += 1;
|
||||||
|
if (trace.vote === "down") thumbsDown += 1;
|
||||||
|
if (readFeedbackReason(trace)) withReason += 1;
|
||||||
|
statuses[trace.status] = (statuses[trace.status] ?? 0) + 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
total: traces.length,
|
||||||
|
thumbsUp,
|
||||||
|
thumbsDown,
|
||||||
|
withReason,
|
||||||
|
statuses,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function renderFeedbackReport(input: {
|
||||||
|
apiBase: string;
|
||||||
|
companyId: string;
|
||||||
|
traces: FeedbackTrace[];
|
||||||
|
summary: FeedbackSummary;
|
||||||
|
includePayloads: boolean;
|
||||||
|
}): string {
|
||||||
|
const lines: string[] = [];
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.bold(pc.magenta("Paperclip Feedback Report")));
|
||||||
|
lines.push(pc.dim(new Date().toISOString()));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
lines.push(`${pc.dim("Server:")} ${input.apiBase}`);
|
||||||
|
lines.push(`${pc.dim("Company:")} ${input.companyId}`);
|
||||||
|
lines.push("");
|
||||||
|
|
||||||
|
if (input.traces.length === 0) {
|
||||||
|
lines.push(pc.yellow("[!!] No feedback traces found."));
|
||||||
|
lines.push("");
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
lines.push(pc.bold(pc.cyan("Summary")));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
lines.push(` ${pc.green(pc.bold(String(input.summary.thumbsUp)))} thumbs up`);
|
||||||
|
lines.push(` ${pc.red(pc.bold(String(input.summary.thumbsDown)))} thumbs down`);
|
||||||
|
lines.push(` ${pc.yellow(pc.bold(String(input.summary.withReason)))} downvotes with a reason`);
|
||||||
|
lines.push(` ${pc.bold(String(input.summary.total))} total traces`);
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.dim("Export status:"));
|
||||||
|
for (const status of ["pending", "sent", "local_only", "failed"]) {
|
||||||
|
lines.push(` ${padRight(status, 10)} ${input.summary.statuses[status] ?? 0}`);
|
||||||
|
}
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.bold(pc.cyan("Trace Details")));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
|
||||||
|
for (const trace of input.traces) {
|
||||||
|
const voteColor = trace.vote === "up" ? pc.green : pc.red;
|
||||||
|
const voteIcon = trace.vote === "up" ? "^" : "v";
|
||||||
|
const issueRef = trace.issueIdentifier ?? trace.issueId;
|
||||||
|
const label = trace.targetSummary.label?.trim() || trace.targetType;
|
||||||
|
const excerpt = compactText(trace.targetSummary.excerpt);
|
||||||
|
const reason = readFeedbackReason(trace);
|
||||||
|
lines.push(
|
||||||
|
` ${voteColor(voteIcon)} ${pc.bold(issueRef)} ${pc.dim(compactText(trace.issueTitle, 64))}`,
|
||||||
|
);
|
||||||
|
lines.push(
|
||||||
|
` ${pc.dim("Trace:")} ${trace.id.slice(0, 8)} ${pc.dim("Status:")} ${trace.status} ${pc.dim("Date:")} ${formatTimestamp(trace.createdAt)}`,
|
||||||
|
);
|
||||||
|
lines.push(` ${pc.dim("Target:")} ${label}`);
|
||||||
|
if (excerpt) {
|
||||||
|
lines.push(` ${pc.dim("Excerpt:")} ${excerpt}`);
|
||||||
|
}
|
||||||
|
if (reason) {
|
||||||
|
lines.push(` ${pc.yellow(pc.bold("Reason:"))} ${pc.yellow(reason)}`);
|
||||||
|
}
|
||||||
|
lines.push("");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (input.includePayloads) {
|
||||||
|
lines.push(pc.bold(pc.cyan("Raw Payloads")));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
for (const trace of input.traces) {
|
||||||
|
if (!trace.payloadSnapshot) continue;
|
||||||
|
const issueRef = trace.issueIdentifier ?? trace.issueId;
|
||||||
|
lines.push(` ${pc.bold(`${issueRef} (${trace.id.slice(0, 8)})`)}`);
|
||||||
|
const body = JSON.stringify(trace.payloadSnapshot, null, 2)?.split("\n") ?? [];
|
||||||
|
for (const line of body) {
|
||||||
|
lines.push(` ${pc.dim(line)}`);
|
||||||
|
}
|
||||||
|
lines.push("");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
lines.push(pc.dim(`Report complete. ${input.traces.length} trace(s) displayed.`));
|
||||||
|
lines.push("");
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function writeFeedbackExportBundle(input: {
|
||||||
|
apiBase: string;
|
||||||
|
companyId: string;
|
||||||
|
traces: FeedbackTrace[];
|
||||||
|
outputDir: string;
|
||||||
|
traceBundleFetcher?: (trace: FeedbackTrace) => Promise<FeedbackTraceBundle>;
|
||||||
|
}): Promise<FeedbackExportResult> {
|
||||||
|
await ensureEmptyOutputDirectory(input.outputDir);
|
||||||
|
await mkdir(path.join(input.outputDir, "votes"), { recursive: true });
|
||||||
|
await mkdir(path.join(input.outputDir, "traces"), { recursive: true });
|
||||||
|
await mkdir(path.join(input.outputDir, "full-traces"), { recursive: true });
|
||||||
|
|
||||||
|
const summary = summarizeFeedbackTraces(input.traces);
|
||||||
|
const voteFiles: string[] = [];
|
||||||
|
const traceFiles: string[] = [];
|
||||||
|
const fullTraceDirs: string[] = [];
|
||||||
|
const fullTraceFiles: string[] = [];
|
||||||
|
const issueSet = new Set<string>();
|
||||||
|
|
||||||
|
for (const trace of input.traces) {
|
||||||
|
const issueRef = sanitizeFileSegment(trace.issueIdentifier ?? trace.issueId);
|
||||||
|
const voteRecord = buildFeedbackVoteRecord(trace);
|
||||||
|
const voteFileName = `${issueRef}-${trace.feedbackVoteId.slice(0, 8)}.json`;
|
||||||
|
const traceFileName = `${issueRef}-${trace.id.slice(0, 8)}.json`;
|
||||||
|
voteFiles.push(voteFileName);
|
||||||
|
traceFiles.push(traceFileName);
|
||||||
|
issueSet.add(trace.issueIdentifier ?? trace.issueId);
|
||||||
|
await writeFile(
|
||||||
|
path.join(input.outputDir, "votes", voteFileName),
|
||||||
|
`${JSON.stringify(voteRecord, null, 2)}\n`,
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
await writeFile(
|
||||||
|
path.join(input.outputDir, "traces", traceFileName),
|
||||||
|
`${JSON.stringify(trace, null, 2)}\n`,
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
|
||||||
|
if (input.traceBundleFetcher) {
|
||||||
|
const bundle = await input.traceBundleFetcher(trace);
|
||||||
|
const bundleDirName = `${issueRef}-${trace.id.slice(0, 8)}`;
|
||||||
|
const bundleDir = path.join(input.outputDir, "full-traces", bundleDirName);
|
||||||
|
await mkdir(bundleDir, { recursive: true });
|
||||||
|
fullTraceDirs.push(bundleDirName);
|
||||||
|
await writeFile(
|
||||||
|
path.join(bundleDir, "bundle.json"),
|
||||||
|
`${JSON.stringify(bundle, null, 2)}\n`,
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, "bundle.json"));
|
||||||
|
for (const file of bundle.files) {
|
||||||
|
const targetPath = path.join(bundleDir, file.path);
|
||||||
|
await mkdir(path.dirname(targetPath), { recursive: true });
|
||||||
|
await writeFile(targetPath, file.contents, "utf8");
|
||||||
|
fullTraceFiles.push(path.posix.join("full-traces", bundleDirName, file.path.replace(/\\/g, "/")));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const zipPath = `${input.outputDir}.zip`;
|
||||||
|
const manifest: FeedbackExportManifest = {
|
||||||
|
exportedAt: new Date().toISOString(),
|
||||||
|
serverUrl: input.apiBase,
|
||||||
|
companyId: input.companyId,
|
||||||
|
summary: {
|
||||||
|
...summary,
|
||||||
|
uniqueIssues: issueSet.size,
|
||||||
|
issues: Array.from(issueSet).sort((left, right) => left.localeCompare(right)),
|
||||||
|
},
|
||||||
|
files: {
|
||||||
|
votes: voteFiles.slice().sort((left, right) => left.localeCompare(right)),
|
||||||
|
traces: traceFiles.slice().sort((left, right) => left.localeCompare(right)),
|
||||||
|
fullTraces: fullTraceDirs.slice().sort((left, right) => left.localeCompare(right)),
|
||||||
|
zip: path.basename(zipPath),
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
await writeFile(
|
||||||
|
path.join(input.outputDir, "index.json"),
|
||||||
|
`${JSON.stringify(manifest, null, 2)}\n`,
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
const archiveFiles = await collectJsonFilesForArchive(input.outputDir, [
|
||||||
|
"index.json",
|
||||||
|
...manifest.files.votes.map((file) => path.posix.join("votes", file)),
|
||||||
|
...manifest.files.traces.map((file) => path.posix.join("traces", file)),
|
||||||
|
...fullTraceFiles,
|
||||||
|
]);
|
||||||
|
await writeFile(zipPath, createStoredZipArchive(archiveFiles, path.basename(input.outputDir)));
|
||||||
|
|
||||||
|
return {
|
||||||
|
outputDir: input.outputDir,
|
||||||
|
zipPath,
|
||||||
|
manifest,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function renderFeedbackExportSummary(exported: FeedbackExportResult): string {
|
||||||
|
const lines: string[] = [];
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.bold(pc.magenta("Paperclip Feedback Export")));
|
||||||
|
lines.push(pc.dim(exported.manifest.exportedAt));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
lines.push(`${pc.dim("Company:")} ${exported.manifest.companyId}`);
|
||||||
|
lines.push(`${pc.dim("Output:")} ${exported.outputDir}`);
|
||||||
|
lines.push(`${pc.dim("Archive:")} ${exported.zipPath}`);
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.bold("Export Summary"));
|
||||||
|
lines.push(horizontalRule());
|
||||||
|
lines.push(` ${pc.green(pc.bold(String(exported.manifest.summary.thumbsUp)))} thumbs up`);
|
||||||
|
lines.push(` ${pc.red(pc.bold(String(exported.manifest.summary.thumbsDown)))} thumbs down`);
|
||||||
|
lines.push(` ${pc.yellow(pc.bold(String(exported.manifest.summary.withReason)))} with reason`);
|
||||||
|
lines.push(` ${pc.bold(String(exported.manifest.summary.uniqueIssues))} unique issues`);
|
||||||
|
lines.push("");
|
||||||
|
lines.push(pc.dim("Files:"));
|
||||||
|
lines.push(` ${path.join(exported.outputDir, "index.json")}`);
|
||||||
|
lines.push(` ${path.join(exported.outputDir, "votes")} (${exported.manifest.files.votes.length} files)`);
|
||||||
|
lines.push(` ${path.join(exported.outputDir, "traces")} (${exported.manifest.files.traces.length} files)`);
|
||||||
|
lines.push(` ${path.join(exported.outputDir, "full-traces")} (${exported.manifest.files.fullTraces.length} bundles)`);
|
||||||
|
lines.push(` ${exported.zipPath}`);
|
||||||
|
lines.push("");
|
||||||
|
return lines.join("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
function readFeedbackReason(trace: FeedbackTrace): string | null {
|
||||||
|
const payload = asRecord(trace.payloadSnapshot);
|
||||||
|
const vote = asRecord(payload?.vote);
|
||||||
|
const reason = vote?.reason;
|
||||||
|
return typeof reason === "string" && reason.trim() ? reason.trim() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildFeedbackVoteRecord(trace: FeedbackTrace) {
|
||||||
|
return {
|
||||||
|
voteId: trace.feedbackVoteId,
|
||||||
|
traceId: trace.id,
|
||||||
|
issueId: trace.issueId,
|
||||||
|
issueIdentifier: trace.issueIdentifier,
|
||||||
|
issueTitle: trace.issueTitle,
|
||||||
|
vote: trace.vote,
|
||||||
|
targetType: trace.targetType,
|
||||||
|
targetId: trace.targetId,
|
||||||
|
targetSummary: trace.targetSummary,
|
||||||
|
status: trace.status,
|
||||||
|
consentVersion: trace.consentVersion,
|
||||||
|
createdAt: trace.createdAt,
|
||||||
|
updatedAt: trace.updatedAt,
|
||||||
|
reason: readFeedbackReason(trace),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function asRecord(value: unknown): Record<string, unknown> | null {
|
||||||
|
if (!value || typeof value !== "object" || Array.isArray(value)) return null;
|
||||||
|
return value as Record<string, unknown>;
|
||||||
|
}
|
||||||
|
|
||||||
|
function compactText(value: string | null | undefined, maxLength = 88): string | null {
|
||||||
|
if (!value) return null;
|
||||||
|
const compact = value.replace(/\s+/g, " ").trim();
|
||||||
|
if (!compact) return null;
|
||||||
|
if (compact.length <= maxLength) return compact;
|
||||||
|
return `${compact.slice(0, maxLength - 3)}...`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatTimestamp(value: unknown): string {
|
||||||
|
if (value instanceof Date) return value.toISOString().slice(0, 19).replace("T", " ");
|
||||||
|
if (typeof value === "string") return value.slice(0, 19).replace("T", " ");
|
||||||
|
return "-";
|
||||||
|
}
|
||||||
|
|
||||||
|
function horizontalRule(): string {
|
||||||
|
return pc.dim("-".repeat(72));
|
||||||
|
}
|
||||||
|
|
||||||
|
function padRight(value: string, width: number): string {
|
||||||
|
return `${value}${" ".repeat(Math.max(0, width - value.length))}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function defaultFeedbackExportDirName(): string {
|
||||||
|
const iso = new Date().toISOString().replace(/[-:]/g, "").replace(/\.\d{3}Z$/, "Z");
|
||||||
|
return `feedback-export-${iso}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureEmptyOutputDirectory(outputDir: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
const info = await stat(outputDir);
|
||||||
|
if (!info.isDirectory()) {
|
||||||
|
throw new Error(`Output path already exists and is not a directory: ${outputDir}`);
|
||||||
|
}
|
||||||
|
const entries = await readdir(outputDir);
|
||||||
|
if (entries.length > 0) {
|
||||||
|
throw new Error(`Output directory already exists and is not empty: ${outputDir}`);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : "";
|
||||||
|
if (/ENOENT/.test(message)) {
|
||||||
|
await mkdir(outputDir, { recursive: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function collectJsonFilesForArchive(
|
||||||
|
outputDir: string,
|
||||||
|
relativePaths: string[],
|
||||||
|
): Promise<Record<string, string>> {
|
||||||
|
const files: Record<string, string> = {};
|
||||||
|
for (const relativePath of relativePaths) {
|
||||||
|
const normalized = relativePath.replace(/\\/g, "/");
|
||||||
|
files[normalized] = await readFile(path.join(outputDir, normalized), "utf8");
|
||||||
|
}
|
||||||
|
return files;
|
||||||
|
}
|
||||||
|
|
||||||
|
function sanitizeFileSegment(value: string): string {
|
||||||
|
return value.replace(/[^a-zA-Z0-9._-]+/g, "-").replace(/^-+|-+$/g, "") || "feedback";
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeUint16(target: Uint8Array, offset: number, value: number) {
|
||||||
|
target[offset] = value & 0xff;
|
||||||
|
target[offset + 1] = (value >>> 8) & 0xff;
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeUint32(target: Uint8Array, offset: number, value: number) {
|
||||||
|
target[offset] = value & 0xff;
|
||||||
|
target[offset + 1] = (value >>> 8) & 0xff;
|
||||||
|
target[offset + 2] = (value >>> 16) & 0xff;
|
||||||
|
target[offset + 3] = (value >>> 24) & 0xff;
|
||||||
|
}
|
||||||
|
|
||||||
|
function crc32(bytes: Uint8Array) {
|
||||||
|
let crc = 0xffffffff;
|
||||||
|
for (const byte of bytes) {
|
||||||
|
crc ^= byte;
|
||||||
|
for (let bit = 0; bit < 8; bit += 1) {
|
||||||
|
crc = (crc & 1) === 1 ? (crc >>> 1) ^ 0xedb88320 : crc >>> 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return (crc ^ 0xffffffff) >>> 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
function createStoredZipArchive(files: Record<string, string>, rootPath: string): Uint8Array {
|
||||||
|
const encoder = new TextEncoder();
|
||||||
|
const localChunks: Uint8Array[] = [];
|
||||||
|
const centralChunks: Uint8Array[] = [];
|
||||||
|
let localOffset = 0;
|
||||||
|
let entryCount = 0;
|
||||||
|
|
||||||
|
for (const [relativePath, content] of Object.entries(files).sort(([left], [right]) => left.localeCompare(right))) {
|
||||||
|
const fileName = encoder.encode(`${rootPath}/${relativePath}`);
|
||||||
|
const body = encoder.encode(content);
|
||||||
|
const checksum = crc32(body);
|
||||||
|
|
||||||
|
const localHeader = new Uint8Array(30 + fileName.length);
|
||||||
|
writeUint32(localHeader, 0, 0x04034b50);
|
||||||
|
writeUint16(localHeader, 4, 20);
|
||||||
|
writeUint16(localHeader, 6, 0x0800);
|
||||||
|
writeUint16(localHeader, 8, 0);
|
||||||
|
writeUint32(localHeader, 14, checksum);
|
||||||
|
writeUint32(localHeader, 18, body.length);
|
||||||
|
writeUint32(localHeader, 22, body.length);
|
||||||
|
writeUint16(localHeader, 26, fileName.length);
|
||||||
|
localHeader.set(fileName, 30);
|
||||||
|
|
||||||
|
const centralHeader = new Uint8Array(46 + fileName.length);
|
||||||
|
writeUint32(centralHeader, 0, 0x02014b50);
|
||||||
|
writeUint16(centralHeader, 4, 20);
|
||||||
|
writeUint16(centralHeader, 6, 20);
|
||||||
|
writeUint16(centralHeader, 8, 0x0800);
|
||||||
|
writeUint16(centralHeader, 10, 0);
|
||||||
|
writeUint32(centralHeader, 16, checksum);
|
||||||
|
writeUint32(centralHeader, 20, body.length);
|
||||||
|
writeUint32(centralHeader, 24, body.length);
|
||||||
|
writeUint16(centralHeader, 28, fileName.length);
|
||||||
|
writeUint32(centralHeader, 42, localOffset);
|
||||||
|
centralHeader.set(fileName, 46);
|
||||||
|
|
||||||
|
localChunks.push(localHeader, body);
|
||||||
|
centralChunks.push(centralHeader);
|
||||||
|
localOffset += localHeader.length + body.length;
|
||||||
|
entryCount += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
const centralDirectoryLength = centralChunks.reduce((sum, chunk) => sum + chunk.length, 0);
|
||||||
|
const archive = new Uint8Array(
|
||||||
|
localChunks.reduce((sum, chunk) => sum + chunk.length, 0) + centralDirectoryLength + 22,
|
||||||
|
);
|
||||||
|
let offset = 0;
|
||||||
|
for (const chunk of localChunks) {
|
||||||
|
archive.set(chunk, offset);
|
||||||
|
offset += chunk.length;
|
||||||
|
}
|
||||||
|
const centralDirectoryOffset = offset;
|
||||||
|
for (const chunk of centralChunks) {
|
||||||
|
archive.set(chunk, offset);
|
||||||
|
offset += chunk.length;
|
||||||
|
}
|
||||||
|
writeUint32(archive, offset, 0x06054b50);
|
||||||
|
writeUint16(archive, offset + 8, entryCount);
|
||||||
|
writeUint16(archive, offset + 10, entryCount);
|
||||||
|
writeUint32(archive, offset + 12, centralDirectoryLength);
|
||||||
|
writeUint32(archive, offset + 16, centralDirectoryOffset);
|
||||||
|
return archive;
|
||||||
|
}
|
||||||
@@ -1,8 +1,10 @@
|
|||||||
import { Command } from "commander";
|
import { Command } from "commander";
|
||||||
|
import { writeFile } from "node:fs/promises";
|
||||||
import {
|
import {
|
||||||
addIssueCommentSchema,
|
addIssueCommentSchema,
|
||||||
checkoutIssueSchema,
|
checkoutIssueSchema,
|
||||||
createIssueSchema,
|
createIssueSchema,
|
||||||
|
type FeedbackTrace,
|
||||||
updateIssueSchema,
|
updateIssueSchema,
|
||||||
type Issue,
|
type Issue,
|
||||||
type IssueComment,
|
type IssueComment,
|
||||||
@@ -15,6 +17,11 @@ import {
|
|||||||
resolveCommandContext,
|
resolveCommandContext,
|
||||||
type BaseClientOptions,
|
type BaseClientOptions,
|
||||||
} from "./common.js";
|
} from "./common.js";
|
||||||
|
import {
|
||||||
|
buildFeedbackTraceQuery,
|
||||||
|
normalizeFeedbackTraceExportFormat,
|
||||||
|
serializeFeedbackTraces,
|
||||||
|
} from "./feedback.js";
|
||||||
|
|
||||||
interface IssueBaseOptions extends BaseClientOptions {
|
interface IssueBaseOptions extends BaseClientOptions {
|
||||||
status?: string;
|
status?: string;
|
||||||
@@ -61,6 +68,18 @@ interface IssueCheckoutOptions extends BaseClientOptions {
|
|||||||
expectedStatuses?: string;
|
expectedStatuses?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface IssueFeedbackOptions extends BaseClientOptions {
|
||||||
|
targetType?: string;
|
||||||
|
vote?: string;
|
||||||
|
status?: string;
|
||||||
|
from?: string;
|
||||||
|
to?: string;
|
||||||
|
sharedOnly?: boolean;
|
||||||
|
includePayload?: boolean;
|
||||||
|
out?: string;
|
||||||
|
format?: string;
|
||||||
|
}
|
||||||
|
|
||||||
export function registerIssueCommands(program: Command): void {
|
export function registerIssueCommands(program: Command): void {
|
||||||
const issue = program.command("issue").description("Issue operations");
|
const issue = program.command("issue").description("Issue operations");
|
||||||
|
|
||||||
@@ -237,6 +256,85 @@ export function registerIssueCommands(program: Command): void {
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
issue
|
||||||
|
.command("feedback:list")
|
||||||
|
.description("List feedback traces for an issue")
|
||||||
|
.argument("<issueId>", "Issue ID")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--include-payload", "Include stored payload snapshots in the response")
|
||||||
|
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||||
|
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts)}`,
|
||||||
|
)) ?? [];
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(traces, { json: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
printOutput(
|
||||||
|
traces.map((trace) => ({
|
||||||
|
id: trace.id,
|
||||||
|
issue: trace.issueIdentifier ?? trace.issueId,
|
||||||
|
vote: trace.vote,
|
||||||
|
status: trace.status,
|
||||||
|
targetType: trace.targetType,
|
||||||
|
target: trace.targetSummary.label,
|
||||||
|
})),
|
||||||
|
{ json: false },
|
||||||
|
);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
addCommonClientOptions(
|
||||||
|
issue
|
||||||
|
.command("feedback:export")
|
||||||
|
.description("Export feedback traces for an issue")
|
||||||
|
.argument("<issueId>", "Issue ID")
|
||||||
|
.option("--target-type <type>", "Filter by target type")
|
||||||
|
.option("--vote <vote>", "Filter by vote value")
|
||||||
|
.option("--status <status>", "Filter by trace status")
|
||||||
|
.option("--from <iso8601>", "Only include traces created at or after this timestamp")
|
||||||
|
.option("--to <iso8601>", "Only include traces created at or before this timestamp")
|
||||||
|
.option("--shared-only", "Only include traces eligible for sharing/export")
|
||||||
|
.option("--include-payload", "Include stored payload snapshots in the export")
|
||||||
|
.option("--out <path>", "Write export to a file path instead of stdout")
|
||||||
|
.option("--format <format>", "Export format: json or ndjson", "ndjson")
|
||||||
|
.action(async (issueId: string, opts: IssueFeedbackOptions) => {
|
||||||
|
try {
|
||||||
|
const ctx = resolveCommandContext(opts);
|
||||||
|
const traces = (await ctx.api.get<FeedbackTrace[]>(
|
||||||
|
`/api/issues/${issueId}/feedback-traces${buildFeedbackTraceQuery(opts, opts.includePayload ?? true)}`,
|
||||||
|
)) ?? [];
|
||||||
|
const serialized = serializeFeedbackTraces(traces, opts.format);
|
||||||
|
if (opts.out?.trim()) {
|
||||||
|
await writeFile(opts.out, serialized, "utf8");
|
||||||
|
if (ctx.json) {
|
||||||
|
printOutput(
|
||||||
|
{ out: opts.out, count: traces.length, format: normalizeFeedbackTraceExportFormat(opts.format) },
|
||||||
|
{ json: true },
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
console.log(`Wrote ${traces.length} feedback trace(s) to ${opts.out}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
process.stdout.write(`${serialized}${serialized.endsWith("\n") ? "" : "\n"}`);
|
||||||
|
} catch (err) {
|
||||||
|
handleCommandError(err);
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
addCommonClientOptions(
|
addCommonClientOptions(
|
||||||
issue
|
issue
|
||||||
.command("checkout")
|
.command("checkout")
|
||||||
|
|||||||
@@ -63,6 +63,9 @@ function defaultConfig(): PaperclipConfig {
|
|||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage: defaultStorageConfig(),
|
storage: defaultStorageConfig(),
|
||||||
secrets: defaultSecretsConfig(),
|
secrets: defaultSecretsConfig(),
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -33,6 +33,11 @@ import {
|
|||||||
} from "../config/home.js";
|
} from "../config/home.js";
|
||||||
import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js";
|
import { bootstrapCeoInvite } from "./auth-bootstrap-ceo.js";
|
||||||
import { printPaperclipCliBanner } from "../utils/banner.js";
|
import { printPaperclipCliBanner } from "../utils/banner.js";
|
||||||
|
import {
|
||||||
|
getTelemetryClient,
|
||||||
|
trackInstallStarted,
|
||||||
|
trackInstallCompleted,
|
||||||
|
} from "../telemetry.js";
|
||||||
|
|
||||||
type SetupMode = "quickstart" | "advanced";
|
type SetupMode = "quickstart" | "advanced";
|
||||||
|
|
||||||
@@ -356,6 +361,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
setupMode = setupModeChoice as SetupMode;
|
setupMode = setupModeChoice as SetupMode;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const tc = getTelemetryClient();
|
||||||
|
if (tc) trackInstallStarted(tc);
|
||||||
|
|
||||||
let llm: PaperclipConfig["llm"] | undefined;
|
let llm: PaperclipConfig["llm"] | undefined;
|
||||||
const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv();
|
const { defaults: derivedDefaults, usedEnvKeys, ignoredEnvKeys } = quickstartDefaultsFromEnv();
|
||||||
let {
|
let {
|
||||||
@@ -488,6 +496,9 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
logging,
|
logging,
|
||||||
server,
|
server,
|
||||||
auth,
|
auth,
|
||||||
|
telemetry: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
storage,
|
storage,
|
||||||
secrets,
|
secrets,
|
||||||
};
|
};
|
||||||
@@ -501,6 +512,10 @@ export async function onboard(opts: OnboardOptions): Promise<void> {
|
|||||||
|
|
||||||
writeConfig(config, opts.config);
|
writeConfig(config, opts.config);
|
||||||
|
|
||||||
|
if (tc) trackInstallCompleted(tc, {
|
||||||
|
adapterType: server.deploymentMode,
|
||||||
|
});
|
||||||
|
|
||||||
p.note(
|
p.note(
|
||||||
[
|
[
|
||||||
`Database: ${database.mode}`,
|
`Database: ${database.mode}`,
|
||||||
|
|||||||
352
cli/src/commands/routines.ts
Normal file
352
cli/src/commands/routines.ts
Normal file
@@ -0,0 +1,352 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import net from "node:net";
|
||||||
|
import path from "node:path";
|
||||||
|
import { Command } from "commander";
|
||||||
|
import pc from "picocolors";
|
||||||
|
import {
|
||||||
|
applyPendingMigrations,
|
||||||
|
createDb,
|
||||||
|
createEmbeddedPostgresLogBuffer,
|
||||||
|
ensurePostgresDatabase,
|
||||||
|
formatEmbeddedPostgresError,
|
||||||
|
routines,
|
||||||
|
} from "@paperclipai/db";
|
||||||
|
import { eq, inArray } from "drizzle-orm";
|
||||||
|
import { loadPaperclipEnvFile } from "../config/env.js";
|
||||||
|
import { readConfig, resolveConfigPath } from "../config/store.js";
|
||||||
|
|
||||||
|
type RoutinesDisableAllOptions = {
|
||||||
|
config?: string;
|
||||||
|
dataDir?: string;
|
||||||
|
companyId?: string;
|
||||||
|
json?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
type DisableAllRoutinesResult = {
|
||||||
|
companyId: string;
|
||||||
|
totalRoutines: number;
|
||||||
|
pausedCount: number;
|
||||||
|
alreadyPausedCount: number;
|
||||||
|
archivedCount: number;
|
||||||
|
};
|
||||||
|
|
||||||
|
type EmbeddedPostgresInstance = {
|
||||||
|
initialise(): Promise<void>;
|
||||||
|
start(): Promise<void>;
|
||||||
|
stop(): Promise<void>;
|
||||||
|
};
|
||||||
|
|
||||||
|
type EmbeddedPostgresCtor = new (opts: {
|
||||||
|
databaseDir: string;
|
||||||
|
user: string;
|
||||||
|
password: string;
|
||||||
|
port: number;
|
||||||
|
persistent: boolean;
|
||||||
|
initdbFlags?: string[];
|
||||||
|
onLog?: (message: unknown) => void;
|
||||||
|
onError?: (message: unknown) => void;
|
||||||
|
}) => EmbeddedPostgresInstance;
|
||||||
|
|
||||||
|
type EmbeddedPostgresHandle = {
|
||||||
|
port: number;
|
||||||
|
startedByThisProcess: boolean;
|
||||||
|
stop: () => Promise<void>;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ClosableDb = ReturnType<typeof createDb> & {
|
||||||
|
$client?: {
|
||||||
|
end?: (options?: { timeout?: number }) => Promise<void>;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
function nonEmpty(value: string | null | undefined): string | null {
|
||||||
|
return typeof value === "string" && value.trim().length > 0 ? value.trim() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function isPortAvailable(port: number): Promise<boolean> {
|
||||||
|
return await new Promise<boolean>((resolve) => {
|
||||||
|
const server = net.createServer();
|
||||||
|
server.unref();
|
||||||
|
server.once("error", () => resolve(false));
|
||||||
|
server.listen(port, "127.0.0.1", () => {
|
||||||
|
server.close(() => resolve(true));
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function findAvailablePort(preferredPort: number): Promise<number> {
|
||||||
|
let port = Math.max(1, Math.trunc(preferredPort));
|
||||||
|
while (!(await isPortAvailable(port))) {
|
||||||
|
port += 1;
|
||||||
|
}
|
||||||
|
return port;
|
||||||
|
}
|
||||||
|
|
||||||
|
function readPidFilePort(postmasterPidFile: string): number | null {
|
||||||
|
if (!fs.existsSync(postmasterPidFile)) return null;
|
||||||
|
try {
|
||||||
|
const lines = fs.readFileSync(postmasterPidFile, "utf8").split("\n");
|
||||||
|
const port = Number(lines[3]?.trim());
|
||||||
|
return Number.isInteger(port) && port > 0 ? port : null;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function readRunningPostmasterPid(postmasterPidFile: string): number | null {
|
||||||
|
if (!fs.existsSync(postmasterPidFile)) return null;
|
||||||
|
try {
|
||||||
|
const pid = Number(fs.readFileSync(postmasterPidFile, "utf8").split("\n")[0]?.trim());
|
||||||
|
if (!Number.isInteger(pid) || pid <= 0) return null;
|
||||||
|
process.kill(pid, 0);
|
||||||
|
return pid;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureEmbeddedPostgres(dataDir: string, preferredPort: number): Promise<EmbeddedPostgresHandle> {
|
||||||
|
const moduleName = "embedded-postgres";
|
||||||
|
let EmbeddedPostgres: EmbeddedPostgresCtor;
|
||||||
|
try {
|
||||||
|
const mod = await import(moduleName);
|
||||||
|
EmbeddedPostgres = mod.default as EmbeddedPostgresCtor;
|
||||||
|
} catch {
|
||||||
|
throw new Error(
|
||||||
|
"Embedded PostgreSQL support requires dependency `embedded-postgres`. Reinstall dependencies and try again.",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const postmasterPidFile = path.resolve(dataDir, "postmaster.pid");
|
||||||
|
const runningPid = readRunningPostmasterPid(postmasterPidFile);
|
||||||
|
if (runningPid) {
|
||||||
|
return {
|
||||||
|
port: readPidFilePort(postmasterPidFile) ?? preferredPort,
|
||||||
|
startedByThisProcess: false,
|
||||||
|
stop: async () => {},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const port = await findAvailablePort(preferredPort);
|
||||||
|
const logBuffer = createEmbeddedPostgresLogBuffer();
|
||||||
|
const instance = new EmbeddedPostgres({
|
||||||
|
databaseDir: dataDir,
|
||||||
|
user: "paperclip",
|
||||||
|
password: "paperclip",
|
||||||
|
port,
|
||||||
|
persistent: true,
|
||||||
|
initdbFlags: ["--encoding=UTF8", "--locale=C", "--lc-messages=C"],
|
||||||
|
onLog: logBuffer.append,
|
||||||
|
onError: logBuffer.append,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!fs.existsSync(path.resolve(dataDir, "PG_VERSION"))) {
|
||||||
|
try {
|
||||||
|
await instance.initialise();
|
||||||
|
} catch (error) {
|
||||||
|
throw formatEmbeddedPostgresError(error, {
|
||||||
|
fallbackMessage: `Failed to initialize embedded PostgreSQL cluster in ${dataDir} on port ${port}`,
|
||||||
|
recentLogs: logBuffer.getRecentLogs(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (fs.existsSync(postmasterPidFile)) {
|
||||||
|
fs.rmSync(postmasterPidFile, { force: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await instance.start();
|
||||||
|
} catch (error) {
|
||||||
|
throw formatEmbeddedPostgresError(error, {
|
||||||
|
fallbackMessage: `Failed to start embedded PostgreSQL on port ${port}`,
|
||||||
|
recentLogs: logBuffer.getRecentLogs(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
port,
|
||||||
|
startedByThisProcess: true,
|
||||||
|
stop: async () => {
|
||||||
|
await instance.stop();
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async function closeDb(db: ClosableDb): Promise<void> {
|
||||||
|
await db.$client?.end?.({ timeout: 5 }).catch(() => undefined);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function openConfiguredDb(configPath: string): Promise<{
|
||||||
|
db: ClosableDb;
|
||||||
|
stop: () => Promise<void>;
|
||||||
|
}> {
|
||||||
|
const config = readConfig(configPath);
|
||||||
|
if (!config) {
|
||||||
|
throw new Error(`Config not found at ${configPath}.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
let embeddedHandle: EmbeddedPostgresHandle | null = null;
|
||||||
|
try {
|
||||||
|
if (config.database.mode === "embedded-postgres") {
|
||||||
|
embeddedHandle = await ensureEmbeddedPostgres(
|
||||||
|
config.database.embeddedPostgresDataDir,
|
||||||
|
config.database.embeddedPostgresPort,
|
||||||
|
);
|
||||||
|
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
|
||||||
|
await ensurePostgresDatabase(adminConnectionString, "paperclip");
|
||||||
|
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
const db = createDb(connectionString) as ClosableDb;
|
||||||
|
return {
|
||||||
|
db,
|
||||||
|
stop: async () => {
|
||||||
|
await closeDb(db);
|
||||||
|
if (embeddedHandle?.startedByThisProcess) {
|
||||||
|
await embeddedHandle.stop().catch(() => undefined);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const connectionString = nonEmpty(config.database.connectionString);
|
||||||
|
if (!connectionString) {
|
||||||
|
throw new Error(`Config at ${configPath} does not define a database connection string.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
const db = createDb(connectionString) as ClosableDb;
|
||||||
|
return {
|
||||||
|
db,
|
||||||
|
stop: async () => {
|
||||||
|
await closeDb(db);
|
||||||
|
},
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
if (embeddedHandle?.startedByThisProcess) {
|
||||||
|
await embeddedHandle.stop().catch(() => undefined);
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function disableAllRoutinesInConfig(
|
||||||
|
options: Pick<RoutinesDisableAllOptions, "config" | "companyId">,
|
||||||
|
): Promise<DisableAllRoutinesResult> {
|
||||||
|
const configPath = resolveConfigPath(options.config);
|
||||||
|
loadPaperclipEnvFile(configPath);
|
||||||
|
const companyId =
|
||||||
|
nonEmpty(options.companyId)
|
||||||
|
?? nonEmpty(process.env.PAPERCLIP_COMPANY_ID)
|
||||||
|
?? null;
|
||||||
|
if (!companyId) {
|
||||||
|
throw new Error("Company ID is required. Pass --company-id or set PAPERCLIP_COMPANY_ID.");
|
||||||
|
}
|
||||||
|
|
||||||
|
const config = readConfig(configPath);
|
||||||
|
if (!config) {
|
||||||
|
throw new Error(`Config not found at ${configPath}.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
let embeddedHandle: EmbeddedPostgresHandle | null = null;
|
||||||
|
let db: ClosableDb | null = null;
|
||||||
|
try {
|
||||||
|
if (config.database.mode === "embedded-postgres") {
|
||||||
|
embeddedHandle = await ensureEmbeddedPostgres(
|
||||||
|
config.database.embeddedPostgresDataDir,
|
||||||
|
config.database.embeddedPostgresPort,
|
||||||
|
);
|
||||||
|
const adminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/postgres`;
|
||||||
|
await ensurePostgresDatabase(adminConnectionString, "paperclip");
|
||||||
|
const connectionString = `postgres://paperclip:paperclip@127.0.0.1:${embeddedHandle.port}/paperclip`;
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
db = createDb(connectionString) as ClosableDb;
|
||||||
|
} else {
|
||||||
|
const connectionString = nonEmpty(config.database.connectionString);
|
||||||
|
if (!connectionString) {
|
||||||
|
throw new Error(`Config at ${configPath} does not define a database connection string.`);
|
||||||
|
}
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
db = createDb(connectionString) as ClosableDb;
|
||||||
|
}
|
||||||
|
|
||||||
|
const existing = await db
|
||||||
|
.select({
|
||||||
|
id: routines.id,
|
||||||
|
status: routines.status,
|
||||||
|
})
|
||||||
|
.from(routines)
|
||||||
|
.where(eq(routines.companyId, companyId));
|
||||||
|
|
||||||
|
const alreadyPausedCount = existing.filter((routine) => routine.status === "paused").length;
|
||||||
|
const archivedCount = existing.filter((routine) => routine.status === "archived").length;
|
||||||
|
const idsToPause = existing
|
||||||
|
.filter((routine) => routine.status !== "paused" && routine.status !== "archived")
|
||||||
|
.map((routine) => routine.id);
|
||||||
|
|
||||||
|
if (idsToPause.length > 0) {
|
||||||
|
await db
|
||||||
|
.update(routines)
|
||||||
|
.set({
|
||||||
|
status: "paused",
|
||||||
|
updatedAt: new Date(),
|
||||||
|
})
|
||||||
|
.where(inArray(routines.id, idsToPause));
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
companyId,
|
||||||
|
totalRoutines: existing.length,
|
||||||
|
pausedCount: idsToPause.length,
|
||||||
|
alreadyPausedCount,
|
||||||
|
archivedCount,
|
||||||
|
};
|
||||||
|
} finally {
|
||||||
|
if (db) {
|
||||||
|
await closeDb(db);
|
||||||
|
}
|
||||||
|
if (embeddedHandle?.startedByThisProcess) {
|
||||||
|
await embeddedHandle.stop().catch(() => undefined);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function disableAllRoutinesCommand(options: RoutinesDisableAllOptions): Promise<void> {
|
||||||
|
const result = await disableAllRoutinesInConfig(options);
|
||||||
|
|
||||||
|
if (options.json) {
|
||||||
|
console.log(JSON.stringify(result, null, 2));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result.totalRoutines === 0) {
|
||||||
|
console.log(pc.dim(`No routines found for company ${result.companyId}.`));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(
|
||||||
|
`Paused ${result.pausedCount} routine(s) for company ${result.companyId} ` +
|
||||||
|
`(${result.alreadyPausedCount} already paused, ${result.archivedCount} archived).`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function registerRoutineCommands(program: Command): void {
|
||||||
|
const routinesCommand = program.command("routines").description("Local routine maintenance commands");
|
||||||
|
|
||||||
|
routinesCommand
|
||||||
|
.command("disable-all")
|
||||||
|
.description("Pause all non-archived routines in the configured local instance for one company")
|
||||||
|
.option("-c, --config <path>", "Path to config file")
|
||||||
|
.option("-d, --data-dir <path>", "Paperclip data directory root (isolates state from ~/.paperclip)")
|
||||||
|
.option("-C, --company-id <id>", "Company ID")
|
||||||
|
.option("--json", "Output raw JSON")
|
||||||
|
.action(async (opts: RoutinesDisableAllOptions) => {
|
||||||
|
try {
|
||||||
|
await disableAllRoutinesCommand(opts);
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : String(error);
|
||||||
|
console.error(pc.red(message));
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -224,6 +224,9 @@ export function buildWorktreeConfig(input: {
|
|||||||
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
|
...(authPublicBaseUrl ? { publicBaseUrl: authPublicBaseUrl } : {}),
|
||||||
disableSignUp: source?.auth.disableSignUp ?? false,
|
disableSignUp: source?.auth.disableSignUp ?? false,
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
enabled: source?.telemetry?.enabled ?? true,
|
||||||
|
},
|
||||||
storage: {
|
storage: {
|
||||||
provider: source?.storage.provider ?? "local_disk",
|
provider: source?.storage.provider ?? "local_disk",
|
||||||
localDisk: {
|
localDisk: {
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ export {
|
|||||||
loggingConfigSchema,
|
loggingConfigSchema,
|
||||||
serverConfigSchema,
|
serverConfigSchema,
|
||||||
authConfigSchema,
|
authConfigSchema,
|
||||||
|
telemetryConfigSchema,
|
||||||
storageConfigSchema,
|
storageConfigSchema,
|
||||||
storageLocalDiskConfigSchema,
|
storageLocalDiskConfigSchema,
|
||||||
storageS3ConfigSchema,
|
storageS3ConfigSchema,
|
||||||
@@ -19,10 +20,11 @@ export {
|
|||||||
type LoggingConfig,
|
type LoggingConfig,
|
||||||
type ServerConfig,
|
type ServerConfig,
|
||||||
type AuthConfig,
|
type AuthConfig,
|
||||||
|
type TelemetryConfig,
|
||||||
type StorageConfig,
|
type StorageConfig,
|
||||||
type StorageLocalDiskConfig,
|
type StorageLocalDiskConfig,
|
||||||
type StorageS3Config,
|
type StorageS3Config,
|
||||||
type SecretsConfig,
|
type SecretsConfig,
|
||||||
type SecretsLocalEncryptedConfig,
|
type SecretsLocalEncryptedConfig,
|
||||||
type ConfigMeta,
|
type ConfigMeta,
|
||||||
} from "@paperclipai/shared";
|
} from "../../../packages/shared/src/config-schema.js";
|
||||||
|
|||||||
@@ -15,11 +15,15 @@ import { registerAgentCommands } from "./commands/client/agent.js";
|
|||||||
import { registerApprovalCommands } from "./commands/client/approval.js";
|
import { registerApprovalCommands } from "./commands/client/approval.js";
|
||||||
import { registerActivityCommands } from "./commands/client/activity.js";
|
import { registerActivityCommands } from "./commands/client/activity.js";
|
||||||
import { registerDashboardCommands } from "./commands/client/dashboard.js";
|
import { registerDashboardCommands } from "./commands/client/dashboard.js";
|
||||||
|
import { registerRoutineCommands } from "./commands/routines.js";
|
||||||
|
import { registerFeedbackCommands } from "./commands/client/feedback.js";
|
||||||
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
|
import { applyDataDirOverride, type DataDirOptionLike } from "./config/data-dir.js";
|
||||||
import { loadPaperclipEnvFile } from "./config/env.js";
|
import { loadPaperclipEnvFile } from "./config/env.js";
|
||||||
|
import { initTelemetryFromConfigFile, flushTelemetry } from "./telemetry.js";
|
||||||
import { registerWorktreeCommands } from "./commands/worktree.js";
|
import { registerWorktreeCommands } from "./commands/worktree.js";
|
||||||
import { registerPluginCommands } from "./commands/client/plugin.js";
|
import { registerPluginCommands } from "./commands/client/plugin.js";
|
||||||
import { registerClientAuthCommands } from "./commands/client/auth.js";
|
import { registerClientAuthCommands } from "./commands/client/auth.js";
|
||||||
|
import { cliVersion } from "./version.js";
|
||||||
|
|
||||||
const program = new Command();
|
const program = new Command();
|
||||||
const DATA_DIR_OPTION_HELP =
|
const DATA_DIR_OPTION_HELP =
|
||||||
@@ -28,7 +32,7 @@ const DATA_DIR_OPTION_HELP =
|
|||||||
program
|
program
|
||||||
.name("paperclipai")
|
.name("paperclipai")
|
||||||
.description("Paperclip CLI — setup, diagnose, and configure your instance")
|
.description("Paperclip CLI — setup, diagnose, and configure your instance")
|
||||||
.version("0.2.7");
|
.version(cliVersion);
|
||||||
|
|
||||||
program.hook("preAction", (_thisCommand, actionCommand) => {
|
program.hook("preAction", (_thisCommand, actionCommand) => {
|
||||||
const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
|
const options = actionCommand.optsWithGlobals() as DataDirOptionLike;
|
||||||
@@ -38,6 +42,7 @@ program.hook("preAction", (_thisCommand, actionCommand) => {
|
|||||||
hasContextOption: optionNames.has("context"),
|
hasContextOption: optionNames.has("context"),
|
||||||
});
|
});
|
||||||
loadPaperclipEnvFile(options.config);
|
loadPaperclipEnvFile(options.config);
|
||||||
|
initTelemetryFromConfigFile(options.config);
|
||||||
});
|
});
|
||||||
|
|
||||||
program
|
program
|
||||||
@@ -137,6 +142,8 @@ registerAgentCommands(program);
|
|||||||
registerApprovalCommands(program);
|
registerApprovalCommands(program);
|
||||||
registerActivityCommands(program);
|
registerActivityCommands(program);
|
||||||
registerDashboardCommands(program);
|
registerDashboardCommands(program);
|
||||||
|
registerRoutineCommands(program);
|
||||||
|
registerFeedbackCommands(program);
|
||||||
registerWorktreeCommands(program);
|
registerWorktreeCommands(program);
|
||||||
registerPluginCommands(program);
|
registerPluginCommands(program);
|
||||||
|
|
||||||
@@ -154,7 +161,20 @@ auth
|
|||||||
|
|
||||||
registerClientAuthCommands(auth);
|
registerClientAuthCommands(auth);
|
||||||
|
|
||||||
program.parseAsync().catch((err) => {
|
async function main(): Promise<void> {
|
||||||
console.error(err instanceof Error ? err.message : String(err));
|
let failed = false;
|
||||||
process.exit(1);
|
try {
|
||||||
});
|
await program.parseAsync();
|
||||||
|
} catch (err) {
|
||||||
|
failed = true;
|
||||||
|
console.error(err instanceof Error ? err.message : String(err));
|
||||||
|
} finally {
|
||||||
|
await flushTelemetry();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (failed) {
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void main();
|
||||||
|
|||||||
49
cli/src/telemetry.ts
Normal file
49
cli/src/telemetry.ts
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import path from "node:path";
|
||||||
|
import {
|
||||||
|
TelemetryClient,
|
||||||
|
resolveTelemetryConfig,
|
||||||
|
loadOrCreateState,
|
||||||
|
trackInstallStarted,
|
||||||
|
trackInstallCompleted,
|
||||||
|
trackCompanyImported,
|
||||||
|
} from "../../packages/shared/src/telemetry/index.js";
|
||||||
|
import { resolvePaperclipInstanceRoot } from "./config/home.js";
|
||||||
|
import { readConfig } from "./config/store.js";
|
||||||
|
import { cliVersion } from "./version.js";
|
||||||
|
|
||||||
|
let client: TelemetryClient | null = null;
|
||||||
|
|
||||||
|
export function initTelemetry(fileConfig?: { enabled?: boolean }): TelemetryClient | null {
|
||||||
|
if (client) return client;
|
||||||
|
|
||||||
|
const config = resolveTelemetryConfig(fileConfig);
|
||||||
|
if (!config.enabled) return null;
|
||||||
|
|
||||||
|
const stateDir = path.join(resolvePaperclipInstanceRoot(), "telemetry");
|
||||||
|
client = new TelemetryClient(config, () => loadOrCreateState(stateDir, cliVersion), cliVersion);
|
||||||
|
return client;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function initTelemetryFromConfigFile(configPath?: string): TelemetryClient | null {
|
||||||
|
try {
|
||||||
|
return initTelemetry(readConfig(configPath)?.telemetry);
|
||||||
|
} catch {
|
||||||
|
return initTelemetry();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getTelemetryClient(): TelemetryClient | null {
|
||||||
|
return client;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function flushTelemetry(): Promise<void> {
|
||||||
|
if (client) {
|
||||||
|
await client.flush();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export {
|
||||||
|
trackInstallStarted,
|
||||||
|
trackInstallCompleted,
|
||||||
|
trackCompanyImported,
|
||||||
|
};
|
||||||
10
cli/src/version.ts
Normal file
10
cli/src/version.ts
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
import { createRequire } from "node:module";
|
||||||
|
|
||||||
|
type PackageJson = {
|
||||||
|
version?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
const require = createRequire(import.meta.url);
|
||||||
|
const pkg = require("../package.json") as PackageJson;
|
||||||
|
|
||||||
|
export const cliVersion = pkg.version ?? "0.0.0";
|
||||||
@@ -2,7 +2,7 @@
|
|||||||
"extends": "../tsconfig.base.json",
|
"extends": "../tsconfig.base.json",
|
||||||
"compilerOptions": {
|
"compilerOptions": {
|
||||||
"outDir": "dist",
|
"outDir": "dist",
|
||||||
"rootDir": "src"
|
"rootDir": ".."
|
||||||
},
|
},
|
||||||
"include": ["src"]
|
"include": ["src", "../packages/shared/src"]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -175,6 +175,8 @@ Seed modes:
|
|||||||
|
|
||||||
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
|
After `worktree init`, both the server and the CLI auto-load the repo-local `.paperclip/.env` when run inside that worktree, so normal commands like `pnpm dev`, `paperclipai doctor`, and `paperclipai db:backup` stay scoped to the worktree instance.
|
||||||
|
|
||||||
|
Provisioned git worktrees also pause all seeded routines in the isolated worktree database by default. This prevents copied daily/cron routines from firing unexpectedly inside the new workspace instance during development.
|
||||||
|
|
||||||
That repo-local env also sets:
|
That repo-local env also sets:
|
||||||
|
|
||||||
- `PAPERCLIP_IN_WORKTREE=true`
|
- `PAPERCLIP_IN_WORKTREE=true`
|
||||||
|
|||||||
@@ -249,7 +249,7 @@ Runs local `claude` CLI directly.
|
|||||||
"cwd": "/absolute/or/relative/path",
|
"cwd": "/absolute/or/relative/path",
|
||||||
"promptTemplate": "You are agent {{agent.id}} ...",
|
"promptTemplate": "You are agent {{agent.id}} ...",
|
||||||
"model": "optional-model-id",
|
"model": "optional-model-id",
|
||||||
"maxTurnsPerRun": 300,
|
"maxTurnsPerRun": 1000,
|
||||||
"dangerouslySkipPermissions": true,
|
"dangerouslySkipPermissions": true,
|
||||||
"env": {"KEY": "VALUE"},
|
"env": {"KEY": "VALUE"},
|
||||||
"extraArgs": [],
|
"extraArgs": [],
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ The `claude_local` adapter runs Anthropic's Claude Code CLI locally. It supports
|
|||||||
| `env` | object | No | Environment variables (supports secret refs) |
|
| `env` | object | No | Environment variables (supports secret refs) |
|
||||||
| `timeoutSec` | number | No | Process timeout (0 = no timeout) |
|
| `timeoutSec` | number | No | Process timeout (0 = no timeout) |
|
||||||
| `graceSec` | number | No | Grace period before force-kill |
|
| `graceSec` | number | No | Grace period before force-kill |
|
||||||
| `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `300`) |
|
| `maxTurnsPerRun` | number | No | Max agentic turns per heartbeat (defaults to `1000`) |
|
||||||
| `dangerouslySkipPermissions` | boolean | No | Skip permission prompts (dev only) |
|
| `dangerouslySkipPermissions` | boolean | No | Skip permission prompts (dev only) |
|
||||||
|
|
||||||
## Prompt Templates
|
## Prompt Templates
|
||||||
|
|||||||
189
docs/feedback-voting.md
Normal file
189
docs/feedback-voting.md
Normal file
@@ -0,0 +1,189 @@
|
|||||||
|
# Feedback Voting — Local Data Guide
|
||||||
|
|
||||||
|
When you rate an agent's response with **Helpful** (thumbs up) or **Needs work** (thumbs down), Paperclip saves your vote locally alongside your running instance. This guide covers what gets stored, how to access it, and how to export it.
|
||||||
|
|
||||||
|
## How voting works
|
||||||
|
|
||||||
|
1. Click **Helpful** or **Needs work** on any agent comment or document revision.
|
||||||
|
2. If you click **Needs work**, an optional text prompt appears: _"What could have been better?"_ You can type a reason or dismiss it.
|
||||||
|
3. A consent dialog asks whether to keep the vote local or share it. Your choice is remembered for future votes.
|
||||||
|
|
||||||
|
### What gets stored
|
||||||
|
|
||||||
|
Each vote creates two local records:
|
||||||
|
|
||||||
|
| Record | What it contains |
|
||||||
|
|--------|-----------------|
|
||||||
|
| **Vote** | Your vote (up/down), optional reason text, sharing preference, consent version, timestamp |
|
||||||
|
| **Trace bundle** | Full context snapshot: the voted-on comment/revision text, issue title, agent info, your vote, and reason — everything needed to understand the feedback in isolation |
|
||||||
|
|
||||||
|
All data lives in your local Paperclip database. Nothing leaves your machine unless you explicitly choose to share.
|
||||||
|
|
||||||
|
When a vote is marked for sharing, Paperclip also queues the trace bundle for background export through the Telemetry Backend. The app server never uploads raw feedback trace bundles directly to object storage.
|
||||||
|
|
||||||
|
## Viewing your votes
|
||||||
|
|
||||||
|
### Quick report (terminal)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pnpm paperclipai feedback report
|
||||||
|
```
|
||||||
|
|
||||||
|
Shows a color-coded summary: vote counts, per-trace details with reasons, and export statuses.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Installed CLI
|
||||||
|
paperclipai feedback report
|
||||||
|
|
||||||
|
# Point to a different server or company
|
||||||
|
pnpm paperclipai feedback report --api-base http://127.0.0.1:3000 --company-id <company-id>
|
||||||
|
|
||||||
|
# Include raw payload dumps in the report
|
||||||
|
pnpm paperclipai feedback report --payloads
|
||||||
|
```
|
||||||
|
|
||||||
|
### API endpoints
|
||||||
|
|
||||||
|
All endpoints require board-user access (automatic in local dev).
|
||||||
|
|
||||||
|
**List votes for an issue:**
|
||||||
|
```bash
|
||||||
|
curl http://127.0.0.1:3102/api/issues/<issueId>/feedback-votes
|
||||||
|
```
|
||||||
|
|
||||||
|
**List trace bundles for an issue (with full payloads):**
|
||||||
|
```bash
|
||||||
|
curl 'http://127.0.0.1:3102/api/issues/<issueId>/feedback-traces?includePayload=true'
|
||||||
|
```
|
||||||
|
|
||||||
|
**List all traces company-wide:**
|
||||||
|
```bash
|
||||||
|
curl 'http://127.0.0.1:3102/api/companies/<companyId>/feedback-traces?includePayload=true'
|
||||||
|
```
|
||||||
|
|
||||||
|
**Get a single trace envelope record:**
|
||||||
|
```bash
|
||||||
|
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Get the full export bundle for a trace:**
|
||||||
|
```bash
|
||||||
|
curl http://127.0.0.1:3102/api/feedback-traces/<traceId>/bundle
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Filtering
|
||||||
|
|
||||||
|
The trace endpoints accept query parameters:
|
||||||
|
|
||||||
|
| Parameter | Values | Description |
|
||||||
|
|-----------|--------|-------------|
|
||||||
|
| `vote` | `up`, `down` | Filter by vote direction |
|
||||||
|
| `status` | `local_only`, `pending`, `sent`, `failed` | Filter by export status |
|
||||||
|
| `targetType` | `issue_comment`, `issue_document_revision` | Filter by what was voted on |
|
||||||
|
| `sharedOnly` | `true` | Only show votes the user chose to share |
|
||||||
|
| `includePayload` | `true` | Include the full context snapshot |
|
||||||
|
| `from` / `to` | ISO date | Date range filter |
|
||||||
|
|
||||||
|
## Exporting your data
|
||||||
|
|
||||||
|
### Export to files + zip
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pnpm paperclipai feedback export
|
||||||
|
```
|
||||||
|
|
||||||
|
Creates a timestamped directory with:
|
||||||
|
|
||||||
|
```
|
||||||
|
feedback-export-20260331T120000Z/
|
||||||
|
index.json # manifest with summary stats
|
||||||
|
votes/
|
||||||
|
PAP-123-a1b2c3d4.json # vote metadata (one per vote)
|
||||||
|
traces/
|
||||||
|
PAP-123-e5f6g7h8.json # Paperclip feedback envelope (one per trace)
|
||||||
|
full-traces/
|
||||||
|
PAP-123-e5f6g7h8/
|
||||||
|
bundle.json # full export manifest for the trace
|
||||||
|
...raw adapter files # codex / claude / opencode session artifacts when available
|
||||||
|
feedback-export-20260331T120000Z.zip
|
||||||
|
```
|
||||||
|
|
||||||
|
Exports are full by default. `traces/` keeps the Paperclip envelope, while `full-traces/` contains the richer per-trace bundle plus any recoverable adapter-native files.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Custom server and output directory
|
||||||
|
pnpm paperclipai feedback export --api-base http://127.0.0.1:3000 --company-id <company-id> --out ./my-export
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reading an exported trace
|
||||||
|
|
||||||
|
Open any file in `traces/` to see:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "trace-uuid",
|
||||||
|
"vote": "down",
|
||||||
|
"issueIdentifier": "PAP-123",
|
||||||
|
"issueTitle": "Fix login timeout",
|
||||||
|
"targetType": "issue_comment",
|
||||||
|
"targetSummary": {
|
||||||
|
"label": "Comment",
|
||||||
|
"excerpt": "The first 80 chars of the comment that was voted on..."
|
||||||
|
},
|
||||||
|
"payloadSnapshot": {
|
||||||
|
"vote": {
|
||||||
|
"value": "down",
|
||||||
|
"reason": "Did not address the root cause"
|
||||||
|
},
|
||||||
|
"target": {
|
||||||
|
"body": "Full text of the agent comment..."
|
||||||
|
},
|
||||||
|
"issue": {
|
||||||
|
"identifier": "PAP-123",
|
||||||
|
"title": "Fix login timeout"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Open `full-traces/<issue>-<trace>/bundle.json` to see the expanded export metadata, including capture notes, adapter type, integrity metadata, and the inventory of raw files written alongside it.
|
||||||
|
|
||||||
|
Built-in local adapters now export their native session artifacts more directly:
|
||||||
|
|
||||||
|
- `codex_local`: `adapter/codex/session.jsonl`
|
||||||
|
- `claude_local`: `adapter/claude/session.jsonl`, plus any `adapter/claude/session/...` sidecar files and `adapter/claude/debug.txt` when present
|
||||||
|
- `opencode_local`: `adapter/opencode/session.json`, `adapter/opencode/messages/*.json`, and `adapter/opencode/parts/<messageId>/*.json`, with optional `project.json`, `todo.json`, and `session-diff.json`
|
||||||
|
|
||||||
|
## Sharing preferences
|
||||||
|
|
||||||
|
The first time you vote, a consent dialog asks:
|
||||||
|
|
||||||
|
- **Keep local** — vote is stored locally only (`sharedWithLabs: false`)
|
||||||
|
- **Share this vote** — vote is marked for sharing (`sharedWithLabs: true`)
|
||||||
|
|
||||||
|
Your preference is saved per-company. You can change it any time via the feedback settings. Votes marked "keep local" are never queued for export.
|
||||||
|
|
||||||
|
## Data lifecycle
|
||||||
|
|
||||||
|
| Status | Meaning |
|
||||||
|
|--------|---------|
|
||||||
|
| `local_only` | Vote stored locally, not marked for sharing |
|
||||||
|
| `pending` | Marked for sharing, waiting to be sent |
|
||||||
|
| `sent` | Successfully transmitted |
|
||||||
|
| `failed` | Transmission attempted but failed (will retry) |
|
||||||
|
|
||||||
|
Your local database always retains the full vote and trace data regardless of sharing status.
|
||||||
|
|
||||||
|
## Remote sync
|
||||||
|
|
||||||
|
Votes you choose to share are queued as `pending` traces and flushed by the server's background worker to the Telemetry Backend. The Telemetry Backend validates the request, then persists the bundle into its configured object storage.
|
||||||
|
|
||||||
|
- App server responsibility: build the bundle, POST it to Telemetry Backend, update trace status
|
||||||
|
- Telemetry Backend responsibility: authenticate the request, validate payload shape, compress/store the bundle, return the final object key
|
||||||
|
- Retry behavior: failed uploads move to `failed` with an error message in `failureReason`, and the worker retries them on later ticks
|
||||||
|
|
||||||
|
Exported objects use a deterministic key pattern so they are easy to inspect:
|
||||||
|
|
||||||
|
```text
|
||||||
|
feedback-traces/<companyId>/YYYY/MM/DD/<exportId-or-traceId>.json
|
||||||
|
```
|
||||||
@@ -35,11 +35,12 @@
|
|||||||
"dist"
|
"dist"
|
||||||
],
|
],
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"build": "tsc && cp -r src/migrations dist/migrations",
|
"check:migrations": "tsx src/check-migration-numbering.ts",
|
||||||
|
"build": "pnpm run check:migrations && tsc && cp -r src/migrations dist/migrations",
|
||||||
"clean": "rm -rf dist",
|
"clean": "rm -rf dist",
|
||||||
"typecheck": "tsc --noEmit",
|
"typecheck": "pnpm run check:migrations && tsc --noEmit",
|
||||||
"generate": "tsc -p tsconfig.json && drizzle-kit generate",
|
"generate": "pnpm run check:migrations && tsc -p tsconfig.json && drizzle-kit generate",
|
||||||
"migrate": "tsx src/migrate.ts",
|
"migrate": "pnpm run check:migrations && tsx src/migrate.ts",
|
||||||
"seed": "tsx src/seed.ts"
|
"seed": "tsx src/seed.ts"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
|||||||
179
packages/db/src/backup-lib.test.ts
Normal file
179
packages/db/src/backup-lib.test.ts
Normal file
@@ -0,0 +1,179 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterEach, describe, expect, it } from "vitest";
|
||||||
|
import postgres from "postgres";
|
||||||
|
import { createBufferedTextFileWriter, runDatabaseBackup, runDatabaseRestore } from "./backup-lib.js";
|
||||||
|
import { ensurePostgresDatabase } from "./client.js";
|
||||||
|
import {
|
||||||
|
getEmbeddedPostgresTestSupport,
|
||||||
|
startEmbeddedPostgresTestDatabase,
|
||||||
|
} from "./test-embedded-postgres.js";
|
||||||
|
|
||||||
|
const cleanups: Array<() => Promise<void> | void> = [];
|
||||||
|
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||||
|
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||||
|
|
||||||
|
function createTempDir(prefix: string): string {
|
||||||
|
const dir = fs.mkdtempSync(path.join(os.tmpdir(), prefix));
|
||||||
|
cleanups.push(() => {
|
||||||
|
fs.rmSync(dir, { recursive: true, force: true });
|
||||||
|
});
|
||||||
|
return dir;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function createTempDatabase(): Promise<string> {
|
||||||
|
const db = await startEmbeddedPostgresTestDatabase("paperclip-db-backup-");
|
||||||
|
cleanups.push(db.cleanup);
|
||||||
|
return db.connectionString;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function createSiblingDatabase(connectionString: string, databaseName: string): Promise<string> {
|
||||||
|
const adminUrl = new URL(connectionString);
|
||||||
|
adminUrl.pathname = "/postgres";
|
||||||
|
await ensurePostgresDatabase(adminUrl.toString(), databaseName);
|
||||||
|
const targetUrl = new URL(connectionString);
|
||||||
|
targetUrl.pathname = `/${databaseName}`;
|
||||||
|
return targetUrl.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
while (cleanups.length > 0) {
|
||||||
|
const cleanup = cleanups.pop();
|
||||||
|
await cleanup?.();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!embeddedPostgresSupport.supported) {
|
||||||
|
console.warn(
|
||||||
|
`Skipping embedded Postgres backup tests on this host: ${embeddedPostgresSupport.reason ?? "unsupported environment"}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("createBufferedTextFileWriter", () => {
|
||||||
|
it("preserves line boundaries across buffered flushes", async () => {
|
||||||
|
const tempDir = createTempDir("paperclip-buffered-writer-");
|
||||||
|
const outputPath = path.join(tempDir, "backup.sql");
|
||||||
|
const writer = createBufferedTextFileWriter(outputPath, 16);
|
||||||
|
const lines = [
|
||||||
|
"-- header",
|
||||||
|
"BEGIN;",
|
||||||
|
"",
|
||||||
|
"INSERT INTO test VALUES (1);",
|
||||||
|
"-- footer",
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
writer.emit(line);
|
||||||
|
}
|
||||||
|
|
||||||
|
await writer.close();
|
||||||
|
|
||||||
|
expect(fs.readFileSync(outputPath, "utf8")).toBe(lines.join("\n"));
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describeEmbeddedPostgres("runDatabaseBackup", () => {
|
||||||
|
it(
|
||||||
|
"backs up and restores large table payloads without materializing one giant string",
|
||||||
|
async () => {
|
||||||
|
const sourceConnectionString = await createTempDatabase();
|
||||||
|
const restoreConnectionString = await createSiblingDatabase(
|
||||||
|
sourceConnectionString,
|
||||||
|
"paperclip_restore_target",
|
||||||
|
);
|
||||||
|
const backupDir = createTempDir("paperclip-db-backup-output-");
|
||||||
|
const sourceSql = postgres(sourceConnectionString, { max: 1, onnotice: () => {} });
|
||||||
|
const restoreSql = postgres(restoreConnectionString, { max: 1, onnotice: () => {} });
|
||||||
|
|
||||||
|
try {
|
||||||
|
await sourceSql.unsafe(`
|
||||||
|
CREATE TYPE "public"."backup_test_state" AS ENUM ('pending', 'done');
|
||||||
|
`);
|
||||||
|
await sourceSql.unsafe(`
|
||||||
|
CREATE TABLE "public"."backup_test_records" (
|
||||||
|
"id" serial PRIMARY KEY,
|
||||||
|
"title" text NOT NULL,
|
||||||
|
"payload" text NOT NULL,
|
||||||
|
"state" "public"."backup_test_state" NOT NULL,
|
||||||
|
"metadata" jsonb,
|
||||||
|
"created_at" timestamptz NOT NULL DEFAULT now()
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
const payload = "x".repeat(8192);
|
||||||
|
for (let index = 0; index < 160; index += 1) {
|
||||||
|
const createdAt = new Date(Date.UTC(2026, 0, 1, 0, 0, index));
|
||||||
|
await sourceSql`
|
||||||
|
INSERT INTO "public"."backup_test_records" (
|
||||||
|
"title",
|
||||||
|
"payload",
|
||||||
|
"state",
|
||||||
|
"metadata",
|
||||||
|
"created_at"
|
||||||
|
)
|
||||||
|
VALUES (
|
||||||
|
${`row-${index}`},
|
||||||
|
${payload},
|
||||||
|
${index % 2 === 0 ? "pending" : "done"}::"public"."backup_test_state",
|
||||||
|
${JSON.stringify({ index, even: index % 2 === 0 })}::jsonb,
|
||||||
|
${createdAt}
|
||||||
|
)
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await runDatabaseBackup({
|
||||||
|
connectionString: sourceConnectionString,
|
||||||
|
backupDir,
|
||||||
|
retentionDays: 7,
|
||||||
|
filenamePrefix: "paperclip-test",
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.backupFile).toMatch(/paperclip-test-.*\.sql$/);
|
||||||
|
expect(result.sizeBytes).toBeGreaterThan(1024 * 1024);
|
||||||
|
expect(fs.existsSync(result.backupFile)).toBe(true);
|
||||||
|
|
||||||
|
await runDatabaseRestore({
|
||||||
|
connectionString: restoreConnectionString,
|
||||||
|
backupFile: result.backupFile,
|
||||||
|
});
|
||||||
|
|
||||||
|
const counts = await restoreSql.unsafe<{ count: number }[]>(`
|
||||||
|
SELECT count(*)::int AS count
|
||||||
|
FROM "public"."backup_test_records"
|
||||||
|
`);
|
||||||
|
expect(counts[0]?.count).toBe(160);
|
||||||
|
|
||||||
|
const sampleRows = await restoreSql.unsafe<{
|
||||||
|
title: string;
|
||||||
|
payload: string;
|
||||||
|
state: string;
|
||||||
|
metadata: { index: number; even: boolean };
|
||||||
|
}[]>(`
|
||||||
|
SELECT "title", "payload", "state"::text AS "state", "metadata"
|
||||||
|
FROM "public"."backup_test_records"
|
||||||
|
WHERE "title" IN ('row-0', 'row-159')
|
||||||
|
ORDER BY "title"
|
||||||
|
`);
|
||||||
|
expect(sampleRows).toEqual([
|
||||||
|
{
|
||||||
|
title: "row-0",
|
||||||
|
payload,
|
||||||
|
state: "pending",
|
||||||
|
metadata: { index: 0, even: true },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "row-159",
|
||||||
|
payload,
|
||||||
|
state: "done",
|
||||||
|
metadata: { index: 159, even: false },
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
} finally {
|
||||||
|
await sourceSql.end();
|
||||||
|
await restoreSql.end();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
60_000,
|
||||||
|
);
|
||||||
|
});
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
import { existsSync, mkdirSync, readdirSync, statSync, unlinkSync } from "node:fs";
|
import { createWriteStream, existsSync, mkdirSync, readdirSync, statSync, unlinkSync } from "node:fs";
|
||||||
import { readFile, writeFile } from "node:fs/promises";
|
import { readFile } from "node:fs/promises";
|
||||||
import { basename, resolve } from "node:path";
|
import { basename, resolve } from "node:path";
|
||||||
import postgres from "postgres";
|
import postgres from "postgres";
|
||||||
|
|
||||||
@@ -47,6 +47,7 @@ type TableDefinition = {
|
|||||||
|
|
||||||
const DRIZZLE_SCHEMA = "drizzle";
|
const DRIZZLE_SCHEMA = "drizzle";
|
||||||
const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations";
|
const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations";
|
||||||
|
const DEFAULT_BACKUP_WRITE_BUFFER_BYTES = 1024 * 1024;
|
||||||
|
|
||||||
const STATEMENT_BREAKPOINT = "-- paperclip statement breakpoint 69f6f3f1-42fd-46a6-bf17-d1d85f8f3900";
|
const STATEMENT_BREAKPOINT = "-- paperclip statement breakpoint 69f6f3f1-42fd-46a6-bf17-d1d85f8f3900";
|
||||||
|
|
||||||
@@ -141,6 +142,102 @@ function tableKey(schemaName: string, tableName: string): string {
|
|||||||
return `${schemaName}.${tableName}`;
|
return `${schemaName}.${tableName}`;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function createBufferedTextFileWriter(filePath: string, maxBufferedBytes = DEFAULT_BACKUP_WRITE_BUFFER_BYTES) {
|
||||||
|
const stream = createWriteStream(filePath, { encoding: "utf8" });
|
||||||
|
const flushThreshold = Math.max(1, Math.trunc(maxBufferedBytes));
|
||||||
|
let bufferedLines: string[] = [];
|
||||||
|
let bufferedBytes = 0;
|
||||||
|
let firstChunk = true;
|
||||||
|
let closed = false;
|
||||||
|
let streamError: Error | null = null;
|
||||||
|
let pendingWrite = Promise.resolve();
|
||||||
|
|
||||||
|
stream.on("error", (error) => {
|
||||||
|
streamError = error;
|
||||||
|
});
|
||||||
|
|
||||||
|
const writeChunk = async (chunk: string): Promise<void> => {
|
||||||
|
if (streamError) throw streamError;
|
||||||
|
const canContinue = stream.write(chunk);
|
||||||
|
if (!canContinue) {
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const handleDrain = () => {
|
||||||
|
cleanup();
|
||||||
|
resolve();
|
||||||
|
};
|
||||||
|
const handleError = (error: Error) => {
|
||||||
|
cleanup();
|
||||||
|
reject(error);
|
||||||
|
};
|
||||||
|
const cleanup = () => {
|
||||||
|
stream.off("drain", handleDrain);
|
||||||
|
stream.off("error", handleError);
|
||||||
|
};
|
||||||
|
stream.once("drain", handleDrain);
|
||||||
|
stream.once("error", handleError);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (streamError) throw streamError;
|
||||||
|
};
|
||||||
|
|
||||||
|
const flushBufferedLines = () => {
|
||||||
|
if (bufferedLines.length === 0) return;
|
||||||
|
const linesToWrite = bufferedLines;
|
||||||
|
bufferedLines = [];
|
||||||
|
bufferedBytes = 0;
|
||||||
|
const chunkBody = linesToWrite.join("\n");
|
||||||
|
const chunk = firstChunk ? chunkBody : `\n${chunkBody}`;
|
||||||
|
firstChunk = false;
|
||||||
|
pendingWrite = pendingWrite.then(() => writeChunk(chunk));
|
||||||
|
};
|
||||||
|
|
||||||
|
return {
|
||||||
|
emit(line: string) {
|
||||||
|
if (closed) {
|
||||||
|
throw new Error(`Cannot write to closed backup file: ${filePath}`);
|
||||||
|
}
|
||||||
|
if (streamError) throw streamError;
|
||||||
|
bufferedLines.push(line);
|
||||||
|
bufferedBytes += Buffer.byteLength(line, "utf8") + 1;
|
||||||
|
if (bufferedBytes >= flushThreshold) {
|
||||||
|
flushBufferedLines();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
async close() {
|
||||||
|
if (closed) return;
|
||||||
|
closed = true;
|
||||||
|
flushBufferedLines();
|
||||||
|
await pendingWrite;
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
if (streamError) {
|
||||||
|
reject(streamError);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
stream.end((error?: Error | null) => {
|
||||||
|
if (error) reject(error);
|
||||||
|
else resolve();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
if (streamError) throw streamError;
|
||||||
|
},
|
||||||
|
async abort() {
|
||||||
|
if (closed) return;
|
||||||
|
closed = true;
|
||||||
|
bufferedLines = [];
|
||||||
|
bufferedBytes = 0;
|
||||||
|
stream.destroy();
|
||||||
|
await pendingWrite.catch(() => {});
|
||||||
|
if (existsSync(filePath)) {
|
||||||
|
try {
|
||||||
|
unlinkSync(filePath);
|
||||||
|
} catch {
|
||||||
|
// Preserve the original backup failure if temporary file cleanup also fails.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise<RunDatabaseBackupResult> {
|
export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise<RunDatabaseBackupResult> {
|
||||||
const filenamePrefix = opts.filenamePrefix ?? "paperclip";
|
const filenamePrefix = opts.filenamePrefix ?? "paperclip";
|
||||||
const retentionDays = Math.max(1, Math.trunc(opts.retentionDays));
|
const retentionDays = Math.max(1, Math.trunc(opts.retentionDays));
|
||||||
@@ -149,12 +246,14 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
|||||||
const excludedTableNames = normalizeTableNameSet(opts.excludeTables);
|
const excludedTableNames = normalizeTableNameSet(opts.excludeTables);
|
||||||
const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns);
|
const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns);
|
||||||
const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
const sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
|
||||||
|
mkdirSync(opts.backupDir, { recursive: true });
|
||||||
|
const backupFile = resolve(opts.backupDir, `${filenamePrefix}-${timestamp()}.sql`);
|
||||||
|
const writer = createBufferedTextFileWriter(backupFile);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
await sql`SELECT 1`;
|
await sql`SELECT 1`;
|
||||||
|
|
||||||
const lines: string[] = [];
|
const emit = (line: string) => writer.emit(line);
|
||||||
const emit = (line: string) => lines.push(line);
|
|
||||||
const emitStatement = (statement: string) => {
|
const emitStatement = (statement: string) => {
|
||||||
emit(statement);
|
emit(statement);
|
||||||
emit(STATEMENT_BREAKPOINT);
|
emit(STATEMENT_BREAKPOINT);
|
||||||
@@ -503,10 +602,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
|||||||
emitStatement("COMMIT;");
|
emitStatement("COMMIT;");
|
||||||
emit("");
|
emit("");
|
||||||
|
|
||||||
// Write the backup file
|
await writer.close();
|
||||||
mkdirSync(opts.backupDir, { recursive: true });
|
|
||||||
const backupFile = resolve(opts.backupDir, `${filenamePrefix}-${timestamp()}.sql`);
|
|
||||||
await writeFile(backupFile, lines.join("\n"), "utf8");
|
|
||||||
|
|
||||||
const sizeBytes = statSync(backupFile).size;
|
const sizeBytes = statSync(backupFile).size;
|
||||||
const prunedCount = pruneOldBackups(opts.backupDir, retentionDays, filenamePrefix);
|
const prunedCount = pruneOldBackups(opts.backupDir, retentionDays, filenamePrefix);
|
||||||
@@ -516,6 +612,9 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
|
|||||||
sizeBytes,
|
sizeBytes,
|
||||||
prunedCount,
|
prunedCount,
|
||||||
};
|
};
|
||||||
|
} catch (error) {
|
||||||
|
await writer.abort();
|
||||||
|
throw error;
|
||||||
} finally {
|
} finally {
|
||||||
await sql.end();
|
await sql.end();
|
||||||
}
|
}
|
||||||
|
|||||||
89
packages/db/src/check-migration-numbering.ts
Normal file
89
packages/db/src/check-migration-numbering.ts
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
import { readdir, readFile } from "node:fs/promises";
|
||||||
|
import { fileURLToPath } from "node:url";
|
||||||
|
|
||||||
|
const migrationsDir = fileURLToPath(new URL("./migrations", import.meta.url));
|
||||||
|
const journalPath = fileURLToPath(new URL("./migrations/meta/_journal.json", import.meta.url));
|
||||||
|
|
||||||
|
type JournalFile = {
|
||||||
|
entries?: Array<{
|
||||||
|
idx?: number;
|
||||||
|
tag?: string;
|
||||||
|
}>;
|
||||||
|
};
|
||||||
|
|
||||||
|
function migrationNumber(value: string): string | null {
|
||||||
|
const match = value.match(/^(\d{4})_/);
|
||||||
|
return match ? match[1] : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function ensureNoDuplicates(values: string[], label: string) {
|
||||||
|
const seen = new Map<string, string>();
|
||||||
|
|
||||||
|
for (const value of values) {
|
||||||
|
const number = migrationNumber(value);
|
||||||
|
if (!number) {
|
||||||
|
throw new Error(`${label} entry does not start with a 4-digit migration number: ${value}`);
|
||||||
|
}
|
||||||
|
const existing = seen.get(number);
|
||||||
|
if (existing) {
|
||||||
|
throw new Error(`Duplicate migration number ${number} in ${label}: ${existing}, ${value}`);
|
||||||
|
}
|
||||||
|
seen.set(number, value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function ensureStrictlyOrdered(values: string[], label: string) {
|
||||||
|
const sorted = [...values].sort();
|
||||||
|
for (let index = 0; index < values.length; index += 1) {
|
||||||
|
if (values[index] !== sorted[index]) {
|
||||||
|
throw new Error(
|
||||||
|
`${label} are out of order at position ${index}: expected ${sorted[index]}, found ${values[index]}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function ensureJournalMatchesFiles(migrationFiles: string[], journalTags: string[]) {
|
||||||
|
const journalFiles = journalTags.map((tag) => `${tag}.sql`);
|
||||||
|
|
||||||
|
if (journalFiles.length !== migrationFiles.length) {
|
||||||
|
throw new Error(
|
||||||
|
`Migration journal/file count mismatch: journal has ${journalFiles.length}, files have ${migrationFiles.length}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
for (let index = 0; index < migrationFiles.length; index += 1) {
|
||||||
|
const migrationFile = migrationFiles[index];
|
||||||
|
const journalFile = journalFiles[index];
|
||||||
|
if (migrationFile !== journalFile) {
|
||||||
|
throw new Error(
|
||||||
|
`Migration journal/file order mismatch at position ${index}: journal has ${journalFile}, files have ${migrationFile}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const migrationFiles = (await readdir(migrationsDir))
|
||||||
|
.filter((entry) => entry.endsWith(".sql"))
|
||||||
|
.sort();
|
||||||
|
|
||||||
|
ensureNoDuplicates(migrationFiles, "migration files");
|
||||||
|
ensureStrictlyOrdered(migrationFiles, "migration files");
|
||||||
|
|
||||||
|
const rawJournal = await readFile(journalPath, "utf8");
|
||||||
|
const journal = JSON.parse(rawJournal) as JournalFile;
|
||||||
|
const journalTags = (journal.entries ?? [])
|
||||||
|
.map((entry, index) => {
|
||||||
|
if (typeof entry.tag !== "string" || entry.tag.length === 0) {
|
||||||
|
throw new Error(`Migration journal entry ${index} is missing a tag`);
|
||||||
|
}
|
||||||
|
return entry.tag;
|
||||||
|
});
|
||||||
|
|
||||||
|
ensureNoDuplicates(journalTags, "migration journal");
|
||||||
|
ensureStrictlyOrdered(journalTags, "migration journal");
|
||||||
|
ensureJournalMatchesFiles(migrationFiles, journalTags);
|
||||||
|
}
|
||||||
|
|
||||||
|
await main();
|
||||||
@@ -241,4 +241,164 @@ describeEmbeddedPostgres("applyPendingMigrations", () => {
|
|||||||
},
|
},
|
||||||
20_000,
|
20_000,
|
||||||
);
|
);
|
||||||
|
|
||||||
|
it(
|
||||||
|
"replays migration 0047 safely when feedback tables and run columns already exist",
|
||||||
|
async () => {
|
||||||
|
const connectionString = await createTempDatabase();
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const overjoyedGrootHash = await migrationHash("0047_overjoyed_groot.sql");
|
||||||
|
|
||||||
|
await sql.unsafe(
|
||||||
|
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${overjoyedGrootHash}'`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const tables = await sql.unsafe<{ table_name: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT table_name
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name IN ('feedback_exports', 'feedback_votes')
|
||||||
|
ORDER BY table_name
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(tables.map((row) => row.table_name)).toEqual([
|
||||||
|
"feedback_exports",
|
||||||
|
"feedback_votes",
|
||||||
|
]);
|
||||||
|
|
||||||
|
const columns = await sql.unsafe<{ table_name: string; column_name: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT table_name, column_name
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND (
|
||||||
|
(table_name = 'companies' AND column_name IN (
|
||||||
|
'feedback_data_sharing_enabled',
|
||||||
|
'feedback_data_sharing_consent_at',
|
||||||
|
'feedback_data_sharing_consent_by_user_id',
|
||||||
|
'feedback_data_sharing_terms_version'
|
||||||
|
))
|
||||||
|
OR (table_name = 'document_revisions' AND column_name = 'created_by_run_id')
|
||||||
|
OR (table_name = 'issue_comments' AND column_name = 'created_by_run_id')
|
||||||
|
)
|
||||||
|
ORDER BY table_name, column_name
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(columns).toHaveLength(6);
|
||||||
|
} finally {
|
||||||
|
await sql.end();
|
||||||
|
}
|
||||||
|
|
||||||
|
const pendingState = await inspectMigrations(connectionString);
|
||||||
|
expect(pendingState).toMatchObject({
|
||||||
|
status: "needsMigrations",
|
||||||
|
pendingMigrations: ["0047_overjoyed_groot.sql"],
|
||||||
|
reason: "pending-migrations",
|
||||||
|
});
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const finalState = await inspectMigrations(connectionString);
|
||||||
|
expect(finalState.status).toBe("upToDate");
|
||||||
|
|
||||||
|
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const constraints = await verifySql.unsafe<{ conname: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT conname
|
||||||
|
FROM pg_constraint
|
||||||
|
WHERE conname IN (
|
||||||
|
'feedback_exports_company_id_companies_id_fk',
|
||||||
|
'feedback_exports_feedback_vote_id_feedback_votes_id_fk',
|
||||||
|
'feedback_exports_issue_id_issues_id_fk',
|
||||||
|
'feedback_votes_company_id_companies_id_fk',
|
||||||
|
'feedback_votes_issue_id_issues_id_fk'
|
||||||
|
)
|
||||||
|
ORDER BY conname
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(constraints.map((row) => row.conname)).toEqual([
|
||||||
|
"feedback_exports_company_id_companies_id_fk",
|
||||||
|
"feedback_exports_feedback_vote_id_feedback_votes_id_fk",
|
||||||
|
"feedback_exports_issue_id_issues_id_fk",
|
||||||
|
"feedback_votes_company_id_companies_id_fk",
|
||||||
|
"feedback_votes_issue_id_issues_id_fk",
|
||||||
|
]);
|
||||||
|
} finally {
|
||||||
|
await verifySql.end();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
20_000,
|
||||||
|
);
|
||||||
|
|
||||||
|
it(
|
||||||
|
"replays migration 0048 safely when routines.variables already exists",
|
||||||
|
async () => {
|
||||||
|
const connectionString = await createTempDatabase();
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const flashyMarrowHash = await migrationHash("0048_flashy_marrow.sql");
|
||||||
|
|
||||||
|
await sql.unsafe(
|
||||||
|
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${flashyMarrowHash}'`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const columns = await sql.unsafe<{ column_name: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT column_name
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'routines'
|
||||||
|
AND column_name = 'variables'
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(columns).toHaveLength(1);
|
||||||
|
} finally {
|
||||||
|
await sql.end();
|
||||||
|
}
|
||||||
|
|
||||||
|
const pendingState = await inspectMigrations(connectionString);
|
||||||
|
expect(pendingState).toMatchObject({
|
||||||
|
status: "needsMigrations",
|
||||||
|
pendingMigrations: ["0048_flashy_marrow.sql"],
|
||||||
|
reason: "pending-migrations",
|
||||||
|
});
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const finalState = await inspectMigrations(connectionString);
|
||||||
|
expect(finalState.status).toBe("upToDate");
|
||||||
|
|
||||||
|
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const columns = await verifySql.unsafe<{ column_name: string; is_nullable: string; data_type: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT column_name, is_nullable, data_type
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'routines'
|
||||||
|
AND column_name = 'variables'
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(columns).toEqual([
|
||||||
|
expect.objectContaining({
|
||||||
|
column_name: "variables",
|
||||||
|
is_nullable: "NO",
|
||||||
|
data_type: "jsonb",
|
||||||
|
}),
|
||||||
|
]);
|
||||||
|
} finally {
|
||||||
|
await verifySql.end();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
20_000,
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|||||||
102
packages/db/src/migrations/0047_overjoyed_groot.sql
Normal file
102
packages/db/src/migrations/0047_overjoyed_groot.sql
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
CREATE TABLE IF NOT EXISTS "feedback_exports" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"company_id" uuid NOT NULL,
|
||||||
|
"feedback_vote_id" uuid NOT NULL,
|
||||||
|
"issue_id" uuid NOT NULL,
|
||||||
|
"project_id" uuid,
|
||||||
|
"author_user_id" text NOT NULL,
|
||||||
|
"target_type" text NOT NULL,
|
||||||
|
"target_id" text NOT NULL,
|
||||||
|
"vote" text NOT NULL,
|
||||||
|
"status" text DEFAULT 'local_only' NOT NULL,
|
||||||
|
"destination" text,
|
||||||
|
"export_id" text,
|
||||||
|
"consent_version" text,
|
||||||
|
"schema_version" text DEFAULT 'paperclip-feedback-envelope-v2' NOT NULL,
|
||||||
|
"bundle_version" text DEFAULT 'paperclip-feedback-bundle-v2' NOT NULL,
|
||||||
|
"payload_version" text DEFAULT 'paperclip-feedback-v1' NOT NULL,
|
||||||
|
"payload_digest" text,
|
||||||
|
"payload_snapshot" jsonb,
|
||||||
|
"target_summary" jsonb NOT NULL,
|
||||||
|
"redaction_summary" jsonb,
|
||||||
|
"attempt_count" integer DEFAULT 0 NOT NULL,
|
||||||
|
"last_attempted_at" timestamp with time zone,
|
||||||
|
"exported_at" timestamp with time zone,
|
||||||
|
"failure_reason" text,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
CREATE TABLE IF NOT EXISTS "feedback_votes" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"company_id" uuid NOT NULL,
|
||||||
|
"issue_id" uuid NOT NULL,
|
||||||
|
"target_type" text NOT NULL,
|
||||||
|
"target_id" text NOT NULL,
|
||||||
|
"author_user_id" text NOT NULL,
|
||||||
|
"vote" text NOT NULL,
|
||||||
|
"reason" text,
|
||||||
|
"shared_with_labs" boolean DEFAULT false NOT NULL,
|
||||||
|
"shared_at" timestamp with time zone,
|
||||||
|
"consent_version" text,
|
||||||
|
"redaction_summary" jsonb,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_enabled" boolean DEFAULT false NOT NULL;--> statement-breakpoint
|
||||||
|
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_consent_at" timestamp with time zone;--> statement-breakpoint
|
||||||
|
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_consent_by_user_id" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "companies" ADD COLUMN IF NOT EXISTS "feedback_data_sharing_terms_version" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "document_revisions" ADD COLUMN IF NOT EXISTS "created_by_run_id" uuid;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_comments" ADD COLUMN IF NOT EXISTS "created_by_run_id" uuid;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_company_id_companies_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_feedback_vote_id_feedback_votes_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_feedback_vote_id_feedback_votes_id_fk" FOREIGN KEY ("feedback_vote_id") REFERENCES "public"."feedback_votes"("id") ON DELETE cascade ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_issue_id_issues_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE cascade ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_exports_project_id_projects_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_exports" ADD CONSTRAINT "feedback_exports_project_id_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."projects"("id") ON DELETE set null ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_votes_company_id_companies_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_votes" ADD CONSTRAINT "feedback_votes_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'feedback_votes_issue_id_issues_id_fk') THEN
|
||||||
|
ALTER TABLE "feedback_votes" ADD CONSTRAINT "feedback_votes_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE no action ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS "feedback_exports_feedback_vote_idx" ON "feedback_exports" USING btree ("feedback_vote_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_exports_company_created_idx" ON "feedback_exports" USING btree ("company_id","created_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_exports_company_status_idx" ON "feedback_exports" USING btree ("company_id","status","created_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_exports_company_issue_idx" ON "feedback_exports" USING btree ("company_id","issue_id","created_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_exports_company_project_idx" ON "feedback_exports" USING btree ("company_id","project_id","created_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_exports_company_author_idx" ON "feedback_exports" USING btree ("company_id","author_user_id","created_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_votes_company_issue_idx" ON "feedback_votes" USING btree ("company_id","issue_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_votes_issue_target_idx" ON "feedback_votes" USING btree ("issue_id","target_type","target_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX IF NOT EXISTS "feedback_votes_author_idx" ON "feedback_votes" USING btree ("author_user_id","created_at");--> statement-breakpoint
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS "feedback_votes_company_target_author_idx" ON "feedback_votes" USING btree ("company_id","target_type","target_id","author_user_id");--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'document_revisions_created_by_run_id_heartbeat_runs_id_fk') THEN
|
||||||
|
ALTER TABLE "document_revisions" ADD CONSTRAINT "document_revisions_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;--> statement-breakpoint
|
||||||
|
DO $$ BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'issue_comments_created_by_run_id_heartbeat_runs_id_fk') THEN
|
||||||
|
ALTER TABLE "issue_comments" ADD CONSTRAINT "issue_comments_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
1
packages/db/src/migrations/0048_flashy_marrow.sql
Normal file
1
packages/db/src/migrations/0048_flashy_marrow.sql
Normal file
@@ -0,0 +1 @@
|
|||||||
|
ALTER TABLE "routines" ADD COLUMN IF NOT EXISTS "variables" jsonb DEFAULT '[]'::jsonb NOT NULL;
|
||||||
12539
packages/db/src/migrations/meta/0047_snapshot.json
Normal file
12539
packages/db/src/migrations/meta/0047_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
12546
packages/db/src/migrations/meta/0048_snapshot.json
Normal file
12546
packages/db/src/migrations/meta/0048_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -330,6 +330,20 @@
|
|||||||
"when": 1774960197878,
|
"when": 1774960197878,
|
||||||
"tag": "0046_smooth_sentinels",
|
"tag": "0046_smooth_sentinels",
|
||||||
"breakpoints": true
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 47,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1775137972687,
|
||||||
|
"tag": "0047_overjoyed_groot",
|
||||||
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 48,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1775145655557,
|
||||||
|
"tag": "0048_flashy_marrow",
|
||||||
|
"breakpoints": true
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -16,6 +16,12 @@ export const companies = pgTable(
|
|||||||
requireBoardApprovalForNewAgents: boolean("require_board_approval_for_new_agents")
|
requireBoardApprovalForNewAgents: boolean("require_board_approval_for_new_agents")
|
||||||
.notNull()
|
.notNull()
|
||||||
.default(true),
|
.default(true),
|
||||||
|
feedbackDataSharingEnabled: boolean("feedback_data_sharing_enabled")
|
||||||
|
.notNull()
|
||||||
|
.default(false),
|
||||||
|
feedbackDataSharingConsentAt: timestamp("feedback_data_sharing_consent_at", { withTimezone: true }),
|
||||||
|
feedbackDataSharingConsentByUserId: text("feedback_data_sharing_consent_by_user_id"),
|
||||||
|
feedbackDataSharingTermsVersion: text("feedback_data_sharing_terms_version"),
|
||||||
brandColor: text("brand_color"),
|
brandColor: text("brand_color"),
|
||||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { pgTable, uuid, text, integer, timestamp, index, uniqueIndex } from "dri
|
|||||||
import { companies } from "./companies.js";
|
import { companies } from "./companies.js";
|
||||||
import { agents } from "./agents.js";
|
import { agents } from "./agents.js";
|
||||||
import { documents } from "./documents.js";
|
import { documents } from "./documents.js";
|
||||||
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
|
|
||||||
export const documentRevisions = pgTable(
|
export const documentRevisions = pgTable(
|
||||||
"document_revisions",
|
"document_revisions",
|
||||||
@@ -16,6 +17,7 @@ export const documentRevisions = pgTable(
|
|||||||
changeSummary: text("change_summary"),
|
changeSummary: text("change_summary"),
|
||||||
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
||||||
createdByUserId: text("created_by_user_id"),
|
createdByUserId: text("created_by_user_id"),
|
||||||
|
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
|
||||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
},
|
},
|
||||||
(table) => ({
|
(table) => ({
|
||||||
|
|||||||
45
packages/db/src/schema/feedback_exports.ts
Normal file
45
packages/db/src/schema/feedback_exports.ts
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
import { index, integer, jsonb, pgTable, text, timestamp, uniqueIndex, uuid } from "drizzle-orm/pg-core";
|
||||||
|
import { companies } from "./companies.js";
|
||||||
|
import { feedbackVotes } from "./feedback_votes.js";
|
||||||
|
import { issues } from "./issues.js";
|
||||||
|
import { projects } from "./projects.js";
|
||||||
|
|
||||||
|
export const feedbackExports = pgTable(
|
||||||
|
"feedback_exports",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
|
feedbackVoteId: uuid("feedback_vote_id").notNull().references(() => feedbackVotes.id, { onDelete: "cascade" }),
|
||||||
|
issueId: uuid("issue_id").notNull().references(() => issues.id, { onDelete: "cascade" }),
|
||||||
|
projectId: uuid("project_id").references(() => projects.id, { onDelete: "set null" }),
|
||||||
|
authorUserId: text("author_user_id").notNull(),
|
||||||
|
targetType: text("target_type").notNull(),
|
||||||
|
targetId: text("target_id").notNull(),
|
||||||
|
vote: text("vote").notNull(),
|
||||||
|
status: text("status").notNull().default("local_only"),
|
||||||
|
destination: text("destination"),
|
||||||
|
exportId: text("export_id"),
|
||||||
|
consentVersion: text("consent_version"),
|
||||||
|
schemaVersion: text("schema_version").notNull().default("paperclip-feedback-envelope-v2"),
|
||||||
|
bundleVersion: text("bundle_version").notNull().default("paperclip-feedback-bundle-v2"),
|
||||||
|
payloadVersion: text("payload_version").notNull().default("paperclip-feedback-v1"),
|
||||||
|
payloadDigest: text("payload_digest"),
|
||||||
|
payloadSnapshot: jsonb("payload_snapshot"),
|
||||||
|
targetSummary: jsonb("target_summary").notNull(),
|
||||||
|
redactionSummary: jsonb("redaction_summary"),
|
||||||
|
attemptCount: integer("attempt_count").notNull().default(0),
|
||||||
|
lastAttemptedAt: timestamp("last_attempted_at", { withTimezone: true }),
|
||||||
|
exportedAt: timestamp("exported_at", { withTimezone: true }),
|
||||||
|
failureReason: text("failure_reason"),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
voteUniqueIdx: uniqueIndex("feedback_exports_feedback_vote_idx").on(table.feedbackVoteId),
|
||||||
|
companyCreatedIdx: index("feedback_exports_company_created_idx").on(table.companyId, table.createdAt),
|
||||||
|
companyStatusIdx: index("feedback_exports_company_status_idx").on(table.companyId, table.status, table.createdAt),
|
||||||
|
companyIssueIdx: index("feedback_exports_company_issue_idx").on(table.companyId, table.issueId, table.createdAt),
|
||||||
|
companyProjectIdx: index("feedback_exports_company_project_idx").on(table.companyId, table.projectId, table.createdAt),
|
||||||
|
companyAuthorIdx: index("feedback_exports_company_author_idx").on(table.companyId, table.authorUserId, table.createdAt),
|
||||||
|
}),
|
||||||
|
);
|
||||||
34
packages/db/src/schema/feedback_votes.ts
Normal file
34
packages/db/src/schema/feedback_votes.ts
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { boolean, index, jsonb, pgTable, text, timestamp, uniqueIndex, uuid } from "drizzle-orm/pg-core";
|
||||||
|
import { companies } from "./companies.js";
|
||||||
|
import { issues } from "./issues.js";
|
||||||
|
|
||||||
|
export const feedbackVotes = pgTable(
|
||||||
|
"feedback_votes",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
|
issueId: uuid("issue_id").notNull().references(() => issues.id),
|
||||||
|
targetType: text("target_type").notNull(),
|
||||||
|
targetId: text("target_id").notNull(),
|
||||||
|
authorUserId: text("author_user_id").notNull(),
|
||||||
|
vote: text("vote").notNull(),
|
||||||
|
reason: text("reason"),
|
||||||
|
sharedWithLabs: boolean("shared_with_labs").notNull().default(false),
|
||||||
|
sharedAt: timestamp("shared_at", { withTimezone: true }),
|
||||||
|
consentVersion: text("consent_version"),
|
||||||
|
redactionSummary: jsonb("redaction_summary"),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
companyIssueIdx: index("feedback_votes_company_issue_idx").on(table.companyId, table.issueId),
|
||||||
|
issueTargetIdx: index("feedback_votes_issue_target_idx").on(table.issueId, table.targetType, table.targetId),
|
||||||
|
authorIdx: index("feedback_votes_author_idx").on(table.authorUserId, table.createdAt),
|
||||||
|
companyTargetAuthorUniqueIdx: uniqueIndex("feedback_votes_company_target_author_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.targetType,
|
||||||
|
table.targetId,
|
||||||
|
table.authorUserId,
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
);
|
||||||
@@ -32,6 +32,8 @@ export { issueLabels } from "./issue_labels.js";
|
|||||||
export { issueApprovals } from "./issue_approvals.js";
|
export { issueApprovals } from "./issue_approvals.js";
|
||||||
export { issueComments } from "./issue_comments.js";
|
export { issueComments } from "./issue_comments.js";
|
||||||
export { issueInboxArchives } from "./issue_inbox_archives.js";
|
export { issueInboxArchives } from "./issue_inbox_archives.js";
|
||||||
|
export { feedbackVotes } from "./feedback_votes.js";
|
||||||
|
export { feedbackExports } from "./feedback_exports.js";
|
||||||
export { issueReadStates } from "./issue_read_states.js";
|
export { issueReadStates } from "./issue_read_states.js";
|
||||||
export { assets } from "./assets.js";
|
export { assets } from "./assets.js";
|
||||||
export { issueAttachments } from "./issue_attachments.js";
|
export { issueAttachments } from "./issue_attachments.js";
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { pgTable, uuid, text, timestamp, index } from "drizzle-orm/pg-core";
|
|||||||
import { companies } from "./companies.js";
|
import { companies } from "./companies.js";
|
||||||
import { issues } from "./issues.js";
|
import { issues } from "./issues.js";
|
||||||
import { agents } from "./agents.js";
|
import { agents } from "./agents.js";
|
||||||
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
|
|
||||||
export const issueComments = pgTable(
|
export const issueComments = pgTable(
|
||||||
"issue_comments",
|
"issue_comments",
|
||||||
@@ -11,6 +12,7 @@ export const issueComments = pgTable(
|
|||||||
issueId: uuid("issue_id").notNull().references(() => issues.id),
|
issueId: uuid("issue_id").notNull().references(() => issues.id),
|
||||||
authorAgentId: uuid("author_agent_id").references(() => agents.id),
|
authorAgentId: uuid("author_agent_id").references(() => agents.id),
|
||||||
authorUserId: text("author_user_id"),
|
authorUserId: text("author_user_id"),
|
||||||
|
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
|
||||||
body: text("body").notNull(),
|
body: text("body").notNull(),
|
||||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ import { companySecrets } from "./company_secrets.js";
|
|||||||
import { issues } from "./issues.js";
|
import { issues } from "./issues.js";
|
||||||
import { projects } from "./projects.js";
|
import { projects } from "./projects.js";
|
||||||
import { goals } from "./goals.js";
|
import { goals } from "./goals.js";
|
||||||
|
import type { RoutineVariable } from "@paperclipai/shared";
|
||||||
|
|
||||||
export const routines = pgTable(
|
export const routines = pgTable(
|
||||||
"routines",
|
"routines",
|
||||||
@@ -31,6 +32,7 @@ export const routines = pgTable(
|
|||||||
status: text("status").notNull().default("active"),
|
status: text("status").notNull().default("active"),
|
||||||
concurrencyPolicy: text("concurrency_policy").notNull().default("coalesce_if_active"),
|
concurrencyPolicy: text("concurrency_policy").notNull().default("coalesce_if_active"),
|
||||||
catchUpPolicy: text("catch_up_policy").notNull().default("skip_missed"),
|
catchUpPolicy: text("catch_up_policy").notNull().default("skip_missed"),
|
||||||
|
variables: jsonb("variables").$type<RoutineVariable[]>().notNull().default([]),
|
||||||
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
createdByAgentId: uuid("created_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
||||||
createdByUserId: text("created_by_user_id"),
|
createdByUserId: text("created_by_user_id"),
|
||||||
updatedByAgentId: uuid("updated_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
updatedByAgentId: uuid("updated_by_agent_id").references(() => agents.id, { onDelete: "set null" }),
|
||||||
|
|||||||
@@ -41,6 +41,7 @@ const manifest: PaperclipPluginManifestV1 = {
|
|||||||
"goals.update",
|
"goals.update",
|
||||||
"activity.log.write",
|
"activity.log.write",
|
||||||
"metrics.write",
|
"metrics.write",
|
||||||
|
"telemetry.track",
|
||||||
"plugin.state.read",
|
"plugin.state.read",
|
||||||
"plugin.state.write",
|
"plugin.state.write",
|
||||||
"events.subscribe",
|
"events.subscribe",
|
||||||
|
|||||||
@@ -405,6 +405,16 @@ async function registerActionHandlers(ctx: PluginContext): Promise<void> {
|
|||||||
data: { companyId },
|
data: { companyId },
|
||||||
});
|
});
|
||||||
await ctx.metrics.write("demo.events.emitted", 1, { source: "manual" });
|
await ctx.metrics.write("demo.events.emitted", 1, { source: "manual" });
|
||||||
|
await ctx.telemetry.track("demo_event", {
|
||||||
|
source: "manual",
|
||||||
|
has_company: Boolean(companyId),
|
||||||
|
});
|
||||||
|
pushRecord({
|
||||||
|
level: "info",
|
||||||
|
source: "telemetry",
|
||||||
|
message: "Tracked plugin telemetry event demo_event",
|
||||||
|
data: { companyId },
|
||||||
|
});
|
||||||
return { ok: true, message };
|
return { ok: true, message };
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -312,6 +312,7 @@ Declare in `manifest.capabilities`. Grouped by scope:
|
|||||||
| | `issue.comments.create` |
|
| | `issue.comments.create` |
|
||||||
| | `activity.log.write` |
|
| | `activity.log.write` |
|
||||||
| | `metrics.write` |
|
| | `metrics.write` |
|
||||||
|
| | `telemetry.track` |
|
||||||
| **Instance** | `instance.settings.register` |
|
| **Instance** | `instance.settings.register` |
|
||||||
| | `plugin.state.read` |
|
| | `plugin.state.read` |
|
||||||
| | `plugin.state.write` |
|
| | `plugin.state.write` |
|
||||||
|
|||||||
@@ -135,6 +135,11 @@ export interface HostServices {
|
|||||||
write(params: WorkerToHostMethods["metrics.write"][0]): Promise<void>;
|
write(params: WorkerToHostMethods["metrics.write"][0]): Promise<void>;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
/** Provides `telemetry.track`. */
|
||||||
|
telemetry: {
|
||||||
|
track(params: WorkerToHostMethods["telemetry.track"][0]): Promise<void>;
|
||||||
|
};
|
||||||
|
|
||||||
/** Provides `log`. */
|
/** Provides `log`. */
|
||||||
logger: {
|
logger: {
|
||||||
log(params: WorkerToHostMethods["log"][0]): Promise<void>;
|
log(params: WorkerToHostMethods["log"][0]): Promise<void>;
|
||||||
@@ -284,6 +289,9 @@ const METHOD_CAPABILITY_MAP: Record<WorkerToHostMethodName, PluginCapability | n
|
|||||||
// Metrics
|
// Metrics
|
||||||
"metrics.write": "metrics.write",
|
"metrics.write": "metrics.write",
|
||||||
|
|
||||||
|
// Telemetry
|
||||||
|
"telemetry.track": "telemetry.track",
|
||||||
|
|
||||||
// Logger — always allowed
|
// Logger — always allowed
|
||||||
"log": null,
|
"log": null,
|
||||||
|
|
||||||
@@ -447,6 +455,11 @@ export function createHostClientHandlers(
|
|||||||
return services.metrics.write(params);
|
return services.metrics.write(params);
|
||||||
}),
|
}),
|
||||||
|
|
||||||
|
// Telemetry
|
||||||
|
"telemetry.track": gated("telemetry.track", async (params) => {
|
||||||
|
return services.telemetry.track(params);
|
||||||
|
}),
|
||||||
|
|
||||||
// Logger
|
// Logger
|
||||||
"log": gated("log", async (params) => {
|
"log": gated("log", async (params) => {
|
||||||
return services.logger.log(params);
|
return services.logger.log(params);
|
||||||
|
|||||||
@@ -182,6 +182,7 @@ export type {
|
|||||||
PluginStreamsClient,
|
PluginStreamsClient,
|
||||||
PluginToolsClient,
|
PluginToolsClient,
|
||||||
PluginMetricsClient,
|
PluginMetricsClient,
|
||||||
|
PluginTelemetryClient,
|
||||||
PluginLogger,
|
PluginLogger,
|
||||||
} from "./types.js";
|
} from "./types.js";
|
||||||
|
|
||||||
|
|||||||
@@ -519,6 +519,12 @@ export interface WorkerToHostMethods {
|
|||||||
result: void,
|
result: void,
|
||||||
];
|
];
|
||||||
|
|
||||||
|
// Telemetry
|
||||||
|
"telemetry.track": [
|
||||||
|
params: { eventName: string; dimensions?: Record<string, string | number | boolean> },
|
||||||
|
result: void,
|
||||||
|
];
|
||||||
|
|
||||||
// Logger
|
// Logger
|
||||||
"log": [
|
"log": [
|
||||||
params: { level: "info" | "warn" | "error" | "debug"; message: string; meta?: Record<string, unknown> },
|
params: { level: "info" | "warn" | "error" | "debug"; message: string; meta?: Record<string, unknown> },
|
||||||
|
|||||||
@@ -71,6 +71,7 @@ export interface TestHarness {
|
|||||||
logs: TestHarnessLogEntry[];
|
logs: TestHarnessLogEntry[];
|
||||||
activity: Array<{ message: string; entityType?: string; entityId?: string; metadata?: Record<string, unknown> }>;
|
activity: Array<{ message: string; entityType?: string; entityId?: string; metadata?: Record<string, unknown> }>;
|
||||||
metrics: Array<{ name: string; value: number; tags?: Record<string, string> }>;
|
metrics: Array<{ name: string; value: number; tags?: Record<string, string> }>;
|
||||||
|
telemetry: Array<{ eventName: string; dimensions?: Record<string, string | number | boolean> }>;
|
||||||
}
|
}
|
||||||
|
|
||||||
type EventRegistration = {
|
type EventRegistration = {
|
||||||
@@ -132,6 +133,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
|||||||
const logs: TestHarnessLogEntry[] = [];
|
const logs: TestHarnessLogEntry[] = [];
|
||||||
const activity: TestHarness["activity"] = [];
|
const activity: TestHarness["activity"] = [];
|
||||||
const metrics: TestHarness["metrics"] = [];
|
const metrics: TestHarness["metrics"] = [];
|
||||||
|
const telemetry: TestHarness["telemetry"] = [];
|
||||||
|
|
||||||
const state = new Map<string, unknown>();
|
const state = new Map<string, unknown>();
|
||||||
const entities = new Map<string, PluginEntityRecord>();
|
const entities = new Map<string, PluginEntityRecord>();
|
||||||
@@ -631,6 +633,12 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
|||||||
metrics.push({ name, value, tags });
|
metrics.push({ name, value, tags });
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
telemetry: {
|
||||||
|
async track(eventName, dimensions) {
|
||||||
|
requireCapability(manifest, capabilitySet, "telemetry.track");
|
||||||
|
telemetry.push({ eventName, dimensions });
|
||||||
|
},
|
||||||
|
},
|
||||||
logger: {
|
logger: {
|
||||||
info(message, meta) {
|
info(message, meta) {
|
||||||
logs.push({ level: "info", message, meta });
|
logs.push({ level: "info", message, meta });
|
||||||
@@ -729,6 +737,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
|||||||
logs,
|
logs,
|
||||||
activity,
|
activity,
|
||||||
metrics,
|
metrics,
|
||||||
|
telemetry,
|
||||||
};
|
};
|
||||||
|
|
||||||
return harness;
|
return harness;
|
||||||
|
|||||||
@@ -761,6 +761,28 @@ export interface PluginMetricsClient {
|
|||||||
write(name: string, value: number, tags?: Record<string, string>): Promise<void>;
|
write(name: string, value: number, tags?: Record<string, string>): Promise<void>;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* `ctx.telemetry` — emit plugin-scoped telemetry to the host's external
|
||||||
|
* telemetry pipeline.
|
||||||
|
*
|
||||||
|
* Requires `telemetry.track` capability.
|
||||||
|
*/
|
||||||
|
export interface PluginTelemetryClient {
|
||||||
|
/**
|
||||||
|
* Track a plugin telemetry event.
|
||||||
|
*
|
||||||
|
* The host prefixes the final event name as `plugin.<pluginId>.<eventName>`
|
||||||
|
* before forwarding it to the shared telemetry client.
|
||||||
|
*
|
||||||
|
* @param eventName - Bare plugin event slug (for example `"sync_completed"`)
|
||||||
|
* @param dimensions - Optional structured dimensions
|
||||||
|
*/
|
||||||
|
track(
|
||||||
|
eventName: string,
|
||||||
|
dimensions?: Record<string, string | number | boolean>,
|
||||||
|
): Promise<void>;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* `ctx.companies` — read company metadata.
|
* `ctx.companies` — read company metadata.
|
||||||
*
|
*
|
||||||
@@ -1156,6 +1178,9 @@ export interface PluginContext {
|
|||||||
/** Write plugin metrics. Requires `metrics.write`. */
|
/** Write plugin metrics. Requires `metrics.write`. */
|
||||||
metrics: PluginMetricsClient;
|
metrics: PluginMetricsClient;
|
||||||
|
|
||||||
|
/** Emit plugin-scoped external telemetry. Requires `telemetry.track`. */
|
||||||
|
telemetry: PluginTelemetryClient;
|
||||||
|
|
||||||
/** Structured logger. Output is captured and surfaced in the plugin health dashboard. */
|
/** Structured logger. Output is captured and surfaced in the plugin health dashboard. */
|
||||||
logger: PluginLogger;
|
logger: PluginLogger;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -793,6 +793,15 @@ export function startWorkerRpcHost(options: WorkerRpcHostOptions): WorkerRpcHost
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
|
||||||
|
telemetry: {
|
||||||
|
async track(
|
||||||
|
eventName: string,
|
||||||
|
dimensions?: Record<string, string | number | boolean>,
|
||||||
|
): Promise<void> {
|
||||||
|
await callHost("telemetry.track", { eventName, dimensions });
|
||||||
|
},
|
||||||
|
},
|
||||||
|
|
||||||
logger: {
|
logger: {
|
||||||
info(message: string, meta?: Record<string, unknown>): void {
|
info(message: string, meta?: Record<string, unknown>): void {
|
||||||
notifyHost("log", { level: "info", message, meta });
|
notifyHost("log", { level: "info", message, meta });
|
||||||
|
|||||||
@@ -14,6 +14,7 @@
|
|||||||
"type": "module",
|
"type": "module",
|
||||||
"exports": {
|
"exports": {
|
||||||
".": "./src/index.ts",
|
".": "./src/index.ts",
|
||||||
|
"./telemetry": "./src/telemetry/index.ts",
|
||||||
"./*": "./src/*.ts"
|
"./*": "./src/*.ts"
|
||||||
},
|
},
|
||||||
"publishConfig": {
|
"publishConfig": {
|
||||||
@@ -23,6 +24,10 @@
|
|||||||
"types": "./dist/index.d.ts",
|
"types": "./dist/index.d.ts",
|
||||||
"import": "./dist/index.js"
|
"import": "./dist/index.js"
|
||||||
},
|
},
|
||||||
|
"./telemetry": {
|
||||||
|
"types": "./dist/telemetry/index.d.ts",
|
||||||
|
"import": "./dist/telemetry/index.js"
|
||||||
|
},
|
||||||
"./*": {
|
"./*": {
|
||||||
"types": "./dist/*.d.ts",
|
"types": "./dist/*.d.ts",
|
||||||
"import": "./dist/*.js"
|
"import": "./dist/*.js"
|
||||||
|
|||||||
@@ -95,6 +95,10 @@ export const secretsConfigSchema = z.object({
|
|||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
export const telemetryConfigSchema = z.object({
|
||||||
|
enabled: z.boolean().default(true),
|
||||||
|
}).default({});
|
||||||
|
|
||||||
export const paperclipConfigSchema = z
|
export const paperclipConfigSchema = z
|
||||||
.object({
|
.object({
|
||||||
$meta: configMetaSchema,
|
$meta: configMetaSchema,
|
||||||
@@ -102,6 +106,7 @@ export const paperclipConfigSchema = z
|
|||||||
database: databaseConfigSchema,
|
database: databaseConfigSchema,
|
||||||
logging: loggingConfigSchema,
|
logging: loggingConfigSchema,
|
||||||
server: serverConfigSchema,
|
server: serverConfigSchema,
|
||||||
|
telemetry: telemetryConfigSchema,
|
||||||
auth: authConfigSchema.default({
|
auth: authConfigSchema.default({
|
||||||
baseUrlMode: "auto",
|
baseUrlMode: "auto",
|
||||||
disableSignUp: false,
|
disableSignUp: false,
|
||||||
@@ -174,5 +179,6 @@ export type StorageS3Config = z.infer<typeof storageS3ConfigSchema>;
|
|||||||
export type SecretsConfig = z.infer<typeof secretsConfigSchema>;
|
export type SecretsConfig = z.infer<typeof secretsConfigSchema>;
|
||||||
export type SecretsLocalEncryptedConfig = z.infer<typeof secretsLocalEncryptedConfigSchema>;
|
export type SecretsLocalEncryptedConfig = z.infer<typeof secretsLocalEncryptedConfigSchema>;
|
||||||
export type AuthConfig = z.infer<typeof authConfigSchema>;
|
export type AuthConfig = z.infer<typeof authConfigSchema>;
|
||||||
|
export type TelemetryConfig = z.infer<typeof telemetryConfigSchema>;
|
||||||
export type ConfigMeta = z.infer<typeof configMetaSchema>;
|
export type ConfigMeta = z.infer<typeof configMetaSchema>;
|
||||||
export type DatabaseBackupConfig = z.infer<typeof databaseBackupConfigSchema>;
|
export type DatabaseBackupConfig = z.infer<typeof databaseBackupConfigSchema>;
|
||||||
|
|||||||
@@ -166,6 +166,9 @@ export type RoutineTriggerKind = (typeof ROUTINE_TRIGGER_KINDS)[number];
|
|||||||
export const ROUTINE_TRIGGER_SIGNING_MODES = ["bearer", "hmac_sha256"] as const;
|
export const ROUTINE_TRIGGER_SIGNING_MODES = ["bearer", "hmac_sha256"] as const;
|
||||||
export type RoutineTriggerSigningMode = (typeof ROUTINE_TRIGGER_SIGNING_MODES)[number];
|
export type RoutineTriggerSigningMode = (typeof ROUTINE_TRIGGER_SIGNING_MODES)[number];
|
||||||
|
|
||||||
|
export const ROUTINE_VARIABLE_TYPES = ["text", "textarea", "number", "boolean", "select"] as const;
|
||||||
|
export type RoutineVariableType = (typeof ROUTINE_VARIABLE_TYPES)[number];
|
||||||
|
|
||||||
export const ROUTINE_RUN_STATUSES = [
|
export const ROUTINE_RUN_STATUSES = [
|
||||||
"received",
|
"received",
|
||||||
"coalesced",
|
"coalesced",
|
||||||
@@ -448,6 +451,7 @@ export const PLUGIN_CAPABILITIES = [
|
|||||||
"agent.sessions.close",
|
"agent.sessions.close",
|
||||||
"activity.log.write",
|
"activity.log.write",
|
||||||
"metrics.write",
|
"metrics.write",
|
||||||
|
"telemetry.track",
|
||||||
// Plugin State
|
// Plugin State
|
||||||
"plugin.state.read",
|
"plugin.state.read",
|
||||||
"plugin.state.write",
|
"plugin.state.write",
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ export {
|
|||||||
ROUTINE_CATCH_UP_POLICIES,
|
ROUTINE_CATCH_UP_POLICIES,
|
||||||
ROUTINE_TRIGGER_KINDS,
|
ROUTINE_TRIGGER_KINDS,
|
||||||
ROUTINE_TRIGGER_SIGNING_MODES,
|
ROUTINE_TRIGGER_SIGNING_MODES,
|
||||||
|
ROUTINE_VARIABLE_TYPES,
|
||||||
ROUTINE_RUN_STATUSES,
|
ROUTINE_RUN_STATUSES,
|
||||||
ROUTINE_RUN_SOURCES,
|
ROUTINE_RUN_SOURCES,
|
||||||
PAUSE_REASONS,
|
PAUSE_REASONS,
|
||||||
@@ -88,6 +89,7 @@ export {
|
|||||||
type RoutineCatchUpPolicy,
|
type RoutineCatchUpPolicy,
|
||||||
type RoutineTriggerKind,
|
type RoutineTriggerKind,
|
||||||
type RoutineTriggerSigningMode,
|
type RoutineTriggerSigningMode,
|
||||||
|
type RoutineVariableType,
|
||||||
type RoutineRunStatus,
|
type RoutineRunStatus,
|
||||||
type RoutineRunSource,
|
type RoutineRunSource,
|
||||||
type PauseReason,
|
type PauseReason,
|
||||||
@@ -138,6 +140,16 @@ export {
|
|||||||
|
|
||||||
export type {
|
export type {
|
||||||
Company,
|
Company,
|
||||||
|
FeedbackVote,
|
||||||
|
FeedbackDataSharingPreference,
|
||||||
|
FeedbackTargetType,
|
||||||
|
FeedbackVoteValue,
|
||||||
|
FeedbackTrace,
|
||||||
|
FeedbackTraceStatus,
|
||||||
|
FeedbackTraceTargetSummary,
|
||||||
|
FeedbackTraceBundleCaptureStatus,
|
||||||
|
FeedbackTraceBundleFile,
|
||||||
|
FeedbackTraceBundle,
|
||||||
CompanySkillSourceType,
|
CompanySkillSourceType,
|
||||||
CompanySkillTrustLevel,
|
CompanySkillTrustLevel,
|
||||||
CompanySkillCompatibility,
|
CompanySkillCompatibility,
|
||||||
@@ -245,6 +257,8 @@ export type {
|
|||||||
FinanceSummary,
|
FinanceSummary,
|
||||||
FinanceByBiller,
|
FinanceByBiller,
|
||||||
FinanceByKind,
|
FinanceByKind,
|
||||||
|
AgentWakeupResponse,
|
||||||
|
AgentWakeupSkipped,
|
||||||
HeartbeatRun,
|
HeartbeatRun,
|
||||||
HeartbeatRunEvent,
|
HeartbeatRunEvent,
|
||||||
AgentRuntimeState,
|
AgentRuntimeState,
|
||||||
@@ -294,6 +308,8 @@ export type {
|
|||||||
CompanySecret,
|
CompanySecret,
|
||||||
SecretProviderDescriptor,
|
SecretProviderDescriptor,
|
||||||
Routine,
|
Routine,
|
||||||
|
RoutineVariable,
|
||||||
|
RoutineVariableDefaultValue,
|
||||||
RoutineTrigger,
|
RoutineTrigger,
|
||||||
RoutineRun,
|
RoutineRun,
|
||||||
RoutineTriggerSecretMaterial,
|
RoutineTriggerSecretMaterial,
|
||||||
@@ -325,6 +341,15 @@ export type {
|
|||||||
ProviderQuotaResult,
|
ProviderQuotaResult,
|
||||||
} from "./types/index.js";
|
} from "./types/index.js";
|
||||||
|
|
||||||
|
export {
|
||||||
|
DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE,
|
||||||
|
FEEDBACK_TARGET_TYPES,
|
||||||
|
FEEDBACK_DATA_SHARING_PREFERENCES,
|
||||||
|
FEEDBACK_TRACE_STATUSES,
|
||||||
|
FEEDBACK_VOTE_VALUES,
|
||||||
|
DEFAULT_FEEDBACK_DATA_SHARING_TERMS_VERSION,
|
||||||
|
} from "./types/feedback.js";
|
||||||
|
|
||||||
export {
|
export {
|
||||||
instanceGeneralSettingsSchema,
|
instanceGeneralSettingsSchema,
|
||||||
patchInstanceGeneralSettingsSchema,
|
patchInstanceGeneralSettingsSchema,
|
||||||
@@ -338,9 +363,14 @@ export {
|
|||||||
createCompanySchema,
|
createCompanySchema,
|
||||||
updateCompanySchema,
|
updateCompanySchema,
|
||||||
updateCompanyBrandingSchema,
|
updateCompanyBrandingSchema,
|
||||||
|
feedbackTargetTypeSchema,
|
||||||
|
feedbackTraceStatusSchema,
|
||||||
|
feedbackVoteValueSchema,
|
||||||
|
upsertIssueFeedbackVoteSchema,
|
||||||
type CreateCompany,
|
type CreateCompany,
|
||||||
type UpdateCompany,
|
type UpdateCompany,
|
||||||
type UpdateCompanyBranding,
|
type UpdateCompanyBranding,
|
||||||
|
type UpsertIssueFeedbackVote,
|
||||||
agentSkillStateSchema,
|
agentSkillStateSchema,
|
||||||
agentSkillSyncModeSchema,
|
agentSkillSyncModeSchema,
|
||||||
agentSkillEntrySchema,
|
agentSkillEntrySchema,
|
||||||
@@ -449,6 +479,7 @@ export {
|
|||||||
updateRoutineSchema,
|
updateRoutineSchema,
|
||||||
createRoutineTriggerSchema,
|
createRoutineTriggerSchema,
|
||||||
updateRoutineTriggerSchema,
|
updateRoutineTriggerSchema,
|
||||||
|
routineVariableSchema,
|
||||||
runRoutineSchema,
|
runRoutineSchema,
|
||||||
rotateRoutineTriggerSecretSchema,
|
rotateRoutineTriggerSecretSchema,
|
||||||
type CreateSecret,
|
type CreateSecret,
|
||||||
@@ -573,6 +604,14 @@ export {
|
|||||||
type ParsedProjectMention,
|
type ParsedProjectMention,
|
||||||
} from "./project-mentions.js";
|
} from "./project-mentions.js";
|
||||||
|
|
||||||
|
export {
|
||||||
|
extractRoutineVariableNames,
|
||||||
|
interpolateRoutineTemplate,
|
||||||
|
isValidRoutineVariableName,
|
||||||
|
stringifyRoutineVariableValue,
|
||||||
|
syncRoutineVariablesWithTemplate,
|
||||||
|
} from "./routine-variables.js";
|
||||||
|
|
||||||
export {
|
export {
|
||||||
paperclipConfigSchema,
|
paperclipConfigSchema,
|
||||||
configMetaSchema,
|
configMetaSchema,
|
||||||
@@ -587,6 +626,8 @@ export {
|
|||||||
storageLocalDiskConfigSchema,
|
storageLocalDiskConfigSchema,
|
||||||
storageS3ConfigSchema,
|
storageS3ConfigSchema,
|
||||||
secretsLocalEncryptedConfigSchema,
|
secretsLocalEncryptedConfigSchema,
|
||||||
|
telemetryConfigSchema,
|
||||||
|
type TelemetryConfig,
|
||||||
type PaperclipConfig,
|
type PaperclipConfig,
|
||||||
type LlmConfig,
|
type LlmConfig,
|
||||||
type DatabaseBackupConfig,
|
type DatabaseBackupConfig,
|
||||||
|
|||||||
34
packages/shared/src/routine-variables.test.ts
Normal file
34
packages/shared/src/routine-variables.test.ts
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import {
|
||||||
|
extractRoutineVariableNames,
|
||||||
|
interpolateRoutineTemplate,
|
||||||
|
syncRoutineVariablesWithTemplate,
|
||||||
|
} from "./routine-variables.js";
|
||||||
|
|
||||||
|
describe("routine variable helpers", () => {
|
||||||
|
it("extracts placeholder names in first-appearance order", () => {
|
||||||
|
expect(
|
||||||
|
extractRoutineVariableNames("Review {{repo}} and {{priority}} for {{repo}}"),
|
||||||
|
).toEqual(["repo", "priority"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("preserves existing metadata when syncing variables from a template", () => {
|
||||||
|
expect(
|
||||||
|
syncRoutineVariablesWithTemplate("Review {{repo}} and {{priority}}", [
|
||||||
|
{ name: "repo", label: "Repository", type: "text", defaultValue: "paperclip", required: true, options: [] },
|
||||||
|
]),
|
||||||
|
).toEqual([
|
||||||
|
{ name: "repo", label: "Repository", type: "text", defaultValue: "paperclip", required: true, options: [] },
|
||||||
|
{ name: "priority", label: null, type: "text", defaultValue: null, required: true, options: [] },
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("interpolates provided variable values into the routine template", () => {
|
||||||
|
expect(
|
||||||
|
interpolateRoutineTemplate("Review {{repo}} for {{priority}}", {
|
||||||
|
repo: "paperclip",
|
||||||
|
priority: "high",
|
||||||
|
}),
|
||||||
|
).toBe("Review paperclip for high");
|
||||||
|
});
|
||||||
|
});
|
||||||
62
packages/shared/src/routine-variables.ts
Normal file
62
packages/shared/src/routine-variables.ts
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
import type { RoutineVariable } from "./types/routine.js";
|
||||||
|
|
||||||
|
const ROUTINE_VARIABLE_MATCHER = /\{\{\s*([A-Za-z][A-Za-z0-9_]*)\s*\}\}/g;
|
||||||
|
|
||||||
|
export function isValidRoutineVariableName(name: string): boolean {
|
||||||
|
return /^[A-Za-z][A-Za-z0-9_]*$/.test(name);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function extractRoutineVariableNames(template: string | null | undefined): string[] {
|
||||||
|
if (!template) return [];
|
||||||
|
const found = new Set<string>();
|
||||||
|
for (const match of template.matchAll(ROUTINE_VARIABLE_MATCHER)) {
|
||||||
|
const name = match[1];
|
||||||
|
if (name && !found.has(name)) {
|
||||||
|
found.add(name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [...found];
|
||||||
|
}
|
||||||
|
|
||||||
|
function defaultRoutineVariable(name: string): RoutineVariable {
|
||||||
|
return {
|
||||||
|
name,
|
||||||
|
label: null,
|
||||||
|
type: "text",
|
||||||
|
defaultValue: null,
|
||||||
|
required: true,
|
||||||
|
options: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function syncRoutineVariablesWithTemplate(
|
||||||
|
template: string | null | undefined,
|
||||||
|
existing: RoutineVariable[] | null | undefined,
|
||||||
|
): RoutineVariable[] {
|
||||||
|
const names = extractRoutineVariableNames(template);
|
||||||
|
const existingByName = new Map((existing ?? []).map((variable) => [variable.name, variable]));
|
||||||
|
return names.map((name) => existingByName.get(name) ?? defaultRoutineVariable(name));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function stringifyRoutineVariableValue(value: unknown): string {
|
||||||
|
if (typeof value === "string") return value;
|
||||||
|
if (typeof value === "number" || typeof value === "boolean") return String(value);
|
||||||
|
if (value == null) return "";
|
||||||
|
try {
|
||||||
|
return JSON.stringify(value);
|
||||||
|
} catch {
|
||||||
|
return String(value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function interpolateRoutineTemplate(
|
||||||
|
template: string | null | undefined,
|
||||||
|
values: Record<string, unknown> | null | undefined,
|
||||||
|
): string | null {
|
||||||
|
if (template == null) return null;
|
||||||
|
if (!values || Object.keys(values).length === 0) return template;
|
||||||
|
return template.replace(ROUTINE_VARIABLE_MATCHER, (match, rawName: string) => {
|
||||||
|
if (!(rawName in values)) return match;
|
||||||
|
return stringifyRoutineVariableValue(values[rawName]);
|
||||||
|
});
|
||||||
|
}
|
||||||
104
packages/shared/src/telemetry/client.ts
Normal file
104
packages/shared/src/telemetry/client.ts
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
import { createHash } from "node:crypto";
|
||||||
|
import type {
|
||||||
|
TelemetryConfig,
|
||||||
|
TelemetryEvent,
|
||||||
|
TelemetryEventName,
|
||||||
|
TelemetryState,
|
||||||
|
} from "./types.js";
|
||||||
|
|
||||||
|
const DEFAULT_ENDPOINT = "https://telemetry.paperclip.ing/ingest";
|
||||||
|
const BATCH_SIZE = 50;
|
||||||
|
const SEND_TIMEOUT_MS = 5_000;
|
||||||
|
|
||||||
|
export class TelemetryClient {
|
||||||
|
private queue: TelemetryEvent[] = [];
|
||||||
|
private readonly config: TelemetryConfig;
|
||||||
|
private readonly stateFactory: () => TelemetryState;
|
||||||
|
private readonly version: string;
|
||||||
|
private state: TelemetryState | null = null;
|
||||||
|
private flushInterval: ReturnType<typeof setInterval> | null = null;
|
||||||
|
|
||||||
|
constructor(config: TelemetryConfig, stateFactory: () => TelemetryState, version: string) {
|
||||||
|
this.config = config;
|
||||||
|
this.stateFactory = stateFactory;
|
||||||
|
this.version = version;
|
||||||
|
}
|
||||||
|
|
||||||
|
track(eventName: TelemetryEventName, dimensions?: Record<string, string | number | boolean>): void {
|
||||||
|
if (!this.config.enabled) return;
|
||||||
|
this.getState(); // ensure state is initialised (side-effect: creates state file on first call)
|
||||||
|
|
||||||
|
this.queue.push({
|
||||||
|
name: eventName,
|
||||||
|
occurredAt: new Date().toISOString(),
|
||||||
|
dimensions: dimensions ?? {},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (this.queue.length >= BATCH_SIZE) {
|
||||||
|
void this.flush();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async flush(): Promise<void> {
|
||||||
|
if (!this.config.enabled || this.queue.length === 0) return;
|
||||||
|
|
||||||
|
const events = this.queue.splice(0);
|
||||||
|
const state = this.getState();
|
||||||
|
const endpoint = this.config.endpoint ?? DEFAULT_ENDPOINT;
|
||||||
|
const app = this.config.app ?? "paperclip";
|
||||||
|
const schemaVersion = this.config.schemaVersion ?? "1";
|
||||||
|
|
||||||
|
const controller = new AbortController();
|
||||||
|
const timer = setTimeout(() => controller.abort(), SEND_TIMEOUT_MS);
|
||||||
|
try {
|
||||||
|
await fetch(endpoint, {
|
||||||
|
method: "POST",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
body: JSON.stringify({
|
||||||
|
app,
|
||||||
|
schemaVersion,
|
||||||
|
installId: state.installId,
|
||||||
|
events,
|
||||||
|
}),
|
||||||
|
signal: controller.signal,
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// Fire-and-forget: silent failure, no retries
|
||||||
|
} finally {
|
||||||
|
clearTimeout(timer);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
startPeriodicFlush(intervalMs: number = 60_000): void {
|
||||||
|
if (this.flushInterval) return;
|
||||||
|
this.flushInterval = setInterval(() => {
|
||||||
|
void this.flush();
|
||||||
|
}, intervalMs);
|
||||||
|
// Allow the process to exit even if the interval is still active
|
||||||
|
if (typeof this.flushInterval === "object" && "unref" in this.flushInterval) {
|
||||||
|
this.flushInterval.unref();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stop(): void {
|
||||||
|
if (this.flushInterval) {
|
||||||
|
clearInterval(this.flushInterval);
|
||||||
|
this.flushInterval = null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
hashPrivateRef(value: string): string {
|
||||||
|
const state = this.getState();
|
||||||
|
return createHash("sha256")
|
||||||
|
.update(state.salt + value)
|
||||||
|
.digest("hex")
|
||||||
|
.slice(0, 16);
|
||||||
|
}
|
||||||
|
|
||||||
|
private getState(): TelemetryState {
|
||||||
|
if (!this.state) {
|
||||||
|
this.state = this.stateFactory();
|
||||||
|
}
|
||||||
|
return this.state;
|
||||||
|
}
|
||||||
|
}
|
||||||
25
packages/shared/src/telemetry/config.ts
Normal file
25
packages/shared/src/telemetry/config.ts
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
import type { TelemetryConfig } from "./types.js";
|
||||||
|
|
||||||
|
const CI_ENV_VARS = ["CI", "CONTINUOUS_INTEGRATION", "BUILD_NUMBER", "GITHUB_ACTIONS", "GITLAB_CI"];
|
||||||
|
|
||||||
|
function isCI(): boolean {
|
||||||
|
return CI_ENV_VARS.some((key) => process.env[key] === "true" || process.env[key] === "1");
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveTelemetryConfig(fileConfig?: { enabled?: boolean }): TelemetryConfig {
|
||||||
|
if (process.env.PAPERCLIP_TELEMETRY_DISABLED === "1") {
|
||||||
|
return { enabled: false };
|
||||||
|
}
|
||||||
|
if (process.env.DO_NOT_TRACK === "1") {
|
||||||
|
return { enabled: false };
|
||||||
|
}
|
||||||
|
if (isCI()) {
|
||||||
|
return { enabled: false };
|
||||||
|
}
|
||||||
|
if (fileConfig?.enabled === false) {
|
||||||
|
return { enabled: false };
|
||||||
|
}
|
||||||
|
|
||||||
|
const endpoint = process.env.PAPERCLIP_TELEMETRY_ENDPOINT || undefined;
|
||||||
|
return { enabled: true, endpoint };
|
||||||
|
}
|
||||||
45
packages/shared/src/telemetry/events.ts
Normal file
45
packages/shared/src/telemetry/events.ts
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
import type { TelemetryClient } from "./client.js";
|
||||||
|
|
||||||
|
export function trackInstallStarted(client: TelemetryClient): void {
|
||||||
|
client.track("install.started");
|
||||||
|
}
|
||||||
|
|
||||||
|
export function trackInstallCompleted(
|
||||||
|
client: TelemetryClient,
|
||||||
|
dims: { adapterType: string },
|
||||||
|
): void {
|
||||||
|
client.track("install.completed", { adapter_type: dims.adapterType });
|
||||||
|
}
|
||||||
|
|
||||||
|
export function trackCompanyImported(
|
||||||
|
client: TelemetryClient,
|
||||||
|
dims: { sourceType: string; sourceRef: string; isPrivate: boolean },
|
||||||
|
): void {
|
||||||
|
const ref = dims.isPrivate ? client.hashPrivateRef(dims.sourceRef) : dims.sourceRef;
|
||||||
|
client.track("company.imported", {
|
||||||
|
source_type: dims.sourceType,
|
||||||
|
source_ref: ref,
|
||||||
|
source_ref_hashed: dims.isPrivate,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function trackAgentFirstHeartbeat(
|
||||||
|
client: TelemetryClient,
|
||||||
|
dims: { agentRole: string },
|
||||||
|
): void {
|
||||||
|
client.track("agent.first_heartbeat", { agent_role: dims.agentRole });
|
||||||
|
}
|
||||||
|
|
||||||
|
export function trackAgentTaskCompleted(
|
||||||
|
client: TelemetryClient,
|
||||||
|
dims: { agentRole: string },
|
||||||
|
): void {
|
||||||
|
client.track("agent.task_completed", { agent_role: dims.agentRole });
|
||||||
|
}
|
||||||
|
|
||||||
|
export function trackErrorHandlerCrash(
|
||||||
|
client: TelemetryClient,
|
||||||
|
dims: { errorCode: string },
|
||||||
|
): void {
|
||||||
|
client.track("error.handler_crash", { error_code: dims.errorCode });
|
||||||
|
}
|
||||||
18
packages/shared/src/telemetry/index.ts
Normal file
18
packages/shared/src/telemetry/index.ts
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
export { TelemetryClient } from "./client.js";
|
||||||
|
export { resolveTelemetryConfig } from "./config.js";
|
||||||
|
export { loadOrCreateState } from "./state.js";
|
||||||
|
export {
|
||||||
|
trackInstallStarted,
|
||||||
|
trackInstallCompleted,
|
||||||
|
trackCompanyImported,
|
||||||
|
trackAgentFirstHeartbeat,
|
||||||
|
trackAgentTaskCompleted,
|
||||||
|
trackErrorHandlerCrash,
|
||||||
|
} from "./events.js";
|
||||||
|
export type {
|
||||||
|
TelemetryConfig,
|
||||||
|
TelemetryState,
|
||||||
|
TelemetryEvent,
|
||||||
|
TelemetryEventEnvelope,
|
||||||
|
TelemetryEventName,
|
||||||
|
} from "./types.js";
|
||||||
31
packages/shared/src/telemetry/state.ts
Normal file
31
packages/shared/src/telemetry/state.ts
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
import { randomUUID, randomBytes } from "node:crypto";
|
||||||
|
import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
|
||||||
|
import path from "node:path";
|
||||||
|
import type { TelemetryState } from "./types.js";
|
||||||
|
|
||||||
|
export function loadOrCreateState(stateDir: string, version: string): TelemetryState {
|
||||||
|
const filePath = path.join(stateDir, "state.json");
|
||||||
|
|
||||||
|
if (existsSync(filePath)) {
|
||||||
|
try {
|
||||||
|
const raw = readFileSync(filePath, "utf-8");
|
||||||
|
const parsed = JSON.parse(raw) as TelemetryState;
|
||||||
|
if (parsed.installId && parsed.salt) {
|
||||||
|
return parsed;
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Corrupted state file — recreate
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const state: TelemetryState = {
|
||||||
|
installId: randomUUID(),
|
||||||
|
salt: randomBytes(32).toString("hex"),
|
||||||
|
createdAt: new Date().toISOString(),
|
||||||
|
firstSeenVersion: version,
|
||||||
|
};
|
||||||
|
|
||||||
|
mkdirSync(stateDir, { recursive: true });
|
||||||
|
writeFileSync(filePath, JSON.stringify(state, null, 2) + "\n", "utf-8");
|
||||||
|
return state;
|
||||||
|
}
|
||||||
37
packages/shared/src/telemetry/types.ts
Normal file
37
packages/shared/src/telemetry/types.ts
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
export interface TelemetryState {
|
||||||
|
installId: string;
|
||||||
|
salt: string;
|
||||||
|
createdAt: string;
|
||||||
|
firstSeenVersion: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TelemetryConfig {
|
||||||
|
enabled: boolean;
|
||||||
|
endpoint?: string;
|
||||||
|
app?: string;
|
||||||
|
schemaVersion?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Per-event object inside the backend envelope */
|
||||||
|
export interface TelemetryEvent {
|
||||||
|
name: string;
|
||||||
|
occurredAt: string;
|
||||||
|
dimensions: Record<string, string | number | boolean>;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Full payload sent to the backend ingest endpoint */
|
||||||
|
export interface TelemetryEventEnvelope {
|
||||||
|
app: string;
|
||||||
|
schemaVersion: string;
|
||||||
|
installId: string;
|
||||||
|
events: TelemetryEvent[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export type TelemetryEventName =
|
||||||
|
| "install.started"
|
||||||
|
| "install.completed"
|
||||||
|
| "company.imported"
|
||||||
|
| "agent.first_heartbeat"
|
||||||
|
| "agent.task_completed"
|
||||||
|
| "error.handler_crash"
|
||||||
|
| `plugin.${string}`;
|
||||||
@@ -31,6 +31,10 @@ export interface CompanyPortabilityCompanyManifestEntry {
|
|||||||
brandColor: string | null;
|
brandColor: string | null;
|
||||||
logoPath: string | null;
|
logoPath: string | null;
|
||||||
requireBoardApprovalForNewAgents: boolean;
|
requireBoardApprovalForNewAgents: boolean;
|
||||||
|
feedbackDataSharingEnabled: boolean;
|
||||||
|
feedbackDataSharingConsentAt: string | null;
|
||||||
|
feedbackDataSharingConsentByUserId: string | null;
|
||||||
|
feedbackDataSharingTermsVersion: string | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface CompanyPortabilitySidebarOrder {
|
export interface CompanyPortabilitySidebarOrder {
|
||||||
@@ -53,6 +57,8 @@ export interface CompanyPortabilityProjectManifestEntry {
|
|||||||
metadata: Record<string, unknown> | null;
|
metadata: Record<string, unknown> | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
import type { RoutineVariable } from "./routine.js";
|
||||||
|
|
||||||
export interface CompanyPortabilityProjectWorkspaceManifestEntry {
|
export interface CompanyPortabilityProjectWorkspaceManifestEntry {
|
||||||
key: string;
|
key: string;
|
||||||
name: string;
|
name: string;
|
||||||
@@ -80,6 +86,7 @@ export interface CompanyPortabilityIssueRoutineTriggerManifestEntry {
|
|||||||
export interface CompanyPortabilityIssueRoutineManifestEntry {
|
export interface CompanyPortabilityIssueRoutineManifestEntry {
|
||||||
concurrencyPolicy: string | null;
|
concurrencyPolicy: string | null;
|
||||||
catchUpPolicy: string | null;
|
catchUpPolicy: string | null;
|
||||||
|
variables?: RoutineVariable[] | null;
|
||||||
triggers: CompanyPortabilityIssueRoutineTriggerManifestEntry[];
|
triggers: CompanyPortabilityIssueRoutineTriggerManifestEntry[];
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -12,6 +12,10 @@ export interface Company {
|
|||||||
budgetMonthlyCents: number;
|
budgetMonthlyCents: number;
|
||||||
spentMonthlyCents: number;
|
spentMonthlyCents: number;
|
||||||
requireBoardApprovalForNewAgents: boolean;
|
requireBoardApprovalForNewAgents: boolean;
|
||||||
|
feedbackDataSharingEnabled: boolean;
|
||||||
|
feedbackDataSharingConsentAt: Date | null;
|
||||||
|
feedbackDataSharingConsentByUserId: string | null;
|
||||||
|
feedbackDataSharingTermsVersion: string | null;
|
||||||
brandColor: string | null;
|
brandColor: string | null;
|
||||||
logoAssetId: string | null;
|
logoAssetId: string | null;
|
||||||
logoUrl: string | null;
|
logoUrl: string | null;
|
||||||
|
|||||||
120
packages/shared/src/types/feedback.ts
Normal file
120
packages/shared/src/types/feedback.ts
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
export const FEEDBACK_TARGET_TYPES = ["issue_comment", "issue_document_revision"] as const;
|
||||||
|
export type FeedbackTargetType = (typeof FEEDBACK_TARGET_TYPES)[number];
|
||||||
|
|
||||||
|
export const FEEDBACK_VOTE_VALUES = ["up", "down"] as const;
|
||||||
|
export type FeedbackVoteValue = (typeof FEEDBACK_VOTE_VALUES)[number];
|
||||||
|
|
||||||
|
export const FEEDBACK_DATA_SHARING_PREFERENCES = ["allowed", "not_allowed", "prompt"] as const;
|
||||||
|
export type FeedbackDataSharingPreference = (typeof FEEDBACK_DATA_SHARING_PREFERENCES)[number];
|
||||||
|
|
||||||
|
export const DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE: FeedbackDataSharingPreference = "prompt";
|
||||||
|
|
||||||
|
export const FEEDBACK_TRACE_STATUSES = ["local_only", "pending", "sent", "failed"] as const;
|
||||||
|
export type FeedbackTraceStatus = (typeof FEEDBACK_TRACE_STATUSES)[number];
|
||||||
|
|
||||||
|
export const DEFAULT_FEEDBACK_DATA_SHARING_TERMS_VERSION = "feedback-data-sharing-v1";
|
||||||
|
|
||||||
|
export interface FeedbackVote {
|
||||||
|
id: string;
|
||||||
|
companyId: string;
|
||||||
|
issueId: string;
|
||||||
|
targetType: FeedbackTargetType;
|
||||||
|
targetId: string;
|
||||||
|
authorUserId: string;
|
||||||
|
vote: FeedbackVoteValue;
|
||||||
|
reason: string | null;
|
||||||
|
sharedWithLabs: boolean;
|
||||||
|
sharedAt: Date | null;
|
||||||
|
consentVersion: string | null;
|
||||||
|
redactionSummary: Record<string, unknown> | null;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FeedbackTraceTargetSummary {
|
||||||
|
label: string;
|
||||||
|
excerpt: string | null;
|
||||||
|
authorAgentId: string | null;
|
||||||
|
authorUserId: string | null;
|
||||||
|
createdAt: Date | null;
|
||||||
|
documentKey: string | null;
|
||||||
|
documentTitle: string | null;
|
||||||
|
revisionNumber: number | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FeedbackTrace {
|
||||||
|
id: string;
|
||||||
|
companyId: string;
|
||||||
|
feedbackVoteId: string;
|
||||||
|
issueId: string;
|
||||||
|
projectId: string | null;
|
||||||
|
issueIdentifier: string | null;
|
||||||
|
issueTitle: string;
|
||||||
|
authorUserId: string;
|
||||||
|
targetType: FeedbackTargetType;
|
||||||
|
targetId: string;
|
||||||
|
vote: FeedbackVoteValue;
|
||||||
|
status: FeedbackTraceStatus;
|
||||||
|
destination: string | null;
|
||||||
|
exportId: string | null;
|
||||||
|
consentVersion: string | null;
|
||||||
|
schemaVersion: string;
|
||||||
|
bundleVersion: string;
|
||||||
|
payloadVersion: string;
|
||||||
|
payloadDigest: string | null;
|
||||||
|
payloadSnapshot: Record<string, unknown> | null;
|
||||||
|
targetSummary: FeedbackTraceTargetSummary;
|
||||||
|
redactionSummary: Record<string, unknown> | null;
|
||||||
|
attemptCount: number;
|
||||||
|
lastAttemptedAt: Date | null;
|
||||||
|
exportedAt: Date | null;
|
||||||
|
failureReason: string | null;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type FeedbackTraceBundleCaptureStatus = "full" | "partial" | "unavailable";
|
||||||
|
|
||||||
|
export interface FeedbackTraceBundleFile {
|
||||||
|
path: string;
|
||||||
|
contentType: string;
|
||||||
|
encoding: "utf8";
|
||||||
|
byteLength: number;
|
||||||
|
sha256: string;
|
||||||
|
source:
|
||||||
|
| "paperclip_run"
|
||||||
|
| "paperclip_run_events"
|
||||||
|
| "paperclip_run_log"
|
||||||
|
| "codex_session"
|
||||||
|
| "claude_stream_json"
|
||||||
|
| "claude_project_session"
|
||||||
|
| "claude_project_artifact"
|
||||||
|
| "claude_debug_log"
|
||||||
|
| "claude_task_metadata"
|
||||||
|
| "opencode_session"
|
||||||
|
| "opencode_session_diff"
|
||||||
|
| "opencode_message"
|
||||||
|
| "opencode_message_part"
|
||||||
|
| "opencode_project"
|
||||||
|
| "opencode_todo";
|
||||||
|
contents: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FeedbackTraceBundle {
|
||||||
|
traceId: string;
|
||||||
|
exportId: string | null;
|
||||||
|
companyId: string;
|
||||||
|
issueId: string;
|
||||||
|
issueIdentifier: string | null;
|
||||||
|
adapterType: string | null;
|
||||||
|
captureStatus: FeedbackTraceBundleCaptureStatus;
|
||||||
|
notes: string[];
|
||||||
|
envelope: Record<string, unknown>;
|
||||||
|
surface: Record<string, unknown> | null;
|
||||||
|
paperclipRun: Record<string, unknown> | null;
|
||||||
|
rawAdapterTrace: Record<string, unknown> | null;
|
||||||
|
normalizedAdapterTrace: Record<string, unknown> | null;
|
||||||
|
privacy: Record<string, unknown> | null;
|
||||||
|
integrity: Record<string, unknown>;
|
||||||
|
files: FeedbackTraceBundleFile[];
|
||||||
|
}
|
||||||
@@ -42,6 +42,18 @@ export interface HeartbeatRun {
|
|||||||
updatedAt: Date;
|
updatedAt: Date;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface AgentWakeupSkipped {
|
||||||
|
status: "skipped";
|
||||||
|
reason: string;
|
||||||
|
message: string | null;
|
||||||
|
issueId: string | null;
|
||||||
|
executionRunId: string | null;
|
||||||
|
executionAgentId: string | null;
|
||||||
|
executionAgentName: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type AgentWakeupResponse = HeartbeatRun | AgentWakeupSkipped;
|
||||||
|
|
||||||
export interface HeartbeatRunEvent {
|
export interface HeartbeatRunEvent {
|
||||||
id: number;
|
id: number;
|
||||||
companyId: string;
|
companyId: string;
|
||||||
|
|||||||
@@ -1,4 +1,16 @@
|
|||||||
export type { Company } from "./company.js";
|
export type { Company } from "./company.js";
|
||||||
|
export type {
|
||||||
|
FeedbackVote,
|
||||||
|
FeedbackDataSharingPreference,
|
||||||
|
FeedbackTargetType,
|
||||||
|
FeedbackVoteValue,
|
||||||
|
FeedbackTrace,
|
||||||
|
FeedbackTraceStatus,
|
||||||
|
FeedbackTraceTargetSummary,
|
||||||
|
FeedbackTraceBundleCaptureStatus,
|
||||||
|
FeedbackTraceBundleFile,
|
||||||
|
FeedbackTraceBundle,
|
||||||
|
} from "./feedback.js";
|
||||||
export type { InstanceExperimentalSettings, InstanceGeneralSettings, InstanceSettings } from "./instance.js";
|
export type { InstanceExperimentalSettings, InstanceGeneralSettings, InstanceSettings } from "./instance.js";
|
||||||
export type {
|
export type {
|
||||||
CompanySkillSourceType,
|
CompanySkillSourceType,
|
||||||
@@ -118,6 +130,8 @@ export type {
|
|||||||
} from "./secrets.js";
|
} from "./secrets.js";
|
||||||
export type {
|
export type {
|
||||||
Routine,
|
Routine,
|
||||||
|
RoutineVariable,
|
||||||
|
RoutineVariableDefaultValue,
|
||||||
RoutineTrigger,
|
RoutineTrigger,
|
||||||
RoutineRun,
|
RoutineRun,
|
||||||
RoutineTriggerSecretMaterial,
|
RoutineTriggerSecretMaterial,
|
||||||
@@ -129,6 +143,8 @@ export type {
|
|||||||
export type { CostEvent, CostSummary, CostByAgent, CostByProviderModel, CostByBiller, CostByAgentModel, CostWindowSpendRow, CostByProject } from "./cost.js";
|
export type { CostEvent, CostSummary, CostByAgent, CostByProviderModel, CostByBiller, CostByAgentModel, CostWindowSpendRow, CostByProject } from "./cost.js";
|
||||||
export type { FinanceEvent, FinanceSummary, FinanceByBiller, FinanceByKind } from "./finance.js";
|
export type { FinanceEvent, FinanceSummary, FinanceByBiller, FinanceByKind } from "./finance.js";
|
||||||
export type {
|
export type {
|
||||||
|
AgentWakeupResponse,
|
||||||
|
AgentWakeupSkipped,
|
||||||
HeartbeatRun,
|
HeartbeatRun,
|
||||||
HeartbeatRunEvent,
|
HeartbeatRunEvent,
|
||||||
AgentRuntimeState,
|
AgentRuntimeState,
|
||||||
|
|||||||
@@ -1,5 +1,9 @@
|
|||||||
|
import type { FeedbackDataSharingPreference } from "./feedback.js";
|
||||||
|
|
||||||
export interface InstanceGeneralSettings {
|
export interface InstanceGeneralSettings {
|
||||||
censorUsernameInLogs: boolean;
|
censorUsernameInLogs: boolean;
|
||||||
|
keyboardShortcuts: boolean;
|
||||||
|
feedbackDataSharingPreference: FeedbackDataSharingPreference;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface InstanceExperimentalSettings {
|
export interface InstanceExperimentalSettings {
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import type { IssueOriginKind } from "../constants.js";
|
import type { IssueOriginKind, RoutineVariableType } from "../constants.js";
|
||||||
|
|
||||||
export interface RoutineProjectSummary {
|
export interface RoutineProjectSummary {
|
||||||
id: string;
|
id: string;
|
||||||
@@ -25,6 +25,17 @@ export interface RoutineIssueSummary {
|
|||||||
updatedAt: Date;
|
updatedAt: Date;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export type RoutineVariableDefaultValue = string | number | boolean | null;
|
||||||
|
|
||||||
|
export interface RoutineVariable {
|
||||||
|
name: string;
|
||||||
|
label: string | null;
|
||||||
|
type: RoutineVariableType;
|
||||||
|
defaultValue: RoutineVariableDefaultValue;
|
||||||
|
required: boolean;
|
||||||
|
options: string[];
|
||||||
|
}
|
||||||
|
|
||||||
export interface Routine {
|
export interface Routine {
|
||||||
id: string;
|
id: string;
|
||||||
companyId: string;
|
companyId: string;
|
||||||
@@ -38,6 +49,7 @@ export interface Routine {
|
|||||||
status: string;
|
status: string;
|
||||||
concurrencyPolicy: string;
|
concurrencyPolicy: string;
|
||||||
catchUpPolicy: string;
|
catchUpPolicy: string;
|
||||||
|
variables: RoutineVariable[];
|
||||||
createdByAgentId: string | null;
|
createdByAgentId: string | null;
|
||||||
createdByUserId: string | null;
|
createdByUserId: string | null;
|
||||||
updatedByAgentId: string | null;
|
updatedByAgentId: string | null;
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
|
import { routineVariableSchema } from "./routine.js";
|
||||||
|
|
||||||
export const portabilityIncludeSchema = z
|
export const portabilityIncludeSchema = z
|
||||||
.object({
|
.object({
|
||||||
@@ -36,6 +37,10 @@ export const portabilityCompanyManifestEntrySchema = z.object({
|
|||||||
brandColor: z.string().nullable(),
|
brandColor: z.string().nullable(),
|
||||||
logoPath: z.string().nullable(),
|
logoPath: z.string().nullable(),
|
||||||
requireBoardApprovalForNewAgents: z.boolean(),
|
requireBoardApprovalForNewAgents: z.boolean(),
|
||||||
|
feedbackDataSharingEnabled: z.boolean().default(false),
|
||||||
|
feedbackDataSharingConsentAt: z.string().datetime().nullable().default(null),
|
||||||
|
feedbackDataSharingConsentByUserId: z.string().nullable().default(null),
|
||||||
|
feedbackDataSharingTermsVersion: z.string().nullable().default(null),
|
||||||
});
|
});
|
||||||
|
|
||||||
export const portabilitySidebarOrderSchema = z.object({
|
export const portabilitySidebarOrderSchema = z.object({
|
||||||
@@ -119,6 +124,7 @@ export const portabilityIssueRoutineTriggerManifestEntrySchema = z.object({
|
|||||||
export const portabilityIssueRoutineManifestEntrySchema = z.object({
|
export const portabilityIssueRoutineManifestEntrySchema = z.object({
|
||||||
concurrencyPolicy: z.string().nullable(),
|
concurrencyPolicy: z.string().nullable(),
|
||||||
catchUpPolicy: z.string().nullable(),
|
catchUpPolicy: z.string().nullable(),
|
||||||
|
variables: z.array(routineVariableSchema).nullable().optional(),
|
||||||
triggers: z.array(portabilityIssueRoutineTriggerManifestEntrySchema).default([]),
|
triggers: z.array(portabilityIssueRoutineTriggerManifestEntrySchema).default([]),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -3,6 +3,7 @@ import { COMPANY_STATUSES } from "../constants.js";
|
|||||||
|
|
||||||
const logoAssetIdSchema = z.string().uuid().nullable().optional();
|
const logoAssetIdSchema = z.string().uuid().nullable().optional();
|
||||||
const brandColorSchema = z.string().regex(/^#[0-9a-fA-F]{6}$/).nullable().optional();
|
const brandColorSchema = z.string().regex(/^#[0-9a-fA-F]{6}$/).nullable().optional();
|
||||||
|
const feedbackDataSharingTermsVersionSchema = z.string().min(1).nullable().optional();
|
||||||
|
|
||||||
export const createCompanySchema = z.object({
|
export const createCompanySchema = z.object({
|
||||||
name: z.string().min(1),
|
name: z.string().min(1),
|
||||||
@@ -18,6 +19,10 @@ export const updateCompanySchema = createCompanySchema
|
|||||||
status: z.enum(COMPANY_STATUSES).optional(),
|
status: z.enum(COMPANY_STATUSES).optional(),
|
||||||
spentMonthlyCents: z.number().int().nonnegative().optional(),
|
spentMonthlyCents: z.number().int().nonnegative().optional(),
|
||||||
requireBoardApprovalForNewAgents: z.boolean().optional(),
|
requireBoardApprovalForNewAgents: z.boolean().optional(),
|
||||||
|
feedbackDataSharingEnabled: z.boolean().optional(),
|
||||||
|
feedbackDataSharingConsentAt: z.coerce.date().nullable().optional(),
|
||||||
|
feedbackDataSharingConsentByUserId: z.string().min(1).nullable().optional(),
|
||||||
|
feedbackDataSharingTermsVersion: feedbackDataSharingTermsVersionSchema,
|
||||||
brandColor: brandColorSchema,
|
brandColor: brandColorSchema,
|
||||||
logoAssetId: logoAssetIdSchema,
|
logoAssetId: logoAssetIdSchema,
|
||||||
});
|
});
|
||||||
|
|||||||
22
packages/shared/src/validators/feedback.ts
Normal file
22
packages/shared/src/validators/feedback.ts
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
import {
|
||||||
|
FEEDBACK_DATA_SHARING_PREFERENCES,
|
||||||
|
FEEDBACK_TARGET_TYPES,
|
||||||
|
FEEDBACK_TRACE_STATUSES,
|
||||||
|
FEEDBACK_VOTE_VALUES,
|
||||||
|
} from "../types/feedback.js";
|
||||||
|
|
||||||
|
export const feedbackTargetTypeSchema = z.enum(FEEDBACK_TARGET_TYPES);
|
||||||
|
export const feedbackTraceStatusSchema = z.enum(FEEDBACK_TRACE_STATUSES);
|
||||||
|
export const feedbackVoteValueSchema = z.enum(FEEDBACK_VOTE_VALUES);
|
||||||
|
export const feedbackDataSharingPreferenceSchema = z.enum(FEEDBACK_DATA_SHARING_PREFERENCES);
|
||||||
|
|
||||||
|
export const upsertIssueFeedbackVoteSchema = z.object({
|
||||||
|
targetType: feedbackTargetTypeSchema,
|
||||||
|
targetId: z.string().uuid(),
|
||||||
|
vote: feedbackVoteValueSchema,
|
||||||
|
reason: z.string().trim().max(1000).optional(),
|
||||||
|
allowSharing: z.boolean().optional(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export type UpsertIssueFeedbackVote = z.infer<typeof upsertIssueFeedbackVoteSchema>;
|
||||||
@@ -24,6 +24,14 @@ export {
|
|||||||
type UpdateCompany,
|
type UpdateCompany,
|
||||||
type UpdateCompanyBranding,
|
type UpdateCompanyBranding,
|
||||||
} from "./company.js";
|
} from "./company.js";
|
||||||
|
export {
|
||||||
|
feedbackDataSharingPreferenceSchema,
|
||||||
|
feedbackTargetTypeSchema,
|
||||||
|
feedbackTraceStatusSchema,
|
||||||
|
feedbackVoteValueSchema,
|
||||||
|
upsertIssueFeedbackVoteSchema,
|
||||||
|
type UpsertIssueFeedbackVote,
|
||||||
|
} from "./feedback.js";
|
||||||
export {
|
export {
|
||||||
companySkillSourceTypeSchema,
|
companySkillSourceTypeSchema,
|
||||||
companySkillTrustLevelSchema,
|
companySkillTrustLevelSchema,
|
||||||
@@ -206,6 +214,7 @@ export {
|
|||||||
updateRoutineSchema,
|
updateRoutineSchema,
|
||||||
createRoutineTriggerSchema,
|
createRoutineTriggerSchema,
|
||||||
updateRoutineTriggerSchema,
|
updateRoutineTriggerSchema,
|
||||||
|
routineVariableSchema,
|
||||||
runRoutineSchema,
|
runRoutineSchema,
|
||||||
rotateRoutineTriggerSecretSchema,
|
rotateRoutineTriggerSecretSchema,
|
||||||
type CreateRoutine,
|
type CreateRoutine,
|
||||||
|
|||||||
@@ -1,7 +1,13 @@
|
|||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
|
import { DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE } from "../types/feedback.js";
|
||||||
|
import { feedbackDataSharingPreferenceSchema } from "./feedback.js";
|
||||||
|
|
||||||
export const instanceGeneralSettingsSchema = z.object({
|
export const instanceGeneralSettingsSchema = z.object({
|
||||||
censorUsernameInLogs: z.boolean().default(false),
|
censorUsernameInLogs: z.boolean().default(false),
|
||||||
|
keyboardShortcuts: z.boolean().default(false),
|
||||||
|
feedbackDataSharingPreference: feedbackDataSharingPreferenceSchema.default(
|
||||||
|
DEFAULT_FEEDBACK_DATA_SHARING_PREFERENCE,
|
||||||
|
),
|
||||||
}).strict();
|
}).strict();
|
||||||
|
|
||||||
export const patchInstanceGeneralSettingsSchema = instanceGeneralSettingsSchema.partial();
|
export const patchInstanceGeneralSettingsSchema = instanceGeneralSettingsSchema.partial();
|
||||||
|
|||||||
@@ -1,6 +1,15 @@
|
|||||||
import { z } from "zod";
|
import { z } from "zod";
|
||||||
import { ISSUE_PRIORITIES, ISSUE_STATUSES } from "../constants.js";
|
import { ISSUE_PRIORITIES, ISSUE_STATUSES } from "../constants.js";
|
||||||
|
|
||||||
|
export const ISSUE_EXECUTION_WORKSPACE_PREFERENCES = [
|
||||||
|
"inherit",
|
||||||
|
"shared_workspace",
|
||||||
|
"isolated_workspace",
|
||||||
|
"operator_branch",
|
||||||
|
"reuse_existing",
|
||||||
|
"agent_default",
|
||||||
|
] as const;
|
||||||
|
|
||||||
const executionWorkspaceStrategySchema = z
|
const executionWorkspaceStrategySchema = z
|
||||||
.object({
|
.object({
|
||||||
type: z.enum(["project_primary", "git_worktree", "adapter_managed", "cloud_sandbox"]).optional(),
|
type: z.enum(["project_primary", "git_worktree", "adapter_managed", "cloud_sandbox"]).optional(),
|
||||||
@@ -14,7 +23,7 @@ const executionWorkspaceStrategySchema = z
|
|||||||
|
|
||||||
export const issueExecutionWorkspaceSettingsSchema = z
|
export const issueExecutionWorkspaceSettingsSchema = z
|
||||||
.object({
|
.object({
|
||||||
mode: z.enum(["inherit", "shared_workspace", "isolated_workspace", "operator_branch", "reuse_existing", "agent_default"]).optional(),
|
mode: z.enum(ISSUE_EXECUTION_WORKSPACE_PREFERENCES).optional(),
|
||||||
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
||||||
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
||||||
})
|
})
|
||||||
@@ -43,14 +52,7 @@ export const createIssueSchema = z.object({
|
|||||||
billingCode: z.string().optional().nullable(),
|
billingCode: z.string().optional().nullable(),
|
||||||
assigneeAdapterOverrides: issueAssigneeAdapterOverridesSchema.optional().nullable(),
|
assigneeAdapterOverrides: issueAssigneeAdapterOverridesSchema.optional().nullable(),
|
||||||
executionWorkspaceId: z.string().uuid().optional().nullable(),
|
executionWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
executionWorkspacePreference: z.enum([
|
executionWorkspacePreference: z.enum(ISSUE_EXECUTION_WORKSPACE_PREFERENCES).optional().nullable(),
|
||||||
"inherit",
|
|
||||||
"shared_workspace",
|
|
||||||
"isolated_workspace",
|
|
||||||
"operator_branch",
|
|
||||||
"reuse_existing",
|
|
||||||
"agent_default",
|
|
||||||
]).optional().nullable(),
|
|
||||||
executionWorkspaceSettings: issueExecutionWorkspaceSettingsSchema.optional().nullable(),
|
executionWorkspaceSettings: issueExecutionWorkspaceSettingsSchema.optional().nullable(),
|
||||||
labelIds: z.array(z.string().uuid()).optional(),
|
labelIds: z.array(z.string().uuid()).optional(),
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -5,7 +5,47 @@ import {
|
|||||||
ROUTINE_CONCURRENCY_POLICIES,
|
ROUTINE_CONCURRENCY_POLICIES,
|
||||||
ROUTINE_STATUSES,
|
ROUTINE_STATUSES,
|
||||||
ROUTINE_TRIGGER_SIGNING_MODES,
|
ROUTINE_TRIGGER_SIGNING_MODES,
|
||||||
|
ROUTINE_VARIABLE_TYPES,
|
||||||
} from "../constants.js";
|
} from "../constants.js";
|
||||||
|
import {
|
||||||
|
ISSUE_EXECUTION_WORKSPACE_PREFERENCES,
|
||||||
|
issueExecutionWorkspaceSettingsSchema,
|
||||||
|
} from "./issue.js";
|
||||||
|
|
||||||
|
const routineVariableValueSchema = z.union([z.string(), z.number().finite(), z.boolean()]);
|
||||||
|
|
||||||
|
export const routineVariableSchema = z.object({
|
||||||
|
name: z.string().trim().regex(/^[A-Za-z][A-Za-z0-9_]*$/),
|
||||||
|
label: z.string().trim().max(120).optional().nullable(),
|
||||||
|
type: z.enum(ROUTINE_VARIABLE_TYPES).optional().default("text"),
|
||||||
|
defaultValue: routineVariableValueSchema.optional().nullable(),
|
||||||
|
required: z.boolean().optional().default(true),
|
||||||
|
options: z.array(z.string().trim().min(1).max(120)).max(50).optional().default([]),
|
||||||
|
}).superRefine((value, ctx) => {
|
||||||
|
if (value.type === "select" && value.options.length === 0) {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
path: ["options"],
|
||||||
|
message: "Select variables require at least one option",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (value.type !== "select" && value.options.length > 0) {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
path: ["options"],
|
||||||
|
message: "Only select variables can define options",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (value.type === "select" && value.defaultValue != null) {
|
||||||
|
if (typeof value.defaultValue !== "string" || !value.options.includes(value.defaultValue)) {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
path: ["defaultValue"],
|
||||||
|
message: "Select variable defaults must match one of the allowed options",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
export const createRoutineSchema = z.object({
|
export const createRoutineSchema = z.object({
|
||||||
projectId: z.string().uuid(),
|
projectId: z.string().uuid(),
|
||||||
@@ -18,6 +58,7 @@ export const createRoutineSchema = z.object({
|
|||||||
status: z.enum(ROUTINE_STATUSES).optional().default("active"),
|
status: z.enum(ROUTINE_STATUSES).optional().default("active"),
|
||||||
concurrencyPolicy: z.enum(ROUTINE_CONCURRENCY_POLICIES).optional().default("coalesce_if_active"),
|
concurrencyPolicy: z.enum(ROUTINE_CONCURRENCY_POLICIES).optional().default("coalesce_if_active"),
|
||||||
catchUpPolicy: z.enum(ROUTINE_CATCH_UP_POLICIES).optional().default("skip_missed"),
|
catchUpPolicy: z.enum(ROUTINE_CATCH_UP_POLICIES).optional().default("skip_missed"),
|
||||||
|
variables: z.array(routineVariableSchema).optional().default([]),
|
||||||
});
|
});
|
||||||
|
|
||||||
export type CreateRoutine = z.infer<typeof createRoutineSchema>;
|
export type CreateRoutine = z.infer<typeof createRoutineSchema>;
|
||||||
@@ -62,8 +103,12 @@ export type UpdateRoutineTrigger = z.infer<typeof updateRoutineTriggerSchema>;
|
|||||||
export const runRoutineSchema = z.object({
|
export const runRoutineSchema = z.object({
|
||||||
triggerId: z.string().uuid().optional().nullable(),
|
triggerId: z.string().uuid().optional().nullable(),
|
||||||
payload: z.record(z.unknown()).optional().nullable(),
|
payload: z.record(z.unknown()).optional().nullable(),
|
||||||
|
variables: z.record(routineVariableValueSchema).optional().nullable(),
|
||||||
idempotencyKey: z.string().trim().max(255).optional().nullable(),
|
idempotencyKey: z.string().trim().max(255).optional().nullable(),
|
||||||
source: z.enum(["manual", "api"]).optional().default("manual"),
|
source: z.enum(["manual", "api"]).optional().default("manual"),
|
||||||
|
executionWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
|
executionWorkspacePreference: z.enum(ISSUE_EXECUTION_WORKSPACE_PREFERENCES).optional().nullable(),
|
||||||
|
executionWorkspaceSettings: issueExecutionWorkspaceSettingsSchema.optional().nullable(),
|
||||||
});
|
});
|
||||||
|
|
||||||
export type RunRoutine = z.infer<typeof runRoutineSchema>;
|
export type RunRoutine = z.infer<typeof runRoutineSchema>;
|
||||||
|
|||||||
@@ -31,20 +31,41 @@ source_env_path="$(dirname "$source_config_path")/.env"
|
|||||||
|
|
||||||
mkdir -p "$paperclip_dir"
|
mkdir -p "$paperclip_dir"
|
||||||
|
|
||||||
run_isolated_worktree_init() {
|
run_paperclipai_command() {
|
||||||
|
local command_args=("$@")
|
||||||
if command -v pnpm >/dev/null 2>&1 && pnpm paperclipai --help >/dev/null 2>&1; then
|
if command -v pnpm >/dev/null 2>&1 && pnpm paperclipai --help >/dev/null 2>&1; then
|
||||||
pnpm paperclipai worktree init --force --seed-mode minimal --name "$worktree_name" --from-config "$source_config_path"
|
pnpm paperclipai "${command_args[@]}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local base_cli_tsx_path="$base_cwd/cli/node_modules/tsx/dist/cli.mjs"
|
||||||
|
local base_cli_entry_path="$base_cwd/cli/src/index.ts"
|
||||||
|
if command -v node >/dev/null 2>&1 && [[ -f "$base_cli_tsx_path" ]] && [[ -f "$base_cli_entry_path" ]]; then
|
||||||
|
node "$base_cli_tsx_path" "$base_cli_entry_path" "${command_args[@]}"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if command -v paperclipai >/dev/null 2>&1; then
|
if command -v paperclipai >/dev/null 2>&1; then
|
||||||
paperclipai worktree init --force --seed-mode minimal --name "$worktree_name" --from-config "$source_config_path"
|
paperclipai "${command_args[@]}"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
return 1
|
return 1
|
||||||
}
|
}
|
||||||
|
|
||||||
|
run_isolated_worktree_init() {
|
||||||
|
run_paperclipai_command \
|
||||||
|
worktree \
|
||||||
|
init \
|
||||||
|
--force \
|
||||||
|
--seed-mode \
|
||||||
|
minimal \
|
||||||
|
--name \
|
||||||
|
"$worktree_name" \
|
||||||
|
--from-config \
|
||||||
|
"$source_config_path"
|
||||||
|
}
|
||||||
|
|
||||||
write_fallback_worktree_config() {
|
write_fallback_worktree_config() {
|
||||||
WORKTREE_NAME="$worktree_name" \
|
WORKTREE_NAME="$worktree_name" \
|
||||||
BASE_CWD="$base_cwd" \
|
BASE_CWD="$base_cwd" \
|
||||||
@@ -300,6 +321,20 @@ if ! run_isolated_worktree_init; then
|
|||||||
write_fallback_worktree_config
|
write_fallback_worktree_config
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
disable_seeded_routines() {
|
||||||
|
local company_id="${PAPERCLIP_COMPANY_ID:-}"
|
||||||
|
if [[ -z "$company_id" ]]; then
|
||||||
|
echo "PAPERCLIP_COMPANY_ID not set; skipping routine disable post-step." >&2
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! run_paperclipai_command routines disable-all --config "$worktree_config_path" --company-id "$company_id"; then
|
||||||
|
echo "paperclipai CLI not available in this workspace; skipping routine disable post-step." >&2
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
disable_seeded_routines
|
||||||
|
|
||||||
while IFS= read -r relative_path; do
|
while IFS= read -r relative_path; do
|
||||||
[[ -n "$relative_path" ]] || continue
|
[[ -n "$relative_path" ]] || continue
|
||||||
source_path="$base_cwd/$relative_path"
|
source_path="$base_cwd/$relative_path"
|
||||||
|
|||||||
@@ -123,11 +123,14 @@ function setVersion(version) {
|
|||||||
`.version("${version}")`,
|
`.version("${version}")`,
|
||||||
);
|
);
|
||||||
|
|
||||||
if (cliEntry === nextCliEntry) {
|
if (cliEntry !== nextCliEntry) {
|
||||||
throw new Error("failed to rewrite CLI version string in cli/src/index.ts");
|
writeFileSync(cliEntryPath, nextCliEntry);
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
writeFileSync(cliEntryPath, nextCliEntry);
|
if (!cliEntry.includes(".version(cliVersion)")) {
|
||||||
|
throw new Error("failed to rewrite CLI version string in cli/src/index.ts");
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function listPackages() {
|
function listPackages() {
|
||||||
|
|||||||
@@ -131,7 +131,7 @@ function makeAgent(adapterType: string) {
|
|||||||
|
|
||||||
describe("agent skill routes", () => {
|
describe("agent skill routes", () => {
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
vi.clearAllMocks();
|
vi.resetAllMocks();
|
||||||
mockAgentService.resolveByReference.mockResolvedValue({
|
mockAgentService.resolveByReference.mockResolvedValue({
|
||||||
ambiguous: false,
|
ambiguous: false,
|
||||||
agent: makeAgent("claude_local"),
|
agent: makeAgent("claude_local"),
|
||||||
@@ -231,11 +231,31 @@ describe("agent skill routes", () => {
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("keeps runtime materialization for persistent skill adapters", async () => {
|
it("skips runtime materialization when listing Codex skills", async () => {
|
||||||
mockAgentService.getById.mockResolvedValue(makeAgent("codex_local"));
|
mockAgentService.getById.mockResolvedValue(makeAgent("codex_local"));
|
||||||
mockAdapter.listSkills.mockResolvedValue({
|
mockAdapter.listSkills.mockResolvedValue({
|
||||||
adapterType: "codex_local",
|
adapterType: "codex_local",
|
||||||
supported: true,
|
supported: true,
|
||||||
|
mode: "ephemeral",
|
||||||
|
desiredSkills: ["paperclipai/paperclip/paperclip"],
|
||||||
|
entries: [],
|
||||||
|
warnings: [],
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(createApp())
|
||||||
|
.get("/api/agents/11111111-1111-4111-8111-111111111111/skills?companyId=company-1");
|
||||||
|
|
||||||
|
expect(res.status, JSON.stringify(res.body)).toBe(200);
|
||||||
|
expect(mockCompanySkillService.listRuntimeSkillEntries).toHaveBeenCalledWith("company-1", {
|
||||||
|
materializeMissing: false,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("keeps runtime materialization for persistent skill adapters", async () => {
|
||||||
|
mockAgentService.getById.mockResolvedValue(makeAgent("cursor"));
|
||||||
|
mockAdapter.listSkills.mockResolvedValue({
|
||||||
|
adapterType: "cursor",
|
||||||
|
supported: true,
|
||||||
mode: "persistent",
|
mode: "persistent",
|
||||||
desiredSkills: ["paperclipai/paperclip/paperclip"],
|
desiredSkills: ["paperclipai/paperclip/paperclip"],
|
||||||
entries: [],
|
entries: [],
|
||||||
|
|||||||
@@ -29,6 +29,12 @@ vi.mock("../services/index.js", () => ({
|
|||||||
agentService: () => ({
|
agentService: () => ({
|
||||||
getById: vi.fn(),
|
getById: vi.fn(),
|
||||||
}),
|
}),
|
||||||
|
feedbackService: () => ({
|
||||||
|
listIssueVotesForUser: vi.fn(),
|
||||||
|
listFeedbackTraces: vi.fn(),
|
||||||
|
getFeedbackTraceById: vi.fn(),
|
||||||
|
saveIssueVote: vi.fn(),
|
||||||
|
}),
|
||||||
logActivity: vi.fn(),
|
logActivity: vi.fn(),
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
|||||||
@@ -34,6 +34,12 @@ const mockCompanyPortabilityService = vi.hoisted(() => ({
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
const mockLogActivity = vi.hoisted(() => vi.fn());
|
const mockLogActivity = vi.hoisted(() => vi.fn());
|
||||||
|
const mockFeedbackService = vi.hoisted(() => ({
|
||||||
|
listIssueVotesForUser: vi.fn(),
|
||||||
|
listFeedbackTraces: vi.fn(),
|
||||||
|
getFeedbackTraceById: vi.fn(),
|
||||||
|
saveIssueVote: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
vi.mock("../services/index.js", () => ({
|
vi.mock("../services/index.js", () => ({
|
||||||
accessService: () => mockAccessService,
|
accessService: () => mockAccessService,
|
||||||
@@ -41,6 +47,7 @@ vi.mock("../services/index.js", () => ({
|
|||||||
budgetService: () => mockBudgetService,
|
budgetService: () => mockBudgetService,
|
||||||
companyPortabilityService: () => mockCompanyPortabilityService,
|
companyPortabilityService: () => mockCompanyPortabilityService,
|
||||||
companyService: () => mockCompanyService,
|
companyService: () => mockCompanyService,
|
||||||
|
feedbackService: () => mockFeedbackService,
|
||||||
logActivity: mockLogActivity,
|
logActivity: mockLogActivity,
|
||||||
}));
|
}));
|
||||||
|
|
||||||
@@ -78,9 +85,7 @@ function createApp(actor: Record<string, unknown>) {
|
|||||||
|
|
||||||
describe("PATCH /api/companies/:companyId/branding", () => {
|
describe("PATCH /api/companies/:companyId/branding", () => {
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
mockCompanyService.update.mockReset();
|
vi.resetAllMocks();
|
||||||
mockAgentService.getById.mockReset();
|
|
||||||
mockLogActivity.mockReset();
|
|
||||||
});
|
});
|
||||||
|
|
||||||
it("rejects non-CEO agent callers", async () => {
|
it("rejects non-CEO agent callers", async () => {
|
||||||
|
|||||||
@@ -32,6 +32,12 @@ const mockCompanyPortabilityService = vi.hoisted(() => ({
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
const mockLogActivity = vi.hoisted(() => vi.fn());
|
const mockLogActivity = vi.hoisted(() => vi.fn());
|
||||||
|
const mockFeedbackService = vi.hoisted(() => ({
|
||||||
|
listIssueVotesForUser: vi.fn(),
|
||||||
|
listFeedbackTraces: vi.fn(),
|
||||||
|
getFeedbackTraceById: vi.fn(),
|
||||||
|
saveIssueVote: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
vi.mock("../services/index.js", () => ({
|
vi.mock("../services/index.js", () => ({
|
||||||
accessService: () => mockAccessService,
|
accessService: () => mockAccessService,
|
||||||
@@ -39,6 +45,7 @@ vi.mock("../services/index.js", () => ({
|
|||||||
budgetService: () => mockBudgetService,
|
budgetService: () => mockBudgetService,
|
||||||
companyPortabilityService: () => mockCompanyPortabilityService,
|
companyPortabilityService: () => mockCompanyPortabilityService,
|
||||||
companyService: () => mockCompanyService,
|
companyService: () => mockCompanyService,
|
||||||
|
feedbackService: () => mockFeedbackService,
|
||||||
logActivity: mockLogActivity,
|
logActivity: mockLogActivity,
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
|||||||
@@ -375,6 +375,7 @@ describe("company portability", () => {
|
|||||||
expect(
|
expect(
|
||||||
parseGitHubSourceUrl("https://github.com/paperclipai/companies?ref=feature%2Fdemo&path=gstack"),
|
parseGitHubSourceUrl("https://github.com/paperclipai/companies?ref=feature%2Fdemo&path=gstack"),
|
||||||
).toEqual({
|
).toEqual({
|
||||||
|
hostname: "github.com",
|
||||||
owner: "paperclipai",
|
owner: "paperclipai",
|
||||||
repo: "companies",
|
repo: "companies",
|
||||||
ref: "feature/demo",
|
ref: "feature/demo",
|
||||||
@@ -389,6 +390,7 @@ describe("company portability", () => {
|
|||||||
"https://github.com/paperclipai/companies?ref=abc123&companyPath=gstack%2FCOMPANY.md",
|
"https://github.com/paperclipai/companies?ref=abc123&companyPath=gstack%2FCOMPANY.md",
|
||||||
),
|
),
|
||||||
).toEqual({
|
).toEqual({
|
||||||
|
hostname: "github.com",
|
||||||
owner: "paperclipai",
|
owner: "paperclipai",
|
||||||
repo: "companies",
|
repo: "companies",
|
||||||
ref: "abc123",
|
ref: "abc123",
|
||||||
|
|||||||
1105
server/src/__tests__/feedback-service.test.ts
Normal file
1105
server/src/__tests__/feedback-service.test.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,7 +1,7 @@
|
|||||||
import { randomUUID } from "node:crypto";
|
import { randomUUID } from "node:crypto";
|
||||||
import { spawn, type ChildProcess } from "node:child_process";
|
import { spawn, type ChildProcess } from "node:child_process";
|
||||||
import { eq } from "drizzle-orm";
|
import { eq } from "drizzle-orm";
|
||||||
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
import { afterAll, afterEach, beforeAll, describe, expect, it, vi } from "vitest";
|
||||||
import {
|
import {
|
||||||
agents,
|
agents,
|
||||||
agentWakeupRequests,
|
agentWakeupRequests,
|
||||||
@@ -16,6 +16,23 @@ import {
|
|||||||
startEmbeddedPostgresTestDatabase,
|
startEmbeddedPostgresTestDatabase,
|
||||||
} from "./helpers/embedded-postgres.js";
|
} from "./helpers/embedded-postgres.js";
|
||||||
import { runningProcesses } from "../adapters/index.ts";
|
import { runningProcesses } from "../adapters/index.ts";
|
||||||
|
const mockTelemetryClient = vi.hoisted(() => ({ track: vi.fn() }));
|
||||||
|
const mockTrackAgentFirstHeartbeat = vi.hoisted(() => vi.fn());
|
||||||
|
|
||||||
|
vi.mock("../telemetry.ts", () => ({
|
||||||
|
getTelemetryClient: () => mockTelemetryClient,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock("@paperclipai/shared/telemetry", async () => {
|
||||||
|
const actual = await vi.importActual<typeof import("@paperclipai/shared/telemetry")>(
|
||||||
|
"@paperclipai/shared/telemetry",
|
||||||
|
);
|
||||||
|
return {
|
||||||
|
...actual,
|
||||||
|
trackAgentFirstHeartbeat: mockTrackAgentFirstHeartbeat,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
import { heartbeatService } from "../services/heartbeat.ts";
|
import { heartbeatService } from "../services/heartbeat.ts";
|
||||||
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
const embeddedPostgresSupport = await getEmbeddedPostgresTestSupport();
|
||||||
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
const describeEmbeddedPostgres = embeddedPostgresSupport.supported ? describe : describe.skip;
|
||||||
@@ -43,6 +60,7 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
|
|||||||
}, 20_000);
|
}, 20_000);
|
||||||
|
|
||||||
afterEach(async () => {
|
afterEach(async () => {
|
||||||
|
vi.clearAllMocks();
|
||||||
runningProcesses.clear();
|
runningProcesses.clear();
|
||||||
for (const child of childProcesses) {
|
for (const child of childProcesses) {
|
||||||
child.kill("SIGKILL");
|
child.kill("SIGKILL");
|
||||||
@@ -67,6 +85,7 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
|
|||||||
|
|
||||||
async function seedRunFixture(input?: {
|
async function seedRunFixture(input?: {
|
||||||
adapterType?: string;
|
adapterType?: string;
|
||||||
|
agentStatus?: "paused" | "idle" | "running";
|
||||||
runStatus?: "running" | "queued" | "failed";
|
runStatus?: "running" | "queued" | "failed";
|
||||||
processPid?: number | null;
|
processPid?: number | null;
|
||||||
processLossRetryCount?: number;
|
processLossRetryCount?: number;
|
||||||
@@ -94,7 +113,7 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
|
|||||||
companyId,
|
companyId,
|
||||||
name: "CodexCoder",
|
name: "CodexCoder",
|
||||||
role: "engineer",
|
role: "engineer",
|
||||||
status: "paused",
|
status: input?.agentStatus ?? "paused",
|
||||||
adapterType: input?.adapterType ?? "codex_local",
|
adapterType: input?.adapterType ?? "codex_local",
|
||||||
adapterConfig: {},
|
adapterConfig: {},
|
||||||
runtimeConfig: {},
|
runtimeConfig: {},
|
||||||
@@ -252,4 +271,18 @@ describeEmbeddedPostgres("heartbeat orphaned process recovery", () => {
|
|||||||
expect(run?.errorCode).toBeNull();
|
expect(run?.errorCode).toBeNull();
|
||||||
expect(run?.error).toBeNull();
|
expect(run?.error).toBeNull();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("tracks the first heartbeat with the agent role instead of adapter type", async () => {
|
||||||
|
const { runId } = await seedRunFixture({
|
||||||
|
agentStatus: "running",
|
||||||
|
includeIssue: false,
|
||||||
|
});
|
||||||
|
const heartbeat = heartbeatService(db);
|
||||||
|
|
||||||
|
await heartbeat.cancelRun(runId);
|
||||||
|
|
||||||
|
expect(mockTrackAgentFirstHeartbeat).toHaveBeenCalledWith(mockTelemetryClient, {
|
||||||
|
agentRole: "engineer",
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -35,6 +35,8 @@ describe("instance settings routes", () => {
|
|||||||
vi.clearAllMocks();
|
vi.clearAllMocks();
|
||||||
mockInstanceSettingsService.getGeneral.mockResolvedValue({
|
mockInstanceSettingsService.getGeneral.mockResolvedValue({
|
||||||
censorUsernameInLogs: false,
|
censorUsernameInLogs: false,
|
||||||
|
keyboardShortcuts: false,
|
||||||
|
feedbackDataSharingPreference: "prompt",
|
||||||
});
|
});
|
||||||
mockInstanceSettingsService.getExperimental.mockResolvedValue({
|
mockInstanceSettingsService.getExperimental.mockResolvedValue({
|
||||||
enableIsolatedWorkspaces: false,
|
enableIsolatedWorkspaces: false,
|
||||||
@@ -44,6 +46,8 @@ describe("instance settings routes", () => {
|
|||||||
id: "instance-settings-1",
|
id: "instance-settings-1",
|
||||||
general: {
|
general: {
|
||||||
censorUsernameInLogs: true,
|
censorUsernameInLogs: true,
|
||||||
|
keyboardShortcuts: true,
|
||||||
|
feedbackDataSharingPreference: "allowed",
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
mockInstanceSettingsService.updateExperimental.mockResolvedValue({
|
mockInstanceSettingsService.updateExperimental.mockResolvedValue({
|
||||||
@@ -110,20 +114,30 @@ describe("instance settings routes", () => {
|
|||||||
|
|
||||||
const getRes = await request(app).get("/api/instance/settings/general");
|
const getRes = await request(app).get("/api/instance/settings/general");
|
||||||
expect(getRes.status).toBe(200);
|
expect(getRes.status).toBe(200);
|
||||||
expect(getRes.body).toEqual({ censorUsernameInLogs: false });
|
expect(getRes.body).toEqual({
|
||||||
|
censorUsernameInLogs: false,
|
||||||
|
keyboardShortcuts: false,
|
||||||
|
feedbackDataSharingPreference: "prompt",
|
||||||
|
});
|
||||||
|
|
||||||
const patchRes = await request(app)
|
const patchRes = await request(app)
|
||||||
.patch("/api/instance/settings/general")
|
.patch("/api/instance/settings/general")
|
||||||
.send({ censorUsernameInLogs: true });
|
.send({
|
||||||
|
censorUsernameInLogs: true,
|
||||||
|
keyboardShortcuts: true,
|
||||||
|
feedbackDataSharingPreference: "allowed",
|
||||||
|
});
|
||||||
|
|
||||||
expect(patchRes.status).toBe(200);
|
expect(patchRes.status).toBe(200);
|
||||||
expect(mockInstanceSettingsService.updateGeneral).toHaveBeenCalledWith({
|
expect(mockInstanceSettingsService.updateGeneral).toHaveBeenCalledWith({
|
||||||
censorUsernameInLogs: true,
|
censorUsernameInLogs: true,
|
||||||
|
keyboardShortcuts: true,
|
||||||
|
feedbackDataSharingPreference: "allowed",
|
||||||
});
|
});
|
||||||
expect(mockLogActivity).toHaveBeenCalledTimes(2);
|
expect(mockLogActivity).toHaveBeenCalledTimes(2);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("rejects non-admin board users", async () => {
|
it("allows non-admin board users to read general settings", async () => {
|
||||||
const app = createApp({
|
const app = createApp({
|
||||||
type: "board",
|
type: "board",
|
||||||
userId: "user-1",
|
userId: "user-1",
|
||||||
@@ -134,8 +148,25 @@ describe("instance settings routes", () => {
|
|||||||
|
|
||||||
const res = await request(app).get("/api/instance/settings/general");
|
const res = await request(app).get("/api/instance/settings/general");
|
||||||
|
|
||||||
|
expect(res.status).toBe(200);
|
||||||
|
expect(mockInstanceSettingsService.getGeneral).toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects non-admin board users from updating general settings", async () => {
|
||||||
|
const app = createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "user-1",
|
||||||
|
source: "session",
|
||||||
|
isInstanceAdmin: false,
|
||||||
|
companyIds: ["company-1"],
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app)
|
||||||
|
.patch("/api/instance/settings/general")
|
||||||
|
.send({ censorUsernameInLogs: true, keyboardShortcuts: true });
|
||||||
|
|
||||||
expect(res.status).toBe(403);
|
expect(res.status).toBe(403);
|
||||||
expect(mockInstanceSettingsService.getGeneral).not.toHaveBeenCalled();
|
expect(mockInstanceSettingsService.updateGeneral).not.toHaveBeenCalled();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("rejects agent callers", async () => {
|
it("rejects agent callers", async () => {
|
||||||
@@ -148,7 +179,7 @@ describe("instance settings routes", () => {
|
|||||||
|
|
||||||
const res = await request(app)
|
const res = await request(app)
|
||||||
.patch("/api/instance/settings/general")
|
.patch("/api/instance/settings/general")
|
||||||
.send({ censorUsernameInLogs: true });
|
.send({ feedbackDataSharingPreference: "not_allowed" });
|
||||||
|
|
||||||
expect(res.status).toBe(403);
|
expect(res.status).toBe(403);
|
||||||
expect(mockInstanceSettingsService.updateGeneral).not.toHaveBeenCalled();
|
expect(mockInstanceSettingsService.updateGeneral).not.toHaveBeenCalled();
|
||||||
|
|||||||
@@ -35,8 +35,22 @@ vi.mock("../services/index.js", () => ({
|
|||||||
agentService: () => mockAgentService,
|
agentService: () => mockAgentService,
|
||||||
documentService: () => ({}),
|
documentService: () => ({}),
|
||||||
executionWorkspaceService: () => ({}),
|
executionWorkspaceService: () => ({}),
|
||||||
|
feedbackService: () => ({
|
||||||
|
listIssueVotesForUser: vi.fn(async () => []),
|
||||||
|
saveIssueVote: vi.fn(async () => ({ vote: null, consentEnabledNow: false, sharingEnabled: false })),
|
||||||
|
}),
|
||||||
goalService: () => ({}),
|
goalService: () => ({}),
|
||||||
heartbeatService: () => mockHeartbeatService,
|
heartbeatService: () => mockHeartbeatService,
|
||||||
|
instanceSettingsService: () => ({
|
||||||
|
get: vi.fn(async () => ({
|
||||||
|
id: "instance-settings-1",
|
||||||
|
general: {
|
||||||
|
censorUsernameInLogs: false,
|
||||||
|
feedbackDataSharingPreference: "prompt",
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
listCompanyIds: vi.fn(async () => ["company-1"]),
|
||||||
|
}),
|
||||||
issueApprovalService: () => ({}),
|
issueApprovalService: () => ({}),
|
||||||
issueService: () => mockIssueService,
|
issueService: () => mockIssueService,
|
||||||
logActivity: mockLogActivity,
|
logActivity: mockLogActivity,
|
||||||
|
|||||||
@@ -32,11 +32,16 @@ vi.mock("../services/index.js", () => ({
|
|||||||
agentService: () => mockAgentService,
|
agentService: () => mockAgentService,
|
||||||
documentService: () => mockDocumentsService,
|
documentService: () => mockDocumentsService,
|
||||||
executionWorkspaceService: () => ({}),
|
executionWorkspaceService: () => ({}),
|
||||||
|
feedbackService: () => ({}),
|
||||||
goalService: () => ({}),
|
goalService: () => ({}),
|
||||||
heartbeatService: () => ({
|
heartbeatService: () => ({
|
||||||
wakeup: vi.fn(async () => undefined),
|
wakeup: vi.fn(async () => undefined),
|
||||||
reportRunActivity: vi.fn(async () => undefined),
|
reportRunActivity: vi.fn(async () => undefined),
|
||||||
}),
|
}),
|
||||||
|
instanceSettingsService: () => ({
|
||||||
|
getExperimental: vi.fn(async () => ({})),
|
||||||
|
getGeneral: vi.fn(async () => ({ feedbackDataSharingPreference: "prompt" })),
|
||||||
|
}),
|
||||||
issueApprovalService: () => ({}),
|
issueApprovalService: () => ({}),
|
||||||
issueService: () => mockIssueService,
|
issueService: () => mockIssueService,
|
||||||
logActivity: mockLogActivity,
|
logActivity: mockLogActivity,
|
||||||
|
|||||||
128
server/src/__tests__/issue-feedback-routes.test.ts
Normal file
128
server/src/__tests__/issue-feedback-routes.test.ts
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
import express from "express";
|
||||||
|
import request from "supertest";
|
||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import { errorHandler } from "../middleware/index.js";
|
||||||
|
import { issueRoutes } from "../routes/issues.js";
|
||||||
|
|
||||||
|
const mockFeedbackService = vi.hoisted(() => ({
|
||||||
|
getFeedbackTraceById: vi.fn(),
|
||||||
|
getFeedbackTraceBundle: vi.fn(),
|
||||||
|
listIssueVotesForUser: vi.fn(),
|
||||||
|
listFeedbackTraces: vi.fn(),
|
||||||
|
saveIssueVote: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock("../services/index.js", () => ({
|
||||||
|
accessService: () => ({
|
||||||
|
canUser: vi.fn(),
|
||||||
|
hasPermission: vi.fn(),
|
||||||
|
}),
|
||||||
|
agentService: () => ({
|
||||||
|
getById: vi.fn(),
|
||||||
|
}),
|
||||||
|
documentService: () => ({}),
|
||||||
|
executionWorkspaceService: () => ({}),
|
||||||
|
feedbackService: () => mockFeedbackService,
|
||||||
|
goalService: () => ({}),
|
||||||
|
heartbeatService: () => ({
|
||||||
|
wakeup: vi.fn(async () => undefined),
|
||||||
|
reportRunActivity: vi.fn(async () => undefined),
|
||||||
|
getRun: vi.fn(async () => null),
|
||||||
|
getActiveRunForAgent: vi.fn(async () => null),
|
||||||
|
cancelRun: vi.fn(async () => null),
|
||||||
|
}),
|
||||||
|
instanceSettingsService: () => ({
|
||||||
|
get: vi.fn(async () => ({
|
||||||
|
id: "instance-settings-1",
|
||||||
|
general: {
|
||||||
|
censorUsernameInLogs: false,
|
||||||
|
feedbackDataSharingPreference: "prompt",
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
listCompanyIds: vi.fn(async () => ["company-1"]),
|
||||||
|
}),
|
||||||
|
issueApprovalService: () => ({}),
|
||||||
|
issueService: () => ({
|
||||||
|
getById: vi.fn(),
|
||||||
|
update: vi.fn(),
|
||||||
|
addComment: vi.fn(),
|
||||||
|
findMentionedAgents: vi.fn(),
|
||||||
|
}),
|
||||||
|
logActivity: vi.fn(async () => undefined),
|
||||||
|
projectService: () => ({}),
|
||||||
|
routineService: () => ({
|
||||||
|
syncRunStatusForIssue: vi.fn(async () => undefined),
|
||||||
|
}),
|
||||||
|
workProductService: () => ({}),
|
||||||
|
}));
|
||||||
|
|
||||||
|
function createApp(actor: Record<string, unknown>) {
|
||||||
|
const app = express();
|
||||||
|
app.use(express.json());
|
||||||
|
app.use((req, _res, next) => {
|
||||||
|
(req as any).actor = actor;
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
app.use("/api", issueRoutes({} as any, {} as any));
|
||||||
|
app.use(errorHandler);
|
||||||
|
return app;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("issue feedback trace routes", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects non-board callers before fetching a feedback trace", async () => {
|
||||||
|
const app = createApp({
|
||||||
|
type: "agent",
|
||||||
|
agentId: "agent-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
source: "agent_key",
|
||||||
|
runId: "run-1",
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app).get("/api/feedback-traces/trace-1");
|
||||||
|
|
||||||
|
expect(res.status).toBe(403);
|
||||||
|
expect(mockFeedbackService.getFeedbackTraceById).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("returns 404 when a board user lacks access to the trace company", async () => {
|
||||||
|
mockFeedbackService.getFeedbackTraceById.mockResolvedValue({
|
||||||
|
id: "trace-1",
|
||||||
|
companyId: "company-2",
|
||||||
|
});
|
||||||
|
const app = createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "user-1",
|
||||||
|
source: "session",
|
||||||
|
isInstanceAdmin: false,
|
||||||
|
companyIds: ["company-1"],
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app).get("/api/feedback-traces/trace-1");
|
||||||
|
|
||||||
|
expect(res.status).toBe(404);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("returns 404 for bundle fetches when a board user lacks access to the trace company", async () => {
|
||||||
|
mockFeedbackService.getFeedbackTraceBundle.mockResolvedValue({
|
||||||
|
id: "trace-1",
|
||||||
|
companyId: "company-2",
|
||||||
|
issueId: "issue-1",
|
||||||
|
files: [],
|
||||||
|
});
|
||||||
|
const app = createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "user-1",
|
||||||
|
source: "session",
|
||||||
|
isInstanceAdmin: false,
|
||||||
|
companyIds: ["company-1"],
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app).get("/api/feedback-traces/trace-1/bundle");
|
||||||
|
|
||||||
|
expect(res.status).toBe(404);
|
||||||
|
});
|
||||||
|
});
|
||||||
125
server/src/__tests__/issue-telemetry-routes.test.ts
Normal file
125
server/src/__tests__/issue-telemetry-routes.test.ts
Normal file
@@ -0,0 +1,125 @@
|
|||||||
|
import express from "express";
|
||||||
|
import request from "supertest";
|
||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import { issueRoutes } from "../routes/issues.js";
|
||||||
|
import { errorHandler } from "../middleware/index.js";
|
||||||
|
|
||||||
|
const mockIssueService = vi.hoisted(() => ({
|
||||||
|
getById: vi.fn(),
|
||||||
|
update: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const mockAgentService = vi.hoisted(() => ({
|
||||||
|
getById: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
const mockTrackAgentTaskCompleted = vi.hoisted(() => vi.fn());
|
||||||
|
const mockGetTelemetryClient = vi.hoisted(() => vi.fn());
|
||||||
|
|
||||||
|
vi.mock("@paperclipai/shared/telemetry", () => ({
|
||||||
|
trackAgentTaskCompleted: mockTrackAgentTaskCompleted,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock("../telemetry.js", () => ({
|
||||||
|
getTelemetryClient: mockGetTelemetryClient,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock("../services/index.js", () => ({
|
||||||
|
accessService: () => ({
|
||||||
|
canUser: vi.fn(),
|
||||||
|
hasPermission: vi.fn(),
|
||||||
|
}),
|
||||||
|
agentService: () => mockAgentService,
|
||||||
|
documentService: () => ({}),
|
||||||
|
executionWorkspaceService: () => ({}),
|
||||||
|
feedbackService: () => ({}),
|
||||||
|
goalService: () => ({}),
|
||||||
|
heartbeatService: () => ({
|
||||||
|
reportRunActivity: vi.fn(async () => undefined),
|
||||||
|
}),
|
||||||
|
instanceSettingsService: () => ({}),
|
||||||
|
issueApprovalService: () => ({}),
|
||||||
|
issueService: () => mockIssueService,
|
||||||
|
logActivity: vi.fn(async () => undefined),
|
||||||
|
projectService: () => ({}),
|
||||||
|
routineService: () => ({
|
||||||
|
syncRunStatusForIssue: vi.fn(async () => undefined),
|
||||||
|
}),
|
||||||
|
workProductService: () => ({}),
|
||||||
|
}));
|
||||||
|
|
||||||
|
function makeIssue(status: "todo" | "done") {
|
||||||
|
return {
|
||||||
|
id: "11111111-1111-4111-8111-111111111111",
|
||||||
|
companyId: "company-1",
|
||||||
|
status,
|
||||||
|
assigneeAgentId: "22222222-2222-4222-8222-222222222222",
|
||||||
|
assigneeUserId: null,
|
||||||
|
createdByUserId: "local-board",
|
||||||
|
identifier: "PAP-1018",
|
||||||
|
title: "Telemetry test",
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function createApp(actor: Record<string, unknown>) {
|
||||||
|
const app = express();
|
||||||
|
app.use(express.json());
|
||||||
|
app.use((req, _res, next) => {
|
||||||
|
(req as any).actor = actor;
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
app.use("/api", issueRoutes({} as any, {} as any));
|
||||||
|
app.use(errorHandler);
|
||||||
|
return app;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("issue telemetry routes", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
|
||||||
|
mockIssueService.getById.mockResolvedValue(makeIssue("todo"));
|
||||||
|
mockIssueService.update.mockImplementation(async (_id: string, patch: Record<string, unknown>) => ({
|
||||||
|
...makeIssue("todo"),
|
||||||
|
...patch,
|
||||||
|
}));
|
||||||
|
});
|
||||||
|
|
||||||
|
it("emits task-completed telemetry with the agent role", async () => {
|
||||||
|
mockAgentService.getById.mockResolvedValue({
|
||||||
|
id: "agent-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
role: "engineer",
|
||||||
|
adapterType: "codex_local",
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(createApp({
|
||||||
|
type: "agent",
|
||||||
|
agentId: "agent-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
runId: null,
|
||||||
|
}))
|
||||||
|
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
|
||||||
|
.send({ status: "done" });
|
||||||
|
|
||||||
|
expect(res.status).toBe(200);
|
||||||
|
expect(mockTrackAgentTaskCompleted).toHaveBeenCalledWith(expect.anything(), {
|
||||||
|
agentRole: "engineer",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("does not emit agent task-completed telemetry for board-driven completions", async () => {
|
||||||
|
const res = await request(createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "local-board",
|
||||||
|
companyIds: ["company-1"],
|
||||||
|
source: "local_implicit",
|
||||||
|
isInstanceAdmin: false,
|
||||||
|
}))
|
||||||
|
.patch("/api/issues/11111111-1111-4111-8111-111111111111")
|
||||||
|
.send({ status: "done" });
|
||||||
|
|
||||||
|
expect(res.status).toBe(200);
|
||||||
|
expect(mockTrackAgentTaskCompleted).not.toHaveBeenCalled();
|
||||||
|
expect(mockAgentService.getById).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -36,11 +36,25 @@ vi.mock("../services/index.js", () => ({
|
|||||||
executionWorkspaceService: () => ({
|
executionWorkspaceService: () => ({
|
||||||
getById: vi.fn(),
|
getById: vi.fn(),
|
||||||
}),
|
}),
|
||||||
|
feedbackService: () => ({
|
||||||
|
listIssueVotesForUser: vi.fn(async () => []),
|
||||||
|
saveIssueVote: vi.fn(async () => ({ vote: null, consentEnabledNow: false, sharingEnabled: false })),
|
||||||
|
}),
|
||||||
goalService: () => mockGoalService,
|
goalService: () => mockGoalService,
|
||||||
heartbeatService: () => ({
|
heartbeatService: () => ({
|
||||||
wakeup: vi.fn(async () => undefined),
|
wakeup: vi.fn(async () => undefined),
|
||||||
reportRunActivity: vi.fn(async () => undefined),
|
reportRunActivity: vi.fn(async () => undefined),
|
||||||
}),
|
}),
|
||||||
|
instanceSettingsService: () => ({
|
||||||
|
get: vi.fn(async () => ({
|
||||||
|
id: "instance-settings-1",
|
||||||
|
general: {
|
||||||
|
censorUsernameInLogs: false,
|
||||||
|
feedbackDataSharingPreference: "prompt",
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
listCompanyIds: vi.fn(async () => ["company-1"]),
|
||||||
|
}),
|
||||||
issueApprovalService: () => ({}),
|
issueApprovalService: () => ({}),
|
||||||
issueService: () => mockIssueService,
|
issueService: () => mockIssueService,
|
||||||
logActivity: vi.fn(async () => undefined),
|
logActivity: vi.fn(async () => undefined),
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
import { randomUUID } from "node:crypto";
|
import { randomUUID } from "node:crypto";
|
||||||
|
import { eq } from "drizzle-orm";
|
||||||
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
import { afterAll, afterEach, beforeAll, describe, expect, it } from "vitest";
|
||||||
import {
|
import {
|
||||||
activityLog,
|
activityLog,
|
||||||
@@ -228,6 +229,42 @@ describeEmbeddedPostgres("issueService.list participantAgentId", () => {
|
|||||||
expect(result.map((issue) => issue.id)).toEqual([matchedIssueId]);
|
expect(result.map((issue) => issue.id)).toEqual([matchedIssueId]);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("accepts issue identifiers through getById", async () => {
|
||||||
|
const companyId = randomUUID();
|
||||||
|
const issueId = randomUUID();
|
||||||
|
|
||||||
|
await db.insert(companies).values({
|
||||||
|
id: companyId,
|
||||||
|
name: "Paperclip",
|
||||||
|
issuePrefix: "PAP",
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
await db.insert(issues).values({
|
||||||
|
id: issueId,
|
||||||
|
companyId,
|
||||||
|
issueNumber: 1064,
|
||||||
|
identifier: "PAP-1064",
|
||||||
|
title: "Feedback votes error",
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
createdByUserId: "user-1",
|
||||||
|
});
|
||||||
|
|
||||||
|
const issue = await svc.getById("PAP-1064");
|
||||||
|
|
||||||
|
expect(issue).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
id: issueId,
|
||||||
|
identifier: "PAP-1064",
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("returns null instead of throwing for malformed non-uuid issue refs", async () => {
|
||||||
|
await expect(svc.getById("not-a-uuid")).resolves.toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
it("filters issues by execution workspace id", async () => {
|
it("filters issues by execution workspace id", async () => {
|
||||||
const companyId = randomUUID();
|
const companyId = randomUUID();
|
||||||
const projectId = randomUUID();
|
const projectId = randomUUID();
|
||||||
@@ -357,18 +394,8 @@ describeEmbeddedPostgres("issueService.list participantAgentId", () => {
|
|||||||
},
|
},
|
||||||
]);
|
]);
|
||||||
|
|
||||||
await svc.archiveInbox(
|
await svc.archiveInbox(companyId, archivedIssueId, userId, new Date("2026-03-26T12:30:00.000Z"));
|
||||||
companyId,
|
await svc.archiveInbox(companyId, resurfacedIssueId, userId, new Date("2026-03-26T13:00:00.000Z"));
|
||||||
archivedIssueId,
|
|
||||||
userId,
|
|
||||||
new Date("2026-03-26T12:30:00.000Z"),
|
|
||||||
);
|
|
||||||
await svc.archiveInbox(
|
|
||||||
companyId,
|
|
||||||
resurfacedIssueId,
|
|
||||||
userId,
|
|
||||||
new Date("2026-03-26T13:00:00.000Z"),
|
|
||||||
);
|
|
||||||
|
|
||||||
await db.insert(issueComments).values({
|
await db.insert(issueComments).values({
|
||||||
companyId,
|
companyId,
|
||||||
@@ -402,6 +429,160 @@ describeEmbeddedPostgres("issueService.list participantAgentId", () => {
|
|||||||
resurfacedIssueId,
|
resurfacedIssueId,
|
||||||
]));
|
]));
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("resurfaces archived issue when status/updatedAt changes after archiving", async () => {
|
||||||
|
const companyId = randomUUID();
|
||||||
|
const userId = "user-1";
|
||||||
|
const otherUserId = "user-2";
|
||||||
|
|
||||||
|
await db.insert(companies).values({
|
||||||
|
id: companyId,
|
||||||
|
name: "Paperclip",
|
||||||
|
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
const issueId = randomUUID();
|
||||||
|
|
||||||
|
await db.insert(issues).values({
|
||||||
|
id: issueId,
|
||||||
|
companyId,
|
||||||
|
title: "Issue with old comment then status change",
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
createdByUserId: userId,
|
||||||
|
createdAt: new Date("2026-03-26T10:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-26T10:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Old external comment before archiving
|
||||||
|
await db.insert(issueComments).values({
|
||||||
|
companyId,
|
||||||
|
issueId,
|
||||||
|
authorUserId: otherUserId,
|
||||||
|
body: "Old comment before archive",
|
||||||
|
createdAt: new Date("2026-03-26T11:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-26T11:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Archive after seeing the comment
|
||||||
|
await svc.archiveInbox(
|
||||||
|
companyId,
|
||||||
|
issueId,
|
||||||
|
userId,
|
||||||
|
new Date("2026-03-26T12:00:00.000Z"),
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify it's archived
|
||||||
|
const afterArchive = await svc.list(companyId, {
|
||||||
|
touchedByUserId: userId,
|
||||||
|
inboxArchivedByUserId: userId,
|
||||||
|
});
|
||||||
|
expect(afterArchive.map((i) => i.id)).not.toContain(issueId);
|
||||||
|
|
||||||
|
// Status/work update changes updatedAt (no new comment)
|
||||||
|
await db
|
||||||
|
.update(issues)
|
||||||
|
.set({
|
||||||
|
status: "in_progress",
|
||||||
|
updatedAt: new Date("2026-03-26T13:00:00.000Z"),
|
||||||
|
})
|
||||||
|
.where(eq(issues.id, issueId));
|
||||||
|
|
||||||
|
// Should resurface because updatedAt > archivedAt
|
||||||
|
const afterUpdate = await svc.list(companyId, {
|
||||||
|
touchedByUserId: userId,
|
||||||
|
inboxArchivedByUserId: userId,
|
||||||
|
});
|
||||||
|
expect(afterUpdate.map((i) => i.id)).toContain(issueId);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("sorts and exposes last activity from comments and non-local issue activity logs", async () => {
|
||||||
|
const companyId = randomUUID();
|
||||||
|
const olderIssueId = randomUUID();
|
||||||
|
const commentIssueId = randomUUID();
|
||||||
|
const activityIssueId = randomUUID();
|
||||||
|
|
||||||
|
await db.insert(companies).values({
|
||||||
|
id: companyId,
|
||||||
|
name: "Paperclip",
|
||||||
|
issuePrefix: `T${companyId.replace(/-/g, "").slice(0, 6).toUpperCase()}`,
|
||||||
|
requireBoardApprovalForNewAgents: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
await db.insert(issues).values([
|
||||||
|
{
|
||||||
|
id: olderIssueId,
|
||||||
|
companyId,
|
||||||
|
title: "Older issue",
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
updatedAt: new Date("2026-03-26T10:00:00.000Z"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: commentIssueId,
|
||||||
|
companyId,
|
||||||
|
title: "Comment activity issue",
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
updatedAt: new Date("2026-03-26T10:00:00.000Z"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: activityIssueId,
|
||||||
|
companyId,
|
||||||
|
title: "Logged activity issue",
|
||||||
|
status: "todo",
|
||||||
|
priority: "medium",
|
||||||
|
updatedAt: new Date("2026-03-26T10:00:00.000Z"),
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
await db.insert(issueComments).values({
|
||||||
|
companyId,
|
||||||
|
issueId: commentIssueId,
|
||||||
|
body: "New comment without touching issue.updatedAt",
|
||||||
|
createdAt: new Date("2026-03-26T11:00:00.000Z"),
|
||||||
|
updatedAt: new Date("2026-03-26T11:00:00.000Z"),
|
||||||
|
});
|
||||||
|
|
||||||
|
await db.insert(activityLog).values([
|
||||||
|
{
|
||||||
|
companyId,
|
||||||
|
actorType: "system",
|
||||||
|
actorId: "system",
|
||||||
|
action: "issue.document_updated",
|
||||||
|
entityType: "issue",
|
||||||
|
entityId: activityIssueId,
|
||||||
|
createdAt: new Date("2026-03-26T12:00:00.000Z"),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
companyId,
|
||||||
|
actorType: "user",
|
||||||
|
actorId: "user-1",
|
||||||
|
action: "issue.read_marked",
|
||||||
|
entityType: "issue",
|
||||||
|
entityId: olderIssueId,
|
||||||
|
createdAt: new Date("2026-03-26T13:00:00.000Z"),
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
const result = await svc.list(companyId, {});
|
||||||
|
|
||||||
|
expect(result.map((issue) => issue.id)).toEqual([
|
||||||
|
activityIssueId,
|
||||||
|
commentIssueId,
|
||||||
|
olderIssueId,
|
||||||
|
]);
|
||||||
|
expect(result.find((issue) => issue.id === activityIssueId)?.lastActivityAt?.toISOString()).toBe(
|
||||||
|
"2026-03-26T12:00:00.000Z",
|
||||||
|
);
|
||||||
|
expect(result.find((issue) => issue.id === commentIssueId)?.lastActivityAt?.toISOString()).toBe(
|
||||||
|
"2026-03-26T11:00:00.000Z",
|
||||||
|
);
|
||||||
|
expect(result.find((issue) => issue.id === olderIssueId)?.lastActivityAt?.toISOString()).toBe(
|
||||||
|
"2026-03-26T10:00:00.000Z",
|
||||||
|
);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describeEmbeddedPostgres("issueService.create workspace inheritance", () => {
|
describeEmbeddedPostgres("issueService.create workspace inheritance", () => {
|
||||||
|
|||||||
114
server/src/__tests__/plugin-telemetry-bridge.test.ts
Normal file
114
server/src/__tests__/plugin-telemetry-bridge.test.ts
Normal file
@@ -0,0 +1,114 @@
|
|||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import { createHostClientHandlers } from "../../../packages/plugins/sdk/src/host-client-factory.js";
|
||||||
|
import { PLUGIN_RPC_ERROR_CODES } from "../../../packages/plugins/sdk/src/protocol.js";
|
||||||
|
import { buildHostServices } from "../services/plugin-host-services.js";
|
||||||
|
|
||||||
|
const mockGetTelemetryClient = vi.hoisted(() => vi.fn());
|
||||||
|
|
||||||
|
vi.mock("../telemetry.js", () => ({
|
||||||
|
getTelemetryClient: mockGetTelemetryClient,
|
||||||
|
}));
|
||||||
|
|
||||||
|
function createEventBusStub() {
|
||||||
|
return {
|
||||||
|
forPlugin() {
|
||||||
|
return {
|
||||||
|
emit: vi.fn(),
|
||||||
|
subscribe: vi.fn(),
|
||||||
|
};
|
||||||
|
},
|
||||||
|
} as any;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("plugin telemetry bridge", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
mockGetTelemetryClient.mockReset();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("prefixes plugin telemetry events before forwarding them to the telemetry client", async () => {
|
||||||
|
const track = vi.fn();
|
||||||
|
mockGetTelemetryClient.mockReturnValue({ track });
|
||||||
|
|
||||||
|
const services = buildHostServices(
|
||||||
|
{} as never,
|
||||||
|
"plugin-record-id",
|
||||||
|
"linear",
|
||||||
|
createEventBusStub(),
|
||||||
|
);
|
||||||
|
const handlers = createHostClientHandlers({
|
||||||
|
pluginId: "linear",
|
||||||
|
capabilities: ["telemetry.track"],
|
||||||
|
services,
|
||||||
|
});
|
||||||
|
|
||||||
|
await handlers["telemetry.track"]({
|
||||||
|
eventName: "sync_completed",
|
||||||
|
dimensions: { attempts: 2, success: true },
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(track).toHaveBeenCalledWith("plugin.linear.sync_completed", {
|
||||||
|
attempts: 2,
|
||||||
|
success: true,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects invalid bare telemetry event names before prefixing", async () => {
|
||||||
|
mockGetTelemetryClient.mockReturnValue({ track: vi.fn() });
|
||||||
|
|
||||||
|
const services = buildHostServices(
|
||||||
|
{} as never,
|
||||||
|
"plugin-record-id",
|
||||||
|
"linear",
|
||||||
|
createEventBusStub(),
|
||||||
|
);
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
services.telemetry.track({ eventName: "sync.completed" }),
|
||||||
|
).rejects.toThrow(
|
||||||
|
'Plugin telemetry event names must be lowercase slugs using letters, numbers, "_" or "-".',
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects telemetry tracking when the plugin lacks the capability", async () => {
|
||||||
|
const services = buildHostServices(
|
||||||
|
{} as never,
|
||||||
|
"plugin-record-id",
|
||||||
|
"linear",
|
||||||
|
createEventBusStub(),
|
||||||
|
);
|
||||||
|
const handlers = createHostClientHandlers({
|
||||||
|
pluginId: "linear",
|
||||||
|
capabilities: [],
|
||||||
|
services,
|
||||||
|
});
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
handlers["telemetry.track"]({ eventName: "sync_completed" }),
|
||||||
|
).rejects.toMatchObject({
|
||||||
|
code: PLUGIN_RPC_ERROR_CODES.CAPABILITY_DENIED,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(mockGetTelemetryClient).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("passes telemetry requests through when the plugin declares the capability", async () => {
|
||||||
|
const services = buildHostServices(
|
||||||
|
{} as never,
|
||||||
|
"plugin-record-id",
|
||||||
|
"linear",
|
||||||
|
createEventBusStub(),
|
||||||
|
);
|
||||||
|
const handlers = createHostClientHandlers({
|
||||||
|
pluginId: "linear",
|
||||||
|
capabilities: ["telemetry.track"],
|
||||||
|
services,
|
||||||
|
});
|
||||||
|
|
||||||
|
await handlers["telemetry.track"]({
|
||||||
|
eventName: "sync_completed",
|
||||||
|
dimensions: { source: "manual" },
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(mockGetTelemetryClient).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user