Compare commits

..

8 Commits

Author SHA1 Message Date
Dotta
1cdbf9e500 Simplify capped live run polling
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-30 15:55:19 -05:00
Dotta
b192440642 Document liveness recovery coverage scope
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-30 15:37:35 -05:00
Dotta
c66bcd38f8 Split advanced liveness behavior from reliability PR
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-30 15:20:19 -05:00
Dotta
5684e17b79 Address reliability review findings
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-30 13:47:42 -05:00
Dotta
1686cd47ee Harden issue recovery reliability
Co-Authored-By: Paperclip <noreply@paperclip.ing>
2026-04-30 13:38:46 -05:00
Dotta
87f19cd9a6 Improve issue thread scale and markdown polish (#4861)
## Thinking Path

> - Paperclip's board UI is the operator surface for supervising
AI-agent companies.
> - Issue threads are where operators read progress, respond to agents,
inspect markdown, and jump through long histories.
> - Large threads and rich markdown had become difficult to navigate and
expensive to render.
> - The previous rollup mixed these UI scale fixes with unrelated
backend recovery, costs, backups, and settings changes.
> - This pull request isolates the issue-thread scale and markdown
polish work.
> - The benefit is a reviewable UI slice that can merge independently of
the backend reliability, database backup, workflow, and board QoL PRs.

## What Changed

- Virtualized long issue chat threads and stabilized
anchor/jump-to-latest behavior for large histories.
- Added incremental issue-list row loading and tests for
scroll-triggered pagination behavior.
- Hardened markdown body rendering and markdown editor behavior around
HTML tags, image drops, code-copy UI, and escaped newline handling.
- Added a long-thread measurement harness at
`scripts/measure-issue-chat-long-thread.mjs` plus
`perf:issue-chat-long-thread`.
- Added focused UI/lib regression coverage for thread rendering,
markdown, optimistic comments, and message building.

## Verification

- `pnpm install --frozen-lockfile`
- `pnpm exec vitest run ui/src/components/IssueChatThread.test.tsx
ui/src/components/IssuesList.test.tsx
ui/src/components/MarkdownBody.test.tsx
ui/src/components/MarkdownEditor.test.tsx
ui/src/lib/issue-chat-messages.test.ts
ui/src/lib/optimistic-issue-comments.test.ts`
- Result: 6 test files passed, 170 tests passed.
- UI screenshots not included because this PR is covered by targeted
component tests and does not introduce a new page layout.

## Risks

- Virtualization changes can affect scroll anchoring in edge cases on
very long threads.
- Markdown/editor hardening changes are intentionally defensive, but
malformed content may render differently than before.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5.5, code execution and GitHub CLI tool use, medium
reasoning effort.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-30 13:18:01 -05:00
Dotta
cd606563f6 Expand database backups to non-system schemas (#4859)
## Thinking Path

> - Paperclip is the control plane for autonomous AI companies.
> - Reliable backups are part of operating that control plane safely.
> - The previous backup path was public-schema oriented and did not
clearly cover plugin-owned schemas or migration history.
> - Paperclip now has plugin database namespaces and Drizzle migration
state that must survive backup/restore.
> - This pull request expands logical database backups to non-system
schemas and documents the backup boundary.
> - The benefit is safer restore behavior for core and plugin-owned
database state without implying full filesystem disaster recovery.

## What Changed

- Include non-system database schemas in JavaScript and pg_dump backup
paths.
- Preserve enum, table, sequence, index, constraint, migration, and
plugin-schema objects across backup/restore.
- Add restore coverage for plugin-owned schemas and Drizzle migration
history.
- Clarify docs that DB backups are logical database backups, not full
instance filesystem backups.

## Verification

- `pnpm install --frozen-lockfile`
- `pnpm exec vitest run packages/db/src/backup-lib.test.ts`
- Result: 1 test file passed, 4 tests passed.
- Confirmed this PR does not include `pnpm-lock.yaml` or
`.github/workflows/*` changes.

## Risks

- Medium: backup generation touches schema discovery and restore
ordering, so unusual database objects may need additional coverage
later.
- No migrations are included.

> For core feature work, check [`ROADMAP.md`](ROADMAP.md) first and
discuss it in `#dev` before opening the PR. Feature PRs that overlap
with planned core work may need to be redirected — check the roadmap
first. See `CONTRIBUTING.md`.

## Model Used

- OpenAI Codex, GPT-5 coding agent, tool use enabled, medium reasoning
effort. Exact hosted context-window details are not exposed in this
runtime.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge

Note: no UI changes are included in this PR, so screenshots are not
applicable.

---------

Co-authored-by: Paperclip <noreply@paperclip.ing>
2026-04-30 12:54:35 -05:00
Devin Foley
c0ce35d1fb Improve E2B plugin configuration UX and fix execution timeouts (#4802)
## Thinking Path

> - Paperclip orchestrates AI agents for zero-human companies
> - E2B is a sandbox provider plugin that runs agent code in isolated
cloud environments
> - Operators configure E2B through the plugin settings page
> - But the E2B API key configuration was unclear — the settings field
description didn't explain that pasted keys are auto-saved as company
secrets, and the fallback to the host `E2B_API_KEY` variable wasn't
documented
> - Additionally, long-running E2B sandbox commands were timing out
because the plugin environment RPC driver used a fixed timeout, and
environment commands competed for the single foreground command slot
> - This PR clarifies the E2B configuration UX, fixes RPC timeouts for
plugin environment execution, and runs E2B environment commands in
background mode to avoid blocking the foreground slot
> - The benefit is clearer E2B setup for operators and more reliable
sandbox command execution

## What Changed

- Updated E2B plugin manifest and settings UI to clarify API key
configuration — field description now explains that pasted keys are
saved as company secrets and documents the `E2B_API_KEY` host fallback
- Added test coverage for the plugin settings page rendering
- Fixed `plugin-environment-driver.ts` to pass the configured timeout
through to RPC calls instead of using a hardcoded default
- Updated `environment-runtime.ts` to propagate timeout from the
environment lease to the plugin driver
- Changed E2B sandbox command execution to use background handles so
long-running agent commands don't block the foreground slot needed by
the callback bridge

## Verification

- `pnpm test` — all existing and new tests pass
- `pnpm typecheck` — clean
- Manual: navigate to plugin settings, verify E2B API key field shows
the updated description text
- Manual: run an E2B-backed agent task with a long-running command,
verify it completes without RPC timeout

## Risks

- Low risk. Configuration UX change is cosmetic. The timeout fix passes
an existing value through instead of dropping it. Background command
execution is a behavioral change but only affects E2B sandbox commands —
the foreground slot is still available for bridge health checks.

## Model Used

Codex GPT 5.4 high via Paperclip.

## Checklist

- [x] I have included a thinking path that traces from project context
to this change
- [x] I have specified the model used (with version and capability
details)
- [x] I have checked ROADMAP.md and confirmed this PR does not duplicate
planned core work
- [x] I have run tests locally and they pass
- [x] I have added or updated tests where applicable
- [x] If this change affects the UI, I have included before/after
screenshots
- [x] I have updated relevant documentation to reflect my changes
- [x] I have considered and documented any risks above
- [x] I will address all Greptile and reviewer comments before
requesting merge
2026-04-29 17:12:30 -07:00
28 changed files with 1579 additions and 199 deletions

View File

@@ -149,7 +149,15 @@ The plugin runtime tracks plugin-owned database namespaces and migrations in `pl
## Backups
Paperclip supports automatic and manual database backups. See `doc/DEVELOPING.md` for the current `paperclipai db:backup` / `pnpm db:backup` commands and backup retention configuration.
Paperclip supports automatic and manual logical database backups. These dumps include
non-system database schemas such as `public`, the Drizzle migration journal, and
plugin-owned database schemas. See `doc/DEVELOPING.md` for the current
`paperclipai db:backup` / `pnpm db:backup` commands and backup retention
configuration.
Database backups do not include non-database instance files such as local-disk
uploads, workspace files, or the local encrypted secrets master key. Back those paths
up separately when you need full instance disaster recovery.
## Secret storage

View File

@@ -421,7 +421,9 @@ If you set `DATABASE_URL`, the server will use that instead of embedded PostgreS
## Automatic DB Backups
Paperclip can run automatic DB backups on a timer. Defaults:
Paperclip can run automatic logical database backups on a timer. These backups cover
non-system database schemas, including migration history and plugin-owned database
schemas. Defaults:
- enabled
- every 60 minutes
@@ -449,6 +451,10 @@ Environment overrides:
- `PAPERCLIP_DB_BACKUP_RETENTION_DAYS=<days>`
- `PAPERCLIP_DB_BACKUP_DIR=/absolute/or/~/path`
DB backups are not full instance filesystem backups. For full local disaster
recovery, also back up local storage files and the local encrypted secrets key if
those providers are enabled.
## Secrets in Dev
Agent env vars now support secret references. By default, secret values are stored with local encryption and only secret refs are persisted in agent config.

View File

@@ -42,7 +42,8 @@
"evals:smoke": "cd evals/promptfoo && npx promptfoo@0.103.3 eval",
"test:release-smoke": "npx playwright test --config tests/release-smoke/playwright.config.ts",
"test:release-smoke:headed": "npx playwright test --config tests/release-smoke/playwright.config.ts --headed",
"metrics:paperclip-commits": "tsx scripts/paperclip-commit-metrics.ts"
"metrics:paperclip-commits": "tsx scripts/paperclip-commit-metrics.ts",
"perf:issue-chat-long-thread": "node scripts/measure-issue-chat-long-thread.mjs"
},
"devDependencies": {
"@playwright/test": "^1.58.2",

View File

@@ -182,7 +182,135 @@ describeEmbeddedPostgres("runDatabaseBackup", () => {
);
it(
"restores statements incrementally when backup comments precede the first breakpoint",
"backs up and restores non-public database schemas and migration history",
async () => {
const sourceConnectionString = await createTempDatabase();
const restoreConnectionString = await createSiblingDatabase(
sourceConnectionString,
"paperclip_full_logical_restore_target",
);
const backupDir = createTempDir("paperclip-db-full-logical-backup-");
const sourceSql = postgres(sourceConnectionString, { max: 1, onnotice: () => {} });
const restoreSql = postgres(restoreConnectionString, { max: 1, onnotice: () => {} });
try {
await sourceSql.unsafe(`
CREATE SCHEMA IF NOT EXISTS "drizzle";
CREATE TABLE IF NOT EXISTS "drizzle"."__drizzle_migrations" (
"id" serial PRIMARY KEY,
"hash" text NOT NULL,
"created_at" bigint
);
INSERT INTO "drizzle"."__drizzle_migrations" ("hash", "created_at")
VALUES ('paperclip-migration-history', 1770000000000);
`);
await sourceSql.unsafe(`
CREATE TABLE "public"."backup_parent_records" (
"id" uuid PRIMARY KEY,
"name" text NOT NULL
);
INSERT INTO "public"."backup_parent_records" ("id", "name")
VALUES ('11111111-1111-4111-8111-111111111111', 'parent');
`);
await sourceSql.unsafe(`
CREATE TABLE "public"."plugin_rows" (
"id" serial PRIMARY KEY,
"note" text NOT NULL
);
CREATE TABLE "public"."audit_rows" (
"id" serial PRIMARY KEY,
"secret_note" text
);
INSERT INTO "public"."plugin_rows" ("note")
VALUES ('public-collision');
INSERT INTO "public"."audit_rows" ("secret_note")
VALUES ('public-secret');
`);
await sourceSql.unsafe(`
CREATE SCHEMA "plugin_backup_scope";
CREATE TYPE "plugin_backup_scope"."plugin_status" AS ENUM ('ready', 'done');
CREATE TABLE "plugin_backup_scope"."plugin_rows" (
"id" serial PRIMARY KEY,
"parent_id" uuid NOT NULL REFERENCES "public"."backup_parent_records"("id") ON DELETE CASCADE,
"status" "plugin_backup_scope"."plugin_status" NOT NULL,
"note" text NOT NULL
);
CREATE TABLE "plugin_backup_scope"."audit_rows" (
"id" serial PRIMARY KEY,
"secret_note" text
);
CREATE UNIQUE INDEX "plugin_rows_note_uq" ON "plugin_backup_scope"."plugin_rows" ("note");
INSERT INTO "plugin_backup_scope"."plugin_rows" ("parent_id", "status", "note")
VALUES ('11111111-1111-4111-8111-111111111111', 'ready', 'first');
INSERT INTO "plugin_backup_scope"."audit_rows" ("secret_note")
VALUES ('plugin-secret');
`);
const result = await runDatabaseBackup({
connectionString: sourceConnectionString,
backupDir,
retention: { dailyDays: 7, weeklyWeeks: 4, monthlyMonths: 1 },
filenamePrefix: "paperclip-full-logical-test",
backupEngine: "javascript",
excludeTables: ["plugin_rows"],
nullifyColumns: {
audit_rows: ["secret_note"],
},
});
await runDatabaseRestore({
connectionString: restoreConnectionString,
backupFile: result.backupFile,
});
const migrationRows = await restoreSql.unsafe<{ hash: string }[]>(`
SELECT "hash"
FROM "drizzle"."__drizzle_migrations"
WHERE "hash" = 'paperclip-migration-history'
`);
expect(migrationRows).toEqual([{ hash: "paperclip-migration-history" }]);
const pluginRows = await restoreSql.unsafe<{ note: string; status: string; parent_name: string }[]>(`
SELECT r."note", r."status"::text AS "status", p."name" AS "parent_name"
FROM "plugin_backup_scope"."plugin_rows" r
JOIN "public"."backup_parent_records" p ON p."id" = r."parent_id"
`);
expect(pluginRows).toEqual([{ note: "first", status: "ready", parent_name: "parent" }]);
const publicCollisionRows = await restoreSql.unsafe<{ count: number }[]>(`
SELECT count(*)::int AS count
FROM "public"."plugin_rows"
`);
expect(publicCollisionRows[0]?.count).toBe(0);
const publicAuditRows = await restoreSql.unsafe<{ secret_note: string | null }[]>(`
SELECT "secret_note"
FROM "public"."audit_rows"
`);
expect(publicAuditRows).toEqual([{ secret_note: null }]);
const pluginAuditRows = await restoreSql.unsafe<{ secret_note: string | null }[]>(`
SELECT "secret_note"
FROM "plugin_backup_scope"."audit_rows"
`);
expect(pluginAuditRows).toEqual([{ secret_note: "plugin-secret" }]);
await expect(
restoreSql.unsafe(`
INSERT INTO "plugin_backup_scope"."plugin_rows" ("parent_id", "status", "note")
VALUES ('11111111-1111-4111-8111-111111111111', 'done', 'first')
`),
).rejects.toThrow();
} finally {
await sourceSql.end();
await restoreSql.end();
}
},
60_000,
);
it(
"restores legacy public-only backups without migration history",
async () => {
const restoreConnectionString = await createTempDatabase();
const restoreSql = postgres(restoreConnectionString, { max: 1, onnotice: () => {} });

View File

@@ -19,6 +19,11 @@ export type RunDatabaseBackupOptions = {
retention: BackupRetentionPolicy;
filenamePrefix?: string;
connectTimeoutSeconds?: number;
/**
* @deprecated Migration-journal schemas are included with the normal backup
* scope. This option is kept for compatibility and no longer changes backup
* engine selection.
*/
includeMigrationJournal?: boolean;
excludeTables?: string[];
nullifyColumns?: Record<string, string[]>;
@@ -61,8 +66,6 @@ type ExtensionDefinition = {
schema_name: string;
};
const DRIZZLE_SCHEMA = "drizzle";
const DRIZZLE_MIGRATIONS_TABLE = "__drizzle_migrations";
const DEFAULT_BACKUP_WRITE_BUFFER_BYTES = 1024 * 1024;
const BACKUP_DATA_CURSOR_ROWS = 100;
const BACKUP_CLI_STDERR_BYTES = 64 * 1024;
@@ -194,16 +197,22 @@ function formatSqlLiteral(value: string): string {
function normalizeTableNameSet(values: string[] | undefined): Set<string> {
return new Set(
(values ?? [])
.map((value) => value.trim())
.map(normalizeTableSelector)
.filter((value) => value.length > 0),
);
}
function normalizeTableSelector(value: string): string {
const trimmed = value.trim();
if (trimmed.length === 0) return "";
return trimmed.includes(".") ? trimmed : tableKey("public", trimmed);
}
function normalizeNullifyColumnMap(values: Record<string, string[]> | undefined): Map<string, Set<string>> {
const out = new Map<string, Set<string>>();
if (!values) return out;
for (const [tableName, columns] of Object.entries(values)) {
const normalizedTable = tableName.trim();
const normalizedTable = normalizeTableSelector(tableName);
if (normalizedTable.length === 0) continue;
const normalizedColumns = new Set(
columns
@@ -229,9 +238,14 @@ function tableKey(schemaName: string, tableName: string): string {
return `${schemaName}.${tableName}`;
}
function nonSystemSchemaPredicate(identifier: string): string {
return `${identifier} NOT IN ('pg_catalog', 'information_schema')
AND ${identifier} NOT LIKE 'pg_toast%'
AND ${identifier} NOT LIKE 'pg_temp_%'`;
}
function hasBackupTransforms(opts: RunDatabaseBackupOptions): boolean {
return opts.includeMigrationJournal === true ||
(opts.excludeTables?.length ?? 0) > 0 ||
return (opts.excludeTables?.length ?? 0) > 0 ||
Object.keys(opts.nullifyColumns ?? {}).length > 0;
}
@@ -285,7 +299,6 @@ async function runPgDumpBackup(opts: {
"--if-exists",
"--no-owner",
"--no-privileges",
"--schema=public",
],
{
stdio: ["ignore", "pipe", "pipe"],
@@ -484,7 +497,6 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
const connectTimeout = Math.max(1, Math.trunc(opts.connectTimeoutSeconds ?? 5));
const backupEngine = opts.backupEngine ?? "auto";
const canUsePgDump = !hasBackupTransforms(opts);
const includeMigrationJournal = opts.includeMigrationJournal === true;
const excludedTableNames = normalizeTableNameSet(opts.excludeTables);
const nullifiedColumnsByTable = normalizeNullifyColumnMap(opts.nullifyColumns);
let sql = postgres(opts.connectionString, { max: 1, connect_timeout: connectTimeout });
@@ -552,31 +564,24 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
SELECT table_schema AS schema_name, table_name AS tablename
FROM information_schema.tables
WHERE table_type = 'BASE TABLE'
AND (
table_schema = 'public'
OR (${includeMigrationJournal}::boolean AND table_schema = ${DRIZZLE_SCHEMA} AND table_name = ${DRIZZLE_MIGRATIONS_TABLE})
)
AND ${sql.unsafe(nonSystemSchemaPredicate("table_schema"))}
ORDER BY table_schema, table_name
`;
const tables = allTables;
const includedTableNames = new Set(tables.map(({ schema_name, tablename }) => tableKey(schema_name, tablename)));
const includedSchemas = new Set(tables.map(({ schema_name }) => schema_name));
// Get all enums
const enums = await sql<{ typname: string; labels: string[] }[]>`
SELECT t.typname, array_agg(e.enumlabel ORDER BY e.enumsortorder) AS labels
const enums = await sql<{ schema_name: string; typname: string; labels: string[] }[]>`
SELECT n.nspname AS schema_name, t.typname, array_agg(e.enumlabel ORDER BY e.enumsortorder) AS labels
FROM pg_type t
JOIN pg_enum e ON t.oid = e.enumtypid
JOIN pg_namespace n ON t.typnamespace = n.oid
WHERE n.nspname = 'public'
GROUP BY t.typname
ORDER BY t.typname
WHERE ${sql.unsafe(nonSystemSchemaPredicate("n.nspname"))}
GROUP BY n.nspname, t.typname
ORDER BY n.nspname, t.typname
`;
for (const e of enums) {
const labels = e.labels.map((l) => `'${l.replace(/'/g, "''")}'`).join(", ");
emitStatement(`CREATE TYPE "public"."${e.typname}" AS ENUM (${labels});`);
}
if (enums.length > 0) emit("");
for (const e of enums) includedSchemas.add(e.schema_name);
const allSequences = await sql<SequenceDefinition[]>`
SELECT
@@ -598,16 +603,14 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
LEFT JOIN pg_class tbl ON tbl.oid = dep.refobjid
LEFT JOIN pg_namespace tblns ON tblns.oid = tbl.relnamespace
LEFT JOIN pg_attribute attr ON attr.attrelid = tbl.oid AND attr.attnum = dep.refobjsubid
WHERE s.sequence_schema = 'public'
OR (${includeMigrationJournal}::boolean AND s.sequence_schema = ${DRIZZLE_SCHEMA})
WHERE ${sql.unsafe(nonSystemSchemaPredicate("s.sequence_schema"))}
ORDER BY s.sequence_schema, s.sequence_name
`;
const sequences = allSequences.filter(
(seq) => !seq.owner_table || includedTableNames.has(tableKey(seq.owner_schema ?? "public", seq.owner_table)),
);
const schemas = new Set<string>();
for (const table of tables) schemas.add(table.schema_name);
const schemas = new Set<string>(includedSchemas);
for (const seq of sequences) schemas.add(seq.sequence_schema);
const extraSchemas = [...schemas].filter((schemaName) => schemaName !== "public");
if (extraSchemas.length > 0) {
@@ -618,6 +621,12 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
emit("");
}
for (const e of enums) {
const labels = e.labels.map((l) => `'${l.replace(/'/g, "''")}'`).join(", ");
emitStatement(`CREATE TYPE ${quoteQualifiedName(e.schema_name, e.typname)} AS ENUM (${labels});`);
}
if (enums.length > 0) emit("");
const extensions = await sql<ExtensionDefinition[]>`
SELECT
e.extname AS extension_name,
@@ -655,6 +664,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
const columns = await sql<{
column_name: string;
data_type: string;
udt_schema: string;
udt_name: string;
is_nullable: string;
column_default: string | null;
@@ -662,7 +672,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
numeric_precision: number | null;
numeric_scale: number | null;
}[]>`
SELECT column_name, data_type, udt_name, is_nullable, column_default,
SELECT column_name, data_type, udt_schema, udt_name, is_nullable, column_default,
character_maximum_length, numeric_precision, numeric_scale
FROM information_schema.columns
WHERE table_schema = ${schema_name} AND table_name = ${tablename}
@@ -676,9 +686,12 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
for (const col of columns) {
let typeStr: string;
if (col.data_type === "USER-DEFINED") {
typeStr = `"${col.udt_name}"`;
typeStr = quoteQualifiedName(col.udt_schema, col.udt_name);
} else if (col.data_type === "ARRAY") {
typeStr = `${col.udt_name.replace(/^_/, "")}[]`;
const elementType = col.udt_name.replace(/^_/, "");
typeStr = col.udt_schema === "pg_catalog"
? `${elementType}[]`
: `${quoteQualifiedName(col.udt_schema, elementType)}[]`;
} else if (col.data_type === "character varying") {
typeStr = col.character_maximum_length
? `varchar(${col.character_maximum_length})`
@@ -761,10 +774,8 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
JOIN pg_namespace tgtn ON tgtn.oid = tgt.relnamespace
JOIN pg_attribute sa ON sa.attrelid = src.oid AND sa.attnum = ANY(c.conkey)
JOIN pg_attribute ta ON ta.attrelid = tgt.oid AND ta.attnum = ANY(c.confkey)
WHERE c.contype = 'f' AND (
srcn.nspname = 'public'
OR (${includeMigrationJournal}::boolean AND srcn.nspname = ${DRIZZLE_SCHEMA})
)
WHERE c.contype = 'f'
AND ${sql.unsafe(nonSystemSchemaPredicate("srcn.nspname"))}
GROUP BY c.conname, srcn.nspname, src.relname, tgtn.nspname, tgt.relname, c.confupdtype, c.confdeltype
ORDER BY srcn.nspname, src.relname, c.conname
`;
@@ -800,10 +811,8 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
JOIN pg_class t ON t.oid = c.conrelid
JOIN pg_namespace n ON n.oid = t.relnamespace
JOIN pg_attribute a ON a.attrelid = t.oid AND a.attnum = ANY(c.conkey)
WHERE c.contype = 'u' AND (
n.nspname = 'public'
OR (${includeMigrationJournal}::boolean AND n.nspname = ${DRIZZLE_SCHEMA})
)
WHERE c.contype = 'u'
AND ${sql.unsafe(nonSystemSchemaPredicate("n.nspname"))}
GROUP BY c.conname, n.nspname, t.relname
ORDER BY n.nspname, t.relname, c.conname
`;
@@ -822,10 +831,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
const allIndexes = await sql<{ schema_name: string; tablename: string; indexdef: string }[]>`
SELECT schemaname AS schema_name, tablename, indexdef
FROM pg_indexes
WHERE (
schemaname = 'public'
OR (${includeMigrationJournal}::boolean AND schemaname = ${DRIZZLE_SCHEMA})
)
WHERE ${sql.unsafe(nonSystemSchemaPredicate("schemaname"))}
AND indexname NOT IN (
SELECT conname FROM pg_constraint c
JOIN pg_namespace n ON n.oid = c.connamespace
@@ -845,9 +851,10 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
// Dump data for each table
for (const { schema_name, tablename } of tables) {
const currentTableKey = tableKey(schema_name, tablename);
const qualifiedTableName = quoteQualifiedName(schema_name, tablename);
const count = await sql.unsafe<{ n: number }[]>(`SELECT count(*)::int AS n FROM ${qualifiedTableName}`);
if (excludedTableNames.has(tablename) || (count[0]?.n ?? 0) === 0) continue;
if (excludedTableNames.has(currentTableKey) || (count[0]?.n ?? 0) === 0) continue;
// Get column info for this table
const cols = await sql<{ column_name: string; data_type: string }[]>`
@@ -860,7 +867,7 @@ export async function runDatabaseBackup(opts: RunDatabaseBackupOptions): Promise
emit(`-- Data for: ${schema_name}.${tablename} (${count[0]!.n} rows)`);
const nullifiedColumns = nullifiedColumnsByTable.get(tablename) ?? new Set<string>();
const nullifiedColumns = nullifiedColumnsByTable.get(currentTableKey) ?? new Set<string>();
if (backupEngine !== "javascript" && nullifiedColumns.size === 0) {
emit(`COPY ${qualifiedTableName} (${colNames}) FROM stdin;`);
await writer.writeRaw("\n");

View File

@@ -0,0 +1,108 @@
#!/usr/bin/env node
import { chromium } from "@playwright/test";
import fs from "node:fs";
import os from "node:os";
import path from "node:path";
const baseUrl = (process.env.PAPERCLIP_PERF_BASE_URL || "http://localhost:3100").replace(/\/$/, "");
const companyPrefix = process.env.PAPERCLIP_PERF_COMPANY_PREFIX;
const url = companyPrefix
? `${baseUrl}/${companyPrefix}/tests/perf/long-thread`
: `${baseUrl}/tests/perf/long-thread`;
const origin = new URL(url).origin;
function loadBoardToken() {
const authPath = path.resolve(os.homedir(), ".paperclip/auth.json");
try {
const auth = JSON.parse(fs.readFileSync(authPath, "utf-8"));
const credentials = auth.credentials || {};
const matching = Object.values(credentials).find((entry) => {
if (!entry || !entry.token || !entry.apiBase) return false;
return new URL(entry.apiBase).origin === origin;
});
if (matching?.token) return matching.token;
const fallback = Object.values(credentials).find((entry) => entry?.token);
return fallback?.token ?? null;
} catch {
return null;
}
}
const browser = await chromium.launch({ headless: true });
const page = await browser.newPage({ viewport: { width: 1440, height: 1000 } });
const boardToken = process.env.PAPERCLIP_PERF_BEARER_TOKEN || loadBoardToken();
if (boardToken) {
await page.route(`${origin}/**`, async (route) => {
await route.continue({
headers: { ...route.request().headers(), Authorization: `Bearer ${boardToken}` },
});
});
}
try {
const startedAt = Date.now();
await page.goto(url, { waitUntil: "networkidle" });
await page.waitForSelector('[data-testid="issue-chat-long-thread-perf"]', { timeout: 30_000 });
await page.waitForFunction(() => {
const target = Number(document.querySelector('[data-testid="perf-fixture-row-target"]')?.textContent ?? "450");
const renderedRows = document.querySelectorAll('[data-testid="issue-chat-message-row"]').length;
const virtualizer = document.querySelector('[data-testid="issue-chat-thread-virtualizer"]');
if (!virtualizer) return renderedRows >= target;
const virtualCount = Number(virtualizer.getAttribute("data-virtual-count") ?? "0");
return virtualCount >= target && renderedRows > 0 && renderedRows < target;
}, null, { timeout: 60_000 });
const rowReadyMs = Date.now() - startedAt;
const metrics = await page.evaluate(async () => {
const text = (testId) => document.querySelector(`[data-testid="${testId}"]`)?.textContent?.trim() ?? "";
const numericMs = (testId) => {
const value = text(testId).replace(/\s*ms$/, "");
const parsed = Number(value);
return Number.isFinite(parsed) ? parsed : null;
};
const rowCount = document.querySelectorAll('[data-testid="issue-chat-message-row"]').length;
const virtualizer = document.querySelector('[data-testid="issue-chat-thread-virtualizer"]');
const virtualCount = Number(virtualizer?.getAttribute("data-virtual-count") ?? "0");
const assistantRowCount = document.querySelectorAll('[data-testid="issue-chat-message-row"][data-message-role="assistant"]').length;
const systemRowCount = document.querySelectorAll('[data-testid="issue-chat-message-row"][data-message-role="system"]').length;
const userRowCount = document.querySelectorAll('[data-testid="issue-chat-message-row"][data-message-role="user"]').length;
const markdownRows = Number(text("perf-fixture-markdown-rows"));
const commitCount = Number(text("perf-commit-count"));
const scrollStartY = window.scrollY;
const scrollTarget = Math.max(0, document.documentElement.scrollHeight - window.innerHeight);
const scrollStartedAt = performance.now();
window.scrollTo({ top: scrollTarget, behavior: "instant" });
await new Promise((resolve) => requestAnimationFrame(() => resolve()));
await new Promise((resolve) => requestAnimationFrame(() => resolve()));
const scrollResponsiveMs = performance.now() - scrollStartedAt;
return {
url: window.location.href,
fixtureRowTarget: Number(text("perf-fixture-row-target")),
virtualized: Boolean(virtualizer),
virtualCount,
rowCount,
assistantRowCount,
userRowCount,
systemRowCount,
markdownRows,
commitCount,
mountActualDurationMs: numericMs("perf-mount-duration"),
latestActualDurationMs: numericMs("perf-latest-duration"),
maxActualDurationMs: numericMs("perf-max-duration"),
totalActualDurationMs: numericMs("perf-total-duration"),
reactProfilerAvailable: commitCount > 0,
scrollResponsiveMs: Number(scrollResponsiveMs.toFixed(1)),
scrollDeltaPx: Math.round(Math.abs(window.scrollY - scrollStartY)),
documentHeightPx: Math.round(document.documentElement.scrollHeight),
};
});
const elapsedMs = Date.now() - startedAt;
console.log(JSON.stringify({ ...metrics, renderReadyMs: rowReadyMs, elapsedMs }, null, 2));
} finally {
await browser.close();
}

View File

@@ -10,7 +10,6 @@ const mockHeartbeatService = vi.hoisted(() => ({
buildRunOutputSilence: vi.fn(),
getRunIssueSummary: vi.fn(),
getActiveRunIssueSummaryForAgent: vi.fn(),
buildRunOutputSilence: vi.fn(),
getRunLogAccess: vi.fn(),
readLog: vi.fn(),
}));
@@ -71,7 +70,7 @@ function registerModuleMocks() {
}));
}
async function createApp() {
async function createApp(db: Record<string, unknown> = {}) {
const [{ agentRoutes }, { errorHandler }] = await Promise.all([
vi.importActual<typeof import("../routes/agents.js")>("../routes/agents.js"),
vi.importActual<typeof import("../middleware/index.js")>("../middleware/index.js"),
@@ -88,11 +87,32 @@ async function createApp() {
};
next();
});
app.use("/api", agentRoutes({} as any));
app.use("/api", agentRoutes(db as any));
app.use(errorHandler);
return app;
}
function createLiveRunsDbStub(rows: Array<Record<string, unknown>>) {
const limit = vi.fn(async (value: number) => rows.slice(0, value));
const orderedQuery = {
limit,
then: (resolve: (value: Array<Record<string, unknown>>) => unknown) => Promise.resolve(rows).then(resolve),
};
const query = {
from: vi.fn().mockReturnThis(),
innerJoin: vi.fn().mockReturnThis(),
where: vi.fn().mockReturnThis(),
orderBy: vi.fn().mockReturnValue(orderedQuery),
};
return {
db: {
select: vi.fn().mockReturnValue(query),
},
limit,
};
}
async function requestApp(
app: express.Express,
buildRequest: (baseUrl: string) => request.Test,
@@ -284,4 +304,81 @@ describe("agent live run routes", () => {
nextOffset: 5,
});
});
it("caps company live run polling by default", async () => {
const rows = Array.from({ length: 75 }, (_, index) => ({
id: `run-${index}`,
companyId: "company-1",
status: "running",
invocationSource: "on_demand",
triggerDetail: "manual",
startedAt: new Date("2026-04-10T09:30:00.000Z"),
finishedAt: null,
createdAt: new Date(`2026-04-10T09:${String(index % 60).padStart(2, "0")}:00.000Z`),
agentId: "agent-1",
agentName: "Builder",
adapterType: "codex_local",
logBytes: 0,
livenessState: "healthy",
livenessReason: null,
continuationAttempt: 0,
lastUsefulActionAt: null,
nextAction: null,
lastOutputAt: null,
lastOutputSeq: null,
lastOutputStream: null,
lastOutputBytes: 0,
processStartedAt: null,
issueId: "issue-1",
}));
const { db, limit } = createLiveRunsDbStub(rows);
const res = await requestApp(
await createApp(db),
(baseUrl) => request(baseUrl).get("/api/companies/company-1/live-runs"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(limit).toHaveBeenCalledWith(50);
expect(res.body).toHaveLength(50);
expect(mockHeartbeatService.buildRunOutputSilence).toHaveBeenCalledTimes(50);
});
it("treats explicit zero live run limits as the capped default", async () => {
const rows = Array.from({ length: 75 }, (_, index) => ({
id: `run-${index}`,
companyId: "company-1",
status: "running",
invocationSource: "on_demand",
triggerDetail: "manual",
startedAt: new Date("2026-04-10T09:30:00.000Z"),
finishedAt: null,
createdAt: new Date(`2026-04-10T09:${String(index % 60).padStart(2, "0")}:00.000Z`),
agentId: "agent-1",
agentName: "Builder",
adapterType: "codex_local",
logBytes: 0,
livenessState: "healthy",
livenessReason: null,
continuationAttempt: 0,
lastUsefulActionAt: null,
nextAction: null,
lastOutputAt: null,
lastOutputSeq: null,
lastOutputStream: null,
lastOutputBytes: 0,
processStartedAt: null,
issueId: "issue-1",
}));
const { db, limit } = createLiveRunsDbStub(rows);
const res = await requestApp(
await createApp(db),
(baseUrl) => request(baseUrl).get("/api/companies/company-1/live-runs?limit=0&minCount=0"),
);
expect(res.status, JSON.stringify(res.body)).toBe(200);
expect(limit).toHaveBeenCalledWith(50);
expect(res.body).toHaveLength(50);
});
});

View File

@@ -289,10 +289,23 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
const heartbeat = heartbeatService(db);
const first = await heartbeat.reconcileIssueGraphLiveness();
const second = await heartbeat.reconcileIssueGraphLiveness();
expect(first.escalationsCreated).toBe(1);
const [sourceAfterFirst] = await db
.select({ updatedAt: issues.updatedAt })
.from(issues)
.where(eq(issues.id, blockedIssueId));
const eventsAfterFirst = await db.select().from(activityLog).where(eq(activityLog.companyId, companyId));
expect(eventsAfterFirst.filter((event) => event.action === "issue.blockers.updated")).toHaveLength(1);
const second = await heartbeat.reconcileIssueGraphLiveness();
expect(second.escalationsCreated).toBe(0);
const [sourceAfterSecond] = await db
.select({ updatedAt: issues.updatedAt })
.from(issues)
.where(eq(issues.id, blockedIssueId));
expect(sourceAfterSecond?.updatedAt.getTime()).toBe(sourceAfterFirst?.updatedAt.getTime());
const escalations = await db
.select()
@@ -345,7 +358,7 @@ describeEmbeddedPostgres("heartbeat issue graph liveness escalation", () => {
projectWorkspaceSourceIssueId: blockerIssueId,
},
});
expect(events.some((event) => event.action === "issue.blockers.updated")).toBe(true);
expect(events.filter((event) => event.action === "issue.blockers.updated")).toHaveLength(1);
});
it("skips budget-blocked direct owners and assigns recovery to the manager fallback", async () => {

View File

@@ -76,6 +76,9 @@ describeEmbeddedPostgres("issue blocker attention", () => {
status: string;
parentId?: string | null;
assigneeAgentId?: string | null;
originKind?: string | null;
originId?: string | null;
originFingerprint?: string | null;
}) {
const id = input.id ?? randomUUID();
await db.insert(issues).values({
@@ -87,6 +90,9 @@ describeEmbeddedPostgres("issue blocker attention", () => {
priority: "medium",
parentId: input.parentId ?? null,
assigneeAgentId: input.assigneeAgentId ?? null,
originKind: input.originKind ?? "manual",
originId: input.originId ?? null,
originFingerprint: input.originFingerprint ?? "default",
});
return id;
}
@@ -356,6 +362,52 @@ describeEmbeddedPostgres("issue blocker attention", () => {
});
});
it("treats open liveness escalation blockers as covered waiting paths", async () => {
const { companyId, agentId } = await createCompany("PBL");
const parentId = await insertIssue({ companyId, identifier: "PBL-1", title: "Parent", status: "blocked" });
const cancelledLeafId = await insertIssue({
companyId,
identifier: "PBL-2",
title: "Cancelled blocker",
status: "cancelled",
assigneeAgentId: agentId,
});
const incidentKey = [
"harness_liveness",
companyId,
parentId,
"blocked_by_cancelled_issue",
cancelledLeafId,
].join(":");
const escalationId = await insertIssue({
companyId,
identifier: "PBL-3",
title: "Liveness escalation",
status: "todo",
assigneeAgentId: agentId,
originKind: "harness_liveness_escalation",
originId: incidentKey,
originFingerprint: [
"harness_liveness_leaf",
companyId,
"blocked_by_cancelled_issue",
cancelledLeafId,
].join(":"),
});
await block({ companyId, blockerIssueId: cancelledLeafId, blockedIssueId: parentId });
await block({ companyId, blockerIssueId: escalationId, blockedIssueId: parentId });
const parent = (await svc.list(companyId, { status: "blocked,todo" })).find((issue) => issue.id === parentId);
expect(parent?.blockerAttention).toMatchObject({
state: "covered",
reason: "active_dependency",
unresolvedBlockerCount: 2,
coveredBlockerCount: 2,
attentionBlockerCount: 0,
});
});
it("does not treat a scheduled retry as actively covered work", async () => {
const { companyId, agentId } = await createCompany("PBY");
const parentId = await insertIssue({ companyId, identifier: "PBY-1", title: "Parent", status: "blocked" });

View File

@@ -98,7 +98,8 @@ function readRunLogLimitBytes(value: unknown) {
function readLiveRunsQueryInt(value: unknown, max: number, fallback = 0) {
const parsed = Number(value);
if (!Number.isFinite(parsed)) return fallback;
return Math.max(0, Math.min(max, Math.trunc(parsed)));
if (parsed <= 0) return fallback;
return Math.min(max, Math.trunc(parsed));
}
export function agentRoutes(
@@ -2703,8 +2704,8 @@ export function agentRoutes(
const companyId = req.params.companyId as string;
assertCompanyAccess(req, companyId);
const minCount = readLiveRunsQueryInt(req.query.minCount, 50);
const limit = readLiveRunsQueryInt(req.query.limit, 50);
const minCount = readLiveRunsQueryInt(req.query.minCount, 50, 50);
const limit = readLiveRunsQueryInt(req.query.limit, 50, 50);
const columns = {
id: heartbeatRuns.id,
@@ -2744,8 +2745,8 @@ export function agentRoutes(
)
.orderBy(desc(heartbeatRuns.createdAt));
const liveRuns = limit > 0 ? await liveRunsQuery.limit(limit) : await liveRunsQuery;
const targetRunCount = limit > 0 ? Math.min(minCount, limit) : minCount;
const liveRuns = await liveRunsQuery.limit(limit);
const targetRunCount = Math.min(minCount, limit);
if (targetRunCount > 0 && liveRuns.length < targetRunCount) {
const activeIds = liveRuns.map((r) => r.id);

View File

@@ -52,6 +52,7 @@ import {
issueTreeControlService,
type ActiveIssueTreePauseHoldGate,
} from "./issue-tree-control.js";
import { parseIssueGraphLivenessIncidentKey } from "./recovery/origins.js";
const ALL_ISSUE_STATUSES = ["backlog", "todo", "in_progress", "in_review", "blocked", "done", "cancelled"];
const MAX_ISSUE_COMMENT_PAGE_LIMIT = 500;
@@ -1174,12 +1175,12 @@ async function listIssueBlockerAttentionMap(
}
}
const reviewNodeIds = [...nodesById.values()]
.filter((node) => node.status === "in_review")
const explicitWaitCandidateIds = [...nodesById.values()]
.filter((node) => node.status !== "done")
.map((node) => node.id);
const explicitWaitingIssueIds = new Set<string>();
if (reviewNodeIds.length > 0) {
for (const chunk of chunkList(reviewNodeIds, ISSUE_LIST_RELATED_QUERY_CHUNK_SIZE)) {
if (explicitWaitCandidateIds.length > 0) {
for (const chunk of chunkList(explicitWaitCandidateIds, ISSUE_LIST_RELATED_QUERY_CHUNK_SIZE)) {
const interactionRows: Array<{ issueId: string }> = await dbOrTx
.select({ issueId: issueThreadInteractions.issueId })
.from(issueThreadInteractions)
@@ -1204,22 +1205,28 @@ async function listIssueBlockerAttentionMap(
),
);
for (const row of approvalRows) explicitWaitingIssueIds.add(row.issueId);
}
const recoveryRows: Array<{ originId: string | null }> = await dbOrTx
.select({ originId: issues.originId })
.from(issues)
.where(
and(
eq(issues.companyId, companyId),
eq(issues.originKind, BLOCKER_ATTENTION_OPEN_RECOVERY_ORIGIN_KIND),
isNull(issues.hiddenAt),
inArray(issues.originId, chunk),
notInArray(issues.status, BLOCKER_ATTENTION_OPEN_RECOVERY_TERMINAL_STATUSES),
),
);
for (const row of recoveryRows) {
if (row.originId) explicitWaitingIssueIds.add(row.originId);
}
// Recovery rows are intentionally company-wide: a liveness escalation for
// the same leaf blocker represents an active waiting path even when that
// blocker is reached through another blocked graph.
const recoveryRows: Array<{ id: string; originId: string | null }> = await dbOrTx
.select({ id: issues.id, originId: issues.originId })
.from(issues)
.where(
and(
eq(issues.companyId, companyId),
eq(issues.originKind, BLOCKER_ATTENTION_OPEN_RECOVERY_ORIGIN_KIND),
isNull(issues.hiddenAt),
notInArray(issues.status, BLOCKER_ATTENTION_OPEN_RECOVERY_TERMINAL_STATUSES),
),
);
for (const row of recoveryRows) {
const parsed = parseIssueGraphLivenessIncidentKey(row.originId);
if (!parsed || parsed.companyId !== companyId) continue;
explicitWaitingIssueIds.add(row.id);
explicitWaitingIssueIds.add(parsed.issueId);
explicitWaitingIssueIds.add(parsed.leafIssueId);
}
}
@@ -1257,8 +1264,11 @@ async function listIssueBlockerAttentionMap(
if (node.status === "done") {
return { covered: true, stalled: false, sampleBlockerIdentifier: nodeSample, sampleStalledBlockerIdentifier: null };
}
if (explicitWaitingIssueIds.has(node.id)) {
return { covered: true, stalled: false, sampleBlockerIdentifier: nodeSample, sampleStalledBlockerIdentifier: null };
}
if (node.status === "in_review") {
const hasWaitingPath = activeIssueIds.has(node.id) || Boolean(node.assigneeUserId) || explicitWaitingIssueIds.has(node.id);
const hasWaitingPath = activeIssueIds.has(node.id) || Boolean(node.assigneeUserId);
if (hasWaitingPath) {
return { covered: true, stalled: false, sampleBlockerIdentifier: nodeSample, sampleStalledBlockerIdentifier: null };
}

View File

@@ -127,7 +127,6 @@ export function decideRunLivenessContinuation(input: {
if (budgetBlocked) {
return { kind: "skip", reason: "budget hard stop blocks continuation" };
}
const currentAttempt = readContinuationAttempt(run.continuationAttempt);
if (currentAttempt >= maxAttempts) {
return {

View File

@@ -2250,10 +2250,16 @@ export function recoveryService(db: Db, deps: { enqueueWakeup: RecoveryWakeup })
}) {
const blockerIds = await existingBlockerIssueIds(input.issue.companyId, input.issue.id);
const nextBlockerIds = [...new Set([...blockerIds, input.escalationIssueId])];
const isAlreadyBlockedByEscalation = blockerIds.includes(input.escalationIssueId);
const isAlreadyBlocked = input.issue.status === "blocked";
if (isAlreadyBlockedByEscalation && isAlreadyBlocked) {
return input.issue;
}
const update: Partial<typeof issues.$inferInsert> & { blockedByIssueIds: string[] } = {
blockedByIssueIds: nextBlockerIds,
};
if (input.issue.status !== "blocked") {
if (!isAlreadyBlocked) {
update.status = "blocked";
}

View File

@@ -1,6 +1,6 @@
// @vitest-environment jsdom
import { act, createRef, forwardRef, useImperativeHandle } from "react";
import { act, createRef, forwardRef, useImperativeHandle, useState } from "react";
import type { ReactNode } from "react";
import { createRoot } from "react-dom/client";
import { MemoryRouter } from "react-router-dom";
@@ -601,6 +601,71 @@ describe("IssueChatThread", () => {
scrollHost.remove();
});
it("cancels jump-to-latest settling when the user scrolls manually", () => {
vi.useFakeTimers();
container.remove();
const scrollHost = document.createElement("main");
scrollHost.id = "main-content";
scrollHost.style.overflowY = "auto";
scrollHost.style.overflow = "auto";
scrollHost.style.height = "640px";
document.body.appendChild(scrollHost);
container = document.createElement("div");
scrollHost.appendChild(container);
const elementScrollToMock = vi.fn();
scrollHost.scrollTo = elementScrollToMock as unknown as typeof scrollHost.scrollTo;
const originalScrollIntoView = Element.prototype.scrollIntoView;
const scrollIntoViewMock = vi.fn();
Element.prototype.scrollIntoView = scrollIntoViewMock as unknown as typeof Element.prototype.scrollIntoView;
const root = createRoot(container);
act(() => {
root.render(
<MemoryRouter>
<IssueChatThread
comments={issueChatLongThreadComments}
linkedRuns={issueChatLongThreadLinkedRuns}
timelineEvents={issueChatLongThreadEvents}
liveRuns={[]}
agentMap={issueChatLongThreadAgentMap}
currentUserId="user-board"
onAdd={async () => {}}
enableLiveTranscriptPolling={false}
transcriptsByRunId={issueChatLongThreadTranscriptsByRunId}
hasOutputForRun={(runId) => issueChatLongThreadTranscriptsByRunId.has(runId)}
/>
</MemoryRouter>,
);
});
const jump = Array.from(container.querySelectorAll("button")).find(
(button) => button.textContent === "Jump to latest",
) as HTMLButtonElement | undefined;
expect(jump).toBeDefined();
act(() => {
jump?.click();
});
expect(elementScrollToMock.mock.calls.some(([arg]) => hasSmoothScrollBehavior(arg))).toBe(true);
const scrollCallsAfterClick = elementScrollToMock.mock.calls.length;
act(() => {
scrollHost.dispatchEvent(new WheelEvent("wheel", { bubbles: true }));
vi.advanceTimersByTime(500);
});
expect(elementScrollToMock).toHaveBeenCalledTimes(scrollCallsAfterClick);
expect(scrollIntoViewMock).not.toHaveBeenCalled();
Element.prototype.scrollIntoView = originalScrollIntoView;
act(() => {
root.unmount();
});
scrollHost.remove();
});
// Regression for PAP-2672: when the merged feed ends with a non-comment row
// (run/timeline/embedded output) we still want Jump to latest to land on the
// last comment, not whichever activity row sorts last.
@@ -757,6 +822,78 @@ describe("IssueChatThread", () => {
});
});
it("uses comments rendered by onRefreshLatestComments before resolving latest", async () => {
const scrolledIds: string[] = [];
const originalScrollIntoView = Element.prototype.scrollIntoView;
Element.prototype.scrollIntoView = vi.fn(function scrollIntoView(this: Element) {
scrolledIds.push(this.id);
}) as unknown as typeof Element.prototype.scrollIntoView;
const olderComment = {
id: "comment-before-refresh",
companyId: "company-1",
issueId: "issue-1",
authorAgentId: "agent-perf-codex",
authorUserId: null,
body: "Older loaded comment",
createdAt: new Date("2026-04-06T12:00:00.000Z"),
updatedAt: new Date("2026-04-06T12:00:00.000Z"),
};
const latestComment = {
...olderComment,
id: "comment-after-refresh",
body: "Latest fetched comment",
createdAt: new Date("2026-04-06T12:01:00.000Z"),
updatedAt: new Date("2026-04-06T12:01:00.000Z"),
};
function RefreshingThread() {
const [comments, setComments] = useState([olderComment]);
return (
<IssueChatThread
comments={comments}
linkedRuns={[]}
timelineEvents={[]}
liveRuns={[]}
agentMap={issueChatLongThreadAgentMap}
currentUserId="user-board"
onAdd={async () => {}}
enableLiveTranscriptPolling={false}
onRefreshLatestComments={async () => {
setComments([olderComment, latestComment]);
await new Promise((resolve) => window.requestAnimationFrame(resolve));
}}
/>
);
}
const root = createRoot(container);
await act(async () => {
root.render(
<MemoryRouter>
<RefreshingThread />
</MemoryRouter>,
);
});
const jump = Array.from(container.querySelectorAll("button")).find(
(button) => button.textContent === "Jump to latest",
) as HTMLButtonElement | undefined;
expect(jump).toBeDefined();
await act(async () => {
jump?.click();
await new Promise((resolve) => window.requestAnimationFrame(resolve));
});
expect(scrolledIds).toContain("comment-comment-after-refresh");
Element.prototype.scrollIntoView = originalScrollIntoView;
act(() => {
root.unmount();
});
});
it("findLatestCommentMessageIndex prefers the last comment-anchored row (PAP-2672)", () => {
const messages = [
{ metadata: { custom: { anchorId: "comment-a" } } },

View File

@@ -1534,7 +1534,7 @@ function IssueChatAssistantMessage({
}}
>
<Square className="mr-2 h-3.5 w-3.5 fill-current" />
{isStoppingRun ? "Stopping" : "Stop run"}
{isStoppingRun ? "Stopping..." : "Stop run"}
</DropdownMenuItem>
) : null}
{runHref ? (
@@ -3101,6 +3101,8 @@ export function IssueChatThread({
const lastUserMessageIdRef = useRef<string | null>(null);
const spacerBaselineAnchorRef = useRef<string | null>(null);
const spacerInitialReserveRef = useRef(0);
const latestSettleTimeoutsRef = useRef<number[]>([]);
const latestSettleCleanupRef = useRef<(() => void) | null>(null);
const [bottomSpacerHeight, setBottomSpacerHeight] = useState(0);
const displayLiveRuns = useMemo(() => {
const deduped = new Map<string, LiveRunForIssue>();
@@ -3141,6 +3143,17 @@ export function IssueChatThread({
}
return ids;
}, [displayLiveRuns]);
const clearLatestSettleTimeouts = useCallback(() => {
for (const timeout of latestSettleTimeoutsRef.current) {
window.clearTimeout(timeout);
}
latestSettleTimeoutsRef.current = [];
latestSettleCleanupRef.current?.();
latestSettleCleanupRef.current = null;
}, []);
useEffect(() => clearLatestSettleTimeouts, [clearLatestSettleTimeouts]);
const { transcriptByRun, hasOutputForRun } = useLiveRunTranscripts({
runs: enableLiveTranscriptPolling ? transcriptRuns : [],
companyId,
@@ -3194,6 +3207,8 @@ export function IssueChatThread({
stableMessageCacheRef.current = stabilized.cache;
return stabilized.messages;
}, [rawMessages]);
const latestMessagesRef = useRef<readonly ThreadMessage[]>(messages);
latestMessagesRef.current = messages;
const isRunning = displayLiveRuns.some((run) => run.status === "queued" || run.status === "running");
const unresolvedBlockers = useMemo(
@@ -3226,9 +3241,14 @@ export function IssueChatThread({
function scrollToThreadAnchor(
anchorId: string,
options?: { align?: "start" | "center" | "end" | "auto"; behavior?: ScrollBehavior },
messageSnapshot: readonly ThreadMessage[] = messages,
) {
const virtualIndex = messageAnchorIndex.get(anchorId);
if (useVirtualizedThread && virtualIndex !== undefined) {
const snapshotUsesVirtualizer = messageSnapshot.length >= VIRTUALIZED_THREAD_ROW_THRESHOLD;
const virtualIndex =
messageSnapshot === messages
? messageAnchorIndex.get(anchorId)
: findMessageAnchorIndex(messageSnapshot, anchorId);
if (snapshotUsesVirtualizer && virtualIndex !== undefined && virtualIndex >= 0) {
if (!virtualizedThreadRef.current) return false;
virtualizedThreadRef.current.scrollToIndex(virtualIndex, {
align: options?.align ?? "center",
@@ -3356,26 +3376,35 @@ export function IssueChatThread({
bottomAnchorRef.current?.scrollIntoView({ behavior: "smooth", block: "end" });
}
// Walks the thread by anchor and lands on the latest `comment-*` row, with
// a short series of settle passes. The virtualizer estimates row sizes for
// unmeasured rows, and that estimate undershoots tall markdown comments —
// so the first scroll often lands above the actual bottom and the user
// ends up clicking Jump to latest repeatedly to converge. Re-issuing the
// scroll after measurements catch up lets one click reach the actual
// latest comment (PAP-2672 follow-up).
function scrollToLatestCommentWithSettle() {
const latestCommentIndex = findLatestCommentMessageIndex(messages);
// Lands on the latest `comment-*` row and then drives the scroll the rest
// of the way home as the virtualizer's per-row measurements arrive.
//
// The virtualizer estimates 220px for unmeasured rows. On long threads
// with tall markdown comments (PAP-2536 et al.), totalSize is hugely
// underestimated until rows render and get measured. A single scroll
// lands above the actual bottom; rendered rows then expand, the layout
// grows, and the user has to keep clicking Jump-to-latest to walk closer
// to the real bottom. The convergence loop below issues `scrollIntoView`
// on the latest comment element on every tick until the DOM bottom of
// that element is at the scroll container's bottom (or scroll position
// and content height stop changing).
function scrollToLatestCommentWithSettle(messageSnapshot: readonly ThreadMessage[] = latestMessagesRef.current) {
const latestCommentIndex = findLatestCommentMessageIndex(messageSnapshot);
if (latestCommentIndex < 0) {
jumpToLatestFallback();
return;
}
const latestCommentAnchor = issueChatMessageAnchorId(messages[latestCommentIndex]);
const latestCommentAnchor = issueChatMessageAnchorId(messageSnapshot[latestCommentIndex]);
if (!latestCommentAnchor) {
jumpToLatestFallback();
return;
}
const initial = scrollToThreadAnchor(latestCommentAnchor, { align: "end", behavior: "smooth" });
const initial = scrollToThreadAnchor(
latestCommentAnchor,
{ align: "end", behavior: "smooth" },
messageSnapshot,
);
if (!initial) {
jumpToLatestFallback();
return;
@@ -3383,41 +3412,123 @@ export function IssueChatThread({
if (typeof window === "undefined") return;
const settleDelays = [380, 760, 1140];
settleDelays.forEach((delay) => {
window.setTimeout(() => {
const el = document.getElementById(latestCommentAnchor);
if (el) {
el.scrollIntoView({ behavior: "smooth", block: "end" });
return;
}
// The row may still be outside the virtualizer's render buffer; nudge
// the offset so it gets mounted, then the next pass can align with
// real DOM measurements.
const startedAt = (typeof performance !== "undefined" ? performance.now() : Date.now());
const MAX_DURATION_MS = 4000;
const TICK_MS = 80;
const TOLERANCE_PX = 4;
clearLatestSettleTimeouts();
const resolveScrollContainer = (): HTMLElement | null =>
(document.getElementById("main-content") as HTMLElement | null);
const cancelTarget = resolveScrollContainer() ?? window;
let lastScrollTop = -1;
let lastScrollHeight = -1;
let stableTicks = 0;
let cancelled = false;
const cancel = () => {
cancelled = true;
};
const cleanup = () => {
cancelTarget.removeEventListener("wheel", cancel);
cancelTarget.removeEventListener("touchstart", cancel);
};
cancelTarget.addEventListener("wheel", cancel, { once: true, passive: true });
cancelTarget.addEventListener("touchstart", cancel, { once: true, passive: true });
latestSettleCleanupRef.current = cleanup;
const finish = () => {
cleanup();
latestSettleCleanupRef.current = null;
for (const timeout of latestSettleTimeoutsRef.current) {
window.clearTimeout(timeout);
}
latestSettleTimeoutsRef.current = [];
};
const scheduleTick = (delay: number) => {
const timeout = window.setTimeout(() => {
latestSettleTimeoutsRef.current = latestSettleTimeoutsRef.current.filter((entry) => entry !== timeout);
tick();
}, delay);
latestSettleTimeoutsRef.current.push(timeout);
};
const tick = () => {
const now = (typeof performance !== "undefined" ? performance.now() : Date.now());
if (cancelled || now - startedAt > MAX_DURATION_MS) {
finish();
return;
}
const el = document.getElementById(latestCommentAnchor);
if (!el) {
// Row hasn't been rendered into the virtualizer's buffer yet — nudge
// the offset (instant) so it gets mounted, then keep settling.
virtualizedThreadRef.current?.scrollToIndex(latestCommentIndex, {
align: "end",
behavior: "auto",
});
}, delay);
});
scheduleTick(TICK_MS);
return;
}
const container = resolveScrollContainer();
const containerBottom = container
? container.getBoundingClientRect().bottom
: window.innerHeight;
const elBottom = el.getBoundingClientRect().bottom;
const offBottom = elBottom - containerBottom;
if (Math.abs(offBottom) > TOLERANCE_PX) {
el.scrollIntoView({ behavior: "smooth", block: "end" });
}
const currentScrollTop = container?.scrollTop ?? window.scrollY;
const currentScrollHeight = container?.scrollHeight ?? document.documentElement.scrollHeight;
const scrollStable = Math.abs(currentScrollTop - lastScrollTop) < 1;
const heightStable = currentScrollHeight === lastScrollHeight;
const atBottom = Math.abs(offBottom) <= TOLERANCE_PX;
if (scrollStable && heightStable && atBottom) {
stableTicks += 1;
if (stableTicks >= 3) {
finish();
return;
}
} else {
stableTicks = 0;
}
lastScrollTop = currentScrollTop;
lastScrollHeight = currentScrollHeight;
scheduleTick(TICK_MS);
};
// Hold the first iteration off for one frame so the initial smooth
// scroll has begun (and the virtualizer has rendered the buffer around
// the target) before we start settling.
scheduleTick(120);
}
function handleJumpToLatest() {
if (onRefreshLatestComments) {
// Refetching from page 0 (newest first) brings any comments that
// arrived after the initial load into the cache before we scroll —
// otherwise we'd land on the latest *loaded* row rather than the
// absolute newest, which is what PAP-2672 reopened on.
// Refetching the comments query (page 0 first) brings any comment that
// arrived after the initial load — including ones live updates may
// have missed during reconnects — into the loaded set before we
// resolve the latest target. Otherwise we'd land on the latest
// *loaded* comment but not the absolute newest. (PAP-2672 follow-up.)
const refreshed = onRefreshLatestComments();
if (refreshed && typeof (refreshed as Promise<unknown>).then === "function") {
(refreshed as Promise<unknown>).then(
() => scrollToLatestCommentWithSettle(),
() => scrollToLatestCommentWithSettle(),
() => scrollToLatestCommentWithSettle(latestMessagesRef.current),
() => scrollToLatestCommentWithSettle(latestMessagesRef.current),
);
return;
}
}
scrollToLatestCommentWithSettle();
scrollToLatestCommentWithSettle(latestMessagesRef.current);
}
const stableOnVote = useStableEvent(onVote);

View File

@@ -523,6 +523,153 @@ describe("IssuesList", () => {
});
});
it("hides the workflow blocker chip when a sub-issue is blocked only by its previous sibling", async () => {
const firstChild = createIssue({
id: "issue-first-child",
identifier: "PAP-1",
parentId: "issue-parent",
title: "First child",
status: "todo",
createdAt: new Date("2026-04-01T00:00:00.000Z"),
});
const secondChild = createIssue({
id: "issue-second-child",
identifier: "PAP-2",
parentId: "issue-parent",
title: "Second child",
status: "blocked",
blockedBy: [
{
id: "issue-first-child",
identifier: "PAP-1",
title: "First child",
status: "todo",
priority: "medium",
assigneeAgentId: null,
assigneeUserId: null,
},
],
createdAt: new Date("2026-04-02T00:00:00.000Z"),
});
const { root } = renderWithQueryClient(
<IssuesList
issues={[secondChild, firstChild]}
agents={[]}
projects={[]}
viewStateKey="paperclip:test-issues"
defaultSortField="workflow"
onUpdateIssue={() => undefined}
/>,
container,
);
await waitForAssertion(() => {
const rows = Array.from(container.querySelectorAll('[data-testid="issue-row"]'));
expect(rows).toHaveLength(2);
expect(rows.map((row) => row.getAttribute("data-step"))).toEqual(["1", "2"]);
expect(container.textContent).not.toContain("blocked by PAP-1");
});
act(() => {
root.unmount();
});
});
it("collapses multiple workflow blocker chips to the first blocker and a count", async () => {
const issueDone = createIssue({
id: "issue-done",
identifier: "PAP-1",
title: "Done first",
status: "done",
createdAt: new Date("2026-04-01T00:00:00.000Z"),
});
const firstBlocker = createIssue({
id: "issue-first-blocker",
identifier: "PAP-2",
title: "First blocker",
status: "todo",
createdAt: new Date("2026-04-02T00:00:00.000Z"),
});
const secondBlocker = createIssue({
id: "issue-second-blocker",
identifier: "PAP-3",
title: "Second blocker",
status: "todo",
createdAt: new Date("2026-04-03T00:00:00.000Z"),
});
const thirdBlocker = createIssue({
id: "issue-third-blocker",
identifier: "PAP-4",
title: "Third blocker",
status: "todo",
createdAt: new Date("2026-04-04T00:00:00.000Z"),
});
const issueBlocked = createIssue({
id: "issue-blocked",
identifier: "PAP-5",
title: "Blocked issue",
status: "blocked",
blockedBy: [
{
id: "issue-first-blocker",
identifier: "PAP-2",
title: "First blocker",
status: "todo",
priority: "medium",
assigneeAgentId: null,
assigneeUserId: null,
},
{
id: "issue-second-blocker",
identifier: "PAP-3",
title: "Second blocker",
status: "todo",
priority: "medium",
assigneeAgentId: null,
assigneeUserId: null,
},
{
id: "issue-third-blocker",
identifier: "PAP-4",
title: "Third blocker",
status: "todo",
priority: "medium",
assigneeAgentId: null,
assigneeUserId: null,
},
],
createdAt: new Date("2026-04-05T00:00:00.000Z"),
});
const { root } = renderWithQueryClient(
<IssuesList
issues={[issueBlocked, thirdBlocker, secondBlocker, firstBlocker, issueDone]}
agents={[]}
projects={[]}
viewStateKey="paperclip:test-issues"
defaultSortField="workflow"
onUpdateIssue={() => undefined}
/>,
container,
);
await waitForAssertion(() => {
expect(container.textContent).toContain("blocked by PAP-2");
expect(container.textContent).toContain("... and 2 more");
expect(container.textContent).not.toContain("blocked by PAP-3");
expect(container.textContent).not.toContain("blocked by PAP-4");
const blockerButtons = Array.from(container.querySelectorAll("button"))
.filter((button) => button.textContent?.includes("blocked by"));
expect(blockerButtons).toHaveLength(1);
expect(blockerButtons[0]?.textContent).toBe("blocked by PAP-2 · step 2 ... and 2 more");
});
act(() => {
root.unmount();
});
});
it("uses hierarchical checklist step numbers when nested rows render inline", async () => {
const firstRoot = createIssue({
id: "issue-first-root",
@@ -909,7 +1056,7 @@ describe("IssuesList", () => {
});
it("waits for the desktop main scroll container before rendering more local rows", async () => {
const manyIssues = Array.from({ length: 420 }, (_, index) =>
const manyIssues = Array.from({ length: 120 }, (_, index) =>
createIssue({
id: `issue-${index + 1}`,
identifier: `PAP-${index + 1}`,

View File

@@ -282,6 +282,51 @@ function buildChecklistStepNumberMap(issues: Issue[], nestingEnabled: boolean):
return stepNumberByIssueId;
}
function buildPreviousSiblingIssueIdMap(issues: Issue[], nestingEnabled: boolean): Map<string, string> {
const previousSiblingByIssueId = new Map<string, string>();
if (!nestingEnabled) {
const previousByParentId = new Map<string, Issue>();
for (const issue of issues) {
if (!issue.parentId) continue;
const previousSibling = previousByParentId.get(issue.parentId);
if (previousSibling) {
previousSiblingByIssueId.set(issue.id, previousSibling.id);
}
previousByParentId.set(issue.parentId, issue);
}
return previousSiblingByIssueId;
}
const { roots, childMap } = buildIssueTree(issues);
const visit = (siblings: Issue[]) => {
siblings.forEach((issue, index) => {
const previousSibling = index > 0 ? siblings[index - 1] : null;
if (issue.parentId && previousSibling?.parentId === issue.parentId) {
previousSiblingByIssueId.set(issue.id, previousSibling.id);
}
visit(childMap.get(issue.id) ?? []);
});
};
visit(roots);
return previousSiblingByIssueId;
}
function shouldSuppressSinglePreviousSiblingBlockerChip(
issue: Issue,
unresolvedVisibleBlockerIds: string[],
previousSiblingIssueId: string | undefined,
): boolean {
return Boolean(
issue.parentId
&& previousSiblingIssueId
&& (issue.blockedBy ?? []).length === 1
&& unresolvedVisibleBlockerIds.length === 1
&& unresolvedVisibleBlockerIds[0] === previousSiblingIssueId,
);
}
/* ── Component ── */
interface Agent {
@@ -878,6 +923,7 @@ export function IssuesList({
const visibleIssueIds = new Set(filtered.map((issue) => issue.id));
const stepNumberByIssueId = buildChecklistStepNumberMap(filtered, viewState.nestingEnabled);
const previousSiblingIssueIdByIssueId = buildPreviousSiblingIssueIdMap(filtered, viewState.nestingEnabled);
const unresolvedVisibleBlockersByIssueId = new Map<string, string[]>();
filtered.forEach((issue) => {
@@ -889,7 +935,12 @@ export function IssuesList({
if (!blockerIssue) return false;
return blockerIssue.status !== "done" && blockerIssue.status !== "cancelled";
});
unresolvedVisibleBlockersByIssueId.set(issue.id, unresolvedVisible);
const shouldSuppressChip = shouldSuppressSinglePreviousSiblingBlockerChip(
issue,
unresolvedVisible,
previousSiblingIssueIdByIssueId.get(issue.id),
);
unresolvedVisibleBlockersByIssueId.set(issue.id, shouldSuppressChip ? [] : unresolvedVisible);
});
const firstActionable = filtered.find((issue) => isActionableWorkflowStatus(issue.status)) ?? null;
@@ -1388,36 +1439,49 @@ export function IssuesList({
const doneRowTitleClass = checklistMeta && issue.status === "done"
? "text-muted-foreground"
: undefined;
const checklistDependencyChips = checklistMeta && unresolvedVisibleBlockers.length > 0 ? (
<>
{unresolvedVisibleBlockers.map((blockerId) => {
const blockerIssue = issueById.get(blockerId);
if (!blockerIssue) return null;
const label = blockerIssue.identifier ?? blockerIssue.id.slice(0, 8);
const blockerStep = checklistMeta.stepNumberByIssueId.get(blockerId);
const blockerStepSuffix = blockerStep ? ` \u00b7 step ${blockerStep}` : "";
const chipLabel = `blocked by ${label}${blockerStepSuffix}`;
return (
<button
key={blockerId}
type="button"
onClick={(event) => {
event.preventDefault();
event.stopPropagation();
const target = document.getElementById(`issue-workflow-row-${blockerId}`);
if (!target) return;
target.scrollIntoView({ behavior: "smooth", block: "nearest" });
target.focus?.();
}}
className="inline-flex items-center rounded-full border border-amber-400/45 bg-amber-50/60 px-1.5 py-0.5 text-[10px] font-medium text-amber-700 hover:bg-amber-100/80 dark:border-amber-300/35 dark:bg-amber-400/10 dark:text-amber-300"
title={chipLabel}
aria-label={chipLabel}
>
{chipLabel}
</button>
);
})}
</>
const visibleBlockerChips = unresolvedVisibleBlockers
.map((blockerId) => {
const blockerIssue = issueById.get(blockerId);
if (!blockerIssue) return null;
const label = blockerIssue.identifier ?? blockerIssue.id.slice(0, 8);
const blockerStep = checklistMeta?.stepNumberByIssueId.get(blockerId);
const blockerStepSuffix = blockerStep ? ` \u00b7 step ${blockerStep}` : "";
return { blockerId, chipLabel: `blocked by ${label}${blockerStepSuffix}` };
})
.filter((chip): chip is { blockerId: string; chipLabel: string } => chip !== null);
const firstVisibleBlockerChip = visibleBlockerChips[0] ?? null;
const additionalVisibleBlockerCount = Math.max(visibleBlockerChips.length - 1, 0);
const additionalVisibleBlockerLabel = additionalVisibleBlockerCount > 0
? ` ... and ${additionalVisibleBlockerCount} more`
: "";
const firstVisibleBlockerDisplayLabel = firstVisibleBlockerChip
? `${firstVisibleBlockerChip.chipLabel}${additionalVisibleBlockerLabel}`
: "";
const hiddenVisibleBlockerLabels = visibleBlockerChips
.slice(1)
.map((chip) => chip.chipLabel)
.join(", ");
const firstVisibleBlockerTitle = additionalVisibleBlockerCount > 0
? `${firstVisibleBlockerDisplayLabel}: ${hiddenVisibleBlockerLabels}`
: firstVisibleBlockerDisplayLabel;
const checklistDependencyChips = checklistMeta && firstVisibleBlockerChip ? (
<button
key={firstVisibleBlockerChip.blockerId}
type="button"
onClick={(event) => {
event.preventDefault();
event.stopPropagation();
const target = document.getElementById(`issue-workflow-row-${firstVisibleBlockerChip.blockerId}`);
if (!target) return;
target.scrollIntoView({ behavior: "smooth", block: "nearest" });
target.focus?.();
}}
className="inline-flex items-center rounded-full border border-amber-400/45 bg-amber-50/60 px-1.5 py-0.5 text-[10px] font-medium text-amber-700 hover:bg-amber-100/80 dark:border-amber-300/35 dark:bg-amber-400/10 dark:text-amber-300"
title={firstVisibleBlockerTitle}
aria-label={firstVisibleBlockerTitle}
>
{firstVisibleBlockerDisplayLabel}
</button>
) : null;
return (

View File

@@ -366,6 +366,21 @@ describe("MarkdownBody", () => {
expect(html).toContain('style="max-width:100%;overflow-x:auto"');
});
it("renders a copy button alongside fenced code blocks", () => {
const html = renderMarkdown("```ts\nconst a = 1;\n```");
expect(html).toContain("paperclip-markdown-codeblock");
expect(html).toContain("paperclip-markdown-codeblock-copy");
expect(html).toContain('aria-label="Copy code"');
expect(html).toContain("lucide-copy");
});
it("does not render a copy button on inline code", () => {
const html = renderMarkdown("Reference `inline-code` here.");
expect(html).not.toContain("paperclip-markdown-codeblock-copy");
});
it("renders internal issue links and bare identifiers as inline issue refs", () => {
const html = renderMarkdown(`See PAP-42 and [linked task](${buildIssueReferenceHref("PAP-77")}) for follow-up.`, [
{ identifier: "PAP-42", status: "done" },

View File

@@ -1,6 +1,6 @@
import { isValidElement, useEffect, useId, useState, type ReactNode } from "react";
import { isValidElement, useCallback, useEffect, useId, useRef, useState, type ReactNode } from "react";
import { useQuery } from "@tanstack/react-query";
import { ExternalLink, Github } from "lucide-react";
import { Check, Copy, ExternalLink, Github } from "lucide-react";
import Markdown, { defaultUrlTransform, type Components, type Options } from "react-markdown";
import remarkGfm from "remark-gfm";
import { cn } from "../lib/utils";
@@ -183,6 +183,83 @@ function renderLinkBody(
);
}
function CodeBlock({
children,
preProps,
}: {
children: ReactNode;
preProps: React.HTMLAttributes<HTMLPreElement>;
}) {
const [copied, setCopied] = useState(false);
const [failed, setFailed] = useState(false);
const preRef = useRef<HTMLPreElement>(null);
const timerRef = useRef<ReturnType<typeof setTimeout>>(undefined);
useEffect(() => () => clearTimeout(timerRef.current), []);
const handleCopy = useCallback(async () => {
const text = preRef.current?.innerText ?? flattenText(children);
try {
if (navigator.clipboard && window.isSecureContext) {
await navigator.clipboard.writeText(text);
} else {
const textarea = document.createElement("textarea");
textarea.value = text;
textarea.style.position = "fixed";
textarea.style.left = "-9999px";
document.body.appendChild(textarea);
try {
textarea.select();
const success = document.execCommand("copy");
if (!success) throw new Error("execCommand copy failed");
} finally {
document.body.removeChild(textarea);
}
}
setFailed(false);
setCopied(true);
} catch {
setFailed(true);
setCopied(true);
}
clearTimeout(timerRef.current);
timerRef.current = setTimeout(() => {
setCopied(false);
setFailed(false);
}, 1500);
}, [children]);
const label = failed ? "Copy failed" : copied ? "Copied!" : "Copy";
return (
<div className="paperclip-markdown-codeblock">
<pre
{...preProps}
ref={preRef}
style={mergeScrollableBlockStyle(preProps.style as React.CSSProperties | undefined)}
>
{children}
</pre>
<button
type="button"
onClick={handleCopy}
aria-label="Copy code"
title={label}
className="paperclip-markdown-codeblock-copy"
data-copied={copied || undefined}
data-failed={failed || undefined}
>
{copied && !failed ? (
<Check aria-hidden="true" className="h-3.5 w-3.5" />
) : (
<Copy aria-hidden="true" className="h-3.5 w-3.5" />
)}
<span className="paperclip-markdown-codeblock-copy-label">{label}</span>
</button>
</div>
);
}
function MermaidDiagramBlock({ source, darkMode }: { source: string; darkMode: boolean }) {
const renderId = useId().replace(/[^a-zA-Z0-9_-]/g, "");
const [svg, setSvg] = useState<string | null>(null);
@@ -286,7 +363,7 @@ export function MarkdownBody({
if (mermaidSource) {
return <MermaidDiagramBlock source={mermaidSource} darkMode={theme === "dark"} />;
}
return <pre {...preProps} style={mergeScrollableBlockStyle(preProps.style as React.CSSProperties | undefined)}>{preChildren}</pre>;
return <CodeBlock preProps={preProps}>{preChildren}</CodeBlock>;
},
code: ({ node: _node, style: codeStyle, children: codeChildren, ...codeProps }) => (
<code {...codeProps} style={mergeWrapStyle(codeStyle as React.CSSProperties | undefined)}>

View File

@@ -478,22 +478,34 @@ describe("MarkdownEditor", () => {
});
});
it("anchors the mention menu inside the visual viewport when mobile offsets are present", () => {
it("places the menu top on the caret line and offsets the left a space-width past the caret", () => {
expect(
computeMentionMenuPosition(
{ viewportTop: 180, viewportLeft: 120 },
{ viewportTop: 100, viewportBottom: 118, viewportLeft: 240 },
{ offsetLeft: 0, offsetTop: 0, width: 800, height: 600 },
),
).toEqual({
top: 100,
left: 250,
});
});
it("applies visual viewport offsets when present", () => {
expect(
computeMentionMenuPosition(
{ viewportTop: 20, viewportBottom: 38, viewportLeft: 120 },
{ offsetLeft: 24, offsetTop: 320, width: 320, height: 260 },
),
).toEqual({
top: 372,
left: 144,
top: 340,
left: 154,
});
});
it("clamps the mention menu back into view near the viewport edges", () => {
expect(
computeMentionMenuPosition(
{ viewportTop: 260, viewportLeft: 240 },
{ viewportTop: 260, viewportBottom: 278, viewportLeft: 240 },
{ offsetLeft: 0, offsetTop: 0, width: 280, height: 220 },
),
).toEqual({
@@ -502,16 +514,28 @@ describe("MarkdownEditor", () => {
});
});
it("flips the menu above the caret line when it would overflow below", () => {
expect(
computeMentionMenuPosition(
{ viewportTop: 560, viewportBottom: 580, viewportLeft: 200 },
{ offsetLeft: 0, offsetTop: 0, width: 800, height: 600 },
),
).toEqual({
top: 372,
left: 210,
});
});
it("keeps a short mention menu on the same line when it fits below the caret", () => {
expect(
computeMentionMenuPosition(
{ viewportTop: 160, viewportLeft: 120 },
{ viewportTop: 160, viewportBottom: 178, viewportLeft: 120 },
{ offsetLeft: 0, offsetTop: 0, width: 320, height: 220 },
{ width: 188, height: 42 },
),
).toEqual({
top: 164,
left: 120,
top: 160,
left: 130,
});
});
@@ -619,8 +643,20 @@ describe("MarkdownEditor", () => {
editable.remove();
});
it("accepts mention selection from touchstart taps", async () => {
const handleChange = vi.fn();
function createTouchEvent(
type: "touchstart" | "touchmove" | "touchend",
touches: Array<{ clientX: number; clientY: number }>,
) {
const event = new Event(type, { bubbles: true, cancelable: true });
const list = touches as unknown as TouchList;
Object.defineProperty(event, "touches", { value: type === "touchend" ? [] : list });
Object.defineProperty(event, "changedTouches", { value: list });
return event;
}
async function openMentionMenuFor(
handleChange: ReturnType<typeof vi.fn>,
): Promise<{ option: HTMLButtonElement; root: ReturnType<typeof createRoot> }> {
const root = createRoot(container);
await act(async () => {
@@ -645,7 +681,6 @@ describe("MarkdownEditor", () => {
const editable = container.querySelector('[contenteditable="true"]');
expect(editable).not.toBeNull();
const textNode = editable?.firstChild;
expect(textNode?.nodeType).toBe(Node.TEXT_NODE);
@@ -659,15 +694,24 @@ describe("MarkdownEditor", () => {
act(() => {
document.dispatchEvent(new Event("selectionchange"));
});
await flush();
const option = Array.from(document.body.querySelectorAll('button[type="button"]'))
.find((node) => node.textContent?.includes("Paperclip App"));
.find((node) => node.textContent?.includes("Paperclip App")) as HTMLButtonElement | undefined;
expect(option).toBeTruthy();
return { option: option!, root };
}
it("accepts mention selection from a touch tap", async () => {
const handleChange = vi.fn();
const { option, root } = await openMentionMenuFor(handleChange);
const point = { clientX: 100, clientY: 50 };
act(() => {
option?.dispatchEvent(new Event("touchstart", { bubbles: true, cancelable: true }));
option.dispatchEvent(createTouchEvent("touchstart", [point]));
});
act(() => {
option.dispatchEvent(createTouchEvent("touchend", [point]));
});
expect(handleChange).toHaveBeenCalledWith(
@@ -678,4 +722,44 @@ describe("MarkdownEditor", () => {
root.unmount();
});
});
it("does not preventDefault on touchstart so the mention menu can scroll on mobile", async () => {
const handleChange = vi.fn();
const { option, root } = await openMentionMenuFor(handleChange);
const touchstart = createTouchEvent("touchstart", [{ clientX: 100, clientY: 50 }]);
act(() => {
option.dispatchEvent(touchstart);
});
expect(touchstart.defaultPrevented).toBe(false);
expect(handleChange).not.toHaveBeenCalled();
await act(async () => {
root.unmount();
});
});
it("does not select when the touch moves like a scroll", async () => {
const handleChange = vi.fn();
const { option, root } = await openMentionMenuFor(handleChange);
const start = { clientX: 100, clientY: 50 };
const moved = { clientX: 100, clientY: 90 };
act(() => {
option.dispatchEvent(createTouchEvent("touchstart", [start]));
});
act(() => {
option.dispatchEvent(createTouchEvent("touchmove", [moved]));
});
act(() => {
option.dispatchEvent(createTouchEvent("touchend", [moved]));
});
expect(handleChange).not.toHaveBeenCalled();
await act(async () => {
root.unmount();
});
});
});

View File

@@ -174,8 +174,14 @@ interface MentionState {
query: string;
top: number;
left: number;
/** Viewport-relative coords for portal positioning */
/**
* Caret-aligned viewport coords for portal positioning. `viewportTop` /
* `viewportBottom` describe the active text line, and `viewportLeft` is the
* caret X (right edge of the last typed character) so the menu can sit on
* the same line, just to the right of the cursor.
*/
viewportTop: number;
viewportBottom: number;
viewportLeft: number;
textNode: Text;
atPos: number;
@@ -201,6 +207,8 @@ const MENTION_MENU_HEIGHT = 208;
const MENTION_MENU_PADDING = 8;
const MENTION_MENU_ROW_HEIGHT = 34;
const MENTION_MENU_CHROME_HEIGHT = 8;
/** Roughly one space-width of breathing room between the caret and the menu. */
const MENTION_MENU_CARET_GAP = 10;
const CODE_BLOCK_LANGUAGES: Record<string, string> = {
txt: "Text",
@@ -263,6 +271,36 @@ export function findMentionMatch(
};
}
interface CaretRect {
top: number;
bottom: number;
/** Caret X — the right edge of the last typed character (or left edge of the next). */
x: number;
}
function measureCaretRect(textNode: Text, offset: number, atPos: number): CaretRect {
const length = textNode.textContent?.length ?? 0;
const rectFromRange = (start: number, end: number, side: "right" | "left"): CaretRect | null => {
if (start < 0 || end > length || end <= start) return null;
const range = document.createRange();
range.setStart(textNode, start);
range.setEnd(textNode, end);
const rect = range.getBoundingClientRect();
if (rect.width === 0 && rect.height === 0) return null;
return { top: rect.top, bottom: rect.bottom, x: side === "right" ? rect.right : rect.left };
};
// Prefer the character immediately before the caret — its right edge IS the caret X
// and its top/bottom describe the active line. Falls back to the char after the caret
// and finally the @ marker if nothing else gives us a valid rect.
return (
rectFromRange(Math.max(0, offset - 1), offset, "right")
?? rectFromRange(offset, Math.min(length, offset + 1), "left")
?? rectFromRange(atPos, atPos + 1, "right")
?? { top: 0, bottom: 0, x: 0 }
);
}
function detectMention(container: HTMLElement): MentionState | null {
const sel = window.getSelection();
if (!sel || sel.rangeCount === 0 || !sel.isCollapsed) return null;
@@ -277,21 +315,20 @@ function detectMention(container: HTMLElement): MentionState | null {
const match = findMentionMatch(text, offset);
if (!match) return null;
// Get position relative to container
const tempRange = document.createRange();
tempRange.setStart(textNode, match.atPos);
tempRange.setEnd(textNode, match.atPos + 1);
const rect = tempRange.getBoundingClientRect();
// Anchor the menu to the live caret so it tracks each typed character instead of
// staying glued to the @ marker.
const caret = measureCaretRect(textNode as Text, offset, match.atPos);
const containerRect = container.getBoundingClientRect();
return {
trigger: match.trigger,
marker: match.marker,
query: match.query,
top: rect.bottom - containerRect.top,
left: rect.left - containerRect.left,
viewportTop: rect.bottom,
viewportLeft: rect.left,
top: caret.top - containerRect.top,
left: caret.x - containerRect.left,
viewportTop: caret.top,
viewportBottom: caret.bottom,
viewportLeft: caret.x,
textNode: textNode as Text,
atPos: match.atPos,
endPos: match.endPos,
@@ -318,7 +355,7 @@ function getMentionMenuViewport(): MentionMenuViewport {
}
export function computeMentionMenuPosition(
anchor: Pick<MentionState, "viewportTop" | "viewportLeft">,
anchor: Pick<MentionState, "viewportTop" | "viewportBottom" | "viewportLeft">,
viewport: MentionMenuViewport,
menuSize: MentionMenuSize = { width: MENTION_MENU_WIDTH, height: MENTION_MENU_HEIGHT },
) {
@@ -327,10 +364,23 @@ export function computeMentionMenuPosition(
const minTop = viewport.offsetTop + MENTION_MENU_PADDING;
const maxTop = viewport.offsetTop + viewport.height - menuSize.height;
return {
top: Math.max(minTop, Math.min(viewport.offsetTop + anchor.viewportTop + 4, maxTop)),
left: Math.max(minLeft, Math.min(viewport.offsetLeft + anchor.viewportLeft, maxLeft)),
};
// Place the menu's top edge on the current line so it sits next to the caret.
// If it would overflow below, flip above so the menu's bottom hugs the line.
const desiredTop = viewport.offsetTop + anchor.viewportTop;
let top: number;
if (desiredTop > maxTop) {
const flipped = viewport.offsetTop + anchor.viewportBottom - menuSize.height;
top = Math.max(minTop, Math.min(flipped, maxTop));
} else {
top = Math.max(minTop, desiredTop);
}
// Place the menu's left edge a small gap to the right of the caret X so
// there's roughly a space-width of breathing room between cursor and menu.
const desiredLeft = viewport.offsetLeft + anchor.viewportLeft + MENTION_MENU_CARET_GAP;
const left = Math.max(minLeft, Math.min(desiredLeft, maxLeft));
return { top, left };
}
function getMentionMenuSize(optionCount: number): MentionMenuSize {
@@ -903,6 +953,44 @@ export const MarkdownEditor = forwardRef<MarkdownEditorRef, MarkdownEditorProps>
}
}, [selectMention]);
// Touch handling for the mention menu. We deliberately do NOT preventDefault
// on touchstart so the browser can still scroll the menu vertically; instead
// we record the start point and only treat the gesture as a selection if the
// finger lifted with negligible movement (i.e., a tap, not a scroll).
const touchStartPointRef = useRef<{ x: number; y: number } | null>(null);
const TOUCH_TAP_THRESHOLD_PX = 8;
const handleAutocompleteTouchStart = useCallback((event: ReactTouchEvent<HTMLButtonElement>) => {
const touch = event.touches[0];
if (!touch) return;
touchStartPointRef.current = { x: touch.clientX, y: touch.clientY };
}, []);
const handleAutocompleteTouchMove = useCallback((event: ReactTouchEvent<HTMLButtonElement>) => {
const start = touchStartPointRef.current;
if (!start) return;
const touch = event.touches[0];
if (!touch) return;
if (Math.hypot(touch.clientX - start.x, touch.clientY - start.y) > TOUCH_TAP_THRESHOLD_PX) {
touchStartPointRef.current = null;
}
}, []);
const handleAutocompleteTouchEnd = useCallback((
event: ReactTouchEvent<HTMLButtonElement>,
option: AutocompleteOption,
) => {
const start = touchStartPointRef.current;
touchStartPointRef.current = null;
if (!start) return;
const touch = event.changedTouches[0];
if (!touch) return;
if (Math.hypot(touch.clientX - start.x, touch.clientY - start.y) > TOUCH_TAP_THRESHOLD_PX) {
return;
}
handleAutocompletePress(event, option);
}, [handleAutocompletePress]);
function hasFilePayload(evt: DragEvent<HTMLDivElement>) {
return Array.from(evt.dataTransfer?.types ?? []).includes("Files");
}
@@ -1131,26 +1219,36 @@ export const MarkdownEditor = forwardRef<MarkdownEditorRef, MarkdownEditorProps>
/>
{/* Mention dropdown — rendered via portal so it isn't clipped by overflow containers */}
{mentionActive && filteredMentions.length > 0 &&
{mentionActive && filteredMentions.length > 0 && mentionMenuPosition &&
createPortal(
<div
className="fixed z-[9999] min-w-[180px] max-w-[calc(100vw-16px)] max-h-[200px] overflow-y-auto rounded-md border border-border bg-popover shadow-md"
className="fixed z-[9999] min-w-[180px] max-w-[calc(100vw-16px)] max-h-[208px] overflow-y-auto rounded-md border border-border bg-popover shadow-md"
style={{
top: Math.min(mentionState.viewportTop + 4, window.innerHeight - 208),
left: Math.max(8, Math.min(mentionState.viewportLeft, window.innerWidth - 188)),
top: mentionMenuPosition.top,
left: mentionMenuPosition.left,
touchAction: "pan-y",
WebkitOverflowScrolling: "touch",
}}
>
{filteredMentions.map((option, i) => (
<button
key={option.id}
type="button"
tabIndex={-1}
className={cn(
"flex items-center gap-2 w-full px-3 py-1.5 text-sm text-left hover:bg-accent/50 transition-colors",
i === mentionIndex && "bg-accent",
)}
onPointerDown={(e) => handleAutocompletePress(e, option)}
onPointerDown={(e) => {
// Touch is handled via onTouchStart/onTouchEnd so vertical scrolling
// isn't swallowed; only handle mouse/pen here.
if (e.pointerType === "touch") return;
handleAutocompletePress(e, option);
}}
onMouseDown={(e) => handleAutocompletePress(e, option)}
onTouchStart={(e) => handleAutocompletePress(e, option)}
onTouchStart={handleAutocompleteTouchStart}
onTouchMove={handleAutocompleteTouchMove}
onTouchEnd={(e) => handleAutocompleteTouchEnd(e, option)}
onMouseEnter={() => {
if (mentionStateRef.current?.trigger === "skill") {
skillEnterArmedRef.current = true;

View File

@@ -222,6 +222,22 @@ async function flush() {
});
}
async function waitForAssertion(assertion: () => void, attempts = 20) {
let lastError: unknown;
for (let attempt = 0; attempt < attempts; attempt += 1) {
try {
assertion();
return;
} catch (error) {
lastError = error;
await flush();
}
}
throw lastError;
}
function renderDialog(container: HTMLDivElement) {
const queryClient = new QueryClient({
defaultOptions: {
@@ -268,6 +284,7 @@ describe("NewIssueDialog", () => {
mockAuthApi.getSession.mockResolvedValue({ user: { id: "user-1" } });
mockAssetsApi.uploadImage.mockResolvedValue({ contentPath: "/uploads/asset.png" });
mockInstanceSettingsApi.getExperimental.mockResolvedValue({ enableIsolatedWorkspaces: false });
localStorage.clear();
mockIssuesApi.create.mockResolvedValue({
id: "issue-2",
companyId: "company-1",
@@ -351,7 +368,9 @@ describe("NewIssueDialog", () => {
const submitButton = Array.from(container.querySelectorAll("button"))
.find((button) => button.textContent?.includes("Create Sub-Issue"));
expect(submitButton).not.toBeUndefined();
expect(submitButton?.hasAttribute("disabled")).toBe(false);
await waitForAssertion(() => {
expect(submitButton?.hasAttribute("disabled")).toBe(false);
});
await act(async () => {
submitButton!.dispatchEvent(new MouseEvent("click", { bubbles: true }));
@@ -373,6 +392,17 @@ describe("NewIssueDialog", () => {
});
it("submits the latest locally typed title and description", async () => {
let resolveProjects: (projects: Array<{
id: string;
name: string;
description: string | null;
archivedAt: string | null;
color: string;
}>) => void = () => undefined;
mockProjectsApi.list.mockReturnValue(new Promise((resolve) => {
resolveProjects = resolve;
}));
const { root } = renderDialog(container);
await flush();
@@ -401,10 +431,26 @@ describe("NewIssueDialog", () => {
});
await flush();
await act(async () => {
resolveProjects([
{
id: "project-1",
name: "Alpha",
description: null,
archivedAt: null,
color: "#445566",
},
]);
await Promise.resolve();
});
await flush();
const submitButton = Array.from(container.querySelectorAll("button"))
.find((button) => button.textContent?.includes("Create Issue"));
expect(submitButton).not.toBeUndefined();
expect(submitButton?.hasAttribute("disabled")).toBe(false);
await waitForAssertion(() => {
expect(submitButton?.hasAttribute("disabled")).toBe(false);
});
await act(async () => {
submitButton!.dispatchEvent(new MouseEvent("click", { bubbles: true }));

View File

@@ -417,6 +417,7 @@ export function NewIssueDialog() {
const [isFileDragOver, setIsFileDragOver] = useState(false);
const draftTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
const executionWorkspaceDefaultProjectId = useRef<string | null>(null);
const initializationKeyRef = useRef<string | null>(null);
const effectiveCompanyId = dialogCompanyId ?? selectedCompanyId;
const dialogCompany = companies.find((c) => c.id === effectiveCompanyId) ?? selectedCompany;
@@ -673,7 +674,13 @@ export function NewIssueDialog() {
// Restore draft or apply defaults when dialog opens
useEffect(() => {
if (!newIssueOpen) return;
if (!newIssueOpen) {
initializationKeyRef.current = null;
return;
}
const initializationKey = `${selectedCompanyId ?? ""}:${JSON.stringify(newIssueDefaults)}`;
if (initializationKeyRef.current === initializationKey) return;
initializationKeyRef.current = initializationKey;
setDialogCompanyId(selectedCompanyId);
executionWorkspaceDefaultProjectId.current = null;
@@ -681,6 +688,7 @@ export function NewIssueDialog() {
if (newIssueDefaults.parentId) {
const defaultProjectId = newIssueDefaults.projectId ?? "";
const defaultProject = orderedProjects.find((project) => project.id === defaultProjectId);
const hasExplicitProjectWorkspaceId = newIssueDefaults.projectWorkspaceId !== undefined;
const defaultProjectWorkspaceId = newIssueDefaults.projectWorkspaceId
?? defaultProjectWorkspaceIdForProject(defaultProject);
const defaultExecutionWorkspaceMode = newIssueDefaults.executionWorkspaceId
@@ -697,7 +705,9 @@ export function NewIssueDialog() {
setAssigneeChrome(false);
setExecutionWorkspaceMode(defaultExecutionWorkspaceMode);
setSelectedExecutionWorkspaceId(newIssueDefaults.executionWorkspaceId ?? "");
executionWorkspaceDefaultProjectId.current = defaultProjectId || null;
executionWorkspaceDefaultProjectId.current = hasExplicitProjectWorkspaceId || defaultProject
? defaultProjectId || null
: null;
} else if (newIssueDefaults.title) {
setIssueText(newIssueDefaults.title, newIssueDefaults.description ?? "");
setStatus(newIssueDefaults.status ?? "todo");
@@ -716,7 +726,7 @@ export function NewIssueDialog() {
setAssigneeChrome(false);
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(defaultProject));
setSelectedExecutionWorkspaceId("");
executionWorkspaceDefaultProjectId.current = defaultProjectId || null;
executionWorkspaceDefaultProjectId.current = defaultProject ? defaultProjectId || null : null;
} else if (draft && draft.title.trim()) {
const restoredProjectId = newIssueDefaults.projectId ?? draft.projectId;
const restoredProject = orderedProjects.find((project) => project.id === restoredProjectId);
@@ -742,7 +752,9 @@ export function NewIssueDialog() {
?? (draft.useIsolatedExecutionWorkspace ? "isolated_workspace" : defaultExecutionWorkspaceModeForProject(restoredProject)),
);
setSelectedExecutionWorkspaceId(draft.selectedExecutionWorkspaceId ?? "");
executionWorkspaceDefaultProjectId.current = restoredProjectId || null;
executionWorkspaceDefaultProjectId.current = draft.projectWorkspaceId || restoredProject
? restoredProjectId || null
: null;
} else {
const defaultProjectId = newIssueDefaults.projectId ?? "";
const defaultProject = orderedProjects.find((project) => project.id === defaultProjectId);
@@ -761,9 +773,9 @@ export function NewIssueDialog() {
setAssigneeChrome(false);
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(defaultProject));
setSelectedExecutionWorkspaceId("");
executionWorkspaceDefaultProjectId.current = defaultProjectId || null;
executionWorkspaceDefaultProjectId.current = defaultProject ? defaultProjectId || null : null;
}
}, [newIssueOpen, newIssueDefaults, orderedProjects, setIssueText]);
}, [newIssueOpen, newIssueDefaults, orderedProjects, selectedCompanyId, setIssueText]);
useEffect(() => {
if (!supportsAssigneeOverrides) {
@@ -815,6 +827,7 @@ export function NewIssueDialog() {
setIsFileDragOver(false);
setCompanyOpen(false);
executionWorkspaceDefaultProjectId.current = null;
initializationKeyRef.current = null;
}
function handleCompanyChange(companyId: string) {
@@ -1060,7 +1073,12 @@ export function NewIssueDialog() {
}, [orderedProjects]);
useEffect(() => {
if (!newIssueOpen || !projectId || executionWorkspaceDefaultProjectId.current === projectId) {
if (
!newIssueOpen ||
!projectId ||
selectedExecutionWorkspaceId ||
executionWorkspaceDefaultProjectId.current === projectId
) {
return;
}
const project = orderedProjects.find((entry) => entry.id === projectId);
@@ -1069,7 +1087,7 @@ export function NewIssueDialog() {
setProjectWorkspaceId(defaultProjectWorkspaceIdForProject(project));
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(project));
setSelectedExecutionWorkspaceId("");
}, [newIssueOpen, orderedProjects, projectId]);
}, [newIssueOpen, orderedProjects, projectId, selectedExecutionWorkspaceId]);
const modelOverrideOptions = useMemo<InlineEntityOption[]>(
() => {
return [...(assigneeAdapterModels ?? [])]

View File

@@ -442,7 +442,7 @@
align-items: center;
gap: 0.25rem;
margin: 0 0.1rem;
padding: 0 0.45rem;
padding: 0 0.625rem;
border: 1px solid var(--border);
border-radius: 999px;
font-size: 0.75rem;
@@ -457,9 +457,7 @@
/* Strip the MDXEditor's default inline-code styling from the text inside chips
(the link label otherwise picks up a monospace font + gray tint). */
.paperclip-mdxeditor-content a.paperclip-mention-chip,
.paperclip-mdxeditor-content a.paperclip-mention-chip code,
.paperclip-mdxeditor-content a.paperclip-project-mention-chip,
.paperclip-mdxeditor-content a.paperclip-project-mention-chip code {
font-family: inherit;
background: none;
@@ -670,6 +668,53 @@ a.paperclip-mention-chip[data-mention-kind="agent"]::before {
background: none;
}
/* Copy-to-clipboard button on fenced code blocks */
.paperclip-markdown-codeblock {
position: relative;
}
.paperclip-markdown-codeblock-copy {
position: absolute;
top: 0.4rem;
right: 0.4rem;
display: inline-flex;
align-items: center;
gap: 0.25rem;
padding: 0.2rem 0.4rem;
border-radius: calc(var(--radius) - 4px);
border: 1px solid color-mix(in oklab, var(--foreground) 14%, transparent);
background-color: color-mix(in oklab, var(--muted) 92%, var(--background) 8%);
color: var(--muted-foreground);
font-size: 0.7rem;
line-height: 1;
cursor: pointer;
opacity: 0;
transition: opacity 0.12s ease, background-color 0.12s ease, color 0.12s ease;
}
.paperclip-markdown-codeblock:hover .paperclip-markdown-codeblock-copy,
.paperclip-markdown-codeblock-copy:focus-visible,
.paperclip-markdown-codeblock-copy[data-copied] {
opacity: 1;
}
.paperclip-markdown-codeblock-copy:hover {
background-color: var(--accent);
color: var(--accent-foreground);
}
.paperclip-markdown-codeblock-copy[data-copied] {
color: var(--primary);
}
.paperclip-markdown-codeblock-copy[data-failed] {
color: var(--destructive);
}
.paperclip-markdown-codeblock-copy-label {
font-weight: 500;
}
/* Remove backtick pseudo-elements from inline code (prose default adds them) */
.prose code::before,
.prose code::after {
@@ -862,7 +907,7 @@ span.paperclip-project-mention-chip {
align-items: center;
gap: 0.25rem;
margin: 0 0.1rem;
padding: 0 0.45rem;
padding: 0 0.625rem;
border: 1px solid var(--border);
border-radius: 999px;
font-size: 0.75rem;

View File

@@ -605,6 +605,37 @@ describe("buildIssueChatMessages", () => {
});
});
it("labels pause-caused cancelled runs as paused by board", () => {
const messages = buildIssueChatMessages({
comments: [],
timelineEvents: [],
linkedRuns: [
{
runId: "run-paused",
status: "cancelled",
agentId: "agent-1",
agentName: "CodexCoder",
createdAt: new Date("2026-04-06T12:01:00.000Z"),
startedAt: new Date("2026-04-06T12:01:00.000Z"),
finishedAt: new Date("2026-04-06T12:02:00.000Z"),
resultJson: { stopReason: "paused" },
},
],
liveRuns: [],
transcriptsByRunId: new Map([
["run-paused", [{ kind: "assistant", ts: "2026-04-06T12:01:05.000Z", text: "Working on it." }]],
]),
hasOutputForRun: (runId) => runId === "run-paused",
currentUserId: "user-1",
});
expect(messages).toHaveLength(1);
expect(messages[0]?.metadata.custom).toMatchObject({
chainOfThoughtLabel: "Paused by board after 1 minute",
runStatus: "cancelled",
});
});
it("can keep succeeded runs without transcript output for embedded run feeds", () => {
const messages = buildIssueChatMessages({
comments: [],

View File

@@ -45,6 +45,7 @@ export interface IssueChatLinkedRun {
finishedAt?: Date | string | null;
hasStoredOutput?: boolean;
logBytes?: number | null;
resultJson?: Record<string, unknown> | null;
}
export interface IssueChatTranscriptEntry {
@@ -484,11 +485,13 @@ function runDurationLabel(run: {
createdAt: Date | string;
startedAt: Date | string | null;
finishedAt?: Date | string | null;
resultJson?: Record<string, unknown> | null;
}) {
const start = run.startedAt ?? run.createdAt;
const end = run.finishedAt ?? null;
const durationMs = end ? Math.max(0, toTimestamp(end) - toTimestamp(start)) : null;
const durationText = formatDurationWords(durationMs);
const stopReason = typeof run.resultJson?.stopReason === "string" ? run.resultJson.stopReason : null;
switch (run.status) {
case "succeeded":
return durationText ? `Worked for ${durationText}` : "Finished work";
@@ -498,6 +501,9 @@ function runDurationLabel(run: {
case "timed_out":
return durationText ? `Timed out after ${durationText}` : "Run timed out";
case "cancelled":
if (stopReason === "paused") {
return durationText ? `Paused by board after ${durationText}` : "Paused by board";
}
return durationText ? `Cancelled after ${durationText}` : "Run cancelled";
case "queued":
return "Queued";

View File

@@ -9,6 +9,7 @@ import {
flattenIssueCommentPages,
getNextIssueCommentPageParam,
isQueuedIssueComment,
loadRemainingIssueCommentPages,
matchesIssueRef,
mergeIssueComments,
removeIssueCommentFromPages,
@@ -234,6 +235,31 @@ describe("optimistic issue comments", () => {
).toBe("comment-1");
});
it("loads remaining comment pages until the terminal partial page", async () => {
const fetchPage = vi.fn(async (afterCommentId: string) => {
if (afterCommentId === "comment-3") return [{ id: "comment-2" }, { id: "comment-1" }];
if (afterCommentId === "comment-1") return [{ id: "comment-0" }];
return [];
});
const loaded = await loadRemainingIssueCommentPages({
pages: [[{ id: "comment-4" }, { id: "comment-3" }]],
pageParams: [null],
pageSize: 2,
fetchPage,
});
expect(fetchPage).toHaveBeenCalledTimes(2);
expect(fetchPage).toHaveBeenNthCalledWith(1, "comment-3");
expect(fetchPage).toHaveBeenNthCalledWith(2, "comment-1");
expect(loaded.pages.map((page) => page.map((comment) => comment.id))).toEqual([
["comment-4", "comment-3"],
["comment-2", "comment-1"],
["comment-0"],
]);
expect(loaded.pageParams).toEqual([null, "comment-3", "comment-1"]);
});
it("autoloads older chat comments while the initial thread is still under the threshold", () => {
expect(
shouldAutoloadOlderIssueComments({

View File

@@ -150,6 +150,45 @@ export function getNextIssueCommentPageParam(
return lastPage[lastPage.length - 1]?.id;
}
function getNextPageCursor<T extends { id: string }>(
lastPage: ReadonlyArray<T> | undefined,
pageSize: number,
): string | undefined {
if (!lastPage || lastPage.length < pageSize) return undefined;
return lastPage[lastPage.length - 1]?.id;
}
export async function loadRemainingIssueCommentPages<T extends { id: string }>(params: {
pages: ReadonlyArray<ReadonlyArray<T>> | undefined;
pageParams: ReadonlyArray<string | null> | undefined;
pageSize: number;
fetchPage: (afterCommentId: string) => Promise<ReadonlyArray<T>>;
}): Promise<{ pages: T[][]; pageParams: Array<string | null> }> {
const pages = (params.pages ?? []).map((page) => [...page]);
const pageParams = params.pageParams
? [...params.pageParams].slice(0, pages.length)
: pages.map(() => null);
while (pageParams.length < pages.length) {
pageParams.push(null);
}
if (params.pageSize <= 0) return { pages, pageParams };
let cursor = getNextPageCursor(pages[pages.length - 1], params.pageSize);
const seenCursors = new Set<string>();
while (cursor && !seenCursors.has(cursor)) {
seenCursors.add(cursor);
const nextPage = [...await params.fetchPage(cursor)];
pages.push(nextPage);
pageParams.push(cursor);
cursor = getNextPageCursor(nextPage, params.pageSize);
}
return { pages, pageParams };
}
export function shouldAutoloadOlderIssueComments(params: {
activeDetailTab: string;
hasOlderComments: boolean;