mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-05-13 18:46:38 +02:00
5
.changeset/3312-sdk-first-architecture-seams.md
Normal file
5
.changeset/3312-sdk-first-architecture-seams.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Changed
|
||||
pr: 3312
|
||||
---
|
||||
Tighten SDK-first architecture seams across planning path projection, workstream inventory, STATE.md transforms, and CJS command routing. Shared CJS/SDK helpers now reduce drift, and STATE.md progress projection preserves curated wider aggregates without hiding real disk-derived progress.
|
||||
179
CONTEXT.md
179
CONTEXT.md
@@ -25,6 +25,9 @@ Adapter Module that satisfies native query dispatch at the Dispatch Policy seam,
|
||||
### Query CLI Output Module
|
||||
Module owning projection from dispatch results/errors to CLI `{ exitCode, stdoutChunks, stderrLines }` output contract.
|
||||
|
||||
### STATE.md Document Module
|
||||
Shared CJS/SDK pure transform Module owning STATE.md parse, field extraction, field replacement, status normalization, and frontmatter reconstruction. It does not scan `.planning/phases` and does not own persistence or locking; phase/plan/summary counts arrive from inventory/progress Modules as inputs, and CJS/SDK read-modify-write paths remain Adapters.
|
||||
|
||||
### Query Execution Policy Module
|
||||
Module owning query transport routing policy projection (`preferNative`, fallback policy, workstream subprocess forcing) at execution seam.
|
||||
|
||||
@@ -37,12 +40,21 @@ Canonical command normalization and resolution Interface (`query-command-resolut
|
||||
### Command Topology Module
|
||||
Module owning command resolution, policy projection (`mutation`, `output_mode`), unknown-command diagnosis, and handler Adapter binding at one seam for query dispatch.
|
||||
|
||||
### CJS Command Router Adapter Module
|
||||
Compatibility Adapter Module for `gsd-tools.cjs` command families. Uses generated command metadata plus small argument shapers to route to CJS handlers, rather than calling SDK Command Topology directly. Preserves CJS compatibility startup while reducing hand-written router drift.
|
||||
|
||||
### Query Pre-Project Config Policy Module
|
||||
Module policy that defines query-time behavior when `.planning/config.json` is absent: use built-in defaults for parity-sensitive query Interfaces, and emit parity-aligned empty model ids for pre-project model resolution surfaces.
|
||||
|
||||
### Planning Workspace Module
|
||||
Module owning `.planning` path resolution, active workstream pointer policy (`session-scoped > shared`), pointer self-heal behavior, and planning lock semantics for workstream-aware execution.
|
||||
|
||||
### Workstream Inventory Module
|
||||
Shared CJS/SDK Module owning workstream directory discovery, per-workstream state projection, phase/plan/summary counting, roadmap-declared phase count, active marker projection, and active-workstream collision inputs. Command handlers render list/status/progress outputs from this inventory instead of rescanning `.planning/workstreams/*` directly.
|
||||
|
||||
### Planning Path Projection Module
|
||||
SDK query Module owning projection from project/workstream context to concrete `.planning` paths. Policy precedence is `explicit workstream > env workstream > env project > root`. Invalid workspace context is a validation error at this seam rather than a silent fallback.
|
||||
|
||||
### Worktree Root Resolution Adapter Module
|
||||
Adapter Module owning linked-worktree root mapping and metadata-prune policy (`git worktree prune` non-destructive default) for planning/workstream callers.
|
||||
|
||||
@@ -306,3 +318,170 @@ After stripping prose @-refs, some command `<process>` blocks retained bolded "*
|
||||
`RELEASE-NOTES.RELEASE-STREAM.dev-branch=canary dist-tag (only); install via @canary`
|
||||
`RELEASE-NOTES.RELEASE-STREAM.main-branch=next (RCs) + latest (stable); install via @next or @latest`
|
||||
`RELEASE-NOTES.RELEASE-STREAM.rule=streams do not mix; do not document @canary install in RC notes or @next in canary notes`
|
||||
|
||||
---
|
||||
|
||||
## Repo-Rule Reinforcement (2026-05-09, machine-oriented)
|
||||
|
||||
`META.RULE.canonical-source-precedence=CONTRIBUTING.md > docs/adr/* > CONTEXT.md > agent memory`
|
||||
`META.RULE.read-contributing-first=read CONTRIBUTING.md sections "Pull Request Guidelines" + "CHANGELOG Entries" before EVERY agent dispatch`
|
||||
`META.RULE.brief-must-cite-doc=agent prompts MUST quote the canonical doc line being applied; paraphrasing from predicate memory drifts and produces violations`
|
||||
`META.RULE.brief-no-paraphrase=writing "k040 — never leave changelog box unchecked" caused 5 of 8 agents to edit CHANGELOG.md in violation of CONTRIBUTING.md L110`
|
||||
|
||||
`PRED.k320.signal=changelog-direct-edit-forbidden`
|
||||
`PRED.k320.canonical-source=CONTRIBUTING.md L110-123`
|
||||
`PRED.k320.rule=do not edit CHANGELOG.md in feature/fix/enhancement PRs`
|
||||
`PRED.k320.cure=drop .changeset/<adj>-<noun>-<noun>.md fragment ONLY`
|
||||
`PRED.k320.tool=npm run changeset -- --type <T> --pr <NNN> --body "..."`
|
||||
`PRED.k320.types=Added|Changed|Deprecated|Removed|Fixed|Security`
|
||||
`PRED.k320.opt-out-label=no-changelog`
|
||||
`PRED.k320.ci-enforcement=scripts/changeset/lint.cjs`
|
||||
`PRED.k320.ci-paths-monitored=bin/ get-shit-done/ agents/ commands/ hooks/ sdk/src/`
|
||||
`PRED.k320.recovery=open Removed-typed cleanup PR deleting only the redundant row`
|
||||
`PRED.k320.evidence=PR #3302 merge-conflict against #3308 CHANGELOG.md row 2026-05-09`
|
||||
|
||||
`PRED.k321.signal=cr-outside-diff-range-finding`
|
||||
`PRED.k321.shape=CR posts "[!CAUTION] outside the diff" findings in review BODY, not in reviewThreads`
|
||||
`PRED.k321.poll-shape=parse pulls/<n>/reviews body AND graphql reviewThreads`
|
||||
`PRED.k321.resolution=address in code; no GraphQL resolveReviewThread needed for body-only findings`
|
||||
`PRED.k321.evidence=PRs #3304/#3305 (2026-05-09): real Minor/Major findings in body, 0 threads`
|
||||
|
||||
`PRED.k322.signal=cr-sustained-throttle`
|
||||
`PRED.k322.distinct-from=k080`
|
||||
`PRED.k322.shape=ack posted, real review never lands within [5s, 410s] cooldown after burst of N PRs <15min`
|
||||
`PRED.k322.cure-1=2nd retrigger ~10min after first ack`
|
||||
`PRED.k322.cure-2=if silent at 50min, treat as silent-pass with maintainer flag in merge-commit body`
|
||||
`PRED.k322.merge-gate-impact=k070 real_coderabbit_review_present unsatisfied; requires maintainer judgment`
|
||||
`PRED.k322.evidence=PR #3306 (2026-05-09): 0 reviews after 50min + 2 retriggers`
|
||||
|
||||
`PRED.k323.signal=sibling-audit-cross-pr-overlap`
|
||||
`PRED.k323.shape=2+ open issues touch same canonical bug site; each fix's sibling-audit produces overlapping diff`
|
||||
`PRED.k323.cure-pre-dispatch=brief one agent canonical-owner; brief others to EXCLUDE shared site`
|
||||
`PRED.k323.cure-alt=consolidate into single PR when 2+ issues share root cause`
|
||||
`PRED.k323.recovery=close smaller PR as "subsumed by #N" or rebase second to drop overlap hunk`
|
||||
`PRED.k323.evidence=#3300 (#3297) overlapped #3306 (#3298) on add-backlog.md hunks 2026-05-09`
|
||||
|
||||
`PRED.k324.signal=agent-terminates-mid-monitor`
|
||||
`PRED.k324.k095-restatement=k095 confirmed shape: agent reports "waiting for monitor" / "tests still running" then terminates`
|
||||
`PRED.k324.cure=verify via gh api on every agent-completion notification; never trust narrative`
|
||||
`PRED.k324.poll-shape=gh pr view <n> --json mergeStateStatus,statusCheckRollup + pulls/<n>/reviews + graphql reviewThreads + issues/<n>/comments tail`
|
||||
`PRED.k324.evidence=2026-05-09 session: 5+ mid-monitor terminations across PRs #3232/#3271/#3251/#3255/#3262`
|
||||
|
||||
`PRED.k325.signal=worktree-branch-lock-on-force-push`
|
||||
`PRED.k325.shape=git checkout <branch> errors "already used by worktree at <agent-worktree>"`
|
||||
`PRED.k325.cure=detached-HEAD: git checkout --detach $(git ls-remote origin <branch>); modify; commit; git push --force-with-lease=<branch>:<remote-sha> origin HEAD:refs/heads/<branch>`
|
||||
`PRED.k325.cleanup=git worktree remove --force <path> for aged agent worktrees`
|
||||
`PRED.k325.evidence=2026-05-09 CHANGELOG.md strip on PRs #3300/#3302/#3304/#3305 required detached-HEAD`
|
||||
|
||||
`PRED.k326.signal=brief-contradicts-canonical-doc`
|
||||
`PRED.k326.shape=N parallel agents amplify a single brief-vs-doc contradiction into N violations`
|
||||
`PRED.k326.cure=quote canonical doc verbatim in brief; mentally simulate "if all N agents follow this brief literally, do they violate any rule?"`
|
||||
`PRED.k326.evidence=2026-05-09 brief "k040 — update CHANGELOG.md" → 5 of 8 agents violated CONTRIBUTING.md L110`
|
||||
|
||||
`PRED.k327.signal=cr-ack-vs-real-review`
|
||||
`PRED.k327.ack-shape=body "✅ Actions performed - Full review triggered"`
|
||||
`PRED.k327.real-review-shape=body starts "Actionable comments posted: N" OR "[!CAUTION] Some comments are outside the diff"`
|
||||
`PRED.k327.distinguish-key=len(pulls/<n>/reviews) — ack=0, real=≥1`
|
||||
`PRED.k327.cooldown-normal=[5s, 410s]`
|
||||
`PRED.k327.cooldown-throttled=k322`
|
||||
|
||||
`PRED.k328.signal=pr-template-typed-heading-required`
|
||||
`PRED.k328.canonical-source=CONTRIBUTING.md L101`
|
||||
`PRED.k328.k100-restatement=heading must match issue class: bug→## Fix PR, enhancement→## Enhancement PR, feature→## Feature PR`
|
||||
`PRED.k328.audit-list=[heading-matches-class, closing-keyword-present, changeset-fragment-or-no-changelog-label]`
|
||||
|
||||
`PRED.k329.signal=changeset-fragment-canonical-shape`
|
||||
`PRED.k329.canonical-source=CONTRIBUTING.md L112-117 + .changeset/README.md`
|
||||
`PRED.k329.filename=.changeset/<adj>-<noun>-<noun>.md`
|
||||
`PRED.k329.frontmatter=---\\ntype: <Added|Changed|Deprecated|Removed|Fixed|Security>\\npr: <NNN>\\n---`
|
||||
`PRED.k329.body=**<Bold user-visible change>** — <symptom-led explanation>. (#<NNN>)`
|
||||
`PRED.k329.observed-clean=#3299 sunny-ibex-wave, #3301 sturdy-rams-caper, #3306 3298-phase-dir-prefix-drift-workflows`
|
||||
|
||||
`PRED.k330.signal=mempalace-diary-not-callable-by-ai`
|
||||
`PRED.k330.shape=mempalace MCP tools require explicit user call; AI cannot trigger`
|
||||
`PRED.k330.fallback=append predicate-format findings directly to CONTEXT.md`
|
||||
|
||||
`PRED.k331.signal=close-with-no-comment-is-literal`
|
||||
`PRED.k331.shape=instruction "close with no comment (rationale)" — parenthetical is rationale, NOT comment body`
|
||||
`PRED.k331.k101-restatement=k101 includes close-time --comment flag; rationale belongs in subsuming PR's squash-merge body`
|
||||
`PRED.k331.cure=gh pr close <n> with NO --comment flag`
|
||||
`PRED.k331.recovery=if violation lands, gh api -X DELETE repos/<o>/<r>/issues/comments/<id>`
|
||||
`PRED.k331.evidence=2026-05-09 wave-3: violation on #3300 close, deleted within 30s`
|
||||
|
||||
`PROC.AGENT-DISPATCH.preflight=[read-CONTRIBUTING.md-fresh, read-relevant-ADRs, cite-specific-line-in-brief, require-closing-keyword, require-changeset-fragment, forbid-CHANGELOG.md-edit, require-isolation-worktree, forbid-self-PR-comment, mandate-trust-but-verify]`
|
||||
`PROC.AGENT-DISPATCH.parallel-overlap-audit=before dispatching N sibling-audit fixers, compute file-set union and assign canonical owners`
|
||||
`PROC.AGENT-DISPATCH.completion-verify=run k324.poll-shape on every agent-completion notification`
|
||||
|
||||
`PROC.MERGE-WAVE.ordering=[wave1: isolated-files, wave2: CHANGELOG-only-overlap (better: strip per k320), wave3: same-file-overlap with explicit decision]`
|
||||
`PROC.MERGE-WAVE.preflight=gh pr view <n> --json files for every PR; identify overlap pairs; surface to maintainer`
|
||||
`PROC.MERGE-WAVE.changelog-strip-pattern=detached-HEAD per k325 + git checkout main -- CHANGELOG.md + commit + force-with-lease`
|
||||
`PROC.MERGE-WAVE.merge-tool=gh pr merge <n> --squash --delete-branch`
|
||||
`PROC.MERGE-WAVE.merge-tool-warning=delete-branch may fail with "used by worktree at" — harmless; remote branch still deleted`
|
||||
|
||||
## Triage+Merge Wave Outcome (2026-05-09T15:47Z, machine-oriented)
|
||||
|
||||
`WAVE.2026-05-09.scope=trek-e-authored issues, classes=[bug, enhancement, feature]`
|
||||
`WAVE.2026-05-09.dispatched=8`
|
||||
`WAVE.2026-05-09.merged=7`
|
||||
`WAVE.2026-05-09.closed-as-subsumed=1`
|
||||
`WAVE.2026-05-09.skipped-mvp-epic=[#2826, #2885, #2882, #2879, #2877, #2875]`
|
||||
|
||||
`WAVE.PR.3299.issue=3290`
|
||||
`WAVE.PR.3299.class=bug`
|
||||
`WAVE.PR.3299.fix=agents/gsd-intel-updater.md layout-detection block gated on framework-repo check`
|
||||
`WAVE.PR.3299.cr-state=clean (No actionable comments)`
|
||||
`WAVE.PR.3299.merged=2026-05-09T15:39:16Z`
|
||||
|
||||
`WAVE.PR.3301.issue=3232`
|
||||
`WAVE.PR.3301.class=enhancement`
|
||||
`WAVE.PR.3301.fix=docs/contributor-standards.md first-cut + CONTRIBUTING.md cross-link + 1 CR thread resolved (MD040)`
|
||||
`WAVE.PR.3301.cr-state=clean post-fix`
|
||||
`WAVE.PR.3301.merged=2026-05-09T15:39:24Z`
|
||||
|
||||
`WAVE.PR.3308.issue=3262`
|
||||
`WAVE.PR.3308.class=enhancement`
|
||||
`WAVE.PR.3308.fix=extract get-shit-done/bin/lib/plan-scan.cjs scanPhasePlans; port 4 call sites in init/state/roadmap/phase`
|
||||
`WAVE.PR.3308.cr-state=2 reviews real, 1 thread resolved`
|
||||
`WAVE.PR.3308.merged=2026-05-09T15:39:32Z`
|
||||
`WAVE.PR.3308.violation=carried redundant CHANGELOG.md row in violation of k320; cleanup task spawned`
|
||||
|
||||
`WAVE.PR.3302.issue=3271`
|
||||
`WAVE.PR.3302.class=enhancement`
|
||||
`WAVE.PR.3302.fix=docs/adr/0005 + 0006 + README index + tests/enh-3271-sdk-adr-structure.test.cjs`
|
||||
`WAVE.PR.3302.cr-state=1 review, 1 thread resolved (ADR self-ref test)`
|
||||
`WAVE.PR.3302.changelog-strip=force-pushed 2026-05-09T15:35Z`
|
||||
`WAVE.PR.3302.merged=2026-05-09T15:46:28Z`
|
||||
|
||||
`WAVE.PR.3304.issue=3255`
|
||||
`WAVE.PR.3304.class=enhancement`
|
||||
`WAVE.PR.3304.fix=get-shit-done/bin/gsd-tools.cjs --json-errors flag + GSD_JSON_ERRORS env + docs/json-errors.md taxonomy + usage-string disclosure (CR k321 finding addressed)`
|
||||
`WAVE.PR.3304.cr-state=1 review (k321 outside-diff finding fixed in code)`
|
||||
`WAVE.PR.3304.changelog-strip=force-pushed 2026-05-09T15:35Z`
|
||||
`WAVE.PR.3304.merged=2026-05-09T15:46:35Z`
|
||||
|
||||
`WAVE.PR.3305.issue=3251`
|
||||
`WAVE.PR.3305.class=enhancement`
|
||||
`WAVE.PR.3305.fix=command-aliases.generated.cjs NON_FAMILY entries (40) + sdk gen-command-aliases.ts typed-export preservation (CR k321 Major finding addressed)`
|
||||
`WAVE.PR.3305.cr-state=1 review (k321 outside-diff finding fixed in code)`
|
||||
`WAVE.PR.3305.changelog-strip=force-pushed 2026-05-09T15:35Z`
|
||||
`WAVE.PR.3305.merged=2026-05-09T15:46:41Z`
|
||||
|
||||
`WAVE.PR.3306.issue=3298`
|
||||
`WAVE.PR.3306.class=bug`
|
||||
`WAVE.PR.3306.fix=phase-dir prefix drift fixed in 3 sites (add-backlog.md + import.md + plan-milestone-gaps.md) per k015 sibling-audit`
|
||||
`WAVE.PR.3306.cr-state=k322 sustained-throttle silent pass — 0 reviews after 50min + 2 retriggers, CI green`
|
||||
`WAVE.PR.3306.subsumes=PR #3300 (#3297 add-backlog dedicated fix)`
|
||||
`WAVE.PR.3306.merged=2026-05-09T15:47:16Z`
|
||||
|
||||
`WAVE.PR.3300.issue=3297`
|
||||
`WAVE.PR.3300.class=bug`
|
||||
`WAVE.PR.3300.fix=add-backlog.md project_code prefix (focused #3297 fix)`
|
||||
`WAVE.PR.3300.outcome=closed-as-subsumed by #3306; issue #3297 manually closed`
|
||||
`WAVE.PR.3300.k323-evidence=overlapped #3306 add-backlog.md hunks with different prefix idiom`
|
||||
`WAVE.PR.3300.k331-violation=close-with-comment violation, comment deleted within 30s`
|
||||
|
||||
`WAVE.LESSON.changelog-policy-violation-multiplier=brief contradicting CONTRIBUTING.md L110 produced violations on 5 of 8 PRs (#3300, #3302, #3304, #3305, #3308); k326 + k320 capture`
|
||||
`WAVE.LESSON.cr-throttle-burst-correlation=8 PRs in <15min triggered k322 sustained-throttle on multiple PRs (#3306 worst case)`
|
||||
`WAVE.LESSON.sibling-audit-overlap=k015-family parallel dispatch on #3297 + #3298 produced k323 add-backlog.md cross-PR overlap`
|
||||
`WAVE.LESSON.agent-narrative-unreliable=k095/k324 confirmed at scale: 5 of 8 agents terminated mid-monitor with stale claims requiring direct verification`
|
||||
`WAVE.LESSON.k101-still-trips=even after CONTEXT.md k101 reinforcement, agent of record posted self-PR comment on close; k331 adds explicit close-time literal-instruction guard`
|
||||
|
||||
@@ -259,6 +259,7 @@
|
||||
"active-workstream-store.cjs",
|
||||
"artifacts.cjs",
|
||||
"audit.cjs",
|
||||
"cjs-command-router-adapter.cjs",
|
||||
"command-aliases.generated.cjs",
|
||||
"commands.cjs",
|
||||
"config-schema.cjs",
|
||||
@@ -294,12 +295,14 @@
|
||||
"secrets.cjs",
|
||||
"security.cjs",
|
||||
"state-command-router.cjs",
|
||||
"state-document.cjs",
|
||||
"state.cjs",
|
||||
"template.cjs",
|
||||
"uat.cjs",
|
||||
"validate-command-router.cjs",
|
||||
"verify-command-router.cjs",
|
||||
"verify.cjs",
|
||||
"workstream-inventory.cjs",
|
||||
"workstream-name-policy.cjs",
|
||||
"workstream.cjs",
|
||||
"worktree-safety.cjs"
|
||||
|
||||
@@ -358,7 +358,7 @@ The `gsd-planner` agent is decomposed into a core agent plus reference modules t
|
||||
|
||||
---
|
||||
|
||||
## CLI Modules (47 shipped)
|
||||
## CLI Modules (50 shipped)
|
||||
|
||||
Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
|
||||
@@ -367,6 +367,7 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `active-workstream-store.cjs` | Workstream source precedence and selection (CLI `--ws` > `GSD_WORKSTREAM` env > stored pointer); name validation and environment propagation |
|
||||
| `artifacts.cjs` | Canonical artifact registry — known `.planning/` root file names; used by `gsd-health` W019 lint |
|
||||
| `audit.cjs` | Audit dispatch, audit open sessions, audit storage helpers |
|
||||
| `cjs-command-router-adapter.cjs` | Shared compatibility adapter for manifest-backed CJS command-family routers |
|
||||
| `command-aliases.generated.cjs` | Generated CJS alias/subcommand metadata for manifest-backed family routers |
|
||||
| `commands.cjs` | Misc CLI commands (slug, timestamp, todos, scaffolding, stats) |
|
||||
| `config-schema.cjs` | Single source of truth for `VALID_CONFIG_KEYS` and dynamic key patterns; imported by both the validator and the config-schema-docs parity test |
|
||||
@@ -391,7 +392,7 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `phase-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools phase` |
|
||||
| `phase.cjs` | Phase directory operations, decimal numbering, plan indexing |
|
||||
| `phases-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools phases` |
|
||||
| `plan-scan.cjs` | Canonical phase-plan scanner — shared helper for detecting plan and summary files in flat and nested layouts (k014); consumed by `state.cjs`, `roadmap.cjs`, and `init.cjs` |
|
||||
| `plan-scan.cjs` | Canonical phase-plan scanner — shared helper for detecting plan and summary files in flat and nested layouts (k014); consumed by state, roadmap, init, and workstream inventory paths |
|
||||
| `planning-workspace.cjs` | Planning path/workstream seam (`planningDir`, `planningPaths`, active-workstream routing, `.planning/.lock` orchestration) |
|
||||
| `profile-output.cjs` | Profile rendering, USER-PROFILE.md and dev-preferences.md generation |
|
||||
| `profile-pipeline.cjs` | User behavioral profiling data pipeline, session file scanning |
|
||||
@@ -403,11 +404,13 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `security.cjs` | Path traversal prevention, prompt injection detection, safe JSON/shell helpers |
|
||||
| `state-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools state` |
|
||||
| `state.cjs` | STATE.md parsing, updating, progression, metrics |
|
||||
| `state-document.cjs` | Pure STATE.md field extraction, replacement, status normalization, and progress calculation transforms |
|
||||
| `template.cjs` | Template selection and filling with variable substitution |
|
||||
| `uat.cjs` | UAT file parsing, verification debt tracking, audit-uat support |
|
||||
| `validate-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools validate` |
|
||||
| `verify-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools verify` |
|
||||
| `verify.cjs` | Plan structure, phase completeness, reference, commit validation |
|
||||
| `workstream-inventory.cjs` | Shared workstream inventory projection: state fields, phase/plan/summary counts, roadmap phase count, and active marker |
|
||||
| `workstream-name-policy.cjs` | Canonical workstream name validation (`isValidActiveWorkstreamName`) and slug normalization (`toWorkstreamSlug`); shared by all workstream callers |
|
||||
| `workstream.cjs` | Workstream CRUD, migration, session-scoped active pointer |
|
||||
| `worktree-safety.cjs` | Worktree-root resolution and non-destructive prune policy decisions; owns W017 health-check logic |
|
||||
|
||||
39
get-shit-done/bin/lib/cjs-command-router-adapter.cjs
Normal file
39
get-shit-done/bin/lib/cjs-command-router-adapter.cjs
Normal file
@@ -0,0 +1,39 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* CJS Command Router Adapter Module
|
||||
*
|
||||
* Compatibility routing for gsd-tools.cjs command families. Uses generated
|
||||
* command metadata for availability and small family-local argument shapers for
|
||||
* CJS handler calls.
|
||||
*/
|
||||
|
||||
function routeCjsCommandFamily({
|
||||
args,
|
||||
subcommands,
|
||||
handlers,
|
||||
defaultSubcommand,
|
||||
unsupported = {},
|
||||
unknownMessage,
|
||||
error,
|
||||
}) {
|
||||
const subcommand = args[1] || defaultSubcommand;
|
||||
|
||||
if (subcommand && unsupported[subcommand]) {
|
||||
error(unsupported[subcommand]);
|
||||
return;
|
||||
}
|
||||
|
||||
const handler = subcommand ? handlers[subcommand] : null;
|
||||
if (handler) {
|
||||
handler();
|
||||
return;
|
||||
}
|
||||
|
||||
const available = subcommands.filter(s => !unsupported[s]);
|
||||
error(unknownMessage(subcommand, available));
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeCjsCommandFamily,
|
||||
};
|
||||
@@ -1,50 +1,58 @@
|
||||
'use strict';
|
||||
|
||||
const { PHASE_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
const { routeCjsCommandFamily } = require('./cjs-command-router-adapter.cjs');
|
||||
|
||||
function routePhaseCommand({ phase, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'next-decimal') {
|
||||
phase.cmdPhaseNextDecimal(cwd, args[2], raw);
|
||||
} else if (subcommand === 'add') {
|
||||
let customId = null;
|
||||
const descArgs = [];
|
||||
for (let i = 2; i < args.length; i++) {
|
||||
if (args[i] === '--id' && i + 1 < args.length) {
|
||||
customId = args[i + 1];
|
||||
i++;
|
||||
} else {
|
||||
descArgs.push(args[i]);
|
||||
}
|
||||
}
|
||||
phase.cmdPhaseAdd(cwd, descArgs.join(' '), raw, customId);
|
||||
} else if (subcommand === 'add-batch') {
|
||||
const descFlagIdx = args.indexOf('--descriptions');
|
||||
let descriptions;
|
||||
if (descFlagIdx !== -1 && args[descFlagIdx + 1]) {
|
||||
try {
|
||||
descriptions = JSON.parse(args[descFlagIdx + 1]);
|
||||
} catch {
|
||||
error('--descriptions must be a JSON array');
|
||||
}
|
||||
} else {
|
||||
descriptions = args.slice(2).filter(a => a !== '--raw');
|
||||
}
|
||||
phase.cmdPhaseAddBatch(cwd, descriptions, raw);
|
||||
} else if (subcommand === 'insert') {
|
||||
if (args.includes('--dry-run')) {
|
||||
error('phase insert does not support --dry-run');
|
||||
}
|
||||
phase.cmdPhaseInsert(cwd, args[2], args.slice(3).join(' '), raw);
|
||||
} else if (subcommand === 'remove') {
|
||||
const forceFlag = args.includes('--force');
|
||||
phase.cmdPhaseRemove(cwd, args[2], { force: forceFlag }, raw);
|
||||
} else if (subcommand === 'complete') {
|
||||
phase.cmdPhaseComplete(cwd, args[2], raw);
|
||||
} else {
|
||||
error(`Unknown phase subcommand. Available: ${PHASE_SUBCOMMANDS.filter((s) => s !== 'list-plans' && s !== 'list-artifacts' && s !== 'scaffold').join(', ')}`);
|
||||
}
|
||||
routeCjsCommandFamily({
|
||||
args,
|
||||
subcommands: PHASE_SUBCOMMANDS,
|
||||
unsupported: {
|
||||
'list-plans': 'phase list-plans is SDK-only. Use: gsd-sdk query phase.list-plans ...',
|
||||
'list-artifacts': 'phase list-artifacts is SDK-only. Use: gsd-sdk query phase.list-artifacts ...',
|
||||
scaffold: 'phase scaffold is routed through the top-level scaffold command.',
|
||||
},
|
||||
error,
|
||||
unknownMessage: (_subcommand, available) => `Unknown phase subcommand. Available: ${available.join(', ')}`,
|
||||
handlers: {
|
||||
'next-decimal': () => phase.cmdPhaseNextDecimal(cwd, args[2], raw),
|
||||
add: () => {
|
||||
let customId = null;
|
||||
const descArgs = [];
|
||||
for (let i = 2; i < args.length; i++) {
|
||||
if (args[i] === '--id' && i + 1 < args.length) {
|
||||
customId = args[i + 1];
|
||||
i++;
|
||||
} else {
|
||||
descArgs.push(args[i]);
|
||||
}
|
||||
}
|
||||
phase.cmdPhaseAdd(cwd, descArgs.join(' '), raw, customId);
|
||||
},
|
||||
'add-batch': () => {
|
||||
const descFlagIdx = args.indexOf('--descriptions');
|
||||
let descriptions;
|
||||
if (descFlagIdx !== -1 && args[descFlagIdx + 1]) {
|
||||
try {
|
||||
descriptions = JSON.parse(args[descFlagIdx + 1]);
|
||||
} catch {
|
||||
error('--descriptions must be a JSON array');
|
||||
}
|
||||
} else {
|
||||
descriptions = args.slice(2).filter(a => a !== '--raw');
|
||||
}
|
||||
phase.cmdPhaseAddBatch(cwd, descriptions, raw);
|
||||
},
|
||||
insert: () => {
|
||||
if (args.includes('--dry-run')) {
|
||||
error('phase insert does not support --dry-run');
|
||||
}
|
||||
phase.cmdPhaseInsert(cwd, args[2], args.slice(3).join(' '), raw);
|
||||
},
|
||||
remove: () => phase.cmdPhaseRemove(cwd, args[2], { force: args.includes('--force') }, raw),
|
||||
complete: () => phase.cmdPhaseComplete(cwd, args[2], raw),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
'use strict';
|
||||
|
||||
const { PHASES_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
const { routeCjsCommandFamily } = require('./cjs-command-router-adapter.cjs');
|
||||
|
||||
/**
|
||||
* Manifest-backed phases subcommand router.
|
||||
@@ -12,23 +13,25 @@ const { PHASES_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
* registry). CJS `gsd-tools phases` intentionally supports list/clear only.
|
||||
*/
|
||||
function routePhasesCommand({ phase, milestone, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'list') {
|
||||
const typeIndex = args.indexOf('--type');
|
||||
const phaseIndex = args.indexOf('--phase');
|
||||
const options = {
|
||||
type: typeIndex !== -1 ? args[typeIndex + 1] : null,
|
||||
phase: phaseIndex !== -1 ? args[phaseIndex + 1] : null,
|
||||
includeArchived: args.includes('--include-archived'),
|
||||
};
|
||||
phase.cmdPhasesList(cwd, options, raw);
|
||||
} else if (subcommand === 'clear') {
|
||||
milestone.cmdPhasesClear(cwd, raw, args.slice(2));
|
||||
} else {
|
||||
const cjsSupported = PHASES_SUBCOMMANDS.filter((s) => s !== 'archive');
|
||||
error(`Unknown phases subcommand. Available: ${cjsSupported.join(', ')}`);
|
||||
}
|
||||
routeCjsCommandFamily({
|
||||
args,
|
||||
subcommands: PHASES_SUBCOMMANDS.filter((s) => s !== 'archive'),
|
||||
error,
|
||||
unknownMessage: (_subcommand, available) => `Unknown phases subcommand. Available: ${available.join(', ')}`,
|
||||
handlers: {
|
||||
list: () => {
|
||||
const typeIndex = args.indexOf('--type');
|
||||
const phaseIndex = args.indexOf('--phase');
|
||||
const options = {
|
||||
type: typeIndex !== -1 ? args[typeIndex + 1] : null,
|
||||
phase: phaseIndex !== -1 ? args[phaseIndex + 1] : null,
|
||||
includeArchived: args.includes('--include-archived'),
|
||||
};
|
||||
phase.cmdPhasesList(cwd, options, raw);
|
||||
},
|
||||
clear: () => milestone.cmdPhasesClear(cwd, raw, args.slice(2)),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
|
||||
@@ -1,89 +1,98 @@
|
||||
'use strict';
|
||||
|
||||
const { STATE_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
const { routeCjsCommandFamily } = require('./cjs-command-router-adapter.cjs');
|
||||
|
||||
/**
|
||||
* Manifest-backed state subcommand router.
|
||||
* Keeps gsd-tools.cjs thin while preserving existing command semantics.
|
||||
*/
|
||||
function routeStateCommand({ state, args, cwd, raw, parseNamedArgs, error }) {
|
||||
const subcommand = args[1];
|
||||
const parsePlans = (plans) => {
|
||||
const parsedPlans = plans == null ? null : Number.parseInt(plans, 10);
|
||||
if (plans != null && Number.isNaN(parsedPlans)) {
|
||||
error('Invalid --plans value. Expected an integer.');
|
||||
return null;
|
||||
}
|
||||
return parsedPlans;
|
||||
};
|
||||
|
||||
if (subcommand === 'json') {
|
||||
state.cmdStateJson(cwd, raw);
|
||||
} else if (subcommand === 'update') {
|
||||
state.cmdStateUpdate(cwd, args[2], args[3]);
|
||||
} else if (subcommand === 'get') {
|
||||
state.cmdStateGet(cwd, args[2], raw);
|
||||
} else if (subcommand === 'patch') {
|
||||
const patches = {};
|
||||
for (let i = 2; i < args.length; i += 2) {
|
||||
const key = args[i].replace(/^--/, '');
|
||||
const value = args[i + 1];
|
||||
if (key && value !== undefined) {
|
||||
patches[key] = value;
|
||||
}
|
||||
}
|
||||
state.cmdStatePatch(cwd, patches, raw);
|
||||
} else if (subcommand === 'advance-plan') {
|
||||
state.cmdStateAdvancePlan(cwd, raw);
|
||||
} else if (subcommand === 'record-metric') {
|
||||
const { phase: p, plan, duration, tasks, files } = parseNamedArgs(args, ['phase', 'plan', 'duration', 'tasks', 'files']);
|
||||
state.cmdStateRecordMetric(cwd, { phase: p, plan, duration, tasks, files }, raw);
|
||||
} else if (subcommand === 'update-progress') {
|
||||
state.cmdStateUpdateProgress(cwd, raw);
|
||||
} else if (subcommand === 'add-decision') {
|
||||
const { phase: p, summary, 'summary-file': summary_file, rationale, 'rationale-file': rationale_file } = parseNamedArgs(args, ['phase', 'summary', 'summary-file', 'rationale', 'rationale-file']);
|
||||
state.cmdStateAddDecision(cwd, { phase: p, summary, summary_file, rationale: rationale || '', rationale_file }, raw);
|
||||
} else if (subcommand === 'add-blocker') {
|
||||
const { text, 'text-file': text_file } = parseNamedArgs(args, ['text', 'text-file']);
|
||||
state.cmdStateAddBlocker(cwd, { text, text_file }, raw);
|
||||
} else if (subcommand === 'resolve-blocker') {
|
||||
state.cmdStateResolveBlocker(cwd, parseNamedArgs(args, ['text']).text, raw);
|
||||
} else if (subcommand === 'record-session') {
|
||||
const { 'stopped-at': stopped_at, 'resume-file': resume_file } = parseNamedArgs(args, ['stopped-at', 'resume-file']);
|
||||
state.cmdStateRecordSession(cwd, { stopped_at, resume_file: resume_file || 'None' }, raw);
|
||||
} else if (subcommand === 'begin-phase') {
|
||||
const { phase: p, name, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
const parsedPlans = plans == null ? null : Number.parseInt(plans, 10);
|
||||
if (plans != null && Number.isNaN(parsedPlans)) {
|
||||
return error('Invalid --plans value. Expected an integer.');
|
||||
}
|
||||
state.cmdStateBeginPhase(cwd, p, name, parsedPlans, raw);
|
||||
} else if (subcommand === 'signal-waiting') {
|
||||
const { type, question, options, phase: p } = parseNamedArgs(args, ['type', 'question', 'options', 'phase']);
|
||||
state.cmdSignalWaiting(cwd, type, question, options, p, raw);
|
||||
} else if (subcommand === 'signal-resume') {
|
||||
state.cmdSignalResume(cwd, raw);
|
||||
} else if (subcommand === 'planned-phase') {
|
||||
const { phase: p, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
const parsedPlans = plans == null ? null : Number.parseInt(plans, 10);
|
||||
if (plans != null && Number.isNaN(parsedPlans)) {
|
||||
return error('Invalid --plans value. Expected an integer.');
|
||||
}
|
||||
state.cmdStatePlannedPhase(cwd, p, parsedPlans, raw);
|
||||
} else if (subcommand === 'validate') {
|
||||
state.cmdStateValidate(cwd, raw);
|
||||
} else if (subcommand === 'sync') {
|
||||
const { verify } = parseNamedArgs(args, [], ['verify']);
|
||||
state.cmdStateSync(cwd, { verify }, raw);
|
||||
} else if (subcommand === 'prune') {
|
||||
const { 'keep-recent': keepRecent, 'dry-run': dryRun } = parseNamedArgs(args, ['keep-recent'], ['dry-run']);
|
||||
state.cmdStatePrune(cwd, { keepRecent: keepRecent || '3', dryRun: !!dryRun }, raw);
|
||||
} else if (subcommand === 'complete-phase') {
|
||||
const { phase: p } = parseNamedArgs(args, ['phase']);
|
||||
state.cmdStateCompletePhase(cwd, raw, p || args[2]);
|
||||
} else if (subcommand === 'milestone-switch') {
|
||||
const { milestone, name } = parseNamedArgs(args, ['milestone', 'name']);
|
||||
state.cmdStateMilestoneSwitch(cwd, milestone, name, raw);
|
||||
} else if (subcommand === 'add-roadmap-evolution') {
|
||||
error('state add-roadmap-evolution is SDK-only. Use: gsd-sdk query state.add-roadmap-evolution ...');
|
||||
} else if (subcommand === undefined || subcommand === 'load') {
|
||||
state.cmdStateLoad(cwd, raw);
|
||||
} else {
|
||||
const available = ['load', 'complete-phase', ...STATE_SUBCOMMANDS.filter((s) => s !== 'load')];
|
||||
error(`Unknown state subcommand: "${subcommand}". Available: ${available.join(', ')}`);
|
||||
}
|
||||
routeCjsCommandFamily({
|
||||
args,
|
||||
subcommands: ['load', 'complete-phase', ...STATE_SUBCOMMANDS.filter((s) => s !== 'load')],
|
||||
defaultSubcommand: 'load',
|
||||
unsupported: {
|
||||
'add-roadmap-evolution': 'state add-roadmap-evolution is SDK-only. Use: gsd-sdk query state.add-roadmap-evolution ...',
|
||||
},
|
||||
error,
|
||||
unknownMessage: (subcommand, available) => `Unknown state subcommand: "${subcommand}". Available: ${available.join(', ')}`,
|
||||
handlers: {
|
||||
load: () => state.cmdStateLoad(cwd, raw),
|
||||
json: () => state.cmdStateJson(cwd, raw),
|
||||
update: () => state.cmdStateUpdate(cwd, args[2], args[3]),
|
||||
get: () => state.cmdStateGet(cwd, args[2], raw),
|
||||
patch: () => {
|
||||
const patches = {};
|
||||
for (let i = 2; i < args.length; i += 2) {
|
||||
const key = args[i].replace(/^--/, '');
|
||||
const value = args[i + 1];
|
||||
if (key && value !== undefined) {
|
||||
patches[key] = value;
|
||||
}
|
||||
}
|
||||
state.cmdStatePatch(cwd, patches, raw);
|
||||
},
|
||||
'advance-plan': () => state.cmdStateAdvancePlan(cwd, raw),
|
||||
'record-metric': () => {
|
||||
const { phase: p, plan, duration, tasks, files } = parseNamedArgs(args, ['phase', 'plan', 'duration', 'tasks', 'files']);
|
||||
state.cmdStateRecordMetric(cwd, { phase: p, plan, duration, tasks, files }, raw);
|
||||
},
|
||||
'update-progress': () => state.cmdStateUpdateProgress(cwd, raw),
|
||||
'add-decision': () => {
|
||||
const { phase: p, summary, 'summary-file': summary_file, rationale, 'rationale-file': rationale_file } = parseNamedArgs(args, ['phase', 'summary', 'summary-file', 'rationale', 'rationale-file']);
|
||||
state.cmdStateAddDecision(cwd, { phase: p, summary, summary_file, rationale: rationale || '', rationale_file }, raw);
|
||||
},
|
||||
'add-blocker': () => {
|
||||
const { text, 'text-file': text_file } = parseNamedArgs(args, ['text', 'text-file']);
|
||||
state.cmdStateAddBlocker(cwd, { text, text_file }, raw);
|
||||
},
|
||||
'resolve-blocker': () => state.cmdStateResolveBlocker(cwd, parseNamedArgs(args, ['text']).text, raw),
|
||||
'record-session': () => {
|
||||
const { 'stopped-at': stopped_at, 'resume-file': resume_file } = parseNamedArgs(args, ['stopped-at', 'resume-file']);
|
||||
state.cmdStateRecordSession(cwd, { stopped_at, resume_file: resume_file || 'None' }, raw);
|
||||
},
|
||||
'begin-phase': () => {
|
||||
const { phase: p, name, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
state.cmdStateBeginPhase(cwd, p, name, parsePlans(plans), raw);
|
||||
},
|
||||
'signal-waiting': () => {
|
||||
const { type, question, options, phase: p } = parseNamedArgs(args, ['type', 'question', 'options', 'phase']);
|
||||
state.cmdSignalWaiting(cwd, type, question, options, p, raw);
|
||||
},
|
||||
'signal-resume': () => state.cmdSignalResume(cwd, raw),
|
||||
'planned-phase': () => {
|
||||
const { phase: p, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
state.cmdStatePlannedPhase(cwd, p, parsePlans(plans), raw);
|
||||
},
|
||||
validate: () => state.cmdStateValidate(cwd, raw),
|
||||
sync: () => {
|
||||
const { verify } = parseNamedArgs(args, [], ['verify']);
|
||||
state.cmdStateSync(cwd, { verify }, raw);
|
||||
},
|
||||
prune: () => {
|
||||
const { 'keep-recent': keepRecent, 'dry-run': dryRun } = parseNamedArgs(args, ['keep-recent'], ['dry-run']);
|
||||
state.cmdStatePrune(cwd, { keepRecent: keepRecent || '3', dryRun: !!dryRun }, raw);
|
||||
},
|
||||
'complete-phase': () => {
|
||||
const { phase: p } = parseNamedArgs(args, ['phase']);
|
||||
state.cmdStateCompletePhase(cwd, raw, p || args[2]);
|
||||
},
|
||||
'milestone-switch': () => {
|
||||
const { milestone, name } = parseNamedArgs(args, ['milestone', 'name']);
|
||||
state.cmdStateMilestoneSwitch(cwd, milestone, name, raw);
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
|
||||
122
get-shit-done/bin/lib/state-document.cjs
Normal file
122
get-shit-done/bin/lib/state-document.cjs
Normal file
@@ -0,0 +1,122 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* STATE.md Document Module
|
||||
*
|
||||
* Pure transforms for STATE.md text. This module does not read the filesystem
|
||||
* and does not own persistence or locking.
|
||||
*/
|
||||
|
||||
function escapeRegex(str) {
|
||||
return String(str).replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
|
||||
}
|
||||
|
||||
function stateExtractField(content, fieldName) {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
const boldPattern = new RegExp(`\\*\\*${escaped}:\\*\\*[ \\t]*(.+)`, 'i');
|
||||
const boldMatch = content.match(boldPattern);
|
||||
if (boldMatch) return boldMatch[1].trim();
|
||||
const plainPattern = new RegExp(`^${escaped}:[ \\t]*(.+)`, 'im');
|
||||
const plainMatch = content.match(plainPattern);
|
||||
return plainMatch ? plainMatch[1].trim() : null;
|
||||
}
|
||||
|
||||
function stateReplaceField(content, fieldName, newValue) {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
const boldPattern = new RegExp(`(\\*\\*${escaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
if (boldPattern.test(content)) {
|
||||
return content.replace(boldPattern, (_match, prefix) => `${prefix}${newValue}`);
|
||||
}
|
||||
const plainPattern = new RegExp(`(^${escaped}:\\s*)(.*)`, 'im');
|
||||
if (plainPattern.test(content)) {
|
||||
return content.replace(plainPattern, (_match, prefix) => `${prefix}${newValue}`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
function stateReplaceFieldWithFallback(content, primary, fallback, value) {
|
||||
let result = stateReplaceField(content, primary, value);
|
||||
if (result) return result;
|
||||
if (fallback) {
|
||||
result = stateReplaceField(content, fallback, value);
|
||||
if (result) return result;
|
||||
}
|
||||
return content;
|
||||
}
|
||||
|
||||
function normalizeStateStatus(status, pausedAt) {
|
||||
let normalizedStatus = status || 'unknown';
|
||||
const statusLower = (status || '').toLowerCase();
|
||||
if (statusLower.includes('paused') || statusLower.includes('stopped') || pausedAt) {
|
||||
normalizedStatus = 'paused';
|
||||
} else if (statusLower.includes('executing') || statusLower.includes('in progress')) {
|
||||
normalizedStatus = 'executing';
|
||||
} else if (statusLower.includes('planning') || statusLower.includes('ready to plan')) {
|
||||
normalizedStatus = 'planning';
|
||||
} else if (statusLower.includes('discussing')) {
|
||||
normalizedStatus = 'discussing';
|
||||
} else if (statusLower.includes('verif')) {
|
||||
normalizedStatus = 'verifying';
|
||||
} else if (statusLower.includes('complete') || statusLower.includes('done')) {
|
||||
normalizedStatus = 'completed';
|
||||
} else if (statusLower.includes('ready to execute')) {
|
||||
normalizedStatus = 'executing';
|
||||
}
|
||||
return normalizedStatus;
|
||||
}
|
||||
|
||||
function computeProgressPercent(completedPlans, totalPlans, completedPhases, totalPhases) {
|
||||
const hasPlanData = totalPlans !== null && totalPlans > 0 && completedPlans !== null;
|
||||
const hasPhaseData = totalPhases !== null && totalPhases > 0 && completedPhases !== null;
|
||||
|
||||
if (!hasPlanData && !hasPhaseData) return null;
|
||||
|
||||
const planFraction = hasPlanData ? completedPlans / totalPlans : 1;
|
||||
const phaseFraction = hasPhaseData ? completedPhases / totalPhases : 1;
|
||||
|
||||
return Math.min(100, Math.round(Math.min(planFraction, phaseFraction) * 100));
|
||||
}
|
||||
|
||||
function toFiniteNumber(value) {
|
||||
const number = Number(value);
|
||||
return Number.isFinite(number) ? number : null;
|
||||
}
|
||||
|
||||
function existingProgressExceedsDerived(existingProgress, derivedProgress, key) {
|
||||
const existing = toFiniteNumber(existingProgress[key]);
|
||||
const derived = toFiniteNumber(derivedProgress[key]);
|
||||
return existing !== null && derived !== null && existing > derived;
|
||||
}
|
||||
|
||||
function shouldPreserveExistingProgress(existingProgress, derivedProgress) {
|
||||
if (!existingProgress || typeof existingProgress !== 'object') return false;
|
||||
if (!derivedProgress || typeof derivedProgress !== 'object') return false;
|
||||
|
||||
return (
|
||||
existingProgressExceedsDerived(existingProgress, derivedProgress, 'total_phases') ||
|
||||
existingProgressExceedsDerived(existingProgress, derivedProgress, 'completed_phases') ||
|
||||
existingProgressExceedsDerived(existingProgress, derivedProgress, 'total_plans') ||
|
||||
existingProgressExceedsDerived(existingProgress, derivedProgress, 'completed_plans')
|
||||
);
|
||||
}
|
||||
|
||||
function normalizeProgressNumbers(progress) {
|
||||
if (!progress || typeof progress !== 'object') return progress;
|
||||
|
||||
const normalized = { ...progress };
|
||||
for (const key of ['total_phases', 'completed_phases', 'total_plans', 'completed_plans', 'percent']) {
|
||||
const number = toFiniteNumber(normalized[key]);
|
||||
if (number !== null) normalized[key] = number;
|
||||
}
|
||||
return normalized;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
computeProgressPercent,
|
||||
normalizeProgressNumbers,
|
||||
normalizeStateStatus,
|
||||
shouldPreserveExistingProgress,
|
||||
stateExtractField,
|
||||
stateReplaceField,
|
||||
stateReplaceFieldWithFallback,
|
||||
};
|
||||
@@ -8,38 +8,20 @@ const { escapeRegex, loadConfig, getMilestoneInfo, getMilestonePhaseFilter, norm
|
||||
const { planningDir, planningPaths } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter, reconstructFrontmatter } = require('./frontmatter.cjs');
|
||||
const scanPhasePlans = require('./plan-scan.cjs');
|
||||
const {
|
||||
computeProgressPercent,
|
||||
normalizeProgressNumbers,
|
||||
normalizeStateStatus,
|
||||
shouldPreserveExistingProgress,
|
||||
stateExtractField,
|
||||
stateReplaceField,
|
||||
} = require('./state-document.cjs');
|
||||
|
||||
// Cache disk scan results from buildStateFrontmatter per cwd per process (#1967).
|
||||
// Avoids re-reading N+1 directories on every state write when the phase structure
|
||||
// hasn't changed within the same gsd-tools invocation.
|
||||
const _diskScanCache = new Map();
|
||||
|
||||
/**
|
||||
* Compute the canonical progress percent for STATE.md frontmatter and body.
|
||||
*
|
||||
* Uses min(plan_fraction, phase_fraction) when both denominators are > 0.
|
||||
* This prevents a false "100%" when ROADMAP declares future phases that have no
|
||||
* disk dirs yet — all plans summarised only means 100% of *realized* work done,
|
||||
* not 100% of the declared milestone (#3242 Bug B).
|
||||
*
|
||||
* @param {number|null} completedPlans
|
||||
* @param {number|null} totalPlans
|
||||
* @param {number|null} completedPhases
|
||||
* @param {number|null} totalPhases - ROADMAP-declared count (>= realised dirs)
|
||||
* @returns {number|null} 0–100, or null when there is no data
|
||||
*/
|
||||
function computeProgressPercent(completedPlans, totalPlans, completedPhases, totalPhases) {
|
||||
const hasPlanData = totalPlans !== null && totalPlans > 0 && completedPlans !== null;
|
||||
const hasPhaseData = totalPhases !== null && totalPhases > 0 && completedPhases !== null;
|
||||
|
||||
if (!hasPlanData && !hasPhaseData) return null;
|
||||
|
||||
const planFraction = hasPlanData ? completedPlans / totalPlans : 1;
|
||||
const phaseFraction = hasPhaseData ? completedPhases / totalPhases : 1;
|
||||
|
||||
return Math.min(100, Math.round(Math.min(planFraction, phaseFraction) * 100));
|
||||
}
|
||||
|
||||
/** Shorthand — every state command needs this path */
|
||||
function getStatePath(cwd) {
|
||||
return planningPaths(cwd).state;
|
||||
@@ -55,19 +37,6 @@ process.on('exit', () => {
|
||||
}
|
||||
});
|
||||
|
||||
// Shared helper: extract a field value from STATE.md content.
|
||||
// Supports both **Field:** bold and plain Field: format.
|
||||
// Horizontal whitespace only after ':' so YAML keys like `progress:` do not match as `Progress:` (parity with sdk/helpers stateExtractField).
|
||||
function stateExtractField(content, fieldName) {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
const boldPattern = new RegExp(`\\*\\*${escaped}:\\*\\*[ \\t]*(.+)`, 'i');
|
||||
const boldMatch = content.match(boldPattern);
|
||||
if (boldMatch) return boldMatch[1].trim();
|
||||
const plainPattern = new RegExp(`^${escaped}:[ \\t]*(.+)`, 'im');
|
||||
const plainMatch = content.match(plainPattern);
|
||||
return plainMatch ? plainMatch[1].trim() : null;
|
||||
}
|
||||
|
||||
function cmdStateLoad(cwd, raw) {
|
||||
const config = loadConfig(cwd);
|
||||
const planDir = planningPaths(cwd).planning;
|
||||
@@ -190,16 +159,9 @@ function cmdStatePatch(cwd, patches, raw) {
|
||||
// Use atomic read-modify-write to prevent lost updates from concurrent agents
|
||||
readModifyWriteStateMd(statePath, (content) => {
|
||||
for (const [field, value] of Object.entries(patches)) {
|
||||
const fieldEscaped = escapeRegex(field);
|
||||
// Try **Field:** bold format first, then plain Field: format
|
||||
const boldPattern = new RegExp(`(\\*\\*${fieldEscaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
const plainPattern = new RegExp(`(^${fieldEscaped}:\\s*)(.*)`, 'im');
|
||||
|
||||
if (boldPattern.test(content)) {
|
||||
content = content.replace(boldPattern, (_match, prefix) => `${prefix}${value}`);
|
||||
results.updated.push(field);
|
||||
} else if (plainPattern.test(content)) {
|
||||
content = content.replace(plainPattern, (_match, prefix) => `${prefix}${value}`);
|
||||
const result = stateReplaceField(content, field, value);
|
||||
if (result) {
|
||||
content = result;
|
||||
results.updated.push(field);
|
||||
} else {
|
||||
results.failed.push(field);
|
||||
@@ -233,16 +195,10 @@ function cmdStateUpdate(cwd, field, value) {
|
||||
// Triggering syncStateFrontmatter would rebuild progress.* from disk, trampling
|
||||
// manually-curated cross-milestone counters stored in the frontmatter (#3242 Bug A).
|
||||
readModifyWriteStateMd(statePath, (content) => {
|
||||
const fieldEscaped = escapeRegex(field);
|
||||
// Try **Field:** bold format first, then plain Field: format
|
||||
const boldPattern = new RegExp(`(\\*\\*${fieldEscaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
const plainPattern = new RegExp(`(^${fieldEscaped}:\\s*)(.*)`, 'im');
|
||||
if (boldPattern.test(content)) {
|
||||
const result = stateReplaceField(content, field, value);
|
||||
if (result) {
|
||||
updated = true;
|
||||
return content.replace(boldPattern, (_match, prefix) => `${prefix}${value}`);
|
||||
} else if (plainPattern.test(content)) {
|
||||
updated = true;
|
||||
return content.replace(plainPattern, (_match, prefix) => `${prefix}${value}`);
|
||||
return result;
|
||||
}
|
||||
return content;
|
||||
}, cwd, { resync: false });
|
||||
@@ -257,21 +213,6 @@ function cmdStateUpdate(cwd, field, value) {
|
||||
}
|
||||
|
||||
// ─── State Progression Engine ────────────────────────────────────────────────
|
||||
// stateExtractField is defined above (shared helper) — do not duplicate.
|
||||
|
||||
function stateReplaceField(content, fieldName, newValue) {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
// Try **Field:** bold format first, then plain Field: format
|
||||
const boldPattern = new RegExp(`(\\*\\*${escaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
if (boldPattern.test(content)) {
|
||||
return content.replace(boldPattern, (_match, prefix) => `${prefix}${newValue}`);
|
||||
}
|
||||
const plainPattern = new RegExp(`(^${escaped}:\\s*)(.*)`, 'im');
|
||||
if (plainPattern.test(content)) {
|
||||
return content.replace(plainPattern, (_match, prefix) => `${prefix}${newValue}`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Replace a STATE.md field with fallback field name support.
|
||||
@@ -909,24 +850,7 @@ function buildStateFrontmatter(bodyContent, cwd) {
|
||||
if (pctMatch) progressPercent = parseInt(pctMatch[1], 10);
|
||||
}
|
||||
|
||||
// Normalize status to one of: planning, discussing, executing, verifying, paused, completed, unknown
|
||||
let normalizedStatus = status || 'unknown';
|
||||
const statusLower = (status || '').toLowerCase();
|
||||
if (statusLower.includes('paused') || statusLower.includes('stopped') || pausedAt) {
|
||||
normalizedStatus = 'paused';
|
||||
} else if (statusLower.includes('executing') || statusLower.includes('in progress')) {
|
||||
normalizedStatus = 'executing';
|
||||
} else if (statusLower.includes('planning') || statusLower.includes('ready to plan')) {
|
||||
normalizedStatus = 'planning';
|
||||
} else if (statusLower.includes('discussing')) {
|
||||
normalizedStatus = 'discussing';
|
||||
} else if (statusLower.includes('verif')) {
|
||||
normalizedStatus = 'verifying';
|
||||
} else if (statusLower.includes('complete') || statusLower.includes('done')) {
|
||||
normalizedStatus = 'completed';
|
||||
} else if (statusLower.includes('ready to execute')) {
|
||||
normalizedStatus = 'executing';
|
||||
}
|
||||
const normalizedStatus = normalizeStateStatus(status, pausedAt);
|
||||
|
||||
const fm = { gsd_state_version: '1.0' };
|
||||
|
||||
@@ -1126,6 +1050,12 @@ function cmdStateJson(cwd, raw) {
|
||||
if (built.status === 'unknown' && existingFm && existingFm.status && existingFm.status !== 'unknown') {
|
||||
built.status = existingFm.status;
|
||||
}
|
||||
// Preserve curated cross-milestone aggregates when local disk scanning sees
|
||||
// only a narrower realized subset (#3242 Bug A). Stale lower counters still
|
||||
// rebuild from disk because they do not exceed the derived scan.
|
||||
if (existingFm && shouldPreserveExistingProgress(existingFm.progress, built.progress)) {
|
||||
built.progress = normalizeProgressNumbers(existingFm.progress);
|
||||
}
|
||||
|
||||
output(built, raw, JSON.stringify(built, null, 2));
|
||||
}
|
||||
|
||||
159
get-shit-done/bin/lib/workstream-inventory.cjs
Normal file
159
get-shit-done/bin/lib/workstream-inventory.cjs
Normal file
@@ -0,0 +1,159 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Workstream Inventory Module
|
||||
*
|
||||
* Owns discovery and read-only projection of .planning/workstreams/* state.
|
||||
* Command handlers should render outputs from this inventory instead of
|
||||
* rescanning workstream directories directly.
|
||||
*/
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { toPosixPath, readSubdirectories } = require('./core.cjs');
|
||||
const scanPhasePlans = require('./plan-scan.cjs');
|
||||
const { planningPaths, planningRoot, getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { stateExtractField } = require('./state-document.cjs');
|
||||
|
||||
function workstreamsRoot(cwd) {
|
||||
return path.join(planningRoot(cwd), 'workstreams');
|
||||
}
|
||||
|
||||
function countRoadmapPhases(roadmapPath, fallbackCount) {
|
||||
try {
|
||||
const roadmapContent = fs.readFileSync(roadmapPath, 'utf-8');
|
||||
const matches = roadmapContent.match(/^#{2,4}\s+Phase\s+[\w][\w.-]*/gm);
|
||||
return matches ? matches.length : fallbackCount;
|
||||
} catch {
|
||||
return fallbackCount;
|
||||
}
|
||||
}
|
||||
|
||||
function countPhaseFiles(phaseDir) {
|
||||
const scan = scanPhasePlans(phaseDir);
|
||||
return { planCount: scan.planCount, summaryCount: scan.summaryCount };
|
||||
}
|
||||
|
||||
function readStateProjection(statePath) {
|
||||
try {
|
||||
const stateContent = fs.readFileSync(statePath, 'utf-8');
|
||||
return {
|
||||
status: stateExtractField(stateContent, 'Status') || 'unknown',
|
||||
current_phase: stateExtractField(stateContent, 'Current Phase'),
|
||||
last_activity: stateExtractField(stateContent, 'Last Activity'),
|
||||
};
|
||||
} catch {
|
||||
return {
|
||||
status: 'unknown',
|
||||
current_phase: null,
|
||||
last_activity: null,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
function inspectWorkstream(cwd, name, options = {}) {
|
||||
const wsDir = path.join(workstreamsRoot(cwd), name);
|
||||
if (!fs.existsSync(wsDir)) return null;
|
||||
|
||||
const active = options.active === undefined ? getActiveWorkstream(cwd) : options.active;
|
||||
const p = planningPaths(cwd, name);
|
||||
const phaseDirs = readSubdirectories(p.phases);
|
||||
const phases = [];
|
||||
let completedPhases = 0;
|
||||
let totalPlans = 0;
|
||||
let completedPlans = 0;
|
||||
|
||||
for (const dir of phaseDirs.sort()) {
|
||||
const counts = countPhaseFiles(path.join(p.phases, dir));
|
||||
const status = counts.summaryCount >= counts.planCount && counts.planCount > 0
|
||||
? 'complete'
|
||||
: counts.planCount > 0
|
||||
? 'in_progress'
|
||||
: 'pending';
|
||||
|
||||
totalPlans += counts.planCount;
|
||||
completedPlans += Math.min(counts.summaryCount, counts.planCount);
|
||||
if (status === 'complete') completedPhases++;
|
||||
|
||||
phases.push({
|
||||
directory: dir,
|
||||
status,
|
||||
plan_count: counts.planCount,
|
||||
summary_count: counts.summaryCount,
|
||||
});
|
||||
}
|
||||
|
||||
const roadmapPhaseCount = countRoadmapPhases(p.roadmap, phaseDirs.length);
|
||||
const state = readStateProjection(p.state);
|
||||
|
||||
return {
|
||||
name,
|
||||
path: toPosixPath(path.relative(cwd, wsDir)),
|
||||
active: name === active,
|
||||
files: {
|
||||
roadmap: fs.existsSync(p.roadmap),
|
||||
state: fs.existsSync(p.state),
|
||||
requirements: fs.existsSync(p.requirements),
|
||||
},
|
||||
status: state.status,
|
||||
current_phase: state.current_phase,
|
||||
last_activity: state.last_activity,
|
||||
phases,
|
||||
phase_count: phases.length,
|
||||
completed_phases: completedPhases,
|
||||
roadmap_phase_count: roadmapPhaseCount,
|
||||
total_plans: totalPlans,
|
||||
completed_plans: completedPlans,
|
||||
progress_percent: roadmapPhaseCount > 0 ? Math.round((completedPhases / roadmapPhaseCount) * 100) : 0,
|
||||
};
|
||||
}
|
||||
|
||||
function listWorkstreamInventories(cwd) {
|
||||
const wsRoot = workstreamsRoot(cwd);
|
||||
if (!fs.existsSync(wsRoot)) {
|
||||
return {
|
||||
mode: 'flat',
|
||||
active: null,
|
||||
workstreams: [],
|
||||
count: 0,
|
||||
message: 'No workstreams — operating in flat mode',
|
||||
};
|
||||
}
|
||||
|
||||
const active = getActiveWorkstream(cwd);
|
||||
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams = [];
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const inventory = inspectWorkstream(cwd, entry.name, { active });
|
||||
if (inventory) workstreams.push(inventory);
|
||||
}
|
||||
|
||||
return {
|
||||
mode: 'workstream',
|
||||
active,
|
||||
workstreams,
|
||||
count: workstreams.length,
|
||||
};
|
||||
}
|
||||
|
||||
function isCompletedInventory(inventory) {
|
||||
const status = String(inventory && inventory.status ? inventory.status : '').toLowerCase();
|
||||
return status.includes('milestone complete') || status.includes('archived');
|
||||
}
|
||||
|
||||
function getOtherActiveWorkstreamInventories(cwd, excludeWs) {
|
||||
return listWorkstreamInventories(cwd).workstreams
|
||||
.filter(inventory => inventory.name !== excludeWs)
|
||||
.filter(inventory => !isCompletedInventory(inventory));
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
countPhaseFiles,
|
||||
countRoadmapPhases,
|
||||
getOtherActiveWorkstreamInventories,
|
||||
inspectWorkstream,
|
||||
isCompletedInventory,
|
||||
listWorkstreamInventories,
|
||||
workstreamsRoot,
|
||||
};
|
||||
@@ -10,10 +10,14 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { output, error, toPosixPath, getMilestoneInfo, generateSlugInternal, filterPlanFiles, filterSummaryFiles, readSubdirectories } = require('./core.cjs');
|
||||
const { planningPaths, planningRoot, setActiveWorkstream, getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { stateExtractField } = require('./state.cjs');
|
||||
const { output, error, toPosixPath, getMilestoneInfo, generateSlugInternal } = require('./core.cjs');
|
||||
const { planningRoot, setActiveWorkstream, getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { toWorkstreamSlug, hasInvalidPathSegment, isValidActiveWorkstreamName } = require('./workstream-name-policy.cjs');
|
||||
const {
|
||||
getOtherActiveWorkstreamInventories,
|
||||
inspectWorkstream,
|
||||
listWorkstreamInventories,
|
||||
} = require('./workstream-inventory.cjs');
|
||||
|
||||
// ─── Migration ──────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -179,52 +183,22 @@ function cmdWorkstreamCreate(cwd, name, options, raw) {
|
||||
}
|
||||
|
||||
function cmdWorkstreamList(cwd, raw) {
|
||||
const wsRoot = path.join(planningRoot(cwd), 'workstreams');
|
||||
|
||||
if (!fs.existsSync(wsRoot)) {
|
||||
output({ mode: 'flat', workstreams: [], message: 'No workstreams — operating in flat mode' }, raw);
|
||||
const inventory = listWorkstreamInventories(cwd);
|
||||
if (inventory.mode === 'flat') {
|
||||
output({ mode: 'flat', workstreams: [], message: inventory.message }, raw);
|
||||
return;
|
||||
}
|
||||
|
||||
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams = [];
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
|
||||
const wsDir = path.join(wsRoot, entry.name);
|
||||
const phasesDir = path.join(wsDir, 'phases');
|
||||
|
||||
const phaseDirs = readSubdirectories(phasesDir);
|
||||
const phaseCount = phaseDirs.length;
|
||||
let completedCount = 0;
|
||||
for (const d of phaseDirs) {
|
||||
try {
|
||||
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
||||
} catch {}
|
||||
}
|
||||
|
||||
let status = 'unknown', currentPhase = null;
|
||||
try {
|
||||
const stateContent = fs.readFileSync(path.join(wsDir, 'STATE.md'), 'utf-8');
|
||||
status = stateExtractField(stateContent, 'Status') || 'unknown';
|
||||
currentPhase = stateExtractField(stateContent, 'Current Phase');
|
||||
} catch {}
|
||||
|
||||
workstreams.push({
|
||||
name: entry.name,
|
||||
path: toPosixPath(path.relative(cwd, wsDir)),
|
||||
has_roadmap: fs.existsSync(path.join(wsDir, 'ROADMAP.md')),
|
||||
has_state: fs.existsSync(path.join(wsDir, 'STATE.md')),
|
||||
status,
|
||||
current_phase: currentPhase,
|
||||
phase_count: phaseCount,
|
||||
completed_phases: completedCount,
|
||||
});
|
||||
}
|
||||
const workstreams = inventory.workstreams.map(ws => ({
|
||||
name: ws.name,
|
||||
path: ws.path,
|
||||
has_roadmap: ws.files.roadmap,
|
||||
has_state: ws.files.state,
|
||||
status: ws.status,
|
||||
current_phase: ws.current_phase,
|
||||
phase_count: ws.phase_count,
|
||||
completed_phases: ws.completed_phases,
|
||||
}));
|
||||
|
||||
output({ mode: 'workstream', workstreams, count: workstreams.length }, raw);
|
||||
}
|
||||
@@ -239,50 +213,19 @@ function cmdWorkstreamStatus(cwd, name, raw) {
|
||||
return;
|
||||
}
|
||||
|
||||
const p = planningPaths(cwd, name);
|
||||
const relPath = toPosixPath(path.relative(cwd, wsDir));
|
||||
|
||||
const files = {
|
||||
roadmap: fs.existsSync(p.roadmap),
|
||||
state: fs.existsSync(p.state),
|
||||
requirements: fs.existsSync(p.requirements),
|
||||
};
|
||||
|
||||
const phases = [];
|
||||
for (const dir of readSubdirectories(p.phases).sort()) {
|
||||
try {
|
||||
const phaseFiles = fs.readdirSync(path.join(p.phases, dir));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
phases.push({
|
||||
directory: dir,
|
||||
status: summaries.length >= plans.length && plans.length > 0 ? 'complete' :
|
||||
plans.length > 0 ? 'in_progress' : 'pending',
|
||||
plan_count: plans.length,
|
||||
summary_count: summaries.length,
|
||||
});
|
||||
} catch {}
|
||||
}
|
||||
|
||||
let stateInfo = {};
|
||||
try {
|
||||
const stateContent = fs.readFileSync(p.state, 'utf-8');
|
||||
stateInfo = {
|
||||
status: stateExtractField(stateContent, 'Status') || 'unknown',
|
||||
current_phase: stateExtractField(stateContent, 'Current Phase'),
|
||||
last_activity: stateExtractField(stateContent, 'Last Activity'),
|
||||
};
|
||||
} catch {}
|
||||
const inventory = inspectWorkstream(cwd, name);
|
||||
|
||||
output({
|
||||
found: true,
|
||||
workstream: name,
|
||||
path: relPath,
|
||||
files,
|
||||
phases,
|
||||
phase_count: phases.length,
|
||||
completed_phases: phases.filter(ph => ph.status === 'complete').length,
|
||||
...stateInfo,
|
||||
path: inventory.path,
|
||||
files: inventory.files,
|
||||
phases: inventory.phases,
|
||||
phase_count: inventory.phase_count,
|
||||
completed_phases: inventory.completed_phases,
|
||||
status: inventory.status,
|
||||
current_phase: inventory.current_phase,
|
||||
last_activity: inventory.last_activity,
|
||||
}, raw);
|
||||
}
|
||||
|
||||
@@ -381,64 +324,23 @@ function cmdWorkstreamGet(cwd, raw) {
|
||||
}
|
||||
|
||||
function cmdWorkstreamProgress(cwd, raw) {
|
||||
const root = planningRoot(cwd);
|
||||
const wsRoot = path.join(root, 'workstreams');
|
||||
|
||||
if (!fs.existsSync(wsRoot)) {
|
||||
output({ mode: 'flat', workstreams: [], message: 'No workstreams — operating in flat mode' }, raw);
|
||||
const inventory = listWorkstreamInventories(cwd);
|
||||
if (inventory.mode === 'flat') {
|
||||
output({ mode: 'flat', workstreams: [], message: inventory.message }, raw);
|
||||
return;
|
||||
}
|
||||
|
||||
const active = getActiveWorkstream(cwd);
|
||||
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams = [];
|
||||
const workstreams = inventory.workstreams.map(ws => ({
|
||||
name: ws.name,
|
||||
active: ws.active,
|
||||
status: ws.status,
|
||||
current_phase: ws.current_phase,
|
||||
phases: `${ws.completed_phases}/${ws.roadmap_phase_count}`,
|
||||
plans: `${ws.completed_plans}/${ws.total_plans}`,
|
||||
progress_percent: ws.progress_percent,
|
||||
}));
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
|
||||
const wsDir = path.join(wsRoot, entry.name);
|
||||
const phasesDir = path.join(wsDir, 'phases');
|
||||
|
||||
const phaseDirsProgress = readSubdirectories(phasesDir);
|
||||
const phaseCount = phaseDirsProgress.length;
|
||||
let completedCount = 0, totalPlans = 0, completedPlans = 0;
|
||||
for (const d of phaseDirsProgress) {
|
||||
try {
|
||||
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
totalPlans += plans.length;
|
||||
completedPlans += Math.min(summaries.length, plans.length);
|
||||
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
||||
} catch {}
|
||||
}
|
||||
|
||||
let roadmapPhaseCount = phaseCount;
|
||||
try {
|
||||
const roadmapContent = fs.readFileSync(path.join(wsDir, 'ROADMAP.md'), 'utf-8');
|
||||
const phaseMatches = roadmapContent.match(/^###?\s+Phase\s+\d/gm);
|
||||
if (phaseMatches) roadmapPhaseCount = phaseMatches.length;
|
||||
} catch {}
|
||||
|
||||
let status = 'unknown', currentPhase = null;
|
||||
try {
|
||||
const stateContent = fs.readFileSync(path.join(wsDir, 'STATE.md'), 'utf-8');
|
||||
status = stateExtractField(stateContent, 'Status') || 'unknown';
|
||||
currentPhase = stateExtractField(stateContent, 'Current Phase');
|
||||
} catch {}
|
||||
|
||||
workstreams.push({
|
||||
name: entry.name,
|
||||
active: entry.name === active,
|
||||
status,
|
||||
current_phase: currentPhase,
|
||||
phases: `${completedCount}/${roadmapPhaseCount}`,
|
||||
plans: `${completedPlans}/${totalPlans}`,
|
||||
progress_percent: roadmapPhaseCount > 0 ? Math.round((completedCount / roadmapPhaseCount) * 100) : 0,
|
||||
});
|
||||
}
|
||||
|
||||
output({ mode: 'workstream', active, workstreams, count: workstreams.length }, raw);
|
||||
output({ mode: 'workstream', active: inventory.active, workstreams, count: workstreams.length }, raw);
|
||||
}
|
||||
|
||||
// ─── Collision Detection ────────────────────────────────────────────────────
|
||||
@@ -449,47 +351,12 @@ function cmdWorkstreamProgress(cwd, raw) {
|
||||
* when a workstream finishes its last phase.
|
||||
*/
|
||||
function getOtherActiveWorkstreams(cwd, excludeWs) {
|
||||
const wsRoot = path.join(planningRoot(cwd), 'workstreams');
|
||||
if (!fs.existsSync(wsRoot)) return [];
|
||||
|
||||
const entries = fs.readdirSync(wsRoot, { withFileTypes: true });
|
||||
const others = [];
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory() || entry.name === excludeWs) continue;
|
||||
|
||||
const wsDir = path.join(wsRoot, entry.name);
|
||||
const statePath = path.join(wsDir, 'STATE.md');
|
||||
|
||||
let status = 'unknown', currentPhase = null;
|
||||
try {
|
||||
const content = fs.readFileSync(statePath, 'utf-8');
|
||||
status = stateExtractField(content, 'Status') || 'unknown';
|
||||
currentPhase = stateExtractField(content, 'Current Phase');
|
||||
} catch {}
|
||||
|
||||
if (status.toLowerCase().includes('milestone complete') ||
|
||||
status.toLowerCase().includes('archived')) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const phasesDir = path.join(wsDir, 'phases');
|
||||
const phaseDirsOther = readSubdirectories(phasesDir);
|
||||
const phaseCount = phaseDirsOther.length;
|
||||
let completedCount = 0;
|
||||
for (const d of phaseDirsOther) {
|
||||
try {
|
||||
const phaseFiles = fs.readdirSync(path.join(phasesDir, d));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
||||
} catch {}
|
||||
}
|
||||
|
||||
others.push({ name: entry.name, status, current_phase: currentPhase, phases: `${completedCount}/${phaseCount}` });
|
||||
}
|
||||
|
||||
return others;
|
||||
return getOtherActiveWorkstreamInventories(cwd, excludeWs).map(ws => ({
|
||||
name: ws.name,
|
||||
status: ws.status,
|
||||
current_phase: ws.current_phase,
|
||||
phases: `${ws.completed_phases}/${ws.phase_count}`,
|
||||
}));
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
|
||||
1991
package-lock.json
generated
1991
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -25,6 +25,7 @@ import { GSDError, ErrorClassification } from '../errors.js';
|
||||
export { SUPPORTED_RUNTIMES, type Runtime } from '../model-catalog.js';
|
||||
import { SUPPORTED_RUNTIMES, type Runtime } from '../model-catalog.js';
|
||||
import { workspacePlanningPaths, resolveWorkspaceContext, type PlanningPaths } from './workspace.js';
|
||||
export { stateExtractField } from './state-document.js';
|
||||
import { relPlanningPath, validateWorkstreamName } from '../workstream-utils.js';
|
||||
|
||||
// ─── Runtime-aware agents directory resolution ─────────────────────────────
|
||||
@@ -317,29 +318,6 @@ export function toPosixPath(p: string): string {
|
||||
return p.split('\\').join('/');
|
||||
}
|
||||
|
||||
// ─── stateExtractField ──────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Extract a field value from STATE.md content.
|
||||
*
|
||||
* Supports both **bold:** and plain: formats, case-insensitive.
|
||||
*
|
||||
* @param content - STATE.md content string
|
||||
* @param fieldName - Field name to extract
|
||||
* @returns The field value, or null if not found
|
||||
*/
|
||||
export function stateExtractField(content: string, fieldName: string): string | null {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
// Horizontal whitespace only after ':' so YAML blocks like `progress:\n total:` do not
|
||||
// match as `Progress:` with a multi-line "value" (parity with STATE.md body fields).
|
||||
const boldPattern = new RegExp(`\\*\\*${escaped}:\\*\\*[ \\t]*(.+)`, 'i');
|
||||
const boldMatch = content.match(boldPattern);
|
||||
if (boldMatch) return boldMatch[1].trim();
|
||||
const plainPattern = new RegExp(`^${escaped}:[ \\t]*(.+)`, 'im');
|
||||
const plainMatch = content.match(plainPattern);
|
||||
return plainMatch ? plainMatch[1].trim() : null;
|
||||
}
|
||||
|
||||
// ─── normalizeMd ───────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
|
||||
@@ -30,7 +30,6 @@ import {
|
||||
phaseTokenMatches,
|
||||
toPosixPath,
|
||||
planningPaths,
|
||||
stateExtractField,
|
||||
} from './helpers.js';
|
||||
import { extractFrontmatter } from './frontmatter.js';
|
||||
import { extractCurrentMilestone } from './roadmap.js';
|
||||
@@ -41,6 +40,7 @@ import {
|
||||
releaseStateLock,
|
||||
stateReplaceField,
|
||||
} from './state-mutation.js';
|
||||
import { stateExtractField, stateReplaceFieldWithFallback } from './state-document.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
import {
|
||||
assertNoNullBytes,
|
||||
@@ -959,29 +959,6 @@ export const phaseRemove: QueryHandler = async (args, projectDir, workstream) =>
|
||||
};
|
||||
};
|
||||
|
||||
// ─── stateReplaceFieldWithFallback (inline) ────────────────────────────────
|
||||
|
||||
/**
|
||||
* Replace a field with fallback field name support.
|
||||
*
|
||||
* Tries primary first, then fallback. Returns content unchanged if neither matches.
|
||||
* Reimplemented here because state-mutation.ts keeps it module-private.
|
||||
*/
|
||||
function stateReplaceFieldWithFallback(
|
||||
content: string,
|
||||
primary: string,
|
||||
fallback: string | null,
|
||||
value: string,
|
||||
): string {
|
||||
let result = stateReplaceField(content, primary, value);
|
||||
if (result) return result;
|
||||
if (fallback) {
|
||||
result = stateReplaceField(content, fallback, value);
|
||||
if (result) return result;
|
||||
}
|
||||
return content;
|
||||
}
|
||||
|
||||
// ─── updatePerformanceMetricsSection ───────────────────────────────────────
|
||||
|
||||
/**
|
||||
|
||||
35
sdk/src/query/plan-scan.test.ts
Normal file
35
sdk/src/query/plan-scan.test.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
import { describe, expect, it } from 'vitest';
|
||||
import { mkdtemp, mkdir, rm, writeFile } from 'node:fs/promises';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { scanPhasePlans } from './plan-scan.js';
|
||||
|
||||
describe('scanPhasePlans', () => {
|
||||
it('counts flat and nested plan files while excluding derivative files', async () => {
|
||||
const tmpDir = await mkdtemp(join(tmpdir(), 'gsd-plan-scan-'));
|
||||
try {
|
||||
const phaseDir = join(tmpDir, 'phases', '1');
|
||||
const nestedDir = join(phaseDir, 'plans');
|
||||
await mkdir(nestedDir, { recursive: true });
|
||||
await writeFile(join(phaseDir, '01-01-PLAN.md'), '# Plan');
|
||||
await writeFile(join(phaseDir, '01-01-SUMMARY.md'), '# Summary');
|
||||
await writeFile(join(phaseDir, '01-01-PLAN-OUTLINE.md'), '# Outline');
|
||||
await writeFile(join(nestedDir, 'PLAN-02-next.md'), '# Plan');
|
||||
await writeFile(join(nestedDir, 'SUMMARY-02-next.md'), '# Summary');
|
||||
await writeFile(join(nestedDir, 'PLAN-03-draft.pre-bounce.md'), '# Draft');
|
||||
await writeFile(join(nestedDir, 'PLAN-04-OUTLINE.md'), '# Outline');
|
||||
|
||||
expect(scanPhasePlans(phaseDir)).toMatchObject({
|
||||
planCount: 2,
|
||||
summaryCount: 2,
|
||||
completed: true,
|
||||
hasNestedPlans: true,
|
||||
planFiles: ['01-01-PLAN.md', 'PLAN-02-next.md'],
|
||||
summaryFiles: ['01-01-SUMMARY.md', 'SUMMARY-02-next.md'],
|
||||
});
|
||||
} finally {
|
||||
await rm(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
82
sdk/src/query/plan-scan.ts
Normal file
82
sdk/src/query/plan-scan.ts
Normal file
@@ -0,0 +1,82 @@
|
||||
import { existsSync, readdirSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
|
||||
const PLAN_OUTLINE_RE = /-OUTLINE\.md$/i;
|
||||
const PLAN_PRE_BOUNCE_RE = /\.pre-bounce\.md$/i;
|
||||
|
||||
export interface PhasePlanScan {
|
||||
planCount: number;
|
||||
summaryCount: number;
|
||||
completed: boolean;
|
||||
hasNestedPlans: boolean;
|
||||
planFiles: string[];
|
||||
summaryFiles: string[];
|
||||
}
|
||||
|
||||
export function isRootPlanFile(fileName: string): boolean {
|
||||
if (PLAN_OUTLINE_RE.test(fileName)) return false;
|
||||
if (PLAN_PRE_BOUNCE_RE.test(fileName)) return false;
|
||||
if (fileName.endsWith('-PLAN.md') || fileName === 'PLAN.md') return true;
|
||||
return /\.md$/i.test(fileName) && /PLAN/i.test(fileName);
|
||||
}
|
||||
|
||||
export function isNestedPlanFile(fileName: string): boolean {
|
||||
if (PLAN_OUTLINE_RE.test(fileName)) return false;
|
||||
if (PLAN_PRE_BOUNCE_RE.test(fileName)) return false;
|
||||
return /^PLAN-\d+.*\.md$/i.test(fileName) || /-PLAN-\d+.*\.md$/i.test(fileName);
|
||||
}
|
||||
|
||||
export function isRootSummaryFile(fileName: string): boolean {
|
||||
return fileName.endsWith('-SUMMARY.md') || fileName === 'SUMMARY.md';
|
||||
}
|
||||
|
||||
export function isNestedSummaryFile(fileName: string): boolean {
|
||||
return /^SUMMARY-\d+.*\.md$/i.test(fileName) || /-SUMMARY-\d+.*\.md$/i.test(fileName);
|
||||
}
|
||||
|
||||
export function scanPhasePlans(phaseDir: string): PhasePlanScan {
|
||||
let rootFiles: string[];
|
||||
try {
|
||||
rootFiles = readdirSync(phaseDir);
|
||||
} catch {
|
||||
return {
|
||||
planCount: 0,
|
||||
summaryCount: 0,
|
||||
completed: false,
|
||||
hasNestedPlans: false,
|
||||
planFiles: [],
|
||||
summaryFiles: [],
|
||||
};
|
||||
}
|
||||
|
||||
const rootPlanFiles = rootFiles.filter(isRootPlanFile);
|
||||
const rootSummaryFiles = rootFiles.filter(isRootSummaryFile);
|
||||
|
||||
let nestedPlanFiles: string[] = [];
|
||||
let nestedSummaryFiles: string[] = [];
|
||||
let hasNestedPlans = false;
|
||||
|
||||
const nestedDir = join(phaseDir, 'plans');
|
||||
if (existsSync(nestedDir)) {
|
||||
try {
|
||||
const nestedFiles = readdirSync(nestedDir);
|
||||
nestedPlanFiles = nestedFiles.filter(isNestedPlanFile);
|
||||
nestedSummaryFiles = nestedFiles.filter(isNestedSummaryFile);
|
||||
hasNestedPlans = nestedPlanFiles.length > 0;
|
||||
} catch { /* ignore unreadable nested layout */ }
|
||||
}
|
||||
|
||||
const planFiles = rootPlanFiles.concat(nestedPlanFiles);
|
||||
const summaryFiles = rootSummaryFiles.concat(nestedSummaryFiles);
|
||||
const planCount = planFiles.length;
|
||||
const summaryCount = summaryFiles.length;
|
||||
|
||||
return {
|
||||
planCount,
|
||||
summaryCount,
|
||||
completed: planCount > 0 && summaryCount >= planCount,
|
||||
hasNestedPlans,
|
||||
planFiles,
|
||||
summaryFiles,
|
||||
};
|
||||
}
|
||||
129
sdk/src/query/state-document.ts
Normal file
129
sdk/src/query/state-document.ts
Normal file
@@ -0,0 +1,129 @@
|
||||
/**
|
||||
* STATE.md Document Module.
|
||||
*
|
||||
* Pure transforms for STATE.md text. This module does not read the filesystem
|
||||
* and does not own persistence or locking.
|
||||
*/
|
||||
|
||||
function escapeRegex(str: string): string {
|
||||
return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
|
||||
}
|
||||
|
||||
export function stateExtractField(content: string, fieldName: string): string | null {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
const boldPattern = new RegExp(`\\*\\*${escaped}:\\*\\*[ \\t]*(.+)`, 'i');
|
||||
const boldMatch = content.match(boldPattern);
|
||||
if (boldMatch) return boldMatch[1].trim();
|
||||
const plainPattern = new RegExp(`^${escaped}:[ \\t]*(.+)`, 'im');
|
||||
const plainMatch = content.match(plainPattern);
|
||||
return plainMatch ? plainMatch[1].trim() : null;
|
||||
}
|
||||
|
||||
export function stateReplaceField(content: string, fieldName: string, newValue: string): string | null {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
const boldPattern = new RegExp(`(\\*\\*${escaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
if (boldPattern.test(content)) {
|
||||
return content.replace(boldPattern, (_match, prefix: string) => `${prefix}${newValue}`);
|
||||
}
|
||||
const plainPattern = new RegExp(`(^${escaped}:\\s*)(.*)`, 'im');
|
||||
if (plainPattern.test(content)) {
|
||||
return content.replace(plainPattern, (_match, prefix: string) => `${prefix}${newValue}`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
export function stateReplaceFieldWithFallback(
|
||||
content: string,
|
||||
primary: string,
|
||||
fallback: string | null,
|
||||
value: string,
|
||||
): string {
|
||||
let result = stateReplaceField(content, primary, value);
|
||||
if (result) return result;
|
||||
if (fallback) {
|
||||
result = stateReplaceField(content, fallback, value);
|
||||
if (result) return result;
|
||||
}
|
||||
return content;
|
||||
}
|
||||
|
||||
export function normalizeStateStatus(status: string | null | undefined, pausedAt?: string | null): string {
|
||||
let normalizedStatus = status || 'unknown';
|
||||
const statusLower = (status || '').toLowerCase();
|
||||
if (statusLower.includes('paused') || statusLower.includes('stopped') || pausedAt) {
|
||||
normalizedStatus = 'paused';
|
||||
} else if (statusLower.includes('executing') || statusLower.includes('in progress')) {
|
||||
normalizedStatus = 'executing';
|
||||
} else if (statusLower.includes('planning') || statusLower.includes('ready to plan')) {
|
||||
normalizedStatus = 'planning';
|
||||
} else if (statusLower.includes('discussing')) {
|
||||
normalizedStatus = 'discussing';
|
||||
} else if (statusLower.includes('verif')) {
|
||||
normalizedStatus = 'verifying';
|
||||
} else if (statusLower.includes('complete') || statusLower.includes('done')) {
|
||||
normalizedStatus = 'completed';
|
||||
} else if (statusLower.includes('ready to execute')) {
|
||||
normalizedStatus = 'executing';
|
||||
}
|
||||
return normalizedStatus;
|
||||
}
|
||||
|
||||
export function computeProgressPercent(
|
||||
completedPlans: number | null,
|
||||
totalPlans: number | null,
|
||||
completedPhases: number | null,
|
||||
totalPhases: number | null,
|
||||
): number | null {
|
||||
const hasPlanData = totalPlans !== null && totalPlans > 0 && completedPlans !== null;
|
||||
const hasPhaseData = totalPhases !== null && totalPhases > 0 && completedPhases !== null;
|
||||
|
||||
if (!hasPlanData && !hasPhaseData) return null;
|
||||
|
||||
const planFraction = hasPlanData ? completedPlans / totalPlans : 1;
|
||||
const phaseFraction = hasPhaseData ? completedPhases / totalPhases : 1;
|
||||
|
||||
return Math.min(100, Math.round(Math.min(planFraction, phaseFraction) * 100));
|
||||
}
|
||||
|
||||
function toFiniteNumber(value: unknown): number | null {
|
||||
const number = Number(value);
|
||||
return Number.isFinite(number) ? number : null;
|
||||
}
|
||||
|
||||
function existingProgressExceedsDerived(
|
||||
existingProgress: Record<string, unknown>,
|
||||
derivedProgress: Record<string, unknown>,
|
||||
key: string,
|
||||
): boolean {
|
||||
const existing = toFiniteNumber(existingProgress[key]);
|
||||
const derived = toFiniteNumber(derivedProgress[key]);
|
||||
return existing !== null && derived !== null && existing > derived;
|
||||
}
|
||||
|
||||
export function shouldPreserveExistingProgress(
|
||||
existingProgress: unknown,
|
||||
derivedProgress: unknown,
|
||||
): existingProgress is Record<string, unknown> {
|
||||
if (!existingProgress || typeof existingProgress !== 'object') return false;
|
||||
if (!derivedProgress || typeof derivedProgress !== 'object') return false;
|
||||
|
||||
const existing = existingProgress as Record<string, unknown>;
|
||||
const derived = derivedProgress as Record<string, unknown>;
|
||||
return (
|
||||
existingProgressExceedsDerived(existing, derived, 'total_phases') ||
|
||||
existingProgressExceedsDerived(existing, derived, 'completed_phases') ||
|
||||
existingProgressExceedsDerived(existing, derived, 'total_plans') ||
|
||||
existingProgressExceedsDerived(existing, derived, 'completed_plans')
|
||||
);
|
||||
}
|
||||
|
||||
export function normalizeProgressNumbers(progress: unknown): unknown {
|
||||
if (!progress || typeof progress !== 'object') return progress;
|
||||
|
||||
const normalized: Record<string, unknown> = { ...(progress as Record<string, unknown>) };
|
||||
for (const key of ['total_phases', 'completed_phases', 'total_plans', 'completed_plans', 'percent']) {
|
||||
const number = toFiniteNumber(normalized[key]);
|
||||
if (number !== null) normalized[key] = number;
|
||||
}
|
||||
return normalized;
|
||||
}
|
||||
@@ -244,6 +244,48 @@ describe('stateUpdate', () => {
|
||||
expect(data.updated).toBe(false);
|
||||
});
|
||||
|
||||
it('preserves curated progress frontmatter during body-only updates', async () => {
|
||||
const stateContent = `---
|
||||
gsd_state_version: 1.0
|
||||
milestone: v3.0
|
||||
milestone_name: SDK-First Migration
|
||||
status: executing
|
||||
progress:
|
||||
total_phases: 12
|
||||
completed_phases: 6
|
||||
total_plans: 22
|
||||
completed_plans: 22
|
||||
percent: 50
|
||||
---
|
||||
|
||||
# Project State
|
||||
|
||||
## Current Position
|
||||
|
||||
Status: Executing
|
||||
Last Activity: 2026-01-01
|
||||
Progress: [█████░░░░░] 50%
|
||||
`;
|
||||
await setupTestProject(tmpDir, stateContent);
|
||||
|
||||
const { stateUpdate } = await import('./state-mutation.js');
|
||||
const { stateJson } = await import('./state.js');
|
||||
|
||||
const result = await stateUpdate(['Last Activity', '2026-05-07'], tmpDir);
|
||||
expect((result.data as Record<string, unknown>).updated).toBe(true);
|
||||
|
||||
const loaded = await stateJson([], tmpDir);
|
||||
const progress = (loaded.data as Record<string, unknown>).progress as Record<string, unknown>;
|
||||
expect(progress.total_phases).toBe(12);
|
||||
expect(progress.completed_phases).toBe(6);
|
||||
expect(progress.total_plans).toBe(22);
|
||||
expect(progress.completed_plans).toBe(22);
|
||||
expect(progress.percent).toBe(50);
|
||||
|
||||
const after = await readFile(join(tmpDir, '.planning', 'STATE.md'), 'utf-8');
|
||||
expect(after).toContain('Last Activity: 2026-05-07');
|
||||
});
|
||||
|
||||
it('throws on missing args', async () => {
|
||||
const { stateUpdate } = await import('./state-mutation.js');
|
||||
|
||||
|
||||
@@ -28,14 +28,13 @@ import { extractFrontmatter, stripFrontmatter } from './frontmatter.js';
|
||||
import { reconstructFrontmatter, spliceFrontmatter } from './frontmatter-mutation.js';
|
||||
import {
|
||||
comparePhaseNum,
|
||||
escapeRegex,
|
||||
normalizePhaseName,
|
||||
phaseTokenMatches,
|
||||
planningPaths,
|
||||
normalizeMd,
|
||||
stateExtractField,
|
||||
} from './helpers.js';
|
||||
import { buildStateFrontmatter, getMilestonePhaseFilter } from './state.js';
|
||||
import { stateExtractField, stateReplaceField, stateReplaceFieldWithFallback } from './state-document.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
|
||||
// ─── Process exit lock cleanup (D2 — match CJS state.cjs:16-23) ─────────
|
||||
@@ -52,48 +51,7 @@ process.on('exit', () => {
|
||||
}
|
||||
});
|
||||
|
||||
// ─── stateReplaceField ────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Replace a field value in STATE.md content.
|
||||
*
|
||||
* Uses separate regex instances (no g flag) to avoid lastIndex persistence.
|
||||
* Supports both **bold:** and plain: formats.
|
||||
*
|
||||
* @param content - STATE.md content
|
||||
* @param fieldName - Field name to replace
|
||||
* @param newValue - New value to set
|
||||
* @returns Updated content, or null if field not found
|
||||
*/
|
||||
export function stateReplaceField(content: string, fieldName: string, newValue: string): string | null {
|
||||
const escaped = escapeRegex(fieldName);
|
||||
// Try **Field:** bold format first
|
||||
const boldPattern = new RegExp(`(\\*\\*${escaped}:\\*\\*\\s*)(.*)`, 'i');
|
||||
if (boldPattern.test(content)) {
|
||||
return content.replace(new RegExp(`(\\*\\*${escaped}:\\*\\*\\s*)(.*)`, 'i'), (_match, prefix: string) => `${prefix}${newValue}`);
|
||||
}
|
||||
// Try plain Field: format
|
||||
const plainPattern = new RegExp(`(^${escaped}:\\s*)(.*)`, 'im');
|
||||
if (plainPattern.test(content)) {
|
||||
return content.replace(new RegExp(`(^${escaped}:\\s*)(.*)`, 'im'), (_match, prefix: string) => `${prefix}${newValue}`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Replace a field with fallback field name support.
|
||||
*
|
||||
* Tries primary first, then fallback. Returns content unchanged if neither matches.
|
||||
*/
|
||||
function stateReplaceFieldWithFallback(content: string, primary: string, fallback: string | null, value: string): string {
|
||||
let result = stateReplaceField(content, primary, value);
|
||||
if (result) return result;
|
||||
if (fallback) {
|
||||
result = stateReplaceField(content, fallback, value);
|
||||
if (result) return result;
|
||||
}
|
||||
return content;
|
||||
}
|
||||
export { stateReplaceField };
|
||||
|
||||
/**
|
||||
* Update fields within the ## Current Position section.
|
||||
@@ -234,10 +192,10 @@ export async function releaseStateLock(lockPath: string): Promise<void> {
|
||||
* Strips existing frontmatter, rebuilds from body + disk, and splices back.
|
||||
* Preserves existing status when body-derived status is 'unknown'.
|
||||
*/
|
||||
async function syncStateFrontmatter(content: string, projectDir: string): Promise<string> {
|
||||
async function syncStateFrontmatter(content: string, projectDir: string, workstream?: string): Promise<string> {
|
||||
const existingFm = extractFrontmatter(content);
|
||||
const body = stripFrontmatter(content);
|
||||
const derivedFm = await buildStateFrontmatter(body, projectDir);
|
||||
const derivedFm = await buildStateFrontmatter(body, projectDir, workstream);
|
||||
|
||||
// Preserve existing status when body-derived is 'unknown'
|
||||
if (derivedFm.status === 'unknown' && existingFm.status && existingFm.status !== 'unknown') {
|
||||
@@ -261,8 +219,10 @@ async function readModifyWriteStateMd(
|
||||
projectDir: string,
|
||||
modifier: (content: string) => string | Promise<string>,
|
||||
workstream?: string,
|
||||
options: { resync?: boolean } = {},
|
||||
): Promise<string> {
|
||||
const statePath = planningPaths(projectDir, workstream).state;
|
||||
const resync = options.resync !== false;
|
||||
const lockPath = await acquireStateLock(statePath);
|
||||
try {
|
||||
let content: string;
|
||||
@@ -274,9 +234,16 @@ async function readModifyWriteStateMd(
|
||||
// Strip frontmatter before passing to modifier so that regex replacements
|
||||
// operate on body fields only (not on YAML frontmatter keys like 'status:').
|
||||
// syncStateFrontmatter rebuilds frontmatter from the modified body + disk.
|
||||
const preFm = extractFrontmatter(content);
|
||||
const body = stripFrontmatter(content);
|
||||
const modified = await modifier(body);
|
||||
const synced = await syncStateFrontmatter(modified, projectDir);
|
||||
let synced = await syncStateFrontmatter(modified, projectDir, workstream);
|
||||
if (!resync && preFm && preFm.progress) {
|
||||
const postFm = extractFrontmatter(synced);
|
||||
postFm.progress = preFm.progress;
|
||||
const yamlStr = reconstructFrontmatter(postFm);
|
||||
synced = `---\n${yamlStr}\n---\n\n${stripFrontmatter(synced)}`;
|
||||
}
|
||||
const normalized = normalizeMd(synced);
|
||||
await writeFile(statePath, normalized, 'utf-8');
|
||||
return normalized;
|
||||
@@ -339,7 +306,7 @@ export const stateUpdate: QueryHandler = async (args, projectDir, workstream) =>
|
||||
return result;
|
||||
}
|
||||
return content;
|
||||
}, workstream);
|
||||
}, workstream, { resync: false });
|
||||
|
||||
return { data: { updated } };
|
||||
};
|
||||
@@ -1707,4 +1674,4 @@ export const statePrune: QueryHandler = async (args, projectDir, workstream) =>
|
||||
archive_file: totalPruned > 0 ? 'STATE-ARCHIVE.md' : null,
|
||||
},
|
||||
};
|
||||
};
|
||||
};
|
||||
|
||||
@@ -155,8 +155,36 @@ describe('stateJson', () => {
|
||||
expect(progress.completed_plans).toBe(4);
|
||||
// Phase 09 complete (3/3), phase 10 incomplete (1/3), phase 11 incomplete (0/1)
|
||||
expect(progress.completed_phases).toBe(1);
|
||||
// 4/7 = 57%
|
||||
expect(progress.percent).toBe(57);
|
||||
// min(plan fraction 4/7, phase fraction 1/3) = 33%
|
||||
expect(progress.percent).toBe(33);
|
||||
});
|
||||
|
||||
it('preserves wider curated progress when disk scan only sees a realized subset', async () => {
|
||||
const stateContent = `---
|
||||
gsd_state_version: 1.0
|
||||
milestone: v3.0
|
||||
milestone_name: SDK-First Migration
|
||||
status: executing
|
||||
progress:
|
||||
total_phases: 12
|
||||
completed_phases: 6
|
||||
total_plans: 22
|
||||
completed_plans: 22
|
||||
percent: 50
|
||||
---
|
||||
|
||||
${STATE_BODY}`;
|
||||
await writeFile(join(tmpDir, '.planning', 'STATE.md'), stateContent);
|
||||
|
||||
const result = await stateJson([], tmpDir);
|
||||
const data = result.data as Record<string, unknown>;
|
||||
const progress = data.progress as Record<string, unknown>;
|
||||
|
||||
expect(progress.total_phases).toBe(12);
|
||||
expect(progress.completed_phases).toBe(6);
|
||||
expect(progress.total_plans).toBe(22);
|
||||
expect(progress.completed_plans).toBe(22);
|
||||
expect(progress.percent).toBe(50);
|
||||
});
|
||||
|
||||
it('preserves stopped_at from existing frontmatter', async () => {
|
||||
@@ -232,8 +260,8 @@ Progress: [░░░░░░░░░░] 0%
|
||||
const data = result.data as Record<string, unknown>;
|
||||
const progress = data.progress as Record<string, unknown>;
|
||||
|
||||
// Disk should override the body's 0%
|
||||
expect(progress.percent).toBe(57);
|
||||
// Disk should override the body's 0%; phase fraction caps plan-only progress.
|
||||
expect(progress.percent).toBe(33);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -23,7 +23,14 @@
|
||||
import { readFile, readdir } from 'node:fs/promises';
|
||||
import { join } from 'node:path';
|
||||
import { extractFrontmatter, stripFrontmatter } from './frontmatter.js';
|
||||
import { stateExtractField, planningPaths, escapeRegex } from './helpers.js';
|
||||
import { planningPaths, escapeRegex } from './helpers.js';
|
||||
import {
|
||||
computeProgressPercent,
|
||||
normalizeProgressNumbers,
|
||||
normalizeStateStatus,
|
||||
shouldPreserveExistingProgress,
|
||||
stateExtractField,
|
||||
} from './state-document.js';
|
||||
import { getMilestoneInfo, extractCurrentMilestone } from './roadmap.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
|
||||
@@ -146,32 +153,14 @@ export async function buildStateFrontmatter(bodyContent: string, projectDir: str
|
||||
} catch { /* intentionally empty */ }
|
||||
|
||||
// Derive percent from disk counts (ground truth)
|
||||
let progressPercent: number | null = null;
|
||||
if (totalPlans !== null && totalPlans > 0 && completedPlans !== null) {
|
||||
progressPercent = Math.min(100, Math.round(completedPlans / totalPlans * 100));
|
||||
} else if (progressRaw) {
|
||||
let progressPercent = computeProgressPercent(completedPlans, totalPlans, completedPhases, totalPhases);
|
||||
if (progressPercent === null && progressRaw) {
|
||||
const pctMatch = progressRaw.match(/(\d+)%/);
|
||||
if (pctMatch) progressPercent = parseInt(pctMatch[1], 10);
|
||||
}
|
||||
|
||||
// Normalize status
|
||||
let normalizedStatus = status || 'unknown';
|
||||
const statusLower = (status || '').toLowerCase();
|
||||
if (statusLower.includes('paused') || statusLower.includes('stopped') || pausedAt) {
|
||||
normalizedStatus = 'paused';
|
||||
} else if (statusLower.includes('executing') || statusLower.includes('in progress')) {
|
||||
normalizedStatus = 'executing';
|
||||
} else if (statusLower.includes('planning') || statusLower.includes('ready to plan')) {
|
||||
normalizedStatus = 'planning';
|
||||
} else if (statusLower.includes('discussing')) {
|
||||
normalizedStatus = 'discussing';
|
||||
} else if (statusLower.includes('verif')) {
|
||||
normalizedStatus = 'verifying';
|
||||
} else if (statusLower.includes('complete') || statusLower.includes('done')) {
|
||||
normalizedStatus = 'completed';
|
||||
} else if (statusLower.includes('ready to execute')) {
|
||||
normalizedStatus = 'executing';
|
||||
}
|
||||
let normalizedStatus = normalizeStateStatus(status, pausedAt);
|
||||
|
||||
// Bug #2613: status preservation — if body has no Status field and existing
|
||||
// frontmatter has a non-unknown status, prefer existing.
|
||||
@@ -210,7 +199,7 @@ export async function buildStateFrontmatter(bodyContent: string, projectDir: str
|
||||
const derivedCompletedPlans = Number(progress.completed_plans ?? 0);
|
||||
const existingTotalPlans = Number(existingProgress.total_plans ?? 0);
|
||||
if (derivedTotalPlans === 0 && derivedCompletedPlans === 0 && existingTotalPlans > 0) {
|
||||
fm.progress = existingProgress;
|
||||
fm.progress = normalizeProgressNumbers(existingProgress);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -259,6 +248,12 @@ export const stateJson: QueryHandler = async (_args, projectDir, workstream) =>
|
||||
if (built.status === 'unknown' && existingFm && existingFm.status && existingFm.status !== 'unknown') {
|
||||
built.status = existingFm.status;
|
||||
}
|
||||
// Read-side projection: preserve curated cross-milestone aggregates when the
|
||||
// disk scan sees only a narrower realized subset (#3242 Bug A). Mutation sync
|
||||
// remains disk-authoritative when it sees non-zero counts.
|
||||
if (existingFm && shouldPreserveExistingProgress(existingFm.progress, built.progress)) {
|
||||
built.progress = normalizeProgressNumbers(existingFm.progress);
|
||||
}
|
||||
|
||||
return { data: built };
|
||||
};
|
||||
|
||||
195
sdk/src/query/workstream-inventory.ts
Normal file
195
sdk/src/query/workstream-inventory.ts
Normal file
@@ -0,0 +1,195 @@
|
||||
/**
|
||||
* Workstream Inventory Module.
|
||||
*
|
||||
* Owns discovery and read-only projection of .planning/workstreams/* state.
|
||||
* Query handlers should render outputs from this inventory instead of
|
||||
* rescanning workstream directories directly.
|
||||
*/
|
||||
|
||||
import { existsSync, readdirSync, readFileSync } from 'node:fs';
|
||||
import { join, relative } from 'node:path';
|
||||
|
||||
import { toPosixPath } from './helpers.js';
|
||||
import { scanPhasePlans } from './plan-scan.js';
|
||||
import { stateExtractField } from './state-document.js';
|
||||
import { readActiveWorkstream } from './active-workstream-store.js';
|
||||
|
||||
export interface WorkstreamPhaseInventory {
|
||||
directory: string;
|
||||
status: 'complete' | 'in_progress' | 'pending';
|
||||
plan_count: number;
|
||||
summary_count: number;
|
||||
}
|
||||
|
||||
export interface WorkstreamInventory {
|
||||
name: string;
|
||||
path: string;
|
||||
active: boolean;
|
||||
files: {
|
||||
roadmap: boolean;
|
||||
state: boolean;
|
||||
requirements: boolean;
|
||||
};
|
||||
status: string;
|
||||
current_phase: string | null;
|
||||
last_activity: string | null;
|
||||
phases: WorkstreamPhaseInventory[];
|
||||
phase_count: number;
|
||||
completed_phases: number;
|
||||
roadmap_phase_count: number;
|
||||
total_plans: number;
|
||||
completed_plans: number;
|
||||
progress_percent: number;
|
||||
}
|
||||
|
||||
export interface WorkstreamInventoryList {
|
||||
mode: 'flat' | 'workstream';
|
||||
active: string | null;
|
||||
workstreams: WorkstreamInventory[];
|
||||
count: number;
|
||||
message?: string;
|
||||
}
|
||||
|
||||
export const planningRoot = (projectDir: string): string =>
|
||||
join(projectDir, '.planning');
|
||||
|
||||
export const workstreamsRoot = (projectDir: string): string =>
|
||||
join(planningRoot(projectDir), 'workstreams');
|
||||
|
||||
function wsPlanningPaths(projectDir: string, name: string) {
|
||||
const base = join(planningRoot(projectDir), 'workstreams', name);
|
||||
return {
|
||||
state: join(base, 'STATE.md'),
|
||||
roadmap: join(base, 'ROADMAP.md'),
|
||||
phases: join(base, 'phases'),
|
||||
requirements: join(base, 'REQUIREMENTS.md'),
|
||||
};
|
||||
}
|
||||
|
||||
function readSubdirectories(dir: string): string[] {
|
||||
if (!existsSync(dir)) return [];
|
||||
return readdirSync(dir, { withFileTypes: true }).filter(e => e.isDirectory()).map(e => e.name);
|
||||
}
|
||||
|
||||
export function countRoadmapPhases(roadmapPath: string, fallbackCount: number): number {
|
||||
try {
|
||||
const roadmapContent = readFileSync(roadmapPath, 'utf-8');
|
||||
const matches = roadmapContent.match(/^#{2,4}\s+Phase\s+[\w][\w.-]*/gm);
|
||||
return matches ? matches.length : fallbackCount;
|
||||
} catch {
|
||||
return fallbackCount;
|
||||
}
|
||||
}
|
||||
|
||||
export function countPhaseFiles(phaseDir: string): { planCount: number; summaryCount: number } {
|
||||
const scan = scanPhasePlans(phaseDir);
|
||||
return { planCount: scan.planCount, summaryCount: scan.summaryCount };
|
||||
}
|
||||
|
||||
function readStateProjection(statePath: string): Pick<WorkstreamInventory, 'status' | 'current_phase' | 'last_activity'> {
|
||||
try {
|
||||
const stateContent = readFileSync(statePath, 'utf-8');
|
||||
return {
|
||||
status: stateExtractField(stateContent, 'Status') || 'unknown',
|
||||
current_phase: stateExtractField(stateContent, 'Current Phase'),
|
||||
last_activity: stateExtractField(stateContent, 'Last Activity'),
|
||||
};
|
||||
} catch {
|
||||
return {
|
||||
status: 'unknown',
|
||||
current_phase: null,
|
||||
last_activity: null,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export function inspectWorkstream(
|
||||
projectDir: string,
|
||||
name: string,
|
||||
options: { active?: string | null } = {},
|
||||
): WorkstreamInventory | null {
|
||||
const wsDir = join(workstreamsRoot(projectDir), name);
|
||||
if (!existsSync(wsDir)) return null;
|
||||
|
||||
const active = options.active === undefined ? readActiveWorkstream(projectDir) : options.active;
|
||||
const p = wsPlanningPaths(projectDir, name);
|
||||
const phaseDirs = readSubdirectories(p.phases);
|
||||
const phases: WorkstreamPhaseInventory[] = [];
|
||||
let completedPhases = 0;
|
||||
let totalPlans = 0;
|
||||
let completedPlans = 0;
|
||||
|
||||
for (const dir of [...phaseDirs].sort()) {
|
||||
const counts = countPhaseFiles(join(p.phases, dir));
|
||||
const status: WorkstreamPhaseInventory['status'] =
|
||||
counts.summaryCount >= counts.planCount && counts.planCount > 0
|
||||
? 'complete'
|
||||
: counts.planCount > 0
|
||||
? 'in_progress'
|
||||
: 'pending';
|
||||
|
||||
totalPlans += counts.planCount;
|
||||
completedPlans += Math.min(counts.summaryCount, counts.planCount);
|
||||
if (status === 'complete') completedPhases++;
|
||||
|
||||
phases.push({
|
||||
directory: dir,
|
||||
status,
|
||||
plan_count: counts.planCount,
|
||||
summary_count: counts.summaryCount,
|
||||
});
|
||||
}
|
||||
|
||||
const roadmapPhaseCount = countRoadmapPhases(p.roadmap, phaseDirs.length);
|
||||
const state = readStateProjection(p.state);
|
||||
|
||||
return {
|
||||
name,
|
||||
path: toPosixPath(relative(projectDir, wsDir)),
|
||||
active: name === active,
|
||||
files: {
|
||||
roadmap: existsSync(p.roadmap),
|
||||
state: existsSync(p.state),
|
||||
requirements: existsSync(p.requirements),
|
||||
},
|
||||
status: state.status,
|
||||
current_phase: state.current_phase,
|
||||
last_activity: state.last_activity,
|
||||
phases,
|
||||
phase_count: phases.length,
|
||||
completed_phases: completedPhases,
|
||||
roadmap_phase_count: roadmapPhaseCount,
|
||||
total_plans: totalPlans,
|
||||
completed_plans: completedPlans,
|
||||
progress_percent: roadmapPhaseCount > 0 ? Math.round((completedPhases / roadmapPhaseCount) * 100) : 0,
|
||||
};
|
||||
}
|
||||
|
||||
export function listWorkstreamInventories(projectDir: string): WorkstreamInventoryList {
|
||||
const wsRoot = workstreamsRoot(projectDir);
|
||||
if (!existsSync(wsRoot)) {
|
||||
return {
|
||||
mode: 'flat',
|
||||
active: null,
|
||||
workstreams: [],
|
||||
count: 0,
|
||||
message: 'No workstreams — operating in flat mode',
|
||||
};
|
||||
}
|
||||
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
const entries = readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams: WorkstreamInventory[] = [];
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const inventory = inspectWorkstream(projectDir, entry.name, { active });
|
||||
if (inventory) workstreams.push(inventory);
|
||||
}
|
||||
|
||||
return {
|
||||
mode: 'workstream',
|
||||
active,
|
||||
workstreams,
|
||||
count: workstreams.length,
|
||||
};
|
||||
}
|
||||
@@ -22,44 +22,20 @@ import {
|
||||
} from 'node:fs';
|
||||
import { join, relative } from 'node:path';
|
||||
|
||||
import { toPosixPath, stateExtractField } from './helpers.js';
|
||||
import { toPosixPath } from './helpers.js';
|
||||
import { GSDError, ErrorClassification } from '../errors.js';
|
||||
import { validateWorkstreamName, toWorkstreamSlug } from '../workstream-name-policy.js';
|
||||
import { readActiveWorkstream, writeActiveWorkstream } from './active-workstream-store.js';
|
||||
import {
|
||||
inspectWorkstream,
|
||||
listWorkstreamInventories,
|
||||
planningRoot,
|
||||
workstreamsRoot,
|
||||
} from './workstream-inventory.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
|
||||
// ─── Internal helpers ─────────────────────────────────────────────────────
|
||||
|
||||
const planningRoot = (projectDir: string) =>
|
||||
join(projectDir, '.planning');
|
||||
|
||||
const workstreamsDir = (projectDir: string) =>
|
||||
join(planningRoot(projectDir), 'workstreams');
|
||||
|
||||
function wsPlanningPaths(projectDir: string, name: string) {
|
||||
const base = join(planningRoot(projectDir), 'workstreams', name);
|
||||
return {
|
||||
planning: base,
|
||||
state: join(base, 'STATE.md'),
|
||||
roadmap: join(base, 'ROADMAP.md'),
|
||||
phases: join(base, 'phases'),
|
||||
requirements: join(base, 'REQUIREMENTS.md'),
|
||||
};
|
||||
}
|
||||
|
||||
function readSubdirectories(dir: string): string[] {
|
||||
if (!existsSync(dir)) return [];
|
||||
return readdirSync(dir, { withFileTypes: true }).filter(e => e.isDirectory()).map(e => e.name);
|
||||
}
|
||||
|
||||
function filterPlanFiles(files: string[]): string[] {
|
||||
return files.filter(f => f.endsWith('-PLAN.md') || f === 'PLAN.md');
|
||||
}
|
||||
|
||||
function filterSummaryFiles(files: string[]): string[] {
|
||||
return files.filter(f => f.endsWith('-SUMMARY.md') || f === 'SUMMARY.md');
|
||||
}
|
||||
|
||||
// ─── Handlers ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
@@ -69,7 +45,7 @@ function filterSummaryFiles(files: string[]): string[] {
|
||||
*/
|
||||
export const workstreamGet: QueryHandler = async (_args, projectDir) => {
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
const wsRoot = workstreamsDir(projectDir);
|
||||
const wsRoot = workstreamsRoot(projectDir);
|
||||
return {
|
||||
data: {
|
||||
active,
|
||||
@@ -79,15 +55,26 @@ export const workstreamGet: QueryHandler = async (_args, projectDir) => {
|
||||
};
|
||||
|
||||
export const workstreamList: QueryHandler = async (_args, projectDir) => {
|
||||
const dir = workstreamsDir(projectDir);
|
||||
if (!existsSync(dir)) return { data: { mode: 'flat', workstreams: [], message: 'No workstreams — operating in flat mode' } };
|
||||
try {
|
||||
const entries = readdirSync(dir, { withFileTypes: true });
|
||||
const workstreams = entries.filter(e => e.isDirectory()).map(e => e.name);
|
||||
return { data: { mode: 'workstream', workstreams, count: workstreams.length } };
|
||||
} catch {
|
||||
return { data: { mode: 'flat', workstreams: [], count: 0 } };
|
||||
const inventory = listWorkstreamInventories(projectDir);
|
||||
if (inventory.mode === 'flat') {
|
||||
return { data: { mode: 'flat', workstreams: [], message: inventory.message } };
|
||||
}
|
||||
return {
|
||||
data: {
|
||||
mode: 'workstream',
|
||||
workstreams: inventory.workstreams.map(ws => ({
|
||||
name: ws.name,
|
||||
path: ws.path,
|
||||
has_roadmap: ws.files.roadmap,
|
||||
has_state: ws.files.state,
|
||||
status: ws.status,
|
||||
current_phase: ws.current_phase,
|
||||
phase_count: ws.phase_count,
|
||||
completed_phases: ws.completed_phases,
|
||||
})),
|
||||
count: inventory.count,
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
export const workstreamCreate: QueryHandler = async (args, projectDir) => {
|
||||
@@ -105,7 +92,7 @@ export const workstreamCreate: QueryHandler = async (args, projectDir) => {
|
||||
return { data: { created: false, reason: '.planning/ directory not found — run /gsd-new-project first' } };
|
||||
}
|
||||
|
||||
const wsRoot = workstreamsDir(projectDir);
|
||||
const wsRoot = workstreamsRoot(projectDir);
|
||||
const wsDir = join(wsRoot, slug);
|
||||
|
||||
if (existsSync(wsDir) && existsSync(join(wsDir, 'STATE.md'))) {
|
||||
@@ -170,7 +157,7 @@ export const workstreamCreate: QueryHandler = async (args, projectDir) => {
|
||||
* so frontmatter fields and body stay in lockstep with the source.
|
||||
*/
|
||||
function syncRootStateMirror(projectDir: string, name: string): void {
|
||||
const wsStatePath = join(workstreamsDir(projectDir), name, 'STATE.md');
|
||||
const wsStatePath = join(workstreamsRoot(projectDir), name, 'STATE.md');
|
||||
const rootStatePath = join(planningRoot(projectDir), 'STATE.md');
|
||||
if (!existsSync(wsStatePath)) return;
|
||||
try {
|
||||
@@ -195,7 +182,7 @@ export const workstreamSet: QueryHandler = async (args, projectDir) => {
|
||||
return { data: { active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, underscores, or dots only' } };
|
||||
}
|
||||
|
||||
const wsDir = join(workstreamsDir(projectDir), name);
|
||||
const wsDir = join(workstreamsRoot(projectDir), name);
|
||||
if (!existsSync(wsDir)) {
|
||||
return { data: { active: null, error: 'not_found', workstream: name } };
|
||||
}
|
||||
@@ -214,60 +201,26 @@ export const workstreamStatus: QueryHandler = async (args, projectDir) => {
|
||||
throw new GSDError('Invalid workstream name', ErrorClassification.Validation);
|
||||
}
|
||||
|
||||
const wsDir = join(workstreamsDir(projectDir), name);
|
||||
const wsDir = join(workstreamsRoot(projectDir), name);
|
||||
if (!existsSync(wsDir)) {
|
||||
return { data: { found: false, workstream: name } };
|
||||
}
|
||||
|
||||
const p = wsPlanningPaths(projectDir, name);
|
||||
const relPath = toPosixPath(relative(projectDir, wsDir));
|
||||
|
||||
const files = {
|
||||
roadmap: existsSync(p.roadmap),
|
||||
state: existsSync(p.state),
|
||||
requirements: existsSync(p.requirements),
|
||||
};
|
||||
|
||||
const phases: Array<{ directory: string; status: string; plan_count: number; summary_count: number }> = [];
|
||||
for (const dir of readSubdirectories(p.phases).sort()) {
|
||||
try {
|
||||
const phaseFiles = readdirSync(join(p.phases, dir));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
phases.push({
|
||||
directory: dir,
|
||||
status:
|
||||
summaries.length >= plans.length && plans.length > 0
|
||||
? 'complete'
|
||||
: plans.length > 0
|
||||
? 'in_progress'
|
||||
: 'pending',
|
||||
plan_count: plans.length,
|
||||
summary_count: summaries.length,
|
||||
});
|
||||
} catch { /* skip */ }
|
||||
}
|
||||
|
||||
let stateInfo: Record<string, string | null> = {};
|
||||
try {
|
||||
const stateContent = readFileSync(p.state, 'utf-8');
|
||||
stateInfo = {
|
||||
status: stateExtractField(stateContent, 'Status') || 'unknown',
|
||||
current_phase: stateExtractField(stateContent, 'Current Phase'),
|
||||
last_activity: stateExtractField(stateContent, 'Last Activity'),
|
||||
};
|
||||
} catch { /* skip */ }
|
||||
const inventory = inspectWorkstream(projectDir, name);
|
||||
if (!inventory) return { data: { found: false, workstream: name } };
|
||||
|
||||
return {
|
||||
data: {
|
||||
found: true,
|
||||
workstream: name,
|
||||
path: relPath,
|
||||
files,
|
||||
phases,
|
||||
phase_count: phases.length,
|
||||
completed_phases: phases.filter(ph => ph.status === 'complete').length,
|
||||
...stateInfo,
|
||||
path: inventory.path,
|
||||
files: inventory.files,
|
||||
phases: inventory.phases,
|
||||
phase_count: inventory.phase_count,
|
||||
completed_phases: inventory.completed_phases,
|
||||
status: inventory.status,
|
||||
current_phase: inventory.current_phase,
|
||||
last_activity: inventory.last_activity,
|
||||
},
|
||||
};
|
||||
};
|
||||
@@ -280,7 +233,7 @@ export const workstreamComplete: QueryHandler = async (args, projectDir) => {
|
||||
}
|
||||
|
||||
const root = planningRoot(projectDir);
|
||||
const wsRoot = workstreamsDir(projectDir);
|
||||
const wsRoot = workstreamsRoot(projectDir);
|
||||
const wsDir = join(wsRoot, name);
|
||||
|
||||
if (!existsSync(wsDir)) {
|
||||
@@ -341,85 +294,31 @@ export const workstreamComplete: QueryHandler = async (args, projectDir) => {
|
||||
* (Not the same as roadmap `progress` / `progressBar`.)
|
||||
*/
|
||||
export const workstreamProgress: QueryHandler = async (_args, projectDir) => {
|
||||
const wsRoot = workstreamsDir(projectDir);
|
||||
|
||||
if (!existsSync(wsRoot)) {
|
||||
const inventory = listWorkstreamInventories(projectDir);
|
||||
if (inventory.mode === 'flat') {
|
||||
return {
|
||||
data: {
|
||||
mode: 'flat',
|
||||
workstreams: [],
|
||||
message: 'No workstreams — operating in flat mode',
|
||||
message: inventory.message,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
const entries = readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams: Array<{
|
||||
name: string;
|
||||
active: boolean;
|
||||
status: string;
|
||||
current_phase: string | null;
|
||||
phases: string;
|
||||
plans: string;
|
||||
progress_percent: number;
|
||||
}> = [];
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
|
||||
const wsDir = join(wsRoot, entry.name);
|
||||
const phasesDir = join(wsDir, 'phases');
|
||||
|
||||
const phaseDirsProgress = readSubdirectories(phasesDir);
|
||||
const phaseCount = phaseDirsProgress.length;
|
||||
let completedCount = 0;
|
||||
let totalPlans = 0;
|
||||
let completedPlans = 0;
|
||||
for (const d of phaseDirsProgress) {
|
||||
try {
|
||||
const phaseFiles = readdirSync(join(phasesDir, d));
|
||||
const plans = filterPlanFiles(phaseFiles);
|
||||
const summaries = filterSummaryFiles(phaseFiles);
|
||||
totalPlans += plans.length;
|
||||
completedPlans += Math.min(summaries.length, plans.length);
|
||||
if (plans.length > 0 && summaries.length >= plans.length) completedCount++;
|
||||
} catch { /* skip */ }
|
||||
}
|
||||
|
||||
let roadmapPhaseCount = phaseCount;
|
||||
try {
|
||||
const roadmapContent = readFileSync(join(wsDir, 'ROADMAP.md'), 'utf-8');
|
||||
const phaseMatches = roadmapContent.match(/^###?\s+Phase\s+\d/gm);
|
||||
if (phaseMatches) roadmapPhaseCount = phaseMatches.length;
|
||||
} catch { /* no roadmap */ }
|
||||
|
||||
let status = 'unknown';
|
||||
let currentPhase: string | null = null;
|
||||
try {
|
||||
const stateContent = readFileSync(join(wsDir, 'STATE.md'), 'utf-8');
|
||||
status = stateExtractField(stateContent, 'Status') || 'unknown';
|
||||
currentPhase = stateExtractField(stateContent, 'Current Phase');
|
||||
} catch { /* skip */ }
|
||||
|
||||
workstreams.push({
|
||||
name: entry.name,
|
||||
active: entry.name === active,
|
||||
status,
|
||||
current_phase: currentPhase,
|
||||
phases: `${completedCount}/${roadmapPhaseCount}`,
|
||||
plans: `${completedPlans}/${totalPlans}`,
|
||||
progress_percent:
|
||||
roadmapPhaseCount > 0 ? Math.round((completedCount / roadmapPhaseCount) * 100) : 0,
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
data: {
|
||||
mode: 'workstream',
|
||||
active,
|
||||
workstreams,
|
||||
count: workstreams.length,
|
||||
active: inventory.active,
|
||||
workstreams: inventory.workstreams.map(ws => ({
|
||||
name: ws.name,
|
||||
active: ws.active,
|
||||
status: ws.status,
|
||||
current_phase: ws.current_phase,
|
||||
phases: `${ws.completed_phases}/${ws.roadmap_phase_count}`,
|
||||
plans: `${ws.completed_plans}/${ws.total_plans}`,
|
||||
progress_percent: ws.progress_percent,
|
||||
})),
|
||||
count: inventory.count,
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
@@ -28,6 +28,7 @@ const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const os = require('os');
|
||||
const { execFileSync } = require('child_process');
|
||||
|
||||
const REPO_ROOT = path.join(__dirname, '..');
|
||||
@@ -88,6 +89,8 @@ describe('bug #2647: outer tarball ships sdk/dist so gsd-sdk query works', () =>
|
||||
});
|
||||
|
||||
test('npm pack dry-run includes sdk/dist/cli.js after build:sdk', { timeout: 180_000 }, () => {
|
||||
const npmCache = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-npm-cache-'));
|
||||
const npmEnv = { ...process.env, npm_config_cache: npmCache };
|
||||
// Ensure the sdk is built so the pack reflects what publish would ship.
|
||||
// The outer prepublishOnly chains through build:sdk, which does `npm ci && npm run build`
|
||||
// inside sdk/. We emulate that here without full ci to keep the test fast:
|
||||
@@ -98,24 +101,28 @@ describe('bug #2647: outer tarball ships sdk/dist so gsd-sdk query works', () =>
|
||||
// Build requires node_modules; install if missing, then build.
|
||||
const sdkNodeModules = path.join(sdkDir, 'node_modules');
|
||||
if (!fs.existsSync(sdkNodeModules)) {
|
||||
execFileSync('npm', ['ci', '--silent'], { cwd: sdkDir, stdio: 'pipe' });
|
||||
execFileSync('npm', ['ci', '--silent'], { cwd: sdkDir, stdio: 'pipe', env: npmEnv });
|
||||
}
|
||||
execFileSync('npm', ['run', 'build'], { cwd: sdkDir, stdio: 'pipe' });
|
||||
execFileSync('npm', ['run', 'build'], { cwd: sdkDir, stdio: 'pipe', env: npmEnv });
|
||||
}
|
||||
assert.ok(fs.existsSync(cliJs), 'sdk build must produce sdk/dist/cli.js');
|
||||
|
||||
const out = execFileSync(
|
||||
'npm',
|
||||
['pack', '--dry-run', '--json', '--ignore-scripts'],
|
||||
{ cwd: REPO_ROOT, stdio: ['ignore', 'pipe', 'pipe'] },
|
||||
).toString('utf-8');
|
||||
const manifest = JSON.parse(out);
|
||||
const files = manifest[0].files.map((f) => f.path);
|
||||
const cliPresent = files.includes('sdk/dist/cli.js');
|
||||
assert.ok(
|
||||
cliPresent,
|
||||
`npm pack must include sdk/dist/cli.js in the tarball (so "gsd-sdk query" resolves after install). sdk/dist entries found: ${files.filter((p) => p.startsWith('sdk/dist')).length}`,
|
||||
);
|
||||
try {
|
||||
const out = execFileSync(
|
||||
'npm',
|
||||
['pack', '--dry-run', '--json', '--ignore-scripts'],
|
||||
{ cwd: REPO_ROOT, stdio: ['ignore', 'pipe', 'pipe'], env: npmEnv },
|
||||
).toString('utf-8');
|
||||
const manifest = JSON.parse(out);
|
||||
const files = manifest[0].files.map((f) => f.path);
|
||||
const cliPresent = files.includes('sdk/dist/cli.js');
|
||||
assert.ok(
|
||||
cliPresent,
|
||||
`npm pack must include sdk/dist/cli.js in the tarball (so "gsd-sdk query" resolves after install). sdk/dist entries found: ${files.filter((p) => p.startsWith('sdk/dist')).length}`,
|
||||
);
|
||||
} finally {
|
||||
fs.rmSync(npmCache, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
test('built sdk CLI exposes the `query` subcommand', { timeout: 60_000 }, () => {
|
||||
|
||||
@@ -118,18 +118,19 @@ describe('Bug #3017: buildCodexHookBlock emits absolute node runner', () => {
|
||||
'must return null on missing runner so caller can warn-and-skip instead of writing a broken hook');
|
||||
});
|
||||
|
||||
test('integrates with resolveNodeRunner() in the live process — runner equals process.execPath (#3022 CR)', () => {
|
||||
test('integrates with resolveNodeRunner() in the live process — runner equals resolved node runner (#3022 CR)', () => {
|
||||
const runner = resolveNodeRunner();
|
||||
assert.ok(runner, 'resolveNodeRunner returns a usable value in this test env');
|
||||
const block = buildCodexHookBlock('/tmp/x/.codex', { absoluteRunner: runner });
|
||||
const parsed = parseCodexHookBlock(block);
|
||||
assert.equal(parsed.ok, true);
|
||||
// Strict canonical-runner equality: the parsed runner (after
|
||||
// stripping toml + JSON escape layers) must be exactly process.execPath
|
||||
// (forward-slashed, since resolveNodeRunner normalizes that way).
|
||||
const expected = process.execPath.replace(/\\/g, '/');
|
||||
// Strict canonical-runner equality: the parsed runner (after stripping
|
||||
// toml + JSON escape layers) must be exactly the normalized runner that
|
||||
// resolveNodeRunner selected. Homebrew Cellar execPath values intentionally
|
||||
// normalize to the stable Homebrew symlink (#3181).
|
||||
const expected = JSON.parse(runner);
|
||||
assert.equal(unescapeRunner(parsed.runner), expected,
|
||||
`parsed runner must equal process.execPath, got: ${parsed.runner}, want: ${expected}`);
|
||||
`parsed runner must equal resolveNodeRunner(), got: ${parsed.runner}, want: ${expected}`);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -104,7 +104,7 @@ describe('#3242 Bug A: body-only state.update preserves curated progress frontma
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('state.update "Last Activity" does not overwrite progress.completed_plans', { todo: 'fix pending: #3242 Bug A not yet implemented' }, (t) => {
|
||||
test('state.update "Last Activity" does not overwrite progress.completed_plans', (t) => {
|
||||
const statePath = path.join(tmpDir, '.planning', 'STATE.md');
|
||||
fs.writeFileSync(statePath, buildStateWithCuratedProgress({
|
||||
completedPlans: 22,
|
||||
@@ -198,7 +198,7 @@ describe('#3242 Bug B: progress.percent reflects phase fraction when ROADMAP dec
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('12 declared phases / 6 realized / 6/6 plans done → percent is 50, not 100', { todo: 'fix pending: #3242 Bug B not yet implemented' }, (t) => {
|
||||
test('12 declared phases / 6 realized / 6/6 plans done → percent is 50, not 100', (t) => {
|
||||
const statePath = path.join(tmpDir, '.planning', 'STATE.md');
|
||||
|
||||
// Body: 6 realized phases visible to disk scan.
|
||||
@@ -292,7 +292,7 @@ describe('#3242 Bug B: progress.percent reflects phase fraction when ROADMAP dec
|
||||
);
|
||||
});
|
||||
|
||||
test('state sync also reflects phase-fraction-capped percent in body Progress field', { todo: 'fix pending: #3242 Bug B not yet implemented' }, () => {
|
||||
test('state sync also reflects phase-fraction-capped percent in body Progress field', () => {
|
||||
// state sync updates the body's Progress: field — it must use the same capped formula
|
||||
const statePath = path.join(tmpDir, '.planning', 'STATE.md');
|
||||
|
||||
|
||||
@@ -51,6 +51,7 @@ const {
|
||||
GSD_CODEX_MARKER,
|
||||
CODEX_AGENT_SANDBOX,
|
||||
parseTomlToObject,
|
||||
resolveNodeRunner,
|
||||
} = require('../bin/install.js');
|
||||
|
||||
function runCodexInstall(codexHome, cwd = path.join(__dirname, '..')) {
|
||||
@@ -1437,10 +1438,10 @@ describe('Codex install hook configuration (e2e)', () => {
|
||||
// #3017: handler command now uses the absolute Node binary path so
|
||||
// GUI/minimal-PATH runtimes can resolve it. The shape is
|
||||
// "<absolute-node-path>" "<hook-path>"
|
||||
// where <absolute-node-path> is process.execPath (forward-slashed)
|
||||
// and the hook path is also quoted. Same Node process runs the test
|
||||
// and the installer, so process.execPath matches at both ends.
|
||||
const expectedRunner = process.execPath.replace(/\\/g, '/');
|
||||
// where <absolute-node-path> is the normalized runner selected by
|
||||
// resolveNodeRunner() and the hook path is also quoted. Homebrew Cellar
|
||||
// execPath values intentionally normalize to stable Homebrew symlinks.
|
||||
const expectedRunner = JSON.parse(resolveNodeRunner());
|
||||
const expectedHookPath = path.join(codexHome, 'hooks', 'gsd-check-update.js').replace(/\\/g, '/');
|
||||
const expectedCommand = `"${expectedRunner}" "${expectedHookPath}"`;
|
||||
assert.strictEqual(
|
||||
|
||||
@@ -110,7 +110,7 @@ describe('write-profile command', () => {
|
||||
const analysisPath = path.join(tmpDir, 'analysis.json');
|
||||
fs.writeFileSync(analysisPath, JSON.stringify(analysis));
|
||||
|
||||
const result = runGsdTools(['write-profile', '--input', analysisPath, '--raw'], tmpDir);
|
||||
const result = runGsdTools(['write-profile', '--input', analysisPath, '--raw'], tmpDir, { HOME: tmpDir });
|
||||
assert.ok(result.success, `Failed: ${result.error}`);
|
||||
const out = JSON.parse(result.output);
|
||||
assert.ok(out.profile_path, 'should return profile_path');
|
||||
@@ -206,7 +206,11 @@ describe('generate-dev-preferences command', () => {
|
||||
const analysisPath = path.join(tmpDir, 'analysis.json');
|
||||
fs.writeFileSync(analysisPath, JSON.stringify(analysis));
|
||||
|
||||
const result = runGsdTools(['generate-dev-preferences', '--analysis', analysisPath, '--raw'], tmpDir);
|
||||
const result = runGsdTools(
|
||||
['generate-dev-preferences', '--analysis', analysisPath, '--raw'],
|
||||
tmpDir,
|
||||
{ HOME: tmpDir }
|
||||
);
|
||||
assert.ok(result.success, `Failed: ${result.error}`);
|
||||
const out = JSON.parse(result.output);
|
||||
assert.ok(out.command_path || out.command_name, 'should return command output');
|
||||
|
||||
Reference in New Issue
Block a user