mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-05-14 02:56:38 +02:00
fix(workstream): normalize migration workstream names (#3269)
* fix(workstream): normalize migrate-name to valid slug * docs(context): record workstream migrate-name slug invariant * fix(catalog-cjs): balanced fallback for unknown profile (CR finding A) profiles[profile] could return undefined for any profile key absent from the catalog entry, causing downstream callers like formatAgentToModelMapAsTable to crash on .length. Add ?? profiles.balanced fallback to match the SDK adapter. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * test(sdk): anchor path resolution on import.meta.url not cwd (CR finding B) resolve(process.cwd(), '..') breaks when Vitest is invoked from the repo root because cwd is already the repo root and '..' goes one level above. Replace with a file-relative path using fileURLToPath(new URL('../../../', import.meta.url)) anchored at the test file's location (sdk/src/query/). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * test: derive Group B runtime list from catalog (CR finding C) Hardcoded ['kilo', 'cline', ...] throws TypeError if a runtime name is removed from the catalog. Derive group B dynamically via Object.keys(catalog.runtimeTierDefaults).filter(r => !r.opus) so the test never goes stale and auto-covers future Group B additions. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * docs(workflow): add hermes to Step B runtime options (CR finding D) hermes appears in the Group A built-in defaults table but was missing from the AskUserQuestion options in Step B, forcing users to manually type it via 'Other (Group B or custom)'. Add explicit hermes entry for UI consistency. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * docs(config): refresh dynamic_routing tier table; fix stale L671 (findings E+F) Finding E: tier table was missing 6 heavy-tier agents and 15 standard/light agents added by this PR. Updated all three rows to match catalog routingTier assignments (33 agents total). Finding F: removed stale '18 of 31' claim and agent enumeration; replaced with accurate note that all 33 agents have explicit catalog entries. Updated authoritative source pointers to model-catalog.cjs / model-catalog.ts. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * test(core): add profile-fallback unit tests for quality and budget (CR nitpick G) The PR introduced quality→opus and budget→haiku unknown-agent fallbacks but only balanced→sonnet and inherit→inherit were tested. Add two tests covering the remaining two branches to complete coverage. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * adr: define planning workspace and worktree seam * refactor(worktree): extract worktree safety policy module * refactor(workstream): extract active workstream pointer store seam * test(worktree): cover policy branch paths and persist seam guardrails * refactor(worktree): centralize health inventory seam for W017 * fix(workspace): align SDK project path policy with CJS planningDir * refactor(query): unify SDK planning path projection seam * refactor(init): route workspace projection through planningPaths seam * docs(adr): add SDK architecture and planning path ADRs * refactor(worktree): deepen name, pointer, inventory, and config seams * docs(config): harmonize claude-opus-4-6 to 4-7 in resolve_model_ids example (CR finding 2) * fix(sdk): return undefined for model_profile='inherit' sentinel (CR finding 3) * docs(adr): renumber conflicting 0003-sdk-package-seam-module to 0007, update seam-map reference (CR finding 4) * fix(workstream): align CJS and SDK name validation to accept dots, guard path traversal via includes('..') (CR finding 5) * fix(sdk): guard writeActiveWorkstream against non-existent workstream directory, k014/k031 parity (CR finding 6) * chore(changeset): add #3269 changeset (CR finding 1 — proper changeset for this PR) * docs(inventory): register 3 new CLI modules in INVENTORY.md/MANIFEST (active-workstream-store, workstream-name-policy, worktree-safety) * fix(sdk): use relPlanningPath(workstream) in planningPaths, fix setActiveWorkstream/getActiveWorkstream name errors in workstream.ts * fix(sdk): validate GSD_WORKSTREAM in planningPaths before use (#3269 regression) planningPaths() called resolveWorkspaceContext() which returned GSD_WORKSTREAM raw (no validation). An invalid value like '../evil' was used as effectiveWorkstream, constructing a bad path; roadmapAnalyze() caught the ENOENT and returned a no-phase_count error object instead of the root ROADMAP result. Fix: validate envCtx.workstream with validateWorkstreamName() in planningPaths() before accepting it as effectiveWorkstream. Invalid env → null → root .planning/ fallback, preserving the bug-2791 contract: invalid GSD_WORKSTREAM is silently ignored and falls back to the root context (phase_count: 0 for empty root ROADMAP). The bug-2791 regression test now passes. No other call sites read GSD_WORKSTREAM without validation: query-runtime-context.ts already validates; cli.ts already validates; context-engine.ts takes a caller-validated workstream parameter. Closes #3268 (regression introduced by #3269 workstream-name-policy work). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
5
.changeset/nimble-lynx-tumble.md
Normal file
5
.changeset/nimble-lynx-tumble.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 3269
|
||||
---
|
||||
**Workstream name normalization** — workstream names are now consistently validated across CJS and SDK layers, accepting alphanumeric, hyphens, underscores, and dots (e.g. `v1.0`); path traversal via `..` sequences is blocked in both layers. The `model_profile: 'inherit'` sentinel no longer leaks as a literal model ID in session-runner. SDK `writeActiveWorkstream` now validates that the target workstream directory exists before writing the pointer.
|
||||
55
CONTEXT.md
55
CONTEXT.md
@@ -40,6 +40,12 @@ Module owning command resolution, policy projection (`mutation`, `output_mode`),
|
||||
### Query Pre-Project Config Policy Module
|
||||
Module policy that defines query-time behavior when `.planning/config.json` is absent: use built-in defaults for parity-sensitive query Interfaces, and emit parity-aligned empty model ids for pre-project model resolution surfaces.
|
||||
|
||||
### Planning Workspace Module
|
||||
Module owning `.planning` path resolution, active workstream pointer policy (`session-scoped > shared`), pointer self-heal behavior, and planning lock semantics for workstream-aware execution.
|
||||
|
||||
### Worktree Root Resolution Adapter Module
|
||||
Adapter Module owning linked-worktree root mapping and metadata-prune policy (`git worktree prune` non-destructive default) for planning/workstream callers.
|
||||
|
||||
### SDK Package Seam Module
|
||||
Module owning SDK-to-`get-shit-done-cc` compatibility policy: legacy asset discovery, install-layout probing, transition-only error messaging, and thin Adapter access for CJS-era assets that native SDK Modules have not replaced yet.
|
||||
|
||||
@@ -97,6 +103,12 @@ Five-axis story decomposition discipline (**S**pike, **P**aths, **I**nterfaces,
|
||||
### Shell guards — absolute-path containment must use `root/` prefix, not glob
|
||||
- `[[ "$PATH" != "$ROOT"* ]]` matches sibling prefixes (`/repo-extra` passes when `ROOT=/repo`). Use `[[ "$P" != "$ROOT" && "$P" != "$ROOT/"* ]]`. Also: check `[ -z "$ROOT" ]` and exit 1 before the containment test. Warn → fail-closed for security-relevant path checks.
|
||||
|
||||
### Workstream migration names — enforce one canonical slug contract
|
||||
- **Invariant**: every directory under `.planning/workstreams/*` must be addressable by `workstream status/set/complete`, so creation and migration must share the same name contract.
|
||||
- **Failure class**: accepting raw `--migrate-name` values created directories that later commands reject (e.g. `Bad Name` directory exists but CLI rejects it as invalid).
|
||||
- **Rule**: normalize `--migrate-name` through the same slug transform as `workstream create` (`[a-z0-9-]`), and fail fast if normalization yields empty.
|
||||
- **TDD sentinel**: keep regression asserting `workstream create ... --migrate-name 'Bad Name'` migrates to `bad-name` and does not leave `Bad Name` on disk.
|
||||
|
||||
### Docs — keep internal reference counts consistent
|
||||
- When a heading says `(N shipped)` and a footnote says `N-1 top-level references`, update the footnote. CodeRabbit catches this every time.
|
||||
|
||||
@@ -206,3 +218,46 @@ After stripping prose @-refs, some command `<process>` blocks retained bolded "*
|
||||
- Do not keep duplicate/inert exported types; expose one canonical union Interface.
|
||||
- Do not emit decision events before guard checks that may reject the path.
|
||||
- Do not leave architectural docs with ambiguous seam ownership between CLI and SDK paths.
|
||||
|
||||
---
|
||||
|
||||
## AI Ops Memory (2026-05-09, machine-oriented)
|
||||
|
||||
`RULESET.CONTRIB.GATE.ORDER=issue-first -> approval-label -> code -> PR-link -> changeset/no-changelog`
|
||||
`RULESET.CONTRIB.CLASSIFY.fix=requires confirmed/confirmed-bug before implementation`
|
||||
`RULESET.CONTRIB.CLASSIFY.enhancement=requires approved-enhancement before implementation`
|
||||
`RULESET.CONTRIB.CLASSIFY.feature=requires approved-feature before implementation`
|
||||
|
||||
`CI.GATE.issue-link-required=hard-fail if PR body lacks closes/fixes/resolves #<issue>`
|
||||
`CI.GATE.changeset-lint=hard-fail for user-facing code diffs unless .changeset/* or PR has no-changelog label`
|
||||
`CI.GATE.repair-sequence(PR)=create issue -> apply approval label -> edit PR body w/ closing keyword -> apply no-changelog if appropriate -> re-run checks`
|
||||
|
||||
`PR.3267.POSTMORTEM.root-cause=[missing issue link, missing changeset/no-changelog]`
|
||||
`PR.3267.POSTMORTEM.recovery=[issue#3270 created, label approved-enhancement applied, PR reopened, body includes "Closes #3270", label no-changelog applied]`
|
||||
|
||||
`WORKTREE.SEAM.current=Worktree Safety Policy Module`
|
||||
`WORKTREE.SEAM.files=[get-shit-done/bin/lib/worktree-safety.cjs, get-shit-done/bin/lib/core.cjs]`
|
||||
`WORKTREE.SEAM.interface=[resolveWorktreeContext, parseWorktreePorcelain, planWorktreePrune, executeWorktreePrunePlan]`
|
||||
`WORKTREE.SEAM.default-prune-policy=metadata_prune_only (non-destructive)`
|
||||
`WORKTREE.SEAM.decision-1=retain non-destructive default; destructive path only as explicit future opt-in scaffold`
|
||||
|
||||
`WORKSTREAM.INVARIANT.migrate-name=must normalize through canonical slug policy`
|
||||
`WORKSTREAM.INVARIANT.slug-contract=all .planning/workstreams/<name> must be addressable by set/get/status/complete`
|
||||
`WORKSTREAM.REGRESSION.test-anchor=tests/workstream.test.cjs::normalizes --migrate-name to a valid workstream slug`
|
||||
|
||||
`ARCH.SKILL.improve-codebase.next-candidates=[Workstream Name Policy Module, Workstream Progress Projection Module, Active Workstream Pointer Store Module]`
|
||||
|
||||
`WORKTREE.SEAM.test-policy=cover all decision branches in policy module before changing prune behavior`
|
||||
`WORKTREE.SEAM.test-anchors=[resolveWorktreeContext:has_local_planning|linked_worktree|not_git_repo|main_worktree, planWorktreePrune:git_list_failed|worktrees_present|no_worktrees|parser_throw_fallback, executeWorktreePrunePlan:missing_plan|skip_passthrough|unsupported_action|metadata_prune_only]`
|
||||
`WORKTREE.SEAM.invariant=parser failure must degrade to metadata_prune_only and never escalate to destructive removal`
|
||||
`WORKTREE.SEAM.execution-rule=prefer node --test tests/worktree-safety-policy.test.cjs for fast seam validation; avoid full npm test loop for seam-only changes`
|
||||
`WORKTREE.SEAM.inventory-interface=[listLinkedWorktreePaths, inspectWorktreeHealth]`
|
||||
`WORKTREE.SEAM.caller-rule=verify.cjs must consume inspectWorktreeHealth for W017 classification; no ad-hoc porcelain parsing in callers`
|
||||
`WORKTREE.SEAM.test-anchor-w017=tests/orphan-worktree-detection.test.cjs + tests/worktree-safety-policy.test.cjs`
|
||||
`WORKTREE.SEAM.inventory-snapshot=snapshotWorktreeInventory(repoRoot,{staleAfterMs,nowMs}) is canonical linked-worktree health snapshot for callers`
|
||||
`PLANNING.PATH.PARITY.sdk-project-scope=.planning/<project> (never .planning/projects/<project>); mirror planning-workspace.cjs planningDir()`
|
||||
`PLANNING.PATH.SEAM.sdk=helpers.planningPaths delegates to workspacePlanningPaths + resolveWorkspaceContext; precedence explicit-ws > env-ws > env-project > root`
|
||||
`PLANNING.PATH.SEAM.init-handlers=[initExecutePhase, initPlanPhase, initPhaseOp, initMilestoneOp] consume helpers.planningPaths().planning (no direct relPlanningPath join)`
|
||||
`WORKSTREAM.NAME.POLICY.cjs-module=get-shit-done/bin/lib/workstream-name-policy.cjs owns toWorkstreamSlug + active-name/path-segment validation`
|
||||
`WORKSTREAM.POINTER.SEAM.sdk-module=sdk/src/query/active-workstream-store.ts owns read/write self-heal for .planning/active-workstream`
|
||||
`CONFIG.SEAM.loadConfig-context=loadConfig(cwd,{workstream}) replaces env-mutation fallback; no temporary process.env GSD_WORKSTREAM rewrites`
|
||||
|
||||
@@ -668,7 +668,7 @@ Invalid flag tokens are sanitized and logged as warnings. Only recognized GSD fl
|
||||
| gsd-doc-writer | Opus | Sonnet | Haiku | Inherit |
|
||||
| gsd-doc-verifier | Sonnet | Sonnet | Haiku | Inherit |
|
||||
|
||||
> **Fallback semantics for unlisted agents.** The profiles table above covers 18 of 31 shipped agents. Agents without an explicit profile row (`gsd-advisor-researcher`, `gsd-assumptions-analyzer`, `gsd-security-auditor`, `gsd-user-profiler`, and the nine advanced agents — `gsd-ai-researcher`, `gsd-domain-researcher`, `gsd-eval-planner`, `gsd-eval-auditor`, `gsd-framework-selector`, `gsd-code-reviewer`, `gsd-code-fixer`, `gsd-debug-session-manager`, `gsd-intel-updater`) inherit the runtime default model for the selected profile. To pin a specific model for any of these agents, use `model_overrides` (next section) — `model_overrides` accepts any shipped agent name regardless of whether it has a profile row here. The authoritative profile table lives in `get-shit-done/bin/lib/model-profiles.cjs`; the authoritative 31-agent roster lives in [`docs/INVENTORY.md`](INVENTORY.md).
|
||||
> **All 33 shipped agents have explicit per-profile tier assignments** in the catalog (`sdk/shared/model-catalog.json`). The table above shows a representative subset of the most-used agents. For agents not listed here, `model_overrides` accepts any shipped agent name. The authoritative profile data is derived from `sdk/shared/model-catalog.json` via `get-shit-done/bin/lib/model-catalog.cjs` and `sdk/src/model-catalog.ts`.
|
||||
|
||||
### Per-Agent Overrides
|
||||
|
||||
@@ -808,9 +808,9 @@ Each agent in `MODEL_PROFILES` declares one of three default tiers. The resolver
|
||||
|
||||
| Tier | Agents | Use case |
|
||||
|---|---|---|
|
||||
| `light` | gsd-codebase-mapper, gsd-pattern-mapper, gsd-research-synthesizer, gsd-plan-checker, gsd-integration-checker, gsd-nyquist-auditor, gsd-ui-checker, gsd-ui-auditor, gsd-doc-verifier | Cheap/fast — pure mappers, scanners, low-stakes audits |
|
||||
| `standard` | gsd-executor, gsd-phase-researcher, gsd-project-researcher, gsd-verifier, gsd-doc-writer, gsd-ui-researcher | Default workhorse — research, writing, primary verification |
|
||||
| `heavy` | gsd-planner, gsd-roadmapper, gsd-debugger | Deep reasoning — already at top, can't escalate further |
|
||||
| `light` | gsd-codebase-mapper, gsd-doc-classifier, gsd-doc-verifier, gsd-integration-checker, gsd-intel-updater, gsd-nyquist-auditor, gsd-pattern-mapper, gsd-plan-checker, gsd-research-synthesizer, gsd-ui-auditor, gsd-ui-checker | Cheap/fast — pure mappers, scanners, low-stakes audits |
|
||||
| `standard` | gsd-advisor-researcher, gsd-ai-researcher, gsd-code-fixer, gsd-code-reviewer, gsd-doc-synthesizer, gsd-doc-writer, gsd-domain-researcher, gsd-eval-auditor, gsd-executor, gsd-phase-researcher, gsd-project-researcher, gsd-ui-researcher, gsd-verifier | Default workhorse — research, writing, primary verification |
|
||||
| `heavy` | gsd-assumptions-analyzer, gsd-debug-session-manager, gsd-debugger, gsd-eval-planner, gsd-framework-selector, gsd-planner, gsd-roadmapper, gsd-security-auditor, gsd-user-profiler | Deep reasoning — already at top, can't escalate further |
|
||||
|
||||
#### Escalation flow
|
||||
|
||||
@@ -894,7 +894,7 @@ The intent is the same as the Claude profile tiers -- use a stronger model for p
|
||||
| Value | Behavior | Use When |
|
||||
|-------|----------|----------|
|
||||
| `false` (default) | Returns Claude aliases (`opus`, `sonnet`, `haiku`) | Claude Code with native Anthropic API |
|
||||
| `true` | Maps aliases to full Claude model IDs (`claude-opus-4-6`) | Claude Code with API that requires full IDs |
|
||||
| `true` | Maps aliases to full Claude model IDs (`claude-opus-4-7`) | Claude Code with API that requires full IDs |
|
||||
| `"omit"` | Returns empty string (runtime picks its default) | Non-Claude runtimes (Codex, OpenCode, Gemini CLI, Kilo) |
|
||||
|
||||
### Runtime-Aware Profiles (#2517)
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"generated": "2026-05-07",
|
||||
"generated": "2026-05-09",
|
||||
"families": {
|
||||
"agents": [
|
||||
"gsd-advisor-researcher",
|
||||
@@ -256,6 +256,7 @@
|
||||
"worktree-path-safety.md"
|
||||
],
|
||||
"cli_modules": [
|
||||
"active-workstream-store.cjs",
|
||||
"artifacts.cjs",
|
||||
"audit.cjs",
|
||||
"command-aliases.generated.cjs",
|
||||
@@ -298,7 +299,9 @@
|
||||
"validate-command-router.cjs",
|
||||
"verify-command-router.cjs",
|
||||
"verify.cjs",
|
||||
"workstream.cjs"
|
||||
"workstream-name-policy.cjs",
|
||||
"workstream.cjs",
|
||||
"worktree-safety.cjs"
|
||||
],
|
||||
"hooks": [
|
||||
"gsd-check-update-worker.js",
|
||||
|
||||
@@ -358,12 +358,13 @@ The `gsd-planner` agent is decomposed into a core agent plus reference modules t
|
||||
|
||||
---
|
||||
|
||||
## CLI Modules (43 shipped)
|
||||
## CLI Modules (46 shipped)
|
||||
|
||||
Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
|
||||
| Module | Responsibility |
|
||||
|--------|----------------|
|
||||
| `active-workstream-store.cjs` | Workstream source precedence and selection (CLI `--ws` > `GSD_WORKSTREAM` env > stored pointer); name validation and environment propagation |
|
||||
| `artifacts.cjs` | Canonical artifact registry — known `.planning/` root file names; used by `gsd-health` W019 lint |
|
||||
| `audit.cjs` | Audit dispatch, audit open sessions, audit storage helpers |
|
||||
| `command-aliases.generated.cjs` | Generated CJS alias/subcommand metadata for manifest-backed family routers |
|
||||
@@ -406,7 +407,9 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `validate-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools validate` |
|
||||
| `verify-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools verify` |
|
||||
| `verify.cjs` | Plan structure, phase completeness, reference, commit validation |
|
||||
| `workstream-name-policy.cjs` | Canonical workstream name validation (`isValidActiveWorkstreamName`) and slug normalization (`toWorkstreamSlug`); shared by all workstream callers |
|
||||
| `workstream.cjs` | Workstream CRUD, migration, session-scoped active pointer |
|
||||
| `worktree-safety.cjs` | Worktree-root resolution and non-destructive prune policy decisions; owns W017 health-check logic |
|
||||
|
||||
[`docs/CLI-TOOLS.md`](CLI-TOOLS.md) may describe a subset of these modules; when it disagrees with the filesystem, this table and the directory listing are authoritative.
|
||||
|
||||
|
||||
23
docs/adr/0004-worktree-workstream-seam-module.md
Normal file
23
docs/adr/0004-worktree-workstream-seam-module.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# Planning Workspace Module as single seam for worktree and workstream state
|
||||
|
||||
- **Status:** Accepted
|
||||
- **Date:** 2026-05-08
|
||||
|
||||
We decided to treat planning/worktree behavior as one explicit Planning Workspace Module Interface rather than spread policy across ad-hoc call sites. The Module owns `.planning` path resolution, active workstream pointer policy, workstream-name invariants, and lock semantics, while a focused Worktree Root Resolution Adapter owns linked-worktree root mapping and metadata prune behavior. This raises depth at the seam, increases leverage for callers, and improves locality for bug fixes in the worktree/workstream loop.
|
||||
|
||||
## Decision
|
||||
|
||||
- The Planning Workspace Module Interface is authoritative for:
|
||||
- `planningDir` / `planningRoot` / `planningPaths`
|
||||
- active workstream pointer policy (`session-scoped > shared`)
|
||||
- pointer self-heal behavior (invalid/stale pointers clear to null)
|
||||
- planning lock semantics (`withPlanningLock`)
|
||||
- Worktree root detection stays behind one Worktree Root Resolution Adapter (`resolveWorktreeRoot`), so callers do not re-derive git-dir/common-dir logic.
|
||||
- Worktree metadata cleanup remains non-destructive by default: `pruneOrphanedWorktrees` runs `git worktree prune` only and does not remove linked worktree directories.
|
||||
- Workstream naming is one invariant across create/migrate/set/get/env-pointer paths: values must be canonical slugs that remain addressable by all workstream commands.
|
||||
|
||||
## Consequences
|
||||
|
||||
- Tests can pin behavior through one Interface instead of source-grep fragments, improving regression quality for worktree/workstream bugs.
|
||||
- Bug classes caused by contract drift (for example migration names accepted in one path but rejected in another) are fixed once in the Module and propagate to all callers.
|
||||
- Callers become thin Adapters over a deeper seam; future policy changes (session identity strategy, lock recovery, worktree prune behavior) stay localized.
|
||||
21
docs/adr/0005-sdk-architecture-seam-map.md
Normal file
21
docs/adr/0005-sdk-architecture-seam-map.md
Normal file
@@ -0,0 +1,21 @@
|
||||
# SDK Architecture seam map for query/runtime surfaces
|
||||
|
||||
- **Status:** Accepted
|
||||
- **Date:** 2026-05-09
|
||||
|
||||
We decided to keep SDK architecture explicitly module-seamed rather than allow feature logic to spread across query handlers, runtime adapters, and compatibility shims. This ADR is the top-level map for SDK seams and their ownership boundaries.
|
||||
|
||||
## Decision
|
||||
|
||||
- Treat the SDK as a composition of explicit seam Modules with thin call-site Adapters.
|
||||
- Keep compatibility policy isolated behind the **SDK Package Seam Module** (see `0007-sdk-package-seam-module.md`).
|
||||
- Keep dispatch transport/outcome policy behind the **Dispatch Policy Module** and **SDK Runtime Bridge Module** (see `0001-dispatch-policy-module.md` amendment).
|
||||
- Keep model/runtime profile resolution behind the **Model Catalog Module** (see `0003-model-catalog-module.md`).
|
||||
- Keep planning/worktree/workstream path-state policy behind the **Planning Workspace Module** (see `0004-worktree-workstream-seam-module.md`).
|
||||
- Keep planning path projection policy explicit and centralized (detailed in `0006-planning-path-projection-module.md`).
|
||||
|
||||
## Consequences
|
||||
|
||||
- SDK callers (`init*`, query handlers, runtime entry points) remain thin Adapters over stable interfaces.
|
||||
- Changes to package layout compatibility, dispatch transport, model policy, and planning path policy are localized to owning Modules.
|
||||
- Architecture reviews can classify drift quickly: if behavior changes outside owning seam Module, it is a design violation.
|
||||
20
docs/adr/0006-planning-path-projection-module.md
Normal file
20
docs/adr/0006-planning-path-projection-module.md
Normal file
@@ -0,0 +1,20 @@
|
||||
# Planning Path Projection Module for SDK query handlers
|
||||
|
||||
- **Status:** Accepted
|
||||
- **Date:** 2026-05-09
|
||||
|
||||
We decided to centralize SDK planning-path projection behind one Module interface instead of reconstructing `.planning` paths in each handler with ad-hoc joins. This deepens the planning seam and prevents path-policy drift between helper and caller layers.
|
||||
|
||||
## Decision
|
||||
|
||||
- `helpers.planningPaths(projectDir, workstream?)` is the canonical SDK projection interface for planning paths.
|
||||
- `helpers.planningPaths` delegates to `workspacePlanningPaths` + `resolveWorkspaceContext` for policy, not duplicate local path composition.
|
||||
- Policy precedence is explicit and stable: `explicit workstream > env workstream > env project > root`.
|
||||
- Query/init handlers (`initExecutePhase`, `initPlanPhase`, `initPhaseOp`, `initMilestoneOp`) must consume `planningPaths(...).planning` rather than direct `relPlanningPath` joins.
|
||||
- SDK project scope for planning is `.planning/<project>` (never `.planning/projects/<project>`), aligned with CJS planning workspace behavior.
|
||||
|
||||
## Consequences
|
||||
|
||||
- One fix in planning path policy updates all handlers and reduces regression surface.
|
||||
- Tests can target seam behavior (`workspace.test.ts`, `helpers.test.ts`, init handler tests) instead of source-grep heuristics.
|
||||
- Cross-package parity bugs between SDK and CJS planning path resolution become easier to detect and correct.
|
||||
@@ -174,6 +174,7 @@ const path = require('path');
|
||||
const core = require('./lib/core.cjs');
|
||||
const { error, findProjectRoot } = core;
|
||||
const { getActiveWorkstream } = require('./lib/planning-workspace.cjs');
|
||||
const { resolveActiveWorkstream, applyResolvedWorkstreamEnv } = require('./lib/active-workstream-store.cjs');
|
||||
const state = require('./lib/state.cjs');
|
||||
const phase = require('./lib/phase.cjs');
|
||||
const roadmap = require('./lib/roadmap.cjs');
|
||||
@@ -275,30 +276,18 @@ async function main() {
|
||||
}
|
||||
|
||||
// Optional workstream override for parallel milestone work.
|
||||
// Priority: --ws flag > GSD_WORKSTREAM env var > session-scoped pointer > shared legacy pointer > null
|
||||
const wsEqArg = args.find(arg => arg.startsWith('--ws='));
|
||||
const wsIdx = args.indexOf('--ws');
|
||||
// Priority: --ws flag > GSD_WORKSTREAM env var > session/shared pointer > null.
|
||||
let ws = null;
|
||||
if (wsEqArg) {
|
||||
ws = wsEqArg.slice('--ws='.length).trim();
|
||||
if (!ws) error('Missing value for --ws');
|
||||
args.splice(args.indexOf(wsEqArg), 1);
|
||||
} else if (wsIdx !== -1) {
|
||||
ws = args[wsIdx + 1];
|
||||
if (!ws || ws.startsWith('--')) error('Missing value for --ws');
|
||||
args.splice(wsIdx, 2);
|
||||
} else if (process.env.GSD_WORKSTREAM) {
|
||||
ws = process.env.GSD_WORKSTREAM.trim();
|
||||
} else {
|
||||
ws = getActiveWorkstream(cwd);
|
||||
}
|
||||
// Validate workstream name to prevent path traversal attacks.
|
||||
if (ws && !/^[a-zA-Z0-9_-]+$/.test(ws)) {
|
||||
error('Invalid workstream name: must be alphanumeric, hyphens, and underscores only');
|
||||
}
|
||||
// Set env var so all modules (planningDir, planningPaths) auto-resolve workstream paths
|
||||
if (ws) {
|
||||
process.env.GSD_WORKSTREAM = ws;
|
||||
try {
|
||||
const wsResolution = resolveActiveWorkstream(cwd, args, process.env, {
|
||||
getStored: getActiveWorkstream,
|
||||
});
|
||||
ws = wsResolution.ws;
|
||||
args = wsResolution.args;
|
||||
// Set env var so all modules (planningDir, planningPaths) auto-resolve workstream paths.
|
||||
applyResolvedWorkstreamEnv(wsResolution, process.env);
|
||||
} catch (err) {
|
||||
error(err.message || String(err));
|
||||
}
|
||||
|
||||
const rawIndex = args.indexOf('--raw');
|
||||
|
||||
85
get-shit-done/bin/lib/active-workstream-store.cjs
Normal file
85
get-shit-done/bin/lib/active-workstream-store.cjs
Normal file
@@ -0,0 +1,85 @@
|
||||
/**
|
||||
* Active Workstream Pointer Store Module
|
||||
*
|
||||
* Owns workstream source precedence and selection:
|
||||
* CLI --ws > GSD_WORKSTREAM env > stored active workstream pointer.
|
||||
*/
|
||||
|
||||
const { getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { isValidActiveWorkstreamName } = require('./workstream-name-policy.cjs');
|
||||
|
||||
function validateWorkstreamName(name) {
|
||||
return isValidActiveWorkstreamName(name);
|
||||
}
|
||||
|
||||
function parseCliWorkstream(args) {
|
||||
const wsEqArg = args.find(arg => arg.startsWith('--ws='));
|
||||
const wsIdx = args.indexOf('--ws');
|
||||
|
||||
if (wsEqArg) {
|
||||
const value = wsEqArg.slice('--ws='.length).trim();
|
||||
if (!value) throw new Error('Missing value for --ws');
|
||||
return {
|
||||
value,
|
||||
source: 'cli',
|
||||
args: args.filter(arg => arg !== wsEqArg),
|
||||
};
|
||||
}
|
||||
|
||||
if (wsIdx !== -1) {
|
||||
const value = args[wsIdx + 1];
|
||||
if (!value || value.startsWith('--')) throw new Error('Missing value for --ws');
|
||||
return {
|
||||
value,
|
||||
source: 'cli',
|
||||
args: args.filter((_, idx) => idx !== wsIdx && idx !== wsIdx + 1),
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
value: null,
|
||||
source: null,
|
||||
args: args.slice(),
|
||||
};
|
||||
}
|
||||
|
||||
function resolveActiveWorkstream(cwd, args, env = process.env, deps = {}) {
|
||||
const parsed = parseCliWorkstream(args);
|
||||
const getStored = deps.getStored || getActiveWorkstream;
|
||||
|
||||
let ws = null;
|
||||
let source = 'none';
|
||||
|
||||
if (parsed.value) {
|
||||
ws = parsed.value;
|
||||
source = parsed.source;
|
||||
} else if (env && typeof env.GSD_WORKSTREAM === 'string' && env.GSD_WORKSTREAM.trim()) {
|
||||
ws = env.GSD_WORKSTREAM.trim();
|
||||
source = 'env';
|
||||
} else {
|
||||
ws = getStored(cwd) || null;
|
||||
source = ws ? 'store' : 'none';
|
||||
}
|
||||
|
||||
if (ws && !validateWorkstreamName(ws)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, underscores, or dots');
|
||||
}
|
||||
|
||||
return {
|
||||
ws,
|
||||
source,
|
||||
args: parsed.args,
|
||||
};
|
||||
}
|
||||
|
||||
function applyResolvedWorkstreamEnv(resolution, env = process.env) {
|
||||
if (!resolution || !resolution.ws) return;
|
||||
env.GSD_WORKSTREAM = resolution.ws;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
validateWorkstreamName,
|
||||
parseCliWorkstream,
|
||||
resolveActiveWorkstream,
|
||||
applyResolvedWorkstreamEnv,
|
||||
};
|
||||
@@ -8,6 +8,13 @@ const path = require('path');
|
||||
const { execSync, execFileSync, spawnSync } = require('child_process');
|
||||
const { MODEL_PROFILES, AGENT_TO_PHASE_TYPE, VALID_PHASE_TYPES, AGENT_DEFAULT_TIERS, VALID_AGENT_TIERS, nextTier } = require('./model-profiles.cjs');
|
||||
const { MODEL_ALIAS_MAP, RUNTIME_PROFILE_MAP, KNOWN_RUNTIMES, RUNTIMES_WITH_REASONING_EFFORT } = require('./model-catalog.cjs');
|
||||
const {
|
||||
resolveWorktreeContext,
|
||||
parseWorktreePorcelain: parseWorktreePorcelainPolicy,
|
||||
planWorktreePrune,
|
||||
executeWorktreePrunePlan,
|
||||
inspectWorktreeHealth,
|
||||
} = require('./worktree-safety.cjs');
|
||||
// Compatibility shim: new imports should use planning-workspace.cjs directly.
|
||||
const {
|
||||
planningDir,
|
||||
@@ -332,11 +339,14 @@ function _deepMergeConfig(base, overlay) {
|
||||
return result;
|
||||
}
|
||||
|
||||
function loadConfig(cwd) {
|
||||
function loadConfig(cwd, options = {}) {
|
||||
const activeWorkstream = Object.prototype.hasOwnProperty.call(options, 'workstream')
|
||||
? options.workstream
|
||||
: (process.env.GSD_WORKSTREAM || null);
|
||||
// When GSD_WORKSTREAM is set, load root config first so workstream config
|
||||
// can inherit from it. This prevents users from duplicating model_overrides,
|
||||
// workflow.*, etc. across every workstream config (#2714).
|
||||
const ws = process.env.GSD_WORKSTREAM || null;
|
||||
const ws = activeWorkstream;
|
||||
let rootParsed = null;
|
||||
if (ws) {
|
||||
const rootConfigPath = path.join(planningRoot(cwd), 'config.json');
|
||||
@@ -348,7 +358,7 @@ function loadConfig(cwd) {
|
||||
}
|
||||
}
|
||||
|
||||
const configPath = path.join(planningDir(cwd), 'config.json');
|
||||
const configPath = path.join(planningDir(cwd, ws), 'config.json');
|
||||
const defaults = CONFIG_DEFAULTS;
|
||||
|
||||
try {
|
||||
@@ -535,18 +545,11 @@ function loadConfig(cwd) {
|
||||
// If .planning/ exists, the project is initialized — just missing config.json.
|
||||
// When GSD_WORKSTREAM is set and root config was loaded, the workstream config
|
||||
// doesn't exist — treat root config as the effective config for this workstream.
|
||||
if (fs.existsSync(planningDir(cwd))) {
|
||||
if (fs.existsSync(planningDir(cwd, ws))) {
|
||||
if (rootParsed) {
|
||||
// Workstream has no config.json: re-parse using root config as the sole source.
|
||||
// Temporarily clear GSD_WORKSTREAM so planningDir() returns root .planning/,
|
||||
// then reload. This is safe: rootParsed is already the root config object.
|
||||
const savedWs = process.env.GSD_WORKSTREAM;
|
||||
delete process.env.GSD_WORKSTREAM;
|
||||
try {
|
||||
return loadConfig(cwd);
|
||||
} finally {
|
||||
process.env.GSD_WORKSTREAM = savedWs;
|
||||
}
|
||||
// Keep env immutable by explicitly reloading with workstream context cleared.
|
||||
return loadConfig(cwd, { workstream: null });
|
||||
}
|
||||
return defaults;
|
||||
}
|
||||
@@ -740,30 +743,11 @@ function execGit(cwd, args) {
|
||||
* Returns the main worktree path, or cwd if not in a worktree.
|
||||
*/
|
||||
function resolveWorktreeRoot(cwd) {
|
||||
// If the current directory already has its own .planning/, respect it.
|
||||
// This handles linked worktrees with independent planning state (e.g., Conductor workspaces).
|
||||
if (fs.existsSync(path.join(cwd, '.planning'))) {
|
||||
return cwd;
|
||||
}
|
||||
|
||||
// Check if we're in a linked worktree
|
||||
const gitDir = execGit(cwd, ['rev-parse', '--git-dir']);
|
||||
const commonDir = execGit(cwd, ['rev-parse', '--git-common-dir']);
|
||||
|
||||
if (gitDir.exitCode !== 0 || commonDir.exitCode !== 0) return cwd;
|
||||
|
||||
// In a linked worktree, .git is a file pointing to .git/worktrees/<name>
|
||||
// and git-common-dir points to the main repo's .git directory
|
||||
const gitDirResolved = path.resolve(cwd, gitDir.stdout);
|
||||
const commonDirResolved = path.resolve(cwd, commonDir.stdout);
|
||||
|
||||
if (gitDirResolved !== commonDirResolved) {
|
||||
// We're in a linked worktree — resolve main worktree root
|
||||
// The common dir is the main repo's .git, so its parent is the main worktree root
|
||||
return path.dirname(commonDirResolved);
|
||||
}
|
||||
|
||||
return cwd;
|
||||
const context = resolveWorktreeContext(cwd, {
|
||||
execGit,
|
||||
existsSync: fs.existsSync,
|
||||
});
|
||||
return context.effectiveRoot;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -775,21 +759,7 @@ function resolveWorktreeRoot(cwd) {
|
||||
* @returns {{ path: string, branch: string }[]}
|
||||
*/
|
||||
function parseWorktreePorcelain(porcelain) {
|
||||
const entries = [];
|
||||
let current = null;
|
||||
for (const line of porcelain.split('\n')) {
|
||||
if (line.startsWith('worktree ')) {
|
||||
current = { path: line.slice('worktree '.length).trim(), branch: null };
|
||||
} else if (line.startsWith('branch refs/heads/') && current) {
|
||||
current.branch = line.slice('branch refs/heads/'.length).trim();
|
||||
} else if (line === '' && current) {
|
||||
if (current.branch) entries.push(current);
|
||||
current = null;
|
||||
}
|
||||
}
|
||||
// flush last entry if file doesn't end with blank line
|
||||
if (current && current.branch) entries.push(current);
|
||||
return entries;
|
||||
return parseWorktreePorcelainPolicy(porcelain);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -802,31 +772,15 @@ function parseWorktreePorcelain(porcelain) {
|
||||
* @returns {string[]} list of worktree paths that were removed (always empty)
|
||||
*/
|
||||
function pruneOrphanedWorktrees(repoRoot) {
|
||||
const pruned = [];
|
||||
const cwd = process.cwd();
|
||||
|
||||
try {
|
||||
// 1. Get all worktrees in porcelain format
|
||||
const listResult = execGit(repoRoot, ['worktree', 'list', '--porcelain']);
|
||||
if (listResult.exitCode !== 0) return pruned;
|
||||
|
||||
const worktrees = parseWorktreePorcelain(listResult.stdout);
|
||||
if (worktrees.length === 0) {
|
||||
execGit(repoRoot, ['worktree', 'prune']);
|
||||
return pruned;
|
||||
}
|
||||
|
||||
// Destructive removal of linked worktrees is intentionally disabled.
|
||||
// Keep metadata cleanup only (git worktree prune), which clears stale refs
|
||||
// for manually-deleted directories without removing active sibling worktrees.
|
||||
void cwd;
|
||||
void worktrees;
|
||||
const plan = planWorktreePrune(
|
||||
repoRoot,
|
||||
{ allowDestructive: false },
|
||||
{ execGit, parseWorktreePorcelain }
|
||||
);
|
||||
executeWorktreePrunePlan(plan, { execGit });
|
||||
} catch { /* never crash the caller */ }
|
||||
|
||||
// Always run prune to clear stale references (e.g. manually-deleted dirs)
|
||||
execGit(repoRoot, ['worktree', 'prune']);
|
||||
|
||||
return pruned;
|
||||
return [];
|
||||
}
|
||||
|
||||
// ─── Planning workspace (pathing + active workstream + lock) moved to planning-workspace.cjs ───
|
||||
@@ -2047,4 +2001,5 @@ module.exports = {
|
||||
atomicWriteFileSync,
|
||||
timeAgo,
|
||||
pruneOrphanedWorktrees,
|
||||
inspectWorktreeHealth,
|
||||
};
|
||||
|
||||
@@ -69,7 +69,7 @@ function getAgentToModelMapForProfile(normalizedProfile) {
|
||||
const profile = VALID_PROFILES.includes(normalizedProfile) ? normalizedProfile : 'balanced';
|
||||
const out = {};
|
||||
for (const [agent, profiles] of Object.entries(MODEL_PROFILES)) {
|
||||
out[agent] = profile === 'inherit' ? 'inherit' : profiles[profile];
|
||||
out[agent] = profile === 'inherit' ? 'inherit' : (profiles[profile] ?? profiles.balanced);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
@@ -12,6 +12,7 @@ const os = require('os');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const { execFileSync } = require('child_process');
|
||||
const { isValidActiveWorkstreamName } = require('./workstream-name-policy.cjs');
|
||||
|
||||
const WORKSTREAM_SESSION_ENV_KEYS = [
|
||||
'GSD_SESSION_KEY',
|
||||
@@ -235,7 +236,7 @@ function pickActiveWorkstreamAdapter(cwd, opts = {}) {
|
||||
}
|
||||
|
||||
function validateWorkstreamName(name) {
|
||||
return /^[a-zA-Z0-9_-]+$/.test(name);
|
||||
return isValidActiveWorkstreamName(name);
|
||||
}
|
||||
|
||||
function withPlanningLock(cwd, fn) {
|
||||
@@ -333,7 +334,7 @@ function createPlanningWorkspace(cwd, opts = {}) {
|
||||
return;
|
||||
}
|
||||
if (!validateWorkstreamName(name)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, and underscores only');
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, underscores, or dots');
|
||||
}
|
||||
|
||||
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const os = require('os');
|
||||
const { safeReadFile, loadConfig, normalizePhaseName, escapeRegex, execGit, findPhaseInternal, getMilestoneInfo, stripShippedMilestones, extractCurrentMilestone, output, error, checkAgentsInstalled, CONFIG_DEFAULTS } = require('./core.cjs');
|
||||
const { safeReadFile, loadConfig, normalizePhaseName, escapeRegex, execGit, findPhaseInternal, getMilestoneInfo, stripShippedMilestones, extractCurrentMilestone, output, error, checkAgentsInstalled, CONFIG_DEFAULTS, inspectWorktreeHealth } = require('./core.cjs');
|
||||
const { planningDir } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter, parseMustHavesBlock } = require('./frontmatter.cjs');
|
||||
const { writeStateMd } = require('./state.cjs');
|
||||
@@ -908,33 +908,24 @@ function cmdValidateHealth(cwd, options, raw) {
|
||||
|
||||
// ─── Check 11: Stale / orphan git worktrees (#2167) ────────────────────────
|
||||
try {
|
||||
const worktreeResult = execGit(cwd, ['worktree', 'list', '--porcelain']);
|
||||
if (worktreeResult.exitCode === 0 && worktreeResult.stdout) {
|
||||
const blocks = worktreeResult.stdout.split('\n\n').filter(Boolean);
|
||||
// Skip the first block — it is always the main worktree
|
||||
for (let i = 1; i < blocks.length; i++) {
|
||||
const lines = blocks[i].split('\n');
|
||||
const wtLine = lines.find(l => l.startsWith('worktree '));
|
||||
if (!wtLine) continue;
|
||||
const wtPath = wtLine.slice('worktree '.length);
|
||||
|
||||
if (!fs.existsSync(wtPath)) {
|
||||
// Orphan: path no longer exists on disk
|
||||
const worktreeHealth = inspectWorktreeHealth(
|
||||
cwd,
|
||||
{ staleAfterMs: 60 * 60 * 1000 },
|
||||
{ execGit, existsSync: fs.existsSync, statSync: fs.statSync }
|
||||
);
|
||||
if (worktreeHealth.ok) {
|
||||
for (const finding of worktreeHealth.findings) {
|
||||
if (finding.kind === 'orphan') {
|
||||
addIssue('warning', 'W017',
|
||||
`Orphan git worktree: ${wtPath} (path no longer exists on disk)`,
|
||||
`Orphan git worktree: ${finding.path} (path no longer exists on disk)`,
|
||||
'Run: git worktree prune');
|
||||
} else {
|
||||
// Check if stale (older than 1 hour)
|
||||
try {
|
||||
const stat = fs.statSync(wtPath);
|
||||
const ageMs = Date.now() - stat.mtimeMs;
|
||||
const ONE_HOUR = 60 * 60 * 1000;
|
||||
if (ageMs > ONE_HOUR) {
|
||||
addIssue('warning', 'W017',
|
||||
`Stale git worktree: ${wtPath} (last modified ${Math.round(ageMs / 60000)} minutes ago)`,
|
||||
`Run: git worktree remove ${wtPath} --force`);
|
||||
}
|
||||
} catch { /* stat failed — skip */ }
|
||||
continue;
|
||||
}
|
||||
|
||||
if (finding.kind === 'stale') {
|
||||
addIssue('warning', 'W017',
|
||||
`Stale git worktree: ${finding.path} (last modified ${finding.ageMinutes} minutes ago)`,
|
||||
`Run: git worktree remove ${finding.path} --force`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
33
get-shit-done/bin/lib/workstream-name-policy.cjs
Normal file
33
get-shit-done/bin/lib/workstream-name-policy.cjs
Normal file
@@ -0,0 +1,33 @@
|
||||
/**
|
||||
* Workstream Name Policy Module
|
||||
*
|
||||
* Owns canonical name validation and slug normalization used by workstream and
|
||||
* active-pointer callers.
|
||||
*/
|
||||
|
||||
const ACTIVE_WORKSTREAM_RE = /^[a-zA-Z0-9][a-zA-Z0-9._-]*$/;
|
||||
|
||||
function toWorkstreamSlug(name) {
|
||||
return String(name || '')
|
||||
.toLowerCase()
|
||||
.replace(/[^a-z0-9]+/g, '-')
|
||||
.replace(/^-+|-+$/g, '');
|
||||
}
|
||||
|
||||
function hasInvalidPathSegment(name) {
|
||||
const value = String(name || '');
|
||||
return /[/\\]/.test(value) || value === '.' || value === '..' || value.includes('..');
|
||||
}
|
||||
|
||||
function isValidActiveWorkstreamName(name) {
|
||||
const value = String(name || '');
|
||||
if (value === '..' || value.startsWith('../') || value.includes('..')) return false;
|
||||
return ACTIVE_WORKSTREAM_RE.test(value);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
toWorkstreamSlug,
|
||||
hasInvalidPathSegment,
|
||||
isValidActiveWorkstreamName,
|
||||
};
|
||||
|
||||
@@ -13,6 +13,7 @@ const path = require('path');
|
||||
const { output, error, toPosixPath, getMilestoneInfo, generateSlugInternal, filterPlanFiles, filterSummaryFiles, readSubdirectories } = require('./core.cjs');
|
||||
const { planningPaths, planningRoot, setActiveWorkstream, getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { stateExtractField } = require('./state.cjs');
|
||||
const { toWorkstreamSlug, hasInvalidPathSegment, isValidActiveWorkstreamName } = require('./workstream-name-policy.cjs');
|
||||
|
||||
// ─── Migration ──────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -23,7 +24,7 @@ const { stateExtractField } = require('./state.cjs');
|
||||
* milestones/, research/, codebase/, todos/) stay in place.
|
||||
*/
|
||||
function migrateToWorkstreams(cwd, workstreamName) {
|
||||
if (!workstreamName || /[/\\]/.test(workstreamName) || workstreamName === '.' || workstreamName === '..') {
|
||||
if (!workstreamName || hasInvalidPathSegment(workstreamName)) {
|
||||
throw new Error('Invalid workstream name for migration');
|
||||
}
|
||||
|
||||
@@ -72,7 +73,7 @@ function cmdWorkstreamCreate(cwd, name, options, raw) {
|
||||
error('workstream name required. Usage: workstream create <name>');
|
||||
}
|
||||
|
||||
const slug = name.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/^-+|-+$/g, '');
|
||||
const slug = toWorkstreamSlug(name);
|
||||
if (!slug) {
|
||||
error('Invalid workstream name — must contain at least one alphanumeric character');
|
||||
}
|
||||
@@ -101,7 +102,15 @@ function cmdWorkstreamCreate(cwd, name, options, raw) {
|
||||
const migrateName = options.migrateName || null;
|
||||
let existingWsName;
|
||||
if (migrateName) {
|
||||
existingWsName = migrateName;
|
||||
existingWsName = toWorkstreamSlug(migrateName);
|
||||
if (!existingWsName) {
|
||||
output({
|
||||
created: false,
|
||||
error: 'migration_failed',
|
||||
message: 'Invalid migrate-name — must contain at least one alphanumeric character',
|
||||
}, raw);
|
||||
return;
|
||||
}
|
||||
} else {
|
||||
try {
|
||||
const milestone = getMilestoneInfo(cwd);
|
||||
@@ -222,7 +231,7 @@ function cmdWorkstreamList(cwd, raw) {
|
||||
|
||||
function cmdWorkstreamStatus(cwd, name, raw) {
|
||||
if (!name) error('workstream name required. Usage: workstream status <name>');
|
||||
if (/[/\\]/.test(name) || name === '.' || name === '..') error('Invalid workstream name');
|
||||
if (hasInvalidPathSegment(name)) error('Invalid workstream name');
|
||||
|
||||
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
||||
if (!fs.existsSync(wsDir)) {
|
||||
@@ -279,7 +288,7 @@ function cmdWorkstreamStatus(cwd, name, raw) {
|
||||
|
||||
function cmdWorkstreamComplete(cwd, name, options, raw) {
|
||||
if (!name) error('workstream name required. Usage: workstream complete <name>');
|
||||
if (/[/\\]/.test(name) || name === '.' || name === '..') error('Invalid workstream name');
|
||||
if (hasInvalidPathSegment(name)) error('Invalid workstream name');
|
||||
|
||||
const root = planningRoot(cwd);
|
||||
const wsRoot = path.join(root, 'workstreams');
|
||||
@@ -350,8 +359,8 @@ function cmdWorkstreamSet(cwd, name, raw) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
output({ active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, and underscores only' }, raw);
|
||||
if (!isValidActiveWorkstreamName(name)) {
|
||||
output({ active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, underscores, or dots' }, raw);
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
287
get-shit-done/bin/lib/worktree-safety.cjs
Normal file
287
get-shit-done/bin/lib/worktree-safety.cjs
Normal file
@@ -0,0 +1,287 @@
|
||||
/**
|
||||
* Worktree Safety Policy Module
|
||||
*
|
||||
* Owns worktree-root resolution and non-destructive prune policy decisions.
|
||||
*/
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { spawnSync } = require('child_process');
|
||||
|
||||
function execGitDefault(cwd, args) {
|
||||
const result = spawnSync('git', args, {
|
||||
cwd,
|
||||
stdio: 'pipe',
|
||||
encoding: 'utf-8',
|
||||
});
|
||||
return {
|
||||
exitCode: result.status ?? 1,
|
||||
stdout: (result.stdout ?? '').toString().trim(),
|
||||
stderr: (result.stderr ?? '').toString().trim(),
|
||||
};
|
||||
}
|
||||
|
||||
function parseWorktreePorcelain(porcelain) {
|
||||
return parseWorktreeEntries(porcelain).filter((entry) => entry.branch).map((entry) => ({
|
||||
path: entry.path,
|
||||
branch: entry.branch,
|
||||
}));
|
||||
}
|
||||
|
||||
function parseWorktreeEntries(porcelain) {
|
||||
const entries = [];
|
||||
const blocks = String(porcelain || '').split('\n\n').filter(Boolean);
|
||||
for (const block of blocks) {
|
||||
const lines = block.split('\n');
|
||||
const worktreeLine = lines.find((l) => l.startsWith('worktree '));
|
||||
if (!worktreeLine) continue;
|
||||
const worktreePath = worktreeLine.slice('worktree '.length).trim();
|
||||
if (!worktreePath) continue;
|
||||
const branchLine = lines.find((l) => l.startsWith('branch refs/heads/'));
|
||||
const branch = branchLine ? branchLine.slice('branch refs/heads/'.length).trim() : null;
|
||||
entries.push({ path: worktreePath, branch });
|
||||
}
|
||||
return entries;
|
||||
}
|
||||
|
||||
function parseWorktreeListPaths(porcelain) {
|
||||
return parseWorktreeEntries(porcelain).map((entry) => entry.path);
|
||||
}
|
||||
|
||||
function readWorktreeList(repoRoot, deps = {}) {
|
||||
const execGit = deps.execGit || execGitDefault;
|
||||
const listResult = execGit(repoRoot, ['worktree', 'list', '--porcelain']);
|
||||
if (listResult.exitCode !== 0) {
|
||||
return {
|
||||
ok: false,
|
||||
reason: 'git_list_failed',
|
||||
porcelain: '',
|
||||
entries: [],
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
ok: true,
|
||||
reason: 'ok',
|
||||
porcelain: listResult.stdout,
|
||||
entries: parseWorktreeEntries(listResult.stdout),
|
||||
};
|
||||
}
|
||||
|
||||
function resolveWorktreeContext(cwd, deps = {}) {
|
||||
const execGit = deps.execGit || execGitDefault;
|
||||
const existsSync = deps.existsSync || fs.existsSync;
|
||||
|
||||
// Local .planning takes precedence over linked-worktree remapping.
|
||||
if (existsSync(path.join(cwd, '.planning'))) {
|
||||
return {
|
||||
effectiveRoot: cwd,
|
||||
mode: 'current_directory',
|
||||
reason: 'has_local_planning',
|
||||
};
|
||||
}
|
||||
|
||||
const gitDir = execGit(cwd, ['rev-parse', '--git-dir']);
|
||||
const commonDir = execGit(cwd, ['rev-parse', '--git-common-dir']);
|
||||
if (gitDir.exitCode !== 0 || commonDir.exitCode !== 0) {
|
||||
return {
|
||||
effectiveRoot: cwd,
|
||||
mode: 'current_directory',
|
||||
reason: 'not_git_repo',
|
||||
};
|
||||
}
|
||||
|
||||
const gitDirResolved = path.resolve(cwd, gitDir.stdout);
|
||||
const commonDirResolved = path.resolve(cwd, commonDir.stdout);
|
||||
if (gitDirResolved !== commonDirResolved) {
|
||||
return {
|
||||
effectiveRoot: path.dirname(commonDirResolved),
|
||||
mode: 'linked_worktree_root',
|
||||
reason: 'linked_worktree',
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
effectiveRoot: cwd,
|
||||
mode: 'current_directory',
|
||||
reason: 'main_worktree',
|
||||
};
|
||||
}
|
||||
|
||||
function planWorktreePrune(repoRoot, options = {}, deps = {}) {
|
||||
const parsePorcelain = deps.parseWorktreePorcelain || parseWorktreePorcelain;
|
||||
const destructiveModeRequested = Boolean(options.allowDestructive);
|
||||
const listed = readWorktreeList(repoRoot, deps);
|
||||
if (!listed.ok) {
|
||||
return {
|
||||
repoRoot,
|
||||
action: 'skip',
|
||||
reason: listed.reason,
|
||||
destructiveModeRequested,
|
||||
};
|
||||
}
|
||||
|
||||
let worktrees = [];
|
||||
try {
|
||||
worktrees = parsePorcelain(listed.porcelain);
|
||||
} catch {
|
||||
// Keep historical behavior: still run metadata prune when parsing fails.
|
||||
worktrees = [];
|
||||
}
|
||||
|
||||
return {
|
||||
repoRoot,
|
||||
action: 'metadata_prune_only',
|
||||
reason: worktrees.length === 0 ? 'no_worktrees' : 'worktrees_present',
|
||||
destructiveModeRequested,
|
||||
};
|
||||
}
|
||||
|
||||
function executeWorktreePrunePlan(plan, deps = {}) {
|
||||
const execGit = deps.execGit || execGitDefault;
|
||||
if (!plan || plan.action === 'skip') {
|
||||
return {
|
||||
ok: false,
|
||||
action: plan ? plan.action : 'skip',
|
||||
reason: plan ? plan.reason : 'missing_plan',
|
||||
pruned: [],
|
||||
};
|
||||
}
|
||||
|
||||
if (plan.action !== 'metadata_prune_only') {
|
||||
return {
|
||||
ok: false,
|
||||
action: plan.action,
|
||||
reason: 'unsupported_action',
|
||||
pruned: [],
|
||||
};
|
||||
}
|
||||
|
||||
const result = execGit(plan.repoRoot, ['worktree', 'prune']);
|
||||
return {
|
||||
ok: result.exitCode === 0,
|
||||
action: plan.action,
|
||||
reason: plan.reason,
|
||||
pruned: [],
|
||||
};
|
||||
}
|
||||
|
||||
function listLinkedWorktreePaths(repoRoot, deps = {}) {
|
||||
const listed = readWorktreeList(repoRoot, deps);
|
||||
if (!listed.ok) {
|
||||
return {
|
||||
ok: false,
|
||||
reason: listed.reason,
|
||||
paths: [],
|
||||
};
|
||||
}
|
||||
|
||||
const allPaths = listed.entries.map((entry) => entry.path);
|
||||
// git worktree list always includes the current/main worktree first.
|
||||
return {
|
||||
ok: true,
|
||||
reason: 'ok',
|
||||
paths: allPaths.slice(1),
|
||||
};
|
||||
}
|
||||
|
||||
function inspectWorktreeHealth(repoRoot, options = {}, deps = {}) {
|
||||
const inventory = snapshotWorktreeInventory(repoRoot, options, deps);
|
||||
if (!inventory.ok) {
|
||||
return {
|
||||
ok: false,
|
||||
reason: inventory.reason,
|
||||
findings: [],
|
||||
};
|
||||
}
|
||||
|
||||
const findings = [];
|
||||
for (const entry of inventory.entries) {
|
||||
if (!entry.exists) {
|
||||
findings.push({
|
||||
kind: 'orphan',
|
||||
path: entry.path,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
if (entry.isStale) {
|
||||
findings.push({
|
||||
kind: 'stale',
|
||||
path: entry.path,
|
||||
ageMinutes: entry.ageMinutes,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
ok: true,
|
||||
reason: 'ok',
|
||||
findings,
|
||||
};
|
||||
}
|
||||
|
||||
function snapshotWorktreeInventory(repoRoot, options = {}, deps = {}) {
|
||||
const existsSync = deps.existsSync || fs.existsSync;
|
||||
const statSync = deps.statSync || fs.statSync;
|
||||
const staleAfterMs = options.staleAfterMs ?? (60 * 60 * 1000);
|
||||
const nowMs = options.nowMs ?? Date.now();
|
||||
const listed = listLinkedWorktreePaths(repoRoot, { execGit: deps.execGit || execGitDefault });
|
||||
if (!listed.ok) {
|
||||
return {
|
||||
ok: false,
|
||||
reason: listed.reason,
|
||||
entries: [],
|
||||
};
|
||||
}
|
||||
|
||||
const entries = [];
|
||||
for (const worktreePath of listed.paths) {
|
||||
let exists = false;
|
||||
let isStale = false;
|
||||
let ageMinutes = null;
|
||||
|
||||
if (!existsSync(worktreePath)) {
|
||||
entries.push({
|
||||
path: worktreePath,
|
||||
exists,
|
||||
isStale,
|
||||
ageMinutes,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
exists = true;
|
||||
try {
|
||||
const stat = statSync(worktreePath);
|
||||
const ageMs = nowMs - stat.mtimeMs;
|
||||
ageMinutes = Math.round(ageMs / 60000);
|
||||
if (ageMs > staleAfterMs) {
|
||||
isStale = true;
|
||||
}
|
||||
} catch {
|
||||
// Keep historical behavior: stat failures are ignored.
|
||||
}
|
||||
entries.push({
|
||||
path: worktreePath,
|
||||
exists,
|
||||
isStale,
|
||||
ageMinutes,
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
ok: true,
|
||||
reason: 'ok',
|
||||
entries,
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
resolveWorktreeContext,
|
||||
parseWorktreePorcelain,
|
||||
planWorktreePrune,
|
||||
executeWorktreePrunePlan,
|
||||
listLinkedWorktreePaths,
|
||||
inspectWorktreeHealth,
|
||||
snapshotWorktreeInventory,
|
||||
};
|
||||
@@ -382,6 +382,7 @@ AskUserQuestion([
|
||||
{ label: "qwen", description: "Qwen CLI." },
|
||||
{ label: "opencode", description: "OpenCode (uses anthropic/ prefix)." },
|
||||
{ label: "copilot", description: "GitHub Copilot." },
|
||||
{ label: "hermes", description: "Hermes (uses anthropic/ prefix)." },
|
||||
{ label: "Other (Group B or custom)", description: "kilo, cline, cursor, windsurf, augment, trae, codebuddy, antigravity, or a custom runtime string. Overrides are honored even though no built-in map exists." }
|
||||
]
|
||||
}
|
||||
|
||||
50
sdk/src/query/active-workstream-store.ts
Normal file
50
sdk/src/query/active-workstream-store.ts
Normal file
@@ -0,0 +1,50 @@
|
||||
import { readFileSync, writeFileSync, unlinkSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { validateWorkstreamName } from '../workstream-utils.js';
|
||||
|
||||
function pointerPath(projectDir: string): string {
|
||||
return join(projectDir, '.planning', 'active-workstream');
|
||||
}
|
||||
|
||||
function workstreamDir(projectDir: string, name: string): string {
|
||||
return join(projectDir, '.planning', 'workstreams', name);
|
||||
}
|
||||
|
||||
/**
|
||||
* Read active workstream pointer from `.planning/active-workstream`.
|
||||
* Invalid or stale pointers are self-healed by clearing the file.
|
||||
*/
|
||||
export function readActiveWorkstream(projectDir: string): string | null {
|
||||
const filePath = pointerPath(projectDir);
|
||||
try {
|
||||
const name = readFileSync(filePath, 'utf-8').trim();
|
||||
if (!name || !validateWorkstreamName(name)) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return null;
|
||||
}
|
||||
if (!existsSync(workstreamDir(projectDir, name))) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return null;
|
||||
}
|
||||
return name;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export function writeActiveWorkstream(projectDir: string, name: string | null): void {
|
||||
const filePath = pointerPath(projectDir);
|
||||
if (!name) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return;
|
||||
}
|
||||
if (!validateWorkstreamName(name)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, underscores, or dots');
|
||||
}
|
||||
const wsDir = workstreamDir(projectDir, name);
|
||||
if (!existsSync(wsDir)) {
|
||||
throw new Error(`Workstream directory does not exist: ${name}`);
|
||||
}
|
||||
writeFileSync(filePath, name + '\n', 'utf-8');
|
||||
}
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import { mkdtemp, writeFile, mkdir, rm, readdir } from 'node:fs/promises';
|
||||
import { join, resolve } from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { GSDError, ErrorClassification, exitCodeFor } from '../errors.js';
|
||||
|
||||
@@ -217,7 +218,8 @@ describe('resolveModel', () => {
|
||||
describe('MODEL_PROFILES', () => {
|
||||
it('contains every shipped gsd agent file on disk (#3229)', async () => {
|
||||
const { MODEL_PROFILES } = await import('./config-query.js');
|
||||
const repoRoot = resolve(process.cwd(), '..');
|
||||
// config-query.test.ts lives at sdk/src/query/ — three levels from repo root
|
||||
const repoRoot = resolve(fileURLToPath(new URL('../../../', import.meta.url)));
|
||||
const agentFiles = (await readdir(join(repoRoot, 'agents')))
|
||||
.filter((f) => /^gsd-.*\.md$/.test(f))
|
||||
.map((f) => f.replace(/\.md$/, ''))
|
||||
|
||||
@@ -193,6 +193,11 @@ describe('stateExtractField', () => {
|
||||
// ─── planningPaths ──────────────────────────────────────────────────────────
|
||||
|
||||
describe('planningPaths', () => {
|
||||
afterEach(() => {
|
||||
delete process.env['GSD_WORKSTREAM'];
|
||||
delete process.env['GSD_PROJECT'];
|
||||
});
|
||||
|
||||
it('returns all expected keys', () => {
|
||||
const paths = planningPaths('/proj');
|
||||
expect(paths).toHaveProperty('planning');
|
||||
@@ -209,6 +214,19 @@ describe('planningPaths', () => {
|
||||
expect(paths.state).toContain('.planning/STATE.md');
|
||||
expect(paths.config).toContain('.planning/config.json');
|
||||
});
|
||||
|
||||
it('uses GSD_PROJECT env when no explicit workstream is provided', () => {
|
||||
process.env['GSD_PROJECT'] = 'proj-scope';
|
||||
const paths = planningPaths('/proj');
|
||||
expect(paths.planning).toContain('/proj/.planning/proj-scope');
|
||||
});
|
||||
|
||||
it('explicit workstream overrides GSD_PROJECT env', () => {
|
||||
process.env['GSD_PROJECT'] = 'proj-scope';
|
||||
const paths = planningPaths('/proj', 'ws-a');
|
||||
expect(paths.planning).toContain('/proj/.planning/workstreams/ws-a');
|
||||
expect(paths.planning).not.toContain('proj-scope');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── normalizeMd ───────────────────────────────────────────────────────────
|
||||
|
||||
@@ -22,9 +22,10 @@ import { realpath } from 'node:fs/promises';
|
||||
import { existsSync, statSync, readFileSync } from 'node:fs';
|
||||
import { homedir } from 'node:os';
|
||||
import { GSDError, ErrorClassification } from '../errors.js';
|
||||
import { relPlanningPath } from '../workstream-utils.js';
|
||||
export { SUPPORTED_RUNTIMES, type Runtime } from '../model-catalog.js';
|
||||
import { SUPPORTED_RUNTIMES, type Runtime } from '../model-catalog.js';
|
||||
import { workspacePlanningPaths, resolveWorkspaceContext, type PlanningPaths } from './workspace.js';
|
||||
import { relPlanningPath, validateWorkstreamName } from '../workstream-utils.js';
|
||||
|
||||
// ─── Runtime-aware agents directory resolution ─────────────────────────────
|
||||
|
||||
@@ -173,15 +174,7 @@ export function renderGlobalSkillDisplayPath(runtime: Runtime, skillName: string
|
||||
// ─── Types ──────────────────────────────────────────────────────────────────
|
||||
|
||||
/** Paths to common .planning files. */
|
||||
export interface PlanningPaths {
|
||||
planning: string;
|
||||
state: string;
|
||||
roadmap: string;
|
||||
project: string;
|
||||
config: string;
|
||||
phases: string;
|
||||
requirements: string;
|
||||
}
|
||||
export type { PlanningPaths } from './workspace.js';
|
||||
|
||||
// ─── escapeRegex ────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -462,11 +455,23 @@ export function normalizeMd(content: string): string {
|
||||
* All paths returned in POSIX format.
|
||||
*
|
||||
* @param projectDir - Root project directory
|
||||
* @param workstream - Optional workstream name (see relPlanningPath)
|
||||
* @param workstream - Optional workstream name
|
||||
* @returns Object with paths to common .planning files
|
||||
*/
|
||||
export function planningPaths(projectDir: string, workstream?: string): PlanningPaths {
|
||||
const base = join(projectDir, relPlanningPath(workstream));
|
||||
const envCtx = resolveWorkspaceContext();
|
||||
// Validate env workstream before use: invalid GSD_WORKSTREAM falls back to
|
||||
// root .planning/ (bug-2791 contract — invalid env must not crash or route
|
||||
// to a bad path; silent fallback to root preserves pre-#3269 behaviour).
|
||||
const validEnvWorkstream =
|
||||
envCtx.workstream && validateWorkstreamName(envCtx.workstream) ? envCtx.workstream : null;
|
||||
const effectiveWorkstream = workstream ?? validEnvWorkstream;
|
||||
// Use relPlanningPath(workstream) to scope the base path per workstream policy.
|
||||
const base = join(projectDir, relPlanningPath(effectiveWorkstream ?? undefined));
|
||||
// For env-sourced project scoping (no explicit workstream), delegate to workspace.
|
||||
if (!effectiveWorkstream && envCtx.project) {
|
||||
return workspacePlanningPaths(projectDir, { workstream: null, project: envCtx.project });
|
||||
}
|
||||
return {
|
||||
planning: toPosixPath(base),
|
||||
state: toPosixPath(join(base, 'STATE.md')),
|
||||
|
||||
@@ -29,7 +29,6 @@ import { maskIfSecret } from './secrets.js';
|
||||
import { findPhase } from './phase.js';
|
||||
import { roadmapGetPhase, getMilestoneInfo, extractCurrentMilestone, extractPhasesFromSection } from './roadmap.js';
|
||||
import { planningPaths, normalizePhaseName, toPosixPath, resolveAgentsDir, detectRuntime } from './helpers.js';
|
||||
import { relPlanningPath } from '../workstream-utils.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
|
||||
// ─── Internal helpers ──────────────────────────────────────────────────────
|
||||
@@ -279,7 +278,8 @@ export const initExecutePhase: QueryHandler = async (args, projectDir, workstrea
|
||||
}
|
||||
|
||||
const config = await loadConfig(projectDir);
|
||||
const planningDir = join(projectDir, relPlanningPath(workstream));
|
||||
const paths = planningPaths(projectDir, workstream);
|
||||
const planningDir = paths.planning;
|
||||
|
||||
const { phaseInfo, roadmapPhase } = await getPhaseInfoWithFallback(phase, projectDir, workstream);
|
||||
const phase_req_ids = extractReqIds(roadmapPhase);
|
||||
@@ -361,7 +361,8 @@ export const initPlanPhase: QueryHandler = async (args, projectDir, workstream)
|
||||
}
|
||||
|
||||
const config = await loadConfig(projectDir);
|
||||
const planningDir = join(projectDir, relPlanningPath(workstream));
|
||||
const paths = planningPaths(projectDir, workstream);
|
||||
const planningDir = paths.planning;
|
||||
|
||||
const { phaseInfo, roadmapPhase } = await getPhaseInfoWithFallback(phase, projectDir, workstream);
|
||||
const phase_req_ids = extractReqIds(roadmapPhase);
|
||||
@@ -630,7 +631,8 @@ export const initPhaseOp: QueryHandler = async (args, projectDir, workstream) =>
|
||||
}
|
||||
|
||||
const config = await loadConfig(projectDir);
|
||||
const planningDir = join(projectDir, relPlanningPath(workstream));
|
||||
const paths = planningPaths(projectDir, workstream);
|
||||
const planningDir = paths.planning;
|
||||
|
||||
// findPhase with archived override: if only match is archived, prefer ROADMAP
|
||||
const phaseResult = await findPhase([phase], projectDir, workstream);
|
||||
@@ -796,7 +798,8 @@ export const initTodos: QueryHandler = async (args, projectDir) => {
|
||||
*/
|
||||
export const initMilestoneOp: QueryHandler = async (_args, projectDir, workstream) => {
|
||||
const config = await loadConfig(projectDir);
|
||||
const planningDir = join(projectDir, relPlanningPath(workstream));
|
||||
const paths = planningPaths(projectDir, workstream);
|
||||
const planningDir = paths.planning;
|
||||
const milestone = await getMilestoneInfo(projectDir, workstream);
|
||||
|
||||
const phasesDir = join(planningDir, 'phases');
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { join } from 'node:path';
|
||||
import { readFileSync, existsSync } from 'node:fs';
|
||||
import { findProjectRoot } from './helpers.js';
|
||||
import { validateWorkstreamName } from '../workstream-utils.js';
|
||||
import { readActiveWorkstream } from './active-workstream-store.js';
|
||||
|
||||
export interface QueryRuntimeContextInput {
|
||||
projectDir: string;
|
||||
@@ -13,26 +12,6 @@ export interface QueryRuntimeContext {
|
||||
ws?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Read the active workstream from `.planning/active-workstream` file.
|
||||
*
|
||||
* Mirrors the logic in workstream.ts:getActiveWorkstream — returns null
|
||||
* when the file is missing, empty, contains invalid characters, or names
|
||||
* a workstream directory that doesn't exist on disk.
|
||||
*/
|
||||
function readActiveWorkstreamFile(projectDir: string): string | null {
|
||||
const filePath = join(projectDir, '.planning', 'active-workstream');
|
||||
try {
|
||||
const name = readFileSync(filePath, 'utf-8').trim();
|
||||
if (!name || !validateWorkstreamName(name)) return null;
|
||||
const wsDir = join(projectDir, '.planning', 'workstreams', name);
|
||||
if (!existsSync(wsDir)) return null;
|
||||
return name;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve the runtime context for a query invocation.
|
||||
*
|
||||
@@ -57,7 +36,7 @@ export function resolveQueryRuntimeContext(input: QueryRuntimeContextInput): Que
|
||||
return { projectDir, ws: envWs };
|
||||
}
|
||||
|
||||
const fileWs = readActiveWorkstreamFile(projectDir);
|
||||
const fileWs = readActiveWorkstream(projectDir);
|
||||
return {
|
||||
projectDir,
|
||||
ws: fileWs ?? undefined,
|
||||
|
||||
@@ -69,10 +69,11 @@ describe('workspacePlanningPaths', () => {
|
||||
expect(paths.phases).toContain('workstreams/backend/phases');
|
||||
});
|
||||
|
||||
it('scopes to .planning/projects/<project> when project set', () => {
|
||||
it('scopes to .planning/<project> when project set (CJS parity)', () => {
|
||||
const paths = workspacePlanningPaths(projectDir, { workstream: null, project: 'api-server' });
|
||||
expect(paths.planning).toContain('projects/api-server');
|
||||
expect(paths.state).toContain('projects/api-server/STATE.md');
|
||||
expect(paths.planning).toContain('.planning/api-server');
|
||||
expect(paths.planning).not.toContain('projects/');
|
||||
expect(paths.state).toContain('.planning/api-server/STATE.md');
|
||||
});
|
||||
|
||||
it('workstream takes precedence over project when both set', () => {
|
||||
|
||||
@@ -21,8 +21,20 @@
|
||||
|
||||
import { join } from 'node:path';
|
||||
import { GSDError, ErrorClassification } from '../errors.js';
|
||||
import { toPosixPath } from './helpers.js';
|
||||
import type { PlanningPaths } from './helpers.js';
|
||||
|
||||
export interface PlanningPaths {
|
||||
planning: string;
|
||||
state: string;
|
||||
roadmap: string;
|
||||
project: string;
|
||||
config: string;
|
||||
phases: string;
|
||||
requirements: string;
|
||||
}
|
||||
|
||||
function toPosixPath(p: string): string {
|
||||
return p.split('\\').join('/');
|
||||
}
|
||||
|
||||
// ─── Types ─────────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -93,7 +105,7 @@ export function resolveWorkspaceContext(): WorkspaceContext {
|
||||
* Return PlanningPaths scoped to the active workspace or project.
|
||||
*
|
||||
* When context has a workstream set: base = .planning/workstreams/<ws>/
|
||||
* When context has a project set: base = .planning/projects/<project>/
|
||||
* When context has a project set: base = .planning/<project>/
|
||||
* When context is null or empty: base = .planning/ (default)
|
||||
*
|
||||
* Workspace and project names are validated before path construction.
|
||||
@@ -114,7 +126,9 @@ export function workspacePlanningPaths(
|
||||
base = join(projectDir, '.planning', 'workstreams', context.workstream);
|
||||
} else if (context?.project != null) {
|
||||
validateWorkspaceName(context.project, 'project');
|
||||
base = join(projectDir, '.planning', 'projects', context.project);
|
||||
// Match CJS planningDir() policy: project scopes under `.planning/<project>/`
|
||||
// (not `.planning/projects/<project>/`).
|
||||
base = join(projectDir, '.planning', context.project);
|
||||
} else {
|
||||
base = join(projectDir, '.planning');
|
||||
}
|
||||
|
||||
@@ -24,6 +24,8 @@ import { join, relative } from 'node:path';
|
||||
|
||||
import { toPosixPath, stateExtractField } from './helpers.js';
|
||||
import { GSDError, ErrorClassification } from '../errors.js';
|
||||
import { validateWorkstreamName, toWorkstreamSlug } from '../workstream-name-policy.js';
|
||||
import { readActiveWorkstream, writeActiveWorkstream } from './active-workstream-store.js';
|
||||
import type { QueryHandler } from './utils.js';
|
||||
|
||||
// ─── Internal helpers ─────────────────────────────────────────────────────
|
||||
@@ -58,37 +60,6 @@ function filterSummaryFiles(files: string[]): string[] {
|
||||
return files.filter(f => f.endsWith('-SUMMARY.md') || f === 'SUMMARY.md');
|
||||
}
|
||||
|
||||
function getActiveWorkstream(projectDir: string): string | null {
|
||||
const filePath = join(planningRoot(projectDir), 'active-workstream');
|
||||
try {
|
||||
const name = readFileSync(filePath, 'utf-8').trim();
|
||||
if (!name || !/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return null;
|
||||
}
|
||||
const wsDir = join(workstreamsDir(projectDir), name);
|
||||
if (!existsSync(wsDir)) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return null;
|
||||
}
|
||||
return name;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function setActiveWorkstream(projectDir: string, name: string | null): void {
|
||||
const filePath = join(planningRoot(projectDir), 'active-workstream');
|
||||
if (!name) {
|
||||
try { unlinkSync(filePath); } catch { /* already gone */ }
|
||||
return;
|
||||
}
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, and underscores only');
|
||||
}
|
||||
writeFileSync(filePath, name + '\n', 'utf-8');
|
||||
}
|
||||
|
||||
// ─── Handlers ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
@@ -97,7 +68,7 @@ function setActiveWorkstream(projectDir: string, name: string | null): void {
|
||||
* Port of `cmdWorkstreamGet` from `workstream.cjs` lines 367–371.
|
||||
*/
|
||||
export const workstreamGet: QueryHandler = async (_args, projectDir) => {
|
||||
const active = getActiveWorkstream(projectDir);
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
const wsRoot = workstreamsDir(projectDir);
|
||||
return {
|
||||
data: {
|
||||
@@ -126,7 +97,7 @@ export const workstreamCreate: QueryHandler = async (args, projectDir) => {
|
||||
return { data: { created: false, reason: 'invalid workstream name — path separators not allowed' } };
|
||||
}
|
||||
|
||||
const slug = rawName.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/^-+|-+$/g, '');
|
||||
const slug = toWorkstreamSlug(rawName);
|
||||
if (!slug) return { data: { created: false, reason: 'invalid workstream name — must contain at least one alphanumeric character' } };
|
||||
|
||||
const baseDir = planningRoot(projectDir);
|
||||
@@ -174,7 +145,7 @@ export const workstreamCreate: QueryHandler = async (args, projectDir) => {
|
||||
writeFileSync(statePath, stateContent, 'utf-8');
|
||||
}
|
||||
|
||||
setActiveWorkstream(projectDir, slug);
|
||||
writeActiveWorkstream(projectDir, slug);
|
||||
|
||||
const relPath = toPosixPath(relative(projectDir, wsDir));
|
||||
return {
|
||||
@@ -215,13 +186,13 @@ export const workstreamSet: QueryHandler = async (args, projectDir) => {
|
||||
if (name !== '--clear') {
|
||||
return { data: { set: false, reason: 'name required. Usage: workstream set <name> (or workstream set --clear to unset)' } };
|
||||
}
|
||||
const previous = getActiveWorkstream(projectDir);
|
||||
setActiveWorkstream(projectDir, null);
|
||||
const previous = readActiveWorkstream(projectDir);
|
||||
writeActiveWorkstream(projectDir, null);
|
||||
return { data: { active: null, cleared: true, previous: previous || null } };
|
||||
}
|
||||
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
return { data: { active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, and underscores only' } };
|
||||
if (!validateWorkstreamName(name)) {
|
||||
return { data: { active: null, error: 'invalid_name', message: 'Workstream name must be alphanumeric, hyphens, underscores, or dots only' } };
|
||||
}
|
||||
|
||||
const wsDir = join(workstreamsDir(projectDir), name);
|
||||
@@ -229,7 +200,7 @@ export const workstreamSet: QueryHandler = async (args, projectDir) => {
|
||||
return { data: { active: null, error: 'not_found', workstream: name } };
|
||||
}
|
||||
|
||||
setActiveWorkstream(projectDir, name);
|
||||
writeActiveWorkstream(projectDir, name);
|
||||
syncRootStateMirror(projectDir, name);
|
||||
return { data: { active: name, set: true, mirror_synced: existsSync(join(wsDir, 'STATE.md')) } };
|
||||
};
|
||||
@@ -316,8 +287,8 @@ export const workstreamComplete: QueryHandler = async (args, projectDir) => {
|
||||
return { data: { completed: false, error: 'not_found', workstream: name } };
|
||||
}
|
||||
|
||||
const active = getActiveWorkstream(projectDir);
|
||||
if (active === name) setActiveWorkstream(projectDir, null);
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
if (active === name) writeActiveWorkstream(projectDir, null);
|
||||
|
||||
const archiveDir = join(root, 'milestones');
|
||||
const today = new Date().toISOString().split('T')[0];
|
||||
@@ -341,7 +312,7 @@ export const workstreamComplete: QueryHandler = async (args, projectDir) => {
|
||||
try { renameSync(join(archivePath, fname), join(wsDir, fname)); } catch { /* rollback */ }
|
||||
}
|
||||
try { rmdirSync(archivePath); } catch { /* cleanup */ }
|
||||
if (active === name) setActiveWorkstream(projectDir, name);
|
||||
if (active === name) writeActiveWorkstream(projectDir, name);
|
||||
return { data: { completed: false, error: 'archive_failed', message: String(err), workstream: name } };
|
||||
}
|
||||
|
||||
@@ -382,7 +353,7 @@ export const workstreamProgress: QueryHandler = async (_args, projectDir) => {
|
||||
};
|
||||
}
|
||||
|
||||
const active = getActiveWorkstream(projectDir);
|
||||
const active = readActiveWorkstream(projectDir);
|
||||
const entries = readdirSync(wsRoot, { withFileTypes: true });
|
||||
const workstreams: Array<{
|
||||
name: string;
|
||||
|
||||
@@ -52,6 +52,7 @@ function resolveModel(options?: SessionOptions, config?: GSDConfig): string | un
|
||||
|
||||
if (config?.model_profile) {
|
||||
const profile = String(config.model_profile).toLowerCase();
|
||||
if (profile === 'inherit') return undefined;
|
||||
const tier = profile === 'quality' ? 'opus'
|
||||
: (profile === 'budget' || profile === 'speed') ? 'haiku'
|
||||
: (profile === 'balanced' || profile === 'adaptive') ? 'sonnet'
|
||||
|
||||
24
sdk/src/workstream-name-policy.ts
Normal file
24
sdk/src/workstream-name-policy.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
/**
|
||||
* Workstream Name Policy Module
|
||||
*
|
||||
* Owns SDK-side workstream validation and slug normalization.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Validate a workstream name.
|
||||
* Allowed: alphanumeric, hyphens, underscores, dots.
|
||||
* Disallowed: empty, spaces, slashes, special chars, path traversal.
|
||||
*/
|
||||
export function validateWorkstreamName(name: string): boolean {
|
||||
if (!name || name.length === 0) return false;
|
||||
if (name.includes('..')) return false;
|
||||
return /^[a-zA-Z0-9][a-zA-Z0-9._-]*$/.test(name);
|
||||
}
|
||||
|
||||
export function toWorkstreamSlug(name: string): string {
|
||||
return String(name || '')
|
||||
.toLowerCase()
|
||||
.replace(/[^a-z0-9]+/g, '-')
|
||||
.replace(/^-+|-+$/g, '');
|
||||
}
|
||||
|
||||
@@ -6,19 +6,7 @@
|
||||
*/
|
||||
|
||||
import { posix } from 'node:path';
|
||||
|
||||
/**
|
||||
* Validate a workstream name.
|
||||
* Allowed: alphanumeric, hyphens, underscores, dots.
|
||||
* Disallowed: empty, spaces, slashes, special chars, path traversal.
|
||||
*/
|
||||
export function validateWorkstreamName(name: string): boolean {
|
||||
if (!name || name.length === 0) return false;
|
||||
// Only allow alphanumeric, hyphens, underscores, dots
|
||||
// Must not be ".." or start with ".." (path traversal)
|
||||
if (name === '..' || name.startsWith('../')) return false;
|
||||
return /^[a-zA-Z0-9][a-zA-Z0-9._-]*$/.test(name);
|
||||
}
|
||||
export { validateWorkstreamName, toWorkstreamSlug } from './workstream-name-policy.js';
|
||||
|
||||
/**
|
||||
* Return the relative planning directory path.
|
||||
|
||||
97
tests/active-workstream-store.test.cjs
Normal file
97
tests/active-workstream-store.test.cjs
Normal file
@@ -0,0 +1,97 @@
|
||||
const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
|
||||
const {
|
||||
validateWorkstreamName,
|
||||
parseCliWorkstream,
|
||||
resolveActiveWorkstream,
|
||||
applyResolvedWorkstreamEnv,
|
||||
} = require('../get-shit-done/bin/lib/active-workstream-store.cjs');
|
||||
|
||||
describe('active-workstream-store', () => {
|
||||
test('validateWorkstreamName accepts canonical names', () => {
|
||||
assert.equal(validateWorkstreamName('alpha'), true);
|
||||
assert.equal(validateWorkstreamName('alpha_2'), true);
|
||||
assert.equal(validateWorkstreamName('alpha-2'), true);
|
||||
});
|
||||
|
||||
test('validateWorkstreamName rejects invalid names', () => {
|
||||
assert.equal(validateWorkstreamName('alpha beta'), false);
|
||||
assert.equal(validateWorkstreamName('../alpha'), false);
|
||||
assert.equal(validateWorkstreamName('alpha/beta'), false);
|
||||
});
|
||||
|
||||
test('parseCliWorkstream parses --ws=<name>', () => {
|
||||
const parsed = parseCliWorkstream(['state', 'json', '--ws=alpha', '--raw']);
|
||||
assert.equal(parsed.value, 'alpha');
|
||||
assert.equal(parsed.source, 'cli');
|
||||
assert.deepEqual(parsed.args, ['state', 'json', '--raw']);
|
||||
});
|
||||
|
||||
test('parseCliWorkstream parses --ws <name>', () => {
|
||||
const parsed = parseCliWorkstream(['state', 'json', '--ws', 'alpha', '--raw']);
|
||||
assert.equal(parsed.value, 'alpha');
|
||||
assert.equal(parsed.source, 'cli');
|
||||
assert.deepEqual(parsed.args, ['state', 'json', '--raw']);
|
||||
});
|
||||
|
||||
test('parseCliWorkstream throws on missing value', () => {
|
||||
assert.throws(
|
||||
() => parseCliWorkstream(['state', 'json', '--ws']),
|
||||
/Missing value for --ws/
|
||||
);
|
||||
});
|
||||
|
||||
test('resolveActiveWorkstream precedence: cli > env > store', () => {
|
||||
const cli = resolveActiveWorkstream('/repo', ['state', 'json', '--ws', 'cli-ws'], {
|
||||
GSD_WORKSTREAM: 'env-ws',
|
||||
}, {
|
||||
getStored: () => 'store-ws',
|
||||
});
|
||||
assert.equal(cli.ws, 'cli-ws');
|
||||
assert.equal(cli.source, 'cli');
|
||||
|
||||
const env = resolveActiveWorkstream('/repo', ['state', 'json'], {
|
||||
GSD_WORKSTREAM: 'env-ws',
|
||||
}, {
|
||||
getStored: () => 'store-ws',
|
||||
});
|
||||
assert.equal(env.ws, 'env-ws');
|
||||
assert.equal(env.source, 'env');
|
||||
|
||||
const store = resolveActiveWorkstream('/repo', ['state', 'json'], {
|
||||
GSD_WORKSTREAM: '',
|
||||
}, {
|
||||
getStored: () => 'store-ws',
|
||||
});
|
||||
assert.equal(store.ws, 'store-ws');
|
||||
assert.equal(store.source, 'store');
|
||||
});
|
||||
|
||||
test('resolveActiveWorkstream returns none when no source provides a workstream', () => {
|
||||
const resolved = resolveActiveWorkstream('/repo', ['state', 'json'], {
|
||||
GSD_WORKSTREAM: '',
|
||||
}, {
|
||||
getStored: () => null,
|
||||
});
|
||||
assert.equal(resolved.ws, null);
|
||||
assert.equal(resolved.source, 'none');
|
||||
});
|
||||
|
||||
test('resolveActiveWorkstream rejects invalid selected name', () => {
|
||||
assert.throws(
|
||||
() => resolveActiveWorkstream('/repo', ['state', 'json', '--ws', 'bad/name']),
|
||||
/Invalid workstream name/
|
||||
);
|
||||
});
|
||||
|
||||
test('applyResolvedWorkstreamEnv sets env only when ws exists', () => {
|
||||
const env = { GSD_WORKSTREAM: 'old' };
|
||||
applyResolvedWorkstreamEnv({ ws: null }, env);
|
||||
assert.equal(env.GSD_WORKSTREAM, 'old');
|
||||
|
||||
applyResolvedWorkstreamEnv({ ws: 'new-ws' }, env);
|
||||
assert.equal(env.GSD_WORKSTREAM, 'new-ws');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -289,6 +289,17 @@ describe('loadConfig workstream config inheritance (#2714)', () => {
|
||||
assert.strictEqual(config.model_profile, 'quality');
|
||||
assert.deepStrictEqual(config.model_overrides, { 'gsd-executor': 'opus' });
|
||||
});
|
||||
|
||||
test('loadConfig does not mutate GSD_WORKSTREAM when workstream config is missing', () => {
|
||||
writeRootConfig({ model_profile: 'quality' });
|
||||
fs.mkdirSync(path.join(tmpDir, '.planning', 'workstreams', 'feature-f'), { recursive: true });
|
||||
process.env.GSD_WORKSTREAM = 'feature-f';
|
||||
|
||||
const config = loadConfig(tmpDir);
|
||||
|
||||
assert.strictEqual(config.model_profile, 'quality');
|
||||
assert.strictEqual(process.env.GSD_WORKSTREAM, 'feature-f');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── loadConfig commit_docs gitignore auto-detection (#1250) ──────────────────
|
||||
@@ -434,6 +445,16 @@ describe('resolveModelInternal', () => {
|
||||
assert.strictEqual(resolveModelInternal(tmpDir, 'gsd-nonexistent'), 'sonnet');
|
||||
});
|
||||
|
||||
test('returns opus for unknown agent type with quality profile', () => {
|
||||
writeConfig({ model_profile: 'quality' });
|
||||
assert.strictEqual(resolveModelInternal(tmpDir, 'gsd-nonexistent'), 'opus');
|
||||
});
|
||||
|
||||
test('returns haiku for unknown agent type with budget profile', () => {
|
||||
writeConfig({ model_profile: 'budget' });
|
||||
assert.strictEqual(resolveModelInternal(tmpDir, 'gsd-nonexistent'), 'haiku');
|
||||
});
|
||||
|
||||
test('returns inherit for unknown agent type with inherit profile', () => {
|
||||
writeConfig({ model_profile: 'inherit' });
|
||||
assert.strictEqual(resolveModelInternal(tmpDir, 'gsd-nonexistent'), 'inherit');
|
||||
|
||||
@@ -50,7 +50,9 @@ describe('model catalog runtime defaults parity (#3229)', () => {
|
||||
});
|
||||
|
||||
test('Group B runtimes remain documented as having no built-in defaults', () => {
|
||||
const groupB = ['kilo', 'cline', 'cursor', 'windsurf', 'augment', 'trae', 'codebuddy', 'antigravity'];
|
||||
const groupB = Object.keys(catalog.runtimeTierDefaults)
|
||||
.filter(runtime => !catalog.runtimeTierDefaults[runtime].opus);
|
||||
assert.ok(groupB.length > 0, 'expected at least one Group B runtime in catalog');
|
||||
for (const runtime of groupB) {
|
||||
const tiers = catalog.runtimeTierDefaults[runtime];
|
||||
assert.equal(tiers.opus, null);
|
||||
|
||||
@@ -1,13 +1,12 @@
|
||||
// allow-test-rule: architectural-invariant
|
||||
// verify.cjs must contain the W017 warning code and the worktree list invocation.
|
||||
// These checks guard the existence of the detection feature, not its text output.
|
||||
// Behavioral tests cover the detection flow; structural tests guard the implementation contract.
|
||||
// Structural checks verify the health seam exports worktree inspection capability.
|
||||
// Behavioral tests cover detection flow via validate health output.
|
||||
|
||||
/**
|
||||
* GSD Tools Tests - Orphan/Stale Worktree Detection (W017)
|
||||
*
|
||||
* Tests for feat/worktree-health-w017-2167:
|
||||
* - W017 code exists in verify.cjs (structural)
|
||||
* - Worktree Safety Policy Module exports health inspection interface (structural)
|
||||
* - No false positives on projects without linked worktrees
|
||||
* - Adding the check does not regress baseline health status
|
||||
*/
|
||||
@@ -63,23 +62,20 @@ function setupHealthyProject(tmpDir) {
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// 1. Structural: W017 code exists in verify.cjs
|
||||
// 1. Structural: Worktree Safety Policy Module exposes inspection interface
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('W017: structural presence', () => {
|
||||
test('verify.cjs contains W017 warning code', () => {
|
||||
const verifyPath = path.join(__dirname, '..', 'get-shit-done', 'bin', 'lib', 'verify.cjs');
|
||||
const source = fs.readFileSync(verifyPath, 'utf-8');
|
||||
assert.ok(source.includes("'W017'"), 'verify.cjs should contain W017 warning code');
|
||||
test('worktree-safety module exports inspectWorktreeHealth', () => {
|
||||
const modulePath = path.join(__dirname, '..', 'get-shit-done', 'bin', 'lib', 'worktree-safety.cjs');
|
||||
const seam = require(modulePath);
|
||||
assert.strictEqual(typeof seam.inspectWorktreeHealth, 'function');
|
||||
});
|
||||
|
||||
test('verify.cjs contains worktree list --porcelain invocation', () => {
|
||||
const verifyPath = path.join(__dirname, '..', 'get-shit-done', 'bin', 'lib', 'verify.cjs');
|
||||
const source = fs.readFileSync(verifyPath, 'utf-8');
|
||||
assert.ok(
|
||||
source.includes('worktree') && source.includes('--porcelain'),
|
||||
'verify.cjs should invoke git worktree list --porcelain'
|
||||
);
|
||||
test('worktree-safety module exports linked worktree listing interface', () => {
|
||||
const modulePath = path.join(__dirname, '..', 'get-shit-done', 'bin', 'lib', 'worktree-safety.cjs');
|
||||
const seam = require(modulePath);
|
||||
assert.strictEqual(typeof seam.listLinkedWorktreePaths, 'function');
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -9,6 +9,7 @@ const fs = require('fs');
|
||||
const os = require('os');
|
||||
const path = require('path');
|
||||
const { runGsdTools, createTempProject, cleanup } = require('./helpers.cjs');
|
||||
const { migrateToWorkstreams, getOtherActiveWorkstreams } = require('../get-shit-done/bin/lib/workstream.cjs');
|
||||
|
||||
// ─── Helper ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -386,6 +387,56 @@ describe('workstream create with migration', () => {
|
||||
// Shared files stay
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'PROJECT.md')));
|
||||
});
|
||||
|
||||
test('normalizes --migrate-name to a valid workstream slug', () => {
|
||||
const isolatedDir = createTempProject();
|
||||
try {
|
||||
fs.writeFileSync(path.join(isolatedDir, '.planning', 'PROJECT.md'), '# Project\n');
|
||||
fs.writeFileSync(path.join(isolatedDir, '.planning', 'ROADMAP.md'), '## Roadmap v1.0: Existing\n### Phase 1: A\n');
|
||||
fs.writeFileSync(path.join(isolatedDir, '.planning', 'STATE.md'), '# State\n**Status:** In progress\n');
|
||||
|
||||
const result = runGsdTools(
|
||||
['workstream', 'create', 'new-feature', '--migrate-name', 'Bad Name', '--raw'],
|
||||
isolatedDir
|
||||
);
|
||||
assert.ok(result.success, `create with migrate-name normalization failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.created, true);
|
||||
assert.strictEqual(data.migration.workstream, 'bad-name');
|
||||
assert.ok(fs.existsSync(path.join(isolatedDir, '.planning', 'workstreams', 'bad-name', 'ROADMAP.md')));
|
||||
assert.ok(!fs.existsSync(path.join(isolatedDir, '.planning', 'workstreams', 'Bad Name')));
|
||||
} finally {
|
||||
cleanup(isolatedDir);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('migrateToWorkstreams', () => {
|
||||
test('rejects invalid workstream names for migration', () => {
|
||||
const tmpDir = createTempProject();
|
||||
try {
|
||||
assert.throws(
|
||||
() => migrateToWorkstreams(tmpDir, 'bad/name'),
|
||||
/Invalid workstream name for migration/
|
||||
);
|
||||
} finally {
|
||||
cleanup(tmpDir);
|
||||
}
|
||||
});
|
||||
|
||||
test('fails when already in workstream mode', () => {
|
||||
const tmpDir = createTempProject();
|
||||
try {
|
||||
fs.mkdirSync(path.join(tmpDir, '.planning', 'workstreams', 'existing'), { recursive: true });
|
||||
assert.throws(
|
||||
() => migrateToWorkstreams(tmpDir, 'new-stream'),
|
||||
/Already in workstream mode/
|
||||
);
|
||||
} finally {
|
||||
cleanup(tmpDir);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('workstream list', () => {
|
||||
@@ -570,6 +621,22 @@ describe('getOtherActiveWorkstreams', () => {
|
||||
!w.status.toLowerCase().includes('milestone complete'));
|
||||
assert.strictEqual(activeWs.length, 2); // alpha and beta active
|
||||
});
|
||||
|
||||
test('returns only non-complete siblings with phase progress metadata', () => {
|
||||
const alphaPlan = path.join(tmpDir, '.planning', 'workstreams', 'alpha', 'phases', '01-alpha', 'PLAN.md');
|
||||
const betaPlan = path.join(tmpDir, '.planning', 'workstreams', 'beta', 'phases', '01-beta', 'PLAN.md');
|
||||
const betaSummary = path.join(tmpDir, '.planning', 'workstreams', 'beta', 'phases', '01-beta', 'SUMMARY.md');
|
||||
fs.mkdirSync(path.dirname(alphaPlan), { recursive: true });
|
||||
fs.mkdirSync(path.dirname(betaPlan), { recursive: true });
|
||||
fs.writeFileSync(alphaPlan, '# Plan\n');
|
||||
fs.writeFileSync(betaPlan, '# Plan\n');
|
||||
fs.writeFileSync(betaSummary, '# Summary\n');
|
||||
|
||||
const others = getOtherActiveWorkstreams(tmpDir, 'alpha');
|
||||
assert.strictEqual(others.length, 1);
|
||||
assert.strictEqual(others[0].name, 'beta');
|
||||
assert.strictEqual(others[0].phases, '1/1');
|
||||
});
|
||||
});
|
||||
|
||||
describe('workstream progress', () => {
|
||||
@@ -598,6 +665,18 @@ describe('workstream progress', () => {
|
||||
assert.strictEqual(data.workstreams[0].active, true);
|
||||
assert.strictEqual(data.workstreams[0].progress_percent, 50);
|
||||
});
|
||||
|
||||
test('returns flat mode when no workstreams exist', () => {
|
||||
const emptyDir = createTempProject();
|
||||
try {
|
||||
const result = runGsdTools(['workstream', 'progress', '--raw'], emptyDir);
|
||||
assert.ok(result.success, `progress in flat mode failed: ${result.error}`);
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.mode, 'flat');
|
||||
} finally {
|
||||
cleanup(emptyDir);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Integration: gsd-tools --ws flag ────────────────────────────────────────
|
||||
|
||||
272
tests/worktree-safety-policy.test.cjs
Normal file
272
tests/worktree-safety-policy.test.cjs
Normal file
@@ -0,0 +1,272 @@
|
||||
'use strict';
|
||||
|
||||
const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
|
||||
const {
|
||||
resolveWorktreeContext,
|
||||
parseWorktreePorcelain,
|
||||
planWorktreePrune,
|
||||
executeWorktreePrunePlan,
|
||||
listLinkedWorktreePaths,
|
||||
inspectWorktreeHealth,
|
||||
snapshotWorktreeInventory,
|
||||
} = require('../get-shit-done/bin/lib/worktree-safety.cjs');
|
||||
|
||||
describe('worktree-safety policy module', () => {
|
||||
test('resolveWorktreeContext prefers current directory when .planning exists', () => {
|
||||
const context = resolveWorktreeContext('/repo/wt', {
|
||||
existsSync: () => true,
|
||||
execGit: () => ({ exitCode: 1, stdout: '', stderr: '' }),
|
||||
});
|
||||
assert.strictEqual(context.effectiveRoot, '/repo/wt');
|
||||
assert.strictEqual(context.reason, 'has_local_planning');
|
||||
assert.strictEqual(context.mode, 'current_directory');
|
||||
});
|
||||
|
||||
test('resolveWorktreeContext maps linked worktree to common-dir parent', () => {
|
||||
const context = resolveWorktreeContext('/repo/wt', {
|
||||
existsSync: () => false,
|
||||
execGit: (_, args) => {
|
||||
if (args[1] === '--git-dir') return { exitCode: 0, stdout: '.git/worktrees/wt', stderr: '' };
|
||||
if (args[1] === '--git-common-dir') return { exitCode: 0, stdout: '../.git', stderr: '' };
|
||||
return { exitCode: 1, stdout: '', stderr: '' };
|
||||
},
|
||||
});
|
||||
assert.strictEqual(context.effectiveRoot, '/repo');
|
||||
assert.strictEqual(context.reason, 'linked_worktree');
|
||||
assert.strictEqual(context.mode, 'linked_worktree_root');
|
||||
});
|
||||
|
||||
test('resolveWorktreeContext falls back when git metadata is unavailable', () => {
|
||||
const context = resolveWorktreeContext('/repo/wt', {
|
||||
existsSync: () => false,
|
||||
execGit: () => ({ exitCode: 1, stdout: '', stderr: '' }),
|
||||
});
|
||||
assert.strictEqual(context.effectiveRoot, '/repo/wt');
|
||||
assert.strictEqual(context.reason, 'not_git_repo');
|
||||
});
|
||||
|
||||
test('resolveWorktreeContext keeps cwd for main worktree checkout', () => {
|
||||
const context = resolveWorktreeContext('/repo/main', {
|
||||
existsSync: () => false,
|
||||
execGit: (_, args) => {
|
||||
if (args[1] === '--git-dir') return { exitCode: 0, stdout: '.git', stderr: '' };
|
||||
if (args[1] === '--git-common-dir') return { exitCode: 0, stdout: '.git', stderr: '' };
|
||||
return { exitCode: 1, stdout: '', stderr: '' };
|
||||
},
|
||||
});
|
||||
assert.strictEqual(context.effectiveRoot, '/repo/main');
|
||||
assert.strictEqual(context.reason, 'main_worktree');
|
||||
assert.strictEqual(context.mode, 'current_directory');
|
||||
});
|
||||
|
||||
test('parseWorktreePorcelain skips detached HEAD entries', () => {
|
||||
const porcelain = [
|
||||
'worktree /repo/main',
|
||||
'HEAD deadbeef',
|
||||
'branch refs/heads/main',
|
||||
'',
|
||||
'worktree /repo/wt-detached',
|
||||
'HEAD cafe1234',
|
||||
'detached',
|
||||
'',
|
||||
'worktree /repo/wt-feature',
|
||||
'HEAD f00dbabe',
|
||||
'branch refs/heads/feature-x',
|
||||
'',
|
||||
].join('\n');
|
||||
const parsed = parseWorktreePorcelain(porcelain);
|
||||
assert.deepStrictEqual(parsed, [
|
||||
{ path: '/repo/main', branch: 'main' },
|
||||
{ path: '/repo/wt-feature', branch: 'feature-x' },
|
||||
]);
|
||||
});
|
||||
|
||||
test('planWorktreePrune is non-destructive by default', () => {
|
||||
const plan = planWorktreePrune('/repo/main', {}, {
|
||||
execGit: () => ({ exitCode: 0, stdout: 'worktree /repo/main\nbranch refs/heads/main\n', stderr: '' }),
|
||||
parseWorktreePorcelain: () => [{ path: '/repo/main', branch: 'main' }],
|
||||
});
|
||||
assert.strictEqual(plan.action, 'metadata_prune_only');
|
||||
assert.strictEqual(plan.reason, 'worktrees_present');
|
||||
assert.strictEqual(plan.destructiveModeRequested, false);
|
||||
});
|
||||
|
||||
test('planWorktreePrune keeps metadata-prune action when destructive mode is requested (scaffold)', () => {
|
||||
const plan = planWorktreePrune('/repo/main', { allowDestructive: true }, {
|
||||
execGit: () => ({ exitCode: 0, stdout: '', stderr: '' }),
|
||||
parseWorktreePorcelain: () => [],
|
||||
});
|
||||
assert.strictEqual(plan.action, 'metadata_prune_only');
|
||||
assert.strictEqual(plan.reason, 'no_worktrees');
|
||||
assert.strictEqual(plan.destructiveModeRequested, true);
|
||||
});
|
||||
|
||||
test('planWorktreePrune skips when git worktree list fails', () => {
|
||||
const plan = planWorktreePrune('/repo/main', {}, {
|
||||
execGit: () => ({ exitCode: 2, stdout: '', stderr: 'fatal' }),
|
||||
});
|
||||
assert.strictEqual(plan.action, 'skip');
|
||||
assert.strictEqual(plan.reason, 'git_list_failed');
|
||||
});
|
||||
|
||||
test('planWorktreePrune still metadata-prunes when porcelain parser throws', () => {
|
||||
const plan = planWorktreePrune('/repo/main', {}, {
|
||||
execGit: () => ({ exitCode: 0, stdout: 'not-porcelain', stderr: '' }),
|
||||
parseWorktreePorcelain: () => {
|
||||
throw new Error('parse failed');
|
||||
},
|
||||
});
|
||||
assert.strictEqual(plan.action, 'metadata_prune_only');
|
||||
assert.strictEqual(plan.reason, 'no_worktrees');
|
||||
});
|
||||
|
||||
test('executeWorktreePrunePlan runs git worktree prune for metadata plan', () => {
|
||||
const calls = [];
|
||||
const result = executeWorktreePrunePlan(
|
||||
{ repoRoot: '/repo/main', action: 'metadata_prune_only', reason: 'worktrees_present' },
|
||||
{
|
||||
execGit: (cwd, args) => {
|
||||
calls.push({ cwd, args });
|
||||
return { exitCode: 0, stdout: '', stderr: '' };
|
||||
},
|
||||
}
|
||||
);
|
||||
assert.strictEqual(result.ok, true);
|
||||
assert.deepStrictEqual(calls, [{ cwd: '/repo/main', args: ['worktree', 'prune'] }]);
|
||||
});
|
||||
|
||||
test('executeWorktreePrunePlan returns skip for missing plan', () => {
|
||||
const result = executeWorktreePrunePlan(null, {
|
||||
execGit: () => ({ exitCode: 0, stdout: '', stderr: '' }),
|
||||
});
|
||||
assert.strictEqual(result.ok, false);
|
||||
assert.strictEqual(result.action, 'skip');
|
||||
assert.strictEqual(result.reason, 'missing_plan');
|
||||
});
|
||||
|
||||
test('executeWorktreePrunePlan returns skip plan unchanged without git call', () => {
|
||||
let called = false;
|
||||
const result = executeWorktreePrunePlan(
|
||||
{ repoRoot: '/repo/main', action: 'skip', reason: 'git_list_failed' },
|
||||
{
|
||||
execGit: () => {
|
||||
called = true;
|
||||
return { exitCode: 0, stdout: '', stderr: '' };
|
||||
},
|
||||
}
|
||||
);
|
||||
assert.strictEqual(result.ok, false);
|
||||
assert.strictEqual(result.action, 'skip');
|
||||
assert.strictEqual(result.reason, 'git_list_failed');
|
||||
assert.strictEqual(called, false);
|
||||
});
|
||||
|
||||
test('executeWorktreePrunePlan rejects unsupported actions', () => {
|
||||
const result = executeWorktreePrunePlan(
|
||||
{ repoRoot: '/repo/main', action: 'remove_missing_paths', reason: 'explicit' },
|
||||
{
|
||||
execGit: () => ({ exitCode: 0, stdout: '', stderr: '' }),
|
||||
}
|
||||
);
|
||||
assert.strictEqual(result.ok, false);
|
||||
assert.strictEqual(result.action, 'remove_missing_paths');
|
||||
assert.strictEqual(result.reason, 'unsupported_action');
|
||||
});
|
||||
|
||||
test('listLinkedWorktreePaths parses porcelain and skips first/main path', () => {
|
||||
const listed = listLinkedWorktreePaths('/repo/main', {
|
||||
execGit: () => ({
|
||||
exitCode: 0,
|
||||
stdout: [
|
||||
'worktree /repo/main',
|
||||
'HEAD aaa',
|
||||
'branch refs/heads/main',
|
||||
'',
|
||||
'worktree /repo/wt-a',
|
||||
'HEAD bbb',
|
||||
'branch refs/heads/feat-a',
|
||||
'',
|
||||
'worktree /repo/wt-b',
|
||||
'HEAD ccc',
|
||||
'detached',
|
||||
'',
|
||||
].join('\n'),
|
||||
stderr: '',
|
||||
}),
|
||||
});
|
||||
assert.strictEqual(listed.ok, true);
|
||||
assert.deepStrictEqual(listed.paths, ['/repo/wt-a', '/repo/wt-b']);
|
||||
});
|
||||
|
||||
test('inspectWorktreeHealth reports orphan and stale findings', () => {
|
||||
const health = inspectWorktreeHealth(
|
||||
'/repo/main',
|
||||
{ staleAfterMs: 60 * 60 * 1000, nowMs: 2 * 60 * 60 * 1000 },
|
||||
{
|
||||
execGit: () => ({
|
||||
exitCode: 0,
|
||||
stdout: [
|
||||
'worktree /repo/main',
|
||||
'HEAD aaa',
|
||||
'branch refs/heads/main',
|
||||
'',
|
||||
'worktree /repo/wt-orphan',
|
||||
'HEAD bbb',
|
||||
'branch refs/heads/feat-a',
|
||||
'',
|
||||
'worktree /repo/wt-stale',
|
||||
'HEAD ccc',
|
||||
'branch refs/heads/feat-b',
|
||||
'',
|
||||
].join('\n'),
|
||||
stderr: '',
|
||||
}),
|
||||
existsSync: p => p !== '/repo/wt-orphan',
|
||||
statSync: () => ({ mtimeMs: 0 }),
|
||||
}
|
||||
);
|
||||
|
||||
assert.strictEqual(health.ok, true);
|
||||
assert.deepStrictEqual(health.findings, [
|
||||
{ kind: 'orphan', path: '/repo/wt-orphan' },
|
||||
{ kind: 'stale', path: '/repo/wt-stale', ageMinutes: 120 },
|
||||
]);
|
||||
});
|
||||
|
||||
test('snapshotWorktreeInventory returns typed linked-worktree entries', () => {
|
||||
const inventory = snapshotWorktreeInventory(
|
||||
'/repo/main',
|
||||
{ staleAfterMs: 60 * 60 * 1000, nowMs: 2 * 60 * 60 * 1000 },
|
||||
{
|
||||
execGit: () => ({
|
||||
exitCode: 0,
|
||||
stdout: [
|
||||
'worktree /repo/main',
|
||||
'HEAD aaa',
|
||||
'branch refs/heads/main',
|
||||
'',
|
||||
'worktree /repo/wt-a',
|
||||
'HEAD bbb',
|
||||
'branch refs/heads/feat-a',
|
||||
'',
|
||||
'worktree /repo/wt-b',
|
||||
'HEAD ccc',
|
||||
'branch refs/heads/feat-b',
|
||||
'',
|
||||
].join('\n'),
|
||||
stderr: '',
|
||||
}),
|
||||
existsSync: p => p !== '/repo/wt-b',
|
||||
statSync: () => ({ mtimeMs: 0 }),
|
||||
}
|
||||
);
|
||||
|
||||
assert.strictEqual(inventory.ok, true);
|
||||
assert.deepStrictEqual(inventory.entries, [
|
||||
{ path: '/repo/wt-a', exists: true, isStale: true, ageMinutes: 120 },
|
||||
{ path: '/repo/wt-b', exists: false, isStale: false, ageMinutes: null },
|
||||
]);
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user