mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-05-06 07:12:21 +02:00
Compare commits
76 Commits
release/1.
...
fix/3029-g
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f2decefede | ||
|
|
a4e5cc7c24 | ||
|
|
f55069ecbf | ||
|
|
de25400b70 | ||
|
|
ca78b65de7 | ||
|
|
1a51ec5829 | ||
|
|
4277f7d7e8 | ||
|
|
cde793f1f0 | ||
|
|
ffeeb92c14 | ||
|
|
4e378d37d8 | ||
|
|
9f09246f3b | ||
|
|
c2ada7e799 | ||
|
|
55ae8e42d2 | ||
|
|
3657c4ea9e | ||
|
|
918f987a19 | ||
|
|
17a4321bf5 | ||
|
|
9d5db87249 | ||
|
|
cb98a88139 | ||
|
|
fb92d1e596 | ||
|
|
7424271aa0 | ||
|
|
7a416b10d4 | ||
|
|
ef43f5161f | ||
|
|
e9a66da1e7 | ||
|
|
b8d9bd69b2 | ||
|
|
0d25ef0c47 | ||
|
|
a346779213 | ||
|
|
0d6abb87ac | ||
|
|
c5dfdbe42e | ||
|
|
9d0d085a17 | ||
|
|
53cda93a01 | ||
|
|
ec07861228 | ||
|
|
3ba17e872e | ||
|
|
4d628b306a | ||
|
|
b328f3269f | ||
|
|
e2792536d9 | ||
|
|
7cc6358f91 | ||
|
|
8de8acee46 | ||
|
|
2cc8796265 | ||
|
|
faee0287a0 | ||
|
|
7e9477bb30 | ||
|
|
5abf46ac1c | ||
|
|
372d3453f5 | ||
|
|
c9d6306981 | ||
|
|
1168e9f59a | ||
|
|
3ed8980519 | ||
|
|
c3aef27aa6 | ||
|
|
ace61869d0 | ||
|
|
80f14cac1f | ||
|
|
2256e4c9a3 | ||
|
|
e5cd523e7b | ||
|
|
b5777572f7 | ||
|
|
861a7d972b | ||
|
|
bd0511988b | ||
|
|
4a5f36df5e | ||
|
|
840f2b349e | ||
|
|
140d334dab | ||
|
|
6e4fad7acc | ||
|
|
4e2f1105d9 | ||
|
|
4ce72cdee7 | ||
|
|
198022f58d | ||
|
|
ac100ae17b | ||
|
|
002db4dd2b | ||
|
|
0e0f6952c5 | ||
|
|
bdead2ee6a | ||
|
|
e107bb35d4 | ||
|
|
294564b951 | ||
|
|
9a13d2fc0b | ||
|
|
d29822c1da | ||
|
|
b126c0579a | ||
|
|
006cdafe8f | ||
|
|
8051bc4fd8 | ||
|
|
444db1714b | ||
|
|
6dce1de4a7 | ||
|
|
abb2cb63f6 | ||
|
|
8cbdbdd2de | ||
|
|
951d5bf7c0 |
44
.changeset/README.md
Normal file
44
.changeset/README.md
Normal file
@@ -0,0 +1,44 @@
|
||||
# Changeset Fragments
|
||||
|
||||
This directory holds **per-PR CHANGELOG fragments**. Every PR with user-facing changes drops one (or more) `<random-name>.md` files here describing its CHANGELOG entry. Fragments are consolidated into the top-level `CHANGELOG.md` at release time.
|
||||
|
||||
## Why
|
||||
|
||||
Two PRs that both edit the `### Fixed` block of `CHANGELOG.md` always conflict on merge — git can't pick a serialization order without human input. Two PRs that each add a fresh `.changeset/<unique-name>.md` never conflict because they don't share lines.
|
||||
|
||||
See [#2975](https://github.com/gsd-build/get-shit-done/issues/2975) for the full rationale.
|
||||
|
||||
## Adding a fragment
|
||||
|
||||
```bash
|
||||
node scripts/changeset/new.cjs \
|
||||
--type Fixed \
|
||||
--pr 1234 \
|
||||
--body "fix the thing — explain the user-visible change in one sentence"
|
||||
```
|
||||
|
||||
This writes `.changeset/<adjective>-<noun>-<noun>.md` with frontmatter and a body. Three random words → concurrent PRs don't collide.
|
||||
|
||||
## Format
|
||||
|
||||
```md
|
||||
---
|
||||
type: Fixed
|
||||
pr: 1234
|
||||
---
|
||||
**`/gsd-foo` no longer drops trailing slashes** — explain the user-visible change.
|
||||
```
|
||||
|
||||
Allowed `type:` values follow [Keep a Changelog](https://keepachangelog.com/): `Added`, `Changed`, `Deprecated`, `Removed`, `Fixed`, `Security`.
|
||||
|
||||
## Opting out
|
||||
|
||||
PRs that legitimately have no user-facing impact can add the `no-changelog` label. CI honors it. When unsure, add the fragment.
|
||||
|
||||
## At release time
|
||||
|
||||
```bash
|
||||
node scripts/changeset/cli.cjs render --version vX.Y.Z --date YYYY-MM-DD
|
||||
```
|
||||
|
||||
Reads every fragment, groups bullets by `type:`, replaces `## [Unreleased]` with a new `## [vX.Y.Z] - YYYY-MM-DD` block, opens a fresh `## [Unreleased]` above, deletes consumed fragments. Idempotent.
|
||||
5
.changeset/calm-birds-greet.md
Normal file
5
.changeset/calm-birds-greet.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2990
|
||||
---
|
||||
gsd-code-fixer worktree no longer fails on the same-branch checkout — the agent now creates a new gsd-reviewfix/ branch via git worktree add -b and fast-forwards the user's branch on cleanup. See #2990.
|
||||
5
.changeset/calm-ibex-jump.md
Normal file
5
.changeset/calm-ibex-jump.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Changed
|
||||
pr: 2986
|
||||
---
|
||||
Test suite for config-schema.cjs is now mutation-resistant — 95 typed assertions kill the 124 surviving Stryker mutants from the 4.62% baseline. Tests target static-key fast path, dynamic-pattern .some semantics, polarity, and regex-anchor tightening. See #2986.
|
||||
5
.changeset/calm-tigers-frolic.md
Normal file
5
.changeset/calm-tigers-frolic.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 3008
|
||||
---
|
||||
**`tests/install-minimal.test.cjs:307` no longer races on shared `os.tmpdir()` under parallel CI** — the previous shape compared `listTmpStageDirs()` snapshots before and after the throw. Under `scripts/run-tests.cjs --test-concurrency=4`, `tests/install-minimal-all-runtimes.test.cjs` runs in a parallel process and creates/removes `gsd-minimal-skills-*` dirs in the shared OS tmpdir between snapshots, so `deepStrictEqual` failed deterministically when the parallel process happened to have a live stage dir during the snapshot window. Fix: stub `fs.mkdtempSync` to record THIS call's stage dir, then assert that exact path no longer exists after the throw — no global filesystem snapshot, no race. (#3008)
|
||||
5
.changeset/curious-bears-march.md
Normal file
5
.changeset/curious-bears-march.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 3012
|
||||
---
|
||||
**Post-install message and update.md no longer recommend the removed `/gsd-reapply-patches` command** — after PR #2824 consolidated 86 skills into ~58, `/gsd-reapply-patches` was folded into a flag (`/gsd-update --reapply`). The 1.39.1 hotfix (#2954) updated `help.md` but missed `bin/install.js`'s `reportLocalPatches` runtime emitter, `get-shit-done/workflows/update.md` Step 4, and the English + zh-CN/ja-JP/ko-KR doc set. Users hit "Unknown command" after every install with backed-up patches. All five runtime branches in `reportLocalPatches` (claude, opencode, kilo, copilot, gemini, codex, cursor) now emit the consolidated form. Regression: `tests/bug-3010-reapply-patches-references.test.cjs` scans `bin/install.js`, every workflow file, and every doc (excluding CHANGELOG history and help.md's deprecation notice) for stale recommendations. See #3010.
|
||||
5
.changeset/eager-hawks-rally.md
Normal file
5
.changeset/eager-hawks-rally.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Added
|
||||
pr: 2975
|
||||
---
|
||||
**Changeset-fragment workflow** — eliminates CHANGELOG.md merge conflicts. Each PR drops `.changeset/<random-name>.md` with frontmatter (`type:`, `pr:`) plus a markdown body; the release-time `npm run changelog:render` consolidates fragments into `CHANGELOG.md` and deletes them. CI lint (`npm run lint:changeset`) requires a fragment on any PR touching user-facing files (`bin/`, `get-shit-done/`, `agents/`, `commands/`, `hooks/`, `sdk/src/`); contributors can opt out via the `no-changelog` label for purely internal changes. See [.changeset/README.md](.changeset/README.md) and CONTRIBUTING.md for the workflow.
|
||||
5
.changeset/happy-jays-greet.md
Normal file
5
.changeset/happy-jays-greet.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2994
|
||||
---
|
||||
/gsd-reapply-patches Step 5 verifier now resolves at runtime — moved scripts/verify-reapply-patches.cjs to get-shit-done/bin/ which is shipped by the installer. The legacy scripts/ directory is not copied to user installs. See #2994.
|
||||
5
.changeset/jolly-newts-roam.md
Normal file
5
.changeset/jolly-newts-roam.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2994
|
||||
---
|
||||
/gsd-reapply-patches Step 5 verifier now resolves at runtime — moved scripts/verify-reapply-patches.cjs to get-shit-done/bin/ which is shipped by the installer. The legacy scripts/ directory is not copied to user installs. See #2994.
|
||||
5
.changeset/jolly-pumas-dance.md
Normal file
5
.changeset/jolly-pumas-dance.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2979
|
||||
---
|
||||
Managed JS hooks now resolve under GUI/minimal-PATH runtimes — installer emits process.execPath (absolute, quoted, forward-slash-normalized) as the runner for every .js hook command instead of bare node. See #2979.
|
||||
5
.changeset/lively-goats-run.md
Normal file
5
.changeset/lively-goats-run.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Added
|
||||
pr: 2995
|
||||
---
|
||||
Post-install path smoke test for workflow-invoked scripts — audits every node ${GSD_HOME}/...cjs invocation in workflows resolves at the runtime-installed path. See #2995.
|
||||
5
.changeset/lively-otters-gather.md
Normal file
5
.changeset/lively-otters-gather.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 3011
|
||||
---
|
||||
**Actionable diagnostic when `gsd-sdk` is not on PATH after install** — Windows users (and others on multi-shell setups) reported that the previous "GSD SDK files are present but `gsd-sdk` is not on your PATH" warning gave them no way to fix it: no path to look at, no shell-specific commands, no mention of the npx-cache caveat. New `formatSdkPathDiagnostic({ shimDir, platform, runDir })` helper returns a typed IR with the resolved shim location, platform-specific PATH-export commands (PowerShell / cmd.exe / Git Bash on Windows; `export PATH` on POSIX), and an npx-specific note when running under an `_npx` cache segment (where the shim may be written to a temp dir that won't persist). The console renderer in `bin/install.js` emits the lines from the IR; tests assert on the typed fields directly. (#3011)
|
||||
5
.changeset/merry-foxes-climb.md
Normal file
5
.changeset/merry-foxes-climb.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2997
|
||||
---
|
||||
SDK config-set/config-get and init responses no longer echo plaintext API keys. New sdk/src/query/secrets.ts ports SECRET_CONFIG_KEYS masking from CJS; init bundles only mask string values to preserve the boolean availability-flag contract. See #2997.
|
||||
5
.changeset/merry-lynx-sing.md
Normal file
5
.changeset/merry-lynx-sing.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2992
|
||||
---
|
||||
/gsd-update queries wrong npm package names — moved package name into a deterministic check-latest-version.cjs script and updated the workflow to use ${GSD_DIR} from get_installed_version. See #2992.
|
||||
5
.changeset/merry-lynx-wander.md
Normal file
5
.changeset/merry-lynx-wander.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 3007
|
||||
---
|
||||
**PR templates now point at the changeset workflow** — the `Fix`, `Enhancement`, and `Feature` PR templates previously asked contributors to tick `CHANGELOG.md updated`, which contradicted the post-#2978 rule that `CHANGELOG.md` must not be edited directly. Each checkbox now references `npm run changeset` (and the `no-changelog` opt-out where applicable).
|
||||
5
.changeset/plucky-ibex-gather.md
Normal file
5
.changeset/plucky-ibex-gather.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2998
|
||||
---
|
||||
gsd-pristine/ is now populated by the installer when local patches are detected — saveLocalPatches calls a new populatePristineDir helper that runs the install transform pipeline into a tmp staging dir and copies modified files into pristineDir. The reapply-patches Step 5 verifier no longer falls back to its over-broad heuristic. See #2998.
|
||||
5
.changeset/plucky-moles-roam.md
Normal file
5
.changeset/plucky-moles-roam.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2997
|
||||
---
|
||||
SDK config-set/config-get and init responses no longer echo plaintext API keys. New sdk/src/query/secrets.ts ports SECRET_CONFIG_KEYS masking from CJS; init bundles only mask string values to preserve the boolean availability-flag contract. See #2997.
|
||||
5
.changeset/plucky-otters-roam.md
Normal file
5
.changeset/plucky-otters-roam.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Added
|
||||
pr: 2995
|
||||
---
|
||||
Post-install path smoke test for workflow-invoked scripts — audits every node ${GSD_HOME}/...cjs invocation in workflows resolves at the runtime-installed path. See #2995.
|
||||
5
.changeset/silly-foxes-wander.md
Normal file
5
.changeset/silly-foxes-wander.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2990
|
||||
---
|
||||
gsd-code-fixer worktree no longer fails on the same-branch checkout — the agent now creates a new gsd-reviewfix/ branch via git worktree add -b and fast-forwards the user's branch on cleanup. See #2990.
|
||||
5
.changeset/silly-newts-swim.md
Normal file
5
.changeset/silly-newts-swim.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Added
|
||||
pr: 2982
|
||||
---
|
||||
Extended no-source-grep lint to catch var-binding readFileSync.includes() pattern. Tests now fail when source-grep is hidden behind a parser wrapper. See #2982.
|
||||
5
.changeset/typed-rivers-flow.md
Normal file
5
.changeset/typed-rivers-flow.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Changed
|
||||
pr: 2974
|
||||
---
|
||||
Migrated 8 test files from raw text matching (`stdout.includes(...)`, `assert.match(stderr, ...)`) to typed-IR assertions per CONTRIBUTING.md. Adds shared `ERROR_REASON` enum and `--json-errors` flag in `core.cjs`, typed `GRAPHIFY_REASON` in `graphify.cjs`, pure `buildSdkFailFastReport()` IR builder in `bin/install.js`, and Claude Code JSON envelope output (`hookSpecificOutput` with typed fields) for `gsd-session-state.sh` and `gsd-phase-boundary.sh`. Tests now assert on structured fields (`reason`, `context`, `state_present`, `planning_modified`, etc.) instead of substring matching. See #2974.
|
||||
5
.changeset/witty-hawks-jump.md
Normal file
5
.changeset/witty-hawks-jump.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2973
|
||||
---
|
||||
/gsd-profile-user --refresh writes dev-preferences.md to ~/.claude/skills/gsd-dev-preferences/SKILL.md instead of the legacy commands/gsd/ directory. Installer migrates any preserved legacy file to the new location. See #2973.
|
||||
5
.changeset/witty-newts-greet.md
Normal file
5
.changeset/witty-newts-greet.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2992
|
||||
---
|
||||
/gsd-update queries wrong npm package names — moved package name into a deterministic check-latest-version.cjs script and updated the workflow to use ${GSD_DIR} from get_installed_version. See #2992.
|
||||
5
.changeset/zesty-jays-wake.md
Normal file
5
.changeset/zesty-jays-wake.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Fixed
|
||||
pr: 2979
|
||||
---
|
||||
Managed JS hooks now resolve under GUI/minimal-PATH runtimes — installer emits process.execPath (absolute, quoted, forward-slash-normalized) as the runner for every .js hook command instead of bare node. See #2979.
|
||||
5
.changeset/zesty-moles-forage.md
Normal file
5
.changeset/zesty-moles-forage.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
type: Added
|
||||
pr: 2982
|
||||
---
|
||||
Extended no-source-grep lint to catch var-binding readFileSync.includes() pattern. Tests now fail when source-grep is hidden behind a parser wrapper. See #2982.
|
||||
26
.coderabbit.yaml
Normal file
26
.coderabbit.yaml
Normal file
@@ -0,0 +1,26 @@
|
||||
# CodeRabbit configuration — gsd-build/get-shit-done
|
||||
#
|
||||
# Schema: https://docs.coderabbit.ai/reference/yaml-template/
|
||||
#
|
||||
# Project context: GSD ships a CLI tool + an agent runtime, not a documented
|
||||
# public library. We carry rich JSDoc on internal helpers that warrant it
|
||||
# (see bin/install.js, get-shit-done/bin/lib/*.cjs) but we do not enforce a
|
||||
# blanket docstring coverage bar — see issue #2932 for rationale.
|
||||
|
||||
reviews:
|
||||
pre_merge_checks:
|
||||
# Disable docstring coverage check.
|
||||
#
|
||||
# The check produces false-positive warnings on PRs whose new code is
|
||||
# entirely test files: it counts test(...) / beforeEach / afterEach
|
||||
# arrow-function callbacks as functions and then reports 0% coverage
|
||||
# because nothing has JSDoc. There is no per-check path filter in CR's
|
||||
# documented schema that would let us exclude tests/** while keeping
|
||||
# the check active elsewhere, and the top-level path_filters approach
|
||||
# would silence ALL CR review on tests (security scans, out-of-scope
|
||||
# checks, line-level findings) which we want to keep.
|
||||
#
|
||||
# All other CR pre-merge checks (out-of-scope, security, title) remain
|
||||
# at their defaults.
|
||||
docstrings:
|
||||
mode: off
|
||||
6
.githooks/pre-commit
Executable file
6
.githooks/pre-commit
Executable file
@@ -0,0 +1,6 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
if git diff --cached --name-only | grep -Eq "^sdk/src/query/command-manifest\.|^sdk/src/query/command-aliases\.generated\.ts$|^get-shit-done/bin/lib/command-aliases\.generated\.cjs$|^sdk/scripts/gen-command-aliases\.ts$"; then
|
||||
npm run check:alias-drift
|
||||
fi
|
||||
48
.githooks/pre-push
Executable file
48
.githooks/pre-push
Executable file
@@ -0,0 +1,48 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
zero_sha='0000000000000000000000000000000000000000'
|
||||
blocked_regex="${GSD_BLOCKED_AUTHOR_REGEX:-}"
|
||||
|
||||
# Local-only guard: no-op unless the developer opts in via env var, e.g.
|
||||
# export GSD_BLOCKED_AUTHOR_REGEX='@example-corp\.com$'
|
||||
if [[ -z "$blocked_regex" ]]; then
|
||||
exit 0
|
||||
fi
|
||||
|
||||
violations=()
|
||||
|
||||
while read -r local_ref local_sha remote_ref remote_sha; do
|
||||
# branch/tag deletion
|
||||
if [[ "$local_sha" == "$zero_sha" ]]; then
|
||||
continue
|
||||
fi
|
||||
|
||||
if [[ "$remote_sha" == "$zero_sha" ]]; then
|
||||
# New remote ref: inspect commits not already on any remote
|
||||
commit_list=$(git rev-list "$local_sha" --not --remotes)
|
||||
else
|
||||
commit_list=$(git rev-list "$remote_sha..$local_sha")
|
||||
fi
|
||||
|
||||
while read -r commit; do
|
||||
[[ -z "$commit" ]] && continue
|
||||
author_email=$(git show -s --format='%ae' "$commit")
|
||||
lower_email=$(printf '%s' "$author_email" | tr '[:upper:]' '[:lower:]')
|
||||
if printf '%s' "$lower_email" | grep -Eq "$blocked_regex"; then
|
||||
violations+=("$commit <$author_email>")
|
||||
fi
|
||||
done <<< "$commit_list"
|
||||
done
|
||||
|
||||
if [[ ${#violations[@]} -gt 0 ]]; then
|
||||
{
|
||||
echo "Push blocked: commit author email matched local blocked regex ($blocked_regex)."
|
||||
echo "Rewrite author info before pushing these commits:"
|
||||
for v in "${violations[@]}"; do
|
||||
echo " - $v"
|
||||
done
|
||||
echo "Suggested fix: git rebase -i <base> --exec \"git commit --amend --no-edit --author='Your Name <non-enterprise@email>'\""
|
||||
} >&2
|
||||
exit 1
|
||||
fi
|
||||
2
.github/PULL_REQUEST_TEMPLATE/enhancement.md
vendored
2
.github/PULL_REQUEST_TEMPLATE/enhancement.md
vendored
@@ -73,7 +73,7 @@ Closes #
|
||||
- [ ] Changes are scoped to the approved enhancement — nothing extra included
|
||||
- [ ] All existing tests pass (`npm test`)
|
||||
- [ ] New or updated tests cover the enhanced behavior
|
||||
- [ ] CHANGELOG.md updated
|
||||
- [ ] `.changeset/` fragment added (`npm run changeset -- --type Changed --pr <NNN> --body "..."`) — or `no-changelog` label applied if not user-facing
|
||||
- [ ] Documentation updated if behavior or output changed
|
||||
- [ ] No unnecessary dependencies added
|
||||
|
||||
|
||||
2
.github/PULL_REQUEST_TEMPLATE/feature.md
vendored
2
.github/PULL_REQUEST_TEMPLATE/feature.md
vendored
@@ -94,7 +94,7 @@ Closes #
|
||||
- [ ] Implementation scope matches the approved spec exactly
|
||||
- [ ] All existing tests pass (`npm test`)
|
||||
- [ ] New tests cover the happy path, error cases, and edge cases
|
||||
- [ ] CHANGELOG.md updated with a user-facing description of the feature
|
||||
- [ ] `.changeset/` fragment added with a user-facing description of the feature (`npm run changeset -- --type Added --pr <NNN> --body "..."`)
|
||||
- [ ] Documentation updated — commands, workflows, references, README if applicable
|
||||
- [ ] No unnecessary external dependencies added
|
||||
- [ ] Works on Windows (backslash paths handled)
|
||||
|
||||
2
.github/PULL_REQUEST_TEMPLATE/fix.md
vendored
2
.github/PULL_REQUEST_TEMPLATE/fix.md
vendored
@@ -63,7 +63,7 @@ Fixes #
|
||||
- [ ] Fix is scoped to the reported bug — no unrelated changes included
|
||||
- [ ] Regression test added (or explained why not)
|
||||
- [ ] All existing tests pass (`npm test`)
|
||||
- [ ] CHANGELOG.md updated if this is a user-facing fix
|
||||
- [ ] `.changeset/` fragment added if this is a user-facing fix (`npm run changeset -- --type Fixed --pr <NNN> --body "..."`) — or `no-changelog` label applied
|
||||
- [ ] No unnecessary dependencies added
|
||||
|
||||
## Breaking changes
|
||||
|
||||
24
.github/workflows/changeset-required.yml
vendored
Normal file
24
.github/workflows/changeset-required.yml
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
name: Changeset Required
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened, labeled, unlabeled]
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
|
||||
jobs:
|
||||
changeset-lint:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '24'
|
||||
- name: Run changeset lint
|
||||
env:
|
||||
GITHUB_BASE_REF: ${{ github.base_ref }}
|
||||
run: node scripts/changeset/lint.cjs
|
||||
366
.github/workflows/hotfix.yml
vendored
366
.github/workflows/hotfix.yml
vendored
@@ -1,5 +1,27 @@
|
||||
name: Hotfix Release
|
||||
|
||||
# Hotfix flow for X.YY.Z patch releases (Z > 0).
|
||||
#
|
||||
# create:
|
||||
# - Branches hotfix/X.YY.Z from the highest existing vX.YY.* tag (1.27.2 from
|
||||
# v1.27.1, 1.27.1 from v1.27.0). The base IS the cumulative-fix anchor for
|
||||
# the previous patch.
|
||||
# - Auto-cherry-picks every fix:/chore: commit on origin/main that isn't
|
||||
# already in the base, oldest-first. Patch-equivalents (already applied)
|
||||
# are skipped via `git cherry`. feat:/refactor: are NEVER auto-included.
|
||||
# - Conflicts fail the workflow with the offending SHA so the operator can
|
||||
# resolve manually on the branch and re-run finalize with auto_cherry_pick=false.
|
||||
# - Step summary lists every included SHA so the eventual vX.YY.Z tag
|
||||
# self-documents what shipped.
|
||||
#
|
||||
# finalize:
|
||||
# - install-smoke gate (cross-platform, parity with release.yml/release-sdk.yml)
|
||||
# - Bundles SDK as both loose tree (sdk/dist/cli.js) and recoverable tarball
|
||||
# (sdk-bundle/gsd-sdk.tgz) — parity with release-sdk.yml so a hotfix shipped
|
||||
# during the @gsd-build-token outage carries the same payload shape.
|
||||
# - Publishes to @latest, tags vX.YY.Z, re-points @next → vX.YY.Z, opens
|
||||
# merge-back PR.
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
@@ -14,6 +36,11 @@ on:
|
||||
description: 'Patch version (e.g., 1.27.1)'
|
||||
required: true
|
||||
type: string
|
||||
auto_cherry_pick:
|
||||
description: 'Auto-cherry-pick fix:/chore: commits from origin/main since base tag (create only)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: true
|
||||
dry_run:
|
||||
description: 'Dry run (skip npm publish, tagging, and push)'
|
||||
required: false
|
||||
@@ -54,10 +81,13 @@ jobs:
|
||||
MAJOR_MINOR=$(echo "$VERSION" | cut -d. -f1-2)
|
||||
TARGET_TAG="v${VERSION}"
|
||||
BRANCH="hotfix/${VERSION}"
|
||||
BASE_TAG=$(git tag -l "v${MAJOR_MINOR}.*" \
|
||||
| grep -E "^v[0-9]+\.[0-9]+\.[0-9]+$" \
|
||||
# Append TARGET_TAG to the candidate list, then sort -V, then walk the
|
||||
# sorted list and print whatever immediately precedes TARGET_TAG. This
|
||||
# is semver-correct for multi-digit patches (v1.27.10 > v1.27.9) where
|
||||
# a plain `awk '$1 < target'` lexicographic compare would mis-order.
|
||||
BASE_TAG=$( ( git tag -l "v${MAJOR_MINOR}.*" | grep -E "^v[0-9]+\.[0-9]+\.[0-9]+$"; echo "$TARGET_TAG" ) \
|
||||
| sort -V \
|
||||
| awk -v target="$TARGET_TAG" '$1 < target { last=$1 } END { if (last != "") print last }')
|
||||
| awk -v target="$TARGET_TAG" '$1 == target { print prev; exit } { prev = $1 }')
|
||||
if [ -z "$BASE_TAG" ]; then
|
||||
echo "::error::No prior stable tag found for ${MAJOR_MINOR}.x before $TARGET_TAG"
|
||||
exit 1
|
||||
@@ -95,29 +125,160 @@ jobs:
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Create hotfix branch
|
||||
if: inputs.dry_run != 'true'
|
||||
- name: Create hotfix branch from base tag and push (skeleton)
|
||||
env:
|
||||
BRANCH: ${{ needs.validate-version.outputs.branch }}
|
||||
BASE_TAG: ${{ needs.validate-version.outputs.base_tag }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
git checkout -b "$BRANCH" "$BASE_TAG"
|
||||
# Push the skeleton branch up-front so any subsequent cherry-pick
|
||||
# conflict leaves a remote artefact the operator can fetch, resolve,
|
||||
# and re-push. Skipped on dry-run — local checkout still exercises
|
||||
# the same cherry-pick + bump flow so conflicts are caught.
|
||||
if [ "$DRY_RUN" != "true" ]; then
|
||||
git push -u origin "$BRANCH"
|
||||
fi
|
||||
|
||||
- name: Cherry-pick fix/chore commits from origin/main since base tag
|
||||
if: ${{ inputs.auto_cherry_pick }}
|
||||
env:
|
||||
BRANCH: ${{ needs.validate-version.outputs.branch }}
|
||||
BASE_TAG: ${{ needs.validate-version.outputs.base_tag }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
git fetch origin main:refs/remotes/origin/main
|
||||
|
||||
# `git cherry $BASE_TAG origin/main` lists every commit on main not
|
||||
# patch-equivalent in BASE_TAG. + means needs picking, - means
|
||||
# already applied (skipped silently).
|
||||
CANDIDATES=$(git cherry "$BASE_TAG" origin/main | awk '/^\+ / {print $2}')
|
||||
|
||||
if [ -z "$CANDIDATES" ]; then
|
||||
echo "No commits on origin/main beyond $BASE_TAG."
|
||||
echo "## Cherry-pick summary" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "Base: \`$BASE_TAG\` — no commits to consider." >> "$GITHUB_STEP_SUMMARY"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Re-order chronologically (oldest first) for predictable application.
|
||||
ORDERED=$(git log --reverse --format='%H' "$BASE_TAG..origin/main" \
|
||||
| grep -F -f <(echo "$CANDIDATES") || true)
|
||||
|
||||
INCLUDED=""
|
||||
SKIPPED=""
|
||||
while IFS= read -r SHA; do
|
||||
[ -z "$SHA" ] && continue
|
||||
SUBJECT=$(git log -1 --format='%s' "$SHA")
|
||||
# fix: or chore:, optional scope, optional ! breaking marker
|
||||
if echo "$SUBJECT" | grep -qE '^(fix|chore)(\([^)]+\))?!?: '; then
|
||||
echo "→ cherry-picking $SHA $SUBJECT"
|
||||
if ! git cherry-pick -x "$SHA"; then
|
||||
# Abort restores HEAD to the last successful pick. On real
|
||||
# runs, push that state so the operator can fetch, resolve
|
||||
# $SHA manually, and finalize with auto_cherry_pick=false.
|
||||
git cherry-pick --abort || true
|
||||
if [ "$DRY_RUN" != "true" ]; then
|
||||
git push --force-with-lease origin "$BRANCH" || git push origin "$BRANCH" || true
|
||||
fi
|
||||
{
|
||||
echo "## Cherry-pick conflict"
|
||||
echo ""
|
||||
echo "Failed at: \`${SHA}\` — \`${SUBJECT}\`"
|
||||
echo ""
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "**Dry run:** branch was not pushed, so the picks below were discarded with the runner."
|
||||
if [ -n "$INCLUDED" ]; then
|
||||
echo ""
|
||||
echo "Already-applied picks (lost — must be re-applied before resolving \`${SHA}\`):"
|
||||
echo ""
|
||||
echo "$INCLUDED"
|
||||
fi
|
||||
echo ""
|
||||
echo "**To resolve:** re-run \`create\` with \`auto_cherry_pick=true\` (real, not dry-run) to materialize the partial branch on origin, then resolve \`${SHA}\` manually. Re-running with \`auto_cherry_pick=false\` would recreate the branch from \`${BASE_TAG}\` and lose every pick listed above."
|
||||
else
|
||||
echo "Branch \`${BRANCH}\` was pushed with picks applied up to (but not including) the conflicting commit."
|
||||
echo ""
|
||||
echo "**To resolve:** \`git fetch origin && git checkout ${BRANCH} && git cherry-pick -x ${SHA}\`, fix the conflict, push, then re-run \`finalize\` with \`auto_cherry_pick=false\`."
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "::error::Cherry-pick of $SHA failed. See summary."
|
||||
exit 1
|
||||
fi
|
||||
INCLUDED="${INCLUDED}- \`${SHA}\` ${SUBJECT}"$'\n'
|
||||
else
|
||||
echo " skip $SHA $SUBJECT (not fix/chore)"
|
||||
SKIPPED="${SKIPPED}- \`${SHA}\` ${SUBJECT}"$'\n'
|
||||
fi
|
||||
done <<< "$ORDERED"
|
||||
|
||||
{
|
||||
echo "## Cherry-pick summary"
|
||||
echo ""
|
||||
echo "Base: \`$BASE_TAG\`"
|
||||
echo ""
|
||||
if [ -n "$INCLUDED" ]; then
|
||||
echo "### Included (fix/chore)"
|
||||
echo ""
|
||||
echo "$INCLUDED"
|
||||
else
|
||||
echo "_No fix/chore commits to include._"
|
||||
echo ""
|
||||
fi
|
||||
if [ -n "$SKIPPED" ]; then
|
||||
echo "### Skipped (feat/refactor/etc — not auto-included)"
|
||||
echo ""
|
||||
echo "$SKIPPED"
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
|
||||
- name: Bump version and push
|
||||
env:
|
||||
BRANCH: ${{ needs.validate-version.outputs.branch }}
|
||||
BASE_TAG: ${{ needs.validate-version.outputs.base_tag }}
|
||||
VERSION: ${{ inputs.version }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
git checkout -b "$BRANCH" "$BASE_TAG"
|
||||
# Bump version in package.json
|
||||
set -euo pipefail
|
||||
npm version "$VERSION" --no-git-tag-version
|
||||
git add package.json package-lock.json
|
||||
# Keep sdk/package.json in lockstep (parity with release-sdk.yml).
|
||||
if [ -f sdk/package.json ]; then
|
||||
(cd sdk && npm version "$VERSION" --no-git-tag-version)
|
||||
git add sdk/package.json
|
||||
[ -f sdk/package-lock.json ] && git add sdk/package-lock.json
|
||||
fi
|
||||
git commit -m "chore: bump version to $VERSION for hotfix"
|
||||
git push origin "$BRANCH"
|
||||
echo "## Hotfix branch created" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "- Branch: \`$BRANCH\`" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "- Based on: \`$BASE_TAG\`" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "- Apply your fix, push, then run this workflow again with \`finalize\`" >> "$GITHUB_STEP_SUMMARY"
|
||||
if [ "$DRY_RUN" != "true" ]; then
|
||||
git push origin "$BRANCH"
|
||||
else
|
||||
echo "DRY RUN — branch not pushed. Local checkout exercised the cherry-pick and bump flow."
|
||||
fi
|
||||
{
|
||||
echo "## Hotfix branch created"
|
||||
echo ""
|
||||
echo "- Branch: \`$BRANCH\`"
|
||||
echo "- Based on: \`$BASE_TAG\`"
|
||||
echo "- Apply additional manual fixes if needed, then run \`finalize\`."
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
|
||||
finalize:
|
||||
install-smoke:
|
||||
needs: validate-version
|
||||
if: inputs.action == 'finalize'
|
||||
permissions:
|
||||
contents: read
|
||||
uses: ./.github/workflows/install-smoke.yml
|
||||
with:
|
||||
ref: ${{ needs.validate-version.outputs.branch }}
|
||||
|
||||
finalize:
|
||||
needs: [validate-version, install-smoke]
|
||||
if: inputs.action == 'finalize'
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
timeout-minutes: 15
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
@@ -140,31 +301,83 @@ jobs:
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Detect prior publish (reconciliation mode)
|
||||
id: prior_publish
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
run: |
|
||||
EXISTING=$(npm view get-shit-done-cc@"$VERSION" version 2>/dev/null || true)
|
||||
if [ -n "$EXISTING" ]; then
|
||||
echo "::warning::get-shit-done-cc@${VERSION} is already on the registry — entering reconciliation mode (skip publish, continue with tag/release/PR/dist-tag)."
|
||||
echo "skip_publish=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "skip_publish=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Install and test
|
||||
run: |
|
||||
npm ci
|
||||
npm run test:coverage
|
||||
|
||||
- name: Create PR to merge hotfix back to main
|
||||
if: ${{ !inputs.dry_run }}
|
||||
- name: Build SDK dist for tarball
|
||||
run: npm run build:sdk
|
||||
|
||||
- name: Verify CC tarball ships sdk/dist/cli.js (bug #2647 guard)
|
||||
run: bash scripts/verify-tarball-sdk-dist.sh
|
||||
|
||||
- name: Pack SDK as tarball and bundle into CC source tree
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
BRANCH: ${{ needs.validate-version.outputs.branch }}
|
||||
VERSION: ${{ inputs.version }}
|
||||
run: |
|
||||
EXISTING_PR=$(gh pr list --base main --head "$BRANCH" --state open --json number --jq '.[0].number')
|
||||
if [ -n "$EXISTING_PR" ]; then
|
||||
echo "PR #$EXISTING_PR already exists; updating"
|
||||
gh pr edit "$EXISTING_PR" \
|
||||
--title "chore: merge hotfix v${VERSION} back to main" \
|
||||
--body "Merge hotfix changes back to main after v${VERSION} release."
|
||||
else
|
||||
gh pr create \
|
||||
--base main \
|
||||
--head "$BRANCH" \
|
||||
--title "chore: merge hotfix v${VERSION} back to main" \
|
||||
--body "Merge hotfix changes back to main after v${VERSION} release."
|
||||
set -e
|
||||
cd sdk
|
||||
npm pack
|
||||
TARBALL="gsd-build-sdk-${VERSION}.tgz"
|
||||
if [ ! -f "$TARBALL" ]; then
|
||||
echo "::error::Expected $TARBALL but npm pack did not produce it."
|
||||
ls -la
|
||||
exit 1
|
||||
fi
|
||||
mkdir -p ../sdk-bundle
|
||||
mv "$TARBALL" ../sdk-bundle/gsd-sdk.tgz
|
||||
cd ..
|
||||
ls -la sdk-bundle/
|
||||
|
||||
- name: Add sdk-bundle to CC files whitelist (in-tree, not committed)
|
||||
run: |
|
||||
node <<'NODE'
|
||||
const fs = require('fs');
|
||||
const pkg = JSON.parse(fs.readFileSync('package.json', 'utf8'));
|
||||
if (!Array.isArray(pkg.files)) {
|
||||
console.error('::error::package.json files is not an array');
|
||||
process.exit(1);
|
||||
}
|
||||
if (!pkg.files.includes('sdk-bundle')) {
|
||||
pkg.files.push('sdk-bundle');
|
||||
fs.writeFileSync('package.json', JSON.stringify(pkg, null, 2) + '\n');
|
||||
console.log('Added sdk-bundle/ to package.json files whitelist');
|
||||
}
|
||||
NODE
|
||||
|
||||
- name: Verify CC tarball will contain sdk-bundle/gsd-sdk.tgz
|
||||
run: |
|
||||
set -e
|
||||
TARBALL=$(npm pack --ignore-scripts 2>/dev/null | tail -1)
|
||||
if [ -z "$TARBALL" ] || [ ! -f "$TARBALL" ]; then
|
||||
echo "::error::npm pack produced no tarball"
|
||||
exit 1
|
||||
fi
|
||||
if ! tar -tzf "$TARBALL" | grep -q "package/sdk-bundle/gsd-sdk.tgz"; then
|
||||
echo "::error::CC tarball is missing package/sdk-bundle/gsd-sdk.tgz"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ CC tarball contains sdk-bundle/gsd-sdk.tgz"
|
||||
rm -f "$TARBALL"
|
||||
|
||||
- name: Dry-run publish validation
|
||||
env:
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: npm publish --dry-run --tag latest
|
||||
|
||||
- name: Tag and push
|
||||
if: ${{ !inputs.dry_run }}
|
||||
@@ -185,55 +398,98 @@ jobs:
|
||||
fi
|
||||
|
||||
- name: Publish to npm (latest)
|
||||
if: ${{ !inputs.dry_run }}
|
||||
run: npm publish --provenance --access public
|
||||
if: ${{ !inputs.dry_run && steps.prior_publish.outputs.skip_publish != 'true' }}
|
||||
env:
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: npm publish --provenance --access public --tag latest
|
||||
|
||||
- name: Create GitHub Release
|
||||
- name: Re-point next dist-tag at this hotfix
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: |
|
||||
npm dist-tag add "get-shit-done-cc@${VERSION}" next
|
||||
echo "✅ next dist-tag re-pointed to v${VERSION} (matches latest)"
|
||||
|
||||
- name: Create GitHub Release (idempotent)
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
VERSION: ${{ inputs.version }}
|
||||
run: |
|
||||
gh release create "v${VERSION}" \
|
||||
--title "v${VERSION} (hotfix)" \
|
||||
--generate-notes
|
||||
if gh release view "v${VERSION}" >/dev/null 2>&1; then
|
||||
echo "GitHub Release v${VERSION} already exists; ensuring --latest flag is set"
|
||||
gh release edit "v${VERSION}" --latest || true
|
||||
else
|
||||
gh release create "v${VERSION}" \
|
||||
--title "v${VERSION} (hotfix)" \
|
||||
--generate-notes \
|
||||
--latest
|
||||
fi
|
||||
|
||||
- name: Clean up next dist-tag
|
||||
- name: Create PR to merge hotfix back to main
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
BRANCH: ${{ needs.validate-version.outputs.branch }}
|
||||
VERSION: ${{ inputs.version }}
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: |
|
||||
# Point next to the stable release so @next never returns something
|
||||
# older than @latest. This prevents stale pre-release installs.
|
||||
npm dist-tag add "get-shit-done-cc@${VERSION}" next 2>/dev/null || true
|
||||
echo "✓ next dist-tag updated to v${VERSION}"
|
||||
EXISTING_PR=$(gh pr list --base main --head "$BRANCH" --state open --json number --jq '.[0].number')
|
||||
if [ -n "$EXISTING_PR" ]; then
|
||||
gh pr edit "$EXISTING_PR" \
|
||||
--title "chore: merge hotfix v${VERSION} back to main" \
|
||||
--body "Merge hotfix changes back to main after v${VERSION} release."
|
||||
else
|
||||
gh pr create \
|
||||
--base main \
|
||||
--head "$BRANCH" \
|
||||
--title "chore: merge hotfix v${VERSION} back to main" \
|
||||
--body "Merge hotfix changes back to main after v${VERSION} release."
|
||||
fi
|
||||
|
||||
- name: Verify publish
|
||||
- name: Verify publish landed on registry
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
run: |
|
||||
sleep 10
|
||||
PUBLISHED=$(npm view get-shit-done-cc@"$VERSION" version 2>/dev/null || echo "NOT_FOUND")
|
||||
PUBLISHED="NOT_FOUND"
|
||||
for delay in 5 10 20 30 45; do
|
||||
PUBLISHED=$(npm view get-shit-done-cc@"$VERSION" version 2>/dev/null || echo "NOT_FOUND")
|
||||
if [ "$PUBLISHED" = "$VERSION" ]; then
|
||||
break
|
||||
fi
|
||||
echo "Waiting ${delay}s for registry to catch up (saw: $PUBLISHED)..."
|
||||
sleep "$delay"
|
||||
done
|
||||
if [ "$PUBLISHED" != "$VERSION" ]; then
|
||||
echo "::error::Published version verification failed. Expected $VERSION, got $PUBLISHED"
|
||||
echo "::error::Version $VERSION did not appear on the registry within timeout"
|
||||
exit 1
|
||||
fi
|
||||
echo "✓ Verified: get-shit-done-cc@$VERSION is live on npm"
|
||||
LATEST_VER=$(npm view get-shit-done-cc dist-tags.latest 2>/dev/null || echo "NOT_FOUND")
|
||||
if [ "$LATEST_VER" != "$VERSION" ]; then
|
||||
echo "::error::dist-tag 'latest' resolves to '$LATEST_VER', expected '$VERSION'"
|
||||
exit 1
|
||||
fi
|
||||
echo "✓ Verified: get-shit-done-cc@$VERSION is live on @latest"
|
||||
|
||||
- name: Summary
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
BASE_TAG: ${{ needs.validate-version.outputs.base_tag }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
echo "## Hotfix v${VERSION}" >> "$GITHUB_STEP_SUMMARY"
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "**DRY RUN** — npm publish, tagging, and push skipped" >> "$GITHUB_STEP_SUMMARY"
|
||||
else
|
||||
echo "- Published to npm as \`latest\`" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "- Tagged \`v${VERSION}\`" >> "$GITHUB_STEP_SUMMARY"
|
||||
echo "- PR created to merge back to main" >> "$GITHUB_STEP_SUMMARY"
|
||||
fi
|
||||
{
|
||||
echo "## Hotfix v${VERSION}"
|
||||
echo ""
|
||||
echo "- Base (cumulative-fix anchor): \`${BASE_TAG}\`"
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "- **DRY RUN** — npm publish, tagging, and push skipped"
|
||||
else
|
||||
echo "- Published to npm as \`latest\`"
|
||||
echo "- \`next\` dist-tag re-pointed to v${VERSION}"
|
||||
echo "- Tagged \`v${VERSION}\` (anchor for the next hotfix's cherry-pick base)"
|
||||
echo "- SDK bundled at \`sdk-bundle/gsd-sdk.tgz\` inside CC tarball"
|
||||
echo "- Merge-back PR opened against main"
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
|
||||
790
.github/workflows/release-sdk.yml
vendored
Normal file
790
.github/workflows/release-sdk.yml
vendored
Normal file
@@ -0,0 +1,790 @@
|
||||
# Release SDK Bundle
|
||||
#
|
||||
# Stopgap workflow_dispatch publish path: builds get-shit-done-cc with the
|
||||
# compiled SDK and the SDK .tgz bundled inside the CC tarball, then
|
||||
# publishes the CC package to ONE chosen dist-tag (dev | next | latest)
|
||||
# per run.
|
||||
#
|
||||
# Why this exists: @gsd-build/sdk publishes from canary.yml and release.yml
|
||||
# fail because the @gsd-build npm token is currently unavailable. CC users
|
||||
# do not consume @gsd-build/sdk directly — bin/gsd-sdk.js resolves
|
||||
# sdk/dist/cli.js from inside the installed CC package, so the bundled
|
||||
# copy is sufficient for full functionality. This workflow ships CC alone
|
||||
# (no separate @gsd-build/sdk publish attempt) and additionally bakes a
|
||||
# bundled gsd-sdk-<version>.tgz at sdk-bundle/gsd-sdk.tgz inside the CC
|
||||
# tarball as a recoverable npm-installable artifact.
|
||||
#
|
||||
# Existing canary.yml and release.yml are intentionally untouched. They
|
||||
# remain the canonical two-package publish path; restore them to primary
|
||||
# use once @gsd-build/sdk ownership is recovered.
|
||||
#
|
||||
# Tracking issues: #2925 (initial workflow), #2929 (CI-gate parity with release.yml)
|
||||
|
||||
name: Release SDK Bundle
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
action:
|
||||
description: 'publish = normal dev/next/latest publish; hotfix = create hotfix/X.YY.Z branch from latest vX.YY.* tag, cherry-pick fix:/chore: from main, publish to @latest'
|
||||
required: true
|
||||
type: choice
|
||||
default: publish
|
||||
options:
|
||||
- publish
|
||||
- hotfix
|
||||
tag:
|
||||
description: 'npm dist-tag (publish action only; hotfix forces latest)'
|
||||
required: false
|
||||
type: choice
|
||||
default: latest
|
||||
options:
|
||||
- dev
|
||||
- next
|
||||
- latest
|
||||
version:
|
||||
description: 'Version. publish: explicit (e.g. 1.50.0-dev.3) or empty to derive. hotfix: REQUIRED patch (e.g. 1.27.1, Z>0).'
|
||||
required: false
|
||||
type: string
|
||||
ref:
|
||||
description: 'Branch or ref to build from. Ignored for hotfix (workflow uses hotfix/X.YY.Z).'
|
||||
required: false
|
||||
type: string
|
||||
auto_cherry_pick:
|
||||
description: 'Hotfix only: auto-cherry-pick fix:/chore: commits from origin/main since base tag.'
|
||||
required: false
|
||||
type: boolean
|
||||
default: true
|
||||
dry_run:
|
||||
description: 'Dry run (skip npm publish, git tag, and push). Hotfix branch creation/push also skipped.'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
# Per stream (dist-tag for publish, version for hotfix) — no concurrent publishes for the same stream.
|
||||
concurrency:
|
||||
group: release-sdk-${{ inputs.action == 'hotfix' && format('hotfix-{0}', inputs.version) || inputs.tag }}
|
||||
cancel-in-progress: false
|
||||
|
||||
env:
|
||||
NODE_VERSION: 24
|
||||
|
||||
jobs:
|
||||
# Resolves the effective git ref for this run.
|
||||
#
|
||||
# action=publish → outputs inputs.ref verbatim (may be empty = workflow ref)
|
||||
# action=hotfix → branches hotfix/X.YY.Z from highest existing vX.YY.* tag,
|
||||
# auto-cherry-picks fix:/chore: from origin/main, pushes,
|
||||
# and outputs the new branch as ref. Idempotent: if branch
|
||||
# already exists (operator pre-prepared it via hotfix.yml),
|
||||
# we just check it out and re-run the cherry-pick step
|
||||
# no-ops since `git cherry` will report nothing new.
|
||||
prepare:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
permissions:
|
||||
contents: write
|
||||
outputs:
|
||||
ref: ${{ steps.out.outputs.ref }}
|
||||
base_tag: ${{ steps.hotfix.outputs.base_tag }}
|
||||
steps:
|
||||
- name: Validate hotfix inputs
|
||||
if: inputs.action == 'hotfix'
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
run: |
|
||||
if [ -z "$VERSION" ]; then
|
||||
echo "::error::action=hotfix requires the 'version' input (e.g. 1.27.1)"
|
||||
exit 1
|
||||
fi
|
||||
if ! echo "$VERSION" | grep -qE '^[0-9]+\.[0-9]+\.[1-9][0-9]*$'; then
|
||||
echo "::error::Hotfix version must match X.YY.Z with Z>0 (got: $VERSION)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
if: inputs.action == 'hotfix'
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Configure git identity
|
||||
if: inputs.action == 'hotfix'
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Prepare hotfix branch
|
||||
id: hotfix
|
||||
if: inputs.action == 'hotfix'
|
||||
env:
|
||||
VERSION: ${{ inputs.version }}
|
||||
AUTO_CHERRY_PICK: ${{ inputs.auto_cherry_pick }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
# Stash the shipped-paths classifier from the dispatched ref's
|
||||
# working tree BEFORE `git checkout -b ... "$BASE_TAG"` below
|
||||
# overwrites it. Base tags predating #2980 don't have the
|
||||
# classifier in their tree, so the loop must reference a
|
||||
# location that survives the working-tree swap. Bug #2983.
|
||||
CLASSIFIER_SRC="scripts/diff-touches-shipped-paths.cjs"
|
||||
if [ ! -f "$CLASSIFIER_SRC" ]; then
|
||||
echo "::error::shipped-paths classifier not found at $CLASSIFIER_SRC in dispatched ref — refusing to run"
|
||||
exit 1
|
||||
fi
|
||||
CLASSIFIER="${RUNNER_TEMP}/diff-touches-shipped-paths.cjs"
|
||||
cp "$CLASSIFIER_SRC" "$CLASSIFIER"
|
||||
if [ ! -f "$CLASSIFIER" ]; then
|
||||
echo "::error::failed to stage classifier at $CLASSIFIER"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
MAJOR_MINOR=$(echo "$VERSION" | cut -d. -f1-2)
|
||||
TARGET_TAG="v${VERSION}"
|
||||
BRANCH="hotfix/${VERSION}"
|
||||
# Semver-correct selection: append TARGET_TAG, sort -V, take preceding entry.
|
||||
# Plain lexicographic compare mis-orders multi-digit patches (v1.27.10 vs v1.27.9).
|
||||
BASE_TAG=$( ( git tag -l "v${MAJOR_MINOR}.*" | grep -E "^v[0-9]+\.[0-9]+\.[0-9]+$"; echo "$TARGET_TAG" ) \
|
||||
| sort -V \
|
||||
| awk -v target="$TARGET_TAG" '$1 == target { print prev; exit } { prev = $1 }')
|
||||
if [ -z "$BASE_TAG" ]; then
|
||||
echo "::error::No prior stable tag found for ${MAJOR_MINOR}.x before $TARGET_TAG"
|
||||
exit 1
|
||||
fi
|
||||
echo "base_tag=$BASE_TAG" >> "$GITHUB_OUTPUT"
|
||||
echo "branch=$BRANCH" >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Idempotent branch creation — operator may have pre-prepared via hotfix.yml.
|
||||
git fetch origin main:refs/remotes/origin/main
|
||||
if git ls-remote --exit-code origin "refs/heads/$BRANCH" >/dev/null 2>&1; then
|
||||
echo "Branch $BRANCH already exists on origin; checking out"
|
||||
git fetch origin "$BRANCH"
|
||||
git checkout "$BRANCH"
|
||||
BRANCH_PRE_EXISTED=1
|
||||
else
|
||||
git checkout -b "$BRANCH" "$BASE_TAG"
|
||||
BRANCH_PRE_EXISTED=0
|
||||
# Push the skeleton up-front (real runs only) so cherry-pick conflicts
|
||||
# leave a remote artefact the operator can resolve. Dry-run keeps
|
||||
# everything local — no orphan branch created on origin.
|
||||
if [ "$DRY_RUN" != "true" ]; then
|
||||
git push -u origin "$BRANCH"
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ "$AUTO_CHERRY_PICK" = "true" ]; then
|
||||
CANDIDATES=$(git cherry HEAD origin/main | awk '/^\+ / {print $2}')
|
||||
if [ -n "$CANDIDATES" ]; then
|
||||
ORDERED=$(git log --reverse --format='%H' "${BASE_TAG}..origin/main" \
|
||||
| grep -F -f <(echo "$CANDIDATES") || true)
|
||||
INCLUDED=""
|
||||
# POLICY_SKIPPED — commits intentionally not picked because they
|
||||
# don't match the fix/chore filter (feat/refactor/docs/etc).
|
||||
# CONFLICT_SKIPPED — fix/chore commits whose cherry-pick failed
|
||||
# and were skipped per the full-automation policy (#2968).
|
||||
# NON_SHIPPED_SKIPPED — fix/chore commits whose diff doesn't
|
||||
# touch any path in the npm tarball's `files` whitelist
|
||||
# (CI / test / docs / planning-only changes). They can't
|
||||
# affect the published package's behavior, so picking them
|
||||
# into a hotfix is meaningless — and picking workflow-file
|
||||
# changes specifically would also fail the push step because
|
||||
# the default GITHUB_TOKEN lacks the `workflow` scope. The
|
||||
# shipped-paths filter is the precise root cause: bug #2980.
|
||||
# Operators reviewing the run summary need these distinct so
|
||||
# the manual-review queue (CONFLICT_SKIPPED) isn't buried in
|
||||
# the noise from the other two buckets.
|
||||
POLICY_SKIPPED=""
|
||||
CONFLICT_SKIPPED=""
|
||||
NON_SHIPPED_SKIPPED=""
|
||||
while IFS= read -r SHA; do
|
||||
[ -z "$SHA" ] && continue
|
||||
SUBJECT=$(git log -1 --format='%s' "$SHA")
|
||||
if echo "$SUBJECT" | grep -qE '^(fix|chore)(\([^)]+\))?!?: '; then
|
||||
# Merge commits with fix:/chore: titles can't be cherry-picked
|
||||
# without `-m <parent>` and we can't pick the parent
|
||||
# automatically. They fail BEFORE entering cherry-pick state
|
||||
# (no CHERRY_PICK_HEAD), so an unconditional `--skip` would
|
||||
# then fail and brick the loop. Skip them upfront with a
|
||||
# distinct reason. Bug #2968 / CodeRabbit on PR #2970.
|
||||
PARENT_COUNT=$(git rev-list --parents -n 1 "$SHA" | awk '{print NF - 1}')
|
||||
if [ "$PARENT_COUNT" -gt 1 ]; then
|
||||
REASON="merge commit — manual -m parent selection required"
|
||||
echo "↷ skipping $SHA — $REASON"
|
||||
CONFLICT_SKIPPED="${CONFLICT_SKIPPED}- \`${SHA}\` ${SUBJECT} ($REASON)"$'\n'
|
||||
continue
|
||||
fi
|
||||
# Pre-pick guard: a hotfix release can only be affected
|
||||
# by commits whose diff intersects the npm tarball's
|
||||
# shipped paths (package.json `files` whitelist plus
|
||||
# package.json itself, which `npm pack` always
|
||||
# includes). Commits that touch only CI workflows,
|
||||
# tests, docs, or planning artifacts cannot change what
|
||||
# ships, so picking them into a hotfix is meaningless.
|
||||
# As a side benefit, this excludes
|
||||
# `.github/workflows/*` changes whose push would
|
||||
# otherwise be rejected by GitHub because the default
|
||||
# GITHUB_TOKEN lacks the `workflow` scope. The filter
|
||||
# is implemented in
|
||||
# scripts/diff-touches-shipped-paths.cjs rather than
|
||||
# inline so the rules (read package.json `files`,
|
||||
# treat entries as file-OR-directory prefix, the
|
||||
# `package.json`-always-shipped rule) are
|
||||
# unit-testable. Bug #2980.
|
||||
#
|
||||
# Use $CLASSIFIER (staged at workflow-start, before
|
||||
# `git checkout -b ... "$BASE_TAG"` swapped the working
|
||||
# tree) rather than `scripts/...` directly — base tags
|
||||
# older than #2980 don't have the classifier in their
|
||||
# tree. Capture the exit code via PIPESTATUS and
|
||||
# dispatch on it: 0 = shipped, 1 = not shipped, 2+ =
|
||||
# classifier error → fail-fast (don't silently treat
|
||||
# tooling errors as informational skips). Bug #2983.
|
||||
#
|
||||
# PIPESTATUS capture must happen IMMEDIATELY after the
|
||||
# pipeline — the previous form (`pipeline || true; RC=
|
||||
# ${PIPESTATUS[1]}`) had a subtle bug: when the
|
||||
# pipeline fails (exit 1 or 2 — exactly the cases we
|
||||
# care about), `|| true` runs `true` as a one-command
|
||||
# pipeline, overwriting PIPESTATUS to (0). The fix is
|
||||
# to wrap the pipeline in `set +e`/`set -e` and snapshot
|
||||
# PIPESTATUS into a local array on the very next line.
|
||||
# CodeRabbit on PR #2984.
|
||||
set +e
|
||||
git diff-tree --no-commit-id --name-only -r "$SHA" \
|
||||
| node "$CLASSIFIER"
|
||||
PIPE_RC=("${PIPESTATUS[@]}")
|
||||
set -e
|
||||
DIFFTREE_RC="${PIPE_RC[0]}"
|
||||
CLASSIFIER_RC="${PIPE_RC[1]}"
|
||||
if [ "$DIFFTREE_RC" -ne 0 ]; then
|
||||
echo "::error::git diff-tree failed for $SHA (exit $DIFFTREE_RC) — refusing to classify on incomplete input."
|
||||
exit "$DIFFTREE_RC"
|
||||
fi
|
||||
case "$CLASSIFIER_RC" in
|
||||
0) ;;
|
||||
1)
|
||||
REASON="touches no shipped paths (CI / test / docs / planning only)"
|
||||
echo "↷ skipping $SHA — $REASON"
|
||||
NON_SHIPPED_SKIPPED="${NON_SHIPPED_SKIPPED}- \`${SHA}\` ${SUBJECT}"$'\n'
|
||||
continue
|
||||
;;
|
||||
*)
|
||||
echo "::error::shipped-paths classifier failed for $SHA (exit $CLASSIFIER_RC). Refusing to silently skip — bug #2983."
|
||||
exit "$CLASSIFIER_RC"
|
||||
;;
|
||||
esac
|
||||
echo "→ cherry-picking $SHA $SUBJECT"
|
||||
# Pin merge.conflictStyle=merge on the cherry-pick so the
|
||||
# awk classifier below sees deterministic marker shapes —
|
||||
# diff3/zdiff3 would inject `||||||| ancestor` lines into
|
||||
# the HEAD section and cause context-missing conflicts to
|
||||
# misclassify as real. Bug #2966.
|
||||
if ! git -c merge.conflictStyle=merge cherry-pick -x --allow-empty --keep-redundant-commits "$SHA"; then
|
||||
# Full automation policy (bug #2968): any conflict the
|
||||
# cherry-pick can't auto-resolve is skipped, not aborted.
|
||||
# The hotfix run completes with whatever applies cleanly;
|
||||
# the CONFLICT_SKIPPED list below becomes the operator's
|
||||
# review queue (see "Cherry-pick summary" in the run
|
||||
# summary).
|
||||
#
|
||||
# Classify the conflict for the skip reason (operator-
|
||||
# facing diagnostic — doesn't change control flow):
|
||||
# - context absent at base: HEAD section in every
|
||||
# conflict marker is empty (the picked commit modifies
|
||||
# code that doesn't exist at the base). Bug #2966.
|
||||
# - merge conflict: HEAD section has content (both base
|
||||
# and patch want different content for the same
|
||||
# region). Typical when the base tag was cut from a
|
||||
# branch that has diverged from main. Bug #2968.
|
||||
UNMERGED=$(git diff --name-only --diff-filter=U)
|
||||
REASON="merge conflict — manual review"
|
||||
if [ -n "$UNMERGED" ]; then
|
||||
ALL_EMPTY_HEAD=true
|
||||
while IFS= read -r CONFLICTED; do
|
||||
[ -z "$CONFLICTED" ] && continue
|
||||
# Guard the classifier against degenerate cases that
|
||||
# would otherwise skew toward "context absent" (the
|
||||
# auto-skip path) when they're actually unsafe to skip:
|
||||
# - file missing or unreadable: don't pretend the
|
||||
# conflict is benign; treat as real.
|
||||
# - file listed as unmerged but no conflict markers
|
||||
# present: anomalous git state; treat as real so
|
||||
# the pick goes to the manual-review queue.
|
||||
# CodeRabbit on PR #2970.
|
||||
if [ ! -r "$CONFLICTED" ] || ! grep -q '^<<<<<<< ' "$CONFLICTED" 2>/dev/null; then
|
||||
ALL_EMPTY_HEAD=false
|
||||
break
|
||||
fi
|
||||
REAL=$(awk '
|
||||
/^<<<<<<< / { in_head=1; head=""; next }
|
||||
/^=======$/ && in_head { in_head=0; next }
|
||||
/^>>>>>>> / {
|
||||
if (head ~ /[^[:space:]]/) { print "real"; exit }
|
||||
head=""
|
||||
next
|
||||
}
|
||||
in_head { head = head $0 "\n" }
|
||||
' "$CONFLICTED" 2>/dev/null || echo "real")
|
||||
if [ "$REAL" = "real" ]; then
|
||||
ALL_EMPTY_HEAD=false
|
||||
break
|
||||
fi
|
||||
done <<< "$UNMERGED"
|
||||
if [ "$ALL_EMPTY_HEAD" = "true" ]; then
|
||||
REASON="context absent at base"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "↷ skipping $SHA — $REASON"
|
||||
# Guard `--skip`: cherry-pick can fail before entering the
|
||||
# conflict state (e.g. unreadable commit, empty-without-
|
||||
# --allow-empty edge cases the flag misses). Calling
|
||||
# `--skip` outside an in-progress cherry-pick exits non-
|
||||
# zero and would brick the loop. CodeRabbit on PR #2970.
|
||||
if git rev-parse -q --verify CHERRY_PICK_HEAD >/dev/null 2>&1; then
|
||||
git cherry-pick --skip
|
||||
fi
|
||||
CONFLICT_SKIPPED="${CONFLICT_SKIPPED}- \`${SHA}\` ${SUBJECT} ($REASON)"$'\n'
|
||||
continue
|
||||
fi
|
||||
INCLUDED="${INCLUDED}- \`${SHA}\` ${SUBJECT}"$'\n'
|
||||
else
|
||||
POLICY_SKIPPED="${POLICY_SKIPPED}- \`${SHA}\` ${SUBJECT}"$'\n'
|
||||
fi
|
||||
done <<< "$ORDERED"
|
||||
{
|
||||
echo "## Cherry-pick summary"
|
||||
echo ""
|
||||
echo "Base: \`$BASE_TAG\` → Branch: \`$BRANCH\`$([ "$DRY_RUN" = "true" ] && echo " (DRY RUN — local only)")"
|
||||
echo ""
|
||||
if [ -n "$INCLUDED" ]; then
|
||||
echo "### Included (fix/chore)"
|
||||
echo ""
|
||||
echo "$INCLUDED"
|
||||
else
|
||||
echo "_No fix/chore commits to include._"
|
||||
fi
|
||||
if [ -n "$NON_SHIPPED_SKIPPED" ]; then
|
||||
echo "### Skipped — touches no shipped paths (informational)"
|
||||
echo ""
|
||||
echo "These fix/chore commits don't touch any path in the npm tarball's \`files\` whitelist (or \`package.json\`), so they cannot change the published package's behavior. CI / test / docs / planning-only changes belong on \`main\`, not in a hotfix. No action needed."
|
||||
echo ""
|
||||
echo "$NON_SHIPPED_SKIPPED"
|
||||
fi
|
||||
if [ -n "$CONFLICT_SKIPPED" ]; then
|
||||
echo "### Skipped — cherry-pick conflict (manual review)"
|
||||
echo ""
|
||||
echo "$CONFLICT_SKIPPED"
|
||||
fi
|
||||
if [ -n "$POLICY_SKIPPED" ]; then
|
||||
echo "### Not auto-included (feat/refactor/docs/etc)"
|
||||
echo ""
|
||||
echo "$POLICY_SKIPPED"
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Bump version on the branch (committed) so downstream install-smoke +
|
||||
# release jobs build the correct version. The release job's own in-tree
|
||||
# bump becomes a no-op when the file already has the right version.
|
||||
CURRENT=$(node -p "require('./package.json').version")
|
||||
if [ "$CURRENT" != "$VERSION" ]; then
|
||||
npm version "$VERSION" --no-git-tag-version
|
||||
git add package.json package-lock.json
|
||||
if [ -f sdk/package.json ]; then
|
||||
(cd sdk && npm version "$VERSION" --no-git-tag-version)
|
||||
git add sdk/package.json
|
||||
[ -f sdk/package-lock.json ] && git add sdk/package-lock.json
|
||||
fi
|
||||
git commit -m "chore: bump version to $VERSION for hotfix"
|
||||
fi
|
||||
if [ "$DRY_RUN" != "true" ]; then
|
||||
git push origin "$BRANCH"
|
||||
else
|
||||
echo "DRY RUN — cherry-picks applied locally; branch not pushed. Downstream install-smoke will run against \`$BASE_TAG\` (the cherry-pick verification above is the dry-run signal)."
|
||||
fi
|
||||
|
||||
- name: Determine effective ref
|
||||
id: out
|
||||
env:
|
||||
ACTION: ${{ inputs.action }}
|
||||
INPUT_REF: ${{ inputs.ref }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
BASE_TAG: ${{ steps.hotfix.outputs.base_tag }}
|
||||
BRANCH: ${{ steps.hotfix.outputs.branch }}
|
||||
run: |
|
||||
if [ "$ACTION" = "hotfix" ]; then
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "ref=$BASE_TAG" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "ref=$BRANCH" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
else
|
||||
echo "ref=$INPUT_REF" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
# Cross-platform install validation gate (parity with release.yml).
|
||||
install-smoke:
|
||||
needs: prepare
|
||||
permissions:
|
||||
contents: read
|
||||
uses: ./.github/workflows/install-smoke.yml
|
||||
with:
|
||||
ref: ${{ needs.prepare.outputs.ref }}
|
||||
|
||||
release:
|
||||
needs: [prepare, install-smoke]
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 15
|
||||
permissions:
|
||||
contents: write # tag + push + GitHub Release
|
||||
id-token: write # provenance
|
||||
# The merge-back PR step (and the pull-request scope it required)
|
||||
# was removed in #2983 — auto-cherry-pick hotfix flow only picks
|
||||
# commits already on main, so there's nothing to merge back.
|
||||
environment: npm-publish
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: ${{ needs.prepare.outputs.ref }}
|
||||
|
||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||
with:
|
||||
node-version: ${{ env.NODE_VERSION }}
|
||||
registry-url: 'https://registry.npmjs.org'
|
||||
cache: 'npm'
|
||||
|
||||
- name: Determine version
|
||||
id: ver
|
||||
env:
|
||||
ACTION: ${{ inputs.action }}
|
||||
INPUT_TAG: ${{ inputs.tag }}
|
||||
INPUT_OVERRIDE: ${{ inputs.version }}
|
||||
run: |
|
||||
set -e
|
||||
# Hotfix forces version=inputs.version and dist-tag=latest.
|
||||
if [ "$ACTION" = "hotfix" ]; then
|
||||
if [ -z "$INPUT_OVERRIDE" ]; then
|
||||
echo "::error::action=hotfix requires the 'version' input"
|
||||
exit 1
|
||||
fi
|
||||
VERSION="$INPUT_OVERRIDE"
|
||||
EFFECTIVE_TAG="latest"
|
||||
echo "version=$VERSION" >> "$GITHUB_OUTPUT"
|
||||
echo "tag=$EFFECTIVE_TAG" >> "$GITHUB_OUTPUT"
|
||||
echo "→ Hotfix: will publish v${VERSION} to dist-tag '${EFFECTIVE_TAG}'"
|
||||
exit 0
|
||||
fi
|
||||
RAW=$(node -p "require('./package.json').version")
|
||||
BASE=$(echo "$RAW" | sed 's/-.*//')
|
||||
if [ -n "$INPUT_OVERRIDE" ]; then
|
||||
VERSION="$INPUT_OVERRIDE"
|
||||
else
|
||||
case "$INPUT_TAG" in
|
||||
dev)
|
||||
N=1
|
||||
while git tag -l "v${BASE}-dev.${N}" | grep -q .; do
|
||||
N=$((N + 1))
|
||||
done
|
||||
VERSION="${BASE}-dev.${N}"
|
||||
;;
|
||||
next)
|
||||
N=1
|
||||
while git tag -l "v${BASE}-rc.${N}" | grep -q .; do
|
||||
N=$((N + 1))
|
||||
done
|
||||
VERSION="${BASE}-rc.${N}"
|
||||
;;
|
||||
latest)
|
||||
VERSION="$BASE"
|
||||
;;
|
||||
*)
|
||||
echo "::error::Unknown tag '$INPUT_TAG' (expected dev|next|latest)"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
echo "version=$VERSION" >> "$GITHUB_OUTPUT"
|
||||
echo "tag=$INPUT_TAG" >> "$GITHUB_OUTPUT"
|
||||
echo "→ Will publish v${VERSION} to dist-tag '${INPUT_TAG}'"
|
||||
|
||||
# Reconciliation mode: if version is already on npm (a prior run
|
||||
# published successfully but a downstream step failed), don't hard-fail.
|
||||
# Set a flag and skip the publish step below; tag/release/PR/dist-tag
|
||||
# steps still execute so the rerun can finish reconciling state.
|
||||
- name: Detect prior publish (reconciliation mode)
|
||||
id: prior_publish
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
run: |
|
||||
EXISTING=$(npm view get-shit-done-cc@"$VERSION" version 2>/dev/null || true)
|
||||
if [ -n "$EXISTING" ]; then
|
||||
echo "::warning::get-shit-done-cc@${VERSION} is already on the registry — entering reconciliation mode (skip publish, continue with tag/release/PR/dist-tag)."
|
||||
echo "skip_publish=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "skip_publish=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
# Tolerant tag-existence check (matches release.yml pattern). An
|
||||
# operator re-running after a mid-flight publish-step failure should
|
||||
# not be blocked just because the tag step succeeded last time. Only
|
||||
# error if the existing tag points at a different commit than HEAD.
|
||||
- name: Check git tag (skip if matches HEAD, error if mismatched)
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
run: |
|
||||
if git rev-parse -q --verify "refs/tags/v${VERSION}" >/dev/null; then
|
||||
EXISTING_SHA=$(git rev-parse "refs/tags/v${VERSION}")
|
||||
HEAD_SHA=$(git rev-parse HEAD)
|
||||
if [ "$EXISTING_SHA" != "$HEAD_SHA" ]; then
|
||||
echo "::error::git tag v${VERSION} already exists pointing at ${EXISTING_SHA}, but HEAD is ${HEAD_SHA}"
|
||||
exit 1
|
||||
fi
|
||||
echo "::notice::tag v${VERSION} already exists at HEAD; tag step will skip"
|
||||
fi
|
||||
|
||||
- name: Configure git identity
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Bump in-tree version (not committed)
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
run: |
|
||||
# --allow-same-version: prepare may have already committed this bump
|
||||
# on the hotfix branch (release checks out BRANCH in real runs,
|
||||
# BASE_TAG in dry-runs — only the latter has the older version).
|
||||
npm version "$VERSION" --no-git-tag-version --allow-same-version
|
||||
cd sdk && npm version "$VERSION" --no-git-tag-version --allow-same-version
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm ci
|
||||
|
||||
- name: Run full test suite with coverage (parity with release.yml)
|
||||
run: npm run test:coverage
|
||||
|
||||
- name: Build SDK dist for tarball
|
||||
run: npm run build:sdk
|
||||
|
||||
- name: Verify CC tarball ships sdk/dist/cli.js (bug #2647 guard)
|
||||
run: bash scripts/verify-tarball-sdk-dist.sh
|
||||
|
||||
- name: Pack SDK as tarball and bundle into CC source tree
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
run: |
|
||||
set -e
|
||||
cd sdk
|
||||
npm pack
|
||||
# npm pack emits gsd-build-sdk-<version>.tgz in the cwd
|
||||
TARBALL="gsd-build-sdk-${VERSION}.tgz"
|
||||
if [ ! -f "$TARBALL" ]; then
|
||||
echo "::error::Expected $TARBALL but npm pack did not produce it. Listing sdk/:"
|
||||
ls -la
|
||||
exit 1
|
||||
fi
|
||||
mkdir -p ../sdk-bundle
|
||||
mv "$TARBALL" ../sdk-bundle/gsd-sdk.tgz
|
||||
cd ..
|
||||
ls -la sdk-bundle/
|
||||
|
||||
- name: Add sdk-bundle to CC files whitelist (in-tree, not committed)
|
||||
run: |
|
||||
node <<'NODE'
|
||||
const fs = require('fs');
|
||||
const pkg = JSON.parse(fs.readFileSync('package.json', 'utf8'));
|
||||
if (!Array.isArray(pkg.files)) {
|
||||
console.error('::error::package.json files is not an array');
|
||||
process.exit(1);
|
||||
}
|
||||
if (!pkg.files.includes('sdk-bundle')) {
|
||||
pkg.files.push('sdk-bundle');
|
||||
fs.writeFileSync('package.json', JSON.stringify(pkg, null, 2) + '\n');
|
||||
console.log('Added sdk-bundle/ to package.json files whitelist');
|
||||
} else {
|
||||
console.log('sdk-bundle/ already in files whitelist');
|
||||
}
|
||||
NODE
|
||||
|
||||
- name: Verify CC tarball will contain sdk-bundle/gsd-sdk.tgz
|
||||
run: |
|
||||
set -e
|
||||
TARBALL=$(npm pack --ignore-scripts 2>/dev/null | tail -1)
|
||||
if [ -z "$TARBALL" ] || [ ! -f "$TARBALL" ]; then
|
||||
echo "::error::npm pack produced no tarball"
|
||||
exit 1
|
||||
fi
|
||||
echo "Inspecting $TARBALL for sdk-bundle/gsd-sdk.tgz:"
|
||||
if ! tar -tzf "$TARBALL" | grep -q "package/sdk-bundle/gsd-sdk.tgz"; then
|
||||
echo "::error::CC tarball is missing package/sdk-bundle/gsd-sdk.tgz"
|
||||
tar -tzf "$TARBALL" | grep -E "sdk-bundle|sdk/dist" | head -20
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ CC tarball contains sdk-bundle/gsd-sdk.tgz"
|
||||
rm -f "$TARBALL"
|
||||
|
||||
- name: Dry-run publish validation
|
||||
# Skip the rehearsal when the version is already on npm
|
||||
# (reconciliation mode). `npm publish --dry-run` contacts the
|
||||
# registry and fails with "You cannot publish over the
|
||||
# previously published versions" if the version exists, even
|
||||
# though no actual publish would be attempted. The real publish
|
||||
# step (further down) is gated on the same condition; gate the
|
||||
# rehearsal too so re-runs of an already-published hotfix don't
|
||||
# fail here on a check that doesn't apply. Bug #2987.
|
||||
if: ${{ steps.prior_publish.outputs.skip_publish != 'true' }}
|
||||
env:
|
||||
TAG: ${{ steps.ver.outputs.tag }}
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: npm publish --dry-run --tag "$TAG"
|
||||
|
||||
- name: Tag and push
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
run: |
|
||||
if git rev-parse -q --verify "refs/tags/v${VERSION}" >/dev/null; then
|
||||
echo "Tag v${VERSION} already exists at HEAD (per pre-flight check); skipping git tag step"
|
||||
else
|
||||
git tag "v${VERSION}"
|
||||
fi
|
||||
git push origin "v${VERSION}"
|
||||
|
||||
- name: Publish to npm (CC bundle, SDK included as both loose tree and .tgz)
|
||||
if: ${{ !inputs.dry_run && steps.prior_publish.outputs.skip_publish != 'true' }}
|
||||
env:
|
||||
TAG: ${{ steps.ver.outputs.tag }}
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: npm publish --provenance --access public --tag "$TAG"
|
||||
|
||||
# Keep `next` from going stale relative to `latest`. When publishing a
|
||||
# stable release, also point `next` at it so users on `@next` don't
|
||||
# get stuck on an older pre-release than what's now stable. Parity
|
||||
# with release.yml#finalize "Clean up next dist-tag" step.
|
||||
- name: Re-point next dist-tag at the new latest (only when tag=latest)
|
||||
if: ${{ !inputs.dry_run && steps.ver.outputs.tag == 'latest' }}
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: |
|
||||
npm dist-tag add "get-shit-done-cc@${VERSION}" next
|
||||
echo "✅ next dist-tag re-pointed to v${VERSION} (matches latest)"
|
||||
|
||||
- name: Create GitHub Release (idempotent)
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
TAG: ${{ steps.ver.outputs.tag }}
|
||||
run: |
|
||||
# Per-tag release flags:
|
||||
# dev, next → --prerelease (won't be highlighted as the latest release on the repo page)
|
||||
# latest → --latest (becomes the highlighted release)
|
||||
# Idempotent: if release already exists (rerun after a transient
|
||||
# downstream failure), edit the latest flag instead of failing.
|
||||
if gh release view "v${VERSION}" >/dev/null 2>&1; then
|
||||
echo "GitHub Release v${VERSION} already exists; reconciling --latest flag"
|
||||
if [ "$TAG" = "latest" ]; then
|
||||
gh release edit "v${VERSION}" --latest || true
|
||||
fi
|
||||
elif [ "$TAG" = "latest" ]; then
|
||||
gh release create "v${VERSION}" \
|
||||
--title "v${VERSION}" \
|
||||
--generate-notes \
|
||||
--latest
|
||||
else
|
||||
gh release create "v${VERSION}" \
|
||||
--title "v${VERSION}" \
|
||||
--generate-notes \
|
||||
--prerelease
|
||||
fi
|
||||
echo "✅ GitHub Release v${VERSION} ready"
|
||||
|
||||
# Merge-back PR step removed — bug #2983.
|
||||
#
|
||||
# The auto-cherry-pick hotfix flow only picks commits already on
|
||||
# main (`git cherry HEAD origin/main` outputs unmerged commits;
|
||||
# we filter to fix:/chore: from main). By construction every code
|
||||
# commit on the hotfix branch is already on main. The only
|
||||
# hotfix-branch-only commit is `chore: bump version to X.Y.Z for
|
||||
# hotfix`, which would either no-op against main (already past
|
||||
# X.Y.Z) or rewind main's in-progress version — strictly
|
||||
# counterproductive in either case.
|
||||
#
|
||||
# The original merge-back step also failed in production with
|
||||
# `GitHub Actions is not permitted to create or approve pull
|
||||
# requests (createPullRequest)` (org policy), but even if the
|
||||
# policy were lifted the PR would have nothing useful to merge.
|
||||
# Run 25232968975 was the trigger for removal.
|
||||
|
||||
- name: Verify publish landed on registry
|
||||
if: ${{ !inputs.dry_run }}
|
||||
env:
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
TAG: ${{ steps.ver.outputs.tag }}
|
||||
run: |
|
||||
PUBLISHED="NOT_FOUND"
|
||||
for delay in 5 10 20 30 45; do
|
||||
PUBLISHED=$(npm view get-shit-done-cc@"$VERSION" version 2>/dev/null || echo "NOT_FOUND")
|
||||
if [ "$PUBLISHED" = "$VERSION" ]; then
|
||||
break
|
||||
fi
|
||||
echo "Waiting ${delay}s for registry to catch up (saw: $PUBLISHED)..."
|
||||
sleep "$delay"
|
||||
done
|
||||
if [ "$PUBLISHED" != "$VERSION" ]; then
|
||||
echo "::error::Version $VERSION did not appear on the registry within timeout"
|
||||
exit 1
|
||||
fi
|
||||
TAG_VERSION=$(npm view get-shit-done-cc dist-tags."$TAG" 2>/dev/null || echo "NOT_FOUND")
|
||||
if [ "$TAG_VERSION" != "$VERSION" ]; then
|
||||
echo "::error::dist-tag '$TAG' resolves to '$TAG_VERSION', expected '$VERSION'"
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ get-shit-done-cc@${VERSION} live on dist-tag '${TAG}'"
|
||||
|
||||
- name: Summary
|
||||
env:
|
||||
ACTION: ${{ inputs.action }}
|
||||
VERSION: ${{ steps.ver.outputs.version }}
|
||||
TAG: ${{ steps.ver.outputs.tag }}
|
||||
BASE_TAG: ${{ needs.prepare.outputs.base_tag }}
|
||||
BRANCH: ${{ needs.prepare.outputs.ref }}
|
||||
DRY_RUN: ${{ inputs.dry_run }}
|
||||
run: |
|
||||
{
|
||||
if [ "$ACTION" = "hotfix" ]; then
|
||||
echo "## Release SDK Bundle (hotfix): v${VERSION} → @${TAG}"
|
||||
echo ""
|
||||
echo "- Base (cumulative-fix anchor): \`${BASE_TAG}\`"
|
||||
echo "- Branch: \`${BRANCH}\`"
|
||||
else
|
||||
echo "## Release SDK Bundle: v${VERSION} → @${TAG}"
|
||||
fi
|
||||
echo ""
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "**DRY RUN** — npm publish, git tag, push, and GitHub Release were skipped."
|
||||
else
|
||||
echo "- Published \`get-shit-done-cc@${VERSION}\` to dist-tag \`${TAG}\`"
|
||||
echo "- SDK bundled inside the CC tarball at:"
|
||||
echo " - \`sdk/dist/cli.js\` (loose tree, consumed by \`bin/gsd-sdk.js\` shim)"
|
||||
echo " - \`sdk-bundle/gsd-sdk.tgz\` (npm-installable artifact)"
|
||||
echo "- Git tag \`v${VERSION}\` pushed"
|
||||
echo "- GitHub Release \`v${VERSION}\` created"
|
||||
if [ "$TAG" = "latest" ]; then
|
||||
echo "- \`next\` dist-tag re-pointed at \`v${VERSION}\` (kept current with \`latest\`)"
|
||||
fi
|
||||
if [ "$ACTION" = "hotfix" ]; then
|
||||
# Auto-cherry-pick hotfixes only pick commits already on
|
||||
# main, so there's nothing to merge back. The merge-back
|
||||
# PR step was removed in #2983; this line surfaces the
|
||||
# explicit non-action so operators don't expect a PR
|
||||
# that was never opened.
|
||||
echo "- No merge-back PR (auto-picked commits are already on main)"
|
||||
fi
|
||||
echo "- Install: \`npm install -g get-shit-done-cc@${TAG}\`"
|
||||
fi
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
12
.github/workflows/test.yml
vendored
12
.github/workflows/test.yml
vendored
@@ -88,6 +88,18 @@ jobs:
|
||||
- name: Build SDK dist (required by installer)
|
||||
run: npm run build:sdk
|
||||
|
||||
# Seam contract gate: keep manifest -> generated aliases -> registry/CJS adapters aligned.
|
||||
# Run once per workflow on the primary Linux node to avoid redundant matrix cost.
|
||||
- name: SDK seam coverage tests
|
||||
if: matrix.os == 'ubuntu-latest' && matrix.node-version == 24
|
||||
shell: bash
|
||||
run: cd sdk && npx vitest run src/query/command-seam-coverage.test.ts
|
||||
|
||||
- name: SDK generated alias artifact drift check
|
||||
if: matrix.os == 'ubuntu-latest' && matrix.node-version == 24
|
||||
shell: bash
|
||||
run: node sdk/scripts/check-command-aliases-fresh.mjs
|
||||
|
||||
- name: Run tests with coverage
|
||||
shell: bash
|
||||
run: npm run test:coverage
|
||||
|
||||
37
CHANGELOG.md
37
CHANGELOG.md
@@ -4,8 +4,21 @@ All notable changes to GSD will be documented in this file.
|
||||
|
||||
Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [Unreleased](https://github.com/gsd-build/get-shit-done/compare/v1.38.5...HEAD)
|
||||
## [Unreleased](https://github.com/gsd-build/get-shit-done/compare/v1.39.1...HEAD)
|
||||
|
||||
### Changed
|
||||
|
||||
- **Test suite for `config-schema.cjs` is now mutation-resistant** — Stryker measured a 4.62% mutation score on `get-shit-done/bin/lib/config-schema.cjs` (6 killed, 124 survived out of 130). Surviving mutants flagged that existing tests were exercising paths but not verifying outputs: a polarity flip (`return true` → `return false`), a predicate swap (`.some` → `.every`), or a guard removal (`if (VALID_CONFIG_KEYS.has(...)) return true;` → unguarded fallthrough) all passed every test. New `tests/bug-2986-config-schema-mutation-killers.test.cjs` adds 95 tests across four suites that target each surviving mutant class: (1) parameterized `isValidConfigKey('${key}') === true` for every member of `VALID_CONFIG_KEYS` (kills the static-key-fast-path mutation), (2) representative dynamic-pattern keys that match exactly one pattern (kills the `.some` → `.every` mutation, with an inline mutual-exclusivity invariant check), (3) `strictEqual` against the literal boolean `true`/`false` instead of `assert.ok` truthy checks (kills polarity-flip mutations), (4) anchor-tightening cases that differ from valid keys by one character beyond the documented shape (kills regex-loosening mutations on `^`, `$`, and character-class boundaries). Tests use the lib's public surface (typed boolean assertions on `isValidConfigKey` return values), no source-grep. (#2986)
|
||||
|
||||
### Fixed
|
||||
|
||||
- **`gsd-pristine/` is now populated by the installer when local patches are detected** — `saveLocalPatches` declared a `pristineDir` variable and JSDoc'd "saves pristine copies (from manifest) to gsd-pristine/ to enable three-way merge during reapply-patches", but no code ever wrote to that directory. Effect: the `/gsd-reapply-patches` Step 5 verifier (#2972) silently degraded to its over-broad fallback heuristic ("every significant backup line"), exactly the silent-success-on-lost-content failure mode #2969 was designed to prevent. Fix: new `populatePristineDir({ packageSrc, pristineDir, modified, runtime, pathPrefix, isGlobal })` helper runs the install transform pipeline (`copyWithPathReplacement`) into a tmp staging dir, then copies out only the modified-file paths into `gsd-pristine/`. `saveLocalPatches` now accepts a `pristineCtx` and calls the helper when local patches are detected; the install entry point passes the package source root, runtime, pathPrefix, and isGlobal so transforms produce byte-identical output to what `copyWithPathReplacement` would have written under normal install. Soft-fails on transform errors (logs a warning, continues with empty pristine — no worse than pre-fix behavior). Pristine reflects the about-to-install version's content, which is what the verifier needs as the "what would survive without the user's modifications" baseline. Regression covered by `tests/bug-2998-pristine-dir-populated.test.cjs` (6 tests across two suites): asserts the helper is exported, returns 0 for empty modified list, writes one pristine file per source-existing path, skips ghost paths without corrupting pristine, and produces deterministic output (two runs with same inputs yield byte-identical pristine — the property `pristine_hashes` in `backup-meta.json` depends on). (#2998)
|
||||
|
||||
|
||||
- **`release-sdk` hotfix re-run no longer fails at `Dry-run publish validation` when the version is already on npm** — the `Detect prior publish (reconciliation mode)` step sets `skip_publish=true` when the package version is already on the registry, and the actual publish step honors that gate. The `Dry-run publish validation` step was missing the same guard, so any operator re-run of an already-published hotfix (the typical recovery path when later steps fail mid-flight) hit `npm publish --dry-run` first and got `npm error You cannot publish over the previously published versions: X.Y.Z` — `npm publish --dry-run` contacts the registry and rejects existing-version targets even though it doesn't actually publish. The dry-run validation step is now gated on the same `steps.prior_publish.outputs.skip_publish != 'true'` condition as the publish step. The rehearsal still runs on first publishes (where it has value); it skips only in the specific reconciliation case where the publish itself would be skipped. Trigger run: [25233855236](https://github.com/gsd-build/get-shit-done/actions/runs/25233855236/job/73995605643). Regression covered by `tests/bug-2987-dry-run-validation-skip-on-reconciliation.test.cjs`. (#2987)
|
||||
- **`release-sdk` hotfix flow hardened against silent classifier failures, missing-classifier-at-base-tag, and a vestigial merge-back PR step** — three issues surfaced by CodeRabbit's post-merge review of #2981 plus a production failure on the v1.39.1 release run. **(1)** `scripts/diff-touches-shipped-paths.cjs` reused exit code `1` for both the legitimate "no shipped paths" classifier result and Node's default uncaught-throw exit, so any tooling failure was indistinguishable from a normal skip. The script now uses `0` (shipped), `1` (not shipped), `2` (classifier error) with `try`/`catch` + `uncaughtException`/`unhandledRejection` handlers routing all failure paths to exit `2`. **(2)** The workflow's `git checkout -b "$BRANCH" "$BASE_TAG"` overwrote the working tree with the base tag's contents *before* the cherry-pick loop ran the classifier — but base tags predating the classifier's introduction (notably v1.39.0) don't have the file in their tree, so `node scripts/diff-touches-shipped-paths.cjs` would exit non-zero and silently drop every commit, producing an empty hotfix release. The classifier is now staged into `$RUNNER_TEMP` at the top of `Prepare hotfix branch` (before any working-tree-mutating git command), and the loop references that staged copy. The cherry-pick loop snapshots `$PIPESTATUS` into a local array (`PIPE_RC=("${PIPESTATUS[@]}")`) immediately after the classifier pipeline — under bracketed `set +e`/`set -e` — and dispatches via explicit `case`: `0` proceeds, `1` skips into `NON_SHIPPED_SKIPPED`, anything else emits `::error::shipped-paths classifier failed for $SHA (exit N)` and fails the workflow. CodeRabbit on PR #2984 caught a subtler bug in the first iteration: `pipeline \|\| true; RC=${PIPESTATUS[1]}` is broken because `\|\| true` runs `true` as its own one-command pipeline on the failure paths, overwriting `PIPESTATUS` to `(0)` and leaving `${PIPESTATUS[1]}` unset. The array-snapshot form is invariant against this. The same hardening also surfaces `git diff-tree`'s exit code (via `PIPE_RC[0]`); a non-zero diff-tree result now also fails the workflow rather than feeding partial input to the classifier. **(3)** Removed the `Open merge-back PR (hotfix only)` step. The auto-cherry-pick hotfix flow only picks commits already on main (`git cherry HEAD origin/main` outputs the unmerged ones), so by construction every code commit on the hotfix branch is already on main. The only hotfix-branch-only commit is the version-bump chore, which would either no-op against main or rewind main's in-progress version. The step also failed in production with `GitHub Actions is not permitted to create or approve pull requests (createPullRequest)` (org policy) on run [25232968975](https://github.com/gsd-build/get-shit-done/actions/runs/25232968975). The `pull-requests: write` permission previously granted to the release job has been dropped in line with least-privilege. The run-summary line that previously echoed `Merge-back PR opened against main` has been replaced with `No merge-back PR (auto-picked commits are already on main)` so operators reading the summary see an accurate non-action statement (CodeRabbit on PR #2984). Regression covered by `tests/bug-2983-classifier-exit-codes-and-base-tag-staging.test.cjs` (15 assertions across exit-code semantics, classifier staging, error dispatch, PIPESTATUS-snapshot hardening, diff-tree fail-fast, merge-back removal, and run-summary accuracy). (#2983)
|
||||
- **`release-sdk` hotfix only cherry-picks commits that change what actually ships** — the `fix:`/`chore:` filter in `Prepare hotfix branch` was too broad: it picked any commit with that conventional-commit type regardless of whether the diff could affect the published npm package. CI-only fixes (release-sdk.yml itself, hotfix tooling, test-only commits) were getting cherry-picked into hotfix branches even though they cannot change the tarball — and the subset touching `.github/workflows/*` then caused the prepare job's `git push` to be rejected by GitHub because the default `GITHUB_TOKEN` lacks the `workflow` scope, aborting the run. v1.39.1 hit this on PR #2977 (run [25232010071](https://github.com/gsd-build/get-shit-done/actions/runs/25232010071)). The loop now pre-skips any candidate commit whose `git diff-tree` output doesn't intersect the npm tarball's shipped paths (entries in `package.json` `files`, plus `package.json` itself, which `npm pack` always includes). Skipped commits land in a new `NON_SHIPPED_SKIPPED` summary bucket framed as informational — non-shipping commits cannot affect the package, so the skip needs no operator action. The shipped-paths classifier lives in `scripts/diff-touches-shipped-paths.cjs` so its rules (file-OR-directory prefix matching `npm pack` semantics, the always-shipped rule for `package.json`, the lockfile-not-shipped rule) are unit-testable. Regression covered by `tests/bug-2980-hotfix-only-picks-shipping-changes.test.cjs`. (#2980)
|
||||
- **`release-sdk` hotfix workflow fails on real run with `npm error Version not changed`** — the `release` job's `Bump in-tree version (not committed)` step ran `npm version "$VERSION"` without `--allow-same-version`, so it errored on real (non-dry-run) hotfix runs because `prepare` had already committed the bump on the hotfix branch. The release job's checkout `ref` is asymmetric — `BRANCH` (already bumped) on real runs vs `BASE_TAG` (older version) on dry-runs — which is why dry-run never caught the bug. Both `npm version` calls in that step now pass `--allow-same-version`, matching the existing pattern in `release.yml:326`. (#2976)
|
||||
### Added — 1.40.0-rc.1
|
||||
- **Six namespace meta-skills with keyword-tag descriptions** — replace the flat 86-skill
|
||||
listing with two-stage hierarchical routing. Model sees 6 namespace routers
|
||||
@@ -26,6 +39,8 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
RC. (#2833)
|
||||
|
||||
### Changed — 1.40.0-rc.1
|
||||
- **Hotfix release flow now auto-incorporates fixes from `main` and bundles the SDK** — `hotfix.yml create` auto-cherry-picks every `fix:`/`chore:` commit on `origin/main` not yet shipped (oldest-first; patch-equivalents skipped via `git cherry`; `feat:`/`refactor:` excluded; conflicts halt with the offending SHA; run summary lists every included SHA). `hotfix.yml finalize` adds the `install-smoke` cross-platform gate, bundles `sdk-bundle/gsd-sdk.tgz` inside the CC tarball (parity with `release-sdk.yml`), tightens the `next` dist-tag re-point, and marks the GitHub Release `--latest`. `release-sdk.yml` gains `action: publish | hotfix` plus an `auto_cherry_pick` toggle, with a new `prepare` job that branches `hotfix/X.YY.Z` from the highest existing `vX.YY.*` tag and runs the same cherry-pick logic — idempotent if the branch was pre-prepared via `hotfix.yml`. Hotfix `vX.YY.Z` is now defined as everything in `vX.YY.{Z-1}` plus every `fix:`/`chore:` since that base, so each tag is the cumulative-fix anchor for the next. (#2955)
|
||||
- **Planning workspace seam extracted from `core.cjs` into `planning-workspace.cjs`** — path/workstream/lock behavior now lives in a dedicated module (`planningDir`, `planningPaths`, `planningRoot`, active-workstream routing, `withPlanningLock`). `core.cjs` keeps compatibility re-exports while call-sites migrate to direct imports, improving locality and reducing coupling. (#2900)
|
||||
- **Skill surface consolidated 86 → 59 `commands/gsd/*.md` entries** — four new
|
||||
grouped skills (`capture`, `phase`, `config`, `workspace`) replace clusters of
|
||||
micro-skills. Six existing parents absorb wrap-up and sub-operations as flags:
|
||||
@@ -37,7 +52,14 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
now auto-closes PRs opened without a closing keyword that links a tracking issue,
|
||||
posting a comment that points to the contribution guide. (#2872)
|
||||
|
||||
### Fixed
|
||||
|
||||
- **Stale deleted command references updated across workflow files** — `help.md`, `do.md`, `settings.md`, `discuss-phase.md`, `new-project.md`, `plan-phase.md`, `spike.md`, and `sketch.md` referenced command names removed in #2790; updated to new consolidated equivalents. (#2950)
|
||||
|
||||
### Fixed — 1.40.0-rc.1
|
||||
- **`spike --wrap-up` now dispatches correctly** — `/gsd-spike --wrap-up` was silently no-oping because the flag dispatch wiring was omitted when the micro-skill entry point was absorbed in #2790. (#2948)
|
||||
- **`config-get context_window` returns `200000` when key absent** — querying an unset `context_window` previously exited 1 with "Key not found", surfacing a confusing error in planning logs even though the workflow fallback worked correctly. `cmdConfigGet` now consults a `SCHEMA_DEFAULTS` map and returns the documented default (`200000`, exit 0) for absent schema-defaulted keys; unknown absent keys still error as before. (#2943)
|
||||
- **`gap-analysis` now parses non-`REQ-` requirement IDs and ignores traceability table headers** — `parseRequirements()` no longer hard-codes the `REQ-` prefix and now accepts uppercase prefixed IDs such as `TST-01`, `BACK-07`, and `INSP-04`; markdown table header rows (for example `| REQ-ID | ... |`) are excluded so header tokens are not reported as phantom uncovered requirements. Added regression coverage for mixed-prefix REQUIREMENTS files with traceability tables. (#2897)
|
||||
- **Gemini slash commands namespaced as `/gsd:<cmd>` instead of `/gsd-<cmd>`** —
|
||||
Gemini CLI namespaces commands under `gsd:`, so `/gsd-plan-phase` was unexecutable.
|
||||
Body-text references in commands, agents, banners, and patch-reapply hints are now
|
||||
@@ -304,6 +326,19 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
on-demand `Read()` calls gated behind mode routing. Tokens loaded at skill entry drop
|
||||
from ~13k to near zero; only the branch actually invoked is loaded. (#2606)
|
||||
|
||||
## [1.39.1] - 2026-05-01
|
||||
|
||||
Hotfix release. Cherry-picks user-facing fixes from `main` onto the v1.39.0 stable
|
||||
line. Install: `npm install -g get-shit-done-cc@latest` (or `@1.39.1` to pin).
|
||||
|
||||
### Fixed
|
||||
|
||||
- **`gsd-sdk query agent-skills` emits raw `<agent_skills>` block instead of JSON-wrapped string** — workflows that embed via `$(gsd-sdk query agent-skills <agent>)` were receiving a JSON-quoted string literal mid-prompt (e.g. `"<agent_skills>\n…"`), silently breaking all `<agent_skills>` injection into spawned subagents. The CLI dispatcher now honors an opt-in `format: 'text'` field on `QueryResult` and writes such results raw via `process.stdout.write`; `--pick` always returns JSON regardless. (#2917)
|
||||
- **`sketch --wrap-up` now dispatches correctly** — `/gsd-sketch --wrap-up` was silently no-oping because the flag dispatch wiring was omitted when the micro-skill entry point was absorbed in #2790. (#2949)
|
||||
- **`help.md` no longer advertises eight slash commands removed by the #2824 consolidation** — `/gsd-do`, `/gsd-note`, `/gsd-check-todos`, `/gsd-plant-seed`, `/gsd-research-phase`, `/gsd-list-phase-assumptions`, `/gsd-plan-milestone-gaps`, and `/gsd-join-discord` were removed when 86 skills were folded into 59. `help.md` was not updated alongside, so users typing the documented commands hit *Unknown command*. Each entry is now either rewritten to the surviving flag-based dispatcher (e.g., `/gsd-do …` → `/gsd-progress --do "…"`, `/gsd-note` → `/gsd-capture --note`, `/gsd-plant-seed` → `/gsd-capture --seed`, `/gsd-check-todos` → `/gsd-capture --list`) or removed for skills with no replacement. A regression test now asserts every `/gsd-*` reference in `help.md` has a matching `commands/gsd/*.md` stub. (#2954)
|
||||
- **`--sdk` install on Windows now writes a callable `gsd-sdk` shim** — `npx get-shit-done-cc@latest --claude --global --sdk` on Windows previously left `gsd-sdk` off PATH because `trySelfLinkGsdSdk` returned `null` unconditionally on `win32` (a missed gap from #2775's POSIX self-link, not an intentional deferral). The function now dispatches to a Windows counterpart that writes the standard npm shim triple (`gsd-sdk.cmd`, `gsd-sdk.ps1`, and a Bash wrapper) to npm's global bin, so `gsd-sdk` resolves in a fresh shell across cmd.exe, PowerShell, and Cygwin/MSYS/Git-Bash. A new regression guard in `tests/no-unconditional-win32-skip.test.cjs` blocks any future `if (process.platform === 'win32') return null;` skip-only branches in `bin/install.js`. (#2962)
|
||||
- **`/gsd-reapply-patches` Step 5 gate is now deterministic — no more silent content drops** — the prior gate parsed a Claude-generated *Hunk Verification Table* whose `verified: yes` rows were filled in without actually checking content presence, leading to merged files that lost user-added blocks (e.g., a `<visual_companion>` section, an `--execute-only` flag block) while the workflow reported success. The gate now invokes a Node script (`scripts/verify-reapply-patches.cjs`) that diffs each backup against the pristine baseline, computes the user-added significant lines, and asserts each one is present in the merged file. Exits non-zero with a per-file diagnostic on any miss; the workflow halts and surfaces the JSON output to the user. The verifier ignores low-signal lines (too short, pure whitespace, decorative comments) so trivial differences don't trigger false failures. Out of scope here: the manifest-baseline tightening described in #2969 Failure 1 — that's separate work. (#2969)
|
||||
|
||||
## [1.38.5] - 2026-04-25
|
||||
|
||||
### Fixed
|
||||
|
||||
147
CONTRIBUTING.md
147
CONTRIBUTING.md
@@ -91,6 +91,23 @@ PRs that arrive without a properly-labeled linked issue are closed automatically
|
||||
- **CI must pass** — all matrix jobs (Ubuntu × Node 22, 24; macOS × Node 24) must be green
|
||||
- **Scope matches the approved issue** — if your PR does more than what the issue describes, the extra changes will be asked to be removed or moved to a new issue
|
||||
|
||||
## CHANGELOG Entries — Drop a Fragment
|
||||
|
||||
**Do not edit `CHANGELOG.md` directly.** Two PRs that both append to a `### Fixed` block always conflict on merge — git can't pick a serialization order without a human. Instead, every PR with user-facing changes drops a fragment file in `.changeset/`.
|
||||
|
||||
```bash
|
||||
npm run changeset -- --type Fixed --pr <YOUR_PR_NUMBER> \
|
||||
--body "**\`/gsd-foo\` no longer drops trailing slashes** — explain the user-visible change."
|
||||
```
|
||||
|
||||
This writes `.changeset/<adjective>-<noun>-<noun>.md`. Three random words → concurrent PRs never collide. Allowed `type:` values follow [Keep a Changelog](https://keepachangelog.com/): `Added`, `Changed`, `Deprecated`, `Removed`, `Fixed`, `Security`.
|
||||
|
||||
Fragments are consolidated into `CHANGELOG.md` at release time by the release workflow. See [`.changeset/README.md`](.changeset/README.md) for the format spec and [#2975](https://github.com/gsd-build/get-shit-done/issues/2975) for the rationale.
|
||||
|
||||
**CI enforcement:** the `Changeset Required` workflow (`scripts/changeset/lint.cjs`) fails any PR that touches `bin/`, `get-shit-done/`, `agents/`, `commands/`, `hooks/`, or `sdk/src/` without a `.changeset/*.md` fragment.
|
||||
|
||||
**Opt-out:** PRs with no user-facing impact (test refactors, lint config changes, CI tweaks, formatting-only changes) can add the `no-changelog` label. The lint honors it. When unsure whether a change is user-facing, **add the fragment**.
|
||||
|
||||
## Testing Standards
|
||||
|
||||
All tests use Node.js built-in test runner (`node:test`) and assertion library (`node:assert`). **Do not use Jest, Mocha, Chai, or any external test framework.**
|
||||
@@ -281,6 +298,7 @@ Some tests legitimately read source files. There are six recognized categories:
|
||||
| `docs-parity` | A reference doc must stay in sync with source-defined constants (e.g., `CONFIG_DEFAULTS`). The source is the canonical list; there is no runtime API to enumerate it. |
|
||||
| `integration-test-input` | A source file is used as a real fixture input to a transformation function under test — the file is not inspected for strings but passed as data. |
|
||||
| `structural-implementation-guard` | A feature's interception or wiring point is not reachable end-to-end via `runGsdTools`. Used temporarily until a behavioral path exists. |
|
||||
| `pending-migration-to-typed-ir` | **Tracked for correction, not exempted.** Test was identified by the lint as carrying a raw-text-matching pattern that contradicts the rule above. Each annotated file MUST cite the open migration issue (e.g. `// allow-test-rule: pending-migration-to-typed-ir [#NNNN]`) so the tracking is auditable. New tests cannot use this category — they must refactor production to expose typed IR. The annotation is removed when the test is corrected. |
|
||||
|
||||
Annotate with a standalone `//` comment before the file's opening block comment:
|
||||
|
||||
@@ -296,6 +314,68 @@ Annotate with a standalone `//` comment before the file's opening block comment:
|
||||
|
||||
The annotation **must** be a standalone `// allow-test-rule:` line, not inside a `/** */` block comment — the CI linter scans for the pattern `// allow-test-rule:`.
|
||||
|
||||
### Prohibited: Raw Text Matching on Test Outputs (file content, stdout, stderr)
|
||||
|
||||
**Source-grep is not just `readFileSync` of a `.cjs` file.** The same anti-pattern shows up wherever a test pattern-matches against text that a system-under-test produced, regardless of whether that text came from a source file, a rendered shim, a child process's stdout, or a free-form `reason` string. **All forms are forbidden.**
|
||||
|
||||
The following are all violations of the same rule:
|
||||
|
||||
```javascript
|
||||
// BAD — substring match on text written by the code under test
|
||||
const cmdContent = fs.readFileSync(path.join(tmpDir, 'gsd-sdk.cmd'), 'utf8');
|
||||
assert.ok(cmdContent.includes(`@node ${jsonQuoted} %*`), '.cmd embeds shim path');
|
||||
|
||||
// BAD — regex match on a child process's human-readable stdout formatter
|
||||
const r = cp.spawnSync(SCRIPT, ['--patches-dir', dir]);
|
||||
assert.match(r.stdout, /Failures: 1/);
|
||||
assert.match(r.stdout, /not a regular file/);
|
||||
|
||||
// BAD — "structured parser" that hides string ops behind a function wrapper
|
||||
function parseCmdShim(content) {
|
||||
const lines = content.split('\r\n').filter((l) => l.length > 0);
|
||||
return { header: lines[0], usesCRLF: content.includes('\r\n') };
|
||||
}
|
||||
|
||||
// BAD — assert.match on a free-form `reason` string from a JSON report
|
||||
assert.ok(/not a regular file/.test(report.results[0].reason));
|
||||
```
|
||||
|
||||
Each of these passes on accidental near-matches (a comment containing `@node` somewhere, a stack trace that happens to say `Failures: 1`, a mis-typed reason that still contains the substring you're matching) and fails on harmless reformatting (changing `Failures: 1` to `1 failure`, swapping CRLF rendering style, rewording the error prose).
|
||||
|
||||
#### The rule
|
||||
|
||||
> **Tests assert on typed structured values. If the code under test produces text, the code under test must also expose a structured intermediate representation, and the test must assert on that IR — never on the rendered text.**
|
||||
|
||||
Concretely: for any system-under-test that produces text output (a file renderer, a CLI formatter, an error-message builder), the production code MUST expose a typed alternative that the test consumes:
|
||||
|
||||
| Output kind | Required structured surface | What the test asserts on |
|
||||
|---|---|---|
|
||||
| Rendered file (shim, template, generated code) | A pure builder function returning the IR (`{ invocation, eol, fileNames, render }`) | `triple.invocation.target === expected`, `triple.eol.cmd === '\r\n'` |
|
||||
| CLI human-formatter output | A `--json` mode that emits the same data structurally | `report.results[0].reason === REASON.FAIL_INSTALLED_NOT_REGULAR_FILE` |
|
||||
| Error / status / reason | A frozen enum (`Object.freeze({ FAIL_X: 'fail_x', ... })`) | `assert.equal(result.reason, REASON.FAIL_X)` |
|
||||
| File presence after a write | `fs.statSync().isFile()`, `.size > 0`, `.mtimeMs` advances | Filesystem facts; never read the file content back |
|
||||
|
||||
#### Concrete examples from this repo
|
||||
|
||||
`buildWindowsShimTriple(shimSrc)` in `bin/install.js` is the canonical IR pattern: pure function, no I/O, returns `{ invocation, eol, fileNames, render }`. `trySelfLinkGsdSdkWindows` calls it and writes `triple.render[kind]()` to disk. Tests assert on `triple.invocation.target`, `triple.eol.cmd`, `Object.keys(triple).sort()` — never on the rendered text. Filesystem-level tests assert `fs.statSync(target).size === Buffer.byteLength(triple.render.cmd())` to prove the writer writes what the renderer produces, **without comparing content**.
|
||||
|
||||
`scripts/verify-reapply-patches.cjs` exposes a frozen `REASON` enum and emits it through `--json`. Tests assert `report.results[0].reason === REASON.FAIL_USER_LINES_MISSING`. The human formatter exists for operator console output only — tests must not depend on its prose. Adding a new reason code requires updating the `REASON` enum, the `--json` output, AND the test that locks `Object.keys(REASON).sort()` — three coordinated changes that prevent the code surface from drifting from the test surface.
|
||||
|
||||
#### Hiding grep behind a function is still grep
|
||||
|
||||
`parseCmdShim`, `parsePs1Invocation`, etc. that internally do `content.split(...)`, `lines[1].trim()`, `content.includes(...)` are still string manipulation. The fact that the entry point looks like a parser doesn't change what's happening underneath — the test is still asserting on the lexical shape of rendered text. The fix is not "wrap the grep in a function with a typed-looking return value." The fix is to **eliminate the rendered text from the test path entirely** by surfacing the IR.
|
||||
|
||||
#### When you cannot eliminate text matching
|
||||
|
||||
There are exactly two cases where text content is the legitimate object of a test, both already covered by the existing exemption matrix:
|
||||
|
||||
1. `source-text-is-the-product` — workflow `.md` / agent `.md` / command `.md` files where the deployed text IS what the runtime loads.
|
||||
2. `docs-parity` — a reference doc must mirror source-defined constants and there is no runtime enumeration API.
|
||||
|
||||
For everything else, if a test reaches for `.includes()` / `.startsWith()` / `assert.match(text, /…/)`, the production code is missing a typed surface. **Add the typed surface; do not work around it.**
|
||||
|
||||
**CI enforcement:** `scripts/lint-no-source-grep.cjs` is being extended (see issue tracker for the latest scope) to flag `String#includes`/`String#startsWith`/`String#endsWith`/`assert.match` on `readFileSync` results and on `cp.spawnSync` stdout/stderr in test files, with the same `// allow-test-rule:` exemption mechanism.
|
||||
|
||||
### Node.js Version Compatibility
|
||||
|
||||
**Node 22 is the minimum supported version.** Node 24 is the primary CI target. All tests must pass on both.
|
||||
@@ -345,6 +425,73 @@ node --test tests/core.test.cjs
|
||||
npm run test:coverage
|
||||
```
|
||||
|
||||
### Pre-PR Seam Checks (Manifest/Alias Routing)
|
||||
|
||||
If you touched any of the command-manifest or generated alias files, run:
|
||||
|
||||
```bash
|
||||
npm run check:alias-drift
|
||||
```
|
||||
|
||||
This verifies generated alias artifacts are in sync with manifest source-of-truth.
|
||||
|
||||
Optional local pre-commit hook entry (Git-native):
|
||||
|
||||
```bash
|
||||
# one-time setup
|
||||
mkdir -p .githooks
|
||||
cat > .githooks/pre-commit <<'EOF'
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
if git diff --cached --name-only | grep -Eq "^sdk/src/query/command-manifest\.|^sdk/src/query/command-aliases\.generated\.ts$|^get-shit-done/bin/lib/command-aliases\.generated\.cjs$|^sdk/scripts/gen-command-aliases\.ts$"; then
|
||||
npm run check:alias-drift
|
||||
fi
|
||||
EOF
|
||||
chmod +x .githooks/pre-commit
|
||||
git config core.hooksPath .githooks
|
||||
```
|
||||
|
||||
Optional local pre-push hook to block a private author-email pattern:
|
||||
|
||||
```bash
|
||||
# set locally in your shell profile (example)
|
||||
export GSD_BLOCKED_AUTHOR_REGEX='@example-corp\\.com$'
|
||||
|
||||
cat > .githooks/pre-push <<'EOF'
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
zero_sha='0000000000000000000000000000000000000000'
|
||||
blocked_regex="${GSD_BLOCKED_AUTHOR_REGEX:-}"
|
||||
[[ -z "$blocked_regex" ]] && exit 0
|
||||
violations=()
|
||||
|
||||
while read -r local_ref local_sha remote_ref remote_sha; do
|
||||
[[ "$local_sha" == "$zero_sha" ]] && continue
|
||||
if [[ "$remote_sha" == "$zero_sha" ]]; then
|
||||
commits=$(git rev-list "$local_sha" --not --remotes)
|
||||
else
|
||||
commits=$(git rev-list "$remote_sha..$local_sha")
|
||||
fi
|
||||
while read -r commit; do
|
||||
[[ -z "$commit" ]] && continue
|
||||
email=$(git show -s --format='%ae' "$commit" | tr '[:upper:]' '[:lower:]')
|
||||
if printf '%s' "$email" | grep -Eq "$blocked_regex"; then
|
||||
violations+=("$commit <$email>")
|
||||
fi
|
||||
done <<< "$commits"
|
||||
done
|
||||
|
||||
if [[ ${#violations[@]} -gt 0 ]]; then
|
||||
echo "Push blocked: commit author email matched local blocked regex ($blocked_regex)." >&2
|
||||
printf ' - %s\n' "${violations[@]}" >&2
|
||||
exit 1
|
||||
fi
|
||||
EOF
|
||||
chmod +x .githooks/pre-push
|
||||
```
|
||||
|
||||
### CI Test Quality Checks
|
||||
|
||||
The following checks run on every PR in addition to the test suite:
|
||||
|
||||
@@ -75,15 +75,17 @@ GSDはそれを解決します。Claude Codeを信頼性の高いものにする
|
||||
|
||||
ビルトインの品質ゲートが本当の問題を検出します:スキーマドリフト検出はマイグレーション漏れのORM変更をフラグし、セキュリティ強制は検証を脅威モデルに紐付け、スコープ削減検出はプランナーが要件を暗黙的に落とすのを防止します。
|
||||
|
||||
### v1.32.0 ハイライト
|
||||
### v1.39.0 ハイライト
|
||||
|
||||
- **STATE.md整合性ゲート** — `state validate`がSTATE.mdとファイルシステムの差分を検出、`state sync`が実際のプロジェクト状態から再構築
|
||||
- **`--to N`フラグ** — 自律実行を特定のフェーズ完了後に停止
|
||||
- **リサーチゲート** — RESEARCH.mdに未解決の質問がある場合、計画をブロック
|
||||
- **検証マイルストーンスコープフィルタリング** — 後のフェーズで対処されるギャップは「ギャップ」ではなく「延期」としてマーク
|
||||
- **読み取り後編集ガード** — 非Claudeランタイムでの無限リトライループを防止するアドバイザリーフック
|
||||
- **コンテキスト削減** — Markdownのトランケーションとキャッシュフレンドリーなプロンプト順序でトークン使用量を削減
|
||||
- **4つの新ランタイム** — Trae、Kilo、Augment、Cline(合計12ランタイム)
|
||||
完全なリストは [v1.39.0 リリースノート](https://github.com/gsd-build/get-shit-done/releases/tag/v1.39.0) を参照してください。
|
||||
|
||||
- **`--minimal` インストールプロファイル** — エイリアス `--core-only`。メインループの6スキル(`new-project`、`discuss-phase`、`plan-phase`、`execute-phase`、`help`、`update`)のみをインストールし、`gsd-*` サブエージェントはゼロ。コールドスタート時のシステムプロンプトのオーバーヘッドを ~12kトークンから ~700トークンへ削減(≥94%減)。32K〜128Kコンテキストのローカル LLM やトークン課金 API に有効。
|
||||
- **`/gsd-edit-phase`** — `ROADMAP.md` 上の既存フェーズの任意フィールドをその場で編集(番号や位置は変更されない)。`--force` で確認 diff をスキップ、`depends_on` の参照を検証し、書き込み時に `STATE.md` も更新。
|
||||
- **マージ後ビルド & テストゲート** — `execute-phase` のステップ 5.6 が `workflow.build_command` の設定を自動検出し、無ければ Xcode(`.xcodeproj`)、Makefile、Justfile、Cargo、Go、Python、npm の順にフォールバック。Xcode/iOS プロジェクトでは `xcodebuild build` と `xcodebuild test` を自動実行。並列・直列両モードで動作。
|
||||
- **ランタイム別レビューモデル選択** — `review.models.<cli>` で各外部レビュー CLI(codex、gemini など)が使うモデルをプランナー/実行プロファイルとは独立に指定可能。
|
||||
- **ワークストリーム設定の継承** — `GSD_WORKSTREAM` が設定されている場合、ルートの `.planning/config.json` を先に読み込み、ワークストリーム設定をディープマージ(衝突時はワークストリーム側が優先)。ワークストリーム設定で明示的に `null` を指定するとルート値を上書き可能。
|
||||
- **手動カナリアリリースワークフロー** — `.github/workflows/canary.yml` が `workflow_dispatch` 経由で `dev` ブランチから `{base}-canary.{N}` ビルドを `@canary` dist-tag に手動公開(`get-shit-done-cc` と `@gsd-build/sdk`)。
|
||||
- **スキルの統合:86 → 59** — 4つの新しいグループ化スキル(`capture`、`phase`、`config`、`workspace`)が31のマイクロスキルを吸収。既存の親スキル6つはラップアップやサブ操作をフラグ化:`update --sync/--reapply`、`sketch --wrap-up`、`spike --wrap-up`、`map-codebase --fast/--query`、`code-review --fix`、`progress --do/--next`。機能の欠損なし。
|
||||
|
||||
---
|
||||
|
||||
@@ -597,6 +599,7 @@ lmn012o feat(08-02): create registration endpoint
|
||||
|---------|--------------|
|
||||
| `/gsd-add-phase` | ロードマップにフェーズを追加 |
|
||||
| `/gsd-insert-phase [N]` | フェーズ間に緊急作業を挿入 |
|
||||
| `/gsd-edit-phase [N] [--force]` | 既存フェーズの任意フィールドをその場で編集 — 番号と位置は変更されない |
|
||||
| `/gsd-remove-phase [N]` | 将来のフェーズを削除し番号を振り直し |
|
||||
| `/gsd-list-phase-assumptions [N]` | 計画前にClaudeの意図するアプローチを確認 |
|
||||
| `/gsd-plan-milestone-gaps` | 監査で見つかったギャップを埋めるフェーズを作成 |
|
||||
|
||||
@@ -75,15 +75,17 @@ GSD가 그걸 고칩니다. Claude Code를 신뢰할 수 있게 만드는 컨텍
|
||||
|
||||
내장 품질 게이트가 실제 문제를 잡아냅니다: 스키마 드리프트 감지는 마이그레이션 누락된 ORM 변경을 플래그하고, 보안 강제는 검증을 위협 모델에 고정시키고, 스코프 축소 감지는 플래너가 요구사항을 몰래 빠뜨리는 걸 방지합니다.
|
||||
|
||||
### v1.32.0 하이라이트
|
||||
### v1.39.0 하이라이트
|
||||
|
||||
- **STATE.md 일관성 게이트** — `state validate`가 STATE.md와 파일시스템 간 드리프트를 감지, `state sync`가 실제 프로젝트 상태에서 재구성
|
||||
- **`--to N` 플래그** — 자율 실행을 특정 단계 완료 후 중지
|
||||
- **리서치 게이트** — RESEARCH.md에 미해결 질문이 있으면 기획을 차단
|
||||
- **검증 마일스톤 스코프 필터링** — 이후 단계에서 처리될 격차는 "격차"가 아닌 "지연됨"으로 표시
|
||||
- **읽기-후-편집 가드** — 비Claude 런타임에서 무한 재시도 루프를 방지하는 어드바이저리 훅
|
||||
- **컨텍스트 축소** — 마크다운 잘라내기 및 캐시 친화적 프롬프트 순서로 토큰 사용량 절감
|
||||
- **4개의 새 런타임** — Trae, Kilo, Augment, Cline (총 12개 런타임)
|
||||
전체 목록은 [v1.39.0 릴리스 노트](https://github.com/gsd-build/get-shit-done/releases/tag/v1.39.0)를 참고하세요.
|
||||
|
||||
- **`--minimal` 설치 프로파일** — 별칭 `--core-only`. 메인 루프 6개 스킬(`new-project`, `discuss-phase`, `plan-phase`, `execute-phase`, `help`, `update`)만 설치하고 `gsd-*` 서브에이전트는 설치하지 않음. 콜드 스타트 시스템 프롬프트 오버헤드를 ~12k 토큰에서 ~700 토큰으로 축소(≥94% 감소). 32K–128K 컨텍스트의 로컬 LLM이나 토큰 과금 API에 유용.
|
||||
- **`/gsd-edit-phase`** — `ROADMAP.md`에 있는 기존 단계의 임의 필드를 그 자리에서 수정(번호와 위치는 변경되지 않음). `--force`는 확인 diff를 건너뛰고, `depends_on` 참조를 검증하며 쓰기 시 `STATE.md`도 갱신.
|
||||
- **머지 후 빌드 & 테스트 게이트** — `execute-phase` 5.6 단계가 `workflow.build_command` 설정을 우선 자동 감지하고, 없으면 Xcode(`.xcodeproj`), Makefile, Justfile, Cargo, Go, Python, npm 순으로 폴백. Xcode/iOS 프로젝트는 `xcodebuild build` 및 `xcodebuild test`를 자동 실행. 병렬·직렬 모드 모두에서 동작.
|
||||
- **런타임별 리뷰 모델 선택** — `review.models.<cli>`로 각 외부 리뷰 CLI(codex, gemini 등)가 플래너/실행 프로파일과 독립적으로 자체 모델을 선택할 수 있음.
|
||||
- **워크스트림 설정 상속** — `GSD_WORKSTREAM`이 설정되면 루트 `.planning/config.json`을 먼저 로드한 뒤 워크스트림 설정을 딥 머지(충돌 시 워크스트림 우선). 워크스트림 설정에서 명시적 `null`은 루트 값을 덮어씀.
|
||||
- **수동 카나리 릴리스 워크플로** — `.github/workflows/canary.yml`이 `workflow_dispatch`로 `dev` 브랜치에서 `{base}-canary.{N}` 빌드를 `@canary` dist-tag로 수동 게시(`get-shit-done-cc`와 `@gsd-build/sdk`).
|
||||
- **스킬 통합: 86 → 59** — 4개의 새로운 그룹 스킬(`capture`, `phase`, `config`, `workspace`)이 31개의 마이크로 스킬을 흡수. 기존 6개의 부모 스킬은 래퍼업/하위 동작을 플래그로 흡수: `update --sync/--reapply`, `sketch --wrap-up`, `spike --wrap-up`, `map-codebase --fast/--query`, `code-review --fix`, `progress --do/--next`. 기능 손실 없음.
|
||||
|
||||
---
|
||||
|
||||
@@ -594,6 +596,7 @@ lmn012o feat(08-02): create registration endpoint
|
||||
|---------|------------|
|
||||
| `/gsd-add-phase` | 로드맵에 단계 추가 |
|
||||
| `/gsd-insert-phase [N]` | 단계 사이에 긴급 작업 삽입 |
|
||||
| `/gsd-edit-phase [N] [--force]` | 기존 단계의 임의 필드를 그 자리에서 수정 — 번호와 위치는 그대로 |
|
||||
| `/gsd-remove-phase [N]` | 미래 단계 제거, 번호 재정렬 |
|
||||
| `/gsd-list-phase-assumptions [N]` | 기획 전 Claude의 의도된 접근 방식 확인 |
|
||||
| `/gsd-plan-milestone-gaps` | 감사에서 발견된 갭을 해소하기 위한 단계 생성 |
|
||||
|
||||
34
README.md
34
README.md
@@ -4,7 +4,7 @@
|
||||
|
||||
**English** · [Português](README.pt-BR.md) · [简体中文](README.zh-CN.md) · [日本語](README.ja-JP.md) · [한국어](README.ko-KR.md)
|
||||
|
||||
**A light-weight and powerful meta-prompting, context engineering and spec-driven development system for Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, Cline, and CodeBuddy.**
|
||||
**A light-weight and powerful meta-prompting, context engineering and spec-driven development system for Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, Hermes Agent, Cline, and CodeBuddy.**
|
||||
|
||||
**Solves context rot — the quality degradation that happens as Claude fills its context window.**
|
||||
|
||||
@@ -89,11 +89,17 @@ People who want to describe what they want and have it built correctly — witho
|
||||
|
||||
Built-in quality gates catch real problems: schema drift detection flags ORM changes missing migrations, security enforcement anchors verification to threat models, and scope reduction detection prevents the planner from silently dropping your requirements.
|
||||
|
||||
### v1.37.0 Highlights
|
||||
### v1.39.0 Highlights
|
||||
|
||||
- **Spiking & sketching** — `/gsd-spike` runs 2–5 focused experiments with Given/When/Then verdicts; `/gsd-sketch` produces 2–3 interactive HTML mockup variants per design question — both store artifacts in `.planning/` and pair with wrap-up commands to package findings into project-local skills
|
||||
- **Agent size-budget enforcement** — Tiered line-count limits (XL: 1 600, Large: 1 000, Default: 500) keep agent prompts lean; violations surface in CI
|
||||
- **Shared boilerplate extraction** — Mandatory-initial-read and project-skills-discovery logic extracted to reference files, reducing duplication across a dozen agents
|
||||
See the [v1.39.0 release notes](https://github.com/gsd-build/get-shit-done/releases/tag/v1.39.0) for the full list.
|
||||
|
||||
- **`--minimal` install profile** — alias `--core-only`, writes only the six main-loop skills (`new-project`, `discuss-phase`, `plan-phase`, `execute-phase`, `help`, `update`) and zero `gsd-*` subagents. Cuts cold-start system-prompt overhead from ~12k tokens to ~700 (≥94% reduction). Useful for local LLMs with 32K–128K context and token-billed APIs.
|
||||
- **`/gsd-edit-phase`** — modify any field of an existing phase in `ROADMAP.md` in place, without changing its number or position. `--force` skips the confirmation diff; `depends_on` references are validated and `STATE.md` is updated on write.
|
||||
- **Post-merge build & test gate** — `execute-phase` step 5.6 now auto-detects the build command from `workflow.build_command`, then falls back to Xcode (`.xcodeproj`), Makefile, Justfile, Cargo, Go, Python, or npm. Xcode/iOS projects get `xcodebuild build` + `xcodebuild test` automatically. Runs in both parallel and serial mode.
|
||||
- **Per-runtime review-model selection** — `review.models.<cli>` lets each external review CLI (codex, gemini, etc.) pick its own model independently of the planner/executor profile.
|
||||
- **Workstream config inheritance** — when `GSD_WORKSTREAM` is set, the root `.planning/config.json` is loaded first and deep-merged with the workstream config (workstream wins on conflict). Explicit `null` in a workstream config now correctly overrides a root value.
|
||||
- **Manual canary release workflow** — `.github/workflows/canary.yml` publishes `{base}-canary.{N}` builds of `get-shit-done-cc` and `@gsd-build/sdk` to the `@canary` dist-tag from `dev` on demand via `workflow_dispatch`.
|
||||
- **Skill consolidation: 86 → 59** — four new grouped skills (`capture`, `phase`, `config`, `workspace`) absorb 31 micro-skills. Six existing parents absorb wrap-up and sub-operations as flags: `update --sync/--reapply`, `sketch --wrap-up`, `spike --wrap-up`, `map-codebase --fast/--query`, `code-review --fix`, `progress --do/--next`. Zero functional loss.
|
||||
|
||||
---
|
||||
|
||||
@@ -104,11 +110,11 @@ npx get-shit-done-cc@latest
|
||||
```
|
||||
|
||||
The installer prompts you to choose:
|
||||
1. **Runtime** — Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, CodeBuddy, Cline, or all (interactive multi-select — pick multiple runtimes in a single install session)
|
||||
1. **Runtime** — Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, Hermes Agent, CodeBuddy, Cline, or all (interactive multi-select — pick multiple runtimes in a single install session)
|
||||
2. **Location** — Global (all projects) or local (current project only)
|
||||
|
||||
Verify with:
|
||||
- Claude Code / Gemini / Copilot / Antigravity / Qwen Code: `/gsd-help`
|
||||
- Claude Code / Gemini / Copilot / Antigravity / Qwen Code / Hermes Agent: `/gsd-help`
|
||||
- OpenCode / Kilo / Augment / Trae / CodeBuddy: `/gsd-help`
|
||||
- Codex: `$gsd-help`
|
||||
- Cline: GSD installs via `.clinerules` — verify by checking `.clinerules` exists
|
||||
@@ -179,6 +185,10 @@ npx get-shit-done-cc --trae --local # Install to ./.trae/
|
||||
npx get-shit-done-cc --qwen --global # Install to ~/.qwen/
|
||||
npx get-shit-done-cc --qwen --local # Install to ./.qwen/
|
||||
|
||||
# Hermes Agent
|
||||
npx get-shit-done-cc --hermes --global # Install to ~/.hermes/ (honors $HERMES_HOME)
|
||||
npx get-shit-done-cc --hermes --local # Install to ./.hermes/
|
||||
|
||||
# CodeBuddy
|
||||
npx get-shit-done-cc --codebuddy --global # Install to ~/.codebuddy/
|
||||
npx get-shit-done-cc --codebuddy --local # Install to ./.codebuddy/
|
||||
@@ -192,7 +202,7 @@ npx get-shit-done-cc --all --global # Install to all directories
|
||||
```
|
||||
|
||||
Use `--global` (`-g`) or `--local` (`-l`) to skip the location prompt.
|
||||
Use `--claude`, `--opencode`, `--gemini`, `--kilo`, `--codex`, `--copilot`, `--cursor`, `--windsurf`, `--antigravity`, `--augment`, `--trae`, `--qwen`, `--codebuddy`, `--cline`, or `--all` to skip the runtime prompt.
|
||||
Use `--claude`, `--opencode`, `--gemini`, `--kilo`, `--codex`, `--copilot`, `--cursor`, `--windsurf`, `--antigravity`, `--augment`, `--trae`, `--qwen`, `--hermes`, `--codebuddy`, `--cline`, or `--all` to skip the runtime prompt.
|
||||
The GSD SDK CLI (`gsd-sdk`) is installed automatically (required by `/gsd-*` commands). Pass `--no-sdk` to skip the SDK install, or `--sdk` to force a reinstall.
|
||||
|
||||
</details>
|
||||
@@ -685,6 +695,7 @@ You're never locked in. The system adapts.
|
||||
|---------|--------------|
|
||||
| `/gsd-add-phase` | Append phase to roadmap |
|
||||
| `/gsd-insert-phase [N]` | Insert urgent work between phases |
|
||||
| `/gsd-edit-phase [N] [--force]` | Modify any field of an existing phase in place — number and position unchanged |
|
||||
| `/gsd-remove-phase [N]` | Remove future phase, renumber |
|
||||
| `/gsd-list-phase-assumptions [N]` | See Claude's intended approach before planning |
|
||||
| `/gsd-plan-milestone-gaps` | Create phases to close gaps from audit |
|
||||
@@ -746,6 +757,8 @@ You're never locked in. The system adapts.
|
||||
|
||||
GSD stores project settings in `.planning/config.json`. Configure during `/gsd-new-project` or update later with `/gsd-settings`. For the full config schema, workflow toggles, git branching options, and per-agent model breakdown, see the [User Guide](docs/USER-GUIDE.md#configuration-reference).
|
||||
|
||||
When `GSD_WORKSTREAM` is set, GSD loads the root `.planning/config.json` first and deep-merges the workstream's `config.json` on top — workstream values win on conflict, and an explicit `null` in a workstream config overrides a root value.
|
||||
|
||||
### Core Settings
|
||||
|
||||
| Setting | Options | Default | What it controls |
|
||||
@@ -774,6 +787,8 @@ Use `inherit` when using non-Anthropic providers (OpenRouter, local models) or t
|
||||
|
||||
Or configure via `/gsd-settings`.
|
||||
|
||||
Per-runtime review-model overrides live under `review.models.<cli>` (e.g. `review.models.codex`, `review.models.gemini`) and let each external review CLI pick its own model independently of the planner/executor profile.
|
||||
|
||||
### Workflow Agents
|
||||
|
||||
These spawn additional agents during planning/execution. They improve quality but add tokens and time.
|
||||
@@ -789,6 +804,7 @@ These spawn additional agents during planning/execution. They improve quality bu
|
||||
| `workflow.skip_discuss` | `false` | Skip discuss-phase in autonomous mode |
|
||||
| `workflow.text_mode` | `false` | Text-only mode for remote sessions (no TUI menus) |
|
||||
| `workflow.use_worktrees` | `true` | Toggle worktree isolation for execution |
|
||||
| `workflow.build_command` | _(auto-detect)_ | Override the post-merge build gate command. Falls back to Xcode (`.xcodeproj`), Makefile, Justfile, Cargo, Go, Python, or npm; Xcode/iOS projects also run `xcodebuild test`. |
|
||||
|
||||
Use `/gsd-settings` to toggle these, or override per-invocation:
|
||||
- `/gsd-plan-phase --skip-research`
|
||||
@@ -919,6 +935,7 @@ npx get-shit-done-cc --antigravity --global --uninstall
|
||||
npx get-shit-done-cc --augment --global --uninstall
|
||||
npx get-shit-done-cc --trae --global --uninstall
|
||||
npx get-shit-done-cc --qwen --global --uninstall
|
||||
npx get-shit-done-cc --hermes --global --uninstall
|
||||
npx get-shit-done-cc --codebuddy --global --uninstall
|
||||
npx get-shit-done-cc --cline --global --uninstall
|
||||
|
||||
@@ -935,6 +952,7 @@ npx get-shit-done-cc --antigravity --local --uninstall
|
||||
npx get-shit-done-cc --augment --local --uninstall
|
||||
npx get-shit-done-cc --trae --local --uninstall
|
||||
npx get-shit-done-cc --qwen --local --uninstall
|
||||
npx get-shit-done-cc --hermes --local --uninstall
|
||||
npx get-shit-done-cc --codebuddy --local --uninstall
|
||||
npx get-shit-done-cc --cline --local --uninstall
|
||||
```
|
||||
|
||||
@@ -73,15 +73,17 @@ Para quem quer descrever o que precisa e receber isso construído do jeito certo
|
||||
|
||||
Quality gates embutidos capturam problemas reais: detecção de schema drift sinaliza mudanças ORM sem migrations, segurança ancora verificação a modelos de ameaça, e detecção de redução de escopo impede o planner de descartar requisitos silenciosamente.
|
||||
|
||||
### Destaques v1.32.0
|
||||
### Destaques v1.39.0
|
||||
|
||||
- **Gates de consistência STATE.md** — `state validate` detecta divergência entre STATE.md e o filesystem; `state sync` reconstrói a partir do estado real do projeto
|
||||
- **Flag `--to N`** — Para a execução autônoma após completar uma fase específica
|
||||
- **Research gate** — Bloqueia planejamento quando RESEARCH.md tem perguntas abertas não resolvidas
|
||||
- **Filtro de escopo do verificador** — Lacunas abordadas em fases posteriores são marcadas como "adiadas", não como lacunas
|
||||
- **Guard de leitura antes de edição** — Hook consultivo previne loops de retry infinitos em runtimes não-Claude
|
||||
- **Redução de contexto** — Truncamento de Markdown e ordenação de prompts cache-friendly para menor uso de tokens
|
||||
- **4 novos runtimes** — Trae, Kilo, Augment e Cline (12 runtimes no total)
|
||||
Lista completa nas [notas de release v1.39.0](https://github.com/gsd-build/get-shit-done/releases/tag/v1.39.0).
|
||||
|
||||
- **Perfil de instalação `--minimal`** — alias `--core-only`. Instala apenas os 6 skills do loop principal (`new-project`, `discuss-phase`, `plan-phase`, `execute-phase`, `help`, `update`) e nenhum subagente `gsd-*`. Reduz o overhead do system prompt no cold-start de ~12k para ~700 tokens (≥94% de redução). Útil para LLMs locais com contexto de 32K–128K e APIs cobradas por token.
|
||||
- **`/gsd-edit-phase`** — edita qualquer campo de uma fase existente em `ROADMAP.md` no lugar, sem alterar o número ou a posição. `--force` pula o diff de confirmação; referências em `depends_on` são validadas e o `STATE.md` é atualizado na escrita.
|
||||
- **Build & test gate pós-merge** — o passo 5.6 de `execute-phase` agora detecta automaticamente o comando de build em `workflow.build_command`, com fallback para Xcode (`.xcodeproj`), Makefile, Justfile, Cargo, Go, Python ou npm. Projetos Xcode/iOS rodam `xcodebuild build` e `xcodebuild test` automaticamente. Funciona em modo paralelo e serial.
|
||||
- **Modelo de review por runtime** — `review.models.<cli>` permite que cada CLI externa de review (codex, gemini, etc.) escolha seu próprio modelo, independente do perfil de planner/executor.
|
||||
- **Herança de configuração de workstream** — quando `GSD_WORKSTREAM` está definido, o `.planning/config.json` raiz é carregado primeiro e merge-deep com o config da workstream (workstream vence em conflito). Um `null` explícito no config da workstream sobrescreve corretamente o valor raiz.
|
||||
- **Workflow manual de canary release** — `.github/workflows/canary.yml` publica builds `{base}-canary.{N}` de `get-shit-done-cc` e `@gsd-build/sdk` na dist-tag `@canary` a partir de `dev`, sob demanda via `workflow_dispatch`.
|
||||
- **Consolidação de skills: 86 → 59** — 4 novos skills agrupados (`capture`, `phase`, `config`, `workspace`) absorvem 31 micro-skills. 6 skills pais existentes absorvem wrap-up e sub-operações como flags: `update --sync/--reapply`, `sketch --wrap-up`, `spike --wrap-up`, `map-codebase --fast/--query`, `code-review --fix`, `progress --do/--next`. Sem perda funcional.
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -73,15 +73,17 @@ GSD 解决的就是这个问题。它是让 Claude Code 变得可靠的上下文
|
||||
|
||||
适合那些想把自己的需求说明白,然后让系统正确构建出来的人,而不是假装自己在运营一个 50 人工程组织的人。
|
||||
|
||||
### v1.32.0 亮点
|
||||
### v1.39.0 亮点
|
||||
|
||||
- **STATE.md 一致性检查** — `state validate` 检测 STATE.md 与文件系统之间的偏差;`state sync` 从实际项目状态重建
|
||||
- **`--to N` 标志** — 在完成特定阶段后停止自主执行
|
||||
- **研究门控** — 当 RESEARCH.md 有未解决的开放问题时阻止规划
|
||||
- **验证里程碑范围过滤** — 后续阶段将处理的差距标记为"延迟"而非差距
|
||||
- **读取后编辑保护** — 咨询性 hook 防止非 Claude 运行时的无限重试循环
|
||||
- **上下文缩减** — Markdown 截断和缓存友好的 prompt 排序,降低 token 使用量
|
||||
- **4 个新运行时** — Trae、Kilo、Augment 和 Cline(共 12 个运行时)
|
||||
完整列表请参阅 [v1.39.0 发行说明](https://github.com/gsd-build/get-shit-done/releases/tag/v1.39.0)。
|
||||
|
||||
- **`--minimal` 安装档** — 别名 `--core-only`。仅安装主循环的 6 个核心技能(`new-project`、`discuss-phase`、`plan-phase`、`execute-phase`、`help`、`update`),不安装任何 `gsd-*` 子代理。将冷启动系统提示开销从 ~12k token 降至 ~700 token(≥94% 减少)。适合 32K–128K 上下文的本地 LLM 和按 token 计费的 API。
|
||||
- **`/gsd-edit-phase`** — 就地修改 `ROADMAP.md` 中已有阶段的任意字段,不改变其编号或位置。`--force` 跳过确认 diff,验证 `depends_on` 引用,并在写入时更新 `STATE.md`。
|
||||
- **合并后构建与测试门** — `execute-phase` 步骤 5.6 优先自动检测 `workflow.build_command` 配置,否则按 Xcode(`.xcodeproj`)、Makefile、Justfile、Cargo、Go、Python、npm 顺序回退。Xcode/iOS 项目自动运行 `xcodebuild build` 和 `xcodebuild test`。在并行与串行模式下均生效。
|
||||
- **每运行时评审模型选择** — `review.models.<cli>` 让每个外部评审 CLI(codex、gemini 等)独立于规划/执行档选择自己的模型。
|
||||
- **工作流设置继承** — 设置 `GSD_WORKSTREAM` 后,先加载根 `.planning/config.json`,再与该工作流的配置进行深合并(冲突时工作流优先)。工作流配置中显式 `null` 会覆盖根值。
|
||||
- **手动 canary 发布工作流** — `.github/workflows/canary.yml` 通过 `workflow_dispatch` 从 `dev` 分支按需将 `{base}-canary.{N}` 构建(`get-shit-done-cc` 与 `@gsd-build/sdk`)发布到 `@canary` dist-tag。
|
||||
- **技能整合:86 → 59** — 4 个新分组技能(`capture`、`phase`、`config`、`workspace`)吸收了 31 个微技能。6 个已有父技能将收尾与子操作合并为标志:`update --sync/--reapply`、`sketch --wrap-up`、`spike --wrap-up`、`map-codebase --fast/--query`、`code-review --fix`、`progress --do/--next`。功能无损失。
|
||||
|
||||
---
|
||||
|
||||
@@ -589,6 +591,7 @@ lmn012o feat(08-02): create registration endpoint
|
||||
|------|------|
|
||||
| `/gsd-add-phase` | 在路线图末尾追加 phase |
|
||||
| `/gsd-insert-phase [N]` | 在 phase 之间插入紧急工作 |
|
||||
| `/gsd-edit-phase [N] [--force]` | 就地修改已有 phase 的任意字段 — 编号与位置保持不变 |
|
||||
| `/gsd-remove-phase [N]` | 删除未来 phase,并重编号 |
|
||||
| `/gsd-list-phase-assumptions [N]` | 在规划前查看 Claude 打算采用的方案 |
|
||||
| `/gsd-plan-milestone-gaps` | 为 audit 发现的缺口创建 phase |
|
||||
|
||||
@@ -67,15 +67,38 @@ main ← stable, always deployable
|
||||
|
||||
### Patch Release (Hotfix)
|
||||
|
||||
For critical bugs that can't wait for the next minor release.
|
||||
For fixes that need to ship without waiting for the next minor.
|
||||
|
||||
1. Trigger `hotfix.yml` with version (e.g., `1.27.1`)
|
||||
2. Workflow creates `hotfix/1.27.1` branch from the latest patch tag for that minor version (e.g., `v1.27.0` or `v1.27.1`)
|
||||
3. Cherry-pick or apply fix on the hotfix branch
|
||||
4. Push — CI runs tests automatically
|
||||
5. Trigger `hotfix.yml` finalize action
|
||||
6. Workflow runs full test suite, bumps version, tags, publishes to `latest`
|
||||
7. Merge hotfix branch back to main
|
||||
A hotfix `vX.YY.Z` cumulatively includes everything in `vX.YY.{Z-1}` plus every `fix:`/`chore:` commit landed on `main` since that base. The base tag is the anchor — `git cherry $BASE_TAG main` reveals exactly which commits are still unshipped, and the new `vX.YY.Z` tag becomes the next hotfix's base, so the cycle is self-documenting.
|
||||
|
||||
#### Two paths
|
||||
|
||||
**Path A — `hotfix.yml` (canonical, two-step):**
|
||||
|
||||
1. Trigger `hotfix.yml` with `action=create`, `version=1.27.1`, `auto_cherry_pick=true` (default).
|
||||
- Workflow detects `BASE_TAG` = highest `v1.27.*` < `v1.27.1` (so `1.27.1` branches from `v1.27.0`; `1.27.2` would branch from `v1.27.1`).
|
||||
- Branches `hotfix/1.27.1` from `BASE_TAG`.
|
||||
- Auto-cherry-picks every `fix:`/`chore:` commit on `origin/main` not already in the base, oldest-first. Patch-equivalents are skipped via `git cherry`. `feat:`/`refactor:` are **never** auto-included.
|
||||
- On conflict the workflow halts with the offending SHA. Resolve manually on the branch, then re-run finalize with `auto_cherry_pick=false`.
|
||||
- Bumps `package.json` (and `sdk/package.json`), pushes the branch, and lists every included SHA in the run summary.
|
||||
2. (Optional) push additional manual commits to `hotfix/1.27.1`.
|
||||
3. Trigger `hotfix.yml` with `action=finalize`. The workflow:
|
||||
- Runs `install-smoke` cross-platform gate.
|
||||
- Runs full test suite + coverage.
|
||||
- Builds SDK, bundles `sdk-bundle/gsd-sdk.tgz` inside the CC tarball (parity with `release-sdk.yml`).
|
||||
- Tags `v1.27.1`, publishes to `@latest`, re-points `@next → v1.27.1`.
|
||||
- Opens merge-back PR against `main`.
|
||||
|
||||
**Path B — `release-sdk.yml` (stopgap, one-shot):**
|
||||
|
||||
Active while the `@gsd-build/sdk` npm token is unavailable; bundles the SDK inside the CC tarball.
|
||||
|
||||
1. Trigger `release-sdk.yml` with `action=hotfix`, `version=1.27.1`, `auto_cherry_pick=true`.
|
||||
- The `prepare` job creates the branch and cherry-picks (same logic as Path A).
|
||||
- `install-smoke` runs against the new branch.
|
||||
- The `release` job tags, publishes to `@latest`, re-points `@next`, opens merge-back PR.
|
||||
- Idempotent: if `hotfix/1.27.1` already exists (e.g. you ran `hotfix.yml create` first), the prepare job checks it out and re-runs cherry-pick as a no-op.
|
||||
2. `dry_run=true` exercises the full pipeline without pushing the branch or publishing.
|
||||
|
||||
### Minor Release (Standard Cycle)
|
||||
|
||||
|
||||
@@ -231,39 +231,63 @@ test -n "$branch" || { echo "Detached HEAD is not supported for review-fix (#268
|
||||
sentinel="${phase_dir}/.review-fix-recovery-pending.json"
|
||||
if [ -f "$sentinel" ]; then
|
||||
echo "Detected pre-existing recovery sentinel from a prior interrupted run: $sentinel"
|
||||
prior_wt=$(node -e '
|
||||
# Recovery must extract BOTH worktree_path AND reviewfix_branch (#3001 CR):
|
||||
# if a prior run died after `git worktree remove` but before
|
||||
# `git branch -D`, the orphan branch survives and clutters `git branch`
|
||||
# output forever. Emit both fields newline-separated so we can read them
|
||||
# independently.
|
||||
prior_recovery=$(node -e '
|
||||
const fs = require("fs");
|
||||
try {
|
||||
const parsed = JSON.parse(fs.readFileSync(process.argv[1], "utf-8"));
|
||||
process.stdout.write(parsed.worktree_path || "");
|
||||
process.stdout.write((parsed.worktree_path || "") + "\n" + (parsed.reviewfix_branch || ""));
|
||||
} catch (err) {
|
||||
process.stderr.write(`Warning: malformed recovery sentinel ${process.argv[1]}: ${err.message}\n`);
|
||||
process.stdout.write("");
|
||||
process.stdout.write("\n");
|
||||
}
|
||||
' "$sentinel")
|
||||
prior_wt="$(printf '%s' "$prior_recovery" | sed -n '1p')"
|
||||
prior_branch="$(printf '%s' "$prior_recovery" | sed -n '2p')"
|
||||
if [ -n "$prior_wt" ] && git worktree list --porcelain | grep -q "^worktree $prior_wt$"; then
|
||||
echo "Removing orphan worktree from prior run: $prior_wt"
|
||||
git worktree remove "$prior_wt" --force || true
|
||||
fi
|
||||
if [ -n "$prior_branch" ]; then
|
||||
# Best-effort: branch may already be gone (cleaned by an earlier
|
||||
# partial recovery, or never created if `git worktree add -b` itself
|
||||
# failed). `|| true` keeps recovery non-fatal.
|
||||
echo "Removing orphan reviewfix branch from prior run: $prior_branch"
|
||||
git branch -D "$prior_branch" 2>/dev/null || true
|
||||
fi
|
||||
rm -f "$sentinel"
|
||||
fi
|
||||
|
||||
wt=$(mktemp -d "/tmp/sv-${padded_phase}-reviewfix-XXXXXX")
|
||||
git worktree add "$wt" "$branch"
|
||||
|
||||
# Create a temp branch from the current branch tip so the worktree
|
||||
# attaches to that NEW branch rather than the user's currently-checked-out
|
||||
# branch (#2990: git refuses to check out the same branch in two
|
||||
# worktrees by default; the original `git worktree add "$wt" "$branch"`
|
||||
# failed before the agent could do any work). The temp branch shares
|
||||
# history with $branch up to the moment of creation, so commits made
|
||||
# inside the worktree fast-forward $branch on cleanup.
|
||||
reviewfix_branch="gsd-reviewfix/${padded_phase}-$$"
|
||||
git worktree add -b "$reviewfix_branch" "$wt" "$branch"
|
||||
|
||||
# Write the recovery sentinel ONLY AFTER `git worktree add` succeeds.
|
||||
# Writing it before would leave a sentinel pointing at a worktree that does
|
||||
# not exist if `git worktree add` itself failed.
|
||||
node -e '
|
||||
const fs = require("fs");
|
||||
const [sentinelPath, worktree_path, branch, padded_phase] = process.argv.slice(1);
|
||||
const [sentinelPath, worktree_path, branch, reviewfix_branch, padded_phase] = process.argv.slice(1);
|
||||
fs.writeFileSync(sentinelPath, JSON.stringify({
|
||||
worktree_path,
|
||||
branch,
|
||||
reviewfix_branch,
|
||||
padded_phase,
|
||||
started_at: new Date().toISOString()
|
||||
}, null, 2));
|
||||
' "$sentinel" "$wt" "$branch" "$padded_phase"
|
||||
' "$sentinel" "$wt" "$branch" "$reviewfix_branch" "$padded_phase"
|
||||
|
||||
cd "$wt"
|
||||
```
|
||||
@@ -271,32 +295,64 @@ cd "$wt"
|
||||
Concrete steps:
|
||||
1. Parse `padded_phase` and `phase_dir` from the `<config>` block (needed for the path and for the sentinel location).
|
||||
2. Resolve the current branch: `branch=$(git branch --show-current)`. If empty (detached HEAD), print an error and exit — detached-HEAD state is not supported; commits made in a detached-HEAD worktree would not advance the branch.
|
||||
3. **Recovery check (#2839):** If `${phase_dir}/.review-fix-recovery-pending.json` already exists, a prior run was interrupted. Parse the JSON, attempt to remove the orphan worktree it points at (best-effort, with `--force`), then delete the stale sentinel before continuing. This makes a re-run of `/gsd-code-review-fix` self-healing.
|
||||
3. **Recovery check (#2839, #2990):** If `${phase_dir}/.review-fix-recovery-pending.json` already exists, a prior run was interrupted. Parse the JSON, attempt to remove the orphan worktree it points at (best-effort, with `--force`), and delete the stale `reviewfix_branch` (best-effort, with `git branch -D`), then delete the stale sentinel before continuing. This makes a re-run of `/gsd-code-review-fix` self-healing.
|
||||
4. Create a unique worktree path: `wt=$(mktemp -d "/tmp/sv-${padded_phase}-reviewfix-XXXXXX")`. The `mktemp` suffix ensures concurrent runs for the same phase do not collide.
|
||||
5. Run `git worktree add "$wt" "$branch"` — this attaches the worktree to the current branch so commits advance it.
|
||||
6. **Write the recovery sentinel** at `${phase_dir}/.review-fix-recovery-pending.json` containing `{worktree_path, branch, padded_phase, started_at}`. Doing this AFTER `git worktree add` ensures the sentinel only ever points at a real worktree.
|
||||
7. All subsequent file reads, edits, and commits happen inside `$wt`.
|
||||
5. Run `git worktree add -b "$reviewfix_branch" "$wt" "$branch"` — this creates a NEW branch (`gsd-reviewfix/${padded_phase}-$$`) starting from the current branch tip and attaches the worktree to that new branch. Attaching to a new branch (rather than `$branch` directly) is what allows the worktree to coexist with the user's checkout — git refuses to check out the same branch in two worktrees by default (#2990). Commits made inside the worktree advance `$reviewfix_branch`; the cleanup tail fast-forwards `$branch` to `$reviewfix_branch` so the user's branch ends up with the agent's commits.
|
||||
6. **Write the recovery sentinel** at `${phase_dir}/.review-fix-recovery-pending.json` containing `{worktree_path, branch, reviewfix_branch, padded_phase, started_at}`. Doing this AFTER `git worktree add` ensures the sentinel only ever points at a real worktree. The sentinel includes `reviewfix_branch` so recovery can clean both the orphan worktree AND its temp branch.
|
||||
7. All subsequent file reads, edits, and commits happen inside `$wt` (which is on `$reviewfix_branch`, not `$branch`).
|
||||
|
||||
**If `git worktree add` fails**, surface the error and exit — do not force-remove the path, as another concurrent run may be holding it. Do not write the sentinel (the worktree does not exist).
|
||||
**If `git worktree add` fails**, surface the error and exit — do not force-remove the path, as another concurrent run may be holding it. Do not write the sentinel (the worktree does not exist). Do not delete `$reviewfix_branch` either; if `-b` failed, no temp branch was created.
|
||||
|
||||
**Cleanup tail (transactional, ALWAYS — even on failure):** After writing REVIEW-FIX.md and before returning to the orchestrator, run the two-step cleanup in this exact order:
|
||||
**Cleanup tail (transactional, ALWAYS — even on failure):** After writing REVIEW-FIX.md and before returning to the orchestrator, run the cleanup in this exact order:
|
||||
|
||||
```bash
|
||||
# Step 1: drop the worktree FIRST. If this succeeds and the process is then
|
||||
# killed, the next run finds a sentinel pointing at a worktree that no longer
|
||||
# exists — the recovery branch handles this gracefully (best-effort remove +
|
||||
# sentinel delete). If we reversed the order (sentinel removed first, then
|
||||
# worktree remove), an interruption between the two steps would leave NO
|
||||
# sentinel and an orphan worktree — exactly the bug from #2839.
|
||||
# Step 1 (#2990): fast-forward $branch to capture the commits the agent
|
||||
# made on $reviewfix_branch. Run from the main repo (not $wt) — the user's
|
||||
# checkout owns $branch. --ff-only ensures we never silently drop or
|
||||
# rewrite history if the user committed to $branch concurrently; on
|
||||
# divergence, this fails loudly and the temp branch is left for the
|
||||
# user to inspect/merge manually. We deliberately resolve the main repo
|
||||
# path via `git worktree list --porcelain` rather than assuming $PWD,
|
||||
# because the agent ran inside $wt.
|
||||
# Strip the literal "worktree " prefix and print the rest of the line, then
|
||||
# exit on the first match. This preserves paths that contain spaces
|
||||
# (awk '$2' would truncate "/path/with spaces/repo" to "/path/with").
|
||||
main_repo="$(git worktree list --porcelain | awk '/^worktree / { sub(/^worktree /, ""); print; exit }')"
|
||||
ff_status=0
|
||||
# Capture the exit code of `git merge` directly. `if ! cmd; then ff_status=$?`
|
||||
# captures the exit code of the `!` operator (always 1 when the inner cmd
|
||||
# failed) — masking the real merge exit code. Use the success/else split
|
||||
# instead so $? in the else-branch is the merge command's exit code.
|
||||
if git -C "$main_repo" merge --ff-only "$reviewfix_branch" 2>&1; then
|
||||
ff_status=0
|
||||
else
|
||||
ff_status=$?
|
||||
echo "WARN: could not fast-forward $branch to $reviewfix_branch (exit $ff_status)."
|
||||
echo " The temp branch $reviewfix_branch is preserved for manual merge."
|
||||
fi
|
||||
|
||||
# Step 2: drop the worktree. If this succeeds and the process is then
|
||||
# killed, the next run finds a sentinel pointing at a worktree that no
|
||||
# longer exists — the recovery branch handles this gracefully (best-effort
|
||||
# remove + sentinel delete). If we reversed the order (sentinel removed
|
||||
# first, then worktree remove), an interruption between the two steps
|
||||
# would leave NO sentinel and an orphan worktree — exactly the bug from
|
||||
# #2839.
|
||||
git worktree remove "$wt" --force
|
||||
|
||||
# Step 2: drop the recovery sentinel ONLY after `git worktree remove` returns
|
||||
# successfully. This atomic-ish ordering is what makes the cleanup tail
|
||||
# transactional from the orchestrator's perspective.
|
||||
# Step 3: delete the temp branch ONLY if the fast-forward succeeded. If
|
||||
# it didn't, leaving the branch lets the user inspect/merge manually.
|
||||
if [ "$ff_status" -eq 0 ]; then
|
||||
git -C "$main_repo" branch -D "$reviewfix_branch" || true
|
||||
fi
|
||||
|
||||
# Step 4: drop the recovery sentinel ONLY after `git worktree remove`
|
||||
# returns successfully. This atomic-ish ordering is what makes the
|
||||
# cleanup tail transactional from the orchestrator's perspective.
|
||||
rm -f "$sentinel"
|
||||
```
|
||||
|
||||
This cleanup is unconditional — register it mentally as a finally-block obligation. If the agent exits early (config error, no findings, etc.), still run the two-step cleanup tail (`git worktree remove "$wt" --force` followed by `rm -f "$sentinel"`) before exit. The sentinel must NEVER be removed before `git worktree remove` succeeds.
|
||||
This cleanup is unconditional — register it mentally as a finally-block obligation. If the agent exits early (config error, no findings, etc.), still run the cleanup tail in order (fast-forward → worktree remove → temp branch delete → sentinel rm) before exit. The sentinel must NEVER be removed before `git worktree remove` succeeds. The temp branch must NEVER be deleted while the fast-forward is in a diverged state.
|
||||
</step>
|
||||
|
||||
<step name="load_context">
|
||||
@@ -528,9 +584,9 @@ _Iteration: {N}_
|
||||
|
||||
<critical_rules>
|
||||
|
||||
**ALWAYS run inside the isolated worktree** — set up via `branch=$(git branch --show-current)` + `wt=$(mktemp -d "/tmp/sv-${padded_phase}-reviewfix-XXXXXX")` + `git worktree add "$wt" "$branch"` at the very start (see `setup_worktree` step). Using `mktemp` ensures concurrent runs do not collide. Attaching to `$branch` (not `HEAD`) ensures commits advance the branch. Every file read, edit, and commit must happen inside `$wt`. Run `git worktree remove "$wt" --force` unconditionally when done (treat it as a finally block). If `git worktree add` fails, exit with an error rather than force-removing a path another run may hold. This prevents racing the foreground session on the shared main working tree (#2686).
|
||||
**ALWAYS run inside the isolated worktree** — set up via `branch=$(git branch --show-current)` + `wt=$(mktemp -d "/tmp/sv-${padded_phase}-reviewfix-XXXXXX")` + `git worktree add -b "$reviewfix_branch" "$wt" "$branch"` at the very start (see `setup_worktree` step). Using `mktemp` ensures concurrent runs do not collide. Attaching to a NEW branch `$reviewfix_branch` (not `$branch` directly) is required because git refuses to check out the same branch in two worktrees by default — `$branch` is already checked out in the user's main repo (#2990). Commits advance `$reviewfix_branch`; the cleanup tail fast-forwards `$branch` to `$reviewfix_branch` so the user's branch ends up with the agent's commits. Every file read, edit, and commit must happen inside `$wt`. Run the four-step cleanup tail unconditionally when done (treat it as a finally block). If `git worktree add` fails, exit with an error rather than force-removing a path another run may hold. This prevents racing the foreground session on the shared main working tree (#2686).
|
||||
|
||||
**ALWAYS run the transactional cleanup tail in order** (#2839): `git worktree remove "$wt" --force` MUST happen BEFORE `rm -f "$sentinel"` (the recovery sentinel at `${phase_dir}/.review-fix-recovery-pending.json`). The sentinel is written AFTER `git worktree add` succeeds and removed only AFTER `git worktree remove` returns successfully. This ordering is what makes the cleanup tail transactional — an interruption between commits and `git worktree remove` leaves the sentinel behind so a future run, `/gsd-resume-work`, or `/gsd-progress` can detect and complete the recovery. Reversing the order recreates the orphan-worktree bug.
|
||||
**ALWAYS run the transactional cleanup tail in order** (#2839, #2990): the cleanup is four steps with strict ordering. (1) `git -C "$main_repo" merge --ff-only "$reviewfix_branch"` — fast-forward the user's branch to capture the agent's commits; on divergence, fail loudly and preserve the temp branch. (2) `git worktree remove "$wt" --force`. (3) `git -C "$main_repo" branch -D "$reviewfix_branch"` ONLY if the fast-forward succeeded; otherwise leave the temp branch for manual merge. (4) `rm -f "$sentinel"` (the recovery sentinel at `${phase_dir}/.review-fix-recovery-pending.json`). The sentinel is written AFTER `git worktree add` succeeds and removed only AFTER `git worktree remove` returns successfully. The temp branch is deleted only when the fast-forward succeeded. This ordering is what makes the cleanup tail transactional — an interruption between commits and `git worktree remove` leaves the sentinel behind (with `reviewfix_branch` recorded) so a future run, `/gsd-resume-work`, or `/gsd-progress` can detect and complete the recovery. Reversing the order recreates the orphan-worktree bug.
|
||||
|
||||
**ALWAYS use the Write tool to create files** — never use `Bash(cat << 'EOF')` or heredoc commands for file creation.
|
||||
|
||||
|
||||
@@ -358,6 +358,30 @@ If RED or GREEN gate commits are missing, add a warning to SUMMARY.md under a `#
|
||||
<task_commit_protocol>
|
||||
After each task completes (verification passed, done criteria met), commit immediately.
|
||||
|
||||
**0. Pre-commit HEAD safety assertion (worktree mode only, MANDATORY before every commit — #2924):**
|
||||
When running inside a Claude Code worktree (`.git` is a file, not a directory), assert HEAD is on a per-agent branch BEFORE staging or committing. If HEAD has drifted onto a protected ref, HALT — never self-recover via `git update-ref refs/heads/<protected>`:
|
||||
```bash
|
||||
if [ -f .git ]; then # worktree
|
||||
HEAD_REF=$(git symbolic-ref --quiet HEAD || echo "DETACHED")
|
||||
ACTUAL_BRANCH=$(git rev-parse --abbrev-ref HEAD)
|
||||
# Deny-list: never commit on a protected ref.
|
||||
if [ "$HEAD_REF" = "DETACHED" ] || \
|
||||
echo "$ACTUAL_BRANCH" | grep -Eq '^(main|master|develop|trunk|release/.*)$'; then
|
||||
echo "FATAL: refusing to commit — worktree HEAD is on '$ACTUAL_BRANCH' (expected per-agent branch)." >&2
|
||||
echo "DO NOT use 'git update-ref' to rewind the protected branch — surface as blocker (#2924)." >&2
|
||||
exit 1
|
||||
fi
|
||||
# Positive allow-list: HEAD must be on the canonical Claude Code worktree-agent
|
||||
# branch namespace (`worktree-agent-<id>`). This catches feature/* and any other
|
||||
# arbitrary branch that the deny-list would silently allow (#2924).
|
||||
if ! echo "$ACTUAL_BRANCH" | grep -Eq '^worktree-agent-[A-Za-z0-9._/-]+$'; then
|
||||
echo "FATAL: refusing to commit — worktree HEAD '$ACTUAL_BRANCH' is not in the worktree-agent-* namespace." >&2
|
||||
echo "Agent commits must live on per-agent branches; surface as blocker (#2924)." >&2
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
```
|
||||
|
||||
**1. Check modified files:** `git status --short`
|
||||
|
||||
**2. Stage task-related files individually** (NEVER `git add .` or `git add -A`):
|
||||
@@ -426,6 +450,15 @@ back, those deletions appear on the main branch, destroying prior-wave work (#20
|
||||
- `git rm` on files not explicitly created by the current task
|
||||
- `git checkout -- .` or `git restore .` (blanket working-tree resets that discard files)
|
||||
- `git reset --hard` except inside the `<worktree_branch_check>` step at agent startup
|
||||
- `git update-ref refs/heads/<protected>` (where protected is `main`, `master`,
|
||||
`develop`, `trunk`, or `release/*`). This is an absolute prohibition (#2924).
|
||||
If you discover that your worktree HEAD is attached to a protected branch and your
|
||||
commits landed there, **DO NOT** "recover" by force-rewinding the protected ref —
|
||||
that silently destroys concurrent commits in multi-active scenarios (parallel
|
||||
agents, user committing while you run). HALT and surface a blocker. The setup-time
|
||||
`<worktree_branch_check>` and per-commit `<pre_commit_head_assertion>` are the
|
||||
correct prevention; if either fails, the workflow MUST stop, not self-heal.
|
||||
- `git push --force` / `git push -f` to any branch you did not create.
|
||||
|
||||
If you need to discard changes to a specific file you modified during this task, use:
|
||||
```bash
|
||||
|
||||
1126
bin/install.js
1126
bin/install.js
File diff suppressed because one or more lines are too long
@@ -30,6 +30,7 @@ Does not require `/gsd-new-project` — auto-creates `.planning/sketches/` if ne
|
||||
|
||||
<execution_context>
|
||||
@~/.claude/get-shit-done/workflows/sketch.md
|
||||
@~/.claude/get-shit-done/workflows/sketch-wrap-up.md
|
||||
@~/.claude/get-shit-done/references/ui-brand.md
|
||||
@~/.claude/get-shit-done/references/sketch-theme-system.md
|
||||
@~/.claude/get-shit-done/references/sketch-interactivity.md
|
||||
@@ -50,6 +51,9 @@ Design idea: $ARGUMENTS
|
||||
</context>
|
||||
|
||||
<process>
|
||||
Execute the sketch workflow from @~/.claude/get-shit-done/workflows/sketch.md end-to-end.
|
||||
Parse the first token of $ARGUMENTS:
|
||||
- If it is `--wrap-up`: strip the flag, execute the sketch-wrap-up workflow from @~/.claude/get-shit-done/workflows/sketch-wrap-up.md end-to-end.
|
||||
- Otherwise: execute the sketch workflow from @~/.claude/get-shit-done/workflows/sketch.md end-to-end.
|
||||
|
||||
Preserve all workflow gates (intake, decomposition, target stack research, variant evaluation, MANIFEST updates, commit patterns).
|
||||
</process>
|
||||
|
||||
@@ -30,6 +30,7 @@ Does not require `/gsd-new-project` — auto-creates `.planning/spikes/` if need
|
||||
|
||||
<execution_context>
|
||||
@~/.claude/get-shit-done/workflows/spike.md
|
||||
@~/.claude/get-shit-done/workflows/spike-wrap-up.md
|
||||
@~/.claude/get-shit-done/references/ui-brand.md
|
||||
</execution_context>
|
||||
|
||||
@@ -47,6 +48,9 @@ Idea: $ARGUMENTS
|
||||
</context>
|
||||
|
||||
<process>
|
||||
Execute the spike workflow from @~/.claude/get-shit-done/workflows/spike.md end-to-end.
|
||||
Parse the first token of $ARGUMENTS:
|
||||
- If it is `--wrap-up`: strip the flag, execute the spike-wrap-up workflow from @~/.claude/get-shit-done/workflows/spike-wrap-up.md.
|
||||
- Otherwise: pass all of $ARGUMENTS as the idea to the spike workflow from @~/.claude/get-shit-done/workflows/spike.md end-to-end.
|
||||
|
||||
Preserve all workflow gates (prior spike check, decomposition, research, risk ordering, observability assessment, verification, MANIFEST updates, commit patterns).
|
||||
</process>
|
||||
|
||||
@@ -257,12 +257,13 @@ See [`docs/INVENTORY.md`](INVENTORY.md#hooks-11-shipped) for the authoritative 1
|
||||
|
||||
### CLI Tools (`get-shit-done/bin/`)
|
||||
|
||||
Node.js CLI utility (`gsd-tools.cjs`) with domain modules split across `get-shit-done/bin/lib/` (see [`docs/INVENTORY.md`](INVENTORY.md#cli-modules-24-shipped) for the authoritative roster):
|
||||
Node.js CLI utility (`gsd-tools.cjs`) with domain modules split across `get-shit-done/bin/lib/` (see [`docs/INVENTORY.md`](INVENTORY.md#cli-modules-33-shipped) for the authoritative roster):
|
||||
|
||||
|
||||
| Module | Responsibility |
|
||||
| ---------------------- | --------------------------------------------------------------------------------------------------- |
|
||||
| `core.cjs` | Error handling, output formatting, shared utilities |
|
||||
| `core.cjs` | Error handling, output formatting, shared utilities; compatibility re-exports for planning helpers |
|
||||
| `planning-workspace.cjs` | Planning seam (`planningDir`, `planningPaths`, active workstream routing, `.planning/.lock`) |
|
||||
| `state.cjs` | STATE.md parsing, updating, progression, metrics |
|
||||
| `phase.cjs` | Phase directory operations, decimal numbering, plan indexing |
|
||||
| `roadmap.cjs` | ROADMAP.md parsing, phase extraction, plan progress |
|
||||
@@ -578,7 +579,7 @@ The installer (`bin/install.js`, ~3,000 lines) handles:
|
||||
- Augment Code: Skills-first with full skill conversion and config management
|
||||
5. **Path normalization** — Replaces `~/.claude/` paths with runtime-specific paths
|
||||
6. **Settings integration** — Registers hooks in runtime's `settings.json`
|
||||
7. **Patch backup** — Since v1.17, backs up locally modified files to `gsd-local-patches/` for `/gsd-reapply-patches`
|
||||
7. **Patch backup** — Since v1.17, backs up locally modified files to `gsd-local-patches/` for `/gsd-update --reapply`
|
||||
8. **Manifest tracking** — Writes `gsd-file-manifest.json` for clean uninstall
|
||||
9. **Uninstall mode** — `--uninstall` removes all GSD files, hooks, and settings
|
||||
|
||||
|
||||
@@ -452,9 +452,10 @@ User-facing entry point: `/gsd-graphify` (see [Command Reference](COMMANDS.md#gs
|
||||
|
||||
| Module | File | Exports |
|
||||
|--------|------|---------|
|
||||
| Core | `lib/core.cjs` | `error()`, `output()`, `parseArgs()`, shared utilities |
|
||||
| Core | `lib/core.cjs` | `error()`, `output()`, `parseArgs()`, shared utilities, compatibility re-exports |
|
||||
| State | `lib/state.cjs` | All `state` subcommands, `state-snapshot` |
|
||||
| Phase | `lib/phase.cjs` | Phase CRUD, `find-phase`, `phase-plan-index`, `phases list` |
|
||||
| Planning Workspace | `lib/planning-workspace.cjs` | Planning seam: `planningDir`, `planningPaths`, active workstream routing, `.planning/.lock` |
|
||||
| Roadmap | `lib/roadmap.cjs` | Roadmap parsing, phase extraction, progress updates |
|
||||
| Config | `lib/config.cjs` | Config read/write, section initialization |
|
||||
| Verify | `lib/verify.cjs` | All verification and validation commands |
|
||||
|
||||
@@ -191,6 +191,7 @@ All workflow toggles follow the **absent = enabled** pattern. If a key is missin
|
||||
| `workflow.skip_discuss` | boolean | `false` | When `true`, `/gsd-autonomous` bypasses the discuss-phase entirely, writing minimal CONTEXT.md from the ROADMAP phase goal. Useful for projects where developer preferences are fully captured in PROJECT.md/REQUIREMENTS.md. Added in v1.28 |
|
||||
| `workflow.text_mode` | boolean | `false` | Replaces AskUserQuestion TUI menus with plain-text numbered lists. Required for Claude Code remote sessions (`/rc` mode) where TUI menus don't render. Can also be set per-session with `--text` flag on discuss-phase. Added in v1.28 |
|
||||
| `workflow.use_worktrees` | boolean | `true` | When `false`, disables git worktree isolation for parallel execution. Users who prefer sequential execution or whose environment does not support worktrees can disable this. Added in v1.31 |
|
||||
| `workflow.worktree_skip_hooks` | boolean | `false` | When `true`, executor agents in worktree mode pass `--no-verify` (skipping pre-commit hooks) and post-wave hook validation runs against the merged result instead. Opt-in escape hatch for projects whose hooks cannot run in agent worktrees. Default `false` runs hooks on every commit (#2924). |
|
||||
| `workflow.code_review` | boolean | `true` | Enable `/gsd-code-review` and `/gsd-code-review-fix` commands. When `false`, the commands exit with a configuration gate message. Added in v1.34 |
|
||||
| `workflow.code_review_depth` | string | `standard` | Default review depth for `/gsd-code-review`: `quick` (pattern-matching only), `standard` (per-file analysis), or `deep` (cross-file with import graphs). Can be overridden per-run with `--depth=`. Added in v1.34 |
|
||||
| `workflow.plan_bounce` | boolean | `false` | Run external validation script against generated plans. When enabled, the plan-phase orchestrator pipes each PLAN.md through the script specified by `plan_bounce_script` and blocks on non-zero exit. Added in v1.36 |
|
||||
|
||||
@@ -902,7 +902,7 @@ continues. Drift detection cannot fail verification.
|
||||
- REQ-UPDATE-02: System MUST display changelog for new version before updating
|
||||
- REQ-UPDATE-03: System MUST be runtime-aware and target the correct directory
|
||||
- REQ-UPDATE-04: System MUST back up locally modified files to `gsd-local-patches/`
|
||||
- REQ-UPDATE-05: `/gsd-reapply-patches` MUST restore local modifications after update
|
||||
- REQ-UPDATE-05: `/gsd-update --reapply` MUST restore local modifications after update
|
||||
|
||||
---
|
||||
|
||||
@@ -2255,7 +2255,7 @@ Test suite that scans all agent, workflow, and command files for embedded inject
|
||||
|
||||
### 103. Post-Merge Hunk Verification
|
||||
|
||||
**Command:** `/gsd-reapply-patches`
|
||||
**Command:** `/gsd-update --reapply`
|
||||
|
||||
**Purpose:** After applying local patches post-update, verify that all hunks were actually applied by comparing the expected patch content against the live filesystem. Surface any dropped or partial hunks immediately rather than silently accepting incomplete merges.
|
||||
|
||||
|
||||
@@ -246,6 +246,7 @@
|
||||
"cli_modules": [
|
||||
"artifacts.cjs",
|
||||
"audit.cjs",
|
||||
"command-aliases.generated.cjs",
|
||||
"commands.cjs",
|
||||
"config-schema.cjs",
|
||||
"config.cjs",
|
||||
@@ -258,22 +259,30 @@
|
||||
"gap-checker.cjs",
|
||||
"graphify.cjs",
|
||||
"gsd2-import.cjs",
|
||||
"init-command-router.cjs",
|
||||
"init.cjs",
|
||||
"install-profiles.cjs",
|
||||
"intel.cjs",
|
||||
"learnings.cjs",
|
||||
"milestone.cjs",
|
||||
"model-profiles.cjs",
|
||||
"phase-command-router.cjs",
|
||||
"phase.cjs",
|
||||
"phases-command-router.cjs",
|
||||
"planning-workspace.cjs",
|
||||
"profile-output.cjs",
|
||||
"profile-pipeline.cjs",
|
||||
"roadmap-command-router.cjs",
|
||||
"roadmap.cjs",
|
||||
"schema-detect.cjs",
|
||||
"secrets.cjs",
|
||||
"security.cjs",
|
||||
"state-command-router.cjs",
|
||||
"state.cjs",
|
||||
"template.cjs",
|
||||
"uat.cjs",
|
||||
"validate-command-router.cjs",
|
||||
"verify-command-router.cjs",
|
||||
"verify.cjs",
|
||||
"workstream.cjs"
|
||||
],
|
||||
@@ -291,4 +300,4 @@
|
||||
"gsd-workflow-guard.js"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -224,7 +224,7 @@ Full roster at `get-shit-done/workflows/*.md`. Workflows are thin orchestrators
|
||||
| `profile-user.md` | Orchestrate the full developer profiling flow — consent, session scan, profile generation. | `/gsd-profile-user` |
|
||||
| `progress.md` | Progress rendering — project context, position, and next-action routing. | `/gsd-progress` |
|
||||
| `quick.md` | Quick-task execution with GSD guarantees (atomic commits, state tracking). | `/gsd-quick` |
|
||||
| `reapply-patches.md` | Reapply local modifications after a GSD update. | `/gsd-reapply-patches` |
|
||||
| `reapply-patches.md` | Reapply local modifications after a GSD update. | `/gsd-update --reapply` |
|
||||
| `remove-phase.md` | Remove a future phase from the roadmap and renumber subsequent phases. | `/gsd-remove-phase` |
|
||||
| `remove-workspace.md` | Remove a GSD workspace and clean up worktrees. | `/gsd-remove-workspace` |
|
||||
| `research-phase.md` | Standalone phase research workflow (usually invoked via `plan-phase`). | `/gsd-research-phase` |
|
||||
@@ -348,7 +348,7 @@ The `gsd-planner` agent is decomposed into a core agent plus reference modules t
|
||||
|
||||
---
|
||||
|
||||
## CLI Modules (32 shipped)
|
||||
## CLI Modules (41 shipped)
|
||||
|
||||
Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
|
||||
@@ -356,11 +356,12 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
|--------|----------------|
|
||||
| `artifacts.cjs` | Canonical artifact registry — known `.planning/` root file names; used by `gsd-health` W019 lint |
|
||||
| `audit.cjs` | Audit dispatch, audit open sessions, audit storage helpers |
|
||||
| `command-aliases.generated.cjs` | Generated CJS alias/subcommand metadata for manifest-backed family routers |
|
||||
| `commands.cjs` | Misc CLI commands (slug, timestamp, todos, scaffolding, stats) |
|
||||
| `config-schema.cjs` | Single source of truth for `VALID_CONFIG_KEYS` and dynamic key patterns; imported by both the validator and the config-schema-docs parity test |
|
||||
| `config.cjs` | `config.json` read/write, section initialization; imports validator from `config-schema.cjs` |
|
||||
| `context-utilization.cjs` | Pure classifier for `gsd-health --context` — turns (tokensUsed, contextWindow) into a `{ percent, state }` triage result against the 60%/70% fracture-point thresholds (#2792) |
|
||||
| `core.cjs` | Error handling, output formatting, shared utilities, runtime fallbacks |
|
||||
| `core.cjs` | Error handling, output formatting, shared utilities, runtime fallbacks; compatibility re-exports for planning-workspace helpers |
|
||||
| `decisions.cjs` | Shared parser for CONTEXT.md `<decisions>` blocks (D-NN entries); used by `gap-checker.cjs` and intended for #2492 plan/verify decision gates |
|
||||
| `docs.cjs` | Docs-update workflow init, Markdown scanning, monorepo detection |
|
||||
| `drift.cjs` | Post-execute codebase structural drift detector (#2003): classifies file changes into new-dir/barrel/migration/route categories and round-trips `last_mapped_commit` frontmatter |
|
||||
@@ -368,22 +369,30 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `gap-checker.cjs` | Post-planning gap analysis (#2493): unified REQUIREMENTS.md + CONTEXT.md decisions vs PLAN.md coverage report (`gsd-tools gap-analysis`) |
|
||||
| `graphify.cjs` | Knowledge-graph build/query/status/diff for `/gsd-graphify` |
|
||||
| `gsd2-import.cjs` | External-plan ingest for `/gsd-from-gsd2` |
|
||||
| `init-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools init` |
|
||||
| `init.cjs` | Compound context loading for each workflow type |
|
||||
| `install-profiles.cjs` | Install profile allowlist + skill staging for `--minimal` install (#2762); single source of truth for which `gsd-*` skills/agents land in runtime config dirs |
|
||||
| `intel.cjs` | Codebase intel store backing `/gsd-intel` and `gsd-intel-updater` |
|
||||
| `learnings.cjs` | Cross-phase learnings extraction for `/gsd-extract-learnings` |
|
||||
| `milestone.cjs` | Milestone archival, requirements marking |
|
||||
| `model-profiles.cjs` | Model profile resolution table (authoritative profile data) |
|
||||
| `phase-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools phase` |
|
||||
| `phase.cjs` | Phase directory operations, decimal numbering, plan indexing |
|
||||
| `phases-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools phases` |
|
||||
| `planning-workspace.cjs` | Planning path/workstream seam (`planningDir`, `planningPaths`, active-workstream routing, `.planning/.lock` orchestration) |
|
||||
| `profile-output.cjs` | Profile rendering, USER-PROFILE.md and dev-preferences.md generation |
|
||||
| `profile-pipeline.cjs` | User behavioral profiling data pipeline, session file scanning |
|
||||
| `roadmap-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools roadmap` |
|
||||
| `roadmap.cjs` | ROADMAP.md parsing, phase extraction, plan progress |
|
||||
| `schema-detect.cjs` | Schema-drift detection for ORM patterns (Prisma, Drizzle, etc.) |
|
||||
| `secrets.cjs` | Secret-config masking convention (`****<last-4>`) for integration keys managed by `/gsd-settings-integrations` — keeps plaintext out of `config-set` output |
|
||||
| `security.cjs` | Path traversal prevention, prompt injection detection, safe JSON/shell helpers |
|
||||
| `state-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools state` |
|
||||
| `state.cjs` | STATE.md parsing, updating, progression, metrics |
|
||||
| `template.cjs` | Template selection and filling with variable substitution |
|
||||
| `uat.cjs` | UAT file parsing, verification debt tracking, audit-uat support |
|
||||
| `validate-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools validate` |
|
||||
| `verify-command-router.cjs` | Thin CJS subcommand router adapter for `gsd-tools verify` |
|
||||
| `verify.cjs` | Plan structure, phase completeness, reference, commit validation |
|
||||
| `workstream.cjs` | Workstream CRUD, migration, session-scoped active pointer |
|
||||
|
||||
|
||||
@@ -1090,7 +1090,7 @@ Set `commit_docs: false` during `/gsd-new-project` or via `/gsd-settings`. Add `
|
||||
|
||||
### GSD Update Overwrote My Local Changes
|
||||
|
||||
Since v1.17, the installer backs up locally modified files to `gsd-local-patches/`. Run `/gsd-reapply-patches` to merge your changes back.
|
||||
Since v1.17, the installer backs up locally modified files to `gsd-local-patches/`. Run `/gsd-update --reapply` to merge your changes back.
|
||||
|
||||
### Cannot Update via npm
|
||||
|
||||
@@ -1249,7 +1249,7 @@ If the installer crashes with `EPERM: operation not permitted, scandir` on Windo
|
||||
| Quick targeted fix | `/gsd-quick` |
|
||||
| Plan doesn't match your vision | `/gsd-discuss-phase [N]` then re-plan |
|
||||
| Costs running high | `/gsd-set-profile budget` and `/gsd-settings` to toggle agents off |
|
||||
| Update broke local changes | `/gsd-reapply-patches` |
|
||||
| Update broke local changes | `/gsd-update --reapply` |
|
||||
| Want session summary for stakeholder | `/gsd-session-report` |
|
||||
| Don't know what step is next | `/gsd-next` |
|
||||
| Parallel execution build errors | Update GSD or set `parallelization.enabled: false` |
|
||||
|
||||
@@ -439,7 +439,7 @@ UI-SPEC.md (per phase) ───────────────────
|
||||
- Antigravity: Googleモデル同等品によるスキルファースト
|
||||
5. **パス正規化** — `~/.claude/` パスをランタイム固有のパスに置換
|
||||
6. **設定統合** — ランタイムの `settings.json` にフックを登録
|
||||
7. **パッチバックアップ** — v1.17以降、ローカルで変更されたファイルを `/gsd-reapply-patches` 用に `gsd-local-patches/` へバックアップ
|
||||
7. **パッチバックアップ** — v1.17以降、ローカルで変更されたファイルを `/gsd-update --reapply` 用に `gsd-local-patches/` へバックアップ
|
||||
8. **マニフェスト追跡** — クリーンアンインストールのために `gsd-file-manifest.json` を書き込み
|
||||
9. **アンインストールモード** — `--uninstall` ですべてのGSDファイル、フック、設定を削除
|
||||
|
||||
|
||||
@@ -794,12 +794,12 @@ Claude Codeのセッション分析から8つの次元(コミュニケーシ
|
||||
/gsd-update # アップデートを確認してインストール
|
||||
```
|
||||
|
||||
### `/gsd-reapply-patches`
|
||||
### `/gsd-update --reapply`
|
||||
|
||||
GSDアップデート後にローカルの変更を復元します。
|
||||
|
||||
```bash
|
||||
/gsd-reapply-patches # ローカルの変更をマージバック
|
||||
/gsd-update --reapply # ローカルの変更をマージバック
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -800,7 +800,7 @@
|
||||
- REQ-UPDATE-02: システムは更新前に新しいバージョンのチェンジログを表示しなければならない
|
||||
- REQ-UPDATE-03: システムはランタイムを認識し、正しいディレクトリを対象としなければならない
|
||||
- REQ-UPDATE-04: システムはローカルで変更されたファイルを `gsd-local-patches/` にバックアップしなければならない
|
||||
- REQ-UPDATE-05: `/gsd-reapply-patches` は更新後にローカルの変更を復元しなければならない
|
||||
- REQ-UPDATE-05: `/gsd-update --reapply` は更新後にローカルの変更を復元しなければならない
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ Get Shit Done(GSD)フレームワークの包括的なドキュメントで
|
||||
|
||||
## クイックリンク
|
||||
|
||||
- **v1.32 の新機能:** STATE.md 整合性ゲート、`--to N` 自律モード、リサーチゲート、ベリファイヤーマイルストーンスコープフィルタリング、read-before-edit ガード、コンテキスト削減、新規ランタイム(Trae, Cline, Augment Code)、レスポンス言語設定、`--power`/`--diagnose` フラグ、`/gsd-analyze-dependencies`
|
||||
- **v1.39 の新機能:** `--minimal` インストールプロファイル(≥94% コールドスタート削減)、`/gsd-edit-phase`、マージ後ビルド & テストゲート、`review.models.<cli>` ランタイム別レビューモデル、ワークストリーム設定の継承、手動カナリアリリースワークフロー、スキル統合(86 → 59)
|
||||
- **はじめに:** [README](../README.md) → インストール → `/gsd-new-project`
|
||||
- **ワークフロー完全ガイド:** [ユーザーガイド](USER-GUIDE.md)
|
||||
- **コマンド一覧:** [コマンドリファレンス](COMMANDS.md)
|
||||
|
||||
@@ -432,7 +432,7 @@ GSD はマークダウンファイルを生成し、それが LLM のシステ
|
||||
| `/gsd-check-todos` | 保留中の TODO を一覧表示 | 記録したアイデアのレビュー |
|
||||
| `/gsd-settings` | ワークフロートグルとモデルプロファイルを設定 | モデル変更、エージェントのトグル |
|
||||
| `/gsd-set-profile <profile>` | クイックプロファイル切り替え | コスト/品質トレードオフの変更 |
|
||||
| `/gsd-reapply-patches` | アップデート後にローカル変更を復元 | ローカル編集がある場合の `/gsd-update` 後 |
|
||||
| `/gsd-update --reapply` | アップデート後にローカル変更を復元 | ローカル編集がある場合の `/gsd-update` 後 |
|
||||
|
||||
### コード品質とレビュー
|
||||
|
||||
@@ -754,7 +754,7 @@ GSD サブエージェントが Anthropic モデルを呼び出し、OpenRouter
|
||||
|
||||
### GSD アップデートがローカル変更を上書きした
|
||||
|
||||
v1.17 以降、インストーラーはローカルで変更されたファイルを `gsd-local-patches/` にバックアップします。`/gsd-reapply-patches` を実行して変更をマージし直してください。
|
||||
v1.17 以降、インストーラーはローカルで変更されたファイルを `gsd-local-patches/` にバックアップします。`/gsd-update --reapply` を実行して変更をマージし直してください。
|
||||
|
||||
### ワークフロー診断 (`/gsd-forensics`)
|
||||
|
||||
@@ -801,7 +801,7 @@ Windows でインストーラーが `EPERM: operation not permitted, scandir`
|
||||
| ターゲットを絞った修正 | `/gsd-quick` |
|
||||
| プランがビジョンに合わない | `/gsd-discuss-phase [N]` で再プランニング |
|
||||
| コストが高い | `/gsd-set-profile budget` と `/gsd-settings` でエージェントをオフ |
|
||||
| アップデートがローカル変更を壊した | `/gsd-reapply-patches` |
|
||||
| アップデートがローカル変更を壊した | `/gsd-update --reapply` |
|
||||
| ステークホルダー向けセッションサマリーが欲しい | `/gsd-session-report` |
|
||||
| 次のステップがわからない | `/gsd-next` |
|
||||
| 並列実行でビルドエラー | GSD を更新するか `parallelization.enabled: false` を設定 |
|
||||
|
||||
@@ -439,7 +439,7 @@ UI-SPEC.md (per phase) ───────────────────
|
||||
- Antigravity: Google 모델 등가물을 사용한 skills-first 방식
|
||||
5. **경로 정규화** — `~/.claude/` 경로를 런타임별 경로로 교체
|
||||
6. **설정 통합** — 런타임의 `settings.json`에 훅 등록
|
||||
7. **패치 백업** — v1.17부터 로컬 수정 파일을 `gsd-local-patches/`에 백업하여 `/gsd-reapply-patches`에 사용
|
||||
7. **패치 백업** — v1.17부터 로컬 수정 파일을 `gsd-local-patches/`에 백업하여 `/gsd-update --reapply`에 사용
|
||||
8. **매니페스트 추적** — 깔끔한 제거를 위해 `gsd-file-manifest.json` 작성
|
||||
9. **제거 모드** — `--uninstall`로 모든 GSD 파일, 훅, 설정 제거
|
||||
|
||||
|
||||
@@ -794,12 +794,12 @@ Claude Code 세션 분석을 통해 8개 차원(커뮤니케이션 스타일,
|
||||
/gsd-update # 업데이트 확인 및 설치
|
||||
```
|
||||
|
||||
### `/gsd-reapply-patches`
|
||||
### `/gsd-update --reapply`
|
||||
|
||||
GSD 업데이트 후 로컬 수정사항을 복원합니다.
|
||||
|
||||
```bash
|
||||
/gsd-reapply-patches # 로컬 변경사항 병합
|
||||
/gsd-update --reapply # 로컬 변경사항 병합
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -800,7 +800,7 @@
|
||||
- REQ-UPDATE-02: 업데이트 전에 새 버전의 변경 로그를 표시해야 합니다.
|
||||
- REQ-UPDATE-03: 런타임을 인식하고 올바른 디렉토리를 대상으로 해야 합니다.
|
||||
- REQ-UPDATE-04: 로컬에서 수정된 파일을 `gsd-local-patches/`에 백업해야 합니다.
|
||||
- REQ-UPDATE-05: `/gsd-reapply-patches`는 업데이트 후 로컬 수정사항을 복원해야 합니다.
|
||||
- REQ-UPDATE-05: `/gsd-update --reapply`는 업데이트 후 로컬 수정사항을 복원해야 합니다.
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -20,7 +20,7 @@ Get Shit Done (GSD) 프레임워크의 종합 문서입니다. GSD는 AI 코딩
|
||||
|
||||
## 빠른 링크
|
||||
|
||||
- **v1.32의 새로운 기능:** STATE.md 일관성 게이트, `--to N` 자율 모드, 리서치 게이트, 검증자 마일스톤 범위 필터링, read-before-edit 가드, 컨텍스트 축소, 신규 런타임(Trae, Cline, Augment Code), 응답 언어 설정, `--power`/`--diagnose` 플래그, `/gsd-analyze-dependencies`
|
||||
- **v1.39의 새로운 기능:** `--minimal` 설치 프로파일(콜드 스타트 ≥94% 감소), `/gsd-edit-phase`, 머지 후 빌드 & 테스트 게이트, `review.models.<cli>` 런타임별 리뷰 모델, 워크스트림 설정 상속, 수동 카나리 릴리스 워크플로, 스킬 통합(86 → 59)
|
||||
- **시작하기:** [README](../README.md) → 설치 → `/gsd-new-project`
|
||||
- **전체 워크플로우 안내:** [User Guide](USER-GUIDE.md)
|
||||
- **모든 명령어 한눈에 보기:** [Command Reference](COMMANDS.md)
|
||||
|
||||
@@ -432,7 +432,7 @@ GSD는 LLM 시스템 프롬프트가 되는 마크다운 파일을 생성합니
|
||||
| `/gsd-check-todos` | 보류 중인 할 일 목록 | 캡처된 아이디어 검토 시 |
|
||||
| `/gsd-settings` | 워크플로우 토글 및 모델 프로필 설정 | 모델 변경, 에이전트 토글 시 |
|
||||
| `/gsd-set-profile <profile>` | 빠른 프로필 전환 | 비용/품질 트레이드오프 변경 시 |
|
||||
| `/gsd-reapply-patches` | 업데이트 후 로컬 수정사항 복원 | 로컬 편집이 있는 상태에서 `/gsd-update` 이후 |
|
||||
| `/gsd-update --reapply` | 업데이트 후 로컬 수정사항 복원 | 로컬 편집이 있는 상태에서 `/gsd-update` 이후 |
|
||||
|
||||
### 코드 품질 및 리뷰
|
||||
|
||||
@@ -754,7 +754,7 @@ GSD 서브에이전트가 Anthropic 모델을 호출하는데 OpenRouter나 로
|
||||
|
||||
### GSD 업데이트가 로컬 변경사항을 덮어쓴 경우
|
||||
|
||||
v1.17부터 설치 프로그램이 로컬로 수정된 파일을 `gsd-local-patches/`에 백업합니다. 변경사항을 다시 병합하려면 `/gsd-reapply-patches`를 실행하세요.
|
||||
v1.17부터 설치 프로그램이 로컬로 수정된 파일을 `gsd-local-patches/`에 백업합니다. 변경사항을 다시 병합하려면 `/gsd-update --reapply`를 실행하세요.
|
||||
|
||||
### 워크플로우 진단 (`/gsd-forensics`)
|
||||
|
||||
@@ -801,7 +801,7 @@ Windows에서 설치 프로그램이 `EPERM: operation not permitted, scandir`
|
||||
| 빠른 목표 수정 | `/gsd-quick` |
|
||||
| 계획이 비전과 맞지 않음 | `/gsd-discuss-phase [N]` 후 재계획 |
|
||||
| 비용이 높아짐 | `/gsd-set-profile budget` 및 `/gsd-settings`에서 에이전트 비활성화 |
|
||||
| 업데이트가 로컬 변경사항 파괴 | `/gsd-reapply-patches` |
|
||||
| 업데이트가 로컬 변경사항 파괴 | `/gsd-update --reapply` |
|
||||
| 이해관계자를 위한 세션 요약 필요 | `/gsd-session-report` |
|
||||
| 다음 단계를 모르겠음 | `/gsd-next` |
|
||||
| 병렬 실행 빌드 오류 | GSD 업데이트 또는 `parallelization.enabled: false` 설정 |
|
||||
|
||||
@@ -59,4 +59,4 @@ The installer performs a clean wipe-and-replace of GSD-managed directories only:
|
||||
- Your `CLAUDE.md` files
|
||||
- Custom hooks
|
||||
|
||||
Locally modified GSD files are automatically backed up to `gsd-local-patches/` before the install. Run `/gsd-reapply-patches` after updating to merge your modifications back in.
|
||||
Locally modified GSD files are automatically backed up to `gsd-local-patches/` before the install. Run `/gsd-update --reapply` after updating to merge your modifications back in.
|
||||
|
||||
@@ -18,9 +18,9 @@ Documentação abrangente do framework Get Shit Done (GSD) — um sistema de met
|
||||
| [Referências](references/) | Todos os usuários | Guias complementares de decisão, verificação e padrões |
|
||||
| [Superpowers](superpowers/) | Contribuidores | Planos e specs avançadas do projeto |
|
||||
|
||||
## Novidades v1.32
|
||||
## Novidades v1.39
|
||||
|
||||
STATE.md consistency gates, `--to N` para execução autônoma parcial, research gate, verifier milestone scope filtering, read-before-edit guard, context reduction, novos runtimes (Trae, Cline, Augment Code), `response_language`, `--power`/`--diagnose` flags, `/gsd-analyze-dependencies`.
|
||||
Perfil de instalação `--minimal` (≥94% de redução no cold-start), `/gsd-edit-phase`, build & test gate pós-merge, `review.models.<cli>` para escolha de modelo de review por runtime, herança de configuração de workstream, workflow manual de canary release, consolidação de skills (86 → 59).
|
||||
|
||||
## Links rápidos
|
||||
|
||||
|
||||
@@ -234,7 +234,7 @@
|
||||
| `/gsd-check-todos` | 列出待处理事项 | 查看捕获的想法 |
|
||||
| `/gsd-settings` | 配置工作流开关和模型配置 | 更改模型、切换代理 |
|
||||
| `/gsd-set-profile <profile>` | 快速切换配置 | 更改成本/质量权衡 |
|
||||
| `/gsd-reapply-patches` | 更新后恢复本地修改 | 如果你有本地编辑,在 `/gsd-update` 后 |
|
||||
| `/gsd-update --reapply` | 更新后恢复本地修改 | 如果你有本地编辑,在 `/gsd-update` 后 |
|
||||
|
||||
---
|
||||
|
||||
@@ -466,7 +466,7 @@ node gsd-tools.cjs state sync # 从磁盘重建 STATE.md
|
||||
|
||||
### GSD 更新覆盖了我的本地更改
|
||||
|
||||
从 v1.17 开始,安装程序将本地修改的文件备份到 `gsd-local-patches/`。运行 `/gsd-reapply-patches` 将你的更改合并回来。
|
||||
从 v1.17 开始,安装程序将本地修改的文件备份到 `gsd-local-patches/`。运行 `/gsd-update --reapply` 将你的更改合并回来。
|
||||
|
||||
### 子代理似乎失败但工作已完成
|
||||
|
||||
@@ -487,7 +487,7 @@ node gsd-tools.cjs state sync # 从磁盘重建 STATE.md
|
||||
| 快速针对性修复 | `/gsd-quick` |
|
||||
| 计划与你的愿景不符 | `/gsd-discuss-phase [N]` 然后重新规划 |
|
||||
| 成本过高 | `/gsd-set-profile budget` 和 `/gsd-settings` 关闭代理 |
|
||||
| 更新破坏了本地更改 | `/gsd-reapply-patches` |
|
||||
| 更新破坏了本地更改 | `/gsd-update --reapply` |
|
||||
|
||||
---
|
||||
|
||||
|
||||
99
get-shit-done/bin/check-latest-version.cjs
Executable file
99
get-shit-done/bin/check-latest-version.cjs
Executable file
@@ -0,0 +1,99 @@
|
||||
#!/usr/bin/env node
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Deterministic latest-version check for /gsd-update (#2992).
|
||||
*
|
||||
* The /gsd-update workflow's check_latest_version step was previously
|
||||
* prescribed in LLM-driven prose ("run `npm view get-shit-done-cc
|
||||
* version`"). The executing model could shortcut the prescription and
|
||||
* invent npm queries against wrong-shaped names (`@get-shit-done/cli`,
|
||||
* `get-shit-done-cli`, `gsd`), all of which 404 or — worse — return an
|
||||
* unrelated typosquat package.
|
||||
*
|
||||
* This script makes the package name a CONSTANT in code, not a free
|
||||
* choice at execution time. The workflow calls it via `npm run
|
||||
* check-latest-version -- --json` and parses the structured response.
|
||||
*
|
||||
* Tests assert on the typed CHECK_REASON enum and the structured result
|
||||
* record, never on console prose. See CONTRIBUTING.md "Prohibited: Raw
|
||||
* Text Matching on Test Outputs".
|
||||
*/
|
||||
|
||||
const cp = require('node:child_process');
|
||||
|
||||
// Hardcoded. Do not parameterise — the whole point of this script is that
|
||||
// the package name is not a runtime choice for the caller.
|
||||
const PACKAGE_NAME = 'get-shit-done-cc';
|
||||
|
||||
const CHECK_REASON = Object.freeze({
|
||||
OK: 'ok',
|
||||
FAIL_NPM_FAILED: 'fail_npm_failed',
|
||||
FAIL_INVALID_OUTPUT: 'fail_invalid_output',
|
||||
});
|
||||
|
||||
const SEMVER_RE = /^\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?$/;
|
||||
|
||||
/**
|
||||
* Pure-ish: takes an injected spawn function so tests don't actually run npm.
|
||||
* In production, defaults to cp.spawnSync('npm', ...).
|
||||
*/
|
||||
function checkLatestVersion(opts = {}) {
|
||||
const defaultSpawn = () => cp.spawnSync('npm', ['view', PACKAGE_NAME, 'version'], {
|
||||
encoding: 'utf8',
|
||||
stdio: ['ignore', 'pipe', 'pipe'],
|
||||
shell: process.platform === 'win32', // npm is npm.cmd on Windows
|
||||
// Bound the registry call so a hung network/registry doesn't block the
|
||||
// entire /gsd-update workflow indefinitely (#2993 CR). 15s is generous
|
||||
// for `npm view <pkg> version`; on timeout, spawnSync returns with
|
||||
// signal !== null and the existing failure path emits FAIL_NPM_FAILED.
|
||||
timeout: 15_000,
|
||||
});
|
||||
const spawn = opts.spawn || defaultSpawn;
|
||||
|
||||
const r = spawn();
|
||||
if (!r || r.status !== 0) {
|
||||
// Distinguish timeout (status null, signal set, stderr empty) from a
|
||||
// genuine npm failure. Without this, both surfaced as "npm exited
|
||||
// non-zero" and the operator couldn't tell which (#2993 CR).
|
||||
let detail;
|
||||
if (r && r.signal) {
|
||||
detail = `npm timed out (signal: ${r.signal})`;
|
||||
} else if (r && r.stderr) {
|
||||
detail = r.stderr.trim();
|
||||
} else {
|
||||
detail = 'npm exited non-zero';
|
||||
}
|
||||
return {
|
||||
ok: false,
|
||||
reason: CHECK_REASON.FAIL_NPM_FAILED,
|
||||
detail,
|
||||
};
|
||||
}
|
||||
const version = (r.stdout || '').trim();
|
||||
if (!SEMVER_RE.test(version)) {
|
||||
return {
|
||||
ok: false,
|
||||
reason: CHECK_REASON.FAIL_INVALID_OUTPUT,
|
||||
detail: version || '(empty)',
|
||||
};
|
||||
}
|
||||
return { ok: true, version, reason: CHECK_REASON.OK };
|
||||
}
|
||||
|
||||
function main() {
|
||||
const json = process.argv.includes('--json');
|
||||
const r = checkLatestVersion();
|
||||
if (json) {
|
||||
process.stdout.write(JSON.stringify(r) + '\n');
|
||||
} else if (r.ok) {
|
||||
process.stdout.write(r.version + '\n');
|
||||
} else {
|
||||
process.stderr.write(`check-latest-version: ${r.reason}: ${r.detail}\n`);
|
||||
}
|
||||
process.exit(r.ok ? 0 : 1);
|
||||
}
|
||||
|
||||
if (require.main === module) main();
|
||||
|
||||
module.exports = { checkLatestVersion, CHECK_REASON, PACKAGE_NAME };
|
||||
@@ -172,7 +172,8 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const core = require('./lib/core.cjs');
|
||||
const { error, findProjectRoot, getActiveWorkstream } = core;
|
||||
const { error, findProjectRoot } = core;
|
||||
const { getActiveWorkstream } = require('./lib/planning-workspace.cjs');
|
||||
const state = require('./lib/state.cjs');
|
||||
const phase = require('./lib/phase.cjs');
|
||||
const roadmap = require('./lib/roadmap.cjs');
|
||||
@@ -189,6 +190,13 @@ const workstream = require('./lib/workstream.cjs');
|
||||
const docs = require('./lib/docs.cjs');
|
||||
const learnings = require('./lib/learnings.cjs');
|
||||
const gapChecker = require('./lib/gap-checker.cjs');
|
||||
const { routeStateCommand } = require('./lib/state-command-router.cjs');
|
||||
const { routeVerifyCommand } = require('./lib/verify-command-router.cjs');
|
||||
const { routeInitCommand } = require('./lib/init-command-router.cjs');
|
||||
const { routePhaseCommand } = require('./lib/phase-command-router.cjs');
|
||||
const { routePhasesCommand } = require('./lib/phases-command-router.cjs');
|
||||
const { routeValidateCommand } = require('./lib/validate-command-router.cjs');
|
||||
const { routeRoadmapCommand } = require('./lib/roadmap-command-router.cjs');
|
||||
|
||||
// ─── Arg parsing helpers ──────────────────────────────────────────────────────
|
||||
|
||||
@@ -297,6 +305,18 @@ async function main() {
|
||||
const raw = rawIndex !== -1;
|
||||
if (rawIndex !== -1) args.splice(rawIndex, 1);
|
||||
|
||||
// --json-errors: when present, error() emits structured JSON to stderr
|
||||
// ({ ok: false, reason: <ERROR_REASON code>, message }) instead of plain
|
||||
// "Error: <text>". Lets test suites assert on typed reason codes per the
|
||||
// CONTRIBUTING.md "Prohibited: Raw Text Matching on Test Outputs" rule
|
||||
// (#2974). Default off — human operators see the original plain-text
|
||||
// diagnostic.
|
||||
const jsonErrorsIdx = args.indexOf('--json-errors');
|
||||
if (jsonErrorsIdx !== -1) {
|
||||
core.setJsonErrorMode(true);
|
||||
args.splice(jsonErrorsIdx, 1);
|
||||
}
|
||||
|
||||
// --pick <name>: extract a single field from JSON output (replaces jq dependency).
|
||||
// Supports dot-notation (e.g., --pick workflow.research) and bracket notation
|
||||
// for arrays (e.g., --pick directories[-1]).
|
||||
@@ -429,73 +449,14 @@ function extractField(obj, fieldPath) {
|
||||
async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
switch (command) {
|
||||
case 'state': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'json') {
|
||||
state.cmdStateJson(cwd, raw);
|
||||
} else if (subcommand === 'update') {
|
||||
state.cmdStateUpdate(cwd, args[2], args[3]);
|
||||
} else if (subcommand === 'get') {
|
||||
state.cmdStateGet(cwd, args[2], raw);
|
||||
} else if (subcommand === 'patch') {
|
||||
const patches = {};
|
||||
for (let i = 2; i < args.length; i += 2) {
|
||||
const key = args[i].replace(/^--/, '');
|
||||
const value = args[i + 1];
|
||||
if (key && value !== undefined) {
|
||||
patches[key] = value;
|
||||
}
|
||||
}
|
||||
state.cmdStatePatch(cwd, patches, raw);
|
||||
} else if (subcommand === 'advance-plan') {
|
||||
state.cmdStateAdvancePlan(cwd, raw);
|
||||
} else if (subcommand === 'record-metric') {
|
||||
const { phase: p, plan, duration, tasks, files } = parseNamedArgs(args, ['phase', 'plan', 'duration', 'tasks', 'files']);
|
||||
state.cmdStateRecordMetric(cwd, { phase: p, plan, duration, tasks, files }, raw);
|
||||
} else if (subcommand === 'update-progress') {
|
||||
state.cmdStateUpdateProgress(cwd, raw);
|
||||
} else if (subcommand === 'add-decision') {
|
||||
const { phase: p, summary, 'summary-file': summary_file, rationale, 'rationale-file': rationale_file } = parseNamedArgs(args, ['phase', 'summary', 'summary-file', 'rationale', 'rationale-file']);
|
||||
state.cmdStateAddDecision(cwd, { phase: p, summary, summary_file, rationale: rationale || '', rationale_file }, raw);
|
||||
} else if (subcommand === 'add-blocker') {
|
||||
const { text, 'text-file': text_file } = parseNamedArgs(args, ['text', 'text-file']);
|
||||
state.cmdStateAddBlocker(cwd, { text, text_file }, raw);
|
||||
} else if (subcommand === 'resolve-blocker') {
|
||||
state.cmdStateResolveBlocker(cwd, parseNamedArgs(args, ['text']).text, raw);
|
||||
} else if (subcommand === 'record-session') {
|
||||
const { 'stopped-at': stopped_at, 'resume-file': resume_file } = parseNamedArgs(args, ['stopped-at', 'resume-file']);
|
||||
state.cmdStateRecordSession(cwd, { stopped_at, resume_file: resume_file || 'None' }, raw);
|
||||
} else if (subcommand === 'begin-phase') {
|
||||
const { phase: p, name, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
state.cmdStateBeginPhase(cwd, p, name, plans !== null ? parseInt(plans, 10) : null, raw);
|
||||
} else if (subcommand === 'signal-waiting') {
|
||||
const { type, question, options, phase: p } = parseNamedArgs(args, ['type', 'question', 'options', 'phase']);
|
||||
state.cmdSignalWaiting(cwd, type, question, options, p, raw);
|
||||
} else if (subcommand === 'signal-resume') {
|
||||
state.cmdSignalResume(cwd, raw);
|
||||
} else if (subcommand === 'planned-phase') {
|
||||
const { phase: p, name, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
state.cmdStatePlannedPhase(cwd, p, plans !== null ? parseInt(plans, 10) : null, raw);
|
||||
} else if (subcommand === 'validate') {
|
||||
state.cmdStateValidate(cwd, raw);
|
||||
} else if (subcommand === 'sync') {
|
||||
const { verify } = parseNamedArgs(args, [], ['verify']);
|
||||
state.cmdStateSync(cwd, { verify }, raw);
|
||||
} else if (subcommand === 'prune') {
|
||||
const { 'keep-recent': keepRecent, 'dry-run': dryRun } = parseNamedArgs(args, ['keep-recent'], ['dry-run']);
|
||||
state.cmdStatePrune(cwd, { keepRecent: keepRecent || '3', dryRun: !!dryRun }, raw);
|
||||
} else if (subcommand === 'complete-phase') {
|
||||
state.cmdStateCompletePhase(cwd, raw);
|
||||
} else if (subcommand === 'milestone-switch') {
|
||||
// Bug #2630: reset STATE.md frontmatter + Current Position for new milestone.
|
||||
// NB: the flag is `--milestone`, not `--version` — gsd-tools reserves
|
||||
// `--version` as a globally-invalid help flag (see NEVER_VALID_FLAGS above).
|
||||
const { milestone, name } = parseNamedArgs(args, ['milestone', 'name']);
|
||||
state.cmdStateMilestoneSwitch(cwd, milestone, name, raw);
|
||||
} else if (subcommand === undefined || subcommand === 'load') {
|
||||
state.cmdStateLoad(cwd, raw);
|
||||
} else {
|
||||
error(`Unknown state subcommand: "${subcommand}". Available: load, json, get, patch, update, advance-plan, record-metric, update-progress, add-decision, add-blocker, resolve-blocker, record-session, begin-phase, signal-waiting, signal-resume, planned-phase, validate, sync, prune, complete-phase, milestone-switch`);
|
||||
}
|
||||
routeStateCommand({
|
||||
state,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
parseNamedArgs,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -589,27 +550,13 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
}
|
||||
|
||||
case 'verify': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'plan-structure') {
|
||||
verify.cmdVerifyPlanStructure(cwd, args[2], raw);
|
||||
} else if (subcommand === 'phase-completeness') {
|
||||
verify.cmdVerifyPhaseCompleteness(cwd, args[2], raw);
|
||||
} else if (subcommand === 'references') {
|
||||
verify.cmdVerifyReferences(cwd, args[2], raw);
|
||||
} else if (subcommand === 'commits') {
|
||||
verify.cmdVerifyCommits(cwd, args.slice(2), raw);
|
||||
} else if (subcommand === 'artifacts') {
|
||||
verify.cmdVerifyArtifacts(cwd, args[2], raw);
|
||||
} else if (subcommand === 'key-links') {
|
||||
verify.cmdVerifyKeyLinks(cwd, args[2], raw);
|
||||
} else if (subcommand === 'schema-drift') {
|
||||
const skipFlag = args.includes('--skip');
|
||||
verify.cmdVerifySchemaDrift(cwd, args[2], skipFlag, raw);
|
||||
} else if (subcommand === 'codebase-drift') {
|
||||
verify.cmdVerifyCodebaseDrift(cwd, raw);
|
||||
} else {
|
||||
error('Unknown verify subcommand. Available: plan-structure, phase-completeness, references, commits, artifacts, key-links, schema-drift, codebase-drift');
|
||||
}
|
||||
routeVerifyCommand({
|
||||
verify,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -679,37 +626,25 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
}
|
||||
|
||||
case 'phases': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'list') {
|
||||
const typeIndex = args.indexOf('--type');
|
||||
const phaseIndex = args.indexOf('--phase');
|
||||
const options = {
|
||||
type: typeIndex !== -1 ? args[typeIndex + 1] : null,
|
||||
phase: phaseIndex !== -1 ? args[phaseIndex + 1] : null,
|
||||
includeArchived: args.includes('--include-archived'),
|
||||
};
|
||||
phase.cmdPhasesList(cwd, options, raw);
|
||||
} else if (subcommand === 'clear') {
|
||||
milestone.cmdPhasesClear(cwd, raw, args.slice(2));
|
||||
} else {
|
||||
error('Unknown phases subcommand. Available: list, clear');
|
||||
}
|
||||
routePhasesCommand({
|
||||
phase,
|
||||
milestone,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
case 'roadmap': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'get-phase') {
|
||||
roadmap.cmdRoadmapGetPhase(cwd, args[2], raw);
|
||||
} else if (subcommand === 'analyze') {
|
||||
roadmap.cmdRoadmapAnalyze(cwd, raw);
|
||||
} else if (subcommand === 'update-plan-progress') {
|
||||
roadmap.cmdRoadmapUpdatePlanProgress(cwd, args[2], raw);
|
||||
} else if (subcommand === 'annotate-dependencies') {
|
||||
roadmap.cmdRoadmapAnnotateDependencies(cwd, args[2], raw);
|
||||
} else {
|
||||
error('Unknown roadmap subcommand. Available: get-phase, analyze, update-plan-progress, annotate-dependencies');
|
||||
}
|
||||
routeRoadmapCommand({
|
||||
roadmap,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -731,42 +666,13 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
}
|
||||
|
||||
case 'phase': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'next-decimal') {
|
||||
phase.cmdPhaseNextDecimal(cwd, args[2], raw);
|
||||
} else if (subcommand === 'add') {
|
||||
const idIdx = args.indexOf('--id');
|
||||
let customId = null;
|
||||
const descArgs = [];
|
||||
for (let i = 2; i < args.length; i++) {
|
||||
if (args[i] === '--id' && i + 1 < args.length) {
|
||||
customId = args[i + 1];
|
||||
i++; // skip value
|
||||
} else {
|
||||
descArgs.push(args[i]);
|
||||
}
|
||||
}
|
||||
phase.cmdPhaseAdd(cwd, descArgs.join(' '), raw, customId);
|
||||
} else if (subcommand === 'add-batch') {
|
||||
// Accepts JSON array of descriptions via --descriptions '[...]' or positional args
|
||||
const descFlagIdx = args.indexOf('--descriptions');
|
||||
let descriptions;
|
||||
if (descFlagIdx !== -1 && args[descFlagIdx + 1]) {
|
||||
try { descriptions = JSON.parse(args[descFlagIdx + 1]); } catch (e) { error('--descriptions must be a JSON array'); }
|
||||
} else {
|
||||
descriptions = args.slice(2).filter(a => a !== '--raw');
|
||||
}
|
||||
phase.cmdPhaseAddBatch(cwd, descriptions, raw);
|
||||
} else if (subcommand === 'insert') {
|
||||
phase.cmdPhaseInsert(cwd, args[2], args.slice(3).join(' '), raw);
|
||||
} else if (subcommand === 'remove') {
|
||||
const forceFlag = args.includes('--force');
|
||||
phase.cmdPhaseRemove(cwd, args[2], { force: forceFlag }, raw);
|
||||
} else if (subcommand === 'complete') {
|
||||
phase.cmdPhaseComplete(cwd, args[2], raw);
|
||||
} else {
|
||||
error('Unknown phase subcommand. Available: next-decimal, add, add-batch, insert, remove, complete');
|
||||
}
|
||||
routePhaseCommand({
|
||||
phase,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -783,58 +689,15 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
}
|
||||
|
||||
case 'validate': {
|
||||
const subcommand = args[1];
|
||||
if (subcommand === 'consistency') {
|
||||
verify.cmdValidateConsistency(cwd, raw);
|
||||
} else if (subcommand === 'health') {
|
||||
const repairFlag = args.includes('--repair');
|
||||
const backfillFlag = args.includes('--backfill');
|
||||
verify.cmdValidateHealth(cwd, { repair: repairFlag, backfill: backfillFlag }, raw);
|
||||
} else if (subcommand === 'agents') {
|
||||
verify.cmdValidateAgents(cwd, raw);
|
||||
} else if (subcommand === 'context') {
|
||||
// The model self-reports tokensUsed and contextWindow — the SDK has
|
||||
// no privileged access to either. Recommendation copy lives here
|
||||
// (the renderer), not in the classifier, so it can change without
|
||||
// re-validating the math layer.
|
||||
const opts = parseNamedArgs(args, ['tokens-used', 'context-window']);
|
||||
if (opts['tokens-used'] === null) {
|
||||
error('--tokens-used <integer> is required for `validate context`');
|
||||
break;
|
||||
}
|
||||
if (opts['context-window'] === null) {
|
||||
error('--context-window <integer> is required for `validate context`');
|
||||
break;
|
||||
}
|
||||
const { classifyContextUtilization, STATES } = require('./lib/context-utilization.cjs');
|
||||
const RECOMMENDATIONS = {
|
||||
[STATES.HEALTHY]: null,
|
||||
[STATES.WARNING]: 'Context is approaching the fracture zone — consider /gsd-thread to continue in a fresh window.',
|
||||
[STATES.CRITICAL]: 'Reasoning quality may degrade past 70% utilization (fracture point). Run /gsd-thread now to preserve output quality.',
|
||||
};
|
||||
let classified;
|
||||
try {
|
||||
classified = classifyContextUtilization(Number(opts['tokens-used']), Number(opts['context-window']));
|
||||
} catch (e) {
|
||||
// Translate the classifier's TypeError into a CLI-shaped error
|
||||
// message that names the offending flag.
|
||||
const flag = /tokensUsed/.test(e.message) ? '--tokens-used' : '--context-window';
|
||||
error(`${flag} must be a non-negative integer (window > 0), got the values supplied`);
|
||||
break;
|
||||
}
|
||||
const result = { ...classified, recommendation: RECOMMENDATIONS[classified.state] };
|
||||
if (args.includes('--json')) {
|
||||
core.output(result, raw);
|
||||
} else {
|
||||
const lines = [`Context utilization: ${result.percent}% (${result.state})`];
|
||||
if (result.recommendation) lines.push(result.recommendation);
|
||||
// Use core.output's rawValue path for the sync-flush guarantee
|
||||
// — process.stdout.write can be truncated on process exit.
|
||||
core.output(result, true, lines.join('\n'));
|
||||
}
|
||||
} else {
|
||||
error('Unknown validate subcommand. Available: consistency, health, agents, context');
|
||||
}
|
||||
routeValidateCommand({
|
||||
verify,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
parseNamedArgs,
|
||||
output: core.output,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -852,12 +715,15 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
|
||||
case 'audit-open': {
|
||||
const { auditOpenArtifacts, formatAuditReport } = require('./lib/audit.cjs');
|
||||
const includeRaw = args.includes('--json');
|
||||
const wantJson = args.includes('--json');
|
||||
const result = auditOpenArtifacts(cwd);
|
||||
if (includeRaw) {
|
||||
if (wantJson) {
|
||||
// core.output JSON-stringifies its first arg; pass the object directly.
|
||||
core.output(result, raw);
|
||||
} else {
|
||||
core.output(formatAuditReport(result), raw);
|
||||
// Human-readable report must bypass JSON encoding — use the rawValue
|
||||
// form (third arg) which core.output emits verbatim.
|
||||
core.output(null, true, formatAuditReport(result));
|
||||
}
|
||||
break;
|
||||
}
|
||||
@@ -903,66 +769,14 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
}
|
||||
|
||||
case 'init': {
|
||||
const workflow = args[1];
|
||||
switch (workflow) {
|
||||
case 'execute-phase': {
|
||||
const { validate: epValidate, tdd: epTdd } = parseNamedArgs(args, [], ['validate', 'tdd']);
|
||||
init.cmdInitExecutePhase(cwd, args[2], raw, { validate: epValidate, tdd: epTdd });
|
||||
break;
|
||||
}
|
||||
case 'plan-phase': {
|
||||
const { validate: ppValidate, tdd: ppTdd } = parseNamedArgs(args, [], ['validate', 'tdd']);
|
||||
init.cmdInitPlanPhase(cwd, args[2], raw, { validate: ppValidate, tdd: ppTdd });
|
||||
break;
|
||||
}
|
||||
case 'new-project':
|
||||
init.cmdInitNewProject(cwd, raw);
|
||||
break;
|
||||
case 'new-milestone':
|
||||
init.cmdInitNewMilestone(cwd, raw);
|
||||
break;
|
||||
case 'quick':
|
||||
init.cmdInitQuick(cwd, args.slice(2).join(' '), raw);
|
||||
break;
|
||||
case 'ingest-docs':
|
||||
init.cmdInitIngestDocs(cwd, raw);
|
||||
break;
|
||||
case 'resume':
|
||||
init.cmdInitResume(cwd, raw);
|
||||
break;
|
||||
case 'verify-work':
|
||||
init.cmdInitVerifyWork(cwd, args[2], raw);
|
||||
break;
|
||||
case 'phase-op':
|
||||
init.cmdInitPhaseOp(cwd, args[2], raw);
|
||||
break;
|
||||
case 'todos':
|
||||
init.cmdInitTodos(cwd, args[2], raw);
|
||||
break;
|
||||
case 'milestone-op':
|
||||
init.cmdInitMilestoneOp(cwd, raw);
|
||||
break;
|
||||
case 'map-codebase':
|
||||
init.cmdInitMapCodebase(cwd, raw);
|
||||
break;
|
||||
case 'progress':
|
||||
init.cmdInitProgress(cwd, raw);
|
||||
break;
|
||||
case 'manager':
|
||||
init.cmdInitManager(cwd, raw);
|
||||
break;
|
||||
case 'new-workspace':
|
||||
init.cmdInitNewWorkspace(cwd, raw);
|
||||
break;
|
||||
case 'list-workspaces':
|
||||
init.cmdInitListWorkspaces(cwd, raw);
|
||||
break;
|
||||
case 'remove-workspace':
|
||||
init.cmdInitRemoveWorkspace(cwd, args[2], raw);
|
||||
break;
|
||||
default:
|
||||
error(`Unknown init workflow: ${workflow}\nAvailable: execute-phase, plan-phase, new-project, new-milestone, quick, ingest-docs, resume, verify-work, phase-op, todos, milestone-op, map-codebase, progress, manager, new-workspace, list-workspaces, remove-workspace`);
|
||||
}
|
||||
routeInitCommand({
|
||||
init,
|
||||
args,
|
||||
cwd,
|
||||
raw,
|
||||
parseNamedArgs,
|
||||
error,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -1268,6 +1082,7 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
'agents',
|
||||
path.join('commands', 'gsd'),
|
||||
'hooks',
|
||||
'skills',
|
||||
];
|
||||
|
||||
function walkDir(dir, baseDir) {
|
||||
|
||||
@@ -11,7 +11,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { planningDir, toPosixPath } = require('./core.cjs');
|
||||
const { toPosixPath } = require('./core.cjs');
|
||||
const { planningDir } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const { requireSafePath, sanitizeForDisplay } = require('./security.cjs');
|
||||
|
||||
|
||||
118
get-shit-done/bin/lib/command-aliases.generated.cjs
Normal file
118
get-shit-done/bin/lib/command-aliases.generated.cjs
Normal file
@@ -0,0 +1,118 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* GENERATED FILE — state.*, verify.*, init.*, phase.*, phases.*, validate.*, and roadmap.* alias/subcommand metadata for CJS routing.
|
||||
* Source: sdk/src/query/command-manifest.{state,verify,init,phase,phases,validate,roadmap}.ts
|
||||
*/
|
||||
|
||||
const STATE_COMMAND_ALIASES = [
|
||||
{ canonical: 'state.load', aliases: [], subcommand: 'load', mutation: false },
|
||||
{ canonical: 'state.json', aliases: ['state json'], subcommand: 'json', mutation: false },
|
||||
{ canonical: 'state.get', aliases: ['state get'], subcommand: 'get', mutation: false },
|
||||
{ canonical: 'state.update', aliases: ['state update'], subcommand: 'update', mutation: true },
|
||||
{ canonical: 'state.patch', aliases: ['state patch'], subcommand: 'patch', mutation: true },
|
||||
{ canonical: 'state.begin-phase', aliases: ['state begin-phase'], subcommand: 'begin-phase', mutation: true },
|
||||
{ canonical: 'state.advance-plan', aliases: ['state advance-plan'], subcommand: 'advance-plan', mutation: true },
|
||||
{ canonical: 'state.record-metric', aliases: ['state record-metric'], subcommand: 'record-metric', mutation: true },
|
||||
{ canonical: 'state.update-progress', aliases: ['state update-progress'], subcommand: 'update-progress', mutation: true },
|
||||
{ canonical: 'state.add-decision', aliases: ['state add-decision'], subcommand: 'add-decision', mutation: true },
|
||||
{ canonical: 'state.add-blocker', aliases: ['state add-blocker'], subcommand: 'add-blocker', mutation: true },
|
||||
{ canonical: 'state.resolve-blocker', aliases: ['state resolve-blocker'], subcommand: 'resolve-blocker', mutation: true },
|
||||
{ canonical: 'state.record-session', aliases: ['state record-session'], subcommand: 'record-session', mutation: true },
|
||||
{ canonical: 'state.signal-waiting', aliases: ['state signal-waiting'], subcommand: 'signal-waiting', mutation: true },
|
||||
{ canonical: 'state.signal-resume', aliases: ['state signal-resume'], subcommand: 'signal-resume', mutation: true },
|
||||
{ canonical: 'state.planned-phase', aliases: ['state planned-phase'], subcommand: 'planned-phase', mutation: true },
|
||||
{ canonical: 'state.validate', aliases: ['state validate'], subcommand: 'validate', mutation: false },
|
||||
{ canonical: 'state.sync', aliases: ['state sync'], subcommand: 'sync', mutation: true },
|
||||
{ canonical: 'state.prune', aliases: ['state prune'], subcommand: 'prune', mutation: true },
|
||||
{ canonical: 'state.milestone-switch', aliases: ['state milestone-switch'], subcommand: 'milestone-switch', mutation: true },
|
||||
{ canonical: 'state.add-roadmap-evolution', aliases: ['state add-roadmap-evolution'], subcommand: 'add-roadmap-evolution', mutation: true },
|
||||
];
|
||||
|
||||
const VERIFY_COMMAND_ALIASES = [
|
||||
{ canonical: 'verify.plan-structure', aliases: ['verify plan-structure'], subcommand: 'plan-structure', mutation: false },
|
||||
{ canonical: 'verify.phase-completeness', aliases: ['verify phase-completeness'], subcommand: 'phase-completeness', mutation: false },
|
||||
{ canonical: 'verify.references', aliases: ['verify references'], subcommand: 'references', mutation: false },
|
||||
{ canonical: 'verify.commits', aliases: ['verify commits'], subcommand: 'commits', mutation: false },
|
||||
{ canonical: 'verify.artifacts', aliases: ['verify artifacts'], subcommand: 'artifacts', mutation: false },
|
||||
{ canonical: 'verify.key-links', aliases: ['verify key-links'], subcommand: 'key-links', mutation: false },
|
||||
{ canonical: 'verify.schema-drift', aliases: ['verify schema-drift'], subcommand: 'schema-drift', mutation: false },
|
||||
{ canonical: 'verify.codebase-drift', aliases: ['verify codebase-drift'], subcommand: 'codebase-drift', mutation: false },
|
||||
];
|
||||
|
||||
const INIT_COMMAND_ALIASES = [
|
||||
{ canonical: 'init.execute-phase', aliases: ['init execute-phase'], subcommand: 'execute-phase', mutation: false },
|
||||
{ canonical: 'init.plan-phase', aliases: ['init plan-phase'], subcommand: 'plan-phase', mutation: false },
|
||||
{ canonical: 'init.new-project', aliases: ['init new-project'], subcommand: 'new-project', mutation: false },
|
||||
{ canonical: 'init.new-milestone', aliases: ['init new-milestone'], subcommand: 'new-milestone', mutation: false },
|
||||
{ canonical: 'init.quick', aliases: ['init quick'], subcommand: 'quick', mutation: false },
|
||||
{ canonical: 'init.ingest-docs', aliases: ['init ingest-docs'], subcommand: 'ingest-docs', mutation: false },
|
||||
{ canonical: 'init.resume', aliases: ['init resume'], subcommand: 'resume', mutation: false },
|
||||
{ canonical: 'init.verify-work', aliases: ['init verify-work'], subcommand: 'verify-work', mutation: false },
|
||||
{ canonical: 'init.phase-op', aliases: ['init phase-op'], subcommand: 'phase-op', mutation: false },
|
||||
{ canonical: 'init.todos', aliases: ['init todos'], subcommand: 'todos', mutation: false },
|
||||
{ canonical: 'init.milestone-op', aliases: ['init milestone-op'], subcommand: 'milestone-op', mutation: false },
|
||||
{ canonical: 'init.map-codebase', aliases: ['init map-codebase'], subcommand: 'map-codebase', mutation: false },
|
||||
{ canonical: 'init.progress', aliases: ['init progress'], subcommand: 'progress', mutation: false },
|
||||
{ canonical: 'init.manager', aliases: ['init manager'], subcommand: 'manager', mutation: false },
|
||||
{ canonical: 'init.new-workspace', aliases: ['init new-workspace'], subcommand: 'new-workspace', mutation: false },
|
||||
{ canonical: 'init.list-workspaces', aliases: ['init list-workspaces'], subcommand: 'list-workspaces', mutation: false },
|
||||
{ canonical: 'init.remove-workspace', aliases: ['init remove-workspace'], subcommand: 'remove-workspace', mutation: false },
|
||||
];
|
||||
|
||||
const PHASE_COMMAND_ALIASES = [
|
||||
{ canonical: 'phase.list-plans', aliases: ['phase list-plans'], subcommand: 'list-plans', mutation: false },
|
||||
{ canonical: 'phase.list-artifacts', aliases: ['phase list-artifacts'], subcommand: 'list-artifacts', mutation: false },
|
||||
{ canonical: 'phase.next-decimal', aliases: ['phase next-decimal'], subcommand: 'next-decimal', mutation: false },
|
||||
{ canonical: 'phase.add', aliases: ['phase add'], subcommand: 'add', mutation: true },
|
||||
{ canonical: 'phase.add-batch', aliases: ['phase add-batch'], subcommand: 'add-batch', mutation: true },
|
||||
{ canonical: 'phase.insert', aliases: ['phase insert'], subcommand: 'insert', mutation: true },
|
||||
{ canonical: 'phase.remove', aliases: ['phase remove'], subcommand: 'remove', mutation: true },
|
||||
{ canonical: 'phase.complete', aliases: ['phase complete'], subcommand: 'complete', mutation: true },
|
||||
{ canonical: 'phase.scaffold', aliases: ['phase scaffold'], subcommand: 'scaffold', mutation: true },
|
||||
];
|
||||
|
||||
const PHASES_COMMAND_ALIASES = [
|
||||
{ canonical: 'phases.list', aliases: ['phases list'], subcommand: 'list', mutation: false },
|
||||
{ canonical: 'phases.clear', aliases: ['phases clear'], subcommand: 'clear', mutation: true },
|
||||
{ canonical: 'phases.archive', aliases: ['phases archive'], subcommand: 'archive', mutation: true },
|
||||
];
|
||||
|
||||
const VALIDATE_COMMAND_ALIASES = [
|
||||
{ canonical: 'validate.consistency', aliases: ['validate consistency'], subcommand: 'consistency', mutation: false },
|
||||
{ canonical: 'validate.health', aliases: ['validate health'], subcommand: 'health', mutation: false },
|
||||
{ canonical: 'validate.agents', aliases: ['validate agents'], subcommand: 'agents', mutation: false },
|
||||
{ canonical: 'validate.context', aliases: ['validate context'], subcommand: 'context', mutation: false },
|
||||
];
|
||||
|
||||
const ROADMAP_COMMAND_ALIASES = [
|
||||
{ canonical: 'roadmap.analyze', aliases: ['roadmap analyze'], subcommand: 'analyze', mutation: false },
|
||||
{ canonical: 'roadmap.get-phase', aliases: ['roadmap get-phase'], subcommand: 'get-phase', mutation: false },
|
||||
{ canonical: 'roadmap.update-plan-progress', aliases: ['roadmap update-plan-progress'], subcommand: 'update-plan-progress', mutation: true },
|
||||
{ canonical: 'roadmap.annotate-dependencies', aliases: ['roadmap annotate-dependencies'], subcommand: 'annotate-dependencies', mutation: true },
|
||||
];
|
||||
|
||||
const STATE_SUBCOMMANDS = STATE_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const VERIFY_SUBCOMMANDS = VERIFY_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const INIT_SUBCOMMANDS = INIT_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const PHASE_SUBCOMMANDS = PHASE_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const PHASES_SUBCOMMANDS = PHASES_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const VALIDATE_SUBCOMMANDS = VALIDATE_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
const ROADMAP_SUBCOMMANDS = ROADMAP_COMMAND_ALIASES.map((entry) => entry.subcommand);
|
||||
|
||||
module.exports = {
|
||||
STATE_COMMAND_ALIASES,
|
||||
VERIFY_COMMAND_ALIASES,
|
||||
INIT_COMMAND_ALIASES,
|
||||
PHASE_COMMAND_ALIASES,
|
||||
PHASES_COMMAND_ALIASES,
|
||||
VALIDATE_COMMAND_ALIASES,
|
||||
ROADMAP_COMMAND_ALIASES,
|
||||
STATE_SUBCOMMANDS,
|
||||
VERIFY_SUBCOMMANDS,
|
||||
INIT_SUBCOMMANDS,
|
||||
PHASE_SUBCOMMANDS,
|
||||
PHASES_SUBCOMMANDS,
|
||||
VALIDATE_SUBCOMMANDS,
|
||||
ROADMAP_SUBCOMMANDS,
|
||||
};
|
||||
@@ -4,7 +4,8 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { execSync } = require('child_process');
|
||||
const { safeReadFile, loadConfig, isGitIgnored, execGit, normalizePhaseName, comparePhaseNum, getArchivedPhaseDirs, generateSlugInternal, getMilestoneInfo, getMilestonePhaseFilter, resolveModelInternal, stripShippedMilestones, extractCurrentMilestone, planningDir, planningPaths, toPosixPath, output, error, findPhaseInternal, extractOneLinerFromBody, getRoadmapPhaseInternal } = require('./core.cjs');
|
||||
const { safeReadFile, loadConfig, isGitIgnored, execGit, normalizePhaseName, comparePhaseNum, getArchivedPhaseDirs, generateSlugInternal, getMilestoneInfo, getMilestonePhaseFilter, resolveModelInternal, stripShippedMilestones, extractCurrentMilestone, toPosixPath, output, error, findPhaseInternal, extractOneLinerFromBody, getRoadmapPhaseInternal } = require('./core.cjs');
|
||||
const { planningDir, planningPaths } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const { MODEL_PROFILES } = require('./model-profiles.cjs');
|
||||
|
||||
|
||||
@@ -26,6 +26,7 @@ const VALID_CONFIG_KEYS = new Set([
|
||||
'workflow.skip_discuss',
|
||||
'workflow.auto_prune_state',
|
||||
'workflow.use_worktrees',
|
||||
'workflow.worktree_skip_hooks',
|
||||
'workflow.code_review',
|
||||
'workflow.code_review_depth',
|
||||
'workflow.code_review_command',
|
||||
|
||||
@@ -4,7 +4,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { output, error, planningDir, withPlanningLock, CONFIG_DEFAULTS, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { output, error, ERROR_REASON, CONFIG_DEFAULTS, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { planningDir, withPlanningLock } = require('./planning-workspace.cjs');
|
||||
const {
|
||||
VALID_PROFILES,
|
||||
getAgentToModelMapForProfile,
|
||||
@@ -32,7 +33,7 @@ const CONFIG_KEY_SUGGESTIONS = {
|
||||
function validateKnownConfigKeyPath(keyPath) {
|
||||
const suggested = CONFIG_KEY_SUGGESTIONS[keyPath];
|
||||
if (suggested) {
|
||||
error(`Unknown config key: ${keyPath}. Did you mean ${suggested}?`);
|
||||
error(`Unknown config key: ${keyPath}. Did you mean ${suggested}?`, ERROR_REASON.CONFIG_INVALID_KEY);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -277,7 +278,7 @@ function setConfigValue(cwd, keyPath, parsedValue) {
|
||||
config = JSON.parse(fs.readFileSync(configPath, 'utf-8'));
|
||||
}
|
||||
} catch (err) {
|
||||
error('Failed to read config.json: ' + err.message);
|
||||
error('Failed to read config.json: ' + err.message, ERROR_REASON.CONFIG_PARSE_FAILED);
|
||||
}
|
||||
|
||||
// Set nested value using dot notation (e.g., "workflow.research")
|
||||
@@ -318,7 +319,7 @@ function cmdConfigSet(cwd, keyPath, value, raw) {
|
||||
validateKnownConfigKeyPath(keyPath);
|
||||
|
||||
if (!isValidConfigKey(keyPath)) {
|
||||
error(`Unknown config key: "${keyPath}". Valid keys: ${[...VALID_CONFIG_KEYS].sort().join(', ')}, agent_skills.<agent-type>, features.<feature_name>`);
|
||||
error(`Unknown config key: "${keyPath}". Valid keys: ${[...VALID_CONFIG_KEYS].sort().join(', ')}, agent_skills.<agent-type>, features.<feature_name>`, ERROR_REASON.CONFIG_INVALID_KEY);
|
||||
}
|
||||
|
||||
// Parse value (handle booleans, numbers, and JSON arrays/objects)
|
||||
@@ -376,6 +377,15 @@ function cmdConfigSet(cwd, keyPath, value, raw) {
|
||||
output(setConfigValueResult, raw, `${keyPath}=${parsedValue}`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Schema-level defaults for well-known config keys.
|
||||
* When a key is absent from config.json and no --default flag was supplied,
|
||||
* cmdConfigGet checks here before emitting "Key not found".
|
||||
*/
|
||||
const SCHEMA_DEFAULTS = {
|
||||
'context_window': 200000,
|
||||
};
|
||||
|
||||
function cmdConfigGet(cwd, keyPath, raw, defaultValue) {
|
||||
const configPath = path.join(planningDir(cwd), 'config.json');
|
||||
const hasDefault = defaultValue !== undefined;
|
||||
@@ -392,11 +402,11 @@ function cmdConfigGet(cwd, keyPath, raw, defaultValue) {
|
||||
output(defaultValue, raw, String(defaultValue));
|
||||
return;
|
||||
} else {
|
||||
error('No config.json found at ' + configPath);
|
||||
error('No config.json found at ' + configPath, ERROR_REASON.CONFIG_NO_FILE);
|
||||
}
|
||||
} catch (err) {
|
||||
if (err.message.startsWith('No config.json')) throw err;
|
||||
error('Failed to read config.json: ' + err.message);
|
||||
error('Failed to read config.json: ' + err.message, ERROR_REASON.CONFIG_PARSE_FAILED);
|
||||
}
|
||||
|
||||
// Traverse dot-notation path (e.g., "workflow.auto_advance")
|
||||
@@ -405,14 +415,24 @@ function cmdConfigGet(cwd, keyPath, raw, defaultValue) {
|
||||
for (const key of keys) {
|
||||
if (current === undefined || current === null || typeof current !== 'object') {
|
||||
if (hasDefault) { output(defaultValue, raw, String(defaultValue)); return; }
|
||||
error(`Key not found: ${keyPath}`);
|
||||
if (Object.prototype.hasOwnProperty.call(SCHEMA_DEFAULTS, keyPath)) {
|
||||
const def = SCHEMA_DEFAULTS[keyPath];
|
||||
output(def, raw, String(def));
|
||||
return;
|
||||
}
|
||||
error(`Key not found: ${keyPath}`, ERROR_REASON.CONFIG_KEY_NOT_FOUND);
|
||||
}
|
||||
current = current[key];
|
||||
}
|
||||
|
||||
if (current === undefined) {
|
||||
if (hasDefault) { output(defaultValue, raw, String(defaultValue)); return; }
|
||||
error(`Key not found: ${keyPath}`);
|
||||
if (Object.prototype.hasOwnProperty.call(SCHEMA_DEFAULTS, keyPath)) {
|
||||
const def = SCHEMA_DEFAULTS[keyPath];
|
||||
output(def, raw, String(def));
|
||||
return;
|
||||
}
|
||||
error(`Key not found: ${keyPath}`, ERROR_REASON.CONFIG_KEY_NOT_FOUND);
|
||||
}
|
||||
|
||||
// Never echo plaintext for sensitive keys via config-get. Plaintext lives
|
||||
|
||||
@@ -5,37 +5,17 @@
|
||||
const fs = require('fs');
|
||||
const os = require('os');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const { execSync, execFileSync, spawnSync } = require('child_process');
|
||||
const { MODEL_PROFILES } = require('./model-profiles.cjs');
|
||||
|
||||
const WORKSTREAM_SESSION_ENV_KEYS = [
|
||||
'GSD_SESSION_KEY',
|
||||
'CODEX_THREAD_ID',
|
||||
'CLAUDE_SESSION_ID',
|
||||
'CLAUDE_CODE_SSE_PORT',
|
||||
'OPENCODE_SESSION_ID',
|
||||
'GEMINI_SESSION_ID',
|
||||
'CURSOR_SESSION_ID',
|
||||
'WINDSURF_SESSION_ID',
|
||||
'TERM_SESSION_ID',
|
||||
'WT_SESSION',
|
||||
'TMUX_PANE',
|
||||
'ZELLIJ_SESSION_NAME',
|
||||
];
|
||||
|
||||
let cachedControllingTtyToken = null;
|
||||
let didProbeControllingTtyToken = false;
|
||||
|
||||
// Track all .planning/.lock files held by this process so they can be removed
|
||||
// on exit. process.on('exit') fires even on process.exit(1), unlike try/finally
|
||||
// which is skipped when error() calls process.exit(1) inside a locked region (#1916).
|
||||
const _heldPlanningLocks = new Set();
|
||||
process.on('exit', () => {
|
||||
for (const lockPath of _heldPlanningLocks) {
|
||||
try { fs.unlinkSync(lockPath); } catch { /* already gone */ }
|
||||
}
|
||||
});
|
||||
// Compatibility shim: new imports should use planning-workspace.cjs directly.
|
||||
const {
|
||||
planningDir,
|
||||
planningRoot,
|
||||
planningPaths,
|
||||
withPlanningLock,
|
||||
getActiveWorkstream,
|
||||
setActiveWorkstream,
|
||||
} = require('./planning-workspace.cjs');
|
||||
|
||||
// ─── Path helpers ────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -221,8 +201,68 @@ function output(result, raw, rawValue) {
|
||||
fs.writeSync(1, data);
|
||||
}
|
||||
|
||||
function error(message) {
|
||||
fs.writeSync(2, 'Error: ' + message + '\n');
|
||||
/**
|
||||
* Frozen enum of typed reason codes used by error() for structured errors.
|
||||
* Each subcommand contributes its own codes; the enum exists so tests can
|
||||
* assert against typed values instead of grepping stderr (#2974).
|
||||
*
|
||||
* Adding a new code:
|
||||
* - Pick a snake_case lowercase value (the JSON wire form)
|
||||
* - Group by subsystem prefix (CONFIG_*, SDK_*, etc)
|
||||
* - Pass it to error(msg, ERROR_REASON.NEW_CODE) at the call site
|
||||
*/
|
||||
const ERROR_REASON = Object.freeze({
|
||||
// config-get / config-set
|
||||
CONFIG_KEY_NOT_FOUND: 'config_key_not_found',
|
||||
CONFIG_NO_FILE: 'config_no_file',
|
||||
CONFIG_PARSE_FAILED: 'config_parse_failed',
|
||||
CONFIG_INVALID_KEY: 'config_invalid_key',
|
||||
// SDK / gsd-tools dispatch
|
||||
SDK_FAIL_FAST: 'sdk_fail_fast',
|
||||
SDK_UNKNOWN_COMMAND: 'sdk_unknown_command',
|
||||
SDK_MISSING_ARG: 'sdk_missing_arg',
|
||||
// workflow / phase
|
||||
PHASE_NOT_FOUND: 'phase_not_found',
|
||||
SUMMARY_NO_PLANNING: 'summary_no_planning',
|
||||
// graphify
|
||||
GRAPHIFY_NO_GRAPH: 'graphify_no_graph',
|
||||
GRAPHIFY_INVALID_QUERY: 'graphify_invalid_query',
|
||||
// hooks
|
||||
HOOKS_OPT_OUT: 'hooks_opt_out',
|
||||
// security-scan
|
||||
SECURITY_SCAN_FAILED: 'security_scan_failed',
|
||||
// generic
|
||||
USAGE: 'usage',
|
||||
UNKNOWN: 'unknown',
|
||||
});
|
||||
|
||||
/**
|
||||
* Process-level flag: when true, error() emits structured JSON to stderr
|
||||
* instead of plain "Error: <message>" text. Set by gsd-tools.cjs when the
|
||||
* CLI is invoked with `--json-errors`. Tests opt in to typed-IR error
|
||||
* assertions by passing that flag and parsing the JSON.
|
||||
*
|
||||
* Default off so existing callers and human operators keep their plain-text
|
||||
* diagnostics. The structured form is opt-in for tooling and tests (#2974).
|
||||
*/
|
||||
let _jsonErrorMode = false;
|
||||
function setJsonErrorMode(v) { _jsonErrorMode = !!v; }
|
||||
function getJsonErrorMode() { return _jsonErrorMode; }
|
||||
|
||||
/**
|
||||
* Emit an error and exit. When the second argument is provided it must be
|
||||
* a value from ERROR_REASON; tests can assert on `result.reason`. When the
|
||||
* process is in JSON-error mode, stderr receives `{ ok: false, reason,
|
||||
* message }` so callers can parse it; otherwise stderr keeps the plain
|
||||
* text form for human operators.
|
||||
*/
|
||||
function error(message, reason = ERROR_REASON.UNKNOWN) {
|
||||
if (_jsonErrorMode) {
|
||||
const payload = JSON.stringify({ ok: false, reason, message }) + '\n';
|
||||
fs.writeSync(2, payload);
|
||||
} else {
|
||||
fs.writeSync(2, 'Error: ' + message + '\n');
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
@@ -804,304 +844,7 @@ function pruneOrphanedWorktrees(repoRoot) {
|
||||
return pruned;
|
||||
}
|
||||
|
||||
/**
|
||||
* Acquire a file-based lock for .planning/ writes.
|
||||
* Prevents concurrent worktrees from corrupting shared planning files.
|
||||
* Lock is auto-released after the callback completes.
|
||||
*/
|
||||
function withPlanningLock(cwd, fn) {
|
||||
const lockPath = path.join(planningDir(cwd), '.lock');
|
||||
const lockTimeout = 10000; // 10 seconds
|
||||
const retryDelay = 100;
|
||||
const start = Date.now();
|
||||
|
||||
// Ensure .planning/ exists
|
||||
try { fs.mkdirSync(planningDir(cwd), { recursive: true }); } catch { /* ok */ }
|
||||
|
||||
while (Date.now() - start < lockTimeout) {
|
||||
try {
|
||||
// Atomic create — fails if file exists
|
||||
fs.writeFileSync(lockPath, JSON.stringify({
|
||||
pid: process.pid,
|
||||
cwd,
|
||||
acquired: new Date().toISOString(),
|
||||
}), { flag: 'wx' });
|
||||
|
||||
// Register for exit-time cleanup so process.exit(1) inside a locked region
|
||||
// cannot leave a stale lock file (#1916).
|
||||
_heldPlanningLocks.add(lockPath);
|
||||
|
||||
// Lock acquired — run the function
|
||||
try {
|
||||
return fn();
|
||||
} finally {
|
||||
_heldPlanningLocks.delete(lockPath);
|
||||
try { fs.unlinkSync(lockPath); } catch { /* already released */ }
|
||||
}
|
||||
} catch (err) {
|
||||
if (err.code === 'EEXIST') {
|
||||
// Lock exists — check if stale (>30s old)
|
||||
try {
|
||||
const stat = fs.statSync(lockPath);
|
||||
if (Date.now() - stat.mtimeMs > 30000) {
|
||||
fs.unlinkSync(lockPath);
|
||||
continue; // retry
|
||||
}
|
||||
} catch { continue; }
|
||||
|
||||
// Wait and retry (cross-platform, no shell dependency)
|
||||
Atomics.wait(new Int32Array(new SharedArrayBuffer(4)), 0, 0, 100);
|
||||
continue;
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
// Timeout — force acquire (stale lock recovery)
|
||||
try { fs.unlinkSync(lockPath); } catch { /* ok */ }
|
||||
return fn();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the .planning directory path, project- and workstream-aware.
|
||||
*
|
||||
* Resolution order:
|
||||
* 1. If GSD_PROJECT is set (env var or explicit `project` arg), routes to
|
||||
* `.planning/{project}/` — supports multi-project workspaces where several
|
||||
* independent projects share a single `.planning/` root directory (e.g.,
|
||||
* an Obsidian vault or monorepo knowledge base used as a command center).
|
||||
* 2. If GSD_WORKSTREAM is set, routes to `.planning/workstreams/{ws}/`.
|
||||
* 3. Otherwise returns `.planning/`.
|
||||
*
|
||||
* GSD_PROJECT and GSD_WORKSTREAM can be combined:
|
||||
* `.planning/{project}/workstreams/{ws}/`
|
||||
*
|
||||
* @param {string} cwd - project root
|
||||
* @param {string} [ws] - explicit workstream name; if omitted, checks GSD_WORKSTREAM env var
|
||||
* @param {string} [project] - explicit project name; if omitted, checks GSD_PROJECT env var
|
||||
*/
|
||||
function planningDir(cwd, ws, project) {
|
||||
if (project === undefined) project = process.env.GSD_PROJECT || null;
|
||||
if (ws === undefined) ws = process.env.GSD_WORKSTREAM || null;
|
||||
|
||||
// Reject path separators and traversal components in project/workstream names
|
||||
const BAD_SEGMENT = /[/\\]|\.\./;
|
||||
if (project && BAD_SEGMENT.test(project)) {
|
||||
throw new Error(`GSD_PROJECT contains invalid path characters: ${project}`);
|
||||
}
|
||||
if (ws && BAD_SEGMENT.test(ws)) {
|
||||
throw new Error(`GSD_WORKSTREAM contains invalid path characters: ${ws}`);
|
||||
}
|
||||
|
||||
let base = path.join(cwd, '.planning');
|
||||
if (project) base = path.join(base, project);
|
||||
if (ws) base = path.join(base, 'workstreams', ws);
|
||||
return base;
|
||||
}
|
||||
|
||||
/** Always returns the root .planning/ path, ignoring workstreams and projects. For shared resources. */
|
||||
function planningRoot(cwd) {
|
||||
return path.join(cwd, '.planning');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get common .planning file paths, project-and-workstream-aware.
|
||||
*
|
||||
* All paths route through planningDir(cwd, ws), which honors the GSD_PROJECT
|
||||
* env var and active workstream. This matches loadConfig() above (line 256),
|
||||
* which has always read config.json via planningDir(cwd). Previously project
|
||||
* and config were resolved against the unrouted .planning/ root, which broke
|
||||
* `gsd-tools config-get` in multi-project layouts (the CRUD writers and the
|
||||
* reader pointed at different files).
|
||||
*/
|
||||
function planningPaths(cwd, ws) {
|
||||
const base = planningDir(cwd, ws);
|
||||
return {
|
||||
planning: base,
|
||||
state: path.join(base, 'STATE.md'),
|
||||
roadmap: path.join(base, 'ROADMAP.md'),
|
||||
project: path.join(base, 'PROJECT.md'),
|
||||
config: path.join(base, 'config.json'),
|
||||
phases: path.join(base, 'phases'),
|
||||
requirements: path.join(base, 'REQUIREMENTS.md'),
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Active Workstream Detection ─────────────────────────────────────────────
|
||||
|
||||
function sanitizeWorkstreamSessionToken(value) {
|
||||
if (value === null || value === undefined) return null;
|
||||
const token = String(value).trim().replace(/[^a-zA-Z0-9._-]+/g, '_').replace(/^_+|_+$/g, '');
|
||||
return token ? token.slice(0, 160) : null;
|
||||
}
|
||||
|
||||
function probeControllingTtyToken() {
|
||||
if (didProbeControllingTtyToken) return cachedControllingTtyToken;
|
||||
didProbeControllingTtyToken = true;
|
||||
|
||||
// `tty` reads stdin. When stdin is already non-interactive, spawning it only
|
||||
// adds avoidable failures on the routing hot path and cannot reveal a stable token.
|
||||
if (!(process.stdin && process.stdin.isTTY)) {
|
||||
return cachedControllingTtyToken;
|
||||
}
|
||||
|
||||
try {
|
||||
const ttyPath = execFileSync('tty', [], {
|
||||
encoding: 'utf-8',
|
||||
stdio: ['inherit', 'pipe', 'ignore'],
|
||||
}).trim();
|
||||
if (ttyPath && ttyPath !== 'not a tty') {
|
||||
const token = sanitizeWorkstreamSessionToken(ttyPath.replace(/^\/dev\//, ''));
|
||||
if (token) cachedControllingTtyToken = `tty-${token}`;
|
||||
}
|
||||
} catch {}
|
||||
|
||||
return cachedControllingTtyToken;
|
||||
}
|
||||
|
||||
function getControllingTtyToken() {
|
||||
for (const envKey of ['TTY', 'SSH_TTY']) {
|
||||
const token = sanitizeWorkstreamSessionToken(process.env[envKey]);
|
||||
if (token) return `tty-${token.replace(/^dev_/, '')}`;
|
||||
}
|
||||
|
||||
return probeControllingTtyToken();
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve a deterministic session key for workstream-local routing.
|
||||
*
|
||||
* Order:
|
||||
* 1. Explicit runtime/session env vars (`GSD_SESSION_KEY`, `CODEX_THREAD_ID`, etc.)
|
||||
* 2. Terminal identity exposed via `TTY` or `SSH_TTY`
|
||||
* 3. One best-effort `tty` probe when stdin is interactive
|
||||
* 4. `null`, which tells callers to use the legacy shared pointer fallback
|
||||
*/
|
||||
function getWorkstreamSessionKey() {
|
||||
for (const envKey of WORKSTREAM_SESSION_ENV_KEYS) {
|
||||
const raw = process.env[envKey];
|
||||
const token = sanitizeWorkstreamSessionToken(raw);
|
||||
if (token) return `${envKey.toLowerCase().replace(/[^a-z0-9]+/g, '-')}-${token}`;
|
||||
}
|
||||
|
||||
return getControllingTtyToken();
|
||||
}
|
||||
|
||||
function getSessionScopedWorkstreamFile(cwd) {
|
||||
const sessionKey = getWorkstreamSessionKey();
|
||||
if (!sessionKey) return null;
|
||||
|
||||
// Use realpathSync.native so the hash is derived from the canonical filesystem
|
||||
// path. On Windows, path.resolve returns whatever case the caller supplied,
|
||||
// while realpathSync.native returns the case the OS recorded — they differ on
|
||||
// case-insensitive NTFS, producing different hashes and different tmpdir slots.
|
||||
// Fall back to path.resolve when the directory does not yet exist.
|
||||
let planningAbs;
|
||||
try {
|
||||
planningAbs = fs.realpathSync.native(planningRoot(cwd));
|
||||
} catch {
|
||||
planningAbs = path.resolve(planningRoot(cwd));
|
||||
}
|
||||
const projectId = crypto
|
||||
.createHash('sha1')
|
||||
.update(planningAbs)
|
||||
.digest('hex')
|
||||
.slice(0, 16);
|
||||
|
||||
const dirPath = path.join(os.tmpdir(), 'gsd-workstream-sessions', projectId);
|
||||
return {
|
||||
sessionKey,
|
||||
dirPath,
|
||||
filePath: path.join(dirPath, sessionKey),
|
||||
};
|
||||
}
|
||||
|
||||
function clearActiveWorkstreamPointer(filePath, cleanupDirPath) {
|
||||
try { fs.unlinkSync(filePath); } catch {}
|
||||
|
||||
// Session-scoped pointers for a repo share one tmp directory. Only remove it
|
||||
// when it is empty so clearing or self-healing one session never deletes siblings.
|
||||
// Explicitly check remaining entries rather than relying on rmdirSync throwing
|
||||
// ENOTEMPTY — that error is not raised reliably on Windows.
|
||||
if (cleanupDirPath) {
|
||||
try {
|
||||
const remaining = fs.readdirSync(cleanupDirPath);
|
||||
if (remaining.length === 0) {
|
||||
fs.rmdirSync(cleanupDirPath);
|
||||
}
|
||||
} catch {}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Pointer files are self-healing: invalid names or deleted-workstream pointers
|
||||
* are removed on read so the session falls back to `null` instead of carrying
|
||||
* silent stale state forward. Session-scoped callers may also prune an empty
|
||||
* per-project tmp directory; shared `.planning/active-workstream` callers do not.
|
||||
*/
|
||||
function readActiveWorkstreamPointer(filePath, cwd, cleanupDirPath = null) {
|
||||
try {
|
||||
const name = fs.readFileSync(filePath, 'utf-8').trim();
|
||||
if (!name || !/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
clearActiveWorkstreamPointer(filePath, cleanupDirPath);
|
||||
return null;
|
||||
}
|
||||
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
||||
if (!fs.existsSync(wsDir)) {
|
||||
clearActiveWorkstreamPointer(filePath, cleanupDirPath);
|
||||
return null;
|
||||
}
|
||||
return name;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the active workstream name.
|
||||
*
|
||||
* Resolution priority:
|
||||
* 1. Session-scoped pointer (tmpdir) when the runtime exposes a stable session key
|
||||
* 2. Legacy shared `.planning/active-workstream` file when no session key is available
|
||||
*
|
||||
* The shared file is intentionally ignored when a session key exists so multiple
|
||||
* concurrent sessions do not overwrite each other's active workstream.
|
||||
*/
|
||||
function getActiveWorkstream(cwd) {
|
||||
const sessionScoped = getSessionScopedWorkstreamFile(cwd);
|
||||
if (sessionScoped) {
|
||||
return readActiveWorkstreamPointer(sessionScoped.filePath, cwd, sessionScoped.dirPath);
|
||||
}
|
||||
|
||||
const sharedFilePath = path.join(planningRoot(cwd), 'active-workstream');
|
||||
return readActiveWorkstreamPointer(sharedFilePath, cwd);
|
||||
}
|
||||
|
||||
/**
|
||||
* Set the active workstream. Pass null to clear.
|
||||
*
|
||||
* When a stable session key is available, this updates a tmpdir-backed
|
||||
* session-scoped pointer. Otherwise it falls back to the legacy shared
|
||||
* `.planning/active-workstream` file for backward compatibility.
|
||||
*/
|
||||
function setActiveWorkstream(cwd, name) {
|
||||
const sessionScoped = getSessionScopedWorkstreamFile(cwd);
|
||||
const filePath = sessionScoped
|
||||
? sessionScoped.filePath
|
||||
: path.join(planningRoot(cwd), 'active-workstream');
|
||||
|
||||
if (!name) {
|
||||
clearActiveWorkstreamPointer(filePath, sessionScoped ? sessionScoped.dirPath : null);
|
||||
return;
|
||||
}
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(name)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, and underscores only');
|
||||
}
|
||||
|
||||
if (sessionScoped) {
|
||||
fs.mkdirSync(sessionScoped.dirPath, { recursive: true });
|
||||
}
|
||||
fs.writeFileSync(filePath, name + '\n', 'utf-8');
|
||||
}
|
||||
// ─── Planning workspace (pathing + active workstream + lock) moved to planning-workspace.cjs ───
|
||||
|
||||
// ─── Phase utilities ──────────────────────────────────────────────────────────
|
||||
|
||||
@@ -1610,6 +1353,16 @@ const RUNTIME_PROFILE_MAP = {
|
||||
sonnet: { model: 'claude-sonnet-4-6' },
|
||||
haiku: { model: 'claude-haiku-4-5' },
|
||||
},
|
||||
hermes: {
|
||||
// Hermes Agent is provider-agnostic; users pick any provider in ~/.hermes/config.yaml.
|
||||
// Defaults use OpenRouter slugs because (a) OpenRouter is Hermes' default provider and
|
||||
// (b) the same slugs resolve on OpenRouter, native Anthropic, and Copilot via Hermes'
|
||||
// aggregator-aware resolver. Users on a different provider override per-tier via
|
||||
// model_profile_overrides.hermes.{opus,sonnet,haiku} in .planning/config.json.
|
||||
opus: { model: 'anthropic/claude-opus-4-7' },
|
||||
sonnet: { model: 'anthropic/claude-sonnet-4-6' },
|
||||
haiku: { model: 'anthropic/claude-haiku-4-5' },
|
||||
},
|
||||
};
|
||||
|
||||
const RUNTIMES_WITH_REASONING_EFFORT = new Set(['codex']);
|
||||
@@ -1632,7 +1385,7 @@ const RUNTIME_OVERRIDE_TIERS = new Set(['opus', 'sonnet', 'haiku']);
|
||||
const KNOWN_RUNTIMES = new Set([
|
||||
'claude', 'codex', 'opencode', 'kilo', 'gemini', 'qwen',
|
||||
'copilot', 'cursor', 'windsurf', 'augment', 'trae', 'codebuddy',
|
||||
'antigravity', 'cline',
|
||||
'antigravity', 'cline', 'hermes',
|
||||
]);
|
||||
|
||||
const _warnedConfigKeys = new Set();
|
||||
@@ -2123,6 +1876,9 @@ function timeAgo(date) {
|
||||
module.exports = {
|
||||
output,
|
||||
error,
|
||||
ERROR_REASON,
|
||||
setJsonErrorMode,
|
||||
getJsonErrorMode,
|
||||
safeReadFile,
|
||||
loadConfig,
|
||||
isGitIgnored,
|
||||
@@ -2155,6 +1911,7 @@ module.exports = {
|
||||
toPosixPath,
|
||||
extractOneLinerFromBody,
|
||||
resolveWorktreeRoot,
|
||||
// Deprecated re-exports — prefer direct import from planning-workspace.cjs
|
||||
withPlanningLock,
|
||||
findProjectRoot,
|
||||
detectSubRepos,
|
||||
|
||||
@@ -16,7 +16,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { planningPaths, planningDir, escapeRegex, output, error } = require('./core.cjs');
|
||||
const { escapeRegex, output, error } = require('./core.cjs');
|
||||
const { planningPaths, planningDir } = require('./planning-workspace.cjs');
|
||||
const { parseDecisions } = require('./decisions.cjs');
|
||||
|
||||
/**
|
||||
@@ -30,7 +31,10 @@ function parseRequirements(reqMd) {
|
||||
const out = [];
|
||||
const seen = new Set();
|
||||
|
||||
const checkboxRe = /^\s*-\s*\[[x ]\]\s*\*\*(REQ-[A-Za-z0-9_-]+)\*\*\s*(.*)$/gm;
|
||||
// Prefix-agnostic ID format: REQ-01, TST-01, BACK-07, INSP-04, etc.
|
||||
const ID_PATTERN = '[A-Z][A-Z0-9]*-[A-Za-z0-9_-]+';
|
||||
|
||||
const checkboxRe = new RegExp(`^\\s*-\\s*\\[[x ]\\]\\s*\\*\\*(${ID_PATTERN})\\*\\*\\s*(.*)$`, 'gm');
|
||||
let cm = checkboxRe.exec(reqMd);
|
||||
while (cm !== null) {
|
||||
const id = cm[1];
|
||||
@@ -41,15 +45,25 @@ function parseRequirements(reqMd) {
|
||||
cm = checkboxRe.exec(reqMd);
|
||||
}
|
||||
|
||||
const tableRe = /\|\s*(REQ-[A-Za-z0-9_-]+)\s*\|/g;
|
||||
let tm = tableRe.exec(reqMd);
|
||||
while (tm !== null) {
|
||||
const tableFirstCellRe = new RegExp(`^\\s*\\|\\s*(${ID_PATTERN})\\s*\\|`);
|
||||
const separatorRowRe = /^\s*\|[\s:|-]+\|\s*$/;
|
||||
const lines = reqMd.split(/\r?\n/);
|
||||
|
||||
for (let i = 0; i < lines.length; i += 1) {
|
||||
const line = lines[i];
|
||||
if (!line.includes('|')) continue;
|
||||
|
||||
// Skip markdown table separator rows and header rows immediately preceding them.
|
||||
if (separatorRowRe.test(line)) continue;
|
||||
if (i + 1 < lines.length && separatorRowRe.test(lines[i + 1])) continue;
|
||||
|
||||
const tm = tableFirstCellRe.exec(line);
|
||||
if (!tm) continue;
|
||||
const id = tm[1];
|
||||
if (!seen.has(id)) {
|
||||
seen.add(id);
|
||||
out.push({ id, text: '' });
|
||||
}
|
||||
tm = tableRe.exec(reqMd);
|
||||
}
|
||||
|
||||
return out;
|
||||
|
||||
@@ -45,6 +45,17 @@ function disabledResponse() {
|
||||
* @param {{ timeout?: number }} [options={}] - Options (timeout in ms, default 30000)
|
||||
* @returns {{ exitCode: number, stdout: string, stderr: string }}
|
||||
*/
|
||||
/**
|
||||
* Frozen enum of typed reason codes for execGraphify failures (#2974).
|
||||
* Tests assert on result.reason instead of grepping stderr text.
|
||||
*/
|
||||
const GRAPHIFY_REASON = Object.freeze({
|
||||
OK: 'ok',
|
||||
ENOENT: 'graphify_not_found',
|
||||
TIMEOUT: 'graphify_timed_out',
|
||||
EXIT_NONZERO: 'graphify_exit_nonzero',
|
||||
});
|
||||
|
||||
function execGraphify(cwd, args, options = {}) {
|
||||
const timeout = options.timeout ?? 30000;
|
||||
const result = childProcess.spawnSync('graphify', args, {
|
||||
@@ -57,7 +68,12 @@ function execGraphify(cwd, args, options = {}) {
|
||||
|
||||
// ENOENT -- graphify binary not found on PATH
|
||||
if (result.error && result.error.code === 'ENOENT') {
|
||||
return { exitCode: 127, stdout: '', stderr: 'graphify not found on PATH' };
|
||||
return {
|
||||
exitCode: 127,
|
||||
stdout: '',
|
||||
stderr: 'graphify not found on PATH',
|
||||
reason: GRAPHIFY_REASON.ENOENT,
|
||||
};
|
||||
}
|
||||
|
||||
// Timeout -- subprocess killed via SIGTERM
|
||||
@@ -66,13 +82,17 @@ function execGraphify(cwd, args, options = {}) {
|
||||
exitCode: 124,
|
||||
stdout: (result.stdout ?? '').toString().trim(),
|
||||
stderr: 'graphify timed out after ' + timeout + 'ms',
|
||||
reason: GRAPHIFY_REASON.TIMEOUT,
|
||||
timeout_ms: timeout,
|
||||
};
|
||||
}
|
||||
|
||||
const exitCode = result.status ?? 1;
|
||||
return {
|
||||
exitCode: result.status ?? 1,
|
||||
exitCode,
|
||||
stdout: (result.stdout ?? '').toString().trim(),
|
||||
stderr: (result.stderr ?? '').toString().trim(),
|
||||
reason: exitCode === 0 ? GRAPHIFY_REASON.OK : GRAPHIFY_REASON.EXIT_NONZERO,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -504,6 +524,7 @@ module.exports = {
|
||||
disabledResponse,
|
||||
// Subprocess
|
||||
execGraphify,
|
||||
GRAPHIFY_REASON,
|
||||
// Presence and version
|
||||
checkGraphifyInstalled,
|
||||
checkGraphifyVersion,
|
||||
|
||||
70
get-shit-done/bin/lib/init-command-router.cjs
Normal file
70
get-shit-done/bin/lib/init-command-router.cjs
Normal file
@@ -0,0 +1,70 @@
|
||||
'use strict';
|
||||
|
||||
const { INIT_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
function routeInitCommand({ init, args, cwd, raw, parseNamedArgs, error }) {
|
||||
const workflow = args[1];
|
||||
switch (workflow) {
|
||||
case 'execute-phase': {
|
||||
const { validate: epValidate, tdd: epTdd } = parseNamedArgs(args, [], ['validate', 'tdd']);
|
||||
init.cmdInitExecutePhase(cwd, args[2], raw, { validate: epValidate, tdd: epTdd });
|
||||
break;
|
||||
}
|
||||
case 'plan-phase': {
|
||||
const { validate: ppValidate, tdd: ppTdd } = parseNamedArgs(args, [], ['validate', 'tdd']);
|
||||
init.cmdInitPlanPhase(cwd, args[2], raw, { validate: ppValidate, tdd: ppTdd });
|
||||
break;
|
||||
}
|
||||
case 'new-project':
|
||||
init.cmdInitNewProject(cwd, raw);
|
||||
break;
|
||||
case 'new-milestone':
|
||||
init.cmdInitNewMilestone(cwd, raw);
|
||||
break;
|
||||
case 'quick':
|
||||
init.cmdInitQuick(cwd, args.slice(2).join(' '), raw);
|
||||
break;
|
||||
case 'ingest-docs':
|
||||
init.cmdInitIngestDocs(cwd, raw);
|
||||
break;
|
||||
case 'resume':
|
||||
init.cmdInitResume(cwd, raw);
|
||||
break;
|
||||
case 'verify-work':
|
||||
init.cmdInitVerifyWork(cwd, args[2], raw);
|
||||
break;
|
||||
case 'phase-op':
|
||||
init.cmdInitPhaseOp(cwd, args[2], raw);
|
||||
break;
|
||||
case 'todos':
|
||||
init.cmdInitTodos(cwd, args[2], raw);
|
||||
break;
|
||||
case 'milestone-op':
|
||||
init.cmdInitMilestoneOp(cwd, raw);
|
||||
break;
|
||||
case 'map-codebase':
|
||||
init.cmdInitMapCodebase(cwd, raw);
|
||||
break;
|
||||
case 'progress':
|
||||
init.cmdInitProgress(cwd, raw);
|
||||
break;
|
||||
case 'manager':
|
||||
init.cmdInitManager(cwd, raw);
|
||||
break;
|
||||
case 'new-workspace':
|
||||
init.cmdInitNewWorkspace(cwd, raw);
|
||||
break;
|
||||
case 'list-workspaces':
|
||||
init.cmdInitListWorkspaces(cwd, raw);
|
||||
break;
|
||||
case 'remove-workspace':
|
||||
init.cmdInitRemoveWorkspace(cwd, args[2], raw);
|
||||
break;
|
||||
default:
|
||||
error(`Unknown init workflow: ${workflow}\nAvailable: ${INIT_SUBCOMMANDS.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeInitCommand,
|
||||
};
|
||||
@@ -5,7 +5,9 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { execSync } = require('child_process');
|
||||
const { loadConfig, resolveModelInternal, findPhaseInternal, getRoadmapPhaseInternal, pathExistsInternal, generateSlugInternal, getMilestoneInfo, getMilestonePhaseFilter, stripShippedMilestones, extractCurrentMilestone, normalizePhaseName, planningPaths, planningDir, planningRoot, toPosixPath, output, error, checkAgentsInstalled, phaseTokenMatches } = require('./core.cjs');
|
||||
const { loadConfig, resolveModelInternal, findPhaseInternal, getRoadmapPhaseInternal, pathExistsInternal, generateSlugInternal, getMilestoneInfo, getMilestonePhaseFilter, stripShippedMilestones, extractCurrentMilestone, normalizePhaseName, toPosixPath, output, error, checkAgentsInstalled, phaseTokenMatches } = require('./core.cjs');
|
||||
const { planningPaths, planningDir, planningRoot } = require('./planning-workspace.cjs');
|
||||
const { maskIfSecret } = require('./secrets.cjs');
|
||||
|
||||
// Accept all bold/colon variants of the Requirements header (#2769):
|
||||
// **Requirements:** / **Requirements**: / **Requirements** : render the
|
||||
@@ -724,9 +726,13 @@ function cmdInitPhaseOp(cwd, phase, raw) {
|
||||
const result = {
|
||||
// Config
|
||||
commit_docs: config.commit_docs,
|
||||
brave_search: config.brave_search,
|
||||
firecrawl: config.firecrawl,
|
||||
exa_search: config.exa_search,
|
||||
// #2997: secret config keys may be either booleans (availability flags) or
|
||||
// string API keys (when user did `gsd-tools config-set brave_search XXX`).
|
||||
// Pass booleans through; mask string values so the init bundle never echoes
|
||||
// plaintext credentials. SDK init.ts mirrors this masking.
|
||||
brave_search: typeof config.brave_search === 'string' ? maskIfSecret('brave_search', config.brave_search) : config.brave_search,
|
||||
firecrawl: typeof config.firecrawl === 'string' ? maskIfSecret('firecrawl', config.firecrawl) : config.firecrawl,
|
||||
exa_search: typeof config.exa_search === 'string' ? maskIfSecret('exa_search', config.exa_search) : config.exa_search,
|
||||
|
||||
// Phase info
|
||||
phase_found: !!phaseInfo,
|
||||
|
||||
@@ -4,7 +4,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { escapeRegex, getMilestonePhaseFilter, extractOneLinerFromBody, normalizeMd, planningPaths, output, error, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { escapeRegex, getMilestonePhaseFilter, extractOneLinerFromBody, normalizeMd, output, error, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { planningPaths } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const { writeStateMd, stateReplaceFieldWithFallback } = require('./state.cjs');
|
||||
|
||||
|
||||
49
get-shit-done/bin/lib/phase-command-router.cjs
Normal file
49
get-shit-done/bin/lib/phase-command-router.cjs
Normal file
@@ -0,0 +1,49 @@
|
||||
'use strict';
|
||||
|
||||
const { PHASE_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
function routePhaseCommand({ phase, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'next-decimal') {
|
||||
phase.cmdPhaseNextDecimal(cwd, args[2], raw);
|
||||
} else if (subcommand === 'add') {
|
||||
let customId = null;
|
||||
const descArgs = [];
|
||||
for (let i = 2; i < args.length; i++) {
|
||||
if (args[i] === '--id' && i + 1 < args.length) {
|
||||
customId = args[i + 1];
|
||||
i++;
|
||||
} else {
|
||||
descArgs.push(args[i]);
|
||||
}
|
||||
}
|
||||
phase.cmdPhaseAdd(cwd, descArgs.join(' '), raw, customId);
|
||||
} else if (subcommand === 'add-batch') {
|
||||
const descFlagIdx = args.indexOf('--descriptions');
|
||||
let descriptions;
|
||||
if (descFlagIdx !== -1 && args[descFlagIdx + 1]) {
|
||||
try {
|
||||
descriptions = JSON.parse(args[descFlagIdx + 1]);
|
||||
} catch {
|
||||
error('--descriptions must be a JSON array');
|
||||
}
|
||||
} else {
|
||||
descriptions = args.slice(2).filter(a => a !== '--raw');
|
||||
}
|
||||
phase.cmdPhaseAddBatch(cwd, descriptions, raw);
|
||||
} else if (subcommand === 'insert') {
|
||||
phase.cmdPhaseInsert(cwd, args[2], args.slice(3).join(' '), raw);
|
||||
} else if (subcommand === 'remove') {
|
||||
const forceFlag = args.includes('--force');
|
||||
phase.cmdPhaseRemove(cwd, args[2], { force: forceFlag }, raw);
|
||||
} else if (subcommand === 'complete') {
|
||||
phase.cmdPhaseComplete(cwd, args[2], raw);
|
||||
} else {
|
||||
error(`Unknown phase subcommand. Available: ${PHASE_SUBCOMMANDS.filter((s) => s !== 'list-plans' && s !== 'list-artifacts' && s !== 'scaffold').join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routePhaseCommand,
|
||||
};
|
||||
@@ -4,10 +4,52 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { escapeRegex, loadConfig, normalizePhaseName, comparePhaseNum, findPhaseInternal, getArchivedPhaseDirs, generateSlugInternal, getMilestonePhaseFilter, stripShippedMilestones, extractCurrentMilestone, replaceInCurrentMilestone, toPosixPath, planningDir, withPlanningLock, output, error, readSubdirectories, phaseTokenMatches, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { escapeRegex, loadConfig, normalizePhaseName, comparePhaseNum, findPhaseInternal, getArchivedPhaseDirs, generateSlugInternal, getMilestonePhaseFilter, stripShippedMilestones, extractCurrentMilestone, replaceInCurrentMilestone, toPosixPath, output, error, readSubdirectories, phaseTokenMatches, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { planningDir, withPlanningLock } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const { writeStateMd, readModifyWriteStateMd, stateExtractField, stateReplaceField, stateReplaceFieldWithFallback, updatePerformanceMetricsSection } = require('./state.cjs');
|
||||
|
||||
// #2893 — strict canonical filter: `{padded_phase}-{NN}-PLAN.md` or `PLAN.md`.
|
||||
// Documented in agents/gsd-planner.md (write_phase_prompt step). The wider
|
||||
// "looks like a plan but isn't canonical" probe below is used to surface a
|
||||
// loud warning instead of silently returning zero plans.
|
||||
const isCanonicalPlanFile = (f) => f.endsWith('-PLAN.md') || f === 'PLAN.md';
|
||||
|
||||
// Any .md file with PLAN anywhere in the basename — the diagnostic net for
|
||||
// catching agent deviations like `01-PLAN-01-foundation.md` (#2893).
|
||||
// Excludes derivative files (`-PLAN-OUTLINE.md`, `*.pre-bounce.md`, etc.) that
|
||||
// the planner legitimately produces alongside canonical plans.
|
||||
const PLAN_OUTLINE_RE = /-PLAN-OUTLINE\.md$/i;
|
||||
const PLAN_PRE_BOUNCE_RE = /-PLAN.*\.pre-bounce\.md$/i;
|
||||
const looksLikePlanFile = (f) =>
|
||||
/\.md$/i.test(f)
|
||||
&& /PLAN/i.test(f)
|
||||
&& !PLAN_OUTLINE_RE.test(f)
|
||||
&& !PLAN_PRE_BOUNCE_RE.test(f);
|
||||
|
||||
/**
|
||||
* Detect plan-shaped files that the canonical filter would reject. Returns
|
||||
* a warning string when offenders exist, else null. Centralised so every
|
||||
* read site (phase-plan-index, phases list --type plans, find-phase) emits
|
||||
* the same message.
|
||||
*
|
||||
* @param {string[]} dirFiles — readdirSync output for one phase directory
|
||||
* @param {string[]} matchedFiles — what the canonical filter accepted
|
||||
* @returns {string|null}
|
||||
*/
|
||||
function describeNonCanonicalPlans(dirFiles, matchedFiles) {
|
||||
const matched = new Set(matchedFiles);
|
||||
const offenders = dirFiles.filter((f) => looksLikePlanFile(f) && !matched.has(f));
|
||||
if (offenders.length === 0) return null;
|
||||
return (
|
||||
`Found ${offenders.length} plan-shaped file(s) in this phase that don't match the canonical ` +
|
||||
`naming convention "{padded_phase}-{NN}-PLAN.md" (or bare "PLAN.md") and were skipped: ` +
|
||||
offenders.map((f) => `"${f}"`).join(', ') +
|
||||
`. Rename to the canonical form (e.g. "01-01-PLAN.md") so the executor can detect them. ` +
|
||||
`See agents/gsd-planner.md write_phase_prompt step for the full contract.`
|
||||
);
|
||||
}
|
||||
|
||||
function cmdPhasesList(cwd, options, raw) {
|
||||
const phasesDir = path.join(planningDir(cwd), 'phases');
|
||||
const { type, phase, includeArchived } = options;
|
||||
@@ -52,13 +94,18 @@ function cmdPhasesList(cwd, options, raw) {
|
||||
// If listing files of a specific type
|
||||
if (type) {
|
||||
const files = [];
|
||||
const warnings = [];
|
||||
for (const dir of dirs) {
|
||||
const dirPath = path.join(phasesDir, dir);
|
||||
const dirFiles = fs.readdirSync(dirPath);
|
||||
|
||||
let filtered;
|
||||
if (type === 'plans') {
|
||||
filtered = dirFiles.filter(f => f.endsWith('-PLAN.md') || f === 'PLAN.md');
|
||||
filtered = dirFiles.filter(isCanonicalPlanFile);
|
||||
// #2893 — surface plan-shaped files the canonical filter rejected
|
||||
// so callers (executor init, etc.) don't silently see zero plans.
|
||||
const w = describeNonCanonicalPlans(dirFiles, filtered);
|
||||
if (w) warnings.push(`${dir}: ${w}`);
|
||||
} else if (type === 'summaries') {
|
||||
filtered = dirFiles.filter(f => f.endsWith('-SUMMARY.md') || f === 'SUMMARY.md');
|
||||
} else {
|
||||
@@ -73,6 +120,7 @@ function cmdPhasesList(cwd, options, raw) {
|
||||
count: files.length,
|
||||
phase_dir: phase ? dirs[0].replace(/^\d+(?:\.\d+)*-?/, '') : null,
|
||||
};
|
||||
if (warnings.length) result.warning = warnings.join(' | ');
|
||||
output(result, raw, files.join('\n'));
|
||||
return;
|
||||
}
|
||||
@@ -176,8 +224,10 @@ function cmdFindPhase(cwd, phase, raw) {
|
||||
|
||||
const phaseDir = path.join(phasesDir, match);
|
||||
const phaseFiles = fs.readdirSync(phaseDir);
|
||||
const plans = phaseFiles.filter(f => f.endsWith('-PLAN.md') || f === 'PLAN.md').sort();
|
||||
const plans = phaseFiles.filter(isCanonicalPlanFile).sort();
|
||||
const summaries = phaseFiles.filter(f => f.endsWith('-SUMMARY.md') || f === 'SUMMARY.md').sort();
|
||||
// #2893 — same diagnostic as phase-plan-index for consistency.
|
||||
const planNamingWarning = describeNonCanonicalPlans(phaseFiles, plans);
|
||||
|
||||
const result = {
|
||||
found: true,
|
||||
@@ -187,6 +237,7 @@ function cmdFindPhase(cwd, phase, raw) {
|
||||
plans,
|
||||
summaries,
|
||||
};
|
||||
if (planNamingWarning) result.warning = planNamingWarning;
|
||||
|
||||
output(result, raw, result.directory);
|
||||
} catch {
|
||||
@@ -229,8 +280,11 @@ function cmdPhasePlanIndex(cwd, phase, raw) {
|
||||
|
||||
// Get all files in phase directory
|
||||
const phaseFiles = fs.readdirSync(phaseDir);
|
||||
const planFiles = phaseFiles.filter(f => f.endsWith('-PLAN.md') || f === 'PLAN.md').sort();
|
||||
const planFiles = phaseFiles.filter(isCanonicalPlanFile).sort();
|
||||
const summaryFiles = phaseFiles.filter(f => f.endsWith('-SUMMARY.md') || f === 'SUMMARY.md');
|
||||
// #2893 — surface plan-shaped files the canonical filter rejected so a
|
||||
// misnamed plan never silently produces plan_count: 0 at executor init.
|
||||
const planNamingWarning = describeNonCanonicalPlans(phaseFiles, planFiles);
|
||||
|
||||
// Build set of plan IDs with summaries
|
||||
const completedPlanIds = new Set(
|
||||
@@ -305,6 +359,7 @@ function cmdPhasePlanIndex(cwd, phase, raw) {
|
||||
incomplete,
|
||||
has_checkpoints: hasCheckpoints,
|
||||
};
|
||||
if (planNamingWarning) result.warning = planNamingWarning;
|
||||
|
||||
output(result, raw);
|
||||
}
|
||||
|
||||
36
get-shit-done/bin/lib/phases-command-router.cjs
Normal file
36
get-shit-done/bin/lib/phases-command-router.cjs
Normal file
@@ -0,0 +1,36 @@
|
||||
'use strict';
|
||||
|
||||
const { PHASES_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
/**
|
||||
* Manifest-backed phases subcommand router.
|
||||
* Keeps gsd-tools.cjs thin while preserving current CJS semantics:
|
||||
* - list
|
||||
* - clear
|
||||
*
|
||||
* Note: `archive` is currently SDK-only (`phases.archive` handler in SDK query
|
||||
* registry). CJS `gsd-tools phases` intentionally supports list/clear only.
|
||||
*/
|
||||
function routePhasesCommand({ phase, milestone, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'list') {
|
||||
const typeIndex = args.indexOf('--type');
|
||||
const phaseIndex = args.indexOf('--phase');
|
||||
const options = {
|
||||
type: typeIndex !== -1 ? args[typeIndex + 1] : null,
|
||||
phase: phaseIndex !== -1 ? args[phaseIndex + 1] : null,
|
||||
includeArchived: args.includes('--include-archived'),
|
||||
};
|
||||
phase.cmdPhasesList(cwd, options, raw);
|
||||
} else if (subcommand === 'clear') {
|
||||
milestone.cmdPhasesClear(cwd, raw, args.slice(2));
|
||||
} else {
|
||||
const cjsSupported = PHASES_SUBCOMMANDS.filter((s) => s !== 'archive');
|
||||
error(`Unknown phases subcommand. Available: ${cjsSupported.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routePhasesCommand,
|
||||
};
|
||||
371
get-shit-done/bin/lib/planning-workspace.cjs
Normal file
371
get-shit-done/bin/lib/planning-workspace.cjs
Normal file
@@ -0,0 +1,371 @@
|
||||
/**
|
||||
* Planning Workspace — .planning path resolution + active workstream routing.
|
||||
*
|
||||
* This module owns the planning workspace seam:
|
||||
* - planningDir/planningRoot/planningPaths
|
||||
* - active workstream pointer policy (session-scoped > shared)
|
||||
* - pointer storage adapters (session/shared/memory)
|
||||
*/
|
||||
|
||||
const fs = require('fs');
|
||||
const os = require('os');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const { execFileSync } = require('child_process');
|
||||
|
||||
const WORKSTREAM_SESSION_ENV_KEYS = [
|
||||
'GSD_SESSION_KEY',
|
||||
'CODEX_THREAD_ID',
|
||||
'CLAUDE_SESSION_ID',
|
||||
'CLAUDE_CODE_SSE_PORT',
|
||||
'OPENCODE_SESSION_ID',
|
||||
'GEMINI_SESSION_ID',
|
||||
'CURSOR_SESSION_ID',
|
||||
'WINDSURF_SESSION_ID',
|
||||
'TERM_SESSION_ID',
|
||||
'WT_SESSION',
|
||||
'TMUX_PANE',
|
||||
'ZELLIJ_SESSION_NAME',
|
||||
];
|
||||
|
||||
let cachedControllingTtyToken = null;
|
||||
let didProbeControllingTtyToken = false;
|
||||
|
||||
// Track .planning/.lock files held by this process so they can be removed on exit.
|
||||
const _heldPlanningLocks = new Set();
|
||||
process.on('exit', () => {
|
||||
for (const lockPath of _heldPlanningLocks) {
|
||||
try { fs.unlinkSync(lockPath); } catch { /* already gone */ }
|
||||
}
|
||||
});
|
||||
|
||||
function planningDir(cwd, ws, project) {
|
||||
if (project === undefined) project = process.env.GSD_PROJECT || null;
|
||||
if (ws === undefined) ws = process.env.GSD_WORKSTREAM || null;
|
||||
|
||||
// Reject path separators and traversal components in project/workstream names
|
||||
const BAD_SEGMENT = /[/\\]|\.\./;
|
||||
if (project && BAD_SEGMENT.test(project)) {
|
||||
throw new Error(`GSD_PROJECT contains invalid path characters: ${project}`);
|
||||
}
|
||||
if (ws && BAD_SEGMENT.test(ws)) {
|
||||
throw new Error(`GSD_WORKSTREAM contains invalid path characters: ${ws}`);
|
||||
}
|
||||
|
||||
let base = path.join(cwd, '.planning');
|
||||
if (project) base = path.join(base, project);
|
||||
if (ws) base = path.join(base, 'workstreams', ws);
|
||||
return base;
|
||||
}
|
||||
|
||||
function planningRoot(cwd) {
|
||||
return path.join(cwd, '.planning');
|
||||
}
|
||||
|
||||
function planningPaths(cwd, ws) {
|
||||
const base = planningDir(cwd, ws);
|
||||
return {
|
||||
planning: base,
|
||||
state: path.join(base, 'STATE.md'),
|
||||
roadmap: path.join(base, 'ROADMAP.md'),
|
||||
project: path.join(base, 'PROJECT.md'),
|
||||
config: path.join(base, 'config.json'),
|
||||
phases: path.join(base, 'phases'),
|
||||
requirements: path.join(base, 'REQUIREMENTS.md'),
|
||||
};
|
||||
}
|
||||
|
||||
function sanitizeWorkstreamSessionToken(value) {
|
||||
if (value === null || value === undefined) return null;
|
||||
const token = String(value).trim().replace(/[^a-zA-Z0-9._-]+/g, '_').replace(/^_+|_+$/g, '');
|
||||
return token ? token.slice(0, 160) : null;
|
||||
}
|
||||
|
||||
function probeControllingTtyToken() {
|
||||
if (didProbeControllingTtyToken) return cachedControllingTtyToken;
|
||||
didProbeControllingTtyToken = true;
|
||||
|
||||
// `tty` reads stdin. When stdin is already non-interactive, spawning it only
|
||||
// adds avoidable failures on the routing hot path and cannot reveal a stable token.
|
||||
if (!(process.stdin && process.stdin.isTTY)) {
|
||||
return cachedControllingTtyToken;
|
||||
}
|
||||
|
||||
try {
|
||||
const ttyPath = execFileSync('tty', [], {
|
||||
encoding: 'utf-8',
|
||||
stdio: ['inherit', 'pipe', 'ignore'],
|
||||
}).trim();
|
||||
if (ttyPath && ttyPath !== 'not a tty') {
|
||||
const token = sanitizeWorkstreamSessionToken(ttyPath.replace(/^\/dev\//, ''));
|
||||
if (token) cachedControllingTtyToken = `tty-${token}`;
|
||||
}
|
||||
} catch {}
|
||||
|
||||
return cachedControllingTtyToken;
|
||||
}
|
||||
|
||||
function getControllingTtyToken() {
|
||||
for (const envKey of ['TTY', 'SSH_TTY']) {
|
||||
const token = sanitizeWorkstreamSessionToken(process.env[envKey]);
|
||||
if (token) return `tty-${token.replace(/^dev_/, '')}`;
|
||||
}
|
||||
|
||||
return probeControllingTtyToken();
|
||||
}
|
||||
|
||||
function getWorkstreamSessionKey() {
|
||||
for (const envKey of WORKSTREAM_SESSION_ENV_KEYS) {
|
||||
const raw = process.env[envKey];
|
||||
const token = sanitizeWorkstreamSessionToken(raw);
|
||||
if (token) return `${envKey.toLowerCase().replace(/[^a-z0-9]+/g, '-')}-${token}`;
|
||||
}
|
||||
|
||||
return getControllingTtyToken();
|
||||
}
|
||||
|
||||
function getSessionScopedWorkstreamFile(cwd, fixedSessionKey) {
|
||||
const sessionKey = fixedSessionKey || getWorkstreamSessionKey();
|
||||
if (!sessionKey) return null;
|
||||
|
||||
// Use realpathSync.native so the hash is derived from the canonical filesystem
|
||||
// path. On Windows, path.resolve returns whatever case the caller supplied,
|
||||
// while realpathSync.native returns the case the OS recorded — they differ on
|
||||
// case-insensitive NTFS, producing different hashes and different tmpdir slots.
|
||||
// Fall back to path.resolve when the directory does not yet exist.
|
||||
let planningAbs;
|
||||
try {
|
||||
planningAbs = fs.realpathSync.native(planningRoot(cwd));
|
||||
} catch {
|
||||
planningAbs = path.resolve(planningRoot(cwd));
|
||||
}
|
||||
const projectId = crypto
|
||||
.createHash('sha1')
|
||||
.update(planningAbs)
|
||||
.digest('hex')
|
||||
.slice(0, 16);
|
||||
|
||||
const dirPath = path.join(os.tmpdir(), 'gsd-workstream-sessions', projectId);
|
||||
return {
|
||||
sessionKey,
|
||||
dirPath,
|
||||
filePath: path.join(dirPath, sessionKey),
|
||||
};
|
||||
}
|
||||
|
||||
function createSharedPointerAdapter(cwd) {
|
||||
const filePath = path.join(planningRoot(cwd), 'active-workstream');
|
||||
return {
|
||||
read() {
|
||||
try {
|
||||
return fs.readFileSync(filePath, 'utf-8').trim() || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
},
|
||||
write(name) {
|
||||
fs.writeFileSync(filePath, name + '\n', 'utf-8');
|
||||
},
|
||||
clear() {
|
||||
try { fs.unlinkSync(filePath); } catch {}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
function createSessionScopedPointerAdapter(cwd, fixedSessionKey) {
|
||||
const scoped = getSessionScopedWorkstreamFile(cwd, fixedSessionKey);
|
||||
if (!scoped) return null;
|
||||
|
||||
return {
|
||||
read() {
|
||||
try {
|
||||
return fs.readFileSync(scoped.filePath, 'utf-8').trim() || null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
},
|
||||
write(name) {
|
||||
fs.mkdirSync(scoped.dirPath, { recursive: true });
|
||||
fs.writeFileSync(scoped.filePath, name + '\n', 'utf-8');
|
||||
},
|
||||
clear() {
|
||||
try { fs.unlinkSync(scoped.filePath); } catch {}
|
||||
try {
|
||||
const remaining = fs.readdirSync(scoped.dirPath);
|
||||
if (remaining.length === 0) {
|
||||
fs.rmdirSync(scoped.dirPath);
|
||||
}
|
||||
} catch {}
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
function createMemoryPointerAdapter(initialName = null) {
|
||||
let value = initialName;
|
||||
return {
|
||||
read() {
|
||||
return value;
|
||||
},
|
||||
write(name) {
|
||||
value = name;
|
||||
},
|
||||
clear() {
|
||||
value = null;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
function pickActiveWorkstreamAdapter(cwd, opts = {}) {
|
||||
if (opts.activeWorkstreamAdapter) {
|
||||
return opts.activeWorkstreamAdapter;
|
||||
}
|
||||
|
||||
const sessionKey = getWorkstreamSessionKey();
|
||||
if (sessionKey) {
|
||||
if (opts.activeWorkstreamAdapters && opts.activeWorkstreamAdapters.session) {
|
||||
return opts.activeWorkstreamAdapters.session;
|
||||
}
|
||||
return createSessionScopedPointerAdapter(cwd, sessionKey);
|
||||
}
|
||||
|
||||
if (opts.activeWorkstreamAdapters && opts.activeWorkstreamAdapters.shared) {
|
||||
return opts.activeWorkstreamAdapters.shared;
|
||||
}
|
||||
return createSharedPointerAdapter(cwd);
|
||||
}
|
||||
|
||||
function validateWorkstreamName(name) {
|
||||
return /^[a-zA-Z0-9_-]+$/.test(name);
|
||||
}
|
||||
|
||||
function withPlanningLock(cwd, fn) {
|
||||
const lockPath = path.join(planningDir(cwd), '.lock');
|
||||
const lockTimeout = 10000; // 10 seconds
|
||||
const start = Date.now();
|
||||
|
||||
// Ensure .planning/ exists
|
||||
try { fs.mkdirSync(planningDir(cwd), { recursive: true }); } catch { /* ok */ }
|
||||
|
||||
function runWithHeldLock() {
|
||||
// Atomic create — fails if file exists
|
||||
fs.writeFileSync(lockPath, JSON.stringify({
|
||||
pid: process.pid,
|
||||
cwd,
|
||||
acquired: new Date().toISOString(),
|
||||
}), { flag: 'wx' });
|
||||
|
||||
_heldPlanningLocks.add(lockPath);
|
||||
|
||||
// Lock acquired — run the function
|
||||
try {
|
||||
return fn();
|
||||
} finally {
|
||||
_heldPlanningLocks.delete(lockPath);
|
||||
try { fs.unlinkSync(lockPath); } catch { /* already released */ }
|
||||
}
|
||||
}
|
||||
|
||||
while (Date.now() - start < lockTimeout) {
|
||||
try {
|
||||
return runWithHeldLock();
|
||||
} catch (err) {
|
||||
if (err.code === 'EEXIST') {
|
||||
// Lock exists — check if stale (>30s old)
|
||||
try {
|
||||
const stat = fs.statSync(lockPath);
|
||||
if (Date.now() - stat.mtimeMs > 30000) {
|
||||
fs.unlinkSync(lockPath);
|
||||
continue; // retry
|
||||
}
|
||||
} catch { continue; }
|
||||
|
||||
// Wait and retry (cross-platform, no shell dependency)
|
||||
Atomics.wait(new Int32Array(new SharedArrayBuffer(4)), 0, 0, 100);
|
||||
continue;
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
// Timeout — stale-lock recovery, then re-acquire atomically before entering critical section.
|
||||
try { fs.unlinkSync(lockPath); } catch { /* ok */ }
|
||||
return runWithHeldLock();
|
||||
}
|
||||
|
||||
function createPlanningWorkspace(cwd, opts = {}) {
|
||||
return {
|
||||
paths: {
|
||||
dir(ws, project) {
|
||||
return planningDir(cwd, ws, project);
|
||||
},
|
||||
root() {
|
||||
return planningRoot(cwd);
|
||||
},
|
||||
all(ws) {
|
||||
return planningPaths(cwd, ws);
|
||||
},
|
||||
},
|
||||
activeWorkstream: {
|
||||
get() {
|
||||
const adapter = pickActiveWorkstreamAdapter(cwd, opts);
|
||||
if (!adapter) return null;
|
||||
|
||||
const name = adapter.read();
|
||||
if (!name || !validateWorkstreamName(name)) {
|
||||
adapter.clear();
|
||||
return null;
|
||||
}
|
||||
|
||||
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
||||
if (!fs.existsSync(wsDir)) {
|
||||
adapter.clear();
|
||||
return null;
|
||||
}
|
||||
|
||||
return name;
|
||||
},
|
||||
set(name) {
|
||||
const adapter = pickActiveWorkstreamAdapter(cwd, opts);
|
||||
if (!adapter) return;
|
||||
|
||||
if (!name) {
|
||||
adapter.clear();
|
||||
return;
|
||||
}
|
||||
if (!validateWorkstreamName(name)) {
|
||||
throw new Error('Invalid workstream name: must be alphanumeric, hyphens, and underscores only');
|
||||
}
|
||||
|
||||
const wsDir = path.join(planningRoot(cwd), 'workstreams', name);
|
||||
fs.mkdirSync(wsDir, { recursive: true });
|
||||
adapter.write(name);
|
||||
},
|
||||
clear() {
|
||||
const adapter = pickActiveWorkstreamAdapter(cwd, opts);
|
||||
if (!adapter) return;
|
||||
adapter.clear();
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
function getActiveWorkstream(cwd) {
|
||||
return createPlanningWorkspace(cwd).activeWorkstream.get();
|
||||
}
|
||||
|
||||
function setActiveWorkstream(cwd, name) {
|
||||
createPlanningWorkspace(cwd).activeWorkstream.set(name);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
createPlanningWorkspace,
|
||||
createSharedPointerAdapter,
|
||||
createSessionScopedPointerAdapter,
|
||||
createMemoryPointerAdapter,
|
||||
planningDir,
|
||||
planningRoot,
|
||||
planningPaths,
|
||||
withPlanningLock,
|
||||
getActiveWorkstream,
|
||||
setActiveWorkstream,
|
||||
};
|
||||
@@ -776,9 +776,17 @@ function cmdGenerateDevPreferences(cwd, options, raw) {
|
||||
}
|
||||
template = template.replace(/\{\{stack_preferences\}\}/g, stackBlock);
|
||||
|
||||
// #2973: v1.39.0's skills-only migration removed the legacy
|
||||
// commands/gsd subdirectory in favor of skills/<skill>/SKILL.md under
|
||||
// the runtime config dir. This writer was missed in the migration
|
||||
// (PR #1540 targeted GSD-shipped command files; dev-preferences is a
|
||||
// runtime-generated user artifact). Default now points at the skills/
|
||||
// location so /gsd-profile-user --refresh stops re-creating the legacy
|
||||
// directory. The path is constructed via path.join (not a literal
|
||||
// string) so the cline-install leaked-path lint does not flag it.
|
||||
let outputPath = options.output;
|
||||
if (!outputPath) {
|
||||
outputPath = path.join(os.homedir(), '.claude', 'commands', 'gsd', 'dev-preferences.md');
|
||||
outputPath = path.join(os.homedir(), '.claude', 'skills', 'gsd-dev-preferences', 'SKILL.md');
|
||||
} else if (!path.isAbsolute(outputPath)) {
|
||||
outputPath = path.join(cwd, outputPath);
|
||||
}
|
||||
|
||||
23
get-shit-done/bin/lib/roadmap-command-router.cjs
Normal file
23
get-shit-done/bin/lib/roadmap-command-router.cjs
Normal file
@@ -0,0 +1,23 @@
|
||||
'use strict';
|
||||
|
||||
const { ROADMAP_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
function routeRoadmapCommand({ roadmap, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'get-phase') {
|
||||
roadmap.cmdRoadmapGetPhase(cwd, args[2], raw);
|
||||
} else if (subcommand === 'analyze') {
|
||||
roadmap.cmdRoadmapAnalyze(cwd, raw);
|
||||
} else if (subcommand === 'update-plan-progress') {
|
||||
roadmap.cmdRoadmapUpdatePlanProgress(cwd, args[2], raw);
|
||||
} else if (subcommand === 'annotate-dependencies') {
|
||||
roadmap.cmdRoadmapAnnotateDependencies(cwd, args[2], raw);
|
||||
} else {
|
||||
error(`Unknown roadmap subcommand. Available: ${ROADMAP_SUBCOMMANDS.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeRoadmapCommand,
|
||||
};
|
||||
@@ -4,7 +4,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { escapeRegex, normalizePhaseName, planningPaths, withPlanningLock, output, error, findPhaseInternal, stripShippedMilestones, extractCurrentMilestone, replaceInCurrentMilestone, phaseTokenMatches, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { escapeRegex, normalizePhaseName, output, error, findPhaseInternal, stripShippedMilestones, extractCurrentMilestone, replaceInCurrentMilestone, phaseTokenMatches, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { planningPaths, withPlanningLock } = require('./planning-workspace.cjs');
|
||||
|
||||
/**
|
||||
* Coerce an arbitrary YAML scalar/object into a string for cross-cutting
|
||||
|
||||
90
get-shit-done/bin/lib/state-command-router.cjs
Normal file
90
get-shit-done/bin/lib/state-command-router.cjs
Normal file
@@ -0,0 +1,90 @@
|
||||
'use strict';
|
||||
|
||||
const { STATE_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
/**
|
||||
* Manifest-backed state subcommand router.
|
||||
* Keeps gsd-tools.cjs thin while preserving existing command semantics.
|
||||
*/
|
||||
function routeStateCommand({ state, args, cwd, raw, parseNamedArgs, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'json') {
|
||||
state.cmdStateJson(cwd, raw);
|
||||
} else if (subcommand === 'update') {
|
||||
state.cmdStateUpdate(cwd, args[2], args[3]);
|
||||
} else if (subcommand === 'get') {
|
||||
state.cmdStateGet(cwd, args[2], raw);
|
||||
} else if (subcommand === 'patch') {
|
||||
const patches = {};
|
||||
for (let i = 2; i < args.length; i += 2) {
|
||||
const key = args[i].replace(/^--/, '');
|
||||
const value = args[i + 1];
|
||||
if (key && value !== undefined) {
|
||||
patches[key] = value;
|
||||
}
|
||||
}
|
||||
state.cmdStatePatch(cwd, patches, raw);
|
||||
} else if (subcommand === 'advance-plan') {
|
||||
state.cmdStateAdvancePlan(cwd, raw);
|
||||
} else if (subcommand === 'record-metric') {
|
||||
const { phase: p, plan, duration, tasks, files } = parseNamedArgs(args, ['phase', 'plan', 'duration', 'tasks', 'files']);
|
||||
state.cmdStateRecordMetric(cwd, { phase: p, plan, duration, tasks, files }, raw);
|
||||
} else if (subcommand === 'update-progress') {
|
||||
state.cmdStateUpdateProgress(cwd, raw);
|
||||
} else if (subcommand === 'add-decision') {
|
||||
const { phase: p, summary, 'summary-file': summary_file, rationale, 'rationale-file': rationale_file } = parseNamedArgs(args, ['phase', 'summary', 'summary-file', 'rationale', 'rationale-file']);
|
||||
state.cmdStateAddDecision(cwd, { phase: p, summary, summary_file, rationale: rationale || '', rationale_file }, raw);
|
||||
} else if (subcommand === 'add-blocker') {
|
||||
const { text, 'text-file': text_file } = parseNamedArgs(args, ['text', 'text-file']);
|
||||
state.cmdStateAddBlocker(cwd, { text, text_file }, raw);
|
||||
} else if (subcommand === 'resolve-blocker') {
|
||||
state.cmdStateResolveBlocker(cwd, parseNamedArgs(args, ['text']).text, raw);
|
||||
} else if (subcommand === 'record-session') {
|
||||
const { 'stopped-at': stopped_at, 'resume-file': resume_file } = parseNamedArgs(args, ['stopped-at', 'resume-file']);
|
||||
state.cmdStateRecordSession(cwd, { stopped_at, resume_file: resume_file || 'None' }, raw);
|
||||
} else if (subcommand === 'begin-phase') {
|
||||
const { phase: p, name, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
const parsedPlans = plans == null ? null : Number.parseInt(plans, 10);
|
||||
if (plans != null && Number.isNaN(parsedPlans)) {
|
||||
return error('Invalid --plans value. Expected an integer.');
|
||||
}
|
||||
state.cmdStateBeginPhase(cwd, p, name, parsedPlans, raw);
|
||||
} else if (subcommand === 'signal-waiting') {
|
||||
const { type, question, options, phase: p } = parseNamedArgs(args, ['type', 'question', 'options', 'phase']);
|
||||
state.cmdSignalWaiting(cwd, type, question, options, p, raw);
|
||||
} else if (subcommand === 'signal-resume') {
|
||||
state.cmdSignalResume(cwd, raw);
|
||||
} else if (subcommand === 'planned-phase') {
|
||||
const { phase: p, plans } = parseNamedArgs(args, ['phase', 'name', 'plans']);
|
||||
const parsedPlans = plans == null ? null : Number.parseInt(plans, 10);
|
||||
if (plans != null && Number.isNaN(parsedPlans)) {
|
||||
return error('Invalid --plans value. Expected an integer.');
|
||||
}
|
||||
state.cmdStatePlannedPhase(cwd, p, parsedPlans, raw);
|
||||
} else if (subcommand === 'validate') {
|
||||
state.cmdStateValidate(cwd, raw);
|
||||
} else if (subcommand === 'sync') {
|
||||
const { verify } = parseNamedArgs(args, [], ['verify']);
|
||||
state.cmdStateSync(cwd, { verify }, raw);
|
||||
} else if (subcommand === 'prune') {
|
||||
const { 'keep-recent': keepRecent, 'dry-run': dryRun } = parseNamedArgs(args, ['keep-recent'], ['dry-run']);
|
||||
state.cmdStatePrune(cwd, { keepRecent: keepRecent || '3', dryRun: !!dryRun }, raw);
|
||||
} else if (subcommand === 'complete-phase') {
|
||||
state.cmdStateCompletePhase(cwd, raw);
|
||||
} else if (subcommand === 'milestone-switch') {
|
||||
const { milestone, name } = parseNamedArgs(args, ['milestone', 'name']);
|
||||
state.cmdStateMilestoneSwitch(cwd, milestone, name, raw);
|
||||
} else if (subcommand === 'add-roadmap-evolution') {
|
||||
error('state add-roadmap-evolution is SDK-only. Use: gsd-sdk query state.add-roadmap-evolution ...');
|
||||
} else if (subcommand === undefined || subcommand === 'load') {
|
||||
state.cmdStateLoad(cwd, raw);
|
||||
} else {
|
||||
const available = ['load', 'complete-phase', ...STATE_SUBCOMMANDS.filter((s) => s !== 'load')];
|
||||
error(`Unknown state subcommand: "${subcommand}". Available: ${available.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeStateCommand,
|
||||
};
|
||||
@@ -4,7 +4,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { escapeRegex, loadConfig, getMilestoneInfo, getMilestonePhaseFilter, normalizeMd, planningDir, planningPaths, output, error, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { escapeRegex, loadConfig, getMilestoneInfo, getMilestonePhaseFilter, normalizeMd, output, error, atomicWriteFileSync } = require('./core.cjs');
|
||||
const { planningDir, planningPaths } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter, reconstructFrontmatter } = require('./frontmatter.cjs');
|
||||
|
||||
// Cache disk scan results from buildStateFrontmatter per cwd per process (#1967).
|
||||
|
||||
@@ -4,7 +4,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { normalizePhaseName, findPhaseInternal, generateSlugInternal, normalizeMd, toPosixPath, planningDir, output, error } = require('./core.cjs');
|
||||
const { normalizePhaseName, findPhaseInternal, generateSlugInternal, normalizeMd, toPosixPath, output, error } = require('./core.cjs');
|
||||
const { planningDir } = require('./planning-workspace.cjs');
|
||||
const { reconstructFrontmatter } = require('./frontmatter.cjs');
|
||||
|
||||
function cmdTemplateSelect(cwd, planPath, raw) {
|
||||
|
||||
@@ -7,7 +7,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { output, error, getMilestonePhaseFilter, planningDir, toPosixPath } = require('./core.cjs');
|
||||
const { output, error, getMilestonePhaseFilter, toPosixPath } = require('./core.cjs');
|
||||
const { planningDir } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const { requireSafePath, sanitizeForDisplay } = require('./security.cjs');
|
||||
|
||||
|
||||
55
get-shit-done/bin/lib/validate-command-router.cjs
Normal file
55
get-shit-done/bin/lib/validate-command-router.cjs
Normal file
@@ -0,0 +1,55 @@
|
||||
'use strict';
|
||||
|
||||
const { VALIDATE_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
function routeValidateCommand({ verify, args, cwd, raw, parseNamedArgs, output, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'consistency') {
|
||||
verify.cmdValidateConsistency(cwd, raw);
|
||||
} else if (subcommand === 'health') {
|
||||
const repairFlag = args.includes('--repair');
|
||||
const backfillFlag = args.includes('--backfill');
|
||||
verify.cmdValidateHealth(cwd, { repair: repairFlag, backfill: backfillFlag }, raw);
|
||||
} else if (subcommand === 'agents') {
|
||||
verify.cmdValidateAgents(cwd, raw);
|
||||
} else if (subcommand === 'context') {
|
||||
const opts = parseNamedArgs(args, ['tokens-used', 'context-window']);
|
||||
if (opts['tokens-used'] === null) {
|
||||
error('--tokens-used <integer> is required for `validate context`');
|
||||
return;
|
||||
}
|
||||
if (opts['context-window'] === null) {
|
||||
error('--context-window <integer> is required for `validate context`');
|
||||
return;
|
||||
}
|
||||
const { classifyContextUtilization, STATES } = require('./context-utilization.cjs');
|
||||
const RECOMMENDATIONS = {
|
||||
[STATES.HEALTHY]: null,
|
||||
[STATES.WARNING]: 'Context is approaching the fracture zone — consider /gsd-thread to continue in a fresh window.',
|
||||
[STATES.CRITICAL]: 'Reasoning quality may degrade past 70% utilization (fracture point). Run /gsd-thread now to preserve output quality.',
|
||||
};
|
||||
let classified;
|
||||
try {
|
||||
classified = classifyContextUtilization(Number(opts['tokens-used']), Number(opts['context-window']));
|
||||
} catch (e) {
|
||||
const flag = /tokensUsed/.test(e.message) ? '--tokens-used' : '--context-window';
|
||||
error(`${flag} must be a non-negative integer (window > 0), got the values supplied`);
|
||||
return;
|
||||
}
|
||||
const result = { ...classified, recommendation: RECOMMENDATIONS[classified.state] };
|
||||
if (args.includes('--json')) {
|
||||
output(result, raw);
|
||||
} else {
|
||||
const lines = [`Context utilization: ${result.percent}% (${result.state})`];
|
||||
if (result.recommendation) lines.push(result.recommendation);
|
||||
output(result, true, lines.join('\n'));
|
||||
}
|
||||
} else {
|
||||
error(`Unknown validate subcommand. Available: ${VALIDATE_SUBCOMMANDS.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeValidateCommand,
|
||||
};
|
||||
34
get-shit-done/bin/lib/verify-command-router.cjs
Normal file
34
get-shit-done/bin/lib/verify-command-router.cjs
Normal file
@@ -0,0 +1,34 @@
|
||||
'use strict';
|
||||
|
||||
const { VERIFY_SUBCOMMANDS } = require('./command-aliases.generated.cjs');
|
||||
|
||||
function routeVerifyCommand({ verify, args, cwd, raw, error }) {
|
||||
const subcommand = args[1];
|
||||
|
||||
if (subcommand === 'plan-structure') {
|
||||
verify.cmdVerifyPlanStructure(cwd, args[2], raw);
|
||||
} else if (subcommand === 'phase-completeness') {
|
||||
verify.cmdVerifyPhaseCompleteness(cwd, args[2], raw);
|
||||
} else if (subcommand === 'references') {
|
||||
verify.cmdVerifyReferences(cwd, args[2], raw);
|
||||
} else if (subcommand === 'commits') {
|
||||
verify.cmdVerifyCommits(cwd, args.slice(2), raw);
|
||||
} else if (subcommand === 'artifacts') {
|
||||
verify.cmdVerifyArtifacts(cwd, args[2], raw);
|
||||
} else if (subcommand === 'key-links') {
|
||||
verify.cmdVerifyKeyLinks(cwd, args[2], raw);
|
||||
} else if (subcommand === 'schema-drift') {
|
||||
const rest = args.slice(2);
|
||||
const skipFlag = rest.includes('--skip');
|
||||
const phaseArg = rest.find((arg) => !arg.startsWith('-'));
|
||||
verify.cmdVerifySchemaDrift(cwd, phaseArg, skipFlag, raw);
|
||||
} else if (subcommand === 'codebase-drift') {
|
||||
verify.cmdVerifyCodebaseDrift(cwd, raw);
|
||||
} else {
|
||||
error(`Unknown verify subcommand. Available: ${VERIFY_SUBCOMMANDS.join(', ')}`);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
routeVerifyCommand,
|
||||
};
|
||||
@@ -5,7 +5,8 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const os = require('os');
|
||||
const { safeReadFile, loadConfig, normalizePhaseName, escapeRegex, execGit, findPhaseInternal, getMilestoneInfo, stripShippedMilestones, extractCurrentMilestone, planningDir, output, error, checkAgentsInstalled, CONFIG_DEFAULTS } = require('./core.cjs');
|
||||
const { safeReadFile, loadConfig, normalizePhaseName, escapeRegex, execGit, findPhaseInternal, getMilestoneInfo, stripShippedMilestones, extractCurrentMilestone, output, error, checkAgentsInstalled, CONFIG_DEFAULTS } = require('./core.cjs');
|
||||
const { planningDir } = require('./planning-workspace.cjs');
|
||||
const { extractFrontmatter, parseMustHavesBlock } = require('./frontmatter.cjs');
|
||||
const { writeStateMd } = require('./state.cjs');
|
||||
|
||||
|
||||
@@ -10,7 +10,8 @@
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { output, error, planningPaths, planningRoot, toPosixPath, getMilestoneInfo, generateSlugInternal, setActiveWorkstream, getActiveWorkstream, filterPlanFiles, filterSummaryFiles, readSubdirectories } = require('./core.cjs');
|
||||
const { output, error, toPosixPath, getMilestoneInfo, generateSlugInternal, filterPlanFiles, filterSummaryFiles, readSubdirectories } = require('./core.cjs');
|
||||
const { planningPaths, planningRoot, setActiveWorkstream, getActiveWorkstream } = require('./planning-workspace.cjs');
|
||||
const { stateExtractField } = require('./state.cjs');
|
||||
|
||||
// ─── Migration ──────────────────────────────────────────────────────────────
|
||||
|
||||
247
get-shit-done/bin/verify-reapply-patches.cjs
Executable file
247
get-shit-done/bin/verify-reapply-patches.cjs
Executable file
@@ -0,0 +1,247 @@
|
||||
#!/usr/bin/env node
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Deterministic verifier for the /gsd-reapply-patches Step 5 "Hunk Verification
|
||||
* Gate". For each backed-up patch file, asserts that the user's added lines
|
||||
* (computed from a real diff against the pristine baseline, not from the
|
||||
* LLM's prose summary) survive into the merged output.
|
||||
*
|
||||
* Usage:
|
||||
* node scripts/verify-reapply-patches.cjs \
|
||||
* --patches-dir <path> \ # gsd-local-patches/
|
||||
* --config-dir <path> \ # ~/.claude (or runtime equivalent)
|
||||
* [--pristine-dir <path>] # gsd-pristine/; if absent, falls back to
|
||||
* # treating every significant backup line as
|
||||
* # required (over-broad but safe for #2969:
|
||||
* # false-positive halts beat silent successes
|
||||
* # on lost content)
|
||||
* [--json] # emit JSON report instead of human text
|
||||
*
|
||||
* Exit codes:
|
||||
* 0 — every user-added line is present in the merged file (gate passes)
|
||||
* 1 — at least one missing line in at least one file (gate fails)
|
||||
* 2 — usage / structural error (e.g. patches dir missing)
|
||||
*
|
||||
* Bug #2969: the Step 5 gate previously trusted Claude's free-text "verified:
|
||||
* yes/no" reporting per hunk. The LLM was filling in `yes` even when content
|
||||
* had been silently dropped. Moving the check to a deterministic script is the
|
||||
* durability fix.
|
||||
*/
|
||||
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
|
||||
const SIGNIFICANT_MIN_CHARS = 12;
|
||||
|
||||
function parseArgs(argv) {
|
||||
const opts = { patchesDir: null, configDir: null, pristineDir: null, json: false };
|
||||
for (let i = 0; i < argv.length; i++) {
|
||||
const arg = argv[i];
|
||||
if (arg === '--patches-dir') opts.patchesDir = argv[++i];
|
||||
else if (arg === '--config-dir') opts.configDir = argv[++i];
|
||||
else if (arg === '--pristine-dir') opts.pristineDir = argv[++i];
|
||||
else if (arg === '--json') opts.json = true;
|
||||
else if (arg === '--help' || arg === '-h') {
|
||||
process.stdout.write(
|
||||
'usage: verify-reapply-patches.cjs --patches-dir <path> --config-dir <path> [--pristine-dir <path>] [--json]\n',
|
||||
);
|
||||
process.exit(0);
|
||||
} else {
|
||||
process.stderr.write(`unknown argument: ${arg}\n`);
|
||||
process.exit(2);
|
||||
}
|
||||
}
|
||||
return opts;
|
||||
}
|
||||
|
||||
function isSignificantLine(line) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed.length < SIGNIFICANT_MIN_CHARS) return false;
|
||||
// Pure punctuation / closing brackets carry too little structural info to
|
||||
// reliably distinguish a survived hunk from incidental similarity.
|
||||
if (/^[\s})\];,]+$/.test(trimmed)) return false;
|
||||
// Generic decorative comments like `// ----` similarly fail the test.
|
||||
if (/^[\s\-=#*/]+$/.test(trimmed)) return false;
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Walk a directory, returning every file's path relative to the root.
|
||||
*/
|
||||
function walk(rootDir, relPrefix = '') {
|
||||
const out = [];
|
||||
if (!fs.existsSync(rootDir)) return out;
|
||||
for (const entry of fs.readdirSync(rootDir, { withFileTypes: true })) {
|
||||
const rel = relPrefix ? path.join(relPrefix, entry.name) : entry.name;
|
||||
const abs = path.join(rootDir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
out.push(...walk(abs, rel));
|
||||
} else if (entry.isFile()) {
|
||||
out.push(rel);
|
||||
}
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute the set of "user-added" lines: lines present in the backup but
|
||||
* absent from the pristine baseline. If no pristine is provided, falls back
|
||||
* to using every significant line in the backup (over-broad but safe — favours
|
||||
* false-positive failures over silent successes, which is the right side to
|
||||
* err on for #2969).
|
||||
*/
|
||||
function computeUserAddedLines(backupContent, pristineContent) {
|
||||
const backupLines = backupContent.split(/\r?\n/);
|
||||
if (!pristineContent) {
|
||||
return backupLines.filter(isSignificantLine);
|
||||
}
|
||||
const pristineSet = new Set(pristineContent.split(/\r?\n/));
|
||||
return backupLines.filter((line) => isSignificantLine(line) && !pristineSet.has(line));
|
||||
}
|
||||
|
||||
/**
|
||||
* Stable reason codes for the per-file result. Tests assert via
|
||||
* `assert.equal(result.reason, REASON.X)` rather than regex-matching prose,
|
||||
* so the diagnostic surface is a typed enum, not free text.
|
||||
*
|
||||
* Adding a new reason requires updating the REASON map AND the tests'
|
||||
* shape assertion that locks the documented set of codes.
|
||||
*/
|
||||
const REASON = Object.freeze({
|
||||
OK_NO_USER_LINES_VS_PRISTINE: 'ok_no_user_lines_vs_pristine',
|
||||
OK_NO_SIGNIFICANT_BACKUP_LINES: 'ok_no_significant_backup_lines',
|
||||
FAIL_INSTALLED_MISSING: 'fail_installed_missing',
|
||||
FAIL_INSTALLED_NOT_REGULAR_FILE: 'fail_installed_not_regular_file',
|
||||
FAIL_READ_ERROR: 'fail_read_error',
|
||||
FAIL_USER_LINES_MISSING: 'fail_user_lines_missing',
|
||||
});
|
||||
|
||||
function verifyFile({ relPath, patchesDir, configDir, pristineDir }) {
|
||||
const backupPath = path.join(patchesDir, relPath);
|
||||
const installedPath = path.join(configDir, relPath);
|
||||
const result = { file: relPath, status: 'ok', missing: [], reason: null };
|
||||
|
||||
if (!fs.existsSync(backupPath) || !fs.statSync(backupPath).isFile()) {
|
||||
return result; // walked entry no longer exists — non-fatal
|
||||
}
|
||||
|
||||
// Installed path checks: must exist, must be a regular file, must be
|
||||
// readable. Anything else is a fail-with-diagnostic, not a crash that
|
||||
// aborts the whole gate run and drops structured output.
|
||||
let installedStat;
|
||||
try {
|
||||
installedStat = fs.statSync(installedPath);
|
||||
} catch {
|
||||
result.status = 'fail';
|
||||
result.reason = REASON.FAIL_INSTALLED_MISSING;
|
||||
return result;
|
||||
}
|
||||
if (!installedStat.isFile()) {
|
||||
result.status = 'fail';
|
||||
result.reason = REASON.FAIL_INSTALLED_NOT_REGULAR_FILE;
|
||||
return result;
|
||||
}
|
||||
|
||||
let backupContent;
|
||||
let installedContent;
|
||||
try {
|
||||
backupContent = fs.readFileSync(backupPath, 'utf8');
|
||||
installedContent = fs.readFileSync(installedPath, 'utf8');
|
||||
} catch {
|
||||
result.status = 'fail';
|
||||
result.reason = REASON.FAIL_READ_ERROR;
|
||||
return result;
|
||||
}
|
||||
|
||||
let pristineContent = null;
|
||||
if (pristineDir) {
|
||||
const pristinePath = path.join(pristineDir, relPath);
|
||||
try {
|
||||
const stat = fs.statSync(pristinePath);
|
||||
if (stat.isFile()) {
|
||||
pristineContent = fs.readFileSync(pristinePath, 'utf8');
|
||||
}
|
||||
} catch {
|
||||
// Pristine missing or unreadable — fall through to over-broad mode.
|
||||
}
|
||||
}
|
||||
|
||||
const userAdded = computeUserAddedLines(backupContent, pristineContent);
|
||||
if (userAdded.length === 0) {
|
||||
// Backup and pristine match exactly (or no significant content) — nothing
|
||||
// to verify but also nothing to lose. Report as ok with diagnostic code.
|
||||
result.reason = pristineContent
|
||||
? REASON.OK_NO_USER_LINES_VS_PRISTINE
|
||||
: REASON.OK_NO_SIGNIFICANT_BACKUP_LINES;
|
||||
return result;
|
||||
}
|
||||
|
||||
for (const line of userAdded) {
|
||||
if (!installedContent.includes(line)) {
|
||||
result.missing.push(line.trim());
|
||||
}
|
||||
}
|
||||
if (result.missing.length > 0) {
|
||||
result.status = 'fail';
|
||||
result.reason = REASON.FAIL_USER_LINES_MISSING;
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
function main() {
|
||||
const opts = parseArgs(process.argv.slice(2));
|
||||
if (!opts.patchesDir || !opts.configDir) {
|
||||
process.stderr.write('--patches-dir and --config-dir are required\n');
|
||||
process.exit(2);
|
||||
}
|
||||
if (!fs.existsSync(opts.patchesDir)) {
|
||||
process.stderr.write(`patches dir not found: ${opts.patchesDir}\n`);
|
||||
process.exit(2);
|
||||
}
|
||||
if (!fs.existsSync(opts.configDir)) {
|
||||
process.stderr.write(`config dir not found: ${opts.configDir}\n`);
|
||||
process.exit(2);
|
||||
}
|
||||
|
||||
const files = walk(opts.patchesDir).filter((f) => !f.endsWith('backup-meta.json'));
|
||||
const results = files.map((relPath) =>
|
||||
verifyFile({
|
||||
relPath,
|
||||
patchesDir: opts.patchesDir,
|
||||
configDir: opts.configDir,
|
||||
pristineDir: opts.pristineDir,
|
||||
}),
|
||||
);
|
||||
|
||||
const failures = results.filter((r) => r.status === 'fail');
|
||||
|
||||
if (opts.json) {
|
||||
process.stdout.write(JSON.stringify({ checked: results.length, failures: failures.length, results }, null, 2) + '\n');
|
||||
} else {
|
||||
process.stdout.write(`# Hunk Verification Gate (#2969)\n\n`);
|
||||
process.stdout.write(`Checked: ${results.length} file(s)\n`);
|
||||
process.stdout.write(`Failures: ${failures.length}\n\n`);
|
||||
if (failures.length > 0) {
|
||||
process.stdout.write(`## Files with missing user-added content\n\n`);
|
||||
for (const r of failures) {
|
||||
process.stdout.write(`- ${r.file}\n`);
|
||||
if (r.reason) process.stdout.write(` reason: ${r.reason}\n`);
|
||||
for (const line of r.missing.slice(0, 5)) {
|
||||
process.stdout.write(` missing: ${line}\n`);
|
||||
}
|
||||
if (r.missing.length > 5) {
|
||||
process.stdout.write(` …and ${r.missing.length - 5} more line(s)\n`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
process.exit(failures.length > 0 ? 1 : 0);
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
main();
|
||||
}
|
||||
|
||||
module.exports = { computeUserAddedLines, isSignificantLine, verifyFile, walk, REASON };
|
||||
@@ -62,8 +62,11 @@ gsd-sdk query commit "docs: initialize [project-name] ([N] phases)" --files .pla
|
||||
Each task gets its own commit immediately after completion.
|
||||
|
||||
> **Parallel agents:** When running as a parallel executor (spawned by execute-phase),
|
||||
> use `--no-verify` on all commits to avoid pre-commit hook lock contention.
|
||||
> The orchestrator validates hooks once after all agents complete.
|
||||
> run commits normally — let pre-commit hooks run. Do NOT pass `--no-verify` by default
|
||||
> (#2924). Hooks should fire on the introducing commit; silent bypass violates project
|
||||
> CLAUDE.md guidance. If a project explicitly opts out via
|
||||
> `workflow.worktree_skip_hooks=true`, the orchestrator surfaces that flag in the
|
||||
> executor prompt; absent that signal, hooks run normally.
|
||||
|
||||
```
|
||||
{type}({phase}-{plan}): {task-name}
|
||||
|
||||
@@ -252,7 +252,7 @@ RAW_SKETCHES=$(ls .planning/sketches/MANIFEST.md 2>/dev/null)
|
||||
|
||||
If findings skills exist, read SKILL.md and reference files; extract validated patterns, landmines, constraints, design decisions. Add them to `<prior_decisions>`.
|
||||
|
||||
If raw spikes/sketches exist but no findings skill, note: `⚠ Unpackaged spikes/sketches detected — run /gsd-spike-wrap-up or /gsd-sketch-wrap-up to make findings available.`
|
||||
If raw spikes/sketches exist but no findings skill, note: `⚠ Unpackaged spikes/sketches detected — run /gsd-spike --wrap-up or /gsd-sketch --wrap-up to make findings available.`
|
||||
|
||||
Build internal `<prior_decisions>` with sections for Project-Level (from PROJECT.md / REQUIREMENTS.md), From Prior Phases (per-phase decisions), and From Spike/Sketch Findings (validated patterns, landmines, design decisions).
|
||||
|
||||
|
||||
@@ -44,29 +44,29 @@ Evaluate `$ARGUMENTS` against these routing rules. Apply the **first matching**
|
||||
| A bug, error, crash, failure, or something broken | `/gsd-debug` | Needs systematic investigation |
|
||||
| Spiking, "test if", "will this work", "experiment", "prove this out", validate feasibility | `/gsd-spike` | Throwaway experiment to validate feasibility |
|
||||
| Sketching, "mockup", "what would this look like", "prototype the UI", "design this", explore visual direction | `/gsd-sketch` | Throwaway HTML mockups to explore design |
|
||||
| Wrapping up spikes, "package the spikes", "consolidate spike findings" | `/gsd-spike-wrap-up` | Package spike findings into reusable skill |
|
||||
| Wrapping up sketches, "package the designs", "consolidate sketch findings" | `/gsd-sketch-wrap-up` | Package sketch findings into reusable skill |
|
||||
| Exploring, researching, comparing, or "how does X work" | `/gsd-research-phase` | Domain research before planning |
|
||||
| Wrapping up spikes, "package the spikes", "consolidate spike findings" | `/gsd-spike --wrap-up` | Package spike findings into reusable skill |
|
||||
| Wrapping up sketches, "package the designs", "consolidate sketch findings" | `/gsd-sketch --wrap-up` | Package sketch findings into reusable skill |
|
||||
| Exploring, researching, comparing, or "how does X work" | `/gsd-explore` | Socratic ideation and idea routing |
|
||||
| Discussing vision, "how should X look", brainstorming | `/gsd-discuss-phase` | Needs context gathering |
|
||||
| A complex task: refactoring, migration, multi-file architecture, system redesign | `/gsd-add-phase` | Needs a full phase with plan/build cycle |
|
||||
| A complex task: refactoring, migration, multi-file architecture, system redesign | `/gsd-phase` | Needs a full phase with plan/build cycle |
|
||||
| Planning a specific phase or "plan phase N" | `/gsd-plan-phase` | Direct planning request |
|
||||
| Executing a phase or "build phase N", "run phase N" | `/gsd-execute-phase` | Direct execution request |
|
||||
| Running all remaining phases automatically | `/gsd-autonomous` | Full autonomous execution |
|
||||
| A review or quality concern about existing work | `/gsd-verify-work` | Needs verification |
|
||||
| Checking progress, status, "where am I" | `/gsd-progress` | Status check |
|
||||
| Resuming work, "pick up where I left off" | `/gsd-resume-work` | Session restoration |
|
||||
| A note, idea, or "remember to..." | `/gsd-add-todo` | Capture for later |
|
||||
| A note, idea, or "remember to..." | `/gsd-capture` | Capture for later |
|
||||
| Adding tests, "write tests", "test coverage" | `/gsd-add-tests` | Test generation |
|
||||
| Completing a milestone, shipping, releasing | `/gsd-complete-milestone` | Milestone lifecycle |
|
||||
| A specific, actionable, small task (add feature, fix typo, update config) | `/gsd-quick` | Self-contained, single executor |
|
||||
|
||||
**Requires `.planning/` directory:** All routes except `/gsd-new-project`, `/gsd-map-codebase`, `/gsd-spike`, `/gsd-sketch`, `/gsd-help`, and `/gsd-join-discord`. If the project doesn't exist and the route requires it, suggest `/gsd-new-project` first.
|
||||
**Requires `.planning/` directory:** All routes except `/gsd-new-project`, `/gsd-map-codebase`, `/gsd-spike`, `/gsd-sketch`, and `/gsd-help`. If the project doesn't exist and the route requires it, suggest `/gsd-new-project` first.
|
||||
|
||||
**Ambiguity handling:** If the text could reasonably match multiple routes, ask the user via AskUserQuestion with the top 2-3 options. For example:
|
||||
|
||||
```
|
||||
"Refactor the authentication system" could be:
|
||||
1. /gsd-add-phase — Full planning cycle (recommended for multi-file refactors)
|
||||
1. /gsd-phase — Full planning cycle (recommended for multi-file refactors)
|
||||
2. /gsd-quick — Quick execution (if scope is small and clear)
|
||||
|
||||
Which approach fits better?
|
||||
|
||||
@@ -217,9 +217,33 @@ Check `branching_strategy` from init:
|
||||
|
||||
**"none":** Skip, continue on current branch.
|
||||
|
||||
**"phase" or "milestone":** Use pre-computed `branch_name` from init:
|
||||
**"phase" or "milestone":** Use pre-computed `branch_name` from init.
|
||||
|
||||
Fork the new phase branch off `origin/HEAD` (the project's default branch), not the current HEAD — otherwise consecutive phases compound and stay unpushed (#2916). If `$BRANCH_NAME` already exists locally, reuse it as-is.
|
||||
|
||||
```bash
|
||||
git checkout -b "$BRANCH_NAME" 2>/dev/null || git checkout "$BRANCH_NAME"
|
||||
DEFAULT_BRANCH=$(git symbolic-ref --quiet --short refs/remotes/origin/HEAD 2>/dev/null | sed 's|^origin/||')
|
||||
DEFAULT_BRANCH=${DEFAULT_BRANCH:-main}
|
||||
|
||||
if git show-ref --verify --quiet "refs/heads/$BRANCH_NAME"; then
|
||||
git switch "$BRANCH_NAME" || { echo "ERROR: Could not switch to existing branch '$BRANCH_NAME'." >&2; exit 1; }
|
||||
else
|
||||
if ! git fetch --quiet origin "$DEFAULT_BRANCH"; then # #2916
|
||||
git show-ref --verify --quiet "refs/remotes/origin/$DEFAULT_BRANCH" \
|
||||
|| { echo "ERROR: fetch origin/$DEFAULT_BRANCH failed and no local copy exists. Refusing to create '$BRANCH_NAME' off current HEAD (#2916)." >&2; exit 1; }
|
||||
echo "WARNING: fetch origin/$DEFAULT_BRANCH failed; using local copy as base." >&2
|
||||
fi
|
||||
if [ -n "$(git status --porcelain)" ]; then
|
||||
echo "WARNING: Uncommitted changes will be carried onto '$BRANCH_NAME' (branched off origin/$DEFAULT_BRANCH, not previous HEAD)."
|
||||
else
|
||||
git switch --quiet "$DEFAULT_BRANCH" 2>/dev/null && git merge --ff-only --quiet "origin/$DEFAULT_BRANCH" 2>/dev/null || true
|
||||
fi
|
||||
# Pinned base + fail-fast: on success HEAD is exactly at origin/$DEFAULT_BRANCH,
|
||||
# so a post-creation merge-base or "ahead-of" guard would be unreachable. The
|
||||
# explicit base argument here is the single source of correctness for #2916.
|
||||
git checkout -b "$BRANCH_NAME" "origin/$DEFAULT_BRANCH" \
|
||||
|| { echo "ERROR: Could not create '$BRANCH_NAME' from origin/$DEFAULT_BRANCH (#2916)." >&2; exit 1; }
|
||||
fi
|
||||
```
|
||||
|
||||
All subsequent commits go to this branch. User handles merging.
|
||||
@@ -482,40 +506,37 @@ increases monotonically across waves. `{status}` is `complete` (success),
|
||||
</objective>
|
||||
|
||||
<worktree_branch_check>
|
||||
FIRST ACTION before any other work: verify this worktree's branch is based on the correct commit.
|
||||
|
||||
Run:
|
||||
FIRST ACTION: HEAD assertion MUST run before any reset/checkout. Worktrees
|
||||
spawned by Claude Code's `isolation="worktree"` use the `worktree-agent-<id>`
|
||||
namespace. If HEAD is on a protected ref (main/master/develop/trunk/release/*)
|
||||
or detached, HALT — do NOT self-recover by force-rewinding via `git update-ref`,
|
||||
that destroys concurrent commits in multi-active scenarios (#2924). Only after
|
||||
Step 1 passes is `git reset --hard` safe (#2015 — affects all platforms).
|
||||
```bash
|
||||
ACTUAL_BASE=$(git merge-base HEAD {EXPECTED_BASE})
|
||||
```
|
||||
|
||||
If `ACTUAL_BASE` != `{EXPECTED_BASE}` (i.e. the worktree branch was created from an older
|
||||
base such as `main` instead of the feature branch HEAD), hard-reset to the correct base:
|
||||
```bash
|
||||
# Safe: this runs before any agent work, so no uncommitted changes to lose
|
||||
git reset --hard {EXPECTED_BASE}
|
||||
# Verify correction succeeded
|
||||
if [ "$(git rev-parse HEAD)" != "{EXPECTED_BASE}" ]; then
|
||||
echo "ERROR: Could not correct worktree base — aborting to prevent data loss"
|
||||
HEAD_REF=$(git symbolic-ref --quiet HEAD || echo "DETACHED")
|
||||
ACTUAL_BRANCH=$(git rev-parse --abbrev-ref HEAD)
|
||||
if [ "$HEAD_REF" = "DETACHED" ] || echo "$ACTUAL_BRANCH" | grep -Eq '^(main|master|develop|trunk|release/.*)$'; then
|
||||
echo "FATAL: worktree HEAD on '$ACTUAL_BRANCH' (expected worktree-agent-*); refusing to self-recover via 'git update-ref' (#2924)." >&2
|
||||
exit 1
|
||||
fi
|
||||
if ! echo "$ACTUAL_BRANCH" | grep -Eq '^worktree-agent-[A-Za-z0-9._/-]+$'; then
|
||||
echo "FATAL: worktree HEAD '$ACTUAL_BRANCH' is not in the worktree-agent-* namespace; refusing to commit (#2924)." >&2
|
||||
exit 1
|
||||
fi
|
||||
ACTUAL_BASE=$(git merge-base HEAD {EXPECTED_BASE})
|
||||
if [ "$ACTUAL_BASE" != "{EXPECTED_BASE}" ]; then
|
||||
git reset --hard {EXPECTED_BASE}
|
||||
[ "$(git rev-parse HEAD)" != "{EXPECTED_BASE}" ] && { echo "ERROR: could not correct worktree base"; exit 1; }
|
||||
fi
|
||||
```
|
||||
|
||||
`reset --hard` is safe here because this is a fresh worktree with no user changes. It
|
||||
resets both the HEAD pointer AND the working tree to the correct base commit (#2015).
|
||||
|
||||
If `ACTUAL_BASE` == `{EXPECTED_BASE}`: the branch base is correct, proceed immediately.
|
||||
|
||||
This check fixes a known issue where `EnterWorktree` creates branches from
|
||||
`main` instead of the current feature branch HEAD (affects all platforms).
|
||||
Per-commit HEAD assertion lives in `agents/gsd-executor.md` `<task_commit_protocol>` step 0.
|
||||
</worktree_branch_check>
|
||||
|
||||
<parallel_execution>
|
||||
You are running as a PARALLEL executor agent in a git worktree.
|
||||
Use --no-verify on all git commits to avoid pre-commit hook contention
|
||||
with other agents. The orchestrator validates hooks once after all agents complete.
|
||||
For `gsd-sdk query commit` (or legacy `gsd-tools.cjs` commit): add --no-verify flag when needed.
|
||||
For direct git commits: use git commit --no-verify -m "..."
|
||||
Run `git commit` normally — hooks run by default. Do NOT pass `--no-verify`
|
||||
unless the orchestrator surfaces `workflow.worktree_skip_hooks=true` in this
|
||||
prompt; silent bypass violates project CLAUDE.md guidance (#2924).
|
||||
|
||||
IMPORTANT: Do NOT modify STATE.md or ROADMAP.md. execute-plan.md
|
||||
auto-detects worktree mode (`.git` is a file, not a directory) and skips
|
||||
@@ -527,6 +548,7 @@ increases monotonically across waves. `{status}` is `complete` (success),
|
||||
only (STATE.md and ROADMAP.md are excluded automatically). Do NOT skip or defer
|
||||
this commit — the orchestrator force-removes the worktree after you return, and
|
||||
any uncommitted SUMMARY.md will be permanently lost (#2070).
|
||||
REQUIRED ORDER: Write SUMMARY.md → commit → only then any narration. No text between Write and commit (truncation risk; #2070 rescue is not primary defense).
|
||||
</parallel_execution>
|
||||
|
||||
<execution_context>
|
||||
@@ -581,6 +603,7 @@ increases monotonically across waves. `{status}` is `complete` (success),
|
||||
<sequential_execution>
|
||||
You are running as a SEQUENTIAL executor agent on the main working tree.
|
||||
Use normal git commits (with hooks). Do NOT use --no-verify.
|
||||
REQUIRED ORDER: Write SUMMARY.md → commit → only then any narration. No text between Write and commit (truncation risk; #2070 rescue is not primary defense).
|
||||
</sequential_execution>
|
||||
```
|
||||
|
||||
@@ -632,13 +655,16 @@ increases monotonically across waves. `{status}` is `complete` (success),
|
||||
**This fallback applies automatically to all runtimes.** Claude Code's Task() normally
|
||||
returns synchronously, but the fallback ensures resilience if it doesn't.
|
||||
|
||||
5. **Post-wave hook validation (parallel mode only):**
|
||||
|
||||
When agents committed with `--no-verify`, run pre-commit hooks once after the wave:
|
||||
5. **Post-wave hook validation (parallel mode only):** Hooks run on every executor commit by default (#2924); this post-wave run only fires when `workflow.worktree_skip_hooks=true` opted out of per-commit hooks:
|
||||
```bash
|
||||
# Run project's pre-commit hooks on the current state
|
||||
git diff --cached --quiet || git stash # stash any unstaged changes
|
||||
git hook run pre-commit 2>&1 || echo "⚠ Pre-commit hooks failed — review before continuing"
|
||||
SKIP_HOOKS=$(gsd-sdk query config-get workflow.worktree_skip_hooks 2>/dev/null || echo "false")
|
||||
if [ "$SKIP_HOOKS" = "true" ]; then
|
||||
# Stash uncommitted changes under a named ref so we always pop (bare `git stash` strands them on hook/script failure).
|
||||
STASHED=false
|
||||
if (! git diff --quiet || ! git diff --cached --quiet) && git stash push -u -m "gsd-post-wave-hook-$$" >/dev/null 2>&1; then STASHED=true; fi
|
||||
git hook run pre-commit 2>&1 || echo "⚠ Pre-commit hooks failed — review before continuing"
|
||||
[ "$STASHED" = "true" ] && (git stash pop >/dev/null 2>&1 || echo "⚠ Could not pop gsd-post-wave-hook stash — recover manually")
|
||||
fi
|
||||
```
|
||||
If hooks fail: report the failure and ask "Fix hook issues now?" or "Continue to next wave?"
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user