mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-26 01:35:29 +02:00
Compare commits
7 Commits
fix/2192-c
...
fix/2256-c
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8b94f0370d | ||
|
|
4a34745950 | ||
|
|
c051e71851 | ||
|
|
62b5278040 | ||
|
|
50f61bfd9a | ||
|
|
201b8f1a05 | ||
|
|
73c7281a36 |
80
CHANGELOG.md
80
CHANGELOG.md
@@ -6,18 +6,80 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
|
||||
- **`@gsd-build/sdk` — Phase 1 typed query foundation** — Registry-based `gsd-sdk query` command, classified errors (`GSDQueryError`), and unit-tested handlers under `sdk/src/query/` (state, roadmap, phase lifecycle, init, config, validation, and related domains). Implements incremental SDK-first migration scope approved in #2083; builds on validated work from #2007 / `feat/sdk-foundation` without migrating workflows or removing `gsd-tools.cjs` in this phase.
|
||||
- **Flow diagram directive for phase researcher** — `gsd-phase-researcher` now enforces data-flow architecture diagrams instead of file-listing diagrams. Language-agnostic directive added to agent prompt and research template. (#2139)
|
||||
|
||||
### Fixed
|
||||
- **Shell hooks falsely flagged as stale on every session** — `gsd-phase-boundary.sh`, `gsd-session-state.sh`, and `gsd-validate-commit.sh` now ship with a `# gsd-hook-version: {{GSD_VERSION}}` header; the installer substitutes `{{GSD_VERSION}}` in `.sh` hooks the same way it does for `.js` hooks; and the stale-hook detector in `gsd-check-update.js` now matches bash `#` comment syntax in addition to JS `//` syntax. All three changes are required together — neither the regex fix alone nor the install fix alone is sufficient to resolve the false positive (#2136, #2206, #2209, #2210, #2212)
|
||||
|
||||
- **SDK query layer (PR review hardening)** — `commit-to-subrepo` uses realpath-aware path containment and sanitized commit messages; `state.planned-phase` uses the STATE.md lockfile; `verifyKeyLinks` mitigates ReDoS on frontmatter patterns; frontmatter handlers resolve paths under the real project root; phase directory names reject `..` and separators; `gsd-sdk` restores strict CLI parsing by stripping `--pick` before `parseArgs`; `QueryRegistry.commands()` for enumeration; `todoComplete` uses static error imports.
|
||||
## [1.36.0] - 2026-04-14
|
||||
|
||||
### Added
|
||||
- **`/gsd-graphify` integration** — Knowledge graph for planning agents, enabling richer context connections between project artifacts (#2164)
|
||||
- **`gsd-pattern-mapper` agent** — Codebase pattern analysis agent for identifying recurring patterns and conventions (#1861)
|
||||
- **`@gsd-build/sdk` — Phase 1 typed query foundation** — Registry-based `gsd-sdk query` command with classified errors and unit-tested handlers for state, roadmap, phase lifecycle, init, config, and validation (#2118)
|
||||
- **Opt-in TDD pipeline mode** — `tdd_mode` exposed in init JSON with `--tdd` flag override for test-driven development workflows (#2119, #2124)
|
||||
- **Stale/orphan worktree detection (W017)** — `validate-health` now detects stale and orphan worktrees (#2175)
|
||||
- **Seed scanning in new-milestone** — Planted seeds are scanned during milestone step 2.5 for automatic surfacing (#2177)
|
||||
- **Artifact audit gate** — Open artifact auditing for milestone close and phase verify (#2157, #2158, #2160)
|
||||
- **`/gsd-quick` and `/gsd-thread` subcommands** — Added list/status/resume/close subcommands (#2159)
|
||||
- **Debug skill dispatch and session manager** — Sub-orchestrator for `/gsd-debug` sessions (#2154)
|
||||
- **Project skills awareness** — 9 GSD agents now discover and use project-scoped skills (#2152)
|
||||
- **`/gsd-debug` session management** — TDD gate, reasoning checkpoint, and security hardening (#2146)
|
||||
- **Context-window-aware prompt thinning** — Automatic prompt size reduction for sub-200K models (#1978)
|
||||
- **SDK `--ws` flag** — Workstream-aware execution support (#1884)
|
||||
- **`/gsd-extract-learnings` command** — Phase knowledge capture workflow (#1873)
|
||||
- **Cross-AI execution hook** — Step 2.5 in execute-phase for external AI integration (#1875)
|
||||
- **Ship workflow external review hook** — External code review command hook in ship workflow
|
||||
- **Plan bounce hook** — Optional external refinement step (12.5) in plan-phase workflow
|
||||
- **Cursor CLI self-detection** — Cursor detection and REVIEWS.md template for `/gsd-review` (#1960)
|
||||
- **Architectural Responsibility Mapping** — Added to phase-researcher pipeline (#1988, #2103)
|
||||
- **Configurable `claude_md_path`** — Custom CLAUDE.md path setting (#2010, #2102)
|
||||
- **`/gsd-skill-manifest` command** — Pre-compute skill discovery for faster session starts (#2101)
|
||||
- **`--dry-run` mode and resolved blocker pruning** — State management improvements (#1970)
|
||||
- **State prune command** — Prune unbounded section growth in STATE.md (#1970)
|
||||
- **Global skills support** — Support `~/.claude/skills/` in `agent_skills` config (#1992)
|
||||
- **Context exhaustion auto-recording** — Hooks auto-record session state on context exhaustion (#1974)
|
||||
- **Metrics table pruning** — Auto-prune on phase complete for STATE.md metrics (#2087, #2120)
|
||||
- **Flow diagram directive for phase researcher** — Data-flow architecture diagrams enforced (#2139, #2147)
|
||||
|
||||
### Changed
|
||||
- **Planner context-cost sizing** — Replaced time-based reasoning with context-cost sizing and multi-source coverage audit (#2091, #2092, #2114)
|
||||
- **`/gsd-next` prior-phase completeness scan** — Replaced consecutive-call counter with completeness scan (#2097)
|
||||
- **Inline execution for small plans** — Default to inline execution, skip subagent overhead for small plans (#1979)
|
||||
- **Prior-phase context optimization** — Limited to 3 most recent phases and includes `Depends on` phases (#1969)
|
||||
- **Non-technical owner adaptation** — `discuss-phase` adapts gray area language for non-technical owners via USER-PROFILE.md (#2125, #2173)
|
||||
- **Agent specs standardization** — Standardized `required_reading` patterns across agent specs (#2176)
|
||||
- **CI upgrades** — GitHub Actions upgraded to Node 22+ runtimes; release pipeline fixes (#2128, #1956)
|
||||
- **Branch cleanup workflow** — Auto-delete on merge + weekly sweep (#2051)
|
||||
- **SDK query follow-up** — Expanded mutation commands, PID-liveness lock cleanup, depth-bounded JSON search, and comprehensive unit tests
|
||||
|
||||
- **SDK query follow-up (tests, docs, registry)** — Expanded `QUERY_MUTATION_COMMANDS` for event emission; stale lock cleanup uses PID liveness (`process.kill(pid, 0)`) when a lock file exists; `searchJsonEntries` is depth-bounded (`MAX_JSON_SEARCH_DEPTH`); removed unnecessary `readdirSync`/`Dirent` casts across query handlers; added `sdk/src/query/QUERY-HANDLERS.md` (error vs `{ data.error }`, mutations, locks, intel limits); unit tests for intel, profile, uat, skills, summary, websearch, workstream, registry vs `QUERY_MUTATION_COMMANDS`, and frontmatter extract/splice round-trip.
|
||||
### Fixed
|
||||
- **Init ignores archived phases** — Archived phases from prior milestones sharing a phase number no longer interfere (#2186)
|
||||
- **UAT file listing** — Removed `head -5` truncation from verify-work (#2172)
|
||||
- **Intel status relative time** — Display relative time correctly (#2132)
|
||||
- **Codex hook install** — Copy hook files to Codex install target (#2153, #2166)
|
||||
- **Phase add-batch duplicate prevention** — Prevents duplicate phase numbers on parallel invocations (#2165, #2170)
|
||||
- **Stale hooks warning** — Show contextual warning for dev installs with stale hooks (#2162)
|
||||
- **Worktree submodule skip** — Skip worktree isolation when `.gitmodules` detected (#2144)
|
||||
- **Worktree STATE.md backup** — Use `cp` instead of `git-show` (#2143)
|
||||
- **Bash hooks staleness check** — Add missing bash hooks to `MANAGED_HOOKS` (#2141)
|
||||
- **Code-review parser fix** — Fix SUMMARY.md parser section-reset for top-level keys (#2142)
|
||||
- **Backlog phase exclusion** — Exclude 999.x backlog phases from next-phase and all_complete (#2135)
|
||||
- **Frontmatter regex anchor** — Anchor `extractFrontmatter` regex to file start (#2133)
|
||||
- **Qwen Code install paths** — Eliminate Claude reference leaks (#2112)
|
||||
- **Plan bounce default** — Correct `plan_bounce_passes` default from 1 to 2
|
||||
- **GSD temp directory** — Use dedicated temp subdirectory for GSD temp files (#1975, #2100)
|
||||
- **Workspace path quoting** — Quote path variables in workspace next-step examples (#2096)
|
||||
- **Answer validation loop** — Carve out Other+empty exception from retry loop (#2093)
|
||||
- **Test race condition** — Add `before()` hook to bug-1736 test (#2099)
|
||||
- **Qwen Code path replacement** — Dedicated path replacement branches and finishInstall labels (#2082)
|
||||
- **Global skill symlink guard** — Tests and empty-name handling for config (#1992)
|
||||
- **Context exhaustion hook defects** — Three blocking defects fixed (#1974)
|
||||
- **State disk scan cache** — Invalidate disk scan cache in writeStateMd (#1967)
|
||||
- **State frontmatter caching** — Cache buildStateFrontmatter disk scan per process (#1967)
|
||||
- **Grep anchor and threshold guard** — Correct grep anchor and add threshold=0 guard (#1979)
|
||||
- **Atomic write coverage** — Extend atomicWriteFileSync to milestone, phase, and frontmatter (#1972)
|
||||
- **Health check optimization** — Merge four readdirSync passes into one (#1973)
|
||||
- **SDK query layer hardening** — Realpath-aware path containment, ReDoS mitigation, strict CLI parsing, phase directory sanitization (#2118)
|
||||
- **Prompt injection scan** — Allowlist plan-phase.md
|
||||
|
||||
## [1.35.0] - 2026-04-10
|
||||
|
||||
@@ -1907,7 +1969,9 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
- YOLO mode for autonomous execution
|
||||
- Interactive mode with checkpoints
|
||||
|
||||
[Unreleased]: https://github.com/gsd-build/get-shit-done/compare/v1.34.2...HEAD
|
||||
[Unreleased]: https://github.com/gsd-build/get-shit-done/compare/v1.36.0...HEAD
|
||||
[1.36.0]: https://github.com/gsd-build/get-shit-done/releases/tag/v1.36.0
|
||||
[1.35.0]: https://github.com/gsd-build/get-shit-done/releases/tag/v1.35.0
|
||||
[1.34.2]: https://github.com/gsd-build/get-shit-done/releases/tag/v1.34.2
|
||||
[1.34.1]: https://github.com/gsd-build/get-shit-done/releases/tag/v1.34.1
|
||||
[1.34.0]: https://github.com/gsd-build/get-shit-done/releases/tag/v1.34.0
|
||||
|
||||
22
README.md
22
README.md
@@ -89,13 +89,14 @@ People who want to describe what they want and have it built correctly — witho
|
||||
|
||||
Built-in quality gates catch real problems: schema drift detection flags ORM changes missing migrations, security enforcement anchors verification to threat models, and scope reduction detection prevents the planner from silently dropping your requirements.
|
||||
|
||||
### v1.34.0 Highlights
|
||||
### v1.36.0 Highlights
|
||||
|
||||
- **Gates taxonomy** — 4 canonical gate types (pre-flight, revision, escalation, abort) wired into plan-checker and verifier agents
|
||||
- **Shell hooks fix** — `hooks/*.sh` files are now correctly included in the npm package, eliminating startup hook errors on fresh installs
|
||||
- **Post-merge hunk verification** — `reapply-patches` detects silently dropped hunks after three-way merge
|
||||
- **detectConfigDir fix** — Claude Code users no longer see false "update available" warnings when multiple runtimes are installed
|
||||
- **3 bug fixes** — Milestone backlog preservation, detectConfigDir priority, and npm package manifest
|
||||
- **Knowledge graph integration** — `/gsd-graphify` brings knowledge graphs to planning agents for richer context connections
|
||||
- **SDK typed query foundation** — Registry-based `gsd-sdk query` command with classified errors and handlers for state, roadmap, phase lifecycle, and config
|
||||
- **TDD pipeline mode** — Opt-in test-driven development workflow with `--tdd` flag
|
||||
- **Context-window-aware prompt thinning** — Automatic prompt size reduction for sub-200K models
|
||||
- **Project skills awareness** — 9 GSD agents now discover and use project-scoped skills
|
||||
- **30+ bug fixes** — Worktree safety, state management, installer paths, and health check optimizations
|
||||
|
||||
---
|
||||
|
||||
@@ -116,7 +117,9 @@ Verify with:
|
||||
- Cline: GSD installs via `.clinerules` — verify by checking `.clinerules` exists
|
||||
|
||||
> [!NOTE]
|
||||
> Claude Code 2.1.88+, Qwen Code, and Codex install as skills (`skills/gsd-*/SKILL.md`). Older Claude Code versions use `commands/gsd/`. Cline uses `.clinerules` for configuration. The installer handles all formats automatically.
|
||||
> Claude Code 2.1.88+, Qwen Code, and Codex install as skills (`.claude/skills/`, `./.codex/skills/`, or the matching global `~/.claude/skills/` / `~/.codex/skills/` roots). Older Claude Code versions use `commands/gsd/`. `~/.claude/get-shit-done/skills/` is import-only for legacy migration. The installer handles all formats automatically.
|
||||
|
||||
The canonical discovery contract is documented in [docs/skills/discovery-contract.md](docs/skills/discovery-contract.md).
|
||||
|
||||
> [!TIP]
|
||||
> For source-based installs or environments where npm is unavailable, see **[docs/manual-update.md](docs/manual-update.md)**.
|
||||
@@ -817,8 +820,9 @@ This prevents Claude from reading these files entirely, regardless of what comma
|
||||
|
||||
**Commands not found after install?**
|
||||
- Restart your runtime to reload commands/skills
|
||||
- Verify files exist in `~/.claude/skills/gsd-*/SKILL.md` (Claude Code 2.1.88+) or `~/.claude/commands/gsd/` (legacy)
|
||||
- For Codex, verify skills exist in `~/.codex/skills/gsd-*/SKILL.md` (global) or `./.codex/skills/gsd-*/SKILL.md` (local)
|
||||
- Verify files exist in `~/.claude/skills/gsd-*/SKILL.md` or `~/.codex/skills/gsd-*/SKILL.md` for managed global installs
|
||||
- For local installs, verify `.claude/skills/gsd-*/SKILL.md` or `./.codex/skills/gsd-*/SKILL.md`
|
||||
- Legacy Claude Code installs still use `~/.claude/commands/gsd/`
|
||||
|
||||
**Commands not working as expected?**
|
||||
- Run `/gsd-help` to verify installation
|
||||
|
||||
@@ -5761,10 +5761,15 @@ function install(isGlobal, runtime = 'claude') {
|
||||
// Ensure hook files are executable (fixes #1162 — missing +x permission)
|
||||
try { fs.chmodSync(destFile, 0o755); } catch (e) { /* Windows doesn't support chmod */ }
|
||||
} else {
|
||||
fs.copyFileSync(srcFile, destFile);
|
||||
// Ensure .sh hook files are executable (mirrors chmod in build-hooks.js)
|
||||
// .sh hooks carry a gsd-hook-version header so gsd-check-update.js can
|
||||
// detect staleness after updates — stamp the version just like .js hooks.
|
||||
if (entry.endsWith('.sh')) {
|
||||
let content = fs.readFileSync(srcFile, 'utf8');
|
||||
content = content.replace(/\{\{GSD_VERSION\}\}/g, pkg.version);
|
||||
fs.writeFileSync(destFile, content);
|
||||
try { fs.chmodSync(destFile, 0o755); } catch (e) { /* Windows doesn't support chmod */ }
|
||||
} else {
|
||||
fs.copyFileSync(srcFile, destFile);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -5876,9 +5881,13 @@ function install(isGlobal, runtime = 'claude') {
|
||||
fs.writeFileSync(destFile, content);
|
||||
try { fs.chmodSync(destFile, 0o755); } catch (e) { /* Windows */ }
|
||||
} else {
|
||||
fs.copyFileSync(srcFile, destFile);
|
||||
if (entry.endsWith('.sh')) {
|
||||
let content = fs.readFileSync(srcFile, 'utf8');
|
||||
content = content.replace(/\{\{GSD_VERSION\}\}/g, pkg.version);
|
||||
fs.writeFileSync(destFile, content);
|
||||
try { fs.chmodSync(destFile, 0o755); } catch (e) { /* Windows */ }
|
||||
} else {
|
||||
fs.copyFileSync(srcFile, destFile);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -113,7 +113,7 @@ User-facing entry points. Each file contains YAML frontmatter (name, description
|
||||
- **Copilot:** Slash commands (`/gsd-command-name`)
|
||||
- **Antigravity:** Skills
|
||||
|
||||
**Total commands:** 69
|
||||
**Total commands:** 73
|
||||
|
||||
### Workflows (`get-shit-done/workflows/*.md`)
|
||||
|
||||
@@ -124,7 +124,7 @@ Orchestration logic that commands reference. Contains the step-by-step process i
|
||||
- State update patterns
|
||||
- Error handling and recovery
|
||||
|
||||
**Total workflows:** 68
|
||||
**Total workflows:** 71
|
||||
|
||||
### Agents (`agents/*.md`)
|
||||
|
||||
@@ -134,7 +134,7 @@ Specialized agent definitions with frontmatter specifying:
|
||||
- `tools` — Allowed tool access (Read, Write, Edit, Bash, Grep, Glob, WebSearch, etc.)
|
||||
- `color` — Terminal output color for visual distinction
|
||||
|
||||
**Total agents:** 24
|
||||
**Total agents:** 31
|
||||
|
||||
### References (`get-shit-done/references/*.md`)
|
||||
|
||||
@@ -409,14 +409,14 @@ UI-SPEC.md (per phase) ───────────────────
|
||||
|
||||
```
|
||||
~/.claude/ # Claude Code (global install)
|
||||
├── commands/gsd/*.md # 69 slash commands
|
||||
├── commands/gsd/*.md # 73 slash commands
|
||||
├── get-shit-done/
|
||||
│ ├── bin/gsd-tools.cjs # CLI utility
|
||||
│ ├── bin/lib/*.cjs # 19 domain modules
|
||||
│ ├── workflows/*.md # 68 workflow definitions
|
||||
│ ├── workflows/*.md # 71 workflow definitions
|
||||
│ ├── references/*.md # 35 shared reference docs
|
||||
│ └── templates/ # Planning artifact templates
|
||||
├── agents/*.md # 24 agent definitions
|
||||
├── agents/*.md # 31 agent definitions
|
||||
├── hooks/
|
||||
│ ├── gsd-statusline.js # Statusline hook
|
||||
│ ├── gsd-context-monitor.js # Context warning hook
|
||||
|
||||
92
docs/skills/discovery-contract.md
Normal file
92
docs/skills/discovery-contract.md
Normal file
@@ -0,0 +1,92 @@
|
||||
# Skill Discovery Contract
|
||||
|
||||
> Canonical rules for scanning, inventorying, and rendering GSD skills.
|
||||
|
||||
## Root Categories
|
||||
|
||||
### Project Roots
|
||||
|
||||
Scan these roots relative to the project root:
|
||||
|
||||
- `.claude/skills/`
|
||||
- `.agents/skills/`
|
||||
- `.cursor/skills/`
|
||||
- `.github/skills/`
|
||||
- `./.codex/skills/`
|
||||
|
||||
These roots are used for project-specific skills and for the project `CLAUDE.md` skills section.
|
||||
|
||||
### Managed Global Roots
|
||||
|
||||
Scan these roots relative to the user home directory:
|
||||
|
||||
- `~/.claude/skills/`
|
||||
- `~/.codex/skills/`
|
||||
|
||||
These roots are used for managed runtime installs and inventory reporting.
|
||||
|
||||
### Deprecated Import-Only Root
|
||||
|
||||
- `~/.claude/get-shit-done/skills/`
|
||||
|
||||
This root is kept for legacy migration only. Inventory code may report it, but new installs should not write here.
|
||||
|
||||
### Legacy Claude Commands
|
||||
|
||||
- `~/.claude/commands/gsd/`
|
||||
|
||||
This is not a skills root. Discovery code only checks whether it exists so inventory can report legacy Claude installs.
|
||||
|
||||
## Normalization Rules
|
||||
|
||||
- Scan only subdirectories that contain `SKILL.md`.
|
||||
- Read `name` and `description` from YAML frontmatter.
|
||||
- Use the directory name when `name` is missing.
|
||||
- Extract trigger hints from body lines that match `TRIGGER when: ...`.
|
||||
- Treat `gsd-*` directories as installed framework skills.
|
||||
- Treat `~/.claude/get-shit-done/skills/` entries as deprecated/import-only.
|
||||
- Treat `~/.claude/commands/gsd/` as legacy command installation metadata, not skills.
|
||||
|
||||
## Scanner Behavior
|
||||
|
||||
### `sdk/src/query/skills.ts`
|
||||
|
||||
- Returns a de-duplicated list of discovered skill names.
|
||||
- Scans project roots plus managed global roots.
|
||||
- Does not scan the deprecated import-only root.
|
||||
|
||||
### `get-shit-done/bin/lib/profile-output.cjs`
|
||||
|
||||
- Builds the project `CLAUDE.md` skills section.
|
||||
- Scans project roots only.
|
||||
- Skips `gsd-*` directories so the project section stays focused on user/project skills.
|
||||
- Adds `.codex/skills/` to the project discovery set.
|
||||
|
||||
### `get-shit-done/bin/lib/init.cjs`
|
||||
|
||||
- Generates the skill inventory object for `skill-manifest`.
|
||||
- Reports `skills`, `roots`, `installation`, and `counts`.
|
||||
- Marks `gsd_skills_installed` when any discovered skill name starts with `gsd-`.
|
||||
- Marks `legacy_claude_commands_installed` when `~/.claude/commands/gsd/` contains `.md` command files.
|
||||
|
||||
## Inventory Shape
|
||||
|
||||
`skill-manifest` returns a JSON object with:
|
||||
|
||||
- `skills`: normalized skill entries
|
||||
- `roots`: the canonical roots that were checked
|
||||
- `installation`: summary booleans for installed GSD skills and legacy Claude commands
|
||||
- `counts`: small inventory counts for downstream consumers
|
||||
|
||||
Each skill entry includes:
|
||||
|
||||
- `name`
|
||||
- `description`
|
||||
- `triggers`
|
||||
- `path`
|
||||
- `file_path`
|
||||
- `root`
|
||||
- `scope`
|
||||
- `installed`
|
||||
- `deprecated`
|
||||
|
||||
@@ -333,7 +333,7 @@ async function main() {
|
||||
// filesystem traversal on every invocation.
|
||||
const SKIP_ROOT_RESOLUTION = new Set([
|
||||
'generate-slug', 'current-timestamp', 'verify-path-exists',
|
||||
'verify-summary', 'template', 'frontmatter',
|
||||
'verify-summary', 'template', 'frontmatter', 'detect-custom-files',
|
||||
]);
|
||||
if (!SKIP_ROOT_RESOLUTION.has(command)) {
|
||||
cwd = findProjectRoot(cwd);
|
||||
@@ -1142,6 +1142,98 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
break;
|
||||
}
|
||||
|
||||
// ─── detect-custom-files ───────────────────────────────────────────────
|
||||
// Detect user-added files inside GSD-managed directories that are not
|
||||
// tracked in gsd-file-manifest.json. Used by the update workflow to back
|
||||
// up custom files before the installer wipes those directories.
|
||||
//
|
||||
// This replaces the fragile bash pattern:
|
||||
// MANIFEST_FILES=$(node -e "require('$RUNTIME_DIR/...')" 2>/dev/null)
|
||||
// ${filepath#$RUNTIME_DIR/} # unreliable path stripping
|
||||
// which silently returns CUSTOM_COUNT=0 when $RUNTIME_DIR is unset or
|
||||
// when the stripped path does not match the manifest key format (#1997).
|
||||
|
||||
case 'detect-custom-files': {
|
||||
const configDirIdx = args.indexOf('--config-dir');
|
||||
const configDir = configDirIdx !== -1 ? args[configDirIdx + 1] : null;
|
||||
if (!configDir) {
|
||||
error('Usage: gsd-tools detect-custom-files --config-dir <path>');
|
||||
}
|
||||
const resolvedConfigDir = path.resolve(configDir);
|
||||
if (!fs.existsSync(resolvedConfigDir)) {
|
||||
error(`Config directory not found: ${resolvedConfigDir}`);
|
||||
}
|
||||
|
||||
const manifestPath = path.join(resolvedConfigDir, 'gsd-file-manifest.json');
|
||||
if (!fs.existsSync(manifestPath)) {
|
||||
// No manifest — cannot determine what is custom. Return empty list
|
||||
// (same behaviour as saveLocalPatches in install.js when no manifest).
|
||||
const out = { custom_files: [], custom_count: 0, manifest_found: false };
|
||||
process.stdout.write(JSON.stringify(out, null, 2));
|
||||
break;
|
||||
}
|
||||
|
||||
let manifest;
|
||||
try {
|
||||
manifest = JSON.parse(fs.readFileSync(manifestPath, 'utf8'));
|
||||
} catch {
|
||||
const out = { custom_files: [], custom_count: 0, manifest_found: false, error: 'manifest parse error' };
|
||||
process.stdout.write(JSON.stringify(out, null, 2));
|
||||
break;
|
||||
}
|
||||
|
||||
const manifestKeys = new Set(Object.keys(manifest.files || {}));
|
||||
|
||||
// GSD-managed directories to scan for user-added files.
|
||||
// These are the directories the installer wipes on update.
|
||||
const GSD_MANAGED_DIRS = [
|
||||
'get-shit-done',
|
||||
'agents',
|
||||
path.join('commands', 'gsd'),
|
||||
'hooks',
|
||||
// OpenCode/Kilo flat command dir
|
||||
'command',
|
||||
// Codex/Copilot skills dir
|
||||
'skills',
|
||||
];
|
||||
|
||||
function walkDir(dir, baseDir) {
|
||||
const results = [];
|
||||
if (!fs.existsSync(dir)) return results;
|
||||
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
|
||||
const fullPath = path.join(dir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
results.push(...walkDir(fullPath, baseDir));
|
||||
} else {
|
||||
// Use forward slashes for cross-platform manifest key compatibility
|
||||
const relPath = path.relative(baseDir, fullPath).replace(/\\/g, '/');
|
||||
results.push(relPath);
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
const customFiles = [];
|
||||
for (const managedDir of GSD_MANAGED_DIRS) {
|
||||
const absDir = path.join(resolvedConfigDir, managedDir);
|
||||
if (!fs.existsSync(absDir)) continue;
|
||||
for (const relPath of walkDir(absDir, resolvedConfigDir)) {
|
||||
if (!manifestKeys.has(relPath)) {
|
||||
customFiles.push(relPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const out = {
|
||||
custom_files: customFiles,
|
||||
custom_count: customFiles.length,
|
||||
manifest_found: true,
|
||||
manifest_version: manifest.version || null,
|
||||
};
|
||||
process.stdout.write(JSON.stringify(out, null, 2));
|
||||
break;
|
||||
}
|
||||
|
||||
// ─── GSD-2 Reverse Migration ───────────────────────────────────────────
|
||||
|
||||
case 'from-gsd2': {
|
||||
|
||||
@@ -1590,30 +1590,136 @@ function cmdAgentSkills(cwd, agentType, raw) {
|
||||
/**
|
||||
* Generate a skill manifest from a skills directory.
|
||||
*
|
||||
* Scans the given skills directory for subdirectories containing SKILL.md,
|
||||
* extracts frontmatter (name, description) and trigger conditions from the
|
||||
* body text, and returns an array of skill descriptors.
|
||||
* Scans the canonical skill discovery roots and returns a normalized
|
||||
* inventory object with discovered skills, root metadata, and installation
|
||||
* summary flags. A legacy `skillsDir` override is still accepted for focused
|
||||
* scans, but the default mode is multi-root discovery.
|
||||
*
|
||||
* @param {string} skillsDir - Absolute path to the skills directory
|
||||
* @returns {Array<{name: string, description: string, triggers: string[], path: string}>}
|
||||
* @param {string} cwd - Project root directory
|
||||
* @param {string|null} [skillsDir] - Optional absolute path to a specific skills directory
|
||||
* @returns {{
|
||||
* skills: Array<{name: string, description: string, triggers: string[], path: string, file_path: string, root: string, scope: string, installed: boolean, deprecated: boolean}>,
|
||||
* roots: Array<{root: string, path: string, scope: string, present: boolean, skill_count?: number, command_count?: number, deprecated?: boolean}>,
|
||||
* installation: { gsd_skills_installed: boolean, legacy_claude_commands_installed: boolean },
|
||||
* counts: { skills: number, roots: number }
|
||||
* }}
|
||||
*/
|
||||
function buildSkillManifest(skillsDir) {
|
||||
function buildSkillManifest(cwd, skillsDir = null) {
|
||||
const { extractFrontmatter } = require('./frontmatter.cjs');
|
||||
const os = require('os');
|
||||
|
||||
if (!fs.existsSync(skillsDir)) return [];
|
||||
const canonicalRoots = skillsDir ? [{
|
||||
root: path.resolve(skillsDir),
|
||||
path: path.resolve(skillsDir),
|
||||
scope: 'custom',
|
||||
present: fs.existsSync(skillsDir),
|
||||
kind: 'skills',
|
||||
}] : [
|
||||
{
|
||||
root: '.claude/skills',
|
||||
path: path.join(cwd, '.claude', 'skills'),
|
||||
scope: 'project',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '.agents/skills',
|
||||
path: path.join(cwd, '.agents', 'skills'),
|
||||
scope: 'project',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '.cursor/skills',
|
||||
path: path.join(cwd, '.cursor', 'skills'),
|
||||
scope: 'project',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '.github/skills',
|
||||
path: path.join(cwd, '.github', 'skills'),
|
||||
scope: 'project',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '.codex/skills',
|
||||
path: path.join(cwd, '.codex', 'skills'),
|
||||
scope: 'project',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '~/.claude/skills',
|
||||
path: path.join(os.homedir(), '.claude', 'skills'),
|
||||
scope: 'global',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '~/.codex/skills',
|
||||
path: path.join(os.homedir(), '.codex', 'skills'),
|
||||
scope: 'global',
|
||||
kind: 'skills',
|
||||
},
|
||||
{
|
||||
root: '.claude/get-shit-done/skills',
|
||||
path: path.join(os.homedir(), '.claude', 'get-shit-done', 'skills'),
|
||||
scope: 'import-only',
|
||||
kind: 'skills',
|
||||
deprecated: true,
|
||||
},
|
||||
{
|
||||
root: '.claude/commands/gsd',
|
||||
path: path.join(os.homedir(), '.claude', 'commands', 'gsd'),
|
||||
scope: 'legacy-commands',
|
||||
kind: 'commands',
|
||||
deprecated: true,
|
||||
},
|
||||
];
|
||||
|
||||
const skills = [];
|
||||
const roots = [];
|
||||
let legacyClaudeCommandsInstalled = false;
|
||||
for (const rootInfo of canonicalRoots) {
|
||||
const rootPath = rootInfo.path;
|
||||
const rootSummary = {
|
||||
root: rootInfo.root,
|
||||
path: rootPath,
|
||||
scope: rootInfo.scope,
|
||||
present: fs.existsSync(rootPath),
|
||||
deprecated: !!rootInfo.deprecated,
|
||||
};
|
||||
|
||||
if (!rootSummary.present) {
|
||||
roots.push(rootSummary);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (rootInfo.kind === 'commands') {
|
||||
let entries = [];
|
||||
try {
|
||||
entries = fs.readdirSync(rootPath, { withFileTypes: true });
|
||||
} catch {
|
||||
roots.push(rootSummary);
|
||||
continue;
|
||||
}
|
||||
|
||||
const commandFiles = entries.filter(entry => entry.isFile() && entry.name.endsWith('.md'));
|
||||
rootSummary.command_count = commandFiles.length;
|
||||
if (rootSummary.command_count > 0) legacyClaudeCommandsInstalled = true;
|
||||
roots.push(rootSummary);
|
||||
continue;
|
||||
}
|
||||
|
||||
let entries;
|
||||
try {
|
||||
entries = fs.readdirSync(skillsDir, { withFileTypes: true });
|
||||
entries = fs.readdirSync(rootPath, { withFileTypes: true });
|
||||
} catch {
|
||||
return [];
|
||||
roots.push(rootSummary);
|
||||
continue;
|
||||
}
|
||||
|
||||
const manifest = [];
|
||||
let skillCount = 0;
|
||||
for (const entry of entries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
|
||||
const skillMdPath = path.join(skillsDir, entry.name, 'SKILL.md');
|
||||
const skillMdPath = path.join(rootPath, entry.name, 'SKILL.md');
|
||||
if (!fs.existsSync(skillMdPath)) continue;
|
||||
|
||||
let content;
|
||||
@@ -1641,24 +1747,50 @@ function buildSkillManifest(skillsDir) {
|
||||
}
|
||||
}
|
||||
|
||||
manifest.push({
|
||||
skills.push({
|
||||
name,
|
||||
description,
|
||||
triggers,
|
||||
path: entry.name,
|
||||
file_path: `${entry.name}/SKILL.md`,
|
||||
root: rootInfo.root,
|
||||
scope: rootInfo.scope,
|
||||
installed: rootInfo.scope !== 'import-only',
|
||||
deprecated: !!rootInfo.deprecated,
|
||||
});
|
||||
skillCount++;
|
||||
}
|
||||
|
||||
// Sort by name for deterministic output
|
||||
manifest.sort((a, b) => a.name.localeCompare(b.name));
|
||||
return manifest;
|
||||
rootSummary.skill_count = skillCount;
|
||||
roots.push(rootSummary);
|
||||
}
|
||||
|
||||
skills.sort((a, b) => {
|
||||
const rootCmp = a.root.localeCompare(b.root);
|
||||
return rootCmp !== 0 ? rootCmp : a.name.localeCompare(b.name);
|
||||
});
|
||||
|
||||
const gsdSkillsInstalled = skills.some(skill => skill.name.startsWith('gsd-'));
|
||||
|
||||
return {
|
||||
skills,
|
||||
roots,
|
||||
installation: {
|
||||
gsd_skills_installed: gsdSkillsInstalled,
|
||||
legacy_claude_commands_installed: legacyClaudeCommandsInstalled,
|
||||
},
|
||||
counts: {
|
||||
skills: skills.length,
|
||||
roots: roots.length,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Command: generate skill manifest JSON.
|
||||
*
|
||||
* Options:
|
||||
* --skills-dir <path> Path to skills directory (required)
|
||||
* --skills-dir <path> Optional absolute path to a single skills directory
|
||||
* --write Also write to .planning/skill-manifest.json
|
||||
*/
|
||||
function cmdSkillManifest(cwd, args, raw) {
|
||||
@@ -1667,12 +1799,7 @@ function cmdSkillManifest(cwd, args, raw) {
|
||||
? args[skillsDirIdx + 1]
|
||||
: null;
|
||||
|
||||
if (!skillsDir) {
|
||||
output([], raw);
|
||||
return;
|
||||
}
|
||||
|
||||
const manifest = buildSkillManifest(skillsDir);
|
||||
const manifest = buildSkillManifest(cwd, skillsDir);
|
||||
|
||||
// Optionally write to .planning/skill-manifest.json
|
||||
if (args.includes('--write')) {
|
||||
|
||||
@@ -177,11 +177,11 @@ const CLAUDE_MD_FALLBACKS = {
|
||||
stack: 'Technology stack not yet documented. Will populate after codebase mapping or first phase.',
|
||||
conventions: 'Conventions not yet established. Will populate as patterns emerge during development.',
|
||||
architecture: 'Architecture not yet mapped. Follow existing patterns found in the codebase.',
|
||||
skills: 'No project skills found. Add skills to any of: `.claude/skills/`, `.agents/skills/`, `.cursor/skills/`, or `.github/skills/` with a `SKILL.md` index file.',
|
||||
skills: 'No project skills found. Add skills to any of: `.claude/skills/`, `.agents/skills/`, `.cursor/skills/`, `.github/skills/`, or `.codex/skills/` with a `SKILL.md` index file.',
|
||||
};
|
||||
|
||||
// Directories where project skills may live (checked in order)
|
||||
const SKILL_SEARCH_DIRS = ['.claude/skills', '.agents/skills', '.cursor/skills', '.github/skills'];
|
||||
const SKILL_SEARCH_DIRS = ['.claude/skills', '.agents/skills', '.cursor/skills', '.github/skills', '.codex/skills'];
|
||||
|
||||
const CLAUDE_MD_WORKFLOW_ENFORCEMENT = [
|
||||
'Before using Edit, Write, or other file-changing tools, start work through a GSD command so planning artifacts and execution context stay in sync.',
|
||||
|
||||
@@ -361,6 +361,88 @@ Use AskUserQuestion:
|
||||
**If user cancels:** Exit.
|
||||
</step>
|
||||
|
||||
<step name="backup_custom_files">
|
||||
Before running the installer, detect and back up any user-added files inside
|
||||
GSD-managed directories. These are files that exist on disk but are NOT listed
|
||||
in `gsd-file-manifest.json` — i.e., files the user added themselves that the
|
||||
installer does not know about and will delete during the wipe.
|
||||
|
||||
**Do not use bash path-stripping (`${filepath#$RUNTIME_DIR/}`) or `node -e require()`
|
||||
inline** — those patterns fail when `$RUNTIME_DIR` is unset and the stripped
|
||||
relative path may not match manifest key format, which causes CUSTOM_COUNT=0
|
||||
even when custom files exist (bug #1997). Use `gsd-tools detect-custom-files`
|
||||
instead, which resolves paths reliably with Node.js `path.relative()`.
|
||||
|
||||
First, resolve the config directory (`RUNTIME_DIR`) from the install scope
|
||||
detected in `get_installed_version`:
|
||||
|
||||
```bash
|
||||
# RUNTIME_DIR is the resolved config directory (e.g. ~/.claude, ~/.config/opencode)
|
||||
# It should already be set from get_installed_version as GLOBAL_DIR or LOCAL_DIR.
|
||||
# Use the appropriate variable based on INSTALL_SCOPE.
|
||||
if [ "$INSTALL_SCOPE" = "LOCAL" ]; then
|
||||
RUNTIME_DIR="$LOCAL_DIR"
|
||||
elif [ "$INSTALL_SCOPE" = "GLOBAL" ]; then
|
||||
RUNTIME_DIR="$GLOBAL_DIR"
|
||||
else
|
||||
RUNTIME_DIR=""
|
||||
fi
|
||||
```
|
||||
|
||||
If `RUNTIME_DIR` is empty or does not exist, skip this step (no config dir to
|
||||
inspect).
|
||||
|
||||
Otherwise, resolve the path to `gsd-tools.cjs` and run:
|
||||
|
||||
```bash
|
||||
GSD_TOOLS="$RUNTIME_DIR/get-shit-done/bin/gsd-tools.cjs"
|
||||
if [ -f "$GSD_TOOLS" ] && [ -n "$RUNTIME_DIR" ]; then
|
||||
CUSTOM_JSON=$(node "$GSD_TOOLS" detect-custom-files --config-dir "$RUNTIME_DIR" 2>/dev/null)
|
||||
CUSTOM_COUNT=$(echo "$CUSTOM_JSON" | node -e "process.stdin.resume();let d='';process.stdin.on('data',c=>d+=c);process.stdin.on('end',()=>{try{console.log(JSON.parse(d).custom_count);}catch{console.log(0);}})" 2>/dev/null || echo "0")
|
||||
else
|
||||
CUSTOM_COUNT=0
|
||||
CUSTOM_JSON='{"custom_files":[],"custom_count":0}'
|
||||
fi
|
||||
```
|
||||
|
||||
**If `CUSTOM_COUNT` > 0:**
|
||||
|
||||
Back up each custom file to `$RUNTIME_DIR/gsd-user-files-backup/` before the
|
||||
installer wipes the directories:
|
||||
|
||||
```bash
|
||||
BACKUP_DIR="$RUNTIME_DIR/gsd-user-files-backup"
|
||||
mkdir -p "$BACKUP_DIR"
|
||||
|
||||
# Parse custom_files array from CUSTOM_JSON and copy each file
|
||||
node - "$RUNTIME_DIR" "$BACKUP_DIR" "$CUSTOM_JSON" <<'JSEOF'
|
||||
const [,, runtimeDir, backupDir, customJson] = process.argv;
|
||||
const { custom_files } = JSON.parse(customJson);
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
for (const relPath of custom_files) {
|
||||
const src = path.join(runtimeDir, relPath);
|
||||
const dst = path.join(backupDir, relPath);
|
||||
if (fs.existsSync(src)) {
|
||||
fs.mkdirSync(path.dirname(dst), { recursive: true });
|
||||
fs.copyFileSync(src, dst);
|
||||
console.log(' Backed up: ' + relPath);
|
||||
}
|
||||
}
|
||||
JSEOF
|
||||
```
|
||||
|
||||
Then inform the user:
|
||||
|
||||
```
|
||||
⚠️ Found N custom file(s) inside GSD-managed directories.
|
||||
These have been backed up to gsd-user-files-backup/ before the update.
|
||||
Restore them after the update if needed.
|
||||
```
|
||||
|
||||
**If `CUSTOM_COUNT` == 0:** No user-added files detected. Continue to install.
|
||||
</step>
|
||||
|
||||
<step name="run_update">
|
||||
Run the update using the install type detected in step 1:
|
||||
|
||||
|
||||
107
hooks/gsd-check-update-worker.js
Normal file
107
hooks/gsd-check-update-worker.js
Normal file
@@ -0,0 +1,107 @@
|
||||
#!/usr/bin/env node
|
||||
// gsd-hook-version: {{GSD_VERSION}}
|
||||
// Background worker spawned by gsd-check-update.js (SessionStart hook).
|
||||
// Checks for GSD updates and stale hooks, writes result to cache file.
|
||||
// Receives paths via environment variables set by the parent hook.
|
||||
//
|
||||
// Using a separate file (rather than node -e '<inline code>') avoids the
|
||||
// template-literal regex-escaping problem: regex source is plain JS here.
|
||||
|
||||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { execFileSync } = require('child_process');
|
||||
|
||||
const cacheFile = process.env.GSD_CACHE_FILE;
|
||||
const projectVersionFile = process.env.GSD_PROJECT_VERSION_FILE;
|
||||
const globalVersionFile = process.env.GSD_GLOBAL_VERSION_FILE;
|
||||
|
||||
// Compare semver: true if a > b (a is strictly newer than b)
|
||||
// Strips pre-release suffixes (e.g. '3-beta.1' → '3') to avoid NaN from Number()
|
||||
function isNewer(a, b) {
|
||||
const pa = (a || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
const pb = (b || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
for (let i = 0; i < 3; i++) {
|
||||
if (pa[i] > pb[i]) return true;
|
||||
if (pa[i] < pb[i]) return false;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check project directory first (local install), then global
|
||||
let installed = '0.0.0';
|
||||
let configDir = '';
|
||||
try {
|
||||
if (fs.existsSync(projectVersionFile)) {
|
||||
installed = fs.readFileSync(projectVersionFile, 'utf8').trim();
|
||||
configDir = path.dirname(path.dirname(projectVersionFile));
|
||||
} else if (fs.existsSync(globalVersionFile)) {
|
||||
installed = fs.readFileSync(globalVersionFile, 'utf8').trim();
|
||||
configDir = path.dirname(path.dirname(globalVersionFile));
|
||||
}
|
||||
} catch (e) {}
|
||||
|
||||
// Check for stale hooks — compare hook version headers against installed VERSION
|
||||
// Hooks are installed at configDir/hooks/ (e.g. ~/.claude/hooks/) (#1421)
|
||||
// Only check hooks that GSD currently ships — orphaned files from removed features
|
||||
// (e.g., gsd-intel-*.js) must be ignored to avoid permanent stale warnings (#1750)
|
||||
const MANAGED_HOOKS = [
|
||||
'gsd-check-update-worker.js',
|
||||
'gsd-check-update.js',
|
||||
'gsd-context-monitor.js',
|
||||
'gsd-phase-boundary.sh',
|
||||
'gsd-prompt-guard.js',
|
||||
'gsd-read-guard.js',
|
||||
'gsd-session-state.sh',
|
||||
'gsd-statusline.js',
|
||||
'gsd-validate-commit.sh',
|
||||
'gsd-workflow-guard.js',
|
||||
];
|
||||
|
||||
let staleHooks = [];
|
||||
if (configDir) {
|
||||
const hooksDir = path.join(configDir, 'hooks');
|
||||
try {
|
||||
if (fs.existsSync(hooksDir)) {
|
||||
const hookFiles = fs.readdirSync(hooksDir).filter(f => MANAGED_HOOKS.includes(f));
|
||||
for (const hookFile of hookFiles) {
|
||||
try {
|
||||
const content = fs.readFileSync(path.join(hooksDir, hookFile), 'utf8');
|
||||
// Match both JS (//) and bash (#) comment styles
|
||||
const versionMatch = content.match(/(?:\/\/|#) gsd-hook-version:\s*(.+)/);
|
||||
if (versionMatch) {
|
||||
const hookVersion = versionMatch[1].trim();
|
||||
if (isNewer(installed, hookVersion) && !hookVersion.includes('{{')) {
|
||||
staleHooks.push({ file: hookFile, hookVersion, installedVersion: installed });
|
||||
}
|
||||
} else {
|
||||
// No version header at all — definitely stale (pre-version-tracking)
|
||||
staleHooks.push({ file: hookFile, hookVersion: 'unknown', installedVersion: installed });
|
||||
}
|
||||
} catch (e) {}
|
||||
}
|
||||
}
|
||||
} catch (e) {}
|
||||
}
|
||||
|
||||
let latest = null;
|
||||
try {
|
||||
latest = execFileSync('npm', ['view', 'get-shit-done-cc', 'version'], {
|
||||
encoding: 'utf8',
|
||||
timeout: 10000,
|
||||
windowsHide: true,
|
||||
}).trim();
|
||||
} catch (e) {}
|
||||
|
||||
const result = {
|
||||
update_available: latest && isNewer(latest, installed),
|
||||
installed,
|
||||
latest: latest || 'unknown',
|
||||
checked: Math.floor(Date.now() / 1000),
|
||||
stale_hooks: staleHooks.length > 0 ? staleHooks : undefined,
|
||||
};
|
||||
|
||||
if (cacheFile) {
|
||||
try { fs.writeFileSync(cacheFile, JSON.stringify(result)); } catch (e) {}
|
||||
}
|
||||
@@ -44,99 +44,21 @@ if (!fs.existsSync(cacheDir)) {
|
||||
fs.mkdirSync(cacheDir, { recursive: true });
|
||||
}
|
||||
|
||||
// Run check in background (spawn background process, windowsHide prevents console flash)
|
||||
const child = spawn(process.execPath, ['-e', `
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { execSync } = require('child_process');
|
||||
|
||||
// Compare semver: true if a > b (a is strictly newer than b)
|
||||
// Strips pre-release suffixes (e.g. '3-beta.1' → '3') to avoid NaN from Number()
|
||||
function isNewer(a, b) {
|
||||
const pa = (a || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
const pb = (b || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
for (let i = 0; i < 3; i++) {
|
||||
if (pa[i] > pb[i]) return true;
|
||||
if (pa[i] < pb[i]) return false;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
const cacheFile = ${JSON.stringify(cacheFile)};
|
||||
const projectVersionFile = ${JSON.stringify(projectVersionFile)};
|
||||
const globalVersionFile = ${JSON.stringify(globalVersionFile)};
|
||||
|
||||
// Check project directory first (local install), then global
|
||||
let installed = '0.0.0';
|
||||
let configDir = '';
|
||||
try {
|
||||
if (fs.existsSync(projectVersionFile)) {
|
||||
installed = fs.readFileSync(projectVersionFile, 'utf8').trim();
|
||||
configDir = path.dirname(path.dirname(projectVersionFile));
|
||||
} else if (fs.existsSync(globalVersionFile)) {
|
||||
installed = fs.readFileSync(globalVersionFile, 'utf8').trim();
|
||||
configDir = path.dirname(path.dirname(globalVersionFile));
|
||||
}
|
||||
} catch (e) {}
|
||||
|
||||
// Check for stale hooks — compare hook version headers against installed VERSION
|
||||
// Hooks are installed at configDir/hooks/ (e.g. ~/.claude/hooks/) (#1421)
|
||||
// Only check hooks that GSD currently ships — orphaned files from removed features
|
||||
// (e.g., gsd-intel-*.js) must be ignored to avoid permanent stale warnings (#1750)
|
||||
const MANAGED_HOOKS = [
|
||||
'gsd-check-update.js',
|
||||
'gsd-context-monitor.js',
|
||||
'gsd-phase-boundary.sh',
|
||||
'gsd-prompt-guard.js',
|
||||
'gsd-read-guard.js',
|
||||
'gsd-session-state.sh',
|
||||
'gsd-statusline.js',
|
||||
'gsd-validate-commit.sh',
|
||||
'gsd-workflow-guard.js',
|
||||
];
|
||||
let staleHooks = [];
|
||||
if (configDir) {
|
||||
const hooksDir = path.join(configDir, 'hooks');
|
||||
try {
|
||||
if (fs.existsSync(hooksDir)) {
|
||||
const hookFiles = fs.readdirSync(hooksDir).filter(f => MANAGED_HOOKS.includes(f));
|
||||
for (const hookFile of hookFiles) {
|
||||
try {
|
||||
const content = fs.readFileSync(path.join(hooksDir, hookFile), 'utf8');
|
||||
const versionMatch = content.match(/\\/\\/ gsd-hook-version:\\s*(.+)/);
|
||||
if (versionMatch) {
|
||||
const hookVersion = versionMatch[1].trim();
|
||||
if (isNewer(installed, hookVersion) && !hookVersion.includes('{{')) {
|
||||
staleHooks.push({ file: hookFile, hookVersion, installedVersion: installed });
|
||||
}
|
||||
} else {
|
||||
// No version header at all — definitely stale (pre-version-tracking)
|
||||
staleHooks.push({ file: hookFile, hookVersion: 'unknown', installedVersion: installed });
|
||||
}
|
||||
} catch (e) {}
|
||||
}
|
||||
}
|
||||
} catch (e) {}
|
||||
}
|
||||
|
||||
let latest = null;
|
||||
try {
|
||||
latest = execSync('npm view get-shit-done-cc version', { encoding: 'utf8', timeout: 10000, windowsHide: true }).trim();
|
||||
} catch (e) {}
|
||||
|
||||
const result = {
|
||||
update_available: latest && isNewer(latest, installed),
|
||||
installed,
|
||||
latest: latest || 'unknown',
|
||||
checked: Math.floor(Date.now() / 1000),
|
||||
stale_hooks: staleHooks.length > 0 ? staleHooks : undefined
|
||||
};
|
||||
|
||||
fs.writeFileSync(cacheFile, JSON.stringify(result));
|
||||
`], {
|
||||
// Run check in background via a dedicated worker script.
|
||||
// Spawning a file (rather than node -e '<inline code>') keeps the worker logic
|
||||
// in plain JS with no template-literal regex-escaping concerns, and makes the
|
||||
// worker independently testable.
|
||||
const workerPath = path.join(__dirname, 'gsd-check-update-worker.js');
|
||||
const child = spawn(process.execPath, [workerPath], {
|
||||
stdio: 'ignore',
|
||||
windowsHide: true,
|
||||
detached: true // Required on Windows for proper process detachment
|
||||
detached: true, // Required on Windows for proper process detachment
|
||||
env: {
|
||||
...process.env,
|
||||
GSD_CACHE_FILE: cacheFile,
|
||||
GSD_PROJECT_VERSION_FILE: projectVersionFile,
|
||||
GSD_GLOBAL_VERSION_FILE: globalVersionFile,
|
||||
},
|
||||
});
|
||||
|
||||
child.unref();
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
#!/bin/bash
|
||||
# gsd-hook-version: {{GSD_VERSION}}
|
||||
# gsd-phase-boundary.sh — PostToolUse hook: detect .planning/ file writes
|
||||
# Outputs a reminder when planning files are modified outside normal workflow.
|
||||
# Uses Node.js for JSON parsing (always available in GSD projects, no jq dependency).
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
#!/bin/bash
|
||||
# gsd-hook-version: {{GSD_VERSION}}
|
||||
# gsd-session-state.sh — SessionStart hook: inject project state reminder
|
||||
# Outputs STATE.md head on every session start for orientation.
|
||||
#
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
#!/bin/bash
|
||||
# gsd-hook-version: {{GSD_VERSION}}
|
||||
# gsd-validate-commit.sh — PreToolUse hook: enforce Conventional Commits format
|
||||
# Blocks git commit commands with non-conforming messages (exit 2).
|
||||
# Allows conforming messages and all non-commit commands (exit 0).
|
||||
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "get-shit-done-cc",
|
||||
"version": "1.35.0",
|
||||
"version": "1.36.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "get-shit-done-cc",
|
||||
"version": "1.35.0",
|
||||
"version": "1.36.0",
|
||||
"license": "MIT",
|
||||
"bin": {
|
||||
"get-shit-done-cc": "bin/install.js"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "get-shit-done-cc",
|
||||
"version": "1.35.0",
|
||||
"version": "1.36.0",
|
||||
"description": "A meta-prompting, context engineering and spec-driven development system for Claude Code, OpenCode, Gemini and Codex by TÂCHES.",
|
||||
"bin": {
|
||||
"get-shit-done-cc": "bin/install.js"
|
||||
|
||||
@@ -15,6 +15,7 @@ const DIST_DIR = path.join(HOOKS_DIR, 'dist');
|
||||
|
||||
// Hooks to copy (pure Node.js, no bundling needed)
|
||||
const HOOKS_TO_COPY = [
|
||||
'gsd-check-update-worker.js',
|
||||
'gsd-check-update.js',
|
||||
'gsd-context-monitor.js',
|
||||
'gsd-prompt-guard.js',
|
||||
|
||||
@@ -2,29 +2,72 @@
|
||||
* Tests for agent skills query handler.
|
||||
*/
|
||||
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import { mkdtemp, mkdir, rm } from 'node:fs/promises';
|
||||
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest';
|
||||
import { mkdtemp, mkdir, rm, writeFile } from 'node:fs/promises';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { agentSkills } from './skills.js';
|
||||
|
||||
function writeSkill(rootDir: string, name: string, description = 'Skill under test') {
|
||||
const skillDir = join(rootDir, name);
|
||||
return mkdir(skillDir, { recursive: true }).then(() => writeFile(join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
`name: ${name}`,
|
||||
`description: ${description}`,
|
||||
'---',
|
||||
'',
|
||||
`# ${name}`,
|
||||
].join('\n')));
|
||||
}
|
||||
|
||||
describe('agentSkills', () => {
|
||||
let tmpDir: string;
|
||||
let homeDir: string;
|
||||
|
||||
beforeEach(async () => {
|
||||
tmpDir = await mkdtemp(join(tmpdir(), 'gsd-skills-'));
|
||||
await mkdir(join(tmpDir, '.cursor', 'skills', 'my-skill'), { recursive: true });
|
||||
homeDir = await mkdtemp(join(tmpdir(), 'gsd-skills-home-'));
|
||||
await writeSkill(join(tmpDir, '.cursor', 'skills'), 'my-skill');
|
||||
await writeSkill(join(tmpDir, '.codex', 'skills'), 'project-codex');
|
||||
await mkdir(join(tmpDir, '.claude', 'skills', 'orphaned-dir'), { recursive: true });
|
||||
await writeSkill(join(homeDir, '.claude', 'skills'), 'global-claude');
|
||||
await writeSkill(join(homeDir, '.codex', 'skills'), 'global-codex');
|
||||
await writeSkill(join(homeDir, '.claude', 'get-shit-done', 'skills'), 'legacy-import');
|
||||
vi.stubEnv('HOME', homeDir);
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
vi.unstubAllEnvs();
|
||||
await rm(tmpDir, { recursive: true, force: true });
|
||||
await rm(homeDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('returns deduped skill names from project skill dirs', async () => {
|
||||
it('returns deduped skill names from project and managed global skill dirs', async () => {
|
||||
const r = await agentSkills(['gsd-executor'], tmpDir);
|
||||
const data = r.data as Record<string, unknown>;
|
||||
expect(data.skill_count).toBeGreaterThan(0);
|
||||
expect((data.skills as string[]).length).toBeGreaterThan(0);
|
||||
const skills = data.skills as string[];
|
||||
|
||||
expect(skills).toEqual(expect.arrayContaining([
|
||||
'my-skill',
|
||||
'project-codex',
|
||||
'global-claude',
|
||||
'global-codex',
|
||||
]));
|
||||
expect(skills).not.toContain('orphaned-dir');
|
||||
expect(skills).not.toContain('legacy-import');
|
||||
expect(data.skill_count).toBe(skills.length);
|
||||
});
|
||||
|
||||
it('counts deduped skill names when the same skill exists in multiple roots', async () => {
|
||||
await writeSkill(join(tmpDir, '.claude', 'skills'), 'shared-skill');
|
||||
await writeSkill(join(tmpDir, '.agents', 'skills'), 'shared-skill');
|
||||
|
||||
const r = await agentSkills(['gsd-executor'], tmpDir);
|
||||
const data = r.data as Record<string, unknown>;
|
||||
const skills = data.skills as string[];
|
||||
|
||||
expect(skills.filter((skill) => skill === 'shared-skill')).toHaveLength(1);
|
||||
expect(data.skill_count).toBe(skills.length);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
/**
|
||||
* Agent skills query handler — scan installed skill directories.
|
||||
*
|
||||
* Reads from .claude/skills/, .agents/skills/, .cursor/skills/, .github/skills/,
|
||||
* and the global ~/.claude/get-shit-done/skills/ directory.
|
||||
* Reads from project `.claude/skills/`, `.agents/skills/`, `.cursor/skills/`,
|
||||
* `.github/skills/`, `.codex/skills/`, plus managed global `~/.claude/skills/`
|
||||
* and `~/.codex/skills/` roots.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
@@ -26,7 +27,9 @@ export const agentSkills: QueryHandler = async (args, projectDir) => {
|
||||
join(projectDir, '.agents', 'skills'),
|
||||
join(projectDir, '.cursor', 'skills'),
|
||||
join(projectDir, '.github', 'skills'),
|
||||
join(homedir(), '.claude', 'get-shit-done', 'skills'),
|
||||
join(projectDir, '.codex', 'skills'),
|
||||
join(homedir(), '.claude', 'skills'),
|
||||
join(homedir(), '.codex', 'skills'),
|
||||
];
|
||||
|
||||
const skills: string[] = [];
|
||||
@@ -35,16 +38,19 @@ export const agentSkills: QueryHandler = async (args, projectDir) => {
|
||||
try {
|
||||
const entries = readdirSync(dir, { withFileTypes: true });
|
||||
for (const entry of entries) {
|
||||
if (entry.isDirectory()) skills.push(entry.name);
|
||||
if (!entry.isDirectory()) continue;
|
||||
if (!existsSync(join(dir, entry.name, 'SKILL.md'))) continue;
|
||||
skills.push(entry.name);
|
||||
}
|
||||
} catch { /* skip */ }
|
||||
}
|
||||
|
||||
const dedupedSkills = [...new Set(skills)];
|
||||
return {
|
||||
data: {
|
||||
agent_type: agentType,
|
||||
skills: [...new Set(skills)],
|
||||
skill_count: skills.length,
|
||||
skills: dedupedSkills,
|
||||
skill_count: dedupedSkills.length,
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
59
tests/architecture-counts.test.cjs
Normal file
59
tests/architecture-counts.test.cjs
Normal file
@@ -0,0 +1,59 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* Guards ARCHITECTURE.md component counts against drift.
|
||||
*
|
||||
* Both sides are computed at test runtime — no hardcoded numbers.
|
||||
* Parsing ARCHITECTURE.md: regex extracts the documented count.
|
||||
* Filesystem count: readdirSync filters to *.md files.
|
||||
*
|
||||
* To add a new component: append a row to COMPONENTS below and update
|
||||
* docs/ARCHITECTURE.md with a matching "**Total <label>:** N" line.
|
||||
*/
|
||||
|
||||
const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const ROOT = path.join(__dirname, '..');
|
||||
const ARCH_MD = path.join(ROOT, 'docs', 'ARCHITECTURE.md');
|
||||
const ARCH_CONTENT = fs.readFileSync(ARCH_MD, 'utf-8');
|
||||
|
||||
/** Components whose counts must stay in sync with ARCHITECTURE.md. */
|
||||
const COMPONENTS = [
|
||||
{ label: 'commands', dir: 'commands/gsd' },
|
||||
{ label: 'workflows', dir: 'get-shit-done/workflows' },
|
||||
{ label: 'agents', dir: 'agents' },
|
||||
];
|
||||
|
||||
/**
|
||||
* Parse "**Total <label>:** N" from ARCHITECTURE.md.
|
||||
* Returns the integer N, or throws if the pattern is missing.
|
||||
*/
|
||||
function parseDocCount(label) {
|
||||
const match = ARCH_CONTENT.match(new RegExp(`\\*\\*Total ${label}:\\*\\*\\s+(\\d+)`));
|
||||
assert.ok(match, `ARCHITECTURE.md is missing "**Total ${label}:** N" — add it`);
|
||||
return parseInt(match[1], 10);
|
||||
}
|
||||
|
||||
/**
|
||||
* Count *.md files in a directory (non-recursive).
|
||||
*/
|
||||
function countMdFiles(relDir) {
|
||||
return fs.readdirSync(path.join(ROOT, relDir)).filter((f) => f.endsWith('.md')).length;
|
||||
}
|
||||
|
||||
describe('ARCHITECTURE.md component counts', () => {
|
||||
for (const { label, dir } of COMPONENTS) {
|
||||
test(`Total ${label} matches ${dir}/*.md file count`, () => {
|
||||
const documented = parseDocCount(label);
|
||||
const actual = countMdFiles(dir);
|
||||
assert.strictEqual(
|
||||
documented,
|
||||
actual,
|
||||
`docs/ARCHITECTURE.md says "Total ${label}: ${documented}" but ${dir}/ has ${actual} .md files — update ARCHITECTURE.md`
|
||||
);
|
||||
});
|
||||
}
|
||||
});
|
||||
359
tests/bug-2136-sh-hook-version.test.cjs
Normal file
359
tests/bug-2136-sh-hook-version.test.cjs
Normal file
@@ -0,0 +1,359 @@
|
||||
/**
|
||||
* Regression tests for bug #2136 / #2206
|
||||
*
|
||||
* Root cause: three bash hooks (gsd-phase-boundary.sh, gsd-session-state.sh,
|
||||
* gsd-validate-commit.sh) shipped without a gsd-hook-version header, and the
|
||||
* stale-hook detector in gsd-check-update.js only matched JavaScript comment
|
||||
* syntax (//) — not bash comment syntax (#).
|
||||
*
|
||||
* Result: every session showed "⚠ stale hooks — run /gsd-update" immediately
|
||||
* after a fresh install, because the detector saw hookVersion: 'unknown' for
|
||||
* all three bash hooks.
|
||||
*
|
||||
* This fix requires THREE parts working in concert:
|
||||
* 1. Bash hooks ship with "# gsd-hook-version: {{GSD_VERSION}}"
|
||||
* 2. install.js substitutes {{GSD_VERSION}} in .sh files at install time
|
||||
* 3. gsd-check-update.js regex matches both "//" and "#" comment styles
|
||||
*
|
||||
* Neither fix alone is sufficient:
|
||||
* - Headers + regex fix only (no install.js fix): installed hooks contain
|
||||
* literal "{{GSD_VERSION}}" — the {{-guard silently skips them, making
|
||||
* bash hook staleness permanently undetectable after future updates.
|
||||
* - Headers + install.js fix only (no regex fix): installed hooks are
|
||||
* stamped correctly but the detector still can't read bash "#" comments,
|
||||
* so they still land in the "unknown / stale" branch on every session.
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
// NOTE: Do NOT set GSD_TEST_MODE here — the E2E install tests spawn the
|
||||
// real installer subprocess, which skips all install logic when GSD_TEST_MODE=1.
|
||||
|
||||
const { describe, test, before, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const os = require('os');
|
||||
const { execFileSync } = require('child_process');
|
||||
|
||||
const HOOKS_DIR = path.join(__dirname, '..', 'hooks');
|
||||
const CHECK_UPDATE_FILE = path.join(HOOKS_DIR, 'gsd-check-update.js');
|
||||
const WORKER_FILE = path.join(HOOKS_DIR, 'gsd-check-update-worker.js');
|
||||
const INSTALL_SCRIPT = path.join(__dirname, '..', 'bin', 'install.js');
|
||||
const BUILD_SCRIPT = path.join(__dirname, '..', 'scripts', 'build-hooks.js');
|
||||
|
||||
const SH_HOOKS = [
|
||||
'gsd-phase-boundary.sh',
|
||||
'gsd-session-state.sh',
|
||||
'gsd-validate-commit.sh',
|
||||
];
|
||||
|
||||
// ─── Ensure hooks/dist/ is populated before install tests ────────────────────
|
||||
|
||||
before(() => {
|
||||
execFileSync(process.execPath, [BUILD_SCRIPT], {
|
||||
encoding: 'utf-8',
|
||||
stdio: 'pipe',
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
function createTempDir(prefix) {
|
||||
return fs.mkdtempSync(path.join(os.tmpdir(), prefix));
|
||||
}
|
||||
|
||||
function cleanup(dir) {
|
||||
try { fs.rmSync(dir, { recursive: true, force: true }); } catch { /* ignore */ }
|
||||
}
|
||||
|
||||
function runInstaller(configDir) {
|
||||
execFileSync(process.execPath, [INSTALL_SCRIPT, '--claude', '--global', '--yes'], {
|
||||
encoding: 'utf-8',
|
||||
stdio: 'pipe',
|
||||
env: { ...process.env, CLAUDE_CONFIG_DIR: configDir },
|
||||
});
|
||||
return path.join(configDir, 'hooks');
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Part 1: Bash hook sources carry the version header placeholder
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('bug #2136 part 1: bash hook sources carry gsd-hook-version placeholder', () => {
|
||||
for (const sh of SH_HOOKS) {
|
||||
test(`${sh} contains "# gsd-hook-version: {{GSD_VERSION}}"`, () => {
|
||||
const content = fs.readFileSync(path.join(HOOKS_DIR, sh), 'utf8');
|
||||
assert.ok(
|
||||
content.includes('# gsd-hook-version: {{GSD_VERSION}}'),
|
||||
`${sh} must include "# gsd-hook-version: {{GSD_VERSION}}" so the ` +
|
||||
`installer can stamp it and gsd-check-update.js can detect staleness`
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
test('version header is on line 2 (immediately after shebang)', () => {
|
||||
// Placing the header immediately after #!/bin/bash ensures it is always
|
||||
// found regardless of how much of the file is read.
|
||||
for (const sh of SH_HOOKS) {
|
||||
const lines = fs.readFileSync(path.join(HOOKS_DIR, sh), 'utf8').split('\n');
|
||||
assert.strictEqual(lines[0], '#!/bin/bash', `${sh} line 1 must be #!/bin/bash`);
|
||||
assert.ok(
|
||||
lines[1].startsWith('# gsd-hook-version:'),
|
||||
`${sh} line 2 must be the gsd-hook-version header (got: "${lines[1]}")`
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Part 2: gsd-check-update-worker.js regex handles bash "#" comment syntax
|
||||
// (Logic moved from inline -e template literal to dedicated worker file)
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('bug #2136 part 2: stale-hook detector handles bash comment syntax', () => {
|
||||
let src;
|
||||
|
||||
before(() => {
|
||||
src = fs.readFileSync(WORKER_FILE, 'utf8');
|
||||
});
|
||||
|
||||
test('version regex in source matches "#" comment syntax in addition to "//"', () => {
|
||||
// The regex string in the source must contain the alternation for "#".
|
||||
// The worker uses plain JS (no template-literal escaping), so the form is
|
||||
// "(?:\/\/|#)" directly in source.
|
||||
const hasBashAlternative =
|
||||
src.includes('(?:\\/\\/|#)') || // escaped form (old template-literal style)
|
||||
src.includes('(?:\/\/|#)'); // direct form in plain JS worker
|
||||
assert.ok(
|
||||
hasBashAlternative,
|
||||
'gsd-check-update-worker.js version regex must include an alternative for bash "#" comments. ' +
|
||||
'Expected to find (?:\\/\\/|#) or (?:\/\/|#) in the source. ' +
|
||||
'The original "//" only regex causes bash hooks to always report hookVersion: "unknown"'
|
||||
);
|
||||
});
|
||||
|
||||
test('version regex does not use the old JS-only form as the sole pattern', () => {
|
||||
// The old regex inside the template literal was the string:
|
||||
// /\\/\\/ gsd-hook-version:\\s*(.+)/
|
||||
// which, when evaluated in the subprocess, produced: /\/\/ gsd-hook-version:\s*(.+)/
|
||||
// That only matched JS "//" comments — never bash "#".
|
||||
// We verify that the old exact string no longer appears.
|
||||
assert.ok(
|
||||
!src.includes('\\/\\/ gsd-hook-version'),
|
||||
'gsd-check-update-worker.js must not use the old JS-only (\\/\\/ gsd-hook-version) ' +
|
||||
'escape form as the sole version matcher — it cannot match bash "#" comments'
|
||||
);
|
||||
});
|
||||
|
||||
test('version regex correctly matches both bash and JS hook version headers', () => {
|
||||
// Verify that the versionMatch line in the source uses a regex that matches
|
||||
// both bash "#" and JS "//" comment styles. We check the source contains the
|
||||
// expected alternation, then directly test the known required pattern.
|
||||
//
|
||||
// We do NOT try to extract and evaluate the regex from source (it contains ")"
|
||||
// which breaks simple extraction), so instead we confirm the source matches
|
||||
// our expectation and run the regex itself.
|
||||
assert.ok(
|
||||
src.includes('gsd-hook-version'),
|
||||
'gsd-check-update-worker.js must contain a gsd-hook-version version check'
|
||||
);
|
||||
|
||||
// The fixed regex that must be present: matches both comment styles
|
||||
const fixedRegex = /(?:\/\/|#) gsd-hook-version:\s*(.+)/;
|
||||
|
||||
assert.ok(
|
||||
fixedRegex.test('# gsd-hook-version: 1.36.0'),
|
||||
'bash-style "# gsd-hook-version: X" must be matchable by the required regex'
|
||||
);
|
||||
assert.ok(
|
||||
fixedRegex.test('// gsd-hook-version: 1.36.0'),
|
||||
'JS-style "// gsd-hook-version: X" must still match (no regression)'
|
||||
);
|
||||
assert.ok(
|
||||
!fixedRegex.test('gsd-hook-version: 1.36.0'),
|
||||
'line without a comment prefix must not match (prevents false positives)'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Part 3a: install.js bundled path substitutes {{GSD_VERSION}} in .sh hooks
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('bug #2136 part 3a: install.js bundled path substitutes {{GSD_VERSION}} in .sh hooks', () => {
|
||||
let src;
|
||||
|
||||
before(() => {
|
||||
src = fs.readFileSync(INSTALL_SCRIPT, 'utf8');
|
||||
});
|
||||
|
||||
test('.sh branch in bundled hook copy loop reads file and substitutes GSD_VERSION', () => {
|
||||
// Anchor on configDirReplacement — unique to the bundled-hooks path.
|
||||
const anchorIdx = src.indexOf('configDirReplacement');
|
||||
assert.ok(anchorIdx !== -1, 'bundled hook copy loop anchor (configDirReplacement) not found');
|
||||
|
||||
// Window large enough for the if/else block
|
||||
const region = src.slice(anchorIdx, anchorIdx + 2000);
|
||||
|
||||
assert.ok(
|
||||
region.includes("entry.endsWith('.sh')"),
|
||||
"bundled hook copy loop must check entry.endsWith('.sh')"
|
||||
);
|
||||
assert.ok(
|
||||
region.includes('GSD_VERSION'),
|
||||
'bundled .sh branch must reference GSD_VERSION substitution. Without this, ' +
|
||||
'installed .sh hooks contain the literal "{{GSD_VERSION}}" placeholder and ' +
|
||||
'bash hook staleness becomes permanently undetectable after future updates'
|
||||
);
|
||||
// copyFileSync on a .sh file would skip substitution — ensure we read+write instead
|
||||
const shBranchIdx = region.indexOf("entry.endsWith('.sh')");
|
||||
const shBranchRegion = region.slice(shBranchIdx, shBranchIdx + 400);
|
||||
assert.ok(
|
||||
shBranchRegion.includes('readFileSync') || shBranchRegion.includes('writeFileSync'),
|
||||
'bundled .sh branch must read the file (readFileSync) to perform substitution, ' +
|
||||
'not copyFileSync directly (which skips template expansion)'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Part 3b: install.js Codex path also substitutes {{GSD_VERSION}} in .sh hooks
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('bug #2136 part 3b: install.js Codex path substitutes {{GSD_VERSION}} in .sh hooks', () => {
|
||||
let src;
|
||||
|
||||
before(() => {
|
||||
src = fs.readFileSync(INSTALL_SCRIPT, 'utf8');
|
||||
});
|
||||
|
||||
test('.sh branch in Codex hook copy block substitutes GSD_VERSION', () => {
|
||||
// Anchor on codexHooksSrc — unique to the Codex path.
|
||||
const anchorIdx = src.indexOf('codexHooksSrc');
|
||||
assert.ok(anchorIdx !== -1, 'Codex hook copy block anchor (codexHooksSrc) not found');
|
||||
|
||||
const region = src.slice(anchorIdx, anchorIdx + 2000);
|
||||
|
||||
assert.ok(
|
||||
region.includes("entry.endsWith('.sh')"),
|
||||
"Codex hook copy block must check entry.endsWith('.sh')"
|
||||
);
|
||||
assert.ok(
|
||||
region.includes('GSD_VERSION'),
|
||||
'Codex .sh branch must substitute {{GSD_VERSION}}. The bundled path was fixed ' +
|
||||
'but Codex installs a separate copy of the hooks from hooks/dist that also needs stamping'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Part 4: End-to-end — installed .sh hooks have stamped version, not placeholder
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
describe('bug #2136 part 4: installed .sh hooks contain stamped concrete version', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir('gsd-2136-install-');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('installed .sh hooks contain a concrete version string, not the template placeholder', () => {
|
||||
const hooksDir = runInstaller(tmpDir);
|
||||
|
||||
for (const sh of SH_HOOKS) {
|
||||
const hookPath = path.join(hooksDir, sh);
|
||||
assert.ok(fs.existsSync(hookPath), `${sh} must be installed`);
|
||||
|
||||
const content = fs.readFileSync(hookPath, 'utf8');
|
||||
|
||||
assert.ok(
|
||||
content.includes('# gsd-hook-version:'),
|
||||
`installed ${sh} must contain a "# gsd-hook-version:" header`
|
||||
);
|
||||
assert.ok(
|
||||
!content.includes('{{GSD_VERSION}}'),
|
||||
`installed ${sh} must not contain literal "{{GSD_VERSION}}" — ` +
|
||||
`install.js must substitute it with the concrete package version`
|
||||
);
|
||||
|
||||
const versionMatch = content.match(/# gsd-hook-version:\s*(\S+)/);
|
||||
assert.ok(versionMatch, `installed ${sh} version header must have a version value`);
|
||||
assert.match(
|
||||
versionMatch[1],
|
||||
/^\d+\.\d+\.\d+/,
|
||||
`installed ${sh} version "${versionMatch[1]}" must be a semver-like string`
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
test('stale-hook detector reports zero stale bash hooks immediately after fresh install', () => {
|
||||
// This is the definitive end-to-end proof: after install, run the actual
|
||||
// version-check logic (extracted from gsd-check-update.js) against the
|
||||
// installed hooks and verify none are flagged stale.
|
||||
const hooksDir = runInstaller(tmpDir);
|
||||
const pkg = require(path.join(__dirname, '..', 'package.json'));
|
||||
const installedVersion = pkg.version;
|
||||
|
||||
// Build a subprocess that runs the staleness check logic in isolation.
|
||||
// We pass the installed version, hooks dir, and hook filenames as JSON
|
||||
// to avoid any injection risk.
|
||||
const checkScript = `
|
||||
'use strict';
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
function isNewer(a, b) {
|
||||
const pa = (a || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
const pb = (b || '').split('.').map(s => Number(s.replace(/-.*/, '')) || 0);
|
||||
for (let i = 0; i < 3; i++) {
|
||||
if (pa[i] > pb[i]) return true;
|
||||
if (pa[i] < pb[i]) return false;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
const hooksDir = ${JSON.stringify(hooksDir)};
|
||||
const installed = ${JSON.stringify(installedVersion)};
|
||||
const shHooks = ${JSON.stringify(SH_HOOKS)};
|
||||
// Use the same regex that the fixed gsd-check-update.js uses
|
||||
const versionRe = /(?:\\/\\/|#) gsd-hook-version:\\s*(.+)/;
|
||||
|
||||
const staleHooks = [];
|
||||
for (const hookFile of shHooks) {
|
||||
const hookPath = path.join(hooksDir, hookFile);
|
||||
if (!fs.existsSync(hookPath)) {
|
||||
staleHooks.push({ file: hookFile, hookVersion: 'missing' });
|
||||
continue;
|
||||
}
|
||||
const content = fs.readFileSync(hookPath, 'utf8');
|
||||
const m = content.match(versionRe);
|
||||
if (m) {
|
||||
const hookVersion = m[1].trim();
|
||||
if (isNewer(installed, hookVersion) && !hookVersion.includes('{{')) {
|
||||
staleHooks.push({ file: hookFile, hookVersion, installedVersion: installed });
|
||||
}
|
||||
} else {
|
||||
staleHooks.push({ file: hookFile, hookVersion: 'unknown', installedVersion: installed });
|
||||
}
|
||||
}
|
||||
process.stdout.write(JSON.stringify(staleHooks));
|
||||
`;
|
||||
|
||||
const result = execFileSync(process.execPath, ['-e', checkScript], { encoding: 'utf8' });
|
||||
const staleHooks = JSON.parse(result);
|
||||
|
||||
assert.deepStrictEqual(
|
||||
staleHooks,
|
||||
[],
|
||||
`Fresh install must produce zero stale bash hooks.\n` +
|
||||
`Got: ${JSON.stringify(staleHooks, null, 2)}\n` +
|
||||
`This indicates either the version header was not stamped by install.js, ` +
|
||||
`or the detector regex cannot match bash "#" comment syntax.`
|
||||
);
|
||||
});
|
||||
});
|
||||
@@ -148,6 +148,38 @@ describe('generate-claude-md skills section', () => {
|
||||
assert.ok(content.includes('ERP synchronization flows'));
|
||||
});
|
||||
|
||||
test('discovers skills from .codex/skills/ directory and ignores deprecated import-only roots', () => {
|
||||
const codexSkillDir = path.join(tmpDir, '.codex', 'skills', 'automation');
|
||||
fs.mkdirSync(codexSkillDir, { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(codexSkillDir, 'SKILL.md'),
|
||||
'---\nname: automation\ndescription: Project Codex skill.\n---\n\n# Automation\n'
|
||||
);
|
||||
|
||||
const homeDir = fs.mkdtempSync(path.join(require('os').tmpdir(), 'gsd-claude-skills-home-'));
|
||||
fs.mkdirSync(path.join(homeDir, '.claude', 'get-shit-done', 'skills', 'import-only'), { recursive: true });
|
||||
fs.writeFileSync(
|
||||
path.join(homeDir, '.claude', 'get-shit-done', 'skills', 'import-only', 'SKILL.md'),
|
||||
'---\nname: import-only\ndescription: Deprecated import-only skill.\n---\n'
|
||||
);
|
||||
|
||||
const originalHome = process.env.HOME;
|
||||
process.env.HOME = homeDir;
|
||||
|
||||
try {
|
||||
const result = runGsdTools('generate-claude-md', tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const content = fs.readFileSync(path.join(tmpDir, 'CLAUDE.md'), 'utf-8');
|
||||
assert.ok(content.includes('automation'));
|
||||
assert.ok(content.includes('Project Codex skill'));
|
||||
assert.ok(!content.includes('import-only'));
|
||||
} finally {
|
||||
process.env.HOME = originalHome;
|
||||
cleanup(homeDir);
|
||||
}
|
||||
});
|
||||
|
||||
test('skips gsd- prefixed skill directories', () => {
|
||||
const gsdSkillDir = path.join(tmpDir, '.claude', 'skills', 'gsd-plan-phase');
|
||||
const userSkillDir = path.join(tmpDir, '.claude', 'skills', 'my-feature');
|
||||
|
||||
93
tests/command-count-sync.test.cjs
Normal file
93
tests/command-count-sync.test.cjs
Normal file
@@ -0,0 +1,93 @@
|
||||
/**
|
||||
* Regression test: command count in docs/ARCHITECTURE.md must match
|
||||
* the actual number of .md files in commands/gsd/.
|
||||
*
|
||||
* Counts are extracted from the doc programmatically — never hardcoded
|
||||
* in this test — so any future drift (adding a command without updating
|
||||
* the doc, or vice-versa) is caught immediately.
|
||||
*
|
||||
* Related: issue #2257
|
||||
*/
|
||||
'use strict';
|
||||
|
||||
const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
|
||||
const ROOT = path.resolve(__dirname, '..');
|
||||
const COMMANDS_DIR = path.join(ROOT, 'commands', 'gsd');
|
||||
const ARCH_MD = path.join(ROOT, 'docs', 'ARCHITECTURE.md');
|
||||
|
||||
/**
|
||||
* Count .md files that actually live in commands/gsd/.
|
||||
* Does not recurse into subdirectories.
|
||||
*/
|
||||
function actualCommandCount() {
|
||||
return fs
|
||||
.readdirSync(COMMANDS_DIR)
|
||||
.filter((f) => f.endsWith('.md'))
|
||||
.length;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract the integer from the "**Total commands:** N" prose line in
|
||||
* ARCHITECTURE.md. Returns null if the pattern is not found.
|
||||
*/
|
||||
function docProseCount(content) {
|
||||
const m = content.match(/\*\*Total commands:\*\*\s+(\d+)/);
|
||||
return m ? parseInt(m[1], 10) : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract the integer from the directory-tree comment line:
|
||||
* ├── commands/gsd/*.md # N slash commands
|
||||
* Returns null if the pattern is not found.
|
||||
*/
|
||||
function docTreeCount(content) {
|
||||
const m = content.match(/commands\/gsd\/\*\.md[^\n]*#\s*(\d+)\s+slash commands/);
|
||||
return m ? parseInt(m[1], 10) : null;
|
||||
}
|
||||
|
||||
describe('ARCHITECTURE.md command count sync', () => {
|
||||
const archContent = fs.readFileSync(ARCH_MD, 'utf8');
|
||||
const actual = actualCommandCount();
|
||||
|
||||
test('docs/ARCHITECTURE.md contains a "Total commands:" prose count', () => {
|
||||
const count = docProseCount(archContent);
|
||||
assert.notEqual(count, null, 'Expected "**Total commands:** N" line not found in ARCHITECTURE.md');
|
||||
});
|
||||
|
||||
test('docs/ARCHITECTURE.md contains a directory-tree slash-command count', () => {
|
||||
const count = docTreeCount(archContent);
|
||||
assert.notEqual(count, null, 'Expected "# N slash commands" tree comment not found in ARCHITECTURE.md');
|
||||
});
|
||||
|
||||
test('"Total commands:" prose count matches actual commands/gsd/ file count', () => {
|
||||
const prose = docProseCount(archContent);
|
||||
assert.equal(
|
||||
prose,
|
||||
actual,
|
||||
`ARCHITECTURE.md "Total commands:" says ${prose} but commands/gsd/ has ${actual} .md files — update the doc`,
|
||||
);
|
||||
});
|
||||
|
||||
test('directory-tree slash-command count matches actual commands/gsd/ file count', () => {
|
||||
const tree = docTreeCount(archContent);
|
||||
assert.equal(
|
||||
tree,
|
||||
actual,
|
||||
`ARCHITECTURE.md directory tree says ${tree} slash commands but commands/gsd/ has ${actual} .md files — update the doc`,
|
||||
);
|
||||
});
|
||||
|
||||
test('"Total commands:" prose count and directory-tree count agree with each other', () => {
|
||||
const prose = docProseCount(archContent);
|
||||
const tree = docTreeCount(archContent);
|
||||
assert.equal(
|
||||
prose,
|
||||
tree,
|
||||
`ARCHITECTURE.md has two mismatched counts: "Total commands: ${prose}" vs tree "# ${tree} slash commands"`,
|
||||
);
|
||||
});
|
||||
});
|
||||
@@ -1071,8 +1071,10 @@ describe('stale hook filter', () => {
|
||||
|
||||
describe('stale hook path', () => {
|
||||
test('gsd-check-update.js checks configDir/hooks/ where hooks are actually installed (#1421)', () => {
|
||||
// The stale-hook scan logic lives in the worker (moved from inline -e template literal).
|
||||
// The worker receives configDir via env and constructs the hooksDir path.
|
||||
const content = fs.readFileSync(
|
||||
path.join(__dirname, '..', 'hooks', 'gsd-check-update.js'), 'utf-8'
|
||||
path.join(__dirname, '..', 'hooks', 'gsd-check-update-worker.js'), 'utf-8'
|
||||
);
|
||||
// Hooks are installed at configDir/hooks/ (e.g. ~/.claude/hooks/),
|
||||
// not configDir/get-shit-done/hooks/ which doesn't exist (#1421)
|
||||
|
||||
@@ -18,7 +18,9 @@ const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const HOOKS_DIR = path.join(__dirname, '..', 'hooks');
|
||||
const CHECK_UPDATE_FILE = path.join(HOOKS_DIR, 'gsd-check-update.js');
|
||||
// MANAGED_HOOKS now lives in the worker script (extracted from inline -e code
|
||||
// to avoid template-literal regex-escaping concerns). The test reads the worker.
|
||||
const MANAGED_HOOKS_FILE = path.join(HOOKS_DIR, 'gsd-check-update-worker.js');
|
||||
|
||||
describe('bug #2136: MANAGED_HOOKS must include all shipped hook files', () => {
|
||||
let src;
|
||||
@@ -26,12 +28,12 @@ describe('bug #2136: MANAGED_HOOKS must include all shipped hook files', () => {
|
||||
let shippedHooks;
|
||||
|
||||
// Read once — all tests share the same source snapshot
|
||||
src = fs.readFileSync(CHECK_UPDATE_FILE, 'utf-8');
|
||||
src = fs.readFileSync(MANAGED_HOOKS_FILE, 'utf-8');
|
||||
|
||||
// Extract the MANAGED_HOOKS array entries from the source
|
||||
// The array is defined as a multi-line array literal of quoted strings
|
||||
const match = src.match(/const MANAGED_HOOKS\s*=\s*\[([\s\S]*?)\]/);
|
||||
assert.ok(match, 'MANAGED_HOOKS array not found in gsd-check-update.js');
|
||||
assert.ok(match, 'MANAGED_HOOKS array not found in gsd-check-update-worker.js');
|
||||
|
||||
managedHooks = match[1]
|
||||
.split('\n')
|
||||
@@ -47,7 +49,7 @@ describe('bug #2136: MANAGED_HOOKS must include all shipped hook files', () => {
|
||||
for (const hookFile of jsHooks) {
|
||||
assert.ok(
|
||||
managedHooks.includes(hookFile),
|
||||
`${hookFile} is shipped in hooks/ but missing from MANAGED_HOOKS in gsd-check-update.js`
|
||||
`${hookFile} is shipped in hooks/ but missing from MANAGED_HOOKS in gsd-check-update-worker.js`
|
||||
);
|
||||
}
|
||||
});
|
||||
@@ -57,7 +59,7 @@ describe('bug #2136: MANAGED_HOOKS must include all shipped hook files', () => {
|
||||
for (const hookFile of shHooks) {
|
||||
assert.ok(
|
||||
managedHooks.includes(hookFile),
|
||||
`${hookFile} is shipped in hooks/ but missing from MANAGED_HOOKS in gsd-check-update.js`
|
||||
`${hookFile} is shipped in hooks/ but missing from MANAGED_HOOKS in gsd-check-update-worker.js`
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -11,31 +11,41 @@ const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
// MANAGED_HOOKS lives in the worker file (extracted from inline -e code to eliminate
|
||||
// template-literal regex-escaping concerns). Tests read the worker directly.
|
||||
const CHECK_UPDATE_PATH = path.join(__dirname, '..', 'hooks', 'gsd-check-update.js');
|
||||
const WORKER_PATH = path.join(__dirname, '..', 'hooks', 'gsd-check-update-worker.js');
|
||||
const BUILD_HOOKS_PATH = path.join(__dirname, '..', 'scripts', 'build-hooks.js');
|
||||
|
||||
describe('orphaned hooks stale detection (#1750)', () => {
|
||||
test('stale hook scanner uses an allowlist of managed hooks, not a wildcard', () => {
|
||||
const content = fs.readFileSync(CHECK_UPDATE_PATH, 'utf8');
|
||||
const content = fs.readFileSync(WORKER_PATH, 'utf8');
|
||||
|
||||
// The scanner MUST NOT use a broad `startsWith('gsd-')` filter that catches
|
||||
// orphaned files from removed features (gsd-intel-index.js, gsd-intel-prune.js, etc.)
|
||||
// Instead, it should reference a known set of managed hook filenames.
|
||||
|
||||
// Extract the spawned child script (everything between the template literal backticks)
|
||||
const childScriptMatch = content.match(/spawn\(process\.execPath,\s*\['-e',\s*`([\s\S]*?)`\]/);
|
||||
assert.ok(childScriptMatch, 'should find the spawned child script');
|
||||
const childScript = childScriptMatch[1];
|
||||
|
||||
// The child script must NOT have a broad gsd-*.js wildcard filter
|
||||
const hasBroadFilter = /readdirSync\([^)]+\)\.filter\([^)]*startsWith\('gsd-'\)\s*&&[^)]*endsWith\('\.js'\)/s.test(childScript);
|
||||
const hasBroadFilter = /readdirSync\([^)]+\)\.filter\([^)]*startsWith\('gsd-'\)\s*&&[^)]*endsWith\('\.js'\)/s.test(content);
|
||||
assert.ok(!hasBroadFilter,
|
||||
'scanner must NOT use broad startsWith("gsd-") && endsWith(".js") filter — ' +
|
||||
'this catches orphaned hooks from removed features (e.g., gsd-intel-index.js). ' +
|
||||
'Use a MANAGED_HOOKS allowlist instead.');
|
||||
});
|
||||
|
||||
test('managed hooks list in check-update matches build-hooks HOOKS_TO_COPY JS entries', () => {
|
||||
test('gsd-check-update.js spawns the worker by file path (not inline -e code)', () => {
|
||||
// After the worker extraction, the main hook must spawn the worker file
|
||||
// rather than embedding all logic in a template literal.
|
||||
const content = fs.readFileSync(CHECK_UPDATE_PATH, 'utf8');
|
||||
assert.ok(
|
||||
content.includes('gsd-check-update-worker.js'),
|
||||
'gsd-check-update.js must reference gsd-check-update-worker.js as the spawn target'
|
||||
);
|
||||
assert.ok(
|
||||
!content.includes("'-e'"),
|
||||
'gsd-check-update.js must not use node -e inline code (logic moved to worker file)'
|
||||
);
|
||||
});
|
||||
|
||||
test('managed hooks list in worker matches build-hooks HOOKS_TO_COPY JS entries', () => {
|
||||
// Extract JS hooks from build-hooks.js HOOKS_TO_COPY
|
||||
const buildContent = fs.readFileSync(BUILD_HOOKS_PATH, 'utf8');
|
||||
const hooksArrayMatch = buildContent.match(/HOOKS_TO_COPY\s*=\s*\[([\s\S]*?)\]/);
|
||||
@@ -48,25 +58,18 @@ describe('orphaned hooks stale detection (#1750)', () => {
|
||||
}
|
||||
assert.ok(jsHooks.length >= 5, `expected at least 5 JS hooks in HOOKS_TO_COPY, got ${jsHooks.length}`);
|
||||
|
||||
// The check-update hook should define its own managed hooks list
|
||||
// that matches the JS entries from HOOKS_TO_COPY
|
||||
const checkContent = fs.readFileSync(CHECK_UPDATE_PATH, 'utf8');
|
||||
const childScriptMatch = checkContent.match(/spawn\(process\.execPath,\s*\['-e',\s*`([\s\S]*?)`\]/);
|
||||
const childScript = childScriptMatch[1];
|
||||
|
||||
// Verify each JS hook from HOOKS_TO_COPY is referenced in the managed list
|
||||
// MANAGED_HOOKS in the worker must include each JS hook from HOOKS_TO_COPY
|
||||
const workerContent = fs.readFileSync(WORKER_PATH, 'utf8');
|
||||
for (const hook of jsHooks) {
|
||||
assert.ok(
|
||||
childScript.includes(hook),
|
||||
`managed hooks in check-update should include '${hook}' from HOOKS_TO_COPY`
|
||||
workerContent.includes(hook),
|
||||
`MANAGED_HOOKS in worker should include '${hook}' from HOOKS_TO_COPY`
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
test('orphaned hook filenames would NOT match the managed hooks list', () => {
|
||||
const checkContent = fs.readFileSync(CHECK_UPDATE_PATH, 'utf8');
|
||||
const childScriptMatch = checkContent.match(/spawn\(process\.execPath,\s*\['-e',\s*`([\s\S]*?)`\]/);
|
||||
const childScript = childScriptMatch[1];
|
||||
test('orphaned hook filenames are NOT in the MANAGED_HOOKS list', () => {
|
||||
const workerContent = fs.readFileSync(WORKER_PATH, 'utf8');
|
||||
|
||||
// These are real orphaned hooks from the removed intel feature
|
||||
const orphanedHooks = [
|
||||
@@ -77,8 +80,8 @@ describe('orphaned hooks stale detection (#1750)', () => {
|
||||
|
||||
for (const orphan of orphanedHooks) {
|
||||
assert.ok(
|
||||
!childScript.includes(orphan),
|
||||
`orphaned hook '${orphan}' must NOT be in the managed hooks list`
|
||||
!workerContent.includes(orphan),
|
||||
`orphaned hook '${orphan}' must NOT be in the MANAGED_HOOKS list`
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -157,6 +157,19 @@ describe('generate-claude-md command', () => {
|
||||
const content = fs.readFileSync(outputPath, 'utf-8');
|
||||
assert.ok(content.length > 0, 'should still have content');
|
||||
});
|
||||
|
||||
test('skills fallback mentions the normalized project roots', () => {
|
||||
const result = runGsdTools('generate-claude-md', tmpDir);
|
||||
assert.ok(result.success, `Failed: ${result.error}`);
|
||||
|
||||
const content = fs.readFileSync(path.join(tmpDir, 'CLAUDE.md'), 'utf-8');
|
||||
assert.ok(content.includes('.claude/skills/'));
|
||||
assert.ok(content.includes('.agents/skills/'));
|
||||
assert.ok(content.includes('.cursor/skills/'));
|
||||
assert.ok(content.includes('.github/skills/'));
|
||||
assert.ok(content.includes('.codex/skills/'));
|
||||
assert.ok(!content.includes('get-shit-done/skills'));
|
||||
});
|
||||
});
|
||||
|
||||
// ─── generate-dev-preferences ─────────────────────────────────────────────────
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
/**
|
||||
* Tests for skill-manifest command
|
||||
* TDD: RED phase — tests written before implementation
|
||||
*/
|
||||
|
||||
const { describe, test, beforeEach, afterEach } = require('node:test');
|
||||
@@ -9,211 +8,123 @@ const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { runGsdTools, createTempProject, cleanup } = require('./helpers.cjs');
|
||||
|
||||
function writeSkill(rootDir, name, description, body = '') {
|
||||
const skillDir = path.join(rootDir, name);
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
`name: ${name}`,
|
||||
`description: ${description}`,
|
||||
'---',
|
||||
'',
|
||||
body || `# ${name}`,
|
||||
].join('\n'));
|
||||
}
|
||||
|
||||
describe('skill-manifest', () => {
|
||||
let tmpDir;
|
||||
let homeDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
homeDir = fs.mkdtempSync(path.join(require('os').tmpdir(), 'gsd-skill-manifest-home-'));
|
||||
|
||||
writeSkill(path.join(tmpDir, '.claude', 'skills'), 'project-claude', 'Project Claude skill');
|
||||
writeSkill(path.join(tmpDir, '.claude', 'skills'), 'gsd-help', 'Installed GSD skill');
|
||||
writeSkill(path.join(tmpDir, '.agents', 'skills'), 'project-agents', 'Project agent skill');
|
||||
writeSkill(path.join(tmpDir, '.codex', 'skills'), 'project-codex', 'Project Codex skill');
|
||||
|
||||
writeSkill(path.join(homeDir, '.claude', 'skills'), 'global-claude', 'Global Claude skill');
|
||||
writeSkill(path.join(homeDir, '.codex', 'skills'), 'global-codex', 'Global Codex skill');
|
||||
writeSkill(
|
||||
path.join(homeDir, '.claude', 'get-shit-done', 'skills'),
|
||||
'legacy-import',
|
||||
'Deprecated import-only skill'
|
||||
);
|
||||
|
||||
fs.mkdirSync(path.join(homeDir, '.claude', 'commands', 'gsd'), { recursive: true });
|
||||
fs.writeFileSync(path.join(homeDir, '.claude', 'commands', 'gsd', 'help.md'), '# legacy');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
cleanup(homeDir);
|
||||
});
|
||||
|
||||
test('skill-manifest command exists and returns JSON', () => {
|
||||
// Create a skills directory with one skill
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', 'test-skill');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: test-skill',
|
||||
'description: A test skill',
|
||||
'---',
|
||||
'',
|
||||
'# Test Skill',
|
||||
].join('\n'));
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', path.join(tmpDir, '.claude', 'skills')], tmpDir);
|
||||
test('returns normalized inventory across canonical roots', () => {
|
||||
const result = runGsdTools(['skill-manifest'], tmpDir, { HOME: homeDir });
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.ok(Array.isArray(manifest), 'Manifest should be an array');
|
||||
});
|
||||
assert.ok(Array.isArray(manifest.skills), 'skills should be an array');
|
||||
assert.ok(Array.isArray(manifest.roots), 'roots should be an array');
|
||||
assert.ok(manifest.installation && typeof manifest.installation === 'object', 'installation summary present');
|
||||
assert.ok(manifest.counts && typeof manifest.counts === 'object', 'counts summary present');
|
||||
|
||||
test('generates manifest with correct structure from SKILL.md frontmatter', () => {
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', 'my-skill');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: my-skill',
|
||||
'description: Does something useful',
|
||||
'---',
|
||||
'',
|
||||
'# My Skill',
|
||||
'',
|
||||
'TRIGGER when: user asks about widgets',
|
||||
].join('\n'));
|
||||
const skillNames = manifest.skills.map((skill) => skill.name).sort();
|
||||
assert.deepStrictEqual(skillNames, [
|
||||
'global-claude',
|
||||
'global-codex',
|
||||
'gsd-help',
|
||||
'legacy-import',
|
||||
'project-agents',
|
||||
'project-claude',
|
||||
'project-codex',
|
||||
]);
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', path.join(tmpDir, '.claude', 'skills')], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.strictEqual(manifest.length, 1);
|
||||
assert.strictEqual(manifest[0].name, 'my-skill');
|
||||
assert.strictEqual(manifest[0].description, 'Does something useful');
|
||||
assert.strictEqual(manifest[0].path, 'my-skill');
|
||||
});
|
||||
|
||||
test('empty skills directory produces empty manifest', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
fs.mkdirSync(skillsDir, { recursive: true });
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', skillsDir], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.ok(Array.isArray(manifest), 'Manifest should be an array');
|
||||
assert.strictEqual(manifest.length, 0);
|
||||
});
|
||||
|
||||
test('skills without SKILL.md are skipped', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
// Skill with SKILL.md
|
||||
const goodDir = path.join(skillsDir, 'good-skill');
|
||||
fs.mkdirSync(goodDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(goodDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: good-skill',
|
||||
'description: Has a SKILL.md',
|
||||
'---',
|
||||
'',
|
||||
'# Good Skill',
|
||||
].join('\n'));
|
||||
|
||||
// Skill without SKILL.md (just a directory)
|
||||
const badDir = path.join(skillsDir, 'bad-skill');
|
||||
fs.mkdirSync(badDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(badDir, 'README.md'), '# No SKILL.md here');
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', skillsDir], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.strictEqual(manifest.length, 1);
|
||||
assert.strictEqual(manifest[0].name, 'good-skill');
|
||||
});
|
||||
|
||||
test('manifest includes frontmatter fields from SKILL.md', () => {
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', 'rich-skill');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: rich-skill',
|
||||
'description: A richly documented skill',
|
||||
'---',
|
||||
'',
|
||||
'# Rich Skill',
|
||||
'',
|
||||
'TRIGGER when: user mentions databases',
|
||||
'DO NOT TRIGGER when: user asks about frontend',
|
||||
].join('\n'));
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', path.join(tmpDir, '.claude', 'skills')], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.strictEqual(manifest.length, 1);
|
||||
|
||||
const skill = manifest[0];
|
||||
assert.strictEqual(skill.name, 'rich-skill');
|
||||
assert.strictEqual(skill.description, 'A richly documented skill');
|
||||
assert.strictEqual(skill.path, 'rich-skill');
|
||||
// triggers extracted from body text
|
||||
assert.ok(Array.isArray(skill.triggers), 'triggers should be an array');
|
||||
assert.ok(skill.triggers.length > 0, 'triggers should have at least one entry');
|
||||
assert.ok(skill.triggers.some(t => t.includes('databases')), 'triggers should mention databases');
|
||||
});
|
||||
|
||||
test('multiple skills are all included in manifest', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
|
||||
for (const name of ['alpha', 'beta', 'gamma']) {
|
||||
const dir = path.join(skillsDir, name);
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'SKILL.md'), [
|
||||
'---',
|
||||
`name: ${name}`,
|
||||
`description: The ${name} skill`,
|
||||
'---',
|
||||
'',
|
||||
`# ${name}`,
|
||||
].join('\n'));
|
||||
const codexSkill = manifest.skills.find((skill) => skill.name === 'project-codex');
|
||||
assert.deepStrictEqual(
|
||||
{
|
||||
root: codexSkill.root,
|
||||
scope: codexSkill.scope,
|
||||
installed: codexSkill.installed,
|
||||
deprecated: codexSkill.deprecated,
|
||||
},
|
||||
{
|
||||
root: '.codex/skills',
|
||||
scope: 'project',
|
||||
installed: true,
|
||||
deprecated: false,
|
||||
}
|
||||
);
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', skillsDir], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
const importedSkill = manifest.skills.find((skill) => skill.name === 'legacy-import');
|
||||
assert.deepStrictEqual(
|
||||
{
|
||||
root: importedSkill.root,
|
||||
scope: importedSkill.scope,
|
||||
installed: importedSkill.installed,
|
||||
deprecated: importedSkill.deprecated,
|
||||
},
|
||||
{
|
||||
root: '.claude/get-shit-done/skills',
|
||||
scope: 'import-only',
|
||||
installed: false,
|
||||
deprecated: true,
|
||||
}
|
||||
);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.strictEqual(manifest.length, 3);
|
||||
const names = manifest.map(s => s.name).sort();
|
||||
assert.deepStrictEqual(names, ['alpha', 'beta', 'gamma']);
|
||||
const gsdSkill = manifest.skills.find((skill) => skill.name === 'gsd-help');
|
||||
assert.strictEqual(gsdSkill.installed, true);
|
||||
|
||||
const legacyRoot = manifest.roots.find((root) => root.scope === 'legacy-commands');
|
||||
assert.ok(legacyRoot, 'legacy commands root should be reported');
|
||||
assert.strictEqual(legacyRoot.present, true);
|
||||
|
||||
assert.strictEqual(manifest.installation.gsd_skills_installed, true);
|
||||
assert.strictEqual(manifest.installation.legacy_claude_commands_installed, true);
|
||||
assert.strictEqual(manifest.counts.skills, 7);
|
||||
});
|
||||
|
||||
test('writes manifest to .planning/skill-manifest.json when --write flag is used', () => {
|
||||
const skillDir = path.join(tmpDir, '.claude', 'skills', 'write-test');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: write-test',
|
||||
'description: Tests write mode',
|
||||
'---',
|
||||
'',
|
||||
'# Write Test',
|
||||
].join('\n'));
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', path.join(tmpDir, '.claude', 'skills'), '--write'], tmpDir);
|
||||
const result = runGsdTools(['skill-manifest', '--write'], tmpDir, { HOME: homeDir });
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifestPath = path.join(tmpDir, '.planning', 'skill-manifest.json');
|
||||
assert.ok(fs.existsSync(manifestPath), 'skill-manifest.json should be written to .planning/');
|
||||
|
||||
const manifest = JSON.parse(fs.readFileSync(manifestPath, 'utf-8'));
|
||||
assert.strictEqual(manifest.length, 1);
|
||||
assert.strictEqual(manifest[0].name, 'write-test');
|
||||
});
|
||||
|
||||
test('nonexistent skills directory returns empty manifest', () => {
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', path.join(tmpDir, 'nonexistent')], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.ok(Array.isArray(manifest), 'Manifest should be an array');
|
||||
assert.strictEqual(manifest.length, 0);
|
||||
});
|
||||
|
||||
test('files in skills directory are ignored (only subdirectories scanned)', () => {
|
||||
const skillsDir = path.join(tmpDir, '.claude', 'skills');
|
||||
fs.mkdirSync(skillsDir, { recursive: true });
|
||||
// A file, not a directory
|
||||
fs.writeFileSync(path.join(skillsDir, 'not-a-skill.md'), '# Not a skill');
|
||||
|
||||
// A valid skill directory
|
||||
const skillDir = path.join(skillsDir, 'real-skill');
|
||||
fs.mkdirSync(skillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), [
|
||||
'---',
|
||||
'name: real-skill',
|
||||
'description: A real skill',
|
||||
'---',
|
||||
'',
|
||||
'# Real Skill',
|
||||
].join('\n'));
|
||||
|
||||
const result = runGsdTools(['skill-manifest', '--skills-dir', skillsDir], tmpDir);
|
||||
assert.ok(result.success, `Command should succeed: ${result.error || result.output}`);
|
||||
|
||||
const manifest = JSON.parse(result.output);
|
||||
assert.strictEqual(manifest.length, 1);
|
||||
assert.strictEqual(manifest[0].name, 'real-skill');
|
||||
assert.ok(Array.isArray(manifest.skills));
|
||||
assert.ok(manifest.installation);
|
||||
});
|
||||
});
|
||||
|
||||
228
tests/update-custom-backup.test.cjs
Normal file
228
tests/update-custom-backup.test.cjs
Normal file
@@ -0,0 +1,228 @@
|
||||
/**
|
||||
* GSD Tools Tests — update workflow custom file backup detection (#1997)
|
||||
*
|
||||
* The update workflow must detect user-added files inside GSD-managed
|
||||
* directories (get-shit-done/, agents/, commands/gsd/, hooks/) before the
|
||||
* installer wipes those directories.
|
||||
*
|
||||
* This tests the `detect-custom-files` subcommand of gsd-tools.cjs, which is
|
||||
* the correct fix for the bash path-stripping failure described in #1997.
|
||||
*
|
||||
* The bash pattern `${filepath#$RUNTIME_DIR/}` is unreliable because
|
||||
* $RUNTIME_DIR may not be set and the stripped relative path may not match
|
||||
* manifest key format. Moving the logic into gsd-tools.cjs eliminates the
|
||||
* shell variable expansion failure entirely.
|
||||
*
|
||||
* Closes: #1997
|
||||
*/
|
||||
|
||||
const { describe, test, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const { runGsdTools, createTempDir, cleanup } = require('./helpers.cjs');
|
||||
|
||||
function sha256(content) {
|
||||
return crypto.createHash('sha256').update(content).digest('hex');
|
||||
}
|
||||
|
||||
/**
|
||||
* Write a fake gsd-file-manifest.json into configDir with the given file entries.
|
||||
*/
|
||||
function writeManifest(configDir, files) {
|
||||
const manifest = {
|
||||
version: '1.32.0',
|
||||
timestamp: new Date().toISOString(),
|
||||
files: {}
|
||||
};
|
||||
for (const [relPath, content] of Object.entries(files)) {
|
||||
const fullPath = path.join(configDir, relPath);
|
||||
fs.mkdirSync(path.dirname(fullPath), { recursive: true });
|
||||
fs.writeFileSync(fullPath, content);
|
||||
manifest.files[relPath] = sha256(content);
|
||||
}
|
||||
fs.writeFileSync(
|
||||
path.join(configDir, 'gsd-file-manifest.json'),
|
||||
JSON.stringify(manifest, null, 2)
|
||||
);
|
||||
}
|
||||
|
||||
describe('detect-custom-files — update workflow backup detection (#1997)', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir('gsd-custom-detect-');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('detects a custom file added inside get-shit-done/workflows/', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'get-shit-done/workflows/execute-phase.md': '# Execute Phase\n',
|
||||
'get-shit-done/workflows/plan-phase.md': '# Plan Phase\n',
|
||||
});
|
||||
|
||||
// Add a custom file NOT in the manifest
|
||||
const customFile = path.join(tmpDir, 'get-shit-done/workflows/my-custom-workflow.md');
|
||||
fs.writeFileSync(customFile, '# My Custom Workflow\n');
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
assert.ok(Array.isArray(json.custom_files), 'should return custom_files array');
|
||||
assert.ok(json.custom_files.length > 0, 'should detect at least one custom file');
|
||||
assert.ok(
|
||||
json.custom_files.includes('get-shit-done/workflows/my-custom-workflow.md'),
|
||||
`custom file should be listed; got: ${JSON.stringify(json.custom_files)}`
|
||||
);
|
||||
});
|
||||
|
||||
test('detects custom files added inside agents/', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'agents/gsd-executor.md': '# GSD Executor\n',
|
||||
});
|
||||
|
||||
// Add a user's custom agent (not prefixed with gsd-)
|
||||
const customAgent = path.join(tmpDir, 'agents/my-custom-agent.md');
|
||||
fs.mkdirSync(path.dirname(customAgent), { recursive: true });
|
||||
fs.writeFileSync(customAgent, '# My Custom Agent\n');
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
assert.ok(json.custom_files.includes('agents/my-custom-agent.md'),
|
||||
`custom agent should be detected; got: ${JSON.stringify(json.custom_files)}`);
|
||||
});
|
||||
|
||||
test('reports zero custom files when all files are in manifest', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'get-shit-done/workflows/execute-phase.md': '# Execute Phase\n',
|
||||
'get-shit-done/references/gates.md': '# Gates\n',
|
||||
'agents/gsd-executor.md': '# Executor\n',
|
||||
});
|
||||
// No extra files added
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
assert.ok(Array.isArray(json.custom_files), 'should return custom_files array');
|
||||
assert.strictEqual(json.custom_files.length, 0, 'no custom files should be detected');
|
||||
assert.strictEqual(json.custom_count, 0, 'custom_count should be 0');
|
||||
});
|
||||
|
||||
test('returns custom_count equal to custom_files length', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'get-shit-done/workflows/execute-phase.md': '# Execute Phase\n',
|
||||
});
|
||||
|
||||
// Add two custom files
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'get-shit-done/workflows/custom-a.md'),
|
||||
'# Custom A\n'
|
||||
);
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'get-shit-done/workflows/custom-b.md'),
|
||||
'# Custom B\n'
|
||||
);
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
assert.strictEqual(json.custom_count, json.custom_files.length,
|
||||
'custom_count should equal custom_files.length');
|
||||
assert.strictEqual(json.custom_count, 2, 'should detect exactly 2 custom files');
|
||||
});
|
||||
|
||||
test('does not flag manifest files as custom even if content was modified', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'get-shit-done/workflows/execute-phase.md': '# Execute Phase\nOriginal\n',
|
||||
});
|
||||
|
||||
// Modify the content of an existing manifest file
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'get-shit-done/workflows/execute-phase.md'),
|
||||
'# Execute Phase\nModified by user\n'
|
||||
);
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
// Modified manifest files are handled by saveLocalPatches (in install.js).
|
||||
// detect-custom-files only finds files NOT in the manifest at all.
|
||||
assert.ok(
|
||||
!json.custom_files.includes('get-shit-done/workflows/execute-phase.md'),
|
||||
'modified manifest files should NOT be listed as custom (that is saveLocalPatches territory)'
|
||||
);
|
||||
});
|
||||
|
||||
test('handles missing manifest gracefully — treats all GSD-dir files as custom', () => {
|
||||
// No manifest. Add a file in a GSD-managed dir.
|
||||
const workflowDir = path.join(tmpDir, 'get-shit-done/workflows');
|
||||
fs.mkdirSync(workflowDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(workflowDir, 'my-workflow.md'), '# My Workflow\n');
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
// Without a manifest, we cannot determine what is custom vs GSD-owned.
|
||||
// The command should return an empty list (no manifest = skip detection,
|
||||
// which is safe since saveLocalPatches also does nothing without a manifest).
|
||||
assert.ok(Array.isArray(json.custom_files), 'should return custom_files array');
|
||||
assert.ok(typeof json.custom_count === 'number', 'should return numeric custom_count');
|
||||
});
|
||||
|
||||
test('detects custom files inside get-shit-done/references/', () => {
|
||||
writeManifest(tmpDir, {
|
||||
'get-shit-done/references/gates.md': '# Gates\n',
|
||||
});
|
||||
|
||||
const customRef = path.join(tmpDir, 'get-shit-done/references/my-domain-probes.md');
|
||||
fs.writeFileSync(customRef, '# My Domain Probes\n');
|
||||
|
||||
const result = runGsdTools(
|
||||
['detect-custom-files', '--config-dir', tmpDir],
|
||||
tmpDir
|
||||
);
|
||||
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const json = JSON.parse(result.output);
|
||||
assert.ok(
|
||||
json.custom_files.includes('get-shit-done/references/my-domain-probes.md'),
|
||||
`should detect custom reference; got: ${JSON.stringify(json.custom_files)}`
|
||||
);
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user