mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-25 17:25:23 +02:00
Compare commits
6 Commits
feat/2071-
...
fix/2086-g
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6c2795598a | ||
|
|
7a674c81b7 | ||
|
|
5c0e801322 | ||
|
|
96eef85c40 | ||
|
|
2b4b48401c | ||
|
|
f8cf54bd01 |
26
CHANGELOG.md
26
CHANGELOG.md
@@ -6,9 +6,35 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [1.35.0] - 2026-04-10
|
||||
|
||||
### Added
|
||||
- **Cline runtime support** — First-class Cline runtime via rules-based integration. Installs to `~/.cline/` or `./.cline/` as `.clinerules`. No custom slash commands — uses rules. `--cline` flag. (#1605 follow-up)
|
||||
- **CodeBuddy runtime support** — Skills-based install to `~/.codebuddy/skills/gsd-*/SKILL.md`. `--codebuddy` flag.
|
||||
- **Qwen Code runtime support** — Skills-based install to `~/.qwen/skills/gsd-*/SKILL.md`, same open standard as Claude Code 2.1.88+. `QWEN_CONFIG_DIR` env var for custom paths. `--qwen` flag.
|
||||
- **`/gsd-from-gsd2` command** (`gsd:from-gsd2`) — Reverse migration from GSD-2 format (`.gsd/` with Milestone→Slice→Task hierarchy) back to v1 `.planning/` format. Flags: `--dry-run` (preview only), `--force` (overwrite existing `.planning/`), `--path <dir>` (specify GSD-2 root). Produces `PROJECT.md`, `REQUIREMENTS.md`, `ROADMAP.md`, `STATE.md`, and sequential phase dirs. Flattens Milestone→Slice hierarchy to sequential phase numbers (M001/S01→phase 01, M001/S02→phase 02, M002/S01→phase 03, etc.).
|
||||
- **`/gsd-ai-integration-phase` command** (`gsd:ai-integration-phase`) — AI framework selection wizard for integrating AI/LLM capabilities into a project phase. Interactive decision matrix with domain-specific failure modes and eval criteria. Produces `AI-SPEC.md` with framework recommendation, implementation guidance, and evaluation strategy. Runs 3 parallel specialist agents: domain-researcher, framework-selector, ai-researcher, eval-planner.
|
||||
- **`/gsd-eval-review` command** (`gsd:eval-review`) — Retroactive audit of an implemented AI phase's evaluation coverage. Checks implementation against `AI-SPEC.md` evaluation plan. Scores each eval dimension as COVERED/PARTIAL/MISSING. Produces `EVAL-REVIEW.md` with findings, gaps, and remediation guidance.
|
||||
- **Review model configuration** — Per-CLI model selection for /gsd-review via `review.models.<cli>` config keys. Falls back to CLI defaults when not set. (#1849)
|
||||
- **Statusline now surfaces GSD milestone/phase/status** — when no `in_progress` todo is active, `gsd-statusline.js` reads `.planning/STATE.md` (walking up from the workspace dir) and fills the middle slot with `<milestone> · <status> · <phase> (N/total)`. Gracefully degrades when fields are missing; identical to previous behavior when there is no STATE.md or an active todo wins the slot. Uses the YAML frontmatter added for #628.
|
||||
- **Qwen Code and Cursor CLI peer reviewers** — Added as reviewers in `/gsd-review` with `--qwen` and `--cursor` flags. (#1966)
|
||||
|
||||
### Changed
|
||||
- **Worktree safety — `git clean` prohibition** — `gsd-executor` now prohibits `git clean` in worktree context to prevent deletion of prior wave output. (#2075)
|
||||
- **Executor deletion verification** — Pre-merge deletion checks added to catch missing artifacts before executor commit. (#2070)
|
||||
- **Hard reset in worktree branch check** — `--hard` flag in `worktree_branch_check` now correctly resets the file tree, not just HEAD. (#2073)
|
||||
|
||||
### Fixed
|
||||
- **Context7 MCP CLI fallback** — Handles `tools: []` response that previously broke Context7 availability detection. (#1885)
|
||||
- **`Agent` tool in gsd-autonomous** — Added `Agent` to `allowed-tools` to unblock subagent spawning. (#2043)
|
||||
- **`intel.enabled` in config-set whitelist** — Config key now accepted by `config-set` without validation error. (#2021)
|
||||
- **`writeSettings` null guard** — Guards against null `settingsPath` for Cline runtime to prevent crash on install. (#2046)
|
||||
- **Shell hook absolute paths** — `.sh` hooks now receive absolute quoted paths in `buildHookCommand`, fixing path resolution in non-standard working directories. (#2045)
|
||||
- **`processAttribution` runtime-aware** — Was hardcoded to `'claude'`; now reads actual runtime from environment.
|
||||
- **`AskUserQuestion` plain-text fallback** — Non-Claude runtimes now receive plain-text numbered lists instead of broken TUI menus.
|
||||
- **iOS app scaffold uses XcodeGen** — Prevents SPM execution errors in generated iOS scaffolds. (#2023)
|
||||
- **`acceptance_criteria` hard gate** — Enforced as a hard gate in executor — plans missing acceptance criteria are rejected before execution begins. (#1958)
|
||||
- **`normalizePhaseName` preserves letter suffix case** — Phase names with letter suffixes (e.g., `1a`, `2B`) now preserve original case. (#1963)
|
||||
|
||||
## [1.34.2] - 2026-04-06
|
||||
|
||||
|
||||
16
README.md
16
README.md
@@ -4,7 +4,7 @@
|
||||
|
||||
**English** · [Português](README.pt-BR.md) · [简体中文](README.zh-CN.md) · [日本語](README.ja-JP.md) · [한국어](README.ko-KR.md)
|
||||
|
||||
**A light-weight and powerful meta-prompting, context engineering and spec-driven development system for Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, CodeBuddy, and Cline.**
|
||||
**A light-weight and powerful meta-prompting, context engineering and spec-driven development system for Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, Cline, and CodeBuddy.**
|
||||
|
||||
**Solves context rot — the quality degradation that happens as Claude fills its context window.**
|
||||
|
||||
@@ -106,17 +106,17 @@ npx get-shit-done-cc@latest
|
||||
```
|
||||
|
||||
The installer prompts you to choose:
|
||||
1. **Runtime** — Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, CodeBuddy, Cline, or all (interactive multi-select — pick multiple runtimes in a single install session)
|
||||
1. **Runtime** — Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Cursor, Windsurf, Antigravity, Augment, Trae, Qwen Code, CodeBuddy, Cline, or all (interactive multi-select — pick multiple runtimes in a single install session)
|
||||
2. **Location** — Global (all projects) or local (current project only)
|
||||
|
||||
Verify with:
|
||||
- Claude Code / Gemini / Copilot / Antigravity: `/gsd-help`
|
||||
- Claude Code / Gemini / Copilot / Antigravity / Qwen Code: `/gsd-help`
|
||||
- OpenCode / Kilo / Augment / Trae / CodeBuddy: `/gsd-help`
|
||||
- Codex: `$gsd-help`
|
||||
- Cline: GSD installs via `.clinerules` — verify by checking `.clinerules` exists
|
||||
|
||||
> [!NOTE]
|
||||
> Claude Code 2.1.88+ and Codex install as skills (`skills/gsd-*/SKILL.md`). Older Claude Code versions use `commands/gsd/`. Cline uses `.clinerules` for configuration. The installer handles all formats automatically.
|
||||
> Claude Code 2.1.88+, Qwen Code, and Codex install as skills (`skills/gsd-*/SKILL.md`). Older Claude Code versions use `commands/gsd/`. Cline uses `.clinerules` for configuration. The installer handles all formats automatically.
|
||||
|
||||
> [!TIP]
|
||||
> For source-based installs or environments where npm is unavailable, see **[docs/manual-update.md](docs/manual-update.md)**.
|
||||
@@ -175,6 +175,10 @@ npx get-shit-done-cc --augment --local # Install to ./.augment/
|
||||
npx get-shit-done-cc --trae --global # Install to ~/.trae/
|
||||
npx get-shit-done-cc --trae --local # Install to ./.trae/
|
||||
|
||||
# Qwen Code
|
||||
npx get-shit-done-cc --qwen --global # Install to ~/.qwen/
|
||||
npx get-shit-done-cc --qwen --local # Install to ./.qwen/
|
||||
|
||||
# CodeBuddy
|
||||
npx get-shit-done-cc --codebuddy --global # Install to ~/.codebuddy/
|
||||
npx get-shit-done-cc --codebuddy --local # Install to ./.codebuddy/
|
||||
@@ -188,7 +192,7 @@ npx get-shit-done-cc --all --global # Install to all directories
|
||||
```
|
||||
|
||||
Use `--global` (`-g`) or `--local` (`-l`) to skip the location prompt.
|
||||
Use `--claude`, `--opencode`, `--gemini`, `--kilo`, `--codex`, `--copilot`, `--cursor`, `--windsurf`, `--antigravity`, `--augment`, `--trae`, `--codebuddy`, `--cline`, or `--all` to skip the runtime prompt.
|
||||
Use `--claude`, `--opencode`, `--gemini`, `--kilo`, `--codex`, `--copilot`, `--cursor`, `--windsurf`, `--antigravity`, `--augment`, `--trae`, `--qwen`, `--codebuddy`, `--cline`, or `--all` to skip the runtime prompt.
|
||||
Use `--sdk` to also install the GSD SDK CLI (`gsd-sdk`) for headless autonomous execution.
|
||||
|
||||
</details>
|
||||
@@ -850,6 +854,7 @@ npx get-shit-done-cc --windsurf --global --uninstall
|
||||
npx get-shit-done-cc --antigravity --global --uninstall
|
||||
npx get-shit-done-cc --augment --global --uninstall
|
||||
npx get-shit-done-cc --trae --global --uninstall
|
||||
npx get-shit-done-cc --qwen --global --uninstall
|
||||
npx get-shit-done-cc --codebuddy --global --uninstall
|
||||
npx get-shit-done-cc --cline --global --uninstall
|
||||
|
||||
@@ -865,6 +870,7 @@ npx get-shit-done-cc --windsurf --local --uninstall
|
||||
npx get-shit-done-cc --antigravity --local --uninstall
|
||||
npx get-shit-done-cc --augment --local --uninstall
|
||||
npx get-shit-done-cc --trae --local --uninstall
|
||||
npx get-shit-done-cc --qwen --local --uninstall
|
||||
npx get-shit-done-cc --codebuddy --local --uninstall
|
||||
npx get-shit-done-cc --cline --local --uninstall
|
||||
```
|
||||
|
||||
@@ -17,6 +17,29 @@ Spawned by `discuss-phase` via `Task()`. You do NOT present output directly to t
|
||||
- Return structured markdown output for the main agent to synthesize
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<input>
|
||||
Agent receives via prompt:
|
||||
|
||||
|
||||
@@ -16,6 +16,29 @@ You are a GSD AI researcher. Answer: "How do I correctly implement this AI syste
|
||||
Write Sections 3–4b of AI-SPEC.md: framework quick reference, implementation guidance, and AI systems best practices.
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<required_reading>
|
||||
Read `~/.claude/get-shit-done/references/ai-frameworks.md` for framework profiles and known pitfalls before fetching docs.
|
||||
</required_reading>
|
||||
|
||||
@@ -16,6 +16,29 @@ You are a GSD domain researcher. Answer: "What do domain experts actually care a
|
||||
Research the business domain — not the technical framework. Write Section 1b of AI-SPEC.md.
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<required_reading>
|
||||
Read `~/.claude/get-shit-done/references/ai-evals.md` — specifically the rubric design and domain expert sections.
|
||||
</required_reading>
|
||||
|
||||
@@ -22,12 +22,32 @@ Your job: Execute the plan completely, commit each task, create SUMMARY.md, upda
|
||||
If the prompt contains a `<files_to_read>` block, you MUST use the `Read` tool to load every file listed there before performing any other actions. This is your primary context.
|
||||
</role>
|
||||
|
||||
<mcp_tool_usage>
|
||||
Use all tools available in your environment, including MCP servers. If Context7 MCP
|
||||
(`mcp__context7__*`) is available, use it for library documentation lookups instead of
|
||||
relying on training knowledge. Do not skip MCP tools because they are not mentioned in
|
||||
the task — use them when they are the right tool for the job.
|
||||
</mcp_tool_usage>
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Example: `npx --yes ctx7@latest library react "useEffect hook"`
|
||||
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
Example: `npx --yes ctx7@latest docs /facebook/react "useEffect hook"`
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output. Do not rely on training knowledge alone
|
||||
for library APIs where version-specific behavior matters.
|
||||
</documentation_lookup>
|
||||
|
||||
<project_context>
|
||||
Before executing, discover project context:
|
||||
@@ -380,6 +400,31 @@ Intentional deletions (e.g., removing a deprecated file as part of the task) are
|
||||
**7. Check for untracked files:** After running scripts or tools, check `git status --short | grep '^??'`. For any new untracked files: commit if intentional, add to `.gitignore` if generated/runtime output. Never leave generated files untracked.
|
||||
</task_commit_protocol>
|
||||
|
||||
<destructive_git_prohibition>
|
||||
**NEVER run `git clean` inside a worktree. This is an absolute rule with no exceptions.**
|
||||
|
||||
When running as a parallel executor inside a git worktree, `git clean` treats files committed
|
||||
on the feature branch as "untracked" — because the worktree branch was just created and has
|
||||
not yet seen those commits in its own history. Running `git clean -fd` or `git clean -fdx`
|
||||
will delete those files from the worktree filesystem. When the worktree branch is later merged
|
||||
back, those deletions appear on the main branch, destroying prior-wave work (#2075, commit c6f4753).
|
||||
|
||||
**Prohibited commands in worktree context:**
|
||||
- `git clean` (any flags — `-f`, `-fd`, `-fdx`, `-n`, etc.)
|
||||
- `git rm` on files not explicitly created by the current task
|
||||
- `git checkout -- .` or `git restore .` (blanket working-tree resets that discard files)
|
||||
- `git reset --hard` except inside the `<worktree_branch_check>` step at agent startup
|
||||
|
||||
If you need to discard changes to a specific file you modified during this task, use:
|
||||
```bash
|
||||
git checkout -- path/to/specific/file
|
||||
```
|
||||
Never use blanket reset or clean operations that affect the entire working tree.
|
||||
|
||||
To inspect what is untracked vs. genuinely new, use `git status --short` and evaluate each
|
||||
file individually. If a file appears untracked but is not part of your task, leave it alone.
|
||||
</destructive_git_prohibition>
|
||||
|
||||
<summary_creation>
|
||||
After all tasks complete, create `{phase}-{plan}-SUMMARY.md` at `.planning/phases/XX-name/`.
|
||||
|
||||
|
||||
@@ -34,6 +34,29 @@ If the prompt contains a `<files_to_read>` block, you MUST use the `Read` tool t
|
||||
Claims tagged `[ASSUMED]` signal to the planner and discuss-phase that the information needs user confirmation before becoming a locked decision. Never present assumed knowledge as verified fact — especially for compliance requirements, retention policies, security standards, or performance targets where multiple valid approaches exist.
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<project_context>
|
||||
Before researching, discover project context:
|
||||
|
||||
|
||||
@@ -35,12 +35,15 @@ If the prompt contains a `<files_to_read>` block, you MUST use the `Read` tool t
|
||||
- Return structured results to orchestrator
|
||||
</role>
|
||||
|
||||
<mcp_tool_usage>
|
||||
Use all tools available in your environment, including MCP servers. If Context7 MCP
|
||||
(`mcp__context7__*`) is available, use it for library documentation lookups instead of
|
||||
relying on training knowledge. Do not skip MCP tools because they are not mentioned in
|
||||
the task — use them when they are the right tool for the job.
|
||||
</mcp_tool_usage>
|
||||
<documentation_lookup>
|
||||
For library docs: use Context7 MCP (`mcp__context7__*`) if available. If not (upstream
|
||||
bug #13898 strips MCP from `tools:`-restricted agents), use the Bash CLI fallback:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>" # resolve library ID
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>" # fetch docs
|
||||
```
|
||||
Do not skip — the CLI fallback works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<project_context>
|
||||
Before planning, discover project context:
|
||||
|
||||
@@ -32,6 +32,29 @@ Your files feed the roadmap:
|
||||
**Be comprehensive but opinionated.** "Use X because Y" not "Options are X, Y, Z."
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<philosophy>
|
||||
|
||||
## Training Data = Hypothesis
|
||||
|
||||
@@ -27,6 +27,29 @@ If the prompt contains a `<files_to_read>` block, you MUST use the `Read` tool t
|
||||
- Return structured result to orchestrator
|
||||
</role>
|
||||
|
||||
<documentation_lookup>
|
||||
When you need library or framework documentation, check in this order:
|
||||
|
||||
1. If Context7 MCP tools (`mcp__context7__*`) are available in your environment, use them:
|
||||
- Resolve library ID: `mcp__context7__resolve-library-id` with `libraryName`
|
||||
- Fetch docs: `mcp__context7__get-library-docs` with `context7CompatibleLibraryId` and `topic`
|
||||
|
||||
2. If Context7 MCP is not available (upstream bug anthropics/claude-code#13898 strips MCP
|
||||
tools from agents with a `tools:` frontmatter restriction), use the CLI fallback via Bash:
|
||||
|
||||
Step 1 — Resolve library ID:
|
||||
```bash
|
||||
npx --yes ctx7@latest library <name> "<query>"
|
||||
```
|
||||
Step 2 — Fetch documentation:
|
||||
```bash
|
||||
npx --yes ctx7@latest docs <libraryId> "<query>"
|
||||
```
|
||||
|
||||
Do not skip documentation lookups because MCP tools are unavailable — the CLI fallback
|
||||
works via Bash and produces equivalent output.
|
||||
</documentation_lookup>
|
||||
|
||||
<project_context>
|
||||
Before researching, discover project context:
|
||||
|
||||
|
||||
@@ -70,6 +70,7 @@ const hasCursor = args.includes('--cursor');
|
||||
const hasWindsurf = args.includes('--windsurf');
|
||||
const hasAugment = args.includes('--augment');
|
||||
const hasTrae = args.includes('--trae');
|
||||
const hasQwen = args.includes('--qwen');
|
||||
const hasCodebuddy = args.includes('--codebuddy');
|
||||
const hasCline = args.includes('--cline');
|
||||
const hasBoth = args.includes('--both'); // Legacy flag, keeps working
|
||||
@@ -79,7 +80,7 @@ const hasUninstall = args.includes('--uninstall') || args.includes('-u');
|
||||
// Runtime selection - can be set by flags or interactive prompt
|
||||
let selectedRuntimes = [];
|
||||
if (hasAll) {
|
||||
selectedRuntimes = ['claude', 'kilo', 'opencode', 'gemini', 'codex', 'copilot', 'antigravity', 'cursor', 'windsurf', 'augment', 'trae', 'codebuddy', 'cline'];
|
||||
selectedRuntimes = ['claude', 'kilo', 'opencode', 'gemini', 'codex', 'copilot', 'antigravity', 'cursor', 'windsurf', 'augment', 'trae', 'qwen', 'codebuddy', 'cline'];
|
||||
} else if (hasBoth) {
|
||||
selectedRuntimes = ['claude', 'opencode'];
|
||||
} else {
|
||||
@@ -94,6 +95,7 @@ if (hasAll) {
|
||||
if (hasWindsurf) selectedRuntimes.push('windsurf');
|
||||
if (hasAugment) selectedRuntimes.push('augment');
|
||||
if (hasTrae) selectedRuntimes.push('trae');
|
||||
if (hasQwen) selectedRuntimes.push('qwen');
|
||||
if (hasCodebuddy) selectedRuntimes.push('codebuddy');
|
||||
if (hasCline) selectedRuntimes.push('cline');
|
||||
}
|
||||
@@ -144,6 +146,7 @@ function getDirName(runtime) {
|
||||
if (runtime === 'windsurf') return '.windsurf';
|
||||
if (runtime === 'augment') return '.augment';
|
||||
if (runtime === 'trae') return '.trae';
|
||||
if (runtime === 'qwen') return '.qwen';
|
||||
if (runtime === 'codebuddy') return '.codebuddy';
|
||||
if (runtime === 'cline') return '.cline';
|
||||
return '.claude';
|
||||
@@ -178,6 +181,7 @@ function getConfigDirFromHome(runtime, isGlobal) {
|
||||
if (runtime === 'windsurf') return "'.windsurf'";
|
||||
if (runtime === 'augment') return "'.augment'";
|
||||
if (runtime === 'trae') return "'.trae'";
|
||||
if (runtime === 'qwen') return "'.qwen'";
|
||||
if (runtime === 'codebuddy') return "'.codebuddy'";
|
||||
if (runtime === 'cline') return "'.cline'";
|
||||
return "'.claude'";
|
||||
@@ -342,6 +346,16 @@ function getGlobalDir(runtime, explicitDir = null) {
|
||||
return path.join(os.homedir(), '.trae');
|
||||
}
|
||||
|
||||
if (runtime === 'qwen') {
|
||||
if (explicitDir) {
|
||||
return expandTilde(explicitDir);
|
||||
}
|
||||
if (process.env.QWEN_CONFIG_DIR) {
|
||||
return expandTilde(process.env.QWEN_CONFIG_DIR);
|
||||
}
|
||||
return path.join(os.homedir(), '.qwen');
|
||||
}
|
||||
|
||||
if (runtime === 'codebuddy') {
|
||||
// CodeBuddy: --config-dir > CODEBUDDY_CONFIG_DIR > ~/.codebuddy
|
||||
if (explicitDir) {
|
||||
@@ -384,7 +398,7 @@ const banner = '\n' +
|
||||
'\n' +
|
||||
' Get Shit Done ' + dim + 'v' + pkg.version + reset + '\n' +
|
||||
' A meta-prompting, context engineering and spec-driven\n' +
|
||||
' development system for Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Antigravity, Cursor, Windsurf, Augment, Trae, Cline and CodeBuddy by TÂCHES.\n';
|
||||
' development system for Claude Code, OpenCode, Gemini, Kilo, Codex, Copilot, Antigravity, Cursor, Windsurf, Augment, Trae, Qwen Code, Cline and CodeBuddy by TÂCHES.\n';
|
||||
|
||||
// Parse --config-dir argument
|
||||
function parseConfigDirArg() {
|
||||
@@ -422,7 +436,7 @@ if (hasUninstall) {
|
||||
|
||||
// Show help if requested
|
||||
if (hasHelp) {
|
||||
console.log(` ${yellow}Usage:${reset} npx get-shit-done-cc [options]\n\n ${yellow}Options:${reset}\n ${cyan}-g, --global${reset} Install globally (to config directory)\n ${cyan}-l, --local${reset} Install locally (to current directory)\n ${cyan}--claude${reset} Install for Claude Code only\n ${cyan}--opencode${reset} Install for OpenCode only\n ${cyan}--gemini${reset} Install for Gemini only\n ${cyan}--kilo${reset} Install for Kilo only\n ${cyan}--codex${reset} Install for Codex only\n ${cyan}--copilot${reset} Install for Copilot only\n ${cyan}--antigravity${reset} Install for Antigravity only\n ${cyan}--cursor${reset} Install for Cursor only\n ${cyan}--windsurf${reset} Install for Windsurf only\n ${cyan}--augment${reset} Install for Augment only\n ${cyan}--trae${reset} Install for Trae only\n ${cyan}--cline${reset} Install for Cline only\n ${cyan}--codebuddy${reset} Install for CodeBuddy only\n ${cyan}--all${reset} Install for all runtimes\n ${cyan}-u, --uninstall${reset} Uninstall GSD (remove all GSD files)\n ${cyan}-c, --config-dir <path>${reset} Specify custom config directory\n ${cyan}-h, --help${reset} Show this help message\n ${cyan}--force-statusline${reset} Replace existing statusline config\n\n ${yellow}Examples:${reset}\n ${dim}# Interactive install (prompts for runtime and location)${reset}\n npx get-shit-done-cc\n\n ${dim}# Install for Claude Code globally${reset}\n npx get-shit-done-cc --claude --global\n\n ${dim}# Install for Gemini globally${reset}\n npx get-shit-done-cc --gemini --global\n\n ${dim}# Install for Kilo globally${reset}\n npx get-shit-done-cc --kilo --global\n\n ${dim}# Install for Codex globally${reset}\n npx get-shit-done-cc --codex --global\n\n ${dim}# Install for Copilot globally${reset}\n npx get-shit-done-cc --copilot --global\n\n ${dim}# Install for Copilot locally${reset}\n npx get-shit-done-cc --copilot --local\n\n ${dim}# Install for Antigravity globally${reset}\n npx get-shit-done-cc --antigravity --global\n\n ${dim}# Install for Antigravity locally${reset}\n npx get-shit-done-cc --antigravity --local\n\n ${dim}# Install for Cursor globally${reset}\n npx get-shit-done-cc --cursor --global\n\n ${dim}# Install for Cursor locally${reset}\n npx get-shit-done-cc --cursor --local\n\n ${dim}# Install for Windsurf globally${reset}\n npx get-shit-done-cc --windsurf --global\n\n ${dim}# Install for Windsurf locally${reset}\n npx get-shit-done-cc --windsurf --local\n\n ${dim}# Install for Augment globally${reset}\n npx get-shit-done-cc --augment --global\n\n ${dim}# Install for Augment locally${reset}\n npx get-shit-done-cc --augment --local\n\n ${dim}# Install for Trae globally${reset}\n npx get-shit-done-cc --trae --global\n\n ${dim}# Install for Trae locally${reset}\n npx get-shit-done-cc --trae --local\n\n ${dim}# Install for Cline locally${reset}\n npx get-shit-done-cc --cline --local\n\n ${dim}# Install for CodeBuddy globally${reset}\n npx get-shit-done-cc --codebuddy --global\n\n ${dim}# Install for CodeBuddy locally${reset}\n npx get-shit-done-cc --codebuddy --local\n\n ${dim}# Install for all runtimes globally${reset}\n npx get-shit-done-cc --all --global\n\n ${dim}# Install to custom config directory${reset}\n npx get-shit-done-cc --kilo --global --config-dir ~/.kilo-work\n\n ${dim}# Install to current project only${reset}\n npx get-shit-done-cc --claude --local\n\n ${dim}# Uninstall GSD from Cursor globally${reset}\n npx get-shit-done-cc --cursor --global --uninstall\n\n ${yellow}Notes:${reset}\n The --config-dir option is useful when you have multiple configurations.\n It takes priority over CLAUDE_CONFIG_DIR / OPENCODE_CONFIG_DIR / GEMINI_CONFIG_DIR / KILO_CONFIG_DIR / CODEX_HOME / COPILOT_CONFIG_DIR / ANTIGRAVITY_CONFIG_DIR / CURSOR_CONFIG_DIR / WINDSURF_CONFIG_DIR / AUGMENT_CONFIG_DIR / TRAE_CONFIG_DIR / CLINE_CONFIG_DIR / CODEBUDDY_CONFIG_DIR environment variables.\n`);
|
||||
console.log(` ${yellow}Usage:${reset} npx get-shit-done-cc [options]\n\n ${yellow}Options:${reset}\n ${cyan}-g, --global${reset} Install globally (to config directory)\n ${cyan}-l, --local${reset} Install locally (to current directory)\n ${cyan}--claude${reset} Install for Claude Code only\n ${cyan}--opencode${reset} Install for OpenCode only\n ${cyan}--gemini${reset} Install for Gemini only\n ${cyan}--kilo${reset} Install for Kilo only\n ${cyan}--codex${reset} Install for Codex only\n ${cyan}--copilot${reset} Install for Copilot only\n ${cyan}--antigravity${reset} Install for Antigravity only\n ${cyan}--cursor${reset} Install for Cursor only\n ${cyan}--windsurf${reset} Install for Windsurf only\n ${cyan}--augment${reset} Install for Augment only\n ${cyan}--trae${reset} Install for Trae only\n ${cyan}--qwen${reset} Install for Qwen Code only\n ${cyan}--cline${reset} Install for Cline only\n ${cyan}--codebuddy${reset} Install for CodeBuddy only\n ${cyan}--all${reset} Install for all runtimes\n ${cyan}-u, --uninstall${reset} Uninstall GSD (remove all GSD files)\n ${cyan}-c, --config-dir <path>${reset} Specify custom config directory\n ${cyan}-h, --help${reset} Show this help message\n ${cyan}--force-statusline${reset} Replace existing statusline config\n\n ${yellow}Examples:${reset}\n ${dim}# Interactive install (prompts for runtime and location)${reset}\n npx get-shit-done-cc\n\n ${dim}# Install for Claude Code globally${reset}\n npx get-shit-done-cc --claude --global\n\n ${dim}# Install for Gemini globally${reset}\n npx get-shit-done-cc --gemini --global\n\n ${dim}# Install for Kilo globally${reset}\n npx get-shit-done-cc --kilo --global\n\n ${dim}# Install for Codex globally${reset}\n npx get-shit-done-cc --codex --global\n\n ${dim}# Install for Copilot globally${reset}\n npx get-shit-done-cc --copilot --global\n\n ${dim}# Install for Copilot locally${reset}\n npx get-shit-done-cc --copilot --local\n\n ${dim}# Install for Antigravity globally${reset}\n npx get-shit-done-cc --antigravity --global\n\n ${dim}# Install for Antigravity locally${reset}\n npx get-shit-done-cc --antigravity --local\n\n ${dim}# Install for Cursor globally${reset}\n npx get-shit-done-cc --cursor --global\n\n ${dim}# Install for Cursor locally${reset}\n npx get-shit-done-cc --cursor --local\n\n ${dim}# Install for Windsurf globally${reset}\n npx get-shit-done-cc --windsurf --global\n\n ${dim}# Install for Windsurf locally${reset}\n npx get-shit-done-cc --windsurf --local\n\n ${dim}# Install for Augment globally${reset}\n npx get-shit-done-cc --augment --global\n\n ${dim}# Install for Augment locally${reset}\n npx get-shit-done-cc --augment --local\n\n ${dim}# Install for Trae globally${reset}\n npx get-shit-done-cc --trae --global\n\n ${dim}# Install for Trae locally${reset}\n npx get-shit-done-cc --trae --local\n\n ${dim}# Install for Cline locally${reset}\n npx get-shit-done-cc --cline --local\n\n ${dim}# Install for CodeBuddy globally${reset}\n npx get-shit-done-cc --codebuddy --global\n\n ${dim}# Install for CodeBuddy locally${reset}\n npx get-shit-done-cc --codebuddy --local\n\n ${dim}# Install for all runtimes globally${reset}\n npx get-shit-done-cc --all --global\n\n ${dim}# Install to custom config directory${reset}\n npx get-shit-done-cc --kilo --global --config-dir ~/.kilo-work\n\n ${dim}# Install to current project only${reset}\n npx get-shit-done-cc --claude --local\n\n ${dim}# Uninstall GSD from Cursor globally${reset}\n npx get-shit-done-cc --cursor --global --uninstall\n\n ${yellow}Notes:${reset}\n The --config-dir option is useful when you have multiple configurations.\n It takes priority over CLAUDE_CONFIG_DIR / OPENCODE_CONFIG_DIR / GEMINI_CONFIG_DIR / KILO_CONFIG_DIR / CODEX_HOME / COPILOT_CONFIG_DIR / ANTIGRAVITY_CONFIG_DIR / CURSOR_CONFIG_DIR / WINDSURF_CONFIG_DIR / AUGMENT_CONFIG_DIR / TRAE_CONFIG_DIR / QWEN_CONFIG_DIR / CLINE_CONFIG_DIR / CODEBUDDY_CONFIG_DIR environment variables.\n`);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
@@ -3939,7 +3953,10 @@ function copyCommandsAsClaudeSkills(srcDir, skillsDir, prefix, pathPrefix, runti
|
||||
content = content.replace(/~\/\.claude\//g, pathPrefix);
|
||||
content = content.replace(/\$HOME\/\.claude\//g, pathPrefix);
|
||||
content = content.replace(/\.\/\.claude\//g, `./${getDirName(runtime)}/`);
|
||||
content = processAttribution(content, getCommitAttribution('claude'));
|
||||
content = content.replace(/~\/\.qwen\//g, pathPrefix);
|
||||
content = content.replace(/\$HOME\/\.qwen\//g, pathPrefix);
|
||||
content = content.replace(/\.\/\.qwen\//g, `./${getDirName(runtime)}/`);
|
||||
content = processAttribution(content, getCommitAttribution(runtime));
|
||||
content = convertClaudeCommandToClaudeSkill(content, skillName);
|
||||
|
||||
fs.writeFileSync(path.join(skillDir, 'SKILL.md'), content);
|
||||
@@ -4057,6 +4074,7 @@ function copyWithPathReplacement(srcDir, destDir, pathPrefix, runtime, isCommand
|
||||
const isWindsurf = runtime === 'windsurf';
|
||||
const isAugment = runtime === 'augment';
|
||||
const isTrae = runtime === 'trae';
|
||||
const isQwen = runtime === 'qwen';
|
||||
const isCline = runtime === 'cline';
|
||||
const dirName = getDirName(runtime);
|
||||
|
||||
@@ -4085,6 +4103,9 @@ function copyWithPathReplacement(srcDir, destDir, pathPrefix, runtime, isCommand
|
||||
content = content.replace(globalClaudeRegex, pathPrefix);
|
||||
content = content.replace(globalClaudeHomeRegex, pathPrefix);
|
||||
content = content.replace(localClaudeRegex, `./${dirName}/`);
|
||||
content = content.replace(/~\/\.qwen\//g, pathPrefix);
|
||||
content = content.replace(/\$HOME\/\.qwen\//g, pathPrefix);
|
||||
content = content.replace(/\.\/\.qwen\//g, `./${dirName}/`);
|
||||
}
|
||||
content = processAttribution(content, getCommitAttribution(runtime));
|
||||
|
||||
@@ -4349,6 +4370,7 @@ function uninstall(isGlobal, runtime = 'claude') {
|
||||
const isWindsurf = runtime === 'windsurf';
|
||||
const isAugment = runtime === 'augment';
|
||||
const isTrae = runtime === 'trae';
|
||||
const isQwen = runtime === 'qwen';
|
||||
const isCodebuddy = runtime === 'codebuddy';
|
||||
const dirName = getDirName(runtime);
|
||||
|
||||
@@ -4372,6 +4394,7 @@ function uninstall(isGlobal, runtime = 'claude') {
|
||||
if (runtime === 'windsurf') runtimeLabel = 'Windsurf';
|
||||
if (runtime === 'augment') runtimeLabel = 'Augment';
|
||||
if (runtime === 'trae') runtimeLabel = 'Trae';
|
||||
if (runtime === 'qwen') runtimeLabel = 'Qwen Code';
|
||||
if (runtime === 'codebuddy') runtimeLabel = 'CodeBuddy';
|
||||
|
||||
console.log(` Uninstalling GSD from ${cyan}${runtimeLabel}${reset} at ${cyan}${locationLabel}${reset}\n`);
|
||||
@@ -4502,6 +4525,31 @@ function uninstall(isGlobal, runtime = 'claude') {
|
||||
console.log(` ${green}✓${reset} Removed ${skillCount} Antigravity skills`);
|
||||
}
|
||||
}
|
||||
} else if (isQwen) {
|
||||
const skillsDir = path.join(targetDir, 'skills');
|
||||
if (fs.existsSync(skillsDir)) {
|
||||
let skillCount = 0;
|
||||
const entries = fs.readdirSync(skillsDir, { withFileTypes: true });
|
||||
for (const entry of entries) {
|
||||
if (entry.isDirectory() && entry.name.startsWith('gsd-')) {
|
||||
fs.rmSync(path.join(skillsDir, entry.name), { recursive: true });
|
||||
skillCount++;
|
||||
}
|
||||
}
|
||||
if (skillCount > 0) {
|
||||
removedCount++;
|
||||
console.log(` ${green}✓${reset} Removed ${skillCount} Qwen Code skills`);
|
||||
}
|
||||
}
|
||||
|
||||
const legacyCommandsDir = path.join(targetDir, 'commands', 'gsd');
|
||||
if (fs.existsSync(legacyCommandsDir)) {
|
||||
const savedLegacyArtifacts = preserveUserArtifacts(legacyCommandsDir, ['dev-preferences.md']);
|
||||
fs.rmSync(legacyCommandsDir, { recursive: true });
|
||||
removedCount++;
|
||||
console.log(` ${green}✓${reset} Removed legacy commands/gsd/`);
|
||||
restoreUserArtifacts(legacyCommandsDir, savedLegacyArtifacts);
|
||||
}
|
||||
} else if (isGemini) {
|
||||
// Gemini: still uses commands/gsd/
|
||||
const gsdCommandsDir = path.join(targetDir, 'commands', 'gsd');
|
||||
@@ -5298,6 +5346,7 @@ function install(isGlobal, runtime = 'claude') {
|
||||
const isWindsurf = runtime === 'windsurf';
|
||||
const isAugment = runtime === 'augment';
|
||||
const isTrae = runtime === 'trae';
|
||||
const isQwen = runtime === 'qwen';
|
||||
const isCodebuddy = runtime === 'codebuddy';
|
||||
const isCline = runtime === 'cline';
|
||||
const dirName = getDirName(runtime);
|
||||
@@ -5338,6 +5387,7 @@ function install(isGlobal, runtime = 'claude') {
|
||||
if (isWindsurf) runtimeLabel = 'Windsurf';
|
||||
if (isAugment) runtimeLabel = 'Augment';
|
||||
if (isTrae) runtimeLabel = 'Trae';
|
||||
if (isQwen) runtimeLabel = 'Qwen Code';
|
||||
if (isCodebuddy) runtimeLabel = 'CodeBuddy';
|
||||
if (isCline) runtimeLabel = 'Cline';
|
||||
|
||||
@@ -5447,6 +5497,29 @@ function install(isGlobal, runtime = 'claude') {
|
||||
} else {
|
||||
failures.push('skills/gsd-*');
|
||||
}
|
||||
} else if (isQwen) {
|
||||
const skillsDir = path.join(targetDir, 'skills');
|
||||
const gsdSrc = path.join(src, 'commands', 'gsd');
|
||||
copyCommandsAsClaudeSkills(gsdSrc, skillsDir, 'gsd', pathPrefix, runtime, isGlobal);
|
||||
if (fs.existsSync(skillsDir)) {
|
||||
const count = fs.readdirSync(skillsDir, { withFileTypes: true })
|
||||
.filter(e => e.isDirectory() && e.name.startsWith('gsd-')).length;
|
||||
if (count > 0) {
|
||||
console.log(` ${green}✓${reset} Installed ${count} skills to skills/`);
|
||||
} else {
|
||||
failures.push('skills/gsd-*');
|
||||
}
|
||||
} else {
|
||||
failures.push('skills/gsd-*');
|
||||
}
|
||||
|
||||
const legacyCommandsDir = path.join(targetDir, 'commands', 'gsd');
|
||||
if (fs.existsSync(legacyCommandsDir)) {
|
||||
const savedLegacyArtifacts = preserveUserArtifacts(legacyCommandsDir, ['dev-preferences.md']);
|
||||
fs.rmSync(legacyCommandsDir, { recursive: true });
|
||||
console.log(` ${green}✓${reset} Removed legacy commands/gsd/ directory`);
|
||||
restoreUserArtifacts(legacyCommandsDir, savedLegacyArtifacts);
|
||||
}
|
||||
} else if (isCodebuddy) {
|
||||
const skillsDir = path.join(targetDir, 'skills');
|
||||
const gsdSrc = path.join(src, 'commands', 'gsd');
|
||||
@@ -6289,10 +6362,11 @@ function promptRuntime(callback) {
|
||||
'9': 'gemini',
|
||||
'10': 'kilo',
|
||||
'11': 'opencode',
|
||||
'12': 'trae',
|
||||
'13': 'windsurf'
|
||||
'12': 'qwen',
|
||||
'13': 'trae',
|
||||
'14': 'windsurf'
|
||||
};
|
||||
const allRuntimes = ['claude', 'antigravity', 'augment', 'cline', 'codebuddy', 'codex', 'copilot', 'cursor', 'gemini', 'kilo', 'opencode', 'trae', 'windsurf'];
|
||||
const allRuntimes = ['claude', 'antigravity', 'augment', 'cline', 'codebuddy', 'codex', 'copilot', 'cursor', 'gemini', 'kilo', 'opencode', 'qwen', 'trae', 'windsurf'];
|
||||
|
||||
console.log(` ${yellow}Which runtime(s) would you like to install for?${reset}\n\n ${cyan}1${reset}) Claude Code ${dim}(~/.claude)${reset}
|
||||
${cyan}2${reset}) Antigravity ${dim}(~/.gemini/antigravity)${reset}
|
||||
@@ -6305,9 +6379,10 @@ function promptRuntime(callback) {
|
||||
${cyan}9${reset}) Gemini ${dim}(~/.gemini)${reset}
|
||||
${cyan}10${reset}) Kilo ${dim}(~/.config/kilo)${reset}
|
||||
${cyan}11${reset}) OpenCode ${dim}(~/.config/opencode)${reset}
|
||||
${cyan}12${reset}) Trae ${dim}(~/.trae)${reset}
|
||||
${cyan}13${reset}) Windsurf ${dim}(~/.codeium/windsurf)${reset}
|
||||
${cyan}14${reset}) All
|
||||
${cyan}12${reset}) Qwen Code ${dim}(~/.qwen)${reset}
|
||||
${cyan}13${reset}) Trae ${dim}(~/.trae)${reset}
|
||||
${cyan}14${reset}) Windsurf ${dim}(~/.codeium/windsurf)${reset}
|
||||
${cyan}15${reset}) All
|
||||
|
||||
${dim}Select multiple: 1,2,6 or 1 2 6${reset}
|
||||
`);
|
||||
@@ -6318,7 +6393,7 @@ function promptRuntime(callback) {
|
||||
const input = answer.trim() || '1';
|
||||
|
||||
// "All" shortcut
|
||||
if (input === '14') {
|
||||
if (input === '15') {
|
||||
callback(allRuntimes);
|
||||
return;
|
||||
}
|
||||
|
||||
45
commands/gsd/from-gsd2.md
Normal file
45
commands/gsd/from-gsd2.md
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
name: gsd:from-gsd2
|
||||
description: Import a GSD-2 (.gsd/) project back to GSD v1 (.planning/) format
|
||||
argument-hint: "[--path <dir>] [--force]"
|
||||
allowed-tools:
|
||||
- Read
|
||||
- Write
|
||||
- Bash
|
||||
type: prompt
|
||||
---
|
||||
|
||||
<objective>
|
||||
Reverse-migrate a GSD-2 project (`.gsd/` directory) back to GSD v1 (`.planning/`) format.
|
||||
|
||||
Maps the GSD-2 hierarchy (Milestone → Slice → Task) to the GSD v1 hierarchy (Milestone sections in ROADMAP.md → Phase → Plan), preserving completion state, research files, and summaries.
|
||||
</objective>
|
||||
|
||||
<process>
|
||||
|
||||
1. **Locate the .gsd/ directory** — check the current working directory (or `--path` argument):
|
||||
```bash
|
||||
node "$HOME/.claude/get-shit-done/bin/gsd-tools.cjs" from-gsd2 --dry-run
|
||||
```
|
||||
If no `.gsd/` is found, report the error and stop.
|
||||
|
||||
2. **Show the dry-run preview** — present the full file list and migration statistics to the user. Ask for confirmation before writing anything.
|
||||
|
||||
3. **Run the migration** after confirmation:
|
||||
```bash
|
||||
node "$HOME/.claude/get-shit-done/bin/gsd-tools.cjs" from-gsd2
|
||||
```
|
||||
Use `--force` if `.planning/` already exists and the user has confirmed overwrite.
|
||||
|
||||
4. **Report the result** — show the `filesWritten` count, `planningDir` path, and the preview summary.
|
||||
|
||||
</process>
|
||||
|
||||
<notes>
|
||||
- The migration is non-destructive: `.gsd/` is never modified or removed.
|
||||
- Pass `--path <dir>` to migrate a project at a different path than the current directory.
|
||||
- Slices are numbered sequentially across all milestones (M001/S01 → phase 01, M001/S02 → phase 02, M002/S01 → phase 03, etc.).
|
||||
- Tasks within each slice become plans (T01 → plan 01, T02 → plan 02, etc.).
|
||||
- Completed slices and tasks carry their done state into ROADMAP.md checkboxes and SUMMARY.md files.
|
||||
- GSD-2 cost/token ledger, database state, and VS Code extension state cannot be migrated.
|
||||
</notes>
|
||||
@@ -593,6 +593,31 @@ Ingest an external plan file into the GSD planning system with conflict detectio
|
||||
|
||||
---
|
||||
|
||||
### `/gsd-from-gsd2`
|
||||
|
||||
Reverse migration from GSD-2 format (`.gsd/` with Milestone→Slice→Task hierarchy) back to v1 `.planning/` format.
|
||||
|
||||
| Flag | Required | Description |
|
||||
|------|----------|-------------|
|
||||
| `--dry-run` | No | Preview what would be migrated without writing anything |
|
||||
| `--force` | No | Overwrite existing `.planning/` directory |
|
||||
| `--path <dir>` | No | Specify GSD-2 root directory (defaults to current directory) |
|
||||
|
||||
**Flattening:** Milestone→Slice hierarchy is flattened to sequential phase numbers (M001/S01→phase 01, M001/S02→phase 02, M002/S01→phase 03, etc.).
|
||||
|
||||
**Produces:** `PROJECT.md`, `REQUIREMENTS.md`, `ROADMAP.md`, `STATE.md`, and sequential phase directories in `.planning/`.
|
||||
|
||||
**Safety:** Guards against overwriting an existing `.planning/` directory without `--force`.
|
||||
|
||||
```bash
|
||||
/gsd-from-gsd2 # Migrate .gsd/ in current directory
|
||||
/gsd-from-gsd2 --dry-run # Preview migration without writing
|
||||
/gsd-from-gsd2 --force # Overwrite existing .planning/
|
||||
/gsd-from-gsd2 --path /path/to/gsd2-project # Specify GSD-2 root
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `/gsd-quick`
|
||||
|
||||
Execute ad-hoc task with GSD guarantees.
|
||||
@@ -900,6 +925,37 @@ Query, inspect, or refresh queryable codebase intelligence files stored in `.pla
|
||||
|
||||
---
|
||||
|
||||
## AI Integration Commands
|
||||
|
||||
### `/gsd-ai-integration-phase`
|
||||
|
||||
AI framework selection wizard for integrating AI/LLM capabilities into a project phase. Presents an interactive decision matrix, surfaces domain-specific failure modes and eval criteria, and produces `AI-SPEC.md` with a framework recommendation, implementation guidance, and evaluation strategy.
|
||||
|
||||
**Produces:** `{phase}-AI-SPEC.md` in the phase directory
|
||||
|
||||
**Spawns:** 3 parallel specialist agents: domain-researcher, framework-selector, ai-researcher, and eval-planner
|
||||
|
||||
```bash
|
||||
/gsd-ai-integration-phase # Wizard for the current phase
|
||||
/gsd-ai-integration-phase 3 # Wizard for a specific phase
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### `/gsd-eval-review`
|
||||
|
||||
Retroactive audit of an implemented AI phase's evaluation coverage. Checks implementation against the `AI-SPEC.md` evaluation plan produced by `/gsd-ai-integration-phase`. Scores each eval dimension as COVERED/PARTIAL/MISSING.
|
||||
|
||||
**Prerequisites:** Phase has been executed and has an `AI-SPEC.md`
|
||||
**Produces:** `{phase}-EVAL-REVIEW.md` with findings, gaps, and remediation guidance
|
||||
|
||||
```bash
|
||||
/gsd-eval-review # Audit current phase
|
||||
/gsd-eval-review 3 # Audit a specific phase
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Update Commands
|
||||
|
||||
### `/gsd-update`
|
||||
|
||||
@@ -360,6 +360,36 @@ Settings for the security enforcement feature (v1.31). All follow the **absent =
|
||||
|
||||
---
|
||||
|
||||
## Review Settings
|
||||
|
||||
Configure per-CLI model selection for `/gsd-review`. When set, overrides the CLI's default model for that reviewer.
|
||||
|
||||
| Setting | Type | Default | Description |
|
||||
|---------|------|---------|-------------|
|
||||
| `review.models.gemini` | string | (CLI default) | Model used when `--gemini` reviewer is invoked |
|
||||
| `review.models.claude` | string | (CLI default) | Model used when `--claude` reviewer is invoked |
|
||||
| `review.models.codex` | string | (CLI default) | Model used when `--codex` reviewer is invoked |
|
||||
| `review.models.opencode` | string | (CLI default) | Model used when `--opencode` reviewer is invoked |
|
||||
| `review.models.qwen` | string | (CLI default) | Model used when `--qwen` reviewer is invoked |
|
||||
| `review.models.cursor` | string | (CLI default) | Model used when `--cursor` reviewer is invoked |
|
||||
|
||||
### Example
|
||||
|
||||
```json
|
||||
{
|
||||
"review": {
|
||||
"models": {
|
||||
"gemini": "gemini-2.5-pro",
|
||||
"qwen": "qwen-max"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Falls back to each CLI's configured default when a key is absent. Added in v1.35.0 (#1849).
|
||||
|
||||
---
|
||||
|
||||
## Manager Passthrough Flags
|
||||
|
||||
Configure per-step flags that `/gsd-manager` appends to each dispatched command. This allows customizing how the manager runs discuss, plan, and execute steps without manual flag entry.
|
||||
|
||||
104
docs/FEATURES.md
104
docs/FEATURES.md
@@ -102,6 +102,11 @@
|
||||
- [Hard Stop Safety Gates in /gsd-next](#101-hard-stop-safety-gates-in-gsd-next)
|
||||
- [Adaptive Model Preset](#102-adaptive-model-preset)
|
||||
- [Post-Merge Hunk Verification](#103-post-merge-hunk-verification)
|
||||
- [v1.35.0 Features](#v1350-features)
|
||||
- [New Runtime Support (Cline, CodeBuddy, Qwen Code)](#104-new-runtime-support-cline-codebuddy-qwen-code)
|
||||
- [GSD-2 Reverse Migration](#105-gsd-2-reverse-migration)
|
||||
- [AI Integration Phase Wizard](#106-ai-integration-phase-wizard)
|
||||
- [AI Eval Review](#107-ai-eval-review)
|
||||
- [v1.32 Features](#v132-features)
|
||||
- [STATE.md Consistency Gates](#69-statemd-consistency-gates)
|
||||
- [Autonomous `--to N` Flag](#70-autonomous---to-n-flag)
|
||||
@@ -917,7 +922,7 @@ fix(03-01): correct auth token expiry
|
||||
**Purpose:** Run GSD across multiple AI coding agent runtimes.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-RUNTIME-01: System MUST support Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Antigravity, Trae, Cline, Augment Code
|
||||
- REQ-RUNTIME-01: System MUST support Claude Code, OpenCode, Gemini CLI, Kilo, Codex, Copilot, Antigravity, Trae, Cline, Augment Code, CodeBuddy, Qwen Code
|
||||
- REQ-RUNTIME-02: Installer MUST transform content per runtime (tool names, paths, frontmatter)
|
||||
- REQ-RUNTIME-03: Installer MUST support interactive and non-interactive (`--claude --global`) modes
|
||||
- REQ-RUNTIME-04: Installer MUST support both global and local installation
|
||||
@@ -926,12 +931,12 @@ fix(03-01): correct auth token expiry
|
||||
|
||||
**Runtime Transformations:**
|
||||
|
||||
| Aspect | Claude Code | OpenCode | Gemini | Kilo | Codex | Copilot | Antigravity | Trae | Cline | Augment |
|
||||
|--------|------------|----------|--------|-------|-------|---------|-------------|------|-------|---------|
|
||||
| Commands | Slash commands | Slash commands | Slash commands | Slash commands | Skills (TOML) | Slash commands | Skills | Skills | Rules | Skills |
|
||||
| Agent format | Claude native | `mode: subagent` | Claude native | `mode: subagent` | Skills | Tool mapping | Skills | Skills | Rules | Skills |
|
||||
| Hook events | `PostToolUse` | N/A | `AfterTool` | N/A | N/A | N/A | N/A | N/A | N/A | N/A |
|
||||
| Config | `settings.json` | `opencode.json(c)` | `settings.json` | `kilo.json(c)` | TOML | Instructions | Config | Config | Config | Config |
|
||||
| Aspect | Claude Code | OpenCode | Gemini | Kilo | Codex | Copilot | Antigravity | Trae | Cline | Augment | CodeBuddy | Qwen Code |
|
||||
|--------|------------|----------|--------|-------|-------|---------|-------------|------|-------|---------|-----------|-----------|
|
||||
| Commands | Slash commands | Slash commands | Slash commands | Slash commands | Skills (TOML) | Slash commands | Skills | Skills | Rules | Skills | Skills | Skills |
|
||||
| Agent format | Claude native | `mode: subagent` | Claude native | `mode: subagent` | Skills | Tool mapping | Skills | Skills | Rules | Skills | Skills | Skills |
|
||||
| Hook events | `PostToolUse` | N/A | `AfterTool` | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A |
|
||||
| Config | `settings.json` | `opencode.json(c)` | `settings.json` | `kilo.json(c)` | TOML | Instructions | Config | Config | `.clinerules` | Config | Config | Config |
|
||||
|
||||
---
|
||||
|
||||
@@ -2179,3 +2184,88 @@ Test suite that scans all agent, workflow, and command files for embedded inject
|
||||
- REQ-PATCH-VERIFY-01: Reapply-patches MUST verify each hunk was applied after the merge
|
||||
- REQ-PATCH-VERIFY-02: Dropped or partial hunks MUST be reported to the user with file and line context
|
||||
- REQ-PATCH-VERIFY-03: Verification MUST run after all patches are applied, not per-patch
|
||||
|
||||
---
|
||||
|
||||
## v1.35.0 Features
|
||||
|
||||
- [New Runtime Support (Cline, CodeBuddy, Qwen Code)](#104-new-runtime-support-cline-codebuddy-qwen-code)
|
||||
- [GSD-2 Reverse Migration](#105-gsd-2-reverse-migration)
|
||||
- [AI Integration Phase Wizard](#106-ai-integration-phase-wizard)
|
||||
- [AI Eval Review](#107-ai-eval-review)
|
||||
|
||||
---
|
||||
|
||||
### 104. New Runtime Support (Cline, CodeBuddy, Qwen Code)
|
||||
|
||||
**Part of:** `npx get-shit-done-cc`
|
||||
|
||||
**Purpose:** Extend GSD installation to Cline, CodeBuddy, and Qwen Code runtimes.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-CLINE-02: Cline install MUST write `.clinerules` to `~/.cline/` (global) or `./.cline/` (local). No custom slash commands — rules-based integration only. Flag: `--cline`.
|
||||
- REQ-CODEBUDDY-01: CodeBuddy install MUST deploy skills to `~/.codebuddy/skills/gsd-*/SKILL.md`. Flag: `--codebuddy`.
|
||||
- REQ-QWEN-01: Qwen Code install MUST deploy skills to `~/.qwen/skills/gsd-*/SKILL.md`, following the open standard used by Claude Code 2.1.88+. `QWEN_CONFIG_DIR` env var overrides the default path. Flag: `--qwen`.
|
||||
|
||||
**Runtime summary:**
|
||||
|
||||
| Runtime | Install Format | Config Path | Flag |
|
||||
|---------|---------------|-------------|------|
|
||||
| Cline | `.clinerules` | `~/.cline/` or `./.cline/` | `--cline` |
|
||||
| CodeBuddy | Skills (`SKILL.md`) | `~/.codebuddy/skills/` | `--codebuddy` |
|
||||
| Qwen Code | Skills (`SKILL.md`) | `~/.qwen/skills/` | `--qwen` |
|
||||
|
||||
---
|
||||
|
||||
### 105. GSD-2 Reverse Migration
|
||||
|
||||
**Command:** `/gsd-from-gsd2 [--dry-run] [--force] [--path <dir>]`
|
||||
|
||||
**Purpose:** Migrate a project from GSD-2 format (`.gsd/` directory with Milestone→Slice→Task hierarchy) back to the v1 `.planning/` format, restoring full compatibility with all GSD v1 commands.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-FROM-GSD2-01: Importer MUST read `.gsd/` from the specified or current directory
|
||||
- REQ-FROM-GSD2-02: Milestone→Slice hierarchy MUST be flattened to sequential phase numbers (M001/S01→phase 01, M001/S02→phase 02, M002/S01→phase 03, etc.)
|
||||
- REQ-FROM-GSD2-03: System MUST guard against overwriting an existing `.planning/` directory without `--force`
|
||||
- REQ-FROM-GSD2-04: `--dry-run` MUST preview all changes without writing any files
|
||||
- REQ-FROM-GSD2-05: Migration MUST produce `PROJECT.md`, `REQUIREMENTS.md`, `ROADMAP.md`, `STATE.md`, and sequential phase directories
|
||||
|
||||
**Flags:**
|
||||
|
||||
| Flag | Description |
|
||||
|------|-------------|
|
||||
| `--dry-run` | Preview migration output without writing files |
|
||||
| `--force` | Overwrite an existing `.planning/` directory |
|
||||
| `--path <dir>` | Specify the GSD-2 root directory |
|
||||
|
||||
---
|
||||
|
||||
### 106. AI Integration Phase Wizard
|
||||
|
||||
**Command:** `/gsd-ai-integration-phase [N]`
|
||||
|
||||
**Purpose:** Guide developers through selecting, integrating, and planning evaluation for AI/LLM capabilities in a project phase. Produces a structured `AI-SPEC.md` that feeds into planning and verification.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-AISPEC-01: Wizard MUST present an interactive decision matrix covering framework selection, model choice, and integration approach
|
||||
- REQ-AISPEC-02: System MUST surface domain-specific failure modes and eval criteria relevant to the project type
|
||||
- REQ-AISPEC-03: System MUST spawn 3 parallel specialist agents: domain-researcher, framework-selector, and eval-planner
|
||||
- REQ-AISPEC-04: Output MUST produce `{phase}-AI-SPEC.md` with framework recommendation, implementation guidance, and evaluation strategy
|
||||
|
||||
**Produces:** `{phase}-AI-SPEC.md` in the phase directory
|
||||
|
||||
---
|
||||
|
||||
### 107. AI Eval Review
|
||||
|
||||
**Command:** `/gsd-eval-review [N]`
|
||||
|
||||
**Purpose:** Retroactively audit an executed AI phase's evaluation coverage against the `AI-SPEC.md` plan. Identifies gaps between planned and implemented evaluation before the phase is closed.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-EVALREVIEW-01: Review MUST read `AI-SPEC.md` from the specified phase
|
||||
- REQ-EVALREVIEW-02: Each eval dimension MUST be scored as COVERED, PARTIAL, or MISSING
|
||||
- REQ-EVALREVIEW-03: Output MUST include findings, gap descriptions, and remediation guidance
|
||||
- REQ-EVALREVIEW-04: `EVAL-REVIEW.md` MUST be written to the phase directory
|
||||
|
||||
**Produces:** `{phase}-EVAL-REVIEW.md` with scored eval dimensions, gap analysis, and remediation steps
|
||||
|
||||
@@ -868,6 +868,40 @@ The installer auto-configures `resolve_model_ids: "omit"` for Gemini CLI, OpenCo
|
||||
|
||||
See the [Configuration Reference](CONFIGURATION.md#non-claude-runtimes-codex-opencode-gemini-cli-kilo) for the full explanation.
|
||||
|
||||
### Installing for Cline
|
||||
|
||||
Cline uses a rules-based integration — GSD installs as `.clinerules` rather than slash commands.
|
||||
|
||||
```bash
|
||||
# Global install (applies to all projects)
|
||||
npx get-shit-done-cc --cline --global
|
||||
|
||||
# Local install (this project only)
|
||||
npx get-shit-done-cc --cline --local
|
||||
```
|
||||
|
||||
Global installs write to `~/.cline/`. Local installs write to `./.cline/`. No custom slash commands are registered — GSD rules are loaded automatically by Cline from the rules file.
|
||||
|
||||
### Installing for CodeBuddy
|
||||
|
||||
CodeBuddy uses a skills-based integration.
|
||||
|
||||
```bash
|
||||
npx get-shit-done-cc --codebuddy --global
|
||||
```
|
||||
|
||||
Skills are installed to `~/.codebuddy/skills/gsd-*/SKILL.md`.
|
||||
|
||||
### Installing for Qwen Code
|
||||
|
||||
Qwen Code uses the same open skills standard as Claude Code 2.1.88+.
|
||||
|
||||
```bash
|
||||
npx get-shit-done-cc --qwen --global
|
||||
```
|
||||
|
||||
Skills are installed to `~/.qwen/skills/gsd-*/SKILL.md`. Use the `QWEN_CONFIG_DIR` environment variable to override the default install path.
|
||||
|
||||
### Using Claude Code with Non-Anthropic Providers (OpenRouter, Local)
|
||||
|
||||
If GSD subagents call Anthropic models and you're paying through OpenRouter or a local provider, switch to the `inherit` profile: `/gsd-set-profile inherit`. This makes all agents use your current session model instead of specific Anthropic models. See also `/gsd-settings` → Model Profile → Inherit.
|
||||
|
||||
@@ -154,6 +154,10 @@
|
||||
* learnings copy Copy from current project's LEARNINGS.md
|
||||
* learnings prune --older-than <dur> Remove entries older than duration (e.g. 90d)
|
||||
* learnings delete <id> Delete a learning by ID
|
||||
*
|
||||
* GSD-2 Migration:
|
||||
* from-gsd2 [--path <dir>] [--force] [--dry-run]
|
||||
* Import a GSD-2 (.gsd/) project back to GSD v1 (.planning/) format
|
||||
*/
|
||||
|
||||
const fs = require('fs');
|
||||
@@ -1070,6 +1074,14 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
break;
|
||||
}
|
||||
|
||||
// ─── GSD-2 Reverse Migration ───────────────────────────────────────────
|
||||
|
||||
case 'from-gsd2': {
|
||||
const gsd2Import = require('./lib/gsd2-import.cjs');
|
||||
gsd2Import.cmdFromGsd2(args.slice(1), cwd, raw);
|
||||
break;
|
||||
}
|
||||
|
||||
default:
|
||||
error(`Unknown command: ${command}`);
|
||||
}
|
||||
|
||||
511
get-shit-done/bin/lib/gsd2-import.cjs
Normal file
511
get-shit-done/bin/lib/gsd2-import.cjs
Normal file
@@ -0,0 +1,511 @@
|
||||
'use strict';
|
||||
|
||||
/**
|
||||
* gsd2-import — Reverse migration from GSD-2 (.gsd/) to GSD v1 (.planning/)
|
||||
*
|
||||
* Reads a GSD-2 project directory structure and produces a complete
|
||||
* .planning/ artifact tree in GSD v1 format.
|
||||
*
|
||||
* GSD-2 hierarchy: Milestone → Slice → Task
|
||||
* GSD v1 hierarchy: Milestone (in ROADMAP.md) → Phase → Plan
|
||||
*
|
||||
* Mapping rules:
|
||||
* - Slices are numbered sequentially across all milestones (01, 02, …)
|
||||
* - Tasks within a slice become plans (01-01, 01-02, …)
|
||||
* - Completed slices ([x] in ROADMAP) → [x] phases in ROADMAP.md
|
||||
* - Tasks with a SUMMARY file → SUMMARY.md written
|
||||
* - Slice RESEARCH.md → phase XX-RESEARCH.md
|
||||
*/
|
||||
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
|
||||
// ─── Utilities ──────────────────────────────────────────────────────────────
|
||||
|
||||
function readOptional(filePath) {
|
||||
try { return fs.readFileSync(filePath, 'utf8'); } catch { return null; }
|
||||
}
|
||||
|
||||
function zeroPad(n, width = 2) {
|
||||
return String(n).padStart(width, '0');
|
||||
}
|
||||
|
||||
function slugify(title) {
|
||||
return title.toLowerCase().replace(/[^a-z0-9]+/g, '-').replace(/^-|-$/g, '');
|
||||
}
|
||||
|
||||
// ─── GSD-2 Parser ───────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Find the .gsd/ directory starting from a project root.
|
||||
* Returns the absolute path or null if not found.
|
||||
*/
|
||||
function findGsd2Root(startPath) {
|
||||
if (path.basename(startPath) === '.gsd' && fs.existsSync(startPath)) {
|
||||
return startPath;
|
||||
}
|
||||
const candidate = path.join(startPath, '.gsd');
|
||||
if (fs.existsSync(candidate) && fs.statSync(candidate).isDirectory()) {
|
||||
return candidate;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the ## Slices section from a GSD-2 milestone ROADMAP.md.
|
||||
* Each slice entry looks like:
|
||||
* - [x] **S01: Title** `risk:medium` `depends:[S00]`
|
||||
*/
|
||||
function parseSlicesFromRoadmap(content) {
|
||||
const slices = [];
|
||||
const sectionMatch = content.match(/## Slices\n([\s\S]*?)(?:\n## |\n# |$)/);
|
||||
if (!sectionMatch) return slices;
|
||||
|
||||
for (const line of sectionMatch[1].split('\n')) {
|
||||
const m = line.match(/^- \[([x ])\]\s+\*\*(\w+):\s*([^*]+)\*\*/);
|
||||
if (!m) continue;
|
||||
slices.push({ done: m[1] === 'x', id: m[2].trim(), title: m[3].trim() });
|
||||
}
|
||||
return slices;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the milestone title from the first heading in a GSD-2 ROADMAP.md.
|
||||
* Format: # M001: Title
|
||||
*/
|
||||
function parseMilestoneTitle(content) {
|
||||
const m = content.match(/^# \w+:\s*(.+)/m);
|
||||
return m ? m[1].trim() : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse a task title from a GSD-2 T##-PLAN.md.
|
||||
* Format: # T01: Title
|
||||
*/
|
||||
function parseTaskTitle(content, fallback) {
|
||||
const m = content.match(/^# \w+:\s*(.+)/m);
|
||||
return m ? m[1].trim() : fallback;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the ## Description body from a GSD-2 task plan.
|
||||
*/
|
||||
function parseTaskDescription(content) {
|
||||
const m = content.match(/## Description\n+([\s\S]+?)(?:\n## |\n# |$)/);
|
||||
return m ? m[1].trim() : '';
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse ## Must-Haves items from a GSD-2 task plan.
|
||||
*/
|
||||
function parseTaskMustHaves(content) {
|
||||
const m = content.match(/## Must-Haves\n+([\s\S]+?)(?:\n## |\n# |$)/);
|
||||
if (!m) return [];
|
||||
return m[1].split('\n')
|
||||
.map(l => l.match(/^- \[[ x]\]\s*(.+)/))
|
||||
.filter(Boolean)
|
||||
.map(match => match[1].trim());
|
||||
}
|
||||
|
||||
/**
|
||||
* Read all task plan files from a GSD-2 tasks/ directory.
|
||||
*/
|
||||
function readTasksDir(tasksDir) {
|
||||
if (!fs.existsSync(tasksDir)) return [];
|
||||
|
||||
return fs.readdirSync(tasksDir)
|
||||
.filter(f => f.endsWith('-PLAN.md'))
|
||||
.sort()
|
||||
.map(tf => {
|
||||
const tid = tf.replace('-PLAN.md', '');
|
||||
const plan = readOptional(path.join(tasksDir, tf));
|
||||
const summary = readOptional(path.join(tasksDir, `${tid}-SUMMARY.md`));
|
||||
return {
|
||||
id: tid,
|
||||
title: plan ? parseTaskTitle(plan, tid) : tid,
|
||||
description: plan ? parseTaskDescription(plan) : '',
|
||||
mustHaves: plan ? parseTaskMustHaves(plan) : [],
|
||||
plan,
|
||||
summary,
|
||||
done: !!summary,
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse a complete GSD-2 .gsd/ directory into a structured representation.
|
||||
*/
|
||||
function parseGsd2(gsdDir) {
|
||||
const data = {
|
||||
projectContent: readOptional(path.join(gsdDir, 'PROJECT.md')),
|
||||
requirements: readOptional(path.join(gsdDir, 'REQUIREMENTS.md')),
|
||||
milestones: [],
|
||||
};
|
||||
|
||||
const milestonesBase = path.join(gsdDir, 'milestones');
|
||||
if (!fs.existsSync(milestonesBase)) return data;
|
||||
|
||||
const milestoneIds = fs.readdirSync(milestonesBase)
|
||||
.filter(d => fs.statSync(path.join(milestonesBase, d)).isDirectory())
|
||||
.sort();
|
||||
|
||||
for (const mid of milestoneIds) {
|
||||
const mDir = path.join(milestonesBase, mid);
|
||||
const roadmapContent = readOptional(path.join(mDir, `${mid}-ROADMAP.md`));
|
||||
const slicesDir = path.join(mDir, 'slices');
|
||||
|
||||
const sliceInfos = roadmapContent ? parseSlicesFromRoadmap(roadmapContent) : [];
|
||||
|
||||
const slices = sliceInfos.map(info => {
|
||||
const sDir = path.join(slicesDir, info.id);
|
||||
const hasSDir = fs.existsSync(sDir);
|
||||
return {
|
||||
id: info.id,
|
||||
title: info.title,
|
||||
done: info.done,
|
||||
plan: hasSDir ? readOptional(path.join(sDir, `${info.id}-PLAN.md`)) : null,
|
||||
summary: hasSDir ? readOptional(path.join(sDir, `${info.id}-SUMMARY.md`)) : null,
|
||||
research: hasSDir ? readOptional(path.join(sDir, `${info.id}-RESEARCH.md`)) : null,
|
||||
context: hasSDir ? readOptional(path.join(sDir, `${info.id}-CONTEXT.md`)) : null,
|
||||
tasks: hasSDir ? readTasksDir(path.join(sDir, 'tasks')) : [],
|
||||
};
|
||||
});
|
||||
|
||||
data.milestones.push({
|
||||
id: mid,
|
||||
title: roadmapContent ? (parseMilestoneTitle(roadmapContent) ?? mid) : mid,
|
||||
research: readOptional(path.join(mDir, `${mid}-RESEARCH.md`)),
|
||||
slices,
|
||||
});
|
||||
}
|
||||
|
||||
return data;
|
||||
}
|
||||
|
||||
// ─── Artifact Builders ──────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Build a GSD v1 PLAN.md from a GSD-2 task.
|
||||
*/
|
||||
function buildPlanMd(task, phasePrefix, planPrefix, phaseSlug, milestoneTitle) {
|
||||
const lines = [
|
||||
'---',
|
||||
`phase: "${phasePrefix}"`,
|
||||
`plan: "${planPrefix}"`,
|
||||
'type: "implementation"',
|
||||
'---',
|
||||
'',
|
||||
'<objective>',
|
||||
task.title,
|
||||
'</objective>',
|
||||
'',
|
||||
'<context>',
|
||||
`Phase: ${phasePrefix} (${phaseSlug}) — Milestone: ${milestoneTitle}`,
|
||||
];
|
||||
|
||||
if (task.description) {
|
||||
lines.push('', task.description);
|
||||
}
|
||||
|
||||
lines.push('</context>');
|
||||
|
||||
if (task.mustHaves.length > 0) {
|
||||
lines.push('', '<must_haves>');
|
||||
for (const mh of task.mustHaves) {
|
||||
lines.push(`- ${mh}`);
|
||||
}
|
||||
lines.push('</must_haves>');
|
||||
}
|
||||
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
|
||||
/**
|
||||
* Build a GSD v1 SUMMARY.md from a GSD-2 task summary.
|
||||
* Strips the GSD-2 frontmatter and preserves the body.
|
||||
*/
|
||||
function buildSummaryMd(task, phasePrefix, planPrefix) {
|
||||
const raw = task.summary || '';
|
||||
// Strip GSD-2 frontmatter block (--- ... ---) if present
|
||||
const bodyMatch = raw.match(/^---[\s\S]*?---\n+([\s\S]*)$/);
|
||||
const body = bodyMatch ? bodyMatch[1].trim() : raw.trim();
|
||||
|
||||
return [
|
||||
'---',
|
||||
`phase: "${phasePrefix}"`,
|
||||
`plan: "${planPrefix}"`,
|
||||
'---',
|
||||
'',
|
||||
body || 'Task completed (migrated from GSD-2).',
|
||||
'',
|
||||
].join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Build a GSD v1 XX-CONTEXT.md from a GSD-2 slice.
|
||||
*/
|
||||
function buildContextMd(slice, phasePrefix) {
|
||||
const lines = [
|
||||
`# Phase ${phasePrefix} Context`,
|
||||
'',
|
||||
`Migrated from GSD-2 slice ${slice.id}: ${slice.title}`,
|
||||
];
|
||||
|
||||
const extra = slice.context || '';
|
||||
if (extra.trim()) {
|
||||
lines.push('', extra.trim());
|
||||
}
|
||||
|
||||
return lines.join('\n') + '\n';
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the GSD v1 ROADMAP.md with milestone-sectioned format.
|
||||
*/
|
||||
function buildRoadmapMd(milestones, phaseMap) {
|
||||
const lines = ['# Roadmap', ''];
|
||||
|
||||
for (const milestone of milestones) {
|
||||
lines.push(`## ${milestone.id}: ${milestone.title}`, '');
|
||||
const mPhases = phaseMap.filter(p => p.milestoneId === milestone.id);
|
||||
for (const { slice, phaseNum } of mPhases) {
|
||||
const prefix = zeroPad(phaseNum);
|
||||
const slug = slugify(slice.title);
|
||||
const check = slice.done ? 'x' : ' ';
|
||||
lines.push(`- [${check}] **Phase ${prefix}: ${slug}** — ${slice.title}`);
|
||||
}
|
||||
lines.push('');
|
||||
}
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the GSD v1 STATE.md reflecting the current position in the project.
|
||||
*/
|
||||
function buildStateMd(phaseMap) {
|
||||
const currentEntry = phaseMap.find(p => !p.slice.done);
|
||||
const totalPhases = phaseMap.length;
|
||||
const donePhases = phaseMap.filter(p => p.slice.done).length;
|
||||
const pct = totalPhases > 0 ? Math.round((donePhases / totalPhases) * 100) : 0;
|
||||
|
||||
const currentPhaseNum = currentEntry ? zeroPad(currentEntry.phaseNum) : zeroPad(totalPhases);
|
||||
const currentSlug = currentEntry ? slugify(currentEntry.slice.title) : 'complete';
|
||||
const status = currentEntry ? 'Ready to plan' : 'All phases complete';
|
||||
|
||||
const filled = Math.round(pct / 10);
|
||||
const bar = `[${'█'.repeat(filled)}${'░'.repeat(10 - filled)}]`;
|
||||
const today = new Date().toISOString().split('T')[0];
|
||||
|
||||
return [
|
||||
'# Project State',
|
||||
'',
|
||||
'## Project Reference',
|
||||
'',
|
||||
'See: .planning/PROJECT.md',
|
||||
'',
|
||||
`**Current focus:** Phase ${currentPhaseNum} (${currentSlug})`,
|
||||
'',
|
||||
'## Current Position',
|
||||
'',
|
||||
`Phase: ${currentPhaseNum} of ${zeroPad(totalPhases)} (${currentSlug})`,
|
||||
`Status: ${status}`,
|
||||
`Last activity: ${today} — Migrated from GSD-2`,
|
||||
'',
|
||||
`Progress: ${bar} ${pct}%`,
|
||||
'',
|
||||
'## Accumulated Context',
|
||||
'',
|
||||
'### Decisions',
|
||||
'',
|
||||
'Migrated from GSD-2. Review PROJECT.md for key decisions.',
|
||||
'',
|
||||
'### Blockers/Concerns',
|
||||
'',
|
||||
'None.',
|
||||
'',
|
||||
'## Session Continuity',
|
||||
'',
|
||||
`Last session: ${today}`,
|
||||
'Stopped at: Migration from GSD-2 completed',
|
||||
'Resume file: None',
|
||||
'',
|
||||
].join('\n');
|
||||
}
|
||||
|
||||
// ─── Transformer ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Convert parsed GSD-2 data into a map of relative path → file content.
|
||||
* All paths are relative to the .planning/ root.
|
||||
*/
|
||||
function buildPlanningArtifacts(gsd2Data) {
|
||||
const artifacts = new Map();
|
||||
|
||||
// Passthrough files
|
||||
artifacts.set('PROJECT.md', gsd2Data.projectContent || '# Project\n\n(Migrated from GSD-2)\n');
|
||||
if (gsd2Data.requirements) {
|
||||
artifacts.set('REQUIREMENTS.md', gsd2Data.requirements);
|
||||
}
|
||||
|
||||
// Minimal valid v1 config
|
||||
artifacts.set('config.json', JSON.stringify({ version: 1 }, null, 2) + '\n');
|
||||
|
||||
// Build sequential phase map: flatten Milestones → Slices into numbered phases
|
||||
const phaseMap = [];
|
||||
let phaseNum = 1;
|
||||
for (const milestone of gsd2Data.milestones) {
|
||||
for (const slice of milestone.slices) {
|
||||
phaseMap.push({ milestoneId: milestone.id, milestoneTitle: milestone.title, slice, phaseNum });
|
||||
phaseNum++;
|
||||
}
|
||||
}
|
||||
|
||||
artifacts.set('ROADMAP.md', buildRoadmapMd(gsd2Data.milestones, phaseMap));
|
||||
artifacts.set('STATE.md', buildStateMd(phaseMap));
|
||||
|
||||
for (const { slice, phaseNum, milestoneTitle } of phaseMap) {
|
||||
const prefix = zeroPad(phaseNum);
|
||||
const slug = slugify(slice.title);
|
||||
const dir = `phases/${prefix}-${slug}`;
|
||||
|
||||
artifacts.set(`${dir}/${prefix}-CONTEXT.md`, buildContextMd(slice, prefix));
|
||||
|
||||
if (slice.research) {
|
||||
artifacts.set(`${dir}/${prefix}-RESEARCH.md`, slice.research);
|
||||
}
|
||||
|
||||
for (let i = 0; i < slice.tasks.length; i++) {
|
||||
const task = slice.tasks[i];
|
||||
const planPrefix = zeroPad(i + 1);
|
||||
|
||||
artifacts.set(
|
||||
`${dir}/${prefix}-${planPrefix}-PLAN.md`,
|
||||
buildPlanMd(task, prefix, planPrefix, slug, milestoneTitle)
|
||||
);
|
||||
|
||||
if (task.done && task.summary) {
|
||||
artifacts.set(
|
||||
`${dir}/${prefix}-${planPrefix}-SUMMARY.md`,
|
||||
buildSummaryMd(task, prefix, planPrefix)
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return artifacts;
|
||||
}
|
||||
|
||||
// ─── Preview ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Format a dry-run preview string for display before writing.
|
||||
*/
|
||||
function buildPreview(gsd2Data, artifacts) {
|
||||
const lines = ['Preview — files that will be created in .planning/:'];
|
||||
|
||||
for (const rel of artifacts.keys()) {
|
||||
lines.push(` ${rel}`);
|
||||
}
|
||||
|
||||
const totalSlices = gsd2Data.milestones.reduce((s, m) => s + m.slices.length, 0);
|
||||
const doneSlices = gsd2Data.milestones.reduce((s, m) => s + m.slices.filter(sl => sl.done).length, 0);
|
||||
const allTasks = gsd2Data.milestones.flatMap(m => m.slices.flatMap(sl => sl.tasks));
|
||||
const doneTasks = allTasks.filter(t => t.done).length;
|
||||
|
||||
lines.push('');
|
||||
lines.push(`Milestones: ${gsd2Data.milestones.length}`);
|
||||
lines.push(`Phases (slices): ${totalSlices} (${doneSlices} completed)`);
|
||||
lines.push(`Plans (tasks): ${allTasks.length} (${doneTasks} completed)`);
|
||||
lines.push('');
|
||||
lines.push('Cannot migrate automatically:');
|
||||
lines.push(' - GSD-2 cost/token ledger (no v1 equivalent)');
|
||||
lines.push(' - GSD-2 database state (rebuilt from files on first /gsd-health)');
|
||||
lines.push(' - VS Code extension state');
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
// ─── Writer ───────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Write all artifacts to the .planning/ directory.
|
||||
*/
|
||||
function writePlanningDir(artifacts, planningRoot) {
|
||||
for (const [rel, content] of artifacts) {
|
||||
const absPath = path.join(planningRoot, rel);
|
||||
fs.mkdirSync(path.dirname(absPath), { recursive: true });
|
||||
fs.writeFileSync(absPath, content, 'utf8');
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Command Handler ──────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Entry point called from gsd-tools.cjs.
|
||||
* Supports: --force, --dry-run, --path <dir>
|
||||
*/
|
||||
function cmdFromGsd2(args, cwd, raw) {
|
||||
const { output, error } = require('./core.cjs');
|
||||
|
||||
const force = args.includes('--force');
|
||||
const dryRun = args.includes('--dry-run');
|
||||
|
||||
const pathIdx = args.indexOf('--path');
|
||||
const projectDir = pathIdx >= 0 && args[pathIdx + 1]
|
||||
? path.resolve(cwd, args[pathIdx + 1])
|
||||
: cwd;
|
||||
|
||||
const gsdDir = findGsd2Root(projectDir);
|
||||
if (!gsdDir) {
|
||||
return output({ success: false, error: `No .gsd/ directory found in ${projectDir}` }, raw);
|
||||
}
|
||||
|
||||
const planningRoot = path.join(path.dirname(gsdDir), '.planning');
|
||||
if (fs.existsSync(planningRoot) && !force) {
|
||||
return output({
|
||||
success: false,
|
||||
error: `.planning/ already exists at ${planningRoot}. Pass --force to overwrite.`,
|
||||
}, raw);
|
||||
}
|
||||
|
||||
const gsd2Data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(gsd2Data);
|
||||
const preview = buildPreview(gsd2Data, artifacts);
|
||||
|
||||
if (dryRun) {
|
||||
return output({ success: true, dryRun: true, preview }, raw);
|
||||
}
|
||||
|
||||
writePlanningDir(artifacts, planningRoot);
|
||||
|
||||
return output({
|
||||
success: true,
|
||||
planningDir: planningRoot,
|
||||
filesWritten: artifacts.size,
|
||||
milestones: gsd2Data.milestones.length,
|
||||
preview,
|
||||
}, raw);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
findGsd2Root,
|
||||
parseGsd2,
|
||||
buildPlanningArtifacts,
|
||||
buildPreview,
|
||||
writePlanningDir,
|
||||
cmdFromGsd2,
|
||||
// Exported for unit tests
|
||||
parseSlicesFromRoadmap,
|
||||
parseMilestoneTitle,
|
||||
parseTaskTitle,
|
||||
parseTaskDescription,
|
||||
parseTaskMustHaves,
|
||||
buildPlanMd,
|
||||
buildSummaryMd,
|
||||
buildContextMd,
|
||||
buildRoadmapMd,
|
||||
buildStateMd,
|
||||
slugify,
|
||||
zeroPad,
|
||||
};
|
||||
@@ -382,6 +382,12 @@ Execute each selected wave in sequence. Within a wave: parallel if `PARALLELIZAT
|
||||
auto-detects worktree mode (`.git` is a file, not a directory) and skips
|
||||
shared file updates automatically. The orchestrator updates them centrally
|
||||
after merge.
|
||||
|
||||
REQUIRED: SUMMARY.md MUST be committed before you return. In worktree mode the
|
||||
git_commit_metadata step in execute-plan.md commits SUMMARY.md and REQUIREMENTS.md
|
||||
only (STATE.md and ROADMAP.md are excluded automatically). Do NOT skip or defer
|
||||
this commit — the orchestrator force-removes the worktree after you return, and
|
||||
any uncommitted SUMMARY.md will be permanently lost (#2070).
|
||||
</parallel_execution>
|
||||
|
||||
<execution_context>
|
||||
@@ -556,6 +562,17 @@ Execute each selected wave in sequence. Within a wave: parallel if `PARALLELIZAT
|
||||
fi
|
||||
fi
|
||||
|
||||
# Safety net: commit any uncommitted SUMMARY.md before force-removing the worktree.
|
||||
# This guards against executors that skipped the git_commit_metadata step (#2070).
|
||||
UNCOMMITTED_SUMMARY=$(git -C "$WT" ls-files --modified --others --exclude-standard -- "*SUMMARY.md" 2>/dev/null || true)
|
||||
if [ -n "$UNCOMMITTED_SUMMARY" ]; then
|
||||
echo "⚠ SUMMARY.md was not committed by executor — committing now to prevent data loss"
|
||||
git -C "$WT" add -- "*SUMMARY.md" 2>/dev/null || true
|
||||
git -C "$WT" commit --no-verify -m "docs(recovery): rescue uncommitted SUMMARY.md before worktree removal (#2070)" 2>/dev/null || true
|
||||
# Re-merge the recovery commit
|
||||
git merge "$WT_BRANCH" --no-edit -m "chore: merge rescued SUMMARY.md from executor worktree ($WT_BRANCH)" 2>/dev/null || true
|
||||
fi
|
||||
|
||||
# Remove the worktree
|
||||
git worktree remove "$WT" --force 2>/dev/null || true
|
||||
|
||||
|
||||
219
tests/bug-2075-worktree-deletion-safeguards.test.cjs
Normal file
219
tests/bug-2075-worktree-deletion-safeguards.test.cjs
Normal file
@@ -0,0 +1,219 @@
|
||||
/**
|
||||
* Regression tests for #2075: gsd-executor worktree merge systematically
|
||||
* deletes prior-wave committed files.
|
||||
*
|
||||
* Three failure modes documented in issue #2075:
|
||||
*
|
||||
* Failure Mode B (PRIMARY — unaddressed before this fix):
|
||||
* Executor agent runs `git clean` inside the worktree, removing files
|
||||
* committed on the feature branch. git clean treats them as "untracked"
|
||||
* from the worktree's perspective and deletes them. The executor then
|
||||
* commits only its own deliverables; the subsequent merge brings the
|
||||
* deletions onto the main branch.
|
||||
*
|
||||
* Failure Mode A (partially addressed in PR #1982):
|
||||
* Worktree created from wrong branch base. Audit all worktree-spawning
|
||||
* workflows for worktree_branch_check presence.
|
||||
*
|
||||
* Failure Mode C:
|
||||
* Stale content from wrong base overwrites shared files. Covered by
|
||||
* the --hard reset in the worktree_branch_check.
|
||||
*
|
||||
* Defense-in-depth (from #1977):
|
||||
* Post-commit deletion check: already in gsd-executor.md (--diff-filter=D).
|
||||
* Pre-merge deletion check: already in execute-phase.md (--diff-filter=D).
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
const { describe, test } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const EXECUTOR_AGENT_PATH = path.join(__dirname, '..', 'agents', 'gsd-executor.md');
|
||||
const EXECUTE_PHASE_PATH = path.join(__dirname, '..', 'get-shit-done', 'workflows', 'execute-phase.md');
|
||||
const QUICK_PATH = path.join(__dirname, '..', 'get-shit-done', 'workflows', 'quick.md');
|
||||
const DIAGNOSE_PATH = path.join(__dirname, '..', 'get-shit-done', 'workflows', 'diagnose-issues.md');
|
||||
|
||||
describe('bug-2075: worktree deletion safeguards', () => {
|
||||
|
||||
describe('Failure Mode B: git clean prohibition in executor agent', () => {
|
||||
test('gsd-executor.md explicitly prohibits git clean in worktree context', () => {
|
||||
const content = fs.readFileSync(EXECUTOR_AGENT_PATH, 'utf-8');
|
||||
|
||||
// Must have an explicit prohibition section mentioning git clean
|
||||
const prohibitsGitClean = (
|
||||
content.includes('git clean') &&
|
||||
(
|
||||
/NEVER.*git clean/i.test(content) ||
|
||||
/git clean.*NEVER/i.test(content) ||
|
||||
/do not.*git clean/i.test(content) ||
|
||||
/git clean.*prohibited/i.test(content) ||
|
||||
/prohibited.*git clean/i.test(content) ||
|
||||
/forbidden.*git clean/i.test(content) ||
|
||||
/git clean.*forbidden/i.test(content) ||
|
||||
/must not.*git clean/i.test(content) ||
|
||||
/git clean.*must not/i.test(content)
|
||||
)
|
||||
);
|
||||
|
||||
assert.ok(
|
||||
prohibitsGitClean,
|
||||
'gsd-executor.md must explicitly prohibit git clean — running it inside a worktree deletes files committed on the feature branch (#2075 Failure Mode B)'
|
||||
);
|
||||
});
|
||||
|
||||
test('gsd-executor.md git clean prohibition explains the worktree data-loss risk', () => {
|
||||
const content = fs.readFileSync(EXECUTOR_AGENT_PATH, 'utf-8');
|
||||
|
||||
// The prohibition must be accompanied by a reason — not just a bare rule
|
||||
// Look for the word "worktree" near the git clean prohibition
|
||||
const gitCleanIdx = content.indexOf('git clean');
|
||||
assert.ok(gitCleanIdx > -1, 'gsd-executor.md must mention git clean (to prohibit it)');
|
||||
|
||||
// Extract context around the git clean mention (500 chars either side)
|
||||
const contextStart = Math.max(0, gitCleanIdx - 500);
|
||||
const contextEnd = Math.min(content.length, gitCleanIdx + 500);
|
||||
const context = content.slice(contextStart, contextEnd);
|
||||
|
||||
const hasWorktreeRationale = (
|
||||
/worktree/i.test(context) ||
|
||||
/delete/i.test(context) ||
|
||||
/untracked/i.test(context)
|
||||
);
|
||||
|
||||
assert.ok(
|
||||
hasWorktreeRationale,
|
||||
'The git clean prohibition in gsd-executor.md must explain why: git clean in a worktree deletes files that appear untracked but are committed on the feature branch'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Failure Mode A: worktree_branch_check audit across all worktree-spawning workflows', () => {
|
||||
test('execute-phase.md has worktree_branch_check block with --hard reset', () => {
|
||||
const content = fs.readFileSync(EXECUTE_PHASE_PATH, 'utf-8');
|
||||
|
||||
const blockMatch = content.match(/<worktree_branch_check>([\s\S]*?)<\/worktree_branch_check>/);
|
||||
assert.ok(
|
||||
blockMatch,
|
||||
'execute-phase.md must contain a <worktree_branch_check> block'
|
||||
);
|
||||
|
||||
const block = blockMatch[1];
|
||||
assert.ok(
|
||||
block.includes('reset --hard'),
|
||||
'execute-phase.md worktree_branch_check must use git reset --hard (not --soft)'
|
||||
);
|
||||
assert.ok(
|
||||
!block.includes('reset --soft'),
|
||||
'execute-phase.md worktree_branch_check must not use git reset --soft'
|
||||
);
|
||||
});
|
||||
|
||||
test('quick.md has worktree_branch_check block with --hard reset', () => {
|
||||
const content = fs.readFileSync(QUICK_PATH, 'utf-8');
|
||||
|
||||
const blockMatch = content.match(/<worktree_branch_check>([\s\S]*?)<\/worktree_branch_check>/);
|
||||
assert.ok(
|
||||
blockMatch,
|
||||
'quick.md must contain a <worktree_branch_check> block'
|
||||
);
|
||||
|
||||
const block = blockMatch[1];
|
||||
assert.ok(
|
||||
block.includes('reset --hard'),
|
||||
'quick.md worktree_branch_check must use git reset --hard (not --soft)'
|
||||
);
|
||||
assert.ok(
|
||||
!block.includes('reset --soft'),
|
||||
'quick.md worktree_branch_check must not use git reset --soft'
|
||||
);
|
||||
});
|
||||
|
||||
test('diagnose-issues.md has worktree_branch_check instruction for spawned agents', () => {
|
||||
const content = fs.readFileSync(DIAGNOSE_PATH, 'utf-8');
|
||||
|
||||
assert.ok(
|
||||
content.includes('worktree_branch_check'),
|
||||
'diagnose-issues.md must include worktree_branch_check instruction for spawned debug agents'
|
||||
);
|
||||
|
||||
assert.ok(
|
||||
content.includes('reset --hard'),
|
||||
'diagnose-issues.md worktree_branch_check must instruct agents to use git reset --hard'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Defense-in-depth: post-commit deletion check (from #1977)', () => {
|
||||
test('gsd-executor.md task_commit_protocol has post-commit deletion verification', () => {
|
||||
const content = fs.readFileSync(EXECUTOR_AGENT_PATH, 'utf-8');
|
||||
|
||||
assert.ok(
|
||||
content.includes('--diff-filter=D'),
|
||||
'gsd-executor.md must include --diff-filter=D to detect accidental file deletions after each commit'
|
||||
);
|
||||
|
||||
// Must have a warning about unexpected deletions
|
||||
assert.ok(
|
||||
content.includes('DELETIONS') || content.includes('WARNING'),
|
||||
'gsd-executor.md must emit a warning when a commit includes unexpected file deletions'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Defense-in-depth: pre-merge deletion check (from #1977)', () => {
|
||||
test('execute-phase.md worktree merge section has pre-merge deletion check', () => {
|
||||
const content = fs.readFileSync(EXECUTE_PHASE_PATH, 'utf-8');
|
||||
|
||||
const worktreeCleanupStart = content.indexOf('Worktree cleanup');
|
||||
assert.ok(
|
||||
worktreeCleanupStart > -1,
|
||||
'execute-phase.md must have a worktree cleanup section'
|
||||
);
|
||||
|
||||
const cleanupSection = content.slice(worktreeCleanupStart);
|
||||
|
||||
assert.ok(
|
||||
cleanupSection.includes('--diff-filter=D'),
|
||||
'execute-phase.md worktree cleanup must use --diff-filter=D to block deletion-introducing merges'
|
||||
);
|
||||
|
||||
// Deletion check must appear before git merge
|
||||
const deletionCheckIdx = cleanupSection.indexOf('--diff-filter=D');
|
||||
const gitMergeIdx = cleanupSection.indexOf('git merge');
|
||||
assert.ok(
|
||||
deletionCheckIdx < gitMergeIdx,
|
||||
'--diff-filter=D deletion check must appear before git merge in the worktree cleanup section'
|
||||
);
|
||||
|
||||
assert.ok(
|
||||
cleanupSection.includes('BLOCKED') || cleanupSection.includes('deletion'),
|
||||
'execute-phase.md must block or warn when the worktree branch contains file deletions'
|
||||
);
|
||||
});
|
||||
|
||||
test('quick.md worktree merge section has pre-merge deletion check', () => {
|
||||
const content = fs.readFileSync(QUICK_PATH, 'utf-8');
|
||||
|
||||
const mergeIdx = content.indexOf('git merge');
|
||||
assert.ok(mergeIdx > -1, 'quick.md must contain a git merge operation');
|
||||
|
||||
// Find the worktree cleanup block (starts after "Worktree cleanup")
|
||||
const worktreeCleanupStart = content.indexOf('Worktree cleanup');
|
||||
assert.ok(
|
||||
worktreeCleanupStart > -1,
|
||||
'quick.md must have a worktree cleanup section'
|
||||
);
|
||||
|
||||
const cleanupSection = content.slice(worktreeCleanupStart);
|
||||
|
||||
assert.ok(
|
||||
cleanupSection.includes('--diff-filter=D') || cleanupSection.includes('diff-filter'),
|
||||
'quick.md worktree cleanup must check for file deletions before merging'
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
});
|
||||
550
tests/gsd2-import.test.cjs
Normal file
550
tests/gsd2-import.test.cjs
Normal file
@@ -0,0 +1,550 @@
|
||||
'use strict';
|
||||
|
||||
const { describe, it, test, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
const { createTempDir, cleanup, runGsdTools } = require('./helpers.cjs');
|
||||
|
||||
const {
|
||||
findGsd2Root,
|
||||
parseSlicesFromRoadmap,
|
||||
parseMilestoneTitle,
|
||||
parseTaskTitle,
|
||||
parseTaskDescription,
|
||||
parseTaskMustHaves,
|
||||
parseGsd2,
|
||||
buildPlanningArtifacts,
|
||||
buildRoadmapMd,
|
||||
buildStateMd,
|
||||
slugify,
|
||||
zeroPad,
|
||||
} = require('../get-shit-done/bin/lib/gsd2-import.cjs');
|
||||
|
||||
// ─── Fixture Builders ──────────────────────────────────────────────────────
|
||||
|
||||
/** Build a minimal but complete GSD-2 .gsd/ directory in tmpDir. */
|
||||
function makeGsd2Project(tmpDir, opts = {}) {
|
||||
const gsdDir = path.join(tmpDir, '.gsd');
|
||||
const m001Dir = path.join(gsdDir, 'milestones', 'M001');
|
||||
const s01Dir = path.join(m001Dir, 'slices', 'S01');
|
||||
const s02Dir = path.join(m001Dir, 'slices', 'S02');
|
||||
const s01TasksDir = path.join(s01Dir, 'tasks');
|
||||
|
||||
fs.mkdirSync(s01TasksDir, { recursive: true });
|
||||
|
||||
fs.writeFileSync(path.join(gsdDir, 'PROJECT.md'), '# My Project\n\nA test project.\n');
|
||||
fs.writeFileSync(path.join(gsdDir, 'REQUIREMENTS.md'), [
|
||||
'# Requirements',
|
||||
'',
|
||||
'## Active',
|
||||
'',
|
||||
'### R001 — Do the thing',
|
||||
'',
|
||||
'- Status: active',
|
||||
'- Description: The core requirement.',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
const roadmap = [
|
||||
'# M001: Foundation',
|
||||
'',
|
||||
'**Vision:** Build the foundation.',
|
||||
'',
|
||||
'## Success Criteria',
|
||||
'',
|
||||
'- It works.',
|
||||
'',
|
||||
'## Slices',
|
||||
'',
|
||||
'- [x] **S01: Setup** `risk:low` `depends:[]`',
|
||||
' > After this: setup complete',
|
||||
'- [ ] **S02: Auth System** `risk:medium` `depends:[S01]`',
|
||||
' > After this: auth works',
|
||||
].join('\n');
|
||||
fs.writeFileSync(path.join(m001Dir, 'M001-ROADMAP.md'), roadmap);
|
||||
|
||||
// S01 — completed slice with research and a done task
|
||||
fs.writeFileSync(path.join(s01Dir, 'S01-PLAN.md'), [
|
||||
'# S01: Setup',
|
||||
'',
|
||||
'**Goal:** Set up the project.',
|
||||
'',
|
||||
'## Tasks',
|
||||
'- [x] **T01: Init**',
|
||||
].join('\n'));
|
||||
fs.writeFileSync(path.join(s01Dir, 'S01-RESEARCH.md'), '# Research\n\nSome research.\n');
|
||||
fs.writeFileSync(path.join(s01Dir, 'S01-SUMMARY.md'), '---\nstatus: done\n---\n\nSlice done.\n');
|
||||
|
||||
fs.writeFileSync(path.join(s01TasksDir, 'T01-PLAN.md'), [
|
||||
'# T01: Init Project',
|
||||
'',
|
||||
'**Slice:** S01 — **Milestone:** M001',
|
||||
'',
|
||||
'## Description',
|
||||
'Initialize the project structure.',
|
||||
'',
|
||||
'## Must-Haves',
|
||||
'- [x] package.json exists',
|
||||
'- [x] tsconfig.json exists',
|
||||
'',
|
||||
'## Files',
|
||||
'- `package.json`',
|
||||
'- `tsconfig.json`',
|
||||
].join('\n'));
|
||||
fs.writeFileSync(path.join(s01TasksDir, 'T01-SUMMARY.md'), [
|
||||
'---',
|
||||
'status: done',
|
||||
'completed_at: 2025-01-15',
|
||||
'---',
|
||||
'',
|
||||
'# T01: Init Project',
|
||||
'',
|
||||
'Set up package.json and tsconfig.json.',
|
||||
].join('\n'));
|
||||
|
||||
// S02 — not started: slice appears in roadmap but no slice directory
|
||||
if (opts.withS02Dir) {
|
||||
fs.mkdirSync(path.join(s02Dir, 'tasks'), { recursive: true });
|
||||
fs.writeFileSync(path.join(s02Dir, 'S02-PLAN.md'), [
|
||||
'# S02: Auth System',
|
||||
'',
|
||||
'**Goal:** Add authentication.',
|
||||
'',
|
||||
'## Tasks',
|
||||
'- [ ] **T01: JWT middleware**',
|
||||
].join('\n'));
|
||||
fs.writeFileSync(path.join(s02Dir, 'tasks', 'T01-PLAN.md'), [
|
||||
'# T01: JWT Middleware',
|
||||
'',
|
||||
'**Slice:** S02 — **Milestone:** M001',
|
||||
'',
|
||||
'## Description',
|
||||
'Implement JWT token validation middleware.',
|
||||
'',
|
||||
'## Must-Haves',
|
||||
'- [ ] validateToken() returns 401 on invalid JWT',
|
||||
].join('\n'));
|
||||
}
|
||||
|
||||
return gsdDir;
|
||||
}
|
||||
|
||||
/** Build a two-milestone GSD-2 project. */
|
||||
function makeTwoMilestoneProject(tmpDir) {
|
||||
const gsdDir = path.join(tmpDir, '.gsd');
|
||||
const m001Dir = path.join(gsdDir, 'milestones', 'M001');
|
||||
const m002Dir = path.join(gsdDir, 'milestones', 'M002');
|
||||
|
||||
fs.mkdirSync(path.join(m001Dir, 'slices', 'S01', 'tasks'), { recursive: true });
|
||||
fs.mkdirSync(path.join(m002Dir, 'slices', 'S01', 'tasks'), { recursive: true });
|
||||
|
||||
fs.writeFileSync(path.join(gsdDir, 'PROJECT.md'), '# Multi-milestone Project\n');
|
||||
|
||||
fs.writeFileSync(path.join(m001Dir, 'M001-ROADMAP.md'), [
|
||||
'# M001: Alpha',
|
||||
'',
|
||||
'## Slices',
|
||||
'',
|
||||
'- [x] **S01: Core** `risk:low` `depends:[]`',
|
||||
'- [x] **S02: API** `risk:low` `depends:[S01]`',
|
||||
].join('\n'));
|
||||
|
||||
fs.writeFileSync(path.join(m002Dir, 'M002-ROADMAP.md'), [
|
||||
'# M002: Beta',
|
||||
'',
|
||||
'## Slices',
|
||||
'',
|
||||
'- [ ] **S01: Dashboard** `risk:medium` `depends:[]`',
|
||||
].join('\n'));
|
||||
|
||||
return gsdDir;
|
||||
}
|
||||
|
||||
// ─── Unit Tests ────────────────────────────────────────────────────────────
|
||||
|
||||
describe('parseSlicesFromRoadmap', () => {
|
||||
test('parses done and pending slices', () => {
|
||||
const content = [
|
||||
'## Slices',
|
||||
'',
|
||||
'- [x] **S01: Setup** `risk:low` `depends:[]`',
|
||||
'- [ ] **S02: Auth System** `risk:medium` `depends:[S01]`',
|
||||
].join('\n');
|
||||
const slices = parseSlicesFromRoadmap(content);
|
||||
assert.strictEqual(slices.length, 2);
|
||||
assert.deepStrictEqual(slices[0], { done: true, id: 'S01', title: 'Setup' });
|
||||
assert.deepStrictEqual(slices[1], { done: false, id: 'S02', title: 'Auth System' });
|
||||
});
|
||||
|
||||
test('returns empty array when no Slices section', () => {
|
||||
const slices = parseSlicesFromRoadmap('# M001: Title\n\n## Success Criteria\n\n- Works.');
|
||||
assert.strictEqual(slices.length, 0);
|
||||
});
|
||||
|
||||
test('ignores non-slice lines in the section', () => {
|
||||
const content = [
|
||||
'## Slices',
|
||||
'',
|
||||
'Some intro text.',
|
||||
'- [x] **S01: Core** `risk:low` `depends:[]`',
|
||||
' > After this: done',
|
||||
].join('\n');
|
||||
const slices = parseSlicesFromRoadmap(content);
|
||||
assert.strictEqual(slices.length, 1);
|
||||
assert.strictEqual(slices[0].id, 'S01');
|
||||
});
|
||||
});
|
||||
|
||||
describe('parseMilestoneTitle', () => {
|
||||
test('extracts title from first heading', () => {
|
||||
assert.strictEqual(parseMilestoneTitle('# M001: Foundation\n\nBody.'), 'Foundation');
|
||||
});
|
||||
|
||||
test('returns null when heading absent', () => {
|
||||
assert.strictEqual(parseMilestoneTitle('No heading here.'), null);
|
||||
});
|
||||
});
|
||||
|
||||
describe('parseTaskTitle', () => {
|
||||
test('extracts title from task plan', () => {
|
||||
assert.strictEqual(parseTaskTitle('# T01: Init Project\n\nBody.', 'T01'), 'Init Project');
|
||||
});
|
||||
|
||||
test('falls back to provided default', () => {
|
||||
assert.strictEqual(parseTaskTitle('No heading.', 'T01'), 'T01');
|
||||
});
|
||||
});
|
||||
|
||||
describe('parseTaskDescription', () => {
|
||||
test('extracts description body', () => {
|
||||
const content = [
|
||||
'# T01: Title',
|
||||
'',
|
||||
'## Description',
|
||||
'Do the thing.',
|
||||
'',
|
||||
'## Must-Haves',
|
||||
].join('\n');
|
||||
assert.strictEqual(parseTaskDescription(content), 'Do the thing.');
|
||||
});
|
||||
|
||||
test('returns empty string when section absent', () => {
|
||||
assert.strictEqual(parseTaskDescription('# T01: Title\n\nNo sections.'), '');
|
||||
});
|
||||
});
|
||||
|
||||
describe('parseTaskMustHaves', () => {
|
||||
test('parses checked and unchecked items', () => {
|
||||
const content = [
|
||||
'## Must-Haves',
|
||||
'- [x] File exists',
|
||||
'- [ ] Tests pass',
|
||||
].join('\n');
|
||||
const mh = parseTaskMustHaves(content);
|
||||
assert.deepStrictEqual(mh, ['File exists', 'Tests pass']);
|
||||
});
|
||||
|
||||
test('returns empty array when section absent', () => {
|
||||
assert.deepStrictEqual(parseTaskMustHaves('# T01: Title\n\nNo sections.'), []);
|
||||
});
|
||||
});
|
||||
|
||||
describe('slugify', () => {
|
||||
test('lowercases and replaces non-alphanumeric with hyphens', () => {
|
||||
assert.strictEqual(slugify('Auth System'), 'auth-system');
|
||||
assert.strictEqual(slugify('My Feature (v2)'), 'my-feature-v2');
|
||||
});
|
||||
|
||||
test('strips leading/trailing hyphens', () => {
|
||||
assert.strictEqual(slugify(' spaces '), 'spaces');
|
||||
});
|
||||
});
|
||||
|
||||
describe('zeroPad', () => {
|
||||
test('pads to 2 digits by default', () => {
|
||||
assert.strictEqual(zeroPad(1), '01');
|
||||
assert.strictEqual(zeroPad(12), '12');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Integration Tests ─────────────────────────────────────────────────────
|
||||
|
||||
describe('parseGsd2', () => {
|
||||
let tmpDir;
|
||||
beforeEach(() => { tmpDir = createTempDir('gsd2-parse-'); });
|
||||
afterEach(() => { cleanup(tmpDir); });
|
||||
|
||||
test('reads project and requirements passthroughs', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
assert.ok(data.projectContent.includes('My Project'));
|
||||
assert.ok(data.requirements.includes('R001'));
|
||||
});
|
||||
|
||||
test('parses milestone with slices', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
assert.strictEqual(data.milestones.length, 1);
|
||||
assert.strictEqual(data.milestones[0].id, 'M001');
|
||||
assert.strictEqual(data.milestones[0].title, 'Foundation');
|
||||
assert.strictEqual(data.milestones[0].slices.length, 2);
|
||||
});
|
||||
|
||||
test('marks S01 as done, S02 as not done', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const [s01, s02] = data.milestones[0].slices;
|
||||
assert.strictEqual(s01.done, true);
|
||||
assert.strictEqual(s02.done, false);
|
||||
});
|
||||
|
||||
test('reads research for completed slice', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
assert.ok(data.milestones[0].slices[0].research.includes('Some research'));
|
||||
});
|
||||
|
||||
test('reads tasks from tasks/ directory', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const tasks = data.milestones[0].slices[0].tasks;
|
||||
assert.strictEqual(tasks.length, 1);
|
||||
assert.strictEqual(tasks[0].id, 'T01');
|
||||
assert.strictEqual(tasks[0].title, 'Init Project');
|
||||
assert.strictEqual(tasks[0].done, true);
|
||||
});
|
||||
|
||||
test('parses task must-haves', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const mh = data.milestones[0].slices[0].tasks[0].mustHaves;
|
||||
assert.deepStrictEqual(mh, ['package.json exists', 'tsconfig.json exists']);
|
||||
});
|
||||
|
||||
test('handles missing .gsd/milestones/ gracefully', () => {
|
||||
const gsdDir = path.join(tmpDir, '.gsd');
|
||||
fs.mkdirSync(gsdDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(gsdDir, 'PROJECT.md'), '# Empty\n');
|
||||
const data = parseGsd2(gsdDir);
|
||||
assert.strictEqual(data.milestones.length, 0);
|
||||
});
|
||||
|
||||
test('slice with no directory has empty tasks list', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
// S02 has no slice directory in the default fixture
|
||||
const s02 = data.milestones[0].slices[1];
|
||||
assert.strictEqual(s02.tasks.length, 0);
|
||||
assert.strictEqual(s02.research, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe('buildPlanningArtifacts', () => {
|
||||
let tmpDir;
|
||||
beforeEach(() => { tmpDir = createTempDir('gsd2-artifacts-'); });
|
||||
afterEach(() => { cleanup(tmpDir); });
|
||||
|
||||
test('produces PROJECT.md, REQUIREMENTS.md, ROADMAP.md, STATE.md, config.json', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
assert.ok(artifacts.has('PROJECT.md'));
|
||||
assert.ok(artifacts.has('REQUIREMENTS.md'));
|
||||
assert.ok(artifacts.has('ROADMAP.md'));
|
||||
assert.ok(artifacts.has('STATE.md'));
|
||||
assert.ok(artifacts.has('config.json'));
|
||||
});
|
||||
|
||||
test('S01 (done) maps to phase 01 with PLAN and SUMMARY', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
assert.ok(artifacts.has('phases/01-setup/01-CONTEXT.md'));
|
||||
assert.ok(artifacts.has('phases/01-setup/01-RESEARCH.md'));
|
||||
assert.ok(artifacts.has('phases/01-setup/01-01-PLAN.md'));
|
||||
assert.ok(artifacts.has('phases/01-setup/01-01-SUMMARY.md'));
|
||||
});
|
||||
|
||||
test('S02 (pending) maps to phase 02 with only CONTEXT and PLAN', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir, { withS02Dir: true });
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
assert.ok(artifacts.has('phases/02-auth-system/02-CONTEXT.md'));
|
||||
assert.ok(artifacts.has('phases/02-auth-system/02-01-PLAN.md'));
|
||||
assert.ok(!artifacts.has('phases/02-auth-system/02-01-SUMMARY.md'), 'no summary for pending task');
|
||||
});
|
||||
|
||||
test('ROADMAP.md marks S01 done, S02 pending', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
const roadmap = artifacts.get('ROADMAP.md');
|
||||
assert.ok(roadmap.includes('[x]'));
|
||||
assert.ok(roadmap.includes('[ ]'));
|
||||
});
|
||||
|
||||
test('PLAN.md includes frontmatter with phase and plan keys', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
const plan = artifacts.get('phases/01-setup/01-01-PLAN.md');
|
||||
assert.ok(plan.includes('phase: "01"'));
|
||||
assert.ok(plan.includes('plan: "01"'));
|
||||
assert.ok(plan.includes('type: "implementation"'));
|
||||
});
|
||||
|
||||
test('SUMMARY.md strips GSD-2 frontmatter and adds v1 frontmatter', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
const summary = artifacts.get('phases/01-setup/01-01-SUMMARY.md');
|
||||
assert.ok(summary.includes('phase: "01"'));
|
||||
assert.ok(summary.includes('plan: "01"'));
|
||||
// GSD-2 frontmatter field should not appear
|
||||
assert.ok(!summary.includes('completed_at:'));
|
||||
// Body content should be preserved
|
||||
assert.ok(summary.includes('Init Project'));
|
||||
});
|
||||
|
||||
test('config.json is valid JSON', () => {
|
||||
const gsdDir = makeGsd2Project(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
assert.doesNotThrow(() => JSON.parse(artifacts.get('config.json')));
|
||||
});
|
||||
|
||||
test('multi-milestone: slices numbered sequentially across milestones', () => {
|
||||
const gsdDir = makeTwoMilestoneProject(tmpDir);
|
||||
const data = parseGsd2(gsdDir);
|
||||
const artifacts = buildPlanningArtifacts(data);
|
||||
// M001/S01 → phase 01, M001/S02 → phase 02, M002/S01 → phase 03
|
||||
assert.ok(artifacts.has('phases/01-core/01-CONTEXT.md'));
|
||||
assert.ok(artifacts.has('phases/02-api/02-CONTEXT.md'));
|
||||
assert.ok(artifacts.has('phases/03-dashboard/03-CONTEXT.md'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('buildRoadmapMd', () => {
|
||||
test('produces milestone sections with checked/unchecked phases', () => {
|
||||
const milestones = [{ id: 'M001', title: 'Alpha', slices: [] }];
|
||||
const phaseMap = [
|
||||
{ milestoneId: 'M001', milestoneTitle: 'Alpha', slice: { done: true, title: 'Core' }, phaseNum: 1 },
|
||||
{ milestoneId: 'M001', milestoneTitle: 'Alpha', slice: { done: false, title: 'API' }, phaseNum: 2 },
|
||||
];
|
||||
const roadmap = buildRoadmapMd(milestones, phaseMap);
|
||||
assert.ok(roadmap.includes('## M001: Alpha'));
|
||||
assert.ok(roadmap.includes('[x]'));
|
||||
assert.ok(roadmap.includes('[ ]'));
|
||||
assert.ok(roadmap.includes('Phase 01: core'));
|
||||
assert.ok(roadmap.includes('Phase 02: api'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('buildStateMd', () => {
|
||||
test('sets current phase to first incomplete slice', () => {
|
||||
const phaseMap = [
|
||||
{ milestoneId: 'M001', milestoneTitle: 'Alpha', slice: { done: true, title: 'Core' }, phaseNum: 1 },
|
||||
{ milestoneId: 'M001', milestoneTitle: 'Alpha', slice: { done: false, title: 'API Layer' }, phaseNum: 2 },
|
||||
];
|
||||
const state = buildStateMd(phaseMap);
|
||||
assert.ok(state.includes('Phase: 02'));
|
||||
assert.ok(state.includes('api-layer'));
|
||||
assert.ok(state.includes('Ready to plan'));
|
||||
});
|
||||
|
||||
test('reports all complete when all slices done', () => {
|
||||
const phaseMap = [
|
||||
{ milestoneId: 'M001', milestoneTitle: 'Alpha', slice: { done: true, title: 'Core' }, phaseNum: 1 },
|
||||
];
|
||||
const state = buildStateMd(phaseMap);
|
||||
assert.ok(state.includes('All phases complete'));
|
||||
});
|
||||
});
|
||||
|
||||
// ─── CLI Integration Tests ──────────────────────────────────────────────────
|
||||
|
||||
describe('gsd-tools from-gsd2 CLI', () => {
|
||||
let tmpDir;
|
||||
beforeEach(() => { tmpDir = createTempDir('gsd2-cli-'); });
|
||||
afterEach(() => { cleanup(tmpDir); });
|
||||
|
||||
test('--dry-run returns preview without writing files', () => {
|
||||
makeGsd2Project(tmpDir);
|
||||
const result = runGsdTools(['from-gsd2', '--dry-run', '--raw'], tmpDir);
|
||||
assert.ok(result.success, result.error);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.dryRun, true);
|
||||
assert.ok(parsed.preview.includes('PROJECT.md'));
|
||||
assert.ok(!fs.existsSync(path.join(tmpDir, '.planning')), 'no files written in dry-run');
|
||||
});
|
||||
|
||||
test('writes .planning/ directory with correct structure', () => {
|
||||
makeGsd2Project(tmpDir);
|
||||
const result = runGsdTools(['from-gsd2', '--raw'], tmpDir);
|
||||
assert.ok(result.success, result.error);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.success, true);
|
||||
assert.ok(parsed.filesWritten > 0);
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'ROADMAP.md')));
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'STATE.md')));
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'PROJECT.md')));
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'phases', '01-setup', '01-01-PLAN.md')));
|
||||
});
|
||||
|
||||
test('errors when no .gsd/ directory present', () => {
|
||||
const result = runGsdTools(['from-gsd2', '--raw'], tmpDir);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.success, false);
|
||||
assert.ok(parsed.error.includes('No .gsd/'));
|
||||
});
|
||||
|
||||
test('errors when .planning/ already exists without --force', () => {
|
||||
makeGsd2Project(tmpDir);
|
||||
fs.mkdirSync(path.join(tmpDir, '.planning'), { recursive: true });
|
||||
const result = runGsdTools(['from-gsd2', '--raw'], tmpDir);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.success, false);
|
||||
assert.ok(parsed.error.includes('already exists'));
|
||||
});
|
||||
|
||||
test('--force overwrites existing .planning/', () => {
|
||||
makeGsd2Project(tmpDir);
|
||||
fs.mkdirSync(path.join(tmpDir, '.planning'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, '.planning', 'OLD.md'), 'old content');
|
||||
const result = runGsdTools(['from-gsd2', '--force', '--raw'], tmpDir);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.success, true);
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'ROADMAP.md')));
|
||||
});
|
||||
|
||||
test('--path resolves target directory', () => {
|
||||
const projectDir = path.join(tmpDir, 'myproject');
|
||||
fs.mkdirSync(projectDir, { recursive: true });
|
||||
makeGsd2Project(projectDir);
|
||||
// Run from tmpDir but point at projectDir
|
||||
const result = runGsdTools(['from-gsd2', '--path', projectDir, '--dry-run', '--raw'], tmpDir);
|
||||
assert.ok(result.success, result.error);
|
||||
const parsed = JSON.parse(result.output);
|
||||
assert.strictEqual(parsed.dryRun, true);
|
||||
assert.ok(parsed.preview.includes('PROJECT.md'));
|
||||
});
|
||||
|
||||
test('completion state: S01 done → [x] in ROADMAP.md', () => {
|
||||
makeGsd2Project(tmpDir);
|
||||
runGsdTools(['from-gsd2', '--raw'], tmpDir);
|
||||
const roadmap = fs.readFileSync(path.join(tmpDir, '.planning', 'ROADMAP.md'), 'utf8');
|
||||
assert.ok(roadmap.includes('[x]'));
|
||||
// S02 is pending
|
||||
assert.ok(roadmap.includes('[ ]'));
|
||||
});
|
||||
|
||||
test('SUMMARY.md written for completed task, not for pending', () => {
|
||||
makeGsd2Project(tmpDir, { withS02Dir: true });
|
||||
runGsdTools(['from-gsd2', '--raw'], tmpDir);
|
||||
// S01/T01 is done → SUMMARY exists
|
||||
assert.ok(fs.existsSync(path.join(tmpDir, '.planning', 'phases', '01-setup', '01-01-SUMMARY.md')));
|
||||
// S02/T01 is pending → no SUMMARY
|
||||
assert.ok(!fs.existsSync(path.join(tmpDir, '.planning', 'phases', '02-auth-system', '02-01-SUMMARY.md')));
|
||||
});
|
||||
});
|
||||
@@ -29,10 +29,11 @@ const runtimeMap = {
|
||||
'9': 'gemini',
|
||||
'10': 'kilo',
|
||||
'11': 'opencode',
|
||||
'12': 'trae',
|
||||
'13': 'windsurf'
|
||||
'12': 'qwen',
|
||||
'13': 'trae',
|
||||
'14': 'windsurf'
|
||||
};
|
||||
const allRuntimes = ['claude', 'antigravity', 'augment', 'cline', 'codebuddy', 'codex', 'copilot', 'cursor', 'gemini', 'kilo', 'opencode', 'trae', 'windsurf'];
|
||||
const allRuntimes = ['claude', 'antigravity', 'augment', 'cline', 'codebuddy', 'codex', 'copilot', 'cursor', 'gemini', 'kilo', 'opencode', 'qwen', 'trae', 'windsurf'];
|
||||
|
||||
/**
|
||||
* Simulate the parsing logic from promptRuntime without requiring readline.
|
||||
@@ -41,7 +42,7 @@ const allRuntimes = ['claude', 'antigravity', 'augment', 'cline', 'codebuddy', '
|
||||
function parseRuntimeInput(input) {
|
||||
input = input.trim() || '1';
|
||||
|
||||
if (input === '14') {
|
||||
if (input === '15') {
|
||||
return allRuntimes;
|
||||
}
|
||||
|
||||
@@ -89,16 +90,20 @@ describe('multi-runtime selection parsing', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('11'), ['opencode']);
|
||||
});
|
||||
|
||||
test('single choice for qwen', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('12'), ['qwen']);
|
||||
});
|
||||
|
||||
test('single choice for trae', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('12'), ['trae']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('13'), ['trae']);
|
||||
});
|
||||
|
||||
test('single choice for windsurf', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('13'), ['windsurf']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('14'), ['windsurf']);
|
||||
});
|
||||
|
||||
test('choice 14 returns all runtimes', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('14'), allRuntimes);
|
||||
test('choice 15 returns all runtimes', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('15'), allRuntimes);
|
||||
});
|
||||
|
||||
test('empty input defaults to claude', () => {
|
||||
@@ -107,13 +112,13 @@ describe('multi-runtime selection parsing', () => {
|
||||
});
|
||||
|
||||
test('invalid choices are ignored, falls back to claude if all invalid', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('15'), ['claude']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('16'), ['claude']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('0'), ['claude']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('abc'), ['claude']);
|
||||
});
|
||||
|
||||
test('invalid choices mixed with valid are filtered out', () => {
|
||||
assert.deepStrictEqual(parseRuntimeInput('1,15,7'), ['claude', 'copilot']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('1,16,7'), ['claude', 'copilot']);
|
||||
assert.deepStrictEqual(parseRuntimeInput('abc 3 xyz'), ['augment']);
|
||||
});
|
||||
|
||||
@@ -129,7 +134,7 @@ describe('multi-runtime selection parsing', () => {
|
||||
});
|
||||
|
||||
describe('install.js source contains multi-select support', () => {
|
||||
test('runtimeMap is defined with all 13 runtimes', () => {
|
||||
test('runtimeMap is defined with all 14 runtimes', () => {
|
||||
for (const [key, name] of Object.entries(runtimeMap)) {
|
||||
assert.ok(
|
||||
installSrc.includes(`'${key}': '${name}'`),
|
||||
@@ -146,21 +151,25 @@ describe('install.js source contains multi-select support', () => {
|
||||
}
|
||||
});
|
||||
|
||||
test('all shortcut uses option 14', () => {
|
||||
test('all shortcut uses option 15', () => {
|
||||
assert.ok(
|
||||
installSrc.includes("if (input === '14')"),
|
||||
'all shortcut uses option 14'
|
||||
installSrc.includes("if (input === '15')"),
|
||||
'all shortcut uses option 15'
|
||||
);
|
||||
});
|
||||
|
||||
test('prompt lists Trae as option 12 and All as option 14', () => {
|
||||
test('prompt lists Qwen Code as option 12, Trae as option 13 and All as option 15', () => {
|
||||
assert.ok(
|
||||
installSrc.includes('12${reset}) Trae'),
|
||||
'prompt lists Trae as option 12'
|
||||
installSrc.includes('12${reset}) Qwen Code'),
|
||||
'prompt lists Qwen Code as option 12'
|
||||
);
|
||||
assert.ok(
|
||||
installSrc.includes('14${reset}) All'),
|
||||
'prompt lists All as option 14'
|
||||
installSrc.includes('13${reset}) Trae'),
|
||||
'prompt lists Trae as option 13'
|
||||
);
|
||||
assert.ok(
|
||||
installSrc.includes('15${reset}) All'),
|
||||
'prompt lists All as option 15'
|
||||
);
|
||||
});
|
||||
|
||||
|
||||
178
tests/qwen-install.test.cjs
Normal file
178
tests/qwen-install.test.cjs
Normal file
@@ -0,0 +1,178 @@
|
||||
process.env.GSD_TEST_MODE = '1';
|
||||
|
||||
const { test, describe, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
const os = require('node:os');
|
||||
const { createTempDir, cleanup } = require('./helpers.cjs');
|
||||
|
||||
const {
|
||||
getDirName,
|
||||
getGlobalDir,
|
||||
getConfigDirFromHome,
|
||||
install,
|
||||
uninstall,
|
||||
writeManifest,
|
||||
} = require('../bin/install.js');
|
||||
|
||||
describe('Qwen Code runtime directory mapping', () => {
|
||||
test('maps Qwen to .qwen for local installs', () => {
|
||||
assert.strictEqual(getDirName('qwen'), '.qwen');
|
||||
});
|
||||
|
||||
test('maps Qwen to ~/.qwen for global installs', () => {
|
||||
assert.strictEqual(getGlobalDir('qwen'), path.join(os.homedir(), '.qwen'));
|
||||
});
|
||||
|
||||
test('returns .qwen config fragments for local and global installs', () => {
|
||||
assert.strictEqual(getConfigDirFromHome('qwen', false), "'.qwen'");
|
||||
assert.strictEqual(getConfigDirFromHome('qwen', true), "'.qwen'");
|
||||
});
|
||||
});
|
||||
|
||||
describe('getGlobalDir (Qwen Code)', () => {
|
||||
let originalQwenConfigDir;
|
||||
|
||||
beforeEach(() => {
|
||||
originalQwenConfigDir = process.env.QWEN_CONFIG_DIR;
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
if (originalQwenConfigDir !== undefined) {
|
||||
process.env.QWEN_CONFIG_DIR = originalQwenConfigDir;
|
||||
} else {
|
||||
delete process.env.QWEN_CONFIG_DIR;
|
||||
}
|
||||
});
|
||||
|
||||
test('returns ~/.qwen with no env var or explicit dir', () => {
|
||||
delete process.env.QWEN_CONFIG_DIR;
|
||||
const result = getGlobalDir('qwen');
|
||||
assert.strictEqual(result, path.join(os.homedir(), '.qwen'));
|
||||
});
|
||||
|
||||
test('returns explicit dir when provided', () => {
|
||||
const result = getGlobalDir('qwen', '/custom/qwen-path');
|
||||
assert.strictEqual(result, '/custom/qwen-path');
|
||||
});
|
||||
|
||||
test('respects QWEN_CONFIG_DIR env var', () => {
|
||||
process.env.QWEN_CONFIG_DIR = '~/custom-qwen';
|
||||
const result = getGlobalDir('qwen');
|
||||
assert.strictEqual(result, path.join(os.homedir(), 'custom-qwen'));
|
||||
});
|
||||
|
||||
test('explicit dir takes priority over QWEN_CONFIG_DIR', () => {
|
||||
process.env.QWEN_CONFIG_DIR = '~/from-env';
|
||||
const result = getGlobalDir('qwen', '/explicit/path');
|
||||
assert.strictEqual(result, '/explicit/path');
|
||||
});
|
||||
|
||||
test('does not break other runtimes', () => {
|
||||
assert.strictEqual(getGlobalDir('claude'), path.join(os.homedir(), '.claude'));
|
||||
assert.strictEqual(getGlobalDir('codex'), path.join(os.homedir(), '.codex'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('Qwen Code local install/uninstall', () => {
|
||||
let tmpDir;
|
||||
let previousCwd;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir('gsd-qwen-install-');
|
||||
previousCwd = process.cwd();
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(previousCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('installs GSD into ./.qwen and removes it cleanly', () => {
|
||||
const result = install(false, 'qwen');
|
||||
const targetDir = path.join(tmpDir, '.qwen');
|
||||
|
||||
assert.strictEqual(result.runtime, 'qwen');
|
||||
assert.strictEqual(result.configDir, fs.realpathSync(targetDir));
|
||||
|
||||
assert.ok(fs.existsSync(path.join(targetDir, 'skills', 'gsd-help', 'SKILL.md')));
|
||||
assert.ok(fs.existsSync(path.join(targetDir, 'get-shit-done', 'VERSION')));
|
||||
assert.ok(fs.existsSync(path.join(targetDir, 'agents')));
|
||||
|
||||
const manifest = writeManifest(targetDir, 'qwen');
|
||||
assert.ok(Object.keys(manifest.files).some(file => file.startsWith('skills/gsd-help/')), manifest);
|
||||
|
||||
uninstall(false, 'qwen');
|
||||
|
||||
assert.ok(!fs.existsSync(path.join(targetDir, 'skills', 'gsd-help')), 'Qwen skill directory removed');
|
||||
assert.ok(!fs.existsSync(path.join(targetDir, 'get-shit-done')), 'get-shit-done removed');
|
||||
});
|
||||
});
|
||||
|
||||
describe('E2E: Qwen Code uninstall skills cleanup', () => {
|
||||
let tmpDir;
|
||||
let previousCwd;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempDir('gsd-qwen-uninstall-');
|
||||
previousCwd = process.cwd();
|
||||
process.chdir(tmpDir);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.chdir(previousCwd);
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('removes all gsd-* skill directories on --qwen --uninstall', () => {
|
||||
const targetDir = path.join(tmpDir, '.qwen');
|
||||
install(false, 'qwen');
|
||||
|
||||
const skillsDir = path.join(targetDir, 'skills');
|
||||
assert.ok(fs.existsSync(skillsDir), 'skills dir exists after install');
|
||||
|
||||
const installedSkills = fs.readdirSync(skillsDir, { withFileTypes: true })
|
||||
.filter(e => e.isDirectory() && e.name.startsWith('gsd-'));
|
||||
assert.ok(installedSkills.length > 0, `found ${installedSkills.length} gsd-* skill dirs before uninstall`);
|
||||
|
||||
uninstall(false, 'qwen');
|
||||
|
||||
if (fs.existsSync(skillsDir)) {
|
||||
const remainingGsd = fs.readdirSync(skillsDir, { withFileTypes: true })
|
||||
.filter(e => e.isDirectory() && e.name.startsWith('gsd-'));
|
||||
assert.strictEqual(remainingGsd.length, 0,
|
||||
`Expected 0 gsd-* skill dirs after uninstall, found: ${remainingGsd.map(e => e.name).join(', ')}`);
|
||||
}
|
||||
});
|
||||
|
||||
test('preserves non-GSD skill directories during --qwen --uninstall', () => {
|
||||
const targetDir = path.join(tmpDir, '.qwen');
|
||||
install(false, 'qwen');
|
||||
|
||||
const customSkillDir = path.join(targetDir, 'skills', 'my-custom-skill');
|
||||
fs.mkdirSync(customSkillDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(customSkillDir, 'SKILL.md'), '# My Custom Skill\n');
|
||||
|
||||
assert.ok(fs.existsSync(path.join(customSkillDir, 'SKILL.md')), 'custom skill exists before uninstall');
|
||||
|
||||
uninstall(false, 'qwen');
|
||||
|
||||
assert.ok(fs.existsSync(path.join(customSkillDir, 'SKILL.md')),
|
||||
'Non-GSD skill directory should be preserved after Qwen uninstall');
|
||||
});
|
||||
|
||||
test('removes engine directory on --qwen --uninstall', () => {
|
||||
const targetDir = path.join(tmpDir, '.qwen');
|
||||
install(false, 'qwen');
|
||||
|
||||
assert.ok(fs.existsSync(path.join(targetDir, 'get-shit-done', 'VERSION')),
|
||||
'engine exists before uninstall');
|
||||
|
||||
uninstall(false, 'qwen');
|
||||
|
||||
assert.ok(!fs.existsSync(path.join(targetDir, 'get-shit-done')),
|
||||
'get-shit-done engine should be removed after Qwen uninstall');
|
||||
});
|
||||
});
|
||||
286
tests/qwen-skills-migration.test.cjs
Normal file
286
tests/qwen-skills-migration.test.cjs
Normal file
@@ -0,0 +1,286 @@
|
||||
/**
|
||||
* GSD Tools Tests - Qwen Code Skills Migration
|
||||
*
|
||||
* Tests for installing GSD for Qwen Code using the standard
|
||||
* skills/gsd-xxx/SKILL.md format (same open standard as Claude Code 2.1.88+).
|
||||
*
|
||||
* Uses node:test and node:assert (NOT Jest).
|
||||
*/
|
||||
|
||||
process.env.GSD_TEST_MODE = '1';
|
||||
|
||||
const { test, describe, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const path = require('path');
|
||||
const os = require('os');
|
||||
const fs = require('fs');
|
||||
|
||||
const {
|
||||
convertClaudeCommandToClaudeSkill,
|
||||
copyCommandsAsClaudeSkills,
|
||||
} = require('../bin/install.js');
|
||||
|
||||
// ─── convertClaudeCommandToClaudeSkill (used by Qwen via copyCommandsAsClaudeSkills) ──
|
||||
|
||||
describe('Qwen Code: convertClaudeCommandToClaudeSkill', () => {
|
||||
test('preserves allowed-tools multiline YAML list', () => {
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:next',
|
||||
'description: Advance to the next step',
|
||||
'allowed-tools:',
|
||||
' - Read',
|
||||
' - Bash',
|
||||
' - Grep',
|
||||
'---',
|
||||
'',
|
||||
'Body content here.',
|
||||
].join('\n');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-next');
|
||||
assert.ok(result.includes('allowed-tools:'), 'allowed-tools field is present');
|
||||
assert.ok(result.includes('Read'), 'Read tool preserved');
|
||||
assert.ok(result.includes('Bash'), 'Bash tool preserved');
|
||||
assert.ok(result.includes('Grep'), 'Grep tool preserved');
|
||||
});
|
||||
|
||||
test('preserves argument-hint', () => {
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:debug',
|
||||
'description: Debug issues',
|
||||
'argument-hint: "[issue description]"',
|
||||
'allowed-tools:',
|
||||
' - Read',
|
||||
' - Bash',
|
||||
'---',
|
||||
'',
|
||||
'Debug body.',
|
||||
].join('\n');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-debug');
|
||||
assert.ok(result.includes('argument-hint:'), 'argument-hint field is present');
|
||||
assert.ok(
|
||||
result.includes('[issue description]'),
|
||||
'argument-hint value preserved'
|
||||
);
|
||||
});
|
||||
|
||||
test('converts name format from gsd:xxx to skill naming', () => {
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:next',
|
||||
'description: Advance workflow',
|
||||
'---',
|
||||
'',
|
||||
'Body.',
|
||||
].join('\n');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-next');
|
||||
assert.ok(result.includes('name: gsd-next'), 'name uses skill naming convention');
|
||||
assert.ok(!result.includes('name: gsd:next'), 'old name format removed');
|
||||
});
|
||||
|
||||
test('preserves body content unchanged', () => {
|
||||
const body = '\n<objective>\nDo the thing.\n</objective>\n\n<process>\nStep 1.\nStep 2.\n</process>\n';
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:test',
|
||||
'description: Test command',
|
||||
'---',
|
||||
body,
|
||||
].join('');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-test');
|
||||
assert.ok(result.includes('<objective>'), 'objective tag preserved');
|
||||
assert.ok(result.includes('Do the thing.'), 'body text preserved');
|
||||
assert.ok(result.includes('<process>'), 'process tag preserved');
|
||||
});
|
||||
|
||||
test('produces valid SKILL.md frontmatter starting with ---', () => {
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:plan',
|
||||
'description: Plan a phase',
|
||||
'---',
|
||||
'',
|
||||
'Plan body.',
|
||||
].join('\n');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-plan');
|
||||
assert.ok(result.startsWith('---\n'), 'frontmatter starts with ---');
|
||||
assert.ok(result.includes('\n---\n'), 'frontmatter closes with ---');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── copyCommandsAsClaudeSkills (used for Qwen skills install) ─────────────
|
||||
|
||||
describe('Qwen Code: copyCommandsAsClaudeSkills', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-qwen-test-'));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
if (fs.existsSync(tmpDir)) {
|
||||
fs.rmSync(tmpDir, { recursive: true });
|
||||
}
|
||||
});
|
||||
|
||||
test('creates skills/gsd-xxx/SKILL.md directory structure', () => {
|
||||
// Create source command files
|
||||
const srcDir = path.join(tmpDir, 'src', 'commands', 'gsd');
|
||||
fs.mkdirSync(srcDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(srcDir, 'quick.md'), [
|
||||
'---',
|
||||
'name: gsd:quick',
|
||||
'description: Execute a quick task',
|
||||
'allowed-tools:',
|
||||
' - Read',
|
||||
' - Bash',
|
||||
'---',
|
||||
'',
|
||||
'<objective>Quick task body</objective>',
|
||||
].join('\n'));
|
||||
|
||||
const skillsDir = path.join(tmpDir, 'dest', 'skills');
|
||||
copyCommandsAsClaudeSkills(srcDir, skillsDir, 'gsd', '/test/prefix/', 'qwen', false);
|
||||
|
||||
// Verify SKILL.md was created
|
||||
const skillPath = path.join(skillsDir, 'gsd-quick', 'SKILL.md');
|
||||
assert.ok(fs.existsSync(skillPath), 'gsd-quick/SKILL.md exists');
|
||||
|
||||
// Verify content
|
||||
const content = fs.readFileSync(skillPath, 'utf8');
|
||||
assert.ok(content.includes('name: gsd-quick'), 'skill name converted');
|
||||
assert.ok(content.includes('description:'), 'description present');
|
||||
assert.ok(content.includes('allowed-tools:'), 'allowed-tools preserved');
|
||||
assert.ok(content.includes('<objective>'), 'body content preserved');
|
||||
});
|
||||
|
||||
test('replaces ~/.claude/ paths with pathPrefix', () => {
|
||||
const srcDir = path.join(tmpDir, 'src', 'commands', 'gsd');
|
||||
fs.mkdirSync(srcDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(srcDir, 'next.md'), [
|
||||
'---',
|
||||
'name: gsd:next',
|
||||
'description: Next step',
|
||||
'---',
|
||||
'',
|
||||
'Reference: @~/.claude/get-shit-done/workflows/next.md',
|
||||
].join('\n'));
|
||||
|
||||
const skillsDir = path.join(tmpDir, 'dest', 'skills');
|
||||
copyCommandsAsClaudeSkills(srcDir, skillsDir, 'gsd', '$HOME/.qwen/', 'qwen', false);
|
||||
|
||||
const content = fs.readFileSync(path.join(skillsDir, 'gsd-next', 'SKILL.md'), 'utf8');
|
||||
assert.ok(content.includes('$HOME/.qwen/'), 'path replaced to .qwen/');
|
||||
assert.ok(!content.includes('~/.claude/'), 'old claude path removed');
|
||||
});
|
||||
|
||||
test('replaces $HOME/.claude/ paths with pathPrefix', () => {
|
||||
const srcDir = path.join(tmpDir, 'src', 'commands', 'gsd');
|
||||
fs.mkdirSync(srcDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(srcDir, 'plan.md'), [
|
||||
'---',
|
||||
'name: gsd:plan',
|
||||
'description: Plan phase',
|
||||
'---',
|
||||
'',
|
||||
'Reference: $HOME/.claude/get-shit-done/workflows/plan.md',
|
||||
].join('\n'));
|
||||
|
||||
const skillsDir = path.join(tmpDir, 'dest', 'skills');
|
||||
copyCommandsAsClaudeSkills(srcDir, skillsDir, 'gsd', '$HOME/.qwen/', 'qwen', false);
|
||||
|
||||
const content = fs.readFileSync(path.join(skillsDir, 'gsd-plan', 'SKILL.md'), 'utf8');
|
||||
assert.ok(content.includes('$HOME/.qwen/'), 'path replaced to .qwen/');
|
||||
assert.ok(!content.includes('$HOME/.claude/'), 'old claude path removed');
|
||||
});
|
||||
|
||||
test('removes stale gsd- skills before installing new ones', () => {
|
||||
const srcDir = path.join(tmpDir, 'src', 'commands', 'gsd');
|
||||
fs.mkdirSync(srcDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(srcDir, 'quick.md'), [
|
||||
'---',
|
||||
'name: gsd:quick',
|
||||
'description: Quick task',
|
||||
'---',
|
||||
'',
|
||||
'Body',
|
||||
].join('\n'));
|
||||
|
||||
const skillsDir = path.join(tmpDir, 'dest', 'skills');
|
||||
// Pre-create a stale skill
|
||||
fs.mkdirSync(path.join(skillsDir, 'gsd-old-skill'), { recursive: true });
|
||||
fs.writeFileSync(path.join(skillsDir, 'gsd-old-skill', 'SKILL.md'), 'old');
|
||||
|
||||
copyCommandsAsClaudeSkills(srcDir, skillsDir, 'gsd', '/test/', 'qwen', false);
|
||||
|
||||
assert.ok(!fs.existsSync(path.join(skillsDir, 'gsd-old-skill')), 'stale skill removed');
|
||||
assert.ok(fs.existsSync(path.join(skillsDir, 'gsd-quick', 'SKILL.md')), 'new skill installed');
|
||||
});
|
||||
|
||||
test('preserves agent field in frontmatter', () => {
|
||||
const srcDir = path.join(tmpDir, 'src', 'commands', 'gsd');
|
||||
fs.mkdirSync(srcDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(srcDir, 'execute.md'), [
|
||||
'---',
|
||||
'name: gsd:execute',
|
||||
'description: Execute phase',
|
||||
'agent: gsd-executor',
|
||||
'allowed-tools:',
|
||||
' - Read',
|
||||
' - Bash',
|
||||
' - Task',
|
||||
'---',
|
||||
'',
|
||||
'Execute body',
|
||||
].join('\n'));
|
||||
|
||||
const skillsDir = path.join(tmpDir, 'dest', 'skills');
|
||||
copyCommandsAsClaudeSkills(srcDir, skillsDir, 'gsd', '/test/', 'qwen', false);
|
||||
|
||||
const content = fs.readFileSync(path.join(skillsDir, 'gsd-execute', 'SKILL.md'), 'utf8');
|
||||
assert.ok(content.includes('agent: gsd-executor'), 'agent field preserved');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Integration: SKILL.md format validation ────────────────────────────────
|
||||
|
||||
describe('Qwen Code: SKILL.md format validation', () => {
|
||||
test('SKILL.md frontmatter is valid YAML structure', () => {
|
||||
const input = [
|
||||
'---',
|
||||
'name: gsd:review',
|
||||
'description: Code review with quality checks',
|
||||
'argument-hint: "[PR number or branch]"',
|
||||
'agent: gsd-code-reviewer',
|
||||
'allowed-tools:',
|
||||
' - Read',
|
||||
' - Grep',
|
||||
' - Bash',
|
||||
'---',
|
||||
'',
|
||||
'<objective>Review code</objective>',
|
||||
].join('\n');
|
||||
|
||||
const result = convertClaudeCommandToClaudeSkill(input, 'gsd-review');
|
||||
|
||||
// Parse the frontmatter
|
||||
const fmMatch = result.match(/^---\n([\s\S]*?)\n---/);
|
||||
assert.ok(fmMatch, 'has frontmatter block');
|
||||
|
||||
const fmLines = fmMatch[1].split('\n');
|
||||
const hasName = fmLines.some(l => l.startsWith('name: gsd-review'));
|
||||
const hasDesc = fmLines.some(l => l.startsWith('description:'));
|
||||
const hasAgent = fmLines.some(l => l.startsWith('agent:'));
|
||||
const hasTools = fmLines.some(l => l.startsWith('allowed-tools:'));
|
||||
|
||||
assert.ok(hasName, 'name field correct');
|
||||
assert.ok(hasDesc, 'description field present');
|
||||
assert.ok(hasAgent, 'agent field present');
|
||||
assert.ok(hasTools, 'allowed-tools field present');
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user