mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-25 17:25:23 +02:00
merge: resolve conflicts with upstream main (firecrawl + exa_search)
ensureConfigFile(): keep our refactored version that delegates to
buildNewProjectConfig({}) instead of upstream's duplicated logic.
buildNewProjectConfig(): add firecrawl and exa_search API key
detection alongside existing brave_search, matching upstream's
new integrations.
This commit is contained in:
2
.github/workflows/auto-label-issues.yml
vendored
2
.github/workflows/auto-label-issues.yml
vendored
@@ -10,7 +10,7 @@ jobs:
|
||||
permissions:
|
||||
issues: write
|
||||
steps:
|
||||
- uses: actions/github-script@v7
|
||||
- uses: actions/github-script@v8
|
||||
with:
|
||||
script: |
|
||||
await github.rest.issues.addLabels({
|
||||
|
||||
4
.github/workflows/test.yml
vendored
4
.github/workflows/test.yml
vendored
@@ -25,10 +25,10 @@ jobs:
|
||||
node-version: [20, 22, 24]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@34e114876b0b11c390a56381ad16ebd13914f8d5 # v4
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
|
||||
- name: Set up Node.js ${{ matrix.node-version }}
|
||||
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
|
||||
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||
with:
|
||||
node-version: ${{ matrix.node-version }}
|
||||
cache: 'npm'
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
name: gsd-phase-researcher
|
||||
description: Researches how to implement a phase before planning. Produces RESEARCH.md consumed by gsd-planner. Spawned by /gsd:plan-phase orchestrator.
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*, mcp__firecrawl__*, mcp__exa__*
|
||||
color: cyan
|
||||
# hooks:
|
||||
# PostToolUse:
|
||||
@@ -137,6 +137,31 @@ If `brave_search: false` (or not set), use built-in WebSearch tool instead.
|
||||
|
||||
Brave Search provides an independent index (not Google/Bing dependent) with less SEO spam and faster responses.
|
||||
|
||||
### Exa Semantic Search (MCP)
|
||||
|
||||
Check `exa_search` from init context. If `true`, use Exa for semantic, research-heavy queries:
|
||||
|
||||
```
|
||||
mcp__exa__web_search_exa with query: "your semantic query"
|
||||
```
|
||||
|
||||
**Best for:** Research questions where keyword search fails — "best approaches to X", finding technical/academic content, discovering niche libraries. Returns semantically relevant results.
|
||||
|
||||
If `exa_search: false` (or not set), fall back to WebSearch or Brave Search.
|
||||
|
||||
### Firecrawl Deep Scraping (MCP)
|
||||
|
||||
Check `firecrawl` from init context. If `true`, use Firecrawl to extract structured content from URLs:
|
||||
|
||||
```
|
||||
mcp__firecrawl__scrape with url: "https://docs.example.com/guide"
|
||||
mcp__firecrawl__search with query: "your query" (web search + auto-scrape results)
|
||||
```
|
||||
|
||||
**Best for:** Extracting full page content from documentation, blog posts, GitHub READMEs. Use after finding a URL from Exa, WebSearch, or known docs. Returns clean markdown.
|
||||
|
||||
If `firecrawl: false` (or not set), fall back to WebFetch.
|
||||
|
||||
## Verification Protocol
|
||||
|
||||
**WebSearch findings MUST be verified:**
|
||||
@@ -161,7 +186,7 @@ For each WebSearch finding:
|
||||
| MEDIUM | WebSearch verified with official source, multiple credible sources | State with attribution |
|
||||
| LOW | WebSearch only, single source, unverified | Flag as needing validation |
|
||||
|
||||
Priority: Context7 > Official Docs > Official GitHub > Verified WebSearch > Unverified WebSearch
|
||||
Priority: Context7 > Exa (verified) > Firecrawl (official docs) > Official GitHub > Brave/WebSearch (verified) > WebSearch (unverified)
|
||||
|
||||
</source_hierarchy>
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
name: gsd-project-researcher
|
||||
description: Researches domain ecosystem before roadmap creation. Produces files in .planning/research/ consumed during roadmap creation. Spawned by /gsd:new-project or /gsd:new-milestone orchestrators.
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*, mcp__firecrawl__*, mcp__exa__*
|
||||
color: cyan
|
||||
# hooks:
|
||||
# PostToolUse:
|
||||
@@ -116,6 +116,31 @@ If `brave_search: false` (or not set), use built-in WebSearch tool instead.
|
||||
|
||||
Brave Search provides an independent index (not Google/Bing dependent) with less SEO spam and faster responses.
|
||||
|
||||
### Exa Semantic Search (MCP)
|
||||
|
||||
Check `exa_search` from orchestrator context. If `true`, use Exa for research-heavy, semantic queries:
|
||||
|
||||
```
|
||||
mcp__exa__web_search_exa with query: "your semantic query"
|
||||
```
|
||||
|
||||
**Best for:** Research questions where keyword search fails — "best approaches to X", finding technical/academic content, discovering niche libraries, ecosystem exploration. Returns semantically relevant results rather than keyword matches.
|
||||
|
||||
If `exa_search: false` (or not set), fall back to WebSearch or Brave Search.
|
||||
|
||||
### Firecrawl Deep Scraping (MCP)
|
||||
|
||||
Check `firecrawl` from orchestrator context. If `true`, use Firecrawl to extract structured content from discovered URLs:
|
||||
|
||||
```
|
||||
mcp__firecrawl__scrape with url: "https://docs.example.com/guide"
|
||||
mcp__firecrawl__search with query: "your query" (web search + auto-scrape results)
|
||||
```
|
||||
|
||||
**Best for:** Extracting full page content from documentation, blog posts, GitHub READMEs, comparison articles. Use after finding a relevant URL from Exa, WebSearch, or known docs. Returns clean markdown instead of raw HTML.
|
||||
|
||||
If `firecrawl: false` (or not set), fall back to WebFetch.
|
||||
|
||||
## Verification Protocol
|
||||
|
||||
**WebSearch findings must be verified:**
|
||||
@@ -138,7 +163,7 @@ Never present LOW confidence findings as authoritative.
|
||||
| MEDIUM | WebSearch verified with official source, multiple credible sources agree | State with attribution |
|
||||
| LOW | WebSearch only, single source, unverified | Flag as needing validation |
|
||||
|
||||
**Source priority:** Context7 → Official Docs → Official GitHub → WebSearch (verified) → WebSearch (unverified)
|
||||
**Source priority:** Context7 → Exa (verified) → Firecrawl (official docs) → Official GitHub → Brave/WebSearch (verified) → WebSearch (unverified)
|
||||
|
||||
</tool_strategy>
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
name: gsd-ui-researcher
|
||||
description: Produces UI-SPEC.md design contract for frontend phases. Reads upstream artifacts, detects design system state, asks only unanswered questions. Spawned by /gsd:ui-phase orchestrator.
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*
|
||||
tools: Read, Write, Bash, Grep, Glob, WebSearch, WebFetch, mcp__context7__*, mcp__firecrawl__*, mcp__exa__*
|
||||
color: "#E879F9"
|
||||
# hooks:
|
||||
# PostToolUse:
|
||||
@@ -89,7 +89,11 @@ Your UI-SPEC.md is consumed by:
|
||||
|----------|------|---------|-------------|
|
||||
| 1st | Codebase Grep/Glob | Existing tokens, components, styles, config files | HIGH |
|
||||
| 2nd | Context7 | Component library API docs, shadcn preset format | HIGH |
|
||||
| 3rd | WebSearch | Design pattern references, accessibility standards | Needs verification |
|
||||
| 3rd | Exa (MCP) | Design pattern references, accessibility standards, semantic research | MEDIUM (verify) |
|
||||
| 4th | Firecrawl (MCP) | Deep scrape component library docs, design system references | HIGH (content depends on source) |
|
||||
| 5th | WebSearch | Fallback keyword search for ecosystem discovery | Needs verification |
|
||||
|
||||
**Exa/Firecrawl:** Check `exa_search` and `firecrawl` from orchestrator context. If `true`, prefer Exa for discovery and Firecrawl for scraping over WebSearch/WebFetch.
|
||||
|
||||
**Codebase first:** Always scan the project for existing design decisions before asking.
|
||||
|
||||
|
||||
1015
bin/install.js
1015
bin/install.js
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,700 @@
|
||||
# Materialize new-project config on initialization
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** When `/gsd:new-project` creates `.planning/config.json`, the file contains all effective defaults — not just the 6 user-chosen keys — so developers can see every setting without reading source code.
|
||||
|
||||
**Architecture:** Add a single JS function `buildNewProjectConfig(cwd, userChoices)` in `config.cjs` as the one source of truth for a new project's full config. Expose it as a CLI command `config-new-project`. Update the `new-project.md` workflow to call this command instead of writing a partial JSON inline.
|
||||
|
||||
**Tech Stack:** Node.js/CommonJS, existing gsd-tools CLI, `node:test` for tests.
|
||||
|
||||
---
|
||||
|
||||
## Background: what exists today
|
||||
|
||||
`new-project.md` Step 5 writes this partial config (the AI fills the template):
|
||||
|
||||
```json
|
||||
{
|
||||
"mode": "...", "granularity": "...", "parallelization": "...",
|
||||
"commit_docs": "...", "model_profile": "...",
|
||||
"workflow": { "research", "plan_check", "verifier", "nyquist_validation" }
|
||||
}
|
||||
```
|
||||
|
||||
Missing keys silently resolved by `loadConfig()` at runtime:
|
||||
|
||||
- `search_gitignored: false`
|
||||
- `brave_search: false` (or env-detected `true`)
|
||||
- `git.branching_strategy: "none"`
|
||||
- `git.phase_branch_template: "gsd/phase-{phase}-{slug}"`
|
||||
- `git.milestone_branch_template: "gsd/{milestone}-{slug}"`
|
||||
|
||||
Full config that should exist from the start:
|
||||
|
||||
```json
|
||||
{
|
||||
"mode": "yolo|interactive",
|
||||
"granularity": "coarse|standard|fine",
|
||||
"model_profile": "balanced",
|
||||
"commit_docs": true,
|
||||
"parallelization": true,
|
||||
"search_gitignored": false,
|
||||
"brave_search": false,
|
||||
"git": {
|
||||
"branching_strategy": "none",
|
||||
"phase_branch_template": "gsd/phase-{phase}-{slug}",
|
||||
"milestone_branch_template": "gsd/{milestone}-{slug}"
|
||||
},
|
||||
"workflow": {
|
||||
"research": true,
|
||||
"plan_check": true,
|
||||
"verifier": true,
|
||||
"nyquist_validation": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## File map
|
||||
|
||||
| File | Action | Purpose |
|
||||
|------|--------|---------|
|
||||
| `get-shit-done/bin/lib/config.cjs` | Modify | Add `buildNewProjectConfig()` + `cmdConfigNewProject()` |
|
||||
| `get-shit-done/bin/gsd-tools.cjs` | Modify | Register `config-new-project` case + update usage string |
|
||||
| `get-shit-done/workflows/new-project.md` | Modify | Steps 2a + 5: replace inline JSON write with CLI call |
|
||||
| `tests/config.test.cjs` | Modify | Add `config-new-project` test suite |
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Add `buildNewProjectConfig` and `cmdConfigNewProject` to config.cjs
|
||||
|
||||
**Files:**
|
||||
|
||||
- Modify: `get-shit-done/bin/lib/config.cjs`
|
||||
|
||||
- [ ] **Step 1.1: Write the failing tests first**
|
||||
|
||||
Add to `tests/config.test.cjs` (after the `config-get` suite, before `module.exports`):
|
||||
|
||||
```js
|
||||
// ─── config-new-project ──────────────────────────────────────────────────────
|
||||
|
||||
describe('config-new-project command', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('creates full config with all expected top-level and nested keys', () => {
|
||||
const choices = JSON.stringify({
|
||||
mode: 'interactive',
|
||||
granularity: 'standard',
|
||||
parallelization: true,
|
||||
commit_docs: true,
|
||||
model_profile: 'balanced',
|
||||
workflow: { research: true, plan_check: true, verifier: true, nyquist_validation: true },
|
||||
});
|
||||
const result = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const config = readConfig(tmpDir);
|
||||
|
||||
// User choices present
|
||||
assert.strictEqual(config.mode, 'interactive');
|
||||
assert.strictEqual(config.granularity, 'standard');
|
||||
assert.strictEqual(config.parallelization, true);
|
||||
assert.strictEqual(config.commit_docs, true);
|
||||
assert.strictEqual(config.model_profile, 'balanced');
|
||||
|
||||
// Defaults materialized
|
||||
assert.strictEqual(typeof config.search_gitignored, 'boolean');
|
||||
assert.strictEqual(typeof config.brave_search, 'boolean');
|
||||
|
||||
// git section present with all three keys
|
||||
assert.ok(config.git && typeof config.git === 'object', 'git section should exist');
|
||||
assert.strictEqual(config.git.branching_strategy, 'none');
|
||||
assert.strictEqual(config.git.phase_branch_template, 'gsd/phase-{phase}-{slug}');
|
||||
assert.strictEqual(config.git.milestone_branch_template, 'gsd/{milestone}-{slug}');
|
||||
|
||||
// workflow section present with all four keys
|
||||
assert.ok(config.workflow && typeof config.workflow === 'object', 'workflow section should exist');
|
||||
assert.strictEqual(config.workflow.research, true);
|
||||
assert.strictEqual(config.workflow.plan_check, true);
|
||||
assert.strictEqual(config.workflow.verifier, true);
|
||||
assert.strictEqual(config.workflow.nyquist_validation, true);
|
||||
});
|
||||
|
||||
test('user choices override defaults', () => {
|
||||
const choices = JSON.stringify({
|
||||
mode: 'yolo',
|
||||
granularity: 'coarse',
|
||||
parallelization: false,
|
||||
commit_docs: false,
|
||||
model_profile: 'quality',
|
||||
workflow: { research: false, plan_check: false, verifier: true, nyquist_validation: false },
|
||||
});
|
||||
const result = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const config = readConfig(tmpDir);
|
||||
assert.strictEqual(config.mode, 'yolo');
|
||||
assert.strictEqual(config.granularity, 'coarse');
|
||||
assert.strictEqual(config.parallelization, false);
|
||||
assert.strictEqual(config.commit_docs, false);
|
||||
assert.strictEqual(config.model_profile, 'quality');
|
||||
assert.strictEqual(config.workflow.research, false);
|
||||
assert.strictEqual(config.workflow.plan_check, false);
|
||||
assert.strictEqual(config.workflow.verifier, true);
|
||||
assert.strictEqual(config.workflow.nyquist_validation, false);
|
||||
// Defaults still present for non-chosen keys
|
||||
assert.strictEqual(config.git.branching_strategy, 'none');
|
||||
assert.strictEqual(typeof config.search_gitignored, 'boolean');
|
||||
});
|
||||
|
||||
test('works with empty choices — all defaults materialized', () => {
|
||||
const result = runGsdTools(['config-new-project', '{}'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const config = readConfig(tmpDir);
|
||||
assert.strictEqual(config.model_profile, 'balanced');
|
||||
assert.strictEqual(config.commit_docs, true);
|
||||
assert.strictEqual(config.parallelization, true);
|
||||
assert.strictEqual(config.search_gitignored, false);
|
||||
assert.ok(config.git && typeof config.git === 'object');
|
||||
assert.strictEqual(config.git.branching_strategy, 'none');
|
||||
assert.ok(config.workflow && typeof config.workflow === 'object');
|
||||
assert.strictEqual(config.workflow.nyquist_validation, true);
|
||||
});
|
||||
|
||||
test('is idempotent — returns already_exists if config exists', () => {
|
||||
// First call: create
|
||||
const choices = JSON.stringify({ mode: 'yolo', granularity: 'fine' });
|
||||
const first = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(first.success, `First call failed: ${first.error}`);
|
||||
const firstOut = JSON.parse(first.output);
|
||||
assert.strictEqual(firstOut.created, true);
|
||||
|
||||
// Second call: idempotent
|
||||
const second = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(second.success, `Second call failed: ${second.error}`);
|
||||
const secondOut = JSON.parse(second.output);
|
||||
assert.strictEqual(secondOut.created, false);
|
||||
assert.strictEqual(secondOut.reason, 'already_exists');
|
||||
|
||||
// Config unchanged
|
||||
const config = readConfig(tmpDir);
|
||||
assert.strictEqual(config.mode, 'yolo');
|
||||
assert.strictEqual(config.granularity, 'fine');
|
||||
});
|
||||
|
||||
test('auto_advance in workflow choices is preserved', () => {
|
||||
const choices = JSON.stringify({
|
||||
mode: 'yolo',
|
||||
granularity: 'standard',
|
||||
workflow: { research: true, plan_check: true, verifier: true, nyquist_validation: true, auto_advance: true },
|
||||
});
|
||||
const result = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const config = readConfig(tmpDir);
|
||||
assert.strictEqual(config.workflow.auto_advance, true);
|
||||
});
|
||||
|
||||
test('rejects invalid JSON choices', () => {
|
||||
const result = runGsdTools(['config-new-project', '{not-json}'], tmpDir);
|
||||
assert.strictEqual(result.success, false);
|
||||
assert.ok(result.error.includes('Invalid JSON'), `Expected "Invalid JSON" in: ${result.error}`);
|
||||
});
|
||||
|
||||
test('output JSON has created:true on success', () => {
|
||||
const choices = JSON.stringify({ mode: 'interactive', granularity: 'standard' });
|
||||
const result = runGsdTools(['config-new-project', choices], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
const out = JSON.parse(result.output);
|
||||
assert.strictEqual(out.created, true);
|
||||
assert.strictEqual(out.path, '.planning/config.json');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
- [ ] **Step 1.2: Run failing tests to confirm they fail**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node --test tests/config.test.cjs 2>&1 | grep -E "config-new-project|FAIL|Error"
|
||||
```
|
||||
|
||||
Expected: All `config-new-project` tests fail with "config-new-project is not a valid command" or similar.
|
||||
|
||||
- [ ] **Step 1.3: Implement `buildNewProjectConfig` and `cmdConfigNewProject` in config.cjs**
|
||||
|
||||
In `get-shit-done/bin/lib/config.cjs`, add the following after the `validateKnownConfigKeyPath` function (around line 35) and before `ensureConfigFile`:
|
||||
|
||||
```js
|
||||
/**
|
||||
* Build a fully-materialized config for a new project.
|
||||
*
|
||||
* Merges (in order of increasing priority):
|
||||
* 1. Hardcoded defaults
|
||||
* 2. User-level defaults from ~/.gsd/defaults.json (if present)
|
||||
* 3. userChoices (the settings the user explicitly selected during new-project)
|
||||
*
|
||||
* Returns a plain object — does NOT write any files.
|
||||
*/
|
||||
function buildNewProjectConfig(cwd, userChoices) {
|
||||
const choices = userChoices || {};
|
||||
const homedir = require('os').homedir();
|
||||
|
||||
// Detect Brave Search API key availability
|
||||
const braveKeyFile = path.join(homedir, '.gsd', 'brave_api_key');
|
||||
const hasBraveSearch = !!(process.env.BRAVE_API_KEY || fs.existsSync(braveKeyFile));
|
||||
|
||||
// Load user-level defaults from ~/.gsd/defaults.json if available
|
||||
const globalDefaultsPath = path.join(homedir, '.gsd', 'defaults.json');
|
||||
let userDefaults = {};
|
||||
try {
|
||||
if (fs.existsSync(globalDefaultsPath)) {
|
||||
userDefaults = JSON.parse(fs.readFileSync(globalDefaultsPath, 'utf-8'));
|
||||
// Migrate deprecated "depth" key to "granularity"
|
||||
if ('depth' in userDefaults && !('granularity' in userDefaults)) {
|
||||
const depthToGranularity = { quick: 'coarse', standard: 'standard', comprehensive: 'fine' };
|
||||
userDefaults.granularity = depthToGranularity[userDefaults.depth] || userDefaults.depth;
|
||||
delete userDefaults.depth;
|
||||
try {
|
||||
fs.writeFileSync(globalDefaultsPath, JSON.stringify(userDefaults, null, 2), 'utf-8');
|
||||
} catch {}
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Ignore malformed global defaults
|
||||
}
|
||||
|
||||
const hardcoded = {
|
||||
model_profile: 'balanced',
|
||||
commit_docs: true,
|
||||
parallelization: true,
|
||||
search_gitignored: false,
|
||||
brave_search: hasBraveSearch,
|
||||
git: {
|
||||
branching_strategy: 'none',
|
||||
phase_branch_template: 'gsd/phase-{phase}-{slug}',
|
||||
milestone_branch_template: 'gsd/{milestone}-{slug}',
|
||||
},
|
||||
workflow: {
|
||||
research: true,
|
||||
plan_check: true,
|
||||
verifier: true,
|
||||
nyquist_validation: true,
|
||||
},
|
||||
};
|
||||
|
||||
// Three-level merge: hardcoded <- userDefaults <- choices
|
||||
return {
|
||||
...hardcoded,
|
||||
...userDefaults,
|
||||
...choices,
|
||||
git: {
|
||||
...hardcoded.git,
|
||||
...(userDefaults.git || {}),
|
||||
...(choices.git || {}),
|
||||
},
|
||||
workflow: {
|
||||
...hardcoded.workflow,
|
||||
...(userDefaults.workflow || {}),
|
||||
...(choices.workflow || {}),
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Command: create a fully-materialized .planning/config.json for a new project.
|
||||
*
|
||||
* Accepts user-chosen settings as a JSON string (the keys the user explicitly
|
||||
* configured during /gsd:new-project). All remaining keys are filled from
|
||||
* hardcoded defaults and optional ~/.gsd/defaults.json.
|
||||
*
|
||||
* Idempotent: if config.json already exists, returns { created: false }.
|
||||
*/
|
||||
function cmdConfigNewProject(cwd, choicesJson, raw) {
|
||||
const configPath = path.join(cwd, '.planning', 'config.json');
|
||||
const planningDir = path.join(cwd, '.planning');
|
||||
|
||||
// Idempotent: don't overwrite existing config
|
||||
if (fs.existsSync(configPath)) {
|
||||
output({ created: false, reason: 'already_exists' }, raw, 'exists');
|
||||
return;
|
||||
}
|
||||
|
||||
// Parse user choices
|
||||
let userChoices = {};
|
||||
if (choicesJson && choicesJson.trim() !== '') {
|
||||
try {
|
||||
userChoices = JSON.parse(choicesJson);
|
||||
} catch (err) {
|
||||
error('Invalid JSON for config-new-project: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Ensure .planning directory exists
|
||||
try {
|
||||
if (!fs.existsSync(planningDir)) {
|
||||
fs.mkdirSync(planningDir, { recursive: true });
|
||||
}
|
||||
} catch (err) {
|
||||
error('Failed to create .planning directory: ' + err.message);
|
||||
}
|
||||
|
||||
const config = buildNewProjectConfig(cwd, userChoices);
|
||||
|
||||
try {
|
||||
fs.writeFileSync(configPath, JSON.stringify(config, null, 2), 'utf-8');
|
||||
output({ created: true, path: '.planning/config.json' }, raw, 'created');
|
||||
} catch (err) {
|
||||
error('Failed to write config.json: ' + err.message);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Also add `cmdConfigNewProject` to the `module.exports` at the bottom of `config.cjs`.
|
||||
|
||||
- [ ] **Step 1.4: Run tests to verify they pass**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node --test tests/config.test.cjs 2>&1 | tail -20
|
||||
```
|
||||
|
||||
Expected: All `config-new-project` tests pass. Existing tests still pass.
|
||||
|
||||
- [ ] **Step 1.5: Commit**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
git add get-shit-done/bin/lib/config.cjs tests/config.test.cjs
|
||||
git commit -m "feat: add config-new-project command for full config materialization"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Register `config-new-project` in gsd-tools.cjs
|
||||
|
||||
**Files:**
|
||||
|
||||
- Modify: `get-shit-done/bin/gsd-tools.cjs`
|
||||
|
||||
- [ ] **Step 2.1: Add the case to the switch in gsd-tools.cjs**
|
||||
|
||||
After the `config-get` case (around line 401), add:
|
||||
|
||||
```js
|
||||
case 'config-new-project': {
|
||||
config.cmdConfigNewProject(cwd, args[1], raw);
|
||||
break;
|
||||
}
|
||||
```
|
||||
|
||||
Also update the usage string on line 178 to include `config-new-project`:
|
||||
|
||||
Current: `...config-ensure-section, init`
|
||||
New: `...config-ensure-section, config-new-project, init`
|
||||
|
||||
- [ ] **Step 2.2: Smoke-test the CLI registration**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node get-shit-done/bin/gsd-tools.cjs config-new-project '{"mode":"interactive","granularity":"standard"}' --cwd /tmp/gsd-smoke-$(date +%s)
|
||||
```
|
||||
|
||||
Expected: outputs `{"created":true,"path":".planning/config.json"}` (or similar).
|
||||
|
||||
Clean up: `rm -rf /tmp/gsd-smoke-*`
|
||||
|
||||
- [ ] **Step 2.3: Run full test suite**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node --test tests/config.test.cjs 2>&1 | tail -10
|
||||
```
|
||||
|
||||
Expected: All pass.
|
||||
|
||||
- [ ] **Step 2.4: Commit**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
git add get-shit-done/bin/gsd-tools.cjs
|
||||
git commit -m "feat: register config-new-project in gsd-tools CLI router"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Update new-project.md workflow to use config-new-project
|
||||
|
||||
**Files:**
|
||||
|
||||
- Modify: `get-shit-done/workflows/new-project.md`
|
||||
|
||||
This is the core change. Two places need updating:
|
||||
|
||||
- **Step 2a** (auto mode config creation, around line 168–195)
|
||||
- **Step 5** (interactive mode config creation, around line 470–498)
|
||||
|
||||
- [ ] **Step 3.1: Update Step 2a (auto mode)**
|
||||
|
||||
Find the block in Step 2a that creates config.json:
|
||||
|
||||
```markdown
|
||||
Create `.planning/config.json` with mode set to "yolo":
|
||||
|
||||
```json
|
||||
{
|
||||
"mode": "yolo",
|
||||
"granularity": "[selected]",
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
```
|
||||
|
||||
Replace the inline JSON write instruction with:
|
||||
|
||||
```markdown
|
||||
Create `.planning/config.json` using the CLI (fills in all defaults automatically):
|
||||
|
||||
```bash
|
||||
mkdir -p .planning
|
||||
node "$HOME/.claude/get-shit-done/bin/gsd-tools.cjs" config-new-project "$(cat <<'CHOICES'
|
||||
{
|
||||
"mode": "yolo",
|
||||
"granularity": "[selected: coarse|standard|fine]",
|
||||
"parallelization": [true|false],
|
||||
"commit_docs": [true|false],
|
||||
"model_profile": "[selected: quality|balanced|budget|inherit]",
|
||||
"workflow": {
|
||||
"research": [true|false],
|
||||
"plan_check": [true|false],
|
||||
"verifier": [true|false],
|
||||
"nyquist_validation": [true|false],
|
||||
"auto_advance": true
|
||||
}
|
||||
}
|
||||
CHOICES
|
||||
)"
|
||||
```
|
||||
|
||||
The command merges your selections with all runtime defaults (`search_gitignored`, `brave_search`, `git` section), producing a fully-materialized config.
|
||||
|
||||
```
|
||||
|
||||
- [ ] **Step 3.2: Update Step 5 (interactive mode)**
|
||||
|
||||
Find the block in Step 5 that creates config.json:
|
||||
|
||||
```markdown
|
||||
Create `.planning/config.json` with all settings:
|
||||
|
||||
```json
|
||||
{
|
||||
"mode": "yolo|interactive",
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
```
|
||||
|
||||
Replace with:
|
||||
|
||||
```markdown
|
||||
Create `.planning/config.json` using the CLI (fills in all defaults automatically):
|
||||
|
||||
```bash
|
||||
mkdir -p .planning
|
||||
node "$HOME/.claude/get-shit-done/bin/gsd-tools.cjs" config-new-project "$(cat <<'CHOICES'
|
||||
{
|
||||
"mode": "[selected: yolo|interactive]",
|
||||
"granularity": "[selected: coarse|standard|fine]",
|
||||
"parallelization": [true|false],
|
||||
"commit_docs": [true|false],
|
||||
"model_profile": "[selected: quality|balanced|budget|inherit]",
|
||||
"workflow": {
|
||||
"research": [true|false],
|
||||
"plan_check": [true|false],
|
||||
"verifier": [true|false],
|
||||
"nyquist_validation": [true|false]
|
||||
}
|
||||
}
|
||||
CHOICES
|
||||
)"
|
||||
```
|
||||
|
||||
The command merges your selections with all runtime defaults (`search_gitignored`, `brave_search`, `git` section), producing a fully-materialized config.
|
||||
|
||||
```
|
||||
|
||||
- [ ] **Step 3.3: Verify the workflow file reads correctly**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
grep -n "config-new-project\|config\.json\|CHOICES" get-shit-done/workflows/new-project.md
|
||||
```
|
||||
|
||||
Expected: 2 occurrences of `config-new-project` (one per step), no more inline JSON templates for config creation.
|
||||
|
||||
- [ ] **Step 3.4: Commit**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
git add get-shit-done/workflows/new-project.md
|
||||
git commit -m "feat: use config-new-project in new-project workflow for full config materialization"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Validation
|
||||
|
||||
- [ ] **Step 4.1: Run the full test suite**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node --test tests/ 2>&1 | tail -30
|
||||
```
|
||||
|
||||
Expected: All tests pass (no regressions).
|
||||
|
||||
- [ ] **Step 4.2: Manual end-to-end validation**
|
||||
|
||||
Simulate what `new-project.md` does for a new project:
|
||||
|
||||
```bash
|
||||
# Create a fresh project dir
|
||||
TMP=$(mktemp -d)
|
||||
cd "$TMP"
|
||||
|
||||
# Step 1 simulation: what init new-project returns
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs init new-project --cwd "$TMP"
|
||||
|
||||
# Step 5 simulation: create full config
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-new-project '{
|
||||
"mode": "interactive",
|
||||
"granularity": "standard",
|
||||
"parallelization": true,
|
||||
"commit_docs": true,
|
||||
"model_profile": "balanced",
|
||||
"workflow": {
|
||||
"research": true,
|
||||
"plan_check": true,
|
||||
"verifier": true,
|
||||
"nyquist_validation": true
|
||||
}
|
||||
}' --cwd "$TMP"
|
||||
|
||||
# Verify the file has all 12 expected keys
|
||||
echo "=== Generated config.json ==="
|
||||
cat "$TMP/.planning/config.json"
|
||||
|
||||
# Clean up
|
||||
rm -rf "$TMP"
|
||||
```
|
||||
|
||||
Expected output: a config.json with `mode`, `granularity`, `model_profile`, `commit_docs`, `parallelization`, `search_gitignored`, `brave_search`, `git` (3 sub-keys), `workflow` (4 sub-keys) — 12 top-level keys total (or 10 if counting `git` and `workflow` as single keys).
|
||||
|
||||
- [ ] **Step 4.3: Verify idempotency**
|
||||
|
||||
```bash
|
||||
TMP=$(mktemp -d)
|
||||
CHOICES='{"mode":"yolo","granularity":"coarse"}'
|
||||
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-new-project "$CHOICES" --cwd "$TMP"
|
||||
FIRST=$(cat "$TMP/.planning/config.json")
|
||||
|
||||
# Second call should be no-op
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-new-project "$CHOICES" --cwd "$TMP"
|
||||
SECOND=$(cat "$TMP/.planning/config.json")
|
||||
|
||||
[ "$FIRST" = "$SECOND" ] && echo "IDEMPOTENT: OK" || echo "IDEMPOTENT: FAIL"
|
||||
rm -rf "$TMP"
|
||||
```
|
||||
|
||||
Expected: `IDEMPOTENT: OK`
|
||||
|
||||
- [ ] **Step 4.4: Verify loadConfig still reads the new format correctly**
|
||||
|
||||
```bash
|
||||
TMP=$(mktemp -d)
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-new-project '{
|
||||
"mode":"yolo","granularity":"standard","parallelization":true,"commit_docs":true,
|
||||
"model_profile":"balanced",
|
||||
"workflow":{"research":true,"plan_check":false,"verifier":true,"nyquist_validation":true}
|
||||
}' --cwd "$TMP"
|
||||
|
||||
# loadConfig should correctly read plan_check (nested as workflow.plan_check)
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-get workflow.plan_check --cwd "$TMP"
|
||||
# Expected: false
|
||||
|
||||
node /Users/diego/Dev/get-shit-done/get-shit-done/bin/gsd-tools.cjs config-get git.branching_strategy --cwd "$TMP"
|
||||
# Expected: "none"
|
||||
|
||||
rm -rf "$TMP"
|
||||
```
|
||||
|
||||
- [ ] **Step 4.5: Final full test suite + commit**
|
||||
|
||||
```bash
|
||||
cd /Users/diego/Dev/get-shit-done
|
||||
node --test tests/ 2>&1 | grep -E "pass|fail|error" | tail -5
|
||||
```
|
||||
|
||||
Expected: All pass, 0 failures.
|
||||
|
||||
---
|
||||
|
||||
## Appendix: PR description for upstream
|
||||
|
||||
```
|
||||
feat: materialize all config defaults at new-project initialization
|
||||
|
||||
**Problem:**
|
||||
`/gsd:new-project` creates `.planning/config.json` with only the 6 keys
|
||||
the user explicitly chose during onboarding. Five additional keys
|
||||
(`search_gitignored`, `brave_search`, `git.branching_strategy`,
|
||||
`git.phase_branch_template`, `git.milestone_branch_template`) are resolved
|
||||
silently by `loadConfig()` at runtime but never written to disk.
|
||||
|
||||
This creates two problems:
|
||||
1. **Discoverability**: users can't see or understand `git.branching_strategy`
|
||||
without reading source code — it doesn't appear in their config.
|
||||
2. **Implicit expansion**: the first time `/gsd:settings` or `config-set`
|
||||
writes to the config, those keys still aren't added. The config only
|
||||
reflects a fraction of the effective configuration.
|
||||
|
||||
**Solution:**
|
||||
Add `config-new-project` CLI command to `gsd-tools.cjs`. The command:
|
||||
- Accepts user-chosen values as JSON
|
||||
- Merges them with all runtime defaults (including env-detected `brave_search`)
|
||||
- Writes the fully-materialized config in one shot
|
||||
|
||||
Update `new-project.md` workflow (Steps 2a and 5) to call this command
|
||||
instead of writing a hardcoded partial JSON template. Defaults now live in
|
||||
exactly one place: `buildNewProjectConfig()` in `config.cjs`.
|
||||
|
||||
**Why this is conservative:**
|
||||
- No changes to `loadConfig()`, `ensureConfigFile()`, or any read path
|
||||
- No new config keys introduced
|
||||
- No semantic changes — same values the system was already resolving silently
|
||||
- Fully backward-compatible: `loadConfig()` continues to handle both the old
|
||||
partial format (existing projects) and the new full format
|
||||
- Idempotent: calling `config-new-project` twice is safe
|
||||
- No new user-facing flags
|
||||
|
||||
**Why this improves discoverability:**
|
||||
A developer opening `.planning/config.json` for the first time can now see
|
||||
`git.branching_strategy: "none"` and immediately understand that branching
|
||||
is available and configurable, without reading the GSD source.
|
||||
```
|
||||
@@ -13,7 +13,7 @@ const {
|
||||
|
||||
const VALID_CONFIG_KEYS = new Set([
|
||||
'mode', 'granularity', 'parallelization', 'commit_docs', 'model_profile',
|
||||
'search_gitignored', 'brave_search',
|
||||
'search_gitignored', 'brave_search', 'firecrawl', 'exa_search',
|
||||
'workflow.research', 'workflow.plan_check', 'workflow.verifier',
|
||||
'workflow.nyquist_validation', 'workflow.ui_phase', 'workflow.ui_safety_gate',
|
||||
'workflow.auto_advance', 'workflow.node_repair', 'workflow.node_repair_budget',
|
||||
@@ -55,9 +55,13 @@ function buildNewProjectConfig(userChoices) {
|
||||
const choices = userChoices || {};
|
||||
const homedir = require('os').homedir();
|
||||
|
||||
// Detect Brave Search API key availability
|
||||
// Detect API key availability
|
||||
const braveKeyFile = path.join(homedir, '.gsd', 'brave_api_key');
|
||||
const hasBraveSearch = !!(process.env.BRAVE_API_KEY || fs.existsSync(braveKeyFile));
|
||||
const firecrawlKeyFile = path.join(homedir, '.gsd', 'firecrawl_api_key');
|
||||
const hasFirecrawl = !!(process.env.FIRECRAWL_API_KEY || fs.existsSync(firecrawlKeyFile));
|
||||
const exaKeyFile = path.join(homedir, '.gsd', 'exa_api_key');
|
||||
const hasExaSearch = !!(process.env.EXA_API_KEY || fs.existsSync(exaKeyFile));
|
||||
|
||||
// Load user-level defaults from ~/.gsd/defaults.json if available
|
||||
const globalDefaultsPath = path.join(homedir, '.gsd', 'defaults.json');
|
||||
@@ -85,6 +89,8 @@ function buildNewProjectConfig(userChoices) {
|
||||
parallelization: true,
|
||||
search_gitignored: false,
|
||||
brave_search: hasBraveSearch,
|
||||
firecrawl: hasFirecrawl,
|
||||
exa_search: hasExaSearch,
|
||||
git: {
|
||||
branching_strategy: 'none',
|
||||
phase_branch_template: 'gsd/phase-{phase}-{slug}',
|
||||
|
||||
@@ -161,6 +161,8 @@ function loadConfig(cwd) {
|
||||
nyquist_validation: true,
|
||||
parallelization: true,
|
||||
brave_search: false,
|
||||
firecrawl: false,
|
||||
exa_search: false,
|
||||
text_mode: false, // when true, use plain-text numbered lists instead of AskUserQuestion menus
|
||||
sub_repos: [],
|
||||
resolve_model_ids: false, // when true, resolve aliases (opus/sonnet/haiku) to full model IDs
|
||||
@@ -242,6 +244,8 @@ function loadConfig(cwd) {
|
||||
nyquist_validation: get('nyquist_validation', { section: 'workflow', field: 'nyquist_validation' }) ?? defaults.nyquist_validation,
|
||||
parallelization,
|
||||
brave_search: get('brave_search') ?? defaults.brave_search,
|
||||
firecrawl: get('firecrawl') ?? defaults.firecrawl,
|
||||
exa_search: get('exa_search') ?? defaults.exa_search,
|
||||
text_mode: get('text_mode', { section: 'workflow', field: 'text_mode' }) ?? defaults.text_mode,
|
||||
sub_repos: get('sub_repos', { section: 'planning', field: 'sub_repos' }) ?? defaults.sub_repos,
|
||||
resolve_model_ids: get('resolve_model_ids') ?? defaults.resolve_model_ids,
|
||||
|
||||
@@ -196,6 +196,14 @@ function cmdInitNewProject(cwd, raw) {
|
||||
const braveKeyFile = path.join(homedir, '.gsd', 'brave_api_key');
|
||||
const hasBraveSearch = !!(process.env.BRAVE_API_KEY || fs.existsSync(braveKeyFile));
|
||||
|
||||
// Detect Firecrawl API key availability
|
||||
const firecrawlKeyFile = path.join(homedir, '.gsd', 'firecrawl_api_key');
|
||||
const hasFirecrawl = !!(process.env.FIRECRAWL_API_KEY || fs.existsSync(firecrawlKeyFile));
|
||||
|
||||
// Detect Exa API key availability
|
||||
const exaKeyFile = path.join(homedir, '.gsd', 'exa_api_key');
|
||||
const hasExaSearch = !!(process.env.EXA_API_KEY || fs.existsSync(exaKeyFile));
|
||||
|
||||
// Detect existing code (cross-platform — no Unix `find` dependency)
|
||||
let hasCode = false;
|
||||
let hasPackageFile = false;
|
||||
@@ -248,6 +256,8 @@ function cmdInitNewProject(cwd, raw) {
|
||||
|
||||
// Enhanced search
|
||||
brave_search_available: hasBraveSearch,
|
||||
firecrawl_available: hasFirecrawl,
|
||||
exa_search_available: hasExaSearch,
|
||||
|
||||
// File paths
|
||||
project_path: '.planning/PROJECT.md',
|
||||
@@ -474,6 +484,8 @@ function cmdInitPhaseOp(cwd, phase, raw) {
|
||||
// Config
|
||||
commit_docs: config.commit_docs,
|
||||
brave_search: config.brave_search,
|
||||
firecrawl: config.firecrawl,
|
||||
exa_search: config.exa_search,
|
||||
|
||||
// Phase info
|
||||
phase_found: !!phaseInfo,
|
||||
|
||||
@@ -12,7 +12,7 @@ Read all files referenced by the invoking prompt's execution_context before star
|
||||
Gather project statistics:
|
||||
|
||||
```bash
|
||||
STATS=$(node "$GSD_TOOLS" stats json)
|
||||
STATS=$(node "$HOME/.claude/get-shit-done/bin/gsd-tools.cjs" stats json)
|
||||
if [[ "$STATS" == @file:* ]]; then STATS=$(cat "${STATS#@file:}"); fi
|
||||
```
|
||||
|
||||
|
||||
2
package-lock.json
generated
2
package-lock.json
generated
@@ -16,7 +16,7 @@
|
||||
"esbuild": "^0.24.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=16.7.0"
|
||||
"node": ">=20.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@bcoe/v8-coverage": {
|
||||
|
||||
@@ -21,10 +21,57 @@ const {
|
||||
generateCodexConfigBlock,
|
||||
stripGsdFromCodexConfig,
|
||||
mergeCodexConfig,
|
||||
install,
|
||||
GSD_CODEX_MARKER,
|
||||
CODEX_AGENT_SANDBOX,
|
||||
} = require('../bin/install.js');
|
||||
|
||||
function runCodexInstall(codexHome, cwd = path.join(__dirname, '..')) {
|
||||
const previousCodeHome = process.env.CODEX_HOME;
|
||||
const previousCwd = process.cwd();
|
||||
process.env.CODEX_HOME = codexHome;
|
||||
|
||||
try {
|
||||
process.chdir(cwd);
|
||||
return install(true, 'codex');
|
||||
} finally {
|
||||
process.chdir(previousCwd);
|
||||
if (previousCodeHome === undefined) {
|
||||
delete process.env.CODEX_HOME;
|
||||
} else {
|
||||
process.env.CODEX_HOME = previousCodeHome;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function readCodexConfig(codexHome) {
|
||||
return fs.readFileSync(path.join(codexHome, 'config.toml'), 'utf8');
|
||||
}
|
||||
|
||||
function writeCodexConfig(codexHome, content) {
|
||||
fs.mkdirSync(codexHome, { recursive: true });
|
||||
fs.writeFileSync(path.join(codexHome, 'config.toml'), content, 'utf8');
|
||||
}
|
||||
|
||||
function countMatches(content, pattern) {
|
||||
return (content.match(pattern) || []).length;
|
||||
}
|
||||
|
||||
function assertNoDraftRootKeys(content) {
|
||||
assert.ok(!content.includes('model = "gpt-5.4"'), 'does not inject draft model default');
|
||||
assert.ok(!content.includes('model_reasoning_effort = "high"'), 'does not inject draft reasoning default');
|
||||
assert.ok(!content.includes('disable_response_storage = true'), 'does not inject draft storage default');
|
||||
}
|
||||
|
||||
function assertUsesOnlyEol(content, eol) {
|
||||
if (eol === '\r\n') {
|
||||
assert.ok(content.includes('\r\n'), 'contains CRLF line endings');
|
||||
assert.ok(!content.replace(/\r\n/g, '').includes('\n'), 'does not contain bare LF line endings');
|
||||
return;
|
||||
}
|
||||
assert.ok(!content.includes('\r\n'), 'does not contain CRLF line endings');
|
||||
}
|
||||
|
||||
// ─── getCodexSkillAdapterHeader ─────────────────────────────────────────────────
|
||||
|
||||
describe('getCodexSkillAdapterHeader', () => {
|
||||
@@ -474,6 +521,77 @@ describe('mergeCodexConfig', () => {
|
||||
assert.ok(!beforeMarker.includes('[agents.gsd-'), 'no leaked [agents.gsd-*] above marker');
|
||||
});
|
||||
|
||||
test('case 2 strips leaked GSD-managed sections above marker in CRLF files', () => {
|
||||
const configPath = path.join(tmpDir, 'config.toml');
|
||||
const brokenContent = [
|
||||
'[features]',
|
||||
'child_agents_md = false',
|
||||
'',
|
||||
'[agents]',
|
||||
'max_threads = 4',
|
||||
'',
|
||||
'[agents.gsd-executor]',
|
||||
'description = "stale"',
|
||||
'config_file = "agents/gsd-executor.toml"',
|
||||
'',
|
||||
GSD_CODEX_MARKER,
|
||||
'',
|
||||
'[agents.gsd-executor]',
|
||||
'description = "Executes plans"',
|
||||
'config_file = "agents/gsd-executor.toml"',
|
||||
'',
|
||||
].join('\r\n');
|
||||
fs.writeFileSync(configPath, brokenContent, 'utf8');
|
||||
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
|
||||
const content = fs.readFileSync(configPath, 'utf8');
|
||||
const markerIndex = content.indexOf(GSD_CODEX_MARKER);
|
||||
const beforeMarker = content.slice(0, markerIndex);
|
||||
|
||||
assert.ok(content.includes('child_agents_md = false'), 'preserves user feature keys');
|
||||
assert.strictEqual(countMatches(beforeMarker, /^\[agents\]\s*$/gm), 0, 'removes leaked [agents] above marker');
|
||||
assert.strictEqual(countMatches(beforeMarker, /^\[agents\.gsd-executor\]\s*$/gm), 0, 'removes leaked GSD agent section above marker');
|
||||
assert.strictEqual(countMatches(content, /^\[agents\.gsd-executor\]\s*$/gm), 1, 'keeps one managed agent section');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
});
|
||||
|
||||
test('case 2 preserves user-authored [agents] tables while stripping leaked GSD sections in CRLF files', () => {
|
||||
const configPath = path.join(tmpDir, 'config.toml');
|
||||
const brokenContent = [
|
||||
'[features]',
|
||||
'child_agents_md = false',
|
||||
'',
|
||||
'[agents]',
|
||||
'default = "custom-agent"',
|
||||
'',
|
||||
'[agents.gsd-executor]',
|
||||
'description = "stale"',
|
||||
'config_file = "agents/gsd-executor.toml"',
|
||||
'',
|
||||
GSD_CODEX_MARKER,
|
||||
'',
|
||||
'[agents.gsd-executor]',
|
||||
'description = "Executes plans"',
|
||||
'config_file = "agents/gsd-executor.toml"',
|
||||
'',
|
||||
].join('\r\n');
|
||||
fs.writeFileSync(configPath, brokenContent, 'utf8');
|
||||
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
|
||||
const content = fs.readFileSync(configPath, 'utf8');
|
||||
const markerIndex = content.indexOf(GSD_CODEX_MARKER);
|
||||
const beforeMarker = content.slice(0, markerIndex);
|
||||
|
||||
assert.ok(beforeMarker.includes('[agents]\r\ndefault = "custom-agent"\r\n'), 'preserves user-authored [agents] table');
|
||||
assert.strictEqual(countMatches(beforeMarker, /^\[agents\.gsd-executor\]\s*$/gm), 0, 'removes leaked GSD agent section above marker');
|
||||
assert.strictEqual(countMatches(content, /^\[agents\.gsd-executor\]\s*$/gm), 1, 'keeps one managed agent section in the GSD block');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
});
|
||||
|
||||
test('case 2 idempotent after case 3 with existing [features]', () => {
|
||||
const configPath = path.join(tmpDir, 'config.toml');
|
||||
fs.writeFileSync(configPath, '[features]\nother_feature = true\n');
|
||||
@@ -489,6 +607,29 @@ describe('mergeCodexConfig', () => {
|
||||
assert.strictEqual(first, second, 'idempotent after 2nd merge');
|
||||
assert.strictEqual(second, third, 'idempotent after 3rd merge');
|
||||
});
|
||||
|
||||
test('preserves CRLF when appending GSD block to existing config', () => {
|
||||
const configPath = path.join(tmpDir, 'config.toml');
|
||||
fs.writeFileSync(configPath, '[model]\r\nname = "o3"\r\n', 'utf8');
|
||||
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
|
||||
const content = fs.readFileSync(configPath, 'utf8');
|
||||
assert.ok(content.includes('[model]\r\nname = "o3"\r\n'), 'preserves existing CRLF content');
|
||||
assert.ok(content.includes(`${GSD_CODEX_MARKER}\r\n`), 'writes marker with CRLF');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
});
|
||||
|
||||
test('uses the first newline style when appending GSD block to mixed-EOL configs', () => {
|
||||
const configPath = path.join(tmpDir, 'config.toml');
|
||||
fs.writeFileSync(configPath, '# first line wins\n[model]\r\nname = "o3"\r\n', 'utf8');
|
||||
|
||||
mergeCodexConfig(configPath, sampleBlock);
|
||||
|
||||
const content = fs.readFileSync(configPath, 'utf8');
|
||||
assert.ok(content.includes('# first line wins\n[model]\r\nname = "o3"'), 'preserves the existing mixed-EOL model content');
|
||||
assert.ok(content.includes(`\n\n${GSD_CODEX_MARKER}\n`), 'writes the managed block using the first newline style');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Integration: installCodexConfig ────────────────────────────────────────────
|
||||
@@ -572,3 +713,709 @@ describe('codex features section safety', () => {
|
||||
assert.strictEqual(nonBooleanKeys.length, 0, 'no non-boolean keys in a clean config');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Codex install hook configuration (e2e)', () => {
|
||||
let tmpDir;
|
||||
let codexHome;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-codex-e2e-'));
|
||||
codexHome = path.join(tmpDir, 'codex-home');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
test('fresh CODEX_HOME enables codex_hooks without draft root defaults', () => {
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('[features]\ncodex_hooks = true\n'), 'writes codex_hooks feature');
|
||||
assert.ok(content.includes('# GSD Hooks\n[[hooks]]\nevent = "SessionStart"\n'), 'writes GSD SessionStart hook block');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'writes one codex_hooks key');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'writes one GSD update hook');
|
||||
assertNoDraftRootKeys(content);
|
||||
assertUsesOnlyEol(content, '\n');
|
||||
});
|
||||
|
||||
test('existing LF config without [features] gets one features block and preserves user content', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'# user comment',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "SessionStart"',
|
||||
'command = "echo custom"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'creates one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'creates one codex_hooks key');
|
||||
assert.ok(content.includes('# user comment'), 'preserves user comment');
|
||||
assert.ok(content.includes('[model]\nname = "o3"'), 'preserves model section');
|
||||
assert.ok(content.includes('command = "echo custom"'), 'preserves custom hook');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'adds one GSD update hook');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing CRLF config without [features] preserves CRLF and adds codex_hooks', () => {
|
||||
writeCodexConfig(codexHome, '# user comment\r\n[model]\r\nname = "o3"\r\n');
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'creates one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'creates one codex_hooks key');
|
||||
assert.ok(content.includes('# user comment\r\n[model]\r\nname = "o3"\r\n'), 'preserves existing CRLF content');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing CRLF [features] comment-only table gets codex_hooks without losing adjacent text', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'# user comment',
|
||||
'[features]',
|
||||
'# keep me',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\r\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds one codex_hooks key');
|
||||
assert.ok(content.includes('[features]\r\n# keep me\r\n\r\ncodex_hooks = true\r\n'), 'adds codex_hooks within comment-only table');
|
||||
assert.ok(content.includes('[model]\r\nname = "o3"\r\n'), 'preserves following table');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing [features] with trailing comment gets one codex_hooks without a second table', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features] # keep comment',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\s*\[features\](?:\s*#.*)?$/gm), 1, 'keeps one commented [features] header');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds one codex_hooks key');
|
||||
assert.ok(content.includes('[features] # keep comment\nother_feature = true'), 'preserves commented features table');
|
||||
assert.ok(content.indexOf('codex_hooks = true') > content.indexOf('[features] # keep comment'), 'adds codex_hooks within existing features table');
|
||||
assert.ok(content.indexOf('codex_hooks = true') < content.indexOf('[model]'), 'does not create a second features table before model');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing [features] at EOF without trailing newline is updated in place', () => {
|
||||
writeCodexConfig(codexHome, '[model]\nname = "o3"\n\n[features]');
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds one codex_hooks key');
|
||||
assert.ok(content.indexOf('codex_hooks = true') > content.indexOf('[features]'), 'adds codex_hooks after the existing EOF features header');
|
||||
assert.ok(content.indexOf('codex_hooks = true') < content.indexOf('[agents.gsd-codebase-mapper]'), 'keeps codex_hooks before the next real table');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing empty [features] and codex_hooks = false are normalized and remain idempotent', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = false',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "SessionStart"',
|
||||
'command = "echo custom"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'normalizes to one codex_hooks = true');
|
||||
assert.ok(!content.includes('codex_hooks = false'), 'removes false codex_hooks value');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assert.ok(content.includes('command = "echo custom"'), 'preserves custom hook');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'does not duplicate GSD update hook');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('quoted codex_hooks keys inside [features] are normalized without adding a bare duplicate', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'"codex_hooks" = false',
|
||||
'other_feature = true',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^"codex_hooks" = true$/gm), 1, 'normalizes the quoted key to true');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 0, 'does not append a bare duplicate codex_hooks key');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('quoted [features] headers are recognized as the existing features table', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'["features"]',
|
||||
'"codex_hooks" = false',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[(?:"features"|'features'|features)\]\s*$/gm), 1, 'keeps one features table');
|
||||
assert.strictEqual(countMatches(content, /^"codex_hooks" = true$/gm), 1, 'normalizes the quoted codex_hooks key to true');
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 0, 'does not prepend a second bare features table');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves existing feature keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'keeps one GSD update hook');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('quoted table headers containing # are parsed without treating # as a comment start', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features."a#b"]',
|
||||
'enabled = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('[features."a#b"]\nenabled = true'), 'preserves the quoted nested features table');
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'adds one real top-level features table');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds one codex_hooks key');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'remains idempotent for the GSD hook block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing dotted features config stays dotted and does not grow a [features] table', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features.other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 0, 'does not add a [features] table');
|
||||
assert.strictEqual(countMatches(content, /^features\.codex_hooks = true$/gm), 1, 'adds one dotted codex_hooks key');
|
||||
assert.ok(content.includes('features.other_feature = true'), 'preserves existing dotted features key');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'adds one GSD update hook for dotted codex_hooks and remains idempotent');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('root inline-table features assignments are left untouched without appending invalid dotted keys or hooks', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features = { other_feature = true }',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('features = { other_feature = true }'), 'preserves the root inline-table assignment');
|
||||
assert.strictEqual(countMatches(content, /^features\.codex_hooks = true$/gm), 0, 'does not append an invalid dotted codex_hooks key');
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 0, 'does not prepend a features table');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 0, 'does not add the GSD hook block when codex_hooks cannot be enabled safely');
|
||||
assert.ok(content.includes('[agents.gsd-executor]'), 'still installs the managed agent block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('root scalar features assignments are left untouched without appending invalid dotted keys or hooks', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features = "disabled"',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('features = "disabled"'), 'preserves the root scalar assignment');
|
||||
assert.strictEqual(countMatches(content, /^features\.codex_hooks = true$/gm), 0, 'does not append an invalid dotted codex_hooks key');
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 0, 'does not prepend a features table');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 0, 'does not add the GSD hook block when codex_hooks cannot be enabled safely');
|
||||
assert.ok(content.includes('[agents.gsd-executor]'), 'still installs the managed agent block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('quoted dotted codex_hooks keys stay dotted and are normalized without duplication', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features."codex_hooks" = false',
|
||||
'features.other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 0, 'does not add a [features] table');
|
||||
assert.strictEqual(countMatches(content, /^features\."codex_hooks" = true$/gm), 1, 'normalizes the quoted dotted key to true');
|
||||
assert.strictEqual(countMatches(content, /^features\.codex_hooks = true$/gm), 0, 'does not append a bare dotted duplicate');
|
||||
assert.ok(content.includes('features.other_feature = true'), 'preserves other dotted features keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'adds one GSD update hook for quoted dotted codex_hooks and remains idempotent');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('multiline dotted features assignments insert codex_hooks after the full assignment block', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features.notes = """',
|
||||
'keep-me',
|
||||
'"""',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('features.notes = """\nkeep-me\n"""'), 'preserves the multiline dotted assignment');
|
||||
assert.strictEqual(countMatches(content, /^features\.codex_hooks = true$/gm), 1, 'adds one dotted codex_hooks key');
|
||||
assert.ok(content.indexOf('features.codex_hooks = true') > content.indexOf('"""'), 'inserts codex_hooks after the multiline assignment closes');
|
||||
assert.ok(content.indexOf('features.codex_hooks = true') < content.indexOf('[model]'), 'inserts codex_hooks before the next table');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing empty [features] table is populated with one codex_hooks key', () => {
|
||||
writeCodexConfig(codexHome, '[features]\r\n\r\n[model]\r\nname = "o3"\r\n');
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds one codex_hooks key');
|
||||
assert.ok(content.includes('[features]\r\n\r\ncodex_hooks = true\r\n'), 'adds codex_hooks to empty table');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('multiline strings inside [features] do not create fake tables or fake codex_hooks matches', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'notes = \'\'\'',
|
||||
'[model]',
|
||||
'codex_hooks = false',
|
||||
'\'\'\'',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "AfterCommand"',
|
||||
'command = "echo custom-after-command"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'adds a real codex_hooks key once');
|
||||
assert.ok(content.includes('notes = \'\'\'\n[model]\ncodex_hooks = false\n\'\'\''), 'preserves multiline string content');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = false$/gm), 1, 'does not rewrite codex_hooks text inside multiline string');
|
||||
assert.ok(content.indexOf('codex_hooks = true') > content.indexOf('other_feature = true'), 'does not stop the features section at multiline string content');
|
||||
assert.ok(content.indexOf('codex_hooks = true') < content.indexOf('[[hooks]]'), 'inserts the real codex_hooks key before the next table');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('non-boolean codex_hooks assignments are normalized to true without duplication', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = "sometimes"',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'normalizes to one true value');
|
||||
assert.ok(!content.includes('codex_hooks = "sometimes"'), 'removes non-boolean value');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('multiline basic-string codex_hooks assignments are fully normalized without leaving trailing lines behind', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = """',
|
||||
'multiline-basic-sentinel',
|
||||
'still-in-string',
|
||||
'"""',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'replaces the multiline basic-string assignment with one true value');
|
||||
assert.ok(!content.includes('multiline-basic-sentinel'), 'removes multiline basic-string continuation lines');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves following feature keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'remains idempotent for the GSD hook block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('multiline literal-string codex_hooks assignments are fully normalized without leaving trailing lines behind', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = \'\'\'',
|
||||
'multiline-literal-sentinel',
|
||||
'still-in-literal',
|
||||
'\'\'\'',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'replaces the multiline literal-string assignment with one true value');
|
||||
assert.ok(!content.includes('multiline-literal-sentinel'), 'removes multiline literal-string continuation lines');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves following feature keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'remains idempotent for the GSD hook block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('multiline array codex_hooks assignments are fully normalized without leaving trailing lines behind', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = [',
|
||||
' "array-sentinel-1",',
|
||||
' "array-sentinel-2",',
|
||||
']',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'replaces the multiline array assignment with one true value');
|
||||
assert.ok(!content.includes('array-sentinel-1'), 'removes multiline array continuation lines');
|
||||
assert.ok(!content.includes('array-sentinel-2'), 'removes multiline array continuation lines');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves following feature keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'remains idempotent for the GSD hook block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('triple-quoted codex_hooks values keep inline comments when normalized', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = """sometimes""" # keep me',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true # keep me$/gm), 1, 'normalizes to true and preserves inline comment');
|
||||
assert.ok(!content.includes('"""sometimes"""'), 'removes the old triple-quoted value');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('existing CRLF codex_hooks = true stays single and preserves non-GSD hooks', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = true',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "AfterCommand"',
|
||||
'command = "echo custom-after-command"',
|
||||
'',
|
||||
].join('\r\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'keeps one codex_hooks = true');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assert.strictEqual(countMatches(content, /echo custom-after-command/g), 1, 'preserves non-GSD hook exactly once');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'keeps one GSD update hook');
|
||||
assertUsesOnlyEol(content, '\r\n');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('codex_hooks = true with an inline comment is treated as enabled for hook installation', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = true # keep me',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.strictEqual(countMatches(content, /^\[features\]\s*$/gm), 1, 'keeps one [features] section');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true # keep me$/gm), 1, 'preserves the commented true value');
|
||||
assert.ok(content.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'adds the GSD update hook once');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
|
||||
test('mixed-EOL configs use the first newline style for inserted Codex content', () => {
|
||||
writeCodexConfig(codexHome, '# first line wins\n[model]\r\nname = "o3"\r\n');
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const content = readCodexConfig(codexHome);
|
||||
assert.ok(content.includes('[features]\ncodex_hooks = true\n\n# first line wins\n'), 'prepends the features block using the first newline style');
|
||||
assert.ok(content.includes(`# GSD Agent Configuration — managed by get-shit-done installer\n`), 'writes the managed agent block using the first newline style');
|
||||
assert.ok(content.includes('# GSD Hooks\n[[hooks]]\nevent = "SessionStart"\n'), 'writes the GSD hook block using the first newline style');
|
||||
assert.ok(content.includes('[model]\r\nname = "o3"'), 'preserves the existing CRLF model lines');
|
||||
assert.strictEqual(countMatches(content, /^codex_hooks = true$/gm), 1, 'remains idempotent on repeated installs');
|
||||
assert.strictEqual(countMatches(content, /gsd-update-check\.js/g), 1, 'does not duplicate the GSD hook block');
|
||||
assertNoDraftRootKeys(content);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Codex uninstall symmetry for hook-enabled configs', () => {
|
||||
let tmpDir;
|
||||
let codexHome;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'gsd-codex-uninstall-'));
|
||||
codexHome = path.join(tmpDir, 'codex-home');
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
fs.rmSync(tmpDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
test('fresh install removes the GSD-added codex_hooks feature on uninstall', () => {
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.strictEqual(cleaned, null, 'fresh GSD-only config strips back to nothing');
|
||||
});
|
||||
|
||||
test('install then uninstall removes [features].codex_hooks while preserving other feature keys, comments, hooks, and CRLF', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'# keep me',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "AfterCommand"',
|
||||
'command = "echo custom-after-command"',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\r\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned, 'preserves user config after uninstall cleanup');
|
||||
assert.strictEqual(countMatches(cleaned, /^\[features\](?:\s*#.*)?$/gm), 1, 'keeps the existing features table');
|
||||
assert.strictEqual(countMatches(cleaned, /^codex_hooks = true$/gm), 0, 'removes the GSD-added codex_hooks key');
|
||||
assert.ok(cleaned.includes('# keep me'), 'preserves user comments in [features]');
|
||||
assert.ok(cleaned.includes('other_feature = true'), 'preserves other feature keys');
|
||||
assert.strictEqual(countMatches(cleaned, /echo custom-after-command/g), 1, 'preserves non-GSD hooks');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes only the GSD update hook');
|
||||
assert.strictEqual(countMatches(cleaned, /\[agents\.gsd-/g), 0, 'removes managed GSD agent sections');
|
||||
assertUsesOnlyEol(cleaned, '\r\n');
|
||||
});
|
||||
|
||||
test('install then uninstall removes dotted features.codex_hooks without creating a [features] table', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features.other_feature = true',
|
||||
'',
|
||||
'[[hooks]]',
|
||||
'event = "AfterCommand"',
|
||||
'command = "echo custom-after-command"',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned.includes('features.other_feature = true'), 'preserves other dotted feature keys');
|
||||
assert.strictEqual(countMatches(cleaned, /^features\.codex_hooks = true$/gm), 0, 'removes the dotted GSD codex_hooks key');
|
||||
assert.strictEqual(countMatches(cleaned, /^\[features\]\s*$/gm), 0, 'does not leave behind a [features] table');
|
||||
assert.strictEqual(countMatches(cleaned, /echo custom-after-command/g), 1, 'preserves non-GSD hooks');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes the GSD update hook');
|
||||
});
|
||||
|
||||
test('install then uninstall preserves a pre-existing [features].codex_hooks = true', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'codex_hooks = true',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned.includes('[features]\ncodex_hooks = true\nother_feature = true'), 'preserves the user-authored codex_hooks assignment');
|
||||
assert.strictEqual(countMatches(cleaned, /^codex_hooks = true$/gm), 1, 'keeps the pre-existing codex_hooks key');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes the GSD update hook');
|
||||
assert.strictEqual(countMatches(cleaned, /\[agents\.gsd-/g), 0, 'removes managed GSD agent sections');
|
||||
});
|
||||
|
||||
test('install then uninstall preserves a pre-existing quoted [features].\"codex_hooks\" = true', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'[features]',
|
||||
'"codex_hooks" = true',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned.includes('[features]\n"codex_hooks" = true\nother_feature = true'), 'preserves the user-authored quoted codex_hooks assignment');
|
||||
assert.strictEqual(countMatches(cleaned, /^"codex_hooks" = true$/gm), 1, 'keeps the pre-existing quoted codex_hooks key');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes the GSD update hook');
|
||||
assert.strictEqual(countMatches(cleaned, /\[agents\.gsd-/g), 0, 'removes managed GSD agent sections');
|
||||
});
|
||||
|
||||
test('install then uninstall preserves a pre-existing root dotted features.codex_hooks = true', () => {
|
||||
writeCodexConfig(codexHome, [
|
||||
'features.codex_hooks = true',
|
||||
'features.other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\n'));
|
||||
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned.includes('features.codex_hooks = true\nfeatures.other_feature = true'), 'preserves the user-authored dotted codex_hooks assignment');
|
||||
assert.strictEqual(countMatches(cleaned, /^features\.codex_hooks = true$/gm), 1, 'keeps the pre-existing dotted codex_hooks key');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes the GSD update hook');
|
||||
assert.strictEqual(countMatches(cleaned, /\[agents\.gsd-/g), 0, 'removes managed GSD agent sections');
|
||||
});
|
||||
|
||||
test('install then uninstall leaves short-circuited root features assignments untouched', () => {
|
||||
const cases = [
|
||||
'features = { other_feature = true }\n\n[model]\nname = "o3"\n',
|
||||
'features = "disabled"\n\n[model]\nname = "o3"\n',
|
||||
];
|
||||
|
||||
for (const initialContent of cases) {
|
||||
writeCodexConfig(codexHome, initialContent);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.strictEqual(cleaned, initialContent, `preserves short-circuited root features assignment: ${initialContent.split('\n')[0]}`);
|
||||
|
||||
fs.rmSync(codexHome, { recursive: true, force: true });
|
||||
fs.mkdirSync(codexHome, { recursive: true });
|
||||
}
|
||||
});
|
||||
|
||||
test('install then uninstall keeps mixed-EOL user content stable while removing GSD hook state', () => {
|
||||
const initialContent = [
|
||||
'# first line wins',
|
||||
'[features]',
|
||||
'other_feature = true',
|
||||
'',
|
||||
'[model]',
|
||||
'name = "o3"',
|
||||
'',
|
||||
].join('\r\n').replace(/^# first line wins\r\n/, '# first line wins\n');
|
||||
|
||||
writeCodexConfig(codexHome, initialContent);
|
||||
runCodexInstall(codexHome);
|
||||
|
||||
const cleaned = stripGsdFromCodexConfig(readCodexConfig(codexHome));
|
||||
assert.ok(cleaned.includes('# first line wins\n[features]\r\nother_feature = true\r\n\r\n[model]\r\nname = "o3"'), 'preserves the original mixed-EOL user content');
|
||||
assert.strictEqual(countMatches(cleaned, /^codex_hooks = true$/gm), 0, 'removes the injected codex_hooks key');
|
||||
assert.strictEqual(countMatches(cleaned, /gsd-update-check\.js/g), 0, 'removes the GSD update hook');
|
||||
assert.strictEqual(countMatches(cleaned, /\[agents\.gsd-/g), 0, 'removes managed GSD agent sections');
|
||||
});
|
||||
});
|
||||
|
||||
Reference in New Issue
Block a user