mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-25 17:25:23 +02:00
* feat: auto-remap codebase after significant phase execution (#2003) Adds a post-phase structural drift detector that compares the committed tree against `.planning/codebase/STRUCTURE.md` and either warns or auto-remaps the affected subtrees when drift exceeds a configurable threshold. ## Summary - New `bin/lib/drift.cjs` — pure detector covering four drift categories: new directories outside mapped paths, new barrel exports at `(packages|apps)/*/src/index.*`, new migration files, and new route modules. Prioritizes the most-specific category per file. - New `verify codebase-drift` CLI subcommand + SDK handler, registered as `gsd-sdk query verify.codebase-drift`. - New `codebase_drift_gate` step in `execute-phase` between `schema_drift_gate` and `verify_phase_goal`. Non-blocking by contract — any error logs and the phase continues. - Two new config keys: `workflow.drift_threshold` (int, default 3) and `workflow.drift_action` (`warn` | `auto-remap`, default `warn`), with enum/integer validation in `config-set`. - `gsd-codebase-mapper` learns an optional `--paths <p1,p2,...>` scope hint for incremental remapping; agent/workflow docs updated. - `last_mapped_commit` lives in YAML frontmatter on each `.planning/codebase/*.md` file; `readMappedCommit`/`writeMappedCommit` round-trip helpers ship in `drift.cjs`. ## Tests - 55 new tests in `tests/drift-detection.test.cjs` covering: classification, threshold gating at 2/3/4 elements, warn vs. auto-remap routing, affected-path scoping, `--paths` sanitization (traversal, absolute, shell metacharacter rejection), frontmatter round-trip, defensive paths (missing STRUCTURE.md, malformed input, non-git repos), CLI JSON output, and documentation parity. - Full suite: 5044 pass / 0 fail. ## Documentation - `docs/CONFIGURATION.md` — rows for both new keys. - `docs/ARCHITECTURE.md` — section on the post-execute drift gate. - `docs/AGENTS.md` — `--paths` flag on `gsd-codebase-mapper`. - `docs/USER-GUIDE.md` — user-facing behavior note + toggle commands. - `docs/FEATURES.md` — new 27a section with REQ-DRIFT-01..06. - `docs/INVENTORY.md` + `docs/INVENTORY-MANIFEST.json` — drift.cjs listed. - `get-shit-done/workflows/execute-phase.md` — `codebase_drift_gate` step. - `get-shit-done/workflows/map-codebase.md` — `parse_paths_flag` step. - `agents/gsd-codebase-mapper.md` — `--paths` directive under parse_focus. ## Design decisions - **Frontmatter over sidecar JSON** for `last_mapped_commit`: keeps the baseline attached to the file, survives git moves, survives per-doc regeneration, no extra file lifecycle. - **Substring match against STRUCTURE.md** for `isPathMapped`: the map is free-form markdown, not a structured manifest; any mention of a path prefix counts as "mapped territory". Cheap, no parser, zero false negatives on reasonable maps. - **Category priority migration > route > barrel > new_dir** so a file matching multiple rules counts exactly once at the most specific level. - **Empty-tree SHA fallback** (`4b825dc6…`) when `last_mapped_commit` is absent — semantically correct (no baseline means everything is drift) and deterministic across repos. - **Four layers of non-blocking** — detector try/catch, CLI try/catch, SDK handler try/catch, and workflow `|| echo` shell fallback. Any single layer failing still returns a valid skipped result. - **SDK handler delegates to `gsd-tools.cjs`** rather than re-porting the detector to TypeScript, keeping drift logic in one canonical place. Closes #2003 Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * docs(mapper): tag --paths fenced block as text (CodeRabbit MD040) Comment 3127255172. * docs(config): use /gsd- dash command syntax in drift_action row (CodeRabbit) Comment 3127255180. Matches the convention used by every other command reference in docs/CONFIGURATION.md. * fix(execute-phase): initialize AGENT_SKILLS_MAPPER + tag fenced blocks Two CodeRabbit findings on the auto-remap branch of the drift gate: - 3127255186 (must-fix): the mapper Task prompt referenced ${AGENT_SKILLS_MAPPER} but only AGENT_SKILLS (for gsd-executor) is loaded at init_context (line 72). Without this fix the literal placeholder string would leak into the spawned mapper's prompt. Add an explicit gsd-sdk query agent-skills gsd-codebase-mapper step right before the Task spawn. - 3127255183: tag the warn-message and Task() fenced code blocks as text to satisfy markdownlint MD040. * docs(map-codebase): wire PATH_SCOPE_HINT through every mapper prompt CodeRabbit (review id 4158286952, comment 3127255190) flagged that the parse_paths_flag step defined incremental-remap semantics but did not inject a normalized variable into the spawn_agents and sequential_mapping mapper prompts, so incremental remap could silently regress to a whole-repo scan. - Define SCOPED_PATHS / PATH_SCOPE_HINT in parse_paths_flag. - Inject ${PATH_SCOPE_HINT} into all four spawn_agents Task prompts. - Document the same scope contract for sequential_mapping mode. * fix(drift): writeMappedCommit tolerates missing target file CodeRabbit (review id 4158286952, drift.cjs:349-355 nitpick) noted that readMappedCommit returns null on ENOENT but writeMappedCommit threw — an asymmetry that breaks first-time stamping of a freshly produced doc that the caller has not yet written. - Catch ENOENT on the read; treat absent file as empty content. - Add a regression test that calls writeMappedCommit on a non-existent path and asserts the file is created with correct frontmatter. Test was authored to fail before the fix (ENOENT) and passes after. --------- Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -94,6 +94,19 @@ Based on focus, determine which documents you'll write:
|
||||
- `arch` → ARCHITECTURE.md, STRUCTURE.md
|
||||
- `quality` → CONVENTIONS.md, TESTING.md
|
||||
- `concerns` → CONCERNS.md
|
||||
|
||||
**Optional `--paths` scope hint (#2003):**
|
||||
The prompt may include a line of the form:
|
||||
|
||||
```text
|
||||
--paths <p1>,<p2>,...
|
||||
```
|
||||
|
||||
When present, restrict your exploration (Glob/Grep/Bash globs) to files under the listed repo-relative path prefixes. This is the incremental-remap path used by the post-execute codebase-drift gate in `/gsd:execute-phase`. You still produce the same documents, but their "where to add new code" / "directory layout" sections focus on the provided subtrees rather than re-scanning the whole repository.
|
||||
|
||||
**Path validation:** Reject any `--paths` value containing `..`, starting with `/`, or containing shell metacharacters (`;`, `` ` ``, `$`, `&`, `|`, `<`, `>`). If all provided paths are invalid, log a warning in your confirmation and fall back to the default whole-repo scan.
|
||||
|
||||
If no `--paths` hint is provided, behave exactly as before.
|
||||
</step>
|
||||
|
||||
<step name="explore_codebase">
|
||||
|
||||
@@ -343,18 +343,26 @@ GSD uses a multi-agent architecture where thin orchestrators (workflow files) sp
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| **Spawned by** | `/gsd-map-codebase` |
|
||||
| **Spawned by** | `/gsd-map-codebase`, post-execute drift gate in `/gsd:execute-phase` |
|
||||
| **Parallelism** | 4 instances (tech, architecture, quality, concerns) |
|
||||
| **Tools** | Read, Bash, Grep, Glob, Write |
|
||||
| **Model (balanced)** | Haiku |
|
||||
| **Color** | Cyan |
|
||||
| **Produces** | `.planning/codebase/*.md` (7 documents) |
|
||||
| **Produces** | `.planning/codebase/*.md` (7 documents, with `last_mapped_commit` frontmatter) |
|
||||
|
||||
**Key behaviors:**
|
||||
- Read-only exploration + structured output
|
||||
- Writes documents directly to disk
|
||||
- No reasoning required — pattern extraction from file contents
|
||||
|
||||
**`--paths <p1,p2,...>` scope hint (#2003):**
|
||||
Accepts an optional `--paths` directive in its prompt. When present, the
|
||||
mapper restricts Glob/Grep/Bash exploration to the listed repo-relative path
|
||||
prefixes — this is the incremental-remap path used by the post-execute
|
||||
codebase-drift gate. Path values that contain `..`, start with `/`, or
|
||||
include shell metacharacters are rejected. Without the hint, the mapper
|
||||
runs its default whole-repo scan.
|
||||
|
||||
---
|
||||
|
||||
### gsd-debugger
|
||||
|
||||
@@ -467,8 +467,8 @@ Equivalent paths for other runtimes:
|
||||
│ ├── ARCHITECTURE.md
|
||||
│ └── PITFALLS.md
|
||||
├── codebase/ # Brownfield mapping (from /gsd-map-codebase)
|
||||
│ ├── STACK.md
|
||||
│ ├── ARCHITECTURE.md
|
||||
│ ├── STACK.md # YAML frontmatter carries `last_mapped_commit`
|
||||
│ ├── ARCHITECTURE.md # for the post-execute drift gate (#2003)
|
||||
│ ├── CONVENTIONS.md
|
||||
│ ├── CONCERNS.md
|
||||
│ ├── STRUCTURE.md
|
||||
@@ -502,6 +502,30 @@ Equivalent paths for other runtimes:
|
||||
└── continue-here.md # Context handoff (from pause-work)
|
||||
```
|
||||
|
||||
### Post-Execute Codebase Drift Gate (#2003)
|
||||
|
||||
After the last wave of `/gsd:execute-phase` commits, the workflow runs a
|
||||
non-blocking `codebase_drift_gate` step (between `schema_drift_gate` and
|
||||
`verify_phase_goal`). It compares the diff `last_mapped_commit..HEAD`
|
||||
against `.planning/codebase/STRUCTURE.md` and counts four kinds of
|
||||
structural elements:
|
||||
|
||||
1. New directories outside mapped paths
|
||||
2. New barrel exports at `(packages|apps)/<name>/src/index.*`
|
||||
3. New migration files
|
||||
4. New route modules under `routes/` or `api/`
|
||||
|
||||
If the count meets `workflow.drift_threshold` (default 3), the gate either
|
||||
**warns** (default) with the suggested `/gsd:map-codebase --paths …` command,
|
||||
or **auto-remaps** (`workflow.drift_action = auto-remap`) by spawning
|
||||
`gsd-codebase-mapper` scoped to the affected paths. Any error in detection
|
||||
or remap is logged and the phase continues — drift detection cannot fail
|
||||
verification.
|
||||
|
||||
`last_mapped_commit` lives in YAML frontmatter at the top of each
|
||||
`.planning/codebase/*.md` file; `bin/lib/drift.cjs` provides
|
||||
`readMappedCommit` and `writeMappedCommit` round-trip helpers.
|
||||
|
||||
---
|
||||
|
||||
## Installer Architecture
|
||||
|
||||
@@ -167,6 +167,8 @@ All workflow toggles follow the **absent = enabled** pattern. If a key is missin
|
||||
| `workflow.pattern_mapper` | boolean | `true` | Run the `gsd-pattern-mapper` agent between research and planning to map new files to existing codebase analogs |
|
||||
| `workflow.subagent_timeout` | number | `600` | Timeout in seconds for individual subagent invocations. Increase for long-running research or execution phases |
|
||||
| `workflow.inline_plan_threshold` | number | `3` | Maximum number of tasks in a phase before the planner generates a separate PLAN.md file instead of inlining tasks in the prompt |
|
||||
| `workflow.drift_threshold` | number | `3` | Minimum number of new structural elements (new directories, barrel exports, migrations, route modules) introduced during a phase before the post-execute codebase-drift gate takes action. See [#2003](https://github.com/gsd-build/get-shit-done/issues/2003). Added in v1.39 |
|
||||
| `workflow.drift_action` | string | `warn` | What to do when `workflow.drift_threshold` is exceeded after `/gsd-execute-phase`. `warn` prints a message suggesting `/gsd-map-codebase --paths …`; `auto-remap` spawns `gsd-codebase-mapper` scoped to the affected paths. Added in v1.39 |
|
||||
|
||||
### Recommended Presets
|
||||
|
||||
|
||||
@@ -802,6 +802,45 @@
|
||||
| `TESTING.md` | Test infrastructure, coverage, patterns |
|
||||
| `INTEGRATIONS.md` | External services, APIs, third-party dependencies |
|
||||
|
||||
**Incremental remap — `--paths` (#2003):** The mapper accepts an optional
|
||||
`--paths <p1,p2,...>` scope hint. When provided, it restricts exploration
|
||||
to the listed repo-relative prefixes instead of scanning the whole tree.
|
||||
This is the pathway used by the post-execute codebase-drift gate to refresh
|
||||
only the subtrees the phase actually changed. Each produced document carries
|
||||
`last_mapped_commit` in its YAML frontmatter so drift can be measured
|
||||
against the mapping point, not HEAD.
|
||||
|
||||
### 27a. Post-Execute Codebase Drift Detection
|
||||
|
||||
**Introduced by:** #2003
|
||||
**Trigger:** Runs automatically at the end of every `/gsd:execute-phase`
|
||||
**Configuration:**
|
||||
- `workflow.drift_threshold` (integer, default `3`) — minimum new
|
||||
structural elements before the gate acts.
|
||||
- `workflow.drift_action` (`warn` | `auto-remap`, default `warn`) —
|
||||
warn-only or spawn `gsd-codebase-mapper` with `--paths` scoped to
|
||||
affected subtrees.
|
||||
|
||||
**What counts as drift:**
|
||||
- New directory outside mapped paths
|
||||
- New barrel export at `(packages|apps)/*/src/index.*`
|
||||
- New migration file (supabase/prisma/drizzle/src/migrations/…)
|
||||
- New route module under `routes/` or `api/`
|
||||
|
||||
**Non-blocking guarantee:** any internal failure (missing STRUCTURE.md,
|
||||
git errors, mapper spawn failure) logs a single line and the phase
|
||||
continues. Drift detection cannot fail verification.
|
||||
|
||||
**Requirements:**
|
||||
- REQ-DRIFT-01: System MUST detect the four drift categories from `git diff
|
||||
--name-status last_mapped_commit..HEAD`
|
||||
- REQ-DRIFT-02: Action fires only when element count ≥ `workflow.drift_threshold`
|
||||
- REQ-DRIFT-03: `warn` action MUST NOT spawn any agent
|
||||
- REQ-DRIFT-04: `auto-remap` action MUST pass sanitized `--paths` to the mapper
|
||||
- REQ-DRIFT-05: Detection/remap failure MUST be non-blocking for `/gsd:execute-phase`
|
||||
- REQ-DRIFT-06: `last_mapped_commit` round-trip through YAML frontmatter
|
||||
on each `.planning/codebase/*.md` file
|
||||
|
||||
---
|
||||
|
||||
## Utility Features
|
||||
|
||||
@@ -266,6 +266,7 @@
|
||||
"config.cjs",
|
||||
"core.cjs",
|
||||
"docs.cjs",
|
||||
"drift.cjs",
|
||||
"frontmatter.cjs",
|
||||
"graphify.cjs",
|
||||
"gsd2-import.cjs",
|
||||
|
||||
@@ -356,7 +356,7 @@ The `gsd-planner` agent is decomposed into a core agent plus reference modules t
|
||||
|
||||
---
|
||||
|
||||
## CLI Modules (26 shipped)
|
||||
## CLI Modules (27 shipped)
|
||||
|
||||
Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
|
||||
@@ -369,6 +369,7 @@ Full listing: `get-shit-done/bin/lib/*.cjs`.
|
||||
| `config.cjs` | `config.json` read/write, section initialization; imports validator from `config-schema.cjs` |
|
||||
| `core.cjs` | Error handling, output formatting, shared utilities, runtime fallbacks |
|
||||
| `docs.cjs` | Docs-update workflow init, Markdown scanning, monorepo detection |
|
||||
| `drift.cjs` | Post-execute codebase structural drift detector (#2003): classifies file changes into new-dir/barrel/migration/route categories and round-trips `last_mapped_commit` frontmatter |
|
||||
| `frontmatter.cjs` | YAML frontmatter CRUD operations |
|
||||
| `graphify.cjs` | Knowledge-graph build/query/status/diff for `/gsd-graphify` |
|
||||
| `gsd2-import.cjs` | External-plan ingest for `/gsd-from-gsd2` |
|
||||
|
||||
@@ -585,6 +585,20 @@ claude --dangerously-skip-permissions
|
||||
# (normal phase workflow from here)
|
||||
```
|
||||
|
||||
**Post-execute drift detection (#2003).** After every `/gsd:execute-phase`,
|
||||
GSD checks whether the phase introduced enough structural change
|
||||
(new directories, barrel exports, migrations, or route modules) to make
|
||||
`.planning/codebase/STRUCTURE.md` stale. If it did, the default behavior is
|
||||
to print a one-shot warning suggesting the exact `/gsd:map-codebase --paths …`
|
||||
invocation to refresh just the affected subtrees. Flip the behavior with:
|
||||
|
||||
```bash
|
||||
/gsd:settings workflow.drift_action auto-remap # remap automatically
|
||||
/gsd:settings workflow.drift_threshold 5 # tune sensitivity
|
||||
```
|
||||
|
||||
The gate is non-blocking: any internal failure logs and the phase continues.
|
||||
|
||||
### Quick Bug Fix
|
||||
|
||||
```bash
|
||||
|
||||
@@ -112,6 +112,7 @@
|
||||
* verify artifacts <plan-file> Check must_haves.artifacts
|
||||
* verify key-links <plan-file> Check must_haves.key_links
|
||||
* verify schema-drift <phase> [--skip] Detect schema file changes without push
|
||||
* verify codebase-drift Detect structural drift since last codebase map (#2003)
|
||||
*
|
||||
* Template Fill:
|
||||
* template fill summary --phase N Create pre-filled SUMMARY.md
|
||||
@@ -593,8 +594,10 @@ async function runCommand(command, args, cwd, raw, defaultValue) {
|
||||
} else if (subcommand === 'schema-drift') {
|
||||
const skipFlag = args.includes('--skip');
|
||||
verify.cmdVerifySchemaDrift(cwd, args[2], skipFlag, raw);
|
||||
} else if (subcommand === 'codebase-drift') {
|
||||
verify.cmdVerifyCodebaseDrift(cwd, raw);
|
||||
} else {
|
||||
error('Unknown verify subcommand. Available: plan-structure, phase-completeness, references, commits, artifacts, key-links, schema-drift');
|
||||
error('Unknown verify subcommand. Available: plan-structure, phase-completeness, references, commits, artifacts, key-links, schema-drift, codebase-drift');
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
@@ -37,6 +37,8 @@ const VALID_CONFIG_KEYS = new Set([
|
||||
'workflow.security_enforcement',
|
||||
'workflow.security_asvs_level',
|
||||
'workflow.security_block_on',
|
||||
'workflow.drift_threshold',
|
||||
'workflow.drift_action',
|
||||
'git.branching_strategy', 'git.base_branch', 'git.phase_branch_template', 'git.milestone_branch_template', 'git.quick_branch_template',
|
||||
'planning.commit_docs', 'planning.search_gitignored', 'planning.sub_repos',
|
||||
'workflow.cross_ai_execution', 'workflow.cross_ai_command', 'workflow.cross_ai_timeout',
|
||||
|
||||
@@ -333,6 +333,17 @@ function cmdConfigSet(cwd, keyPath, value, raw) {
|
||||
error(`Invalid context value '${value}'. Valid values: ${VALID_CONTEXT_VALUES.join(', ')}`);
|
||||
}
|
||||
|
||||
// Codebase drift detector (#2003)
|
||||
const VALID_DRIFT_ACTIONS = ['warn', 'auto-remap'];
|
||||
if (keyPath === 'workflow.drift_action' && !VALID_DRIFT_ACTIONS.includes(String(parsedValue))) {
|
||||
error(`Invalid workflow.drift_action '${value}'. Valid values: ${VALID_DRIFT_ACTIONS.join(', ')}`);
|
||||
}
|
||||
if (keyPath === 'workflow.drift_threshold') {
|
||||
if (typeof parsedValue !== 'number' || !Number.isInteger(parsedValue) || parsedValue < 1) {
|
||||
error(`Invalid workflow.drift_threshold '${value}'. Must be a positive integer.`);
|
||||
}
|
||||
}
|
||||
|
||||
const setConfigValueResult = setConfigValue(cwd, keyPath, parsedValue);
|
||||
output(setConfigValueResult, raw, `${keyPath}=${parsedValue}`);
|
||||
}
|
||||
|
||||
378
get-shit-done/bin/lib/drift.cjs
Normal file
378
get-shit-done/bin/lib/drift.cjs
Normal file
@@ -0,0 +1,378 @@
|
||||
/**
|
||||
* Codebase Drift Detection (#2003)
|
||||
*
|
||||
* Detects structural drift between a committed codebase and the
|
||||
* `.planning/codebase/STRUCTURE.md` map produced by `gsd-codebase-mapper`.
|
||||
*
|
||||
* Four categories of drift element:
|
||||
* - new_dir → a newly-added file whose directory prefix does not appear
|
||||
* in STRUCTURE.md
|
||||
* - barrel → a newly-added barrel export at
|
||||
* (packages|apps)/<name>/src/index.(ts|tsx|js|mjs|cjs)
|
||||
* - migration → a newly-added migration file under one of the recognized
|
||||
* migration directories (supabase, prisma, drizzle, src/migrations, …)
|
||||
* - route → a newly-added route module under a `routes/` or `api/` dir
|
||||
*
|
||||
* Each file is counted at most once; when a file matches multiple categories
|
||||
* the most specific category wins (migration > route > barrel > new_dir).
|
||||
*
|
||||
* Design decisions (see PR for full rubber-duck):
|
||||
* - The library is pure. It takes parsed git diff output and returns a
|
||||
* structured result. The CLI/workflow layer is responsible for running
|
||||
* git and for spawning mappers.
|
||||
* - `last_mapped_commit` is stored as YAML-style frontmatter at the top of
|
||||
* each `.planning/codebase/*.md` file. This keeps the baseline attached
|
||||
* to the file, survives git moves, and avoids a sidecar JSON.
|
||||
* - The detector NEVER throws on malformed input — it returns a
|
||||
* `{ skipped: true }` result. The phase workflow depends on this
|
||||
* non-blocking guarantee.
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
const fs = require('node:fs');
|
||||
|
||||
// ─── Constants ───────────────────────────────────────────────────────────────
|
||||
|
||||
const DRIFT_CATEGORIES = Object.freeze(['new_dir', 'barrel', 'migration', 'route']);
|
||||
|
||||
// Category priority when a single file matches multiple rules.
|
||||
// Higher index = more specific = wins.
|
||||
const CATEGORY_PRIORITY = { new_dir: 0, barrel: 1, route: 2, migration: 3 };
|
||||
|
||||
const BARREL_RE = /^(packages|apps)\/[^/]+\/src\/index\.(ts|tsx|js|mjs|cjs)$/;
|
||||
|
||||
const MIGRATION_RES = [
|
||||
/^supabase\/migrations\/.+\.sql$/,
|
||||
/^prisma\/migrations\/.+/,
|
||||
/^drizzle\/meta\/.+/,
|
||||
/^drizzle\/migrations\/.+/,
|
||||
/^src\/migrations\/.+\.(ts|js|sql)$/,
|
||||
/^db\/migrations\/.+\.(sql|ts|js)$/,
|
||||
/^migrations\/.+\.(sql|ts|js)$/,
|
||||
];
|
||||
|
||||
const ROUTE_RES = [
|
||||
/^(apps|packages)\/[^/]+\/src\/routes\/.+\.(ts|tsx|js|jsx|mjs|cjs)$/,
|
||||
/^src\/routes\/.+\.(ts|tsx|js|jsx|mjs|cjs)$/,
|
||||
/^src\/api\/.+\.(ts|tsx|js|jsx|mjs|cjs)$/,
|
||||
/^(apps|packages)\/[^/]+\/src\/api\/.+\.(ts|tsx|js|jsx|mjs|cjs)$/,
|
||||
];
|
||||
|
||||
// A conservative allowlist for `--paths` arguments passed to the mapper:
|
||||
// repo-relative path components separated by /, containing only
|
||||
// alphanumerics, dash, underscore, and dot (no `..`, no `/..`).
|
||||
const SAFE_PATH_RE = /^(?!.*\.\.)(?:[A-Za-z0-9_.][A-Za-z0-9_.\-]*)(?:\/[A-Za-z0-9_.][A-Za-z0-9_.\-]*)*$/;
|
||||
|
||||
// ─── Classification ──────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Classify a single file path into a drift category or null.
|
||||
*
|
||||
* @param {string} file - repo-relative path, forward slashes.
|
||||
* @returns {'barrel'|'migration'|'route'|null}
|
||||
*/
|
||||
function classifyFile(file) {
|
||||
if (typeof file !== 'string' || !file) return null;
|
||||
const norm = file.replace(/\\/g, '/');
|
||||
if (MIGRATION_RES.some((r) => r.test(norm))) return 'migration';
|
||||
if (ROUTE_RES.some((r) => r.test(norm))) return 'route';
|
||||
if (BARREL_RE.test(norm)) return 'barrel';
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* True iff any prefix of `file` (dir1, dir1/dir2, …) appears as a substring
|
||||
* of `structureMd`. Used to decide whether a file is in "mapped territory".
|
||||
*
|
||||
* Matching is deliberately substring-based — STRUCTURE.md is free-form
|
||||
* markdown, not a structured manifest. If the map mentions `src/lib/` the
|
||||
* check `structureMd.includes('src/lib')` holds.
|
||||
*/
|
||||
function isPathMapped(file, structureMd) {
|
||||
const norm = file.replace(/\\/g, '/');
|
||||
const parts = norm.split('/');
|
||||
// Check prefixes from longest to shortest; any hit means "mapped".
|
||||
for (let i = parts.length - 1; i >= 1; i--) {
|
||||
const prefix = parts.slice(0, i).join('/');
|
||||
if (structureMd.includes(prefix)) return true;
|
||||
}
|
||||
// Finally, if even the top-level dir is mentioned, count as mapped.
|
||||
if (parts.length > 0 && structureMd.includes(parts[0] + '/')) return true;
|
||||
if (parts.length > 0 && structureMd.includes('`' + parts[0] + '`')) return true;
|
||||
return false;
|
||||
}
|
||||
|
||||
// ─── Main detection ──────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Detect codebase drift.
|
||||
*
|
||||
* @param {object} input
|
||||
* @param {string[]} input.addedFiles - files with git status A (new)
|
||||
* @param {string[]} input.modifiedFiles - files with git status M
|
||||
* @param {string[]} input.deletedFiles - files with git status D
|
||||
* @param {string|null|undefined} input.structureMd - contents of STRUCTURE.md
|
||||
* @param {number} [input.threshold=3] - min number of drift elements that triggers action
|
||||
* @param {'warn'|'auto-remap'} [input.action='warn']
|
||||
* @returns {object} result
|
||||
*/
|
||||
function detectDrift(input) {
|
||||
try {
|
||||
if (!input || typeof input !== 'object') {
|
||||
return skipped('invalid-input');
|
||||
}
|
||||
const {
|
||||
addedFiles,
|
||||
modifiedFiles,
|
||||
deletedFiles,
|
||||
structureMd,
|
||||
} = input;
|
||||
const threshold = Number.isInteger(input.threshold) && input.threshold >= 1
|
||||
? input.threshold
|
||||
: 3;
|
||||
const action = input.action === 'auto-remap' ? 'auto-remap' : 'warn';
|
||||
|
||||
if (structureMd === null || structureMd === undefined) {
|
||||
return skipped('missing-structure-md');
|
||||
}
|
||||
if (typeof structureMd !== 'string') {
|
||||
return skipped('invalid-structure-md');
|
||||
}
|
||||
|
||||
const added = Array.isArray(addedFiles) ? addedFiles.filter((x) => typeof x === 'string') : [];
|
||||
const modified = Array.isArray(modifiedFiles) ? modifiedFiles : [];
|
||||
const deleted = Array.isArray(deletedFiles) ? deletedFiles : [];
|
||||
|
||||
// Build elements. One element per file, highest-priority category wins.
|
||||
/** @type {{category: string, path: string}[]} */
|
||||
const elements = [];
|
||||
const seen = new Map();
|
||||
|
||||
for (const rawFile of added) {
|
||||
const file = rawFile.replace(/\\/g, '/');
|
||||
const specific = classifyFile(file);
|
||||
let category = specific;
|
||||
if (!category) {
|
||||
if (!isPathMapped(file, structureMd)) {
|
||||
category = 'new_dir';
|
||||
} else {
|
||||
continue; // mapped, known, ordinary file — not drift
|
||||
}
|
||||
}
|
||||
// Dedup: if we've already counted this path at higher-or-equal priority, skip
|
||||
const prior = seen.get(file);
|
||||
if (prior && CATEGORY_PRIORITY[prior] >= CATEGORY_PRIORITY[category]) continue;
|
||||
seen.set(file, category);
|
||||
}
|
||||
|
||||
for (const [file, category] of seen.entries()) {
|
||||
elements.push({ category, path: file });
|
||||
}
|
||||
|
||||
// Sort for stable output.
|
||||
elements.sort((a, b) =>
|
||||
a.category === b.category
|
||||
? a.path.localeCompare(b.path)
|
||||
: a.category.localeCompare(b.category),
|
||||
);
|
||||
|
||||
const actionRequired = elements.length >= threshold;
|
||||
let directive = 'none';
|
||||
let spawnMapper = false;
|
||||
let affectedPaths = [];
|
||||
let message = '';
|
||||
|
||||
if (actionRequired) {
|
||||
directive = action;
|
||||
affectedPaths = chooseAffectedPaths(elements.map((e) => e.path));
|
||||
if (action === 'auto-remap') {
|
||||
spawnMapper = true;
|
||||
}
|
||||
message = buildMessage(elements, affectedPaths, action);
|
||||
}
|
||||
|
||||
return {
|
||||
skipped: false,
|
||||
elements,
|
||||
actionRequired,
|
||||
directive,
|
||||
spawnMapper,
|
||||
affectedPaths,
|
||||
threshold,
|
||||
action,
|
||||
message,
|
||||
counts: {
|
||||
added: added.length,
|
||||
modified: modified.length,
|
||||
deleted: deleted.length,
|
||||
},
|
||||
};
|
||||
} catch (err) {
|
||||
// Non-blocking: never throw from this function.
|
||||
return skipped('exception:' + (err && err.message ? err.message : String(err)));
|
||||
}
|
||||
}
|
||||
|
||||
function skipped(reason) {
|
||||
return {
|
||||
skipped: true,
|
||||
reason,
|
||||
elements: [],
|
||||
actionRequired: false,
|
||||
directive: 'none',
|
||||
spawnMapper: false,
|
||||
affectedPaths: [],
|
||||
message: '',
|
||||
};
|
||||
}
|
||||
|
||||
function buildMessage(elements, affectedPaths, action) {
|
||||
const byCat = {};
|
||||
for (const e of elements) {
|
||||
(byCat[e.category] ||= []).push(e.path);
|
||||
}
|
||||
const lines = [
|
||||
`Codebase drift detected: ${elements.length} structural element(s) since last mapping.`,
|
||||
'',
|
||||
];
|
||||
const labels = {
|
||||
new_dir: 'New directories',
|
||||
barrel: 'New barrel exports',
|
||||
migration: 'New migrations',
|
||||
route: 'New route modules',
|
||||
};
|
||||
for (const cat of ['new_dir', 'barrel', 'migration', 'route']) {
|
||||
if (byCat[cat]) {
|
||||
lines.push(`${labels[cat]}:`);
|
||||
for (const p of byCat[cat]) lines.push(` - ${p}`);
|
||||
}
|
||||
}
|
||||
lines.push('');
|
||||
if (action === 'auto-remap') {
|
||||
lines.push(`Auto-remap scheduled for paths: ${affectedPaths.join(', ')}`);
|
||||
} else {
|
||||
lines.push(
|
||||
`Run /gsd:map-codebase --paths ${affectedPaths.join(',')} to refresh planning context.`,
|
||||
);
|
||||
}
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
// ─── Affected paths ──────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Collapse a list of drifted file paths into a sorted, deduplicated list of
|
||||
* the top-level directory prefixes (depth 2 when the repo uses an
|
||||
* `<apps|packages>/<name>/…` layout; depth 1 otherwise).
|
||||
*/
|
||||
function chooseAffectedPaths(paths) {
|
||||
const out = new Set();
|
||||
for (const raw of paths || []) {
|
||||
if (typeof raw !== 'string' || !raw) continue;
|
||||
const file = raw.replace(/\\/g, '/');
|
||||
const parts = file.split('/');
|
||||
if (parts.length === 0) continue;
|
||||
const top = parts[0];
|
||||
if ((top === 'apps' || top === 'packages') && parts.length >= 2) {
|
||||
out.add(`${top}/${parts[1]}`);
|
||||
} else {
|
||||
out.add(top);
|
||||
}
|
||||
}
|
||||
return [...out].sort();
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter `paths` to only those that are safe to splice into a mapper prompt.
|
||||
* Any path that is absolute, contains traversal, or includes shell
|
||||
* metacharacters is dropped.
|
||||
*/
|
||||
function sanitizePaths(paths) {
|
||||
if (!Array.isArray(paths)) return [];
|
||||
const out = [];
|
||||
for (const p of paths) {
|
||||
if (typeof p !== 'string') continue;
|
||||
if (p.startsWith('/')) continue;
|
||||
if (!SAFE_PATH_RE.test(p)) continue;
|
||||
out.push(p);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
// ─── Frontmatter helpers ─────────────────────────────────────────────────────
|
||||
|
||||
const FRONTMATTER_RE = /^---\r?\n([\s\S]*?)\r?\n---\r?\n?/;
|
||||
|
||||
function parseFrontmatter(content) {
|
||||
if (typeof content !== 'string') return { data: {}, body: '' };
|
||||
const m = content.match(FRONTMATTER_RE);
|
||||
if (!m) return { data: {}, body: content };
|
||||
const data = {};
|
||||
for (const line of m[1].split(/\r?\n/)) {
|
||||
const kv = line.match(/^([A-Za-z0-9_][A-Za-z0-9_-]*):\s*(.*)$/);
|
||||
if (!kv) continue;
|
||||
data[kv[1]] = kv[2];
|
||||
}
|
||||
return { data, body: content.slice(m[0].length) };
|
||||
}
|
||||
|
||||
function serializeFrontmatter(data, body) {
|
||||
const keys = Object.keys(data);
|
||||
if (keys.length === 0) return body;
|
||||
const lines = ['---'];
|
||||
for (const k of keys) lines.push(`${k}: ${data[k]}`);
|
||||
lines.push('---');
|
||||
return lines.join('\n') + '\n' + body;
|
||||
}
|
||||
|
||||
/**
|
||||
* Read `last_mapped_commit` from the frontmatter of a `.planning/codebase/*.md`
|
||||
* file. Returns null if the file does not exist or has no frontmatter.
|
||||
*/
|
||||
function readMappedCommit(filePath) {
|
||||
let content;
|
||||
try {
|
||||
content = fs.readFileSync(filePath, 'utf8');
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
const { data } = parseFrontmatter(content);
|
||||
const sha = data.last_mapped_commit;
|
||||
return typeof sha === 'string' && sha.length > 0 ? sha : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Upsert `last_mapped_commit` and `last_mapped_at` into the frontmatter of
|
||||
* the given file, preserving any other frontmatter keys and the body.
|
||||
*/
|
||||
function writeMappedCommit(filePath, commitSha, isoDate) {
|
||||
// Symmetric with readMappedCommit (which returns null on missing files):
|
||||
// tolerate a missing target by creating a minimal frontmatter-only file
|
||||
// rather than throwing ENOENT. This matters when a mapper produces a new
|
||||
// doc and the caller stamps it before any prior content existed.
|
||||
let content = '';
|
||||
try {
|
||||
content = fs.readFileSync(filePath, 'utf8');
|
||||
} catch (err) {
|
||||
if (err.code !== 'ENOENT') throw err;
|
||||
}
|
||||
const { data, body } = parseFrontmatter(content);
|
||||
data.last_mapped_commit = commitSha;
|
||||
if (isoDate) data.last_mapped_at = isoDate;
|
||||
fs.writeFileSync(filePath, serializeFrontmatter(data, body));
|
||||
}
|
||||
|
||||
// ─── Exports ─────────────────────────────────────────────────────────────────
|
||||
|
||||
module.exports = {
|
||||
DRIFT_CATEGORIES,
|
||||
classifyFile,
|
||||
detectDrift,
|
||||
chooseAffectedPaths,
|
||||
sanitizePaths,
|
||||
readMappedCommit,
|
||||
writeMappedCommit,
|
||||
// Exposed for the CLI layer to reuse the same parser.
|
||||
parseFrontmatter,
|
||||
};
|
||||
@@ -1169,6 +1169,141 @@ function cmdVerifySchemaDrift(cwd, phaseArg, skipFlag, raw) {
|
||||
}, raw);
|
||||
}
|
||||
|
||||
// ─── Codebase Drift Detection (#2003) ────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Detect structural drift between the committed tree and
|
||||
* `.planning/codebase/STRUCTURE.md`. Non-blocking: any failure returns a
|
||||
* `{ skipped: true }` JSON result with a reason; the command never exits
|
||||
* non-zero so `execute-phase`'s drift gate cannot fail the phase.
|
||||
*/
|
||||
function cmdVerifyCodebaseDrift(cwd, raw) {
|
||||
const drift = require('./drift.cjs');
|
||||
|
||||
const emit = (payload) => output(payload, raw);
|
||||
|
||||
try {
|
||||
const codebaseDir = path.join(planningDir(cwd), 'codebase');
|
||||
const structurePath = path.join(codebaseDir, 'STRUCTURE.md');
|
||||
if (!fs.existsSync(structurePath)) {
|
||||
emit({
|
||||
skipped: true,
|
||||
reason: 'no-structure-md',
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
let structureMd;
|
||||
try {
|
||||
structureMd = fs.readFileSync(structurePath, 'utf-8');
|
||||
} catch (err) {
|
||||
emit({
|
||||
skipped: true,
|
||||
reason: 'cannot-read-structure-md: ' + err.message,
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const lastMapped = drift.readMappedCommit(structurePath);
|
||||
|
||||
// Verify we're inside a git repo and resolve the diff range.
|
||||
const revProbe = execGit(cwd, ['rev-parse', 'HEAD']);
|
||||
if (revProbe.exitCode !== 0) {
|
||||
emit({
|
||||
skipped: true,
|
||||
reason: 'not-a-git-repo',
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Empty-tree SHA is a stable fallback when no mapping commit is recorded.
|
||||
const EMPTY_TREE = '4b825dc642cb6eb9a060e54bf8d69288fbee4904';
|
||||
let base = lastMapped;
|
||||
if (!base) {
|
||||
base = EMPTY_TREE;
|
||||
} else {
|
||||
// Verify the commit is reachable; if not, fall back to EMPTY_TREE.
|
||||
const verify = execGit(cwd, ['cat-file', '-t', base]);
|
||||
if (verify.exitCode !== 0) base = EMPTY_TREE;
|
||||
}
|
||||
|
||||
const diff = execGit(cwd, ['diff', '--name-status', base, 'HEAD']);
|
||||
if (diff.exitCode !== 0) {
|
||||
emit({
|
||||
skipped: true,
|
||||
reason: 'git-diff-failed',
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const added = [];
|
||||
const modified = [];
|
||||
const deleted = [];
|
||||
for (const line of diff.stdout.split(/\r?\n/)) {
|
||||
if (!line.trim()) continue;
|
||||
const m = line.match(/^([A-Z])\d*\t(.+?)(?:\t(.+))?$/);
|
||||
if (!m) continue;
|
||||
const status = m[1];
|
||||
// For renames (R), use the new path (m[3] if present, else m[2]).
|
||||
const file = m[3] || m[2];
|
||||
if (status === 'A' || status === 'R' || status === 'C') added.push(file);
|
||||
else if (status === 'M') modified.push(file);
|
||||
else if (status === 'D') deleted.push(file);
|
||||
}
|
||||
|
||||
// Threshold and action read from config, with defaults.
|
||||
const config = loadConfig(cwd);
|
||||
const threshold = Number.isInteger(config?.workflow?.drift_threshold) && config.workflow.drift_threshold >= 1
|
||||
? config.workflow.drift_threshold
|
||||
: 3;
|
||||
const action = config?.workflow?.drift_action === 'auto-remap' ? 'auto-remap' : 'warn';
|
||||
|
||||
const result = drift.detectDrift({
|
||||
addedFiles: added,
|
||||
modifiedFiles: modified,
|
||||
deletedFiles: deleted,
|
||||
structureMd,
|
||||
threshold,
|
||||
action,
|
||||
});
|
||||
|
||||
emit({
|
||||
skipped: !!result.skipped,
|
||||
reason: result.reason || null,
|
||||
action_required: !!result.actionRequired,
|
||||
directive: result.directive,
|
||||
spawn_mapper: !!result.spawnMapper,
|
||||
affected_paths: result.affectedPaths || [],
|
||||
elements: result.elements || [],
|
||||
threshold,
|
||||
action,
|
||||
last_mapped_commit: lastMapped,
|
||||
message: result.message || '',
|
||||
});
|
||||
} catch (err) {
|
||||
// Non-blocking: never bubble up an exception.
|
||||
emit({
|
||||
skipped: true,
|
||||
reason: 'exception: ' + (err && err.message ? err.message : String(err)),
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
cmdVerifySummary,
|
||||
cmdVerifyPlanStructure,
|
||||
@@ -1181,4 +1316,5 @@ module.exports = {
|
||||
cmdValidateHealth,
|
||||
cmdValidateAgents,
|
||||
cmdVerifySchemaDrift,
|
||||
cmdVerifyCodebaseDrift,
|
||||
};
|
||||
|
||||
@@ -1270,6 +1270,86 @@ If `TEXT_MODE` is true, present as a plain-text numbered list. Otherwise use Ask
|
||||
**If user selects option 3:** Stop execution. Report partial completion.
|
||||
</step>
|
||||
|
||||
<step name="codebase_drift_gate">
|
||||
Post-execution structural drift detection (#2003). Runs after the last wave
|
||||
commits, before verification. **Non-blocking by contract:** any internal
|
||||
error here MUST fall through and continue to `verify_phase_goal`. The phase
|
||||
is never failed by this gate.
|
||||
|
||||
```bash
|
||||
DRIFT=$(gsd-sdk query verify.codebase-drift 2>/dev/null || echo '{"skipped":true,"reason":"sdk-failed"}')
|
||||
```
|
||||
|
||||
Parse JSON for: `skipped`, `reason`, `action_required`, `directive`,
|
||||
`spawn_mapper`, `affected_paths`, `elements`, `threshold`, `action`,
|
||||
`last_mapped_commit`, `message`.
|
||||
|
||||
**If `skipped` is true (no STRUCTURE.md, missing git, or any internal error):**
|
||||
Log one line — `Codebase drift check skipped: {reason}` — and continue to
|
||||
`verify_phase_goal`. Do NOT prompt the user. Do NOT block.
|
||||
|
||||
**If `action_required` is false:** Continue silently to `verify_phase_goal`.
|
||||
|
||||
**If `action_required` is true AND `directive` is `warn`:**
|
||||
Print the `message` field verbatim. The format is:
|
||||
|
||||
```text
|
||||
Codebase drift detected: {N} structural element(s) since last mapping.
|
||||
|
||||
New directories:
|
||||
- {path}
|
||||
New barrel exports:
|
||||
- {path}
|
||||
New migrations:
|
||||
- {path}
|
||||
New route modules:
|
||||
- {path}
|
||||
|
||||
Run /gsd:map-codebase --paths {affected_paths} to refresh planning context.
|
||||
```
|
||||
|
||||
Then continue to `verify_phase_goal`. Do NOT block. Do NOT spawn anything.
|
||||
|
||||
**If `action_required` is true AND `directive` is `auto-remap`:**
|
||||
|
||||
First load the mapper agent's skill bundle (the executor's `AGENT_SKILLS`
|
||||
from step `init_context` is for `gsd-executor`, not the mapper):
|
||||
|
||||
```bash
|
||||
AGENT_SKILLS_MAPPER=$(gsd-sdk query agent-skills gsd-codebase-mapper 2>/dev/null || true)
|
||||
```
|
||||
|
||||
Then spawn `gsd-codebase-mapper` agents with the `--paths` hint:
|
||||
|
||||
```text
|
||||
Task(
|
||||
subagent_type="gsd-codebase-mapper",
|
||||
description="Incremental codebase remap (drift)",
|
||||
prompt="Focus: arch
|
||||
Today's date: {date}
|
||||
--paths {affected_paths joined by comma}
|
||||
|
||||
Refresh STRUCTURE.md and ARCHITECTURE.md scoped to the listed paths only.
|
||||
Stamp last_mapped_commit in each document's frontmatter.
|
||||
${AGENT_SKILLS_MAPPER}"
|
||||
)
|
||||
```
|
||||
|
||||
If the spawn fails or the agent reports an error: log `Codebase drift
|
||||
auto-remap failed: {reason}` and continue to `verify_phase_goal`. The phase
|
||||
is NOT failed by a remap failure.
|
||||
|
||||
If the remap succeeds: log `Codebase drift auto-remap completed for paths:
|
||||
{affected_paths}` and continue to `verify_phase_goal`.
|
||||
|
||||
The two relevant config keys (continue on error / failure if either is invalid):
|
||||
- `workflow.drift_threshold` (integer, default 3) — minimum drift elements before action
|
||||
- `workflow.drift_action` — `warn` (default) or `auto-remap`
|
||||
|
||||
This step is fully non-blocking — it never fails the phase, and any
|
||||
exception path returns control to `verify_phase_goal`.
|
||||
</step>
|
||||
|
||||
<step name="verify_phase_goal">
|
||||
Verify phase achieved its GOAL, not just completed tasks.
|
||||
|
||||
|
||||
@@ -27,6 +27,44 @@ Documents are reference material for Claude when planning/executing. Always incl
|
||||
|
||||
<process>
|
||||
|
||||
<step name="parse_paths_flag" priority="first">
|
||||
Parse an optional `--paths <p1,p2,...>` argument. When supplied (by the
|
||||
post-execute codebase-drift gate in `/gsd:execute-phase` or by a user running
|
||||
`/gsd:map-codebase --paths apps/accounting,packages/ui`), the workflow
|
||||
operates in **incremental-remap mode**:
|
||||
|
||||
- Pass `--paths <p1>,<p2>,...` through to each spawned `gsd-codebase-mapper`
|
||||
agent's prompt. Agents scope their Glob/Grep/Bash exploration to the listed
|
||||
repo-relative prefixes only — no whole-repo scan.
|
||||
- Reject path values that contain `..`, start with `/`, or include shell
|
||||
metacharacters (`;`, `` ` ``, `$`, `&`, `|`, `<`, `>`). If all provided
|
||||
paths are invalid, fall back to a normal whole-repo run.
|
||||
- On write, each mapper stamps `last_mapped_commit: <HEAD sha>` into the YAML
|
||||
frontmatter of every document it produces (see `bin/lib/drift.cjs:writeMappedCommit`).
|
||||
|
||||
**Explicit contract — propagate `--paths` through a single normalized
|
||||
variable.** Downstream steps (`spawn_agents`, `sequential_mapping`, and any
|
||||
Task-mode prompt construction) MUST use `${PATH_SCOPE_HINT}` to ensure every
|
||||
mapper receives the same deterministic scope. Without this contract
|
||||
incremental-remap can silently regress to a whole-repo scan.
|
||||
|
||||
```bash
|
||||
# Validated, comma-separated paths (empty if --paths absent or all rejected):
|
||||
SCOPED_PATHS="<validated paths or empty>"
|
||||
if [ -n "$SCOPED_PATHS" ]; then
|
||||
PATH_SCOPE_HINT="--paths $SCOPED_PATHS"
|
||||
else
|
||||
PATH_SCOPE_HINT=""
|
||||
fi
|
||||
```
|
||||
|
||||
All mapper prompts built later in this workflow MUST include
|
||||
`${PATH_SCOPE_HINT}` (expanded to empty when full-repo mode is in effect).
|
||||
|
||||
When `--paths` is absent, behave exactly as before: full-repo scan, all 7
|
||||
documents refreshed.
|
||||
</step>
|
||||
|
||||
<step name="init_context" priority="first">
|
||||
Load codebase mapping context:
|
||||
|
||||
@@ -124,6 +162,8 @@ Write these documents to .planning/codebase/:
|
||||
|
||||
IMPORTANT: Use {date} for all [YYYY-MM-DD] date placeholders in documents.
|
||||
|
||||
Scope: ${PATH_SCOPE_HINT:-(full repo)} — when --paths is supplied, restrict exploration to those prefixes only.
|
||||
|
||||
Explore thoroughly. Write documents directly using templates. Return confirmation only.
|
||||
${AGENT_SKILLS_MAPPER}"
|
||||
)
|
||||
@@ -148,6 +188,8 @@ Write these documents to .planning/codebase/:
|
||||
|
||||
IMPORTANT: Use {date} for all [YYYY-MM-DD] date placeholders in documents.
|
||||
|
||||
Scope: ${PATH_SCOPE_HINT:-(full repo)} — when --paths is supplied, restrict exploration to those prefixes only.
|
||||
|
||||
Explore thoroughly. Write documents directly using templates. Return confirmation only.
|
||||
${AGENT_SKILLS_MAPPER}"
|
||||
)
|
||||
@@ -172,6 +214,8 @@ Write these documents to .planning/codebase/:
|
||||
|
||||
IMPORTANT: Use {date} for all [YYYY-MM-DD] date placeholders in documents.
|
||||
|
||||
Scope: ${PATH_SCOPE_HINT:-(full repo)} — when --paths is supplied, restrict exploration to those prefixes only.
|
||||
|
||||
Explore thoroughly. Write documents directly using templates. Return confirmation only.
|
||||
${AGENT_SKILLS_MAPPER}"
|
||||
)
|
||||
@@ -195,6 +239,8 @@ Write this document to .planning/codebase/:
|
||||
|
||||
IMPORTANT: Use {date} for all [YYYY-MM-DD] date placeholders in documents.
|
||||
|
||||
Scope: ${PATH_SCOPE_HINT:-(full repo)} — when --paths is supplied, restrict exploration to those prefixes only.
|
||||
|
||||
Explore thoroughly. Write document directly using template. Return confirmation only.
|
||||
${AGENT_SKILLS_MAPPER}"
|
||||
)
|
||||
@@ -246,6 +292,8 @@ When the `Task` tool is unavailable, perform codebase mapping sequentially in th
|
||||
|
||||
**IMPORTANT:** Use `{date}` from init context for all `[YYYY-MM-DD]` date placeholders in documents. NEVER guess the date.
|
||||
|
||||
**SCOPE:** When `${PATH_SCOPE_HINT}` is non-empty (i.e. `--paths` was supplied), restrict every pass below to the validated path prefixes in `${SCOPED_PATHS}`. Do NOT scan files outside those prefixes. When `${PATH_SCOPE_HINT}` is empty, perform a full-repo scan.
|
||||
|
||||
Perform all 4 mapping passes sequentially:
|
||||
|
||||
**Pass 1: Tech Focus**
|
||||
|
||||
@@ -56,7 +56,7 @@ import { agentSkills } from './skills.js';
|
||||
import { requirementsMarkComplete, roadmapAnnotateDependencies } from './roadmap.js';
|
||||
import { roadmapUpdatePlanProgress } from './roadmap-update-plan-progress.js';
|
||||
import { statePlannedPhase } from './state-mutation.js';
|
||||
import { verifySchemaDrift } from './verify.js';
|
||||
import { verifySchemaDrift, verifyCodebaseDrift } from './verify.js';
|
||||
import {
|
||||
todoMatchPhase, statsJson, statsTable, progressBar, progressTable, listTodos, todoComplete,
|
||||
} from './progress.js';
|
||||
@@ -460,6 +460,8 @@ export function createRegistry(
|
||||
registry.register('state planned-phase', statePlannedPhase);
|
||||
registry.register('verify.schema-drift', verifySchemaDrift);
|
||||
registry.register('verify schema-drift', verifySchemaDrift);
|
||||
registry.register('verify.codebase-drift', verifyCodebaseDrift);
|
||||
registry.register('verify codebase-drift', verifyCodebaseDrift);
|
||||
registry.register('todo.match-phase', todoMatchPhase);
|
||||
registry.register('todo match-phase', todoMatchPhase);
|
||||
registry.register('list-todos', listTodos);
|
||||
|
||||
@@ -643,3 +643,56 @@ export const verifySchemaDrift: QueryHandler = async (args, projectDir, workstre
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
/**
|
||||
* verify.codebase-drift — structural drift detector (#2003).
|
||||
*
|
||||
* Non-blocking by contract: every failure mode returns a successful response
|
||||
* with `{ skipped: true, reason }`. The post-execute drift gate in
|
||||
* `/gsd:execute-phase` relies on this guarantee.
|
||||
*
|
||||
* Delegates to the Node-side implementation in `bin/lib/drift.cjs` and
|
||||
* `bin/lib/verify.cjs` via a child process so the drift logic stays in one
|
||||
* canonical place (see `cmdVerifyCodebaseDrift`).
|
||||
*/
|
||||
export const verifyCodebaseDrift: QueryHandler = async (_args, projectDir) => {
|
||||
try {
|
||||
const { execFileSync } = await import('node:child_process');
|
||||
const { fileURLToPath } = await import('node:url');
|
||||
const { dirname, resolve } = await import('node:path');
|
||||
const here = typeof __dirname === 'string'
|
||||
? __dirname
|
||||
: dirname(fileURLToPath(import.meta.url));
|
||||
// sdk/src/query -> ../../../get-shit-done/bin/gsd-tools.cjs
|
||||
// sdk/dist/query -> ../../../get-shit-done/bin/gsd-tools.cjs
|
||||
const toolsPath = resolve(here, '..', '..', '..', 'get-shit-done', 'bin', 'gsd-tools.cjs');
|
||||
const out = execFileSync(process.execPath, [toolsPath, 'verify', 'codebase-drift'], {
|
||||
cwd: projectDir,
|
||||
encoding: 'utf-8',
|
||||
stdio: ['pipe', 'pipe', 'pipe'],
|
||||
}).trim();
|
||||
try {
|
||||
return { data: JSON.parse(out) };
|
||||
} catch {
|
||||
return {
|
||||
data: {
|
||||
skipped: true,
|
||||
reason: 'sdk-parse-failed',
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
},
|
||||
};
|
||||
}
|
||||
} catch (err) {
|
||||
return {
|
||||
data: {
|
||||
skipped: true,
|
||||
reason: 'sdk-exception: ' + (err instanceof Error ? err.message : String(err)),
|
||||
action_required: false,
|
||||
directive: 'none',
|
||||
elements: [],
|
||||
},
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
663
tests/drift-detection.test.cjs
Normal file
663
tests/drift-detection.test.cjs
Normal file
@@ -0,0 +1,663 @@
|
||||
/**
|
||||
* GSD Tools Tests — Codebase Drift Detection (#2003)
|
||||
*
|
||||
* Unit tests for bin/lib/drift.cjs plus CLI surface via verify codebase-drift.
|
||||
* Exercises the four drift categories (new dir, barrel, migration, route),
|
||||
* threshold gating, warn vs. auto-remap, last_mapped_commit round-trip,
|
||||
* config validation, mapper --paths passthrough, and graceful failure paths.
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
const { test, describe, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('node:fs');
|
||||
const path = require('node:path');
|
||||
const { execFileSync } = require('node:child_process');
|
||||
const {
|
||||
createTempProject,
|
||||
createTempGitProject,
|
||||
cleanup,
|
||||
runGsdTools,
|
||||
} = require('./helpers.cjs');
|
||||
|
||||
const DRIFT_PATH = path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'get-shit-done',
|
||||
'bin',
|
||||
'lib',
|
||||
'drift.cjs',
|
||||
);
|
||||
const CONFIG_SCHEMA_PATH = path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'get-shit-done',
|
||||
'bin',
|
||||
'lib',
|
||||
'config-schema.cjs',
|
||||
);
|
||||
|
||||
const {
|
||||
detectDrift,
|
||||
classifyFile,
|
||||
readMappedCommit,
|
||||
writeMappedCommit,
|
||||
chooseAffectedPaths,
|
||||
sanitizePaths,
|
||||
DRIFT_CATEGORIES,
|
||||
} = require(DRIFT_PATH);
|
||||
|
||||
// Small wrapper around execFileSync so tests don't sprinkle shell=true calls.
|
||||
function git(cwd, ...args) {
|
||||
return execFileSync('git', args, { cwd, encoding: 'utf8', stdio: ['pipe', 'pipe', 'pipe'] }).trim();
|
||||
}
|
||||
|
||||
// ─── Unit: classifyFile ──────────────────────────────────────────────────────
|
||||
|
||||
describe('classifyFile', () => {
|
||||
test('classifies packages barrel export', () => {
|
||||
assert.strictEqual(classifyFile('packages/foo/src/index.ts'), 'barrel');
|
||||
});
|
||||
|
||||
test('classifies apps barrel export', () => {
|
||||
assert.strictEqual(classifyFile('apps/web/src/index.tsx'), 'barrel');
|
||||
});
|
||||
|
||||
test('classifies supabase migration', () => {
|
||||
assert.strictEqual(
|
||||
classifyFile('supabase/migrations/20240101_init.sql'),
|
||||
'migration',
|
||||
);
|
||||
});
|
||||
|
||||
test('classifies prisma migration folder', () => {
|
||||
assert.strictEqual(
|
||||
classifyFile('prisma/migrations/20240101_init/migration.sql'),
|
||||
'migration',
|
||||
);
|
||||
});
|
||||
|
||||
test('classifies drizzle meta migration', () => {
|
||||
assert.strictEqual(classifyFile('drizzle/meta/_journal.json'), 'migration');
|
||||
});
|
||||
|
||||
test('classifies route module', () => {
|
||||
assert.strictEqual(
|
||||
classifyFile('apps/web/src/routes/journal.ts'),
|
||||
'route',
|
||||
);
|
||||
assert.strictEqual(
|
||||
classifyFile('src/api/users.ts'),
|
||||
'route',
|
||||
);
|
||||
});
|
||||
|
||||
test('returns null for ordinary source file', () => {
|
||||
assert.strictEqual(classifyFile('src/lib/util.ts'), null);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: detectDrift categories ────────────────────────────────────────────
|
||||
|
||||
describe('detectDrift — categories', () => {
|
||||
const baseStructure = [
|
||||
'# Codebase Structure',
|
||||
'',
|
||||
'- `src/lib/` — helpers',
|
||||
'- `bin/` — CLIs',
|
||||
'',
|
||||
].join('\n');
|
||||
|
||||
test('identifies new directory outside mapped paths', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['newpkg/src/thing.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
const newDirs = result.elements.filter((e) => e.category === 'new_dir');
|
||||
assert.ok(newDirs.length >= 1, 'should find at least one new directory');
|
||||
assert.ok(
|
||||
newDirs.some((e) => e.path.startsWith('newpkg')),
|
||||
'should flag newpkg as new',
|
||||
);
|
||||
});
|
||||
|
||||
test('does not flag files in already-mapped paths', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['src/lib/newhelper.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
const newDirs = result.elements.filter((e) => e.category === 'new_dir');
|
||||
assert.strictEqual(
|
||||
newDirs.length,
|
||||
0,
|
||||
'src/lib is mapped — no new_dir drift',
|
||||
);
|
||||
});
|
||||
|
||||
test('identifies new barrel export', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['packages/widgets/src/index.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
assert.ok(result.elements.some((e) => e.category === 'barrel'));
|
||||
});
|
||||
|
||||
test('identifies new migration', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['supabase/migrations/20240501_add_accounts.sql'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
assert.ok(result.elements.some((e) => e.category === 'migration'));
|
||||
});
|
||||
|
||||
test('identifies new route module', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['apps/accounting/src/routes/journal.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
assert.ok(result.elements.some((e) => e.category === 'route'));
|
||||
});
|
||||
|
||||
test('prioritizes higher-specificity category per file', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['supabase/migrations/20240101_init.sql'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: baseStructure,
|
||||
});
|
||||
const perFile = result.elements.filter(
|
||||
(e) => e.path === 'supabase/migrations/20240101_init.sql',
|
||||
);
|
||||
assert.strictEqual(perFile.length, 1, 'file counted once');
|
||||
assert.strictEqual(perFile[0].category, 'migration');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: threshold gating ──────────────────────────────────────────────────
|
||||
|
||||
describe('detectDrift — threshold gating', () => {
|
||||
test('2 elements under default threshold → no action', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: [
|
||||
'packages/a/src/index.ts',
|
||||
'packages/b/src/index.ts',
|
||||
],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 3,
|
||||
});
|
||||
assert.strictEqual(result.elements.length >= 2, true);
|
||||
assert.strictEqual(result.actionRequired, false);
|
||||
});
|
||||
|
||||
test('3 elements at threshold → action required', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: [
|
||||
'packages/a/src/index.ts',
|
||||
'packages/b/src/index.ts',
|
||||
'packages/c/src/index.ts',
|
||||
],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 3,
|
||||
});
|
||||
assert.strictEqual(result.actionRequired, true);
|
||||
});
|
||||
|
||||
test('4 elements exceeds threshold → action required', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: [
|
||||
'packages/a/src/index.ts',
|
||||
'packages/b/src/index.ts',
|
||||
'packages/c/src/index.ts',
|
||||
'supabase/migrations/1.sql',
|
||||
],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 3,
|
||||
});
|
||||
assert.strictEqual(result.actionRequired, true);
|
||||
});
|
||||
|
||||
test('respects custom threshold value', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['packages/a/src/index.ts', 'packages/b/src/index.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 2,
|
||||
});
|
||||
assert.strictEqual(result.actionRequired, true);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: action routing ────────────────────────────────────────────────────
|
||||
|
||||
describe('detectDrift — action routing', () => {
|
||||
const over = {
|
||||
addedFiles: [
|
||||
'packages/a/src/index.ts',
|
||||
'packages/b/src/index.ts',
|
||||
'packages/c/src/index.ts',
|
||||
],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 3,
|
||||
};
|
||||
|
||||
test('warn action yields warn directive and no mapper spawn request', () => {
|
||||
const result = detectDrift({ ...over, action: 'warn' });
|
||||
assert.strictEqual(result.directive, 'warn');
|
||||
assert.strictEqual(result.spawnMapper, false);
|
||||
assert.ok(result.message.includes('drift'), 'message mentions drift');
|
||||
});
|
||||
|
||||
test('auto-remap action yields spawn directive with affected paths', () => {
|
||||
const result = detectDrift({ ...over, action: 'auto-remap' });
|
||||
assert.strictEqual(result.directive, 'auto-remap');
|
||||
assert.strictEqual(result.spawnMapper, true);
|
||||
assert.ok(Array.isArray(result.affectedPaths));
|
||||
assert.ok(result.affectedPaths.length > 0);
|
||||
for (const p of result.affectedPaths) {
|
||||
assert.ok(!p.startsWith('/'), 'no absolute paths');
|
||||
assert.ok(!p.includes('..'), 'no traversal');
|
||||
}
|
||||
});
|
||||
|
||||
test('below-threshold inputs produce no directive', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['packages/a/src/index.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# only src/ mapped',
|
||||
threshold: 3,
|
||||
action: 'auto-remap',
|
||||
});
|
||||
assert.strictEqual(result.actionRequired, false);
|
||||
assert.strictEqual(result.spawnMapper, false);
|
||||
assert.strictEqual(result.directive, 'none');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: affected-paths scoping ────────────────────────────────────────────
|
||||
|
||||
describe('chooseAffectedPaths', () => {
|
||||
test('collapses files into top-level prefixes', () => {
|
||||
const paths = chooseAffectedPaths([
|
||||
'apps/accounting/src/routes/a.ts',
|
||||
'apps/accounting/src/routes/b.ts',
|
||||
'packages/ui/src/index.ts',
|
||||
]);
|
||||
assert.ok(paths.includes('apps/accounting'));
|
||||
assert.ok(paths.includes('packages/ui'));
|
||||
});
|
||||
|
||||
test('deduplicates and sorts', () => {
|
||||
const paths = chooseAffectedPaths([
|
||||
'zzz/a.ts',
|
||||
'aaa/b.ts',
|
||||
'zzz/c.ts',
|
||||
]);
|
||||
assert.deepStrictEqual(paths, ['aaa', 'zzz']);
|
||||
});
|
||||
|
||||
test('returns [] for empty input', () => {
|
||||
assert.deepStrictEqual(chooseAffectedPaths([]), []);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: sanitizePaths ─────────────────────────────────────────────────────
|
||||
|
||||
describe('sanitizePaths', () => {
|
||||
test('rejects traversal', () => {
|
||||
assert.deepStrictEqual(sanitizePaths(['../evil']), []);
|
||||
assert.deepStrictEqual(sanitizePaths(['foo/../evil']), []);
|
||||
});
|
||||
|
||||
test('rejects absolute paths', () => {
|
||||
assert.deepStrictEqual(sanitizePaths(['/etc/passwd']), []);
|
||||
});
|
||||
|
||||
test('rejects shell metacharacters', () => {
|
||||
assert.deepStrictEqual(sanitizePaths(['foo;rm -rf /']), []);
|
||||
assert.deepStrictEqual(sanitizePaths(['foo`id`']), []);
|
||||
assert.deepStrictEqual(sanitizePaths(['foo$(id)']), []);
|
||||
});
|
||||
|
||||
test('accepts normal repo-relative paths', () => {
|
||||
assert.deepStrictEqual(
|
||||
sanitizePaths(['apps/web', 'packages/ui']),
|
||||
['apps/web', 'packages/ui'],
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: last_mapped_commit frontmatter round-trip ─────────────────────────
|
||||
|
||||
describe('last_mapped_commit frontmatter', () => {
|
||||
let tmp;
|
||||
beforeEach(() => {
|
||||
tmp = createTempProject('gsd-drift-');
|
||||
fs.mkdirSync(path.join(tmp, '.planning', 'codebase'), { recursive: true });
|
||||
});
|
||||
afterEach(() => cleanup(tmp));
|
||||
|
||||
test('writeMappedCommit creates frontmatter on fresh file', () => {
|
||||
const file = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(file, '# Codebase Structure\n\nBody\n');
|
||||
writeMappedCommit(file, 'deadbeef00000000000000000000000000000000', '2026-04-22');
|
||||
const content = fs.readFileSync(file, 'utf8');
|
||||
assert.ok(content.startsWith('---\n'));
|
||||
assert.ok(content.includes('last_mapped_commit: deadbeef00000000000000000000000000000000'));
|
||||
assert.ok(content.includes('# Codebase Structure'));
|
||||
});
|
||||
|
||||
test('writeMappedCommit updates existing frontmatter', () => {
|
||||
const file = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(
|
||||
file,
|
||||
'---\nlast_mapped_commit: aaaa\nother: keep-me\n---\n# body\n',
|
||||
);
|
||||
writeMappedCommit(file, 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb', '2026-04-22');
|
||||
const content = fs.readFileSync(file, 'utf8');
|
||||
assert.ok(content.includes('last_mapped_commit: bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb'));
|
||||
assert.ok(content.includes('other: keep-me'), 'preserves other keys');
|
||||
assert.ok(content.includes('# body'));
|
||||
});
|
||||
|
||||
test('readMappedCommit round-trips via write', () => {
|
||||
const file = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(file, '# body\n');
|
||||
writeMappedCommit(file, 'cafebabe00000000000000000000000000000000', '2026-04-22');
|
||||
assert.strictEqual(
|
||||
readMappedCommit(file),
|
||||
'cafebabe00000000000000000000000000000000',
|
||||
);
|
||||
});
|
||||
|
||||
test('readMappedCommit returns null when file missing', () => {
|
||||
assert.strictEqual(readMappedCommit('/nonexistent/path.md'), null);
|
||||
});
|
||||
|
||||
test('readMappedCommit returns null when frontmatter absent', () => {
|
||||
const file = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(file, '# No frontmatter\n');
|
||||
assert.strictEqual(readMappedCommit(file), null);
|
||||
});
|
||||
|
||||
test('writeMappedCommit creates the file when it does not exist (symmetry with readMappedCommit)', () => {
|
||||
const file = path.join(tmp, '.planning', 'codebase', 'NEW-DOC.md');
|
||||
assert.strictEqual(fs.existsSync(file), false, 'precondition: file absent');
|
||||
// Must not throw — readMappedCommit returns null for missing files,
|
||||
// writeMappedCommit must defensively create them.
|
||||
writeMappedCommit(file, 'feedface00000000000000000000000000000000', '2026-04-22');
|
||||
assert.strictEqual(fs.existsSync(file), true, 'file created');
|
||||
assert.strictEqual(
|
||||
readMappedCommit(file),
|
||||
'feedface00000000000000000000000000000000',
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: negative / defensive ──────────────────────────────────────────────
|
||||
|
||||
describe('detectDrift — defensive paths', () => {
|
||||
test('missing structureMd → skipped result, no throw', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: ['foo/bar.ts'],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: null,
|
||||
});
|
||||
assert.strictEqual(result.skipped, true);
|
||||
assert.strictEqual(result.actionRequired, false);
|
||||
assert.ok(result.reason);
|
||||
});
|
||||
|
||||
test('empty inputs → no drift', () => {
|
||||
const result = detectDrift({
|
||||
addedFiles: [],
|
||||
modifiedFiles: [],
|
||||
deletedFiles: [],
|
||||
structureMd: '# structure',
|
||||
});
|
||||
assert.strictEqual(result.elements.length, 0);
|
||||
assert.strictEqual(result.actionRequired, false);
|
||||
});
|
||||
|
||||
test('categories constant is exposed and stable', () => {
|
||||
assert.ok(Array.isArray(DRIFT_CATEGORIES));
|
||||
assert.deepStrictEqual(
|
||||
[...DRIFT_CATEGORIES].sort(),
|
||||
['barrel', 'migration', 'new_dir', 'route'],
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Unit: non-blocking guarantee ────────────────────────────────────────────
|
||||
|
||||
describe('detectDrift — non-blocking guarantee', () => {
|
||||
test('never throws on malformed input', () => {
|
||||
assert.doesNotThrow(() => detectDrift({}));
|
||||
assert.doesNotThrow(() => detectDrift({ addedFiles: null }));
|
||||
assert.doesNotThrow(() => detectDrift({ addedFiles: ['x'], structureMd: undefined }));
|
||||
});
|
||||
|
||||
test('malformed input returns a skipped result (never crashes the phase)', () => {
|
||||
const r = detectDrift({});
|
||||
assert.strictEqual(r.skipped, true);
|
||||
assert.strictEqual(r.actionRequired, false);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Config validation: new keys present and restricted ──────────────────────
|
||||
|
||||
describe('config-schema — drift keys', () => {
|
||||
test('workflow.drift_threshold in VALID_CONFIG_KEYS', () => {
|
||||
const { VALID_CONFIG_KEYS } = require(CONFIG_SCHEMA_PATH);
|
||||
assert.ok(VALID_CONFIG_KEYS.has('workflow.drift_threshold'));
|
||||
});
|
||||
|
||||
test('workflow.drift_action in VALID_CONFIG_KEYS', () => {
|
||||
const { VALID_CONFIG_KEYS } = require(CONFIG_SCHEMA_PATH);
|
||||
assert.ok(VALID_CONFIG_KEYS.has('workflow.drift_action'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('config-set drift validation via CLI', () => {
|
||||
let tmp;
|
||||
beforeEach(() => {
|
||||
tmp = createTempGitProject('gsd-drift-cfg-');
|
||||
});
|
||||
afterEach(() => cleanup(tmp));
|
||||
|
||||
test('accepts warn', () => {
|
||||
const r = runGsdTools(['config-set', 'workflow.drift_action', 'warn'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
});
|
||||
|
||||
test('accepts auto-remap', () => {
|
||||
const r = runGsdTools(['config-set', 'workflow.drift_action', 'auto-remap'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
});
|
||||
|
||||
test('rejects bogus drift_action value', () => {
|
||||
const r = runGsdTools(['config-set', 'workflow.drift_action', 'sometimes'], tmp);
|
||||
assert.strictEqual(r.success, false);
|
||||
});
|
||||
|
||||
test('drift_threshold accepts integer', () => {
|
||||
const r = runGsdTools(['config-set', 'workflow.drift_threshold', '5'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
});
|
||||
|
||||
test('drift_threshold rejects non-numeric', () => {
|
||||
const r = runGsdTools(['config-set', 'workflow.drift_threshold', 'many'], tmp);
|
||||
assert.strictEqual(r.success, false);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Docs parity for CONFIGURATION.md ────────────────────────────────────────
|
||||
|
||||
describe('docs parity', () => {
|
||||
test('workflow.drift_threshold mentioned in docs/CONFIGURATION.md', () => {
|
||||
const md = fs.readFileSync(
|
||||
path.join(__dirname, '..', 'docs', 'CONFIGURATION.md'),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(md.includes('`workflow.drift_threshold`'));
|
||||
});
|
||||
|
||||
test('workflow.drift_action mentioned in docs/CONFIGURATION.md', () => {
|
||||
const md = fs.readFileSync(
|
||||
path.join(__dirname, '..', 'docs', 'CONFIGURATION.md'),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(md.includes('`workflow.drift_action`'));
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Mapper --paths flag documented ──────────────────────────────────────────
|
||||
|
||||
describe('gsd-codebase-mapper --paths flag', () => {
|
||||
test('agent doc mentions --paths', () => {
|
||||
const doc = fs.readFileSync(
|
||||
path.join(__dirname, '..', 'agents', 'gsd-codebase-mapper.md'),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(/--paths/.test(doc));
|
||||
});
|
||||
|
||||
test('AGENTS.md mentions --paths for mapper', () => {
|
||||
const doc = fs.readFileSync(
|
||||
path.join(__dirname, '..', 'docs', 'AGENTS.md'),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(/--paths/.test(doc));
|
||||
});
|
||||
|
||||
test('map-codebase workflow documents --paths passthrough', () => {
|
||||
const doc = fs.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'get-shit-done',
|
||||
'workflows',
|
||||
'map-codebase.md',
|
||||
),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(/--paths/.test(doc));
|
||||
});
|
||||
});
|
||||
|
||||
// ─── Execute-phase workflow integration ──────────────────────────────────────
|
||||
|
||||
describe('execute-phase integrates codebase_drift_gate', () => {
|
||||
test('workflow references a codebase drift step', () => {
|
||||
const doc = fs.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'get-shit-done',
|
||||
'workflows',
|
||||
'execute-phase.md',
|
||||
),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(/codebase_drift_gate|codebase-drift/.test(doc));
|
||||
});
|
||||
|
||||
test('workflow documents non-blocking guarantee for drift', () => {
|
||||
const doc = fs.readFileSync(
|
||||
path.join(
|
||||
__dirname,
|
||||
'..',
|
||||
'get-shit-done',
|
||||
'workflows',
|
||||
'execute-phase.md',
|
||||
),
|
||||
'utf8',
|
||||
);
|
||||
assert.ok(/non[- ]blocking/i.test(doc) || /continue on (error|failure)/i.test(doc));
|
||||
});
|
||||
});
|
||||
|
||||
// ─── CLI: verify codebase-drift subcommand ───────────────────────────────────
|
||||
|
||||
describe('verify codebase-drift CLI', () => {
|
||||
let tmp;
|
||||
beforeEach(() => {
|
||||
tmp = createTempGitProject('gsd-drift-cli-');
|
||||
fs.mkdirSync(path.join(tmp, '.planning', 'codebase'), { recursive: true });
|
||||
});
|
||||
afterEach(() => cleanup(tmp));
|
||||
|
||||
test('returns skipped JSON when STRUCTURE.md missing', () => {
|
||||
const r = runGsdTools(['verify', 'codebase-drift'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
const data = JSON.parse(r.output);
|
||||
assert.strictEqual(data.skipped, true);
|
||||
assert.strictEqual(data.action_required, false);
|
||||
});
|
||||
|
||||
test('returns no-drift result when STRUCTURE.md is fresh', () => {
|
||||
const structure = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(structure, '# Codebase Structure\n\n- `src/`\n');
|
||||
const head = git(tmp, 'rev-parse', 'HEAD');
|
||||
writeMappedCommit(structure, head, '2026-04-22');
|
||||
const r = runGsdTools(['verify', 'codebase-drift'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
const data = JSON.parse(r.output);
|
||||
assert.strictEqual(data.action_required, false);
|
||||
});
|
||||
|
||||
test('detects drift when new files added after last_mapped_commit', () => {
|
||||
const structure = path.join(tmp, '.planning', 'codebase', 'STRUCTURE.md');
|
||||
fs.writeFileSync(structure, '# Codebase Structure\n\n- `src/`\n');
|
||||
const head = git(tmp, 'rev-parse', 'HEAD');
|
||||
writeMappedCommit(structure, head, '2026-04-22');
|
||||
git(tmp, 'add', '-A');
|
||||
git(tmp, 'commit', '-m', 'map codebase');
|
||||
for (const pkg of ['alpha', 'beta', 'gamma']) {
|
||||
const dir = path.join(tmp, 'packages', pkg, 'src');
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dir, 'index.ts'), 'export {};\n');
|
||||
}
|
||||
git(tmp, 'add', '-A');
|
||||
git(tmp, 'commit', '-m', 'add packages');
|
||||
const r = runGsdTools(['verify', 'codebase-drift'], tmp);
|
||||
assert.strictEqual(r.success, true, r.error);
|
||||
const data = JSON.parse(r.output);
|
||||
assert.strictEqual(data.action_required, true);
|
||||
assert.strictEqual(data.directive, 'warn');
|
||||
assert.ok(data.elements.length >= 3);
|
||||
});
|
||||
|
||||
test('never exits non-zero when git repo is missing (non-blocking)', () => {
|
||||
const nonGit = createTempProject('gsd-drift-nongit-');
|
||||
try {
|
||||
const r = runGsdTools(['verify', 'codebase-drift'], nonGit);
|
||||
assert.strictEqual(r.success, true, 'must exit 0 even without git');
|
||||
const data = JSON.parse(r.output);
|
||||
assert.strictEqual(data.skipped, true);
|
||||
} finally {
|
||||
cleanup(nonGit);
|
||||
}
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user