mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-25 17:25:23 +02:00
* feat: auto-remap codebase after significant phase execution (#2003) Adds a post-phase structural drift detector that compares the committed tree against `.planning/codebase/STRUCTURE.md` and either warns or auto-remaps the affected subtrees when drift exceeds a configurable threshold. ## Summary - New `bin/lib/drift.cjs` — pure detector covering four drift categories: new directories outside mapped paths, new barrel exports at `(packages|apps)/*/src/index.*`, new migration files, and new route modules. Prioritizes the most-specific category per file. - New `verify codebase-drift` CLI subcommand + SDK handler, registered as `gsd-sdk query verify.codebase-drift`. - New `codebase_drift_gate` step in `execute-phase` between `schema_drift_gate` and `verify_phase_goal`. Non-blocking by contract — any error logs and the phase continues. - Two new config keys: `workflow.drift_threshold` (int, default 3) and `workflow.drift_action` (`warn` | `auto-remap`, default `warn`), with enum/integer validation in `config-set`. - `gsd-codebase-mapper` learns an optional `--paths <p1,p2,...>` scope hint for incremental remapping; agent/workflow docs updated. - `last_mapped_commit` lives in YAML frontmatter on each `.planning/codebase/*.md` file; `readMappedCommit`/`writeMappedCommit` round-trip helpers ship in `drift.cjs`. ## Tests - 55 new tests in `tests/drift-detection.test.cjs` covering: classification, threshold gating at 2/3/4 elements, warn vs. auto-remap routing, affected-path scoping, `--paths` sanitization (traversal, absolute, shell metacharacter rejection), frontmatter round-trip, defensive paths (missing STRUCTURE.md, malformed input, non-git repos), CLI JSON output, and documentation parity. - Full suite: 5044 pass / 0 fail. ## Documentation - `docs/CONFIGURATION.md` — rows for both new keys. - `docs/ARCHITECTURE.md` — section on the post-execute drift gate. - `docs/AGENTS.md` — `--paths` flag on `gsd-codebase-mapper`. - `docs/USER-GUIDE.md` — user-facing behavior note + toggle commands. - `docs/FEATURES.md` — new 27a section with REQ-DRIFT-01..06. - `docs/INVENTORY.md` + `docs/INVENTORY-MANIFEST.json` — drift.cjs listed. - `get-shit-done/workflows/execute-phase.md` — `codebase_drift_gate` step. - `get-shit-done/workflows/map-codebase.md` — `parse_paths_flag` step. - `agents/gsd-codebase-mapper.md` — `--paths` directive under parse_focus. ## Design decisions - **Frontmatter over sidecar JSON** for `last_mapped_commit`: keeps the baseline attached to the file, survives git moves, survives per-doc regeneration, no extra file lifecycle. - **Substring match against STRUCTURE.md** for `isPathMapped`: the map is free-form markdown, not a structured manifest; any mention of a path prefix counts as "mapped territory". Cheap, no parser, zero false negatives on reasonable maps. - **Category priority migration > route > barrel > new_dir** so a file matching multiple rules counts exactly once at the most specific level. - **Empty-tree SHA fallback** (`4b825dc6…`) when `last_mapped_commit` is absent — semantically correct (no baseline means everything is drift) and deterministic across repos. - **Four layers of non-blocking** — detector try/catch, CLI try/catch, SDK handler try/catch, and workflow `|| echo` shell fallback. Any single layer failing still returns a valid skipped result. - **SDK handler delegates to `gsd-tools.cjs`** rather than re-porting the detector to TypeScript, keeping drift logic in one canonical place. Closes #2003 Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * docs(mapper): tag --paths fenced block as text (CodeRabbit MD040) Comment 3127255172. * docs(config): use /gsd- dash command syntax in drift_action row (CodeRabbit) Comment 3127255180. Matches the convention used by every other command reference in docs/CONFIGURATION.md. * fix(execute-phase): initialize AGENT_SKILLS_MAPPER + tag fenced blocks Two CodeRabbit findings on the auto-remap branch of the drift gate: - 3127255186 (must-fix): the mapper Task prompt referenced ${AGENT_SKILLS_MAPPER} but only AGENT_SKILLS (for gsd-executor) is loaded at init_context (line 72). Without this fix the literal placeholder string would leak into the spawned mapper's prompt. Add an explicit gsd-sdk query agent-skills gsd-codebase-mapper step right before the Task spawn. - 3127255183: tag the warn-message and Task() fenced code blocks as text to satisfy markdownlint MD040. * docs(map-codebase): wire PATH_SCOPE_HINT through every mapper prompt CodeRabbit (review id 4158286952, comment 3127255190) flagged that the parse_paths_flag step defined incremental-remap semantics but did not inject a normalized variable into the spawn_agents and sequential_mapping mapper prompts, so incremental remap could silently regress to a whole-repo scan. - Define SCOPED_PATHS / PATH_SCOPE_HINT in parse_paths_flag. - Inject ${PATH_SCOPE_HINT} into all four spawn_agents Task prompts. - Document the same scope contract for sequential_mapping mode. * fix(drift): writeMappedCommit tolerates missing target file CodeRabbit (review id 4158286952, drift.cjs:349-355 nitpick) noted that readMappedCommit returns null on ENOENT but writeMappedCommit threw — an asymmetry that breaks first-time stamping of a freshly produced doc that the caller has not yet written. - Catch ENOENT on the read; treat absent file as empty content. - Add a regression test that calls writeMappedCommit on a non-existent path and asserts the file is created with correct frontmatter. Test was authored to fail before the fix (ENOENT) and passes after. --------- Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
699 lines
24 KiB
TypeScript
699 lines
24 KiB
TypeScript
/**
|
||
* Verification query handlers — plan structure, phase completeness, artifact checks.
|
||
*
|
||
* Ported from get-shit-done/bin/lib/verify.cjs.
|
||
* Provides plan validation, phase completeness checking, and artifact verification
|
||
* as native TypeScript query handlers registered in the SDK query registry.
|
||
*
|
||
* @example
|
||
* ```typescript
|
||
* import { verifyPlanStructure, verifyPhaseCompleteness, verifyArtifacts } from './verify.js';
|
||
*
|
||
* const result = await verifyPlanStructure(['path/to/plan.md'], '/project');
|
||
* // { data: { valid: true, errors: [], warnings: [], task_count: 2, ... } }
|
||
* ```
|
||
*/
|
||
|
||
import { readFile, readdir } from 'node:fs/promises';
|
||
import { existsSync, readdirSync, readFileSync, statSync } from 'node:fs';
|
||
import { join, isAbsolute } from 'node:path';
|
||
import { GSDError, ErrorClassification } from '../errors.js';
|
||
import { extractFrontmatter, parseMustHavesBlock } from './frontmatter.js';
|
||
import {
|
||
comparePhaseNum,
|
||
normalizePhaseName,
|
||
phaseTokenMatches,
|
||
planningPaths,
|
||
} from './helpers.js';
|
||
import type { QueryHandler } from './utils.js';
|
||
|
||
// ─── verifyPlanStructure ───────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Validate plan structure against required schema.
|
||
*
|
||
* Port of `cmdVerifyPlanStructure` from `verify.cjs` lines 108-167.
|
||
* Checks required frontmatter fields, task XML elements, wave/depends_on
|
||
* consistency, and autonomous/checkpoint consistency.
|
||
*
|
||
* @param args - args[0]: file path (required)
|
||
* @param projectDir - Project root directory
|
||
* @returns QueryResult with { valid, errors, warnings, task_count, tasks, frontmatter_fields }
|
||
* @throws GSDError with Validation classification if file path missing
|
||
*/
|
||
export const verifyPlanStructure: QueryHandler = async (args, projectDir) => {
|
||
const filePath = args[0];
|
||
if (!filePath) {
|
||
throw new GSDError('file path required', ErrorClassification.Validation);
|
||
}
|
||
|
||
// T-12-01: Null byte rejection on file paths
|
||
if (filePath.includes('\0')) {
|
||
throw new GSDError('file path contains null bytes', ErrorClassification.Validation);
|
||
}
|
||
|
||
const fullPath = isAbsolute(filePath) ? filePath : join(projectDir, filePath);
|
||
|
||
let content: string;
|
||
try {
|
||
content = await readFile(fullPath, 'utf-8');
|
||
} catch {
|
||
return { data: { error: 'File not found', path: filePath } };
|
||
}
|
||
|
||
const fm = extractFrontmatter(content);
|
||
const errors: string[] = [];
|
||
const warnings: string[] = [];
|
||
|
||
// Check required frontmatter fields
|
||
const required = ['phase', 'plan', 'type', 'wave', 'depends_on', 'files_modified', 'autonomous', 'must_haves'];
|
||
for (const field of required) {
|
||
if (fm[field] === undefined) errors.push(`Missing required frontmatter field: ${field}`);
|
||
}
|
||
|
||
// Parse and check task elements
|
||
// T-12-03: Use non-greedy [\s\S]*? to avoid catastrophic backtracking
|
||
const taskPattern = /<task[^>]*>([\s\S]*?)<\/task>/g;
|
||
const tasks: Array<{ name: string; hasFiles: boolean; hasAction: boolean; hasVerify: boolean; hasDone: boolean }> = [];
|
||
let taskMatch: RegExpExecArray | null;
|
||
while ((taskMatch = taskPattern.exec(content)) !== null) {
|
||
const taskContent = taskMatch[1];
|
||
const nameMatch = taskContent.match(/<name>([\s\S]*?)<\/name>/);
|
||
const taskName = nameMatch ? nameMatch[1].trim() : 'unnamed';
|
||
const hasFiles = /<files>/.test(taskContent);
|
||
const hasAction = /<action>/.test(taskContent);
|
||
const hasVerify = /<verify>/.test(taskContent);
|
||
const hasDone = /<done>/.test(taskContent);
|
||
|
||
if (!nameMatch) errors.push('Task missing <name> element');
|
||
if (!hasAction) errors.push(`Task '${taskName}' missing <action>`);
|
||
if (!hasVerify) warnings.push(`Task '${taskName}' missing <verify>`);
|
||
if (!hasDone) warnings.push(`Task '${taskName}' missing <done>`);
|
||
if (!hasFiles) warnings.push(`Task '${taskName}' missing <files>`);
|
||
|
||
tasks.push({ name: taskName, hasFiles, hasAction, hasVerify, hasDone });
|
||
}
|
||
|
||
if (tasks.length === 0) warnings.push('No <task> elements found');
|
||
|
||
// Wave/depends_on consistency
|
||
if (fm.wave && parseInt(String(fm.wave), 10) > 1 && (!fm.depends_on || (Array.isArray(fm.depends_on) && fm.depends_on.length === 0))) {
|
||
warnings.push('Wave > 1 but depends_on is empty');
|
||
}
|
||
|
||
// Autonomous/checkpoint consistency
|
||
const hasCheckpoints = /<task\s+type=["']?checkpoint/.test(content);
|
||
if (hasCheckpoints && fm.autonomous !== 'false' && fm.autonomous !== false) {
|
||
errors.push('Has checkpoint tasks but autonomous is not false');
|
||
}
|
||
|
||
return {
|
||
data: {
|
||
valid: errors.length === 0,
|
||
errors,
|
||
warnings,
|
||
task_count: tasks.length,
|
||
tasks,
|
||
frontmatter_fields: Object.keys(fm),
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifyPhaseCompleteness ───────────────────────────────────────────────
|
||
|
||
/**
|
||
* Check phase completeness by matching PLAN files to SUMMARY files.
|
||
*
|
||
* Port of `cmdVerifyPhaseCompleteness` from `verify.cjs` lines 169-213.
|
||
* Scans a phase directory for PLAN and SUMMARY files, identifies incomplete
|
||
* plans (no summary) and orphan summaries (no plan).
|
||
*
|
||
* @param args - args[0]: phase number (required)
|
||
* @param projectDir - Project root directory
|
||
* @returns QueryResult with { complete, phase, plan_count, summary_count, incomplete_plans, orphan_summaries, errors, warnings }
|
||
* @throws GSDError with Validation classification if phase number missing
|
||
*/
|
||
export const verifyPhaseCompleteness: QueryHandler = async (args, projectDir, workstream) => {
|
||
const phase = args[0];
|
||
if (!phase) {
|
||
throw new GSDError('phase required', ErrorClassification.Validation);
|
||
}
|
||
|
||
const phasesDir = planningPaths(projectDir, workstream).phases;
|
||
const normalized = normalizePhaseName(phase);
|
||
|
||
// Find phase directory (mirror findPhase pattern from phase.ts)
|
||
let phaseDir: string | null = null;
|
||
let phaseNumber: string = normalized;
|
||
try {
|
||
const entries = await readdir(phasesDir, { withFileTypes: true });
|
||
const dirs = entries
|
||
.filter(e => e.isDirectory())
|
||
.map(e => e.name)
|
||
.sort();
|
||
const match = dirs.find(d => phaseTokenMatches(d, normalized));
|
||
if (match) {
|
||
phaseDir = join(phasesDir, match);
|
||
// Extract phase number from directory name
|
||
const numMatch = match.match(/^(\d+[A-Z]?(?:\.\d+)*)/i);
|
||
if (numMatch) phaseNumber = numMatch[1];
|
||
}
|
||
} catch { /* phases dir doesn't exist */ }
|
||
|
||
if (!phaseDir) {
|
||
return { data: { error: 'Phase not found', phase } };
|
||
}
|
||
|
||
const errors: string[] = [];
|
||
const warnings: string[] = [];
|
||
|
||
// List plans and summaries
|
||
let files: string[];
|
||
try {
|
||
files = await readdir(phaseDir);
|
||
} catch {
|
||
return { data: { error: 'Cannot read phase directory' } };
|
||
}
|
||
|
||
const plans = files.filter(f => /-PLAN\.md$/i.test(f));
|
||
const summaries = files.filter(f => /-SUMMARY\.md$/i.test(f));
|
||
|
||
// Extract plan IDs (everything before -PLAN.md / -SUMMARY.md)
|
||
const planIds = new Set(plans.map(p => p.replace(/-PLAN\.md$/i, '')));
|
||
const summaryIds = new Set(summaries.map(s => s.replace(/-SUMMARY\.md$/i, '')));
|
||
|
||
// Plans without summaries
|
||
const incompletePlans = [...planIds].filter(id => !summaryIds.has(id));
|
||
if (incompletePlans.length > 0) {
|
||
errors.push(`Plans without summaries: ${incompletePlans.join(', ')}`);
|
||
}
|
||
|
||
// Summaries without plans (orphans)
|
||
const orphanSummaries = [...summaryIds].filter(id => !planIds.has(id));
|
||
if (orphanSummaries.length > 0) {
|
||
warnings.push(`Summaries without plans: ${orphanSummaries.join(', ')}`);
|
||
}
|
||
|
||
return {
|
||
data: {
|
||
complete: errors.length === 0,
|
||
phase: phaseNumber,
|
||
plan_count: plans.length,
|
||
summary_count: summaries.length,
|
||
incomplete_plans: incompletePlans,
|
||
orphan_summaries: orphanSummaries,
|
||
errors,
|
||
warnings,
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifyArtifacts ───────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Verify artifact file existence and content from must_haves.artifacts.
|
||
*
|
||
* Port of `cmdVerifyArtifacts` from `verify.cjs` lines 283-336.
|
||
* Reads must_haves.artifacts from plan frontmatter and checks each artifact
|
||
* for file existence, min_lines, contains, and exports.
|
||
*
|
||
* @param args - args[0]: plan file path (required)
|
||
* @param projectDir - Project root directory
|
||
* @returns QueryResult with { all_passed, passed, total, artifacts }
|
||
* @throws GSDError with Validation classification if file path missing
|
||
*/
|
||
export const verifyArtifacts: QueryHandler = async (args, projectDir) => {
|
||
const planFilePath = args[0];
|
||
if (!planFilePath) {
|
||
throw new GSDError('plan file path required', ErrorClassification.Validation);
|
||
}
|
||
|
||
// T-12-01: Null byte rejection on file paths
|
||
if (planFilePath.includes('\0')) {
|
||
throw new GSDError('file path contains null bytes', ErrorClassification.Validation);
|
||
}
|
||
|
||
const fullPath = isAbsolute(planFilePath) ? planFilePath : join(projectDir, planFilePath);
|
||
|
||
let content: string;
|
||
try {
|
||
content = await readFile(fullPath, 'utf-8');
|
||
} catch {
|
||
return { data: { error: 'File not found', path: planFilePath } };
|
||
}
|
||
|
||
const { items: artifacts } = parseMustHavesBlock(content, 'artifacts');
|
||
if (artifacts.length === 0) {
|
||
return { data: { error: 'No must_haves.artifacts found in frontmatter', path: planFilePath } };
|
||
}
|
||
|
||
const results: Array<{ path: string; exists: boolean; issues: string[]; passed: boolean }> = [];
|
||
|
||
for (const artifact of artifacts) {
|
||
if (typeof artifact === 'string') continue; // skip simple string items
|
||
const artObj = artifact as Record<string, unknown>;
|
||
const artPath = artObj.path as string | undefined;
|
||
if (!artPath) continue;
|
||
|
||
const artFullPath = join(projectDir, artPath);
|
||
let exists = false;
|
||
let fileContent = '';
|
||
|
||
try {
|
||
fileContent = await readFile(artFullPath, 'utf-8');
|
||
exists = true;
|
||
} catch {
|
||
// File doesn't exist
|
||
}
|
||
|
||
const check: { path: string; exists: boolean; issues: string[]; passed: boolean } = {
|
||
path: artPath,
|
||
exists,
|
||
issues: [],
|
||
passed: false,
|
||
};
|
||
|
||
if (exists) {
|
||
const lineCount = fileContent.split('\n').length;
|
||
|
||
if (artObj.min_lines && lineCount < (artObj.min_lines as number)) {
|
||
check.issues.push(`Only ${lineCount} lines, need ${artObj.min_lines}`);
|
||
}
|
||
if (artObj.contains && !fileContent.includes(artObj.contains as string)) {
|
||
check.issues.push(`Missing pattern: ${artObj.contains}`);
|
||
}
|
||
if (artObj.exports) {
|
||
const exports = Array.isArray(artObj.exports) ? artObj.exports : [artObj.exports];
|
||
for (const exp of exports) {
|
||
if (!fileContent.includes(String(exp))) {
|
||
check.issues.push(`Missing export: ${exp}`);
|
||
}
|
||
}
|
||
}
|
||
check.passed = check.issues.length === 0;
|
||
} else {
|
||
check.issues.push('File not found');
|
||
}
|
||
|
||
results.push(check);
|
||
}
|
||
|
||
const passed = results.filter(r => r.passed).length;
|
||
return {
|
||
data: {
|
||
all_passed: results.length > 0 && passed === results.length,
|
||
passed,
|
||
total: results.length,
|
||
artifacts: results,
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifyCommits ────────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Verify that commit hashes referenced in SUMMARY.md files actually exist.
|
||
*
|
||
* Port of `cmdVerifyCommits` from `verify.cjs` lines 262-282.
|
||
* Used by gsd-verifier agent to confirm commits mentioned in summaries
|
||
* are real commits in the git history.
|
||
*
|
||
* @param args - One or more commit hashes
|
||
* @param projectDir - Project root directory
|
||
* @returns QueryResult with { all_valid, valid, invalid, total }
|
||
*/
|
||
export const verifyCommits: QueryHandler = async (args, projectDir) => {
|
||
if (args.length === 0) {
|
||
throw new GSDError('At least one commit hash required', ErrorClassification.Validation);
|
||
}
|
||
|
||
const { execGit } = await import('./commit.js');
|
||
const valid: string[] = [];
|
||
const invalid: string[] = [];
|
||
|
||
for (const hash of args) {
|
||
const result = execGit(projectDir, ['cat-file', '-t', hash]);
|
||
if (result.exitCode === 0 && result.stdout.trim() === 'commit') {
|
||
valid.push(hash);
|
||
} else {
|
||
invalid.push(hash);
|
||
}
|
||
}
|
||
|
||
return {
|
||
data: {
|
||
all_valid: invalid.length === 0,
|
||
valid,
|
||
invalid,
|
||
total: args.length,
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifyReferences ─────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Verify that @-references and backtick file paths in a document resolve.
|
||
*
|
||
* Port of `cmdVerifyReferences` from `verify.cjs` lines 217-260.
|
||
*
|
||
* @param args - args[0]: file path (required)
|
||
* @param projectDir - Project root directory
|
||
* @returns QueryResult with { valid, found, missing }
|
||
*/
|
||
export const verifyReferences: QueryHandler = async (args, projectDir) => {
|
||
const filePath = args[0];
|
||
if (!filePath) {
|
||
throw new GSDError('file path required', ErrorClassification.Validation);
|
||
}
|
||
|
||
const fullPath = isAbsolute(filePath) ? filePath : join(projectDir, filePath);
|
||
|
||
let content: string;
|
||
try {
|
||
content = await readFile(fullPath, 'utf-8');
|
||
} catch {
|
||
return { data: { error: 'File not found', path: filePath } };
|
||
}
|
||
|
||
const found: string[] = [];
|
||
const missing: string[] = [];
|
||
|
||
const atRefs = content.match(/@([^\s\n,)]+\/[^\s\n,)]+)/g) || [];
|
||
for (const ref of atRefs) {
|
||
const cleanRef = ref.slice(1);
|
||
const resolved = cleanRef.startsWith('~/')
|
||
? join(process.env.HOME || '', cleanRef.slice(2))
|
||
: join(projectDir, cleanRef);
|
||
if (existsSync(resolved)) {
|
||
found.push(cleanRef);
|
||
} else {
|
||
missing.push(cleanRef);
|
||
}
|
||
}
|
||
|
||
const backtickRefs = content.match(/`([^`]+\/[^`]+\.[a-zA-Z]{1,10})`/g) || [];
|
||
for (const ref of backtickRefs) {
|
||
const cleanRef = ref.slice(1, -1);
|
||
if (cleanRef.startsWith('http') || cleanRef.includes('${') || cleanRef.includes('{{')) continue;
|
||
if (found.includes(cleanRef) || missing.includes(cleanRef)) continue;
|
||
const resolved = join(projectDir, cleanRef);
|
||
if (existsSync(resolved)) {
|
||
found.push(cleanRef);
|
||
} else {
|
||
missing.push(cleanRef);
|
||
}
|
||
}
|
||
|
||
return {
|
||
data: {
|
||
valid: missing.length === 0,
|
||
found: found.length,
|
||
missing,
|
||
total: found.length + missing.length,
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifySummary ────────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Verify a SUMMARY.md file: existence, file spot-checks, commit refs, self-check section.
|
||
*
|
||
* Port of `cmdVerifySummary` from verify.cjs lines 13-107.
|
||
*
|
||
* @param args - args[0]: summary path (required), args[1]: optional --check-count N
|
||
*/
|
||
export const verifySummary: QueryHandler = async (args, projectDir) => {
|
||
const summaryPath = args[0];
|
||
if (!summaryPath) {
|
||
throw new GSDError('summary-path required', ErrorClassification.Validation);
|
||
}
|
||
|
||
const checkCountIdx = args.indexOf('--check-count');
|
||
const checkCount = checkCountIdx !== -1 ? parseInt(args[checkCountIdx + 1], 10) || 2 : 2;
|
||
|
||
const fullPath = join(projectDir, summaryPath);
|
||
|
||
if (!existsSync(fullPath)) {
|
||
return {
|
||
data: {
|
||
passed: false,
|
||
checks: {
|
||
summary_exists: false,
|
||
files_created: { checked: 0, found: 0, missing: [] },
|
||
commits_exist: false,
|
||
self_check: 'not_found',
|
||
},
|
||
errors: ['SUMMARY.md not found'],
|
||
},
|
||
};
|
||
}
|
||
|
||
const content = readFileSync(fullPath, 'utf-8');
|
||
const errors: string[] = [];
|
||
|
||
const mentionedFiles = new Set<string>();
|
||
const patterns = [
|
||
/`([^`]+\.[a-zA-Z]+)`/g,
|
||
/(?:Created|Modified|Added|Updated|Edited):\s*`?([^\s`]+\.[a-zA-Z]+)`?/gi,
|
||
];
|
||
for (const pattern of patterns) {
|
||
let m;
|
||
while ((m = pattern.exec(content)) !== null) {
|
||
const filePath = m[1];
|
||
if (filePath && !filePath.startsWith('http') && filePath.includes('/')) {
|
||
mentionedFiles.add(filePath);
|
||
}
|
||
}
|
||
}
|
||
|
||
const filesToCheck = Array.from(mentionedFiles).slice(0, checkCount);
|
||
const missing: string[] = [];
|
||
for (const file of filesToCheck) {
|
||
if (!existsSync(join(projectDir, file))) {
|
||
missing.push(file);
|
||
}
|
||
}
|
||
|
||
const { execGit } = await import('./commit.js');
|
||
const commitHashPattern = /\b[0-9a-f]{7,40}\b/g;
|
||
const hashes = content.match(commitHashPattern) || [];
|
||
let commitsExist = false;
|
||
for (const hash of hashes.slice(0, 3)) {
|
||
const result = execGit(projectDir, ['cat-file', '-t', hash]);
|
||
if (result.exitCode === 0 && result.stdout.trim() === 'commit') {
|
||
commitsExist = true;
|
||
break;
|
||
}
|
||
}
|
||
|
||
let selfCheck = 'not_found';
|
||
const selfCheckPattern = /##\s*(?:Self[- ]?Check|Verification|Quality Check)/i;
|
||
if (selfCheckPattern.test(content)) {
|
||
const passPattern = /(?:all\s+)?(?:pass|✓|✅|complete|succeeded)/i;
|
||
const failPattern = /(?:fail|✗|❌|incomplete|blocked)/i;
|
||
const checkSection = content.slice(content.search(selfCheckPattern));
|
||
if (failPattern.test(checkSection)) {
|
||
selfCheck = 'failed';
|
||
} else if (passPattern.test(checkSection)) {
|
||
selfCheck = 'passed';
|
||
}
|
||
}
|
||
|
||
if (missing.length > 0) errors.push('Missing files: ' + missing.join(', '));
|
||
if (!commitsExist && hashes.length > 0) errors.push('Referenced commit hashes not found in git history');
|
||
if (selfCheck === 'failed') errors.push('Self-check section indicates failure');
|
||
|
||
const passed = missing.length === 0 && selfCheck !== 'failed';
|
||
return {
|
||
data: {
|
||
passed,
|
||
checks: {
|
||
summary_exists: true,
|
||
files_created: { checked: filesToCheck.length, found: filesToCheck.length - missing.length, missing },
|
||
commits_exist: commitsExist,
|
||
self_check: selfCheck,
|
||
},
|
||
errors,
|
||
},
|
||
};
|
||
};
|
||
|
||
// ─── verifyPathExists ─────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Check file/directory existence and return type.
|
||
*
|
||
* Port of `cmdVerifyPathExists` from commands.cjs lines 111-132.
|
||
*
|
||
* @param args - args[0]: path to check (required)
|
||
*/
|
||
export const verifyPathExists: QueryHandler = async (args, projectDir) => {
|
||
const targetPath = args[0];
|
||
if (!targetPath) {
|
||
throw new GSDError('path required for verification', ErrorClassification.Validation);
|
||
}
|
||
if (targetPath.includes('\0')) {
|
||
throw new GSDError('path contains null bytes', ErrorClassification.Validation);
|
||
}
|
||
|
||
const fullPath = isAbsolute(targetPath) ? targetPath : join(projectDir, targetPath);
|
||
|
||
try {
|
||
const stats = statSync(fullPath);
|
||
const type = stats.isDirectory() ? 'directory' : stats.isFile() ? 'file' : 'other';
|
||
return { data: { exists: true, type } };
|
||
} catch {
|
||
return { data: { exists: false, type: null } };
|
||
}
|
||
};
|
||
|
||
// ─── verifySchemaDrift ────────────────────────────────────────────────────
|
||
|
||
/**
|
||
* Detect schema drift for a phase — port of `cmdVerifySchemaDrift` from verify.cjs lines 1013–1086.
|
||
*/
|
||
export const verifySchemaDrift: QueryHandler = async (args, projectDir, workstream) => {
|
||
const phaseArg = args[0];
|
||
const skipFlag = args.includes('--skip');
|
||
|
||
if (!phaseArg) {
|
||
throw new GSDError('Usage: verify schema-drift <phase> [--skip]', ErrorClassification.Validation);
|
||
}
|
||
|
||
const { checkSchemaDrift } = await import('./schema-detect.js');
|
||
const { execGit } = await import('./commit.js');
|
||
|
||
const phasesDir = planningPaths(projectDir, workstream).phases;
|
||
if (!existsSync(phasesDir)) {
|
||
return {
|
||
data: {
|
||
drift_detected: false,
|
||
blocking: false,
|
||
message: 'No phases directory',
|
||
},
|
||
};
|
||
}
|
||
|
||
const normalized = normalizePhaseName(phaseArg);
|
||
const dirNames = readdirSync(phasesDir, { withFileTypes: true })
|
||
.filter(e => e.isDirectory())
|
||
.map(e => e.name)
|
||
.sort((a, b) => comparePhaseNum(a, b));
|
||
|
||
let phaseDirName = dirNames.find(d => phaseTokenMatches(d, normalized)) ?? null;
|
||
if (!phaseDirName && /^[\d.]+/.test(phaseArg)) {
|
||
const exact = join(phasesDir, phaseArg);
|
||
if (existsSync(exact)) phaseDirName = phaseArg;
|
||
}
|
||
|
||
if (!phaseDirName) {
|
||
return {
|
||
data: {
|
||
drift_detected: false,
|
||
blocking: false,
|
||
message: `Phase directory not found: ${phaseArg}`,
|
||
},
|
||
};
|
||
}
|
||
|
||
const phaseDir = join(phasesDir, phaseDirName);
|
||
|
||
function filesModifiedFromFrontmatter(fm: Record<string, unknown>): string[] {
|
||
const v = fm.files_modified;
|
||
if (Array.isArray(v)) return v.map(x => String(x).trim()).filter(Boolean);
|
||
if (typeof v === 'string') {
|
||
const t = v.trim();
|
||
return t ? [t] : [];
|
||
}
|
||
return [];
|
||
}
|
||
|
||
const allFiles: string[] = [];
|
||
const planFiles = readdirSync(phaseDir).filter(f => f.endsWith('-PLAN.md') || f === 'PLAN.md');
|
||
for (const pf of planFiles) {
|
||
const content = readFileSync(join(phaseDir, pf), 'utf-8');
|
||
const fm = extractFrontmatter(content) as Record<string, unknown>;
|
||
allFiles.push(...filesModifiedFromFrontmatter(fm));
|
||
}
|
||
|
||
let executionLog = '';
|
||
const summaryFiles = readdirSync(phaseDir).filter(f => f.endsWith('-SUMMARY.md'));
|
||
for (const sf of summaryFiles) {
|
||
executionLog += readFileSync(join(phaseDir, sf), 'utf-8') + '\n';
|
||
}
|
||
|
||
const gitLog = execGit(projectDir, ['log', '--oneline', '--all', '-50']);
|
||
if (gitLog.exitCode === 0) {
|
||
executionLog += '\n' + gitLog.stdout;
|
||
}
|
||
|
||
const result = checkSchemaDrift(allFiles, executionLog, { skipCheck: !!skipFlag });
|
||
|
||
return {
|
||
data: {
|
||
drift_detected: result.driftDetected,
|
||
blocking: result.blocking,
|
||
schema_files: result.schemaFiles,
|
||
orms: result.orms,
|
||
unpushed_orms: result.unpushedOrms,
|
||
message: result.message,
|
||
skipped: result.skipped || false,
|
||
},
|
||
};
|
||
};
|
||
|
||
/**
|
||
* verify.codebase-drift — structural drift detector (#2003).
|
||
*
|
||
* Non-blocking by contract: every failure mode returns a successful response
|
||
* with `{ skipped: true, reason }`. The post-execute drift gate in
|
||
* `/gsd:execute-phase` relies on this guarantee.
|
||
*
|
||
* Delegates to the Node-side implementation in `bin/lib/drift.cjs` and
|
||
* `bin/lib/verify.cjs` via a child process so the drift logic stays in one
|
||
* canonical place (see `cmdVerifyCodebaseDrift`).
|
||
*/
|
||
export const verifyCodebaseDrift: QueryHandler = async (_args, projectDir) => {
|
||
try {
|
||
const { execFileSync } = await import('node:child_process');
|
||
const { fileURLToPath } = await import('node:url');
|
||
const { dirname, resolve } = await import('node:path');
|
||
const here = typeof __dirname === 'string'
|
||
? __dirname
|
||
: dirname(fileURLToPath(import.meta.url));
|
||
// sdk/src/query -> ../../../get-shit-done/bin/gsd-tools.cjs
|
||
// sdk/dist/query -> ../../../get-shit-done/bin/gsd-tools.cjs
|
||
const toolsPath = resolve(here, '..', '..', '..', 'get-shit-done', 'bin', 'gsd-tools.cjs');
|
||
const out = execFileSync(process.execPath, [toolsPath, 'verify', 'codebase-drift'], {
|
||
cwd: projectDir,
|
||
encoding: 'utf-8',
|
||
stdio: ['pipe', 'pipe', 'pipe'],
|
||
}).trim();
|
||
try {
|
||
return { data: JSON.parse(out) };
|
||
} catch {
|
||
return {
|
||
data: {
|
||
skipped: true,
|
||
reason: 'sdk-parse-failed',
|
||
action_required: false,
|
||
directive: 'none',
|
||
elements: [],
|
||
},
|
||
};
|
||
}
|
||
} catch (err) {
|
||
return {
|
||
data: {
|
||
skipped: true,
|
||
reason: 'sdk-exception: ' + (err instanceof Error ? err.message : String(err)),
|
||
action_required: false,
|
||
directive: 'none',
|
||
elements: [],
|
||
},
|
||
};
|
||
}
|
||
};
|