mirror of
https://github.com/glittercowboy/get-shit-done
synced 2026-04-25 17:25:23 +02:00
feat: add /gsd:docs-update command for verified documentation generation (#1532)
* docs(01-02): complete gsd-doc-writer agent skeleton plan
- SUMMARY.md for plan 01-02
- STATE.md advanced to plan 2/2, progress 50%
- ROADMAP.md updated with phase 1 plan progress
- REQUIREMENTS.md marked DOCG-01 and DOCG-08 complete
* feat(01-01): create lib/docs.cjs with cmdDocsInit and detection helpers
- Add cmdDocsInit following cmdInitMapCodebase pattern
- Add hasGsdMarker(), scanExistingDocs(), detectProjectType()
- Add detectDocTooling(), detectMonorepoWorkspaces() private helpers
- GSD_MARKER constant for generated-by tracking
- Only Node.js built-ins and local lib requires used
* feat(01-01): wire docs-init into gsd-tools.cjs and register gsd-doc-writer model profile
- Add const docs = require('./lib/docs.cjs') to gsd-tools.cjs
- Add case 'docs-init' routing to docs.cmdDocsInit
- Add docs-init to help text and JSDoc header
- Register gsd-doc-writer in MODEL_PROFILES (quality:opus, balanced:sonnet, budget:haiku)
- Fix docs.cjs: inline withProjectRoot logic via checkAgentsInstalled (private in init.cjs)
* docs(01-01): complete docs-init command plan
- SUMMARY.md documenting cmdDocsInit, detection helpers, wiring
- STATE.md advanced, progress updated to 100%
- ROADMAP.md phase 1 marked Complete
- REQUIREMENTS.md INFRA-01, INFRA-02, CONS-03 marked complete
* feat(01-02): create gsd-doc-writer agent skeleton
- YAML frontmatter with name, description, tools, color: purple
- role block with doc_assignment receiving convention
- create_mode and update_mode sections
- 9 stub template sections (readme, architecture, getting_started, development, testing, api, configuration, deployment, contributing)
- Each template has Required Sections list and Phase 3 TODO
- critical_rules prohibiting GSD methodology and CHANGELOG
- success_criteria checklist
- No GSD methodology leaks in template sections
* feat(02-01): add docs-update workflow Steps 1-6 — init, classify, route, resolve, detect
- init_context step calling docs-init with @file: handling and agent-skills loading
- validate_agents step warns on missing gsd-doc-writer without halting
- classify_project step maps project_type signals to 5 primary labels plus conditional docs
- build_doc_queue step with always-on 6 docs and conditional API/CONTRIBUTING/DEPLOYMENT routing
- resolve_modes step with doc-type to canonical path mapping and create/update detection
- detect_runtime_capabilities step with Task tool detection and sequential fallback routing
* docs(02-01): complete docs-update workflow plan — 13-step orchestration for parallel doc generation
- 02-01-SUMMARY.md: plan results, decisions, file inventory
- STATE.md: advanced to last plan, progress 100%, decisions recorded
- ROADMAP.md: Phase 2 marked Complete (1/1 plans with summary)
- REQUIREMENTS.md: marked INFRA-04, DOCG-03, DOCG-04, CONS-01, CONS-02, CONS-04 complete
* docs(03-02): complete command entry point and workflow extension plan
- 03-02-SUMMARY.md: plan results, decisions, file inventory
- STATE.md: advanced to plan 2, progress 100%, decisions recorded
- ROADMAP.md: Phase 3 marked Complete (2/2 plans with summaries)
- REQUIREMENTS.md: marked INFRA-03, EXIST-01, EXIST-02, EXIST-04 complete
* feat(03-01): fill all 9 doc templates, add supplement mode and per-package README template
- Replace all 9 template stubs with full content guidance (Required Sections, Content Discovery, Format Notes)
- Add shared doc_tooling_guidance block for Docusaurus, VitePress, MkDocs, Storybook routing
- Add supplement_mode block: append-only strategy with heading comparison and safety rules
- Add template_readme_per_package for monorepo per-package README generation
- Update role block to list supplement as third mode; add rule 7 to critical_rules
- Add supplement mode check to success_criteria
- Remove all Phase 3 TODO stubs and placeholder comments
* feat(03-02): add docs-update command entry point with --force and --verify-only flags
- YAML frontmatter with name, argument-hint, allowed-tools
- objective block documents flag semantics with literal-token enforcement pattern
- execution_context references docs-update.md workflow
- context block passes $ARGUMENTS and documents flag derivation rules
- --force takes precedence over --verify-only when both present
* feat(03-02): extend docs-update workflow with preservation_check, monorepo dispatch, and verify-only
- preservation_check step between resolve_modes and detect_runtime_capabilities
- preservation_check skips on --force, --verify-only, or no hand-written docs
- per-file AskUserQuestion choice: preserve/supplement/regenerate with fallback default to preserve
- dispatch_monorepo_packages step after collect_wave_2 for per-package READMEs
- verify_only_report early-exit step with VERIFY marker count and Phase 4 deferral message
- preservation_mode field added to all doc_assignment blocks in dispatch_wave_1, dispatch_wave_2
- sequential_generation extended with monorepo per-package section
- commit_docs updated to include per-package README files pattern
- report extended with per-package README rows and preservation decisions
- success_criteria updated with preservation, --force, --verify-only, and monorepo checks
* feat(04-01): create gsd-doc-verifier agent with claim extraction and filesystem verification
- YAML frontmatter with name, description, tools, and color fields
- claim_extraction section with 5 categories: file paths, commands, API endpoints, functions, dependencies
- skip_rules section for VERIFY markers, placeholders, example prefixes, and diff blocks
- verification_process with 6 steps using filesystem tools only (no self-consistency checks)
- output_format with exact JSON shape per D-01
- critical_rules enforcing filesystem-only verification and read-only operation
* feat(04-01): add fix_mode to gsd-doc-writer with surgical correction instructions
- Add fix_mode section after supplement_mode in modes block
- Document fix mode as valid option in role block mode list
- Add failures field to doc_assignment fields (fix mode only)
- fix_mode enforces surgical precision: only correct listed failing lines
- VERIFY marker fallback when correct value cannot be determined
* test(04-03): add docs-init integration test suite
- 13 tests across 4 describe blocks covering JSON output shape, project type
detection, existing doc scanning, GSD marker detection, and doc tooling
- Tests use node:test + node:assert/strict with beforeEach/afterEach lifecycle
- All 13 tests pass with `node --test tests/docs-update.test.cjs`
* feat(04-02): add verify_docs, fix_loop, scan_for_secrets steps to docs-update workflow
- verify_docs step spawns gsd-doc-verifier per generated doc and collects structured JSON results
- fix_loop step bounded at 2 iterations with regression detection (D-05/D-06)
- scan_for_secrets step uses exact map-codebase grep pattern before commit (D-07/D-08)
- verify_only_report updated to invoke real gsd-doc-verifier instead of VERIFY marker count stub
- success_criteria updated with 4 new verification gate checklist items
* docs(04-02): complete verification gate workflow steps plan
- SUMMARY.md: verify_docs, fix_loop, scan_for_secrets, and updated verify_only_report
- STATE.md: advanced to ready_for_verification, 100% progress, decisions logged
- ROADMAP.md: phase 4 marked Complete (3/3 plans with SUMMARYs)
- REQUIREMENTS.md: VERF-01, VERF-02, VERF-03 all marked complete
* refactor(profiles): Adds 'gsd-doc-verifier' to the 'MODEL_PROFILES'
* feat(agents): Add critical rules for file creation and update install test
* docs(05): create phase plan for docs output refinement
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* feat(05-01): make scanExistingDocs recursive into docs/ subdirectories
- Replace flat docs/ scan with recursive walkDir helper (MAX_DEPTH=4)
- Add SKIP_DIRS filtering at every level of recursive walk
- Add fallback to documentation/ or doc/ when docs/ does not exist
- Update JSDoc to reflect recursive scanning behavior
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* feat(05-01): update gsd-doc-writer default path guidance to docs/
- Change "No tooling detected" guidance to default to docs/ directory
- Add README.md and CONTRIBUTING.md as root-level exceptions
- Add instruction to create docs/ directory if it does not exist
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* feat(05-02): invert path table to default docs to docs/ directory
- Invert resolve_modes path table: docs/ is primary for all types except readme and contributing
- Add mkdir -p docs/ instruction before agent dispatch
- Update all downstream path references: collect_wave_1, collect_wave_2, commit_docs, report, verify tables
- Update sequential_generation wave_1_outputs and resolved path references
- Update success criteria and verify_only_report examples to use docs/ paths
* feat(05-02): add CONTRIBUTING confirmation gate and existing doc review queue
- Add CONTRIBUTING.md user confirmation prompt in build_doc_queue (skipped with --force or when file exists)
- Add review_queue for non-canonical existing docs (verification only, not rewriting)
- Add review_queue verification in verify_docs step with fix_loop exclusion
- Add existing doc accuracy review section to report step with manual correction guidance
* docs(05-02): complete path table inversion and doc queue improvements plan
- Add 05-02-SUMMARY.md with execution results
- Update STATE.md with position, decisions, and metrics
- Update ROADMAP.md with phase 05 plan progress
* fix(05): replace plain text y/n prompts with AskUserQuestion in docs-update workflow
Three prompts were using plain text (y/n) instead of GSD's standard
AskUserQuestion pattern: CONTRIBUTING.md confirmation, doc queue
proceed gate, and secrets scan confirmation.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* feat(05): structure-aware paths, non-canonical doc fixes, and gap detection
- resolve_modes now inspects existing doc directory structure and places
new docs in matching subdirectories (e.g., docs/architecture/ if that
pattern exists), instead of dumping everything flat into docs/
- Non-canonical docs with inaccuracies are now sent to gsd-doc-writer
in fix mode for surgical corrections, not just reported
- Added documentation gap detection step that scans the codebase for
undocumented areas and prompts user to create missing docs
- Added type: custom support to gsd-doc-writer with template_custom
section for gap-detected documentation
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* fix(05): smarter structure-aware path resolution for grouped doc directories
When a project uses grouped subdirectories (docs/architecture/,
docs/api/, docs/guides/), ALL canonical docs must be placed in
appropriate groups — none left flat in docs/. Added resolution
chain per doc type with fallback creation. Filenames now match
existing naming style (lowercase-kebab vs UPPERCASE). Queue
presentation shows actual resolved paths, not defaults.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* fix(05): restore mode resolution table as primary queue presentation
The table showing resolved paths, modes, and sources for each doc
must be displayed before the proceed/abort confirmation. It was
replaced by a simple list — now restored as the canonical queue view.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* fix(05): use table format for existing docs review queue presentation
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* feat(05): add work manifest for structured handoffs between workflow steps
Root cause from smoke test: orchestrator forgot to verify 45 non-canonical
docs because the review_queue had no structural scaffolding — it existed
only in orchestrator memory. Fix:
1. Write docs-work-manifest.json to .planning/tmp/ after resolve_modes
with all canonical_queue, review_queue, and gap_queue items
2. Every subsequent step (dispatch, collect, verify, fix_loop, report)
MUST read the manifest first — single source of truth
3. Restructured verify_docs into explicit Phase 1 (canonical) and
Phase 2 (non-canonical) with separate dispatch for each
4. Both queues now eligible for fix_loop corrections
5. Added manifest read instructions to all dispatch/collect steps
Follows the same pattern as execute-phase's phase-plan-index for
tracking work items across multi-step orchestration.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* docs(05): update workflow purpose to reflect full command scope
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* refactor(05): remove redundant steps from docs-update workflow
- Remove validate_agents step (if command is available, agents are installed)
- Remove agents_installed/missing_agents extraction from init_context
- Remove available_agent_types block (agent types specified in each Task call)
- Remove detect_runtime_capabilities step (runtime knows its own tools)
- Replace hardcoded flat paths in collect_wave_1/2 with manifest resolved_paths
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
* fix(05): restore available_agent_types section required by test suite
Test enforces that workflows spawning named agents must declare them
in an <available_agent_types> block. Added back with both gsd-doc-writer
and gsd-doc-verifier listed.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
272
tests/docs-update.test.cjs
Normal file
272
tests/docs-update.test.cjs
Normal file
@@ -0,0 +1,272 @@
|
||||
/**
|
||||
* GSD Tools Tests - docs-update
|
||||
*
|
||||
* Integration tests for the docs-init gsd-tools subcommand.
|
||||
* Covers: JSON output shape, project type detection, existing doc scanning,
|
||||
* GSD marker detection, and doc tooling detection.
|
||||
*
|
||||
* Requirements: VERF-03
|
||||
*/
|
||||
|
||||
const { test, describe, beforeEach, afterEach } = require('node:test');
|
||||
const assert = require('node:assert/strict');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { runGsdTools, createTempProject, cleanup } = require('./helpers.cjs');
|
||||
|
||||
// ─── JSON output shape ────────────────────────────────────────────────────────
|
||||
|
||||
describe('docs-init command', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('returns expected JSON shape', () => {
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
|
||||
// Top-level scalar fields
|
||||
assert.strictEqual(typeof data.doc_writer_model, 'string');
|
||||
assert.strictEqual(typeof data.commit_docs, 'boolean');
|
||||
assert.strictEqual(typeof data.planning_exists, 'boolean');
|
||||
assert.strictEqual(typeof data.project_root, 'string');
|
||||
assert.strictEqual(typeof data.agents_installed, 'boolean');
|
||||
|
||||
// Array fields
|
||||
assert.ok(Array.isArray(data.existing_docs), 'existing_docs should be an array');
|
||||
assert.ok(Array.isArray(data.monorepo_workspaces), 'monorepo_workspaces should be an array');
|
||||
assert.ok(Array.isArray(data.missing_agents), 'missing_agents should be an array');
|
||||
|
||||
// project_type object with 7 boolean fields
|
||||
assert.ok(data.project_type && typeof data.project_type === 'object', 'project_type should be an object');
|
||||
assert.strictEqual(typeof data.project_type.has_package_json, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.has_api_routes, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.has_cli_bin, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.is_open_source, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.has_deploy_config, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.is_monorepo, 'boolean');
|
||||
assert.strictEqual(typeof data.project_type.has_tests, 'boolean');
|
||||
|
||||
// doc_tooling object with 4 boolean fields
|
||||
assert.ok(data.doc_tooling && typeof data.doc_tooling === 'object', 'doc_tooling should be an object');
|
||||
assert.strictEqual(typeof data.doc_tooling.docusaurus, 'boolean');
|
||||
assert.strictEqual(typeof data.doc_tooling.vitepress, 'boolean');
|
||||
assert.strictEqual(typeof data.doc_tooling.mkdocs, 'boolean');
|
||||
assert.strictEqual(typeof data.doc_tooling.storybook, 'boolean');
|
||||
|
||||
// planning_exists is true since createTempProject creates .planning/
|
||||
assert.strictEqual(data.planning_exists, true);
|
||||
});
|
||||
|
||||
test('bare project returns all false signals', () => {
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
|
||||
// All project_type fields should be false for a bare project
|
||||
assert.strictEqual(data.project_type.has_package_json, false);
|
||||
assert.strictEqual(data.project_type.has_api_routes, false);
|
||||
assert.strictEqual(data.project_type.has_cli_bin, false);
|
||||
assert.strictEqual(data.project_type.is_open_source, false);
|
||||
assert.strictEqual(data.project_type.has_deploy_config, false);
|
||||
assert.strictEqual(data.project_type.is_monorepo, false);
|
||||
assert.strictEqual(data.project_type.has_tests, false);
|
||||
|
||||
// No docs, no workspaces, no doc tooling
|
||||
assert.deepEqual(data.existing_docs, []);
|
||||
assert.deepEqual(data.monorepo_workspaces, []);
|
||||
assert.strictEqual(data.doc_tooling.docusaurus, false);
|
||||
assert.strictEqual(data.doc_tooling.vitepress, false);
|
||||
assert.strictEqual(data.doc_tooling.mkdocs, false);
|
||||
assert.strictEqual(data.doc_tooling.storybook, false);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── project type detection ───────────────────────────────────────────────────
|
||||
|
||||
describe('project type detection', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('detects CLI tool from package.json bin field', () => {
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'package.json'),
|
||||
JSON.stringify({ name: 'my-cli', bin: { mycli: 'bin/cli.js' } }),
|
||||
'utf-8'
|
||||
);
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.has_cli_bin, true);
|
||||
assert.strictEqual(data.project_type.has_package_json, true);
|
||||
});
|
||||
|
||||
test('detects open source from LICENSE file', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'LICENSE'), 'MIT License', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.is_open_source, true);
|
||||
});
|
||||
|
||||
test('detects monorepo from package.json workspaces', () => {
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'package.json'),
|
||||
JSON.stringify({ name: 'mono', workspaces: ['packages/*'] }),
|
||||
'utf-8'
|
||||
);
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.is_monorepo, true);
|
||||
assert.ok(data.monorepo_workspaces.includes('packages/*'), 'monorepo_workspaces should contain packages/*');
|
||||
});
|
||||
|
||||
test('detects tests from tests directory', () => {
|
||||
fs.mkdirSync(path.join(tmpDir, 'tests'), { recursive: true });
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.has_tests, true);
|
||||
});
|
||||
|
||||
test('detects deploy config from Dockerfile', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'Dockerfile'), 'FROM node:20', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.has_deploy_config, true);
|
||||
});
|
||||
|
||||
test('detects API routes from src/app/api directory', () => {
|
||||
fs.mkdirSync(path.join(tmpDir, 'src', 'app', 'api'), { recursive: true });
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.project_type.has_api_routes, true);
|
||||
});
|
||||
});
|
||||
|
||||
// ─── existing doc scanning ────────────────────────────────────────────────────
|
||||
|
||||
describe('existing doc scanning', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('scans .md files in project root', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'README.md'), '# README\n', 'utf-8');
|
||||
fs.writeFileSync(path.join(tmpDir, 'ARCHITECTURE.md'), '# Architecture\n', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.ok(data.existing_docs.length >= 2, 'existing_docs should contain at least 2 entries');
|
||||
|
||||
const paths = data.existing_docs.map(d => d.path);
|
||||
assert.ok(paths.includes('README.md'), 'existing_docs should contain README.md');
|
||||
assert.ok(paths.includes('ARCHITECTURE.md'), 'existing_docs should contain ARCHITECTURE.md');
|
||||
});
|
||||
|
||||
test('detects GSD marker in existing docs', () => {
|
||||
fs.writeFileSync(
|
||||
path.join(tmpDir, 'README.md'),
|
||||
'<!-- generated-by: gsd-doc-writer -->\n# README\n',
|
||||
'utf-8'
|
||||
);
|
||||
fs.writeFileSync(path.join(tmpDir, 'NOTES.md'), '# Notes\n', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
|
||||
const readmeEntry = data.existing_docs.find(d => d.path === 'README.md');
|
||||
assert.ok(readmeEntry, 'README.md should appear in existing_docs');
|
||||
assert.strictEqual(readmeEntry.has_gsd_marker, true, 'README.md should have GSD marker');
|
||||
|
||||
const notesEntry = data.existing_docs.find(d => d.path === 'NOTES.md');
|
||||
assert.ok(notesEntry, 'NOTES.md should appear in existing_docs');
|
||||
assert.strictEqual(notesEntry.has_gsd_marker, false, 'NOTES.md should not have GSD marker');
|
||||
});
|
||||
});
|
||||
|
||||
// ─── doc tooling detection ────────────────────────────────────────────────────
|
||||
|
||||
describe('doc tooling detection', () => {
|
||||
let tmpDir;
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup(tmpDir);
|
||||
});
|
||||
|
||||
test('detects Docusaurus config', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'docusaurus.config.js'), 'module.exports = {};', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.doc_tooling.docusaurus, true);
|
||||
});
|
||||
|
||||
test('detects VitePress config', () => {
|
||||
fs.mkdirSync(path.join(tmpDir, '.vitepress'), { recursive: true });
|
||||
fs.writeFileSync(path.join(tmpDir, '.vitepress', 'config.ts'), 'export default {};', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.doc_tooling.vitepress, true);
|
||||
});
|
||||
|
||||
test('detects MkDocs config', () => {
|
||||
fs.writeFileSync(path.join(tmpDir, 'mkdocs.yml'), 'site_name: test', 'utf-8');
|
||||
|
||||
const result = runGsdTools(['docs-init'], tmpDir);
|
||||
assert.ok(result.success, `Command failed: ${result.error}`);
|
||||
|
||||
const data = JSON.parse(result.output);
|
||||
assert.strictEqual(data.doc_tooling.mkdocs, true);
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user