mirror of
https://github.com/thedotmack/claude-mem
synced 2026-04-25 17:15:04 +02:00
refactor: decompose monolithic services into modular architecture (#534)
* docs: add monolith refactor report with system breakdown Comprehensive analysis of codebase identifying: - 14 files over 500 lines requiring refactoring - 3 critical monoliths (SessionStore, SearchManager, worker-service) - 80% code duplication across agent files - 5-phase refactoring roadmap with domain-based architecture * fix: prevent memory_session_id from equaling content_session_id The bug: memory_session_id was initialized to contentSessionId as a "placeholder for FK purposes". This caused the SDK resume logic to inject memory agent messages into the USER's Claude Code transcript, corrupting their conversation history. Root cause: - SessionStore.createSDKSession initialized memory_session_id = contentSessionId - SDKAgent checked memorySessionId !== contentSessionId but this check only worked if the session was fetched fresh from DB The fix: - SessionStore: Initialize memory_session_id as NULL, not contentSessionId - SDKAgent: Simple truthy check !!session.memorySessionId (NULL = fresh start) - Database migration: Ran UPDATE to set memory_session_id = NULL for 1807 existing sessions that had the bug Also adds [ALIGNMENT] logging across the session lifecycle to help debug session continuity issues: - Hook entry: contentSessionId + promptNumber - DB lookup: contentSessionId → memorySessionId mapping proof - Resume decision: shows which memorySessionId will be used for resume - Capture: logs when memorySessionId is captured from first SDK response UI: Added "Alignment" quick filter button in LogsModal to show only alignment logs for debugging session continuity. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor: improve error handling in worker-service.ts - Fix GENERIC_CATCH anti-patterns by logging full error objects instead of just messages - Add [ANTI-PATTERN IGNORED] markers for legitimate cases (cleanup, hot paths) - Simplify error handling comments to be more concise - Improve httpShutdown() error discrimination for ECONNREFUSED - Reduce LARGE_TRY_BLOCK issues in initialization code Part of anti-pattern cleanup plan (132 total issues) * refactor: improve error logging in SearchManager.ts - Pass full error objects to logger instead of just error.message - Fixes PARTIAL_ERROR_LOGGING anti-patterns (10 instances) - Better debugging visibility when Chroma queries fail Part of anti-pattern cleanup (133 remaining) * refactor: improve error logging across SessionStore and mcp-server - SessionStore.ts: Fix error logging in column rename utility - mcp-server.ts: Log full error objects instead of just error.message - Improve error handling in Worker API calls and tool execution Part of anti-pattern cleanup (133 remaining) * Refactor hooks to streamline error handling and loading states - Simplified error handling in useContextPreview by removing try-catch and directly checking response status. - Refactored usePagination to eliminate try-catch, improving readability and maintaining error handling through response checks. - Cleaned up useSSE by removing unnecessary try-catch around JSON parsing, ensuring clarity in message handling. - Enhanced useSettings by streamlining the saving process, removing try-catch, and directly checking the result for success. * refactor: add error handling back to SearchManager Chroma calls - Wrap queryChroma calls in try-catch to prevent generator crashes - Log Chroma errors as warnings and fall back gracefully - Fixes generator failures when Chroma has issues - Part of anti-pattern cleanup recovery * feat: Add generator failure investigation report and observation duplication regression report - Created a comprehensive investigation report detailing the root cause of generator failures during anti-pattern cleanup, including the impact, investigation process, and implemented fixes. - Documented the critical regression causing observation duplication due to race conditions in the SDK agent, outlining symptoms, root cause analysis, and proposed fixes. * fix: address PR #528 review comments - atomic cleanup and detector improvements This commit addresses critical review feedback from PR #528: ## 1. Atomic Message Cleanup (Fix Race Condition) **Problem**: SessionRoutes.ts generator error handler had race condition - Queried messages then marked failed in loop - If crash during loop → partial marking → inconsistent state **Solution**: - Added `markSessionMessagesFailed()` to PendingMessageStore.ts - Single atomic UPDATE statement replaces loop - Follows existing pattern from `resetProcessingToPending()` **Files**: - src/services/sqlite/PendingMessageStore.ts (new method) - src/services/worker/http/routes/SessionRoutes.ts (use new method) ## 2. Anti-Pattern Detector Improvements **Problem**: Detector didn't recognize logger.failure() method - Lines 212 & 335 already included "failure" - Lines 112-113 (PARTIAL_ERROR_LOGGING detection) did not **Solution**: Updated regex patterns to include "failure" for consistency **Files**: - scripts/anti-pattern-test/detect-error-handling-antipatterns.ts ## 3. Documentation **PR Comment**: Added clarification on memory_session_id fix location - Points to SessionStore.ts:1155 - Explains why NULL initialization prevents message injection bug ## Review Response Addresses "Must Address Before Merge" items from review: ✅ Clarified memory_session_id bug fix location (via PR comment) ✅ Made generator error handler message cleanup atomic ❌ Deferred comprehensive test suite to follow-up PR (keeps PR focused) ## Testing - Build passes with no errors - Anti-pattern detector runs successfully - Atomic cleanup follows proven pattern from existing methods 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix: FOREIGN KEY constraint and missing failed_at_epoch column Two critical bugs fixed: 1. Missing failed_at_epoch column in pending_messages table - Added migration 20 to create the column - Fixes error when trying to mark messages as failed 2. FOREIGN KEY constraint failed when storing observations - All three agents (SDK, Gemini, OpenRouter) were passing session.contentSessionId instead of session.memorySessionId - storeObservationsAndMarkComplete expects memorySessionId - Added null check and clear error message However, observations still not saving - see investigation report. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * Refactor hook input parsing to improve error handling - Added a nested try-catch block in new-hook.ts, save-hook.ts, and summary-hook.ts to handle JSON parsing errors more gracefully. - Replaced direct error throwing with logging of the error details using logger.error. - Ensured that the process exits cleanly after handling input in all three hooks. * docs: update monolith report post session-logging merge - SessionStore grew to 2,011 lines (49 methods) - highest priority - SearchManager reduced to 1,778 lines (improved) - Agent files reduced by ~45 lines combined - Added trend indicators and post-merge observations - Core refactoring proposal remains valid * refactor(sqlite): decompose SessionStore into modular architecture Extract the 2011-line SessionStore.ts monolith into focused, single-responsibility modules following grep-optimized progressive disclosure pattern: New module structure: - sessions/ - Session creation and retrieval (create.ts, get.ts, types.ts) - observations/ - Observation storage and queries (store.ts, get.ts, recent.ts, files.ts, types.ts) - summaries/ - Summary storage and queries (store.ts, get.ts, recent.ts, types.ts) - prompts/ - User prompt management (store.ts, get.ts, types.ts) - timeline/ - Cross-entity timeline queries (queries.ts) - import/ - Bulk import operations (bulk.ts) - migrations/ - Database migrations (runner.ts) New coordinator files: - Database.ts - ClaudeMemDatabase class with re-exports - transactions.ts - Atomic cross-entity transactions - Named re-export facades (Sessions.ts, Observations.ts, etc.) Key design decisions: - All functions take `db: Database` as first parameter (functional style) - Named re-exports instead of index.ts for grep-friendliness - SessionStore retained as backward-compatible wrapper - Target file size: 50-150 lines (60% compliance) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor(agents): extract shared logic into modular architecture Consolidate duplicate code across SDKAgent, GeminiAgent, and OpenRouterAgent into focused utility modules. Total reduction: 500 lines (29%). New modules in src/services/worker/agents/: - ResponseProcessor.ts: Atomic DB transactions, Chroma sync, SSE broadcast - ObservationBroadcaster.ts: SSE event formatting and dispatch - SessionCleanupHelper.ts: Session state cleanup and stuck message reset - FallbackErrorHandler.ts: Provider error detection for fallback logic - types.ts: Shared interfaces (WorkerRef, SSE payloads, StorageResult) Bug fix: SDKAgent was incorrectly using obs.files instead of obs.files_read and hardcoding files_modified to empty array. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor(search): extract search strategies into modular architecture Decompose SearchManager into focused strategy pattern with: - SearchOrchestrator: Coordinates strategy selection and fallback - ChromaSearchStrategy: Vector semantic search via ChromaDB - SQLiteSearchStrategy: Filter-only queries for date/project/type - HybridSearchStrategy: Metadata filtering + semantic ranking - ResultFormatter: Markdown table formatting for results - TimelineBuilder: Chronological timeline construction - Filter modules: DateFilter, ProjectFilter, TypeFilter SearchManager now delegates to new infrastructure while maintaining full backward compatibility with existing public API. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor(context): decompose context-generator into modular architecture Extract 660-line monolith into focused components: - ContextBuilder: Main orchestrator (~160 lines) - ContextConfigLoader: Configuration loading - TokenCalculator: Token budget calculations - ObservationCompiler: Data retrieval and query building - MarkdownFormatter/ColorFormatter: Output formatting - Section renderers: Header, Timeline, Summary, Footer Maintains full backward compatibility - context-generator.ts now delegates to new ContextBuilder while preserving public API. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor(worker): decompose worker-service into modular infrastructure Split 2000+ line monolith into focused modules: Infrastructure: - ProcessManager: PID files, signal handlers, child process cleanup - HealthMonitor: Port checks, health polling, version matching - GracefulShutdown: Coordinated cleanup on exit Server: - Server: Express app setup, core routes, route registration - Middleware: Re-exports from existing middleware - ErrorHandler: Centralized error handling with AppError class Integrations: - CursorHooksInstaller: Full Cursor IDE integration (registry, hooks, MCP) WorkerService now acts as thin coordinator wiring all components together. Maintains full backward compatibility with existing public API. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * Refactor session queue processing and database interactions - Implement claim-and-delete pattern in SessionQueueProcessor to simplify message handling and eliminate duplicate processing. - Update PendingMessageStore to support atomic claim-and-delete operations, removing the need for intermediate processing states. - Introduce storeObservations method in SessionStore for simplified observation and summary storage without message tracking. - Remove deprecated methods and clean up session state management in worker agents. - Adjust response processing to accommodate new storage patterns, ensuring atomic transactions for observations and summaries. - Remove unnecessary reset logic for stuck messages due to the new queue handling approach. * Add duplicate observation cleanup script Script to clean up duplicate observations created by the batching bug where observations were stored once per message ID instead of once per observation. Includes safety checks to always keep at least one copy. Usage: bun scripts/cleanup-duplicates.ts # Dry run bun scripts/cleanup-duplicates.ts --execute # Delete duplicates bun scripts/cleanup-duplicates.ts --aggressive # Ignore time window 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> --------- Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
306
MONOLITH_REFACTOR_REPORT.md
Normal file
306
MONOLITH_REFACTOR_REPORT.md
Normal file
@@ -0,0 +1,306 @@
|
||||
# Monolith Refactor Report
|
||||
|
||||
> **Last Updated:** 2026-01-03 (post session-logging merge)
|
||||
|
||||
## Executive Summary
|
||||
|
||||
The claude-mem codebase contains **~21,000 lines** of TypeScript across 71+ files. Analysis reveals several monolithic files that violate single-responsibility principles and create tight coupling. This report identifies refactoring targets and proposes a modular architecture.
|
||||
|
||||
**Recent Changes:** The `session-logging` branch merge improved error handling across the codebase. SearchManager was reduced by ~180 lines, but SessionStore grew by ~110 lines due to new migrations and logging.
|
||||
|
||||
---
|
||||
|
||||
## Part 1: Monolith Files Identified
|
||||
|
||||
### Critical Priority (>1500 lines)
|
||||
|
||||
| File | Lines | Methods | Primary Issues | Trend |
|
||||
|------|-------|---------|----------------|-------|
|
||||
| `src/services/worker-service.ts` | 2,034 | - | Server init, process management, Cursor hooks, MCP setup all mixed | ↓ -28 |
|
||||
| `src/services/sqlite/SessionStore.ts` | 2,011 | 49 | Migrations + CRUD + queries + transformations all in one class | ↑ +108 |
|
||||
| `src/services/worker/SearchManager.ts` | 1,778 | 17 | Three search strategies crammed together, formatting mixed in | ↓ -178 |
|
||||
|
||||
### High Priority (500-1500 lines)
|
||||
|
||||
| File | Lines | Issues | Trend |
|
||||
|------|-------|--------|-------|
|
||||
| `src/services/sync/ChromaSync.ts` | 870 | Sync and query operations mixed | — |
|
||||
| `src/services/context-generator.ts` | 659 | 23 standalone functions, no class structure | — |
|
||||
| `src/services/worker/http/routes/SessionRoutes.ts` | 625 | Provider selection mixed with business logic | ↑ +7 |
|
||||
| `src/services/worker/OpenRouterAgent.ts` | 599 | 80% code duplicated from other agents | ↓ -15 |
|
||||
| `src/services/worker/GeminiAgent.ts` | 574 | 80% code duplicated from other agents | ↓ -15 |
|
||||
| `src/services/worker/SDKAgent.ts` | 546 | Base patterns duplicated across all agents | ↓ -15 |
|
||||
| `src/services/sqlite/SessionSearch.ts` | 526 | FTS5 tables maintained for backward compat | — |
|
||||
| `src/services/sqlite/migrations.ts` | 509 | All 11 migrations in single file | — |
|
||||
| `src/services/sqlite/PendingMessageStore.ts` | 447 | Message queue operations | ↑ +21 |
|
||||
| `src/services/worker/http/routes/SettingsRoutes.ts` | 414 | File I/O, validation, git ops mixed | — |
|
||||
|
||||
### Code Duplication Issue
|
||||
|
||||
The three agent files (`SDKAgent`, `GeminiAgent`, `OpenRouterAgent`) share ~80% duplicate code:
|
||||
- Message building logic
|
||||
- Result parsing
|
||||
- Context updating
|
||||
- Database sync patterns
|
||||
|
||||
---
|
||||
|
||||
## Part 2: System Breakdown Proposal
|
||||
|
||||
### Domain-Based Module Architecture
|
||||
|
||||
```
|
||||
src/
|
||||
├── domains/ # Business domain modules
|
||||
│ ├── sessions/ # Session lifecycle
|
||||
│ │ ├── SessionRepository.ts
|
||||
│ │ ├── SessionService.ts
|
||||
│ │ └── types.ts
|
||||
│ │
|
||||
│ ├── observations/ # Observation management
|
||||
│ │ ├── ObservationRepository.ts
|
||||
│ │ ├── ObservationService.ts
|
||||
│ │ └── types.ts
|
||||
│ │
|
||||
│ ├── summaries/ # Summary generation
|
||||
│ │ ├── SummaryRepository.ts
|
||||
│ │ ├── SummaryService.ts
|
||||
│ │ └── types.ts
|
||||
│ │
|
||||
│ ├── prompts/ # Prompt storage
|
||||
│ │ ├── PromptRepository.ts
|
||||
│ │ └── types.ts
|
||||
│ │
|
||||
│ └── search/ # Search subsystem
|
||||
│ ├── strategies/
|
||||
│ │ ├── ChromaSearchStrategy.ts
|
||||
│ │ ├── FilterSearchStrategy.ts
|
||||
│ │ └── SearchStrategy.ts (interface)
|
||||
│ ├── SearchOrchestrator.ts
|
||||
│ ├── ResultFormatter.ts
|
||||
│ └── TimelineBuilder.ts
|
||||
│
|
||||
├── infrastructure/ # Cross-cutting infrastructure
|
||||
│ ├── database/
|
||||
│ │ ├── DatabaseConnection.ts
|
||||
│ │ ├── TransactionManager.ts
|
||||
│ │ └── migrations/
|
||||
│ │ ├── MigrationRunner.ts
|
||||
│ │ ├── 001_initial.ts
|
||||
│ │ ├── 002_add_prompts.ts
|
||||
│ │ └── ...
|
||||
│ │
|
||||
│ ├── vector/
|
||||
│ │ ├── ChromaClient.ts
|
||||
│ │ ├── ChromaSyncManager.ts
|
||||
│ │ └── ChromaQueryEngine.ts
|
||||
│ │
|
||||
│ └── agents/
|
||||
│ ├── BaseAgent.ts # Shared agent logic
|
||||
│ ├── AgentFactory.ts
|
||||
│ ├── MessageBuilder.ts
|
||||
│ ├── ResponseParser.ts
|
||||
│ ├── providers/
|
||||
│ │ ├── ClaudeProvider.ts
|
||||
│ │ ├── GeminiProvider.ts
|
||||
│ │ └── OpenRouterProvider.ts
|
||||
│ └── types.ts
|
||||
│
|
||||
├── api/ # HTTP layer
|
||||
│ ├── routes/
|
||||
│ │ ├── sessions.ts
|
||||
│ │ ├── data.ts
|
||||
│ │ ├── search.ts
|
||||
│ │ ├── settings.ts
|
||||
│ │ └── viewer.ts
|
||||
│ ├── middleware/
|
||||
│ └── server.ts
|
||||
│
|
||||
├── context/ # Context injection
|
||||
│ ├── ContextBuilder.ts
|
||||
│ ├── ContextConfigLoader.ts
|
||||
│ ├── ObservationCompiler.ts
|
||||
│ └── TokenCalculator.ts
|
||||
│
|
||||
└── shared/ # Shared utilities (existing)
|
||||
├── logger.ts
|
||||
├── settings.ts
|
||||
└── ...
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Part 3: Refactoring Targets by Priority
|
||||
|
||||
### Phase 1: Database Layer Decomposition
|
||||
|
||||
**Target:** `src/services/sqlite/SessionStore.ts` (2,011 lines, 49 methods → ~5 files)
|
||||
|
||||
| Extract To | Responsibility | Est. Lines |
|
||||
|------------|---------------|------------|
|
||||
| `domains/sessions/SessionRepository.ts` | Session CRUD ops | ~300 |
|
||||
| `domains/observations/ObservationRepository.ts` | Observation storage/retrieval | ~400 |
|
||||
| `domains/summaries/SummaryRepository.ts` | Summary storage/retrieval | ~200 |
|
||||
| `infrastructure/database/migrations/MigrationRunner.ts` | Schema migrations | ~250 |
|
||||
|
||||
**Benefits:**
|
||||
- Single responsibility per file
|
||||
- Testable in isolation
|
||||
- Reduces coupling
|
||||
|
||||
---
|
||||
|
||||
### Phase 2: Agent Consolidation
|
||||
|
||||
**Target:** 3 agent files (1,719 lines → ~800 lines total)
|
||||
|
||||
| Extract To | Responsibility |
|
||||
|------------|---------------|
|
||||
| `infrastructure/agents/BaseAgent.ts` | Common agent logic, prompt building |
|
||||
| `infrastructure/agents/MessageBuilder.ts` | Message construction |
|
||||
| `infrastructure/agents/ResponseParser.ts` | Result parsing (observations, summaries) |
|
||||
| `infrastructure/agents/providers/*.ts` | Provider-specific API calls only |
|
||||
|
||||
**Benefits:**
|
||||
- Eliminates 80% code duplication
|
||||
- Easy to add new providers
|
||||
- Centralized message format changes
|
||||
|
||||
---
|
||||
|
||||
### Phase 3: Search Strategy Pattern
|
||||
|
||||
**Target:** `src/services/worker/SearchManager.ts` (1,778 lines → ~5 files)
|
||||
|
||||
| Extract To | Responsibility |
|
||||
|------------|---------------|
|
||||
| `domains/search/SearchOrchestrator.ts` | Coordinates search strategies |
|
||||
| `domains/search/strategies/ChromaSearchStrategy.ts` | Vector search via Chroma |
|
||||
| `domains/search/strategies/FilterSearchStrategy.ts` | SQLite filter-based search |
|
||||
| `domains/search/ResultFormatter.ts` | Formats search results |
|
||||
| `domains/search/TimelineBuilder.ts` | Constructs timeline views |
|
||||
|
||||
**Benefits:**
|
||||
- Strategy pattern for extensibility
|
||||
- Clear fallback logic
|
||||
- Testable strategies
|
||||
|
||||
---
|
||||
|
||||
### Phase 4: Context Generator Restructure
|
||||
|
||||
**Target:** `src/services/context-generator.ts` (659 lines → ~4 files)
|
||||
|
||||
| Extract To | Responsibility |
|
||||
|------------|---------------|
|
||||
| `context/ContextBuilder.ts` | Main builder class |
|
||||
| `context/ContextConfigLoader.ts` | Config loading/validation |
|
||||
| `context/ObservationCompiler.ts` | Compiles observations for injection |
|
||||
| `context/TokenCalculator.ts` | Token budget calculations |
|
||||
|
||||
**Benefits:**
|
||||
- Class-based structure
|
||||
- Clear dependencies
|
||||
- Easier testing
|
||||
|
||||
---
|
||||
|
||||
### Phase 5: Server/Infrastructure Split
|
||||
|
||||
**Target:** `src/services/worker-service.ts` (2,034 lines → ~4 files)
|
||||
|
||||
| Extract To | Responsibility |
|
||||
|------------|---------------|
|
||||
| `api/server.ts` | Express app, route registration |
|
||||
| `infrastructure/ProcessManager.ts` | PID files, signal handlers |
|
||||
| `infrastructure/CursorHooksInstaller.ts` | Cursor integration |
|
||||
| `infrastructure/MCPClientManager.ts` | MCP client lifecycle |
|
||||
|
||||
---
|
||||
|
||||
## Part 4: Dependency Reduction Strategy
|
||||
|
||||
### Current Pain Points
|
||||
|
||||
1. **SessionStore** imported by 7+ files directly
|
||||
2. No abstraction between routes and data access
|
||||
3. All routes depend on `DatabaseManager` which exposes raw `SessionStore`
|
||||
|
||||
### Proposed Dependency Injection
|
||||
|
||||
```typescript
|
||||
// infrastructure/container.ts
|
||||
export interface ServiceContainer {
|
||||
sessions: SessionService;
|
||||
observations: ObservationService;
|
||||
summaries: SummaryService;
|
||||
search: SearchOrchestrator;
|
||||
agents: AgentFactory;
|
||||
}
|
||||
|
||||
// Usage in routes
|
||||
app.post('/sessions', (req, res) => {
|
||||
const { sessions } = getContainer();
|
||||
sessions.create(req.body);
|
||||
});
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Part 5: Migration Strategy
|
||||
|
||||
### Incremental Approach
|
||||
|
||||
Each phase can be done independently without breaking the system:
|
||||
|
||||
1. **Create new modules** alongside existing code
|
||||
2. **Migrate routes one at a time** to use new modules
|
||||
3. **Deprecate old code** once all routes migrated
|
||||
4. **Remove deprecated code** after testing
|
||||
|
||||
### Testing Requirements
|
||||
|
||||
- Unit tests for each extracted module
|
||||
- Integration tests for repository operations
|
||||
- End-to-end tests for API routes
|
||||
|
||||
---
|
||||
|
||||
## Appendix: File Size Distribution
|
||||
|
||||
```
|
||||
2,034 src/services/worker-service.ts ████████████████████
|
||||
2,011 src/services/sqlite/SessionStore.ts ████████████████████
|
||||
1,778 src/services/worker/SearchManager.ts █████████████████
|
||||
870 src/services/sync/ChromaSync.ts ████████
|
||||
659 src/services/context-generator.ts ██████
|
||||
625 src/services/worker/http/routes/SessionRoutes.ts ██████
|
||||
599 src/services/worker/OpenRouterAgent.ts █████
|
||||
574 src/services/worker/GeminiAgent.ts █████
|
||||
546 src/services/worker/SDKAgent.ts █████
|
||||
526 src/services/sqlite/SessionSearch.ts █████
|
||||
509 src/services/sqlite/migrations.ts █████
|
||||
466 src/services/worker/http/routes/DataRoutes.ts ████
|
||||
447 src/services/sqlite/PendingMessageStore.ts ████
|
||||
414 src/services/worker/http/routes/SettingsRoutes.ts ████
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
| Metric | Current | After Refactor |
|
||||
|--------|---------|----------------|
|
||||
| Files >500 lines | 14 | 0-2 |
|
||||
| Max file size | 2,034 | ~400 |
|
||||
| Code duplication | ~1,100 lines | ~100 lines |
|
||||
| Testable modules | Low | High |
|
||||
|
||||
**Recommended Start:** Phase 1 (SessionStore decomposition) - highest impact, clearest boundaries, and **growing** (now 2,011 lines with 49 methods).
|
||||
|
||||
### Key Observations Post-Merge
|
||||
|
||||
1. **SessionStore is still the top priority** - it grew by 108 lines and is now the 2nd largest file
|
||||
2. **SearchManager improved** - down 178 lines from error handling refactor
|
||||
3. **Agent files slightly smaller** - ~45 lines combined reduction
|
||||
4. **Core architecture unchanged** - the proposed modular structure remains valid
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
243
scripts/cleanup-duplicates.ts
Normal file
243
scripts/cleanup-duplicates.ts
Normal file
@@ -0,0 +1,243 @@
|
||||
#!/usr/bin/env bun
|
||||
/**
|
||||
* Cleanup script for duplicate observations created by the batching bug.
|
||||
*
|
||||
* The bug: When multiple messages were batched together, observations were stored
|
||||
* once per message ID instead of once per observation. For example, if 4 messages
|
||||
* were batched and produced 3 observations, those 3 observations were stored
|
||||
* 12 times (4×3) instead of 3 times.
|
||||
*
|
||||
* This script identifies duplicates by matching on:
|
||||
* - memory_session_id (same session)
|
||||
* - text (same content)
|
||||
* - type (same observation type)
|
||||
* - created_at_epoch within 60 seconds (same batch window)
|
||||
*
|
||||
* Usage:
|
||||
* bun scripts/cleanup-duplicates.ts # Dry run (default)
|
||||
* bun scripts/cleanup-duplicates.ts --execute # Actually delete duplicates
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import { homedir } from 'os';
|
||||
import { join } from 'path';
|
||||
|
||||
const DB_PATH = join(homedir(), '.claude-mem', 'claude-mem.db');
|
||||
|
||||
// Time window modes for duplicate detection
|
||||
const TIME_WINDOW_MODES = {
|
||||
strict: 5, // 5 seconds - only exact duplicates from same batch
|
||||
normal: 60, // 60 seconds - duplicates within same minute
|
||||
aggressive: 0, // 0 = ignore time entirely, match on session+text+type only
|
||||
};
|
||||
|
||||
interface DuplicateGroup {
|
||||
memory_session_id: string;
|
||||
title: string;
|
||||
type: string;
|
||||
epoch_bucket: number;
|
||||
count: number;
|
||||
ids: number[];
|
||||
keep_id: number;
|
||||
delete_ids: number[];
|
||||
}
|
||||
|
||||
interface ObservationRow {
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
narrative: string | null;
|
||||
type: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
function main() {
|
||||
const dryRun = !process.argv.includes('--execute');
|
||||
const aggressive = process.argv.includes('--aggressive');
|
||||
const strict = process.argv.includes('--strict');
|
||||
|
||||
// Determine time window
|
||||
let windowMode: keyof typeof TIME_WINDOW_MODES = 'normal';
|
||||
if (aggressive) windowMode = 'aggressive';
|
||||
if (strict) windowMode = 'strict';
|
||||
const batchWindowSeconds = TIME_WINDOW_MODES[windowMode];
|
||||
|
||||
console.log('='.repeat(60));
|
||||
console.log('Claude-Mem Duplicate Observation Cleanup');
|
||||
console.log('='.repeat(60));
|
||||
console.log(`Mode: ${dryRun ? 'DRY RUN (use --execute to delete)' : 'EXECUTE'}`);
|
||||
console.log(`Database: ${DB_PATH}`);
|
||||
console.log(`Time window: ${windowMode} (${batchWindowSeconds === 0 ? 'ignore time' : batchWindowSeconds + ' seconds'})`);
|
||||
console.log('');
|
||||
console.log('Options:');
|
||||
console.log(' --execute Actually delete duplicates (default: dry run)');
|
||||
console.log(' --strict 5-second window (exact batch duplicates only)');
|
||||
console.log(' --aggressive Ignore time, match on session+text+type only');
|
||||
console.log('');
|
||||
|
||||
const db = dryRun
|
||||
? new Database(DB_PATH, { readonly: true })
|
||||
: new Database(DB_PATH);
|
||||
|
||||
// Get total observation count
|
||||
const totalCount = db.prepare('SELECT COUNT(*) as count FROM observations').get() as { count: number };
|
||||
console.log(`Total observations in database: ${totalCount.count}`);
|
||||
|
||||
// Find all observations and group by content fingerprint
|
||||
const observations = db.prepare(`
|
||||
SELECT
|
||||
id,
|
||||
memory_session_id,
|
||||
title,
|
||||
subtitle,
|
||||
narrative,
|
||||
type,
|
||||
created_at_epoch
|
||||
FROM observations
|
||||
ORDER BY memory_session_id, title, type, created_at_epoch
|
||||
`).all() as ObservationRow[];
|
||||
|
||||
console.log(`Analyzing ${observations.length} observations for duplicates...`);
|
||||
console.log('');
|
||||
|
||||
// Group observations by fingerprint (session + text + type + time bucket)
|
||||
const groups = new Map<string, ObservationRow[]>();
|
||||
|
||||
for (const obs of observations) {
|
||||
// Skip observations without title (can't dedupe without content identifier)
|
||||
if (obs.title === null) continue;
|
||||
|
||||
// Create content hash from title + subtitle + narrative
|
||||
const contentKey = `${obs.title}|${obs.subtitle || ''}|${obs.narrative || ''}`;
|
||||
|
||||
// Create fingerprint based on time window mode
|
||||
let fingerprint: string;
|
||||
if (batchWindowSeconds === 0) {
|
||||
// Aggressive mode: ignore time entirely
|
||||
fingerprint = `${obs.memory_session_id}|${obs.type}|${contentKey}`;
|
||||
} else {
|
||||
// Normal/strict mode: include time bucket
|
||||
const epochBucket = Math.floor(obs.created_at_epoch / batchWindowSeconds);
|
||||
fingerprint = `${obs.memory_session_id}|${obs.type}|${epochBucket}|${contentKey}`;
|
||||
}
|
||||
|
||||
if (!groups.has(fingerprint)) {
|
||||
groups.set(fingerprint, []);
|
||||
}
|
||||
groups.get(fingerprint)!.push(obs);
|
||||
}
|
||||
|
||||
// Find groups with duplicates
|
||||
const duplicateGroups: DuplicateGroup[] = [];
|
||||
|
||||
for (const [fingerprint, rows] of groups) {
|
||||
if (rows.length > 1) {
|
||||
// Sort by id to keep the oldest (lowest id)
|
||||
rows.sort((a, b) => a.id - b.id);
|
||||
const keepId = rows[0].id;
|
||||
const deleteIds = rows.slice(1).map(r => r.id);
|
||||
|
||||
// SAFETY: Never delete all copies - always keep at least one
|
||||
if (deleteIds.length >= rows.length) {
|
||||
throw new Error(`SAFETY VIOLATION: Would delete all ${rows.length} copies! Aborting.`);
|
||||
}
|
||||
if (!deleteIds.every(id => id !== keepId)) {
|
||||
throw new Error(`SAFETY VIOLATION: Delete list contains keep_id ${keepId}! Aborting.`);
|
||||
}
|
||||
|
||||
const title = rows[0].title || '';
|
||||
duplicateGroups.push({
|
||||
memory_session_id: rows[0].memory_session_id,
|
||||
title: title.substring(0, 100) + (title.length > 100 ? '...' : ''),
|
||||
type: rows[0].type,
|
||||
epoch_bucket: batchWindowSeconds > 0 ? Math.floor(rows[0].created_at_epoch / batchWindowSeconds) : 0,
|
||||
count: rows.length,
|
||||
ids: rows.map(r => r.id),
|
||||
keep_id: keepId,
|
||||
delete_ids: deleteIds,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (duplicateGroups.length === 0) {
|
||||
console.log('No duplicate observations found!');
|
||||
db.close();
|
||||
return;
|
||||
}
|
||||
|
||||
// Calculate stats
|
||||
const totalDuplicates = duplicateGroups.reduce((sum, g) => sum + g.delete_ids.length, 0);
|
||||
const affectedSessions = new Set(duplicateGroups.map(g => g.memory_session_id)).size;
|
||||
|
||||
console.log('DUPLICATE ANALYSIS:');
|
||||
console.log('-'.repeat(60));
|
||||
console.log(`Duplicate groups found: ${duplicateGroups.length}`);
|
||||
console.log(`Total duplicates to remove: ${totalDuplicates}`);
|
||||
console.log(`Affected sessions: ${affectedSessions}`);
|
||||
console.log(`Observations after cleanup: ${totalCount.count - totalDuplicates}`);
|
||||
console.log('');
|
||||
|
||||
// Show sample of duplicates
|
||||
console.log('SAMPLE DUPLICATES (first 10 groups):');
|
||||
console.log('-'.repeat(60));
|
||||
|
||||
for (const group of duplicateGroups.slice(0, 10)) {
|
||||
console.log(`Session: ${group.memory_session_id.substring(0, 20)}...`);
|
||||
console.log(`Type: ${group.type}`);
|
||||
console.log(`Count: ${group.count} copies (keeping id=${group.keep_id}, deleting ${group.delete_ids.length})`);
|
||||
console.log(`Title: "${group.title}"`);
|
||||
console.log('');
|
||||
}
|
||||
|
||||
if (duplicateGroups.length > 10) {
|
||||
console.log(`... and ${duplicateGroups.length - 10} more groups`);
|
||||
console.log('');
|
||||
}
|
||||
|
||||
// Execute deletion if not dry run
|
||||
if (!dryRun) {
|
||||
console.log('EXECUTING DELETION...');
|
||||
console.log('-'.repeat(60));
|
||||
|
||||
const allDeleteIds = duplicateGroups.flatMap(g => g.delete_ids);
|
||||
|
||||
// Delete in batches of 500 to avoid SQLite limits
|
||||
const BATCH_SIZE = 500;
|
||||
let deleted = 0;
|
||||
|
||||
db.exec('BEGIN TRANSACTION');
|
||||
|
||||
try {
|
||||
for (let i = 0; i < allDeleteIds.length; i += BATCH_SIZE) {
|
||||
const batch = allDeleteIds.slice(i, i + BATCH_SIZE);
|
||||
const placeholders = batch.map(() => '?').join(',');
|
||||
const stmt = db.prepare(`DELETE FROM observations WHERE id IN (${placeholders})`);
|
||||
const result = stmt.run(...batch);
|
||||
deleted += result.changes;
|
||||
console.log(`Deleted batch ${Math.floor(i / BATCH_SIZE) + 1}: ${result.changes} observations`);
|
||||
}
|
||||
|
||||
db.exec('COMMIT');
|
||||
console.log('');
|
||||
console.log(`Successfully deleted ${deleted} duplicate observations!`);
|
||||
|
||||
// Verify final count
|
||||
const finalCount = db.prepare('SELECT COUNT(*) as count FROM observations').get() as { count: number };
|
||||
console.log(`Final observation count: ${finalCount.count}`);
|
||||
|
||||
} catch (error) {
|
||||
db.exec('ROLLBACK');
|
||||
console.error('Error during deletion, rolled back:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
} else {
|
||||
console.log('DRY RUN COMPLETE');
|
||||
console.log('-'.repeat(60));
|
||||
console.log('No changes were made. Run with --execute to delete duplicates.');
|
||||
}
|
||||
|
||||
db.close();
|
||||
}
|
||||
|
||||
main();
|
||||
8
src/services/Context.ts
Normal file
8
src/services/Context.ts
Normal file
@@ -0,0 +1,8 @@
|
||||
/**
|
||||
* Context - Named re-export facade
|
||||
*
|
||||
* Provides a clean import path for context generation functionality.
|
||||
* Import from './Context.js' or './context/index.js'.
|
||||
*/
|
||||
|
||||
export * from './context/index.js';
|
||||
@@ -1,659 +1,18 @@
|
||||
/**
|
||||
* Context Generator - generates context injection for SessionStart
|
||||
* Context Generator - DEPRECATED
|
||||
*
|
||||
* This module contains all the logic for building the context injection string.
|
||||
* It's used by the worker service and called via HTTP from the context-hook.
|
||||
* This file is maintained for backward compatibility.
|
||||
* New code should import from './Context.js' or './context/index.js'.
|
||||
*
|
||||
* The context generation logic has been restructured into:
|
||||
* - src/services/context/ContextBuilder.ts - Main orchestrator
|
||||
* - src/services/context/ContextConfigLoader.ts - Configuration loading
|
||||
* - src/services/context/TokenCalculator.ts - Token economics
|
||||
* - src/services/context/ObservationCompiler.ts - Data retrieval
|
||||
* - src/services/context/formatters/ - Output formatting
|
||||
* - src/services/context/sections/ - Section rendering
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { existsSync, readFileSync, unlinkSync } from 'fs';
|
||||
import { SessionStore } from './sqlite/SessionStore.js';
|
||||
import { logger } from '../utils/logger.js';
|
||||
import { SettingsDefaultsManager } from '../shared/SettingsDefaultsManager.js';
|
||||
import {
|
||||
parseJsonArray,
|
||||
formatDateTime,
|
||||
formatTime,
|
||||
formatDate,
|
||||
toRelativePath,
|
||||
extractFirstFile
|
||||
} from '../shared/timeline-formatting.js';
|
||||
import { getProjectName } from '../utils/project-name.js';
|
||||
import { ModeManager } from './domain/ModeManager.js';
|
||||
|
||||
// Version marker path - use homedir-based path that works in both CJS and ESM contexts
|
||||
const VERSION_MARKER_PATH = path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack', 'plugin', '.install-version');
|
||||
|
||||
interface ContextConfig {
|
||||
// Display counts
|
||||
totalObservationCount: number;
|
||||
fullObservationCount: number;
|
||||
sessionCount: number;
|
||||
|
||||
// Token display toggles
|
||||
showReadTokens: boolean;
|
||||
showWorkTokens: boolean;
|
||||
showSavingsAmount: boolean;
|
||||
showSavingsPercent: boolean;
|
||||
|
||||
// Filters
|
||||
observationTypes: Set<string>;
|
||||
observationConcepts: Set<string>;
|
||||
|
||||
// Display options
|
||||
fullObservationField: 'narrative' | 'facts';
|
||||
showLastSummary: boolean;
|
||||
showLastMessage: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load all context configuration settings
|
||||
* Priority: ~/.claude-mem/settings.json > env var > defaults
|
||||
*/
|
||||
function loadContextConfig(): ContextConfig {
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
const settings = SettingsDefaultsManager.loadFromFile(settingsPath);
|
||||
|
||||
// For non-code modes, use all types/concepts from active mode instead of settings
|
||||
const modeId = settings.CLAUDE_MEM_MODE;
|
||||
const isCodeMode = modeId === 'code' || modeId.startsWith('code--');
|
||||
|
||||
let observationTypes: Set<string>;
|
||||
let observationConcepts: Set<string>;
|
||||
|
||||
if (isCodeMode) {
|
||||
// Code mode: use settings-based filtering
|
||||
observationTypes = new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim()).filter(Boolean)
|
||||
);
|
||||
observationConcepts = new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim()).filter(Boolean)
|
||||
);
|
||||
} else {
|
||||
// Non-code modes: use all types/concepts from active mode
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
observationTypes = new Set(mode.observation_types.map(t => t.id));
|
||||
observationConcepts = new Set(mode.observation_concepts.map(c => c.id));
|
||||
}
|
||||
|
||||
return {
|
||||
totalObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10),
|
||||
fullObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_FULL_COUNT, 10),
|
||||
sessionCount: parseInt(settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT, 10),
|
||||
showReadTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS === 'true',
|
||||
showWorkTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS === 'true',
|
||||
showSavingsAmount: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT === 'true',
|
||||
showSavingsPercent: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT === 'true',
|
||||
observationTypes,
|
||||
observationConcepts,
|
||||
fullObservationField: settings.CLAUDE_MEM_CONTEXT_FULL_FIELD as 'narrative' | 'facts',
|
||||
showLastSummary: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY === 'true',
|
||||
showLastMessage: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
// Configuration constants
|
||||
const CHARS_PER_TOKEN_ESTIMATE = 4;
|
||||
const SUMMARY_LOOKAHEAD = 1;
|
||||
|
||||
export interface ContextInput {
|
||||
session_id?: string;
|
||||
transcript_path?: string;
|
||||
cwd?: string;
|
||||
hook_event_name?: string;
|
||||
source?: "startup" | "resume" | "clear" | "compact";
|
||||
[key: string]: any;
|
||||
}
|
||||
|
||||
// ANSI color codes for terminal output
|
||||
const colors = {
|
||||
reset: '\x1b[0m',
|
||||
bright: '\x1b[1m',
|
||||
dim: '\x1b[2m',
|
||||
cyan: '\x1b[36m',
|
||||
green: '\x1b[32m',
|
||||
yellow: '\x1b[33m',
|
||||
blue: '\x1b[34m',
|
||||
magenta: '\x1b[35m',
|
||||
gray: '\x1b[90m',
|
||||
red: '\x1b[31m',
|
||||
};
|
||||
|
||||
interface Observation {
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
narrative: string | null;
|
||||
facts: string | null;
|
||||
concepts: string | null;
|
||||
files_read: string | null;
|
||||
files_modified: string | null;
|
||||
discovery_tokens: number | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
interface SessionSummary {
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
// Helper: Render a summary field
|
||||
function renderSummaryField(label: string, value: string | null, color: string, useColors: boolean): string[] {
|
||||
if (!value) return [];
|
||||
|
||||
if (useColors) {
|
||||
return [`${color}${label}:${colors.reset} ${value}`, ''];
|
||||
}
|
||||
return [`**${label}**: ${value}`, ''];
|
||||
}
|
||||
|
||||
// Helper: Convert cwd path to dashed format
|
||||
function cwdToDashed(cwd: string): string {
|
||||
return cwd.replace(/\//g, '-');
|
||||
}
|
||||
|
||||
// Helper: Extract last assistant message from transcript file
|
||||
function extractPriorMessages(transcriptPath: string): { userMessage: string; assistantMessage: string } {
|
||||
try {
|
||||
if (!existsSync(transcriptPath)) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const content = readFileSync(transcriptPath, 'utf-8').trim();
|
||||
if (!content) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const lines = content.split('\n').filter(line => line.trim());
|
||||
let lastAssistantMessage = '';
|
||||
|
||||
for (let i = lines.length - 1; i >= 0; i--) {
|
||||
try {
|
||||
const line = lines[i];
|
||||
if (!line.includes('"type":"assistant"')) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const entry = JSON.parse(line);
|
||||
if (entry.type === 'assistant' && entry.message?.content && Array.isArray(entry.message.content)) {
|
||||
let text = '';
|
||||
for (const block of entry.message.content) {
|
||||
if (block.type === 'text') {
|
||||
text += block.text;
|
||||
}
|
||||
}
|
||||
text = text.replace(/<system-reminder>[\s\S]*?<\/system-reminder>/g, '').trim();
|
||||
if (text) {
|
||||
lastAssistantMessage = text;
|
||||
break;
|
||||
}
|
||||
}
|
||||
} catch (parseError) {
|
||||
logger.debug('PARSER', 'Skipping malformed transcript line', { lineIndex: i }, parseError as Error);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
return { userMessage: '', assistantMessage: lastAssistantMessage };
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', `Failed to extract prior messages from transcript`, { transcriptPath }, error as Error);
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate context for a project
|
||||
*/
|
||||
export async function generateContext(input?: ContextInput, useColors: boolean = false): Promise<string> {
|
||||
const config = loadContextConfig();
|
||||
const cwd = input?.cwd ?? process.cwd();
|
||||
const project = getProjectName(cwd);
|
||||
|
||||
let db: SessionStore | null = null;
|
||||
try {
|
||||
db = new SessionStore();
|
||||
} catch (error: any) {
|
||||
if (error.code === 'ERR_DLOPEN_FAILED') {
|
||||
try {
|
||||
unlinkSync(VERSION_MARKER_PATH);
|
||||
} catch (unlinkError) {
|
||||
logger.debug('SYSTEM', 'Marker file cleanup failed (may not exist)', {}, unlinkError as Error);
|
||||
}
|
||||
logger.error('SYSTEM', 'Native module rebuild needed - restart Claude Code to auto-fix');
|
||||
return '';
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Build SQL WHERE clause for observation types
|
||||
const typeArray = Array.from(config.observationTypes);
|
||||
const typePlaceholders = typeArray.map(() => '?').join(',');
|
||||
|
||||
// Build SQL WHERE clause for concepts
|
||||
const conceptArray = Array.from(config.observationConcepts);
|
||||
const conceptPlaceholders = conceptArray.map(() => '?').join(',');
|
||||
|
||||
// Get recent observations
|
||||
const observations = db.db.prepare(`
|
||||
SELECT
|
||||
id, memory_session_id, type, title, subtitle, narrative,
|
||||
facts, concepts, files_read, files_modified, discovery_tokens,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
WHERE project = ?
|
||||
AND type IN (${typePlaceholders})
|
||||
AND EXISTS (
|
||||
SELECT 1 FROM json_each(concepts)
|
||||
WHERE value IN (${conceptPlaceholders})
|
||||
)
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(project, ...typeArray, ...conceptArray, config.totalObservationCount) as Observation[];
|
||||
|
||||
// Get recent summaries
|
||||
const recentSummaries = db.db.prepare(`
|
||||
SELECT id, memory_session_id, request, investigated, learned, completed, next_steps, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(project, config.sessionCount + SUMMARY_LOOKAHEAD) as SessionSummary[];
|
||||
|
||||
// Retrieve prior session messages if enabled
|
||||
let priorUserMessage = '';
|
||||
let priorAssistantMessage = '';
|
||||
|
||||
if (config.showLastMessage && observations.length > 0) {
|
||||
const currentSessionId = input?.session_id;
|
||||
const priorSessionObs = observations.find(obs => obs.memory_session_id !== currentSessionId);
|
||||
|
||||
if (priorSessionObs) {
|
||||
const priorSessionId = priorSessionObs.memory_session_id;
|
||||
const dashedCwd = cwdToDashed(cwd);
|
||||
const transcriptPath = path.join(homedir(), '.claude', 'projects', dashedCwd, `${priorSessionId}.jsonl`);
|
||||
const messages = extractPriorMessages(transcriptPath);
|
||||
priorUserMessage = messages.userMessage;
|
||||
priorAssistantMessage = messages.assistantMessage;
|
||||
}
|
||||
}
|
||||
|
||||
// If we have neither observations nor summaries, show empty state
|
||||
if (observations.length === 0 && recentSummaries.length === 0) {
|
||||
db?.close();
|
||||
if (useColors) {
|
||||
return `\n${colors.bright}${colors.cyan}[${project}] recent context${colors.reset}\n${colors.gray}${'─'.repeat(60)}${colors.reset}\n\n${colors.dim}No previous sessions found for this project yet.${colors.reset}\n`;
|
||||
}
|
||||
return `# [${project}] recent context\n\nNo previous sessions found for this project yet.`;
|
||||
}
|
||||
|
||||
const displaySummaries = recentSummaries.slice(0, config.sessionCount);
|
||||
const timelineObs = observations;
|
||||
|
||||
// Build output
|
||||
const output: string[] = [];
|
||||
|
||||
// Header
|
||||
if (useColors) {
|
||||
output.push('');
|
||||
output.push(`${colors.bright}${colors.cyan}[${project}] recent context${colors.reset}`);
|
||||
output.push(`${colors.gray}${'─'.repeat(60)}${colors.reset}`);
|
||||
output.push('');
|
||||
} else {
|
||||
output.push(`# [${project}] recent context`);
|
||||
output.push('');
|
||||
}
|
||||
|
||||
// Chronological Timeline
|
||||
if (timelineObs.length > 0) {
|
||||
// Legend - generate dynamically from active mode
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
const typeLegendItems = mode.observation_types.map(t => `${t.emoji} ${t.id}`).join(' | ');
|
||||
if (useColors) {
|
||||
output.push(`${colors.dim}Legend: 🎯 session-request | ${typeLegendItems}${colors.reset}`);
|
||||
} else {
|
||||
output.push(`**Legend:** 🎯 session-request | ${typeLegendItems}`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
// Column Key
|
||||
if (useColors) {
|
||||
output.push(`${colors.bright}💡 Column Key${colors.reset}`);
|
||||
output.push(`${colors.dim} Read: Tokens to read this observation (cost to learn it now)${colors.reset}`);
|
||||
output.push(`${colors.dim} Work: Tokens spent on work that produced this record (🔍 research, 🛠️ building, ⚖️ deciding)${colors.reset}`);
|
||||
} else {
|
||||
output.push(`💡 **Column Key**:`);
|
||||
output.push(`- **Read**: Tokens to read this observation (cost to learn it now)`);
|
||||
output.push(`- **Work**: Tokens spent on work that produced this record (🔍 research, 🛠️ building, ⚖️ deciding)`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
// Context Index Instructions
|
||||
if (useColors) {
|
||||
output.push(`${colors.dim}💡 Context Index: This semantic index (titles, types, files, tokens) is usually sufficient to understand past work.${colors.reset}`);
|
||||
output.push('');
|
||||
output.push(`${colors.dim}When you need implementation details, rationale, or debugging context:${colors.reset}`);
|
||||
output.push(`${colors.dim} - Use the mem-search skill to fetch full observations on-demand${colors.reset}`);
|
||||
output.push(`${colors.dim} - Critical types (🔴 bugfix, ⚖️ decision) often need detailed fetching${colors.reset}`);
|
||||
output.push(`${colors.dim} - Trust this index over re-reading code for past decisions and learnings${colors.reset}`);
|
||||
} else {
|
||||
output.push(`💡 **Context Index:** This semantic index (titles, types, files, tokens) is usually sufficient to understand past work.`);
|
||||
output.push('');
|
||||
output.push(`When you need implementation details, rationale, or debugging context:`);
|
||||
output.push(`- Use the mem-search skill to fetch full observations on-demand`);
|
||||
output.push(`- Critical types (🔴 bugfix, ⚖️ decision) often need detailed fetching`);
|
||||
output.push(`- Trust this index over re-reading code for past decisions and learnings`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
// Context Economics
|
||||
const totalObservations = observations.length;
|
||||
const totalReadTokens = observations.reduce((sum, obs) => {
|
||||
const obsSize = (obs.title?.length || 0) +
|
||||
(obs.subtitle?.length || 0) +
|
||||
(obs.narrative?.length || 0) +
|
||||
JSON.stringify(obs.facts || []).length;
|
||||
return sum + Math.ceil(obsSize / CHARS_PER_TOKEN_ESTIMATE);
|
||||
}, 0);
|
||||
const totalDiscoveryTokens = observations.reduce((sum, obs) => sum + (obs.discovery_tokens || 0), 0);
|
||||
const savings = totalDiscoveryTokens - totalReadTokens;
|
||||
const savingsPercent = totalDiscoveryTokens > 0
|
||||
? Math.round((savings / totalDiscoveryTokens) * 100)
|
||||
: 0;
|
||||
|
||||
const showContextEconomics = config.showReadTokens || config.showWorkTokens ||
|
||||
config.showSavingsAmount || config.showSavingsPercent;
|
||||
|
||||
if (showContextEconomics) {
|
||||
if (useColors) {
|
||||
output.push(`${colors.bright}${colors.cyan}📊 Context Economics${colors.reset}`);
|
||||
output.push(`${colors.dim} Loading: ${totalObservations} observations (${totalReadTokens.toLocaleString()} tokens to read)${colors.reset}`);
|
||||
output.push(`${colors.dim} Work investment: ${totalDiscoveryTokens.toLocaleString()} tokens spent on research, building, and decisions${colors.reset}`);
|
||||
if (totalDiscoveryTokens > 0 && (config.showSavingsAmount || config.showSavingsPercent)) {
|
||||
let savingsLine = ' Your savings: ';
|
||||
if (config.showSavingsAmount && config.showSavingsPercent) {
|
||||
savingsLine += `${savings.toLocaleString()} tokens (${savingsPercent}% reduction from reuse)`;
|
||||
} else if (config.showSavingsAmount) {
|
||||
savingsLine += `${savings.toLocaleString()} tokens`;
|
||||
} else {
|
||||
savingsLine += `${savingsPercent}% reduction from reuse`;
|
||||
}
|
||||
output.push(`${colors.green}${savingsLine}${colors.reset}`);
|
||||
}
|
||||
output.push('');
|
||||
} else {
|
||||
output.push(`📊 **Context Economics**:`);
|
||||
output.push(`- Loading: ${totalObservations} observations (${totalReadTokens.toLocaleString()} tokens to read)`);
|
||||
output.push(`- Work investment: ${totalDiscoveryTokens.toLocaleString()} tokens spent on research, building, and decisions`);
|
||||
if (totalDiscoveryTokens > 0 && (config.showSavingsAmount || config.showSavingsPercent)) {
|
||||
let savingsLine = '- Your savings: ';
|
||||
if (config.showSavingsAmount && config.showSavingsPercent) {
|
||||
savingsLine += `${savings.toLocaleString()} tokens (${savingsPercent}% reduction from reuse)`;
|
||||
} else if (config.showSavingsAmount) {
|
||||
savingsLine += `${savings.toLocaleString()} tokens`;
|
||||
} else {
|
||||
savingsLine += `${savingsPercent}% reduction from reuse`;
|
||||
}
|
||||
output.push(savingsLine);
|
||||
}
|
||||
output.push('');
|
||||
}
|
||||
}
|
||||
|
||||
// Prepare summaries for timeline display
|
||||
const mostRecentSummaryId = recentSummaries[0]?.id;
|
||||
|
||||
interface SummaryTimelineItem extends SessionSummary {
|
||||
displayEpoch: number;
|
||||
displayTime: string;
|
||||
shouldShowLink: boolean;
|
||||
}
|
||||
|
||||
const summariesForTimeline: SummaryTimelineItem[] = displaySummaries.map((summary, i) => {
|
||||
const olderSummary = i === 0 ? null : recentSummaries[i + 1];
|
||||
return {
|
||||
...summary,
|
||||
displayEpoch: olderSummary ? olderSummary.created_at_epoch : summary.created_at_epoch,
|
||||
displayTime: olderSummary ? olderSummary.created_at : summary.created_at,
|
||||
shouldShowLink: summary.id !== mostRecentSummaryId
|
||||
};
|
||||
});
|
||||
|
||||
// Identify which observations should show full details
|
||||
const fullObservationIds = new Set(
|
||||
observations
|
||||
.slice(0, config.fullObservationCount)
|
||||
.map(obs => obs.id)
|
||||
);
|
||||
|
||||
type TimelineItem =
|
||||
| { type: 'observation'; data: Observation }
|
||||
| { type: 'summary'; data: SummaryTimelineItem };
|
||||
|
||||
const timeline: TimelineItem[] = [
|
||||
...timelineObs.map(obs => ({ type: 'observation' as const, data: obs })),
|
||||
...summariesForTimeline.map(summary => ({ type: 'summary' as const, data: summary }))
|
||||
];
|
||||
|
||||
// Sort chronologically
|
||||
timeline.sort((a, b) => {
|
||||
const aEpoch = a.type === 'observation' ? a.data.created_at_epoch : a.data.displayEpoch;
|
||||
const bEpoch = b.type === 'observation' ? b.data.created_at_epoch : b.data.displayEpoch;
|
||||
return aEpoch - bEpoch;
|
||||
});
|
||||
|
||||
// Group by day
|
||||
const itemsByDay = new Map<string, TimelineItem[]>();
|
||||
for (const item of timeline) {
|
||||
const itemDate = item.type === 'observation' ? item.data.created_at : item.data.displayTime;
|
||||
const day = formatDate(itemDate);
|
||||
if (!itemsByDay.has(day)) {
|
||||
itemsByDay.set(day, []);
|
||||
}
|
||||
itemsByDay.get(day)!.push(item);
|
||||
}
|
||||
|
||||
// Sort days chronologically
|
||||
const sortedDays = Array.from(itemsByDay.entries()).sort((a, b) => {
|
||||
const aDate = new Date(a[0]).getTime();
|
||||
const bDate = new Date(b[0]).getTime();
|
||||
return aDate - bDate;
|
||||
});
|
||||
|
||||
// Render each day's timeline
|
||||
for (const [day, dayItems] of sortedDays) {
|
||||
if (useColors) {
|
||||
output.push(`${colors.bright}${colors.cyan}${day}${colors.reset}`);
|
||||
output.push('');
|
||||
} else {
|
||||
output.push(`### ${day}`);
|
||||
output.push('');
|
||||
}
|
||||
|
||||
let currentFile: string | null = null;
|
||||
let lastTime = '';
|
||||
let tableOpen = false;
|
||||
|
||||
for (const item of dayItems) {
|
||||
if (item.type === 'summary') {
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
tableOpen = false;
|
||||
currentFile = null;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const summary = item.data;
|
||||
const summaryTitle = `${summary.request || 'Session started'} (${formatDateTime(summary.displayTime)})`;
|
||||
|
||||
if (useColors) {
|
||||
output.push(`🎯 ${colors.yellow}#S${summary.id}${colors.reset} ${summaryTitle}`);
|
||||
} else {
|
||||
output.push(`**🎯 #S${summary.id}** ${summaryTitle}`);
|
||||
}
|
||||
output.push('');
|
||||
} else {
|
||||
const obs = item.data;
|
||||
const file = extractFirstFile(obs.files_modified, cwd);
|
||||
|
||||
if (file !== currentFile) {
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
}
|
||||
|
||||
if (useColors) {
|
||||
output.push(`${colors.dim}${file}${colors.reset}`);
|
||||
} else {
|
||||
output.push(`**${file}**`);
|
||||
}
|
||||
|
||||
if (!useColors) {
|
||||
output.push(`| ID | Time | T | Title | Read | Work |`);
|
||||
output.push(`|----|------|---|-------|------|------|`);
|
||||
}
|
||||
|
||||
currentFile = file;
|
||||
tableOpen = true;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const time = formatTime(obs.created_at);
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
|
||||
const obsSize = (obs.title?.length || 0) +
|
||||
(obs.subtitle?.length || 0) +
|
||||
(obs.narrative?.length || 0) +
|
||||
JSON.stringify(obs.facts || []).length;
|
||||
const readTokens = Math.ceil(obsSize / CHARS_PER_TOKEN_ESTIMATE);
|
||||
const discoveryTokens = obs.discovery_tokens || 0;
|
||||
const workEmoji = ModeManager.getInstance().getWorkEmoji(obs.type);
|
||||
const discoveryDisplay = discoveryTokens > 0 ? `${workEmoji} ${discoveryTokens.toLocaleString()}` : '-';
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '';
|
||||
lastTime = time;
|
||||
|
||||
const shouldShowFull = fullObservationIds.has(obs.id);
|
||||
|
||||
if (shouldShowFull) {
|
||||
const detailField = config.fullObservationField === 'narrative'
|
||||
? obs.narrative
|
||||
: (obs.facts ? parseJsonArray(obs.facts).join('\n') : null);
|
||||
|
||||
if (useColors) {
|
||||
const timePart = showTime ? `${colors.dim}${time}${colors.reset}` : ' '.repeat(time.length);
|
||||
const readPart = (config.showReadTokens && readTokens > 0) ? `${colors.dim}(~${readTokens}t)${colors.reset}` : '';
|
||||
const discoveryPart = (config.showWorkTokens && discoveryTokens > 0) ? `${colors.dim}(${workEmoji} ${discoveryTokens.toLocaleString()}t)${colors.reset}` : '';
|
||||
|
||||
output.push(` ${colors.dim}#${obs.id}${colors.reset} ${timePart} ${icon} ${colors.bright}${title}${colors.reset}`);
|
||||
if (detailField) {
|
||||
output.push(` ${colors.dim}${detailField}${colors.reset}`);
|
||||
}
|
||||
if (readPart || discoveryPart) {
|
||||
output.push(` ${readPart} ${discoveryPart}`);
|
||||
}
|
||||
output.push('');
|
||||
} else {
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
tableOpen = false;
|
||||
}
|
||||
|
||||
output.push(`**#${obs.id}** ${timeDisplay || '″'} ${icon} **${title}**`);
|
||||
if (detailField) {
|
||||
output.push('');
|
||||
output.push(detailField);
|
||||
output.push('');
|
||||
}
|
||||
const tokenParts: string[] = [];
|
||||
if (config.showReadTokens) {
|
||||
tokenParts.push(`Read: ~${readTokens}`);
|
||||
}
|
||||
if (config.showWorkTokens) {
|
||||
tokenParts.push(`Work: ${discoveryDisplay}`);
|
||||
}
|
||||
if (tokenParts.length > 0) {
|
||||
output.push(tokenParts.join(', '));
|
||||
}
|
||||
output.push('');
|
||||
currentFile = null;
|
||||
}
|
||||
} else {
|
||||
if (useColors) {
|
||||
const timePart = showTime ? `${colors.dim}${time}${colors.reset}` : ' '.repeat(time.length);
|
||||
const readPart = (config.showReadTokens && readTokens > 0) ? `${colors.dim}(~${readTokens}t)${colors.reset}` : '';
|
||||
const discoveryPart = (config.showWorkTokens && discoveryTokens > 0) ? `${colors.dim}(${workEmoji} ${discoveryTokens.toLocaleString()}t)${colors.reset}` : '';
|
||||
output.push(` ${colors.dim}#${obs.id}${colors.reset} ${timePart} ${icon} ${title} ${readPart} ${discoveryPart}`);
|
||||
} else {
|
||||
const readCol = config.showReadTokens ? `~${readTokens}` : '';
|
||||
const workCol = config.showWorkTokens ? discoveryDisplay : '';
|
||||
output.push(`| #${obs.id} | ${timeDisplay || '″'} | ${icon} | ${title} | ${readCol} | ${workCol} |`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
}
|
||||
}
|
||||
|
||||
// Add full summary details for most recent session
|
||||
const mostRecentSummary = recentSummaries[0];
|
||||
const mostRecentObservation = observations[0];
|
||||
|
||||
const shouldShowSummary = config.showLastSummary &&
|
||||
mostRecentSummary &&
|
||||
(mostRecentSummary.investigated || mostRecentSummary.learned || mostRecentSummary.completed || mostRecentSummary.next_steps) &&
|
||||
(!mostRecentObservation || mostRecentSummary.created_at_epoch > mostRecentObservation.created_at_epoch);
|
||||
|
||||
if (shouldShowSummary) {
|
||||
output.push(...renderSummaryField('Investigated', mostRecentSummary.investigated, colors.blue, useColors));
|
||||
output.push(...renderSummaryField('Learned', mostRecentSummary.learned, colors.yellow, useColors));
|
||||
output.push(...renderSummaryField('Completed', mostRecentSummary.completed, colors.green, useColors));
|
||||
output.push(...renderSummaryField('Next Steps', mostRecentSummary.next_steps, colors.magenta, useColors));
|
||||
}
|
||||
|
||||
// Previously section
|
||||
if (priorAssistantMessage) {
|
||||
output.push('');
|
||||
output.push('---');
|
||||
output.push('');
|
||||
if (useColors) {
|
||||
output.push(`${colors.bright}${colors.magenta}📋 Previously${colors.reset}`);
|
||||
output.push('');
|
||||
output.push(`${colors.dim}A: ${priorAssistantMessage}${colors.reset}`);
|
||||
} else {
|
||||
output.push(`**📋 Previously**`);
|
||||
output.push('');
|
||||
output.push(`A: ${priorAssistantMessage}`);
|
||||
}
|
||||
output.push('');
|
||||
}
|
||||
|
||||
// Footer
|
||||
if (showContextEconomics && totalDiscoveryTokens > 0 && savings > 0) {
|
||||
const workTokensK = Math.round(totalDiscoveryTokens / 1000);
|
||||
output.push('');
|
||||
if (useColors) {
|
||||
output.push(`${colors.dim}💰 Access ${workTokensK}k tokens of past research & decisions for just ${totalReadTokens.toLocaleString()}t. Use the mem-search skill to access memories by ID instead of re-reading files.${colors.reset}`);
|
||||
} else {
|
||||
output.push(`💰 Access ${workTokensK}k tokens of past research & decisions for just ${totalReadTokens.toLocaleString()}t. Use the mem-search skill to access memories by ID instead of re-reading files.`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
db?.close();
|
||||
return output.join('\n').trimEnd();
|
||||
}
|
||||
// Re-export everything from the new context module
|
||||
export { generateContext } from './context/index.js';
|
||||
export type { ContextInput, ContextConfig } from './context/types.js';
|
||||
|
||||
161
src/services/context/ContextBuilder.ts
Normal file
161
src/services/context/ContextBuilder.ts
Normal file
@@ -0,0 +1,161 @@
|
||||
/**
|
||||
* ContextBuilder - Main orchestrator for context generation
|
||||
*
|
||||
* Coordinates all context generation components to build the final output.
|
||||
* This is the primary entry point for context generation.
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { unlinkSync } from 'fs';
|
||||
import { SessionStore } from '../sqlite/SessionStore.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { getProjectName } from '../../utils/project-name.js';
|
||||
|
||||
import type { ContextInput, ContextConfig, Observation, SessionSummary } from './types.js';
|
||||
import { loadContextConfig } from './ContextConfigLoader.js';
|
||||
import { calculateTokenEconomics } from './TokenCalculator.js';
|
||||
import {
|
||||
queryObservations,
|
||||
querySummaries,
|
||||
getPriorSessionMessages,
|
||||
prepareSummariesForTimeline,
|
||||
buildTimeline,
|
||||
getFullObservationIds,
|
||||
} from './ObservationCompiler.js';
|
||||
import { renderHeader } from './sections/HeaderRenderer.js';
|
||||
import { renderTimeline } from './sections/TimelineRenderer.js';
|
||||
import { shouldShowSummary, renderSummaryFields } from './sections/SummaryRenderer.js';
|
||||
import { renderPreviouslySection, renderFooter } from './sections/FooterRenderer.js';
|
||||
import { renderMarkdownEmptyState } from './formatters/MarkdownFormatter.js';
|
||||
import { renderColorEmptyState } from './formatters/ColorFormatter.js';
|
||||
|
||||
// Version marker path for native module error handling
|
||||
const VERSION_MARKER_PATH = path.join(
|
||||
homedir(),
|
||||
'.claude',
|
||||
'plugins',
|
||||
'marketplaces',
|
||||
'thedotmack',
|
||||
'plugin',
|
||||
'.install-version'
|
||||
);
|
||||
|
||||
/**
|
||||
* Initialize database connection with error handling
|
||||
*/
|
||||
function initializeDatabase(): SessionStore | null {
|
||||
try {
|
||||
return new SessionStore();
|
||||
} catch (error: any) {
|
||||
if (error.code === 'ERR_DLOPEN_FAILED') {
|
||||
try {
|
||||
unlinkSync(VERSION_MARKER_PATH);
|
||||
} catch (unlinkError) {
|
||||
logger.debug('SYSTEM', 'Marker file cleanup failed (may not exist)', {}, unlinkError as Error);
|
||||
}
|
||||
logger.error('SYSTEM', 'Native module rebuild needed - restart Claude Code to auto-fix');
|
||||
return null;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Render empty state when no data exists
|
||||
*/
|
||||
function renderEmptyState(project: string, useColors: boolean): string {
|
||||
return useColors ? renderColorEmptyState(project) : renderMarkdownEmptyState(project);
|
||||
}
|
||||
|
||||
/**
|
||||
* Build context output from loaded data
|
||||
*/
|
||||
function buildContextOutput(
|
||||
project: string,
|
||||
observations: Observation[],
|
||||
summaries: SessionSummary[],
|
||||
config: ContextConfig,
|
||||
cwd: string,
|
||||
sessionId: string | undefined,
|
||||
useColors: boolean
|
||||
): string {
|
||||
const output: string[] = [];
|
||||
|
||||
// Calculate token economics
|
||||
const economics = calculateTokenEconomics(observations);
|
||||
|
||||
// Render header section
|
||||
output.push(...renderHeader(project, economics, config, useColors));
|
||||
|
||||
// Prepare timeline data
|
||||
const displaySummaries = summaries.slice(0, config.sessionCount);
|
||||
const summariesForTimeline = prepareSummariesForTimeline(displaySummaries, summaries);
|
||||
const timeline = buildTimeline(observations, summariesForTimeline);
|
||||
const fullObservationIds = getFullObservationIds(observations, config.fullObservationCount);
|
||||
|
||||
// Render timeline
|
||||
output.push(...renderTimeline(timeline, fullObservationIds, config, cwd, useColors));
|
||||
|
||||
// Render most recent summary if applicable
|
||||
const mostRecentSummary = summaries[0];
|
||||
const mostRecentObservation = observations[0];
|
||||
|
||||
if (shouldShowSummary(config, mostRecentSummary, mostRecentObservation)) {
|
||||
output.push(...renderSummaryFields(mostRecentSummary, useColors));
|
||||
}
|
||||
|
||||
// Render previously section (prior assistant message)
|
||||
const priorMessages = getPriorSessionMessages(observations, config, sessionId, cwd);
|
||||
output.push(...renderPreviouslySection(priorMessages, useColors));
|
||||
|
||||
// Render footer
|
||||
output.push(...renderFooter(economics, config, useColors));
|
||||
|
||||
return output.join('\n').trimEnd();
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate context for a project
|
||||
*
|
||||
* Main entry point for context generation. Orchestrates loading config,
|
||||
* querying data, and rendering the final context string.
|
||||
*/
|
||||
export async function generateContext(
|
||||
input?: ContextInput,
|
||||
useColors: boolean = false
|
||||
): Promise<string> {
|
||||
const config = loadContextConfig();
|
||||
const cwd = input?.cwd ?? process.cwd();
|
||||
const project = getProjectName(cwd);
|
||||
|
||||
// Initialize database
|
||||
const db = initializeDatabase();
|
||||
if (!db) {
|
||||
return '';
|
||||
}
|
||||
|
||||
try {
|
||||
// Query data
|
||||
const observations = queryObservations(db, project, config);
|
||||
const summaries = querySummaries(db, project, config);
|
||||
|
||||
// Handle empty state
|
||||
if (observations.length === 0 && summaries.length === 0) {
|
||||
return renderEmptyState(project, useColors);
|
||||
}
|
||||
|
||||
// Build and return context
|
||||
return buildContextOutput(
|
||||
project,
|
||||
observations,
|
||||
summaries,
|
||||
config,
|
||||
cwd,
|
||||
input?.session_id,
|
||||
useColors
|
||||
);
|
||||
} finally {
|
||||
db.close();
|
||||
}
|
||||
}
|
||||
57
src/services/context/ContextConfigLoader.ts
Normal file
57
src/services/context/ContextConfigLoader.ts
Normal file
@@ -0,0 +1,57 @@
|
||||
/**
|
||||
* ContextConfigLoader - Loads and validates context configuration
|
||||
*
|
||||
* Handles loading settings from file with mode-based filtering for observation types.
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
import type { ContextConfig } from './types.js';
|
||||
|
||||
/**
|
||||
* Load all context configuration settings
|
||||
* Priority: ~/.claude-mem/settings.json > env var > defaults
|
||||
*/
|
||||
export function loadContextConfig(): ContextConfig {
|
||||
const settingsPath = path.join(homedir(), '.claude-mem', 'settings.json');
|
||||
const settings = SettingsDefaultsManager.loadFromFile(settingsPath);
|
||||
|
||||
// For non-code modes, use all types/concepts from active mode instead of settings
|
||||
const modeId = settings.CLAUDE_MEM_MODE;
|
||||
const isCodeMode = modeId === 'code' || modeId.startsWith('code--');
|
||||
|
||||
let observationTypes: Set<string>;
|
||||
let observationConcepts: Set<string>;
|
||||
|
||||
if (isCodeMode) {
|
||||
// Code mode: use settings-based filtering
|
||||
observationTypes = new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_TYPES.split(',').map((t: string) => t.trim()).filter(Boolean)
|
||||
);
|
||||
observationConcepts = new Set(
|
||||
settings.CLAUDE_MEM_CONTEXT_OBSERVATION_CONCEPTS.split(',').map((c: string) => c.trim()).filter(Boolean)
|
||||
);
|
||||
} else {
|
||||
// Non-code modes: use all types/concepts from active mode
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
observationTypes = new Set(mode.observation_types.map(t => t.id));
|
||||
observationConcepts = new Set(mode.observation_concepts.map(c => c.id));
|
||||
}
|
||||
|
||||
return {
|
||||
totalObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_OBSERVATIONS, 10),
|
||||
fullObservationCount: parseInt(settings.CLAUDE_MEM_CONTEXT_FULL_COUNT, 10),
|
||||
sessionCount: parseInt(settings.CLAUDE_MEM_CONTEXT_SESSION_COUNT, 10),
|
||||
showReadTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_READ_TOKENS === 'true',
|
||||
showWorkTokens: settings.CLAUDE_MEM_CONTEXT_SHOW_WORK_TOKENS === 'true',
|
||||
showSavingsAmount: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_AMOUNT === 'true',
|
||||
showSavingsPercent: settings.CLAUDE_MEM_CONTEXT_SHOW_SAVINGS_PERCENT === 'true',
|
||||
observationTypes,
|
||||
observationConcepts,
|
||||
fullObservationField: settings.CLAUDE_MEM_CONTEXT_FULL_FIELD as 'narrative' | 'facts',
|
||||
showLastSummary: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_SUMMARY === 'true',
|
||||
showLastMessage: settings.CLAUDE_MEM_CONTEXT_SHOW_LAST_MESSAGE === 'true',
|
||||
};
|
||||
}
|
||||
202
src/services/context/ObservationCompiler.ts
Normal file
202
src/services/context/ObservationCompiler.ts
Normal file
@@ -0,0 +1,202 @@
|
||||
/**
|
||||
* ObservationCompiler - Query building and data retrieval for context
|
||||
*
|
||||
* Handles database queries for observations and summaries, plus transcript extraction.
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { existsSync, readFileSync } from 'fs';
|
||||
import { SessionStore } from '../sqlite/SessionStore.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import type {
|
||||
ContextConfig,
|
||||
Observation,
|
||||
SessionSummary,
|
||||
SummaryTimelineItem,
|
||||
TimelineItem,
|
||||
PriorMessages,
|
||||
} from './types.js';
|
||||
import { SUMMARY_LOOKAHEAD } from './types.js';
|
||||
|
||||
/**
|
||||
* Query observations from database with type and concept filtering
|
||||
*/
|
||||
export function queryObservations(
|
||||
db: SessionStore,
|
||||
project: string,
|
||||
config: ContextConfig
|
||||
): Observation[] {
|
||||
const typeArray = Array.from(config.observationTypes);
|
||||
const typePlaceholders = typeArray.map(() => '?').join(',');
|
||||
const conceptArray = Array.from(config.observationConcepts);
|
||||
const conceptPlaceholders = conceptArray.map(() => '?').join(',');
|
||||
|
||||
return db.db.prepare(`
|
||||
SELECT
|
||||
id, memory_session_id, type, title, subtitle, narrative,
|
||||
facts, concepts, files_read, files_modified, discovery_tokens,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
WHERE project = ?
|
||||
AND type IN (${typePlaceholders})
|
||||
AND EXISTS (
|
||||
SELECT 1 FROM json_each(concepts)
|
||||
WHERE value IN (${conceptPlaceholders})
|
||||
)
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(project, ...typeArray, ...conceptArray, config.totalObservationCount) as Observation[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Query recent session summaries from database
|
||||
*/
|
||||
export function querySummaries(
|
||||
db: SessionStore,
|
||||
project: string,
|
||||
config: ContextConfig
|
||||
): SessionSummary[] {
|
||||
return db.db.prepare(`
|
||||
SELECT id, memory_session_id, request, investigated, learned, completed, next_steps, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`).all(project, config.sessionCount + SUMMARY_LOOKAHEAD) as SessionSummary[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert cwd path to dashed format for transcript lookup
|
||||
*/
|
||||
function cwdToDashed(cwd: string): string {
|
||||
return cwd.replace(/\//g, '-');
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract prior messages from transcript file
|
||||
*/
|
||||
export function extractPriorMessages(transcriptPath: string): PriorMessages {
|
||||
try {
|
||||
if (!existsSync(transcriptPath)) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const content = readFileSync(transcriptPath, 'utf-8').trim();
|
||||
if (!content) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const lines = content.split('\n').filter(line => line.trim());
|
||||
let lastAssistantMessage = '';
|
||||
|
||||
for (let i = lines.length - 1; i >= 0; i--) {
|
||||
try {
|
||||
const line = lines[i];
|
||||
if (!line.includes('"type":"assistant"')) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const entry = JSON.parse(line);
|
||||
if (entry.type === 'assistant' && entry.message?.content && Array.isArray(entry.message.content)) {
|
||||
let text = '';
|
||||
for (const block of entry.message.content) {
|
||||
if (block.type === 'text') {
|
||||
text += block.text;
|
||||
}
|
||||
}
|
||||
text = text.replace(/<system-reminder>[\s\S]*?<\/system-reminder>/g, '').trim();
|
||||
if (text) {
|
||||
lastAssistantMessage = text;
|
||||
break;
|
||||
}
|
||||
}
|
||||
} catch (parseError) {
|
||||
logger.debug('PARSER', 'Skipping malformed transcript line', { lineIndex: i }, parseError as Error);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
return { userMessage: '', assistantMessage: lastAssistantMessage };
|
||||
} catch (error) {
|
||||
logger.failure('WORKER', `Failed to extract prior messages from transcript`, { transcriptPath }, error as Error);
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get prior session messages if enabled
|
||||
*/
|
||||
export function getPriorSessionMessages(
|
||||
observations: Observation[],
|
||||
config: ContextConfig,
|
||||
currentSessionId: string | undefined,
|
||||
cwd: string
|
||||
): PriorMessages {
|
||||
if (!config.showLastMessage || observations.length === 0) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const priorSessionObs = observations.find(obs => obs.memory_session_id !== currentSessionId);
|
||||
if (!priorSessionObs) {
|
||||
return { userMessage: '', assistantMessage: '' };
|
||||
}
|
||||
|
||||
const priorSessionId = priorSessionObs.memory_session_id;
|
||||
const dashedCwd = cwdToDashed(cwd);
|
||||
const transcriptPath = path.join(homedir(), '.claude', 'projects', dashedCwd, `${priorSessionId}.jsonl`);
|
||||
return extractPriorMessages(transcriptPath);
|
||||
}
|
||||
|
||||
/**
|
||||
* Prepare summaries for timeline display
|
||||
*/
|
||||
export function prepareSummariesForTimeline(
|
||||
displaySummaries: SessionSummary[],
|
||||
allSummaries: SessionSummary[]
|
||||
): SummaryTimelineItem[] {
|
||||
const mostRecentSummaryId = allSummaries[0]?.id;
|
||||
|
||||
return displaySummaries.map((summary, i) => {
|
||||
const olderSummary = i === 0 ? null : allSummaries[i + 1];
|
||||
return {
|
||||
...summary,
|
||||
displayEpoch: olderSummary ? olderSummary.created_at_epoch : summary.created_at_epoch,
|
||||
displayTime: olderSummary ? olderSummary.created_at : summary.created_at,
|
||||
shouldShowLink: summary.id !== mostRecentSummaryId
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Build unified timeline from observations and summaries
|
||||
*/
|
||||
export function buildTimeline(
|
||||
observations: Observation[],
|
||||
summaries: SummaryTimelineItem[]
|
||||
): TimelineItem[] {
|
||||
const timeline: TimelineItem[] = [
|
||||
...observations.map(obs => ({ type: 'observation' as const, data: obs })),
|
||||
...summaries.map(summary => ({ type: 'summary' as const, data: summary }))
|
||||
];
|
||||
|
||||
// Sort chronologically
|
||||
timeline.sort((a, b) => {
|
||||
const aEpoch = a.type === 'observation' ? a.data.created_at_epoch : a.data.displayEpoch;
|
||||
const bEpoch = b.type === 'observation' ? b.data.created_at_epoch : b.data.displayEpoch;
|
||||
return aEpoch - bEpoch;
|
||||
});
|
||||
|
||||
return timeline;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get set of observation IDs that should show full details
|
||||
*/
|
||||
export function getFullObservationIds(observations: Observation[], count: number): Set<number> {
|
||||
return new Set(
|
||||
observations
|
||||
.slice(0, count)
|
||||
.map(obs => obs.id)
|
||||
);
|
||||
}
|
||||
78
src/services/context/TokenCalculator.ts
Normal file
78
src/services/context/TokenCalculator.ts
Normal file
@@ -0,0 +1,78 @@
|
||||
/**
|
||||
* TokenCalculator - Token budget calculations for context economics
|
||||
*
|
||||
* Handles estimation of token counts for observations and context economics.
|
||||
*/
|
||||
|
||||
import type { Observation, TokenEconomics, ContextConfig } from './types.js';
|
||||
import { CHARS_PER_TOKEN_ESTIMATE } from './types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
/**
|
||||
* Calculate token count for a single observation
|
||||
*/
|
||||
export function calculateObservationTokens(obs: Observation): number {
|
||||
const obsSize = (obs.title?.length || 0) +
|
||||
(obs.subtitle?.length || 0) +
|
||||
(obs.narrative?.length || 0) +
|
||||
JSON.stringify(obs.facts || []).length;
|
||||
return Math.ceil(obsSize / CHARS_PER_TOKEN_ESTIMATE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate context economics for a set of observations
|
||||
*/
|
||||
export function calculateTokenEconomics(observations: Observation[]): TokenEconomics {
|
||||
const totalObservations = observations.length;
|
||||
|
||||
const totalReadTokens = observations.reduce((sum, obs) => {
|
||||
return sum + calculateObservationTokens(obs);
|
||||
}, 0);
|
||||
|
||||
const totalDiscoveryTokens = observations.reduce((sum, obs) => {
|
||||
return sum + (obs.discovery_tokens || 0);
|
||||
}, 0);
|
||||
|
||||
const savings = totalDiscoveryTokens - totalReadTokens;
|
||||
const savingsPercent = totalDiscoveryTokens > 0
|
||||
? Math.round((savings / totalDiscoveryTokens) * 100)
|
||||
: 0;
|
||||
|
||||
return {
|
||||
totalObservations,
|
||||
totalReadTokens,
|
||||
totalDiscoveryTokens,
|
||||
savings,
|
||||
savingsPercent,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get work emoji for an observation type
|
||||
*/
|
||||
export function getWorkEmoji(obsType: string): string {
|
||||
return ModeManager.getInstance().getWorkEmoji(obsType);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format token display for an observation
|
||||
*/
|
||||
export function formatObservationTokenDisplay(
|
||||
obs: Observation,
|
||||
config: ContextConfig
|
||||
): { readTokens: number; discoveryTokens: number; discoveryDisplay: string; workEmoji: string } {
|
||||
const readTokens = calculateObservationTokens(obs);
|
||||
const discoveryTokens = obs.discovery_tokens || 0;
|
||||
const workEmoji = getWorkEmoji(obs.type);
|
||||
const discoveryDisplay = discoveryTokens > 0 ? `${workEmoji} ${discoveryTokens.toLocaleString()}` : '-';
|
||||
|
||||
return { readTokens, discoveryTokens, discoveryDisplay, workEmoji };
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if context economics should be shown
|
||||
*/
|
||||
export function shouldShowContextEconomics(config: ContextConfig): boolean {
|
||||
return config.showReadTokens || config.showWorkTokens ||
|
||||
config.showSavingsAmount || config.showSavingsPercent;
|
||||
}
|
||||
223
src/services/context/formatters/ColorFormatter.ts
Normal file
223
src/services/context/formatters/ColorFormatter.ts
Normal file
@@ -0,0 +1,223 @@
|
||||
/**
|
||||
* ColorFormatter - Formats context output with ANSI colors for terminal
|
||||
*
|
||||
* Handles all colored formatting for context injection (terminal display).
|
||||
*/
|
||||
|
||||
import type {
|
||||
ContextConfig,
|
||||
Observation,
|
||||
TokenEconomics,
|
||||
PriorMessages,
|
||||
} from '../types.js';
|
||||
import { colors } from '../types.js';
|
||||
import { ModeManager } from '../../domain/ModeManager.js';
|
||||
import { formatObservationTokenDisplay } from '../TokenCalculator.js';
|
||||
|
||||
/**
|
||||
* Render colored header
|
||||
*/
|
||||
export function renderColorHeader(project: string): string[] {
|
||||
return [
|
||||
'',
|
||||
`${colors.bright}${colors.cyan}[${project}] recent context${colors.reset}`,
|
||||
`${colors.gray}${'─'.repeat(60)}${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored legend
|
||||
*/
|
||||
export function renderColorLegend(): string[] {
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
const typeLegendItems = mode.observation_types.map(t => `${t.emoji} ${t.id}`).join(' | ');
|
||||
|
||||
return [
|
||||
`${colors.dim}Legend: session-request | ${typeLegendItems}${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored column key
|
||||
*/
|
||||
export function renderColorColumnKey(): string[] {
|
||||
return [
|
||||
`${colors.bright}Column Key${colors.reset}`,
|
||||
`${colors.dim} Read: Tokens to read this observation (cost to learn it now)${colors.reset}`,
|
||||
`${colors.dim} Work: Tokens spent on work that produced this record ( research, building, deciding)${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored context index instructions
|
||||
*/
|
||||
export function renderColorContextIndex(): string[] {
|
||||
return [
|
||||
`${colors.dim}Context Index: This semantic index (titles, types, files, tokens) is usually sufficient to understand past work.${colors.reset}`,
|
||||
'',
|
||||
`${colors.dim}When you need implementation details, rationale, or debugging context:${colors.reset}`,
|
||||
`${colors.dim} - Use the mem-search skill to fetch full observations on-demand${colors.reset}`,
|
||||
`${colors.dim} - Critical types ( bugfix, decision) often need detailed fetching${colors.reset}`,
|
||||
`${colors.dim} - Trust this index over re-reading code for past decisions and learnings${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored context economics
|
||||
*/
|
||||
export function renderColorContextEconomics(
|
||||
economics: TokenEconomics,
|
||||
config: ContextConfig
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
|
||||
output.push(`${colors.bright}${colors.cyan}Context Economics${colors.reset}`);
|
||||
output.push(`${colors.dim} Loading: ${economics.totalObservations} observations (${economics.totalReadTokens.toLocaleString()} tokens to read)${colors.reset}`);
|
||||
output.push(`${colors.dim} Work investment: ${economics.totalDiscoveryTokens.toLocaleString()} tokens spent on research, building, and decisions${colors.reset}`);
|
||||
|
||||
if (economics.totalDiscoveryTokens > 0 && (config.showSavingsAmount || config.showSavingsPercent)) {
|
||||
let savingsLine = ' Your savings: ';
|
||||
if (config.showSavingsAmount && config.showSavingsPercent) {
|
||||
savingsLine += `${economics.savings.toLocaleString()} tokens (${economics.savingsPercent}% reduction from reuse)`;
|
||||
} else if (config.showSavingsAmount) {
|
||||
savingsLine += `${economics.savings.toLocaleString()} tokens`;
|
||||
} else {
|
||||
savingsLine += `${economics.savingsPercent}% reduction from reuse`;
|
||||
}
|
||||
output.push(`${colors.green}${savingsLine}${colors.reset}`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored day header
|
||||
*/
|
||||
export function renderColorDayHeader(day: string): string[] {
|
||||
return [
|
||||
`${colors.bright}${colors.cyan}${day}${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored file header
|
||||
*/
|
||||
export function renderColorFileHeader(file: string): string[] {
|
||||
return [
|
||||
`${colors.dim}${file}${colors.reset}`
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored table row for observation
|
||||
*/
|
||||
export function renderColorTableRow(
|
||||
obs: Observation,
|
||||
time: string,
|
||||
showTime: boolean,
|
||||
config: ContextConfig
|
||||
): string {
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const { readTokens, discoveryTokens, workEmoji } = formatObservationTokenDisplay(obs, config);
|
||||
|
||||
const timePart = showTime ? `${colors.dim}${time}${colors.reset}` : ' '.repeat(time.length);
|
||||
const readPart = (config.showReadTokens && readTokens > 0) ? `${colors.dim}(~${readTokens}t)${colors.reset}` : '';
|
||||
const discoveryPart = (config.showWorkTokens && discoveryTokens > 0) ? `${colors.dim}(${workEmoji} ${discoveryTokens.toLocaleString()}t)${colors.reset}` : '';
|
||||
|
||||
return ` ${colors.dim}#${obs.id}${colors.reset} ${timePart} ${icon} ${title} ${readPart} ${discoveryPart}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored full observation
|
||||
*/
|
||||
export function renderColorFullObservation(
|
||||
obs: Observation,
|
||||
time: string,
|
||||
showTime: boolean,
|
||||
detailField: string | null,
|
||||
config: ContextConfig
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const { readTokens, discoveryTokens, workEmoji } = formatObservationTokenDisplay(obs, config);
|
||||
|
||||
const timePart = showTime ? `${colors.dim}${time}${colors.reset}` : ' '.repeat(time.length);
|
||||
const readPart = (config.showReadTokens && readTokens > 0) ? `${colors.dim}(~${readTokens}t)${colors.reset}` : '';
|
||||
const discoveryPart = (config.showWorkTokens && discoveryTokens > 0) ? `${colors.dim}(${workEmoji} ${discoveryTokens.toLocaleString()}t)${colors.reset}` : '';
|
||||
|
||||
output.push(` ${colors.dim}#${obs.id}${colors.reset} ${timePart} ${icon} ${colors.bright}${title}${colors.reset}`);
|
||||
if (detailField) {
|
||||
output.push(` ${colors.dim}${detailField}${colors.reset}`);
|
||||
}
|
||||
if (readPart || discoveryPart) {
|
||||
output.push(` ${readPart} ${discoveryPart}`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored summary item in timeline
|
||||
*/
|
||||
export function renderColorSummaryItem(
|
||||
summary: { id: number; request: string | null },
|
||||
formattedTime: string
|
||||
): string[] {
|
||||
const summaryTitle = `${summary.request || 'Session started'} (${formattedTime})`;
|
||||
return [
|
||||
`${colors.yellow}#S${summary.id}${colors.reset} ${summaryTitle}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored summary field
|
||||
*/
|
||||
export function renderColorSummaryField(label: string, value: string | null, color: string): string[] {
|
||||
if (!value) return [];
|
||||
return [`${color}${label}:${colors.reset} ${value}`, ''];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored previously section
|
||||
*/
|
||||
export function renderColorPreviouslySection(priorMessages: PriorMessages): string[] {
|
||||
if (!priorMessages.assistantMessage) return [];
|
||||
|
||||
return [
|
||||
'',
|
||||
'---',
|
||||
'',
|
||||
`${colors.bright}${colors.magenta}Previously${colors.reset}`,
|
||||
'',
|
||||
`${colors.dim}A: ${priorMessages.assistantMessage}${colors.reset}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored footer
|
||||
*/
|
||||
export function renderColorFooter(totalDiscoveryTokens: number, totalReadTokens: number): string[] {
|
||||
const workTokensK = Math.round(totalDiscoveryTokens / 1000);
|
||||
return [
|
||||
'',
|
||||
`${colors.dim}Access ${workTokensK}k tokens of past research & decisions for just ${totalReadTokens.toLocaleString()}t. Use the mem-search skill to access memories by ID instead of re-reading files.${colors.reset}`
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render colored empty state
|
||||
*/
|
||||
export function renderColorEmptyState(project: string): string {
|
||||
return `\n${colors.bright}${colors.cyan}[${project}] recent context${colors.reset}\n${colors.gray}${'─'.repeat(60)}${colors.reset}\n\n${colors.dim}No previous sessions found for this project yet.${colors.reset}\n`;
|
||||
}
|
||||
226
src/services/context/formatters/MarkdownFormatter.ts
Normal file
226
src/services/context/formatters/MarkdownFormatter.ts
Normal file
@@ -0,0 +1,226 @@
|
||||
/**
|
||||
* MarkdownFormatter - Formats context output as markdown (non-colored mode)
|
||||
*
|
||||
* Handles all markdown formatting for context injection.
|
||||
*/
|
||||
|
||||
import type {
|
||||
ContextConfig,
|
||||
Observation,
|
||||
SessionSummary,
|
||||
TokenEconomics,
|
||||
PriorMessages,
|
||||
} from '../types.js';
|
||||
import { ModeManager } from '../../domain/ModeManager.js';
|
||||
import { formatObservationTokenDisplay } from '../TokenCalculator.js';
|
||||
|
||||
/**
|
||||
* Render markdown header
|
||||
*/
|
||||
export function renderMarkdownHeader(project: string): string[] {
|
||||
return [
|
||||
`# [${project}] recent context`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown legend
|
||||
*/
|
||||
export function renderMarkdownLegend(): string[] {
|
||||
const mode = ModeManager.getInstance().getActiveMode();
|
||||
const typeLegendItems = mode.observation_types.map(t => `${t.emoji} ${t.id}`).join(' | ');
|
||||
|
||||
return [
|
||||
`**Legend:** session-request | ${typeLegendItems}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown column key
|
||||
*/
|
||||
export function renderMarkdownColumnKey(): string[] {
|
||||
return [
|
||||
`**Column Key**:`,
|
||||
`- **Read**: Tokens to read this observation (cost to learn it now)`,
|
||||
`- **Work**: Tokens spent on work that produced this record ( research, building, deciding)`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown context index instructions
|
||||
*/
|
||||
export function renderMarkdownContextIndex(): string[] {
|
||||
return [
|
||||
`**Context Index:** This semantic index (titles, types, files, tokens) is usually sufficient to understand past work.`,
|
||||
'',
|
||||
`When you need implementation details, rationale, or debugging context:`,
|
||||
`- Use the mem-search skill to fetch full observations on-demand`,
|
||||
`- Critical types ( bugfix, decision) often need detailed fetching`,
|
||||
`- Trust this index over re-reading code for past decisions and learnings`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown context economics
|
||||
*/
|
||||
export function renderMarkdownContextEconomics(
|
||||
economics: TokenEconomics,
|
||||
config: ContextConfig
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
|
||||
output.push(`**Context Economics**:`);
|
||||
output.push(`- Loading: ${economics.totalObservations} observations (${economics.totalReadTokens.toLocaleString()} tokens to read)`);
|
||||
output.push(`- Work investment: ${economics.totalDiscoveryTokens.toLocaleString()} tokens spent on research, building, and decisions`);
|
||||
|
||||
if (economics.totalDiscoveryTokens > 0 && (config.showSavingsAmount || config.showSavingsPercent)) {
|
||||
let savingsLine = '- Your savings: ';
|
||||
if (config.showSavingsAmount && config.showSavingsPercent) {
|
||||
savingsLine += `${economics.savings.toLocaleString()} tokens (${economics.savingsPercent}% reduction from reuse)`;
|
||||
} else if (config.showSavingsAmount) {
|
||||
savingsLine += `${economics.savings.toLocaleString()} tokens`;
|
||||
} else {
|
||||
savingsLine += `${economics.savingsPercent}% reduction from reuse`;
|
||||
}
|
||||
output.push(savingsLine);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown day header
|
||||
*/
|
||||
export function renderMarkdownDayHeader(day: string): string[] {
|
||||
return [
|
||||
`### ${day}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown file header with table header
|
||||
*/
|
||||
export function renderMarkdownFileHeader(file: string): string[] {
|
||||
return [
|
||||
`**${file}**`,
|
||||
`| ID | Time | T | Title | Read | Work |`,
|
||||
`|----|------|---|-------|------|------|`
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown table row for observation
|
||||
*/
|
||||
export function renderMarkdownTableRow(
|
||||
obs: Observation,
|
||||
timeDisplay: string,
|
||||
config: ContextConfig
|
||||
): string {
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const { readTokens, discoveryDisplay } = formatObservationTokenDisplay(obs, config);
|
||||
|
||||
const readCol = config.showReadTokens ? `~${readTokens}` : '';
|
||||
const workCol = config.showWorkTokens ? discoveryDisplay : '';
|
||||
|
||||
return `| #${obs.id} | ${timeDisplay || '"'} | ${icon} | ${title} | ${readCol} | ${workCol} |`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown full observation
|
||||
*/
|
||||
export function renderMarkdownFullObservation(
|
||||
obs: Observation,
|
||||
timeDisplay: string,
|
||||
detailField: string | null,
|
||||
config: ContextConfig
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
const title = obs.title || 'Untitled';
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const { readTokens, discoveryDisplay } = formatObservationTokenDisplay(obs, config);
|
||||
|
||||
output.push(`**#${obs.id}** ${timeDisplay || '"'} ${icon} **${title}**`);
|
||||
if (detailField) {
|
||||
output.push('');
|
||||
output.push(detailField);
|
||||
output.push('');
|
||||
}
|
||||
|
||||
const tokenParts: string[] = [];
|
||||
if (config.showReadTokens) {
|
||||
tokenParts.push(`Read: ~${readTokens}`);
|
||||
}
|
||||
if (config.showWorkTokens) {
|
||||
tokenParts.push(`Work: ${discoveryDisplay}`);
|
||||
}
|
||||
if (tokenParts.length > 0) {
|
||||
output.push(tokenParts.join(', '));
|
||||
}
|
||||
output.push('');
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown summary item in timeline
|
||||
*/
|
||||
export function renderMarkdownSummaryItem(
|
||||
summary: { id: number; request: string | null },
|
||||
formattedTime: string
|
||||
): string[] {
|
||||
const summaryTitle = `${summary.request || 'Session started'} (${formattedTime})`;
|
||||
return [
|
||||
`**#S${summary.id}** ${summaryTitle}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown summary field
|
||||
*/
|
||||
export function renderMarkdownSummaryField(label: string, value: string | null): string[] {
|
||||
if (!value) return [];
|
||||
return [`**${label}**: ${value}`, ''];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown previously section
|
||||
*/
|
||||
export function renderMarkdownPreviouslySection(priorMessages: PriorMessages): string[] {
|
||||
if (!priorMessages.assistantMessage) return [];
|
||||
|
||||
return [
|
||||
'',
|
||||
'---',
|
||||
'',
|
||||
`**Previously**`,
|
||||
'',
|
||||
`A: ${priorMessages.assistantMessage}`,
|
||||
''
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown footer
|
||||
*/
|
||||
export function renderMarkdownFooter(totalDiscoveryTokens: number, totalReadTokens: number): string[] {
|
||||
const workTokensK = Math.round(totalDiscoveryTokens / 1000);
|
||||
return [
|
||||
'',
|
||||
`Access ${workTokensK}k tokens of past research & decisions for just ${totalReadTokens.toLocaleString()}t. Use the mem-search skill to access memories by ID instead of re-reading files.`
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Render markdown empty state
|
||||
*/
|
||||
export function renderMarkdownEmptyState(project: string): string {
|
||||
return `# [${project}] recent context\n\nNo previous sessions found for this project yet.`;
|
||||
}
|
||||
18
src/services/context/index.ts
Normal file
18
src/services/context/index.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
/**
|
||||
* Context Module - Public API
|
||||
*
|
||||
* Re-exports the main context generation functionality.
|
||||
*/
|
||||
|
||||
export { generateContext } from './ContextBuilder.js';
|
||||
export type { ContextInput, ContextConfig } from './types.js';
|
||||
|
||||
// Component exports for advanced usage
|
||||
export { loadContextConfig } from './ContextConfigLoader.js';
|
||||
export { calculateTokenEconomics, calculateObservationTokens } from './TokenCalculator.js';
|
||||
export {
|
||||
queryObservations,
|
||||
querySummaries,
|
||||
buildTimeline,
|
||||
getPriorSessionMessages,
|
||||
} from './ObservationCompiler.js';
|
||||
42
src/services/context/sections/FooterRenderer.ts
Normal file
42
src/services/context/sections/FooterRenderer.ts
Normal file
@@ -0,0 +1,42 @@
|
||||
/**
|
||||
* FooterRenderer - Renders the context footer sections
|
||||
*
|
||||
* Handles rendering of previously section and token savings footer.
|
||||
*/
|
||||
|
||||
import type { ContextConfig, TokenEconomics, PriorMessages } from '../types.js';
|
||||
import { shouldShowContextEconomics } from '../TokenCalculator.js';
|
||||
import * as Markdown from '../formatters/MarkdownFormatter.js';
|
||||
import * as Color from '../formatters/ColorFormatter.js';
|
||||
|
||||
/**
|
||||
* Render the previously section (prior assistant message)
|
||||
*/
|
||||
export function renderPreviouslySection(
|
||||
priorMessages: PriorMessages,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
if (useColors) {
|
||||
return Color.renderColorPreviouslySection(priorMessages);
|
||||
}
|
||||
return Markdown.renderMarkdownPreviouslySection(priorMessages);
|
||||
}
|
||||
|
||||
/**
|
||||
* Render the footer with token savings info
|
||||
*/
|
||||
export function renderFooter(
|
||||
economics: TokenEconomics,
|
||||
config: ContextConfig,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
// Only show footer if we have savings to display
|
||||
if (!shouldShowContextEconomics(config) || economics.totalDiscoveryTokens <= 0 || economics.savings <= 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
if (useColors) {
|
||||
return Color.renderColorFooter(economics.totalDiscoveryTokens, economics.totalReadTokens);
|
||||
}
|
||||
return Markdown.renderMarkdownFooter(economics.totalDiscoveryTokens, economics.totalReadTokens);
|
||||
}
|
||||
61
src/services/context/sections/HeaderRenderer.ts
Normal file
61
src/services/context/sections/HeaderRenderer.ts
Normal file
@@ -0,0 +1,61 @@
|
||||
/**
|
||||
* HeaderRenderer - Renders the context header sections
|
||||
*
|
||||
* Handles rendering of header, legend, column key, context index, and economics.
|
||||
*/
|
||||
|
||||
import type { ContextConfig, TokenEconomics } from '../types.js';
|
||||
import { shouldShowContextEconomics } from '../TokenCalculator.js';
|
||||
import * as Markdown from '../formatters/MarkdownFormatter.js';
|
||||
import * as Color from '../formatters/ColorFormatter.js';
|
||||
|
||||
/**
|
||||
* Render the complete header section
|
||||
*/
|
||||
export function renderHeader(
|
||||
project: string,
|
||||
economics: TokenEconomics,
|
||||
config: ContextConfig,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
|
||||
// Main header
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorHeader(project));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownHeader(project));
|
||||
}
|
||||
|
||||
// Legend
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorLegend());
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownLegend());
|
||||
}
|
||||
|
||||
// Column key
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorColumnKey());
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownColumnKey());
|
||||
}
|
||||
|
||||
// Context index instructions
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorContextIndex());
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownContextIndex());
|
||||
}
|
||||
|
||||
// Context economics
|
||||
if (shouldShowContextEconomics(config)) {
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorContextEconomics(economics, config));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownContextEconomics(economics, config));
|
||||
}
|
||||
}
|
||||
|
||||
return output;
|
||||
}
|
||||
65
src/services/context/sections/SummaryRenderer.ts
Normal file
65
src/services/context/sections/SummaryRenderer.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
/**
|
||||
* SummaryRenderer - Renders the summary section at the end of context
|
||||
*
|
||||
* Handles rendering of the most recent session summary fields.
|
||||
*/
|
||||
|
||||
import type { ContextConfig, Observation, SessionSummary } from '../types.js';
|
||||
import { colors } from '../types.js';
|
||||
import * as Markdown from '../formatters/MarkdownFormatter.js';
|
||||
import * as Color from '../formatters/ColorFormatter.js';
|
||||
|
||||
/**
|
||||
* Check if summary should be displayed
|
||||
*/
|
||||
export function shouldShowSummary(
|
||||
config: ContextConfig,
|
||||
mostRecentSummary: SessionSummary | undefined,
|
||||
mostRecentObservation: Observation | undefined
|
||||
): boolean {
|
||||
if (!config.showLastSummary || !mostRecentSummary) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const hasContent = !!(
|
||||
mostRecentSummary.investigated ||
|
||||
mostRecentSummary.learned ||
|
||||
mostRecentSummary.completed ||
|
||||
mostRecentSummary.next_steps
|
||||
);
|
||||
|
||||
if (!hasContent) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Only show if summary is more recent than observations
|
||||
if (mostRecentObservation && mostRecentSummary.created_at_epoch <= mostRecentObservation.created_at_epoch) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render summary fields
|
||||
*/
|
||||
export function renderSummaryFields(
|
||||
summary: SessionSummary,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorSummaryField('Investigated', summary.investigated, colors.blue));
|
||||
output.push(...Color.renderColorSummaryField('Learned', summary.learned, colors.yellow));
|
||||
output.push(...Color.renderColorSummaryField('Completed', summary.completed, colors.green));
|
||||
output.push(...Color.renderColorSummaryField('Next Steps', summary.next_steps, colors.magenta));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownSummaryField('Investigated', summary.investigated));
|
||||
output.push(...Markdown.renderMarkdownSummaryField('Learned', summary.learned));
|
||||
output.push(...Markdown.renderMarkdownSummaryField('Completed', summary.completed));
|
||||
output.push(...Markdown.renderMarkdownSummaryField('Next Steps', summary.next_steps));
|
||||
}
|
||||
|
||||
return output;
|
||||
}
|
||||
170
src/services/context/sections/TimelineRenderer.ts
Normal file
170
src/services/context/sections/TimelineRenderer.ts
Normal file
@@ -0,0 +1,170 @@
|
||||
/**
|
||||
* TimelineRenderer - Renders the chronological timeline of observations and summaries
|
||||
*
|
||||
* Handles day grouping, file grouping within days, and table rendering.
|
||||
*/
|
||||
|
||||
import type {
|
||||
ContextConfig,
|
||||
Observation,
|
||||
TimelineItem,
|
||||
SummaryTimelineItem,
|
||||
} from '../types.js';
|
||||
import { formatTime, formatDate, formatDateTime, extractFirstFile, parseJsonArray } from '../../../shared/timeline-formatting.js';
|
||||
import * as Markdown from '../formatters/MarkdownFormatter.js';
|
||||
import * as Color from '../formatters/ColorFormatter.js';
|
||||
|
||||
/**
|
||||
* Group timeline items by day
|
||||
*/
|
||||
export function groupTimelineByDay(timeline: TimelineItem[]): Map<string, TimelineItem[]> {
|
||||
const itemsByDay = new Map<string, TimelineItem[]>();
|
||||
|
||||
for (const item of timeline) {
|
||||
const itemDate = item.type === 'observation' ? item.data.created_at : item.data.displayTime;
|
||||
const day = formatDate(itemDate);
|
||||
if (!itemsByDay.has(day)) {
|
||||
itemsByDay.set(day, []);
|
||||
}
|
||||
itemsByDay.get(day)!.push(item);
|
||||
}
|
||||
|
||||
// Sort days chronologically
|
||||
const sortedEntries = Array.from(itemsByDay.entries()).sort((a, b) => {
|
||||
const aDate = new Date(a[0]).getTime();
|
||||
const bDate = new Date(b[0]).getTime();
|
||||
return aDate - bDate;
|
||||
});
|
||||
|
||||
return new Map(sortedEntries);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get detail field content for full observation display
|
||||
*/
|
||||
function getDetailField(obs: Observation, config: ContextConfig): string | null {
|
||||
if (config.fullObservationField === 'narrative') {
|
||||
return obs.narrative;
|
||||
}
|
||||
return obs.facts ? parseJsonArray(obs.facts).join('\n') : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render a single day's timeline items
|
||||
*/
|
||||
export function renderDayTimeline(
|
||||
day: string,
|
||||
dayItems: TimelineItem[],
|
||||
fullObservationIds: Set<number>,
|
||||
config: ContextConfig,
|
||||
cwd: string,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
|
||||
// Day header
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorDayHeader(day));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownDayHeader(day));
|
||||
}
|
||||
|
||||
let currentFile: string | null = null;
|
||||
let lastTime = '';
|
||||
let tableOpen = false;
|
||||
|
||||
for (const item of dayItems) {
|
||||
if (item.type === 'summary') {
|
||||
// Close any open table before summary
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
tableOpen = false;
|
||||
currentFile = null;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const summary = item.data as SummaryTimelineItem;
|
||||
const formattedTime = formatDateTime(summary.displayTime);
|
||||
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorSummaryItem(summary, formattedTime));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownSummaryItem(summary, formattedTime));
|
||||
}
|
||||
} else {
|
||||
const obs = item.data as Observation;
|
||||
const file = extractFirstFile(obs.files_modified, cwd);
|
||||
const time = formatTime(obs.created_at);
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '';
|
||||
lastTime = time;
|
||||
|
||||
const shouldShowFull = fullObservationIds.has(obs.id);
|
||||
|
||||
// Check if we need a new file section
|
||||
if (file !== currentFile) {
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
}
|
||||
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorFileHeader(file));
|
||||
} else {
|
||||
output.push(...Markdown.renderMarkdownFileHeader(file));
|
||||
}
|
||||
|
||||
currentFile = file;
|
||||
tableOpen = true;
|
||||
}
|
||||
|
||||
if (shouldShowFull) {
|
||||
const detailField = getDetailField(obs, config);
|
||||
|
||||
if (useColors) {
|
||||
output.push(...Color.renderColorFullObservation(obs, time, showTime, detailField, config));
|
||||
} else {
|
||||
// Close table for full observation in markdown mode
|
||||
if (tableOpen && !useColors) {
|
||||
output.push('');
|
||||
tableOpen = false;
|
||||
}
|
||||
output.push(...Markdown.renderMarkdownFullObservation(obs, timeDisplay, detailField, config));
|
||||
currentFile = null; // Reset to trigger new table header if needed
|
||||
}
|
||||
} else {
|
||||
if (useColors) {
|
||||
output.push(Color.renderColorTableRow(obs, time, showTime, config));
|
||||
} else {
|
||||
output.push(Markdown.renderMarkdownTableRow(obs, timeDisplay, config));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Close any remaining open table
|
||||
if (tableOpen) {
|
||||
output.push('');
|
||||
}
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render the complete timeline
|
||||
*/
|
||||
export function renderTimeline(
|
||||
timeline: TimelineItem[],
|
||||
fullObservationIds: Set<number>,
|
||||
config: ContextConfig,
|
||||
cwd: string,
|
||||
useColors: boolean
|
||||
): string[] {
|
||||
const output: string[] = [];
|
||||
const itemsByDay = groupTimelineByDay(timeline);
|
||||
|
||||
for (const [day, dayItems] of itemsByDay) {
|
||||
output.push(...renderDayTimeline(day, dayItems, fullObservationIds, config, cwd, useColors));
|
||||
}
|
||||
|
||||
return output;
|
||||
}
|
||||
131
src/services/context/types.ts
Normal file
131
src/services/context/types.ts
Normal file
@@ -0,0 +1,131 @@
|
||||
/**
|
||||
* Context Types - Shared types for context generation module
|
||||
*/
|
||||
|
||||
/**
|
||||
* Input parameters for context generation
|
||||
*/
|
||||
export interface ContextInput {
|
||||
session_id?: string;
|
||||
transcript_path?: string;
|
||||
cwd?: string;
|
||||
hook_event_name?: string;
|
||||
source?: "startup" | "resume" | "clear" | "compact";
|
||||
[key: string]: any;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configuration for context generation
|
||||
*/
|
||||
export interface ContextConfig {
|
||||
// Display counts
|
||||
totalObservationCount: number;
|
||||
fullObservationCount: number;
|
||||
sessionCount: number;
|
||||
|
||||
// Token display toggles
|
||||
showReadTokens: boolean;
|
||||
showWorkTokens: boolean;
|
||||
showSavingsAmount: boolean;
|
||||
showSavingsPercent: boolean;
|
||||
|
||||
// Filters
|
||||
observationTypes: Set<string>;
|
||||
observationConcepts: Set<string>;
|
||||
|
||||
// Display options
|
||||
fullObservationField: 'narrative' | 'facts';
|
||||
showLastSummary: boolean;
|
||||
showLastMessage: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Observation record from database
|
||||
*/
|
||||
export interface Observation {
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
narrative: string | null;
|
||||
facts: string | null;
|
||||
concepts: string | null;
|
||||
files_read: string | null;
|
||||
files_modified: string | null;
|
||||
discovery_tokens: number | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Session summary record from database
|
||||
*/
|
||||
export interface SessionSummary {
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Summary with timeline display info
|
||||
*/
|
||||
export interface SummaryTimelineItem extends SessionSummary {
|
||||
displayEpoch: number;
|
||||
displayTime: string;
|
||||
shouldShowLink: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Timeline item - either observation or summary
|
||||
*/
|
||||
export type TimelineItem =
|
||||
| { type: 'observation'; data: Observation }
|
||||
| { type: 'summary'; data: SummaryTimelineItem };
|
||||
|
||||
/**
|
||||
* Token economics data
|
||||
*/
|
||||
export interface TokenEconomics {
|
||||
totalObservations: number;
|
||||
totalReadTokens: number;
|
||||
totalDiscoveryTokens: number;
|
||||
savings: number;
|
||||
savingsPercent: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Prior messages from transcript
|
||||
*/
|
||||
export interface PriorMessages {
|
||||
userMessage: string;
|
||||
assistantMessage: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* ANSI color codes for terminal output
|
||||
*/
|
||||
export const colors = {
|
||||
reset: '\x1b[0m',
|
||||
bright: '\x1b[1m',
|
||||
dim: '\x1b[2m',
|
||||
cyan: '\x1b[36m',
|
||||
green: '\x1b[32m',
|
||||
yellow: '\x1b[33m',
|
||||
blue: '\x1b[34m',
|
||||
magenta: '\x1b[35m',
|
||||
gray: '\x1b[90m',
|
||||
red: '\x1b[31m',
|
||||
};
|
||||
|
||||
/**
|
||||
* Configuration constants
|
||||
*/
|
||||
export const CHARS_PER_TOKEN_ESTIMATE = 4;
|
||||
export const SUMMARY_LOOKAHEAD = 1;
|
||||
115
src/services/infrastructure/GracefulShutdown.ts
Normal file
115
src/services/infrastructure/GracefulShutdown.ts
Normal file
@@ -0,0 +1,115 @@
|
||||
/**
|
||||
* GracefulShutdown - Cleanup utilities for graceful exit
|
||||
*
|
||||
* Extracted from worker-service.ts to provide centralized shutdown coordination.
|
||||
* Handles:
|
||||
* - HTTP server closure (with Windows-specific delays)
|
||||
* - Session manager shutdown coordination
|
||||
* - Child process cleanup (Windows zombie port fix)
|
||||
*/
|
||||
|
||||
import http from 'http';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import {
|
||||
getChildProcesses,
|
||||
forceKillProcess,
|
||||
waitForProcessesExit,
|
||||
removePidFile
|
||||
} from './ProcessManager.js';
|
||||
|
||||
export interface ShutdownableService {
|
||||
shutdownAll(): Promise<void>;
|
||||
}
|
||||
|
||||
export interface CloseableClient {
|
||||
close(): Promise<void>;
|
||||
}
|
||||
|
||||
export interface CloseableDatabase {
|
||||
close(): Promise<void>;
|
||||
}
|
||||
|
||||
/**
|
||||
* Configuration for graceful shutdown
|
||||
*/
|
||||
export interface GracefulShutdownConfig {
|
||||
server: http.Server | null;
|
||||
sessionManager: ShutdownableService;
|
||||
mcpClient?: CloseableClient;
|
||||
dbManager?: CloseableDatabase;
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform graceful shutdown of all services
|
||||
*
|
||||
* IMPORTANT: On Windows, we must kill all child processes before exiting
|
||||
* to prevent zombie ports. The socket handle can be inherited by children,
|
||||
* and if not properly closed, the port stays bound after process death.
|
||||
*/
|
||||
export async function performGracefulShutdown(config: GracefulShutdownConfig): Promise<void> {
|
||||
logger.info('SYSTEM', 'Shutdown initiated');
|
||||
|
||||
// Clean up PID file on shutdown
|
||||
removePidFile();
|
||||
|
||||
// STEP 1: Enumerate all child processes BEFORE we start closing things
|
||||
const childPids = await getChildProcesses(process.pid);
|
||||
logger.info('SYSTEM', 'Found child processes', { count: childPids.length, pids: childPids });
|
||||
|
||||
// STEP 2: Close HTTP server first
|
||||
if (config.server) {
|
||||
await closeHttpServer(config.server);
|
||||
logger.info('SYSTEM', 'HTTP server closed');
|
||||
}
|
||||
|
||||
// STEP 3: Shutdown active sessions
|
||||
await config.sessionManager.shutdownAll();
|
||||
|
||||
// STEP 4: Close MCP client connection (signals child to exit gracefully)
|
||||
if (config.mcpClient) {
|
||||
await config.mcpClient.close();
|
||||
logger.info('SYSTEM', 'MCP client closed');
|
||||
}
|
||||
|
||||
// STEP 5: Close database connection (includes ChromaSync cleanup)
|
||||
if (config.dbManager) {
|
||||
await config.dbManager.close();
|
||||
}
|
||||
|
||||
// STEP 6: Force kill any remaining child processes (Windows zombie port fix)
|
||||
if (childPids.length > 0) {
|
||||
logger.info('SYSTEM', 'Force killing remaining children');
|
||||
for (const pid of childPids) {
|
||||
await forceKillProcess(pid);
|
||||
}
|
||||
// Wait for children to fully exit
|
||||
await waitForProcessesExit(childPids, 5000);
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Worker shutdown complete');
|
||||
}
|
||||
|
||||
/**
|
||||
* Close HTTP server with Windows-specific delays
|
||||
* Windows needs extra time to release sockets properly
|
||||
*/
|
||||
async function closeHttpServer(server: http.Server): Promise<void> {
|
||||
// Close all active connections
|
||||
server.closeAllConnections();
|
||||
|
||||
// Give Windows time to close connections before closing server (prevents zombie ports)
|
||||
if (process.platform === 'win32') {
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
}
|
||||
|
||||
// Close the server
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
server.close(err => err ? reject(err) : resolve());
|
||||
});
|
||||
|
||||
// Extra delay on Windows to ensure port is fully released
|
||||
if (process.platform === 'win32') {
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
logger.info('SYSTEM', 'Waited for Windows port cleanup');
|
||||
}
|
||||
}
|
||||
143
src/services/infrastructure/HealthMonitor.ts
Normal file
143
src/services/infrastructure/HealthMonitor.ts
Normal file
@@ -0,0 +1,143 @@
|
||||
/**
|
||||
* HealthMonitor - Port monitoring, health checks, and version checking
|
||||
*
|
||||
* Extracted from worker-service.ts monolith to provide centralized health monitoring.
|
||||
* Handles:
|
||||
* - Port availability checking
|
||||
* - Worker health/readiness polling
|
||||
* - Version mismatch detection (critical for plugin updates)
|
||||
* - HTTP-based shutdown requests
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { readFileSync } from 'fs';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
|
||||
/**
|
||||
* Check if a port is in use by querying the health endpoint
|
||||
*/
|
||||
export async function isPortInUse(port: number): Promise<boolean> {
|
||||
try {
|
||||
// Note: Removed AbortSignal.timeout to avoid Windows Bun cleanup issue (libuv assertion)
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/health`);
|
||||
return response.ok;
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Health check polls every 500ms, logging would flood
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for the worker to become fully ready (passes readiness check)
|
||||
* @param port Worker port to check
|
||||
* @param timeoutMs Maximum time to wait in milliseconds
|
||||
* @returns true if worker became ready, false if timeout
|
||||
*/
|
||||
export async function waitForHealth(port: number, timeoutMs: number = 30000): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
while (Date.now() - start < timeoutMs) {
|
||||
try {
|
||||
// Note: Removed AbortSignal.timeout to avoid Windows Bun cleanup issue (libuv assertion)
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/readiness`);
|
||||
if (response.ok) return true;
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Retry loop - expected failures during startup, will retry
|
||||
logger.debug('SYSTEM', 'Service not ready yet, will retry', { port }, error as Error);
|
||||
}
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for a port to become free (no longer responding to health checks)
|
||||
* Used after shutdown to confirm the port is available for restart
|
||||
*/
|
||||
export async function waitForPortFree(port: number, timeoutMs: number = 10000): Promise<boolean> {
|
||||
const start = Date.now();
|
||||
while (Date.now() - start < timeoutMs) {
|
||||
if (!(await isPortInUse(port))) return true;
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Send HTTP shutdown request to a running worker
|
||||
* @param port Worker port
|
||||
* @returns true if shutdown request was acknowledged, false otherwise
|
||||
*/
|
||||
export async function httpShutdown(port: number): Promise<boolean> {
|
||||
try {
|
||||
// Note: Removed AbortSignal.timeout to avoid Windows Bun cleanup issue (libuv assertion)
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/admin/shutdown`, {
|
||||
method: 'POST'
|
||||
});
|
||||
if (!response.ok) {
|
||||
logger.warn('SYSTEM', 'Shutdown request returned error', { port, status: response.status });
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
} catch (error) {
|
||||
// Connection refused is expected if worker already stopped
|
||||
if (error instanceof Error && error.message?.includes('ECONNREFUSED')) {
|
||||
logger.debug('SYSTEM', 'Worker already stopped', { port }, error);
|
||||
return false;
|
||||
}
|
||||
// Unexpected error - log full details
|
||||
logger.warn('SYSTEM', 'Shutdown request failed unexpectedly', { port }, error as Error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the plugin version from the installed marketplace package.json
|
||||
* This is the "expected" version that should be running
|
||||
*/
|
||||
export function getInstalledPluginVersion(): string {
|
||||
const marketplaceRoot = path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack');
|
||||
const packageJsonPath = path.join(marketplaceRoot, 'package.json');
|
||||
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf-8'));
|
||||
return packageJson.version;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the running worker's version via API
|
||||
* This is the "actual" version currently running
|
||||
*/
|
||||
export async function getRunningWorkerVersion(port: number): Promise<string | null> {
|
||||
try {
|
||||
const response = await fetch(`http://127.0.0.1:${port}/api/version`);
|
||||
if (!response.ok) return null;
|
||||
const data = await response.json() as { version: string };
|
||||
return data.version;
|
||||
} catch {
|
||||
// Expected: worker not running or version endpoint unavailable
|
||||
logger.debug('SYSTEM', 'Could not fetch worker version', { port });
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export interface VersionCheckResult {
|
||||
matches: boolean;
|
||||
pluginVersion: string;
|
||||
workerVersion: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if worker version matches plugin version
|
||||
* Critical for detecting when plugin is updated but worker is still running old code
|
||||
* Returns true if versions match or if we can't determine (assume match for graceful degradation)
|
||||
*/
|
||||
export async function checkVersionMatch(port: number): Promise<VersionCheckResult> {
|
||||
const pluginVersion = getInstalledPluginVersion();
|
||||
const workerVersion = await getRunningWorkerVersion(port);
|
||||
|
||||
// If we can't get worker version, assume it matches (graceful degradation)
|
||||
if (!workerVersion) {
|
||||
return { matches: true, pluginVersion, workerVersion };
|
||||
}
|
||||
|
||||
return { matches: pluginVersion === workerVersion, pluginVersion, workerVersion };
|
||||
}
|
||||
306
src/services/infrastructure/ProcessManager.ts
Normal file
306
src/services/infrastructure/ProcessManager.ts
Normal file
@@ -0,0 +1,306 @@
|
||||
/**
|
||||
* ProcessManager - PID files, signal handlers, and child process lifecycle management
|
||||
*
|
||||
* Extracted from worker-service.ts monolith to provide centralized process management.
|
||||
* Handles:
|
||||
* - PID file management for daemon coordination
|
||||
* - Signal handler registration for graceful shutdown
|
||||
* - Child process enumeration and cleanup (especially for Windows zombie port fix)
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { existsSync, writeFileSync, readFileSync, unlinkSync, mkdirSync } from 'fs';
|
||||
import { exec, execSync, spawn } from 'child_process';
|
||||
import { promisify } from 'util';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
// Standard paths for PID file management
|
||||
const DATA_DIR = path.join(homedir(), '.claude-mem');
|
||||
const PID_FILE = path.join(DATA_DIR, 'worker.pid');
|
||||
|
||||
export interface PidInfo {
|
||||
pid: number;
|
||||
port: number;
|
||||
startedAt: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Write PID info to the standard PID file location
|
||||
*/
|
||||
export function writePidFile(info: PidInfo): void {
|
||||
mkdirSync(DATA_DIR, { recursive: true });
|
||||
writeFileSync(PID_FILE, JSON.stringify(info, null, 2));
|
||||
}
|
||||
|
||||
/**
|
||||
* Read PID info from the standard PID file location
|
||||
* Returns null if file doesn't exist or is corrupted
|
||||
*/
|
||||
export function readPidFile(): PidInfo | null {
|
||||
if (!existsSync(PID_FILE)) return null;
|
||||
|
||||
try {
|
||||
return JSON.parse(readFileSync(PID_FILE, 'utf-8'));
|
||||
} catch (error) {
|
||||
logger.warn('SYSTEM', 'Failed to parse PID file', { path: PID_FILE }, error as Error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the PID file (called during shutdown)
|
||||
*/
|
||||
export function removePidFile(): void {
|
||||
if (!existsSync(PID_FILE)) return;
|
||||
|
||||
try {
|
||||
unlinkSync(PID_FILE);
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Cleanup function - PID file removal failure is non-critical
|
||||
logger.warn('SYSTEM', 'Failed to remove PID file', { path: PID_FILE }, error as Error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get platform-adjusted timeout (Windows socket cleanup is slower)
|
||||
*/
|
||||
export function getPlatformTimeout(baseMs: number): number {
|
||||
const WINDOWS_MULTIPLIER = 2.0;
|
||||
return process.platform === 'win32' ? Math.round(baseMs * WINDOWS_MULTIPLIER) : baseMs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all child process PIDs (Windows-specific)
|
||||
* Used for cleanup to prevent zombie ports when parent exits
|
||||
*/
|
||||
export async function getChildProcesses(parentPid: number): Promise<number[]> {
|
||||
if (process.platform !== 'win32') {
|
||||
return [];
|
||||
}
|
||||
|
||||
// SECURITY: Validate PID is a positive integer to prevent command injection
|
||||
if (!Number.isInteger(parentPid) || parentPid <= 0) {
|
||||
logger.warn('SYSTEM', 'Invalid parent PID for child process enumeration', { parentPid });
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.ParentProcessId -eq ${parentPid} } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 60000 });
|
||||
return stdout
|
||||
.trim()
|
||||
.split('\n')
|
||||
.map(s => parseInt(s.trim(), 10))
|
||||
.filter(n => !isNaN(n) && Number.isInteger(n) && n > 0);
|
||||
} catch (error) {
|
||||
// Shutdown cleanup - failure is non-critical, continue without child process cleanup
|
||||
logger.warn('SYSTEM', 'Failed to enumerate child processes', { parentPid }, error as Error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Force kill a process by PID
|
||||
* Windows: uses taskkill /F /T to kill process tree
|
||||
* Unix: uses SIGKILL
|
||||
*/
|
||||
export async function forceKillProcess(pid: number): Promise<void> {
|
||||
// SECURITY: Validate PID is a positive integer to prevent command injection
|
||||
if (!Number.isInteger(pid) || pid <= 0) {
|
||||
logger.warn('SYSTEM', 'Invalid PID for force kill', { pid });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
if (process.platform === 'win32') {
|
||||
// /T kills entire process tree, /F forces termination
|
||||
await execAsync(`taskkill /PID ${pid} /T /F`, { timeout: 60000 });
|
||||
} else {
|
||||
process.kill(pid, 'SIGKILL');
|
||||
}
|
||||
logger.info('SYSTEM', 'Killed process', { pid });
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Shutdown cleanup - process already exited, continue
|
||||
logger.debug('SYSTEM', 'Process already exited during force kill', { pid }, error as Error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for processes to fully exit
|
||||
*/
|
||||
export async function waitForProcessesExit(pids: number[], timeoutMs: number): Promise<void> {
|
||||
const start = Date.now();
|
||||
|
||||
while (Date.now() - start < timeoutMs) {
|
||||
const stillAlive = pids.filter(pid => {
|
||||
try {
|
||||
process.kill(pid, 0);
|
||||
return true;
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Tight loop checking 100s of PIDs every 100ms during cleanup
|
||||
return false;
|
||||
}
|
||||
});
|
||||
|
||||
if (stillAlive.length === 0) {
|
||||
logger.info('SYSTEM', 'All child processes exited');
|
||||
return;
|
||||
}
|
||||
|
||||
logger.debug('SYSTEM', 'Waiting for processes to exit', { stillAlive });
|
||||
await new Promise(r => setTimeout(r, 100));
|
||||
}
|
||||
|
||||
logger.warn('SYSTEM', 'Timeout waiting for child processes to exit');
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up orphaned chroma-mcp processes from previous worker sessions
|
||||
* Prevents process accumulation and memory leaks
|
||||
*/
|
||||
export async function cleanupOrphanedProcesses(): Promise<void> {
|
||||
const isWindows = process.platform === 'win32';
|
||||
const pids: number[] = [];
|
||||
|
||||
try {
|
||||
if (isWindows) {
|
||||
// Windows: Use PowerShell Get-CimInstance to find chroma-mcp processes
|
||||
const cmd = `powershell -Command "Get-CimInstance Win32_Process | Where-Object { $_.Name -like '*python*' -and $_.CommandLine -like '*chroma-mcp*' } | Select-Object -ExpandProperty ProcessId"`;
|
||||
const { stdout } = await execAsync(cmd, { timeout: 60000 });
|
||||
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Windows)');
|
||||
return;
|
||||
}
|
||||
|
||||
const pidStrings = stdout.trim().split('\n');
|
||||
for (const pidStr of pidStrings) {
|
||||
const pid = parseInt(pidStr.trim(), 10);
|
||||
// SECURITY: Validate PID is positive integer before adding to list
|
||||
if (!isNaN(pid) && Number.isInteger(pid) && pid > 0) {
|
||||
pids.push(pid);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
// Unix: Use ps aux | grep
|
||||
const { stdout } = await execAsync('ps aux | grep "chroma-mcp" | grep -v grep || true');
|
||||
|
||||
if (!stdout.trim()) {
|
||||
logger.debug('SYSTEM', 'No orphaned chroma-mcp processes found (Unix)');
|
||||
return;
|
||||
}
|
||||
|
||||
const lines = stdout.trim().split('\n');
|
||||
for (const line of lines) {
|
||||
const parts = line.trim().split(/\s+/);
|
||||
if (parts.length > 1) {
|
||||
const pid = parseInt(parts[1], 10);
|
||||
// SECURITY: Validate PID is positive integer before adding to list
|
||||
if (!isNaN(pid) && Number.isInteger(pid) && pid > 0) {
|
||||
pids.push(pid);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
// Orphan cleanup is non-critical - log and continue
|
||||
logger.warn('SYSTEM', 'Failed to enumerate orphaned processes', {}, error as Error);
|
||||
return;
|
||||
}
|
||||
|
||||
if (pids.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Cleaning up orphaned chroma-mcp processes', {
|
||||
platform: isWindows ? 'Windows' : 'Unix',
|
||||
count: pids.length,
|
||||
pids
|
||||
});
|
||||
|
||||
// Kill all found processes
|
||||
if (isWindows) {
|
||||
for (const pid of pids) {
|
||||
// SECURITY: Double-check PID validation before using in taskkill command
|
||||
if (!Number.isInteger(pid) || pid <= 0) {
|
||||
logger.warn('SYSTEM', 'Skipping invalid PID', { pid });
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
execSync(`taskkill /PID ${pid} /T /F`, { timeout: 60000, stdio: 'ignore' });
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Cleanup loop - process may have exited, continue to next PID
|
||||
logger.debug('SYSTEM', 'Failed to kill process, may have already exited', { pid }, error as Error);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
for (const pid of pids) {
|
||||
try {
|
||||
process.kill(pid, 'SIGKILL');
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Cleanup loop - process may have exited, continue to next PID
|
||||
logger.debug('SYSTEM', 'Process already exited', { pid }, error as Error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info('SYSTEM', 'Orphaned processes cleaned up', { count: pids.length });
|
||||
}
|
||||
|
||||
/**
|
||||
* Spawn a detached daemon process
|
||||
* Returns the child PID or undefined if spawn failed
|
||||
*/
|
||||
export function spawnDaemon(
|
||||
scriptPath: string,
|
||||
port: number,
|
||||
extraEnv: Record<string, string> = {}
|
||||
): number | undefined {
|
||||
const child = spawn(process.execPath, [scriptPath, '--daemon'], {
|
||||
detached: true,
|
||||
stdio: 'ignore',
|
||||
windowsHide: true,
|
||||
env: {
|
||||
...process.env,
|
||||
CLAUDE_MEM_WORKER_PORT: String(port),
|
||||
...extraEnv
|
||||
}
|
||||
});
|
||||
|
||||
if (child.pid === undefined) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
child.unref();
|
||||
return child.pid;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create signal handler factory for graceful shutdown
|
||||
* Returns a handler function that can be passed to process.on('SIGTERM') etc.
|
||||
*/
|
||||
export function createSignalHandler(
|
||||
shutdownFn: () => Promise<void>,
|
||||
isShuttingDownRef: { value: boolean }
|
||||
): (signal: string) => Promise<void> {
|
||||
return async (signal: string) => {
|
||||
if (isShuttingDownRef.value) {
|
||||
logger.warn('SYSTEM', `Received ${signal} but shutdown already in progress`);
|
||||
return;
|
||||
}
|
||||
isShuttingDownRef.value = true;
|
||||
|
||||
logger.info('SYSTEM', `Received ${signal}, shutting down...`);
|
||||
try {
|
||||
await shutdownFn();
|
||||
process.exit(0);
|
||||
} catch (error) {
|
||||
// Top-level signal handler - log any shutdown error and exit
|
||||
logger.error('SYSTEM', 'Error during shutdown', {}, error as Error);
|
||||
process.exit(1);
|
||||
}
|
||||
};
|
||||
}
|
||||
7
src/services/infrastructure/index.ts
Normal file
7
src/services/infrastructure/index.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Infrastructure module - Process management, health monitoring, and shutdown utilities
|
||||
*/
|
||||
|
||||
export * from './ProcessManager.js';
|
||||
export * from './HealthMonitor.js';
|
||||
export * from './GracefulShutdown.js';
|
||||
668
src/services/integrations/CursorHooksInstaller.ts
Normal file
668
src/services/integrations/CursorHooksInstaller.ts
Normal file
@@ -0,0 +1,668 @@
|
||||
/**
|
||||
* CursorHooksInstaller - Cursor IDE integration for claude-mem
|
||||
*
|
||||
* Extracted from worker-service.ts monolith to provide centralized Cursor integration.
|
||||
* Handles:
|
||||
* - Cursor hooks installation/uninstallation
|
||||
* - MCP server configuration
|
||||
* - Context file generation
|
||||
* - Project registry management
|
||||
*/
|
||||
|
||||
import path from 'path';
|
||||
import { homedir } from 'os';
|
||||
import { existsSync, readFileSync, writeFileSync, unlinkSync, mkdirSync } from 'fs';
|
||||
import { exec } from 'child_process';
|
||||
import { promisify } from 'util';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { getWorkerPort } from '../../shared/worker-utils.js';
|
||||
import {
|
||||
readCursorRegistry as readCursorRegistryFromFile,
|
||||
writeCursorRegistry as writeCursorRegistryToFile,
|
||||
writeContextFile,
|
||||
type CursorProjectRegistry
|
||||
} from '../../utils/cursor-utils.js';
|
||||
import type { CursorInstallTarget, CursorHooksJson, CursorMcpConfig, Platform } from './types.js';
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
// Standard paths
|
||||
const DATA_DIR = path.join(homedir(), '.claude-mem');
|
||||
const CURSOR_REGISTRY_FILE = path.join(DATA_DIR, 'cursor-projects.json');
|
||||
|
||||
// ============================================================================
|
||||
// Platform Detection
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Detect platform for script selection
|
||||
*/
|
||||
export function detectPlatform(): Platform {
|
||||
return process.platform === 'win32' ? 'windows' : 'unix';
|
||||
}
|
||||
|
||||
/**
|
||||
* Get script extension based on platform
|
||||
*/
|
||||
export function getScriptExtension(): string {
|
||||
return detectPlatform() === 'windows' ? '.ps1' : '.sh';
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Project Registry
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Read the Cursor project registry
|
||||
*/
|
||||
export function readCursorRegistry(): CursorProjectRegistry {
|
||||
return readCursorRegistryFromFile(CURSOR_REGISTRY_FILE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Write the Cursor project registry
|
||||
*/
|
||||
export function writeCursorRegistry(registry: CursorProjectRegistry): void {
|
||||
writeCursorRegistryToFile(CURSOR_REGISTRY_FILE, registry);
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a project for auto-context updates
|
||||
*/
|
||||
export function registerCursorProject(projectName: string, workspacePath: string): void {
|
||||
const registry = readCursorRegistry();
|
||||
registry[projectName] = {
|
||||
workspacePath,
|
||||
installedAt: new Date().toISOString()
|
||||
};
|
||||
writeCursorRegistry(registry);
|
||||
logger.info('CURSOR', 'Registered project for auto-context updates', { projectName, workspacePath });
|
||||
}
|
||||
|
||||
/**
|
||||
* Unregister a project from auto-context updates
|
||||
*/
|
||||
export function unregisterCursorProject(projectName: string): void {
|
||||
const registry = readCursorRegistry();
|
||||
if (registry[projectName]) {
|
||||
delete registry[projectName];
|
||||
writeCursorRegistry(registry);
|
||||
logger.info('CURSOR', 'Unregistered project', { projectName });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update Cursor context files for all registered projects matching this project name.
|
||||
* Called by SDK agents after saving a summary.
|
||||
*/
|
||||
export async function updateCursorContextForProject(projectName: string, port: number): Promise<void> {
|
||||
const registry = readCursorRegistry();
|
||||
const entry = registry[projectName];
|
||||
|
||||
if (!entry) return; // Project doesn't have Cursor hooks installed
|
||||
|
||||
try {
|
||||
// Fetch fresh context from worker
|
||||
const response = await fetch(
|
||||
`http://127.0.0.1:${port}/api/context/inject?project=${encodeURIComponent(projectName)}`
|
||||
);
|
||||
|
||||
if (!response.ok) return;
|
||||
|
||||
const context = await response.text();
|
||||
if (!context || !context.trim()) return;
|
||||
|
||||
// Write to the project's Cursor rules file using shared utility
|
||||
writeContextFile(entry.workspacePath, context);
|
||||
logger.debug('CURSOR', 'Updated context file', { projectName, workspacePath: entry.workspacePath });
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Background context update - failure is non-critical, user workflow continues
|
||||
logger.warn('CURSOR', 'Failed to update context file', { projectName }, error as Error);
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Path Finding
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Find cursor-hooks directory
|
||||
* Searches in order: marketplace install, source repo
|
||||
* Checks for both bash (common.sh) and PowerShell (common.ps1) scripts
|
||||
*/
|
||||
export function findCursorHooksDir(): string | null {
|
||||
const possiblePaths = [
|
||||
// Marketplace install location
|
||||
path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack', 'cursor-hooks'),
|
||||
// Development/source location (relative to built worker-service.cjs in plugin/scripts/)
|
||||
path.join(path.dirname(__filename), '..', '..', 'cursor-hooks'),
|
||||
// Alternative dev location
|
||||
path.join(process.cwd(), 'cursor-hooks'),
|
||||
];
|
||||
|
||||
for (const p of possiblePaths) {
|
||||
// Check for either bash or PowerShell common script
|
||||
if (existsSync(path.join(p, 'common.sh')) || existsSync(path.join(p, 'common.ps1'))) {
|
||||
return p;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find MCP server script path
|
||||
* Searches in order: marketplace install, source repo
|
||||
*/
|
||||
export function findMcpServerPath(): string | null {
|
||||
const possiblePaths = [
|
||||
// Marketplace install location
|
||||
path.join(homedir(), '.claude', 'plugins', 'marketplaces', 'thedotmack', 'plugin', 'scripts', 'mcp-server.cjs'),
|
||||
// Development/source location (relative to built worker-service.cjs in plugin/scripts/)
|
||||
path.join(path.dirname(__filename), 'mcp-server.cjs'),
|
||||
// Alternative dev location
|
||||
path.join(process.cwd(), 'plugin', 'scripts', 'mcp-server.cjs'),
|
||||
];
|
||||
|
||||
for (const p of possiblePaths) {
|
||||
if (existsSync(p)) {
|
||||
return p;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the target directory for Cursor hooks based on install target
|
||||
*/
|
||||
export function getTargetDir(target: CursorInstallTarget): string | null {
|
||||
switch (target) {
|
||||
case 'project':
|
||||
return path.join(process.cwd(), '.cursor');
|
||||
case 'user':
|
||||
return path.join(homedir(), '.cursor');
|
||||
case 'enterprise':
|
||||
if (process.platform === 'darwin') {
|
||||
return '/Library/Application Support/Cursor';
|
||||
} else if (process.platform === 'linux') {
|
||||
return '/etc/cursor';
|
||||
} else if (process.platform === 'win32') {
|
||||
return path.join(process.env.ProgramData || 'C:\\ProgramData', 'Cursor');
|
||||
}
|
||||
return null;
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// MCP Configuration
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Configure MCP server in Cursor's mcp.json
|
||||
* @param target 'project' or 'user'
|
||||
* @returns 0 on success, 1 on failure
|
||||
*/
|
||||
export function configureCursorMcp(target: CursorInstallTarget): number {
|
||||
const mcpServerPath = findMcpServerPath();
|
||||
|
||||
if (!mcpServerPath) {
|
||||
console.error('Could not find MCP server script');
|
||||
console.error(' Expected at: ~/.claude/plugins/marketplaces/thedotmack/plugin/scripts/mcp-server.cjs');
|
||||
return 1;
|
||||
}
|
||||
|
||||
const targetDir = getTargetDir(target);
|
||||
if (!targetDir) {
|
||||
console.error(`Invalid target: ${target}. Use: project or user`);
|
||||
return 1;
|
||||
}
|
||||
|
||||
const mcpJsonPath = path.join(targetDir, 'mcp.json');
|
||||
|
||||
try {
|
||||
// Create directory if needed
|
||||
mkdirSync(targetDir, { recursive: true });
|
||||
|
||||
// Load existing config or create new
|
||||
let config: CursorMcpConfig = { mcpServers: {} };
|
||||
if (existsSync(mcpJsonPath)) {
|
||||
try {
|
||||
config = JSON.parse(readFileSync(mcpJsonPath, 'utf-8'));
|
||||
if (!config.mcpServers) {
|
||||
config.mcpServers = {};
|
||||
}
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Fallback behavior - corrupt config, continue with empty
|
||||
logger.warn('SYSTEM', 'Corrupt mcp.json, creating new config', { path: mcpJsonPath }, error as Error);
|
||||
config = { mcpServers: {} };
|
||||
}
|
||||
}
|
||||
|
||||
// Add claude-mem MCP server
|
||||
config.mcpServers['claude-mem'] = {
|
||||
command: 'node',
|
||||
args: [mcpServerPath]
|
||||
};
|
||||
|
||||
writeFileSync(mcpJsonPath, JSON.stringify(config, null, 2));
|
||||
console.log(` Configured MCP server in ${target === 'user' ? '~/.cursor' : '.cursor'}/mcp.json`);
|
||||
console.log(` Server path: ${mcpServerPath}`);
|
||||
|
||||
return 0;
|
||||
} catch (error) {
|
||||
console.error(`Failed to configure MCP: ${(error as Error).message}`);
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Hook Installation
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Install Cursor hooks
|
||||
*/
|
||||
export async function installCursorHooks(sourceDir: string, target: CursorInstallTarget): Promise<number> {
|
||||
const platform = detectPlatform();
|
||||
const scriptExt = getScriptExtension();
|
||||
|
||||
console.log(`\nInstalling Claude-Mem Cursor hooks (${target} level, ${platform})...\n`);
|
||||
|
||||
const targetDir = getTargetDir(target);
|
||||
if (!targetDir) {
|
||||
console.error(`Invalid target: ${target}. Use: project, user, or enterprise`);
|
||||
return 1;
|
||||
}
|
||||
|
||||
const hooksDir = path.join(targetDir, 'hooks');
|
||||
const workspaceRoot = process.cwd();
|
||||
|
||||
try {
|
||||
// Create directories
|
||||
mkdirSync(hooksDir, { recursive: true });
|
||||
|
||||
// Determine which scripts to copy based on platform
|
||||
const commonScript = platform === 'windows' ? 'common.ps1' : 'common.sh';
|
||||
const hookScripts = [
|
||||
`session-init${scriptExt}`,
|
||||
`context-inject${scriptExt}`,
|
||||
`save-observation${scriptExt}`,
|
||||
`save-file-edit${scriptExt}`,
|
||||
`session-summary${scriptExt}`
|
||||
];
|
||||
|
||||
const scripts = [commonScript, ...hookScripts];
|
||||
|
||||
for (const script of scripts) {
|
||||
const srcPath = path.join(sourceDir, script);
|
||||
const dstPath = path.join(hooksDir, script);
|
||||
|
||||
if (existsSync(srcPath)) {
|
||||
const content = readFileSync(srcPath, 'utf-8');
|
||||
// Unix scripts need execute permission; Windows PowerShell doesn't need it
|
||||
const mode = platform === 'windows' ? undefined : 0o755;
|
||||
writeFileSync(dstPath, content, mode ? { mode } : undefined);
|
||||
console.log(` Copied ${script}`);
|
||||
} else {
|
||||
console.warn(` ${script} not found in source`);
|
||||
}
|
||||
}
|
||||
|
||||
// Generate hooks.json with correct paths and platform-appropriate commands
|
||||
const hooksJsonPath = path.join(targetDir, 'hooks.json');
|
||||
const hookPrefix = target === 'project' ? './.cursor/hooks/' : `${hooksDir}/`;
|
||||
|
||||
// For PowerShell, we need to invoke via powershell.exe
|
||||
const makeHookCommand = (scriptName: string) => {
|
||||
const scriptPath = `${hookPrefix}${scriptName}${scriptExt}`;
|
||||
if (platform === 'windows') {
|
||||
// PowerShell execution: use -ExecutionPolicy Bypass to ensure scripts run
|
||||
return `powershell.exe -ExecutionPolicy Bypass -File "${scriptPath}"`;
|
||||
}
|
||||
return scriptPath;
|
||||
};
|
||||
|
||||
const hooksJson: CursorHooksJson = {
|
||||
version: 1,
|
||||
hooks: {
|
||||
beforeSubmitPrompt: [
|
||||
{ command: makeHookCommand('session-init') },
|
||||
{ command: makeHookCommand('context-inject') }
|
||||
],
|
||||
afterMCPExecution: [
|
||||
{ command: makeHookCommand('save-observation') }
|
||||
],
|
||||
afterShellExecution: [
|
||||
{ command: makeHookCommand('save-observation') }
|
||||
],
|
||||
afterFileEdit: [
|
||||
{ command: makeHookCommand('save-file-edit') }
|
||||
],
|
||||
stop: [
|
||||
{ command: makeHookCommand('session-summary') }
|
||||
]
|
||||
}
|
||||
};
|
||||
|
||||
writeFileSync(hooksJsonPath, JSON.stringify(hooksJson, null, 2));
|
||||
console.log(` Created hooks.json (${platform} mode)`);
|
||||
|
||||
// For project-level: create initial context file
|
||||
if (target === 'project') {
|
||||
await setupProjectContext(targetDir, workspaceRoot);
|
||||
}
|
||||
|
||||
console.log(`
|
||||
Installation complete!
|
||||
|
||||
Hooks installed to: ${targetDir}/hooks.json
|
||||
Scripts installed to: ${hooksDir}
|
||||
|
||||
Next steps:
|
||||
1. Start claude-mem worker: claude-mem start
|
||||
2. Restart Cursor to load the hooks
|
||||
3. Check Cursor Settings → Hooks tab to verify
|
||||
|
||||
Context Injection:
|
||||
Context from past sessions is stored in .cursor/rules/claude-mem-context.mdc
|
||||
and automatically included in every chat. It updates after each session ends.
|
||||
`);
|
||||
|
||||
return 0;
|
||||
} catch (error) {
|
||||
console.error(`\nInstallation failed: ${(error as Error).message}`);
|
||||
if (target === 'enterprise') {
|
||||
console.error(' Tip: Enterprise installation may require sudo/admin privileges');
|
||||
}
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup initial context file for project-level installation
|
||||
*/
|
||||
async function setupProjectContext(targetDir: string, workspaceRoot: string): Promise<void> {
|
||||
const rulesDir = path.join(targetDir, 'rules');
|
||||
mkdirSync(rulesDir, { recursive: true });
|
||||
|
||||
const port = getWorkerPort();
|
||||
const projectName = path.basename(workspaceRoot);
|
||||
let contextGenerated = false;
|
||||
|
||||
console.log(` Generating initial context...`);
|
||||
|
||||
try {
|
||||
// Check if worker is running
|
||||
const healthResponse = await fetch(`http://127.0.0.1:${port}/api/readiness`);
|
||||
if (healthResponse.ok) {
|
||||
// Fetch context
|
||||
const contextResponse = await fetch(
|
||||
`http://127.0.0.1:${port}/api/context/inject?project=${encodeURIComponent(projectName)}`
|
||||
);
|
||||
if (contextResponse.ok) {
|
||||
const context = await contextResponse.text();
|
||||
if (context && context.trim()) {
|
||||
writeContextFile(workspaceRoot, context);
|
||||
contextGenerated = true;
|
||||
console.log(` Generated initial context from existing memory`);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Fallback behavior - worker not running, use placeholder
|
||||
logger.debug('CURSOR', 'Worker not running during install', {}, error as Error);
|
||||
}
|
||||
|
||||
if (!contextGenerated) {
|
||||
// Create placeholder context file
|
||||
const rulesFile = path.join(rulesDir, 'claude-mem-context.mdc');
|
||||
const placeholderContent = `---
|
||||
alwaysApply: true
|
||||
description: "Claude-mem context from past sessions (auto-updated)"
|
||||
---
|
||||
|
||||
# Memory Context from Past Sessions
|
||||
|
||||
*No context yet. Complete your first session and context will appear here.*
|
||||
|
||||
Use claude-mem's MCP search tools for manual memory queries.
|
||||
`;
|
||||
writeFileSync(rulesFile, placeholderContent);
|
||||
console.log(` Created placeholder context file (will populate after first session)`);
|
||||
}
|
||||
|
||||
// Register project for automatic context updates after summaries
|
||||
registerCursorProject(projectName, workspaceRoot);
|
||||
console.log(` Registered for auto-context updates`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Uninstall Cursor hooks
|
||||
*/
|
||||
export function uninstallCursorHooks(target: CursorInstallTarget): number {
|
||||
console.log(`\nUninstalling Claude-Mem Cursor hooks (${target} level)...\n`);
|
||||
|
||||
const targetDir = getTargetDir(target);
|
||||
if (!targetDir) {
|
||||
console.error(`Invalid target: ${target}`);
|
||||
return 1;
|
||||
}
|
||||
|
||||
try {
|
||||
const hooksDir = path.join(targetDir, 'hooks');
|
||||
const hooksJsonPath = path.join(targetDir, 'hooks.json');
|
||||
|
||||
// Remove hook scripts for both platforms (in case user switches platforms)
|
||||
const bashScripts = ['common.sh', 'session-init.sh', 'context-inject.sh',
|
||||
'save-observation.sh', 'save-file-edit.sh', 'session-summary.sh'];
|
||||
const psScripts = ['common.ps1', 'session-init.ps1', 'context-inject.ps1',
|
||||
'save-observation.ps1', 'save-file-edit.ps1', 'session-summary.ps1'];
|
||||
|
||||
const allScripts = [...bashScripts, ...psScripts];
|
||||
|
||||
for (const script of allScripts) {
|
||||
const scriptPath = path.join(hooksDir, script);
|
||||
if (existsSync(scriptPath)) {
|
||||
unlinkSync(scriptPath);
|
||||
console.log(` Removed ${script}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Remove hooks.json
|
||||
if (existsSync(hooksJsonPath)) {
|
||||
unlinkSync(hooksJsonPath);
|
||||
console.log(` Removed hooks.json`);
|
||||
}
|
||||
|
||||
// Remove context file and unregister if project-level
|
||||
if (target === 'project') {
|
||||
const contextFile = path.join(targetDir, 'rules', 'claude-mem-context.mdc');
|
||||
if (existsSync(contextFile)) {
|
||||
unlinkSync(contextFile);
|
||||
console.log(` Removed context file`);
|
||||
}
|
||||
|
||||
// Unregister from auto-context updates
|
||||
const projectName = path.basename(process.cwd());
|
||||
unregisterCursorProject(projectName);
|
||||
console.log(` Unregistered from auto-context updates`);
|
||||
}
|
||||
|
||||
console.log(`\nUninstallation complete!\n`);
|
||||
console.log('Restart Cursor to apply changes.');
|
||||
|
||||
return 0;
|
||||
} catch (error) {
|
||||
console.error(`\nUninstallation failed: ${(error as Error).message}`);
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check Cursor hooks installation status
|
||||
*/
|
||||
export function checkCursorHooksStatus(): number {
|
||||
console.log('\nClaude-Mem Cursor Hooks Status\n');
|
||||
|
||||
const locations: Array<{ name: string; dir: string }> = [
|
||||
{ name: 'Project', dir: path.join(process.cwd(), '.cursor') },
|
||||
{ name: 'User', dir: path.join(homedir(), '.cursor') },
|
||||
];
|
||||
|
||||
if (process.platform === 'darwin') {
|
||||
locations.push({ name: 'Enterprise', dir: '/Library/Application Support/Cursor' });
|
||||
} else if (process.platform === 'linux') {
|
||||
locations.push({ name: 'Enterprise', dir: '/etc/cursor' });
|
||||
}
|
||||
|
||||
let anyInstalled = false;
|
||||
|
||||
for (const loc of locations) {
|
||||
const hooksJson = path.join(loc.dir, 'hooks.json');
|
||||
const hooksDir = path.join(loc.dir, 'hooks');
|
||||
|
||||
if (existsSync(hooksJson)) {
|
||||
anyInstalled = true;
|
||||
console.log(`${loc.name}: Installed`);
|
||||
console.log(` Config: ${hooksJson}`);
|
||||
|
||||
// Detect which platform's scripts are installed
|
||||
const bashScripts = ['session-init.sh', 'context-inject.sh', 'save-observation.sh'];
|
||||
const psScripts = ['session-init.ps1', 'context-inject.ps1', 'save-observation.ps1'];
|
||||
|
||||
const hasBash = bashScripts.some(s => existsSync(path.join(hooksDir, s)));
|
||||
const hasPs = psScripts.some(s => existsSync(path.join(hooksDir, s)));
|
||||
|
||||
if (hasBash && hasPs) {
|
||||
console.log(` Platform: Both (bash + PowerShell)`);
|
||||
} else if (hasBash) {
|
||||
console.log(` Platform: Unix (bash)`);
|
||||
} else if (hasPs) {
|
||||
console.log(` Platform: Windows (PowerShell)`);
|
||||
} else {
|
||||
console.log(` No hook scripts found`);
|
||||
}
|
||||
|
||||
// Check for appropriate scripts based on current platform
|
||||
const platform = detectPlatform();
|
||||
const scripts = platform === 'windows' ? psScripts : bashScripts;
|
||||
const missing = scripts.filter(s => !existsSync(path.join(hooksDir, s)));
|
||||
|
||||
if (missing.length > 0) {
|
||||
console.log(` Missing ${platform} scripts: ${missing.join(', ')}`);
|
||||
} else {
|
||||
console.log(` Scripts: All present for ${platform}`);
|
||||
}
|
||||
|
||||
// Check for context file (project only)
|
||||
if (loc.name === 'Project') {
|
||||
const contextFile = path.join(loc.dir, 'rules', 'claude-mem-context.mdc');
|
||||
if (existsSync(contextFile)) {
|
||||
console.log(` Context: Active`);
|
||||
} else {
|
||||
console.log(` Context: Not yet generated (will be created on first prompt)`);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
console.log(`${loc.name}: Not installed`);
|
||||
}
|
||||
console.log('');
|
||||
}
|
||||
|
||||
if (!anyInstalled) {
|
||||
console.log('No hooks installed. Run: claude-mem cursor install\n');
|
||||
}
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Detect if Claude Code is available
|
||||
* Checks for the Claude Code CLI and plugin directory
|
||||
*/
|
||||
export async function detectClaudeCode(): Promise<boolean> {
|
||||
try {
|
||||
// Check for Claude Code CLI
|
||||
const { stdout } = await execAsync('which claude || where claude', { timeout: 5000 });
|
||||
if (stdout.trim()) {
|
||||
return true;
|
||||
}
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Fallback behavior - CLI not found, continue to directory check
|
||||
logger.debug('SYSTEM', 'Claude CLI not in PATH', {}, error as Error);
|
||||
}
|
||||
|
||||
// Check for Claude Code plugin directory
|
||||
const pluginDir = path.join(homedir(), '.claude', 'plugins');
|
||||
if (existsSync(pluginDir)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle cursor subcommand for hooks installation
|
||||
*/
|
||||
export async function handleCursorCommand(subcommand: string, args: string[]): Promise<number> {
|
||||
switch (subcommand) {
|
||||
case 'install': {
|
||||
const target = (args[0] || 'project') as CursorInstallTarget;
|
||||
const cursorHooksDir = findCursorHooksDir();
|
||||
|
||||
if (!cursorHooksDir) {
|
||||
console.error('Could not find cursor-hooks directory');
|
||||
console.error(' Expected at: ~/.claude/plugins/marketplaces/thedotmack/cursor-hooks/');
|
||||
return 1;
|
||||
}
|
||||
|
||||
return installCursorHooks(cursorHooksDir, target);
|
||||
}
|
||||
|
||||
case 'uninstall': {
|
||||
const target = (args[0] || 'project') as CursorInstallTarget;
|
||||
return uninstallCursorHooks(target);
|
||||
}
|
||||
|
||||
case 'status': {
|
||||
return checkCursorHooksStatus();
|
||||
}
|
||||
|
||||
case 'setup': {
|
||||
// Interactive guided setup - handled by main() in worker-service.ts
|
||||
// This is a placeholder that should not be reached
|
||||
console.log('Use the main entry point for setup');
|
||||
return 0;
|
||||
}
|
||||
|
||||
default: {
|
||||
console.log(`
|
||||
Claude-Mem Cursor Integration
|
||||
|
||||
Usage: claude-mem cursor <command> [options]
|
||||
|
||||
Commands:
|
||||
setup Interactive guided setup (recommended for first-time users)
|
||||
|
||||
install [target] Install Cursor hooks
|
||||
target: project (default), user, or enterprise
|
||||
|
||||
uninstall [target] Remove Cursor hooks
|
||||
target: project (default), user, or enterprise
|
||||
|
||||
status Check installation status
|
||||
|
||||
Examples:
|
||||
npm run cursor:setup # Interactive wizard (recommended)
|
||||
npm run cursor:install # Install for current project
|
||||
claude-mem cursor install user # Install globally for user
|
||||
claude-mem cursor uninstall # Remove from current project
|
||||
claude-mem cursor status # Check if hooks are installed
|
||||
|
||||
For more info: https://docs.claude-mem.ai/cursor
|
||||
`);
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
6
src/services/integrations/index.ts
Normal file
6
src/services/integrations/index.ts
Normal file
@@ -0,0 +1,6 @@
|
||||
/**
|
||||
* Integrations module - IDE integrations (Cursor, etc.)
|
||||
*/
|
||||
|
||||
export * from './types.js';
|
||||
export * from './CursorHooksInstaller.js';
|
||||
27
src/services/integrations/types.ts
Normal file
27
src/services/integrations/types.ts
Normal file
@@ -0,0 +1,27 @@
|
||||
/**
|
||||
* Integration Types - Shared types for IDE integrations
|
||||
*/
|
||||
|
||||
export interface CursorMcpConfig {
|
||||
mcpServers: {
|
||||
[name: string]: {
|
||||
command: string;
|
||||
args?: string[];
|
||||
env?: Record<string, string>;
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export type CursorInstallTarget = 'project' | 'user' | 'enterprise';
|
||||
export type Platform = 'windows' | 'unix';
|
||||
|
||||
export interface CursorHooksJson {
|
||||
version: number;
|
||||
hooks: {
|
||||
beforeSubmitPrompt?: Array<{ command: string }>;
|
||||
afterMCPExecution?: Array<{ command: string }>;
|
||||
afterShellExecution?: Array<{ command: string }>;
|
||||
afterFileEdit?: Array<{ command: string }>;
|
||||
stop?: Array<{ command: string }>;
|
||||
};
|
||||
}
|
||||
@@ -11,21 +11,22 @@ export class SessionQueueProcessor {
|
||||
|
||||
/**
|
||||
* Create an async iterator that yields messages as they become available.
|
||||
* Uses atomic database claiming to prevent race conditions.
|
||||
* Uses atomic claim-and-delete to prevent duplicates.
|
||||
* The queue is a pure buffer: claim it, delete it, process in memory.
|
||||
* Waits for 'message' event when queue is empty.
|
||||
*/
|
||||
async *createIterator(sessionDbId: number, signal: AbortSignal): AsyncIterableIterator<PendingMessageWithId> {
|
||||
while (!signal.aborted) {
|
||||
try {
|
||||
// 1. Atomically claim next message from DB
|
||||
const persistentMessage = this.store.claimNextMessage(sessionDbId);
|
||||
// Atomically claim AND DELETE next message from DB
|
||||
// Message is now in memory only - no "processing" state tracking needed
|
||||
const persistentMessage = this.store.claimAndDelete(sessionDbId);
|
||||
|
||||
if (persistentMessage) {
|
||||
// Yield the message for processing
|
||||
// Yield the message for processing (it's already deleted from queue)
|
||||
yield this.toPendingMessageWithId(persistentMessage);
|
||||
} else {
|
||||
// 2. Queue empty - wait for wake-up event
|
||||
// We use a promise that resolves on 'message' event or abort
|
||||
// Queue empty - wait for wake-up event
|
||||
await this.waitForMessage(signal);
|
||||
}
|
||||
} catch (error) {
|
||||
|
||||
102
src/services/server/ErrorHandler.ts
Normal file
102
src/services/server/ErrorHandler.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
/**
|
||||
* ErrorHandler - Centralized error handling for Express
|
||||
*
|
||||
* Provides error handling middleware and utilities for the server.
|
||||
*/
|
||||
|
||||
import { Request, Response, NextFunction, ErrorRequestHandler } from 'express';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
|
||||
/**
|
||||
* Standard error response format
|
||||
*/
|
||||
export interface ErrorResponse {
|
||||
error: string;
|
||||
message: string;
|
||||
code?: string;
|
||||
details?: unknown;
|
||||
}
|
||||
|
||||
/**
|
||||
* Application error with additional context
|
||||
*/
|
||||
export class AppError extends Error {
|
||||
constructor(
|
||||
message: string,
|
||||
public statusCode: number = 500,
|
||||
public code?: string,
|
||||
public details?: unknown
|
||||
) {
|
||||
super(message);
|
||||
this.name = 'AppError';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an error response object
|
||||
*/
|
||||
export function createErrorResponse(
|
||||
error: string,
|
||||
message: string,
|
||||
code?: string,
|
||||
details?: unknown
|
||||
): ErrorResponse {
|
||||
const response: ErrorResponse = { error, message };
|
||||
if (code) response.code = code;
|
||||
if (details) response.details = details;
|
||||
return response;
|
||||
}
|
||||
|
||||
/**
|
||||
* Global error handler middleware
|
||||
* Should be registered last in the middleware chain
|
||||
*/
|
||||
export const errorHandler: ErrorRequestHandler = (
|
||||
err: Error | AppError,
|
||||
req: Request,
|
||||
res: Response,
|
||||
_next: NextFunction
|
||||
): void => {
|
||||
// Determine status code
|
||||
const statusCode = err instanceof AppError ? err.statusCode : 500;
|
||||
|
||||
// Log error
|
||||
logger.error('HTTP', `Error handling ${req.method} ${req.path}`, {
|
||||
statusCode,
|
||||
error: err.message,
|
||||
code: err instanceof AppError ? err.code : undefined
|
||||
}, err);
|
||||
|
||||
// Build response
|
||||
const response = createErrorResponse(
|
||||
err.name || 'Error',
|
||||
err.message,
|
||||
err instanceof AppError ? err.code : undefined,
|
||||
err instanceof AppError ? err.details : undefined
|
||||
);
|
||||
|
||||
// Send response (don't call next, as we've handled the error)
|
||||
res.status(statusCode).json(response);
|
||||
};
|
||||
|
||||
/**
|
||||
* Not found handler - for routes that don't exist
|
||||
*/
|
||||
export function notFoundHandler(req: Request, res: Response): void {
|
||||
res.status(404).json(createErrorResponse(
|
||||
'NotFound',
|
||||
`Cannot ${req.method} ${req.path}`
|
||||
));
|
||||
}
|
||||
|
||||
/**
|
||||
* Async wrapper to catch errors in async route handlers
|
||||
* Automatically passes errors to Express error handler
|
||||
*/
|
||||
export function asyncHandler<T>(
|
||||
fn: (req: Request, res: Response, next: NextFunction) => Promise<T>
|
||||
): (req: Request, res: Response, next: NextFunction) => void {
|
||||
return (req: Request, res: Response, next: NextFunction): void => {
|
||||
Promise.resolve(fn(req, res, next)).catch(next);
|
||||
};
|
||||
}
|
||||
14
src/services/server/Middleware.ts
Normal file
14
src/services/server/Middleware.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
/**
|
||||
* Server Middleware - Re-exports and enhances existing middleware
|
||||
*
|
||||
* This module provides a unified interface for server middleware.
|
||||
* Re-exports from worker/http/middleware.ts to maintain backward compatibility
|
||||
* while providing a cleaner import path for server setup.
|
||||
*/
|
||||
|
||||
// Re-export all middleware from the existing location
|
||||
export {
|
||||
createMiddleware,
|
||||
requireLocalhost,
|
||||
summarizeRequestBody
|
||||
} from '../worker/http/middleware.js';
|
||||
271
src/services/server/Server.ts
Normal file
271
src/services/server/Server.ts
Normal file
@@ -0,0 +1,271 @@
|
||||
/**
|
||||
* Server - Express app setup and route registration
|
||||
*
|
||||
* Extracted from worker-service.ts monolith to provide centralized HTTP server management.
|
||||
* Handles:
|
||||
* - Express app creation and configuration
|
||||
* - Middleware registration
|
||||
* - Route registration (delegates to route handlers)
|
||||
* - Core system endpoints (health, readiness, version, admin)
|
||||
*/
|
||||
|
||||
import express, { Request, Response, Application } from 'express';
|
||||
import http from 'http';
|
||||
import * as fs from 'fs';
|
||||
import path from 'path';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { createMiddleware, summarizeRequestBody, requireLocalhost } from './Middleware.js';
|
||||
import { errorHandler, notFoundHandler } from './ErrorHandler.js';
|
||||
|
||||
// Build-time injected version constant (set by esbuild define)
|
||||
declare const __DEFAULT_PACKAGE_VERSION__: string;
|
||||
const BUILT_IN_VERSION = typeof __DEFAULT_PACKAGE_VERSION__ !== 'undefined'
|
||||
? __DEFAULT_PACKAGE_VERSION__
|
||||
: 'development';
|
||||
|
||||
/**
|
||||
* Interface for route handlers that can be registered with the server
|
||||
*/
|
||||
export interface RouteHandler {
|
||||
setupRoutes(app: Application): void;
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for initializing the server
|
||||
*/
|
||||
export interface ServerOptions {
|
||||
/** Whether initialization is complete (for readiness check) */
|
||||
getInitializationComplete: () => boolean;
|
||||
/** Whether MCP is ready (for health/readiness info) */
|
||||
getMcpReady: () => boolean;
|
||||
/** Shutdown function for admin endpoints */
|
||||
onShutdown: () => Promise<void>;
|
||||
/** Restart function for admin endpoints */
|
||||
onRestart: () => Promise<void>;
|
||||
}
|
||||
|
||||
/**
|
||||
* Express application and HTTP server wrapper
|
||||
* Provides centralized setup for middleware and routes
|
||||
*/
|
||||
export class Server {
|
||||
readonly app: Application;
|
||||
private server: http.Server | null = null;
|
||||
private readonly options: ServerOptions;
|
||||
private readonly startTime: number = Date.now();
|
||||
|
||||
constructor(options: ServerOptions) {
|
||||
this.options = options;
|
||||
this.app = express();
|
||||
this.setupMiddleware();
|
||||
this.setupCoreRoutes();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the underlying HTTP server
|
||||
*/
|
||||
getHttpServer(): http.Server | null {
|
||||
return this.server;
|
||||
}
|
||||
|
||||
/**
|
||||
* Start listening on the specified host and port
|
||||
*/
|
||||
async listen(port: number, host: string): Promise<void> {
|
||||
return new Promise<void>((resolve, reject) => {
|
||||
this.server = this.app.listen(port, host, () => {
|
||||
logger.info('SYSTEM', 'HTTP server started', { host, port, pid: process.pid });
|
||||
resolve();
|
||||
});
|
||||
this.server.on('error', reject);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Close the HTTP server
|
||||
*/
|
||||
async close(): Promise<void> {
|
||||
if (!this.server) return;
|
||||
|
||||
// Close all active connections
|
||||
this.server.closeAllConnections();
|
||||
|
||||
// Give Windows time to close connections before closing server
|
||||
if (process.platform === 'win32') {
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
}
|
||||
|
||||
// Close the server
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
this.server!.close(err => err ? reject(err) : resolve());
|
||||
});
|
||||
|
||||
// Extra delay on Windows to ensure port is fully released
|
||||
if (process.platform === 'win32') {
|
||||
await new Promise(r => setTimeout(r, 500));
|
||||
}
|
||||
|
||||
this.server = null;
|
||||
logger.info('SYSTEM', 'HTTP server closed');
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a route handler
|
||||
*/
|
||||
registerRoutes(handler: RouteHandler): void {
|
||||
handler.setupRoutes(this.app);
|
||||
}
|
||||
|
||||
/**
|
||||
* Finalize route setup by adding error handlers
|
||||
* Call this after all routes have been registered
|
||||
*/
|
||||
finalizeRoutes(): void {
|
||||
// 404 handler for unmatched routes
|
||||
this.app.use(notFoundHandler);
|
||||
|
||||
// Global error handler (must be last)
|
||||
this.app.use(errorHandler);
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup Express middleware
|
||||
*/
|
||||
private setupMiddleware(): void {
|
||||
const middlewares = createMiddleware(summarizeRequestBody);
|
||||
middlewares.forEach(mw => this.app.use(mw));
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup core system routes (health, readiness, version, admin)
|
||||
*/
|
||||
private setupCoreRoutes(): void {
|
||||
// Test build ID for debugging which build is running
|
||||
const TEST_BUILD_ID = 'TEST-008-wrapper-ipc';
|
||||
|
||||
// Health check endpoint - always responds, even during initialization
|
||||
this.app.get('/api/health', (_req: Request, res: Response) => {
|
||||
res.status(200).json({
|
||||
status: 'ok',
|
||||
build: TEST_BUILD_ID,
|
||||
managed: process.env.CLAUDE_MEM_MANAGED === 'true',
|
||||
hasIpc: typeof process.send === 'function',
|
||||
platform: process.platform,
|
||||
pid: process.pid,
|
||||
initialized: this.options.getInitializationComplete(),
|
||||
mcpReady: this.options.getMcpReady(),
|
||||
});
|
||||
});
|
||||
|
||||
// Readiness check endpoint - returns 503 until full initialization completes
|
||||
this.app.get('/api/readiness', (_req: Request, res: Response) => {
|
||||
if (this.options.getInitializationComplete()) {
|
||||
res.status(200).json({
|
||||
status: 'ready',
|
||||
mcpReady: this.options.getMcpReady(),
|
||||
});
|
||||
} else {
|
||||
res.status(503).json({
|
||||
status: 'initializing',
|
||||
message: 'Worker is still initializing, please retry',
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Version endpoint - returns the worker's built-in version
|
||||
this.app.get('/api/version', (_req: Request, res: Response) => {
|
||||
res.status(200).json({ version: BUILT_IN_VERSION });
|
||||
});
|
||||
|
||||
// Instructions endpoint - loads SKILL.md sections on-demand
|
||||
this.app.get('/api/instructions', async (req: Request, res: Response) => {
|
||||
const topic = (req.query.topic as string) || 'all';
|
||||
const operation = req.query.operation as string | undefined;
|
||||
|
||||
try {
|
||||
let content: string;
|
||||
|
||||
if (operation) {
|
||||
const operationPath = path.join(__dirname, '../skills/mem-search/operations', `${operation}.md`);
|
||||
content = await fs.promises.readFile(operationPath, 'utf-8');
|
||||
} else {
|
||||
const skillPath = path.join(__dirname, '../skills/mem-search/SKILL.md');
|
||||
const fullContent = await fs.promises.readFile(skillPath, 'utf-8');
|
||||
content = this.extractInstructionSection(fullContent, topic);
|
||||
}
|
||||
|
||||
res.json({
|
||||
content: [{ type: 'text', text: content }]
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(404).json({ error: 'Instruction not found' });
|
||||
}
|
||||
});
|
||||
|
||||
// Admin endpoints for process management (localhost-only)
|
||||
this.app.post('/api/admin/restart', requireLocalhost, async (_req: Request, res: Response) => {
|
||||
res.json({ status: 'restarting' });
|
||||
|
||||
// Handle Windows managed mode via IPC
|
||||
const isWindowsManaged = process.platform === 'win32' &&
|
||||
process.env.CLAUDE_MEM_MANAGED === 'true' &&
|
||||
process.send;
|
||||
|
||||
if (isWindowsManaged) {
|
||||
logger.info('SYSTEM', 'Sending restart request to wrapper');
|
||||
process.send!({ type: 'restart' });
|
||||
} else {
|
||||
// Unix or standalone Windows - handle restart ourselves
|
||||
setTimeout(async () => {
|
||||
await this.options.onRestart();
|
||||
}, 100);
|
||||
}
|
||||
});
|
||||
|
||||
this.app.post('/api/admin/shutdown', requireLocalhost, async (_req: Request, res: Response) => {
|
||||
res.json({ status: 'shutting_down' });
|
||||
|
||||
// Handle Windows managed mode via IPC
|
||||
const isWindowsManaged = process.platform === 'win32' &&
|
||||
process.env.CLAUDE_MEM_MANAGED === 'true' &&
|
||||
process.send;
|
||||
|
||||
if (isWindowsManaged) {
|
||||
logger.info('SYSTEM', 'Sending shutdown request to wrapper');
|
||||
process.send!({ type: 'shutdown' });
|
||||
} else {
|
||||
// Unix or standalone Windows - handle shutdown ourselves
|
||||
setTimeout(async () => {
|
||||
await this.options.onShutdown();
|
||||
}, 100);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract a specific section from instruction content
|
||||
*/
|
||||
private extractInstructionSection(content: string, topic: string): string {
|
||||
const sections: Record<string, string> = {
|
||||
'workflow': this.extractBetween(content, '## The Workflow', '## Search Parameters'),
|
||||
'search_params': this.extractBetween(content, '## Search Parameters', '## Examples'),
|
||||
'examples': this.extractBetween(content, '## Examples', '## Why This Workflow'),
|
||||
'all': content
|
||||
};
|
||||
|
||||
return sections[topic] || sections['all'];
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract text between two markers
|
||||
*/
|
||||
private extractBetween(content: string, startMarker: string, endMarker: string): string {
|
||||
const startIdx = content.indexOf(startMarker);
|
||||
const endIdx = content.indexOf(endMarker);
|
||||
|
||||
if (startIdx === -1) return content;
|
||||
if (endIdx === -1) return content.substring(startIdx);
|
||||
|
||||
return content.substring(startIdx, endIdx).trim();
|
||||
}
|
||||
}
|
||||
7
src/services/server/index.ts
Normal file
7
src/services/server/index.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Server module - HTTP server, middleware, and error handling
|
||||
*/
|
||||
|
||||
export * from './Server.js';
|
||||
export * from './Middleware.js';
|
||||
export * from './ErrorHandler.js';
|
||||
@@ -1,6 +1,7 @@
|
||||
import { Database } from 'bun:sqlite';
|
||||
import { DATA_DIR, DB_PATH, ensureDir } from '../../shared/paths.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { MigrationRunner } from './migrations/runner.js';
|
||||
|
||||
// SQLite configuration constants
|
||||
const SQLITE_MMAP_SIZE_BYTES = 256 * 1024 * 1024; // 256MB
|
||||
@@ -14,8 +15,53 @@ export interface Migration {
|
||||
|
||||
let dbInstance: Database | null = null;
|
||||
|
||||
/**
|
||||
* ClaudeMemDatabase - New entry point for the sqlite module
|
||||
*
|
||||
* Replaces SessionStore as the database coordinator.
|
||||
* Sets up bun:sqlite with optimized settings and runs all migrations.
|
||||
*
|
||||
* Usage:
|
||||
* const db = new ClaudeMemDatabase(); // uses default DB_PATH
|
||||
* const db = new ClaudeMemDatabase('/path/to/db.sqlite');
|
||||
* const db = new ClaudeMemDatabase(':memory:'); // for tests
|
||||
*/
|
||||
export class ClaudeMemDatabase {
|
||||
public db: Database;
|
||||
|
||||
constructor(dbPath: string = DB_PATH) {
|
||||
// Ensure data directory exists (skip for in-memory databases)
|
||||
if (dbPath !== ':memory:') {
|
||||
ensureDir(DATA_DIR);
|
||||
}
|
||||
|
||||
// Create database connection
|
||||
this.db = new Database(dbPath, { create: true, readwrite: true });
|
||||
|
||||
// Apply optimized SQLite settings
|
||||
this.db.run('PRAGMA journal_mode = WAL');
|
||||
this.db.run('PRAGMA synchronous = NORMAL');
|
||||
this.db.run('PRAGMA foreign_keys = ON');
|
||||
this.db.run('PRAGMA temp_store = memory');
|
||||
this.db.run(`PRAGMA mmap_size = ${SQLITE_MMAP_SIZE_BYTES}`);
|
||||
this.db.run(`PRAGMA cache_size = ${SQLITE_CACHE_SIZE_PAGES}`);
|
||||
|
||||
// Run all migrations
|
||||
const migrationRunner = new MigrationRunner(this.db);
|
||||
migrationRunner.runAllMigrations();
|
||||
}
|
||||
|
||||
/**
|
||||
* Close the database connection
|
||||
*/
|
||||
close(): void {
|
||||
this.db.close();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* SQLite Database singleton with migration support and optimized settings
|
||||
* @deprecated Use ClaudeMemDatabase instead for new code
|
||||
*/
|
||||
export class DatabaseManager {
|
||||
private static instance: DatabaseManager;
|
||||
@@ -173,4 +219,17 @@ export async function initializeDatabase(): Promise<Database> {
|
||||
return await manager.initialize();
|
||||
}
|
||||
|
||||
export { Database };
|
||||
// Re-export bun:sqlite Database type
|
||||
export { Database };
|
||||
|
||||
// Re-export MigrationRunner for external use
|
||||
export { MigrationRunner } from './migrations/runner.js';
|
||||
|
||||
// Re-export all module functions for convenient imports
|
||||
export * from './Sessions.js';
|
||||
export * from './Observations.js';
|
||||
export * from './Summaries.js';
|
||||
export * from './Prompts.js';
|
||||
export * from './Timeline.js';
|
||||
export * from './Import.js';
|
||||
export * from './transactions.js';
|
||||
5
src/services/sqlite/Import.ts
Normal file
5
src/services/sqlite/Import.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
/**
|
||||
* Import functions for bulk data import with duplicate checking
|
||||
*/
|
||||
|
||||
export * from './import/bulk.js';
|
||||
10
src/services/sqlite/Observations.ts
Normal file
10
src/services/sqlite/Observations.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
/**
|
||||
* Observations module - named re-exports
|
||||
* Provides all observation-related database operations
|
||||
*/
|
||||
|
||||
export * from './observations/types.js';
|
||||
export * from './observations/store.js';
|
||||
export * from './observations/get.js';
|
||||
export * from './observations/recent.js';
|
||||
export * from './observations/files.js';
|
||||
@@ -26,17 +26,14 @@ export interface PersistentPendingMessage {
|
||||
/**
|
||||
* PendingMessageStore - Persistent work queue for SDK messages
|
||||
*
|
||||
* Messages are persisted before processing and marked complete after success.
|
||||
* This enables recovery from SDK hangs and worker crashes.
|
||||
* Messages are persisted before processing using a claim-and-delete pattern.
|
||||
* This simplifies the lifecycle and eliminates duplicate processing bugs.
|
||||
*
|
||||
* Lifecycle:
|
||||
* 1. enqueue() - Message persisted with status 'pending'
|
||||
* 2. markProcessing() - Status changes to 'processing' when yielded to SDK
|
||||
* 3. markProcessed() - Status changes to 'processed' after successful SDK response
|
||||
* 4. markFailed() - Status changes to 'failed' if max retries exceeded
|
||||
* 2. claimAndDelete() - Atomically claims and deletes message (process in memory)
|
||||
*
|
||||
* Recovery:
|
||||
* - resetStuckMessages() - Moves 'processing' messages back to 'pending' if stuck
|
||||
* - getSessionsWithPendingMessages() - Find sessions that need recovery on startup
|
||||
*/
|
||||
export class PendingMessageStore {
|
||||
@@ -80,14 +77,13 @@ export class PendingMessageStore {
|
||||
}
|
||||
|
||||
/**
|
||||
* Atomically claim the next pending message for processing
|
||||
* Finds oldest pending -> marks processing -> returns it
|
||||
* Uses a transaction to prevent race conditions
|
||||
* Atomically claim and DELETE the next pending message.
|
||||
* Finds oldest pending -> returns it -> deletes from queue.
|
||||
* The queue is a pure buffer: claim it, delete it, process in memory.
|
||||
* Uses a transaction to prevent race conditions.
|
||||
*/
|
||||
claimNextMessage(sessionDbId: number): PersistentPendingMessage | null {
|
||||
const now = Date.now();
|
||||
|
||||
const claimTx = this.db.transaction((sessionId: number, timestamp: number) => {
|
||||
claimAndDelete(sessionDbId: number): PersistentPendingMessage | null {
|
||||
const claimTx = this.db.transaction((sessionId: number) => {
|
||||
const peekStmt = this.db.prepare(`
|
||||
SELECT * FROM pending_messages
|
||||
WHERE session_db_id = ? AND status = 'pending'
|
||||
@@ -95,26 +91,16 @@ export class PendingMessageStore {
|
||||
LIMIT 1
|
||||
`);
|
||||
const msg = peekStmt.get(sessionId) as PersistentPendingMessage | null;
|
||||
|
||||
|
||||
if (msg) {
|
||||
const updateStmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'processing', started_processing_at_epoch = ?
|
||||
WHERE id = ?
|
||||
`);
|
||||
updateStmt.run(timestamp, msg.id);
|
||||
|
||||
// Return updated object
|
||||
return {
|
||||
...msg,
|
||||
status: 'processing',
|
||||
started_processing_at_epoch: timestamp
|
||||
} as PersistentPendingMessage;
|
||||
// Delete immediately - no "processing" state needed
|
||||
const deleteStmt = this.db.prepare('DELETE FROM pending_messages WHERE id = ?');
|
||||
deleteStmt.run(msg.id);
|
||||
}
|
||||
return null;
|
||||
return msg;
|
||||
});
|
||||
|
||||
return claimTx(sessionDbId, now) as PersistentPendingMessage | null;
|
||||
return claimTx(sessionDbId) as PersistentPendingMessage | null;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -254,38 +240,7 @@ export class PendingMessageStore {
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark message as being processed (status: pending -> processing)
|
||||
*/
|
||||
markProcessing(messageId: number): void {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET status = 'processing', started_processing_at_epoch = ?
|
||||
WHERE id = ? AND status = 'pending'
|
||||
`);
|
||||
stmt.run(now, messageId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark message as successfully processed (status: processing -> processed)
|
||||
* Clears tool_input and tool_response to save space (observations are already saved)
|
||||
*/
|
||||
markProcessed(messageId: number): void {
|
||||
const now = Date.now();
|
||||
const stmt = this.db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET
|
||||
status = 'processed',
|
||||
completed_at_epoch = ?,
|
||||
tool_input = NULL,
|
||||
tool_response = NULL
|
||||
WHERE id = ? AND status = 'processing'
|
||||
`);
|
||||
stmt.run(now, messageId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Mark message as failed (status: processing -> failed or back to pending for retry)
|
||||
* Mark message as failed (status: pending -> failed or back to pending for retry)
|
||||
* If retry_count < maxRetries, moves back to 'pending' for retry
|
||||
* Otherwise marks as 'failed' permanently
|
||||
*/
|
||||
@@ -381,28 +336,6 @@ export class PendingMessageStore {
|
||||
return result ? { sessionDbId: result.session_db_id, contentSessionId: result.content_session_id } : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup old processed messages (retention policy)
|
||||
* Keeps the most recent N processed messages, deletes the rest
|
||||
* @param retentionCount Number of processed messages to keep (default: 100)
|
||||
* @returns Number of messages deleted
|
||||
*/
|
||||
cleanupProcessed(retentionCount: number = 100): number {
|
||||
const stmt = this.db.prepare(`
|
||||
DELETE FROM pending_messages
|
||||
WHERE status = 'processed'
|
||||
AND id NOT IN (
|
||||
SELECT id FROM pending_messages
|
||||
WHERE status = 'processed'
|
||||
ORDER BY completed_at_epoch DESC
|
||||
LIMIT ?
|
||||
)
|
||||
`);
|
||||
|
||||
const result = stmt.run(retentionCount);
|
||||
return result.changes;
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all failed messages from the queue
|
||||
* @returns Number of messages deleted
|
||||
|
||||
10
src/services/sqlite/Prompts.ts
Normal file
10
src/services/sqlite/Prompts.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
/**
|
||||
* User prompts module - named re-exports
|
||||
*
|
||||
* Provides all user prompt database operations as standalone functions.
|
||||
* Each function takes `db: Database` as first parameter.
|
||||
*/
|
||||
|
||||
export * from './prompts/types.js';
|
||||
export * from './prompts/store.js';
|
||||
export * from './prompts/get.js';
|
||||
@@ -1324,6 +1324,119 @@ export class SessionStore {
|
||||
}
|
||||
|
||||
/**
|
||||
* ATOMIC: Store observations + summary (no message tracking)
|
||||
*
|
||||
* Simplified version for use with claim-and-delete queue pattern.
|
||||
* Messages are deleted from queue immediately on claim, so there's no
|
||||
* message completion to track. This just stores observations and summary.
|
||||
*
|
||||
* @param memorySessionId - SDK memory session ID
|
||||
* @param project - Project name
|
||||
* @param observations - Array of observations to store (can be empty)
|
||||
* @param summary - Optional summary to store
|
||||
* @param promptNumber - Optional prompt number
|
||||
* @param discoveryTokens - Discovery tokens count
|
||||
* @param overrideTimestampEpoch - Optional override timestamp
|
||||
* @returns Object with observation IDs, optional summary ID, and timestamp
|
||||
*/
|
||||
storeObservations(
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
observations: Array<{
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
facts: string[];
|
||||
narrative: string | null;
|
||||
concepts: string[];
|
||||
files_read: string[];
|
||||
files_modified: string[];
|
||||
}>,
|
||||
summary: {
|
||||
request: string;
|
||||
investigated: string;
|
||||
learned: string;
|
||||
completed: string;
|
||||
next_steps: string;
|
||||
notes: string | null;
|
||||
} | null,
|
||||
promptNumber?: number,
|
||||
discoveryTokens: number = 0,
|
||||
overrideTimestampEpoch?: number
|
||||
): { observationIds: number[]; summaryId: number | null; createdAtEpoch: number } {
|
||||
// Use override timestamp if provided
|
||||
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
|
||||
const timestampIso = new Date(timestampEpoch).toISOString();
|
||||
|
||||
// Create transaction that wraps all operations
|
||||
const storeTx = this.db.transaction(() => {
|
||||
const observationIds: number[] = [];
|
||||
|
||||
// 1. Store all observations
|
||||
const obsStmt = this.db.prepare(`
|
||||
INSERT INTO observations
|
||||
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
for (const observation of observations) {
|
||||
const result = obsStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
observation.type,
|
||||
observation.title,
|
||||
observation.subtitle,
|
||||
JSON.stringify(observation.facts),
|
||||
observation.narrative,
|
||||
JSON.stringify(observation.concepts),
|
||||
JSON.stringify(observation.files_read),
|
||||
JSON.stringify(observation.files_modified),
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
observationIds.push(Number(result.lastInsertRowid));
|
||||
}
|
||||
|
||||
// 2. Store summary if provided
|
||||
let summaryId: number | null = null;
|
||||
if (summary) {
|
||||
const summaryStmt = this.db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(memory_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = summaryStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
summary.learned,
|
||||
summary.completed,
|
||||
summary.next_steps,
|
||||
summary.notes,
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
summaryId = Number(result.lastInsertRowid);
|
||||
}
|
||||
|
||||
return { observationIds, summaryId, createdAtEpoch: timestampEpoch };
|
||||
});
|
||||
|
||||
// Execute the transaction and return results
|
||||
return storeTx();
|
||||
}
|
||||
|
||||
/**
|
||||
* @deprecated Use storeObservations instead. This method is kept for backwards compatibility.
|
||||
*
|
||||
* ATOMIC: Store observations + summary + mark pending message as processed
|
||||
*
|
||||
* This method wraps observation storage, summary storage, and message completion
|
||||
|
||||
11
src/services/sqlite/Sessions.ts
Normal file
11
src/services/sqlite/Sessions.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
/**
|
||||
* Sessions module - re-exports all session-related functions
|
||||
*
|
||||
* Usage:
|
||||
* import { createSDKSession, getSessionById } from './Sessions.js';
|
||||
* const sessionId = createSDKSession(db, contentId, project, prompt);
|
||||
*/
|
||||
|
||||
export * from './sessions/types.js';
|
||||
export * from './sessions/create.js';
|
||||
export * from './sessions/get.js';
|
||||
7
src/services/sqlite/Summaries.ts
Normal file
7
src/services/sqlite/Summaries.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Summaries module - Named re-exports for summary-related database operations
|
||||
*/
|
||||
export * from './summaries/types.js';
|
||||
export * from './summaries/store.js';
|
||||
export * from './summaries/get.js';
|
||||
export * from './summaries/recent.js';
|
||||
8
src/services/sqlite/Timeline.ts
Normal file
8
src/services/sqlite/Timeline.ts
Normal file
@@ -0,0 +1,8 @@
|
||||
/**
|
||||
* Timeline module re-exports
|
||||
* Provides time-based context queries for observations, sessions, and prompts
|
||||
*
|
||||
* grep-friendly: Timeline, getTimelineAroundTimestamp, getTimelineAroundObservation, getAllProjects
|
||||
*/
|
||||
|
||||
export * from './timeline/queries.js';
|
||||
236
src/services/sqlite/import/bulk.ts
Normal file
236
src/services/sqlite/import/bulk.ts
Normal file
@@ -0,0 +1,236 @@
|
||||
/**
|
||||
* Bulk import functions for importing data with duplicate checking
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
|
||||
export interface ImportResult {
|
||||
imported: boolean;
|
||||
id: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Import SDK session with duplicate checking
|
||||
* Duplicates are identified by content_session_id
|
||||
*/
|
||||
export function importSdkSession(
|
||||
db: Database,
|
||||
session: {
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
started_at: string;
|
||||
started_at_epoch: number;
|
||||
completed_at: string | null;
|
||||
completed_at_epoch: number | null;
|
||||
status: string;
|
||||
}
|
||||
): ImportResult {
|
||||
// Check if session already exists
|
||||
const existing = db
|
||||
.prepare('SELECT id FROM sdk_sessions WHERE content_session_id = ?')
|
||||
.get(session.content_session_id) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
}
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO sdk_sessions (
|
||||
content_session_id, memory_session_id, project, user_prompt,
|
||||
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
session.content_session_id,
|
||||
session.memory_session_id,
|
||||
session.project,
|
||||
session.user_prompt,
|
||||
session.started_at,
|
||||
session.started_at_epoch,
|
||||
session.completed_at,
|
||||
session.completed_at_epoch,
|
||||
session.status
|
||||
);
|
||||
|
||||
return { imported: true, id: result.lastInsertRowid as number };
|
||||
}
|
||||
|
||||
/**
|
||||
* Import session summary with duplicate checking
|
||||
* Duplicates are identified by memory_session_id
|
||||
*/
|
||||
export function importSessionSummary(
|
||||
db: Database,
|
||||
summary: {
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
files_read: string | null;
|
||||
files_edited: string | null;
|
||||
notes: string | null;
|
||||
prompt_number: number | null;
|
||||
discovery_tokens: number;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
): ImportResult {
|
||||
// Check if summary already exists for this session
|
||||
const existing = db
|
||||
.prepare('SELECT id FROM session_summaries WHERE memory_session_id = ?')
|
||||
.get(summary.memory_session_id) as { id: number } | undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
}
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO session_summaries (
|
||||
memory_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, discovery_tokens, created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
summary.memory_session_id,
|
||||
summary.project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
summary.learned,
|
||||
summary.completed,
|
||||
summary.next_steps,
|
||||
summary.files_read,
|
||||
summary.files_edited,
|
||||
summary.notes,
|
||||
summary.prompt_number,
|
||||
summary.discovery_tokens || 0,
|
||||
summary.created_at,
|
||||
summary.created_at_epoch
|
||||
);
|
||||
|
||||
return { imported: true, id: result.lastInsertRowid as number };
|
||||
}
|
||||
|
||||
/**
|
||||
* Import observation with duplicate checking
|
||||
* Duplicates are identified by memory_session_id + title + created_at_epoch
|
||||
*/
|
||||
export function importObservation(
|
||||
db: Database,
|
||||
obs: {
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
text: string | null;
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
facts: string | null;
|
||||
narrative: string | null;
|
||||
concepts: string | null;
|
||||
files_read: string | null;
|
||||
files_modified: string | null;
|
||||
prompt_number: number | null;
|
||||
discovery_tokens: number;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
): ImportResult {
|
||||
// Check if observation already exists
|
||||
const existing = db
|
||||
.prepare(
|
||||
`
|
||||
SELECT id FROM observations
|
||||
WHERE memory_session_id = ? AND title = ? AND created_at_epoch = ?
|
||||
`
|
||||
)
|
||||
.get(obs.memory_session_id, obs.title, obs.created_at_epoch) as
|
||||
| { id: number }
|
||||
| undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
}
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO observations (
|
||||
memory_session_id, project, text, type, title, subtitle,
|
||||
facts, narrative, concepts, files_read, files_modified,
|
||||
prompt_number, discovery_tokens, created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
obs.memory_session_id,
|
||||
obs.project,
|
||||
obs.text,
|
||||
obs.type,
|
||||
obs.title,
|
||||
obs.subtitle,
|
||||
obs.facts,
|
||||
obs.narrative,
|
||||
obs.concepts,
|
||||
obs.files_read,
|
||||
obs.files_modified,
|
||||
obs.prompt_number,
|
||||
obs.discovery_tokens || 0,
|
||||
obs.created_at,
|
||||
obs.created_at_epoch
|
||||
);
|
||||
|
||||
return { imported: true, id: result.lastInsertRowid as number };
|
||||
}
|
||||
|
||||
/**
|
||||
* Import user prompt with duplicate checking
|
||||
* Duplicates are identified by content_session_id + prompt_number
|
||||
*/
|
||||
export function importUserPrompt(
|
||||
db: Database,
|
||||
prompt: {
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
): ImportResult {
|
||||
// Check if prompt already exists
|
||||
const existing = db
|
||||
.prepare(
|
||||
`
|
||||
SELECT id FROM user_prompts
|
||||
WHERE content_session_id = ? AND prompt_number = ?
|
||||
`
|
||||
)
|
||||
.get(prompt.content_session_id, prompt.prompt_number) as
|
||||
| { id: number }
|
||||
| undefined;
|
||||
|
||||
if (existing) {
|
||||
return { imported: false, id: existing.id };
|
||||
}
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO user_prompts (
|
||||
content_session_id, prompt_number, prompt_text,
|
||||
created_at, created_at_epoch
|
||||
) VALUES (?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
prompt.content_session_id,
|
||||
prompt.prompt_number,
|
||||
prompt.prompt_text,
|
||||
prompt.created_at,
|
||||
prompt.created_at_epoch
|
||||
);
|
||||
|
||||
return { imported: true, id: result.lastInsertRowid as number };
|
||||
}
|
||||
@@ -1,7 +1,14 @@
|
||||
// Export main components
|
||||
export { DatabaseManager, getDatabase, initializeDatabase } from './Database.js';
|
||||
export {
|
||||
ClaudeMemDatabase,
|
||||
DatabaseManager,
|
||||
getDatabase,
|
||||
initializeDatabase,
|
||||
MigrationRunner
|
||||
} from './Database.js';
|
||||
|
||||
// Export session store (CRUD operations for sessions, observations, summaries)
|
||||
// @deprecated Use modular functions from Database.ts instead
|
||||
export { SessionStore } from './SessionStore.js';
|
||||
|
||||
// Export session search (FTS5 and structured search)
|
||||
@@ -12,3 +19,14 @@ export * from './types.js';
|
||||
|
||||
// Export migrations
|
||||
export { migrations } from './migrations.js';
|
||||
|
||||
// Export transactions
|
||||
export { storeObservations, storeObservationsAndMarkComplete } from './transactions.js';
|
||||
|
||||
// Re-export all modular functions for convenient access
|
||||
export * from './Sessions.js';
|
||||
export * from './Observations.js';
|
||||
export * from './Summaries.js';
|
||||
export * from './Prompts.js';
|
||||
export * from './Timeline.js';
|
||||
export * from './Import.js';
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import { Database } from 'bun:sqlite';
|
||||
import { Migration } from './Database.js';
|
||||
|
||||
// Re-export MigrationRunner for SessionStore migration extraction
|
||||
export { MigrationRunner } from './migrations/runner.js';
|
||||
|
||||
/**
|
||||
* Initial schema migration - creates all core tables
|
||||
*/
|
||||
|
||||
631
src/services/sqlite/migrations/runner.ts
Normal file
631
src/services/sqlite/migrations/runner.ts
Normal file
@@ -0,0 +1,631 @@
|
||||
import { Database } from 'bun:sqlite';
|
||||
import { logger } from '../../../utils/logger.js';
|
||||
import {
|
||||
TableColumnInfo,
|
||||
IndexInfo,
|
||||
TableNameRow,
|
||||
SchemaVersion
|
||||
} from '../../../types/database.js';
|
||||
|
||||
/**
|
||||
* MigrationRunner handles all database schema migrations
|
||||
* Extracted from SessionStore to separate concerns
|
||||
*/
|
||||
export class MigrationRunner {
|
||||
constructor(private db: Database) {}
|
||||
|
||||
/**
|
||||
* Run all migrations in order
|
||||
* This is the only public method - all migrations are internal
|
||||
*/
|
||||
runAllMigrations(): void {
|
||||
this.initializeSchema();
|
||||
this.ensureWorkerPortColumn();
|
||||
this.ensurePromptTrackingColumns();
|
||||
this.removeSessionSummariesUniqueConstraint();
|
||||
this.addObservationHierarchicalFields();
|
||||
this.makeObservationsTextNullable();
|
||||
this.createUserPromptsTable();
|
||||
this.ensureDiscoveryTokensColumn();
|
||||
this.createPendingMessagesTable();
|
||||
this.renameSessionIdColumns();
|
||||
this.repairSessionIdColumnRename();
|
||||
this.addFailedAtEpochColumn();
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize database schema using migrations (migration004)
|
||||
* This runs the core SDK tables migration if no tables exist
|
||||
*/
|
||||
private initializeSchema(): void {
|
||||
// Create schema_versions table if it doesn't exist
|
||||
this.db.run(`
|
||||
CREATE TABLE IF NOT EXISTS schema_versions (
|
||||
id INTEGER PRIMARY KEY,
|
||||
version INTEGER UNIQUE NOT NULL,
|
||||
applied_at TEXT NOT NULL
|
||||
)
|
||||
`);
|
||||
|
||||
// Get applied migrations
|
||||
const appliedVersions = this.db.prepare('SELECT version FROM schema_versions ORDER BY version').all() as SchemaVersion[];
|
||||
const maxApplied = appliedVersions.length > 0 ? Math.max(...appliedVersions.map(v => v.version)) : 0;
|
||||
|
||||
// Only run migration004 if no migrations have been applied
|
||||
// This creates the sdk_sessions, observations, and session_summaries tables
|
||||
if (maxApplied === 0) {
|
||||
logger.info('DB', 'Initializing fresh database with migration004');
|
||||
|
||||
// Migration004: SDK agent architecture tables
|
||||
this.db.run(`
|
||||
CREATE TABLE IF NOT EXISTS sdk_sessions (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
content_session_id TEXT UNIQUE NOT NULL,
|
||||
memory_session_id TEXT UNIQUE,
|
||||
project TEXT NOT NULL,
|
||||
user_prompt TEXT,
|
||||
started_at TEXT NOT NULL,
|
||||
started_at_epoch INTEGER NOT NULL,
|
||||
completed_at TEXT,
|
||||
completed_at_epoch INTEGER,
|
||||
status TEXT CHECK(status IN ('active', 'completed', 'failed')) NOT NULL DEFAULT 'active'
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_claude_id ON sdk_sessions(content_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_sdk_id ON sdk_sessions(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_project ON sdk_sessions(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_status ON sdk_sessions(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_sdk_sessions_started ON sdk_sessions(started_at_epoch DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS observations (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT NOT NULL,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery')),
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_sdk_session ON observations(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_project ON observations(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_type ON observations(type);
|
||||
CREATE INDEX IF NOT EXISTS idx_observations_created ON observations(created_at_epoch DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS session_summaries (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
memory_session_id TEXT UNIQUE NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
learned TEXT,
|
||||
completed TEXT,
|
||||
next_steps TEXT,
|
||||
files_read TEXT,
|
||||
files_edited TEXT,
|
||||
notes TEXT,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX IF NOT EXISTS idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
// Record migration004 as applied
|
||||
this.db.prepare('INSERT INTO schema_versions (version, applied_at) VALUES (?, ?)').run(4, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Migration004 applied successfully');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure worker_port column exists (migration 5)
|
||||
*/
|
||||
private ensureWorkerPortColumn(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(5) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if column exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasWorkerPort = tableInfo.some(col => col.name === 'worker_port');
|
||||
|
||||
if (!hasWorkerPort) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN worker_port INTEGER');
|
||||
logger.info('DB', 'Added worker_port column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(5, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure prompt tracking columns exist (migration 6)
|
||||
*/
|
||||
private ensurePromptTrackingColumns(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(6) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check sdk_sessions for prompt_counter
|
||||
const sessionsInfo = this.db.query('PRAGMA table_info(sdk_sessions)').all() as TableColumnInfo[];
|
||||
const hasPromptCounter = sessionsInfo.some(col => col.name === 'prompt_counter');
|
||||
|
||||
if (!hasPromptCounter) {
|
||||
this.db.run('ALTER TABLE sdk_sessions ADD COLUMN prompt_counter INTEGER DEFAULT 0');
|
||||
logger.info('DB', 'Added prompt_counter column to sdk_sessions table');
|
||||
}
|
||||
|
||||
// Check observations for prompt_number
|
||||
const observationsInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const obsHasPromptNumber = observationsInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!obsHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE observations ADD COLUMN prompt_number INTEGER');
|
||||
logger.info('DB', 'Added prompt_number column to observations table');
|
||||
}
|
||||
|
||||
// Check session_summaries for prompt_number
|
||||
const summariesInfo = this.db.query('PRAGMA table_info(session_summaries)').all() as TableColumnInfo[];
|
||||
const sumHasPromptNumber = summariesInfo.some(col => col.name === 'prompt_number');
|
||||
|
||||
if (!sumHasPromptNumber) {
|
||||
this.db.run('ALTER TABLE session_summaries ADD COLUMN prompt_number INTEGER');
|
||||
logger.info('DB', 'Added prompt_number column to session_summaries table');
|
||||
}
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(6, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove UNIQUE constraint from session_summaries.memory_session_id (migration 7)
|
||||
*/
|
||||
private removeSessionSummariesUniqueConstraint(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(7) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if UNIQUE constraint exists
|
||||
const summariesIndexes = this.db.query('PRAGMA index_list(session_summaries)').all() as IndexInfo[];
|
||||
const hasUniqueConstraint = summariesIndexes.some(idx => idx.unique === 1);
|
||||
|
||||
if (!hasUniqueConstraint) {
|
||||
// Already migrated (no constraint exists)
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Removing UNIQUE constraint from session_summaries.memory_session_id');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
// Create new table without UNIQUE constraint
|
||||
this.db.run(`
|
||||
CREATE TABLE session_summaries_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
request TEXT,
|
||||
investigated TEXT,
|
||||
learned TEXT,
|
||||
completed TEXT,
|
||||
next_steps TEXT,
|
||||
files_read TEXT,
|
||||
files_edited TEXT,
|
||||
notes TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Copy data from old table
|
||||
this.db.run(`
|
||||
INSERT INTO session_summaries_new
|
||||
SELECT id, memory_session_id, project, request, investigated, learned,
|
||||
completed, next_steps, files_read, files_edited, notes,
|
||||
prompt_number, created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
`);
|
||||
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE session_summaries');
|
||||
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE session_summaries_new RENAME TO session_summaries');
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_session_summaries_sdk_session ON session_summaries(memory_session_id);
|
||||
CREATE INDEX idx_session_summaries_project ON session_summaries(project);
|
||||
CREATE INDEX idx_session_summaries_created ON session_summaries(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(7, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully removed UNIQUE constraint from session_summaries.memory_session_id');
|
||||
}
|
||||
|
||||
/**
|
||||
* Add hierarchical fields to observations table (migration 8)
|
||||
*/
|
||||
private addObservationHierarchicalFields(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(8) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if new fields already exist
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const hasTitle = tableInfo.some(col => col.name === 'title');
|
||||
|
||||
if (hasTitle) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(8, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Adding hierarchical fields to observations table');
|
||||
|
||||
// Add new columns
|
||||
this.db.run(`
|
||||
ALTER TABLE observations ADD COLUMN title TEXT;
|
||||
ALTER TABLE observations ADD COLUMN subtitle TEXT;
|
||||
ALTER TABLE observations ADD COLUMN facts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN narrative TEXT;
|
||||
ALTER TABLE observations ADD COLUMN concepts TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_read TEXT;
|
||||
ALTER TABLE observations ADD COLUMN files_modified TEXT;
|
||||
`);
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(8, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully added hierarchical fields to observations table');
|
||||
}
|
||||
|
||||
/**
|
||||
* Make observations.text nullable (migration 9)
|
||||
* The text field is deprecated in favor of structured fields (title, subtitle, narrative, etc.)
|
||||
*/
|
||||
private makeObservationsTextNullable(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(9) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if text column is already nullable
|
||||
const tableInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const textColumn = tableInfo.find(col => col.name === 'text');
|
||||
|
||||
if (!textColumn || textColumn.notnull === 0) {
|
||||
// Already migrated or text column doesn't exist
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Making observations.text nullable');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
// Create new table with text as nullable
|
||||
this.db.run(`
|
||||
CREATE TABLE observations_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
memory_session_id TEXT NOT NULL,
|
||||
project TEXT NOT NULL,
|
||||
text TEXT,
|
||||
type TEXT NOT NULL CHECK(type IN ('decision', 'bugfix', 'feature', 'refactor', 'discovery', 'change')),
|
||||
title TEXT,
|
||||
subtitle TEXT,
|
||||
facts TEXT,
|
||||
narrative TEXT,
|
||||
concepts TEXT,
|
||||
files_read TEXT,
|
||||
files_modified TEXT,
|
||||
prompt_number INTEGER,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(memory_session_id) REFERENCES sdk_sessions(memory_session_id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
// Copy data from old table (all existing columns)
|
||||
this.db.run(`
|
||||
INSERT INTO observations_new
|
||||
SELECT id, memory_session_id, project, text, type, title, subtitle, facts,
|
||||
narrative, concepts, files_read, files_modified, prompt_number,
|
||||
created_at, created_at_epoch
|
||||
FROM observations
|
||||
`);
|
||||
|
||||
// Drop old table
|
||||
this.db.run('DROP TABLE observations');
|
||||
|
||||
// Rename new table
|
||||
this.db.run('ALTER TABLE observations_new RENAME TO observations');
|
||||
|
||||
// Recreate indexes
|
||||
this.db.run(`
|
||||
CREATE INDEX idx_observations_sdk_session ON observations(memory_session_id);
|
||||
CREATE INDEX idx_observations_project ON observations(project);
|
||||
CREATE INDEX idx_observations_type ON observations(type);
|
||||
CREATE INDEX idx_observations_created ON observations(created_at_epoch DESC);
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(9, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully made observations.text nullable');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create user_prompts table with FTS5 support (migration 10)
|
||||
*/
|
||||
private createUserPromptsTable(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(10) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if table already exists
|
||||
const tableInfo = this.db.query('PRAGMA table_info(user_prompts)').all() as TableColumnInfo[];
|
||||
if (tableInfo.length > 0) {
|
||||
// Already migrated
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Creating user_prompts table with FTS5 support');
|
||||
|
||||
// Begin transaction
|
||||
this.db.run('BEGIN TRANSACTION');
|
||||
|
||||
// Create main table (using content_session_id since memory_session_id is set asynchronously by worker)
|
||||
this.db.run(`
|
||||
CREATE TABLE user_prompts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
content_session_id TEXT NOT NULL,
|
||||
prompt_number INTEGER NOT NULL,
|
||||
prompt_text TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
FOREIGN KEY(content_session_id) REFERENCES sdk_sessions(content_session_id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX idx_user_prompts_claude_session ON user_prompts(content_session_id);
|
||||
CREATE INDEX idx_user_prompts_created ON user_prompts(created_at_epoch DESC);
|
||||
CREATE INDEX idx_user_prompts_prompt_number ON user_prompts(prompt_number);
|
||||
CREATE INDEX idx_user_prompts_lookup ON user_prompts(content_session_id, prompt_number);
|
||||
`);
|
||||
|
||||
// Create FTS5 virtual table
|
||||
this.db.run(`
|
||||
CREATE VIRTUAL TABLE user_prompts_fts USING fts5(
|
||||
prompt_text,
|
||||
content='user_prompts',
|
||||
content_rowid='id'
|
||||
);
|
||||
`);
|
||||
|
||||
// Create triggers to sync FTS5
|
||||
this.db.run(`
|
||||
CREATE TRIGGER user_prompts_ai AFTER INSERT ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER user_prompts_ad AFTER DELETE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER user_prompts_au AFTER UPDATE ON user_prompts BEGIN
|
||||
INSERT INTO user_prompts_fts(user_prompts_fts, rowid, prompt_text)
|
||||
VALUES('delete', old.id, old.prompt_text);
|
||||
INSERT INTO user_prompts_fts(rowid, prompt_text)
|
||||
VALUES (new.id, new.prompt_text);
|
||||
END;
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
this.db.run('COMMIT');
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(10, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'Successfully created user_prompts table with FTS5 support');
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure discovery_tokens column exists (migration 11)
|
||||
* CRITICAL: This migration was incorrectly using version 7 (which was already taken by removeSessionSummariesUniqueConstraint)
|
||||
* The duplicate version number may have caused migration tracking issues in some databases
|
||||
*/
|
||||
private ensureDiscoveryTokensColumn(): void {
|
||||
// Check if migration already applied to avoid unnecessary re-runs
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(11) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if discovery_tokens column exists in observations table
|
||||
const observationsInfo = this.db.query('PRAGMA table_info(observations)').all() as TableColumnInfo[];
|
||||
const obsHasDiscoveryTokens = observationsInfo.some(col => col.name === 'discovery_tokens');
|
||||
|
||||
if (!obsHasDiscoveryTokens) {
|
||||
this.db.run('ALTER TABLE observations ADD COLUMN discovery_tokens INTEGER DEFAULT 0');
|
||||
logger.info('DB', 'Added discovery_tokens column to observations table');
|
||||
}
|
||||
|
||||
// Check if discovery_tokens column exists in session_summaries table
|
||||
const summariesInfo = this.db.query('PRAGMA table_info(session_summaries)').all() as TableColumnInfo[];
|
||||
const sumHasDiscoveryTokens = summariesInfo.some(col => col.name === 'discovery_tokens');
|
||||
|
||||
if (!sumHasDiscoveryTokens) {
|
||||
this.db.run('ALTER TABLE session_summaries ADD COLUMN discovery_tokens INTEGER DEFAULT 0');
|
||||
logger.info('DB', 'Added discovery_tokens column to session_summaries table');
|
||||
}
|
||||
|
||||
// Record migration only after successful column verification/addition
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(11, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Create pending_messages table for persistent work queue (migration 16)
|
||||
* Messages are persisted before processing and deleted after success.
|
||||
* Enables recovery from SDK hangs and worker crashes.
|
||||
*/
|
||||
private createPendingMessagesTable(): void {
|
||||
// Check if migration already applied
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(16) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Check if table already exists
|
||||
const tables = this.db.query("SELECT name FROM sqlite_master WHERE type='table' AND name='pending_messages'").all() as TableNameRow[];
|
||||
if (tables.length > 0) {
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(16, new Date().toISOString());
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info('DB', 'Creating pending_messages table');
|
||||
|
||||
this.db.run(`
|
||||
CREATE TABLE pending_messages (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
session_db_id INTEGER NOT NULL,
|
||||
content_session_id TEXT NOT NULL,
|
||||
message_type TEXT NOT NULL CHECK(message_type IN ('observation', 'summarize')),
|
||||
tool_name TEXT,
|
||||
tool_input TEXT,
|
||||
tool_response TEXT,
|
||||
cwd TEXT,
|
||||
last_user_message TEXT,
|
||||
last_assistant_message TEXT,
|
||||
prompt_number INTEGER,
|
||||
status TEXT NOT NULL DEFAULT 'pending' CHECK(status IN ('pending', 'processing', 'processed', 'failed')),
|
||||
retry_count INTEGER NOT NULL DEFAULT 0,
|
||||
created_at_epoch INTEGER NOT NULL,
|
||||
started_processing_at_epoch INTEGER,
|
||||
completed_at_epoch INTEGER,
|
||||
FOREIGN KEY (session_db_id) REFERENCES sdk_sessions(id) ON DELETE CASCADE
|
||||
)
|
||||
`);
|
||||
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_session ON pending_messages(session_db_id)');
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_status ON pending_messages(status)');
|
||||
this.db.run('CREATE INDEX IF NOT EXISTS idx_pending_messages_claude_session ON pending_messages(content_session_id)');
|
||||
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(16, new Date().toISOString());
|
||||
|
||||
logger.info('DB', 'pending_messages table created successfully');
|
||||
}
|
||||
|
||||
/**
|
||||
* Rename session ID columns for semantic clarity (migration 17)
|
||||
* - claude_session_id -> content_session_id (user's observed session)
|
||||
* - sdk_session_id -> memory_session_id (memory agent's session for resume)
|
||||
*
|
||||
* IDEMPOTENT: Checks each table individually before renaming.
|
||||
* This handles databases in any intermediate state (partial migration, fresh install, etc.)
|
||||
*/
|
||||
private renameSessionIdColumns(): void {
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(17) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
logger.info('DB', 'Checking session ID columns for semantic clarity rename');
|
||||
|
||||
let renamesPerformed = 0;
|
||||
|
||||
// Helper to safely rename a column if it exists
|
||||
const safeRenameColumn = (table: string, oldCol: string, newCol: string): boolean => {
|
||||
const tableInfo = this.db.query(`PRAGMA table_info(${table})`).all() as TableColumnInfo[];
|
||||
const hasOldCol = tableInfo.some(col => col.name === oldCol);
|
||||
const hasNewCol = tableInfo.some(col => col.name === newCol);
|
||||
|
||||
if (hasNewCol) {
|
||||
// Already renamed, nothing to do
|
||||
return false;
|
||||
}
|
||||
|
||||
if (hasOldCol) {
|
||||
// SQLite 3.25+ supports ALTER TABLE RENAME COLUMN
|
||||
this.db.run(`ALTER TABLE ${table} RENAME COLUMN ${oldCol} TO ${newCol}`);
|
||||
logger.info('DB', `Renamed ${table}.${oldCol} to ${newCol}`);
|
||||
return true;
|
||||
}
|
||||
|
||||
// Neither column exists - table might not exist or has different schema
|
||||
logger.warn('DB', `Column ${oldCol} not found in ${table}, skipping rename`);
|
||||
return false;
|
||||
};
|
||||
|
||||
// Rename in sdk_sessions table
|
||||
if (safeRenameColumn('sdk_sessions', 'claude_session_id', 'content_session_id')) renamesPerformed++;
|
||||
if (safeRenameColumn('sdk_sessions', 'sdk_session_id', 'memory_session_id')) renamesPerformed++;
|
||||
|
||||
// Rename in pending_messages table
|
||||
if (safeRenameColumn('pending_messages', 'claude_session_id', 'content_session_id')) renamesPerformed++;
|
||||
|
||||
// Rename in observations table
|
||||
if (safeRenameColumn('observations', 'sdk_session_id', 'memory_session_id')) renamesPerformed++;
|
||||
|
||||
// Rename in session_summaries table
|
||||
if (safeRenameColumn('session_summaries', 'sdk_session_id', 'memory_session_id')) renamesPerformed++;
|
||||
|
||||
// Rename in user_prompts table
|
||||
if (safeRenameColumn('user_prompts', 'claude_session_id', 'content_session_id')) renamesPerformed++;
|
||||
|
||||
// Record migration
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(17, new Date().toISOString());
|
||||
|
||||
if (renamesPerformed > 0) {
|
||||
logger.info('DB', `Successfully renamed ${renamesPerformed} session ID columns`);
|
||||
} else {
|
||||
logger.info('DB', 'No session ID column renames needed (already up to date)');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Repair session ID column renames (migration 19)
|
||||
* DEPRECATED: Migration 17 is now fully idempotent and handles all cases.
|
||||
* This migration is kept for backwards compatibility but does nothing.
|
||||
*/
|
||||
private repairSessionIdColumnRename(): void {
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(19) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
// Migration 17 now handles all column rename cases idempotently.
|
||||
// Just record this migration as applied.
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(19, new Date().toISOString());
|
||||
}
|
||||
|
||||
/**
|
||||
* Add failed_at_epoch column to pending_messages (migration 20)
|
||||
* Used by markSessionMessagesFailed() for error recovery tracking
|
||||
*/
|
||||
private addFailedAtEpochColumn(): void {
|
||||
const applied = this.db.prepare('SELECT version FROM schema_versions WHERE version = ?').get(20) as SchemaVersion | undefined;
|
||||
if (applied) return;
|
||||
|
||||
const tableInfo = this.db.query('PRAGMA table_info(pending_messages)').all() as TableColumnInfo[];
|
||||
const hasColumn = tableInfo.some(col => col.name === 'failed_at_epoch');
|
||||
|
||||
if (!hasColumn) {
|
||||
this.db.run('ALTER TABLE pending_messages ADD COLUMN failed_at_epoch INTEGER');
|
||||
logger.info('DB', 'Added failed_at_epoch column to pending_messages table');
|
||||
}
|
||||
|
||||
this.db.prepare('INSERT OR IGNORE INTO schema_versions (version, applied_at) VALUES (?, ?)').run(20, new Date().toISOString());
|
||||
}
|
||||
}
|
||||
52
src/services/sqlite/observations/files.ts
Normal file
52
src/services/sqlite/observations/files.ts
Normal file
@@ -0,0 +1,52 @@
|
||||
/**
|
||||
* Session file retrieval functions
|
||||
* Extracted from SessionStore.ts for modular organization
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import type { SessionFilesResult } from './types.js';
|
||||
|
||||
/**
|
||||
* Get aggregated files from all observations for a session
|
||||
*/
|
||||
export function getFilesForSession(
|
||||
db: Database,
|
||||
memorySessionId: string
|
||||
): SessionFilesResult {
|
||||
const stmt = db.prepare(`
|
||||
SELECT files_read, files_modified
|
||||
FROM observations
|
||||
WHERE memory_session_id = ?
|
||||
`);
|
||||
|
||||
const rows = stmt.all(memorySessionId) as Array<{
|
||||
files_read: string | null;
|
||||
files_modified: string | null;
|
||||
}>;
|
||||
|
||||
const filesReadSet = new Set<string>();
|
||||
const filesModifiedSet = new Set<string>();
|
||||
|
||||
for (const row of rows) {
|
||||
// Parse files_read
|
||||
if (row.files_read) {
|
||||
const files = JSON.parse(row.files_read);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesReadSet.add(f));
|
||||
}
|
||||
}
|
||||
|
||||
// Parse files_modified
|
||||
if (row.files_modified) {
|
||||
const files = JSON.parse(row.files_modified);
|
||||
if (Array.isArray(files)) {
|
||||
files.forEach(f => filesModifiedSet.add(f));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
filesRead: Array.from(filesReadSet),
|
||||
filesModified: Array.from(filesModifiedSet)
|
||||
};
|
||||
}
|
||||
112
src/services/sqlite/observations/get.ts
Normal file
112
src/services/sqlite/observations/get.ts
Normal file
@@ -0,0 +1,112 @@
|
||||
/**
|
||||
* Observation retrieval functions
|
||||
* Extracted from SessionStore.ts for modular organization
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import type { ObservationRecord } from '../../../types/database.js';
|
||||
import type { GetObservationsByIdsOptions, ObservationSessionRow } from './types.js';
|
||||
|
||||
/**
|
||||
* Get a single observation by ID
|
||||
*/
|
||||
export function getObservationById(db: Database, id: number): ObservationRecord | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE id = ?
|
||||
`);
|
||||
|
||||
return stmt.get(id) as ObservationRecord | undefined || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get observations by array of IDs with ordering and limit
|
||||
*/
|
||||
export function getObservationsByIds(
|
||||
db: Database,
|
||||
ids: number[],
|
||||
options: GetObservationsByIdsOptions = {}
|
||||
): ObservationRecord[] {
|
||||
if (ids.length === 0) return [];
|
||||
|
||||
const { orderBy = 'date_desc', limit, project, type, concepts, files } = options;
|
||||
const orderClause = orderBy === 'date_asc' ? 'ASC' : 'DESC';
|
||||
const limitClause = limit ? `LIMIT ${limit}` : '';
|
||||
|
||||
// Build placeholders for IN clause
|
||||
const placeholders = ids.map(() => '?').join(',');
|
||||
const params: any[] = [...ids];
|
||||
const additionalConditions: string[] = [];
|
||||
|
||||
// Apply project filter
|
||||
if (project) {
|
||||
additionalConditions.push('project = ?');
|
||||
params.push(project);
|
||||
}
|
||||
|
||||
// Apply type filter
|
||||
if (type) {
|
||||
if (Array.isArray(type)) {
|
||||
const typePlaceholders = type.map(() => '?').join(',');
|
||||
additionalConditions.push(`type IN (${typePlaceholders})`);
|
||||
params.push(...type);
|
||||
} else {
|
||||
additionalConditions.push('type = ?');
|
||||
params.push(type);
|
||||
}
|
||||
}
|
||||
|
||||
// Apply concepts filter
|
||||
if (concepts) {
|
||||
const conceptsList = Array.isArray(concepts) ? concepts : [concepts];
|
||||
const conceptConditions = conceptsList.map(() =>
|
||||
'EXISTS (SELECT 1 FROM json_each(concepts) WHERE value = ?)'
|
||||
);
|
||||
params.push(...conceptsList);
|
||||
additionalConditions.push(`(${conceptConditions.join(' OR ')})`);
|
||||
}
|
||||
|
||||
// Apply files filter
|
||||
if (files) {
|
||||
const filesList = Array.isArray(files) ? files : [files];
|
||||
const fileConditions = filesList.map(() => {
|
||||
return '(EXISTS (SELECT 1 FROM json_each(files_read) WHERE value LIKE ?) OR EXISTS (SELECT 1 FROM json_each(files_modified) WHERE value LIKE ?))';
|
||||
});
|
||||
filesList.forEach(file => {
|
||||
params.push(`%${file}%`, `%${file}%`);
|
||||
});
|
||||
additionalConditions.push(`(${fileConditions.join(' OR ')})`);
|
||||
}
|
||||
|
||||
const whereClause = additionalConditions.length > 0
|
||||
? `WHERE id IN (${placeholders}) AND ${additionalConditions.join(' AND ')}`
|
||||
: `WHERE id IN (${placeholders})`;
|
||||
|
||||
const stmt = db.prepare(`
|
||||
SELECT *
|
||||
FROM observations
|
||||
${whereClause}
|
||||
ORDER BY created_at_epoch ${orderClause}
|
||||
${limitClause}
|
||||
`);
|
||||
|
||||
return stmt.all(...params) as ObservationRecord[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get observations for a specific session
|
||||
*/
|
||||
export function getObservationsForSession(
|
||||
db: Database,
|
||||
memorySessionId: string
|
||||
): ObservationSessionRow[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT title, subtitle, type, prompt_number
|
||||
FROM observations
|
||||
WHERE memory_session_id = ?
|
||||
ORDER BY created_at_epoch ASC
|
||||
`);
|
||||
|
||||
return stmt.all(memorySessionId) as ObservationSessionRow[];
|
||||
}
|
||||
43
src/services/sqlite/observations/recent.ts
Normal file
43
src/services/sqlite/observations/recent.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
/**
|
||||
* Recent observation retrieval functions
|
||||
* Extracted from SessionStore.ts for modular organization
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import type { RecentObservationRow, AllRecentObservationRow } from './types.js';
|
||||
|
||||
/**
|
||||
* Get recent observations for a project
|
||||
*/
|
||||
export function getRecentObservations(
|
||||
db: Database,
|
||||
project: string,
|
||||
limit: number = 20
|
||||
): RecentObservationRow[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT type, text, prompt_number, created_at
|
||||
FROM observations
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(project, limit) as RecentObservationRow[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent observations across all projects (for web UI)
|
||||
*/
|
||||
export function getAllRecentObservations(
|
||||
db: Database,
|
||||
limit: number = 100
|
||||
): AllRecentObservationRow[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT id, type, title, subtitle, text, project, prompt_number, created_at, created_at_epoch
|
||||
FROM observations
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(limit) as AllRecentObservationRow[];
|
||||
}
|
||||
54
src/services/sqlite/observations/store.ts
Normal file
54
src/services/sqlite/observations/store.ts
Normal file
@@ -0,0 +1,54 @@
|
||||
/**
|
||||
* Store observation function
|
||||
* Extracted from SessionStore.ts for modular organization
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import type { ObservationInput, StoreObservationResult } from './types.js';
|
||||
|
||||
/**
|
||||
* Store an observation (from SDK parsing)
|
||||
* Assumes session already exists (created by hook)
|
||||
*/
|
||||
export function storeObservation(
|
||||
db: Database,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
observation: ObservationInput,
|
||||
promptNumber?: number,
|
||||
discoveryTokens: number = 0,
|
||||
overrideTimestampEpoch?: number
|
||||
): StoreObservationResult {
|
||||
// Use override timestamp if provided (for processing backlog messages with original timestamps)
|
||||
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
|
||||
const timestampIso = new Date(timestampEpoch).toISOString();
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO observations
|
||||
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
observation.type,
|
||||
observation.title,
|
||||
observation.subtitle,
|
||||
JSON.stringify(observation.facts),
|
||||
observation.narrative,
|
||||
JSON.stringify(observation.concepts),
|
||||
JSON.stringify(observation.files_read),
|
||||
JSON.stringify(observation.files_modified),
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
|
||||
return {
|
||||
id: Number(result.lastInsertRowid),
|
||||
createdAtEpoch: timestampEpoch
|
||||
};
|
||||
}
|
||||
81
src/services/sqlite/observations/types.ts
Normal file
81
src/services/sqlite/observations/types.ts
Normal file
@@ -0,0 +1,81 @@
|
||||
/**
|
||||
* Type definitions for observation operations
|
||||
* Extracted from SessionStore.ts for modular organization
|
||||
*/
|
||||
|
||||
/**
|
||||
* Input type for storeObservation function
|
||||
*/
|
||||
export interface ObservationInput {
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
facts: string[];
|
||||
narrative: string | null;
|
||||
concepts: string[];
|
||||
files_read: string[];
|
||||
files_modified: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Result from storing an observation
|
||||
*/
|
||||
export interface StoreObservationResult {
|
||||
id: number;
|
||||
createdAtEpoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for getObservationsByIds
|
||||
*/
|
||||
export interface GetObservationsByIdsOptions {
|
||||
orderBy?: 'date_desc' | 'date_asc';
|
||||
limit?: number;
|
||||
project?: string;
|
||||
type?: string | string[];
|
||||
concepts?: string | string[];
|
||||
files?: string | string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Result type for getFilesForSession
|
||||
*/
|
||||
export interface SessionFilesResult {
|
||||
filesRead: string[];
|
||||
filesModified: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple observation row for getObservationsForSession
|
||||
*/
|
||||
export interface ObservationSessionRow {
|
||||
title: string;
|
||||
subtitle: string;
|
||||
type: string;
|
||||
prompt_number: number | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Recent observation row type
|
||||
*/
|
||||
export interface RecentObservationRow {
|
||||
type: string;
|
||||
text: string;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Full recent observation row (for web UI)
|
||||
*/
|
||||
export interface AllRecentObservationRow {
|
||||
id: number;
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
text: string;
|
||||
project: string;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
168
src/services/sqlite/prompts/get.ts
Normal file
168
src/services/sqlite/prompts/get.ts
Normal file
@@ -0,0 +1,168 @@
|
||||
/**
|
||||
* User prompt retrieval operations
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type { UserPromptRecord, LatestPromptResult } from '../../../types/database.js';
|
||||
import type { RecentUserPromptResult, PromptWithProject, GetPromptsByIdsOptions } from './types.js';
|
||||
|
||||
/**
|
||||
* Get user prompt by session ID and prompt number
|
||||
* @returns The prompt text, or null if not found
|
||||
*/
|
||||
export function getUserPrompt(
|
||||
db: Database,
|
||||
contentSessionId: string,
|
||||
promptNumber: number
|
||||
): string | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT prompt_text
|
||||
FROM user_prompts
|
||||
WHERE content_session_id = ? AND prompt_number = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
const result = stmt.get(contentSessionId, promptNumber) as { prompt_text: string } | undefined;
|
||||
return result?.prompt_text ?? null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current prompt number by counting user_prompts for this session
|
||||
* Replaces the prompt_counter column which is no longer maintained
|
||||
*/
|
||||
export function getPromptNumberFromUserPrompts(db: Database, contentSessionId: string): number {
|
||||
const result = db.prepare(`
|
||||
SELECT COUNT(*) as count FROM user_prompts WHERE content_session_id = ?
|
||||
`).get(contentSessionId) as { count: number };
|
||||
return result.count;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get latest user prompt with session info for a Claude session
|
||||
* Used for syncing prompts to Chroma during session initialization
|
||||
*/
|
||||
export function getLatestUserPrompt(
|
||||
db: Database,
|
||||
contentSessionId: string
|
||||
): LatestPromptResult | undefined {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.memory_session_id,
|
||||
s.project
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.content_session_id = ?
|
||||
ORDER BY up.created_at_epoch DESC
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return stmt.get(contentSessionId) as LatestPromptResult | undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent user prompts across all sessions (for web UI)
|
||||
*/
|
||||
export function getAllRecentUserPrompts(
|
||||
db: Database,
|
||||
limit: number = 100
|
||||
): RecentUserPromptResult[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
up.id,
|
||||
up.content_session_id,
|
||||
s.project,
|
||||
up.prompt_number,
|
||||
up.prompt_text,
|
||||
up.created_at,
|
||||
up.created_at_epoch
|
||||
FROM user_prompts up
|
||||
LEFT JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
ORDER BY up.created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(limit) as RecentUserPromptResult[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a single user prompt by ID
|
||||
*/
|
||||
export function getPromptById(db: Database, id: number): PromptWithProject | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
p.id,
|
||||
p.content_session_id,
|
||||
p.prompt_number,
|
||||
p.prompt_text,
|
||||
s.project,
|
||||
p.created_at,
|
||||
p.created_at_epoch
|
||||
FROM user_prompts p
|
||||
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
||||
WHERE p.id = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return (stmt.get(id) as PromptWithProject | undefined) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get multiple user prompts by IDs
|
||||
*/
|
||||
export function getPromptsByIds(db: Database, ids: number[]): PromptWithProject[] {
|
||||
if (ids.length === 0) return [];
|
||||
|
||||
const placeholders = ids.map(() => '?').join(',');
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
p.id,
|
||||
p.content_session_id,
|
||||
p.prompt_number,
|
||||
p.prompt_text,
|
||||
s.project,
|
||||
p.created_at,
|
||||
p.created_at_epoch
|
||||
FROM user_prompts p
|
||||
LEFT JOIN sdk_sessions s ON p.content_session_id = s.content_session_id
|
||||
WHERE p.id IN (${placeholders})
|
||||
ORDER BY p.created_at_epoch DESC
|
||||
`);
|
||||
|
||||
return stmt.all(...ids) as PromptWithProject[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user prompts by IDs (for hybrid Chroma search)
|
||||
* Returns prompts in specified temporal order with optional project filter
|
||||
*/
|
||||
export function getUserPromptsByIds(
|
||||
db: Database,
|
||||
ids: number[],
|
||||
options: GetPromptsByIdsOptions = {}
|
||||
): UserPromptRecord[] {
|
||||
if (ids.length === 0) return [];
|
||||
|
||||
const { orderBy = 'date_desc', limit, project } = options;
|
||||
const orderClause = orderBy === 'date_asc' ? 'ASC' : 'DESC';
|
||||
const limitClause = limit ? `LIMIT ${limit}` : '';
|
||||
const placeholders = ids.map(() => '?').join(',');
|
||||
const params: (number | string)[] = [...ids];
|
||||
|
||||
const projectFilter = project ? 'AND s.project = ?' : '';
|
||||
if (project) params.push(project);
|
||||
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
up.*,
|
||||
s.project,
|
||||
s.memory_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.id IN (${placeholders}) ${projectFilter}
|
||||
ORDER BY up.created_at_epoch ${orderClause}
|
||||
${limitClause}
|
||||
`);
|
||||
|
||||
return stmt.all(...params) as UserPromptRecord[];
|
||||
}
|
||||
28
src/services/sqlite/prompts/store.ts
Normal file
28
src/services/sqlite/prompts/store.ts
Normal file
@@ -0,0 +1,28 @@
|
||||
/**
|
||||
* User prompt storage operations
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
|
||||
/**
|
||||
* Save a user prompt to the database
|
||||
* @returns The inserted row ID
|
||||
*/
|
||||
export function saveUserPrompt(
|
||||
db: Database,
|
||||
contentSessionId: string,
|
||||
promptNumber: number,
|
||||
promptText: string
|
||||
): number {
|
||||
const now = new Date();
|
||||
const nowEpoch = now.getTime();
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO user_prompts
|
||||
(content_session_id, prompt_number, prompt_text, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(contentSessionId, promptNumber, promptText, now.toISOString(), nowEpoch);
|
||||
return result.lastInsertRowid as number;
|
||||
}
|
||||
40
src/services/sqlite/prompts/types.ts
Normal file
40
src/services/sqlite/prompts/types.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
/**
|
||||
* Type definitions for user prompts module
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
|
||||
/**
|
||||
* Result type for getAllRecentUserPrompts
|
||||
*/
|
||||
export interface RecentUserPromptResult {
|
||||
id: number;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Result type for getPromptById and getPromptsByIds
|
||||
*/
|
||||
export interface PromptWithProject {
|
||||
id: number;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project: string;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for getUserPromptsByIds
|
||||
*/
|
||||
export interface GetPromptsByIdsOptions {
|
||||
orderBy?: 'date_desc' | 'date_asc';
|
||||
limit?: number;
|
||||
project?: string;
|
||||
}
|
||||
62
src/services/sqlite/sessions/create.ts
Normal file
62
src/services/sqlite/sessions/create.ts
Normal file
@@ -0,0 +1,62 @@
|
||||
/**
|
||||
* Session creation and update functions
|
||||
* Database-first parameter pattern for functional composition
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
|
||||
/**
|
||||
* Create a new SDK session (idempotent - returns existing session ID if already exists)
|
||||
*
|
||||
* IDEMPOTENCY via INSERT OR IGNORE pattern:
|
||||
* - Prompt #1: session_id not in database -> INSERT creates new row
|
||||
* - Prompt #2+: session_id exists -> INSERT ignored, fetch existing ID
|
||||
* - Result: Same database ID returned for all prompts in conversation
|
||||
*
|
||||
* WHY THIS MATTERS:
|
||||
* - NO "does session exist?" checks needed anywhere
|
||||
* - NO risk of creating duplicate sessions
|
||||
* - ALL hooks automatically connected via session_id
|
||||
* - SAVE hook observations go to correct session (same session_id)
|
||||
* - SDKAgent continuation prompt has correct context (same session_id)
|
||||
*/
|
||||
export function createSDKSession(
|
||||
db: Database,
|
||||
contentSessionId: string,
|
||||
project: string,
|
||||
userPrompt: string
|
||||
): number {
|
||||
const now = new Date();
|
||||
const nowEpoch = now.getTime();
|
||||
|
||||
// Pure INSERT OR IGNORE - no updates, no complexity
|
||||
// NOTE: memory_session_id starts as NULL. It is captured by SDKAgent from the first SDK
|
||||
// response and stored via updateMemorySessionId(). CRITICAL: memory_session_id must NEVER
|
||||
// equal contentSessionId - that would inject memory messages into the user's transcript!
|
||||
db.prepare(`
|
||||
INSERT OR IGNORE INTO sdk_sessions
|
||||
(content_session_id, memory_session_id, project, user_prompt, started_at, started_at_epoch, status)
|
||||
VALUES (?, NULL, ?, ?, ?, ?, 'active')
|
||||
`).run(contentSessionId, project, userPrompt, now.toISOString(), nowEpoch);
|
||||
|
||||
// Return existing or new ID
|
||||
const row = db.prepare('SELECT id FROM sdk_sessions WHERE content_session_id = ?')
|
||||
.get(contentSessionId) as { id: number };
|
||||
return row.id;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the memory session ID for a session
|
||||
* Called by SDKAgent when it captures the session ID from the first SDK message
|
||||
*/
|
||||
export function updateMemorySessionId(
|
||||
db: Database,
|
||||
sessionDbId: number,
|
||||
memorySessionId: string
|
||||
): void {
|
||||
db.prepare(`
|
||||
UPDATE sdk_sessions
|
||||
SET memory_session_id = ?
|
||||
WHERE id = ?
|
||||
`).run(memorySessionId, sessionDbId);
|
||||
}
|
||||
106
src/services/sqlite/sessions/get.ts
Normal file
106
src/services/sqlite/sessions/get.ts
Normal file
@@ -0,0 +1,106 @@
|
||||
/**
|
||||
* Session retrieval functions
|
||||
* Database-first parameter pattern for functional composition
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type {
|
||||
SessionBasic,
|
||||
SessionFull,
|
||||
SessionWithStatus,
|
||||
SessionSummaryDetail,
|
||||
} from './types.js';
|
||||
|
||||
/**
|
||||
* Get session by ID (basic fields only)
|
||||
*/
|
||||
export function getSessionById(db: Database, id: number): SessionBasic | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT id, content_session_id, memory_session_id, project, user_prompt
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return (stmt.get(id) as SessionBasic | undefined) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get SDK sessions by memory session IDs
|
||||
* Used for exporting session metadata
|
||||
*/
|
||||
export function getSdkSessionsBySessionIds(
|
||||
db: Database,
|
||||
memorySessionIds: string[]
|
||||
): SessionFull[] {
|
||||
if (memorySessionIds.length === 0) return [];
|
||||
|
||||
const placeholders = memorySessionIds.map(() => '?').join(',');
|
||||
const stmt = db.prepare(`
|
||||
SELECT id, content_session_id, memory_session_id, project, user_prompt,
|
||||
started_at, started_at_epoch, completed_at, completed_at_epoch, status
|
||||
FROM sdk_sessions
|
||||
WHERE memory_session_id IN (${placeholders})
|
||||
ORDER BY started_at_epoch DESC
|
||||
`);
|
||||
|
||||
return stmt.all(...memorySessionIds) as SessionFull[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent sessions with their status and summary info
|
||||
* Returns sessions ordered oldest-first for display
|
||||
*/
|
||||
export function getRecentSessionsWithStatus(
|
||||
db: Database,
|
||||
project: string,
|
||||
limit: number = 3
|
||||
): SessionWithStatus[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT * FROM (
|
||||
SELECT
|
||||
s.memory_session_id,
|
||||
s.status,
|
||||
s.started_at,
|
||||
s.started_at_epoch,
|
||||
s.user_prompt,
|
||||
CASE WHEN sum.memory_session_id IS NOT NULL THEN 1 ELSE 0 END as has_summary
|
||||
FROM sdk_sessions s
|
||||
LEFT JOIN session_summaries sum ON s.memory_session_id = sum.memory_session_id
|
||||
WHERE s.project = ? AND s.memory_session_id IS NOT NULL
|
||||
GROUP BY s.memory_session_id
|
||||
ORDER BY s.started_at_epoch DESC
|
||||
LIMIT ?
|
||||
)
|
||||
ORDER BY started_at_epoch ASC
|
||||
`);
|
||||
|
||||
return stmt.all(project, limit) as SessionWithStatus[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get full session summary by ID (includes request_summary and learned_summary)
|
||||
*/
|
||||
export function getSessionSummaryById(
|
||||
db: Database,
|
||||
id: number
|
||||
): SessionSummaryDetail | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
id,
|
||||
memory_session_id,
|
||||
content_session_id,
|
||||
project,
|
||||
user_prompt,
|
||||
request_summary,
|
||||
learned_summary,
|
||||
status,
|
||||
created_at,
|
||||
created_at_epoch
|
||||
FROM sdk_sessions
|
||||
WHERE id = ?
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return (stmt.get(id) as SessionSummaryDetail | undefined) || null;
|
||||
}
|
||||
58
src/services/sqlite/sessions/types.ts
Normal file
58
src/services/sqlite/sessions/types.ts
Normal file
@@ -0,0 +1,58 @@
|
||||
/**
|
||||
* Session-related type definitions
|
||||
* Standalone types for session query results
|
||||
*/
|
||||
|
||||
/**
|
||||
* Basic session info (minimal fields)
|
||||
*/
|
||||
export interface SessionBasic {
|
||||
id: number;
|
||||
content_session_id: string;
|
||||
memory_session_id: string | null;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Full session record with timestamps
|
||||
*/
|
||||
export interface SessionFull {
|
||||
id: number;
|
||||
content_session_id: string;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
started_at: string;
|
||||
started_at_epoch: number;
|
||||
completed_at: string | null;
|
||||
completed_at_epoch: number | null;
|
||||
status: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Session with summary info for status display
|
||||
*/
|
||||
export interface SessionWithStatus {
|
||||
memory_session_id: string | null;
|
||||
status: string;
|
||||
started_at: string;
|
||||
user_prompt: string | null;
|
||||
has_summary: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Session summary with all detail fields
|
||||
*/
|
||||
export interface SessionSummaryDetail {
|
||||
id: number;
|
||||
memory_session_id: string | null;
|
||||
content_session_id: string;
|
||||
project: string;
|
||||
user_prompt: string;
|
||||
request_summary: string | null;
|
||||
learned_summary: string | null;
|
||||
status: string;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
86
src/services/sqlite/summaries/get.ts
Normal file
86
src/services/sqlite/summaries/get.ts
Normal file
@@ -0,0 +1,86 @@
|
||||
/**
|
||||
* Get session summaries from the database
|
||||
*/
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type { SessionSummaryRecord } from '../../../types/database.js';
|
||||
import type { SessionSummary, GetByIdsOptions } from './types.js';
|
||||
|
||||
/**
|
||||
* Get summary for a specific session
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param memorySessionId - SDK memory session ID
|
||||
* @returns Most recent summary for the session, or null if none exists
|
||||
*/
|
||||
export function getSummaryForSession(
|
||||
db: Database,
|
||||
memorySessionId: string
|
||||
): SessionSummary | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at,
|
||||
created_at_epoch
|
||||
FROM session_summaries
|
||||
WHERE memory_session_id = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT 1
|
||||
`);
|
||||
|
||||
return (stmt.get(memorySessionId) as SessionSummary | undefined) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a single session summary by ID
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param id - Summary ID
|
||||
* @returns Full summary record or null if not found
|
||||
*/
|
||||
export function getSummaryById(
|
||||
db: Database,
|
||||
id: number
|
||||
): SessionSummaryRecord | null {
|
||||
const stmt = db.prepare(`
|
||||
SELECT * FROM session_summaries WHERE id = ?
|
||||
`);
|
||||
|
||||
return (stmt.get(id) as SessionSummaryRecord | undefined) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get session summaries by IDs (for hybrid Chroma search)
|
||||
* Returns summaries in specified temporal order
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param ids - Array of summary IDs
|
||||
* @param options - Query options (orderBy, limit, project)
|
||||
*/
|
||||
export function getSummariesByIds(
|
||||
db: Database,
|
||||
ids: number[],
|
||||
options: GetByIdsOptions = {}
|
||||
): SessionSummaryRecord[] {
|
||||
if (ids.length === 0) return [];
|
||||
|
||||
const { orderBy = 'date_desc', limit, project } = options;
|
||||
const orderClause = orderBy === 'date_asc' ? 'ASC' : 'DESC';
|
||||
const limitClause = limit ? `LIMIT ${limit}` : '';
|
||||
const placeholders = ids.map(() => '?').join(',');
|
||||
const params: (number | string)[] = [...ids];
|
||||
|
||||
// Apply project filter
|
||||
const whereClause = project
|
||||
? `WHERE id IN (${placeholders}) AND project = ?`
|
||||
: `WHERE id IN (${placeholders})`;
|
||||
if (project) params.push(project);
|
||||
|
||||
const stmt = db.prepare(`
|
||||
SELECT * FROM session_summaries
|
||||
${whereClause}
|
||||
ORDER BY created_at_epoch ${orderClause}
|
||||
${limitClause}
|
||||
`);
|
||||
|
||||
return stmt.all(...params) as SessionSummaryRecord[];
|
||||
}
|
||||
77
src/services/sqlite/summaries/recent.ts
Normal file
77
src/services/sqlite/summaries/recent.ts
Normal file
@@ -0,0 +1,77 @@
|
||||
/**
|
||||
* Get recent session summaries from the database
|
||||
*/
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type { RecentSummary, SummaryWithSessionInfo, FullSummary } from './types.js';
|
||||
|
||||
/**
|
||||
* Get recent session summaries for a project
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param project - Project name to filter by
|
||||
* @param limit - Maximum number of summaries to return (default 10)
|
||||
*/
|
||||
export function getRecentSummaries(
|
||||
db: Database,
|
||||
project: string,
|
||||
limit: number = 10
|
||||
): RecentSummary[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, prompt_number, created_at
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(project, limit) as RecentSummary[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent summaries with session info for context display
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param project - Project name to filter by
|
||||
* @param limit - Maximum number of summaries to return (default 3)
|
||||
*/
|
||||
export function getRecentSummariesWithSessionInfo(
|
||||
db: Database,
|
||||
project: string,
|
||||
limit: number = 3
|
||||
): SummaryWithSessionInfo[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT
|
||||
memory_session_id, request, learned, completed, next_steps,
|
||||
prompt_number, created_at
|
||||
FROM session_summaries
|
||||
WHERE project = ?
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(project, limit) as SummaryWithSessionInfo[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get recent summaries across all projects (for web UI)
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param limit - Maximum number of summaries to return (default 50)
|
||||
*/
|
||||
export function getAllRecentSummaries(
|
||||
db: Database,
|
||||
limit: number = 50
|
||||
): FullSummary[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT id, request, investigated, learned, completed, next_steps,
|
||||
files_read, files_edited, notes, project, prompt_number,
|
||||
created_at, created_at_epoch
|
||||
FROM session_summaries
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`);
|
||||
|
||||
return stmt.all(limit) as FullSummary[];
|
||||
}
|
||||
58
src/services/sqlite/summaries/store.ts
Normal file
58
src/services/sqlite/summaries/store.ts
Normal file
@@ -0,0 +1,58 @@
|
||||
/**
|
||||
* Store session summaries in the database
|
||||
*/
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type { SummaryInput, StoreSummaryResult } from './types.js';
|
||||
|
||||
/**
|
||||
* Store a session summary (from SDK parsing)
|
||||
* Assumes session already exists - will fail with FK error if not
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param memorySessionId - SDK memory session ID
|
||||
* @param project - Project name
|
||||
* @param summary - Summary content from SDK parsing
|
||||
* @param promptNumber - Optional prompt number
|
||||
* @param discoveryTokens - Token count for discovery (default 0)
|
||||
* @param overrideTimestampEpoch - Optional timestamp override for backlog processing
|
||||
*/
|
||||
export function storeSummary(
|
||||
db: Database,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
summary: SummaryInput,
|
||||
promptNumber?: number,
|
||||
discoveryTokens: number = 0,
|
||||
overrideTimestampEpoch?: number
|
||||
): StoreSummaryResult {
|
||||
// Use override timestamp if provided (for processing backlog messages with original timestamps)
|
||||
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
|
||||
const timestampIso = new Date(timestampEpoch).toISOString();
|
||||
|
||||
const stmt = db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(memory_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = stmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
summary.learned,
|
||||
summary.completed,
|
||||
summary.next_steps,
|
||||
summary.notes,
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
|
||||
return {
|
||||
id: Number(result.lastInsertRowid),
|
||||
createdAtEpoch: timestampEpoch
|
||||
};
|
||||
}
|
||||
97
src/services/sqlite/summaries/types.ts
Normal file
97
src/services/sqlite/summaries/types.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
/**
|
||||
* Type definitions for summary-related database operations
|
||||
*/
|
||||
|
||||
/**
|
||||
* Summary input for storage (from SDK parsing)
|
||||
*/
|
||||
export interface SummaryInput {
|
||||
request: string;
|
||||
investigated: string;
|
||||
learned: string;
|
||||
completed: string;
|
||||
next_steps: string;
|
||||
notes: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Result from storing a summary
|
||||
*/
|
||||
export interface StoreSummaryResult {
|
||||
id: number;
|
||||
createdAtEpoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Summary for a specific session (minimal fields)
|
||||
*/
|
||||
export interface SessionSummary {
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
files_read: string | null;
|
||||
files_edited: string | null;
|
||||
notes: string | null;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Summary with session info for context display
|
||||
*/
|
||||
export interface SummaryWithSessionInfo {
|
||||
memory_session_id: string;
|
||||
request: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Recent summary (for project-scoped queries)
|
||||
*/
|
||||
export interface RecentSummary {
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
files_read: string | null;
|
||||
files_edited: string | null;
|
||||
notes: string | null;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Full summary with all fields (for web UI)
|
||||
*/
|
||||
export interface FullSummary {
|
||||
id: number;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
files_read: string | null;
|
||||
files_edited: string | null;
|
||||
notes: string | null;
|
||||
project: string;
|
||||
prompt_number: number | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for getByIds query
|
||||
*/
|
||||
export interface GetByIdsOptions {
|
||||
orderBy?: 'date_desc' | 'date_asc';
|
||||
limit?: number;
|
||||
project?: string;
|
||||
}
|
||||
218
src/services/sqlite/timeline/queries.ts
Normal file
218
src/services/sqlite/timeline/queries.ts
Normal file
@@ -0,0 +1,218 @@
|
||||
/**
|
||||
* Timeline query functions
|
||||
* Provides time-based context queries for observations, sessions, and prompts
|
||||
*
|
||||
* grep-friendly: getTimelineAroundTimestamp, getTimelineAroundObservation, getAllProjects
|
||||
*/
|
||||
|
||||
import type { Database } from 'bun:sqlite';
|
||||
import type { ObservationRecord, SessionSummaryRecord, UserPromptRecord } from '../../../types/database.js';
|
||||
import { logger } from '../../../utils/logger.js';
|
||||
|
||||
/**
|
||||
* Timeline result containing observations, sessions, and prompts within a time window
|
||||
*/
|
||||
export interface TimelineResult {
|
||||
observations: ObservationRecord[];
|
||||
sessions: Array<{
|
||||
id: number;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
request: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}>;
|
||||
prompts: Array<{
|
||||
id: number;
|
||||
content_session_id: string;
|
||||
prompt_number: number;
|
||||
prompt_text: string;
|
||||
project: string | undefined;
|
||||
created_at: string;
|
||||
created_at_epoch: number;
|
||||
}>;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get timeline around a specific timestamp
|
||||
* Convenience wrapper that delegates to getTimelineAroundObservation with null anchor
|
||||
*
|
||||
* @param db Database connection
|
||||
* @param anchorEpoch Epoch timestamp to anchor the query around
|
||||
* @param depthBefore Number of records to retrieve before anchor (any type)
|
||||
* @param depthAfter Number of records to retrieve after anchor (any type)
|
||||
* @param project Optional project filter
|
||||
* @returns Object containing observations, sessions, and prompts for the specified window
|
||||
*/
|
||||
export function getTimelineAroundTimestamp(
|
||||
db: Database,
|
||||
anchorEpoch: number,
|
||||
depthBefore: number = 10,
|
||||
depthAfter: number = 10,
|
||||
project?: string
|
||||
): TimelineResult {
|
||||
return getTimelineAroundObservation(db, null, anchorEpoch, depthBefore, depthAfter, project);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get timeline around a specific observation ID
|
||||
* Uses observation ID offsets to determine time boundaries, then fetches all record types in that window
|
||||
*
|
||||
* @param db Database connection
|
||||
* @param anchorObservationId Observation ID to anchor around (null for timestamp-based)
|
||||
* @param anchorEpoch Epoch timestamp fallback or anchor for timestamp-based queries
|
||||
* @param depthBefore Number of records to retrieve before anchor
|
||||
* @param depthAfter Number of records to retrieve after anchor
|
||||
* @param project Optional project filter
|
||||
* @returns Object containing observations, sessions, and prompts for the specified window
|
||||
*/
|
||||
export function getTimelineAroundObservation(
|
||||
db: Database,
|
||||
anchorObservationId: number | null,
|
||||
anchorEpoch: number,
|
||||
depthBefore: number = 10,
|
||||
depthAfter: number = 10,
|
||||
project?: string
|
||||
): TimelineResult {
|
||||
const projectFilter = project ? 'AND project = ?' : '';
|
||||
const projectParams = project ? [project] : [];
|
||||
|
||||
let startEpoch: number;
|
||||
let endEpoch: number;
|
||||
|
||||
if (anchorObservationId !== null) {
|
||||
// Get boundary observations by ID offset
|
||||
const beforeQuery = `
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id <= ? ${projectFilter}
|
||||
ORDER BY id DESC
|
||||
LIMIT ?
|
||||
`;
|
||||
const afterQuery = `
|
||||
SELECT id, created_at_epoch
|
||||
FROM observations
|
||||
WHERE id >= ? ${projectFilter}
|
||||
ORDER BY id ASC
|
||||
LIMIT ?
|
||||
`;
|
||||
|
||||
try {
|
||||
const beforeRecords = db.prepare(beforeQuery).all(anchorObservationId, ...projectParams, depthBefore + 1) as Array<{id: number; created_at_epoch: number}>;
|
||||
const afterRecords = db.prepare(afterQuery).all(anchorObservationId, ...projectParams, depthAfter + 1) as Array<{id: number; created_at_epoch: number}>;
|
||||
|
||||
// Get the earliest and latest timestamps from boundary observations
|
||||
if (beforeRecords.length === 0 && afterRecords.length === 0) {
|
||||
return { observations: [], sessions: [], prompts: [] };
|
||||
}
|
||||
|
||||
startEpoch = beforeRecords.length > 0 ? beforeRecords[beforeRecords.length - 1].created_at_epoch : anchorEpoch;
|
||||
endEpoch = afterRecords.length > 0 ? afterRecords[afterRecords.length - 1].created_at_epoch : anchorEpoch;
|
||||
} catch (err: any) {
|
||||
logger.error('DB', 'Error getting boundary observations', undefined, { error: err, project });
|
||||
return { observations: [], sessions: [], prompts: [] };
|
||||
}
|
||||
} else {
|
||||
// For timestamp-based anchors, use time-based boundaries
|
||||
// Get observations to find the time window
|
||||
const beforeQuery = `
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch <= ? ${projectFilter}
|
||||
ORDER BY created_at_epoch DESC
|
||||
LIMIT ?
|
||||
`;
|
||||
const afterQuery = `
|
||||
SELECT created_at_epoch
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? ${projectFilter}
|
||||
ORDER BY created_at_epoch ASC
|
||||
LIMIT ?
|
||||
`;
|
||||
|
||||
try {
|
||||
const beforeRecords = db.prepare(beforeQuery).all(anchorEpoch, ...projectParams, depthBefore) as Array<{created_at_epoch: number}>;
|
||||
const afterRecords = db.prepare(afterQuery).all(anchorEpoch, ...projectParams, depthAfter + 1) as Array<{created_at_epoch: number}>;
|
||||
|
||||
if (beforeRecords.length === 0 && afterRecords.length === 0) {
|
||||
return { observations: [], sessions: [], prompts: [] };
|
||||
}
|
||||
|
||||
startEpoch = beforeRecords.length > 0 ? beforeRecords[beforeRecords.length - 1].created_at_epoch : anchorEpoch;
|
||||
endEpoch = afterRecords.length > 0 ? afterRecords[afterRecords.length - 1].created_at_epoch : anchorEpoch;
|
||||
} catch (err: any) {
|
||||
logger.error('DB', 'Error getting boundary timestamps', undefined, { error: err, project });
|
||||
return { observations: [], sessions: [], prompts: [] };
|
||||
}
|
||||
}
|
||||
|
||||
// Now query ALL record types within the time window
|
||||
const obsQuery = `
|
||||
SELECT *
|
||||
FROM observations
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${projectFilter}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`;
|
||||
|
||||
const sessQuery = `
|
||||
SELECT *
|
||||
FROM session_summaries
|
||||
WHERE created_at_epoch >= ? AND created_at_epoch <= ? ${projectFilter}
|
||||
ORDER BY created_at_epoch ASC
|
||||
`;
|
||||
|
||||
const promptQuery = `
|
||||
SELECT up.*, s.project, s.memory_session_id
|
||||
FROM user_prompts up
|
||||
JOIN sdk_sessions s ON up.content_session_id = s.content_session_id
|
||||
WHERE up.created_at_epoch >= ? AND up.created_at_epoch <= ? ${projectFilter.replace('project', 's.project')}
|
||||
ORDER BY up.created_at_epoch ASC
|
||||
`;
|
||||
|
||||
const observations = db.prepare(obsQuery).all(startEpoch, endEpoch, ...projectParams) as ObservationRecord[];
|
||||
const sessions = db.prepare(sessQuery).all(startEpoch, endEpoch, ...projectParams) as SessionSummaryRecord[];
|
||||
const prompts = db.prepare(promptQuery).all(startEpoch, endEpoch, ...projectParams) as UserPromptRecord[];
|
||||
|
||||
return {
|
||||
observations,
|
||||
sessions: sessions.map(s => ({
|
||||
id: s.id,
|
||||
memory_session_id: s.memory_session_id,
|
||||
project: s.project,
|
||||
request: s.request,
|
||||
completed: s.completed,
|
||||
next_steps: s.next_steps,
|
||||
created_at: s.created_at,
|
||||
created_at_epoch: s.created_at_epoch
|
||||
})),
|
||||
prompts: prompts.map(p => ({
|
||||
id: p.id,
|
||||
content_session_id: p.content_session_id,
|
||||
prompt_number: p.prompt_number,
|
||||
prompt_text: p.prompt_text,
|
||||
project: p.project,
|
||||
created_at: p.created_at,
|
||||
created_at_epoch: p.created_at_epoch
|
||||
}))
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all unique projects from the database (for web UI project filter)
|
||||
*
|
||||
* @param db Database connection
|
||||
* @returns Array of unique project names
|
||||
*/
|
||||
export function getAllProjects(db: Database): string[] {
|
||||
const stmt = db.prepare(`
|
||||
SELECT DISTINCT project
|
||||
FROM sdk_sessions
|
||||
WHERE project IS NOT NULL AND project != ''
|
||||
ORDER BY project ASC
|
||||
`);
|
||||
|
||||
const rows = stmt.all() as Array<{ project: string }>;
|
||||
return rows.map(row => row.project);
|
||||
}
|
||||
236
src/services/sqlite/transactions.ts
Normal file
236
src/services/sqlite/transactions.ts
Normal file
@@ -0,0 +1,236 @@
|
||||
/**
|
||||
* Cross-boundary database transactions
|
||||
*
|
||||
* This module contains atomic transactions that span multiple domains
|
||||
* (observations, summaries, pending messages). These functions ensure
|
||||
* data consistency across domain boundaries.
|
||||
*/
|
||||
|
||||
import { Database } from 'bun:sqlite';
|
||||
import type { ObservationInput } from './observations/types.js';
|
||||
import type { SummaryInput } from './summaries/types.js';
|
||||
|
||||
/**
|
||||
* Result from storeObservations / storeObservationsAndMarkComplete transaction
|
||||
*/
|
||||
export interface StoreObservationsResult {
|
||||
observationIds: number[];
|
||||
summaryId: number | null;
|
||||
createdAtEpoch: number;
|
||||
}
|
||||
|
||||
// Legacy alias for backwards compatibility
|
||||
export type StoreAndMarkCompleteResult = StoreObservationsResult;
|
||||
|
||||
/**
|
||||
* ATOMIC: Store observations + summary + mark pending message as processed
|
||||
*
|
||||
* This function wraps observation storage, summary storage, and message completion
|
||||
* in a single database transaction to prevent race conditions. If the worker crashes
|
||||
* during processing, either all operations succeed together or all fail together.
|
||||
*
|
||||
* This fixes the observation duplication bug where observations were stored but
|
||||
* the message wasn't marked complete, causing reprocessing on crash recovery.
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param memorySessionId - SDK memory session ID
|
||||
* @param project - Project name
|
||||
* @param observations - Array of observations to store (can be empty)
|
||||
* @param summary - Optional summary to store
|
||||
* @param messageId - Pending message ID to mark as processed
|
||||
* @param promptNumber - Optional prompt number
|
||||
* @param discoveryTokens - Discovery tokens count
|
||||
* @param overrideTimestampEpoch - Optional override timestamp
|
||||
* @returns Object with observation IDs, optional summary ID, and timestamp
|
||||
*/
|
||||
export function storeObservationsAndMarkComplete(
|
||||
db: Database,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
observations: ObservationInput[],
|
||||
summary: SummaryInput | null,
|
||||
messageId: number,
|
||||
promptNumber?: number,
|
||||
discoveryTokens: number = 0,
|
||||
overrideTimestampEpoch?: number
|
||||
): StoreAndMarkCompleteResult {
|
||||
// Use override timestamp if provided
|
||||
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
|
||||
const timestampIso = new Date(timestampEpoch).toISOString();
|
||||
|
||||
// Create transaction that wraps all operations
|
||||
const storeAndMarkTx = db.transaction(() => {
|
||||
const observationIds: number[] = [];
|
||||
|
||||
// 1. Store all observations
|
||||
const obsStmt = db.prepare(`
|
||||
INSERT INTO observations
|
||||
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
for (const observation of observations) {
|
||||
const result = obsStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
observation.type,
|
||||
observation.title,
|
||||
observation.subtitle,
|
||||
JSON.stringify(observation.facts),
|
||||
observation.narrative,
|
||||
JSON.stringify(observation.concepts),
|
||||
JSON.stringify(observation.files_read),
|
||||
JSON.stringify(observation.files_modified),
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
observationIds.push(Number(result.lastInsertRowid));
|
||||
}
|
||||
|
||||
// 2. Store summary if provided
|
||||
let summaryId: number | null = null;
|
||||
if (summary) {
|
||||
const summaryStmt = db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(memory_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = summaryStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
summary.learned,
|
||||
summary.completed,
|
||||
summary.next_steps,
|
||||
summary.notes,
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
summaryId = Number(result.lastInsertRowid);
|
||||
}
|
||||
|
||||
// 3. Mark pending message as processed
|
||||
// This UPDATE is part of the same transaction, so if it fails,
|
||||
// observations and summary will be rolled back
|
||||
const updateStmt = db.prepare(`
|
||||
UPDATE pending_messages
|
||||
SET
|
||||
status = 'processed',
|
||||
completed_at_epoch = ?,
|
||||
tool_input = NULL,
|
||||
tool_response = NULL
|
||||
WHERE id = ? AND status = 'processing'
|
||||
`);
|
||||
updateStmt.run(timestampEpoch, messageId);
|
||||
|
||||
return { observationIds, summaryId, createdAtEpoch: timestampEpoch };
|
||||
});
|
||||
|
||||
// Execute the transaction and return results
|
||||
return storeAndMarkTx();
|
||||
}
|
||||
|
||||
/**
|
||||
* ATOMIC: Store observations + summary (no message tracking)
|
||||
*
|
||||
* Simplified version for use with claim-and-delete queue pattern.
|
||||
* Messages are deleted from queue immediately on claim, so there's no
|
||||
* message completion to track. This just stores observations and summary.
|
||||
*
|
||||
* @param db - Database instance
|
||||
* @param memorySessionId - SDK memory session ID
|
||||
* @param project - Project name
|
||||
* @param observations - Array of observations to store (can be empty)
|
||||
* @param summary - Optional summary to store
|
||||
* @param promptNumber - Optional prompt number
|
||||
* @param discoveryTokens - Discovery tokens count
|
||||
* @param overrideTimestampEpoch - Optional override timestamp
|
||||
* @returns Object with observation IDs, optional summary ID, and timestamp
|
||||
*/
|
||||
export function storeObservations(
|
||||
db: Database,
|
||||
memorySessionId: string,
|
||||
project: string,
|
||||
observations: ObservationInput[],
|
||||
summary: SummaryInput | null,
|
||||
promptNumber?: number,
|
||||
discoveryTokens: number = 0,
|
||||
overrideTimestampEpoch?: number
|
||||
): StoreObservationsResult {
|
||||
// Use override timestamp if provided
|
||||
const timestampEpoch = overrideTimestampEpoch ?? Date.now();
|
||||
const timestampIso = new Date(timestampEpoch).toISOString();
|
||||
|
||||
// Create transaction that wraps all operations
|
||||
const storeTx = db.transaction(() => {
|
||||
const observationIds: number[] = [];
|
||||
|
||||
// 1. Store all observations
|
||||
const obsStmt = db.prepare(`
|
||||
INSERT INTO observations
|
||||
(memory_session_id, project, type, title, subtitle, facts, narrative, concepts,
|
||||
files_read, files_modified, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
for (const observation of observations) {
|
||||
const result = obsStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
observation.type,
|
||||
observation.title,
|
||||
observation.subtitle,
|
||||
JSON.stringify(observation.facts),
|
||||
observation.narrative,
|
||||
JSON.stringify(observation.concepts),
|
||||
JSON.stringify(observation.files_read),
|
||||
JSON.stringify(observation.files_modified),
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
observationIds.push(Number(result.lastInsertRowid));
|
||||
}
|
||||
|
||||
// 2. Store summary if provided
|
||||
let summaryId: number | null = null;
|
||||
if (summary) {
|
||||
const summaryStmt = db.prepare(`
|
||||
INSERT INTO session_summaries
|
||||
(memory_session_id, project, request, investigated, learned, completed,
|
||||
next_steps, notes, prompt_number, discovery_tokens, created_at, created_at_epoch)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const result = summaryStmt.run(
|
||||
memorySessionId,
|
||||
project,
|
||||
summary.request,
|
||||
summary.investigated,
|
||||
summary.learned,
|
||||
summary.completed,
|
||||
summary.next_steps,
|
||||
summary.notes,
|
||||
promptNumber || null,
|
||||
discoveryTokens,
|
||||
timestampIso,
|
||||
timestampEpoch
|
||||
);
|
||||
summaryId = Number(result.lastInsertRowid);
|
||||
}
|
||||
|
||||
return { observationIds, summaryId, createdAtEpoch: timestampEpoch };
|
||||
});
|
||||
|
||||
// Execute the transaction and return results
|
||||
return storeTx();
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -30,7 +30,6 @@ export interface ActiveSession {
|
||||
startTime: number;
|
||||
cumulativeInputTokens: number; // Track input tokens for discovery cost
|
||||
cumulativeOutputTokens: number; // Track output tokens for discovery cost
|
||||
pendingProcessingIds: Set<number>; // Track ALL message IDs yielded but not yet processed
|
||||
earliestPendingTimestamp: number | null; // Original timestamp of earliest pending message (for accurate observation timestamps)
|
||||
conversationHistory: ConversationMessage[]; // Shared conversation history for provider switching
|
||||
currentProvider: 'claude' | 'gemini' | 'openrouter' | null; // Track which provider is currently running
|
||||
|
||||
@@ -15,13 +15,17 @@ import { homedir } from 'os';
|
||||
import { DatabaseManager } from './DatabaseManager.js';
|
||||
import { SessionManager } from './SessionManager.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { parseObservations, parseSummary } from '../../sdk/parser.js';
|
||||
import { buildInitPrompt, buildObservationPrompt, buildSummaryPrompt, buildContinuationPrompt } from '../../sdk/prompts.js';
|
||||
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
|
||||
import type { ActiveSession, ConversationMessage } from '../worker-types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
import { updateCursorContextForProject } from '../worker-service.js';
|
||||
import { getWorkerPort } from '../../shared/worker-utils.js';
|
||||
import {
|
||||
processAgentResponse,
|
||||
shouldFallbackToClaude,
|
||||
isAbortError,
|
||||
type WorkerRef,
|
||||
type FallbackAgent
|
||||
} from './agents/index.js';
|
||||
|
||||
// Gemini API endpoint
|
||||
const GEMINI_API_URL = 'https://generativelanguage.googleapis.com/v1beta/models';
|
||||
@@ -96,11 +100,6 @@ interface GeminiContent {
|
||||
parts: Array<{ text: string }>;
|
||||
}
|
||||
|
||||
// Forward declaration for fallback agent type
|
||||
type FallbackAgent = {
|
||||
startSession(session: ActiveSession, worker?: any): Promise<void>;
|
||||
};
|
||||
|
||||
export class GeminiAgent {
|
||||
private dbManager: DatabaseManager;
|
||||
private sessionManager: SessionManager;
|
||||
@@ -119,28 +118,11 @@ export class GeminiAgent {
|
||||
this.fallbackAgent = agent;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if an error should trigger fallback to Claude
|
||||
*/
|
||||
private shouldFallbackToClaude(error: any): boolean {
|
||||
const message = error?.message || '';
|
||||
// Fall back on rate limit (429), server errors (5xx), or network issues
|
||||
return (
|
||||
message.includes('429') ||
|
||||
message.includes('500') ||
|
||||
message.includes('502') ||
|
||||
message.includes('503') ||
|
||||
message.includes('ECONNREFUSED') ||
|
||||
message.includes('ETIMEDOUT') ||
|
||||
message.includes('fetch failed')
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Start Gemini agent for a session
|
||||
* Uses multi-turn conversation to maintain context across messages
|
||||
*/
|
||||
async startSession(session: ActiveSession, worker?: any): Promise<void> {
|
||||
async startSession(session: ActiveSession, worker?: WorkerRef): Promise<void> {
|
||||
try {
|
||||
// Get Gemini configuration
|
||||
const { apiKey, model, rateLimitingEnabled } = this.getGeminiConfig();
|
||||
@@ -170,8 +152,17 @@ export class GeminiAgent {
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7); // Rough estimate
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
|
||||
// Process response (no original timestamp for init - not from queue)
|
||||
await this.processGeminiResponse(session, initResponse.content, worker, tokensUsed, null);
|
||||
// Process response using shared ResponseProcessor (no original timestamp for init - not from queue)
|
||||
await processAgentResponse(
|
||||
initResponse.content,
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
null,
|
||||
'Gemini'
|
||||
);
|
||||
} else {
|
||||
logger.warn('SDK', 'Empty Gemini init response - session may lack context', {
|
||||
sessionId: session.sessionDbId,
|
||||
@@ -205,18 +196,27 @@ export class GeminiAgent {
|
||||
session.conversationHistory.push({ role: 'user', content: obsPrompt });
|
||||
const obsResponse = await this.queryGeminiMultiTurn(session.conversationHistory, apiKey, model, rateLimitingEnabled);
|
||||
|
||||
let tokensUsed = 0;
|
||||
if (obsResponse.content) {
|
||||
// Add response to conversation history
|
||||
session.conversationHistory.push({ role: 'assistant', content: obsResponse.content });
|
||||
|
||||
const tokensUsed = obsResponse.tokensUsed || 0;
|
||||
tokensUsed = obsResponse.tokensUsed || 0;
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7);
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
}
|
||||
|
||||
// Process response (even if empty) - empty responses will have no observations/summaries
|
||||
// but messages still need to be marked complete atomically
|
||||
await this.processGeminiResponse(session, obsResponse.content || '', worker, tokensUsed, originalTimestamp);
|
||||
// Process response using shared ResponseProcessor
|
||||
await processAgentResponse(
|
||||
obsResponse.content || '',
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
originalTimestamp,
|
||||
'Gemini'
|
||||
);
|
||||
|
||||
} else if (message.type === 'summarize') {
|
||||
// Build summary prompt
|
||||
@@ -232,18 +232,27 @@ export class GeminiAgent {
|
||||
session.conversationHistory.push({ role: 'user', content: summaryPrompt });
|
||||
const summaryResponse = await this.queryGeminiMultiTurn(session.conversationHistory, apiKey, model, rateLimitingEnabled);
|
||||
|
||||
let tokensUsed = 0;
|
||||
if (summaryResponse.content) {
|
||||
// Add response to conversation history
|
||||
session.conversationHistory.push({ role: 'assistant', content: summaryResponse.content });
|
||||
|
||||
const tokensUsed = summaryResponse.tokensUsed || 0;
|
||||
tokensUsed = summaryResponse.tokensUsed || 0;
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7);
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
}
|
||||
|
||||
// Process response (even if empty) - empty responses will have no observations/summaries
|
||||
// but messages still need to be marked complete atomically
|
||||
await this.processGeminiResponse(session, summaryResponse.content || '', worker, tokensUsed, originalTimestamp);
|
||||
// Process response using shared ResponseProcessor
|
||||
await processAgentResponse(
|
||||
summaryResponse.content || '',
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
originalTimestamp,
|
||||
'Gemini'
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -255,37 +264,26 @@ export class GeminiAgent {
|
||||
historyLength: session.conversationHistory.length
|
||||
});
|
||||
|
||||
} catch (error: any) {
|
||||
if (error.name === 'AbortError') {
|
||||
} catch (error: unknown) {
|
||||
if (isAbortError(error)) {
|
||||
logger.warn('SDK', 'Gemini agent aborted', { sessionId: session.sessionDbId });
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Check if we should fall back to Claude
|
||||
if (this.shouldFallbackToClaude(error) && this.fallbackAgent) {
|
||||
if (shouldFallbackToClaude(error) && this.fallbackAgent) {
|
||||
logger.warn('SDK', 'Gemini API failed, falling back to Claude SDK', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
error: error.message,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
historyLength: session.conversationHistory.length
|
||||
});
|
||||
|
||||
// Reset any 'processing' messages back to 'pending' so Claude can retry them
|
||||
// This handles the case where Gemini failed mid-processing a message
|
||||
const pendingStore = this.sessionManager.getPendingMessageStore();
|
||||
const resetCount = pendingStore.resetStuckMessages(0); // 0 = reset ALL processing messages
|
||||
if (resetCount > 0) {
|
||||
logger.info('SDK', 'Reset processing messages for fallback', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
resetCount
|
||||
});
|
||||
}
|
||||
|
||||
// Fall back to Claude - it will use the same session with shared conversationHistory
|
||||
// Note: Claude SDK will continue processing from current state
|
||||
// Note: With claim-and-delete queue pattern, messages are already deleted on claim
|
||||
return this.fallbackAgent.startSession(session, worker);
|
||||
}
|
||||
|
||||
logger.failure('SDK', 'Gemini agent error', { sessionDbId: session.sessionDbId }, error);
|
||||
logger.failure('SDK', 'Gemini agent error', { sessionDbId: session.sessionDbId }, error as Error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
@@ -356,166 +354,6 @@ export class GeminiAgent {
|
||||
return { content, tokensUsed };
|
||||
}
|
||||
|
||||
/**
|
||||
* Process Gemini response (same format as Claude)
|
||||
* @param originalTimestamp - Original epoch when message was queued (for backlog processing accuracy)
|
||||
*/
|
||||
private async processGeminiResponse(
|
||||
session: ActiveSession,
|
||||
text: string,
|
||||
worker: any | undefined,
|
||||
discoveryTokens: number,
|
||||
originalTimestamp: number | null
|
||||
): Promise<void> {
|
||||
// Parse observations and summary
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
const summary = parseSummary(text, session.sessionDbId);
|
||||
|
||||
// Convert nullable fields to empty strings for storeSummary (if summary exists)
|
||||
const summaryForStore = summary ? {
|
||||
request: summary.request || '',
|
||||
investigated: summary.investigated || '',
|
||||
learned: summary.learned || '',
|
||||
completed: summary.completed || '',
|
||||
next_steps: summary.next_steps || '',
|
||||
notes: summary.notes
|
||||
} : null;
|
||||
|
||||
// Get the pending message ID(s) for this response
|
||||
const pendingMessageStore = this.sessionManager.getPendingMessageStore();
|
||||
const sessionStore = this.dbManager.getSessionStore();
|
||||
|
||||
if (session.pendingProcessingIds.size > 0) {
|
||||
// ATOMIC TRANSACTION: Store observations + summary + mark message(s) complete
|
||||
for (const messageId of session.pendingProcessingIds) {
|
||||
// CRITICAL: Must use memorySessionId (not contentSessionId) for FK constraint
|
||||
if (!session.memorySessionId) {
|
||||
throw new Error('Cannot store observations: memorySessionId not yet captured');
|
||||
}
|
||||
|
||||
const result = sessionStore.storeObservationsAndMarkComplete(
|
||||
session.memorySessionId,
|
||||
session.project,
|
||||
observations,
|
||||
summaryForStore,
|
||||
messageId,
|
||||
pendingMessageStore,
|
||||
session.lastPromptNumber,
|
||||
discoveryTokens,
|
||||
originalTimestamp ?? undefined
|
||||
);
|
||||
|
||||
logger.info('SDK', 'Gemini observations and summary saved atomically', {
|
||||
sessionId: session.sessionDbId,
|
||||
messageId,
|
||||
observationCount: result.observationIds.length,
|
||||
hasSummary: !!result.summaryId,
|
||||
atomicTransaction: true
|
||||
});
|
||||
|
||||
// AFTER transaction commits - async operations (can fail safely)
|
||||
for (let i = 0; i < observations.length; i++) {
|
||||
const obsId = result.observationIds[i];
|
||||
const obs = observations[i];
|
||||
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).catch(err => {
|
||||
logger.warn('SDK', 'Gemini chroma sync failed', { obsId }, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
text: null,
|
||||
narrative: obs.narrative || null,
|
||||
facts: JSON.stringify(obs.facts || []),
|
||||
concepts: JSON.stringify(obs.concepts || []),
|
||||
files_read: JSON.stringify(obs.files_read || []),
|
||||
files_modified: JSON.stringify(obs.files_modified || []),
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sync summary to Chroma (if present)
|
||||
if (summaryForStore && result.summaryId) {
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
result.summaryId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).catch(err => {
|
||||
logger.warn('SDK', 'Gemini chroma sync failed', { summaryId: result.summaryId }, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: result.summaryId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary!.request,
|
||||
investigated: summary!.investigated,
|
||||
learned: summary!.learned,
|
||||
completed: summary!.completed,
|
||||
next_steps: summary!.next_steps,
|
||||
notes: summary!.notes,
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update Cursor context file for registered projects (fire-and-forget)
|
||||
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
|
||||
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Clear the processed message IDs
|
||||
session.pendingProcessingIds.clear();
|
||||
session.earliestPendingTimestamp = null;
|
||||
|
||||
// Clean up old processed messages
|
||||
const deletedCount = pendingMessageStore.cleanupProcessed(100);
|
||||
if (deletedCount > 0) {
|
||||
logger.debug('SDK', 'Cleaned up old processed messages', { deletedCount });
|
||||
}
|
||||
|
||||
// Broadcast activity status after processing
|
||||
if (worker && typeof worker.broadcastProcessingStatus === 'function') {
|
||||
worker.broadcastProcessingStatus();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// REMOVED: markMessagesProcessed() - replaced by atomic transaction in processGeminiResponse()
|
||||
// Messages are now marked complete atomically with observation storage to prevent duplicates
|
||||
|
||||
/**
|
||||
* Get Gemini configuration from settings or environment
|
||||
*/
|
||||
|
||||
@@ -14,14 +14,18 @@
|
||||
import { DatabaseManager } from './DatabaseManager.js';
|
||||
import { SessionManager } from './SessionManager.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { parseObservations, parseSummary } from '../../sdk/parser.js';
|
||||
import { buildInitPrompt, buildObservationPrompt, buildSummaryPrompt, buildContinuationPrompt } from '../../sdk/prompts.js';
|
||||
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
|
||||
import { USER_SETTINGS_PATH } from '../../shared/paths.js';
|
||||
import type { ActiveSession, ConversationMessage } from '../worker-types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
import { updateCursorContextForProject } from '../worker-service.js';
|
||||
import { getWorkerPort } from '../../shared/worker-utils.js';
|
||||
import {
|
||||
processAgentResponse,
|
||||
shouldFallbackToClaude,
|
||||
isAbortError,
|
||||
type WorkerRef,
|
||||
type FallbackAgent
|
||||
} from './agents/index.js';
|
||||
|
||||
// OpenRouter API endpoint
|
||||
const OPENROUTER_API_URL = 'https://openrouter.ai/api/v1/chat/completions';
|
||||
@@ -29,7 +33,7 @@ const OPENROUTER_API_URL = 'https://openrouter.ai/api/v1/chat/completions';
|
||||
// Context window management constants (defaults, overridable via settings)
|
||||
const DEFAULT_MAX_CONTEXT_MESSAGES = 20; // Maximum messages to keep in conversation history
|
||||
const DEFAULT_MAX_ESTIMATED_TOKENS = 100000; // ~100k tokens max context (safety limit)
|
||||
const CHARS_PER_TOKEN_ESTIMATE = 4; // Conservative estimate: 1 token ≈ 4 chars
|
||||
const CHARS_PER_TOKEN_ESTIMATE = 4; // Conservative estimate: 1 token = 4 chars
|
||||
|
||||
// OpenAI-compatible message format
|
||||
interface OpenAIMessage {
|
||||
@@ -56,11 +60,6 @@ interface OpenRouterResponse {
|
||||
};
|
||||
}
|
||||
|
||||
// Forward declaration for fallback agent type
|
||||
type FallbackAgent = {
|
||||
startSession(session: ActiveSession, worker?: any): Promise<void>;
|
||||
};
|
||||
|
||||
export class OpenRouterAgent {
|
||||
private dbManager: DatabaseManager;
|
||||
private sessionManager: SessionManager;
|
||||
@@ -79,28 +78,11 @@ export class OpenRouterAgent {
|
||||
this.fallbackAgent = agent;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if an error should trigger fallback to Claude
|
||||
*/
|
||||
private shouldFallbackToClaude(error: any): boolean {
|
||||
const message = error?.message || '';
|
||||
// Fall back on rate limit (429), server errors (5xx), or network issues
|
||||
return (
|
||||
message.includes('429') ||
|
||||
message.includes('500') ||
|
||||
message.includes('502') ||
|
||||
message.includes('503') ||
|
||||
message.includes('ECONNREFUSED') ||
|
||||
message.includes('ETIMEDOUT') ||
|
||||
message.includes('fetch failed')
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Start OpenRouter agent for a session
|
||||
* Uses multi-turn conversation to maintain context across messages
|
||||
*/
|
||||
async startSession(session: ActiveSession, worker?: any): Promise<void> {
|
||||
async startSession(session: ActiveSession, worker?: WorkerRef): Promise<void> {
|
||||
try {
|
||||
// Get OpenRouter configuration
|
||||
const { apiKey, model, siteUrl, appName } = this.getOpenRouterConfig();
|
||||
@@ -130,8 +112,17 @@ export class OpenRouterAgent {
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7); // Rough estimate
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
|
||||
// Process response (no original timestamp for init - not from queue)
|
||||
await this.processOpenRouterResponse(session, initResponse.content, worker, tokensUsed, null);
|
||||
// Process response using shared ResponseProcessor (no original timestamp for init - not from queue)
|
||||
await processAgentResponse(
|
||||
initResponse.content,
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
null,
|
||||
'OpenRouter'
|
||||
);
|
||||
} else {
|
||||
logger.warn('SDK', 'Empty OpenRouter init response - session may lack context', {
|
||||
sessionId: session.sessionDbId,
|
||||
@@ -164,18 +155,27 @@ export class OpenRouterAgent {
|
||||
session.conversationHistory.push({ role: 'user', content: obsPrompt });
|
||||
const obsResponse = await this.queryOpenRouterMultiTurn(session.conversationHistory, apiKey, model, siteUrl, appName);
|
||||
|
||||
let tokensUsed = 0;
|
||||
if (obsResponse.content) {
|
||||
// Add response to conversation history
|
||||
session.conversationHistory.push({ role: 'assistant', content: obsResponse.content });
|
||||
|
||||
const tokensUsed = obsResponse.tokensUsed || 0;
|
||||
tokensUsed = obsResponse.tokensUsed || 0;
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7);
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
}
|
||||
|
||||
// Process response (even if empty) - empty responses will have no observations/summaries
|
||||
// but messages still need to be marked complete atomically
|
||||
await this.processOpenRouterResponse(session, obsResponse.content || '', worker, tokensUsed, originalTimestamp);
|
||||
// Process response using shared ResponseProcessor
|
||||
await processAgentResponse(
|
||||
obsResponse.content || '',
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
originalTimestamp,
|
||||
'OpenRouter'
|
||||
);
|
||||
|
||||
} else if (message.type === 'summarize') {
|
||||
// Build summary prompt
|
||||
@@ -191,18 +191,27 @@ export class OpenRouterAgent {
|
||||
session.conversationHistory.push({ role: 'user', content: summaryPrompt });
|
||||
const summaryResponse = await this.queryOpenRouterMultiTurn(session.conversationHistory, apiKey, model, siteUrl, appName);
|
||||
|
||||
let tokensUsed = 0;
|
||||
if (summaryResponse.content) {
|
||||
// Add response to conversation history
|
||||
session.conversationHistory.push({ role: 'assistant', content: summaryResponse.content });
|
||||
|
||||
const tokensUsed = summaryResponse.tokensUsed || 0;
|
||||
tokensUsed = summaryResponse.tokensUsed || 0;
|
||||
session.cumulativeInputTokens += Math.floor(tokensUsed * 0.7);
|
||||
session.cumulativeOutputTokens += Math.floor(tokensUsed * 0.3);
|
||||
}
|
||||
|
||||
// Process response (even if empty) - empty responses will have no observations/summaries
|
||||
// but messages still need to be marked complete atomically
|
||||
await this.processOpenRouterResponse(session, summaryResponse.content || '', worker, tokensUsed, originalTimestamp);
|
||||
// Process response using shared ResponseProcessor
|
||||
await processAgentResponse(
|
||||
summaryResponse.content || '',
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
tokensUsed,
|
||||
originalTimestamp,
|
||||
'OpenRouter'
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -215,35 +224,26 @@ export class OpenRouterAgent {
|
||||
model
|
||||
});
|
||||
|
||||
} catch (error: any) {
|
||||
if (error.name === 'AbortError') {
|
||||
} catch (error: unknown) {
|
||||
if (isAbortError(error)) {
|
||||
logger.warn('SDK', 'OpenRouter agent aborted', { sessionId: session.sessionDbId });
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Check if we should fall back to Claude
|
||||
if (this.shouldFallbackToClaude(error) && this.fallbackAgent) {
|
||||
if (shouldFallbackToClaude(error) && this.fallbackAgent) {
|
||||
logger.warn('SDK', 'OpenRouter API failed, falling back to Claude SDK', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
error: error.message,
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
historyLength: session.conversationHistory.length
|
||||
});
|
||||
|
||||
// Reset any 'processing' messages back to 'pending' so Claude can retry them
|
||||
const pendingStore = this.sessionManager.getPendingMessageStore();
|
||||
const resetCount = pendingStore.resetStuckMessages(0); // 0 = reset ALL processing messages
|
||||
if (resetCount > 0) {
|
||||
logger.info('SDK', 'Reset processing messages for fallback', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
resetCount
|
||||
});
|
||||
}
|
||||
|
||||
// Fall back to Claude - it will use the same session with shared conversationHistory
|
||||
// Note: With claim-and-delete queue pattern, messages are already deleted on claim
|
||||
return this.fallbackAgent.startSession(session, worker);
|
||||
}
|
||||
|
||||
logger.failure('SDK', 'OpenRouter agent error', { sessionDbId: session.sessionDbId }, error);
|
||||
logger.failure('SDK', 'OpenRouter agent error', { sessionDbId: session.sessionDbId }, error as Error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
@@ -260,9 +260,7 @@ export class OpenRouterAgent {
|
||||
* Keeps most recent messages within token budget
|
||||
*/
|
||||
private truncateHistory(history: ConversationMessage[]): ConversationMessage[] {
|
||||
const settings = SettingsDefaultsManager.loadFromFile(
|
||||
USER_SETTINGS_PATH
|
||||
);
|
||||
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
|
||||
|
||||
const MAX_CONTEXT_MESSAGES = parseInt(settings.CLAUDE_MEM_OPENROUTER_MAX_CONTEXT_MESSAGES) || DEFAULT_MAX_CONTEXT_MESSAGES;
|
||||
const MAX_ESTIMATED_TOKENS = parseInt(settings.CLAUDE_MEM_OPENROUTER_MAX_TOKENS) || DEFAULT_MAX_ESTIMATED_TOKENS;
|
||||
@@ -399,166 +397,6 @@ export class OpenRouterAgent {
|
||||
return { content, tokensUsed };
|
||||
}
|
||||
|
||||
/**
|
||||
* Process OpenRouter response (same format as Claude/Gemini)
|
||||
* @param originalTimestamp - Original epoch when message was queued (for backlog processing accuracy)
|
||||
*/
|
||||
private async processOpenRouterResponse(
|
||||
session: ActiveSession,
|
||||
text: string,
|
||||
worker: any | undefined,
|
||||
discoveryTokens: number,
|
||||
originalTimestamp: number | null
|
||||
): Promise<void> {
|
||||
// Parse observations and summary
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
const summary = parseSummary(text, session.sessionDbId);
|
||||
|
||||
// Convert nullable fields to empty strings for storeSummary (if summary exists)
|
||||
const summaryForStore = summary ? {
|
||||
request: summary.request || '',
|
||||
investigated: summary.investigated || '',
|
||||
learned: summary.learned || '',
|
||||
completed: summary.completed || '',
|
||||
next_steps: summary.next_steps || '',
|
||||
notes: summary.notes
|
||||
} : null;
|
||||
|
||||
// Get the pending message ID(s) for this response
|
||||
const pendingMessageStore = this.sessionManager.getPendingMessageStore();
|
||||
const sessionStore = this.dbManager.getSessionStore();
|
||||
|
||||
if (session.pendingProcessingIds.size > 0) {
|
||||
// ATOMIC TRANSACTION: Store observations + summary + mark message(s) complete
|
||||
for (const messageId of session.pendingProcessingIds) {
|
||||
// CRITICAL: Must use memorySessionId (not contentSessionId) for FK constraint
|
||||
if (!session.memorySessionId) {
|
||||
throw new Error('Cannot store observations: memorySessionId not yet captured');
|
||||
}
|
||||
|
||||
const result = sessionStore.storeObservationsAndMarkComplete(
|
||||
session.memorySessionId,
|
||||
session.project,
|
||||
observations,
|
||||
summaryForStore,
|
||||
messageId,
|
||||
pendingMessageStore,
|
||||
session.lastPromptNumber,
|
||||
discoveryTokens,
|
||||
originalTimestamp ?? undefined
|
||||
);
|
||||
|
||||
logger.info('SDK', 'OpenRouter observations and summary saved atomically', {
|
||||
sessionId: session.sessionDbId,
|
||||
messageId,
|
||||
observationCount: result.observationIds.length,
|
||||
hasSummary: !!result.summaryId,
|
||||
atomicTransaction: true
|
||||
});
|
||||
|
||||
// AFTER transaction commits - async operations (can fail safely)
|
||||
for (let i = 0; i < observations.length; i++) {
|
||||
const obsId = result.observationIds[i];
|
||||
const obs = observations[i];
|
||||
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).catch(err => {
|
||||
logger.warn('SDK', 'OpenRouter chroma sync failed', { obsId }, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
text: null,
|
||||
narrative: obs.narrative || null,
|
||||
facts: JSON.stringify(obs.facts || []),
|
||||
concepts: JSON.stringify(obs.concepts || []),
|
||||
files_read: JSON.stringify(obs.files_read || []),
|
||||
files_modified: JSON.stringify(obs.files_modified || []),
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sync summary to Chroma (if present)
|
||||
if (summaryForStore && result.summaryId) {
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
result.summaryId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).catch(err => {
|
||||
logger.warn('SDK', 'OpenRouter chroma sync failed', { summaryId: result.summaryId }, err);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: result.summaryId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary!.request,
|
||||
investigated: summary!.investigated,
|
||||
learned: summary!.learned,
|
||||
completed: summary!.completed,
|
||||
next_steps: summary!.next_steps,
|
||||
notes: summary!.notes,
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update Cursor context file for registered projects (fire-and-forget)
|
||||
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
|
||||
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Clear the processed message IDs
|
||||
session.pendingProcessingIds.clear();
|
||||
session.earliestPendingTimestamp = null;
|
||||
|
||||
// Clean up old processed messages
|
||||
const deletedCount = pendingMessageStore.cleanupProcessed(100);
|
||||
if (deletedCount > 0) {
|
||||
logger.debug('SDK', 'Cleaned up old processed messages', { deletedCount });
|
||||
}
|
||||
|
||||
// Broadcast activity status after processing
|
||||
if (worker && typeof worker.broadcastProcessingStatus === 'function') {
|
||||
worker.broadcastProcessingStatus();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// REMOVED: markMessagesProcessed() - replaced by atomic transaction in processOpenRouterResponse()
|
||||
// Messages are now marked complete atomically with observation storage to prevent duplicates
|
||||
|
||||
/**
|
||||
* Get OpenRouter configuration from settings or environment
|
||||
*/
|
||||
|
||||
@@ -14,14 +14,12 @@ import path from 'path';
|
||||
import { DatabaseManager } from './DatabaseManager.js';
|
||||
import { SessionManager } from './SessionManager.js';
|
||||
import { logger } from '../../utils/logger.js';
|
||||
import { parseObservations, parseSummary } from '../../sdk/parser.js';
|
||||
import { buildInitPrompt, buildObservationPrompt, buildSummaryPrompt, buildContinuationPrompt } from '../../sdk/prompts.js';
|
||||
import { SettingsDefaultsManager } from '../../shared/SettingsDefaultsManager.js';
|
||||
import { USER_SETTINGS_PATH } from '../../shared/paths.js';
|
||||
import type { ActiveSession, SDKUserMessage, PendingMessage } from '../worker-types.js';
|
||||
import type { ActiveSession, SDKUserMessage } from '../worker-types.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
import { updateCursorContextForProject } from '../worker-service.js';
|
||||
import { getWorkerPort } from '../../shared/worker-utils.js';
|
||||
import { processAgentResponse, type WorkerRef } from './agents/index.js';
|
||||
|
||||
// Import Agent SDK (assumes it's installed)
|
||||
// @ts-ignore - Agent SDK types may not be available
|
||||
@@ -40,161 +38,163 @@ export class SDKAgent {
|
||||
* Start SDK agent for a session (event-driven, no polling)
|
||||
* @param worker WorkerService reference for spinner control (optional)
|
||||
*/
|
||||
async startSession(session: ActiveSession, worker?: any): Promise<void> {
|
||||
async startSession(session: ActiveSession, worker?: WorkerRef): Promise<void> {
|
||||
// Find Claude executable
|
||||
const claudePath = this.findClaudeExecutable();
|
||||
|
||||
// Get model ID and disallowed tools
|
||||
const modelId = this.getModelId();
|
||||
// Memory agent is OBSERVER ONLY - no tools allowed
|
||||
const disallowedTools = [
|
||||
'Bash', // Prevent infinite loops
|
||||
'Read', // No file reading
|
||||
'Write', // No file writing
|
||||
'Edit', // No file editing
|
||||
'Grep', // No code searching
|
||||
'Glob', // No file pattern matching
|
||||
'WebFetch', // No web fetching
|
||||
'WebSearch', // No web searching
|
||||
'Task', // No spawning sub-agents
|
||||
'NotebookEdit', // No notebook editing
|
||||
'AskUserQuestion',// No asking questions
|
||||
'TodoWrite' // No todo management
|
||||
];
|
||||
|
||||
// Create message generator (event-driven)
|
||||
const messageGenerator = this.createMessageGenerator(session);
|
||||
|
||||
// CRITICAL: Only resume if memorySessionId exists (was captured from a previous SDK response).
|
||||
// memorySessionId starts as NULL and is captured on first SDK message.
|
||||
// NEVER use contentSessionId for resume - that would inject messages into the user's transcript!
|
||||
const hasRealMemorySessionId = !!session.memorySessionId;
|
||||
|
||||
// Find Claude executable
|
||||
const claudePath = this.findClaudeExecutable();
|
||||
|
||||
// Get model ID and disallowed tools
|
||||
const modelId = this.getModelId();
|
||||
// Memory agent is OBSERVER ONLY - no tools allowed
|
||||
const disallowedTools = [
|
||||
'Bash', // Prevent infinite loops
|
||||
'Read', // No file reading
|
||||
'Write', // No file writing
|
||||
'Edit', // No file editing
|
||||
'Grep', // No code searching
|
||||
'Glob', // No file pattern matching
|
||||
'WebFetch', // No web fetching
|
||||
'WebSearch', // No web searching
|
||||
'Task', // No spawning sub-agents
|
||||
'NotebookEdit', // No notebook editing
|
||||
'AskUserQuestion',// No asking questions
|
||||
'TodoWrite' // No todo management
|
||||
];
|
||||
|
||||
// Create message generator (event-driven)
|
||||
const messageGenerator = this.createMessageGenerator(session);
|
||||
|
||||
// CRITICAL: Only resume if memorySessionId exists (was captured from a previous SDK response).
|
||||
// memorySessionId starts as NULL and is captured on first SDK message.
|
||||
// NEVER use contentSessionId for resume - that would inject messages into the user's transcript!
|
||||
const hasRealMemorySessionId = !!session.memorySessionId;
|
||||
|
||||
logger.info('SDK', 'Starting SDK query', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
memorySessionId: session.memorySessionId,
|
||||
hasRealMemorySessionId,
|
||||
resume_parameter: hasRealMemorySessionId ? session.memorySessionId : '(none - fresh start)',
|
||||
lastPromptNumber: session.lastPromptNumber
|
||||
});
|
||||
|
||||
// SESSION ALIGNMENT LOG: Resume decision proof - show if we're resuming with correct memorySessionId
|
||||
if (session.lastPromptNumber > 1) {
|
||||
logger.info('SDK', `[ALIGNMENT] Resume Decision | contentSessionId=${session.contentSessionId} | memorySessionId=${session.memorySessionId} | prompt#=${session.lastPromptNumber} | hasRealMemorySessionId=${hasRealMemorySessionId} | resumeWith=${hasRealMemorySessionId ? session.memorySessionId : 'NONE (fresh SDK session)'}`);
|
||||
} else {
|
||||
logger.info('SDK', `[ALIGNMENT] First Prompt | contentSessionId=${session.contentSessionId} | prompt#=${session.lastPromptNumber} | Will capture memorySessionId from first SDK response`);
|
||||
}
|
||||
|
||||
// Run Agent SDK query loop
|
||||
// Only resume if we have a captured memory session ID
|
||||
const queryResult = query({
|
||||
prompt: messageGenerator,
|
||||
options: {
|
||||
model: modelId,
|
||||
// Resume with captured memorySessionId (null on first prompt, real ID on subsequent)
|
||||
...(hasRealMemorySessionId && { resume: session.memorySessionId }),
|
||||
disallowedTools,
|
||||
abortController: session.abortController,
|
||||
pathToClaudeCodeExecutable: claudePath
|
||||
}
|
||||
});
|
||||
|
||||
// Process SDK messages
|
||||
for await (const message of queryResult) {
|
||||
// Capture memory session ID from first SDK message (any type has session_id)
|
||||
// This enables resume for subsequent generator starts within the same user session
|
||||
if (!session.memorySessionId && message.session_id) {
|
||||
session.memorySessionId = message.session_id;
|
||||
// Persist to database for cross-restart recovery
|
||||
this.dbManager.getSessionStore().updateMemorySessionId(
|
||||
session.sessionDbId,
|
||||
message.session_id
|
||||
);
|
||||
logger.info('SDK', 'Captured memory session ID', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
memorySessionId: message.session_id
|
||||
});
|
||||
// SESSION ALIGNMENT LOG: Memory session ID captured - now contentSessionId→memorySessionId mapping is complete
|
||||
logger.info('SDK', `[ALIGNMENT] Captured | contentSessionId=${session.contentSessionId} → memorySessionId=${message.session_id} | Future prompts will resume with this ID`);
|
||||
}
|
||||
|
||||
// Handle assistant messages
|
||||
if (message.type === 'assistant') {
|
||||
const content = message.message.content;
|
||||
const textContent = Array.isArray(content)
|
||||
? content.filter((c: any) => c.type === 'text').map((c: any) => c.text).join('\n')
|
||||
: typeof content === 'string' ? content : '';
|
||||
|
||||
const responseSize = textContent.length;
|
||||
|
||||
// Capture token state BEFORE updating (for delta calculation)
|
||||
const tokensBeforeResponse = session.cumulativeInputTokens + session.cumulativeOutputTokens;
|
||||
|
||||
// Extract and track token usage
|
||||
const usage = message.message.usage;
|
||||
if (usage) {
|
||||
session.cumulativeInputTokens += usage.input_tokens || 0;
|
||||
session.cumulativeOutputTokens += usage.output_tokens || 0;
|
||||
|
||||
// Cache creation counts as discovery, cache read doesn't
|
||||
if (usage.cache_creation_input_tokens) {
|
||||
session.cumulativeInputTokens += usage.cache_creation_input_tokens;
|
||||
}
|
||||
|
||||
logger.debug('SDK', 'Token usage captured', {
|
||||
sessionId: session.sessionDbId,
|
||||
inputTokens: usage.input_tokens,
|
||||
outputTokens: usage.output_tokens,
|
||||
cacheCreation: usage.cache_creation_input_tokens || 0,
|
||||
cacheRead: usage.cache_read_input_tokens || 0,
|
||||
cumulativeInput: session.cumulativeInputTokens,
|
||||
cumulativeOutput: session.cumulativeOutputTokens
|
||||
});
|
||||
}
|
||||
|
||||
// Calculate discovery tokens (delta for this response only)
|
||||
const discoveryTokens = (session.cumulativeInputTokens + session.cumulativeOutputTokens) - tokensBeforeResponse;
|
||||
|
||||
// Process response (empty or not) and mark messages as processed
|
||||
// Capture earliest timestamp BEFORE processing (will be cleared after)
|
||||
const originalTimestamp = session.earliestPendingTimestamp;
|
||||
|
||||
if (responseSize > 0) {
|
||||
const truncatedResponse = responseSize > 100
|
||||
? textContent.substring(0, 100) + '...'
|
||||
: textContent;
|
||||
logger.dataOut('SDK', `Response received (${responseSize} chars)`, {
|
||||
sessionId: session.sessionDbId,
|
||||
promptNumber: session.lastPromptNumber
|
||||
}, truncatedResponse);
|
||||
}
|
||||
|
||||
// Parse and process response (even if empty) with discovery token delta and original timestamp
|
||||
// Empty responses will result in empty observations array and null summary
|
||||
await this.processSDKResponse(session, textContent, worker, discoveryTokens, originalTimestamp);
|
||||
}
|
||||
|
||||
// Log result messages
|
||||
if (message.type === 'result' && message.subtype === 'success') {
|
||||
// Usage telemetry is captured at SDK level
|
||||
}
|
||||
}
|
||||
|
||||
// Mark session complete
|
||||
const sessionDuration = Date.now() - session.startTime;
|
||||
logger.success('SDK', 'Agent completed', {
|
||||
sessionId: session.sessionDbId,
|
||||
duration: `${(sessionDuration / 1000).toFixed(1)}s`
|
||||
});
|
||||
|
||||
logger.info('SDK', 'Starting SDK query', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
contentSessionId: session.contentSessionId,
|
||||
memorySessionId: session.memorySessionId,
|
||||
hasRealMemorySessionId,
|
||||
resume_parameter: hasRealMemorySessionId ? session.memorySessionId : '(none - fresh start)',
|
||||
lastPromptNumber: session.lastPromptNumber
|
||||
});
|
||||
|
||||
// SESSION ALIGNMENT LOG: Resume decision proof - show if we're resuming with correct memorySessionId
|
||||
if (session.lastPromptNumber > 1) {
|
||||
logger.info('SDK', `[ALIGNMENT] Resume Decision | contentSessionId=${session.contentSessionId} | memorySessionId=${session.memorySessionId} | prompt#=${session.lastPromptNumber} | hasRealMemorySessionId=${hasRealMemorySessionId} | resumeWith=${hasRealMemorySessionId ? session.memorySessionId : 'NONE (fresh SDK session)'}`);
|
||||
} else {
|
||||
logger.info('SDK', `[ALIGNMENT] First Prompt | contentSessionId=${session.contentSessionId} | prompt#=${session.lastPromptNumber} | Will capture memorySessionId from first SDK response`);
|
||||
}
|
||||
|
||||
// Run Agent SDK query loop
|
||||
// Only resume if we have a captured memory session ID
|
||||
const queryResult = query({
|
||||
prompt: messageGenerator,
|
||||
options: {
|
||||
model: modelId,
|
||||
// Resume with captured memorySessionId (null on first prompt, real ID on subsequent)
|
||||
...(hasRealMemorySessionId && { resume: session.memorySessionId }),
|
||||
disallowedTools,
|
||||
abortController: session.abortController,
|
||||
pathToClaudeCodeExecutable: claudePath
|
||||
}
|
||||
});
|
||||
|
||||
// Process SDK messages
|
||||
for await (const message of queryResult) {
|
||||
// Capture memory session ID from first SDK message (any type has session_id)
|
||||
// This enables resume for subsequent generator starts within the same user session
|
||||
if (!session.memorySessionId && message.session_id) {
|
||||
session.memorySessionId = message.session_id;
|
||||
// Persist to database for cross-restart recovery
|
||||
this.dbManager.getSessionStore().updateMemorySessionId(
|
||||
session.sessionDbId,
|
||||
message.session_id
|
||||
);
|
||||
logger.info('SDK', 'Captured memory session ID', {
|
||||
sessionDbId: session.sessionDbId,
|
||||
memorySessionId: message.session_id
|
||||
});
|
||||
// SESSION ALIGNMENT LOG: Memory session ID captured - now contentSessionId→memorySessionId mapping is complete
|
||||
logger.info('SDK', `[ALIGNMENT] Captured | contentSessionId=${session.contentSessionId} → memorySessionId=${message.session_id} | Future prompts will resume with this ID`);
|
||||
}
|
||||
|
||||
// Handle assistant messages
|
||||
if (message.type === 'assistant') {
|
||||
const content = message.message.content;
|
||||
const textContent = Array.isArray(content)
|
||||
? content.filter((c: any) => c.type === 'text').map((c: any) => c.text).join('\n')
|
||||
: typeof content === 'string' ? content : '';
|
||||
|
||||
const responseSize = textContent.length;
|
||||
|
||||
// Capture token state BEFORE updating (for delta calculation)
|
||||
const tokensBeforeResponse = session.cumulativeInputTokens + session.cumulativeOutputTokens;
|
||||
|
||||
// Extract and track token usage
|
||||
const usage = message.message.usage;
|
||||
if (usage) {
|
||||
session.cumulativeInputTokens += usage.input_tokens || 0;
|
||||
session.cumulativeOutputTokens += usage.output_tokens || 0;
|
||||
|
||||
// Cache creation counts as discovery, cache read doesn't
|
||||
if (usage.cache_creation_input_tokens) {
|
||||
session.cumulativeInputTokens += usage.cache_creation_input_tokens;
|
||||
}
|
||||
|
||||
logger.debug('SDK', 'Token usage captured', {
|
||||
sessionId: session.sessionDbId,
|
||||
inputTokens: usage.input_tokens,
|
||||
outputTokens: usage.output_tokens,
|
||||
cacheCreation: usage.cache_creation_input_tokens || 0,
|
||||
cacheRead: usage.cache_read_input_tokens || 0,
|
||||
cumulativeInput: session.cumulativeInputTokens,
|
||||
cumulativeOutput: session.cumulativeOutputTokens
|
||||
});
|
||||
}
|
||||
|
||||
// Calculate discovery tokens (delta for this response only)
|
||||
const discoveryTokens = (session.cumulativeInputTokens + session.cumulativeOutputTokens) - tokensBeforeResponse;
|
||||
|
||||
// Process response (empty or not) and mark messages as processed
|
||||
// Capture earliest timestamp BEFORE processing (will be cleared after)
|
||||
const originalTimestamp = session.earliestPendingTimestamp;
|
||||
|
||||
if (responseSize > 0) {
|
||||
const truncatedResponse = responseSize > 100
|
||||
? textContent.substring(0, 100) + '...'
|
||||
: textContent;
|
||||
logger.dataOut('SDK', `Response received (${responseSize} chars)`, {
|
||||
sessionId: session.sessionDbId,
|
||||
promptNumber: session.lastPromptNumber
|
||||
}, truncatedResponse);
|
||||
}
|
||||
|
||||
// Parse and process response using shared ResponseProcessor
|
||||
await processAgentResponse(
|
||||
textContent,
|
||||
session,
|
||||
this.dbManager,
|
||||
this.sessionManager,
|
||||
worker,
|
||||
discoveryTokens,
|
||||
originalTimestamp,
|
||||
'SDK'
|
||||
);
|
||||
}
|
||||
|
||||
// Log result messages
|
||||
if (message.type === 'result' && message.subtype === 'success') {
|
||||
// Usage telemetry is captured at SDK level
|
||||
}
|
||||
}
|
||||
|
||||
// Mark session complete
|
||||
const sessionDuration = Date.now() - session.startTime;
|
||||
logger.success('SDK', 'Agent completed', {
|
||||
sessionId: session.sessionDbId,
|
||||
duration: `${(sessionDuration / 1000).toFixed(1)}s`
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Create event-driven message generator (yields messages from SessionManager)
|
||||
*
|
||||
@@ -315,190 +315,6 @@ export class SDKAgent {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Process SDK response text (parse XML, save to database, sync to Chroma)
|
||||
* @param discoveryTokens - Token cost for discovering this response (delta, not cumulative)
|
||||
* @param originalTimestamp - Original epoch when message was queued (for backlog processing accuracy)
|
||||
*
|
||||
* Also captures assistant responses to shared conversation history for provider interop.
|
||||
* This allows Gemini to see full context if provider is switched mid-session.
|
||||
*
|
||||
* CRITICAL: Uses atomic transaction to prevent observation duplication on crash recovery.
|
||||
*/
|
||||
private async processSDKResponse(session: ActiveSession, text: string, worker: any | undefined, discoveryTokens: number, originalTimestamp: number | null): Promise<void> {
|
||||
// Add assistant response to shared conversation history for provider interop
|
||||
if (text) {
|
||||
session.conversationHistory.push({ role: 'assistant', content: text });
|
||||
}
|
||||
|
||||
// Parse observations and summary
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
const summary = parseSummary(text, session.sessionDbId);
|
||||
|
||||
// Get the pending message ID(s) for this response
|
||||
// In normal operation, this should be ONE message (FIFO processing)
|
||||
// But we handle multiple for safety (in case SDK batches messages)
|
||||
const pendingMessageStore = this.sessionManager.getPendingMessageStore();
|
||||
const sessionStore = this.dbManager.getSessionStore();
|
||||
|
||||
if (session.pendingProcessingIds.size > 0) {
|
||||
// ATOMIC TRANSACTION: Store observations + summary + mark message(s) complete
|
||||
// This prevents duplicates if the worker crashes after storing but before marking complete
|
||||
for (const messageId of session.pendingProcessingIds) {
|
||||
// CRITICAL: Must use memorySessionId (not contentSessionId) for FK constraint
|
||||
if (!session.memorySessionId) {
|
||||
throw new Error('Cannot store observations: memorySessionId not yet captured');
|
||||
}
|
||||
|
||||
const result = sessionStore.storeObservationsAndMarkComplete(
|
||||
session.memorySessionId,
|
||||
session.project,
|
||||
observations,
|
||||
summary || null,
|
||||
messageId,
|
||||
pendingMessageStore,
|
||||
session.lastPromptNumber,
|
||||
discoveryTokens,
|
||||
originalTimestamp ?? undefined
|
||||
);
|
||||
|
||||
// Log what was saved
|
||||
logger.info('SDK', 'Observations and summary saved atomically', {
|
||||
sessionId: session.sessionDbId,
|
||||
messageId,
|
||||
observationCount: result.observationIds.length,
|
||||
hasSummary: !!result.summaryId,
|
||||
atomicTransaction: true
|
||||
});
|
||||
|
||||
// AFTER transaction commits - async operations (can fail safely without data loss)
|
||||
// Sync observations to Chroma
|
||||
for (let i = 0; i < observations.length; i++) {
|
||||
const obsId = result.observationIds[i];
|
||||
const obs = observations[i];
|
||||
const chromaStart = Date.now();
|
||||
|
||||
this.dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).then(() => {
|
||||
const chromaDuration = Date.now() - chromaStart;
|
||||
logger.debug('CHROMA', 'Observation synced', {
|
||||
obsId,
|
||||
duration: `${chromaDuration}ms`,
|
||||
type: obs.type,
|
||||
title: obs.title || '(untitled)'
|
||||
});
|
||||
}).catch((error) => {
|
||||
logger.warn('CHROMA', 'Observation sync failed, continuing without vector search', {
|
||||
obsId,
|
||||
type: obs.type,
|
||||
title: obs.title || '(untitled)'
|
||||
}, error);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_observation',
|
||||
observation: {
|
||||
id: obsId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
text: obs.text || null,
|
||||
narrative: obs.narrative || null,
|
||||
facts: JSON.stringify(obs.facts || []),
|
||||
concepts: JSON.stringify(obs.concepts || []),
|
||||
files_read: JSON.stringify(obs.files || []),
|
||||
files_modified: JSON.stringify([]),
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sync summary to Chroma (if present)
|
||||
if (summary && result.summaryId) {
|
||||
const chromaStart = Date.now();
|
||||
this.dbManager.getChromaSync().syncSummary(
|
||||
result.summaryId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summary,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).then(() => {
|
||||
const chromaDuration = Date.now() - chromaStart;
|
||||
logger.debug('CHROMA', 'Summary synced', {
|
||||
summaryId: result.summaryId,
|
||||
duration: `${chromaDuration}ms`,
|
||||
request: summary.request || '(no request)'
|
||||
});
|
||||
}).catch((error) => {
|
||||
logger.warn('CHROMA', 'Summary sync failed, continuing without vector search', {
|
||||
summaryId: result.summaryId,
|
||||
request: summary.request || '(no request)'
|
||||
}, error);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
if (worker && worker.sseBroadcaster) {
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_summary',
|
||||
summary: {
|
||||
id: result.summaryId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary.request,
|
||||
investigated: summary.investigated,
|
||||
learned: summary.learned,
|
||||
completed: summary.completed,
|
||||
next_steps: summary.next_steps,
|
||||
notes: summary.notes,
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Update Cursor context file for registered projects (fire-and-forget)
|
||||
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
|
||||
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Clear the processed message IDs
|
||||
session.pendingProcessingIds.clear();
|
||||
session.earliestPendingTimestamp = null;
|
||||
|
||||
// Clean up old processed messages (keep last 100 for UI display)
|
||||
const deletedCount = pendingMessageStore.cleanupProcessed(100);
|
||||
if (deletedCount > 0) {
|
||||
logger.debug('SDK', 'Cleaned up old processed messages', { deletedCount });
|
||||
}
|
||||
|
||||
// Broadcast activity status after processing (queue may have changed)
|
||||
if (worker && typeof worker.broadcastProcessingStatus === 'function') {
|
||||
worker.broadcastProcessingStatus();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// REMOVED: markMessagesProcessed() - replaced by atomic transaction in processSDKResponse()
|
||||
// Messages are now marked complete atomically with observation storage to prevent duplicates
|
||||
|
||||
// ============================================================================
|
||||
// Configuration Helpers
|
||||
// ============================================================================
|
||||
@@ -508,7 +324,7 @@ export class SDKAgent {
|
||||
*/
|
||||
private findClaudeExecutable(): string {
|
||||
const settings = SettingsDefaultsManager.loadFromFile(USER_SETTINGS_PATH);
|
||||
|
||||
|
||||
// 1. Check configured path
|
||||
if (settings.CLAUDE_CODE_PATH) {
|
||||
// Lazy load fs to keep startup fast
|
||||
@@ -522,10 +338,10 @@ export class SDKAgent {
|
||||
// 2. Try auto-detection
|
||||
try {
|
||||
const claudePath = execSync(
|
||||
process.platform === 'win32' ? 'where claude' : 'which claude',
|
||||
process.platform === 'win32' ? 'where claude' : 'which claude',
|
||||
{ encoding: 'utf8', windowsHide: true, stdio: ['ignore', 'pipe', 'ignore'] }
|
||||
).trim().split('\n')[0].trim();
|
||||
|
||||
|
||||
if (claudePath) return claudePath;
|
||||
} catch (error) {
|
||||
// [ANTI-PATTERN IGNORED]: Fallback behavior - which/where failed, continue to throw clear error
|
||||
|
||||
7
src/services/worker/Search.ts
Normal file
7
src/services/worker/Search.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
/**
|
||||
* Search.ts - Named re-export facade for search module
|
||||
*
|
||||
* Provides a clean import path for the search module.
|
||||
*/
|
||||
|
||||
export * from './search/index.js';
|
||||
@@ -1,9 +1,16 @@
|
||||
/**
|
||||
* SearchManager - Core search orchestration for claude-mem
|
||||
* Extracted from mcp-server.ts to centralize business logic in Worker services
|
||||
*
|
||||
* This class contains all tool handler logic that was previously in the MCP server.
|
||||
* The MCP server now acts as a thin HTTP wrapper that calls these methods via HTTP.
|
||||
* This class is a thin wrapper that delegates to the modular search infrastructure.
|
||||
* It maintains the same public interface for backward compatibility.
|
||||
*
|
||||
* The actual search logic is now in:
|
||||
* - SearchOrchestrator: Strategy selection and coordination
|
||||
* - ChromaSearchStrategy: Vector-based semantic search
|
||||
* - SQLiteSearchStrategy: Filter-only queries
|
||||
* - HybridSearchStrategy: Metadata filtering + semantic ranking
|
||||
* - ResultFormatter: Output formatting
|
||||
* - TimelineBuilder: Timeline construction
|
||||
*/
|
||||
|
||||
import { basename } from 'path';
|
||||
@@ -17,21 +24,36 @@ import { logger } from '../../utils/logger.js';
|
||||
import { formatDate, formatTime, formatDateTime, extractFirstFile, groupByDate, estimateTokens } from '../../shared/timeline-formatting.js';
|
||||
import { ModeManager } from '../domain/ModeManager.js';
|
||||
|
||||
const COLLECTION_NAME = 'cm__claude-mem';
|
||||
const RECENCY_WINDOW_DAYS = 90;
|
||||
const RECENCY_WINDOW_MS = RECENCY_WINDOW_DAYS * 24 * 60 * 60 * 1000;
|
||||
import {
|
||||
SearchOrchestrator,
|
||||
TimelineBuilder,
|
||||
TimelineData,
|
||||
SEARCH_CONSTANTS
|
||||
} from './search/index.js';
|
||||
|
||||
export class SearchManager {
|
||||
private orchestrator: SearchOrchestrator;
|
||||
private timelineBuilder: TimelineBuilder;
|
||||
|
||||
constructor(
|
||||
private sessionSearch: SessionSearch,
|
||||
private sessionStore: SessionStore,
|
||||
private chromaSync: ChromaSync,
|
||||
private formatter: FormattingService,
|
||||
private timelineService: TimelineService
|
||||
) {}
|
||||
) {
|
||||
// Initialize the new modular search infrastructure
|
||||
this.orchestrator = new SearchOrchestrator(
|
||||
sessionSearch,
|
||||
sessionStore,
|
||||
chromaSync
|
||||
);
|
||||
this.timelineBuilder = new TimelineBuilder();
|
||||
}
|
||||
|
||||
/**
|
||||
* Query Chroma vector database via ChromaSync
|
||||
* @deprecated Use orchestrator.search() instead
|
||||
*/
|
||||
private async queryChroma(
|
||||
query: string,
|
||||
@@ -71,8 +93,8 @@ export class SearchManager {
|
||||
// Flatten dateStart/dateEnd into dateRange object
|
||||
if (normalized.dateStart || normalized.dateEnd) {
|
||||
normalized.dateRange = {
|
||||
start: normalized.dateStart,
|
||||
end: normalized.dateEnd
|
||||
start: normalized.dateStart,
|
||||
end: normalized.dateEnd
|
||||
};
|
||||
delete normalized.dateStart;
|
||||
delete normalized.dateEnd;
|
||||
@@ -104,13 +126,13 @@ export class SearchManager {
|
||||
logger.debug('SEARCH', 'Filter-only query (no query text), using direct SQLite filtering', { enablesDateFilters: true });
|
||||
const obsOptions = { ...options, type: obs_type, concepts, files };
|
||||
if (searchObservations) {
|
||||
observations = this.sessionSearch.searchObservations(undefined, obsOptions);
|
||||
observations = this.sessionSearch.searchObservations(undefined, obsOptions);
|
||||
}
|
||||
if (searchSessions) {
|
||||
sessions = this.sessionSearch.searchSessions(undefined, options);
|
||||
sessions = this.sessionSearch.searchSessions(undefined, options);
|
||||
}
|
||||
if (searchPrompts) {
|
||||
prompts = this.sessionSearch.searchUserPrompts(undefined, options);
|
||||
prompts = this.sessionSearch.searchUserPrompts(undefined, options);
|
||||
}
|
||||
}
|
||||
// PATH 2: CHROMA SEMANTIC SEARCH (query text + Chroma available)
|
||||
@@ -121,11 +143,11 @@ export class SearchManager {
|
||||
// Build Chroma where filter for doc_type
|
||||
let whereFilter: Record<string, any> | undefined;
|
||||
if (type === 'observations') {
|
||||
whereFilter = { doc_type: 'observation' };
|
||||
whereFilter = { doc_type: 'observation' };
|
||||
} else if (type === 'sessions') {
|
||||
whereFilter = { doc_type: 'session_summary' };
|
||||
whereFilter = { doc_type: 'session_summary' };
|
||||
} else if (type === 'prompts') {
|
||||
whereFilter = { doc_type: 'user_prompt' };
|
||||
whereFilter = { doc_type: 'user_prompt' };
|
||||
}
|
||||
|
||||
// Step 1: Chroma semantic search with optional type filter
|
||||
@@ -134,51 +156,51 @@ export class SearchManager {
|
||||
logger.debug('SEARCH', 'ChromaDB returned semantic matches', { matchCount: chromaResults.ids.length });
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const recentMetadata = chromaResults.metadatas.map((meta, idx) => ({
|
||||
id: chromaResults.ids[idx],
|
||||
meta,
|
||||
isRecent: meta && meta.created_at_epoch > ninetyDaysAgo
|
||||
})).filter(item => item.isRecent);
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentMetadata = chromaResults.metadatas.map((meta, idx) => ({
|
||||
id: chromaResults.ids[idx],
|
||||
meta,
|
||||
isRecent: meta && meta.created_at_epoch > ninetyDaysAgo
|
||||
})).filter(item => item.isRecent);
|
||||
|
||||
logger.debug('SEARCH', 'Results within 90-day window', { count: recentMetadata.length });
|
||||
logger.debug('SEARCH', 'Results within 90-day window', { count: recentMetadata.length });
|
||||
|
||||
// Step 3: Categorize IDs by document type
|
||||
const obsIds: number[] = [];
|
||||
const sessionIds: number[] = [];
|
||||
const promptIds: number[] = [];
|
||||
// Step 3: Categorize IDs by document type
|
||||
const obsIds: number[] = [];
|
||||
const sessionIds: number[] = [];
|
||||
const promptIds: number[] = [];
|
||||
|
||||
for (const item of recentMetadata) {
|
||||
const docType = item.meta?.doc_type;
|
||||
if (docType === 'observation' && searchObservations) {
|
||||
obsIds.push(item.id);
|
||||
} else if (docType === 'session_summary' && searchSessions) {
|
||||
sessionIds.push(item.id);
|
||||
} else if (docType === 'user_prompt' && searchPrompts) {
|
||||
promptIds.push(item.id);
|
||||
}
|
||||
}
|
||||
for (const item of recentMetadata) {
|
||||
const docType = item.meta?.doc_type;
|
||||
if (docType === 'observation' && searchObservations) {
|
||||
obsIds.push(item.id);
|
||||
} else if (docType === 'session_summary' && searchSessions) {
|
||||
sessionIds.push(item.id);
|
||||
} else if (docType === 'user_prompt' && searchPrompts) {
|
||||
promptIds.push(item.id);
|
||||
}
|
||||
}
|
||||
|
||||
logger.debug('SEARCH', 'Categorized results by type', { observations: obsIds.length, sessions: sessionIds.length, prompts: prompts.length });
|
||||
logger.debug('SEARCH', 'Categorized results by type', { observations: obsIds.length, sessions: sessionIds.length, prompts: prompts.length });
|
||||
|
||||
// Step 4: Hydrate from SQLite with additional filters
|
||||
if (obsIds.length > 0) {
|
||||
// Apply obs_type, concepts, files filters if provided
|
||||
const obsOptions = { ...options, type: obs_type, concepts, files };
|
||||
observations = this.sessionStore.getObservationsByIds(obsIds, obsOptions);
|
||||
}
|
||||
if (sessionIds.length > 0) {
|
||||
sessions = this.sessionStore.getSessionSummariesByIds(sessionIds, { orderBy: 'date_desc', limit: options.limit, project: options.project });
|
||||
}
|
||||
if (promptIds.length > 0) {
|
||||
prompts = this.sessionStore.getUserPromptsByIds(promptIds, { orderBy: 'date_desc', limit: options.limit, project: options.project });
|
||||
}
|
||||
// Step 4: Hydrate from SQLite with additional filters
|
||||
if (obsIds.length > 0) {
|
||||
// Apply obs_type, concepts, files filters if provided
|
||||
const obsOptions = { ...options, type: obs_type, concepts, files };
|
||||
observations = this.sessionStore.getObservationsByIds(obsIds, obsOptions);
|
||||
}
|
||||
if (sessionIds.length > 0) {
|
||||
sessions = this.sessionStore.getSessionSummariesByIds(sessionIds, { orderBy: 'date_desc', limit: options.limit, project: options.project });
|
||||
}
|
||||
if (promptIds.length > 0) {
|
||||
prompts = this.sessionStore.getUserPromptsByIds(promptIds, { orderBy: 'date_desc', limit: options.limit, project: options.project });
|
||||
}
|
||||
|
||||
logger.debug('SEARCH', 'Hydrated results from SQLite', { observations: observations.length, sessions: sessions.length, prompts: prompts.length });
|
||||
logger.debug('SEARCH', 'Hydrated results from SQLite', { observations: observations.length, sessions: sessions.length, prompts: prompts.length });
|
||||
} else {
|
||||
// Chroma returned 0 results - this is the correct answer, don't fall back to FTS5
|
||||
logger.debug('SEARCH', 'ChromaDB found no matches (final result, no FTS5 fallback)', {});
|
||||
// Chroma returned 0 results - this is the correct answer, don't fall back to FTS5
|
||||
logger.debug('SEARCH', 'ChromaDB found no matches (final result, no FTS5 fallback)', {});
|
||||
}
|
||||
}
|
||||
// ChromaDB not initialized - mark as failed to show proper error message
|
||||
@@ -196,28 +218,28 @@ export class SearchManager {
|
||||
// JSON format: return raw data for programmatic access (e.g., export scripts)
|
||||
if (format === 'json') {
|
||||
return {
|
||||
observations,
|
||||
sessions,
|
||||
prompts,
|
||||
totalResults,
|
||||
query: query || ''
|
||||
observations,
|
||||
sessions,
|
||||
prompts,
|
||||
totalResults,
|
||||
query: query || ''
|
||||
};
|
||||
}
|
||||
|
||||
if (totalResults === 0) {
|
||||
if (chromaFailed) {
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: `⚠️ Vector search failed - semantic search unavailable.\n\nTo enable semantic search:\n1. Install uv: https://docs.astral.sh/uv/getting-started/installation/\n2. Restart the worker: npm run worker:restart\n\nNote: You can still use filter-only searches (date ranges, types, files) without a query term.`
|
||||
}]
|
||||
};
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: `Vector search failed - semantic search unavailable.\n\nTo enable semantic search:\n1. Install uv: https://docs.astral.sh/uv/getting-started/installation/\n2. Restart the worker: npm run worker:restart\n\nNote: You can still use filter-only searches (date ranges, types, files) without a query term.`
|
||||
}]
|
||||
};
|
||||
}
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: `No results found matching "${query}"`
|
||||
}]
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: `No results found matching "${query}"`
|
||||
}]
|
||||
};
|
||||
}
|
||||
|
||||
@@ -231,22 +253,22 @@ export class SearchManager {
|
||||
|
||||
const allResults: CombinedResult[] = [
|
||||
...observations.map(obs => ({
|
||||
type: 'observation' as const,
|
||||
data: obs,
|
||||
epoch: obs.created_at_epoch,
|
||||
created_at: obs.created_at
|
||||
type: 'observation' as const,
|
||||
data: obs,
|
||||
epoch: obs.created_at_epoch,
|
||||
created_at: obs.created_at
|
||||
})),
|
||||
...sessions.map(sess => ({
|
||||
type: 'session' as const,
|
||||
data: sess,
|
||||
epoch: sess.created_at_epoch,
|
||||
created_at: sess.created_at
|
||||
type: 'session' as const,
|
||||
data: sess,
|
||||
epoch: sess.created_at_epoch,
|
||||
created_at: sess.created_at
|
||||
})),
|
||||
...prompts.map(prompt => ({
|
||||
type: 'prompt' as const,
|
||||
data: prompt,
|
||||
epoch: prompt.created_at_epoch,
|
||||
created_at: prompt.created_at
|
||||
type: 'prompt' as const,
|
||||
data: prompt,
|
||||
epoch: prompt.created_at_epoch,
|
||||
created_at: prompt.created_at
|
||||
}))
|
||||
];
|
||||
|
||||
@@ -276,46 +298,46 @@ export class SearchManager {
|
||||
// Group by file within this day
|
||||
const resultsByFile = new Map<string, CombinedResult[]>();
|
||||
for (const result of dayResults) {
|
||||
let file = 'General';
|
||||
if (result.type === 'observation') {
|
||||
file = extractFirstFile(result.data.files_modified, cwd);
|
||||
}
|
||||
if (!resultsByFile.has(file)) {
|
||||
resultsByFile.set(file, []);
|
||||
}
|
||||
resultsByFile.get(file)!.push(result);
|
||||
let file = 'General';
|
||||
if (result.type === 'observation') {
|
||||
file = extractFirstFile(result.data.files_modified, cwd);
|
||||
}
|
||||
if (!resultsByFile.has(file)) {
|
||||
resultsByFile.set(file, []);
|
||||
}
|
||||
resultsByFile.get(file)!.push(result);
|
||||
}
|
||||
|
||||
// Render each file section
|
||||
for (const [file, fileResults] of resultsByFile) {
|
||||
lines.push(`**${file}**`);
|
||||
lines.push(this.formatter.formatSearchTableHeader());
|
||||
lines.push(`**${file}**`);
|
||||
lines.push(this.formatter.formatSearchTableHeader());
|
||||
|
||||
let lastTime = '';
|
||||
for (const result of fileResults) {
|
||||
if (result.type === 'observation') {
|
||||
const formatted = this.formatter.formatObservationSearchRow(result.data as ObservationSearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else if (result.type === 'session') {
|
||||
const formatted = this.formatter.formatSessionSearchRow(result.data as SessionSummarySearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else {
|
||||
const formatted = this.formatter.formatUserPromptSearchRow(result.data as UserPromptSearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
}
|
||||
}
|
||||
let lastTime = '';
|
||||
for (const result of fileResults) {
|
||||
if (result.type === 'observation') {
|
||||
const formatted = this.formatter.formatObservationSearchRow(result.data as ObservationSearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else if (result.type === 'session') {
|
||||
const formatted = this.formatter.formatSessionSearchRow(result.data as SessionSummarySearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else {
|
||||
const formatted = this.formatter.formatUserPromptSearchRow(result.data as UserPromptSearchResult, lastTime);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push('');
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: lines.join('\n')
|
||||
type: 'text' as const,
|
||||
text: lines.join('\n')
|
||||
}]
|
||||
};
|
||||
}
|
||||
@@ -364,7 +386,7 @@ export class SearchManager {
|
||||
logger.debug('SEARCH', 'Chroma returned semantic matches for timeline', { matchCount: chromaResults?.ids?.length ?? 0 });
|
||||
|
||||
if (chromaResults?.ids && chromaResults.ids.length > 0) {
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentIds = chromaResults.ids.filter((_id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
@@ -488,7 +510,7 @@ export class SearchManager {
|
||||
lines.push(`# Timeline around anchor: ${anchorId}`);
|
||||
}
|
||||
|
||||
lines.push(`**Window:** ${depth_before} records before → ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push(`**Window:** ${depth_before} records before -> ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push('');
|
||||
|
||||
|
||||
@@ -534,9 +556,9 @@ export class SearchManager {
|
||||
|
||||
const sess = item.data as SessionSummarySearchResult;
|
||||
const title = sess.request || 'Session summary';
|
||||
const marker = isAnchor ? ' ← **ANCHOR**' : '';
|
||||
const marker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
|
||||
lines.push(`**🎯 #S${sess.id}** ${title} (${formatDateTime(item.epoch)})${marker}`);
|
||||
lines.push(`**\uD83C\uDFAF #S${sess.id}** ${title} (${formatDateTime(item.epoch)})${marker}`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'prompt') {
|
||||
if (tableOpen) {
|
||||
@@ -549,7 +571,7 @@ export class SearchManager {
|
||||
const prompt = item.data as UserPromptSearchResult;
|
||||
const truncated = prompt.prompt_text.length > 100 ? prompt.prompt_text.substring(0, 100) + '...' : prompt.prompt_text;
|
||||
|
||||
lines.push(`**💬 User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`**\uD83D\uDCAC User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`> ${truncated}`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'observation') {
|
||||
@@ -577,10 +599,10 @@ export class SearchManager {
|
||||
const tokens = estimateTokens(obs.narrative);
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '″';
|
||||
const timeDisplay = showTime ? time : '"';
|
||||
lastTime = time;
|
||||
|
||||
const anchorMarker = isAnchor ? ' ← **ANCHOR**' : '';
|
||||
const anchorMarker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
lines.push(`| #${obs.id} | ${timeDisplay} | ${icon} | ${title}${anchorMarker} | ~${tokens} |`);
|
||||
}
|
||||
}
|
||||
@@ -592,8 +614,8 @@ export class SearchManager {
|
||||
|
||||
return {
|
||||
content: [{
|
||||
type: 'text' as const,
|
||||
text: lines.join('\n')
|
||||
type: 'text' as const,
|
||||
text: lines.join('\n')
|
||||
}]
|
||||
};
|
||||
}
|
||||
@@ -830,7 +852,7 @@ export class SearchManager {
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentIds = chromaResults.ids.filter((_id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
@@ -887,7 +909,7 @@ export class SearchManager {
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentIds = chromaResults.ids.filter((_id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
@@ -944,7 +966,7 @@ export class SearchManager {
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentIds = chromaResults.ids.filter((_id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
@@ -1425,7 +1447,7 @@ export class SearchManager {
|
||||
|
||||
// Header
|
||||
lines.push(`# Timeline around anchor: ${anchorId}`);
|
||||
lines.push(`**Window:** ${depth_before} records before → ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push(`**Window:** ${depth_before} records before -> ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push('');
|
||||
|
||||
|
||||
@@ -1473,9 +1495,9 @@ export class SearchManager {
|
||||
// Render session
|
||||
const sess = item.data as SessionSummarySearchResult;
|
||||
const title = sess.request || 'Session summary';
|
||||
const marker = isAnchor ? ' ← **ANCHOR**' : '';
|
||||
const marker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
|
||||
lines.push(`**🎯 #S${sess.id}** ${title} (${formatDateTime(item.epoch)})${marker}`);
|
||||
lines.push(`**\uD83C\uDFAF #S${sess.id}** ${title} (${formatDateTime(item.epoch)})${marker}`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'prompt') {
|
||||
// Close any open table
|
||||
@@ -1490,7 +1512,7 @@ export class SearchManager {
|
||||
const prompt = item.data as UserPromptSearchResult;
|
||||
const truncated = prompt.prompt_text.length > 100 ? prompt.prompt_text.substring(0, 100) + '...' : prompt.prompt_text;
|
||||
|
||||
lines.push(`**💬 User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`**\uD83D\uDCAC User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`> ${truncated}`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'observation') {
|
||||
@@ -1523,10 +1545,10 @@ export class SearchManager {
|
||||
const tokens = estimateTokens(obs.narrative);
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '″';
|
||||
const timeDisplay = showTime ? time : '"';
|
||||
lastTime = time;
|
||||
|
||||
const anchorMarker = isAnchor ? ' ← **ANCHOR**' : '';
|
||||
const anchorMarker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
lines.push(`| #${obs.id} | ${timeDisplay} | ${icon} | ${title}${anchorMarker} | ~${tokens} |`);
|
||||
}
|
||||
}
|
||||
@@ -1563,7 +1585,7 @@ export class SearchManager {
|
||||
|
||||
if (chromaResults.ids.length > 0) {
|
||||
// Filter by recency (90 days)
|
||||
const ninetyDaysAgo = Date.now() - RECENCY_WINDOW_MS;
|
||||
const ninetyDaysAgo = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
const recentIds = chromaResults.ids.filter((_id, idx) => {
|
||||
const meta = chromaResults.metadatas[idx];
|
||||
return meta && meta.created_at_epoch > ninetyDaysAgo;
|
||||
@@ -1659,7 +1681,7 @@ export class SearchManager {
|
||||
// Header
|
||||
lines.push(`# Timeline for query: "${query}"`);
|
||||
lines.push(`**Anchor:** Observation #${topResult.id} - ${topResult.title || 'Untitled'}`);
|
||||
lines.push(`**Window:** ${depth_before} records before → ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push(`**Window:** ${depth_before} records before -> ${depth_after} records after | **Items:** ${filteredItems?.length ?? 0}`);
|
||||
lines.push('');
|
||||
|
||||
|
||||
@@ -1705,7 +1727,7 @@ export class SearchManager {
|
||||
const sess = item.data as SessionSummarySearchResult;
|
||||
const title = sess.request || 'Session summary';
|
||||
|
||||
lines.push(`**🎯 #S${sess.id}** ${title} (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`**\uD83C\uDFAF #S${sess.id}** ${title} (${formatDateTime(item.epoch)})`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'prompt') {
|
||||
// Close any open table
|
||||
@@ -1720,7 +1742,7 @@ export class SearchManager {
|
||||
const prompt = item.data as UserPromptSearchResult;
|
||||
const truncated = prompt.prompt_text.length > 100 ? prompt.prompt_text.substring(0, 100) + '...' : prompt.prompt_text;
|
||||
|
||||
lines.push(`**💬 User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`**\uD83D\uDCAC User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`> ${truncated}`);
|
||||
lines.push('');
|
||||
} else if (item.type === 'observation') {
|
||||
@@ -1753,10 +1775,10 @@ export class SearchManager {
|
||||
const tokens = estimateTokens(obs.narrative);
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '″';
|
||||
const timeDisplay = showTime ? time : '"';
|
||||
lastTime = time;
|
||||
|
||||
const anchorMarker = isAnchor ? ' ← **ANCHOR**' : '';
|
||||
const anchorMarker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
lines.push(`| #${obs.id} | ${timeDisplay} | ${icon} | ${title}${anchorMarker} | ~${tokens} |`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -137,7 +137,6 @@ export class SessionManager {
|
||||
startTime: Date.now(),
|
||||
cumulativeInputTokens: 0,
|
||||
cumulativeOutputTokens: 0,
|
||||
pendingProcessingIds: new Set(),
|
||||
earliestPendingTimestamp: null,
|
||||
conversationHistory: [], // Initialize empty - will be populated by agents
|
||||
currentProvider: null // Will be set when generator starts
|
||||
@@ -379,12 +378,9 @@ export class SessionManager {
|
||||
}
|
||||
|
||||
const processor = new SessionQueueProcessor(this.getPendingStore(), emitter);
|
||||
|
||||
// Use the robust Pump iterator
|
||||
for await (const message of processor.createIterator(sessionDbId, session.abortController.signal)) {
|
||||
// Track this message ID for completion marking
|
||||
session.pendingProcessingIds.add(message._persistentId);
|
||||
|
||||
// Use the robust iterator - messages are deleted on claim (no tracking needed)
|
||||
for await (const message of processor.createIterator(sessionDbId, session.abortController.signal)) {
|
||||
// Track earliest timestamp for accurate observation timestamps
|
||||
// This ensures backlog messages get their original timestamps, not current time
|
||||
if (session.earliestPendingTimestamp === null) {
|
||||
|
||||
73
src/services/worker/agents/FallbackErrorHandler.ts
Normal file
73
src/services/worker/agents/FallbackErrorHandler.ts
Normal file
@@ -0,0 +1,73 @@
|
||||
/**
|
||||
* FallbackErrorHandler: Error detection for provider fallback
|
||||
*
|
||||
* Responsibility:
|
||||
* - Determine if an error should trigger fallback to Claude SDK
|
||||
* - Provide consistent error classification across Gemini and OpenRouter
|
||||
*/
|
||||
|
||||
import { FALLBACK_ERROR_PATTERNS } from './types.js';
|
||||
|
||||
/**
|
||||
* Check if an error should trigger fallback to Claude SDK
|
||||
*
|
||||
* Errors that trigger fallback:
|
||||
* - 429: Rate limit exceeded
|
||||
* - 500/502/503: Server errors
|
||||
* - ECONNREFUSED: Connection refused (server down)
|
||||
* - ETIMEDOUT: Request timeout
|
||||
* - fetch failed: Network failure
|
||||
*
|
||||
* @param error - Error object to check
|
||||
* @returns true if the error should trigger fallback to Claude
|
||||
*/
|
||||
export function shouldFallbackToClaude(error: unknown): boolean {
|
||||
const message = getErrorMessage(error);
|
||||
|
||||
return FALLBACK_ERROR_PATTERNS.some(pattern => message.includes(pattern));
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract error message from various error types
|
||||
*/
|
||||
function getErrorMessage(error: unknown): string {
|
||||
if (error === null || error === undefined) {
|
||||
return '';
|
||||
}
|
||||
|
||||
if (typeof error === 'string') {
|
||||
return error;
|
||||
}
|
||||
|
||||
if (error instanceof Error) {
|
||||
return error.message;
|
||||
}
|
||||
|
||||
if (typeof error === 'object' && 'message' in error) {
|
||||
return String((error as { message: unknown }).message);
|
||||
}
|
||||
|
||||
return String(error);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if error is an AbortError (user cancelled)
|
||||
*
|
||||
* @param error - Error object to check
|
||||
* @returns true if this is an abort/cancellation error
|
||||
*/
|
||||
export function isAbortError(error: unknown): boolean {
|
||||
if (error === null || error === undefined) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (error instanceof Error && error.name === 'AbortError') {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (typeof error === 'object' && 'name' in error) {
|
||||
return (error as { name: unknown }).name === 'AbortError';
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
54
src/services/worker/agents/ObservationBroadcaster.ts
Normal file
54
src/services/worker/agents/ObservationBroadcaster.ts
Normal file
@@ -0,0 +1,54 @@
|
||||
/**
|
||||
* ObservationBroadcaster: SSE broadcasting for observations and summaries
|
||||
*
|
||||
* Responsibility:
|
||||
* - Broadcast new observations to SSE clients
|
||||
* - Broadcast new summaries to SSE clients
|
||||
* - Handle worker reference safely (null checks)
|
||||
*
|
||||
* BUGFIX: This module fixes the incorrect field names in SDKAgent:
|
||||
* - SDKAgent used `obs.files` which doesn't exist - should be `obs.files_read`
|
||||
* - SDKAgent used hardcoded `files_modified: JSON.stringify([])` - should use `obs.files_modified`
|
||||
*/
|
||||
|
||||
import type { WorkerRef, ObservationSSEPayload, SummarySSEPayload } from './types.js';
|
||||
|
||||
/**
|
||||
* Broadcast a new observation to SSE clients
|
||||
*
|
||||
* @param worker - Worker reference with SSE broadcaster (can be undefined)
|
||||
* @param payload - Observation data to broadcast
|
||||
*/
|
||||
export function broadcastObservation(
|
||||
worker: WorkerRef | undefined,
|
||||
payload: ObservationSSEPayload
|
||||
): void {
|
||||
if (!worker?.sseBroadcaster) {
|
||||
return;
|
||||
}
|
||||
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_observation',
|
||||
observation: payload
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Broadcast a new summary to SSE clients
|
||||
*
|
||||
* @param worker - Worker reference with SSE broadcaster (can be undefined)
|
||||
* @param payload - Summary data to broadcast
|
||||
*/
|
||||
export function broadcastSummary(
|
||||
worker: WorkerRef | undefined,
|
||||
payload: SummarySSEPayload
|
||||
): void {
|
||||
if (!worker?.sseBroadcaster) {
|
||||
return;
|
||||
}
|
||||
|
||||
worker.sseBroadcaster.broadcast({
|
||||
type: 'new_summary',
|
||||
summary: payload
|
||||
});
|
||||
}
|
||||
270
src/services/worker/agents/ResponseProcessor.ts
Normal file
270
src/services/worker/agents/ResponseProcessor.ts
Normal file
@@ -0,0 +1,270 @@
|
||||
/**
|
||||
* ResponseProcessor: Shared response processing for all agent implementations
|
||||
*
|
||||
* Responsibility:
|
||||
* - Parse observations and summaries from agent responses
|
||||
* - Execute atomic database transactions
|
||||
* - Orchestrate Chroma sync (fire-and-forget)
|
||||
* - Broadcast to SSE clients
|
||||
* - Clean up processed messages
|
||||
*
|
||||
* This module extracts 150+ lines of duplicate code from SDKAgent, GeminiAgent, and OpenRouterAgent.
|
||||
*/
|
||||
|
||||
import { logger } from '../../../utils/logger.js';
|
||||
import { parseObservations, parseSummary, type ParsedObservation, type ParsedSummary } from '../../../sdk/parser.js';
|
||||
import { updateCursorContextForProject } from '../../worker-service.js';
|
||||
import { getWorkerPort } from '../../../shared/worker-utils.js';
|
||||
import type { ActiveSession } from '../../worker-types.js';
|
||||
import type { DatabaseManager } from '../DatabaseManager.js';
|
||||
import type { SessionManager } from '../SessionManager.js';
|
||||
import type { WorkerRef, StorageResult } from './types.js';
|
||||
import { broadcastObservation, broadcastSummary } from './ObservationBroadcaster.js';
|
||||
import { cleanupProcessedMessages } from './SessionCleanupHelper.js';
|
||||
|
||||
/**
|
||||
* Process agent response text (parse XML, save to database, sync to Chroma, broadcast SSE)
|
||||
*
|
||||
* This is the unified response processor that handles:
|
||||
* 1. Adding response to conversation history (for provider interop)
|
||||
* 2. Parsing observations and summaries from XML
|
||||
* 3. Atomic database transaction to store observations + summary
|
||||
* 4. Async Chroma sync (fire-and-forget, failures are non-critical)
|
||||
* 5. SSE broadcast to web UI clients
|
||||
* 6. Session cleanup
|
||||
*
|
||||
* @param text - Response text from the agent
|
||||
* @param session - Active session being processed
|
||||
* @param dbManager - Database manager for storage operations
|
||||
* @param sessionManager - Session manager for message tracking
|
||||
* @param worker - Worker reference for SSE broadcasting (optional)
|
||||
* @param discoveryTokens - Token cost delta for this response
|
||||
* @param originalTimestamp - Original epoch when message was queued (for accurate timestamps)
|
||||
* @param agentName - Name of the agent for logging (e.g., 'SDK', 'Gemini', 'OpenRouter')
|
||||
*/
|
||||
export async function processAgentResponse(
|
||||
text: string,
|
||||
session: ActiveSession,
|
||||
dbManager: DatabaseManager,
|
||||
sessionManager: SessionManager,
|
||||
worker: WorkerRef | undefined,
|
||||
discoveryTokens: number,
|
||||
originalTimestamp: number | null,
|
||||
agentName: string
|
||||
): Promise<void> {
|
||||
// Add assistant response to shared conversation history for provider interop
|
||||
if (text) {
|
||||
session.conversationHistory.push({ role: 'assistant', content: text });
|
||||
}
|
||||
|
||||
// Parse observations and summary
|
||||
const observations = parseObservations(text, session.contentSessionId);
|
||||
const summary = parseSummary(text, session.sessionDbId);
|
||||
|
||||
// Convert nullable fields to empty strings for storeSummary (if summary exists)
|
||||
const summaryForStore = normalizeSummaryForStorage(summary);
|
||||
|
||||
// Get session store for atomic transaction
|
||||
const sessionStore = dbManager.getSessionStore();
|
||||
|
||||
// CRITICAL: Must use memorySessionId (not contentSessionId) for FK constraint
|
||||
if (!session.memorySessionId) {
|
||||
throw new Error('Cannot store observations: memorySessionId not yet captured');
|
||||
}
|
||||
|
||||
// ATOMIC TRANSACTION: Store observations + summary ONCE
|
||||
// Messages are already deleted from queue on claim, so no completion tracking needed
|
||||
const result = sessionStore.storeObservations(
|
||||
session.memorySessionId,
|
||||
session.project,
|
||||
observations,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
discoveryTokens,
|
||||
originalTimestamp ?? undefined
|
||||
);
|
||||
|
||||
// Log what was saved
|
||||
logger.info('SDK', `${agentName} observations and summary saved atomically`, {
|
||||
sessionId: session.sessionDbId,
|
||||
observationCount: result.observationIds.length,
|
||||
hasSummary: !!result.summaryId,
|
||||
atomicTransaction: true
|
||||
});
|
||||
|
||||
// AFTER transaction commits - async operations (can fail safely without data loss)
|
||||
await syncAndBroadcastObservations(
|
||||
observations,
|
||||
result,
|
||||
session,
|
||||
dbManager,
|
||||
worker,
|
||||
discoveryTokens,
|
||||
agentName
|
||||
);
|
||||
|
||||
// Sync and broadcast summary if present
|
||||
await syncAndBroadcastSummary(
|
||||
summary,
|
||||
summaryForStore,
|
||||
result,
|
||||
session,
|
||||
dbManager,
|
||||
worker,
|
||||
discoveryTokens,
|
||||
agentName
|
||||
);
|
||||
|
||||
// Clean up session state
|
||||
cleanupProcessedMessages(session, worker);
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize summary for storage (convert null fields to empty strings)
|
||||
*/
|
||||
function normalizeSummaryForStorage(summary: ParsedSummary | null): {
|
||||
request: string;
|
||||
investigated: string;
|
||||
learned: string;
|
||||
completed: string;
|
||||
next_steps: string;
|
||||
notes: string | null;
|
||||
} | null {
|
||||
if (!summary) return null;
|
||||
|
||||
return {
|
||||
request: summary.request || '',
|
||||
investigated: summary.investigated || '',
|
||||
learned: summary.learned || '',
|
||||
completed: summary.completed || '',
|
||||
next_steps: summary.next_steps || '',
|
||||
notes: summary.notes
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync observations to Chroma and broadcast to SSE clients
|
||||
*/
|
||||
async function syncAndBroadcastObservations(
|
||||
observations: ParsedObservation[],
|
||||
result: StorageResult,
|
||||
session: ActiveSession,
|
||||
dbManager: DatabaseManager,
|
||||
worker: WorkerRef | undefined,
|
||||
discoveryTokens: number,
|
||||
agentName: string
|
||||
): Promise<void> {
|
||||
for (let i = 0; i < observations.length; i++) {
|
||||
const obsId = result.observationIds[i];
|
||||
const obs = observations[i];
|
||||
const chromaStart = Date.now();
|
||||
|
||||
// Sync to Chroma (fire-and-forget)
|
||||
dbManager.getChromaSync().syncObservation(
|
||||
obsId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
obs,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).then(() => {
|
||||
const chromaDuration = Date.now() - chromaStart;
|
||||
logger.debug('CHROMA', 'Observation synced', {
|
||||
obsId,
|
||||
duration: `${chromaDuration}ms`,
|
||||
type: obs.type,
|
||||
title: obs.title || '(untitled)'
|
||||
});
|
||||
}).catch((error) => {
|
||||
logger.warn('CHROMA', `${agentName} chroma sync failed, continuing without vector search`, {
|
||||
obsId,
|
||||
type: obs.type,
|
||||
title: obs.title || '(untitled)'
|
||||
}, error);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
// BUGFIX: Use obs.files_read and obs.files_modified (not obs.files)
|
||||
broadcastObservation(worker, {
|
||||
id: obsId,
|
||||
memory_session_id: session.memorySessionId,
|
||||
session_id: session.contentSessionId,
|
||||
type: obs.type,
|
||||
title: obs.title,
|
||||
subtitle: obs.subtitle,
|
||||
text: null, // text field is not in ParsedObservation
|
||||
narrative: obs.narrative || null,
|
||||
facts: JSON.stringify(obs.facts || []),
|
||||
concepts: JSON.stringify(obs.concepts || []),
|
||||
files_read: JSON.stringify(obs.files_read || []),
|
||||
files_modified: JSON.stringify(obs.files_modified || []),
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync summary to Chroma and broadcast to SSE clients
|
||||
*/
|
||||
async function syncAndBroadcastSummary(
|
||||
summary: ParsedSummary | null,
|
||||
summaryForStore: { request: string; investigated: string; learned: string; completed: string; next_steps: string; notes: string | null } | null,
|
||||
result: StorageResult,
|
||||
session: ActiveSession,
|
||||
dbManager: DatabaseManager,
|
||||
worker: WorkerRef | undefined,
|
||||
discoveryTokens: number,
|
||||
agentName: string
|
||||
): Promise<void> {
|
||||
if (!summaryForStore || !result.summaryId) {
|
||||
return;
|
||||
}
|
||||
|
||||
const chromaStart = Date.now();
|
||||
|
||||
// Sync to Chroma (fire-and-forget)
|
||||
dbManager.getChromaSync().syncSummary(
|
||||
result.summaryId,
|
||||
session.contentSessionId,
|
||||
session.project,
|
||||
summaryForStore,
|
||||
session.lastPromptNumber,
|
||||
result.createdAtEpoch,
|
||||
discoveryTokens
|
||||
).then(() => {
|
||||
const chromaDuration = Date.now() - chromaStart;
|
||||
logger.debug('CHROMA', 'Summary synced', {
|
||||
summaryId: result.summaryId,
|
||||
duration: `${chromaDuration}ms`,
|
||||
request: summaryForStore.request || '(no request)'
|
||||
});
|
||||
}).catch((error) => {
|
||||
logger.warn('CHROMA', `${agentName} chroma sync failed, continuing without vector search`, {
|
||||
summaryId: result.summaryId,
|
||||
request: summaryForStore.request || '(no request)'
|
||||
}, error);
|
||||
});
|
||||
|
||||
// Broadcast to SSE clients (for web UI)
|
||||
broadcastSummary(worker, {
|
||||
id: result.summaryId,
|
||||
session_id: session.contentSessionId,
|
||||
request: summary!.request,
|
||||
investigated: summary!.investigated,
|
||||
learned: summary!.learned,
|
||||
completed: summary!.completed,
|
||||
next_steps: summary!.next_steps,
|
||||
notes: summary!.notes,
|
||||
project: session.project,
|
||||
prompt_number: session.lastPromptNumber,
|
||||
created_at_epoch: result.createdAtEpoch
|
||||
});
|
||||
|
||||
// Update Cursor context file for registered projects (fire-and-forget)
|
||||
updateCursorContextForProject(session.project, getWorkerPort()).catch(error => {
|
||||
logger.warn('CURSOR', 'Context update failed (non-critical)', { project: session.project }, error as Error);
|
||||
});
|
||||
}
|
||||
36
src/services/worker/agents/SessionCleanupHelper.ts
Normal file
36
src/services/worker/agents/SessionCleanupHelper.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
/**
|
||||
* SessionCleanupHelper: Session state cleanup after response processing
|
||||
*
|
||||
* Responsibility:
|
||||
* - Reset earliest pending timestamp
|
||||
* - Broadcast processing status updates
|
||||
*
|
||||
* NOTE: With claim-and-delete queue pattern, messages are deleted on claim,
|
||||
* so there's no pendingProcessingIds tracking or processed message cleanup.
|
||||
*/
|
||||
|
||||
import type { ActiveSession } from '../../worker-types.js';
|
||||
import type { WorkerRef } from './types.js';
|
||||
|
||||
/**
|
||||
* Clean up session state after response processing
|
||||
*
|
||||
* With claim-and-delete queue pattern, this function simply:
|
||||
* 1. Resets the earliest pending timestamp
|
||||
* 2. Broadcasts updated processing status to SSE clients
|
||||
*
|
||||
* @param session - Active session to clean up
|
||||
* @param worker - Worker reference for status broadcasting (optional)
|
||||
*/
|
||||
export function cleanupProcessedMessages(
|
||||
session: ActiveSession,
|
||||
worker: WorkerRef | undefined
|
||||
): void {
|
||||
// Reset earliest pending timestamp for next batch
|
||||
session.earliestPendingTimestamp = null;
|
||||
|
||||
// Broadcast activity status after processing (queue may have changed)
|
||||
if (worker && typeof worker.broadcastProcessingStatus === 'function') {
|
||||
worker.broadcastProcessingStatus();
|
||||
}
|
||||
}
|
||||
38
src/services/worker/agents/index.ts
Normal file
38
src/services/worker/agents/index.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
/**
|
||||
* Agent Consolidation Module
|
||||
*
|
||||
* This module provides shared utilities for SDK, Gemini, and OpenRouter agents.
|
||||
* It extracts common patterns to reduce code duplication and ensure consistent behavior.
|
||||
*
|
||||
* Usage:
|
||||
* ```typescript
|
||||
* import { processAgentResponse, shouldFallbackToClaude } from './agents/index.js';
|
||||
* ```
|
||||
*/
|
||||
|
||||
// Types
|
||||
export type {
|
||||
WorkerRef,
|
||||
ObservationSSEPayload,
|
||||
SummarySSEPayload,
|
||||
SSEEventPayload,
|
||||
StorageResult,
|
||||
ResponseProcessingContext,
|
||||
ParsedResponse,
|
||||
FallbackAgent,
|
||||
BaseAgentConfig,
|
||||
} from './types.js';
|
||||
|
||||
export { FALLBACK_ERROR_PATTERNS } from './types.js';
|
||||
|
||||
// Response Processing
|
||||
export { processAgentResponse } from './ResponseProcessor.js';
|
||||
|
||||
// SSE Broadcasting
|
||||
export { broadcastObservation, broadcastSummary } from './ObservationBroadcaster.js';
|
||||
|
||||
// Session Cleanup
|
||||
export { cleanupProcessedMessages } from './SessionCleanupHelper.js';
|
||||
|
||||
// Error Handling
|
||||
export { shouldFallbackToClaude, isAbortError } from './FallbackErrorHandler.js';
|
||||
133
src/services/worker/agents/types.ts
Normal file
133
src/services/worker/agents/types.ts
Normal file
@@ -0,0 +1,133 @@
|
||||
/**
|
||||
* Shared agent types for SDK, Gemini, and OpenRouter agents
|
||||
*
|
||||
* Responsibility:
|
||||
* - Define common interfaces used across all agent implementations
|
||||
* - Provide type safety for response processing and broadcasting
|
||||
*/
|
||||
|
||||
import type { ActiveSession } from '../../worker-types.js';
|
||||
import type { ParsedObservation, ParsedSummary } from '../../../sdk/parser.js';
|
||||
|
||||
// ============================================================================
|
||||
// Worker Reference Type
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Worker reference for SSE broadcasting and status updates
|
||||
* Both sseBroadcaster and broadcastProcessingStatus are optional
|
||||
* to allow agents to run without a full worker context (e.g., testing)
|
||||
*/
|
||||
export interface WorkerRef {
|
||||
sseBroadcaster?: {
|
||||
broadcast(event: SSEEventPayload): void;
|
||||
};
|
||||
broadcastProcessingStatus?: () => void;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// SSE Event Payloads
|
||||
// ============================================================================
|
||||
|
||||
export interface ObservationSSEPayload {
|
||||
id: number;
|
||||
memory_session_id: string | null;
|
||||
session_id: string;
|
||||
type: string;
|
||||
title: string | null;
|
||||
subtitle: string | null;
|
||||
text: string | null;
|
||||
narrative: string | null;
|
||||
facts: string; // JSON stringified
|
||||
concepts: string; // JSON stringified
|
||||
files_read: string; // JSON stringified
|
||||
files_modified: string; // JSON stringified
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
export interface SummarySSEPayload {
|
||||
id: number;
|
||||
session_id: string;
|
||||
request: string | null;
|
||||
investigated: string | null;
|
||||
learned: string | null;
|
||||
completed: string | null;
|
||||
next_steps: string | null;
|
||||
notes: string | null;
|
||||
project: string;
|
||||
prompt_number: number;
|
||||
created_at_epoch: number;
|
||||
}
|
||||
|
||||
export type SSEEventPayload =
|
||||
| { type: 'new_observation'; observation: ObservationSSEPayload }
|
||||
| { type: 'new_summary'; summary: SummarySSEPayload };
|
||||
|
||||
// ============================================================================
|
||||
// Response Processing Types
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Result from atomic database transaction for observations/summary storage
|
||||
*/
|
||||
export interface StorageResult {
|
||||
observationIds: number[];
|
||||
summaryId: number | null;
|
||||
createdAtEpoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Context needed for response processing
|
||||
*/
|
||||
export interface ResponseProcessingContext {
|
||||
session: ActiveSession;
|
||||
worker: WorkerRef | undefined;
|
||||
discoveryTokens: number;
|
||||
originalTimestamp: number | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parsed response data ready for storage
|
||||
*/
|
||||
export interface ParsedResponse {
|
||||
observations: ParsedObservation[];
|
||||
summary: ParsedSummary | null;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Fallback Agent Interface
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Interface for fallback agent (used by Gemini/OpenRouter to fall back to Claude)
|
||||
*/
|
||||
export interface FallbackAgent {
|
||||
startSession(session: ActiveSession, worker?: WorkerRef): Promise<void>;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Agent Configuration Types
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Base configuration shared across all agents
|
||||
*/
|
||||
export interface BaseAgentConfig {
|
||||
dbManager: import('../DatabaseManager.js').DatabaseManager;
|
||||
sessionManager: import('../SessionManager.js').SessionManager;
|
||||
}
|
||||
|
||||
/**
|
||||
* Error codes that should trigger fallback to Claude
|
||||
*/
|
||||
export const FALLBACK_ERROR_PATTERNS = [
|
||||
'429', // Rate limit
|
||||
'500', // Internal server error
|
||||
'502', // Bad gateway
|
||||
'503', // Service unavailable
|
||||
'ECONNREFUSED', // Connection refused
|
||||
'ETIMEDOUT', // Timeout
|
||||
'fetch failed', // Network failure
|
||||
] as const;
|
||||
302
src/services/worker/search/ResultFormatter.ts
Normal file
302
src/services/worker/search/ResultFormatter.ts
Normal file
@@ -0,0 +1,302 @@
|
||||
/**
|
||||
* ResultFormatter - Formats search results for display
|
||||
*
|
||||
* Consolidates formatting logic from FormattingService and SearchManager.
|
||||
* Provides consistent table and text formatting for all search result types.
|
||||
*/
|
||||
|
||||
import {
|
||||
ObservationSearchResult,
|
||||
SessionSummarySearchResult,
|
||||
UserPromptSearchResult,
|
||||
CombinedResult,
|
||||
SearchResults
|
||||
} from './types.js';
|
||||
import { ModeManager } from '../../domain/ModeManager.js';
|
||||
import { formatTime, extractFirstFile, groupByDate, estimateTokens } from '../../../shared/timeline-formatting.js';
|
||||
|
||||
const CHARS_PER_TOKEN_ESTIMATE = 4;
|
||||
|
||||
export class ResultFormatter {
|
||||
/**
|
||||
* Format search results as markdown text
|
||||
*/
|
||||
formatSearchResults(
|
||||
results: SearchResults,
|
||||
query: string,
|
||||
chromaFailed: boolean = false
|
||||
): string {
|
||||
const totalResults = results.observations.length +
|
||||
results.sessions.length +
|
||||
results.prompts.length;
|
||||
|
||||
if (totalResults === 0) {
|
||||
if (chromaFailed) {
|
||||
return this.formatChromaFailureMessage();
|
||||
}
|
||||
return `No results found matching "${query}"`;
|
||||
}
|
||||
|
||||
// Combine all results with timestamps for unified sorting
|
||||
const combined = this.combineResults(results);
|
||||
|
||||
// Sort by date
|
||||
combined.sort((a, b) => b.epoch - a.epoch);
|
||||
|
||||
// Group by date, then by file within each day
|
||||
const cwd = process.cwd();
|
||||
const resultsByDate = groupByDate(combined, item => item.created_at);
|
||||
|
||||
// Build output with date/file grouping
|
||||
const lines: string[] = [];
|
||||
lines.push(`Found ${totalResults} result(s) matching "${query}" (${results.observations.length} obs, ${results.sessions.length} sessions, ${results.prompts.length} prompts)`);
|
||||
lines.push('');
|
||||
|
||||
for (const [day, dayResults] of resultsByDate) {
|
||||
lines.push(`### ${day}`);
|
||||
lines.push('');
|
||||
|
||||
// Group by file within this day
|
||||
const resultsByFile = new Map<string, CombinedResult[]>();
|
||||
for (const result of dayResults) {
|
||||
let file = 'General';
|
||||
if (result.type === 'observation') {
|
||||
file = extractFirstFile(
|
||||
(result.data as ObservationSearchResult).files_modified,
|
||||
cwd
|
||||
);
|
||||
}
|
||||
if (!resultsByFile.has(file)) {
|
||||
resultsByFile.set(file, []);
|
||||
}
|
||||
resultsByFile.get(file)!.push(result);
|
||||
}
|
||||
|
||||
// Render each file section
|
||||
for (const [file, fileResults] of resultsByFile) {
|
||||
lines.push(`**${file}**`);
|
||||
lines.push(this.formatSearchTableHeader());
|
||||
|
||||
let lastTime = '';
|
||||
for (const result of fileResults) {
|
||||
if (result.type === 'observation') {
|
||||
const formatted = this.formatObservationSearchRow(
|
||||
result.data as ObservationSearchResult,
|
||||
lastTime
|
||||
);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else if (result.type === 'session') {
|
||||
const formatted = this.formatSessionSearchRow(
|
||||
result.data as SessionSummarySearchResult,
|
||||
lastTime
|
||||
);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
} else {
|
||||
const formatted = this.formatPromptSearchRow(
|
||||
result.data as UserPromptSearchResult,
|
||||
lastTime
|
||||
);
|
||||
lines.push(formatted.row);
|
||||
lastTime = formatted.time;
|
||||
}
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
}
|
||||
}
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Combine results into unified format
|
||||
*/
|
||||
combineResults(results: SearchResults): CombinedResult[] {
|
||||
return [
|
||||
...results.observations.map(obs => ({
|
||||
type: 'observation' as const,
|
||||
data: obs,
|
||||
epoch: obs.created_at_epoch,
|
||||
created_at: obs.created_at
|
||||
})),
|
||||
...results.sessions.map(sess => ({
|
||||
type: 'session' as const,
|
||||
data: sess,
|
||||
epoch: sess.created_at_epoch,
|
||||
created_at: sess.created_at
|
||||
})),
|
||||
...results.prompts.map(prompt => ({
|
||||
type: 'prompt' as const,
|
||||
data: prompt,
|
||||
epoch: prompt.created_at_epoch,
|
||||
created_at: prompt.created_at
|
||||
}))
|
||||
];
|
||||
}
|
||||
|
||||
/**
|
||||
* Format search table header (no Work column)
|
||||
*/
|
||||
formatSearchTableHeader(): string {
|
||||
return `| ID | Time | T | Title | Read |
|
||||
|----|------|---|-------|------|`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format full table header (with Work column)
|
||||
*/
|
||||
formatTableHeader(): string {
|
||||
return `| ID | Time | T | Title | Read | Work |
|
||||
|-----|------|---|-------|------|------|`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format observation as table row for search results
|
||||
*/
|
||||
formatObservationSearchRow(
|
||||
obs: ObservationSearchResult,
|
||||
lastTime: string
|
||||
): { row: string; time: string } {
|
||||
const id = `#${obs.id}`;
|
||||
const time = formatTime(obs.created_at_epoch);
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const title = obs.title || 'Untitled';
|
||||
const readTokens = this.estimateReadTokens(obs);
|
||||
|
||||
const timeDisplay = time === lastTime ? '"' : time;
|
||||
|
||||
return {
|
||||
row: `| ${id} | ${timeDisplay} | ${icon} | ${title} | ~${readTokens} |`,
|
||||
time
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Format session as table row for search results
|
||||
*/
|
||||
formatSessionSearchRow(
|
||||
session: SessionSummarySearchResult,
|
||||
lastTime: string
|
||||
): { row: string; time: string } {
|
||||
const id = `#S${session.id}`;
|
||||
const time = formatTime(session.created_at_epoch);
|
||||
const icon = '\uD83C\uDFAF'; // Target emoji
|
||||
const title = session.request ||
|
||||
`Session ${session.memory_session_id?.substring(0, 8) || 'unknown'}`;
|
||||
|
||||
const timeDisplay = time === lastTime ? '"' : time;
|
||||
|
||||
return {
|
||||
row: `| ${id} | ${timeDisplay} | ${icon} | ${title} | - |`,
|
||||
time
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Format user prompt as table row for search results
|
||||
*/
|
||||
formatPromptSearchRow(
|
||||
prompt: UserPromptSearchResult,
|
||||
lastTime: string
|
||||
): { row: string; time: string } {
|
||||
const id = `#P${prompt.id}`;
|
||||
const time = formatTime(prompt.created_at_epoch);
|
||||
const icon = '\uD83D\uDCAC'; // Speech bubble emoji
|
||||
const title = prompt.prompt_text.length > 60
|
||||
? prompt.prompt_text.substring(0, 57) + '...'
|
||||
: prompt.prompt_text;
|
||||
|
||||
const timeDisplay = time === lastTime ? '"' : time;
|
||||
|
||||
return {
|
||||
row: `| ${id} | ${timeDisplay} | ${icon} | ${title} | - |`,
|
||||
time
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Format observation as index row (with Work column)
|
||||
*/
|
||||
formatObservationIndex(obs: ObservationSearchResult, _index: number): string {
|
||||
const id = `#${obs.id}`;
|
||||
const time = formatTime(obs.created_at_epoch);
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const title = obs.title || 'Untitled';
|
||||
const readTokens = this.estimateReadTokens(obs);
|
||||
const workEmoji = ModeManager.getInstance().getWorkEmoji(obs.type);
|
||||
const workTokens = obs.discovery_tokens || 0;
|
||||
const workDisplay = workTokens > 0 ? `${workEmoji} ${workTokens}` : '-';
|
||||
|
||||
return `| ${id} | ${time} | ${icon} | ${title} | ~${readTokens} | ${workDisplay} |`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format session as index row
|
||||
*/
|
||||
formatSessionIndex(session: SessionSummarySearchResult, _index: number): string {
|
||||
const id = `#S${session.id}`;
|
||||
const time = formatTime(session.created_at_epoch);
|
||||
const icon = '\uD83C\uDFAF';
|
||||
const title = session.request ||
|
||||
`Session ${session.memory_session_id?.substring(0, 8) || 'unknown'}`;
|
||||
|
||||
return `| ${id} | ${time} | ${icon} | ${title} | - | - |`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format user prompt as index row
|
||||
*/
|
||||
formatPromptIndex(prompt: UserPromptSearchResult, _index: number): string {
|
||||
const id = `#P${prompt.id}`;
|
||||
const time = formatTime(prompt.created_at_epoch);
|
||||
const icon = '\uD83D\uDCAC';
|
||||
const title = prompt.prompt_text.length > 60
|
||||
? prompt.prompt_text.substring(0, 57) + '...'
|
||||
: prompt.prompt_text;
|
||||
|
||||
return `| ${id} | ${time} | ${icon} | ${title} | - | - |`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Estimate read tokens for an observation
|
||||
*/
|
||||
private estimateReadTokens(obs: ObservationSearchResult): number {
|
||||
const size = (obs.title?.length || 0) +
|
||||
(obs.subtitle?.length || 0) +
|
||||
(obs.narrative?.length || 0) +
|
||||
(obs.facts?.length || 0);
|
||||
return Math.ceil(size / CHARS_PER_TOKEN_ESTIMATE);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format Chroma failure message
|
||||
*/
|
||||
private formatChromaFailureMessage(): string {
|
||||
return `Vector search failed - semantic search unavailable.
|
||||
|
||||
To enable semantic search:
|
||||
1. Install uv: https://docs.astral.sh/uv/getting-started/installation/
|
||||
2. Restart the worker: npm run worker:restart
|
||||
|
||||
Note: You can still use filter-only searches (date ranges, types, files) without a query term.`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format search tips footer
|
||||
*/
|
||||
formatSearchTips(): string {
|
||||
return `
|
||||
---
|
||||
Search Strategy:
|
||||
1. Search with index to see titles, dates, IDs
|
||||
2. Use timeline to get context around interesting results
|
||||
3. Batch fetch full details: get_observations(ids=[...])
|
||||
|
||||
Tips:
|
||||
- Filter by type: obs_type="bugfix,feature"
|
||||
- Filter by date: dateStart="2025-01-01"
|
||||
- Sort: orderBy="date_desc" or "date_asc"`;
|
||||
}
|
||||
}
|
||||
287
src/services/worker/search/SearchOrchestrator.ts
Normal file
287
src/services/worker/search/SearchOrchestrator.ts
Normal file
@@ -0,0 +1,287 @@
|
||||
/**
|
||||
* SearchOrchestrator - Coordinates search strategies and handles fallback logic
|
||||
*
|
||||
* This is the main entry point for search operations. It:
|
||||
* 1. Normalizes input parameters
|
||||
* 2. Selects the appropriate strategy
|
||||
* 3. Executes the search
|
||||
* 4. Handles fallbacks on failure
|
||||
* 5. Delegates to formatters for output
|
||||
*/
|
||||
|
||||
import { SessionSearch } from '../../sqlite/SessionSearch.js';
|
||||
import { SessionStore } from '../../sqlite/SessionStore.js';
|
||||
import { ChromaSync } from '../../sync/ChromaSync.js';
|
||||
|
||||
import { ChromaSearchStrategy } from './strategies/ChromaSearchStrategy.js';
|
||||
import { SQLiteSearchStrategy } from './strategies/SQLiteSearchStrategy.js';
|
||||
import { HybridSearchStrategy } from './strategies/HybridSearchStrategy.js';
|
||||
|
||||
import { ResultFormatter } from './ResultFormatter.js';
|
||||
import { TimelineBuilder, TimelineItem, TimelineData } from './TimelineBuilder.js';
|
||||
|
||||
import {
|
||||
StrategySearchOptions,
|
||||
StrategySearchResult,
|
||||
SearchResults,
|
||||
SEARCH_CONSTANTS,
|
||||
ObservationSearchResult
|
||||
} from './types.js';
|
||||
import { logger } from '../../../utils/logger.js';
|
||||
|
||||
/**
|
||||
* Normalized parameters from URL-friendly format
|
||||
*/
|
||||
interface NormalizedParams extends StrategySearchOptions {
|
||||
concepts?: string[];
|
||||
files?: string[];
|
||||
obsType?: string[];
|
||||
}
|
||||
|
||||
export class SearchOrchestrator {
|
||||
private chromaStrategy: ChromaSearchStrategy | null = null;
|
||||
private sqliteStrategy: SQLiteSearchStrategy;
|
||||
private hybridStrategy: HybridSearchStrategy | null = null;
|
||||
private resultFormatter: ResultFormatter;
|
||||
private timelineBuilder: TimelineBuilder;
|
||||
|
||||
constructor(
|
||||
private sessionSearch: SessionSearch,
|
||||
private sessionStore: SessionStore,
|
||||
private chromaSync: ChromaSync | null
|
||||
) {
|
||||
// Initialize strategies
|
||||
this.sqliteStrategy = new SQLiteSearchStrategy(sessionSearch);
|
||||
|
||||
if (chromaSync) {
|
||||
this.chromaStrategy = new ChromaSearchStrategy(chromaSync, sessionStore);
|
||||
this.hybridStrategy = new HybridSearchStrategy(chromaSync, sessionStore, sessionSearch);
|
||||
}
|
||||
|
||||
this.resultFormatter = new ResultFormatter();
|
||||
this.timelineBuilder = new TimelineBuilder();
|
||||
}
|
||||
|
||||
/**
|
||||
* Main search entry point
|
||||
*/
|
||||
async search(args: any): Promise<StrategySearchResult> {
|
||||
const options = this.normalizeParams(args);
|
||||
|
||||
// Decision tree for strategy selection
|
||||
return await this.executeWithFallback(options);
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute search with fallback logic
|
||||
*/
|
||||
private async executeWithFallback(
|
||||
options: NormalizedParams
|
||||
): Promise<StrategySearchResult> {
|
||||
// PATH 1: FILTER-ONLY (no query text) - Use SQLite
|
||||
if (!options.query) {
|
||||
logger.debug('SEARCH', 'Orchestrator: Filter-only query, using SQLite', {});
|
||||
return await this.sqliteStrategy.search(options);
|
||||
}
|
||||
|
||||
// PATH 2: CHROMA SEMANTIC SEARCH (query text + Chroma available)
|
||||
if (this.chromaStrategy) {
|
||||
logger.debug('SEARCH', 'Orchestrator: Using Chroma semantic search', {});
|
||||
const result = await this.chromaStrategy.search(options);
|
||||
|
||||
// If Chroma succeeded (even with 0 results), return
|
||||
if (result.usedChroma) {
|
||||
return result;
|
||||
}
|
||||
|
||||
// Chroma failed - fall back to SQLite for filter-only
|
||||
logger.debug('SEARCH', 'Orchestrator: Chroma failed, falling back to SQLite', {});
|
||||
const fallbackResult = await this.sqliteStrategy.search({
|
||||
...options,
|
||||
query: undefined // Remove query for SQLite fallback
|
||||
});
|
||||
|
||||
return {
|
||||
...fallbackResult,
|
||||
fellBack: true
|
||||
};
|
||||
}
|
||||
|
||||
// PATH 3: No Chroma available
|
||||
logger.debug('SEARCH', 'Orchestrator: Chroma not available', {});
|
||||
return {
|
||||
results: { observations: [], sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: false,
|
||||
strategy: 'sqlite'
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Find by concept with hybrid search
|
||||
*/
|
||||
async findByConcept(concept: string, args: any): Promise<StrategySearchResult> {
|
||||
const options = this.normalizeParams(args);
|
||||
|
||||
if (this.hybridStrategy) {
|
||||
return await this.hybridStrategy.findByConcept(concept, options);
|
||||
}
|
||||
|
||||
// Fallback to SQLite
|
||||
const results = this.sqliteStrategy.findByConcept(concept, options);
|
||||
return {
|
||||
results: { observations: results, sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: false,
|
||||
strategy: 'sqlite'
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Find by type with hybrid search
|
||||
*/
|
||||
async findByType(type: string | string[], args: any): Promise<StrategySearchResult> {
|
||||
const options = this.normalizeParams(args);
|
||||
|
||||
if (this.hybridStrategy) {
|
||||
return await this.hybridStrategy.findByType(type, options);
|
||||
}
|
||||
|
||||
// Fallback to SQLite
|
||||
const results = this.sqliteStrategy.findByType(type, options);
|
||||
return {
|
||||
results: { observations: results, sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: false,
|
||||
strategy: 'sqlite'
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Find by file with hybrid search
|
||||
*/
|
||||
async findByFile(filePath: string, args: any): Promise<{
|
||||
observations: ObservationSearchResult[];
|
||||
sessions: any[];
|
||||
usedChroma: boolean;
|
||||
}> {
|
||||
const options = this.normalizeParams(args);
|
||||
|
||||
if (this.hybridStrategy) {
|
||||
return await this.hybridStrategy.findByFile(filePath, options);
|
||||
}
|
||||
|
||||
// Fallback to SQLite
|
||||
const results = this.sqliteStrategy.findByFile(filePath, options);
|
||||
return { ...results, usedChroma: false };
|
||||
}
|
||||
|
||||
/**
|
||||
* Get timeline around anchor
|
||||
*/
|
||||
getTimeline(
|
||||
timelineData: TimelineData,
|
||||
anchorId: number | string,
|
||||
anchorEpoch: number,
|
||||
depthBefore: number,
|
||||
depthAfter: number
|
||||
): TimelineItem[] {
|
||||
const items = this.timelineBuilder.buildTimeline(timelineData);
|
||||
return this.timelineBuilder.filterByDepth(items, anchorId, anchorEpoch, depthBefore, depthAfter);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format timeline for display
|
||||
*/
|
||||
formatTimeline(
|
||||
items: TimelineItem[],
|
||||
anchorId: number | string | null,
|
||||
options: {
|
||||
query?: string;
|
||||
depthBefore?: number;
|
||||
depthAfter?: number;
|
||||
} = {}
|
||||
): string {
|
||||
return this.timelineBuilder.formatTimeline(items, anchorId, options);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format search results for display
|
||||
*/
|
||||
formatSearchResults(
|
||||
results: SearchResults,
|
||||
query: string,
|
||||
chromaFailed: boolean = false
|
||||
): string {
|
||||
return this.resultFormatter.formatSearchResults(results, query, chromaFailed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get result formatter for direct access
|
||||
*/
|
||||
getFormatter(): ResultFormatter {
|
||||
return this.resultFormatter;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get timeline builder for direct access
|
||||
*/
|
||||
getTimelineBuilder(): TimelineBuilder {
|
||||
return this.timelineBuilder;
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize query parameters from URL-friendly format
|
||||
*/
|
||||
private normalizeParams(args: any): NormalizedParams {
|
||||
const normalized: any = { ...args };
|
||||
|
||||
// Parse comma-separated concepts into array
|
||||
if (normalized.concepts && typeof normalized.concepts === 'string') {
|
||||
normalized.concepts = normalized.concepts.split(',').map((s: string) => s.trim()).filter(Boolean);
|
||||
}
|
||||
|
||||
// Parse comma-separated files into array
|
||||
if (normalized.files && typeof normalized.files === 'string') {
|
||||
normalized.files = normalized.files.split(',').map((s: string) => s.trim()).filter(Boolean);
|
||||
}
|
||||
|
||||
// Parse comma-separated obs_type into array
|
||||
if (normalized.obs_type && typeof normalized.obs_type === 'string') {
|
||||
normalized.obsType = normalized.obs_type.split(',').map((s: string) => s.trim()).filter(Boolean);
|
||||
delete normalized.obs_type;
|
||||
}
|
||||
|
||||
// Parse comma-separated type (for filterSchema) into array
|
||||
if (normalized.type && typeof normalized.type === 'string' && normalized.type.includes(',')) {
|
||||
normalized.type = normalized.type.split(',').map((s: string) => s.trim()).filter(Boolean);
|
||||
}
|
||||
|
||||
// Map 'type' param to 'searchType' for API consistency
|
||||
if (normalized.type && !normalized.searchType) {
|
||||
if (['observations', 'sessions', 'prompts'].includes(normalized.type)) {
|
||||
normalized.searchType = normalized.type;
|
||||
delete normalized.type;
|
||||
}
|
||||
}
|
||||
|
||||
// Flatten dateStart/dateEnd into dateRange object
|
||||
if (normalized.dateStart || normalized.dateEnd) {
|
||||
normalized.dateRange = {
|
||||
start: normalized.dateStart,
|
||||
end: normalized.dateEnd
|
||||
};
|
||||
delete normalized.dateStart;
|
||||
delete normalized.dateEnd;
|
||||
}
|
||||
|
||||
return normalized;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if Chroma is available
|
||||
*/
|
||||
isChromaAvailable(): boolean {
|
||||
return !!this.chromaSync;
|
||||
}
|
||||
}
|
||||
302
src/services/worker/search/TimelineBuilder.ts
Normal file
302
src/services/worker/search/TimelineBuilder.ts
Normal file
@@ -0,0 +1,302 @@
|
||||
/**
|
||||
* TimelineBuilder - Constructs timeline views for search results
|
||||
*
|
||||
* Builds chronological views around anchor points with depth control.
|
||||
* Used by the timeline tool and get_context_timeline tool.
|
||||
*/
|
||||
|
||||
import {
|
||||
ObservationSearchResult,
|
||||
SessionSummarySearchResult,
|
||||
UserPromptSearchResult,
|
||||
CombinedResult
|
||||
} from './types.js';
|
||||
import { ModeManager } from '../../domain/ModeManager.js';
|
||||
import {
|
||||
formatDate,
|
||||
formatTime,
|
||||
formatDateTime,
|
||||
extractFirstFile,
|
||||
estimateTokens
|
||||
} from '../../../shared/timeline-formatting.js';
|
||||
|
||||
/**
|
||||
* Timeline item for unified chronological display
|
||||
*/
|
||||
export interface TimelineItem {
|
||||
type: 'observation' | 'session' | 'prompt';
|
||||
data: ObservationSearchResult | SessionSummarySearchResult | UserPromptSearchResult;
|
||||
epoch: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Raw timeline data from SessionStore
|
||||
*/
|
||||
export interface TimelineData {
|
||||
observations: ObservationSearchResult[];
|
||||
sessions: SessionSummarySearchResult[];
|
||||
prompts: UserPromptSearchResult[];
|
||||
}
|
||||
|
||||
export class TimelineBuilder {
|
||||
/**
|
||||
* Build timeline items from raw data
|
||||
*/
|
||||
buildTimeline(data: TimelineData): TimelineItem[] {
|
||||
const items: TimelineItem[] = [
|
||||
...data.observations.map(obs => ({
|
||||
type: 'observation' as const,
|
||||
data: obs,
|
||||
epoch: obs.created_at_epoch
|
||||
})),
|
||||
...data.sessions.map(sess => ({
|
||||
type: 'session' as const,
|
||||
data: sess,
|
||||
epoch: sess.created_at_epoch
|
||||
})),
|
||||
...data.prompts.map(prompt => ({
|
||||
type: 'prompt' as const,
|
||||
data: prompt,
|
||||
epoch: prompt.created_at_epoch
|
||||
}))
|
||||
];
|
||||
|
||||
// Sort chronologically
|
||||
items.sort((a, b) => a.epoch - b.epoch);
|
||||
return items;
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter timeline items to respect depth window around anchor
|
||||
*/
|
||||
filterByDepth(
|
||||
items: TimelineItem[],
|
||||
anchorId: number | string,
|
||||
anchorEpoch: number,
|
||||
depthBefore: number,
|
||||
depthAfter: number
|
||||
): TimelineItem[] {
|
||||
if (items.length === 0) return items;
|
||||
|
||||
let anchorIndex = this.findAnchorIndex(items, anchorId, anchorEpoch);
|
||||
|
||||
if (anchorIndex === -1) return items;
|
||||
|
||||
const startIndex = Math.max(0, anchorIndex - depthBefore);
|
||||
const endIndex = Math.min(items.length, anchorIndex + depthAfter + 1);
|
||||
return items.slice(startIndex, endIndex);
|
||||
}
|
||||
|
||||
/**
|
||||
* Find anchor index in timeline items
|
||||
*/
|
||||
private findAnchorIndex(
|
||||
items: TimelineItem[],
|
||||
anchorId: number | string,
|
||||
anchorEpoch: number
|
||||
): number {
|
||||
if (typeof anchorId === 'number') {
|
||||
// Observation ID
|
||||
return items.findIndex(
|
||||
item => item.type === 'observation' &&
|
||||
(item.data as ObservationSearchResult).id === anchorId
|
||||
);
|
||||
}
|
||||
|
||||
if (typeof anchorId === 'string' && anchorId.startsWith('S')) {
|
||||
// Session ID
|
||||
const sessionNum = parseInt(anchorId.slice(1), 10);
|
||||
return items.findIndex(
|
||||
item => item.type === 'session' &&
|
||||
(item.data as SessionSummarySearchResult).id === sessionNum
|
||||
);
|
||||
}
|
||||
|
||||
// Timestamp anchor - find closest item
|
||||
const index = items.findIndex(item => item.epoch >= anchorEpoch);
|
||||
return index === -1 ? items.length - 1 : index;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format timeline as markdown
|
||||
*/
|
||||
formatTimeline(
|
||||
items: TimelineItem[],
|
||||
anchorId: number | string | null,
|
||||
options: {
|
||||
query?: string;
|
||||
depthBefore?: number;
|
||||
depthAfter?: number;
|
||||
cwd?: string;
|
||||
} = {}
|
||||
): string {
|
||||
const { query, depthBefore, depthAfter, cwd = process.cwd() } = options;
|
||||
|
||||
if (items.length === 0) {
|
||||
return query
|
||||
? `Found observation matching "${query}", but no timeline context available.`
|
||||
: 'No timeline items found';
|
||||
}
|
||||
|
||||
const lines: string[] = [];
|
||||
|
||||
// Header
|
||||
if (query && anchorId) {
|
||||
const anchorObs = items.find(
|
||||
item => item.type === 'observation' &&
|
||||
(item.data as ObservationSearchResult).id === anchorId
|
||||
);
|
||||
const anchorTitle = anchorObs
|
||||
? ((anchorObs.data as ObservationSearchResult).title || 'Untitled')
|
||||
: 'Unknown';
|
||||
lines.push(`# Timeline for query: "${query}"`);
|
||||
lines.push(`**Anchor:** Observation #${anchorId} - ${anchorTitle}`);
|
||||
} else if (anchorId) {
|
||||
lines.push(`# Timeline around anchor: ${anchorId}`);
|
||||
} else {
|
||||
lines.push(`# Timeline`);
|
||||
}
|
||||
|
||||
if (depthBefore !== undefined && depthAfter !== undefined) {
|
||||
lines.push(`**Window:** ${depthBefore} records before -> ${depthAfter} records after | **Items:** ${items.length}`);
|
||||
} else {
|
||||
lines.push(`**Items:** ${items.length}`);
|
||||
}
|
||||
lines.push('');
|
||||
|
||||
// Group by day
|
||||
const dayMap = this.groupByDay(items);
|
||||
const sortedDays = this.sortDaysChronologically(dayMap);
|
||||
|
||||
// Render each day
|
||||
for (const [day, dayItems] of sortedDays) {
|
||||
lines.push(`### ${day}`);
|
||||
lines.push('');
|
||||
|
||||
let currentFile: string | null = null;
|
||||
let lastTime = '';
|
||||
let tableOpen = false;
|
||||
|
||||
for (const item of dayItems) {
|
||||
const isAnchor = this.isAnchorItem(item, anchorId);
|
||||
|
||||
if (item.type === 'session') {
|
||||
// Close any open table
|
||||
if (tableOpen) {
|
||||
lines.push('');
|
||||
tableOpen = false;
|
||||
currentFile = null;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const sess = item.data as SessionSummarySearchResult;
|
||||
const title = sess.request || 'Session summary';
|
||||
const marker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
|
||||
lines.push(`**\uD83C\uDFAF #S${sess.id}** ${title} (${formatDateTime(item.epoch)})${marker}`);
|
||||
lines.push('');
|
||||
|
||||
} else if (item.type === 'prompt') {
|
||||
if (tableOpen) {
|
||||
lines.push('');
|
||||
tableOpen = false;
|
||||
currentFile = null;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const prompt = item.data as UserPromptSearchResult;
|
||||
const truncated = prompt.prompt_text.length > 100
|
||||
? prompt.prompt_text.substring(0, 100) + '...'
|
||||
: prompt.prompt_text;
|
||||
|
||||
lines.push(`**\uD83D\uDCAC User Prompt #${prompt.prompt_number}** (${formatDateTime(item.epoch)})`);
|
||||
lines.push(`> ${truncated}`);
|
||||
lines.push('');
|
||||
|
||||
} else if (item.type === 'observation') {
|
||||
const obs = item.data as ObservationSearchResult;
|
||||
const file = extractFirstFile(obs.files_modified, cwd);
|
||||
|
||||
if (file !== currentFile) {
|
||||
if (tableOpen) {
|
||||
lines.push('');
|
||||
}
|
||||
|
||||
lines.push(`**${file}**`);
|
||||
lines.push(`| ID | Time | T | Title | Tokens |`);
|
||||
lines.push(`|----|------|---|-------|--------|`);
|
||||
|
||||
currentFile = file;
|
||||
tableOpen = true;
|
||||
lastTime = '';
|
||||
}
|
||||
|
||||
const icon = ModeManager.getInstance().getTypeIcon(obs.type);
|
||||
const time = formatTime(item.epoch);
|
||||
const title = obs.title || 'Untitled';
|
||||
const tokens = estimateTokens(obs.narrative);
|
||||
|
||||
const showTime = time !== lastTime;
|
||||
const timeDisplay = showTime ? time : '"';
|
||||
lastTime = time;
|
||||
|
||||
const anchorMarker = isAnchor ? ' <- **ANCHOR**' : '';
|
||||
lines.push(`| #${obs.id} | ${timeDisplay} | ${icon} | ${title}${anchorMarker} | ~${tokens} |`);
|
||||
}
|
||||
}
|
||||
|
||||
if (tableOpen) {
|
||||
lines.push('');
|
||||
}
|
||||
}
|
||||
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Group timeline items by day
|
||||
*/
|
||||
private groupByDay(items: TimelineItem[]): Map<string, TimelineItem[]> {
|
||||
const dayMap = new Map<string, TimelineItem[]>();
|
||||
|
||||
for (const item of items) {
|
||||
const day = formatDate(item.epoch);
|
||||
if (!dayMap.has(day)) {
|
||||
dayMap.set(day, []);
|
||||
}
|
||||
dayMap.get(day)!.push(item);
|
||||
}
|
||||
|
||||
return dayMap;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sort days chronologically
|
||||
*/
|
||||
private sortDaysChronologically(
|
||||
dayMap: Map<string, TimelineItem[]>
|
||||
): Array<[string, TimelineItem[]]> {
|
||||
return Array.from(dayMap.entries()).sort((a, b) => {
|
||||
const aDate = new Date(a[0]).getTime();
|
||||
const bDate = new Date(b[0]).getTime();
|
||||
return aDate - bDate;
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if item is the anchor
|
||||
*/
|
||||
private isAnchorItem(item: TimelineItem, anchorId: number | string | null): boolean {
|
||||
if (anchorId === null) return false;
|
||||
|
||||
if (typeof anchorId === 'number' && item.type === 'observation') {
|
||||
return (item.data as ObservationSearchResult).id === anchorId;
|
||||
}
|
||||
|
||||
if (typeof anchorId === 'string' && anchorId.startsWith('S') && item.type === 'session') {
|
||||
return `S${(item.data as SessionSummarySearchResult).id}` === anchorId;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
}
|
||||
101
src/services/worker/search/filters/DateFilter.ts
Normal file
101
src/services/worker/search/filters/DateFilter.ts
Normal file
@@ -0,0 +1,101 @@
|
||||
/**
|
||||
* DateFilter - Date range filtering for search results
|
||||
*
|
||||
* Provides utilities for filtering search results by date range.
|
||||
*/
|
||||
|
||||
import { DateRange, SearchResult, CombinedResult, SEARCH_CONSTANTS } from '../types.js';
|
||||
|
||||
/**
|
||||
* Parse date range values to epoch milliseconds
|
||||
*/
|
||||
export function parseDateRange(dateRange?: DateRange): {
|
||||
startEpoch?: number;
|
||||
endEpoch?: number;
|
||||
} {
|
||||
if (!dateRange) {
|
||||
return {};
|
||||
}
|
||||
|
||||
const result: { startEpoch?: number; endEpoch?: number } = {};
|
||||
|
||||
if (dateRange.start) {
|
||||
result.startEpoch = typeof dateRange.start === 'number'
|
||||
? dateRange.start
|
||||
: new Date(dateRange.start).getTime();
|
||||
}
|
||||
|
||||
if (dateRange.end) {
|
||||
result.endEpoch = typeof dateRange.end === 'number'
|
||||
? dateRange.end
|
||||
: new Date(dateRange.end).getTime();
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if an epoch timestamp is within a date range
|
||||
*/
|
||||
export function isWithinDateRange(
|
||||
epoch: number,
|
||||
dateRange?: DateRange
|
||||
): boolean {
|
||||
if (!dateRange) {
|
||||
return true;
|
||||
}
|
||||
|
||||
const { startEpoch, endEpoch } = parseDateRange(dateRange);
|
||||
|
||||
if (startEpoch && epoch < startEpoch) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (endEpoch && epoch > endEpoch) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if an epoch timestamp is within the recency window
|
||||
*/
|
||||
export function isRecent(epoch: number): boolean {
|
||||
const cutoff = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
return epoch > cutoff;
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter combined results by date range
|
||||
*/
|
||||
export function filterResultsByDate<T extends { epoch: number }>(
|
||||
results: T[],
|
||||
dateRange?: DateRange
|
||||
): T[] {
|
||||
if (!dateRange) {
|
||||
return results;
|
||||
}
|
||||
|
||||
return results.filter(result => isWithinDateRange(result.epoch, dateRange));
|
||||
}
|
||||
|
||||
/**
|
||||
* Get date boundaries for common ranges
|
||||
*/
|
||||
export function getDateBoundaries(range: 'today' | 'week' | 'month' | '90days'): DateRange {
|
||||
const now = Date.now();
|
||||
const startOfToday = new Date();
|
||||
startOfToday.setHours(0, 0, 0, 0);
|
||||
|
||||
switch (range) {
|
||||
case 'today':
|
||||
return { start: startOfToday.getTime() };
|
||||
case 'week':
|
||||
return { start: now - 7 * 24 * 60 * 60 * 1000 };
|
||||
case 'month':
|
||||
return { start: now - 30 * 24 * 60 * 60 * 1000 };
|
||||
case '90days':
|
||||
return { start: now - SEARCH_CONSTANTS.RECENCY_WINDOW_MS };
|
||||
}
|
||||
}
|
||||
59
src/services/worker/search/filters/ProjectFilter.ts
Normal file
59
src/services/worker/search/filters/ProjectFilter.ts
Normal file
@@ -0,0 +1,59 @@
|
||||
/**
|
||||
* ProjectFilter - Project scoping for search results
|
||||
*
|
||||
* Provides utilities for filtering search results by project.
|
||||
*/
|
||||
|
||||
import { basename } from 'path';
|
||||
|
||||
/**
|
||||
* Get the current project name from cwd
|
||||
*/
|
||||
export function getCurrentProject(): string {
|
||||
return basename(process.cwd());
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize project name for filtering
|
||||
*/
|
||||
export function normalizeProject(project?: string): string | undefined {
|
||||
if (!project) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
// Remove leading/trailing whitespace
|
||||
const trimmed = project.trim();
|
||||
if (!trimmed) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a result matches the project filter
|
||||
*/
|
||||
export function matchesProject(
|
||||
resultProject: string,
|
||||
filterProject?: string
|
||||
): boolean {
|
||||
if (!filterProject) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return resultProject === filterProject;
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter results by project
|
||||
*/
|
||||
export function filterResultsByProject<T extends { project: string }>(
|
||||
results: T[],
|
||||
project?: string
|
||||
): T[] {
|
||||
if (!project) {
|
||||
return results;
|
||||
}
|
||||
|
||||
return results.filter(result => matchesProject(result.project, project));
|
||||
}
|
||||
75
src/services/worker/search/filters/TypeFilter.ts
Normal file
75
src/services/worker/search/filters/TypeFilter.ts
Normal file
@@ -0,0 +1,75 @@
|
||||
/**
|
||||
* TypeFilter - Observation type filtering for search results
|
||||
*
|
||||
* Provides utilities for filtering observations by type.
|
||||
*/
|
||||
|
||||
type ObservationType = 'decision' | 'bugfix' | 'feature' | 'refactor' | 'discovery' | 'change';
|
||||
|
||||
/**
|
||||
* Valid observation types
|
||||
*/
|
||||
export const OBSERVATION_TYPES: ObservationType[] = [
|
||||
'decision',
|
||||
'bugfix',
|
||||
'feature',
|
||||
'refactor',
|
||||
'discovery',
|
||||
'change'
|
||||
];
|
||||
|
||||
/**
|
||||
* Normalize type filter value(s)
|
||||
*/
|
||||
export function normalizeType(
|
||||
type?: string | string[]
|
||||
): ObservationType[] | undefined {
|
||||
if (!type) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const types = Array.isArray(type) ? type : [type];
|
||||
const normalized = types
|
||||
.map(t => t.trim().toLowerCase())
|
||||
.filter(t => OBSERVATION_TYPES.includes(t as ObservationType)) as ObservationType[];
|
||||
|
||||
return normalized.length > 0 ? normalized : undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a result matches the type filter
|
||||
*/
|
||||
export function matchesType(
|
||||
resultType: string,
|
||||
filterTypes?: ObservationType[]
|
||||
): boolean {
|
||||
if (!filterTypes || filterTypes.length === 0) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return filterTypes.includes(resultType as ObservationType);
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter observations by type
|
||||
*/
|
||||
export function filterObservationsByType<T extends { type: string }>(
|
||||
observations: T[],
|
||||
types?: ObservationType[]
|
||||
): T[] {
|
||||
if (!types || types.length === 0) {
|
||||
return observations;
|
||||
}
|
||||
|
||||
return observations.filter(obs => matchesType(obs.type, types));
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse comma-separated type string
|
||||
*/
|
||||
export function parseTypeString(typeString: string): ObservationType[] {
|
||||
return typeString
|
||||
.split(',')
|
||||
.map(t => t.trim().toLowerCase())
|
||||
.filter(t => OBSERVATION_TYPES.includes(t as ObservationType)) as ObservationType[];
|
||||
}
|
||||
26
src/services/worker/search/index.ts
Normal file
26
src/services/worker/search/index.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
/**
|
||||
* Search Module - Named exports for search functionality
|
||||
*
|
||||
* This is the public API for the search module.
|
||||
*/
|
||||
|
||||
// Main orchestrator
|
||||
export { SearchOrchestrator } from './SearchOrchestrator.js';
|
||||
|
||||
// Formatters
|
||||
export { ResultFormatter } from './ResultFormatter.js';
|
||||
export { TimelineBuilder, TimelineItem, TimelineData } from './TimelineBuilder.js';
|
||||
|
||||
// Strategies
|
||||
export { SearchStrategy, BaseSearchStrategy } from './strategies/SearchStrategy.js';
|
||||
export { ChromaSearchStrategy } from './strategies/ChromaSearchStrategy.js';
|
||||
export { SQLiteSearchStrategy } from './strategies/SQLiteSearchStrategy.js';
|
||||
export { HybridSearchStrategy } from './strategies/HybridSearchStrategy.js';
|
||||
|
||||
// Filters
|
||||
export * from './filters/DateFilter.js';
|
||||
export * from './filters/ProjectFilter.js';
|
||||
export * from './filters/TypeFilter.js';
|
||||
|
||||
// Types
|
||||
export * from './types.js';
|
||||
213
src/services/worker/search/strategies/ChromaSearchStrategy.ts
Normal file
213
src/services/worker/search/strategies/ChromaSearchStrategy.ts
Normal file
@@ -0,0 +1,213 @@
|
||||
/**
|
||||
* ChromaSearchStrategy - Vector-based semantic search via Chroma
|
||||
*
|
||||
* This strategy handles semantic search queries using ChromaDB:
|
||||
* 1. Query Chroma for semantically similar documents
|
||||
* 2. Filter by recency (90-day window)
|
||||
* 3. Categorize by document type
|
||||
* 4. Hydrate from SQLite
|
||||
*
|
||||
* Used when: Query text is provided and Chroma is available
|
||||
*/
|
||||
|
||||
import { BaseSearchStrategy, SearchStrategy } from './SearchStrategy.js';
|
||||
import {
|
||||
StrategySearchOptions,
|
||||
StrategySearchResult,
|
||||
SEARCH_CONSTANTS,
|
||||
ChromaMetadata,
|
||||
ObservationSearchResult,
|
||||
SessionSummarySearchResult,
|
||||
UserPromptSearchResult
|
||||
} from '../types.js';
|
||||
import { ChromaSync } from '../../../sync/ChromaSync.js';
|
||||
import { SessionStore } from '../../../sqlite/SessionStore.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
|
||||
export class ChromaSearchStrategy extends BaseSearchStrategy implements SearchStrategy {
|
||||
readonly name = 'chroma';
|
||||
|
||||
constructor(
|
||||
private chromaSync: ChromaSync,
|
||||
private sessionStore: SessionStore
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
canHandle(options: StrategySearchOptions): boolean {
|
||||
// Can handle when query text is provided and Chroma is available
|
||||
return !!options.query && !!this.chromaSync;
|
||||
}
|
||||
|
||||
async search(options: StrategySearchOptions): Promise<StrategySearchResult> {
|
||||
const {
|
||||
query,
|
||||
searchType = 'all',
|
||||
obsType,
|
||||
concepts,
|
||||
files,
|
||||
limit = SEARCH_CONSTANTS.DEFAULT_LIMIT,
|
||||
project,
|
||||
orderBy = 'date_desc'
|
||||
} = options;
|
||||
|
||||
if (!query) {
|
||||
return this.emptyResult('chroma');
|
||||
}
|
||||
|
||||
const searchObservations = searchType === 'all' || searchType === 'observations';
|
||||
const searchSessions = searchType === 'all' || searchType === 'sessions';
|
||||
const searchPrompts = searchType === 'all' || searchType === 'prompts';
|
||||
|
||||
let observations: ObservationSearchResult[] = [];
|
||||
let sessions: SessionSummarySearchResult[] = [];
|
||||
let prompts: UserPromptSearchResult[] = [];
|
||||
|
||||
try {
|
||||
// Build Chroma where filter for doc_type
|
||||
const whereFilter = this.buildWhereFilter(searchType);
|
||||
|
||||
// Step 1: Chroma semantic search
|
||||
logger.debug('SEARCH', 'ChromaSearchStrategy: Querying Chroma', { query, searchType });
|
||||
const chromaResults = await this.chromaSync.queryChroma(
|
||||
query,
|
||||
SEARCH_CONSTANTS.CHROMA_BATCH_SIZE,
|
||||
whereFilter
|
||||
);
|
||||
|
||||
logger.debug('SEARCH', 'ChromaSearchStrategy: Chroma returned matches', {
|
||||
matchCount: chromaResults.ids.length
|
||||
});
|
||||
|
||||
if (chromaResults.ids.length === 0) {
|
||||
// No matches - this is the correct answer
|
||||
return {
|
||||
results: { observations: [], sessions: [], prompts: [] },
|
||||
usedChroma: true,
|
||||
fellBack: false,
|
||||
strategy: 'chroma'
|
||||
};
|
||||
}
|
||||
|
||||
// Step 2: Filter by recency (90 days)
|
||||
const recentItems = this.filterByRecency(chromaResults);
|
||||
logger.debug('SEARCH', 'ChromaSearchStrategy: Filtered by recency', {
|
||||
count: recentItems.length
|
||||
});
|
||||
|
||||
// Step 3: Categorize by document type
|
||||
const categorized = this.categorizeByDocType(recentItems, {
|
||||
searchObservations,
|
||||
searchSessions,
|
||||
searchPrompts
|
||||
});
|
||||
|
||||
// Step 4: Hydrate from SQLite with additional filters
|
||||
if (categorized.obsIds.length > 0) {
|
||||
const obsOptions = { type: obsType, concepts, files, orderBy, limit, project };
|
||||
observations = this.sessionStore.getObservationsByIds(categorized.obsIds, obsOptions);
|
||||
}
|
||||
|
||||
if (categorized.sessionIds.length > 0) {
|
||||
sessions = this.sessionStore.getSessionSummariesByIds(categorized.sessionIds, {
|
||||
orderBy,
|
||||
limit,
|
||||
project
|
||||
});
|
||||
}
|
||||
|
||||
if (categorized.promptIds.length > 0) {
|
||||
prompts = this.sessionStore.getUserPromptsByIds(categorized.promptIds, {
|
||||
orderBy,
|
||||
limit,
|
||||
project
|
||||
});
|
||||
}
|
||||
|
||||
logger.debug('SEARCH', 'ChromaSearchStrategy: Hydrated results', {
|
||||
observations: observations.length,
|
||||
sessions: sessions.length,
|
||||
prompts: prompts.length
|
||||
});
|
||||
|
||||
return {
|
||||
results: { observations, sessions, prompts },
|
||||
usedChroma: true,
|
||||
fellBack: false,
|
||||
strategy: 'chroma'
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
logger.warn('SEARCH', 'ChromaSearchStrategy: Search failed', {}, error as Error);
|
||||
// Return empty result - caller may try fallback strategy
|
||||
return {
|
||||
results: { observations: [], sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: false,
|
||||
strategy: 'chroma'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Build Chroma where filter for document type
|
||||
*/
|
||||
private buildWhereFilter(searchType: string): Record<string, any> | undefined {
|
||||
switch (searchType) {
|
||||
case 'observations':
|
||||
return { doc_type: 'observation' };
|
||||
case 'sessions':
|
||||
return { doc_type: 'session_summary' };
|
||||
case 'prompts':
|
||||
return { doc_type: 'user_prompt' };
|
||||
default:
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Filter results by recency (90-day window)
|
||||
*/
|
||||
private filterByRecency(chromaResults: {
|
||||
ids: number[];
|
||||
metadatas: ChromaMetadata[];
|
||||
}): Array<{ id: number; meta: ChromaMetadata }> {
|
||||
const cutoff = Date.now() - SEARCH_CONSTANTS.RECENCY_WINDOW_MS;
|
||||
|
||||
return chromaResults.metadatas
|
||||
.map((meta, idx) => ({
|
||||
id: chromaResults.ids[idx],
|
||||
meta
|
||||
}))
|
||||
.filter(item => item.meta && item.meta.created_at_epoch > cutoff);
|
||||
}
|
||||
|
||||
/**
|
||||
* Categorize IDs by document type
|
||||
*/
|
||||
private categorizeByDocType(
|
||||
items: Array<{ id: number; meta: ChromaMetadata }>,
|
||||
options: {
|
||||
searchObservations: boolean;
|
||||
searchSessions: boolean;
|
||||
searchPrompts: boolean;
|
||||
}
|
||||
): { obsIds: number[]; sessionIds: number[]; promptIds: number[] } {
|
||||
const obsIds: number[] = [];
|
||||
const sessionIds: number[] = [];
|
||||
const promptIds: number[] = [];
|
||||
|
||||
for (const item of items) {
|
||||
const docType = item.meta?.doc_type;
|
||||
if (docType === 'observation' && options.searchObservations) {
|
||||
obsIds.push(item.id);
|
||||
} else if (docType === 'session_summary' && options.searchSessions) {
|
||||
sessionIds.push(item.id);
|
||||
} else if (docType === 'user_prompt' && options.searchPrompts) {
|
||||
promptIds.push(item.id);
|
||||
}
|
||||
}
|
||||
|
||||
return { obsIds, sessionIds, promptIds };
|
||||
}
|
||||
}
|
||||
270
src/services/worker/search/strategies/HybridSearchStrategy.ts
Normal file
270
src/services/worker/search/strategies/HybridSearchStrategy.ts
Normal file
@@ -0,0 +1,270 @@
|
||||
/**
|
||||
* HybridSearchStrategy - Combines metadata filtering with semantic ranking
|
||||
*
|
||||
* This strategy provides the best of both worlds:
|
||||
* 1. SQLite metadata filter (get all IDs matching criteria)
|
||||
* 2. Chroma semantic ranking (rank by relevance)
|
||||
* 3. Intersection (keep only IDs from step 1, in rank order from step 2)
|
||||
* 4. Hydrate from SQLite in semantic rank order
|
||||
*
|
||||
* Used for: findByConcept, findByFile, findByType with Chroma available
|
||||
*/
|
||||
|
||||
import { BaseSearchStrategy, SearchStrategy } from './SearchStrategy.js';
|
||||
import {
|
||||
StrategySearchOptions,
|
||||
StrategySearchResult,
|
||||
SEARCH_CONSTANTS,
|
||||
ObservationSearchResult,
|
||||
SessionSummarySearchResult
|
||||
} from '../types.js';
|
||||
import { ChromaSync } from '../../../sync/ChromaSync.js';
|
||||
import { SessionStore } from '../../../sqlite/SessionStore.js';
|
||||
import { SessionSearch } from '../../../sqlite/SessionSearch.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
|
||||
export class HybridSearchStrategy extends BaseSearchStrategy implements SearchStrategy {
|
||||
readonly name = 'hybrid';
|
||||
|
||||
constructor(
|
||||
private chromaSync: ChromaSync,
|
||||
private sessionStore: SessionStore,
|
||||
private sessionSearch: SessionSearch
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
canHandle(options: StrategySearchOptions): boolean {
|
||||
// Can handle when we have metadata filters and Chroma is available
|
||||
return !!this.chromaSync && (
|
||||
!!options.concepts ||
|
||||
!!options.files ||
|
||||
(!!options.type && !!options.query) ||
|
||||
options.strategyHint === 'hybrid'
|
||||
);
|
||||
}
|
||||
|
||||
async search(options: StrategySearchOptions): Promise<StrategySearchResult> {
|
||||
// This is the generic hybrid search - specific operations use dedicated methods
|
||||
const { query, limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project } = options;
|
||||
|
||||
if (!query) {
|
||||
return this.emptyResult('hybrid');
|
||||
}
|
||||
|
||||
// For generic hybrid search, use the standard Chroma path
|
||||
// More specific operations (findByConcept, etc.) have dedicated methods
|
||||
return this.emptyResult('hybrid');
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations by concept with semantic ranking
|
||||
* Pattern: Metadata filter -> Chroma ranking -> Intersection -> Hydrate
|
||||
*/
|
||||
async findByConcept(
|
||||
concept: string,
|
||||
options: StrategySearchOptions
|
||||
): Promise<StrategySearchResult> {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy } = options;
|
||||
const filterOptions = { limit, project, dateRange, orderBy };
|
||||
|
||||
try {
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: findByConcept', { concept });
|
||||
|
||||
// Step 1: SQLite metadata filter
|
||||
const metadataResults = this.sessionSearch.findByConcept(concept, filterOptions);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Found metadata matches', {
|
||||
count: metadataResults.length
|
||||
});
|
||||
|
||||
if (metadataResults.length === 0) {
|
||||
return this.emptyResult('hybrid');
|
||||
}
|
||||
|
||||
// Step 2: Chroma semantic ranking
|
||||
const ids = metadataResults.map(obs => obs.id);
|
||||
const chromaResults = await this.chromaSync.queryChroma(
|
||||
concept,
|
||||
Math.min(ids.length, SEARCH_CONSTANTS.CHROMA_BATCH_SIZE)
|
||||
);
|
||||
|
||||
// Step 3: Intersect - keep only IDs from metadata, in Chroma rank order
|
||||
const rankedIds = this.intersectWithRanking(ids, chromaResults.ids);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Ranked by semantic relevance', {
|
||||
count: rankedIds.length
|
||||
});
|
||||
|
||||
// Step 4: Hydrate in semantic rank order
|
||||
if (rankedIds.length > 0) {
|
||||
const observations = this.sessionStore.getObservationsByIds(rankedIds, { limit });
|
||||
// Restore semantic ranking order
|
||||
observations.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
|
||||
return {
|
||||
results: { observations, sessions: [], prompts: [] },
|
||||
usedChroma: true,
|
||||
fellBack: false,
|
||||
strategy: 'hybrid'
|
||||
};
|
||||
}
|
||||
|
||||
return this.emptyResult('hybrid');
|
||||
|
||||
} catch (error) {
|
||||
logger.warn('SEARCH', 'HybridSearchStrategy: findByConcept failed', {}, error as Error);
|
||||
// Fall back to metadata-only results
|
||||
const results = this.sessionSearch.findByConcept(concept, filterOptions);
|
||||
return {
|
||||
results: { observations: results, sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: true,
|
||||
strategy: 'hybrid'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations by type with semantic ranking
|
||||
*/
|
||||
async findByType(
|
||||
type: string | string[],
|
||||
options: StrategySearchOptions
|
||||
): Promise<StrategySearchResult> {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy } = options;
|
||||
const filterOptions = { limit, project, dateRange, orderBy };
|
||||
const typeStr = Array.isArray(type) ? type.join(', ') : type;
|
||||
|
||||
try {
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: findByType', { type: typeStr });
|
||||
|
||||
// Step 1: SQLite metadata filter
|
||||
const metadataResults = this.sessionSearch.findByType(type as any, filterOptions);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Found metadata matches', {
|
||||
count: metadataResults.length
|
||||
});
|
||||
|
||||
if (metadataResults.length === 0) {
|
||||
return this.emptyResult('hybrid');
|
||||
}
|
||||
|
||||
// Step 2: Chroma semantic ranking
|
||||
const ids = metadataResults.map(obs => obs.id);
|
||||
const chromaResults = await this.chromaSync.queryChroma(
|
||||
typeStr,
|
||||
Math.min(ids.length, SEARCH_CONSTANTS.CHROMA_BATCH_SIZE)
|
||||
);
|
||||
|
||||
// Step 3: Intersect with ranking
|
||||
const rankedIds = this.intersectWithRanking(ids, chromaResults.ids);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Ranked by semantic relevance', {
|
||||
count: rankedIds.length
|
||||
});
|
||||
|
||||
// Step 4: Hydrate in rank order
|
||||
if (rankedIds.length > 0) {
|
||||
const observations = this.sessionStore.getObservationsByIds(rankedIds, { limit });
|
||||
observations.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
|
||||
return {
|
||||
results: { observations, sessions: [], prompts: [] },
|
||||
usedChroma: true,
|
||||
fellBack: false,
|
||||
strategy: 'hybrid'
|
||||
};
|
||||
}
|
||||
|
||||
return this.emptyResult('hybrid');
|
||||
|
||||
} catch (error) {
|
||||
logger.warn('SEARCH', 'HybridSearchStrategy: findByType failed', {}, error as Error);
|
||||
const results = this.sessionSearch.findByType(type as any, filterOptions);
|
||||
return {
|
||||
results: { observations: results, sessions: [], prompts: [] },
|
||||
usedChroma: false,
|
||||
fellBack: true,
|
||||
strategy: 'hybrid'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations and sessions by file path with semantic ranking
|
||||
*/
|
||||
async findByFile(
|
||||
filePath: string,
|
||||
options: StrategySearchOptions
|
||||
): Promise<{
|
||||
observations: ObservationSearchResult[];
|
||||
sessions: SessionSummarySearchResult[];
|
||||
usedChroma: boolean;
|
||||
}> {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy } = options;
|
||||
const filterOptions = { limit, project, dateRange, orderBy };
|
||||
|
||||
try {
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: findByFile', { filePath });
|
||||
|
||||
// Step 1: SQLite metadata filter
|
||||
const metadataResults = this.sessionSearch.findByFile(filePath, filterOptions);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Found file matches', {
|
||||
observations: metadataResults.observations.length,
|
||||
sessions: metadataResults.sessions.length
|
||||
});
|
||||
|
||||
// Sessions don't need semantic ranking (already summarized)
|
||||
const sessions = metadataResults.sessions;
|
||||
|
||||
if (metadataResults.observations.length === 0) {
|
||||
return { observations: [], sessions, usedChroma: false };
|
||||
}
|
||||
|
||||
// Step 2: Chroma semantic ranking for observations
|
||||
const ids = metadataResults.observations.map(obs => obs.id);
|
||||
const chromaResults = await this.chromaSync.queryChroma(
|
||||
filePath,
|
||||
Math.min(ids.length, SEARCH_CONSTANTS.CHROMA_BATCH_SIZE)
|
||||
);
|
||||
|
||||
// Step 3: Intersect with ranking
|
||||
const rankedIds = this.intersectWithRanking(ids, chromaResults.ids);
|
||||
logger.debug('SEARCH', 'HybridSearchStrategy: Ranked observations', {
|
||||
count: rankedIds.length
|
||||
});
|
||||
|
||||
// Step 4: Hydrate in rank order
|
||||
if (rankedIds.length > 0) {
|
||||
const observations = this.sessionStore.getObservationsByIds(rankedIds, { limit });
|
||||
observations.sort((a, b) => rankedIds.indexOf(a.id) - rankedIds.indexOf(b.id));
|
||||
|
||||
return { observations, sessions, usedChroma: true };
|
||||
}
|
||||
|
||||
return { observations: [], sessions, usedChroma: false };
|
||||
|
||||
} catch (error) {
|
||||
logger.warn('SEARCH', 'HybridSearchStrategy: findByFile failed', {}, error as Error);
|
||||
const results = this.sessionSearch.findByFile(filePath, filterOptions);
|
||||
return {
|
||||
observations: results.observations,
|
||||
sessions: results.sessions,
|
||||
usedChroma: false
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Intersect metadata IDs with Chroma IDs, preserving Chroma's rank order
|
||||
*/
|
||||
private intersectWithRanking(metadataIds: number[], chromaIds: number[]): number[] {
|
||||
const metadataSet = new Set(metadataIds);
|
||||
const rankedIds: number[] = [];
|
||||
|
||||
for (const chromaId of chromaIds) {
|
||||
if (metadataSet.has(chromaId) && !rankedIds.includes(chromaId)) {
|
||||
rankedIds.push(chromaId);
|
||||
}
|
||||
}
|
||||
|
||||
return rankedIds;
|
||||
}
|
||||
}
|
||||
131
src/services/worker/search/strategies/SQLiteSearchStrategy.ts
Normal file
131
src/services/worker/search/strategies/SQLiteSearchStrategy.ts
Normal file
@@ -0,0 +1,131 @@
|
||||
/**
|
||||
* SQLiteSearchStrategy - Direct SQLite queries for filter-only searches
|
||||
*
|
||||
* This strategy handles searches without query text (filter-only):
|
||||
* - Date range filtering
|
||||
* - Project filtering
|
||||
* - Type filtering
|
||||
* - Concept/file filtering
|
||||
*
|
||||
* Used when: No query text is provided, or as a fallback when Chroma fails
|
||||
*/
|
||||
|
||||
import { BaseSearchStrategy, SearchStrategy } from './SearchStrategy.js';
|
||||
import {
|
||||
StrategySearchOptions,
|
||||
StrategySearchResult,
|
||||
SEARCH_CONSTANTS,
|
||||
ObservationSearchResult,
|
||||
SessionSummarySearchResult,
|
||||
UserPromptSearchResult
|
||||
} from '../types.js';
|
||||
import { SessionSearch } from '../../../sqlite/SessionSearch.js';
|
||||
import { logger } from '../../../../utils/logger.js';
|
||||
|
||||
export class SQLiteSearchStrategy extends BaseSearchStrategy implements SearchStrategy {
|
||||
readonly name = 'sqlite';
|
||||
|
||||
constructor(private sessionSearch: SessionSearch) {
|
||||
super();
|
||||
}
|
||||
|
||||
canHandle(options: StrategySearchOptions): boolean {
|
||||
// Can handle filter-only queries (no query text)
|
||||
// Also used as fallback when Chroma is unavailable
|
||||
return !options.query || options.strategyHint === 'sqlite';
|
||||
}
|
||||
|
||||
async search(options: StrategySearchOptions): Promise<StrategySearchResult> {
|
||||
const {
|
||||
searchType = 'all',
|
||||
obsType,
|
||||
concepts,
|
||||
files,
|
||||
limit = SEARCH_CONSTANTS.DEFAULT_LIMIT,
|
||||
offset = 0,
|
||||
project,
|
||||
dateRange,
|
||||
orderBy = 'date_desc'
|
||||
} = options;
|
||||
|
||||
const searchObservations = searchType === 'all' || searchType === 'observations';
|
||||
const searchSessions = searchType === 'all' || searchType === 'sessions';
|
||||
const searchPrompts = searchType === 'all' || searchType === 'prompts';
|
||||
|
||||
let observations: ObservationSearchResult[] = [];
|
||||
let sessions: SessionSummarySearchResult[] = [];
|
||||
let prompts: UserPromptSearchResult[] = [];
|
||||
|
||||
const baseOptions = { limit, offset, orderBy, project, dateRange };
|
||||
|
||||
logger.debug('SEARCH', 'SQLiteSearchStrategy: Filter-only query', {
|
||||
searchType,
|
||||
hasDateRange: !!dateRange,
|
||||
hasProject: !!project
|
||||
});
|
||||
|
||||
try {
|
||||
if (searchObservations) {
|
||||
const obsOptions = {
|
||||
...baseOptions,
|
||||
type: obsType,
|
||||
concepts,
|
||||
files
|
||||
};
|
||||
observations = this.sessionSearch.searchObservations(undefined, obsOptions);
|
||||
}
|
||||
|
||||
if (searchSessions) {
|
||||
sessions = this.sessionSearch.searchSessions(undefined, baseOptions);
|
||||
}
|
||||
|
||||
if (searchPrompts) {
|
||||
prompts = this.sessionSearch.searchUserPrompts(undefined, baseOptions);
|
||||
}
|
||||
|
||||
logger.debug('SEARCH', 'SQLiteSearchStrategy: Results', {
|
||||
observations: observations.length,
|
||||
sessions: sessions.length,
|
||||
prompts: prompts.length
|
||||
});
|
||||
|
||||
return {
|
||||
results: { observations, sessions, prompts },
|
||||
usedChroma: false,
|
||||
fellBack: false,
|
||||
strategy: 'sqlite'
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
logger.warn('SEARCH', 'SQLiteSearchStrategy: Search failed', {}, error as Error);
|
||||
return this.emptyResult('sqlite');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations by concept (used by findByConcept tool)
|
||||
*/
|
||||
findByConcept(concept: string, options: StrategySearchOptions): ObservationSearchResult[] {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy = 'date_desc' } = options;
|
||||
return this.sessionSearch.findByConcept(concept, { limit, project, dateRange, orderBy });
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations by type (used by findByType tool)
|
||||
*/
|
||||
findByType(type: string | string[], options: StrategySearchOptions): ObservationSearchResult[] {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy = 'date_desc' } = options;
|
||||
return this.sessionSearch.findByType(type as any, { limit, project, dateRange, orderBy });
|
||||
}
|
||||
|
||||
/**
|
||||
* Find observations and sessions by file path (used by findByFile tool)
|
||||
*/
|
||||
findByFile(filePath: string, options: StrategySearchOptions): {
|
||||
observations: ObservationSearchResult[];
|
||||
sessions: SessionSummarySearchResult[];
|
||||
} {
|
||||
const { limit = SEARCH_CONSTANTS.DEFAULT_LIMIT, project, dateRange, orderBy = 'date_desc' } = options;
|
||||
return this.sessionSearch.findByFile(filePath, { limit, project, dateRange, orderBy });
|
||||
}
|
||||
}
|
||||
60
src/services/worker/search/strategies/SearchStrategy.ts
Normal file
60
src/services/worker/search/strategies/SearchStrategy.ts
Normal file
@@ -0,0 +1,60 @@
|
||||
/**
|
||||
* SearchStrategy - Interface for search strategy implementations
|
||||
*
|
||||
* Each strategy implements a different approach to searching:
|
||||
* - ChromaSearchStrategy: Vector-based semantic search via Chroma
|
||||
* - SQLiteSearchStrategy: Direct SQLite queries for filter-only searches
|
||||
* - HybridSearchStrategy: Metadata filtering + semantic ranking
|
||||
*/
|
||||
|
||||
import { SearchResults, StrategySearchOptions, StrategySearchResult } from '../types.js';
|
||||
|
||||
/**
|
||||
* Base interface for all search strategies
|
||||
*/
|
||||
export interface SearchStrategy {
|
||||
/**
|
||||
* Execute a search with the given options
|
||||
* @param options Search options including query and filters
|
||||
* @returns Promise resolving to categorized search results
|
||||
*/
|
||||
search(options: StrategySearchOptions): Promise<StrategySearchResult>;
|
||||
|
||||
/**
|
||||
* Check if this strategy can handle the given search options
|
||||
* @param options Search options to evaluate
|
||||
* @returns true if this strategy can handle the search
|
||||
*/
|
||||
canHandle(options: StrategySearchOptions): boolean;
|
||||
|
||||
/**
|
||||
* Strategy name for logging and debugging
|
||||
*/
|
||||
readonly name: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Abstract base class providing common functionality for strategies
|
||||
*/
|
||||
export abstract class BaseSearchStrategy implements SearchStrategy {
|
||||
abstract readonly name: string;
|
||||
|
||||
abstract search(options: StrategySearchOptions): Promise<StrategySearchResult>;
|
||||
abstract canHandle(options: StrategySearchOptions): boolean;
|
||||
|
||||
/**
|
||||
* Create an empty search result
|
||||
*/
|
||||
protected emptyResult(strategy: 'chroma' | 'sqlite' | 'hybrid'): StrategySearchResult {
|
||||
return {
|
||||
results: {
|
||||
observations: [],
|
||||
sessions: [],
|
||||
prompts: []
|
||||
},
|
||||
usedChroma: strategy === 'chroma' || strategy === 'hybrid',
|
||||
fellBack: false,
|
||||
strategy
|
||||
};
|
||||
}
|
||||
}
|
||||
120
src/services/worker/search/types.ts
Normal file
120
src/services/worker/search/types.ts
Normal file
@@ -0,0 +1,120 @@
|
||||
/**
|
||||
* Search Types - Type definitions for the search module
|
||||
* Centralizes all search-related types, options, and result interfaces
|
||||
*/
|
||||
|
||||
import { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult, SearchOptions, DateRange } from '../../sqlite/types.js';
|
||||
|
||||
// Re-export base types for convenience
|
||||
export { ObservationSearchResult, SessionSummarySearchResult, UserPromptSearchResult, SearchOptions, DateRange };
|
||||
|
||||
/**
|
||||
* Constants used across search strategies
|
||||
*/
|
||||
export const SEARCH_CONSTANTS = {
|
||||
RECENCY_WINDOW_DAYS: 90,
|
||||
RECENCY_WINDOW_MS: 90 * 24 * 60 * 60 * 1000,
|
||||
DEFAULT_LIMIT: 20,
|
||||
CHROMA_BATCH_SIZE: 100
|
||||
} as const;
|
||||
|
||||
/**
|
||||
* Document types stored in Chroma
|
||||
*/
|
||||
export type ChromaDocType = 'observation' | 'session_summary' | 'user_prompt';
|
||||
|
||||
/**
|
||||
* Chroma query result with typed metadata
|
||||
*/
|
||||
export interface ChromaQueryResult {
|
||||
ids: number[];
|
||||
distances: number[];
|
||||
metadatas: ChromaMetadata[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Metadata stored with each Chroma document
|
||||
*/
|
||||
export interface ChromaMetadata {
|
||||
sqlite_id: number;
|
||||
doc_type: ChromaDocType;
|
||||
memory_session_id: string;
|
||||
project: string;
|
||||
created_at_epoch: number;
|
||||
type?: string;
|
||||
title?: string;
|
||||
subtitle?: string;
|
||||
concepts?: string;
|
||||
files_read?: string;
|
||||
files_modified?: string;
|
||||
field_type?: string;
|
||||
prompt_number?: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Unified search result type for all document types
|
||||
*/
|
||||
export type SearchResult = ObservationSearchResult | SessionSummarySearchResult | UserPromptSearchResult;
|
||||
|
||||
/**
|
||||
* Search results container with categorized results
|
||||
*/
|
||||
export interface SearchResults {
|
||||
observations: ObservationSearchResult[];
|
||||
sessions: SessionSummarySearchResult[];
|
||||
prompts: UserPromptSearchResult[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Extended search options for the search module
|
||||
*/
|
||||
export interface ExtendedSearchOptions extends SearchOptions {
|
||||
/** Type filter for search API (observations, sessions, prompts) */
|
||||
searchType?: 'observations' | 'sessions' | 'prompts' | 'all';
|
||||
/** Observation type filter (decision, bugfix, feature, etc.) */
|
||||
obsType?: string | string[];
|
||||
/** Concept tags to filter by */
|
||||
concepts?: string | string[];
|
||||
/** File paths to filter by */
|
||||
files?: string | string[];
|
||||
/** Output format */
|
||||
format?: 'text' | 'json';
|
||||
}
|
||||
|
||||
/**
|
||||
* Search strategy selection hint
|
||||
*/
|
||||
export type SearchStrategyHint = 'chroma' | 'sqlite' | 'hybrid' | 'auto';
|
||||
|
||||
/**
|
||||
* Options passed to search strategies
|
||||
*/
|
||||
export interface StrategySearchOptions extends ExtendedSearchOptions {
|
||||
/** Query text for semantic search (optional for filter-only queries) */
|
||||
query?: string;
|
||||
/** Force a specific strategy */
|
||||
strategyHint?: SearchStrategyHint;
|
||||
}
|
||||
|
||||
/**
|
||||
* Result from a search strategy
|
||||
*/
|
||||
export interface StrategySearchResult {
|
||||
results: SearchResults;
|
||||
/** Whether Chroma was used successfully */
|
||||
usedChroma: boolean;
|
||||
/** Whether fallback was triggered */
|
||||
fellBack: boolean;
|
||||
/** Strategy that produced the results */
|
||||
strategy: SearchStrategyHint;
|
||||
}
|
||||
|
||||
/**
|
||||
* Combined result type for timeline items
|
||||
*/
|
||||
export interface CombinedResult {
|
||||
type: 'observation' | 'session' | 'prompt';
|
||||
data: SearchResult;
|
||||
epoch: number;
|
||||
created_at: string;
|
||||
}
|
||||
Reference in New Issue
Block a user