chore: bump version to 10.7.0

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Alex Newman
2026-04-04 14:58:05 -07:00
parent 902db6b2e1
commit c5129ed016
12 changed files with 209 additions and 40 deletions

View File

@@ -0,0 +1,7 @@
<claude-mem-context>
# claude-mem: Cross-Session Memory
*No context yet. Complete your first session and context will appear here.*
Use claude-mem's MCP search tools for manual memory queries.
</claude-mem-context>

View File

@@ -10,7 +10,7 @@
"plugins": [
{
"name": "claude-mem",
"version": "10.6.3",
"version": "10.7.0",
"source": "./plugin",
"description": "Persistent memory system for Claude Code - context compression across sessions"
}

7
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,7 @@
<claude-mem-context>
# claude-mem: Cross-Session Memory
*No context yet. Complete your first session and context will appear here.*
Use claude-mem's MCP search tools for manual memory queries.
</claude-mem-context>

View File

@@ -0,0 +1,68 @@
# Memory Context from Past Sessions
The following context is from claude-mem, a persistent memory system that tracks your coding sessions.
# $CMEM claude-mem 2026-04-03 6:48pm PDT
Legend: 🎯session 🔴bugfix 🟣feature 🔄refactor ✅change 🔵discovery ⚖decision
Format: ID TIME TYPE TITLE
Fetch details: get_observations([IDs]) | Search: mem-search skill
Stats: 50 obs (18,868t read) | 401,168t work | 95% savings
### Apr 3, 2026
62994 1:47p 🔴 Merge Commit Finalized on thedotmack/npx-gemini-cli Branch
62995 1:48p 🔵 Worker Running but Health Endpoint Doesn't Accept POST
62996 " 🔵 Worker Health Endpoint Returns Detailed Status via GET
62997 1:49p 🔵 Worker Service Timeout and Shutdown Behavior in worker-service.ts
62998 " 🔵 claude-mem Hook Architecture Defined in plugin/hooks/hooks.json
62999 " 🔵 Session Idle Timeout Architecture: Two-Tier System in claude-mem
63000 " 🔵 Orphan Reaper Runs Every 30 Seconds; Sessions Orphaned After 6 Hours
63001 1:51p 🔵 POST /api/sessions/complete Removes Sessions from Active Map to Unblock Orphan Reaper
63002 1:52p 🔵 Stop Hook Summarize Flow: Extracts Last Assistant Message from Transcript
63004 " 🔵 POST /api/sessions/summarize: Privacy Check Before Queuing SDK Agent
63005 " 🔵 SessionManager.deleteSession Verifies Subprocess Exit to Prevent Zombies
63007 " 🔵 deleteSession: 4-Step Teardown with Generator and Subprocess Timeouts
63008 1:53p 🔵 Queue Depth Always Read from Database; Generator Restarts Capped at 3
63009 " 🔴 Fixed Lost Summaries: session-complete Now Waits for Pending Work Before Deleting Session
63010 1:54p 🔴 SessionEnd Hook Timeout Increased to 180s
63014 2:00p 🔵 claude-mem Hook Architecture and Exit Code System
63015 2:01p 🔵 SessionEnd Hook Has a 1.5s Default Timeout Controlled by Environment Variable
63016 2:02p 🔴 Stop Hook Now Owns Full Session Lifecycle: Summarize → Poll → Complete
63017 " 🔵 Missing /api/sessions/status Route — Only DB-ID Variant Exists
63018 2:03p 🔴 Added /api/sessions/status Route Registration to SessionRoutes
63020 " 🟣 Added handleStatusByClaudeId Handler for GET /api/sessions/status
63022 " 🔄 Removed Pending-Work Polling from /api/sessions/complete — Moved to Stop Hook
63024 " 🔄 SessionEnd Hook Reverted to Fast Fire-and-Forget (2s Timeout)
63026 2:04p 🔵 claude-mem hooks.json Full Hook Lifecycle Configuration
63027 2:05p ✅ Push to Pull Request
63028 " 🔵 Pre-Push State: claude-mem Repository Changes
63029 " 🔴 Fix Lost Summaries: Move Summary Wait into Stop Hook
63035 2:11p ✅ Testing Plan Created for tmux-cli npx Installation Flows
63036 2:12p 🔵 claude-mem Supports 13 npx Installation Flows Across IDE Integrations
63037 " 🔵 Detailed Integration Strategies for All 13 claude-mem npx Installation Flows
63038 2:13p ✅ NPX Install Flow Test Plan Document Created
63039 " ✅ 12 TODO Tasks Created for npx Install Flow Testing
63040 2:19p 🟣 Comprehensive Test Suite Requested for Claude-Mem CLI
63041 2:20p 🔵 NPX Install Flow Test Plan Exists for 12 IDE Integrations
63042 " 🟣 Phase 2 E2E Runtime Testing Added to NPX Install Test Plan
63043 " ✅ Test Tasks Updated with Phase 2 E2E Runtime Steps for 5 IDE Flows
63044 " ✅ All Remaining Test Tasks (612) Updated with Phase 2 E2E Runtime Steps
63079 6:31p ⚖️ Test Execution via Subagents Using /do Command
63080 6:32p 🔵 IDE Auto-Detection Module in claude-mem
63081 " 🔵 Install Command Architecture with Multi-IDE Dispatch
63082 " 🔵 MCP Integrations Module for 6 IDEs
63083 " 🔵 Cursor, Windsurf, and Gemini CLI Hook-Based Integrations
63084 " 🔵 OpenCode, OpenClaw, and Codex CLI Installers
63085 6:33p 🔵 tmux-cli Available for Automated Testing
63086 " 🔵 NPX Install Flow Test Plan — 12 IDE Flows
63087 6:34p 🟣 Detailed Test Execution Plan Created for NPX Install Flows
63103 6:47p 🔵 NPX Install Fails for Windsurf IDE with Missing rxjs Dependency
63104 " 🔵 Windsurf Install Failure Was a Dependency Ordering Race
63105 " 🟣 claude-mem Gemini CLI Integration: 8 Hooks Registered
63106 " 🟣 claude-mem OpenCode Integration: Plugin File + AGENTS.md Context
Access 401k tokens of past work via get_observations([IDs]) or mem-search skill.
---
*Auto-updated by claude-mem after each session. Use MCP search tools for detailed queries.*

7
WARP.md Normal file
View File

@@ -0,0 +1,7 @@
<claude-mem-context>
# claude-mem: Cross-Session Memory
*No context yet. Complete your first session and context will appear here.*
Use claude-mem's MCP search tools for manual memory queries.
</claude-mem-context>

View File

@@ -1,6 +1,6 @@
{
"name": "claude-mem",
"version": "10.6.3",
"version": "10.7.0",
"description": "Memory compression system for Claude Code - persist context across sessions",
"keywords": [
"claude",

View File

@@ -1,6 +1,6 @@
{
"name": "claude-mem",
"version": "10.6.3",
"version": "10.7.0",
"description": "Persistent memory system for Claude Code - seamlessly preserve context across sessions",
"author": {
"name": "Alex Newman"

View File

@@ -1,6 +1,6 @@
{
"name": "claude-mem-plugin",
"version": "10.6.3",
"version": "10.7.0",
"private": true,
"description": "Runtime dependencies for claude-mem bundled hooks",
"type": "module",

View File

@@ -114,7 +114,7 @@ Set the \`cycles\` parameter to \`"ref"\` to resolve cyclical schemas with defs.
${c}`}var bP=new Set([".js",".jsx",".ts",".tsx",".mjs",".cjs",".py",".pyw",".go",".rs",".rb",".java",".cs",".cpp",".c",".h",".hpp",".swift",".kt",".php",".vue",".svelte"]),xP=new Set(["node_modules",".git","dist","build",".next","__pycache__",".venv","venv","env",".env","target","vendor",".cache",".turbo","coverage",".nyc_output",".claude",".smart-file-read"]),kP=512*1024;async function*n$(e,t,r=20){if(r<=0)return;let n;try{n=await(0,Sn.readdir)(e,{withFileTypes:!0})}catch{return}for(let o of n){if(o.name.startsWith(".")&&o.name!=="."||xP.has(o.name))continue;let i=(0,pi.join)(e,o.name);if(o.isDirectory())yield*n$(i,t,r-1);else if(o.isFile()){let a=o.name.slice(o.name.lastIndexOf("."));bP.has(a)&&(yield i)}}}async function SP(e){try{let t=await(0,Sn.stat)(e);if(t.size>kP||t.size===0)return null;let r=await(0,Sn.readFile)(e,"utf-8");return r.slice(0,1e3).includes("\0")?null:r}catch{return null}}async function o$(e,t,r={}){let n=r.maxResults||20,o=t.toLowerCase(),i=o.split(/[\s_\-./]+/).filter(h=>h.length>0),a=[];for await(let h of n$(e,e)){if(r.filePattern&&!(0,pi.relative)(e,h).toLowerCase().includes(r.filePattern.toLowerCase()))continue;let _=await SP(h);_&&a.push({absolutePath:h,relativePath:(0,pi.relative)(e,h),content:_})}let s=e$(a),c=[],u=[],l=0;for(let[h,_]of s){l+=wP(_);let E=Os(h.toLowerCase(),i)>0,I=[],A=(j,Le)=>{for(let de of j){let Wt=0,Qe="",Kt=Os(de.name.toLowerCase(),i);Kt>0&&(Wt+=Kt*3,Qe="name match"),de.signature.toLowerCase().includes(o)&&(Wt+=2,Qe=Qe?`${Qe} + signature`:"signature match"),de.jsdoc&&de.jsdoc.toLowerCase().includes(o)&&(Wt+=1,Qe=Qe?`${Qe} + jsdoc`:"jsdoc match"),Wt>0&&(E=!0,I.push({filePath:h,symbolName:Le?`${Le}.${de.name}`:de.name,kind:de.kind,signature:de.signature,jsdoc:de.jsdoc,lineStart:de.lineStart,lineEnd:de.lineEnd,matchReason:Qe})),de.children&&A(de.children,de.name)}};A(_.symbols),E&&(c.push(_),u.push(...I))}u.sort((h,_)=>{let b=Os(h.symbolName.toLowerCase(),i);return Os(_.symbolName.toLowerCase(),i)-b});let d=u.slice(0,n),m=new Set(d.map(h=>h.filePath)),p=c.filter(h=>m.has(h.filePath)).slice(0,n),g=p.reduce((h,_)=>h+_.foldedTokenEstimate,0);return{foldedFiles:p,matchingSymbols:d,totalFilesScanned:a.length,totalSymbolsFound:l,tokenEstimate:g}}function Os(e,t){let r=0;for(let n of t)if(e===n)r+=10;else if(e.includes(n))r+=5;else{let o=0,i=0;for(let a of n){let s=e.indexOf(a,o);s!==-1&&(i++,o=s+1)}i===n.length&&(r+=1)}return r}function wP(e){let t=e.symbols.length;for(let r of e.symbols)r.children&&(t+=r.children.length);return t}function i$(e,t){let r=[];if(r.push(`\u{1F50D} Smart Search: "${t}"`),r.push(` Scanned ${e.totalFilesScanned} files, found ${e.totalSymbolsFound} symbols`),r.push(` ${e.matchingSymbols.length} matches across ${e.foldedFiles.length} files (~${e.tokenEstimate} tokens for folded view)`),r.push(""),e.matchingSymbols.length===0)return r.push(" No matching symbols found."),r.join(`
`);r.push("\u2500\u2500 Matching Symbols \u2500\u2500"),r.push("");for(let n of e.matchingSymbols){if(r.push(` ${n.kind} ${n.symbolName} (${n.filePath}:${n.lineStart+1})`),r.push(` ${n.signature}`),n.jsdoc){let o=n.jsdoc.split(`
`).find(i=>i.replace(/^[\s*/]+/,"").trim().length>0);o&&r.push(` \u{1F4AC} ${o.replace(/^[\s*/]+/,"").trim()}`)}r.push("")}r.push("\u2500\u2500 Folded File Views \u2500\u2500"),r.push("");for(let n of e.foldedFiles)r.push(kn(n)),r.push("");return r.push("\u2500\u2500 Actions \u2500\u2500"),r.push(" To see full implementation: use smart_unfold with file path and symbol name"),r.join(`
`)}var Of=require("node:fs/promises"),js=require("node:path"),zP="10.6.3";console.log=(...e)=>{ve.error("CONSOLE","Intercepted console output (MCP protocol protection)",void 0,{args:e})};var a$={search:"/api/search",timeline:"/api/timeline"};async function s$(e,t){ve.debug("SYSTEM","\u2192 Worker API",void 0,{endpoint:e,params:t});try{let r=new URLSearchParams;for(let[a,s]of Object.entries(t))s!=null&&r.append(a,String(s));let n=`${e}?${r}`,o=await Ts(n);if(!o.ok){let a=await o.text();throw new Error(`Worker API error (${o.status}): ${a}`)}let i=await o.json();return ve.debug("SYSTEM","\u2190 Worker API success",void 0,{endpoint:e}),i}catch(r){return ve.error("SYSTEM","\u2190 Worker API error",{endpoint:e},r),{content:[{type:"text",text:`Error calling Worker API: ${r instanceof Error?r.message:String(r)}`}],isError:!0}}}async function IP(e,t){ve.debug("HTTP","Worker API request (POST)",void 0,{endpoint:e});try{let r=await Ts(e,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify(t)});if(!r.ok){let o=await r.text();throw new Error(`Worker API error (${r.status}): ${o}`)}let n=await r.json();return ve.debug("HTTP","Worker API success (POST)",void 0,{endpoint:e}),{content:[{type:"text",text:JSON.stringify(n,null,2)}]}}catch(r){return ve.error("HTTP","Worker API error (POST)",{endpoint:e},r),{content:[{type:"text",text:`Error calling Worker API: ${r instanceof Error?r.message:String(r)}`}],isError:!0}}}async function EP(){try{return(await Ts("/api/health")).ok}catch(e){return ve.debug("SYSTEM","Worker health check failed",{},e),!1}}var c$=[{name:"__IMPORTANT",description:`3-LAYER WORKFLOW (ALWAYS FOLLOW):
`)}var Of=require("node:fs/promises"),js=require("node:path"),zP="10.7.0";console.log=(...e)=>{ve.error("CONSOLE","Intercepted console output (MCP protocol protection)",void 0,{args:e})};var a$={search:"/api/search",timeline:"/api/timeline"};async function s$(e,t){ve.debug("SYSTEM","\u2192 Worker API",void 0,{endpoint:e,params:t});try{let r=new URLSearchParams;for(let[a,s]of Object.entries(t))s!=null&&r.append(a,String(s));let n=`${e}?${r}`,o=await Ts(n);if(!o.ok){let a=await o.text();throw new Error(`Worker API error (${o.status}): ${a}`)}let i=await o.json();return ve.debug("SYSTEM","\u2190 Worker API success",void 0,{endpoint:e}),i}catch(r){return ve.error("SYSTEM","\u2190 Worker API error",{endpoint:e},r),{content:[{type:"text",text:`Error calling Worker API: ${r instanceof Error?r.message:String(r)}`}],isError:!0}}}async function IP(e,t){ve.debug("HTTP","Worker API request (POST)",void 0,{endpoint:e});try{let r=await Ts(e,{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify(t)});if(!r.ok){let o=await r.text();throw new Error(`Worker API error (${r.status}): ${o}`)}let n=await r.json();return ve.debug("HTTP","Worker API success (POST)",void 0,{endpoint:e}),{content:[{type:"text",text:JSON.stringify(n,null,2)}]}}catch(r){return ve.error("HTTP","Worker API error (POST)",{endpoint:e},r),{content:[{type:"text",text:`Error calling Worker API: ${r instanceof Error?r.message:String(r)}`}],isError:!0}}}async function EP(){try{return(await Ts("/api/health")).ok}catch(e){return ve.debug("SYSTEM","Worker health check failed",{},e),!1}}var c$=[{name:"__IMPORTANT",description:`3-LAYER WORKFLOW (ALWAYS FOLLOW):
1. search(query) \u2192 Get index with IDs (~50-100 tokens/result)
2. timeline(anchor=ID) \u2192 Get context around interesting results
3. get_observations([IDs]) \u2192 Fetch full details ONLY for filtered IDs

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,42 @@
---
name: claude-code-plugin-release
description: Automated semantic versioning and release workflow for Claude Code plugins. Handles version increments across package.json, marketplace.json, and plugin.json, build verification, git tagging, GitHub releases, and changelog generation.
---
# Version Bump & Release Workflow
**IMPORTANT:** You must first plan and write detailed release notes before starting the version bump workflow.
**CRITICAL:** ALWAYS commit EVERYTHING (including build artifacts). At the end of this workflow, NOTHING should be left uncommitted or unpushed. Run `git status` at the end to verify.
## Preparation
1. **Analyze**: Determine if the change is a **PATCH** (bug fixes), **MINOR** (features), or **MAJOR** (breaking) update.
2. **Environment**: Identify the repository owner and name (e.g., from `git remote -v`).
3. **Paths**: Verify existence of `package.json`, `.claude-plugin/marketplace.json`, and `plugin/.claude-plugin/plugin.json`.
## Workflow
1. **Update**: Increment version strings in all configuration files.
2. **Verify**: Use `grep` to ensure all files match the new version.
3. **Build**: Run `npm run build` to generate fresh artifacts.
4. **Commit**: Stage all changes including artifacts: `git add -A && git commit -m "chore: bump version to X.Y.Z"`.
5. **Tag**: Create an annotated tag: `git tag -a vX.Y.Z -m "Version X.Y.Z"`.
6. **Push**: `git push origin main && git push origin vX.Y.Z`.
7. **Release**: `gh release create vX.Y.Z --title "vX.Y.Z" --notes "RELEASE_NOTES"`.
8. **Changelog**: Regenerate `CHANGELOG.md` using the GitHub API and the provided script:
```bash
gh api repos/{owner}/{repo}/releases --paginate | ./scripts/generate_changelog.js > CHANGELOG.md
```
9. **Sync**: Commit and push the updated `CHANGELOG.md`.
10. **Notify**: Run `npm run discord:notify vX.Y.Z` if applicable.
11. **Finalize**: Run `git status` to ensure a clean working tree.
## Checklist
- [ ] All config files have matching versions
- [ ] `npm run build` succeeded
- [ ] Git tag created and pushed
- [ ] GitHub release created with notes
- [ ] `CHANGELOG.md` updated and pushed
- [ ] `git status` shows clean tree

View File

@@ -0,0 +1,37 @@
#!/usr/bin/env node
const fs = require('fs');
/**
* Processes GitHub release JSON from stdin and outputs a formatted CHANGELOG.md
*/
function generate() {
try {
const input = fs.readFileSync(0, 'utf8');
if (!input || input.trim() === '') {
process.stderr.write('No input received on stdin
');
process.exit(1);
}
const releases = JSON.parse(input);
const lines = ['# Changelog', '', 'All notable changes to this project.', ''];
releases.slice(0, 50).forEach(r => {
const date = r.published_at.split('T')[0];
lines.push(`## [${r.tag_name}] - ${date}`);
lines.push('');
if (r.body) lines.push(r.body.trim());
lines.push('');
});
process.stdout.write(lines.join('
') + '
');
} catch (err) {
process.stderr.write(`Error generating changelog: ${err.message}
`);
process.exit(1);
}
}
generate();