Commit Graph

3 Commits

Author SHA1 Message Date
Elie Habib
6c3d2770f7 feat: split settings into LLMs and API Keys tabs, fix keychain vault and Ollama UX
- Split settings window into 3 tabs: LLMs (Ollama/Groq/OpenRouter),
  API Keys (data feeds), and Debug & Logs
- Add featureFilter option to RuntimeConfigPanel for rendering subsets
- Consolidate keychain to single JSON vault entry (1 macOS prompt vs 20)
- Add Ollama model discovery with /api/tags + /v1/models fallback
- Strip <think> reasoning tokens from Ollama responses
- Suppress thinking with think:false in Ollama request body
- Parallel secret verification with 15s global timeout
- Fix manual model input overlapping dropdown (CSS grid-area + hidden-input class)
- Add loading spinners to settings tab panels
- Suppress notification popups when settings window is open
- Filter embed models from Ollama dropdown
- Fix settings window black screen flash with inline dark background
2026-02-20 00:02:48 +04:00
Elie Habib
1d1b1b209f fix: harden OpenAI-compatible endpoint flow for Ollama/LM Studio 2026-02-19 20:18:13 +04:00
Claude
5cdc41712c refactor: unify summarization providers behind common interfaces
Server-side: extract shared CORS, validation, caching, prompt building,
and response shaping into api/_summarize-handler.js factory. Each
endpoint (Groq, OpenRouter, Ollama) becomes a thin wrapper calling
createSummarizeHandler() with a provider config: credentials, API URL,
model, headers, and provider label.

Client-side: replace three near-identical tryOllama/tryGroq/tryOpenRouter
functions with a single tryApiProvider() driven by an API_PROVIDERS
config array. Add runApiChain() helper that loops the chain with
progress callbacks. Simplify translateText() from three copy-pasted
blocks to a single loop over the same provider array.

- groq-summarize.js: 297 → 30 lines
- openrouter-summarize.js: 295 → 33 lines
- ollama-summarize.js: 289 → 34 lines
- summarization.ts: 336 → 239 lines
- New _summarize-handler.js: 315 lines (shared)
- Net: -566 lines of duplication removed

Adding a new LLM provider now requires only a provider config object
in the endpoint file + one entry in the API_PROVIDERS array.

Tests: 13 new tests for the shared factory (cache key, dedup, handler
creation, fallback, error casing, HTTP methods). All 42 existing tests
pass unchanged.

https://claude.ai/code/session_01AGg9fG6LZ8Y6XhvLszdfeY
2026-02-19 15:11:25 +00:00