Files
worldmonitor/scripts
Elie Habib ac9e3c8af2 refactor(llm): consolidate provider chain to single source of truth (#1640)
* fix(relay): add LLM fallback chain to ais-relay classify

Replace single Groq-only LLM call with provider fallback chain
(Groq → OpenRouter → Ollama) matching seed-insights.mjs pattern.
If Groq fails or is unavailable, classify falls through to the
next configured provider automatically.

* refactor(llm): consolidate provider chain to single source of truth

- Fix OpenRouter model: openrouter/free → google/gemini-2.5-flash in canonical llm.ts
- Migrate 4 intelligence handlers (classify-event, batch-classify, deduct-situation,
  get-country-intel-brief) from hardcoded Groq-only to callLlm() with full
  ollama → groq → openrouter fallback chain
- Remove duplicate getProviderCredentials from news/v1/_shared.ts, re-export canonical
- Remove orphaned GROQ_API_URL/GROQ_MODEL from intelligence/v1/_shared.ts
- Reorder script provider chains (ais-relay.cjs, seed-insights.mjs) to canonical
  ollama → groq → openrouter order
- Net -161 lines: eliminated duplicated provider logic across 9 files

* fix: eliminate double JSON parse in classify-event, throw on runSeed verification failure

* fix(tests): add llm module alias to country-intel-brief test fixture

* fix: preserve generic LLM_API_* fallback, add retry to seed verification

- Add 'generic' provider to callLlm() chain for LLM_API_URL/LLM_API_KEY/LLM_MODEL
  (preserves existing OpenAI-compatible endpoint contract)
- Change seed verification to warn-only with 1 retry instead of fatal throw
  (write already succeeded, transient read failure shouldn't fail the job)
- Update docs to reflect new provider fallback chain
2026-03-15 11:44:42 +04:00
..