fix(llm): pin LLM edge functions to US/EU regions to prevent geo-block 403s (#2541)

OpenRouter returns 403 'This model is not available in your region' when
Vercel routes through edge nodes in regions where Google Gemini is blocked.
Pin chat-analyst, news, and intelligence edge functions to iad1/lhr1/fra1/sfo1.

Also improves error logging in callLlmReasoningStream to include model name
and full response body on non-2xx for easier future diagnosis.
This commit is contained in:
Elie Habib
2026-03-30 11:08:14 +04:00
committed by GitHub
parent bf27e474c2
commit cf5328f2d4
4 changed files with 5 additions and 4 deletions

View File

@@ -1,4 +1,4 @@
export const config = { runtime: 'edge' };
export const config = { runtime: 'edge', regions: ['iad1', 'lhr1', 'fra1', 'sfo1'] };
import { createDomainGateway, serverOptions } from '../../../server/gateway';
import { createIntelligenceServiceRoutes } from '../../../src/generated/server/worldmonitor/intelligence/v1/service_server';