mirror of
https://github.com/koala73/worldmonitor.git
synced 2026-04-25 17:14:57 +02:00
* chore(api): enforce sebuf contract via exceptions manifest (#3207) Adds api/api-route-exceptions.json as the single source of truth for non-proto /api/ endpoints, with scripts/enforce-sebuf-api-contract.mjs gating every PR via npm run lint:api-contract. Fixes the root-only blind spot in the prior allowlist (tests/edge-functions.test.mjs), which only scanned top-level *.js files and missed nested paths and .ts endpoints — the gap that let api/supply-chain/v1/country-products.ts and friends drift under proto domain URL prefixes unchallenged. Checks both directions: every api/<domain>/v<N>/[rpc].ts must pair with a generated service_server.ts (so a deleted proto fails CI), and every generated service must have an HTTP gateway (no orphaned generated code). Manifest entries require category + reason + owner, with removal_issue mandatory for temporary categories (deferred, migration-pending) and forbidden for permanent ones. .github/CODEOWNERS pins the manifest to @SebastienMelki so new exceptions don't slip through review. The manifest only shrinks: migration-pending entries (19 today) will be removed as subsequent commits in this PR land each migration. * refactor(maritime): migrate /api/ais-snapshot → maritime/v1.GetVesselSnapshot (#3207) The proto VesselSnapshot was carrying density + disruptions but the frontend also needed sequence, relay status, and candidate_reports to drive the position-callback system. Those only lived on the raw relay passthrough, so the client had to keep hitting /api/ais-snapshot whenever callbacks were registered and fall back to the proto RPC only when the relay URL was gone. This commit pushes all three missing fields through the proto contract and collapses the dual-fetch-path into one proto client call. Proto changes (proto/worldmonitor/maritime/v1/): - VesselSnapshot gains sequence, status, candidate_reports. - GetVesselSnapshotRequest gains include_candidates (query: include_candidates). Handler (server/worldmonitor/maritime/v1/get-vessel-snapshot.ts): - Forwards include_candidates to ?candidates=... on the relay. - Separate 5-min in-memory caches for the candidates=on and candidates=off variants; they have very different payload sizes and should not share a slot. - Per-request in-flight dedup preserved per-variant. Frontend (src/services/maritime/index.ts): - fetchSnapshotPayload now calls MaritimeServiceClient.getVesselSnapshot directly with includeCandidates threaded through. The raw-relay path, SNAPSHOT_PROXY_URL, DIRECT_RAILWAY_SNAPSHOT_URL and LOCAL_SNAPSHOT_FALLBACK are gone — production already routed via Vercel, the "direct" branch only ever fired on localhost, and the proto gateway covers both. - New toLegacyCandidateReport helper mirrors toDensityZone/toDisruptionEvent. api/ais-snapshot.js deleted; manifest entry removed. Only reduced the codegen scope to worldmonitor.maritime.v1 (buf generate --path) — regenerating the full tree drops // @ts-nocheck from every client/server file and surfaces pre-existing type errors across 30+ unrelated services, which is not in scope for this PR. Shape-diff vs legacy payload: - disruptions / density: proto carries the same fields, just with the GeoCoordinates wrapper and enum strings (remapped client-side via existing toDisruptionEvent / toDensityZone helpers). - sequence, status.{connected,vessels,messages}: now populated from the proto response — was hardcoded to 0/false in the prior proto fallback. - candidateReports: same shape; optional numeric fields come through as 0 instead of undefined, which the legacy consumer already handled. * refactor(sanctions): migrate /api/sanctions-entity-search → LookupSanctionEntity (#3207) The proto docstring already claimed "OFAC + OpenSanctions" coverage but the handler only fuzzy-matched a local OFAC Redis index — narrower than the legacy /api/sanctions-entity-search, which proxied OpenSanctions live (the source advertised in docs/api-proxies.mdx). Deleting the legacy without expanding the handler would have been a silent coverage regression for external consumers. Handler changes (server/worldmonitor/sanctions/v1/lookup-entity.ts): - Primary path: live search against api.opensanctions.org/search/default with an 8s timeout and the same User-Agent the legacy edge fn used. - Fallback path: the existing OFAC local fuzzy match, kept intact for when OpenSanctions is unreachable / rate-limiting. - Response source field flips between 'opensanctions' (happy path) and 'ofac' (fallback) so clients can tell which index answered. - Query validation tightened: rejects q > 200 chars (matches legacy cap). Rate limiting: - Added /api/sanctions/v1/lookup-entity to ENDPOINT_RATE_POLICIES at 30/min per IP — matches the legacy createIpRateLimiter budget. The gateway already enforces per-endpoint policies via checkEndpointRateLimit. Docs: - docs/api-proxies.mdx — dropped the /api/sanctions-entity-search row (plus the orphaned /api/ais-snapshot row left over from the previous commit in this PR). - docs/panels/sanctions-pressure.mdx — points at the new RPC URL and describes the OpenSanctions-primary / OFAC-fallback semantics. api/sanctions-entity-search.js deleted; manifest entry removed. * refactor(military): migrate /api/military-flights → ListMilitaryFlights (#3207) Legacy /api/military-flights read a pre-baked Redis blob written by the seed-military-flights cron and returned flights in a flat app-friendly shape (lat/lon, lowercase enums, lastSeenMs). The proto RPC takes a bbox, fetches OpenSky live, classifies server-side, and returns nested GeoCoordinates + MILITARY_*_TYPE_* enum strings + lastSeenAt — same data, different contract. fetchFromRedis in src/services/military-flights.ts was doing nothing sebuf-aware. Renamed it to fetchViaProto and rewrote to: - Instantiate MilitaryServiceClient against getRpcBaseUrl(). - Iterate MILITARY_QUERY_REGIONS (PACIFIC + WESTERN) in parallel — same regions the desktop OpenSky path and the seed cron already use, so dashboard coverage tracks the analytic pipeline. - Dedup by hexCode across regions. - Map proto → app shape via new mapProtoFlight helper plus three reverse enum maps (AIRCRAFT_TYPE_REVERSE, OPERATOR_REVERSE, CONFIDENCE_REVERSE). The seed cron (scripts/seed-military-flights.mjs) stays put: it feeds regional-snapshot mobility, cross-source signals, correlation, and the health freshness check (api/health.js: 'military:flights:v1'). None of those read the legacy HTTP endpoint; they read the Redis key directly. The proto handler uses its own per-bbox cache keys under the same prefix, so dashboard traffic no longer races the seed cron's blob — the two paths diverge by a small refresh lag, which is acceptable. Docs: dropped the /api/military-flights row from docs/api-proxies.mdx. api/military-flights.js deleted; manifest entry removed. Shape-diff vs legacy: - f.location.{latitude,longitude} → f.lat, f.lon - f.aircraftType: MILITARY_AIRCRAFT_TYPE_TANKER → 'tanker' via reverse map - f.operator: MILITARY_OPERATOR_USAF → 'usaf' via reverse map - f.confidence: MILITARY_CONFIDENCE_LOW → 'low' via reverse map - f.lastSeenAt (number) → f.lastSeen (Date) - f.enrichment → f.enriched (with field renames) - Extra fields registration / aircraftModel / origin / destination / firstSeenAt now flow through where proto populates them. * fix(supply-chain): thread includeCandidates through chokepoint status (#3207) Caught by tsconfig.api.json typecheck in the pre-push hook (not covered by the plain tsc --noEmit run that ran before I pushed the ais-snapshot commit). The chokepoint status handler calls getVesselSnapshot internally with a static no-auth request — now required to include the new includeCandidates bool from the proto extension. Passing false: server-internal callers don't need per-vessel reports. * test(maritime): update getVesselSnapshot cache assertions (#3207) The ais-snapshot migration replaced the single cachedSnapshot/cacheTimestamp pair with a per-variant cache so candidates-on and candidates-off payloads don't evict each other. Pre-push hook surfaced that tests/server-handlers still asserted the old variable names. Rewriting the assertions to match the new shape while preserving the invariants they actually guard: - Freshness check against slot TTL. - Cache read before relay call. - Per-slot in-flight dedup. - Stale-serve on relay failure (result ?? slot.snapshot). * chore(proto): restore // @ts-nocheck on regenerated maritime files (#3207) I ran 'buf generate --path worldmonitor/maritime/v1' to scope the proto regen to the one service I was changing (to avoid the toolchain drift that drops @ts-nocheck from 60+ unrelated files — separate issue). But the repo convention is the 'make generate' target, which runs buf and then sed-prepends '// @ts-nocheck' to every generated .ts file. My scoped command skipped the sed step. The proto-check CI enforces the sed output, so the two maritime files need the directive restored. * refactor(enrichment): decomm /api/enrichment/{company,signals} legacy edge fns (#3207) Both endpoints were already ported to IntelligenceService: - getCompanyEnrichment (/api/intelligence/v1/get-company-enrichment) - listCompanySignals (/api/intelligence/v1/list-company-signals) No frontend callers of the legacy /api/enrichment/* paths exist. Removes: - api/enrichment/company.js, signals.js, _domain.js - api-route-exceptions.json migration-pending entries (58 remain) - docs/api-proxies.mdx rows for /api/enrichment/{company,signals} - docs/architecture.mdx reference updated to the IntelligenceService RPCs Verified: typecheck, typecheck:api, lint:api-contract (89 files / 58 entries), lint:boundaries, tests/edge-functions.test.mjs (136 pass), tests/enrichment-caching.test.mjs (14 pass — still guards the intelligence/v1 handlers), make generate is zero-diff. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * refactor(leads): migrate /api/{contact,register-interest} → LeadsService (#3207) New leads/v1 sebuf service with two POST RPCs: - SubmitContact → /api/leads/v1/submit-contact - RegisterInterest → /api/leads/v1/register-interest Handler logic ported 1:1 from api/contact.js + api/register-interest.js: - Turnstile verification (desktop sources bypass, preserved) - Honeypot (website field) silently accepts without upstream calls - Free-email-domain gate on SubmitContact (422 ApiError) - validateEmail (disposable/offensive/typo-TLD/MX) on RegisterInterest - Convex writes via ConvexHttpClient (contactMessages:submit, registerInterest:register) - Resend notification + confirmation emails (HTML templates unchanged) Shared helpers moved to server/_shared/: - turnstile.ts (getClientIp + verifyTurnstile) - email-validation.ts (disposable/offensive/MX checks) Rate limits preserved via ENDPOINT_RATE_POLICIES: - submit-contact: 3/hour per IP (was in-memory 3/hr) - register-interest: 5/hour per IP (was in-memory 5/hr; desktop sources previously capped at 2/hr via shared in-memory map — now 5/hr like everyone else, accepting the small regression in exchange for Upstash-backed global limiting) Callers updated: - pro-test/src/App.tsx contact form → new submit-contact path - src-tauri/sidecar/local-api-server.mjs cloud-fallback rewrites /api/register-interest → /api/leads/v1/register-interest when proxying; keeps local path for older desktop builds - src/services/runtime.ts isKeyFreeApiTarget allows both old and new paths through the WORLDMONITOR_API_KEY-optional gate Tests: - tests/contact-handler.test.mjs rewritten to call submitContact handler directly; asserts on ValidationError / ApiError - tests/email-validation.test.mjs + tests/turnstile.test.mjs point at the new server/_shared/ modules Deleted: api/contact.js, api/register-interest.js, api/_ip-rate-limit.js, api/_turnstile.js, api/_email-validation.js, api/_turnstile.test.mjs. Manifest entries removed (58 → 56). Docs updated (api-platform, api-commerce, usage-rate-limits). Verified: npm run typecheck + typecheck:api + lint:api-contract (88 files / 56 entries) + lint:boundaries pass; full test:data (5852 tests) passes; make generate is zero-diff. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * chore(pro-test): rebuild bundle for leads/v1 contact form (#3207) Updates the enterprise contact form to POST to /api/leads/v1/submit-contact (old path /api/contact removed in the previous commit). Bundle is rebuilt from pro-test/src/App.tsx source change in9ccd309d. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review): address HIGH review findings 1-3 (#3207) Three review findings from @koala73 on the sebuf-migration PR, all silent bugs that would have shipped to prod: ### 1. Sanctions rate-limit policy was dead code ENDPOINT_RATE_POLICIES keyed the 30/min budget under /api/sanctions/v1/lookup-entity, but the generated route (from the proto RPC LookupSanctionEntity) is /api/sanctions/v1/lookup-sanction-entity. hasEndpointRatePolicy / getEndpointRatelimit are exact-string pathname lookups, so the mismatch meant the endpoint fell through to the generic 600/min global limiter instead of the advertised 30/min. Net effect: the live OpenSanctions proxy endpoint (unauthenticated, external upstream) had 20x the intended rate budget. Fixed by renaming the policy key to match the generated route. ### 2. Lost stale-seed fallback on military-flights Legacy api/military-flights.js cascaded military:flights:v1 → military:flights:stale:v1 before returning empty. The new proto handler went straight to live OpenSky/relay and returned null on miss. Relay or OpenSky hiccup used to serve stale seeded data (24h TTL); under the new handler it showed an empty map. Both keys are still written by scripts/seed-military-flights.mjs on every run — fix just reads the stale key when the live fetch returns null, converts the seed's app-shape flights (flat lat/lon, lowercase enums, lastSeenMs) to the proto shape (nested GeoCoordinates, enum strings, lastSeenAt), and filters to the request bbox. Read via getRawJson (unprefixed) to match the seed cron's writes, which bypass the env-prefix system. ### 3. Hex-code casing mismatch broke getFlightByHex The seed cron writes hexCode: icao24.toUpperCase() (uppercase); src/services/military-flights.ts:getFlightByHex uppercases the lookup input: f.hexCode === hexCode.toUpperCase(). The new proto handler preserved OpenSky's lowercase icao24, and mapProtoFlight is a pass-through. getFlightByHex was silently returning undefined for every call after the migration. Fix: uppercase in the proto handler (live + stale paths), and document the invariant in a comment on MilitaryFlight.hex_code in military_flight.proto so future handlers don't re-break it. ### Verified - typecheck + typecheck:api clean - lint:api-contract (56 entries) / lint:boundaries clean - tests/edge-functions.test.mjs 130 pass - make generate zero-diff (openapi spec regenerated for proto comment) Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review): restore desktop 2/hr rate cap on register-interest (#3207) Addresses HIGH review finding #4 from @koala73. The legacy api/register-interest.js applied a nested 2/hr per-IP cap when `source === 'desktop-settings'`, on top of the generic 5/hr endpoint budget. The sebuf migration lost this — desktop-source requests now enjoy the full 5/hr cap. Since `source` is an unsigned client-supplied field, anyone sending `source: 'desktop-settings'` skips Turnstile AND gets 5/hr. Without the tighter cap the Turnstile bypass is cheaper to abuse. Added `checkScopedRateLimit` to `server/_shared/rate-limit.ts` — a reusable second-stage Upstash limiter keyed on an opaque scope string + caller identifier. Fail-open on Redis errors to match existing checkRateLimit / checkEndpointRateLimit semantics. Handlers that need per-subscope caps on top of the gateway-level endpoint budget use this helper. In register-interest: when `isDesktopSource`, call checkScopedRateLimit with scope `/api/leads/v1/register-interest#desktop`, limit=2, window=1h, IP as identifier. On exceeded → throw ApiError(429). ### What this does not fix This caps the blast radius of the Turnstile bypass but does not close it — an attacker sending `source: 'desktop-settings'` still skips Turnstile (just at 2/hr instead of 5/hr). The proper fix is a signed desktop-secret header that authenticates the bypass; filed as follow-up #3252. That requires coordinated Tauri build + Vercel env changes out of scope for #3207. ### Verified - typecheck + typecheck:api clean - lint:api-contract (56 entries) - tests/edge-functions.test.mjs + contact-handler.test.mjs (147 pass) Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review): MEDIUM + LOW + rate-limit-policy CI check (#3207) Closes out the remaining @koala73 review findings from #3242 that didn't already land in the HIGH-fix commits, plus the requested CI check that would have caught HIGH #1 (dead-code policy key) at review time. ### MEDIUM #5 — Turnstile missing-secret policy default Flip `verifyTurnstile`'s default `missingSecretPolicy` from `'allow'` to `'allow-in-development'`. Dev with no secret = pass (expected local); prod with no secret = reject + log. submit-contact was already explicitly overriding to `'allow-in-development'`; register-interest was silently getting `'allow'`. Safe default now means a future missing-secret misconfiguration in prod gets caught instead of silently letting bots through. Removed the now-redundant override in submit-contact. ### MEDIUM #6 — Silent enum fallbacks in maritime client `toDisruptionEvent` mapped `AIS_DISRUPTION_TYPE_UNSPECIFIED` / unknown enum values → `gap_spike` / `low` silently. Refactored to return null when either enum is unknown; caller filters nulls out of the array. Handler doesn't produce UNSPECIFIED today, but the `gap_spike` default would have mislabeled the first new enum value the proto ever adds — dropping unknowns is safer than shipping wrong labels. ### LOW — Copy drift in register-interest email Email template hardcoded `435+ Sources`; PR #3241 bumped marketing to `500+`. Bumped in the rewritten file to stay consistent. The `as any` on Convex mutation names carried over from legacy and filed as follow-up #3253. ### Rate-limit-policy coverage lint `scripts/enforce-rate-limit-policies.mjs` validates every key in `ENDPOINT_RATE_POLICIES` resolves to a proto-generated gateway route by cross-referencing `docs/api/*.openapi.yaml`. Fails with the sanctions-entity-search incident referenced in the error message so future drift has a paper trail. Wired into package.json (`lint:rate-limit-policies`) and the pre-push hook alongside `lint:boundaries`. Smoke-tested both directions — clean repo passes (5 policies / 175 routes), seeded drift (the exact HIGH #1 typo) fails with the advertised remedy text. ### Verified - `lint:rate-limit-policies` ✓ - `typecheck` + `typecheck:api` ✓ - `lint:api-contract` ✓ (56 entries) - `lint:boundaries` ✓ - edge-functions + contact-handler tests (147 pass) Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * refactor(commit 5): decomm /api/eia/* + migrate /api/satellites → IntelligenceService (#3207) Both targets turned out to be decomm-not-migration cases. The original plan called for two new services (economic/v1.GetEiaSeries + natural/v1.ListSatellitePositions) but research found neither was needed: ### /api/eia/[[...path]].js — pure decomm, zero consumers The "catch-all" is a misnomer — only two paths actually worked, /api/eia/health and /api/eia/petroleum, both Redis-only readers. Zero frontend callers in src/. Zero server-side readers. Nothing consumes the `energy:eia-petroleum:v1` key that seed-eia-petroleum.mjs writes daily. The EIA data the frontend actually uses goes through existing typed RPCs in economic/v1: GetEnergyPrices, GetCrudeInventories, GetNatGasStorage, GetEnergyCapacity. None of those touch /api/eia/*. Building GetEiaSeries would have been dead code. Deleted the legacy file + its test (tests/api-eia-petroleum.test.mjs — it only covered the legacy endpoint, no behavior to preserve). Empty api/eia/ dir removed. **Note for review:** the Redis seed cron keeps running daily and nothing consumes it. If that stays unused, seed-eia-petroleum.mjs should be retired too (separate PR). Out of scope for sebuf-migration. ### /api/satellites.js — Learning #2 strikes again IntelligenceService.ListSatellites already exists at /api/intelligence/v1/list-satellites, reads the same Redis key (intelligence:satellites:tle:v1), and supports an optional country filter the legacy didn't have. One frontend caller in src/services/satellites.ts needed to switch from `fetch(toApiUrl('/api/satellites'))` to the typed IntelligenceServiceClient.listSatellites. Shape diff was tiny — legacy `noradId` became proto `id` (handler line 36 already picks either), everything else identical. alt/velocity/inclination in the proto are ignored by the caller since it propagates positions client-side via satellite.js. Kept the client-side cache + failure cooldown + 20s timeout (still valid concerns at the caller level). ### Manifest + docs - api-route-exceptions.json: 56 → 54 entries (both removed) - docs/api-proxies.mdx: dropped the two rows from the Raw-data passthroughs table ### Verified - typecheck + typecheck:api ✓ - lint:api-contract (54 entries) / lint:boundaries / lint:rate-limit-policies ✓ - tests/edge-functions.test.mjs 127 pass (down from 130 — 3 tests were for the deleted eia endpoint) - make generate zero-diff (no proto changes) Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * refactor(commit 6): migrate /api/supply-chain/v1/{country-products,multi-sector-cost-shock} → SupplyChainService (#3207) Both endpoints were hand-rolled TS handlers sitting under a proto URL prefix — the exact drift the manifest guardrail flagged. Promoted both to typed RPCs: - GetCountryProducts → /api/supply-chain/v1/get-country-products - GetMultiSectorCostShock → /api/supply-chain/v1/get-multi-sector-cost-shock Handlers preserve the existing semantics: PRO-gate via isCallerPremium(ctx.request), iso2 / chokepointId validation, raw bilateral-hs4 Redis read (skip env-prefix to match seeder writes), CHOKEPOINT_STATUS_KEY for war-risk tier, and the math from _multi-sector-shock.ts unchanged. Empty-data and non-PRO paths return the typed empty payload (no 403 — the sebuf gateway pattern is empty-payload-on-deny). Client wrapper switches from premiumFetch to client.getCountryProducts/ client.getMultiSectorCostShock. Legacy MultiSectorShock / MultiSectorShockResponse / CountryProductsResponse names remain as type aliases of the generated proto types so CountryBriefPanel + CountryDeepDivePanel callsites compile with zero churn. Manifest 54 → 52. Rate-limit gateway routes 175 → 177. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(gateway): add cache-tier entries for new supply-chain RPCs (#3207) Pre-push tests/route-cache-tier.test.mjs caught the missing entries. Both PRO-gated, request-varying — match the existing supply-chain PRO cohort (get-country-cost-shock, get-bypass-options, etc.) at slow-browser tier. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * refactor(commit 7): migrate /api/scenario/v1/{run,status,templates} → ScenarioService (#3207) Promote the three literal-filename scenario endpoints to a typed sebuf service with three RPCs: POST /api/scenario/v1/run-scenario (RunScenario) GET /api/scenario/v1/get-scenario-status (GetScenarioStatus) GET /api/scenario/v1/list-scenario-templates (ListScenarioTemplates) Preserves all security invariants from the legacy handlers: - 405 for wrong method (sebuf service-config method gate) - scenarioId validation against SCENARIO_TEMPLATES registry - iso2 regex ^[A-Z]{2}$ - JOB_ID_RE path-traversal guard on status - Per-IP 10/min rate limit (moved to gateway ENDPOINT_RATE_POLICIES) - Queue-depth backpressure (>100 → 429) - PRO gating via isCallerPremium - AbortSignal.timeout on every Redis pipeline (runRedisPipeline helper) Wire-level diffs vs legacy: - Per-user RL now enforced at the gateway (same 10/min/IP budget). - Rate-limit response omits Retry-After header; retryAfter is in the body per error-mapper.ts convention. - ListScenarioTemplates emits affectedHs2: [] when the registry entry is null (all-sectors sentinel); proto repeated cannot carry null. - RunScenario returns { jobId, status } (no statusUrl field — unused by SupplyChainPanel, drop from wire). Gateway wiring: - server/gateway.ts RPC_CACHE_TIER: list-scenario-templates → 'daily' (matches legacy max-age=3600); get-scenario-status → 'slow-browser' (premium short-circuit target, explicit entry required by tests/route-cache-tier.test.mjs). - src/shared/premium-paths.ts: swap old run/status for the new run-scenario/get-scenario-status paths. - api/scenario/v1/{run,status,templates}.ts deleted; 3 manifest exceptions removed (63 → 52 → 49 migration-pending). Client: - src/services/scenario/index.ts — typed client wrapper using premiumFetch (injects Clerk bearer / API key). - src/components/SupplyChainPanel.ts — polling loop swapped from premiumFetch strings to runScenario/getScenarioStatus. Hard 20s timeout on run preserved via AbortSignal.any. Tests: - tests/scenario-handler.test.mjs — 18 new handler-level tests covering every security invariant + the worker envelope coercion. - tests/edge-functions.test.mjs — scenario sections removed, replaced with a breadcrumb pointer to the new test file. Docs: api-scenarios.mdx, scenario-engine.mdx, usage-rate-limits.mdx, usage-errors.mdx, supply-chain.mdx refreshed with new paths. Verified: typecheck, typecheck:api, lint:api-contract (49 entries), lint:rate-limit-policies (6/180), lint:boundaries, route-cache-tier (parity), full edge-functions (117) + scenario-handler (18). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * refactor(commit 8): migrate /api/v2/shipping/{route-intelligence,webhooks} → ShippingV2Service (#3207) Partner-facing endpoints promoted to a typed sebuf service. Wire shape preserved byte-for-byte (camelCase field names, ISO-8601 fetchedAt, the same subscriberId/secret formats, the same SET + SADD + EXPIRE 30-day Redis pipeline). Partner URLs /api/v2/shipping/* are unchanged. RPCs landed: - GET /route-intelligence → RouteIntelligence (PRO, slow-browser) - POST /webhooks → RegisterWebhook (PRO) - GET /webhooks → ListWebhooks (PRO, slow-browser) The existing path-parameter URLs remain on the legacy edge-function layout because sebuf's HTTP annotations don't currently model path params (grep proto/**/*.proto for `path: "{…}"` returns zero). Those endpoints are split into two Vercel dynamic-route files under api/v2/shipping/webhooks/, behaviorally identical to the previous hybrid file but cleanly separated: - GET /webhooks/{subscriberId} → [subscriberId].ts - POST /webhooks/{subscriberId}/rotate-secret → [subscriberId]/[action].ts - POST /webhooks/{subscriberId}/reactivate → [subscriberId]/[action].ts Both get manifest entries under `migration-pending` pointing at #3207. Other changes - scripts/enforce-sebuf-api-contract.mjs: extended GATEWAY_RE to accept api/v{N}/{domain}/[rpc].ts (version-first) alongside the canonical api/{domain}/v{N}/[rpc].ts; first-use of the reversed ordering is shipping/v2 because that's the partner contract. - vite.config.ts: dev-server sebuf interceptor regex extended to match both layouts; shipping/v2 import + allRoutes entry added. - server/gateway.ts: RPC_CACHE_TIER entries for /api/v2/shipping/ route-intelligence + /webhooks (slow-browser; premium-gated endpoints short-circuit to slow-browser but the entries are required by tests/route-cache-tier.test.mjs). - src/shared/premium-paths.ts: route-intelligence + webhooks added. - tests/shipping-v2-handler.test.mjs: 18 handler-level tests covering PRO gate, iso2/cargoType/hs2 coercion, SSRF guards (http://, RFC1918, cloud metadata, IMDS), chokepoint whitelist, alertThreshold range, secret/subscriberId format, pipeline shape + 30-day TTL, cross-tenant owner isolation, `secret` omission from list response. Manifest delta - Removed: api/v2/shipping/route-intelligence.ts, api/v2/shipping/webhooks.ts - Added: api/v2/shipping/webhooks/[subscriberId].ts (migration-pending) - Added: api/v2/shipping/webhooks/[subscriberId]/[action].ts (migration-pending) - Added: api/internal/brief-why-matters.ts (internal-helper) — regression surface from the #3248 main merge, which introduced the file without a manifest entry. Filed here to keep the lint green; not strictly in scope for commit 8 but unblocking. Net result: 49 → 47 `migration-pending` entries (one net-removal even though webhook path-params stay pending, because two files collapsed into two dynamic routes). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review HIGH 1): SupplyChainServiceClient must use premiumFetch (#3207) Signed-in browser pro users were silently hitting 401 on 8 supply-chain premium endpoints (country-products, multi-sector-cost-shock, country-chokepoint-index, bypass-options, country-cost-shock, sector-dependency, route-explorer-lane, route-impact). The shared client was constructed with globalThis.fetch, so no Clerk bearer or X-WorldMonitor-Key was injected. The gateway's validateApiKey runs with forceKey=true for PREMIUM_RPC_PATHS and 401s before isCallerPremium is consulted. The generated client's try/catch collapses the 401 into an empty-fallback return, leaving panels blank with no visible error. Fix is one line at the client constructor: swap globalThis.fetch for premiumFetch. The same pattern is already in use for insider-transactions, stock-analysis, stock-backtest, scenario, trade (premiumClient) — this was an omission on this client, not a new pattern. premiumFetch no-ops safely when no credentials are available, so the 5 non-premium methods on this client (shippingRates, chokepointStatus, chokepointHistory, criticalMinerals, shippingStress) continue to work unchanged. This also fixes two panels that were pre-existing latently broken on main (chokepoint-index, bypass-options, etc. — predating #3207, not regressions from it). Commit 6 expanded the surface by routing two more methods through the same buggy client; this commit fixes the class. From koala73 review (#3242 second-pass, HIGH new #1): > Exact class PR #3233 fixed for RegionalIntelligenceBoard / > DeductionPanel / trade / country-intel. Supply-chain was not in > #3233's scope. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review HIGH 2): restore 400 on input-shape errors for 2 supply-chain handlers (#3207) Commit 6 collapsed all non-happy paths into empty-200 on `get-country-products` and `get-multi-sector-cost-shock`, including caller-bug cases that legacy returned 400 for: - get-country-products: malformed iso2 → empty 200 (was 400) - get-multi-sector-cost-shock: malformed iso2 / missing chokepointId / unknown chokepointId → empty 200 (was 400) The commit message for 6 called out the 403-for-non-pro → empty-200 shift ("sebuf gateway pattern is empty-payload-on-deny") but not the 400 shift. They're different classes: - Empty-payload-200 for PRO-deny: intentional contract change, already documented and applied across the service. Generated clients treat "you lack PRO" as "no data" — fine. - Empty-payload-200 for malformed input: caller bug silently masked. External API consumers can't distinguish "bad wiring" from "genuinely no data", test harnesses lose the signal, bad calling code doesn't surface in Sentry. Fix: `throw new ValidationError(violations)` on the 3 input-shape branches. The generated sebuf server maps ValidationError → HTTP 400 (see src/generated/server/.../service_server.ts and leads/v1 which already uses this pattern). PRO-gate deny stays as empty-200 — that contract shift was intentional and is preserved. Regression tests added at tests/supply-chain-validation.test.mjs (8 cases) pinning the three-way contract: - bad input → 400 (ValidationError) - PRO-gate deny on valid input → 200 empty - valid PRO input, no data in Redis → 200 empty (unchanged) From koala73 review (#3242 second-pass, HIGH new #2). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review HIGH 3): restore statusUrl on RunScenarioResponse + document 202→200 wire break (#3207) Commit 7 silently shifted /api/scenario/v1/run-scenario's response contract in two ways that the commit message covered only partially: 1. HTTP 202 Accepted → HTTP 200 OK 2. Dropped `statusUrl` string from the response body The `statusUrl` drop was mentioned as "unused by SupplyChainPanel" but not framed as a contract change. The 202 → 200 shift was not mentioned at all. This is a same-version (v1 → v1) migration, so external callers that key off either signal — `response.status === 202` or `response.body.statusUrl` — silently branch incorrectly. Evaluated options: (a) sebuf per-RPC status-code config — not available. sebuf's HttpConfig only models `path` and `method`; no status annotation. (b) Bump to scenario/v2 — judged heavier than the break itself for a single status-code shift. No in-repo caller uses 202 or statusUrl; the docs-level impact is containable. (c) Accept the break, document explicitly, partially restore. Took option (c): - Restored `statusUrl` in the proto (new field `string status_url = 3` on RunScenarioResponse). Server computes `/api/scenario/v1/get-scenario-status?jobId=<encoded job_id>` and populates it on every successful enqueue. External callers that followed this URL keep working unchanged. - 202 → 200 is not recoverable inside the sebuf generator, so it is called out explicitly in two places: - docs/api-scenarios.mdx now includes a prominent `<Warning>` block documenting the v1→v1 contract shift + the suggested migration (branch on response body shape, not HTTP status). - RunScenarioResponse proto comment explains why 200 is the new success status on enqueue. OpenAPI bundle regenerated to reflect the restored statusUrl field. - Regression test added in tests/scenario-handler.test.mjs pinning `statusUrl` to the exact URL-encoded shape — locks the invariant so a future proto rename or handler refactor can't silently drop it again. From koala73 review (#3242 second-pass, HIGH new #3). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review HIGH 1/2): close webhook tenant-isolation gap on shipping/v2 (#3207) Koala flagged this as a merge blocker in PR #3242 review. server/worldmonitor/shipping/v2/{register-webhook,list-webhooks}.ts migrated without reinstating validateApiKey(req, { forceKey: true }), diverging from both the sibling api/v2/shipping/webhooks/[subscriberId] routes and the documented "X-WorldMonitor-Key required" contract in docs/api-shipping-v2.mdx. Attack surface: the gateway accepts Clerk bearer auth as a pro signal. A Clerk-authenticated pro user with no X-WorldMonitor-Key reaches the handler, callerFingerprint() falls back to 'anon', and every such caller collapses into a shared webhook:owner:anon:v1 bucket. The defense-in-depth ownerTag !== ownerHash check in list-webhooks.ts doesn't catch it because both sides equal 'anon' — every Clerk-session holder could enumerate / overwrite every other Clerk-session pro tenant's registered webhook URLs. Fix: reinstate validateApiKey(ctx.request, { forceKey: true }) at the top of each handler, throwing ApiError(401) when absent. Matches the sibling routes exactly and the published partner contract. Tests: - tests/shipping-v2-handler.test.mjs: two existing "non-PRO → 403" tests for register/list were using makeCtx() with no key, which now fails at the 401 layer first. Renamed to "no API key → 401 (tenant-isolation gate)" with a comment explaining the failure mode being tested. 18/18 pass. Verified: typecheck:api, lint:api-contract (no change), lint:boundaries, lint:rate-limit-policies, test:data (6005/6005). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> * fix(review HIGH 2/2): restore v1 path aliases on scenario + supply-chain (#3207) Koala flagged this as a merge blocker in PR #3242 review. Commits 6 + 7 of #3207 renamed five documented v1 URLs to the sebuf method-derived paths and deleted the legacy edge-function files: POST /api/scenario/v1/run → run-scenario GET /api/scenario/v1/status → get-scenario-status GET /api/scenario/v1/templates → list-scenario-templates GET /api/supply-chain/v1/country-products → get-country-products GET /api/supply-chain/v1/multi-sector-cost-shock → get-multi-sector-cost-shock server/router.ts is an exact static-match table (Map keyed on `METHOD PATH`), so any external caller — docs, partner scripts, grep-the- internet — hitting the old documented URL would 404 on first request after merge. Commit 8 (shipping/v2) preserved partner URLs byte-for- byte; the scenario + supply-chain renames missed that discipline. Fix: add five thin alias edge functions that rewrite the pathname to the canonical sebuf path and delegate to the domain [rpc].ts gateway via a new server/alias-rewrite.ts helper. Premium gating, rate limits, entitlement checks, and cache-tier lookups all fire on the canonical path — aliases are pure URL rewrites, not a duplicate handler pipeline. api/scenario/v1/{run,status,templates}.ts api/supply-chain/v1/{country-products,multi-sector-cost-shock}.ts Vite dev parity: file-based routing at api/ is a Vercel concern, so the dev middleware (vite.config.ts) gets a matching V1_ALIASES rewrite map before the router dispatch. Manifest: 5 new entries under `deferred` with removal_issue=#3282 (tracking their retirement at the next v1→v2 break). lint:api-contract stays green (89 files checked, 55 manifest entries validated). Docs: - docs/api-scenarios.mdx: migration callout at the top with the full old→new URL table and a link to the retirement issue. - CHANGELOG.md + docs/changelog.mdx: Changed entry documenting the rename + alias compat + the 202→200 shift (from commit23c821a1). Verified: typecheck:api, lint:api-contract, lint:rate-limit-policies, lint:boundaries, test:data (6005/6005). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
75
server/_shared/email-validation.ts
Normal file
75
server/_shared/email-validation.ts
Normal file
@@ -0,0 +1,75 @@
|
||||
const DISPOSABLE_DOMAINS = new Set<string>([
|
||||
'guerrillamail.com', 'guerrillamail.de', 'guerrillamail.net', 'guerrillamail.org',
|
||||
'guerrillamailblock.com', 'grr.la', 'sharklasers.com', 'spam4.me',
|
||||
'tempmail.com', 'temp-mail.org', 'temp-mail.io',
|
||||
'throwaway.email', 'throwaway.com',
|
||||
'mailinator.com', 'mailnesia.com', 'maildrop.cc',
|
||||
'yopmail.com', 'yopmail.fr', 'yopmail.net',
|
||||
'trashmail.com', 'trashmail.me', 'trashmail.net',
|
||||
'dispostable.com', 'discard.email',
|
||||
'fakeinbox.com', 'fakemail.net',
|
||||
'getnada.com', 'nada.email',
|
||||
'tempinbox.com', 'tempr.email', 'tempmailaddress.com',
|
||||
'emailondeck.com', '33mail.com',
|
||||
'mohmal.com', 'mohmal.im', 'mohmal.in',
|
||||
'harakirimail.com', 'crazymailing.com',
|
||||
'inboxbear.com', 'mailcatch.com',
|
||||
'mintemail.com', 'mt2015.com',
|
||||
'spamgourmet.com', 'spamgourmet.net',
|
||||
'mailexpire.com', 'mailforspam.com',
|
||||
'safetymail.info', 'trashymail.com',
|
||||
'mytemp.email', 'tempail.com',
|
||||
'burnermail.io',
|
||||
'passinbox.com', 'passmail.net', 'passmail.com',
|
||||
'silomails.com', 'slmail.me',
|
||||
'spam.me', 'spambox.us',
|
||||
]);
|
||||
|
||||
const OFFENSIVE_RE = /(nigger|faggot|fuckfaggot)/i;
|
||||
|
||||
const TYPO_TLDS = new Set<string>(['con', 'coma', 'comhade', 'gmai', 'gmial']);
|
||||
|
||||
export type EmailValidationResult = { valid: true } | { valid: false; reason: string };
|
||||
|
||||
async function hasMxRecords(domain: string): Promise<boolean> {
|
||||
try {
|
||||
const res = await fetch(
|
||||
`https://cloudflare-dns.com/dns-query?name=${encodeURIComponent(domain)}&type=MX`,
|
||||
{ headers: { Accept: 'application/dns-json' }, signal: AbortSignal.timeout(3000) },
|
||||
);
|
||||
if (!res.ok) return true;
|
||||
const data = (await res.json()) as { Answer?: unknown[] };
|
||||
return Array.isArray(data.Answer) && data.Answer.length > 0;
|
||||
} catch {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
export async function validateEmail(email: string): Promise<EmailValidationResult> {
|
||||
const normalized = email.trim().toLowerCase();
|
||||
const atIdx = normalized.indexOf('@');
|
||||
if (atIdx < 1) return { valid: false, reason: 'Invalid email format' };
|
||||
|
||||
const domain = normalized.slice(atIdx + 1);
|
||||
const localPart = normalized.slice(0, atIdx);
|
||||
|
||||
if (OFFENSIVE_RE.test(localPart) || OFFENSIVE_RE.test(domain)) {
|
||||
return { valid: false, reason: 'Email address not accepted' };
|
||||
}
|
||||
|
||||
if (DISPOSABLE_DOMAINS.has(domain)) {
|
||||
return { valid: false, reason: 'Disposable email addresses are not allowed. Please use a permanent email.' };
|
||||
}
|
||||
|
||||
const tld = domain.split('.').pop();
|
||||
if (tld && TYPO_TLDS.has(tld)) {
|
||||
return { valid: false, reason: 'This email domain looks like a typo. Please check the ending.' };
|
||||
}
|
||||
|
||||
const mx = await hasMxRecords(domain);
|
||||
if (!mx) {
|
||||
return { valid: false, reason: 'This email domain does not accept mail. Please check for typos.' };
|
||||
}
|
||||
|
||||
return { valid: true };
|
||||
}
|
||||
@@ -80,6 +80,18 @@ interface EndpointRatePolicy {
|
||||
const ENDPOINT_RATE_POLICIES: Record<string, EndpointRatePolicy> = {
|
||||
'/api/news/v1/summarize-article-cache': { limit: 3000, window: '60 s' },
|
||||
'/api/intelligence/v1/classify-event': { limit: 600, window: '60 s' },
|
||||
// Legacy /api/sanctions-entity-search rate limit was 30/min per IP. Preserve
|
||||
// that budget now that LookupSanctionEntity proxies OpenSanctions live.
|
||||
'/api/sanctions/v1/lookup-sanction-entity': { limit: 30, window: '60 s' },
|
||||
// Lead capture: preserve the 3/hr and 5/hr budgets from legacy api/contact.js
|
||||
// and api/register-interest.js. Lower limits than normal IP rate limit since
|
||||
// these hit Convex + Resend per request.
|
||||
'/api/leads/v1/submit-contact': { limit: 3, window: '1 h' },
|
||||
'/api/leads/v1/register-interest': { limit: 5, window: '1 h' },
|
||||
// Scenario engine: legacy /api/scenario/v1/run capped at 10 jobs/min/IP via
|
||||
// inline Upstash INCR. Gateway now enforces the same budget with per-IP
|
||||
// keying in checkEndpointRateLimit.
|
||||
'/api/scenario/v1/run-scenario': { limit: 10, window: '60 s' },
|
||||
};
|
||||
|
||||
const endpointLimiters = new Map<string, Ratelimit>();
|
||||
@@ -131,3 +143,59 @@ export async function checkEndpointRateLimit(
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// --- In-handler scoped rate limits ---
|
||||
//
|
||||
// Handlers that need a per-subscope cap *in addition to* the gateway-level
|
||||
// endpoint policy (e.g. a tighter budget for one request variant) use this
|
||||
// helper. Gateway's checkEndpointRateLimit still runs first — this is a
|
||||
// second stage.
|
||||
|
||||
const scopedLimiters = new Map<string, Ratelimit>();
|
||||
|
||||
function getScopedRatelimit(scope: string, limit: number, window: Duration): Ratelimit | null {
|
||||
const cacheKey = `${scope}|${limit}|${window}`;
|
||||
const cached = scopedLimiters.get(cacheKey);
|
||||
if (cached) return cached;
|
||||
|
||||
const url = process.env.UPSTASH_REDIS_REST_URL;
|
||||
const token = process.env.UPSTASH_REDIS_REST_TOKEN;
|
||||
if (!url || !token) return null;
|
||||
|
||||
const rl = new Ratelimit({
|
||||
redis: new Redis({ url, token }),
|
||||
limiter: Ratelimit.slidingWindow(limit, window),
|
||||
prefix: 'rl:scope',
|
||||
analytics: false,
|
||||
});
|
||||
scopedLimiters.set(cacheKey, rl);
|
||||
return rl;
|
||||
}
|
||||
|
||||
export interface ScopedRateLimitResult {
|
||||
allowed: boolean;
|
||||
limit: number;
|
||||
reset: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns whether the request is under the scoped budget. `scope` is an
|
||||
* opaque namespace (e.g. `${pathname}#desktop`); `identifier` is usually the
|
||||
* client IP but can be any stable caller identifier. Fail-open on Redis errors
|
||||
* to stay consistent with checkRateLimit / checkEndpointRateLimit semantics.
|
||||
*/
|
||||
export async function checkScopedRateLimit(
|
||||
scope: string,
|
||||
limit: number,
|
||||
window: Duration,
|
||||
identifier: string,
|
||||
): Promise<ScopedRateLimitResult> {
|
||||
const rl = getScopedRatelimit(scope, limit, window);
|
||||
if (!rl) return { allowed: true, limit, reset: 0 };
|
||||
try {
|
||||
const result = await rl.limit(`${scope}:${identifier}`);
|
||||
return { allowed: result.success, limit: result.limit, reset: result.reset };
|
||||
} catch {
|
||||
return { allowed: true, limit, reset: 0 };
|
||||
}
|
||||
}
|
||||
|
||||
52
server/_shared/turnstile.ts
Normal file
52
server/_shared/turnstile.ts
Normal file
@@ -0,0 +1,52 @@
|
||||
const TURNSTILE_VERIFY_URL = 'https://challenges.cloudflare.com/turnstile/v0/siteverify';
|
||||
|
||||
export function getClientIp(request: Request): string {
|
||||
return (
|
||||
request.headers.get('x-real-ip') ||
|
||||
request.headers.get('cf-connecting-ip') ||
|
||||
request.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ||
|
||||
'unknown'
|
||||
);
|
||||
}
|
||||
|
||||
export type TurnstileMissingSecretPolicy = 'allow' | 'allow-in-development' | 'deny';
|
||||
|
||||
export interface VerifyTurnstileArgs {
|
||||
token: string;
|
||||
ip: string;
|
||||
logPrefix?: string;
|
||||
missingSecretPolicy?: TurnstileMissingSecretPolicy;
|
||||
}
|
||||
|
||||
export async function verifyTurnstile({
|
||||
token,
|
||||
ip,
|
||||
logPrefix = '[turnstile]',
|
||||
// Default: dev = allow (missing secret is expected locally), prod = deny.
|
||||
// Callers that need the opposite (deliberately allow missing-secret in prod)
|
||||
// can still pass 'allow' explicitly.
|
||||
missingSecretPolicy = 'allow-in-development',
|
||||
}: VerifyTurnstileArgs): Promise<boolean> {
|
||||
const secret = process.env.TURNSTILE_SECRET_KEY;
|
||||
if (!secret) {
|
||||
if (missingSecretPolicy === 'allow') return true;
|
||||
|
||||
const isDevelopment = (process.env.VERCEL_ENV ?? 'development') === 'development';
|
||||
if (isDevelopment) return true;
|
||||
|
||||
console.error(`${logPrefix} TURNSTILE_SECRET_KEY not set in production, rejecting`);
|
||||
return false;
|
||||
}
|
||||
|
||||
try {
|
||||
const res = await fetch(TURNSTILE_VERIFY_URL, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
|
||||
body: new URLSearchParams({ secret, response: token, remoteip: ip }),
|
||||
});
|
||||
const data = (await res.json()) as { success?: boolean };
|
||||
return data.success === true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
31
server/alias-rewrite.ts
Normal file
31
server/alias-rewrite.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
/**
|
||||
* URL-rewrite alias helper for legacy v1 paths that were renamed during the
|
||||
* sebuf migration (#3207). The sebuf generator produces RPC URLs derived from
|
||||
* method names (e.g. `run-scenario`), which diverge from the documented v1
|
||||
* URLs (`run`). These aliases keep the old documented URLs working byte-for-
|
||||
* byte — external callers, docs, and partner scripts don't break.
|
||||
*
|
||||
* Each alias edge function rewrites the request pathname to the new sebuf
|
||||
* path and hands off to the domain gateway. The gateway applies auth, rate
|
||||
* limiting, and entitlement checks against the *new* path, so premium
|
||||
* gating / cache tiers / entitlement maps stay keyed on a single canonical
|
||||
* URL.
|
||||
*
|
||||
* Trivially deleted when v1 retires — just `rm` the alias files.
|
||||
*/
|
||||
export async function rewriteToSebuf(
|
||||
req: Request,
|
||||
newPath: string,
|
||||
gateway: (req: Request) => Promise<Response>,
|
||||
): Promise<Response> {
|
||||
const url = new URL(req.url);
|
||||
url.pathname = newPath;
|
||||
const body =
|
||||
req.method === 'GET' || req.method === 'HEAD' ? undefined : await req.arrayBuffer();
|
||||
const rewritten = new Request(url.toString(), {
|
||||
method: req.method,
|
||||
headers: req.headers,
|
||||
body,
|
||||
});
|
||||
return gateway(rewritten);
|
||||
}
|
||||
@@ -218,9 +218,17 @@ const RPC_CACHE_TIER: Record<string, CacheTier> = {
|
||||
'/api/supply-chain/v1/get-country-chokepoint-index': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-bypass-options': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-country-cost-shock': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-country-products': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-multi-sector-cost-shock': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-sector-dependency': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-route-explorer-lane': 'slow-browser',
|
||||
'/api/supply-chain/v1/get-route-impact': 'slow-browser',
|
||||
// Scenario engine: list-scenario-templates is a compile-time constant catalog;
|
||||
// daily tier gives browser max-age=3600 matching the legacy /api/scenario/v1/templates
|
||||
// endpoint header. get-scenario-status is premium-gated — gateway short-circuits
|
||||
// to 'slow-browser' but the entry is still required by tests/route-cache-tier.test.mjs.
|
||||
'/api/scenario/v1/list-scenario-templates': 'daily',
|
||||
'/api/scenario/v1/get-scenario-status': 'slow-browser',
|
||||
'/api/health/v1/list-disease-outbreaks': 'slow',
|
||||
'/api/health/v1/list-air-quality-alerts': 'fast',
|
||||
'/api/intelligence/v1/get-social-velocity': 'fast',
|
||||
@@ -241,6 +249,13 @@ const RPC_CACHE_TIER: Record<string, CacheTier> = {
|
||||
'/api/intelligence/v1/get-regional-brief': 'slow',
|
||||
'/api/resilience/v1/get-resilience-score': 'slow',
|
||||
'/api/resilience/v1/get-resilience-ranking': 'slow',
|
||||
|
||||
// Partner-facing shipping/v2. route-intelligence is premium-gated; gateway
|
||||
// short-circuits to slow-browser. Entry required by tests/route-cache-tier.test.mjs.
|
||||
'/api/v2/shipping/route-intelligence': 'slow-browser',
|
||||
// GET /webhooks lists caller's webhooks — premium-gated; short-circuited to
|
||||
// slow-browser. Entry required by tests/route-cache-tier.test.mjs.
|
||||
'/api/v2/shipping/webhooks': 'slow-browser',
|
||||
};
|
||||
|
||||
import { PREMIUM_RPC_PATHS } from '../src/shared/premium-paths';
|
||||
|
||||
9
server/worldmonitor/leads/v1/handler.ts
Normal file
9
server/worldmonitor/leads/v1/handler.ts
Normal file
@@ -0,0 +1,9 @@
|
||||
import type { LeadsServiceHandler } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
|
||||
|
||||
import { registerInterest } from './register-interest';
|
||||
import { submitContact } from './submit-contact';
|
||||
|
||||
export const leadsHandler: LeadsServiceHandler = {
|
||||
submitContact,
|
||||
registerInterest,
|
||||
};
|
||||
272
server/worldmonitor/leads/v1/register-interest.ts
Normal file
272
server/worldmonitor/leads/v1/register-interest.ts
Normal file
@@ -0,0 +1,272 @@
|
||||
/**
|
||||
* RPC: registerInterest -- Adds an email to the Pro waitlist and emails a confirmation.
|
||||
* Port from api/register-interest.js
|
||||
* Sources: Convex registerInterest:register mutation + Resend confirmation email
|
||||
*/
|
||||
|
||||
import { ConvexHttpClient } from 'convex/browser';
|
||||
import type {
|
||||
ServerContext,
|
||||
RegisterInterestRequest,
|
||||
RegisterInterestResponse,
|
||||
} from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
|
||||
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
|
||||
import { getClientIp, verifyTurnstile } from '../../../_shared/turnstile';
|
||||
import { validateEmail } from '../../../_shared/email-validation';
|
||||
import { checkScopedRateLimit } from '../../../_shared/rate-limit';
|
||||
|
||||
const EMAIL_RE = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
|
||||
const MAX_EMAIL_LENGTH = 320;
|
||||
const MAX_META_LENGTH = 100;
|
||||
|
||||
const DESKTOP_SOURCES = new Set<string>(['desktop-settings']);
|
||||
|
||||
// Legacy api/register-interest.js capped desktop-source signups at 2/hr per IP
|
||||
// on top of the generic 5/hr endpoint budget. Since `source` is an unsigned
|
||||
// client-supplied field, this cap is the backstop — the signed-header fix that
|
||||
// actually authenticates the desktop bypass is tracked as a follow-up.
|
||||
const DESKTOP_RATE_SCOPE = '/api/leads/v1/register-interest#desktop';
|
||||
const DESKTOP_RATE_LIMIT = 2;
|
||||
const DESKTOP_RATE_WINDOW = '1 h' as const;
|
||||
|
||||
interface ConvexRegisterResult {
|
||||
status: 'registered' | 'already_registered';
|
||||
referralCode: string;
|
||||
referralCount: number;
|
||||
position?: number;
|
||||
emailSuppressed?: boolean;
|
||||
}
|
||||
|
||||
async function sendConfirmationEmail(email: string, referralCode: string): Promise<void> {
|
||||
const referralLink = `https://worldmonitor.app/pro?ref=${referralCode}`;
|
||||
const shareText = encodeURIComponent("I just joined the World Monitor Pro waitlist \u2014 real-time global intelligence powered by AI. Join me:");
|
||||
const shareUrl = encodeURIComponent(referralLink);
|
||||
const twitterShare = `https://x.com/intent/tweet?text=${shareText}&url=${shareUrl}`;
|
||||
const linkedinShare = `https://www.linkedin.com/sharing/share-offsite/?url=${shareUrl}`;
|
||||
const whatsappShare = `https://wa.me/?text=${shareText}%20${shareUrl}`;
|
||||
const telegramShare = `https://t.me/share/url?url=${shareUrl}&text=${encodeURIComponent('Join the World Monitor Pro waitlist:')}`;
|
||||
|
||||
const resendKey = process.env.RESEND_API_KEY;
|
||||
if (!resendKey) {
|
||||
console.warn('[register-interest] RESEND_API_KEY not set — skipping email');
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const resendRes = await fetch('https://api.resend.com/emails', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${resendKey}`,
|
||||
},
|
||||
body: JSON.stringify({
|
||||
from: 'World Monitor <noreply@worldmonitor.app>',
|
||||
to: [email],
|
||||
subject: "You\u2019re on the World Monitor Pro waitlist",
|
||||
html: `
|
||||
<div style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; max-width: 600px; margin: 0 auto; background: #0a0a0a; color: #e0e0e0;">
|
||||
<div style="background: #4ade80; height: 4px;"></div>
|
||||
<div style="padding: 40px 32px 0;">
|
||||
<table cellpadding="0" cellspacing="0" border="0" style="margin: 0 auto 32px;">
|
||||
<tr>
|
||||
<td style="width: 40px; height: 40px; vertical-align: middle;">
|
||||
<img src="https://www.worldmonitor.app/favico/android-chrome-192x192.png" width="40" height="40" alt="WorldMonitor" style="border-radius: 50%; display: block;" />
|
||||
</td>
|
||||
<td style="padding-left: 12px;">
|
||||
<div style="font-size: 16px; font-weight: 800; color: #fff; letter-spacing: -0.5px;">WORLD MONITOR</div>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; border-left: 3px solid #4ade80; padding: 20px 24px; margin-bottom: 28px;">
|
||||
<p style="font-size: 18px; font-weight: 600; color: #fff; margin: 0 0 8px;">You\u2019re on the Pro waitlist.</p>
|
||||
<p style="font-size: 14px; color: #999; margin: 0; line-height: 1.5;">We\u2019ll notify you the moment Pro launches. Here\u2019s what you\u2019ll get:</p>
|
||||
</div>
|
||||
<table cellpadding="0" cellspacing="0" border="0" width="100%" style="margin-bottom: 28px;">
|
||||
<tr>
|
||||
<td style="width: 50%; padding: 12px; vertical-align: top;">
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; padding: 16px; height: 100%;">
|
||||
<div style="font-size: 20px; margin-bottom: 8px;">⚡</div>
|
||||
<div style="font-size: 13px; font-weight: 700; color: #fff; margin-bottom: 4px;">Near-Real-Time</div>
|
||||
<div style="font-size: 12px; color: #888; line-height: 1.4;">Data refresh under 60 seconds via priority pipeline</div>
|
||||
</div>
|
||||
</td>
|
||||
<td style="width: 50%; padding: 12px; vertical-align: top;">
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; padding: 16px; height: 100%;">
|
||||
<div style="font-size: 20px; margin-bottom: 8px;">🧠</div>
|
||||
<div style="font-size: 13px; font-weight: 700; color: #fff; margin-bottom: 4px;">AI Analyst</div>
|
||||
<div style="font-size: 12px; color: #888; line-height: 1.4;">Morning briefs, flash alerts, pattern detection</div>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="width: 50%; padding: 12px; vertical-align: top;">
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; padding: 16px; height: 100%;">
|
||||
<div style="font-size: 20px; margin-bottom: 8px;">📨</div>
|
||||
<div style="font-size: 13px; font-weight: 700; color: #fff; margin-bottom: 4px;">Delivered to You</div>
|
||||
<div style="font-size: 12px; color: #888; line-height: 1.4;">Slack, Telegram, WhatsApp, Email, Discord</div>
|
||||
</div>
|
||||
</td>
|
||||
<td style="width: 50%; padding: 12px; vertical-align: top;">
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; padding: 16px; height: 100%;">
|
||||
<div style="font-size: 20px; margin-bottom: 8px;">🔑</div>
|
||||
<div style="font-size: 13px; font-weight: 700; color: #fff; margin-bottom: 4px;">30+ Services, 1 Key</div>
|
||||
<div style="font-size: 12px; color: #888; line-height: 1.4;">ACLED, NASA FIRMS, OpenSky, Finnhub, and more</div>
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
<table cellpadding="0" cellspacing="0" border="0" width="100%" style="margin-bottom: 28px; background: #111; border: 1px solid #1a1a1a;">
|
||||
<tr>
|
||||
<td style="text-align: center; padding: 16px 8px; width: 33%;">
|
||||
<div style="font-size: 22px; font-weight: 800; color: #4ade80;">2M+</div>
|
||||
<div style="font-size: 10px; color: #666; text-transform: uppercase; letter-spacing: 1px;">Users</div>
|
||||
</td>
|
||||
<td style="text-align: center; padding: 16px 8px; width: 33%; border-left: 1px solid #1a1a1a; border-right: 1px solid #1a1a1a;">
|
||||
<div style="font-size: 22px; font-weight: 800; color: #4ade80;">500+</div>
|
||||
<div style="font-size: 10px; color: #666; text-transform: uppercase; letter-spacing: 1px;">Sources</div>
|
||||
</td>
|
||||
<td style="text-align: center; padding: 16px 8px; width: 33%;">
|
||||
<div style="font-size: 22px; font-weight: 800; color: #4ade80;">190+</div>
|
||||
<div style="font-size: 10px; color: #666; text-transform: uppercase; letter-spacing: 1px;">Countries</div>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
<div style="text-align: center; margin-bottom: 24px;">
|
||||
<div style="display: inline-block; background: #111; border: 1px solid #4ade80; padding: 12px 28px;">
|
||||
<div style="font-size: 18px; font-weight: 800; color: #fff;">You're in!</div>
|
||||
<div style="font-size: 11px; color: #4ade80; text-transform: uppercase; letter-spacing: 2px; margin-top: 4px;">Waitlist confirmed</div>
|
||||
</div>
|
||||
</div>
|
||||
<div style="background: #111; border: 1px solid #1a1a1a; border-left: 3px solid #4ade80; padding: 20px 24px; margin-bottom: 24px;">
|
||||
<p style="font-size: 16px; font-weight: 700; color: #fff; margin: 0 0 8px;">Move up the line \u2014 invite friends</p>
|
||||
<p style="font-size: 13px; color: #888; margin: 0 0 16px; line-height: 1.5;">Each friend who joins through your link bumps you closer to the front. Top referrers get early access.</p>
|
||||
<div style="background: #0a0a0a; border: 1px solid #222; padding: 12px 16px; margin-bottom: 16px; word-break: break-all;">
|
||||
<a href="${referralLink}" style="color: #4ade80; text-decoration: none; font-size: 13px; font-family: monospace;">${referralLink}</a>
|
||||
</div>
|
||||
<table cellpadding="0" cellspacing="0" border="0" width="100%">
|
||||
<tr>
|
||||
<td style="width: 25%; text-align: center; padding: 4px;">
|
||||
<a href="${twitterShare}" style="display: inline-block; background: #1a1a1a; border: 1px solid #222; color: #ccc; text-decoration: none; padding: 8px 0; width: 100%; font-size: 11px; font-weight: 600;">X</a>
|
||||
</td>
|
||||
<td style="width: 25%; text-align: center; padding: 4px;">
|
||||
<a href="${linkedinShare}" style="display: inline-block; background: #1a1a1a; border: 1px solid #222; color: #ccc; text-decoration: none; padding: 8px 0; width: 100%; font-size: 11px; font-weight: 600;">LinkedIn</a>
|
||||
</td>
|
||||
<td style="width: 25%; text-align: center; padding: 4px;">
|
||||
<a href="${whatsappShare}" style="display: inline-block; background: #1a1a1a; border: 1px solid #222; color: #ccc; text-decoration: none; padding: 8px 0; width: 100%; font-size: 11px; font-weight: 600;">WhatsApp</a>
|
||||
</td>
|
||||
<td style="width: 25%; text-align: center; padding: 4px;">
|
||||
<a href="${telegramShare}" style="display: inline-block; background: #1a1a1a; border: 1px solid #222; color: #ccc; text-decoration: none; padding: 8px 0; width: 100%; font-size: 11px; font-weight: 600;">Telegram</a>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
<div style="text-align: center; margin-bottom: 36px;">
|
||||
<a href="https://worldmonitor.app" style="display: inline-block; background: #4ade80; color: #0a0a0a; padding: 14px 36px; text-decoration: none; font-weight: 800; font-size: 13px; text-transform: uppercase; letter-spacing: 1.5px; border-radius: 2px;">Explore the Free Dashboard</a>
|
||||
<p style="font-size: 12px; color: #555; margin-top: 12px;">The free dashboard stays free forever. Pro adds intelligence on top.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div style="border-top: 1px solid #1a1a1a; padding: 24px 32px; text-align: center;">
|
||||
<div style="margin-bottom: 16px;">
|
||||
<a href="https://x.com/eliehabib" style="color: #666; text-decoration: none; font-size: 12px; margin: 0 12px;">X / Twitter</a>
|
||||
<a href="https://github.com/koala73/worldmonitor" style="color: #666; text-decoration: none; font-size: 12px; margin: 0 12px;">GitHub</a>
|
||||
<a href="https://worldmonitor.app/pro" style="color: #666; text-decoration: none; font-size: 12px; margin: 0 12px;">Pro Waitlist</a>
|
||||
</div>
|
||||
<p style="font-size: 11px; color: #444; margin: 0; line-height: 1.6;">
|
||||
World Monitor \u2014 Real-time intelligence for a connected world.<br />
|
||||
<a href="https://worldmonitor.app" style="color: #4ade80; text-decoration: none;">worldmonitor.app</a>
|
||||
</p>
|
||||
</div>
|
||||
</div>`,
|
||||
}),
|
||||
});
|
||||
if (!resendRes.ok) {
|
||||
const body = await resendRes.text();
|
||||
console.error(`[register-interest] Resend ${resendRes.status}:`, body);
|
||||
} else {
|
||||
console.log(`[register-interest] Email sent to ${email}`);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('[register-interest] Resend error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
export async function registerInterest(
|
||||
ctx: ServerContext,
|
||||
req: RegisterInterestRequest,
|
||||
): Promise<RegisterInterestResponse> {
|
||||
// Honeypot — silently accept but do nothing.
|
||||
if (req.website) {
|
||||
return { status: 'registered', referralCode: '', referralCount: 0, position: 0, emailSuppressed: false };
|
||||
}
|
||||
|
||||
const ip = getClientIp(ctx.request);
|
||||
const isDesktopSource = typeof req.source === 'string' && DESKTOP_SOURCES.has(req.source);
|
||||
|
||||
// Desktop sources bypass Turnstile (no browser captcha). `source` is
|
||||
// attacker-controlled, so anyone claiming desktop-settings skips the
|
||||
// captcha — apply a tighter 2/hr per-IP cap on that path to cap abuse
|
||||
// (matches the legacy handler's in-memory secondary cap). Proper fix is
|
||||
// a signed desktop-secret header; tracked as a follow-up.
|
||||
if (isDesktopSource) {
|
||||
const scoped = await checkScopedRateLimit(
|
||||
DESKTOP_RATE_SCOPE,
|
||||
DESKTOP_RATE_LIMIT,
|
||||
DESKTOP_RATE_WINDOW,
|
||||
ip,
|
||||
);
|
||||
if (!scoped.allowed) {
|
||||
throw new ApiError(429, 'Too many requests', '');
|
||||
}
|
||||
} else {
|
||||
const turnstileOk = await verifyTurnstile({
|
||||
token: req.turnstileToken || '',
|
||||
ip,
|
||||
logPrefix: '[register-interest]',
|
||||
});
|
||||
if (!turnstileOk) {
|
||||
throw new ApiError(403, 'Bot verification failed', '');
|
||||
}
|
||||
}
|
||||
|
||||
const { email, source, appVersion, referredBy } = req;
|
||||
if (!email || email.length > MAX_EMAIL_LENGTH || !EMAIL_RE.test(email)) {
|
||||
throw new ValidationError([{ field: 'email', description: 'Invalid email address' }]);
|
||||
}
|
||||
|
||||
const emailCheck = await validateEmail(email);
|
||||
if (!emailCheck.valid) {
|
||||
throw new ValidationError([{ field: 'email', description: emailCheck.reason }]);
|
||||
}
|
||||
|
||||
const safeSource = source ? source.slice(0, MAX_META_LENGTH) : 'unknown';
|
||||
const safeAppVersion = appVersion ? appVersion.slice(0, MAX_META_LENGTH) : 'unknown';
|
||||
const safeReferredBy = referredBy ? referredBy.slice(0, 20) : undefined;
|
||||
|
||||
const convexUrl = process.env.CONVEX_URL;
|
||||
if (!convexUrl) {
|
||||
throw new ApiError(503, 'Registration service unavailable', '');
|
||||
}
|
||||
|
||||
const client = new ConvexHttpClient(convexUrl);
|
||||
const result = (await client.mutation('registerInterest:register' as any, {
|
||||
email,
|
||||
source: safeSource,
|
||||
appVersion: safeAppVersion,
|
||||
referredBy: safeReferredBy,
|
||||
})) as ConvexRegisterResult;
|
||||
|
||||
if (result.status === 'registered' && result.referralCode) {
|
||||
if (!result.emailSuppressed) {
|
||||
await sendConfirmationEmail(email, result.referralCode);
|
||||
} else {
|
||||
console.log(`[register-interest] Skipped email to suppressed address: ${email}`);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
status: result.status,
|
||||
referralCode: result.referralCode,
|
||||
referralCount: result.referralCount,
|
||||
position: result.position ?? 0,
|
||||
emailSuppressed: result.emailSuppressed ?? false,
|
||||
};
|
||||
}
|
||||
177
server/worldmonitor/leads/v1/submit-contact.ts
Normal file
177
server/worldmonitor/leads/v1/submit-contact.ts
Normal file
@@ -0,0 +1,177 @@
|
||||
/**
|
||||
* RPC: submitContact -- Stores an enterprise contact submission and emails ops.
|
||||
* Port from api/contact.js
|
||||
* Sources: Convex contactMessages:submit mutation + Resend notification email
|
||||
*/
|
||||
|
||||
import { ConvexHttpClient } from 'convex/browser';
|
||||
import type {
|
||||
ServerContext,
|
||||
SubmitContactRequest,
|
||||
SubmitContactResponse,
|
||||
} from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
|
||||
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
|
||||
import { getClientIp, verifyTurnstile } from '../../../_shared/turnstile';
|
||||
|
||||
const EMAIL_RE = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
|
||||
const PHONE_RE = /^[+(]?\d[\d\s()./-]{4,23}\d$/;
|
||||
const MAX_FIELD = 500;
|
||||
const MAX_MESSAGE = 2000;
|
||||
|
||||
const FREE_EMAIL_DOMAINS = new Set<string>([
|
||||
'gmail.com', 'googlemail.com', 'yahoo.com', 'yahoo.fr', 'yahoo.co.uk', 'yahoo.co.jp',
|
||||
'hotmail.com', 'hotmail.fr', 'hotmail.co.uk', 'outlook.com', 'outlook.fr',
|
||||
'live.com', 'live.fr', 'msn.com', 'aol.com', 'icloud.com', 'me.com', 'mac.com',
|
||||
'protonmail.com', 'proton.me', 'mail.com', 'zoho.com', 'yandex.com', 'yandex.ru',
|
||||
'gmx.com', 'gmx.net', 'gmx.de', 'web.de', 'mail.ru', 'inbox.com',
|
||||
'fastmail.com', 'tutanota.com', 'tuta.io', 'hey.com',
|
||||
'qq.com', '163.com', '126.com', 'sina.com', 'foxmail.com',
|
||||
'rediffmail.com', 'ymail.com', 'rocketmail.com',
|
||||
'wanadoo.fr', 'free.fr', 'laposte.net', 'orange.fr', 'sfr.fr',
|
||||
't-online.de', 'libero.it', 'virgilio.it',
|
||||
]);
|
||||
|
||||
function escapeHtml(str: string): string {
|
||||
return str
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"');
|
||||
}
|
||||
|
||||
function sanitizeForSubject(str: string, maxLen = 50): string {
|
||||
return str.replace(/[\r\n\0]/g, '').slice(0, maxLen);
|
||||
}
|
||||
|
||||
async function sendNotificationEmail(
|
||||
name: string,
|
||||
email: string,
|
||||
organization: string,
|
||||
phone: string,
|
||||
message: string | undefined,
|
||||
ip: string,
|
||||
country: string | null,
|
||||
): Promise<boolean> {
|
||||
const resendKey = process.env.RESEND_API_KEY;
|
||||
if (!resendKey) {
|
||||
console.error('[contact] RESEND_API_KEY not set — lead stored in Convex but notification NOT sent');
|
||||
return false;
|
||||
}
|
||||
const notifyEmail = process.env.CONTACT_NOTIFY_EMAIL || 'elie@worldmonitor.app';
|
||||
const emailDomain = (email.split('@')[1] || '').toLowerCase();
|
||||
try {
|
||||
const res = await fetch('https://api.resend.com/emails', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${resendKey}`,
|
||||
},
|
||||
body: JSON.stringify({
|
||||
from: 'World Monitor <noreply@worldmonitor.app>',
|
||||
to: [notifyEmail],
|
||||
subject: `[WM Enterprise] ${sanitizeForSubject(name)} from ${sanitizeForSubject(organization)}`,
|
||||
html: `
|
||||
<div style="font-family: -apple-system, sans-serif; max-width: 600px; margin: 0 auto;">
|
||||
<h2 style="color: #4ade80;">New Enterprise Contact</h2>
|
||||
<table style="width: 100%; border-collapse: collapse;">
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Name</td><td style="padding: 8px;">${escapeHtml(name)}</td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Email</td><td style="padding: 8px;"><a href="mailto:${escapeHtml(email)}">${escapeHtml(email)}</a></td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Domain</td><td style="padding: 8px;"><a href="https://${escapeHtml(emailDomain)}" target="_blank">${escapeHtml(emailDomain)}</a></td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Company</td><td style="padding: 8px;">${escapeHtml(organization)}</td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Phone</td><td style="padding: 8px;"><a href="tel:${escapeHtml(phone)}">${escapeHtml(phone)}</a></td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">Message</td><td style="padding: 8px;">${escapeHtml(message || 'N/A')}</td></tr>
|
||||
<tr><td style="padding: 8px; font-weight: bold; color: #666;">IP</td><td style="padding: 8px; font-family: monospace;">${escapeHtml(ip || 'unknown')}</td></tr>
|
||||
${country ? `<tr><td style="padding: 8px; font-weight: bold; color: #666;">Country</td><td style="padding: 8px;">${escapeHtml(country)}</td></tr>` : ''}
|
||||
</table>
|
||||
<p style="color: #999; font-size: 12px; margin-top: 24px;">Sent from worldmonitor.app enterprise contact form</p>
|
||||
</div>`,
|
||||
}),
|
||||
});
|
||||
if (!res.ok) {
|
||||
const body = await res.text();
|
||||
console.error(`[contact] Resend ${res.status}:`, body);
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
} catch (err) {
|
||||
console.error('[contact] Resend error:', err);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
export async function submitContact(
|
||||
ctx: ServerContext,
|
||||
req: SubmitContactRequest,
|
||||
): Promise<SubmitContactResponse> {
|
||||
// Honeypot — silently accept but do nothing (bots auto-fill hidden field).
|
||||
if (req.website) {
|
||||
return { status: 'sent', emailSent: false };
|
||||
}
|
||||
|
||||
const ip = getClientIp(ctx.request);
|
||||
const country = ctx.request.headers.get('cf-ipcountry')
|
||||
|| ctx.request.headers.get('x-vercel-ip-country');
|
||||
|
||||
const turnstileOk = await verifyTurnstile({
|
||||
token: req.turnstileToken || '',
|
||||
ip,
|
||||
logPrefix: '[contact]',
|
||||
});
|
||||
if (!turnstileOk) {
|
||||
throw new ApiError(403, 'Bot verification failed', '');
|
||||
}
|
||||
|
||||
const { email, name, organization, phone, message, source } = req;
|
||||
|
||||
if (!email || !EMAIL_RE.test(email)) {
|
||||
throw new ValidationError([{ field: 'email', description: 'Invalid email' }]);
|
||||
}
|
||||
|
||||
const emailDomain = email.split('@')[1]?.toLowerCase();
|
||||
if (emailDomain && FREE_EMAIL_DOMAINS.has(emailDomain)) {
|
||||
throw new ApiError(422, 'Please use your work email address', '');
|
||||
}
|
||||
|
||||
if (!name || name.trim().length === 0) {
|
||||
throw new ValidationError([{ field: 'name', description: 'Name is required' }]);
|
||||
}
|
||||
if (!organization || organization.trim().length === 0) {
|
||||
throw new ValidationError([{ field: 'organization', description: 'Company is required' }]);
|
||||
}
|
||||
if (!phone || !PHONE_RE.test(phone.trim())) {
|
||||
throw new ValidationError([{ field: 'phone', description: 'Valid phone number is required' }]);
|
||||
}
|
||||
|
||||
const safeName = name.slice(0, MAX_FIELD);
|
||||
const safeOrg = organization.slice(0, MAX_FIELD);
|
||||
const safePhone = phone.trim().slice(0, 30);
|
||||
const safeMsg = message ? message.slice(0, MAX_MESSAGE) : undefined;
|
||||
const safeSource = source ? source.slice(0, 100) : 'enterprise-contact';
|
||||
|
||||
const convexUrl = process.env.CONVEX_URL;
|
||||
if (!convexUrl) {
|
||||
throw new ApiError(503, 'Service unavailable', '');
|
||||
}
|
||||
|
||||
const client = new ConvexHttpClient(convexUrl);
|
||||
await client.mutation('contactMessages:submit' as any, {
|
||||
name: safeName,
|
||||
email: email.trim(),
|
||||
organization: safeOrg,
|
||||
phone: safePhone,
|
||||
message: safeMsg,
|
||||
source: safeSource,
|
||||
});
|
||||
|
||||
const emailSent = await sendNotificationEmail(
|
||||
safeName,
|
||||
email.trim(),
|
||||
safeOrg,
|
||||
safePhone,
|
||||
safeMsg,
|
||||
ip,
|
||||
country,
|
||||
);
|
||||
|
||||
return { status: 'sent', emailSent };
|
||||
}
|
||||
@@ -7,6 +7,7 @@ import type {
|
||||
AisDisruption,
|
||||
AisDisruptionType,
|
||||
AisDisruptionSeverity,
|
||||
SnapshotCandidateReport,
|
||||
} from '../../../../src/generated/server/worldmonitor/maritime/v1/service_server';
|
||||
|
||||
import { getRelayBaseUrl, getRelayHeaders } from '../../../_shared/relay';
|
||||
@@ -26,44 +27,73 @@ const SEVERITY_MAP: Record<string, AisDisruptionSeverity> = {
|
||||
high: 'AIS_DISRUPTION_SEVERITY_HIGH',
|
||||
};
|
||||
|
||||
// In-memory cache (matches old /api/ais-snapshot behavior)
|
||||
// Cache the two variants separately — candidate reports materially change
|
||||
// payload size, and clients with no position callbacks should not have to
|
||||
// wait on or pay for the heavier payload.
|
||||
const SNAPSHOT_CACHE_TTL_MS = 300_000; // 5 min -- matches client poll interval
|
||||
let cachedSnapshot: VesselSnapshot | undefined;
|
||||
let cacheTimestamp = 0;
|
||||
let inFlightRequest: Promise<VesselSnapshot | undefined> | null = null;
|
||||
|
||||
async function fetchVesselSnapshot(): Promise<VesselSnapshot | undefined> {
|
||||
// Return cached if fresh
|
||||
interface SnapshotCacheSlot {
|
||||
snapshot: VesselSnapshot | undefined;
|
||||
timestamp: number;
|
||||
inFlight: Promise<VesselSnapshot | undefined> | null;
|
||||
}
|
||||
|
||||
const cache: Record<'with' | 'without', SnapshotCacheSlot> = {
|
||||
with: { snapshot: undefined, timestamp: 0, inFlight: null },
|
||||
without: { snapshot: undefined, timestamp: 0, inFlight: null },
|
||||
};
|
||||
|
||||
async function fetchVesselSnapshot(includeCandidates: boolean): Promise<VesselSnapshot | undefined> {
|
||||
const slot = cache[includeCandidates ? 'with' : 'without'];
|
||||
const now = Date.now();
|
||||
if (cachedSnapshot && (now - cacheTimestamp) < SNAPSHOT_CACHE_TTL_MS) {
|
||||
return cachedSnapshot;
|
||||
if (slot.snapshot && (now - slot.timestamp) < SNAPSHOT_CACHE_TTL_MS) {
|
||||
return slot.snapshot;
|
||||
}
|
||||
|
||||
// In-flight dedup: if a request is already running, await it
|
||||
if (inFlightRequest) {
|
||||
return inFlightRequest;
|
||||
if (slot.inFlight) {
|
||||
return slot.inFlight;
|
||||
}
|
||||
|
||||
inFlightRequest = fetchVesselSnapshotFromRelay();
|
||||
slot.inFlight = fetchVesselSnapshotFromRelay(includeCandidates);
|
||||
try {
|
||||
const result = await inFlightRequest;
|
||||
const result = await slot.inFlight;
|
||||
if (result) {
|
||||
cachedSnapshot = result;
|
||||
cacheTimestamp = Date.now();
|
||||
slot.snapshot = result;
|
||||
slot.timestamp = Date.now();
|
||||
}
|
||||
return result ?? cachedSnapshot; // serve stale on relay failure
|
||||
return result ?? slot.snapshot; // serve stale on relay failure
|
||||
} finally {
|
||||
inFlightRequest = null;
|
||||
slot.inFlight = null;
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefined> {
|
||||
function toCandidateReport(raw: any): SnapshotCandidateReport | null {
|
||||
if (!raw || typeof raw !== 'object') return null;
|
||||
const mmsi = String(raw.mmsi ?? '');
|
||||
if (!mmsi) return null;
|
||||
const lat = Number(raw.lat);
|
||||
const lon = Number(raw.lon);
|
||||
if (!Number.isFinite(lat) || !Number.isFinite(lon)) return null;
|
||||
return {
|
||||
mmsi,
|
||||
name: String(raw.name ?? ''),
|
||||
lat,
|
||||
lon,
|
||||
shipType: Number.isFinite(Number(raw.shipType)) ? Number(raw.shipType) : 0,
|
||||
heading: Number.isFinite(Number(raw.heading)) ? Number(raw.heading) : 0,
|
||||
speed: Number.isFinite(Number(raw.speed)) ? Number(raw.speed) : 0,
|
||||
course: Number.isFinite(Number(raw.course)) ? Number(raw.course) : 0,
|
||||
timestamp: Number.isFinite(Number(raw.timestamp)) ? Number(raw.timestamp) : Date.now(),
|
||||
};
|
||||
}
|
||||
|
||||
async function fetchVesselSnapshotFromRelay(includeCandidates: boolean): Promise<VesselSnapshot | undefined> {
|
||||
try {
|
||||
const relayBaseUrl = getRelayBaseUrl();
|
||||
if (!relayBaseUrl) return undefined;
|
||||
|
||||
const response = await fetch(
|
||||
`${relayBaseUrl}/ais/snapshot?candidates=false`,
|
||||
`${relayBaseUrl}/ais/snapshot?candidates=${includeCandidates ? 'true' : 'false'}`,
|
||||
{
|
||||
headers: getRelayHeaders(),
|
||||
signal: AbortSignal.timeout(10000),
|
||||
@@ -107,10 +137,22 @@ async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefine
|
||||
description: String(d.description || ''),
|
||||
}));
|
||||
|
||||
const rawStatus = (data.status && typeof data.status === 'object') ? data.status : {};
|
||||
const candidateReports = (includeCandidates && Array.isArray(data.candidateReports))
|
||||
? data.candidateReports.map(toCandidateReport).filter((r: SnapshotCandidateReport | null): r is SnapshotCandidateReport => r !== null)
|
||||
: [];
|
||||
|
||||
return {
|
||||
snapshotAt: Date.now(),
|
||||
densityZones,
|
||||
disruptions,
|
||||
sequence: Number.isFinite(Number(data.sequence)) ? Number(data.sequence) : 0,
|
||||
status: {
|
||||
connected: Boolean(rawStatus.connected),
|
||||
vessels: Number.isFinite(Number(rawStatus.vessels)) ? Number(rawStatus.vessels) : 0,
|
||||
messages: Number.isFinite(Number(rawStatus.messages)) ? Number(rawStatus.messages) : 0,
|
||||
},
|
||||
candidateReports,
|
||||
};
|
||||
} catch {
|
||||
return undefined;
|
||||
@@ -123,10 +165,10 @@ async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefine
|
||||
|
||||
export async function getVesselSnapshot(
|
||||
_ctx: ServerContext,
|
||||
_req: GetVesselSnapshotRequest,
|
||||
req: GetVesselSnapshotRequest,
|
||||
): Promise<GetVesselSnapshotResponse> {
|
||||
try {
|
||||
const snapshot = await fetchVesselSnapshot();
|
||||
const snapshot = await fetchVesselSnapshot(Boolean(req.includeCandidates));
|
||||
return { snapshot };
|
||||
} catch {
|
||||
return { snapshot: undefined };
|
||||
|
||||
@@ -3,15 +3,18 @@ import type {
|
||||
ListMilitaryFlightsRequest,
|
||||
ListMilitaryFlightsResponse,
|
||||
MilitaryAircraftType,
|
||||
MilitaryOperator,
|
||||
MilitaryConfidence,
|
||||
} from '../../../../src/generated/server/worldmonitor/military/v1/service_server';
|
||||
|
||||
import { isMilitaryCallsign, isMilitaryHex, detectAircraftType, UPSTREAM_TIMEOUT_MS } from './_shared';
|
||||
import { cachedFetchJson } from '../../../_shared/redis';
|
||||
import { cachedFetchJson, getRawJson } from '../../../_shared/redis';
|
||||
import { markNoCacheResponse } from '../../../_shared/response-headers';
|
||||
import { getRelayBaseUrl, getRelayHeaders } from '../../../_shared/relay';
|
||||
|
||||
const REDIS_CACHE_KEY = 'military:flights:v1';
|
||||
const REDIS_CACHE_TTL = 600; // 10 min — reduce upstream API pressure
|
||||
const REDIS_STALE_KEY = 'military:flights:stale:v1';
|
||||
|
||||
/** Snap a coordinate to a grid step so nearby bbox values share cache entries. */
|
||||
const quantize = (v: number, step: number) => Math.round(v / step) * step;
|
||||
@@ -53,8 +56,110 @@ const AIRCRAFT_TYPE_MAP: Record<string, string> = {
|
||||
reconnaissance: 'MILITARY_AIRCRAFT_TYPE_RECONNAISSANCE',
|
||||
drone: 'MILITARY_AIRCRAFT_TYPE_DRONE',
|
||||
bomber: 'MILITARY_AIRCRAFT_TYPE_BOMBER',
|
||||
fighter: 'MILITARY_AIRCRAFT_TYPE_FIGHTER',
|
||||
helicopter: 'MILITARY_AIRCRAFT_TYPE_HELICOPTER',
|
||||
vip: 'MILITARY_AIRCRAFT_TYPE_VIP',
|
||||
special_ops: 'MILITARY_AIRCRAFT_TYPE_SPECIAL_OPS',
|
||||
};
|
||||
|
||||
const OPERATOR_MAP: Record<string, string> = {
|
||||
usaf: 'MILITARY_OPERATOR_USAF',
|
||||
raf: 'MILITARY_OPERATOR_RAF',
|
||||
faf: 'MILITARY_OPERATOR_FAF',
|
||||
gaf: 'MILITARY_OPERATOR_GAF',
|
||||
iaf: 'MILITARY_OPERATOR_IAF',
|
||||
nato: 'MILITARY_OPERATOR_NATO',
|
||||
other: 'MILITARY_OPERATOR_OTHER',
|
||||
};
|
||||
|
||||
const CONFIDENCE_MAP: Record<string, string> = {
|
||||
high: 'MILITARY_CONFIDENCE_HIGH',
|
||||
medium: 'MILITARY_CONFIDENCE_MEDIUM',
|
||||
low: 'MILITARY_CONFIDENCE_LOW',
|
||||
};
|
||||
|
||||
interface StaleFlight {
|
||||
id?: string;
|
||||
callsign?: string;
|
||||
hexCode?: string;
|
||||
registration?: string;
|
||||
aircraftType?: string;
|
||||
aircraftModel?: string;
|
||||
operator?: string;
|
||||
operatorCountry?: string;
|
||||
lat?: number | null;
|
||||
lon?: number | null;
|
||||
altitude?: number;
|
||||
heading?: number;
|
||||
speed?: number;
|
||||
verticalRate?: number;
|
||||
onGround?: boolean;
|
||||
squawk?: string;
|
||||
origin?: string;
|
||||
destination?: string;
|
||||
lastSeenMs?: number;
|
||||
firstSeenMs?: number;
|
||||
confidence?: string;
|
||||
isInteresting?: boolean;
|
||||
note?: string;
|
||||
}
|
||||
|
||||
interface StalePayload {
|
||||
flights?: StaleFlight[];
|
||||
fetchedAt?: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert the seed cron's app-shape flight (flat lat/lon, lowercase enums,
|
||||
* lastSeenMs) into the proto shape (nested GeoCoordinates, enum strings,
|
||||
* lastSeenAt). Mirrors the inverse of src/services/military-flights.ts:mapProtoFlight.
|
||||
* hexCode is canonicalized to uppercase per the invariant documented on
|
||||
* MilitaryFlight.hex_code in military_flight.proto.
|
||||
*/
|
||||
function staleToProto(f: StaleFlight): ListMilitaryFlightsResponse['flights'][number] | null {
|
||||
if (f.lat == null || f.lon == null) return null;
|
||||
const icao = (f.hexCode || f.id || '').toUpperCase();
|
||||
if (!icao) return null;
|
||||
return {
|
||||
id: icao,
|
||||
callsign: (f.callsign || '').trim(),
|
||||
hexCode: icao,
|
||||
registration: f.registration || '',
|
||||
aircraftType: (AIRCRAFT_TYPE_MAP[f.aircraftType || ''] || 'MILITARY_AIRCRAFT_TYPE_UNKNOWN') as MilitaryAircraftType,
|
||||
aircraftModel: f.aircraftModel || '',
|
||||
operator: (OPERATOR_MAP[f.operator || ''] || 'MILITARY_OPERATOR_OTHER') as MilitaryOperator,
|
||||
operatorCountry: f.operatorCountry || '',
|
||||
location: { latitude: f.lat, longitude: f.lon },
|
||||
altitude: f.altitude ?? 0,
|
||||
heading: f.heading ?? 0,
|
||||
speed: f.speed ?? 0,
|
||||
verticalRate: f.verticalRate ?? 0,
|
||||
onGround: f.onGround ?? false,
|
||||
squawk: f.squawk || '',
|
||||
origin: f.origin || '',
|
||||
destination: f.destination || '',
|
||||
lastSeenAt: f.lastSeenMs ?? Date.now(),
|
||||
firstSeenAt: f.firstSeenMs ?? 0,
|
||||
confidence: (CONFIDENCE_MAP[f.confidence || ''] || 'MILITARY_CONFIDENCE_LOW') as MilitaryConfidence,
|
||||
isInteresting: f.isInteresting ?? false,
|
||||
note: f.note || '',
|
||||
enrichment: undefined,
|
||||
};
|
||||
}
|
||||
|
||||
async function fetchStaleFallback(): Promise<ListMilitaryFlightsResponse['flights'] | null> {
|
||||
try {
|
||||
const raw = (await getRawJson(REDIS_STALE_KEY)) as StalePayload | null;
|
||||
if (!raw || !Array.isArray(raw.flights) || raw.flights.length === 0) return null;
|
||||
const flights = raw.flights
|
||||
.map(staleToProto)
|
||||
.filter((f): f is NonNullable<typeof f> => f != null);
|
||||
return flights.length > 0 ? flights : null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export async function listMilitaryFlights(
|
||||
ctx: ServerContext,
|
||||
req: ListMilitaryFlightsRequest,
|
||||
@@ -115,11 +220,17 @@ export async function listMilitaryFlights(
|
||||
if (!isMilitaryCallsign(callsign) && !isMilitaryHex(icao24)) continue;
|
||||
|
||||
const aircraftType = detectAircraftType(callsign);
|
||||
// Canonicalize hex_code to uppercase — the seed cron
|
||||
// (scripts/seed-military-flights.mjs) writes uppercase, and
|
||||
// src/services/military-flights.ts getFlightByHex uppercases the
|
||||
// lookup input. Preserving OpenSky's lowercase here would break
|
||||
// every hex lookup silently.
|
||||
const hex = icao24.toUpperCase();
|
||||
|
||||
flights.push({
|
||||
id: icao24,
|
||||
id: hex,
|
||||
callsign: (callsign || '').trim(),
|
||||
hexCode: icao24,
|
||||
hexCode: hex,
|
||||
registration: '',
|
||||
aircraftType: (AIRCRAFT_TYPE_MAP[aircraftType] || 'MILITARY_AIRCRAFT_TYPE_UNKNOWN') as MilitaryAircraftType,
|
||||
aircraftModel: '',
|
||||
@@ -148,6 +259,15 @@ export async function listMilitaryFlights(
|
||||
);
|
||||
|
||||
if (!fullResult) {
|
||||
// Live fetch failed. The legacy /api/military-flights handler cascaded
|
||||
// military:flights:v1 → military:flights:stale:v1 before returning empty.
|
||||
// The seed cron (scripts/seed-military-flights.mjs) writes both keys
|
||||
// every run; stale has a 24h TTL versus 10min live, so it's the right
|
||||
// fallback when OpenSky / the relay hiccups.
|
||||
const staleFlights = await fetchStaleFallback();
|
||||
if (staleFlights && staleFlights.length > 0) {
|
||||
return { flights: filterFlightsToBounds(staleFlights, requestBounds), clusters: [], pagination: undefined };
|
||||
}
|
||||
markNoCacheResponse(ctx.request);
|
||||
return { flights: [], clusters: [], pagination: undefined };
|
||||
}
|
||||
|
||||
@@ -11,7 +11,10 @@ import { getCachedJson } from '../../../_shared/redis';
|
||||
const ENTITY_INDEX_KEY = 'sanctions:entities:v1';
|
||||
const DEFAULT_MAX = 10;
|
||||
const MAX_RESULTS_LIMIT = 50;
|
||||
const MAX_QUERY_LENGTH = 200;
|
||||
const MIN_QUERY_LENGTH = 2;
|
||||
const OPENSANCTIONS_BASE = 'https://api.opensanctions.org';
|
||||
const OPENSANCTIONS_TIMEOUT_MS = 8_000;
|
||||
|
||||
interface EntityIndexRecord {
|
||||
id: string;
|
||||
@@ -21,6 +24,24 @@ interface EntityIndexRecord {
|
||||
pr: string[];
|
||||
}
|
||||
|
||||
interface OpenSanctionsHit {
|
||||
id?: string;
|
||||
schema?: string;
|
||||
caption?: string;
|
||||
properties?: {
|
||||
name?: string[];
|
||||
country?: string[];
|
||||
nationality?: string[];
|
||||
program?: string[];
|
||||
sanctions?: string[];
|
||||
};
|
||||
}
|
||||
|
||||
interface OpenSanctionsSearchResponse {
|
||||
results?: OpenSanctionsHit[];
|
||||
total?: { value?: number };
|
||||
}
|
||||
|
||||
function normalize(s: string): string {
|
||||
return s.toLowerCase().replace(/[^a-z0-9]/g, ' ').replace(/\s+/g, ' ').trim();
|
||||
}
|
||||
@@ -30,59 +51,122 @@ function clampMax(value: number): number {
|
||||
return Math.min(Math.max(Math.trunc(value), 1), MAX_RESULTS_LIMIT);
|
||||
}
|
||||
|
||||
function entityTypeFromSchema(schema: string): string {
|
||||
if (schema === 'Vessel') return 'vessel';
|
||||
if (schema === 'Aircraft') return 'aircraft';
|
||||
if (schema === 'Person') return 'individual';
|
||||
return 'entity';
|
||||
}
|
||||
|
||||
function normalizeOpenSanctionsHit(hit: OpenSanctionsHit): SanctionEntityMatch | null {
|
||||
const props = hit.properties ?? {};
|
||||
const name = (props.name ?? [hit.caption ?? '']).filter(Boolean)[0] ?? '';
|
||||
if (!name || !hit.id) return null;
|
||||
const countries = (props.country ?? props.nationality ?? []).slice(0, 3);
|
||||
const programs = (props.program ?? props.sanctions ?? []).slice(0, 3);
|
||||
return {
|
||||
id: `opensanctions:${hit.id}`,
|
||||
name,
|
||||
entityType: entityTypeFromSchema(hit.schema ?? ''),
|
||||
countryCodes: countries,
|
||||
programs,
|
||||
};
|
||||
}
|
||||
|
||||
async function searchOpenSanctions(q: string, limit: number): Promise<{ results: SanctionEntityMatch[]; total: number } | null> {
|
||||
const url = new URL(`${OPENSANCTIONS_BASE}/search/default`);
|
||||
url.searchParams.set('q', q);
|
||||
url.searchParams.set('limit', String(limit));
|
||||
|
||||
const resp = await fetch(url.toString(), {
|
||||
headers: {
|
||||
'User-Agent': 'WorldMonitor/1.0 sanctions-search',
|
||||
Accept: 'application/json',
|
||||
},
|
||||
signal: AbortSignal.timeout(OPENSANCTIONS_TIMEOUT_MS),
|
||||
});
|
||||
|
||||
if (!resp.ok) return null;
|
||||
|
||||
const data = (await resp.json()) as OpenSanctionsSearchResponse;
|
||||
const hits = Array.isArray(data.results) ? data.results : [];
|
||||
const results = hits
|
||||
.map(normalizeOpenSanctionsHit)
|
||||
.filter((r): r is SanctionEntityMatch => r !== null);
|
||||
const total = data.total?.value ?? results.length;
|
||||
return { results, total };
|
||||
}
|
||||
|
||||
function searchOfacLocal(q: string, maxResults: number, raw: unknown): { results: SanctionEntityMatch[]; total: number } {
|
||||
if (!Array.isArray(raw)) return { results: [], total: 0 };
|
||||
|
||||
const index = raw as EntityIndexRecord[];
|
||||
const needle = normalize(q);
|
||||
const tokens = needle.split(' ').filter(Boolean);
|
||||
const scored: Array<{ score: number; entry: EntityIndexRecord }> = [];
|
||||
|
||||
for (const entry of index) {
|
||||
const haystack = normalize(entry.name);
|
||||
|
||||
if (haystack === needle) {
|
||||
scored.push({ score: 100, entry });
|
||||
continue;
|
||||
}
|
||||
if (haystack.startsWith(needle)) {
|
||||
scored.push({ score: 80, entry });
|
||||
continue;
|
||||
}
|
||||
if (tokens.length > 0 && tokens.every((t) => haystack.includes(t))) {
|
||||
const pos = haystack.indexOf(tokens[0] ?? '');
|
||||
scored.push({ score: 60 - Math.min(pos, 20), entry });
|
||||
continue;
|
||||
}
|
||||
const matchCount = tokens.filter((t) => haystack.includes(t)).length;
|
||||
if (matchCount > 0) {
|
||||
scored.push({ score: matchCount * 10, entry });
|
||||
}
|
||||
}
|
||||
|
||||
scored.sort((a, b) => b.score - a.score);
|
||||
|
||||
const results: SanctionEntityMatch[] = scored.slice(0, maxResults).map(({ entry }) => ({
|
||||
id: entry.id,
|
||||
name: entry.name,
|
||||
entityType: entry.et,
|
||||
countryCodes: entry.cc,
|
||||
programs: entry.pr,
|
||||
}));
|
||||
|
||||
return { results, total: scored.length };
|
||||
}
|
||||
|
||||
export const lookupSanctionEntity: SanctionsServiceHandler['lookupSanctionEntity'] = async (
|
||||
_ctx: ServerContext,
|
||||
req: LookupSanctionEntityRequest,
|
||||
): Promise<LookupSanctionEntityResponse> => {
|
||||
const q = (req.q ?? '').trim();
|
||||
if (q.length < MIN_QUERY_LENGTH) {
|
||||
return { results: [], total: 0, source: 'ofac' };
|
||||
if (q.length < MIN_QUERY_LENGTH || q.length > MAX_QUERY_LENGTH) {
|
||||
return { results: [], total: 0, source: 'opensanctions' };
|
||||
}
|
||||
|
||||
const maxResults = clampMax(req.maxResults);
|
||||
const needle = normalize(q);
|
||||
const tokens = needle.split(' ').filter(Boolean);
|
||||
|
||||
// Primary: live query against OpenSanctions — broader global coverage than
|
||||
// the local OFAC index. Matches the legacy /api/sanctions-entity-search path.
|
||||
try {
|
||||
const upstream = await searchOpenSanctions(q, maxResults);
|
||||
if (upstream) {
|
||||
return { ...upstream, source: 'opensanctions' };
|
||||
}
|
||||
} catch {
|
||||
// fall through to OFAC fallback
|
||||
}
|
||||
|
||||
// Fallback: local OFAC fuzzy match from the seeded Redis index. Keeps the
|
||||
// endpoint useful when OpenSanctions is unreachable or rate-limiting us.
|
||||
try {
|
||||
const raw = await getCachedJson(ENTITY_INDEX_KEY, true);
|
||||
if (!Array.isArray(raw)) return { results: [], total: 0, source: 'ofac' };
|
||||
|
||||
const index = raw as EntityIndexRecord[];
|
||||
const scored: Array<{ score: number; entry: EntityIndexRecord }> = [];
|
||||
|
||||
for (const entry of index) {
|
||||
const haystack = normalize(entry.name);
|
||||
|
||||
if (haystack === needle) {
|
||||
scored.push({ score: 100, entry });
|
||||
continue;
|
||||
}
|
||||
if (haystack.startsWith(needle)) {
|
||||
scored.push({ score: 80, entry });
|
||||
continue;
|
||||
}
|
||||
if (tokens.length > 0 && tokens.every((t) => haystack.includes(t))) {
|
||||
const pos = haystack.indexOf(tokens[0] ?? '');
|
||||
scored.push({ score: 60 - Math.min(pos, 20), entry });
|
||||
continue;
|
||||
}
|
||||
const matchCount = tokens.filter((t) => haystack.includes(t)).length;
|
||||
if (matchCount > 0) {
|
||||
scored.push({ score: matchCount * 10, entry });
|
||||
}
|
||||
}
|
||||
|
||||
scored.sort((a, b) => b.score - a.score);
|
||||
|
||||
const results: SanctionEntityMatch[] = scored.slice(0, maxResults).map(({ entry }) => ({
|
||||
id: entry.id,
|
||||
name: entry.name,
|
||||
entityType: entry.et,
|
||||
countryCodes: entry.cc,
|
||||
programs: entry.pr,
|
||||
}));
|
||||
|
||||
return { results, total: scored.length, source: 'ofac' };
|
||||
return { ...searchOfacLocal(q, maxResults, raw), source: 'ofac' };
|
||||
} catch {
|
||||
return { results: [], total: 0, source: 'ofac' };
|
||||
}
|
||||
|
||||
99
server/worldmonitor/scenario/v1/get-scenario-status.ts
Normal file
99
server/worldmonitor/scenario/v1/get-scenario-status.ts
Normal file
@@ -0,0 +1,99 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
GetScenarioStatusRequest,
|
||||
GetScenarioStatusResponse,
|
||||
ScenarioResult,
|
||||
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { getRawJson } from '../../../_shared/redis';
|
||||
|
||||
// Matches jobIds produced by run-scenario.ts: `scenario:{13-digit-ts}:{8-char-suffix}`.
|
||||
// Guards `GET /scenario-result/{jobId}` against path-traversal via crafted jobId.
|
||||
const JOB_ID_RE = /^scenario:\d{13}:[a-z0-9]{8}$/;
|
||||
|
||||
interface WorkerResultEnvelope {
|
||||
status?: string;
|
||||
result?: unknown;
|
||||
error?: unknown;
|
||||
}
|
||||
|
||||
function coerceImpactCountries(raw: unknown): ScenarioResult['topImpactCountries'] {
|
||||
if (!Array.isArray(raw)) return [];
|
||||
const out: ScenarioResult['topImpactCountries'] = [];
|
||||
for (const entry of raw) {
|
||||
if (!entry || typeof entry !== 'object') continue;
|
||||
const c = entry as { iso2?: unknown; totalImpact?: unknown; impactPct?: unknown };
|
||||
out.push({
|
||||
iso2: typeof c.iso2 === 'string' ? c.iso2 : '',
|
||||
totalImpact: typeof c.totalImpact === 'number' ? c.totalImpact : 0,
|
||||
impactPct: typeof c.impactPct === 'number' ? c.impactPct : 0,
|
||||
});
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
function coerceTemplate(raw: unknown): ScenarioResult['template'] {
|
||||
if (!raw || typeof raw !== 'object') return undefined;
|
||||
const t = raw as { name?: unknown; disruptionPct?: unknown; durationDays?: unknown; costShockMultiplier?: unknown };
|
||||
return {
|
||||
name: typeof t.name === 'string' ? t.name : '',
|
||||
disruptionPct: typeof t.disruptionPct === 'number' ? t.disruptionPct : 0,
|
||||
durationDays: typeof t.durationDays === 'number' ? t.durationDays : 0,
|
||||
costShockMultiplier: typeof t.costShockMultiplier === 'number' ? t.costShockMultiplier : 1,
|
||||
};
|
||||
}
|
||||
|
||||
function coerceResult(raw: unknown): ScenarioResult | undefined {
|
||||
if (!raw || typeof raw !== 'object') return undefined;
|
||||
const r = raw as { affectedChokepointIds?: unknown; topImpactCountries?: unknown; template?: unknown };
|
||||
return {
|
||||
affectedChokepointIds: Array.isArray(r.affectedChokepointIds)
|
||||
? r.affectedChokepointIds.filter((id): id is string => typeof id === 'string')
|
||||
: [],
|
||||
topImpactCountries: coerceImpactCountries(r.topImpactCountries),
|
||||
template: coerceTemplate(r.template),
|
||||
};
|
||||
}
|
||||
|
||||
export async function getScenarioStatus(
|
||||
ctx: ServerContext,
|
||||
req: GetScenarioStatusRequest,
|
||||
): Promise<GetScenarioStatusResponse> {
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) {
|
||||
throw new ApiError(403, 'PRO subscription required', '');
|
||||
}
|
||||
|
||||
const jobId = req.jobId ?? '';
|
||||
if (!JOB_ID_RE.test(jobId)) {
|
||||
throw new ValidationError([{ field: 'jobId', description: 'Invalid or missing jobId' }]);
|
||||
}
|
||||
|
||||
// Worker writes under the raw (unprefixed) key, so we must read raw.
|
||||
let envelope: WorkerResultEnvelope | null = null;
|
||||
try {
|
||||
envelope = await getRawJson(`scenario-result:${jobId}`) as WorkerResultEnvelope | null;
|
||||
} catch {
|
||||
throw new ApiError(502, 'Failed to fetch job status', '');
|
||||
}
|
||||
|
||||
if (!envelope) {
|
||||
return { status: 'pending', error: '' };
|
||||
}
|
||||
|
||||
const status = typeof envelope.status === 'string' ? envelope.status : 'pending';
|
||||
|
||||
if (status === 'done') {
|
||||
const result = coerceResult(envelope.result);
|
||||
return { status: 'done', result, error: '' };
|
||||
}
|
||||
|
||||
if (status === 'failed') {
|
||||
const error = typeof envelope.error === 'string' ? envelope.error : 'computation_error';
|
||||
return { status: 'failed', error };
|
||||
}
|
||||
|
||||
return { status, error: '' };
|
||||
}
|
||||
11
server/worldmonitor/scenario/v1/handler.ts
Normal file
11
server/worldmonitor/scenario/v1/handler.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import type { ScenarioServiceHandler } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
|
||||
import { runScenario } from './run-scenario';
|
||||
import { getScenarioStatus } from './get-scenario-status';
|
||||
import { listScenarioTemplates } from './list-scenario-templates';
|
||||
|
||||
export const scenarioHandler: ScenarioServiceHandler = {
|
||||
runScenario,
|
||||
getScenarioStatus,
|
||||
listScenarioTemplates,
|
||||
};
|
||||
26
server/worldmonitor/scenario/v1/list-scenario-templates.ts
Normal file
26
server/worldmonitor/scenario/v1/list-scenario-templates.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
ListScenarioTemplatesRequest,
|
||||
ListScenarioTemplatesResponse,
|
||||
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
|
||||
import { SCENARIO_TEMPLATES } from '../../supply-chain/v1/scenario-templates';
|
||||
|
||||
export async function listScenarioTemplates(
|
||||
_ctx: ServerContext,
|
||||
_req: ListScenarioTemplatesRequest,
|
||||
): Promise<ListScenarioTemplatesResponse> {
|
||||
return {
|
||||
templates: SCENARIO_TEMPLATES.map((t) => ({
|
||||
id: t.id,
|
||||
name: t.name,
|
||||
affectedChokepointIds: [...t.affectedChokepointIds],
|
||||
disruptionPct: t.disruptionPct,
|
||||
durationDays: t.durationDays,
|
||||
// Empty array means ALL sectors on the wire (mirrors the `affectedHs2: null`
|
||||
// template convention — proto `repeated` cannot carry null).
|
||||
affectedHs2: t.affectedHs2 ? [...t.affectedHs2] : [],
|
||||
costShockMultiplier: t.costShockMultiplier,
|
||||
})),
|
||||
};
|
||||
}
|
||||
79
server/worldmonitor/scenario/v1/run-scenario.ts
Normal file
79
server/worldmonitor/scenario/v1/run-scenario.ts
Normal file
@@ -0,0 +1,79 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
RunScenarioRequest,
|
||||
RunScenarioResponse,
|
||||
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
|
||||
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { runRedisPipeline } from '../../../_shared/redis';
|
||||
import { getScenarioTemplate } from '../../supply-chain/v1/scenario-templates';
|
||||
|
||||
const QUEUE_KEY = 'scenario-queue:pending';
|
||||
const MAX_QUEUE_DEPTH = 100;
|
||||
const JOB_ID_CHARSET = 'abcdefghijklmnopqrstuvwxyz0123456789';
|
||||
|
||||
function generateJobId(): string {
|
||||
const ts = Date.now();
|
||||
let suffix = '';
|
||||
const array = new Uint8Array(8);
|
||||
crypto.getRandomValues(array);
|
||||
for (const byte of array) suffix += JOB_ID_CHARSET[byte % JOB_ID_CHARSET.length];
|
||||
return `scenario:${ts}:${suffix}`;
|
||||
}
|
||||
|
||||
export async function runScenario(
|
||||
ctx: ServerContext,
|
||||
req: RunScenarioRequest,
|
||||
): Promise<RunScenarioResponse> {
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) {
|
||||
throw new ApiError(403, 'PRO subscription required', '');
|
||||
}
|
||||
|
||||
const scenarioId = (req.scenarioId ?? '').trim();
|
||||
if (!scenarioId) {
|
||||
throw new ValidationError([{ field: 'scenarioId', description: 'scenarioId is required' }]);
|
||||
}
|
||||
if (!getScenarioTemplate(scenarioId)) {
|
||||
throw new ValidationError([{ field: 'scenarioId', description: `Unknown scenario: ${scenarioId}` }]);
|
||||
}
|
||||
|
||||
const iso2 = req.iso2 ? req.iso2.trim() : '';
|
||||
if (iso2 && !/^[A-Z]{2}$/.test(iso2)) {
|
||||
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase country code' }]);
|
||||
}
|
||||
|
||||
// Queue-depth backpressure. Raw key: worker reads it unprefixed, so we must too.
|
||||
const [depthEntry] = await runRedisPipeline([['LLEN', QUEUE_KEY]], true);
|
||||
const depth = typeof depthEntry?.result === 'number' ? depthEntry.result : 0;
|
||||
if (depth > MAX_QUEUE_DEPTH) {
|
||||
throw new ApiError(429, 'Scenario queue is at capacity, please try again later', '');
|
||||
}
|
||||
|
||||
const jobId = generateJobId();
|
||||
const payload = JSON.stringify({
|
||||
jobId,
|
||||
scenarioId,
|
||||
iso2: iso2 || null,
|
||||
enqueuedAt: Date.now(),
|
||||
});
|
||||
|
||||
// Upstash RPUSH returns the new list length; helper returns [] on transport
|
||||
// failure. Either no entry or a non-numeric result means the enqueue never
|
||||
// landed — surface as 502 so the caller retries.
|
||||
const [pushEntry] = await runRedisPipeline([['RPUSH', QUEUE_KEY, payload]], true);
|
||||
if (!pushEntry || typeof pushEntry.result !== 'number') {
|
||||
throw new ApiError(502, 'Failed to enqueue scenario job', '');
|
||||
}
|
||||
|
||||
// statusUrl is a server-computed convenience URL preserved from the legacy
|
||||
// /api/scenario/v1/run contract so external callers can keep polling via the
|
||||
// response body rather than hardcoding the status path. See the proto comment
|
||||
// on RunScenarioResponse for why this matters on a v1 → v1 migration.
|
||||
return {
|
||||
jobId,
|
||||
status: 'pending',
|
||||
statusUrl: `/api/scenario/v1/get-scenario-status?jobId=${encodeURIComponent(jobId)}`,
|
||||
};
|
||||
}
|
||||
11
server/worldmonitor/shipping/v2/handler.ts
Normal file
11
server/worldmonitor/shipping/v2/handler.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import type { ShippingV2ServiceHandler } from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
|
||||
import { routeIntelligence } from './route-intelligence';
|
||||
import { registerWebhook } from './register-webhook';
|
||||
import { listWebhooks } from './list-webhooks';
|
||||
|
||||
export const shippingV2Handler: ShippingV2ServiceHandler = {
|
||||
routeIntelligence,
|
||||
registerWebhook,
|
||||
listWebhooks,
|
||||
};
|
||||
70
server/worldmonitor/shipping/v2/list-webhooks.ts
Normal file
70
server/worldmonitor/shipping/v2/list-webhooks.ts
Normal file
@@ -0,0 +1,70 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
ListWebhooksRequest,
|
||||
ListWebhooksResponse,
|
||||
WebhookSummary,
|
||||
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
import { ApiError } from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
|
||||
// @ts-expect-error — JS module, no declaration file
|
||||
import { validateApiKey } from '../../../../api/_api-key.js';
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { runRedisPipeline } from '../../../_shared/redis';
|
||||
import {
|
||||
webhookKey,
|
||||
ownerIndexKey,
|
||||
callerFingerprint,
|
||||
type WebhookRecord,
|
||||
} from './webhook-shared';
|
||||
|
||||
export async function listWebhooks(
|
||||
ctx: ServerContext,
|
||||
_req: ListWebhooksRequest,
|
||||
): Promise<ListWebhooksResponse> {
|
||||
// Without forceKey, Clerk-authenticated pro callers reach this handler with
|
||||
// no API key, callerFingerprint() returns the 'anon' fallback, and the
|
||||
// ownerTag !== ownerHash defense-in-depth below collapses because both
|
||||
// sides equal 'anon' — exposing every 'anon'-bucket tenant's webhooks to
|
||||
// every Clerk-session holder. See registerWebhook for full rationale.
|
||||
const apiKeyResult = validateApiKey(ctx.request, { forceKey: true }) as {
|
||||
valid: boolean; required: boolean; error?: string;
|
||||
};
|
||||
if (apiKeyResult.required && !apiKeyResult.valid) {
|
||||
throw new ApiError(401, apiKeyResult.error ?? 'API key required', '');
|
||||
}
|
||||
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) {
|
||||
throw new ApiError(403, 'PRO subscription required', '');
|
||||
}
|
||||
|
||||
const ownerHash = await callerFingerprint(ctx.request);
|
||||
const smembersResult = await runRedisPipeline([['SMEMBERS', ownerIndexKey(ownerHash)]]);
|
||||
const memberIds = (smembersResult[0]?.result as string[] | null) ?? [];
|
||||
|
||||
if (memberIds.length === 0) {
|
||||
return { webhooks: [] };
|
||||
}
|
||||
|
||||
const getResults = await runRedisPipeline(memberIds.map(id => ['GET', webhookKey(id)]));
|
||||
const webhooks: WebhookSummary[] = [];
|
||||
for (const r of getResults) {
|
||||
if (!r.result || typeof r.result !== 'string') continue;
|
||||
try {
|
||||
const record = JSON.parse(r.result) as WebhookRecord;
|
||||
if (record.ownerTag !== ownerHash) continue;
|
||||
webhooks.push({
|
||||
subscriberId: record.subscriberId,
|
||||
callbackUrl: record.callbackUrl,
|
||||
chokepointIds: record.chokepointIds,
|
||||
alertThreshold: record.alertThreshold,
|
||||
createdAt: record.createdAt,
|
||||
active: record.active,
|
||||
});
|
||||
} catch {
|
||||
// skip malformed
|
||||
}
|
||||
}
|
||||
|
||||
return { webhooks };
|
||||
}
|
||||
100
server/worldmonitor/shipping/v2/register-webhook.ts
Normal file
100
server/worldmonitor/shipping/v2/register-webhook.ts
Normal file
@@ -0,0 +1,100 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
RegisterWebhookRequest,
|
||||
RegisterWebhookResponse,
|
||||
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
import {
|
||||
ApiError,
|
||||
ValidationError,
|
||||
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
|
||||
// @ts-expect-error — JS module, no declaration file
|
||||
import { validateApiKey } from '../../../../api/_api-key.js';
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { runRedisPipeline } from '../../../_shared/redis';
|
||||
import {
|
||||
WEBHOOK_TTL,
|
||||
VALID_CHOKEPOINT_IDS,
|
||||
isBlockedCallbackUrl,
|
||||
generateSecret,
|
||||
generateSubscriberId,
|
||||
webhookKey,
|
||||
ownerIndexKey,
|
||||
callerFingerprint,
|
||||
type WebhookRecord,
|
||||
} from './webhook-shared';
|
||||
|
||||
export async function registerWebhook(
|
||||
ctx: ServerContext,
|
||||
req: RegisterWebhookRequest,
|
||||
): Promise<RegisterWebhookResponse> {
|
||||
// Webhooks are per-tenant keyed on callerFingerprint(), which hashes the
|
||||
// API key. Without forceKey, a Clerk-authenticated pro caller reaches this
|
||||
// handler with no API key, callerFingerprint() falls back to 'anon', and
|
||||
// every such caller collapses into a shared 'anon' owner bucket — letting
|
||||
// one Clerk-session holder enumerate/overwrite other tenants' webhooks.
|
||||
// Matches the legacy `api/v2/shipping/webhooks/[subscriberId]{,/[action]}.ts`
|
||||
// gate and the documented "X-WorldMonitor-Key required" contract in
|
||||
// docs/api-shipping-v2.mdx.
|
||||
const apiKeyResult = validateApiKey(ctx.request, { forceKey: true }) as {
|
||||
valid: boolean; required: boolean; error?: string;
|
||||
};
|
||||
if (apiKeyResult.required && !apiKeyResult.valid) {
|
||||
throw new ApiError(401, apiKeyResult.error ?? 'API key required', '');
|
||||
}
|
||||
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) {
|
||||
throw new ApiError(403, 'PRO subscription required', '');
|
||||
}
|
||||
|
||||
const callbackUrl = (req.callbackUrl ?? '').trim();
|
||||
if (!callbackUrl) {
|
||||
throw new ValidationError([{ field: 'callbackUrl', description: 'callbackUrl is required' }]);
|
||||
}
|
||||
|
||||
const ssrfError = isBlockedCallbackUrl(callbackUrl);
|
||||
if (ssrfError) {
|
||||
throw new ValidationError([{ field: 'callbackUrl', description: ssrfError }]);
|
||||
}
|
||||
|
||||
const chokepointIds = Array.isArray(req.chokepointIds) ? req.chokepointIds : [];
|
||||
const invalidCp = chokepointIds.find(id => !VALID_CHOKEPOINT_IDS.has(id));
|
||||
if (invalidCp) {
|
||||
throw new ValidationError([
|
||||
{ field: 'chokepointIds', description: `Unknown chokepoint ID: ${invalidCp}` },
|
||||
]);
|
||||
}
|
||||
|
||||
// Proto default int32 is 0 — treat 0 as "unset" to preserve the legacy
|
||||
// default of 50 when the caller omits alertThreshold.
|
||||
const alertThreshold = req.alertThreshold > 0 ? req.alertThreshold : 50;
|
||||
if (alertThreshold < 0 || alertThreshold > 100) {
|
||||
throw new ValidationError([
|
||||
{ field: 'alertThreshold', description: 'alertThreshold must be a number between 0 and 100' },
|
||||
]);
|
||||
}
|
||||
|
||||
const ownerTag = await callerFingerprint(ctx.request);
|
||||
const newSubscriberId = generateSubscriberId();
|
||||
const secret = await generateSecret();
|
||||
|
||||
const record: WebhookRecord = {
|
||||
subscriberId: newSubscriberId,
|
||||
ownerTag,
|
||||
callbackUrl,
|
||||
chokepointIds: chokepointIds.length ? chokepointIds : [...VALID_CHOKEPOINT_IDS],
|
||||
alertThreshold,
|
||||
createdAt: new Date().toISOString(),
|
||||
active: true,
|
||||
secret,
|
||||
};
|
||||
|
||||
await runRedisPipeline([
|
||||
['SET', webhookKey(newSubscriberId), JSON.stringify(record), 'EX', String(WEBHOOK_TTL)],
|
||||
['SADD', ownerIndexKey(ownerTag), newSubscriberId],
|
||||
['EXPIRE', ownerIndexKey(ownerTag), String(WEBHOOK_TTL)],
|
||||
]);
|
||||
|
||||
return { subscriberId: newSubscriberId, secret };
|
||||
}
|
||||
116
server/worldmonitor/shipping/v2/route-intelligence.ts
Normal file
116
server/worldmonitor/shipping/v2/route-intelligence.ts
Normal file
@@ -0,0 +1,116 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
RouteIntelligenceRequest,
|
||||
RouteIntelligenceResponse,
|
||||
ChokepointExposure,
|
||||
BypassOption,
|
||||
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
import {
|
||||
ApiError,
|
||||
ValidationError,
|
||||
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
|
||||
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { getCachedJson } from '../../../_shared/redis';
|
||||
import { CHOKEPOINT_STATUS_KEY } from '../../../_shared/cache-keys';
|
||||
import { BYPASS_CORRIDORS_BY_CHOKEPOINT, type CargoType } from '../../../_shared/bypass-corridors';
|
||||
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
|
||||
import COUNTRY_PORT_CLUSTERS from '../../../../scripts/shared/country-port-clusters.json';
|
||||
|
||||
interface PortClusterEntry {
|
||||
nearestRouteIds: string[];
|
||||
coastSide: string;
|
||||
}
|
||||
|
||||
interface ChokepointStatusEntry {
|
||||
id: string;
|
||||
name?: string;
|
||||
disruptionScore?: number;
|
||||
warRiskTier?: string;
|
||||
}
|
||||
|
||||
interface ChokepointStatusResponse {
|
||||
chokepoints?: ChokepointStatusEntry[];
|
||||
}
|
||||
|
||||
const VALID_CARGO_TYPES = new Set(['container', 'tanker', 'bulk', 'roro']);
|
||||
|
||||
export async function routeIntelligence(
|
||||
ctx: ServerContext,
|
||||
req: RouteIntelligenceRequest,
|
||||
): Promise<RouteIntelligenceResponse> {
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) {
|
||||
throw new ApiError(403, 'PRO subscription required', '');
|
||||
}
|
||||
|
||||
const fromIso2 = (req.fromIso2 ?? '').trim().toUpperCase();
|
||||
const toIso2 = (req.toIso2 ?? '').trim().toUpperCase();
|
||||
if (!/^[A-Z]{2}$/.test(fromIso2) || !/^[A-Z]{2}$/.test(toIso2)) {
|
||||
throw new ValidationError([
|
||||
{ field: 'fromIso2', description: 'fromIso2 and toIso2 must be valid 2-letter ISO country codes' },
|
||||
]);
|
||||
}
|
||||
|
||||
const cargoTypeRaw = (req.cargoType ?? '').trim().toLowerCase();
|
||||
const cargoType: CargoType = (VALID_CARGO_TYPES.has(cargoTypeRaw) ? cargoTypeRaw : 'container') as CargoType;
|
||||
const hs2 = (req.hs2 ?? '').trim().replace(/\D/g, '') || '27';
|
||||
|
||||
const clusters = COUNTRY_PORT_CLUSTERS as unknown as Record<string, PortClusterEntry>;
|
||||
const fromCluster = clusters[fromIso2];
|
||||
const toCluster = clusters[toIso2];
|
||||
|
||||
const fromRoutes = new Set(fromCluster?.nearestRouteIds ?? []);
|
||||
const toRoutes = new Set(toCluster?.nearestRouteIds ?? []);
|
||||
const sharedRoutes = [...fromRoutes].filter(r => toRoutes.has(r));
|
||||
const primaryRouteId = sharedRoutes[0] ?? fromCluster?.nearestRouteIds[0] ?? '';
|
||||
|
||||
const statusRaw = (await getCachedJson(CHOKEPOINT_STATUS_KEY).catch(() => null)) as ChokepointStatusResponse | null;
|
||||
const statusMap = new Map<string, ChokepointStatusEntry>(
|
||||
(statusRaw?.chokepoints ?? []).map(cp => [cp.id, cp]),
|
||||
);
|
||||
|
||||
const relevantRouteSet = new Set(sharedRoutes.length ? sharedRoutes : (fromCluster?.nearestRouteIds ?? []));
|
||||
const chokepointExposures: ChokepointExposure[] = CHOKEPOINT_REGISTRY
|
||||
.filter(cp => cp.routeIds.some(r => relevantRouteSet.has(r)))
|
||||
.map(cp => {
|
||||
const overlap = cp.routeIds.filter(r => relevantRouteSet.has(r)).length;
|
||||
const exposurePct = Math.round((overlap / Math.max(cp.routeIds.length, 1)) * 100);
|
||||
return { chokepointId: cp.id, chokepointName: cp.displayName, exposurePct };
|
||||
})
|
||||
.filter(e => e.exposurePct > 0)
|
||||
.sort((a, b) => b.exposurePct - a.exposurePct);
|
||||
|
||||
const primaryChokepoint = chokepointExposures[0];
|
||||
const primaryCpStatus = primaryChokepoint ? statusMap.get(primaryChokepoint.chokepointId) : null;
|
||||
|
||||
const disruptionScore = primaryCpStatus?.disruptionScore ?? 0;
|
||||
const warRiskTier = primaryCpStatus?.warRiskTier ?? 'WAR_RISK_TIER_NORMAL';
|
||||
|
||||
const bypassOptions: BypassOption[] = primaryChokepoint
|
||||
? (BYPASS_CORRIDORS_BY_CHOKEPOINT[primaryChokepoint.chokepointId] ?? [])
|
||||
.filter(c => c.suitableCargoTypes.length === 0 || c.suitableCargoTypes.includes(cargoType))
|
||||
.slice(0, 5)
|
||||
.map(c => ({
|
||||
id: c.id,
|
||||
name: c.name,
|
||||
type: c.type,
|
||||
addedTransitDays: c.addedTransitDays,
|
||||
addedCostMultiplier: c.addedCostMultiplier,
|
||||
activationThreshold: c.activationThreshold,
|
||||
}))
|
||||
: [];
|
||||
|
||||
return {
|
||||
fromIso2,
|
||||
toIso2,
|
||||
cargoType,
|
||||
hs2,
|
||||
primaryRouteId,
|
||||
chokepointExposures,
|
||||
bypassOptions,
|
||||
warRiskTier,
|
||||
disruptionScore,
|
||||
fetchedAt: new Date().toISOString(),
|
||||
};
|
||||
}
|
||||
102
server/worldmonitor/shipping/v2/webhook-shared.ts
Normal file
102
server/worldmonitor/shipping/v2/webhook-shared.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
|
||||
|
||||
export const WEBHOOK_TTL = 86400 * 30; // 30 days
|
||||
export const VALID_CHOKEPOINT_IDS = new Set(CHOKEPOINT_REGISTRY.map(c => c.id));
|
||||
|
||||
// Private IP ranges + known cloud metadata hostnames blocked at registration.
|
||||
// DNS rebinding is not mitigated here (no DNS resolution in edge runtime); the
|
||||
// delivery worker must re-resolve and re-check before sending.
|
||||
export const PRIVATE_HOSTNAME_PATTERNS = [
|
||||
/^localhost$/i,
|
||||
/^127\.\d+\.\d+\.\d+$/,
|
||||
/^10\.\d+\.\d+\.\d+$/,
|
||||
/^192\.168\.\d+\.\d+$/,
|
||||
/^172\.(1[6-9]|2\d|3[01])\.\d+\.\d+$/,
|
||||
/^169\.254\.\d+\.\d+$/,
|
||||
/^fd[0-9a-f]{2}:/i,
|
||||
/^fe80:/i,
|
||||
/^::1$/,
|
||||
/^0\.0\.0\.0$/,
|
||||
/^0\.\d+\.\d+\.\d+$/,
|
||||
/^100\.(6[4-9]|[7-9]\d|1[01]\d|12[0-7])\.\d+\.\d+$/,
|
||||
];
|
||||
|
||||
export const BLOCKED_METADATA_HOSTNAMES = new Set([
|
||||
'169.254.169.254',
|
||||
'metadata.google.internal',
|
||||
'metadata.internal',
|
||||
'instance-data',
|
||||
'metadata',
|
||||
'computemetadata',
|
||||
'link-local.s3.amazonaws.com',
|
||||
]);
|
||||
|
||||
export function isBlockedCallbackUrl(rawUrl: string): string | null {
|
||||
let parsed: URL;
|
||||
try {
|
||||
parsed = new URL(rawUrl);
|
||||
} catch {
|
||||
return 'callbackUrl is not a valid URL';
|
||||
}
|
||||
|
||||
if (parsed.protocol !== 'https:') {
|
||||
return 'callbackUrl must use https';
|
||||
}
|
||||
|
||||
const hostname = parsed.hostname.toLowerCase();
|
||||
|
||||
if (BLOCKED_METADATA_HOSTNAMES.has(hostname)) {
|
||||
return 'callbackUrl hostname is a blocked metadata endpoint';
|
||||
}
|
||||
|
||||
for (const pattern of PRIVATE_HOSTNAME_PATTERNS) {
|
||||
if (pattern.test(hostname)) {
|
||||
return `callbackUrl resolves to a private/reserved address: ${hostname}`;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
export async function generateSecret(): Promise<string> {
|
||||
const bytes = new Uint8Array(32);
|
||||
crypto.getRandomValues(bytes);
|
||||
return [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
|
||||
}
|
||||
|
||||
export function generateSubscriberId(): string {
|
||||
const bytes = new Uint8Array(12);
|
||||
crypto.getRandomValues(bytes);
|
||||
return 'wh_' + [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
|
||||
}
|
||||
|
||||
export function webhookKey(subscriberId: string): string {
|
||||
return `webhook:sub:${subscriberId}:v1`;
|
||||
}
|
||||
|
||||
export function ownerIndexKey(ownerHash: string): string {
|
||||
return `webhook:owner:${ownerHash}:v1`;
|
||||
}
|
||||
|
||||
/** SHA-256 hash of the caller's API key — used as ownerTag and owner index key. Never secret. */
|
||||
export async function callerFingerprint(req: Request): Promise<string> {
|
||||
const key =
|
||||
req.headers.get('X-WorldMonitor-Key') ??
|
||||
req.headers.get('X-Api-Key') ??
|
||||
'';
|
||||
if (!key) return 'anon';
|
||||
const encoded = new TextEncoder().encode(key);
|
||||
const hashBuffer = await crypto.subtle.digest('SHA-256', encoded);
|
||||
return Array.from(new Uint8Array(hashBuffer)).map(b => b.toString(16).padStart(2, '0')).join('');
|
||||
}
|
||||
|
||||
export interface WebhookRecord {
|
||||
subscriberId: string;
|
||||
ownerTag: string;
|
||||
callbackUrl: string;
|
||||
chokepointIds: string[];
|
||||
alertThreshold: number;
|
||||
createdAt: string;
|
||||
active: boolean;
|
||||
secret: string;
|
||||
}
|
||||
@@ -257,7 +257,7 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
|
||||
|
||||
const [navResult, vesselResult, transitSummariesData, flowsData] = await Promise.all([
|
||||
listNavigationalWarnings(ctx, { area: '', pageSize: 0, cursor: '' }).catch((): ListNavigationalWarningsResponse => { navFailed = true; return { warnings: [], pagination: undefined }; }),
|
||||
getVesselSnapshot(ctx, { neLat: 90, neLon: 180, swLat: -90, swLon: -180 }).catch((): GetVesselSnapshotResponse => { vesselFailed = true; return { snapshot: undefined }; }),
|
||||
getVesselSnapshot(ctx, { neLat: 90, neLon: 180, swLat: -90, swLon: -180, includeCandidates: false }).catch((): GetVesselSnapshotResponse => { vesselFailed = true; return { snapshot: undefined }; }),
|
||||
getCachedJson(TRANSIT_SUMMARIES_KEY, true).catch(() => null) as Promise<TransitSummariesPayload | null>,
|
||||
getCachedJson(FLOWS_KEY, true).catch(() => null) as Promise<Record<string, FlowEstimateEntry> | null>,
|
||||
]);
|
||||
|
||||
47
server/worldmonitor/supply-chain/v1/get-country-products.ts
Normal file
47
server/worldmonitor/supply-chain/v1/get-country-products.ts
Normal file
@@ -0,0 +1,47 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
GetCountryProductsRequest,
|
||||
GetCountryProductsResponse,
|
||||
CountryProduct,
|
||||
} from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
|
||||
import { ValidationError } from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
|
||||
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { getCachedJson } from '../../../_shared/redis';
|
||||
|
||||
interface BilateralHs4Payload {
|
||||
iso2: string;
|
||||
products?: CountryProduct[];
|
||||
fetchedAt?: string;
|
||||
}
|
||||
|
||||
export async function getCountryProducts(
|
||||
ctx: ServerContext,
|
||||
req: GetCountryProductsRequest,
|
||||
): Promise<GetCountryProductsResponse> {
|
||||
const iso2 = (req.iso2 ?? '').trim().toUpperCase();
|
||||
|
||||
// Input-shape errors return 400 — restoring the legacy /api/supply-chain/v1/
|
||||
// country-products contract which predated the sebuf migration. Empty-payload-200
|
||||
// is reserved for the PRO-gate deny path (intentional contract shift), not for
|
||||
// caller bugs (malformed/missing fields). Distinguishing the two matters for
|
||||
// logging, external API consumers, and silent-failure detection.
|
||||
if (!/^[A-Z]{2}$/.test(iso2)) {
|
||||
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase ISO country code' }]);
|
||||
}
|
||||
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
const empty: GetCountryProductsResponse = { iso2, products: [], fetchedAt: '' };
|
||||
if (!isPro) return empty;
|
||||
|
||||
// Seeder writes via raw key (no env-prefix) — match it on read.
|
||||
const key = `comtrade:bilateral-hs4:${iso2}:v1`;
|
||||
const payload = await getCachedJson(key, true).catch(() => null) as BilateralHs4Payload | null;
|
||||
if (!payload) return empty;
|
||||
|
||||
return {
|
||||
iso2,
|
||||
products: Array.isArray(payload.products) ? payload.products : [],
|
||||
fetchedAt: payload.fetchedAt ?? '',
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,129 @@
|
||||
import type {
|
||||
ServerContext,
|
||||
GetMultiSectorCostShockRequest,
|
||||
GetMultiSectorCostShockResponse,
|
||||
ChokepointInfo,
|
||||
MultiSectorCostShock,
|
||||
WarRiskTier,
|
||||
} from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
|
||||
import { ValidationError } from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
|
||||
|
||||
import { isCallerPremium } from '../../../_shared/premium-check';
|
||||
import { getCachedJson } from '../../../_shared/redis';
|
||||
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
|
||||
import { CHOKEPOINT_STATUS_KEY } from '../../../_shared/cache-keys';
|
||||
import {
|
||||
aggregateAnnualImportsByHs2,
|
||||
clampClosureDays,
|
||||
computeMultiSectorShocks,
|
||||
MULTI_SECTOR_HS2_LABELS,
|
||||
SEEDED_HS2_CODES,
|
||||
type SeededProduct,
|
||||
} from './_multi-sector-shock';
|
||||
|
||||
interface CountryProductsCache {
|
||||
iso2: string;
|
||||
products?: SeededProduct[];
|
||||
fetchedAt?: string;
|
||||
}
|
||||
|
||||
function emptySectorSkeleton(closureDays: number): MultiSectorCostShock[] {
|
||||
return SEEDED_HS2_CODES.map(hs2 => ({
|
||||
hs2,
|
||||
hs2Label: MULTI_SECTOR_HS2_LABELS[hs2] ?? `HS ${hs2}`,
|
||||
importValueAnnual: 0,
|
||||
freightAddedPctPerTon: 0,
|
||||
warRiskPremiumBps: 0,
|
||||
addedTransitDays: 0,
|
||||
totalCostShockPerDay: 0,
|
||||
totalCostShock30Days: 0,
|
||||
totalCostShock90Days: 0,
|
||||
totalCostShock: 0,
|
||||
closureDays,
|
||||
}));
|
||||
}
|
||||
|
||||
function emptyResponse(
|
||||
iso2: string,
|
||||
chokepointId: string,
|
||||
closureDays: number,
|
||||
warRiskTier: WarRiskTier = 'WAR_RISK_TIER_UNSPECIFIED',
|
||||
unavailableReason = '',
|
||||
sectors: MultiSectorCostShock[] = [],
|
||||
): GetMultiSectorCostShockResponse {
|
||||
return {
|
||||
iso2,
|
||||
chokepointId,
|
||||
closureDays,
|
||||
warRiskTier,
|
||||
sectors,
|
||||
totalAddedCost: 0,
|
||||
fetchedAt: new Date().toISOString(),
|
||||
unavailableReason,
|
||||
};
|
||||
}
|
||||
|
||||
export async function getMultiSectorCostShock(
|
||||
ctx: ServerContext,
|
||||
req: GetMultiSectorCostShockRequest,
|
||||
): Promise<GetMultiSectorCostShockResponse> {
|
||||
const iso2 = (req.iso2 ?? '').trim().toUpperCase();
|
||||
const chokepointId = (req.chokepointId ?? '').trim().toLowerCase();
|
||||
const closureDays = clampClosureDays(req.closureDays ?? 30);
|
||||
|
||||
// Input-shape errors return 400 — restoring the legacy /api/supply-chain/v1/
|
||||
// multi-sector-cost-shock contract. Empty-payload-200 is reserved for the
|
||||
// PRO-gate deny path (intentional contract shift), not for caller bugs
|
||||
// (malformed or missing fields). Distinguishing the two matters for external
|
||||
// API consumers, tests, and silent-failure detection in logs.
|
||||
if (!/^[A-Z]{2}$/.test(iso2)) {
|
||||
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase ISO country code' }]);
|
||||
}
|
||||
if (!chokepointId) {
|
||||
throw new ValidationError([{ field: 'chokepointId', description: 'chokepointId is required' }]);
|
||||
}
|
||||
if (!CHOKEPOINT_REGISTRY.some(c => c.id === chokepointId)) {
|
||||
throw new ValidationError([{ field: 'chokepointId', description: `Unknown chokepointId: ${chokepointId}` }]);
|
||||
}
|
||||
|
||||
const isPro = await isCallerPremium(ctx.request);
|
||||
if (!isPro) return emptyResponse(iso2, chokepointId, closureDays);
|
||||
|
||||
// Seeder writes the products payload via raw key (no env-prefix) — read raw.
|
||||
const productsKey = `comtrade:bilateral-hs4:${iso2}:v1`;
|
||||
const [productsCache, statusCache] = await Promise.all([
|
||||
getCachedJson(productsKey, true).catch(() => null) as Promise<CountryProductsCache | null>,
|
||||
getCachedJson(CHOKEPOINT_STATUS_KEY).catch(() => null) as Promise<{ chokepoints?: ChokepointInfo[] } | null>,
|
||||
]);
|
||||
|
||||
const products = Array.isArray(productsCache?.products) ? productsCache.products : [];
|
||||
const importsByHs2 = aggregateAnnualImportsByHs2(products);
|
||||
const hasAnyImports = Object.values(importsByHs2).some(v => v > 0);
|
||||
const warRiskTier = (statusCache?.chokepoints?.find(c => c.id === chokepointId)?.warRiskTier
|
||||
?? 'WAR_RISK_TIER_NORMAL') as WarRiskTier;
|
||||
|
||||
if (!hasAnyImports) {
|
||||
return emptyResponse(
|
||||
iso2,
|
||||
chokepointId,
|
||||
closureDays,
|
||||
warRiskTier,
|
||||
'No seeded import data available for this country',
|
||||
emptySectorSkeleton(closureDays),
|
||||
);
|
||||
}
|
||||
|
||||
const sectors = computeMultiSectorShocks(importsByHs2, chokepointId, warRiskTier, closureDays);
|
||||
const totalAddedCost = sectors.reduce((sum, s) => sum + s.totalCostShock, 0);
|
||||
|
||||
return {
|
||||
iso2,
|
||||
chokepointId,
|
||||
closureDays,
|
||||
warRiskTier,
|
||||
sectors,
|
||||
totalAddedCost,
|
||||
fetchedAt: new Date().toISOString(),
|
||||
unavailableReason: '',
|
||||
};
|
||||
}
|
||||
@@ -8,6 +8,8 @@ import { getShippingStress } from './get-shipping-stress';
|
||||
import { getCountryChokepointIndex } from './get-country-chokepoint-index';
|
||||
import { getBypassOptions } from './get-bypass-options';
|
||||
import { getCountryCostShock } from './get-country-cost-shock';
|
||||
import { getCountryProducts } from './get-country-products';
|
||||
import { getMultiSectorCostShock } from './get-multi-sector-cost-shock';
|
||||
import { getSectorDependency } from './get-sector-dependency';
|
||||
import { getRouteExplorerLane } from './get-route-explorer-lane';
|
||||
import { getRouteImpact } from './get-route-impact';
|
||||
@@ -21,6 +23,8 @@ export const supplyChainHandler: SupplyChainServiceHandler = {
|
||||
getCountryChokepointIndex,
|
||||
getBypassOptions,
|
||||
getCountryCostShock,
|
||||
getCountryProducts,
|
||||
getMultiSectorCostShock,
|
||||
getSectorDependency,
|
||||
getRouteExplorerLane,
|
||||
getRouteImpact,
|
||||
|
||||
Reference in New Issue
Block a user