chore(api): enforce sebuf contract + migrate drifting endpoints (#3207) (#3242)

* chore(api): enforce sebuf contract via exceptions manifest (#3207)

Adds api/api-route-exceptions.json as the single source of truth for
non-proto /api/ endpoints, with scripts/enforce-sebuf-api-contract.mjs
gating every PR via npm run lint:api-contract. Fixes the root-only blind
spot in the prior allowlist (tests/edge-functions.test.mjs), which only
scanned top-level *.js files and missed nested paths and .ts endpoints —
the gap that let api/supply-chain/v1/country-products.ts and friends
drift under proto domain URL prefixes unchallenged.

Checks both directions: every api/<domain>/v<N>/[rpc].ts must pair with
a generated service_server.ts (so a deleted proto fails CI), and every
generated service must have an HTTP gateway (no orphaned generated code).

Manifest entries require category + reason + owner, with removal_issue
mandatory for temporary categories (deferred, migration-pending) and
forbidden for permanent ones. .github/CODEOWNERS pins the manifest to
@SebastienMelki so new exceptions don't slip through review.

The manifest only shrinks: migration-pending entries (19 today) will be
removed as subsequent commits in this PR land each migration.

* refactor(maritime): migrate /api/ais-snapshot → maritime/v1.GetVesselSnapshot (#3207)

The proto VesselSnapshot was carrying density + disruptions but the frontend
also needed sequence, relay status, and candidate_reports to drive the
position-callback system. Those only lived on the raw relay passthrough, so
the client had to keep hitting /api/ais-snapshot whenever callbacks were
registered and fall back to the proto RPC only when the relay URL was gone.

This commit pushes all three missing fields through the proto contract and
collapses the dual-fetch-path into one proto client call.

Proto changes (proto/worldmonitor/maritime/v1/):
  - VesselSnapshot gains sequence, status, candidate_reports.
  - GetVesselSnapshotRequest gains include_candidates (query: include_candidates).

Handler (server/worldmonitor/maritime/v1/get-vessel-snapshot.ts):
  - Forwards include_candidates to ?candidates=... on the relay.
  - Separate 5-min in-memory caches for the candidates=on and candidates=off
    variants; they have very different payload sizes and should not share a slot.
  - Per-request in-flight dedup preserved per-variant.

Frontend (src/services/maritime/index.ts):
  - fetchSnapshotPayload now calls MaritimeServiceClient.getVesselSnapshot
    directly with includeCandidates threaded through. The raw-relay path,
    SNAPSHOT_PROXY_URL, DIRECT_RAILWAY_SNAPSHOT_URL and LOCAL_SNAPSHOT_FALLBACK
    are gone — production already routed via Vercel, the "direct" branch only
    ever fired on localhost, and the proto gateway covers both.
  - New toLegacyCandidateReport helper mirrors toDensityZone/toDisruptionEvent.

api/ais-snapshot.js deleted; manifest entry removed. Only reduced the codegen
scope to worldmonitor.maritime.v1 (buf generate --path) — regenerating the
full tree drops // @ts-nocheck from every client/server file and surfaces
pre-existing type errors across 30+ unrelated services, which is not in
scope for this PR.

Shape-diff vs legacy payload:
  - disruptions / density: proto carries the same fields, just with the
    GeoCoordinates wrapper and enum strings (remapped client-side via
    existing toDisruptionEvent / toDensityZone helpers).
  - sequence, status.{connected,vessels,messages}: now populated from the
    proto response — was hardcoded to 0/false in the prior proto fallback.
  - candidateReports: same shape; optional numeric fields come through as
    0 instead of undefined, which the legacy consumer already handled.

* refactor(sanctions): migrate /api/sanctions-entity-search → LookupSanctionEntity (#3207)

The proto docstring already claimed "OFAC + OpenSanctions" coverage but the
handler only fuzzy-matched a local OFAC Redis index — narrower than the
legacy /api/sanctions-entity-search, which proxied OpenSanctions live (the
source advertised in docs/api-proxies.mdx). Deleting the legacy without
expanding the handler would have been a silent coverage regression for
external consumers.

Handler changes (server/worldmonitor/sanctions/v1/lookup-entity.ts):
  - Primary path: live search against api.opensanctions.org/search/default
    with an 8s timeout and the same User-Agent the legacy edge fn used.
  - Fallback path: the existing OFAC local fuzzy match, kept intact for when
    OpenSanctions is unreachable / rate-limiting.
  - Response source field flips between 'opensanctions' (happy path) and
    'ofac' (fallback) so clients can tell which index answered.
  - Query validation tightened: rejects q > 200 chars (matches legacy cap).

Rate limiting:
  - Added /api/sanctions/v1/lookup-entity to ENDPOINT_RATE_POLICIES at 30/min
    per IP — matches the legacy createIpRateLimiter budget. The gateway
    already enforces per-endpoint policies via checkEndpointRateLimit.

Docs:
  - docs/api-proxies.mdx — dropped the /api/sanctions-entity-search row
    (plus the orphaned /api/ais-snapshot row left over from the previous
    commit in this PR).
  - docs/panels/sanctions-pressure.mdx — points at the new RPC URL and
    describes the OpenSanctions-primary / OFAC-fallback semantics.

api/sanctions-entity-search.js deleted; manifest entry removed.

* refactor(military): migrate /api/military-flights → ListMilitaryFlights (#3207)

Legacy /api/military-flights read a pre-baked Redis blob written by the
seed-military-flights cron and returned flights in a flat app-friendly
shape (lat/lon, lowercase enums, lastSeenMs). The proto RPC takes a bbox,
fetches OpenSky live, classifies server-side, and returns nested
GeoCoordinates + MILITARY_*_TYPE_* enum strings + lastSeenAt — same data,
different contract.

fetchFromRedis in src/services/military-flights.ts was doing nothing
sebuf-aware. Renamed it to fetchViaProto and rewrote to:

  - Instantiate MilitaryServiceClient against getRpcBaseUrl().
  - Iterate MILITARY_QUERY_REGIONS (PACIFIC + WESTERN) in parallel — same
    regions the desktop OpenSky path and the seed cron already use, so
    dashboard coverage tracks the analytic pipeline.
  - Dedup by hexCode across regions.
  - Map proto → app shape via new mapProtoFlight helper plus three reverse
    enum maps (AIRCRAFT_TYPE_REVERSE, OPERATOR_REVERSE, CONFIDENCE_REVERSE).

The seed cron (scripts/seed-military-flights.mjs) stays put: it feeds
regional-snapshot mobility, cross-source signals, correlation, and the
health freshness check (api/health.js: 'military:flights:v1'). None of
those read the legacy HTTP endpoint; they read the Redis key directly.
The proto handler uses its own per-bbox cache keys under the same prefix,
so dashboard traffic no longer races the seed cron's blob — the two paths
diverge by a small refresh lag, which is acceptable.

Docs: dropped the /api/military-flights row from docs/api-proxies.mdx.

api/military-flights.js deleted; manifest entry removed.

Shape-diff vs legacy:
  - f.location.{latitude,longitude} → f.lat, f.lon
  - f.aircraftType: MILITARY_AIRCRAFT_TYPE_TANKER → 'tanker' via reverse map
  - f.operator: MILITARY_OPERATOR_USAF → 'usaf' via reverse map
  - f.confidence: MILITARY_CONFIDENCE_LOW → 'low' via reverse map
  - f.lastSeenAt (number) → f.lastSeen (Date)
  - f.enrichment → f.enriched (with field renames)
  - Extra fields registration / aircraftModel / origin / destination /
    firstSeenAt now flow through where proto populates them.

* fix(supply-chain): thread includeCandidates through chokepoint status (#3207)

Caught by tsconfig.api.json typecheck in the pre-push hook (not covered
by the plain tsc --noEmit run that ran before I pushed the ais-snapshot
commit). The chokepoint status handler calls getVesselSnapshot internally
with a static no-auth request — now required to include the new
includeCandidates bool from the proto extension.

Passing false: server-internal callers don't need per-vessel reports.

* test(maritime): update getVesselSnapshot cache assertions (#3207)

The ais-snapshot migration replaced the single cachedSnapshot/cacheTimestamp
pair with a per-variant cache so candidates-on and candidates-off payloads
don't evict each other. Pre-push hook surfaced that tests/server-handlers
still asserted the old variable names. Rewriting the assertions to match
the new shape while preserving the invariants they actually guard:

  - Freshness check against slot TTL.
  - Cache read before relay call.
  - Per-slot in-flight dedup.
  - Stale-serve on relay failure (result ?? slot.snapshot).

* chore(proto): restore // @ts-nocheck on regenerated maritime files (#3207)

I ran 'buf generate --path worldmonitor/maritime/v1' to scope the proto
regen to the one service I was changing (to avoid the toolchain drift
that drops @ts-nocheck from 60+ unrelated files — separate issue). But
the repo convention is the 'make generate' target, which runs buf and
then sed-prepends '// @ts-nocheck' to every generated .ts file. My
scoped command skipped the sed step. The proto-check CI enforces the
sed output, so the two maritime files need the directive restored.

* refactor(enrichment): decomm /api/enrichment/{company,signals} legacy edge fns (#3207)

Both endpoints were already ported to IntelligenceService:
  - getCompanyEnrichment  (/api/intelligence/v1/get-company-enrichment)
  - listCompanySignals    (/api/intelligence/v1/list-company-signals)

No frontend callers of the legacy /api/enrichment/* paths exist. Removes:
  - api/enrichment/company.js, signals.js, _domain.js
  - api-route-exceptions.json migration-pending entries (58 remain)
  - docs/api-proxies.mdx rows for /api/enrichment/{company,signals}
  - docs/architecture.mdx reference updated to the IntelligenceService RPCs

Verified: typecheck, typecheck:api, lint:api-contract (89 files / 58 entries),
lint:boundaries, tests/edge-functions.test.mjs (136 pass),
tests/enrichment-caching.test.mjs (14 pass — still guards the intelligence/v1
handlers), make generate is zero-diff.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* refactor(leads): migrate /api/{contact,register-interest} → LeadsService (#3207)

New leads/v1 sebuf service with two POST RPCs:
  - SubmitContact    → /api/leads/v1/submit-contact
  - RegisterInterest → /api/leads/v1/register-interest

Handler logic ported 1:1 from api/contact.js + api/register-interest.js:
  - Turnstile verification (desktop sources bypass, preserved)
  - Honeypot (website field) silently accepts without upstream calls
  - Free-email-domain gate on SubmitContact (422 ApiError)
  - validateEmail (disposable/offensive/typo-TLD/MX) on RegisterInterest
  - Convex writes via ConvexHttpClient (contactMessages:submit, registerInterest:register)
  - Resend notification + confirmation emails (HTML templates unchanged)

Shared helpers moved to server/_shared/:
  - turnstile.ts (getClientIp + verifyTurnstile)
  - email-validation.ts (disposable/offensive/MX checks)

Rate limits preserved via ENDPOINT_RATE_POLICIES:
  - submit-contact:    3/hour per IP (was in-memory 3/hr)
  - register-interest: 5/hour per IP (was in-memory 5/hr; desktop
    sources previously capped at 2/hr via shared in-memory map —
    now 5/hr like everyone else, accepting the small regression in
    exchange for Upstash-backed global limiting)

Callers updated:
  - pro-test/src/App.tsx contact form → new submit-contact path
  - src-tauri/sidecar/local-api-server.mjs cloud-fallback rewrites
    /api/register-interest → /api/leads/v1/register-interest when
    proxying; keeps local path for older desktop builds
  - src/services/runtime.ts isKeyFreeApiTarget allows both old and
    new paths through the WORLDMONITOR_API_KEY-optional gate

Tests:
  - tests/contact-handler.test.mjs rewritten to call submitContact
    handler directly; asserts on ValidationError / ApiError
  - tests/email-validation.test.mjs + tests/turnstile.test.mjs
    point at the new server/_shared/ modules

Deleted: api/contact.js, api/register-interest.js, api/_ip-rate-limit.js,
api/_turnstile.js, api/_email-validation.js, api/_turnstile.test.mjs.
Manifest entries removed (58 → 56). Docs updated (api-platform,
api-commerce, usage-rate-limits).

Verified: npm run typecheck + typecheck:api + lint:api-contract
(88 files / 56 entries) + lint:boundaries pass; full test:data
(5852 tests) passes; make generate is zero-diff.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* chore(pro-test): rebuild bundle for leads/v1 contact form (#3207)

Updates the enterprise contact form to POST to /api/leads/v1/submit-contact
(old path /api/contact removed in the previous commit).

Bundle is rebuilt from pro-test/src/App.tsx source change in 9ccd309d.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review): address HIGH review findings 1-3 (#3207)

Three review findings from @koala73 on the sebuf-migration PR, all
silent bugs that would have shipped to prod:

### 1. Sanctions rate-limit policy was dead code

ENDPOINT_RATE_POLICIES keyed the 30/min budget under
/api/sanctions/v1/lookup-entity, but the generated route (from the
proto RPC LookupSanctionEntity) is /api/sanctions/v1/lookup-sanction-entity.
hasEndpointRatePolicy / getEndpointRatelimit are exact-string pathname
lookups, so the mismatch meant the endpoint fell through to the
generic 600/min global limiter instead of the advertised 30/min.

Net effect: the live OpenSanctions proxy endpoint (unauthenticated,
external upstream) had 20x the intended rate budget. Fixed by renaming
the policy key to match the generated route.

### 2. Lost stale-seed fallback on military-flights

Legacy api/military-flights.js cascaded military:flights:v1 →
military:flights:stale:v1 before returning empty. The new proto
handler went straight to live OpenSky/relay and returned null on miss.

Relay or OpenSky hiccup used to serve stale seeded data (24h TTL);
under the new handler it showed an empty map. Both keys are still
written by scripts/seed-military-flights.mjs on every run — fix just
reads the stale key when the live fetch returns null, converts the
seed's app-shape flights (flat lat/lon, lowercase enums, lastSeenMs)
to the proto shape (nested GeoCoordinates, enum strings, lastSeenAt),
and filters to the request bbox.

Read via getRawJson (unprefixed) to match the seed cron's writes,
which bypass the env-prefix system.

### 3. Hex-code casing mismatch broke getFlightByHex

The seed cron writes hexCode: icao24.toUpperCase() (uppercase);
src/services/military-flights.ts:getFlightByHex uppercases the lookup
input: f.hexCode === hexCode.toUpperCase(). The new proto handler
preserved OpenSky's lowercase icao24, and mapProtoFlight is a
pass-through. getFlightByHex was silently returning undefined for
every call after the migration.

Fix: uppercase in the proto handler (live + stale paths), and document
the invariant in a comment on MilitaryFlight.hex_code in
military_flight.proto so future handlers don't re-break it.

### Verified

- typecheck + typecheck:api clean
- lint:api-contract (56 entries) / lint:boundaries clean
- tests/edge-functions.test.mjs 130 pass
- make generate zero-diff (openapi spec regenerated for proto comment)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review): restore desktop 2/hr rate cap on register-interest (#3207)

Addresses HIGH review finding #4 from @koala73. The legacy
api/register-interest.js applied a nested 2/hr per-IP cap when
`source === 'desktop-settings'`, on top of the generic 5/hr endpoint
budget. The sebuf migration lost this — desktop-source requests now
enjoy the full 5/hr cap.

Since `source` is an unsigned client-supplied field, anyone sending
`source: 'desktop-settings'` skips Turnstile AND gets 5/hr. Without
the tighter cap the Turnstile bypass is cheaper to abuse.

Added `checkScopedRateLimit` to `server/_shared/rate-limit.ts` — a
reusable second-stage Upstash limiter keyed on an opaque scope string
+ caller identifier. Fail-open on Redis errors to match existing
checkRateLimit / checkEndpointRateLimit semantics. Handlers that need
per-subscope caps on top of the gateway-level endpoint budget use this
helper.

In register-interest: when `isDesktopSource`, call checkScopedRateLimit
with scope `/api/leads/v1/register-interest#desktop`, limit=2, window=1h,
IP as identifier. On exceeded → throw ApiError(429).

### What this does not fix

This caps the blast radius of the Turnstile bypass but does not close
it — an attacker sending `source: 'desktop-settings'` still skips
Turnstile (just at 2/hr instead of 5/hr). The proper fix is a signed
desktop-secret header that authenticates the bypass; filed as
follow-up #3252. That requires coordinated Tauri build + Vercel env
changes out of scope for #3207.

### Verified

- typecheck + typecheck:api clean
- lint:api-contract (56 entries)
- tests/edge-functions.test.mjs + contact-handler.test.mjs (147 pass)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review): MEDIUM + LOW + rate-limit-policy CI check (#3207)

Closes out the remaining @koala73 review findings from #3242 that
didn't already land in the HIGH-fix commits, plus the requested CI
check that would have caught HIGH #1 (dead-code policy key) at
review time.

### MEDIUM #5 — Turnstile missing-secret policy default

Flip `verifyTurnstile`'s default `missingSecretPolicy` from `'allow'`
to `'allow-in-development'`. Dev with no secret = pass (expected
local); prod with no secret = reject + log. submit-contact was
already explicitly overriding to `'allow-in-development'`;
register-interest was silently getting `'allow'`. Safe default now
means a future missing-secret misconfiguration in prod gets caught
instead of silently letting bots through. Removed the now-redundant
override in submit-contact.

### MEDIUM #6 — Silent enum fallbacks in maritime client

`toDisruptionEvent` mapped `AIS_DISRUPTION_TYPE_UNSPECIFIED` / unknown
enum values → `gap_spike` / `low` silently. Refactored to return null
when either enum is unknown; caller filters nulls out of the array.
Handler doesn't produce UNSPECIFIED today, but the `gap_spike`
default would have mislabeled the first new enum value the proto
ever adds — dropping unknowns is safer than shipping wrong labels.

### LOW — Copy drift in register-interest email

Email template hardcoded `435+ Sources`; PR #3241 bumped marketing to
`500+`. Bumped in the rewritten file to stay consistent.

The `as any` on Convex mutation names carried over from legacy and
filed as follow-up #3253.

### Rate-limit-policy coverage lint

`scripts/enforce-rate-limit-policies.mjs` validates every key in
`ENDPOINT_RATE_POLICIES` resolves to a proto-generated gateway route
by cross-referencing `docs/api/*.openapi.yaml`. Fails with the
sanctions-entity-search incident referenced in the error message so
future drift has a paper trail.

Wired into package.json (`lint:rate-limit-policies`) and the pre-push
hook alongside `lint:boundaries`. Smoke-tested both directions —
clean repo passes (5 policies / 175 routes), seeded drift (the exact
HIGH #1 typo) fails with the advertised remedy text.

### Verified
- `lint:rate-limit-policies` ✓
- `typecheck` + `typecheck:api` ✓
- `lint:api-contract` ✓ (56 entries)
- `lint:boundaries` ✓
- edge-functions + contact-handler tests (147 pass)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* refactor(commit 5): decomm /api/eia/* + migrate /api/satellites → IntelligenceService (#3207)

Both targets turned out to be decomm-not-migration cases. The original
plan called for two new services (economic/v1.GetEiaSeries +
natural/v1.ListSatellitePositions) but research found neither was
needed:

### /api/eia/[[...path]].js — pure decomm, zero consumers

The "catch-all" is a misnomer — only two paths actually worked,
/api/eia/health and /api/eia/petroleum, both Redis-only readers.
Zero frontend callers in src/. Zero server-side readers. Nothing
consumes the `energy:eia-petroleum:v1` key that seed-eia-petroleum.mjs
writes daily.

The EIA data the frontend actually uses goes through existing typed
RPCs in economic/v1: GetEnergyPrices, GetCrudeInventories,
GetNatGasStorage, GetEnergyCapacity. None of those touch /api/eia/*.

Building GetEiaSeries would have been dead code. Deleted the legacy
file + its test (tests/api-eia-petroleum.test.mjs — it only covered
the legacy endpoint, no behavior to preserve). Empty api/eia/ dir
removed.

**Note for review:** the Redis seed cron keeps running daily and
nothing consumes it. If that stays unused, seed-eia-petroleum.mjs
should be retired too (separate PR). Out of scope for sebuf-migration.

### /api/satellites.js — Learning #2 strikes again

IntelligenceService.ListSatellites already exists at
/api/intelligence/v1/list-satellites, reads the same Redis key
(intelligence:satellites:tle:v1), and supports an optional country
filter the legacy didn't have.

One frontend caller in src/services/satellites.ts needed to switch
from `fetch(toApiUrl('/api/satellites'))` to the typed
IntelligenceServiceClient.listSatellites. Shape diff was tiny —
legacy `noradId` became proto `id` (handler line 36 already picks
either), everything else identical. alt/velocity/inclination in the
proto are ignored by the caller since it propagates positions
client-side via satellite.js.

Kept the client-side cache + failure cooldown + 20s timeout (still
valid concerns at the caller level).

### Manifest + docs
- api-route-exceptions.json: 56 → 54 entries (both removed)
- docs/api-proxies.mdx: dropped the two rows from the Raw-data
  passthroughs table

### Verified
- typecheck + typecheck:api ✓
- lint:api-contract (54 entries) / lint:boundaries / lint:rate-limit-policies ✓
- tests/edge-functions.test.mjs 127 pass (down from 130 — 3 tests were
  for the deleted eia endpoint)
- make generate zero-diff (no proto changes)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* refactor(commit 6): migrate /api/supply-chain/v1/{country-products,multi-sector-cost-shock} → SupplyChainService (#3207)

Both endpoints were hand-rolled TS handlers sitting under a proto URL prefix —
the exact drift the manifest guardrail flagged. Promoted both to typed RPCs:

- GetCountryProducts → /api/supply-chain/v1/get-country-products
- GetMultiSectorCostShock → /api/supply-chain/v1/get-multi-sector-cost-shock

Handlers preserve the existing semantics: PRO-gate via isCallerPremium(ctx.request),
iso2 / chokepointId validation, raw bilateral-hs4 Redis read (skip env-prefix to
match seeder writes), CHOKEPOINT_STATUS_KEY for war-risk tier, and the math from
_multi-sector-shock.ts unchanged. Empty-data and non-PRO paths return the typed
empty payload (no 403 — the sebuf gateway pattern is empty-payload-on-deny).

Client wrapper switches from premiumFetch to client.getCountryProducts/
client.getMultiSectorCostShock. Legacy MultiSectorShock / MultiSectorShockResponse /
CountryProductsResponse names remain as type aliases of the generated proto types
so CountryBriefPanel + CountryDeepDivePanel callsites compile with zero churn.

Manifest 54 → 52. Rate-limit gateway routes 175 → 177.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(gateway): add cache-tier entries for new supply-chain RPCs (#3207)

Pre-push tests/route-cache-tier.test.mjs caught the missing entries.
Both PRO-gated, request-varying — match the existing supply-chain PRO cohort
(get-country-cost-shock, get-bypass-options, etc.) at slow-browser tier.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* refactor(commit 7): migrate /api/scenario/v1/{run,status,templates} → ScenarioService (#3207)

Promote the three literal-filename scenario endpoints to a typed sebuf
service with three RPCs:

  POST /api/scenario/v1/run-scenario        (RunScenario)
  GET  /api/scenario/v1/get-scenario-status (GetScenarioStatus)
  GET  /api/scenario/v1/list-scenario-templates (ListScenarioTemplates)

Preserves all security invariants from the legacy handlers:
- 405 for wrong method (sebuf service-config method gate)
- scenarioId validation against SCENARIO_TEMPLATES registry
- iso2 regex ^[A-Z]{2}$
- JOB_ID_RE path-traversal guard on status
- Per-IP 10/min rate limit (moved to gateway ENDPOINT_RATE_POLICIES)
- Queue-depth backpressure (>100 → 429)
- PRO gating via isCallerPremium
- AbortSignal.timeout on every Redis pipeline (runRedisPipeline helper)

Wire-level diffs vs legacy:
- Per-user RL now enforced at the gateway (same 10/min/IP budget).
- Rate-limit response omits Retry-After header; retryAfter is in the
  body per error-mapper.ts convention.
- ListScenarioTemplates emits affectedHs2: [] when the registry entry
  is null (all-sectors sentinel); proto repeated cannot carry null.
- RunScenario returns { jobId, status } (no statusUrl field — unused
  by SupplyChainPanel, drop from wire).

Gateway wiring:
- server/gateway.ts RPC_CACHE_TIER: list-scenario-templates → 'daily'
  (matches legacy max-age=3600); get-scenario-status → 'slow-browser'
  (premium short-circuit target, explicit entry required by
  tests/route-cache-tier.test.mjs).
- src/shared/premium-paths.ts: swap old run/status for the new
  run-scenario/get-scenario-status paths.
- api/scenario/v1/{run,status,templates}.ts deleted; 3 manifest
  exceptions removed (63 → 52 → 49 migration-pending).

Client:
- src/services/scenario/index.ts — typed client wrapper using
  premiumFetch (injects Clerk bearer / API key).
- src/components/SupplyChainPanel.ts — polling loop swapped from
  premiumFetch strings to runScenario/getScenarioStatus. Hard 20s
  timeout on run preserved via AbortSignal.any.

Tests:
- tests/scenario-handler.test.mjs — 18 new handler-level tests
  covering every security invariant + the worker envelope coercion.
- tests/edge-functions.test.mjs — scenario sections removed,
  replaced with a breadcrumb pointer to the new test file.

Docs: api-scenarios.mdx, scenario-engine.mdx, usage-rate-limits.mdx,
usage-errors.mdx, supply-chain.mdx refreshed with new paths.

Verified: typecheck, typecheck:api, lint:api-contract (49 entries),
lint:rate-limit-policies (6/180), lint:boundaries, route-cache-tier
(parity), full edge-functions (117) + scenario-handler (18).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* refactor(commit 8): migrate /api/v2/shipping/{route-intelligence,webhooks} → ShippingV2Service (#3207)

Partner-facing endpoints promoted to a typed sebuf service. Wire shape
preserved byte-for-byte (camelCase field names, ISO-8601 fetchedAt, the
same subscriberId/secret formats, the same SET + SADD + EXPIRE 30-day
Redis pipeline). Partner URLs /api/v2/shipping/* are unchanged.

RPCs landed:
- GET  /route-intelligence  → RouteIntelligence  (PRO, slow-browser)
- POST /webhooks            → RegisterWebhook    (PRO)
- GET  /webhooks            → ListWebhooks       (PRO, slow-browser)

The existing path-parameter URLs remain on the legacy edge-function
layout because sebuf's HTTP annotations don't currently model path
params (grep proto/**/*.proto for `path: "{…}"` returns zero). Those
endpoints are split into two Vercel dynamic-route files under
api/v2/shipping/webhooks/, behaviorally identical to the previous
hybrid file but cleanly separated:
- GET  /webhooks/{subscriberId}                → [subscriberId].ts
- POST /webhooks/{subscriberId}/rotate-secret  → [subscriberId]/[action].ts
- POST /webhooks/{subscriberId}/reactivate     → [subscriberId]/[action].ts

Both get manifest entries under `migration-pending` pointing at #3207.

Other changes
- scripts/enforce-sebuf-api-contract.mjs: extended GATEWAY_RE to accept
  api/v{N}/{domain}/[rpc].ts (version-first) alongside the canonical
  api/{domain}/v{N}/[rpc].ts; first-use of the reversed ordering is
  shipping/v2 because that's the partner contract.
- vite.config.ts: dev-server sebuf interceptor regex extended to match
  both layouts; shipping/v2 import + allRoutes entry added.
- server/gateway.ts: RPC_CACHE_TIER entries for /api/v2/shipping/
  route-intelligence + /webhooks (slow-browser; premium-gated endpoints
  short-circuit to slow-browser but the entries are required by
  tests/route-cache-tier.test.mjs).
- src/shared/premium-paths.ts: route-intelligence + webhooks added.
- tests/shipping-v2-handler.test.mjs: 18 handler-level tests covering
  PRO gate, iso2/cargoType/hs2 coercion, SSRF guards (http://, RFC1918,
  cloud metadata, IMDS), chokepoint whitelist, alertThreshold range,
  secret/subscriberId format, pipeline shape + 30-day TTL, cross-tenant
  owner isolation, `secret` omission from list response.

Manifest delta
- Removed: api/v2/shipping/route-intelligence.ts, api/v2/shipping/webhooks.ts
- Added:   api/v2/shipping/webhooks/[subscriberId].ts (migration-pending)
- Added:   api/v2/shipping/webhooks/[subscriberId]/[action].ts (migration-pending)
- Added:   api/internal/brief-why-matters.ts (internal-helper) — regression
  surface from the #3248 main merge, which introduced the file without a
  manifest entry. Filed here to keep the lint green; not strictly in scope
  for commit 8 but unblocking.

Net result: 49 → 47 `migration-pending` entries (one net-removal even
though webhook path-params stay pending, because two files collapsed
into two dynamic routes).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review HIGH 1): SupplyChainServiceClient must use premiumFetch (#3207)

Signed-in browser pro users were silently hitting 401 on 8 supply-chain
premium endpoints (country-products, multi-sector-cost-shock,
country-chokepoint-index, bypass-options, country-cost-shock,
sector-dependency, route-explorer-lane, route-impact). The shared
client was constructed with globalThis.fetch, so no Clerk bearer or
X-WorldMonitor-Key was injected. The gateway's validateApiKey runs
with forceKey=true for PREMIUM_RPC_PATHS and 401s before isCallerPremium
is consulted. The generated client's try/catch collapses the 401 into
an empty-fallback return, leaving panels blank with no visible error.

Fix is one line at the client constructor: swap globalThis.fetch for
premiumFetch. The same pattern is already in use for insider-transactions,
stock-analysis, stock-backtest, scenario, trade (premiumClient) — this
was an omission on this client, not a new pattern.

premiumFetch no-ops safely when no credentials are available, so the
5 non-premium methods on this client (shippingRates, chokepointStatus,
chokepointHistory, criticalMinerals, shippingStress) continue to work
unchanged.

This also fixes two panels that were pre-existing latently broken on
main (chokepoint-index, bypass-options, etc. — predating #3207, not
regressions from it). Commit 6 expanded the surface by routing two more
methods through the same buggy client; this commit fixes the class.

From koala73 review (#3242 second-pass, HIGH new #1):
> Exact class PR #3233 fixed for RegionalIntelligenceBoard /
> DeductionPanel / trade / country-intel. Supply-chain was not in
> #3233's scope.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review HIGH 2): restore 400 on input-shape errors for 2 supply-chain handlers (#3207)

Commit 6 collapsed all non-happy paths into empty-200 on
`get-country-products` and `get-multi-sector-cost-shock`, including
caller-bug cases that legacy returned 400 for:

- get-country-products: malformed iso2 → empty 200 (was 400)
- get-multi-sector-cost-shock: malformed iso2 / missing chokepointId /
  unknown chokepointId → empty 200 (was 400)

The commit message for 6 called out the 403-for-non-pro → empty-200
shift ("sebuf gateway pattern is empty-payload-on-deny") but not the
400 shift. They're different classes:

- Empty-payload-200 for PRO-deny: intentional contract change, already
  documented and applied across the service. Generated clients treat
  "you lack PRO" as "no data" — fine.
- Empty-payload-200 for malformed input: caller bug silently masked.
  External API consumers can't distinguish "bad wiring" from "genuinely
  no data", test harnesses lose the signal, bad calling code doesn't
  surface in Sentry.

Fix: `throw new ValidationError(violations)` on the 3 input-shape
branches. The generated sebuf server maps ValidationError → HTTP 400
(see src/generated/server/.../service_server.ts and leads/v1 which
already uses this pattern).

PRO-gate deny stays as empty-200 — that contract shift was intentional
and is preserved.

Regression tests added at tests/supply-chain-validation.test.mjs (8
cases) pinning the three-way contract:
- bad input                         → 400 (ValidationError)
- PRO-gate deny on valid input      → 200 empty
- valid PRO input, no data in Redis → 200 empty (unchanged)

From koala73 review (#3242 second-pass, HIGH new #2).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review HIGH 3): restore statusUrl on RunScenarioResponse + document 202→200 wire break (#3207)

Commit 7 silently shifted /api/scenario/v1/run-scenario's response
contract in two ways that the commit message covered only partially:

1. HTTP 202 Accepted → HTTP 200 OK
2. Dropped `statusUrl` string from the response body

The `statusUrl` drop was mentioned as "unused by SupplyChainPanel" but
not framed as a contract change. The 202 → 200 shift was not mentioned
at all. This is a same-version (v1 → v1) migration, so external callers
that key off either signal — `response.status === 202` or
`response.body.statusUrl` — silently branch incorrectly.

Evaluated options:
  (a) sebuf per-RPC status-code config — not available. sebuf's
      HttpConfig only models `path` and `method`; no status annotation.
  (b) Bump to scenario/v2 — judged heavier than the break itself for
      a single status-code shift. No in-repo caller uses 202 or
      statusUrl; the docs-level impact is containable.
  (c) Accept the break, document explicitly, partially restore.

Took option (c):

- Restored `statusUrl` in the proto (new field `string status_url = 3`
  on RunScenarioResponse). Server computes
  `/api/scenario/v1/get-scenario-status?jobId=<encoded job_id>` and
  populates it on every successful enqueue. External callers that
  followed this URL keep working unchanged.
- 202 → 200 is not recoverable inside the sebuf generator, so it is
  called out explicitly in two places:
    - docs/api-scenarios.mdx now includes a prominent `<Warning>` block
      documenting the v1→v1 contract shift + the suggested migration
      (branch on response body shape, not HTTP status).
    - RunScenarioResponse proto comment explains why 200 is the new
      success status on enqueue.
  OpenAPI bundle regenerated to reflect the restored statusUrl field.

- Regression test added in tests/scenario-handler.test.mjs pinning
  `statusUrl` to the exact URL-encoded shape — locks the invariant so
  a future proto rename or handler refactor can't silently drop it
  again.

From koala73 review (#3242 second-pass, HIGH new #3).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review HIGH 1/2): close webhook tenant-isolation gap on shipping/v2 (#3207)

Koala flagged this as a merge blocker in PR #3242 review.

server/worldmonitor/shipping/v2/{register-webhook,list-webhooks}.ts
migrated without reinstating validateApiKey(req, { forceKey: true }),
diverging from both the sibling api/v2/shipping/webhooks/[subscriberId]
routes and the documented "X-WorldMonitor-Key required" contract in
docs/api-shipping-v2.mdx.

Attack surface: the gateway accepts Clerk bearer auth as a pro signal.
A Clerk-authenticated pro user with no X-WorldMonitor-Key reaches the
handler, callerFingerprint() falls back to 'anon', and every such
caller collapses into a shared webhook:owner:anon:v1 bucket. The
defense-in-depth ownerTag !== ownerHash check in list-webhooks.ts
doesn't catch it because both sides equal 'anon' — every Clerk-session
holder could enumerate / overwrite every other Clerk-session pro
tenant's registered webhook URLs.

Fix: reinstate validateApiKey(ctx.request, { forceKey: true }) at the
top of each handler, throwing ApiError(401) when absent. Matches the
sibling routes exactly and the published partner contract.

Tests:
- tests/shipping-v2-handler.test.mjs: two existing "non-PRO → 403"
  tests for register/list were using makeCtx() with no key, which now
  fails at the 401 layer first. Renamed to "no API key → 401
  (tenant-isolation gate)" with a comment explaining the failure mode
  being tested. 18/18 pass.

Verified: typecheck:api, lint:api-contract (no change), lint:boundaries,
lint:rate-limit-policies, test:data (6005/6005).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

* fix(review HIGH 2/2): restore v1 path aliases on scenario + supply-chain (#3207)

Koala flagged this as a merge blocker in PR #3242 review.

Commits 6 + 7 of #3207 renamed five documented v1 URLs to the sebuf
method-derived paths and deleted the legacy edge-function files:

  POST /api/scenario/v1/run                       → run-scenario
  GET  /api/scenario/v1/status                    → get-scenario-status
  GET  /api/scenario/v1/templates                 → list-scenario-templates
  GET  /api/supply-chain/v1/country-products      → get-country-products
  GET  /api/supply-chain/v1/multi-sector-cost-shock → get-multi-sector-cost-shock

server/router.ts is an exact static-match table (Map keyed on `METHOD
PATH`), so any external caller — docs, partner scripts, grep-the-
internet — hitting the old documented URL would 404 on first request
after merge. Commit 8 (shipping/v2) preserved partner URLs byte-for-
byte; the scenario + supply-chain renames missed that discipline.

Fix: add five thin alias edge functions that rewrite the pathname to
the canonical sebuf path and delegate to the domain [rpc].ts gateway
via a new server/alias-rewrite.ts helper. Premium gating, rate limits,
entitlement checks, and cache-tier lookups all fire on the canonical
path — aliases are pure URL rewrites, not a duplicate handler pipeline.

  api/scenario/v1/{run,status,templates}.ts
  api/supply-chain/v1/{country-products,multi-sector-cost-shock}.ts

Vite dev parity: file-based routing at api/ is a Vercel concern, so the
dev middleware (vite.config.ts) gets a matching V1_ALIASES rewrite map
before the router dispatch.

Manifest: 5 new entries under `deferred` with removal_issue=#3282
(tracking their retirement at the next v1→v2 break). lint:api-contract
stays green (89 files checked, 55 manifest entries validated).

Docs:
- docs/api-scenarios.mdx: migration callout at the top with the full
  old→new URL table and a link to the retirement issue.
- CHANGELOG.md + docs/changelog.mdx: Changed entry documenting the
  rename + alias compat + the 202→200 shift (from commit 23c821a1).

Verified: typecheck:api, lint:api-contract, lint:rate-limit-policies,
lint:boundaries, test:data (6005/6005).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Sebastien Melki
2026-04-22 09:55:59 +03:00
committed by GitHub
parent 425507d15a
commit 58e42aadf9
129 changed files with 7156 additions and 2876 deletions

6
.github/CODEOWNERS vendored Normal file
View File

@@ -0,0 +1,6 @@
# Adding or modifying api-route-exceptions.json opts an endpoint out of the
# sebuf contract. Every change here needs review to prevent drift.
/api/api-route-exceptions.json @SebastienMelki
# The enforcement script and its CI wiring are the teeth behind the manifest.
/scripts/enforce-sebuf-api-contract.mjs @SebastienMelki

View File

@@ -43,6 +43,7 @@ jobs:
- run: npm run lint:unicode
- run: npm run lint
- run: npm run lint:boundaries
- run: npm run lint:api-contract
- name: Markdown lint
run: npm run lint:md
- name: Version sync check

View File

@@ -65,6 +65,9 @@ node scripts/check-unicode-safety.mjs || exit 1
echo "Running architectural boundary check..."
npm run lint:boundaries || exit 1
echo "Running rate-limit policy coverage check..."
npm run lint:rate-limit-policies || exit 1
echo "Running edge function bundle check..."
while IFS= read -r f; do
npx esbuild "$f" --bundle --format=esm --platform=browser --outfile=/dev/null 2>/dev/null || {

View File

@@ -17,9 +17,23 @@ All notable changes to World Monitor are documented here.
- PortWatch, CorridorRisk, and transit seed loops on Railway relay (#1560)
- R2 trace storage for forecast debugging with Cloudflare API upload (#1655)
### Changed
- **Sebuf API migration (#3207)** — scenario + supply-chain endpoints migrated to the typed sebuf contract. RPC URLs now derive from method names; the five renamed v1 URLs remain live as thin aliases so existing integrations keep working:
- `/api/scenario/v1/run``/api/scenario/v1/run-scenario`
- `/api/scenario/v1/status``/api/scenario/v1/get-scenario-status`
- `/api/scenario/v1/templates``/api/scenario/v1/list-scenario-templates`
- `/api/supply-chain/v1/country-products``/api/supply-chain/v1/get-country-products`
- `/api/supply-chain/v1/multi-sector-cost-shock``/api/supply-chain/v1/get-multi-sector-cost-shock`
Aliases will retire at the next v1→v2 break ([#3282](https://github.com/koala73/worldmonitor/issues/3282)).
- `POST /api/scenario/v1/run-scenario` now returns `200 OK` instead of the pre-migration `202 Accepted` on successful enqueue. sebuf's HTTP annotations don't support per-RPC status codes. Branch on response body `status === "pending"` instead of `response.status === 202`. `statusUrl` is preserved.
### Security
- CDN-Cache-Control header now only set for trusted origins (worldmonitor.app, Vercel previews, Tauri); no-origin server-side requests always reach the edge function so `validateApiKey` can run, closing a potential cache-bypass path for external scrapers
- **Shipping v2 webhook tenant isolation (#3242)** — `POST /api/v2/shipping/webhooks` (register) and `GET /api/v2/shipping/webhooks` (list) now enforce `validateApiKey(req, { forceKey: true })`, matching the sibling `[subscriberId]{,/[action]}` routes and the documented contract in `docs/api-shipping-v2.mdx`. Without this gate, a Clerk-authenticated pro user with no API key would fall through `callerFingerprint()` to the shared `'anon'` bucket and see/overwrite webhooks owned by other `'anon'`-bucket tenants.
### Fixed

View File

@@ -1,20 +0,0 @@
export function createIpRateLimiter({ limit, windowMs }) {
const rateLimitMap = new Map();
function getEntry(ip) {
return rateLimitMap.get(ip) || null;
}
function isRateLimited(ip) {
const now = Date.now();
const entry = getEntry(ip);
if (!entry || now - entry.windowStart > windowMs) {
rateLimitMap.set(ip, { windowStart: now, count: 1 });
return false;
}
entry.count += 1;
return entry.count > limit;
}
return { isRateLimited, getEntry };
}

View File

@@ -1,16 +0,0 @@
import { createRelayHandler } from './_relay.js';
export const config = { runtime: 'edge' };
export default createRelayHandler({
relayPath: '/ais/snapshot',
timeout: 12000,
requireApiKey: true,
requireRateLimit: true,
cacheHeaders: (ok) => ({
'Cache-Control': ok
? 'public, max-age=60, s-maxage=300, stale-while-revalidate=600, stale-if-error=900'
: 'public, max-age=10, s-maxage=30, stale-while-revalidate=120',
...(ok && { 'CDN-Cache-Control': 'public, s-maxage=300, stale-while-revalidate=600, stale-if-error=900' }),
}),
});

View File

@@ -0,0 +1,398 @@
{
"$comment": "Single source of truth for non-proto /api/ endpoints. All new JSON data APIs MUST use sebuf (proto → buf generate → handler). This manifest is the only escape hatch, and every entry is reviewed by @SebastienMelki (see .github/CODEOWNERS). Categories: external-protocol (MCP / OAuth — shape dictated by external spec), non-json (binary/HTML/image responses), upstream-proxy (raw pass-through of an external feed), ops-admin (health/cron/version — operator plumbing, not a product API), internal-helper (dashboard-internal bundle, not user-facing), deferred (should migrate eventually — must have a removal_issue), migration-pending (actively being migrated in an open PR — removed as its commit lands). See docs/adding-endpoints.mdx.",
"schema_version": 1,
"exceptions": [
{
"path": "api/mcp.ts",
"category": "external-protocol",
"reason": "MCP streamable-HTTP transport — JSON-RPC 2.0 envelope dictated by the Model Context Protocol spec, not something we can or should redefine as proto.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/mcp-proxy.js",
"category": "external-protocol",
"reason": "Proxy for the MCP transport — same shape constraint as api/mcp.ts.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/oauth/authorize.js",
"category": "external-protocol",
"reason": "OAuth 2.0 authorization endpoint. Response is an HTML consent page and 302 redirect; request/response shape is dictated by RFC 6749.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/oauth/register.js",
"category": "external-protocol",
"reason": "OAuth 2.0 dynamic client registration (RFC 7591). Shape fixed by the spec.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/oauth/token.js",
"category": "external-protocol",
"reason": "OAuth 2.0 token endpoint (RFC 6749). application/x-www-form-urlencoded request body; shape fixed by the spec.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/discord/oauth/callback.ts",
"category": "external-protocol",
"reason": "Discord OAuth redirect target — response is an HTML popup-closer page, query-param shape fixed by Discord.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/discord/oauth/start.ts",
"category": "external-protocol",
"reason": "Discord OAuth initiator — issues 302 to Discord's authorize URL. Not a JSON API.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/slack/oauth/callback.ts",
"category": "external-protocol",
"reason": "Slack OAuth redirect target — HTML response with postMessage + window.close, query-param shape fixed by Slack.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/slack/oauth/start.ts",
"category": "external-protocol",
"reason": "Slack OAuth initiator — 302 to Slack's authorize URL. Not a JSON API.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/download.js",
"category": "non-json",
"reason": "Binary file download (zip/csv/xlsx). Content-Type is not application/json.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/og-story.js",
"category": "non-json",
"reason": "Open Graph preview image (PNG via @vercel/og). Binary response.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/story.js",
"category": "non-json",
"reason": "Rendered HTML story page for social embeds — response is text/html, not JSON.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/youtube/embed.js",
"category": "non-json",
"reason": "YouTube oEmbed passthrough — shape dictated by YouTube's oEmbed response, served as-is.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/youtube/live.js",
"category": "non-json",
"reason": "Streams YouTube live metadata — chunked text response, not a typed JSON payload.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/brief/carousel/[userId]/[issueDate]/[page].ts",
"category": "non-json",
"reason": "Rendered carousel page image for brief social posts. Binary image response, dynamic path segments.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/opensky.js",
"category": "upstream-proxy",
"reason": "Transparent proxy to OpenSky Network API. Shape is OpenSky's; we don't remodel it.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/polymarket.js",
"category": "upstream-proxy",
"reason": "Transparent proxy to Polymarket gamma API. Shape is Polymarket's.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/gpsjam.js",
"category": "upstream-proxy",
"reason": "Transparent proxy to gpsjam.org tile/feed. Shape is upstream's.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/oref-alerts.js",
"category": "upstream-proxy",
"reason": "Transparent proxy to Pikud HaOref (IDF Home Front Command) alert feed. Shape is upstream's.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/rss-proxy.js",
"category": "upstream-proxy",
"reason": "Generic RSS/Atom XML proxy for CORS-blocked feeds. Response is XML, not JSON.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/telegram-feed.js",
"category": "upstream-proxy",
"reason": "Telegram channel feed proxy — passes upstream MTProto-derived shape through.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/supply-chain/hormuz-tracker.js",
"category": "upstream-proxy",
"reason": "Transparent proxy to Hormuz strait AIS tracker feed. Shape is upstream's.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/health.js",
"category": "ops-admin",
"reason": "Liveness probe hit by uptime monitor and load balancer. Not a product API; plain-text OK response.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/seed-health.js",
"category": "ops-admin",
"reason": "Cron-triggered data-freshness check for feed seeds. Operator tool, not a user-facing API.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/version.js",
"category": "ops-admin",
"reason": "Build-version probe for the desktop auto-updater. Tiny plain-JSON operator plumbing.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/cache-purge.js",
"category": "ops-admin",
"reason": "Admin-gated cache invalidation endpoint. Internal operator action, not a product API.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/seed-contract-probe.ts",
"category": "ops-admin",
"reason": "Cron probe that verifies seed contract shapes against upstreams. Operator telemetry.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/invalidate-user-api-key-cache.ts",
"category": "ops-admin",
"reason": "Admin-gated cache bust for user API key lookups. Internal operator action.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/fwdstart.js",
"category": "ops-admin",
"reason": "Tauri desktop updater bootstrap — starts the sidecar forwarding flow. Operator plumbing, not JSON.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/notify.ts",
"category": "ops-admin",
"reason": "Outbound notification dispatch (Slack/Discord/email) driven by cron. Internal, not a typed user API.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/bootstrap.js",
"category": "internal-helper",
"reason": "Dashboard-internal config bundle assembled at request time. Exposing as a user-facing API would implicitly commit us to its shape; keep it deliberately unversioned and out of the API surface.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/geo.js",
"category": "internal-helper",
"reason": "Lightweight IP-to-geo lookup wrapping Vercel's request.geo. Dashboard-internal helper; not worth a service of its own.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/reverse-geocode.js",
"category": "internal-helper",
"reason": "Reverse-geocode helper used only by the map layer for label rendering. Wraps an upstream provider; shape tracks upstream, not a versioned product contract.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/internal/brief-why-matters.ts",
"category": "internal-helper",
"reason": "Internal brief-pipeline helper — auth'd by RELAY_SHARED_SECRET (Railway cron only), not a user-facing API. Generated on merge of #3248 from main without a manifest entry; filed here to keep the lint green.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/data/city-coords.ts",
"category": "internal-helper",
"reason": "Static city-coordinates payload served from the deploy artifact. Returns a fixed reference table, not a queryable service.",
"owner": "@SebastienMelki",
"removal_issue": null
},
{
"path": "api/latest-brief.ts",
"category": "deferred",
"reason": "Returns the current user's latest brief. Auth-gated and Clerk-coupled; migrating requires modeling Brief in proto and auth context in handler. Deferred to brief/v1 service.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/brief/share-url.ts",
"category": "deferred",
"reason": "Creates a shareable public URL for a brief. Part of the brief/v1 service work.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/brief/public/[hash].ts",
"category": "deferred",
"reason": "Resolves a share hash to public-safe brief JSON. Part of the brief/v1 service work; dynamic path segment needs proto path-param modeling.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/brief/[userId]/[issueDate].ts",
"category": "deferred",
"reason": "Fetches a specific brief by user + issue date. Part of the brief/v1 service work.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/notification-channels.ts",
"category": "deferred",
"reason": "Lists / configures user notification channels. Auth-gated; migrating requires user/v1 or notifications/v1 service. Deferred until Clerk migration settles.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/user-prefs.ts",
"category": "deferred",
"reason": "Reads / writes user dashboard preferences. Auth-gated; part of user/v1 service work pending Clerk migration.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/create-checkout.ts",
"category": "deferred",
"reason": "Creates a Dodo Payments checkout session. Payments domain is still stabilizing (Clerk + Dodo integration); migrate once shape is frozen.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/customer-portal.ts",
"category": "deferred",
"reason": "Issues a Dodo customer-portal redirect URL. Paired with create-checkout.ts; migrate together.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/product-catalog.js",
"category": "deferred",
"reason": "Returns Dodo product catalog (pricing tiers). Migrate alongside the rest of the payments surface.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/referral/me.ts",
"category": "deferred",
"reason": "Returns the signed-in user's referral state. Auth-gated; part of user/v1 service.",
"owner": "@SebastienMelki",
"removal_issue": "TBD"
},
{
"path": "api/scenario/v1/run.ts",
"category": "deferred",
"reason": "URL-compat alias for POST /api/scenario/v1/run-scenario. Thin gateway wrapper that rewrites the documented pre-#3207 v1 URL to the canonical sebuf RPC path. Not a new endpoint — preserves the partner-documented wire contract. Retires at the next v1→v2 break.",
"owner": "@SebastienMelki",
"removal_issue": "#3282"
},
{
"path": "api/scenario/v1/status.ts",
"category": "deferred",
"reason": "URL-compat alias for GET /api/scenario/v1/get-scenario-status. See api/scenario/v1/run.ts for the same rationale.",
"owner": "@SebastienMelki",
"removal_issue": "#3282"
},
{
"path": "api/scenario/v1/templates.ts",
"category": "deferred",
"reason": "URL-compat alias for GET /api/scenario/v1/list-scenario-templates. See api/scenario/v1/run.ts for the same rationale.",
"owner": "@SebastienMelki",
"removal_issue": "#3282"
},
{
"path": "api/supply-chain/v1/country-products.ts",
"category": "deferred",
"reason": "URL-compat alias for GET /api/supply-chain/v1/get-country-products. See api/scenario/v1/run.ts for the same rationale.",
"owner": "@SebastienMelki",
"removal_issue": "#3282"
},
{
"path": "api/supply-chain/v1/multi-sector-cost-shock.ts",
"category": "deferred",
"reason": "URL-compat alias for GET /api/supply-chain/v1/get-multi-sector-cost-shock. See api/scenario/v1/run.ts for the same rationale.",
"owner": "@SebastienMelki",
"removal_issue": "#3282"
},
{
"path": "api/v2/shipping/webhooks/[subscriberId].ts",
"category": "migration-pending",
"reason": "Partner-facing path-parameter endpoint (GET status by subscriber id). Cannot migrate to sebuf yet — no path-param support in the annotation layer. Paired with the typed ShippingV2Service on the same base URL; tracked for eventual migration once sebuf path params are available.",
"owner": "@SebastienMelki",
"removal_issue": "#3207"
},
{
"path": "api/v2/shipping/webhooks/[subscriberId]/[action].ts",
"category": "migration-pending",
"reason": "Partner-facing path-parameter endpoints (POST rotate-secret, POST reactivate). Cannot migrate to sebuf yet — no path-param support. Paired with the typed ShippingV2Service; tracked for eventual migration once sebuf path params are available.",
"owner": "@SebastienMelki",
"removal_issue": "#3207"
},
{
"path": "api/chat-analyst.ts",
"category": "migration-pending",
"reason": "SSE streaming endpoint. Migrating to analyst/v1.ChatAnalyst (streaming RPC) in commit 9 of #3207. Blocked on sebuf#150 (TS-server SSE codegen).",
"owner": "@SebastienMelki",
"removal_issue": "#3207"
},
{
"path": "api/widget-agent.ts",
"category": "migration-pending",
"reason": "Migrating to analyst/v1.WidgetComplete in commit 9 of #3207. Blocked on sebuf#150.",
"owner": "@SebastienMelki",
"removal_issue": "#3207"
},
{
"path": "api/skills/fetch-agentskills.ts",
"category": "migration-pending",
"reason": "Migrating to analyst/v1.ListAgentSkills in commit 9 of #3207. Blocked on sebuf#150.",
"owner": "@SebastienMelki",
"removal_issue": "#3207"
}
]
}

View File

@@ -1,61 +0,0 @@
// EIA (Energy Information Administration) passthrough.
// Redis-only reader. Railway seeder `seed-eia-petroleum.mjs` (bundled in
// `seed-bundle-energy-sources`) writes `energy:eia-petroleum:v1`; this
// endpoint reads from Redis and never hits api.eia.gov at request time.
// Gold standard per feedback_vercel_reads_only.md.
import { getCorsHeaders, isDisallowedOrigin } from '../_cors.js';
import { readJsonFromUpstash } from '../_upstash-json.js';
export const config = { runtime: 'edge' };
const CANONICAL_KEY = 'energy:eia-petroleum:v1';
export default async function handler(req) {
const cors = getCorsHeaders(req);
if (isDisallowedOrigin(req)) {
return new Response(JSON.stringify({ error: 'Origin not allowed' }), { status: 403, headers: cors });
}
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'GET') {
return Response.json({ error: 'Method not allowed' }, { status: 405, headers: cors });
}
const url = new URL(req.url);
const path = url.pathname.replace('/api/eia', '');
if (path === '/health' || path === '') {
return Response.json({ configured: true }, { headers: cors });
}
if (path === '/petroleum') {
let data;
try {
data = await readJsonFromUpstash(CANONICAL_KEY, 3_000);
} catch {
data = null;
}
if (!data) {
return Response.json(
{ error: 'Data not yet seeded', hint: 'Retry in a few minutes' },
{
status: 503,
headers: { ...cors, 'Cache-Control': 'no-store', 'Retry-After': '300' },
},
);
}
return Response.json(data, {
headers: {
...cors,
'Cache-Control': 'public, max-age=1800, s-maxage=1800, stale-while-revalidate=86400',
},
});
}
return Response.json({ error: 'Not found' }, { status: 404, headers: cors });
}

View File

@@ -1,19 +0,0 @@
const DOMAIN_SUFFIX_RE = /\.(com|io|co|org|net|ai|dev|app)$/;
export function toOrgSlugFromDomain(domain) {
return (domain || '')
.trim()
.toLowerCase()
.replace(DOMAIN_SUFFIX_RE, '')
.split('.')
.pop() || '';
}
export function inferCompanyNameFromDomain(domain) {
const orgSlug = toOrgSlugFromDomain(domain);
if (!orgSlug) return domain || '';
return orgSlug
.replace(/-/g, ' ')
.replace(/\b\w/g, (c) => c.toUpperCase());
}

View File

@@ -1,203 +0,0 @@
/**
* Company Enrichment API — Vercel Edge Function
* Aggregates company data from multiple public sources:
* - GitHub org data
* - Hacker News mentions
* - SEC EDGAR filings (public US companies)
* - Tech stack inference from GitHub repos
*
* GET /api/enrichment/company?domain=example.com
* GET /api/enrichment/company?name=Stripe
*/
import { getCorsHeaders, isDisallowedOrigin } from '../_cors.js';
import { checkRateLimit } from '../_rate-limit.js';
import { inferCompanyNameFromDomain, toOrgSlugFromDomain } from './_domain.js';
export const config = { runtime: 'edge' };
const UA = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36';
const CACHE_TTL_SECONDS = 3600;
const GITHUB_API_HEADERS = Object.freeze({ Accept: 'application/vnd.github.v3+json', 'User-Agent': UA });
async function fetchGitHubOrg(name) {
try {
const res = await fetch(`https://api.github.com/orgs/${encodeURIComponent(name)}`, {
headers: GITHUB_API_HEADERS,
signal: AbortSignal.timeout(5000),
});
if (!res.ok) return null;
const data = await res.json();
return {
name: data.name || data.login,
description: data.description,
blog: data.blog,
location: data.location,
publicRepos: data.public_repos,
followers: data.followers,
avatarUrl: data.avatar_url,
createdAt: data.created_at,
};
} catch {
return null;
}
}
async function fetchGitHubTechStack(orgName) {
try {
const res = await fetch(
`https://api.github.com/orgs/${encodeURIComponent(orgName)}/repos?sort=stars&per_page=10`,
{
headers: GITHUB_API_HEADERS,
signal: AbortSignal.timeout(5000),
},
);
if (!res.ok) return [];
const repos = await res.json();
const languages = new Map();
for (const repo of repos) {
if (repo.language) {
languages.set(repo.language, (languages.get(repo.language) || 0) + repo.stargazers_count + 1);
}
}
return Array.from(languages.entries())
.sort((a, b) => b[1] - a[1])
.slice(0, 10)
.map(([lang, score]) => ({ name: lang, category: 'Programming Language', confidence: Math.min(1, score / 100) }));
} catch {
return [];
}
}
async function fetchSECData(companyName) {
try {
const res = await fetch(
`https://efts.sec.gov/LATEST/search-index?q=${encodeURIComponent(companyName)}&dateRange=custom&startdt=${getDateMonthsAgo(6)}&enddt=${getTodayISO()}&forms=10-K,10-Q,8-K&from=0&size=5`,
{
headers: { 'User-Agent': 'WorldMonitor research@worldmonitor.app', 'Accept': 'application/json' },
signal: AbortSignal.timeout(8000),
},
);
if (!res.ok) return null;
const data = await res.json();
if (!data.hits || !data.hits.hits || data.hits.hits.length === 0) return null;
return {
totalFilings: data.hits.total?.value || 0,
recentFilings: data.hits.hits.slice(0, 5).map((h) => ({
form: h._source?.form_type || h._source?.file_type,
date: h._source?.file_date || h._source?.period_of_report,
description: h._source?.display_names?.[0] || companyName,
})),
};
} catch {
return null;
}
}
async function fetchHackerNewsMentions(companyName) {
try {
const res = await fetch(
`https://hn.algolia.com/api/v1/search?query=${encodeURIComponent(companyName)}&tags=story&hitsPerPage=5`,
{
headers: { 'User-Agent': UA },
signal: AbortSignal.timeout(5000),
},
);
if (!res.ok) return [];
const data = await res.json();
return (data.hits || []).map((h) => ({
title: h.title,
url: h.url,
points: h.points,
comments: h.num_comments,
date: h.created_at,
}));
} catch {
return [];
}
}
function getTodayISO() {
return toISODate(new Date());
}
function getDateMonthsAgo(months) {
const d = new Date();
d.setMonth(d.getMonth() - months);
return toISODate(d);
}
function toISODate(date) {
return date.toISOString().split('T')[0];
}
export default async function handler(req) {
const cors = getCorsHeaders(req, 'GET, OPTIONS');
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (isDisallowedOrigin(req)) {
return new Response('Forbidden', { status: 403, headers: cors });
}
const rateLimitResult = await checkRateLimit(req, 'enrichment', 30, '60s');
if (rateLimitResult) return rateLimitResult;
const url = new URL(req.url);
const domain = url.searchParams.get('domain')?.trim().toLowerCase();
const name = url.searchParams.get('name')?.trim();
if (!domain && !name) {
return new Response(JSON.stringify({ error: 'Provide ?domain= or ?name= parameter' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const companyName = name || (domain ? inferCompanyNameFromDomain(domain) : 'Unknown');
const searchName = domain ? toOrgSlugFromDomain(domain) : companyName.toLowerCase().replace(/\s+/g, '');
const [githubOrg, techStack, secData, hnMentions] = await Promise.all([
fetchGitHubOrg(searchName),
fetchGitHubTechStack(searchName),
fetchSECData(companyName),
fetchHackerNewsMentions(companyName),
]);
const enrichedData = {
company: {
name: githubOrg?.name || companyName,
domain: domain || githubOrg?.blog?.replace(/^https?:\/\//, '').replace(/\/$/, '') || null,
description: githubOrg?.description || null,
location: githubOrg?.location || null,
website: githubOrg?.blog || (domain ? `https://${domain}` : null),
founded: githubOrg?.createdAt ? new Date(githubOrg.createdAt).getFullYear() : null,
},
github: githubOrg ? {
publicRepos: githubOrg.publicRepos,
followers: githubOrg.followers,
avatarUrl: githubOrg.avatarUrl,
} : null,
techStack: techStack.length > 0 ? techStack : null,
secFilings: secData,
hackerNewsMentions: hnMentions.length > 0 ? hnMentions : null,
enrichedAt: new Date().toISOString(),
sources: [
githubOrg ? 'github' : null,
techStack.length > 0 ? 'github_repos' : null,
secData ? 'sec_edgar' : null,
hnMentions.length > 0 ? 'hacker_news' : null,
].filter(Boolean),
};
return new Response(JSON.stringify(enrichedData), {
status: 200,
headers: {
...cors,
'Content-Type': 'application/json',
'Cache-Control': `public, s-maxage=${CACHE_TTL_SECONDS}, stale-while-revalidate=${CACHE_TTL_SECONDS * 2}`,
},
});
}

View File

@@ -1,218 +0,0 @@
/**
* Signal Discovery API — Vercel Edge Function
* Discovers activity signals for a company from public sources:
* - News mentions (Hacker News)
* - GitHub activity spikes
* - Job posting signals (HN hiring threads)
*
* GET /api/enrichment/signals?company=Stripe&domain=stripe.com
*/
import { getCorsHeaders, isDisallowedOrigin } from '../_cors.js';
import { checkRateLimit } from '../_rate-limit.js';
import { toOrgSlugFromDomain } from './_domain.js';
export const config = { runtime: 'edge' };
const UA = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36';
const UPSTREAM_TIMEOUT_MS = 5000;
const DEFAULT_HEADERS = Object.freeze({ 'User-Agent': UA });
const GITHUB_HEADERS = Object.freeze({ Accept: 'application/vnd.github.v3+json', ...DEFAULT_HEADERS });
const SIGNAL_KEYWORDS = {
hiring_surge: ['hiring', 'we\'re hiring', 'join our team', 'open positions', 'new roles', 'growing team'],
funding_event: ['raised', 'funding', 'series', 'investment', 'valuation', 'backed by'],
expansion_signal: ['expansion', 'new office', 'opening', 'entering market', 'new region', 'international'],
technology_adoption: ['migrating to', 'adopting', 'implementing', 'rolling out', 'tech stack', 'infrastructure'],
executive_movement: ['appointed', 'joins as', 'new ceo', 'new cto', 'new vp', 'leadership change', 'promoted to'],
financial_trigger: ['revenue', 'ipo', 'acquisition', 'merger', 'quarterly results', 'earnings'],
};
function classifySignal(text) {
const lower = text.toLowerCase();
for (const [type, keywords] of Object.entries(SIGNAL_KEYWORDS)) {
for (const kw of keywords) {
if (lower.includes(kw)) return type;
}
}
return 'press_release';
}
function scoreSignalStrength(points, comments, recencyDays) {
let score = 0;
if (points > 100) score += 3;
else if (points > 30) score += 2;
else score += 1;
if (comments > 50) score += 2;
else if (comments > 10) score += 1;
if (recencyDays <= 3) score += 3;
else if (recencyDays <= 7) score += 2;
else if (recencyDays <= 14) score += 1;
if (score >= 7) return 'critical';
if (score >= 5) return 'high';
if (score >= 3) return 'medium';
return 'low';
}
async function fetchHNSignals(companyName) {
try {
const res = await fetch(
`https://hn.algolia.com/api/v1/search_by_date?query=${encodeURIComponent(companyName)}&tags=story&hitsPerPage=20&numericFilters=created_at_i>${Math.floor(Date.now() / 1000) - 30 * 86400}`,
{
headers: DEFAULT_HEADERS,
signal: AbortSignal.timeout(UPSTREAM_TIMEOUT_MS),
},
);
if (!res.ok) return [];
const data = await res.json();
const now = Date.now();
return (data.hits || []).map((h) => {
const recencyDays = (now - new Date(h.created_at).getTime()) / 86400000;
return {
type: classifySignal(h.title),
title: h.title,
url: h.url || `https://news.ycombinator.com/item?id=${h.objectID}`,
source: 'Hacker News',
sourceTier: 2,
timestamp: h.created_at,
strength: scoreSignalStrength(h.points || 0, h.num_comments || 0, recencyDays),
engagement: { points: h.points, comments: h.num_comments },
};
});
} catch {
return [];
}
}
async function fetchGitHubSignals(orgName) {
try {
const res = await fetch(
`https://api.github.com/orgs/${encodeURIComponent(orgName)}/repos?sort=created&per_page=10`,
{
headers: GITHUB_HEADERS,
signal: AbortSignal.timeout(UPSTREAM_TIMEOUT_MS),
},
);
if (!res.ok) return [];
const repos = await res.json();
const now = Date.now();
const thirtyDaysAgo = now - 30 * 86400000;
return repos
.filter((r) => new Date(r.created_at).getTime() > thirtyDaysAgo)
.map((r) => ({
type: 'technology_adoption',
title: `New repository: ${r.full_name}${r.description || 'No description'}`,
url: r.html_url,
source: 'GitHub',
sourceTier: 2,
timestamp: r.created_at,
strength: r.stargazers_count > 50 ? 'high' : r.stargazers_count > 10 ? 'medium' : 'low',
engagement: { stars: r.stargazers_count, forks: r.forks_count },
}));
} catch {
return [];
}
}
async function fetchJobSignals(companyName) {
try {
const res = await fetch(
`https://hn.algolia.com/api/v1/search?query=${encodeURIComponent(companyName)}&tags=comment,ask_hn&hitsPerPage=10&numericFilters=created_at_i>${Math.floor(Date.now() / 1000) - 60 * 86400}`,
{
headers: DEFAULT_HEADERS,
signal: AbortSignal.timeout(UPSTREAM_TIMEOUT_MS),
},
);
if (!res.ok) return [];
const data = await res.json();
const hiringComments = (data.hits || []).filter((h) => {
const text = (h.comment_text || '').toLowerCase();
return text.includes('hiring') || text.includes('job') || text.includes('apply');
});
if (hiringComments.length === 0) return [];
return [{
type: 'hiring_surge',
title: `${companyName} hiring activity (${hiringComments.length} mentions in HN hiring threads)`,
url: `https://news.ycombinator.com/item?id=${hiringComments[0].story_id}`,
source: 'HN Hiring Threads',
sourceTier: 3,
timestamp: hiringComments[0].created_at,
strength: hiringComments.length >= 3 ? 'high' : 'medium',
engagement: { mentions: hiringComments.length },
}];
} catch {
return [];
}
}
export default async function handler(req) {
const cors = getCorsHeaders(req, 'GET, OPTIONS');
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (isDisallowedOrigin(req)) {
return new Response('Forbidden', { status: 403, headers: cors });
}
const rateLimitResult = await checkRateLimit(req, 'signals', 20, '60s');
if (rateLimitResult) return rateLimitResult;
const url = new URL(req.url);
const company = url.searchParams.get('company')?.trim();
const domain = url.searchParams.get('domain')?.trim().toLowerCase();
if (!company) {
return new Response(JSON.stringify({ error: 'Provide ?company= parameter' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const orgName = toOrgSlugFromDomain(domain) || company.toLowerCase().replace(/\s+/g, '');
const [hnSignals, githubSignals, jobSignals] = await Promise.all([
fetchHNSignals(company),
fetchGitHubSignals(orgName),
fetchJobSignals(company),
]);
const allSignals = [...hnSignals, ...githubSignals, ...jobSignals]
.sort((a, b) => new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime());
const signalTypeCounts = {};
for (const s of allSignals) {
signalTypeCounts[s.type] = (signalTypeCounts[s.type] || 0) + 1;
}
const result = {
company,
domain: domain || null,
signals: allSignals,
summary: {
totalSignals: allSignals.length,
byType: signalTypeCounts,
strongestSignal: allSignals[0] || null,
signalDiversity: Object.keys(signalTypeCounts).length,
},
discoveredAt: new Date().toISOString(),
};
return new Response(JSON.stringify(result), {
status: 200,
headers: {
...cors,
'Content-Type': 'application/json',
'Cache-Control': 'public, s-maxage=1800, stale-while-revalidate=3600',
},
});
}

9
api/leads/v1/[rpc].ts Normal file
View File

@@ -0,0 +1,9 @@
export const config = { runtime: 'edge' };
import { createDomainGateway, serverOptions } from '../../../server/gateway';
import { createLeadsServiceRoutes } from '../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { leadsHandler } from '../../../server/worldmonitor/leads/v1/handler';
export default createDomainGateway(
createLeadsServiceRoutes(leadsHandler, serverOptions),
);

View File

@@ -1,68 +0,0 @@
import { getCorsHeaders, isDisallowedOrigin } from './_cors.js';
import { jsonResponse } from './_json-response.js';
import { readJsonFromUpstash } from './_upstash-json.js';
export const config = { runtime: 'edge' };
const REDIS_KEY = 'military:flights:v1';
const STALE_KEY = 'military:flights:stale:v1';
let cached = null;
let cachedAt = 0;
const CACHE_TTL = 120_000;
let negUntil = 0;
const NEG_TTL = 30_000;
async function fetchMilitaryFlightsData() {
const now = Date.now();
if (cached && now - cachedAt < CACHE_TTL) return cached;
if (now < negUntil) return null;
let data;
try { data = await readJsonFromUpstash(REDIS_KEY); } catch { data = null; }
if (!data) {
try { data = await readJsonFromUpstash(STALE_KEY); } catch { data = null; }
}
if (!data) {
negUntil = now + NEG_TTL;
return null;
}
cached = data;
cachedAt = now;
return data;
}
export default async function handler(req) {
const corsHeaders = getCorsHeaders(req, 'GET, OPTIONS');
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: corsHeaders });
}
if (isDisallowedOrigin(req)) {
return jsonResponse({ error: 'Origin not allowed' }, 403, corsHeaders);
}
const data = await fetchMilitaryFlightsData();
if (!data) {
return jsonResponse(
{ error: 'Military flight data temporarily unavailable' },
503,
{ 'Cache-Control': 'no-cache, no-store', ...corsHeaders },
);
}
return jsonResponse(
data,
200,
{
'Cache-Control': 's-maxage=120, stale-while-revalidate=60, stale-if-error=300',
...corsHeaders,
},
);
}

View File

@@ -1,88 +0,0 @@
// Edge function: on-demand OpenSanctions entity search (Phase 2 — issue #2042).
// Proxies to https://api.opensanctions.org — no auth required for basic search.
// Merges results with OFAC via the RPC lookup endpoint for a unified response.
export const config = { runtime: 'edge' };
import { createIpRateLimiter } from './_ip-rate-limit.js';
import { jsonResponse } from './_json-response.js';
import { getClientIp } from './_turnstile.js';
const OPENSANCTIONS_BASE = 'https://api.opensanctions.org';
const OPENSANCTIONS_TIMEOUT_MS = 8_000;
const MAX_RESULTS = 20;
const rateLimiter = createIpRateLimiter({ limit: 30, windowMs: 60_000 });
function normalizeEntity(hit) {
const props = hit.properties ?? {};
const name = (props.name ?? [hit.caption]).filter(Boolean)[0] ?? '';
const countries = props.country ?? props.nationality ?? [];
const programs = props.program ?? props.sanctions ?? [];
const schema = hit.schema ?? '';
let entityType = 'entity';
if (schema === 'Vessel') entityType = 'vessel';
else if (schema === 'Aircraft') entityType = 'aircraft';
else if (schema === 'Person') entityType = 'individual';
return {
id: `opensanctions:${hit.id}`,
name,
entityType,
countryCodes: countries.slice(0, 3),
programs: programs.slice(0, 3),
datasets: hit.datasets ?? [],
score: hit.score ?? 0,
};
}
export default async function handler(req) {
const ip = getClientIp(req);
if (rateLimiter.isRateLimited(ip)) {
return jsonResponse({ error: 'Too many requests' }, 429);
}
const { searchParams } = new URL(req.url);
const q = (searchParams.get('q') ?? '').trim();
if (!q || q.length < 2) {
return jsonResponse({ error: 'q must be at least 2 characters' }, 400);
}
if (q.length > 200) {
return jsonResponse({ error: 'q must be at most 200 characters' }, 400);
}
const limitRaw = Number(searchParams.get('limit') ?? '10');
const limit = Math.min(Number.isFinite(limitRaw) && limitRaw > 0 ? Math.trunc(limitRaw) : 10, MAX_RESULTS);
try {
const url = new URL(`${OPENSANCTIONS_BASE}/search/default`);
url.searchParams.set('q', q);
url.searchParams.set('limit', String(limit));
const resp = await fetch(url.toString(), {
headers: {
'User-Agent': 'WorldMonitor/1.0 sanctions-search',
Accept: 'application/json',
},
signal: AbortSignal.timeout(OPENSANCTIONS_TIMEOUT_MS),
});
if (!resp.ok) {
return jsonResponse({ results: [], total: 0, source: 'opensanctions', error: `upstream HTTP ${resp.status}` }, 200);
}
const data = await resp.json();
const results = (data.results ?? []).map(normalizeEntity);
return jsonResponse({
results,
total: data.total?.value ?? results.length,
source: 'opensanctions',
}, 200, { 'Cache-Control': 'no-store' });
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
return jsonResponse({ results: [], total: 0, source: 'opensanctions', error: message }, 200);
}
}

View File

@@ -1,49 +0,0 @@
import { getCorsHeaders, isDisallowedOrigin } from './_cors.js';
import { jsonResponse } from './_json-response.js';
import { readJsonFromUpstash } from './_upstash-json.js';
export const config = { runtime: 'edge' };
const REDIS_KEY = 'intelligence:satellites:tle:v1';
let cached = null;
let cachedAt = 0;
const CACHE_TTL = 600_000;
let negUntil = 0;
const NEG_TTL = 60_000;
async function fetchSatelliteData() {
const now = Date.now();
if (cached && now - cachedAt < CACHE_TTL) return cached;
if (now < negUntil) return null;
let data;
try { data = await readJsonFromUpstash(REDIS_KEY); } catch { data = null; }
if (!data) {
negUntil = now + NEG_TTL;
return null;
}
cached = data;
cachedAt = now;
return data;
}
export default async function handler(req) {
const corsHeaders = getCorsHeaders(req, 'GET, OPTIONS');
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: corsHeaders });
}
if (isDisallowedOrigin(req)) {
return jsonResponse({ error: 'Origin not allowed' }, 403, corsHeaders);
}
const data = await fetchSatelliteData();
if (!data) {
return jsonResponse({ error: 'Satellite data temporarily unavailable' }, 503, {
'Cache-Control': 'no-cache, no-store', ...corsHeaders,
});
}
return jsonResponse(data, 200, {
'Cache-Control': 's-maxage=3600, stale-while-revalidate=1800, stale-if-error=3600',
...corsHeaders,
});
}

9
api/scenario/v1/[rpc].ts Normal file
View File

@@ -0,0 +1,9 @@
export const config = { runtime: 'edge' };
import { createDomainGateway, serverOptions } from '../../../server/gateway';
import { createScenarioServiceRoutes } from '../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { scenarioHandler } from '../../../server/worldmonitor/scenario/v1/handler';
export default createDomainGateway(
createScenarioServiceRoutes(scenarioHandler, serverOptions),
);

View File

@@ -1,163 +1,8 @@
export const config = { runtime: 'edge' };
import { isCallerPremium } from '../../../server/_shared/premium-check';
import { getScenarioTemplate } from '../../../server/worldmonitor/supply-chain/v1/scenario-templates';
import gateway from './[rpc]';
import { rewriteToSebuf } from '../../../server/alias-rewrite';
const JOB_ID_CHARSET = 'abcdefghijklmnopqrstuvwxyz0123456789';
function generateJobId(): string {
const ts = Date.now();
let suffix = '';
const array = new Uint8Array(8);
crypto.getRandomValues(array);
for (const byte of array) suffix += JOB_ID_CHARSET[byte % JOB_ID_CHARSET.length];
return `scenario:${ts}:${suffix}`;
}
function getClientIp(req: Request): string {
return (
req.headers.get('cf-connecting-ip') ||
req.headers.get('x-real-ip') ||
req.headers.get('x-forwarded-for')?.split(',')[0]?.trim() ||
'0.0.0.0'
);
}
export default async function handler(req: Request): Promise<Response> {
if (req.method !== 'POST') {
return new Response('', { status: 405 });
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { 'Content-Type': 'application/json' },
});
}
const url = process.env.UPSTASH_REDIS_REST_URL;
const token = process.env.UPSTASH_REDIS_REST_TOKEN;
if (!url || !token) {
return new Response(JSON.stringify({ error: 'Service temporarily unavailable' }), {
status: 503,
headers: { 'Content-Type': 'application/json' },
});
}
// Per-user rate limit: 10 scenario jobs per user per minute (sliding window via INCR+EXPIRE).
const identifier = getClientIp(req);
const minute = Math.floor(Date.now() / 60_000);
const rateLimitKey = `rate:scenario:${identifier}:${minute}`;
const rlResp = await fetch(`${url}/pipeline`, {
method: 'POST',
headers: { Authorization: `Bearer ${token}`, 'Content-Type': 'application/json' },
body: JSON.stringify([
['INCR', rateLimitKey],
['EXPIRE', rateLimitKey, 60],
['LLEN', 'scenario-queue:pending'],
]),
signal: AbortSignal.timeout(5_000),
}).catch(() => null);
if (rlResp?.ok) {
const rlResults = (await rlResp.json()) as Array<{ result: number }>;
const count = rlResults[0]?.result ?? 0;
const queueDepth = rlResults[2]?.result ?? 0;
if (count > 10) {
return new Response(JSON.stringify({ error: 'Rate limit exceeded: 10 scenario jobs per minute' }), {
status: 429,
headers: {
'Content-Type': 'application/json',
'Retry-After': '60',
},
});
}
if (queueDepth > 100) {
return new Response(JSON.stringify({ error: 'Scenario queue is at capacity, please try again later' }), {
status: 429,
headers: {
'Content-Type': 'application/json',
'Retry-After': '30',
},
});
}
}
let body: Record<string, unknown>;
try {
body = await req.json();
} catch {
return new Response(JSON.stringify({ error: 'Invalid JSON body' }), {
status: 400,
headers: { 'Content-Type': 'application/json' },
});
}
const { scenarioId, iso2 } = body as { scenarioId?: string; iso2?: string };
if (!scenarioId || typeof scenarioId !== 'string') {
return new Response(JSON.stringify({ error: 'scenarioId is required' }), {
status: 400,
headers: { 'Content-Type': 'application/json' },
});
}
if (!getScenarioTemplate(scenarioId)) {
return new Response(JSON.stringify({ error: `Unknown scenario: ${scenarioId}` }), {
status: 400,
headers: { 'Content-Type': 'application/json' },
});
}
if (iso2 !== undefined && iso2 !== null && (typeof iso2 !== 'string' || !/^[A-Z]{2}$/.test(iso2))) {
return new Response(JSON.stringify({ error: 'iso2 must be a 2-letter uppercase country code' }), {
status: 400,
headers: { 'Content-Type': 'application/json' },
});
}
const jobId = generateJobId();
const payload = JSON.stringify({
jobId,
scenarioId,
iso2: iso2 ?? null,
enqueuedAt: Date.now(),
});
// Upstash REST command format: POST base URL with body `[CMD, ...args]`.
// The previous `/rpush/{key}` + `body: [payload]` form caused Upstash to
// store the literal array-string (`["{jobId:...}"]`) as the list value,
// which broke the scenario-worker's JSON.parse → destructure flow and
// made every job fail field validation (Railway log 2026-04-18: repeated
// `[scenario-worker] Job failed field validation, discarding: ["{...`).
// This command format matches `upstashLpush` in scripts/ais-relay.cjs.
const redisResp = await fetch(url, {
method: 'POST',
headers: {
Authorization: `Bearer ${token}`,
'Content-Type': 'application/json',
},
body: JSON.stringify(['RPUSH', 'scenario-queue:pending', payload]),
signal: AbortSignal.timeout(5_000),
});
if (!redisResp.ok) {
console.error('[scenario/run] Redis enqueue failed:', redisResp.status);
return new Response(JSON.stringify({ error: 'Failed to enqueue scenario job' }), {
status: 502,
headers: { 'Content-Type': 'application/json' },
});
}
return new Response(
JSON.stringify({ jobId, status: 'pending', statusUrl: `/api/scenario/v1/status?jobId=${jobId}` }),
{
status: 202,
headers: { 'Content-Type': 'application/json' },
},
);
}
// Alias for documented v1 URL. See server/alias-rewrite.ts.
export default (req: Request) =>
rewriteToSebuf(req, '/api/scenario/v1/run-scenario', gateway);

View File

@@ -1,77 +1,8 @@
export const config = { runtime: 'edge' };
import { isCallerPremium } from '../../../server/_shared/premium-check';
import gateway from './[rpc]';
import { rewriteToSebuf } from '../../../server/alias-rewrite';
/** Matches jobIds produced by run.ts: "scenario:{timestamp}:{8-char-suffix}" */
const JOB_ID_RE = /^scenario:\d{13}:[a-z0-9]{8}$/;
export default async function handler(req: Request): Promise<Response> {
if (req.method !== 'GET') {
return new Response('', { status: 405 });
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { 'Content-Type': 'application/json' },
});
}
const { searchParams } = new URL(req.url);
const jobId = searchParams.get('jobId');
if (!jobId || !JOB_ID_RE.test(jobId)) {
return new Response(
JSON.stringify({ error: 'Invalid or missing jobId' }),
{ status: 400, headers: { 'Content-Type': 'application/json' } },
);
}
const url = process.env.UPSTASH_REDIS_REST_URL;
const token = process.env.UPSTASH_REDIS_REST_TOKEN;
if (!url || !token) {
return new Response(
JSON.stringify({ error: 'Service temporarily unavailable' }),
{ status: 503, headers: { 'Content-Type': 'application/json' } },
);
}
const resultKey = `scenario-result:${jobId}`;
const redisResp = await fetch(`${url}/get/${encodeURIComponent(resultKey)}`, {
headers: { Authorization: `Bearer ${token}` },
signal: AbortSignal.timeout(5_000),
});
if (!redisResp.ok) {
console.error('[scenario/status] Redis get failed:', redisResp.status);
return new Response(
JSON.stringify({ error: 'Failed to fetch job status' }),
{ status: 502, headers: { 'Content-Type': 'application/json' } },
);
}
const data = (await redisResp.json()) as { result?: string | null };
if (!data.result) {
return new Response(
JSON.stringify({ jobId, status: 'pending' }),
{ status: 200, headers: { 'Content-Type': 'application/json' } },
);
}
let parsed: unknown;
try {
parsed = JSON.parse(data.result);
} catch {
return new Response(
JSON.stringify({ error: 'Corrupted job result' }),
{ status: 500, headers: { 'Content-Type': 'application/json' } },
);
}
return new Response(
JSON.stringify(parsed),
{ status: 200, headers: { 'Content-Type': 'application/json' } },
);
}
// Alias for documented v1 URL. See server/alias-rewrite.ts.
export default (req: Request) =>
rewriteToSebuf(req, '/api/scenario/v1/get-scenario-status', gateway);

View File

@@ -1,27 +1,8 @@
export const config = { runtime: 'edge' };
import { SCENARIO_TEMPLATES } from '../../../server/worldmonitor/supply-chain/v1/scenario-templates';
import gateway from './[rpc]';
import { rewriteToSebuf } from '../../../server/alias-rewrite';
export default async function handler(req: Request): Promise<Response> {
if (req.method !== 'GET') {
return new Response('', { status: 405 });
}
const templates = SCENARIO_TEMPLATES.map(t => ({
id: t.id,
name: t.name,
affectedChokepointIds: t.affectedChokepointIds,
disruptionPct: t.disruptionPct,
durationDays: t.durationDays,
affectedHs2: t.affectedHs2,
costShockMultiplier: t.costShockMultiplier,
}));
return new Response(JSON.stringify({ templates }), {
status: 200,
headers: {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=3600',
},
});
}
// Alias for documented v1 URL. See server/alias-rewrite.ts.
export default (req: Request) =>
rewriteToSebuf(req, '/api/scenario/v1/list-scenario-templates', gateway);

View File

@@ -1,56 +1,8 @@
export const config = { runtime: 'edge' };
import { isCallerPremium } from '../../../server/_shared/premium-check';
// @ts-expect-error — JS module, no declaration file
import { readJsonFromUpstash } from '../../_upstash-json.js';
import gateway from './[rpc]';
import { rewriteToSebuf } from '../../../server/alias-rewrite';
export default async function handler(req: Request): Promise<Response> {
if (req.method !== 'GET') {
return new Response('', { status: 405 });
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(
JSON.stringify({ error: 'PRO subscription required' }),
{ status: 403, headers: { 'Content-Type': 'application/json' } },
);
}
const { searchParams } = new URL(req.url);
const iso2 = searchParams.get('iso2')?.toUpperCase();
if (!iso2 || !/^[A-Z]{2}$/.test(iso2)) {
return new Response(
JSON.stringify({ error: 'Invalid or missing iso2 parameter' }),
{ status: 400, headers: { 'Content-Type': 'application/json' } },
);
}
const key = `comtrade:bilateral-hs4:${iso2}:v1`;
try {
const data = await readJsonFromUpstash(key, 5_000);
if (!data) {
return new Response(
JSON.stringify({ iso2, products: [], fetchedAt: '' }),
{ status: 200, headers: { 'Content-Type': 'application/json', 'Cache-Control': 'no-store' } },
);
}
return new Response(
JSON.stringify(data),
{
status: 200,
headers: {
'Content-Type': 'application/json',
'Cache-Control': 'private, max-age=3600',
'Vary': 'Authorization, Cookie',
},
},
);
} catch {
return new Response(
JSON.stringify({ error: 'Failed to fetch product data' }),
{ status: 502, headers: { 'Content-Type': 'application/json' } },
);
}
}
// Alias for documented v1 URL. See server/alias-rewrite.ts.
export default (req: Request) =>
rewriteToSebuf(req, '/api/supply-chain/v1/get-country-products', gateway);

View File

@@ -1,158 +1,8 @@
export const config = { runtime: 'edge' };
import { isCallerPremium } from '../../../server/_shared/premium-check';
import { CHOKEPOINT_REGISTRY } from '../../../server/_shared/chokepoint-registry';
import { CHOKEPOINT_STATUS_KEY } from '../../../server/_shared/cache-keys';
import {
aggregateAnnualImportsByHs2,
clampClosureDays,
computeMultiSectorShocks,
MULTI_SECTOR_HS2_LABELS,
SEEDED_HS2_CODES,
type MultiSectorCostShock,
type SeededProduct,
} from '../../../server/worldmonitor/supply-chain/v1/_multi-sector-shock';
// @ts-expect-error — JS module, no declaration file
import { readJsonFromUpstash } from '../../_upstash-json.js';
import gateway from './[rpc]';
import { rewriteToSebuf } from '../../../server/alias-rewrite';
interface ChokepointStatusCache {
chokepoints?: Array<{ id: string; warRiskTier?: string }>;
}
interface CountryProductsCache {
iso2: string;
products?: SeededProduct[];
fetchedAt?: string;
}
export interface MultiSectorCostShockResponse {
iso2: string;
chokepointId: string;
closureDays: number;
warRiskTier: string;
sectors: MultiSectorCostShock[];
totalAddedCost: number;
fetchedAt: string;
unavailableReason: string;
}
function emptyResponse(
iso2: string,
chokepointId: string,
closureDays: number,
reason = '',
): MultiSectorCostShockResponse {
return {
iso2,
chokepointId,
closureDays,
warRiskTier: 'WAR_RISK_TIER_UNSPECIFIED',
sectors: [],
totalAddedCost: 0,
fetchedAt: new Date().toISOString(),
unavailableReason: reason,
};
}
export default async function handler(req: Request): Promise<Response> {
if (req.method !== 'GET') {
return new Response('', { status: 405 });
}
const { searchParams } = new URL(req.url);
const iso2 = (searchParams.get('iso2') ?? '').toUpperCase();
const chokepointId = (searchParams.get('chokepointId') ?? '').trim().toLowerCase();
const rawDays = Number(searchParams.get('closureDays') ?? '30');
const closureDays = clampClosureDays(rawDays);
if (!/^[A-Z]{2}$/.test(iso2)) {
return new Response(
JSON.stringify({ error: 'Invalid or missing iso2 parameter' }),
{ status: 400, headers: { 'Content-Type': 'application/json' } },
);
}
if (!chokepointId) {
return new Response(
JSON.stringify({ error: 'Invalid or missing chokepointId parameter' }),
{ status: 400, headers: { 'Content-Type': 'application/json' } },
);
}
if (!CHOKEPOINT_REGISTRY.some(c => c.id === chokepointId)) {
return new Response(
JSON.stringify({ error: `Unknown chokepointId: ${chokepointId}` }),
{ status: 400, headers: { 'Content-Type': 'application/json' } },
);
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(
JSON.stringify({ error: 'PRO subscription required' }),
{ status: 403, headers: { 'Content-Type': 'application/json' } },
);
}
// Parallel Redis reads: country products + chokepoint status (for war risk tier).
const productsKey = `comtrade:bilateral-hs4:${iso2}:v1`;
const [productsCache, statusCache] = await Promise.all([
readJsonFromUpstash(productsKey, 5_000).catch(() => null) as Promise<CountryProductsCache | null>,
readJsonFromUpstash(CHOKEPOINT_STATUS_KEY, 5_000).catch(() => null) as Promise<ChokepointStatusCache | null>,
]);
const products = Array.isArray(productsCache?.products) ? productsCache.products : [];
const importsByHs2 = aggregateAnnualImportsByHs2(products);
const hasAnyImports = Object.values(importsByHs2).some(v => v > 0);
const warRiskTier = statusCache?.chokepoints?.find(c => c.id === chokepointId)?.warRiskTier
?? 'WAR_RISK_TIER_NORMAL';
if (!hasAnyImports) {
return new Response(
JSON.stringify({
...emptyResponse(iso2, chokepointId, closureDays, 'No seeded import data available for this country'),
// Still emit the empty sector skeleton so the UI can render rows at 0.
sectors: SEEDED_HS2_CODES.map(hs2 => ({
hs2,
hs2Label: MULTI_SECTOR_HS2_LABELS[hs2] ?? `HS ${hs2}`,
importValueAnnual: 0,
freightAddedPctPerTon: 0,
warRiskPremiumBps: 0,
addedTransitDays: 0,
totalCostShockPerDay: 0,
totalCostShock30Days: 0,
totalCostShock90Days: 0,
totalCostShock: 0,
closureDays,
})),
warRiskTier,
} satisfies MultiSectorCostShockResponse),
{ status: 200, headers: { 'Content-Type': 'application/json', 'Cache-Control': 'no-store' } },
);
}
const sectors = computeMultiSectorShocks(importsByHs2, chokepointId, warRiskTier, closureDays);
const totalAddedCost = sectors.reduce((sum, s) => sum + s.totalCostShock, 0);
const response: MultiSectorCostShockResponse = {
iso2,
chokepointId,
closureDays,
warRiskTier,
sectors,
totalAddedCost,
fetchedAt: new Date().toISOString(),
unavailableReason: '',
};
return new Response(
JSON.stringify(response),
{
status: 200,
headers: {
'Content-Type': 'application/json',
// Closure duration is user-controlled, so cache is private + short.
'Cache-Control': 'private, max-age=60',
'Vary': 'Authorization, Cookie, X-WorldMonitor-Key',
},
},
);
}
// Alias for documented v1 URL. See server/alias-rewrite.ts.
export default (req: Request) =>
rewriteToSebuf(req, '/api/supply-chain/v1/get-multi-sector-cost-shock', gateway);

9
api/v2/shipping/[rpc].ts Normal file
View File

@@ -0,0 +1,9 @@
export const config = { runtime: 'edge' };
import { createDomainGateway, serverOptions } from '../../../server/gateway';
import { createShippingV2ServiceRoutes } from '../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import { shippingV2Handler } from '../../../server/worldmonitor/shipping/v2/handler';
export default createDomainGateway(
createShippingV2ServiceRoutes(shippingV2Handler, serverOptions),
);

View File

@@ -1,158 +0,0 @@
/**
* GET /api/v2/shipping/route-intelligence
*
* Vendor-facing route intelligence API. Returns the primary trade route, chokepoint
* exposures, bypass options, war risk tier, and disruption score for a given
* country pair + cargo type.
*
* Authentication: X-WorldMonitor-Key required (forceKey: true). Browser origins
* are NOT exempt — this endpoint is designed for server-to-server integration.
*/
export const config = { runtime: 'edge' };
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../_api-key.js';
// @ts-expect-error — JS module, no declaration file
import { getCorsHeaders } from '../../_cors.js';
import { isCallerPremium } from '../../../server/_shared/premium-check';
import { getCachedJson } from '../../../server/_shared/redis';
import { CHOKEPOINT_STATUS_KEY } from '../../../server/_shared/cache-keys';
import { BYPASS_CORRIDORS_BY_CHOKEPOINT } from '../../../server/_shared/bypass-corridors';
import { CHOKEPOINT_REGISTRY } from '../../../server/_shared/chokepoint-registry';
import COUNTRY_PORT_CLUSTERS from '../../../scripts/shared/country-port-clusters.json';
interface PortClusterEntry {
nearestRouteIds: string[];
coastSide: string;
}
interface ChokepointStatus {
id: string;
name?: string;
disruptionScore?: number;
warRiskTier?: string;
}
interface ChokepointStatusResponse {
chokepoints?: ChokepointStatus[];
}
export default async function handler(req: Request): Promise<Response> {
const cors = getCorsHeaders(req);
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'GET') {
return new Response(JSON.stringify({ error: 'Method not allowed' }), { status: 405, headers: { ...cors, 'Content-Type': 'application/json' } });
}
let apiKeyResult = validateApiKey(req, { forceKey: true });
// Fallback: wm_ user keys are validated async via Convex, not in the static key list
const wmKey =
req.headers.get('X-WorldMonitor-Key') ??
req.headers.get('X-Api-Key') ??
'';
if (apiKeyResult.required && !apiKeyResult.valid && wmKey.startsWith('wm_')) {
const { validateUserApiKey } = await import('../../../server/_shared/user-api-key');
const userKeyResult = await validateUserApiKey(wmKey);
if (userKeyResult) {
apiKeyResult = { valid: true, required: true };
}
}
if (apiKeyResult.required && !apiKeyResult.valid) {
return new Response(JSON.stringify({ error: apiKeyResult.error ?? 'API key required' }), {
status: 401,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const { searchParams } = new URL(req.url);
const fromIso2 = searchParams.get('fromIso2')?.trim().toUpperCase() ?? '';
const toIso2 = searchParams.get('toIso2')?.trim().toUpperCase() ?? '';
const cargoType = (searchParams.get('cargoType')?.trim().toLowerCase() ?? 'container') as 'container' | 'tanker' | 'bulk' | 'roro';
const hs2 = searchParams.get('hs2')?.trim().replace(/\D/g, '') || '27';
if (!/^[A-Z]{2}$/.test(fromIso2) || !/^[A-Z]{2}$/.test(toIso2)) {
return new Response(JSON.stringify({ error: 'fromIso2 and toIso2 must be valid 2-letter ISO country codes' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const clusters = COUNTRY_PORT_CLUSTERS as unknown as Record<string, PortClusterEntry>;
const fromCluster = clusters[fromIso2];
const toCluster = clusters[toIso2];
const fromRoutes = new Set(fromCluster?.nearestRouteIds ?? []);
const toRoutes = new Set(toCluster?.nearestRouteIds ?? []);
const sharedRoutes = [...fromRoutes].filter(r => toRoutes.has(r));
const primaryRouteId = sharedRoutes[0] ?? fromCluster?.nearestRouteIds[0] ?? '';
// Load live chokepoint data
const statusRaw = await getCachedJson(CHOKEPOINT_STATUS_KEY).catch(() => null) as ChokepointStatusResponse | null;
const statusMap = new Map<string, ChokepointStatus>(
(statusRaw?.chokepoints ?? []).map(cp => [cp.id, cp])
);
// Find chokepoints on the primary route and shared routes
const relevantRouteSet = new Set(sharedRoutes.length ? sharedRoutes : (fromCluster?.nearestRouteIds ?? []));
const chokepointExposures = CHOKEPOINT_REGISTRY
.filter(cp => cp.routeIds.some(r => relevantRouteSet.has(r)))
.map(cp => {
const overlap = cp.routeIds.filter(r => relevantRouteSet.has(r)).length;
const exposurePct = Math.round((overlap / Math.max(cp.routeIds.length, 1)) * 100);
return { chokepointId: cp.id, chokepointName: cp.displayName, exposurePct };
})
.filter(e => e.exposurePct > 0)
.sort((a, b) => b.exposurePct - a.exposurePct);
const primaryChokepoint = chokepointExposures[0];
const primaryCpStatus = primaryChokepoint ? statusMap.get(primaryChokepoint.chokepointId) : null;
const disruptionScore = primaryCpStatus?.disruptionScore ?? 0;
const warRiskTier = primaryCpStatus?.warRiskTier ?? 'WAR_RISK_TIER_NORMAL';
// Bypass options for the primary chokepoint
const corridors = primaryChokepoint
? (BYPASS_CORRIDORS_BY_CHOKEPOINT[primaryChokepoint.chokepointId] ?? [])
.filter(c => c.suitableCargoTypes.length === 0 || c.suitableCargoTypes.includes(cargoType))
.slice(0, 5)
.map(c => ({
id: c.id,
name: c.name,
type: c.type,
addedTransitDays: c.addedTransitDays,
addedCostMultiplier: c.addedCostMultiplier,
activationThreshold: c.activationThreshold,
}))
: [];
const body = {
fromIso2,
toIso2,
cargoType,
hs2,
primaryRouteId,
chokepointExposures,
bypassOptions: corridors,
warRiskTier,
disruptionScore,
fetchedAt: new Date().toISOString(),
};
return new Response(JSON.stringify(body), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json', 'Cache-Control': 'public, max-age=60, stale-while-revalidate=120' },
});
}

View File

@@ -1,356 +0,0 @@
/**
* POST /api/v2/shipping/webhooks — Register a webhook for chokepoint disruption alerts.
* GET /api/v2/shipping/webhooks — List webhooks for the authenticated caller.
*
* Payload: { callbackUrl, chokepointIds[], alertThreshold }
* Response: { subscriberId, secret }
*
* Security:
* - X-WorldMonitor-Key required (forceKey: true)
* - SSRF prevention: callbackUrl hostname is validated against private IP ranges.
* LIMITATION: DNS rebinding is not mitigated in the edge runtime (no DNS resolution
* at registration time). The delivery worker MUST resolve the URL before sending and
* re-check it against PRIVATE_HOSTNAME_PATTERNS. HTTPS-only is required to limit
* exposure (TLS certs cannot be issued for private IPs via public CAs).
* - HMAC signatures: webhook deliveries include X-WM-Signature: sha256=<HMAC-SHA256(payload, secret)>
* - Ownership: SHA-256 of the caller's API key is stored as ownerTag; an owner index (Redis Set)
* enables list queries without a full scan.
*/
export const config = { runtime: 'edge' };
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../_api-key.js';
// @ts-expect-error — JS module, no declaration file
import { getCorsHeaders } from '../../_cors.js';
import { isCallerPremium } from '../../../server/_shared/premium-check';
import { getCachedJson, setCachedJson, runRedisPipeline } from '../../../server/_shared/redis';
import { CHOKEPOINT_REGISTRY } from '../../../server/_shared/chokepoint-registry';
const WEBHOOK_TTL = 86400 * 30; // 30 days
const VALID_CHOKEPOINT_IDS = new Set(CHOKEPOINT_REGISTRY.map(c => c.id));
// Private IP ranges + known cloud metadata hostnames blocked at registration.
// NOTE: DNS rebinding bypass is not mitigated here (no DNS resolution in edge runtime).
// The delivery worker must re-validate the resolved IP before sending.
const PRIVATE_HOSTNAME_PATTERNS = [
/^localhost$/i,
/^127\.\d+\.\d+\.\d+$/,
/^10\.\d+\.\d+\.\d+$/,
/^192\.168\.\d+\.\d+$/,
/^172\.(1[6-9]|2\d|3[01])\.\d+\.\d+$/,
/^169\.254\.\d+\.\d+$/, // link-local + AWS/GCP/Azure IMDS
/^fd[0-9a-f]{2}:/i, // IPv6 ULA (fd00::/8)
/^fe80:/i, // IPv6 link-local
/^::1$/, // IPv6 loopback
/^0\.0\.0\.0$/,
/^0\.\d+\.\d+\.\d+$/, // RFC 1122 "this network"
/^100\.(6[4-9]|[7-9]\d|1[01]\d|12[0-7])\.\d+\.\d+$/, // RFC 6598 shared address
];
// Known cloud metadata endpoints that must be blocked explicitly even if the
// IP regex above misses a future alias or IPv6 variant.
const BLOCKED_METADATA_HOSTNAMES = new Set([
'169.254.169.254', // AWS/Azure/GCP IMDS (IPv4)
'metadata.google.internal', // GCP metadata server
'metadata.internal', // GCP alternative alias
'instance-data', // OpenStack metadata
'metadata', // generic cloud metadata alias
'computemetadata', // GCP legacy
'link-local.s3.amazonaws.com',
]);
function isBlockedCallbackUrl(rawUrl: string): string | null {
let parsed: URL;
try {
parsed = new URL(rawUrl);
} catch {
return 'callbackUrl is not a valid URL';
}
// HTTPS is required — TLS certs cannot be issued for private IPs via public CAs,
// which prevents the most common DNS-rebinding variant in practice.
if (parsed.protocol !== 'https:') {
return 'callbackUrl must use https';
}
const hostname = parsed.hostname.toLowerCase();
if (BLOCKED_METADATA_HOSTNAMES.has(hostname)) {
return 'callbackUrl hostname is a blocked metadata endpoint';
}
for (const pattern of PRIVATE_HOSTNAME_PATTERNS) {
if (pattern.test(hostname)) {
return `callbackUrl resolves to a private/reserved address: ${hostname}`;
}
}
return null;
}
async function generateSecret(): Promise<string> {
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
return [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
}
function generateSubscriberId(): string {
const bytes = new Uint8Array(12);
crypto.getRandomValues(bytes);
return 'wh_' + [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
}
function webhookKey(subscriberId: string): string {
return `webhook:sub:${subscriberId}:v1`;
}
function ownerIndexKey(ownerHash: string): string {
return `webhook:owner:${ownerHash}:v1`;
}
/** SHA-256 hash of the caller's API key — used as ownerTag and owner index key. Never secret. */
async function callerFingerprint(req: Request): Promise<string> {
const key =
req.headers.get('X-WorldMonitor-Key') ??
req.headers.get('X-Api-Key') ??
'';
if (!key) return 'anon';
const encoded = new TextEncoder().encode(key);
const hashBuffer = await crypto.subtle.digest('SHA-256', encoded);
return Array.from(new Uint8Array(hashBuffer)).map(b => b.toString(16).padStart(2, '0')).join('');
}
interface WebhookRecord {
subscriberId: string;
ownerTag: string; // SHA-256 hash of the registrant's API key for ownership checks
callbackUrl: string;
chokepointIds: string[];
alertThreshold: number;
createdAt: string;
active: boolean;
// secret is persisted so delivery workers can sign payloads via HMAC-SHA256.
// Stored in trusted Redis; rotated via /rotate-secret.
secret: string;
}
export default async function handler(req: Request): Promise<Response> {
const cors = getCorsHeaders(req);
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
const apiKeyResult = validateApiKey(req, { forceKey: true });
if (apiKeyResult.required && !apiKeyResult.valid) {
return new Response(JSON.stringify({ error: apiKeyResult.error ?? 'API key required' }), {
status: 401,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const url = new URL(req.url);
const pathParts = url.pathname.replace(/\/+$/, '').split('/');
// Find the wh_* segment anywhere in the path (handles /webhooks/wh_xxx/action)
const whIndex = pathParts.findIndex(p => p.startsWith('wh_'));
const subscriberId = whIndex !== -1 ? pathParts[whIndex] : null;
// Action is the segment after the wh_* segment, if present
const action = whIndex !== -1 ? (pathParts[whIndex + 1] ?? null) : null;
// POST /api/v2/shipping/webhooks — Register new webhook
if (req.method === 'POST' && !subscriberId) {
let body: { callbackUrl?: string; chokepointIds?: string[]; alertThreshold?: number };
try {
body = await req.json() as typeof body;
} catch {
return new Response(JSON.stringify({ error: 'Request body must be valid JSON' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const { callbackUrl, chokepointIds = [], alertThreshold = 50 } = body;
if (!callbackUrl) {
return new Response(JSON.stringify({ error: 'callbackUrl is required' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const ssrfError = isBlockedCallbackUrl(callbackUrl);
if (ssrfError) {
return new Response(JSON.stringify({ error: ssrfError }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const invalidCp = chokepointIds.find(id => !VALID_CHOKEPOINT_IDS.has(id));
if (invalidCp) {
return new Response(JSON.stringify({ error: `Unknown chokepoint ID: ${invalidCp}` }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
if (typeof alertThreshold !== 'number' || alertThreshold < 0 || alertThreshold > 100) {
return new Response(JSON.stringify({ error: 'alertThreshold must be a number between 0 and 100' }), {
status: 400,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const ownerTag = await callerFingerprint(req);
const newSubscriberId = generateSubscriberId();
const secret = await generateSecret();
const record: WebhookRecord = {
subscriberId: newSubscriberId,
ownerTag,
callbackUrl,
chokepointIds: chokepointIds.length ? chokepointIds : [...VALID_CHOKEPOINT_IDS],
alertThreshold,
createdAt: new Date().toISOString(),
active: true,
secret, // persisted so delivery workers can compute HMAC signatures
};
// Persist record + update owner index (Redis Set) atomically via pipeline.
// raw = false so all keys are prefixed consistently with getCachedJson reads.
await runRedisPipeline([
['SET', webhookKey(newSubscriberId), JSON.stringify(record), 'EX', String(WEBHOOK_TTL)],
['SADD', ownerIndexKey(ownerTag), newSubscriberId],
['EXPIRE', ownerIndexKey(ownerTag), String(WEBHOOK_TTL)],
]);
return new Response(JSON.stringify({ subscriberId: newSubscriberId, secret }), {
status: 201,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
// Helper: load record + verify ownership in one place
async function loadOwned(subId: string): Promise<WebhookRecord | 'not_found' | 'forbidden'> {
const record = await getCachedJson(webhookKey(subId)).catch(() => null) as WebhookRecord | null;
if (!record) return 'not_found';
const ownerHash = await callerFingerprint(req);
if (record.ownerTag !== ownerHash) return 'forbidden';
return record;
}
// GET /api/v2/shipping/webhooks — List caller's webhooks
if (req.method === 'GET' && !subscriberId) {
const ownerHash = await callerFingerprint(req);
const smembersResult = await runRedisPipeline([['SMEMBERS', ownerIndexKey(ownerHash)]]);
const memberIds = (smembersResult[0]?.result as string[] | null) ?? [];
if (memberIds.length === 0) {
return new Response(JSON.stringify({ webhooks: [] }), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const getResults = await runRedisPipeline(memberIds.map(id => ['GET', webhookKey(id)]));
const webhooks = getResults
.map((r) => {
if (!r.result || typeof r.result !== 'string') return null;
try {
const record = JSON.parse(r.result) as WebhookRecord;
if (record.ownerTag !== ownerHash) return null; // defensive ownership check
return {
subscriberId: record.subscriberId,
callbackUrl: record.callbackUrl,
chokepointIds: record.chokepointIds,
alertThreshold: record.alertThreshold,
createdAt: record.createdAt,
active: record.active,
};
} catch {
return null;
}
})
.filter(Boolean);
return new Response(JSON.stringify({ webhooks }), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
// GET /api/v2/shipping/webhooks/{subscriberId} — Status check
if (req.method === 'GET' && subscriberId && !action) {
const result = await loadOwned(subscriberId);
if (result === 'not_found') {
return new Response(JSON.stringify({ error: 'Webhook not found' }), { status: 404, headers: { ...cors, 'Content-Type': 'application/json' } });
}
if (result === 'forbidden') {
return new Response(JSON.stringify({ error: 'Forbidden' }), { status: 403, headers: { ...cors, 'Content-Type': 'application/json' } });
}
return new Response(JSON.stringify({
subscriberId: result.subscriberId,
callbackUrl: result.callbackUrl,
chokepointIds: result.chokepointIds,
alertThreshold: result.alertThreshold,
createdAt: result.createdAt,
active: result.active,
// secret is intentionally omitted from status responses
}), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
// POST /api/v2/shipping/webhooks/{subscriberId}/rotate-secret
if (req.method === 'POST' && subscriberId && action === 'rotate-secret') {
const result = await loadOwned(subscriberId);
if (result === 'not_found') {
return new Response(JSON.stringify({ error: 'Webhook not found' }), { status: 404, headers: { ...cors, 'Content-Type': 'application/json' } });
}
if (result === 'forbidden') {
return new Response(JSON.stringify({ error: 'Forbidden' }), { status: 403, headers: { ...cors, 'Content-Type': 'application/json' } });
}
const newSecret = await generateSecret();
await setCachedJson(webhookKey(subscriberId), { ...result, secret: newSecret }, WEBHOOK_TTL);
return new Response(JSON.stringify({ subscriberId, secret: newSecret, rotatedAt: new Date().toISOString() }), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
// POST /api/v2/shipping/webhooks/{subscriberId}/reactivate
if (req.method === 'POST' && subscriberId && action === 'reactivate') {
const result = await loadOwned(subscriberId);
if (result === 'not_found') {
return new Response(JSON.stringify({ error: 'Webhook not found' }), { status: 404, headers: { ...cors, 'Content-Type': 'application/json' } });
}
if (result === 'forbidden') {
return new Response(JSON.stringify({ error: 'Forbidden' }), { status: 403, headers: { ...cors, 'Content-Type': 'application/json' } });
}
await setCachedJson(webhookKey(subscriberId), { ...result, active: true }, WEBHOOK_TTL);
return new Response(JSON.stringify({ subscriberId, active: true }), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
return new Response(JSON.stringify({ error: 'Not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}

View File

@@ -0,0 +1,89 @@
/**
* GET /api/v2/shipping/webhooks/{subscriberId} — Status read for a single
* webhook. Preserved on the legacy path-param URL shape because sebuf does
* not currently support path-parameter RPC paths; tracked for eventual
* migration under #3207.
*/
export const config = { runtime: 'edge' };
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../../_api-key.js';
// @ts-expect-error — JS module, no declaration file
import { getCorsHeaders } from '../../../_cors.js';
import { isCallerPremium } from '../../../../server/_shared/premium-check';
import { getCachedJson } from '../../../../server/_shared/redis';
import {
webhookKey,
callerFingerprint,
type WebhookRecord,
} from '../../../../server/worldmonitor/shipping/v2/webhook-shared';
export default async function handler(req: Request): Promise<Response> {
const cors = getCorsHeaders(req);
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'GET') {
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
status: 405,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const apiKeyResult = validateApiKey(req, { forceKey: true });
if (apiKeyResult.required && !apiKeyResult.valid) {
return new Response(JSON.stringify({ error: apiKeyResult.error ?? 'API key required' }), {
status: 401,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const url = new URL(req.url);
const parts = url.pathname.replace(/\/+$/, '').split('/');
const subscriberId = parts[parts.length - 1];
if (!subscriberId || !subscriberId.startsWith('wh_')) {
return new Response(JSON.stringify({ error: 'Webhook not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const record = (await getCachedJson(webhookKey(subscriberId)).catch(() => null)) as WebhookRecord | null;
if (!record) {
return new Response(JSON.stringify({ error: 'Webhook not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const ownerHash = await callerFingerprint(req);
if (record.ownerTag !== ownerHash) {
return new Response(JSON.stringify({ error: 'Forbidden' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
return new Response(
JSON.stringify({
subscriberId: record.subscriberId,
callbackUrl: record.callbackUrl,
chokepointIds: record.chokepointIds,
alertThreshold: record.alertThreshold,
createdAt: record.createdAt,
active: record.active,
}),
{ status: 200, headers: { ...cors, 'Content-Type': 'application/json' } },
);
}

View File

@@ -0,0 +1,105 @@
/**
* POST /api/v2/shipping/webhooks/{subscriberId}/rotate-secret
* POST /api/v2/shipping/webhooks/{subscriberId}/reactivate
*
* Preserved on the legacy path-param URL shape because sebuf does not
* currently support path-parameter RPC paths; tracked for eventual
* migration under #3207.
*/
export const config = { runtime: 'edge' };
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../../../_api-key.js';
// @ts-expect-error — JS module, no declaration file
import { getCorsHeaders } from '../../../../_cors.js';
import { isCallerPremium } from '../../../../../server/_shared/premium-check';
import { getCachedJson, setCachedJson } from '../../../../../server/_shared/redis';
import {
WEBHOOK_TTL,
webhookKey,
callerFingerprint,
generateSecret,
type WebhookRecord,
} from '../../../../../server/worldmonitor/shipping/v2/webhook-shared';
export default async function handler(req: Request): Promise<Response> {
const cors = getCorsHeaders(req);
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'POST') {
return new Response(JSON.stringify({ error: 'Method not allowed' }), {
status: 405,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const apiKeyResult = validateApiKey(req, { forceKey: true });
if (apiKeyResult.required && !apiKeyResult.valid) {
return new Response(JSON.stringify({ error: apiKeyResult.error ?? 'API key required' }), {
status: 401,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const isPro = await isCallerPremium(req);
if (!isPro) {
return new Response(JSON.stringify({ error: 'PRO subscription required' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const url = new URL(req.url);
const parts = url.pathname.replace(/\/+$/, '').split('/');
const action = parts[parts.length - 1];
const subscriberId = parts[parts.length - 2];
if (!subscriberId || !subscriberId.startsWith('wh_')) {
return new Response(JSON.stringify({ error: 'Webhook not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
if (action !== 'rotate-secret' && action !== 'reactivate') {
return new Response(JSON.stringify({ error: 'Not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const record = (await getCachedJson(webhookKey(subscriberId)).catch(() => null)) as WebhookRecord | null;
if (!record) {
return new Response(JSON.stringify({ error: 'Webhook not found' }), {
status: 404,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
const ownerHash = await callerFingerprint(req);
if (record.ownerTag !== ownerHash) {
return new Response(JSON.stringify({ error: 'Forbidden' }), {
status: 403,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}
if (action === 'rotate-secret') {
const newSecret = await generateSecret();
await setCachedJson(webhookKey(subscriberId), { ...record, secret: newSecret }, WEBHOOK_TTL);
return new Response(
JSON.stringify({ subscriberId, secret: newSecret, rotatedAt: new Date().toISOString() }),
{ status: 200, headers: { ...cors, 'Content-Type': 'application/json' } },
);
}
// action === 'reactivate'
await setCachedJson(webhookKey(subscriberId), { ...record, active: true }, WEBHOOK_TTL);
return new Response(JSON.stringify({ subscriberId, active: true }), {
status: 200,
headers: { ...cors, 'Content-Type': 'application/json' },
});
}

View File

@@ -2,11 +2,13 @@
title: "Adding API Endpoints"
description: "All JSON API endpoints in World Monitor must use sebuf. This guide walks through adding a new RPC to an existing service and adding an entirely new service."
---
All JSON API endpoints in World Monitor **must** use sebuf. Do not create standalone `api/*.js` files — the legacy pattern is deprecated and being removed.
All JSON API endpoints in World Monitor **must** use sebuf. Do not create standalone `api/*.js` or `api/*.ts` files for new data APIs — the legacy pattern is deprecated and being removed.
This guide walks through adding a new RPC to an existing service and adding an entirely new service.
> **Important:** After modifying any `.proto` file, you **must** run `make generate` before building or pushing. The generated TypeScript files in `src/generated/` are checked into the repo and must stay in sync with the proto definitions. CI does not run generation yet — this is your responsibility until we add it to the pipeline (see [#200](https://github.com/koala73/worldmonitor/issues/200)).
> **Enforcement:** `npm run lint:api-contract` runs in CI (see `.github/workflows/lint-code.yml`). It walks every file under `api/`, pairs each sebuf gateway (`api/<domain>/v<N>/[rpc].ts`) with a generated service under `src/generated/server/worldmonitor/`, and rejects any file that is neither a gateway nor listed in `api/api-route-exceptions.json`. The manifest is the only escape hatch for endpoints that genuinely cannot be proto — OAuth callbacks, binary responses, upstream proxies, operator plumbing — and every entry is pinned to @SebastienMelki via `.github/CODEOWNERS`. Expect reviewer pushback on new entries.
>
> **Generation freshness:** After modifying any `.proto` file, run `make generate` before pushing. The generated TypeScript in `src/generated/` is checked in and must stay in sync; `.github/workflows/proto-check.yml` fails the PR if it drifts.
## Prerequisites

View File

@@ -112,6 +112,6 @@ No `referrals` count or `rewardMonths` is returned today — Dodo's `affonso_ref
## Waitlist
### `POST /api/register-interest`
### `POST /api/leads/v1/register-interest`
Captures an email into the Convex waitlist table. Turnstile-verified, rate-limited per IP. See [Platform endpoints](/api-platform) for the request shape.
Captures an email into the Convex waitlist table. Turnstile-verified (desktop sources bypass), rate-limited per IP. Part of `LeadsService`; see [Platform endpoints](/api-platform) for the request shape.

View File

@@ -146,10 +146,10 @@ Redirects to the matching asset on the latest GitHub release of `koala73/worldmo
Caches the 302 for 5 minutes (`s-maxage=300`, `stale-while-revalidate=60`, `stale-if-error=600`).
### `POST /api/contact`
### `POST /api/leads/v1/submit-contact`
Public contact form. Turnstile-verified, rate-limited per IP.
Public enterprise contact form. Turnstile-verified, rate-limited per IP. Part of `LeadsService`.
### `POST /api/register-interest`
### `POST /api/leads/v1/register-interest`
Captures email for desktop-app early-access waitlist. Writes to Convex.
Captures email for Pro-waitlist signup. Writes to Convex and sends a confirmation email. Part of `LeadsService`.

View File

@@ -18,14 +18,9 @@ Most proxies are intended for **our own dashboard** and are lightly gated. They
| Endpoint | Upstream | Purpose |
|----------|----------|---------|
| `GET /api/eia/[...path]` | EIA v2 (energy.gov) | Catch-all proxy for US Energy Information Administration series. Path segments are appended to `https://api.eia.gov/v2/...`. |
| `GET /api/opensky` | OpenSky Network | Live ADS-B state vectors. Used by the flights layer. |
| `GET /api/polymarket` | Polymarket gamma-api | Active event contracts. |
| `GET /api/satellites` | CelesTrak | LEO/GEO satellite orbital elements. |
| `GET /api/military-flights` | ADS-B Exchange / adsb.lol | Identified military aircraft. |
| `GET /api/ais-snapshot` | Internal AIS seed | Snapshot of latest vessel positions for the currently-loaded bbox. |
| `GET /api/gpsjam` | gpsjam.org | GPS interference hotspot reports. |
| `GET /api/sanctions-entity-search?q=...` | OFAC SDN | Fuzzy-match sanctions entity search. |
| `GET /api/oref-alerts` | OREF (Israel Home Front Command) | Tzeva Adom rocket alert mirror. |
| `GET /api/supply-chain/hormuz-tracker` | Internal AIS + registry | Real-time Hormuz transit dashboard data. |
@@ -40,14 +35,6 @@ All proxies:
Fetches an RSS/Atom feed and returns the parsed JSON. The URL must match one of the patterns in `_rss-allowed-domains.js` — arbitrary URLs are refused to prevent SSRF.
### `GET /api/enrichment/company?domain=<host>`
Returns company metadata (name, logo, industry, HQ country) for a website domain. Composite of public sources.
### `GET /api/enrichment/signals?domain=<host>`
Returns trust and risk signals (TLS grade, DNS age, WHOIS country, threat-list membership) for a domain.
## Skills registry
### `GET /api/skills/fetch-agentskills`

View File

@@ -6,12 +6,24 @@ description: "Run pre-defined supply-chain disruption scenarios against any coun
The **scenarios** API is a PRO-only, job-queued surface on top of the WorldMonitor chokepoint + trade dataset. Callers enqueue a named scenario template against an optional country, then poll a job-id until the worker completes.
<Info>
This service is documented inline (not yet proto-backed). Proto migration is tracked in [issue #3207](https://github.com/koala73/worldmonitor/issues/3207) and will replace this page with auto-generated reference.
This service is proto-backed — see `proto/worldmonitor/scenario/v1/service.proto`. Auto-generated reference will replace this page once the scenario service is included in the published OpenAPI bundle.
</Info>
<Note>
**Legacy v1 URL aliases** — the sebuf migration (#3207) renamed the three v1 endpoints to align with the proto RPC names. The old URLs are preserved as thin aliases so existing integrations keep working:
| Legacy URL | Canonical URL |
|---|---|
| `POST /api/scenario/v1/run` | `POST /api/scenario/v1/run-scenario` |
| `GET /api/scenario/v1/status` | `GET /api/scenario/v1/get-scenario-status` |
| `GET /api/scenario/v1/templates` | `GET /api/scenario/v1/list-scenario-templates` |
Prefer the canonical URLs in new code — the aliases will retire at the next v1→v2 break (tracked in [#3282](https://github.com/koala73/worldmonitor/issues/3282)).
</Note>
## List templates
### `GET /api/scenario/v1/templates`
### `GET /api/scenario/v1/list-scenario-templates`
Returns the catalog of pre-defined scenario templates. Cached `public, max-age=3600`.
@@ -32,18 +44,18 @@ Returns the catalog of pre-defined scenario templates. Cached `public, max-age=3
}
```
Other shipped templates at the time of writing: `taiwan-strait-full-closure`, `suez-bab-simultaneous`, `panama-drought-50pct`, `russia-baltic-grain-suspension`, `us-tariff-escalation-electronics`. Use the live `/templates` response as the source of truth — the set grows over time.
Other shipped templates at the time of writing: `taiwan-strait-full-closure`, `suez-bab-simultaneous`, `panama-drought-50pct`, `russia-baltic-grain-suspension`, `us-tariff-escalation-electronics`. Use the live `/list-scenario-templates` response as the source of truth — the set grows over time. `affectedHs2: []` on the wire means the scenario affects ALL sectors (the registry's `null` sentinel, which `repeated string` cannot carry directly).
## Run a scenario
### `POST /api/scenario/v1/run`
### `POST /api/scenario/v1/run-scenario`
Enqueues a job. Returns `202 Accepted` with a `jobId` the caller must poll.
Enqueues a job. Returns the assigned `jobId` the caller must poll.
- **Auth**: PRO entitlement required. Granted by either (a) a valid `X-WorldMonitor-Key` (env key from `WORLDMONITOR_VALID_KEYS`, or a user-owned `wm_`-prefixed key whose owner has the `apiAccess` entitlement), **or** (b) a Clerk bearer token whose user has role `pro` or Dodo entitlement tier ≥ 1. A trusted browser Origin alone is **not** sufficient — `isCallerPremium()` in `server/_shared/premium-check.ts` only counts explicit credentials. Browser calls work because `premiumFetch()` (`src/services/premium-fetch.ts`) injects one of the two credential forms on the caller's behalf.
- **Rate limits**:
- 10 jobs / minute / user
- Global queue capped at 100 in-flight jobs; excess rejected with `429` + `Retry-After: 30`
- 10 jobs / minute / IP (enforced at the gateway via `ENDPOINT_RATE_POLICIES` in `server/_shared/rate-limit.ts`)
- Global queue capped at 100 in-flight jobs; excess rejected with `429`
**Request**:
```json
@@ -53,77 +65,83 @@ Enqueues a job. Returns `202 Accepted` with a `jobId` the caller must poll.
}
```
- `scenarioId` — id from `/templates`. Required.
- `iso2` — optional ISO-3166-1 alpha-2 (uppercase). Scopes the scenario to one country.
- `scenarioId` — id from `/list-scenario-templates`. Required.
- `iso2` — optional ISO-3166-1 alpha-2 (uppercase). Scopes the scenario to one country. Empty string = scope-all.
**Response (`202`)**:
**Response (`200`)**:
```json
{
"jobId": "scenario:1713456789012:a1b2c3d4",
"status": "pending",
"statusUrl": "/api/scenario/v1/status?jobId=scenario:1713456789012:a1b2c3d4"
"statusUrl": "/api/scenario/v1/get-scenario-status?jobId=scenario%3A1713456789012%3Aa1b2c3d4"
}
```
- `statusUrl` — server-computed convenience URL. Callers that don't want to hardcode the status path can follow this directly (it URL-encodes the `jobId`).
<Warning>
**Wire-contract change (v1 → v1)** — the pre-sebuf-migration endpoint returned `202 Accepted` on successful enqueue; the migrated endpoint returns `200 OK`. No per-RPC status-code configuration is available in sebuf's HTTP annotations today, and introducing a `/v2` for a single status-code shift was judged heavier than the break itself.
If your integration branches on `response.status === 202`, switch to branching on response body shape (`response.body.status === "pending"` indicates enqueue success). `statusUrl` is preserved exactly as before and is a safe signal to key off.
</Warning>
**Errors**:
| Status | `error` | Cause |
|--------|---------|-------|
| 400 | `Invalid JSON body` | Body is not valid JSON |
| 400 | `scenarioId is required` | Missing field |
| 400 | `Unknown scenario: ...` | `scenarioId` not in the template catalog |
| 400 | `iso2 must be a 2-letter uppercase country code` | Malformed `iso2` |
| Status | `message` | Cause |
|--------|-----------|-------|
| 400 | `Validation failed` (violations include `scenarioId`) | Missing or unknown `scenarioId` |
| 400 | `Validation failed` (violations include `iso2`) | Malformed `iso2` |
| 403 | `PRO subscription required` | Not PRO |
| 405 | — | Method other than `POST` |
| 429 | `Rate limit exceeded: 10 scenario jobs per minute` | Per-user rate limit |
| 405 | — | Method other than `POST` (enforced by sebuf service-config) |
| 429 | `Too many requests` | Per-IP 10/min gateway rate limit |
| 429 | `Scenario queue is at capacity, please try again later` | Global queue > 100 |
| 502 | `Failed to enqueue scenario job` | Redis enqueue failure |
| 503 | `Service temporarily unavailable` | Missing env |
## Poll job status
### `GET /api/scenario/v1/status?jobId=<jobId>`
### `GET /api/scenario/v1/get-scenario-status?jobId=<jobId>`
Returns the job's current state as written by the worker, or a synthesised `pending` stub while the job is still queued.
- **Auth**: same as `/run`
- **Auth**: same as `/run-scenario`
- **jobId format**: `scenario:{unix-ms}:{8-char-suffix}` — strictly validated to guard against path traversal
**Status lifecycle**:
| `status` | When | Additional fields |
|---|---|---|
| `pending` | Job enqueued but worker has not picked it up yet. Synthesised by the status handler when no Redis record exists. | — |
| `processing` | Worker dequeued the job and started computing. Written by the worker at job pickup. | `startedAt` (ms epoch) |
| `done` | Worker completed successfully. | `completedAt`, `result` (scenario-specific payload) |
| `failed` | Worker hit a computation error. | `failedAt`, `error` (string) |
| `status` | When |
|---|---|
| `pending` | Job enqueued but worker has not picked it up yet. Synthesised by the status handler when no Redis record exists. |
| `processing` | Worker dequeued the job and started computing. |
| `done` | Worker completed successfully; `result` is populated. |
| `failed` | Worker hit a computation error; `error` is populated. |
**Pending response (`200`)**:
```json
{
"jobId": "scenario:1713456789012:a1b2c3d4",
"status": "pending"
}
{ "status": "pending", "error": "" }
```
**Processing response (`200`)**:
```json
{
"status": "processing",
"startedAt": 1713456789500
}
{ "status": "processing", "error": "" }
```
**Done response (`200`)** — the worker writes the result directly to Redis; the status endpoint returns it verbatim:
**Done response (`200`)** — `result` carries the worker's computed payload:
```json
{
"status": "done",
"completedAt": 1713456890123,
"error": "",
"result": {
"costShockPct": 14.2,
"affectedImportValueUsd": 8400000000,
"topExposedSectors": ["refined_petroleum", "chemicals"]
"affectedChokepointIds": ["hormuz_strait"],
"topImpactCountries": [
{ "iso2": "JP", "totalImpact": 1500.0, "impactPct": 100 }
],
"template": {
"name": "hormuz_strait",
"disruptionPct": 100,
"durationDays": 14,
"costShockMultiplier": 2.10
}
}
}
```
@@ -131,25 +149,19 @@ Returns the job's current state as written by the worker, or a synthesised `pend
**Failed response (`200`)**:
```json
{
"status": "failed",
"error": "computation_error",
"failedAt": 1713456890123
}
{ "status": "failed", "error": "computation_error" }
```
Poll loop: treat `pending` and `processing` as non-terminal; only `done` and `failed` are terminal. Both pending and processing can legitimately persist for several seconds under load.
**Errors**:
| Status | `error` | Cause |
|--------|---------|-------|
| 400 | `Invalid or missing jobId` | Missing or malformed `jobId` |
| Status | `message` | Cause |
|--------|-----------|-------|
| 400 | `Validation failed` (violations include `jobId`) | Missing or malformed `jobId` |
| 403 | `PRO subscription required` | Not PRO |
| 405 | — | Method other than `GET` |
| 500 | `Corrupted job result` | Worker wrote invalid JSON |
| 405 | — | Method other than `GET` (enforced by sebuf service-config) |
| 502 | `Failed to fetch job status` | Redis read failure |
| 503 | `Service temporarily unavailable` | Missing env |
## Polling strategy

View File

@@ -0,0 +1 @@
{"components":{"schemas":{"Error":{"description":"Error is returned when a handler encounters an error. It contains a simple error message that the developer can customize.","properties":{"message":{"description":"Error message (e.g., 'user not found', 'database connection failed')","type":"string"}},"type":"object"},"FieldViolation":{"description":"FieldViolation describes a single validation error for a specific field.","properties":{"description":{"description":"Human-readable description of the validation violation (e.g., 'must be a valid email address', 'required field missing')","type":"string"},"field":{"description":"The field path that failed validation (e.g., 'user.email' for nested fields). For header validation, this will be the header name (e.g., 'X-API-Key')","type":"string"}},"required":["field","description"],"type":"object"},"RegisterInterestRequest":{"description":"RegisterInterestRequest carries a Pro-waitlist signup.","properties":{"appVersion":{"type":"string"},"email":{"type":"string"},"referredBy":{"type":"string"},"source":{"type":"string"},"turnstileToken":{"description":"Cloudflare Turnstile token. Desktop sources bypass Turnstile; see handler.","type":"string"},"website":{"description":"Honeypot — bots auto-fill this hidden field; real submissions leave it empty.","type":"string"}},"type":"object"},"RegisterInterestResponse":{"description":"RegisterInterestResponse mirrors the Convex registerInterest:register return shape.","properties":{"emailSuppressed":{"description":"True when the email is on the suppression list (prior bounce) and no confirmation was sent.","type":"boolean"},"position":{"description":"Waitlist position at registration time. Present only when status == \"registered\".","format":"int32","type":"integer"},"referralCode":{"description":"Stable referral code for this email.","type":"string"},"referralCount":{"description":"Number of signups credited to this email.","format":"int32","type":"integer"},"status":{"description":"\"registered\" for a new signup; \"already_registered\" for a returning email.","type":"string"}},"type":"object"},"SubmitContactRequest":{"description":"SubmitContactRequest carries an enterprise contact form submission.","properties":{"email":{"type":"string"},"message":{"type":"string"},"name":{"type":"string"},"organization":{"type":"string"},"phone":{"type":"string"},"source":{"type":"string"},"turnstileToken":{"description":"Cloudflare Turnstile token proving the submitter is human.","type":"string"},"website":{"description":"Honeypot — bots auto-fill this hidden field; real submissions leave it empty.","type":"string"}},"type":"object"},"SubmitContactResponse":{"description":"SubmitContactResponse reports the outcome of storing the lead and notifying ops.","properties":{"emailSent":{"description":"True when the Resend notification to ops was delivered.","type":"boolean"},"status":{"description":"Always \"sent\" on success.","type":"string"}},"type":"object"},"ValidationError":{"description":"ValidationError is returned when request validation fails. It contains a list of field violations describing what went wrong.","properties":{"violations":{"description":"List of validation violations","items":{"$ref":"#/components/schemas/FieldViolation"},"type":"array"}},"required":["violations"],"type":"object"}}},"info":{"title":"LeadsService API","version":"1.0.0"},"openapi":"3.1.0","paths":{"/api/leads/v1/register-interest":{"post":{"description":"RegisterInterest adds an email to the Pro waitlist and sends a confirmation email.","operationId":"RegisterInterest","requestBody":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/RegisterInterestRequest"}}},"required":true},"responses":{"200":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/RegisterInterestResponse"}}},"description":"Successful response"},"400":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/ValidationError"}}},"description":"Validation error"},"default":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/Error"}}},"description":"Error response"}},"summary":"RegisterInterest","tags":["LeadsService"]}},"/api/leads/v1/submit-contact":{"post":{"description":"SubmitContact stores an enterprise contact submission in Convex and emails ops.","operationId":"SubmitContact","requestBody":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/SubmitContactRequest"}}},"required":true},"responses":{"200":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/SubmitContactResponse"}}},"description":"Successful response"},"400":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/ValidationError"}}},"description":"Validation error"},"default":{"content":{"application/json":{"schema":{"$ref":"#/components/schemas/Error"}}},"description":"Error response"}},"summary":"SubmitContact","tags":["LeadsService"]}}}}

View File

@@ -0,0 +1,173 @@
openapi: 3.1.0
info:
title: LeadsService API
version: 1.0.0
paths:
/api/leads/v1/submit-contact:
post:
tags:
- LeadsService
summary: SubmitContact
description: SubmitContact stores an enterprise contact submission in Convex and emails ops.
operationId: SubmitContact
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/SubmitContactRequest'
required: true
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/SubmitContactResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/leads/v1/register-interest:
post:
tags:
- LeadsService
summary: RegisterInterest
description: RegisterInterest adds an email to the Pro waitlist and sends a confirmation email.
operationId: RegisterInterest
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/RegisterInterestRequest'
required: true
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/RegisterInterestResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
components:
schemas:
Error:
type: object
properties:
message:
type: string
description: Error message (e.g., 'user not found', 'database connection failed')
description: Error is returned when a handler encounters an error. It contains a simple error message that the developer can customize.
FieldViolation:
type: object
properties:
field:
type: string
description: The field path that failed validation (e.g., 'user.email' for nested fields). For header validation, this will be the header name (e.g., 'X-API-Key')
description:
type: string
description: Human-readable description of the validation violation (e.g., 'must be a valid email address', 'required field missing')
required:
- field
- description
description: FieldViolation describes a single validation error for a specific field.
ValidationError:
type: object
properties:
violations:
type: array
items:
$ref: '#/components/schemas/FieldViolation'
description: List of validation violations
required:
- violations
description: ValidationError is returned when request validation fails. It contains a list of field violations describing what went wrong.
SubmitContactRequest:
type: object
properties:
email:
type: string
name:
type: string
organization:
type: string
phone:
type: string
message:
type: string
source:
type: string
website:
type: string
description: Honeypot — bots auto-fill this hidden field; real submissions leave it empty.
turnstileToken:
type: string
description: Cloudflare Turnstile token proving the submitter is human.
description: SubmitContactRequest carries an enterprise contact form submission.
SubmitContactResponse:
type: object
properties:
status:
type: string
description: Always "sent" on success.
emailSent:
type: boolean
description: True when the Resend notification to ops was delivered.
description: SubmitContactResponse reports the outcome of storing the lead and notifying ops.
RegisterInterestRequest:
type: object
properties:
email:
type: string
source:
type: string
appVersion:
type: string
referredBy:
type: string
website:
type: string
description: Honeypot — bots auto-fill this hidden field; real submissions leave it empty.
turnstileToken:
type: string
description: Cloudflare Turnstile token. Desktop sources bypass Turnstile; see handler.
description: RegisterInterestRequest carries a Pro-waitlist signup.
RegisterInterestResponse:
type: object
properties:
status:
type: string
description: '"registered" for a new signup; "already_registered" for a returning email.'
referralCode:
type: string
description: Stable referral code for this email.
referralCount:
type: integer
format: int32
description: Number of signups credited to this email.
position:
type: integer
format: int32
description: Waitlist position at registration time. Present only when status == "registered".
emailSuppressed:
type: boolean
description: True when the email is on the suppression list (prior bounce) and no confirmation was sent.
description: RegisterInterestResponse mirrors the Convex registerInterest:register return shape.

File diff suppressed because one or more lines are too long

View File

@@ -39,6 +39,15 @@ paths:
schema:
type: number
format: double
- name: include_candidates
in: query
description: |-
When true, populate VesselSnapshot.candidate_reports with per-vessel
position reports. Clients with no position callbacks should leave this
false to keep responses small.
required: false
schema:
type: boolean
responses:
"200":
description: Successful response
@@ -156,6 +165,12 @@ components:
type: number
format: double
description: South-west corner longitude of bounding box.
includeCandidates:
type: boolean
description: |-
When true, populate VesselSnapshot.candidate_reports with per-vessel
position reports. Clients with no position callbacks should leave this
false to keep responses small.
description: GetVesselSnapshotRequest specifies filters for the vessel snapshot.
GetVesselSnapshotResponse:
type: object
@@ -178,6 +193,18 @@ components:
type: array
items:
$ref: '#/components/schemas/AisDisruption'
sequence:
type: integer
format: int32
description: |-
Monotonic sequence number from the relay. Clients use this to detect stale
responses during polling.
status:
$ref: '#/components/schemas/AisSnapshotStatus'
candidateReports:
type: array
items:
$ref: '#/components/schemas/SnapshotCandidateReport'
description: VesselSnapshot represents a point-in-time view of civilian AIS vessel data.
AisDensityZone:
type: object
@@ -281,6 +308,61 @@ components:
required:
- id
description: AisDisruption represents a detected anomaly in AIS vessel tracking data.
AisSnapshotStatus:
type: object
properties:
connected:
type: boolean
description: Whether the relay WebSocket is connected to the AIS provider.
vessels:
type: integer
format: int32
description: Number of vessels currently tracked by the relay.
messages:
type: integer
format: int32
description: Total AIS messages processed in the current session.
description: AisSnapshotStatus reports relay health at the time of the snapshot.
SnapshotCandidateReport:
type: object
properties:
mmsi:
type: string
description: Maritime Mobile Service Identity.
name:
type: string
description: Vessel name (may be empty if unknown).
lat:
type: number
format: double
description: Latitude in decimal degrees.
lon:
type: number
format: double
description: Longitude in decimal degrees.
shipType:
type: integer
format: int32
description: AIS ship type code (0 if unknown).
heading:
type: integer
format: int32
description: Heading in degrees (0-359, or 511 for unavailable).
speed:
type: number
format: double
description: Speed over ground in knots.
course:
type: integer
format: int32
description: Course over ground in degrees.
timestamp:
type: integer
format: int64
description: 'Report timestamp, as Unix epoch milliseconds.. Warning: Values > 2^53 may lose precision in JavaScript'
description: |-
SnapshotCandidateReport is a per-vessel position report attached to a
snapshot. Used to drive the client-side position callback system.
ListNavigationalWarningsRequest:
type: object
properties:

File diff suppressed because one or more lines are too long

View File

@@ -513,7 +513,11 @@ components:
description: Aircraft callsign.
hexCode:
type: string
description: ICAO 24-bit hex address.
description: |-
ICAO 24-bit hex address. Canonical form is UPPERCASE — seeders and
handlers must uppercase before writing so hex-based lookups
(src/services/military-flights.ts:getFlightByHex) match regardless of
upstream source casing.
registration:
type: string
description: Aircraft registration number.

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,316 @@
openapi: 3.1.0
info:
title: ScenarioService API
version: 1.0.0
paths:
/api/scenario/v1/run-scenario:
post:
tags:
- ScenarioService
summary: RunScenario
description: |-
RunScenario enqueues a scenario job on scenario-queue:pending. PRO-gated.
The scenario-worker (scripts/scenario-worker.mjs) pulls jobs off the
queue via BLMOVE and writes results under scenario-result:{job_id}.
operationId: RunScenario
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/RunScenarioRequest'
required: true
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/RunScenarioResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/scenario/v1/get-scenario-status:
get:
tags:
- ScenarioService
summary: GetScenarioStatus
description: |-
GetScenarioStatus polls a single job's result. PRO-gated.
Returns status="pending" when no result key exists, mirroring the
worker's lifecycle state once the key is written.
operationId: GetScenarioStatus
parameters:
- name: jobId
in: query
description: |-
Job id of the form `scenario:{epoch_ms}:{8-char-suffix}`. Path-traversal
guarded by JOB_ID_RE in the handler.
required: false
schema:
type: string
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/GetScenarioStatusResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/scenario/v1/list-scenario-templates:
get:
tags:
- ScenarioService
summary: ListScenarioTemplates
description: |-
ListScenarioTemplates returns the catalog of pre-defined scenarios.
Not PRO-gated — used by documented public API consumers.
operationId: ListScenarioTemplates
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/ListScenarioTemplatesResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
components:
schemas:
Error:
type: object
properties:
message:
type: string
description: Error message (e.g., 'user not found', 'database connection failed')
description: Error is returned when a handler encounters an error. It contains a simple error message that the developer can customize.
FieldViolation:
type: object
properties:
field:
type: string
description: The field path that failed validation (e.g., 'user.email' for nested fields). For header validation, this will be the header name (e.g., 'X-API-Key')
description:
type: string
description: Human-readable description of the validation violation (e.g., 'must be a valid email address', 'required field missing')
required:
- field
- description
description: FieldViolation describes a single validation error for a specific field.
ValidationError:
type: object
properties:
violations:
type: array
items:
$ref: '#/components/schemas/FieldViolation'
description: List of validation violations
required:
- violations
description: ValidationError is returned when request validation fails. It contains a list of field violations describing what went wrong.
RunScenarioRequest:
type: object
properties:
scenarioId:
type: string
maxLength: 128
minLength: 1
description: Scenario template id — must match an entry in SCENARIO_TEMPLATES.
iso2:
type: string
pattern: ^([A-Z]{2})?$
description: |-
Optional 2-letter ISO country code to scope the impact computation.
When absent, the worker computes for all countries with seeded exposure.
required:
- scenarioId
description: |-
RunScenarioRequest enqueues a scenario job on the scenario-queue:pending
Upstash list for the async scenario-worker to pick up.
RunScenarioResponse:
type: object
properties:
jobId:
type: string
description: Generated job id of the form `scenario:{epoch_ms}:{8-char-suffix}`.
status:
type: string
description: Always "pending" at enqueue time.
statusUrl:
type: string
description: |-
Convenience URL the caller can use to poll this job's status.
Server-computed as `/api/scenario/v1/get-scenario-status?jobId=<job_id>`.
Restored after the v1 → v1 sebuf migration because external callers
may key off this field.
description: |-
RunScenarioResponse carries the enqueued job id. Clients poll
GetScenarioStatus with this id until status != "pending".
NOTE: the legacy (pre-sebuf) endpoint returned HTTP 202 Accepted on
enqueue; the sebuf-generated server emits 200 OK for all successful
responses (no per-RPC status-code configuration is available in the
current sebuf HTTP annotations). The 202 → 200 shift on a same-version
(v1 → v1) migration is called out in docs/api-scenarios.mdx and the
OpenAPI bundle; external consumers keying off `response.status === 202`
need to branch on response body shape instead.
GetScenarioStatusRequest:
type: object
properties:
jobId:
type: string
pattern: ^scenario:[0-9]{13}:[a-z0-9]{8}$
description: |-
Job id of the form `scenario:{epoch_ms}:{8-char-suffix}`. Path-traversal
guarded by JOB_ID_RE in the handler.
required:
- jobId
description: GetScenarioStatusRequest polls the worker result for an enqueued job id.
GetScenarioStatusResponse:
type: object
properties:
status:
type: string
result:
$ref: '#/components/schemas/ScenarioResult'
error:
type: string
description: Populated only when status == "failed".
description: |-
GetScenarioStatusResponse reflects the worker's lifecycle state.
"pending" — no key yet (job still queued or very-recent enqueue).
"processing" — worker has claimed the job but hasn't completed compute.
"done" — compute succeeded; `result` is populated.
"failed" — compute errored; `error` is populated.
ScenarioResult:
type: object
properties:
affectedChokepointIds:
type: array
items:
type: string
description: Chokepoint ids disrupted by this scenario.
topImpactCountries:
type: array
items:
$ref: '#/components/schemas/ScenarioImpactCountry'
template:
$ref: '#/components/schemas/ScenarioResultTemplate'
description: |-
ScenarioResult is the computed payload the scenario-worker writes back
under the `scenario-result:{job_id}` Redis key. Populated only when
GetScenarioStatusResponse.status == "done".
ScenarioImpactCountry:
type: object
properties:
iso2:
type: string
description: 2-letter ISO country code.
totalImpact:
type: number
format: double
description: |-
Raw weighted impact value aggregated across the country's exposed HS2
chapters. Relative-only — not a currency amount.
impactPct:
type: integer
format: int32
description: Impact as a 0-100 share of the worst-hit country.
description: ScenarioImpactCountry carries a single country's scenario impact score.
ScenarioResultTemplate:
type: object
properties:
name:
type: string
description: |-
Display name (worker derives this from affected_chokepoint_ids; may be
`tariff_shock` for tariff-type scenarios).
disruptionPct:
type: integer
format: int32
description: 0-100 percent of chokepoint capacity blocked.
durationDays:
type: integer
format: int32
description: Estimated duration of disruption in days.
costShockMultiplier:
type: number
format: double
description: Freight cost multiplier applied on top of bypass corridor costs.
description: |-
ScenarioResultTemplate carries template parameters echoed into the worker's
computed result so clients can render them without re-looking up the
template registry.
ListScenarioTemplatesRequest:
type: object
ListScenarioTemplatesResponse:
type: object
properties:
templates:
type: array
items:
$ref: '#/components/schemas/ScenarioTemplate'
ScenarioTemplate:
type: object
properties:
id:
type: string
name:
type: string
affectedChokepointIds:
type: array
items:
type: string
description: |-
Chokepoint ids this scenario disrupts. Empty for tariff-shock scenarios
that have no physical chokepoint closure.
disruptionPct:
type: integer
format: int32
description: 0-100 percent of chokepoint capacity blocked.
durationDays:
type: integer
format: int32
description: Estimated duration of disruption in days.
affectedHs2:
type: array
items:
type: string
description: HS2 chapter codes affected. Empty means ALL sectors are affected.
costShockMultiplier:
type: number
format: double
description: Freight cost multiplier applied on top of bypass corridor costs.
description: |-
ScenarioTemplate mirrors the catalog shape served by
GET /api/scenario/v1/list-scenario-templates. The authoritative template
registry lives in server/worldmonitor/supply-chain/v1/scenario-templates.ts.

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,335 @@
openapi: 3.1.0
info:
title: ShippingV2Service API
version: 1.0.0
paths:
/api/v2/shipping/route-intelligence:
get:
tags:
- ShippingV2Service
summary: RouteIntelligence
description: |-
RouteIntelligence scores a country-pair trade route for chokepoint exposure
and current disruption risk. Partner-facing; wire shape is byte-compatible
with the pre-migration JSON response documented at docs/api-shipping-v2.mdx.
operationId: RouteIntelligence
parameters:
- name: fromIso2
in: query
description: Origin country, ISO-3166-1 alpha-2 uppercase.
required: false
schema:
type: string
- name: toIso2
in: query
description: Destination country, ISO-3166-1 alpha-2 uppercase.
required: false
schema:
type: string
- name: cargoType
in: query
description: |-
Cargo type — one of: container (default), tanker, bulk, roro.
Empty string defers to the server default. Unknown values are coerced to
"container" to preserve legacy behavior.
required: false
schema:
type: string
- name: hs2
in: query
description: |-
2-digit HS commodity code (default "27" — mineral fuels). Non-digit
characters are stripped server-side to match legacy behavior.
required: false
schema:
type: string
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/RouteIntelligenceResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/v2/shipping/webhooks:
get:
tags:
- ShippingV2Service
summary: ListWebhooks
description: |-
ListWebhooks returns the caller's registered webhooks filtered by the
SHA-256 owner tag of the calling API key. The `secret` is intentionally
omitted from the response; use rotate-secret to obtain a new one.
operationId: ListWebhooks
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/ListWebhooksResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
post:
tags:
- ShippingV2Service
summary: RegisterWebhook
description: |-
RegisterWebhook subscribes a callback URL to chokepoint disruption alerts.
Returns the subscriberId and the raw HMAC secret — the secret is never
returned again except via rotate-secret.
operationId: RegisterWebhook
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/RegisterWebhookRequest'
required: true
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/RegisterWebhookResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
components:
schemas:
Error:
type: object
properties:
message:
type: string
description: Error message (e.g., 'user not found', 'database connection failed')
description: Error is returned when a handler encounters an error. It contains a simple error message that the developer can customize.
FieldViolation:
type: object
properties:
field:
type: string
description: The field path that failed validation (e.g., 'user.email' for nested fields). For header validation, this will be the header name (e.g., 'X-API-Key')
description:
type: string
description: Human-readable description of the validation violation (e.g., 'must be a valid email address', 'required field missing')
required:
- field
- description
description: FieldViolation describes a single validation error for a specific field.
ValidationError:
type: object
properties:
violations:
type: array
items:
$ref: '#/components/schemas/FieldViolation'
description: List of validation violations
required:
- violations
description: ValidationError is returned when request validation fails. It contains a list of field violations describing what went wrong.
RouteIntelligenceRequest:
type: object
properties:
fromIso2:
type: string
pattern: ^[A-Z]{2}$
description: Origin country, ISO-3166-1 alpha-2 uppercase.
toIso2:
type: string
pattern: ^[A-Z]{2}$
description: Destination country, ISO-3166-1 alpha-2 uppercase.
cargoType:
type: string
description: |-
Cargo type — one of: container (default), tanker, bulk, roro.
Empty string defers to the server default. Unknown values are coerced to
"container" to preserve legacy behavior.
hs2:
type: string
description: |-
2-digit HS commodity code (default "27" — mineral fuels). Non-digit
characters are stripped server-side to match legacy behavior.
required:
- fromIso2
- toIso2
description: |-
RouteIntelligenceRequest scopes a route-intelligence query by origin and
destination country. Query-parameter names are preserved verbatim from the
legacy partner contract (fromIso2/toIso2/cargoType/hs2 — camelCase).
RouteIntelligenceResponse:
type: object
properties:
fromIso2:
type: string
toIso2:
type: string
cargoType:
type: string
hs2:
type: string
primaryRouteId:
type: string
chokepointExposures:
type: array
items:
$ref: '#/components/schemas/ChokepointExposure'
bypassOptions:
type: array
items:
$ref: '#/components/schemas/BypassOption'
warRiskTier:
type: string
description: War-risk tier enum string, e.g., "WAR_RISK_TIER_NORMAL" or "WAR_RISK_TIER_ELEVATED".
disruptionScore:
type: integer
format: int32
description: Disruption score of the primary chokepoint, 0-100.
fetchedAt:
type: string
description: ISO-8601 timestamp of when the response was assembled.
description: |-
RouteIntelligenceResponse wire shape preserved byte-for-byte from the
pre-migration JSON at docs/api-shipping-v2.mdx. `fetched_at` is intentionally
a string (ISO-8601) rather than int64 epoch ms because partners depend on
the ISO-8601 shape.
ChokepointExposure:
type: object
properties:
chokepointId:
type: string
chokepointName:
type: string
exposurePct:
type: integer
format: int32
description: Single chokepoint exposure for a route.
BypassOption:
type: object
properties:
id:
type: string
name:
type: string
type:
type: string
description: Type of bypass (e.g., "maritime_detour", "land_corridor").
addedTransitDays:
type: integer
format: int32
addedCostMultiplier:
type: number
format: double
activationThreshold:
type: string
description: Enum-like string, e.g., "DISRUPTION_SCORE_60".
description: Single bypass-corridor option around a disrupted chokepoint.
RegisterWebhookRequest:
type: object
properties:
callbackUrl:
type: string
maxLength: 2048
minLength: 8
description: |-
HTTPS callback URL. Must not resolve to a private/loopback address at
registration time (SSRF guard). The delivery worker re-validates the
resolved IP before each send to mitigate DNS rebinding.
chokepointIds:
type: array
items:
type: string
description: |-
Zero or more chokepoint IDs to subscribe to. Empty list subscribes to
the entire CHOKEPOINT_REGISTRY. Unknown IDs fail with 400.
alertThreshold:
type: integer
maximum: 100
minimum: 0
format: int32
description: Disruption-score threshold for delivery, 0-100. Default 50.
required:
- callbackUrl
description: |-
RegisterWebhookRequest creates a new chokepoint-disruption webhook
subscription. Wire shape is byte-compatible with the pre-migration
legacy POST body.
RegisterWebhookResponse:
type: object
properties:
subscriberId:
type: string
description: '`wh_` prefix + 24 lowercase hex chars (12 random bytes).'
secret:
type: string
description: Raw 64-char lowercase hex secret (32 random bytes). No `whsec_` prefix.
description: |-
RegisterWebhookResponse wire shape preserved exactly — partners persist the
`secret` because the server never returns it again except via rotate-secret.
ListWebhooksRequest:
type: object
description: |-
ListWebhooksRequest has no fields — the owner is derived from the caller's
API-key fingerprint (SHA-256 of X-WorldMonitor-Key).
ListWebhooksResponse:
type: object
properties:
webhooks:
type: array
items:
$ref: '#/components/schemas/WebhookSummary'
description: |-
ListWebhooksResponse wire shape preserved exactly: the `webhooks` field
name and the omission of `secret` are part of the partner contract.
WebhookSummary:
type: object
properties:
subscriberId:
type: string
callbackUrl:
type: string
chokepointIds:
type: array
items:
type: string
alertThreshold:
type: integer
format: int32
createdAt:
type: string
description: ISO-8601 timestamp of registration.
active:
type: boolean
description: |-
Single webhook record in the list response. `secret` is intentionally
omitted; use rotate-secret to obtain a new one.

File diff suppressed because one or more lines are too long

View File

@@ -266,6 +266,84 @@ paths:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/supply-chain/v1/get-country-products:
get:
tags:
- SupplyChainService
summary: GetCountryProducts
description: GetCountryProducts returns the seeded bilateral-HS4 import basket for a country. PRO-gated.
operationId: GetCountryProducts
parameters:
- name: iso2
in: query
required: false
schema:
type: string
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/GetCountryProductsResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/supply-chain/v1/get-multi-sector-cost-shock:
get:
tags:
- SupplyChainService
summary: GetMultiSectorCostShock
description: |-
GetMultiSectorCostShock returns per-sector cost-shock estimates for a
country+chokepoint+closure-window. PRO-gated.
operationId: GetMultiSectorCostShock
parameters:
- name: iso2
in: query
required: false
schema:
type: string
- name: chokepointId
in: query
required: false
schema:
type: string
- name: closureDays
in: query
description: Closure-window duration in days. Server clamps to [1, 365]. Defaults to 30.
required: false
schema:
type: integer
format: int32
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/GetMultiSectorCostShockResponse'
"400":
description: Validation error
content:
application/json:
schema:
$ref: '#/components/schemas/ValidationError'
default:
description: Error response
content:
application/json:
schema:
$ref: '#/components/schemas/Error'
/api/supply-chain/v1/get-sector-dependency:
get:
tags:
@@ -991,6 +1069,153 @@ components:
description: Null/unavailable explanation for non-energy sectors
fetchedAt:
type: string
GetCountryProductsRequest:
type: object
properties:
iso2:
type: string
pattern: ^[A-Z]{2}$
required:
- iso2
GetCountryProductsResponse:
type: object
properties:
iso2:
type: string
products:
type: array
items:
$ref: '#/components/schemas/CountryProduct'
fetchedAt:
type: string
description: ISO timestamp from the seeded payload (empty when no data is cached).
CountryProduct:
type: object
properties:
hs4:
type: string
description:
type: string
totalValue:
type: number
format: double
topExporters:
type: array
items:
$ref: '#/components/schemas/ProductExporter'
year:
type: integer
format: int32
ProductExporter:
type: object
properties:
partnerCode:
type: integer
format: int32
partnerIso2:
type: string
value:
type: number
format: double
share:
type: number
format: double
GetMultiSectorCostShockRequest:
type: object
properties:
iso2:
type: string
pattern: ^[A-Z]{2}$
chokepointId:
type: string
closureDays:
type: integer
format: int32
description: Closure-window duration in days. Server clamps to [1, 365]. Defaults to 30.
required:
- iso2
- chokepointId
GetMultiSectorCostShockResponse:
type: object
properties:
iso2:
type: string
chokepointId:
type: string
closureDays:
type: integer
format: int32
description: Server-clamped closure-window duration in days (1-365).
warRiskTier:
type: string
enum:
- WAR_RISK_TIER_UNSPECIFIED
- WAR_RISK_TIER_NORMAL
- WAR_RISK_TIER_ELEVATED
- WAR_RISK_TIER_HIGH
- WAR_RISK_TIER_CRITICAL
- WAR_RISK_TIER_WAR_ZONE
description: |-
*
War risk tier derived from Lloyd's JWC Listed Areas + OSINT threat classification.
This is a FREE field (no PRO gate) — it exposes the existing server-internal
threatLevel from ChokepointConfig, making it available to clients for badges
and bypass corridor scoring.
sectors:
type: array
items:
$ref: '#/components/schemas/MultiSectorCostShock'
totalAddedCost:
type: number
format: double
description: Sum of total_cost_shock across all sectors.
fetchedAt:
type: string
unavailableReason:
type: string
description: Populated when no seeded import data is available for the country.
MultiSectorCostShock:
type: object
properties:
hs2:
type: string
description: HS2 chapter code (e.g. "27" mineral fuels, "85" electronics).
hs2Label:
type: string
description: Friendly chapter label (e.g. "Energy", "Electronics").
importValueAnnual:
type: number
format: double
description: Total annual import value (USD) for this sector.
freightAddedPctPerTon:
type: number
format: double
description: Bypass-corridor freight uplift fraction (0.10 == +10% per ton).
warRiskPremiumBps:
type: integer
format: int32
description: War-risk insurance premium (basis points) sourced from the chokepoint tier.
addedTransitDays:
type: integer
format: int32
description: Bypass-corridor transit penalty (informational).
totalCostShockPerDay:
type: number
format: double
totalCostShock30Days:
type: number
format: double
totalCostShock90Days:
type: number
format: double
totalCostShock:
type: number
format: double
description: Cost for the requested closure_days window.
closureDays:
type: integer
format: int32
description: Echoes the clamped closure duration used for total_cost_shock (1-365).
GetSectorDependencyRequest:
type: object
properties:

View File

@@ -271,7 +271,7 @@ World Monitor uses 60+ Vercel Edge Functions as a lightweight API layer, split i
- **BIS Integration** — policy rates, real effective exchange rates, and credit-to-GDP ratios from the Bank for International Settlements, cached with 30-minute TTL
- **WTO Trade Policy** — trade restrictions, tariff trends, bilateral trade flows, and SPS/TBT barriers from the World Trade Organization
- **Supply Chain Intelligence** — maritime chokepoint disruption scores (cross-referencing NGA warnings + AIS data), FRED shipping freight indices with spike detection, and critical mineral supply concentration via Herfindahl-Hirschman Index analysis
- **Company Enrichment** — `/api/enrichment/company` aggregates GitHub organization data, inferred tech stack (derived from repository language distributions weighted by star count), SEC EDGAR public filings (10-K, 10-Q, 8-K), and Hacker News mentions into a single response. `/api/enrichment/signals` surfaces real-time company activity signals — funding events, hiring surges, executive changes, and expansion announcements — sourced from Hacker News and GitHub, each classified by signal type and scored for strength based on engagement, comment volume, and recency
- **Company Enrichment** — `IntelligenceService.GetCompanyEnrichment` aggregates GitHub organization data, inferred tech stack (derived from repository language distributions weighted by star count), SEC EDGAR public filings (10-K, 10-Q, 8-K), and Hacker News mentions into a single response. `IntelligenceService.ListCompanySignals` surfaces real-time company activity signals — funding events, hiring surges, executive changes, and expansion announcements — sourced from Hacker News and GitHub, each classified by signal type and scored for strength based on engagement, comment volume, and recency
All edge functions include circuit breaker logic and return cached stale data when upstream APIs are unavailable, ensuring the dashboard never shows blank panels.

View File

@@ -21,9 +21,23 @@ All notable changes to World Monitor are documented here. Subscribe via [RSS](/c
- R2 trace storage for forecast debugging with Cloudflare API upload (#1655)
- `@ts-nocheck` injection in Makefile generate target for CI proto-freshness parity (#1637)
### Changed
- **Sebuf API migration (#3207)** — scenario + supply-chain endpoints moved to the typed sebuf contract. RPC URLs now derive from method names; five renamed v1 URLs remain live as thin aliases so existing integrations keep working:
- `/api/scenario/v1/run` → `/api/scenario/v1/run-scenario`
- `/api/scenario/v1/status` → `/api/scenario/v1/get-scenario-status`
- `/api/scenario/v1/templates` → `/api/scenario/v1/list-scenario-templates`
- `/api/supply-chain/v1/country-products` → `/api/supply-chain/v1/get-country-products`
- `/api/supply-chain/v1/multi-sector-cost-shock` → `/api/supply-chain/v1/get-multi-sector-cost-shock`
Aliases retire at the next v1→v2 break (tracked in [#3282](https://github.com/koala73/worldmonitor/issues/3282)).
- `POST /api/scenario/v1/run-scenario` now returns `200 OK` instead of the pre-migration `202 Accepted`. sebuf's HTTP annotations don't carry per-RPC status codes. Branch on response body `status === "pending"` instead of `response.status === 202`. `statusUrl` field is preserved.
### Security
- **CDN cache bypass closed**: `CDN-Cache-Control` header now only emitted for trusted origins (worldmonitor.app, Vercel previews, Tauri). No-origin server-side requests always invoke the edge function so `validateApiKey` runs, preventing a cached 200 from being served to external scrapers.
- **Shipping v2 webhook tenant isolation (#3242)**: `POST /api/v2/shipping/webhooks` and `GET /api/v2/shipping/webhooks` now enforce `validateApiKey(req, { forceKey: true })`, matching the sibling `[subscriberId]{,/[action]}` routes. Without this gate, a Clerk-authenticated pro user with no API key would collapse into a shared `'anon'` fingerprint bucket and see/overwrite webhooks owned by other `'anon'`-bucket tenants.
### Fixed

View File

@@ -42,4 +42,4 @@ OFAC publishes list updates on an irregular cadence (daily-to-weekly). The panel
## API reference
- [Sanctions service](/api/SanctionsService.openapi.yaml) — sanctions pressure + entity search RPCs.
- Related: `/api/sanctions-entity-search` endpoint (documented under [Proxies](/api-proxies)) for fuzzy SDN entity lookups.
- Related: `GET /api/sanctions/v1/lookup-entity?q=...` — fuzzy entity lookup against OpenSanctions (live) with OFAC local index as a fallback.

View File

@@ -24,7 +24,7 @@ Panel id is `supply-chain`; canonical component is `src/components/SupplyChainPa
## Data sources
Reads the cached supply-chain dataset (composite stress, carrier list, minerals, scenario templates). The Scenario Engine triggers call `POST /api/scenario/v1/run` and poll `/status` — see [Scenarios API](/api-scenarios) for the HTTP contract.
Reads the cached supply-chain dataset (composite stress, carrier list, minerals, scenario templates). The Scenario Engine triggers call `POST /api/scenario/v1/run-scenario` and poll `GET /api/scenario/v1/get-scenario-status` — see [Scenarios API](/api-scenarios) for the HTTP contract.
## Refresh cadence

View File

@@ -45,9 +45,9 @@ The UI is state-driven, not modal — activating a scenario sets a `scenarioStat
## Tier & gating
Scenario Engine is **PRO**. Free users see the trigger buttons but are blocked at activation: a `scenario-engine` gate-hit event is logged and the map is not repainted. The `/api/scenario/v1/run` endpoint itself also enforces PRO at the edge (`api/scenario/v1/run.ts`).
Scenario Engine is **PRO**. Free users see the trigger buttons but are blocked at activation: a `scenario-engine` gate-hit event is logged and the map is not repainted. The `ScenarioService.RunScenario` handler also enforces PRO at the edge (`server/worldmonitor/scenario/v1/run-scenario.ts`).
Rate limits on the API side — 10 jobs / minute / user, with a global queue cap at 100 in-flight — are documented in [Scenarios API](/api-scenarios#run-a-scenario).
Rate limits on the API side — 10 jobs / minute / IP, with a global queue cap at 100 in-flight — are documented in [Scenarios API](/api-scenarios#run-a-scenario).
## Run it yourself
@@ -59,7 +59,7 @@ The workflow is inherently async — the edge function enqueues a job, a Railway
4. When the result lands, the map repaints, and a scenario banner is prepended to the panel. The banner always shows: a ⚠ icon, the scenario name, the top 5 impacted countries with per-country impact %, and a **×** dismiss control. When the scenario's result payload includes template parameters (duration, disruption %, cost-shock multiplier), the banner additionally renders a chip row (e.g. `14d · +110% cost`) and a tagline line such as *"Simulating 14d / 100% closure / +110% cost on 1 chokepoint. Chokepoint card below shows projected score; map highlights disrupted routes."* The affected chokepoints themselves are highlighted on the map and on the chokepoint cards rather than listed by name in the banner.
5. Click the **×** dismiss control on the banner (aria-label: "Dismiss scenario") to clear the scenario state — the map repaints to its baseline and the panel re-renders without the projected score and red-border callouts.
For scripted use, see [`POST /api/scenario/v1/run`](/api-scenarios#run-a-scenario) — enqueue, then poll `/status` until the response has a terminal status (`"done"` on success, `"failed"` on error). Non-terminal states are `"pending"` (queued) and `"processing"` (worker started); both can persist for several seconds. See the [status lifecycle table](/api-scenarios#poll-job-status) for the full contract.
For scripted use, see [`POST /api/scenario/v1/run-scenario`](/api-scenarios#run-a-scenario) — enqueue, then poll `GET /api/scenario/v1/get-scenario-status` until the response has a terminal status (`"done"` on success, `"failed"` on error). Non-terminal states are `"pending"` (queued) and `"processing"` (worker started); both can persist for several seconds. See the [status lifecycle table](/api-scenarios#poll-job-status) for the full contract.
## Data behind Scenario Engine

View File

@@ -28,7 +28,7 @@ OAuth endpoints follow [RFC 6749 §5.2](https://datatracker.ietf.org/doc/html/rf
| Code | Meaning | Retry? |
|------|---------|--------|
| `200` | OK | — |
| `202` | Accepted — job enqueued (e.g. `scenario/v1/run`). Poll `statusUrl`. | — |
| `202` | Accepted — job enqueued. Poll for terminal status. | — |
| `304` | Not Modified (conditional cache hit) | — |
| `400` | Bad request — validation error | No — fix input |
| `401` | Missing / invalid auth | No — fix auth |
@@ -55,13 +55,13 @@ OAuth endpoints follow [RFC 6749 §5.2](https://datatracker.ietf.org/doc/html/rf
| `Rate limit exceeded` | 429 fired; honor `Retry-After`. |
| `Service temporarily unavailable` | Upstash or another hard dependency missing at request time. |
| `service_unavailable` | Signing secret / required env not configured. |
| `Failed to enqueue scenario job` | Redis pipeline failure on `/api/scenario/v1/run`. |
| `Failed to enqueue scenario job` | Redis pipeline failure on `/api/scenario/v1/run-scenario`. |
## Retry strategy
**Idempotent reads** (`GET`): retry 429/5xx with exponential backoff (1s, 2s, 4s, cap 30s, 5 attempts). Most GET responses are cached at the edge, so the retry usually goes faster.
**Writes**: never auto-retry 4xx. For 5xx on writes, inspect: `POST /api/brief/share-url` and `POST /api/v2/shipping/webhooks` are idempotent; `POST /api/scenario/v1/run` is **not** — it enqueues a new job on each call. Retrying a 5xx on `run` may double-charge the rate-limit counter.
**Writes**: never auto-retry 4xx. For 5xx on writes, inspect: `POST /api/brief/share-url` and `POST /api/v2/shipping/webhooks` are idempotent; `POST /api/scenario/v1/run-scenario` is **not** — it enqueues a new job on each call. Retrying a 5xx on `run-scenario` may double-charge the rate-limit counter.
**MCP**: the server returns tool errors in the JSON-RPC result with `isError: true` and a text explanation — those are not HTTP errors. Handle them at the tool-call layer.

View File

@@ -37,10 +37,10 @@ Exceeding any of these during the OAuth flow will cause the MCP client to fail t
| Endpoint | Limit | Window | Scope |
|----------|-------|--------|-------|
| `POST /api/scenario/v1/run` | 10 | 60 s | Per user |
| `POST /api/scenario/v1/run` (queue depth) | 100 in-flight | — | Global |
| `POST /api/register-interest` | 5 | 60 min | Per IP + Turnstile |
| `POST /api/contact` | 3 | 60 min | Per IP + Turnstile |
| `POST /api/scenario/v1/run-scenario` | 10 | 60 s | Per IP |
| `POST /api/scenario/v1/run-scenario` (queue depth) | 100 in-flight | — | Global |
| `POST /api/leads/v1/register-interest` | 5 | 60 min | Per IP + Turnstile (desktop sources bypass Turnstile) |
| `POST /api/leads/v1/submit-contact` | 3 | 60 min | Per IP + Turnstile |
Other write endpoints (`/api/brief/share-url`, `/api/notification-channels`, `/api/create-checkout`, `/api/customer-portal`, etc.) fall back to the default per-IP limit above.
@@ -74,5 +74,5 @@ Content-Type: application/json
- Webhook callback URLs must be HTTPS (except localhost).
- `api/download` file sizes capped at ~50 MB per request.
- `POST /api/scenario/v1/run` globally pauses new jobs when the pending queue exceeds **100** — returns 429 with `Retry-After: 30`.
- `POST /api/scenario/v1/run-scenario` globally pauses new jobs when the pending queue exceeds **100** — returns 429.
- `api/v2/shipping/webhooks` TTL is **30 days** — re-register to extend.

View File

@@ -8,6 +8,8 @@
"lint": "biome lint ./src ./server ./api ./tests ./e2e ./scripts ./middleware.ts",
"lint:fix": "biome check ./src ./server ./api ./tests ./e2e ./scripts ./middleware.ts --fix",
"lint:boundaries": "node scripts/lint-boundaries.mjs",
"lint:api-contract": "node scripts/enforce-sebuf-api-contract.mjs",
"lint:rate-limit-policies": "node scripts/enforce-rate-limit-policies.mjs",
"lint:unicode": "node scripts/check-unicode-safety.mjs",
"lint:unicode:staged": "node scripts/check-unicode-safety.mjs --staged",
"lint:md": "markdownlint-cli2 '**/*.md' '!**/node_modules/**' '!.agent/**' '!.agents/**' '!.claude/**' '!.factory/**' '!.windsurf/**' '!skills/**' '!docs/internal/**' '!docs/Docs_To_Review/**' '!todos/**' '!docs/plans/**'",

View File

@@ -1097,7 +1097,7 @@ const EnterprisePage = () => (
const turnstileWidget = form.querySelector('.cf-turnstile') as HTMLElement | null;
const turnstileToken = turnstileWidget?.dataset.token || '';
try {
const res = await fetch(`${API_BASE}/contact`, {
const res = await fetch(`${API_BASE}/leads/v1/submit-contact`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
@@ -1115,7 +1115,7 @@ const EnterprisePage = () => (
if (!res.ok) {
const data = await res.json().catch(() => ({}));
if (res.status === 422 && errorEl) {
errorEl.textContent = data.error || t('enterpriseShowcase.workEmailRequired');
errorEl.textContent = data.message || data.error || t('enterpriseShowcase.workEmailRequired');
errorEl.classList.remove('hidden');
btn.textContent = origText;
btn.disabled = false;

View File

@@ -0,0 +1,29 @@
syntax = "proto3";
package worldmonitor.leads.v1;
// RegisterInterestRequest carries a Pro-waitlist signup.
message RegisterInterestRequest {
string email = 1;
string source = 2;
string app_version = 3;
string referred_by = 4;
// Honeypot — bots auto-fill this hidden field; real submissions leave it empty.
string website = 5;
// Cloudflare Turnstile token. Desktop sources bypass Turnstile; see handler.
string turnstile_token = 6;
}
// RegisterInterestResponse mirrors the Convex registerInterest:register return shape.
message RegisterInterestResponse {
// "registered" for a new signup; "already_registered" for a returning email.
string status = 1;
// Stable referral code for this email.
string referral_code = 2;
// Number of signups credited to this email.
int32 referral_count = 3;
// Waitlist position at registration time. Present only when status == "registered".
int32 position = 4;
// True when the email is on the suppression list (prior bounce) and no confirmation was sent.
bool email_suppressed = 5;
}

View File

@@ -0,0 +1,22 @@
syntax = "proto3";
package worldmonitor.leads.v1;
import "sebuf/http/annotations.proto";
import "worldmonitor/leads/v1/register_interest.proto";
import "worldmonitor/leads/v1/submit_contact.proto";
// LeadsService handles public-facing lead capture: enterprise contact and Pro-waitlist signups.
service LeadsService {
option (sebuf.http.service_config) = {base_path: "/api/leads/v1"};
// SubmitContact stores an enterprise contact submission in Convex and emails ops.
rpc SubmitContact(SubmitContactRequest) returns (SubmitContactResponse) {
option (sebuf.http.config) = {path: "/submit-contact", method: HTTP_METHOD_POST};
}
// RegisterInterest adds an email to the Pro waitlist and sends a confirmation email.
rpc RegisterInterest(RegisterInterestRequest) returns (RegisterInterestResponse) {
option (sebuf.http.config) = {path: "/register-interest", method: HTTP_METHOD_POST};
}
}

View File

@@ -0,0 +1,25 @@
syntax = "proto3";
package worldmonitor.leads.v1;
// SubmitContactRequest carries an enterprise contact form submission.
message SubmitContactRequest {
string email = 1;
string name = 2;
string organization = 3;
string phone = 4;
string message = 5;
string source = 6;
// Honeypot — bots auto-fill this hidden field; real submissions leave it empty.
string website = 7;
// Cloudflare Turnstile token proving the submitter is human.
string turnstile_token = 8;
}
// SubmitContactResponse reports the outcome of storing the lead and notifying ops.
message SubmitContactResponse {
// Always "sent" on success.
string status = 1;
// True when the Resend notification to ops was delivered.
bool email_sent = 2;
}

View File

@@ -15,6 +15,10 @@ message GetVesselSnapshotRequest {
double sw_lat = 3 [(sebuf.http.query) = { name: "sw_lat" }];
// South-west corner longitude of bounding box.
double sw_lon = 4 [(sebuf.http.query) = { name: "sw_lon" }];
// When true, populate VesselSnapshot.candidate_reports with per-vessel
// position reports. Clients with no position callbacks should leave this
// false to keep responses small.
bool include_candidates = 5 [(sebuf.http.query) = { name: "include_candidates" }];
}
// GetVesselSnapshotResponse contains the vessel traffic snapshot.

View File

@@ -14,6 +14,47 @@ message VesselSnapshot {
repeated AisDensityZone density_zones = 2;
// Detected AIS disruptions.
repeated AisDisruption disruptions = 3;
// Monotonic sequence number from the relay. Clients use this to detect stale
// responses during polling.
int32 sequence = 4;
// Relay health status.
AisSnapshotStatus status = 5;
// Recent position reports for individual vessels. Only populated when the
// request sets include_candidates=true — empty otherwise.
repeated SnapshotCandidateReport candidate_reports = 6;
}
// AisSnapshotStatus reports relay health at the time of the snapshot.
message AisSnapshotStatus {
// Whether the relay WebSocket is connected to the AIS provider.
bool connected = 1;
// Number of vessels currently tracked by the relay.
int32 vessels = 2;
// Total AIS messages processed in the current session.
int32 messages = 3;
}
// SnapshotCandidateReport is a per-vessel position report attached to a
// snapshot. Used to drive the client-side position callback system.
message SnapshotCandidateReport {
// Maritime Mobile Service Identity.
string mmsi = 1;
// Vessel name (may be empty if unknown).
string name = 2;
// Latitude in decimal degrees.
double lat = 3;
// Longitude in decimal degrees.
double lon = 4;
// AIS ship type code (0 if unknown).
int32 ship_type = 5;
// Heading in degrees (0-359, or 511 for unavailable).
int32 heading = 6;
// Speed over ground in knots.
double speed = 7;
// Course over ground in degrees.
int32 course = 8;
// Report timestamp, as Unix epoch milliseconds.
int64 timestamp = 9 [(sebuf.http.int64_encoding) = INT64_ENCODING_NUMBER];
}
// AisDensityZone represents a zone of concentrated vessel traffic.

View File

@@ -15,7 +15,10 @@ message MilitaryFlight {
];
// Aircraft callsign.
string callsign = 2;
// ICAO 24-bit hex address.
// ICAO 24-bit hex address. Canonical form is UPPERCASE — seeders and
// handlers must uppercase before writing so hex-based lookups
// (src/services/military-flights.ts:getFlightByHex) match regardless of
// upstream source casing.
string hex_code = 3;
// Aircraft registration number.
string registration = 4;

View File

@@ -0,0 +1,68 @@
syntax = "proto3";
package worldmonitor.scenario.v1;
import "buf/validate/validate.proto";
import "sebuf/http/annotations.proto";
// ScenarioImpactCountry carries a single country's scenario impact score.
message ScenarioImpactCountry {
// 2-letter ISO country code.
string iso2 = 1;
// Raw weighted impact value aggregated across the country's exposed HS2
// chapters. Relative-only — not a currency amount.
double total_impact = 2;
// Impact as a 0-100 share of the worst-hit country.
int32 impact_pct = 3;
}
// ScenarioResultTemplate carries template parameters echoed into the worker's
// computed result so clients can render them without re-looking up the
// template registry.
message ScenarioResultTemplate {
// Display name (worker derives this from affected_chokepoint_ids; may be
// `tariff_shock` for tariff-type scenarios).
string name = 1;
// 0-100 percent of chokepoint capacity blocked.
int32 disruption_pct = 2;
// Estimated duration of disruption in days.
int32 duration_days = 3;
// Freight cost multiplier applied on top of bypass corridor costs.
double cost_shock_multiplier = 4;
}
// ScenarioResult is the computed payload the scenario-worker writes back
// under the `scenario-result:{job_id}` Redis key. Populated only when
// GetScenarioStatusResponse.status == "done".
message ScenarioResult {
// Chokepoint ids disrupted by this scenario.
repeated string affected_chokepoint_ids = 1;
// Top 20 countries by aggregated impact, sorted desc by total_impact.
repeated ScenarioImpactCountry top_impact_countries = 2;
// Template parameters echoed from the registry.
ScenarioResultTemplate template = 3;
}
// GetScenarioStatusRequest polls the worker result for an enqueued job id.
message GetScenarioStatusRequest {
// Job id of the form `scenario:{epoch_ms}:{8-char-suffix}`. Path-traversal
// guarded by JOB_ID_RE in the handler.
string job_id = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.pattern = "^scenario:[0-9]{13}:[a-z0-9]{8}$",
(sebuf.http.query) = {name: "jobId"}
];
}
// GetScenarioStatusResponse reflects the worker's lifecycle state.
// "pending" — no key yet (job still queued or very-recent enqueue).
// "processing" — worker has claimed the job but hasn't completed compute.
// "done" — compute succeeded; `result` is populated.
// "failed" — compute errored; `error` is populated.
message GetScenarioStatusResponse {
string status = 1;
// Populated only when status == "done".
ScenarioResult result = 2;
// Populated only when status == "failed".
string error = 3;
}

View File

@@ -0,0 +1,28 @@
syntax = "proto3";
package worldmonitor.scenario.v1;
// ScenarioTemplate mirrors the catalog shape served by
// GET /api/scenario/v1/list-scenario-templates. The authoritative template
// registry lives in server/worldmonitor/supply-chain/v1/scenario-templates.ts.
message ScenarioTemplate {
string id = 1;
string name = 2;
// Chokepoint ids this scenario disrupts. Empty for tariff-shock scenarios
// that have no physical chokepoint closure.
repeated string affected_chokepoint_ids = 3;
// 0-100 percent of chokepoint capacity blocked.
int32 disruption_pct = 4;
// Estimated duration of disruption in days.
int32 duration_days = 5;
// HS2 chapter codes affected. Empty means ALL sectors are affected.
repeated string affected_hs2 = 6;
// Freight cost multiplier applied on top of bypass corridor costs.
double cost_shock_multiplier = 7;
}
message ListScenarioTemplatesRequest {}
message ListScenarioTemplatesResponse {
repeated ScenarioTemplate templates = 1;
}

View File

@@ -0,0 +1,43 @@
syntax = "proto3";
package worldmonitor.scenario.v1;
import "buf/validate/validate.proto";
// RunScenarioRequest enqueues a scenario job on the scenario-queue:pending
// Upstash list for the async scenario-worker to pick up.
message RunScenarioRequest {
// Scenario template id — must match an entry in SCENARIO_TEMPLATES.
string scenario_id = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.min_len = 1,
(buf.validate.field).string.max_len = 128
];
// Optional 2-letter ISO country code to scope the impact computation.
// When absent, the worker computes for all countries with seeded exposure.
string iso2 = 2 [
(buf.validate.field).string.pattern = "^([A-Z]{2})?$"
];
}
// RunScenarioResponse carries the enqueued job id. Clients poll
// GetScenarioStatus with this id until status != "pending".
//
// NOTE: the legacy (pre-sebuf) endpoint returned HTTP 202 Accepted on
// enqueue; the sebuf-generated server emits 200 OK for all successful
// responses (no per-RPC status-code configuration is available in the
// current sebuf HTTP annotations). The 202 → 200 shift on a same-version
// (v1 → v1) migration is called out in docs/api-scenarios.mdx and the
// OpenAPI bundle; external consumers keying off `response.status === 202`
// need to branch on response body shape instead.
message RunScenarioResponse {
// Generated job id of the form `scenario:{epoch_ms}:{8-char-suffix}`.
string job_id = 1;
// Always "pending" at enqueue time.
string status = 2;
// Convenience URL the caller can use to poll this job's status.
// Server-computed as `/api/scenario/v1/get-scenario-status?jobId=<job_id>`.
// Restored after the v1 → v1 sebuf migration because external callers
// may key off this field.
string status_url = 3;
}

View File

@@ -0,0 +1,34 @@
syntax = "proto3";
package worldmonitor.scenario.v1;
import "sebuf/http/annotations.proto";
import "worldmonitor/scenario/v1/run_scenario.proto";
import "worldmonitor/scenario/v1/get_scenario_status.proto";
import "worldmonitor/scenario/v1/list_scenario_templates.proto";
// ScenarioService exposes the scenario engine: enqueue a scenario job,
// poll its status, and list the template catalog.
service ScenarioService {
option (sebuf.http.service_config) = {base_path: "/api/scenario/v1"};
// RunScenario enqueues a scenario job on scenario-queue:pending. PRO-gated.
// The scenario-worker (scripts/scenario-worker.mjs) pulls jobs off the
// queue via BLMOVE and writes results under scenario-result:{job_id}.
rpc RunScenario(RunScenarioRequest) returns (RunScenarioResponse) {
option (sebuf.http.config) = {path: "/run-scenario", method: HTTP_METHOD_POST};
}
// GetScenarioStatus polls a single job's result. PRO-gated.
// Returns status="pending" when no result key exists, mirroring the
// worker's lifecycle state once the key is written.
rpc GetScenarioStatus(GetScenarioStatusRequest) returns (GetScenarioStatusResponse) {
option (sebuf.http.config) = {path: "/get-scenario-status", method: HTTP_METHOD_GET};
}
// ListScenarioTemplates returns the catalog of pre-defined scenarios.
// Not PRO-gated — used by documented public API consumers.
rpc ListScenarioTemplates(ListScenarioTemplatesRequest) returns (ListScenarioTemplatesResponse) {
option (sebuf.http.config) = {path: "/list-scenario-templates", method: HTTP_METHOD_GET};
}
}

View File

@@ -0,0 +1,25 @@
syntax = "proto3";
package worldmonitor.shipping.v2;
// ListWebhooksRequest has no fields — the owner is derived from the caller's
// API-key fingerprint (SHA-256 of X-WorldMonitor-Key).
message ListWebhooksRequest {}
// Single webhook record in the list response. `secret` is intentionally
// omitted; use rotate-secret to obtain a new one.
message WebhookSummary {
string subscriber_id = 1;
string callback_url = 2;
repeated string chokepoint_ids = 3;
int32 alert_threshold = 4;
// ISO-8601 timestamp of registration.
string created_at = 5;
bool active = 6;
}
// ListWebhooksResponse wire shape preserved exactly: the `webhooks` field
// name and the omission of `secret` are part of the partner contract.
message ListWebhooksResponse {
repeated WebhookSummary webhooks = 1;
}

View File

@@ -0,0 +1,36 @@
syntax = "proto3";
package worldmonitor.shipping.v2;
import "buf/validate/validate.proto";
// RegisterWebhookRequest creates a new chokepoint-disruption webhook
// subscription. Wire shape is byte-compatible with the pre-migration
// legacy POST body.
message RegisterWebhookRequest {
// HTTPS callback URL. Must not resolve to a private/loopback address at
// registration time (SSRF guard). The delivery worker re-validates the
// resolved IP before each send to mitigate DNS rebinding.
string callback_url = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.min_len = 8,
(buf.validate.field).string.max_len = 2048
];
// Zero or more chokepoint IDs to subscribe to. Empty list subscribes to
// the entire CHOKEPOINT_REGISTRY. Unknown IDs fail with 400.
repeated string chokepoint_ids = 2;
// Disruption-score threshold for delivery, 0-100. Default 50.
int32 alert_threshold = 3 [
(buf.validate.field).int32.gte = 0,
(buf.validate.field).int32.lte = 100
];
}
// RegisterWebhookResponse wire shape preserved exactly — partners persist the
// `secret` because the server never returns it again except via rotate-secret.
message RegisterWebhookResponse {
// `wh_` prefix + 24 lowercase hex chars (12 random bytes).
string subscriber_id = 1;
// Raw 64-char lowercase hex secret (32 random bytes). No `whsec_` prefix.
string secret = 2;
}

View File

@@ -0,0 +1,70 @@
syntax = "proto3";
package worldmonitor.shipping.v2;
import "buf/validate/validate.proto";
import "sebuf/http/annotations.proto";
// RouteIntelligenceRequest scopes a route-intelligence query by origin and
// destination country. Query-parameter names are preserved verbatim from the
// legacy partner contract (fromIso2/toIso2/cargoType/hs2 — camelCase).
message RouteIntelligenceRequest {
// Origin country, ISO-3166-1 alpha-2 uppercase.
string from_iso2 = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.pattern = "^[A-Z]{2}$",
(sebuf.http.query) = { name: "fromIso2" }
];
// Destination country, ISO-3166-1 alpha-2 uppercase.
string to_iso2 = 2 [
(buf.validate.field).required = true,
(buf.validate.field).string.pattern = "^[A-Z]{2}$",
(sebuf.http.query) = { name: "toIso2" }
];
// Cargo type — one of: container (default), tanker, bulk, roro.
// Empty string defers to the server default. Unknown values are coerced to
// "container" to preserve legacy behavior.
string cargo_type = 3 [(sebuf.http.query) = { name: "cargoType" }];
// 2-digit HS commodity code (default "27" — mineral fuels). Non-digit
// characters are stripped server-side to match legacy behavior.
string hs2 = 4 [(sebuf.http.query) = { name: "hs2" }];
}
// Single chokepoint exposure for a route.
message ChokepointExposure {
string chokepoint_id = 1;
string chokepoint_name = 2;
int32 exposure_pct = 3;
}
// Single bypass-corridor option around a disrupted chokepoint.
message BypassOption {
string id = 1;
string name = 2;
// Type of bypass (e.g., "maritime_detour", "land_corridor").
string type = 3;
int32 added_transit_days = 4;
double added_cost_multiplier = 5;
// Enum-like string, e.g., "DISRUPTION_SCORE_60".
string activation_threshold = 6;
}
// RouteIntelligenceResponse wire shape preserved byte-for-byte from the
// pre-migration JSON at docs/api-shipping-v2.mdx. `fetched_at` is intentionally
// a string (ISO-8601) rather than int64 epoch ms because partners depend on
// the ISO-8601 shape.
message RouteIntelligenceResponse {
string from_iso2 = 1;
string to_iso2 = 2;
string cargo_type = 3;
string hs2 = 4;
string primary_route_id = 5;
repeated ChokepointExposure chokepoint_exposures = 6;
repeated BypassOption bypass_options = 7;
// War-risk tier enum string, e.g., "WAR_RISK_TIER_NORMAL" or "WAR_RISK_TIER_ELEVATED".
string war_risk_tier = 8;
// Disruption score of the primary chokepoint, 0-100.
int32 disruption_score = 9;
// ISO-8601 timestamp of when the response was assembled.
string fetched_at = 10;
}

View File

@@ -0,0 +1,42 @@
syntax = "proto3";
package worldmonitor.shipping.v2;
import "sebuf/http/annotations.proto";
import "worldmonitor/shipping/v2/route_intelligence.proto";
import "worldmonitor/shipping/v2/register_webhook.proto";
import "worldmonitor/shipping/v2/list_webhooks.proto";
// ShippingV2Service is the partner-facing (vendor) surface for chokepoint
// route intelligence and disruption-alert webhooks. PRO-gated, authenticated
// via X-WorldMonitor-Key (server-to-server; browser origins are NOT exempt).
//
// The base_path intentionally reverses the usual /api/{domain}/v{N} ordering
// because the existing partner contract is /api/v2/shipping/*. The proto does
// not model the path-parameter endpoints that also live on this surface
// (GET /webhooks/{id}, POST /webhooks/{id}/rotate-secret, POST /webhooks/{id}/reactivate);
// those remain on the legacy file layout until sebuf path-params are supported.
service ShippingV2Service {
option (sebuf.http.service_config) = {base_path: "/api/v2/shipping"};
// RouteIntelligence scores a country-pair trade route for chokepoint exposure
// and current disruption risk. Partner-facing; wire shape is byte-compatible
// with the pre-migration JSON response documented at docs/api-shipping-v2.mdx.
rpc RouteIntelligence(RouteIntelligenceRequest) returns (RouteIntelligenceResponse) {
option (sebuf.http.config) = {path: "/route-intelligence", method: HTTP_METHOD_GET};
}
// RegisterWebhook subscribes a callback URL to chokepoint disruption alerts.
// Returns the subscriberId and the raw HMAC secret — the secret is never
// returned again except via rotate-secret.
rpc RegisterWebhook(RegisterWebhookRequest) returns (RegisterWebhookResponse) {
option (sebuf.http.config) = {path: "/webhooks", method: HTTP_METHOD_POST};
}
// ListWebhooks returns the caller's registered webhooks filtered by the
// SHA-256 owner tag of the calling API key. The `secret` is intentionally
// omitted from the response; use rotate-secret to obtain a new one.
rpc ListWebhooks(ListWebhooksRequest) returns (ListWebhooksResponse) {
option (sebuf.http.config) = {path: "/webhooks", method: HTTP_METHOD_GET};
}
}

View File

@@ -0,0 +1,36 @@
syntax = "proto3";
package worldmonitor.supply_chain.v1;
import "buf/validate/validate.proto";
import "sebuf/http/annotations.proto";
message ProductExporter {
int32 partner_code = 1;
string partner_iso2 = 2;
double value = 3;
double share = 4;
}
message CountryProduct {
string hs4 = 1;
string description = 2;
double total_value = 3;
repeated ProductExporter top_exporters = 4;
int32 year = 5;
}
message GetCountryProductsRequest {
string iso2 = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.len = 2,
(buf.validate.field).string.pattern = "^[A-Z]{2}$",
(sebuf.http.query) = {name: "iso2"}
];
}
message GetCountryProductsResponse {
string iso2 = 1;
repeated CountryProduct products = 2;
// ISO timestamp from the seeded payload (empty when no data is cached).
string fetched_at = 3;
}

View File

@@ -0,0 +1,58 @@
syntax = "proto3";
package worldmonitor.supply_chain.v1;
import "buf/validate/validate.proto";
import "sebuf/http/annotations.proto";
import "worldmonitor/supply_chain/v1/supply_chain_data.proto";
message MultiSectorCostShock {
// HS2 chapter code (e.g. "27" mineral fuels, "85" electronics).
string hs2 = 1;
// Friendly chapter label (e.g. "Energy", "Electronics").
string hs2_label = 2;
// Total annual import value (USD) for this sector.
double import_value_annual = 3;
// Bypass-corridor freight uplift fraction (0.10 == +10% per ton).
double freight_added_pct_per_ton = 4;
// War-risk insurance premium (basis points) sourced from the chokepoint tier.
int32 war_risk_premium_bps = 5;
// Bypass-corridor transit penalty (informational).
int32 added_transit_days = 6;
double total_cost_shock_per_day = 7;
double total_cost_shock_30_days = 8;
double total_cost_shock_90_days = 9;
// Cost for the requested closure_days window.
double total_cost_shock = 10;
// Echoes the clamped closure duration used for total_cost_shock (1-365).
int32 closure_days = 11;
}
message GetMultiSectorCostShockRequest {
string iso2 = 1 [
(buf.validate.field).required = true,
(buf.validate.field).string.len = 2,
(buf.validate.field).string.pattern = "^[A-Z]{2}$",
(sebuf.http.query) = {name: "iso2"}
];
string chokepoint_id = 2 [
(buf.validate.field).required = true,
(sebuf.http.query) = {name: "chokepointId"}
];
// Closure-window duration in days. Server clamps to [1, 365]. Defaults to 30.
int32 closure_days = 3 [(sebuf.http.query) = {name: "closureDays"}];
}
message GetMultiSectorCostShockResponse {
string iso2 = 1;
string chokepoint_id = 2;
// Server-clamped closure-window duration in days (1-365).
int32 closure_days = 3;
WarRiskTier war_risk_tier = 4;
// Per-sector shock entries (10 seeded HS2 codes), sorted by total_cost_shock_per_day desc.
repeated MultiSectorCostShock sectors = 5;
// Sum of total_cost_shock across all sectors.
double total_added_cost = 6;
string fetched_at = 7;
// Populated when no seeded import data is available for the country.
string unavailable_reason = 8;
}

View File

@@ -11,6 +11,8 @@ import "worldmonitor/supply_chain/v1/get_shipping_stress.proto";
import "worldmonitor/supply_chain/v1/get_country_chokepoint_index.proto";
import "worldmonitor/supply_chain/v1/get_bypass_options.proto";
import "worldmonitor/supply_chain/v1/get_country_cost_shock.proto";
import "worldmonitor/supply_chain/v1/get_country_products.proto";
import "worldmonitor/supply_chain/v1/get_multi_sector_cost_shock.proto";
import "worldmonitor/supply_chain/v1/get_sector_dependency.proto";
import "worldmonitor/supply_chain/v1/get_route_explorer_lane.proto";
import "worldmonitor/supply_chain/v1/get_route_impact.proto";
@@ -57,6 +59,17 @@ service SupplyChainService {
option (sebuf.http.config) = {path: "/get-country-cost-shock", method: HTTP_METHOD_GET};
}
// GetCountryProducts returns the seeded bilateral-HS4 import basket for a country. PRO-gated.
rpc GetCountryProducts(GetCountryProductsRequest) returns (GetCountryProductsResponse) {
option (sebuf.http.config) = {path: "/get-country-products", method: HTTP_METHOD_GET};
}
// GetMultiSectorCostShock returns per-sector cost-shock estimates for a
// country+chokepoint+closure-window. PRO-gated.
rpc GetMultiSectorCostShock(GetMultiSectorCostShockRequest) returns (GetMultiSectorCostShockResponse) {
option (sebuf.http.config) = {path: "/get-multi-sector-cost-shock", method: HTTP_METHOD_GET};
}
// GetSectorDependency returns dependency flags and risk profile for a country+HS2 sector. PRO-gated.
rpc GetSectorDependency(GetSectorDependencyRequest) returns (GetSectorDependencyResponse) {
option (sebuf.http.config) = {path: "/get-sector-dependency", method: HTTP_METHOD_GET};

File diff suppressed because one or more lines are too long

View File

@@ -144,7 +144,7 @@
}
</script>
<script src="https://challenges.cloudflare.com/turnstile/v0/api.js?render=explicit" async defer></script>
<script type="module" crossorigin src="/pro/assets/index-BzWOWshY.js"></script>
<script type="module" crossorigin src="/pro/assets/index-BzYxK1gb.js"></script>
<link rel="stylesheet" crossorigin href="/pro/assets/index-InU6PrNf.css">
</head>
<body>

View File

@@ -0,0 +1,76 @@
#!/usr/bin/env node
/**
* Validates every key in ENDPOINT_RATE_POLICIES (server/_shared/rate-limit.ts)
* is a real gateway route by checking the OpenAPI specs generated from protos.
* Catches rename-drift that causes policies to become dead code (the
* sanctions-entity-search review finding — the policy key was
* `/api/sanctions/v1/lookup-entity` but the proto RPC generates path
* `/api/sanctions/v1/lookup-sanction-entity`, so the 30/min limit never
* applied and the endpoint fell through to the 600/min global limiter).
*
* Runs in the same pre-push + CI context as lint:api-contract.
*/
import { readFileSync, readdirSync } from 'node:fs';
import { join } from 'node:path';
const ROOT = new URL('..', import.meta.url).pathname;
const OPENAPI_DIR = join(ROOT, 'docs/api');
const RATE_LIMIT_SRC = join(ROOT, 'server/_shared/rate-limit.ts');
function extractPolicyKeys() {
const src = readFileSync(RATE_LIMIT_SRC, 'utf8');
const match = src.match(/ENDPOINT_RATE_POLICIES:\s*Record<[^>]+>\s*=\s*\{([\s\S]*?)\n\};/);
if (!match) {
throw new Error('Could not locate ENDPOINT_RATE_POLICIES in rate-limit.ts');
}
const block = match[1];
const keys = [];
// Match quoted keys: '/api/...' or "/api/..."
const keyRe = /['"](\/api\/[^'"]+)['"]\s*:/g;
let m;
while ((m = keyRe.exec(block)) !== null) {
keys.push(m[1]);
}
return keys;
}
function extractRoutesFromOpenApi() {
const routes = new Set();
const files = readdirSync(OPENAPI_DIR).filter((f) => f.endsWith('.openapi.yaml'));
for (const file of files) {
const yaml = readFileSync(join(OPENAPI_DIR, file), 'utf8');
// OpenAPI paths section — each route is a top-level key under `paths:`
// indented 4 spaces. Strip trailing colon.
const pathRe = /^\s{4}(\/api\/[^\s:]+):/gm;
let m;
while ((m = pathRe.exec(yaml)) !== null) {
routes.add(m[1]);
}
}
return routes;
}
function main() {
const keys = extractPolicyKeys();
const routes = extractRoutesFromOpenApi();
const missing = keys.filter((k) => !routes.has(k));
if (missing.length > 0) {
console.error('✗ ENDPOINT_RATE_POLICIES key(s) do not match any generated gateway route:\n');
for (const key of missing) {
console.error(` - ${key}`);
}
console.error('\nEach key must be a proto-generated RPC path. Check that:');
console.error(' 1. The key matches the path in docs/api/<Service>.openapi.yaml exactly.');
console.error(' 2. If you renamed the RPC in proto, update the policy key to match.');
console.error(' 3. If the policy is for a non-proto legacy route, remove it once that route is migrated.\n');
console.error('Similar issues in history: review of #3242 flagged the sanctions-entity-search');
console.error('policy under `/api/sanctions/v1/lookup-entity` when the generated path was');
console.error('`/api/sanctions/v1/lookup-sanction-entity` — the policy was dead code.');
process.exit(1);
}
console.log(`✓ rate-limit policies clean: ${keys.length} policies validated against ${routes.size} gateway routes.`);
}
main();

View File

@@ -0,0 +1,315 @@
#!/usr/bin/env node
/**
* Sebuf API contract enforcement.
*
* Every file under api/ must be one of:
* 1. A sebuf gateway — api/<kebab-domain>/v<N>/[rpc].ts paired with a
* generated service_server under src/generated/server/worldmonitor/<snake_domain>/v<N>/.
* 2. A listed entry in api/api-route-exceptions.json with category, reason,
* owner, and (for temporary categories) a removal_issue.
*
* Also checks the reverse: every generated service has a gateway. This catches
* the case where a proto is deleted but the gateway wrapper is left behind.
*
* Skips: underscore-prefixed helpers, *.test.*, and anything gitignored (the
* compiled sidecar bundles at api/[[...path]].js and api/<domain>/v1/[rpc].js
* are build artifacts, not source).
*
* Exit 0 clean, 1 on any violation. Output is agent-readable: file:line + remedy.
*/
import { existsSync, readFileSync, readdirSync } from 'node:fs';
import { join, relative, sep } from 'node:path';
import { execFileSync } from 'node:child_process';
const ROOT = process.cwd();
const API_DIR = join(ROOT, 'api');
const GEN_SERVER_DIR = join(ROOT, 'src/generated/server/worldmonitor');
const MANIFEST_PATH = join(API_DIR, 'api-route-exceptions.json');
const VALID_CATEGORIES = new Set([
'external-protocol',
'non-json',
'upstream-proxy',
'ops-admin',
'internal-helper',
'deferred',
'migration-pending',
]);
// Categories that describe *permanent* exceptions — never expected to leave the
// manifest. A removal_issue on these would be misleading.
const PERMANENT_CATEGORIES = new Set([
'external-protocol',
'non-json',
'upstream-proxy',
'ops-admin',
'internal-helper',
]);
const violations = [];
function violation(file, message, remedy) {
violations.push({ file, message, remedy });
}
// --- Enumerate candidate files under api/ ---
function walk(dir, acc = []) {
for (const entry of readdirSync(dir, { withFileTypes: true })) {
const full = join(dir, entry.name);
if (entry.isDirectory()) {
walk(full, acc);
} else {
acc.push(full);
}
}
return acc;
}
const SOURCE_EXTS = ['.ts', '.tsx', '.js', '.mjs', '.cjs'];
function isSourceFile(path) {
const base = path.split(sep).pop();
if (base.startsWith('_')) return false;
if (base.includes('.test.')) return false;
return SOURCE_EXTS.some((ext) => base.endsWith(ext));
}
const allApiFiles = walk(API_DIR).filter(isSourceFile);
// Filter out gitignored paths in one batch.
function filterIgnored(files) {
if (files.length === 0) return [];
const relPaths = files.map((f) => relative(ROOT, f)).join('\n');
let ignored = new Set();
try {
// --stdin returns the ignored paths (one per line). Exit 0 = some matched,
// 1 = none matched, 128 = error. We treat 0 and 1 as success.
const output = execFileSync('git', ['check-ignore', '--stdin'], {
input: relPaths,
encoding: 'utf8',
stdio: ['pipe', 'pipe', 'pipe'],
});
ignored = new Set(output.split('\n').filter(Boolean));
} catch (err) {
// exit code 1 = no paths ignored; treat as empty.
if (err.status === 1) {
ignored = new Set();
} else {
throw err;
}
}
return files.filter((f) => !ignored.has(relative(ROOT, f)));
}
const candidateFiles = filterIgnored(allApiFiles);
// --- Load manifest ---
if (!existsSync(MANIFEST_PATH)) {
console.error(`${relative(ROOT, MANIFEST_PATH)} is missing.`);
console.error(
' Remedy: restore api-route-exceptions.json (see docs/adding-endpoints.mdx). It is the single source of truth for non-proto endpoints.',
);
process.exit(1);
}
let manifest;
try {
manifest = JSON.parse(readFileSync(MANIFEST_PATH, 'utf8'));
} catch (err) {
console.error(`${relative(ROOT, MANIFEST_PATH)} is not valid JSON: ${err.message}`);
process.exit(1);
}
if (!Array.isArray(manifest.exceptions)) {
console.error(`${relative(ROOT, MANIFEST_PATH)} is missing the "exceptions" array.`);
process.exit(1);
}
// Validate every manifest entry's shape.
const manifestByPath = new Map();
for (const [idx, entry] of manifest.exceptions.entries()) {
const label = `api-route-exceptions.json[${idx}]`;
if (typeof entry.path !== 'string' || entry.path.length === 0) {
violation(label, 'entry is missing a non-empty "path" string', 'Set "path" to the api/ path this entry covers.');
continue;
}
if (manifestByPath.has(entry.path)) {
violation(
label,
`duplicate entry for path "${entry.path}"`,
'Remove the duplicate — one entry per path.',
);
}
manifestByPath.set(entry.path, entry);
if (!VALID_CATEGORIES.has(entry.category)) {
violation(
label,
`invalid category "${entry.category}" for ${entry.path}`,
`category must be one of: ${[...VALID_CATEGORIES].join(', ')}.`,
);
}
if (typeof entry.reason !== 'string' || entry.reason.trim().length < 10) {
violation(
label,
`entry for ${entry.path} is missing a substantive "reason" (≥10 chars)`,
'Write a one-sentence reason explaining why this endpoint cannot or should not be a sebuf RPC.',
);
}
if (typeof entry.owner !== 'string' || !entry.owner.startsWith('@')) {
violation(
label,
`entry for ${entry.path} has an invalid "owner" (must be a GitHub handle starting with @)`,
'Set "owner" to a GitHub handle like @SebastienMelki.',
);
}
if (entry.removal_issue !== null && entry.removal_issue !== undefined) {
if (typeof entry.removal_issue !== 'string') {
violation(
label,
`entry for ${entry.path} has non-string "removal_issue"`,
'Set "removal_issue" to null, "TBD", or an issue reference like "#3207".',
);
} else if (
entry.removal_issue !== 'TBD' &&
!/^#\d+$/.test(entry.removal_issue)
) {
violation(
label,
`entry for ${entry.path} has malformed "removal_issue" "${entry.removal_issue}"`,
'Use null for permanent exceptions, "TBD" while an issue is being filed, or "#<number>" once tracked.',
);
}
}
if (PERMANENT_CATEGORIES.has(entry.category) && entry.removal_issue) {
violation(
label,
`entry for ${entry.path} is category "${entry.category}" but has a "removal_issue" set`,
'Permanent categories (external-protocol, non-json, upstream-proxy, ops-admin, internal-helper) do not track removal. Set removal_issue to null.',
);
}
if (!PERMANENT_CATEGORIES.has(entry.category) && !entry.removal_issue) {
violation(
label,
`entry for ${entry.path} is category "${entry.category}" but has no "removal_issue"`,
'Temporary categories (deferred, migration-pending) must declare a tracking issue or "TBD".',
);
}
// Reverse pointer: manifest must not name a file that doesn't exist.
const absolute = join(ROOT, entry.path);
if (!existsSync(absolute)) {
violation(
label,
`entry for ${entry.path} points to a file that does not exist`,
'Remove the entry if the file was deleted, or fix the path.',
);
}
}
// --- Classify each api/ source file ---
// Sebuf gateway pattern — two accepted forms:
// 1. api/<kebab-domain>/v<N>/[rpc].ts (standard, domain-first)
// 2. api/v<N>/<kebab-domain>/[rpc].ts (version-first, for partner-URL
// preservation where the external contract already uses that layout —
// e.g. /api/v2/shipping/*).
// Both map to src/generated/server/worldmonitor/<snake_domain>/v<N>/.
const GATEWAY_RE = /^api\/(?:([a-z][a-z0-9-]*)\/v(\d+)|v(\d+)\/([a-z][a-z0-9-]*))\/\[rpc\]\.(ts|tsx|js|mjs|cjs)$/;
function kebabToSnake(s) {
return s.replace(/-/g, '_');
}
const seenGatewayDomains = new Set();
for (const absolute of candidateFiles) {
const rel = relative(ROOT, absolute).split(sep).join('/');
// Skip the manifest itself — it isn't an endpoint.
if (rel === 'api/api-route-exceptions.json') continue;
const gatewayMatch = rel.match(GATEWAY_RE);
if (gatewayMatch) {
// Group 1+2 = standard form (domain, version); 3+4 = version-first form (version, domain).
const domain = gatewayMatch[1] ?? gatewayMatch[4];
const version = gatewayMatch[2] ?? gatewayMatch[3];
const snakeDomain = kebabToSnake(domain);
const expectedServer = join(
GEN_SERVER_DIR,
snakeDomain,
`v${version}`,
'service_server.ts',
);
seenGatewayDomains.add(`${snakeDomain}/v${version}`);
if (!existsSync(expectedServer)) {
violation(
rel,
`sebuf gateway for /${domain}/v${version} has no matching generated service`,
`Expected ${relative(ROOT, expectedServer)}. Either regenerate (cd proto && buf generate), restore the proto under proto/worldmonitor/${snakeDomain}/v${version}/service.proto, or delete this orphaned gateway.`,
);
}
continue;
}
if (manifestByPath.has(rel)) {
// The entry was already validated above. Nothing more to check here.
continue;
}
violation(
rel,
'file under api/ is neither a sebuf gateway nor a listed exception',
'New JSON data APIs must be sebuf RPCs (proto → buf generate → handler). See docs/adding-endpoints.mdx. If this endpoint genuinely cannot be proto (external protocol, binary response, upstream proxy, ops plumbing), add an entry to api/api-route-exceptions.json — expect reviewer pushback.',
);
}
// --- Bidirectional check: every generated service has a gateway ---
if (existsSync(GEN_SERVER_DIR)) {
for (const domainDir of readdirSync(GEN_SERVER_DIR, { withFileTypes: true })) {
if (!domainDir.isDirectory()) continue;
const snakeDomain = domainDir.name;
const domainPath = join(GEN_SERVER_DIR, snakeDomain);
for (const versionDir of readdirSync(domainPath, { withFileTypes: true })) {
if (!versionDir.isDirectory()) continue;
if (!/^v\d+$/.test(versionDir.name)) continue;
const serviceServer = join(
domainPath,
versionDir.name,
'service_server.ts',
);
if (!existsSync(serviceServer)) continue;
const key = `${snakeDomain}/${versionDir.name}`;
if (!seenGatewayDomains.has(key)) {
const kebabDomain = snakeDomain.replace(/_/g, '-');
violation(
relative(ROOT, serviceServer),
`generated service ${snakeDomain}/${versionDir.name} has no HTTP gateway under api/`,
`Create api/${kebabDomain}/${versionDir.name}/[rpc].ts (follow the pattern from any existing domain — it just imports the generated server factory and re-exports as the edge handler).`,
);
}
}
}
}
// --- Output ---
if (violations.length === 0) {
console.log(
`✓ sebuf API contract clean: ${candidateFiles.length} api/ files checked, ${manifest.exceptions.length} manifest entries validated.`,
);
process.exit(0);
}
console.error(`✖ sebuf API contract: ${violations.length} violation(s):\n`);
for (const v of violations) {
console.error(` ${v.file}`);
console.error(` ${v.message}`);
console.error(` Remedy: ${v.remedy}`);
console.error('');
}
process.exit(1);

View File

@@ -1,4 +1,4 @@
const DISPOSABLE_DOMAINS = new Set([
const DISPOSABLE_DOMAINS = new Set<string>([
'guerrillamail.com', 'guerrillamail.de', 'guerrillamail.net', 'guerrillamail.org',
'guerrillamailblock.com', 'grr.la', 'sharklasers.com', 'spam4.me',
'tempmail.com', 'temp-mail.org', 'temp-mail.io',
@@ -27,23 +27,25 @@ const DISPOSABLE_DOMAINS = new Set([
const OFFENSIVE_RE = /(nigger|faggot|fuckfaggot)/i;
const TYPO_TLDS = new Set(['con', 'coma', 'comhade', 'gmai', 'gmial']);
const TYPO_TLDS = new Set<string>(['con', 'coma', 'comhade', 'gmai', 'gmial']);
async function hasMxRecords(domain) {
export type EmailValidationResult = { valid: true } | { valid: false; reason: string };
async function hasMxRecords(domain: string): Promise<boolean> {
try {
const res = await fetch(
`https://cloudflare-dns.com/dns-query?name=${encodeURIComponent(domain)}&type=MX`,
{ headers: { Accept: 'application/dns-json' }, signal: AbortSignal.timeout(3000) }
{ headers: { Accept: 'application/dns-json' }, signal: AbortSignal.timeout(3000) },
);
if (!res.ok) return true;
const data = await res.json();
const data = (await res.json()) as { Answer?: unknown[] };
return Array.isArray(data.Answer) && data.Answer.length > 0;
} catch {
return true;
}
}
export async function validateEmail(email) {
export async function validateEmail(email: string): Promise<EmailValidationResult> {
const normalized = email.trim().toLowerCase();
const atIdx = normalized.indexOf('@');
if (atIdx < 1) return { valid: false, reason: 'Invalid email format' };

View File

@@ -80,6 +80,18 @@ interface EndpointRatePolicy {
const ENDPOINT_RATE_POLICIES: Record<string, EndpointRatePolicy> = {
'/api/news/v1/summarize-article-cache': { limit: 3000, window: '60 s' },
'/api/intelligence/v1/classify-event': { limit: 600, window: '60 s' },
// Legacy /api/sanctions-entity-search rate limit was 30/min per IP. Preserve
// that budget now that LookupSanctionEntity proxies OpenSanctions live.
'/api/sanctions/v1/lookup-sanction-entity': { limit: 30, window: '60 s' },
// Lead capture: preserve the 3/hr and 5/hr budgets from legacy api/contact.js
// and api/register-interest.js. Lower limits than normal IP rate limit since
// these hit Convex + Resend per request.
'/api/leads/v1/submit-contact': { limit: 3, window: '1 h' },
'/api/leads/v1/register-interest': { limit: 5, window: '1 h' },
// Scenario engine: legacy /api/scenario/v1/run capped at 10 jobs/min/IP via
// inline Upstash INCR. Gateway now enforces the same budget with per-IP
// keying in checkEndpointRateLimit.
'/api/scenario/v1/run-scenario': { limit: 10, window: '60 s' },
};
const endpointLimiters = new Map<string, Ratelimit>();
@@ -131,3 +143,59 @@ export async function checkEndpointRateLimit(
return null;
}
}
// --- In-handler scoped rate limits ---
//
// Handlers that need a per-subscope cap *in addition to* the gateway-level
// endpoint policy (e.g. a tighter budget for one request variant) use this
// helper. Gateway's checkEndpointRateLimit still runs first — this is a
// second stage.
const scopedLimiters = new Map<string, Ratelimit>();
function getScopedRatelimit(scope: string, limit: number, window: Duration): Ratelimit | null {
const cacheKey = `${scope}|${limit}|${window}`;
const cached = scopedLimiters.get(cacheKey);
if (cached) return cached;
const url = process.env.UPSTASH_REDIS_REST_URL;
const token = process.env.UPSTASH_REDIS_REST_TOKEN;
if (!url || !token) return null;
const rl = new Ratelimit({
redis: new Redis({ url, token }),
limiter: Ratelimit.slidingWindow(limit, window),
prefix: 'rl:scope',
analytics: false,
});
scopedLimiters.set(cacheKey, rl);
return rl;
}
export interface ScopedRateLimitResult {
allowed: boolean;
limit: number;
reset: number;
}
/**
* Returns whether the request is under the scoped budget. `scope` is an
* opaque namespace (e.g. `${pathname}#desktop`); `identifier` is usually the
* client IP but can be any stable caller identifier. Fail-open on Redis errors
* to stay consistent with checkRateLimit / checkEndpointRateLimit semantics.
*/
export async function checkScopedRateLimit(
scope: string,
limit: number,
window: Duration,
identifier: string,
): Promise<ScopedRateLimitResult> {
const rl = getScopedRatelimit(scope, limit, window);
if (!rl) return { allowed: true, limit, reset: 0 };
try {
const result = await rl.limit(`${scope}:${identifier}`);
return { allowed: result.success, limit: result.limit, reset: result.reset };
} catch {
return { allowed: true, limit, reset: 0 };
}
}

View File

@@ -1,7 +1,6 @@
const TURNSTILE_VERIFY_URL = 'https://challenges.cloudflare.com/turnstile/v0/siteverify';
export function getClientIp(request) {
// Prefer platform-populated IP headers before falling back to x-forwarded-for.
export function getClientIp(request: Request): string {
return (
request.headers.get('x-real-ip') ||
request.headers.get('cf-connecting-ip') ||
@@ -10,12 +9,24 @@ export function getClientIp(request) {
);
}
export type TurnstileMissingSecretPolicy = 'allow' | 'allow-in-development' | 'deny';
export interface VerifyTurnstileArgs {
token: string;
ip: string;
logPrefix?: string;
missingSecretPolicy?: TurnstileMissingSecretPolicy;
}
export async function verifyTurnstile({
token,
ip,
logPrefix = '[turnstile]',
missingSecretPolicy = 'allow',
}) {
// Default: dev = allow (missing secret is expected locally), prod = deny.
// Callers that need the opposite (deliberately allow missing-secret in prod)
// can still pass 'allow' explicitly.
missingSecretPolicy = 'allow-in-development',
}: VerifyTurnstileArgs): Promise<boolean> {
const secret = process.env.TURNSTILE_SECRET_KEY;
if (!secret) {
if (missingSecretPolicy === 'allow') return true;
@@ -33,7 +44,7 @@ export async function verifyTurnstile({
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
body: new URLSearchParams({ secret, response: token, remoteip: ip }),
});
const data = await res.json();
const data = (await res.json()) as { success?: boolean };
return data.success === true;
} catch {
return false;

31
server/alias-rewrite.ts Normal file
View File

@@ -0,0 +1,31 @@
/**
* URL-rewrite alias helper for legacy v1 paths that were renamed during the
* sebuf migration (#3207). The sebuf generator produces RPC URLs derived from
* method names (e.g. `run-scenario`), which diverge from the documented v1
* URLs (`run`). These aliases keep the old documented URLs working byte-for-
* byte — external callers, docs, and partner scripts don't break.
*
* Each alias edge function rewrites the request pathname to the new sebuf
* path and hands off to the domain gateway. The gateway applies auth, rate
* limiting, and entitlement checks against the *new* path, so premium
* gating / cache tiers / entitlement maps stay keyed on a single canonical
* URL.
*
* Trivially deleted when v1 retires — just `rm` the alias files.
*/
export async function rewriteToSebuf(
req: Request,
newPath: string,
gateway: (req: Request) => Promise<Response>,
): Promise<Response> {
const url = new URL(req.url);
url.pathname = newPath;
const body =
req.method === 'GET' || req.method === 'HEAD' ? undefined : await req.arrayBuffer();
const rewritten = new Request(url.toString(), {
method: req.method,
headers: req.headers,
body,
});
return gateway(rewritten);
}

View File

@@ -218,9 +218,17 @@ const RPC_CACHE_TIER: Record<string, CacheTier> = {
'/api/supply-chain/v1/get-country-chokepoint-index': 'slow-browser',
'/api/supply-chain/v1/get-bypass-options': 'slow-browser',
'/api/supply-chain/v1/get-country-cost-shock': 'slow-browser',
'/api/supply-chain/v1/get-country-products': 'slow-browser',
'/api/supply-chain/v1/get-multi-sector-cost-shock': 'slow-browser',
'/api/supply-chain/v1/get-sector-dependency': 'slow-browser',
'/api/supply-chain/v1/get-route-explorer-lane': 'slow-browser',
'/api/supply-chain/v1/get-route-impact': 'slow-browser',
// Scenario engine: list-scenario-templates is a compile-time constant catalog;
// daily tier gives browser max-age=3600 matching the legacy /api/scenario/v1/templates
// endpoint header. get-scenario-status is premium-gated — gateway short-circuits
// to 'slow-browser' but the entry is still required by tests/route-cache-tier.test.mjs.
'/api/scenario/v1/list-scenario-templates': 'daily',
'/api/scenario/v1/get-scenario-status': 'slow-browser',
'/api/health/v1/list-disease-outbreaks': 'slow',
'/api/health/v1/list-air-quality-alerts': 'fast',
'/api/intelligence/v1/get-social-velocity': 'fast',
@@ -241,6 +249,13 @@ const RPC_CACHE_TIER: Record<string, CacheTier> = {
'/api/intelligence/v1/get-regional-brief': 'slow',
'/api/resilience/v1/get-resilience-score': 'slow',
'/api/resilience/v1/get-resilience-ranking': 'slow',
// Partner-facing shipping/v2. route-intelligence is premium-gated; gateway
// short-circuits to slow-browser. Entry required by tests/route-cache-tier.test.mjs.
'/api/v2/shipping/route-intelligence': 'slow-browser',
// GET /webhooks lists caller's webhooks — premium-gated; short-circuited to
// slow-browser. Entry required by tests/route-cache-tier.test.mjs.
'/api/v2/shipping/webhooks': 'slow-browser',
};
import { PREMIUM_RPC_PATHS } from '../src/shared/premium-paths';

View File

@@ -0,0 +1,9 @@
import type { LeadsServiceHandler } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { registerInterest } from './register-interest';
import { submitContact } from './submit-contact';
export const leadsHandler: LeadsServiceHandler = {
submitContact,
registerInterest,
};

View File

@@ -1,24 +1,45 @@
export const config = { runtime: 'edge' };
/**
* RPC: registerInterest -- Adds an email to the Pro waitlist and emails a confirmation.
* Port from api/register-interest.js
* Sources: Convex registerInterest:register mutation + Resend confirmation email
*/
import { ConvexHttpClient } from 'convex/browser';
import { getCorsHeaders, isDisallowedOrigin } from './_cors.js';
import { getClientIp, verifyTurnstile } from './_turnstile.js';
import { jsonResponse } from './_json-response.js';
import { createIpRateLimiter } from './_ip-rate-limit.js';
import { validateEmail } from './_email-validation.js';
import type {
ServerContext,
RegisterInterestRequest,
RegisterInterestResponse,
} from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { getClientIp, verifyTurnstile } from '../../../_shared/turnstile';
import { validateEmail } from '../../../_shared/email-validation';
import { checkScopedRateLimit } from '../../../_shared/rate-limit';
const EMAIL_RE = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
const MAX_EMAIL_LENGTH = 320;
const MAX_META_LENGTH = 100;
const RATE_LIMIT = 5;
const RATE_WINDOW_MS = 60 * 60 * 1000;
const DESKTOP_SOURCES = new Set<string>(['desktop-settings']);
const rateLimiter = createIpRateLimiter({ limit: RATE_LIMIT, windowMs: RATE_WINDOW_MS });
// Legacy api/register-interest.js capped desktop-source signups at 2/hr per IP
// on top of the generic 5/hr endpoint budget. Since `source` is an unsigned
// client-supplied field, this cap is the backstop — the signed-header fix that
// actually authenticates the desktop bypass is tracked as a follow-up.
const DESKTOP_RATE_SCOPE = '/api/leads/v1/register-interest#desktop';
const DESKTOP_RATE_LIMIT = 2;
const DESKTOP_RATE_WINDOW = '1 h' as const;
async function sendConfirmationEmail(email, referralCode) {
interface ConvexRegisterResult {
status: 'registered' | 'already_registered';
referralCode: string;
referralCount: number;
position?: number;
emailSuppressed?: boolean;
}
async function sendConfirmationEmail(email: string, referralCode: string): Promise<void> {
const referralLink = `https://worldmonitor.app/pro?ref=${referralCode}`;
const shareText = encodeURIComponent('I just joined the World Monitor Pro waitlist \u2014 real-time global intelligence powered by AI. Join me:');
const shareText = encodeURIComponent("I just joined the World Monitor Pro waitlist \u2014 real-time global intelligence powered by AI. Join me:");
const shareUrl = encodeURIComponent(referralLink);
const twitterShare = `https://x.com/intent/tweet?text=${shareText}&url=${shareUrl}`;
const linkedinShare = `https://www.linkedin.com/sharing/share-offsite/?url=${shareUrl}`;
@@ -40,7 +61,7 @@ async function sendConfirmationEmail(email, referralCode) {
body: JSON.stringify({
from: 'World Monitor <noreply@worldmonitor.app>',
to: [email],
subject: 'You\u2019re on the World Monitor Pro waitlist',
subject: "You\u2019re on the World Monitor Pro waitlist",
html: `
<div style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; max-width: 600px; margin: 0 auto; background: #0a0a0a; color: #e0e0e0;">
<div style="background: #4ade80; height: 4px;"></div>
@@ -100,7 +121,7 @@ async function sendConfirmationEmail(email, referralCode) {
<div style="font-size: 10px; color: #666; text-transform: uppercase; letter-spacing: 1px;">Users</div>
</td>
<td style="text-align: center; padding: 16px 8px; width: 33%; border-left: 1px solid #1a1a1a; border-right: 1px solid #1a1a1a;">
<div style="font-size: 22px; font-weight: 800; color: #4ade80;">435+</div>
<div style="font-size: 22px; font-weight: 800; color: #4ade80;">500+</div>
<div style="font-size: 10px; color: #666; text-transform: uppercase; letter-spacing: 1px;">Sources</div>
</td>
<td style="text-align: center; padding: 16px 8px; width: 33%;">
@@ -168,105 +189,84 @@ async function sendConfirmationEmail(email, referralCode) {
}
}
export default async function handler(req) {
if (isDisallowedOrigin(req)) {
return jsonResponse({ error: 'Origin not allowed' }, 403);
export async function registerInterest(
ctx: ServerContext,
req: RegisterInterestRequest,
): Promise<RegisterInterestResponse> {
// Honeypot — silently accept but do nothing.
if (req.website) {
return { status: 'registered', referralCode: '', referralCount: 0, position: 0, emailSuppressed: false };
}
const cors = getCorsHeaders(req, 'POST, OPTIONS');
const ip = getClientIp(ctx.request);
const isDesktopSource = typeof req.source === 'string' && DESKTOP_SOURCES.has(req.source);
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'POST') {
return jsonResponse({ error: 'Method not allowed' }, 405, cors);
}
const ip = getClientIp(req);
if (rateLimiter.isRateLimited(ip)) {
return jsonResponse({ error: 'Too many requests' }, 429, cors);
}
let body;
try {
body = await req.json();
} catch {
return jsonResponse({ error: 'Invalid JSON' }, 400, cors);
}
// Honeypot — bots auto-fill this hidden field; real users leave it empty
if (body.website) {
return jsonResponse({ status: 'registered' }, 200, cors);
}
// Cloudflare Turnstile verification — skip for desktop app (no browser captcha available).
// Desktop bypasses captcha, so enforce stricter rate limit (2/hr vs 5/hr).
const DESKTOP_SOURCES = new Set(['desktop-settings']);
const isDesktopSource = typeof body.source === 'string' && DESKTOP_SOURCES.has(body.source);
// Desktop sources bypass Turnstile (no browser captcha). `source` is
// attacker-controlled, so anyone claiming desktop-settings skips the
// captcha — apply a tighter 2/hr per-IP cap on that path to cap abuse
// (matches the legacy handler's in-memory secondary cap). Proper fix is
// a signed desktop-secret header; tracked as a follow-up.
if (isDesktopSource) {
const entry = rateLimiter.getEntry(ip);
if (entry && entry.count > 2) {
return jsonResponse({ error: 'Rate limit exceeded' }, 429, cors);
const scoped = await checkScopedRateLimit(
DESKTOP_RATE_SCOPE,
DESKTOP_RATE_LIMIT,
DESKTOP_RATE_WINDOW,
ip,
);
if (!scoped.allowed) {
throw new ApiError(429, 'Too many requests', '');
}
} else {
const turnstileOk = await verifyTurnstile({
token: body.turnstileToken || '',
token: req.turnstileToken || '',
ip,
logPrefix: '[register-interest]',
});
if (!turnstileOk) {
return jsonResponse({ error: 'Bot verification failed' }, 403, cors);
throw new ApiError(403, 'Bot verification failed', '');
}
}
const { email, source, appVersion, referredBy } = body;
if (!email || typeof email !== 'string' || email.length > MAX_EMAIL_LENGTH || !EMAIL_RE.test(email)) {
return jsonResponse({ error: 'Invalid email address' }, 400, cors);
const { email, source, appVersion, referredBy } = req;
if (!email || email.length > MAX_EMAIL_LENGTH || !EMAIL_RE.test(email)) {
throw new ValidationError([{ field: 'email', description: 'Invalid email address' }]);
}
const emailCheck = await validateEmail(email);
if (!emailCheck.valid) {
return jsonResponse({ error: emailCheck.reason }, 400, cors);
throw new ValidationError([{ field: 'email', description: emailCheck.reason }]);
}
const safeSource = typeof source === 'string'
? source.slice(0, MAX_META_LENGTH)
: 'unknown';
const safeAppVersion = typeof appVersion === 'string'
? appVersion.slice(0, MAX_META_LENGTH)
: 'unknown';
const safeReferredBy = typeof referredBy === 'string'
? referredBy.slice(0, 20)
: undefined;
const safeSource = source ? source.slice(0, MAX_META_LENGTH) : 'unknown';
const safeAppVersion = appVersion ? appVersion.slice(0, MAX_META_LENGTH) : 'unknown';
const safeReferredBy = referredBy ? referredBy.slice(0, 20) : undefined;
const convexUrl = process.env.CONVEX_URL;
if (!convexUrl) {
return jsonResponse({ error: 'Registration service unavailable' }, 503, cors);
throw new ApiError(503, 'Registration service unavailable', '');
}
try {
const client = new ConvexHttpClient(convexUrl);
const result = await client.mutation('registerInterest:register', {
email,
source: safeSource,
appVersion: safeAppVersion,
referredBy: safeReferredBy,
});
const client = new ConvexHttpClient(convexUrl);
const result = (await client.mutation('registerInterest:register' as any, {
email,
source: safeSource,
appVersion: safeAppVersion,
referredBy: safeReferredBy,
})) as ConvexRegisterResult;
// Send confirmation email for new registrations (awaited to avoid Edge isolate termination)
// Skip if email is on the suppression list (previously bounced)
if (result.status === 'registered' && result.referralCode) {
if (!result.emailSuppressed) {
await sendConfirmationEmail(email, result.referralCode);
} else {
console.log(`[register-interest] Skipped email to suppressed address: ${email}`);
}
if (result.status === 'registered' && result.referralCode) {
if (!result.emailSuppressed) {
await sendConfirmationEmail(email, result.referralCode);
} else {
console.log(`[register-interest] Skipped email to suppressed address: ${email}`);
}
return jsonResponse(result, 200, cors);
} catch (err) {
console.error('[register-interest] Convex error:', err);
return jsonResponse({ error: 'Registration failed' }, 500, cors);
}
return {
status: result.status,
referralCode: result.referralCode,
referralCount: result.referralCount,
position: result.position ?? 0,
emailSuppressed: result.emailSuppressed ?? false,
};
}

View File

@@ -1,17 +1,24 @@
export const config = { runtime: 'edge' };
/**
* RPC: submitContact -- Stores an enterprise contact submission and emails ops.
* Port from api/contact.js
* Sources: Convex contactMessages:submit mutation + Resend notification email
*/
import { ConvexHttpClient } from 'convex/browser';
import { getCorsHeaders, isDisallowedOrigin } from './_cors.js';
import { getClientIp, verifyTurnstile } from './_turnstile.js';
import { jsonResponse } from './_json-response.js';
import { createIpRateLimiter } from './_ip-rate-limit.js';
import type {
ServerContext,
SubmitContactRequest,
SubmitContactResponse,
} from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/leads/v1/service_server';
import { getClientIp, verifyTurnstile } from '../../../_shared/turnstile';
const EMAIL_RE = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
const PHONE_RE = /^[+(]?\d[\d\s()./-]{4,23}\d$/;
const MAX_FIELD = 500;
const MAX_MESSAGE = 2000;
const FREE_EMAIL_DOMAINS = new Set([
const FREE_EMAIL_DOMAINS = new Set<string>([
'gmail.com', 'googlemail.com', 'yahoo.com', 'yahoo.fr', 'yahoo.co.uk', 'yahoo.co.jp',
'hotmail.com', 'hotmail.fr', 'hotmail.co.uk', 'outlook.com', 'outlook.fr',
'live.com', 'live.fr', 'msn.com', 'aol.com', 'icloud.com', 'me.com', 'mac.com',
@@ -24,12 +31,27 @@ const FREE_EMAIL_DOMAINS = new Set([
't-online.de', 'libero.it', 'virgilio.it',
]);
const RATE_LIMIT = 3;
const RATE_WINDOW_MS = 60 * 60 * 1000;
function escapeHtml(str: string): string {
return str
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;');
}
const rateLimiter = createIpRateLimiter({ limit: RATE_LIMIT, windowMs: RATE_WINDOW_MS });
function sanitizeForSubject(str: string, maxLen = 50): string {
return str.replace(/[\r\n\0]/g, '').slice(0, maxLen);
}
async function sendNotificationEmail(name, email, organization, phone, message, ip, country) {
async function sendNotificationEmail(
name: string,
email: string,
organization: string,
phone: string,
message: string | undefined,
ip: string,
country: string | null,
): Promise<boolean> {
const resendKey = process.env.RESEND_API_KEY;
if (!resendKey) {
console.error('[contact] RESEND_API_KEY not set — lead stored in Convex but notification NOT sent');
@@ -77,109 +99,79 @@ async function sendNotificationEmail(name, email, organization, phone, message,
}
}
function escapeHtml(str) {
return str
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;');
}
function sanitizeForSubject(str, maxLen = 50) {
return str.replace(/[\r\n\0]/g, '').slice(0, maxLen);
}
export default async function handler(req) {
if (isDisallowedOrigin(req)) {
return jsonResponse({ error: 'Origin not allowed' }, 403);
export async function submitContact(
ctx: ServerContext,
req: SubmitContactRequest,
): Promise<SubmitContactResponse> {
// Honeypot — silently accept but do nothing (bots auto-fill hidden field).
if (req.website) {
return { status: 'sent', emailSent: false };
}
const cors = getCorsHeaders(req, 'POST, OPTIONS');
if (req.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: cors });
}
if (req.method !== 'POST') {
return jsonResponse({ error: 'Method not allowed' }, 405, cors);
}
const ip = getClientIp(req);
const country = req.headers.get('cf-ipcountry') || req.headers.get('x-vercel-ip-country') || null;
if (rateLimiter.isRateLimited(ip)) {
return jsonResponse({ error: 'Too many requests' }, 429, cors);
}
let body;
try {
body = await req.json();
} catch {
return jsonResponse({ error: 'Invalid JSON' }, 400, cors);
}
if (body.website) {
return jsonResponse({ status: 'sent' }, 200, cors);
}
const ip = getClientIp(ctx.request);
const country = ctx.request.headers.get('cf-ipcountry')
|| ctx.request.headers.get('x-vercel-ip-country');
const turnstileOk = await verifyTurnstile({
token: body.turnstileToken || '',
token: req.turnstileToken || '',
ip,
logPrefix: '[contact]',
missingSecretPolicy: 'allow-in-development',
});
if (!turnstileOk) {
return jsonResponse({ error: 'Bot verification failed' }, 403, cors);
throw new ApiError(403, 'Bot verification failed', '');
}
const { email, name, organization, phone, message, source } = body;
const { email, name, organization, phone, message, source } = req;
if (!email || typeof email !== 'string' || !EMAIL_RE.test(email)) {
return jsonResponse({ error: 'Invalid email' }, 400, cors);
if (!email || !EMAIL_RE.test(email)) {
throw new ValidationError([{ field: 'email', description: 'Invalid email' }]);
}
const emailDomain = email.split('@')[1]?.toLowerCase();
if (emailDomain && FREE_EMAIL_DOMAINS.has(emailDomain)) {
return jsonResponse({ error: 'Please use your work email address' }, 422, cors);
throw new ApiError(422, 'Please use your work email address', '');
}
if (!name || typeof name !== 'string' || name.trim().length === 0) {
return jsonResponse({ error: 'Name is required' }, 400, cors);
if (!name || name.trim().length === 0) {
throw new ValidationError([{ field: 'name', description: 'Name is required' }]);
}
if (!organization || typeof organization !== 'string' || organization.trim().length === 0) {
return jsonResponse({ error: 'Company is required' }, 400, cors);
if (!organization || organization.trim().length === 0) {
throw new ValidationError([{ field: 'organization', description: 'Company is required' }]);
}
if (!phone || typeof phone !== 'string' || !PHONE_RE.test(phone.trim())) {
return jsonResponse({ error: 'Valid phone number is required' }, 400, cors);
if (!phone || !PHONE_RE.test(phone.trim())) {
throw new ValidationError([{ field: 'phone', description: 'Valid phone number is required' }]);
}
const safeName = name.slice(0, MAX_FIELD);
const safeOrg = organization.slice(0, MAX_FIELD);
const safePhone = phone.trim().slice(0, 30);
const safeMsg = typeof message === 'string' ? message.slice(0, MAX_MESSAGE) : undefined;
const safeSource = typeof source === 'string' ? source.slice(0, 100) : 'enterprise-contact';
const safeMsg = message ? message.slice(0, MAX_MESSAGE) : undefined;
const safeSource = source ? source.slice(0, 100) : 'enterprise-contact';
const convexUrl = process.env.CONVEX_URL;
if (!convexUrl) {
return jsonResponse({ error: 'Service unavailable' }, 503, cors);
throw new ApiError(503, 'Service unavailable', '');
}
try {
const client = new ConvexHttpClient(convexUrl);
await client.mutation('contactMessages:submit', {
name: safeName,
email: email.trim(),
organization: safeOrg,
phone: safePhone,
message: safeMsg,
source: safeSource,
});
const client = new ConvexHttpClient(convexUrl);
await client.mutation('contactMessages:submit' as any, {
name: safeName,
email: email.trim(),
organization: safeOrg,
phone: safePhone,
message: safeMsg,
source: safeSource,
});
const emailSent = await sendNotificationEmail(safeName, email.trim(), safeOrg, safePhone, safeMsg, ip, country);
const emailSent = await sendNotificationEmail(
safeName,
email.trim(),
safeOrg,
safePhone,
safeMsg,
ip,
country,
);
return jsonResponse({ status: 'sent', emailSent }, 200, cors);
} catch (err) {
console.error('[contact] error:', err);
return jsonResponse({ error: 'Failed to send message' }, 500, cors);
}
return { status: 'sent', emailSent };
}

View File

@@ -7,6 +7,7 @@ import type {
AisDisruption,
AisDisruptionType,
AisDisruptionSeverity,
SnapshotCandidateReport,
} from '../../../../src/generated/server/worldmonitor/maritime/v1/service_server';
import { getRelayBaseUrl, getRelayHeaders } from '../../../_shared/relay';
@@ -26,44 +27,73 @@ const SEVERITY_MAP: Record<string, AisDisruptionSeverity> = {
high: 'AIS_DISRUPTION_SEVERITY_HIGH',
};
// In-memory cache (matches old /api/ais-snapshot behavior)
// Cache the two variants separately — candidate reports materially change
// payload size, and clients with no position callbacks should not have to
// wait on or pay for the heavier payload.
const SNAPSHOT_CACHE_TTL_MS = 300_000; // 5 min -- matches client poll interval
let cachedSnapshot: VesselSnapshot | undefined;
let cacheTimestamp = 0;
let inFlightRequest: Promise<VesselSnapshot | undefined> | null = null;
async function fetchVesselSnapshot(): Promise<VesselSnapshot | undefined> {
// Return cached if fresh
interface SnapshotCacheSlot {
snapshot: VesselSnapshot | undefined;
timestamp: number;
inFlight: Promise<VesselSnapshot | undefined> | null;
}
const cache: Record<'with' | 'without', SnapshotCacheSlot> = {
with: { snapshot: undefined, timestamp: 0, inFlight: null },
without: { snapshot: undefined, timestamp: 0, inFlight: null },
};
async function fetchVesselSnapshot(includeCandidates: boolean): Promise<VesselSnapshot | undefined> {
const slot = cache[includeCandidates ? 'with' : 'without'];
const now = Date.now();
if (cachedSnapshot && (now - cacheTimestamp) < SNAPSHOT_CACHE_TTL_MS) {
return cachedSnapshot;
if (slot.snapshot && (now - slot.timestamp) < SNAPSHOT_CACHE_TTL_MS) {
return slot.snapshot;
}
// In-flight dedup: if a request is already running, await it
if (inFlightRequest) {
return inFlightRequest;
if (slot.inFlight) {
return slot.inFlight;
}
inFlightRequest = fetchVesselSnapshotFromRelay();
slot.inFlight = fetchVesselSnapshotFromRelay(includeCandidates);
try {
const result = await inFlightRequest;
const result = await slot.inFlight;
if (result) {
cachedSnapshot = result;
cacheTimestamp = Date.now();
slot.snapshot = result;
slot.timestamp = Date.now();
}
return result ?? cachedSnapshot; // serve stale on relay failure
return result ?? slot.snapshot; // serve stale on relay failure
} finally {
inFlightRequest = null;
slot.inFlight = null;
}
}
async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefined> {
function toCandidateReport(raw: any): SnapshotCandidateReport | null {
if (!raw || typeof raw !== 'object') return null;
const mmsi = String(raw.mmsi ?? '');
if (!mmsi) return null;
const lat = Number(raw.lat);
const lon = Number(raw.lon);
if (!Number.isFinite(lat) || !Number.isFinite(lon)) return null;
return {
mmsi,
name: String(raw.name ?? ''),
lat,
lon,
shipType: Number.isFinite(Number(raw.shipType)) ? Number(raw.shipType) : 0,
heading: Number.isFinite(Number(raw.heading)) ? Number(raw.heading) : 0,
speed: Number.isFinite(Number(raw.speed)) ? Number(raw.speed) : 0,
course: Number.isFinite(Number(raw.course)) ? Number(raw.course) : 0,
timestamp: Number.isFinite(Number(raw.timestamp)) ? Number(raw.timestamp) : Date.now(),
};
}
async function fetchVesselSnapshotFromRelay(includeCandidates: boolean): Promise<VesselSnapshot | undefined> {
try {
const relayBaseUrl = getRelayBaseUrl();
if (!relayBaseUrl) return undefined;
const response = await fetch(
`${relayBaseUrl}/ais/snapshot?candidates=false`,
`${relayBaseUrl}/ais/snapshot?candidates=${includeCandidates ? 'true' : 'false'}`,
{
headers: getRelayHeaders(),
signal: AbortSignal.timeout(10000),
@@ -107,10 +137,22 @@ async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefine
description: String(d.description || ''),
}));
const rawStatus = (data.status && typeof data.status === 'object') ? data.status : {};
const candidateReports = (includeCandidates && Array.isArray(data.candidateReports))
? data.candidateReports.map(toCandidateReport).filter((r: SnapshotCandidateReport | null): r is SnapshotCandidateReport => r !== null)
: [];
return {
snapshotAt: Date.now(),
densityZones,
disruptions,
sequence: Number.isFinite(Number(data.sequence)) ? Number(data.sequence) : 0,
status: {
connected: Boolean(rawStatus.connected),
vessels: Number.isFinite(Number(rawStatus.vessels)) ? Number(rawStatus.vessels) : 0,
messages: Number.isFinite(Number(rawStatus.messages)) ? Number(rawStatus.messages) : 0,
},
candidateReports,
};
} catch {
return undefined;
@@ -123,10 +165,10 @@ async function fetchVesselSnapshotFromRelay(): Promise<VesselSnapshot | undefine
export async function getVesselSnapshot(
_ctx: ServerContext,
_req: GetVesselSnapshotRequest,
req: GetVesselSnapshotRequest,
): Promise<GetVesselSnapshotResponse> {
try {
const snapshot = await fetchVesselSnapshot();
const snapshot = await fetchVesselSnapshot(Boolean(req.includeCandidates));
return { snapshot };
} catch {
return { snapshot: undefined };

View File

@@ -3,15 +3,18 @@ import type {
ListMilitaryFlightsRequest,
ListMilitaryFlightsResponse,
MilitaryAircraftType,
MilitaryOperator,
MilitaryConfidence,
} from '../../../../src/generated/server/worldmonitor/military/v1/service_server';
import { isMilitaryCallsign, isMilitaryHex, detectAircraftType, UPSTREAM_TIMEOUT_MS } from './_shared';
import { cachedFetchJson } from '../../../_shared/redis';
import { cachedFetchJson, getRawJson } from '../../../_shared/redis';
import { markNoCacheResponse } from '../../../_shared/response-headers';
import { getRelayBaseUrl, getRelayHeaders } from '../../../_shared/relay';
const REDIS_CACHE_KEY = 'military:flights:v1';
const REDIS_CACHE_TTL = 600; // 10 min — reduce upstream API pressure
const REDIS_STALE_KEY = 'military:flights:stale:v1';
/** Snap a coordinate to a grid step so nearby bbox values share cache entries. */
const quantize = (v: number, step: number) => Math.round(v / step) * step;
@@ -53,8 +56,110 @@ const AIRCRAFT_TYPE_MAP: Record<string, string> = {
reconnaissance: 'MILITARY_AIRCRAFT_TYPE_RECONNAISSANCE',
drone: 'MILITARY_AIRCRAFT_TYPE_DRONE',
bomber: 'MILITARY_AIRCRAFT_TYPE_BOMBER',
fighter: 'MILITARY_AIRCRAFT_TYPE_FIGHTER',
helicopter: 'MILITARY_AIRCRAFT_TYPE_HELICOPTER',
vip: 'MILITARY_AIRCRAFT_TYPE_VIP',
special_ops: 'MILITARY_AIRCRAFT_TYPE_SPECIAL_OPS',
};
const OPERATOR_MAP: Record<string, string> = {
usaf: 'MILITARY_OPERATOR_USAF',
raf: 'MILITARY_OPERATOR_RAF',
faf: 'MILITARY_OPERATOR_FAF',
gaf: 'MILITARY_OPERATOR_GAF',
iaf: 'MILITARY_OPERATOR_IAF',
nato: 'MILITARY_OPERATOR_NATO',
other: 'MILITARY_OPERATOR_OTHER',
};
const CONFIDENCE_MAP: Record<string, string> = {
high: 'MILITARY_CONFIDENCE_HIGH',
medium: 'MILITARY_CONFIDENCE_MEDIUM',
low: 'MILITARY_CONFIDENCE_LOW',
};
interface StaleFlight {
id?: string;
callsign?: string;
hexCode?: string;
registration?: string;
aircraftType?: string;
aircraftModel?: string;
operator?: string;
operatorCountry?: string;
lat?: number | null;
lon?: number | null;
altitude?: number;
heading?: number;
speed?: number;
verticalRate?: number;
onGround?: boolean;
squawk?: string;
origin?: string;
destination?: string;
lastSeenMs?: number;
firstSeenMs?: number;
confidence?: string;
isInteresting?: boolean;
note?: string;
}
interface StalePayload {
flights?: StaleFlight[];
fetchedAt?: number;
}
/**
* Convert the seed cron's app-shape flight (flat lat/lon, lowercase enums,
* lastSeenMs) into the proto shape (nested GeoCoordinates, enum strings,
* lastSeenAt). Mirrors the inverse of src/services/military-flights.ts:mapProtoFlight.
* hexCode is canonicalized to uppercase per the invariant documented on
* MilitaryFlight.hex_code in military_flight.proto.
*/
function staleToProto(f: StaleFlight): ListMilitaryFlightsResponse['flights'][number] | null {
if (f.lat == null || f.lon == null) return null;
const icao = (f.hexCode || f.id || '').toUpperCase();
if (!icao) return null;
return {
id: icao,
callsign: (f.callsign || '').trim(),
hexCode: icao,
registration: f.registration || '',
aircraftType: (AIRCRAFT_TYPE_MAP[f.aircraftType || ''] || 'MILITARY_AIRCRAFT_TYPE_UNKNOWN') as MilitaryAircraftType,
aircraftModel: f.aircraftModel || '',
operator: (OPERATOR_MAP[f.operator || ''] || 'MILITARY_OPERATOR_OTHER') as MilitaryOperator,
operatorCountry: f.operatorCountry || '',
location: { latitude: f.lat, longitude: f.lon },
altitude: f.altitude ?? 0,
heading: f.heading ?? 0,
speed: f.speed ?? 0,
verticalRate: f.verticalRate ?? 0,
onGround: f.onGround ?? false,
squawk: f.squawk || '',
origin: f.origin || '',
destination: f.destination || '',
lastSeenAt: f.lastSeenMs ?? Date.now(),
firstSeenAt: f.firstSeenMs ?? 0,
confidence: (CONFIDENCE_MAP[f.confidence || ''] || 'MILITARY_CONFIDENCE_LOW') as MilitaryConfidence,
isInteresting: f.isInteresting ?? false,
note: f.note || '',
enrichment: undefined,
};
}
async function fetchStaleFallback(): Promise<ListMilitaryFlightsResponse['flights'] | null> {
try {
const raw = (await getRawJson(REDIS_STALE_KEY)) as StalePayload | null;
if (!raw || !Array.isArray(raw.flights) || raw.flights.length === 0) return null;
const flights = raw.flights
.map(staleToProto)
.filter((f): f is NonNullable<typeof f> => f != null);
return flights.length > 0 ? flights : null;
} catch {
return null;
}
}
export async function listMilitaryFlights(
ctx: ServerContext,
req: ListMilitaryFlightsRequest,
@@ -115,11 +220,17 @@ export async function listMilitaryFlights(
if (!isMilitaryCallsign(callsign) && !isMilitaryHex(icao24)) continue;
const aircraftType = detectAircraftType(callsign);
// Canonicalize hex_code to uppercase — the seed cron
// (scripts/seed-military-flights.mjs) writes uppercase, and
// src/services/military-flights.ts getFlightByHex uppercases the
// lookup input. Preserving OpenSky's lowercase here would break
// every hex lookup silently.
const hex = icao24.toUpperCase();
flights.push({
id: icao24,
id: hex,
callsign: (callsign || '').trim(),
hexCode: icao24,
hexCode: hex,
registration: '',
aircraftType: (AIRCRAFT_TYPE_MAP[aircraftType] || 'MILITARY_AIRCRAFT_TYPE_UNKNOWN') as MilitaryAircraftType,
aircraftModel: '',
@@ -148,6 +259,15 @@ export async function listMilitaryFlights(
);
if (!fullResult) {
// Live fetch failed. The legacy /api/military-flights handler cascaded
// military:flights:v1 → military:flights:stale:v1 before returning empty.
// The seed cron (scripts/seed-military-flights.mjs) writes both keys
// every run; stale has a 24h TTL versus 10min live, so it's the right
// fallback when OpenSky / the relay hiccups.
const staleFlights = await fetchStaleFallback();
if (staleFlights && staleFlights.length > 0) {
return { flights: filterFlightsToBounds(staleFlights, requestBounds), clusters: [], pagination: undefined };
}
markNoCacheResponse(ctx.request);
return { flights: [], clusters: [], pagination: undefined };
}

View File

@@ -11,7 +11,10 @@ import { getCachedJson } from '../../../_shared/redis';
const ENTITY_INDEX_KEY = 'sanctions:entities:v1';
const DEFAULT_MAX = 10;
const MAX_RESULTS_LIMIT = 50;
const MAX_QUERY_LENGTH = 200;
const MIN_QUERY_LENGTH = 2;
const OPENSANCTIONS_BASE = 'https://api.opensanctions.org';
const OPENSANCTIONS_TIMEOUT_MS = 8_000;
interface EntityIndexRecord {
id: string;
@@ -21,6 +24,24 @@ interface EntityIndexRecord {
pr: string[];
}
interface OpenSanctionsHit {
id?: string;
schema?: string;
caption?: string;
properties?: {
name?: string[];
country?: string[];
nationality?: string[];
program?: string[];
sanctions?: string[];
};
}
interface OpenSanctionsSearchResponse {
results?: OpenSanctionsHit[];
total?: { value?: number };
}
function normalize(s: string): string {
return s.toLowerCase().replace(/[^a-z0-9]/g, ' ').replace(/\s+/g, ' ').trim();
}
@@ -30,59 +51,122 @@ function clampMax(value: number): number {
return Math.min(Math.max(Math.trunc(value), 1), MAX_RESULTS_LIMIT);
}
function entityTypeFromSchema(schema: string): string {
if (schema === 'Vessel') return 'vessel';
if (schema === 'Aircraft') return 'aircraft';
if (schema === 'Person') return 'individual';
return 'entity';
}
function normalizeOpenSanctionsHit(hit: OpenSanctionsHit): SanctionEntityMatch | null {
const props = hit.properties ?? {};
const name = (props.name ?? [hit.caption ?? '']).filter(Boolean)[0] ?? '';
if (!name || !hit.id) return null;
const countries = (props.country ?? props.nationality ?? []).slice(0, 3);
const programs = (props.program ?? props.sanctions ?? []).slice(0, 3);
return {
id: `opensanctions:${hit.id}`,
name,
entityType: entityTypeFromSchema(hit.schema ?? ''),
countryCodes: countries,
programs,
};
}
async function searchOpenSanctions(q: string, limit: number): Promise<{ results: SanctionEntityMatch[]; total: number } | null> {
const url = new URL(`${OPENSANCTIONS_BASE}/search/default`);
url.searchParams.set('q', q);
url.searchParams.set('limit', String(limit));
const resp = await fetch(url.toString(), {
headers: {
'User-Agent': 'WorldMonitor/1.0 sanctions-search',
Accept: 'application/json',
},
signal: AbortSignal.timeout(OPENSANCTIONS_TIMEOUT_MS),
});
if (!resp.ok) return null;
const data = (await resp.json()) as OpenSanctionsSearchResponse;
const hits = Array.isArray(data.results) ? data.results : [];
const results = hits
.map(normalizeOpenSanctionsHit)
.filter((r): r is SanctionEntityMatch => r !== null);
const total = data.total?.value ?? results.length;
return { results, total };
}
function searchOfacLocal(q: string, maxResults: number, raw: unknown): { results: SanctionEntityMatch[]; total: number } {
if (!Array.isArray(raw)) return { results: [], total: 0 };
const index = raw as EntityIndexRecord[];
const needle = normalize(q);
const tokens = needle.split(' ').filter(Boolean);
const scored: Array<{ score: number; entry: EntityIndexRecord }> = [];
for (const entry of index) {
const haystack = normalize(entry.name);
if (haystack === needle) {
scored.push({ score: 100, entry });
continue;
}
if (haystack.startsWith(needle)) {
scored.push({ score: 80, entry });
continue;
}
if (tokens.length > 0 && tokens.every((t) => haystack.includes(t))) {
const pos = haystack.indexOf(tokens[0] ?? '');
scored.push({ score: 60 - Math.min(pos, 20), entry });
continue;
}
const matchCount = tokens.filter((t) => haystack.includes(t)).length;
if (matchCount > 0) {
scored.push({ score: matchCount * 10, entry });
}
}
scored.sort((a, b) => b.score - a.score);
const results: SanctionEntityMatch[] = scored.slice(0, maxResults).map(({ entry }) => ({
id: entry.id,
name: entry.name,
entityType: entry.et,
countryCodes: entry.cc,
programs: entry.pr,
}));
return { results, total: scored.length };
}
export const lookupSanctionEntity: SanctionsServiceHandler['lookupSanctionEntity'] = async (
_ctx: ServerContext,
req: LookupSanctionEntityRequest,
): Promise<LookupSanctionEntityResponse> => {
const q = (req.q ?? '').trim();
if (q.length < MIN_QUERY_LENGTH) {
return { results: [], total: 0, source: 'ofac' };
if (q.length < MIN_QUERY_LENGTH || q.length > MAX_QUERY_LENGTH) {
return { results: [], total: 0, source: 'opensanctions' };
}
const maxResults = clampMax(req.maxResults);
const needle = normalize(q);
const tokens = needle.split(' ').filter(Boolean);
// Primary: live query against OpenSanctions — broader global coverage than
// the local OFAC index. Matches the legacy /api/sanctions-entity-search path.
try {
const upstream = await searchOpenSanctions(q, maxResults);
if (upstream) {
return { ...upstream, source: 'opensanctions' };
}
} catch {
// fall through to OFAC fallback
}
// Fallback: local OFAC fuzzy match from the seeded Redis index. Keeps the
// endpoint useful when OpenSanctions is unreachable or rate-limiting us.
try {
const raw = await getCachedJson(ENTITY_INDEX_KEY, true);
if (!Array.isArray(raw)) return { results: [], total: 0, source: 'ofac' };
const index = raw as EntityIndexRecord[];
const scored: Array<{ score: number; entry: EntityIndexRecord }> = [];
for (const entry of index) {
const haystack = normalize(entry.name);
if (haystack === needle) {
scored.push({ score: 100, entry });
continue;
}
if (haystack.startsWith(needle)) {
scored.push({ score: 80, entry });
continue;
}
if (tokens.length > 0 && tokens.every((t) => haystack.includes(t))) {
const pos = haystack.indexOf(tokens[0] ?? '');
scored.push({ score: 60 - Math.min(pos, 20), entry });
continue;
}
const matchCount = tokens.filter((t) => haystack.includes(t)).length;
if (matchCount > 0) {
scored.push({ score: matchCount * 10, entry });
}
}
scored.sort((a, b) => b.score - a.score);
const results: SanctionEntityMatch[] = scored.slice(0, maxResults).map(({ entry }) => ({
id: entry.id,
name: entry.name,
entityType: entry.et,
countryCodes: entry.cc,
programs: entry.pr,
}));
return { results, total: scored.length, source: 'ofac' };
return { ...searchOfacLocal(q, maxResults, raw), source: 'ofac' };
} catch {
return { results: [], total: 0, source: 'ofac' };
}

View File

@@ -0,0 +1,99 @@
import type {
ServerContext,
GetScenarioStatusRequest,
GetScenarioStatusResponse,
ScenarioResult,
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { isCallerPremium } from '../../../_shared/premium-check';
import { getRawJson } from '../../../_shared/redis';
// Matches jobIds produced by run-scenario.ts: `scenario:{13-digit-ts}:{8-char-suffix}`.
// Guards `GET /scenario-result/{jobId}` against path-traversal via crafted jobId.
const JOB_ID_RE = /^scenario:\d{13}:[a-z0-9]{8}$/;
interface WorkerResultEnvelope {
status?: string;
result?: unknown;
error?: unknown;
}
function coerceImpactCountries(raw: unknown): ScenarioResult['topImpactCountries'] {
if (!Array.isArray(raw)) return [];
const out: ScenarioResult['topImpactCountries'] = [];
for (const entry of raw) {
if (!entry || typeof entry !== 'object') continue;
const c = entry as { iso2?: unknown; totalImpact?: unknown; impactPct?: unknown };
out.push({
iso2: typeof c.iso2 === 'string' ? c.iso2 : '',
totalImpact: typeof c.totalImpact === 'number' ? c.totalImpact : 0,
impactPct: typeof c.impactPct === 'number' ? c.impactPct : 0,
});
}
return out;
}
function coerceTemplate(raw: unknown): ScenarioResult['template'] {
if (!raw || typeof raw !== 'object') return undefined;
const t = raw as { name?: unknown; disruptionPct?: unknown; durationDays?: unknown; costShockMultiplier?: unknown };
return {
name: typeof t.name === 'string' ? t.name : '',
disruptionPct: typeof t.disruptionPct === 'number' ? t.disruptionPct : 0,
durationDays: typeof t.durationDays === 'number' ? t.durationDays : 0,
costShockMultiplier: typeof t.costShockMultiplier === 'number' ? t.costShockMultiplier : 1,
};
}
function coerceResult(raw: unknown): ScenarioResult | undefined {
if (!raw || typeof raw !== 'object') return undefined;
const r = raw as { affectedChokepointIds?: unknown; topImpactCountries?: unknown; template?: unknown };
return {
affectedChokepointIds: Array.isArray(r.affectedChokepointIds)
? r.affectedChokepointIds.filter((id): id is string => typeof id === 'string')
: [],
topImpactCountries: coerceImpactCountries(r.topImpactCountries),
template: coerceTemplate(r.template),
};
}
export async function getScenarioStatus(
ctx: ServerContext,
req: GetScenarioStatusRequest,
): Promise<GetScenarioStatusResponse> {
const isPro = await isCallerPremium(ctx.request);
if (!isPro) {
throw new ApiError(403, 'PRO subscription required', '');
}
const jobId = req.jobId ?? '';
if (!JOB_ID_RE.test(jobId)) {
throw new ValidationError([{ field: 'jobId', description: 'Invalid or missing jobId' }]);
}
// Worker writes under the raw (unprefixed) key, so we must read raw.
let envelope: WorkerResultEnvelope | null = null;
try {
envelope = await getRawJson(`scenario-result:${jobId}`) as WorkerResultEnvelope | null;
} catch {
throw new ApiError(502, 'Failed to fetch job status', '');
}
if (!envelope) {
return { status: 'pending', error: '' };
}
const status = typeof envelope.status === 'string' ? envelope.status : 'pending';
if (status === 'done') {
const result = coerceResult(envelope.result);
return { status: 'done', result, error: '' };
}
if (status === 'failed') {
const error = typeof envelope.error === 'string' ? envelope.error : 'computation_error';
return { status: 'failed', error };
}
return { status, error: '' };
}

View File

@@ -0,0 +1,11 @@
import type { ScenarioServiceHandler } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { runScenario } from './run-scenario';
import { getScenarioStatus } from './get-scenario-status';
import { listScenarioTemplates } from './list-scenario-templates';
export const scenarioHandler: ScenarioServiceHandler = {
runScenario,
getScenarioStatus,
listScenarioTemplates,
};

View File

@@ -0,0 +1,26 @@
import type {
ServerContext,
ListScenarioTemplatesRequest,
ListScenarioTemplatesResponse,
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { SCENARIO_TEMPLATES } from '../../supply-chain/v1/scenario-templates';
export async function listScenarioTemplates(
_ctx: ServerContext,
_req: ListScenarioTemplatesRequest,
): Promise<ListScenarioTemplatesResponse> {
return {
templates: SCENARIO_TEMPLATES.map((t) => ({
id: t.id,
name: t.name,
affectedChokepointIds: [...t.affectedChokepointIds],
disruptionPct: t.disruptionPct,
durationDays: t.durationDays,
// Empty array means ALL sectors on the wire (mirrors the `affectedHs2: null`
// template convention — proto `repeated` cannot carry null).
affectedHs2: t.affectedHs2 ? [...t.affectedHs2] : [],
costShockMultiplier: t.costShockMultiplier,
})),
};
}

View File

@@ -0,0 +1,79 @@
import type {
ServerContext,
RunScenarioRequest,
RunScenarioResponse,
} from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { ApiError, ValidationError } from '../../../../src/generated/server/worldmonitor/scenario/v1/service_server';
import { isCallerPremium } from '../../../_shared/premium-check';
import { runRedisPipeline } from '../../../_shared/redis';
import { getScenarioTemplate } from '../../supply-chain/v1/scenario-templates';
const QUEUE_KEY = 'scenario-queue:pending';
const MAX_QUEUE_DEPTH = 100;
const JOB_ID_CHARSET = 'abcdefghijklmnopqrstuvwxyz0123456789';
function generateJobId(): string {
const ts = Date.now();
let suffix = '';
const array = new Uint8Array(8);
crypto.getRandomValues(array);
for (const byte of array) suffix += JOB_ID_CHARSET[byte % JOB_ID_CHARSET.length];
return `scenario:${ts}:${suffix}`;
}
export async function runScenario(
ctx: ServerContext,
req: RunScenarioRequest,
): Promise<RunScenarioResponse> {
const isPro = await isCallerPremium(ctx.request);
if (!isPro) {
throw new ApiError(403, 'PRO subscription required', '');
}
const scenarioId = (req.scenarioId ?? '').trim();
if (!scenarioId) {
throw new ValidationError([{ field: 'scenarioId', description: 'scenarioId is required' }]);
}
if (!getScenarioTemplate(scenarioId)) {
throw new ValidationError([{ field: 'scenarioId', description: `Unknown scenario: ${scenarioId}` }]);
}
const iso2 = req.iso2 ? req.iso2.trim() : '';
if (iso2 && !/^[A-Z]{2}$/.test(iso2)) {
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase country code' }]);
}
// Queue-depth backpressure. Raw key: worker reads it unprefixed, so we must too.
const [depthEntry] = await runRedisPipeline([['LLEN', QUEUE_KEY]], true);
const depth = typeof depthEntry?.result === 'number' ? depthEntry.result : 0;
if (depth > MAX_QUEUE_DEPTH) {
throw new ApiError(429, 'Scenario queue is at capacity, please try again later', '');
}
const jobId = generateJobId();
const payload = JSON.stringify({
jobId,
scenarioId,
iso2: iso2 || null,
enqueuedAt: Date.now(),
});
// Upstash RPUSH returns the new list length; helper returns [] on transport
// failure. Either no entry or a non-numeric result means the enqueue never
// landed — surface as 502 so the caller retries.
const [pushEntry] = await runRedisPipeline([['RPUSH', QUEUE_KEY, payload]], true);
if (!pushEntry || typeof pushEntry.result !== 'number') {
throw new ApiError(502, 'Failed to enqueue scenario job', '');
}
// statusUrl is a server-computed convenience URL preserved from the legacy
// /api/scenario/v1/run contract so external callers can keep polling via the
// response body rather than hardcoding the status path. See the proto comment
// on RunScenarioResponse for why this matters on a v1 → v1 migration.
return {
jobId,
status: 'pending',
statusUrl: `/api/scenario/v1/get-scenario-status?jobId=${encodeURIComponent(jobId)}`,
};
}

View File

@@ -0,0 +1,11 @@
import type { ShippingV2ServiceHandler } from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import { routeIntelligence } from './route-intelligence';
import { registerWebhook } from './register-webhook';
import { listWebhooks } from './list-webhooks';
export const shippingV2Handler: ShippingV2ServiceHandler = {
routeIntelligence,
registerWebhook,
listWebhooks,
};

View File

@@ -0,0 +1,70 @@
import type {
ServerContext,
ListWebhooksRequest,
ListWebhooksResponse,
WebhookSummary,
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import { ApiError } from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../../../api/_api-key.js';
import { isCallerPremium } from '../../../_shared/premium-check';
import { runRedisPipeline } from '../../../_shared/redis';
import {
webhookKey,
ownerIndexKey,
callerFingerprint,
type WebhookRecord,
} from './webhook-shared';
export async function listWebhooks(
ctx: ServerContext,
_req: ListWebhooksRequest,
): Promise<ListWebhooksResponse> {
// Without forceKey, Clerk-authenticated pro callers reach this handler with
// no API key, callerFingerprint() returns the 'anon' fallback, and the
// ownerTag !== ownerHash defense-in-depth below collapses because both
// sides equal 'anon' — exposing every 'anon'-bucket tenant's webhooks to
// every Clerk-session holder. See registerWebhook for full rationale.
const apiKeyResult = validateApiKey(ctx.request, { forceKey: true }) as {
valid: boolean; required: boolean; error?: string;
};
if (apiKeyResult.required && !apiKeyResult.valid) {
throw new ApiError(401, apiKeyResult.error ?? 'API key required', '');
}
const isPro = await isCallerPremium(ctx.request);
if (!isPro) {
throw new ApiError(403, 'PRO subscription required', '');
}
const ownerHash = await callerFingerprint(ctx.request);
const smembersResult = await runRedisPipeline([['SMEMBERS', ownerIndexKey(ownerHash)]]);
const memberIds = (smembersResult[0]?.result as string[] | null) ?? [];
if (memberIds.length === 0) {
return { webhooks: [] };
}
const getResults = await runRedisPipeline(memberIds.map(id => ['GET', webhookKey(id)]));
const webhooks: WebhookSummary[] = [];
for (const r of getResults) {
if (!r.result || typeof r.result !== 'string') continue;
try {
const record = JSON.parse(r.result) as WebhookRecord;
if (record.ownerTag !== ownerHash) continue;
webhooks.push({
subscriberId: record.subscriberId,
callbackUrl: record.callbackUrl,
chokepointIds: record.chokepointIds,
alertThreshold: record.alertThreshold,
createdAt: record.createdAt,
active: record.active,
});
} catch {
// skip malformed
}
}
return { webhooks };
}

View File

@@ -0,0 +1,100 @@
import type {
ServerContext,
RegisterWebhookRequest,
RegisterWebhookResponse,
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import {
ApiError,
ValidationError,
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
// @ts-expect-error — JS module, no declaration file
import { validateApiKey } from '../../../../api/_api-key.js';
import { isCallerPremium } from '../../../_shared/premium-check';
import { runRedisPipeline } from '../../../_shared/redis';
import {
WEBHOOK_TTL,
VALID_CHOKEPOINT_IDS,
isBlockedCallbackUrl,
generateSecret,
generateSubscriberId,
webhookKey,
ownerIndexKey,
callerFingerprint,
type WebhookRecord,
} from './webhook-shared';
export async function registerWebhook(
ctx: ServerContext,
req: RegisterWebhookRequest,
): Promise<RegisterWebhookResponse> {
// Webhooks are per-tenant keyed on callerFingerprint(), which hashes the
// API key. Without forceKey, a Clerk-authenticated pro caller reaches this
// handler with no API key, callerFingerprint() falls back to 'anon', and
// every such caller collapses into a shared 'anon' owner bucket — letting
// one Clerk-session holder enumerate/overwrite other tenants' webhooks.
// Matches the legacy `api/v2/shipping/webhooks/[subscriberId]{,/[action]}.ts`
// gate and the documented "X-WorldMonitor-Key required" contract in
// docs/api-shipping-v2.mdx.
const apiKeyResult = validateApiKey(ctx.request, { forceKey: true }) as {
valid: boolean; required: boolean; error?: string;
};
if (apiKeyResult.required && !apiKeyResult.valid) {
throw new ApiError(401, apiKeyResult.error ?? 'API key required', '');
}
const isPro = await isCallerPremium(ctx.request);
if (!isPro) {
throw new ApiError(403, 'PRO subscription required', '');
}
const callbackUrl = (req.callbackUrl ?? '').trim();
if (!callbackUrl) {
throw new ValidationError([{ field: 'callbackUrl', description: 'callbackUrl is required' }]);
}
const ssrfError = isBlockedCallbackUrl(callbackUrl);
if (ssrfError) {
throw new ValidationError([{ field: 'callbackUrl', description: ssrfError }]);
}
const chokepointIds = Array.isArray(req.chokepointIds) ? req.chokepointIds : [];
const invalidCp = chokepointIds.find(id => !VALID_CHOKEPOINT_IDS.has(id));
if (invalidCp) {
throw new ValidationError([
{ field: 'chokepointIds', description: `Unknown chokepoint ID: ${invalidCp}` },
]);
}
// Proto default int32 is 0 — treat 0 as "unset" to preserve the legacy
// default of 50 when the caller omits alertThreshold.
const alertThreshold = req.alertThreshold > 0 ? req.alertThreshold : 50;
if (alertThreshold < 0 || alertThreshold > 100) {
throw new ValidationError([
{ field: 'alertThreshold', description: 'alertThreshold must be a number between 0 and 100' },
]);
}
const ownerTag = await callerFingerprint(ctx.request);
const newSubscriberId = generateSubscriberId();
const secret = await generateSecret();
const record: WebhookRecord = {
subscriberId: newSubscriberId,
ownerTag,
callbackUrl,
chokepointIds: chokepointIds.length ? chokepointIds : [...VALID_CHOKEPOINT_IDS],
alertThreshold,
createdAt: new Date().toISOString(),
active: true,
secret,
};
await runRedisPipeline([
['SET', webhookKey(newSubscriberId), JSON.stringify(record), 'EX', String(WEBHOOK_TTL)],
['SADD', ownerIndexKey(ownerTag), newSubscriberId],
['EXPIRE', ownerIndexKey(ownerTag), String(WEBHOOK_TTL)],
]);
return { subscriberId: newSubscriberId, secret };
}

View File

@@ -0,0 +1,116 @@
import type {
ServerContext,
RouteIntelligenceRequest,
RouteIntelligenceResponse,
ChokepointExposure,
BypassOption,
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import {
ApiError,
ValidationError,
} from '../../../../src/generated/server/worldmonitor/shipping/v2/service_server';
import { isCallerPremium } from '../../../_shared/premium-check';
import { getCachedJson } from '../../../_shared/redis';
import { CHOKEPOINT_STATUS_KEY } from '../../../_shared/cache-keys';
import { BYPASS_CORRIDORS_BY_CHOKEPOINT, type CargoType } from '../../../_shared/bypass-corridors';
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
import COUNTRY_PORT_CLUSTERS from '../../../../scripts/shared/country-port-clusters.json';
interface PortClusterEntry {
nearestRouteIds: string[];
coastSide: string;
}
interface ChokepointStatusEntry {
id: string;
name?: string;
disruptionScore?: number;
warRiskTier?: string;
}
interface ChokepointStatusResponse {
chokepoints?: ChokepointStatusEntry[];
}
const VALID_CARGO_TYPES = new Set(['container', 'tanker', 'bulk', 'roro']);
export async function routeIntelligence(
ctx: ServerContext,
req: RouteIntelligenceRequest,
): Promise<RouteIntelligenceResponse> {
const isPro = await isCallerPremium(ctx.request);
if (!isPro) {
throw new ApiError(403, 'PRO subscription required', '');
}
const fromIso2 = (req.fromIso2 ?? '').trim().toUpperCase();
const toIso2 = (req.toIso2 ?? '').trim().toUpperCase();
if (!/^[A-Z]{2}$/.test(fromIso2) || !/^[A-Z]{2}$/.test(toIso2)) {
throw new ValidationError([
{ field: 'fromIso2', description: 'fromIso2 and toIso2 must be valid 2-letter ISO country codes' },
]);
}
const cargoTypeRaw = (req.cargoType ?? '').trim().toLowerCase();
const cargoType: CargoType = (VALID_CARGO_TYPES.has(cargoTypeRaw) ? cargoTypeRaw : 'container') as CargoType;
const hs2 = (req.hs2 ?? '').trim().replace(/\D/g, '') || '27';
const clusters = COUNTRY_PORT_CLUSTERS as unknown as Record<string, PortClusterEntry>;
const fromCluster = clusters[fromIso2];
const toCluster = clusters[toIso2];
const fromRoutes = new Set(fromCluster?.nearestRouteIds ?? []);
const toRoutes = new Set(toCluster?.nearestRouteIds ?? []);
const sharedRoutes = [...fromRoutes].filter(r => toRoutes.has(r));
const primaryRouteId = sharedRoutes[0] ?? fromCluster?.nearestRouteIds[0] ?? '';
const statusRaw = (await getCachedJson(CHOKEPOINT_STATUS_KEY).catch(() => null)) as ChokepointStatusResponse | null;
const statusMap = new Map<string, ChokepointStatusEntry>(
(statusRaw?.chokepoints ?? []).map(cp => [cp.id, cp]),
);
const relevantRouteSet = new Set(sharedRoutes.length ? sharedRoutes : (fromCluster?.nearestRouteIds ?? []));
const chokepointExposures: ChokepointExposure[] = CHOKEPOINT_REGISTRY
.filter(cp => cp.routeIds.some(r => relevantRouteSet.has(r)))
.map(cp => {
const overlap = cp.routeIds.filter(r => relevantRouteSet.has(r)).length;
const exposurePct = Math.round((overlap / Math.max(cp.routeIds.length, 1)) * 100);
return { chokepointId: cp.id, chokepointName: cp.displayName, exposurePct };
})
.filter(e => e.exposurePct > 0)
.sort((a, b) => b.exposurePct - a.exposurePct);
const primaryChokepoint = chokepointExposures[0];
const primaryCpStatus = primaryChokepoint ? statusMap.get(primaryChokepoint.chokepointId) : null;
const disruptionScore = primaryCpStatus?.disruptionScore ?? 0;
const warRiskTier = primaryCpStatus?.warRiskTier ?? 'WAR_RISK_TIER_NORMAL';
const bypassOptions: BypassOption[] = primaryChokepoint
? (BYPASS_CORRIDORS_BY_CHOKEPOINT[primaryChokepoint.chokepointId] ?? [])
.filter(c => c.suitableCargoTypes.length === 0 || c.suitableCargoTypes.includes(cargoType))
.slice(0, 5)
.map(c => ({
id: c.id,
name: c.name,
type: c.type,
addedTransitDays: c.addedTransitDays,
addedCostMultiplier: c.addedCostMultiplier,
activationThreshold: c.activationThreshold,
}))
: [];
return {
fromIso2,
toIso2,
cargoType,
hs2,
primaryRouteId,
chokepointExposures,
bypassOptions,
warRiskTier,
disruptionScore,
fetchedAt: new Date().toISOString(),
};
}

View File

@@ -0,0 +1,102 @@
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
export const WEBHOOK_TTL = 86400 * 30; // 30 days
export const VALID_CHOKEPOINT_IDS = new Set(CHOKEPOINT_REGISTRY.map(c => c.id));
// Private IP ranges + known cloud metadata hostnames blocked at registration.
// DNS rebinding is not mitigated here (no DNS resolution in edge runtime); the
// delivery worker must re-resolve and re-check before sending.
export const PRIVATE_HOSTNAME_PATTERNS = [
/^localhost$/i,
/^127\.\d+\.\d+\.\d+$/,
/^10\.\d+\.\d+\.\d+$/,
/^192\.168\.\d+\.\d+$/,
/^172\.(1[6-9]|2\d|3[01])\.\d+\.\d+$/,
/^169\.254\.\d+\.\d+$/,
/^fd[0-9a-f]{2}:/i,
/^fe80:/i,
/^::1$/,
/^0\.0\.0\.0$/,
/^0\.\d+\.\d+\.\d+$/,
/^100\.(6[4-9]|[7-9]\d|1[01]\d|12[0-7])\.\d+\.\d+$/,
];
export const BLOCKED_METADATA_HOSTNAMES = new Set([
'169.254.169.254',
'metadata.google.internal',
'metadata.internal',
'instance-data',
'metadata',
'computemetadata',
'link-local.s3.amazonaws.com',
]);
export function isBlockedCallbackUrl(rawUrl: string): string | null {
let parsed: URL;
try {
parsed = new URL(rawUrl);
} catch {
return 'callbackUrl is not a valid URL';
}
if (parsed.protocol !== 'https:') {
return 'callbackUrl must use https';
}
const hostname = parsed.hostname.toLowerCase();
if (BLOCKED_METADATA_HOSTNAMES.has(hostname)) {
return 'callbackUrl hostname is a blocked metadata endpoint';
}
for (const pattern of PRIVATE_HOSTNAME_PATTERNS) {
if (pattern.test(hostname)) {
return `callbackUrl resolves to a private/reserved address: ${hostname}`;
}
}
return null;
}
export async function generateSecret(): Promise<string> {
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
return [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
}
export function generateSubscriberId(): string {
const bytes = new Uint8Array(12);
crypto.getRandomValues(bytes);
return 'wh_' + [...bytes].map(b => b.toString(16).padStart(2, '0')).join('');
}
export function webhookKey(subscriberId: string): string {
return `webhook:sub:${subscriberId}:v1`;
}
export function ownerIndexKey(ownerHash: string): string {
return `webhook:owner:${ownerHash}:v1`;
}
/** SHA-256 hash of the caller's API key — used as ownerTag and owner index key. Never secret. */
export async function callerFingerprint(req: Request): Promise<string> {
const key =
req.headers.get('X-WorldMonitor-Key') ??
req.headers.get('X-Api-Key') ??
'';
if (!key) return 'anon';
const encoded = new TextEncoder().encode(key);
const hashBuffer = await crypto.subtle.digest('SHA-256', encoded);
return Array.from(new Uint8Array(hashBuffer)).map(b => b.toString(16).padStart(2, '0')).join('');
}
export interface WebhookRecord {
subscriberId: string;
ownerTag: string;
callbackUrl: string;
chokepointIds: string[];
alertThreshold: number;
createdAt: string;
active: boolean;
secret: string;
}

View File

@@ -257,7 +257,7 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
const [navResult, vesselResult, transitSummariesData, flowsData] = await Promise.all([
listNavigationalWarnings(ctx, { area: '', pageSize: 0, cursor: '' }).catch((): ListNavigationalWarningsResponse => { navFailed = true; return { warnings: [], pagination: undefined }; }),
getVesselSnapshot(ctx, { neLat: 90, neLon: 180, swLat: -90, swLon: -180 }).catch((): GetVesselSnapshotResponse => { vesselFailed = true; return { snapshot: undefined }; }),
getVesselSnapshot(ctx, { neLat: 90, neLon: 180, swLat: -90, swLon: -180, includeCandidates: false }).catch((): GetVesselSnapshotResponse => { vesselFailed = true; return { snapshot: undefined }; }),
getCachedJson(TRANSIT_SUMMARIES_KEY, true).catch(() => null) as Promise<TransitSummariesPayload | null>,
getCachedJson(FLOWS_KEY, true).catch(() => null) as Promise<Record<string, FlowEstimateEntry> | null>,
]);

View File

@@ -0,0 +1,47 @@
import type {
ServerContext,
GetCountryProductsRequest,
GetCountryProductsResponse,
CountryProduct,
} from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
import { ValidationError } from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
import { isCallerPremium } from '../../../_shared/premium-check';
import { getCachedJson } from '../../../_shared/redis';
interface BilateralHs4Payload {
iso2: string;
products?: CountryProduct[];
fetchedAt?: string;
}
export async function getCountryProducts(
ctx: ServerContext,
req: GetCountryProductsRequest,
): Promise<GetCountryProductsResponse> {
const iso2 = (req.iso2 ?? '').trim().toUpperCase();
// Input-shape errors return 400 — restoring the legacy /api/supply-chain/v1/
// country-products contract which predated the sebuf migration. Empty-payload-200
// is reserved for the PRO-gate deny path (intentional contract shift), not for
// caller bugs (malformed/missing fields). Distinguishing the two matters for
// logging, external API consumers, and silent-failure detection.
if (!/^[A-Z]{2}$/.test(iso2)) {
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase ISO country code' }]);
}
const isPro = await isCallerPremium(ctx.request);
const empty: GetCountryProductsResponse = { iso2, products: [], fetchedAt: '' };
if (!isPro) return empty;
// Seeder writes via raw key (no env-prefix) — match it on read.
const key = `comtrade:bilateral-hs4:${iso2}:v1`;
const payload = await getCachedJson(key, true).catch(() => null) as BilateralHs4Payload | null;
if (!payload) return empty;
return {
iso2,
products: Array.isArray(payload.products) ? payload.products : [],
fetchedAt: payload.fetchedAt ?? '',
};
}

View File

@@ -0,0 +1,129 @@
import type {
ServerContext,
GetMultiSectorCostShockRequest,
GetMultiSectorCostShockResponse,
ChokepointInfo,
MultiSectorCostShock,
WarRiskTier,
} from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
import { ValidationError } from '../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server';
import { isCallerPremium } from '../../../_shared/premium-check';
import { getCachedJson } from '../../../_shared/redis';
import { CHOKEPOINT_REGISTRY } from '../../../_shared/chokepoint-registry';
import { CHOKEPOINT_STATUS_KEY } from '../../../_shared/cache-keys';
import {
aggregateAnnualImportsByHs2,
clampClosureDays,
computeMultiSectorShocks,
MULTI_SECTOR_HS2_LABELS,
SEEDED_HS2_CODES,
type SeededProduct,
} from './_multi-sector-shock';
interface CountryProductsCache {
iso2: string;
products?: SeededProduct[];
fetchedAt?: string;
}
function emptySectorSkeleton(closureDays: number): MultiSectorCostShock[] {
return SEEDED_HS2_CODES.map(hs2 => ({
hs2,
hs2Label: MULTI_SECTOR_HS2_LABELS[hs2] ?? `HS ${hs2}`,
importValueAnnual: 0,
freightAddedPctPerTon: 0,
warRiskPremiumBps: 0,
addedTransitDays: 0,
totalCostShockPerDay: 0,
totalCostShock30Days: 0,
totalCostShock90Days: 0,
totalCostShock: 0,
closureDays,
}));
}
function emptyResponse(
iso2: string,
chokepointId: string,
closureDays: number,
warRiskTier: WarRiskTier = 'WAR_RISK_TIER_UNSPECIFIED',
unavailableReason = '',
sectors: MultiSectorCostShock[] = [],
): GetMultiSectorCostShockResponse {
return {
iso2,
chokepointId,
closureDays,
warRiskTier,
sectors,
totalAddedCost: 0,
fetchedAt: new Date().toISOString(),
unavailableReason,
};
}
export async function getMultiSectorCostShock(
ctx: ServerContext,
req: GetMultiSectorCostShockRequest,
): Promise<GetMultiSectorCostShockResponse> {
const iso2 = (req.iso2 ?? '').trim().toUpperCase();
const chokepointId = (req.chokepointId ?? '').trim().toLowerCase();
const closureDays = clampClosureDays(req.closureDays ?? 30);
// Input-shape errors return 400 — restoring the legacy /api/supply-chain/v1/
// multi-sector-cost-shock contract. Empty-payload-200 is reserved for the
// PRO-gate deny path (intentional contract shift), not for caller bugs
// (malformed or missing fields). Distinguishing the two matters for external
// API consumers, tests, and silent-failure detection in logs.
if (!/^[A-Z]{2}$/.test(iso2)) {
throw new ValidationError([{ field: 'iso2', description: 'iso2 must be a 2-letter uppercase ISO country code' }]);
}
if (!chokepointId) {
throw new ValidationError([{ field: 'chokepointId', description: 'chokepointId is required' }]);
}
if (!CHOKEPOINT_REGISTRY.some(c => c.id === chokepointId)) {
throw new ValidationError([{ field: 'chokepointId', description: `Unknown chokepointId: ${chokepointId}` }]);
}
const isPro = await isCallerPremium(ctx.request);
if (!isPro) return emptyResponse(iso2, chokepointId, closureDays);
// Seeder writes the products payload via raw key (no env-prefix) — read raw.
const productsKey = `comtrade:bilateral-hs4:${iso2}:v1`;
const [productsCache, statusCache] = await Promise.all([
getCachedJson(productsKey, true).catch(() => null) as Promise<CountryProductsCache | null>,
getCachedJson(CHOKEPOINT_STATUS_KEY).catch(() => null) as Promise<{ chokepoints?: ChokepointInfo[] } | null>,
]);
const products = Array.isArray(productsCache?.products) ? productsCache.products : [];
const importsByHs2 = aggregateAnnualImportsByHs2(products);
const hasAnyImports = Object.values(importsByHs2).some(v => v > 0);
const warRiskTier = (statusCache?.chokepoints?.find(c => c.id === chokepointId)?.warRiskTier
?? 'WAR_RISK_TIER_NORMAL') as WarRiskTier;
if (!hasAnyImports) {
return emptyResponse(
iso2,
chokepointId,
closureDays,
warRiskTier,
'No seeded import data available for this country',
emptySectorSkeleton(closureDays),
);
}
const sectors = computeMultiSectorShocks(importsByHs2, chokepointId, warRiskTier, closureDays);
const totalAddedCost = sectors.reduce((sum, s) => sum + s.totalCostShock, 0);
return {
iso2,
chokepointId,
closureDays,
warRiskTier,
sectors,
totalAddedCost,
fetchedAt: new Date().toISOString(),
unavailableReason: '',
};
}

View File

@@ -8,6 +8,8 @@ import { getShippingStress } from './get-shipping-stress';
import { getCountryChokepointIndex } from './get-country-chokepoint-index';
import { getBypassOptions } from './get-bypass-options';
import { getCountryCostShock } from './get-country-cost-shock';
import { getCountryProducts } from './get-country-products';
import { getMultiSectorCostShock } from './get-multi-sector-cost-shock';
import { getSectorDependency } from './get-sector-dependency';
import { getRouteExplorerLane } from './get-route-explorer-lane';
import { getRouteImpact } from './get-route-impact';
@@ -21,6 +23,8 @@ export const supplyChainHandler: SupplyChainServiceHandler = {
getCountryChokepointIndex,
getBypassOptions,
getCountryCostShock,
getCountryProducts,
getMultiSectorCostShock,
getSectorDependency,
getRouteExplorerLane,
getRouteImpact,

View File

@@ -1212,10 +1212,14 @@ async function dispatch(requestUrl, req, routes, context) {
}
// Registration — call Convex directly when CONVEX_URL is available (self-hosted),
// otherwise proxy to cloud (desktop sidecar never has CONVEX_URL).
// Keeps the legacy /api/register-interest local path so older desktop builds
// continue to work; cloud fallback rewrites to the new sebuf RPC path.
if (requestUrl.pathname === '/api/register-interest' && req.method === 'POST') {
const convexUrl = process.env.CONVEX_URL;
if (!convexUrl) {
const cloudResponse = await tryCloudFallback(requestUrl, req, context, 'no CONVEX_URL');
const cloudUrl = new URL(requestUrl);
cloudUrl.pathname = '/api/leads/v1/register-interest';
const cloudResponse = await tryCloudFallback(cloudUrl, req, context, 'no CONVEX_URL');
if (cloudResponse) return cloudResponse;
return json({ error: 'Registration service unavailable' }, 503);
}

View File

@@ -17,7 +17,7 @@ import { isDesktopRuntime } from '@/services/runtime';
import { getAuthState, subscribeAuthState } from '@/services/auth-state';
import { hasPremiumAccess } from '@/services/panel-gating';
import { trackGateHit } from '@/services/analytics';
import { premiumFetch } from '@/services/premium-fetch';
import { runScenario, getScenarioStatus } from '@/services/scenario';
type TabId = 'chokepoints' | 'shipping' | 'indicators' | 'minerals' | 'stress';
@@ -847,14 +847,8 @@ export class SupplyChainPanel extends Panel {
// Hard timeout on POST /run so a hanging edge function can't leave
// the button in "Computing…" indefinitely.
const runSignal = AbortSignal.any([signal, AbortSignal.timeout(20_000)]);
const runResp = await premiumFetch('/api/scenario/v1/run', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ scenarioId }),
signal: runSignal,
});
if (!runResp.ok) throw new Error(`Run failed: ${runResp.status}`);
const { jobId } = await runResp.json() as { jobId: string };
const runResp = await runScenario({ scenarioId, iso2: '' }, { signal: runSignal });
const jobId = runResp.jobId;
let result: ScenarioResult | null = null;
// 60 × 1s = 60s max (worker typically completes in <1s). 1s poll keeps
// the perceived latency <2s in the common case. First iteration polls
@@ -864,9 +858,7 @@ export class SupplyChainPanel extends Panel {
if (signal.aborted) { resetButton('Simulate Closure'); return; }
if (!this.content.isConnected) return; // panel gone — nothing to update
if (i > 0) await new Promise(r => setTimeout(r, 1000));
const statusResp = await premiumFetch(`/api/scenario/v1/status?jobId=${encodeURIComponent(jobId)}`, { signal });
if (!statusResp.ok) throw new Error(`Status poll failed: ${statusResp.status}`);
const status = await statusResp.json() as { status: string; result?: ScenarioResult };
const status = await getScenarioStatus(jobId, { signal });
if (status.status === 'done') {
const r = status.result;
if (!r || !Array.isArray(r.topImpactCountries)) throw new Error('done without valid result');

View File

@@ -0,0 +1,149 @@
// @ts-nocheck
// Code generated by protoc-gen-ts-client. DO NOT EDIT.
// source: worldmonitor/leads/v1/service.proto
export interface SubmitContactRequest {
email: string;
name: string;
organization: string;
phone: string;
message: string;
source: string;
website: string;
turnstileToken: string;
}
export interface SubmitContactResponse {
status: string;
emailSent: boolean;
}
export interface RegisterInterestRequest {
email: string;
source: string;
appVersion: string;
referredBy: string;
website: string;
turnstileToken: string;
}
export interface RegisterInterestResponse {
status: string;
referralCode: string;
referralCount: number;
position: number;
emailSuppressed: boolean;
}
export interface FieldViolation {
field: string;
description: string;
}
export class ValidationError extends Error {
violations: FieldViolation[];
constructor(violations: FieldViolation[]) {
super("Validation failed");
this.name = "ValidationError";
this.violations = violations;
}
}
export class ApiError extends Error {
statusCode: number;
body: string;
constructor(statusCode: number, message: string, body: string) {
super(message);
this.name = "ApiError";
this.statusCode = statusCode;
this.body = body;
}
}
export interface LeadsServiceClientOptions {
fetch?: typeof fetch;
defaultHeaders?: Record<string, string>;
}
export interface LeadsServiceCallOptions {
headers?: Record<string, string>;
signal?: AbortSignal;
}
export class LeadsServiceClient {
private baseURL: string;
private fetchFn: typeof fetch;
private defaultHeaders: Record<string, string>;
constructor(baseURL: string, options?: LeadsServiceClientOptions) {
this.baseURL = baseURL.replace(/\/+$/, "");
this.fetchFn = options?.fetch ?? globalThis.fetch;
this.defaultHeaders = { ...options?.defaultHeaders };
}
async submitContact(req: SubmitContactRequest, options?: LeadsServiceCallOptions): Promise<SubmitContactResponse> {
let path = "/api/leads/v1/submit-contact";
const url = this.baseURL + path;
const headers: Record<string, string> = {
"Content-Type": "application/json",
...this.defaultHeaders,
...options?.headers,
};
const resp = await this.fetchFn(url, {
method: "POST",
headers,
body: JSON.stringify(req),
signal: options?.signal,
});
if (!resp.ok) {
return this.handleError(resp);
}
return await resp.json() as SubmitContactResponse;
}
async registerInterest(req: RegisterInterestRequest, options?: LeadsServiceCallOptions): Promise<RegisterInterestResponse> {
let path = "/api/leads/v1/register-interest";
const url = this.baseURL + path;
const headers: Record<string, string> = {
"Content-Type": "application/json",
...this.defaultHeaders,
...options?.headers,
};
const resp = await this.fetchFn(url, {
method: "POST",
headers,
body: JSON.stringify(req),
signal: options?.signal,
});
if (!resp.ok) {
return this.handleError(resp);
}
return await resp.json() as RegisterInterestResponse;
}
private async handleError(resp: Response): Promise<never> {
const body = await resp.text();
if (resp.status === 400) {
try {
const parsed = JSON.parse(body);
if (parsed.violations) {
throw new ValidationError(parsed.violations);
}
} catch (e) {
if (e instanceof ValidationError) throw e;
}
}
throw new ApiError(resp.status, `Request failed with status ${resp.status}`, body);
}
}

Some files were not shown because too many files have changed in this diff Show More