fix(supply-chain): increase Redis timeout for PortWatch and remove content height cap (#1598)

* fix(supply-chain): increase Redis timeout for PortWatch and remove content height cap

Root cause: getCachedJson has a 1500ms timeout, but the PortWatch
payload (~149KB for 13 chokepoints x 175 days) exceeds this on
high-latency Edge regions. The fetch silently times out and returns
null, so the handler builds responses with empty transit summaries.

Fix: add optional timeoutMs param to getCachedJson, use 5000ms for
the PortWatch fetch. Also remove the 300px max-height on
.economic-content so the Supply Chain panel fills available height.

* refactor(supply-chain): move transit summary assembly to Railway relay

Vercel Edge was reading 3 large Redis keys (PortWatch 149KB, transit
counts, CorridorRisk) and assembling transit summaries on every request.
The 1500ms Redis timeout caused the 149KB PortWatch fetch to silently
fail on high-latency Edge regions (Mumbai bom1), leaving all transit
data empty.

Now Railway builds the pre-assembled transit summaries (including
anomaly detection) and writes them to a single key. Vercel reads
ONE small pre-built key instead of 3 raw keys.

Flow: Railway seeds PortWatch + transit counts -> builds summaries ->
writes supply_chain:transit-summaries:v1 -> Vercel reads it.

This follows the gold standard: "Vercel reads Redis ONLY; Railway
makes ALL external API calls and data assembly."

* test(supply-chain): add sync tests for relay threat levels and name mappings

detectTrafficAnomalyRelay and CHOKEPOINT_THREAT_LEVELS in the relay are
duplicated from _scoring.mjs and get-chokepoint-status.ts because
ais-relay.cjs is CJS. Added sync tests that validate:
- Every canonical chokepoint has a relay threat level
- Relay threat levels match handler CHOKEPOINTS config
- RELAY_NAME_TO_ID covers all canonical chokepoints

This catches drift between the two source-of-truth files.

* fix(ui): restore bounded scroll on economic-content with flex layout

The previous fix replaced max-height: 300px with flex: 1 1 auto, but
.panel-content was not a flex container so the flex rule was ignored.
This caused tabs to scroll away with the content.

Fix: use :has(.economic-content) to make .panel-content a flex column
only for panels with tabbed economic content. Tabs stay pinned, content
area scrolls independently.

* feat(supply-chain): fix CorridorRisk API integration (open beta, no key needed)

The CorridorRisk API is in open beta at corridorrisk.io/api/corridors
(not api.corridorrisk.io/v1/corridors). No API key required during beta.

Changes:
- Fix URL to corridorrisk.io/api/corridors
- Remove API key requirement (open beta)
- Update name matching for actual API names (e.g. "Persian Gulf &
  Strait of Hormuz" -> hormuz_strait)
- Derive riskLevel from score (>=70 critical, >=50 high, etc.)
- Store riskScore, vesselCount, eventCount7d, riskSummary
- Feed CorridorRisk data into transit summaries

* test(supply-chain): comprehensive transit summary integration tests

75 tests across 10 suites covering:
- Relay seedTransitSummaries assembly (Redis key, fields, triggers)
- CorridorRisk name mapping and risk level derivation from score
- Handler reads pre-built summaries (not raw upstream keys)
- Handler isolation: no PortWatchData/CorridorRiskData/CANONICAL_CHOKEPOINTS imports
- detectTrafficAnomalyRelay sync with _scoring.mjs (side-by-side execution)
- detectTrafficAnomaly edge cases (boundaries, threat levels, unsorted history)
- CHOKEPOINT_THREAT_LEVELS relay-handler sync validation

* fix(supply-chain): hydrate transit summaries from Redis on relay restart

After relay restart, latestPortwatchData and latestCorridorRiskData are
null. The initial seedTransitSummaries call (35s after boot) would
return early with no data, leaving the transit-summaries:v1 key stale
until the next PortWatch seed completes (6+ seconds later).

Fix: seedTransitSummaries now reads persisted PortWatch and CorridorRisk
data from Redis when in-memory state is empty. This covers the cold-start
gap so Vercel always has fresh transit summaries.

Also adds 5 tests validating the hydration path order and assignment.

* fix(supply-chain): add fallback to raw Redis keys when pre-built summaries are empty

P1: If supply_chain:transit-summaries:v1 is absent (relay not deployed,
restart in progress, or transient PortWatch failure), the handler now
falls back to reading the raw portwatch, corridorrisk, and transit count
keys directly and assembling summaries on the fly.

This ensures corridor risk data (riskLevel, incidentCount7d, disruptionPct)
is never silently zeroed out, and users keep history/counts even during
the 6-hour PortWatch re-seed window.

Strategy: pre-built summaries (fast path) -> raw keys fallback (slow path)
-> all-zero defaults (last resort).
This commit is contained in:
Elie Habib
2026-03-14 23:27:27 +04:00
committed by GitHub
parent 519ae55980
commit 13bb3ef080
6 changed files with 803 additions and 75 deletions

View File

@@ -64,6 +64,7 @@ const STANDALONE_KEYS = {
portwatch: 'supply_chain:portwatch:v1',
corridorrisk: 'supply_chain:corridorrisk:v1',
chokepointTransits: 'supply_chain:chokepoint_transits:v1',
transitSummaries: 'supply_chain:transit-summaries:v1',
};
const SEED_META = {
@@ -117,6 +118,7 @@ const SEED_META = {
portwatch: { key: 'seed-meta:supply_chain:portwatch', maxStaleMin: 720 },
corridorrisk: { key: 'seed-meta:supply_chain:corridorrisk', maxStaleMin: 120 },
chokepointTransits: { key: 'seed-meta:supply_chain:chokepoint_transits', maxStaleMin: 15 },
transitSummaries: { key: 'seed-meta:supply_chain:transit-summaries', maxStaleMin: 15 },
};
// Standalone keys that are populated on-demand by RPC handlers (not seeds).

View File

@@ -3688,6 +3688,7 @@ const PORTWATCH_CHOKEPOINT_NAMES = [
{ name: 'Lombok Strait', id: 'lombok_strait' },
];
let portwatchSeedInFlight = false;
let latestPortwatchData = null;
function pwFormatDate(ts) {
const d = new Date(ts);
@@ -3777,9 +3778,11 @@ async function seedPortWatch() {
console.warn('[PortWatch] No data fetched — skipping');
return;
}
latestPortwatchData = result;
const ok = await upstashSet(PORTWATCH_REDIS_KEY, result, PORTWATCH_TTL);
await upstashSet('seed-meta:supply_chain:portwatch', { fetchedAt: Date.now(), recordCount: Object.keys(result).length }, 604800);
console.log(`[PortWatch] Seeded ${Object.keys(result).length} chokepoints (redis: ${ok ? 'OK' : 'FAIL'}) in ${((Date.now() - t0) / 1000).toFixed(1)}s`);
seedTransitSummaries().catch(e => console.warn('[TransitSummary] Post-PortWatch seed error:', e?.message || e));
} catch (e) {
console.warn('[PortWatch] Seed error:', e?.message || e);
} finally {
@@ -3799,33 +3802,32 @@ async function startPortWatchSeedLoop() {
}, PORTWATCH_SEED_INTERVAL_MS).unref?.();
}
const CORRIDOR_RISK_API_KEY = process.env.CORRIDOR_RISK_API_KEY || '';
const CORRIDOR_RISK_BASE_URL = 'https://api.corridorrisk.io/v1/corridors';
const CORRIDOR_RISK_BASE_URL = 'https://corridorrisk.io/api/corridors';
const CORRIDOR_RISK_REDIS_KEY = 'supply_chain:corridorrisk:v1';
const CORRIDOR_RISK_TTL = 7200;
const CORRIDOR_RISK_SEED_INTERVAL_MS = 60 * 60 * 1000;
const CORRIDOR_RISK_NAMES = [
{ name: 'Suez', id: 'suez' },
{ name: 'Malacca', id: 'malacca_strait' },
{ name: 'Hormuz', id: 'hormuz_strait' },
{ name: 'Bab el-Mandeb', id: 'bab_el_mandeb' },
{ name: 'Panama', id: 'panama' },
{ name: 'Taiwan', id: 'taiwan_strait' },
{ name: 'Cape of Good Hope', id: 'cape_of_good_hope' },
// API name -> canonical chokepoint ID (partial substring match)
const CORRIDOR_RISK_NAME_MAP = [
{ pattern: 'hormuz', id: 'hormuz_strait' },
{ pattern: 'bab-el-mandeb', id: 'bab_el_mandeb' },
{ pattern: 'red sea', id: 'bab_el_mandeb' },
{ pattern: 'suez', id: 'suez' },
{ pattern: 'south china sea', id: 'taiwan_strait' },
{ pattern: 'black sea', id: 'bosphorus' },
];
let corridorRiskSeedInFlight = false;
let latestCorridorRiskData = null;
async function seedCorridorRisk() {
if (!CORRIDOR_RISK_API_KEY) return;
if (corridorRiskSeedInFlight) return;
corridorRiskSeedInFlight = true;
const t0 = Date.now();
try {
const resp = await fetch(CORRIDOR_RISK_BASE_URL, {
headers: {
Authorization: `Bearer ${CORRIDOR_RISK_API_KEY}`,
Accept: 'application/json',
'User-Agent': CHROME_UA,
Referer: 'https://corridorrisk.io/dashboard.html',
},
signal: AbortSignal.timeout(10000),
});
@@ -3833,32 +3835,37 @@ async function seedCorridorRisk() {
console.warn(`[CorridorRisk] HTTP ${resp.status}`);
return;
}
const body = await resp.json();
const corridors = Array.isArray(body) ? body : body.data;
if (!corridors?.length) {
const corridors = await resp.json();
if (!Array.isArray(corridors) || !corridors.length) {
console.warn('[CorridorRisk] No corridors returned — skipping');
return;
}
const crNameMap = new Map(CORRIDOR_RISK_NAMES.map(c => [c.name.toLowerCase(), c.id]));
const result = {};
for (const corridor of corridors) {
const name = corridor.name;
if (!name) continue;
const id = crNameMap.get(name.toLowerCase());
if (!id) continue;
result[id] = {
riskLevel: String(corridor.risk_level ?? ''),
const name = (corridor.name || '').toLowerCase();
const mapping = CORRIDOR_RISK_NAME_MAP.find(m => name.includes(m.pattern));
if (!mapping) continue;
const score = Number(corridor.score ?? 0);
const riskLevel = score >= 70 ? 'critical' : score >= 50 ? 'high' : score >= 30 ? 'elevated' : 'normal';
result[mapping.id] = {
riskLevel,
riskScore: score,
incidentCount7d: Number(corridor.incident_count_7d ?? 0),
eventCount7d: Number(corridor.event_count_7d ?? 0),
disruptionPct: Number(corridor.disruption_pct ?? 0),
vesselCount: Number(corridor.vessel_count ?? 0),
riskSummary: String(corridor.risk_summary || '').slice(0, 200),
};
}
if (Object.keys(result).length === 0) {
console.warn('[CorridorRisk] No matching corridors — skipping');
return;
}
latestCorridorRiskData = result;
const ok = await upstashSet(CORRIDOR_RISK_REDIS_KEY, result, CORRIDOR_RISK_TTL);
await upstashSet('seed-meta:supply_chain:corridorrisk', { fetchedAt: Date.now(), recordCount: Object.keys(result).length }, 604800);
console.log(`[CorridorRisk] Seeded ${Object.keys(result).length} corridors (redis: ${ok ? 'OK' : 'FAIL'}) in ${((Date.now() - t0) / 1000).toFixed(1)}s`);
seedTransitSummaries().catch(e => console.warn('[TransitSummary] Post-CorridorRisk seed error:', e?.message || e));
} catch (e) {
console.warn('[CorridorRisk] Seed error:', e?.message || e);
} finally {
@@ -3871,10 +3878,6 @@ async function startCorridorRiskSeedLoop() {
console.log('[CorridorRisk] Disabled (no Upstash Redis)');
return;
}
if (!CORRIDOR_RISK_API_KEY) {
console.log('[CorridorRisk] Disabled (no CORRIDOR_RISK_API_KEY)');
return;
}
console.log(`[CorridorRisk] Seed loop starting (interval ${CORRIDOR_RISK_SEED_INTERVAL_MS / 1000 / 60}min)`);
seedCorridorRisk().catch(e => console.warn('[CorridorRisk] Initial seed error:', e?.message || e));
setInterval(() => {
@@ -4711,6 +4714,123 @@ setInterval(() => {
seedChokepointTransits().catch(err => console.error('[Transit] Seed error:', err.message));
}, CHOKEPOINT_TRANSIT_INTERVAL_MS).unref?.();
// --- Pre-assembled Transit Summaries (Railway advantage: avoids large Redis reads on Vercel) ---
const TRANSIT_SUMMARY_REDIS_KEY = 'supply_chain:transit-summaries:v1';
const TRANSIT_SUMMARY_TTL = 900; // 15 min
const TRANSIT_SUMMARY_INTERVAL_MS = 10 * 60 * 1000;
// Threat levels for anomaly detection.
// IMPORTANT: Must stay in sync with CHOKEPOINTS[].threatLevel in
// server/worldmonitor/supply-chain/v1/get-chokepoint-status.ts
// Only war_zone and critical trigger anomaly signals.
const CHOKEPOINT_THREAT_LEVELS = {
suez: 'high', malacca_strait: 'normal', hormuz_strait: 'war_zone',
bab_el_mandeb: 'critical', panama: 'normal', taiwan_strait: 'elevated',
cape_of_good_hope: 'normal', gibraltar: 'normal', bosphorus: 'elevated',
korea_strait: 'normal', dover_strait: 'normal', kerch_strait: 'war_zone',
lombok_strait: 'normal',
};
// ID mapping: relay geofence name -> canonical ID
const RELAY_NAME_TO_ID = {
'Suez Canal': 'suez', 'Malacca Strait': 'malacca_strait',
'Strait of Hormuz': 'hormuz_strait', 'Bab el-Mandeb Strait': 'bab_el_mandeb',
'Panama Canal': 'panama', 'Taiwan Strait': 'taiwan_strait',
'Cape of Good Hope': 'cape_of_good_hope', 'Gibraltar Strait': 'gibraltar',
'Bosporus Strait': 'bosphorus', 'Korea Strait': 'korea_strait',
'Dover Strait': 'dover_strait', 'Kerch Strait': 'kerch_strait',
'Lombok Strait': 'lombok_strait',
'South China Sea': null, 'Black Sea': null, // area geofences, not chokepoints
};
// Duplicated from server/worldmonitor/supply-chain/v1/_scoring.mjs because
// ais-relay.cjs is CJS and cannot import .mjs modules. Keep in sync.
function detectTrafficAnomalyRelay(history, threatLevel) {
if (!history || history.length < 37) return { dropPct: 0, signal: false };
const sorted = [...history].sort((a, b) => b.date.localeCompare(a.date));
let recent7 = 0, baseline30 = 0;
for (let i = 0; i < 7 && i < sorted.length; i++) recent7 += sorted[i].total;
for (let i = 7; i < 37 && i < sorted.length; i++) baseline30 += sorted[i].total;
const baselineAvg7 = (baseline30 / Math.min(30, sorted.length - 7)) * 7;
if (baselineAvg7 < 14) return { dropPct: 0, signal: false };
const dropPct = Math.round(((baselineAvg7 - recent7) / baselineAvg7) * 100);
const isHighThreat = threatLevel === 'war_zone' || threatLevel === 'critical';
return { dropPct, signal: dropPct >= 50 && isHighThreat };
}
async function seedTransitSummaries() {
// Hydrate from Redis on cold start (in-memory state lost after relay restart)
if (!latestPortwatchData) {
const persisted = await upstashGet(PORTWATCH_REDIS_KEY);
if (persisted && typeof persisted === 'object' && Object.keys(persisted).length > 0) {
latestPortwatchData = persisted;
console.log(`[TransitSummary] Hydrated PortWatch from Redis (${Object.keys(persisted).length} chokepoints)`);
}
}
if (!latestCorridorRiskData) {
const persisted = await upstashGet(CORRIDOR_RISK_REDIS_KEY);
if (persisted && typeof persisted === 'object' && Object.keys(persisted).length > 0) {
latestCorridorRiskData = persisted;
console.log(`[TransitSummary] Hydrated CorridorRisk from Redis (${Object.keys(persisted).length} corridors)`);
}
}
const pw = latestPortwatchData;
if (!pw || Object.keys(pw).length === 0) return;
const now = Date.now();
const summaries = {};
for (const [cpId, cpData] of Object.entries(pw)) {
const threatLevel = CHOKEPOINT_THREAT_LEVELS[cpId] || 'normal';
const anomaly = detectTrafficAnomalyRelay(cpData.history, threatLevel);
// Get relay transit counts for this chokepoint
let relayTransit = null;
for (const [relayName, canonicalId] of Object.entries(RELAY_NAME_TO_ID)) {
if (canonicalId === cpId) {
const crossings = chokepointCrossings.get(relayName) || [];
const recent = crossings.filter(c => now - c.ts < TRANSIT_WINDOW_MS);
if (recent.length > 0) {
relayTransit = {
tanker: recent.filter(c => c.type === 'tanker').length,
cargo: recent.filter(c => c.type === 'cargo').length,
other: recent.filter(c => c.type === 'other').length,
total: recent.length,
};
}
break;
}
}
const cr = latestCorridorRiskData?.[cpId];
summaries[cpId] = {
todayTotal: relayTransit?.total ?? 0,
todayTanker: relayTransit?.tanker ?? 0,
todayCargo: relayTransit?.cargo ?? 0,
todayOther: relayTransit?.other ?? 0,
wowChangePct: cpData.wowChangePct ?? 0,
history: cpData.history ?? [],
riskLevel: cr?.riskLevel ?? '',
incidentCount7d: cr?.incidentCount7d ?? 0,
disruptionPct: cr?.disruptionPct ?? 0,
anomaly,
};
}
const ok = await upstashSet(TRANSIT_SUMMARY_REDIS_KEY, { summaries, fetchedAt: now }, TRANSIT_SUMMARY_TTL);
await upstashSet('seed-meta:supply_chain:transit-summaries', { fetchedAt: now, recordCount: Object.keys(summaries).length }, 604800);
console.log(`[TransitSummary] Seeded ${Object.keys(summaries).length} summaries (redis: ${ok ? 'OK' : 'FAIL'})`);
}
// Seed transit summaries every 10 min (same as transit counter)
setTimeout(() => {
seedTransitSummaries().catch(e => console.warn('[TransitSummary] Initial seed error:', e?.message || e));
}, 35_000);
setInterval(() => {
seedTransitSummaries().catch(e => console.warn('[TransitSummary] Seed error:', e?.message || e));
}, TRANSIT_SUMMARY_INTERVAL_MS).unref?.();
// UCDP GED Events cache (persistent in-memory — Railway advantage)
const UCDP_CACHE_TTL_MS = 6 * 60 * 60 * 1000; // 6 hours
const UCDP_RELAY_MAX_PAGES = 12;

View File

@@ -16,15 +16,15 @@ import { cachedFetchJson, getCachedJson } from '../../../_shared/redis';
import { listNavigationalWarnings } from '../../maritime/v1/list-navigational-warnings';
import { getVesselSnapshot } from '../../maritime/v1/get-vessel-snapshot';
import type { PortWatchData } from './_portwatch-upstream';
import type { CorridorRiskData } from './_corridorrisk-upstream';
import { CANONICAL_CHOKEPOINTS } from './_chokepoint-ids';
// @ts-expect-error — .mjs module, no declaration file
import { computeDisruptionScore, scoreToStatus, SEVERITY_SCORE, THREAT_LEVEL, detectTrafficAnomaly } from './_scoring.mjs';
const REDIS_CACHE_KEY = 'supply_chain:chokepoints:v4';
const PORTWATCH_REDIS_KEY = 'supply_chain:portwatch:v1';
const CORRIDORRISK_REDIS_KEY = 'supply_chain:corridorrisk:v1';
const RELAY_TRANSIT_KEY = 'supply_chain:chokepoint_transits:v1';
const TRANSIT_SUMMARIES_KEY = 'supply_chain:transit-summaries:v1';
const PORTWATCH_FALLBACK_KEY = 'supply_chain:portwatch:v1';
const CORRIDORRISK_FALLBACK_KEY = 'supply_chain:corridorrisk:v1';
const TRANSIT_COUNTS_FALLBACK_KEY = 'supply_chain:chokepoint_transits:v1';
const REDIS_CACHE_TTL = 300; // 5 min
const THREAT_CONFIG_MAX_AGE_DAYS = 120;
const NEARBY_CHOKEPOINT_RADIUS_KM = 300;
@@ -67,15 +67,21 @@ interface ChokepointConfig {
type DirectionLabel = 'eastbound' | 'westbound' | 'northbound' | 'southbound';
interface RelayTransitEntry {
tanker: number;
cargo: number;
other: number;
total: number;
interface PreBuiltTransitSummary {
todayTotal: number;
todayTanker: number;
todayCargo: number;
todayOther: number;
wowChangePct: number;
history: { date: string; tanker: number; cargo: number; other: number; total: number }[];
riskLevel: string;
incidentCount7d: number;
disruptionPct: number;
anomaly: { dropPct: number; signal: boolean };
}
interface RelayTransitPayload {
transits: Record<string, RelayTransitEntry>;
interface TransitSummariesPayload {
summaries: Record<string, PreBuiltTransitSummary>;
fetchedAt: number;
}
@@ -231,36 +237,43 @@ interface ChokepointFetchResult {
upstreamUnavailable: boolean;
}
function buildRelayLookup(transitData: RelayTransitPayload | null): Map<string, RelayTransitEntry> {
const map = new Map<string, RelayTransitEntry>();
if (!transitData?.transits) return map;
for (const [relayName, entry] of Object.entries(transitData.transits)) {
const canonical = CANONICAL_CHOKEPOINTS.find(c => c.relayName === relayName);
if (canonical) map.set(canonical.id, entry);
}
return map;
}
interface CorridorRiskEntry { riskLevel: string; incidentCount7d: number; disruptionPct: number }
interface RelayTransitEntry { tanker: number; cargo: number; other: number; total: number }
interface RelayTransitPayload { transits: Record<string, RelayTransitEntry>; fetchedAt: number }
function buildTransitSummary(
cp: ChokepointConfig,
function buildFallbackSummaries(
portwatch: PortWatchData | null,
relayLookup: Map<string, RelayTransitEntry>,
corridorRisk: CorridorRiskData | null,
): import('../../../../src/generated/server/worldmonitor/supply_chain/v1/service_server').TransitSummary {
const relay = relayLookup.get(cp.id);
const pw = portwatch?.[cp.id];
const cr = corridorRisk?.[cp.id];
return {
todayTotal: relay?.total ?? 0,
todayTanker: relay?.tanker ?? 0,
todayCargo: relay?.cargo ?? 0,
todayOther: relay?.other ?? 0,
wowChangePct: pw?.wowChangePct ?? 0,
history: pw?.history ?? [],
riskLevel: cr?.riskLevel ?? '',
incidentCount7d: cr?.incidentCount7d ?? 0,
disruptionPct: cr?.disruptionPct ?? 0,
};
corridorRisk: Record<string, CorridorRiskEntry> | null,
transitData: RelayTransitPayload | null,
chokepoints: ChokepointConfig[],
): Record<string, PreBuiltTransitSummary> {
const summaries: Record<string, PreBuiltTransitSummary> = {};
const relayMap = new Map<string, RelayTransitEntry>();
if (transitData?.transits) {
for (const [relayName, entry] of Object.entries(transitData.transits)) {
const canonical = CANONICAL_CHOKEPOINTS.find(c => c.relayName === relayName);
if (canonical) relayMap.set(canonical.id, entry);
}
}
for (const cp of chokepoints) {
const pw = portwatch?.[cp.id];
const cr = corridorRisk?.[cp.id];
const relay = relayMap.get(cp.id);
const anomaly = detectTrafficAnomaly(pw?.history ?? [], cp.threatLevel);
summaries[cp.id] = {
todayTotal: relay?.total ?? 0,
todayTanker: relay?.tanker ?? 0,
todayCargo: relay?.cargo ?? 0,
todayOther: relay?.other ?? 0,
wowChangePct: pw?.wowChangePct ?? 0,
history: pw?.history ?? [],
riskLevel: cr?.riskLevel ?? '',
incidentCount7d: cr?.incidentCount7d ?? 0,
disruptionPct: cr?.disruptionPct ?? 0,
anomaly,
};
}
return summaries;
}
async function fetchChokepointData(): Promise<ChokepointFetchResult> {
@@ -269,21 +282,31 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
let navFailed = false;
let vesselFailed = false;
const [navResult, vesselResult, portwatchData, corridorRiskData, transitData] = await Promise.all([
const [navResult, vesselResult, transitSummariesData] = await Promise.all([
listNavigationalWarnings(ctx, { area: '', pageSize: 0, cursor: '' }).catch((): ListNavigationalWarningsResponse => { navFailed = true; return { warnings: [], pagination: undefined }; }),
getVesselSnapshot(ctx, { neLat: 90, neLon: 180, swLat: -90, swLon: -180 }).catch((): GetVesselSnapshotResponse => { vesselFailed = true; return { snapshot: undefined }; }),
getCachedJson(PORTWATCH_REDIS_KEY, true).catch(() => null) as Promise<PortWatchData | null>,
getCachedJson(CORRIDORRISK_REDIS_KEY, true).catch(() => null) as Promise<CorridorRiskData | null>,
getCachedJson(RELAY_TRANSIT_KEY, true).catch(() => null) as Promise<RelayTransitPayload | null>,
getCachedJson(TRANSIT_SUMMARIES_KEY, true).catch(() => null) as Promise<TransitSummariesPayload | null>,
]);
let summaries = transitSummariesData?.summaries ?? {};
// Fallback: if pre-built summaries are empty, read raw upstream keys directly
if (Object.keys(summaries).length === 0) {
const [portwatch, corridorRisk, transitCounts] = await Promise.all([
getCachedJson(PORTWATCH_FALLBACK_KEY, true).catch(() => null) as Promise<PortWatchData | null>,
getCachedJson(CORRIDORRISK_FALLBACK_KEY, true).catch(() => null) as Promise<Record<string, CorridorRiskEntry> | null>,
getCachedJson(TRANSIT_COUNTS_FALLBACK_KEY, true).catch(() => null) as Promise<RelayTransitPayload | null>,
]);
if (portwatch && Object.keys(portwatch).length > 0) {
summaries = buildFallbackSummaries(portwatch, corridorRisk, transitCounts, CHOKEPOINTS);
}
}
const warnings = navResult.warnings || [];
const disruptions: AisDisruption[] = vesselResult.snapshot?.disruptions || [];
const upstreamUnavailable = (navFailed && vesselFailed) || (navFailed && disruptions.length === 0) || (vesselFailed && warnings.length === 0);
const warningsByChokepoint = groupWarningsByChokepoint(warnings);
const disruptionsByChokepoint = groupDisruptionsByChokepoint(disruptions);
const threatConfigFresh = isThreatConfigFresh();
const relayLookup = buildRelayLookup(transitData);
const chokepoints = CHOKEPOINTS.map((cp): ChokepointInfo => {
const matchedWarnings = warningsByChokepoint.get(cp.id) ?? [];
@@ -295,8 +318,8 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
}, 0);
const threatScore = (THREAT_LEVEL as Record<string, number>)[cp.threatLevel] ?? 0;
const pw = portwatchData?.[cp.id];
const anomaly = detectTrafficAnomaly(pw?.history ?? [], cp.threatLevel);
const ts = summaries[cp.id];
const anomaly = ts?.anomaly ?? { dropPct: 0, signal: false };
const anomalyBonus = anomaly.signal ? 10 : 0;
const disruptionScore = Math.min(100, computeDisruptionScore(threatScore, matchedWarnings.length, maxSeverity) + anomalyBonus);
const status = scoreToStatus(disruptionScore);
@@ -308,7 +331,7 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
descriptions.push(cp.threatDescription);
}
if (anomaly.signal) {
descriptions.push(`Traffic down ${anomaly.dropPct}% vs 30-day baseline vessels may be transiting dark (AIS off)`);
descriptions.push(`Traffic down ${anomaly.dropPct}% vs 30-day baseline, vessels may be transiting dark (AIS off)`);
}
if (!threatConfigFresh) {
descriptions.push(THREAT_CONFIG_STALE_NOTE);
@@ -334,7 +357,17 @@ async function fetchChokepointData(): Promise<ChokepointFetchResult> {
description: descriptions.join('; '),
directions: cp.directions,
directionalDwt: [],
transitSummary: buildTransitSummary(cp, portwatchData, relayLookup, corridorRiskData),
transitSummary: ts ? {
todayTotal: ts.todayTotal,
todayTanker: ts.todayTanker,
todayCargo: ts.todayCargo,
todayOther: ts.todayOther,
wowChangePct: ts.wowChangePct,
history: ts.history,
riskLevel: ts.riskLevel,
incidentCount7d: ts.incidentCount7d,
disruptionPct: ts.disruptionPct,
} : { todayTotal: 0, todayTanker: 0, todayCargo: 0, todayOther: 0, wowChangePct: 0, history: [], riskLevel: '', incidentCount7d: 0, disruptionPct: 0 },
};
});

View File

@@ -8502,9 +8502,15 @@ a.prediction-link:hover {
/* economic-tabs: now uses shared .panel-tabs / .panel-tab */
.panel-content:has(.economic-content) {
display: flex;
flex-direction: column;
}
.economic-content {
padding: 8px;
max-height: 300px;
flex: 1 1 0;
min-height: 0;
overflow-y: auto;
}

View File

@@ -69,6 +69,36 @@ describe('portwatchNameToId', () => {
});
});
import { readFileSync } from 'node:fs';
const relaySrc = readFileSync('scripts/ais-relay.cjs', 'utf8');
const handlerSrc = readFileSync('server/worldmonitor/supply-chain/v1/get-chokepoint-status.ts', 'utf8');
describe('relay CHOKEPOINT_THREAT_LEVELS sync', () => {
it('relay has a threat level entry for every canonical chokepoint', () => {
for (const cp of CANONICAL_CHOKEPOINTS) {
assert.match(relaySrc, new RegExp(`${cp.id}:\\s*'`), `Missing relay threat level for ${cp.id}`);
}
});
it('relay threat levels match handler CHOKEPOINTS config', () => {
const relayBlock = relaySrc.match(/CHOKEPOINT_THREAT_LEVELS\s*=\s*\{([^}]+)\}/)?.[1] || '';
for (const cp of CANONICAL_CHOKEPOINTS) {
const relayMatch = relayBlock.match(new RegExp(`${cp.id}:\\s*'(\\w+)'`));
const handlerMatch = handlerSrc.match(new RegExp(`id:\\s*'${cp.id}'[^}]*threatLevel:\\s*'(\\w+)'`));
if (relayMatch && handlerMatch) {
assert.equal(relayMatch[1], handlerMatch[1], `Threat level mismatch for ${cp.id}: relay=${relayMatch[1]} handler=${handlerMatch[1]}`);
}
}
});
it('relay RELAY_NAME_TO_ID covers all canonical chokepoints', () => {
for (const cp of CANONICAL_CHOKEPOINTS) {
assert.match(relaySrc, new RegExp(`'${cp.relayName}':\\s*'${cp.id}'`), `Missing relay name mapping for ${cp.relayName} -> ${cp.id}`);
}
});
});
describe('corridorRiskNameToId', () => {
it('maps "Hormuz" to hormuz_strait', () => {
assert.equal(corridorRiskNameToId('Hormuz'), 'hormuz_strait');

View File

@@ -0,0 +1,537 @@
import { describe, it } from 'node:test';
import assert from 'node:assert/strict';
import { readFileSync } from 'node:fs';
import { dirname, resolve } from 'node:path';
import { fileURLToPath } from 'node:url';
import { detectTrafficAnomaly } from '../server/worldmonitor/supply-chain/v1/_scoring.mjs';
import {
CANONICAL_CHOKEPOINTS,
corridorRiskNameToId,
} from '../server/worldmonitor/supply-chain/v1/_chokepoint-ids.ts';
const __dirname = dirname(fileURLToPath(import.meta.url));
const root = resolve(__dirname, '..');
const relaySrc = readFileSync(resolve(root, 'scripts/ais-relay.cjs'), 'utf-8');
const handlerSrc = readFileSync(resolve(root, 'server/worldmonitor/supply-chain/v1/get-chokepoint-status.ts'), 'utf-8');
function makeDays(count, dailyTotal, startOffset) {
const days = [];
for (let i = 0; i < count; i++) {
const d = new Date(Date.now() - (startOffset + i) * 86400000);
days.push({
date: d.toISOString().slice(0, 10),
tanker: 0,
cargo: dailyTotal,
other: 0,
total: dailyTotal,
});
}
return days;
}
// ---------------------------------------------------------------------------
// 1. seedTransitSummaries relay source analysis
// ---------------------------------------------------------------------------
describe('seedTransitSummaries (relay)', () => {
it('defines seedTransitSummaries function', () => {
assert.match(relaySrc, /async function seedTransitSummaries\(\)/);
});
it('writes to supply_chain:transit-summaries:v1 Redis key', () => {
assert.match(relaySrc, /supply_chain:transit-summaries:v1/);
});
it('writes seed-meta for transit-summaries', () => {
assert.match(relaySrc, /seed-meta:supply_chain:transit-summaries/);
});
it('summary object includes all required fields', () => {
assert.match(relaySrc, /todayTotal:/);
assert.match(relaySrc, /todayTanker:/);
assert.match(relaySrc, /todayCargo:/);
assert.match(relaySrc, /todayOther:/);
assert.match(relaySrc, /wowChangePct:/);
assert.match(relaySrc, /history:/);
assert.match(relaySrc, /riskLevel:/);
assert.match(relaySrc, /incidentCount7d:/);
assert.match(relaySrc, /disruptionPct:/);
assert.match(relaySrc, /anomaly/);
});
it('reads latestCorridorRiskData for riskLevel/incidentCount7d/disruptionPct', () => {
assert.match(relaySrc, /latestCorridorRiskData\?\.\[cpId\]/);
assert.match(relaySrc, /cr\?\.riskLevel/);
assert.match(relaySrc, /cr\?\.incidentCount7d/);
assert.match(relaySrc, /cr\?\.disruptionPct/);
});
it('reads latestPortwatchData for history and wowChangePct', () => {
assert.match(relaySrc, /latestPortwatchData/);
assert.match(relaySrc, /cpData\.history/);
assert.match(relaySrc, /cpData\.wowChangePct/);
});
it('calls detectTrafficAnomalyRelay with history and threat level', () => {
assert.match(relaySrc, /detectTrafficAnomalyRelay\(cpData\.history,\s*threatLevel\)/);
});
it('wraps summaries in { summaries, fetchedAt } envelope', () => {
assert.match(relaySrc, /\{\s*summaries,\s*fetchedAt:\s*now\s*\}/);
});
it('is triggered after PortWatch seed completes', () => {
const portWatchBlock = relaySrc.match(/\[PortWatch\] Seeded[\s\S]{0,200}seedTransitSummaries/);
assert.ok(portWatchBlock, 'seedTransitSummaries should be called after PortWatch seed');
});
it('is triggered after CorridorRisk seed completes', () => {
const corridorBlock = relaySrc.match(/\[CorridorRisk\] Seeded[\s\S]{0,200}seedTransitSummaries/);
assert.ok(corridorBlock, 'seedTransitSummaries should be called after CorridorRisk seed');
});
it('runs on 10 minute interval', () => {
assert.match(relaySrc, /TRANSIT_SUMMARY_INTERVAL_MS\s*=\s*10\s*\*\s*60\s*\*\s*1000/);
});
it('has 15 minute TTL', () => {
assert.match(relaySrc, /TRANSIT_SUMMARY_TTL\s*=\s*900/);
});
});
// ---------------------------------------------------------------------------
// 2. CORRIDOR_RISK_NAME_MAP and seedCorridorRisk
// ---------------------------------------------------------------------------
describe('CORRIDOR_RISK_NAME_MAP (relay)', () => {
it('defines CORRIDOR_RISK_NAME_MAP array', () => {
assert.match(relaySrc, /const CORRIDOR_RISK_NAME_MAP\s*=\s*\[/);
});
it('maps hormuz to hormuz_strait', () => {
assert.match(relaySrc, /pattern:\s*'hormuz'.*id:\s*'hormuz_strait'/);
});
it('maps bab-el-mandeb to bab_el_mandeb', () => {
assert.match(relaySrc, /pattern:\s*'bab-el-mandeb'.*id:\s*'bab_el_mandeb'/);
});
it('maps red sea to bab_el_mandeb', () => {
assert.match(relaySrc, /pattern:\s*'red sea'.*id:\s*'bab_el_mandeb'/);
});
it('maps suez to suez', () => {
assert.match(relaySrc, /pattern:\s*'suez'.*id:\s*'suez'/);
});
it('maps south china sea to taiwan_strait', () => {
assert.match(relaySrc, /pattern:\s*'south china sea'.*id:\s*'taiwan_strait'/);
});
it('maps black sea to bosphorus', () => {
assert.match(relaySrc, /pattern:\s*'black sea'.*id:\s*'bosphorus'/);
});
it('has exactly 6 mapping entries', () => {
const mapBlock = relaySrc.match(/CORRIDOR_RISK_NAME_MAP\s*=\s*\[([\s\S]*?)\];/);
assert.ok(mapBlock, 'CORRIDOR_RISK_NAME_MAP block not found');
const patterns = [...mapBlock[1].matchAll(/pattern:\s*'/g)];
assert.equal(patterns.length, 6);
});
});
describe('seedCorridorRisk risk level derivation', () => {
// Extract the risk-level derivation logic from relay source to test boundaries
const riskLevelLine = relaySrc.match(/const riskLevel = score >= 70 \? 'critical' : score >= 50 \? 'high' : score >= 30 \? 'elevated' : 'normal'/);
assert.ok(riskLevelLine, 'risk level derivation logic not found in relay');
// Re-implement for direct boundary testing
function deriveRiskLevel(score) {
return score >= 70 ? 'critical' : score >= 50 ? 'high' : score >= 30 ? 'elevated' : 'normal';
}
it('score >= 70 is critical', () => {
assert.equal(deriveRiskLevel(70), 'critical');
assert.equal(deriveRiskLevel(100), 'critical');
});
it('score 50-69 is high', () => {
assert.equal(deriveRiskLevel(50), 'high');
assert.equal(deriveRiskLevel(69), 'high');
});
it('score 30-49 is elevated', () => {
assert.equal(deriveRiskLevel(30), 'elevated');
assert.equal(deriveRiskLevel(49), 'elevated');
});
it('score < 30 is normal', () => {
assert.equal(deriveRiskLevel(0), 'normal');
assert.equal(deriveRiskLevel(29), 'normal');
});
it('boundary: score 69 is high (not critical)', () => {
assert.equal(deriveRiskLevel(69), 'high');
});
it('boundary: score 49 is elevated (not high)', () => {
assert.equal(deriveRiskLevel(49), 'elevated');
});
it('boundary: score 29 is normal (not elevated)', () => {
assert.equal(deriveRiskLevel(29), 'normal');
});
});
describe('seedCorridorRisk output fields', () => {
it('writes riskLevel to result', () => {
assert.match(relaySrc, /riskLevel,/);
});
it('writes riskScore', () => {
assert.match(relaySrc, /riskScore:\s*score/);
});
it('writes incidentCount7d from incident_count_7d', () => {
assert.match(relaySrc, /incidentCount7d:\s*Number\(corridor\.incident_count_7d/);
});
it('writes disruptionPct from disruption_pct', () => {
assert.match(relaySrc, /disruptionPct:\s*Number\(corridor\.disruption_pct/);
});
it('writes eventCount7d from event_count_7d', () => {
assert.match(relaySrc, /eventCount7d:\s*Number\(corridor\.event_count_7d/);
});
it('writes vesselCount from vessel_count', () => {
assert.match(relaySrc, /vesselCount:\s*Number\(corridor\.vessel_count/);
});
it('truncates riskSummary to 200 chars', () => {
assert.match(relaySrc, /\.slice\(0,\s*200\)/);
});
it('stores result in latestCorridorRiskData for transit summary assembly', () => {
assert.match(relaySrc, /latestCorridorRiskData\s*=\s*result/);
});
it('writes to corridor risk Redis key', () => {
assert.match(relaySrc, /supply_chain:corridorrisk/);
});
it('writes seed-meta for corridor risk', () => {
assert.match(relaySrc, /seed-meta:supply_chain:corridorrisk/);
});
});
// ---------------------------------------------------------------------------
// 3. Vercel handler consuming pre-built summaries
// ---------------------------------------------------------------------------
describe('get-chokepoint-status handler (source analysis)', () => {
it('defines TRANSIT_SUMMARIES_KEY pointing to transit-summaries:v1', () => {
assert.match(handlerSrc, /TRANSIT_SUMMARIES_KEY\s*=\s*'supply_chain:transit-summaries:v1'/);
});
it('reads transit summaries via getCachedJson', () => {
assert.match(handlerSrc, /getCachedJson\(TRANSIT_SUMMARIES_KEY/);
});
it('imports PortWatchData for fallback assembly', () => {
assert.match(handlerSrc, /import.*PortWatchData/);
});
it('does NOT import CorridorRiskData (uses local interface)', () => {
assert.doesNotMatch(handlerSrc, /import.*CorridorRiskData/);
});
it('imports CANONICAL_CHOKEPOINTS for fallback relay-name mapping', () => {
assert.match(handlerSrc, /import.*CANONICAL_CHOKEPOINTS/);
});
it('does NOT import portwatchNameToId or corridorRiskNameToId', () => {
assert.doesNotMatch(handlerSrc, /import.*portwatchNameToId/);
assert.doesNotMatch(handlerSrc, /import.*corridorRiskNameToId/);
});
it('defines PreBuiltTransitSummary interface with all required fields', () => {
assert.match(handlerSrc, /interface PreBuiltTransitSummary/);
assert.match(handlerSrc, /todayTotal:\s*number/);
assert.match(handlerSrc, /todayTanker:\s*number/);
assert.match(handlerSrc, /todayCargo:\s*number/);
assert.match(handlerSrc, /todayOther:\s*number/);
assert.match(handlerSrc, /wowChangePct:\s*number/);
assert.match(handlerSrc, /riskLevel:\s*string/);
assert.match(handlerSrc, /incidentCount7d:\s*number/);
assert.match(handlerSrc, /disruptionPct:\s*number/);
assert.match(handlerSrc, /anomaly:\s*\{\s*dropPct:\s*number;\s*signal:\s*boolean\s*\}/);
});
it('defines TransitSummariesPayload with summaries record and fetchedAt', () => {
assert.match(handlerSrc, /interface TransitSummariesPayload/);
assert.match(handlerSrc, /summaries:\s*Record<string,\s*PreBuiltTransitSummary>/);
assert.match(handlerSrc, /fetchedAt:\s*number/);
});
it('maps transit summary data into ChokepointInfo.transitSummary', () => {
assert.match(handlerSrc, /transitSummary:\s*ts\s*\?/);
});
it('provides zero-value fallback when no transit summary exists', () => {
assert.match(handlerSrc, /todayTotal:\s*0,\s*todayTanker:\s*0/);
});
it('uses anomaly.signal for bonus scoring', () => {
assert.match(handlerSrc, /anomalyBonus\s*=\s*anomaly\.signal\s*\?\s*10\s*:\s*0/);
});
it('includes anomaly drop description when signal is true', () => {
assert.match(handlerSrc, /Traffic down.*dropPct.*baseline/);
});
});
// ---------------------------------------------------------------------------
// 4. CORRIDOR_RISK_NAME_MAP alignment with _chokepoint-ids
// ---------------------------------------------------------------------------
describe('corridor risk name map alignment with canonical IDs', () => {
const mapBlock = relaySrc.match(/CORRIDOR_RISK_NAME_MAP\s*=\s*\[([\s\S]*?)\];/);
const entries = [...mapBlock[1].matchAll(/\{\s*pattern:\s*'([^']+)',\s*id:\s*'([^']+)'\s*\}/g)];
it('all mapped IDs are valid canonical chokepoint IDs', () => {
const canonicalIds = new Set(CANONICAL_CHOKEPOINTS.map(c => c.id));
for (const [, , id] of entries) {
assert.ok(canonicalIds.has(id), `${id} is not a canonical chokepoint ID`);
}
});
it('corridorRiskNameToId covers chokepoints with non-null corridorRiskName', () => {
const withCr = CANONICAL_CHOKEPOINTS.filter(c => c.corridorRiskName !== null);
for (const cp of withCr) {
assert.equal(corridorRiskNameToId(cp.corridorRiskName), cp.id,
`corridorRiskNameToId('${cp.corridorRiskName}') should return '${cp.id}'`);
}
});
});
// ---------------------------------------------------------------------------
// 5. detectTrafficAnomalyRelay sync with _scoring.mjs version
// ---------------------------------------------------------------------------
describe('detectTrafficAnomalyRelay sync with _scoring.mjs', () => {
// Extract the relay copy of detectTrafficAnomalyRelay
const fnMatch = relaySrc.match(/function detectTrafficAnomalyRelay\(history, threatLevel\)\s*\{([\s\S]*?)\n\}/);
assert.ok(fnMatch, 'detectTrafficAnomalyRelay not found in relay source');
const relayFn = new Function('history', 'threatLevel', fnMatch[1]);
it('matches _scoring.mjs for war_zone with large drop', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(30, 100, 7)];
const scoringResult = detectTrafficAnomaly(history, 'war_zone');
const relayResult = relayFn(history, 'war_zone');
assert.deepEqual(relayResult, scoringResult);
});
it('matches _scoring.mjs for normal threat level', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(30, 100, 7)];
const scoringResult = detectTrafficAnomaly(history, 'normal');
const relayResult = relayFn(history, 'normal');
assert.deepEqual(relayResult, scoringResult);
});
it('matches _scoring.mjs for insufficient history', () => {
const history = makeDays(20, 100, 0);
const scoringResult = detectTrafficAnomaly(history, 'war_zone');
const relayResult = relayFn(history, 'war_zone');
assert.deepEqual(relayResult, scoringResult);
});
it('matches _scoring.mjs for low baseline', () => {
const history = [...makeDays(7, 0, 0), ...makeDays(30, 1, 7)];
const scoringResult = detectTrafficAnomaly(history, 'war_zone');
const relayResult = relayFn(history, 'war_zone');
assert.deepEqual(relayResult, scoringResult);
});
it('matches _scoring.mjs for critical threat level', () => {
const history = [...makeDays(7, 10, 0), ...makeDays(30, 100, 7)];
const scoringResult = detectTrafficAnomaly(history, 'critical');
const relayResult = relayFn(history, 'critical');
assert.deepEqual(relayResult, scoringResult);
});
});
// ---------------------------------------------------------------------------
// 6. detectTrafficAnomaly (_scoring.mjs) edge cases
// ---------------------------------------------------------------------------
describe('detectTrafficAnomaly edge cases (_scoring.mjs)', () => {
it('null history returns no signal', () => {
const result = detectTrafficAnomaly(null, 'war_zone');
assert.deepEqual(result, { dropPct: 0, signal: false });
});
it('empty array returns no signal', () => {
const result = detectTrafficAnomaly([], 'war_zone');
assert.deepEqual(result, { dropPct: 0, signal: false });
});
it('exactly 37 days is sufficient', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(30, 100, 7)];
assert.equal(history.length, 37);
const result = detectTrafficAnomaly(history, 'war_zone');
assert.ok(result.signal, 'should detect anomaly with exactly 37 days');
assert.ok(result.dropPct >= 90);
});
it('36 days is insufficient', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(29, 100, 7)];
assert.equal(history.length, 36);
const result = detectTrafficAnomaly(history, 'war_zone');
assert.equal(result.signal, false);
assert.equal(result.dropPct, 0);
});
it('equal traffic recent vs baseline yields dropPct 0, no signal', () => {
const history = [...makeDays(7, 100, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.equal(result.dropPct, 0);
assert.equal(result.signal, false);
});
it('increased traffic yields negative dropPct, no signal', () => {
const history = [...makeDays(7, 200, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.ok(result.dropPct < 0, `expected negative dropPct, got ${result.dropPct}`);
assert.equal(result.signal, false);
});
it('exactly 50% drop in war_zone triggers signal', () => {
const history = [...makeDays(7, 50, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.equal(result.dropPct, 50);
assert.equal(result.signal, true);
});
it('49% drop in war_zone does NOT trigger signal', () => {
const history = [...makeDays(7, 51, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.ok(result.dropPct < 50);
assert.equal(result.signal, false);
});
it('elevated threat level does not trigger signal even with large drop', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'elevated');
assert.equal(result.signal, false);
assert.ok(result.dropPct >= 90);
});
it('high threat level does not trigger signal even with large drop', () => {
const history = [...makeDays(7, 5, 0), ...makeDays(30, 100, 7)];
const result = detectTrafficAnomaly(history, 'high');
assert.equal(result.signal, false);
});
it('unsorted history is handled correctly (sorted internally)', () => {
const history = [...makeDays(30, 100, 7), ...makeDays(7, 5, 0)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.ok(result.signal);
assert.ok(result.dropPct >= 90);
});
it('baseline < 2 vessels/day avg (< 14 total over 7 days) returns no signal', () => {
// baseline30 of 1/day -> baselineAvg7 = (30*1/30)*7 = 7 < 14
const history = [...makeDays(7, 0, 0), ...makeDays(30, 1, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.equal(result.signal, false);
assert.equal(result.dropPct, 0);
});
it('baseline of exactly 2 vessels/day (14/week) is accepted', () => {
const history = [...makeDays(7, 0, 0), ...makeDays(30, 2, 7)];
const result = detectTrafficAnomaly(history, 'war_zone');
assert.ok(result.dropPct > 0, 'should compute dropPct when baseline is 14/week');
});
});
// ---------------------------------------------------------------------------
// 7. CHOKEPOINT_THREAT_LEVELS sync between relay and handler
// ---------------------------------------------------------------------------
describe('CHOKEPOINT_THREAT_LEVELS relay-handler sync', () => {
const relayBlock = relaySrc.match(/CHOKEPOINT_THREAT_LEVELS\s*=\s*\{([^}]+)\}/)?.[1] || '';
it('relay defines threat levels for all 13 canonical chokepoints', () => {
for (const cp of CANONICAL_CHOKEPOINTS) {
assert.match(relayBlock, new RegExp(`${cp.id}:\\s*'`),
`Missing threat level for ${cp.id} in relay`);
}
});
it('relay threat levels match handler CHOKEPOINTS config', () => {
for (const cp of CANONICAL_CHOKEPOINTS) {
const relayMatch = relayBlock.match(new RegExp(`${cp.id}:\\s*'(\\w+)'`));
const handlerMatch = handlerSrc.match(new RegExp(`id:\\s*'${cp.id}'[^}]*threatLevel:\\s*'(\\w+)'`));
if (relayMatch && handlerMatch) {
assert.equal(relayMatch[1], handlerMatch[1],
`Threat level mismatch for ${cp.id}: relay=${relayMatch[1]} handler=${handlerMatch[1]}`);
}
}
});
});
// ---------------------------------------------------------------------------
// 8. Handler reads pre-built summaries first, falls back to raw keys
// ---------------------------------------------------------------------------
describe('handler transit data strategy', () => {
it('reads TRANSIT_SUMMARIES_KEY as primary source', () => {
assert.match(handlerSrc, /TRANSIT_SUMMARIES_KEY/);
});
it('has fallback keys for portwatch, corridorrisk, and transit counts', () => {
assert.match(handlerSrc, /PORTWATCH_FALLBACK_KEY/);
assert.match(handlerSrc, /CORRIDORRISK_FALLBACK_KEY/);
assert.match(handlerSrc, /TRANSIT_COUNTS_FALLBACK_KEY/);
});
it('fallback triggers only when pre-built summaries are empty', () => {
assert.match(handlerSrc, /Object\.keys\(summaries\)\.length === 0/);
});
it('fallback builds summaries with detectTrafficAnomaly', () => {
assert.match(handlerSrc, /buildFallbackSummaries/);
assert.match(handlerSrc, /detectTrafficAnomaly/);
});
it('does NOT call getPortWatchTransits or fetchCorridorRisk (no upstream fetch)', () => {
assert.doesNotMatch(handlerSrc, /getPortWatchTransits/);
assert.doesNotMatch(handlerSrc, /fetchCorridorRisk/);
});
});
describe('seedTransitSummaries cold-start hydration', () => {
it('reads PortWatch from Redis when latestPortwatchData is null', () => {
assert.match(relaySrc, /if\s*\(\s*!latestPortwatchData\s*\)/);
assert.match(relaySrc, /upstashGet\(PORTWATCH_REDIS_KEY\)/);
assert.match(relaySrc, /Hydrated PortWatch from Redis/);
});
it('reads CorridorRisk from Redis when latestCorridorRiskData is null', () => {
assert.match(relaySrc, /if\s*\(\s*!latestCorridorRiskData\s*\)/);
assert.match(relaySrc, /upstashGet\(CORRIDOR_RISK_REDIS_KEY\)/);
assert.match(relaySrc, /Hydrated CorridorRisk from Redis/);
});
it('hydration happens BEFORE the empty-check early return', () => {
const fnBody = relaySrc.match(/async function seedTransitSummaries\(\)\s*\{([\s\S]*?)\n\}/)?.[1] || '';
const hydratePos = fnBody.indexOf('upstashGet(PORTWATCH_REDIS_KEY)');
const earlyReturnPos = fnBody.indexOf("if (!pw || Object.keys(pw).length === 0) return");
assert.ok(hydratePos > 0, 'hydration code not found');
assert.ok(earlyReturnPos > 0, 'early return not found');
assert.ok(hydratePos < earlyReturnPos, 'hydration must happen BEFORE the empty-data early return');
});
it('assigns hydrated data back to latestPortwatchData', () => {
const fnBody = relaySrc.match(/async function seedTransitSummaries\(\)\s*\{([\s\S]*?)\n\}/)?.[1] || '';
assert.match(fnBody, /latestPortwatchData\s*=\s*persisted/);
});
it('assigns hydrated data back to latestCorridorRiskData', () => {
const fnBody = relaySrc.match(/async function seedTransitSummaries\(\)\s*\{([\s\S]*?)\n\}/)?.[1] || '';
assert.match(fnBody, /latestCorridorRiskData\s*=\s*persisted/);
});
});