mirror of
https://github.com/koala73/worldmonitor.git
synced 2026-04-25 17:14:57 +02:00
* feat(seed-contract): PR 2a — runSeed envelope dual-write + 91 seeders migrated
Opt-in contract path in runSeed: when opts.declareRecords is provided, write
{_seed, data} envelope to the canonical key alongside legacy seed-meta:*
(dual-write). State machine: OK / OK_ZERO / RETRY with zeroIsValid opt.
declareRecords throws or returns non-integer → hard fail (contract violation).
extraKeys[*] support per-key declareRecords; each extra key writes its own
envelope. Legacy seeders (no declareRecords) entirely unchanged.
Migrated all 91 scripts/seed-*.mjs to contract mode. Each exports
declareRecords returning the canonical record count, and passes
schemaVersion: 1 + maxStaleMin (matched to api/health.js SEED_META, or 2.5x
interval where no registry entry exists). Contract conformance reports 84/86
seeders with full descriptor (2 pre-existing warnings).
Legacy seed-meta keys still written so unmigrated readers keep working;
follow-up slices flip health.js + readers to envelope-first.
Tests: 61/61 PR 1 tests still pass.
Next slices for PR 2:
- api/health.js registry collapse + 15 seed-bundle-*.mjs canonicalKey wiring
- reader migration (mcp, resilience, aviation, displacement, regional-snapshot)
- direct writers — ais-relay.cjs, consumer-prices-core publish.ts
- public-boundary stripSeedEnvelope + test migration
Plan: docs/plans/2026-04-14-002-fix-runseed-zero-record-lockout-plan.md
* fix(seed-contract): unwrap envelopes in internal cross-seed readers
After PR 2a enveloped 91 canonical keys as {_seed, data}, every script-side
reader that returned the raw parsed JSON started silently handing callers the
envelope instead of the bare payload. WoW baselines (bigmac, grocery-basket,
fear-greed) saw undefined .countries / .composite; seed-climate-anomalies saw
undefined .normals from climate:zone-normals:v1; seed-thermal-escalation saw
undefined .fireDetections from wildfire:fires:v1; seed-forecasts' ~40-key
pipeline batch returned envelopes for every input.
Fix: route every script-side reader through unwrapEnvelope(...).data. Legacy
bare-shape values pass through unchanged (unwrapEnvelope returns
{_seed: null, data: raw} for any non-envelope shape).
Changed:
- scripts/_seed-utils.mjs: import unwrapEnvelope; redisGet, readSeedSnapshot,
verifySeedKey all unwrap. Exported new readCanonicalValue() helper for
cross-seed consumers.
- 18 seed-*.mjs scripts with local redisGet-style helpers or inline fetch
patched to unwrap via the envelope source module (subagent sweep).
- scripts/seed-forecasts.mjs pipeline batch: parse() unwraps each result.
- scripts/seed-energy-spine.mjs redisMget: unwraps each result.
Tests:
- tests/seed-utils-envelope-reads.test.mjs: 7 new cases covering envelope
+ legacy + null paths for readSeedSnapshot and verifySeedKey.
- Full seed suite: 67/67 pass (was 61, +6 new).
Addresses both of user's P1 findings on PR #3097.
* feat(seed-contract): envelope-aware reads in server + api helpers
Every RPC and public-boundary reader now automatically strips _seed from
contract-mode canonical keys. Legacy bare-shape values pass through unchanged
(unwrapEnvelope no-ops on non-envelope shapes).
Changed helpers (one-place fix — unblocks ~60 call sites):
- server/_shared/redis.ts: getRawJson, getCachedJson, getCachedJsonBatch
unwrap by default. cachedFetchJson inherits via getCachedJson.
- api/_upstash-json.js: readJsonFromUpstash unwraps (covers api/mcp.ts
tool responses + all its canonical-key reads).
- api/bootstrap.js: getCachedJsonBatch unwraps (public-boundary —
clients never see envelope metadata).
Left intentionally unchanged:
- api/health.js / api/seed-health.js: read only seed-meta:* keys which
remain bare-shape during dual-write. unwrapEnvelope already imported at
the meta-read boundary (PR 1) as a defensive no-op.
Tests: 67/67 seed tests pass. typecheck + typecheck:api clean.
This is the blast-radius fix the PR #3097 review called out — external
readers that would otherwise see {_seed, data} after the writer side
migrated.
* fix(test): strip export keyword in vm.runInContext'd seed source
cross-source-signals-regulatory.test.mjs loads scripts/seed-cross-source-signals.mjs
via vm.runInContext, which cannot parse ESM `export` syntax. PR 2a added
`export function declareRecords` to every seeder, which broke this test's
static-analysis approach.
Fix: strip the `export` keyword from the declareRecords line in the
preprocessed source string so the function body still evaluates as a plain
declaration.
Full test:data suite: 5307/5307 pass. typecheck + typecheck:api clean.
* feat(seed-contract): consumer-prices publish.ts writes envelopes
Wrap the 5 canonical keys written by consumer-prices-core/src/jobs/publish.ts
(overview, movers:7d/30d, freshness, categories:7d/30d/90d, retailer-spread,
basket-series) in {_seed, data} envelopes. Legacy seed-meta:<key> writes
preserved for dual-write.
Inlined a buildEnvelope helper (10 lines) rather than taking a cross-package
dependency — consumer-prices-core is a standalone npm package. Documented the
four-file parity contract (mjs source, ts mirror, js edge mirror, this copy).
Contract fields: sourceVersion='consumer-prices-core-publish-v1', schemaVersion=1,
state='OK' (recordCount>0) or 'OK_ZERO' (legitimate zero).
Typecheck: no new errors in publish.ts.
* fix(seed-contract): 3 more server-side readers unwrap envelopes
Found during final audit:
- server/worldmonitor/resilience/v1/_shared.ts: resilience score reader
parsed cached GetResilienceScoreResponse raw. Contract-mode seed-resilience-scores
now envelopes those keys.
- server/worldmonitor/resilience/v1/get-resilience-ranking.ts: p05/p95
interval lookup parsed raw from seed-resilience-scores' extra-key path.
- server/worldmonitor/infrastructure/v1/_shared.ts: mgetJson() used for
count-source keys (wildfire:fires:v1, news:insights:v1) which are both
contract-mode now.
All three now unwrap via server/_shared/seed-envelope. Legacy shapes pass
through unchanged.
Typecheck clean.
* feat(seed-contract): ais-relay.cjs direct writes produce envelopes
32 canonical-key write sites in scripts/ais-relay.cjs now produce {_seed, data}
envelopes. Inlined buildEnvelope() (CJS module can't require ESM source) +
envelopeWrite(key, data, ttlSeconds, meta) wrapper. Enveloped keys span market
bootstrap, aviation, cyber-threats, theater-posture, weather-alerts, economic
spending/fred/worldbank, tech-events, corridor-risk, usni-fleet, shipping-stress,
social:reddit, wsb-tickers, pizzint, product-catalog, chokepoint transits,
ucdp-events, satellites, oref.
Left bare (not seeded data keys): seed-meta:* (dual-write legacy),
classifyCacheKey LLM cache, notam:prev-closed-state internal state,
wm:notif:scan-dedup flags.
Updated tests/ucdp-seed-resilience.test.mjs regex to accept both upstashSet
(pre-contract) and envelopeWrite (post-contract) call patterns.
* feat(seed-contract): 15 bundle files add canonicalKey for envelope gate
54 bundle sections across 12 files now declare canonicalKey alongside the
existing seedMetaKey. _bundle-runner.mjs (from PR 1) prefers canonicalKey
when both are present — gates section runs on envelope._seed.fetchedAt
read directly from the data key, eliminating the meta-outlives-data class
of bugs.
Files touched:
- climate (5), derived-signals (2), ecb-eu (3), energy-sources (6),
health (2), imf-extended (4), macro (10), market-backup (9),
portwatch (4), relay-backup (2), resilience-recovery (5), static-ref (2)
Skipped (14 sections, 3 whole bundles): multi-key writers, dynamic
templated keys (displacement year-scoped), or non-runSeed orchestrators
(regional brief cron, resilience-scores' 222-country publish, validation/
benchmark scripts). These continue to use seedMetaKey or their own gate.
seedMetaKey preserved everywhere — dual-write. _bundle-runner.mjs falls
back to legacy when canonicalKey is absent.
All 15 bundles pass node --check. test:data: 5307/5307. typecheck:all: clean.
* fix(seed-contract): 4 PR #3097 review P1s — transform/declareRecords mismatches + envelope leaks
Addresses both P1 findings and the extra-key seed-meta leak surfaced in review:
1. runSeed helper-level invariant: seed-meta:* keys NEVER envelope.
scripts/_seed-utils.mjs exports shouldEnvelopeKey(key) — returns false for
any key starting with 'seed-meta:'. Both atomicPublish (canonical) and
writeExtraKey (extras) gate the envelope wrap through this helper. Fixes
seed-iea-oil-stocks' ANALYSIS_META_EXTRA_KEY silently getting enveloped,
which broke health.js parsing the value as bare {fetchedAt, recordCount}.
Also defends against any future manual writeExtraKey(..., envelopeMeta)
call that happens to target a seed-meta:* key.
2. seed-token-panels canonical + extras fixed.
publishTransform returns data.defi (the defi panel itself, shape {tokens}).
Old declareRecords counted data.defi.tokens + data.ai.tokens + data.other.tokens
on the transformed payload → 0 → RETRY path → canonical market:defi-tokens:v1
never wrote, and because runSeed returned before the extraKeys loop,
market:ai-tokens:v1 + market:other-tokens:v1 stayed stale too.
New: declareRecords counts data.tokens on the transformed shape. AI_KEY +
OTHER_KEY extras reuse the same function (transforms return structurally
identical panels). Added isMain guard so test imports don't fire runSeed.
3. api/product-catalog.js cached reader unwraps envelope.
ais-relay.cjs now envelopes product-catalog:v2 via envelopeWrite(). The
edge reader did raw JSON.parse(result) and returned {_seed, data} to
clients, breaking the cached path. Fix: import unwrapEnvelope from
./_seed-envelope.js, apply after JSON.parse. One site — :238-241 is
downstream of getFromCache(), so the single reader fix covers both.
4. Regression lock tests/seed-contract-transform-regressions.test.mjs (11 cases):
- shouldEnvelopeKey invariant: seed-meta:* false, canonical true
- Token-panels declareRecords works on transformed shape (canonical + both extras)
- Explicit repro of pre-fix buggy signature returning 0 — guards against revert
- resolveRecordCount accepts 0, rejects non-integer
- Product-catalog envelope unwrap returns bare shape; legacy passes through
Verification:
- npm run test:data → 5318/5318 pass (was 5307 — 11 new regressions)
- npm run typecheck:all → clean
- node --check on every modified script
iea-oil-stocks canonical declareRecords was NOT broken (user confirmed during
review — buildIndex preserves .members); only its ANALYSIS_META_EXTRA_KEY
was affected, now covered generically by commit 1's helper invariant.
* fix(seed-contract): seed-token-panels validateFn also runs on post-transform shape
Review finding: fixing declareRecords wasn't sufficient — atomicPublish() runs
validateFn(publishData) on the transformed payload too. seed-token-panels'
validate() checked data.defi/.ai/.other on the transformed {tokens} shape,
returned false, and runSeed took the early skipped-write branch (before even
reaching the declareRecords RETRY logic). Net effect: same as before the
declareRecords fix — canonical + both extras stayed stale.
Fix: validate() now checks the canonical defi panel directly (Array.isArray
(data?.tokens) && has at least one t.price > 0). AI/OTHER panels are validated
implicitly by their own extraKey declareRecords on write.
Audited the other 9 seeders with publishTransform (bls-series, bis-extended,
bis-data, gdelt-intel, trade-flows, iea-oil-stocks, jodi-gas, sanctions-pressure,
forecasts): all validateFn's correctly target the post-transform shape. Only
token-panels regressed.
Added 4 regression tests (tests/seed-contract-transform-regressions.test.mjs):
- validate accepts transformed panel with priced tokens
- validate rejects all-zero-price tokens
- validate rejects empty/missing tokens
- Explicit pre-fix repro (buggy old signature fails on transformed shape)
Verification:
- npm run test:data → 5322/5322 pass (was 5318; +4 new)
- npm run typecheck:all → clean
- node --check clean
* feat(seed-contract): add /api/seed-contract-probe validation endpoint
Single machine-readable gate for 'is PR #3097 working in production'.
Replaces the curl/jq ritual with one authenticated edge call that returns
HTTP 200 ok:true or 503 + failing check list.
What it validates:
- 8 canonical keys have {_seed, data} envelopes with required data fields
and minRecords floors (fsi-eu, zone-normals, 3 token panels + minRecords
guard against token-panels RETRY regression, product-catalog, wildfire,
earthquakes).
- 2 seed-meta:* keys remain BARE (shouldEnvelopeKey invariant; guards
against iea-oil-stocks ANALYSIS_META_EXTRA_KEY-class regressions).
- /api/product-catalog + /api/bootstrap responses contain no '_seed' leak.
Auth: x-probe-secret header must match RELAY_SHARED_SECRET (reuses existing
Vercel↔Railway internal trust boundary).
Probe logic is exported (checkProbe, checkPublicBoundary, DEFAULT_PROBES) for
hermetic testing. tests/seed-contract-probe.test.mjs covers every branch:
envelope pass/fail on field/records/shape, bare pass/fail on shape/field,
missing/malformed JSON, Redis non-2xx, boundary seed-leak detection,
DEFAULT_PROBES sanity (seed-meta invariant present, token-panels minRecords
guard present).
Usage:
curl -H "x-probe-secret: $RELAY_SHARED_SECRET" \
https://api.worldmonitor.app/api/seed-contract-probe
PR 3 will extend the probe with a stricter mode that asserts seed-meta:*
keys are GONE (not just bare) once legacy dual-write is removed.
Verification:
- tests/seed-contract-probe.test.mjs → 15/15 pass
- npm run test:data → 5338/5338 (was 5322; +16 new incl. conformance)
- npm run typecheck:all → clean
* fix(seed-contract): tighten probe — minRecords on AI/OTHER + cache-path source header
Review P2 findings: the probe's stated guards were weaker than advertised.
1. market:ai-tokens:v1 + market:other-tokens:v1 probes claimed to guard the
token-panels extra-key RETRY regression but only checked shape='envelope'
+ dataHas:['tokens']. If an extra-key declareRecords regressed to 0, both
probes would still pass because checkProbe() only inspects _seed.recordCount
when minRecords is set. Now both enforce minRecords: 1.
2. /api/product-catalog boundary check only asserted no '_seed' leak — which
is also true for the static fallback path. A broken cached reader
(getFromCache returning null or throwing) could serve fallback silently
and still pass this probe. Now:
- api/product-catalog.js emits X-Product-Catalog-Source: cache|dodo|fallback
on the response (the json() helper gained an optional source param wired
to each of the three branches).
- checkPublicBoundary declaratively requires that header's value match
'cache' for /api/product-catalog, so a fallback-serve fails the probe
with reason 'source:fallback!=cache' or 'source:missing!=cache'.
Test updates (tests/seed-contract-probe.test.mjs):
- Boundary check reworked to use a BOUNDARY_CHECKS config with optional
requireSourceHeader per endpoint.
- New cases: served-from-cache passes, served-from-fallback fails with source
mismatch, missing header fails, seed-leak still takes precedence, bad
status fails.
- Token-panels sanity test now asserts minRecords≥1 on all 3 panels.
Verification:
- tests/seed-contract-probe.test.mjs → 17/17 pass (was 15, +2 net)
- npm run test:data → 5340/5340
- npm run typecheck:all → clean
438 lines
14 KiB
JavaScript
438 lines
14 KiB
JavaScript
#!/usr/bin/env node
|
|
|
|
import { loadEnvFile, CHROME_UA, runSeed } from './_seed-utils.mjs';
|
|
|
|
loadEnvFile(import.meta.url);
|
|
|
|
const EONET_API_URL = 'https://eonet.gsfc.nasa.gov/api/v3/events';
|
|
const GDACS_API = 'https://www.gdacs.org/gdacsapi/api/events/geteventlist/MAP';
|
|
const NHC_BASE = 'https://mapservices.weather.noaa.gov/tropical/rest/services/tropical/NHC_tropical_weather/MapServer';
|
|
const CANONICAL_KEY = 'natural:events:v1';
|
|
const CACHE_TTL = 43200; // 12h — 6x the 2h cron interval; was 1h (TTL < maxStaleMin:120 — panel went dark before health alarmed)
|
|
|
|
const DAYS = 30;
|
|
const WILDFIRE_MAX_AGE_MS = 48 * 60 * 60 * 1000;
|
|
|
|
const GDACS_TO_CATEGORY = {
|
|
EQ: 'earthquakes',
|
|
FL: 'floods',
|
|
TC: 'severeStorms',
|
|
VO: 'volcanoes',
|
|
WF: 'wildfires',
|
|
DR: 'drought',
|
|
};
|
|
|
|
const EVENT_TYPE_NAMES = {
|
|
EQ: 'Earthquake',
|
|
FL: 'Flood',
|
|
TC: 'Tropical Cyclone',
|
|
VO: 'Volcano',
|
|
WF: 'Wildfire',
|
|
DR: 'Drought',
|
|
};
|
|
|
|
const NATURAL_EVENT_CATEGORIES = new Set([
|
|
'severeStorms', 'wildfires', 'volcanoes', 'earthquakes', 'floods',
|
|
'landslides', 'drought', 'dustHaze', 'snow', 'tempExtremes',
|
|
'seaLakeIce', 'waterColor', 'manmade',
|
|
]);
|
|
|
|
function normalizeCategory(id) {
|
|
const c = String(id || '').trim();
|
|
return NATURAL_EVENT_CATEGORIES.has(c) ? c : 'manmade';
|
|
}
|
|
|
|
async function fetchEonet(days) {
|
|
const url = `${EONET_API_URL}?status=open&days=${days}`;
|
|
const res = await fetch(url, {
|
|
headers: { Accept: 'application/json', 'User-Agent': CHROME_UA },
|
|
signal: AbortSignal.timeout(15_000),
|
|
});
|
|
if (!res.ok) throw new Error(`EONET ${res.status}`);
|
|
|
|
const data = await res.json();
|
|
const events = [];
|
|
const now = Date.now();
|
|
|
|
for (const event of data.events || []) {
|
|
const category = event.categories?.[0];
|
|
if (!category) continue;
|
|
const normalizedCategory = normalizeCategory(category.id);
|
|
if (normalizedCategory === 'earthquakes') continue;
|
|
|
|
const latestGeo = event.geometry?.[event.geometry.length - 1];
|
|
if (!latestGeo || latestGeo.type !== 'Point') continue;
|
|
|
|
const eventDate = new Date(latestGeo.date);
|
|
const [lon, lat] = latestGeo.coordinates;
|
|
|
|
if (normalizedCategory === 'wildfires' && now - eventDate.getTime() > WILDFIRE_MAX_AGE_MS) continue;
|
|
|
|
const source = event.sources?.[0];
|
|
events.push({
|
|
id: event.id || '',
|
|
title: event.title || '',
|
|
description: event.description || '',
|
|
category: normalizedCategory,
|
|
categoryTitle: category.title || '',
|
|
lat,
|
|
lon,
|
|
date: eventDate.getTime(),
|
|
magnitude: latestGeo.magnitudeValue ?? 0,
|
|
magnitudeUnit: latestGeo.magnitudeUnit || '',
|
|
sourceUrl: source?.url || '',
|
|
sourceName: source?.id || '',
|
|
closed: event.closed !== null,
|
|
});
|
|
}
|
|
|
|
return events;
|
|
}
|
|
|
|
function classifyWind(kt) {
|
|
if (kt >= 137) return { category: 5, classification: 'Category 5' };
|
|
if (kt >= 113) return { category: 4, classification: 'Category 4' };
|
|
if (kt >= 96) return { category: 3, classification: 'Category 3' };
|
|
if (kt >= 83) return { category: 2, classification: 'Category 2' };
|
|
if (kt >= 64) return { category: 1, classification: 'Category 1' };
|
|
if (kt >= 34) return { category: 0, classification: 'Tropical Storm' };
|
|
return { category: 0, classification: 'Tropical Depression' };
|
|
}
|
|
|
|
function parseGdacsTcFields(props) {
|
|
const fields = {};
|
|
fields.stormId = `gdacs-TC-${props.eventid}`;
|
|
|
|
const name = String(props.name || '');
|
|
const nameMatch = name.match(/(?:Hurricane|Typhoon|Cyclone|Storm|Depression)\s+(.+)/i);
|
|
fields.stormName = nameMatch ? nameMatch[1].trim() : name.trim() || undefined;
|
|
|
|
const desc = String(props.description || '') + ' ' + String(props.severitydata?.severitytext || '');
|
|
|
|
const windPatterns = [
|
|
/(\d+(?:\.\d+)?)\s*(?:kn(?:ots?)?|kt)/i,
|
|
/(\d+(?:\.\d+)?)\s*mph/i,
|
|
/(\d+(?:\.\d+)?)\s*km\/?h/i,
|
|
];
|
|
for (const [i, pat] of windPatterns.entries()) {
|
|
const m = desc.match(pat);
|
|
if (m) {
|
|
let val = parseFloat(m[1]);
|
|
if (i === 1) val = Math.round(val * 0.868976);
|
|
else if (i === 2) val = Math.round(val * 0.539957);
|
|
if (val > 0 && val <= 200) {
|
|
fields.windKt = Math.round(val);
|
|
const { category, classification } = classifyWind(fields.windKt);
|
|
fields.stormCategory = category;
|
|
fields.classification = classification;
|
|
}
|
|
break;
|
|
}
|
|
}
|
|
|
|
const pressureMatch = desc.match(/(\d{3,4})\s*(?:mb|hPa|mbar)/i);
|
|
if (pressureMatch) {
|
|
const p = parseInt(pressureMatch[1], 10);
|
|
if (p >= 850 && p <= 1050) fields.pressureMb = p;
|
|
}
|
|
|
|
return fields;
|
|
}
|
|
|
|
async function fetchGdacs() {
|
|
const res = await fetch(GDACS_API, {
|
|
headers: { Accept: 'application/json', 'User-Agent': CHROME_UA },
|
|
signal: AbortSignal.timeout(15_000),
|
|
});
|
|
if (!res.ok) throw new Error(`GDACS ${res.status}`);
|
|
|
|
const data = await res.json();
|
|
const features = data.features || [];
|
|
const seen = new Set();
|
|
const events = [];
|
|
|
|
for (const f of features) {
|
|
if (!f.geometry || f.geometry.type !== 'Point') continue;
|
|
const props = f.properties;
|
|
const key = `${props.eventtype}-${props.eventid}`;
|
|
if (seen.has(key)) continue;
|
|
seen.add(key);
|
|
|
|
if (props.alertlevel === 'Green') continue;
|
|
|
|
const category = GDACS_TO_CATEGORY[props.eventtype] || 'manmade';
|
|
const alertPrefix = props.alertlevel === 'Red' ? '\u{1F534} ' : props.alertlevel === 'Orange' ? '\u{1F7E0} ' : '';
|
|
const description = props.description || EVENT_TYPE_NAMES[props.eventtype] || props.eventtype;
|
|
const severity = props.severitydata?.severitytext || '';
|
|
|
|
const tcFields = props.eventtype === 'TC' ? parseGdacsTcFields(props) : {};
|
|
|
|
events.push({
|
|
id: `gdacs-${props.eventtype}-${props.eventid}`,
|
|
title: `${alertPrefix}${props.name || ''}`,
|
|
description: `${description}${severity ? ` - ${severity}` : ''}`,
|
|
category,
|
|
categoryTitle: description,
|
|
lat: f.geometry.coordinates[1] ?? 0,
|
|
lon: f.geometry.coordinates[0] ?? 0,
|
|
date: new Date(props.fromdate || 0).getTime(),
|
|
magnitude: 0,
|
|
magnitudeUnit: '',
|
|
sourceUrl: props.url?.report || '',
|
|
sourceName: 'GDACS',
|
|
closed: false,
|
|
...tcFields,
|
|
forecastTrack: [],
|
|
conePolygon: [],
|
|
pastTrack: [],
|
|
});
|
|
}
|
|
|
|
return events.slice(0, 100);
|
|
}
|
|
|
|
// NHC ArcGIS layer IDs per storm slot (5 slots per basin)
|
|
// Each slot has: forecastPoints, forecastTrack, forecastCone, pastPoints, pastTrack
|
|
const NHC_STORM_SLOTS = [];
|
|
const BASIN_OFFSETS = { AT: 4, EP: 134, CP: 264 };
|
|
const BASIN_CODES = { AT: 'AL', EP: 'EP', CP: 'CP' };
|
|
for (const [prefix, base] of Object.entries(BASIN_OFFSETS)) {
|
|
for (let i = 0; i < 5; i++) {
|
|
const offset = base + i * 26;
|
|
NHC_STORM_SLOTS.push({
|
|
basin: BASIN_CODES[prefix],
|
|
forecastPoints: offset + 2,
|
|
forecastTrack: offset + 3,
|
|
forecastCone: offset + 4,
|
|
pastPoints: offset + 7,
|
|
pastTrack: offset + 8,
|
|
});
|
|
}
|
|
}
|
|
|
|
async function nhcQuery(layerId) {
|
|
const url = `${NHC_BASE}/${layerId}/query?where=1%3D1&outFields=*&f=geojson`;
|
|
const res = await fetch(url, {
|
|
headers: { Accept: 'application/json', 'User-Agent': CHROME_UA },
|
|
signal: AbortSignal.timeout(15_000),
|
|
});
|
|
if (!res.ok) return { type: 'FeatureCollection', features: [] };
|
|
return res.json();
|
|
}
|
|
|
|
const NHC_STORM_TYPES = {
|
|
HU: 'Hurricane', TS: 'Tropical Storm', TD: 'Tropical Depression',
|
|
STS: 'Subtropical Storm', STD: 'Subtropical Depression',
|
|
EX: 'Post-Tropical', PT: 'Post-Tropical',
|
|
};
|
|
|
|
async function fetchNhc() {
|
|
// Query all forecast point layers to find active storms
|
|
const pointQueries = NHC_STORM_SLOTS.map(s => nhcQuery(s.forecastPoints));
|
|
const pointResults = await Promise.allSettled(pointQueries);
|
|
|
|
const activeSlots = [];
|
|
for (let i = 0; i < NHC_STORM_SLOTS.length; i++) {
|
|
const r = pointResults[i];
|
|
if (r.status === 'fulfilled' && r.value.features?.length > 0) {
|
|
activeSlots.push({ slot: NHC_STORM_SLOTS[i], points: r.value });
|
|
}
|
|
}
|
|
|
|
if (activeSlots.length === 0) return [];
|
|
|
|
// Fetch track, cone, past data for active storms only
|
|
const detailQueries = activeSlots.map(async ({ slot, points }) => {
|
|
const [coneRes, pastPtsRes] = await Promise.allSettled([
|
|
nhcQuery(slot.forecastCone),
|
|
nhcQuery(slot.pastPoints),
|
|
]);
|
|
return {
|
|
slot, points,
|
|
cone: coneRes.status === 'fulfilled' ? coneRes.value : null,
|
|
pastPts: pastPtsRes.status === 'fulfilled' ? pastPtsRes.value : null,
|
|
};
|
|
});
|
|
const stormData = await Promise.all(detailQueries);
|
|
|
|
const events = [];
|
|
for (const { slot, points, cone, pastPts } of stormData) {
|
|
// Current position = forecast point with tau=0
|
|
const currentPt = points.features.find(f => f.properties?.tau === 0 || f.properties?.fcstprd === 0);
|
|
if (!currentPt) continue;
|
|
|
|
const p = currentPt.properties;
|
|
const stormName = p.stormname || '';
|
|
const windKt = p.maxwind || 0;
|
|
const ssNum = p.ssnum || 0;
|
|
const stormType = p.stormtype || 'TS';
|
|
const advisNum = p.advisnum || '';
|
|
const stormNum = p.stormnum || 0;
|
|
const stormId = `nhc-${slot.basin}${String(stormNum).padStart(2, '0')}-${advisNum}`;
|
|
|
|
const classification = NHC_STORM_TYPES[stormType] || classifyWind(windKt).classification;
|
|
const typeLabel = NHC_STORM_TYPES[stormType] || stormType;
|
|
const title = `${typeLabel} ${stormName}`;
|
|
|
|
// Build forecast track from forecast points
|
|
const forecastTrack = points.features
|
|
.filter(f => f.properties?.tau > 0 || f.properties?.fcstprd > 0)
|
|
.sort((a, b) => (a.properties.tau || a.properties.fcstprd) - (b.properties.tau || b.properties.fcstprd))
|
|
.map(f => ({
|
|
lat: f.geometry.coordinates[1],
|
|
lon: f.geometry.coordinates[0],
|
|
hour: f.properties.tau || f.properties.fcstprd || 0,
|
|
windKt: f.properties.maxwind || 0,
|
|
category: f.properties.ssnum || 0,
|
|
}));
|
|
|
|
// Build cone polygon from forecast cone geometry (CoordRing format)
|
|
const conePolygon = [];
|
|
if (cone?.features?.length > 0) {
|
|
for (const f of cone.features) {
|
|
const rings =
|
|
f.geometry?.type === 'Polygon' ? f.geometry.coordinates || [] :
|
|
f.geometry?.type === 'MultiPolygon' ? (f.geometry.coordinates || []).flat() :
|
|
[];
|
|
for (const ring of rings) {
|
|
conePolygon.push({ points: ring.map(([lon, lat]) => ({ lon, lat })) });
|
|
}
|
|
}
|
|
}
|
|
|
|
// Build past track from past points
|
|
const pastTrack = [];
|
|
if (pastPts?.features?.length > 0) {
|
|
const sorted = pastPts.features
|
|
.filter(f => f.geometry?.coordinates)
|
|
.sort((a, b) => (a.properties.dtg || 0) - (b.properties.dtg || 0));
|
|
for (const f of sorted) {
|
|
pastTrack.push({
|
|
lat: f.geometry.coordinates[1],
|
|
lon: f.geometry.coordinates[0],
|
|
windKt: f.properties.intensity ?? 0,
|
|
timestamp: f.properties.dtg ?? 0,
|
|
});
|
|
}
|
|
}
|
|
|
|
const lat = currentPt.geometry.coordinates[1];
|
|
const lon = currentPt.geometry.coordinates[0];
|
|
if (lat < -90 || lat > 90 || lon < -180 || lon > 180) continue;
|
|
if (windKt < 0 || windKt > 200) continue;
|
|
|
|
const pressureMb = p.mslp >= 850 && p.mslp <= 1050 ? p.mslp : undefined;
|
|
const advDate = p.advdate ? new Date(p.advdate).getTime() : Date.now();
|
|
|
|
events.push({
|
|
id: stormId,
|
|
title,
|
|
description: `${title}, Max wind ${windKt} kt${pressureMb ? `, Pressure ${pressureMb} mb` : ''}`,
|
|
category: 'severeStorms',
|
|
categoryTitle: 'Tropical Cyclone',
|
|
lat,
|
|
lon,
|
|
date: Number.isFinite(advDate) ? advDate : Date.now(),
|
|
magnitude: windKt,
|
|
magnitudeUnit: 'kt',
|
|
sourceUrl: `https://www.nhc.noaa.gov/`,
|
|
sourceName: 'NHC',
|
|
closed: false,
|
|
stormId,
|
|
stormName,
|
|
basin: slot.basin,
|
|
stormCategory: ssNum,
|
|
classification,
|
|
windKt,
|
|
pressureMb,
|
|
movementDir: p.tcdir ?? undefined,
|
|
movementSpeedKt: p.tcspd ?? undefined,
|
|
forecastTrack,
|
|
conePolygon,
|
|
pastTrack,
|
|
});
|
|
}
|
|
|
|
return events;
|
|
}
|
|
|
|
async function fetchNaturalEvents() {
|
|
const [eonetResult, gdacsResult, nhcResult] = await Promise.allSettled([
|
|
fetchEonet(DAYS),
|
|
fetchGdacs(),
|
|
fetchNhc(),
|
|
]);
|
|
|
|
const eonetEvents = eonetResult.status === 'fulfilled' ? eonetResult.value : [];
|
|
const gdacsEvents = gdacsResult.status === 'fulfilled' ? gdacsResult.value : [];
|
|
const nhcEvents = nhcResult.status === 'fulfilled' ? nhcResult.value : [];
|
|
|
|
if (eonetResult.status === 'rejected') console.log('[EONET]', eonetResult.reason?.message);
|
|
if (gdacsResult.status === 'rejected') console.log('[GDACS]', gdacsResult.reason?.message);
|
|
if (nhcResult.status === 'rejected') console.log('[NHC]', nhcResult.reason?.message);
|
|
|
|
// NHC events take priority for storms (have forecast tracks/cones)
|
|
// Dedup GDACS TC events against NHC by storm name proximity
|
|
const nhcStorms = nhcEvents
|
|
.filter(e => e.stormName)
|
|
.map(e => ({ name: (e.stormName || '').toLowerCase(), lat: e.lat, lon: e.lon }));
|
|
const seenLocations = new Set();
|
|
const merged = [];
|
|
|
|
// Add NHC storms first (highest quality data with tracks/cones)
|
|
for (const event of nhcEvents) {
|
|
const k = `${event.lat.toFixed(1)}-${event.lon.toFixed(1)}-${event.category}`;
|
|
seenLocations.add(k);
|
|
merged.push(event);
|
|
}
|
|
|
|
// Add GDACS events, skipping TC events that match NHC storms by name
|
|
for (const event of gdacsEvents) {
|
|
if (event.category === 'severeStorms' && event.stormName) {
|
|
const gName = event.stormName.toLowerCase();
|
|
const isDupe = nhcStorms.some(n =>
|
|
n.name === gName && Math.abs(n.lat - event.lat) < 10 && Math.abs(n.lon - event.lon) < 30
|
|
);
|
|
if (isDupe) continue;
|
|
}
|
|
const k = `${event.lat.toFixed(1)}-${event.lon.toFixed(1)}-${event.category}`;
|
|
if (!seenLocations.has(k)) {
|
|
seenLocations.add(k);
|
|
merged.push(event);
|
|
}
|
|
}
|
|
|
|
// Add EONET events
|
|
for (const event of eonetEvents) {
|
|
const k = `${event.lat.toFixed(1)}-${event.lon.toFixed(1)}-${event.category}`;
|
|
if (!seenLocations.has(k)) {
|
|
seenLocations.add(k);
|
|
merged.push(event);
|
|
}
|
|
}
|
|
|
|
if (merged.length === 0) return null;
|
|
return { events: merged };
|
|
}
|
|
|
|
function validate(data) {
|
|
return Array.isArray(data?.events);
|
|
}
|
|
|
|
export function declareRecords(data) {
|
|
return Array.isArray(data?.events) ? data.events.length : 0;
|
|
}
|
|
|
|
runSeed('natural', 'events', CANONICAL_KEY, fetchNaturalEvents, {
|
|
validateFn: validate,
|
|
ttlSeconds: CACHE_TTL,
|
|
sourceVersion: 'eonet+gdacs+nhc',
|
|
|
|
declareRecords,
|
|
schemaVersion: 1,
|
|
maxStaleMin: 360,
|
|
}).catch((err) => {
|
|
const _cause = err.cause ? ` (cause: ${err.cause.message || err.cause.code || err.cause})` : ''; console.error('FATAL:', (err.message || err) + _cause);
|
|
process.exit(1);
|
|
});
|