mirror of
https://github.com/koala73/worldmonitor.git
synced 2026-04-25 17:14:57 +02:00
* feat(seed-contract): PR 2a — runSeed envelope dual-write + 91 seeders migrated
Opt-in contract path in runSeed: when opts.declareRecords is provided, write
{_seed, data} envelope to the canonical key alongside legacy seed-meta:*
(dual-write). State machine: OK / OK_ZERO / RETRY with zeroIsValid opt.
declareRecords throws or returns non-integer → hard fail (contract violation).
extraKeys[*] support per-key declareRecords; each extra key writes its own
envelope. Legacy seeders (no declareRecords) entirely unchanged.
Migrated all 91 scripts/seed-*.mjs to contract mode. Each exports
declareRecords returning the canonical record count, and passes
schemaVersion: 1 + maxStaleMin (matched to api/health.js SEED_META, or 2.5x
interval where no registry entry exists). Contract conformance reports 84/86
seeders with full descriptor (2 pre-existing warnings).
Legacy seed-meta keys still written so unmigrated readers keep working;
follow-up slices flip health.js + readers to envelope-first.
Tests: 61/61 PR 1 tests still pass.
Next slices for PR 2:
- api/health.js registry collapse + 15 seed-bundle-*.mjs canonicalKey wiring
- reader migration (mcp, resilience, aviation, displacement, regional-snapshot)
- direct writers — ais-relay.cjs, consumer-prices-core publish.ts
- public-boundary stripSeedEnvelope + test migration
Plan: docs/plans/2026-04-14-002-fix-runseed-zero-record-lockout-plan.md
* fix(seed-contract): unwrap envelopes in internal cross-seed readers
After PR 2a enveloped 91 canonical keys as {_seed, data}, every script-side
reader that returned the raw parsed JSON started silently handing callers the
envelope instead of the bare payload. WoW baselines (bigmac, grocery-basket,
fear-greed) saw undefined .countries / .composite; seed-climate-anomalies saw
undefined .normals from climate:zone-normals:v1; seed-thermal-escalation saw
undefined .fireDetections from wildfire:fires:v1; seed-forecasts' ~40-key
pipeline batch returned envelopes for every input.
Fix: route every script-side reader through unwrapEnvelope(...).data. Legacy
bare-shape values pass through unchanged (unwrapEnvelope returns
{_seed: null, data: raw} for any non-envelope shape).
Changed:
- scripts/_seed-utils.mjs: import unwrapEnvelope; redisGet, readSeedSnapshot,
verifySeedKey all unwrap. Exported new readCanonicalValue() helper for
cross-seed consumers.
- 18 seed-*.mjs scripts with local redisGet-style helpers or inline fetch
patched to unwrap via the envelope source module (subagent sweep).
- scripts/seed-forecasts.mjs pipeline batch: parse() unwraps each result.
- scripts/seed-energy-spine.mjs redisMget: unwraps each result.
Tests:
- tests/seed-utils-envelope-reads.test.mjs: 7 new cases covering envelope
+ legacy + null paths for readSeedSnapshot and verifySeedKey.
- Full seed suite: 67/67 pass (was 61, +6 new).
Addresses both of user's P1 findings on PR #3097.
* feat(seed-contract): envelope-aware reads in server + api helpers
Every RPC and public-boundary reader now automatically strips _seed from
contract-mode canonical keys. Legacy bare-shape values pass through unchanged
(unwrapEnvelope no-ops on non-envelope shapes).
Changed helpers (one-place fix — unblocks ~60 call sites):
- server/_shared/redis.ts: getRawJson, getCachedJson, getCachedJsonBatch
unwrap by default. cachedFetchJson inherits via getCachedJson.
- api/_upstash-json.js: readJsonFromUpstash unwraps (covers api/mcp.ts
tool responses + all its canonical-key reads).
- api/bootstrap.js: getCachedJsonBatch unwraps (public-boundary —
clients never see envelope metadata).
Left intentionally unchanged:
- api/health.js / api/seed-health.js: read only seed-meta:* keys which
remain bare-shape during dual-write. unwrapEnvelope already imported at
the meta-read boundary (PR 1) as a defensive no-op.
Tests: 67/67 seed tests pass. typecheck + typecheck:api clean.
This is the blast-radius fix the PR #3097 review called out — external
readers that would otherwise see {_seed, data} after the writer side
migrated.
* fix(test): strip export keyword in vm.runInContext'd seed source
cross-source-signals-regulatory.test.mjs loads scripts/seed-cross-source-signals.mjs
via vm.runInContext, which cannot parse ESM `export` syntax. PR 2a added
`export function declareRecords` to every seeder, which broke this test's
static-analysis approach.
Fix: strip the `export` keyword from the declareRecords line in the
preprocessed source string so the function body still evaluates as a plain
declaration.
Full test:data suite: 5307/5307 pass. typecheck + typecheck:api clean.
* feat(seed-contract): consumer-prices publish.ts writes envelopes
Wrap the 5 canonical keys written by consumer-prices-core/src/jobs/publish.ts
(overview, movers:7d/30d, freshness, categories:7d/30d/90d, retailer-spread,
basket-series) in {_seed, data} envelopes. Legacy seed-meta:<key> writes
preserved for dual-write.
Inlined a buildEnvelope helper (10 lines) rather than taking a cross-package
dependency — consumer-prices-core is a standalone npm package. Documented the
four-file parity contract (mjs source, ts mirror, js edge mirror, this copy).
Contract fields: sourceVersion='consumer-prices-core-publish-v1', schemaVersion=1,
state='OK' (recordCount>0) or 'OK_ZERO' (legitimate zero).
Typecheck: no new errors in publish.ts.
* fix(seed-contract): 3 more server-side readers unwrap envelopes
Found during final audit:
- server/worldmonitor/resilience/v1/_shared.ts: resilience score reader
parsed cached GetResilienceScoreResponse raw. Contract-mode seed-resilience-scores
now envelopes those keys.
- server/worldmonitor/resilience/v1/get-resilience-ranking.ts: p05/p95
interval lookup parsed raw from seed-resilience-scores' extra-key path.
- server/worldmonitor/infrastructure/v1/_shared.ts: mgetJson() used for
count-source keys (wildfire:fires:v1, news:insights:v1) which are both
contract-mode now.
All three now unwrap via server/_shared/seed-envelope. Legacy shapes pass
through unchanged.
Typecheck clean.
* feat(seed-contract): ais-relay.cjs direct writes produce envelopes
32 canonical-key write sites in scripts/ais-relay.cjs now produce {_seed, data}
envelopes. Inlined buildEnvelope() (CJS module can't require ESM source) +
envelopeWrite(key, data, ttlSeconds, meta) wrapper. Enveloped keys span market
bootstrap, aviation, cyber-threats, theater-posture, weather-alerts, economic
spending/fred/worldbank, tech-events, corridor-risk, usni-fleet, shipping-stress,
social:reddit, wsb-tickers, pizzint, product-catalog, chokepoint transits,
ucdp-events, satellites, oref.
Left bare (not seeded data keys): seed-meta:* (dual-write legacy),
classifyCacheKey LLM cache, notam:prev-closed-state internal state,
wm:notif:scan-dedup flags.
Updated tests/ucdp-seed-resilience.test.mjs regex to accept both upstashSet
(pre-contract) and envelopeWrite (post-contract) call patterns.
* feat(seed-contract): 15 bundle files add canonicalKey for envelope gate
54 bundle sections across 12 files now declare canonicalKey alongside the
existing seedMetaKey. _bundle-runner.mjs (from PR 1) prefers canonicalKey
when both are present — gates section runs on envelope._seed.fetchedAt
read directly from the data key, eliminating the meta-outlives-data class
of bugs.
Files touched:
- climate (5), derived-signals (2), ecb-eu (3), energy-sources (6),
health (2), imf-extended (4), macro (10), market-backup (9),
portwatch (4), relay-backup (2), resilience-recovery (5), static-ref (2)
Skipped (14 sections, 3 whole bundles): multi-key writers, dynamic
templated keys (displacement year-scoped), or non-runSeed orchestrators
(regional brief cron, resilience-scores' 222-country publish, validation/
benchmark scripts). These continue to use seedMetaKey or their own gate.
seedMetaKey preserved everywhere — dual-write. _bundle-runner.mjs falls
back to legacy when canonicalKey is absent.
All 15 bundles pass node --check. test:data: 5307/5307. typecheck:all: clean.
* fix(seed-contract): 4 PR #3097 review P1s — transform/declareRecords mismatches + envelope leaks
Addresses both P1 findings and the extra-key seed-meta leak surfaced in review:
1. runSeed helper-level invariant: seed-meta:* keys NEVER envelope.
scripts/_seed-utils.mjs exports shouldEnvelopeKey(key) — returns false for
any key starting with 'seed-meta:'. Both atomicPublish (canonical) and
writeExtraKey (extras) gate the envelope wrap through this helper. Fixes
seed-iea-oil-stocks' ANALYSIS_META_EXTRA_KEY silently getting enveloped,
which broke health.js parsing the value as bare {fetchedAt, recordCount}.
Also defends against any future manual writeExtraKey(..., envelopeMeta)
call that happens to target a seed-meta:* key.
2. seed-token-panels canonical + extras fixed.
publishTransform returns data.defi (the defi panel itself, shape {tokens}).
Old declareRecords counted data.defi.tokens + data.ai.tokens + data.other.tokens
on the transformed payload → 0 → RETRY path → canonical market:defi-tokens:v1
never wrote, and because runSeed returned before the extraKeys loop,
market:ai-tokens:v1 + market:other-tokens:v1 stayed stale too.
New: declareRecords counts data.tokens on the transformed shape. AI_KEY +
OTHER_KEY extras reuse the same function (transforms return structurally
identical panels). Added isMain guard so test imports don't fire runSeed.
3. api/product-catalog.js cached reader unwraps envelope.
ais-relay.cjs now envelopes product-catalog:v2 via envelopeWrite(). The
edge reader did raw JSON.parse(result) and returned {_seed, data} to
clients, breaking the cached path. Fix: import unwrapEnvelope from
./_seed-envelope.js, apply after JSON.parse. One site — :238-241 is
downstream of getFromCache(), so the single reader fix covers both.
4. Regression lock tests/seed-contract-transform-regressions.test.mjs (11 cases):
- shouldEnvelopeKey invariant: seed-meta:* false, canonical true
- Token-panels declareRecords works on transformed shape (canonical + both extras)
- Explicit repro of pre-fix buggy signature returning 0 — guards against revert
- resolveRecordCount accepts 0, rejects non-integer
- Product-catalog envelope unwrap returns bare shape; legacy passes through
Verification:
- npm run test:data → 5318/5318 pass (was 5307 — 11 new regressions)
- npm run typecheck:all → clean
- node --check on every modified script
iea-oil-stocks canonical declareRecords was NOT broken (user confirmed during
review — buildIndex preserves .members); only its ANALYSIS_META_EXTRA_KEY
was affected, now covered generically by commit 1's helper invariant.
* fix(seed-contract): seed-token-panels validateFn also runs on post-transform shape
Review finding: fixing declareRecords wasn't sufficient — atomicPublish() runs
validateFn(publishData) on the transformed payload too. seed-token-panels'
validate() checked data.defi/.ai/.other on the transformed {tokens} shape,
returned false, and runSeed took the early skipped-write branch (before even
reaching the declareRecords RETRY logic). Net effect: same as before the
declareRecords fix — canonical + both extras stayed stale.
Fix: validate() now checks the canonical defi panel directly (Array.isArray
(data?.tokens) && has at least one t.price > 0). AI/OTHER panels are validated
implicitly by their own extraKey declareRecords on write.
Audited the other 9 seeders with publishTransform (bls-series, bis-extended,
bis-data, gdelt-intel, trade-flows, iea-oil-stocks, jodi-gas, sanctions-pressure,
forecasts): all validateFn's correctly target the post-transform shape. Only
token-panels regressed.
Added 4 regression tests (tests/seed-contract-transform-regressions.test.mjs):
- validate accepts transformed panel with priced tokens
- validate rejects all-zero-price tokens
- validate rejects empty/missing tokens
- Explicit pre-fix repro (buggy old signature fails on transformed shape)
Verification:
- npm run test:data → 5322/5322 pass (was 5318; +4 new)
- npm run typecheck:all → clean
- node --check clean
* feat(seed-contract): add /api/seed-contract-probe validation endpoint
Single machine-readable gate for 'is PR #3097 working in production'.
Replaces the curl/jq ritual with one authenticated edge call that returns
HTTP 200 ok:true or 503 + failing check list.
What it validates:
- 8 canonical keys have {_seed, data} envelopes with required data fields
and minRecords floors (fsi-eu, zone-normals, 3 token panels + minRecords
guard against token-panels RETRY regression, product-catalog, wildfire,
earthquakes).
- 2 seed-meta:* keys remain BARE (shouldEnvelopeKey invariant; guards
against iea-oil-stocks ANALYSIS_META_EXTRA_KEY-class regressions).
- /api/product-catalog + /api/bootstrap responses contain no '_seed' leak.
Auth: x-probe-secret header must match RELAY_SHARED_SECRET (reuses existing
Vercel↔Railway internal trust boundary).
Probe logic is exported (checkProbe, checkPublicBoundary, DEFAULT_PROBES) for
hermetic testing. tests/seed-contract-probe.test.mjs covers every branch:
envelope pass/fail on field/records/shape, bare pass/fail on shape/field,
missing/malformed JSON, Redis non-2xx, boundary seed-leak detection,
DEFAULT_PROBES sanity (seed-meta invariant present, token-panels minRecords
guard present).
Usage:
curl -H "x-probe-secret: $RELAY_SHARED_SECRET" \
https://api.worldmonitor.app/api/seed-contract-probe
PR 3 will extend the probe with a stricter mode that asserts seed-meta:*
keys are GONE (not just bare) once legacy dual-write is removed.
Verification:
- tests/seed-contract-probe.test.mjs → 15/15 pass
- npm run test:data → 5338/5338 (was 5322; +16 new incl. conformance)
- npm run typecheck:all → clean
* fix(seed-contract): tighten probe — minRecords on AI/OTHER + cache-path source header
Review P2 findings: the probe's stated guards were weaker than advertised.
1. market:ai-tokens:v1 + market:other-tokens:v1 probes claimed to guard the
token-panels extra-key RETRY regression but only checked shape='envelope'
+ dataHas:['tokens']. If an extra-key declareRecords regressed to 0, both
probes would still pass because checkProbe() only inspects _seed.recordCount
when minRecords is set. Now both enforce minRecords: 1.
2. /api/product-catalog boundary check only asserted no '_seed' leak — which
is also true for the static fallback path. A broken cached reader
(getFromCache returning null or throwing) could serve fallback silently
and still pass this probe. Now:
- api/product-catalog.js emits X-Product-Catalog-Source: cache|dodo|fallback
on the response (the json() helper gained an optional source param wired
to each of the three branches).
- checkPublicBoundary declaratively requires that header's value match
'cache' for /api/product-catalog, so a fallback-serve fails the probe
with reason 'source:fallback!=cache' or 'source:missing!=cache'.
Test updates (tests/seed-contract-probe.test.mjs):
- Boundary check reworked to use a BOUNDARY_CHECKS config with optional
requireSourceHeader per endpoint.
- New cases: served-from-cache passes, served-from-fallback fails with source
mismatch, missing header fails, seed-leak still takes precedence, bad
status fails.
- Token-panels sanity test now asserts minRecords≥1 on all 3 panels.
Verification:
- tests/seed-contract-probe.test.mjs → 17/17 pass (was 15, +2 net)
- npm run test:data → 5340/5340
- npm run typecheck:all → clean
395 lines
12 KiB
JavaScript
395 lines
12 KiB
JavaScript
#!/usr/bin/env node
|
||
|
||
import {
|
||
acquireLockSafely,
|
||
CHROME_UA,
|
||
extendExistingTtl,
|
||
getRedisCredentials,
|
||
loadEnvFile,
|
||
logSeedResult,
|
||
releaseLock,
|
||
withRetry,
|
||
} from './_seed-utils.mjs';
|
||
import { resolveIso2 } from './_country-resolver.mjs';
|
||
import { unwrapEnvelope } from './_seed-envelope-source.mjs';
|
||
|
||
loadEnvFile(import.meta.url);
|
||
|
||
export const EMBER_KEY_PREFIX = 'energy:ember:v1:';
|
||
export const EMBER_ALL_KEY = 'energy:ember:v1:_all';
|
||
export const EMBER_META_KEY = 'seed-meta:energy:ember';
|
||
export const EMBER_TTL_SECONDS = 259200; // 72h = 3× daily cron interval
|
||
|
||
const EMBER_CSV_URL =
|
||
'https://storage.googleapis.com/emb-prod-bkt-publicdata/public-downloads/monthly_full_release_long_format.csv';
|
||
const LOCK_DOMAIN = 'energy:ember';
|
||
const LOCK_TTL_MS = 20 * 60 * 1000; // 20 min
|
||
const MIN_COUNTRIES = 60;
|
||
const MIN_COUNT_RATIO = 0.75; // abort if new count < 75% of previous
|
||
|
||
function parseDelimitedRow(line, delimiter) {
|
||
const cells = [];
|
||
let current = '';
|
||
let inQuotes = false;
|
||
|
||
for (let idx = 0; idx < line.length; idx += 1) {
|
||
const char = line[idx];
|
||
const next = line[idx + 1];
|
||
|
||
if (char === '"') {
|
||
if (inQuotes && next === '"') {
|
||
current += '"';
|
||
idx += 1;
|
||
} else {
|
||
inQuotes = !inQuotes;
|
||
}
|
||
continue;
|
||
}
|
||
|
||
if (char === delimiter && !inQuotes) {
|
||
cells.push(current);
|
||
current = '';
|
||
continue;
|
||
}
|
||
|
||
current += char;
|
||
}
|
||
|
||
cells.push(current);
|
||
return cells.map((cell) => cell.trim());
|
||
}
|
||
|
||
function parseDelimitedText(text, delimiter) {
|
||
const lines = text
|
||
.replace(/^\uFEFF/, '')
|
||
.split(/\r?\n/)
|
||
.map((line) => line.trim())
|
||
.filter(Boolean);
|
||
if (lines.length < 2) return [];
|
||
|
||
const headers = parseDelimitedRow(lines[0], delimiter);
|
||
return lines.slice(1).map((line) => {
|
||
const values = parseDelimitedRow(line, delimiter);
|
||
return Object.fromEntries(headers.map((h, i) => [h, values[i] ?? '']));
|
||
});
|
||
}
|
||
|
||
function safeFloat(value) {
|
||
const n = parseFloat(value);
|
||
return Number.isFinite(n) ? n : null;
|
||
}
|
||
|
||
/**
|
||
* Parse Ember long-format monthly CSV.
|
||
* Returns Map<iso2, EmberCountryData>.
|
||
* @param {string} csvText
|
||
* @returns {Map<string, {dataMonth: string, fossilShare: number|null, renewShare: number|null, nuclearShare: number|null, coalShare: number|null, gasShare: number|null, demandTwh: number|null}>}
|
||
*/
|
||
// Real Ember monthly CSV column names (confirmed from header row)
|
||
const COLS = {
|
||
iso3: 'ISO 3 code',
|
||
series: 'Variable',
|
||
unit: 'Unit',
|
||
value: 'Value',
|
||
date: 'Date',
|
||
};
|
||
|
||
export function parseEmberCsv(csvText) {
|
||
const rows = parseDelimitedText(csvText, ',');
|
||
if (rows.length === 0) throw new Error('Ember CSV: no data rows');
|
||
|
||
// Schema sentinel — abort if Fossil series is missing entirely
|
||
const hasFossil = rows.some((r) => String(r[COLS.series] || '').trim() === 'Fossil');
|
||
if (!hasFossil) {
|
||
throw new Error('Ember CSV schema changed — "Fossil" series not found; update parser');
|
||
}
|
||
|
||
// Group by ISO 3 code, filter to TWh rows only
|
||
/** @type {Map<string, Array<{date: string, series: string, value: number}>>} */
|
||
const byIso3 = new Map();
|
||
for (const row of rows) {
|
||
const iso3 = String(row[COLS.iso3] || '').trim();
|
||
if (!iso3) continue;
|
||
if (String(row[COLS.unit] || '').trim() !== 'TWh') continue;
|
||
|
||
const value = safeFloat(row[COLS.value]);
|
||
if (value === null) continue;
|
||
|
||
const series = String(row[COLS.series] || '').trim();
|
||
const date = String(row[COLS.date] || '').trim();
|
||
if (!series || !date) continue;
|
||
|
||
if (!byIso3.has(iso3)) byIso3.set(iso3, []);
|
||
byIso3.get(iso3).push({ date, series, value });
|
||
}
|
||
|
||
/** @type {Map<string, object>} */
|
||
const result = new Map();
|
||
|
||
for (const [iso3, entries] of byIso3) {
|
||
const iso2 = resolveIso2({ iso3 });
|
||
if (!iso2) continue;
|
||
|
||
// Find most recent month — max date string (YYYY-MM-DD lexicographic order works)
|
||
const maxDate = entries.reduce((best, e) => (e.date > best ? e.date : best), '');
|
||
if (!maxDate) continue;
|
||
|
||
const monthEntries = entries.filter((e) => e.date === maxDate);
|
||
|
||
// Build series lookup for this month
|
||
const seriesMap = new Map();
|
||
for (const e of monthEntries) {
|
||
seriesMap.set(e.series, e.value);
|
||
}
|
||
|
||
const total = seriesMap.get('Total Generation') ?? null;
|
||
if (!total || total === 0) continue;
|
||
|
||
const fossil = seriesMap.get('Fossil') ?? null;
|
||
const renew = seriesMap.get('Renewables') ?? null;
|
||
const nuclear = seriesMap.get('Nuclear') ?? null;
|
||
const coal = seriesMap.get('Coal') ?? null;
|
||
const gas = seriesMap.get('Gas') ?? null;
|
||
|
||
const fossilShare = fossil !== null ? (fossil / total) * 100 : null;
|
||
const renewShare = renew !== null ? (renew / total) * 100 : null;
|
||
const nuclearShare = nuclear !== null ? (nuclear / total) * 100 : null;
|
||
const coalShare = coal !== null ? (coal / total) * 100 : null;
|
||
const gasShare = gas !== null ? (gas / total) * 100 : null;
|
||
|
||
// dataMonth = YYYY-MM from YYYY-MM-DD
|
||
const dataMonth = maxDate.slice(0, 7);
|
||
|
||
result.set(iso2, {
|
||
dataMonth,
|
||
fossilShare,
|
||
renewShare,
|
||
nuclearShare,
|
||
coalShare,
|
||
gasShare,
|
||
demandTwh: total,
|
||
});
|
||
}
|
||
|
||
return result;
|
||
}
|
||
|
||
/**
|
||
* Build compact bulk map of all countries keyed by ISO2.
|
||
* Omits redundant fields to reduce payload size.
|
||
* @param {Map<string, object>} countries
|
||
* @returns {Record<string, {dataMonth: string, fossilShare: number|null, renewShare: number|null, nuclearShare: number|null, coalShare: number|null, gasShare: number|null, demandTwh: number|null}>}
|
||
*/
|
||
export function buildAllCountriesMap(countries) {
|
||
const result = {};
|
||
for (const [iso2, entry] of countries) {
|
||
result[iso2] = {
|
||
dataMonth: entry.dataMonth,
|
||
fossilShare: entry.fossilShare,
|
||
renewShare: entry.renewShare,
|
||
nuclearShare: entry.nuclearShare,
|
||
coalShare: entry.coalShare,
|
||
gasShare: entry.gasShare,
|
||
demandTwh: entry.demandTwh,
|
||
};
|
||
}
|
||
return result;
|
||
}
|
||
|
||
async function redisPipeline(commands) {
|
||
const { url, token } = getRedisCredentials();
|
||
const response = await fetch(`${url}/pipeline`, {
|
||
method: 'POST',
|
||
headers: {
|
||
Authorization: `Bearer ${token}`,
|
||
'Content-Type': 'application/json',
|
||
},
|
||
body: JSON.stringify(commands),
|
||
signal: AbortSignal.timeout(30_000),
|
||
});
|
||
if (!response.ok) {
|
||
const text = await response.text().catch(() => '');
|
||
throw new Error(`Redis pipeline failed: HTTP ${response.status} — ${text.slice(0, 200)}`);
|
||
}
|
||
return response.json();
|
||
}
|
||
|
||
async function redisGet(key) {
|
||
const { url, token } = getRedisCredentials();
|
||
const resp = await fetch(`${url}/get/${encodeURIComponent(key)}`, {
|
||
headers: { Authorization: `Bearer ${token}` },
|
||
signal: AbortSignal.timeout(5_000),
|
||
});
|
||
if (!resp.ok) return null;
|
||
const data = await resp.json();
|
||
return data.result ? unwrapEnvelope(JSON.parse(data.result)).data : null;
|
||
}
|
||
|
||
async function preservePreviousSnapshot(errorMsg, stashedAllMap = null, newCountryKeys = null, dataWritten = false) {
|
||
console.error('[EmberElectricity] Preserving previous snapshot:', errorMsg);
|
||
|
||
const existingMeta = await redisGet(EMBER_META_KEY).catch(() => null);
|
||
|
||
if (stashedAllMap && typeof stashedAllMap === 'object' && !dataWritten) {
|
||
const restoreCmds = [];
|
||
for (const [iso2, val] of Object.entries(stashedAllMap)) {
|
||
restoreCmds.push([
|
||
'SET', `${EMBER_KEY_PREFIX}${iso2}`, JSON.stringify(val), 'EX', EMBER_TTL_SECONDS,
|
||
]);
|
||
}
|
||
restoreCmds.push(['SET', EMBER_ALL_KEY, JSON.stringify(stashedAllMap), 'EX', EMBER_TTL_SECONDS]);
|
||
if (newCountryKeys) {
|
||
const oldIso2Set = new Set(Object.keys(stashedAllMap));
|
||
for (const iso2 of newCountryKeys) {
|
||
if (!oldIso2Set.has(iso2)) {
|
||
restoreCmds.push(['DEL', `${EMBER_KEY_PREFIX}${iso2}`]);
|
||
}
|
||
}
|
||
}
|
||
await redisPipeline(restoreCmds).catch((e) =>
|
||
console.error('[EmberElectricity] Snapshot restore failed:', e),
|
||
);
|
||
} else if (!dataWritten) {
|
||
const existingAll = await redisGet(EMBER_ALL_KEY).catch(() => null);
|
||
const iso2Keys = existingAll && typeof existingAll === 'object'
|
||
? Object.keys(existingAll).map((iso2) => `${EMBER_KEY_PREFIX}${iso2}`)
|
||
: [];
|
||
|
||
await extendExistingTtl(
|
||
[...iso2Keys, EMBER_ALL_KEY, EMBER_META_KEY],
|
||
EMBER_TTL_SECONDS,
|
||
);
|
||
}
|
||
|
||
const metaPayload = {
|
||
fetchedAt: Date.now(),
|
||
recordCount: existingMeta?.recordCount ?? null,
|
||
status: 'error',
|
||
error: errorMsg,
|
||
};
|
||
await redisPipeline([
|
||
['SET', EMBER_META_KEY, JSON.stringify(metaPayload), 'EX', EMBER_TTL_SECONDS],
|
||
]);
|
||
}
|
||
|
||
export async function main() {
|
||
const startedAt = Date.now();
|
||
const runId = `energy:ember:${startedAt}`;
|
||
const lock = await acquireLockSafely(LOCK_DOMAIN, runId, LOCK_TTL_MS, { label: LOCK_DOMAIN });
|
||
if (lock.skipped) {
|
||
console.log('[EmberElectricity] Lock held by concurrent run, skipping');
|
||
return;
|
||
}
|
||
if (!lock.locked) {
|
||
console.log('[EmberElectricity] Lock held by another run, skipping');
|
||
return;
|
||
}
|
||
|
||
let oldAllMap = null;
|
||
let newCountryKeys = null;
|
||
let dataWritten = false;
|
||
|
||
try {
|
||
const csvText = await withRetry(
|
||
() =>
|
||
fetch(EMBER_CSV_URL, {
|
||
headers: { 'User-Agent': CHROME_UA },
|
||
signal: AbortSignal.timeout(5 * 60 * 1000), // 5 min — large CSV
|
||
}).then((r) => {
|
||
if (!r.ok) throw new Error(`Ember HTTP ${r.status}`);
|
||
return r.text();
|
||
}),
|
||
2,
|
||
2000,
|
||
);
|
||
|
||
const countries = parseEmberCsv(csvText);
|
||
console.log(`[EmberElectricity] Parsed ${countries.size} countries`);
|
||
|
||
if (countries.size < MIN_COUNTRIES) {
|
||
throw new Error(
|
||
`Ember: only ${countries.size} countries parsed, expected >=${MIN_COUNTRIES}`,
|
||
);
|
||
}
|
||
|
||
// Count-drop guard: abort if new count < 75% of previous
|
||
const prevMeta = await redisGet(EMBER_META_KEY).catch(() => null);
|
||
if (prevMeta && typeof prevMeta === 'object' && prevMeta.recordCount > 0) {
|
||
if (countries.size < prevMeta.recordCount * MIN_COUNT_RATIO) {
|
||
throw new Error(
|
||
`Ember: country count dropped from ${prevMeta.recordCount} to ${countries.size} (<75% threshold) — aborting`,
|
||
);
|
||
}
|
||
}
|
||
|
||
newCountryKeys = new Set(countries.keys());
|
||
|
||
const allCountriesMap = buildAllCountriesMap(countries);
|
||
|
||
const metaPayload = {
|
||
fetchedAt: Date.now(),
|
||
recordCount: countries.size,
|
||
sourceVersion: 'ember-monthly-v1',
|
||
};
|
||
|
||
// Stash old _all for restore on failure
|
||
oldAllMap = await redisGet(EMBER_ALL_KEY).catch(() => null);
|
||
|
||
// Phase A: write all per-country keys + _all in a single pipeline
|
||
const dataCommands = [];
|
||
for (const [iso2, payload] of countries) {
|
||
dataCommands.push([
|
||
'SET',
|
||
`${EMBER_KEY_PREFIX}${iso2}`,
|
||
JSON.stringify(payload),
|
||
'EX',
|
||
EMBER_TTL_SECONDS,
|
||
]);
|
||
}
|
||
dataCommands.push([
|
||
'SET',
|
||
EMBER_ALL_KEY,
|
||
JSON.stringify(allCountriesMap),
|
||
'EX',
|
||
EMBER_TTL_SECONDS,
|
||
]);
|
||
|
||
// DEL obsolete per-country keys no longer in the new dataset
|
||
const oldIso2Set = oldAllMap && typeof oldAllMap === 'object' ? new Set(Object.keys(oldAllMap)) : new Set();
|
||
for (const iso2 of oldIso2Set) {
|
||
if (!newCountryKeys.has(iso2)) {
|
||
dataCommands.push(['DEL', `${EMBER_KEY_PREFIX}${iso2}`]);
|
||
}
|
||
}
|
||
|
||
const dataResults = await redisPipeline(dataCommands);
|
||
const dataFailures = dataResults.filter((r) => r?.error || r?.result === 'ERR');
|
||
if (dataFailures.length > 0) {
|
||
throw new Error(
|
||
`Redis pipeline: ${dataFailures.length}/${dataCommands.length} data commands failed`,
|
||
);
|
||
}
|
||
dataWritten = true;
|
||
|
||
// Phase B: seed-meta (only after all data is fully written)
|
||
await redisPipeline([['SET', EMBER_META_KEY, JSON.stringify(metaPayload), 'EX', EMBER_TTL_SECONDS]]);
|
||
|
||
logSeedResult('energy:ember', countries.size, Date.now() - startedAt);
|
||
console.log(`[EmberElectricity] Seeded ${countries.size} countries`);
|
||
} catch (err) {
|
||
await preservePreviousSnapshot(String(err), oldAllMap, newCountryKeys, dataWritten).catch((e) =>
|
||
console.error('[EmberElectricity] Failed to preserve snapshot:', e),
|
||
);
|
||
throw err;
|
||
} finally {
|
||
await releaseLock(LOCK_DOMAIN, runId);
|
||
}
|
||
}
|
||
|
||
if (process.argv[1]?.endsWith('seed-ember-electricity.mjs')) {
|
||
main().catch((err) => {
|
||
console.error(err);
|
||
process.exit(1);
|
||
});
|
||
}
|