Files
worldmonitor/scripts/ais-relay.cjs
Sebastien Melki b1d835b69f feat: HappyMonitor — positive news dashboard (happy.worldmonitor.app) (#229)
* chore: add project config

* docs: add domain research (stack, features, architecture, pitfalls)

* docs: define v1 requirements

* docs: create roadmap (9 phases)

* docs(01): capture phase context

* docs(state): record phase 1 context session

* docs(01): research phase domain

* docs(01): create phase plan

* fix(01): revise plans based on checker feedback

* feat(01-01): register happy variant in config system and build tooling

- Add 'happy' to allowed stored variants in variant.ts
- Create variants/happy.ts with panels, map layers, and VariantConfig
- Add HAPPY_PANELS, HAPPY_MAP_LAYERS, HAPPY_MOBILE_MAP_LAYERS inline in panels.ts
- Update ternary export chains to select happy config when SITE_VARIANT === 'happy'
- Add happy entry to VARIANT_META in vite.config.ts
- Add dev:happy and build:happy scripts to package.json

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-01): update index.html for variant detection, CSP, and Google Fonts

- Add happy.worldmonitor.app to CSP frame-src directive
- Extend inline script to detect variant from hostname (happy/tech/finance) and localStorage
- Set data-variant attribute on html element before first paint to prevent FOUC
- Add Google Fonts preconnect and Nunito stylesheet links
- Add favicon variant path replacement in htmlVariantPlugin for non-full variants

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-01): create happy variant favicon assets

- Create SVG globe favicon in sage green (#6B8F5E) and warm gold (#C4A35A)
- Generate PNG favicons at all required sizes (16, 32, 180, 192, 512)
- Generate favicon.ico with PNG-in-ICO wrapper
- Create branded OG image (1200x630) with cream background, sage/gold scheme

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(01-01): complete variant registration plan

- Create 01-01-SUMMARY.md documenting variant registration
- Update STATE.md with plan 1 completion, metrics, decisions
- Update ROADMAP.md with phase 01 progress (1/3 plans)
- Mark INFRA-01, INFRA-02, INFRA-03 requirements complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-02): create happy variant CSS theme with warm palette and semantic overrides

- Complete happy-theme.css with light mode (cream/sage), dark mode (navy/warm), and semantic colors
- 179 lines covering all CSS custom properties: backgrounds, text, borders, overlays, map, panels
- Nunito typography and 14px panel border radius for soft rounded aesthetic
- Semantic colors remapped: gold (critical), sage (growth), blue (hope), pink (kindness)
- Dark mode uses warm navy/sage tones, never pure black
- Import added to main.css after panels.css

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-02): add happy variant skeleton shell overrides and theme-color meta

- Inline skeleton styles for happy variant light mode (cream bg, Nunito font, sage dot, warm shimmer)
- Inline skeleton styles for happy variant dark mode (navy bg, warm borders, sage tones)
- Rounded corners (14px) on skeleton panels and map for soft aesthetic
- Softer pill border-radius (8px) in happy variant
- htmlVariantPlugin: theme-color meta updated to #FAFAF5 for happy variant mobile chrome

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(01-02): complete happy theme CSS plan

- SUMMARY.md with execution results and self-check
- STATE.md advanced to plan 2/3, decisions logged
- ROADMAP.md progress updated (2/3 plans complete)
- REQUIREMENTS.md: THEME-01, THEME-03, THEME-04 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-03): create warm basemap styles and wire variant-aware map selection

- Add happy-light.json: sage land, cream background, light blue ocean (forked from CARTO Voyager)
- Add happy-dark.json: dark sage land, navy background, dark navy ocean (forked from CARTO Dark Matter)
- Both styles preserve CARTO CDN source/sprite/glyph URLs for tile loading
- DeckGLMap.ts selects happy basemap URLs when SITE_VARIANT is 'happy'

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(01-03): style panel chrome, empty states, and loading for happy variant

- Panels get 14px rounded corners with subtle warm shadows
- Panel titles use normal casing (no uppercase) for friendlier feel
- Empty states (.panel-empty, .empty-state) show nature-themed sprout SVG icon
- Loading radar animation softened to 3s rotation with sage-green glow
- Status dots use gentle happy-pulse animation (2.5s ease-in-out)
- Error states use warm gold tones instead of harsh red
- Map controls, tabs, badges all get rounded corners
- Severity badges use warm semantic colors
- Download banner and posture radar adapted to warm theme

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(01-03): bridge SITE_VARIANT to data-variant attribute on <html>

The CSS theme overrides rely on [data-variant="happy"] on the document root,
but the inline script only detects variant from hostname/localStorage. This
leaves local dev (VITE_VARIANT=happy) and Vercel deployments without the
attribute set. Two fixes:

1. main.ts sets document.documentElement.dataset.variant from SITE_VARIANT
2. Vite htmlVariantPlugin injects build-time variant fallback into inline script

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(01-03): boost CSS specificity so happy theme wins over :root

The happy-theme.css was imported before :root in main.css, and both
[data-variant="happy"] and :root have equal specificity (0-1-0), so
:root variables won after in the cascade. Fix by using :root[data-variant="happy"]
(specificity 0-2-0) which always beats :root (0-1-0).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(01): fix CSS cascade — import happy-theme after main.css in main.ts

The root cause: happy-theme.css was @imported inside main.css (line 4),
which meant Vite loaded it BEFORE the :root block (line 9+). With equal
specificity, the later :root variables always won.

Fix: remove @import from main.css, import happy-theme.css directly in
main.ts after main.css. This ensures cascade order is correct — happy
theme variables come last and win. No !important needed.

Also consolidated semantic color variables into the same selector blocks
to reduce redundancy.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(01): fix CSS cascade with @layer base and theme toggle for happy variant

- Wrap main.css in @layer base via base-layer.css so happy-theme.css
  (unlayered) always wins the cascade for custom properties
- Remove duplicate <link> stylesheet from index.html (was double-loading)
- Default happy variant to light theme (data-theme="light") so the
  theme toggle works on first click instead of requiring two clicks
- Force build-time variant in inline script — stale localStorage can no
  longer override the deployment variant
- Prioritize VITE_VARIANT env over localStorage in variant.ts so
  variant-specific builds are deterministic

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(01-03): complete map basemap & panel chrome plan — Phase 1 done

- Add 01-03-SUMMARY.md with task commits, deviations, and self-check
- Update STATE.md: Phase 1 complete, advance to ready for Phase 2
- Update ROADMAP.md: mark Phase 1 plans 3/3 complete
- Update REQUIREMENTS.md: mark THEME-02 and THEME-05 complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-01): complete phase execution

* docs(phase-02): research curated content pipeline

* docs(02): create phase plan — curated content pipeline

* feat(02-01): add positive RSS feeds for happy variant

- Add HAPPY_FEEDS record with 8 feeds across 5 categories (positive, science, nature, health, inspiring)
- Update FEEDS export ternary to route happy variant to HAPPY_FEEDS
- Add happy source tiers to SOURCE_TIERS (Tier 2 for main sources, Tier 3 for category feeds)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(02-01): extend GDELT with tone filtering and positive topic queries

- Add tone_filter (field 4) and sort (field 5) to SearchGdeltDocumentsRequest proto
- Regenerate TypeScript client/server types via buf generate
- Handler appends toneFilter to GDELT query string, uses req.sort for sort param
- Add POSITIVE_GDELT_TOPICS array with 5 positive topic queries
- Add fetchPositiveGdeltArticles() with tone>5 and ToneDesc defaults
- Add fetchPositiveTopicIntelligence() and fetchAllPositiveTopicIntelligence() helpers
- Existing fetchGdeltArticles() backward compatible (empty toneFilter/sort = no change)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(02-01): complete positive feeds & GDELT tone filtering plan

- Create 02-01-SUMMARY.md with execution results
- Update STATE.md: phase 2, plan 1 of 2, decisions, metrics
- Update ROADMAP.md: phase 02 progress (1/2 plans)
- Mark FEED-01 and FEED-03 requirements complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(02-02): add positive content classifier and extend NewsItem type

- Create positive-classifier.ts with 6 content categories (science-health, nature-wildlife, humanity-kindness, innovation-tech, climate-wins, culture-community)
- Source-based pre-mapping for GNN category feeds (fast path)
- Priority-ordered keyword classification for general positive feeds (slow path)
- Add happyCategory optional field to NewsItem interface
- Export HAPPY_CATEGORY_LABELS and HAPPY_CATEGORY_ALL for downstream UI use

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore(02-02): clean up happy variant config and verify feed wiring

- Remove dead FEEDS placeholder from happy.ts (now handled by HAPPY_FEEDS in feeds.ts)
- Remove unused Feed type import
- Verified SOURCE_TIERS has all 8 happy feed entries (Tier 2: GNN/Positive.News/RTBC/Optimist, Tier 3: GNN category feeds)
- Verified FEEDS export routes to HAPPY_FEEDS when SITE_VARIANT=happy
- Verified App.ts loadNews() dynamically iterates FEEDS keys
- Happy variant builds successfully

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(02-02): complete content category classifier plan

- SUMMARY.md documenting classifier implementation and feed wiring cleanup
- STATE.md updated: Phase 2 complete, 5 total plans done, 56% progress
- ROADMAP.md updated: Phase 02 marked complete (2/2 plans)
- REQUIREMENTS.md: FEED-04 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(02-03): create gap closure plan for classifier wiring

* feat(02-03): wire classifyNewsItem into happy variant news ingestion

- Import classifyNewsItem from positive-classifier service
- Add classification step in loadNewsCategory() after fetchCategoryFeeds
- Guard with SITE_VARIANT === 'happy' to avoid impact on other variants
- In-place mutation via for..of loop sets happyCategory on every NewsItem

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(02-03): complete classifier wiring gap closure plan

- Add 02-03-SUMMARY.md documenting classifier wiring completion
- Update STATE.md with plan 3/3 position and decisions
- Update ROADMAP.md with completed plan checkboxes
- Include 02-VERIFICATION.md phase verification document

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-2): complete phase execution

* test(02): complete UAT - 1 passed, 1 blocker diagnosed

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-3): research positive news feed & quality pipeline

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(03): create phase plan for positive news feed and quality pipeline

* fix(03): revise plans based on checker feedback

* feat(03-02): add imageUrl to NewsItem and extract images from RSS

- Add optional imageUrl field to NewsItem interface
- Add extractImageUrl() helper to rss.ts with 4-strategy image extraction
  (media:content, media:thumbnail, enclosure, img-in-description)
- Wire image extraction into fetchFeed() for happy variant only

* feat(03-01): add happy variant guards to all App.ts code paths

- Skip DEFCON/PizzInt indicator for happy variant
- Add happy variant link (sun icon) to variant switcher header
- Show 'Good News Map' title for happy variant map section
- Skip LiveNewsPanel, LiveWebcams, TechEvents, ServiceStatus, TechReadiness, MacroSignals, ETFFlows, Stablecoin panels for happy
- Gate live-news first-position logic with happy exclusion
- Only load 'news' data for happy variant (skip markets, predictions, pizzint, fred, oil, spending, intelligence, military layers)
- Only schedule 'news' refresh interval for happy (skip all geopolitical/financial refreshes)
- Add happy-specific search modal with positive placeholder and no military/geopolitical sources

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(03-02): create PositiveNewsFeedPanel with filter bar and card rendering

- New PositiveNewsFeedPanel component extending Panel with:
  - Category filter bar (All + 6 positive categories)
  - Rich card rendering with image, title, source, category badge, time
  - Filter state preserved across data refreshes
  - Proper cleanup in destroy()
- Add CSS styles to happy-theme.css for cards and filter bar
  - Category-specific badge colors using theme variables
  - Scoped under [data-variant="happy"] to avoid affecting other variants

* feat(03-01): return empty channels for happy variant in LiveNewsPanel

- Defense-in-depth: LIVE_CHANNELS returns empty array for happy variant
- Ensures zero Bloomberg/war streams even if panel is somehow instantiated
- Combined with createPanels() guard from Task 1 for belt-and-suspenders safety

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(03-02): complete positive news feed panel plan

- Created 03-02-SUMMARY.md with execution results
- Updated STATE.md with position, decisions, and metrics
- Updated ROADMAP.md with phase 03 progress (2/3 plans)
- Marked NEWS-01, NEWS-02 requirements as complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(03-01): complete Happy Variant App.ts Integration plan

- SUMMARY.md with execution results and decisions
- STATE.md updated with 03-01 decisions and session info
- ROADMAP.md progress updated (2/3 phase 3 plans)
- NEWS-03 requirement marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(03-03): create sentiment gate service for ML-based filtering

- Exports filterBySentiment() wrapping mlWorker.classifySentiment()
- Default threshold 0.85 with localStorage override for tuning
- Graceful degradation: returns all items if ML unavailable
- Batches titles at 20 items per call (ML_THRESHOLDS.maxTextsPerBatch)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(03-03): wire multi-stage quality pipeline and positive-feed panel into App.ts

- Register 'positive-feed' in HAPPY_PANELS replacing 'live-news'
- Import PositiveNewsFeedPanel, filterBySentiment, fetchAllPositiveTopicIntelligence
- Add positivePanel + happyAllItems class properties
- Create PositiveNewsFeedPanel in createPanels() for happy variant
- Accumulate curated items in loadNewsCategory() for happy variant
- Implement loadHappySupplementaryAndRender() 4-stage pipeline:
  1. Curated feeds render immediately (non-blocking UX)
  2. GDELT positive articles fetched as supplementary
  3. Sentiment-filtered via DistilBERT-SST2 (filterBySentiment)
  4. Merged + sorted by date, re-rendered
- Auto-refresh on REFRESH_INTERVALS.feeds re-runs full pipeline
- ML failure degrades gracefully to curated-only display

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(03-03): complete quality pipeline plan - phase 3 done

- Summary: multi-stage positive news pipeline with ML sentiment gate
- STATE.md: phase 3 complete (3/3), 89% progress
- ROADMAP.md: phase 03 marked complete
- REQUIREMENTS.md: FEED-02, FEED-05 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(03): wire positive-feed panel key in panels.ts and add happy map layer/legend config

The executor updated happy.ts but the actual HAPPY_PANELS export comes from
panels.ts — it still had 'live-news' instead of 'positive-feed', so the panel
never rendered. Also adds happyLayers (natural only) and happy legend to Map.ts
to hide military layer toggles and geopolitical legend items.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-3): complete phase execution

* docs(phase-4): research global map & positive events

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(04): create phase plan — global map & positive events

* fix(04): revise plans based on checker feedback

* feat(04-01): add positiveEvents and kindness keys to MapLayers interface and all variant configs

- Add positiveEvents and kindness boolean keys to MapLayers interface
- Update all 10 variant layer configs (8 in panels.ts + 2 in happy.ts)
- Happy variant: positiveEvents=true, kindness=true; all others: false
- Fix variant config files (full, tech, finance) and e2e harnesses for compilation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(04-01): add happy variant layer toggles and legend in DeckGLMap

- Add happy branch to createLayerToggles with 3 toggles: Positive Events, Acts of Kindness, Natural Events
- Add happy branch to createLegend with 4 items: Positive Event (green), Breakthrough (gold), Act of Kindness (light green), Natural Event (orange)
- Non-happy variants unchanged

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(04-01): complete map layer config & happy variant toggles plan

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(04-02): add positive events geocoding pipeline and map layer

- Proto service PositiveEventsService with ListPositiveGeoEvents RPC
- Server-side GDELT GEO fetch with positive topic queries, dedup, classification
- Client-side service calling server RPC + RSS geocoding via inferGeoHubsFromTitle
- DeckGLMap green/gold ScatterplotLayer with pulse animation for significant events
- Tooltip shows event name, category, and report count
- Routes registered in api gateway and vite dev server

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(04-02): wire positive events loading into App.ts happy variant pipeline

- Import fetchPositiveGeoEvents and geocodePositiveNewsItems
- Load positive events in loadAllData() for happy variant with positiveEvents toggle
- loadPositiveEvents() merges GDELT GEO RPC + geocoded RSS items, deduplicates by name
- loadDataForLayer switch case for toggling positiveEvents layer on/off
- MapContainer.setPositiveEvents() delegates to DeckGLMap

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(04-02): complete positive events geocoding pipeline plan

- SUMMARY.md with task commits, decisions, deviations
- STATE.md updated with position, metrics, decisions
- ROADMAP.md and REQUIREMENTS.md updated

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(04-03): create kindness-data service with baseline generator and curated events

- Add KindnessPoint interface for map visualization data
- Add MAJOR_CITIES constant with ~60 cities worldwide (population-weighted)
- Implement generateBaselineKindness() producing 50-80 synthetic points per cycle
- Implement extractKindnessEvents() for real kindness items from curated news
- Export fetchKindnessData() merging baseline + real events

* feat(04-03): add kindness layer to DeckGLMap and wire into App.ts pipeline

- Add createKindnessLayers() with solid green fill + gentle pulse ring for real events
- Add kindness-layer tooltip showing city name and description
- Add setKindnessData() setter in DeckGLMap and MapContainer
- Wire loadKindnessData() into App.ts loadAllData and loadDataForLayer
- Kindness layer gated by mapLayers.kindness toggle (happy variant only)
- Pulse animation triggers when real kindness events are present

* docs(04-03): complete kindness data pipeline & map layer plan

- Create 04-03-SUMMARY.md documenting kindness layer implementation
- Update STATE.md: phase 04 complete (3/3 plans), advance position
- Update ROADMAP.md: phase 04 marked complete
- Mark KIND-01 and KIND-02 requirements as complete

* docs(phase-4): complete phase execution

* docs(phase-5): research humanity data panels domain

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(05-humanity-data-panels): create phase plan

* feat(05-01): create humanity counters service with metric definitions and rate calculations

- Define 6 positive global metrics with annual totals from UN/WHO/World Bank/UNESCO
- Calculate per-second rates from annual totals / 31,536,000 seconds
- Absolute-time getCounterValue() avoids drift across tabs/throttling
- Locale-aware formatCounterValue() using Intl.NumberFormat

* feat(05-02): install papaparse and create progress data service

- Install papaparse + @types/papaparse for potential OWID CSV fallback
- Create src/services/progress-data.ts with 4 World Bank indicators
- Export PROGRESS_INDICATORS (life expectancy, literacy, child mortality, poverty)
- Export fetchProgressData() using existing getIndicatorData() RPC
- Null value filtering, year sorting, invertTrend-aware change calculation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(05-01): create CountersPanel component with 60fps animated ticking numbers

- Extend Panel base class with counters-grid of 6 counter cards
- requestAnimationFrame loop updates all values at 60fps
- Absolute-time calculation via getCounterValue() prevents drift
- textContent updates (not innerHTML) avoid layout thrashing
- startTicking() / destroy() lifecycle methods for App.ts integration

* feat(05-02): create ProgressChartsPanel with D3.js area charts

- Extend Panel base class with id 'progress', title 'Human Progress'
- Render 4 stacked D3 area charts (life expectancy, literacy, child mortality, poverty)
- Warm happy-theme colors: sage green, soft blue, warm gold, muted rose
- d3.area() with curveMonotoneX for smooth filled curves
- Header with label, change badge (e.g., "+58.0% since 1960"), and unit
- Hover tooltip with bisector-based nearest data point detection
- ResizeObserver with 200ms debounce for responsive re-rendering
- Clean destroy() lifecycle with observer disconnection

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(05-01): complete ticking counters service & panel plan

- SUMMARY.md with execution results and self-check
- STATE.md updated to phase 5, plan 1/3
- ROADMAP.md progress updated
- Requirements COUNT-01, COUNT-02, COUNT-03 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(05-02): complete progress charts panel plan

- Create 05-02-SUMMARY.md with execution results
- Update STATE.md: plan 2/3, decisions, metrics
- Update ROADMAP.md: phase 05 progress (2/3 plans)
- Mark PROG-01, PROG-02, PROG-03 complete in REQUIREMENTS.md

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(05-03): wire CountersPanel and ProgressChartsPanel into App.ts lifecycle

- Import CountersPanel, ProgressChartsPanel, and fetchProgressData
- Add class properties for both new panels
- Instantiate both panels in createPanels() gated by SITE_VARIANT === 'happy'
- Add progress data loading task in refreshAll() for happy variant
- Add loadProgressData() private method calling fetchProgressData + setData
- Add destroy() cleanup for both panels (stops rAF loop and ResizeObserver)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(05-03): add counter and progress chart CSS styles to happy-theme.css

- Counters grid: responsive 3-column layout (3/2/1 at 900px/500px breakpoints)
- Counter cards: hover lift, tabular-nums for jitter-free 60fps updates
- Counter icon/value/label/source typography hierarchy
- Progress chart containers: stacked with border dividers
- Chart header with label, badge, and unit display
- D3 SVG axis styling (tick text fill, domain stroke)
- Hover tooltip with absolute positioning and shadow
- Dark mode adjustments for card hover shadow and tooltip shadow
- All selectors scoped under [data-variant='happy']

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(05-03): complete panel wiring & CSS plan

- Create 05-03-SUMMARY.md with execution results
- Update STATE.md: phase 5 complete (3/3 plans), decisions, metrics
- Update ROADMAP.md: phase 05 progress (3/3 summaries, Complete)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-5): complete phase execution

* docs(06): research phase 6 content spotlight panels

* docs(phase-6): create phase plan

* feat(06-01): add science RSS feeds and BreakthroughsTickerPanel

- Expand HAPPY_FEEDS.science from 1 to 5 feeds (ScienceDaily, Nature News, Live Science, New Scientist)
- Create BreakthroughsTickerPanel extending Panel with horizontal scrolling ticker
- Doubled content rendering for seamless infinite CSS scroll animation
- Sanitized HTML output using escapeHtml/sanitizeUrl

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(06-01): create HeroSpotlightPanel with photo, map location, and hero card

- Create HeroSpotlightPanel extending Panel for daily hero spotlight
- Render hero card with image, source, title, time, and optional map button
- Conditionally show "Show on map" button only when both lat and lon exist
- Expose onLocationRequest callback for App.ts map integration wiring
- Sanitized HTML output using escapeHtml/sanitizeUrl

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(06-02): add GoodThingsDigestPanel with progressive AI summarization

- Panel extends Panel base class with id 'digest', title '5 Good Things'
- Renders numbered story cards with titles immediately (progressive rendering)
- Summarizes each story in parallel via generateSummary() with Promise.allSettled
- AbortController cancels in-flight summaries on re-render or destroy
- Graceful fallback to truncated title on summarization failure
- Passes [title, source] to satisfy generateSummary's 2-headline minimum

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(06-02): complete Good Things Digest Panel plan

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(06-01): complete content spotlight panels plan

- Add 06-01-SUMMARY.md with execution results
- Update STATE.md with position, decisions, metrics
- Update ROADMAP.md and REQUIREMENTS.md progress

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(06-03): wire Phase 6 panels into App.ts lifecycle and update happy.ts config

- Import and instantiate BreakthroughsTickerPanel, HeroSpotlightPanel, GoodThingsDigestPanel in createPanels()
- Wire heroPanel.onLocationRequest callback to map.setCenter + map.flashLocation
- Distribute data to all three panels after content pipeline in loadHappySupplementaryAndRender()
- Add destroy calls for all three panels in App.destroy()
- Add digest key to DEFAULT_PANELS in happy.ts config

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(06-03): add CSS styles for ticker, hero card, and digest panels

- Add happy-ticker-scroll keyframe animation for infinite horizontal scroll
- Add breakthroughs ticker styles (wrapper, track, items with hover pause)
- Add hero spotlight card styles (image, body, source, title, location button)
- Add digest list styles (numbered cards, titles, sources, progressive summaries)
- Add dark mode overrides for all three panel types
- All selectors scoped under [data-variant="happy"]

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(06-03): complete panel wiring & CSS plan

- Create 06-03-SUMMARY.md with execution results
- Update STATE.md: phase 6 complete, 18 plans done, 78% progress
- Update ROADMAP.md: phase 06 marked complete (3/3 plans)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-6): complete phase execution

* docs(07): research conservation & energy trackers phase

* docs(07-conservation-energy-trackers): create phase plan

* feat(07-02): add renewable energy data service

- Fetch World Bank EG.ELC.RNEW.ZS indicator (IEA-sourced) for global + 7 regions
- Return global percentage, historical time-series, and regional breakdown
- Graceful degradation: individual region failures skipped, complete failure returns zeroed data
- Follow proven progress-data.ts pattern for getIndicatorData() RPC usage

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(07-01): add conservation wins dataset and data service

- Create conservation-wins.json with 10 species recovery stories and population timelines
- Create conservation-data.ts with SpeciesRecovery interface and fetchConservationWins() loader
- Species data sourced from USFWS, IUCN, NOAA, WWF, and other published reports

* feat(07-02): add RenewableEnergyPanel with D3 arc gauge and regional breakdown

- Animated D3 arc gauge showing global renewable electricity % with 1.5s easeCubicOut
- Historical trend sparkline using d3.area() + curveMonotoneX below gauge
- Regional breakdown with horizontal bars sorted by percentage descending
- All colors use getCSSColor() for theme-aware rendering
- Empty state handling when no data available

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(07-01): add SpeciesComebackPanel with D3 sparklines and species cards

- Create SpeciesComebackPanel extending Panel base class
- Render species cards with photo (lazy loading + error fallback), info badges, D3 sparkline, and summary
- D3 sparklines use area + line with curveMonotoneX and viewBox for responsive sizing
- Recovery status badges (recovered/recovering/stabilized) and IUCN category badges
- Population values formatted with Intl.NumberFormat for readability

* docs(07-02): complete renewable energy panel plan

- SUMMARY.md with task commits, decisions, self-check
- STATE.md updated to phase 7 plan 2, 83% progress
- ROADMAP.md phase 07 progress updated
- REQUIREMENTS.md: ENERGY-01, ENERGY-02, ENERGY-03 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(07-01): complete species comeback panel plan

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(07-03): wire species and renewable panels into App.ts lifecycle

- Add imports for SpeciesComebackPanel, RenewableEnergyPanel, and data services
- Add class properties for speciesPanel and renewablePanel
- Instantiate both panels in createPanels() gated by SITE_VARIANT === 'happy'
- Add loadSpeciesData() and loadRenewableData() tasks in refreshAll()
- Add destroy cleanup for both panels before map cleanup
- Add species and renewable entries to happy.ts DEFAULT_PANELS config

* feat(07-03): add CSS styles for species cards and renewable energy gauge

- Species card grid layout with 2-column responsive grid
- Photo, info, badges (recovered/recovering/stabilized/IUCN), sparkline, summary styles
- Renewable energy gauge section, historical sparkline, and regional bar chart styles
- Dark mode overrides for species card hover shadow and IUCN badge background
- All styles scoped with [data-variant='happy'] using existing CSS variables

* docs(07-03): complete panel wiring & CSS plan

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(happy): add missing panel entries and RSS proxy for dev mode

HAPPY_PANELS in panels.ts was missing digest, species, and renewable
entries — panels were constructed but never appended to the grid because
the panelOrder loop only iterated the 6 original keys.

Also adds RSS proxy middleware for Vite dev server, fixes sebuf route
regex to match hyphenated domains (positive-events), and adds happy
feed domains to the rss-proxy allowlist.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: progress data lookup, ticker speed, ultrawide layout gap

1. Progress/renewable data: World Bank API returns countryiso3code "WLD"
   for world aggregate, but services were looking up by request code "1W".
   Changed lookups to use "WLD".

2. Breakthroughs ticker: slowed animation from 30s to 60s duration.

3. Ultrawide layout (>2000px): replaced float-based layout with CSS grid.
   Map stays in left column (60%), panels grid in right column (40%).
   Eliminates dead space under the map where panels used to wrap below.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: UI polish — counter overflow, ticker speed, monitors panel, filter tabs

- Counter values: responsive font-size with clamp(), overflow protection,
  tighter card padding to prevent large numbers from overflowing
- Breakthroughs ticker: slowed from 60s to 120s animation duration
- My Monitors panel: gate monitors from panel order in happy variant
  (was unconditionally pushed into panelOrder regardless of variant)
- Filter tabs: smaller padding/font, flex-shrink:0, fade mask on right
  edge to hint at scrollable overflow

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(happy): exclude APT groups layer from happy variant map

The APT groups layer (cyber threat actors like Fancy Bear, Cozy Bear)
was only excluded for the tech variant. Now also excluded for happy,
since cyber threat data has no place on a Good News Map.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(happy-map): labeled markers, remove fake baseline, fix APT leak

- Positive events now show category emoji + location name as colored
  text labels (TextLayer) instead of bare dots. Labels filter by zoom
  level to avoid clutter at global view.
- Removed synthetic kindness baseline (50-80 fake "Volunteers at work"
  dots in random cities). Only real kindness events from news remain.
- Kindness events also get labeled dots with headlines.
- Improved tooltips with proper category names and source counts.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(happy-map): disable earthquakes, fix GDELT query syntax

- Disable natural events layer (earthquakes) for happy variant —
  not positive news
- Fix GDELT GEO positive queries: OR terms require parentheses
  per GDELT API syntax, added third query for charity/volunteer news
- Updated both desktop and mobile happy map layer configs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(happy): ultrawide grid overflow, panel text polish

Ultrawide: set min-height:0 on map/panels grid children so they
respect 1fr row constraint and scroll independently instead of
pushing content below the viewport.

Panel CSS: softer word-break on counters, line-clamp on digest
and species summaries, ticker title max-width, consistent
text-dim color instead of opacity hacks.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(08-map-data-overlays): research phase domain

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(08-map-data-overlays): create phase plan

* Add Global Giving Activity Index with multi-platform aggregation (#255)

* feat(08-01): add static data for happiness scores, renewable installations, and recovery zones

- Create world-happiness.json with 152 country scores from WHR 2025
- Create renewable-installations.json with 92 global entries (solar/wind/hydro/geothermal)
- Extend conservation-wins.json with recoveryZone lat/lon for all 10 species

* feat(08-01): add service loaders, extend MapLayers with happiness/species/energy keys

- Create happiness-data.ts with fetchHappinessScores() returning Map<ISO2, score>
- Create renewable-installations.ts with fetchRenewableInstallations() returning typed array
- Extend SpeciesRecovery interface with optional recoveryZone field
- Add happiness, speciesRecovery, renewableInstallations to MapLayers interface
- Update all 8 variant MapLayers configs (happiness=true in happy, false elsewhere)
- Update e2e harness files with new layer keys

* docs(08-01): complete data foundation plan summary and state updates

- Create 08-01-SUMMARY.md with execution results
- Update STATE.md to phase 8, plan 1/2
- Update ROADMAP.md progress for phase 08
- Mark requirements MAP-03, MAP-04, MAP-05 complete

* feat(08-02): add happiness choropleth, species recovery, and renewable installation overlay layers

- Add three Deck.gl layer creation methods with color-coded rendering
- Add public data setters for happiness scores, species recovery zones, and renewable installations
- Wire layers into buildLayers() gated by MapLayers keys
- Add tooltip cases for all three new layer types
- Extend happy variant layer toggles (World Happiness, Species Recovery, Clean Energy)
- Extend happy variant legend with choropleth, species, and renewable entries
- Cache country GeoJSON reference in loadCountryBoundaries() for choropleth reuse

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(08-02): wire MapContainer delegation and App.ts data loading for map overlays

- Add MapContainer delegation methods for happiness, species recovery, and renewable installations
- Add happiness scores and renewable installations map data loading in App.ts refreshAll()
- Chain species recovery zone data to map from existing loadSpeciesData()
- All three overlay datasets flow from App.ts through MapContainer to DeckGLMap

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(08-02): complete map overlay layers plan

- Create 08-02-SUMMARY.md with execution results
- Update STATE.md: phase 8 complete (2/2 plans), 22 total plans, decisions logged
- Update ROADMAP.md: phase 08 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-8): complete phase execution

* docs(roadmap): add Phase 7.1 gap closure for renewable energy installation & coal data

Addresses Phase 7 verification gaps (ENERGY-01, ENERGY-03): renewable panel
lacks solar/wind installation growth and coal plant closure visualizations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(7.1): research renewable energy installation & coal retirement data

* docs(71): create phase plans for renewable energy installation & coal retirement data

* feat(71-01): add GetEnergyCapacity RPC proto and server handler

- Create get_energy_capacity.proto with request/response messages
- Add GetEnergyCapacity RPC to EconomicService in service.proto
- Implement server handler with EIA capability API integration
- Coal code fallback (COL -> BIT/SUB/LIG/RC) for sub-type support
- Redis cache with 24h TTL for annual capacity data
- Register handler in economic service handler

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(71-01): add client-side fetchEnergyCapacity with circuit breaker

- Add GetEnergyCapacityResponse import and capacityBreaker to economic service
- Export fetchEnergyCapacityRpc() with energyEia feature gating
- Add CapacitySeries/CapacityDataPoint types to renewable-energy-data.ts
- Export fetchEnergyCapacity() that transforms proto types to domain types

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(71-01): complete EIA energy capacity data pipeline plan

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(71-02): add setCapacityData() with D3 stacked area chart to RenewableEnergyPanel

- setCapacityData() renders D3 stacked area (solar yellow + wind blue) with coal decline (red)
- Chart labeled 'US Installed Capacity (EIA)' with compact inline legend
- Appends below existing gauge/sparkline/regions without replacing content
- CSS styles for capacity section, header, legend in happy-theme.css

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(71-02): wire EIA capacity data loading in App.ts loadRenewableData()

- Import fetchEnergyCapacity from renewable-energy-data service
- Call fetchEnergyCapacity() after World Bank gauge data, pass to setCapacityData()
- Wrapped in try/catch so EIA failure does not break existing gauge

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(71-02): complete EIA capacity visualization plan

- SUMMARY.md documenting D3 stacked area chart implementation
- STATE.md updated: Phase 7.1 complete (2/2 plans), progress 100%
- ROADMAP.md updated with plan progress

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-71): complete phase execution

* docs(phase-09): research sharing, TV mode & polish domain

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(09): create phase plan for sharing, TV mode & polish

* docs(phase-09): plan Sharing, TV Mode & Polish

3 plans in 2 waves covering share cards (Canvas 2D renderer),
TV/ambient mode (fullscreen panel cycling + CSS particles),
and celebration animations (canvas-confetti milestones).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-01): create Canvas 2D renderer for happy share cards

- 1080x1080 branded PNG with warm gradient per category
- Category badge, headline word-wrap, source, date, HappyMonitor branding
- shareHappyCard() with Web Share API -> clipboard -> download fallback
- wrapText() helper for Canvas 2D manual line breaking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-02): create TvModeController and TV mode CSS

- TvModeController class manages fullscreen, panel cycling with configurable 30s-2min interval
- CSS [data-tv-mode] attribute drives larger typography, hidden interactive elements, smooth panel transitions
- Ambient floating particles (CSS-only, opacity 0.04) with reduced motion support
- TV exit button appears on hover, hidden by default outside TV mode

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-02): wire TV mode into App.ts header and lifecycle

- TV mode button with monitor icon in happy variant header
- TV exit button at page level, visible on hover in TV mode
- Shift+T keyboard shortcut toggles TV mode
- TvModeController instantiated lazily on first toggle
- Proper cleanup in destroy() method

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-01): add share button to positive news cards with handler

- Share button (SVG upload icon) appears on card hover, top-right
- Delegated click handler prevents link navigation, calls shareHappyCard
- Brief .shared visual feedback (green, scale) for 1.5s on click
- Dark mode support for share button background
- Fix: tv-mode.ts panelKeys index guard (pre-existing build blocker)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(09-02): complete TV Mode plan

- SUMMARY.md with task commits, deviations, decisions
- STATE.md updated: position, metrics, decisions, session
- ROADMAP.md updated: phase 09 progress (2/3 plans)
- REQUIREMENTS.md updated: TV-01, TV-02, TV-03 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(09-01): complete positive news share cards plan

- SUMMARY.md with Canvas 2D renderer and share button accomplishments
- STATE.md updated with decisions and session continuity
- ROADMAP.md progress updated (2/3 plans in phase 09)
- REQUIREMENTS.md: SHARE-01, SHARE-02, SHARE-03 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-03): add celebration service with canvas-confetti

- Install canvas-confetti + @types/canvas-confetti
- Create src/services/celebration.ts with warm nature-inspired palette
- Session-level dedup (Set<string>) prevents repeat celebrations
- Respects prefers-reduced-motion media query
- Milestone detection for species recovery + renewable energy records
- Moderate particle counts (40-80) for "warm, not birthday party" feel

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(09-03): wire milestone celebrations into App.ts data pipelines

- Import checkMilestones in App.ts
- Call checkMilestones after species data loads with recovery statuses
- Call checkMilestones after renewable energy data loads with global percentage
- All celebration calls gated behind SITE_VARIANT === 'happy'
- Placed after panel setData() so data is visible before confetti fires

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(09-03): complete celebration animations plan

- 09-03-SUMMARY.md with execution results
- STATE.md updated: phase 09 complete, 26 plans total, 100% progress
- ROADMAP.md updated with phase 09 completion
- REQUIREMENTS.md: THEME-06 marked complete

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-09): complete phase execution

* fix(happy): remove natural events layer from happy variant

Natural events (earthquakes, volcanoes, storms) were leaking into the
happy variant through stale localStorage and the layer toggle UI. Force
all non-happy layers off regardless of localStorage state, and remove
the natural events toggle from both DeckGL and SVG map layer configs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(phase-7.1): complete phase execution — mark all phases done

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs(v1): complete milestone audit — 49/49 requirements satisfied

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(happy): close audit tech debt — map layer defaults, theme-color meta

- Enable speciesRecovery and renewableInstallations layers by default
  in HAPPY_MAP_LAYERS (panels.ts + happy.ts) so MAP-04/MAP-05 are
  visible on first load
- Use happy-specific theme-color meta values (#FAFAF5 light, #1A2332
  dark) in setTheme() and applyStoredTheme() instead of generic colors

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs: add checkpoint for giving integration handoff

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat(giving): integrate Global Giving Activity Index from PR #254

Cherry-pick the giving feature that was left behind when PR #255
batch-merged without including #254's proto/handler/panel files.

Adds:
- Proto definitions (GivingService, GivingSummary, PlatformGiving, etc.)
- Server handler: GoFundMe/GlobalGiving/JustGiving/crypto/OECD aggregation
- Client service with circuit breaker
- GivingPanel with tabs (platforms, categories, crypto, institutional)
- Full wiring: API routes, vite dev server, data freshness, panel config
- Happy variant panel config entry

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(giving): move panel init and data fetch out of full-variant-only blocks

The GivingPanel was instantiated inside `if (SITE_VARIANT === 'full')` and
the data fetch was inside `loadIntelligenceSignals()` (also full-only).
Moved both to variant-agnostic scope so the panel works on happy variant.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(giving): bypass debounced setContent so tab buttons are clickable

Panel.setContent() is debounced (150ms), so event listeners attached
immediately after it were binding to DOM elements that got replaced by
the deferred innerHTML write. Write directly to this.content.innerHTML
like other interactive panels do.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: remove .planning/ from repo and gitignore it

Planning files served their purpose during happy monitor development.
They remain on disk for reference but no longer tracked.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: merge new panels into saved panelSettings so they aren't hidden

When panelSettings is loaded from localStorage, any panels added since
the user last saved settings would be missing from the config. The
applyPanelSettings loop wouldn't touch them, but without a config entry
they also wouldn't appear in the settings toggle UI correctly.

Now merges DEFAULT_PANELS entries into loaded settings for any keys
that don't exist yet, so new panels are visible by default.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: giving data baselines, theme toggle persistence, and client caching

- Replace broken GoFundMe (301→404) and GlobalGiving (401) API calls
  with hardcoded baselines from published annual reports. Activity index
  rises from 42 to 56 as all 3 platforms now report non-zero volumes.
- Fix happy variant theme toggle not persisting across page reloads:
  applyStoredTheme() couldn't distinguish "no preference" from "user
  chose dark" — both returned DEFAULT_THEME. Now checks raw localStorage.
- Fix inline script in index.html not setting data-theme="dark" for
  happy variant, causing CSS :root[data-variant="happy"] (light) to
  win over :root[data-variant="happy"][data-theme="dark"].
- Add client-side caching to giving service: persistCache on circuit
  breaker, 30min in-memory TTL, and request deduplication.
- Add Playwright E2E tests for theme toggle (8 tests, all passing).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* perf: add persistent cache to all 29 circuit breakers across 19 services

Enable persistCache and set appropriate cacheTtlMs on every circuit
breaker that lacked them. Data survives page reloads via IndexedDB
fallback and reduces redundant API calls on navigation.

TTLs matched to data freshness: 5min for real-time feeds (weather,
earthquakes, wildfires, aviation), 10min for event data (conflict,
cyber, unrest, climate, research), 15-30min for slow-moving data
(economic indicators, energy capacity, population exposure).

Market quotes breaker intentionally left at cacheTtlMs: 0 (real-time).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat: expand map labels progressively as user zooms in

Labels now show more text at higher zoom levels instead of always
truncating at 30 chars. Zoom <3: 20 chars, <5: 35, <7: 60, 7+: full.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: keep 30-char baseline for map labels, expand to full text at zoom 6+

Previous change was too aggressive with low-zoom truncation (20 chars).
Now keeps original 30-char limit at global view, progressively expands
to 50/80/200 chars as user zooms in. Also scales font size with zoom.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* Revert "fix: keep 30-char baseline for map labels, expand to full text at zoom 6+"

This reverts commit 33b8a8accc2d48acd45f3dcea97a083b8bcebbf0.

* Revert "feat: expand map labels progressively as user zooms in"

This reverts commit 285f91fe471925ca445243ae5d8ac37723f2eda7.

* perf: stale-while-revalidate for instant page load

Circuit breaker now returns stale cached data immediately and refreshes
in the background, instead of blocking on API calls when cache exceeds
TTL. Also persists happyAllItems to IndexedDB so Hero, Digest, and
Breakthroughs panels render instantly from cache on page reload.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: address PR #229 review — 4 issues from koala

1. P1: Fix duplicate event listeners in PositiveNewsFeedPanel.renderCards()
   — remove listener before re-adding to prevent stacking on re-renders

2. P1: Fix TV mode cycling hidden panels causing blank screen
   — filter out user-disabled panels from cycle list, rebuild keys on toggle

3. P2: Fix positive classifier false positives for short keywords
   — "ai" and "art" now use space-delimited matching to avoid substring hits
     (e.g. "aid", "rain", "said", "start", "part")

4. P3: Fix CSP blocking Google Fonts stylesheet for Nunito
   — add https://fonts.googleapis.com to style-src directive

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor: decompose App.ts into focused modules under src/app/

Break the 4,597-line monolithic App class into 7 focused modules plus a
~460-line thin orchestrator. Each module implements the AppModule lifecycle
(init/destroy) and communicates via a shared AppContext state object with
narrow callback interfaces — no circular dependencies.

Modules extracted:
- app-context.ts: shared state types (AppContext, AppModule, etc.)
- desktop-updater.ts: desktop version checking + update badge
- country-intel.ts: country briefs, timeline, CII signals
- search-manager.ts: search modal, result routing, index updates
- refresh-scheduler.ts: periodic data refresh with jitter/backoff
- panel-layout.ts: panel creation, grid layout, drag-drop
- data-loader.ts: all 36 data loading methods
- event-handlers.ts: DOM events, shortcuts, idle detection, URL sync

Verified: tsc --noEmit (zero errors), all 3 variant builds pass
(full, tech, finance), runtime smoke test confirms no regressions.

* fix: resolve test failures and missing CSS token from PR review

1. flushStaleRefreshes test now reads from refresh-scheduler.ts (moved
   during App.ts modularization)
2. e2e runtime tests updated to import DesktopUpdater and DataLoaderManager
   instead of App.prototype for resolveUpdateDownloadUrl and loadMarkets
3. Add --semantic-positive CSS variable to main.css and happy-theme.css
   (both light and dark variants)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: hide happy variant button from other variants

The button is only visible when already on the happy variant. This
allows merging the modularized App.ts without exposing the unfinished
happy layout to users — layout work continues in a follow-up PR.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Elie Habib <elie.habib@gmail.com>
2026-02-25 10:05:26 +04:00

2524 lines
94 KiB
JavaScript
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
#!/usr/bin/env node
/**
* AIS WebSocket Relay Server
* Proxies aisstream.io data to browsers via WebSocket
*
* Deploy on Railway with:
* AISSTREAM_API_KEY=your_key
*
* Local: node scripts/ais-relay.cjs
*/
const http = require('http');
const https = require('https');
const zlib = require('zlib');
const path = require('path');
const { readFileSync } = require('fs');
const crypto = require('crypto');
const { WebSocketServer, WebSocket } = require('ws');
const AISSTREAM_URL = 'wss://stream.aisstream.io/v0/stream';
const API_KEY = process.env.AISSTREAM_API_KEY || process.env.VITE_AISSTREAM_API_KEY;
const PORT = process.env.PORT || 3004;
if (!API_KEY) {
console.error('[Relay] Error: AISSTREAM_API_KEY environment variable not set');
console.error('[Relay] Get a free key at https://aisstream.io');
process.exit(1);
}
const MAX_WS_CLIENTS = 10; // Cap WS clients — app uses HTTP snapshots, not WS
const UPSTREAM_QUEUE_HIGH_WATER = Math.max(500, Number(process.env.AIS_UPSTREAM_QUEUE_HIGH_WATER || 4000));
const UPSTREAM_QUEUE_LOW_WATER = Math.max(
100,
Math.min(UPSTREAM_QUEUE_HIGH_WATER - 1, Number(process.env.AIS_UPSTREAM_QUEUE_LOW_WATER || 1000))
);
const UPSTREAM_QUEUE_HARD_CAP = Math.max(
UPSTREAM_QUEUE_HIGH_WATER + 1,
Number(process.env.AIS_UPSTREAM_QUEUE_HARD_CAP || 8000)
);
const UPSTREAM_DRAIN_BATCH = Math.max(1, Number(process.env.AIS_UPSTREAM_DRAIN_BATCH || 250));
const UPSTREAM_DRAIN_BUDGET_MS = Math.max(2, Number(process.env.AIS_UPSTREAM_DRAIN_BUDGET_MS || 20));
const MAX_VESSELS = 50000; // hard cap on vessels Map
const MAX_VESSEL_HISTORY = 50000;
const MAX_DENSITY_CELLS = 5000;
const RELAY_SHARED_SECRET = process.env.RELAY_SHARED_SECRET || '';
const RELAY_AUTH_HEADER = (process.env.RELAY_AUTH_HEADER || 'x-relay-key').toLowerCase();
const ALLOW_UNAUTHENTICATED_RELAY = process.env.ALLOW_UNAUTHENTICATED_RELAY === 'true';
const IS_PRODUCTION_RELAY = process.env.NODE_ENV === 'production'
|| !!process.env.RAILWAY_ENVIRONMENT
|| !!process.env.RAILWAY_PROJECT_ID
|| !!process.env.RAILWAY_STATIC_URL;
const RELAY_RATE_LIMIT_WINDOW_MS = Math.max(1000, Number(process.env.RELAY_RATE_LIMIT_WINDOW_MS || 60000));
const RELAY_RATE_LIMIT_MAX = Number.isFinite(Number(process.env.RELAY_RATE_LIMIT_MAX))
? Number(process.env.RELAY_RATE_LIMIT_MAX) : 1200;
const RELAY_OPENSKY_RATE_LIMIT_MAX = Number.isFinite(Number(process.env.RELAY_OPENSKY_RATE_LIMIT_MAX))
? Number(process.env.RELAY_OPENSKY_RATE_LIMIT_MAX) : 600;
const RELAY_RSS_RATE_LIMIT_MAX = Number.isFinite(Number(process.env.RELAY_RSS_RATE_LIMIT_MAX))
? Number(process.env.RELAY_RSS_RATE_LIMIT_MAX) : 300;
const RELAY_LOG_THROTTLE_MS = Math.max(1000, Number(process.env.RELAY_LOG_THROTTLE_MS || 10000));
const ALLOW_VERCEL_PREVIEW_ORIGINS = process.env.ALLOW_VERCEL_PREVIEW_ORIGINS === 'true';
if (IS_PRODUCTION_RELAY && !RELAY_SHARED_SECRET && !ALLOW_UNAUTHENTICATED_RELAY) {
console.error('[Relay] Error: RELAY_SHARED_SECRET is required in production');
console.error('[Relay] Set RELAY_SHARED_SECRET on Railway and Vercel to secure relay endpoints');
console.error('[Relay] To bypass temporarily (not recommended), set ALLOW_UNAUTHENTICATED_RELAY=true');
process.exit(1);
}
let upstreamSocket = null;
let upstreamPaused = false;
let upstreamQueue = [];
let upstreamQueueReadIndex = 0;
let upstreamDrainScheduled = false;
let clients = new Set();
let messageCount = 0;
let droppedMessages = 0;
const requestRateBuckets = new Map(); // key: route:ip -> { count, resetAt }
const logThrottleState = new Map(); // key: event key -> timestamp
// Safe response: guard against "headers already sent" crashes
function safeEnd(res, statusCode, headers, body) {
if (res.headersSent || res.writableEnded) return false;
try {
res.writeHead(statusCode, headers);
res.end(body);
return true;
} catch {
return false;
}
}
// gzip compress & send a response (reduces egress ~80% for JSON)
function sendCompressed(req, res, statusCode, headers, body) {
if (res.headersSent || res.writableEnded) return;
const acceptEncoding = req.headers['accept-encoding'] || '';
if (acceptEncoding.includes('gzip')) {
zlib.gzip(typeof body === 'string' ? Buffer.from(body) : body, (err, compressed) => {
if (err || res.headersSent || res.writableEnded) {
safeEnd(res, statusCode, headers, body);
return;
}
safeEnd(res, statusCode, { ...headers, 'Content-Encoding': 'gzip', 'Vary': 'Accept-Encoding' }, compressed);
});
} else {
safeEnd(res, statusCode, headers, body);
}
}
// Pre-gzipped response: serve a cached gzip buffer directly (zero CPU per request)
function sendPreGzipped(req, res, statusCode, headers, rawBody, gzippedBody) {
if (res.headersSent || res.writableEnded) return;
const acceptEncoding = req.headers['accept-encoding'] || '';
if (acceptEncoding.includes('gzip') && gzippedBody) {
safeEnd(res, statusCode, { ...headers, 'Content-Encoding': 'gzip', 'Vary': 'Accept-Encoding' }, gzippedBody);
} else {
safeEnd(res, statusCode, headers, rawBody);
}
}
// ─────────────────────────────────────────────────────────────
// Telegram OSINT ingestion (public channels) → Early Signals
// Web-first: runs on this Railway relay process, serves /telegram/feed
// Requires env:
// - TELEGRAM_API_ID
// - TELEGRAM_API_HASH
// - TELEGRAM_SESSION (StringSession)
// ─────────────────────────────────────────────────────────────
const TELEGRAM_ENABLED = Boolean(process.env.TELEGRAM_API_ID && process.env.TELEGRAM_API_HASH && process.env.TELEGRAM_SESSION);
const TELEGRAM_POLL_INTERVAL_MS = Math.max(15_000, Number(process.env.TELEGRAM_POLL_INTERVAL_MS || 60_000));
const TELEGRAM_MAX_FEED_ITEMS = Math.max(50, Number(process.env.TELEGRAM_MAX_FEED_ITEMS || 200));
const TELEGRAM_MAX_TEXT_CHARS = Math.max(200, Number(process.env.TELEGRAM_MAX_TEXT_CHARS || 800));
const telegramState = {
client: null,
channels: [],
cursorByHandle: Object.create(null),
items: [],
lastPollAt: 0,
lastError: null,
startedAt: Date.now(),
};
function loadTelegramChannels() {
// Product-managed curated list lives in repo root under data/ (shared by web + desktop).
// Relay is executed from scripts/, so resolve ../data.
const p = path.join(__dirname, '..', 'data', 'telegram-channels.json');
const set = String(process.env.TELEGRAM_CHANNEL_SET || 'full').toLowerCase();
try {
const raw = JSON.parse(readFileSync(p, 'utf8'));
const bucket = raw?.channels?.[set];
const channels = Array.isArray(bucket) ? bucket : [];
telegramState.channels = channels
.filter(c => c && typeof c.handle === 'string' && c.handle.length > 1)
.map(c => ({
handle: String(c.handle).replace(/^@/, ''),
label: c.label ? String(c.label) : undefined,
topic: c.topic ? String(c.topic) : undefined,
region: c.region ? String(c.region) : undefined,
tier: c.tier != null ? Number(c.tier) : undefined,
enabled: c.enabled !== false,
maxMessages: c.maxMessages != null ? Number(c.maxMessages) : undefined,
}))
.filter(c => c.enabled);
if (!telegramState.channels.length) {
console.warn(`[Relay] Telegram channel set "${set}" is empty — no channels to poll`);
}
return telegramState.channels;
} catch (e) {
telegramState.channels = [];
telegramState.lastError = `failed to load telegram-channels.json: ${e?.message || String(e)}`;
return [];
}
}
function normalizeTelegramMessage(msg, channel) {
const textRaw = String(msg?.message || '');
const text = textRaw.slice(0, TELEGRAM_MAX_TEXT_CHARS);
const ts = msg?.date ? new Date(msg.date * 1000).toISOString() : new Date().toISOString();
return {
id: `${channel.handle}:${msg.id}`,
source: 'telegram',
channel: channel.handle,
channelTitle: channel.label || channel.handle,
url: `https://t.me/${channel.handle}/${msg.id}`,
ts,
text,
topic: channel.topic || 'other',
tags: [channel.region].filter(Boolean),
earlySignal: true,
};
}
async function initTelegramClientIfNeeded() {
if (!TELEGRAM_ENABLED) return false;
if (telegramState.client) return true;
const apiId = parseInt(String(process.env.TELEGRAM_API_ID || ''), 10);
const apiHash = String(process.env.TELEGRAM_API_HASH || '');
const sessionStr = String(process.env.TELEGRAM_SESSION || '');
if (!apiId || !apiHash || !sessionStr) return false;
try {
const { TelegramClient } = await import('telegram');
const { StringSession } = await import('telegram/sessions');
const client = new TelegramClient(new StringSession(sessionStr), apiId, apiHash, {
connectionRetries: 3,
});
await client.connect();
telegramState.client = client;
telegramState.lastError = null;
console.log('[Relay] Telegram client connected');
return true;
} catch (e) {
telegramState.lastError = `telegram init failed: ${e?.message || String(e)}`;
console.warn('[Relay] Telegram init failed:', telegramState.lastError);
return false;
}
}
async function pollTelegramOnce() {
const ok = await initTelegramClientIfNeeded();
if (!ok) return;
const channels = telegramState.channels.length ? telegramState.channels : loadTelegramChannels();
if (!channels.length) return;
const client = telegramState.client;
const newItems = [];
for (const channel of channels) {
const handle = channel.handle;
const minId = telegramState.cursorByHandle[handle] || 0;
try {
const entity = await client.getEntity(handle);
const msgs = await client.getMessages(entity, {
limit: Math.max(1, Math.min(50, channel.maxMessages || 25)),
minId,
});
for (const msg of msgs) {
if (!msg || !msg.id || !msg.message) continue;
const item = normalizeTelegramMessage(msg, channel);
newItems.push(item);
if (!telegramState.cursorByHandle[handle] || msg.id > telegramState.cursorByHandle[handle]) {
telegramState.cursorByHandle[handle] = msg.id;
}
}
// Gentle rate limiting between channels
await new Promise(r => setTimeout(r, Math.max(300, Number(process.env.TELEGRAM_RATE_LIMIT_MS || 800))));
} catch (e) {
const em = e?.message || String(e);
telegramState.lastError = `poll ${handle} failed: ${em}`;
console.warn('[Relay] Telegram poll error:', telegramState.lastError);
}
}
if (newItems.length) {
const seen = new Set();
telegramState.items = [...newItems, ...telegramState.items]
.filter(item => {
if (seen.has(item.id)) return false;
seen.add(item.id);
return true;
})
.sort((a, b) => (b.ts || '').localeCompare(a.ts || ''))
.slice(0, TELEGRAM_MAX_FEED_ITEMS);
}
telegramState.lastPollAt = Date.now();
}
function startTelegramPollLoop() {
if (!TELEGRAM_ENABLED) return;
loadTelegramChannels();
// Dont block server startup.
pollTelegramOnce().catch(e => console.warn('[Relay] Telegram poll error:', e?.message || e));
setInterval(() => {
pollTelegramOnce().catch(e => console.warn('[Relay] Telegram poll error:', e?.message || e));
}, TELEGRAM_POLL_INTERVAL_MS).unref?.();
console.log('[Relay] Telegram poll loop started');
}
function gzipSyncBuffer(body) {
try {
return zlib.gzipSync(typeof body === 'string' ? Buffer.from(body) : body);
} catch {
return null;
}
}
function getClientIp(req) {
const xRealIp = req.headers['x-real-ip'];
if (typeof xRealIp === 'string' && xRealIp.trim()) {
return xRealIp.trim();
}
const xff = req.headers['x-forwarded-for'];
if (typeof xff === 'string' && xff) {
const parts = xff.split(',').map((part) => part.trim()).filter(Boolean);
// Proxy chain order is client,proxy1,proxy2...; use first hop as client IP.
if (parts.length > 0) return parts[0];
}
return req.socket?.remoteAddress || 'unknown';
}
function safeTokenEquals(provided, expected) {
const a = Buffer.from(provided || '');
const b = Buffer.from(expected || '');
if (a.length !== b.length) return false;
return crypto.timingSafeEqual(a, b);
}
function getRelaySecretFromRequest(req) {
const direct = req.headers[RELAY_AUTH_HEADER];
if (typeof direct === 'string' && direct.trim()) return direct.trim();
const auth = req.headers.authorization;
if (typeof auth === 'string' && auth.toLowerCase().startsWith('bearer ')) {
const token = auth.slice(7).trim();
if (token) return token;
}
return '';
}
function isAuthorizedRequest(req) {
if (!RELAY_SHARED_SECRET) return true;
const provided = getRelaySecretFromRequest(req);
if (!provided) return false;
return safeTokenEquals(provided, RELAY_SHARED_SECRET);
}
function getRouteGroup(pathname) {
if (pathname.startsWith('/opensky')) return 'opensky';
if (pathname.startsWith('/rss')) return 'rss';
if (pathname.startsWith('/ais/snapshot')) return 'snapshot';
if (pathname.startsWith('/worldbank')) return 'worldbank';
if (pathname.startsWith('/polymarket')) return 'polymarket';
if (pathname.startsWith('/ucdp-events')) return 'ucdp-events';
return 'other';
}
function getRateLimitForPath(pathname) {
if (pathname.startsWith('/opensky')) return RELAY_OPENSKY_RATE_LIMIT_MAX;
if (pathname.startsWith('/rss')) return RELAY_RSS_RATE_LIMIT_MAX;
return RELAY_RATE_LIMIT_MAX;
}
function consumeRateLimit(req, pathname) {
const maxRequests = getRateLimitForPath(pathname);
if (!Number.isFinite(maxRequests) || maxRequests <= 0) return { limited: false, limit: 0, remaining: 0, resetInMs: 0 };
const now = Date.now();
const ip = getClientIp(req);
const key = `${getRouteGroup(pathname)}:${ip}`;
const existing = requestRateBuckets.get(key);
if (!existing || now >= existing.resetAt) {
const next = { count: 1, resetAt: now + RELAY_RATE_LIMIT_WINDOW_MS };
requestRateBuckets.set(key, next);
return { limited: false, limit: maxRequests, remaining: Math.max(0, maxRequests - 1), resetInMs: next.resetAt - now };
}
existing.count += 1;
const limited = existing.count > maxRequests;
return {
limited,
limit: maxRequests,
remaining: Math.max(0, maxRequests - existing.count),
resetInMs: Math.max(0, existing.resetAt - now),
};
}
function logThrottled(level, key, ...args) {
const now = Date.now();
const last = logThrottleState.get(key) || 0;
if (now - last < RELAY_LOG_THROTTLE_MS) return;
logThrottleState.set(key, now);
console[level](...args);
}
const METRICS_WINDOW_SECONDS = Math.max(10, Number(process.env.RELAY_METRICS_WINDOW_SECONDS || 60));
const relayMetricsBuckets = new Map(); // key: unix second -> rolling metrics bucket
const relayMetricsLifetime = {
openskyRequests: 0,
openskyCacheHit: 0,
openskyNegativeHit: 0,
openskyDedup: 0,
openskyDedupNeg: 0,
openskyDedupEmpty: 0,
openskyMiss: 0,
openskyUpstreamFetches: 0,
drops: 0,
};
let relayMetricsQueueMaxLifetime = 0;
let relayMetricsCurrentSec = 0;
let relayMetricsCurrentBucket = null;
let relayMetricsLastPruneSec = 0;
function createRelayMetricsBucket() {
return {
openskyRequests: 0,
openskyCacheHit: 0,
openskyNegativeHit: 0,
openskyDedup: 0,
openskyDedupNeg: 0,
openskyDedupEmpty: 0,
openskyMiss: 0,
openskyUpstreamFetches: 0,
drops: 0,
queueMax: 0,
};
}
function getMetricsNowSec() {
return Math.floor(Date.now() / 1000);
}
function pruneRelayMetricsBuckets(nowSec = getMetricsNowSec()) {
const minSec = nowSec - METRICS_WINDOW_SECONDS + 1;
for (const sec of relayMetricsBuckets.keys()) {
if (sec < minSec) relayMetricsBuckets.delete(sec);
}
if (relayMetricsCurrentSec < minSec) {
relayMetricsCurrentSec = 0;
relayMetricsCurrentBucket = null;
}
}
function getRelayMetricsBucket(nowSec = getMetricsNowSec()) {
if (nowSec !== relayMetricsLastPruneSec) {
pruneRelayMetricsBuckets(nowSec);
relayMetricsLastPruneSec = nowSec;
}
if (relayMetricsCurrentBucket && relayMetricsCurrentSec === nowSec) {
return relayMetricsCurrentBucket;
}
let bucket = relayMetricsBuckets.get(nowSec);
if (!bucket) {
bucket = createRelayMetricsBucket();
relayMetricsBuckets.set(nowSec, bucket);
}
relayMetricsCurrentSec = nowSec;
relayMetricsCurrentBucket = bucket;
return bucket;
}
function incrementRelayMetric(field, amount = 1) {
const bucket = getRelayMetricsBucket();
bucket[field] = (bucket[field] || 0) + amount;
if (Object.prototype.hasOwnProperty.call(relayMetricsLifetime, field)) {
relayMetricsLifetime[field] += amount;
}
}
function sampleRelayQueueSize(queueSize) {
const bucket = getRelayMetricsBucket();
if (queueSize > bucket.queueMax) bucket.queueMax = queueSize;
if (queueSize > relayMetricsQueueMaxLifetime) relayMetricsQueueMaxLifetime = queueSize;
}
function safeRatio(numerator, denominator) {
if (!denominator) return 0;
return Number((numerator / denominator).toFixed(4));
}
function getRelayRollingMetrics() {
const nowSec = getMetricsNowSec();
const minSec = nowSec - METRICS_WINDOW_SECONDS + 1;
pruneRelayMetricsBuckets(nowSec);
const rollup = createRelayMetricsBucket();
for (const [sec, bucket] of relayMetricsBuckets) {
if (sec < minSec) continue;
rollup.openskyRequests += bucket.openskyRequests;
rollup.openskyCacheHit += bucket.openskyCacheHit;
rollup.openskyNegativeHit += bucket.openskyNegativeHit;
rollup.openskyDedup += bucket.openskyDedup;
rollup.openskyDedupNeg += bucket.openskyDedupNeg;
rollup.openskyDedupEmpty += bucket.openskyDedupEmpty;
rollup.openskyMiss += bucket.openskyMiss;
rollup.openskyUpstreamFetches += bucket.openskyUpstreamFetches;
rollup.drops += bucket.drops;
if (bucket.queueMax > rollup.queueMax) rollup.queueMax = bucket.queueMax;
}
const dedupCount = rollup.openskyDedup + rollup.openskyDedupNeg + rollup.openskyDedupEmpty;
const cacheServedCount = rollup.openskyCacheHit + rollup.openskyNegativeHit + dedupCount;
return {
windowSeconds: METRICS_WINDOW_SECONDS,
generatedAt: new Date().toISOString(),
opensky: {
requests: rollup.openskyRequests,
hitRatio: safeRatio(cacheServedCount, rollup.openskyRequests),
dedupRatio: safeRatio(dedupCount, rollup.openskyRequests),
cacheHits: rollup.openskyCacheHit,
negativeHits: rollup.openskyNegativeHit,
dedupHits: dedupCount,
misses: rollup.openskyMiss,
upstreamFetches: rollup.openskyUpstreamFetches,
global429CooldownRemainingMs: Math.max(0, openskyGlobal429Until - Date.now()),
requestSpacingMs: OPENSKY_REQUEST_SPACING_MS,
},
ais: {
queueMax: rollup.queueMax,
currentQueue: getUpstreamQueueSize(),
drops: rollup.drops,
dropsPerSec: Number((rollup.drops / METRICS_WINDOW_SECONDS).toFixed(4)),
upstreamPaused,
},
lifetime: {
openskyRequests: relayMetricsLifetime.openskyRequests,
openskyCacheHit: relayMetricsLifetime.openskyCacheHit,
openskyNegativeHit: relayMetricsLifetime.openskyNegativeHit,
openskyDedup: relayMetricsLifetime.openskyDedup + relayMetricsLifetime.openskyDedupNeg + relayMetricsLifetime.openskyDedupEmpty,
openskyMiss: relayMetricsLifetime.openskyMiss,
openskyUpstreamFetches: relayMetricsLifetime.openskyUpstreamFetches,
drops: relayMetricsLifetime.drops,
queueMax: relayMetricsQueueMaxLifetime,
},
};
}
// AIS aggregate state for snapshot API (server-side fanout)
const GRID_SIZE = 2;
const DENSITY_WINDOW = 30 * 60 * 1000; // 30 minutes
const GAP_THRESHOLD = 60 * 60 * 1000; // 1 hour
const SNAPSHOT_INTERVAL_MS = Math.max(2000, Number(process.env.AIS_SNAPSHOT_INTERVAL_MS || 5000));
const CANDIDATE_RETENTION_MS = 2 * 60 * 60 * 1000; // 2 hours
const MAX_DENSITY_ZONES = 200;
const MAX_CANDIDATE_REPORTS = 1500;
const vessels = new Map();
const vesselHistory = new Map();
const densityGrid = new Map();
const candidateReports = new Map();
let snapshotSequence = 0;
let lastSnapshot = null;
let lastSnapshotAt = 0;
// Pre-serialized cache: avoids JSON.stringify + gzip per request
let lastSnapshotJson = null; // cached JSON string (no candidates)
let lastSnapshotGzip = null; // cached gzip buffer (no candidates)
let lastSnapshotWithCandJson = null;
let lastSnapshotWithCandGzip = null;
// Chokepoint spatial index: bucket vessels into grid cells at ingest time
// instead of O(chokepoints * vessels) on every snapshot
const chokepointBuckets = new Map(); // key: gridKey -> Set of MMSI
const vesselChokepoints = new Map(); // key: MMSI -> Set of chokepoint names
const CHOKEPOINTS = [
{ name: 'Strait of Hormuz', lat: 26.5, lon: 56.5, radius: 2 },
{ name: 'Suez Canal', lat: 30.0, lon: 32.5, radius: 1 },
{ name: 'Strait of Malacca', lat: 2.5, lon: 101.5, radius: 2 },
{ name: 'Bab el-Mandeb', lat: 12.5, lon: 43.5, radius: 1.5 },
{ name: 'Panama Canal', lat: 9.0, lon: -79.5, radius: 1 },
{ name: 'Taiwan Strait', lat: 24.5, lon: 119.5, radius: 2 },
{ name: 'South China Sea', lat: 15.0, lon: 115.0, radius: 5 },
{ name: 'Black Sea', lat: 43.5, lon: 34.0, radius: 3 },
];
const NAVAL_PREFIX_RE = /^(USS|USNS|HMS|HMAS|HMCS|INS|JS|ROKS|TCG|FS|BNS|RFS|PLAN|PLA|CGC|PNS|KRI|ITS|SNS|MMSI)/i;
function getGridKey(lat, lon) {
const gridLat = Math.floor(lat / GRID_SIZE) * GRID_SIZE;
const gridLon = Math.floor(lon / GRID_SIZE) * GRID_SIZE;
return `${gridLat},${gridLon}`;
}
function isLikelyMilitaryCandidate(meta) {
const mmsi = String(meta?.MMSI || '');
const shipType = Number(meta?.ShipType);
const name = (meta?.ShipName || '').trim().toUpperCase();
if (Number.isFinite(shipType) && (shipType === 35 || shipType === 55 || (shipType >= 50 && shipType <= 59))) {
return true;
}
if (name && NAVAL_PREFIX_RE.test(name)) return true;
if (mmsi.length >= 9) {
const suffix = mmsi.substring(3);
if (suffix.startsWith('00') || suffix.startsWith('99')) return true;
}
return false;
}
function getUpstreamQueueSize() {
return upstreamQueue.length - upstreamQueueReadIndex;
}
function enqueueUpstreamMessage(raw) {
upstreamQueue.push(raw);
sampleRelayQueueSize(getUpstreamQueueSize());
}
function dequeueUpstreamMessage() {
if (upstreamQueueReadIndex >= upstreamQueue.length) return null;
const raw = upstreamQueue[upstreamQueueReadIndex++];
// Compact queue periodically to avoid unbounded sparse arrays.
if (upstreamQueueReadIndex >= 1024 && upstreamQueueReadIndex * 2 >= upstreamQueue.length) {
upstreamQueue = upstreamQueue.slice(upstreamQueueReadIndex);
upstreamQueueReadIndex = 0;
}
return raw;
}
function clearUpstreamQueue() {
upstreamQueue = [];
upstreamQueueReadIndex = 0;
upstreamDrainScheduled = false;
sampleRelayQueueSize(0);
}
function evictMapByTimestamp(map, maxSize, getTimestamp) {
if (map.size <= maxSize) return;
const sorted = [...map.entries()].sort((a, b) => {
const tsA = Number(getTimestamp(a[1])) || 0;
const tsB = Number(getTimestamp(b[1])) || 0;
return tsA - tsB;
});
const removeCount = map.size - maxSize;
for (let i = 0; i < removeCount; i++) {
map.delete(sorted[i][0]);
}
}
function removeVesselFromChokepoints(mmsi) {
const previous = vesselChokepoints.get(mmsi);
if (!previous) return;
for (const cpName of previous) {
const bucket = chokepointBuckets.get(cpName);
if (!bucket) continue;
bucket.delete(mmsi);
if (bucket.size === 0) chokepointBuckets.delete(cpName);
}
vesselChokepoints.delete(mmsi);
}
function updateVesselChokepoints(mmsi, lat, lon) {
const next = new Set();
for (const cp of CHOKEPOINTS) {
const dlat = lat - cp.lat;
const dlon = lon - cp.lon;
if (dlat * dlat + dlon * dlon <= cp.radius * cp.radius) {
next.add(cp.name);
}
}
const previous = vesselChokepoints.get(mmsi) || new Set();
for (const cpName of previous) {
if (next.has(cpName)) continue;
const bucket = chokepointBuckets.get(cpName);
if (!bucket) continue;
bucket.delete(mmsi);
if (bucket.size === 0) chokepointBuckets.delete(cpName);
}
for (const cpName of next) {
let bucket = chokepointBuckets.get(cpName);
if (!bucket) {
bucket = new Set();
chokepointBuckets.set(cpName, bucket);
}
bucket.add(mmsi);
}
if (next.size === 0) vesselChokepoints.delete(mmsi);
else vesselChokepoints.set(mmsi, next);
}
function processRawUpstreamMessage(raw) {
messageCount++;
if (messageCount % 5000 === 0) {
const mem = process.memoryUsage();
console.log(`[Relay] ${messageCount} msgs, ${clients.size} ws-clients, ${vessels.size} vessels, queue=${getUpstreamQueueSize()}, dropped=${droppedMessages}, rss=${(mem.rss / 1024 / 1024).toFixed(0)}MB heap=${(mem.heapUsed / 1024 / 1024).toFixed(0)}MB, cache: opensky=${openskyResponseCache.size} opensky_neg=${openskyNegativeCache.size} rss_feed=${rssResponseCache.size}`);
}
try {
const parsed = JSON.parse(raw);
if (parsed?.MessageType === 'PositionReport') {
processPositionReportForSnapshot(parsed);
}
} catch {
// Ignore malformed upstream payloads
}
// Heavily throttled WS fanout: every 50th message only
// The app primarily uses HTTP snapshot polling, WS is for rare external consumers
if (clients.size > 0 && messageCount % 50 === 0) {
const message = raw.toString();
for (const client of clients) {
if (client.readyState === WebSocket.OPEN) {
// Per-client backpressure: skip if client buffer is backed up
if (client.bufferedAmount < 1024 * 1024) {
client.send(message);
}
}
}
}
}
function processPositionReportForSnapshot(data) {
const meta = data?.MetaData;
const pos = data?.Message?.PositionReport;
if (!meta || !pos) return;
const mmsi = String(meta.MMSI || '');
if (!mmsi) return;
const lat = Number.isFinite(pos.Latitude) ? pos.Latitude : meta.latitude;
const lon = Number.isFinite(pos.Longitude) ? pos.Longitude : meta.longitude;
if (!Number.isFinite(lat) || !Number.isFinite(lon)) return;
const now = Date.now();
vessels.set(mmsi, {
mmsi,
name: meta.ShipName || '',
lat,
lon,
timestamp: now,
shipType: meta.ShipType,
heading: pos.TrueHeading,
speed: pos.Sog,
course: pos.Cog,
});
const history = vesselHistory.get(mmsi) || [];
history.push(now);
if (history.length > 10) history.shift();
vesselHistory.set(mmsi, history);
const gridKey = getGridKey(lat, lon);
let cell = densityGrid.get(gridKey);
if (!cell) {
cell = {
lat: Math.floor(lat / GRID_SIZE) * GRID_SIZE + GRID_SIZE / 2,
lon: Math.floor(lon / GRID_SIZE) * GRID_SIZE + GRID_SIZE / 2,
vessels: new Set(),
lastUpdate: now,
previousCount: 0,
};
densityGrid.set(gridKey, cell);
}
cell.vessels.add(mmsi);
cell.lastUpdate = now;
// Maintain exact chokepoint membership so moving vessels don't get "stuck" in old buckets.
updateVesselChokepoints(mmsi, lat, lon);
if (isLikelyMilitaryCandidate(meta)) {
candidateReports.set(mmsi, {
mmsi,
name: meta.ShipName || '',
lat,
lon,
shipType: meta.ShipType,
heading: pos.TrueHeading,
speed: pos.Sog,
course: pos.Cog,
timestamp: now,
});
}
}
function cleanupAggregates() {
const now = Date.now();
const cutoff = now - DENSITY_WINDOW;
for (const [mmsi, vessel] of vessels) {
if (vessel.timestamp < cutoff) {
vessels.delete(mmsi);
removeVesselFromChokepoints(mmsi);
}
}
// Hard cap: if still over limit, evict oldest
if (vessels.size > MAX_VESSELS) {
const sorted = [...vessels.entries()].sort((a, b) => a[1].timestamp - b[1].timestamp);
const toRemove = sorted.slice(0, vessels.size - MAX_VESSELS);
for (const [mmsi] of toRemove) {
vessels.delete(mmsi);
removeVesselFromChokepoints(mmsi);
}
}
for (const [mmsi, history] of vesselHistory) {
const filtered = history.filter((ts) => ts >= cutoff);
if (filtered.length === 0) {
vesselHistory.delete(mmsi);
} else {
vesselHistory.set(mmsi, filtered);
}
}
// Hard cap: keep the most recent vessel histories.
evictMapByTimestamp(vesselHistory, MAX_VESSEL_HISTORY, (history) => history[history.length - 1] || 0);
for (const [key, cell] of densityGrid) {
cell.previousCount = cell.vessels.size;
for (const mmsi of cell.vessels) {
const vessel = vessels.get(mmsi);
if (!vessel || vessel.timestamp < cutoff) {
cell.vessels.delete(mmsi);
}
}
if (cell.vessels.size === 0 && now - cell.lastUpdate > DENSITY_WINDOW * 2) {
densityGrid.delete(key);
}
}
// Hard cap: keep the most recently updated cells.
evictMapByTimestamp(densityGrid, MAX_DENSITY_CELLS, (cell) => cell.lastUpdate || 0);
for (const [mmsi, report] of candidateReports) {
if (report.timestamp < now - CANDIDATE_RETENTION_MS) {
candidateReports.delete(mmsi);
}
}
// Hard cap: keep freshest candidate reports.
evictMapByTimestamp(candidateReports, MAX_CANDIDATE_REPORTS, (report) => report.timestamp || 0);
// Clean chokepoint buckets: remove stale vessels
for (const [cpName, bucket] of chokepointBuckets) {
for (const mmsi of bucket) {
if (vessels.has(mmsi)) continue;
bucket.delete(mmsi);
const memberships = vesselChokepoints.get(mmsi);
if (memberships) {
memberships.delete(cpName);
if (memberships.size === 0) vesselChokepoints.delete(mmsi);
}
}
if (bucket.size === 0) chokepointBuckets.delete(cpName);
}
}
function detectDisruptions() {
const disruptions = [];
const now = Date.now();
// O(chokepoints) using pre-built spatial buckets instead of O(chokepoints × vessels)
for (const chokepoint of CHOKEPOINTS) {
const bucket = chokepointBuckets.get(chokepoint.name);
const vesselCount = bucket ? bucket.size : 0;
if (vesselCount >= 5) {
const normalTraffic = chokepoint.radius * 10;
const severity = vesselCount > normalTraffic * 1.5
? 'high'
: vesselCount > normalTraffic
? 'elevated'
: 'low';
disruptions.push({
id: `chokepoint-${chokepoint.name.toLowerCase().replace(/\s+/g, '-')}`,
name: chokepoint.name,
type: 'chokepoint_congestion',
lat: chokepoint.lat,
lon: chokepoint.lon,
severity,
changePct: normalTraffic > 0 ? Math.round((vesselCount / normalTraffic - 1) * 100) : 0,
windowHours: 1,
vesselCount,
region: chokepoint.name,
description: `${vesselCount} vessels in ${chokepoint.name}`,
});
}
}
let darkShipCount = 0;
for (const history of vesselHistory.values()) {
if (history.length >= 2) {
const lastSeen = history[history.length - 1];
const secondLast = history[history.length - 2];
if (lastSeen - secondLast > GAP_THRESHOLD && now - lastSeen < 10 * 60 * 1000) {
darkShipCount++;
}
}
}
if (darkShipCount >= 1) {
disruptions.push({
id: 'global-gap-spike',
name: 'AIS Gap Spike Detected',
type: 'gap_spike',
lat: 0,
lon: 0,
severity: darkShipCount > 20 ? 'high' : darkShipCount > 10 ? 'elevated' : 'low',
changePct: darkShipCount * 10,
windowHours: 1,
darkShips: darkShipCount,
description: `${darkShipCount} vessels returned after extended AIS silence`,
});
}
return disruptions;
}
function calculateDensityZones() {
const zones = [];
const allCells = Array.from(densityGrid.values()).filter((c) => c.vessels.size >= 2);
if (allCells.length === 0) return zones;
const vesselCounts = allCells.map((c) => c.vessels.size);
const maxVessels = Math.max(...vesselCounts);
const minVessels = Math.min(...vesselCounts);
for (const [key, cell] of densityGrid) {
if (cell.vessels.size < 2) continue;
const logMax = Math.log(maxVessels + 1);
const logMin = Math.log(minVessels + 1);
const logCurrent = Math.log(cell.vessels.size + 1);
const intensity = logMax > logMin
? 0.2 + (0.8 * (logCurrent - logMin) / (logMax - logMin))
: 0.5;
const deltaPct = cell.previousCount > 0
? Math.round(((cell.vessels.size - cell.previousCount) / cell.previousCount) * 100)
: 0;
zones.push({
id: `density-${key}`,
name: `Zone ${key}`,
lat: cell.lat,
lon: cell.lon,
intensity,
deltaPct,
shipsPerDay: cell.vessels.size * 48,
note: cell.vessels.size >= 10 ? 'High traffic area' : undefined,
});
}
return zones
.sort((a, b) => b.intensity - a.intensity)
.slice(0, MAX_DENSITY_ZONES);
}
function getCandidateReportsSnapshot() {
return Array.from(candidateReports.values())
.sort((a, b) => b.timestamp - a.timestamp)
.slice(0, MAX_CANDIDATE_REPORTS);
}
function buildSnapshot() {
const now = Date.now();
if (lastSnapshot && now - lastSnapshotAt < Math.floor(SNAPSHOT_INTERVAL_MS / 2)) {
return lastSnapshot;
}
cleanupAggregates();
snapshotSequence++;
lastSnapshot = {
sequence: snapshotSequence,
timestamp: new Date(now).toISOString(),
status: {
connected: upstreamSocket?.readyState === WebSocket.OPEN,
vessels: vessels.size,
messages: messageCount,
clients: clients.size,
droppedMessages,
},
disruptions: detectDisruptions(),
density: calculateDensityZones(),
};
lastSnapshotAt = now;
// Pre-serialize JSON once (avoid per-request JSON.stringify)
const basePayload = { ...lastSnapshot, candidateReports: [] };
lastSnapshotJson = JSON.stringify(basePayload);
const withCandPayload = { ...lastSnapshot, candidateReports: getCandidateReportsSnapshot() };
lastSnapshotWithCandJson = JSON.stringify(withCandPayload);
// Pre-gzip both variants asynchronously (zero CPU on request path)
zlib.gzip(Buffer.from(lastSnapshotJson), (err, buf) => {
if (!err) lastSnapshotGzip = buf;
});
zlib.gzip(Buffer.from(lastSnapshotWithCandJson), (err, buf) => {
if (!err) lastSnapshotWithCandGzip = buf;
});
return lastSnapshot;
}
setInterval(() => {
if (upstreamSocket?.readyState === WebSocket.OPEN || vessels.size > 0) {
buildSnapshot();
}
}, SNAPSHOT_INTERVAL_MS);
// UCDP GED Events cache (persistent in-memory — Railway advantage)
const UCDP_CACHE_TTL_MS = 6 * 60 * 60 * 1000; // 6 hours
const UCDP_PAGE_SIZE = 1000;
const UCDP_MAX_PAGES = 12;
const UCDP_FETCH_TIMEOUT = 30000; // 30s per page (no Railway limit)
const UCDP_TRAILING_WINDOW_MS = 365 * 24 * 60 * 60 * 1000;
let ucdpCache = { data: null, timestamp: 0 };
let ucdpFetchInProgress = false;
const UCDP_VIOLENCE_TYPE_MAP = {
1: 'state-based',
2: 'non-state',
3: 'one-sided',
};
function ucdpParseDateMs(value) {
if (!value) return NaN;
return Date.parse(String(value));
}
function ucdpGetMaxDateMs(events) {
let maxMs = NaN;
for (const event of events) {
const ms = ucdpParseDateMs(event?.date_start);
if (!Number.isFinite(ms)) continue;
if (!Number.isFinite(maxMs) || ms > maxMs) maxMs = ms;
}
return maxMs;
}
function ucdpBuildVersionCandidates() {
const year = new Date().getFullYear() - 2000;
return Array.from(new Set([`${year}.1`, `${year - 1}.1`, '25.1', '24.1']));
}
async function ucdpFetchPage(version, page) {
const url = `https://ucdpapi.pcr.uu.se/api/gedevents/${version}?pagesize=${UCDP_PAGE_SIZE}&page=${page}`;
return new Promise((resolve, reject) => {
const req = https.get(url, { headers: { Accept: 'application/json' }, timeout: UCDP_FETCH_TIMEOUT }, (res) => {
if (res.statusCode !== 200) {
res.resume();
return reject(new Error(`UCDP API ${res.statusCode} (v${version} p${page})`));
}
let data = '';
res.on('data', chunk => data += chunk);
res.on('end', () => {
try { resolve(JSON.parse(data)); }
catch (e) { reject(new Error('UCDP JSON parse error')); }
});
});
req.on('error', reject);
req.on('timeout', () => { req.destroy(); reject(new Error('UCDP timeout')); });
});
}
async function ucdpDiscoverVersion() {
const candidates = ucdpBuildVersionCandidates();
for (const version of candidates) {
try {
const page0 = await ucdpFetchPage(version, 0);
if (Array.isArray(page0?.Result)) return { version, page0 };
} catch { /* next candidate */ }
}
throw new Error('No valid UCDP GED version found');
}
async function ucdpFetchAllEvents() {
const { version, page0 } = await ucdpDiscoverVersion();
const totalPages = Math.max(1, Number(page0?.TotalPages) || 1);
const newestPage = totalPages - 1;
let allEvents = [];
let latestDatasetMs = NaN;
for (let offset = 0; offset < UCDP_MAX_PAGES && (newestPage - offset) >= 0; offset++) {
const page = newestPage - offset;
const rawData = page === 0 ? page0 : await ucdpFetchPage(version, page);
const events = Array.isArray(rawData?.Result) ? rawData.Result : [];
allEvents = allEvents.concat(events);
const pageMaxMs = ucdpGetMaxDateMs(events);
if (!Number.isFinite(latestDatasetMs) && Number.isFinite(pageMaxMs)) {
latestDatasetMs = pageMaxMs;
}
if (Number.isFinite(latestDatasetMs) && Number.isFinite(pageMaxMs)) {
if (pageMaxMs < latestDatasetMs - UCDP_TRAILING_WINDOW_MS) break;
}
console.log(`[UCDP] Fetched v${version} page ${page} (${events.length} events)`);
}
const sanitized = allEvents
.filter(e => {
if (!Number.isFinite(latestDatasetMs)) return true;
const ms = ucdpParseDateMs(e?.date_start);
return Number.isFinite(ms) && ms >= (latestDatasetMs - UCDP_TRAILING_WINDOW_MS);
})
.map(e => ({
id: String(e.id || ''),
date_start: e.date_start || '',
date_end: e.date_end || '',
latitude: Number(e.latitude) || 0,
longitude: Number(e.longitude) || 0,
country: e.country || '',
side_a: (e.side_a || '').substring(0, 200),
side_b: (e.side_b || '').substring(0, 200),
deaths_best: Number(e.best) || 0,
deaths_low: Number(e.low) || 0,
deaths_high: Number(e.high) || 0,
type_of_violence: UCDP_VIOLENCE_TYPE_MAP[e.type_of_violence] || 'state-based',
source_original: (e.source_original || '').substring(0, 300),
}))
.sort((a, b) => {
const bMs = ucdpParseDateMs(b.date_start);
const aMs = ucdpParseDateMs(a.date_start);
return (Number.isFinite(bMs) ? bMs : 0) - (Number.isFinite(aMs) ? aMs : 0);
});
return {
success: true,
count: sanitized.length,
data: sanitized,
version,
cached_at: new Date().toISOString(),
};
}
async function handleUcdpEventsRequest(req, res) {
const now = Date.now();
if (ucdpCache.data && now - ucdpCache.timestamp < UCDP_CACHE_TTL_MS) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=3600',
'X-Cache': 'HIT',
}, JSON.stringify(ucdpCache.data));
}
if (ucdpCache.data && !ucdpFetchInProgress) {
ucdpFetchInProgress = true;
ucdpFetchAllEvents()
.then(result => {
ucdpCache = { data: result, timestamp: Date.now() };
console.log(`[UCDP] Background refresh: ${result.count} events (v${result.version})`);
})
.catch(err => console.error('[UCDP] Background refresh error:', err.message))
.finally(() => { ucdpFetchInProgress = false; });
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=600',
'X-Cache': 'STALE',
}, JSON.stringify(ucdpCache.data));
}
if (ucdpFetchInProgress) {
res.writeHead(202, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify({ success: false, count: 0, data: [], cached_at: '', message: 'Fetch in progress' }));
}
try {
ucdpFetchInProgress = true;
console.log('[UCDP] Cold fetch starting...');
const result = await ucdpFetchAllEvents();
ucdpCache = { data: result, timestamp: Date.now() };
ucdpFetchInProgress = false;
console.log(`[UCDP] Cold fetch complete: ${result.count} events (v${result.version})`);
sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=3600',
'X-Cache': 'MISS',
}, JSON.stringify(result));
} catch (err) {
ucdpFetchInProgress = false;
console.error('[UCDP] Fetch error:', err.message);
res.writeHead(500, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ success: false, error: err.message, count: 0, data: [] }));
}
}
// ── Response caches (eliminates ~1.2TB/day OpenSky + ~30GB/day RSS egress) ──
const openskyResponseCache = new Map(); // key: sorted query params → { data, gzip, timestamp }
const openskyNegativeCache = new Map(); // key: cacheKey → { status, timestamp, body, gzip } — prevents retry storms on 429/5xx
const openskyInFlight = new Map(); // key: cacheKey → Promise (dedup concurrent requests)
const OPENSKY_CACHE_TTL_MS = Number(process.env.OPENSKY_CACHE_TTL_MS) || 60 * 1000; // 60s default — env-configurable
const OPENSKY_NEGATIVE_CACHE_TTL_MS = Number(process.env.OPENSKY_NEGATIVE_CACHE_TTL_MS) || 30 * 1000; // 30s — env-configurable
const OPENSKY_CACHE_MAX_ENTRIES = Math.max(10, Number(process.env.OPENSKY_CACHE_MAX_ENTRIES || 128));
const OPENSKY_NEGATIVE_CACHE_MAX_ENTRIES = Math.max(10, Number(process.env.OPENSKY_NEGATIVE_CACHE_MAX_ENTRIES || 256));
const OPENSKY_BBOX_QUANT_STEP = Number.isFinite(Number(process.env.OPENSKY_BBOX_QUANT_STEP))
? Math.max(0, Number(process.env.OPENSKY_BBOX_QUANT_STEP)) : 0.01;
const OPENSKY_BBOX_DECIMALS = OPENSKY_BBOX_QUANT_STEP > 0
? Math.min(6, ((String(OPENSKY_BBOX_QUANT_STEP).split('.')[1] || '').length || 0))
: 6;
const OPENSKY_DEDUP_EMPTY_RESPONSE_JSON = JSON.stringify({ states: [], time: 0 });
const OPENSKY_DEDUP_EMPTY_RESPONSE_GZIP = gzipSyncBuffer(OPENSKY_DEDUP_EMPTY_RESPONSE_JSON);
const rssResponseCache = new Map(); // key: feed URL → { data, contentType, timestamp, statusCode }
const rssInFlight = new Map(); // key: feed URL → Promise (dedup concurrent requests)
const RSS_CACHE_TTL_MS = 5 * 60 * 1000; // 5 min — RSS feeds rarely update faster
const RSS_NEGATIVE_CACHE_TTL_MS = 60 * 1000; // 1 min — cache failures to prevent thundering herd
const RSS_CACHE_MAX_ENTRIES = 200; // hard cap — ~20 allowed domains × ~5 paths max, with headroom
function setBoundedCacheEntry(cache, key, value, maxEntries) {
if (!cache.has(key) && cache.size >= maxEntries) {
const oldest = cache.keys().next().value;
if (oldest !== undefined) cache.delete(oldest);
}
cache.set(key, value);
}
function touchCacheEntry(cache, key, entry) {
cache.delete(key);
cache.set(key, entry);
}
function cacheOpenSkyPositive(cacheKey, data) {
setBoundedCacheEntry(openskyResponseCache, cacheKey, {
data,
gzip: gzipSyncBuffer(data),
timestamp: Date.now(),
}, OPENSKY_CACHE_MAX_ENTRIES);
}
function cacheOpenSkyNegative(cacheKey, status) {
const now = Date.now();
const body = JSON.stringify({ states: [], time: now });
setBoundedCacheEntry(openskyNegativeCache, cacheKey, {
status,
timestamp: now,
body,
gzip: gzipSyncBuffer(body),
}, OPENSKY_NEGATIVE_CACHE_MAX_ENTRIES);
}
function quantizeCoordinate(value) {
if (!OPENSKY_BBOX_QUANT_STEP) return value;
return Math.round(value / OPENSKY_BBOX_QUANT_STEP) * OPENSKY_BBOX_QUANT_STEP;
}
function formatCoordinate(value) {
return Number(value.toFixed(OPENSKY_BBOX_DECIMALS)).toString();
}
function normalizeOpenSkyBbox(params) {
const keys = ['lamin', 'lomin', 'lamax', 'lomax'];
const hasAny = keys.some(k => params.has(k));
if (!hasAny) {
return { cacheKey: ',,,', queryParams: [] };
}
if (!keys.every(k => params.has(k))) {
return { error: 'Provide all bbox params: lamin,lomin,lamax,lomax' };
}
const values = {};
for (const key of keys) {
const raw = params.get(key);
if (raw === null || raw.trim() === '') return { error: `Invalid ${key} value` };
const parsed = Number(raw);
if (!Number.isFinite(parsed)) return { error: `Invalid ${key} value` };
values[key] = parsed;
}
if (values.lamin < -90 || values.lamax > 90 || values.lomin < -180 || values.lomax > 180) {
return { error: 'Bbox out of range' };
}
if (values.lamin > values.lamax || values.lomin > values.lomax) {
return { error: 'Invalid bbox ordering' };
}
const normalized = {};
for (const key of keys) normalized[key] = formatCoordinate(quantizeCoordinate(values[key]));
return {
cacheKey: keys.map(k => normalized[k]).join(','),
queryParams: keys.map(k => `${k}=${encodeURIComponent(normalized[k])}`),
};
}
// OpenSky OAuth2 token cache + mutex to prevent thundering herd
let openskyToken = null;
let openskyTokenExpiry = 0;
let openskyTokenPromise = null; // mutex: single in-flight token request
let openskyAuthCooldownUntil = 0; // backoff after repeated failures
const OPENSKY_AUTH_COOLDOWN_MS = 60000; // 1 min cooldown after auth failure
// Global OpenSky rate limiter — serializes upstream requests and enforces 429 cooldown
let openskyGlobal429Until = 0; // timestamp: block ALL upstream requests until this time
const OPENSKY_429_COOLDOWN_MS = Number(process.env.OPENSKY_429_COOLDOWN_MS) || 90 * 1000; // 90s cooldown after any 429
const OPENSKY_REQUEST_SPACING_MS = Number(process.env.OPENSKY_REQUEST_SPACING_MS) || 2000; // 2s minimum between consecutive upstream requests
let openskyLastUpstreamTime = 0;
let openskyUpstreamQueue = Promise.resolve(); // serial chain — only 1 upstream request at a time
async function getOpenSkyToken() {
const clientId = process.env.OPENSKY_CLIENT_ID;
const clientSecret = process.env.OPENSKY_CLIENT_SECRET;
if (!clientId || !clientSecret) {
return null;
}
// Return cached token if still valid (with 60s buffer)
if (openskyToken && Date.now() < openskyTokenExpiry - 60000) {
return openskyToken;
}
// Cooldown: don't retry auth if it recently failed (prevents stampede)
if (Date.now() < openskyAuthCooldownUntil) {
return null;
}
// Mutex: if a token fetch is already in flight, wait for it
if (openskyTokenPromise) {
return openskyTokenPromise;
}
openskyTokenPromise = _fetchOpenSkyToken(clientId, clientSecret);
try {
return await openskyTokenPromise;
} finally {
openskyTokenPromise = null;
}
}
function _attemptOpenSkyTokenFetch(clientId, clientSecret) {
return new Promise((resolve) => {
const postData = `grant_type=client_credentials&client_id=${encodeURIComponent(clientId)}&client_secret=${encodeURIComponent(clientSecret)}`;
const req = https.request({
hostname: 'auth.opensky-network.org',
port: 443,
family: 4,
path: '/auth/realms/opensky-network/protocol/openid-connect/token',
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': Buffer.byteLength(postData),
'User-Agent': 'WorldMonitor/1.0',
},
timeout: 10000
}, (res) => {
let data = '';
res.on('data', chunk => data += chunk);
res.on('end', () => {
try {
const json = JSON.parse(data);
if (json.access_token) {
resolve({ token: json.access_token, expiresIn: json.expires_in || 1800 });
} else {
resolve({ error: json.error || 'no_access_token', status: res.statusCode });
}
} catch (e) {
resolve({ error: `parse: ${e.message}`, status: res.statusCode });
}
});
});
req.on('error', (err) => {
resolve({ error: `${err.code || 'UNKNOWN'}: ${err.message}` });
});
req.on('timeout', () => {
req.destroy();
resolve({ error: 'TIMEOUT' });
});
req.write(postData);
req.end();
});
}
const OPENSKY_AUTH_MAX_RETRIES = 3;
const OPENSKY_AUTH_RETRY_DELAYS = [0, 2000, 5000];
async function _fetchOpenSkyToken(clientId, clientSecret) {
try {
for (let attempt = 0; attempt < OPENSKY_AUTH_MAX_RETRIES; attempt++) {
if (attempt > 0) {
const delay = OPENSKY_AUTH_RETRY_DELAYS[attempt] || 5000;
console.log(`[Relay] OpenSky auth retry ${attempt + 1}/${OPENSKY_AUTH_MAX_RETRIES} in ${delay}ms...`);
await new Promise(r => setTimeout(r, delay));
} else {
console.log('[Relay] Fetching new OpenSky OAuth2 token...');
}
const result = await _attemptOpenSkyTokenFetch(clientId, clientSecret);
if (result.token) {
openskyToken = result.token;
openskyTokenExpiry = Date.now() + result.expiresIn * 1000;
console.log('[Relay] OpenSky token acquired, expires in', result.expiresIn, 'seconds');
return openskyToken;
}
console.error(`[Relay] OpenSky auth attempt ${attempt + 1} failed:`, result.error, result.status ? `(HTTP ${result.status})` : '');
}
openskyAuthCooldownUntil = Date.now() + OPENSKY_AUTH_COOLDOWN_MS;
console.warn(`[Relay] OpenSky auth failed after ${OPENSKY_AUTH_MAX_RETRIES} attempts, cooling down for ${OPENSKY_AUTH_COOLDOWN_MS / 1000}s`);
return null;
} catch (err) {
console.error('[Relay] OpenSky token error:', err.message);
openskyAuthCooldownUntil = Date.now() + OPENSKY_AUTH_COOLDOWN_MS;
return null;
}
}
// Promisified upstream OpenSky fetch (single request)
function _openskyRawFetch(url, token) {
return new Promise((resolve) => {
const request = https.get(url, {
family: 4,
headers: {
'Accept': 'application/json',
'User-Agent': 'WorldMonitor/1.0',
'Authorization': `Bearer ${token}`,
},
timeout: 15000,
}, (response) => {
let data = '';
response.on('data', chunk => data += chunk);
response.on('end', () => resolve({ status: response.statusCode || 502, data }));
});
request.on('error', (err) => resolve({ status: 0, data: null, error: err }));
request.on('timeout', () => { request.destroy(); resolve({ status: 504, data: null, error: new Error('timeout') }); });
});
}
// Serialized queue — ensures only 1 upstream request at a time with minimum spacing.
// Prevents 5 concurrent bbox queries from all getting 429'd.
function openskyQueuedFetch(url, token) {
const job = openskyUpstreamQueue.then(async () => {
if (Date.now() < openskyGlobal429Until) {
return { status: 429, data: JSON.stringify({ states: [], time: Date.now() }), rateLimited: true };
}
const wait = OPENSKY_REQUEST_SPACING_MS - (Date.now() - openskyLastUpstreamTime);
if (wait > 0) await new Promise(r => setTimeout(r, wait));
if (Date.now() < openskyGlobal429Until) {
return { status: 429, data: JSON.stringify({ states: [], time: Date.now() }), rateLimited: true };
}
openskyLastUpstreamTime = Date.now();
return _openskyRawFetch(url, token);
});
openskyUpstreamQueue = job.catch(() => {});
return job;
}
async function handleOpenSkyRequest(req, res, PORT) {
let cacheKey = '';
let settleFlight = null;
try {
const url = new URL(req.url, `http://localhost:${PORT}`);
const params = url.searchParams;
const normalizedBbox = normalizeOpenSkyBbox(params);
if (normalizedBbox.error) {
return safeEnd(res, 400, { 'Content-Type': 'application/json' }, JSON.stringify({
error: normalizedBbox.error,
time: Date.now(),
states: [],
}));
}
cacheKey = normalizedBbox.cacheKey;
incrementRelayMetric('openskyRequests');
// 1. Check positive cache (30s TTL)
const cached = openskyResponseCache.get(cacheKey);
if (cached && Date.now() - cached.timestamp < OPENSKY_CACHE_TTL_MS) {
incrementRelayMetric('openskyCacheHit');
touchCacheEntry(openskyResponseCache, cacheKey, cached); // LRU
return sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=30',
'X-Cache': 'HIT',
}, cached.data, cached.gzip);
}
// 2. Check negative cache — prevents retry storms when upstream returns 429/5xx
const negCached = openskyNegativeCache.get(cacheKey);
if (negCached && Date.now() - negCached.timestamp < OPENSKY_NEGATIVE_CACHE_TTL_MS) {
incrementRelayMetric('openskyNegativeHit');
touchCacheEntry(openskyNegativeCache, cacheKey, negCached); // LRU
return sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-cache',
'X-Cache': 'NEG',
}, negCached.body, negCached.gzip);
}
// 2b. Global 429 cooldown — blocks ALL bbox queries when OpenSky is rate-limiting.
// Without this, 5 unique bbox keys all fire simultaneously when neg cache expires,
// ALL get 429'd, and the cycle repeats forever with zero data flowing.
if (Date.now() < openskyGlobal429Until) {
incrementRelayMetric('openskyNegativeHit');
cacheOpenSkyNegative(cacheKey, 429);
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-cache',
'X-Cache': 'RATE-LIMITED',
}, JSON.stringify({ states: [], time: Date.now() }));
}
// 3. Dedup concurrent requests — await in-flight and return result OR empty (never fall through)
const existing = openskyInFlight.get(cacheKey);
if (existing) {
try {
await existing;
} catch { /* in-flight failed */ }
const deduped = openskyResponseCache.get(cacheKey);
if (deduped && Date.now() - deduped.timestamp < OPENSKY_CACHE_TTL_MS) {
incrementRelayMetric('openskyDedup');
touchCacheEntry(openskyResponseCache, cacheKey, deduped); // LRU
return sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=30',
'X-Cache': 'DEDUP',
}, deduped.data, deduped.gzip);
}
const dedupNeg = openskyNegativeCache.get(cacheKey);
if (dedupNeg && Date.now() - dedupNeg.timestamp < OPENSKY_NEGATIVE_CACHE_TTL_MS) {
incrementRelayMetric('openskyDedupNeg');
touchCacheEntry(openskyNegativeCache, cacheKey, dedupNeg); // LRU
return sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-cache',
'X-Cache': 'DEDUP-NEG',
}, dedupNeg.body, dedupNeg.gzip);
}
// In-flight completed but no cache entry (upstream failed) — return empty instead of thundering herd
incrementRelayMetric('openskyDedupEmpty');
return sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-cache',
'X-Cache': 'DEDUP-EMPTY',
}, OPENSKY_DEDUP_EMPTY_RESPONSE_JSON, OPENSKY_DEDUP_EMPTY_RESPONSE_GZIP);
}
incrementRelayMetric('openskyMiss');
// 4. Set in-flight BEFORE async token fetch to prevent race window
let resolveFlight;
let flightSettled = false;
const flightPromise = new Promise((resolve) => { resolveFlight = resolve; });
settleFlight = () => {
if (flightSettled) return;
flightSettled = true;
resolveFlight();
};
openskyInFlight.set(cacheKey, flightPromise);
const token = await getOpenSkyToken();
if (!token) {
// Do NOT negative-cache auth failures — they poison ALL bbox keys.
// Only negative-cache actual upstream 429/5xx responses.
settleFlight();
openskyInFlight.delete(cacheKey);
return safeEnd(res, 503, { 'Content-Type': 'application/json' },
JSON.stringify({ error: 'OpenSky not configured or auth failed', time: Date.now(), states: [] }));
}
let openskyUrl = 'https://opensky-network.org/api/states/all';
if (normalizedBbox.queryParams.length > 0) {
openskyUrl += '?' + normalizedBbox.queryParams.join('&');
}
logThrottled('log', `opensky-miss:${cacheKey}`, '[Relay] OpenSky request (MISS):', openskyUrl);
incrementRelayMetric('openskyUpstreamFetches');
// Serialized fetch — queued with spacing to prevent concurrent 429 storms
const result = await openskyQueuedFetch(openskyUrl, token);
const upstreamStatus = result.status || 502;
if (upstreamStatus === 401) {
openskyToken = null;
openskyTokenExpiry = 0;
}
if (upstreamStatus === 429 && !result.rateLimited) {
openskyGlobal429Until = Date.now() + OPENSKY_429_COOLDOWN_MS;
console.warn(`[Relay] OpenSky 429 — global cooldown ${OPENSKY_429_COOLDOWN_MS / 1000}s (all bbox queries blocked)`);
}
if (upstreamStatus === 200 && result.data) {
cacheOpenSkyPositive(cacheKey, result.data);
openskyNegativeCache.delete(cacheKey);
} else if (result.error) {
logThrottled('error', `opensky-error:${cacheKey}:${result.error.code || result.error.message}`, '[Relay] OpenSky error:', result.error.message);
cacheOpenSkyNegative(cacheKey, upstreamStatus || 500);
} else {
cacheOpenSkyNegative(cacheKey, upstreamStatus);
logThrottled('warn', `opensky-upstream-${upstreamStatus}:${cacheKey}`,
`[Relay] OpenSky upstream ${upstreamStatus} for ${openskyUrl}, negative-cached for ${OPENSKY_NEGATIVE_CACHE_TTL_MS / 1000}s`);
}
settleFlight();
openskyInFlight.delete(cacheKey);
// Serve stale cache on network errors
if (result.error && cached) {
return sendPreGzipped(req, res, 200, { 'Content-Type': 'application/json', 'X-Cache': 'STALE' }, cached.data, cached.gzip);
}
const responseData = result.data || JSON.stringify({ error: result.error?.message || 'upstream error', time: Date.now(), states: null });
return sendCompressed(req, res, upstreamStatus, {
'Content-Type': 'application/json',
'Cache-Control': upstreamStatus === 200 ? 'public, max-age=30' : 'no-cache',
'X-Cache': result.rateLimited ? 'RATE-LIMITED' : 'MISS',
}, responseData);
} catch (err) {
if (settleFlight) settleFlight();
if (!cacheKey) {
try {
const params = new URL(req.url, `http://localhost:${PORT}`).searchParams;
cacheKey = normalizeOpenSkyBbox(params).cacheKey || ',,,';
} catch {
cacheKey = ',,,';
}
}
openskyInFlight.delete(cacheKey);
safeEnd(res, 500, { 'Content-Type': 'application/json' },
JSON.stringify({ error: err.message, time: Date.now(), states: null }));
}
}
// ── World Bank proxy (World Bank blocks Vercel edge IPs with 403) ──
const worldbankCache = new Map(); // key: query string → { data, timestamp }
const WORLDBANK_CACHE_TTL_MS = 30 * 60 * 1000; // 30 min — data rarely changes
function handleWorldBankRequest(req, res) {
const url = new URL(req.url, `http://localhost:${PORT}`);
const qs = url.search || '';
const cacheKey = qs;
const cached = worldbankCache.get(cacheKey);
if (cached && Date.now() - cached.timestamp < WORLDBANK_CACHE_TTL_MS) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=1800',
'X-Cache': 'HIT',
}, cached.data);
}
const targetUrl = `https://api.worldbank.org/v2${qs.includes('action=indicators') ? '' : '/country'}${url.pathname.replace('/worldbank', '')}${qs}`;
// Passthrough: forward query params to the Vercel edge handler format
// The client sends the same params as /api/worldbank, so we re-fetch from upstream
const wbParams = new URLSearchParams(url.searchParams);
const action = wbParams.get('action');
if (action === 'indicators') {
// Static response — return indicator list directly (same as api/worldbank.js)
const indicators = {
'IT.NET.USER.ZS': 'Internet Users (% of population)',
'IT.CEL.SETS.P2': 'Mobile Subscriptions (per 100 people)',
'IT.NET.BBND.P2': 'Fixed Broadband Subscriptions (per 100 people)',
'IT.NET.SECR.P6': 'Secure Internet Servers (per million people)',
'GB.XPD.RSDV.GD.ZS': 'R&D Expenditure (% of GDP)',
'IP.PAT.RESD': 'Patent Applications (residents)',
'IP.PAT.NRES': 'Patent Applications (non-residents)',
'IP.TMK.TOTL': 'Trademark Applications',
'TX.VAL.TECH.MF.ZS': 'High-Tech Exports (% of manufactured exports)',
'BX.GSR.CCIS.ZS': 'ICT Service Exports (% of service exports)',
'TM.VAL.ICTG.ZS.UN': 'ICT Goods Imports (% of total goods imports)',
'SE.TER.ENRR': 'Tertiary Education Enrollment (%)',
'SE.XPD.TOTL.GD.ZS': 'Education Expenditure (% of GDP)',
'NY.GDP.MKTP.KD.ZG': 'GDP Growth (annual %)',
'NY.GDP.PCAP.CD': 'GDP per Capita (current US$)',
'NE.EXP.GNFS.ZS': 'Exports of Goods & Services (% of GDP)',
};
const defaultCountries = [
'USA','CHN','JPN','DEU','KOR','GBR','IND','ISR','SGP','TWN',
'FRA','CAN','SWE','NLD','CHE','FIN','IRL','AUS','BRA','IDN',
'ARE','SAU','QAT','BHR','EGY','TUR','MYS','THA','VNM','PHL',
'ESP','ITA','POL','CZE','DNK','NOR','AUT','BEL','PRT','EST',
'MEX','ARG','CHL','COL','ZAF','NGA','KEN',
];
const body = JSON.stringify({ indicators, defaultCountries });
worldbankCache.set(cacheKey, { data: body, timestamp: Date.now() });
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=86400',
'X-Cache': 'MISS',
}, body);
}
const indicator = wbParams.get('indicator');
if (!indicator) {
res.writeHead(400, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify({ error: 'Missing indicator parameter' }));
}
const country = wbParams.get('country');
const countries = wbParams.get('countries');
const years = parseInt(wbParams.get('years') || '5', 10);
let countryList = country || (countries ? countries.split(',').join(';') : [
'USA','CHN','JPN','DEU','KOR','GBR','IND','ISR','SGP','TWN',
'FRA','CAN','SWE','NLD','CHE','FIN','IRL','AUS','BRA','IDN',
'ARE','SAU','QAT','BHR','EGY','TUR','MYS','THA','VNM','PHL',
'ESP','ITA','POL','CZE','DNK','NOR','AUT','BEL','PRT','EST',
'MEX','ARG','CHL','COL','ZAF','NGA','KEN',
].join(';'));
const currentYear = new Date().getFullYear();
const startYear = currentYear - years;
const TECH_INDICATORS = {
'IT.NET.USER.ZS': 'Internet Users (% of population)',
'IT.CEL.SETS.P2': 'Mobile Subscriptions (per 100 people)',
'IT.NET.BBND.P2': 'Fixed Broadband Subscriptions (per 100 people)',
'IT.NET.SECR.P6': 'Secure Internet Servers (per million people)',
'GB.XPD.RSDV.GD.ZS': 'R&D Expenditure (% of GDP)',
'IP.PAT.RESD': 'Patent Applications (residents)',
'IP.PAT.NRES': 'Patent Applications (non-residents)',
'IP.TMK.TOTL': 'Trademark Applications',
'TX.VAL.TECH.MF.ZS': 'High-Tech Exports (% of manufactured exports)',
'BX.GSR.CCIS.ZS': 'ICT Service Exports (% of service exports)',
'TM.VAL.ICTG.ZS.UN': 'ICT Goods Imports (% of total goods imports)',
'SE.TER.ENRR': 'Tertiary Education Enrollment (%)',
'SE.XPD.TOTL.GD.ZS': 'Education Expenditure (% of GDP)',
'NY.GDP.MKTP.KD.ZG': 'GDP Growth (annual %)',
'NY.GDP.PCAP.CD': 'GDP per Capita (current US$)',
'NE.EXP.GNFS.ZS': 'Exports of Goods & Services (% of GDP)',
};
const wbUrl = `https://api.worldbank.org/v2/country/${countryList}/indicator/${encodeURIComponent(indicator)}?format=json&date=${startYear}:${currentYear}&per_page=1000`;
console.log('[Relay] World Bank request (MISS):', indicator);
const request = https.get(wbUrl, {
headers: {
'Accept': 'application/json',
'User-Agent': 'Mozilla/5.0 (compatible; WorldMonitor/1.0; +https://worldmonitor.app)',
},
timeout: 15000,
}, (response) => {
if (response.statusCode !== 200) {
res.writeHead(response.statusCode, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify({ error: `World Bank API ${response.statusCode}` }));
}
let rawData = '';
response.on('data', chunk => rawData += chunk);
response.on('end', () => {
try {
const parsed = JSON.parse(rawData);
// Transform raw World Bank response to match client-expected format
if (!parsed || !Array.isArray(parsed) || parsed.length < 2 || !parsed[1]) {
const empty = JSON.stringify({
indicator,
indicatorName: TECH_INDICATORS[indicator] || indicator,
metadata: { page: 1, pages: 1, total: 0 },
byCountry: {}, latestByCountry: {}, timeSeries: [],
});
worldbankCache.set(cacheKey, { data: empty, timestamp: Date.now() });
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=1800',
'X-Cache': 'MISS',
}, empty);
}
const [metadata, records] = parsed;
const transformed = {
indicator,
indicatorName: TECH_INDICATORS[indicator] || (records[0]?.indicator?.value || indicator),
metadata: { page: metadata.page, pages: metadata.pages, total: metadata.total },
byCountry: {}, latestByCountry: {}, timeSeries: [],
};
for (const record of records || []) {
const cc = record.countryiso3code || record.country?.id;
const cn = record.country?.value;
const yr = record.date;
const val = record.value;
if (!cc || val === null) continue;
if (!transformed.byCountry[cc]) transformed.byCountry[cc] = { code: cc, name: cn, values: [] };
transformed.byCountry[cc].values.push({ year: yr, value: val });
if (!transformed.latestByCountry[cc] || yr > transformed.latestByCountry[cc].year) {
transformed.latestByCountry[cc] = { code: cc, name: cn, year: yr, value: val };
}
transformed.timeSeries.push({ countryCode: cc, countryName: cn, year: yr, value: val });
}
for (const c of Object.values(transformed.byCountry)) c.values.sort((a, b) => a.year - b.year);
transformed.timeSeries.sort((a, b) => b.year - a.year || a.countryCode.localeCompare(b.countryCode));
const body = JSON.stringify(transformed);
worldbankCache.set(cacheKey, { data: body, timestamp: Date.now() });
sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=1800',
'X-Cache': 'MISS',
}, body);
} catch (e) {
console.error('[Relay] World Bank parse error:', e.message);
res.writeHead(500, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Parse error' }));
}
});
});
request.on('error', (err) => {
console.error('[Relay] World Bank error:', err.message);
if (cached) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'X-Cache': 'STALE',
}, cached.data);
}
res.writeHead(502, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: err.message }));
});
request.on('timeout', () => {
request.destroy();
if (cached) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'X-Cache': 'STALE',
}, cached.data);
}
res.writeHead(504, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'World Bank request timeout' }));
});
}
// ── Polymarket proxy (Cloudflare JA3 blocks Vercel edge runtime) ──
const polymarketCache = new Map(); // key: query string → { data, timestamp }
const POLYMARKET_CACHE_TTL_MS = 2 * 60 * 1000; // 2 min — market data changes frequently
function handlePolymarketRequest(req, res) {
const url = new URL(req.url, `http://localhost:${PORT}`);
const cacheKey = url.search || '';
const cached = polymarketCache.get(cacheKey);
if (cached && Date.now() - cached.timestamp < POLYMARKET_CACHE_TTL_MS) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=120',
'X-Cache': 'HIT',
'X-Polymarket-Source': 'railway-cache',
}, cached.data);
}
const endpoint = url.searchParams.get('endpoint') || 'markets';
const params = new URLSearchParams();
params.set('closed', url.searchParams.get('closed') || 'false');
params.set('order', url.searchParams.get('order') || 'volume');
params.set('ascending', url.searchParams.get('ascending') || 'false');
const limit = Math.max(1, Math.min(100, parseInt(url.searchParams.get('limit') || '50', 10) || 50));
params.set('limit', String(limit));
const tag = url.searchParams.get('tag') || url.searchParams.get('tag_slug');
if (tag && endpoint === 'events') params.set('tag_slug', tag.replace(/[^a-z0-9-]/gi, '').slice(0, 100));
const gammaUrl = `https://gamma-api.polymarket.com/${endpoint}?${params}`;
console.log('[Relay] Polymarket request (MISS):', endpoint, tag || '');
const request = https.get(gammaUrl, {
headers: { 'Accept': 'application/json' },
timeout: 10000,
}, (response) => {
if (response.statusCode !== 200) {
console.error(`[Relay] Polymarket upstream ${response.statusCode}`);
res.writeHead(response.statusCode, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify([]));
}
let data = '';
response.on('data', chunk => data += chunk);
response.on('end', () => {
polymarketCache.set(cacheKey, { data, timestamp: Date.now() });
sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=120',
'X-Cache': 'MISS',
'X-Polymarket-Source': 'railway',
}, data);
});
});
request.on('error', (err) => {
console.error('[Relay] Polymarket error:', err.message);
if (cached) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'X-Cache': 'STALE',
'X-Polymarket-Source': 'railway-stale',
}, cached.data);
}
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify([]));
});
request.on('timeout', () => {
request.destroy();
if (cached) {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'X-Cache': 'STALE',
'X-Polymarket-Source': 'railway-stale',
}, cached.data);
}
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify([]));
});
}
// Periodic cache cleanup to prevent memory leaks
setInterval(() => {
const now = Date.now();
for (const [key, entry] of openskyResponseCache) {
if (now - entry.timestamp > OPENSKY_CACHE_TTL_MS * 2) openskyResponseCache.delete(key);
}
for (const [key, entry] of openskyNegativeCache) {
if (now - entry.timestamp > OPENSKY_NEGATIVE_CACHE_TTL_MS * 2) openskyNegativeCache.delete(key);
}
for (const [key, entry] of rssResponseCache) {
const maxAge = (entry.statusCode && entry.statusCode >= 200 && entry.statusCode < 300)
? RSS_CACHE_TTL_MS * 2 : RSS_NEGATIVE_CACHE_TTL_MS * 2;
if (now - entry.timestamp > maxAge) rssResponseCache.delete(key);
}
for (const [key, entry] of worldbankCache) {
if (now - entry.timestamp > WORLDBANK_CACHE_TTL_MS * 2) worldbankCache.delete(key);
}
for (const [key, entry] of polymarketCache) {
if (now - entry.timestamp > POLYMARKET_CACHE_TTL_MS * 2) polymarketCache.delete(key);
}
for (const [key, bucket] of requestRateBuckets) {
if (now >= bucket.resetAt + RELAY_RATE_LIMIT_WINDOW_MS * 2) requestRateBuckets.delete(key);
}
for (const [key, ts] of logThrottleState) {
if (now - ts > RELAY_LOG_THROTTLE_MS * 6) logThrottleState.delete(key);
}
}, 60 * 1000);
// CORS origin allowlist — only our domains can use this relay
const ALLOWED_ORIGINS = [
'https://worldmonitor.app',
'https://tech.worldmonitor.app',
'https://finance.worldmonitor.app',
'http://localhost:5173', // Vite dev
'http://localhost:5174', // Vite dev alt port
'http://localhost:4173', // Vite preview
'https://localhost', // Tauri desktop
'tauri://localhost', // Tauri iOS/macOS
];
function getCorsOrigin(req) {
const origin = req.headers.origin || '';
if (ALLOWED_ORIGINS.includes(origin)) return origin;
// Optional: allow Vercel preview deployments when explicitly enabled.
if (ALLOW_VERCEL_PREVIEW_ORIGINS && origin.endsWith('.vercel.app')) return origin;
return '';
}
const server = http.createServer(async (req, res) => {
const pathname = (req.url || '/').split('?')[0];
const corsOrigin = getCorsOrigin(req);
if (corsOrigin) {
res.setHeader('Access-Control-Allow-Origin', corsOrigin);
res.setHeader('Vary', 'Origin');
}
res.setHeader('Access-Control-Allow-Methods', 'GET, OPTIONS');
res.setHeader('Access-Control-Allow-Headers', `Content-Type, Authorization, ${RELAY_AUTH_HEADER}`);
// Handle CORS preflight
if (req.method === 'OPTIONS') {
res.writeHead(corsOrigin ? 204 : 403);
return res.end();
}
const isPublicRoute = pathname === '/health' || pathname === '/';
if (!isPublicRoute) {
if (!isAuthorizedRequest(req)) {
return safeEnd(res, 401, { 'Content-Type': 'application/json' },
JSON.stringify({ error: 'Unauthorized', time: Date.now() }));
}
const rl = consumeRateLimit(req, pathname);
if (rl.limited) {
const retryAfterSec = Math.max(1, Math.ceil(rl.resetInMs / 1000));
return safeEnd(res, 429, {
'Content-Type': 'application/json',
'Retry-After': String(retryAfterSec),
'X-RateLimit-Limit': String(rl.limit),
'X-RateLimit-Remaining': String(rl.remaining),
'X-RateLimit-Reset': String(retryAfterSec),
}, JSON.stringify({ error: 'Too many requests', time: Date.now() }));
}
}
if (pathname === '/health' || pathname === '/') {
const mem = process.memoryUsage();
sendCompressed(req, res, 200, { 'Content-Type': 'application/json' }, JSON.stringify({
status: 'ok',
clients: clients.size,
messages: messageCount,
droppedMessages,
connected: upstreamSocket?.readyState === WebSocket.OPEN,
upstreamPaused,
vessels: vessels.size,
densityZones: Array.from(densityGrid.values()).filter(c => c.vessels.size >= 2).length,
telegram: {
enabled: TELEGRAM_ENABLED,
channels: telegramState.channels?.length || 0,
items: telegramState.items?.length || 0,
lastPollAt: telegramState.lastPollAt ? new Date(telegramState.lastPollAt).toISOString() : null,
hasError: !!telegramState.lastError,
},
memory: {
rss: `${(mem.rss / 1024 / 1024).toFixed(0)}MB`,
heapUsed: `${(mem.heapUsed / 1024 / 1024).toFixed(0)}MB`,
heapTotal: `${(mem.heapTotal / 1024 / 1024).toFixed(0)}MB`,
},
cache: {
opensky: openskyResponseCache.size,
opensky_neg: openskyNegativeCache.size,
rss: rssResponseCache.size,
ucdp: ucdpCache.data ? 'warm' : 'cold',
worldbank: worldbankCache.size,
polymarket: polymarketCache.size,
},
auth: {
sharedSecretEnabled: !!RELAY_SHARED_SECRET,
authHeader: RELAY_AUTH_HEADER,
allowVercelPreviewOrigins: ALLOW_VERCEL_PREVIEW_ORIGINS,
},
rateLimit: {
windowMs: RELAY_RATE_LIMIT_WINDOW_MS,
defaultMax: RELAY_RATE_LIMIT_MAX,
openskyMax: RELAY_OPENSKY_RATE_LIMIT_MAX,
rssMax: RELAY_RSS_RATE_LIMIT_MAX,
},
}));
} else if (pathname === '/metrics') {
return sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'no-store',
}, JSON.stringify(getRelayRollingMetrics()));
} else if (pathname.startsWith('/ais/snapshot')) {
// Aggregated AIS snapshot for server-side fanout — serve pre-serialized + pre-gzipped
connectUpstream();
buildSnapshot(); // ensures cache is warm
const url = new URL(req.url, `http://localhost:${PORT}`);
const includeCandidates = url.searchParams.get('candidates') === 'true';
const json = includeCandidates ? lastSnapshotWithCandJson : lastSnapshotJson;
const gz = includeCandidates ? lastSnapshotWithCandGzip : lastSnapshotGzip;
if (json) {
sendPreGzipped(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=2',
}, json, gz);
} else {
// Cold start fallback
const payload = { ...lastSnapshot, candidateReports: includeCandidates ? getCandidateReportsSnapshot() : [] };
sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=2',
}, JSON.stringify(payload));
}
} else if (pathname === '/opensky-reset') {
openskyToken = null;
openskyTokenExpiry = 0;
openskyTokenPromise = null;
openskyAuthCooldownUntil = 0;
openskyGlobal429Until = 0;
openskyNegativeCache.clear();
console.log('[Relay] OpenSky auth + rate-limit state reset via /opensky-reset');
const tokenStart = Date.now();
const token = await getOpenSkyToken();
return sendCompressed(req, res, 200, { 'Content-Type': 'application/json' }, JSON.stringify({
reset: true,
tokenAcquired: !!token,
latencyMs: Date.now() - tokenStart,
negativeCacheCleared: true,
rateLimitCooldownCleared: true,
}));
} else if (pathname === '/opensky-diag') {
// Temporary diagnostic route with safe output only (no token payloads).
const now = Date.now();
const hasFreshToken = !!(openskyToken && now < openskyTokenExpiry - 60000);
const diag = { timestamp: new Date().toISOString(), steps: [] };
const clientId = process.env.OPENSKY_CLIENT_ID;
const clientSecret = process.env.OPENSKY_CLIENT_SECRET;
diag.steps.push({ step: 'env_check', hasClientId: !!clientId, hasClientSecret: !!clientSecret });
diag.steps.push({
step: 'auth_state',
cachedToken: !!openskyToken,
freshToken: hasFreshToken,
tokenExpiry: openskyTokenExpiry ? new Date(openskyTokenExpiry).toISOString() : null,
cooldownRemainingMs: Math.max(0, openskyAuthCooldownUntil - now),
tokenFetchInFlight: !!openskyTokenPromise,
global429CooldownRemainingMs: Math.max(0, openskyGlobal429Until - now),
requestSpacingMs: OPENSKY_REQUEST_SPACING_MS,
});
if (!clientId || !clientSecret) {
diag.steps.push({ step: 'FAILED', reason: 'Missing OPENSKY_CLIENT_ID or OPENSKY_CLIENT_SECRET' });
res.writeHead(200, { 'Content-Type': 'application/json', 'Cache-Control': 'no-store' });
return res.end(JSON.stringify(diag, null, 2));
}
// Use shared token path so diagnostics respect mutex + cooldown protections.
const tokenStart = Date.now();
const token = await getOpenSkyToken();
diag.steps.push({
step: 'token_request',
method: 'getOpenSkyToken',
success: !!token,
fromCache: hasFreshToken,
latencyMs: Date.now() - tokenStart,
cooldownRemainingMs: Math.max(0, openskyAuthCooldownUntil - Date.now()),
});
if (token) {
const apiResult = await new Promise((resolve) => {
const start = Date.now();
const apiReq = https.get('https://opensky-network.org/api/states/all?lamin=47&lomin=5&lamax=48&lomax=6', {
family: 4,
headers: { 'Authorization': `Bearer ${token}`, 'Accept': 'application/json' },
timeout: 15000,
}, (apiRes) => {
let data = '';
apiRes.on('data', chunk => data += chunk);
apiRes.on('end', () => resolve({
status: apiRes.statusCode,
latencyMs: Date.now() - start,
bodyLength: data.length,
statesCount: (data.match(/"states":\s*\[/) ? 'present' : 'missing'),
}));
});
apiReq.on('error', (err) => resolve({ error: err.message, code: err.code, latencyMs: Date.now() - start }));
apiReq.on('timeout', () => { apiReq.destroy(); resolve({ error: 'timeout', latencyMs: Date.now() - start }); });
});
diag.steps.push({ step: 'api_request', ...apiResult });
} else {
diag.steps.push({ step: 'api_request', skipped: true, reason: 'No token available (auth failure or cooldown active)' });
}
res.writeHead(200, { 'Content-Type': 'application/json', 'Cache-Control': 'no-store' });
res.end(JSON.stringify(diag, null, 2));
} else if (pathname === '/telegram' || pathname.startsWith('/telegram/')) {
// Telegram Early Signals feed (public channels)
try {
const url = new URL(req.url, `http://localhost:${PORT}`);
const limit = Math.max(1, Math.min(200, Number(url.searchParams.get('limit') || 50)));
const topic = (url.searchParams.get('topic') || '').trim().toLowerCase();
const channel = (url.searchParams.get('channel') || '').trim().toLowerCase();
const items = Array.isArray(telegramState.items) ? telegramState.items : [];
const filtered = items.filter((it) => {
if (topic && String(it.topic || '').toLowerCase() !== topic) return false;
if (channel && String(it.channel || '').toLowerCase() !== channel) return false;
return true;
}).slice(0, limit);
sendCompressed(req, res, 200, {
'Content-Type': 'application/json',
'Cache-Control': 'public, max-age=10',
}, JSON.stringify({
source: 'telegram',
earlySignal: true,
enabled: TELEGRAM_ENABLED,
count: filtered.length,
updatedAt: telegramState.lastPollAt ? new Date(telegramState.lastPollAt).toISOString() : null,
items: filtered,
}));
} catch (e) {
res.writeHead(500, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Internal error' }));
}
} else if (pathname.startsWith('/rss')) {
// Proxy RSS feeds that block Vercel IPs
let feedUrl = '';
try {
const url = new URL(req.url, `http://localhost:${PORT}`);
feedUrl = url.searchParams.get('url') || '';
if (!feedUrl) {
res.writeHead(400, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify({ error: 'Missing url parameter' }));
}
// Allow domains that block Vercel IPs (must match feeds.ts railwayRss usage)
const allowedDomains = [
// Original
'rss.cnn.com',
'www.defensenews.com',
'layoffs.fyi',
// International Organizations
'news.un.org',
'www.cisa.gov',
'www.iaea.org',
'www.who.int',
'www.crisisgroup.org',
// Middle East & Regional News
'english.alarabiya.net',
'www.arabnews.com',
'www.timesofisrael.com',
'www.scmp.com',
'kyivindependent.com',
'www.themoscowtimes.com',
// Africa
'feeds.24.com',
'feeds.capi24.com', // News24 redirect destination
'islandtimes.org',
'www.atlanticcouncil.org',
// RSSHub (NHK, MIIT, MOFCOM)
'rsshub.app',
];
const parsed = new URL(feedUrl);
if (!allowedDomains.includes(parsed.hostname)) {
res.writeHead(403, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify({ error: 'Domain not allowed on Railway proxy' }));
}
// Serve from cache if fresh (5 min for success, 1 min for failures)
const rssCached = rssResponseCache.get(feedUrl);
if (rssCached) {
const ttl = (rssCached.statusCode && rssCached.statusCode >= 200 && rssCached.statusCode < 300)
? RSS_CACHE_TTL_MS : RSS_NEGATIVE_CACHE_TTL_MS;
if (Date.now() - rssCached.timestamp < ttl) {
return sendCompressed(req, res, rssCached.statusCode || 200, {
'Content-Type': rssCached.contentType || 'application/xml',
'Cache-Control': rssCached.statusCode >= 200 && rssCached.statusCode < 300 ? 'public, max-age=300' : 'no-cache',
'X-Cache': 'HIT',
}, rssCached.data);
}
}
// In-flight dedup: if another request for the same feed is already fetching,
// wait for it and serve from cache instead of hammering upstream.
const existing = rssInFlight.get(feedUrl);
if (existing) {
try {
await existing;
const deduped = rssResponseCache.get(feedUrl);
if (deduped) {
return sendCompressed(req, res, deduped.statusCode || 200, {
'Content-Type': deduped.contentType || 'application/xml',
'Cache-Control': deduped.statusCode >= 200 && deduped.statusCode < 300 ? 'public, max-age=300' : 'no-cache',
'X-Cache': 'DEDUP',
}, deduped.data);
}
// In-flight completed but nothing cached — serve 502 instead of cascading
return safeEnd(res, 502, { 'Content-Type': 'application/json' },
JSON.stringify({ error: 'Upstream fetch completed but not cached' }));
} catch {
// In-flight fetch failed — serve 502 instead of starting another fetch
return safeEnd(res, 502, { 'Content-Type': 'application/json' },
JSON.stringify({ error: 'Upstream fetch failed' }));
}
}
logThrottled('log', `rss-miss:${feedUrl}`, '[Relay] RSS request (MISS):', feedUrl);
const fetchPromise = new Promise((resolveInFlight, rejectInFlight) => {
let responseHandled = false;
const sendError = (statusCode, message) => {
if (responseHandled || res.headersSent) return;
responseHandled = true;
res.writeHead(statusCode, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: message }));
rejectInFlight(new Error(message));
};
const fetchWithRedirects = (url, redirectCount = 0) => {
if (redirectCount > 3) {
return sendError(502, 'Too many redirects');
}
const protocol = url.startsWith('https') ? https : http;
const request = protocol.get(url, {
headers: {
'Accept': 'application/rss+xml, application/xml, text/xml, */*',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36',
'Accept-Language': 'en-US,en;q=0.9',
},
timeout: 15000
}, (response) => {
if ([301, 302, 303, 307, 308].includes(response.statusCode) && response.headers.location) {
const redirectUrl = response.headers.location.startsWith('http')
? response.headers.location
: new URL(response.headers.location, url).href;
logThrottled('log', `rss-redirect:${feedUrl}:${redirectUrl}`, `[Relay] Following redirect to: ${redirectUrl}`);
return fetchWithRedirects(redirectUrl, redirectCount + 1);
}
const encoding = response.headers['content-encoding'];
let stream = response;
if (encoding === 'gzip' || encoding === 'deflate') {
stream = encoding === 'gzip' ? response.pipe(zlib.createGunzip()) : response.pipe(zlib.createInflate());
}
const chunks = [];
stream.on('data', chunk => chunks.push(chunk));
stream.on('end', () => {
if (responseHandled || res.headersSent) return;
responseHandled = true;
const data = Buffer.concat(chunks);
// Cache all responses: 2xx with full TTL, non-2xx with short TTL (negative cache)
// FIFO eviction: drop oldest-inserted entry if at capacity
if (rssResponseCache.size >= RSS_CACHE_MAX_ENTRIES && !rssResponseCache.has(feedUrl)) {
const oldest = rssResponseCache.keys().next().value;
if (oldest) rssResponseCache.delete(oldest);
}
rssResponseCache.set(feedUrl, { data, contentType: 'application/xml', statusCode: response.statusCode, timestamp: Date.now() });
if (response.statusCode < 200 || response.statusCode >= 300) {
logThrottled('warn', `rss-upstream:${feedUrl}:${response.statusCode}`, `[Relay] RSS upstream ${response.statusCode} for ${feedUrl}`);
}
resolveInFlight();
sendCompressed(req, res, response.statusCode, {
'Content-Type': 'application/xml',
'Cache-Control': response.statusCode >= 200 && response.statusCode < 300 ? 'public, max-age=300' : 'no-cache',
'X-Cache': 'MISS',
}, data);
});
stream.on('error', (err) => {
logThrottled('error', `rss-decompress:${feedUrl}:${err.code || err.message}`, '[Relay] Decompression error:', err.message);
sendError(502, 'Decompression failed: ' + err.message);
});
});
request.on('error', (err) => {
logThrottled('error', `rss-error:${feedUrl}:${err.code || err.message}`, '[Relay] RSS error:', err.message);
// Serve stale on error
if (rssCached) {
if (!responseHandled && !res.headersSent) {
responseHandled = true;
sendCompressed(req, res, 200, { 'Content-Type': 'application/xml', 'X-Cache': 'STALE' }, rssCached.data);
}
resolveInFlight();
return;
}
sendError(502, err.message);
});
request.on('timeout', () => {
request.destroy();
if (rssCached && !responseHandled && !res.headersSent) {
responseHandled = true;
sendCompressed(req, res, 200, { 'Content-Type': 'application/xml', 'X-Cache': 'STALE' }, rssCached.data);
resolveInFlight();
return;
}
sendError(504, 'Request timeout');
});
};
fetchWithRedirects(feedUrl);
}); // end fetchPromise
rssInFlight.set(feedUrl, fetchPromise);
fetchPromise.catch(() => {}).finally(() => rssInFlight.delete(feedUrl));
} catch (err) {
if (feedUrl) rssInFlight.delete(feedUrl);
if (!res.headersSent) {
res.writeHead(500, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: err.message }));
}
}
} else if (pathname.startsWith('/ucdp-events')) {
handleUcdpEventsRequest(req, res);
} else if (pathname.startsWith('/opensky')) {
handleOpenSkyRequest(req, res, PORT);
} else if (pathname.startsWith('/worldbank')) {
handleWorldBankRequest(req, res);
} else if (pathname.startsWith('/polymarket')) {
handlePolymarketRequest(req, res);
} else {
res.writeHead(404);
res.end();
}
});
function connectUpstream() {
// Skip if already connected or connecting
if (upstreamSocket?.readyState === WebSocket.OPEN ||
upstreamSocket?.readyState === WebSocket.CONNECTING) return;
console.log('[Relay] Connecting to aisstream.io...');
const socket = new WebSocket(AISSTREAM_URL);
upstreamSocket = socket;
clearUpstreamQueue();
upstreamPaused = false;
const scheduleUpstreamDrain = () => {
if (upstreamDrainScheduled) return;
upstreamDrainScheduled = true;
setImmediate(drainUpstreamQueue);
};
const drainUpstreamQueue = () => {
if (upstreamSocket !== socket) {
clearUpstreamQueue();
upstreamPaused = false;
return;
}
upstreamDrainScheduled = false;
const startedAt = Date.now();
let processed = 0;
while (processed < UPSTREAM_DRAIN_BATCH &&
getUpstreamQueueSize() > 0 &&
Date.now() - startedAt < UPSTREAM_DRAIN_BUDGET_MS) {
const raw = dequeueUpstreamMessage();
if (!raw) break;
processRawUpstreamMessage(raw);
processed++;
}
const queueSize = getUpstreamQueueSize();
if (queueSize >= UPSTREAM_QUEUE_HIGH_WATER && !upstreamPaused) {
upstreamPaused = true;
socket.pause();
console.warn(`[Relay] Upstream paused (queue=${queueSize}, dropped=${droppedMessages})`);
} else if (upstreamPaused && queueSize <= UPSTREAM_QUEUE_LOW_WATER) {
upstreamPaused = false;
socket.resume();
console.log(`[Relay] Upstream resumed (queue=${queueSize})`);
}
if (queueSize > 0) scheduleUpstreamDrain();
};
socket.on('open', () => {
// Verify this socket is still the current one (race condition guard)
if (upstreamSocket !== socket) {
console.log('[Relay] Stale socket open event, closing');
socket.close();
return;
}
console.log('[Relay] Connected to aisstream.io');
socket.send(JSON.stringify({
APIKey: API_KEY,
BoundingBoxes: [[[-90, -180], [90, 180]]],
FilterMessageTypes: ['PositionReport'],
}));
});
socket.on('message', (data) => {
if (upstreamSocket !== socket) return;
const raw = data instanceof Buffer ? data : Buffer.from(data);
if (getUpstreamQueueSize() >= UPSTREAM_QUEUE_HARD_CAP) {
droppedMessages++;
incrementRelayMetric('drops');
return;
}
enqueueUpstreamMessage(raw);
if (!upstreamPaused && getUpstreamQueueSize() >= UPSTREAM_QUEUE_HIGH_WATER) {
upstreamPaused = true;
socket.pause();
console.warn(`[Relay] Upstream paused (queue=${getUpstreamQueueSize()}, dropped=${droppedMessages})`);
}
scheduleUpstreamDrain();
});
socket.on('close', () => {
if (upstreamSocket === socket) {
upstreamSocket = null;
clearUpstreamQueue();
upstreamPaused = false;
console.log('[Relay] Disconnected, reconnecting in 5s...');
setTimeout(connectUpstream, 5000);
}
});
socket.on('error', (err) => {
console.error('[Relay] Upstream error:', err.message);
});
}
const wss = new WebSocketServer({ server });
server.listen(PORT, () => {
console.log(`[Relay] WebSocket relay on port ${PORT}`);
startTelegramPollLoop();
});
wss.on('connection', (ws, req) => {
if (!isAuthorizedRequest(req)) {
ws.close(1008, 'Unauthorized');
return;
}
const wsOrigin = req.headers.origin || '';
if (wsOrigin && !getCorsOrigin(req)) {
ws.close(1008, 'Origin not allowed');
return;
}
if (clients.size >= MAX_WS_CLIENTS) {
console.log(`[Relay] WS client rejected (max ${MAX_WS_CLIENTS})`);
ws.close(1013, 'Max clients reached');
return;
}
console.log(`[Relay] Client connected (${clients.size + 1}/${MAX_WS_CLIENTS})`);
clients.add(ws);
connectUpstream();
ws.on('close', () => {
clients.delete(ws);
});
ws.on('error', (err) => {
console.error('[Relay] Client error:', err.message);
clients.delete(ws);
});
});
// Memory / health monitor — log every 60s and force GC if available
setInterval(() => {
const mem = process.memoryUsage();
const rssGB = mem.rss / 1024 / 1024 / 1024;
console.log(`[Monitor] rss=${(mem.rss / 1024 / 1024).toFixed(0)}MB heap=${(mem.heapUsed / 1024 / 1024).toFixed(0)}MB/${(mem.heapTotal / 1024 / 1024).toFixed(0)}MB external=${(mem.external / 1024 / 1024).toFixed(0)}MB vessels=${vessels.size} density=${densityGrid.size} candidates=${candidateReports.size} msgs=${messageCount} dropped=${droppedMessages}`);
// Emergency cleanup if memory exceeds 450MB RSS
if (rssGB > 0.45) {
console.warn('[Monitor] High memory — forcing aggressive cleanup');
cleanupAggregates();
// Clear heavy caches only (RSS/polymarket/worldbank are tiny, keep them)
openskyResponseCache.clear();
openskyNegativeCache.clear();
if (global.gc) global.gc();
}
}, 60 * 1000);