mirror of
https://github.com/koala73/worldmonitor.git
synced 2026-04-25 17:14:57 +02:00
* fix(consumer-prices): ensure scrape job exits 0 to unblock aggregate and publish Playwright browser teardown or lingering event loop handles were causing node to exit with a non-zero code, silently breaking the && chain so aggregate and publish never ran, leaving Redis empty and the panel stuck at "Data collection in progress". Wraps the entry point in an explicit async main() with try/finally and forces process.exit(0) so the && chain always proceeds to aggregate and publish regardless of Playwright or pg cleanup errors. * fix(consumer-prices): preserve non-zero exit code for real scrape failures The previous fix always exited 0, masking actual failures like DB errors or config issues and making them look like successful runs to Railway. Now only teardown noise (Playwright/pg handles) is neutralized by forcing process.exit() — real failures set process.exitCode = 1 so Railway still sees a failed run and alerting remains accurate.