feat: self-hosted Docker stack (#1521)

* feat: self-hosted Docker stack with nginx, Redis REST proxy, and seeders

Multi-stage Docker build: esbuild TS handler compilation, vite frontend
build, nginx + Node.js API under supervisord. Upstash-compatible Redis
REST proxy with command allowlist for security. AIS relay WebSocket
sidecar. Seeder wrapper script with auto-sourced env vars from
docker-compose.override.yml. Self-hosting guide with architecture
diagram, API key setup, and troubleshooting.

Security: Redis proxy command allowlist (blocks FLUSHALL/CONFIG/EVAL),
nginx security headers (X-Content-Type-Options, X-Frame-Options,
Referrer-Policy), non-root container user.

* feat(docker): add Docker secrets support for API keys

Entrypoint reads /run/secrets/* files and exports as env vars at
startup. Secrets take priority over environment block values and
stay out of docker inspect / process metadata.

Both methods (env vars and secrets) work simultaneously.

* fix(docker): point supervisord at templated nginx config

The entrypoint runs envsubst on nginx.conf.template and writes
the result to /tmp/nginx.conf (with LOCAL_API_PORT substituted
and listening on port 8080 for non-root). But supervisord was
still launching nginx with /etc/nginx/nginx.conf — the default
Alpine config that listens on port 80, which fails with
"Permission denied" under the non-root appuser.

* fix(docker): remove KEYS from Redis allowlist, fix nginx header inheritance, add LLM vars to seeders

- Remove KEYS from redis-rest-proxy allowlist (O(N) blocking, Redis DoS risk)
- Move security headers into each nginx location block to prevent add_header
  inheritance suppression
- Add LLM_API_URL / LLM_API_KEY / LLM_MODEL to run-seeders.sh grep filter
  so LLM API keys set in docker-compose.override.yml are forwarded to seed scripts

* fix(docker): add path-based POST to Redis proxy, expand allowlist, add missing seeder secrets

- Add POST /{command}/{args...} handler to redis-rest-proxy so Upstash-style
  path POSTs work (setCachedJson uses POST /set/<key>/<value>/EX/<ttl>)
- Expand allowlist: HLEN, LTRIM (seed-military-bases, seed-forecasts),
  ZREVRANGE (premium-stock-store), ZRANDMEMBER (seed-military-bases)
- Add ACLED_EMAIL, ACLED_PASSWORD, OPENROUTER_API_KEY, OLLAMA_API_URL,
  OLLAMA_MODEL to run-seeders.sh so override keys reach host-run seeders

---------

Co-authored-by: Elie Habib <elie.habib@gmail.com>
This commit is contained in:
Jon Torrez
2026-03-19 03:07:20 -05:00
committed by GitHub
parent f2b84ac4c7
commit f4183f99c7
14 changed files with 957 additions and 0 deletions

112
docker-compose.yml Normal file
View File

@@ -0,0 +1,112 @@
# =============================================================================
# World Monitor — Docker / Podman Compose
# =============================================================================
# Self-contained stack: app + Redis + AIS relay.
#
# Quick start:
# cp .env.example .env # add your API keys
# docker compose up -d --build
#
# The app will be available at http://localhost:3000
# =============================================================================
services:
worldmonitor:
build:
context: .
dockerfile: Dockerfile
image: worldmonitor:latest
container_name: worldmonitor
ports:
- "${WM_PORT:-3000}:8080"
environment:
UPSTASH_REDIS_REST_URL: "http://redis-rest:80"
UPSTASH_REDIS_REST_TOKEN: "${REDIS_TOKEN:-wm-local-token}"
LOCAL_API_PORT: "46123"
LOCAL_API_MODE: "docker"
LOCAL_API_CLOUD_FALLBACK: "false"
WS_RELAY_URL: "http://ais-relay:3004"
# LLM provider (any OpenAI-compatible endpoint)
LLM_API_URL: "${LLM_API_URL:-}"
LLM_API_KEY: "${LLM_API_KEY:-}"
LLM_MODEL: "${LLM_MODEL:-}"
GROQ_API_KEY: "${GROQ_API_KEY:-}"
# Data source API keys (optional — features degrade gracefully)
AISSTREAM_API_KEY: "${AISSTREAM_API_KEY:-}"
FINNHUB_API_KEY: "${FINNHUB_API_KEY:-}"
EIA_API_KEY: "${EIA_API_KEY:-}"
FRED_API_KEY: "${FRED_API_KEY:-}"
ACLED_ACCESS_TOKEN: "${ACLED_ACCESS_TOKEN:-}"
NASA_FIRMS_API_KEY: "${NASA_FIRMS_API_KEY:-}"
CLOUDFLARE_API_TOKEN: "${CLOUDFLARE_API_TOKEN:-}"
AVIATIONSTACK_API: "${AVIATIONSTACK_API:-}"
# Docker secrets (recommended for API keys — keeps them out of docker inspect).
# Create secrets/ dir with one file per key, then uncomment below.
# See SELF_HOSTING.md or docker-compose.override.yml for details.
# secrets:
# - GROQ_API_KEY
# - AISSTREAM_API_KEY
# - FINNHUB_API_KEY
# - FRED_API_KEY
# - NASA_FIRMS_API_KEY
# - LLM_API_KEY
depends_on:
redis-rest:
condition: service_started
ais-relay:
condition: service_started
restart: unless-stopped
ais-relay:
build:
context: .
dockerfile: Dockerfile.relay
image: worldmonitor-ais-relay:latest
container_name: worldmonitor-ais-relay
environment:
AISSTREAM_API_KEY: "${AISSTREAM_API_KEY:-}"
PORT: "3004"
restart: unless-stopped
redis:
image: docker.io/redis:7-alpine
container_name: worldmonitor-redis
command: redis-server --maxmemory 256mb --maxmemory-policy allkeys-lru
volumes:
- redis-data:/data
restart: unless-stopped
redis-rest:
build:
context: docker
dockerfile: Dockerfile.redis-rest
image: worldmonitor-redis-rest:latest
container_name: worldmonitor-redis-rest
ports:
- "127.0.0.1:8079:80"
environment:
SRH_TOKEN: "${REDIS_TOKEN:-wm-local-token}"
SRH_CONNECTION_STRING: "redis://redis:6379"
depends_on:
- redis
restart: unless-stopped
# Docker secrets — uncomment and point to your secret files.
# Example: echo "gsk_abc123" > secrets/groq_api_key.txt
# secrets:
# GROQ_API_KEY:
# file: ./secrets/groq_api_key.txt
# AISSTREAM_API_KEY:
# file: ./secrets/aisstream_api_key.txt
# FINNHUB_API_KEY:
# file: ./secrets/finnhub_api_key.txt
# FRED_API_KEY:
# file: ./secrets/fred_api_key.txt
# NASA_FIRMS_API_KEY:
# file: ./secrets/nasa_firms_api_key.txt
# LLM_API_KEY:
# file: ./secrets/llm_api_key.txt
volumes:
redis-data: