mirror of
https://github.com/koala73/worldmonitor.git
synced 2026-04-25 17:14:57 +02:00
feat(brief): consolidate composer into digest cron (retire standalone service) (#3157)
* feat(brief): consolidate composer into digest cron (retire standalone service)
Merges the Phase 3a standalone Railway composer into the existing
digest cron. End state: one cron (seed-digest-notifications.mjs)
writes brief:{userId}:{issueDate} for every eligible user AND
dispatches the digest to their configured channels with a signed
magazine URL appended. Net -1 Railway service.
User's architectural note: "there is no reason to have 1 digest
preparing all and sending, then another doing a duplicate". This
delivers that — infrastructure consolidation, same send cadence,
single source of truth for brief envelopes.
File moves / deletes:
- scripts/seed-brief-composer.mjs → scripts/lib/brief-compose.mjs
Pure-helpers library: no main(), no env guards, no cron. Exports
composeBriefForRule + groupEligibleRulesByUser + dedupeRulesByUser
(shim) + shouldExitNonZero + date helpers + extractInsights.
- Dockerfile.seed-brief-composer → deleted.
- The seed-brief-composer Railway service is retired (user confirmed
they would delete it manually).
New files:
- scripts/lib/brief-url-sign.mjs — plain .mjs port of the sign path
in server/_shared/brief-url.ts (Web Crypto only, no node:crypto).
- tests/brief-url-sign.test.mjs — parity tests that confirm tokens
minted by the scripts-side signer verify via the edge-side verifier
and produce byte-identical output for identical input.
Digest cron (scripts/seed-digest-notifications.mjs):
- Reads news:insights:v1 once per run, composes per-user brief
envelopes, SETEX brief:{userId}:{issueDate} via body-POST pipeline.
- Signs magazine URL per user (BRIEF_URL_SIGNING_SECRET +
WORLDMONITOR_PUBLIC_BASE_URL new env requirements, see pre-merge).
- Injects magazineUrl into buildChannelBodies for every channel
(email, telegram, slack, discord) as a "📖 Open your WorldMonitor
Brief magazine" footer CTA.
- Email HTML gets a dedicated data-brief-cta-slot near the top of
the body with a styled button.
- Compose failures NEVER block the digest send — the digest cron's
existing behaviour is preserved when the brief pipeline has issues.
- Brief compose extracted to its own functions (composeBriefsForRun
+ composeAndStoreBriefForUser) to keep main's biome complexity at
baseline (64 — was 63 before; inline would have pushed to 117).
Tests: 98/98 across the brief suite. New parity tests confirm cross-
module signer agreement.
PRE-MERGE: add BRIEF_URL_SIGNING_SECRET and WORLDMONITOR_PUBLIC_BASE_URL
to the digest-notifications Railway service env (same values already
set on Vercel for Phase 2). Without them, brief compose is auto-
disabled and the digest falls back to its current behaviour — safe to
deploy before env is set.
* fix(brief): digest Dockerfile + propagate compose failure to exit code
Addresses two seventh-round review findings on PR #3157.
1. Cross-directory imports + current Railway build root (todo 230).
The consolidated digest cron imports from ../api, ../shared, and
(transitively via scripts/lib/brief-compose.mjs) ../server/_shared.
The running digest-notifications Railway service builds from the
scripts/ root — those parent paths are outside the deploy tree
and would 500 on next rebuild with ERR_MODULE_NOT_FOUND.
New Dockerfile.digest-notifications (repo-root build context)
COPYs exactly the modules the cron needs: scripts/ contents,
scripts/lib/, shared/brief-envelope.*, shared/brief-filter.*,
server/_shared/brief-render.*, api/_upstash-json.js,
api/_seed-envelope.js. Tight list to keep the watch surface small.
Pattern matches the retired Dockerfile.seed-brief-composer + the
existing Dockerfile.relay.
2. Silent compose failures (todo 231). composeBriefsForRun logged
counters but never exited non-zero. An Upstash outage or missing
signing secret silently dropped every brief write while Railway
showed the cron green. The retired standalone composer exited 1
on structural failures; that observability was lost in the
consolidation.
Changed the compose fn to return {briefByUser, composeSuccess,
composeFailed}. Main captures the counters, runs the full digest
send loop first (compose-layer breakage must NEVER block user-
visible digest delivery), then calls shouldExitNonZero at the
very end. Exit-on-failure gives ops the Railway-red signal
without touching send behaviour.
Also: a total read failure of news:insights:v1 (catch branch)
now counts as 1 compose failure so the gate trips on insights-
key infra breakage, not just per-user write failures.
Tests unchanged (98/98). Typecheck + node --check clean. Biome
complexity ticks 63→65 — same pre-existing bucket, already tolerated
by CI; no new blocker.
PRE-MERGE Railway work still pending: set BRIEF_URL_SIGNING_SECRET
+ WORLDMONITOR_PUBLIC_BASE_URL on the digest-notifications service,
AND switch its dockerfilePath to /Dockerfile.digest-notifications
before merging. Without the dockerfilePath switch, the next rebuild
fails.
* fix(brief): Dockerfile type:module + explicit missing-secret tripwire
Addresses two eighth-round review findings on PR #3157.
1. ESM .js files parse as CommonJS in the container (todo 232).
Dockerfile.digest-notifications COPYs shared/*.js,
server/_shared/*.js, api/*.js — all ESM because the repo-root
package.json has "type":"module". But the image never copies the
root package.json, so Node's nearest-pjson walk inside /app/
reaches / without finding one and defaults to CommonJS. First
`export` statement throws `SyntaxError: Unexpected token 'export'`
at startup.
Fix: write a minimal /app/package.json with {"type":"module"}
early in the build. Avoids dragging the full root package.json
into the image while still giving Node the ESM hint it needs for
repo-owned .js files.
2. Missing BRIEF_URL_SIGNING_SECRET silently tolerated (todo 233).
The old gate folded "operator-disabled" (BRIEF_COMPOSE_ENABLED=0)
and "required secret missing in rollout" into the same boolean
via AND. A production deploy that forgot the env var would skip
brief compose without any failure signal — Railway green, no
briefs, no CTA in digests, nobody notices.
Split the two states: BRIEF_COMPOSE_DISABLED_BY_OPERATOR (explicit
kill switch, silent) and BRIEF_SIGNING_SECRET_MISSING (the misconfig
we care about). When the secret is missing without the operator
flag, composeBriefsForRun returns composeFailed=1 on first call
so the end-of-run exit gate trips and Railway flags the run red.
Digest send still proceeds — compose-layer issues never block
notifications.
Tests: 98/98. Syntax + node --check clean.
* fix(brief): address 2 remaining P2 review comments on PR #3157
Greptile review (2026-04-18T05:04Z) flagged three P2 items. The
first (shouldExitNonZero never wired into cron) was already fixed in
commit 35a46aa34. This commit addresses the other two.
1. composeBriefForRule: issuedAt used Date.now() instead of the
caller-supplied nowMs. Under the digest cron the delta is
milliseconds and harmless, but it broke the function's
determinism contract — same input must produce same output for
tests + retries. Now uses the passed nowMs.
2. buildChannelBodies: magazineUrl embedded raw inside Telegram HTML
<a href="..."> and Slack <URL|text> syntax. The URL is HMAC-
signed and shape-validated upstream (userId regex + YYYY-MM-DD
date), so injection is practically impossible — but the email
CTA (injectBriefCta) escapes per-target and channel footers
should match that discipline. Added:
- Telegram: escape &, <, >, " to HTML entities
- Slack: strip <, >, | (mrkdwn metacharacters)
Discord and plain-text paths unchanged — Discord links tolerate
raw URLs, plain text has no metacharacters to escape.
Tests: 98/98 still pass (deterministic issuedAt change was
transparent to existing assertions because tests already pass nowMs
explicitly via the issuedAt fixture field).
This commit is contained in:
65
Dockerfile.digest-notifications
Normal file
65
Dockerfile.digest-notifications
Normal file
@@ -0,0 +1,65 @@
|
||||
# =============================================================================
|
||||
# Digest notifications cron (consolidated: digest + brief compose + channel send)
|
||||
# =============================================================================
|
||||
# Runs scripts/seed-digest-notifications.mjs as a Railway cron (every 30 min).
|
||||
# The script now also owns the brief envelope write path — per-user
|
||||
# brief:{userId}:{issueDate} keys are produced here, and every channel
|
||||
# dispatch (email/telegram/slack/discord) gets a signed magazine URL CTA.
|
||||
#
|
||||
# Historical context: before 2026-04-18 this service built from the
|
||||
# scripts/ root with plain `npm ci`. The consolidation PR introduced
|
||||
# cross-directory imports (shared/*, server/_shared/*, api/*) so the
|
||||
# service now needs repo-root as build context with the specific
|
||||
# modules COPY'd in. The retired seed-brief-composer Dockerfile had
|
||||
# the same pattern.
|
||||
#
|
||||
# Required env (Railway service vars):
|
||||
# UPSTASH_REDIS_REST_URL
|
||||
# UPSTASH_REDIS_REST_TOKEN
|
||||
# CONVEX_URL (or CONVEX_SITE_URL)
|
||||
# RELAY_SHARED_SECRET
|
||||
# RESEND_API_KEY
|
||||
# TELEGRAM_BOT_TOKEN
|
||||
# BRIEF_URL_SIGNING_SECRET (brief compose disabled without this)
|
||||
# WORLDMONITOR_PUBLIC_BASE_URL (defaults to https://worldmonitor.app)
|
||||
# Optional:
|
||||
# DIGEST_CRON_ENABLED=0 (kill switch for the whole cron)
|
||||
# BRIEF_COMPOSE_ENABLED=0 (kill switch for just brief compose)
|
||||
# AI_DIGEST_ENABLED=0 (kill switch for AI summary LLM call)
|
||||
# =============================================================================
|
||||
|
||||
FROM node:22-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# The repo-root package.json has "type":"module" which tells Node to
|
||||
# parse .js files under shared/, server/_shared/, api/ as ESM. Inside
|
||||
# this image we don't ship the full root package.json (it would pull
|
||||
# in dev-deps metadata we don't need), but the .js files we DO ship
|
||||
# still need the nearest-pjson walk to resolve to an ESM declaration.
|
||||
# A minimal /app/package.json avoids "SyntaxError: Unexpected token
|
||||
# 'export'" at container startup.
|
||||
RUN printf '{"type":"module","private":true}\n' > /app/package.json
|
||||
|
||||
# Install scripts/ runtime dependencies (resend, convex, etc.).
|
||||
COPY scripts/package.json scripts/package-lock.json ./scripts/
|
||||
RUN npm ci --prefix scripts --omit=dev
|
||||
|
||||
# Digest cron + shared script helpers it imports via createRequire.
|
||||
COPY scripts/seed-digest-notifications.mjs ./scripts/
|
||||
COPY scripts/_digest-markdown.mjs ./scripts/
|
||||
COPY scripts/lib/ ./scripts/lib/
|
||||
|
||||
# Brief envelope contract + filter + renderer assertion. These live
|
||||
# under shared/ and server/_shared/ in the repo and are imported from
|
||||
# scripts/lib/brief-compose.mjs. Keep this COPY list tight — adding
|
||||
# unrelated shared/* files expands the rebuild watch surface.
|
||||
COPY shared/brief-envelope.js shared/brief-envelope.d.ts ./shared/
|
||||
COPY shared/brief-filter.js shared/brief-filter.d.ts ./shared/
|
||||
COPY server/_shared/brief-render.js server/_shared/brief-render.d.ts ./server/_shared/
|
||||
|
||||
# Upstash REST helper (brief compose uses redisPipeline + readRawJson).
|
||||
COPY api/_upstash-json.js ./api/
|
||||
COPY api/_seed-envelope.js ./api/
|
||||
|
||||
CMD ["node", "scripts/seed-digest-notifications.mjs"]
|
||||
@@ -1,49 +0,0 @@
|
||||
# =============================================================================
|
||||
# WorldMonitor Brief Composer (Phase 3a)
|
||||
# =============================================================================
|
||||
# Runs scripts/seed-brief-composer.mjs as a standalone Railway cron.
|
||||
# Reads news:insights:v1 from Upstash, queries Convex for enabled
|
||||
# alert rules, writes brief:{userId}:{issueDate} back to Upstash.
|
||||
#
|
||||
# Required env (set in Railway service vars):
|
||||
# UPSTASH_REDIS_REST_URL
|
||||
# UPSTASH_REDIS_REST_TOKEN
|
||||
# CONVEX_URL (or CONVEX_SITE_URL)
|
||||
# RELAY_SHARED_SECRET
|
||||
# Optional:
|
||||
# BRIEF_COMPOSER_ENABLED=0 (kill switch during incidents)
|
||||
#
|
||||
# Runtime characteristics: ~1 Upstash GET + N SETEX. N ≈ enabled PRO
|
||||
# users. No LLM calls (Phase 3b adds those); no outbound fan-out
|
||||
# (Phase 3c). CPU-cheap; memory bound by topStories payload size.
|
||||
# =============================================================================
|
||||
|
||||
FROM node:22-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install scripts/ runtime dependencies (undici for Intl, etc.). The
|
||||
# composer itself uses only built-ins + the repo's own modules, but
|
||||
# tsx is needed to execute .mjs with explicit type-stripping when
|
||||
# shared/ imports traverse into server/_shared/brief-render.js
|
||||
# (which does not use TS).
|
||||
COPY scripts/package.json scripts/package-lock.json ./scripts/
|
||||
RUN npm ci --prefix scripts --omit=dev
|
||||
|
||||
# Composer script
|
||||
COPY scripts/seed-brief-composer.mjs ./scripts/seed-brief-composer.mjs
|
||||
|
||||
# Shared contract + renderer validator (assembly step calls
|
||||
# assertBriefEnvelope to refuse to write a malformed envelope).
|
||||
COPY shared/brief-envelope.js ./shared/brief-envelope.js
|
||||
COPY shared/brief-envelope.d.ts ./shared/brief-envelope.d.ts
|
||||
COPY shared/brief-filter.js ./shared/brief-filter.js
|
||||
COPY shared/brief-filter.d.ts ./shared/brief-filter.d.ts
|
||||
COPY server/_shared/brief-render.js ./server/_shared/brief-render.js
|
||||
COPY server/_shared/brief-render.d.ts ./server/_shared/brief-render.d.ts
|
||||
|
||||
# Upstash REST helper (reused from api/_upstash-json.js)
|
||||
COPY api/_upstash-json.js ./api/_upstash-json.js
|
||||
COPY api/_seed-envelope.js ./api/_seed-envelope.js
|
||||
|
||||
CMD ["node", "scripts/seed-brief-composer.mjs"]
|
||||
181
scripts/lib/brief-compose.mjs
Normal file
181
scripts/lib/brief-compose.mjs
Normal file
@@ -0,0 +1,181 @@
|
||||
// WorldMonitor Brief compose library.
|
||||
//
|
||||
// Pure helpers for producing the per-user brief envelope that the
|
||||
// hosted magazine route (api/brief/*) + dashboard panel + future
|
||||
// channels all consume. Shared between:
|
||||
// - scripts/seed-digest-notifications.mjs (the consolidated cron;
|
||||
// composes a brief for every user it's about to dispatch a
|
||||
// digest to, so the magazine URL can be injected into the
|
||||
// notification output).
|
||||
// - future tests + ad-hoc tools.
|
||||
//
|
||||
// Deliberately has NO top-level side effects: no env guards, no
|
||||
// process.exit, no main(). Import anywhere.
|
||||
//
|
||||
// History: this file used to include a stand-alone Railway cron
|
||||
// (`seed-brief-composer.mjs`). That path was retired in the
|
||||
// consolidation PR — the digest cron now owns the compose+send
|
||||
// pipeline so there is exactly one cron writing brief:{userId}:
|
||||
// {issueDate} keys.
|
||||
|
||||
import {
|
||||
assembleStubbedBriefEnvelope,
|
||||
filterTopStories,
|
||||
issueDateInTz,
|
||||
} from '../../shared/brief-filter.js';
|
||||
|
||||
// ── Rule dedupe (one brief per user, not per variant) ───────────────────────
|
||||
|
||||
const SENSITIVITY_RANK = { all: 0, high: 1, critical: 2 };
|
||||
|
||||
function compareRules(a, b) {
|
||||
const aFull = a.variant === 'full' ? 0 : 1;
|
||||
const bFull = b.variant === 'full' ? 0 : 1;
|
||||
if (aFull !== bFull) return aFull - bFull;
|
||||
const aRank = SENSITIVITY_RANK[a.sensitivity ?? 'all'] ?? 0;
|
||||
const bRank = SENSITIVITY_RANK[b.sensitivity ?? 'all'] ?? 0;
|
||||
if (aRank !== bRank) return aRank - bRank;
|
||||
return (a.updatedAt ?? 0) - (b.updatedAt ?? 0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Group eligible (not-opted-out) rules by userId with each user's
|
||||
* candidates sorted in preference order. Callers walk the candidate
|
||||
* list and take the first that produces non-empty stories — falls
|
||||
* back across variants cleanly.
|
||||
*/
|
||||
export function groupEligibleRulesByUser(rules) {
|
||||
const byUser = new Map();
|
||||
for (const rule of rules) {
|
||||
if (!rule || typeof rule.userId !== 'string') continue;
|
||||
if (rule.aiDigestEnabled === false) continue;
|
||||
const list = byUser.get(rule.userId);
|
||||
if (list) list.push(rule);
|
||||
else byUser.set(rule.userId, [rule]);
|
||||
}
|
||||
for (const list of byUser.values()) list.sort(compareRules);
|
||||
return byUser;
|
||||
}
|
||||
|
||||
/**
|
||||
* @deprecated Kept for existing test imports. Prefer
|
||||
* groupEligibleRulesByUser + per-user fallback at call sites.
|
||||
*/
|
||||
export function dedupeRulesByUser(rules) {
|
||||
const out = [];
|
||||
for (const candidates of groupEligibleRulesByUser(rules).values()) {
|
||||
if (candidates.length > 0) out.push(candidates[0]);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
// ── Failure gate ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Decide whether the consolidated cron should exit non-zero because
|
||||
* the brief-write failure rate is structurally bad (not just a
|
||||
* transient blip). Denominator is ATTEMPTED writes, not eligible
|
||||
* users: skipped-empty users never reach the write path and must not
|
||||
* dilute the ratio.
|
||||
*
|
||||
* @param {{ success: number; failed: number; thresholdRatio?: number }} counters
|
||||
*/
|
||||
export function shouldExitNonZero({ success, failed, thresholdRatio = 0.05 }) {
|
||||
if (failed <= 0) return false;
|
||||
const attempted = success + failed;
|
||||
if (attempted <= 0) return false;
|
||||
const threshold = Math.max(1, Math.floor(attempted * thresholdRatio));
|
||||
return failed >= threshold;
|
||||
}
|
||||
|
||||
// ── Insights fetch ───────────────────────────────────────────────────────────
|
||||
|
||||
/** Unwrap news:insights:v1 envelope and project the fields the brief needs. */
|
||||
export function extractInsights(raw) {
|
||||
const data = raw?.data ?? raw;
|
||||
const topStories = Array.isArray(data?.topStories) ? data.topStories : [];
|
||||
const clusterCount = Number.isFinite(data?.clusterCount) ? data.clusterCount : topStories.length;
|
||||
const multiSourceCount = Number.isFinite(data?.multiSourceCount) ? data.multiSourceCount : 0;
|
||||
return {
|
||||
topStories,
|
||||
numbers: { clusters: clusterCount, multiSource: multiSourceCount },
|
||||
};
|
||||
}
|
||||
|
||||
// ── Date + display helpers ───────────────────────────────────────────────────
|
||||
|
||||
const MONTH_NAMES = [
|
||||
'January', 'February', 'March', 'April', 'May', 'June',
|
||||
'July', 'August', 'September', 'October', 'November', 'December',
|
||||
];
|
||||
|
||||
export function dateLongFromIso(iso) {
|
||||
const [y, m, d] = iso.split('-').map(Number);
|
||||
return `${d} ${MONTH_NAMES[m - 1]} ${y}`;
|
||||
}
|
||||
|
||||
export function issueCodeFromIso(iso) {
|
||||
const [, m, d] = iso.split('-');
|
||||
return `${d}.${m}`;
|
||||
}
|
||||
|
||||
export function localHourInTz(nowMs, timezone) {
|
||||
try {
|
||||
const fmt = new Intl.DateTimeFormat('en-US', {
|
||||
timeZone: timezone,
|
||||
hour: 'numeric',
|
||||
hour12: false,
|
||||
});
|
||||
const hour = fmt.formatToParts(new Date(nowMs)).find((p) => p.type === 'hour')?.value;
|
||||
const n = Number(hour);
|
||||
return Number.isFinite(n) ? n : 9;
|
||||
} catch {
|
||||
return 9;
|
||||
}
|
||||
}
|
||||
|
||||
export function userDisplayNameFromId(userId) {
|
||||
// Clerk IDs look like "user_2abc…". Phase 3b will hydrate real
|
||||
// names via a Convex query; for now a generic placeholder so the
|
||||
// magazine's greeting reads naturally.
|
||||
void userId;
|
||||
return 'Reader';
|
||||
}
|
||||
|
||||
// ── Compose a full brief for a single rule ──────────────────────────────────
|
||||
|
||||
const MAX_STORIES_PER_USER = 12;
|
||||
|
||||
/**
|
||||
* Filter + assemble a BriefEnvelope for one alert rule. Returns null
|
||||
* when the filter produces zero stories — the caller decides whether
|
||||
* to fall back to another variant or skip the user.
|
||||
*
|
||||
* @param {object} rule — enabled alertRule row
|
||||
* @param {{ topStories: unknown[]; numbers: { clusters: number; multiSource: number } }} insights
|
||||
* @param {{ nowMs: number }} [opts]
|
||||
*/
|
||||
export function composeBriefForRule(rule, insights, { nowMs = Date.now() } = {}) {
|
||||
const sensitivity = rule.sensitivity ?? 'all';
|
||||
const tz = rule.digestTimezone ?? 'UTC';
|
||||
const stories = filterTopStories({
|
||||
stories: insights.topStories,
|
||||
sensitivity,
|
||||
maxStories: MAX_STORIES_PER_USER,
|
||||
});
|
||||
if (stories.length === 0) return null;
|
||||
const issueDate = issueDateInTz(nowMs, tz);
|
||||
return assembleStubbedBriefEnvelope({
|
||||
user: { name: userDisplayNameFromId(rule.userId), tz },
|
||||
stories,
|
||||
issueDate,
|
||||
dateLong: dateLongFromIso(issueDate),
|
||||
issue: issueCodeFromIso(issueDate),
|
||||
insightsNumbers: insights.numbers,
|
||||
// Same nowMs as the rest of the envelope so the function stays
|
||||
// deterministic for a given input — tests + retries see identical
|
||||
// output.
|
||||
issuedAt: nowMs,
|
||||
localHour: localHourInTz(nowMs, tz),
|
||||
});
|
||||
}
|
||||
78
scripts/lib/brief-url-sign.mjs
Normal file
78
scripts/lib/brief-url-sign.mjs
Normal file
@@ -0,0 +1,78 @@
|
||||
// HMAC URL signer for scripts/ cron code.
|
||||
//
|
||||
// Port of the sign path in server/_shared/brief-url.ts. The edge
|
||||
// route still owns verify (that code runs unchanged); the digest
|
||||
// cron only needs to mint magazine URLs to embed in notification
|
||||
// bodies.
|
||||
//
|
||||
// Kept in parity with the TS module — any change to the signing
|
||||
// formula MUST happen in both places in the same PR. A regression
|
||||
// test in tests/brief-url-sign.test.mjs produces a token with this
|
||||
// helper and verifies it via the edge's verifyBriefToken.
|
||||
//
|
||||
// No node:crypto — Web Crypto (crypto.subtle + btoa) only. That lets
|
||||
// the same helper run on Node 18+, Vercel Edge, Cloudflare Workers,
|
||||
// and Tauri if ever needed from a non-cron path.
|
||||
|
||||
const USER_ID_RE = /^[A-Za-z0-9_-]{1,128}$/;
|
||||
const ISSUE_DATE_RE = /^\d{4}-\d{2}-\d{2}$/;
|
||||
|
||||
export class BriefUrlError extends Error {
|
||||
constructor(code, message) {
|
||||
super(message);
|
||||
this.code = code;
|
||||
this.name = 'BriefUrlError';
|
||||
}
|
||||
}
|
||||
|
||||
function assertShape(userId, issueDate) {
|
||||
if (!USER_ID_RE.test(userId)) {
|
||||
throw new BriefUrlError('invalid_user_id', 'userId must match [A-Za-z0-9_-]{1,128}');
|
||||
}
|
||||
if (!ISSUE_DATE_RE.test(issueDate)) {
|
||||
throw new BriefUrlError('invalid_issue_date', 'issueDate must match YYYY-MM-DD');
|
||||
}
|
||||
}
|
||||
|
||||
function base64url(bytes) {
|
||||
let bin = '';
|
||||
for (const b of bytes) bin += String.fromCharCode(b);
|
||||
return btoa(bin).replace(/\+/g, '-').replace(/\//g, '_').replace(/=+$/, '');
|
||||
}
|
||||
|
||||
async function hmacSha256(secret, message) {
|
||||
const key = await crypto.subtle.importKey(
|
||||
'raw',
|
||||
new TextEncoder().encode(secret),
|
||||
{ name: 'HMAC', hash: 'SHA-256' },
|
||||
false,
|
||||
['sign'],
|
||||
);
|
||||
const sig = await crypto.subtle.sign('HMAC', key, new TextEncoder().encode(message));
|
||||
return new Uint8Array(sig);
|
||||
}
|
||||
|
||||
/**
|
||||
* Deterministically sign `${userId}:${issueDate}` and return a
|
||||
* base64url-encoded token (43 chars, no padding).
|
||||
* @param {string} userId @param {string} issueDate @param {string} secret
|
||||
* @returns {Promise<string>}
|
||||
*/
|
||||
export async function signBriefToken(userId, issueDate, secret) {
|
||||
assertShape(userId, issueDate);
|
||||
if (!secret) {
|
||||
throw new BriefUrlError('missing_secret', 'BRIEF_URL_SIGNING_SECRET is not configured');
|
||||
}
|
||||
const sig = await hmacSha256(secret, `${userId}:${issueDate}`);
|
||||
return base64url(sig);
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {{ userId: string; issueDate: string; baseUrl: string; secret: string }} opts
|
||||
* @returns {Promise<string>}
|
||||
*/
|
||||
export async function signBriefUrl({ userId, issueDate, baseUrl, secret }) {
|
||||
const token = await signBriefToken(userId, issueDate, secret);
|
||||
const trimmed = baseUrl.replace(/\/+$/, '');
|
||||
return `${trimmed}/api/brief/${encodeURIComponent(userId)}/${encodeURIComponent(issueDate)}?t=${token}`;
|
||||
}
|
||||
@@ -1,393 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* WorldMonitor Brief composer — Railway cron.
|
||||
*
|
||||
* Phase 3a of docs/plans/2026-04-17-003-feat-worldmonitor-brief-
|
||||
* magazine-plan.md. Produces the per-user envelopes that Phases 1+2
|
||||
* already know how to serve; Phase 3b will replace the stubbed
|
||||
* digest text with LLM output.
|
||||
*
|
||||
* Per run:
|
||||
* 1. Fetch the global news-intelligence bundle once.
|
||||
* 2. Ask Convex for every enabled alert-rule with digestMode set.
|
||||
* This matches the eligibility set already used by
|
||||
* seed-digest-notifications — brief access is free-riding on
|
||||
* the digest opt-in.
|
||||
* 3. For each rule:
|
||||
* - Compute issueDate from rule.digestTimezone.
|
||||
* - Filter insights.topStories by rule.sensitivity.
|
||||
* - Assemble a BriefEnvelope with stubbed digest text.
|
||||
* - SETEX brief:{userId}:{issueDate} with a 7-day TTL.
|
||||
* 4. Log per-status counters (success / skipped_empty / failed).
|
||||
*
|
||||
* The script is idempotent within a day: re-running overwrites the
|
||||
* same key with the same envelope (modulo issuedAt). Phase 3c adds
|
||||
* fan-out events on first-write only.
|
||||
*/
|
||||
|
||||
import { createRequire } from 'node:module';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { readRawJsonFromUpstash, redisPipeline } from '../api/_upstash-json.js';
|
||||
import {
|
||||
assembleStubbedBriefEnvelope,
|
||||
filterTopStories,
|
||||
issueDateInTz,
|
||||
} from '../shared/brief-filter.js';
|
||||
|
||||
const require = createRequire(import.meta.url);
|
||||
|
||||
// ── Config ────────────────────────────────────────────────────────────────────
|
||||
|
||||
const UPSTASH_URL = process.env.UPSTASH_REDIS_REST_URL ?? '';
|
||||
const UPSTASH_TOKEN = process.env.UPSTASH_REDIS_REST_TOKEN ?? '';
|
||||
const CONVEX_SITE_URL =
|
||||
process.env.CONVEX_SITE_URL ??
|
||||
(process.env.CONVEX_URL ?? '').replace('.convex.cloud', '.convex.site');
|
||||
const RELAY_SECRET = process.env.RELAY_SHARED_SECRET ?? '';
|
||||
|
||||
const BRIEF_TTL_SECONDS = 7 * 24 * 60 * 60; // 7 days
|
||||
const MAX_STORIES_PER_USER = 12;
|
||||
const INSIGHTS_KEY = 'news:insights:v1';
|
||||
|
||||
// ── Upstash helpers ──────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Write the brief envelope via the Upstash REST pipeline endpoint
|
||||
* (body-POST), not the path-embedded SETEX form. Realistic briefs
|
||||
* (12 stories, per-story description + whyMatters near caps) encode
|
||||
* to 5–20 KB of JSON; URL-encoding inflates that further and can hit
|
||||
* CDN / edge / Node HTTP request-target limits (commonly 8–16 KB).
|
||||
* `redisPipeline` places the command in a JSON body where size
|
||||
* limits are generous and uniform with the rest of the codebase's
|
||||
* Upstash writes.
|
||||
*/
|
||||
async function upstashSetex(key, value, ttlSeconds) {
|
||||
const results = await redisPipeline([
|
||||
['SETEX', key, String(ttlSeconds), JSON.stringify(value)],
|
||||
]);
|
||||
if (!results || !Array.isArray(results) || results.length === 0) {
|
||||
throw new Error(`Upstash SETEX failed for ${key}: null pipeline response`);
|
||||
}
|
||||
const result = results[0];
|
||||
// Upstash pipeline returns either {result} or {error} per command.
|
||||
if (result && typeof result === 'object' && 'error' in result) {
|
||||
throw new Error(`Upstash SETEX failed for ${key}: ${result.error}`);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
// ── Date helpers ─────────────────────────────────────────────────────────────
|
||||
|
||||
const MONTH_NAMES = [
|
||||
'January', 'February', 'March', 'April', 'May', 'June',
|
||||
'July', 'August', 'September', 'October', 'November', 'December',
|
||||
];
|
||||
|
||||
function dateLongFromIso(iso) {
|
||||
// iso is YYYY-MM-DD. Parse literally to avoid tz drift.
|
||||
const [y, m, d] = iso.split('-').map(Number);
|
||||
return `${d} ${MONTH_NAMES[m - 1]} ${y}`;
|
||||
}
|
||||
|
||||
function issueCodeFromIso(iso) {
|
||||
// "2026-04-18" → "18.04"
|
||||
const [, m, d] = iso.split('-');
|
||||
return `${d}.${m}`;
|
||||
}
|
||||
|
||||
function localHourInTz(nowMs, timezone) {
|
||||
try {
|
||||
const fmt = new Intl.DateTimeFormat('en-US', {
|
||||
timeZone: timezone,
|
||||
hour: 'numeric',
|
||||
hour12: false,
|
||||
});
|
||||
const hour = fmt.formatToParts(new Date(nowMs))
|
||||
.find((p) => p.type === 'hour')?.value;
|
||||
const n = Number(hour);
|
||||
return Number.isFinite(n) ? n : 9;
|
||||
} catch {
|
||||
return 9;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Convex helpers ───────────────────────────────────────────────────────────
|
||||
|
||||
async function fetchDigestRules() {
|
||||
const res = await fetch(`${CONVEX_SITE_URL}/relay/digest-rules`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
Authorization: `Bearer ${RELAY_SECRET}`,
|
||||
'User-Agent': 'worldmonitor-brief-composer/1.0',
|
||||
},
|
||||
signal: AbortSignal.timeout(10_000),
|
||||
});
|
||||
if (!res.ok) {
|
||||
throw new Error(`Failed to fetch digest rules: HTTP ${res.status}`);
|
||||
}
|
||||
const rules = await res.json();
|
||||
if (!Array.isArray(rules)) {
|
||||
throw new Error('digest-rules response was not an array');
|
||||
}
|
||||
return rules;
|
||||
}
|
||||
|
||||
// ── Failure gate ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Decide whether the cron should exit non-zero so Railway flags the
|
||||
* run. Denominator is ATTEMPTED writes (success + failed); skipped-
|
||||
* empty users never reached the write path and must not inflate it.
|
||||
* Exported so the denominator contract is testable without mocking
|
||||
* Redis + LLM + the whole cron.
|
||||
*
|
||||
* @param {{ success: number; failed: number; thresholdRatio?: number }} counters
|
||||
* @returns {boolean}
|
||||
*/
|
||||
export function shouldExitNonZero({ success, failed, thresholdRatio = 0.05 }) {
|
||||
if (failed <= 0) return false;
|
||||
const attempted = success + failed;
|
||||
if (attempted <= 0) return false;
|
||||
const threshold = Math.max(1, Math.floor(attempted * thresholdRatio));
|
||||
return failed >= threshold;
|
||||
}
|
||||
|
||||
// ── User-name lookup (best effort) ───────────────────────────────────────────
|
||||
|
||||
function userDisplayNameFromId(userId) {
|
||||
// Clerk IDs look like "user_2abc..." — not display-friendly. Phase
|
||||
// 3b will hydrate names via a Convex query; Phase 3a uses a
|
||||
// generic "you" so the greeting still reads naturally without a
|
||||
// round-trip we don't yet need.
|
||||
void userId;
|
||||
return 'Reader';
|
||||
}
|
||||
|
||||
// ── Rule dedupe (one brief per user, not per variant) ───────────────────────
|
||||
|
||||
// Most-permissive-first ranking. Lower = broader.
|
||||
const SENSITIVITY_RANK = { all: 0, high: 1, critical: 2 };
|
||||
|
||||
function compareRules(a, b) {
|
||||
// Prefer the 'full' variant — it's the superset dashboard.
|
||||
const aFull = a.variant === 'full' ? 0 : 1;
|
||||
const bFull = b.variant === 'full' ? 0 : 1;
|
||||
if (aFull !== bFull) return aFull - bFull;
|
||||
// Tie-break on most permissive sensitivity (broadest brief).
|
||||
const aRank = SENSITIVITY_RANK[a.sensitivity ?? 'all'] ?? 0;
|
||||
const bRank = SENSITIVITY_RANK[b.sensitivity ?? 'all'] ?? 0;
|
||||
if (aRank !== bRank) return aRank - bRank;
|
||||
// Final tie-break: earlier-updated rule wins for determinism.
|
||||
return (a.updatedAt ?? 0) - (b.updatedAt ?? 0);
|
||||
}
|
||||
|
||||
/**
|
||||
* Group eligible (non-opted-out) rules by userId, with each user's
|
||||
* candidates sorted in preference order (best first). Returns an
|
||||
* array of `[userId, ranked-candidates[]]` pairs so the main loop
|
||||
* can try each variant in order and fall back when the preferred
|
||||
* one produces zero stories.
|
||||
*
|
||||
* aiDigestEnabled is pre-filtered here so a user whose preferred
|
||||
* variant is opted out but another variant is opted in still
|
||||
* produces a brief — the dedupe must not pick a variant that can
|
||||
* never emit.
|
||||
*/
|
||||
export function groupEligibleRulesByUser(rules) {
|
||||
/** @type {Map<string, any[]>} */
|
||||
const byUser = new Map();
|
||||
for (const rule of rules) {
|
||||
if (!rule || typeof rule.userId !== 'string') continue;
|
||||
// Default is OPT-IN — only an explicit false opts the user out.
|
||||
if (rule.aiDigestEnabled === false) continue;
|
||||
const list = byUser.get(rule.userId);
|
||||
if (list) list.push(rule);
|
||||
else byUser.set(rule.userId, [rule]);
|
||||
}
|
||||
for (const list of byUser.values()) {
|
||||
list.sort(compareRules);
|
||||
}
|
||||
return byUser;
|
||||
}
|
||||
|
||||
/**
|
||||
* @deprecated Kept so the existing dedupe tests still compile.
|
||||
* Prefer groupEligibleRulesByUser + per-user fallback in callers.
|
||||
*/
|
||||
export function dedupeRulesByUser(rules) {
|
||||
const grouped = groupEligibleRulesByUser(rules);
|
||||
const out = [];
|
||||
for (const candidates of grouped.values()) {
|
||||
if (candidates.length > 0) out.push(candidates[0]);
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
// ── Insights fetch ───────────────────────────────────────────────────────────
|
||||
|
||||
function extractInsights(raw) {
|
||||
// news:insights:v1 is stored as a seed envelope {_seed, data}.
|
||||
// readRawJsonFromUpstash intentionally does not unwrap; do so here.
|
||||
const data = raw?.data ?? raw;
|
||||
const topStories = Array.isArray(data?.topStories) ? data.topStories : [];
|
||||
const clusterCount = Number.isFinite(data?.clusterCount) ? data.clusterCount : topStories.length;
|
||||
const multiSourceCount = Number.isFinite(data?.multiSourceCount) ? data.multiSourceCount : 0;
|
||||
return {
|
||||
topStories,
|
||||
numbers: {
|
||||
clusters: clusterCount,
|
||||
multiSource: multiSourceCount,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// ── SIGTERM handling ─────────────────────────────────────────────────────────
|
||||
// Matches the bundle-runner SIGTERM pattern (feedback note
|
||||
// bundle-runner-sigkill-leaks-child-lock). This script does not take
|
||||
// a distributed lock, but it does perform many parallel Upstash
|
||||
// writes; SIGTERM during the loop should flush partial progress
|
||||
// cleanly instead of throwing mid-fetch.
|
||||
let shuttingDown = false;
|
||||
process.on('SIGTERM', () => {
|
||||
shuttingDown = true;
|
||||
console.log('[brief-composer] SIGTERM received — finishing current iteration');
|
||||
});
|
||||
|
||||
// ── Main ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
async function main() {
|
||||
const startMs = Date.now();
|
||||
console.log('[brief-composer] Run start:', new Date(startMs).toISOString());
|
||||
|
||||
let insightsRaw;
|
||||
try {
|
||||
insightsRaw = await readRawJsonFromUpstash(INSIGHTS_KEY);
|
||||
} catch (err) {
|
||||
console.error('[brief-composer] failed to read', INSIGHTS_KEY, err.message);
|
||||
process.exit(1);
|
||||
}
|
||||
if (!insightsRaw) {
|
||||
console.warn('[brief-composer] insights key empty; no brief to compose');
|
||||
return;
|
||||
}
|
||||
|
||||
const insights = extractInsights(insightsRaw);
|
||||
if (insights.topStories.length === 0) {
|
||||
console.warn('[brief-composer] upstream topStories empty; no brief to compose');
|
||||
return;
|
||||
}
|
||||
|
||||
let rules;
|
||||
try {
|
||||
rules = await fetchDigestRules();
|
||||
} catch (err) {
|
||||
console.error('[brief-composer]', err.message);
|
||||
process.exit(1);
|
||||
}
|
||||
console.log(`[brief-composer] Rules to process: ${rules.length}`);
|
||||
|
||||
// Briefs are user-scoped, but alertRules are (userId, variant)-scoped.
|
||||
// Group eligible (not-opted-out) rules by user in preference order
|
||||
// so we can fall back across variants when the preferred one can't
|
||||
// emit (opt-out on that variant, or zero matching stories).
|
||||
const eligibleByUser = groupEligibleRulesByUser(rules);
|
||||
|
||||
let success = 0;
|
||||
let skippedEmpty = 0;
|
||||
let failed = 0;
|
||||
|
||||
for (const [userId, candidates] of eligibleByUser) {
|
||||
if (shuttingDown) break;
|
||||
try {
|
||||
// Walk preference order; first variant with non-empty stories wins.
|
||||
let chosen = null;
|
||||
let chosenStories = null;
|
||||
for (const candidate of candidates) {
|
||||
const sensitivity = candidate.sensitivity ?? 'all';
|
||||
const stories = filterTopStories({
|
||||
stories: insights.topStories,
|
||||
sensitivity,
|
||||
maxStories: MAX_STORIES_PER_USER,
|
||||
});
|
||||
if (stories.length > 0) {
|
||||
chosen = candidate;
|
||||
chosenStories = stories;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!chosen) {
|
||||
skippedEmpty += 1;
|
||||
continue;
|
||||
}
|
||||
if (candidates.length > 1) {
|
||||
console.log(
|
||||
`[brief-composer] dedup: userId=${userId} chose variant=${chosen.variant} sensitivity=${chosen.sensitivity ?? 'all'} from ${candidates.length} enabled variants`,
|
||||
);
|
||||
}
|
||||
|
||||
const tz = chosen.digestTimezone ?? 'UTC';
|
||||
const issueDate = issueDateInTz(startMs, tz);
|
||||
const envelope = assembleStubbedBriefEnvelope({
|
||||
user: { name: userDisplayNameFromId(chosen.userId), tz },
|
||||
stories: chosenStories,
|
||||
issueDate,
|
||||
dateLong: dateLongFromIso(issueDate),
|
||||
issue: issueCodeFromIso(issueDate),
|
||||
insightsNumbers: insights.numbers,
|
||||
issuedAt: Date.now(),
|
||||
localHour: localHourInTz(startMs, tz),
|
||||
});
|
||||
|
||||
const key = `brief:${chosen.userId}:${issueDate}`;
|
||||
await upstashSetex(key, envelope, BRIEF_TTL_SECONDS);
|
||||
success += 1;
|
||||
} catch (err) {
|
||||
failed += 1;
|
||||
const variants = candidates.map((c) => c.variant).join(',');
|
||||
console.error(
|
||||
`[brief-composer] failed for user=${userId} variants=${variants}:`,
|
||||
err.message,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const eligibleUserCount = eligibleByUser.size;
|
||||
const attempted = success + failed;
|
||||
const durationMs = Date.now() - startMs;
|
||||
console.log(
|
||||
`[brief-composer] Done: rules=${rules.length} eligible_users=${eligibleUserCount} attempted=${attempted} success=${success} skipped_empty=${skippedEmpty} failed=${failed} duration_ms=${durationMs}`,
|
||||
);
|
||||
|
||||
if (shouldExitNonZero({ success, failed })) process.exit(1);
|
||||
}
|
||||
|
||||
// Only run the cron loop when executed as a script, never on import.
|
||||
// Tests import this file for the dedupe helpers and must not trigger
|
||||
// process.exit() at module load. Matches feedback_seed_isMain_guard.
|
||||
function isMain() {
|
||||
if (!process.argv[1]) return false;
|
||||
try {
|
||||
return fileURLToPath(import.meta.url) === process.argv[1];
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
if (isMain()) {
|
||||
if (process.env.BRIEF_COMPOSER_ENABLED === '0') {
|
||||
console.log('[brief-composer] BRIEF_COMPOSER_ENABLED=0 — skipping run');
|
||||
process.exit(0);
|
||||
}
|
||||
if (!UPSTASH_URL || !UPSTASH_TOKEN) {
|
||||
console.error('[brief-composer] UPSTASH_REDIS_REST_URL/TOKEN not set');
|
||||
process.exit(1);
|
||||
}
|
||||
if (!CONVEX_SITE_URL || !RELAY_SECRET) {
|
||||
console.error('[brief-composer] CONVEX_SITE_URL / RELAY_SHARED_SECRET not set');
|
||||
process.exit(1);
|
||||
}
|
||||
main().catch((err) => {
|
||||
console.error('[brief-composer] fatal:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
@@ -29,6 +29,14 @@ const { decrypt } = require('./lib/crypto.cjs');
|
||||
const { callLLM } = require('./lib/llm-chain.cjs');
|
||||
const { fetchUserPreferences, extractUserContext, formatUserProfile } = require('./lib/user-context.cjs');
|
||||
const { Resend } = require('resend');
|
||||
import { readRawJsonFromUpstash, redisPipeline } from '../api/_upstash-json.js';
|
||||
import {
|
||||
composeBriefForRule,
|
||||
extractInsights,
|
||||
groupEligibleRulesByUser,
|
||||
shouldExitNonZero as shouldExitOnBriefFailures,
|
||||
} from './lib/brief-compose.mjs';
|
||||
import { signBriefUrl, BriefUrlError } from './lib/brief-url-sign.mjs';
|
||||
|
||||
// ── Config ────────────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -67,6 +75,23 @@ const AI_SUMMARY_CACHE_TTL = 3600; // 1h
|
||||
const AI_DIGEST_ENABLED = process.env.AI_DIGEST_ENABLED !== '0';
|
||||
const ENTITLEMENT_CACHE_TTL = 900; // 15 min
|
||||
|
||||
// ── Brief composer (consolidation of the retired seed-brief-composer) ──────
|
||||
|
||||
const BRIEF_URL_SIGNING_SECRET = process.env.BRIEF_URL_SIGNING_SECRET ?? '';
|
||||
const WORLDMONITOR_PUBLIC_BASE_URL =
|
||||
process.env.WORLDMONITOR_PUBLIC_BASE_URL ?? 'https://worldmonitor.app';
|
||||
const BRIEF_TTL_SECONDS = 7 * 24 * 60 * 60; // 7 days
|
||||
const INSIGHTS_KEY = 'news:insights:v1';
|
||||
|
||||
// Operator kill switch — used to intentionally silence brief compose
|
||||
// without surfacing a Railway red flag. Distinguished from "secret
|
||||
// missing in a production rollout" which IS worth flagging.
|
||||
const BRIEF_COMPOSE_DISABLED_BY_OPERATOR = process.env.BRIEF_COMPOSE_ENABLED === '0';
|
||||
const BRIEF_COMPOSE_ENABLED =
|
||||
!BRIEF_COMPOSE_DISABLED_BY_OPERATOR && BRIEF_URL_SIGNING_SECRET !== '';
|
||||
const BRIEF_SIGNING_SECRET_MISSING =
|
||||
!BRIEF_COMPOSE_DISABLED_BY_OPERATOR && BRIEF_URL_SIGNING_SECRET === '';
|
||||
|
||||
// ── Redis helpers ──────────────────────────────────────────────────────────────
|
||||
|
||||
async function upstashRest(...args) {
|
||||
@@ -407,6 +432,7 @@ function formatDigestHtml(stories, nowMs) {
|
||||
</tr>
|
||||
</table>
|
||||
<div data-ai-summary-slot></div>
|
||||
<div data-brief-cta-slot></div>
|
||||
<table cellpadding="0" cellspacing="0" border="0" width="100%" style="margin-bottom: 24px;">
|
||||
<tr>
|
||||
<td style="text-align: center; padding: 14px 8px; width: 33%; background: #161616; border: 1px solid #222;">
|
||||
@@ -793,20 +819,46 @@ const DIVIDER = '─'.repeat(40);
|
||||
* Keeps the per-channel formatting logic out of main() so its cognitive
|
||||
* complexity stays within the lint budget.
|
||||
*/
|
||||
function buildChannelBodies(storyListPlain, aiSummary) {
|
||||
function buildChannelBodies(storyListPlain, aiSummary, magazineUrl) {
|
||||
// The URL is already HMAC-signed and shape-validated at sign time
|
||||
// (userId regex + YYYY-MM-DD), but we still escape it per-target
|
||||
// as defence-in-depth — same discipline injectBriefCta uses for
|
||||
// the email button. Each target has different metacharacter rules.
|
||||
const telegramSafeUrl = magazineUrl
|
||||
? String(magazineUrl)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"')
|
||||
: '';
|
||||
const slackSafeUrl = magazineUrl
|
||||
? String(magazineUrl).replace(/[<>|]/g, '')
|
||||
: '';
|
||||
const briefFooterPlain = magazineUrl
|
||||
? `\n\n${DIVIDER}\n\n📖 Open your WorldMonitor Brief magazine:\n${magazineUrl}`
|
||||
: '';
|
||||
const briefFooterTelegram = magazineUrl
|
||||
? `\n\n${DIVIDER}\n\n📖 <a href="${telegramSafeUrl}">Open your WorldMonitor Brief magazine</a>`
|
||||
: '';
|
||||
const briefFooterSlack = magazineUrl
|
||||
? `\n\n${DIVIDER}\n\n📖 <${slackSafeUrl}|Open your WorldMonitor Brief magazine>`
|
||||
: '';
|
||||
const briefFooterDiscord = magazineUrl
|
||||
? `\n\n${DIVIDER}\n\n📖 [Open your WorldMonitor Brief magazine](${magazineUrl})`
|
||||
: '';
|
||||
if (!aiSummary) {
|
||||
return {
|
||||
text: storyListPlain,
|
||||
telegramText: escapeTelegramHtml(storyListPlain),
|
||||
slackText: escapeSlackMrkdwn(storyListPlain),
|
||||
discordText: storyListPlain,
|
||||
text: `${storyListPlain}${briefFooterPlain}`,
|
||||
telegramText: `${escapeTelegramHtml(storyListPlain)}${briefFooterTelegram}`,
|
||||
slackText: `${escapeSlackMrkdwn(storyListPlain)}${briefFooterSlack}`,
|
||||
discordText: `${storyListPlain}${briefFooterDiscord}`,
|
||||
};
|
||||
}
|
||||
return {
|
||||
text: `EXECUTIVE SUMMARY\n\n${aiSummary}\n\n${DIVIDER}\n\n${storyListPlain}`,
|
||||
telegramText: `<b>EXECUTIVE SUMMARY</b>\n\n${markdownToTelegramHtml(aiSummary)}\n\n${DIVIDER}\n\n${escapeTelegramHtml(storyListPlain)}`,
|
||||
slackText: `*EXECUTIVE SUMMARY*\n\n${markdownToSlackMrkdwn(aiSummary)}\n\n${DIVIDER}\n\n${escapeSlackMrkdwn(storyListPlain)}`,
|
||||
discordText: `**EXECUTIVE SUMMARY**\n\n${markdownToDiscord(aiSummary)}\n\n${DIVIDER}\n\n${storyListPlain}`,
|
||||
text: `EXECUTIVE SUMMARY\n\n${aiSummary}\n\n${DIVIDER}\n\n${storyListPlain}${briefFooterPlain}`,
|
||||
telegramText: `<b>EXECUTIVE SUMMARY</b>\n\n${markdownToTelegramHtml(aiSummary)}\n\n${DIVIDER}\n\n${escapeTelegramHtml(storyListPlain)}${briefFooterTelegram}`,
|
||||
slackText: `*EXECUTIVE SUMMARY*\n\n${markdownToSlackMrkdwn(aiSummary)}\n\n${DIVIDER}\n\n${escapeSlackMrkdwn(storyListPlain)}${briefFooterSlack}`,
|
||||
discordText: `**EXECUTIVE SUMMARY**\n\n${markdownToDiscord(aiSummary)}\n\n${DIVIDER}\n\n${storyListPlain}${briefFooterDiscord}`,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -825,6 +877,142 @@ function injectEmailSummary(html, aiSummary) {
|
||||
return html.replace('<div data-ai-summary-slot></div>', summaryHtml);
|
||||
}
|
||||
|
||||
/**
|
||||
* Inject the "Open your brief" CTA into the email HTML. Placed near
|
||||
* the top of the body so recipients see the magazine link before the
|
||||
* story list. Uses inline styles only (Gmail / Outlook friendly).
|
||||
* When no magazineUrl is present (composer skipped / signing
|
||||
* failed), the slot is stripped so the email stays clean.
|
||||
*/
|
||||
function injectBriefCta(html, magazineUrl) {
|
||||
if (!html) return html;
|
||||
if (!magazineUrl) return html.replace('<div data-brief-cta-slot></div>', '');
|
||||
const escapedUrl = String(magazineUrl)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"');
|
||||
const ctaHtml = `<div style="margin:0 0 24px 0;">
|
||||
<a href="${escapedUrl}" style="display:inline-block;background:#f2ede4;color:#0a0a0a;text-decoration:none;font-weight:700;font-size:14px;letter-spacing:0.08em;padding:14px 22px;border-radius:4px;">Open your WorldMonitor Brief →</a>
|
||||
<div style="margin-top:10px;font-size:11px;color:#888;line-height:1.5;">Your personalised editorial magazine. Opens in the browser — scroll or swipe through today's threads.</div>
|
||||
</div>`;
|
||||
return html.replace('<div data-brief-cta-slot></div>', ctaHtml);
|
||||
}
|
||||
|
||||
// ── Brief composition (runs once per cron tick, before digest loop) ─────────
|
||||
|
||||
/**
|
||||
* Write brief:{userId}:{issueDate} for every eligible user and
|
||||
* return { briefByUser, counters } for the digest loop + main's
|
||||
* end-of-run exit gate. One brief per user regardless of how many
|
||||
* variants they have enabled.
|
||||
*
|
||||
* Returns empty counters when brief composition is disabled,
|
||||
* insights are unavailable, or the signing secret is missing. Never
|
||||
* throws — the digest send path must remain independent of the
|
||||
* brief path, so main() handles exit-codes at the very end AFTER
|
||||
* the digest has been dispatched.
|
||||
*
|
||||
* @param {unknown[]} rules
|
||||
* @param {number} nowMs
|
||||
* @returns {Promise<{ briefByUser: Map<string, object>; composeSuccess: number; composeFailed: number }>}
|
||||
*/
|
||||
async function composeBriefsForRun(rules, nowMs) {
|
||||
const briefByUser = new Map();
|
||||
// Missing secret without explicit operator-disable = misconfigured
|
||||
// rollout. Count it as a compose failure so the end-of-run exit
|
||||
// gate trips and Railway flags the run red. Digest send still
|
||||
// proceeds (compose failures must never block notification
|
||||
// delivery to users).
|
||||
if (BRIEF_SIGNING_SECRET_MISSING) {
|
||||
console.error(
|
||||
'[digest] brief: BRIEF_URL_SIGNING_SECRET not configured. Set BRIEF_COMPOSE_ENABLED=0 to silence intentionally.',
|
||||
);
|
||||
return { briefByUser, composeSuccess: 0, composeFailed: 1 };
|
||||
}
|
||||
if (!BRIEF_COMPOSE_ENABLED) return { briefByUser, composeSuccess: 0, composeFailed: 0 };
|
||||
|
||||
let insightsRaw = null;
|
||||
try {
|
||||
insightsRaw = await readRawJsonFromUpstash(INSIGHTS_KEY);
|
||||
} catch (err) {
|
||||
console.warn('[digest] brief: insights read failed, skipping brief composition:', err.message);
|
||||
// An infra-level read failure is a compose-layer failure worth
|
||||
// the Railway red-flag — count it as one failure so the exit
|
||||
// gate catches it. We still return a valid shape so the digest
|
||||
// send path runs normally.
|
||||
return { briefByUser, composeSuccess: 0, composeFailed: 1 };
|
||||
}
|
||||
if (!insightsRaw) return { briefByUser, composeSuccess: 0, composeFailed: 0 };
|
||||
|
||||
const insights = extractInsights(insightsRaw);
|
||||
if (insights.topStories.length === 0) return { briefByUser, composeSuccess: 0, composeFailed: 0 };
|
||||
|
||||
const eligibleByUser = groupEligibleRulesByUser(rules);
|
||||
let composeSuccess = 0;
|
||||
let composeFailed = 0;
|
||||
for (const [userId, candidates] of eligibleByUser) {
|
||||
try {
|
||||
const hit = await composeAndStoreBriefForUser(userId, candidates, insights, nowMs);
|
||||
if (hit) {
|
||||
briefByUser.set(userId, hit);
|
||||
composeSuccess++;
|
||||
}
|
||||
} catch (err) {
|
||||
composeFailed++;
|
||||
if (err instanceof BriefUrlError) {
|
||||
console.warn(`[digest] brief: sign failed for ${userId} (${err.code}): ${err.message}`);
|
||||
} else {
|
||||
console.warn(`[digest] brief: compose failed for ${userId}:`, err.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
console.log(
|
||||
`[digest] brief: compose_success=${composeSuccess} compose_failed=${composeFailed} total_users=${eligibleByUser.size}`,
|
||||
);
|
||||
return { briefByUser, composeSuccess, composeFailed };
|
||||
}
|
||||
|
||||
/**
|
||||
* Per-user: walk candidates until one produces stories, SETEX the
|
||||
* envelope, sign the magazine URL. Returns the entry the caller
|
||||
* should stash in briefByUser, or null when no candidate had stories.
|
||||
*/
|
||||
async function composeAndStoreBriefForUser(userId, candidates, insights, nowMs) {
|
||||
let envelope = null;
|
||||
let chosenVariant = null;
|
||||
for (const candidate of candidates) {
|
||||
const composed = composeBriefForRule(candidate, insights, { nowMs });
|
||||
if (composed) {
|
||||
envelope = composed;
|
||||
chosenVariant = candidate.variant;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!envelope) return null;
|
||||
|
||||
const issueDate = envelope.data.date;
|
||||
const key = `brief:${userId}:${issueDate}`;
|
||||
const pipelineResult = await redisPipeline([
|
||||
['SETEX', key, String(BRIEF_TTL_SECONDS), JSON.stringify(envelope)],
|
||||
]);
|
||||
if (!pipelineResult || !Array.isArray(pipelineResult) || pipelineResult.length === 0) {
|
||||
throw new Error('null pipeline response from Upstash');
|
||||
}
|
||||
const cell = pipelineResult[0];
|
||||
if (cell && typeof cell === 'object' && 'error' in cell) {
|
||||
throw new Error(`Upstash SETEX error: ${cell.error}`);
|
||||
}
|
||||
|
||||
const magazineUrl = await signBriefUrl({
|
||||
userId,
|
||||
issueDate,
|
||||
baseUrl: WORLDMONITOR_PUBLIC_BASE_URL,
|
||||
secret: BRIEF_URL_SIGNING_SECRET,
|
||||
});
|
||||
return { envelope, magazineUrl, chosenVariant };
|
||||
}
|
||||
|
||||
// ── Main ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
async function main() {
|
||||
@@ -856,6 +1044,14 @@ async function main() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Compose per-user brief envelopes once per run (extracted so main's
|
||||
// complexity score stays in the biome budget). Failures MUST NOT
|
||||
// block digest sends — we carry counters forward and apply the
|
||||
// exit-non-zero gate AFTER the digest dispatch so Railway still
|
||||
// surfaces compose-layer breakage without skipping user-visible
|
||||
// digest delivery.
|
||||
const { briefByUser, composeSuccess, composeFailed } = await composeBriefsForRun(rules, nowMs);
|
||||
|
||||
let sentCount = 0;
|
||||
|
||||
for (const rule of rules) {
|
||||
@@ -919,8 +1115,15 @@ async function main() {
|
||||
if (!storyListPlain) continue;
|
||||
const htmlRaw = formatDigestHtml(stories, nowMs);
|
||||
|
||||
const { text, telegramText, slackText, discordText } = buildChannelBodies(storyListPlain, aiSummary);
|
||||
const html = injectEmailSummary(htmlRaw, aiSummary);
|
||||
const brief = briefByUser.get(rule.userId);
|
||||
const magazineUrl = brief?.magazineUrl ?? null;
|
||||
const { text, telegramText, slackText, discordText } = buildChannelBodies(
|
||||
storyListPlain,
|
||||
aiSummary,
|
||||
magazineUrl,
|
||||
);
|
||||
const htmlWithSummary = injectEmailSummary(htmlRaw, aiSummary);
|
||||
const html = injectBriefCta(htmlWithSummary, magazineUrl);
|
||||
|
||||
const shortDate = new Intl.DateTimeFormat('en-US', { month: 'short', day: 'numeric' }).format(new Date(nowMs));
|
||||
const subject = aiSummary ? `WorldMonitor Intelligence Brief — ${shortDate}` : `WorldMonitor Digest — ${shortDate}`;
|
||||
@@ -955,6 +1158,18 @@ async function main() {
|
||||
}
|
||||
|
||||
console.log(`[digest] Cron run complete: ${sentCount} digest(s) sent`);
|
||||
|
||||
// Brief-compose failure gate. Runs at the very end so a compose-
|
||||
// layer outage (Upstash blip, insights key stale, signing secret
|
||||
// missing) never blocks digest delivery to users — but Railway
|
||||
// still flips the run red so ops see the signal. Denominator is
|
||||
// attempted writes (shouldExitNonZero enforces this).
|
||||
if (shouldExitOnBriefFailures({ success: composeSuccess, failed: composeFailed })) {
|
||||
console.warn(
|
||||
`[digest] brief: exiting non-zero — compose_failed=${composeFailed} compose_success=${composeSuccess} crossed the threshold`,
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
|
||||
@@ -13,7 +13,7 @@ import {
|
||||
dedupeRulesByUser,
|
||||
groupEligibleRulesByUser,
|
||||
shouldExitNonZero,
|
||||
} from '../scripts/seed-brief-composer.mjs';
|
||||
} from '../scripts/lib/brief-compose.mjs';
|
||||
|
||||
function rule(overrides = {}) {
|
||||
return {
|
||||
|
||||
72
tests/brief-url-sign.test.mjs
Normal file
72
tests/brief-url-sign.test.mjs
Normal file
@@ -0,0 +1,72 @@
|
||||
// Parity tests for the scripts-side HMAC signer.
|
||||
//
|
||||
// The signing algorithm lives in TWO places: server/_shared/brief-
|
||||
// url.ts (used by edge routes for verify, and by any TS code that
|
||||
// wants to mint URLs) and scripts/lib/brief-url-sign.mjs (used by
|
||||
// the consolidated digest cron to mint magazine URLs to embed in
|
||||
// notification bodies).
|
||||
//
|
||||
// Any drift between them silently produces tokens the edge route
|
||||
// cannot verify. These tests prove: (a) the same (userId, date,
|
||||
// secret) input produces byte-identical tokens across both modules,
|
||||
// (b) tokens signed by the scripts side pass the edge-side
|
||||
// verifyBriefToken.
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import {
|
||||
BriefUrlError,
|
||||
signBriefToken as signTokenScripts,
|
||||
signBriefUrl as signUrlScripts,
|
||||
} from '../scripts/lib/brief-url-sign.mjs';
|
||||
import {
|
||||
signBriefToken as signTokenEdge,
|
||||
verifyBriefToken,
|
||||
} from '../server/_shared/brief-url.ts';
|
||||
|
||||
const SECRET = 'consolidation-parity-secret-0xdead';
|
||||
const USER_ID = 'user_consolidated123';
|
||||
const ISSUE_DATE = '2026-04-18';
|
||||
|
||||
describe('scripts/lib/brief-url-sign parity with server/_shared/brief-url', () => {
|
||||
it('produces byte-identical tokens for the same inputs', async () => {
|
||||
const a = await signTokenScripts(USER_ID, ISSUE_DATE, SECRET);
|
||||
const b = await signTokenEdge(USER_ID, ISSUE_DATE, SECRET);
|
||||
assert.equal(a, b, 'scripts + edge signers must agree byte-for-byte');
|
||||
});
|
||||
|
||||
it('scripts-signed tokens pass edge-side verifyBriefToken', async () => {
|
||||
const token = await signTokenScripts(USER_ID, ISSUE_DATE, SECRET);
|
||||
assert.equal(
|
||||
await verifyBriefToken(USER_ID, ISSUE_DATE, token, SECRET),
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
it('signBriefUrl composes a working URL', async () => {
|
||||
const url = await signUrlScripts({
|
||||
userId: USER_ID,
|
||||
issueDate: ISSUE_DATE,
|
||||
baseUrl: 'https://worldmonitor.app',
|
||||
secret: SECRET,
|
||||
});
|
||||
assert.match(
|
||||
url,
|
||||
new RegExp(`^https://worldmonitor\\.app/api/brief/${USER_ID}/${ISSUE_DATE}\\?t=[A-Za-z0-9_-]{43}$`),
|
||||
);
|
||||
});
|
||||
|
||||
it('rejects malformed userId at sign time', async () => {
|
||||
await assert.rejects(
|
||||
() => signTokenScripts('user with spaces', ISSUE_DATE, SECRET),
|
||||
(err) => err instanceof BriefUrlError && err.code === 'invalid_user_id',
|
||||
);
|
||||
});
|
||||
|
||||
it('rejects empty secret at sign time', async () => {
|
||||
await assert.rejects(
|
||||
() => signTokenScripts(USER_ID, ISSUE_DATE, ''),
|
||||
(err) => err instanceof BriefUrlError && err.code === 'missing_secret',
|
||||
);
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user