Build plan

Built in a day.

Most of the code already exists. What’s left is signing up for four accounts, running two SQL files against Neon, deploying five workers and one dashboard via wrangler. A focused afternoon. Sandwich at lunchtime.

Block 01 · morning

Sign up. Four tabs.

Open four browser tabs, sign up for each, save the keys somewhere safe. None of this needs technical skill.

10 min · Cloudflare

Sign up at dash.cloudflare.com

Add a card. Upgrade to Workers Paid ($5/month). This is the platform that runs everything.

5 min · Neon

Create a Neon Postgres project

console.neon.tech. Region eu-west-2. Copy the pooled connection string. This is the database.

5 min · Companies House

Get a Companies House REST API key

developer.company-information.service.gov.uk. Free. The crawler uses this for filings, charges, officer changes.

5 min · Anthropic

Anthropic API key, £20 credit

console.anthropic.com. Powers the article classifier. £20 lasts months at typical volume.

Block 02 · midday

Database, workers, deploy.

This is the technical block. Two SQL files into Neon, five wrangler deploy commands, secrets pasted in. Following DEPLOY.md step by step it takes about two hours including the inevitable typo or two.

10 min · Database

Run schema.sql then seed.sql against Neon

Paste the contents of each file into the Neon SQL editor. The schema creates all tables and views; the seed populates 11 sample UK retailers (Debenhams, Carpetright, Cath Kidston, River Island, SecretSales, Next, Currys, JD Sports, Boots, M&S, John Lewis) with realistic-looking signals so the dashboard works on day one.

10 min · Optional

Load the bigger news-source list

seed-news-sources-uk-de.sql adds ~100 RSS feeds across UK and Germany. Run the smoke-test script after to disable any that 404. Replaces the 6 starter sources in the main seed.

90 min · Workers

Deploy the five Cloudflare Workers

Companies House crawler. News scraper. LLM classifier. Risk scorer. CSV-ingest. For each: cd workers/<name>, npm install, wrangler secret put DATABASE_URL, wrangler deploy. The CH worker also takes CH_API_KEY, the classifier takes ANTHROPIC_API_KEY.

5 min · KV + R2

Create a KV namespace and an R2 bucket

One-liner each. KV is for the Companies House rate limiter. R2 is where Redbrain finance will drop daily invoice CSVs for the csv-ingest worker.

Block 03 · afternoon

Dashboard live. Tune. Share.

The last hour. Deploy the dashboard, point a password at it, share the URL. Then spend twenty minutes adjusting the sliders to match Redbrain’s actual risk tolerance.

15 min · Dashboard

Deploy the Next.js dashboard to Pages

cd dashboard, npm install, npm run pages:deploy. Set DATABASE_URL and DASHBOARD_PASSWORD as Pages secrets. Redeploy once more so the env vars take effect.

10 min · Tune

Open Settings, drag the sliders

Walk down the fifteen scoring parameters. Half-life. Severity weights. Band thresholds. Action triggers. Defaults are sensible, but Redbrain’s book has a specific shape — nudge until the right merchants surface at the top.

10 min · Watchlist

Star the 5–10 merchants you already worry about

Apply tags — Critical watch, Renewals due, PE-owned. The system starts pulling extra data on the deeper-tagged ones immediately. Confirm they sit at the top of the Finance view.

10 min · Alerts

Wire your finance team’s email and the #risk Slack channel

Add recipients on the Alerts page. The five default rules cover most needs — critical breach, score jump, overdue threshold, watchlist movement, daily digest. Mute the ones you don’t want, route the rest. Done.

After today

The first two weeks.

A day to deploy. A fortnight for the data to fill in.

  1. i

    Day 1 onwards: Companies House data flows.

    The CH worker runs every 6 hours. Within a day or two, every merchant in the book has its complete filing history, charges, officer changes, accounts-overdue flags loaded. Real signals start replacing the seeded ones.

  2. ii

    Hour 1 onwards: News classifier kicks in.

    RSS scraping every 30 minutes across ~100 sources. Each article goes through Claude Haiku to extract merchant mentions and distress severity. Press signals stack on top of CH and internal data.

  3. iii

    Week 1: Wire up the finance CSV pipeline.

    The csv-ingest worker watches an R2 bucket. Drop daily invoice + exposure CSVs in, the worker upserts into Neon overnight. From this point, DSO and exposure data is live not seeded — the most predictive signals in the whole system.

  4. iv

    Week 2: First tuning pass.

    Review the merchants the system flagged versus what finance/credit already thought. Adjust sliders. Reset or add watchlist tags. Calibrate alert thresholds so the team gets pinged for what matters without crying-wolf fatigue.

What was an 8-week plan in the original proposal is now a one-day deploy because the code already exists. The real work is the tuning, not the building.