Deployment & operations¶
Runbooks for shipping changes, re-running the pipeline, uploading data, and checking the health of the system.
Components to deploy¶
| Component | Tool | Frequency |
|---|---|---|
| Data (KV) | wrangler kv bulk put |
When pipeline is re-run |
| Worker API | wrangler deploy |
On backend changes |
| Frontend | wrangler pages deploy |
On frontend changes |
| Supabase schema | SQL editor in dashboard | When new migration exists |
| Docs | mkdocs gh-deploy (or Cloudflare Pages) |
On doc updates |
Runbooks¶
1. Re-run the data pipeline and upload¶
Use this when:
- KPI formulas change (edit code/scores.py or weights in code/config.py)
- New data source added
- Data sources updated (SITG quarterly refresh)
- New curated data (e.g., international schools list)
# 1. Run the pipeline
cd code
conda activate hood-analyzer
python pipeline.py
# ~1.5 min without transit, ~39 min with r5py
# 2. Sanity check the output
conda run -n hood-analyzer python -c "
import pandas as pd
df = pd.read_parquet('../output/geneva_kpis_by_h3.parquet')
print(df.shape)
print(df[['daily_life_score', 'family_score', 'smart_living_score']].describe())
"
# 3. Upload to KV (both batches — ~17K writes total)
cd ..
npx wrangler kv bulk put \
--namespace-id=69e40836e3ec453f874e1531d2ed5d54 \
--remote \
output/kv_export/kv_batch_001.json
npx wrangler kv bulk put \
--namespace-id=69e40836e3ec453f874e1531d2ed5d54 \
--remote \
output/kv_export/kv_batch_002.json
Note on plan limits: The Workers Free plan allows 1,000 KV writes/day — one pipeline upload uses 17,097. You need the Workers Paid plan (CHF 5/month) for iteration. The free plan is only OK if you upload infrequently.
Key insight: Because the worker reads live KV for saved reports (not the Supabase snapshot), existing unlocked reports automatically show the new data on the user's next view. No migration needed.
2. Deploy the worker¶
This picks up wrangler.toml bindings. Existing secrets
(SUPABASE_SERVICE_KEY, STRIPE_SECRET_KEY, etc.) are preserved.
Verify with:
curl https://neighborhood-report-api.dn-de-ridder.workers.dev/api/health
# → {"status":"ok","service":"neighborhood-report-api","timestamp":"..."}
3. Deploy the frontend¶
cd frontend
npm run build
npx wrangler pages deploy dist --project-name=neighborhood-report --commit-dirty=true
--commit-dirty=true suppresses the "you have uncommitted changes"
warning — useful during active development, skip it for production
releases.
Output includes a preview URL like
https://abc12345.neighborhood-report.pages.dev. The canonical
locascore.ch domain points at the latest main branch deployment.
4. Run a Supabase migration¶
- Open the Supabase dashboard → SQL editor
- Paste the contents of
supabase/migration_XXX_name.sql - Hit Run
- Verify no errors
All migrations are idempotent where possible (use CREATE OR REPLACE
for functions and views). Tables use CREATE TABLE (NOT IF NOT
EXISTS) — running them twice will fail, which is a feature (prevents
accidental re-runs).
5. Deploy the docs (this wiki)¶
# Local preview
mkdocs serve
# → http://127.0.0.1:8000
# Build static site
mkdocs build
# → site/
# Deploy to Cloudflare Pages (create a project first, one-time)
npx wrangler pages deploy site --project-name=locascore-docs
Alternative: GitHub Pages via mkdocs gh-deploy — but Cloudflare Pages
is already the platform of record for the rest of the stack.
Checking the health of the system¶
API health¶
KV content¶
# How many keys are in the namespace?
npx wrangler kv key list --namespace-id=69e40836e3ec453f874e1531d2ed5d54 --remote \
| jq 'length'
# → should be 17097 (or close, depending on the current grid size)
# Inspect a specific cell
npx wrangler kv key get 8a1f91ac356ffff \
--namespace-id=69e40836e3ec453f874e1531d2ed5d54 --remote \
| jq '.smart_living_score'
Admin queries (from Supabase SQL editor)¶
After migration 009, these views exist for one-off queries. Examples:
-- Recent unresolved feedback
SELECT * FROM admin_feedback_inbox LIMIT 20;
-- Feedback breakdown by category (last 30d)
SELECT * FROM admin_feedback_stats_30d;
-- Lead pipeline (uncontacted captures)
SELECT * FROM admin_email_captures_pipeline LIMIT 50;
-- Daily report unlocks
SELECT * FROM admin_reports_daily;
-- Revenue snapshot
SELECT * FROM admin_purchases_30d;
-- Captures by commune (where's the demand?)
SELECT * FROM admin_email_captures_by_commune;
All views are restricted to the service_role (the dashboard user) —
not reachable from the frontend or anon API.
GA4 funnel events¶
Events instrumented (partial list — see frontend/src/utils/analytics.ts
for the full EVENTS const):
address_searched,teaser_cta_clicked,auth_checkout_modal_openedauth_modal_signin_submitted,auth_modal_signin_failed,auth_modal_signup_failed(withreason)pricing_step_viewed,checkout_started,checkout_completed,checkout_cancelledreport_unlock_started,report_unlock_success,report_unlock_failedemail_capture_submitted/success/failedfeedback_submitted/success/failedcompare_mode_toggled,compare_viewed
Build funnel reports in GA4 like:
address_searched → teaser_cta_clicked → auth_checkout_modal_opened
→ pricing_step_viewed → checkout_started → checkout_completed
→ report_unlock_success
In dev, every event is logged to console.debug('[analytics] event: ...')
so you can verify instrumentation without GA.
Common operational issues¶
KV upload fails with "free usage limit for this operation for today"¶
Cause: Workers Free plan, 1,000 KV writes/day cap. Fix: Upgrade to Workers Paid ($5/mo). Limit jumps to 1M writes/month.
Frontend deploy shows old data even after KV upload¶
Cause: Cloudflare edge caching (rare) or browser cache.
Fix: Hard-refresh (Cmd+Shift+R). The worker sets Cache-Control:
private, max-age=3600 for report responses — after an hour the cache
expires naturally. If you need to force an immediate refresh, bump the
deployment (frontend redeploy).
"Report not found" after unlock¶
Cause: Race between unlock_report RPC committing and the frontend
reading back. Or the client navigated to /report/:slug before the
unlock response arrived.
Fix: Check Supabase logs — the unlock RPC should have inserted a
row. Check the worker logs for the exact error. useReport will
gracefully fall back to /api/report/unlock if the saved report isn't
found by address.
"No tokens available" for a user who just paid¶
Cause: Stripe webhook didn't fire, or process_purchase RPC failed.
Check:
-- Did the purchase land?
SELECT * FROM purchases
WHERE user_id = '<uuid>'
ORDER BY created_at DESC LIMIT 5;
-- Is the token balance correct?
SELECT id, email, token_balance FROM profiles WHERE id = '<uuid>';
-- Recent completed stripe sessions
SELECT stripe_session_id, status, credits_remaining
FROM purchases
WHERE stripe_session_id LIKE 'cs_%'
ORDER BY created_at DESC LIMIT 10;
If the webhook fired but the RPC failed, check Supabase logs for the
specific error and re-run process_purchase manually with the session ID.
"Outside Geneva" for a valid Geneva address¶
Cause: Swiss Federal geocoder occasionally returns points slightly
outside the canton polygon (e.g., for addresses on the canton border).
Fix: worker/src/utils/h3.ts::validateCoords uses a bounding box
with a small buffer. If a legit address falls outside, increase the
buffer in that file.
Disaster recovery¶
Nuclear option: rebuild everything from source¶
# 1. Re-run the pipeline
cd code && python pipeline.py
# 2. Re-upload both KV batches
cd .. && for i in 1 2; do
npx wrangler kv bulk put \
--namespace-id=69e40836e3ec453f874e1531d2ed5d54 --remote \
output/kv_export/kv_batch_00${i}.json
done
# 3. Re-deploy worker
cd worker && npx wrangler deploy
# 4. Re-deploy frontend
cd ../frontend && npm run build
npx wrangler pages deploy dist --project-name=neighborhood-report
Restoring user data¶
Supabase handles backups automatically (7-day point-in-time recovery on the free tier). To restore a specific row, use the dashboard's restore feature or run a manual query.
Environment variables¶
Worker secrets (set via wrangler secret put)¶
SUPABASE_SERVICE_KEY # from Supabase dashboard → Settings → API
STRIPE_SECRET_KEY # from Stripe dashboard → Developers → API keys
STRIPE_WEBHOOK_SECRET # from Stripe webhook endpoint
MAPBOX_TOKEN # from Mapbox account
TURNSTILE_SECRET_KEY # from Cloudflare Turnstile (currently unused)
Worker env vars (in wrangler.toml)¶
Frontend env vars (in .env.local, not checked in)¶
VITE_API_URL # https://neighborhood-report-api.dn-de-ridder.workers.dev
VITE_SUPABASE_URL # https://<project>.supabase.co
VITE_SUPABASE_ANON_KEY # from Supabase → Settings → API
VITE_MAPBOX_TOKEN # public Mapbox token
VITE_TURNSTILE_SITE_KEY # currently unused
VITE_GA_MEASUREMENT_ID # G-XXXXXXXXXX from GA4 (optional)
Releasing a new version¶
LocaScore uses semantic versioning. See
CHANGELOG.md
at the repo root for the full release history.
When to bump¶
- Patch (0.9.x) — bug fixes, copy tweaks, data refreshes. No schema changes, no new features.
- Minor (0.x.0) — new user-facing features, scoring changes, schema additions (new KPI columns, new DB tables).
- Major (x.0.0) — currently 0.x, will bump to 1.0.0 at public launch. After 1.0, reserved for breaking changes.
Release checklist¶
-
Update
CHANGELOG.md: move items from[Unreleased]into a new version heading with today's date. Use the Keep a Changelog sections (Added/Changed/Deprecated/Removed/Fixed/Security). -
Bump versions in both
The frontend footer will pick up the new version automatically via the Vitepackage.jsonfiles (they should stay in sync):defineinvite.config.ts. -
Commit the version bump:
-
Tag the release:
-
Deploy (in order):
- Run pipeline + upload KV (if data changed)
- Deploy worker:
cd worker && npx wrangler deploy -
Deploy frontend:
cd frontend && npm run build && npx wrangler pages deploy dist --project-name=neighborhood-report -
(Optional) Create a GitHub release from the tag. The web UI can auto-generate release notes from commits, or you can paste the relevant
CHANGELOG.mdsection. -
Verify by checking the footer on
locascore.ch— it should display the new version.
Hotfix flow¶
For urgent fixes (e.g., a production bug):
- Branch from the tag:
git checkout -b hotfix/0.9.1 v0.9.0 - Apply the fix
- Bump to
0.9.1in bothpackage.jsonfiles - Update
CHANGELOG.mdwith a new[0.9.1]entry - Commit, tag, push, deploy (same as above)
- Merge the hotfix branch back into
main
Pre-launch checklist¶
- All migrations (001-009) run in Supabase production
- KV has 17K+ entries (
wrangler kv key list | jq length) -
/api/healthreturns 200 OK - Teaser endpoint returns data for a known address
- Unlock flow works end-to-end (sign up → buy → view report)
- Playwright tests pass (
npm run test:e2e— 10 passing minimum) - GA4 is live and receiving events
- Cloudflare Bot Fight Mode enabled
- Rate limits verified (curl the teaser endpoint 15 times, expect 429)
- Privacy policy + Terms of service accessible at
/privacyand/terms - Data attribution in the footer (SITG, OSM, OFEV, TPG, Mapbox)