Each night the AS400 dumps an invoice export to R2. At 4 AM EST, the worker reads the file, geocodes addresses, splits into Belleville and Kingston routes, and writes the result back to R2 — the same path the driver tablets already read. By 8 AM, drivers see today's stops.
handleInvoiceCron picks first existing R2 path. Auto-detects JSON vs CSV from content.
buildInvoiceRoutes normalizes field names. AS400 column names drift between reports — alias map absorbs that.
branch column wins. Else infer from postal FSA via checkZone(). Pickups skipped.
deliveries/{date}/belleville.json + kingston.json — same shape driver tablet reads.
The 04:00 invoice cron fires after the AS400 export's 1–3 AM window, giving it ~1 hour of buffer. Drivers see the route at 8 AM tablet wake or any manual refresh.
We don't yet know the exact path AS400 writes to. The cron tries these six in order and uses the first that exists — once the actual path is known, move it to the top for a slight speedup.
| # | R2 path | Note |
|---|---|---|
| 1 | as400/invoices-{today}.json | Date-stamped, JSON — preferred |
| 2 | as400/invoices-{today}.csv | Date-stamped, CSV |
| 3 | as400/invoices-latest.json | Sentinel "latest" file, JSON |
| 4 | as400/invoices-latest.csv | Sentinel "latest" file, CSV |
| 5 | as400/invoices/{today}.json | Folder-style, JSON |
| 6 | as400/invoices/{today}.csv | Folder-style, CSV |
Both JSON and CSV are accepted — detected by content shape, not file extension. The first non-whitespace character decides:
function detectFormat(text: string): 'csv' | 'json' { const trimmed = text.trimStart(); return (trimmed.startsWith('[') || trimmed.startsWith('{')) ? 'json' : 'csv'; }
JSON wrappers accepted — the AS400 dump might be any of these shapes:
[ { ... }, { ... } ] — top-level array{ "invoices": [ ... ] }{ "records": [ ... ] }{ "data": [ ... ] }{ "rows": [ ... ] }
AS400 reports rename columns between exports. Each canonical field in the parser maps
to multiple accepted header names — case-insensitive — so an export switching from
invoice_no to document_no doesn't break the cron.
branch=belleville or
branch=kingston, that's the answer.
checkZone(). K8N/K8P/K8R/K8V → Belleville;
K7K/K7L/K7M/K7P/K7R → Kingston.
branch=pickup,
fulfillment=pickup, or address contains "PICKUP", the invoice
is filtered out — it stays at the counter, not on a truck.
POST /cron/run.
The dispatch dashboard's Import Invoices button stays as the safety net whenever automation fails or someone wants to test a one-off. It accepts CSV via multipart upload and writes to the same R2 paths.
# From the dashboard — admin role only POST /api/v1/dispatch/import-invoices authorization: Bearer <admin-jwt> content-type: multipart/form-data # Or directly via curl with the cron token curl -X POST https://bot.613parts.ca/cron/import-invoices \ -H 'authorization: Bearer $CRON_TRIGGER_TOKEN' \ -H 'content-type: text/csv' \ --data-binary @invoices.csv
Vitest suite at worker/test/invoice-import.test.ts covers auth, validation,
CSV parsing, branch grouping, R2 writes, and JSON/CSV parity. Two fixtures:
worker/test/data/sample-invoices.csv — 10 invoices, mixed branches, includes a pickupworker/test/data/sample-invoices.json — same 10 invoices in AS400 JSON shape with aliased field names (invoice_no, customer, ship_to, warehouse) to verify alias resolution
The parity test confirms feeding either fixture produces the same routes — same invoice
numbers in same branches, same per-stop totals, same item counts. Run via
npm test invoice-import from the worker directory.
# wrangler.toml [triggers] crons = ["0 7 * * *", "0 9 * * *"] # Required secrets wrangler secret put GOOGLE_MAPS_API_KEY # geocoding, ~1500 calls/mo wrangler secret put SLACK_WEBHOOK_URL # ops notifications (optional) wrangler secret put CRON_TRIGGER_TOKEN # manual override + dashboard auth # R2 bucket — already provisioned [[r2_buckets]] binding = "CATALOG_BUCKET" bucket_name = "613parts-catalog"