613parts ops · technical spec

Invoice Auto-Pickup

Each night the AS400 dumps an invoice export to R2. At 4 AM EST, the worker reads the file, geocodes addresses, splits into Belleville and Kingston routes, and writes the result back to R2 — the same path the driver tablets already read. By 8 AM, drivers see today's stops.

Pipeline

1
AS400 nightly export Drops invoice file to R2 — JSON or CSV, exact path varies. Cron tries 6 candidates in order.
~01:00–03:00 EST
2
Cron fires handleInvoiceCron picks first existing R2 path. Auto-detects JSON vs CSV from content.
04:00 EST
3
Parse + alias-map columns buildInvoiceRoutes normalizes field names. AS400 column names drift between reports — alias map absorbs that.
~200ms
4
Geocode in parallel Google Maps API with 30-day KV cache. Repeat customer addresses are free; only new ones hit the API.
2–8 sec
5
Group by branch Explicit branch column wins. Else infer from postal FSA via checkZone(). Pickups skipped.
<50ms
6
Write routes to R2 deliveries/{date}/belleville.json + kingston.json — same shape driver tablet reads.
~100ms
7
Notify ops Slack message with stats: invoice count, branch breakdown, geocode failures, warnings. Silent if all clean? No — always logs.
~200ms

Schedule

0 7 * * *
07:00 UTC · 02:00 EST
Catalog regeneration — DAI rims × AS400 stock × fitment → R2
0 9 * * *
09:00 UTC · 04:00 EST
Invoice import — AS400 export → driver routes

The 04:00 invoice cron fires after the AS400 export's 1–3 AM window, giving it ~1 hour of buffer. Drivers see the route at 8 AM tablet wake or any manual refresh.

Candidate paths

We don't yet know the exact path AS400 writes to. The cron tries these six in order and uses the first that exists — once the actual path is known, move it to the top for a slight speedup.

#R2 pathNote
1as400/invoices-{today}.jsonDate-stamped, JSON — preferred
2as400/invoices-{today}.csvDate-stamped, CSV
3as400/invoices-latest.jsonSentinel "latest" file, JSON
4as400/invoices-latest.csvSentinel "latest" file, CSV
5as400/invoices/{today}.jsonFolder-style, JSON
6as400/invoices/{today}.csvFolder-style, CSV

Format auto-detection

Both JSON and CSV are accepted — detected by content shape, not file extension. The first non-whitespace character decides:

function detectFormat(text: string): 'csv' | 'json' {
  const trimmed = text.trimStart();
  return (trimmed.startsWith('[') || trimmed.startsWith('{'))
    ? 'json'
    : 'csv';
}

JSON wrappers accepted — the AS400 dump might be any of these shapes:

Column aliases

AS400 reports rename columns between exports. Each canonical field in the parser maps to multiple accepted header names — case-insensitive — so an export switching from invoice_no to document_no doesn't break the cron.

invoice_number
invoice · invoice_no · invoice# · inv_no · document_no
customer_name
customer · name · bill_to · ship_to_name
customer_phone
phone · tel · phone_number
ship_address
address · ship_to · delivery_address · street
branch
location · warehouse · wh · origin
total_amount
total · amount · invoice_total · grand_total
line_items
items · parts · description · lines
delivery_window
window · delivery_time · time_slot
notes
instructions · special_instructions · comments
fulfillment
ship_method · method · delivery_type

Branch detection

1
Explicit column wins. If the row has branch=belleville or branch=kingston, that's the answer.
2
Postal FSA fallback. Else extract postal code from the geocoded address (or the raw input) and call checkZone(). K8N/K8P/K8R/K8V → Belleville; K7K/K7L/K7M/K7P/K7R → Kingston.
3
Default + warning. If still ambiguous, default to Belleville and emit a warning. Dispatch sees the flagged invoice in the morning report — manual reassignment via dashboard.
×
Pickup invoices skipped. If branch=pickup, fulfillment=pickup, or address contains "PICKUP", the invoice is filtered out — it stays at the counter, not on a truck.

Failure modes

Clean run · level=info All 6 R2 candidates checked, file found, parsed, geocoded, written. Slack gets a green status with the stats.
Soft warnings · level=warning File found and parsed, but some invoices had geocoding failures or unknown branches. Routes still written; flagged invoices listed in Slack message for dispatch review.
No file found · level=warning None of the 6 candidate paths exist. Slack alert lists every path tried. Drivers see yesterday's route until either the file appears OR Preet uploads manually via the dispatch dashboard "Import" button.
Parse failure · level=error File found but content is malformed. Slack gets the error message + first 200 chars of the file for diagnosis. Routes are not overwritten — yesterday's data remains.
R2 write failure · level=error Routes built fine but write to R2 failed. Rare; usually Cloudflare incident. Retry on next cron, or trigger manually via POST /cron/run.

Manual override

The dispatch dashboard's Import Invoices button stays as the safety net whenever automation fails or someone wants to test a one-off. It accepts CSV via multipart upload and writes to the same R2 paths.

# From the dashboard — admin role only
POST /api/v1/dispatch/import-invoices
authorization: Bearer <admin-jwt>
content-type: multipart/form-data

# Or directly via curl with the cron token
curl -X POST https://bot.613parts.ca/cron/import-invoices \
  -H 'authorization: Bearer $CRON_TRIGGER_TOKEN' \
  -H 'content-type: text/csv' \
  --data-binary @invoices.csv

Testing

Vitest suite at worker/test/invoice-import.test.ts covers auth, validation, CSV parsing, branch grouping, R2 writes, and JSON/CSV parity. Two fixtures:

The parity test confirms feeding either fixture produces the same routes — same invoice numbers in same branches, same per-stop totals, same item counts. Run via npm test invoice-import from the worker directory.

Configuration

# wrangler.toml
[triggers]
crons = ["0 7 * * *", "0 9 * * *"]

# Required secrets
wrangler secret put GOOGLE_MAPS_API_KEY     # geocoding, ~1500 calls/mo
wrangler secret put SLACK_WEBHOOK_URL       # ops notifications (optional)
wrangler secret put CRON_TRIGGER_TOKEN      # manual override + dashboard auth

# R2 bucket — already provisioned
[[r2_buckets]]
binding = "CATALOG_BUCKET"
bucket_name = "613parts-catalog"