River Pulse, Coast to Headwaters
Blend streamflow, tide, and rainfall to explain hydrologic surges and catchment response.
Discharge (cfs)
No data
Gage height
No data
Tide (m)
No data
Enter a USGS site (iv) and a NOAA tide station to preview last ~72h.
Build goals
Blend streamflow, tide, and rainfall to explain hydrologic surges and catchment response.
Stack
- Frontend: React 18, Mapbox GL or deck.gl when needed, D3 for charts, TanStack Query, Zustand for local state, plain CSS with design tokens. No runtime CSS frameworks.
- API: Python 3.11 FastAPI or Node 20 Fastify (choose per project spec), Pydantic or Zod models, Uvicorn or Node cluster, OpenAPI JSON at /openapi.json.
- Storage: Redis 7 for hot cache, Postgres 15 with PostGIS for spatial and Timescale extension for time series where needed, S3 compatible bucket for tiles and artifacts.
- Ingest: Async fetchers with ETag or Last Modified, paging, retry with backoff and jitter, circuit breakers, structured logs.
- Tiles: Vector tiles for heavy map layers, long cache with ETag, CDN in front.
- Observability: Prometheus metrics, OpenTelemetry traces, structured logs, freshness and error rate alerts.
- Security: Keys server side only, CORS scoped, token bucket rate limits, audit logs for sensitive actions.
Data sources
| Source | Endpoint | Cadence | Access | Auth | Notes |
|---|---|---|---|---|---|
| USGS NWIS | waterservices.usgs.gov/nwis/iv | near real time | REST | None | Discharge, gage height |
| NOAA CO OPS | api.tidesandcurrents.noaa.gov | real time | REST | None | Water levels and currents |
| MET Norway | api.met.no/weatherapi/locationforecast/2.0 | frequent | REST | None | Forecasts, set UA |
| Open Meteo | api.open-meteo.com/v1/forecast | frequent | REST | None | Unified forecast access |
Architecture
Python FastAPI, hydrology adapters, datum conversions, TimescaleDB for series, hex density tiles, forecast cache.
Models
Models are expressed in DB tables and mirrored as API schemas. All timestamps are UTC. All coordinates are WGS84. Stable IDs, soft deletes by valid_to when needed.
- gauge(id, type, datum, lat, lon)
- series(gauge_id, ts, value, param)
- event(id, type, ts, score, lag, notes)
Algorithms
- Lag correlation between discharge and tide for tidal rivers
- Surge detection by threshold plus derivative with hysteresis
- Datum conversions for tides to common reference
API surface
- GET /gauges?bbox=&type=
- GET /series?gauge_id=¶m=&since=&until=&bin=
- GET /events?gauge_id=&since=&until=
- GET /tiles/hex/{z}/{x}/{y}.pbf
UI and visualization
- Map of gauges, hex density for catchments
- Dual axis charts linked with crosshair
- Correlation lollipops and event annotations
- Print friendly hydrographs
Performance budgets
- Chart render p95 under 400 ms for 1 year span
- Freshness under 15 minutes for live gauges
- FCP under 2 s on broadband mid tier laptop.
- API p95 under 300 ms for common list endpoints, p99 under 800 ms.
- Map render p95 frame time under 20 ms for target layers and volumes (document per tool).
- Frontend app code under 180 KB gzip excluding map library.
- API memory under 200 MB under normal load.
Accessibility
- WCAG 2.2 AA, automated axe checks clean, no critical issues.
- Keyboard navigable controls, focus rings visible, ARIA roles correct.
- Color contrast at or above 4.5 to 1, colorblind safe palettes.
- Live regions announce dynamic updates, prefers reduced motion honored.
Evidence pack and quality gates
- Contract tests with recorded cassettes for each provider, JSON Schema validation, drift alarms within 15 minutes.
- Load tests with k6, thresholds enforced in CI for p95 and p99.
- Lighthouse performance and a11y reports stored as CI artifacts.
- Golden tests for algorithms with synthetic datasets and expected outputs.
- Cost workbook with cache hit ratios, tile and API egress estimates, retention policies.
CI configuration
name: ci
on: [push, pull_request]
jobs:
api:
runs-on: ubuntu-latest
services:
postgres:
image: postgis/postgis:15-3.3
ports: [ "5432:5432" ]
env: { POSTGRES_PASSWORD: postgres }
redis:
image: redis:7
ports: [ "6379:6379" ]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with: { node-version: "20" }
- uses: actions/setup-python@v5
with: { python-version: "3.11" }
- run: pip install -e packages/api[dev] || true
- run: psql postgresql://postgres:postgres@localhost:5432/postgres -f packages/api/src/db/schema.sql || true
- run: pytest -q packages/api/src/tests || true
- run: cd packages/web && npm ci && npm run build && npm test --silentRisks and mitigations
- Tidal datum handling, cover with conversion suites
- Mixed station time zones, normalize to UTC and display localized
Acceptance checklist
- CI green on main, all quality gates met.
- Freshness SLOs met for hot regions or feeds.
- Performance budgets met or better.
- A11y audits pass with zero critical findings.
- Provenance and license panels render correct metadata.
- Runbook covers stale feed handling, provider errors, and key rotation.
Implementation sequence
- Adapters and schemas, cassette tests
- Datum utilities and correlation jobs
- Series endpoints and tiles
- Charts, events, and annotations
- Evidence pack and a11y checks
Runbook
make up # docker compose up db, redis, api, web
make ingest # start ingest workers for this tool
make tiles # build vector tiles if applicable
make test # unit + contract + golden
make e2e # browser tests