Cyclists die.
Cities don't hear about it.

Semicolon is a native iOS hazard dashcam for bikes, scooters, and cars. Multicam capture, LiDAR-aware perception, on-device voice — the clip, the GPS, and the heatmap the city actually needs.

// pov.ascii — front camera, demo-ride-1
[!] approaching hazardseverity threshold 56

/ pipeline

One pipeline.
Three sensor classes.

Bike, scooter, or car — the same perception loop, voice layer, and danger-segment store. Detection in the moment, evidence after the fact, aggregated insight for the people who set the speed limits.

01

Bike

Rider-mounted detection of vehicles, pedestrians, door zones, blocked bike lanes.

02

Scooter

Same pipeline at scooter speeds and on shared micro-mobility paths.

03

Car

Dashcam mode warning drivers about cyclists, pedestrians, unsafe passing.

/ on the road

Recorded mid-ride.

Two clips straight from an iPhone. The left shows the app in motion; the right shows the depth-aware feed with the multicam preview and live YOLO overlay.

app preview
on the road

/ product

Detect. Alert.
Archive. Aggregate.

Four small pieces that add up to one calm, capable safety co-pilot for everyone sharing the road.

01 / detect

Hazards spotted before you can react.

AVCaptureMultiCamSession runs front + rear simultaneously, frames stream to a YOLOv8 sidecar, and ARKit scene-depth fuses LiDAR distance into the risk score. Close passes, doorings, blocked bike lanes, pedestrian conflicts — flagged in real time.

02 / alert

A voice you can hear over wind.

AVSpeechSynthesizer speaks the tuned warning on-device the moment severity crosses threshold — no cloud round-trip, no audio latency. The rolling 60-second clip is locked, uploaded, and added to the gallery.

03 / archive

Evidence with GPS, heading, and speed attached.

Every saved clip lands in MongoDB Atlas with a 2dsphere geo index, a thumbnail, and full telemetry — speed, heading, severity, ride mode. Replay the ride in 3D from the records console; export the corridor report.

04 / aggregate

Repeated near-misses become a danger zone.

When the same corridor lights up across riders, events aggregate into geofenced danger segments — a heatmap city traffic engineers can actually act on, and a per-corridor incident report.

/ stack

Native iOS, wired to one open server.

01MongoDB Atlas
Hazard events + clips, 2dsphere geo index
02YOLOv8 sidecar
Real-time perception over Wi-Fi
03ARKit / LiDAR
Scene-depth fusion in low light
04AVFoundation
AVCaptureMultiCamSession dashcam
05Speech frameworks
SFSpeechRecognizer + AVSpeechSynthesizer, on-device
06MapKit
Apple Maps overlay with imperial turn-by-turn

/ flow

Mount. Ride. Replay.

01

Mount and start a ride

Bar-mount on a bike or scooter; vent-mount in a car. Pick a ride mode and the multicam perception loop begins.

02

Hazards trigger themselves

YOLO scores each frame, LiDAR boosts the risk in tight passes, and AVSpeechSynthesizer voices the alert when severity crosses threshold. Or just say "save clip."

03

Replay, report, repeat

Open the records console for the 3D ride replay, the danger-zone heatmap, and the corridor incident report.