Let’s get one thing out of the way first: nobody — not tech optimists, not self-driving skeptics, not even the most die-hard gearhead who still insists that automatic transmissions are for the weak — wants to see a child get hit by a car. Human-driven or robot-driven, that part is awful and not up for debate.
But Waymo has been walking a tightrope of public trust for years now, and this incident near Grant Elementary School in Santa Monica is the kind of event that makes the rope even more tenuous.
The Unfortunate Waymo Incident
According to a preliminary report from the National Transportation Safety Board released on March 3, the crash occurred on the morning of January 23 at around 8:30 a.m. — which is, as anyone who has ever lived near a school knows, peak chaos hour. A 2024 Jaguar I-Pace running Waymo’s fifth-generation automated driving system was traveling southbound on 24th Street near Pearl Street when a 9-year-old student stepped out of a vehicle further back in a line of stopped cars and walked into the roadway between vehicles.
The Waymo vehicle, traveling at about 17 mph, braked before impact but still struck the child near the front-right headlight. The girl fell, but was able to get up on her own and walk to the curb. She reported minor injuries and didn’t need to go to the hospital, which, all things considered, is the best possible outcome from a very bad situation.
Investigators reconstructed the sequence of events using footage from a nearby school security camera and from the vehicle’s own onboard cameras, according to a local news station. The crash happened in a 25-mph school zone, roughly 40 feet past the end of an adjacent 15-mph school zone segment. Waymo’s own account noted that the child had emerged from behind a double-parked SUV, leaving very little time for any system — human or automated — to react.
The Robot Called 911 (And That’s Actually Kind of Remarkable)
Here’s a detail worth pausing on: after the collision, a Waymo remote assistance operator — a real, human person sitting in Novi, Michigan — contacted 911 and later directed the vehicle to pull over north of the scene, where it stayed put until Santa Monica police arrived.
That’s not nothing. There are plenty of human drivers who have struck pedestrians and kept going. And even more bystanders who just stand, slack-jawed and recording.
Waymo’s system didn’t flee. It didn’t panic. It called for help, moved safely out of the flow of traffic, and waited. Whether that rises to the level of “good” behavior in this situation is a matter of perspective, but it’s worth noting.
The Investigations Piling Up
The NTSB wasn’t the only federal body that came knocking. The National Highway Traffic Safety Administration also opened a preliminary evaluation on January 28, specifically probing whether the vehicle exercised appropriate caution given that it was operating near an elementary school during morning drop-off hours — a time when small humans in backpacks are essentially expected to materialize out of nowhere.
That’s the crux of the regulatory concern here. Waymo’s system is classified as Level 4 automation — meaning it can, in theory, handle all driving tasks without any human in the vehicle. But “can handle all driving tasks” and “should operate at normal speeds next to an elementary school at 8:30 on a Tuesday morning” may not be the same thing.
Both investigations are ongoing, and the NTSB has been careful to label everything released so far as preliminary. Things may look different once the full picture emerges.
The Car Community Has Feelings About This (Shocking, We Know)
If you spend any time in automotive enthusiast circles — the forums, the comment sections, the YouTube channels where people argue passionately about whether a manual gearbox makes you morally superior — you’ll know that Waymo and its robotaxi peers have never exactly been welcomed with open arms.
The critiques range from reasonable (“should we really be trusting billion-dollar software with children’s lives?”) to entertainingly unhinged (“this is how Skynet starts”). But incidents like this January crash have a way of collapsing that spectrum, especially as they keep piling up. For a moment, the guy who names his cars and the transportation policy wonk end up agreeing on something: this is a problem.
And honestly? That’s fair. The vision Waymo and its competitors have been selling is one where autonomous vehicles are meaningfully safer than human drivers. That’s a high bar to clear, and every crash — especially one involving a child, in a school zone, during drop-off — is a public data point against the pitch.
What This Means Going Forward
Waymo has operated millions of miles of driverless rides, and its safety record, in aggregate, still compares favorably to the human average. That context matters. But context doesn’t erase a January morning in Santa Monica.
The more pressing question is whether Waymo’s system was calibrated appropriately for a school zone environment. Operating at 17 mph in a 25-mph zone is technically legal, but does legal mean safe when you’re surrounded by children who haven’t yet learned that “look both ways” applies to robots, too?
That’s exactly what investigators are trying to determine. And until they do, the skeptics — even the ones who cling to their manual transmissions like a life raft — will have something worth paying attention to.