Comment by aanet
15 hours ago
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
Waymo’s performance, once the pedestrian was revealed, sounds pretty good. But is 17mph a safe speed at an active school dropoff area? I admit that I don’t think I ever personally pay attention to the speedometer in such a place, but 17mph seems excessive even for an ordinary parking lot.
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
> But is 17mph a safe speed at an active school dropoff area?
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
> Technically, in CA, the speed limit in school zones are 25 mph
Legally a speed limit is a 'limit' on speed, not a suggested or safe speed. So it's never valid to argue legally that you were driving under the limit, the standard is that you slow down or give more room for places like a school drop-off while kids are being dropped off or picked up.
In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
> In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
---
[1] https://waymo.com/safety/research
[2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[3] https://waymo.com/research/do-autonomous-vehicles-outperform...
[4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[5] https://waymo.com/research/comparative-safety-performance-of...
[6] https://waymo.com/blog/2022/09/benchmarking-av-safety/
[edit: reference]
still, how many ppl do they kill per mile compared to humans?
> Waymo will still have to accept some responsibility
Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.
Think of it like dog ownership: if my dog hurts someone, that's on me. Property that causes harm is the owner's responsibility.
If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.
In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.
> In a technical sense, maybe, but it's all going to be about optics.
Indeed, it is, and that is exactly why Waymo will have to accept some responsibility. I can bet that internally Waymo's PR and Legal teams are working overtime to coordinate the details with NHTSA. We, the general public, may or may not know the details at all, if ever. However, Waymo's technical teams (Safety, etc) will also be working overtime to figure out what they could have done better.
As I mentioned, this is a standard test, and Waymo likely has 1000s of variations of this test in their simulation platforms; they will sweep across all possible parameters to make this test tighter, including the MER (minimum expected response from the AV) and perhaps raise the bar on MER (e.g. brake at max deceleration in some cases, trading off comfort metrics in those cases; etc.) and calculate the effects on local traffic (e.g. "did we endanger the rear vehicles by braking too hard? If so, by how much??" etc). All these are expected actions which the general public will never know (except, perhaps via some technical papers).
Regardless, the PR effects of this collision do not look good, especially as Waymo is expanding their service to other cities (Miami just announced; London by EOY2026). This PR coverage has potential to do more damage to the company than the actual physical damage to the poor traumatized kid and his family. THAT is the responsibility only the company will pay for.
To be sure, my intuition tells me this is not the last such collision. Expect to see some more, by other companies, as they commercialize their own services. It's a matter of statistics.
Bringing a vehicle onto the public roads is a privilege not a right. Any harm to pedestrians that results is your responsibility, not anyone else's.
The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
At least in the interim, wouldn’t doing what you propose cause more deaths if robot drivers are less harmful than humans, but the rules require stronger than that? (I can see the point in making rules stronger as better options become available, but by that logic, shouldn't we already be moving towards requiring robots and outlawing human drivers if it's safer?)
In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
> I would say that Waymo's response, per their blog post [2] has been textbook compliance.
Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.
> In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
[1] http://www.koopman.us/
[2] https://www.gmu.edu/profiles/cummings
There is already widespread discussion on LinkedIn about this thread... but usefully, here [1] is the NHTSA's Office of Defects Investigation report. Nothing much new there, tbh.
As I did suspect, legal scholars are already calling for "voluntary disclosure" from Waymo re: its annotated videos of the collision [2]. FWIW, my skepticism about Waymo actually releasing it remains...
[1] https://static.nhtsa.gov/odi/inv/2026/INOA-PE26001-10005.pdf
[2] https://www.linkedin.com/posts/matthew-wansley-62b5b9126_a-w...
> You, Sir, cite two companies that are diametrically opposite on the safety spectrum
Cringe. Stop it. Simping for google has stopped being cool nearly 2 decades ago.
Still relies on an actual driver.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
"Waymo Driver" is their term for their self driving software.
Though given the situation a human driver would not have been going 17 mph in a school zone during drop-off near double parked vehicles
1. I often see signs in such areas that flash when people exceed the limit. I’d urge you to pull over and see how often humans drive above the limit. 2. I’d urge you to also pull over and watch for how many drivers are not consistently looking at the road, such as using their phones, looking down at climate/entertainment/vehicle controls, looking at a passenger, etc