> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
You're omitting the context provided by the article. This wasn't just a random scenario. Not only was this by an elementary school, but during school drop off hours, with both children and doubled parked cars in the vicinity. If somebody doesn't know what double parking is - it's when cars parallel park beside one another, implicitly on the road, making it difficult to see what's beyond them.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
If you drive in Sweden you will occasionally come up to a form of speed reduction strategy that may seem counterintuitive. They all add to make driving harder and feel more dangerous in order to force attention and lower speed.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
I think my problem is that it reacted after seeing the child step out from behind the SUV.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
multiple children in my area have died due to being hit by distracted drivers driving near schools. One incident resulted in 2 children being dragged 60 yards. Here's a snippet from an article about the death I was referencing:
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
>It's likely that a fully-attentive human driver would have done worse.
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
This exact scenario happened with my dad 50 years ago when a little girl ran out to the street from between some parked cars. It's an extremely difficult scenario to avoid an accident in.
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
> It's likely that a fully-attentive human driver would have done worse.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
Waymo is intentionally leaving out the following details:
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.
Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.
What I find a bit confusing is that no one is putting any blame on the kid. I did the same thing as a kid, except it was a school bus instead of SUV, and that was a fucking stupid thing to do (I remember starting to run over the street, and the next thing is that I am in the hospital bed), even though I had been told to always cross the street from behind the bus, not in front of it.
I bet we'll the the SUV mania in the future as something crazy, like smoking in a plane or using lead for gasoline. Irrational large size cars that people get because everyone it's afraid of another SUV hitting them in a sedan. The tragedy of the commons.
The best reaction from Waymo would have been to start to lobby against letting those monster-trucks park on streets near schools. They are killing so many children, I'm flabbergasted they are still allowed outside of worksites.
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
>Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
Sure but also throw in whether that driver is staring at their phone, distracting by something else, etc. I have been a skeptic of all this stuff for a while but riding in a Waymo in heavy fog changed my mind when questioning how well I or another driver would've done at that time of day and with those conditions.
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
> I honestly cannot imagine a better outcome or handling of the situation.
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions.
That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
When I was a boy, I ran into the street from between two parked cars. I did not notice the car coming, but he noticed me popping out from nowhere, and screeched to a stop.
I saw a girl dart out between to parked cars on a strode. She was less lucky. The car did slam on their breaks. I have no idea what speed it was ultimately going when they hit the girl. It wasn't enough to send her flying but it was enough to knock her over hard. The dad, was sitting in his front yard and had her up and in his car and I'm guessing rushed to the hospital.
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
If the person got up and walked away I'm not sure what damage you'd be doing by reasonably removing your car from blocking others while waiting for police.
Waymo driver? The vehicles are autonomous. I otherwise applaud Waymo's response, and I hope they are as cooperative as they say they will be. However, referring to the autonomous vehicle as having a driver is a dangerous way to phrase it. It's not passive voice, per se, but it has the same effect of obscuring responsibility. Waymo should say we, Waymo LLC, subsidiary of Alphabet, Inc., braked hard...
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
In fact I would call that “superhuman” behavior across the board.
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
How do you know how quickly the software braked? A blog post by a company selling a product is not credible material. We need independent sources.
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
Waymo’s performance, once the pedestrian was revealed, sounds pretty good. But is 17mph a safe speed at an active school dropoff area? I admit that I don’t think I ever personally pay attention to the speedometer in such a place, but 17mph seems excessive even for an ordinary parking lot.
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
> But is 17mph a safe speed at an active school dropoff area?
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
> In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.
The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
> In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
I suspect that highway miles heavily skew this statistic. There's naturally far fewer pedestrians on highways (lower numerator), people travel longer distances on highways (higher denominator), and Waymo vehicles didn't drive on highways until recently. If you look only at non-highway miles, you'll get a much more accurate comparison.
> we should default to the calculation of 2-4x the rate.
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
Would this Waymo incident be counted as an injury? Sounds like the victim was relatively unharmed? Presumably there are human-driver incidents like this where a car hits a child at low speeds, with effectively no injuries, but is never recorded as such?
People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
>People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
We should all think twice before taking a company PR statement completely at face value and praising them for slowing down faster than their own internal "model" says a human driver would. Companies are heavily interested in protecting their bottom line and in a situation like this probably had 5-10 people carefully craft every single word of the statement for maximum damage control.
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
It's going to sound batshit insane what I say - the problem is, if we don't praise company PR, the other side will use this as an excuse to push even harder regulations, not allow them in newer cities, slow down the adoption rate, while factually ignoring that this is just a safer method of transport. I wish I was not a bootlicker, but I really want robotaxis to be available everywhere in the world at some point, and such issues should not slow them down IF it's better, and especially, not worse than humans on average.
Do you know anyone who works at Waymo? The cynicism is silly. Just because some people at some companies behave horribly, it doesn't mean all or even most do.
Look at Waymo's history in the space, meet some of the people working there, then make a decision.
You don't have to think anyone is behaving horribly to acknowledge that a company's PR department will tend to put out the version of the story that makes them look best.
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
For real, I am convinced these are people who never walk or bike, at least around cities like Santa Monica. I am an everyday urban walker and I have to constantly be on alert not to be hit, even when I'm behaving predictably and with the right of way.
Yeah I have to wonder if any of the "humans would do it better" people actually have children and have dropped them off in a school zone. Drivers are on their phones rolling through school zones at 25-30 during pickup/dropoff hours all the fucking time.
A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there. If that driver would also have been driving a large SUV that child would have been pushed on the ground and ran over, so probably a fatality. And also functionally nobody would have given a shit apart from some lame finger pointing at (probably) the kid’s parents.
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
> A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there.
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
No, Waymo’s quote supports the grandparent comment - it was about a “fully attentive human driver” - unless you are arguing that human drivers are consistently “fully attentive”?
> And it is not the child’s or their parents’ fault either: Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults.
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
At some point children are capable of pursuing Darwin Awards. Parents may enable this, but ultimately if one’s child does something stupid contrary to one’s guidance and restrictions, they may end up with a Darwin for it. Two hundred years ago the child mortality rate was half, as in you lost one child per two, and most of those were not the fault of the child or parents. Society for quite some years has been pushing that down, to the point that a near-death involving a neglectful parent and a witless child is apparently (?) newsworthy — but the number of deaths will never reach zero, whether humans or robots or empty plains and blue skies. There will always be a Veruca Salt throwing themselves into the furnace no matter how many safety processes we impose onto roads, cars, drivers, and/or robots.
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
you're exactly right. the fixation on human vs AV error rates completely misses the point. even if we achieve 'perfect' AVs, mixing heavy machinery with children guarantees conflict. physics dictate cars can't stop instantly. the only solution is removing cars, not better drivers.
most commenters here are ignoring the structural incentives. the long term threat of waymo isn't safety, its the enclosure of public infrastructure. these companies are building a permission structure to lobby personal vehicles and public transit off the road.
transportation demand is inelastic. if we allow a transition where mobility is captured by private platforms, the consumer loses all leverage. the endgame is the american healthcare model: capture the market, kill alternatives, and extract max rent because the user has no choice. we need dense urban cores and mass transit, not a dependency on rent seeking oligopolies
If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit? It didn't honor the 15 mph limit.
Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do too, though not everyone of course, incredibly.
We are responsible for the consequences of our actions. The speed limit is almost irrelevant; drive slowly enough so you don't hit anyone - especially in a school zone.
> We are responsible for the consequences of our actions.
We're not though. Drivers are allowed to kill as many people as they like as long as they're apologetic and weren't drinking; at most they pay a small fine.
So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
> Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.
The true self-driving trolley problem. How many rear-end collisions and riders' annoyance caused by phantom braking a manufacturer (or a society) is going to tolerate to save one child per N million miles?
Uncorrelated approach improves sensitivity at the cost of specificity. Early sensor fusion might improve both (maybe at the cost of somewhat lesser sensitivity).
Yeah, if a human made the same mistakes as the Waymo driving too fast near the school, then they would have hurt the kid much worse than the Waymo did.
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
Kinetic energy is a bad metric. Acceleration is what splats people.
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
The Waymo driver tech is impressive. That said an experienced driver might have recognized the pattern where a stopped big vehicle occludes a part of the road leading to such situation, and might have stopped or slowed down almost to a halt before passing. The Waymo driver reacts faster but is not able to predict such scenarios by filling the gaps, simulating the world to inform decisions. Chapeau to Waymo anyways
There have been many instances of Waymo preventing a collision by predicting pedestrians emerging from occlusion. This isn’t new information at all for them. Some accidents are simply physically impossible to prevent. I don’t know for sure if this one was one of those, but I’m fairly confident it couldn’t have been from prediction failure.
See past examples:
https://youtube.com/watch?v=hubWIuuz-e4 — first save is a child emerging from a parked car. Notice how Waymo slows down preemptively before the child starts moving.
I think this definitely an improvement to consider, but when comparing I think that big number, i.e. statistics are the only thing that matters. Some human could detect the pattern and come to full halt another human driver could be speeding while texting
Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
I don't think this comparison is meaningful given the sample size of 1 and the differences between the between your datasets. The standard error margins from the small sample size alone are so large that you could not reasonably claim humans are safer (95% CI for Waymo is about 1 per 20 million miles to 1 per 8 billion miles). Then there are the dataset differences:
1. The NHTSA data is based on police-reported crash data, which reports far fewer injuries than the CDC reports based on ED visits. The child in this case appeared mostly unharmed and situations like this would likely not be counted in the NHTSA data.
2. Waymo taxis operate primarily in densely populated urban environments while human driver milage includes highways and rural roads where you're much less likely to collide with pedestrians per mile driven.
Waymo's 90% crash reduction claim is at least an apples-to-apples comparison.
Do we even know that the child was injured? All I've seen anyone officially claim is that the Waymo made contact, the kid fell over, then stood up and walked to the side of the road. Assuming the Waymo was still braking hard, 6mph means it was about 1/4s and about 30cm from reaching a full stop, so it could be a very minor incident we're talking about here.
I'm not aware of any statistics for how often children come into contact with human-driven cars.
I don't think I'd want to take much from such a statistical result yet. A sample size of 1 accident just isn't enough information to get a real rate from, not that I want to see more collisions with children. Though this is also muddied by the fact that Waymo will most likely adjust their software to make this less likely, and we won't know exactly how or how many miles each version has. I'd also like to see the data for human incidents over just the temperate suburban areas like Waymo operates in.
> child pedestrian injury rate at ~2-4x higher than the US human average.
If this incident had happened with a human driven vehicle would it even have been reported?
I don't know exactly what a 6mph collision looks like but I think it's likely the child had nothing more than some bruises and if a human has done it they would have just said sorry, made sure they were ok, and left
The only question I have is whether the speed it was going was situationally appropriate and whether we’d expect a human to be considered reckless under the same circumstances. 17mph sounds pretty slow but it really depends on context.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post.
The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.
It's interesting how polarized this comments section is. Lots of people claiming a human driver would definitely have been driving slower. Lots of people claiming statistics show that human drivers do worse in this scenario aggregate. Of course neither side presenting convincing evidence.
> any other car been there, probably including Tesla
Cheap shots. If this was Tesla there would be live media coverage across every news outlet around the world and congressmen racing to start investigation.
Look at any thread where Tesla is mentioned and how many waymo simps are mansplaning lidar.
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
It's hard to imagine how any driver could have reacted better in this situation.
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.
Are you thinking of civil liability or criminal liability?
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.
> Normal person would drive carefully around blind spots.
I can't tell if you are using sarcasm here or are serious. I guess it depends on your definition of normal person (obviously not average, but an idealized driver maybe?).
> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.
I know submissions are not meant to contain modifications to article titles, but would it be so bad to have added "at 6mph" and/or "minor injuries" to the title?
> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety
It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?
The real but contentious answer is to change our street and urban design. You can only do so much to make a giant metal machine safe for children and small animals to be struck by. Reducing the frequency of cars and pedestrians occupying the same space will go further than trying to engineer the equivalent of a pool that is impossible to drown in.
Do you think that a company that operates autonomous vehicles will support legislation that makes it easier and safer to move around on foot without getting hit by a car? Or will they lobby for car-centric urban design, like many many companies before them?
OK
Its like this!, If I hit a child in a school district, I loose my licence for many years, and if I dont or cant show remourse, it could be longer, I pay fines, etc
Therefor waymo, must have it's algorythm terminated, ie: totaly destroyed, all the hardware smashed, and they never get to try again with any derivitive of this technology, as there is no reasonable, understandable path towards repentence and rehabilitation, it is litteraly a monster running over children.
or was it carrying an ICE team, then nevermind.
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
> Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
I can see that, prioritize obstacle predictability over transit time. A school zone at certain times of day is very unpredictable with respect to obstacles but a more car congested area would be easier to navigate but slower. Same goes for residential areas during Halloween.
>To put this in perspective, our peer-reviewed model shows that a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph. This significant reduction in impact speed and severity is a demonstration of the material safety benefit of the Waymo Driver.
The statistically relevant question is: How many human drivers have hit children near elementary schools, since Waymo's last accident?
If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.
A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.
This is wrong, although something quite like it is right.
Imagine that there are only 10 Waymo journeys per year, and every year one of them hits a child near an elementary school, while there are 1000000 non-Waymo journeys per year, and every year two of them hit children near elementary schools. In this scenario Waymo has half as many accidents but is clearly much more dangerous.
Here in the real world, obviously the figures aren't anywhere near so extreme, but it's still the case that the great majority of cars on the road are not Waymos, so after counting how many human drivers have had similar accidents you need to scale that figure in proportion to the ratio of human to Waymo car-miles.
(Also, you need to consider the severity of the accidents. That comparison probably favours Waymo; at any rate, they're arguing that it does in this case, that a human driver in the same situation would have hit the child at a much higher and hence more damaging speed.)
Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
I believe Mercedes is the only consumer car manufacturer that is advertising an SAE Level 3 system. My understanding is that L3 is where the manufacturer says you can take your attention off the road while the system is active, so they're assuming liability.
That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.
So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me
Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.
Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
Couldn’t be anymore callous and clinical. This press release alone makes me want not to use their service.
* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*
Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company
That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.
It might not be possible for a lot of places — I don’t really know.
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general.
Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.
Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.
> The situation of "kid running out between cars" will likley never be solved
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
>We accept the risks with humans because those humans accept risk.
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?
"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.
Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.
From the Waymo blog...
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
Yup. And to add
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
You're omitting the context provided by the article. This wasn't just a random scenario. Not only was this by an elementary school, but during school drop off hours, with both children and doubled parked cars in the vicinity. If somebody doesn't know what double parking is - it's when cars parallel park beside one another, implicitly on the road, making it difficult to see what's beyond them.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
6 replies →
If you drive in Sweden you will occasionally come up to a form of speed reduction strategy that may seem counterintuitive. They all add to make driving harder and feel more dangerous in order to force attention and lower speed.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
35 replies →
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
99 replies →
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
23 replies →
I think my problem is that it reacted after seeing the child step out from behind the SUV.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
48 replies →
multiple children in my area have died due to being hit by distracted drivers driving near schools. One incident resulted in 2 children being dragged 60 yards. Here's a snippet from an article about the death I was referencing:
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
7 replies →
If I was a human driver in that contextual situation I wouldn't even be going 14mph in the first place...
I don't fail to believe that, a child running from behind an suv is really scary
It depends. A driver may have seen a child dart behind a car and expect them to emerge on the other side.
Does Waymo have the same object permanence and trajectory prediction (combined) to that of a human?
Once the video evidence it out, it might become evident.
Generally Waymo seems to be a responsible actor so maybe that is the case and this can help demonstrate potential benefits of autonomous vehicles.
Alternatively, if even they can't get this right then it may cast doubts about the maturity of the entire ecosystem
2 replies →
>It's likely that a fully-attentive human driver would have done worse.
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
This exact scenario happened with my dad 50 years ago when a little girl ran out to the street from between some parked cars. It's an extremely difficult scenario to avoid an accident in.
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
2 replies →
"fully attentive human driver ..." is Waymo's claim, and it could be biased in their favor.
1 reply →
> It's likely that a fully-attentive human driver would have done worse.
Why is it likely? Are we taking the vendor's claims in a blog post as truth?
who benefits from a statement like this?
1 reply →
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
I remember someone using similar language when Uber self driving killed someone - and when the video was released, it was laughable.
It is also crazy that this happened 6 days ago at this point and video was NOT part of the press releases. LOL
2 replies →
[dead]
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
1 reply →
A fully attentive human would've known he was near a school and wouldn't have been driving at 17 mph to begin with.
2 replies →
> It's likely that a fully-attentive human driver would have done worse.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
4 replies →
Waymo is intentionally leaving out the following details:
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
1 reply →
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
[0] https://www.safedrivingforlife.info/free-practice-tests/haza...
3 replies →
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
11 replies →
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
7 replies →
Whoa! You're allowed to double park outside a school over there?!
7 replies →
> From the Waymo blog...
I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.
Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.
An honest account of this situation would place at least some blame on there being a tall SUV blocking visibility.
These giant SUVs really are the worst when it comes to child safety
What I find a bit confusing is that no one is putting any blame on the kid. I did the same thing as a kid, except it was a school bus instead of SUV, and that was a fucking stupid thing to do (I remember starting to run over the street, and the next thing is that I am in the hospital bed), even though I had been told to always cross the street from behind the bus, not in front of it.
That day I learned why it was so.
I bet we'll the the SUV mania in the future as something crazy, like smoking in a plane or using lead for gasoline. Irrational large size cars that people get because everyone it's afraid of another SUV hitting them in a sedan. The tragedy of the commons.
The best reaction from Waymo would have been to start to lobby against letting those monster-trucks park on streets near schools. They are killing so many children, I'm flabbergasted they are still allowed outside of worksites.
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
> a full-scale vehicle simulator
The UK is such a situation, and this vehicle would have failed a driving test there.
>Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
Sure but also throw in whether that driver is staring at their phone, distracting by something else, etc. I have been a skeptic of all this stuff for a while but riding in a Waymo in heavy fog changed my mind when questioning how well I or another driver would've done at that time of day and with those conditions.
1 reply →
17 mph is pretty slow unless it’s a school zone
2 replies →
I easily can: when in a school zone never every go so fast that you can't stop before hitting a kid, especially when visibility is limited.
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
It's hardly surprising that the version of events from the PR department makes Waymo sound completely blameless.
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
> Humans have this intuitive sense.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
2 replies →
What's the success rate of this intuitive sense that humans have? Intuitions are wrong frequently.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
The general public is stupid.
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
5 replies →
"from behind a tall SUV, "
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
They they are being very transparent about it.
As every company should, when they have a success. Are they also as transparent about their failures?
38 replies →
as far as we know
2 replies →
> reducing speed from approximately 17 mph
Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?
> I honestly cannot imagine a better outcome or handling of the situation.
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
Can’t trust a private company.
Where is the video recording ?
I suspect the robotaxi may have done better than a human.
Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions. That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
I dont think Waymo is interested in using a video of their car striking a child as marketing.
2 replies →
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
When I was a boy, I ran into the street from between two parked cars. I did not notice the car coming, but he noticed me popping out from nowhere, and screeched to a stop.
I was very very lucky.
I saw a girl dart out between to parked cars on a strode. She was less lucky. The car did slam on their breaks. I have no idea what speed it was ultimately going when they hit the girl. It wasn't enough to send her flying but it was enough to knock her over hard. The dad, was sitting in his front yard and had her up and in his car and I'm guessing rushed to the hospital.
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
2 replies →
It’s great handling of the situation. They should release a video as well.
Indeed. Rather than having the company telling me that they did great I'd rather make up my own mind and watch the video.
We should take their reporting with grain of salt and wait for official results
Well done waymo!
This is great.
what about all the traffic violations though?
https://news.ycombinator.com/item?id=46814583
> remained stopped, moved to the side of the road
Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?
If the person got up and walked away I'm not sure what damage you'd be doing by reasonably removing your car from blocking others while waiting for police.
Take that particular Waymo car off the road. Seems absurd, but they still hit someone.
The car is not the problem. The problem is the intersection of human and machine operating independently of each other with conflicting intention.
I am personally a fan of entirely automated but slow traffic. 10mph limit with zero traffic is fast enough for any metro area.
Waymo driver? The vehicles are autonomous. I otherwise applaud Waymo's response, and I hope they are as cooperative as they say they will be. However, referring to the autonomous vehicle as having a driver is a dangerous way to phrase it. It's not passive voice, per se, but it has the same effect of obscuring responsibility. Waymo should say we, Waymo LLC, subsidiary of Alphabet, Inc., braked hard...
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
> Waymo driver? The vehicles are autonomous
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
EDIT: replies say I'm misremembering, disregard.
That was Cruise, and that was fixed by Cruise ceasing operations.
I don’t think that was Waymo right? Cruise is already wound down as far as I know.
> I honestly cannot imagine a better outcome or handling of the situation.
> From the Waymo blog
Yeah, like, no shit Sherlock. We'd better wait for some videos before making our opinions.
In fact I would call that “superhuman” behavior across the board.
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
How do you know how quickly the software braked? A blog post by a company selling a product is not credible material. We need independent sources.
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
> I honestly cannot imagine a better outcome or handling of the situation.
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
Most humans in that situation won't have reaction speed to do shit about it and it could result in a severe injury or death.
Humans are not going to win on reaction time but prevention is arguably much more important.
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
1 reply →
How would standard automatic breaking (standard in some brands) have performed here?
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
Waymo’s performance, once the pedestrian was revealed, sounds pretty good. But is 17mph a safe speed at an active school dropoff area? I admit that I don’t think I ever personally pay attention to the speedometer in such a place, but 17mph seems excessive even for an ordinary parking lot.
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
> But is 17mph a safe speed at an active school dropoff area?
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
> In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
---
[1] https://waymo.com/safety/research
[2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[3] https://waymo.com/research/do-autonomous-vehicles-outperform...
[4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[5] https://waymo.com/research/comparative-safety-performance-of...
[6] https://waymo.com/blog/2022/09/benchmarking-av-safety/
[edit: reference]
> Waymo will still have to accept some responsibility
Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.
Think of it like dog ownership: if my dog hurts someone, that's on me. Property that causes harm is the owner's responsibility.
If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.
In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.
1 reply →
Bringing a vehicle onto the public roads is a privilege not a right. Any harm to pedestrians that results is your responsibility, not anyone else's.
The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
1 reply →
In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
> I would say that Waymo's response, per their blog post [2] has been textbook compliance.
Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.
> In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
[1] http://www.koopman.us/
[2] https://www.gmu.edu/profiles/cummings
2 replies →
Still relies on an actual driver.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
"Waymo Driver" is their term for their self driving software.
Though given the situation a human driver would not have been going 17 mph in a school zone during drop-off near double parked vehicles
1 reply →
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
I suspect that highway miles heavily skew this statistic. There's naturally far fewer pedestrians on highways (lower numerator), people travel longer distances on highways (higher denominator), and Waymo vehicles didn't drive on highways until recently. If you look only at non-highway miles, you'll get a much more accurate comparison.
4 replies →
> we should default to the calculation of 2-4x the rate.
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
5 replies →
Would this Waymo incident be counted as an injury? Sounds like the victim was relatively unharmed? Presumably there are human-driver incidents like this where a car hits a child at low speeds, with effectively no injuries, but is never recorded as such?
If that's the case, then that's great info. Thank you for adding :)
People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
>People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
3 replies →
We should all think twice before taking a company PR statement completely at face value and praising them for slowing down faster than their own internal "model" says a human driver would. Companies are heavily interested in protecting their bottom line and in a situation like this probably had 5-10 people carefully craft every single word of the statement for maximum damage control.
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
It's going to sound batshit insane what I say - the problem is, if we don't praise company PR, the other side will use this as an excuse to push even harder regulations, not allow them in newer cities, slow down the adoption rate, while factually ignoring that this is just a safer method of transport. I wish I was not a bootlicker, but I really want robotaxis to be available everywhere in the world at some point, and such issues should not slow them down IF it's better, and especially, not worse than humans on average.
You're right, what you're saying is batshit insane.
2 replies →
One of the few seeing through Waymo PR bullshit.
Do you know anyone who works at Waymo? The cynicism is silly. Just because some people at some companies behave horribly, it doesn't mean all or even most do.
Look at Waymo's history in the space, meet some of the people working there, then make a decision.
You don't have to think anyone is behaving horribly to acknowledge that a company's PR department will tend to put out the version of the story that makes them look best.
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
For real, I am convinced these are people who never walk or bike, at least around cities like Santa Monica. I am an everyday urban walker and I have to constantly be on alert not to be hit, even when I'm behaving predictably and with the right of way.
Yeah I have to wonder if any of the "humans would do it better" people actually have children and have dropped them off in a school zone. Drivers are on their phones rolling through school zones at 25-30 during pickup/dropoff hours all the fucking time.
A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there. If that driver would also have been driving a large SUV that child would have been pushed on the ground and ran over, so probably a fatality. And also functionally nobody would have given a shit apart from some lame finger pointing at (probably) the kid’s parents.
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
> A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there.
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
No, Waymo’s quote supports the grandparent comment - it was about a “fully attentive human driver” - unless you are arguing that human drivers are consistently “fully attentive”?
> And it is not the child’s or their parents’ fault either: Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults.
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
When my kids were school age, I taught them that the purpose of crosswalk lines is to determine who pays for your funeral.
They got the point.
This is a very good way of putting it.
We live very close to Grant. We go through this intersection to walk our kids to their schools & know the crossing guards pretty well.
This matches exactly what they said.
That kid is lucky it was a Waymo & not a human driven car.
Do you think Waymos should be banned from driving through Santa Monica?
No. They are by far the safest drivers in Santa Monica. Ideally we get to a point where human drivers are banned.
I do not like the phase "it's the kid's fault" for a kid being hit by a robot-car.
It is never a 6 year old's fault if they get struck by a robot.
Exactly. It’s his parents fault.
At some point children are capable of pursuing Darwin Awards. Parents may enable this, but ultimately if one’s child does something stupid contrary to one’s guidance and restrictions, they may end up with a Darwin for it. Two hundred years ago the child mortality rate was half, as in you lost one child per two, and most of those were not the fault of the child or parents. Society for quite some years has been pushing that down, to the point that a near-death involving a neglectful parent and a witless child is apparently (?) newsworthy — but the number of deaths will never reach zero, whether humans or robots or empty plains and blue skies. There will always be a Veruca Salt throwing themselves into the furnace no matter how many safety processes we impose onto roads, cars, drivers, and/or robots.
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
1 reply →
No-fault accidents happen. Accidents can have causes that are not legal nor moral blame.
2 replies →
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
you're exactly right. the fixation on human vs AV error rates completely misses the point. even if we achieve 'perfect' AVs, mixing heavy machinery with children guarantees conflict. physics dictate cars can't stop instantly. the only solution is removing cars, not better drivers.
most commenters here are ignoring the structural incentives. the long term threat of waymo isn't safety, its the enclosure of public infrastructure. these companies are building a permission structure to lobby personal vehicles and public transit off the road.
transportation demand is inelastic. if we allow a transition where mobility is captured by private platforms, the consumer loses all leverage. the endgame is the american healthcare model: capture the market, kill alternatives, and extract max rent because the user has no choice. we need dense urban cores and mass transit, not a dependency on rent seeking oligopolies
The school speed limit there is 15 mph, and that wasn't enough to prevent an accident.
https://www.yahoo.com/news/articles/child-struck-waymo-near-...
https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
The interesting thing is a 12 mph speed limit would be honored by an autonomous vehicle but probably ignored by humans.
If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit? It didn't honor the 15 mph limit.
1 reply →
Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do too, though not everyone of course, incredibly.
1 reply →
We are responsible for the consequences of our actions. The speed limit is almost irrelevant; drive slowly enough so you don't hit anyone - especially in a school zone.
> We are responsible for the consequences of our actions.
We're not though. Drivers are allowed to kill as many people as they like as long as they're apologetic and weren't drinking; at most they pay a small fine.
1 reply →
So drive at 0mph?
So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Depends on where the Waymo was.
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
https://maps.app.goo.gl/Vhce7puwwYyDYEuo6
> Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
7 replies →
For reference, here's a link to Waymo's blog post: https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
It honked at you? But local laws dictate that it angrily flashes its high beams at you.
In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
The unpredictability was jarring to me as a passenger in a Waymo.
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.
The true self-driving trolley problem. How many rear-end collisions and riders' annoyance caused by phantom braking a manufacturer (or a society) is going to tolerate to save one child per N million miles?
Uncorrelated approach improves sensitivity at the cost of specificity. Early sensor fusion might improve both (maybe at the cost of somewhat lesser sensitivity).
With above-average human reflexes, the kid would have been hit at 14mph instead of 6mph.
About 5x more kinetic energy.
Yeah, if a human made the same mistakes as the Waymo driving too fast near the school, then they would have hurt the kid much worse than the Waymo did.
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
But there may be a better solution...
But would a human be driving at 17 in a school zone during drop off hours? Id argue a human may be slower exactly because of this scenario
4 replies →
Kinetic energy is a bad metric. Acceleration is what splats people.
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
The Waymo driver tech is impressive. That said an experienced driver might have recognized the pattern where a stopped big vehicle occludes a part of the road leading to such situation, and might have stopped or slowed down almost to a halt before passing. The Waymo driver reacts faster but is not able to predict such scenarios by filling the gaps, simulating the world to inform decisions. Chapeau to Waymo anyways
There have been many instances of Waymo preventing a collision by predicting pedestrians emerging from occlusion. This isn’t new information at all for them. Some accidents are simply physically impossible to prevent. I don’t know for sure if this one was one of those, but I’m fairly confident it couldn’t have been from prediction failure.
See past examples:
https://youtube.com/watch?v=hubWIuuz-e4 — first save is a child emerging from a parked car. Notice how Waymo slows down preemptively before the child starts moving.
https://www.reddit.com/r/waymo/s/ivQPuExwNW — detects foot movement from under the bus.
https://www.reddit.com/r/waymo/s/LURJ8isQJ6 — stops for dogs and children running onto the street at night.
This one should have been prevented because the Waymo should have been driving at max 10mph
I think this definitely an improvement to consider, but when comparing I think that big number, i.e. statistics are the only thing that matters. Some human could detect the pattern and come to full halt another human driver could be speeding while texting
Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...
[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...
[5] https://waymo.com/safety/impact/
I don't think this comparison is meaningful given the sample size of 1 and the differences between the between your datasets. The standard error margins from the small sample size alone are so large that you could not reasonably claim humans are safer (95% CI for Waymo is about 1 per 20 million miles to 1 per 8 billion miles). Then there are the dataset differences:
1. The NHTSA data is based on police-reported crash data, which reports far fewer injuries than the CDC reports based on ED visits. The child in this case appeared mostly unharmed and situations like this would likely not be counted in the NHTSA data.
2. Waymo taxis operate primarily in densely populated urban environments while human driver milage includes highways and rural roads where you're much less likely to collide with pedestrians per mile driven.
Waymo's 90% crash reduction claim is at least an apples-to-apples comparison.
Do we even know that the child was injured? All I've seen anyone officially claim is that the Waymo made contact, the kid fell over, then stood up and walked to the side of the road. Assuming the Waymo was still braking hard, 6mph means it was about 1/4s and about 30cm from reaching a full stop, so it could be a very minor incident we're talking about here.
I'm not aware of any statistics for how often children come into contact with human-driven cars.
I don't think I'd want to take much from such a statistical result yet. A sample size of 1 accident just isn't enough information to get a real rate from, not that I want to see more collisions with children. Though this is also muddied by the fact that Waymo will most likely adjust their software to make this less likely, and we won't know exactly how or how many miles each version has. I'd also like to see the data for human incidents over just the temperate suburban areas like Waymo operates in.
> child pedestrian injury rate at ~2-4x higher than the US human average.
If this incident had happened with a human driven vehicle would it even have been reported?
I don't know exactly what a 6mph collision looks like but I think it's likely the child had nothing more than some bruises and if a human has done it they would have just said sorry, made sure they were ok, and left
The only question I have is whether the speed it was going was situationally appropriate and whether we’d expect a human to be considered reckless under the same circumstances. 17mph sounds pretty slow but it really depends on context.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post.
The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.
It's interesting how polarized this comments section is. Lots of people claiming a human driver would definitely have been driving slower. Lots of people claiming statistics show that human drivers do worse in this scenario aggregate. Of course neither side presenting convincing evidence.
And a truly disappointing number of people just accepting company PR as a complete account.
I don't like even the very idea o self-driving cars, but based on the description of the accident, I think the machine passed this with flying colors.
Oddly I cannot decide if this is cause for damnation or celebration
Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
> Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.
> Waymo is held to a significantly higher standard than human drivers.
They have to be, as a machine can not be held accountable for a decision.
9 replies →
This is sad but unfortunately probably happens more frequently with human drivers and people walking out into traffic and you never hear about it.
Straw man argument.
Basically Waymo just prevented a kids potential death.
Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.
> any other car been there, probably including Tesla
Cheap shots. If this was Tesla there would be live media coverage across every news outlet around the world and congressmen racing to start investigation.
Look at any thread where Tesla is mentioned and how many waymo simps are mansplaning lidar.
You just invented a hypothetical situation in your head then drew conclusions from it. In my version, the other car misses the kid entirely.
Yeah, but Tesla has a proven bad safety record. Waymo doesn't and the GP comment is alluding to that
1 reply →
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
It's hard to imagine how any driver could have reacted better in this situation.
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?
When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.
Are you thinking of civil liability or criminal liability?
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.
Waymo failed to stop and hit a child. Normal person would drive carefully around blind spots. I wonder what would comments be if Tesla hit a child.
> Normal person would drive carefully around blind spots.
I can't tell if you are using sarcasm here or are serious. I guess it depends on your definition of normal person (obviously not average, but an idealized driver maybe?).
From the Waymo blog...
> The Waymo Driver braked hard...
By Waymo Driver, they don't mean a human, do they?
No, this is the term they use to refer to their package of sensors, compute, and software.
If what Waymo wrote is true, this sounds more like kids fault or guardian’s.
Seems the robotaxi saved a life.
related: https://www.cbsnews.com/news/ntsb-investigation-waymo-robota...
> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.
It is more complicated than that. Deepends on size of object and many other factors.
The object could be a paper bag flying in the wind, or leaves falling from the tree.
You're right: a quick search shows that pedestrian fatalities are 43% higher on Halloween.
That's probably more a function of more people being in the road than people not understanding what object they're about to hit.
2 replies →
Lidar would pick up a moving object in 3D so unlikely to just keep going.
"Oh that obstructing object doesn't look like a child? Gun it, YOLO." Lmao.
I suspect the cars are trying to avoid running into anything, as that's generally considered bad.
I know submissions are not meant to contain modifications to article titles, but would it be so bad to have added "at 6mph" and/or "minor injuries" to the title?
I don't disagree with you but unfortunately I needed to keep from editorializing and I was restricted by a strict title length limit.
Will post it here:
> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety
It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?
The real but contentious answer is to change our street and urban design. You can only do so much to make a giant metal machine safe for children and small animals to be struck by. Reducing the frequency of cars and pedestrians occupying the same space will go further than trying to engineer the equivalent of a pool that is impossible to drown in.
Do you think that a company that operates autonomous vehicles will support legislation that makes it easier and safer to move around on foot without getting hit by a car? Or will they lobby for car-centric urban design, like many many companies before them?
1 reply →
Laws broken: 0
Nothing to see here.
hmm idk how i feel about taking one in the freeway anymore.
Amazing response to this situation.
Great job, Waymo, for maybe hitting a little kid less than your study assumes a human would have! Is that study legit? Who cares, we trust you!
If this had been Tesla, HN would have crashed from all the dunking.
OK Its like this!, If I hit a child in a school district, I loose my licence for many years, and if I dont or cant show remourse, it could be longer, I pay fines, etc Therefor waymo, must have it's algorythm terminated, ie: totaly destroyed, all the hardware smashed, and they never get to try again with any derivitive of this technology, as there is no reasonable, understandable path towards repentence and rehabilitation, it is litteraly a monster running over children. or was it carrying an ICE team, then nevermind.
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
> Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
I can see that, prioritize obstacle predictability over transit time. A school zone at certain times of day is very unpredictable with respect to obstacles but a more car congested area would be easier to navigate but slower. Same goes for residential areas during Halloween.
Waymo will 100% go down a route human drivers avoid because it will have "less traffic".
>To put this in perspective, our peer-reviewed model shows that a fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph. This significant reduction in impact speed and severity is a demonstration of the material safety benefit of the Waymo Driver.
Our car hits better is a win, I guess?
Glad the child is okay.
When is enough, enough? Software devs working on autonomous driving: look in your soul and update your resume.
The statistically relevant question is: How many human drivers have hit children near elementary schools, since Waymo's last accident?
If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.
A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.
Is Waymo statistically safer? (spoiler: yes)
Please read this article: https://www.bloomberg.com/news/features/2026-01-06/are-auton...
Spoiler: we definitely don't know yet whether Waymo is statistically safer
This is wrong, although something quite like it is right.
Imagine that there are only 10 Waymo journeys per year, and every year one of them hits a child near an elementary school, while there are 1000000 non-Waymo journeys per year, and every year two of them hit children near elementary schools. In this scenario Waymo has half as many accidents but is clearly much more dangerous.
Here in the real world, obviously the figures aren't anywhere near so extreme, but it's still the case that the great majority of cars on the road are not Waymos, so after counting how many human drivers have had similar accidents you need to scale that figure in proportion to the ratio of human to Waymo car-miles.
(Also, you need to consider the severity of the accidents. That comparison probably favours Waymo; at any rate, they're arguing that it does in this case, that a human driver in the same situation would have hit the child at a much higher and hence more damaging speed.)
So confident yet so wrong.
Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
I believe Mercedes is the only consumer car manufacturer that is advertising an SAE Level 3 system. My understanding is that L3 is where the manufacturer says you can take your attention off the road while the system is active, so they're assuming liability.
https://www.mbusa.com/en/owners/manuals/drive-pilot
A human driver would most likely have killed this child. That's what should be on the ledger.
That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...
There's a 50/50 chance that a distracted driver wouldn't have slowed at all and run the child over.
> To estimate injury risk at different impact speeds, IIHS researchers examined 202 crashes involving pedestrians ages 16 or older
A child is probably more likely to die in a collision of the same speed as an adult.
How many human drivers do under 20mph, like ever?
3 replies →
For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.
So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me
Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.
Personally I'm a lot more interested in kids not dying than in making income for injury lawyers. But that's just me.
4 replies →
Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
8 replies →
No, "the ledger" should record actual facts, and not whatever fictional alternatives we imagine.
Fact: This child's life was saved by the car being driven by a computer program instead of a human.
5 replies →
Would have. Could Have. Should have.
Most humans would be halfway into other lane after seeing kids near the street.
Apologist see something different than me.
Perception.
Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
Dog. Bites. Man.
Couldn’t be anymore callous and clinical. This press release alone makes me want not to use their service.
* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*
Alternate headline: Waymo saves child's life
In this timeline, we want our headlines to somehow reflect the contents of the story.
Saved child from what? From themselves. You can't take full credit for partially solving a problem that you, yourself, created.
can we just get waymo tech in busses?
Big vehicles that demand respect and aren't expected to turn on a dime, known stops.
Q: Why did the self-driving car cross the road?
A: It thought it saw a child on the other side.
That's Tesla. Waymo seems mostly ok.
Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company
I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.
Any accident is bad. But accidents involving children are especially bad.
That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.
It might not be possible for a lot of places — I don’t really know.
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
2 replies →
Well, I'm a human and I figure out how to avoid school zones.
Waymo is a subsidiary of Alphabet Inc. the same parent company as Google LLC
It was formerly known as the Google Self Driving Car Project
And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
6 replies →
> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
Do you mean like this?
https://waymo.com/safety/impact/
Yes but ideally from some objective source.
3 replies →
If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.
5 replies →
Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.
> The situation of "kid running out between cars" will likley never be solved
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
3 replies →
Second-order benefit: More Waymos = fewer parked cars
1 reply →
>We accept the risks with humans because those humans accept risk.
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
1 reply →
Even in terms of plain results, I'd say the consequences-based system isn't working so well if it's producing 40,000 US deaths annually.
1 reply →
Yes
Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?
"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.
7 replies →
Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.
I generally agree the bar is high.
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?
> Self driving needs to be orders of magnitude safer for us to acknowledge it
All data indicates that Waymo is ~10x safer so far.
"90% Fewer serious injury or worse crashes"
https://waymo.com/safety/impact/
> The vehicle remained stopped, moved to the side of the road
How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.
My reading of that is that they mean stopped the progression of the journey rather that made no movement whatsoever.
I agree, it’s poorly worded but I think that’s what they mean.
I also assume a human took over (called the police, moved the car, etc) once it hit the kid.
They mean the vehicle didn't drive away. It moved to the side of the road and then stopped and waited.
So many tech lovers defending waymo.
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.