I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs
There are already anecdotes of people aggressively jaywalking in front of a Waymo because they know it will stop, and people driving more aggressively around Waymos because it will always defer to them.
I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs
There are already anecdotes of people aggressively jaywalking in front of a Waymo because they know it will stop, and people driving more aggressively around Waymos because it will always defer to them.
Has this been true of other Google products? They never get worse?