Comment by jerlam

18 hours ago

I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.

I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):

https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...