Comment by Initial_BP

3 years ago

I totally agree with you that Tesla FSD seems more likely yo have a serious crash, but if actual fully autonomous cars are the end goal, then the behavior of learning/beta models doesn't really matter except to the extent that it let's them get to the end goal (for the sake of this argument).

All fully autonomous cars are in a different legal situation then Tesla. Tesla sells Joe Shmoe a car and then tells him he can rub FSD but he's responsible and has to remain attentive then they get info about every disengagement and (mostly) avoid legal responsibility or accidents in many cases.

Waymo is fully responsible for every accident, etc so they HAVE to proceed more cautiously or they'll lose the ability to run their cars. As someone else pointed out they often are only operated in very specific areas, and often even specific streets within a geofence. So while on the surface Waymo may have full self driving operating more effectively with less problems, they're doing so in a much more controlled environment and not getting the variety of data that Tesla has from cars disengaging Literally anywhere in the US.

> but if actual fully autonomous cars are the end goal, then the behavior of learning/beta models doesn't really matter

i didn't sign up to be killed by some idiot tech bro testing a class project where they plumbed alexnet into the steering wheel of a 2000kg vehicle and took a couple of steps downhill

the streets are already dangerous enough for pedestrians