Comment by charcircuit
15 hours ago
So if the law says that a human in the car has to be responsible then it is impossible for a self driving car to exist. I do not think tying the definition to legal liability is right.
I don't see why self driving couldn't just be steering and pedals. It would be pretty limiting but it would be able to drive itself in a circle at least.
No. The law allows passengers in self driving Taxi not to be responsible. Including Taxi operated by Tesla.
Here Tesla makes it clear to people who turn on “Full self driving” the driver must maintain supervision and thus responsibility. As such it’s Tesla’s choice that they aren’t selling self driving cars.
It wouldn’t be such a big deal if some random engineer said they’d eventually do X, but when it’s the CEO repeatedly saying the same across many public appearances that’s as binding as a Super Bowl advertisement.
It's not about legal liability, though I admit the example of tickets was informal and confusing.
Let me state it a little more clearly: the driver is the component in the vehicle system design that's ultimately responsible for ensuring the safety invariants are maintained. In a normal car, that's clearly the human in the driver's seat. Less obviously, the same is true of a Tesla with FSD. If we move that human to a remote control room, they're still the driver even if they're not physically in the vehicle.
It's only when the computer itself becomes responsible for maintaining system safety that it becomes the driver. Waymo is an example. Waymo also employs people in a remote call center, but those humans aren't responsible for safety and hence aren't drivers. But a Waymo employee out on the street using their <5mph remote control mode is driving it, because they've taken on the safety role again.
Legal liability can follow from this, but it's a much more complicated classification that I don't expect to ever have a singular answer, or even a knowable answer in many cases.