← Back to context

Comment by AlotOfReading

17 hours ago

This is a common misconception. People tend to think driving is controlling the steering and pedals, so if FSD does those things it must be driving.

It's not. Driving is whatever has ultimate responsibility for the vehicle and its occupants. If a cop pulls you over while FSD is enabled, it's not Tesla who's paying the ticket. If FSD has an issue, you're the driver who has to respond.

Think of FSD as a very nice cruise control. You're still driving, even if you aren't touching the wheel.

Sort of how programming isn't the same as writing code — it also involves a bunch of other thing like all the design and planning work.

It's a common misconception because the thing is called "full self driving."

  • technically it was called "Full Self Driving (BETA)" and then "Full Self Driving (Supervised)"

The bottom line is, no one else is even remotely close to that experience for the driver, liable or not. Probably with good reason, as every other car company actually listen to their lawyers.

So if the law says that a human in the car has to be responsible then it is impossible for a self driving car to exist. I do not think tying the definition to legal liability is right.

I don't see why self driving couldn't just be steering and pedals. It would be pretty limiting but it would be able to drive itself in a circle at least.

  • No. The law allows passengers in self driving Taxi not to be responsible. Including Taxi operated by Tesla.

    Here Tesla makes it clear to people who turn on “Full self driving” the driver must maintain supervision and thus responsibility. As such it’s Tesla’s choice that they aren’t selling self driving cars.

    It wouldn’t be such a big deal if some random engineer said they’d eventually do X, but when it’s the CEO repeatedly saying the same across many public appearances that’s as binding as a Super Bowl advertisement.

  • It's not about legal liability, though I admit the example of tickets was informal and confusing.

    Let me state it a little more clearly: the driver is the component in the vehicle system design that's ultimately responsible for ensuring the safety invariants are maintained. In a normal car, that's clearly the human in the driver's seat. Less obviously, the same is true of a Tesla with FSD. If we move that human to a remote control room, they're still the driver even if they're not physically in the vehicle.

    It's only when the computer itself becomes responsible for maintaining system safety that it becomes the driver. Waymo is an example. Waymo also employs people in a remote call center, but those humans aren't responsible for safety and hence aren't drivers. But a Waymo employee out on the street using their <5mph remote control mode is driving it, because they've taken on the safety role again.

    Legal liability can follow from this, but it's a much more complicated classification that I don't expect to ever have a singular answer, or even a knowable answer in many cases.