← Back to context

Comment by charcircuit

17 hours ago

From what I've seen on YouTube the cars do drive themselves. This seems more like the type of thing with AI where people change the goal posts of what AI means. Just because a car did not slow down in a school zone, that doesn't mean that the car wasn't driving itself.

This is a common misconception. People tend to think driving is controlling the steering and pedals, so if FSD does those things it must be driving.

It's not. Driving is whatever has ultimate responsibility for the vehicle and its occupants. If a cop pulls you over while FSD is enabled, it's not Tesla who's paying the ticket. If FSD has an issue, you're the driver who has to respond.

Think of FSD as a very nice cruise control. You're still driving, even if you aren't touching the wheel.

  • Sort of how programming isn't the same as writing code — it also involves a bunch of other thing like all the design and planning work.

  • It's a common misconception because the thing is called "full self driving."

    • technically it was called "Full Self Driving (BETA)" and then "Full Self Driving (Supervised)"

  • The bottom line is, no one else is even remotely close to that experience for the driver, liable or not. Probably with good reason, as every other car company actually listen to their lawyers.

  • So if the law says that a human in the car has to be responsible then it is impossible for a self driving car to exist. I do not think tying the definition to legal liability is right.

    I don't see why self driving couldn't just be steering and pedals. It would be pretty limiting but it would be able to drive itself in a circle at least.

    • No. The law allows passengers in self driving Taxi not to be responsible. Including Taxi operated by Tesla.

      Here Tesla makes it clear to people who turn on “Full self driving” the driver must maintain supervision and thus responsibility. As such it’s Tesla’s choice that they aren’t selling self driving cars.

      It wouldn’t be such a big deal if some random engineer said they’d eventually do X, but when it’s the CEO repeatedly saying the same across many public appearances that’s as binding as a Super Bowl advertisement.

    • It's not about legal liability, though I admit the example of tickets was informal and confusing.

      Let me state it a little more clearly: the driver is the component in the vehicle system design that's ultimately responsible for ensuring the safety invariants are maintained. In a normal car, that's clearly the human in the driver's seat. Less obviously, the same is true of a Tesla with FSD. If we move that human to a remote control room, they're still the driver even if they're not physically in the vehicle.

      It's only when the computer itself becomes responsible for maintaining system safety that it becomes the driver. Waymo is an example. Waymo also employs people in a remote call center, but those humans aren't responsible for safety and hence aren't drivers. But a Waymo employee out on the street using their <5mph remote control mode is driving it, because they've taken on the safety role again.

      Legal liability can follow from this, but it's a much more complicated classification that I don't expect to ever have a singular answer, or even a knowable answer in many cases.

By that logic it’s ok if the car slams itself against a concrete wall - just because it failed to stop in time doesn’t mean it wasn’t driving itself.

Self driving cars are supposed to obey the same rules as human drivers.

  • Well ... yes. By that logic it is the case. It applies to humans too - if a human slams their car into a concrete wall then the human was still driving the car. They did a bad job of it, but they were in fact driving.

    A car being driven autonomously doesn't imply much about the quality of that driving. They're still going to make bad decisions and have accidents, just like humans do (a friend of mine died slamming their car into a tree). There is probably some minimum where we'd say that it isn't really driving because it can't do anything right, but modern self driving systems are past that.

    • > A car being driven autonomously doesn't imply much about the quality of that driving

      Only that’s not what they’re selling us - they say autonomous cars are safer than humans, fewer accidents per mile driven, faster reaction times yadda yadda. I think this implies quality and not respecting speed limits is not something that sounds very high-quality. At least not while they have to share the road with humans.

      1 reply →

  • Both statements can be true. Human vs self driving cars is a different classification between good and bad driving. Humans can slam into a wall too.

>the cars do drive themselves

Those are cars with the "HW4" FSD hardware, which was released in Mar 2023.

There were a lot of cars sold with "HW2" (nVidia-based) and HW3 (Tesla silicon). Those cars, apparently cannot be upgraded to HW4 because of physical size differences between the units. HW2 was able to be upgraded to HW3.

Those videos you are talking about seeing do not represent the FSD experience for all, or possibly even most, Tesla FSD vehicles in the wild.

By this definition, putting a brick on the accelerator and tying the steering wheel in place is self-driving.

It's fairly simple. Tesla says I have to supervise, and they are not liable for anything the car does wrong. It is not full self-driving any more than a 25 40 year old car with cruise control is.

AI never had goalposts, it means programming meant to look like human behavior. Like AI opponents in old video games.

Tesla FSD won't be level 5 until Tesla has liability for any crashes it causes the way Waymo does.

Elon Musks claims included (exact quotes, these posts are still on X):

Jan 10, 2016: In ~2 years, summon should work anywhere connected by land & not blocked by borders, eg you're in LA and the car is in NY

Jul 16, 2019: If we make all cars with FSD package self-driving, as planned, any such Tesla should be worth $100k to $200k, as utility increases from ~12 hours/week to ~60 hours/week

These aren't moving goalposts by antis, this are the expectations set by Elon Musk himself when advertising his products.

Those YouTubers are all there to make Tesla look good. It’s a grift. The ones that are honest and show the bad side get kicked out of the Tesla club fast and dogpiled on.

Also a school zone is one of the most basic things the car should be able to handle. If it can’t do that, it’s not ready for public use.

  • >Also a school zone is one of the most basic things the car should be able to handle. If it can’t do that, it’s not ready for public use.

    Humans don't always follow the law driving through school zones. And when humans speed through a school zone, the human is definitely driving the car. Are we ready to let humans drive on public roads?

    The argument has to go into the magnitude of the problem to get anywhere meaningful.

    • We require individual humans to be licensed to drive and can revoke that license as needed.

See, that's really the best argument for this. It can drive itself the same way I can fly an Airbus A321. You can't sue me because I didn't land the plane "intact".