Comment by kjkjadksj
19 hours ago
Why does tesla pretend to be autonomous? My friends with tesla fsd use it fully autonomously. It even finds a spot and parks for them.
19 hours ago
Why does tesla pretend to be autonomous? My friends with tesla fsd use it fully autonomously. It even finds a spot and parks for them.
The company selling the car is adamant that none of their cars are fully autonomous in every single legal or regularity context. Any accident caused by the car is 100% the fault of the driver. But the company markets their cars as fully autonomous. That's pretty much the definition of pretending to be autonomous.
It's a level 2 system, it can't be operated unattended. Your friends are risking thier lives as several people (now dead) have found out.
I think we are at the point where the data suggests they bear more risk when they drive the tesla themselves. See the bloomburg report on accidents per mile.
Wikipedia lists two fatal crashes involving Tesla FSD and one involving Waymo.
> one involving Waymo
Are you referring to the one where a Waymo, and several other cars, were stopped at a traffic light, when another car (incidentally, a Tesla) barreled into the traffic stack at 90 MPH, killing several people?
Because I am not aware of any other fatal accidents where a Waymo was even slightly involved. I think it's, at best, misleading to refer to that in the same sentence as FSD-involved fatalities where FSD was the direct cause.
They key difference is that the Teslas killed their passengers, the Waymo hit someone outside the car (and it wasn't the Waymo's fault, it was hit by another car).
4 replies →
Wikipedia lists at least 28 fatal crashes involving Tesla FSD:
https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...
2 replies →
There is no world in which New York lets Teslas drive autonomously in the next decade. Had they not been grandfathered in in California, I doubt politics there would have allowed it either.
Sources? Havent heard of deaths except total idiots sleepping at 80mph.
If the car needs any occupant to be awake, it is not an autonomous vehicle.
Some of the best marketing ever behind convincing people that the word "autonomous" does not mean what we all know it means.
Are you trying to draw a distinction between sleeping versus looking away from the road and not paying attention to it? I expect both situations to have similar results with similar levels of danger in a Tesla, and the latter is the bare minimum for autonomous/unattended.
You don't need to cite accidents when you're stating the true fact that the system is not approved for unattended use.
It's just pretending to do that, seemingly?
If I can't use the center console to pick a song on Spotify without the car yelling at me to watch the road, it's not autonomous.
No, rather, if the manufacturer of the self-driving software doesn't take full legal liability for actions taken by the car, then it's not autonomous. This is the once and final criterion for a self-driving vehicle.
Sounds like we're in agreement then.
Right now, Tesla skirts legal liability by saying that the driver needs to watch the road and be ready to take control, and then uses measures like detecting your hand on the wheel and tracking your gaze to make sure you're watching the road. If a car driving on FSD crashes, Tesla will say it's the driver's fault for not monitoring the drive.
1 reply →
That is for the lawyers not indicative of capability
I’ve taken a nap in my Waymos. One can’t in a Tesla. That is a difference in capability.