← Back to context

Comment by n0tth3dro1ds

3 years ago

A software system sold as “self-driving” with the lawyer-derived safety valve of an attention warning is not safe. All automotive companies have done takeover studies for ADAS systems. Spoiler: there’s no such thing as a timely takeover at highway speeds. The more a system drives itself, the more the drivers are lulled to a state of inattention (or in the case of many Tesla drivers, literal sleep). This doesn’t work at 70 mph. It’s false advertising on top of a highly unsafe apparatus.

Walter Huang’s Tesla drove into a barrier on the 101 not far from Tesla’s Palo Alto facility. It drove into a barrier because of its naive vision-only system paired with the constantly changing and faded lane lines on the 101 due to construction. If Teslas can’t drive the 101 without mistakes, they can’t drive anywhere. It’s literally right down the road from their autonomous driving team’s office.

How do you see the path from no autonomy, to partial autonomy, to full autonomy?

What if it’s a net positive (reduction of death) and delaying the progress cause more death, who is accountable for those death?

The Tesla system was approved by the relevant authorities or they would not be able to drive them in the US. If the system was required to nag the driver by law and didn’t then it is a breach of the law. If the system is imperfect, know to be so and approved that way I see no foul play.

If the law isn’t good, change the law.

  • >How do you see the path from no autonomy, to partial autonomy, to full autonomy?

    Honestly that isn't the question here and also not society's problem. If Tesla can't make the business model work, that is a Tesla problem.

    Plenty of other companies are making great progress on autonomous vehicles without passing a non-working system off as something that "drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot". It is more expensive that way and requires more upfront costs, but thems the breaks.

    >If the law isn’t good, change the law.

    This thread is literally about a deposition in a court case. It might not require any change of law. It might not be a criminal trial, but civil cases can be just as effective at getting bad actors and their faulty products out of the market.

    • You assume "bad actor" I think it’s a good actor that is going to save a lot of lives. But the media/government is going to use every accident to punish Tesla and slow them down, costing way more lives. Remember the media is paid largely by all the other auto maker and Tesla spend zero in publicity, making it enemy no.1 of the media. The same is true of the current government that is heavily financed by United Auto Worker.

      There is alway a tradeoff of progress and safety. 46 000 people die in car accident each year in the US I hope that this also count in the balance, not only the one who ignore all the advices while using autopilot.

  • >How do you see the path from no autonomy, to partial autonomy, to full autonomy?

    I don’t. Should you be able to sell an unsafe car just because you speculate about some non-existent alien technology that would make it safe if it existed? Why?

    > If the system is imperfect, know to be so and approved that way I see no foul play.

    So if NHTSA doesn’t force their hand then they have no obligation to make a safe driving system, even if they know it to be unsafe? I’m very glad that countless engineers at Volvo (3-point seatbelts) and GM (airbags) and others didn’t have that mindset.