Comment by alfor

3 years ago

In a video from 2016 in an lawsuit about a crash in 2018 in which the driver was repeatedly told by the car to pay attention.

The driver knew about how this part of the road was incorrectly recognized by the software yet didn’t pay attention.

You get sue for the life you didn’t save and get almost zero merits for the ones you did.

A software system sold as “self-driving” with the lawyer-derived safety valve of an attention warning is not safe. All automotive companies have done takeover studies for ADAS systems. Spoiler: there’s no such thing as a timely takeover at highway speeds. The more a system drives itself, the more the drivers are lulled to a state of inattention (or in the case of many Tesla drivers, literal sleep). This doesn’t work at 70 mph. It’s false advertising on top of a highly unsafe apparatus.

Walter Huang’s Tesla drove into a barrier on the 101 not far from Tesla’s Palo Alto facility. It drove into a barrier because of its naive vision-only system paired with the constantly changing and faded lane lines on the 101 due to construction. If Teslas can’t drive the 101 without mistakes, they can’t drive anywhere. It’s literally right down the road from their autonomous driving team’s office.

  • How do you see the path from no autonomy, to partial autonomy, to full autonomy?

    What if it’s a net positive (reduction of death) and delaying the progress cause more death, who is accountable for those death?

    The Tesla system was approved by the relevant authorities or they would not be able to drive them in the US. If the system was required to nag the driver by law and didn’t then it is a breach of the law. If the system is imperfect, know to be so and approved that way I see no foul play.

    If the law isn’t good, change the law.

    • >How do you see the path from no autonomy, to partial autonomy, to full autonomy?

      Honestly that isn't the question here and also not society's problem. If Tesla can't make the business model work, that is a Tesla problem.

      Plenty of other companies are making great progress on autonomous vehicles without passing a non-working system off as something that "drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot". It is more expensive that way and requires more upfront costs, but thems the breaks.

      >If the law isn’t good, change the law.

      This thread is literally about a deposition in a court case. It might not require any change of law. It might not be a criminal trial, but civil cases can be just as effective at getting bad actors and their faulty products out of the market.

      1 reply →

    • >How do you see the path from no autonomy, to partial autonomy, to full autonomy?

      I don’t. Should you be able to sell an unsafe car just because you speculate about some non-existent alien technology that would make it safe if it existed? Why?

      > If the system is imperfect, know to be so and approved that way I see no foul play.

      So if NHTSA doesn’t force their hand then they have no obligation to make a safe driving system, even if they know it to be unsafe? I’m very glad that countless engineers at Volvo (3-point seatbelts) and GM (airbags) and others didn’t have that mindset.