Comment by scottbez1
16 hours ago
Yeah, Tesla gets to blame the “driver”, and has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible.
And the system is designed to set up drivers for failure.
An HCI challenge with mostly autonomous systems is that operators lose their awareness of the system, and when things go wrong you can easily get worse outcomes than if the system was fully manual with an engaged operator.
This is a well known challenge in the nuclear energy sector and airline industry (Air France 447) - how do you keep operators fully engaged even though they almost never need to intervene, because otherwise they’re likely to be missing critical context and make wrong decisions. These days you could probably argue the same is true of software engineers reviewing LLM code that’s often - but not always - correct.
> has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible
Really? Thats crazy.