← Back to context

Comment by jasoncartwright

17 hours ago

Sounds like it's neither self-driving, nor autonomous, if I'm on the hook if it goes wrong.

Yeah, Tesla gets to blame the “driver”, and has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible.

And the system is designed to set up drivers for failure.

An HCI challenge with mostly autonomous systems is that operators lose their awareness of the system, and when things go wrong you can easily get worse outcomes than if the system was fully manual with an engaged operator.

This is a well known challenge in the nuclear energy sector and airline industry (Air France 447) - how do you keep operators fully engaged even though they almost never need to intervene, because otherwise they’re likely to be missing critical context and make wrong decisions. These days you could probably argue the same is true of software engineers reviewing LLM code that’s often - but not always - correct.

  • > has a history of releasing partial and carefully curated subsets of data from crashes to try to shift as much blame onto the driver as possible

    Really? Thats crazy.

Its neither self-driving, nor autonomous, eventually not even a car! (as Tesla slowly exits the car business). It will be 'insurance' on Speculation as a service, as Tesla skyrockets to $20T market cap. Tesla will successfully transition from a small revenue to pre-revenue company: https://www.youtube.com/watch?v=SYJdKW-UnFQ

The last few years of Tesla 'growth' show how this transition is unfolding. S and X production is shutdown, just a few more models to shutdown.

  • I wonder if they will try to sell off the car business once they can hype up something else. It seems odd to just let the car business die.

Especially since they can push regressions over the air and you could be lulled into a sense of safety and robustness that isn’t there and bam you pay the costs of the regressions, not Tesla.