Comment by tsimionescu
3 hours ago
All of your arguments are expounded upon in the article itself, and their conclusions still hold, based on the publicly available data.
The 3x figure in the title is based on a comparison of the Tesla reports with estimated average human driver miles without an incident, not based on police report data. The comparison with police-report data would lead to a 9x figure instead, which the article presents but quickly dismisses.
The denominator problem is made up. Tesla Robotaxi has only been launched in one location, Austin, and only since July (well, 28th June, so maybe there is a few days discrepancy?). So the crash data and the miles data can only refer to this same period. Furthermore, if the miles driven are actually based on some additional length of time, then the picture gets even worse for Tesla, as the denominator for those 9 incidents gets smaller.
The analysis indeed doesn't distinguish between the types of accidents, but this is irrelevant. The human driver estimates for miles driven without incident also don't distinguish between the types of incidents, so the comparison is still very fair (unless you believe people intentionally tried to get the Tesla cars to crash, which makes little sense).
The comparison to Waymo is also done based on incidents reported by both companies under the same reporting requirements, to the same federal agency. The crash definitions and reporting practices are already harmonized, at least to a good extent, through this.
Overall there is no way to look at this data and draw a conclusion that is significantly different from the article: Tesla is bad at autonomous driving, and has a long way to go until it can be considered safe on public roads. We should also remember that robotaxis are not even autonomous, in fact! Each car has a human safety monitor that is ready to step in and take control of the vehicle at any time to avoid incidents - so the real incident rate, if the safety monitor weren't there, would certainly be even worse than this.
I'd also mention that 5 months of data is not that small a sample size, despite you trying to make it sound so (only 9 crashes).
I agree with most of your points and your conclusion, but to be fair OP was asserting that human drivers under-report incidents, which I believe. Super minor bumps where the drivers get out, determine there’s barely a scratch, and go on. Or solo low speed collisions with walls in garage or trees.
I don’t think it invalidates the conclusion, but it seems like one fair point in an otherwise off-target defense.
Sure, but the 3x comparison is not based on reported incidents, it's based on estimates of incidents that occur. I think it's fair to assume such estimates are based on data about repairs and other such market stats, that don't necessarily depend on reporting. We also have no reason a priori to believe the Tesla reports include every single incident either, especially given their history from FSD incident disclosures.
"estimates" (with air quotes)
To add to this, more data from more regions means the estimate of average human miles without an incident is more accurate, simply because it is estimated from a larger sample, so more likely to be representative.
> The 3x figure in the title is based on a comparison of the Tesla reports with estimated average human driver miles without an incident, not based on police report data. The comparison with police-report data would lead to a 9x figure instead, which the article presents but quickly dismisses.
I think OP's point still stands here. Who are people reporting minor incidents to that would be publicly available that isn't the police? This data had to come from somewhere and police reports is the only thing that makes sense to me.
If I bump my car into a post, I'm not telling any government office about it.
I don't know, since they unfortunately don't cite a source for that number, but I can imagine some sources of data - insurers, vehicle repair and paint shops. Since average miles driven without incident seems plausible to be an important factor for insurance companies to know (even minor incidents will typically incur some repair costs), it seems likely that people have studied this and care about the accuracy of the numbers.
Of course, I fully admit that for all I know it's possible the article entirely made up these numbers, I haven't tried to look for an alternative source or anything.
The article lists the crashes right at the top. One of 9 involved hitting a fixed object. The rest involved collisions with people, cars, animals, or injuries.
So, let's exclude hitting fixed objects as you suggest (though the incident we'd be excluding might have been anything from a totaled car and huge fire to zero damage), and also assume that humans fail to report injury / serious property damage accidents more often than not (as the article assumes).
That gets the crash rate down from an unbiased 9x to a lowball 2.66x higher than human drivers. That's with human monitors supervising the cars.
2.66x is still so poor they should be pulled of the streets IMO.
> So, let's exclude hitting fixed objects as you suggest (though the incident we'd be excluding might have been anything from a totaled car and huge fire to zero damage)
I don't know what data is available but what I really care about more than anything is incidents where a human could be killed or harmed, followed by animals, then other property and finally, the car itself. So I'm not arguing to exclude hitting fixed objects, I'm arguing that severity of incident is much more important than total incidents.
Even when comparing it to human drivers, if Tesla autopilot gets into 200 fender benders and 0 fatal crashes I'd prefer that over a human driver getting into 190 fender benders and 10 fatal crashes. Directionally though, I suspect the numbers would probably go the other direction, more major incidents from automated cars because, when are successful, they usually handle situations perfectly and when they fail, they just don't see that stopped car in front of you and hit it at full speed.
> That gets the crash rate down from an unbiased 9x to a lowball 2.66x higher than human drivers. That's with human monitors supervising the cars.
> 2.66x is still so poor they should be pulled of the streets IMO.
I'm really not here to argue they are safe or anything like that. It just seems clear to me that every assumption in this article is made in the direction that makes Tesla look worse.
1 reply →
FTA:
>> However, that figure doesn’t include non-police-reported incidents. When adding those, or rather an estimate of those, humans are closer to 200,000 miles between crashes, which is still a lot better than Tesla’s robotaxi in Austin.
Insurers?
I can't be certain about auto insurers, but healthcare insurers just straight up sell the insurance claims data. I would be surprised if auto insurers haven't found that same "innovation."
That's a fair point, but I'll note that the one time I hit an inanimate object with my car I wasn't about to needlessly involve anyone. Fixed the damage to the vehicle myself and got on with life.
So I think it's reasonable to wonder about the accuracy of estimates for humans. We (ie society) could really use a rigorous dataset for this.
1 reply →