Comment by SilverBirch

4 hours ago

To be honest I think the true story here is:

> the fleet has traveled approximately 500,000 miles

Let's say they average 10mph, and say they operate 10 hours a day, that's 5,000 car-days of travel, or to put it another way about 30 cars over 6 months.

That's tiny! That's a robotaxi company that is literally smaller than a lot of taxi companies.

One crash in this context is going to just completely blow out their statistics. So it's kind of dumb to even talk about the statistics today. The real take away is that the Robotaxis don't really exist, they're in an experimental phase and we're not going to get real statistics until they're doing 1,000x that mileage, and that won't happen until they've built something that actually works and that may never happen.

The more I think about your comment on statistics, the more I change my mind.

At first, I think you’re right - these are (thankfully) rare events. And because of this, the accident rate is Poisson distributed. At this low of a rate, it’s really hard to know what the true average is, so we do really need more time/miles to know how good/bad the Teslas are performing. I also suspect they are getting safer over time, but again… more data required. But, we do have the statistical models to work with these rare events.

But then I think about your comment about it only being 30 cars operating over 6 months. Which, makes sense, except for the fact that it’s not like having a fleet of individual drivers. These robotaxis should all be running the same software, so it’s statistically more like one person driving 500,000 miles. This is a lot of miles! I’ve been driving for over 30 years and I don’t think I’ve driven that many miles. This should be enough data for a comparison.

If we are comparing the Tesla accident rate to people in a consistent manner (accident classification), it’s a valid comparison. So, I think the way this works out is: given an accident rate of 1/500000, we could expect a human to have 9 accidents over the same miles with a probability of ~ 1 x 10^-6. (Never do live math on the internet, but I think this is about right).

Hopefully they will get better.

  • 500,000 / 30 years is ~16,667mi/yr. While its a bit above the US average, its not incredibly so. Tons of normal commuters will have driven more than that many miles in 30 years.

    • That’s not the point (I’m a bit of an outlier, I don’t drive much daily, but make long trips fairly often). The point with focusing on 500,000 miles is that that should be enough of an observation period to be able to make some comparisons. The parent comment was making it seem like that was too low. Putting it into context of how much I’ve driven makes me think that 500,000 miles is enough to make a valid comparison.

Wait, so your argument is there's only 9 crashes so we should wait until there's possibly 9,000 crashes to make an assessment? That's crazy dangerous.

At least 3 of them sound dangerous already, and it's on Tesla to convince us they're safe. It could be a statistical anomaly so far, but hovering at 9x the alternative doesn't provide confidence.

  • No, my argument is you shouldn't draw a statistical conclusion with this data. That's all. I'm kind of pushing in the direction you were pointing in the second part - it's not enough data to make statistical inferences. We should examine each incident, identify the root cause and come to a conclusion as to whether that means the system is not fit for purpose. I just don't think the statistics are useful.

> The real take away is that the Robotaxis don't really exist

More accurately, the real takeaway is that Tesla's robo-taxis don't really exist.

But deep learning is also about statistics.

So if the crash statistics are insufficient, then we cannot trust the deep learning.

  • I suspect Tesla claims they do the deep learning on sensor data from their entire fleet of cars sold, not just the robotaxis.

>One crash in this context is going to just completely blow out their statistics.

One crash in 500,000 miles would merely put them on par with a human driver.

One crash every 50,000 miles would be more like having my sister behind the wheel.

I’ll be sure to tell the next insurer that she’s not a bad driver - she’s just one person operating an itty bitty fleet consisting of one vehicle!

If the cybertaxi were a human driver accruing double points 7 months into its probationary license it would have never made it to 9 accidents because it would have been revoked and suspended after the first two or three accidents in her state and then thrown in JAIL as a “scofflaw” if it continued driving.

  • > One crash in 500,000 miles would merely put them on par with a human driver.

    > One crash every 50,000 miles would be more like having my sister behind the wheel.

    I'm not sure if that leads to the conclusion that you want it to.

    • From the tone, it seems that the poster's sister is a particularly bad driver (or at least they believe her to be). While having an autonomous car that can drive as well as even a bad human driver is definitely a major accomplishment technologically, we all know that threshold was passed a long time ago. However, if Tesla's robotaxis (with human monitors on board, let's not forget - these are not fully autonomous cars like Waymo's!) are at best as good as some of the worse human drivers, then they have no business being allowed on public roads. Remember that human drivers can also lose their license if [caught] driving too poorly.