← Back to context

Comment by ACCount37

8 hours ago

This LIDAR wank annoys me.

If you can train a policy that drives well on cameras, you can get self-driving. If you can't, you're fucked, and no amount of extra sensors will save you.

Self-driving isn't a sensor problem. It always was, is, and always will be an AI problem.

No amount of LIDAR engineering will ever get you a LIDAR that outputs ground truth steering commands. The best you'll ever get is noisy depth estimate speckles that you'll have to massage with, guess what, AI, to get them to do anything of use.

Sensor suite choice is an aside. Camera only 360 coverage? Good enough to move on. The rest of the problem lies with AI.

Even the best AI can't drive without good sensors. Cameras have to guess distance and they fail when there is insufficient contrast, direct sunlight and so on. LiDARs don't have to guess distance.

  • Cameras also fail when weather conditions cake your car in snow and/or mud while you're driving. Actually, from what I just looked up, this is an issue with LiDAR as well. So it seems to me like we don't even have the sensors we need to do this properly yet, unless we can somehow make them all self-cleaning.

    It always goes back to my long standing belief that we need dedicated lanes with roadside RFID tags to really make this self driving thing work well enough.

    • Nah. That's a common "thought about it for 15 seconds but not 15 minutes" mistake.

      Making a car that drives well on arbitrary roads is freakishly hard. Having to adapt every single road in the world before even a single self-driving car can use them? That's a task that makes the previous one look easy.

      Learned sensor fusion policy that can compensate for partial sensor degradation, detect severe dropout, and handle both safely? Very hard. Getting the world that can't fix the low tech potholes on every other road to set up and maintain machine specific infrastructure everywhere? A nonstarter.

      5 replies →

You are correct, but the problem is nobody at Tesla or any other self driving company for that matter knows what they are doing when it comes to AI

If you are doing end to end driving policy (i.e the wrong way of doing it), having lidar is important as a correction factor to the cameras.

  • So far, end to end seems to be the only way to train complex AI systems that actually works.

    Every time you pit the sheer violent force of end to end backpropagation against compartmentalization and lines drawn by humans, at a sufficient scale, backpropagation gets its win.

> If you can train a policy that drives well on cameras, you can get self-driving. If you can't, you're fucked, and no amount of extra sensors will save you.

Source: trust me, bro? This statement has no factual basis. Calling the most common approach of all other self-driving developers except Tesla a wank also is no argument but hate only.

>Self-driving isn't a sensor problem. It always was, is, and always will be an AI problem.

AI + cameras have relevant limitations that LIDAR augmented suites don't. You can paint a photorealistic roadway onto a brick wall and AI + cameras will try to drive right through it, dubbed the "Wile E. Coyote" problem.