← Back to context

Comment by oliveshell

3 years ago

Radar, ultrasonic proximity sensors, and/or LIDAR, presumably.

Tesla has famously removed all radar/ultrasonic sensors from their newer cars in favor of a purely camera-based system.

But they removed them well after they showed they could accurately predict distance with vision.

LiDAR doesn’t make sense as a sensor to me because it only works in good weather. Its Like a car without windscreen wipers.

  • Appreciate the downvote/disagreement.

    But here is Karpathy explaining how vision can be used to measure distance to objects accurately[0]

    Here is the fact that LiDAR doesn't work in the rain[1]: "... In heavy rain, for example, the light pulses emitted from the lidar system are partially reflected off of rain droplets which adds noise to the data, called 'echoes'."

    Which logically implies you need to revert to vision, as see [0] also for why Radar is unreliable.

    [0]https://www.youtube.com/watch?v=g6bOwQdCJrc [1]https://en.wikipedia.org/wiki/Lidar

How do humans drive without these sensors?

  • With superior reasoning abilities. Extra sensors generating more data make reasoning simpler.

    • Extra sensors also generate a lot of extra complexity in fusing all the sensors into a common consistent view of the world. And if your goal is just 'more data' then you might as well just add more cameras, that's also 'more data'. I guess what you wanted to say way 'diverse data'.

      As with everything else there are benefits and costs to those approaches.

    • Except that Radar and LiDAR send false negatives in poor weather/conditions. Making reasoning harder.