Comment by pkulak
9 years ago
Sorry, I kinda tuned out when you mentioned lidar like it's some requirement for autonomous driving.
Lidar is AMAZING for giving press demos on sunny days. For the real world with rain, snow, leaves, plastic bags, etc? Useless.
The future is radar + cameras + a LOT of software blood, sweat and tears.
That's like saying our eyes are useless because we can't see in the dark.
Our eyes are not LIDAR, though, and we can drive pretty well.
In fact, the reason we have crashes is NOT our eyes’ lack of distance detection through laser return timing — having two eyes is enough for distance appreciation. We have crashes because of attention deficit instead.
At this point, there is no reason to believe that a machine can't achieve and outperform a human on a driving task given the same inputs. Sure, human eyes have 5 million cone cells and 1080p feeds only have 2 million pixels, but 4K has 9 million, and more importantly, that level of precision is unnecessary for regular driving.
And Tesla doesn’t even bet just on the visible spectrum; it also relies on radar.
The trick is in our wetware. What the brain does with visual input is not just trivial object recognition. It relies on a complex internal model of the world to both augment the object recognition and to sometimes discard the visual data as invalid.
So sure, theoretically cameras would be enough. But we're not yet there with software, we can't use the camera input well enough. So if you can side-step the need for not-yet-invented ML methods by simply adding a LIDAR to a sensor suite, then it's an obvious way to go.
Compare with powered flight: we didn't get very far by trying to copy the way birds do it. The trick is in the super-light materials birds are made of, and the energy efficiency of their organisms. We only succeeded at powered flight when we brute-forced it by strapping a gasoline engine onto a bunch of wooden planks.
3 replies →
Correct, eyes in fact are useless if you can't see in the dark.
Now imagine walking on top of a sky scraper in pitch darkness. Yes your eyes work in light, but in this case you will likely fall to death.
You can't really drive in the dark, can you? What if it gets dark for 1 second on a cliffside turn?
Actually in the dark you often drive as a leap of faith in the state of the road. I.e. with very little visibility on what can come from the side of the road (no light) or after a turn. We shouldn't. But we do.
2 replies →
There's always the option to enable the headlights in such a case. LEDs switch on way faster than incandescent bulbs. And if they're needed at all - a camera has way more flexibility in brightness input range than a human eye.
A car like a Tesla has also highest-quality maps and GPS sensors - these alone are way better than what you get in your smartphone and are enough to keep the car from going over the cliff.
It's pretty obvious that a self-driving camera-based software would take advantage of headlights on a car, just like a human does? So it never has to drive in full darkness.
7 replies →
Not really. Self driving cars should be able to drive in all of those conditions (and one condition might even quickly turn into another, e.g. sunny into rain).
If a technology only helps with some of the cases (e.g. fair weather) and does not work for the others, then there are two cases:
(a) A single replacement technology will be found that works in 100% of cases.
or:
(b) The technology will only be used on the cases it works well, and the other cases will be handled by some alternative technology equally only suited to them.
In the case of (a), Lidar is indeed useless (or at best, only used as a supplementary technology in favourable conditions).
And I fail to see how (b) can be the case -- that is, how there can be another technology that will solve the rain/snow/night driving problem, but which cannot also outperform/replace Lidar for fair weather driving.
> then there are two cases:
Isn't it interesting that we have five senses, when we could just have one that works in 100% of the cases? A third option is a system based on multi-sensory inputs. Several inputs that are just marginal on their own can provide good performance when combined.
> The future is radar + cameras + a LOT of software
... all of which Waymo's solution also has, in addition to LIDAR.
I will take your self driving car seriously if you can drive it in all conditions in India.
Until then its just an attempt to make something that breaks at the next unanticipated exception.
Human beings can not drive in all conditions in India.
Just to get our car out of the garage, I had to plead and negotiate with N vegetable vendors with makeshift stores on the road.
Also: bicycles, motorbikes, rickshaws(in human pulled, CNG and electric varieties!)and pedestrians mixed in traffic everywhere.
Yep, I have a feeling that for driving in India/Iran/Pakistan/Bangladesh you're going to need a strong AI to negotiate with the street vendors. In Iran I even had a particularly... enthusiastic flower seller actually maneuver himself to make it even harder to drive away.
Not to take away from anyone's work in this area, but I have no idea how long it'll take to go from "works in America" to "works in India". In many countries the safest option (to evade disaster) can occasionally be "floor it and break the speed limit" to get away from x dangerous thing. I'm not sure if that's something that Google is willing to write into an AI.
1 reply →
>>I had to plead and negotiate with N vegetable vendors with makeshift stores on the road.
This is a very practical test case for a car on a road. Not just in India but anywhere in the world.
Instead of N vegetable vendors you could have N traffic cops. How do you manage the human interaction part in the self driving car?
Well, most non-Indian drivers also can't make it in "all conditions in India" so that's a moot point.
Made me laugh but it's a very true statement. When I was driving whilst holidaying in India I realised that the main rule of the road was "largest vehicle wins"! This actually made for quite an easy to understand system with few questions of whose right of way it was.
bike < car < van < truck
which makes sense because if you're the one who's going to come off worse in an impact then you really want to give way - especially if you have a massive painted tipper truck hurtling towards you!
It will be very interesting to see how self driving systems can cope with these local unwritten bylaws.
It will most definitely be interesting, but it's silly to say that self-driving cars can't be taken seriously, if they don't master these conditions (yet). Hell, I couldn't master those conditions myself, nor do I need to, because I live in the the inner city of a Western-European metropolis, so what I need my self-driving car to do varies massively from what people in other regions of the globe may need it to do, but that doesn't make it any less useful for me.
>This actually made for quite an easy to understand system with few questions of whose right of way it was. bike < car < van < truck
That doesn't make much sense, because the main (and most common) question would still be between vehicles of the same class: car vs car, and this doesn't solve it.
>>It will be very interesting to see how self driving systems can cope with these local unwritten bylaws.
This is why self driving AI will require Hard AI.
India is a perfect test bed for these people to test their algorithms. And for heaven's sake why would you test it in some place like the US. Cars in US are pretty much trains on road any way.
3 replies →
I wouldn't drive in India. And I'm human.
You are only human. Its super human who can drive there.
I imagine a self driving car will drive there similarly to a human; slowly, because of the extreme risk. Potentially humans tend to take more risks than they should, so a self driving car in India might seem to drive differently than a human would. But the algorithm stays the same; drive at a speed matching the obstacle risk, and avoid hitting things. Perhaps the parking aspect would be the most different.
IMO, India has the simplest driving algorithm: Keep going where you're going and don't hit anything. Also, honk if you think you might be in someone's blindspot.
that is important only if you live in india, or any other 3rd world country with similar disrespect to anything resembling a traffic rule from a mile away.
western world will be perfectly happy with cars that can only drive on our roads, and eventually manufacturers will pick up on other places as well.
look at the potential bright side - this might bring some order to mess one can see daily in bigger cities on roads.
An autonomous driving feature that you can use on most days but not all of them is still very useful and would have a good market.
It's not going to rain or snow today, and if it would, then I can take the wheel myself.
Isn't Tesla parterned with Nvidia to solve the computation side of things?
I thought I remembered Nvidia presenting some additional stuff about it in their Tech demo recently.
I know Karpathy wasn't involved in the "end-to-end" research paper Nvidia published - but I wouldn't be surprised if they weren't involved somehow, and if anyone could push the CNN tech in that paper further it just might be Karpathy.