Comment by Animats
9 years ago
What this really reflects is that Tesla has painted itself into a corner. They've shipped vehicles with a weak sensor suite that's claimed to be sufficient to support self-driving, leaving the software for later. Tesla, unlike everybody else who's serious, doesn't have a LIDAR.
Now, it's "later", their software demos are about where Google was in 2010, and Tesla has a big problem. This is a really hard problem to do with cameras alone. Deep learning is useful, but it's not magic, and it's not strong AI. No wonder their head of automatic driving quit. Karpathy may bail in a few months, once he realizes he's joined a death march.
If anything, Tesla should have learned by now that you don't want to need to recognize objects to avoid them. The Mobileye system works that way, being very focused on identifying moving cars, pedestrians, and bicycles. It's led to at least four high speed crashes with stationary objects it didn't identify as obstacles. This is pathetic. We had avoidance of big stationary objects working in the DARPA Grand Challenge back in 2005.
With a good LIDAR, you get a point cloud. This tells you where there's something. Maybe you can identify some of the "somethings", but if there's an unidentified object out there, you know it's there. The planner can plot a course that stays on the road surface and doesn't hit anything. Object recognition is mostly for identifying other road users and trying to predict their behavior.
Compare Chris Urmson's talk and videos at SXSW 2016 [1] with Tesla's demo videos from last month.[2] Notice how aware the Google/Waymo vehicle is of what other road users are doing, and how it has a comprehensive overview of the situation. See Urmson show how it handled encountering unusual situations such as someone in a powered wheelchair chasing a duck with a broom. Note Urmson's detailed analysis of how a Google car scraped the side of a bus at 2MPH while maneuvering around sandbags placed in the parking lane.
Now watch Tesla's sped-up video, slowed down to normal speed. (1/4 speed is about right for viewing.) Tesla wouldn't even detect small sandbags; they don't even see traffic cones. Note how few roadside objects they mark. If it's outside the lines, they just don't care. There's not enough info to take evasive action in an emergency. Or even avoid a pothole.
Prediction: 2020 will be the year the big players have self-driving. It will use LIDAR, cameras, and radars. Continental will have a good low-cost LIDAR using the technology from Advanced Scientific Concepts at an affordable price point.
Tesla will try to ship a self-driving system before that while trying to avoid financial responsibility for crashes. People will die because of this.
[1] https://www.youtube.com/watch?v=Uj-rK8V-rik [2] https://player.vimeo.com/video/192179727
> Prediction: 2020 will be the year the big players have self-driving.
I think that depends on what you mean with self-driving. My prediction is that in 2020 we will have slightly better driver assistance systems. Maybe a lane assist system which won't kill me if I don't babysit it on a road with construction ongoing and multiple lane markings or if corners get too tight. Maybe some limited self-driving, e.g. only available in dedicated areas like highway/autobahn.
Keep in mind that 2020 is 3 years, which is less than the development cycle of a car. Or in other words: If something should get released in 2020 you now would already see it driving around on the roads for first tests.
My personal prediction is that we will see reliable and advanced self-driving technology in mass production cars maybe in 2 full car generations from now - which is 2030.
I am curious about what percentage of accidents are caused by the driver falling asleep, and if a lot of lives could be saved by a driver assistance system that self-drives with even the crudest of sensors until the driver is successfully woken up, which would probably be several seconds later.
I don't know why you're getting downvoted. I think watching highway death numbers is an excellent way to judge the success of self-driving cars.
At a minimum a self-stopping car should be the basic requirement, such that any vehicle which senses lack of input from the driver, or crazy input which is differential to the experience of its sensors, should be able to safely turn on a hazard alarm for all other surrounding vehicles and safely stop itself on the nearest shoulder.
If all sensors indicate that there is no emergency obstacle requiring sudden swerving or braking or what not, the car should be able to safely decelerate and stay in a lane etc... but this might be to complex a problem. (I was thinking if a driver may have a seizure or some other episode - but if a driver is under duress this may be a bad thing)
2020 is the year major manufacturers say they will have self-driving cars on the market.
- Volvo [1] - Chrysler [2] - Ford [3] (2021) - GM [4] (2018, first live test fleet) - Toyota [5] (although they just started with self-driving)
[1] https://www.digitaltrends.com/cars/volvo-predicts-fully-autn... [2] https://www.usatoday.com/story/tech/news/2017/04/25/google-s... [3] http://money.cnn.com/2016/08/16/technology/ford-self-driving... [4] http://fortune.com/2017/02/17/gm-lyft-chevy-bolt-fleet/ [5] https://www.wsj.com/articles/toyota-aims-to-make-self-drivin...
> Maybe a lane assist system which won't kill me if I don't babysit it on a road with construction ongoing and multiple lane markings or if corners get too tight.
Or if someone else makes a mistake.
That's another scenario that can have very unpredictable and immediate consequences ranging from 'nothing' to 'two car accident with fatalities'. Even in relatively placid (when it comes to driving) NL I see this kind of situation at least once per year.
Then there are blow-outs and other instant changes of the situation. I do believe that especially in those cases it should not take long before computers are better than humans because of their superior reaction speed.
Yup, these are the reasons I'm not interested in those "hands-free but still assisted" autodriving deals. I'd love cruise control assist and lane assist in my car, but that's about all I want until I can be 100% sure cars can handle the situations by themselves.
4 replies →
>>I think that depends on what you mean with self-driving.
It means a car that doesn't have a steering wheel, brake and accelerator. That kind of a car is quite far.
That's at least 4-5 automotive cycles away, IMO (so 25-30 year away).
And that's for the high end. The low end cars are probably 50-60 years away, IMO.
10 replies →
As far as I know, no car manufacturer ships cars equipped with LIDAR. Nor do they seem to have a camera setup as extensive as Tesla. So I fail to see how Tesla has painted itself into a corner. The worst case is, that they are not reaching full autonomy with the current hardware setup. It certainly would be a big marketing blunder if they don't, but if necessary they can add LIDAR to the production, if they choose to.
> As far as I know, no car manufacturer ships cars equipped with LIDAR.
I can confirm that, at least in a sense, this is false. There are plenty of series cars with LIDARS, but not he scaning things you are thinking about, but simpler kind of lidar tech[1]. I know that is not what you were talking about, but I thought it's worth pointing out other, existing, alternative approaches.
[1]http://www.conti-online.com/www/industrial_sensors_de_en/the...
I think Tesla not waiting for Lidars to get cheaper is the right decision. They can always integrate Lidar once they cheaper in the new models. Right now, no production car ships with a scanning Lidar that I know of.
Waymo cars are really expensive because of that and they cant scale because Velodyne can't make Lidars that quickly.
You also consider the fact that Lidars are literally beams of light being emitted and reflections measured. If you have every car with a Lidar, you get interference and its not the gold standard of measurement anymore.
Tesla conquering the problem with algorithms is the right approach. Remember our brains use algorithms and two cameras to drive too. So its technically possible.
Nvidia keeps on pumping out faster GPU's, cameras keep on getting better. Teslas getting more and more data. They just need better algos while they wait for cheaper sensors that scale.
That's a very wise business move while everyone else waits for a magic bullet.
1 reply →
Shipping hardware that isn't needed is not how automotive engineering works today.
Most automotive projects look like: You want to release a new car model in year X (with is typically around now() + 3-5 years). Then you start a development project exactly for that car, which involves creating roadmaps for the car, creating the architecture, sourcing the components, packaging everything together and testing everything. Most components (including infotainment systems, driver assistance systems, etc). are contracted to sub-suppliers, which develop them especially for that car model (or maybe a range of models from one OEM). At the end of the development cycle you have a car which has exactly one car which has (hopefully) everything that was planned for that model and which will get sold. In parallel the development cycle for the next model begins, where there might be only a minimal reuse from the last one. E.g. it might be decided that one critical driver assistance component is sourced from another supplier, is now working completely different, and requires also changes for the remaining components.
So if you do not intend to upgrade something or reuse it, it just doesn't make sense to include additional hardware for it. We will see the required hardware in cars which also will make use of their functionality.
For Tesla it will be quite interesting if they will really deliver huge autonomous functions on that hardware, or whether we will see a new generation Model S (with overhauled hardware) before anyway. I'm personally pretty sure that we will see new model generations before the software will be on a "fully autonomous" level.
>where there might be only a minimal reuse from the last one
Not from what I saw from some constructors. Lot of software and parts are reused for multiples models.
They may have painted themselves into a corner by saying the current hardware is sufficient for "full self-driving capabilities." The worst case is they're putting half-baked solutions on the road.
Yeah. Theoretically the claim is true (at least wrt. the sensors, not sure about computing power onboard). Tesla's hardware is already better than human hardware for this task.
The trick is, they'd have to advance the state of the art in software quite far, to derive "full self-driving capabilities" from this hardware.
Chevrolet Bolt self driving test fleet comes this way. The difference is that Tesla has been talking loudly to get people to not see the short comings other companies are actively compensating for.
so what if they can add it to future production, their talk promises or implies they have all they need. yet cursory review of random youtube videos will show you how limited their system still is.
this may be another "war" they lead the charge on but falter in securing the win. you can play the car marketing game at times similar to the technology market but in the area of safety there is no compromise. instead of acting like a tech company pushing a new tablet they should have acted like SpaceX
I think Nissan has bet that in a few years, LIDAR will be inexpensive: https://www.youtube.com/watch?v=cfRqNAhAe6c
They're probably right, since it's been reported that Google's Waymo has achieved just that - press reports from January say $8,000 per sensor which is positively affordable when compared to $75,000 for a competing sensor.
https://arstechnica.com/cars/2017/01/googles-waymo-invests-i...
I think most people think LIDAR is going to eventually be inexpensive -- even Tesla has a few test cars with LIDAR. However, it's a fair weather sensor, so you'll have to do a lot more for level 5.
I wonder if I want to put my money betting on this how should I proceed?
1 reply →
> no car manufacturer ships cars equipped with LIDAR
For love of the god, please do not comment in the public if you do not understand the subject. Take a look at Chrysler + waymo minivans, Volvo + Uber suvs, Mercedes self-driving test cars. They all have lidars.
Sorry, but you did not get the point op was making. None of those you mentioned are production cars like the Tesla, they are test mules. The point of op is correct, he/she is not talking about modded research cars.
1 reply →
None of which are production vehicles being shipped from manufacturers. They are test vehicles, nothing more.
For the love of god, stop the hyperbole, and try to understand the comments you are answering better.
See how condescending this is?
1 reply →
Sorry, I kinda tuned out when you mentioned lidar like it's some requirement for autonomous driving.
Lidar is AMAZING for giving press demos on sunny days. For the real world with rain, snow, leaves, plastic bags, etc? Useless.
The future is radar + cameras + a LOT of software blood, sweat and tears.
That's like saying our eyes are useless because we can't see in the dark.
Our eyes are not LIDAR, though, and we can drive pretty well.
In fact, the reason we have crashes is NOT our eyes’ lack of distance detection through laser return timing — having two eyes is enough for distance appreciation. We have crashes because of attention deficit instead.
At this point, there is no reason to believe that a machine can't achieve and outperform a human on a driving task given the same inputs. Sure, human eyes have 5 million cone cells and 1080p feeds only have 2 million pixels, but 4K has 9 million, and more importantly, that level of precision is unnecessary for regular driving.
And Tesla doesn’t even bet just on the visible spectrum; it also relies on radar.
4 replies →
Correct, eyes in fact are useless if you can't see in the dark.
Now imagine walking on top of a sky scraper in pitch darkness. Yes your eyes work in light, but in this case you will likely fall to death.
You can't really drive in the dark, can you? What if it gets dark for 1 second on a cliffside turn?
12 replies →
Not really. Self driving cars should be able to drive in all of those conditions (and one condition might even quickly turn into another, e.g. sunny into rain).
If a technology only helps with some of the cases (e.g. fair weather) and does not work for the others, then there are two cases:
(a) A single replacement technology will be found that works in 100% of cases.
or:
(b) The technology will only be used on the cases it works well, and the other cases will be handled by some alternative technology equally only suited to them.
In the case of (a), Lidar is indeed useless (or at best, only used as a supplementary technology in favourable conditions).
And I fail to see how (b) can be the case -- that is, how there can be another technology that will solve the rain/snow/night driving problem, but which cannot also outperform/replace Lidar for fair weather driving.
1 reply →
> The future is radar + cameras + a LOT of software
... all of which Waymo's solution also has, in addition to LIDAR.
I will take your self driving car seriously if you can drive it in all conditions in India.
Until then its just an attempt to make something that breaks at the next unanticipated exception.
Human beings can not drive in all conditions in India.
Just to get our car out of the garage, I had to plead and negotiate with N vegetable vendors with makeshift stores on the road.
Also: bicycles, motorbikes, rickshaws(in human pulled, CNG and electric varieties!)and pedestrians mixed in traffic everywhere.
3 replies →
Well, most non-Indian drivers also can't make it in "all conditions in India" so that's a moot point.
Made me laugh but it's a very true statement. When I was driving whilst holidaying in India I realised that the main rule of the road was "largest vehicle wins"! This actually made for quite an easy to understand system with few questions of whose right of way it was.
bike < car < van < truck
which makes sense because if you're the one who's going to come off worse in an impact then you really want to give way - especially if you have a massive painted tipper truck hurtling towards you!
It will be very interesting to see how self driving systems can cope with these local unwritten bylaws.
6 replies →
I wouldn't drive in India. And I'm human.
1 reply →
I imagine a self driving car will drive there similarly to a human; slowly, because of the extreme risk. Potentially humans tend to take more risks than they should, so a self driving car in India might seem to drive differently than a human would. But the algorithm stays the same; drive at a speed matching the obstacle risk, and avoid hitting things. Perhaps the parking aspect would be the most different.
IMO, India has the simplest driving algorithm: Keep going where you're going and don't hit anything. Also, honk if you think you might be in someone's blindspot.
that is important only if you live in india, or any other 3rd world country with similar disrespect to anything resembling a traffic rule from a mile away.
western world will be perfectly happy with cars that can only drive on our roads, and eventually manufacturers will pick up on other places as well.
look at the potential bright side - this might bring some order to mess one can see daily in bigger cities on roads.
An autonomous driving feature that you can use on most days but not all of them is still very useful and would have a good market.
It's not going to rain or snow today, and if it would, then I can take the wheel myself.
Isn't Tesla parterned with Nvidia to solve the computation side of things?
I thought I remembered Nvidia presenting some additional stuff about it in their Tech demo recently.
I know Karpathy wasn't involved in the "end-to-end" research paper Nvidia published - but I wouldn't be surprised if they weren't involved somehow, and if anyone could push the CNN tech in that paper further it just might be Karpathy.
When musk said they were doing the first decent electrical sport car, everybody became an expert and said it won't work. Then he decided to do solar city and space x. And the crowd said again "naaaa".
Now the guy is tackling another hard problem and everybody knows better.
My thoughts exactly. Elon Musk is not known for vaporware or BSing. If he says something is doable, there's a good reason for it. Musk has a combination of vision, an understanding of physics, and execution ability unmatched by any other person (in my opinion). To say that he has painted himself in a corner is highly misinformed.
>Elon Musk is not known for vaporware or BSing...
Hyperloop though..
I mean, I think bullshit can be sold to even 'techies' aka HN crowd, if it is wrapped just the right way...
2 replies →
He doesn't have any educational background in AI or it's underpinnings. Running open AI doesn't instantly make him an expert. The above comment is from an expert with experience in the self driving space.
9 replies →
The difference between Tesla and Google is that Tesla actually has to ship these cars to customers right now. Wouldn't LIDAR double the cost of a Tesla right now?
Tesla didn't have to ship a car with unusable self-driving hardware. They'll probably have to eat the cost of a retrofit package on some vehicles to make that work. Like the Roadster transmission problem, where they had to replace all the early drivetrains with the two-speed transmission.
Nobody has built automotive LIDAR units in volume yet. That's why they're so expensive. It's not an inherently expensive technology once someone is ready to order a million units. It does take custom silicon. Tesla, at 25K units per quarter, may not be big enough to start that market.
Continental, which is a very large auto parts maker in Germany, has demo units of their flash LIDAR. They plan to ship in quantity in 2020. Custom ASICs have to be designed and fabbed to get the price down.[1]
[1] http://www.jobs.net/jobs/continental-corporation-careers/en-...
Isn't it possible that by the time LIDAR is technically and economically ready for general deployment current Tesla models will have enough mileage and Tesla avoids retrofitting completely?
1 reply →
Classic engineering ethics problem. Management says they have to ship "right now", but you know that if you do, 1 in a 1000 customers will die. If you wait a couple years, that'll go down to 1 in 1000000, but your company might go bankrupt.
Engineers at Takata and in GM's ignition key department made one choice, Waymo seems to be making the other.
Waymo has the benefit of not going bankrupt if they wait another couple years..
1 reply →
Cost should not be an excuse at the expense of safety. Plain and simple.
If that was true, nobody would ever ship anything to do with safety for less than a million dollars. We make trade-offs between cost and safety all the time. Doctors walk that line day by day, it's a big part of their job. In particular, car safety regulations walk a very fine line between safety and cost. No car regulations require every car to have all and every one of the best and most advanced safety features. If they did, no cars could be sold for less than hundreds of thousands of dollars and they'd all look and perform like blocky vans with great huge crumple zones.
What's unsafe about radar cruise plus lane keep? People act like Tesla is shipping cars that fling themselves into pedestrians at every opportunity. Somehow we all manage to absolve the auto maker when someone with cruise control set on their 1998 Mazda rear ends someone on the freeway. Let's judge Tesla for autonomous safety when they produce an autonomous car.
2 replies →
I know it's popular to bash on Tesla's current challenges with their Autopilot software, but I think it's a bit unfair to expect them to be back in front of the pack just yet. They had their big breakup with MobilEye in, what, September last year? Nine months is pretty quick to go from total reliance on a vendor package to a reasonably functional fully in-house system.
Can you explain the 'back' part of expect them to be back in front of the pack ?
As far as I know they've never been in the front of the pack.
With AP1 they were the only company that had anything approaching level 2/3 in a publicly available production vehicle. Google might have been ahead of them but it's hard to tell since all we ever saw were carefully staged demos.
3 replies →
Actually, they are and have always been in front of the pack.
Tesla had been working on a mobileye replacement for some time before they parted ways.
>Tesla will try to ship a self-driving system before [...] People will die because of this.
On the other hand pushing the envelope on self driving technology using cheap sensors will probably help reduce the world's 1m annual auto deaths earlier than otherwise. Thousands of people will not die because of this.
They really only hired a well known guy cool guy at a post with a lot of exposure. Your comment feels like pretty intense overanalysis to me.
Their previous guy quit. That indicates a problem.
Human drivers are also not equipped with a LIDAR. We rely on stereo-vision combined with a couple of low-tech instruments (rearview mirror, left and right side mirrors and looking-over-shoulder to achieve approx 250-300 degrees view of field) to navigate the road in a vehicle. If you extrapolate the current state of AI and treat 10-20 in-car cameras + radar as the equivalent of what a human brings to the table then I fail to see why Tesla has painted itself in a corner.
Agreed. Raquel Urtasan's research is the cutting edge in this area: https://www.cs.toronto.edu/~urtasun/. And she was recently hired/retained by Uber to lead their robot car efforts in Canada. Here's a recent video of her research from the National Academy of Sciences: https://www.youtube.com/watch?v=sW4M7-xcseI
true, but bird wings flap and airplane wings don't flap.
Do submarines swim?
Doesn't matter, only the results matter. Planes work and work well, they crush birds in every performance metric and sometimes literally. Can a self driving car be made safe without lidar? I suspect so, but I am not certain, but I am no expert.
Have there been advances in using LIDAR in rain/fog/snow? For all the autonomous car demos this seems like too large of a use case to gloss over...
Yes, but they haven't made it down to the automotive level yet.
Most automotive LIDARs just report the time of the first return, but it's possible to do more processing. Airborne LIDAR surveys often record "first and last"; the first return is the canopy of trees or plants; the last is from ground level.
It's also possible to use range gating in fog, smoke, and dust conditions.[1][2] Returns from outside the range gate are ignored. You can move through depth ranges in slices until something interesting shows up. This seems to be in use for military purposes, but hasn't reached the civilian market yet.
Range gated LIDAR imagers have been around for at least 15 years. By now, it should be possible to obtain a full list of returns for each pixel for several frames in succession, crunch on that, and automatically filter out noise such as rain, snow, and dust. It's a lot of data per frame, but not more than GPUs already handle. Some recent work in China seems to be working to make range-gated imaging more automatic in bad conditions.[3]
[1] http://www.sensorsinc.com/applications/military/laser-range-... [2] http://www.obzerv.com/en/videos [3] http://proceedings.spiedigitallibrary.org/proceeding.aspx?ar...
I find it funny that people can seemingly flippantly state that Tessa (or any of these cars) have "weak sensor systems" as if it is such a trivial problem.
"Well, there's your problem right there, let's just slap on some strong sensors and you should be good to go!"
You know what has a weak sensor system? Any car without any sensors.
Yea this seems like the key question. I don't know if Level 5 in two years is feasible, but if it's Level 5 or bust, then LIDAR won't fly (AIUI; would certainly welcome corrections).
On one hand, not pulling in potential safety improvements because they only work in good weather seems wrong, but on the other hand...that might be what needs to happen from a cost/marketing/legal perspective.
> Tesla will try to ship a self-driving system before that while trying to avoid financial responsibility for crashes. People will die because of this.
This is a pretty strong statement. Would you sign up for a slightly more specific version of your claim?
"I believe the Tesla self-driving system that ships by the end of 2020 will be statistically less safe than unassisted human drivers."
Full autonomous is conflict with Tesla's business model: selling cars. I think Tesla's real goal is to build the easiest driving massive production car. Massive production means the car always need a person to baby sit. For full autonomous, they don't need massive production (>10M units) to build the service network.
> People will die because of this.
And not just drivers of Teslas.
>Or even avoid a pothole.
Humans can avoid potholes with one eye. I don't know why you assume LiDAR is a requirement for this.
One eye backed by millions of years of training in depth perception and object recognition... The argument that humans are able to drive with just two cameras and software (brain) is deeply flawed because the brain is highly advanced in areas necessary for driving and Tesla's claim it will replicate that any time soon is absurd. This is why you need additional sensors like LIDAR to alleviate the computational load.
The human brain is a hard AI entity that can think through problems in any generic situation.
In self driving AI you are programming the car to do a specific thing. Sooner or later you will run into a situation in which algorithm will panic and can't do much.
2 replies →
If only the brain would be smart enough to focus on driving instead of distractions...
Yeah, Tesla may fail to ship a safe LIDAR-less car.
But the problem of making a self-driving car without LIDAR or something equivalent is awesomely challenging! An I bet Andrej Karpathy will really enjoy working on it.
And the tech resulting from this line of work will surely find its way in other things. (I guess the military has wet dreams about this stuff... I mean, even "unsafeness" can become a "feature" here: "uhm, look, that school we blew up by, uhm, mistake... was an AI-error... like... these stuff happens, you know, even Tesla's cars have an accident from time to time, that's life". Well, those dreams could also be nightmares: basically any "self driving thingie" is a potential guided missile, and dirt-cheap-because-lidar-less stuff has the potential of becoming ubiquitous, and unmaintained/unupdated/unsecured/hackable, leading to nightmarish urban warfare scenarios...)
And: "People will die because of this."... Uhm, yeah, they will, but if people ain't dying it means research is not moving fast enough, and competition will overtake you. I'd be more worried about when this stuff will be deployed on buses with tens of people, but hopefully public transport would stay a safe decade behind bleeding-edge stuff :)
And about and Tesla: however this plays out, Elon Musk made quite a lot of what would've been technically considered "bad business decisions" and things turned up OK so far... so I wouldn't feel sorry for them or short their stock ;)
> but if people ain't dying it means research is not moving fast enough
Could you help me understand this further? It feels quite insensitive to me.
If companies wait for the tech to be perfectly safe, it will never be released.