That's a tough problem - distinguishing wet pavement from deep water.
Humans make that mistake frequently.
Autonomous vehicles should probably be equipped with a water sensor. (We did that in our DARPA Grand Challenge vehicle back in 2005). Then they can enter water very cautiously and see if it's too deep. This may make them too cautious about shallow puddles on roads, though.
It’s a particularly hard problem in Texas. We get torrential rains and the landscape is relatively flat. Couple that with shallow soil over lots of limestone and it means flooding is really common. We also have roads that have a “low water crossing,” where a road crosses a creekbed that is normally dry but which will flood. There are often water depth signs there (basically a vertical ruler with feet marks so you can see where the water is up to). We lose people to this scenario (driving into flood waters) every year. It’s particularly problematic when it’s dark and you miss a warning sign. Before you know it, you’re in deep water and the flow can sweep the whole car downstream until it gets pinned against a tree, possibly with water forcing its way into the car.
Yeah, people are bad at guessing and the usual "Plan continuation bias" kicks in.
I was travelling in a group to eat lunch with friends once, after heavy rains. We reached a site where the road needs to fit under a bridge and is known to flood, there's standing water, and the driver figured it's probably not too high, he drives in and nope, water over the air intake, bye bye engine and we walked the rest of the way to lunch
I absolutely should have said "No, don't" but the plan says we have to drive under that bridge, there is no plan B. Of course plan A being "Wreck car" is a stupid plan, but the bias meant I didn't say "No" and I should have.
You wouldn't die there, just trash the car, the flooding is localised - but there are definitely other sites around here where in flood conditions you could die if you drove into water that's deeper than you realised.
This case makes me think of my brother's place in rural Tennessee. To get to his house, you drive through a small creek, year round. For a hundred years in their community, they've managed without a bridge. I'm not sure driverless cars are ready for edge cases like this. Also, no one tell Enterprise I drove their rental through a creek.
I've seen several places in England (and at least one in the western United States) where they have fords.
For those not familiar, water runs over the road full-time, and people are expected to just drive through it like it's no big deal. Except for right after a storm, when it is a big deal. It's essentially the intersection of a road and a stream where a bridge should be, but nobody ever built one.
Reminds me of all the Waymo vehicles stalled during that San Francisco blackout a while ago.
I have always believed that when people cite statistics on Waymos beating human drivers on safety statistics, that is only in the case of the happy path, or "happy road". The safety statistics could plummet in specific scenarios that lack training data or forethought, and they could crop up at any time.
Right, but humans are terrible at the happy path. I’d take 20% safer on the happy path over 40% less safe in unforeseen circumstances. The failure mode being “stopped car” is also not that bad.
Such detailed database of fine grained road geometry gets stale very quickly, due to road maintenance and road construction. In US highway lanes are shifted sideways frequently.
You underestimate how frequently details like this change in the real world and how difficult it is to reliably integrate them into the mapping models with very low error rates.
Aggregating this data in something close to real-time, verifying and corroborating that the change to the road model is real and correct, and then pushing those model updates to every vehicle that may need it almost immediately is not really a solved problem.
I thought the same thing. A very small float switch would work here. Somewhere between the radiator and the bumper. Fording depth is different for every vehicle.
This is also why they recommend not to use polarized while cycling, it can obscure slicks or water in certain sections. I still use mine but I know it's not as ideal as photo chromatic lenses.
It doesn't take much of a rainstorm to see localized flooding. Some debris over the storm drain is enough to flood a street. Hard to anticipate that happening.
It would have to know the height of that car above ground, how high the air intake is, etc. A lifter offroader could make it through a much deeper body of water than eg. a prius.
By a water sensor do you mean a sensor to detect the water level relative to the chassis? It seems like a very inexpensive downward-facing ultrasound sensor could work.
Is ultrasound less expensive than a moisture sensor?
The problem with both is they effectively require the vehicle to be in the water already. They need something that can tell depth before the vehicle has to slow down.
If they've mapped the surface of the water relative to themselves... couldn't they slowly wade in and just calculate the depth based on that 3d model without extra sensors.
Assumes there's no abrupt cliff to fall off... but short of the ability to make a 3d map underwater that seems inevitable.
They have been known to make that mistake. To use the word "frequently" demonstrates a misunderstanding between number of incidents and total miles driven. It also ignores that humans often drink and most of these types of accidents happen after 2am and most often in the state of Florida.
> equipped with a water sensor
Car washes will be fun.
> DARPA Grand Challenge
The problems the grand challenge ignores are more important than the ones it solved.
I've never made that mistake; I'm not aware of anyone I know doing it. I very rarely see it myself, except on news footage. Of course it happens some time somewhere but that says nothing about frequency.
> That's a tough problem
Not really. Don't drive where you don't know it's safe. Definitely don't drive into moving water - puddles only, and only if not too deep: I can usually figure it out based on the rest of the road - unless it's a sinkhole, the geometry is somewhat consistent - and especially by looking at objects in the water such as other cars driving through it. Sorry your friend isn't competent to figure it out.
People here are always quick to defend the autonomous cars, like a close friend. How often will we fall in love with a technology or company? It always distorts the truth.
It’s definitely a thing humans do a lot in certain places. Perhaps where you live, it isn’t as much of an issue, so naturally you and nobody you know has encountered it.
I rode my motorcycle into a hole that almost swallowed the front tyre entirely in rural Australia. That hole had just been a slight depression that collected water the last time I rode through it, and there was no visual indication that it was now deeper.
Any human can't necessarily tell the difference between an inch of water, which is perfectly safe to drive on (if slow enough that you don't hydroplane), and a flooded street. They can tell the difference between an inch of water and wet pavement, though.
This is the naive "if you can't stop you're following too close" circular definition based take. Makes for good rightthink points on reddit and communities of similar quality membership but you're not actually gonna build anything useful thinking like that.
In order to drive reasonably humans need to drive through water that is 6-12in deep on occasion. That's just how it is. Near me it's whenever the storm drain at the bottom of the hill clogs.
Article's current (possibly original), less ambiguous title: "Waymo recalls 3,800 robotaxis after glitch allowed some vehicles to ‘drive into standing water’"
IOW 3,800 Waymo vehicles aren't currently sat spinning their wheels in water.
Every time an issue is found, no matter how minor, it's fixed and updated everywhere. From now on, every car of that model (and future models, and related models) will no longer have that problem. Several passes of that improvement cycle, and self-driving cars become safer (and more efficient/comfortable/etc) than human drivers. At least, that's how it's supposed to work.
It's an interesting case of whether it's possible to infer the condition of wading and avoid having to install a sensor specific to a one in a million trips circumstance.
The inference would come from standing water slowing down the vehicle and likely require steering correction, in combination with some machine vision for identifying standing water.
Then there's the advantage of being Google and having hundreds of thousands of people in the same area using Google maps and navigation. Accelerometers in phones can detect crashes pretty reliably. There's a good chance they can reliably detect deceleration from standing water and report the location of the hazard.
If we're lucky, this'll help get us better roads. I'm never exactly happy slowing down to 10mph on the freeway because it isn't obvious whether the 100ft long puddle is an inch deep or 2ft, and I can sidestep some of that danger sometimes by proactively choosing safer lanes, but our roads are dangerous and don't handle water correctly.
How is this only being solved now? Isn’t it a very common thing that happens on roads or what am I missing? At least few times a year I have to go around major puddles, Waymo just speed boats through them? I doubt to believe.
Around 40 years ago, I was cycling around Andover (Hants, UK). Me and a mate were whizzing around near a small artificial lake. I decided to run into what I thought was shallow water and it wasn't. With hindsight that was a really daft failure of perception but you live and learn.
Forty years later touch wood I have not yet broken myself or a car ...
This seems tautological, but in practice, you might expect to see different results.
Engineering hours are finite, so if they're spread across interpreting signals from two different sources, they might not go deep enough to make either one as good as it could be.
Having your engineering resources more focused on a particular approach might actually yield better results.
I say this as someone who's dealing with LiDAR + vision vs pure vision in a different domain, and at this point, I actually think our pure vision systems are better.
This is one of the reasons why I'm suspicious of camera-only systems, here in Finland. Half the year there's a lot of snow and ice around. Which I imagine means most of the view is "white" and "shiny". Coupled with the dark winters it's gotta be a nightmare to deal with.
Not necessarily. Depending on angle and water depth, multi-return LIDAR can give you returns from both water surface and the road surface beneath, in the same way multi-return LIDAR can produce returns from vegetation and the ground beneath.
Could you use a different spectrum of EM radiation to detect water? There are parts of the microwave band that attenuate the signal by absorption and I wonder if you could use that. The only clue a human driver has in that situation is in the visible spectrum. The lines of the road disappear from view, which can be challenging to see at night.
They are rolling these out in New Orleans soon. Standing water is everywhere, and sometimes you have big hidden potholes. You just need to know the roads. Should be fun.
On a normal day it should suffice to train the model to use its judgment and maybe monitor how other cars are reacting to water covering the road, but when it starts flooding everywhere maybe they should pause the service until it dries out.
They suspended service areas they deem high risk until the software update can be applied. So while, yes, it's just a software update, it's a recall in the sense that they've temporarily pulled all the cars off the road in certain areas
"Recall" is a technical term meaning: "public dangerous defect notice".
A "recall" is stating that the defective version of the product in the field must be "removed/recalled" and replaced/updated with a non-defective version at the manufacturer's expense. It just so happens that the removal and replacement of defective software from the field can occur remotely.
The important part is that the manufacturer delivered a defective product that risks your safety, that fixing that safety defect is the responsibility of the manufacturer, and the system is unsafe until that occurs.
Yes, this is a common terminology issue. "Recall" is legally defined in terms of the kind of problems that require one, not the solution to those problems, because the relevant regulations were written when there was no way to fix consumer products other than physically delivering them to the manufacturer or an authorized repair person.
How about a Mastodon, Lamb of God take with Floods of Triton:
Heap data upon this modern age
All human drivers now phased away
A lidar's glow, the soft wheel's echo
Autonomous force of code remains
We are last of the before rides
Now hear the robot cars rise
Hum into eternity
Remember this, all roadways lead to the fleet
That gives me a horrifying idea for a short story about a Waymo being hacked to carry out an assassination where it purposely drives off a bridge and into a lake.
Since recall on cars no longer means doing anything to the car's physical location I think the regulator NHSTA should update this term
It just creates alarmist headlines for what's really an over the air update, although "recall" is still currently a regulatory accurate term in the vehicle space
Cars, especially EVs, have many similarities to being gigantic phones. Imagine if a routine software update from Apple was called a "recall", that functionally describes what's happening here
NHTSA should at least distinguish between "omg we have to get these cars off the road and bring them to the shop immediately!" versus "over the air software update"
Exactly what I was thinking, the CNBC article feels very clickbity because they don't say that in the opening lead that it's just a software update. They make it sound like they need to be taken back to some factory somewhere and get their systems updated. Which is not true because they just get a software update.
Maybe you drive into flood waters, but I don't. That's not a difficult skill to pull off.
We're still in the early days of self driving cars, and as much simulation and miles as they have, they're still constantly getting exposed to real world conditions that are new to them. The world is dynamic, so this will always remain true.
It remains to be seen where we'll converge on capability, incident rate, and acceptance.
> It remains to be seen where we'll converge on capability, incident rate, and acceptance.
I think we're already there with Waymo as the example. We may later choose to diverge from this now-accepted path, but for the moment we have a blueprint, and fixing edge cases with a software update is apparently acceptable, if you just look at all the Waymos operating legally right now.
Maybe you don't drive into flood waters, but your Uber driver might, and that's what Waymo is trying to replace, not your personal driving.
In that context I think comparing it to the average human driver makes a lot of sense, because even if you personally are an even better driver, or even if human drivers are better at some specific things, we have more than enough data to show that Waymo reduces accident rates overall in their current rollout.
The world is dynamic, so sure, it will always be true in some technical sense. But I am confident that eventually we’ll have trained them on enough scenarios that novelty will have a smaller and smaller effect on their ability to safely navigate through the world.
>A product recall is a request from a manufacturer to return a product after the discovery of safety issues...
I think using the term for a software update is abusing the language a bit. And may confuse people who have a real recall where the thing has to go to the dealer.
Waymos are fleet vehicles. Recalls go to the owners, just like with other fleet vehicles such as rental cars, taxis, limos, delivery services, utilities, and city/state/federal government. It doesn't really matter who is whose customer.
I really want car companies to just automate publishing “recalls” for every commit pushed to any car ever. Flood this broken term and force a distinction between “the airbags will literally explode and destroy your face” and “the radio volume is too quiet sometimes”
We really need a better term for when an urgent software update for a vehicle is issued. The extreme majority of the population completely misunderstands it when a "recall" is done when it's actually just an OTA software update.
That you need world models to sensibly deploy "thinking" machines in the real world. Else they do stupid shit like drive straight into water. You can bruteforce some semblance of thinking by training on literally all knowledge that can be digitized but even that is proving to not be quite enough.
Just this morning I was almost killed twice on my bike ride to work by two separate drivers, one of whom looked to be 80 and could barely see over the dashboard, and one who was on their phone. I didn’t even bother trying to remember the plate numbers, knowing that the odds of any kind of consequences are absolute zero.
No, we can’t go back to driving our own vehicles. Waymo everywhere and human driving outlawed, ASAP.
Agree. Multiple people I know have bought Teslas because they don’t trust themselves or their spouses to drive safely, and want them to use FSD. There should be incentives to get people onto self driving.
That's a tough problem - distinguishing wet pavement from deep water. Humans make that mistake frequently. Autonomous vehicles should probably be equipped with a water sensor. (We did that in our DARPA Grand Challenge vehicle back in 2005). Then they can enter water very cautiously and see if it's too deep. This may make them too cautious about shallow puddles on roads, though.
It’s a particularly hard problem in Texas. We get torrential rains and the landscape is relatively flat. Couple that with shallow soil over lots of limestone and it means flooding is really common. We also have roads that have a “low water crossing,” where a road crosses a creekbed that is normally dry but which will flood. There are often water depth signs there (basically a vertical ruler with feet marks so you can see where the water is up to). We lose people to this scenario (driving into flood waters) every year. It’s particularly problematic when it’s dark and you miss a warning sign. Before you know it, you’re in deep water and the flow can sweep the whole car downstream until it gets pinned against a tree, possibly with water forcing its way into the car.
Yeah, people are bad at guessing and the usual "Plan continuation bias" kicks in.
I was travelling in a group to eat lunch with friends once, after heavy rains. We reached a site where the road needs to fit under a bridge and is known to flood, there's standing water, and the driver figured it's probably not too high, he drives in and nope, water over the air intake, bye bye engine and we walked the rest of the way to lunch
I absolutely should have said "No, don't" but the plan says we have to drive under that bridge, there is no plan B. Of course plan A being "Wreck car" is a stupid plan, but the bias meant I didn't say "No" and I should have.
You wouldn't die there, just trash the car, the flooding is localised - but there are definitely other sites around here where in flood conditions you could die if you drove into water that's deeper than you realised.
2 replies →
This case makes me think of my brother's place in rural Tennessee. To get to his house, you drive through a small creek, year round. For a hundred years in their community, they've managed without a bridge. I'm not sure driverless cars are ready for edge cases like this. Also, no one tell Enterprise I drove their rental through a creek.
1 reply →
I've lived here over a decade. Lived through multiple floods. Never once have I driven into water without being unaware of it.
Texas has it easy.
I've seen several places in England (and at least one in the western United States) where they have fords.
For those not familiar, water runs over the road full-time, and people are expected to just drive through it like it's no big deal. Except for right after a storm, when it is a big deal. It's essentially the intersection of a road and a stream where a bridge should be, but nobody ever built one.
10 replies →
Reminds me of all the Waymo vehicles stalled during that San Francisco blackout a while ago.
I have always believed that when people cite statistics on Waymos beating human drivers on safety statistics, that is only in the case of the happy path, or "happy road". The safety statistics could plummet in specific scenarios that lack training data or forethought, and they could crop up at any time.
Right, but humans are terrible at the happy path. I’d take 20% safer on the happy path over 40% less safe in unforeseen circumstances. The failure mode being “stopped car” is also not that bad.
If they have a laser measurement of the road from before, couldn't they see that the level of water vs the expected road surface?
Such detailed database of fine grained road geometry gets stale very quickly, due to road maintenance and road construction. In US highway lanes are shifted sideways frequently.
13 replies →
That seems a very risky assumption for any car (self driving or human driver) during flash floods. "Turn around don't drown":
You think you know how deep it is under because you've taken that road many times before (or in your case you have historical laser measurement)
But you don't know:
- Maybe the road under fully collapsed
- Maybe the flow of water is extremely strong, so you need to accurately estimate that too.
2 replies →
You underestimate how frequently details like this change in the real world and how difficult it is to reliably integrate them into the mapping models with very low error rates.
Aggregating this data in something close to real-time, verifying and corroborating that the change to the road model is real and correct, and then pushing those model updates to every vehicle that may need it almost immediately is not really a solved problem.
That's so much extra complexity
If they have a pre-existing database of every road, sure. And if it's kept up-to-date at all times in all vehicles.
9 replies →
I thought the same thing. A very small float switch would work here. Somewhere between the radiator and the bumper. Fording depth is different for every vehicle.
This is also why they recommend not to use polarized while cycling, it can obscure slicks or water in certain sections. I still use mine but I know it's not as ideal as photo chromatic lenses.
This is how I got my shoes wet climbing around rock pools last weekend.
Pretty sure the right answer mainly involves the car knowing about the weather and other emergency events.
It doesn't take much of a rainstorm to see localized flooding. Some debris over the storm drain is enough to flood a street. Hard to anticipate that happening.
2 replies →
All that it can take is a broken fire hydrant in the wrong place.
Doesn't Land Rover historically have like a wading sensor?
They should really just park themselves on the side of the road (or observe in real time), wait for another car to go first, then follow that path
It would have to know the height of that car above ground, how high the air intake is, etc. A lifter offroader could make it through a much deeper body of water than eg. a prius.
By a water sensor do you mean a sensor to detect the water level relative to the chassis? It seems like a very inexpensive downward-facing ultrasound sensor could work.
When you're going 35 mph and suddenly hit a 2 ft deep puddle (I've done this), that sensor isn't going to help at all.
Is ultrasound less expensive than a moisture sensor?
The problem with both is they effectively require the vehicle to be in the water already. They need something that can tell depth before the vehicle has to slow down.
3 replies →
I've used an ultrasonic sensor to detect the water level in a tank before, I don't think it would work as you describe.
Also, the sensor didn't work in that context either as condensation kept forming on it.
If they've mapped the surface of the water relative to themselves... couldn't they slowly wade in and just calculate the depth based on that 3d model without extra sensors.
Assumes there's no abrupt cliff to fall off... but short of the ability to make a 3d map underwater that seems inevitable.
Human drivers look for Waymos up ahead, water up to their windows.
<jk>
> Humans make that mistake frequently.
They have been known to make that mistake. To use the word "frequently" demonstrates a misunderstanding between number of incidents and total miles driven. It also ignores that humans often drink and most of these types of accidents happen after 2am and most often in the state of Florida.
> equipped with a water sensor
Car washes will be fun.
> DARPA Grand Challenge
The problems the grand challenge ignores are more important than the ones it solved.
[dead]
Do they make this mistake frequently? How frequently? I've seen people overestimate things, but I don't think this is as hard as one might think
My commaai can do this. I'm pretty sure tesla can as well
> frequently
I've never made that mistake; I'm not aware of anyone I know doing it. I very rarely see it myself, except on news footage. Of course it happens some time somewhere but that says nothing about frequency.
> That's a tough problem
Not really. Don't drive where you don't know it's safe. Definitely don't drive into moving water - puddles only, and only if not too deep: I can usually figure it out based on the rest of the road - unless it's a sinkhole, the geometry is somewhat consistent - and especially by looking at objects in the water such as other cars driving through it. Sorry your friend isn't competent to figure it out.
People here are always quick to defend the autonomous cars, like a close friend. How often will we fall in love with a technology or company? It always distorts the truth.
It’s definitely a thing humans do a lot in certain places. Perhaps where you live, it isn’t as much of an issue, so naturally you and nobody you know has encountered it.
1 reply →
Any human can distinguish wet pavement from a flooded street. Some voluntarily drive into the flooded street.
And that is the difference. In a Waymo you are a prisoner, in your own car you can turn around.
I rode my motorcycle into a hole that almost swallowed the front tyre entirely in rural Australia. That hole had just been a slight depression that collected water the last time I rode through it, and there was no visual indication that it was now deeper.
Any human can't necessarily tell the difference between an inch of water, which is perfectly safe to drive on (if slow enough that you don't hydroplane), and a flooded street. They can tell the difference between an inch of water and wet pavement, though.
This is the naive "if you can't stop you're following too close" circular definition based take. Makes for good rightthink points on reddit and communities of similar quality membership but you're not actually gonna build anything useful thinking like that.
In order to drive reasonably humans need to drive through water that is 6-12in deep on occasion. That's just how it is. Near me it's whenever the storm drain at the bottom of the hill clogs.
2 replies →
Article's current (possibly original), less ambiguous title: "Waymo recalls 3,800 robotaxis after glitch allowed some vehicles to ‘drive into standing water’"
IOW 3,800 Waymo vehicles aren't currently sat spinning their wheels in water.
This is important as flooded vehicles are a common sight on the salvage-title market.
Though the idea of a single rider calling for a Waymo and slowly one-by-one 3,800 Waymos drove into a flood and were washed away ...
That's the promise of self-driving cars.
Every time an issue is found, no matter how minor, it's fixed and updated everywhere. From now on, every car of that model (and future models, and related models) will no longer have that problem. Several passes of that improvement cycle, and self-driving cars become safer (and more efficient/comfortable/etc) than human drivers. At least, that's how it's supposed to work.
So underappreciated. The article writes as if its some kind of sign that waymos just aren't ready. But human drivers are not improving at all.
It's an interesting case of whether it's possible to infer the condition of wading and avoid having to install a sensor specific to a one in a million trips circumstance.
The inference would come from standing water slowing down the vehicle and likely require steering correction, in combination with some machine vision for identifying standing water.
Then there's the advantage of being Google and having hundreds of thousands of people in the same area using Google maps and navigation. Accelerometers in phones can detect crashes pretty reliably. There's a good chance they can reliably detect deceleration from standing water and report the location of the hazard.
If we're lucky, this'll help get us better roads. I'm never exactly happy slowing down to 10mph on the freeway because it isn't obvious whether the 100ft long puddle is an inch deep or 2ft, and I can sidestep some of that danger sometimes by proactively choosing safer lanes, but our roads are dangerous and don't handle water correctly.
How is this only being solved now? Isn’t it a very common thing that happens on roads or what am I missing? At least few times a year I have to go around major puddles, Waymo just speed boats through them? I doubt to believe.
Around 40 years ago, I was cycling around Andover (Hants, UK). Me and a mate were whizzing around near a small artificial lake. I decided to run into what I thought was shallow water and it wasn't. With hindsight that was a really daft failure of perception but you live and learn.
Forty years later touch wood I have not yet broken myself or a car ...
Does anyone with a better understanding about LIDAR vs camera approach to autonomous drivng explain how would Tesla handle such situation ?
This is a HW4 Tesla on FSD 14.3.2 trying to drive into a lake five days ago (a la The Office): https://www.reddit.com/r/TeslaFSD/comments/1t9rl2u/fsd_tried..., so I would not say Tesla has solved standing water yet.
That said, FSD seems quite capable of routing around standing water in many cases (e.g. https://xcancel.com/planoken/status/2030754820462633031, https://www.reddit.com/r/TeslaFSD/comments/1pw9f2m/fsd_navig..., https://xcancel.com/BLKMDL3/status/1991862465328779317, https://xcancel.com/JVTacoma/status/2046313902749921638), so handling the remaining cases seems more like a model intelligence / data issue rather than a sensor limitation. Lidar beams generally bounce off mirrorlike surfaces without returning to the sensor, so I think all lidar would tell you about standing water is "there's something shiny/reflective within this region of the image", which you already know from cameras+headlights.
Waymo has LIDAR and cameras, so it is better equipped for every situation.
This seems tautological, but in practice, you might expect to see different results.
Engineering hours are finite, so if they're spread across interpreting signals from two different sources, they might not go deep enough to make either one as good as it could be.
Having your engineering resources more focused on a particular approach might actually yield better results.
I say this as someone who's dealing with LiDAR + vision vs pure vision in a different domain, and at this point, I actually think our pure vision systems are better.
1 reply →
What if they oppose each other? Which one do you trust?
1 reply →
Unless the power is out
https://abc7news.com/post/san-francisco-leaders-press-waymo-...
2 replies →
LIDAR isn't helpful for water. Standing water behaves like a mirror on LIDAR.
This is one of the reasons why I'm suspicious of camera-only systems, here in Finland. Half the year there's a lot of snow and ice around. Which I imagine means most of the view is "white" and "shiny". Coupled with the dark winters it's gotta be a nightmare to deal with.
2 replies →
Not necessarily. Depending on angle and water depth, multi-return LIDAR can give you returns from both water surface and the road surface beneath, in the same way multi-return LIDAR can produce returns from vegetation and the ground beneath.
Could you use a different spectrum of EM radiation to detect water? There are parts of the microwave band that attenuate the signal by absorption and I wonder if you could use that. The only clue a human driver has in that situation is in the visible spectrum. The lines of the road disappear from view, which can be challenging to see at night.
If the LIDAR can sense the road close enough to the front of the car, then it could estimate how far underwater the car is.
1 reply →
Maybe they're secretly developing Waymo submarines..
They are rolling these out in New Orleans soon. Standing water is everywhere, and sometimes you have big hidden potholes. You just need to know the roads. Should be fun.
On a normal day it should suffice to train the model to use its judgment and maybe monitor how other cars are reacting to water covering the road, but when it starts flooding everywhere maybe they should pause the service until it dries out.
I have seen them around here in beta form, but yes God help them
What is a recall in this case? Is them getting a software update a recall now?
They suspended service areas they deem high risk until the software update can be applied. So while, yes, it's just a software update, it's a recall in the sense that they've temporarily pulled all the cars off the road in certain areas
"Recall" is a technical term meaning: "public dangerous defect notice".
A "recall" is stating that the defective version of the product in the field must be "removed/recalled" and replaced/updated with a non-defective version at the manufacturer's expense. It just so happens that the removal and replacement of defective software from the field can occur remotely.
The important part is that the manufacturer delivered a defective product that risks your safety, that fixing that safety defect is the responsibility of the manufacturer, and the system is unsafe until that occurs.
I think so. For some kind of legalese reasons that's generally what a Tesla "recall" amounts to these days.
Yes, this is a common terminology issue. "Recall" is legally defined in terms of the kind of problems that require one, not the solution to those problems, because the relevant regulations were written when there was no way to fix consumer products other than physically delivering them to the manufacturer or an authorized repair person.
Waymo: *locks doors, chorus to Floods by Pantera starts playing, guns it into the water*
“Wash away maaaaan, take him with the floooood”
How about a Mastodon, Lamb of God take with Floods of Triton:
That gives me a horrifying idea for a short story about a Waymo being hacked to carry out an assassination where it purposely drives off a bridge and into a lake.
Since recall on cars no longer means doing anything to the car's physical location I think the regulator NHSTA should update this term
It just creates alarmist headlines for what's really an over the air update, although "recall" is still currently a regulatory accurate term in the vehicle space
Cars, especially EVs, have many similarities to being gigantic phones. Imagine if a routine software update from Apple was called a "recall", that functionally describes what's happening here
NHTSA should at least distinguish between "omg we have to get these cars off the road and bring them to the shop immediately!" versus "over the air software update"
Exactly what I was thinking, the CNBC article feels very clickbity because they don't say that in the opening lead that it's just a software update. They make it sound like they need to be taken back to some factory somewhere and get their systems updated. Which is not true because they just get a software update.
Go fish
This is ok though because humans drive into flood waters too.
Look, you can't make progress without getting your feet wet and then diving straight into the deep end.
Maybe you drive into flood waters, but I don't. That's not a difficult skill to pull off.
We're still in the early days of self driving cars, and as much simulation and miles as they have, they're still constantly getting exposed to real world conditions that are new to them. The world is dynamic, so this will always remain true.
It remains to be seen where we'll converge on capability, incident rate, and acceptance.
> It remains to be seen where we'll converge on capability, incident rate, and acceptance.
I think we're already there with Waymo as the example. We may later choose to diverge from this now-accepted path, but for the moment we have a blueprint, and fixing edge cases with a software update is apparently acceptable, if you just look at all the Waymos operating legally right now.
Maybe you don't drive into flood waters, but your Uber driver might, and that's what Waymo is trying to replace, not your personal driving.
In that context I think comparing it to the average human driver makes a lot of sense, because even if you personally are an even better driver, or even if human drivers are better at some specific things, we have more than enough data to show that Waymo reduces accident rates overall in their current rollout.
The world is dynamic, so sure, it will always be true in some technical sense. But I am confident that eventually we’ll have trained them on enough scenarios that novelty will have a smaller and smaller effect on their ability to safely navigate through the world.
"recall" = applies software update
Wikipedia has
>A product recall is a request from a manufacturer to return a product after the discovery of safety issues...
I think using the term for a software update is abusing the language a bit. And may confuse people who have a real recall where the thing has to go to the dealer.
Yes, "recall" brings to mind serious issues like the gas tank exploding on the Ford Pinto.
1 reply →
Also I think it's wrong to call something a recall if it's not owned by customers. Waymo is a service.
Waymos are fleet vehicles. Recalls go to the owners, just like with other fleet vehicles such as rental cars, taxis, limos, delivery services, utilities, and city/state/federal government. It doesn't really matter who is whose customer.
aw, I was having fun imagining 3,800 Johnny cabs just immediately changing route to go to headquarters.
The difference between that and usual software updates I'm guessing is the cars are pulled from service until the update takes place.
Recall makes for better headlines.
I really want car companies to just automate publishing “recalls” for every commit pushed to any car ever. Flood this broken term and force a distinction between “the airbags will literally explode and destroy your face” and “the radio volume is too quiet sometimes”
3 replies →
Gah, thanks for this. Thought I was used to that slight-of-hand but this one got me
Legally and technically true, and I hate it.
We really need a better term for when an urgent software update for a vehicle is issued. The extreme majority of the population completely misunderstands it when a "recall" is done when it's actually just an OTA software update.
We've updated the title above. Thanks!
[flagged]
[dead]
LeCun is right.
About what
That you need world models to sensibly deploy "thinking" machines in the real world. Else they do stupid shit like drive straight into water. You can bruteforce some semblance of thinking by training on literally all knowledge that can be digitized but even that is proving to not be quite enough.
2 replies →
FFS, can we just go back to talking to each other in person and driving our own vehicles? Where'd the 90s go?
If the car drives itself we will have more time to talk to each other in person.
Or invest in public transport instead
Just this morning I was almost killed twice on my bike ride to work by two separate drivers, one of whom looked to be 80 and could barely see over the dashboard, and one who was on their phone. I didn’t even bother trying to remember the plate numbers, knowing that the odds of any kind of consequences are absolute zero. No, we can’t go back to driving our own vehicles. Waymo everywhere and human driving outlawed, ASAP.
Agree. Multiple people I know have bought Teslas because they don’t trust themselves or their spouses to drive safely, and want them to use FSD. There should be incentives to get people onto self driving.
6 replies →
Actually, we’ve just returned to 2007.
https://youtu.be/DOW_kPzY_JY
> can we just go back to talking to each other in person
He posts on an internet message board
If they are recalled, do the cars drive themselves back to the factory?