> "the resulting congestion required law enforcement to manually manage intersections"
Does anyone know if a Waymo vehicle will actually respond to a LEO giving directions at a dark intersection, or if it will just disregard them in favour of treating it as a 4 way stop?
I suddenly find that I really want an answer to this as well because I'm now imagining what might ensue if one of these attempted to board a car ferry. Typically there's a sign "turn headlights off", you're expected to maintain something like 5 mph (the flow of traffic should never stop), and you get directed by a human to cross multiple lane markings often deviating from the path that the vehicle immediately in front of you took.
Car ferries don't really make much sense in a Waymo-ubiquitous world. It's not your vehicle; there isn't really a reason why you would need to have the same vehicle on the other side of ferry ride. You're better off having one Waymo network on one side of the waterway, a separate Waymo network on the other side, and then a passenger-only ferry with a much higher passenger capacity (and oftentimes, they go much faster, since you can have hull forms like wave-piercing catamarans, hydrofoils, and hovercraft when you aren't carrying cars).
I think that Waymo isn't concerned about those types of scenario because they only operate in a limited area, and can tune their systems to operate best in that area (EG not worrying about car ferries, human-operated parking lots etc)
Your scenario seems to have a lot of overlap with a construction worker directing traffic around a road construction site. I have no idea if Waymo is any good at navigating these, but I am sure there is a lot of model training around these scenarios because they are common in urban driving environments.
This was found to be one of the early challenges of self driving: reading traffic signal gestures of traffic agents. It does it. But the jury is out if it does it well.
It also needs to be able to ensure the signals are coming from a human that actually has authority to command it. Don't really want it taking hand signals from anyone.
The amount of times this has been asked with no confirmation leads me to believe they still do not.
Tesla fanboys gush about how FSD can understand LEO at irregular traffic conditions, but no company I’m aware of has confirmed their systems are capable.
Teslas currently have a driver in the front who could take over in these situations.
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
My lived experience with human drivers and outages at intersections is most people get it very wrong. If you're lucky and the lit intersection is 1 lane in each direction, more often than not everything works out well. But any intersection with multiple lanes or especially an intersection that is one primary road and a lower traffic secondary is going to be full of people just flying through as if they were on green the whole time.
Road casualties are tied to geographical areas and America is an infamously dangerous place to live in when it comes to traffic. By fixing education, road design, and other factors, those 40k killed can be reduced by seven times before you even need to bother with automation. There's a human driver problem, but it's much smaller than the American driver problem.
Also, that still doesn't excuse Waymo blocking roads. These are two different, independent problems. More people die in care crashes than they do in plane crashes but that doesn't mean we should be replacing all cars by planes either.
Seriously. People are outraged about the theoretical potential for human harm while there is a god damn constant death rate here that is 4x higher than every other western country.
I mean really. I’m a self driving skeptic exactly because our roads are inherently dangerous. I’ve been outraged at Cruise and Tesla for hiding their safety shortcomings and acting in bad faith.
Everything I’ve seen from Waymo has been exceptional… and I literally live in a damn neighborhood that lost power, and saw multiple stopped Waymos in the street.
They failed-safe, not perfect, definitely needs improvement, but safe. At the same time we have video of a Tesla blowing through a blacked out intersection, and I saw a damn Muni bus do the same thing, as well as a least a dozen cars do the same damn thing.
People need to be at least somewhat consistent in their arguments.
Imagine that when smartphones were first coming out they could only function with recent battery-tech breakthroughs. Mass-adoptions was pretty quick, but there was scattered reporting that a host of usage patterns could cause the battery to heat up and explode, injuring or killing the user and everyone in a 5-10ft radius.
Now, the smartphone is a pretty darn useful device and rapidly changes how lots of businesses, physical and digital, operate. Some are pushing for bans on public usage of this new battery technology until significant safety improvements can be made. Others argue that it's too late, we're too dependent on smartphones and banning their public use would cause more harm than good. Random explosions continue for decades. The batteries become safer, but also smartphone adoption reaches saturation. 40,000 people die in random smartphone explosions every year in the US.
The spontaneous explosions become so common and normalized that just about everyone knows someone who got caught up in one, a dead friend of a friend, at least. The prevailing attitude is that more education about what settings on a phone shouldn't be turned on together is the only solution. If only people would remember, consistently, every time, to turn on airplane mode before putting the phone in a pocket. Every death is the fault of someone not paying sufficient attention and noticing that the way they were sitting was pressing the camera button through their pants. Every phone user knows that that sort of recklessness can cause the phone to explode!
You as an engineer know how people interact with the software you deploy, right? You know that regardless of education, a significant portion of your users are going to misunderstand how to do something, get themselves in a weird state, tap without thinking. What if every instance in your logs of a user doing something strange or thoughtless was correlated with the potential for injury? You'd pull your software from the market, right? Not auto-makers. They fundamentally cannot reckon with the fact that mass adoption of their product means mass death. Institutionally incapable.
The only responsible thing to do is to limit automobile use to those with extensive training and greatly reduce volume. The US needs blue collar jobs anyway, so why not start up some wide-scale mass-transit projects? It's all a matter of political will, of believing that positive change is possible, and that's sorely lacking.
My concern is that one company can have a malfunction which shuts down traffic in a city. That seems new or historically rare. I understand large scale deployment will find new system design flaws so I’m not outraged, but I do think we should consider what this means for us, if anything.
On the contrary, I would prefer HN detach all threads expressing "concern." That way we don't have to make a subjective call if a comment is "concern" or "concern trolling" at all - they are equally uninteresting and do not advance curiosity.
How is this mode not a standard part of their disaster recovery plan? Especially in sf and the bay area they need to assume an earthquake is going to take out a lot of infrastructure. Did they not take into account this would happen?
> While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets.
We established these confirmation protocols out of an abundance of caution during our early deployment, and we are now refining them to match our current scale. While this strategy was effective during smaller outages, we are now implementing fleet-wide updates that provide the Driver with specific power outage context, allowing it to navigate more decisively.
Sounds like it was and you’re not correctly understanding the complexity of running this at scale.
Sounds like their disaster recovery plan was insufficient, intensified traffic jams in already congested areas because of "backlog", and is now being fixed to support the current scale.
The fact this backlog created issues indicates that it's perhaps Waymo that doesn't understand the complexity of running at that scale, because their systems got overwhelmed.
If the onboard software has detected an unusual situation it doesn't understand, moving may be a bad idea. Possible problems requiring a management decision include flooding, fires, earthquakes, riots, street parties, power outages, building collapses... Handling all that onboard is tough. For different situations, a nearby "safe place" to stop varies. The control center doesn't do remote driving, says Waymo. They provide hints, probably along the lines of "back out, turn around, and get out of this area", or "clear the intersection, then stop and unload your passenger".
Waymo didn't give much info. For example, is loss of contact with the control center a stop condition? After some number of seconds, probably. A car contacting the control center for assistance and not getting an answer is probably a stop condition.
Apparently here they overloaded the control center. That's an indication that this really is automated. There's not one person per car back at HQ; probably far fewer than that. That's good for scaling.
I suspected this. They were moving, but randomly to an observer. I’d seen about 2 out of maybe 20 stopped Waymos navigating around Arguello and Geary area in SF Saturday at 6PM. What was worse was that there was little to no connectivity service across all 3 main providers deeper in the power outage area as well - Spruce and Geary or west of Park Presidio (I have 2 phones, with Google Fi/T-Mobile, AT&T, and Verizon).
Interesting that some legacy safety/precaution code caused more timid and disruptive driving behavior than the current software route planner would've chosen on its own.
The blog post makes no mention of the cellular network congestion/dropped packets that affected people during the power outage. I had bars but was unable to load websites for most of the day. Were Waymos unaffected by the network problems, or were request timeouts encompassed in the word “backlog” used by the blog post?
The networking on AVs is usually redundant across multiple cellular networks to deal with coverage and outage issues. They also use business sims, which usually have a slightly higher network priority than consumers. If waymo's also negotiated to use one of the infrastructure QCIs instead, it would take some seriously disastrous network conditions for them to experience meaningful congestion.
Pardon my being under-informed, but does anyone know why Civic Center, the Presidio, the Park, and the Golden Gate were all dark the longest? Was there some separated municipal circuit they were on that was restored last as it was more complicated? Entered the thread thinking there would be more discussion on the actual architectural mishaps of the grid here rather than those of Waymo alone.
Sending power outage context to the vehicles does not seem like enough of a response. I hope at least they have internal plans for more. For large, complex systems, you want multiple layers of protections. The response feels way too reactive when they could use this incident to guide improvements across the board.
Do Waymo’s have Starlink or another satellite based provider backup? Otherwise, what do they if cell service goes down and they need to phone home for confirmation?
I find always find it kind of funny when an article starts with "At [company], our mission is to be the world’s [the product outranking competitors in its domain]".
I mean, come on, unless you really are a nonprofit trying to save the planet or something (no, building a better X is not saving the planet), your mission is to get rich and raise into a monopoly in your field
People downvoting you may think that this is an uninteresting quibble: we may not find it very surprising that sometimes Waymo asks for human guidance, and we don't necessarily think "autonomous" is an all or nothing designator.
Definition in the Oxford dictionary: "Of, pertaining to, or characterized by autonomy; self-governing, independent; free of external influence or control."
Self-driving car advertisers like Musk or Waymo just want to co-opt this term because it sounds cool. The term also deliberately hides the fact that these vehicles surveil and track you.
EDIT: It is the full definition in the printed Shorter Oxford English Dictionary (which is a large two volume publication). It is understandable that morons downvote it.
This reads to me, an angry resident, as an AI generated article that attempts to leverage the chaos that they caused, for marketing purposes — not as any sort of genuine remorse — underscoring why we shouldn’t be banning AI regulation in the USA.
>The situation was severe enough that the San Francisco Department of Emergency Management advised residents to stay home, underscoring the extraordinary nature of the weekend’s disruptions.
Waymo cannot point to this as an extenuating circumatance when they where a major contributing factor.
> "the resulting congestion required law enforcement to manually manage intersections"
Does anyone know if a Waymo vehicle will actually respond to a LEO giving directions at a dark intersection, or if it will just disregard them in favour of treating it as a 4 way stop?
I suddenly find that I really want an answer to this as well because I'm now imagining what might ensue if one of these attempted to board a car ferry. Typically there's a sign "turn headlights off", you're expected to maintain something like 5 mph (the flow of traffic should never stop), and you get directed by a human to cross multiple lane markings often deviating from the path that the vehicle immediately in front of you took.
Car ferries don't really make much sense in a Waymo-ubiquitous world. It's not your vehicle; there isn't really a reason why you would need to have the same vehicle on the other side of ferry ride. You're better off having one Waymo network on one side of the waterway, a separate Waymo network on the other side, and then a passenger-only ferry with a much higher passenger capacity (and oftentimes, they go much faster, since you can have hull forms like wave-piercing catamarans, hydrofoils, and hovercraft when you aren't carrying cars).
3 replies →
I think that Waymo isn't concerned about those types of scenario because they only operate in a limited area, and can tune their systems to operate best in that area (EG not worrying about car ferries, human-operated parking lots etc)
1 reply →
Your scenario seems to have a lot of overlap with a construction worker directing traffic around a road construction site. I have no idea if Waymo is any good at navigating these, but I am sure there is a lot of model training around these scenarios because they are common in urban driving environments.
Don't they just have a stop/go board? Whereas an LEO at a crossing would have to use hand signals
1 reply →
This was found to be one of the early challenges of self driving: reading traffic signal gestures of traffic agents. It does it. But the jury is out if it does it well.
It also needs to be able to ensure the signals are coming from a human that actually has authority to command it. Don't really want it taking hand signals from anyone.
It's hard for humans as well.
I often see humans drivers being confused with the police officers gesturing more and more until the person figures it out.
3 replies →
The amount of times this has been asked with no confirmation leads me to believe they still do not.
Tesla fanboys gush about how FSD can understand LEO at irregular traffic conditions, but no company I’m aware of has confirmed their systems are capable.
> The amount of times this has been asked with no confirmation leads me to believe they still do not.
They do follow hand signals from police. There are many videos documenting the behaviour. Here is one from waymo: https://waymo.com/blog/2024/03/scaling-waymo-one-safely-acro...
Look for the embed next to the text saying “The Waymo Driver recently interpreting a police officer’s hand signals in a Los Angeles intersection.”
Or here is a video observing the behaviour in the wild: https://youtu.be/3Qk_QhG5whw?si=GCBBNJqB22GRvxk1
Do you want confirmation about something more specific?
5 replies →
Teslas currently have a driver in the front who could take over in these situations.
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.
5 replies →
> we are now implementing fleet-wide updates
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
6 replies →
>seemed to navigate this just fine
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
4 replies →
My lived experience with human drivers and outages at intersections is most people get it very wrong. If you're lucky and the lit intersection is 1 lane in each direction, more often than not everything works out well. But any intersection with multiple lanes or especially an intersection that is one primary road and a lower traffic secondary is going to be full of people just flying through as if they were on green the whole time.
The average human driver is much worse than waymo.
No one seems sufficiently outraged that a private company's equipment blocked the public roads during an emergency.
No one seems sufficiently outraged that human drivers kill 40,000 people a year in the US.
It's approximately one 9/11 a month. And that's just the deaths.
Worldwide, 1.2m people die from vehicle accidents every year; car/motorcycle crashes are the leading cause of death for people aged 5-29 worldwide.
https://www.transportation.gov/NRSS/SafetyProblem
https://www.who.int/news-room/fact-sheets/detail/road-traffi...
Road casualties are tied to geographical areas and America is an infamously dangerous place to live in when it comes to traffic. By fixing education, road design, and other factors, those 40k killed can be reduced by seven times before you even need to bother with automation. There's a human driver problem, but it's much smaller than the American driver problem.
Also, that still doesn't excuse Waymo blocking roads. These are two different, independent problems. More people die in care crashes than they do in plane crashes but that doesn't mean we should be replacing all cars by planes either.
5 replies →
Seriously. People are outraged about the theoretical potential for human harm while there is a god damn constant death rate here that is 4x higher than every other western country.
I mean really. I’m a self driving skeptic exactly because our roads are inherently dangerous. I’ve been outraged at Cruise and Tesla for hiding their safety shortcomings and acting in bad faith.
Everything I’ve seen from Waymo has been exceptional… and I literally live in a damn neighborhood that lost power, and saw multiple stopped Waymos in the street.
They failed-safe, not perfect, definitely needs improvement, but safe. At the same time we have video of a Tesla blowing through a blacked out intersection, and I saw a damn Muni bus do the same thing, as well as a least a dozen cars do the same damn thing.
People need to be at least somewhat consistent in their arguments.
36 replies →
So this makes it desirable for someone's robots to block roads during an emergency?
1 reply →
To adapt this to a tech-head mindset:
Imagine that when smartphones were first coming out they could only function with recent battery-tech breakthroughs. Mass-adoptions was pretty quick, but there was scattered reporting that a host of usage patterns could cause the battery to heat up and explode, injuring or killing the user and everyone in a 5-10ft radius.
Now, the smartphone is a pretty darn useful device and rapidly changes how lots of businesses, physical and digital, operate. Some are pushing for bans on public usage of this new battery technology until significant safety improvements can be made. Others argue that it's too late, we're too dependent on smartphones and banning their public use would cause more harm than good. Random explosions continue for decades. The batteries become safer, but also smartphone adoption reaches saturation. 40,000 people die in random smartphone explosions every year in the US.
The spontaneous explosions become so common and normalized that just about everyone knows someone who got caught up in one, a dead friend of a friend, at least. The prevailing attitude is that more education about what settings on a phone shouldn't be turned on together is the only solution. If only people would remember, consistently, every time, to turn on airplane mode before putting the phone in a pocket. Every death is the fault of someone not paying sufficient attention and noticing that the way they were sitting was pressing the camera button through their pants. Every phone user knows that that sort of recklessness can cause the phone to explode!
You as an engineer know how people interact with the software you deploy, right? You know that regardless of education, a significant portion of your users are going to misunderstand how to do something, get themselves in a weird state, tap without thinking. What if every instance in your logs of a user doing something strange or thoughtless was correlated with the potential for injury? You'd pull your software from the market, right? Not auto-makers. They fundamentally cannot reckon with the fact that mass adoption of their product means mass death. Institutionally incapable.
The only responsible thing to do is to limit automobile use to those with extensive training and greatly reduce volume. The US needs blue collar jobs anyway, so why not start up some wide-scale mass-transit projects? It's all a matter of political will, of believing that positive change is possible, and that's sorely lacking.
2 replies →
Because more than half the people that die just did it to themselves via speed, alcohol or tiredness, cell phones, and often car maintenance too.
We all know we can die when we drive poorly or ignore shocks and tires. But we don't like the idea of dying because of someone else.
I believe that is caused by having lots of cars driving around.
1 reply →
> No one seems sufficiently outraged
Harvesting outrage is about the only reliable function the internet seems to have at this point. You're not seeing enough of it?
I've seen plenty but about the wrong things.
> a private company's equipment blocked the public roads
That would be like every traffic incident ever? I don't think US has public cars or state-owned utilities.
My concern is that one company can have a malfunction which shuts down traffic in a city. That seems new or historically rare. I understand large scale deployment will find new system design flaws so I’m not outraged, but I do think we should consider what this means for us, if anything.
6 replies →
Typically people move aside for emergency vehicles
1 reply →
Why would I be, when I don’t have any standard for comparison.
How many human drivers did similar because the power went out?
Just stopped in the middle of traffic for no reason? Approximately zero.
2 replies →
On the contrary, I would prefer HN detach all threads expressing "concern." That way we don't have to make a subjective call if a comment is "concern" or "concern trolling" at all - they are equally uninteresting and do not advance curiosity.
Based. Anyone complaining about HN being "insufficiently outraged" should go to Twitter and never return.
2 replies →
How is this mode not a standard part of their disaster recovery plan? Especially in sf and the bay area they need to assume an earthquake is going to take out a lot of infrastructure. Did they not take into account this would happen?
> While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets. We established these confirmation protocols out of an abundance of caution during our early deployment, and we are now refining them to match our current scale. While this strategy was effective during smaller outages, we are now implementing fleet-wide updates that provide the Driver with specific power outage context, allowing it to navigate more decisively.
Sounds like it was and you’re not correctly understanding the complexity of running this at scale.
Sounds like their disaster recovery plan was insufficient, intensified traffic jams in already congested areas because of "backlog", and is now being fixed to support the current scale.
The fact this backlog created issues indicates that it's perhaps Waymo that doesn't understand the complexity of running at that scale, because their systems got overwhelmed.
13 replies →
If the onboard software has detected an unusual situation it doesn't understand, moving may be a bad idea. Possible problems requiring a management decision include flooding, fires, earthquakes, riots, street parties, power outages, building collapses... Handling all that onboard is tough. For different situations, a nearby "safe place" to stop varies. The control center doesn't do remote driving, says Waymo. They provide hints, probably along the lines of "back out, turn around, and get out of this area", or "clear the intersection, then stop and unload your passenger".
Waymo didn't give much info. For example, is loss of contact with the control center a stop condition? After some number of seconds, probably. A car contacting the control center for assistance and not getting an answer is probably a stop condition. Apparently here they overloaded the control center. That's an indication that this really is automated. There's not one person per car back at HQ; probably far fewer than that. That's good for scaling.
> For example, is loss of contact with the control center a stop condition?
Almost certainly no - you don’t want the vehicle to enter a tunnel, then stop half way through due to a lack of cell signal.
Rather, areas where signal dropouts are common would be made into no-go areas for route planning purposes.
relying on essentially remote dispatch to resolve these errors states is a disaster
I suspected this. They were moving, but randomly to an observer. I’d seen about 2 out of maybe 20 stopped Waymos navigating around Arguello and Geary area in SF Saturday at 6PM. What was worse was that there was little to no connectivity service across all 3 main providers deeper in the power outage area as well - Spruce and Geary or west of Park Presidio (I have 2 phones, with Google Fi/T-Mobile, AT&T, and Verizon).
Interesting that some legacy safety/precaution code caused more timid and disruptive driving behavior than the current software route planner would've chosen on its own.
The blog post makes no mention of the cellular network congestion/dropped packets that affected people during the power outage. I had bars but was unable to load websites for most of the day. Were Waymos unaffected by the network problems, or were request timeouts encompassed in the word “backlog” used by the blog post?
The networking on AVs is usually redundant across multiple cellular networks to deal with coverage and outage issues. They also use business sims, which usually have a slightly higher network priority than consumers. If waymo's also negotiated to use one of the infrastructure QCIs instead, it would take some seriously disastrous network conditions for them to experience meaningful congestion.
I wonder if the cars also have some sort of mesh (LoRaWAN?) network to help each other out in temporary dead zones, emergencies, etc.
2 replies →
Pardon my being under-informed, but does anyone know why Civic Center, the Presidio, the Park, and the Golden Gate were all dark the longest? Was there some separated municipal circuit they were on that was restored last as it was more complicated? Entered the thread thinking there would be more discussion on the actual architectural mishaps of the grid here rather than those of Waymo alone.
Sending power outage context to the vehicles does not seem like enough of a response. I hope at least they have internal plans for more. For large, complex systems, you want multiple layers of protections. The response feels way too reactive when they could use this incident to guide improvements across the board.
Related context:
Waymo halts service during S.F. blackout after causing traffic jams
https://news.ycombinator.com/item?id=46342412
Do Waymo’s have Starlink or another satellite based provider backup? Otherwise, what do they if cell service goes down and they need to phone home for confirmation?
Cell services is usually around for a while when power goes down.
I doubt they have more than that.
That seems like a major oversight. Adding Starlink wouldn’t add that much marginal cost.
2 replies →
The way all Waymos are updated to learn from this incident reminds me of Pluribus.
Pluribus is certainly in part a commentary on software machine learning models!
I find always find it kind of funny when an article starts with "At [company], our mission is to be the world’s [the product outranking competitors in its domain]".
I mean, come on, unless you really are a nonprofit trying to save the planet or something (no, building a better X is not saving the planet), your mission is to get rich and raise into a monopoly in your field
Tesla FSD would never have this issue according to Elon Musk.
- written from my flying roadster
[dead]
[flagged]
It’s level 4 autonomous driving, not level 5.
https://brx-content.fullsight.org/site/binaries/content/asse...
People downvoting you may think that this is an uninteresting quibble: we may not find it very surprising that sometimes Waymo asks for human guidance, and we don't necessarily think "autonomous" is an all or nothing designator.
Definition in the Oxford dictionary: "Of, pertaining to, or characterized by autonomy; self-governing, independent; free of external influence or control."
Self-driving car advertisers like Musk or Waymo just want to co-opt this term because it sounds cool. The term also deliberately hides the fact that these vehicles surveil and track you.
EDIT: It is the full definition in the printed Shorter Oxford English Dictionary (which is a large two volume publication). It is understandable that morons downvote it.
8 replies →
This reads to me, an angry resident, as an AI generated article that attempts to leverage the chaos that they caused, for marketing purposes — not as any sort of genuine remorse — underscoring why we shouldn’t be banning AI regulation in the USA.
>The situation was severe enough that the San Francisco Department of Emergency Management advised residents to stay home, underscoring the extraordinary nature of the weekend’s disruptions.
Waymo cannot point to this as an extenuating circumatance when they where a major contributing factor.
The symbolic irony of this situation is almost too rich to bear.