Comment by bee_rider
8 hours ago
I don’t really get the water concerns in datacenter cooling. Even if a lot of water was used for cooling with every prompt (which he argues against here, but, even if)… water “used up” by cooling just comes out a little hotter, right? Maybe evaporated. Then it’ll come back in the form of rain. This isn’t an industrial chemistry process that leaves some toxic waste in the water. Or an agricultural one that puts water in plants and then ships it off to some other region. It just becomes another path through the water cycle.
I actually don’t get how this can be a real thing that people are worried about. Is there some astroturfing behind this? Maybe an attempt to make environmentalists and AI skeptics look stupid?
The absolute strongest complaint is that DCs consume treated, potable water, which is less abundant / easily re-created than any old non-potable source. (Of course the easy solution here is DCs just ingest / treat their own non-potable source. Or utilities charge rates sufficient to price in the externality of drawing down more potable water. The economics still work for DCs if they need to treat their own water -- the fundamental problem is that utilities are underpricing their potable water, so DCs prefer it all else being equal.)
Why don’t data centers use gray water more often? Wouldn’t that be better for basically everyone?
My guess is it’s some combination of the infrastructure not existing, the distribution being bad, and the treatment costs not penciling out.
But that feels like the kind of thing municipal utilities could solve with pricing. Potable water should probably be priced differently for residential use than for big commercial/industrial users, in a way that pushes them toward non-potable sources wherever possible.
A fun Texas water fact I always bring up: the entire state’s monthly freshwater use is roughly a week of freshwater inflow into the Chesapeake Bay. Texas would be the 8th-largest GDP in the world if it were a country, and its whole monthly freshwater demand is basically a few months of water that the Chesapeake just dumps into the ocean. (Of course, estuaries make use of the water so it's not just wasted but it's illustrative imo)
Another fun comparison point is yearly Texas uses 0.08% the volume of the Great Lakes in freshwater but ~ 30-50% of the volume of all the lakes in Texas.
We've got a lot of water but it's not distributed evenly and we should probably build some sort of water pipeline eventually so water rich states can sell to water poor states.
Again, this is all just speculation by someone who knows not a damn thing about municipal water management.
> Why don’t data centers use gray water more often?
DCs will just use the cheapest source that meets their needs. If they have to treat greywater and that costs more than municipal potable water, they'll use the potable water. (In part this is utilities selling their potable water too cheaply.)
> Wouldn’t that be better for basically everyone?
No; if it was cheaper for DCs, they'd already be doing it. But it isn't an insurmountable cost -- DCs still pencil with slightly more expensive cooling.
2 replies →
Those of us by the Great Lakes would prefer that our water not get sold to other places, thanks.
6 replies →
Grey water would normally get treated and then discharged into a river or lake or other local water body. If you evaporate it at a data center, then you break that local loop. It's really only different from using potable water in that you save a bit on the expense of fully treating it.
Grey water from where?
Using 1/4th the entire freshwater inflow into the Chesapeake Bay makes it sound enormous. That's multiple major rivers for a bit over 30 million people.
I live near the Potomac and always figured the region was wet enough that water was not a concern. You have me rethinking that somewhat.
3 replies →
Because they're taking water from already parched regions, often pumping it out of the ground. Even if the water did come back locally as rain (it doesn't), it still makes it impossible for people to live off the same aquifers and water sources sustainably.
People are losing their minds in Wisconsin saying proposed data centers will drain lake michigan. I'm not kidding.
Hope they don't find out how much is lost naturally to evaporation each year..
Just 30 mins from where I live data centers are having an impact on water used for farming.
https://www.theguardian.com/global-development/2024/sep/25/m...
https://www.bbc.com/news/articles/cx2ngz7ep1eo
If only we could do water-intensive activities in areas where water is abundant and then ship the resulting products to where they are needed...
In a far science fiction future, I could e.g. imagine connecting LLM inference data centers to a global data network instead of always having to drive up to them to ask my prompts.
The water isn’t gone but if it comes back as rain, it at least has to be cleaned again, since data centers probably don’t use raw rainwater for cooling.
It’s probably still not too bad but there’s at least some work done that’s „used up“ by letting tap water (or probably demineralized water used for cooling) evaporate.
The problem is that data centers use SO MUCH water... sure we humans let water evaporate, but this is a new source of water "waste" to the tune of nearing 2 billion gallons/year, just in Loudon County Virginia & connected water users [0].
When that water source is underground wells, this can take years (on the fast end) or decades (on the moderate end) to get back down. Look at California's water issue -- so many wells extracting water for farming has changed the land topography.
Also, when water 'comes back', it might come back in the ocean and not on land... reducing the available fresh water without desalination.
Data centers need the water to cool... but maybe there's room to find incentives for them to do so while making sure our water bills don't go up like our electric bills are because of the extra load they are putting on utilities.
[0]: https://www.theregister.com/2024/08/19/virginia_datacenter_w...
It doesn't come out a little hotter, it gets evaporated in cooling towers. Same result as any other water usage. Cooling towers can't use seawater either. Most datacenters are in places where fresh water is abundant anyway, but some are not.
Anyway agricultural water usage is way worse in California.
Agriculture is used to grow _food_.
Some of it, but then some AI is used to cure cancer.
But it doesn't have to happen in California.
The rain doesn’t happen directly above where it evaporates. And “slightly warmer” waste water can have major ecological impacts, destroying native life in the lakes and rivers where the wastewater is ejected. Plus, if the water is taken away from underground aquifers that may not be refilling fast enough, or if it’s taking water from downstream users, that’s something to be concerned with.
I have also wondered this and came to a similar conclusion about the politics.
This whole time I've been wondering how it's possible that people don't realize how common evaporative cooling is for much larger buildings that are far more numerous than these data centers, and especially in dry climates where drought is common.
> Or an agricultural one that puts water in plants and then ships it off to some other region
Just like an agriculture, data center puts water to cool chips and ships token to some other reason?
I honestly don't know if you are an AI atroturfing bot. No, I am not being sarcastic. Given this is the top comment and there is no reply, here you go
For a pre-chewed eli5 overview, check this: https://www.eesi.org/articles/view/data-centers-and-water-co...
A responsible human must always verify information. I DW as "secondary l" information source. For instance https://www.dw.com/en/why-does-ai-need-so-much-energy/video-...
tldr: chip immersion uses less water but is more expensive. Water evaporation is the opposite. Datacenters will use the cheapest they can get away with. Water is scarse; evaporated water is as unavailable as contaminated water. Read the information sources.
The explanation about chip immersion is wrong though. It's not a water-saving technique, it's for cooling dense racks more rapidly and maybe saving energy. That warm coolant still needs to be cooled before it goes back through the chips, likely the same way, evaporative cooling tower. Air cooling systems also involve a similar fluid loop, just doesn't go into the chips.
"Since the technology uses synthetic fluids, it requires significantly less water than other approaches." This is like saying that a new car radiator uses less water than an old water-based one, like yeah technically it requires water to work but you aren't boiling it away.
It does mention the real ways to use less water too, either chillers (which use way more power) or running in a colder climate.
I’m not a bot, but maybe I was too quick to not inspect my gut response. I guess I’ll look into it more, maybe this can be a learning experience.
FWIW the comment is just at +2 at the moment, I think it is just at the top of the thread because it is recent and has discussion.
> tldr: chip immersion uses less water but is more expensive. Water evaporation is the opposite. Datacenters will use the cheapest they can get away wi
This suggests a simple fix: charge more to the datacenters (not people) for the water, to make the other option competitive.
No need to throw baby with the ... erm, bathwater.
By that argument water use is never a bad thing since all water comes back as rain. The problem is that data centers need to use clean water, which has to be treated. On a local scale, a large data center could starve a community of potable water, even if the state-wide water use is very small.