← Back to context

Comment by margalabargala

2 days ago

Data centers consume enormous amounts of water for evaporative cooling. What part is nonsense?

If the data center is built somewhere with ample water supplies this isn't an issue. If it's pulling from groundwater this can be a huge issue. Groundwater isn't infinite and is being depleted in many areas.

In the USA, data centres consume about 164 billion gallons of water annually [1]

Irrigation consumes 118 billion gallons per day [1] and thermoelectric power plants a further 133 billion gallons per day.

There's enormous amounts, and there's enormous amounts. If you really want to get mad about water being wasted, look up what californian alfalfa growers pay for their water.

[1] https://www.eesi.org/articles/view/data-centers-and-water-co... [2] https://pubs.usgs.gov/fs/2018/3035/fs20183035.pdf

  • New datacenter projects are usually closed loop now.

    From your first citation:

    > Closed-loop cooling systems enable the reuse of both recycled wastewater and freshwater, allowing water supplies to be used multiple times. A cooling tower can use external air to cool the heated water, allowing it to return to its original temperature. These systems can reduce freshwater use by up to 70%.

    • Citation please, I don’t buy it. Evaporative cooling towers almost double the efficiency of heat rejection vs a closed loop system. I don’t see any data center operator giving up those operating cost efficiency gains just to save some water, but I could be wrong.

      2 replies →

  • It's not a question of quantity but of distribution.

    I'm not defending the waste of water that is growing alfalfa in the desert for export, but there are plenty of places datacenters are built where the water they use is impactful.

    They can both be bad. Unlike the legal mess that is US irrigation water rights, data centers are also a lot easier to do something about.

I was under the impression they capture the evaporation, let it cool, and recycle it?

  • I guess it's possible to have a condensing station, but generally speaking you'd need to supply input energy to allow it to cool down and condense somehow. The bigger question here is if a datacenter using evaporative cooling where does the moisture go? If it just feeds a cloud system that rains on nearby fields, it's not much different than irrigating crops. If it feeds clouds that go offshore and rain into the ocean, it's similar to just diverting drinking water into the ocean

    • I must be missing something, why can't it be entirely closed loop like a water radiator in an old car? A simple fan running through large radiator cores would certainly condense within the system, keeping the water in the system

      1 reply →

  • > I was under the impression they capture the evaporation, let it cool, and recycle it?

    So, how do they get rid of the latent heat of evaporation that's released when the water recondenses?

    The whole point of evaporative cooling is to soak up that latent heat and release it later, out in the environment, when the water recondenses somewhere else.

    It's kind of like why Dune's stillsuits don't work.

  • A 1 GW heat source evaporates about 9 million gallons per day.

    In 2024, US data centers consumed power at an average rate of about 21 GW.

    So, that would be about 70 billion gallons per year evaporated.

They’ll be built and deployed in space soon. Elon said so.

  • The reason they consume water is the same reason space is a bad place to put data centres, getting rid of the heat is a challenge. Having only radiative heat dissipation is going to severely limit space based manufacture and computing, it puts significant constraints on the space station already.