Comment by Havoc
13 hours ago
Could someone explain to me why they don't build these things near oceans? Like nuclear plants that need plenty cooling capacity too
Two loop cycle with heat exchanger to get rid of the heat
13 hours ago
Could someone explain to me why they don't build these things near oceans? Like nuclear plants that need plenty cooling capacity too
Two loop cycle with heat exchanger to get rid of the heat
So Ashburn VA is a datacenter hub because the very first non-government Internet Exchange Point (IXP) anywhere in the world was there (https://en.wikipedia.org/wiki/MAE-East). Back in the 1990's something like half of all internet traffic all over the world hit MAE-East. That in turn made AWS put their first region there (us-east-1 preceded eu-west-1 by 2 years and us-west-1 by 3 years). Then because there were lots of people who knew how to build DC's- and lots of vendors who knew how to supply them- the Dulles Corridor became a major hub for lots of companies datacenters. For AWS, because us-east-1 was the first, it's by far the most gnarly and weird- and a lot of control planes for other AWS services end up relying on it. Which is why it goes down more often than other regions, and when it does go down it makes national news, unlike, say, eu-south-2 in Spain.
But NoVA is basically the same sort of economic cluster that Paul Krugman won his Nobel Prize in Economics for studying, just for datacenters, not factories.
Well said. I'll also add, that with these networks, the sooner you can get traffic off your network the better. There's strong incentive to have your datacenter near these peering points. And since MAE-East was the first, it's been the largest as it's been snowballing the oldest. AOL's HQ was here, Equinix built their peering point soon after MAE-East, etc.
There's a great read about the whole area here: https://www.amazon.com/Internet-Alley-Technology-1945-2005-I...
As for AWS, I often see it repeated that the DCs are the oldest and therefor in disrepair. That's not true; many of the first ones have since been replaced. But there are services that are located here and only here.
But I'll also add, a lot of customers default to using US-East-1 without considering others, and too many deploy in only one AZ. Part of this is AWS's fault as their new services often launch in US-East-1 and West-2 first, so customers go to East-1 to get the new features first.
Speaking as one who was with AWS for 10 years as a TAM and Well-Architected contributor, I saw a lot of customers who didn't design with too much resiliency in mind, and so they get adversely affected when east-1 has an issue (either regional or AZ). The other regions have their fair bit of issues as well. It's not so much that east-1 necessarily fails more than the others, it's that it has so many AZs and so many workloads that people notice it more.
> But there are services that are located here and only here and only here
Why is that? You would think the company ending events like IAM going poof due to it being dependent on us-east-1 would be top priority to fix?
The underlying reason is more that by being in us east coast you have about equal latency for customers in us west coast and Europe. That's a very large population covered from a single site.
If you're building a single datacenter site this is where you start building first.
LATAM as well, all major submarine cables land on the east coast. Surprisingly even from Mexico the latency is often better to US East.
Amusingly I've been part of two critical downtime heating incidents at two different datacenters: one was when Hosting.com's SOMA datacenter got so hot that they were using hoses on the roof to cool it down; and the second one was when Alibaba's Chai Wan datacenter got so hot everything running there went down, including the control plane. So I imagine the proximity to the ocean does not yield any additional advantage in terms of emergency heat sinking. You have x capacity to pump heat out and it doesn't matter if you're next to the sea or in the middle of Nebraska because your entire system needs to be built to be rated for some performance.
yeah but capacity is easier/cheaper to build/overbuild if you can access cold-ish water at all times
Didn't really help Fukushima though. In fact, the ocean came to it. They didn't have to go get it.
Off the top of my head: Ocean levels of salt in a water system are much more expensive to maintain (even the secondary loop).
Coastal land much more expensive. If you go to a remote coastal site, you probably won't have as good access to power.
Coastal sites usually exposed to more severe weather events.
Other fun unpredicatble things eg-Diablo Canyon nuclear facility has had issues with debris and jellyfish migration blocking their saltwater cooling intake.
https://www.nbcnews.com/news/world/diablo-canyon-nuclear-pla...
And oysters / mussels / clams / every other creature that starts small and turns calcium into brick finds your cooling system to be a delightful place to raise a family, especially in delicate heat exchangers with small easily blockable passages.
I had a class in my masters about data centers (HPC Infrastructures). The professor was using some data centers somewhere in the middle of USA, in an area with hot weather as example. He compared that with ideal scenario (weather, power source, etc.).
In one of the slides, there were factors that influence the decision of where to build a data center, and several of the items involved finding a place with enough space and skilled people to work at this data center. He also commented sometimes there is politics involved on choosing the place for a next data center.
Oceans have salt. Saltwater is bad for electronics beyond normal water. You also need a sufficient level of water depth otherwise it'll warm to surface temperature. It also needs to be price-competitive with traditional evaporative cooling.
Toronto is the textbook example of this working. It's on a freshwater lake that is deep relatively close to the shore, and the downtown has expensive real estate blocking traditional methods.
https://en.wikipedia.org/wiki/Deep_Lake_Water_Cooling_System
In a proper 2-loop cooling system, the primary loop (with direct electronics contact) and secondary loop (with seawater/external cooling source) are hydraulically isolated by a heat exchanger. The salt water or whatever never gets anywhere near the electronics.
Saltwater comes in the air. Just being near it corrodes everything. Both stainless steel and bronze are very expensive. Even if things were made of corrosion proof materials, not everything can be, for strength reasons.
The problem is, it's still in contact with something, even if it's just the secondary loop. Saltwater is not just incredibly aggressive against metal, the major problem with using it for cooling is fouling. Fish, mussels, algae, debris, there are a lot of things that can clog up your entire setup.
Lots of proposals to build them near Lake Michigan recently but the residents of Wisconsin only want auto parts stores and paper mills. They've been completely demonized. Cities and counties are passing no data center laws even though it's the perfect place for it.
Paper mills need a lot of heat energy to run the processes. Data centres produce a lot of heat. Sounds like a good combination?
Cold water -> data centre cooling loop - > warm water -> paper mill with heat pumps to transform low-grade heat into the required temperatures -> profit
Data center cooling system output has miserably low extractable energy.
This is just a guess, but land near oceans is more expensive/populated, and water is comparatively cheap
They are, sometimes. Google built this one in Finland in 2011 at the site of an old paper mill, which was already set up to draw water from the Baltic Sea (which isn't as salty as the Atlantic is, but still not fresh water):
https://datacenters.google/locations/hamina-finland/
> Using a cooling system with seawater from the Bay of Finland and a new offsite heat recovery facility, our Hamina data centre is at the forefront of progressing our sustainability and energy-efficiency efforts.
Humidity and corrosion, it's a trade-off (pick your poison).