Comment by cobolcomesback
9 hours ago
It’s to run LLMs.
In the before-AI world, it mattered a lot where data centers were geographically located. They needed to be in the same general location as population centers for latency reasons, and they needed to be in an area that was near major fiber hubs (with multiple connections and providers) for connectivity and failover. They also needed cheap power. This means there’s only a few ideal locations in the US: places like Virginia, Oregon, Ohio, Dallas, Kansas City, Denver, SF are all big fiber hubs. Oregon for example also has cheap power and water.
Then you have the compounding effect where as you expand your data centers, you want them near your already existing data centers for inter-DC latency reasons. AWS can’t expand us-east-1 capacity by building a data center in Oklahoma because it breaks things like inter-DC replication.
Enter LLMs: massive need for expanded compute capacity, but latency and failover connectivity doesn’t really matter (the extra latency from sending a prompt to compute far away is dwarfed by the inference time, and latency for training matters even less). This opens up the new possibility for data centers to be placed in geographic places they couldn’t be before, and now the big priority’s just open land, cheap power, and water.
>Oregon for example also has cheap power and water.
Cheap for who? For the companies having billions upon billions of dollars shoved into their pockets while still managing to lose all that money?
Power won't be cheap after the datacenters move in. Then the price of power goes up for everyone, including the residents who lived there before the datacenter was built. The "AI" companies won't care, they'll just do another round of funding.
https://www.axios.com/2025/08/29/electric-power-bill-costs-a...