Comment by ToucanLoucan
4 days ago
I mean, I'm fine with datacenters plugging into the grid, if they pay for it. I don't understand (and I mean feel free to explain it) this weird shit where a datacenter goes up and everybody's power bills start increasing. I have assumed that it's because the grid's facilities require upgrades to meet the new demand, but in the case of the "new demand" being "one structure consuming an assload of power" it feels incredibly shitty to lay that burden on the taxpayers.
A lot of the increase in bills people are seeing come from necessary upgrades to the distribution infrastructure. Something that was going to be happening anyway.
It's due to lack of investment in the power grid on a generational timeline. We used up every bit of slack and extra capacity in the name of efficiency and not needing to spend the money on building stuff.
It's also nearly impossible to build large-scale things like long distance transmission lines - so even stuff like solar fields and wind farms are difficult to make pencil out these days. You are talking a decade or more to get anything big done, if you are lucky.
We ran out of parlour tricks like trying to game efficiency and curtail residential usage. We also ran out of industry to offshore. This was coming for us either way, just AI datacenter buildouts were unexpected and pulled demand forward some odd number of years.
I was always planning on building an off-grid power setup for exactly this reason - the writing was on the wall decades ago. It just came a bit sooner than I expected!
A large industrial scale power user that operates at roughly the same base power load 24x7 is an absolute dream customer for a grid operator. The fact we can't make the perfect customer profile pencil out without raising rates should be a giant huge red flashing warning sign with bells going off to everyone. Heck, these facilities can even typically participate in demand shedding programs on top of being ideal.
We've been living off the cheap power our grandparents invested in building for us. Time has come to pay the piper.
Ideally, the revenue from the new customer would be enough to cover the upgrades, so long as the new customer makes an up-front committment (from which loans can be written) that makes their risk (of having to pay for the upgrades even if they shut down much sooner than expected) about equal to if they build out their own off-grid system. And then they could sell to existing customers for slightly less than before, due to scale and an overall reduction of peak-to-baseline ratio.
But I guess this isn't how the world works.
As you say, it's because the connection between the increased load and the factors requiring additional spending are at enough of a distance that they're hard to account for. If the datacenter operator argues (often with support from the power company, who has to convince government officials their rate increase is OK) that most of the grid upgrades were going to happen regardless and they've already paid for the increase fairly attributed to their operations, how can you really know whether that's true?
There's also the supply/demand aspect of it. Some electricity is cheaper to provide than others - the cheapest is the renewable or nuclear that's already built in the area, but when demand is high, the grid provider will source electricity from more expensive sources - coal, natural gas, or importing it from neighboring utilities. So, using some made-up numbers, if your existing cost for 100MW is $0.10/Wh, getting the next 100MW might cost $0.50/Wh, bumping the cost for everyone up to $0.30/Wh.
KWh, but yes. I'm in CA so we don't have data centers because the cost of a Kwh is already like $123134^100
Power doesn’t just apperate out of thin air. It has to be generated and that has costs. If suddenly the grid draws more power then more costly sources have to feed it. Everyone pays for the same power.
The big consumer also buys in bulk and negotiates better rates etc.