Comment by dehrmann

2 days ago

> "some of those cost increases [to Oregon electric consumers] came from data centers coming onto our shared grid" Wochele said

[citation needed]

The article did go on...

> According to Oregon CUB, large industrial users, like data centers, that have connected to Portland General Electric’s system pay about 8 cents per kilowatt hour, or kWH, which is the unit of energy used when 1,000 watts of power is used in an hour. Residential customers in the same PGE system pay close to 20 cents per kilowatt hour

But that's a disingenuous comparison. Data centers are cheaper to serve because there's ~one massive line going to one place, power use is generally more fixed and predictable, and they might be paying less because they can reduce power use during heat waves.

It doesn't detail it in the articles and it's quite hard to find details without looking at a specific bill or a specific provider, but for my provider in NYS, the cents/kwh for electricity cost does not include transmission costs or the fee for being connected to the grid. Those are separate line items that cover the cost of the lines and infrastructure for the community. On a more arguable note, even if you only use "one big line" to connect, you're stil part of the grid and should be shouldering some of the burden to maintain that grid and not just your line.

If power use is an issue during heat waves, it means the construction of solar capacity was messed up somewhere in the process.

In Poland/EU during summer we have electricity surplus, not deficit.

  • It just depends on how the electricity is spent. If you have lots of electric heaters and no air conditioning in your country, then the demand is high in winter and low in summer. But if it's the other way around then the demand peeks during heat waves.

    • Our current solar capacity / peak generation is roughly enough to cover ACs if we had 90% (like US) generation, not 10%. On top of that we have other sources which we need anyway for winter.

      Oregon was very slow do adapt solar due to poor regulation and other reasons. It even says so in the article.

Be that as it may, they use up a lot of power and are pushing demand ahead of supply, which can only mean higher prices:

https://www.opb.org/article/2024/08/26/fast-growing-energy-d...

  • This was predicted far before the latest datacenter investment craze.

    It’s what happens when a generation of people decide to stop building electrical capacity. Writing has been on the wall for decades and the AI craze was the final tipping point. It will be scapegoated of course.

    I’m highly skeptical that these are great investments overall, but the fact the answer isn’t “use this money to help subsidize rapidly expanding grid capacity” is indicative of the unseriousness of society today.

    Ideally you use this newfound demand to build out capacity and if the demand goes away due to it being malinvestment - at least you have a bunch of shiny new grid capacity that was half paid for by stupid venture capital money.

There is a standing charge which supposed to cover distribution cost and per kWh price which is supposed to cover the energy itself. Almost 50% of my bill is the standing charge (I don’t know how high it’s for PGE though).

It’s not really disingenuous as distribution from a local substation to individual customers is cheap in most cases. When you start talking 10,000+ homes on say 1 acre lots they collectively use a lot of power in a fairly small area.

Most of the distribution costs occur on the other side of a substation due to efficiency losses with long distance transmission etc. A data center located next to a power plant has some advantages, but still needs power when that power plant is offline.