Comment by YetAnotherNick
16 hours ago
Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.
Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.
I also had doubts, but asked chat and it confirms it’s an issue - including sources.
https://chatgpt.com/share/678b6b3e-9708-8009-bcad-8ba84a5145...
The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.
Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.
Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.
ChatGPT didn't "confirm" anything there. It is not a meaninful reference.
What do you mean by confirms the issue? What's the issue exactly?
The issue is that when you have a high local usage your grid loses the ability to respond to peaks since that capacity is now always in use. Essentially it raises the baseline use which means your elasticity is pretty much gone.
A grid isn't a magic battery that is always there, it is constantly fluctuating, regardless of the intent of producers and consumers. You need to be able to have enough elasticity to deal with that fact. Changing that is hard (and expensive), but it is the only way (such as the technical reality).
The solution is not to create say, 1000 extra coal-fired generating facilities since you can't really turn them on or off at will. Same goes for gas, nuclear etc. You'd need a few of them for your baseline load (combined with other sources like solar, wind, hydro, whatever) and then make sure you have your non-renewable sources have margin and redundancy and use storage for the rest. This was always the case, and it will always be the case.
But now with information technology, the degree to which you can permanently raise demand on the grid to an extreme degree is where the problem becomes much more apparent. And because it's not manufacturing (which is an extreme consumer of energy) you don't really get the "run on lower output" option. You can't have an LLM do "just a little bit of inferencing". Just like you can't have your Netflix send only half a movie to "save power".
In the past we had the luxury of nighttime lower demand which means industry could up their usage, but datacenters don't sleep at night. And they also can't wait for batch processing during the day.
2 replies →