← Back to context

Comment by jillesvangurp

3 hours ago

I did a quick chat gpt research fact check on this and did not find any obvious red flags. That's not a substitute for good research, obviously. I don't think it matters unless they are off by something outrageous like at least an order of magnitude. That would push it close enough to a one tenth of a percent that you could argue it's rivaling some of the more minor sources of emissions like aviation (2-3% I believe). You'd still be off by another order of magnitude. I don't have any reason to believe that that is the case. But please do share if you have other/better information.

I agree with you that reports like this typically have agendas and lots of little white lies, half truths, or assumptions that you might challenge. The question is are they overstating or understating the problem. And why. I can't judge that. I have my suspicions but I kept those out of my original comment; other of course than pointing out that based on the published numbers, this is does not seem like it actually is a very big problem.

Curious to hear what facts you verified with ChatGPT. I did provide some stats, and they’re sourced from the links I shared, and they do suggest 28PJ is off by an order of magnitude, and that the conclusion of 0.02% emissions might be off by as much as two orders of magnitude. What stats did you find that back up the paper’s summary?

From the technology review article:

“In analyzing both public and proprietary data about data centers as a whole, as well as the specific needs of AI, the researchers came to a clear conclusion. Data centers in the US used somewhere around 200 terawatt-hours of electricity in 2024, roughly what it takes to power Thailand for a year. AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million US homes for a year.”

53 to 76 twh == 191 to 273 PJ, already used by AI in 2024