Comment by dahart

6 hours ago

> we’re talking about 0.019% of US emissions

That’s assuming the numbers are accurate and in the ballpark, and I’m having a really hard time getting the numbers in the paper to add up. Do you believe them, or better yet, do you have other sources that support or confirm these numbers?

Just googling, what I get back is estimates that AI in 2024 already consumed over 200PJ, nearly 10x the number in the article, and is projected to double in the next few years. US electricity production is already ~25-30% of US CO2 emissions, and data centers are at least a quarter of that, and AI is now a huge driver of data center energy use. Data centers are using more than 4% of US electricity.

How is it possible that projected AI emissions are 0.019% from this one paper, while multiple other sources are estimating AI is already responsible for on the order of 2% of US emissions in 2024? I’m seeing a 100x discrepancy…

I don’t suspect the authors have intentionally downplayed either estimates, but a bunch of the paper’s data is old enough that it’s not useful for examining AI trends today. The energy use data is from 2016 and 2019. The energy use of inference is from GPT3 and usage numbers in 2023. The estimates of NVIDIA servers sold is from 2023. AI has exploded since then, and I suspect their estimates are off by orders of magnitude because AI usage has exploded in the last 2 years.

The author’s estimate of 28PJ of future AI energy use is based on a whole stack of assumptions in which small errors at every step can lead to very large errors in the estimate. That number is based on guesses of how automatable jobs are, and not on observations of the actual change in AI energy use today.

https://www.pewresearch.org/short-reads/2025/10/24/what-we-k...

https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

I did a quick chat gpt research fact check on this and did not find any obvious red flags. That's not a substitute for good research, obviously. I don't think it matters unless they are off by something outrageous like at least an order of magnitude. That would push it close enough to a one tenth of a percent that you could argue it's rivaling some of the more minor sources of emissions like aviation (2-3% I believe). You'd still be off by another order of magnitude. I don't have any reason to believe that that is the case. But please do share if you have other/better information.

I agree with you that reports like this typically have agendas and lots of little white lies, half truths, or assumptions that you might challenge. The question is are they overstating or understating the problem. And why. I can't judge that. I have my suspicions but I kept those out of my original comment; other of course than pointing out that based on the published numbers, this is does not seem like it actually is a very big problem.

  • Curious to hear what facts you verified with ChatGPT. I did provide some stats, and they’re sourced from the links I shared, and they do suggest 28PJ is off by an order of magnitude, and that the conclusion of 0.02% emissions might be off by as much as two orders of magnitude. What stats did you find that back up the paper’s summary?

    From the technology review article:

    “In analyzing both public and proprietary data about data centers as a whole, as well as the specific needs of AI, the researchers came to a clear conclusion. Data centers in the US used somewhere around 200 terawatt-hours of electricity in 2024, roughly what it takes to power Thailand for a year. AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million US homes for a year.”

    53 to 76 twh == 191 to 273 PJ, already used by AI in 2024