← Back to context

Comment by Legend2440

2 months ago

“AI is killing the planet” is basically made up. It’s not. Not even slightly. Like all industries, it uses some resources, but this is not a bad thing.

People who are mad about AI just reach for the environmental argument to try to get the moral highground.

it does not use "some" resources

it uses a fuck ton of resources[0]

and instead of reducing energy production and emissions we will now be increasing them, which, given current climate prediction models, is in fact "killing the planet"

[0] https://www.iea.org/reports/energy-and-ai/energy-supply-for-...

  • Data centers account for roughly ~1% of global electricity demand and ~.5% of CO2 emissions, as per your link. That's for data centers as a whole, as IEA and some other orgs group "data-centres, AI, and cryptocurrency" as a single aggregate unit. Alone, AI accounts for roughly ~10-14% of a given data center's total energy. Cloud deployments make up ~54%, traditional compute around ~35%.

    The fact is that AI, by any definable metric, is only a sliver of the global energy supply right now. Outside the social media hype, what actual climate scientists and orgs talk about isn't (mostly) what AI is consuming now, it's what the picture looks like within the next decade. THAT is the real horror show if we don't pull policy levers. Anyone who says that AI energy consumption is "killing the planet" is either intentionally misleading the argument or unbelievably misinformed. What's actually, factually "killing the planet" are energy/power, heavy industry (steel, cement, chemicals), transport, and agriculture/land use. AI consumption is a rounding error compared to these. We'll ignore the fact AI is actually being used to manage DC energy efficiency and has reduced the energy consumption at some hyperscale DC's (Amazon, AliBaba, Alphabet, Microsoft) by up to 40%, making it one of the only industry sectors that has a real, non-trivial chance at net-zero if deployed at scale.

    The most interesting thing about this whole paradigm is just how deep of a grasp AI (specifically LLMs) have on the collective social gullet. It's like nothing I've ever been a part of. When Deep Water Horizon blew up and spilled 210M gallons of crude into the Gulf of Mexico, people (rightfully so) got pissed at BP and Transocean.

    Nobody, from what I remember, got angry at the actual, physical metal structure.

    • > what actual climate scientists and orgs talk about isn't (mostly) what AI is consuming now, it's what the picture looks like within the next decade

      that's the point - obviously the planet is not dying _today_, but at the rate at which we are not decreasing emissions, we will kill it. So no, "killing the planet" is not misinformed or misleading.

      > Nobody, from what I remember, got angry at the actual, physical metal structure.

      Nobody's mad at LLMs either. It's the companies that control them and that are fueling the AI "arms race", that are the problem.

      1 reply →

  • This, and the insane amount of resources (energy and materials) to build the disposable hardware. And all the waste it's producing.

    Simon,

    > I find Claude Code personally useful and aim to help people understand why that is.

    No offense, but we don't need your help really. You went on a mission to teach people to use LLMs, I don't know why you would feel the urge but it's not too late to quit doing this, and even teach them not to and why.

    • Given everything I've learned over the last ~3 years I think encouraging professional programmers (and increasingly other knowledge workers) not to learn AI tools would be genuinely unethical.

      Like being an accountant in 1985 who learns to use Lotus-123 and then tells their peers that they should actively avoid getting a PC because this "spreadsheet" thing will all blow over pretty soon.

      1 reply →