Comment by camillomiller
16 hours ago
Is that worth the cost of this technology? Both in terms of financial shenanigans and its environmental cost?
16 hours ago
Is that worth the cost of this technology? Both in terms of financial shenanigans and its environmental cost?
Are you asking if the 10 seconds it takes AI to generate an image is more costly to the environment than a commissioned graphics artist using a laptop for 5-6 hours, or a painter who uses physical media sourced from all over the world?
In short, yes.
A modern laptop is running almost fanless, like a 486 from the days of yore.
A single H200 pumps out 700W continuously in a data center, and you run thousands of them.
Also, don't forget the training and fine tuning runs required for the models.
Mass transportation / global logistics can be very efficient and cheap.
Before the pandemic, it was cheaper to import fresh tomatoes from half-world away rather than growing them locally in some cases. A single container of painting supplies is nothing in the grand scheme of things, esp. when compared with what data centers are consuming and emitting.
This argument is so flawed that its conclusion almost loops back around to being correct again:
No, in terms of unit economics, I'm almost certain that the painting supplies have a bigger ecological/resource footprint than an LLM per icon generated, and I'm pretty sure the cost of shipping tomatoes does not decrease that footprint, even if it possibly dwarfs it.
But yes, due to Jevon's paradox, the total resource use might well increase despite all that. I, for example, would have never commissioned a professional icon for my silly little iOS shortcuts on my homescreen, so my silly icon related carbon footprint went from exactly zero to slightly above that.
This is a plainly dishonest comparison. A single H200 does not need to run continuously for you to generate a dozen pictures. And then you immediately pivot to comparing the paint usage against "the grand scheme of things"- 700W is nothing in the grand scheme of things.
5 replies →
these are unfair comparisons. it's not just a single laptop running all day it's all the graphic designer laptops that get replaced. it's not a single container of painting supplies it's all off them, (which are toxic by the way).
so if power were plentiful and environmental you'd be onboard with it?
2 replies →
Cheaper/faster tech increases overall consumption though. Without the friction of commissioning a graphics artist to design something, a user can generate thousands of images (and iterate on those images multiple times to achieve what they want), resulting in way more images overall.
I'm not really well versed on the environmental cost, more just (neutrally) pointing out that comparing a single 10s image to a 5-6 hour commission ignores the fact that the majority of these images probably would never have existed in the first place without AI.
Also, ignoring training when talking about the environmental costs is bad faith. Without training this image would not exist, and if nobody generating images like these, the training would not happen. So we should really ask, the 10 seconds it took for inference, plus the weeks or months of high intensity compute it took to train the model.
2 replies →
Wow, do you hold a degree in false dichotomies?
The environmental cost is significantly overblown, especially water usage.
I work with direct liquid cooled systems. If the datacenter is working with open DLC systems (most AI datacenters in the US in fact do), there's a lot of water is being wasted, 7/24/365.
A mid-tier top-500 system (think about #250-#325) consumes about a 0.75MW of energy. AI data centers consume magnitudes more. To cool that behemoth you need to pump tons of water per minute in the inner loop.
Outer loop might be slower, but it's a lot of heated water at the end of the day.
To prevent water wastage, you can go closed loop (for both inner and outer loops), but you can't escape the heat you generate and pump to the atmosphere.
So, the environmental cost is overblown, as in Chernobyl or fallout from a nuclear bomb is overblown.
So, it's not.
It's not that it doesn't use water; it's that water is not scarce unless you live in a desert.
As a country, we use 322 billion gallons of water per day. A few million gallons for a datacenter is nothing.
5 replies →
The environmental cost of Chernobyl is indeed often overblown. Nature in the exclusion zone is arguably off much better now than before!
The cost to humans living in affected areas was massive and high profile, but it’s very questionable if it was higher than that of an equivalent amount of coal-burning plants. Fortunately not a tradeoff we have to debate anymore, since there are renewables with much fewer downsides and externalities still.
Nuclear bombs (at least those being actually used) by design kill people, so I’m not sure what the externalities even are if the main utility is already to intentionally cause harm.
Depends on if you believe it will ever become cheaper. Either hardware, inspiring more efficient smaller models, or energy itself. The techno optimist believes that that is the inevitable and investable future. But on what horizon and will it get “zip drived” before then?
absolutely without a doubt it is
If that energy is used for research, maybe. If used to answer customer questions or generate Studio Ghibli knock-offs, it's not worth it, even a bit.
what’s the difference between those two? how can you say one has more value than the other?
1 reply →
To you. Fortunately nobody elected you chief resource allocator of the planet.
And I say that as somebody that also finds Ghibli knock-off avatars used by AI bros in incredibly bad taste (or, arguably an even worse crime against taste, a dated 2025 vibe).
2 replies →