Where in the world are you getting the numbers for how much video streaming uses energy? I am quite sure that just as with LLMs, most of the energy goes into the initial encoding of the video, and nowadays any rational service encodes videos to several bitrates to avoid JIT transcoding.
Networking can’t take that much energy, unless perhaps we are talking about purely wireless networking with cell towers?
Yet we also see that hyperscale cloud emissions targets have been reversed due to AI investment, Datacenter growth is hitting grid capacity limits in many regions, and peaker plant and other non-renewable resources on the grid are being deployed more to handle this specific growth from AI. I think the author, by qualifying on "chatgpt" maybe can make the claims they are making but I don't believe the larger argument would hold for AI as a whole or when you convert the electricity use to emissions.
I'm personally on the side that the ROI will probably work out in the long run but not by minimizing the potential impact and keeping the focus on how we can make this technology (currently in its infancy) more efficient.
[edit wording]
Yep, this is the real answer. It's also the only answer. The big fiction was everyone getting hopped on the idea that "karma" was going to be real, and people's virtue would be correctly identified by overt environmentalism rather then action.
Fossil fuel companies won, and they won in about 1980s when BP paid an advertising firm to come up with "personal carbon footprint" as a meaningful metric. Basically destroyed environmentalism since...well I'll let you know when it stops.
Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.
Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.
The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.
Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.
Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.
Is that true though? Data centers can be placed anywhere in the USA, they could be placed near a bunch of hydro or wind farm resources in the western grid which has little coal anyways outside of one line from Utah to socal. The AI doesn’t have to be located anywhere near to where it is used since fiber is probably easier to run than a high voltage power line.
There are a large number of reasons the AI datacenters are geographically distributed--just to list a few off the top of my head which come up as top drivers: latency, data sovereignty, resilience, grid capacity, renewable energy availability.
This is a great article for discussion. However articles like this must link to references. It is one thing to assert, another to prove. I do agree that heating/cooling, car and transport use, and diet play massive roles in climate change that should not be subsumed by other debates.
The flip side to the authors argument is that LLMs are not only used by home users doing 20 searches a day. Governments and Mega-Corporations are chewing through GPU hours on god-knows-what. New nuclear and other power facilities are being proposed to power their use, this is not insignificant. Schneider Electric predicts 93 GW of energy spent on AI by 2028. https://www.powerelectronicsnews.com/schneider-electric-pred...
The question this is addressing concerns personal use. Is it ethical to use ChatGPT on a personal basis? A surprising number of people will say that it isn't because of the energy and water usage of those prompts.
I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.
Still, LLM queries are not made equal. The environmental justification does not take into account for models querying other services, like the famous case where a single ChatGPT query resulted in thousands of HTTP requests.
I feel it's great that people have gotten invested in energy use this way, even if it's a bit lopsisded. We should use it in a positive way to get public opinion and political overton window behind rapid decarbonization and closure of oil fields.
This used to be the stick they used to beat bitcoin with. I guess it's a good stick because you can hit any technology with it and you can conveniently forget all the terrible uses to which electricity is put.
The section on training feels weak, and that's what the discussion is mainly about.
Many companies are now trying to train models as big as GPT-4. OpenAI is training models that may well be even much larger than GPT-4 (o1 and o3). Framing it as a one-time cost doesn't seem accurate - it doesn't look like the big companies will stop training new ones any time soon, they'll keep doing it. So one model might only be used half a year. And many models may not end up used at all. This might stop at some point, but that's hypothetical.
It briefly touches on training, but uses a seemingly misleading statistic that comes from (in reference to GPT-4) extremely smaller models.
This article [1] says that 300 [round-trip] flights are similar to training one AI model. Its reference of an AI model is a study done on 5-year-old models like BERT (110M parameters), Transformer (213M parameters), and GPT-2. Considering that models today may be more than a thousand times larger, this is an incredulous comparison.
Similar to the logic of "1 mile versus 60 miles in a massive cruise ship"... the article seems to be ironically making a very similar mistake.
737-800 burns about 3t of fuel per hour. NYC-SFO is about 6h, so 18t of fuel. Jet fuel energy density is 43MJ/kg, so 774000 MJ per flight, which is 215 MWh. Assuming the 60 GWh figure is true (seems widely cited on the internets), it comes down to 279 one-way flights.
~90% of the plastic debris in the ocean comes from ten rivers [0]. eight are in china/SEA. millions and billions of single-use items are sitting in warehouses and on store shelves wrapped in plastic. even before the plastic is discarded, the factories these items are produced in dump metric tons of waste into the oceans/soil with little repercussion.
point is, none of our "personal lifestyle decisions" - not eating meat, not mining bitcoin, not using chatgpt, not driving cars - are a drop in the bucket compared to standard practice overseas manufacturing.
us privileged folks could "just boycott", "buy renewable", "vote with your wallet", etc, but sales will move to a less developed area and the pollution will continue. this is not to say that the environment isn't important - it's critically important. it's just to say that until corporations are forced to do things the right way, it's ludicrous to point fingers at each other and worry that what we do day-to-day is destroying the planet.
That's definitely not true. Let's take Americans, for example, driving their cars to work. Americans are about 15% of the world's emissions, of which 25% or so is transportation, of which well over half is cars. So you not driving to work is making direct impact on 2-3% of the world's overall emissions. Likewise, your decisions on all the other things, if taken in aggregate, will have a significant impact on overall emissions.
"Driving to work" is hardly a "vote with your wallet" style consumer choice. Our housing, building, and transportation policies have been geared towards encouraging car-dependence for nearly a century. In places with better public transit and bike lanes, people spontaneously choose to use those modes of transport. Just like with companies dumping as much plastic waste/CO2 as they can get away with, this is a policy problem, plain and simple. No amount of pro-environment metal straw campaigns will solve it. At best environmentally-conscious messaging could encourage changes in voting behavior which influence policy. At worst people could be convinced that they're "doing their part" and fail to consider systemic changes.
Meat and dairy specifically accounts for around 14.5% of global greenhouse gas emissions, according to the UN’s Food and Agricultural Organization (FAO).
If people collectively just ate a bit less meat and dairy, it would go a long way. Don't even have to be perfect. Just show a little bit of restraint.
How much of Americans driving to work is because they choose too though? Amazon's 5 day RTO policy is a good example. How many of the people now going to an office 5 days a week would've done so without the mandate? I see the traffic every day, and saw the same area before the mandate, so I can tell you with confidence that there's many more cars on the road as a result of this commute. this all funnels back to the corporate decision to mandate 5 days in office.
if taken in aggregate, will have a significant impact
This is a good sentiment. But, in context, it is a fallacy. A harmful one.
Consumer action on transport and whatnot, assuming a massive and persistent global awareness effort... has the potential of adding up to a rounding error.
Housing policy, transport policy, urban planning... these are what affects transport emissions. Not individual choices.
Look at our environmental history. Consumer choice has no wins.
It's propaganda. Role reversal. Something for certain organizations to do. It is not an actual effort to achieve environmental benefit.
We should be demanding governments clean up. Governments and NGOs should not be demanding that we clean up.
The emissions from vehicles are different from plastics produced by factories.
Also, while important, 2-3% of world emissions is a drop in the bucket compared to the other 97%. Let's consider the other causes and how we can fix them.
Think about this: for many people, not driving to work is a big deal. If people collectively decide to do that, that's a lot of effort and inconvenience just for 2-3%.
> That's definitely not true. Let's take Americans, for example, driving their cars to work.
Even an example like this that is carefully chosen to make consumers feel/act more responsible falls short. You want people to change their lives/careers to not drive? Ok, but most people already want to work from home, so even the personal “choice” about whether to drive a car is basically stuck like other issues pending government / corporate action, in this case to either improve transit or to divest from expensive commercial real estate. This is really obvious isn’t it?
Grabbing back our feeling of agency should not come at the expense of blaming the public under the ridiculous pretense of “educating” them, because after years of that it just obscures the issues and amounts to misinformation. Fwiw I’m more inclined to agree with admonishing consumers to “use gasoline responsibly!” than say, water usage arguments where cutting my shower in half is supposed to somehow fix decades of irresponsible farming, etc. But after a while, people mistrust the frame itself where consumers are blamed, and so we also need to think carefully about the way we conduct these arguments.
I wish the world would ditch public transit entirely. It's nothing but a misery generator. It's far better to switch to remote work and distributed cities.
I have always felt this way too. Our personal choices do not move the needle on fossil fuel and plastics. One could embrace aversion to these out of a sense of sustainability to signal virtue, but lets not pretend it will save the planet. It won't. Restricting aviation flights, stopping wars and minimizing the dirty fuel used in maritime freight does much more. But the world will not do it.
While I agree in general, my opinion is that customer choices do also matter and can move the needle, slowly, with larger cultural change.
Personally, trying to make better choices, big or small, isn't about "virtue signalling". It's about acknowledging the issues and living according to ones values.
This line of thinking is what undermines democracies and ruins the environment. Your choice might just be a drop in the ocean, but guess what the ocean is made out of.
Descriptively / "objectively" if you make your demand cleaner, you decrease demand for dirty consumption. You can't say individuals don't matter by comparing them to the world, that's invalid.
Normatively, is it a useful lie? Maybe, to some extent. People are lazy, selfish, and stupid. Peter Singer points out that we might be nice to people nearby, but we don't give money to people starving in other countries even if we think it will make a real difference. And no human can really know how even a pencil is made, so we make poor decisions. A carbon tax would unleash the free market on the problem. But saying individuals can't act is not good leadership, if even the people who say they want to fix the issue won't make personal sacrifices, why should the average voter?
Regarding the immediate effect I am sure your point is valid. But it’s also a bit of a cynical point of view, wouldn’t you say?
People make these statements and pursue these personal lifestyle decisions because of their dreams for a better future - not its immediate effect. Just as companies need a vision to succeed, societies need vision as well. If a lot of people are vocal about something and live it, it has a chance of becoming anchored in laws and so force companies to do the “right thing”. Regulation follows collective values.
This massively lets the Philippines off the hook. China has a gazillion people, and so does India, and the rest of SE Asia is bad for pollution, but the Philippines — with 1.5% of the world’s population — is an incredible 36% of ocean plastic pollution.
Also a call-out to Malaysia who are an upper-middle income country and contribute far too much per capita given their income situation, but again, they are a drop in the ocean compared to the (much, much poorer) Philippines.
Having spent half my life in South-East Asia, there’s a cultural problem that needs fixing there.
Per capita beef consumption is down by 35% in the US since the 70s. From 62kg/person/year to 35.
Beef produces ~100kg CO2 per kg of meat. That's a reduction of 27,000kgs of CO2 reduction, per capita.
That's not nothing. By simply reducing beef consumption by 1 kilogram a month, you can prevent more than a metric ton of CO2. If 5% of Americans cut 1 kilo of beef a month, that'd knock out 15 million tons of CO2.
Small changes can have an impact on aggregate, and just because someone else is not making those changes doesn't excuse us looking at ourselves and saying, "I could choose something else this time".
It’s helpful to put this issue into perspective. But dismissing issues as not worth caring about on the grounds that there exist larger problems is fallacious and, to me, quite a dangerous way to live life.
“Why worry about your town’s water quality when some countries don’t have access to clean water?”
“Why go to the dentist for a cavity when some people have no teeth?“
“Why campaign for animal rights when there are some many human rights abuses going on?”
This is nothing but head-in-the-sand, arms-in-the-air, feel-good baloney to convince oneself to sleep well at night.
Guess what happens when you buy a used laptop instead of a new one?
That's right: less "standard practice overseas manufacturing".
Lifestyle change right there.
Buying less, using the same for longer, buying used goods instead of new are lifestyle changes that anyone can make and have an undeniable very clear impact by reducing the amount of stuff that needs to get made. Using my smartphone for 6 years instead of changing every 3 years doesn't mean the one I didn't buy gets sold elsewhere. It means one less sale.
This reads like an attempt to pass the blame to others. Per capita CO₂ emissions in the US are one of the highest in the world, and significantly higher than those in China or SEA. This is despite the US/Europe moving some of our dirtiest/cheapest manufacturing to that region.
Personal choices matter. See the amount of energy used on air conditioning in the US compared to areas of Europe with comparable weather for a banal example. If we want to significantly reduce emissions it will happen through a combination of personal choices, corporate action and government policy.
"Personal impact" is just laundering the responsibility of government and corporations so it looks like it's our fault.
It is true that everyone everywhere all at once could suddenly make the right decision forever and save the planet. But is a statistical anomaly so extreme it's not worth pursuing as a policy. No policy maker worth their salt would look at that and consider it valid long term.
We have a playbook. We refuse to use it. We ban products, and then the companies that refuse to change or cheat get shuttered, and we move on.
well, it's really not about the destruction of the planet but making our habitat more hostile and humans more sick.
sure, STEM will continue to find remedies and cures but at some point we're fucked just because the gene pool was reduced to an unnaturally selected bunch that survived & thrived completely alienated from the actual world.
sure, no biggie, wahaha, that's the name of the game, the old will die, the young repeat the same nonsense and that microbiome and all that other stuff we carry with us as hosts, potentially most likely in a beneficial symbiotic relationship, have no implicit mechanisms to cancel the contract and pivot towards some species or other that won't be d u m b enough to shit all over it's own home & garden, consequently ruining the bio-chemistry with the smell, taste and look of feces everywhere - in the body as well as outside - and all that while it's getting a bit hot in here.
and I doubt that the consequences of controlled demise in a deteriorating environment all while the meds and drugs of leadership and the people fade out quite a few of the brains and the bodies implicit reactions to a lot of sensory perceptions to everything that was vital, crucial to notice for a 'million' years can't be projected to at least some degree. I mean "blindspots" are a thinking tool, after all, but those thinking brains and minds believe in black swans and the better angels of our nature so that doesn't really mean a thing.
the population itself is fine, a habit of psycho-social education and all consecutive upper levels being insanely afraid of competition and insights from below. thing is, whatever financial survival schemes people are running, they all have death cult written all over their faces.
btw, most of this was for fun, I'm really not worried at all. climate change is more a cycle than man-made acceleration. my only point of interest is the deterioration of the species due to all the things that we do and then worry more about the habitat than our and all kinds.
we absolutely can turn the planet into a conservatory. through any climate.
One of the most imminent problems with the environment isn't due to plastic pollution (which is of course terrible, might well have unforseen ramifications via micro plastics, and is impacting negatively biodiversity), but CO2 and other gases impacting climate.
While we should strive to fix both, it's more important in the short term to limit the amount of CO2 pollution before it's too late.
Finally someone who speaks this out. What we do is more or less fly poop ... good four own well being but with almost zero impact.
I'll go on doing some things because I think that some of that are the better ways to handle this or that or it's better for my health but with no expectation that I'll change anything.
It talks at great length about data center trends relating to generative AI, from the perspective of someone who has been deeply involved in researching power usage and sustainability for two decades.
I find the following to be a great point regarding what we ought to consider when adapting our lifestyle to reduce negative environmental impact:
> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.
The further example with a hospital emitting more than a cruise ship is a good illustration of the issue.
Continuing this line of thought, when thinking about your use of an LLM like ChatGPT, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.
For example: Was this tech built with ethically sound methods[0]? What are its the foreseeable long-term effects on human flourishing? Does it cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[1][2]), motivation and capability to learn, creativity? And so forth.
[0] I’m not going to debate utilitarianism vs. deontology here, will just say that “the ends justify the means” does not strike me as a great principle to live by.
> Google, Microsoft, Meta and Amazon all have net-zero emission targets which they take very seriously, making them "some of the most significant corporate purchasers of renewable energy in the world". This helps explain why they're taking very real interest in nuclear power.
Nuclear is indeed (more or less) zero-emission, but it's not renewable.
Thank you for the synthesis and link to the original article, it's a good read!
Such a stupid post, I know people on HN don’t like absolute descriptors like that and sorry for that.
Obviously the LLMs and ChatGPT don’t use the most energy when answering your question, they churn through insane amounts of water and energy when training them, so much so that big tech companies do not disclose and try to obscure those amounts as much as possible.
You aren’t destroying the environment by using it RIGHT NOW, but you are telling the corresponding company that owns the LLM you use “there is interest in this product”, en masse. With these interest indicators they will plan for the future and plan for even more environmental destruction.
It's not like they are mixing that water with oil and pumping into the aquifer. Water evaporate, turn into clouds, that precipitate into rain that fall on the ground and water bodies, where it can be used again. So what's the problem, with datacenter water usage? Has the water cycle has stopped and I was not informed?
Fresh water is finite. Infinite in reuse, but we can only take so much from a river before that river ceases to be. If you have a megabit connection, it doesn't matter that your cloud backups have infinite storage, you are limited by bandwidth.
Water vapor stays aloft for wild, so there's no guarantee it enters the same watershed it was drawn from.
It's also a powerful greenhouse gas, so even though it's removed quickly, raising the rate we produce it results in more insulation.
It's not a finite resource, we need to be judicious and wise in how we allocate it.
Plenty of companies have revealed exactly how much energy and CO2 they have used training a model. Just off the top of my head, I've seen those numbers are available for Meta's Llama models, Microsoft's Phi series and DeepSeek's models - including their impressive DeepSeek v3 which trained for less than $6m in cost - a huge reduction compared to other similar models, and a useful illustration of how much more effect this stuff can get on the training side of things.
Anyone care to have a go at back of the envelope number for training energy use amortized per query for ChatGPT's models? Is the training or the inference going to dominate?
Similar feelings about the repeated references to the apparently agreed consensus that individual action is pointless vs systematic change like switching to a renewable energy system. Jevons Paradox would like a word.
I don’t care about energy usage. How exhausting it must be to be a climate hysterical person and try to factor the climate cost of every single action you take in life.
Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great. But I refuse to worry and be burdened by anxiety over every single unit of electricity consumed. Such a tiring, bullshit part of life the progressives have foisted on elites. And it is elites only as poors don’t give a shit
> Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great.
> And it is elites only as poors don’t give a shit
The poor people are also consumers; raising prices of energy for that group is a fantastic way to get kicked out of office even if you're an actual literal dictator.
People are complex.
The screeds you're objecting to are part of the political process to tell governments to do something, even if that something ends up being a mix of what you suggest plus subsidies for the poor, or something completely different, in any case to avoid being defenestrated
> "Emissions caused by chatgpt use are not significant in comparison to everything else."
Emissions directly caused by Average Joe using ChatGPT is not significant compared to everything else. 50,000 questions is a lot for an individual using ChatGPT casually, but nothing for the businesses using ChatGPT to crunch data. 50,000 "questions" will be lucky to get you through the hour.
Those businesses aren't crunching data just for the sake of it. They are doing so ultimately because that very same aforementioned Average Joe is going to buy something that was produced out of that data crunching. It is the indirect use that raises the "ChatGPT is bad for the environment" alarm. At very least, we at least don't have a good handle on what the actual scale is. How many indirect "questions" am I asking ChatGPT daily?
The article needs to exist because the idea that ChatGPT usage is environmentally disastrous really has started to make its way into the human hive mind.
I'm glad someone is trying to push back against that - I see it every day.
Hey all I wrote this post. To clear up a few points:
I meant this post to tell individuals that worrying about the emissions they personally cause using ChatGPT is silly, not that AI more broadly isn't using a lot of energy.
I can't really factor in how demand for ChatGPT is affecting the future of AI. If you don't want to use ChatGPT because you're worried about creating more demand, that's more legit, but worry about the emissions associated with individual searches right now on their own is a silly distraction.
One criticism is that I didn't talk about training enough. I included a section on training in the emissions and water sections, but if there's more you think I should address or change I'm all ears. Please either share them in the comments on the post or here.
I saw someone assumed I'm an e/acc. I'm very much not and am pretty worried about risks from advanced AI. Had hoped the link to an 80,000 Hours article might've been a clue there.
Someone else assumed I work for Microsoft. I actually exclusively use Claude but wanted to write this for a general audience and way fewer people know about Claude. I used ChatGPT for some research here that I could link people to just to show what it can do.
If you use chatgpt somehow saves you from making one trip to the doctor in your car it can offset the entire year worth of chatgpt usage in terms of co2 impact.
ChatGPT is probably adequate to provide a slightly more user-friendly but also slight-less-reliable replacement for a reliable consumer-oriented medical reference book or website for the task of determining whether self-care without seeing a doctor or seeing a doctor is appropriate for symptoms not obviously posing an immediate emergency.
Sort of off-topic, but it does make one think about usage of compute (and the backing energy / resources required for that)...
i.e. it doesn't seem too much of an exaggeration to say that we might be getting closer and closer to a situation where LLMs (or any other ML inference) is being run so much for so many different reasons / requests, that the usage does become significant in the future.
Similarly, going into detail on what the compute is being used for: i.e. no doubt there are situations currently going on where Person A uses a LLM to expand something like "make a long detailed report about our sales figures", which produces a 20 page report and delivers it to Person B. Person B then says "I haven't time to read all this, LLM please summarise it for me".
So you'd basically have LLM inference compute being used as a very inefficient method of data/request transfer, with the sender expanding a short amount of information to deliver to the recipient, and then the said recipient using an LLM on the other side to reduce it back again to something more manage-able.
That sounds like the opposite of data compression (inflation?) where the data size is increased before sending, then on receiving it is compressed back to a smaller form.
To me the "ChatGPT is destroying the environment "card always felt somewhat like bad faith arguing from the anti-AI crowd trying to find any excuse for being against AI. Like, the same people who complained about "using AI is destroying environment" seemed to have no issue with boarding a plane which would emit a bunch of CO2 so that they can have a vacation in Europe or the like. Selective environmentalism.
Who is this person you’re constructing? Being concerned about plane emissions and travel is an incredibly common thing and people are adjusting their lifestyles accordingly - lots of overnight sleeper train lines are reopening due to demand.
It's less bad faith, more a meme that has become so prevalent that it's impossible to dispell as it's something too nuanced for social media. I've seen more than a few social media posts asking "do they really cut down a rainforest every time someone generates an AI image?"
I mean, it's a literal net new expenditure of power and water. I also deeply doubt they have "no issue" with plane travel. You're just assuming the worst and most hypocritical position to someone, which seems deeply bad faith as well.
It's literally true that most of the AI vendors and their data center partners are writing off energy and water conservation targets they'd had for the near future because of LLM money. That is actually bad in the short and likely long term, especially as people use LLMs for increasingly frivolous things. Is it really necessary to have an LLM essentially do a "I'm Feeling Lucky" Google Search for you at some multiple of that environmental cost? Because that's what most of my friends and coworkers use ChatGPT for. Very rarely are they using it to get anything more complex than just searching wikipedia or documentation you could have had bookmarked.
A person has a choice of if they take a flight and if it's worth it for them. They have no power except for raising a complaint in public on if OpenAI or Google or whoever spends vast amounts of money and power to train some new model. If your bar is that no one is allowed to complain about a company burning energy unless they live a totally blameless life farming their own food without electricity then random companies will get to do any destructive act they want.
What about the people who take the protection of the environment seriously?
They got now a setback because not only didn’t we reach our previous goals on lowering energy consumption but know we put new consumption on top of that. Just because the existing one ate worse doesn’t make it good.
There is a reason why MS missed its CO2 targets and why everyone is kn search for more energy sources.
what does "water used by data center" even mean? Does it consume the water somehow? What does it turn into? Steam? So uploading a 1GB file boils away nearly 1 liter of water? Or is it turned into bits somehow in some kind of mass to energy conversion? I sorta doubt that. Also this means data centers would have cooling towers like some power stations. Are we talking about the cooling towers of power stations?
I think at least that graph is complete non-sense. I will try and have chatGPT explain it to me.
Yes, datacenters have cooling towers. There are lots of good articles about this topic. A good starting point is "water usage effectiveness" (WUE) which is one way this is tracked.
The one near here just has heat exchanges. But even if all the others use evaporators then potential water usage is extremely misleading because its not like the water is consumed - its just temporarily unavailable.
Also why doesn't uploading a 1GB file to my NAS boil a liter of water? are maybe all the switches and routers used between me and the datacenter water-cooled? I mean I can see such switches existing but I don't see them be the norm. Why doesn't the DSLAM on the Street outside emit steam. Is there maybe one bad switch somewhere that just spews steam?
What I am saying is that at least that graph is without further explanation... bad.
"personal carbon footprint" is a term invented by BP and is the single hack that derailed the environment discussion by making people personally responsible and removing the actual polluters from the discussion.
The article could indeed be written by the same kind of people, given it glances over training cost as if AI companies aren't pushing datacenter power/gpu capacities to the limit to produce incrementally better models by brute force. It all falls apart as soon as you stop individualizing the numbers and add back in all of the supposedly non-recurring (but actually recurring because we can't stop ourselves from redoing it) cost.
> and it’s completely clear to me that one side is getting it entirely wrong and spreading misleading ideas
What a great way to start an article. I get it as: "I am not open to listening to your arguments, and in fact if you disagree with me, I will assume that you are a moron".
It reminds me of people saying "planes are not the problem: actually if you compare it to driving a car, it uses less energy per person and per km". Except that as soon as you take a passenger in your car, the car is better (why did you assume that the plane was full and the car almost empty?). And that you don't remotely drive as far with your car as you fly with a plane. Obviously planes are worse than cars. If you need to imagine people commuting by car to the other side of the continent to prove your point, maybe it's not valid?
The fact is that the footprint of IT is increasing every year. And quite obviously, LLMs use more energy than "traditional" searches. Any new technology that makes us use more energy is bad for environment.
Unless you don't understand how bad the situation is: we have largely missed the goal of keeping global warming to 1.5C (thinking that we could reach it is absurd at this point). To keep 2C, we need to reduce global emissions by 5% every year. That's a Covid crisis every year. Let's be honest, it probably won't happen. So we'll go higher than 2C, fine. At the other end of the spectrum, 4C means that a big stripe (where billions of people live) around the equator will become unlivable for human beings (similar to being on Mars: you need equipment just to survive outside). I guess I don't need to argue how bad that would be, and we are currently going there. ChatGPT is part of that effort, as a new technology that makes us increase our emissions instead of doing the opposite.
> Except that it doesn't work if you don't drive your car alone (if you assume the plane is full of passengers, why not assuming that the car is, as well?)
These can be measured for averages. Lots of cars with one person in them, seldom cars fully packed; lots of planes fully packed, seldom (but it does happen) that the plane is almost empty.
> we have largely missed the goal of keeping global warming to 1.5C (thinking that we could reach it is absurd at this point).
Probably, yes; last year passed the threshold — it would be a pleasant *surprise* if that turned out to have been a fluke 14* years early.
* 14 because it would take 14 years for the exponential — seen for the last 30 years — for PV to replace all forms of power consumption; not just electricity, everything. But even then we'd also need to make rapid simultaneous progress with non-energy CO2 sources like cattle and concrete.
> around the equator will become unlivable for human beings (similar to being on Mars: you need equipment just to survive outside)
In so far as your bracket, sure; but there's a huge gap in what equipment you would need.
The comparison I often make is that Mars combines the moisture of the Sahara, the warmth of the Antarctic, the air pressure of the peak of Mount Everest, and the soil quality of a superfund cleanup site, before then revealing that it's actually worse on all counts.
> These can be measured for averages. Lots of cars with one person in them
Sure, but the point should be that we should strive to share cars, not that it's okay to take the plane! Especially given the second argument which is that you don't drive 1000km every time you take your car. The footprint per km is not enough: when you take the plane you typically go much further!
> Probably, yes; last year passed the threshold
That, plus the IPCC scenario that keeps us under 1.5C says that in a few decades, not only we won't be extracting any carbon anymore, but we will be pumping carbon underground faster than we are extracting it now! And that's with the IPCC models which tend to be optimistic (we measure that every year)!
> 14 because it would take 14 years for the exponential — seen for the last 30 years — for PV to replace all forms of power consumption
And you would have to take into account that PV today entirely relies on oil. We are going towards a world with less and less oil, and we don't know how it will impact our capacity of production for PVs. But probably it won't help.
> In so far as your bracket, sure; but there's a huge gap in what equipment you would need.
Sure. It was a quick way to say that the combination of humidity and temperature will be such that sweating won't help humans regulate their temperature. And when we can't regulate our temperature, we die. By any account, this means that billions of people will have to relocate, which means global wars (with entire countries moving with their entire armies).
Now of course that would be infinitely better than trying to live on Mars, which is why it is preposterous to even consider Mars.
It's discomforting to me when people compare resource usage of ChatGPT, a computer, to the resource usage of a human being.
I've seen charts like this before that compare resource usage of people to corporations, implying corporations are the bigger problem. The implication here seems to be the opposite, and that tone feels just a little eugenicist.
A lot of conversations regarding the environment feel so frustrating because they are either qualitative or use aggregate high level data or are like we'll be dead in 50 years (lol my personal favorite)
Why not start capturing waste/energy data for all human made items like nutritional data on food? It won't add much overhead or stifle economies as people fear
That way when I log in to use any online service or when I buy/drive a car or when I buy an item I can see how much energy was consumed and how much waste I produced exactly
In my country there's a lot of institutional hype about green algorithms. I find the whole idea quite irrelevant (for the reasons explained in this post) but of course, it's a way to get funding for those of us who work in AI/NLP (we don't have much money for GPUs and can't do things like training big LLMs, so it's easy to pitch everything we do as "green", and then get funding because that's considered strategic and yadda yadda).
It's funny, but sad, how no one calls the billshit because we would be sabotaging ourselves.
The major players in AI are collectively burning 1-2 gigawatts, day and night, on research and development of the next generation of LLMs. This is as much as my city of a million people. The impact is real, and focusing on inference cost per query kind of misses the point. Every person who uses these tools contributes to the demand and bears some of the responsibility. Similar to how I have responsibility for the carbon emissions of a flight, even if the plane would have flown without me.
I'm saying this as someone who finds LLMs helpful, and uses them without feeling particularly guilty about it. But we should be honest about the costs.
Perhaps off topic, what exactly does the “one way european flight” mean in the context of avoiding co2 emissions? I.e. what is the choice or scenario here?
I published this as a comment as well, but it's probably worth nothing that the ChatGPT water/power numbers cited (the one that is most widely cited in these discussions) comes from an April 2023 paper (Li et al, arXiv:2304.03271) that estimates water/power usage based off of GPT-3 (175B dense model) numbers published from OpenAI's original GPT-3 2021 paper. From Section 3.3.2 Inference:
> As a representative usage scenario for an LLM, we consider a conversation task, which typically includes a CPU-intensive prompt phase that processes the user’s input (a.k.a., prompt) and a memory-intensive token phase that produces outputs [37]. More specifically, we consider a medium-sized request, each with approximately ≤800 words of input and 150 – 300 words of output [37]. The official estimate shows that GPT-3 consumes an order of 0.4 kWh electricity to generate 100 pages of content (e.g., roughly 0.004 kWh per page) [18]. Thus, we consider 0.004 kWh as the per-request server energy consumption for our conversation task. The PUE, WUE, and EWIF are the same as those used for estimating the training water consumption.
There is a slightly newer paper (Oct 2023) that directly measured power usage on a Llama 65B (on V100/A100 hardware) that showed a 14X better efficiency. [2] Ethan Mollick linked to it recently and got me curious since I've recently been running my own inference (performance) testing and it'd be easy enough to just calculate power usage. My results [3] on the latest stable vLLM from last week on a standard H100 node w/ Llama 3.3 70B FP8 was almost a 10X better token/joule than the 2023 V100/A100 testing, which seems about right to me. This is without fancy look-ahead, speculative decode, prefix caching taken into account, just raw token generation. This is 120X more efficient than the commonly cited "ChatGPT" numbers and 250X more efficient than the Llama-3-70B numbers cited in the latest version (v4, 2025-01-15) of that same paper.
For those interested in a full analysis/table with all the citations (including my full testing results) see this o1 chat that calculated the relative efficiency differences and made a nice results table for me: https://chatgpt.com/share/678b55bb-336c-8012-97cc-b94f70919d...
(It's worth point out that that used 45s of TTC, which is a point that is not lost on me!)
Indeed, it doesn't solve the problem that people will misinterpret data and spread misinformation to justify their bad feeling about AI with invalid arguments.
"Other bad things exist" does not mean this thing isn't bad. Or absolutely could be that all these other things are huge energy wasters AND chat gpt is an energy waster.
We have to stop thinking about problems so linearly -- it's not "solve only the worst one first", because we'll forever find reasons to not try and solve that one, and we'll throw up our hands.
Like, we're well aware animal agriculture is a huge environmental impact. But getting everyone to go vegetarian before we start thinking about any other emissions source is a recipe for inaction. We're going to have to make progress, little by little, on all of these things.
Except that's not the point of this: the point is that humans have absolutely finite time to think about problems, and so the way you distract from a problem is by inventing a new more exciting one.
LLMs are in the news cycle, so sending all the activists after LLMs sure does a good job ensuring they're not going after anything which would be more effective doesn't it? (setting aside my thoughts for the moment of the utility of the 'direct action' type activists who I think have been useless for a good long while now - there could not possibly be more 'awareness' of climate change).
Then again, if you can stall a nascent polluter before it becomes entrenched, maybe that's the right time to intervene. Getting people to not eat meat is hard, we've been eating meat forever. Getting people to not use LLMs? That's where most of us were up until very recently.
Where in the world are you getting the numbers for how much video streaming uses energy? I am quite sure that just as with LLMs, most of the energy goes into the initial encoding of the video, and nowadays any rational service encodes videos to several bitrates to avoid JIT transcoding.
Networking can’t take that much energy, unless perhaps we are talking about purely wireless networking with cell towers?
Yet we also see that hyperscale cloud emissions targets have been reversed due to AI investment, Datacenter growth is hitting grid capacity limits in many regions, and peaker plant and other non-renewable resources on the grid are being deployed more to handle this specific growth from AI. I think the author, by qualifying on "chatgpt" maybe can make the claims they are making but I don't believe the larger argument would hold for AI as a whole or when you convert the electricity use to emissions.
I'm personally on the side that the ROI will probably work out in the long run but not by minimizing the potential impact and keeping the focus on how we can make this technology (currently in its infancy) more efficient. [edit wording]
Voluntary conservation was only working by accident and guilt tripping never works. The grid needs to become clean so that we can have new industries.
The grid being clean means not having any fossil power. We can only get there by shutting down all fossil fuel power plants.
We can not get there by adding new power generation.
1 reply →
Yep, this is the real answer. It's also the only answer. The big fiction was everyone getting hopped on the idea that "karma" was going to be real, and people's virtue would be correctly identified by overt environmentalism rather then action.
Fossil fuel companies won, and they won in about 1980s when BP paid an advertising firm to come up with "personal carbon footprint" as a meaningful metric. Basically destroyed environmentalism since...well I'll let you know when it stops.
Why do you belive this? Datacenter uses just a 1-1.3 percent of electricity from grid and even if you suppose AI increased the usage by 2x(which I really doubt), the number would still be tiny.
Also AI training is easiest workload to regulate, as you can only train when you have cheaper green energy.
I also had doubts, but asked chat and it confirms it’s an issue - including sources.
https://chatgpt.com/share/678b6b3e-9708-8009-bcad-8ba84a5145...
The issue is that they are often localised, so even if it’s just 1% of power, it can cause issues.
Still, by itself, grid issues don’t mean climate issues. And any argument complaining about a co2 cost should also consider alternative cost to be reliable. Even if ai was causing 1% or 2% or 10% of energy use, the real question is how much it saves by making society more efficient. And even if it wasn’t, it’s again more of a question about energy companies polluting with co2.
Microsoft, which hosts OpenAI, is famously amazing in terms of their co2 emissions - so far they were going way beyond what other companies were doing.
1 reply →
Is that true though? Data centers can be placed anywhere in the USA, they could be placed near a bunch of hydro or wind farm resources in the western grid which has little coal anyways outside of one line from Utah to socal. The AI doesn’t have to be located anywhere near to where it is used since fiber is probably easier to run than a high voltage power line.
That was already done years ago and people are predicting that the grid will be maxed out soon.
4 replies →
There are a large number of reasons the AI datacenters are geographically distributed--just to list a few off the top of my head which come up as top drivers: latency, data sovereignty, resilience, grid capacity, renewable energy availability.
3 replies →
This is a great article for discussion. However articles like this must link to references. It is one thing to assert, another to prove. I do agree that heating/cooling, car and transport use, and diet play massive roles in climate change that should not be subsumed by other debates.
The flip side to the authors argument is that LLMs are not only used by home users doing 20 searches a day. Governments and Mega-Corporations are chewing through GPU hours on god-knows-what. New nuclear and other power facilities are being proposed to power their use, this is not insignificant. Schneider Electric predicts 93 GW of energy spent on AI by 2028. https://www.powerelectronicsnews.com/schneider-electric-pred...
The question this is addressing concerns personal use. Is it ethical to use ChatGPT on a personal basis? A surprising number of people will say that it isn't because of the energy and water usage of those prompts.
I would be surprised if many people said it is unethical to use LLMs like ChatGPT for environmental reasons, as opposed to ethical principles such as encouraging unfair use of IP and copyright violation.
Still, LLM queries are not made equal. The environmental justification does not take into account for models querying other services, like the famous case where a single ChatGPT query resulted in thousands of HTTP requests.
5 replies →
I feel it's great that people have gotten invested in energy use this way, even if it's a bit lopsisded. We should use it in a positive way to get public opinion and political overton window behind rapid decarbonization and closure of oil fields.
> However articles like this must link to references.
There are links to sources for every piece of data in the article.
Where?
One of the most crucial points "Training an AI model emits as much as 200 plane flights from New York to San Francisco"
This seems to come from this blog https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....
which refers to this article https://www.technologyreview.com/2019/06/06/239031/training-...
which is talking about models like *GPT-2, BERT, and ELMo* -- _5+ year old models_ at this point.
The keystone statement is incredibly vague, and likely misleading. What is "an AI model"? From what I found, this is referring to GPT-2,
5 replies →
> Governments and Mega-Corporations are chewing through GPU hours on god-knows-what.
The "I don't know so it must be huge" argument?
Not knowing what it's being spent on is separate to knowing whether it's being spent
1 reply →
This used to be the stick they used to beat bitcoin with. I guess it's a good stick because you can hit any technology with it and you can conveniently forget all the terrible uses to which electricity is put.
The section on training feels weak, and that's what the discussion is mainly about.
Many companies are now trying to train models as big as GPT-4. OpenAI is training models that may well be even much larger than GPT-4 (o1 and o3). Framing it as a one-time cost doesn't seem accurate - it doesn't look like the big companies will stop training new ones any time soon, they'll keep doing it. So one model might only be used half a year. And many models may not end up used at all. This might stop at some point, but that's hypothetical.
It briefly touches on training, but uses a seemingly misleading statistic that comes from (in reference to GPT-4) extremely smaller models.
This article [1] says that 300 [round-trip] flights are similar to training one AI model. Its reference of an AI model is a study done on 5-year-old models like BERT (110M parameters), Transformer (213M parameters), and GPT-2. Considering that models today may be more than a thousand times larger, this is an incredulous comparison.
Similar to the logic of "1 mile versus 60 miles in a massive cruise ship"... the article seems to be ironically making a very similar mistake.
[1] https://icecat.com/blog/is-ai-truly-a-sustainable-choice/#:~....
737-800 burns about 3t of fuel per hour. NYC-SFO is about 6h, so 18t of fuel. Jet fuel energy density is 43MJ/kg, so 774000 MJ per flight, which is 215 MWh. Assuming the 60 GWh figure is true (seems widely cited on the internets), it comes down to 279 one-way flights.
1 reply →
I am sure that’s intentional, because this article is the same thing we see from e/acc personalities any time the environmental impact is brought up.
Deflection away from what actually uses power and pretending the entire system is just an API like anything else.
~90% of the plastic debris in the ocean comes from ten rivers [0]. eight are in china/SEA. millions and billions of single-use items are sitting in warehouses and on store shelves wrapped in plastic. even before the plastic is discarded, the factories these items are produced in dump metric tons of waste into the oceans/soil with little repercussion.
point is, none of our "personal lifestyle decisions" - not eating meat, not mining bitcoin, not using chatgpt, not driving cars - are a drop in the bucket compared to standard practice overseas manufacturing.
us privileged folks could "just boycott", "buy renewable", "vote with your wallet", etc, but sales will move to a less developed area and the pollution will continue. this is not to say that the environment isn't important - it's critically important. it's just to say that until corporations are forced to do things the right way, it's ludicrous to point fingers at each other and worry that what we do day-to-day is destroying the planet.
[0] https://pubs.acs.org/doi/10.1021/acs.est.7b02368
That's definitely not true. Let's take Americans, for example, driving their cars to work. Americans are about 15% of the world's emissions, of which 25% or so is transportation, of which well over half is cars. So you not driving to work is making direct impact on 2-3% of the world's overall emissions. Likewise, your decisions on all the other things, if taken in aggregate, will have a significant impact on overall emissions.
"Driving to work" is hardly a "vote with your wallet" style consumer choice. Our housing, building, and transportation policies have been geared towards encouraging car-dependence for nearly a century. In places with better public transit and bike lanes, people spontaneously choose to use those modes of transport. Just like with companies dumping as much plastic waste/CO2 as they can get away with, this is a policy problem, plain and simple. No amount of pro-environment metal straw campaigns will solve it. At best environmentally-conscious messaging could encourage changes in voting behavior which influence policy. At worst people could be convinced that they're "doing their part" and fail to consider systemic changes.
15 replies →
Meat and dairy specifically accounts for around 14.5% of global greenhouse gas emissions, according to the UN’s Food and Agricultural Organization (FAO).
If people collectively just ate a bit less meat and dairy, it would go a long way. Don't even have to be perfect. Just show a little bit of restraint.
45 replies →
How much of Americans driving to work is because they choose too though? Amazon's 5 day RTO policy is a good example. How many of the people now going to an office 5 days a week would've done so without the mandate? I see the traffic every day, and saw the same area before the mandate, so I can tell you with confidence that there's many more cars on the road as a result of this commute. this all funnels back to the corporate decision to mandate 5 days in office.
1 reply →
if taken in aggregate, will have a significant impact
This is a good sentiment. But, in context, it is a fallacy. A harmful one.
Consumer action on transport and whatnot, assuming a massive and persistent global awareness effort... has the potential of adding up to a rounding error.
Housing policy, transport policy, urban planning... these are what affects transport emissions. Not individual choices.
Look at our environmental history. Consumer choice has no wins.
It's propaganda. Role reversal. Something for certain organizations to do. It is not an actual effort to achieve environmental benefit.
We should be demanding governments clean up. Governments and NGOs should not be demanding that we clean up.
This assumes all emissions / externalities are created equal, which they are not.
3 replies →
The emissions from vehicles are different from plastics produced by factories.
Also, while important, 2-3% of world emissions is a drop in the bucket compared to the other 97%. Let's consider the other causes and how we can fix them.
Think about this: for many people, not driving to work is a big deal. If people collectively decide to do that, that's a lot of effort and inconvenience just for 2-3%.
2 replies →
> That's definitely not true. Let's take Americans, for example, driving their cars to work.
Even an example like this that is carefully chosen to make consumers feel/act more responsible falls short. You want people to change their lives/careers to not drive? Ok, but most people already want to work from home, so even the personal “choice” about whether to drive a car is basically stuck like other issues pending government / corporate action, in this case to either improve transit or to divest from expensive commercial real estate. This is really obvious isn’t it?
Grabbing back our feeling of agency should not come at the expense of blaming the public under the ridiculous pretense of “educating” them, because after years of that it just obscures the issues and amounts to misinformation. Fwiw I’m more inclined to agree with admonishing consumers to “use gasoline responsibly!” than say, water usage arguments where cutting my shower in half is supposed to somehow fix decades of irresponsible farming, etc. But after a while, people mistrust the frame itself where consumers are blamed, and so we also need to think carefully about the way we conduct these arguments.
I think many Americans driving to work would be happy to work from home if not RTO mandates (encouraged by the government at least on a local level).
The amount of people who choose to not drive to work is significantly impacted by policy.
> Let's take Americans, for example, driving their cars to work.
This is easily solved by switching to EVs. A small-size EV (perfect for personal transportation) is only slightly less CO2-efficient than rail ( https://ourworldindata.org/travel-carbon-footprint ).
I wish the world would ditch public transit entirely. It's nothing but a misery generator. It's far better to switch to remote work and distributed cities.
2 replies →
I have always felt this way too. Our personal choices do not move the needle on fossil fuel and plastics. One could embrace aversion to these out of a sense of sustainability to signal virtue, but lets not pretend it will save the planet. It won't. Restricting aviation flights, stopping wars and minimizing the dirty fuel used in maritime freight does much more. But the world will not do it.
While I agree in general, my opinion is that customer choices do also matter and can move the needle, slowly, with larger cultural change.
Personally, trying to make better choices, big or small, isn't about "virtue signalling". It's about acknowledging the issues and living according to ones values.
This line of thinking is what undermines democracies and ruins the environment. Your choice might just be a drop in the ocean, but guess what the ocean is made out of.
> it's just to say that until corporations are forced to do things the right way
But this isn't going to happen by itself. We need to vote for people who believes in regulating these corporations (rather than deregulating them).
> "vote with your wallet", etc, but sales will move to a less developed area and the pollution will continue.
But voting with your wallet is literally moving sales to a more developed area with less pollution?
I think this is wrong.
Descriptively / "objectively" if you make your demand cleaner, you decrease demand for dirty consumption. You can't say individuals don't matter by comparing them to the world, that's invalid.
Normatively, is it a useful lie? Maybe, to some extent. People are lazy, selfish, and stupid. Peter Singer points out that we might be nice to people nearby, but we don't give money to people starving in other countries even if we think it will make a real difference. And no human can really know how even a pencil is made, so we make poor decisions. A carbon tax would unleash the free market on the problem. But saying individuals can't act is not good leadership, if even the people who say they want to fix the issue won't make personal sacrifices, why should the average voter?
Regarding the immediate effect I am sure your point is valid. But it’s also a bit of a cynical point of view, wouldn’t you say? People make these statements and pursue these personal lifestyle decisions because of their dreams for a better future - not its immediate effect. Just as companies need a vision to succeed, societies need vision as well. If a lot of people are vocal about something and live it, it has a chance of becoming anchored in laws and so force companies to do the “right thing”. Regulation follows collective values.
This massively lets the Philippines off the hook. China has a gazillion people, and so does India, and the rest of SE Asia is bad for pollution, but the Philippines — with 1.5% of the world’s population — is an incredible 36% of ocean plastic pollution.
Also a call-out to Malaysia who are an upper-middle income country and contribute far too much per capita given their income situation, but again, they are a drop in the ocean compared to the (much, much poorer) Philippines.
Having spent half my life in South-East Asia, there’s a cultural problem that needs fixing there.
A pretty graph that make it clear just how bad the most egregious polluters are comparatively: https://ourworldindata.org/grapher/ocean-plastic-waste-per-c...
What are they doing in the Philippines? Dumping household waste straight into the rivers?
2 replies →
Per capita beef consumption is down by 35% in the US since the 70s. From 62kg/person/year to 35.
Beef produces ~100kg CO2 per kg of meat. That's a reduction of 27,000kgs of CO2 reduction, per capita.
That's not nothing. By simply reducing beef consumption by 1 kilogram a month, you can prevent more than a metric ton of CO2. If 5% of Americans cut 1 kilo of beef a month, that'd knock out 15 million tons of CO2.
Small changes can have an impact on aggregate, and just because someone else is not making those changes doesn't excuse us looking at ourselves and saying, "I could choose something else this time".
It’s helpful to put this issue into perspective. But dismissing issues as not worth caring about on the grounds that there exist larger problems is fallacious and, to me, quite a dangerous way to live life.
“Why worry about your town’s water quality when some countries don’t have access to clean water?”
“Why go to the dentist for a cavity when some people have no teeth?“
“Why campaign for animal rights when there are some many human rights abuses going on?”
This is nothing but head-in-the-sand, arms-in-the-air, feel-good baloney to convince oneself to sleep well at night.
Guess what happens when you buy a used laptop instead of a new one?
That's right: less "standard practice overseas manufacturing".
Lifestyle change right there.
Buying less, using the same for longer, buying used goods instead of new are lifestyle changes that anyone can make and have an undeniable very clear impact by reducing the amount of stuff that needs to get made. Using my smartphone for 6 years instead of changing every 3 years doesn't mean the one I didn't buy gets sold elsewhere. It means one less sale.
This reads like an attempt to pass the blame to others. Per capita CO₂ emissions in the US are one of the highest in the world, and significantly higher than those in China or SEA. This is despite the US/Europe moving some of our dirtiest/cheapest manufacturing to that region.
Personal choices matter. See the amount of energy used on air conditioning in the US compared to areas of Europe with comparable weather for a banal example. If we want to significantly reduce emissions it will happen through a combination of personal choices, corporate action and government policy.
"Personal impact" is just laundering the responsibility of government and corporations so it looks like it's our fault.
It is true that everyone everywhere all at once could suddenly make the right decision forever and save the planet. But is a statistical anomaly so extreme it's not worth pursuing as a policy. No policy maker worth their salt would look at that and consider it valid long term.
We have a playbook. We refuse to use it. We ban products, and then the companies that refuse to change or cheat get shuttered, and we move on.
well, it's really not about the destruction of the planet but making our habitat more hostile and humans more sick.
sure, STEM will continue to find remedies and cures but at some point we're fucked just because the gene pool was reduced to an unnaturally selected bunch that survived & thrived completely alienated from the actual world.
sure, no biggie, wahaha, that's the name of the game, the old will die, the young repeat the same nonsense and that microbiome and all that other stuff we carry with us as hosts, potentially most likely in a beneficial symbiotic relationship, have no implicit mechanisms to cancel the contract and pivot towards some species or other that won't be d u m b enough to shit all over it's own home & garden, consequently ruining the bio-chemistry with the smell, taste and look of feces everywhere - in the body as well as outside - and all that while it's getting a bit hot in here.
and I doubt that the consequences of controlled demise in a deteriorating environment all while the meds and drugs of leadership and the people fade out quite a few of the brains and the bodies implicit reactions to a lot of sensory perceptions to everything that was vital, crucial to notice for a 'million' years can't be projected to at least some degree. I mean "blindspots" are a thinking tool, after all, but those thinking brains and minds believe in black swans and the better angels of our nature so that doesn't really mean a thing.
the population itself is fine, a habit of psycho-social education and all consecutive upper levels being insanely afraid of competition and insights from below. thing is, whatever financial survival schemes people are running, they all have death cult written all over their faces.
btw, most of this was for fun, I'm really not worried at all. climate change is more a cycle than man-made acceleration. my only point of interest is the deterioration of the species due to all the things that we do and then worry more about the habitat than our and all kinds.
we absolutely can turn the planet into a conservatory. through any climate.
And 80% of all the trash in the oceans come from fishing industry (e.g. abandoned nets).
Do we talk about pollution or emmissions?
One of the most imminent problems with the environment isn't due to plastic pollution (which is of course terrible, might well have unforseen ramifications via micro plastics, and is impacting negatively biodiversity), but CO2 and other gases impacting climate.
While we should strive to fix both, it's more important in the short term to limit the amount of CO2 pollution before it's too late.
Finally someone who speaks this out. What we do is more or less fly poop ... good four own well being but with almost zero impact. I'll go on doing some things because I think that some of that are the better ways to handle this or that or it's better for my health but with no expectation that I'll change anything.
The absolute best thing I've read on this subject is this article here: https://about.bnef.com/blog/liebreich-generative-ai-the-powe...
It talks at great length about data center trends relating to generative AI, from the perspective of someone who has been deeply involved in researching power usage and sustainability for two decades.
I made my own notes on that piece here (for if you don't have a half hour to spend reading the original): https://simonwillison.net/2025/Jan/12/generative-ai-the-powe...
I find the following to be a great point regarding what we ought to consider when adapting our lifestyle to reduce negative environmental impact:
> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.
The further example with a hospital emitting more than a cruise ship is a good illustration of the issue.
Continuing this line of thought, when thinking about your use of an LLM like ChatGPT, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.
For example: Was this tech built with ethically sound methods[0]? What are its the foreseeable long-term effects on human flourishing? Does it cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[1][2]), motivation and capability to learn, creativity? And so forth.
[0] I’m not going to debate utilitarianism vs. deontology here, will just say that “the ends justify the means” does not strike me as a great principle to live by.
[1] https://news.ycombinator.com/item?id=42549624
Hello Simon,
You mention that
> Google, Microsoft, Meta and Amazon all have net-zero emission targets which they take very seriously, making them "some of the most significant corporate purchasers of renewable energy in the world". This helps explain why they're taking very real interest in nuclear power.
Nuclear is indeed (more or less) zero-emission, but it's not renewable.
Thank you for the synthesis and link to the original article, it's a good read!
Such a stupid post, I know people on HN don’t like absolute descriptors like that and sorry for that.
Obviously the LLMs and ChatGPT don’t use the most energy when answering your question, they churn through insane amounts of water and energy when training them, so much so that big tech companies do not disclose and try to obscure those amounts as much as possible.
You aren’t destroying the environment by using it RIGHT NOW, but you are telling the corresponding company that owns the LLM you use “there is interest in this product”, en masse. With these interest indicators they will plan for the future and plan for even more environmental destruction.
It's not like they are mixing that water with oil and pumping into the aquifer. Water evaporate, turn into clouds, that precipitate into rain that fall on the ground and water bodies, where it can be used again. So what's the problem, with datacenter water usage? Has the water cycle has stopped and I was not informed?
Fresh water is finite. Infinite in reuse, but we can only take so much from a river before that river ceases to be. If you have a megabit connection, it doesn't matter that your cloud backups have infinite storage, you are limited by bandwidth.
Water vapor stays aloft for wild, so there's no guarantee it enters the same watershed it was drawn from.
It's also a powerful greenhouse gas, so even though it's removed quickly, raising the rate we produce it results in more insulation.
It's not a finite resource, we need to be judicious and wise in how we allocate it.
[flagged]
Plenty of companies have revealed exactly how much energy and CO2 they have used training a model. Just off the top of my head, I've seen those numbers are available for Meta's Llama models, Microsoft's Phi series and DeepSeek's models - including their impressive DeepSeek v3 which trained for less than $6m in cost - a huge reduction compared to other similar models, and a useful illustration of how much more effect this stuff can get on the training side of things.
Anyone care to have a go at back of the envelope number for training energy use amortized per query for ChatGPT's models? Is the training or the inference going to dominate?
Similar feelings about the repeated references to the apparently agreed consensus that individual action is pointless vs systematic change like switching to a renewable energy system. Jevons Paradox would like a word.
I don’t care about energy usage. How exhausting it must be to be a climate hysterical person and try to factor the climate cost of every single action you take in life.
Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great. But I refuse to worry and be burdened by anxiety over every single unit of electricity consumed. Such a tiring, bullshit part of life the progressives have foisted on elites. And it is elites only as poors don’t give a shit
> Charge the consumer of energy the requisite price. If you want to make them pay for some externality, great.
> And it is elites only as poors don’t give a shit
The poor people are also consumers; raising prices of energy for that group is a fantastic way to get kicked out of office even if you're an actual literal dictator.
People are complex.
The screeds you're objecting to are part of the political process to tell governments to do something, even if that something ends up being a mix of what you suggest plus subsidies for the poor, or something completely different, in any case to avoid being defenestrated
The title does not match the content.
A more appropriate title is "Emissions caused by chatgpt use are not significant in comparison to everything else."
But, given that title, it becomes somewhat obvious that the article itself doesn't need to exist.
> "Emissions caused by chatgpt use are not significant in comparison to everything else."
Emissions directly caused by Average Joe using ChatGPT is not significant compared to everything else. 50,000 questions is a lot for an individual using ChatGPT casually, but nothing for the businesses using ChatGPT to crunch data. 50,000 "questions" will be lucky to get you through the hour.
Those businesses aren't crunching data just for the sake of it. They are doing so ultimately because that very same aforementioned Average Joe is going to buy something that was produced out of that data crunching. It is the indirect use that raises the "ChatGPT is bad for the environment" alarm. At very least, we at least don't have a good handle on what the actual scale is. How many indirect "questions" am I asking ChatGPT daily?
> given that title, it becomes somewhat obvious that the article itself doesn't need to exist.
Why? I regularly hear people trying to argue that LLMs are an environmental distaster.
Because LLMs are an environmental disaster.
It's not about any individual usage. It's the global technology that is yet to prove to be useful and that already have bad for the environment.
Any new usage should be free of impact on the environment.
(Note: The technology of LLM itself is not an environmental disaster, but how it is put in use currently isn't the way).
8 replies →
The article needs to exist because the idea that ChatGPT usage is environmentally disastrous really has started to make its way into the human hive mind.
I'm glad someone is trying to push back against that - I see it every day.
Learning a new model (like GPT-4) is way more costly than running it.
Hey all I wrote this post. To clear up a few points:
I meant this post to tell individuals that worrying about the emissions they personally cause using ChatGPT is silly, not that AI more broadly isn't using a lot of energy.
I can't really factor in how demand for ChatGPT is affecting the future of AI. If you don't want to use ChatGPT because you're worried about creating more demand, that's more legit, but worry about the emissions associated with individual searches right now on their own is a silly distraction.
One criticism is that I didn't talk about training enough. I included a section on training in the emissions and water sections, but if there's more you think I should address or change I'm all ears. Please either share them in the comments on the post or here.
I saw someone assumed I'm an e/acc. I'm very much not and am pretty worried about risks from advanced AI. Had hoped the link to an 80,000 Hours article might've been a clue there.
Someone else assumed I work for Microsoft. I actually exclusively use Claude but wanted to write this for a general audience and way fewer people know about Claude. I used ChatGPT for some research here that I could link people to just to show what it can do.
If you use chatgpt somehow saves you from making one trip to the doctor in your car it can offset the entire year worth of chatgpt usage in terms of co2 impact.
if your use of chatgpt saves you from a trip to the doctor I would be very concerned
Early days, but not as crazy as it sounds: https://jamanetwork.com/journals/jamanetworkopen/fullarticle...
"The LLM alone scored 16 percentage points (95% CI, 2-30 percentage points; P = .03) higher than the conventional resources group."
1 reply →
ChatGPT is probably adequate to provide a slightly more user-friendly but also slight-less-reliable replacement for a reliable consumer-oriented medical reference book or website for the task of determining whether self-care without seeing a doctor or seeing a doctor is appropriate for symptoms not obviously posing an immediate emergency.
Most doctor visits are for benign matters...
5 replies →
If ChatGPT somehow makes you eat more burgers it could make it makes water consumption worse.
Sort of off-topic, but it does make one think about usage of compute (and the backing energy / resources required for that)...
i.e. it doesn't seem too much of an exaggeration to say that we might be getting closer and closer to a situation where LLMs (or any other ML inference) is being run so much for so many different reasons / requests, that the usage does become significant in the future.
Similarly, going into detail on what the compute is being used for: i.e. no doubt there are situations currently going on where Person A uses a LLM to expand something like "make a long detailed report about our sales figures", which produces a 20 page report and delivers it to Person B. Person B then says "I haven't time to read all this, LLM please summarise it for me".
So you'd basically have LLM inference compute being used as a very inefficient method of data/request transfer, with the sender expanding a short amount of information to deliver to the recipient, and then the said recipient using an LLM on the other side to reduce it back again to something more manage-able.
That sounds like the opposite of data compression (inflation?) where the data size is increased before sending, then on receiving it is compressed back to a smaller form.
Lossy data inflation.
Data center emissions probably 662% higher than big tech claims. Can it keep up the ruse?: https://www.theguardian.com/technology/2024/sep/15/data-cent...
To me the "ChatGPT is destroying the environment "card always felt somewhat like bad faith arguing from the anti-AI crowd trying to find any excuse for being against AI. Like, the same people who complained about "using AI is destroying environment" seemed to have no issue with boarding a plane which would emit a bunch of CO2 so that they can have a vacation in Europe or the like. Selective environmentalism.
Who is this person you’re constructing? Being concerned about plane emissions and travel is an incredibly common thing and people are adjusting their lifestyles accordingly - lots of overnight sleeper train lines are reopening due to demand.
It's less bad faith, more a meme that has become so prevalent that it's impossible to dispell as it's something too nuanced for social media. I've seen more than a few social media posts asking "do they really cut down a rainforest every time someone generates an AI image?"
I mean, it's a literal net new expenditure of power and water. I also deeply doubt they have "no issue" with plane travel. You're just assuming the worst and most hypocritical position to someone, which seems deeply bad faith as well.
It's literally true that most of the AI vendors and their data center partners are writing off energy and water conservation targets they'd had for the near future because of LLM money. That is actually bad in the short and likely long term, especially as people use LLMs for increasingly frivolous things. Is it really necessary to have an LLM essentially do a "I'm Feeling Lucky" Google Search for you at some multiple of that environmental cost? Because that's what most of my friends and coworkers use ChatGPT for. Very rarely are they using it to get anything more complex than just searching wikipedia or documentation you could have had bookmarked.
A person has a choice of if they take a flight and if it's worth it for them. They have no power except for raising a complaint in public on if OpenAI or Google or whoever spends vast amounts of money and power to train some new model. If your bar is that no one is allowed to complain about a company burning energy unless they live a totally blameless life farming their own food without electricity then random companies will get to do any destructive act they want.
You are talking about hypocrites.
What about the people who take the protection of the environment seriously?
They got now a setback because not only didn’t we reach our previous goals on lowering energy consumption but know we put new consumption on top of that. Just because the existing one ate worse doesn’t make it good.
There is a reason why MS missed its CO2 targets and why everyone is kn search for more energy sources.
They all create more CO2.
what does "water used by data center" even mean? Does it consume the water somehow? What does it turn into? Steam? So uploading a 1GB file boils away nearly 1 liter of water? Or is it turned into bits somehow in some kind of mass to energy conversion? I sorta doubt that. Also this means data centers would have cooling towers like some power stations. Are we talking about the cooling towers of power stations?
I think at least that graph is complete non-sense. I will try and have chatGPT explain it to me.
> what does "water used by data center" even mean?
This doesn't clarify what exactly it includes, but there are two main things that generally are included:
(1) Direct water use for cooling, (which, yes, ends up as steam rom cooling towers), and
(2) Water used in generating electricity consumed by data centers, which, yeah, is again evaporated in cooling towers.
Yes, datacenters have cooling towers. There are lots of good articles about this topic. A good starting point is "water usage effectiveness" (WUE) which is one way this is tracked.
The one near here just has heat exchanges. But even if all the others use evaporators then potential water usage is extremely misleading because its not like the water is consumed - its just temporarily unavailable.
Also why doesn't uploading a 1GB file to my NAS boil a liter of water? are maybe all the switches and routers used between me and the datacenter water-cooled? I mean I can see such switches existing but I don't see them be the norm. Why doesn't the DSLAM on the Street outside emit steam. Is there maybe one bad switch somewhere that just spews steam?
What I am saying is that at least that graph is without further explanation... bad.
3 replies →
"personal carbon footprint" is a term invented by BP and is the single hack that derailed the environment discussion by making people personally responsible and removing the actual polluters from the discussion.
The article could indeed be written by the same kind of people, given it glances over training cost as if AI companies aren't pushing datacenter power/gpu capacities to the limit to produce incrementally better models by brute force. It all falls apart as soon as you stop individualizing the numbers and add back in all of the supposedly non-recurring (but actually recurring because we can't stop ourselves from redoing it) cost.
> and it’s completely clear to me that one side is getting it entirely wrong and spreading misleading ideas
What a great way to start an article. I get it as: "I am not open to listening to your arguments, and in fact if you disagree with me, I will assume that you are a moron".
It reminds me of people saying "planes are not the problem: actually if you compare it to driving a car, it uses less energy per person and per km". Except that as soon as you take a passenger in your car, the car is better (why did you assume that the plane was full and the car almost empty?). And that you don't remotely drive as far with your car as you fly with a plane. Obviously planes are worse than cars. If you need to imagine people commuting by car to the other side of the continent to prove your point, maybe it's not valid?
The fact is that the footprint of IT is increasing every year. And quite obviously, LLMs use more energy than "traditional" searches. Any new technology that makes us use more energy is bad for environment.
Unless you don't understand how bad the situation is: we have largely missed the goal of keeping global warming to 1.5C (thinking that we could reach it is absurd at this point). To keep 2C, we need to reduce global emissions by 5% every year. That's a Covid crisis every year. Let's be honest, it probably won't happen. So we'll go higher than 2C, fine. At the other end of the spectrum, 4C means that a big stripe (where billions of people live) around the equator will become unlivable for human beings (similar to being on Mars: you need equipment just to survive outside). I guess I don't need to argue how bad that would be, and we are currently going there. ChatGPT is part of that effort, as a new technology that makes us increase our emissions instead of doing the opposite.
I take your general point, but:
> Except that it doesn't work if you don't drive your car alone (if you assume the plane is full of passengers, why not assuming that the car is, as well?)
These can be measured for averages. Lots of cars with one person in them, seldom cars fully packed; lots of planes fully packed, seldom (but it does happen) that the plane is almost empty.
> we have largely missed the goal of keeping global warming to 1.5C (thinking that we could reach it is absurd at this point).
Probably, yes; last year passed the threshold — it would be a pleasant *surprise* if that turned out to have been a fluke 14* years early.
* 14 because it would take 14 years for the exponential — seen for the last 30 years — for PV to replace all forms of power consumption; not just electricity, everything. But even then we'd also need to make rapid simultaneous progress with non-energy CO2 sources like cattle and concrete.
> around the equator will become unlivable for human beings (similar to being on Mars: you need equipment just to survive outside)
In so far as your bracket, sure; but there's a huge gap in what equipment you would need.
The comparison I often make is that Mars combines the moisture of the Sahara, the warmth of the Antarctic, the air pressure of the peak of Mount Everest, and the soil quality of a superfund cleanup site, before then revealing that it's actually worse on all counts.
> These can be measured for averages. Lots of cars with one person in them
Sure, but the point should be that we should strive to share cars, not that it's okay to take the plane! Especially given the second argument which is that you don't drive 1000km every time you take your car. The footprint per km is not enough: when you take the plane you typically go much further!
> Probably, yes; last year passed the threshold
That, plus the IPCC scenario that keeps us under 1.5C says that in a few decades, not only we won't be extracting any carbon anymore, but we will be pumping carbon underground faster than we are extracting it now! And that's with the IPCC models which tend to be optimistic (we measure that every year)!
> 14 because it would take 14 years for the exponential — seen for the last 30 years — for PV to replace all forms of power consumption
And you would have to take into account that PV today entirely relies on oil. We are going towards a world with less and less oil, and we don't know how it will impact our capacity of production for PVs. But probably it won't help.
> In so far as your bracket, sure; but there's a huge gap in what equipment you would need.
Sure. It was a quick way to say that the combination of humidity and temperature will be such that sweating won't help humans regulate their temperature. And when we can't regulate our temperature, we die. By any account, this means that billions of people will have to relocate, which means global wars (with entire countries moving with their entire armies).
Now of course that would be infinitely better than trying to live on Mars, which is why it is preposterous to even consider Mars.
> It is extremely bad to distract the climate movement with debates about inconsequential levels of emission
This. So we should focus on optimizing transport, heating, energy and food.
It's discomforting to me when people compare resource usage of ChatGPT, a computer, to the resource usage of a human being.
I've seen charts like this before that compare resource usage of people to corporations, implying corporations are the bigger problem. The implication here seems to be the opposite, and that tone feels just a little eugenicist.
The use of ChatGPT doesn’t replace the others it comes on top of that.
MS is missing its CO2 targets because of AI not because of burgers.
The whole argument is, it’s not bad because other things are worse.
We are racing towards the abyss but don’t worry AI only accelerates a little more.
A lot of conversations regarding the environment feel so frustrating because they are either qualitative or use aggregate high level data or are like we'll be dead in 50 years (lol my personal favorite)
Why not start capturing waste/energy data for all human made items like nutritional data on food? It won't add much overhead or stifle economies as people fear
That way when I log in to use any online service or when I buy/drive a car or when I buy an item I can see how much energy was consumed and how much waste I produced exactly
In my country there's a lot of institutional hype about green algorithms. I find the whole idea quite irrelevant (for the reasons explained in this post) but of course, it's a way to get funding for those of us who work in AI/NLP (we don't have much money for GPUs and can't do things like training big LLMs, so it's easy to pitch everything we do as "green", and then get funding because that's considered strategic and yadda yadda).
It's funny, but sad, how no one calls the billshit because we would be sabotaging ourselves.
Enforce a global carbon tax, price it in, and tax the land
International tax agreements are crucial if any progress is to be made on preventing climate disaster.
Which major parties support it? Who is even talking about it?
It’s such an obviously needed mechanism, but hard to get anyone enthused about it.
This group have some proposals on the topic —
https://en.m.wikipedia.org/wiki/Association_for_the_Taxation...
The major players in AI are collectively burning 1-2 gigawatts, day and night, on research and development of the next generation of LLMs. This is as much as my city of a million people. The impact is real, and focusing on inference cost per query kind of misses the point. Every person who uses these tools contributes to the demand and bears some of the responsibility. Similar to how I have responsibility for the carbon emissions of a flight, even if the plane would have flown without me.
I'm saying this as someone who finds LLMs helpful, and uses them without feeling particularly guilty about it. But we should be honest about the costs.
Number are wrong. Just trying on topic. Data centers and AI realty consumes a lot more resources that you article poses.
Perhaps off topic, what exactly does the “one way european flight” mean in the context of avoiding co2 emissions? I.e. what is the choice or scenario here?
Comparing magnitudes does not prove the headline, poor analysis.
Why did this get flagged?
I emailed hn@ to try to find out, I’ve no idea why.
Using it, sure, now do operating it and building it.
I mean.... doing most things are bad for the environment. Do less of everything that you don't need to.
I published this as a comment as well, but it's probably worth nothing that the ChatGPT water/power numbers cited (the one that is most widely cited in these discussions) comes from an April 2023 paper (Li et al, arXiv:2304.03271) that estimates water/power usage based off of GPT-3 (175B dense model) numbers published from OpenAI's original GPT-3 2021 paper. From Section 3.3.2 Inference:
> As a representative usage scenario for an LLM, we consider a conversation task, which typically includes a CPU-intensive prompt phase that processes the user’s input (a.k.a., prompt) and a memory-intensive token phase that produces outputs [37]. More specifically, we consider a medium-sized request, each with approximately ≤800 words of input and 150 – 300 words of output [37]. The official estimate shows that GPT-3 consumes an order of 0.4 kWh electricity to generate 100 pages of content (e.g., roughly 0.004 kWh per page) [18]. Thus, we consider 0.004 kWh as the per-request server energy consumption for our conversation task. The PUE, WUE, and EWIF are the same as those used for estimating the training water consumption.
There is a slightly newer paper (Oct 2023) that directly measured power usage on a Llama 65B (on V100/A100 hardware) that showed a 14X better efficiency. [2] Ethan Mollick linked to it recently and got me curious since I've recently been running my own inference (performance) testing and it'd be easy enough to just calculate power usage. My results [3] on the latest stable vLLM from last week on a standard H100 node w/ Llama 3.3 70B FP8 was almost a 10X better token/joule than the 2023 V100/A100 testing, which seems about right to me. This is without fancy look-ahead, speculative decode, prefix caching taken into account, just raw token generation. This is 120X more efficient than the commonly cited "ChatGPT" numbers and 250X more efficient than the Llama-3-70B numbers cited in the latest version (v4, 2025-01-15) of that same paper.
For those interested in a full analysis/table with all the citations (including my full testing results) see this o1 chat that calculated the relative efficiency differences and made a nice results table for me: https://chatgpt.com/share/678b55bb-336c-8012-97cc-b94f70919d...
(It's worth point out that that used 45s of TTC, which is a point that is not lost on me!)
[1] https://arxiv.org/abs/2304.03271
[2] https://arxiv.org/abs/2310.03003
[3] https://gist.github.com/lhl/bf81a9c7dfc4244c974335e1605dcf22
Just build nuclear.
That doesn't solve the problem
Indeed, it doesn't solve the problem that people will misinterpret data and spread misinformation to justify their bad feeling about AI with invalid arguments.
In the same way that leaving food out doesn’t create a cockroach infestation.
What about training it?
Yeah, there are lots of things worse than LLMs. But, at least, they are useful and profitable.
"Other bad things exist" does not mean this thing isn't bad. Or absolutely could be that all these other things are huge energy wasters AND chat gpt is an energy waster.
We have to stop thinking about problems so linearly -- it's not "solve only the worst one first", because we'll forever find reasons to not try and solve that one, and we'll throw up our hands.
Like, we're well aware animal agriculture is a huge environmental impact. But getting everyone to go vegetarian before we start thinking about any other emissions source is a recipe for inaction. We're going to have to make progress, little by little, on all of these things.
Except that's not the point of this: the point is that humans have absolutely finite time to think about problems, and so the way you distract from a problem is by inventing a new more exciting one.
LLMs are in the news cycle, so sending all the activists after LLMs sure does a good job ensuring they're not going after anything which would be more effective doesn't it? (setting aside my thoughts for the moment of the utility of the 'direct action' type activists who I think have been useless for a good long while now - there could not possibly be more 'awareness' of climate change).
Maybe because we didn’t make real progress in the existing pollution and have a greater chance stopping a new polluter.
How long are climate change and its reasons known?
In the end people vote climate change deniers because they don’t like the inconvenient truth
Then again, if you can stall a nascent polluter before it becomes entrenched, maybe that's the right time to intervene. Getting people to not eat meat is hard, we've been eating meat forever. Getting people to not use LLMs? That's where most of us were up until very recently.
1 reply →
[dead]
This guy must work at Microsoft.