The environment is a terrible reason to avoid ChatGPT
People are saying you shouldn’t use ChatGPT due to statistics like:
A ChatGPT query emits 10x more emissions than a Google search.
Writing an email with ChatGPT uses a whole bottle of water.
ChatGPT uses as much energy as 20,000 households.
These stats are wrong or misleading. They’re bad reasons to not use AI.
1. These estimates are often far too high
The claim that a ChatGPT uses 10x the energy of a google search is based on an estimate from 2023 that each query uses 3 watt-hour.
But AI models have become dramatically more efficient, and there have been more detailed estimates. In 2025, the non-profit Epoch AI estimated a typical ChatGPT query uses 0.3 Wh, a figure later confirmed by the CEO of OpenAI, as well as Google. That’s ten times less than the original. It would make a query roughly equivalent to a Google search.
The bottle of water per email claim comes from the Washington Post, which gives no source or working and represents a worst case scenario. A more realistic estimate is 2ml per query. So even if you make 10 queries to write a single email, that’s 25 times less.
2. AI’s energy use is tiny relative to other things
The 0.3 watt-hour needed for one prompt is about the same as:
There is just as much grounds for criticising the energy consumption of Netflix as GPT, but worrying about either is silly. Our entire online lives – all the streaming, browsing and zooming we do – only use about 2% of total energy.1 AI in turn, remains under 20% of that.2
Reducing how much you fly, eat meat or heat your home will reduce emissions hundreds of times more than cutting your use of ChatGPT.

The same is true of water. The average American uses 1600 liters of water per day, so even if you make 100 prompts per day, at 2ml per prompt, that’s only 0.01% of your total water consumption. Using a shower for one second would use far more. We would never worry about conserving this much water in any other context.
All this is because the virtual world is far more energy efficient than the ‘real’ one. Reading an ebook for an hour uses about 20 times less energy than reading a paper one. In fact, a study in Nature estimated that using GPT results in 100-1000x less emissions than having a human do the same work. Human workers commute to climate controlled offices, and this uses a lot of energy. The virtual world is also already electrified, making it easier to decarbonise. If your sole goal is to reduce CO2 emissions, you should be hoping to move everything online and automate as much as possible. (Though personally I think that’s a bad goal.)
Isn’t AI’s energy use growing rapidly? Yes, but that’s because people find it really useful. It’s extremely misleading to talk about energy consumption without putting it in context with the value created. Everything we do uses some energy. Doing things online uses comparatively little energy, and never going online again would be rather costly, so it’s one of the last things to cut. The International Energy Agency even estimates AI could reduce emissions by more than it produces by better optimising transport and power generation.
3. Cutting individual emissions is an inefficient way to fight climate change in the first place
A typical citizen of the US or EU emits 5-15 tonnes of CO2 per year, so theoretically cutting your emissions to zero would save that much. But spending $1000 per year on carbon credits would reduce emissions the same amount, and be a helluva lot easier.3
And that’s not the most efficient option. Founders Pledge is a philanthropic advisory that has searched for the charities that best reduce CO2 emissions. They’re skeptical of many of the options, but estimate that the Clean Air Task Force, which advocates for investment in neglected green energy technology, has reduced emissions in the past for well under $10 per tonne.4 A donation of $1000 would therefore likely reduce emissions by over ten times as much as cutting your personal emissions to zero.
This makes sense because your donations can be directed towards the most efficient ways of reducing CO2 emissions in the entire world. This probably looks more like investment in green energy, electrification and policy change than you scrimping on your showers.
I used donations to illustrate, but the same point applies to where you direct your time. Fighting climate change is important, but we should focus our time and money towards what reduces emissions the most for the least cost. What you do with your donations, political influence, volunteering and most of all your career matters thousands of times more than your personal emissions.
In sum
AI’s energy consumption is only a small fraction of our online activities, which are only a small fraction of our personal emissions, which are only a small driver of your potential impact on climate change.
There are real reasons to be concerned about AI – from total transformation of the economy, to loss of control, to WW3 or gradual disempowerment – but carbon emissions from personal use of the existing models isn’t one of them. It’s like worrying about plastic straws when an asteroid is hurtling towards Earth.
Thank you to Andy Masley for inspiring this post and providing a lot of the research. Please check out his Substack.
All US data centres use about 4% of electricity as of 2024. If we include all the power used on end-devices like smartphones, and on electricity transmission, we might end up at ~8% of electricity used on the internet.
In the US, only about 21% of energy is used on electricity, so the total energy consumption of all online activities is under 10%*21% = 2.1%.
What we know about energy use at U.S. data centers amid the AI boom, Pew Research, October 2025, link.
How much electricity is used for lighting in the United States?, U.S. Energy Information Administration
https://archive.ph/hrRzc
AI workloads are perhaps 5-15% of data centre consumption (e.g. see this estimate by Goldman Sachs), and datacentres are perhaps half of the electricity used to run the internet. This is projected to rise, but will likely still remain a minority for years ahead.
EU carbon credits cost under $100 per tonne. If you buy one and don’t exercise it, it legal obligation for a company in the EU to emit one less tonne of CO2.
For instance, they believe that even a conservative estimate of their past work reduced emissions for $1.63 per tonne. See the background section of their full report (which also discusses the broader case for thinking we can reduce emissions far more effectively than carbon credits).
https://www.founderspledge.com/research/changing-landscape



Feels quite misleading to me, below are my arguments.
Future growth concerns: While current per-query energy use is relatively low, the article doesn't fully address concerns about AI's rapidly growing total energy consumption as usage scales.
Training vs inference: The article focuses on inference (running queries) but doesn't discuss training costs, which are substantially higher.
Water usage calculations: Data center water use is complex and varies by cooling method and location. The 2ml estimate may not capture all indirect water consumption.
What is “low per query” can sum to very large impact across millions of users, frequent use, model retraining, data-centre proliferation, hardware lifecycle costs, rare-earth mining, e-waste disposal, cooling infrastructure, etc. Indeed, recent reports find that AI’s environmental footprint, direct and indirect, is nontrivial.
There is a rhetorical stance that seems to treat “naturalness” or “environment as cost” purely instrumentally: i.e. environment only matters insofar as humans benefit. That risks falling into the classic philosophical criticism of “appeal to nature”: just because something is natural (or artificial) doesn’t by itself make it morally good or bad.
Hope these Nueromorphic chips will come to the rescue that show 100 fold energy reductions !!
https://www.perplexity.ai/page/brain-inspired-chips-slash-ai-bSeuGr3OQ7i3YTVOOhKbsQ