Discussion about this post

User's avatar
Madhusudhan Pathak's avatar

Feels quite misleading to me, below are my arguments.

Future growth concerns: While current per-query energy use is relatively low, the article doesn't fully address concerns about AI's rapidly growing total energy consumption as usage scales.

Training vs inference: The article focuses on inference (running queries) but doesn't discuss training costs, which are substantially higher.

Water usage calculations: Data center water use is complex and varies by cooling method and location. The 2ml estimate may not capture all indirect water consumption.

What is “low per query” can sum to very large impact across millions of users, frequent use, model retraining, data-centre proliferation, hardware lifecycle costs, rare-earth mining, e-waste disposal, cooling infrastructure, etc. Indeed, recent reports find that AI’s environmental footprint, direct and indirect, is nontrivial.

There is a rhetorical stance that seems to treat “naturalness” or “environment as cost” purely instrumentally: i.e. environment only matters insofar as humans benefit. That risks falling into the classic philosophical criticism of “appeal to nature”: just because something is natural (or artificial) doesn’t by itself make it morally good or bad.

Expand full comment
Intellibytes's avatar

Hope these Nueromorphic chips will come to the rescue that show 100 fold energy reductions !!

https://www.perplexity.ai/page/brain-inspired-chips-slash-ai-bSeuGr3OQ7i3YTVOOhKbsQ

Expand full comment

No posts

Ready for more?