Environmental impacts of large language models

research
Published

October 22, 2025

Just want to quickly highlight a few investigative blogs on the environmental impacts of LLMs.

There are two main issues: water use (for cooling servers) and carbon emissions generated by creating the hardware, training the models and then using the models to get responses (‘inference’).

This blog does a deep dive on the water issues. The key finding is that the water issues seem to be overstated (he says ‘fake’ but I don’t agree with that word choice, water issues are real, just overstated).

Then academic Hannah Ritchies and author of ‘Sustainability by Numbers’ page sums up the carbon cost of chatGPT. She draws heavily on Andy Masley detailed work.

In particular, blog author (Andy Masley) argues that the public has limited attention span for environmental issues and AI isn’t the biggest priority. He sums it up nicely:

“AI water use is mostly fake, personal prompt costs are mostly fake, but AI as a whole is going to put a huge strain on our electrical grid going forward that’s going to matter a lot for the green energy transition, local electricity prices, and air pollution”

By fake, he means your personal usage is trivially small. He highlights golf courses and streaming videos as much larger uses of water and emissions respectively.

What also matters is the counterfactual - ie what would you be doing if you weren’t using chatGPT? If you would be online shopping, using a microwave or streaming videos instead, then your personal emissions are probably slightly lower if you switch your time to chatGPT.

If you are concerned about the impacts of your AI use you can easily do a personal offset by commiting to some simple behaviour changes. e.g. turn you computer off at night and turn it off at the power plug.

What about coders?

One unaddressed question in these posts is the personal impacts of genAI power users (like me). By that I mean people using it for coding. The blogs above deal with people who are doing the occaisonal short chatGPT prompt. Coders might be leaning on AI hundreds of times a day for prompts that range from short to very long (and more energy intensive).

That’s something I’d like to do the numbers for later on.