Here’s How Much a Single Google Gemini Chat Costs the Planet

Google has published new data showing how much energy and water its AI systems actually consume, arguing that most industry estimates paint too rosy a picture.
In a blog post, the company explained that calculating AI’s environmental footprint isn’t straightforward. Most reports only measure how much power chips like GPUs and TPUs use when actively working. But Google says that ignores big pieces of the picture, like energy spent keeping extra servers on standby, power used by CPUs and memory, and even the water needed to cool its massive data centres.
When those hidden costs are included, Google says that one average Gemini text prompt uses about 0.24 watt-hours of electricity, produces 0.03 grams of CO₂, and consumes 0.26 millilitres of water—about five drops. Google compares that energy use to watching TV for less than nine seconds. All of this information is detailed in a new technical paper.
Google argues that it’s been working to make AI more efficient with new model designs, specialized algorithms, and custom chips. Its latest TPU, called Ironwood, is said to be 30 times more efficient than its first version. The company also highlighted improvements like speculative decoding—a trick where a smaller model guesses an answer first and a larger one checks it, saving time and energy.
Even with these efficiency gains, Google acknowledged the demand for AI is growing fast. Running big models at scale uses enormous resources, from electricity to fresh water for cooling. Google is trying to cut back by building more efficient data centres, investing in clean energy, and reducing water use in regions where supplies are stressed.
Want to see more of our stories on Google?
P.S. Want to keep this site truly independent? Support us by buying us a beer, treating us to a coffee, or shopping through Amazon here. Links in this post are affiliate links, so we earn a tiny commission at no charge to you. Thanks for supporting independent Canadian media!
