AI is everywhere. Writing job applications, diagnosing illnesses, helping marketers write blog posts just like this one (but not this one). However, it comes with a dark cloud overhead: a growing concern about its environmental impact.
From water-hungry data centres to carbon-intensive GPUs, AI has been branded as one of the digital world’s dirtiest secrets. But is that fair? Or is it just another round of tech panic?
Let’s break it down; comparing AI to the wider internet, unpacking the biggest environmental burdens, exploring how politeness affects emissions, and identifying practical ways to reduce AI’s footprint through smarter prompt design.
Training vs. using AI: Where the real cost lies
There’s no question: training large AI models uses a staggering amount of energy. GPT-3 consumed around 1,287 MWh, which is roughly the same as 120 UK homes use in a year. It also emitted an estimated 500 metric tonnes of CO₂, and that’s before a single user typed anything into a chat box.
But the bigger, long-term impact doesn’t come from training. It comes from inference – the act of using the model over and over, millions of times a day. A single ChatGPT prompt is estimated to consume 10 to 100 times the energy of a standard Google search. At over 2.5 billion prompts daily, that adds up quickly.
AI vs. Google: Scale matters
To understand how AI’s energy use compares to the broader internet, it helps to look at query volumes. Google currently processes around 14 billion searches per day. That’s over 5 trillion searches per year. ChatGPT, by comparison, handles 2.5 billion prompts daily, more than double what it was handling just eight months ago.
So yes, Google search still dwarfs AI usage by a factor of roughly six to one, but AI is growing at breakneck speed. That growth, combined with its significantly higher per-query energy cost, explains why it is getting so much attention around its environmental impact.
The hidden water cost
AI’s energy needs are just one side of the story. The water required to cool the data centres powering large models like GPT-4 is another. These facilities rely heavily on water-based cooling systems.
Training GPT-3 alone is estimated to have used over 700,000 litres of fresh water. At current growth rates, AI-related data centres could withdraw 6 billion cubic metres of water annually by 2027 – more than the total annual water consumption of the UK.
Given that access to water is fairly important to the survival of the species, we should probably start to worry when a digital system starts to compare to whole countries in terms of usage.
And then there’s this: Politeness costs power
Here’s a curveball you probably weren’t expecting.
Sam Altman, CEO of OpenAI, recently revealed that including polite phrases like “please” and “thank you” in ChatGPT prompts contributes to tens of millions of dollars a year in added energy costs. Why? Because every word you type becomes a token that the AI model must interpret, analyse, and respond to.
Politeness might seem small in isolation, but when multiplied across billions of daily queries, the impact adds up. Some estimates suggest that these extra tokens result in thousands of tonnes of CO₂ emissions annually.
Of course, there are good reasons to be polite. Many users find they get better responses with more courteous phrasing. But if you’re looking to trim your digital carbon footprint, trimming your prompts might be a good place to start.
Token efficiency: How to write smarter prompts
The concept of “tokens” is central to how AI models work. A token is typically around four characters, and every prompt and response is made up of them. More tokens mean more processing, which means more energy.
With some large models handling hundreds of trillions of tokens per month, cutting token use at scale could significantly reduce energy consumption. One proposed strategy could lower usage by 60%, potentially saving millions in infrastructure and electricity costs.
Here are a few ways to cut token use without compromising output quality:
- Be concise: Remove redundant words and avoid overly long phrasing. (PowerBrain AI)
- Trim punctuation: Unnecessary punctuation adds tokens. (PowerBrain AI)
- Use summaries: In chained queries, summarise prior messages instead of pasting entire conversations. (Stack Overflow)
- Preprocess your input: Clean and structure prompts before sending. (OpenMeter)
- Optimise context windows: Drop irrelevant or outdated conversation history. (ArXiv: Semantic Compression)
What about the internet more broadly?
Let’s not forget: the internet is far from clean. Between the data centres, networks, and devices involved, digital activity contributes 2–4% of global greenhouse gas emissions, similar to the aviation industry.
That said, the internet’s energy consumption has stabilised somewhat in recent years, thanks to efficiency gains and more mature infrastructure. Streaming, browsing, and emailing all have relatively low energy costs per interaction, especially when compared to AI.
So, while the internet is vast, AI is concentrated and intensive.
Can AI help fix itself?
Despite its environmental footprint, AI is not just a problem. It could also be part of the solution.
AI is already helping optimise power grids, monitor deforestation, improve agricultural efficiency, and reduce waste in supply chains. Some sources even suggest AI will become a core part of climate adaptation strategy. If powered sustainably, these applications could help reduce emissions elsewhere, offsetting some of the damage caused by the technology itself.
But this positive potential won’t happen by default. It depends on how responsibly we scale AI, and whether we prioritise sustainability alongside performance.
So, is AI as bad as they say?
Yes. And no.
AI is far more resource-hungry per use than most other internet services. Its energy and water demands are growing rapidly, and its environmental impact is concentrated in fewer systems, making it both easier to monitor and harder to ignore.
But it’s not beyond redemption. With cleaner energy, smarter hardware, better prompt design, and careful deployment, AI could become leaner and more useful to sustainability efforts around the world.
In the meantime:
- Be polite to people.
- Be concise with AI.
- If you can Google that simple question, just Google that simple question.
- Don’t be lazy – if it takes 5 minutes to do without AI, just do it.
Let's collaborate
Partner with us
Let’s work together to create smarter, more effective solutions for your business.
Related blogs
Who we are
Explore how our culture and expertise fuel digital innovation
Join us