
‘I wonder how much money OpenAI has lost in electricity costs from people saying “please” and “thank you” to their models.’
This is a question that someone recently posed on X about ChatGPT, OpenAI’s virtual assistant powered by artificial intelligence (AI).
And the answer is ‘tens of millions of dollars’, according to OpenAI chief executive Sam Altman.
‘Please’ and ‘thank you’ are about as British as you can get. Eight in 10 Britons are friendly to AI chatbots.
But being polite to AI comes at a cost – and not just financially.
ChatGPT gulps 39.16million gallons of water a day
Every time you ask ChatGPT to write an email, rip apart your Instagram profile or plan your monthly budget, it uses energy.
And this bot’s electricity bills are an estimated 40million kilowatt-hours every day. That could charge 8million phones, according to Business Energy UK.
Data centres – the engine rooms of AI – use water to stay cool. At least 39.16million gallons a day, to be precise.
That’s enough water to fill 978,000 baths or flush a toilet 24million times.
Why does AI need so much energy?
ChatGPT is an example of generative AI – tech that can make content like text and images. It can do this because it’s a large language model, a neural network that learns by analysing data from across the internet.
This requires staggering amounts of power to pull off, Morten Goodwin, a professor at the University of Agder, Norway, told Metro
‘Data must be transmitted, processed, and stored, whether the message is a complex request or a simple “thank you”,’ the chief scientist at AI Experts said.

‘The same is true for a Google search, an email, or a Teams meeting. You could even argue that humans saying “thank you” to each other also requires energy, albeit a very small amount.’
Companies try to meet AI’s insatiable hunger by using planet-warming fossil fuels, Dr Daniel Farrelly, principal lecturer in psychology at the University of Worcester, told Metro.
‘All online activity has a carbon footprint, from using AI chatbots right down to sending text messages,’ he said.
‘Although these single effects can often be small, they occur millions – billions – of times a day across the world, so the environmental impact from these, in total, can be considerable.
‘Combine this with the fact that these costs are invisible to us (compared to, say, the vapour trail we can see in the sky from the fuel that aeroplanes burn), it makes the potential impact on the environment of online activity a real issue.’
Do we need to be polite to AI?

When it comes to chatbots, flattery will get you nowhere. Polite prompts have a ‘negligible’ effect on how well AI performs, a study found.
Speaking to Metro, co-author Neil Johnson, a professor of physics at George Washington University, said: ‘Are you nice to your toaster? We don’t put birthday wrapping around the bread slice to make it look nicer.
‘Likewise, being nice to AI adds extra “packing” words that can confuse it, and cost you and the company money – particularly if you are paying for your prompts to it.’
Saying ‘please’ and ‘thank you’ adds to an AI’s electricity bill because of how AI ‘thinks’, Robert Blackwell, a senior research associate at the Alan Turing Institute, explained to Metro.
‘When chatting with an AI, words are tokenised – split into smaller pieces – before being processed,’ he said.
‘The more tokens or words used, the higher the cost for the companies running these models.

‘Newer reasoning models use even more tokens as they try to justify and check their answers.’
There are some reasons to be kind to AI, though. A growing amount of research suggests that how we treat AI reflects how we treat one another.
Goodwin, who is also deputy director at the Centre for Artificial Intelligence Research, said that language models learn from the people who use them.
‘If you are polite everywhere, even to chatbots, the norm becomes to be polite,’ he said.
Why do we say please and thank you to AI?
A big draw to AI is how it can carry out tasks to an almost human-level proficiency. That’s a lot for the average person to get their head around.
So we anthropomorphise AI – project human attributes onto objects – to make sense of ‘something that feels human but isn’t’, Luise Freese, who runs the tech blog, M356 Princess, told Metro.

Terminator is partly to blame, she joked, as AI is seen as exciting as it is scary.
‘The idea of robot overlords is burned into pop culture. So we joke and humanise these systems; it’s a coping mechanism,’ the Microsoft MVP winner for M365 development and Business Applications told Metro.
‘But that’s where it gets tricky: these tools don’t have thoughts or feelings; they just mirror patterns. When we treat them like friends, we risk forgetting that.’
Many chatbots make things up, something that happens so frequently that researchers had to make a word for it, ‘hallucinating’. Medical experts have found that it comes up with phoney health studies, while mental health professionals worry about people turning to the bots for therapy.
Some people place a lot of faith in AI, believing it has the same level of understanding and empathy as a human, when it does not, Ana Valdivia, a departmental research lecturer in AI at Oxford, told Metro.
‘The tendency to humanise AI isn’t merely innocent curiosity,’ she said, ‘it is a byproduct of how these technologies are marketed and framed, often encouraging emotional dependency or misplaced trust in systems that are, at their core, mechanical.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.