About
Subscribe

AI’s power hunger is becoming hard to ignore

The sheer amount of natural resources that AI uses every time someone plugs a query into anything should be a major concern.
Dr Jannie Zaaiman
By Dr Jannie Zaaiman, CEO, South Africa Information and Communication Technology Association.
Johannesburg, 18 Sept 2025
Dr Jannie Zaaiman, CEO of the South Africa Information and Communication Technology Association.
Dr Jannie Zaaiman, CEO of the South Africa Information and Communication Technology Association.

Artificial intelligence (AI) really has become a part of our lives. You can’t escape its tentacles even if you tried. Even everything we Google, which was pretty much AI to start with, now comes with an AI summary, often with untrustworthy old linkbacks.

AI is becoming more and more prevalent in higher education institutions, where it is used for research. And, unethically, to write entire papers that are easy to spot unless the student has run them through another AI tool: a humaniser.

Bookkeepers use AI to automatically capture invoices. Accountants use AI to then build dashboards that can pin down where expenses can be cut without harming growth potential and, even better, help companies spot fraud, identify risks, improve management and work out strategies.

I’ve written about the potential of AI to help in everyday work here.

But there are nefarious sides to AI as well.

Hands up everyone who isn’t getting a ton of robo calls each day offering loans, debt solutions, or funeral cover? I thought so. The solution to that is even more AI, often in the form of Truecaller because you can’t back-trace these companies no matter how hard you try.

What we can take away from DeepSeek’s claims – true or not – is the need to discuss the damaging effect that AI could have on our planet.

Lawyers get AI to do their research for them. Some of the outcomes of that would be quite funny if they weren’t tragic. ITWeb has reported on the sort of nonsense that AI spits outin the legal field with made up case law, which leads to heads of argument that seem to have been written by someone who took one too many magic mushrooms. Hopefully, judges will continue catching this practice before it turns into real case law.

What isn’t so funny, however, is the sheer amount of natural resources that AI uses every time you plug a query into anything. Unless you are running an application that doesn’t rely on some form of the cloud – highly unlikely in today’s environment – you are part of a population that is using up water and electricity like no-one’s business. So am I.

This is one of the reasons DeepSeek was such a disruptive entrant to the machine learning chatbot sector. The Chinese competitor to the likes of ChatGPT, Bard, Manus and Perplexity, among others, contends it uses fewer AI chips and, therefore, less electricity than other similar offerings. Just don’t ask it anything about China, it won’t answer that.

Of course, some of its assertations have been found to be wanting, which isn’t surprising.

What we can take away from DeepSeek’s claims – true or not – is the need to discuss the damaging effect that AI could have on our planet at a time when we are, in accordance with the Paris Agreement and the United Nations’ Sustainable Development Goals, trying to limit our effect on the climate.

Companies like Google, Amazon for its Web Services unit, Oracle and Microsoft are investing in their own power stations, many of which are nuclear (that brings with it another conversation) just to keep up with the rising need for computing power to fuel the need for AI.

As far back as 2019, researchers from East Carolina University found that training a large AI model emits the same amount of carbon dioxide as that which would be created by 300 round-trip flights between New York and San Francisco, or almost five times the lifetime emissions of an average car.

Accenture’s mid-year 2025 research found that, in just the next four-and-a-half years, the centres that power AI are set to use as much power as what Canada uses each year. Keeping them cool could use more than the total volume of freshwater that is taken from rivers, lakes and aquifers by countries like Norway or Sweden.

Yes, there is research showing that end-users like you and I don’t do that much harm when we run a ChatGPT query. Sustainability by Numbers states that 100 searches a day is about 2% of what the average Brit consumes in power every 24 hours.

That’s not the point though – the point is that the demand is there, tech companies will build based on demand, and we need to do something about this. I’ll chat in my column next month about how we can start doing that.

Share