Photo by Federico Beccari on Unsplash

AI Has A Giant, Destructive Flaw

This could break the AI hype train.

Will Lockett
Published in
5 min readFeb 23, 2024

--

Last month, Open AI CEO Sam Altman attended a World Economic Forum conference. Such an occurrence should be far from noteworthy, except Sam let slip one of AI’s biggest secrets. During one of the many meetings, he warned that the next wave of AI systems will consume vastly more power than expected, and our current energy systems will struggle to cope. He even went as far as to say, “There’s no way to get there [next-gen AI] without a breakthrough.” This marks a profound turning point in the AI hype train, as until now, their energy usage and associated carbon emissions have been the elephant in the room. With this one statement, Altman has painted the elephant pink and got it to do circus tricks for everyone. So, is this energy usage really a problem? What’s its impact? Does Altman have a solution? And can AI be made more energy efficient?

Let’s start with the basics. AI uses energy in two very different ways, training and inference. Training is when you pump a biblical amount of data into an AI and train it on what patterns it should recognise and what it should do with them. Inference is when you take a trained AI and query it for an output.

Of the two, training is by far the most energy-intensive. For example, it has been estimated that Open AI used 1,300 MWh of energy to…

--

--

Will Lockett
Predict

Independent journalist covering global politics, climate change and technology. Get articles early at www.planetearthandbeyond.co