Photo by Johannes Plenio on Unsplash

This Technology Could Unlock The Future Of AI

Well, sort of…

Will Lockett
Published in
3 min readAug 4, 2024

--

By now, AI’s addiction to energy is well known. Not only does it take an ungodly amount of energy to train and set up an AI, but it also takes a tremendous amount of energy to run AI services. Take OpenAI’s infamous ChatGPT4, which took over 50 GWh of energy to train and uses 500,000 kWh of energy daily servicing user queries. In total, its energy footprint is equivalent to that of a medium-sized city. But, as AI improves, its energy use increases exponentially. As such, the IEA predicts that the AI industry will use a staggering 1,000 TWh of energy each year by 2026! That is equivalent to Japan’s annual energy usage, and it will only continue to grow from there. Obviously, such energy usage isn’t sustainable, as it will slow our net zero transition, incur huge carbon emissions, and even impact power supplies and disrupt industry. Sam Altman, CEO of OpenAI, has, rather questionably, suggested nuclear fusion is needed to unlock the future of AI. But, researchers from the University of Minnesota Twin Cities may have just saved the future of AI with a far more tenable solution.

This solution is a new way of computing known as Computational Random-Access Memory (CRAM) that can train and operate AI 1,000 times more efficiently than current systems!

--

--

Will Lockett
Predict

Independent journalist covering global politics, climate change and technology. Get articles early at www.planetearthandbeyond.co