Member-only story
This Technology Could Unlock The Future Of AI
Well, sort of…
By now, AI’s addiction to energy is well known. Not only does it take an ungodly amount of energy to train and set up an AI, but it also takes a tremendous amount of energy to run AI services. Take OpenAI’s infamous ChatGPT4, which took over 50 GWh of energy to train and uses 500,000 kWh of energy daily servicing user queries. In total, its energy footprint is equivalent to that of a medium-sized city. But, as AI improves, its energy use increases exponentially. As such, the IEA predicts that the AI industry will use a staggering 1,000 TWh of energy each year by 2026! That is equivalent to Japan’s annual energy usage, and it will only continue to grow from there. Obviously, such energy usage isn’t sustainable, as it will slow our net zero transition, incur huge carbon emissions, and even impact power supplies and disrupt industry. Sam Altman, CEO of OpenAI, has, rather questionably, suggested nuclear fusion is needed to unlock the future of AI. But, researchers from the University of Minnesota Twin Cities may have just saved the future of AI with a far more tenable solution.
This solution is a new way of computing known as Computational Random-Access Memory (CRAM) that can train and operate AI 1,000 times more efficiently than current systems!
How? Well, current AI systems must rapidly and constantly transfer data from memory to computational processors during training and when queried, which uses a tremendous amount of energy. CRAM combines memory and processing power, meaning the data can be processed without needing to be transferred, dramatically reducing energy use.
The team that developed this technology has successfully patented their work and is now looking to collaborate with semiconductor industry leaders to produce hardware specifically to advance AI functionality.
So, how will this impact the AI world? If, hypothetically, the entire AI industry could adopt this technology by 2026, they would use less than a TWh of energy per year or about 2.5% of London’s annual energy use, rather than the predicted 1,000 TWh of energy per year. Not only would this make operating AI far cheaper and potentially make the technology decently profitable, but it would also solve the energy supply and carbon emission issues.