Energy Week Panel: AI Poses Problems, Offers Solutions for Decarbonization, Climate, and Energy

TDK Ventures
TDK Ventures
Published in
6 min readFeb 27, 2024

Artificial intelligence has seen explosive growth in recent years, with new large language models like ChatGPT demonstrating impressive capabilities. However, this progress potentially comes at an environmental cost. The final panel from Energy Week 2023, sponsored by TDK Ventures and Climate Investment discussed this impact. TDK Ventures’ Anil Achyuta moderated the panel and started by citing Energy and Policy Considerations for Deep Learning in NLP, a 2019 study that found that training a single AI model can emit 626,000 pounds of CO2 equivalent, similar to the lifetime emissions of five average American cars. The study also reports that training ChatGPT3 in Microsoft’s state-of-the-art US data center can directly evaporate 700,000 liters of clean, fresh water.

The explosion of data and the centers required to store and analyze it are partly to blame.

Kelly Chen, a founding partner at NIF, said that even before generative AI came on the scene, data centers already accounted for over 1% of global carbon emissions.

“Now, with generative AI and all the requirements we will have with larger models, more data, more compute, we will be looking at a lot of strain and need for infrastructure in the next five to 10 years,” she predicted. “As there’s more compute and potentially higher temperatures, we’ll need innovative methods of cooling. On the climate side, there will be more and more needs. How that implements in final output, emissions and total energy usage, there are interesting solutions from startups and large corporates that ensure a slow scaling of environmental impact that will one day decline.”

Helen Lin of At One Ventures pointed out that if Google switched all its 9 billion daily searches to chatbot interactions, its energy consumption would instantly rival that of Ireland. The good news, she said, is that mitigants exist on both the hardware and the software side. The bad news is that Moore’s law may have run its course for efficiency increases.

“A lot of people are saying we have reached a threshold of saturation with how much efficiency we can continue to gain through hardware such as servers, cooling, and chips,” Lin said. “So, the increase in energy intensity from internet usage and regular search has already been offset by hardware improvements. If we’ve hit the ceiling, there may be a cusp of spiking energy consumption (from generative AI).”

All that data is needed to inform generative AI large-language models that have become popular in all aspects of human enterprise. But while ChatGPT4 gobbled up data to integrate 1.5 trillion parameters during its training regimen, that upfront energy consumption and large carbon footprint is not the real problem, according to Andy Perry, director of Energy Transition and Environment at Faculty. The recently released BLOOM’s model required about 50 tons of carbon to train in terms of energy usage, he said.

“That’s about 45 UK households over the course of a year. That’s a lot, but training the model is a one-off activity and it’s being done hundreds of times, it’s probably manageable when amortized over the model’s lifetime use. There are cases to be made to justify it.”

But with the popularity of these models and the associated need for many more global data centers and their associated cooling needs, energy consumption, and emissions, the use of large language and generative AI is set to explode. Each use may not cost a lot in terms of carbon release, but multiplied by billions of operations per day, it could pose a real problem, he said.

Chen agreed, noting that ChatGPT set the record as the fastest software to reach 100 million users and that an AI chat prompt has four to five times the carbon intensity of an online search. She noted that in the pre-ChatGPT era, Google devoted a host of resources in an effort to shrink data centers’ carbon footprint and boost energy efficiency.

“They had been working on it a long time,” she said. “Then, DeepMind came in, and they reduced the energy use and emissions from Google’s data centers by something like 40% using AI. I’m excited to see what the implications are (for generative AI).”

She said this next generation of AI holds great promise for language models, with numerous startups building applications on top of the technology.

“What’s most investable in the field has been computer vision because you can have very narrow uses cases to train on,” she said. “In practice, when you don’t have large models for language, the execution looks terrible and is not usable commercially. Chatboxes five years ago were not something people wanted to interface with. We needed the new, improved iteration. We needed to start large; now we can think about shrinking it down.”

On the usage side, Lin suggested that investors and researchers have a duty to not get carried away by generative AI’s capabilities at the cost of unnecessary resource usage. She suggested that “regular” narrow AI that performs specific intelligent tasks may often be more appropriate and energy-efficient than comprehensive systems.

“The decision to utilize data and compute-hungry generative AI should be deliberate based on true incremental value add,” she advised. “AI solutions can also accelerate innovation pipelines and trim costly trial-and-error processes across sectors like renewable energy and sustainable development — boosting efficiency. But reasonable restraint must be exercised in backing AI ventures that drive disproportionate emissions.”

Perry noted that generative AI is, indeed, different from traditional AI, which is narrowly focused on solving a particular problem by building an algorithm and the software around it. It exerts a low impact because it’s constrained in the type of data going in and the task it is asked to perform.

“Generative AI…trained on everything so it can answer anything it gets asked,” he explained. “When you’re not resource-constrained, you will keep adding data and parameters to make it more powerful and expand its field of vision. While today it’s an arms race of ever-expansion, we will reach a point where we reverse that and realize that constrained models will be more effective because they are trained for specific purposes. That will also narrow the resource requirement.”

Lin said At One invests in AI “in any industry or sector where you have a lot of experimental iteration that can be costly. You need people; you’re burning runway. The more you have to try and fail, for a startup, is death.”

That applies, whether the startup is involved in food tech or green cement, she said. Either way, “you’re trying to find different mixes of ingredients and processes to get to the ultimate outcome of performance. Trial and error that traditionally has been done by a bunch of smart scientists in a lab, kills a lot of early-stage companies. You can accelerate that with AI, and if there’s a gain to be made with generative AI, it can be helpful for a lot of companies in our space.”

Conscientious investing will become easier once the industry implements more robust tracking and transparency around energy consumption from hardware and software operations.

“Quantifying the exact impact of AI processes is an essential first step organizations should embed in their monitoring stack moving forward,” he said. “Impact assessments can then inform usage guidelines or constraints aligned with carbon budgets.”

Moving forward, Lin suggested generative AI is here to stay.

It will grow and become a massive energy-intensity stream within society,” she said. “It will have implications for resources cutting across industry verticals. I think there’s going to be $50 billion in growth in construction just due to data center expansions. With the knock-on effects we have to step up even more to look at the impact of HVAC, energy efficiency.”

The panelists concurred that while AI holds tremendous promise, leaders in technology and business must proactively assess and monitor both economic and environmental costs associated with rapid advancements. Responsible innovation practices, open tracking of emissions data, and cross-sector collaboration will be vital to delivering climate-conscious AI. The goal should be enjoying productivity benefits while respecting planetary boundaries — not solving problems with new problems.

Energy transition is the biggest driver of the world’s ability to combat climate change, but it is basically an optimization problem, Perry said.

“Building renewables in itself is not difficult,” he continued. “We understand how to build wind farms and solar panels. We also understand how to build electric vehicles and make them cost competitive. And we know how to build heat pumps. Broadly, the problems of electrifying demand and creating renewable generation are largely solved from a capital point of view. Our biggest problem is how to make the system work, how to connect these two things at scale — networks, flexibility, balancing supply and demand. AI is perfectly suited to solving those problems.

TDK Ventures and Climate Investment hosted Energy Week 2023 at London’s Goldsmiths Center to highlight the efforts of industry entrepreneurs, investors, and the scientific community in solving the world’s energy challenges and finding sustainable paths to decarbonization.

--

--