On AI’s Climate Paradox

AI can create more efficient systems, but at what cost?

Mike Grindle
The New Climate.
11 min readOct 26, 2023

--

Photo by İsmail Enes Ayhan on Unsplash

When it comes to climate change, technology often plays a dual role. On the one hand, technological advancements, particularly in computing, have led to more energy-efficient hardware, are powering greener technologies, are helping us better understand weather patterns, and undeniably have a significant role to play in tackling the climate crisis.

Yet, at the same time, technology is also driving the issues many hope it will alleviate, with a study by Lancaster University suggesting that ICT currently accounts for around 2.1–3.9% of all global emissions. And nowhere is this paradox better reflected than in the booming market of AI technology, which, one way or the other, has significant implications for the environment. The question is, will it help or hinder our climate’s future?

How AI could become a “climate warrior”

Today, we often associate AI with its uncanny ability to respond to writing and image prompts. However, the tech’s true usefulness may lie not in the things it can make but in how it can highlight inefficiencies and propose better solutions via advanced AI modeling and data analysis.

As a result, many experts are excited by what the technology can do to create more efficient and less polluting systems. For instance, AI technologies can analyze weather data from around the world, monitor and mitigate emissions, optimize transportation networks and agricultural processes, and help communities prepare for the impacts of extreme weather.

Companies are already investing big in the hopes that AI can enable them and others to reach climate targets. For example, in 2017, Microsoft put $50 million into its AI for Earth initiative (which has now evolved into Microsoft’s Planetary Computer Project), granting access to AI for projects that analyze climate data.

Airlines are utilizing machine learning to optimize flight routes, hopefully reducing fuel consumption in the aviation industry. Many hope that a similar thing will happen in the cargo shipping industry, which is currently a major polluter.

Meanwhile, Google is utilizing its Deepmind AI to reduce its own energy consumption in data centers. And perhaps the most significant investment has come from the US government, which has invested $3.5 billion into creating a smart energy grid that uses AI to measure how much energy is needed where and when while accounting for weather effects on green energy generation.

There’s just one problem with all these exciting projects. And that problem is AI.

Why AI might not be our climate savior

Photo by Growtika on Unsplash

The issues with AI begin with questions surrounding the ethics of how companies create the systems themselves. The reality is that AI needs data. Or, to put it more bluntly, human surveillance data.

As an essay by Adrienne Williams and co describes, AI systems are not built on clever code alone, but often from exploitation:

Far from the sophisticated, sentient machines portrayed in media and pop culture, so-called AI systems are fueled by millions of underpaid workers around the world, performing repetitive tasks under precarious labor conditions. And unlike the ‘AI researchers’ paid six-figure salaries in Silicon Valley corporations, these exploited workers are often recruited out of impoverished populations.

There are also many substantiated fears that, due to their often proprietary nature, AI systems could grant governments and corporations too much power over employees and citizens, something we’re already seeing in the insidious rise of so-called “bossware” and job cuts. Then there’s the fact that AI is not always reliable, with certain AI systems having a known tendency to “hallucinate,” and the fact that data sets are vulnerable to attack from bad actors.

Assessing AI’s climate footprint (and why it’s so difficult to do)

But the big issue with AI, from a climate perspective, is its massive carbon footprint. Assessing AI’s carbon footprint is a notoriously challenging task, not least because corporations are tight-lipped about how much energy and resources their AI models are using. But thanks to the hard work of researchers, we do have some indications.

A 2019 paper published by researchers at the University of Massachusetts, Amherst, which focused on natural language processing (the subfield of AI that teaches machines how to use human languages) found that the process of training an AI model can produce 626,000 pounds of carbon dioxide equivalent. To give that figure some perspective, that’s a carbon footprint equivalent to the lifespans of five average American cars (manufacturing included).

Later, in 2022, AI startup Hugging Face released a study that estimated the overall emissions output for their large language model (LLM) called BLOOM. In the end, Hugging Face estimated that BLOOM’s training alone led to 25 metric tons of carbon dioxide emissions and that this figure doubled once they accounted for the manufacturing of computer equipment, the broader computer infrastructure, and post-training.

That may seem like a lot, but it doesn’t tell the whole story. In actuality, experts consider BLOOM’s carbon footprint to be fairly green when compared to other LLMs of its size since it runs on a French supercomputer powered by nuclear energy. Furthermore, their work to achieve transparency is uncharacteristic of other companies in the sector.

Our goal was to go above and beyond just the carbon dioxide emissions of the electricity consumed during training and to account for a larger part of the life cycle in order to help the AI community get a better idea of their impact on the environment and how we could begin to reduce it.” — Sasha Luccioni, a researcher at Hugging Face.

In another study, Alex De Vries attempted to analyze AI’s overall footprint and warned that AI could consume as much energy as a country the size of the Netherlands, or around 0.5% of all global energy consumption, by 2027. But again, the study notes that research is largely speculative due to tech firms not disclosing enough data to make accurate predictions.

Reflecting on his research, Alex De Vries says his findings show that AI should only be used for important tasks, telling The Verge that:

A key takeaway from the article is this call to action for people to just be mindful about what they’re going to be using AI for. This is not specific to AI. Even with blockchain, we have a similar phase where everyone just saw blockchain as a miracle cure … if you’re going to be expending a lot of resources and setting up these really large models and trying them for some time, that’s going to be a potential big waste of power.

But to understand why AI requires so much energy, we need to turn to the data centers that power these technologies, which may be wreaking havoc on local communities.

AI data centers: water-thirsty behemoths

Photo by Taylor Vick on Unsplash

AI technologies such as LLMs require warehouse-sized data centers full of high-tech equipment to work. At first glance, these may seem not unlike the data centers used to power the internet. But as Manny Quinn, boss of the Scottish data center firm DataVita, detailed while talking to the BBC, the difference in energy use between a rack using a standard server and one containing AI processors is significant:

A standard rack full of normal kit is about 4 kilowatts (kW) of power, which is equivalent to a family house. Whereas an AI kit rack would be about 20 times that, so about 80kW of power. And you could have hundreds, if not thousands, of these within a single data center.

These warehouses generate massive amounts of heat and must be cooled to protect the equipment within them. As a result, they require significant amounts of water. However, just how much water they need is another trade secret that tech firms are keeping to themselves.

What we do know is that according to Microsoft’s (who have invested heavily in AI) own sustainability report, their water consumption jumped by 34% between 2021 and 2022, to 6.4 million cubic meters, which is about 2,500 Olympic swimming pools. Of course, demand has only grown since then, largely thanks to the release of Chat-GPT, meaning more water and energy are required to keep these systems working.

Rural locations have become a hotbed for these data centers, often overwhelming local infrastructure and communities. Not only do they use up vast quantities of resources, but they also sprawl over significant amounts of land and are notoriously loud. And more are being built at a startling rate.

As revealed by The Insider following a freedom of information request, tech giant Amazon “operates, or is in the process of building or planning, 102 data centers in northern Virginia. Together, the facilities, when they are all up and running, will have emergency generators capable of producing more than 4.6 gigawatts of power. That’s almost enough backup electrical capacity to light up all of New York City on an average day.

In Phoenix, Karla Moran, an executive at Salt River Project, one of two major utilities that serve the region, says that not only are growing data center power needs rivaling the entire output of entire utilities but are forcing energy grids to delay their move away from fossil fuels. Discussing the local power company’s approval of new methane gas facilities in the area, Moran notes how the “7,000 megawatts of data center requests currently in our pipeline,” which nearly rivals the Salt River Project’s entire 11,000-megawatt system, are “one of the main reasons we look at having a resource like that.

AI, consumers and the rebound effect

Photo by rupixen.com on Unsplash

The optimistic response to concerns regarding AI’s footprint is the same argument Silicon Valley has been peddling for years. That is, that efficiency and progress will eventually catch up and solve the issue — we just need to be patient. Yet, the reality is that technological advances have historically failed to live up to this expectation.

Rather than reducing emissions, more energy-efficient technology often has the paradoxical effect of increasing demand to the point where the level of pollution created by the newer tech actually increases. This phenomenon is sometimes known as the Jevons Paradox or the rebound effect, and we’ve seen it play out multiple times throughout history.

Consider, for instance, how the advent and widespread adoption of LED bulbs has largely failed to reduce energy consumption due to the reduced cost per light resulting in people keeping lights on for longer or buying more. In actual fact, LED lights have led to a huge increase in the adoption of outdoor lights. Meanwhile, in the automobile industry, improvements in fuel efficiency have made the cost of driving more affordable and therefore enabled more driving. Such is a pattern that has only continued since the adoption of electric cars.

More specific to the ICT sector, the quest for “innovation” means we now have mainstream operating systems with ridiculous hardware requirements for simple tasks. Furthermore, the widespread adoption of the web has resulted in the internet becoming increasingly energy-hungry despite vast improvements in hardware. Meanwhile, while smart tech may reduce energy use in your home, questions remain regarding the energy consumption of back-end cloud infrastructure.

All this is to say, that while current tech is always becoming more efficient, this fact becomes irrelevant in the face of ever-growing markets and increased computational requirements.

And will AI truly be a world-changing technology? Or another victim of hype like blockchain, a fascinating technology that, so far, has been largely squandered by resource-wasting, money-grabbing ventures such as cryptocurrencies like Bitcoin, “whose attributed 2021 annual emissions… produce[d] emissions responsible for around 19,000 future deaths” according to one bluntly-put study.

Ultimately, the concern is that, even if AI efficiency improves, it may suffer the same fate as other technological advances — that this tech will become just another polluting consequence of our endless march toward “progress.” And considering the pressing issues that currently face us, there’s the question as to whether the estimated $1 trillion being invested into AI wouldn’t be better spent on other environmental projects that we know work.

But the problems don’t stop there, as AI’s direct emissions don’t tell the full story. Because businesses use AI-powered advertising and recommendation algorithms to get people to buy or consume more things, AI may inevitably end up driving emissions up further in other areas. And in a time when we’re already consuming (and ultimately, wasting) more things than at any other point in human history, that’s not an invalid concern.

Making AI work for us

Not only is AI a solution in search of a problem, but it’s also swiftly becoming something of a problem in search of a solution.” - Lucas Ropek writing for Gizmodo

For all the analogies I’ve used here, visualizing AI’s energy use, carbon emissions, and water consumption is a real challenge for anyone, let alone a layperson. Undeniably, that fact makes raising awareness and understanding difficult. Nonetheless, researchers agree that promoting transparent behavior, such as open reporting on the carbon footprint of AI models and revealing (and removing) inefficient systems is vital to ensuring AI is aiding instead of hindering climate action. With that in mind, if corporations aren’t willing to be open and honest, it may be time for government legislation to intervene.

Alexandra Luccioni, a collaborator in a study on quantifying the carbon cost of machine learning that led to the creation of an emissions calculator, also notes that changing how we power AI is vital:

Using renewable energy grids for training neural networks is the single biggest change that can be made [regarding AI’s carbon footprint]. It can make emissions vary by a factor of 40, between a fully renewable grid and a fully coal grid.

They also note the importance of bringing the conversation into mainstream discussion and “getting researchers to divulge how much carbon dioxide was produced by their research, to reuse models instead of training them from scratch and… [use] more efficient GPUs.

In the balance

Today, there’s a feeling that AI is a runaway technology without a clear goal, often used by bad actors to create spam, spread misinformation, steal from creators, or create entirely useless digital products. Meanwhile, tech startups and big tech companies have found themselves pouring money into what effectively seems like a bottomless pit of interesting but often largely useless applications that generate little in the way of substantial value for them or people.

But there’s no denying that AI technology has real, arguably vital applications for the fight to prevent further climate catastrophe, or for helping humankind adapt to its devastating effects. The question is whether we can steer the use of this technology to the applications where it is truly needed, or whether it will be just another product for us to consume. Certainly, it feels too important to simply let markets, corporations or any algorithm decide its fate, and therefore our fate, for us.

--

--