The carbon footprint of ChatGPT

Chris Pointon
3 min readDec 22, 2022

--

Photo by Matthias Heyde on Unsplash

UPDATE, March 3 2023: The spiralling use of ChatGPT means it is most likely hosted in a range of locations with different electricity carbon intensities. This makes it impossible to give a reasonable estimate of the CO2 footprint. I’m going to stop updating this article but leave it in place to show the methodology. Kasper Groes Ludvigsen has estimated the electricity consumption of ChatGPT for January 2023 as equivalent to up to 175,000 Danes. We both hope that soon Microsoft will publish their own analysis of the carbon footprint using data only they can access.

When something as big as OpenAI ChatGPT comes along, the first thought of the community at ClimateAction.Tech isn’t “Ooh, let’s try it!”, but rather “Hmm, I wonder what the carbon footprint of it is?”. Since the internet is the planet’s largest coal-powered machine, anything that gets a lot of traffic is likely to be emitting a lot of CO₂. So let’s see if we can work out how much…

This tweet about the hosting cost of ChatGPT starts with an estimate that each word of response takes 350ms on an A100 GPU. It then guesses at 30 words per response and the number of responses per day:

Update February 20, 2023 following a UBS estimate of 13 million users per day reported in Wired. Also reduced the number of searches per day per user from 10 to 5 based on some feedback.
13 million users per day with 5 questions each =
65 million responses =
1.95 billion words per day
* 0.35s per word / 3,600 seconds per hour
= 189,583 hours of A100 GPU time per day

Original:
1 million users with 10 questions each =
10,000,000 responses =
300,000,000 words per day
* 0.35s per word / 3,600 seconds per hour
= 29,167 hours of A100 GPU time per day

Cloud Carbon Footprint lists a minimum power consumption of 46W and a maximum of 407W for an A100 in an Azure datacenter (see MIN_WATTS_BY_COMPUTE_PROCESSOR and MAX_WATTS_BY_COMPUTE_PROCESSOR ). I’m guessing not many ChatGPT processors are standing idle so I expect they’re consuming at the top end of that range.

Update 20 February 2023:
189,583 hours * 407W =
77,160kWh per day

Original:
29,167 hours * 407W =
11,870kWh per day

I believe ChatGPT is hosted in California, and Cloud Carbon Footprint (same file) says the emission factor for the Western USA is 0.000322167 tonnes/kWh. So the CO₂ footprint is:

0.000322167 * 77,160 =
24.86 tCO₂e per day

Original:
0.000322167 * 11,870 =
3.82 tCO₂e per day

That’s about 19 (original: 3) months of an average American’s footprint of approximately 15 tCOe per year. Or put another way the same CO₂ emission rate as 605 (original 93) Americans.

What’s missing from this quick analysis:

  • The actual number of queries per day that OpenAI users are generating
  • Emissions from training the model. In an article about the CO₂ footprint of a single ChatGPT instance, Kasper Groes Albin Ludvigsen lists this at 522 tCO2e. These emissions are amortised over the lifetime of the model
  • CO₂ emissions of the end-user equipment accessing ChatGPT . This includes power consumption and a share of the emissions from producing and disposing of the device. It’s probably the largest component of its footprint and impossible to calculate without knowing what devices are accessing ChatGPT and from where. OpenAI could provide some idea of this if they add something similar to the Website Carbon Calculator to the service.
  • Non-GPU emissions like networking, RAM and SSDs
  • The embodied carbon of datacenter

Cloud Carbon Footprint does have estimates of the last two of these, but I’ve left them out for simplicity.

--

--

Chris Pointon
Chris Pointon

Written by Chris Pointon

Internet entrepreneur and technologist. Co-founder of @Racefully and @Databoxer

Responses (6)