Power Hungry: Manufacturing Intellectual Capital
One thousand four hundred terawatt-hours (TWh). That’s a massive amount of energy. Even one of the dumbest technologies ever invented by humans, proof-of-work cryptocurrencies, only consume about one-tenth of that energy annually. To put 1,400 TWh into perspective, that’s triple Germany’s yearly budget.
Given the headlines, you’d be forgiven for thinking AI is on track to burn that amount of energy. In reality, even aggressive estimates of AI energy demand by 2030 are below 200 TWh — slightly more than the Bitcoin network uses today to manufacture hopes and dreams.
What could possibly consume 1,400 TWh per year?
That’s how much energy it takes to power our current stock of global intellectual capital. I’m talking about human brains.
People Powered
Each human brain uses less than half a kilowatt-hour per day. That’s an incredible figure, considering our capabilities. However, any number becomes massive when you multiply it by 8 billion.
Human intellectual capital is efficient, but it’s not terribly productive. Less than half of humans participate in the workforce, and those who do are only productive for a few hours daily. If you do the math, only about 5% of those 1,400 terawatt-hours generate productive output.
Assumptions:
Global population = 8.2 billion
Workforce participation = 3.5 billion (43%)
Days worked per year (average) = 250 (68%)
Productive hours per day (estimate) = 4 (17%)
This thought exercise led me down a rabbit hole. Is AI more or less energy efficient than human brains? Is manufacturing intellectual capital a viable business model? Should we slow AI production until the models are more efficient and renewable energy is more widespread?
Let’s dive into the numbers.
Artificial Minds
AI models are notoriously power-hungry. Public data is limited, but GPT-3 reportedly used 1,287 megawatt-hours for training. Training requirements scale quadratically with model parameters, so GPT-4 might have required more than 100,000 MWh. That number is highly speculative, so let’s use GPT-3 as the benchmark.
Humans aren’t economically productive until they’re trained. Having a bunch of three-year-olds running around the office may be fun, but they won’t get much work done.
For this comparison, assume it takes 18 years to train a human to compete with GPT-3 on similar tasks (e.g., writing, reasoning). Using the figures I cited earlier, that translates to about 3.2 MWh to produce a single 18-year-old human brain.
The implication is that GPT-3 required the same energy to train as four hundred human brains. That’s a large number, but it’s not ridiculous. I doubt four hundred humans could accurately memorize every pattern encoded in GPT-3’s 175 billion parameters.
For fun, let’s do the calculation in reverse. GPT-3 was trained on about 300 billion tokens. Using conservative assumptions, it would take four hundred humans about 13 years to process the tokens used to train GPT-3.
Assumptions:
GPT-3 training tokens = 300 billion
Words per token = 0.75
Human reading rate = 250 words per minute
Hours per day = 8
Number of humans = 400 (from previous calculation)
Is GPT-3 as capable as four hundred teenagers? Which would you rather have working in your office?
I’ll be the first to admit these calculations are ridiculous. You could write a dissertation about the flaws in my logic and simplifying assumptions. That said, I don’t see overwhelming evidence that AI models are more power-hungry than human brains. If anything, they may be less.
One Less Crisis
Let’s stop worrying about AI energy consumption. If we use 200 TWh for AI in 2030, it’ll mean we’re manufacturing and employing intellectual capital at a breakneck pace. If AI capabilities plateau before then, I’m confident we’ll slow model training and divert resources to our old technology — human capital.
I’m concerned about AI safety and labor market disruptions. You don’t need wild assumptions or rigorous calculations to validate those risks. All you need is an imagination and an understanding of how previous technologies upended society. Energy is different.
We’ve relied on human brains for thousands of years to power economic growth. That model is becoming less sustainable by the day. If you care about lifting humans out of poverty while conserving natural resources, you should be all-in on AI — and maybe steer clear of crypto.
Note: While discussing this article, a friend asked about inference. AI may be competitive with human brains during training, but what about when performing tasks? I also did those calculations as well, and the results were similar. My brain used 2.2 Wh to generate a 61-word email response. Using the best assumptions I could find online, I estimated ChatGPT used 2.8 Wh for its 60-word response. That ratio is in the same ballpark I found for training, and the training assumptions are more robust. AI draws way more power than the human brain but does so for significantly less time.