“Hard Ceilings” Doomed Past Societies
Can we do better?
by Ian Morris
Without energy, we die. That is the law of entropy: over time, complex arrangements of matter break down unless they can capture constant inputs of free energy from outside themselves. You can test the theory yourself: if you stop taking in energy from food, the complex arrangement that is you will break down after a couple of weeks. If you stop taking in energy from water, the breakdown will begin after a couple of days. If you stop taking in oxygen, it comes after after a couple of minutes.
So far, humans have found three main ways of capturing energy in addition to breathing and drinking water. The first, historically, was foraging (hunting wild animals and gathering wild plants). Next came farming (cultivating domesticated animals and plants), and then the use of fossil fuels (augmenting the energy of animals and plants with that trapped in coal and oil). We have tinkered with a fourth set of sources — nuclear and solar — but have not yet done much with them.
There were firm ceilings on the amount of energy that the first two sources, foraging and farming, could provide. For foraging, the ceiling was a little under 10,000 kilocalories per person per day, and for farming, a little over 30,000 (these energy budgets had to cover not just food, by the way, but also fuel, shelter, clothing, and everything else that people do). Contemporary Americans consume on average about 230,000 kilocalories per day, most of it turned into electricity to power our machines. We do not know — yet — if there is also a hard ceiling over what is possible in fossil fuel societies, but the implication of history seems to be that there is.
Past experience suggests that the environment largely determines where the hard ceiling lies for each method of energy capture. The Kwakiutl (also known as Kwakwaka’wakw), foragers in resource-rich British Columbia, consumed right around 10,000 kilocalories per person per day, while the !Kung San, foragers in the resource-poor Kalahari Desert, managed on half that. However, history also shows that culture determined just how much of their homeland’s energy potential a particular group would capture. As Adam Smith recognized in the eighteenth century, a farming society’s wealth rise enormously as it increased its division of labor. Specialization produced gains in everything, including in the invention of ships, tools, and techniques that further raised energy capture.
However, as people in Smith’s age also knew, there were limits beyond which market-friendly values and an ever-finer division of labor simply could not take you. Everywhere from Massachusetts to Manchuria, output per person was declining in the late eighteenth century, because there were simply too many people to feed if the only methods available were those of an agrarian world.
The big problem, recognized by Smith’s near-contemporary Thomas Malthus, was that when people did well at capturing energy, they turned much of that energy into more of themselves. Farmers could breed faster than they could increase energy supply (when freed from other constraints, population can double every seventeen years), meaning that hungry mouths tended to consume all the surplus available. Farming societies might stay trapped against the hard ceiling for a while — two thousand years ago, this happened to Rome for roughly a two centuries, and a thousand years later, the same thing happened again to Song dynasty China — but stagnation never lasts long, because a society pressing against the limits of what is possible is very unstable indeed.
The most frequent outcome, by far, is that societies stretched to the limit will collapse. Judging from our historical record, which goes back to 2200 BC in the Near East, the same five forces — population movements, disease, government breakdown, famine, and climate change — are always involved in major breakdowns. Whether we are talking about foragers, farmers, or fossil fuel users, the struggle to keep everything going in an overcrowded world tends to degrade the environment as well as exhausting the stocks of plants, animals, and fuels that economies depend on. Fragility replaces resilience, and when everything is under stress, even quite small shocks can bring the entire system crashing down. Often, in fact, it only needs one small region to buckle under the strain for contagion to spread everywhere. In the worst cases, long dark ages can result. After the collapse of the Indus Valley civilization around 1900 BC, it took a millennium for South Asian social development to return to pre-breakdown levels; after the fall of the Roman Empire, it took Europe even longer.
Once in a very long while, however, societies pressing against the hard ceiling do not collapse. Instead, necessity is the mother of invention; someone innovates and finds a way to shatter the constraints (and, equally important, the innovation is recognized for what it is and is widely adopted). This is what happened around 9500 BC in an area that archaeologists call “the Hilly Flanks,” stretching along the borderlands of what are now Israel, Jordan, Syria, Turkey, Iraq, and Iran. Here there were unusually dense concentrations of plants and animals — wheat, barley, beans, sheep, goats, cows — that had the genetic potential to be domesticated; and, by constantly tinkering with the best ways to cultivate or herd their wild predecessors, humans created the world’s first genetically modified organisms. Farming blew away the ceiling that had limited what foragers could do, and over the next ten millennia, farmers took over almost the whole planet.
There was no breakthrough of equivalent importance until about AD 1800, when entrepreneurs in Britain — confronting rising population and falling wages and profits — figured out how to burn coal and turn the heat this released into motion that could power machines, vastly augmenting the muscles of men and other animals. Population and wealth boomed, and over the next two centuries fossil-fuel industry spread nearly everywhere.
This has been the story so far, but I will close with some good news, some bad news, some worse news, and some better news about where it might take us next. The good news is that so far, when ways to break through a hard ceiling have existed, people have eventually found them. If solar, nuclear, nano, or any other kind of energy can solve the problems that we have created through our exploitation of fossil fuels, we can be optimistic that someone will work out how to this.
The bad news, though, is that word “eventually.” Every time a society runs up against the hard ceiling that constrains it, a natural experiment is played out, with the forces of human ingenuity running a race against those of mounting catastrophe. Most of the time, catastrophe wins. Thousands of foraging bands ran up against the limits of what their environments could support. Almost all of these bands failed to find a way through, and paid the price in starvation, disease, and extinction. Only a tiny handful found the path toward farming.
We know of just half a dozen farming societies that subsequently reached the limits of what the agrarian age could support — the Roman Empire in the first two centuries AD, Song China in the eleventh and twelfth centuries, and Europe, the Ottoman Empire, Mughal India, and Qing China in the eighteenth. Only one area — eighteenth-century England — figured out fossil fuels.
There is a trend here: the bigger and more developed societies become, the fewer of them there are, and the fewer chances humanity gets to work through its problems. In our own age, we effectively have just one global fossil-fuel society, and we only get one chance to shatter the ceiling over it. Failure is not an option.
And now for the worse news. In the past, mass violence accompanied every failed attempt we know of to break through the hard ceiling. Whether we are talking about the end of the Roman and Han Empires or the collapse of the Ottomans, Mughals, and Qing, the death toll always ran into the millions. In the twenty-first century we can expect the same, but worse, because of course we now have nuclear weapons. Roman emperors would have loved the bomb, but because they and their rivals did not have it, Europe eventually recovered from the end of the ancient world (even if it took about fifteen hundred years).
It is certainly true that right now we do not have enough nuclear weapons to kill everyone on earth. For every twenty warheads that existed in 1986, the peak year, there is now just one; but on the other hand, Russia is about to expand its arsenal (the first major power to do so in a generation), and if we want to, we can quickly take the world back into nuclear annihilation territory. This time around, we could have a collapse that no one ever recovers from.
Compared to that scenario, of course, absolutely anything would count as better news, but I want to close on a genuinely positive note. Arguably, the greatest achievement in human history has been our progress in mastering violence. In the age of foragers, people on average stood a 10–20 percent chance of dying violently. In the age of farming, that figure regularly dropped below 5 percent, and was sometimes just half that.
Since the industrial revolution, it has gone down even further. Despite two world wars, two atomic attacks, and multiple genocides, the twentieth-century global rate of violent death was just 1–2 percent, and the World Health Organization tells us that since 2000 it has been just 0.7 percent.
I wrote at length about why this has happened in my book War! What is it Good For? and will not repeat those arguments here; all I need to say in conclusion is that there really are good reasons to be optimistic that we can manage the challenges of the twenty-first century without resorting to the old solution of killing everyone who disagrees.
Maybe we do not have to end up in the world of Oryx & Crake after all.
Images via The British Library.
Read more like this on Medium by clicking the tags below.