The importance of being antifragile
The recent years of financial turmoil have taught us how little it sometimes takes to trigger a financial crisis. We’ve lived through so many of these moments now, and in a way they’re all really the same moment, repeated over and over again like in a nightmare: A stock market, a bank, a currency, or an entire economy, stands on the edge of the abyss. We know that if it falls, it will take others down with it, and they in turn may bring down others. If this is allowed to go on unchecked, the whole world economy may be in danger.
Sound familiar? It should, and there’s no reason to think we’ve seen the end of it. As a market libertarian it pains me to say this, but much of the global economy and financial system as we know it today is fragile.
It’s fragile in much the same way as a vase standing on the edge of a table is fragile. Perhaps the vase has been in your family for a hundred years, much like an old and respectable bank, but that is of no help whatsoever if it does in fact now stand on the edge of the table, and somebody bumps into it. It will fall, break into pieces, and be gone for good. The fragility of the vase follows from the material it is made of – and from the stupidity of the person who placed it on the edge of the table. Later they’ll offer all sorts of excuses. How could they have predicted that anyone would bump into it? It was quite unlikely, look, here are the calculations. But these excuses are of no help to anyone. The vase is gone.
If you look around you, at things big and small, at objects, organizations and systems, you’ll find many examples of things that are fragile, and can be destroyed if the right person bumps into it in the right way. But the Lebanese-American thinker Nassim Taleb argues in his new book Antifragile that there also exist things that are the exact opposite of fragile. Things that are not merely robust, but beyond robustness, such that accidents and chance events tend to make them better and stronger – much like a vase that becomes harder to break every time you drop it on the floor.
One example of this, according to Taleb, is the aviation industry. Whenever an airplane has an accident, an attempt is made to discover why, correct the problem, and prevent it from ever happening again. The paradoxical result of this learning process is that it becomes safer to fly for every plane that crashes. The plane itself is fragile, but the airline industry is the opposite of fragile – unexpected events improve it.
You see the same phenomenon in industries where the level of competition and entrepreneurship is high. The nightlife in your city gets better for every restaurant that goes bankrupt. The bankruptcy itself is a sad event, and negative for those concerned, but the overall result of bankruptcies is to improve the quality of those that survive.
Your body, too, deals with certain types of stress by improving. When you exercise, you expose yourself to an unexpected strain. You tear and stress your muscles, and it hurts. But after a few days you’re not just all right again, you’re stronger than you were before. Your muscles are the opposite of fragile – stress makes them stronger.
Nassim Taleb’s new word for this opposite of fragility is antifragility. I love neologisms, and enjoy creating them. The right new word sharpens the mind, and makes useful concepts easier to think about. Antifragility lights up a part of the world we often overlook: Things that grow stronger from chaos, uncertainty, resistance and stress.
Searching for black swans
Nassim Taleb is best known for his previous book, The Black Swan from 2007, (which I have discussed earlier). In it, he warned that economists, politicians and social scientists tend to underrate the importance of unlikely events. They believe in a world of friendly bell curve distributions, where what is unlikely is also unimportant. In reality, it is unlikely events that shape history. The September 11 terrorist attacks shaped an entire decade of international politics. The Harry Potter novels conquered the imagination of an entire generation of children and youths.
In retrospect, both of these events seem inevitable, but this is only a comforting story we tell ourselves to make the world seem intelligible. Reality is chaotic. The friendly stories come later.
Based on his experience from working in the financial industry, Taleb believed that this industry was particularly ignorant of the importance of negative, unexpected events, and thus also particularly vulnerable to them. He warned that the mortgage institution Fannie Mae was sitting on a basement full of dynamite, and might explode any moment. Then the financial crisis came, and proved him right. Many have claimed that they predicted it. Taleb bet money that it would happen – and made a profit.
After The Black Swan and the financial crisis, many have hailed Taleb as a prophet, and asked him to predict “the next Black Swan”, misunderstanding Taleb’s gloomy message, which was that such predictions are impossible. Black swans are, and always will be, unexpected. Taleb had not predicted the events that would topple the financial markets. He had simply observed that they were poorly prepared to deal with the unexpected.
Consider the vase that stands on the edge of the table. You don’t need to predict who is going to bump into it, or when, or why. All you need to do is ask yourself what will happen if somebody does bump into it. It will fall down, and break. Which means it’s a good idea to place it further in on the table.
Taleb believes that not only is it impossible to predict the specific events that will push the vase over the edge of the table, it is harmful to even try. So what should one do instead? What is the financial equivalent of moving the vase away from the edge? In 2007, Taleb had no answer to this question. Now, in Antifragile, he does, or at least he proposes a perspective for finding one. Financial institutions should give up trying to predict the unpredictable, and instead do something about their real problem: Their fragility. They should try to become robust to unexpected events, or, even better, antifragile. Had they done this in the early 2000s, they would have survived the financial crisis we had, but also all the other financial crises we could have had instead.
In practice this means replacing the misguided quest for perfect predictions with simpler heuristics, such as favoring situations with an limited downside and an unlimited upside over those with a limited upside and an unlimited downside.
Taleb aims much broader than at saving financial markets. As the examples at the start, of airplanes and restaurants and muscles, illustrate, he sees fragility and antifragility everywhere in the world around us.
Fragility is simply any condition where the potential upside is limited while the downside is unlimited, a situation where things will probably turn out well, but only alittle well, and in the worst case they may end in disaster. The best a bank with large loans and investments may hope for is to get their money back with a bit of profit on top. But in the worst case, they’ll lose more money in a day than they’ve earned in total since the last crisis.
Or, imagine that you own an uninsured house. The best thing that can possibly happen to your house, and the most likely, is that nothing will happen to it, and it remains standing just as it is. But the worst that can happen is that it burns down and you lose everything except for your debt. Fragility feels safe, but the fact that the upside is more probable is outweighed by the fact that the downside is so horrible.
Antifragility is the opposite of this, a condition where the potential downside is limited, but the upside is unlimited. A situation where things will probably go badly, but only alittle badly, and in the best case they will go really well. An everyday example is that you ask someone out for a date. The worst, and most likely, outcome is that they decline, which is sad but no disaster. But the best outcome is that you will find someone to spend the rest of your life with.
Or let’s say you write a novel. The worst, and most likely, outcome is that you will have wasted your time, because nobody wants to read it. Again, this is sad, but no disaster. You’ve lost time and effort, but it is a limited loss. But the best possible outcome is practically unlimited: That you will have written the next Harry Potter or Fifty Shades of Grey.
Antifragility is frightening, but the fact that the downside is more probable is outweighed by the fact that the upside is so wonderful.
Which of these situations you’re in determines how you feel about random events. When you’re fragile, chance is an enemy you want to protect yourself from. When you are antifragile, chance is not only a welcome friend, but a necessary one. Randomness will do nothing good for your uninsured house, but it could do wonders for your novel.
In an antifragile system, what is negative on one level may be positive on a higher level. As readers, we benefit from all the countless authors who are probably wasting their time writing books nobody will read, because among them there will also be a few good ones. And it’s because of entrepreneurs who start restaurants that will probably go bankrupt that food lovers can pick and choose among the good ones that survive.
Taleb sees these people as martyrs and everyday heroes. They get the downside, we the upside.
In an antifragile system, we must embrace randomness and stressors. If instead we protect ourselves against it, all the benefits of antifragility disappear. If nobody dares to write novels, because they’re afraid of failing, we will get no more Harry Potters. If you stop stressing your muscles, because you can’t stand the pain, they will waste away. And if you avoid adversity in small things, you will be unprepared to deal with it in large things.
The Swedish psychiatrist David Eberhard has argued that Scandinavians are addicted to the feeling of being safe. They would rather live their lives from beginning to end in a sterile bubble, free of pain and stress. But sooner or later the bubble bursts, and when it does, you are as fragile as an animal born in captivity that is released into the wild.
Not quite libertarianism
Safety-obsessed politicians aim to create a safe economy, free from downturns, bankruptcies and layoffs. But protecting the economy by preventing failure is like protecting your body by avoiding exercise. And when the crisis finally does arrive, despite all your best efforts, it will do more damage than if you had allowed yourself to weather it earlier. You end up with an economy of dinosaurs: Expensive, unprofitable, and too big to fail.
This may sound like market libertarianism, and it almost is, but not quite. Taleb detests the naive rationalism of many economists. He sees them as ornithologists who would teach birds how to fly. And their dream of a highly efficient global economy is making the world more fragile.
Taleb does not disapprove of all economists. He borrows heavily from Friedrich Hayek, the patron saint of pragmatic libertarians. Hayek’s main idea was that it is not possible even for a well-intended central government to control and run an entire economy. They simply do not have access to the necessary information, which is distributed among all the actors, all the companies and individuals who make up the economy, and can never be gathered in one place. Therefore the economy should be free and decentralized, so that each of us may make use of what little we know.
Hayek is an undogmatic libertarian. A social democrat who reads Hayek will probably become a smarter social democrat. Want a welfare state? Sure, if you must, but try to not be a naive rationalist who tampers with forces you don’t understand. It’s this pragmatic aspect of Hayek Taleb builds on, in a way that is in direct conflict with many liberal and libertarian ideas. They too suffer from naive rationalism, the belief that if only we remove all rules and all regulations, everything will just magically turn out for the best. Open borders, super states and unlimited globalization, the more freedom we have, the better the outcome will be.
Too soon to be optimistic
But while freedom and antifragility overlap, they’re not the same thing. Antifragility is a particular kind of freedom: Decentralized, redundant and small-scale.
If you want to learn if something is fragile, robust, or antifragile, there is a simple test you can perform: Sit down, wait about 1000 years, and see what happens to it. Time eats away the fragile, allows the robust to stand unharmed, and strengthens the antifragile. If a technology, tradition or institution has already survived for a long time, it is either robust or antifragile. (Religion is here to stay.)
But if something is brand new, it is too soon to tell. Taleb dismisses optimists who believe that the world is becoming ever safer, ever richer, and ever more peaceful, and that this process will continue forever. It might, but we don’t have enough data yet.
Wars are fewer these days, but the last one we had in our part of the world was also the deadliest war in history. The one we might have had in 1962 would have been even deadlier. How deadly will the next great war be? We have only had nuclear weapons for 70 years. After 700 we can breathe easier, and after 7000 we can relax a little. There have been other periods of growth and peace before the current one. They didn’t last forever.
And when the next disaster finally does occur, Taleb believes it will do more damage in a world of tightly knit, efficient, debt-ridden, naive-rationalist super states than in a world of redundancy and slack.
Theoreticians believe we can safely take on all sorts of risks. We can take on the risk of debt, and the risk of climate change, because their calculations predict that it will turn out well. But reality is meaner than their theories.
The ethics of punditry
Taleb directs his harshest criticism at the world’s economic, political and journalistic elite – the Davos man. He accuses them of being corrupt, because they have found a way to cheat the system , so that they keep the upside when things go well, but the rest of us get the downside when things go badly.
The financial crisis taught us that in the financial sector, profits are private, but losses are public. And whenever a pundit makes a mistake, those who listened to their poor advice get the downside, while they themselves usually keep their jobs. Thomas Friedman was an enthusiastic supporter of the Iraq war. He hoped it would democratize the Arab world. He got nearly everything wrong, but he still writes for the New York Times.
Joseph Stiglitz estimated that Fannie Mae was rock solid. Now he claims that he predicted the financial crisis.
There’s an uneasy tone in many newspaper articles about Taleb. Reviewers complain of his unnecessarily large ego, and that he often speculates about subjects he probably doesn’t know much about, such as dieting. Both of these accusations are correct. But I wonder if some of this uneasiness is also caused by a guilty conscience. Taleb’s ego is a useful excuse for changing the subject from his uncompromising attitude towards the media.
So what is it that he actually wants? Surely he does not want to punish experts when they make bad predictions? Well, yes. That is exactly what he wants.
The ancient lawmaker Hammurabi would execute engineers when the houses they’d built collapsed. Taleb does not go quite that far, but he believes we have a lot to learn from Hammurabi, and that it is unethical to have an opinion unless you have something to lose by it. Warmongers should send their own children or grandchildren into the war. Financial experts should invest in their own advice. The least anyone should be exposed to when they make a bad prediction is public humiliation.
I like this idea, so I will begin this year with a promise. When 2013 is over, I will write an article that lists all the errors I’ve made in this column for Aftenposten. Factual errors, misunderstandings, bad analysis, and most importantly failed predictions.
It costs me nothing, nothing at all, to write some words in a column. The least I can do is try to draw attention to my own mistakes, (as when I, like Thomas Friedman, thought the Iraq war was a good idea.)