Monday, 03 December 2018
Shift #14. Principle: Truth.
From weak, to strong, to fuzzy thinking.
Everyone begins with a weak frame. If you have a weak frame, that means your thinking is environmental: you are easily influenced by circumstances, and by the thinking of the people around you. They give you confidence: they tell you that you’re safe, tell you what reality is, tell you what to expect. And they reward you for operating within whatever framework prevails. If the framework changes, you adapt. You don’t need to understand, to ask questions like: what makes the new way of thinking better than the old way, how did we come to these conclusions, etcetera? No, you don’t need to understand the new thinking, you just need to embrace the new values: the new incentives, the new rules, the new expectations for behavior. You just need to have faith. Faith in the model, faith in the incentives. If you act how you’re expected to act, you’ll be rewarded.
If you have a strong frame, that means that your thinking is driven by a single model: you’ve have a worldview, a way of processing reality, a kind of intellectual machine. New information goes in, gets run through a set of definitions, rules, and principles, and out comes your reactions: your perceptions, judgements and decisions about it. Your model tells you what to think and what to do.
Over time, you may improve your model, but you’re always building on top of the same foundation. There may be edge cases and exceptions where your model doesn’t give you a satisfying answer, but by and large, it gives you satisfying clarity on the important things. That is, you have a high degree of confidence in your premises and early conclusions; they give you confidence that you are accurately perceiving reality; they have some predictive value, so you can use the information you have today to anticipate the future; and most importantly, they serve you well on a day-to-day basis, so you feel rewarded for being right, for having insight into truth.
Why most people have weak frames.
Most people, even most intelligent people, operate with a weak frame. It is just easier not to think for yourself. Aside from the natural difficulty, people who think for themselves tend to be punished by the group. There’s a good reason for this: the vast majority of humanity operates within a hierarchy. Hierarchies have strong frames. But they want followers with weak frames. They don’t necessarily want you to understand their thinking, but they do need you to embrace their values. Hierarchies impose these values both explicitly and implicitly. If you question their thinking or deviate from their values, you’re out of alignment. You’re interrupting their flying formation and disrupting their cadence, so naturally, you’ll be punished. Hierarchies use rewards and punishments to incentivize alignment.
Why some people risk strong frames.
Some people who leave hierarchies risk strong frames, because they care about truth. It is very risky to challenge the strong frame you’re operating inside of. But some people just can’t stand being trapped within someone else’s thinking and values. So they begin to think for themselves, and eventually they develop a strong frame of their own. And either they join another hierarchy, another group, another society, that is more aligned with their new thinking and values, or they start their own.
Some people who stay risk strong frames, because they want power. If you’re at the bottom of a hierarchy, you’re not incentivized to understand its thinking, just its values. But as you climb the hierarchy, you need to understands its thinking. And at the top, you gain doctrinal authority, that is: you can change the thinking, as long as you can maintain alignment.
Elites fight over dogma because dogma is mind-control, and mind-control is power. So if you want power, you eventually must comprehend the strong frame of your own hierarchy. Even if you don’t care about truth, you need to understand enough about the beliefs of your organization to make sure they are aligned with your goals.
It isn’t enough for your dogmas to be consistent with your internal goals, you have to be mindful of external realities. Every hierarchy is a garden that exists in the jungle: the internal politics of the garden often put the garden at risk of the external politics of the jungle. Hierarchies often fail because the person who wins the internal game to ascend to the top is unprepared to play the external game, or cannot align the internal game with the external game. Domestic policy and foreign policy conflict at the peril of the state. This is true for all hierarchies: families, groups and companies, not just governments.
What happens when strong frames break.
But what happens when your model breaks? Strong frames break. No model is perfect. Reality is always changing. Every argument has a counterargument. Bull markets turn into bear markets. New information will arise that the model will process incorrectly. New opportunities that it will miss. In valuing certain things, a model will overlook others, which become threats.
Medieval scholasticism undervalued science and incorrectly processed Galileo’s discovery of heliocentricity. The hierarchy rejected his truth, “and yet it moves” — “E pur si muove” — and yet the earth moves around the sun. The hierarchy’s rejection of truth didn’t change truth, and that eventually diminished its power. The model broke, and people lost faith in it.
Gödel’s Incompleteness Theorems are a landmark in the history of thought. For centuries, Western civilization had confidently invested in “the truth project”: in the belief that a single model could be built to understand reality. Science replaced philosophy, which had replaced religion — but the enlightenment still believed in monolithic, objective truth. Until Gödel. Gödel showed that even in mathematics, it was impossible to build an algorithm, a truth machine, a single model capable of generating all theorems. Any strong frame in mathematics would generate an incomplete set of truths. The only way to generate a more complete set of truths is to build more and more models, all of which are inconsistent.
Why fuzzy thinking outperforms in the long run.
Great thinkers and great organizations break their own models. Accepting the thinking of others prepares you to challenge the thinking of others: a weak frame prepares you for a strong frame. In the same way, if you’ve never built a strong frame before, you’re not ready for fuzzy thinking. But if you’ve invested years into building an ideology, a coherent worldview, an intellectual framework, a philosophy, a religion, a model — then you know how attractive it is, and how terrible it is when it shatters. And then you are lost. You need a new model, but now you distrust all models. You’re in a Gödelian crisis.
Fuzzy thinkers build models then break models, but they keep building models. They don’t lose faith in the value of models. They just realize that every model captures a different truth, and is useful in a different situation. And the reverse is true: every model lies, distorts reality in certain ways, and is dangerous if misapplied. So fuzzy thinkers collect models, and they know which model to apply in a given situation. If a situation arises for which they have no model, they attempt to construct one. They are always triangulating. Always gathering new information. Always recognizing patterns and seeking gestalts.
Weak thinkers outperform in the short term, because hierarchies reward them. Strong thinkers outperform in the medium term, because they create new hierarchies or ascend existing hierarchies, and gain power over weak thinkers. But fuzzy thinkers outperform in the long term.
Strong thinkers move faster than weak thinkers and fuzzy thinkers, because they don’t question their models. As long as their models are right, they are rewarded for their confidence. But as soon as their models break, they run off a cliff.
Fuzzy thinkers are slow because they are always paying a thinking tax. They’re always questioning their premises and conclusions, seeking to identify their blind spots, factoring in unaccounted for risks, integrating new information, seeking radical counter-points. But in the long run, this tax turns into an investment. Their intellectual machine runs not just on one engine, but on many engines. They are more flexible, their multi-framework framework is prepared to handle a wider range of situations.
Confident philosophies turn into religions. Confident strong frame thinkers are religious in nature. They are dogmatic, evangelical and imperialistic. By dogmatic, I mean they are confident that their version of the truth is The Truth. By evangelical and imperialistic, I mean, broadly speaking, that they are expansionist: because they believe they possess The Truth, they are trying to spread it far and wide, and turn it into Power, that is, take over the world.
While religions and empires have their vices, they also have their virtues. Confident regimes invest the most aggressively in the future, undertake the most ambitious buildings, infrastructure and construction projects, and make the most passionate art.
The danger with fuzzy thinkers is that they struggle with confidence. They know they don’t possess The Truth, and they know that the truths they do possess are flawed, so they hesitate to impose them on others. They hesitate to persuade, to invest, to build and to make art.
Paradoxes to transcend.
The ideal leader and the ideal organization would somehow transcend these paradoxes. They would both aggressively question themselves and aggressively expand. They would be confident in their skepticism, and skeptical of their lack of confidence. They would penalize weak frames, enforce a strong frame, yet celebrate fuzzy thinking. They would train as many fuzzy thinkers as possible to facilitate periodic model breaking, but preserve a command-and-control hierarchy to operate effectively once decisions are made. They would continually re-align internal politics with external reality. And reconcile truth and power.
This isn’t an answer of course, but a question.