Edward Lorenz was trying to take a shortcut.
As a mathematician and meteorologist, he was developing a computer model that could accurately predict the weather. It was somewhat of a sore spot for meteorologists that while physics could precisely show the return of Halley’s Comet and predict eclipses; they still couldn’t accurately forecast the weather.
Lorenz hoped that with his new, state of the art, 1960s computer technology, he’d be able to accurately predict the weather by simulating temperature, pressure, and wind speed.
But these computers took a long time to run. So one day in 1961, wanting to reexamine one simulation at greater length, he started it halfway through. He manually input the data from an earlier experiment’s printout to create the same starting condition. Then he went to grab a cup of coffee while the computer did its work.
He was expecting the new run to duplicate the previous one exactly, as he’d entered the same data. Yet the results showed a completely different result. Lorenz described the pair as “two random weathers out of a hat.”
He checked through the results, pored over the code, and inspected the computer for a bug. After weeks of analysis, he found the problem. The computer’s memory stored six decimal places between each simulation. Yet the printout showed only three. So instead of entering .506127, he entered .506.
For many calculations, this difference wouldn’t matter. We’re used to the idea that a small error on input data will lead to a corresponding small error on output. Weather, to the unfortunate frustration of meteorologists, is different. It represents an interdependent environment. A small event, such as the flap of a butterfly’s wings, will create eddies of air that will influence the air around it, creating compounding effects and chain reactions. As Lorenz described it,
“It implies that two states differing by imperceptible amounts may eventually evolve into two considerably different states. If, then, there is any error whatever in observing the present state — and in any real system such errors seem inevitable — an acceptable prediction of the instantaneous state in the distant future may well be impossible.”
In 1972, he presented a paper on his findings, titled “Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?” He didn’t actually answer the question.
So Does the Flap of that Brazilian Butterfly’s Wings Set Off a Tornado in Texas?
“See, here I’m now by myself, uh, talking to myself. That’s, that’s Chaos Theory.” — Dr. Ian Malcolm, Jurassic Park
Okay, you may be thinking, I saw Jurassic Park and as creepy as Jeff Goldblum’s character was, he did a decent job of laying out the fundamentals. I get it. But could that butterfly create a tornado?
Maybe. I don’t know. And that’s the whole point.
Lorenz didn’t answer the question because he couldn’t. The wind coming off a butterfly’s wings leads to that Texas tornado if thousands of other minute variables change in a certain way. Or it could turn into a cyclone off the coast of Florida if different conditions line up to create that situation. The sheer quantity of potential variables offers an unlimited amount of potential options — making any one outcome unpredictable.
In this way, most people misuse the term “butterfly effect.” We tend to associate it with a small input catalyzing a much larger outcome. As if it’s a lever that we can use to magnify our influence and help drive a major impact.
This is the opposite of Lorenz’s point. He showed that these small impacts may have a major influence on the system. But they could also have no effect at all. It’s not about impact. It’s about unpredictability of the impact.
If you release a crate of butterflies in Brazil, you’ll create variations that may increase the likelihood of that Texas tornado. But your actions could just as easily prevent one and put thousands of Texans in your debt. There’s no way to know. It’s simply too complex.
Okay, you might be thinking now, question answered, but not really. That was sort of a cop-out. And sure, I’ll stop referring to my push-up habit as a butterfly effect. But didn’t you say something about this making me a better leader?
I’m so glad you asked.
You Can’t Control Complexity.
“If everything seems under control, you’re just not going fast enough.” — Mario Andretti
The internal combustion engine in a car is a complicated system. It has a lot of parts. And every time I take one apart, I have plenty of issues reassembling it. But you can define the different relationships between these parts. Each component plays a definitive role in the system and it’s not difficult to map out the response when fuel’s injected into the cylinder or when current goes through a spark plug.
Weather’s different. It’s a complex system. Each variable interacts with many others, creating interdependence across the entire system.
Other complex systems include the environment, financial markets, and people’s behaviors. While we can understand the different components of these systems, the high level of interdependence makes predicting specific outcomes nearly impossible, regardless of what your financial advisor may tell you.
The other thing to know about complex systems is that they’re adaptive. They actively respond to what happens. Species adapt to changes in the environment. Markets respond to changing innovations and growth. And people will generally adapt within changing circumstances and use them to their advantage.
If you’re leading people in these environments, the key is to leverage this. Don’t try to predict future behavior and control people towards achieving it. Since you can’t predict that future, trying to lead in this manner is like trying to drive using your rear-view mirror. It might work on a straight road with no surprises, but soon something will change and you’ll be in for a surprise.
Don’t try to control complexity. Create an environment where people can adapt within the complexity.
People intuitively know this. They know that they’re at their best when they have specific goals, but broad latitude to accomplish them. They know that simply executing a bunch of tasks is a good way to turn off their brains and lemming-like, follow orders even if it leads them over a cliff.
Yet somehow, when people get that manager title they tend to forget it. They prioritize control. Even though that’s not how they’d want to be managed.
“The more you tighten your grip Tarkin, the more star systems will slip through your fingers.” — Princess Leia, Star Wars
It’s often said that change is inevitable. And if you take vending machines out of the equation, this is more true today than ever before.
Since we can’t predict how people will behave in the face of uncertainty, trying to control their behaviors in advance is a recipe for disaster. In this instance, our instinct for control is self-defeating.
Think about thriving organizations in today’s world. Their success comes from adapting with developing trends and evolving breakthrough ideas. It’s rarely due to enforcing blind compliance to a set process.
There’s a lot of talk about agility in organization’s today. It’s moved beyond software programming and you can find it as a buzzword in most senior management meetings and strategies. Yet talking about agility doesn’t make it happen. Telling employees to be agile, yet follow the procedures and do as you’re told, doesn’t work.
Twenty years ago, 17 programmers got together and developed the “The Manifesto for Agile Software Development.” Composed of four simple statements, they express the core values for promoting agility over control.
Individuals and interactions over processes and tools.
Working software over comprehensive documentation.
Customer collaboration over contract negotiation.
Responding to change over following a plan.
Think about your own company. Are you embracing these principles or simply telling people to be agile because management says so?
Processes and tools can be helpful. But when they overly control individual initiative and collaboration, they stop being tools and start acting as constraints. The point is to give people guidance and support to make the right decision, not force them into it.
Documentation offers management another method of control, particularly on projects that bring high degrees of risk. In today’s world, comprehensive business cases and extensive design documentation are less useful than working prototypes to test problem and market assumptions.
Contracts are important to align everyone to the same set of expectations. But most are full of legal terms and conditions that you need a law degree to understand. Regardless of what they say, they communicate the very clear message that if something goes wrong, we’re not in this together. They also limit flexibility. As the situation changes, our contracts should be a tool to help us collaborate on a better path, not hold us hostage to outdated stipulations.
Plans are important to outline a path. But it’s always easiest to plan derivative work, leading people to avoid reaching for bold new goals. You also can’t plan your way out of problems. The more time you spend developing a plan and perfecting an approach, the more likely you are to become attached to it. The best plans aren’t seen as a means of management control, but offer a variety of options and decision points to adapt within a changing world.
All of these ideas seek to do one thing: reduce centralized control and distribute it to the people doing the work. As you look at your own company, where are you giving people the freedom to adapt within a complex world? And where are you still trying to drive by looking through the rear-view mirror?
Don’t Try to Control Complexity. Adapt with It.
“Intelligence is the ability to adapt to change.” — Stephen Hawking
Like Lorenz’s butterfly effect, the major impacts of many of our daily decisions won’t reveal themselves for some time. And given the complexity of our environments, the direct role of any one decision is often lost in the noise of the system.
Fortunately, we have opportunities that Lorenz never did. Instead of trying to predict outcomes based on a litany of inputs, we can adjust those inputs along the way. We still can’t predict the results of complex systems, but we can better adapt along with them.
The key is to give up on control. The key is to stop trying to drive by looking through the rear-view mirror. And move to a model that gives that authority to those who are doing the work.
The key is to lead people as each of us would like to be led. Don’t try to control complexity. Give people the freedom and support to adapt with it.