Best Practices are Useless in Complex Systems

Jen Briselli
Topology
Published in
7 min readMay 3, 2023

…and you’re probably working in a complex system.

Image of an old map with the words Terra Incognita
Are you WFTI (Working from Terra Incognita)?

Are most of the challenges you find yourself working on clearly defined problems with straightforward dynamics and obvious solutions?

Right. I didn’t think so.

Yet so many organizational leaders rely on “industry standard” tactics, or seek out “best practices” to guide their strategy, especially in times of uncertainty. It makes sense to some degree; the appeal of best practices exists because they have produced good outcomes in the past, and they do provide value in some domains. But in an increasingly interconnected world where technology, information, and customer expectations evolve at an accelerating rate, insights from past performance quickly become irrelevant in many scenarios, and even worse, a red herring that muddies priorities, hinders decision making, and stifles innovation.

As the saying goes,

Best practice is, by definition, past practice.

If the last few years have taught us anything, it’s that transformation is not a momentary disruption but a persistent evolutionary process. Ambiguity is not an obstacle to be overcome but the water we all swim in. Thriving in that ambiguity and facilitating positive change requires us to rethink our design strategies, organizational structures, and collaborative dynamics because we cannot control the future — only how quickly we learn and respond to it.

Best practices are a form of crystallized learning; they can be quite useful in simple or predictable scenarios, but that very same crystallization translates into brittle fragility when complex, fluid situations require more agility and creative adaptation. While those highly ordered and predictable problem spaces do still exist, far more of the domains we’re operating in these days behave more like complex adaptive ecosystems than simple machines: think public health, emerging tech, consumer behavior, distributed organizations, etc. Traditional problem solving methods that rely on analysis-by-parts fall short in these domains because both the interdependent components themselves and the relationships between them are constantly changing, giving rise to unpredictable, emergent behavior.

Image from Back To the Future featuring Doc and the caption: Best Practices? Where we’re going there are no best practices.

Complex systems are characterized by self-organization and emergence — when the behavior of the whole is different than the sum of its parts’ behaviors. Strategies rooted in best practice or industry standards, though well-intentioned, assume a stable environment. Complex systems, however, are dynamic and unpredictable. What worked for an organization a few years ago may not work today. What a competitor in your space is doing right now won’t necessarily work for you. Conventional “tried and true” approaches neglect second and third order consequences in both space and time, and risk premature convergence when the blind spots around system components that act with agency (instead of following mechanistic rules) perpetuate a false sense of security that prevents organizations from sensing emergent signals and adapting successfully to changing expectations, markets, and resources.

So what should we be doing instead?

Know thyself/thy-system

For one thing, we must get better at understanding the types of domains we are working within. Dave Snowden’s Cynefin Framework is one valuable conceptual map that supports sense making and decision making by differentiating Clear, Complicated, Complex, and Chaotic scenarios:

Clear: Cause-and-effect relationships are clear and can be easily understood and managed with well-established best practices or standard operating procedures.

Complicated: Cause-and-effect relationships are not obvious, but can be discovered through analysis and expertise, and multiple solutions can be applied to manage the situation, requiring specialized knowledge and experience to solve the problem effectively.

Complex: Cause-and-effect relationships are unpredictable and require experimentation, sense making, and emergence to respond effectively, often involving collaboration, innovation, and adaptive approaches to address the situation.

Chaotic: No cause-and-effect relationships are evident, and the situation requires urgent action to establish stability before moving to a more manageable scenario, often requiring decisive leadership, rapid experimentation, and improvisation to survive the crisis.

Image representing the Cynefin framework outlining Chaotic, Complex, Complicated, and Clear scenarios and associated characteristics.
What type of scenario are you navigating/designing?

Of course, no particular domain remains static; most situations are fluid and will likely shift around between these four types. As shared understanding of a system grows, there might be a clockwise drift from Chaotic through Complex through Complicated to Clear, while the reverse might happen over time as knowledge is clouded or if the status quo is challenged.

There are many interesting dynamics that can play out, but one key takeaway is that we often mistake Complex scenarios for merely Complicated ones. Complex scenarios are not ordered and predictable in the way Complicated scenarios are, yet too often leaders respond to Complex situations with decision making strategies that only work in more ordered and predictable scenarios. These strategies are ill suited for Complex challenges. As Einstein (supposedly?) said:

Everything should be made as simple as possible, but not simpler.

No silver bullets

We also must be careful not to replace best practices with a reductionist model of “leverage points,” either. Donella Meadows writes eloquently about the temptation in reducing leverage points to silver bullet thinking:

Folks who do systems analysis have a great belief in “leverage points.” These are places within a complex system (a corporation, an economy, a living body, a city, an ecosystem) where a small shift in one thing can produce big changes in everything.

This idea is not unique to systems analysis — it’s embedded in legend. The silver bullet, the trimtab, the miracle cure, the secret passage, the magic password, the single hero who turns the tide of history. The nearly effortless way to cut through or leap over huge obstacles. We not only want to believe that there are leverage points, we want to know where they are and how to get our hands on them. Leverage points are points of power.

But,

Leverage points are not intuitive. We intuitively use them backward, systematically worsening whatever problems we are trying to solve. The systems analysts I know have come up with no quick or easy formulas for finding leverage points. When we study a system, we usually learn where leverage points are. But a new system we’ve never encountered? Well, our counterintuitions aren’t that well developed. Give us a few months or years and we’ll figure it out. And we know from bitter experience that, because of counterintuitiveness, when we do discover the system’s leverage points, hardly anybody will believe us.

Systems thinking helps us identify leverage points, but instead of expecting them to serve up magic solutions and final answers, we must regard them as beacons that light up the places in a system where an experiment or intervention can teach us more about the system. They show us where we can enter into dialogue with a system, rather than control it.

So then what?

When you’re navigating (or designing experiences in) complex domains, ditch the industry standards and expert wisdom of simpler problem solving methods and embrace these more effective strategies instead:

Diverse Perspectives

Complex domains require a diversity of perspectives and experiences to fully understand and respond to emergent behaviors. Instead of relying on analysis-by-parts problem solving or elevating one particular expert model above another, seek out transdisciplinary viewpoints that apply different lenses to the situation, and incorporate them into your sense making and decision making process.

Experimentation

In complex situations, it’s impossible to predict the outcomes of a particular design intervention or other “solution,” no matter how careful the analysis or deep the expertise. Experimentation allows for continuous learning and adaptation, rather than being locked into a predetermined course of action, and it need not be cumbersome or costly. Pursue a range of lighter, faster, more distributed interventions with real time feedback loops to accelerate learning.

Learning Loops

Speaking of learning, one of the most critical aspects of system behavior is how effectively information flows within it, and establishing the necessary infrastructure (human and non-human) to empower feedback is a key strategy for adaptive learning organizations. Establishing both the cultural and tactical practices that empower triple loop learning ensures that we learn quickly from the signals we sense and the interventions we deploy. (By the way, Pavel Samsonov recently published a nice explainer relating second, third, and even fourth loop learning to product development and AI in particular).

Disintermediation

This is a fancy way of saying we must reduce the distance between the information needed to act, and the people who need it and act on it. In complex systems, a conventional decision making hierarchy simply creates layers of abstraction that not only delay effective action but also introduce noise and interpretation, much like the children’s game of Telephone. In complex systems, we can more efficiently understand the dynamics we observe and respond to them by distributing decision making power; we can leverage the collective intelligence of the ecosystem by supporting informal networks that connect people other across siloes and contexts.

Image of an old railroad track that runs  into a body of water and disappears.

As Russell Ackoff says, leaders “are not confronted with problems that are independent of each other, but with dynamic situations that consist of complex systems of changing problems that interact with each other. I call such situations messes. Problems are extracted from messes by analysis. [Great leaders] do not solve problems, they manage messes.

When you find yourself trying to solve isolated problems instead of managing the holistic mess, ditch best practices and embrace context-specific emergent practices by sensing first, then responding. The system will tell you more about itself than conventional wisdom or external sources can; you just have to be able to hear it when it does.

Jen is co-founder and principal at Topology and was previously Chief Design Strategy Officer at Mad*Pow. Find her on Medium and LinkedIn.

--

--

Jen Briselli
Topology

Chaotic Good | Co-Founder & Principal Strategist at Topology