The cost of complexity

Why getting it right requires a rethinking of how we get it wrong

Samuel Bernstein
The Rubicon
5 min readMay 4, 2015

--

How did this go so wrong? The VA was supposed to build a 15-building hospital in Aurora Colorado for $630 million by the end of 2014. Now, the earliest the hospital will be open is 2017, and the price tag will be at least $1.73 billion.

Congress has kicked into gear in order to channel our popular sense of incredulity. A group of lawmakers from Washington recently toured the construction site. “We’ve got to hold those accountable who are responsible for this, and we’ve got to make sure it never happens again,” said Colorado Rep. Mike Coffman, a Republican.

For many Americans, it would be a troubling story if it wasn’t so familiar. The government may do a lot of good, but it certainly can’t seem to do it on schedule or under budget.

Consider the F-35 Joint Strike Fighter. Marketed as a wonder weapon that will be a mainstay of the Pentagon’s arsenal for decades, the project is seven years behind schedule and already at least $167 billion over budget. All told, the program is expected to cost nearly $1.5 trillion over the lifetime of the aircraft.

Americans have a right to be upset when their government doesn’t meet its commitments. But outrage only gets us so far. Do we ever actually learn anything about how and why these large projects falter?

Complexity and uncertainty

Part of the answer has to do with the relationship between complexity and uncertainty. It makes intuitive sense that as a process becomes more complex its final outcome becomes more uncertain. In statistics, this is called the propagation of uncertainty.

Credit: The New Yorker

The important thing to know is that it deals with how the uncertainty of individual variables within an equation affect the uncertainty of the equation as a whole. It’s not a perfect metaphor, but try thinking of a project plan like a math problem where each variable is the amount of time that a step requires. However, each variable is only an estimate and is somewhat uncertain

For instance, the cement trucks may not show up on time. If the trucks are unlikely to show up on time, than there is also less certainty about how long it will take to pour the foundation. And so on. In this way uncertainty can cascade through a project.

With large projects that have many interconnected steps, even a small amount of uncertainty at the individual-task level can quickly “propagate” into extreme cost and scheduling overruns. Therefore, accuracy in predicting how many resources individual steps will require becomes exponentially more important as a project grows in size.

The Planning Fallacy

Now for some more bad news. People are systematically terrible at accurately predicting how much time (and by extension money) tasks require. What’s more, planners predicting their own actions consistently underestimate how long they will take.

Nobel Prize-winning psychologist Daniel Kahneman, called this the Planning Fallacy, and it has been repeatedly observed in projects large and small. For instance, one study of Canadian tax payers found people consistently filed their tax returns about one week later than they had intended. Essentially, we are a species of wishful thinkers.

Now, most people would not be extremely critical of a person who was a few days late with the phone bill. But, when similarly-minor errors occur in the context of a massively complex and interconnected process the end results can be calamitous.

Solving the problem

These dynamics don’t excuse the VA’s gross mismanagement of the Aurora hospital project. Planning complex projects is a discipline in and of itself, and there are numerous well-tested methodologies to reduce the risk of cost and scheduling overruns.

The VA is organizing an investigation into the project that will hopefully yield more insight into what went wrong. However, the frequency of such missteps — from the F-35 to the bumpy lunch of the insurance exchanges — suggest there are systematic problems with how the government (and others) tackles complex projects.

And make no mistake, the planning problems of the 21st Century are problems of complexity. We are increasingly building systems of systems, which must integrate to perform as expected.

For instance, it is wrong to think of the F-35 as just another airplane. It is innovative because it collects the various sensors, weapons, and mechanical features of the airplane into an integrated whole. All of its various software must talk to each other. It is even designed to proactively monitor itself for mechanical failures and notify maintainers of potential problems.

That feature was integral to keeping the plane’s projected lifetime costs down. Unfortunately, it is also underperforming and behind schedule. It may only be marginally deficient, but when planning assumptions are so sensitive to minor disruptions the consequences are substantial.

Getting it right

So what should be done? A key way to mitigate the negative effects of the Planning Fallacy is to have external auditors challenge the assumptions of a project.

In theory, this is already done as part of the government’s contract-selection process. But budget pressures have increasingly hollowed out the government’s contracting and acquisition workforce.

The Department of Defense may be learning from past mistakes. It has placed a special emphasis on protecting — and even expanding — its acquisition workforce as its overall budget declines. Why? Because it knows the return on investment for having good program oversight is substantial.

Such external checks are especially important when contracting for large projects. The Planning Fallacy is at work in many scenarios, but the incentives of competitive contracting arguably make things worse. When competing for business, a company’s tendency to use optimistic assumptions — even subconsciously — is placed into overdrive.

To err is human

At the implementation level, planners may be able to guard against errors by pushing more formal quality-control processes down to workers on the front lines. In his book, “The Checklist Manifesto: How to Get Things Right,” Atul Gawande suggests such an approach.

Essentially, Gawande argues complexity is outstripping our mental capacity to function affectively in the modern world. His solution is simple checklists. For instance, in 2001 a five-point checklist used in the intensive care unit at John’s Hopkins Hospital essentially eradicated central line infections — saving an estimated eight lives over 27 months.

The key insight is that our assumptions about performance, accountability, and process improvement are often inadequate in the context of complexity. Addressing this challenge requires changes in both process and culture.

We can start by asking the right questions when things go wrong, and perhaps being a bit more humble when we critique the missteps of others. To err is human, but the nature of the modern world increases the consequences of those errors exponentially.

--

--

Samuel Bernstein
The Rubicon

Working in health care consulting in Washington D.C. Thinking about #tech, policy and politics. Find me @Samuelbernstein to chat.