Scylla and Charybdis: The Psychology of High Stakes Decision-Making

Richard M. Adler
5 min readNov 26, 2019

--

In Greek mythology, the hero Odysseus was compelled to sail through a perilous narrow strait on his long journey home from the Trojan War. The strait was guarded by two formidable monsters. Scylla, a six-headed sea serpent perched on the rocks on one side. Charybdis, an enormous whirlpool loomed on the other side of the channel. Following the advice of the sorceress Circe, Odysseus chose to steer clear of Charybdis, only to lose six of his crew to Scylla when they became transfixed by the fearsome whirlpool as they sailed past it.

The modern expression “caught between a rock and a hard place” derives from this story, as it refers to situations in which people must choose between two equally unpleasant decision options. The myth of Scylla and Charybdis also provides an apt metaphor for businesses and other organizations facing critical decisions. In this context, the dilemma resides not in choosing between two problematic alternatives, but in navigating between two inescapable risks that arise in the process of making high stakes decisions.

The Law of Unintended Consequences (LUC) states that decisions to intervene in complex situations create unanticipated and often undesirable outcomes. There are two primary causes for LUC — cognitive biases and bounded rationality. Biases refer to the flawed judgments and choices that often result when our intuitions for assessing and responding to strangers and everyday situations tackle much more complicated problems like corporate mergers, growth strategies, and developing new policies or regulations. For example, we tend to interpret situations in terms of vivid stories, stereotypes, or other statistically invalid samples, and bring to bear weakly relevant evidence or analogies. We rely on simplistic rules of thumb to predict how situations will change and actions will play out over time. Most of these intuitive judgments occur reflexively, below our conscious awareness, making them impossible to avoid. Adding insult to injury, our beliefs, desires, and feelings such as over-confidence, fear, and peer pressure further bias what data we choose to gather, how we weigh it, what decision options we formulate, and how we evaluate them.

The other cause of LUC — bounded rationality — refers to constraints on human capacities to reason deliberately about complex environments such as organizations, markets, and societies. For example, the information that we can obtain for critical decisions is generally incomplete and imprecise (e.g., market data, competitor strategies, and customer preferences). Equally important, social scientific knowledge of individual, group, and societal behaviors is imperfect. This prevents us from predicting the evolution of market or social situations, much less the outcomes of candidate decisions to respond to or shape them. These problems are aggravated by our innate uncertainty about future forces, trends, and events. Bounded rationality precludes the possibility of discovering “optimal” critical decisions; at best, we can devise and identify options that are relatively satisfactory.

The process of making critical decisions corresponds to the narrow strait that Odysseus was forced to navigate as he sailed for home. Leaders are compelled to make critical decisions in order to respond to looming problems or opportunities. The monster Charybdis corresponds to cognitive biases, which draw businesses and governments unwittingly into flawed decisions that produce unexpected and undesirable outcomes. Bounded rationality maps to Scylla, a set of human limitations that undermines our best efforts to make decisions rigorously and correctly in complex situations.

Most accounts of critical decision-making focus on one cause of LUC or the other, and propose measures restricted to mitigating that danger. Discussions that favor a psychological orientation concentrate on decision errors induced by cognitive biases. They prescribe debiasing as the remedy — a set of techniques that compensate for our intuitive foibles by overriding flawed judgments or choices with more carefully reasoned ones. For example, action-oriented biases such as overconfidence encourage cutting corners in the decision process, resulting in precipitous and often overly-aggressive decisions. These tendencies can be countered by adhering to due decision processes, paying more attention to unfavorable data and conflicting opinions, generating more credible alternatives, and considering risks more conscientiously. By contrast, decision scientists identify bounded rationality as the primary culprit. They prescribe analytic and simulation techniques to refine judgments (e.g., about preferences, classifications, and probabilities) and improve projections and comparisons of decision options. Decision sciences push back the bounds of rationality by amplifying our limited capabilities to manage and detect useful patterns in large data sets, project complex situational dynamics, and compare outcomes quickly, consistently, and at scale.

The metaphor of Scylla and Charybdis offers a vivid reminder of the dangers of these siloed approaches. Odysseus was forced to deal with both threats in the strait. Similarly, businesses trying to navigate critical decisions can’t escape the dual menaces of cognitive biases and bounded rationality. Ignoring either peril invites calamity. Without vigorous defenses against cognitive biases, we are vulnerable to flawed judgments and choices; these faulty inputs can compromise an otherwise rigorous analytical decision-making process. For example, biased intuitive assessments about the current situation, the effects of proposed actions, or future conditions “contaminate” the deliberate formulation and evaluation of decision options. But neglecting the potent tools offered by decision sciences is equally problematic. Debiasing only reduces the frequency and severity of flawed judgments and choices. It does nothing to extend our abilities to develop and validate complicated decision options for navigating complex environments. Tools for simulation and analysis are just as necessary as debiasing to minimize the ravages of LUC.

My book, Bending the Law of Unintended Consequences, introduces a method for “test driving” critical decisions. Following the strategy of Odysseus, this method plots a course for decision-makers that minimizes exposure to the dual causes of LUC. It defends against cognitive biases by using models tailored to particular types of critical decisions like managing risk, growth, and competition. Each model is crafted from expert knowledge about what minimal set data inputs are necessary, what are the ingredients of a “complete” decision option, and how to simulate the effects of executing decision options in complex environments. The resulting models act as templates that reduce the opportunities for decision-makers and analysts to unwittingly inject flawed judgments, decision options, or comparisons. The test drive method also resists bounded rationality by using powerful “what-if” simulation and analytic tools to project and compare the likely outcomes of alternative decisions. However, a test drive simulates decision available options many times, against different sets of plausible assumptions about future conditions and the “physics” of implementing decisions in complex business and social environments. This approach insulates decision-makers from choosing an option that only performs well relative to a single fragile set of assumptions about the current situation, future conditions, and imperfect dynamic knowledge.

Odysseus worked to avoid both Scylla and Charybdis. He was only partially successful, but he did save his ship and most of his crew. Most ships attempting to pass through the strait were destroyed by one monster or another. Decision-makers must adopt a similar strategy, coordinating defenses against both cognitive biases and bounded rationality to avoid “train wrecks” and minimize negative and unintended outcomes.

--

--

Richard M. Adler

My interests include critical decision-making (see my book “Bending the Law of Unintended Consequences” [Springer, 2020]), AI, and philosophy of physics.