Human Biases Can Be Adaptive

A Counterpoint to Kahneman et al. (2011)

--

A working paper

In the interest of productive dialectic, I present some alternatives to "The Big Idea: Before You Make That Big Decision..." by Daniel Kahneman, Dan Lovallo, and Olivier Sibony (2011, June), in the Harvard Business Review, in which prominent scientific findings about cognitive “bias” are presented as guidelines that can be utilized in business.

The content below was extracted from my communication with colleagues in July 2011 about the relevance of Professor Kahneman’s increasingly popular ideas—extrapolating from scientific research in the laboratory—to our work on decision making “in the wild” that is fundamentally situated and social, intersubjectively verifiable, subject to validation by observable outcomes, and that reflects both choice and responsibility.

Why revive this counterpoint to Kahneman now? There are three reasons. One is its relevance to collaborative inquiry about prosocial communities for online games. Another reason is that, over the last three years, I believe there have been persistently narrow interpretations and potentially counterproductive applications of Professor Kahneman’s important work. And, most recently, there is continuing attention outside the scientific community to Professor Kahneman’s work that I believe fosters misinterpretations and misapplications of it. See, for example, a recent video interview of Daniel Kahneman on Inc.’s Idea Lab.

What to do about individual bias?

The question begged is what one does with identified biases. Does one eliminate them in a search for the golden idol of objectivity? Or does one utilize any given bias as a deep perspective on the world (on a decision) that is grounded in the extensive experience of a particular individual with the world? The former is helpful only to isolated decision makers, maybe, at least in a unchanging world. The latter is helpful to groups that thrive in times of change. It allows for inter-subjective validation as well as confrontation, and it fosters discovery of new solutions across complementary perspectives. It emphasizes that one should allow oneself (and one's biases) to be confronted with a different perspective. It demands a second-person standpoint as opposed to a third-person standpoint in leadership.

Related to this is the question for business about how to improve decision making. Eliminate bias or utilize bias? There is a good reason why the bias paranoia of behavioral economics hasn't had the impact that many would have expected. The notion of removing bias (e.g., through homogenization and de-individuation) is anathema to the notion of innovation and arguably to the American experience. Fortunately, over the decades (a generation really) since the psychology on which behavioral economics is based, science has progressed by leaps and bounds in understanding what it means to be human. In particular, the most salient psychology in board rooms these days, if not in popular society, is the psychology of creativity, innovation, collaborative innovation and collective intelligence. Even science itself is looking inward at such issues and its own dead ends of parochial pedigree in an effort to understand and promote transdisciplinary science.

Individual bias as diversity enables collective wisdom through crystallization (from Riccio & Darwin, 2010)

A twelve-question checklist

From this perspective, I suggest re-interpretations of the 12-question checklist offered by Kahneman et al. (2011):

1. "Rationalization" is good because it is inseparable from the process of making sense of a new idea. If one is to communicate a strategy, one must translate the strategy in terms of experiences that are familiar. The experiences that matter most in a business context, of course, are the ones that exemplify what one has been able to do or to get done. If this inclines one toward empire building, what is wrong with that as long as others are involved in the decision making? At least there will be options on the table that are more likely to be executable insofar as they are grounded in the commitment of the proponent. The alternative to rationalization bias is abstractness and personal disconnection from an idea.

2. "Affect heuristics" are good to the extent that they reveal sources of motivation once the plan meets reality. In execution, plans will encounter friction, opposition, competition, ambiguity, and diversion that require perseverance and even dauntlessness. Affect bias is invaluable under such circumstances and it is not blind, for example, if others are involved in execution. Affect bias also is not blind because it elicits more salient emotions when expectations are disconfirmed. If one is trained to use one's emotion experience rather than to ignore it, affect can be a sentinel for unexpected events and one's tendencies to react to them. In current scientific parlance, emotion is engagement with the world.

3. Lack of "diversity" is a source of bias that is, in fact, problematic but it is not cognitive bias. It is organizational (or sociological) bias. The only psychological issue addressed in this item on the checklist is the perseverance or dauntlessness required to overcome conformity. See item 2 above for clues about sources for this personal agency that we might not want to eliminate.

4. "Saliency" bias is good insofar as it reflects grounding of some idea in one's past experience. It speaks directly to the ability to execute a plan or decision. If you can't implement a decision, what is the point. I see no way of addressing strategy execution without eliciting personal salience, that is, without understanding how a decision can be implemented in ways similar to actions and accomplishments in the past experience of stakeholders. Notice the personal significance, and juxtapose the personal significance that an idea has for a diversity of stakeholders but certainly don't try to eliminate it or suggest that it is problematic. The alternative to saliency bias is abstractness and speculation about executability.

5. Credible alternatives are good but to suggest that this can be done "objectively" (presumably as opposed to subjectively) creates an infinite regress of logical problems. Moreover, such folly would make it more difficult to encourage "genuine admission of uncertainty" which is the most important recommendation in this item of the checklist. A more serious problem with this recommendation is that it imposes an arbitrary framework on decision making, specifically that alternatives have to be considered concurrently and before the fact (before execution). This is not the way decision-making works in the real world, such as in continuous experimentation.

6. It is good to consider what you don't know but, as in item 5, the problem with this item on the checklist is that it imposes an arbitrary framework on decision making. In real-world decision making, it is more important to educate the attention by keeping critical information requirements in mind during execution of a plan or follow-through on a decision. These requirements typically relate to make-or-break assumptions in decision making that occurred before the fact. Being ready for the unknown (e.g., to notice it) also is well served by hunches and intuition that essentially are incomplete thoughts or emotions that suggest violation of expectations or the emergence of new knowledge.

7. "Anchoring" bias is a good thing insofar as it reflects implicit models about cause-effect relationships or the meaning of some fact or observation. Again the solution here is to consider multiple models and their associated anchors not to eliminate cause-effect assumptions entirely. Diversity in a group makes this easier as an organizational solution rather than a individual-psychological solution.

8. Halo effects are good insofar as they introduce parsimony into decision making, and it greatly facilitates and simplifies communication. Again, the organizational solution of diversity in a group is the answer, not encouraging individuals to be more abstract or unnecessarily complex in their thinking (which is exactly what will happen in a narrow group if one is concerned about halo effects). One should strive for diversity that creates tension and balance with respect to the most likely halo effects. If one is concerned about the halo of leadership, for example, it would be wise to include individuals who differ with respect to their status and views of leadership in an organization.

9. In the pure (experimental) case, the sunk cost fallacy is problematic but in almost no case in the real world are the assumptions behind sunk cost as simple and as clearly vitiated as in experimental or hypothetical cases, and perhaps this is why the tendency exists in the absence of any complexity (in the laboratory). In any case, it is good to consider potential sunk cost effects because it tends to reveal the different assumptions of a diversity of people who have different stakes, experience, or depth of knowledge concerning the sunk cost. Again, authentic diversity is the key here, not to avoid digging as deeply as possible into the lessons learned from prior experience.

10. Inside views are important because without them you cannot really evaluate an outside view or perhaps even recognize what an outside view is. One certainly can't war game without inside views as well as outside views. Again it is useful to have multiple viewpoints and juxtapose them to understand each one better and to have the potential to realize emergent properties. In any case, the point is not to eliminate "biases" like the planning fallacy but to be educated by them.

11. This item in the check list should be recognized as an act of desperation necessitated by the arbitrary framework of planning that is done entirely up front. It is good that we are biased not to take flights of fancy too far into the extreme of the unknown. It just gets too far from what we can actually learn about. It wastes time insofar as it doesn't educate the attention for anything that is likely to happen during execution of plan. It is better to be prepared to re-plan. For re-planning, one should prepare to observe and even elicit variability (for the purpose of exploration) with respect to the most critical assumptions or constraints of a plan. Thought experiments are entertaining and that is all you have when you can't translate theory into action but, by definition, that is not the case for planning.

12. Loss aversion is good insofar as it reflects the fundamental human capacity to perceive the world in terms of personal meaning or consequences. The solution, again, is diversity of personal meaning not meaninglessness.

Notes

Human beings are social animals. All psychological attributes of individuals should be viewed in this context.

Evidence-based decision making is not merely fact-based decision making. It is about the iterative planning-execution-replanning-execution... that combines collaborative reflection on shared or sharable experience with acute awareness of experience as it unfolds (as in hypothesis formulation and testing).

See the July 2011 issue of the Harvard Business Review dedicated to collaboration in business for much more useful and practical recommendations about how to utilize personal experience and intuition ("bias") in organizations.

On individual bias and collective clarity

The points below are a response to the recent video interview of Daniel Kahneman on Inc.’s Idea Lab.

Bias = focus: Utilize a diversity of cognitive biases through collective intelligence to appreciate multiple facets of a situation. We are social creatures. Our apparent flaws as individuals (idiosyncrasies?) necessitate the socially extended self and, arguably, our idiosyncratic biases are necessitated by the collective intelligence of the socially extended self.

Emotion = bias = focus: Damping emotion and the attendant bias is like de-focusing a lens or turning down the gain on a sensor. You will be less likely to sense something you might misinterpret but you also will be less likely to sense something that will be correctly interpreted. The solution is to utilize multiple sensors and, as in adaptive dual control systems, to act and decide over longer time scales based on a multiplicity of data sources explored actively over shorter time scales. For human beings, this is accomplished in a group of mindfully imperfect individuals. Passion can inform reason as David Hume argued.

Metacognition => Emotional Intelligence => Collective Intelligence: Kahneman's "system 2" can involve metacognitive awareness of the biases of "system 1," and this self understanding can be utilized strategically, especially in groups that intentionally develop a deep understanding of the various emotional lenses of individuals in the group. An important implication for collective intelligence and a coherent diversity of quality is that groups can benefit by putting individuals in (in extremis) situations or mindsets that amplify their individual biases so that, in the group as a whole, the various facets of a situation are seen more clearly.

Gamers and military leader developers may understand more than behavioral economists about human decision making "in the wild." Kahneman's work was worthy of the Nobel Prize, in economics by the way, for bringing economists halfway to the reality of human decision making. We need to look elsewhere to make the rest of the journey from the laboratory to the reality of the social environment.

Gary E. Riccio, Ph.D. (revised March 27, 2014)

--

--

Science in the Wild
3.1 -  Collective Intelligence in Scientific Inquiry

Conversations about various manifestations of science in business that address public needs and engagement in the experience economy (Launch Feb, 2014)