A Nation of Strangers

Olson Zaltman
Olson Zaltman
Published in
12 min readMar 9, 2019

Part I of a two-part essay on what divides us — and what can unite us

by Gerald Zaltman, Olson Zaltman and Harvard Business School

Today’s world of information consumption is puzzling. We are taught from a young age to take actions and base opinions on facts, and the internet makes it easier than ever to evaluate truth claims on most topics. At the same time, we see a steady decay in the use of facts. People seem increasingly inclined to let opinion determine what constitutes fact rather than the reverse. When opinion collides with fact, especially on social policies, opinion tends to prevail.

Dr. Gerald Zaltman

More and more people see one another as ignorant of the truth on topics like climate change, immigration, educational policies, free speech, evolution, gun control, inequities in the distribution of wealth, the use of vaccines, trade policy, and much, much more. How others think has become foreign to us, and as a result, we are becoming a nation of strangers, and not just strangers, but opponents. Each side sees the other as embracing opinion over facts — a division I call the “fake news social divide.” This divide, which runs through family, friends, and coworkers, is worrisome to me.

This essay is a call for constructive sharing of conflicting thoughts. It is meant for a very broad audience. It seeks to restore and even enhance the pivotal role that legitimate differences in opinion and interpretations of fact play in a healthy democracy. The essay is not about changing minds, resolving conflicts, or negotiating compromises. There are many other helpful guides for those activities. However, none of those activities can occur without first having constructive, respectful sharing of thoughts with those with whom we disagree.

As a nation, we need to find ways of handling legitimate, conflicting judgments about facts to spur productive discussion on social-policy divisions rather than using conflicting judgments as goads to deepen and widen those divisions, as we seem to be doing now. The metaphor of the American melting pot doesn’t — or shouldn’t — mean erasure of differences or subordination of all into an undifferentiated whole but rather the creation of a unique national stew in which each contributor adds a flavor or texture that is irreplaceable and without which the stew would be that much the poorer. We are at risk of forgetting the virtue of many flavors and textures as it applies to diversity in ideas.

As a nation, we need to find ways of handling legitimate, conflicting judgments about facts.

Part I of this essay provides background about how minds handle ideas, regardless of a position taken. The goal is to create more openness to hearing alternative positions. This is not the same thing as trying to persuade people to adopt a different position. Gaining a better understanding of and respect for what people of opposed views think is very difficult and requires patience. Resistance to this process is natural and is to be anticipated and respected. Our unexamined, preconceived notions and our resistance to hearing ideas we disagree with makes discussion hard, but the results can be worth the effort.

Part II presents sample approaches to facilitate constructive interactions among conflicting parties. These approaches, called prompts, are just suggestions for making a conversation easier. They may be adapted according to the issue, participants, and settings involved. A major assumption is that goodwill exists among the partisans involved. I know that assumption exists among the friends, family, and coworkers with whom I differ on one or another social issue. I believe this is also true for the great majority of our nation’s citizens.

Part I: How Minds Operate

The need to know, to understand the world around us, is a core human need. Our happiness and survival depend on it. This need drives learning, imagination, problem solving, and adapting to change. It determines how we allocate attention, time, and other scarce information-processing resources among competing demands.

The need to know operates unconsciously, though it occasionally surfaces as a conscious feeling. For example, we become aware of feeling certain or uncertain, or confident or uneasy about something. We say we get cold feet or have second thoughts about a decision. Or we experience an aha moment or a gut feeling in which we suddenly believe beyond doubt that an idea or action is correct. Such experiences may be intense, and yet our awareness of our need to know remains slight.

Our sense of what we know, on the other hand, is outsized. Our minds play up our sense of our own knowledge while downplaying what we don’t know. This leads to a failure to critically question our own opinions and beliefs to see if they have factual support. I’ve called this “the fallacy of expertise.” It usually partners with confirmation bias — the tendency to seek supporting evidence for a favored position while avoiding evidence against it.

Our sense of what we know is outsized

There is benefit, of course, to having the need to know operate unconsciously. By operating below awareness, it saves us from “paralysis by analysis” in everyday matters. Imagine if you had to stop and think about which news channel to watch every time you turned on TV, and further, if you had to check each statement made on the news program, and then the sources you consulted when checking those statements, and then the sources you used to vet the sources you consulted, and so on. Our conscious mind doesn’t have sufficient bandwidth to always be asking and answering the question, “How certain am I that I know what I need to know?” Instead, this question is addressed, if at all, in the unconscious mind, where it escapes scrutiny.

We can see how this plays out in today’s world through the following example. Consider an outlandish, impromptu claim made by a candidate at a rally. Imagine the claim receives immediate, rousing applause. Let’s say the audience believes from the outset that the candidate is a truthful person with their best interests at heart. These beliefs automatically confer validity upon a questionable claim. For supporters, a preexisting halo of faith and approval surrounds the candidate, so they give him or her the benefit of any doubt. Pausing, even briefly, to examine the validity of the new claim is unnecessary. Furthermore, others in the crowd are signaling their validation of the claim with cheers. People don’t stop to wonder if this manifestation of the wisdom of crowds is fallible.

The Trickster Mind and Neural Networks

When it comes to the need to know, the mind likes to play tricks. One distinguished psychologist describes the human brain as “a master of deception. It creates experiences and directs actions with a magician’s skill, never revealing how it does so, all the while giving us a false sense of confidence that its products — our day-to-day experiences — reveal its inner workings.”[1]

One such deception has just been noted: the belief that we know more than we actually do. This fallacy has the knock-on effect of hiding from us the incorrect judgments we make based on our assumption of being knowledgeable. Let’s explore this further.

Suppose we listen to a political debate and as a consequence decide to support one of the candidates. Describing events, we say to friends that this candidate’s arguments were superior and that won our allegiance.

Maybe. However, it is quite possible the reverse occurred: we found the candidate’s arguments to be superior because of an unconscious allegiance we felt prior to the debate. Our notion that we carefully considered each candidate’s statements in the debate and only then developed a reasoned preference for one candidate over the other may be an illusion. We want to believe we made a reasoned choice after careful consideration of each candidates’ merits — that is what we are taught to do. It is what we believe a reasonable person does, and we consider ourselves reasonable. But in fact, any number of unconscious preferences relating to gender, ethnicity, voice, facial characteristics, etc. may have had more influence over our decision.

Our notion that we carefully considered each candidate’s statements in the debate and only then developed a reasoned preference for one candidate over the other may be an illusion.

To understand how the trickster element of our mind is so powerful, it helps to understand how the brain’s neural networks operate. These networks are formed by connections among neurons. The more frequently the connections are activated, the stronger or more firmly “wired” together they become. Every idea is represented by a set of neurons. Each set, in turn, is wired to other sets relating to the same topic. For example, the set of neurons involved in liking a candidate may connect to the neurons involved in the willingness to make get-out-the-vote phone calls on his or her behalf, and both ideas may link to the set of neurons involved in being willing to donate to the candidate’s campaign. Operating together, these sets form a more comprehensive network — or what social scientists often call a framework or frame.

We have an incalculably large number of neural networks. They mediate everything we think about, regardless of how right or wrong that thinking might be. Different frameworks can also be connected to one another. Activating one may trigger activation of another. For instance, activating thoughts about international labor policy may bring to mind thoughts about immigration policy (or vice versa) since policies in the two areas may affect one another. Or, thoughts about a candidate’s ethnicity may activate thoughts about his or her position on immigration, which in turn may activate thoughts about the candidate’s trustworthiness.

As the neural connections involved in our image of a candidate — say, the belief that the candidate is caring, is of the “right” gender, and is trustworthy and intelligent — are reinforced by various sources of information, they become increasingly difficult to change. Opinions we encounter in sources that reinforce our image of the candidate are likely to be treated as “facts” and further strengthen these neural connections.

Opinions inconsistent with our image, on the other hand, will be ignored or interpreted as inaccuracies or lies, especially if those opinions come from a disliked news channel or disliked candidate. Indeed, when negative statements about our preferred candidate come from disliked sources, those statements reinforce our belief in our chosen candidate’s integrity. If we believe our candidate is an honest person, then we feel that sources saying that (for example) our candidate is dishonest must be mistaken at best or lies at worst. We feel no need to fact-check claims that suggest our candidate has told a lie.

This is also why media habits are hard to alter. Viewers of Fox News won’t watch MSNBC news and vice versa. The neural processes supporting these news consumption habits are difficult to change. People prefer the comfort of being told they are right to the discomfort of being told they are wrong and need to reassess their position. A powerful jolt is required to rewire media consumption habits. That jolt is unlikely to originate with our wired-in news channel preferences.

Strength in Connections

Because a given idea lives within a network of related ideas, any effort to introduce a new idea or modify an existing one must consider all linked ideas, not just the single idea to be changed. The entire system of ideas needs to be addressed. Suppose you support relatively unfettered admission of asylum seekers to the country, and you are arguing with those who fear that admitting asylum seekers will result in higher crime rates. In this situation, it is not enough simply to assert that allowing asylum seekers into the country will not increase crime rates. Your argument needs to consider other, related ideas in the framework, such as contributions immigrants make to (or burdens they place on) the economy, their participation in our military, the complexity of border security, the humanitarian value of a welcoming community, etc.

Your argument needs to consider other, related ideas in the mental framework….not just the single idea to be changed.

And ideas exist in more than one network, just as people do. Consider the multiple networks you belong to: networks of family, friends, online communities, coworkers, and so forth, each having some influence on you and through you on other people and groups you belong to. When trying to make a change, these related frames must also be considered. A health-related example may be helpful.

Some time ago, my firm, Olson Zaltman, conducted a smoking cessation study for a state department of health. This study revealed that many active smokers accept evidence that cigarette smoking is harmful to them. They continue smoking not simply because of the power of nicotine addiction but because of their strong attachments to other people who happen to be heavy smokers and because of their attachments to the settings — at work, church events, and other social situations — where their valued interactions with those people occur. Constantly being with smokers in these settings adds an impossible hurdle to changing their own behavior. To give up smoking would require giving up the valued social contacts and settings. Many smokers choose not to pay so high a price. The framework of ideas about proper health behaviors loses when it clashes with frameworks about the value of a specific job, a church community, or feelings about the importance of close friendships. Once that fact was realized, the health department’s personnel developed a program of “quitting buddies” to create alternative rich social connections for those seeking to quit — an effort that proved successful.

Back to Feelings

How do thoughts encountered in the news, at political events, in personal conversations, and elsewhere make their way into our unconscious thinking as “facts”? A few factors involved in this process are introduced below. Note the importance of feelings:

1. Frequent exposure to an idea helps embed it in our thinking. A single tweet by the president about a “witch hunt” can be rebroadcast multiple times a day by various news networks. This repetition gives the idea greater salience and power. Repeated information becomes more influential when we are told it originates with an anonymous “insider” source, when we forget the idea’s original source, or when we are unaware it is an attempt to influence us.

2. The more distinctive an idea is, the more likely it is to stick with us and exert a silent influence. A novel twist in how an idea is conveyed — for example, calling Central American migrants traveling north a “caravan,” a term with its own unique associations — can make a cautionary message more impactful.

Repeated information becomes more influential…when we are unaware it is an attempt to influence us

3. An idea or message that is personally relevant is more apt to be accepted.A claim that immigrants cause crime is more believable to people who have been victimized in the past (regardless by whom) than to those who haven’t. Activation of a past incident of victimhood raises thoughts of future vulnerability.

4. The capacity of a message to trigger stories increases its influence. Suggesting immigrants are gang members and generally desperate can be effective even when one has never been a victim but simply knows of someone who has been victimized — even if one does not know that person personally. The victim’s experience, relayed by a newscast or by a mutual acquaintance, becomes a borrowed story, one that gets replayed in our mind and shared with others. And, as the triggered story finds an active home in our memory, it becomes more personally relevant.

5. If an idea is enhanced by a mood, its impact will be greater. Cues in a message can induce certain moods. A tense movie scene becomes more frightening with anxiety-raising background music. A somber mood is established when a political ad has glimpses of coyotes skulking around at night as a crowd of poorly clad people sneak under a border fence.

6. Finally, the more powerful the emotional content of a message, the greater its persuasive impact. Political attack ads often create feelings of physical vulnerability and an imminent sense of losing something highly valued, such as medical insurance or a retirement nest egg.

Part I Summary

Well-entrenched patterns of thought are hard to change. It usually takes an unavoidable reason to question core thinking before we will engage in the arduous task of reassessment. In a world that puts many demands on our time and energy, it is easier to automatically conclude that what a preferred position is correct rather than fact-check its supporting claims. We cling to the illusion of being consciously rational when we are not. We imagine that we examine facts first, then develop an informed preference or feeling, and then act, when the reverse is more common — we tend first to feelan alleged fact is or is not correct, and then assess it, using those feelings as a filter. Then we act. Or, we use feeling as a basis for acting and then rationalize our action as the last step. Either way, feelings often override rational thinking. Rational thinking is a follow-on process whose veto power over our feelings and actions is not strong.

In short, the factuality of information is not always, or even usually, central to how minds operate. Many factors, most of which operate below awareness, affect the processing of information. Very deliberate efforts are required to bring these to consciousness, where they can be approached in a systematic way. This is the focus of Part II.

[1]Lisa Feldman Barrett, How Emotions Are Made: The Secret Life of the Brain(New York: Houghton Mifflin Harcourt, 2017), 278.

Note to Readers: This document draws from many published sources and critical reading by colleagues. Readers are encouraged to consult Gerald Zaltman, Unlocked: Keys to Improve Your Thinking (Amazon, 2018). The author may be contacted at gzaltman@olsonzaltman.com.

Gerald Zaltman is founding parter at Olson Zaltman and the Joseph C. Wilson Professor of Business Administration Emeritus at the Harvard Business School.

--

--