Paying off mental IOUs

Daniel Greene
4 min readAug 30, 2020

--

One of my favorite concepts in psychology is the “illusion of explanatory depth”:

If you asked one hundred people on the street if they understand how a refrigerator works, most would respond, yes, they do. But ask them to then produce a detailed, step-by-step explanation of how exactly a refrigerator works and you would likely hear silence or stammering. This powerful but inaccurate feeling of knowing is what Leonid Rozenblit and Frank Keil in 2002 termed the illusion of explanatory depth, stating, “Most people feel they understand the world with far greater detail, coherence, and depth than they really do.”

Rozenblit and Keil initially demonstrated the illusion of explanatory depth through multi-phase studies. In a first phase, they asked participants to rate how well they understood artifacts such as a sewing machine, crossbow, or cell phone. In a second phase, they asked participants to write a detailed explanation of how each artifact works, and afterwards asked them re-rate how well they understand each one. Study after study showed that ratings of self-knowledge dropped dramatically from phase one to phase two, after participants were faced with their inability to explain how the artifact in question operates. Of course, the illusion extends well beyond artifacts, to how we think about scientific fields, mental illnesses, economic markets and virtually anything we are capable of (mis)understanding.

Understanding the illusion of explanatory depth allows us to combat political extremism. In 2013, Philip Fernbach and colleagues demonstrated that the illusion underlies people’s policy positions on issues like single-payer health care, a national flat tax, and a cap-and-trade system for carbon emissions. As in Rozenbilt and Keil’s studies, Fernbach and colleagues first asked people to rate how well they understood these issues, and then asked them to explain how each issue works and subsequently re-rate their understanding of each issue. In addition, participants rated the extremity of their attitudes on these issues both before and after offering an explanation. Both self-reported understanding of the issue and attitude extremity dropped significantly after explaining the issue — people who strongly supported or opposed an issue became more moderate. What is more, reduced extremity also reduced willingness to donate money to a group advocating for the issue. These studies suggest the illusion of explanatory depth is a powerful tool for cooling off heated political disagreements.

It’s so easy to fool yourself into thinking that you know more about something than you really do, whether it’s about a refrigerator or a social issue. And most social issues are even more complicated than refrigerators. But if you try to explain what you know, you tend to realize how little is really there.

When I look closely, a surprisingly large fraction of my ideas are more or less empty labels. I might use a word or phrase like “refrigerator”, or “healthcare system”, but I can’t actually unpack it into simpler parts and explain how the parts interact. They are like IOUs for actual explicit models or informed opinions. I catch myself in this all the time. Sometimes I find myself endorsing strong opinions about social issues that I know almost nothing about. When I try to explain my position, I find that I have very little to say!

Where did all of these IOUs come from? I seem to get around fine in the world. I know enough about refrigerators to keep my food cold, and I know enough about healthcare systems to make coherent noises about them. I imagine that at some point in the past, I first heard the grownups on TV talk about something called “the healthcare system.” At that moment, I felt a flash of novelty — “What is healthcare? What is NOT healthcare? How does it work? How could it work?” And then an instant later, I thought “It’s fine, just go with it, you can figure it out later.” I left myself an IOU for a future informed opinion. Eventually, I forgot that it was an IOU and just adopted the opinion wholesale. Then, as I started building other IOUs on the premises of older IOUs, I slowly accumulated a kind of hidden epistemic debt. At this point, maybe some unknown parts of my worldview feel coherent, but they are built on flimsy foundations.

Of course, there isn’t enough time to cash in all of my IOUs. I need to act in a complex world, and some things are more important to understand than others. But it’s unsettling to remember that I forgot my own ignorance.

There’s another problem. I have absorbed many IOUs from my own subculture, especially about social and political issues. They help me to fit in. So unless my subculture happens to be right about everything, I should expect (on average) to drift away from my subculture’s worldview as I start cashing in my IOUs.

No wonder fact-checking is so uncommon. It’s hard to notice your blind spots, it’s hard to fill them, and doing so is going to move you away from the people who imparted your worldview in the first place.

--

--

Daniel Greene

Biosecurity researcher, social scientist, fragile blob of atoms, baffled river of experience. This blog is for personal musings. www.danielgreene.net