Idea Inoculation + Inferential Distance

Author’s note: This one’s more of a snippet, but I claim it’s super important. If you’d normally spend 7min reading my post and it takes you 3min to read it this time, consider reading it twice, maybe?


Inferential distance is the gap between [your hypotheses and world model], and [my hypotheses and world model]. It’s just how far out we have to reach to one another in order to understand one another.

If you and I grew up in the same town, went to the same schools, have the same color of skin, have parents in the same economic bracket who attended the same social functions, and both ended up reading Less Wrong together, the odds are that the inferential distance between us for any given set of thoughts is pretty small. If I want to communicate some new insight to you, I don’t have to reach out very far. I understand which parts will be leaps of faith for you, and which prerequisites you have versus which you don’t — I can lean on a shared vocabulary and shared experiences and a shared understanding of how the world works. In short, I’m unlikely to be surprised by which parts of the explanation are easy and which parts you’re going to struggle with.

If, on the other hand, I’m teleported back in time to the deck of the Santa Maria with the imperative to change Christopher Columbus’s mind about a few things or all of humanity dies in a bleak and hopeless future, there’s a lot less of that common context. Even assuming magical translation, Christopher Columbus and I are simply not going to understand each other. Things that are obviously true to one of us will seem confusing and false and badly in need of justification, and conclusions that seem to obviously follow for one of us will be gigantic leaps of faith for the other.

It’s right in the name — inferential distance. It’s not about the “what” so much as it is about the “how” — how you infer new conclusions from a given set of information. When there’s a large inferential distance between you and someone else, you don’t just disagree on the object level, you also often disagree about what counts as evidence, what counts as logic, and what counts as self-evident truth.

(You can see this all over the place. Millennials talking to Boomers, red tribe talking to blue tribe, Westerners talking to Africans or East Asians, the downtrodden talking to the privileged. Scott of SlateStarCodex did an excellent piece on inferential distance here, though I don’t think he ever used the term in the text.)

On its own, this is maybe not all that big of a deal. So you have to take things slower, maybe do a little extra work to define your terms and draw out the other person’s cruxes. So what?

The “so what” is another, equally common effect called idea inoculation.

Basically, it’s an effect in which a person who is exposed to a weak, badly-argued, or uncanny-valley version of an idea is afterwards inoculated against stronger, better versions of that idea. The analogy to vaccines is extremely apt — your brain is attempting to conserve energy and distill patterns of inference, and once it gets the shape of an idea and attaches the flag “bullshit” to it, it’s ever after going to lean toward attaching that same flag to any idea with a similar shape.

(If you’ve ever heard somebody who’s extremely reluctant to engage with anything related to rationality because they saw an episode of the Big Bang Theory once and absolutely hate Sheldon, then you’ve encountered idea inoculation.)

Idea inoculation doesn’t mean you’re stupid — it occurs in all of us, though with some awareness and some effort you can de-anchor from your original impression and try to hear a second version with fresh ears. The problem is, when you combine idea inoculation with inferential distance, you get a recipe for disaster — if your first attempt to bridge the gap fails, your second attempt will also have to overcome the person’s rapidly developing resistance. You might think that each successive attempt will bring you closer to the day that you finally establish common ground and start communicating, but alas — often, each attempt is just increasing their resistance to the core concept, as they build up a library of all the times they saw something like this defeated, proven wrong, made to look silly and naive.

The moral of the story is, chances to explain are limited in practice. When dealing with large inferential gaps, tread lightly and proceed with great care, lest you fritter away the non-renewable resource of the other person having not already decided that this new idea is dumb and dismissable.