Reasonably Polarized: Why politics is more rational than you think.

Kevin Dorst
Science and Philosophy
8 min readSep 6, 2020

A Standard Story

​I haven’t seen Becca in a decade. I don’t know what she thinks about Trump, or Medicare for All, or defunding the police.

But I can guess.

​​Becca and I grew up in a small Midwestern town. Cows, cornfields, and college football. Both of us were moderate in our politics; she a touch more conservative than I-but it hardly mattered, and we hardly noticed.

After graduation, we went our separate ways. I, to a liberal university in a Midwestern city, and then to graduate school on the East Coast. She, to a conservative community college, and then to settle down in rural Missouri.

I — of course — became increasingly liberal. I came to believe that gender roles are oppressive, that racism is systemic, and that our national myths let the powerful paper over the past.

And Becca?

You and I can both guess how her story differs. She’s probably more concerned by shifting gender norms than by the long roots of sexism; more worried by rioters in Portland than by police shootings in Ferguson; and more convinced of America’s greatness than of its deep flaws.

​In short: we started with similar opinions, set out on different life trajectories, and, 10 years down the line, we deeply disagree.

So far, so familiar. The story of me and Becca is one tiny piece of the modern American story: one of pervasive — and increasing — political polarization.

It’s often noted that this polarization is profound: partisans now disagree so much that they often struggle to understand each other.

It’s often noted that this polarization is persistent: when partisans sit down to talk about their now-opposed beliefs, they rarely rethink or revise them.

But what’s rarely emphasized is that this polarization is predictable: people setting out on different life trajectories can see all this coming. When Becca and I said goodbye in the summer of 2010, we both suspected that we wouldn’t be coming back. That when we met again, our disagreements would be larger. That we’d understand each other less, trust each other less, like each other less.

And we were right. That’s why I haven’t seen her in a decade.

Told this way, the story of polarization raises questions that are both political and personal. What should I now think of Becca — and of myself? How should I reconcile the strength of my current beliefs with the fact that they were utterly predictable? And what should I reach for to explain how I came to disagree so profoundly with my old friends?

The standard story: irrationality

The story says, in short, that politics makes us stupid. That despite our best intentions, we glom onto the beliefs of our peers, interpret information in biased ways, defend our beliefs as if they were cherished possessions, and thus wind up wildly overconfident. You’ve probably heard the buzzwords: “confirmation bias”, “the group-polarization effect”, “motivated reasoning”, “the overconfidence effect”, and so on.

This irrationalist picture of human nature has quite the pedigree — it has won Nobel Prizes, started academic subfields, and embedded itself firmly in the popular imagination.

When combined with a new wave of research on the informational traps of the modern internet, the standard story offers a simple explanation for why political polarization has exploded: our biases have led us to mis-use our new informational choices. Again, you’ve probably heard the buzzwords: “echo chambers”, “filter bubbles”, “fake news”, “the Daily Me”, and so on.

The result? A bunch of pig-headed people who increasingly think that they are right and balanced, while the other side is wrong and biased.

It’s a striking story. But it doesn’t work.

It says that polarization is predictable because irrationality is predictable: that Becca and I knew that, due to our biases, I’d get enthralled by liberal professors and she’d get taken in by conservative preachers.

But that’s wrong. When I looked ahead in 2010, I didn’t see systematic biases leading to the changes in my opinions. And looking back today, I don’t see them now.

If I did see them, then I’d give up those opinions. For no one thinks to themselves, “Gender roles are oppressive, racism is systemic, and national myths are lies — but the reason I believe all that is that I interpreted evidence in a biased and irrational way.” More generally: it’s incoherent to believe that your own beliefs are irrational. Therefore, so long as we hold onto our political beliefs, we can’t think that they were formed in a systematically irrational way.

So I don’t see systematic irrationality in my past. Nor do I suspect it in Becca’s. She was just as sharp and critically-minded as I was; if conservative preachers changed her mind, it was not for a lack of rationality.

It turns out that tellers of the irrationalist tale must agree. For despite the many controversies surrounding political (ir)rationality, one piece of common ground is that both sides are equally susceptible to the factors that lead to polarization. As far as the psychological evidence is concerned, the “other side” is no less rational than you — so if you don’t blame your beliefs on irrationality (as you can’t), then you shouldn’t blame theirs on it either.

In short: given that we can’t believe that our own beliefs are irrational, the irrationalist explanation of polarization falls apart.

Suppose you find this argument convincing. Even so, you may find yourself puzzled. After all: what could explain our profound, persistent, and predictable polarization, if not for irrationality? As we’ll see, there’s a genuine philosophical puzzle here. And when we can’t see our way to the solution, it’s very natural to fall back on irrationalism.

In particular: since we can’t view our own beliefs as irrational, it’s natural to instead blame polarization on the other side’s irrationality: “I can’t understand how rational people could see Trump so differently. But I’m not irrational — so the irrational ones must be Becca and her conservative friends, right?”

That thought turns our disagreement into something more. Not only do we think the other side is wrong — we now think they are irrational. Or biased. Or dumb. And that process of demonization — more than anything else — is the sad story of American polarization.

A Reasonable Story

What if it need not be so? What if we could think the other side is wrong, and not think they are dumb? What if we could tell a story on which diverging life trajectories can lead rational people — ones who care about the truth — to be persistently, profoundly, and predictably polarized?

That’s what I’m going to do. I’m going to show how findings from psychology, political science, and philosophy allow us to see polarization as the result of reasonable people doing the best they can with the information they have. To argue that the fault lies not in ourselves, but in the systems we inhabit. And to paint a picture on which our polarized politics consists largely of individually rational actors, collectively acting out a tragedy.

Here is the key. When evidence is ambiguous — when it is hard to know how to interpret it — it can lead rational people to predictably polarize.

This is a theorem in standard (i.e. Bayesian) models of rational belief. It makes concrete and confirmed empirical predictions. And it offers a unified explanation of our buzzwords: confirmation bias, the group-polarization effect, motivated reasoning, and the overconfidence effect are all to be expected from rational people who care about the truth but face systematically ambiguous evidence.

More than that: this story explains why polarization has exploded in recent decades. Changes in our social and informational networks have made is so that, with increasing regularity, the evidence we receive in favor of our political beliefs tends to be unambiguous and therefore strong — while that we receive against them tends to be ambiguous and therefore weak. The rise in this systematic asymmetry is what explains the rise in polarization.

In short: the standard story is right about which mechanisms lead people to polarize, but wrong about what this means about people. People polarize because they look at information that confirms their beliefs and talk to people that are like-minded. But they do these things not because they are irrational, biased, or dumb. They do them because it is the best way to navigate the landscape of complex, ambiguous evidence that pervades our politics.

That’s the claim going to defend over the coming weeks.

Here’s how. I’ll start with a possibility proof — a simple demonstration of how ambiguous evidence can lead rational people to predictably polarize. Our goal will then be to figure out what this demonstration tells us about real-world polarization.

To do that, we need to dive into both the empirical and theoretical details. In what sense has the United States become increasingly ”polarized” — and why?What would it mean for this polarization to be “rational” — and how could ambiguous evidence make it so? How does such evidence explain the mechanisms that drive polarization — and what, therefore, might we do about them? I’ll do my best to give answers to each of these questions.

This, obviously, is a big project. That means two things.

First, it means I’m going to present it in two streams. The core will be this blog, which will explain the main empirical and conceptual ideas in an intuitive way. In parallel, I’ll post an expanding technical appendix that develops the details underlying each stage in the argument.

Second, it means that this is a work in progress. It’ll eventually be a book, but — though I’ve been working on it for years — getting to a finished product will be a long process. This blog is my way of nudging it along.

That means I want your help! The more feedback I get, the better this project will become. I want to know which explanations do (and don’t) makes sense; what strands of argument are (and are not) compelling; and — most of all — what you find of value (or not) in the story I’m going to tell.

So please: send me your questions, reactions, and suggestions.

And together, I hope, we can figure out what happened between me and my old friends — and, maybe, what happened between you and yours.

What next?
If you’re interested in this project, consider signing up for the newsletter, following me on Twitter, or spreading the word.
Coming up next week: an experiment that demonstrates how ambiguous evidence can lead people to polarize.


PS. Thanks to Cosmo Grant, Rachel Fraser, and Ginger Schultheis for helpful feedback on previous drafts of this post — and to Liam Kofi Bright, Cailin O’Connor, Kevin Zollman, and especially Agnes Callard for much help and advice getting this project off the ground.

Originally published at https://www.kevindorst.com.

--

--

Kevin Dorst
Science and Philosophy

Philosopher at University of Pittsburgh, working on the question of how irrational we truly are.