A Short Digression on Facebook and Bias

Yes, Facebook has a bias problem, but not because its staffers allegedly favor one American political party over another.

Facebook, the world’s largest source of news with well over a billion readers, was recently accused of biasing its Trending Topics sidebar against conservative political content.

I don’t care. Really.

This is a classic case of missing the forest for the trees. Yes, Facebook has a bias problem, but not because its staffers allegedly favor one American political party over another. It has a bias problem because its financial raison d’être is to keep its users engaged and on the page, and to do so its algorithms have to prefer the juiciest content for any given user. Users find themselves trapped in an echo chamber of similar ideas.

People who engage with anti-vaccination content on Facebook? They’re more likely to receive updates about chemtrails, 9/11 trutherism, and the Flat Earth theory.

People who “like” right-leaning content will find themselves in an echo chamber of other right-leaning content. People who “like” left-leaning content will find themselves in an echo chamber of other left-leaning content.

This is not how a healthy civic society operates. We should be exposing ourselves to views that we don’t understand and don’t agree with. Discourse is about engaging with people who think differently than us, and understanding that, while they may be the opposition, they are certainly not the enemy.

Our algorithmically-sorted newsfeeds don’t allow for this. They can’t. Their parent companies couldn’t sell our data to advertisers otherwise.