Bursting the evidence bubble
Wrote an op-ed for Australasian Science magazine, arguing that focusing too much on ‘bursting the filter bubble’ is a red herring in terms of improving public discourse, and might in fact make matters worse. You can access the piece from the citation below, or here’s the final draft of the piece I sent.
Knight, S. (2017). Bursting the News Filter Bubble. Australasian Science Magazine, July/August. Retrieved from http://www.australasianscience.com.au/article/issue-julyaugust-2017/bursting-news-filter-bubble.html
After the US presidential elections, Google searches for Breitbart news peaked; friends — not secret Trump supporters — took to the right wing source to try and understand the view they were espousing. Since then there have been frequent calls for more of us to step out of our social media echo chambers and to ‘burst our filter bubble’. These echo chambers, it’s alleged, combine with a filter bubble effect: social media and search engine personalization that emphasizes content similar to content you have viewed or liked before. So, your Facebook feed only exposes you to views you already agree with, and to information that supports those views, leading to a general deterioration in public and political debate as we seem unable or unwilling to engage across perspectives.
If we believe this argument, then your use of Facebook presents an information-access issue, insulating you from diverse perspectives exposure to which would improve political discourse. Empirical research on this topic is hard. Companies control their data, users typically don’t state their politics explicitly, and the impact of proprietary algorithms can only be guessed at. Whether you’re liberal or conservative you’re more likely to believe information that confirms your prior beliefs, the question is, what role is technology playing in this? And does this cognitive bias mean that we should indiscriminately take a more even handed approach to sources?
The research that has been conducted — mostly in the US — paints a complex picture. The nature of the topic being discussed matters; partisan issues are more likely to divide our consumption of media sources and blogs However, most people consume pretty centrist media; it’s a relatively small number — particularly of Republicans — that consume a highly polarized media diet. The idea that online and offline consumption of news media radically differ may be overstated. In fact, users of social media and personalized news aggregation sites are more likely to be exposed to — not insulated from — diverse perspectives. So if we’re in a bubble, this seems to be down to personal selection of sources, rather than algorithms themselves directing the content that we view or discuss. Insofar as there is evidence for filter bubbles they’re a symptom, not a cause, of echo chambers.
Many of the calls for us to burst out of the filter bubble pay little heed to the legitimacy of the perspectives espoused. This is concerning given the long history of manufacturing false equivalence, including on climate change, the relationship between tobacco and cancer, and vaccine denialism. In each case, there has been an attempt to manufacture debate and suggest that opposing arguments deserve equal airtime even where they do not have equal evidence.
In the case of climate science, research suggests that one echo chamber is based around a small but powerful group of denialists, who repeat and amplify individual sources of climate science-denial, while in contrast those who trust the science on climate-change repeat information from multiple sources. These ‘sides’ do not have equivalence, and moralizing over emergence of bubbles based on broad sources of high quality evidence is misguided. Problematically though, the tendency of news outlets to report opposing sides with equivalence — in an attempt to avoid partisanship — makes it harder for people to engage in navigating this evidence. Your ability to reconcile these competing claims is related to how you think about corroboration and expertise. Proposing that we ‘burst our filter bubble’ might in fact result in legitimizing denialist perspectives, resulting in their repetition and more widespread acceptance.
A recent report indicated that (in the US) people felt better informed in 2016 than in 2011. It also indicated that even taking account of political affiliation and demographic factors, placing value on evidence — expertise, corroboration of information, well-grounded data — trumps partisanship on politically contentious issues such as climate change and support for health care reform. That’s where our focus should be. Exposure to others’ experiences, through demographic rather than ideological diversity, may be important, to ground an empathetic discussion and understanding of evidence. However, focusing attention on bursting filter bubbles — whether technologically, or purposefully — could backfire, giving false equivalence to misinformation, and strengthening prior bias. Instead, we need a greater focus on the evidence, and how people treat it.
If this topic interests you, I’ve seen a couple of other great pieces recently on similar topics:
I also wrote a recent piece on epistemic cognition and fake news for The Psychologist…
The op-ed is reproduced under clause 7 of the copyright agreement, with thanks to Australasian Science Magazine. Find the original at: Knight, S. (2017). Bursting the News Filter Bubble. Australasian Science Magazine, July/August. Retrieved from http://www.australasianscience.com.au/article/issue-julyaugust-2017/bursting-news-filter-bubble.html
Originally published at Finding Knowledge.