Should I fear the ‘Filter Bubble’?

Kat Clinton
3 min readApr 5, 2019

When I was introduced to the concept of ‘Filter Bubbles’ by watching Eli Pariser’s Ted talk titled “Beware online ‘filter bubbles’”(2011), I initially got a sense of ‘Fear of Missing Out’ (FOMO). What has the internet been hiding from me!?

Pariser argues that many platforms, such as Facebook and Google, are providing “algorithmic editing of the web” and personalising a user’s online experience and news consumption. This is where the online information or news that we see is personally tailored to what the algorithm thinks we want to see and will consume, and not necessarily be what we need to see. This is what Pariser describes as a ‘filter bubble’. His opinion is that by having a ‘filter bubble’ we don’t get a balanced or diverse view on a topic, which can lead to confirmation bias and doesn’t broaden our worldview.

I got the sense when watching the Ted talk is that Pariser has a very strong one-sided point of view on this topic. He describes an anti-democratic or dystopic notion that these big internet companies are deciding what information we see or not see, with no transparency on how it works and we have no control over it.

Pariser provided a couple of personal observations on a simple Google search his Facebook feed, but no quantitative research to back up his argument. He also didn’t explore the opposite viewpoint that personalisation could actually be a positive experience for the user.

Since Pariser’s 2011 talk, there has been more recent research around diversity of online news, such as “automated serendipity” where algorithmic news actually exposed the user to more diverse news sources than what they would normally use (Fletcher, Richard, and Rasmus Kleis Nielsen. 2018). Also, search engines promote diversified information by providing points of view that is more widespread and varied. Vastly broadening the information available to people and their choices over news outlets.(Seth Flaxman, Sharad Goel, Justin M. Rao. 2016).

I’m glad Pariser brought this discussion into the public discourse because it makes people aware that the information we view online can been filtered or personalised for us in some respect. I feel there needs to be more awareness programs out there to educate people on this topic. It may be unrealistic for the younger generation to understand what filter bubbles or personalisation is, and why it is important, without exposure to the topic in the education system (Elia Powers. 2017), such as a Digital Literacy program.

Pariser provided a compelling argument which did serve as a reminder on what my personal data can be used for online, but I personally don’t think we are being led into filter bubble isolation. With some simple strategies I have learnt to expand my information sources online, and I no longer have ‘FOMO’. A couple of ways I am doing this is by accessing information on more than one platform, such as creating a “Scholarly Bubble” on Twitter, where I will only access and follow academic resources. Also by using the sites listed on the “Deep Web” where Google Search may not reach and come up on your standard search results (University of Illinois. 2019).

References:

Fletcher, Richard, and Rasmus Kleis Nielsen. “Automated Serendipity.” Digital Journalism 6.8 (2018): 976–89. Web.

TED (2011). “Beware online ‘filter bubbles’ | Eli Pariser” [Video File]. Retrieved from: https://youtu.be/B8ofWFx525s

Seth Flaxman, Sharad Goel, Justin M. Rao, Filter Bubbles, Echo Chambers, and Online News Consumption, Public Opinion Quarterly, Volume 80, Issue S1, 2016, Pages 298–320, https://doi-org.ezproxy.uws.edu.au/10.1093/poq/nfw006

Powers, Elia. “My News Feed Is Filtered?” Digital Journalism 5.10 (2017): 1315–335. Web.

University of Illinois (2019). “Protecting Your Data on the Web: Dive Deeper” Retrieved from: http://guides.library.illinois.edu/c.php?g=348478&p=2347796

--

--