The Danger of Filter Bubbles

Before social media and search engines, finding information online was not as easy as it is today. Now, users can search anything they want, but is the information they find complete? A user’s data and online behavior patterns are used by search engines and social media platforms to create an over-personalized online experience. The advancement of these algorithms polarizes the user from other viewpoints, creating a filter bubble.

Filter bubbles occur when users only encounter information they agree with. Filter bubbles have significant social implications because they lead to close-mindedness and polarization of groups- a phenomenon we are witnessing in society today. The code and algorithms within social media platforms and search engines create these filters. In Critical Questions for Big Data, authors Danah Boyd and Kate Crawford explain that Lawrence Lessig argues one of the four forces that regulate social systems is architecture — in the context of technology, they are referring to code. This code creates algorithms that perpetuate bias because it dictates what people see online. This polarization makes it hard for people to accept any viewpoint that is not their own.

Different search results may show up on Google depending on the user behavior or data that is collected like search history or geographic location. Source: Rhodes, L. (Producer), & Orlowski, J. (Director). (2020). The Social Dilemma [Motion Picture]. USA: Exposure Labs

Platforms create filter bubbles through their algorithms by gathering as much data on the user as possible. Cathy O’Neil states in Weapons of Math Destruction that mathematics is not only entangled in the world’s problems, but the use of mathematics is also fueling many of them. The mathematics within these algorithms may appear to be objective, however, the corporate use of the data pulled from the algorithms that are made by the mathematics is concerning. Targeted content based on calculation and collection of user data creates an environment in which the user is not able to consider a different viewpoint. This lack of perspective is troubling because, in order to learn and solve problems, people need to understand and respect multiple viewpoints.

Before powerful algorithms and big data, people were exposed to multiple opinions through conversation, debate, and reliable sources. This awareness facilitated a greater tolerance of opposing views. Now, people look up information using a search engine and their own beliefs are confirmed because the algorithm within the search engine knows what the person wants to see. Data, algorithms, and mathematics have the potential to improve society, but it is essential that users are educated on the dangers of living in a filtered world and that they improve their awareness while consuming information obtained online.

References:

Boyd, D. & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5). 662–679. 10.1080/1369118X.2012.678878

Filter Bubble. (2018, May 17). Techopedia. Retrieved February 11, 2021, from https://www.techopedia.com/definition/28556/filter-bubble

O’Neil, C. (2016). Weapons of math destruction. Crown Books.

Rhodes, L. (Producer), & Orlowski, J. (Director). (2020). The Social Dilemma [Motion Picture]. USA: Exposure Labs

--

--