The Causes and Effects of “Filter Bubbles” and how to Break Free

Kristen Allred
11 min readApr 13, 2018

--

Anyone who uses the internet has experienced filtering of information. Due to the massive amounts of material online it is necessary to refine what we see, and this inevitable filtering of information at first glance isn’t a big problem. Eli Pariser, a long-time internet activist, argues that these filtering algorithms are biased, and don’t show content that disagrees with the user. Filter bubbles, also known as echo chambers were defined in Pariser’s 2011 TED talk titled “Beware online filter bubbles”. Pariser says, “your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.” (Pariser, 4:06). Pariser argues that algorithms create filter bubbles that are affecting our society and that they will bring negative effects if we don’t code the algorithms with a sense of civic responsibility.

While it is true that filter bubbles are a problem, algorithms don’t explain everything. Filter bubbles are an issue of human nature, they feed into the worst part of our human weaknesses because we don’t want our ideas to be challenged. We are to blame for putting ourselves into filter bubbles. Filter bubbles have negative effects, and each person has important roles in breaking free, from the creators of social media, to schools, down to our own role in overcoming them.

In Pariser’s TED talk, he says that at first the internet meant a connection to the world, something that brings everyone together, yet it is doing the opposite. Pariser argues that algorithms learn who we are, and create our filter bubble based on what we click on and how long we spend looking at particular content. He continues, saying these filter bubbles can and will negatively affect society, because algorithms are confining people to their small bubble of information, and polarizing our opinions. He discusses the effect of filter bubbles in his Facebook feed, and how though he is liberal he still likes learning about other points of view beyond his own. One day he noticed that opposing opinions had disappeared from his feed. Facebook learned that he clicked more on liberal links than conservative links, and without consulting him had edited them out. In addition to facebook, he brings up concerns about Google. Pariser states that there isn’t a standard Google anymore, each person receives different search results depending on who they are. Curious, Pariser did a small experiment on this, asking his friends to send screenshots of their search results after Googling “Egypt”. The results were substantially different. One person’s page contained results about riots in Egypt, while another was travel based. Pariser argues that the phenomenon of personalization is sweeping the entire internet, saying “this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see” (Pariser, 5:18). Pariser’s main argument is that algorithms are doing a poor job controlling the flow of information, and that they should be encoded with a sense of civic responsibility. In addition to this, users of the internet should have control over what they see, so the internet can bring new ideas and various perspectives rather than confining people to their personal filter bubbles. Pariser makes good points, but algorithms can’t be shouldered with all the blame.

To understand how filter bubbles are created, it is important to understand how algorithms work, because although they aren’t the sole cause, algorithms do play a fundamental role. Algorithms are large, complex computer codes that decide how relevant information is to each individual. Each website uses a varying system, but the overall goal is to filter out information based on what it thinks you will be interested in. Websites want you to use their sites, and stay on them for as long as possible, so they learn what user’s beliefs are and tailor information to each individual user. This can come into effect with the advertisements that pop up on our feeds, search results, and the order that content is displayed.

Algorithms do play a part in the creation of filter bubbles, but they aren’t the root of the issue. Users of the internet can also put themselves in a filter bubbles, because they feed our human weaknesses. We all tend to read content that we agree with. We don’t want our ideas to be challenged, and we don’t want to be around people who disagree with us. Instead, we want easy lives, and people in our immediate circle who are easy to deal with. Jeff Passe discusses homophily, the human behavior in which individuals tend to form bonds with others who share their beliefs (Passe, 2). We seclude ourselves from varying ideas by surrounding ourselves with online friends who share our opinions, and by subscribing to content that produces content that supports our beliefs. The open ended nature of social media allows us to subscribe to like minded people and only follow those who already agree with us, which creates an echo chamber of information. We aren’t mixing and sharing and understanding other points of view. Our human nature leads to polarization, and evidence suggests that the more polarized someone is, the more they seek out content that agrees with them (Passe, 3). Online, people can find whatever they want to confirm their own bias. This creates a downward spiral that creates fixed minded people who are unwilling to recognize this issue, and if this occurs too often and individuals don’t see content that challenges their points of view, it leads to negative effects.

When individuals are engrossed in their personal filter bubbles, negative results can occur. Filter bubbles create an informational barrier around people that prevents them from seeing opposing viewpoints giving the “impression that our narrow self-interest is all that exists” (Pariser, 3:02). Because of this, that person can become polarized to their side of a particular argument. When individuals on both sides of an issue are polarized and only see their side of an argument, and continually see their opinion reinforced, a solution will never be reached because both sides will refuse to accept the opposing argument, and due to the deepening divide between people and their opinions individuals are struggling to agree to disagree (Panke, 259). Furthermore, because the two sides only see content from their respective echo chambers, they can’t have effective conversations with individuals that disagree with them. Most people get their news from one source, Panke discusses this by saying, “consistent liberals and conservatives often live in separate media worlds and show little overlap in the sources they trust for political news” (Panke, 259). This separation means they aren’t exposed to the same information, thus they cannot have an effective conversation on their topic.

To escalate this issue, there will always be misinformation online. Del Vicario states, “Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia” (Del Vicario, 558). This misinformation is a factor that drives the formation of filter bubbles, and when people are polarized to their side of an argument it’s more likely that they will accept these falsities as truth. This creates an even deeper divide between the people. “The new opportunities to access and share information can lead to the spread of manipulative and deceptive messages” (Panke, 248), and because of the anonymity of social media there are usually no repercussions for spreading false information. By the time it gets taken down, it’s too late.

In addition, one of the biggest issues of filter bubbles is they are invisible, and people don’t realize that they are seeing something different than everyone else. This leads them to believe that their opinion must be right, because all they see is their side and they assume everyone else is seeing that too.

Pariser argues that the negative effects of filter bubbles are damaging democracy as a whole. Democracy is based on the entire population helping to make decisions, and it gives power to the people by emphasising the majority. In the United States, if people are unable to have civil debates, or we are consuming false information then democracy is invalid. Panke says, “The ease to form communities of like-minded peers can result in echo chambers that lack critical discussion, divergent opinion, and political discourse” (Panke, 248). In order for democracy to be an effective system, each person needs to be well informed on the issues they’re voting on, and be able to have critical discussions. Democracy is threatened because of how easy it is for misinformation to spread and be believed.

In addition, these issues don’t just affect large national arguments, or our interactions with others online, but they can affect our personal relationships with others as well. In day to day conversation, if two people are polarized to opposite sides of an issue they will be unable to have a civil conversation. This struggle to express their beliefs has potential to lead to violence, and the inability to have a civic discussion is an issue now that will continue to grow if filter bubbles continue to divide us.

With this being said, we all have an important role in the process of breaking free of filter bubbles. First off, creators of social media platforms or other sources of information should strive to create unbiased websites, and have an understanding of their civic duty that they apply to their algorithms. Pariser makes a clever comparison saying, “The best editing gives us a bit of both … it gives us some information vegetables; it gives us some information dessert. And the challenge with these kinds of algorithmic filters is that because they’re looking at what you click on first, it can throw that off palance. And instead of a balanced information diet, you end up surrounded by information junk food” (Pariser, 5:20). It’s important that we see content that challenges our point of view, because even though algorithms aren’t the only cause of filter bubbles, they still play a big role.

Next, schools. Education is crucial to this issue. One of the biggest problems with filter bubbles is people don’t realize they are in them, thus if awareness can be brought to them through education that would help decrease filter bubbles negative effects. It’s incredibly important to teach people to seek out knowledge, how to research issues, and that not everything online is true. If people get a well rounded education, and learn to have meaningful, learning-oriented conversations the negative effects of filter bubbles will be greatly decreased.

Finally, each individual has a responsibility for breaking out their own personal filter bubbles. By simply turning off customization features, and targeted ads on websites we can limit the effect that algorithms have on our content. However, having a hunger for truth is the most important aspect of overcoming filter bubbles. We can’t keep living in our personal filter bubbles and accepting everything we hear as truth right away. If everyone looked at different opinions on issues, and read about problems from multiple sources democracy wouldn’t be threatened like it is. Those few aspects alone can pop people’s filter bubbles, but overall it is crucial that awareness is brought to the issue, otherwise people will continue to live in their personal internet and suffer the consequences. We all have a role to play in assuring the continuing progression of society.

There are some that disagree with Pariser, saying that filter bubbles aren’t an issue, and that Pariser blew it out of proportion. Most articles that contradict Pariser are written on the algorithm aspect of filter bubbles. Jacob Weisberg challenges Pariser’s views on the effect that algorithms have, saying that the amount of personalization on Google is much less than what Pariser showed in his TED talk. Weisberg asked a Google for a response on this issue, and an employee commented, “We actually have algorithms in place designed specifically to limit personalization and promote variety in the results page” (Weisberg). Finally, he closes by saying if you’re worried about Google giving you a skewed worldview, turn the customization feature off. While these are valid points, Weisberg only argued the effect of algorithms. He made no comment on how individuals build their own filter bubbles, thus these arguments don’t discredit the relevancy of filter bubbles in our society.

Most people don’t know what filter bubbles are, let alone that they are inside of them, so how would they think to go out of their way to seek out truth if they don’t know they are receiving a skewed view of the world. Why would they turn off their customization feature off if they didn’t know their results were being personalized? Even if there weren’t any algorithms on any website, we would still put ourselves in filter bubbles because our nature as humans isn’t to find information that challenges us. We seclude ourselves. However, algorithms do exist. The more we click on certain links, the more priority they will have in our feeds. Even if Google claims they have algorithms that encourage diversity, what about Facebook or news sites? Our human nature leads us to seek out content that we agree with, and the algorithms learn who we are, and personalize our content accordingly, polarizing us even more. Even if the personalization of our search results isn’t as drastic as Parizer showed, the algorithms still have an effect. The combined effect of algorithms and our ignorance is what creates filter bubbles, and they will continue to have negative effects on our society if awareness isn’t brought to this issue.

Filter bubbles are a solvable issue, but if no action is taken to bring awareness to filter bubbles and people don’t do their part in breaking free of them the result could be highly problematic. Filter bubbles have many negative effects that need to be reduced in order for people to get along and for democracy to thrive. It’s going to be hard, because these filter bubbles are so ingrained in our human nature. It’s not easy to seek out the truth especially when we learn that we were wrong. Democracy and the relationships individuals form with others will be affected if no action is taken. By bringing awareness to people about filter bubbles and how we are being manipulated by them, we can break out of our filter bubbles by consuming information from a variety of credible sites, and seeking out multiple sides of arguments. Online, we need to cultivate a learning exchange, rather than promoting our own ideas. We each have a role in this issue, and in the long run, we can fight for algorithms to be reprogrammed to give a more balanced flow of information. For now it is important for each person to realize the profound effects that polarization has, and make an effort to break free of our filter bubbles.

Works Cited

Del Vicario, Michela, et al. “The Spreading of Misinformation Online.” PNAS Proceedings of the National Academy of Sciences of the United States of America, vol. 113, no. 3, 19 Jan. 2016, pp. 554–559. EBSCOhost, doi:10.1073/pnas.1517441113.

Panke, Stefanie, and John Stephens. “Beyond the Echo Chamber: Pedagogical Tools for Civic Engagement Discourse and Reflection.” Journal of Educational Technology & Society, vol. 21, no. 1, Jan. 2018, pp. 248–263. EBSCOhost, ezproxy.uvu.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=eft&AN=127424795&site=eds-live.

Pariser, Eli. “Beware online “filter bubbles”.” TED: Ideas worth spreading, Mar. 2011, www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.

Passe, Jeff, et al. “Homophily, Echo Chambers, & Selective Exposure in Social Networks: What Should Civic Educators Do?.” The Journal of Social Studies Research, 03 Aug. 2017. EBSCOhost, doi:10.1016/j.jssr.2017.08.001.

Weisberg, Jacob. “Eli Pariser’s The Filter Bubble: Is Web personalization turning us into solipsistic twits?” Slate Magazine, 10 June 2011, www.slate.com/articles/news_and_politics/the_big_idea/2011/06/bubble_trouble.html.

--

--