Do you ever feel as if you see the same kinds of articles on Facebook time and time again or that certain websites consistently make their way to the top of your Google searches? If so, you are experiencing the refining power of the filter bubble. Filter bubbles are the personalized worlds we experience online as a result of algorithms that automatically select the content you see based on your previous search history, the websites you visit, and the types of articles you read. Nearly all of the major search engines are using this filtering tool, including Facebook and Google. To illustrate how this works, imagine you are a religious person. You are probably going to visit more faith-based websites and read articles posted by like-minded people. The filter bubble will pick up on your patterns and start tailoring your Facebook newsfeed and Google searches to match your beliefs. For example, you may begin to only see your religious friends’ posts or get search results from websites with religious affiliations.
Before the Internet made its debut, newspapers served as the main news outlet. In those days, the editors were known as “gatekeepers” because they decided what information would be dispersed to the public and what information would be withheld. The rise of the internet lead to the development of a new “gatekeeper,” algorithms. However, the algorithms that power the filter bubble lack one key element that human gatekeepers possess: ethics. Algorithms control our view of the world, but lack human understanding. They prevent us from seeing information that we need to see in order to have a well-rounded view of important issues and to understand other people’s perspectives. Some individuals might prefer to only see information in line with their own interests and beliefs, but I believe a happy medium can be established. I think the filter bubble crosses the line when it completely cuts certain people or websites out of your view. However, as a user of service, you can prevent this from happening by visiting a variety of websites, reading all kinds of articles, and integrating diversity into your online searches. Additionally, you can delete the cache, cookies, and history in your computer’s browser, search the web incognito, or enable the “do not track” feature in Safari.
If I were Facebook or Google, I would allow my consumers to opt in or out of the filter bubble and provide them with instructions on how to do so. One of my main issues with the filter bubble is that it monitors your history and makes decisions for you without your permission or knowledge. I think that it is a helpful feature for some, but intrusive hindrance for others. I prefer a “standard Google,” where everyone gets the same results and has access to the same material. I think it is important to get people out of their comfort zones by presenting them with challenging information, rather than putting a protective shield around them. In my opinion, Facebook should show users posts from all of their friends on their feed, regardless of their religious beliefs, political affiliation, interests, etc. I believe users, not algorithms, should have the power to pick what they do or do not see online. There’s always two sides to every story and I, for one, would like to see them both.
Pariser, Eli. “Transcript of ‘Beware Online ’filter Bubbles.” N.p., n.d. Web. 13 Sept. 2016.
“How to Burst the ‘Filter Bubble’ that Protects Us from Opposing Views.” N.p., n.d. Web. 13 Sept. 2016.
“Funnel — Google Search.” N.p., n.d. Web. 13 Sept. 2016.