Locked in a (Filter) Bubble

Photo by Pete Brooks

Have you ever noticed how the stories on your Facebook News Feed always seem to pop up at the eerily perfect time and are always so relatable? Or how the ads on the side of your Google search are advertising for the exact thing you were looking at on Amazon yesterday? Even if you haven’t; trust me, it’s been happening.

The reason behind these personalized versions of the internet we all have come to know and use everyday is something called a “filter bubble”. A filter bubble is an algorithm written by a website that is used to predict what you want to see based on what you’ve looked at previously. For example, if you like a video of puppies on Facebook on Monday, on Tuesday there will probably be more videos of puppies on your News Feed.

While these bubbles seem, on the surface, to be totally harmless and even helpful, they have a dark side. As we continue to use the same websites and they gather more data about each of us, their algorithms become more precise. They tailor the content each of us sees based on whatever the algorithm tells them we want to see. So instead of receiving a diverse stream of content on a variety of topics, we become trapped in our bubble, seeing only what the algorithm wants us to.

These filter bubbles have become the modern day gatekeepers for information. In the past, there have been forms of information bubbles like politically leaning newspapers and magazines but nothing like this. The difference between the filter bubbles of yesterday and today is that today’s are all-encompassing, and allow people to never even have to experience another perspective. And, because creating content has become so easy, with blogs, twitter, etc. it can seem as though we are getting diverse information even though we are often seeing the same filtered, biased information from multiple outlets.

Filter bubbles are a hot button issue because they can blind a person completely from a huge part of the world and keep them inside their tiny, comfortable bubble. The world is full of things that make people uncomfortable and often it is exactly those things that can broaden one’s perspective and help us relate more closely to one another. Filter bubbles, on the other hand, separate us by interest and content and only deepen the social divide between people of different interests. While these filters help us find oodles of the content we want to see, they have made it increasingly difficult to be an informed member of society with a balanced perspective on a number of issues.

The solution for this problem is incredibly complicated. One on hand, people want to see what they want to see, and these filters have put more desired content into the hands of those who desire it. However, I would argue that information gatekeepers like Facebook and Google have a social responsibility to present at least some varying content that may make some users uncomfortable. Mark Zuckerburg, who is himself guilty for the pervasiveness of these filter bubbles, recently talked about the detriments filter bubbles saying, “a squirrel dying in front of your house may be more relevant to you interests right now than people dying in Africa.” It is an issue worth talking about, and as the internet evolves and these algorithms becomes ever smarter, we can hope that content becomes more relatable and more diverse at the same time.