https://ru.wikipedia.org/wiki/%D0%A4%D0%B0%D0%B9%D0%BB:%D0%9F%D1%83%D0%B7%D1%8B%D1%80%D1%8C_%D1%84%D0%B8%D0%BB%D1%8C%D1%82%D1%80%D0%BE%D0%B2.jpg

Filter Bubbles

Rocio Perez
The Open Book

--

A “filter bubble” shapes what you see in your searches on different websites like Google and Facebook. Websites have special algorithms that will guess what information you want to see based on past searches. They prevent you from seeing different perspectives or values other than your own. For example Eli Pariser, who invented the term “filter bubble”, explained in a Ted Talk that Facebook will use your past searches including what statuses you liked or commented on, to determine what posts will pop up in your news feed.

Pariser also brings up the idea of information “gatekeepers”. Pariser described that in the past human editors were the gatekeepers deciding what information people received. Today I would argue that our information gatekeepers are the internet or those who have the ability to control or build the web. Those who work for Facebook or Google should spend their time preventing these “filter bubbles”. According to a contributor at MIT Technology Review, some Yahoo lab workers have found a possible solution to conquer the “filter bubble”. They have built an engine where it recommends people to others with opposing views on one topic but who share similar values on other topics or ideas. This way they can get opposing views from people who they still share some values with.

Information gatekeepers have the responsibility to give users information from various perspectives, not just a user’s own perspective. Pariser even states that these filter bubbles prevent us from having a functioning democracy. If we’re not able to easily attain all forms of information than we’ll be further alienated from positive change in society. If people can’t see all sides of a story than they’ll be subject to biases and they won’t be able to question their own values. It’s important to question your values in order to gain knowledge and to change for the better.

If I were Facebook or Google, I would definitely try to find a way to prevent “filter bubbles”, especially because it has important implications on being connected to others and getting a proper education. However, the contributor on MIT Technology Review did state that people on social media sites aren’t always happy when those sites change in any way. Yet, the yahoo lab workers did find some success in preventing “filter bubbles” with their new engine. Facebook could follow in their lead or they could mix both the user’s preferences with some opposing views so that users could get exposed to some other ideas. Google could even have a setting for “opposing views” to allow people to have more control over what they’re going to be exposed to. Either way, websites should allow for more user control so that we can decide for ourselves whether or not we want to be informed on other perspectives.

--

--