The Filter Bubble

Due to the massive amount of daily posts on Facebook and the limited time one spends online, it is sheer impossible to read or even look at every post. Thus, the platform selects which posts are shown and which stay hidden for the user. Even this being quiet obvious, it does not make most people wonder how Facebook prioritizes.

The news feed is constructed by an algorithm (called EdgeRank) which is permanently being improved. EdgeRank analyses every signal possible to determine the relevance of each post and each person. In the beginning, it was only about how many people liked one post; nowadays, hundreds of other indicators are being considered, such as the popularity of the post creator’s earlier posts ( with everyone and with the viewer), the type of post (status update / photo / video / link) and the time the post was published, just to name a few.

However, there are filters which might become a bit more worrying, especially in terms of political orientation. The internet activist Eli Pariser delivers the following example: Since he rather clicked on his liberal friends’ posts, his conservative friends’ posts, at some point, no longer appeared in his news feed. This personalized and filtered news feed is questionable as you only get fed with information from sources you agree with, one the one hand your friends sharing your opinion and on the other hand the pages you liked. This results in you living in a ,,filter bubble’’ you created yourself. In a long term — and maybe slightly exaggerated — view this phenomenon might result in a political radicalisation since you do not any longer get in contact with opposing opinions.

Naturally, Facebook is not the only one using that filter bubble, Google does so as well. For one thing this filtering process is surely useful in terms of efficiency, for instance, a musician and a geologist get different results if they search for ,,types of rock’’. For another it is alarming if people are not aware of the fact that the search result list is personalized and does not provide an objective overview.

Another problem about algorithms deals with their designers: Progammers are mostly white academics between 25 and 35 years. Due to the fact that their algorithms — at least partly — represent their ideals, a more diverse group of designers would help decreasing the bias of algorithms.