The Iron Bubble (or: how the Facebook algorithm insulates fascists from reality)

Tracking Exposed
Mar 3, 2018 · 3 min read

The second post of this series showed an analysis of the data we gathered to document the biases of the Facebook news feed algorithm in the context of the upcoming Italian elections. In this post, we want to concentrate on a specific aspect, the fascist PoV, that revealed a disturbing trend.

(A quick summary of the project if you missed the first post: we created six Facebook bots, gave each political sympathies by making them “like” the Facebook pages of specific parties, and then let them scroll their news feeds for three weeks, recording what they saw. All six bots “followed” the same pages, but no two of them saw the same posts: the algorithm chose the “best” posts for each of them)

As the visualization below shows, Michele, the bot pretending to be a fascist, enjoyed a radically different news feed experience from the others:

Image for post
Image for post
Total number of posts seen by each bot (the wider the bar, the more posts), grouped by how many times the posts were repeated (the higher up the bar, the more times).

Fash-bot Michele is shown a much smaller variety of posts, repeated way more often than normal — it saw some posts as often as 29 times in the 20 days represented in the data set.

For comparison, imagine a “fair” distribution, where each post shows up at least once in their newsfeed, and half of them appear twice. If that were the case, the graph would look like this:

Image for post
Image for post
Hypothetical “fair” distribution (for comparison only, not a real data set)

I mostly agree with studies such as Political polarization? Don’t blame the web, especially because the opposite belief is a kind of techno-determinism I feel doesn’t take into account a lot of political complexity. But the data above displays a frightening situation: the Michele bot has been segregated by the algorithm, and only receives content from a very narrow political area. Sure, let’s make fun of fascists because they see mostly pictures or because they are ill-informed, but these people will vote in 31 hours. Not cool.

If you were curious what posts the Facebook algorithm deemed so essential that they had to be shown 29 times each (once a day or more, on average — each), here they are, all three of them. The third is peculiar, with its message that “mass media does not give us a platform, they never even mention our name, but people still declare they will vote for us. Mass media is a scam, spread the word”.

This is the 5th post in the Italian Elections 2018 series: 0- fbtrex Background, 1- Testing Facebook algorithm in an electoral campaign (methodology), 2- first sighting of Facebook power abuse, 3- judging algorithm discrimination, 4- Facebook stab online media twice.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store