fbtrex — Background

Tracking Exposed
3 min readMar 2, 2018

--

facebook.tracking.exposed (fbtrex) is a unique tool to observe the social network from different points of view, our approach to the collective monitoring of algorithmic influence.

Currently focused on Facebook, but it could be expanded to cover more platforms in the future.

The project goal is to provide an open technology to enable analysis from a different background, because the impact of the Algorithm is not just about “political campaigning,” it is about power.

To observe a complex system such as a religious organization, a nation-state, a multinational corporation, one has to look at it from many points of views to understand its impact and influence, because every person interacting with the entity will see something different. The same approach applies to the social graph and the information flowing inside. We already know Facebook is not neutral on it, but to understand the biases of the Algorithm, we can’t look at it from inside our personal profile.

With a browser extension you can record what Facebook gives to you (as opposed to what you give to Facebook), and the data so collected allows you to literally rewind your Facebook timeline. But, as the analogy above, every user has a personal, unique, algorithm-filtered observation point. These data, combined, permit to figure how the Facebook algorithm works, to what extent it deform, shape, “curates” or possibly censor the information you receive.

Because, don’t forget, algorithm could have unintended consequences:

This project has a social rather than a business purpose (besides, we have to find ethical ways to sustain it), and we bound ourselves to some self-imposed policies, regulating the access to this dataset:

  1. The data collected are only the public posts which appear on your timeline. If a publication is shared for “friends” or other restricted audience, it is ignored. Only who contributes the data can share it, and they can share it with third parties if they want.
  2. Access to the full dataset is limited to research analyzing collective phenomena in the public interest: the output should not contain personal information or linkable behavior for a single user. We enforce it by running the MapReduce on our server and submitting the output to a privacy assessment.

The goal of this phase is primarily informative: we want to show, less abstract as possible, what an algorithm does. The project, nicknamed fbtrex, is part of tracking-exposed umbrella, because behavioral tracking is a form of passive surveillance. Algorithm influence is a consequence of such, otherwise invisible, user profiling.

COOL! But, really, nobody cares.

You, who can fully understand these words, are a small minority among the minority of the highly educated humans in this world. High likely you’ll contribute in the interest of this control system. Nothing personal, we are like you, That’s why we can’t be the only target audience.

After the 2016 voting games, some narrative like “oh my god I can’t believe citizen are so retarded, some hidden conspiracy has to be blamed” took space, and this is good, because was about time users profiling and algorithm manipulation got addressed in a more mainstream fashion.

The problem surface prominently during a public event like an election, but in fact it is there every day of the year

Elections are good to catch attention, we need to tell a story, and an opportunity show up in a country we know pretty well. We analyze this Mess-Ahead-Coming-Soon, Election day 4th of March:

https://www.youtube.com/watch?v=LdhQzXHYLZ4

This is the post number 0 of the Italian Elections 2018 serie: 1- Testing Facebook algorithm in an electoral campaign (methodology), 2- first sighting of Facebook power abuse, 3- judging algorithm discrimination, 4- Facebook stab online media twice, 5- The Iron Bubble (or: how the Facebook algorithm insulates fascists from reality).

--

--