Facebook stab online media twice
With this update, we will also prioritize posts that spark conversations and meaningful interactions between people. To do this, we will predict which posts you might want to interact with your friends about, and show these posts higher in feed. These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to — whether that’s a post from a friend seeking advice, a friend asking for recommendations for a trip, or a news article or video prompting lots of discussion.
An official Facebook update. But thanks to the data collected with fbtrex, we can see the consequences of this anticipation.
Which is absurd on its own! Why Facebook has to anticipate an update if we can’t know what happen behind the scene? Also, the algorithm changes constantly anyway, what was different this January?
Whatever, as a third party evaluation technology, we can provide an independent insight. It is a long term analysis of the phenomena saw in the earlier post, using this color code:
Below you’ll see only three users, because Britta, Antonietta & Santiago looks equivalent, but Oliviero and Michele they changed during the days.
In January, the yellow rectangle, show the users were treat nearly in the same way. Around the 6th of February, the Michele’s ratio begin to change, increasing pictures. In the black square is at the maximum. For Oliviero, the change occurred later, in the green box, you’ll see the overwhelming majority of videos showing up in that News Feed.
… and this is “Bringing people close together” ??
Online media monetize with links, but refresh after refresh, they aren’t the content who keep showing up:
It is clear, news media monetize with links, but in the long term only pictures keep showing up. In the next, you’ll find the same visualization, but January and February compared. Note, there is no Display 0 below, only actual impressions.
What happen next?
This is a pre-analysis made with the data collected between the 10th of January and the 21st of February, After the election some collection will continue for another week, and we will release a clean dataset, some update blogpost.
To be fully transparent, the project is not doing well. Without resources and a dedicated team working on, we can’t keep it alive. It is in β-stage since December 2016, has display its potential, but now we need: a fundraiser, UX designer, communication, software developers and data analysts. It is a free software project, I’m open to move in existing organization sharing the passion for an open Internet: controlling our own algorithms sounds radical only because we have a shitty status-quo ;)
The goal is create an infrastructure which enables other people analysis and data-reuse. Algorithm analysis should be part of these analysis. The architecture must embed the privacy preserving capabilities.
At the moment we have a small team of volunteers, which I’ve to give a huge THANK YOU for helping in this Italian Election monitoring: Federico Sarchi, Rugantio, Velenux, Laura Boschi, Gianluca Oldani, Riccardo Coluccini, Raffaele Angus and Manuel d’Orso.
Talking again of an hypothetical roadmap:
- supporting new platform, supporting mobile app
- permits to user to test their own algorithm over the content they shared
- permits to compare their information diet (we can’t develop algorithms telling you if you got decent information or not, but at least you can put your experience in context)
- provide full transparency on the algorithms running over the dataset: if we want teach to the users to demand algorithm transparency, we have to begin
- improve communication outside our techno-hacktivist bubble
- survive to any kind of Facebook countermeasure
And to do this, I need a team and a bunch of months cover. For any additional information, contact claudio aţ tracking dũt exposed
This is a post in the Italian Elections 2018 series: 0- fbtrex Background, 1- Testing Facebook algorithm in an electoral campaign (methodology), 2- first sighting of Facebook power abuse, 3- judging algorithm discrimination, 5- The Iron Bubble (or: how the Facebook algorithm insulates fascists from reality).