Outsmarting algorithms

“Who decides what matters to me?”

Claudio Agosti is a professional programmer and information security analyst based in Berlin. Agosti is part of the ALEX (Algorithms Exposed) programme, a pan-European research collective committed to algorithm literacy and algorithm accountability. He explains how every user can contribute to the process.

Every day, all over the world, algorithms on social media manipulate information to influence our decision-making. Whether they’re helping us streamline our search for a new vacuum cleaner or presenting us with radical religious or political ideas, algorithms are determining what you see online, and may influence your behaviour offline. Big social media tech companies use these algorithms to manipulate content without our (informed) consent while selling off the information they collect on us to third parties in the process.

Facebook, for example, has built a platform that combines likes, reactions, and engagement into a metric of conversion success. Basically, the more you click, comment, and spend time looking at posts, the more profitable the Facebook-platform becomes. And that system, which is optimized for maximum profitability for the company, is easily manipulated: campaign managers and political parties happily hire marketing and social media specialists to optimize their online profile for that platform or hire the services of click farms, that make money from setting up fake profiles.

Accountability of big social tech companies is still virtually non-existent: the European Parliament, the American and the English congress all questioned Mark Zuckerberg on Facebook’s profit-generating strategy, and the company signalled significant changes to its business model as well as vowing to stop political propaganda on its network. But, despite signing a code of practice for disinformation, the company is not in compliance with their own promises. The only form of accountability they have so far delivered on is in regards to advertising — Facebook promised the European Commission an “ad library” to offer some transparency of political advertising.

Cruising content

I am part of a group of researchers based in Berlin and Amsterdam that, under the name of ALEX (short for Algorithms Exposed), has designed a tool that can monitor online behavior. The tool I built has previously tracked misinformation in Italian elections and looks for evidence of potentially illicit and dangerous online influence. It helps people understand how what they view online is influenced by advertisers, or worse — malicious political actors and other non-state actors. We, in short, have found a way to outsmart algorithms.

Our software allows researchers and journalists to analyse information that is circulated on Facebook and YouTube by corporations, political interest groups, and foreign powers. The tool takes a copy of the type of personalized marketing information the platform gives you and extracts metadata to permit analysis and data-reuse. It can track language or open-access posts that people have shared to map key trends on the topic. In short, the subject of our investigation is Facebook itself.

I’ll give you an example of how the tool works: Daria Tombolelli and Hilla Zucker, whom I’ve granted access to the database — explored data in a “datathon” by berlin-dssg.org from circa 1,000 people across the world. These 1,000 individuals saw a total of 4.5 million posts in one year, which were selected by Facebook algorithms as “meaningful content”. The compiled data they gathered showed a really interesting trend: people using Facebook had a higher amount of angry reactions in the second part of the year.

If Facebook’s mission was to bring people closer together — why then are people experience anger when they spend time on the platform? Were people just particularly angry in that time period? Possibly. Or was Facebook circulating more angry and emotion-eliciting advertisements? More research is needed to figure this out.

Fight misinformation from your living room

We need your help to stop malicious social media practices in the fight against future threats to democracy. It’s even more necessary now with the European Parliament elections right around the corner as transparent and unbiased access to information is critical to a representative result.

To outsmart such complex algorithmic systems deployed by Facebook and other corporate actors, we need more data. Misinformation is targeted. And because our experiences are personalized, we lack the critical collective experience which limits our understanding of the danger or the practice.

How can you help?

  1. By installing our web browser extension (download it here for Firefox or Chrome!). The more of us sign up, the more accurately we can measure malicious practices! Your privacy is not in any danger, since we’re only analyzing posts that are made publicly. As such, you retain full control over your data and we won’t profile individuals.
  2. Use the European Map of topics. This tool will let you follow updates regardless of any personalized filtering algorithms, transforming the observed Facebook posts into an RSS feed.
  3. With your creativity and expertise! Are you a researcher or journalist? Use our data to analyze online phenomena on social media and publish your work to combat the threat to our common Europe. Email us at support at tracking dot exposed. If you want to use the dataset for your research, read more about our data activism here: https://eu19.tracking.exposed/page/data-activism/.

Thank you for helping us outsmart algorithms and creating a more transparent and accountable platform!