Social Media Harms

10,000 Ways Meta Manipulates your Facebook Feed

Sharon Winkler
Social Media Harms
Published in
6 min readOct 28, 2021

--

Facebook’s algorithms are shaping your view of the world and your emotions

Photo by Finn on Unsplash

Authors note: April 27, 2024. In the absence of any U.S. federal regulation of social media companies, all of the concerns listed in this piece written in 2021 still remain accurate. Many thanks to Frances Haugen and and former Meta engineer Arturo Bejar for their courage in exposing social media harms.

Facebook has been intentionally manipulating its users for years, subtly making changes to its algorithmic recommendation system or “Feed” to boost users’ time on its platform to increase corporate revenues. The “Facebook Papers” is a collection of thousands of documents accumulated by former Facebook (FB) employee Francis Haugen during her last months working for the company. Haugen submitted these documents to the U.S. Securities and Exchange Commission, the U.S. Congress and to multiple news organizations.

Due to the release of these documents, a stampede of current and former Facebook employees are communicating with journalists regarding their technical work at FB. For years, Many FB employees expressed their concerns within the company about the negative effects of the of News Feed on the mental and physical health of its users and the unintended effects of amplifying the spread of misinformation, on-line bullying, hate speech, and posts with violent content. These comments were shared on an internal employee messaging system and screenshots of their comments included in the Facebook Papers.

How FB’s Feed Manipulates Users

Information on how the FB Feed is manipulated is starting to be discussed in public forums. Merril and Oremus’ Washington Post article, Five points for anger, one for a “like”: How Facebook’s formula fostered rage and misinformation, based upon information obtained through the Facebook Papers highlighted the following:

  • FB uses 10,000 data points or “signals” to determine which posts are given higher priority in an individual users News Feed
  • 2016-FB adds additional emojis- “love,” “haha,” “wow,” “sad” and “angry”
  • 2017- News Feed algorithm changed to give posts that receive above emoji responses receive 5 times (X) the number of ranking points than a “like.”
  • Ranking signals or points also include:
  • long comments to a user’s post awarded more points than single character plain text or avatar responses
  • live video gets substantially more points than recorded video
  • better internet service = more points
  • 2018-angry emoji downgraded to 4X a like
  • 2019-no limit on how high a ranking score could go. FB identified one post with 1 billion points -that post would still have a high enough score to appear at the top of a users news feed even if FB cut the points in half.
  • 2020-study: angry least used of 6 emoji reactions: 429M angry clicks per week, compared with 63B likes and 11B loves. Document states that angry reactions “much more frequent” on problematic posts: “civic low quality news, civic misinformation, health misinformation, and health antivax content.”
  • Sep 2021: angry emoji demoted to a score of 0 points, love/sad changed to 2 points. Single character comments no longer receive points
  • After Jan 6, 2021 riots: cap on weight of live videos reduced to 60X a like

Francis Haugen also weighed in on the inner workings of Facebook’s Feed discussion on the podcast Your Undivided Attention. Her disclosures include:

  • Algorithmic systems (FB News Feed) trained to monitor user interactions
  • FB Group Invitations: users who receive a group invitation from another user see the posts from that group for 30 days (“autoinject”) even if they are not interested in that group.
  • FB has no limits for the number of users an individual user can invite to a FB group. Haugen claims that one user sent QAnon group requests to 300,000 users. Those users received QAnon content from this group for 30 days-without specifically requesting those posts.
  • User engagement is also a part of the points system. A user who posts, comments or uses emojis 1,000 times per day receives 20X more points than the average users. Users who are at 99.9 percentile of engagement get 100X points for posts than “average” FB user.
  • FB knows that people who are experiencing negative life experiences (loss of a significant other/spouse/loved one), people experiencing relationship endings (e.g. loss of romantic relationships) or other socially isolated people, spend more time on FB. These users posts are ranked higher than users who have lower engagement with FB, meaning that people who are not depressed have their FB feeds influenced by people who are depressed.

Takeaways on these revelations:

  • A user who engages 1,000 times per day on FB utilizes FB less than a user at the 99.9 percentile of engagement?
  • Could it be that automated users (bots) have an exponential effect upon what human users see in their News Feeds?
  • Is a human user who engages 1,000 times per day addicted to social media use?
  • Do depressed/anxious/angry users have an outsized influence on what an average user would see in their News Feed? If so, what effect does this have on the average user?
  • Why are you no longer seeing positive posts from in-real-life friends? See emoji ranking changes implemented in 2017.

Real Life Consequences of FB’s Feed

2021 Nobel Peace prize winner, Maria Ressa, a journalist who exposed governmental corruption in the Philippines, put it best in her interview on the podcast Sway:

“ the weaknesses of human beings’ biologies are being exploited by these [social media] platforms.”

Ressa and her website Rappler have been targeted by the Duarte government in the Philippines for many years, using FB extensively as the media platform to spread propaganda. Ressa asserts that in 2017, 97 percent of internet users had FB accounts. Rappler’s research team analyzed FB posts about Ressa from 2016 on — approximately 500,000 posts. They found that 60 percent of the posts were meant to attack the credibility of her work and 40 percent were attacks on her personally: principally death threats and memes exploiting physical flaws, such as her eczema and atopic dermatitis (the nickname these memes used for her was “scrotum face.”). Further Rappler research showed that women were attacked in FB posts 10 times more often than men. Ressa goes on to say:

“So if women are attacked 10 times more than men, what’s the end goal of these attacks? There are two. The first is to pound the target to silence. The second is to create a bandwagon effect, manufactured reality, to make anyone else who’s not aware of it think that this is actually true.”

Ressa discussed the problems she was having with algorithmically accelerated attack posts with FB leadership CEO Mark Zuckerberg and COO Sheryl Sandberg at the April 2017 F8 conference. Both stated that they understood her concerns and would get back to her. Neither ever contacted her again.

No Oversight of Technology Companies

To date, no U.S. government agency or legislation requires any technology company to disclose the data collected and used in their algorithmic ranking systems, nor to share internal research with governmental agencies. These companies are free to change their algorithms at any time, in any way, irregardless of the negative effects these changes may have on their users. Until this changes, any user of any social media platform is being manipulated in ways that they cannot know and frequently do not recognize.

For now, the only way to avoid being manipulated is to not participate on social media platforms

A parting thought:

“The real problem of humanity is the following: We have Paleolithic emotions, medieval institutions and godlike technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.”

Edward O. Wilson, American biologist

Social Media Harms provides a listing of peer-reviewed studies, books and articles from authoritative sources that document the negative effects of social media use. The website also includes links to organizations that promote safe social media and internet use.

--

--

Social Media Harms
Social Media Harms

Published in Social Media Harms

Social Media Harms (https://socialmediaharms.org/) was developed to provide a listing of peer-reviewed studies, books and articles from authoritative sources that document the negative effects of social media use.

Sharon Winkler
Sharon Winkler

Written by Sharon Winkler

Publisher/Editor Social Media Harms, https://socialmediaharms.org. Mother, Grandmother, Retired U. S. Naval Officer

No responses yet