4 ways Facebook’s cash machine algorithms are shredding the moral fiber of our society

The same core functions that enable Facebook to proverbially print money are sowing the societal chaos we’re experiencing today in 2017.

Since its launch in 2004, Facebook has ridden its groundbreaking advertising engine to become one of the most valuable companies in the world, with 2016 revenue of US$27.6 billion, profit of US$12.4 billion, and a current market cap hovering in the US$500 billion range.

With more than 2 billion users, no information system has ever had a bigger audience or been more effective at the capture and sale of attention.

Facebook has also unleashed Trump. #MAGA. Brexit. Ethnic cleansing in Myanmar. ISIS. Fake news. Duterte. Russian troll armies. Your crazy uncle sharing racist memes.

The times seem crazy because they are, and our global addiction to Facebook’s algorithm is fueling the entropy.

Let us briefly summarize the fundamental premise of how Facebook makes money: The more hooked to the social network you are — the more time you spend and content you create — the more money they make. You are the product being bought and sold, and every second of your time spent, every interaction, every message, every post you publish, consume or share adds to the data Facebook collects and monetizes.

They monetize this data through targeted advertisements using the incredibly detailed profile you willingly provide via your actions on the site. And they don’t need to control those actions to profit from you.

Facebook doesn’t care what you do — plant a tree, kiss your girlfriend, murder a puppy, paint a picture. As long as you do it on Facebook, they can monetize it.

Now let’s examine the four most lucrative and destructive features of Facebook’s algorithm.

1. Virality: Facebook accelerates the spread of information
 
 This is the most famous and easily understood Facebook function. With the advent of the share button, Facebook may have devised the most powerful tool for the spread of information ever created.

In addition to allowing you to share content to your connections or the public with a click — Facebook’s algorithm includes an accelerant that prioritizes content that is being shared heavily, no matter its inherent worth. Essentially, they add gasoline to the fire, allowing messages to spread faster and wider than any other information system has ever made possible.

How this makes money for Facebook: Simple. The more viral content there is on Facebook, the more time people will spend consuming it and engaging with it.

How this is tearing us apart: Virality can be hijacked. See item 2 below.

2. Extremism: Facebook rewards radicalism
 
 In the battle for what goes viral, extremism wins.

Posts that are “extreme” — but within Facebook’s content guidelines — will almost always provoke more reactions than those that are rational, making the rise of clickbait, sensationalism, and fake news inevitable.

Which do you think will perform better, a fake story headlined “President sexually assaults chief of staff” or a true story labeled “President outlines climate change agenda”? We know the answer.

To give a very real-world example: Buzzfeed’s Craig Silverman ran a much-cited analysis last year that found that, during the last three months before the 2016 US Presidential election, the top 20 most popular fake news stories received more Facebook engagement than the top 20 most popular true news stories.

Everyone from the New York Times to your grandma is a content producer on Facebook, and most people quickly realize that Facebook rewards the extreme with more reach.

This phenomenon has permeated our behavior and fomented more overall extremism in society — from the rise of extreme urban exploration (those photos of Russian kids dangling from the tops of skyscrapers) to extreme food (restaurants serving cheeseburgers with pizza slices as buns) to extreme racism, intolerance, and bigotry (again, your uncle sharing racist memes).

How this makes money for Facebook: More provocative content on the platform means more interactions means more data points to monetize. Truth and measured arguments don’t pay like hate.

How this is tearing us apart: The increased consumption of extreme content is inciting real world attitudes and behavior to become more radical.

3. Division: Facebook sifts us into siloes

Every time you log on and begin scrolling, Facebook selects from thousands of posts to give you more of what you want based on your previous actions within the platform. 
 
 The more you stay on the platform, the more data they collect, the more they know about you, the better they can target ads to you.

The better they can lump you into interest siloes — white female Fox News fan from rural Pennsylvania or San Francisco Asian-American male surfer or African-American from Milwaukee — the more ready-to-target preset groups Facebook can offer advertisers. And the more you’re being spoon-fed content you like and agree with, the less likely you are to leave the platform.

In short, the filter bubble is real.
 
 How this makes money for Facebook: Facebook’s dominance in digital advertising is based on the fact that it can segment target groups of people more effectively and cheaply than any other method.

How this is tearing us apart: We are becoming more and more divided into siloes, insulated from different viewpoints, with our own beliefs constantly reaffirmed by the filter bubble. We’re becoming more susceptible to mob mentality because we’re surrounded (virtually) by peers who seem to think the same way we do.

4. Artificial intelligence: Facebook uses machine learning to optimize the cycle

This is the most scary and pernicious feature of Facebook’s algorithm and the least understood and controllable, even by Facebook itself.

Facebook’s machine learning is constantly tweaking, testing, and improving aspects of its algorithm to better fulfill its business goals.

Content and behaviors that make it money are automatically rewarded and prioritized within the system, and the system is optimized for them to happen again.

How this makes money for Facebook: With machine algorithms that replace huge amounts of human work, Facebook is able to relentlessly make its cash machine more efficient and effective.

How this is tearing us apart: Last month, ProPublica published a scoop in which it found that advertisers could target people who expressed interest in the topics “Jew hater,” “How to burn Jews,” and “History of ‘why Jews ruin the world’” on Facebook. No humans were involved in the generation of this targeting group — it was created by artificial intelligence based on user actions in the system.

Facebook quickly took it down, but while it existed, anyone with a credit card and a Facebook account could pay to send targeted messaging to Nazis.

I believe that in the future, we will look back on these times and say: What the hell were we thinking? How could we have been so blind? How did we let a company run wild with a business model that so clearly fueled chaos?

It seems that governments and the media are finally waking up to this threat — storm clouds loom on Facebook’s horizon.

That storm can’t come soon enough.

Byron Perry is the Founder and CEO of Coconuts, an online news publisher reaching an audience of millions in Asia. He has spent the last 6 years wrangling Facebook’s algorithms and you can follow him on Twitter at @Byron_Perry.

Graphic: Prae Sakaowan