Member preview

Engines of Polarization

A small change to an algorithm is at the heart of fake news, tribalism and extreme behavior on social media.

In October 2009 Facebook made a fateful change to its algorithm which has had profound and polarizing effects on society. It’s a lesson in unintended consequences — for, on the surface, this tweak to the platform was made to improve user experience. However, it was really a strategic shift towards hyper-monetization, and this drive to turn the free social network into a money-printing machine also ended up exacerbating the worst human tendencies.

What happened was a decision to change the news feed from showing chronological updates to a new order, based on the popularity of a post. Popularity was quantified by a new algorithm which favored engagements. The more engagements a post received, the more chances it had of rising to the top of a newsfeed.

“A reaction of any kind, according to the amoral logic of the algorithm, is good.”

An engagement — as defined by Facebook and all the major social networks — is any kind of interaction with a post. That could be a like, a share, a comment, or a view. It doesn’t matter if you agree or disagree with the content. It doesn’t matter if you spit your coffee out in laughter or disgust. A reaction of any kind, according to the amoral logic of the algorithm, is “good.”

This algorithm change, and the social network’s rapid growth, turned Facebook into an advertising powerhouse. Exponential growth meant it was getting harder and harder to reach people in newsfeeds. The popularity algorithm further diminished returns by quickly de-optimizing posts which didn’t get immediate interaction.

To those seeking to cut through the newsfeed clutter, the solution Facebook offered was simple: pay to boost your content so it appears higher in your network’s newsfeeds. Publishers realized the good old days of free posting and high engagement were over. You had to “pay to play.” This system has translated into a multi-billion dollar profit scheme for the company, which is why they have no reason to change it.

However, a paid boost wasn’t enough to make content go viral. Salacious and controversial content inevitably began drowning out all other content, because that’s what garnered the most immediate interaction.

Just look at Kim Kardashian’s 2014 #BreakTheInternet Paper magazine cover. Featuring her glossy derriere, it was a calculated effort to punch our primordial buttons — it took off on social, indeed “breaking the internet” for a few days. Marketers and publishers were all abuzz about it.

With this new algorithm, sex, violence and hot-button issues have even more impact, especially when packaged in short, easily-digestible pieces with captivating images. And, with the power of paid boosting, lurid content gets even more life.

A sponsored post created by a troll farm

By the time Paper’s campaign went viral, the formula had been clear for years. Publishers had figured out that provocative, easy-to-scan content was the key to virality, and content creators zeroed in on one of the key ingredients favored by the algorithm: controversy. Controversial topics are inherently divisive and tribal, and posting a controversial opinion about a trending topic is a great way to game the system.

Unscrupulous content farm operators realized they could generate misleading, even outright false articles, and quickly gain traction on social networks. As uncovered in the 2017 Wired report by Samanth Subramanian, during the contentious 2016 presidential election, a cottage industry of fake news purveyors sprang up in unlikely places like Macedonia, where teenagers spotted trends, generated content with controversial headlines, and posted links on social networks. They boosted this content to targeted audiences — e.g. followers of Donald Trump, NRA, Evangelical Groups, etc. — in order to exploit our tribal tendencies.

Facebook profited from tactics like these, taking the money they generated from sponsored posts. Google, running ad networks on the sites receiving traffic from Facebook, also made revenue whenever eyeballs perused their content.

These tactics are representative of what’s going on throughout the rest of the “free” internet. In essence, much of the internet as we know it today, dependent on revenue generated by ad technology, incentivizes this type of behavior. You can bet if well-thought out, fair-minded, longform articles were all the rage, that’s what would be prioritized. Unfortunately, that’s just not in our nature.


“As in ancient Roman times, colorful filler is what holds our attention.”

“If it bleeds, it leads,” is an old, unattributed saying, which acted as the strategic North Star for pre-internet media moguls. Sensational stories launched broadcasts and graced the covers and front pages of print media. The sexier, bloodier and more lurid the story, the more people were likely to tune-in and read. There was never a conspiracy to push sensationalist stories on us. It happened to be what we wanted. “Bread and circuses” is the way to the common man’s heart, wrote the Roman satirist Juvenal. As in ancient Roman times, colorful filler is what holds our attention.

In the age of social networking, the attention game is won by whoever has the best algorithm. All of the social networks have their own version of the popularity engine, designed to keep their audiences captive. The problem is our natural tendency towards bread and circuses is now supercharged by machine-learning algorithms. These disembodied demigods of the modern era, ruling us silently from the cloud, as cold and calculating as HAL 9000, only see division and tribalism as a positive thing.


A little post-script: I almost titled this “The Link Between Kim Kardashian’s Butt and Russian Fake News Troll Farms” but realized I’d be undermining the my own critique.

Like what you read? Give Drew Minh a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.