Facebook Is The Primary Cause Of Americans’ Addiction To Toxic, False Conspiracy Theories

Facebook’s mortal sin is using its recommendation engine to multiply & amplify the false, angry, toxic content published on its platform

--

Image by ijmaki from Pixabay

By David Grace (Amazon PageDavid Grace Website)

Facebook’s Toxic Conduct

Facebook’s crucial, fundamental crime is not its failure to curate/police the content published on its platform. That’s it secondary misdeed.

Facebook’s principle, toxic sin is using its recommendation engine to continuously multiply and redistribute hate and lies to thousands or millions of other users whom it thinks might want to see them.

What’s Causing Massive Anger & Belief In False Conspiracy Theories?

I’ve published two columns about the fundamental causes of the hostile divisions in this country

What I Missed In Those Columns

I now think that I failed to address the root source of the radicalization of large segments of the American public. I now believe that the driving force behind this toxic rancor is money, specifically Facebook’s and Fox’s business models which were designed to generate huge advertising profits from content that promotes anger, hatred and lies.

A Fundamental Change In Information Delivery: Push Distribution Has Eclipsed Pull Distribution

Before the internet people searched for information that was of interest to them and bought it. You went looking for the information you wanted and then you grabbed it.

After the internet, third parties realized that they could make more money by figuring out what content might be of interest to people in certain demographics, collecting or creating it, and then sending that content to them.

Much of news and opinion content now operates on “push” model — the seller identifies likely interested recipients and sends the content to them in return for their engagement and so reaps large amounts of advertising dollars.

Push Marketing By Facebook & Fox Is The Primary Creator Of Our Toxic Divisions

Facebook makes its money from advertising.

Advertising dollars depend on getting as many eyeballs per day as possible and as high as possible a level of engagement. The more engaged the larger number of eyeballs, the more money Facebook makes.

Eyeballs and engagement are acquired by identifying and delivering content that most strongly interests/appeals to users, content that keeps them reading and keeps them coming back.

How do you identify the content that most interests your users? You watch what they click on, what they read, and then you constantly send them more content like that.

It sounds harmless. You’ve got a user who clicks on posts on quilting so you send them links to new posts related to quilting, or cats, or baking cakes. What could possibly go wrong with that?

A great deal.

Facebook’s Recommendation Engine Amplifies Hate & Lies

When you watch what people pay attention to and on a daily or even hourly basis you send them more of the same, you legitimize that content; you reinforce it; you multiply it; you concentrate it; you amplify it AND ALSO that stream of similar content you send them displaces that user’s exposure to any contrary content.

That’s not a problem when the content is about making quilts or baking cakes, but it is a huge problem when the content consists of political or social ideas like religious terrorism, anti-Semitism, false conspiracy theories, etc.

The successful use of Goebbles “Big Lie” tactic relies on the lie’s constant repetition, because the more often people hear a lie and the more sources they hear it from, the more likely they are to believe that the lie is true.

So, if someone who mildly distrusts Jews reads a Facebook post that claims that Jews control Hollywood, Facebook’s recommendation algorithm is going to notice and automatically feed that person other posts about Jews controlling Hollywood.

The Feedback Effect

If the user clicks on those other “Jews are bad” posts, Facebook’s algorithm is going to send that user even more posts about Jews controlling the banks, and Jews running a secret government, and Jews . . . . well, you get the idea.

It could start with a post about how bad conservatives are, or gun owners are, or how great the Taliban is or how the Clintons are running a pedophile ring out of the White House basement or . . . anything.

The Recommendation Algorithm Is Blind To Content

The Facebook algorithm doesn’t know anything about the content of the posts it’s promoting — pushing — or care. It’s just code that’s designed to send people more of the same, and if the same is crazy conspiracy theories, racism, smear campaigns, Russian propaganda, jihad promotions, radical Islamic philosophy, it doesn’t matter. The code neither knows nor cares what’s in the posts it’s sending you.

It’s not that Facebook was designed to promote the Taliban or anti-Semitism or QAnon or any other particular notion. It’s that IN ORDER TO MAKE MORE MONEY Facebook’s feedback-effect code promotes and amplifies any and every notion somebody clicks on no matter how false, crazy or toxic it might be.

What counts is that Facebook monitors whatever the user looks at and then relentlessly gives them more and more and more of the same and less and less and less of anything not the same.

Facebook’s “If you liked that then you’ll like this” algorithm constantly multiplies and reinforces the original content, no matter what it was, and simultaneously avoids troubling you with any contrary information.

The code goes out into the Facebook’s data stream, finds more of the same, and floods that result back on the user, like a kid with a magnifying glass focusing sunlight on a line of ants.

Hour by hour, day by day, the user who first read a post claiming that Jews control Hollywood gets sucked into a black hole of ever expanding anti-Semitic content. The feedback effect. And the more of that content they get and read, then the more they get, and read, and the more that Big Lie seems to them to be true.

What Could Be Done To Reduce The Damage

Of course, it doesn’t have to be that way.

Disable The Recommendation Code

Firstly, Facebook could simply disable its recommendation engine entirely so that people see what their friends post plus those things that they actively search for. But that would cost Facebook engagement which means it would cost it money, so Facebook isn’t going to do that.

Reverse The Recommendation Code

Secondly, Facebook could become proactive and flip its recommendation engine 180 degrees so that it sends people posts that are exactly the opposite of the ones they previously clicked on, an anti-feedback effect.

That is, if I clicked on the post claiming that Jews control Hollywood, Facebook would send me other posts debunking the claim that Jews control Hollywood.

If I clicked on a post claiming that people will lose weight by eating five pounds of bananas a day, Facebook would send me posts claiming that eating five pounds of bananas a day will not cause me to lose weight and that, in fact, it might harm my health.

But let’s be realistic. Facebook is not going to either disable or flip its recommendation algorithm because that would cost it money and money is the only thing that matters to them.

Fox’s Business Model

Tobin Smith lays out the details on how Fox implemented its business plan to make money by picking a target group and stoking its viewers anger at that group in his book:

FoxNation: Inside the Network’s Playbook of Tribal Warfare

In his Medium article “FEAR & UNbalanced: Confessions of a 14-Year Fox News Hitman. How Roger Ailes & Fox News Got Rich Scamming America’s La Z Boy Cowboys and Selling Out America’s Soul” Smith said of Fox:

“By careful design and staging Fox News manipulated (and ultimately addicted) the most vulnerable people in America to the most powerful drug cocktail ever: visceral gut feelings of existential outrage relieved by the most powerful emotions of all . . . the thrill of your tribe’s victory over its enemy and the ultimate triumph of good over evil.

“. . . Fox News turned our democracy and politics into performance art and efficiently sold the soul of America to the highest bidder in return for 2-minute ad sequences aired during the performance intermissions.”

The Source Of America’s Embrace Of Hate & False Conspiracy Theories

So, if you believe that lies, false conspiracy theories, calls for religious or political violence, and demonization of people who don’t share your political views are serious threats to our country and you want to know who’s to blame, and why, the answer is clear.

Blame Facebook and any other media platform that is designed to increase use and engagement by systematically multiplying, concentrating and amplifying any content that seems to interest any of their members, and

Blame Fox and any other media platform whose business model is founded on increasing eyeballs and engagement by promoting anger and hatred by their consumers of some designated enemy group.

And the motive for both of these businesses plans is simple: more MONEY.

— David Grace (Amazon PageDavid Grace Website)

To see a searchable list of all David Grace’s columns in chronological order, CLICK HERE

To see a list of David Grace’s columns sorted by topic/subject matter, CLICK HERE.

Follow David Grace on Twitter at: https://twitter.com/davidgraceauth

--

--

David Grace
Government & Political Theory Columns by David Grace

Graduate of Stanford University & U.C. Berkeley Law School. Author of 16 novels and over 400 Medium columns on Economics, Politics, Law, Humor & Satire.