Photo by Thought Catalog on Unsplash

Facebook doesn’t have a plan to fix its content problem.

Is public pressure the only thing that will force Facebook to take down content that promotes hate speech and incites violence?

jonesey
6 min readAug 9, 2018

--

This week, Facebook finally announced that they were removing several Infowars pages, a move following weeks of public backlash on the heels of an uncomfortable interview where Mark Zuckerberg defended the practice of publishing Holocaust denialism on his platform.

With an actual ban, Facebook finally answered the question of “is it okay to publish news stories suggesting the Sandy Hook and Stoneman Douglas school shootings were staged or Hillary Clinton was running a child sex-ring out of a pizza restaurant?”

The answer to that question is, “generally no,” assuming:

  1. You’re a high-profile publisher.
  2. Public backlash of inaction proves costlier than doing nothing.

In what would seem like a no-brainer in another, more sensible universe, Facebook took down the pages citing community standards violations. The site’s credo is to give the world a platform for an open exchange of ideas — “to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”

In theory, the good ideas and real news will prevail against the evil and fake.

In practice, an unrestricted content marketplace looks a lot like unrestricted capitalism. Free-market capitalists insist the market will balance itself — the most innovative and industrious will compete with only the very best goods and services rising to the top. But in practice, the greedy and opportunistic are always rewarded for finding ways to grift and cheat the system — in the content economy, pages like Infowars hold a monopoly on outrage and attention.

Is common sense prevailing in this content economy? I would suggest scrolling through your own newsfeed to decide if you think that’s happening, or if Facebook has opened a portal to the upside-down, spewing out a dark uncontrollable mass of hate and lies, poisoning our public discourse.

The only recourse Facebook has is a rather toothless community standards policy threatening “strikes” before taking down pages.

…we do not share the specific number of strikes that leads to a temporary block or permanent suspension.

How many strikes to get an out according to Facebook content umpires? No one knows for sure.

Photo by Jilbert Ebrahimi on Unsplash

If you had a website where 45% of Americans turned to first for news, if 1 out of every 4 inbound clicks to a news media website came from the website, if the site punished news media who don’t publish using their proprietary “instant article” tool or pay to boost their posts to for visibility, one would think that website is, in fact, a news website.

Yet this news website seems to have so much indifference to their content.

An aspect of Facebook’s content problem that I feel is under-discussed is how all news feed content is weighted equally, treated with the same visual rhetoric. In other words, as the user scrolls through their newsfeed, a thoroughly sourced news article from a legitimate news organization is given the same visual weight as a post from an Alex Jones fan page falsely claiming a war with Russia is imminent.

These two posts are visually framed the same way. That’s a problem.

Without any visual clues or added context, how do we expect the average user to discern truth from bullshit on a site that nearly 1 in 2 Americans go to for news? Media literacy is sorely lacking among Americans and even worse, discrediting journalists has become a policy cornerstone of the Republican party. How else does Donald Trump get elected without receiving the endorsement of hardly any major local newspapers? The people in charge who you’re told to trust have been telling you not to trust your newspaper.

We know now that Facebook was never destined to be a benevolent marketplace of ideas and free expression. It’s the legacy of an algorithm from a website originally built to compare pictures of women. Facebook works because it’s based on an underlining formula that quantifies and extracts the worst traits of people — vanity, insecurity and jealousy. What should we expect to happen when we put a tool with so much potential for malice into the hands of anyone with bad intentions and an internet connection?

Photo by Tertia van Rensburg on Unsplash

No doubt an example has been made out of Alex Jones to symbolically show Facebook is capable of cleaning its content. But there’s still a lot of horrifying content out there. You see it in your timeline, I see it in mine — just searching for an example of an Alex Jones fan page (above) felt like descending into the internet’s rotten hole of hate and stupidity that seems to be growing and deepening every day.

Twitter, who deserves just as much scrutiny and backlash as Facebook in their lack of attentiveness to toxic content, recently announced they are bringing in a team of political science researchers to monitor the health of the discourse on the platform. I can spoil the ending for you…on the Twitter side of the tracks, the conversations are not healthy. For a 12-year old social network, the fact that they are just now thinking about the health of their discourse is astonishing. Perhaps the cost has been too high to police their content, in terms of staffing overhead or the risk of upsetting a large population of their users who will wrongly feel their rights are being violated by being blocked from the platform. Or perhaps, the content problem was one that they were just hoping to deal with later.

Later is now.

The social media-fueled misinformation crisis will be viewed by historians as one of the great challenges our country ever faced. Action is desperately needed and it will come from within or externally. This means that social media companies must take a stance on their content. It means they have to decide if they are news content creators and aggregators. It means conversational health, the veracity of news stories and the prevalence of threats and abuse should be regarded as core KPIs in the same way that active users, engagement rates, etc. are treated in quantifying success. It will require bold leadership to choose this option; there will be backlash and financial burdens for taking a stance.

But it’s necessary.

If the social media networks won’t find a way to police the toxic content on their platforms then it’s up to policymakers to step in — preferably before an electorate becomes so misinformed that we are unable to elect those who would be able to legislate solutions.

Our modern society functions because we have always been able to rely on a free press to do one thing particularly well: dispel bullshit. If we lose this important check and balance because the average citizen can’t discern what is real and what is fake because they are consuming information through a platform that, by design, obfuscates the clues needed to understand the veracity of content, then what follows?

Lawmakers have historically been behind the curve when it comes to regulating technology, but they also historically catch up. U.S Senator Mark Warner (D-VA) recently put forth a proposal addressing a wide range of issues with social media including data privacy, content regulation, media literacy and protections against foreign influence to disrupt U.S. discourse.

If enacted, where these policy proposals could shift the burden of websites being liable for the content of their users in concert with the aging 1996 Communications Decency Act and the inevitable legal battles to follow is up for debate among law and media scholars far better equipped than me.

But in layman's terms, we have a big problem if the CEO of the world’s most 3rd most visited website defends the practice of allowing Holocaust deniers to propagate false news throughout their site for all users to see.

Action is necessary, either through difficult decisions made within a social media boardroom or through regulation by the government.

Any alternative will be a deepening of this country’s division and a very dark future ahead of us.

--

--

jonesey
What a time to be alive!

Web and communications pro. Millennial. Occasional Medium writer.