Facebook-Russia news cycle driven by media wanting Facebook to pick winners

This morning I was reading an article on Buzzfeed titled How People Inside Facebook are Reacting to the Company’s Election Crises and I was surprised the opening premise of the article came out so boldly for Facebook editorially managing the content of their news feeds. The article begins

In the summer of 2015, a Facebook engineer was combing through the company’s internal data when he noticed something unusual. He was searching to determine which websites received the most referral traffic from its billion-plus users. The top 25 included the usual suspects — YouTube and the Huffington Post, along with a few obscure hyperpartisan sites he didn’t recognize.
“Conservative Tribune, Western Journalism, and Breitbart were regularly in the top 10 of news and media websites,” the engineer told BuzzFeed News. “They often ranked higher than established brands like the New York Times and got far more traffic from Facebook than CNN. It was wild.”
Troubled by the trend, the engineer posted a list of these sites and associated URLs to one of Facebook’s internal employee forums. The discussion was brief — and uneventful. “There was this general sense of, ‘Yeah, this is pretty crazy, but what do you want us to do about it?’” the engineer explained.
Today, the engineer’s anecdote reads as a missed opportunity — a warning of an impending storm of misinformation blithely dismissed.

Fundamentally the Buzzfeed article and others like it are arguing that Facebook should be in the position to decide that a big news site like the Wall Street Journal should get more traffic than a blog like Daring Fireball or Tech Crunch and if that isn’t the case then Facebook should artificially change things. At least, that’s the tech news analogy of the argument being made about political news sites.

Facebook: Media Company or Communications Platform?

Buzzfeed and other members of the press constantly wants Facebook to act like a media company and there have literally been dozens of articles written on the topic. In layman’s terms the argument is that since a large number of people get their news from Facebook then the company should have a perspective on the content of user’s news feeds and actively demote or promote content based on some editorial perspective. As a tech company, Facebook believes they are just a communication platform and content that users like reading ends up being what is most popular on the site.

Where the media does have a legitimate argument is when content is specifically promoted to its users by Facebook such as in its Trending Topics feed. This is content that was editorially chosen by Facebook and here the failure is on them that they repeatedly promoted fake news stories such as Megyn Kelly secretly supported Hillary Clinton during the election or that Pope Francis endorsed Donald Trump for US president.

However I think it is extremely dangerous if the news feed algorithm that generates content viewed by 2 billion users a month decides what the promote based on how much it aligns with the politics of Mark Zuckerberg or a bunch of product managers in Palo Alto.

Two Key Challenges for Facebook’s Ad Platform Highlighted by the 2016 Election

Modern ad platform companies like Facebook, Google and my employer Microsoft pride themselves on the fact that anyone can show up and buy ads on their platform whether their budget is $20 or $20 million and show them to anyone anywhere in the world.

The first issue is that this is actually problematic when it comes to political content during election seasons. We live in a world where disinformation campaigns, such as the KGB’s OPERATION INFEKTION which created the myth that AIDS was created by the US government to target minorities, can now be spread very cheaply and easily. Elections are very susceptible to bad or good news about specific candidates so any platform which can be used to reach a large number of voters is at risk of being used by foreign agents to influence a country’s elections in a direction they find more favorable.

However better policing of election related ads is actually fairly straightforward and not a significant challenge.

The second issue is the more pernicious problem. Ads can be purchased that influence people’s opinions on a topic which indirectly map to political influence. For example, any ad about gun control will elicit predictable responses from people who vote Democrat versus Republican in the United States.

In fact, when looking at the Federal Electoral Commission’s rules on Foreign Nationals one finds it says

Despite the general prohibition on foreign national contributions and donations, foreign nationals may lawfully engage in political activity that is not connected with any election to political office at the federal, state, or local levels. The Commission has issued advisory opinions that help to define the parameters of that activity.
In AO 1989–32 (McCarthy), the Commission concluded that a foreign national could not contribute to a ballot measure committee that had coordinated its efforts with a nonfederal candidate’s re-election campaign. Also, in AO 1984–41 (National Conservative Foundation), the Commission allowed a foreign national to underwrite the broadcast of apolitical ads that attempted to expose the alleged political bias of the media. The Commission found that these ads were permissible because they were not “election influencing” in that they did not mention candidates, political offices, political parties, incumbent federal officeholders or any past or future election.

Election influencing ads are actually fairly narrowly defined. Russians buying an ad that claims illegal immigrants are causing an increase in violent crime or US welfare checks are being used by immigrant Muslim men to cater to their four wives aren’t actually illegal.

More specifically, the ads created by Russians on Facebook and other sites were just reposted content originally created by Americans. In the New York Times article How Russia Harvested American Rage to Reshape U.S. Politics we learn

YouTube videos of police beatings on American streets. A widely circulated internet hoax about Muslim men in Michigan collecting welfare for multiple wives. A local news story about two veterans brutally mugged on a freezing winter night.
All of these were recorded, posted or written by Americans. Yet all ended up becoming grist for a network of Facebook pages linked to a shadowy Russian company that has carried out propaganda campaigns for the Kremlin, and which is now believed to be at the center of a far-reaching Russian program to influence the 2016 presidential election.

All of this content was created by Americans and can be advertised by Americans on practically any major social media service today without any issue.

Even if paid for by foreign nationals like the Russians and the 2016 election, the only reason that’s an issue is if these Russians were colluding with the Trump campaign otherwise there’s nothing wrong with Russians promoting a local news story from Michigan.

To summarize, the second problem is that anyone can advertise half-truths and outright lies to an audience whose political & social leanings they can microtarget to elicit a response which would indirectly influence an election.

How Serious is this Problem?

This may sound abstract, but then again we know a man stabbed his father to death after arguing about #pizzagate while another showed up at a local pizzeria ready to commit a mass shooting based on the same hoax. People are getting motivated to kill one another over fake news spread on Facebook. There’s no doubt that others were motivated to turn out and vote a particular way because of stories they read which incensed or outraged them. In fact, Facebook used to brag about how their house ads influenced voter turn out until bragging about influencing elections became passe.

There’s no easy fix since even if you target banning ads that are outright lies you can’t ban ads that espouse a particular pessimistic view of the truth (e.g. highlighting how many crimes are committed when in truth US citizens commit crimes at a higher rate than immigrants) without effectively censoring various political points of view.

Now Playing: Future & Young Thug — Patek Water (featuring Offset)

Like what you read? Give Dare Obasanjo a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.