Why corporate social media initiatives won’t stop “fake news”

HOPE worldwide members Lea Sutrisna Tan (left) and Nani Kumala Dewi translate Indonesian newspapers via flickr

I’m sure Mark Zuckerberg means well. Maybe.

But he appears more than a little overwhelmed by his own creation these days. Social-media-enabled “information” is looking like Frankenstein’s monster in post 2016 election US.

Remember in the movie where the little girl gets thrown into the lake?

Following the US election of Donad Trump this month, Mark Zuckerberg posted a lengthy note to Facebook seeming to do a bit of soul-searching about social media (mis)information, while also carefully hedging on the role that Facebook and social media might have played in electing this person. Zuckerberg focused mostly on this idea of “fake news”

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.

He’s really trying to have it both ways, no? Facebook doesn’t promote misinformation. And even if it did, it wouldn’t matter. That’s some impressive bobbing and weaving from the Facebook CEO.

I feel like if you’re going to use a figure like “99%” and words like “fake news” “authentic” “hoax” and “partisan” it seems only fair to ask whose definitions you are using.

Zuckerberg doesn’t explain that exactly, but goes on to point at “the community” as aribter — as if the users of the Facebook platform constitute any kind of community as usually defined. Shared values? Common purposes? Some shared cultural norms? Whatever. It’s not Facebook’s job to figure that out.

The second bob-and-weave moment in that very short paragragh was his confident assertion that it is extremely unlikely “hoaxes” changed the election outcome in one direction or another.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Simple definition of hoax (noun): an act that is meant to trick or deceive people”

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Is it likely “hoaxes” swayed the election results? Pretty doubtful. Is it likely poorly-reported and misleading news or questionable paid content shared on social media influenced voters? Almost certainly.

And how do I know? Data.

Dozens have studies have examined links between social media participation and civic activism, the influences on perceptions of credibility, and the growing reliance on “secondary gatekeeping” through social sources. We’ve measured the influence of “questionable” information on elections and public policy questions like climate change and vaccination.

Researchers have looked at how news travels so predictably through online social networks that they were able to build a mathematical model.

None of this should be surprising in 2016.

Zuck Part II

Zuckerberg followed up with a post on November 19 announcing some proposed initatives being explored at Facebook.

Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation.
We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.

Again, the “community.” Per this model, the community will be able to identify the fake news so Facebook won’t have to be an arbiter.

Zuckerberg’s proposal includes these suggestions:

*Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
*Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
*Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

Again, these are all dependent on the community being somewhat coherent, cooperative, and using the tools that Facebook has given them in they way they are intended.

The post also includes this point, which is worth some extra attention.

*Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

But do “fake news economics” look any different than “real news economics?” And is Facebook comfortable disrupting both at a real cost to their bottom line?

In July, the social network posted record earnings: quarterly sales were up 59 percent from the previous year, and profits almost tripled to $2.06 billion. While active users of Facebook — now 1.71 billion monthly active users — were up 15 percent, the real story was how much each individual user was worth. The company makes $3.82 a year from each global user, up from $2.76 a year ago, and an average of $14.34 per user in the United States, up from $9.30 a year ago. Much of this growth comes from the fact that advertisers not only have an enormous audience in Facebook but an audience they can slice into the tranches they hope to reach.
New York Times, November 21, 2016


“If you’re not paying for it, you’re not the customer; you’re the product being sold.” (blue_beetle of Metafilter, August 26, 2010)

Facebook and other social media platforms are not journalism organizations. Full stop.

They make money from monetizing us for content producers. How they select those content producing clients — we have no way of knowing besides what they choose to tell us. They are not a news content organization. They are not journalists.

Traditional journalism, for all its flaws, self-mythologizing, and tendency to protect the status quo, operates under a set of professional norms aimed at an ideal of reporting— doing the quasi-metaphysical work of turning “a thing that happens” somewhere in the world into “news.” It was a professional culture that embraced and defended its epistemological authority from the earliest days of print up to today (See Breed, Gans and Singer below).

And the foundational value of that culture was accepting responsibility for the information you put out into the world.

No bobbing and weaving.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —


Betsch, C., Renkewitz, F., Betsch, T., & Ulshofer, C. (2010). The influence of vaccine-critical websites on perceiving vaccination risks. Journal of Health Psychology, 15(3), 446–455.

Breed, W. (1955). Social control in the newsroom: A functional analysis. Social Forces, 33(4), 326–335.

Carlson, M. (2007). Blogs and journalistic authority. Journalism Studies, 8(2), 264–279. doi:10.1080/14616700601148861.

Gans, H. J. (1979). Deciding what’s news: A study of CBS evening news, NBC nightly news, Newsweek, and Time. Evanston, IL: Medill School of Journalism, Northwestern Univ Press.

Gil de Zúñiga, H., Molyneux, L., & Zheng, P. (2014). Social media, political expression, and political participation: Panel analysis of lagged and concurrent relationships. Journal of Communication, 64(4), 612–634.

Johnson, T. J., Kaye, B. K., Bichard, S. L., & Wong, W. J. (2007). Every blog has its day: Politically-interested Internet users’ perceptions of blog credibility. Journal of Computer-Mediated Communication.

Leiserowitz, A. A., Maibach, E. W., Roser-Renouf, C., Smith, N., & Dawson, E. (2010). Climategate, public opinion, and the loss of trust [Working paper].

Leskovec, J., Backstrom, L., & Kleinberg, J. (2009). Meme-tracking and the dynamics of the news cycle. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 497–506).

Massanari, A. L., & Howard, P. N. (2011). Information Technologies and Omnivorous News Diets over Three U.S. Presidential Elections. Journal of Information Technology & Politics, 8(2), 177–198. doi:10.1080/19331681.2011.541702

Singer, J. B. (2014). User-generated visibility: Secondary gatekeeping in a shared media space. New Media & Society, 16(1), 55–73.