Can Facebook do political discourse without undermining its own mission?

Mark Zuckerberg thinks it’s ‘crazy’ to suggest hoax stories on Facebook influenced the US election result. This from his Facebook page:

“Of all the content on Facebook, more than 99% of what people see is authentic… The hoaxes that do exist are not limited to one partisan view… Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.

“That said, we don’t want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here…”

I think there’s an underlying problem here, one that’s been unearthed in the race to the bottom in political discourse.

Consider a peer reviewed, academic study by researchers Jens Krause and John Dyer, which showed that as a crowd became bigger, the number of informed individuals got smaller.

Their experiments involved asking groups of people to walk randomly around a large hall. There were a small number of ‘plants’ in the group, with more detailed information about where to walk. Communication was forbidden, but people had to stay within arms length of another person.

In all cases, the ‘plants’ were followed by the others in the crowd, forming snake-like structures.

Different ratios of plants to crowd size were tested. In a small group, 4 people were required to herd just one person (80%). In crowds of 200 or more, five per cent of the group is enough to influence the direction in which it travels.

Now consider the extreme concentrating effect of platforms like Facebook — a black hole for content.

Of course, misinformation can occur on both sides of the political divide. But a danger of social media is the reduction of complex problems to soundbites, or 140 characters. The most effective way of getting a message across that short is to appeal to the heart, not the head. To bypass our rational side and tap into, instead, our base fears and desires.

Given what we know about the gravitational effects of social media and the way our ‘fast’, instinctive brain makes most of our decisions, stories that tap into our base emotions are more likely to get traction and proliferate, than rational arguments made in the same medium. Social media selects for populist political discourse.

Is that a problem for Facebook? It’s certainly a reasonable hypothesis. It wouldn’t be at all crazy to suggest Mark Zuckerberg tests it before he starts looking at his algorithms. After all, Facebook’s mission is to ‘make the world more open and connected.’ It doesn’t feel that open and connected right now.