Exorcising the demons of the social networks

Enrique Dans
Enrique Dans

--

Three interesting articles posted today refer to successful experiences in eliminating problems such as fake news, terrorist propaganda and bullying on the social networks.

What Github did to kill its trolls” is a long and detailed piece on how the largest platform for publishing and open software collaboration was able to deal with a problem that had begun to threaten its very existence, and which by 2014 was seeing how more and more people stopped using the site after complaining about trolling, bullying, sexism, abuse, harassment and insults.

The solution? To create management teams made up of people from very different origins, with very different ideas and very committed to tying up the loopholes that permitted bad behavior. This was a very different response to Twitter’s, where such behavior is sadly the norm, and that at best offers victims of bullying the means to pretend the problem isn’t there.

In “Why Snapchat and Apple Do not have a fake news problem”, Buzzfeed looks at what Apple News and Snapchat Discover have done to avoid the problems Facebook is now having to deal with. In simple terms, the two companies exercise greater control over what is published on their sites: Apple News curates all content, reviews publishers and monitors RSS feeds. Users can also flag fake news or hate speech. On Snapchat, which now has 150 million daily users, posts from the people users follow are displayed chronologically, not by popularity or personalized algorithm. Profiles do not display a follower count or even let users know how many followers they have.

These are not the kinds of sites where Macedonian teenagers are going to have much success in placing material, or where politicians with dubious reputations can post completely fake news, but rather sites with clearly established right of admission rules. In the case of Apple News, we’re talking about 70 million active users and available content from more than 4,000 publications, and where access is regulated by the same type of rigid and well-known rules that govern the company’s application store: break them and you will soon find yourself excluded.

Finally, in “Facebook, Microsoft, Twitter and YouTube collaborate to remove ‘terrorist content’ from their services”, we begin to see the idea of collaboration between rival services when it comes to produce a greater good, such as eliminating terrorist propaganda and calls to action from the web.

The problems most platforms face come not from technology, but from their users, who apparently need to be subject to control and supervision to prevent the worst of their nature from surfacing.

In my view, Github’s approach is more democratic: anyone can create a project on the platform without any limitation and problems only arise if they break the rules, at which point the community management team intervenes.

Apple and Snapchat’s solution is certainly effective, but it is rather drastic and imposes a significant entry barrier, meaning that people with valuable contributions to make could be left out.

Mechanisms of this type, perhaps applied in different combinations, are increasingly necessary to manage the internet, and studying cases like these can provide solutions. The dream of an internet open to all has come up against the worst aspects of human nature, exposing its many limitations; sad, but true.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)