IMAGE: Alexander Parenkin — 123RF

The dangerous chemistry of the internet

Enrique Dans
Enrique Dans

--

What are the ingredients that have turned the internet into a home for lies, malware and misinformation? Firstly, clearly, the removal of entry barriers to publication. This is a phenomenon I perhaps appreciate better than most, and that has allowed me to share my thoughts with you every day, rather than having to send them to a magazine or newspaper, trusting their more or less rigorous filters, then waiting for the confirmation that there is space, and then finally, seeing my article printed a few days later, as used to be the case. Compared to that laborious and time-consuming process, effectively being my own publisher seems undoubtedly better. The removal of entry barriers to publishing seems in principle to be good news, giving more people a voice, democratizing society … until of course, the smart alecks decide that they also have a right to be heard.

The second ingredient is the incentive that generates advertising. Advertising on the internet seems like a good idea: it’s a medium that allows advertisers to target their products more accurately, offers faster feedback and allows many people to fund their publications. If only it were that simple. The use of traditional models leads to abuse, to intrusive formats, and above all, to the principle of quantity over quality, which measures success in terms of traffic through websites above all else, including common sense. There are sites — many — that automatically reload their pages every few minutes to add a page view, as well as others that are filled with clickbait. The more page views, the more money.

Thirdly, there are the social networks that allow us to connect with our friends and with the people we share ideas and interests with. Again, apparently a great invention. Until we discover that for many people they are the prism through which they look at the world, pursuing likes and doing anything to attract followers, pursuing their 15 minutes of fame and sharing things that probably shouldn’t be shared.

Finally, there are the recommendation algorithms that choose material we supposedly want to see, based on what we have been looking at previously, or what our friends have been looking at, or what has generated a reaction from us. Again, this seems like a good idea, something that works for us and allows us to choose what we want to read amid a vast ocean of information. But again, if we combine this with the fact that we tend to have friends who think along the same lines as us, the result is Eli Pariser’s famous filter bubble, a hall of mirrors in which our thinking is amplified, corroborated and multiplied infinitely by that of others, leading us to feel validated, to believe that everyone thinks like us, or that they don’t say what they think out of political correctness, and that racists, misogynists, fascists and other undesirables are ready to take over the streets and beat up the first immigrant they come across. Or to vote for the first candidate who represents ideas that are not only anti-constitutional, but also go against all common sense. No, the issue with racists, misogynists, fascists and the like is not about political correctness anymore, but the contrary: is about teaching these abominable people what it takes to live in a modern society.

This is the so-called tabloidization of the web, or as some see it, the web as a reflection of society. Obviously, the web itself is not guilty and it makes no sense to abolish it. Instead, the mix of the ingredients described above need to be regulated, their use facilitated, but their abuse prevented. That’s easier to say than do, and a task that that some will try to use to impose censorship, and that may well see the innocent suffer alongside the guilty, and that will see factories that churn out fake news confused with satirists or humorists, and that will require us to take into account any number of subtleties.

It will be difficult, but we have a responsibility to try. By creating something that turns popularity into the truth, it’s pretty obvious that sooner or later, it’s going to need to be brought under control. Doing so will not be easy, but if we don’t, the outcome will be far worse and we will not have contributed anything positive to our society.

(En español, aquí)

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)