Is Democracy Built for the Internet?

Internet technology is said to be the great facilitator of democracy, but research suggests that we might not be well-equipt to handle it.

The internet has promised a lot. It has the ability to give voice to the voiceless and provide a conduit for exposure to and exposition of free expression. It’s billed as the great hero of free speech. It is for this exact reason that the Internet continues to be so heavily censored in countries with strong authoritarian regimes. From the dictator’s point of view it is for precisely these reasons that unfettered access to the Internet seems so threatening, and probably is. It may spread contagions such as freedom of speech, representative government, freedom of religion, human rights, and democracy. How could any self-respecting authoritarian regime NOT control which ideas might “infect” their population?

In recent months a series of troubling reports have caused me to question if technology really will be the boon for democracy it purports to be. It started when Gizmodo published a series of articles exposing some of the processes behind how the “trending” news of Facebook… trends. It included rather alarming discoveries such as that many ‘actual’ news trends are actually “blacklisted” on a daily basis. After the initial article they soon published a follow up which extensively quoted former employees who worked on Facebook’s “trending” news. They describe how Facebook “routinely suppressed conservative news,” and described how they would overtly insert ideologically motivated news, which would otherwise not be trending, into the “trending” section.

Put simply, Facebook was(is?) presenting a manufactured image of the social world which more closely resembles a world which they approve of, than it the actual world we live in. This is then presented as social reality. This was no trivial thing at the time, and with the rising awareness of fake news and “alternative facts,” it is more relevant than ever.

Part of the challenge is that humans are built to use the attention of others a cue for status. Joseph Henrich and Francisco J. Gil-White, then at the University of Michigan and University of Pennsylvania respectively, developed the theory of “information goods,” showing that that humans, unlike Chimpanzees, use what they term “relative prestige” to assign status, and then use that status as a signal of what to believe and emulate.

The challenge is that this evolutionary mechanism developed long before mass media. It developed through direct observation, so trusting the attention of others to point you in the right way was adaptively useful. The Internet is particularly adept at scrambling these signals. Because we believe that seeing others give their attention to something or somebody signals their ability to teach you something we then associate prestige with expertise. And so Donald Trump has a giant swath of the nation which, I assume, are mostly reasonable and well adjusted people, who actually think him trustworthy. They’re surrounded by others who are paying attention to him, especially online, and so something basic, even primordial, signals that this is someone who deserves attention, emulation, and belief. They trust him, at least in part, because his ability to attract attention signals trustworthiness.

This also has huge ramifications for our consumption of the news, personalization algorithms, and the general tendency to surround ourselves with those like us. This is no trivial thing. The “secret” decisions that go into the algorithms behind what we are shown can be extremely powerful. Robert Epstein, the renowned psychologist and former editor-in-chief of Psychology Today, published a series of experiments he recently conducted with Ronald E. Robertson involving over 4500 undecided voters. He was able to show that Google’s search algorithm has the power to “shift the voting preferences of undecided voters by 20 percent” and even by as much as 80 Percent in some demographic groups. All this was accomplished without the voters being aware of any manipulation. They call this phenomenon the “search engine manipulation effect.”

From a societal and sociological standpoint, we often don’t take platforms like Facebook and Google seriously enough. Facebook, for example, has some serious intentions when it comes to the news. Last June Mark Zuckerberg was asked about the role he sees for Facebook and the news, and he stated that it is his intention for Facebook to become “the primary news experience people have.” There’s a problem when we are led to believe that we are being fed ‘unbiased’ reflections of the social world around us, but we are instead being delivered a carefully constructed, and approved, worldview masquerading as “what your friends think.”

Have you ever played the telephone game when somebody gets the bright idea in the middle of the chain to just make up something different and pass that along? Apparently Facebook is that guy, perhaps unintentionally and apparently sometimes intentionally. Clicks, likes, and shares signal what we believe to be worth of our attention, and like it or not, that influences how we and others see the world. That interaction is short circuited if you’re not getting what others actually think is important, but instead what a few young social engineers at Facebook, or even an algorithm, would prefer you to think is important. On that level it’s not so unlike China, North Korea, or Russia. They can, and do, control what is written in the news, but at least knowledgeable people there can take the news with grain of salt. With the speed and convenience I imagine most of Facebook’s 1.6 Billion users interpret and react to the “trending” section in the corner of Facebook, I think is safe to consider the “trending” section largely unsalted.

The Internet also may subvert democracy in much more subtle ways. Online, as in offline, we tend to gravitate towards those like us. Psychologists call this homophily. The difference between online and offline is that on the Internet we are never forced to sit down and have a face to face discussion with someone with whom we disagree. Rather, we can join thousands, or millions, of our like-minded ‘friends’ in condemning, ridiculing and crusading against those on the “other side.”

Research on group polarization has shown that deliberation with like-minded individuals tends to move groups and individuals towards a more extreme point than their pre-deliberation judgments. In other words people holding moderate, nuanced, or more inclusive views in the beginning, after speaking to others of the same opinion, tend to experience a form a radicalization where they hold tighter and more firmly to their original beliefs, and actually take them to more extreme places. This is called the echo-chamber effect, and it’s playing on those same evolutionary inclinations. If 137,000 other people (a number of likes on a Trump tweet yesterday)like it, it must be right, right?! But the “other side” is doing that too. So is every other side. We’re more and more sure that we’re more and more right about more and more stuff that we’re certain is more and more apocalyptically important, and the Internet is part of the reason why. All of the sudden democracy isn’t so much ideas at the town square and choosing them on the ballot as it is existential warfare in high stakes games of us vs them.

The same thing plays out with the news, “fake” or otherwise. Research has shown that repeatedly hearing a lie causes us to believe more each time, even if we know the lie is completely false. In another experiment, when people were directed to simply write one paragraph defending what they were told was a lie, they still increasingly believed it. And perhaps most damning is research showing that even when people are told by the original source that the information is not true, they still continue to include it in their decisions, and at least to some extent, believe it. If there’s one thing that social media, online news aggregators, and Donald Trump are good at, it’s repetition. And it’s incredibly hard to unbelieve something.

I personally did some academic research on the growth of Internet use in Egypt before the “Arab Spring.” I was hoping to gain some insight into how the Internet may have played a part leading up to the revolution. I found that data showed a significant relationship between accessing the Internet and trusting your neighbor. Put simply, Egyptians who got online more ended up trusting their neighbors less.

Findings in psychology suggest that we already tend to think that people agree with use more than they actually do (this is called false consensus bias). We also tend to look for things that agree with what we already think (confirmation bias). We are not, however, condemned to these fates, but without concerted effort, it is our initial inclination. The question is, how much concerted effort do any of us have to spend sifting through the social flows of our online lives? Add to that the evolutionary pull to trust and follow those who are getting the most attention, combined with the almost miraculous ability to surround our digital selves with what others are paying attention to, and I suggest that there may be cause for concern. Celebrity is trustworthiness, likes are validation, and the stakes are increasingly extreme.

I don’t hate technology. I live in the heart of Silicon Valley and technology is the focus of my career. I bought the first Android phone(the G1!). I later switched to the iPhone(sorry!) and now lament when I am even a month behind the latest model. I would be uncomfortable if my boss saw my browsing history because of how much time I spend reading the news. But there’s a strong case to be made that without some thoughtful work on the part of governments, companies, and invidiuals, we may be wildly under-prepared as a species for the onslaught of online political warfare and in the end technology and democracy may not be such great bedfellows after all.