The reckoning for social media and the internet: 7 warning signs

Photo: iStock/Rasica

(This article is based on a speech given at the Social Media and Society conference in Toronto on July 29, 2017. View the slides for the presentation.)

The future of the internet and its social media offspring seems up for grabs these days. Researchers have documented the impact of weaponized narratives aimed at creating confusion and conflict among groups in order to undermine civilizations; how bot armies use new technology tools to manipulate public opinion through “computational propaganda”; and the ways surveillance and filtering techniques are used by governments to undermine opponents. Moreover, the U.S. intelligence community now lists cyberwar as a critical threat.

Relatedly, internet godfather Vinton Cerf argued in a recent Pew Research Center report that, “Trust is rapidly leaking out of the internet environment.” And commentators like The Guardian newspaper’s John Naughton have wondered if the internet should be considered a “failed state.”

Data from Pew Research Center illustrate a number of ways in which problems are arising on multiple fronts for the internet and social media, even as adoption of these technologies continues apace. Here are seven key themes:

  1. ‘Total noise’ confuses: There is mounting evidence that the environment of “total noise,” as the late writer David Foster Wallace called it, distresses citizens. Almost two-thirds of adults (64%) say fake news causes a great deal of confusion about basic facts tied to current events, and 23% report having shared fake news themselves — sometimes inadvertently and sometimes knowingly. Some 60% of American adults say this statement describes them: “I find it difficult to know whether the information I find online is trustworthy.” The problem has even reached the intimate level: 26% of Americans (including 46% of those ages 18 to 29) say false information about them has been posted online.

At the same time, however, most Americans like to have so much information at their fingertips: just 20% say they feel overloaded by information, a smaller share than a decade ago. And majorities are confident in their ability to use the internet to keep up with information, and say that having a lot of information gives them a feeling of control.

2. Political polarization affects people’s judgments and interactions and that plays out online: Pew Research has documented the growth of polarization as the parties become more ideologically homogeneous. This is a broad phenomenon with impacts far beyond online life. Still, there are ways that it ties to people’s views about woes in the information environment and what happens on the internet.

First, people think the “total noise” information environment confuses democratic deliberation. Just before the election, 81% of Americans believed that supporters of Donald Trump and Hillary Clinton could not agree even on the basic facts. That has clear implications for the capacity of political institutions to address issues. In the midst of this cacophony, citizens can lose heart. Some 62% of citizens, Republicans and Democrats alike, say that on important political issues their side is losing more often than winning.

Second, citizens’ exposure to partisan-infused material in many places, including online news and social media, might be shaping their views of each other’s characters. Supporters of each party are much more likely to feel those in the opposition are closed-minded, immoral, dishonest and unintelligent than they are to cite the affirming versions of those attributes. Not surprisingly, for the most ideologically consistent citizens, this antipathy spreads to their views about their neighbors. Many said in our 2014 study that it was important to them to live near people who share their views and nurture close friendships with those who are like-minded. These divisions continue today: 59% of Americans say it is “stressful and frustrating” to talk about politics with people who have a different opinion of Trump than they do; just 35% find such conversations “interesting and informative.”

3. A fractured media ecosystem lets people tailor their information diet and customize their trust: The embittering trends underlying polarization play out in people’s media choices. The Center’s exploration of people’s preferred media sources shows a world where red and blue diverge. They gravitate to specific sources they feel are trustworthy. This disaggregation of trust also shows up in other findings: Americans have differing and discriminating views about the trustworthiness of specific government agencies and about the reliability of different corporate actors to protect their personal data. There are also sharp partisan divisions in views of the impact of national institutions, most notably the national news media and colleges and universities.

4. Internet users can be disinhibited and that can lead to harassment: People think the internet fosters anonymity and many believe that allows users to be more critical of others and even threatening. Fully 41% of adults have experienced online harassment, including 18% who have experience severe forms such as physical threats and stalking. Asked about their most recent episode, 54% of those who were harassed say the incident involved someone they did not know.

This disinhibition can also play out in political discussions. One study asked people to compare political discussions online and offline. Social media users were vastly more likely to say the discussions on those platforms are less respectful, less likely to come to a resolution, less likely to be focused on policy debates and less informative than discussions in other places. They also said the online discussions were considerably more angry.

5.Attention economy’ platforms incentivize outrage: The prospect of moving away from these dismissive or hostile views is not helped by the social media business model. Many experts canvassed by Pew Research believe that the economic structure of key social media platforms reward anger, outrage, anxiety-producing claims and menacing statements, rather than civil discourse. The Center’s own study of the statements of U.S. lawmakers found that significantly more social media engagement was generated by indignant language than by less intense disagreement. We also found that highly ideological members of Congress have more Facebook followers than moderates do.

6. Bots are now key actors in information environments: Social media are being gamed to exploit the algorithms that underlie the platforms — and to take advantage of human fallibilities. For instance, the Oxford Computational Propaganda Project documented how after one 2016 presidential campaign debate, one-third of the pro-Trump Twitter traffic, and one-fifth of the pro-Clinton traffic, was driven by bots. More broadly, the Project argued that bots affect information flows by “manufacturing consensus, or giving the illusion of significant online popularity” and by “democratizing propaganda through enabling nearly anyone to amplify online interactions for partisan ends.”

Of course, some kinds of bots can be incredibly helpful and almost all of us profit from the way algorithms help us navigate information spaces and navigate the physical world. At the same time, there is clear evidence that “bad bots” can wreak all kinds of damage, ranging from devastating denial of service attacks to promoting fake information.

7. People can be turned off by all the drama and commotion: Pew Research Center data also shows that people can be worn out by the challenges of social media. Some 37% of social media users by the end of the last election said they were worn out by the volume of political posts they saw. And 39% have taken steps to block other users or minimize the content they see from them for political reasons.

What can be done?

As experts ponder the reckoning for the internet on these issues, they focus on several possible remedies: They hope technological changes, driven by artificial intelligence, other algorithms and perhaps blockchain will weed out some of the bad behavior and they hope humans will get smarter about how to avoid the pitfalls of operating in digital environments. Still, they worry about how all these problems, anchored as they are in human nature, will ever be fully overcome.

When ordinary users think about solutions, especially in the context of harassment, they see a variety of actors as responsible for improvements: 64% believe online services themselves should have a major role addressing the problems; 60% see other users who witness problems as major actors; 49% look to law enforcement; and 32% say elected officials should play a major role in addressing these issues. In short, they are saying a networked cluster of problems should be confronted by a networked cluster of fixers.