Social Media: Driving or Diminishing Civic Engagement?

Kelly Born, Hewlett Foundation

Credit: Rawpixel

Social media platforms are having a profound impact on civic engagement and democracy. The scale of the impact is evident — globally, there are 2 billion people on Facebook, with the average user spending almost an hour a day on the site. Almost 70 percent of Americans now get news on Facebook, with 10 percent getting news on Twitter or YouTube. But while the scale is evident, the nature of their impact is not yet clear — in large part because these platforms have, to date, shared little data about how their technologies are affecting society and democracy.

Social media platforms offer many democratic benefits — connecting citizens with each other (and their representatives), encouraging voter turnout, even supporting virtual town halls. They have given voice to diverse and minority viewpoints, whose access to broad-scale distribution was previously badly constrained. They have supported connectivity among democratic advocates in authoritarian regimes around the world — making it possible to build coalitions, take action, and exercise free speech.

But, over time, the downsides at the intersection of social media and civic engagement have also become evident: by suppressing civic and political engagement on one hand and, on the other, by spreading inaccurate and uncivil disinformation that may be driving less constructive, polarizing forms of engagement.

Before the advent of social media, it took time for citizens to express their views and frustrations — writing, stamping, and mailing letters to the editor; crafting Op Eds (and having them regularly rejected), etc. Today, users can share extreme opinions, vitriolic comments, or harassing tweets in real time. Several online behaviors — which often target journalists, women, and ethnic or racial minorities — are key:

  • Trolling involves spreading deliberately offensive or provocative content, often with the aim of upsetting or silencing audiences.
  • Doxing involves the distribution of a target’s personal information across the internet without consent.
  • Flooding” tactics (sometimes called “reverse censorship”) “distort or drown out disfavored speech through the creation and dissemination of fake news [and/or disinformation], the payment of fake commentators, [or] the deployment of propaganda robots [bots].”

Together, these tactics can dramatically reduce engagement among the very same communities social media initially seemed to have “given voice.”

Credit: Tim Bennett

Even for those who aren’t actively silenced, widespread confusion about the accuracy and authenticity of online information may be enough to simply demotivate political engagement. The same anonymity that has empowered minority voices around the world has enabled the rise of bots, and the removal of traditional gatekeepers and markers for quality — newspaper brand, for example — make it much more difficult for consumers to discern the quality and accuracy of articles. At the same time, the sheer volume of content may be overwhelming and demotivating to many, especially when the origins are unclear and citizens cannot determine if they are receiving information from a fellow American, a Russian, or indeed even a fellow human.

While this climate of confusion and vitriol risks suppressing engagement further, on the flip side, others are being persuaded, polarized, and mobilized by these dynamics. Just as social media gave pro-democratic minorities voice during the Middle East’s Arab Spring, these platforms have also given voice to extremist, anti-democratic minorities in the liberal West — and social platforms are in some ways perfectly tailored to make disinformation go viral. Social media may not be the great equalizer or “liberation technology” many had initially hoped.

Many have noted that “what’s good for capturing human attention is often bad for humans.” By virtue of business models that rely on exposure to advertisements, social media platforms prioritize engagement. Unfortunately, recent studies confirm that fake news and false rumors appear to “reach more people, penetrate deeper into the social network, and spread much faster than accurate stories.” False stories reach “1,500 people six times quicker, on average, than a true story does.” This is not just the result of artificial amplification by bots, but of the human tendencies to prefer novel and emotional content that aligns with pre-existing political worldviews and “tribal” identities. Posts using moral and emotional (and thus potentially more polarizing) language appear to receive a 20 percent boost. On platforms that rely on engagement, these human tendencies to engage more with novel, inaccurate, self-reaffirming, and emotionally charged content is likely encouraging more destructive forms of political engagement offline as well.

Credit: Kym Ellis

While solutions are not yet clear, there is some hope that, while online platforms may be negatively impacting civic engagement, citizens can also be part the solutions: by decreasing social media usage, or pausing and reflecting before sharing potentially problematic content. Of course, funders can support these grassroots actions by helping to explain, elevate, organize and incentivize them. But changing the practices of hundreds of millions of citizens is much harder than moving upstream to change the practices of a handful of online platforms. Here, funders can support research to better understand the nature of these problems and possible solutions. Once the most productive solutions are identified, funders can help organize citizens to begin putting pressure on governments and platforms to address these negative features effectively.

There is still much we don’t know about the influence of social media on democracy. The rise of social media and removal of gatekeepers appears to have unleashed a tidal wave of disinformation, with repercussions throughout the public square. These platforms may be suppressing engagement on the one hand, while radicalizing and increasing engagement on the other. This is a new reality, and research is still nascent. What is clear is that anyone who cares about civic engagement needs to care about digital disinformation.

Kelly Born is a program officer for both Special Projects and the Hewlett Foundation’s democracy-related grantmaking. In her role with the Madison Initiative, Born’s work focuses on an array of issues related to reducing today’s politically polarized environment. Born oversees the Madison Initiative’s grantmaking in the areas of civic engagement and disinformation. She also serves on the Board of Directors for Philanthropy for Active Civic Engagement (PACE). Before joining the Hewlett Foundation, Kelly worked as a strategy consultant with the Monitor Institute, a nonprofit consulting firm, where she supported a range of foundations’ strategic planning efforts. In addition to her experience as a strategy consultant, Kelly has worked with various nonprofit and multilateral organizations including Ashoka in Peru, the World Bank’s microfinance group CGAP in Paris, Technoserve in East Africa, and both The Asia Foundation and Rubicon National Social Innovation in the Bay Area. Kelly guest lectures on impact investing at Stanford’s Graduate School of Business and on Women and Development at UC Santa Cruz, where she lives.

This piece is a commentary on the PACE paper: “Infogagement: Citizenship and Democracy in the Age of Connection.” Please see the publication for more commentaries and the original paper — and follow #Infogagement to continue the conversation.

--

--

Philanthropy for Active Civic Engagement (PACE)
Infogagement

A network of foundations and funders committed to civic engagement and democratic practice. Visit our publication at: medium.com/office-of-citizen