How can the science of catastrophic collective behavior provide insight into the woes of social media platforms?
On Thursday, November 4, the Berkman Klein Center’s Institute for Rebooting Social Media hosted an invited workshop to collaboratively explore this question. The inspiration for the workshop came from the paper “Stewardship of global collective behavior” spearheaded by Dr. Joseph Bak-Coleman and co-written with authors from the physical, biological, and social sciences. The paper uses examples from climate change and conservation biology to illuminate critical questions about social media, including: What does it mean to treat social media like a crisis discipline, or a discipline that lacks the luxury of time? How does network structure, and the types of interactions allowed via that structure, affect the spread of information in a group?
How does network structure, and the types of interactions allowed via that structure, affect the spread of information in a group?
How could we more intentionally and productively align individual decision-making with large-scale behavior when thinking about stewardship of social media? Moderated by Professor James Mickens and Dr. Joseph Bak-Coleman, the workshop brought together experts from across academia, industry, and civil society to envision what good stewardship of social media could look like.
Aligning on values, designing effective levers, working across disciplines
Values are collective beacons. However, there is no consensus on what values social media platforms should enforce and promote. Everyone agreed that in thinking about “rebooting social media,” we have to first articulate the important values that social media should protect. Yet, given that what is considered valuable differs across regions, communities, and groups, there is not a single set of values we can use to design platforms and develop policy. Moreover, in a social media industry in which values have been historically dictated by business objectives, workshop participants worried about the potential incompatibility of a values-driven approach with business models dependent on advertising and engagement. Nevertheless, everyone agreed that a values-first approach is critical to starting the conversation and can help identify the relevant stakeholders and their respective incentives.
Values are collective beacons.
Centering on human values (as opposed to purely economic ones) reminds us who platforms should ultimately serve. Social media is driven by content submitted by humans, so the study of social media is in many ways a study of human behavior. Many workshop participants stated that inclusivity must be prioritized in the process of defining values, to ensure all perspectives — international, multicultural, academic, and industry — are included in the conversation.
Trying to define shared values is hard, and trying to put those values into practice raises additional challenges. For example, Yiquing Hua highlighted some of the difficulties by raising several questions: What’s the line between good and bad coordination? How do platforms moderate coordinated online activity like K-pop versus QAnon? Professor Martin Wattenberg agreed, saying that a lot of undesirable behaviors look very similar to desirable behaviors.
Levers to Encourage Constructive Behavior
Well-defined values provide a goal to move towards, but workshop participants also stressed the need to better understand how to move social media platforms towards these values. For example, how do we design social media to encourage socially healthy behavior? What technical approaches can we use, and what features of social media platforms would have to change?
Professor James Mickens noted the importance of differentiating between mechanism and policy. Mechanisms are technical levers used to achieve specific policies that control how users interact with the system. Currently, social media platforms hide many of their mechanisms; there is little transparency on what data platforms track, store, and compute. While companies must keep some degree of secrecy to retain competitive advantages, transparency is essential for external accountability. Given today’s lack of transparency, we can often only analyze the front-facing policies that have led to our current state of social brokenness.
Given today’s lack of transparency, we can often only analyze the front-facing policies that have led to our current state of social brokenness.
For example, we know that frictionless resharing and dense networks encourage virality and the spread of clickbait. We also understand that a lack of content moderation enables fake news and hate speech.
Workshop participants brainstormed a variety of policies that would counteract the relentless engagement-based optimization of current social media platforms. For example, to deter social media addiction, platforms could enforce a usage stopgap, much like the limits that casinos place on gambling. To prevent the unfettered spread of disinformation, platforms could add more friction and noise to the resharing process and build networks around smaller communities. Alternatively, social media platforms could reduce paternalism by giving users more control of their experience.
Alternatively, social media platforms could reduce paternalism by giving users more control of their experience.
Platforms could define open protocols that allow users to “bring your own recommendation algorithm.” Platforms could also encourage decentralized ownership and self-governance, similar to communities on Reddit and Wikipedia.
These policies would be layered atop underlying mechanisms. For example, a platform might have to build new infrastructure that collects statistics measuring post virality, social network connectivity, and group size. Platforms might need a system of protocols for communication with other social media platforms. Platforms might need to add official mechanisms for users to communicate with platform engineers and vote on proposed engineering changes. Devising the right set of policies and mechanisms requires creativity, nuance, and forward-looking analysis to effectively encourage constructive behavior. Some of the needed mechanisms may already exist within current platforms, hidden from public view; however, other mechanisms will need to be constructed from scratch.
One final theme that emerged from both the paper and the workshop is the importance of an interdisciplinary approach that draws from an expanded set of people, industries, and policy areas. For example, the risk of catastrophes on social media may be similar to managing systemic risk in finance, ecology and other sciences, and engineering. Information spread in social networks may be similar to information flow in schools of fish or flock of birds. Given the relative newness of social media and its rapid evolution, knowledge, methods, and ideas derived from other domains will be critical to determining good stewardship of online behavior.
For example, climate science researchers have aligned on a single metric — global temperature rise — as a guidepost for societal action. Could we design a singular metric to guide social media policy? Devising such a metric is hard, because social media interactions are driven by a large number of qualitative measures (e.g., individual happiness, social cohesion, the ability to challenge entrenched power dynamics) and quantitative network metrics (e.g., fan-out ratios and information exchange rates). That makes it difficult for any metric to capture the full complexity of human engagement online. Rather than trying to create a single metric to optimize, we might be better served by defining a series of warning thresholds for a variety of individual metrics.
Questions for Further Consideration
The workshop generated a number of broad questions about the future of social media that the Institute for Rebooting Social Media hopes to explore:
- How have different group values shaped the status quo of social media today?
- How can we rethink the ad-driven business model of social media?
- Who should set the norms and rules of social media? Who are we leaving out of these conversations and how can we include them? How can we include more perspectives more effectively?
- To what extent should platforms moderate user behaviors? How are rules and guidelines received by those being moderated?
- How do we balance long-term versus short-term incentives?
- How do we include measures of both online and offline behaviors in our analysis of social media?
- How do we measure the values that we care about, and how do we validate our progress as we develop new kinds of social media that center those values?
The challenges posed by social media cannot be fixed by one sector or group alone. A large-scale collective problem must be met with a large-scale collective solution. Analyzing social media via a crisis discipline framework highlights the urgency of the problem and the interdisciplinary effort needed to address the issue.
This recap was written by Emily Hong and Neeraj Chandra. The workshop was organized by Professor James Mickens, Hilary Ross, and Joanne Cheung, in collaboration with Dr. Joe Bak-Coleman at the University of Washington Center for an Informed Public, with research assistance from Emily Hong and Neeraj Chandra.
For updates from the Institute for Rebooting Social Media, including future events, research, calls for applications, and additional opportunities to get involved, sign up for Berkman Klein’s newsletter and follow Rebooting Social Media on Medium.