CheckYourself: care before you share

An intervention that asks readers to pause and think critically before sharing a news article on social media.

Karen Escarcha
Sep 28, 2020 · 12 min read

Team: John Baldridge, Yuri Deng, Karen Escarcha

This post documents our team’s thought process and intervention idea for Project 1: Navigating Information as part of Communication Design Studio, taught by Stacie Rohrbach in Carnegie Mellon University’s School of Design.

Project Brief

We were tasked with investigating a range of news sources to better understand how they present information and what the creators’ motivations might be. We spent about two weeks researching news sources independently and then we came together as a team and spent another two weeks developing an intervention. We conducted research by browsing native news websites and social media platforms. Due to the COVID-19 pandemic and geographical differences (one of us is based in China), we kept our meetings virtual.

Investigating news sources

Across the three team members, we looked into seven contemporary news sources: The Blaze, .coda, Fox News, The Guardian, HuffPost, Mother Jones, and PBS News Hour. We analyzed their structure (layout, hierarchy, grids, etc), form (color, typefaces, contrast, etc), and content (imagery, sentence structure, etc) and uncovered inherent patterns and relationships among the sources.

Image for post
Image for post

We also considered current trends in media, technology, and culture to develop a deeper understanding of how news is currently created and spread. We were all drawn to the issue of bias and lack of transparency and trust in the news. Our research showed that trust in the media is at an almost unprecedented low, differing drastically among political lines with democrats having more trust in mass media than republicans. Only 41 percent of Americans say they trust mass media according to a 2019 Gallup report.

Image for post
Image for post
GALLUP 2019. Source: https://news.gallup.com/poll/267047/americans-trust-mass-media-edges-down.aspx
  • 13% trust the media “a great deal,” and 28% “a fair amount”
  • Only 69% of Democrats, 15% of Republicans, 36% of independents say that they trust media
Image for post
Image for post
GALLUP 2019. Source: https://news.gallup.com/poll/267047/americans-trust-mass-media-edges-down.aspx

According to a 2020 Gallup/Knight Foundation Survey regarding trust in the media, most Americans say it is harder to be well-informed and determine which news source is accurate and credible.

Our research also found that the rise of social media has changed the information landscape drastically. This includes the way we consume news. In fact, according to a 2019 Pew Research study, social media is now one of the most popular ways in which people, especially young adults, get their political news. It also states that “Americans who mainly get their news on social media are less engaged and less knowledgeable.”

We studied communication design theories to inform our decoding of these news sources. For example, Kevin Lynch’s The Image of the City helped us build new frameworks that we can apply to the design of our intervention. Additionally, we studied sketching as a means to internalize our findings as well as outwardly communicate with others. In both studying theory and practicing sketching, we were able to learn more about constructing new mental models and modes of communication.

Overarching goals

Our goal was to use communication design to intervene in the consumption of news to help people become better-informed citizens. We wanted to learn about the similarities and differences between our identified news sources and translate our research findings into a compelling intervention. This intervention point in consumption is important given a large amount of news that people come across in their daily lives. Oftentimes, we feel overwhelmed with all of the information and might even fall prey to spreading and consuming inaccurate information. Our goal is to help people think more critically about the news they come across to prevent the spread of misinformation or the formation of narrow views.

We also wanted to challenge ourselves to explore news sources we would not otherwise consume. In identifying a range of news across the political spectrum, we set out to learn about bias, trust, and transparency play a role in both the consumption and sharing of news.

Methodology and Findings

The first step we took was to familiarize ourselves with contemporary news sources. Individually, we identified 2–3 news sources and explored their native news sites. We deliberately chose news sources that ranged on the political spectrum to give ourselves a wide range for analysis. We considered the structure, form, and content of each site as well as compared them side by side to better find inherent patterns and similarities and differences.

Although each member’s initial research was conducted independently, there was an overlap in the way we approached these sites. Our observations mainly reflected two aspects: visual communication and the expression of ideology.

Visual communication analysis

We first observed the visual structure of these news websites. By analyzing the grid and hierarchy of page layout, we found that the traditional news website usually adopts a more strict layout mode. For example, Fox has tense columns and lists. In contrast, Mother Jones, as an independent news platform, adopts a relatively loose and free layout. All websites use typography, colors, and images to give news pieces different levels of importance.

However, the media born entirely on the Internet and the traditional news organizations that originated offline show different tendencies on this point. Here we use very intuitive methods for analysis, such as drawing reference lines on the representative pages of the website for comparison and using annotations to explain the differences of visual elements on the pages. Finally, we can summarize the visual form of these websites into analysis charts to quickly show their visual differences.

Implicit and explicit attitudes

We found a variety of ways that news sites subtly convey their implicit or explicit attitude. We analyzed how they do this by reading details and tracking observations. In the process of reading the news, we raised some questions such as: Who funds the website? How does it emphasize the importance of a piece of news? What elements does it emphasize in a piece of news? What are the different wordings used by different news organizations when comparing the same report? Do they use an objective narrative tone or are they constantly emphasizing some points of view?

We found that PBS occasionally mentions its sponsors on the website, while the Guardian states that they were completely neutral and relies on readers’ donations. Other news platforms may not mention their funding sources. When we compared news headlines, we found that HuffPost tries to attract people’s attention with sensational headlines, while the Guardian uses more neutral words most of the time. By looking for answers to these kinds of questions through exploring, we have further learned how news conveys information, ideas, and prejudice.

Additionally, how to arrange the content of the website in time and space is another problem we were discussing. For example, traditional news websites usually classify news according to the regions and types of the event. News in the United States and some European countries tend to dominate. However, for websites like Coda, which explores topics in-depth, we can see that they treat topics in different regions relatively equally. This kind of news website focuses more on the commonness of different events and emphasizes the shared concerns of global social problems. But on the other hand, this classification model may give more subjective color to the reports.

The following diagram concludes three types of news categorizing methods: The Guardian organizes news mostly by regions and the global trend, and then uses Opinions to convey insight; .Coda sorts all news pieces under four topics and dives deeply into them, as long as they reflect on shared social concerns; Mother Jones focuses on politic news the most, though they have other channels like environment and crime, in which you can still read about politics in between the lines.

One surprising but common finding is that news organizations are making content easier to share in prepackaged ways on social media. We imagine that news organizations are leveraging the trust and network of their readers in order to reach and supplement for a declining general audience.

Sharing findings

In communicating our findings with other students, we used napkin sketches to explain the content of our survey. This exercise helped us quickly summarize and present our findings. At the same time, it is also a process of finding out what we have not noticed compared to other people’s views.

Design Intervention

Our general assumption is that individuals who share news and information on social media generally care what their followers think of them. They are sharing news to signal that they are informed (or to show their stance on an issue) or to inform/persuade others about a topic. Regardless of why they are sharing, trust is key. The individual sharing the news wants to be considered reliable and trustworthy. The level of trust is then placed on the individual sharing the news with their social networks.

From our research, we know that trust in traditional news media is low and more citizens are turning to social media to stay informed and engaged on pressing topics. However, when individuals use their social media feeds as a news aggregator, they may become more insulated and less exposed to outside perspectives. According to Wired, most Americans describe themselves as holding balanced views, when in reality people naturally move toward content that reaffirms their own opinions. This confirmation bias or the tendency to see new evidence as confirmation of one’s existing beliefs is especially dangerous on social media. Social media algorithms are meant to identify our preferences and show us more of what we “like.” Simply put, they can turn our preferences into polarization. Our intervention is meant to break that cycle.

CheckYourself intervention

Our intervention is called CheckYourself. CheckYourself is an online tool that informs the user about their own bias and also gives them more reliable information about the news that they are reading and/or sharing. Our intervention is designed to enable users to think critically about the content that they are consuming and sharing. This intervention is meant to create a moment of pause to give the user time to reflect. The goal is to have the user think critically about the information before they share it more broadly. A key component of this intervention is the ability to provide the user with in-the-moment resources that allows them to broaden their understanding of a topic.

Image for post
Image for post

Target audience

Our target audience is individuals who are interested in sharing news and information online. They tend to share information with others that reaffirm their own opinions and biases. They care about being in good standing with their online community and tend to think that this information is helpful and accurate. They also tend to share without thinking critically about the credibility of the news source and/or the information that they are sharing. Many in this target audience could be older adults, as our research suggested that older adults were deemed as “super sharers” and were more susceptible to sharing false or misleading information (Science, 2020).

For example, if a user goes to share an article directly from the news source via the “share on social” button, or simply enters the webpage URL into a sharing tool (Facebook, Twitter, LinkedIn, email, etc.), CheckYourself gives the user critical information about the news source.

A user may be notified that the article has false or misleading information, or that the article is from an unreliable source. We then point the user to credible information from more trustworthy sources. Our intervention will also inform the user of the news sources’ nefarious tactics, like pre-populated text, which might seem helpful to the user, but when unaltered it allows the news source to speak for the user. With that in mind, the intervention detects and notifies the user and encourages them to edit pre-populated text and add their own point of view.

We intentionally designed the intervention to be a little uncomfortable since our goal is to push back against the immediacy of news in order to advance the accuracy and accountability of the content being shared. The main objective of the intervention is to increase (or broaden) the user’s perspective and challenge them to think critically about the information that they consume and share.

Image for post
Image for post
Example of CheckYourself prompt — misleading article
Image for post
Image for post
Additional examples of CheckYourself prompts — sponsored content (left); pre-populated text(right)

Intervention entry points

Awareness of the CheckYourself intervention will be handled with traditional advertising, but by also adding a verified banner on all social media posts that were filtered through the intervention. We think that this seal of approval will serve two purposes 1.) it gives credibility and reassurance to the user when they share information and 2.) it serves as an advertisement to other users that might be interested in learning more about improving their own online sharing behaviors.

Another entry point comes in the form of a quiz. In this scenario, we imagine that an individual who is already aware of their own bias shares the quiz with others in their network whom they think could benefit from this intervention. The hope is that the quiz will be an engaging entry point for individuals who do not think they have a problem with spreading misinformation.

Image for post
Image for post

Once the user creates an account with CheckYourself, that tool is omnipresent while they engage with content online. This functionality was inspired by Grammarly’s integration across platforms and devices. CheckYourself intervenes when the user is reading an article online by inserting missing information on the screen and deeming the article reliable or not. It also intervenes when a user is about to share a misleading or false article on any of their social media or sharing platforms (Facebook, Twitter, LinkedIn, or email). We feel that this seamless integration across all platforms and devices is key to meeting the user where they are and protecting them regardless of the platform being used.

Potential outcomes

We know that our world is getting more polarized, dangerous, and confusing by the second. With the advent of social media, information is fast and cheap. It buzzes around us at a dizzying speed and invades our lives in myriad ways. From push notifications to social media alerts, news corporations are vying for our attention and counting on us to use our own network to spread their message on their behalf. Some of the information should be shared, while other information might add to the significant problems, we see around us. It is more important than ever to filter this information in order to allow information consumers to make well-informed decisions, decisions that will not only affect us but generations of citizens for years to come.

Future design features would include weekly status reports on the user’s sharing habits. These reports would include a trustworthiness rating of the news they have been sharing along with news source breakdowns and comparisons between the CheckYourself news sources and the non-verified news sources. This will allow the user time to reflect on the news they have been sharing along with key insights into their unconscious bias and behaviors.

In conclusion, we feel that this intervention will help individual users escape their own confirmation bias, explore new sources of information, understand some of the tactics news sources are using to mislead users, and think more critically about the information that they consume and share. More importantly, we are hopeful that this verified information will help purify the murky waters of misinformation and in the process create more engaged, enlightened, and informed citizens.

Sources

American views: Trust, media and democracy. (2020). Gallup/Knight Foundation Survey. Retrieved September 25, 2020, from https://knightfoundation.org/wp-content/uploads/2020/08/American-Views-2020-Trust-Media-and-Democracy.pdf

Amy Mitchell, M. (2020, August 27). Americans Who Mainly Get Their News on Social Media Are Less Engaged, Less Knowledgeable. Retrieved September 25, 2020, from https://www.journalism.org/2020/07/30/americans-who-mainly-get-their-news-on-social-media-are-less-engaged-less-knowledgeable/

Brenan, M. (2020, September 11). Americans’ Trust in Mass Media Edges Down to 41%. Retrieved September 25, 2020, from https://news.gallup.com/poll/267047/americans-trust-mass-media-edges-down.aspx

Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019, January 25). Fake news on Twitter during the 2016 U.S. presidential election. Retrieved September 25, 2020, from https://science.sciencemag.org/content/363/6425/374.full

Seneca, C. (n.d.). How to Break Out of Your Social Media Echo Chamber. Retrieved September 25, 2020, from https://www.wired.com/story/facebook-twitter-echo-chamber-confirmation-bias/

CheckYourself

News intervention project, Communication Design Studio, CMU Fall 2020

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store