Overcoming Irrationality in Social Media

Glenn Hopper
8 min readFeb 22, 2021

--

A strong argument could be made that social media and rationality are as removed from one another as chalk and cheese. However, an enlightened, cogent approach to interaction with internet communities and applications has the potential to foster better communication, increase productivity, improve self-confidence, and break the cycle of positive feedback loops to broaden users’ world views.

Social media, broadly defined as any website or application that enables users to create and share content and interact with others, began as a microcosm of the real world where users communicated with one another in much the same way they would a neighbor they passed on the street. But as the power of online social networks grew, users began interacting with people from whom they were further removed in the real world, and the architects of these networks sought to bring in more users and to keep their users engaged for longer, the nature of the discourse shifted to the extreme.

With users spending an average of 2 hours and 24 minutes per day across an average of 8 social networks and messaging apps in 2020, social media forums evolved from simple communication tools like telephones or fax machines to something far more invasive and insidious. The digital world became its own society governed by an unseen and unaccountable mob with its own rules, norms, and standards that were distinct from those of the outside world.

Despite these new and ambiguous norms, users frequently turn to social media to help make decisions in everything from shopping choices to politics. A 2017 Institute for Public Relations study of 1,783 internet users found social media was influential in making decisions and seeking advice in travel, financial services, retail and healthcare. While study participants said they were most likely to follow social media guidance from close friends and family, online promotions, forums, and online reviews were also deemed influential. In an environment where sources can’t always be verified or validated and the voices and opinions of close friends and relatives are mixed with those of conspiracy theory crackpots, internet trolls, and other bad actors decision making is complicated.

The two primary systemic problems with making decisions based on information gleaned from social media are 1) the sheer volume of information available, and 2) the uneven way in which this information is distributed.

TMI

According to a March 2020 Research & Markets report, more than 2.5 quintillion bytes of data are generated every day. While social media certainly doesn’t make up all of that data, the information from social media sources contributes to the amount of information to which we are exposed on an ongoing basis. From social media notifications to emails, pop-up advertisements, calendar reminders, emails, news flash updates, and text messages the amount of external information to which we are exposed is greater than at any point in history. This level of exposure leads to myriad problems including how we choose to filter the information we receive and how well we are able to process the information and respond to it.

In his book The Paradox of Choice: Why More Is Less, psychologist Barry Schwartz noted how abundance of options in everything from blue jeans to breakfast cereal leads to increased anxiety and depression. Schwartz notes that the more options available, the harder it is to make a decision, and the less happy deciders are regardless of what they decide. He argues that decision makers should become “satisficers,” who are willing to settle for “good enough” rather than hold out for the perfect option. With the abundance of information available today, the book’s premise seems more prescient now even than when it was published in 2004.

Further, the brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. To deal with the otherwise overwhelming amount of information, people use heuristics to decide what information gets our attention. These mental shortcuts are useful but can lead to biases.

One of the most common biases encountered on social media is confirmation bias, which is our underlying tendency to focus on information that fits our existing beliefs. In social networks, the social biases that guide users’ selection of friends ultimately influences the information they see. This tendency to consider information more favorably if it comes from with a user’s social circle creates “echo chambers,” where friend groups circulate only information that others in the group agree with. Confirmation bias in social media can lead to group polarization where an entire group shifts to a more radical viewpoint as individual members adopt stances slightly more extreme than the perceived mean in an effort to curry favor with the group.

Welcome to the Machine

Members of social networks fuel these feedback loops on their own, but the echo chambers get an added boost by the social media applications themselves. Social media and search engines use individualized algorithms that are designed to deliver the type of content users are most likely to engage with. The algorithms work to keep users engaged but lead to even further information selection by filtering away information that does not match a user’s click habits or search history. This “filter bubble” effect may prevent users from ever encountering diverse perspectives. Social media users are also influenced by “trending” or popular content, which the algorithms promote across individual users’ feeds across the platform. This type of content promotion can lead to the bandwagon effect, which refers to people’s tendency to adopt certain behaviors or belief because they see many others doing the same.

The stronger the group polarization and commitment to the information within a person’s filter bubble, the more likely the social media user is to fall victim to cognitive dissonance, wherein they refuse to accept information that doesn’t reinforce their existing beliefs — rejecting new information (even in the face of facts) because it makes them uncomfortable.

Filter bubbles, confirmation bias, information selection, and information overload all combine to make people vulnerable to misinformation spread on social media platforms. This is a problem when social media is among the primary sources of news in the U.S., surpassing print newspapers as a source of news in 2018. In a medium where users are routinely exposed to conspiracy theories, extreme political views, and no fact checking of purported “news” stories, the platforms are ripe for the spread of misinformation and other low-credibility content.

The Way You Make Me Feel

When observed beyond the warm glow of a cell phone screen, the hazards of social media make it seem as inviting as a nest of rattle snakes, which could make one wonder why roughly one third of the world population uses it — and with increasing vim and vigor. In 2019, average time spent on social media was 144 minutes per day, which an increase of 62.5% since 2012.

From positive feedback loops to constant feedback from digital connections, the draw to social media seems clear, but designers of these platforms go further to keep users engaged and coming back as frequently as possible.

The act of dragging down a social media screen on a cell phone is akin to pulling a lever on a slot machine.

Designers use psychological tools and tricks to manipulate users and keep them on their sites as long as possible. By tapping into user’s dopamine and reward systems, the applications train users to associate usage with mental rewards. The act of dragging down a social media screen on a cell phone is akin to pulling a lever on a slot machine. Intermittent reinforcements train users that they get rewarded (with new content) as they scroll the site. The reward is unpredictable, which keeps the person coming back for more. Users are also rewarded with positive feedback (likes) and engagement with other users. These are digital versions of interactions we are hardwired to seek out. The ubiquity and persistence of these applications make them the easiest and most accessible way to achieve these sensations. But there is also a downside to dependence on social media for approval, communication and recognition. When people replace strong and supportive real-world relationships for more tenuous digital relationships, the interactions can be as fickle as a fairy throng. But ultimately users end up coming back time and again for one simple reason: doing so makes them feel good.

With such powerful forces drawing users to social media and the overwhelming array of information confounding decision-making it is important for users to apply rational thought and logic to their interaction with it. Awareness of the interplay of these cognitive biases is the first step in eradicating them from social media interactions, interpretations and beliefs.

Rather than blindly using social media with no meta awareness of the context in which they are operating, users could take the following steps to navigate the environment:

· Be Aware. Users should develop insight and awareness of their thought processes and understand that as humans they are potentially prone to errors or biases. For example, before flying into a rage at a social media posting by some random internet troll, users should ask themselves, “Is this worth my emotional energy?”

· Assess, Assess Again. Just as in the real world, citizens of the digital world should routinely look at their thinking and decision-making processes. Ask, “How else might I look at this?” Rather than believe or share every bit of information that comes across the transom, users should consider the story’s content, source, and intent.

· Take an Outside View. Taking an outsider’s view is an important way to evaluate the nature of the information users receive from the internet. A good question to ask when evaluating a piece of information from the internet is, “What would someone who disagreed with me think when reading this article?” The question is posed not with the intent that the user change her opinion, but rather that she only consider and understand there is an opposing view. An even greater step toward taking an outside view would be to seek out an article written from the standpoint of someone with the opposing view.

· Ask, “What if I’m wrong?” This is a continuation of the outside view approach. When deciding whether to believe, share or embrace an opinion expressed on the internet, pausing to ask, “What if I’m wrong?” kicks users out of System 1 thinking and requires a deeper level of thought. This is the essence of rational thinking.

· Embrace Uncertainty. Finally, if a user runs through the rational decision-making process and is still not sure of the veracity of a posting, he should acknowledge, understand and appreciate that uncertainty is part of life and take the opportunity to sharpen his thinking skills and learn from experience.

Social media is not inherently good or bad; rather it is a reflection of those who use it — built on the collection of rational and irrational inputs. Adaptation of a rational approach to social media consumption and interactions can help users maintain a healthy balance. But users aren’t the only ones with a role in this.

The social media companies themselves could do a better job if they wanted to (or were nudged by policy makers to do so). While the genie may be out of the bottle with regards to time spent by people on the platforms, the social media companies could modify their algorithms and content policies to combat positive feedback loops and the spread of misinformation. One route the companies could take would be to modify their algorithms to expose users to information outside of their filter bubble. Researchers at Aalto University in Finland have developed one such algorithm that works by selecting a set of influential users, who can be convinced to spread information about their viewpoints to the other side with the goal of maximizing the number of users exposed to both perspectives.

Both users and social media developers have a part to play if we are to change today’s combative, confusing social media climate. But rational actors needn’t wait for the broader world to change in order to improve their own social media experience.

--

--