Social Media and its Effects on Political Polarization

Richie Beano
11 min readMay 5, 2019

--

This post is an academic piece I have made concerning a topic highly important to the rising generations in this world. This is by no means completely accurate, but more of a hypothesis that I will be testing now, and in the future.

I’ve long been concerned with what happened in the 2016 election. No, not the fact that Trump was elected president, but the manner in which that became our reality. This is a look at the general psychology of our nation. How our disposition to making quick moral judgments catalyzed by social media use has deepened the division in the United States.

To begin, I believe it would be smart to start with some studies concerning the problems we see today, with the “Left vs. Right”, or, BLM and ALM, or any topic that has revealed extreme bipartisanship online and in the real world.

Looking at this graph you’ll see the difference in opinion at the top of the political food chain. However, if we vote these members into place, then what is going on with the citizens? See below.

I realize that there is only one graph here. But the evidence is growing. People of all ages are taking sides on divisive political issues to extremes that used to be left to the fringes. And I do still think it is the fringes that are causing this perception of a divided nation. However, the fringes are the voices we hear now. For a very long time, it was only the top news media sources that people could receive news from. Now, everybody with an internet connection has a voice, and the rippling effects of a single voice with a large following are immense. Where can this be explained best?

Marketing.

Marketing relies on the fringes. Just listen to Seth Godin, arguably the most intelligent mind behind current day marketing explain this idea:

https://www.youtube.com/watch?v=zzCkCPIw1mU

Try the best you can to ignore the guy filming.

Marketing is more than just companies trying to sell you something. Marketing encompasses the cross-section of ideas put forth by individuals and groups alike. These are ideas that feed into your mind and let you make a choice on whether you will accept it or not. The problem isn’t marketing, the problem is the lack of knowledge about marketing.

Why? Because the lack of knowledge about marketing is a lack of knowledge about people trying to sell you not products, but their ideas. It’s an old concept: salespeople are the best at it, and you don’t realize that it’s happening until you’ve accepted or purchased some meaningless product. The fringes are our salespeople on social media. The fringes are the salespeople of radical ideas and bias that come in little tolerable nuggets until those nuggets make up your natural mode of thinking.

The rules are changing a bit now, but the potential for targeted distribution was for a time scary in its effectiveness. Up until recently, online advertisements could be “dark”: targeted to certain people but non-existent on the main page for all to see. Currently, companies must take a more open stance and advertise posts that exist on their page.

Now, these ideas tie into my main point: social media’s highly addictive nature, compounded by its potential for mass distribution has affected the minds of an entire population. I will not be focusing on exactly “HOW” the Russians have a hand in this, as this New York Times Opinion piece below has explained it far better than I ever could. I will be focusing on the priming of United States citizens’ (and international) minds that led to this point.

https://www.youtube.com/watch?v=tR_6dibpDfo

As seen in the video, we’ve believed just about everything that Russia threw our way. They organized public protests that lead to deaths of our citizens, and so much more. The graphs displaying conservative and liberal political stances shown earlier are likely at least partly a result of Russia’s meddling in our social media platforms. This problem does not start with Russia, but with the platforms themselves.

So where do we begin with the talk on potential causes? Jonathan Haidt is a leading Moral Psychologist and Professor of Ethical Leadership at New York University’s Stern School of Business. I’ll be using his book The Righteous Mind in order to address this issue.

How we Make Moral Decisions

One of the biggest takeaways in the book is that we lead our decisions and judgments with intuition, and rationalize our behavior after doing so.

Example: “Julie and Mark are brother and sister. They are traveling together in France on summer vacation from college. One night they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At very least it would be a new experience for each of them. Julie was already taking birth control pills, but Mark uses a condom too, just to be safe. They both enjoy making love, but they decide not to do it again. They keep that night as a special secret, which makes them feel even closer to each other. What do you think about that, was it OK for them to have sex?” (Haidt 45)

Did you feel your gut do a little flip? Did you feel wrong at all while reading the story? Is it inherently wrong what they did? This is your intuition. It’s your emotional response to a “wrong” story. But what exactly is so wrong about the situation? If you were raised in a traditional religious household then you may cite your beliefs as the reason for your disgust. If not, then maybe you’ll think of the dangers of possible pregnancy (inbred children generally have a higher risk of birth defects). But there is practically zero chance they have a child, so what is the problem? Even if you cite religion, is anyone being physically hurt? They enjoyed it and decided not to do it again. If it’s kept forever between themselves, then nobody else can be hurt by this knowledge.

The important reason Haidt uses this story is because it is made to elicit a gut reaction of disgust or wrongness. Sleeping with a sibling? In most societies, it seems like an absolute wrong. The order of our reactions are the guiding principle in this entire argument. We feel emotion first, and rationalize it after.

Moral Decision Making on Social Media

Have you ever scrolled through twitter and saw a tweet from someone you disliked? Feeling micro-level gut reactions is a part of the same process. You probably do not know anything about this person. For instance, if you consider yourself Liberal, and a Donald Trump tweet by chance enters your timeline I’m guessing your first reaction is some minor form of disgust upon seeing his name. Or for a staunch Republican: Do you feel even the slightest negativity upon hearing the name: Alexandria Occasio-Cortez? Gut reaction. If someone asked you at that moment why you felt disgusted, you may just recite somebody you heard online who is also a prominent name in politics or social media.

This is a normal phenomenon now, and has been utilized by the Russians. Take those gut reactions, and amplify them as much as possible.

These photos are made to pull at your heartstrings. Every proponent of Black Lives Matter would have been a potential target for Facebook ads on the left photo. The right, well, its from a fake account called “Army of Jesus”, so I think you can assume that its going to Christians.

https://www.youtube.com/watch?v=CI6rX96oYnY

Social Media Usage

Haidt describes some scary statistics based on social media usage. The effect on young girls is scary. Self-harm and mental health issues are on the rise for this demographic and continue to get worse.

My argument comes in the form of contesting the developers of these platforms. The introduction of “Time Spent” analysis within programs like Facebook, YouTube, and Instagram are nice but likely not solving the problem. While waiting on statistics for those added features, let me propose an idea: we need reactionary fixes, not preventative.

Account “Timers” are there as a “preventative measure” (whether the developers call it so or not). The data showing the negative effects of social media platforms are in our reach. There is the beginning of a mental shift, yet the best developers can do is give users a usage timer. There are options to limit time on these platforms as well, yet kids are already so addicted that only parents with control of their children’s phones will have the ability to impose such restrictions.

I think the fixes need to be far better. Implementing or removing features that work to decrease the dopamine reward mechanism would be a great start. Addiction to content on platforms should be weaned, just as a cigarette smoker may use nicotine patches to try and lose their addiction.

This technology is so far ahead of what we as a species have evolved to deal with. Instant direct communication with massive populations and highly accurate targeting have laid the groundworks for manipulation in many forms.

Where this Comes Together

Social media and the dopamine responses it elicits have caused a need for attention and approval, with a decreased level of attention. In a quick second, anybody can block communication with anyone else online. “Smart” targeting of content has produced echo chambers in the minds of every single person with a smart phone and an account on a social media platform.

Why has this been allowed? Because it works. We see what we would like to see, and nearly none of what we wouldn’t. Take a look at this News Application that Apple has featured on its News category of the App Store:

It’s a news app that only shows you the news that you agree with. And you can “communicate” with the community. There is no debate, no structured argument recommended. Pure, unadulterated, agreement. An internet form of groupthink.

But this exists on Twitter, Facebook, Instagram, everywhere. The major platforms already make this choice for you.

Having instant access to agreeable, dopamine-releasing updates all day long brings the human mind into the echo chamber of “our” ideals. A quote from “The Righteous Mind” describing how we process moral issues helps here. (Note: the “Elephant” Haidt describes in the book is our intuition. The “Rider” is the rational part of the mind that works to justify the elephant, even if it disagrees.)

“The main way that we change our minds on moral issues is by interacting with other people. We are terrible at seeking evidence that challenges our own beliefs, but other people do us this favor, just as we are quite good at finding errors in other people’s beliefs. When discussions are hostile, the odds of change are slight. The elephant leans away from the opponent, and the rider works frantically to rebut the opponent’s charges.” (79)

Yet there was a study conducted as quoted in the book, where subjects were given time in between deciding on a moral issue. They were forced to stop and think for a bit. See below:

“But Paxton and Greene added a twist to the experiment: some subjects were not allowed to respond right away. The computer forced them to wait for two minutes before they could declare their judgment about Julie and Mark. For these subjects the elephant leaned, but quick affective flashes don’t last for two minutes. While the subject was sitting there staring at the screen, the lean diminished and the rider had the time and freedom to think about the supporting argument. People who were forced to reflect on the weak argument still ended up condemning Julie and Mark-slightly more than people who got to answer immediately. But people who were forced to reflect on the good argument for two minutes actually did become substantially more tolerant toward Julie and Marks decision to have sex. The delay allowed the rider to think for himself and to decide upon a judgment that many subjects was contrary to the elephant’s initial inclination.” (81)

Seen in this study, taking time to think made subjects’ responses more tolerable on a morally “disgusting” question. Time is the main variable that is being abused by social media. Everything is so quick that it demands the same speed of decision-making by the user. Why? Because another juicy nugget might be in the comments, or below. We then jump to conclude that the random person in the comments with a point that matches our intuitive decision making is truthful and correct.

If I can boil this down into an equation it would look something like this:

(Social Media during impressionable development) + (Addictive UI/UX Design made to hook people with dopamine responses) + (Mass Targeted Distribution) + (Fringe radically biased thinking) + (Quick, Judge-and-Keep-Scrolling usage) = A very large population hooked on being divided against each other.

We need people to be less “groupish” and more “selfish”. See here:

Many political scientists used to assume that people vote selfishly, choosing the candidate or policy that will benefit them the most. But decades of research on public opinion have led to the conclusion that self-interest is a weak predictor of policy preferences. Parents of children in public school are not more supportive of government aid to schools than other citizens; young men subject to the draft are not more opposed to military escalation than men too old to be drafted; and people who lack health insurance are not more likely to support government-issued health insurance than people covered by insurance. Rather, people care about their groups, whether those be racial, regional, religious, or political. The political scientist Don Kinder summarizes the findings like this: “In matters of public opinion, citizens seem to be asking themselves not What’s in it for me?’ but rather “What’s in it for my group?” Political opinions function as “badges of social membership.” They’re like the array of bumper stickers people put on their cars showing the political causes, universities, and sports teams they support. Our politics is groupish, not selfish… Several studies have documented the “attitude polarization” effect that happens when you give a single body of information to people with differing partisan leanings. Liberals and conservatives actually move further apart when they read about research on whether the death penalty deters crime, or when they rate the quality of arguments made by candidates in a presidential debate, or when they evaluate arguments about affirmative action or gun control. (103)

Yet social media creates and helps groups. Groups help businesses, independent thinkers, creators, and anyone now. Yet the divide worsens. If there is now a place for everybody, then there is now a place where-everybody-else-is-wrong. Groups can help in some places, but selfish attitudes must be taught in politics. You can be selfish and want to help other people, it is not just a personality trait beneficial to the “all-powerful” leaders of the world.

So I guess what I am saying is to be a bit selfish. Decide what works for you, and you alone based on your life and experiences. Is that helping less privileged groups? Great, then you at least know it is a part of your personality anyways. We need to stop believing that politicians and independent thinkers are deciding what is best for us as pioneers of thought. They are saying things for votes, that is all. You need to decide where your priorities are in fulfilling a happy, content life. Not what someone else decides you must prioritize.

Hi, I’m Danny/Richie (either works). I’m a soon to be college grad with a heavy focus on creativity and psychology/mental health. I would love to hear from you. Contact me at danny@collectivema.com!

--

--

Richie Beano

I’m an artist from the Boston Area! College grad by the time you read this, and trying to freelance my way through life. www.danielrbean.com