Censorship: Killer or Cure?

From nbcnews.com.

The Book Thief. Fahrenheit 451. 1984. The Handmaid’s Tale.

Nothing kills creativity and individuality quite as efficiently as the criminalization of it. Novels have been warning viewers about this phenomenon since The Advancement of Learning by Francis Bacon was published in 1605, and this ancient fear is one that has stuck around through the progression of time. Though often presented in a fictional manner, this fear has rapidly become objective throughout the information age. With the consistent banning of influential novels like To Kill a Mockingbird and The Catcher in the Rye, it raises the question of the true, long-term impact that this never-ending suppression will have on future generations. The answer: it’s a negative one.

By cross-analyzing a list of “20 Indispensable High School Reads” with a list of the top 100 most challenged novels, I was less than surprised to find that nine novels took spots on both lists. These ranged from coming-of-age books like The Perks of Being a Wallflower by Stephen Chbosky to ones regarding racial identity such as The Bluest Eye by Toni Morrison. With such a vast range of works, I began to wonder what the censoring groups were aiming to achieve.

These occurrences of censorship eliminate the sentiment of learning from history or risking condemnation to repeat it. If future generations are prevented from accessing these themes in digestible forms, how will they learn to sympathize with characters like Atticus Finch or Holden Caulfield? Where will they learn of the lessons taught by The Lord of the Flies, or how to handle the heart-wrenching pain felt in the end of Of Mice and Men? These experiences link generations together, and prohibiting future adolescents from absorbing the knowledge breaks this connection.

This topic recently resurfaced in my hometown. Despite holding other traditional ideologies, book censorship was never something it bought into. That was, at least, until Christmas of 2022 when a teacher at the junior high provided Looking for Alaska by John Green as an optional Christmas gift for her students. It spiraled out of control, ending up in news articles from out-of-state publications as well as earning a TikTok shoutout from the author himself.

From abc57.com.

After ending up on the desk of my county’s prosecutor, it ended with no charges filed. With the snowball effect that came with this case of censorship, it truly made me think about how traditional censorship can transition to an issue involving the ethics of information technologies. In this case, the escalation from a small-town issue to a global, social media frenzy forced it into said status.

With the increased usage of social media to air out ones grievances, it’s become progressively easier for traditional forms of censorship to grow into more modern instances. In recent years, it has largely translated from physical publications to those roaming the digital space, and websites like Twitter have turned this age-old debate into contemporary and modern.

With this transition from one media to another came a new question of responsibility. Following Elon Musk’s acquisition of the company, Twitter’s new status as a private company fuels this conversation. As a private company, Twitter’s actions are no longer dictated by what its stockholders deem as appropriate, leaving all of the power in Musk’s hands.

When the question of responsibility is added to the equation, the line between right and wrong becomes blurred. How much responsibility does a private company have for the platform it provides? How much responsibility does a poster have for the people that view their content? How much responsibility does a general user have for the content on any platform they choose to browse? Each of these questions is, by nature, interconnected, and it causes this confusion of right and wrong.

On a social media platform, the argument of censorship gains further depth. While eliminating the spread of hate speech and outdated ideologies is crucial for society’s necessary tolerance growth, this grey area causes uproar from all belief systems. This transforms the question from one of impact to one of extremity. Is it worse for everything or nothing to be censored? Each comes with its own downsides, of course, but is it worse to eliminate all possible freedom of speech, including things like time-sensitive news alerts, or to allow all possible freedom of speech, such as neo-Nazi ideologies? This is where the aforementioned issue of responsibility comes in, and it answers this question for you.

When it comes to modern censorship, the bulk of the responsibility must fall onto the general user (or their responsible guardian, considering 97% of 13- to 17-year-olds use social media). This all comes down to the unfortunate existence of a digital footprint. In an online environment, even deleted posts live in perpetuity. Whether it be through screenshots, word of mouth, or services like the Wayback machine, the most hateful of posts will remain immortal in one way or another. Though censoring the initial post can do some good in preventing this process, it hardly prevents any spread from happening at all. When these scenarios occur, it falls on the user to control their intake.

On top of this, online censorship often discourages users from continuing to create content. Content creators have been forced to create new terminology to use online called “Algospeak,” all to allow themselves the simple affordance of saying words like “sex” or “suicide”.

Through the act of “shadow-banning,” some more blatant or extreme cases of online censorship including artists having their work blocked and posts of plus-sized models being specifically targeted have occurred. “Shadow-banning” pertains to the removal, hiding, or muting of someone’s post without alerting them. By using censoring algorithms, social media platforms can invisibly censor a creator’s work in order to make the content on their platform more closely appeal to their ideals.

While this discourages content creators in general, the disproportionate censorship of minority groups on social media multiplies the discouragement of those in these communities. In a study conducted by students from the University of Michigan, it was found that conservatives, Black people, and transgender people are censored more often than other groups.

Conservative participants of the aforementioned study experienced their content pertaining to misinformation, hate speech, adult topics, or a relation to COVID-19 being removed the most. For the transgender participants, they found that following site guidelines was not enough to save their content from being marked as adult, and that their other removed content included anything critical of a dominant group as well as things specifically relating to queer and transgender issues. The Black participants in this study mainly reported that their most removed content related to racism and racial justice.

Despite the transgender and Black participants following the rules and policies of various social media posts, they found their posts being censored at a similar rate as those of conservative participants whose content, more often than not, broke these rules. If creators from marginalized communities will be censored regardless of closely they obey these regulations, why would they want to continue creating content anyway?

Self-censorship, also referred to as a “spiral of silence,” forces people to alter the content they share online in order to fit in with the popular ideals. As a byproduct of general censorship, this spiral can excessively impact those already overly affected by censorship, such as the groups mentioned previously.

Viewing the findings of this study in conjunction with unfortunate outdated prejudices and external cases of unnecessary censorship further speak to the ethics (or lack thereof) of this issue, and it all comes back to the question of whether or not censorship is a killer or a cure.

In order to finalize my thoughts on this topic, I chose to return to the fundamental meaning of censorship, which is the suppression of words, images, or ideas that are “offensive.” I found myself returning to the question of responsibility. After all, who gets to decide on what is or isn’t offensive? With algorithms putting awareness posts and hate-speech in the same box of unruly, why should we rely on censorship at all?

To be completely honest, we shouldn’t. Considering the lack of a concrete definition for what is or isn’t offensive (e.g. is “crazy” offensive? what is defined as racist?), it’s impossible for any one human being to control what is or isn’t censored. Relying on algorithms isn’t a solution either, as we’ve seen with the disproportional blockage happening on almost every mainstream social media platform.

Though censorship could prove beneficial when used correctly, it has become one of many go-to responses that relies far too much on a perfect, unbiased mind. Since this doesn’t, and will likely never, exist, the downsides will continue to outweigh and discredit the positives that censorship could offer. While continuing to inhibit creativity, suppress individual voices, and further marginalize already mistreated groups, it’s a clear solution: censorship is better off in your trash bin.

--

--