TikTok’s Addictive and Unethical Algorithm

Tiktok is a social media application that utilizes algorithms to customize content to users preferences (Source: Getty Images) “Some TikTok users are criticizing the platform for inequitable treatment of creators of color.” © Getty Images

The only thing that can get me through the agony of my biweekly stair stepper routine is the entertainment that TikTok provides to me. What used to feel like an impossible 30 minutes on the stair stepper quickly became an enjoyable experience as I began to set up my phone on the ledge of the stair stepper and watch TikTok for 30 minutes. Within the time that I spend on TikTok, the app has perfectly curated a list of videos that precisely targets my niche interests. I was planning a post-graduation trip to Europe with my friends, and my TikTok quickly became littered with videos about which restaurants to dine at in Paris, my last stop on the trip. I started doing more research on clean eating and gut health, and my feed became littered with recommendations from dietitians that promote a healthy gut. I started to see how the app curated an algorithm that catered directly to my needs and interests, which satisfied my attention on the stair stepper and caused me to continue to utilize TikTok more than any other social media platform on my phone. The more that I talked to other people, the more that I realized that they had similar experiences on the app: people were addicted to their feeds which were highly personalized to reflect each user’s personal values and interests.

Despite my positive experience on the app, TikTok’s algorithm reinforces oppressive beliefs through the echo chamber it creates, and its algorithms are influenced by the oppressive beliefs of the algorithm’s creators (who lack diverse perspectives) pointing to the importance of increasing diverse representation in STEM-related careers.

TikTok’s unique algorithm creates harmful echo chambers

As of December 29, 2021, TikTok made headlines as it surpassed Google and Facebook as the world’s most popular web domain (“TikTok Surpasses Google, Facebook As World’s Most Popular Web Domain”). What makes TikTok unique over other platforms is its algorithm, which “applies a personalization algorithm… [such that] the recommendations [that the algorithm generates] are not only similar in terms of type of content, but in physical attributes such as race, age, or facial features” of the user (“#ForYou: The User’s Perspective on How Tiktok Recommends Videos” ). The algorithm therefore feeds users personalized content that the algorithm perceives to be well received by the user, and hides content that might vary from that user’s beliefs or values. This feature is problematic because “recommendation systems pose the risk of causing echo chambers and filter bubbles” as individuals only see content that aligns with their personal preferences (“#ForYou: The User’s Perspective on How Tiktok Recommends Videos” ). Echo chambers, which reinforce people’s existing beliefs, can be problematic because it shields “users from access to various perspectives [and] might induce biases and limit public debate” (“#ForYou: The User’s Perspective on How TikTok Recommends Videos” ). Echo chambers can also be particularly harmful when users already have sexist, homophobic, transphobic, or racist beliefs.

A personal anecdote can illustrate the point that TikTok’s algorithm leads to an echo chamber who reinforces people’s existing beliefs. My cousin recently came out as transgender, and actively wants to make the transition from identifying as male to female. His parents did not support or believe in the idea of somebody altering their gender identity, due to their own religious beliefs. Given this, after my cousin revealed her desire for her new gender identity to her parents who voiced their discontent with that desire, the parents were scrolling through TikTok and came across transphobic content which highlighted sentiments that gender is innate and cannot be changed. These videos reinforced their existing views and became a point of evidence towards not supporting my cousin. This highlights how TikTok’s algorithm can create echo chambers which reinforce people’s existing beliefs. While this is not innately unethical, it can lead to a lack of diversification of opinions and knowledge among users. This may, in turn, become harmful to some people in society, such as my cousin.

TikTok algorithm is inherently oppressive

TikTok has also received backlash for reinforcing racist views among users in the way that its algorithm understands the monitors its community guidelines. Specifically, one TikTok user named Tiara Nelson expressed her negative experience with the algorithm, “I ran into complications when two of my videos were removed because they ‘violated the community guidelines.’ One of the videos addressed the way I personally felt about non-African Americans using the n-word. The sound I used was a man saying ‘what’s wrong with you’ continuously. The other video that was taken down was a video of me reading Twitter memes. These were comedic memes that the Black community could relate to because a great amount of [Black people] have experienced the circumstances listed. It’s always fun to make these videos because I’m able to relate to them, and others are able to relate to them as well. When I came across [a] girl’s video that degraded the African American community, I couldn’t comprehend why it continued to appear on the For You page because based on TikTok’s community guidelines, the video should’ve been taken down” (“Does TikTok Have A Race Problem?” ). TikTok’s outright ban of Tiara’s content while negligence towards outright racist content appears to be a huge error in the application’s algorithm — or are the racist views it is supporting a result of the innately racist architecture of the algorithm?

Photo of TikTok User: Tiahra Nelson (Source: Forbes) “TikTok user Tiahra Nelson had two of her videos removed because they violated the community guidelines,” © Tiahra Nelson

This example of how TikTok’s algorithms treat the content of women of color is a topic that is explored in “Algorithms of Oppression” by Safiya Noble. Noble describes multiple examples of how negative biases against women of color are embedded in search engine results and algorithms. One notable example that the text discussed is how previous versions of google’s search algorithm had outputted “big booty” and other sexually explicit terms as top search results when searching for “black girls.” This example, as well as the aforementioned example of how TikTok’s treatment of Tiara’s content highlights how racism and sexism are a part of the architecture and language of technology. While people are taught to believe that technology and algorithms are “benign, neutral, or objective… the people who make these decisions [about algorithm creation] hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors” (“Algorithms of Oppression). This highlights that, the way that TikTok took down Tiara Nelson’s content as it violated “community guidelines” whereas outwardly racist, sexist, and homophobic videos were not removed from the app, reveal that the user experience of Tiara Nelson is not a “glitch” in the system. Rather, TikTok’s algorithm is inherently racist and sexist because the creators of the algorithm uphold those views. This is important because the algorithms, such as TikTok’s algorithm, interact directly with users in order to reinforce “oppressive social and economic relations” (“Algorithms of Oppression”).

Problem: These algorithms are biased due to a lack of diversity

The fact that the algorithms are echo chambers that reinforce oppressive views and are also inherently racist and sexist is reflective of those who create these algorithms. Specifically, the realms of data science and artificial intelligence are dominated by elite white men. Only 26% of people within “computer and mathematical occupations” are women, with only 12% of those women being black or Latinx (Data Feminism). Further, while “Blacks make up 11% of the U.S. workforce overall but represent 9% of STEM workers, while Hispanics comprise 16% of the U.S. workforce but only 7% of all STEM workers” (Diversity in the STEM workforce varies widely across jobs). While women, blacks, and hispanics are underrepresented in STEM occupations, there is an overrepresentation of whites (69%) and asians (13%) (Diversity in the STEM workforce varies widely across jobs).

Lack of female representation in computer science related careers is a worsening issue. (Source: Pew Research Center analysis of 1990 and 2000 decennial censuses and 2014–2016 American Community Survey.) Women’s Representation in computer jobs has declined since 1990 © Copyright 2022 Pew Research Center

Given this, it is relevant to discuss the privilege that certain groups hold in relation to others, and how that can affect data science and artificial intelligence. Namely, the privilege hazard is “the phenomenon that makes those who occupy the most privileged positions among us so poorly equipped to recognize instances of oppression in the world” (Data Feminism). Given that the positions related to data science and artificial intelligence are dominated by those who hold the most power in society, it follows that these groups of people lack lived experience of oppression. Ultimately, this limits their ability to foresee harm and therefore develop possible provocative solutions to that potential harm (Data Feminism). Given this, it is possible the algorithm created by TikTok was not inherently intended to harm users, but rather the lack of diversity involved in the creation of that algorithm has two significant effects. First, the lack of diversity made the creators of the algorithm unable to foresee how the echo chambers of user’s beliefs that TikTok creates could negatively affect society. Second, the lack of diversity that exists within technology firms like TikTok, and the sexist and racist beliefs that have been documented to have circulated in these environments, allow for such sexist and racist beliefs to be built into TikTok’s algorithm.

Solution: Increase diversity in STEM / technology related positions

Given what we know about the current state of diverse representation for those who hold STEM-related positions, and the implications of the unjust algorithms that are created, a critical next step is ensuring that those who create these algorithms represent a diverse array of identities. Increasing diversity of the workforces of technology firms is a necessary first move since there is currently a disproportionate number of minorities in STEM and design related positions within technology firms, and there are competitive business advantages of employee diversity. Namely, companies that employ a diverse workforce “have repeatedly been shown to make better decisions, come up with more competitive products, and better understand potential customers” (Design Justice). This is all linked to increased sales revenue, more customers, and greater relative profits (Design Justice). Additionally, employing people from a diverse backgrounds will help to bring to light the issues of oppression that could be perpetuated through the creation of algorithms, as those groups can bring their collective lived experiences with oppression to the table. This will help to combat the privilege hazard that currently exists and plagues the technology space. This will help to create technologies with less racist, sexist, and overall oppressive effects on society. Therefore, employing a diverse workforce could therefore not only help technology firms be more economically successful and efficient, but also a diverse workforce within technology firms could lead to technological advancements that have a more positive effect on society as a whole.

In sum, while users, like myself, enjoy the entertainment that TikTok offers, thereby making it (arguably) the world’s most popular platform; there are important consequences associated with TikTok’s algorithm. First, the algorithm creates echo chambers for users, and reinforces harmful beliefs. Second, the inherently racist nature of the algorithm leads to harmful effects for users. Given that there is a lack of diversity in STEM-related careers, the creators of TikTok’s algorithms failed to foresee these consequences, and failed to stop the integration of negative beliefs from being a part of the architecture of the algorithms. This ultimately points to the importance of increased diversity in STEM-related careers.

Sources

Asare, Janie. (2020). “Does TikTok Have A Race Problem?” Forbes. https://www.forbes.com/sites/janicegassam/2020/04/14/does-tiktok-have-a-race-problem/?sh=59f129cd3260

Costanza-Chock, Sasha. (2020). “Design justice: Community-led practices to build the worlds we need”. MIT Press. https://search.lib.umich.edu/catalog/record/99187396258606381?query=%E2%80%9CDesign+Justice%E2%80%9D+by+Costanza-chock&utm_source=lib-home

D’Ignazio, Catherine, and Lauren F. Klein. (2020). “Data Feminism.” MIT Press, ProQuest Ebook Central. http://ebookcentral.proquest.com/lib/umichigan/detail.action?docID=6120950.

Parker, Kim and Funk, Cary. (2018). “Diversity in the STEM workforce varies widely across jobs.” Pew Research Center. https://www.pewresearch.org/social-trends/2018/01/09/diversity-in-the-stem-workforce-varies-widely-across-jobs/

Moreno, Johan. (2021). “TikTok Surpasses Google, Facebook As World’s Most Popular Web Domain.” Forbes. https://www.forbes.com/sites/johanmoreno/2021/12/29/tiktok-surpasses-google-facebook-as-worlds-most-popular-web-destination/?sh=3471095e43ef

Noble, S. U. (2018). Algorithms of oppression: How Search Engines Reinforce Racism. NYU Press https://www.science.org/doi/10.1126/science.abm5861?utm_source=general_public&utm_medium=digital&utm_campaign=campaign-25163

Scalvini, Marco. (2020). “#foryou: The User’s Perspective on How Tiktok Recommends Videos.” Advance, SAGE Preprints, https://advance.sagepub.com/articles/preprint/Negotiating_morality_and_ethics_the_post-Millennial_perspective_on_TikTok/12800663/3

Tiktok (2022). “Introduction: Community Guidelines” TikTok https://www.tiktok.com/community-guidelines?lang=en

--

--