What TikTok Knows & What's Wrong With That:

Marcelo Lopez
SI 410: Ethics and Information Technology
5 min readFeb 22, 2022

I often find myself spending a lot of my time on Tiktok, so much time that it is regularly the top app in my Screen Time section. Thinking about it now, I believe that if I would spend this time on another app such as Duolingo ** (a language-learning app) practicing my native tongue, Spanish I wouldn’t be slowly losing it. I use Tiktok so much that I often propose watching the short clips as a social activity with people.

When doing so, I can’t help but notice how vastly different some of our “For You Page” or feeds can be, while with others they look practically identical. Although we may get our videos out of order, we eventually end up watching the same content. With some friends, I see content that I most likely may have never seen on my own.

For example, with my significant other, the first video that pops up is a cat video, and the next one is also a cat. After about 4 videos, she finally gets a video that isn’t a cat. As an avid cat lover, this makes a lot of sense. Meanwhile, on my roommate’s phone, he gets nerdy computer science jokes and tech-related videos.

TikTok is personalized to reflect each user’s interests and likes and that’s it.

One day I noticed that a lot of the content creators on my feed looked a lot like me and that without realizing it I was being recommended videos in Spanish which shocked me. I have never given Tiktok any information about my ethnicity and language, so I was shocked. After doing some research online, this was a common experience for many others.

Marc Faddoul, an AI researcher at UC Berkeley’s School of Information, found that by following certain users, the platform algorithm would suggest users that looked similar in ethnicity and gender. It did not stop there though, Faddoul claimed that suggestions were often influenced by age, body shape, hairstyle, and even whether the person has visible disabilities. He stated

“Users tend to follow accounts from similar ages, gender, or cultures. In aggregate, these patterns are strong, and they translate into physical characteristics. Moreover, TikTok is inherently a very appearance-driven platform, and users have quite consistent tastes when it comes to the faces and bodies they enjoy looking at. “Therefore, if an algorithm is trained purely on user-engagement data, it will generate what appears to be appearance-based filter bubbles. For their defense, there was almost certainly no explicit intention from TikTok for their system to behave this way.” But it’s not an excuse, he says. “Such implicit biases can and need to be anticipated and accounted for, or at least acknowledged, especially for such a widely used algorithm, that is so impactful in shaping our cultures.” LINK

Although Mark did not conduct a formal study, TikTok says they weren’t able to replicate his results. Furthermore, Tiktok denies using profile pictures as part of its algorithm, additionally, the app uses collaborative filtering — where recommendations are made based on what other users have done.

Although most people may not care about how Tiktok knows so much information about us, and more likely than not cares even less about how this information is used as long as they get good videos recommended, it is and can be damaging. Tiktok is a very entertaining app, there is no doubt about that, it has carefully created and catered its algorithm to have us users hooked to it. Regardless of that, it unintentionally can support negative beliefs through the echo chamber it creates, and its algorithms are influenced by the unconscious biased beliefs of the algorithm’s creators. In addition to this, it places its users in filter bubbles with other users who look like them, thus potentially silencing minority voices and thoughts.

The simple way a majority of algorithms work on social media is that when a user positively interacts with content; content which is then deemed similar is more likely to show up. On the other hand, if the content is negatively interacted with, content similar is later hidden. In the paper “#ForYou: The User’s Perspective on How TikTok Recommends Videos” Marco Scalvini, states

“Traditional social media recommend content from people we follow or we agree with. For this reason, algorithmic recommendation systems keep feeding similar content, hiding content that may vary from the user’s taste or beliefs. Recommendation systems suggest new content after a thorough consideration of user preferences, which are recorded via their social media interactions, such as which accounts they follow and which posts they comment on. Such preferences are analyzed by recommendation systems to not only find out the type of similar content said user is attracted to but also to hide the ones we would not enjoy. “

Tiktok follows the traditional way social networks algorithmically recommend content, so it wouldn’t be wrong to say that Tiktok is creating and maintaining echo chambers similar to other platforms such as Facebook which have been proven to create them.

I can first-hand vouch for my experience on the platform, at some point while using the platform I had a conspiracy theory phase where I watched them and would egg on users however I never truly believed them. It got to the point where I was getting recommendations about conspiracy theories that had racist and xenophobic undertones. As a person of color myself, I couldn’t think positively of the content or the ideas being spewed so I reported the video and marked it as “Not Interested.” Knowing that I went too deep into the rabbit hole, I slowly but surely marked all the theories as not interesting, and slowly but surely removed myself from that echo chamber.

Although I saw how easy it would be for young or naive users to believe the misinformation being spread, and slowly enter an echo chamber where they have their negative beliefs reinforced. An echo chamber leads its members to distrust everybody outside of their chamber. In addition, it can lead to information spread inside the chamber being spread with no factual information being provided due to the amount of distrust for outsiders. This means that baseless claims are easily spread and accepted as truth, thus reinforcing the negative and oppressive beliefs. As of right now, it is easy to end up on any side of a political echo chamber on Tiktok which is an issue for a platform with such a young user demographic.

Tiktok inadvertently suggests videos that typically contain user preferences, if these preferences are political or oppressive, it is extremely easy for users to end up in an echo chamber. In addition to this, because the algorithm is created by privileged individuals, the algorithm can be filled with biases of the creators who are in a position of power. This bias can be used to recommend users who may have fairer skin, thinner noses, or are shaped in a certain way. Tiktok silences users who do not fit the status quo of the creator(s) of the algorithm, while also enabling users to potentially get trapped in an echo chamber of content creators who look similar to them. As a result, it causes content created by people of color to be hidden from users who may not look like them. Although Tiktok may be fun a quirky video app, it may be more destructive than we think.

--

--