TikTok Knows Your Ethnicity and Uses It Against You

Timothy Wang
SI 410: Ethics and Information Technology
8 min readFeb 22, 2022
Photo from apa.org
Photo of diverse people from APA.org, © S. Pappas, 2019

As if it was yesterday, I still remember the feelings of curiosity and uneasiness that consumed me that moment I realized that TikTok knew my ethnic identity. Like any other day, I was scrolling through my ‘For You’ feed when I noticed a pattern that had been under my nose the entire time. With each consecutive scroll, I was presented with content created by somebody who looked just like me. In shock, I quickly scrolled five more times… I was full of questions and hopeful doubts about the power of their algorithm. How could they know who I was, my race and identity as an Asian American, without me ever mentioning my East Asian background? And was it fair to assume that this was the only content that I wanted to see? When I finished that final fifth scroll, I let out a scoff of disbelief — the pattern remained true.

It turns out that my experience with TikTok making suggestions based on race was not one in a million. In a Wired article, Marc Faddoul, an AI researcher at UC Berkeley’s School of Information, discovered on his use of the platform that by following certain users, TikTok’s algorithm would automatically suggest other users that had the same ethnicity in their profile pictures. Eerily, in combination with my experiences, this means that despite TikTok denying its use of profile pictures for personalized content filtering, TikTok’s content curation algorithm can effectively see your ethnicity based on your actions and tailor the content presented to you based on that knowledge. When confronted about the topic of racial filtering in their algorithm, TikTok has seemed to play off its effects, claiming that they have not been able to replicate similar results and that their algorithm is solely based on collaborative filtering — recommendations based on the user behavior of what others have done (Why is TikTok creating filter bubbles based on your race?). However, the experiences of its users of color still stand to be true and unanswered. How is TikTok really using the data that determines our race and what are the implications? The reality is, the direction of this algorithm of using race as a categorical filter for interests and personalized recommendations, whether intentional or not, is problematic in design because it discriminates individuals of color by locking them into race-based filter bubbles, limits the reach of minority voices, and creates racist assumptions about desired content.

Discrimination Through Race-based Filter Bubbles

Firstly, using race as a categorical filter for personalized recommendations is problematic because its resulting race-based filter bubbles discriminate a certain race of individuals from equally experiencing the mainstream content of others. Filter bubbles are commonly referred to as the echo chambers that are created when social media companies present ideas to users that they already agree with to increase user engagement (Analyzing the Impact of Filter Bubbles on Social Network Polarization). These segregations of content have shown a variety of effects on individuals in regard to ideological polarization in politics (Fake news and ideological polarization: Filter bubbles and selective exposure on social media). However, when the content being filtered is done so through an algorithmic categorization of race, a new problem occurs — racial discrimination. If an algorithm decides that a certain race would agree more with others of the same race and create this filter bubble that locks their interactions within each other, the system is effectively deciding that each race should only interact within itself and that individuals would be better off consuming content produced by creators of the same race. This is not only unfair but serves as a modern form of racial discrimination that is enacted through algorithmic technology. It is unjust and problematic to present users with content that an overseeing system determines is a better fit for their race because it prevents them from accessing the same content that another more dominant group has equal access to.

I can testify to this with my own experiences as well. My story in the first paragraph of my experience as an Asian American on the platform was only a snapshot of the full picture. As I continued using the platform, I also noticed that because my content feed was predominantly filled with videos made by other East Asians, I would rarely ever see original instances of other mainstream content that was determined as “viral”. Certain trends would often never make it to my “For You” page, and if a particular platform trend did, it would typically be a reenactment created by an Asian content creator. This segregation into my race-based filter bubble prevented me from experiencing the same content that others could, which made me feel discriminated against. When considering this issue on an international scale with millions of users experiencing the same divisions, it is a major problem. No algorithm should make users feel separated from others because of their race.

Limiting The Voices of Minorities

Picture of Jalaiah Harmon doing the Renegade from The New York Times, © J. Frank, 2020

Another problematic aspect of using race as a filter for recommendations is that it actively suppresses the voices of minorities on the platform. Because minority groups can only predominantly reach other members of the same minority group due to these racial filter bubbles instantiated by the algorithm, it is much more challenging for their content to reach the mainstream feeds of more racially dominant groups of users. The effects of this were visible when Black content creator Jalaiah Harmon was not given the credit initially for creating the viral TikTok dance The Renegade because her reach as a minority on the platform was overpowered by Charli D’Amelio, a White content creator who copied her dance and was originally credited as its creator (Does TikTok Have A Race Problem?). In Chapter 1 of Data Feminism, D’Ignazio and Klein describe this imbalance as power in the matrix of domination, “some groups experience unearned advantages — because various systems have been designed by people like them and work for people like them — and other groups experience systematic disadvantages — because those same systems were not designed by them or with people like them in mind.” Jalaiah Harmon lacked power because TikTok’s algorithm, which was not predominantly designed by and intended to work for people like her, unjustly used her race as a factor for distribution, which positioned her at a disadvantage to other White creators on a predominantly White user platform. On the contrary, Charli D’Amelio held the structural privileges of power and was able to go viral off of a copied idea because the system had been designed for her to succeed. She did not have to face the limitations of race-based filter bubbles because her ethnicity aligned with the mainstream majority. Therefore, Jalaiah's experience with these racial categorizations determining and limiting who could see her video shows a bigger picture of the problems within the algorithm. In truth, she created a major trend that spread through international barriers, but the algorithm determined that her impact should be racially contained and downplayed. Despite her success as a dancer and content creator today, her following on Instagram alone still falls 47 million short of her more privileged counterpart, Charli D’Amelio. Had Jalaiah Harmon’s race not been a filter for recommendations, perhaps she would have been the one to have her own TV show and clothing line.

Creating racist assumptions about desired content.

Lastly, TikTok’s algorithms that use race as a categorical filter for determining personalized interests is problematic in nature because it creates racist assumptions about the desired content that individuals of color want to see. Yes, studies may show that individuals can be more empathetic to other individuals of the same ethnicity as themselves in certain scenarios (Personality and Individual Differences). However, it doesn’t necessarily suggest that the content that ethnic minorities want to see the most of is that of content creators of the same ethnicity. Further showcasing this point, studies conducted with children on the effects of in-group and out-group ethnicity on attitudes towards members of the in-group and out-group showed that individuals had no clear intention to align with other individuals of the same ethnicity as themselves when presented with the option to (British Journal of Developmental Psychology). This suggests that individuals likely don’t have racial preferences in regard to the ethnic background of the creator of the content that they consume because they don’t have preferences for the races of people that they interact with in real life. Users likely just want to see content that is relevant and entertaining to them (I know that this is what I hope for the most each time I scroll one step closer to infinity). It is clear that TikTok understands this, which is why it has been able to reach its level of success. However, TikTok’s algorithm that assumes that race can be used as a correlating factor for this need of relevancy brings up issues of racism, especially when these assumptions are created by a small subsection of dominant individuals who are likely to be of different ethnicity. Is it fair for them to assume that the content I want to see is that which comes from creators of my same race? I certainly do not think so.

Conclusion

All in all, it is clear that TikTok’s algorithm which indirectly uses race as a filter for interests and personalized recommendations creates avenues for many problems in its design. Not only does it create racial filter bubbles that discriminate users of color from accessing the full depth of mainstream content but it effectively segregates and suffocates the voices of minorities from fully surfacing to the open public, cutting off opportunities for minoritized groups to offer their support towards social movements and find other forms of success on the platform. In addition, its method of using race as a measure of relevancy leads to additional issues of ignorance and racism. A lot of these problems are likely not intentional, but rather the effects of privilege hazards (“the phenomenon that makes those who occupy the most privileged positions among us — those with good educations, respected credentials, and professional accolades — so poorly equipped to recognized instances of oppression in the world” (D’Ignazio and Klein, 2020)) within the individuals who design and code these algorithms. Their lack of experience with being in unprivileged positions “of how things truly are — profoundly limits their ability to foresee and prevent harm, to identify existing problems in the world, and to imagine possible solutions (D’Ignazio and Klein, 2020). As detailed in the first Chapter of Data Feminism, this often results in hard-coded “sexism, racism, and other forms of discriminations into the digital infrastructure of our societies,” and in our case of discussion, TikTok. It is not an easy cycle to break out of but begins with discussing these implications, hiring more persons of color in these fields, and challenging existing systems and assumptions. In recent times, TikTok has made small advancements in these regards. In 2020, they partnered with Mutale Nkonde, CEO of AI For the People, along with a growing number of Black advisors to further efforts of equality within the Black community on the platform (Months after TikTok apologized to Black creators, many say little has changed). They have also launched an account called @BlackTikTok which is run by Black TikTok employees and aimed at uniting the experiences of Black creators (TikTok algorithm error sparks allegations of racial bias). However, these are only the first steps to a less problematic TikTok algorithm. Not only do these actions need to be reproduced on a much larger scale, but certain Black content creators have even said that these new movements have created greater levels of division between the Black and White creators on the platform. How can those feelings be remedied? In addition to that, where do the other ethnicities fall on the plan of action of enacting change? One day, I hope I can perform those five scrolls again and see a new pattern — diversity and equality.

--

--