Censorship and Shadowing: What’s TikTok’s Problem with the SES Equality, Justice, Woman Platform

Asmin Ayçe İdil Kaya
Women in Technology
6 min readNov 13, 2023
Kaynak: Image by Emily Rand / © LOTI / Better Images of AI / AI City / CC-BY 4.0

SES Equality, Justice, and Women Platform has been focusing on algorithmic censorship and shadowing in its content for some time, and as a platform, we are directly facing this censorship on some of our social media platforms.

While bringing women’s and gender equality issues to the heading on the SES Equality, Justice, Woman Platform website, we also compile the weekly agenda for our followers by turning these topics into newsletters. Additionally, every week, we transform these agendas into podcasts with the “Raise Your Voice” (Yükselt SES’ini!) program, interpreting the agenda from a feminist perspective for podcast listeners. One of the topics we address is algorithmic censorship and shadowing on social media. However, even the content criticizing this issue is being censored. For instance, TikTok removed the content where we discussed algorithmic censorship. Let’s first understand what this algorithmic censorship and shadowing are and then examine the instances of algorithmic censorship that the SES Equality, Justice, Woman Platform has experienced.

What is Algorithmic Censorship or Shadowing?

Firstly, let me explain what I mean by algorithmic censorship. We touched upon this issue in our podcast episodes in the past few weeks on the SES Equality, Justice, Woman Platform. Algorithmic censorship we encounter on social media can be defined as the deliberate prevention by technology companies of the spread of content related to a particular topic or situation. These major technology companies can remove this content after it is published through deliberate decisions (these decisions can be political, economic, or for other reasons). They do not directly say, “We are removing it for this reason”, but present you with hundreds of pages of community rules, claiming that the content violates these rules.

Another form of algorithmic intervention is shadowing, where content is deliberately not highlighted. To understand this, we can observe that the content in certain topics receives significantly fewer views than others. For example, if your content, which normally receives at least 500 views, suddenly drops to 1 or 2 when scaled, we can say that there is shadowing. If your content only appears on one person’s homepage, it indicates that the platform is not promoting your content.

When Did SES Equality, Justice, Women’s Platform Encounter Algorithmic Censorship or Shadowing?

The platform is currently actively being censored on TikTok. Or, we can say that in the social media channel, we can detect algorithmic censorship and shadowing. Although we are relatively new to TikTok, we can’t say we get along very well. Since we joined the platform, our content has been able to receive normal or above-average views proportional to the number of followers when it touches on popular topics or non-controversial subjects. However, when it comes to politics, especially criticizing the capitalist system, we encounter censorship and shadowing.

Let’s identify the content that was directly removed from the platform:

1- Firstly, a protest video related to Akbelen’s resistance from Turkey that was published by our platform was removed from TikTok. However, after our objection, we were able to put the content back into circulation, and since the issue was quite on the agenda in Turkey, the views were in the normal range.

2- Secondly, the commemorative content we shared about Mahsa Amini’s protests in Iran, which we published on the anniversary of Mahsa Amini’s death, was removed from TikTok. In the content, we discussed the current situation in Iran by interviewing a woman from there. The footage used was only protest footage, and there was no violent or triggering moment in the content. SES Equality, Justice, Woman Platform is very sensitive about this issue. However, after objecting to the removal, they put the content back with the status of “sensitive image,” and the views were significantly lower accordingly.

3- Another content removed from TikTok was the podcast video about Israel bombing the hospital in Gaza. The video, published after the bombing, explained the aftermath of the incident and the current situation in Gaza. We posted the video on Instagram without any problems, but TikTok removed the video as soon as we shared it. When we objected, we received no response, meaning the video was completely removed from circulation. We heard from other examples that the war in Gaza has been subjected to algorithmic censorship and shadowing on social media platforms. In a report shared by Amnesty International, they say, “There are worrying notifications that content shared by Palestinians and those advocating for Palestinian rights is potentially subjected to discriminatory content control by social media platforms.” We believe that we faced this censorship because our content expressed support for the Palestinian people and called for the end of the war.

4- One of the most interesting cases of censorship came from the podcast video of the platform last week. The video where we criticized this algorithmic censorship and shadowing has been removed from TikTok. In the content, we talked about the censorship we faced. We only used screen recordings in the content, but TikTok removed the content as soon as we shared it. After objecting, we were able to get a response, and TikTok put the content back into circulation. However, on the other side of the coin, we think the content was shadowed after being published because the views of the content were 4 times below our normal viewing scale.

In all these situations, we clearly noticed that TikTok is not very open to critical content, especially on political issues, and wants to prevent its spread. While we can’t directly say what they support, it’s clear that they do not support alternative, feminist, or peaceful opinions.

Let’s share some of the shadowing situations we’ve encountered:

1- The first shadowing situation is about the content we produced about HPV vaccines. When we demanded that the HPV vaccine be free in our TikTok content, the content only received 29 views. However, for our platform, the average viewing rates were around 200–300 on TikTok.

2- Another content that we think was shadowed was related to the difficulties of access to sports for girls after the success of women’s volleyball in Turkey. The viewership of the content was hilariously stuck at 4. If the content had made it to the homepage, the normal viewing rate would have been in the 200–300 range.

3- The latest shadowing situation we encountered was the call to the big women’s march on November 12th by (feminist activist from Turkey) Fidan Ataselim. We shared this call video on TikTok, and again, it received only 4 views. TikTok did not highlight this content.

We can certainly say that shadowing situations on TikTok applied to content that addressed political issues. However, since they cannot find something that violates the “community rules” in the content, we believe that TikTok shadows our content. Because if they can’t find a reason to delete it, they prefer not to show it to anyone. Similarly, we observe that views of content criticizing the system and dominant politics are low, which indicates systematic interference with the algorithm.

What do we do?

While using such a platform (TikTok), there is nothing else to do but try to publicize the algorithmic censorship and shadowing we experience and remind people that what they see on social media platforms is quite biased. As a platform, we find the solution to producing critical content in this field and trying to publicize it wherever we can reach people.

Author and Translator: Asmin Ayçe İdil Kaya

Original Content in Turkish: https://susma24.com/tiktokun-yukselt-sesini-podcast-ile-derdi-ne/

--

--

Asmin Ayçe İdil Kaya
Women in Technology

Asmin Ayçe İdil Kaya from Turkey, 27 and Journalist. Specialized in Women's and Gender Studies, Researches on Gender Discrimination in Knowledge Production