TikTok’s Curated Utopia Isn’t So Perfect

Kaela Olsen
13 min readMay 26, 2020

--

Image source here

A call-to-action for the leaders of TikTok to address the platform’s sexual harassment issues and its deceitful, and dangerous, political censorship practices.

To: Vanessa Pappas, General Manager of TikTok for the U.S., Australia, and New Zealand

From: Kaela Olsen

TikTok, a Chinese video-sharing social networking, has experienced proliferated popularity in the past few years, surpassing apps like Facebook and Snapchat in the number of monthly users. TikTok’s explosive success triggered increasing concerns about their content moderation practices, most notably regarding sexual harassment and political censorship. Upon analyzing the criticisms against TikTok’s censorship policies and TikTok’s response, this memo recommends TikTok to 1) Allow users to see statistical summaries of each video’s audience and to “block” certain audience groups; 2) Track and suppress, if necessary, trending videos containing hashtags with political key-words.

Section 1 provides the relevant background on TikTok’s rapid expansion and acquisition of Musical.ly. Section 2 covers the first of two key issues: the sexual harassment of minors, broken down into 2A) Historical Allegations Against TikTok; 2B) the Hypersexualization of Underage Users; 2C) TikTok’s Response; 2D) Response Shortcomings & Recommended Reforms. Section 3 assesses the second key issue: TikTok’s preferential censorship guidelines, separated into 3A) the U.S. National Security Investigation of TikTok; 3B) Censorship Compliance with Chinese Government; 3C) TikTok’s Response; 3D) Response Shortcomings & Recommended Reforms.

Section 1: Background

TikTok is undoubtedly the social media platform of 2020. The Chinese video-sharing app has suddenly become a global cultural phenomenon, already surpassing Facebook and WhatsApp as the world’s most downloaded non-gaming app this year (Khan 2020). By March 2020, TikTok amassed more than 1.5 billion downloads, expected to garner more than 50m users by 2021 (Ng 2020).

TikTok’s Beijing-based parent company, ByteDance, was valued at $75 billion, making it the world’s most valuable startup. ByteDance’s rapid growth is primarily due to its unique ability to hack its way into a new market by and spending absurd amounts of money to reach audiences through advertising on competing platforms, aggressively courting influencers, or lifting government bans with promises of billion-dollar investments in their country. In November 2017, ByteDance bought Musical.ly for about a billion dollars, merging the app with TikTok in August of 2018 (Tolentino & Petrusich 2019). ByteDance merged Musical.ly’s user base with TikTok’s, establishing the ByteDance brand in the U.S., gaining a tight grasp in the psyche of American teenagers and young adults alike.

Unlike Facebook, Instagram, and Snapchat, TikTok is a social network that has nothing to do with one’s social network. Instead of feeding users posts from their real-life-friends, TikTok provides users with personalized “For You” default landing page, continually refined flow of 15-second videos. The “For You” page’s perfection is devised by ByteDance’s AI machine-learning system by analyzing each video and tracking user behavior (Tolentino & Petrusich 2019). Users share, comment, and like videos, and create their own remixes and renditions of dance clips and act-out memes (Phillips 2019).

Section 2: The Sexual Harassment of Minors

2A: Historical Allegations Against TikTok

In April 2019, the Indian government banned new downloads of TikTok, citing concerns that it exposed minors to pornography and sexual predation. ByteDance responded, announcing plans to hire more local content moderators and to invest a billion dollars in India during the next three years. India quickly lifted the ban, and TikTok launched a new campaign, where they paid fourteen hundred dollars to users who promoted TikTok in India (Tolentino & Petrusich 2019). In July 2018, the Indonesian government temporarily banned TikTok for containing “pornography, inappropriate content, and blasphemy,” issuing a list of demands with which TikTok needed to comply before being reinstated. ByteDance acquiesced, promising to remove “all negative content” by establishing 20 content moderators in Indonesia. A week later, the government overturned the ban (Phillips 2019).

TikTok again used its vault of financial resources to penetrate a new market by eliminating its competition in the U.S. Before being merged with TikTok, Musical.ly, a lip-syncing app based in Shanghai, was “the youngest social network we’ve seen” with users in “first, second, third grade,” says a U.S. ad agency. In 2018, ByteDance consolidated Musical.ly and TikTok, continuing its expeditious expansion. In February 2019, the Federal Trade Commission found that a large percentage of Musical.ly users (now TikTok users) were under the age of 13, uncovering “disturbing practices, including collecting and exposing the location” of these preteens. The app failed to ask for ages or parental consent, as required by federal law (Tolentino & Petrusich 2019). The FTC fined Musical.ly, owned by ByteDance, $5.7 million for violation of U.S. children’s privacy law, COPPA (Perez 2020).

2B: Hypersexualization of Underage Users

TikTok has continually faced issues around the protection of its young audience, 60% of which are ages 16–24 in the U.S. (Schomer 2019). TikTok users’ 15-second dance videos have molded into the most significant virtual stage for adolescent expression today. As teens create dances to go with explicit lyrics ridden with sexual innuendos, TikTok’s hyper-sexualization of teens is uniquely severe. At times, users are too young to realize the meaning of the sexual lyrics to which they dance (Phillips 2019).

What further propagates this issue is TikTok’s default landing page, the “For You” page, a feed filled with videos from anyone with a public account. TikTok’s algorithms ignore a user’s real-life social network and encourage users to discover content from strangers (Schomer 2019). As such, the public-by-default accounts of millions of teenagers have the potential to be exposed to a seemingly infinite abyss of strangers hidden behind their phone screens. Without the social ties they might have on Facebook or Instagram, these strangers lack social accountability or the need to uphold a reputation.

As a result of countless sexually predatory accounts that leave comments on the videos of underage users or manipulate underage users over direct message, TikTok has been called a “hunting ground” for child predators. TikTok users have filed many complaints about the platform’s sexual harassment, describing TikTok as slow or absent in banning abusive accounts or removing videos. However, TikTok claims it “deploys a combination of policies, technology, and moderation strategies to address problematic content or accounts” (Reyes 2019). This combination has ultimately failed, as sexual predation overruns the app as much as to drive TikTok users to take it upon themselves to moderate problematic accounts through reporting or flagging posts (Schomer 2019).

2C: TikTok’s Response

In April of 2019, in response to their FTC fine, TikTok introduced an age-gate feature for new users, allowing only those 13 years and older to create an account, asking for user birthdays but defaulting to the current date (Tolentino & Petrusich 2019). On April 30, 2020, TikTok will launch a “Family Pairing” setting that allows parents to set controls on the accounts of their 13- to 16-year-olds for Screen Time Management, Restricted Mode (which limits inappropriate content), and Direct Messages. TikTok will also automatically disable direct messaging for all users under 16. TikTok’s website recently updated to provide resources like educational safety videos and parental guides (Perez 2020).

2D: Response Shortcomings & Recommended Reforms

TikTok’s reform provides safety measures for only users that say they are under 16. However, today’s adolescence is beyond technologically fluent enough to lie about their age on user profiles. For any pre-teen, a simple age-gate is as easy to crack as downloading the app itself. This fact alone renders TikTok’s recent reforms, including the age-gate and the new Family Pairing feature, completely useless. Furthermore, when TikTok’s “For You” page continuously supplies videos of underage teenagers dancing provocatively, it is unrealistic to expect users on the receiving end not to recreate them. Instead of placing the burden of safety precautions on underage users and parents, the creators of TikTok ought to take responsibility for making it clear when users are exposed to potential predators.

TikTok should provide users with a short statistical summary of the age, gender, and location demographics of each video’s audience to help inform and warn them. If users see that their video demographics are reaching audiences they feel uncomfortable with, then TikTok can build in a control so that users can “block out” these demographic groups. If TikTok’s widely-renowned AI system can compute the perfect algorithm to feed users precisely what they want to see before they even know it, then it can protect users by suppressing the spread of their videos to strangers of a certain age, gender, or location. Previously vulnerable users would be empowered with visibility into who consumes their content, the first step in cracking the glass that the app’s sexual predators hide behind.

Section 3: Preferential Political Censorship

3A: U.S. National Security Investigation of TikTok

TikTok has been the target of a U.S. national security investigation, as politicians fear it could be a source of foreign-controlled disinformation if the Chinese-owned company complies with state intelligence work. In October 2019, Sen. Marco Rubio sent a letter to Treasury Secretary Steven Mnuchin requesting that the U.S. Committee on Foreign Investment look into ByteDance for its 2017 acquisition of Musical.ly, claiming there is “growing evidence” that TikTok’s U.S. platform is engaging in censorship. That same month, Senators Chuck Schumer and Tom Cotton wrote to John Maguire, Director of National Intelligence, to warn of TikTok’s “censorship or manipulation of certain content.” In December 2019, U.S. military branches banned TikTok on government-issued smartphones due to concerns surrounding cybersecurity and spying by the Chinese government (Perez 2019d).

3B: Censorship in Compliance to the Chinese Government

In internal documents obtained by The Intercept, ByteDance instructed TikTok content moderators to censor political speech by punishing those who harmed “national honor” or broadcast about “state organs such as police,” to remove livestreamed military movements and natural disasters, and to delete videos that “defamed civil servants,” or threatened “national security.” ByteDance also ordered the suppression of posts from users deemed too ugly, poor, or disabled for the platform and by users identified as LGBTQ+ (Biddle et al. 2020). TikTok also reportedly censors materials politically sensitive to the Chinese Communist Party, including content related to the Hong Kong protests, Tiananmen Square, Tibetan and Taiwanese independence, and the treatment of Uighurs (Vigdor 2020).

Leaked internal moderation rules show that the company categorized controversial content into “deleted,” “visible to self,” “not recommended,” and “not for feed,” a category similar to “not recommended,” except preventing videos from being searched. The rules note that most political content during election periods should be marked “not recommended,” including partisan speeches, party banners, or police content like videos outside stations or jails. Content about riots and protests, including reference to Tibet, Taiwan, and Tiananmen Square, previously marked “not recommended” but now considered “not for feed” because they result in “real-world harm” (Chen 2020). These rigid constraints enforced upon users to their lack of knowledge were intended by ByteDance to bolster the app’s image as a global paragon of self-expression and creativity by removing what they considered unappealing. These guidelines allowed them to achieve growing traction, while simultaneously discouraging political dissent with a level of authoritarianism customary in China (Biddle et al. 2020).

3C: TikTok’s Response

In response to reports exposing TikTok’s discriminatory censorship guidelines, Josh Gartner, TikTok’s Senior Director of Corporate Communications, stated that these “early and misguided” rules “represented an early blunt attempt at preventing bullying, but are no longer in place, and were already out of use when The Intercept obtained them.” However, the documents leaked have zero mention of bullying, nor why they offer an explicit justification of attracting users (Biddle et al. 2020).

Following the accusations of censoring in compliance with China’s government, TikTok announced they would now outsource all video removal decision-making to its local teams in the U.S., Europe, and India, halting the use of China-based moderators to monitor overseas content (Schomer 2019). In 2019, Gartner announced “fully autonomous” individual “Trust and Safety” moderation to oversee the “development and execution of our moderation policies, headed by industry experts with extensive experience in these areas” (Biddle et al. 2020). Even so, some markets like Germany still rely on ByteDance moderators in China to review content (Kubota et al. 2020). In October 2019, TikTok declared a ban on political advertisements to create a “positive, refreshing environment” on the app. TikTok also tapped the corporate law firm K&L Gates to help with moderation policies, forming a committee of experts to advise on child safety, hate speech, misinformation, and bullying. Along with creating the Advisory Council, TikTok released new Community Guidelines, publishing its first Transparency Report, hiring a global General Counsel, and announcing a Transparency Center in LA open to outside experts wishing to review its moderation practices (Perez 2020).

3D: Response Shortcomings & Recommended Reforms

These reforms came perfectly timed to when TikTok was under the scrutiny of U.S. politicians and regulators, helping them more with manipulating their narrative than making tangible change (Perez 2019d). Despite TikTok’s October 2019 ban of political advertisements, the platform continues to be flooded by political memes and conversations on issues from climate change to Chinese censorship, LGBTQ+ rights, and the U.S. Presidential race (Ng 2020). Despite enforcing an image of political objectivity, TikTok admitted to censoring Hong Kong protest content, with the hashtag #HongKong tagged in a disproportionately small number of posts. ByteDance reported that their Hong Kong protest censorship is consistent with their broader policy of censoring all political content. However, TikTok continues to support U.S. political hashtags: #Trump2020 has 115.1m views, #MAGA has 83.4m views, #BlackLivesMatter has 83.4m views, and #AntiELAB has 4735 views (Perez 2019d). This apparent disparity between the prevalence of Chinese and U.S. political messaging rips a gaping hole through TikTok’s credibility and promise for equitable censorship.

To absolve this, TikTok ought to carefully monitor hashtags with over ten thousand views on the platform, combing through tags for political key-words such as “Trump,” “MAGA,” “Election,” or “Democrat.” Tags containing key-words that begin trending should be moved to the “not for feed” category, and popular videos tagged with these hashtags should be treated the same. Eliminating such a wide variety of hashtags and the videos associated with them might cut out a decent chunk of videos available on the app, and thus misrepresent the full range to which users wish to express themselves. However, TikTok must make this sacrifice if they want to remain politically unbiased and preserve the platform’s utopia of self-expression, creativity, and silliness, without the tarnish of real-world politics.

Section 4: Conclusion

TikTok has become one of the world’s fastest-growing apps and a global cultural phenomenon. However, with its meteoric expansion has come extreme growing pains. Controlled by the obscenely affluent Beijing-based ByteDance, TikTok has faced countless concerns regarding its moderation and censorship policies — specifically, the sexual harassment of minors and deceptive its political censorship.

While attempting to bandage the platform’s sexual harassment with age-gates and parent protections, TikTok failed to truly mitigate the issue. Both the Indian and Indonesian government placed bans on downloads of the app, siting sexual predation and pornography. TikTok’s predominantly young user base that commonly performs short dances alongside explicit music breeds a community in which sexual predation is frequent. To amend this, TikTok should empower users with the ability to see the demographics of their audience and to block certain audience groups. The second key concern is the censorship guidelines that TikTok employs to moderate content, combined with its familial ties to China. TikTok has recently been the target of a U.S. national security investigation, apprehension from multiple politicians, and a ban on government-issued smartphones. TikTok reportedly censors materials deemed politically sensitive to the Chinese Communist Party on its U.S. platform, and leaks of its internal moderation rules show censorship of Hong Kong, Tibet, Taiwan, and Tiananmen Square. Following these accusations, TikTok announced intent to localize all content moderation. However, despite TikTok’s firm statement of their political impartiality and detachment from the Chinese government, the platform continues to censor Chinese and American political videos and hashtags with an astounding disparity. TikTok should closely monitor trending hashtags containing political key-words, and to remove videos with said hashtags, treating all politically controversial hashtags with equal and unbiased thresholds for their removal.

As seen with their offensive content moderation policies, TikTok has worked tirelessly since its initial stages to curate this perfect, utopic platform for children, teenagers, and young adults alike to create, to self-express, and to share with an infinite abyss of strangers. If TikTok so deeply desires to protect their “positive, refreshing environment,” then their utmost priority needs to be abolishing the platform’s sexual harassment and delusive political censorship. If their primary focus is “creating an entertaining, genuine experience for our community” (Perez 2019c) as their business leaders preach, then they have to make tangible changes that prove it.

References:

Biddle, Sam, et al. “TikTok Told Moderators: Suppress Posts by the ‘Ugly’ and Poor.” The Intercept,16 Mar. 2020, theintercept.com/2020/03/16/tiktok-app-moderators-users-discrimination/.

Chen, Angela. “A Leaked Excerpt of TikTok Moderation Rules Shows How Political Content Gets Buried.” MIT Technology Review, MIT Technology Review, 2 Apr. 2020, www.technologyreview.com/2019/11/25/102440/tiktok-content-moderation-politics-protest-netzpolitik/.

Khan, Coco. “TikTok Is the Social Media Sensation of Lockdown. Could I Become Its New Star?” The Guardian, Guardian News and Media, 14 Apr. 2020, www.theguardian.com/technology/2020/apr/14/tiktok-is-the-social-media-sensation-of-lockdown-could-i-become-its-new-star.

Kubota, Yoko, et al. “WSJ News Exclusive | TikTok to Stop Using China-Based Moderators to Monitor Overseas Content.” The Wall Street Journal, Dow Jones & Company, 15 Mar. 2020, www.wsj.com/articles/tiktok-to-stop-using-china-based-moderators-to-monitor-overseas-content-11584300597.

Newton, Casey. “The Threats against TikTok Are Beginning to Add Up.” The Verge, The Verge, 5 Nov. 2019, www.theverge.com/interface/2019/11/5/20947952/tiktok-threats-regulation-competition-pr-china-hearing.

Ng, Alfred. “US Officials in Contact with TikTok over Political Disinformation.” CNET, CNET, 3 Mar. 2020, www.cnet.com/news/us-officials-in-contact-with-tiktok-over-political-disinformation/.

Perez, Sarah. (2019a). “FTC Ruling Sees Musical.ly (TikTok) Fined $5.7M for Violating Children’s Privacy Law, App Updated with Age Gate.” TechCrunch, TechCrunch, 27 Feb. 2019, techcrunch.com/2019/02/27/musical-ly-tiktok-fined-5–7m-by-ftc-for-violating-childrens-privacy-laws-will-update-app-with-age-gate/.

Perez, Sarah. (2019b). “It’s Time to Pay Serious Attention to TikTok.” TechCrunch, TechCrunch, 29 Jan. 2019, techcrunch.com/2019/01/29/its-time-to-pay-serious-attention-totiktok/?guccounter=1.

Perez, Sarah. (2019c) “TikTok Explains Its Ban on Political Advertising.” TechCrunch, TechCrunch, 3 Oct. 2019, techcrunch.com/2019/10/03/tiktok-explains-its-ban-on-political-advertising/.

Perez, Sarah. (2019d). “TikTok Taps Corporate Law Firm K&L Gates to Advise on Its US Content Moderation Policies.” TechCrunch, TechCrunch, 15 Oct. 2019, techcrunch.com/2019/10/15/tiktok-taps-corporate-law-firm-kl-gates-to-advise-on-its-u-s-content-moderation-policies/.

Perez, Sarah. “TikTok to Launch Parental Controls Globally, Disable Direct Messaging for Users under 16.” TechCrunch, TechCrunch, 16 Apr. 2020, techcrunch.com/2020/04/16/tiktok-to-launch-parental-controls-globally-disable-direct-messaging-for-users-under-16/.

Phillips, Owen. “The App That Exposes Teens to Catcalls and Harassment.” Medium, OneZero, 13 Dec. 2019, onezero.medium.com/the-app-that-exposes-teens-to-catcalls-and-harassment-tiktok-musically-d98be52c6ff1.

Reyes, Mariel Soto. “Tiktok Users Are Taking Content Moderation into Own Hands.” Business Insider, Business Insider, 26 June 2019, www.businessinsider.com/tiktok-users-take-content-moderation-lead-2019-6.

Schomer, Audrey. “TikTok’s Growth Will Spark Content Moderation Pains — and Greater Transparency Is the Fix.” Business Insider, Business Insider, 14 Aug. 2019, www.businessinsider.com/tiktoks-growth-sparks-content-moderation-pains-2019-8.

Tolentino, Jia, and Amanda Petrusich. “How TikTok Holds Our Attention.” The New Yorker, 30 Nov. 2019, www.newyorker.com/magazine/2019/09/30/how-tiktok-holds-our-attention.

Vigdor, Neil. “U.S. Military Branches Block Access to TikTok App Amid Pentagon Warning.” The New York Times, The New York Times, 4 Jan. 2020, www.nytimes.com/2020/01/04/us/tiktok-pentagon-military-ban.html.

--

--