Strategic Knowledge

Teens use “algorithmic folklore” to crack TikTok’s black box

Iretiolu Akinrinade
Data & Society: Points
6 min readJul 14, 2021

--

An iPhone screen with the TikTok logo displayed
Photo by Franck on Unsplash

By Ireti Akinrinade and Joan Mukogosi, Research Assistants, Data & Society

In Unseen Teen: The Challenges of Building Healthy Tech for Young People, researchers at Data & Society asked tech workers how they consider or design for the health and well-being of the people who use their platforms, particularly adolescents. One tech worker explained: “Obviously we do have teen users, we assume, but we don’t collect age data about people. Just like we don’t collect really any personal data about people so we don’t have any basis to know who is a minor and who is not.” The report calls this abdication of responsibility “strategic ignorance,” and the research suggests that by carefully choosing what they don’t know about their users, tech companies fail to protect the health and well-being of adolescents on their platforms.

Despite the prevalence of strategic ignorance inside social media and gaming companies, today’s teen tech users have developed a number of creative and often hilarious strategies to make sure that they are seen, heard, and valued online.

Two individuals smiling at a smartphone screen
Photo by Shingi Rice on Unsplash

While companies feign ignorance about marginalized users on their sites, these savvy users flip the script, developing strategic knowledge about the platforms they frequent. Strategic knowledge is an imaginative praxis that uses deliberate engagement tactics to feed algorithms, in order to specify which content should grace a users’ feed. As a foil to strategic ignorance, strategic knowledge is an affirmation that human recipients of algorithmic decisions can wrest power from humans behind the code.

Some platforms view themselves as a digital commons, and are therefore hesitant to heavily-moderated content. Conversely, gaming sites devoted to building worlds, or platforms like TikTok–an app which publicly self-identifies as a place “to inspire creativity and bring joy”–may be more willing than other companies to suppress user-generated content that doesn’t match the company’s sense of self.

Strategic knowledge is an imaginative praxis that uses deliberate engagement tactics to feed algorithms, in order to specify which content should grace a users’ feed.

These corporate identities are operationalized through the platforms’ algorithms, and on TikTok, coding for “joy” has resulted in a discriminatory algorithm that further suppresses marginalized users. TikTok users have speculated about coded discrimination on the platform, sharing individual experiences and anecdotal evidence to identify and disrupt the algorithm. Engaging in collective guesswork, these users take to the comments section to propose different theories about why the algorithm acts in discriminatory ways.

In a recent video, @jameslxke asks his followers why TikTok’s algorithm puts trans users in harm’s way by promoting their content on conservative For You pages. If the algorithm is smart enough to know each user’s identity and is intent on keeping users on its platform, @jameslxke reasons, then why does it put vulnerable users at risk for what he calls a “digital lynching”? In the comment section of @jameslxke’s video, users speculate that creating conflict serves the platform’s bottom line: “tiktok does it on purpose bc arguing/dialogue keeps people on the app & the shock value of sending videos to ppl who wont enjoy it boost their app.” By sharing experiences, asking questions, and crowdsourcing answers, teens are developing an algorithmic folklore while discerning the potential motivations behind TikTok’s software engineering.

Other TikTok users create multiple accounts to segment their different interests, or to exert control over what content the algorithm recommends to them and how their content should be recommended to others. Through selective interaction with particular types of content, users can reach different niches ranging from “garden-tok” to “academic-tok.” One teen user, @izaiahisaac, made a video describing how he curated a separate “spam account” which enables him to access a conservative TikTok feed. On this burner account, @izaiahisaac noticed a trend where users who have tested positive for COVID-19 deliberately enter public spaces with the intent to infect others. By sharing this content with his followers on his main account, Izaiah individually intervened in breaking the silo that TikTok’s algorithm builds between users with different political views or offline identities. Rather than waiting for conservative content to trickle from one end of TikTok’s political spectrum to his own, Izaiah used the platform’s coded categorization of identity markers to inform his followers about right-wing tactics without exposing himself to harm.

Illustration by Cathryn Virginia for Data & Society’s Unseen Teen report

While scholarship on TikTok remains in its early stages, journalists and experts have investigated the experiences of teens on the app. In a 2020 Twitter thread, Marc Faddoul, an AI researcher at UC Berkeley School of Information, conducted an experiment demonstrating that TikTok’s recommendation algorithm tended to suggest accounts with profile pictures that matched the same race, age, or facial characteristics as the ones he already followed. His theory about TikTok’s “physiognomic bubbles” (which bear a relationship to a painful history of eugenicist classifications, and thus may well imply racial bias) was bolstered when internal documents were reported to reveal that the platform directed global content moderators to suppress posts by people with certain facial characteristics. Reporters at the Intercept who reviewed the documents note that “although what it takes to earn a spot on the ‘For You’ page remains a mystery, the document reveals that it takes very little to be excluded, all based on the argument that uploads by unattractive, poor, or otherwise undesirable users could ‘decrease the short-term new user retention rate.’”

The incentive to retain users by hyper-personalizing feeds illuminates an important contradiction inherent in strategic ignorance: though in many cases, these companies collect enough personalized data to target ads and videos to specific identities and demographic groups, that data often fails to cross over to their user experience and product development teams. This siloing of information and lack of corporate will to intervene in teen well-being contributes to a glaring gap in strategic knowledge: questions that explicitly critique the people behind the algorithm are absent from this collective guesswork. While teenagers debate the algorithmic incentives coded into their user experience, it seems that TikTok’s corporate black box remains intact.

Young users may not realize just how integral human actions are in algorithmic outcomes, and corporate strategic ignorance only serves to deepen that gap.

This gap in strategic knowledge may be a product of limited time and energy: reverse-engineering TikTok’s algorithm through the assembling of folk knowledge is a daunting task. Young users may not realize just how integral human actions are in algorithmic outcomes, and corporate strategic ignorance only serves to deepen that gap. Gaining insight into the human decisions and corporate mechanisms behind the app will help teenagers refute the notion that algorithms are purely artificial and that digital infrastructure is unmovable. Digital agency, borne out of deeper connections between young people and the people who build the platforms where they spend meaningful time, is a valuable tool for teenagers seeking to shape their digital worlds.

While TikTok has responded to calls for transparency about their black box algorithmic practices by releasing information about how their “‘For You’ Page” algorithm works, teenage users are still fighting to understand how to use the app safely, effectively, and expressively. Whether it be through not collecting data, collecting too much content, or through unclear lines of responsibility for knowledge, attempts to obscure the platform’s inner workings are being subverted by teenagers across cyberspace every single day. Strategic knowledge offers proof that teenagers are refusing to be forgotten by the people who build the platforms they love. By molding platforms that seek to ignore them into spaces for growth, connection, education, and joy, teenagers are sending a signal to tech workers that their experiences matter.

--

--