Everything You “Want” to See: The Developmental Impact of Algorithmic Targeting on Young Men

Charlie Brunold
Writing 150
Published in
10 min readNov 17, 2023

If you have used social media in the past, you have undoubtedly experienced content personalization. It feels great when a site shows you content tailored to your interests, whether it’s cats, cars, or charcuterie boards. However, this is no coincidence: with high accuracy, a site can predict the posts you like most through algorithmic targeting. Algorithmic targeting takes a user’s information and mathematically calculates the combination of content that they want to see. Yet, these algorithms tend to make generalizations based on broad demographic categories that lead to dangerous repercussions. In particular, algorithmic targeting pushes violent, sexual, and inappropriate content to young men, enabling companies and creators alike to profit from the manipulation of boys’ foundational development and the reinforcement of historical gender stereotypes.

Algorithmic targeting, also commonly referred to as individualized advertising or targeted profiling, is a technologically sophisticated digital strategy that utilizes complex algorithms that leverage an individual user’s data to generate personalized content and advertising. Companies like Meta, TikTok, and X all claim that their algorithms create a user-centric experience by only recommending content based on a user’s interests and preferences but fail to describe the specific criteria that cause certain content to arrive on somebody’s page. Contrary to what these companies may claim, recommendation algorithms operate most effectively by grouping users together into broad categories.

When a user joins a site, an algorithm has no information about their content preferences and, consequently, cannot use previous behavior to curate a personalized experience for the individual. Resultingly, algorithms default to initially recommending content based on the broad demographic information of a user, such as their age, race, and gender. (Bischoff) Many of these initial recommendations are based on the stereotypes of broad demographical categories, and young men in particular are the ones to suffer.

The algorithm utilizes the tropes of hypermasculinity to bombard young men with graphically violent, overtly sexual, and blatantly misogynistic content in hopes of increasing user interaction. When I first signed up for my Instagram account, for example, my entire explore page was filled with sexually suggestive imagery without me doing so much as changing a setting. By setting my gender to a male and indicating that I was a teenager, I had given the company all that I needed to for it to start making recommendations of what it thought would drive my engagement. This is no mistake: the prevailing societal norm over recent human history is that men tend to be more aggressive, sexually minded, and emotionless than women, and for the means of the algorithm, men tend to interact with this type of content more regularly.

Organizations have historically turned to the prevailing gendered societal standards to benefit from specific market corners. For example, take the 1942 advertisement for Tandaco seasoning, where a woman ponders her husband’s fury and disappointment if she were to cook a poor dinner. The advertisement suggests that by using Tandaco seasoning, the woman creates a meal that her husband and his associates love. Therefore, women were explicitly encouraged to use the product out of fear that without it, they would not correctly fill their gender role. (Bondfield) Tandaco seasoning, among countless other companies, capitalizes on the fears created by gender norms to advertise their products. The exploitation of gender roles is very much still present today and, in fact, more dangerous than ever because companies now use precise data systems to group individuals within their demographics for algorithmic targeting.

Worst of all, these recommendation algorithms that utilize gender norms to isolate specific groups are largely accepted by the general public as a piece of positive technology. Over 75% of social media users indicated they would be comfortable forfeiting their data to receive better recommendations. (A. Smith) Like the false generosity Friere discusses in his novel Pedagogy of the Oppressed, companies that use recommendation algorithms exercise their abuse of young men by hiding their malicious intentions behind an improved user experience.

Additionally, companies engage in Freire’s concept of banking education when they utilize personalization algorithms. For the impressionable young man, his trust in an underlying recommendation algorithm to provide content that aligns with his interests leads to the willing consumption of whatever content gets placed on his page. If not given the ability to articulate his questions and debrief the untapped imagery on his device, the young man runs the high risk of unknowingly permanently harming his emotional and mental development during his maturing years.

When you consider that the average U.S. teen spends seven hours a day consuming online media, there’s no question whether or not young men will, at some point, come across inappropriate content. (AAFP) Furthermore, compared to women, men are 10% more likely to see content that involves harassment or bullying online. (A. Smith) It is clear how our stereotypical views about gender roles continue to propagate in the modern age.

Coupled with a user’s reinforcement of recommendation algorithms when interacting with posts on their page, the algorithm pushes any type of content that young men show interest in. As a result, these algorithms create a positive feedback loop for the consumer. A user spending more time on a site comes with more attachment to the personalization of their posts, and more attachment comes with more consumption. Through this cyclic relationship, young men are subliminally forced to be consumers of whatever is placed on their page. Even more, consumers, much less the young man, although aware that an algorithm drives the recommendations on their page, have no idea of the underlying systems at play and, as a result, only have the illusion of agency to determine the content they see. In this way, through a form of banking education, as theorized by Friere, the algorithm is oppressing young men by showing them content that aligns with the stereotypical gender norms of our society.

As Friere says, “The oppressed, instead of striving for liberation, tend themselves to become oppressors.” (Friere 45) Men who grow up around this form of banking education and false generosity will likely become oppressors. As young men see more and more “risk” content on their pages, they become desensitized to the graphic violence online and have a higher likelihood of engaging in violent behavior in the future (Huesmann).

Bell Hooks explores the topic in her novel, The Will to Change: Men, Masculinity, and Love. “It is not true that men are unwilling to change. It is true that many men are afraid to change. It is true that masses of men have not even begun to look at the ways that patriarchy keeps them from knowing themselves, from being in touch with their feelings, from loving.” (Hooks xvii) It’s not that men don’t want to change, it’s that, especially on social media through recommendation algorithms, they are never given the chance to see any other reality than the one presented in the media. That, along with many young men not having a safe and stable environment to learn about the importance of growing into deep, emotional human beings, sets many boys behind and pushes them toward a future of reinforcing patriarchy without having any idea that there could be another way of thinking.

Depressingly, social media companies and creators on their sites are incentivized to capitalize on this exploitation of young men by receiving higher engagement metrics from graphically explicit posts. As a result, creators like Andrew Tate, without any repercussions, angle themselves to capitalize on a young man’s desire to conform to the societal standards he sees online. Tate, with a social media following of over 8 million users, cultivated a massive presence online by spewing ignorant and sexist ideologies that diminish women, all while praising toxic masculinity. In an interview on YouTube, for instance, Tate said: “I’m a realist and when you’re a realist, you’re sexist. There’s no way you can be rooted in reality and not be sexist.” (Radford)

How is it possible that somebody with misogynistic and completely incomprehensible opinions of the world comes to find such a massive following? The answer lies in the algorithm and a company’s capitalistic motivation to gain the most profit. Finding that young men quickly latch on to Tate’s ramblings about masculinity, the algorithm recommends his account to boys. As I explored in my WP2, algorithms are simply mathematical models that aim to solve a given task most effectively. When a recommendation algorithm sees that most of Tate’s fanbase comprises young men, 60% of boys aged 6–15 have heard of Tate, and 1 in 6 boys that age have a favorable view of the creator, it does what it knows will give the company the most profit, which is recommend the misogynistic creator to other young men. (M. Smith) Since companies gain so much foot traffic from young males (using YouTube as an example, 54.4% of their users are male and 17.8% of their users are under the age of 18), they have no reason to ban the content that generates the most buzz. (Ceci, Kemp)

It is hard to say why young men buy so heavily from Tate or any other hypermasculine creator’s platform, but it may just come down to the fact that the masculinity of today has found itself in a crisis. Through the algorithmic targeting of young men, companies signal the types of roles that men and women should play in our modern society. Those roles — not much different from the ones forty years ago — lead to men feeling isolated and without answers in their search for purpose and guidance. As a result, they turn back to the algorithm that has brought them down to the level of stereotypical gender roles and mind mentors online who only look to them as sources of profit and fame.

If we want to impact how social media shapes young men, we must first look to control what is in our hands. We must realize that recommendation algorithms reinforce historical gender roles. Young men cannot decipher what is recommended on their page without an adequately developed brain. As a result, parents need to be the ones to step up and regulate their children’s time online, making sure that they stay away from social media sites until they are old enough to do so. However, a child makes a parent’s job much more challenging when they continually beg to join these sites. I did with my parents: I was so fixed on joining social media because my friends already had accounts online, and I thought I was missing out on connections. This desire to join these glorified data farms is a fascinating social phenomenon that should not go unnoted in the discussion of algorithmic targeting. From personal experience, I can say that peer pressure drives much of a young man’s desire to join these sites. But by doing something they think will make them look “cool,” boys are inadvertently opening themselves up to the detrimental effects of these algorithms at much too young of an age.

This is why it is in the hands of the social media company to be accountable for age-verifying every user so that nobody under 13 can join or use a social media site. Furthermore, companies should be limited in the way that they can utilize recommendation algorithms for users that are below the age of 18 such that they are not targeted by overtly violent and sexual content. Note we shouldn’t outright remove these algorithms. Recommendation algorithms can do great things to improve a user’s experience on a site. Still, when used maliciously on individuals who don’t have the reasoning skills to decipher the imagery on their device, they are tools of developmental destruction. That’s why a fine line needs to be drawn in the sand, and we need to step up to eliminate any possibility that young people are targeted by inappropriate content for simply fitting into a specific demographic. Maybe, once recommendation algorithms stop targeting the impressionable category of young men, we’ll have a better future where boys can grow up to know the joys of a loving, emotional life, finally breaking away from the social gender norms that have engulfed our society for centuries.

Works Cited

AAFP. “Violence in the Media and Entertainment (Position Paper).” AAFP, 12 Dec. 2019, www.aafp.org/about/policies/all/violence-media-entertainment.html.

Bischoff, Manon. “How Recommendation Algorithms Work — and Why They May Miss the Mark.” Scientific American, 20 Aug. 2023, www.scientificamerican.com/article/how-recommendation-algorithms-work-and-why-they-may-miss-the-mark/.

Bondfield, Mel. “Exploring Gender Roles in Vintage Advertising: NFSA.” Exploring Gender Roles in Vintage Advertising | National Film and Sound Archive of Australia, www.nfsa.gov.au/latest/exploring-gender-roles-vintage-advertising. Accessed 16 Nov. 2023.

Brunold, Charlie. “Hands-on AI: Learning How Computers Learn.” Medium, Medium, 16 Oct. 2023, medium.com/@cbrunold/wp2-d2215dabfc12.

Ceci, Laura. “Global Youtube User Distribution by Gender 2023.” Statista, 25 Oct. 2023, www.statista.com/statistics/1287032/distribution-youtube-users-gender/#:~:text=As%20of%20October%202023%2C%20approximately,45.6%20percent%20of%20the%20total.

Freire, Paulo, et al. Pedagody of the Oppressed. The Continuum International Publishing Group Ltd., 1993.

Hooks, Bell. The Will to Change: Men, Masculinity, and Love. Washington Square Press, 2005.

Huesmann, L Rowell. “The Impact of Electronic Media Violence: Scientific Theory and Research.” The Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine, U.S. National Library of Medicine, Dec. 2007, www.ncbi.nlm.nih.gov/pmc/articles/PMC2704015/.

Radford, Antoinette. “Who Is Andrew Tate? The Self-Proclaimed Misogynist Influencer.” BBC News, BBC, 4 Aug. 2023, www.bbc.com/news/uk-64125045.

“Shedding More Light on How Instagram Works.” About Instagram, Instagram, 8 June 2021, about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works.

Smith, Aaron. “2. Algorithms in Action: The Content People See on Social Media.” Pew Research Center: Internet, Science & Tech, Pew Research Center, 16 Nov. 2018, www.pewresearch.org/internet/2018/11/16/algorithms-in-action-the-content-people-see-on-social-media/.

Smith, Aaron. “Public Attitudes toward Computer Algorithms.” Pew Research Center: Internet, Science & Tech, Pew Research Center, 16 Nov. 2018, www.pewresearch.org/internet/2018/11/16/public-attitudes-toward-computer-algorithms/.

Smith, Ben. “How Tiktok Reads Your Mind.” The New York Times, The New York Times, 6 Dec. 2021, www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html.

Smith, Matthew. “One in Six Boys Aged 6–15 Have a Positive View of Andrew Tate.” YouGov, YouGov, 27 Sept. 2023, yougov.co.uk/society/articles/47419-one-in-six-boys-aged-6–15-have-a-positive-view-of-andrew-tate.

“Twitter’s Recommendation Algorithm.” Twitter, Twitter, 31 Mar. 2023, blog.twitter.com/engineering/en_us/topics/open-source/2023/twitter-recommendation-algorithm.

“YouTube Users, Stats, Data, Trends, and More — Datareportal — Global Digital Insights.” DataReportal, datareportal.com/essential-youtube-stats#:~:text=Data%20published%20in%20the%20platform’s,audiences%20using%20ads%20on%20YouTube:&text=377.0%20million%20users%20aged%2018,of%20YouTube’s%20total%20ad%20audience. Accessed 16 Nov. 2023.

--

--