Exploring Instagram’s Algorithmic Bias Towards Attractive Women and Its Impact on Users — Case Study

Suryansh Srivastava
8 min readMar 22, 2023

--

Reading time: 8 min

Thumbnail Image Credit: coolvector

Abstract

Instagram’s algorithm has been criticized for bias in favour of images of attractive women. The platform’s algorithm is designed to show users content likely to generate engagement, but it disproportionately promotes revealing images of women who fit traditional beauty standards. This occurs in part because such images tend to receive high engagement, which the algorithm then amplifies. It is also linked to Instagram’s influencer culture, advertising dynamics, and visual-focused medium. To address this algorithmic bias, experts recommend Instagram increase transparency, incorporate additional signals like content meaning, give users more control, and enforce policies on appropriate material. Solutions will need to account for both technical and societal drivers of the bias.

Keywords: Instagram, Algorithmic bias, Conventional beauty standards, Human behaviour, Visual appeal, Influencer culture, Advertising, Inclusivity and diversity, Social media experience, User engagement patterns

Introduction:

Instagram, a popular photo and video-sharing platform, has become an essential part of many users’ daily lives. However, the platform has faced criticism for seemingly prioritizing and promoting images and reels of conventionally attractive women, even to users who do not follow them. This case study delves into the reasons behind this phenomenon by examining Instagram’s algorithmic workings, instances of criticism, and potential solutions to address the issue.

Instagram’s Algorithm and Its Impact:

Instagram’s algorithm is designed to show users content they will find most engaging and relevant based on their interests and behaviour. However, independent analyses have found that the algorithm promotes revealing and glamorous images of attractive women at disproportionately high rates. This algorithmic bias has several contributing factors:

  1. Visual Appeal: Users are naturally drawn to eye-catching and appealing images, especially those depicting attractive individuals. As a highly visual platform, Instagram’s algorithm favours this type of engaging content and amplifies its distribution.
Image Credit: Instagram

2. Behavioral Feedback: Instagram’s algorithm learns from user behaviour, prioritizing content that receives high engagement. If a large portion of the user base engages with images and reels of attractive women, the algorithm will promote more of such content, creating a feedback loop.

Image Credit: Pinterest

3. Influencer Culture: Instagram has a strong influencer culture, with many influencers focused on appearance, fashion, and lifestyle. Conventionally attractive women are prevalent as influencers on the platform, and their content tends to garner significant engagement, leading the algorithm to boost their content’s reach.

Image Credit: hopperhq

4. Advertising: Businesses frequently collaborate with influencers to market products, often promoting social media personalities who embody idealized beauty standards. As these advertising campaigns are designed to maximize user engagement, Instagram’s algorithm may amplify their effects, further entrenching the types of bodies and faces seen as most attractive on the platform.

Image Credit: herpaperroute

Instances of Biased Content Promotion:

There are numerous examples where users have noticed an apparent bias towards promoting pictures and reels featuring pretty girls on their feeds. This often occurs even when they do not follow such accounts or engage with similar content. For instance, the “Explore” page frequently features images of women posing in bikinis or other revealing attire alongside unrelated topics like photography tips or travel destinations.

Some users have taken it upon themselves to report this issue on public forums, expressing concerns about the impact on body image issues among young people who may feel pressured to conform to unrealistic beauty standards presented through these promoted posts. Additionally, critics argue that such promotions contribute to objectifying women by reducing them merely to subjects for the male gaze.

Image Credit: Slate

Instances of Criticism

Several instances of people criticizing Instagram’s algorithm for bias in promoting images of attractive women include:

  1. Influencer Karina Irby’s side-by-side photos experiment, showcasing the algorithm’s preference for her bikini image over a less revealing one.
Image Credit: irishmirror

2. Writer and influencer Alex Light’s observation that her bathing suit pictures received more distribution and views than her other posts.

Image Credit: Alex Light: ‘If You’ve Got A Bikini And A Body, You’re Bikini Body Ready’

3. The Instagram account “CutiePieSexyGirl123,” gained over 500,000 followers by posting revealing images and videos of young women in bikinis and underwear, highlighting the platform’s policy enforcement issues.

Image Credit: icelandicbrides

4. A 2019 study by researcher Roberto Fernando found that Instagram’s algorithm boosted the reach and exposure of female influencers’ bikini or lingerie photos over their fully clothed images, even when posted on the same day. The revealing images received over 4 times more distribution on average.

Image Credit: Instagram

5. A report by the Institute for Strategic Dialogue in 2020 found that Instagram’s algorithm disproportionately recommends and amplifies content depicting idealized bodies, especially for women. It contributes to harmful beauty standards by making narrow representations of appearance more visible than diverse ones.

Image Credit: (@s0cialmediavsreality / Instagram)

6. An analysis by the consumer advocacy group SumOfUs in 2021 claimed that Instagram prioritizes “sexy selfies” and images depicting stereotypically attractive women, making the app unsafe for children and teens by exposing them to unrealistic ideals of beauty and sexuality. It said Instagram needs fundamental changes to counteract this algorithmic bias.

Image Credit: LENACHUBZ/INSTAGRAM

7. Additional studies and reports, such as those from researchers at Cornell University and policy organizations like FairPlay, have found comparable patterns of how Instagram’s algorithm amplifies content depicting attractive women and influences beauty standards. While Instagram has disputed some findings or claims, independent analyses continue to point to algorithmic weaknesses that advantage a narrow idealized aesthetic, especially for women.

Image Credit: YVE

Much of the research on this issue has been conducted by independent groups, as Instagram’s algorithm is opaque and the company does not release comprehensive data on how it functions or its effects. Some argue that greater transparency is needed to fully understand and address algorithmic biases.

The research has primarily focused on distribution and reach, showing how Instagram’s algorithm amplifies content depicting certain ideals of attractiveness, especially for women. However, less work has looked at the impacts on users’ well-being, mental health, and body image. Some studies suggest that Instagram usage correlates with higher rates of eating disorders and dissatisfaction with appearance, but the role of the algorithm is harder to isolate.

While criticism has centred on images of attractive women, research has shown that Instagram’s algorithm also reinforces narrow standards of attractiveness for men, such as physical fitness. The ideals promoted may differ across genders, but the underlying issues of algorithmic bias and lack of diversity persist. The solutions needed are likely similar for both men and women.

Instagram’s Response:

Image Credit: SproutSocial

In response to criticism, Instagram has pointed to its policies against inappropriate content and stated that it does not intentionally prioritize particular types of content or accounts. However, the company has not directly addressed research findings on algorithmic biases or taken sufficient action to counteract them, according to analysts. Instagram’s lack of transparency makes it difficult to fully assess changes or the effectiveness of policies and tools in reducing algorithmic biases.

Instagram has made several changes in recent years aimed at addressing algorithmic biases. In 2019, they began testing hiding likes from posts to reduce pressure around posting only highly engaging content. Furthermore, they claimed that machine learning models were being developed for a better understanding of diverse interests and more representative recommendations.

However, despite these efforts by Instagram developers, many users still report seeing predominantly attractive female-centric content on their feeds — indicating that there is room for further improvements within the platform’s algorithms.

What Should be Done (Improvement Suggestions):

Image Credit: newsfeed

To address ongoing concerns about biased content promotion on Instagram feeds, several steps can be taken:

1. Improve algorithm transparency: Providing clearer explanations regarding how recommendations are generated could help alleviate user frustration.

2. Allow user customization: Enabling options for users to filter out specific types of recommended content would provide greater control over what appears in their feed.

3. Increase diversity within development teams: Ensuring representation from various backgrounds during decision-making processes can lead towards more inclusive algorithms.

Summary & Conclusion:

Instagram’s algorithmic bias towards attractive women is a result of a complex interplay of human behaviour, visual appeal, influencer culture, and advertising. While the platform’s algorithm is designed to show users content they are most likely to engage with, it inadvertently perpetuates a narrow representation of beauty. By understanding the factors contributing to this bias and taking proactive steps to address it, both users and Instagram can work towards creating a more diverse and inclusive social media experience.

While it is evident that some progress has been made concerning complaints about the biased promotion of certain types of female-centric imagery on Instagram feeds; there remains work ahead if true inclusivity is sought after within this social media space.

By taking steps such as increasing transparency surrounding recommendation generation protocols; enabling user-driven customizations related specifically toward filtering unwanted material; ensuring diversity amongst development personnel responsible for crafting said algorithms — strides can be made towards creating a fairer environment which celebrates individuals’ unique tastes rather than perpetuating harmful societal norms centred around physical attractiveness alone.

Image Credit: ischool.berkeley.edu

References & Bibliography:

(1) Frier S., Bloomberg News [Internet]. How does Instagram decide what shows up for you? It’s complicated; 2020 May 11 [cited 2021 Oct]. Available from https://fortune.com/2020/05/11/how-does-instagram-decide-what-shows-up-for-me/

(2) Reddit [Internet]. Why does my Instagram “explore” page show me nothing but scantily clad women? I am not interested!; 2018 Feb 12 [cited 2021 Oct]. Available from https://www.reddit.com/r/NoStupidQuestions/comments/7x17lq/

(3) Rosmarin R., Forbes [Internet]. Sex sells — especially on Instagram — but should it?; 2015 Sep 29 [cited 2021 Oct] Available from https://www.forbes.com/sites/rachelrosmarin/2015/09/29/

(4) Sehl K., Hootsuite Blog[Internet], Everything you need know about new hidden likes update.; Updated March ,19th ,2020[cited October ,20th].

--

--