The Dark Side Of Social Media : How Instagram’s User Experience (UX) Algorithm Fosters Negative Effects On Mental Health In Young Adults
Instagram is a multi-level social media network app that offers a way to connect with friends, share photos/videos, engage with content, market a business, and follow accounts that reflect personal interests. 1.2 billion users worldwide open the app, engaging with hundreds of posts, content and reels for an average time of 53 minutes daily.
Instagram’s algorithm in 2022 is tailored to its individuals users, much like Meta (formerly Facebook ) Instagram monitors a user’s activity , mining data of how a user engages with the app — key factors include: what type of posts are they viewing ? is it photo or video? Who is the user that made the post? An influencer, company or friend? Does the user main view videos or photos while on the app? Do they like or comment often? Based on the data gathered from these questions, Instagram calculates which content to direct to you on your explore page or main feed, often suggesting posts or accounts to follow based on your likes, past activity, saves, or by creating directed ads from company pages you’ve visited. This also determines the order in which posts are displayed on your feed, based on your likeliness to interact with said posts. The algorithm allows Instagram to create a more tailored, individualized experience while you use their app. The Algorithm can promote a higher use time, as you are viewing and engaging with more content that reflects your personal interests and are more likely to engage with this content, via likes, shares or comments. Understanding the basis of the Algorithm provides insight into the negative aspects of Instagram that fosters an environment that affects a user’s mental health, more specifically that of young adults.
A main, unintentional implication of the algorithm is it may direct sensitive posts to users simply based on their past activity- for example, engagement with workout/exercise posts may prompt content of unhealthy body image , dieting culture, drastic weight-loss, ‘tummy teas’, liposuction, extreme body-building and waist training. A secondary example is by clicking on posts of influencers that are edited, Instagram may further recommended suggested posts of unrealistic beauty standards , cosmetic surgery, photoshopping, beauty filters, and editing apps to its users.
Young adults are a user group of concern, as they are most likely to be directly affected by regular exposure to this content- stemming into issues affecting mental health — such as anxiety, depression, and body image disorders. The influx of these ‘recommended’ posts on their feed leads to over-consumption of content that reflects unhealthy beauty and body standards put forth by social media celebrities and influencers, who are notorious for posting heavily edited , unnatural photos. Young adults are in a period of development in their lives where body image, peer pressure, bullying , social life , high school, dating are at the forefront of their concerns, social media can exacerbate all these issues, sending them into a place of isolation and inadequacy as a result of the content they see on social media.
Future of social media? How can this issue be combated?
The future of social media should place a heavy importance on combating mental health issues that stem directly as a result social media platforms, by creating a revised UX infrastructure where Algorithms are designed with AI integrations that tackle sensitive content by filtering and screening , providing post warnings, shielding posts that feature sensitive content, photoshopping , and inaccurate , false claims and information. Certain content should be blocked from view by users with a certain age range with warnings of visual inaccuracies — unrealistic , edited, beauty standards. Promotion of resources for mental health should be provided within content warnings , for young adults to seek local counselling or resources that encourage positive mental health and body image. These integrations can provide a safer space for young adults to view and engage with content and use social media to their preference , while reducing the potential of mental health and body image issues fostered by the app.