We teach the algorithm diet culture
But we can also unteach it
So I recently graduated with a degree in Dietetics and Nutrition. Ever since then nearly all of my internet and social media ads have been for grad school. My phone seems to know what’s going on in my life without me Googling any Master’s degree programs or visiting any university’s website. It knows this because, according to all of my behaviors and other unrelated searches, I am someone who is likely to go to grad school — I just graduated, I’ve been on linked in, and I’ve set job alerts.
My phone’s assumption that my next step is graduate school is based on data gathered from millions of internet users— their life events, their web searches, their purchases — who then either searched for, applied to, or attended grad school. Apparently the things going on in my life, my actions, and my web behavior are similar enough to theirs that the internet assumes the next thing I’m going to do is start looking at grad school. These ads are both trying to predict my behavior and influence my next move.
It’s scarily accurate — it’s showing me ads for nutrition, public health, and non-fiction writing degrees.
Over the past few weeks, and especially after last week’s Senate hearing, we’ve been hearing a lot about how social media, particularly the Facebook-owned Instagram, is dangerous for young women. Specifically, that the algorithm is showing women and girls harmful accounts and content dedicated to eating disorders and extreme weight loss.
At one point, Senator Klobuchar asked if Facebook uses its algorithm to promote content that promotes eating disorders.
I am not a tech person, but I do have a background in science and have made some models that use past data in order to predict future events. I understand enough to know that it’s not that the algorithm is finding and promoting this Very Bad Stuff because it has some ulterior motive. The algorithm gives you what it thinks you want based on the content you’ve been looking at and interacting with — and what people similar to you have been looking at and interacting with.
The unfortunate truth is that we teach instagram to promote this content. As Douglas Rushkoff explains:
“Researchers have found, for example, that the algorithms running social media platforms tend to show people pictures of their ex-lovers having fun. No, users don’t want to see such images. But, through trial and error, the algorithms have discovered that showing us pictures of our exes having fun increases our engagement… The algorithms don’t know why this works, and they don’t care. They’re only trying to maximize whichever metric we’ve instructed them to pursue.”
In a CNN article Florida State University psychology professor Pamela Keel discusses confirmation bias and how it could lead someone who feels that they need to be thin or lose weight to be drawn to other people or accounts that confirm their belief. If you feel like you need to lose weight or would like some type of weight loss motivation, you don’t have to look very far on social media. The more you look for and find it, the more it’s shown to you. And if you’re not looking for it, the more others similar to you look for it, the more you’ll see.
An algorithm is a reflection of us — we train it with our data. So if Instagram is suggesting eating disorder content to teenage girls, it’s because we live in a world where we make women and girls believe that they have to lose weight at any cost. We teach the algorithm diet culture.
Instagram shouldn’t allow these accounts in the first place- in fact, accounts that promote disordered eating are not allowed on the app but slip through or, more likely, are ignored because they’re good for engagement. The best solution would be for Instagram to do their job and remove these accounts and protect us from the amplification of harmful ideals.
Fortunately, we can also be part of the solution.
The algorithm is built by Instagram, but is taught by all of us through society’s patterns and values. It uses our past actions to predict — and then influence — our future behavior. When we reward the algorithm by engaging with what it has shown us, it learns. It is a reflection of who we are. Unlike humans, artificial intelligence cannot have values or morals. That is where we come in.
We must bury this content. We must stop training the algorithm to show us accounts that glorify eating disorders and amplify body dysmorphia.
If we can create an online culture that no longer equates thinness with moral superiority and make diet culture irrelevant on social media, and if diet culture-related content no longer receives engagement, the algorithm will no longer prioritize it, even if this content still exists. While we shouldn’t have to do this work to protect ourselves on social media, when we collectively announce we’ve had enough, change can happen both offline and online.