When I think of the word “beauty,” I think of waterfalls and sunsets and all that, and of one person in particular, the most beautiful woman I’ve ever seen. It was a small party for Vanity Fair, in a roundabout way I managed to get into, in which the ratio of celebrities to civilians was ridiculous. I just kept drinking champagne and trying to be cool. And then I wasn’t cool, because I had to stare. The most beautiful woman I’ve ever seen was standing out on the patio alone, carrying herself with celestial grace.
I am trying hard to write this without sounding creepy. “Beautiful” doesn’t even come close to describing; she’s a walking moonbeam. Only a Lord Byron poem has the words, all “cloudless climes and starry skies.” (And mind you this was at an event attended by Kate Walsh, Natalie Portman, the Gossip Girl girls, and dozens and dozens of trophy wives.)
But when I search for the word “beauty” in Google images, I don’t see waterfalls or anyone who looks like Kerry Washington. The result is a sea of models with vague inoffensive features, skinny, cisgender, and white white white white.
It is ridiculous to see as you scroll through the results, a screenshot of which is now a meme on Tumblr (that originated with a post on a Facebook fan page for bell hooks).
“Algorithms are more than a technological issue: they involve not only automated data analysis, but also decision-making,” Anna Jobin wrote in a post on bias in Google search results (inspired by a stunning UN Women ad series, “The Autocomplete Truth”). Jobin points out search results like this perpetuate negative stereotypes.
As Jobin writes, “Where can we locate accountability? With the people who first asked stereotyping questions? With the people who asked next? Or with the people who accepted Google’s suggestion to search for the stereotyping questions instead of searching what they originally intended? What about Google itself?”
Sometimes the bias is baked in the model. Thirty years ago, St. George’s Hospital Medical School wrote an algorithm to sort through the first round of applications, modeling historic patterns of acceptance and rejection. Several years after it was implemented, two professors discovered the model assumed a bias against women and people with names that look non-European. According to a British Medical Journal article, referenced in a piece last summer in The Guardian, the school worked to correct this error and even contacted people unfairly screened out to offer them a place.
Representation matters. Mae Jemison, the crazy brilliant NASA astronaut (fluent in Russian, Swahili, Japanese, a doctor, trained in dance and choreography,) who was the first black woman to travel to space, says it was her dream since watching Uhura on Star Trek. hooks, speaking at the New School, recently commented that the white media attention on Lupita Nyong’o (now on the cover of People magazine’s 50 Most Beautiful people issue,) wrongly suggests her beauty is exceptional among black women.
It might seem like popular culture is finally centering women of color — Nyong’o, Freida Pinto, Indira Varma, Rinko Kikuchi, Necar Zadegan, Sofia Vergara, Laverne Cox, Misty Copeland, etc — but the Google algorithms are telling a different story. Casting for low level beauty editorials — the files saved as “beauty.jpg” and posted to the internet —is undoubtably subject to racial bias. So many hair models, so many nameless pretty faces photographed to sell eyeliner and lip gloss; and yet so few contest retrograde ideal notions of a woman’s appearance.
Come to think of it, I don’t know if there were any other people of color at that Vanity Fair party other than Kerry Washington. The magazine has its own problems representing reality, from its all-white Hollywood issues to its recent photo spread of Nyong’o in which her skin appeared lightened.
Google executives have stated the company’s goal is to mirror the world. Should it then work to reconfigure its algorithms to correct this incomplete definition of “beauty,” or is fashion industry discrimination worth mirroring? Since internet user bias is one factor, I think it is in the company’s best interest to interrupt this disgraceful feedback loop with some algorithm tweaks. Google already handcodes to censor some results —like file-sharing sites or terms related to adult content. Its algorithms aren’t Xerox machines of human interest. Structural issues might mean years before fashion and beauty photographers catch up, but there is no reason the internet can’t deliver “beauty.”