Your Skin is No Longer Your Skin: Filters

Melissa Zhang
SI 410: Ethics and Information Technology
8 min readFeb 22, 2022
https://www.businessinsider.com/how-to-search-for-filters-on-instagram

Open up practically any social media platform and one feature will be present: filters and editing softwares.

Not until the mid-2010s did filters become popular for use, with Snapchat being the first social media platform to popularize augmented reality filters (AR filters) and effects in 2011. Initially enamoured by fun filters that added cat ears or butterflies or beards to the face, a more subtle class of filters have emerged, and those are beauty filters. These filters blur the skin, reduce the size of noses and slim down chins, accentuate and enlarge eyes, as well as brighten and lighten skin colour. No longer are filters used to play dress-up; they are now an essential part of many people’s sense of identity, even though it is oftentimes a false reflection of reality.

As Winner discusses in “Do Artifacts Have Politics”, artifacts do not exist on their own and are neutral subjects. I believe that AR filters are the result of the biases of those who create and design them, as well as a consequence of those who profit off of gaining a competitive edge, and thus develop popular features without considering the harm in adopting them.

The problem with AR filters on a broader scale can be looked at with the problems of AI and its implementation for facial recognition and surveillance. As discussed in Race After Technology: Abolitionist tools for the New Jim Code” by Ruha Benjamin, technology is commonly thought of as a techno-utopian tool to solve racism and biases in human judgement, lacking the emotions that so cloud our thought processes. However, it is important to realize that “the datasets and models used in these systems are not objective representations of reality. They are the culmination of particular tools, people, and power structures” (Benjamin 6). Certain people choose what to feed into the machine algorithm, using hundreds of datasets of photos. If we are to assume with AI filters they are created by teaching the technology to match what a face is, the datasets are photos of people. But if most of the faces are white and have certain features, the technology now has bias and the ability to harm.

If we are to expand on that problem, this goes into the amount of power that the few who design facial recognition technology, and technology in general, have on us. Who are they and what are their values? Do they recognize bias? What datasets are they using to test their designs? Furthermore, including a feature on an app is intentional; it does not appear out of thin air. I think that it is easy to not think deeply about the technology we use because we take the efficient design for granted.

If we look at Snapchat, someone or a group of people thought it made sense to automatically put a skin lightening filter onto their camera feature. I did not even realize that was happening to me when I took a picture on Snapchat until one day I tapped on the camera button and held it down, and noticed that the camera screen adjusted after about a second so that my skin was its actual color, when it was much lighter if I took a photo by quick tapping. This discrepancy between a lightened version of my face if I quick-tap to take a photo versus tapping and holding to take a video where my face changes back to its normal shade is important because it is one of the many ways in which we are secretly unlearning what our own faces look like, as well as normalizing harmful ideas of colorism. It was not something that registered for me until I accidentally stumbled upon it; in the past I had thought if my skin looked a bit different, it was because of the lighting in my environment.

Colorism is another major component of the harm that filters have. Historically, colorism has existed with the aid of skin lightening creams, privileged positions compared to darker-skinned people, and other methods of separating by tone. And not only does it exist to uplift and promote lighter-skinned as beauty, it also has consequences for others: on O.J. Simpson’s mugshot, his skin was darkened by Times Magazine in order to “evoke a more “dramatic” tone”, when there is a pattern of skin tone having a correlation with criminality. Even with the creation of the camera, there was bias. The color films and chemicals available for a long time could only really capture the skin tone of white people well. The Shirley card is a testament to what is considered “standard” and “normal” when calibrating photography for printing.

It is a similar case to how facial recognition software has been trained on datasets of white faces, making it difficult to recognize faces of people of color, especially Black people. The harm in this is that it creates a standard for “normal”, which would be that white faces are inherently neutral. I am a relatively pale, lighter-skinned individual. I do not face that much colorism even as a woman of color, but I cannot help but worry about these effects on those who colorism does harm. I also worry because of the increase in filters that alter parts of the face.

The fact that there are terms for harmful trends related to filters is enough concern. For one, there is the idea of the “Instagram Face.” According to child psychologist Dr. Helen Egger, the features that are pedestaled are high cheekbones, poreless skin, cat-like eyes, and and plump lips. “Snapchat dysmorphia” is how people are bringing filtered photos into consultations to achieve the permanent look of a filter. Something similar that I have noticed are people imitating the way filters look with their makeup, even overlaying a filter as they are doing their makeup. The attachment many of us have to the way we look with these filters is recycling the beauty phenomenons that we now criticize from celebrities and pop culture, but it is much more pervasive given how accessibly and easy to use they are.

The growth in the number of marketing filters on Instagram and Snapchat should also be a subject of concern because much of the technology that we cannot live without at this point in our day-to-day lives are mainly trying to create a profit. Most filters nowadays provide blended, smoothed out skin, and this is a problem because it is ever present no matter what the filter’s purpose is and has no relation to the purpose which is to promote a product. Branded filters, which on Instagram are labeled as “sponsored -[filter name]”, promote anything from new movie releases to luxury fashion brands to drinks. And while this is at face value all in good fun, this also speaks to the pervasiveness of advertising and branding in all areas of the internet and technology.

Even more so, it is the addition of beauty filters when there is not really a driving need for them that lends to this idea. What would otherwise be the reason to include face altering filters to a platform like Zoom when its functionality is dependent on scheduling and holding meetings, not “beautifying” participants. It makes to me to blur one’s background for privacy reasons; the ability to blur one’s face to smoothen the skin does not.

All of these reasons leads to a muddled sense of self. The consequences of AR filters is that we lose our sense of self in favor of what companies believe they can make a profit off of, which is the age-old game of social comparison and beauty. To a certain extent, I believe that the people in charge do hold these beliefs on what a “beautiful” or acceptable look should be. However, why I think they roll out these features on their platforms is above all else that they can make a big profit off of it and up the engagement on their platforms. This is particularly where I look to Instagram, which is being exchanged for TikTok. It is not a direct correlation of filters causing this, but rather the overall pattern of these social media oligarchies passing around the same couple features that popularizes one another for the competitive edge. Innovation, nor a strong opinion on beauty, prompts these companies to take advantage of its users.In my opinion, the ease at which people have accepted these filters and practices is concerning for the ways we view ourselves and each other. Seeing so many of these skin blurring, skin lightening, and face altering filters can make us want to achieve these looks — by whatever means possible.

One method is by creating our own filters. And now, almost anyone can create AR filters. Instagram, for example, has its Spark AR Studio, and Snapchat with Lens Studio. That means that anyone can do so at their own ease to alter their face to their own specific liking and needs, rather than picking from a set created from the company, who many would assume is trustworthy and decisive in what they put out. However, these companies are the ones starting the trend of a reliance on Eurocentric, harmful beauty standards, and letting everyone believe it.

I also think that there is a difference in the harm of using editing softwares, such as Photoshop or Facetune, and the difference in filters. The former tends to be more active on the individual in a social media context, while the latter requires much less clear effort to alter things on one’s face. Again, the lack of clarity on some beauty filters on what exactly is changed and Snapchat’s automatic skin lightening on its camera feature are people who have the power in creating and disseminating this technology implementing their harmful ideas onto others without their knowledge.

Strides have been made to be more transparent, however, with platforms like Instagram and Tiktok providing labels whenever a filter, beauty or otherwise, is used. This is especially important given how advanced the technology has gotten to the point where it can be extremely difficult to pinpoint what parts of the face have been altered. The younger age range of their audiences, specifically Tiktok, is necessary as they can be impressionable and still developing their sense of identity.

References

Benjamin, Ruha. “Introduction: The New Jim Code.” Race after Technology: Abolitionist Tools for the New Jim Code, Polity, 2020.

Fong, Joss, and Dion Lee. “Snapchat Filters: The Engineering behind Augmented-Reality Selfies.” Vox, Vox, 28 June 2016, https://www.vox.com/2016/6/28/12046792/how-snapchat-filters-work.

Haines, Anna. “From ‘Instagram Face’ to ‘Snapchat Dysmorphia’: How Beauty Filters Are Changing the Way We See Ourselves.” Forbes, Forbes Magazine, 10 Dec. 2021, https://www.forbes.com/sites/annahaines/2021/04/27/from-instagram-face-to-snapchat-dysmorphia-how-beauty-filters-are-changing-the-way-we-see-ourselves/?sh=222126de4eff.

Spratt, Vicky. “I Feel Increasingly Weird about My Face.” Are Instagram Filters Causing Body Dysmorphia?, https://www.refinery29.com/en-gb/instagram-face-filters-dysmorphia.

Winner, Langdon. “Do Artifacts Have Politics?” Daedalus, 1st ed., vol. 109, The MIT Press, Cambridge, MA, 1980, pp. 121–136.

--

--