Amazon’s Echo Look: We’re Going a Long Way Back, Baby

Harnessing the Power of Machine Learning for Dynamic Narcissism and Exploitation of Human Vulnerability

By S.A. Applin, April 27, 2017

Image from 10 Sexist Vintage Ads

Here we are in 2017, and back in the 1950’s, where how women look and are presented to the world, becomes a problem they face that is perceived to be so severe, that they must sacrifice their privacy, security, and trust to Amazon’s algorithms, so that they will have societal acceptance for their fashion choices.

When I first heard about Amazon’s Echo Look, I was going to give it the benefit of the doubt. I know one of the designers, and she is a sensible sort of person, a fine academic with great credentials. I know her heart is in her work.

However, one person working for Amazon with those attributes doesn’t have the agency to control the outcomes of the larger Amazon machine. Even with the best intent, the Amazon Echo Look is a privacy, trust, ethics, security, Public Relations, cultural, and gender nightmare. Furthermore, this clearly shows Amazon throwing a gauntlet down to the retail industry, with the Amazon Echo Look as Trojan Horse, partially disguising the real threat of Amazon’s intent to ‘pwn’ retail sales by replacing retail sales associates, dressing rooms, and retail shopping, with Amazon’s algorithms, software, and supply chains. This can be a success as long as consumers prefer to disengage with society and instead only interact with deliveries. This is the first chapter of Snow Crash, and Amazon’s central theme.

For those of you unfamiliar with this new offering, Amazon Echo Look combines software and Alexa voice control for the Amazon Echo (a device that sits in the home and controls connected functions via voice commands) with a camera. Echo Look enables a person to check what they are wearing to see if it is stylish and aesthetically pleasing. This is done by image capture for video and/or photo in combination with Echo Look’s voice control using Alexa. To determine if one looks “okay” before leaving the house, one no longer has to ask a human being, Amazon offers a techno-solution, where the primary relationship for a person’s vulnerability with dress sense is entrusted to Amazon’s technology, which matches their photos up with others online, or provides feedback based on Machine Learning algorithms. This is all conveniently described in the Amazon Echo Look promotional video, where the Echo Look is shown located in someone’s bedroom, or dressing area. If there isn’t a “good look,” Amazon helpfully makes suggestions of garments to order (through Amazon) to complete one’s outfits. Perhaps soon, garments will be custom made, just-in-time, with Amazon’s recently granted patent for ‘on demand apparel manufacturing.’ However, what makes a “good look” is subjective, and includes cultural norms as well as aesthetics. The Amazon Echo Look algorithms would have to be mighty sophisticated to address all that encompass a “good look.”

I could make the obvious points about the Echo Look’s utility if someone lives alone, or doesn’t have any friends, or has a hard time dressing themselves to be stylish. However, the Amazon Echo Look seems to be the worst type of digital narcissistic pool possible to fall in. It’s dynamic, ongoing, and endless. The depths to which Amazon is willing to go to enable someone to be endlessly photographed, video recorded, and scrutinized by Machine Learning, under the guise of organizing their wardrobe for looking “okay” crosses thresholds of their personal privacy, security, personhood and identity, and of ethics in general. What isn’t wrong with a big company visually invading our dressing space, our most private place where we create who we are for the day? In our underwear?

Echo Look is embedded within the Amazon ecosystem (Echo/Eco?) that knows more about us than we can imagine, from long-running data collected from Amazon.com shopping, Amazon Prime viewing, and, if we use a Kindle, what we read. Amazon knows what we watch (and how we watch it), what we read (and how we read it), what we purchase (and how we purchase it), and now, what we look like, and not only that, how we look at ourselves and how we assess what our image and identity is, in our private living spaces.

I’m torn for the many directions I could go in for why this is just the wrong product. It could be the tracking, the insecurity decisions about our “presentation of self” being recorded (and saved forever); it could be the ethical issues within the entire Amazon framework of the surveillance products masquerading as services, entertainment, and knowledge; and it could be the promotional video Amazon released to advertise the Echo Look. The video seems aimed at women, features a female narrator, and shows women looking in their closet and deciding about wardrobe, while asking Amazon’s Echo Look for advice. In addition, during the video, there are two tiny clips briefly showing the back of a man, who is facing a closet, where every other part of the promotional video shows women from the front, posing, being photographed, and interacting with Amazon by voice or through their software app. (At the end of the video, we see the man’s face briefly, but as part of the styling process.)

From a Public Relations perspective, the timing for this product release could not be worse. We are currently in the midst of ethics debates concerning the role of humans in autonomous vehicles, Machine Learning applications, and AI systems; women in the technology industry are fighting for equal representation and pay; there are global concerns with the labor issues of cheap disposable clothing and its association with being unsustainable; and overall consumption patterns and economies are changing worldwide. Furthermore, the United States now has a President who has a ‘retro’ attitude towards women’s roles. Whether it intends to or not, Amazon Echo Look supports that attitude, dialing us back to those “do I look ok?” days of yore. For years, I’ve said that “voyeurism and narcissism sell software.” The Amazon Echo Look blatantly capitalizes on this.

Amazon CEO Jeff Bezos, owns The Washington Post, whose reporters have continually provided coverage of the current U.S. President’s administration and its retro attitudes towards women. In this context, the introduction of the Amazon Echo Look must come as a surprise to many who viewed Amazon as an ally in their hope for a President who would continue to promote equality, as well as to the disintegration of perpetually building sexist norms that have changed the way women are viewed in the world in the past. Perhaps even for Bezos, this hope for equality was too much to bear when pitted against the opportunity to collect “behavioral data” for Amazon’s growing empire and ambition to disrupt the retail industry. However in doing so, Amazon is disrupting us, using women as experimental lab rats for its Beta test of just how far it can go to siphon our behavior, desires, insecurities, privacy, dreams and hopes.

We are now in a place where we can ask Amazon’s algorithms if we look okay. As if Amazon’s ability to judge who we are, and how we express ourselves, is possible or even valid. The terrifying thing is, that if we use Amazon, it is.