If you are a woman with what comedian Taylor Orci named “bitchy resting face,” you are commonly interrupted with requests to “smile.” This may arrive as a catcall or friendly advice from someone you know. There is even a street art project from Tatyana Fazlalizadeh called Stop Telling Women to Smile that points out why this interaction is so unpleasant. The seemingly benign request “smile for me” can quickly elevate to vicious harassment, and in every instance, it demands that someone act in compliance.
In public-facing occupations, a person is paid to disguise “bitchy resting face” and engage with customers using interpersonal skills that are necessary for the job but nearly impossible to measure.
But what if a cafe clocked every time a barista gets a customer to smile and rewarded its staff depending on their scores? A comedy club in Barcelona is testing a similar incentive program. It uses facial-recognition to record audience responses and pays performers 0.30 euros a laugh. Recording physical responses rather than “like” button activity makes the data appear less open to interpretation. It is much harder to control our physical responses than to opt in with a mouse click. Projects like the Barcelona laugh tracker raise questions about user agency and how we value the emotional labor of workers who engage with the public. Why not trust a customer’s own story? Why should we believe what a smile tracker reports any more than the responses to a paper and pencil customer satisfaction survey?
Rosalind Picard coined the term “affective computing” to describe computational simulation of empathy and research into emotions, that is giving computers “the skills of intelligence that have to do with understanding human emotion.” This research breaks down the complex motivations and negotiations in human interactions as simple as a customer ordering coffee. Attempts to automate emotional labor can demonstrate just how much is expected of a worker who may be paid no more than minimum wage.
All the devices we see now recording bodily activity — heart rate, steps taken, hours of sleep — soon enough may try to read our thoughts and feelings. Dell recently announced its plans to release a “mood-sensing app” in 2017. Whether your knee jerk response is to compare it to Minority Report or to a liquid crystal thermometer on a gold band, this is where technology is heading. Invasion of privacy is inevitable with thse kinds of projects. It is fair to describe any and all inovation in this field as “creepy.” Yet, some projects are proving useful for drawing attention to how much we expect from someone in a line of work that requires a welcoming attitude.
Here is a project imagined to offset “bitchy resting face” and make the life of a service worker easier. Tsukuba University’s Hirotaka Osawa designed a proof of concept called “AgencyGlass.” The goggles give the impression of smiling alert eyes. Osawa says the device is designed to “decrease emotional labor.”
AgencyGlass reveals the complexity and difficulty of professions like barista, nanny, flight attendant, community manager, or teacher. Women are most commonly employed in professions requiring their emotional labor and affinity with customers and coworkers. Consequently they are underpaid for their services. The appearance of enjoying the job is part of the job, therefore the job itself is mistaken for its own reward.
The “AgencyGlass” video ends with a shot of Osawa napping while he appears to be at work, his goggle-eyes in a permanent relaxed expression of concentration. In addition to making emotional labor visible, it reveals our own expectations when we interact with someone whose job is to focus on our needs. Is it fair to expect kind eyes and a wide grin from a barista we only tipped 25 cents? A flustered waiter or stressed out hairdresser might be having a rough day.
Frightened by the idea of emotion-spoofing/emotion-reading machines? Get in the habit of generously tipping service employees with “bitchy resting face.” There’s nothing more human.