A large portion of my projects are related to emotion. I feel that the sensing and using of emotion related information is the next large step technology will take. This information can be used to create technology that has a more subtle understanding of when to assert its functions, and when to fade into the background or even choose not to function. We will be able to filter memories by emotion and problems like Facebook showing people painful memories can be avoided. Notifications from devices will smartly choose when to and when not to be pushy depending on whether we want to be bothered. The biggest question I have after this semester is still about values, and how to ethically design without overly imposing my own values on the user. Data collection represents a difficult subject in that I believe emotional data can be used to create better technology and services, however the designer is must impose his/her own values while deciding methods of collection and analysis of data. These values skew data results and may hinder the data’s ability to represent reality.
A beautiful, relaxing physical environment is condusive to both work and play. But what about the emotional environment…
The wallflower is one of my first projects on emotion. I wanted to make collective emotion visible by creating a visual indicator of the sentiment of the room. The wallflower is a digital plant that, through speech recognition, listens to conversations in a room, and if the conversation is positive, it grows. If the conversation is negative, it dies. Many times people don’t understand the atmosphere their words create for others, and one negative person is able to create a vicious cycle of negativity. The plant tries to break the cycle by giving a clear visual reminder of what the more abstract emotional environment of a space looks like.
A flaw with this project is the assumption that users will allow their conversations to be recorded. In actuality users may avoid the vicinity of the recording device to retain their privacy. The design assumes my value judgement that people will submit to a loss of privacy in return for a reminder to stay positive. A second flaw is my analysis of the data collected. In the wallflower project, I assert personal values that negative words are bad and should be avoided. This analysis of language does not leave room for humor and sarcasm, it also does not allow for negative conversations used as cathartic release of stress.
As a reaction to self critique on the wallflower project, a team and I created a second project which takes emotional analysis too far.
The truth googles replaces the top half of the face with a virtual version. This virtual face changes expression…
The truth project is a pair of googles that replaces half of the user’s face, and through heart rate data, displays “true” emotions that the user may be trying to hide. This project exposes the sinister side of the wallflower idea. While the goggles are able to help communicate emotions which the user may not be aware of, or be unable to verbalise, the requirement of total transparency of emotion highlights the pitfalls of taking away all privacy from the user.
In the end, there unfortunately seems to be a delicate balance between understanding a user and invasion of privacy. And as technology improves this balance will only become more difficult to navigate.