DL
DL
Nov 23, 2015 · 3 min read

A large portion of my projects are related to emotion. I feel that the sensing and using of emotion related information is the next large step technology will take. This information can be used to create technology that has a more subtle understanding of when to assert its functions, and when to fade into the background or even choose not to function. We will be able to filter memories by emotion and problems like Facebook showing people painful memories can be avoided. Notifications from devices will smartly choose when to and when not to be pushy depending on whether we want to be bothered. The biggest question I have after this semester is still about values, and how to ethically design without overly imposing my own values on the user. Data collection represents a difficult subject in that I believe emotional data can be used to create better technology and services, however the designer is must impose his/her own values while deciding methods of collection and analysis of data. These values skew data results and may hinder the data’s ability to represent reality.

The wallflower is one of my first projects on emotion. I wanted to make collective emotion visible by creating a visual indicator of the sentiment of the room. The wallflower is a digital plant that, through speech recognition, listens to conversations in a room, and if the conversation is positive, it grows. If the conversation is negative, it dies. Many times people don’t understand the atmosphere their words create for others, and one negative person is able to create a vicious cycle of negativity. The plant tries to break the cycle by giving a clear visual reminder of what the more abstract emotional environment of a space looks like.

A flaw with this project is the assumption that users will allow their conversations to be recorded. In actuality users may avoid the vicinity of the recording device to retain their privacy. The design assumes my value judgement that people will submit to a loss of privacy in return for a reminder to stay positive. A second flaw is my analysis of the data collected. In the wallflower project, I assert personal values that negative words are bad and should be avoided. This analysis of language does not leave room for humor and sarcasm, it also does not allow for negative conversations used as cathartic release of stress.

As a reaction to self critique on the wallflower project, a team and I created a second project which takes emotional analysis too far.

The truth project is a pair of googles that replaces half of the user’s face, and through heart rate data, displays “true” emotions that the user may be trying to hide. This project exposes the sinister side of the wallflower idea. While the goggles are able to help communicate emotions which the user may not be aware of, or be unable to verbalise, the requirement of total transparency of emotion highlights the pitfalls of taking away all privacy from the user.

In the end, there unfortunately seems to be a delicate balance between understanding a user and invasion of privacy. And as technology improves this balance will only become more difficult to navigate.

Interaction & Service Design Concepts: Principles, Perspectives & Practices

Graduate Seminar 1, Fall 2015, Carnegie Mellon School of Design, Collection of the Seminar’s Work

    DL

    Written by

    DL

    Interaction Design

    Interaction & Service Design Concepts: Principles, Perspectives & Practices

    Graduate Seminar 1, Fall 2015, Carnegie Mellon School of Design, Collection of the Seminar’s Work

    Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
    Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
    Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade