How to Record Emotion

Our „data“ are not so self-evident, their nature not at all to be taken for granted.

A further objective of this project is to create a system for a person to record an emotion experience. I designed the system keeping in mind that a user should have the ability to use the system in both analog and digital form. The majority of testing of the system was done through the Group Recording experiment and the Self Experiment. These experiments revealed the importance of recording daily events, which lead to the inclusion of triggers and reactions as categories for daily data collection. The final system uses the five dimensions in order to capture the emotion experience holistically first, before the user moves on to reflecting on the stimuli and their reactions.

Fig. 31 A snippet of the records from the year long Self Experiment.

Though a user can adopt this system in an analog manner, I designed it to function together with the visual system in the digital application. Because of this, the structure is not visible. Instead, a user is guided through the same sequence they would go through if filling out the structure analogically. The emotion history created through continual use of the application will be reflected in a database in the same arrangement of the original system.

Each feature of Fine. is meant to encourage reflection. Tracking an emotion experience is broken down into four phases: Naming (emotion labeling), Dimensions, Triggers, and Reactions. Because this takes more time than other automated quantified self applications, its linear user experience is designed to be fluid and quick to travel through. Its simplicity acknowledges the need for as few barriers as possible hindering the user from entering their data consistently.

Each phase of tracking represents a phase of the emotion process. The phases are ordered not in the way that they occur when we experience them, but instead, beginning with the most important step in the reflection process: Naming.


Naming the emotion through language, known as “affect” labeling, is widely accepted to be the first step toward emotional awareness, recognition and understanding. In the app, naming directly reflects the „inferred cognition“ part of the emotion process. A user is given the option to either type in the name of the emotion(s) they experienced or ask for help determining a label which fits. Asking for help provides the user with Robert Plutchik’s eight basic emotions and on further click, a list of their dyads.

Having the option to explore possible emotion labels addresses the need of patients who do not have a developed vocabulary for emotions. Using Plutchik’s basic emotions as a foundation helps users practice and regulate terminology also used by social scientists and emotion researchers. Through the recollection of the emotion experience, its triggers, and the user’s reactions, the user becomes able to better label an emotion. At any point when recording with Fine., a user is allowed to return to the first phase and relabel their emotion.

Fig. 33 Screens from the emotion labeling phase in the application.


In the second phase, a user records psychological and physiological evaluations of their emotion experiences. These are the five dimensions, each of which is recorded through a slider interaction and the guidance of colloquial phrases. These phrases are based how we express the dimensions when we talk about our emotions and perceived feelings (Figure 34). Many of these phrases were characterized off of the research of George Lakoff and Mark Johnson in Metaphors We Live By, as well as Zoltán Kövecses‘ book Metaphor and Emotion: Language, Culture, and Body in Human Feeling. This is an effort to limit the clinical language from the experience, as many of us are unfamiliar with it.

When a user interacts with a slider, the visualization builds out in real-time. As we discussed in the last section, the visualization is complex in an effort to reflect the complexity of an emotion experience. It is imperative that the user is exposed to how the visual representation is changed by an individual dimension. The more often the user is able to see those changes, the more familiar they become with the system. This makes the system easier to memorize and recall when using the analysis feature.

Fig. 34 Screens from the dimension tracking phase in the application.


Recording a trigger represents the “stimulus” part of the emotion process. Triggers can range from external events like natural phenomena and the behaviors of others to internal events like behaviors of ourselves when we, for example, reflect on memories and feel nostalgia.

In the application, a user is able to choose between five categories when reflecting upon what caused the emotion they’re tracking. The triggers are visualized as the base of the form, supporting Freud’s iceberg metaphor.

We don’t always think of what stimulates our emotion experience first, especially when we aren’t conditioned to the trigger or expecting it. Instead we’re more aware of our physiological response to it, how we feel (identified with the dimensions), and sometimes what we feel (identified through naming). Through improved recognition of triggers, we increase the likelihood that we are able to adapt to them in the future. After choosing a trigger category, a user can then go on to provide a more specific sub-trigger, like naming a person or an exact location. The more detailed the responses, the more thorough the dataset. Thorough datasets supply the user with more information to use in the analysis feature of the application.

Fig. 35 Screens from the trigger tracking phase in the application where the user is able to specify a trigger in detail.


The addition of triggers and actions to the recording process helps anchor an emotion experience. Emotions don’t happen out of thin air. Being able to identify what caused them and what impulses we had as response results directly in the gaining of emotion awareness.

The reaction to an emotion is both the last part of the emotion process and the last phase of tracking in the app. Often our reactions are the only part of the emotion process that is observable to person not experiencing the emotion. The reaction is also our attempt at bringing ourselves back to emotional equilibrium. According to the emotion process, we could cycle through many reactions until we rebalance, which makes identifying reactions an important task in gaining emotion awareness. Reacting appropriately to stimuli or physiological changes help bring us back to equilibrium much quicker.

Recording is concluded with a full screen overview of the emotion experience. This allows the user to view the full visualization for the first time (instead of in sequence) as well as the possibility to add specific notes to the experience.

Fig. 36 Screens from the action recording phase in the application.

Direct Manipulation

In the conclusion chapter of this thesis, I highlight the need for further user testing of the application in order to continue iterating upon these types of tools. There is one specific interaction-style which I think would lend itself nicely to this application’s content: direct manipulation. Currently, the metaphors appearing in the tracking of dimensions are the link between the digital form and what would be a real manipulation of a physical object. To actually make the object bigger, for a higher intensity for example, by pinching and zooming, would bring the user even closer to the representation of the emotion. Direct manipulation provides a user with a more perceived sense of control and itself is an expressive act. A gesture build-up of a visualization could be one possible method for addressing emotion data’s flexibility.

Continue to Forming a Narrative.

An overview of this project and a link to the log book of my process can be found online here.