Goal
The goal of the activity was to create a self tracking experience which includes physical, digital, data visualization, and longevity (usage over time) components. I centered my design around creating engagement with a commonly tracked activity, sleeping, and developing non-judgmental and abstracted representations of a user’s data for self reflection.
The majority of sleep tracking devices and apps compare sleep data to a goal, whether that is an average duration per night or a percentage consistency between nights of sleep. However, in my personal experience, perception of sleep quality and energy during the day result in a subjective sense of sleep quality. As a result, this design does not focus on the user’s deviation from a goal, rather, it highlights the variation and consistency within data associated with sleep for the user to develop a sense of their own sleep pattern.
Process
I began by choosing a user whose data I knew personally and anecdotally, my partner whom I share a bed with. I listed his behaviors and actions which relate to energy gain or use such as: sleeping, eating, bathing, working, performing, biking, drinking water, etc. I related the data from the behaviors and actions to derive parameters. For example sleeping includes start time, end time, duration, and disturbance frequency (where disturbances are also measured in time).
From the list I organized the activities into three categories, sleeping, body, mood or state of mind, which served as the left column of a table that noted which sensors or data collection methods could capture them. Through the listing process, activities for data collection were difficult to place like phone usage or dreaming.
With contextual knowledge of my user, I brainstormed affordances or personal objects that could be scaffolded upon for the physical experience. My partner wears earrings which I know he never removes, and may sometimes use a sleep mask. Thus I started searching for research which suggests those wearables are plausible points of data collection. Research conducted by MIT showed heart rate could be measured from earlobe using photoplethysmography (PPG). Similarly, research from the University of Manchester suggested the use of a sensor-embedded eye mask to measure rapid eye movement (REM) via electrooculogram (EOG).
Outcomes
The process of defining the use case led to prototyping via sketching and wireframes. I explored ways in which the experience could engage the user. I considered my user’s personal context as a musician, wondering if sonification of the data would generate creative inspiration. Additionally I believed analogy and narrative would be rich sources of interest for users who are not familiar with their biometric data. Inspired by the story The Princess and the Pea, could a visualization of layers of bedding support the multiple factors keeping one awake at night? Would variability in sleep duration lead to interesting shapes created by line graphs that could lend to a landscape scene which the user could explore?
With the engagement methods above, I began sketching wireframes for an accompanying app including onboarding, energy visualization, sleep quality visualization, user feedback variations, sonification, and a narrative landscape.
The sketches helped to uncover potential issues when real datasets are used. The next step was finding sleep datasets which could be explored, interpreted, and sonified to learn if my ideas are consistent with the outcome.
Sleep duration visualizations were created using sleep datasets from Kaggle and data.world and Figma’s chart generating plugin Datavizer. In Figma, point callouts were added to prototype the user’s experience exploring their own data and observing the leading factor to one night’s sleep per data point.
The narrative landscape idea was difficult to prototype quickly. It would most likely require generative tools such as Grasshopper or other 3D modeling based on a dataset. Figma’s plugin Vectary 3D Elements provided an interesting ridged surface that could suggest the data landscape I had in mind. Using the same callouts from the wireframe, I wondered if the user could explore areas of the map to investigate their sleep habits. Could a mountain represent their late night eating habits, a waterfall be the effects of overhydration, etc.?
With a tool called TwoTone, an open source webapp for data-driven music and data sonification from Sonify, sleep duration was sonified into audio tracks which play a note or arpeggiated scale for each row of sleep duration. For example, a sleep duration of 8 hours may result in the note C being played, where 6 hours may be the note A. I broke the sleep duration into multiple tracks, filtering based on the sleep quality column, and associating each with different instruments, piano, double bass, glockenspiel, etc. The result was a unique, wandering song where the areas of sleep variability added an unpredictable element and where consistency felt repetitive. The song can be heard here.
After further research I found existing apps which sonify data from smartphone sensors, Holon (free version) and Holonist (Pro version) from Holonic Systems. Holonist can output the data as MIDI or OSC formats for creative use with music.
While the design sprint led to an unfinished product system, it brought an exploration of abstract visualization techniques, literature review of wearable technology related to sleep, and ideas to investigate further.
Learnings
The exercise allowed me to play with quantified self data to create new experiences from the act of sleeping. Tuning the ideal data experience means understanding the user’s comfort and literacy with data and its interpretation.