PokeCoach: A Multimodal Approach to Visualizing Personal Exercise Data

Exploring haptic and audio feedback designs for accessibility.

Jong Ho Lee
VisUMD
6 min readDec 15, 2022

--

The main screen of the visualization prototype, PokeCoach. There is a line graph that shows the heart rate data of the user’s running data, which the user can explore by using the slider below. Haptic and audio feedback options are also included.

The term “visualization” seems to imply that the representation of data should be communicated in visual forms. Whether that be using pie graphs to show the percentage of a population that’s willing to believe in aliens, or a bar graph that shows a steady decline in the number of my social outings ever since the pandemic began. However, data visualizations that are designed exclusively for the human eye is often not accessible to people who are blind or low-vision. The accessibility of data visualizations is also decreased for people who have difficulties reading when it relies on dense texts to explain what the graphs mean.

To tackle these problems, researchers explore improving the accessibility of visualizations by encoding information in audio. This ranges from incorporating sonification (e.g., using higher pitches for high y-values in line graphs) to make charts “audible,” to carefully crafting audio summaries (e.g., the computer reading aloud “there is an increasing COVID-19 infection trend from December”). While reading research articles in this topic, I became interested in exploring how we can “visualize” data using different senses such as hearing and touch. Being an avid runner myself, I wanted to ask “how can we make running data graphs more accessible using different modalities?”

With the advent of widespread wearable technology, runners often use these to quantify various aspects of their runs. Smartwatches such as the Garmin Forerunner series and the Apple Watch can track the user’s heart rate and cadence (i.e., the number of strides a runner makes in a minute) during a run. Dedicated heart bands can provide more accurate heart rate readings, and GPS sensors in smartphones and smartwatches can track the distance and speed of a run. Dedicated runners religiously track these data during their runs to understand their condition, and building training plans. One important metric used by runners is the heart rate data, which is often used to estimate the intensity of the workouts. Thus, virtually all commercial running apps display this data one way or another.

Screenshot of the Garmin Connect app showing the heart rate metric of a user’s run.

As shown in the screenshot above, running apps such as Garmin Connect show heart rate data over the course of a run. In this case, it uses a density plot to show how heart rate changed over the duration of the run.

Although heart rate data is important not just for runners but for other sports and exercises, visualizing these data without accessibility in mind can block access to people. Thus, I wanted to explore how I can use a combination of haptics and audio to augment visual heart rate data of a run. I also wanted to use smartphones as the platform to explore, since it’s the medium I mainly use to record and view my runs. Drawing inspiration from many prior work, I adopted the idea of using audio summaries for audio feedback. I also wanted to use the vibrations in the phone to replace sonification, since I thought sonification can interfere with audio summaries while the user is exploring their data. With the vision in mind, I started to design.

My design process started with brainstorming various ways to represent heart rate data using vibrations. While designing, I set and followed this principle: “haptic feedback should clearly discern significant different heart rate values.” After thinking through, I came up with the following haptic designs:

  • Continuous Haptic Feedback: the intensity of the vibration represents the y-axis value, which is the heart rate. The vibration increases when the value is higher, and decreases when the value gets lower.
  • Frequency-based Haptic Feedback: like the Morse code, very short vibrations are produced based on the y-axis value. This represents the heart rate, where there will be a high frequency of short vibrations when the value gets higher, and a lower frequency when the value gets lower.

After designing the haptic feedback, I started to work on the audio summaries. Many audio summaries used in accessible visualizations try to paint the whole picture of what a graph is trying to say. However, when runners reference heart rate data, they also look at particular segments. For example, if I did 8 sprinting workouts in a run, I look at the heart rate for each segment where I did the sprints to compare that with the overall heart rate. Thus, I designed the audio summary so that the user can get simple descriptive statistics of a segment of their choosing. The user can set the starting and ending points of a segment that they would want to listen to, and play an audio summary of that segment.

After designing the haptic and audio summary, I developed a prototype using SwiftUI for the iOS. The reason I chose the iPhone was because of its haptic engine. The haptic engine in many iPhones allows app developers to programmatically “play” vibrations, which allows customizing haptic feedback. After developing the prototype, I named it “PokeCoach,” which I meant as a running coach in your pocket. The video of the prototype in action can be seen below.

Demo of the iOS app prototype.

In the prototype, the user is greeted with a line graph of their run. The user can use the slider below to move the red cursor to see the heart rate of a specific time. Although not shown in the video (it’s rather hard to record vibrations in a video!), the user can switch on haptic feedback by using the switch in the bottom right corner. The user can also switch between the two different types of haptic feedback by navigating to the settings page. Lastly, the user can use the button below to set the starting and ending points of a segment that they’re interested in, and listen to the audio summary.

After developing the prototype, I wanted to get feedback on its design. Since the main purpose of this project was to explore how haptic and audio feedback can augment visual graphs, I conducted informal evaluations with three of my colleagues at UMD. When soliciting feedback, I first introduced the prototype, showed them how to use it, asked them to complete a couple of tasks with it, and did a very short interview on their experience with the prototype.

While getting feedback from this prototype, I found three interesting points:

  • Continuous Haptic Feedback felt more better than Frequency-based Haptic Feedback when detecting changes in the data. In other words, the changes in the intensity was better at “seeing” where the peaks are.
  • Frequency-based Haptic Feedback was more intuitive when discerning the “distance” between two values than Continuous Haptic Feedback. You can find where the peaks are using continuous, but it’s hard to feel how high the value is compared to the low points.
  • It takes time to listen to the entirety of the audio summary and get the information.

From the feedback I got, it can be interesting to use a combination of the two haptic feedback to “feel” the graph. For example, a continuous feedback will be played while the user is navigating the data using the cursor, and a frequency feedback (“heart-rate”) would be played when the user stops at a particular point. Also, making the audio summary customizable might give more power to the user when using this type of visualization. For example, the user might be able to set the audio summary to only play the average of the segment and how different it is compared to the overall, and leave out other information. This way, the user wouldn’t have to sit through all the audio to get information that they are interested in.

In summary, I explored how a combination of haptic and audio feedback can augment visual line graphs of a person’s heart rate data of their running activity. Maybe future work can adopt these ideas to make an accessible time-series data visualization for people who are blind or low vision. An interesting point that I’d like to explore would be to see how haptic feedback can be used to encode various information from different types of data, and not just time-series data. For example, can we use haptic feedback for heatmaps? All in all, this was an interesting project I did for INST760 Data Visualization at UMD.

--

--