Towards Multi-modal Interfaces

--

Introduction

Humans are multisensory beings and make sense of the world via multiple sensory modalities. However, most everyday technologies, including smartphones, tablets, and personal computers, often fail to engage all the senses beyond vision effectively. This imbalance in engaging the senses may lead to sensory overload, fatigue, and can even be very annoying. For example, over-relying on the eyes may lead to eye strain and discomfort.

Dr. Wilder Penfield and his colleagues created a compelling data visualization of the brain’s sensory and motor functions using homunculi. They created a sensory homunculus illustrating the areas of the brain that processes sensory information from different parts of the body, and a motor homunculus illustrating the areas of the brain involved in motor processing for different parts of the body. Areas of the body that have complex and copious amounts of sensory and motor connections are depicted as more prominent in the homunculus.

3-D models of the sensory and motor homunculi. Source: Wikipedia

Following Penfield’s guide, I created my rendition of what a sensory homunculus of a smartphone user or the way our smartphone might see us. I predict the eyes would be unmistakably large, almost popping out of the head. The ears would come next since people courteously have their phones on vibrate most of the time and might even forget what their ringtone is. Speaking of vibration (and not phantom vibration), the hands — particularly the thumb and the pointer finger — I imagine would be huge. Smartphones have reduced our motor abilities to primarily just pointing, tapping, swiping, and pinching. The nose would be minute. And the tongue and the mouth? Would they even exist?

My rendition of the sensory homunculus of a smartphone user

In Dialects of Sketching, Gabriella Goldschmidt describes visuals to aid in “certain kinds of reasoning” and reveal things that might not be explicit otherwise (Goldschmidt,1991). I used this sketch to visualize my argument for a more effective sensory design approach for everyday technology and to establish a dialogue with what currently is. When I shared this almost grotesque looking visual with my peers, I found that just like me, a lot of them could relate this to how they felt after long hours of using their phones. Looking at the enormous thumbs and pointer fingers, they immediately gestured the pinch and swipe motions with their fingers.

Designing multimodal experiences

We live in a rich and diverse world. We sense and perceive the world through multiple senses and rarely recall events and memories unimodally. Our sensory capacities are also very subjective and highly varied. However, most everyday technologies and devices of today are designed for a highly limited number of use cases and make many assumptions about the user, including their sensory capabilities. Other users are deemed as “edge cases” and are addressed (if at all) as an afterthought (Meyer & Wachter-Boettcher, 2017). Currently, we need to heavily rely on our eyes to interface with everyday technologies and they are disproportionately stimulated when compared to the rest of our senses. This hegemony of vision in design not only excludes people who are visually challenged, but imbalanced sensory experiences also create sensory overload and fatigue in those that aren’t.

Through my thesis research, I aim to address this gap by not only hypothesizing effective and accessible multimodal interfaces for everyday technologies but also provide a means for other designers to engage in design across multiple senses. I intentionally aim to study designing for touch, smell, hearing, and vision and not taste since taste often requires people to consume something, the effects of which can be hard to control for, while designing for larger populations.

Our senses each have different affordances that can be leveraged for different applications (Lupton & Lipps 2018). For example,

  • Touch is social and can communicate emotions like support, control and fear
  • Smell is primal and is strongly linked to memory. We tend to retain smell memories for longer periods of time that other sensory memories
  • Sound is very effective in signaling danger and is very effective for communication
  • Vision has a highly developed rich vocabulary and can be very effective in amplifying other sensations
Mapping the affordances of touch, smell, sound and vision based on Ellen Lupton’s The senses: design beyond vision

I aim to research for sensory design through a series of studies exploring the affordances of each of the senses, as mentioned earlier, hypothesizing and testing effective ways to interface with digital technologies that make for a rich, balanced, and inclusive experience that honors our multimodal capacities.

The pilot project

This pilot project captures the development of a wearable haptic device prototype. I worked on this project with the three of my team members in the Morphing Materials class under professor Lining Yao. We wanted to explore how we might be able to add rich tactile interactions beyond just vibrations to a designer’s palette by establishing a vocabulary for haptics.

A haptic grammar for designers

Touch is highly social and is used to communicate emotion non verbally. Control, love, support, intimacy, stress can all be communicated through haptics very effectively. Studies have shown that tactile interactions play a significant role in easing stress and promoting a sense of well-being (Walker & Mcglone, 2013). Beyond its social benefits, rhythm, and timing affordances of haptics can be leveraged to convey different kinds of information. For example, communicating urgency through repeated nudges, providing gentle reminders through taps, alerting during emergencies via strong grabs are just some of the touch-based communication means we use in everyday life and are intuitive to us.

We began visualizing simple touch interactions through 3D modeling to explore basic parameters that could be used to generate different kinds of haptic sensations. While current haptic technologies such as those in smartphones produce different vibration patterns, we wanted to explore haptic sensations that simulated real-life touch.

3D explorations for haptic sensations

Prototyping

We created sketches for a wearable haptic device that would be placed on a non-vulnerable body part, and that wouldn’t hinder the functionality of the body part. For this purpose, we chose the device to be placed on the arm of the user as a loop.

The haptic sensation would be created by the displacement of a magnet placed within a solenoid coil that would tap against the skin. 36 cells would form a 3x12 grid, each containing a solenoid coil.

Left: A single cell with a solenoid and a magnet. Middle: A 3x12 grid of cells. Right: The prototype as a wearable haptic device

Real life haptic sensations

After exploring a few fundamental interactions through 3D modeling, we explored more complex haptic gestures. We tried to emulate real-life haptic references, tactile social interactions, and even some metaphoric gestures. Some of our haptic explorations were:

  • Simulating the real world. A water drop falling into a pool of water and creating a ripple effect. We imagine the use of this in augmenting story telling when accompanied with visual and audio effects
Left: Water drop creating a ripple (source: GIPHY). Right: Drop and ripple effect stimulated in the prototype
  • Social tactile gestures. Gentle calming strokes that might be used to ease stress and promote well-being (Pawling et al, 2017)
Left: Gentle stroking with the hand (source: GIPHY). Right: Calming stroking stimulated in the prototype
  • Metaphoric tactile gestures. Finger tapping or drumming that might represent waiting, boredom or frustration. We imagine this could also be used to haptically represent the loading symbol of a device
Left: Finger tapping (source: IMGUR). Middle: Loading dots (source: Tenor) Right: Finger tapping stimulated in the prototype

Conclusion

This project helped me gain a stronger understanding of the affordances of touch. Through literature research, I could understand the social and psychological significance of touch-based interactions. I learned that touch is not only highly social, but it can also be used to communicate information both metaphorically and by potentially codifying information.

The process of doing this pilot project crystallized the importance of visualizing and sketching as a crucial means to establish a dialogue between the designer and the material, between different iterations of the design and between oneself and others.

This project not only surfaced the limited haptic vocabulary we currently have but also the lack of prototyping tools available for haptic design. I hope that in the future, there is development in our vocabulary around the senses beyond vision and a set of robust prototyping tools available to design for all the senses.

Next steps

Testing — We aim to test the prototype with people and get a subjective rating on the pleasantness of the various haptic sensations and their likeness to real life.

Expanding the haptic vocabulary — The current haptic vocabulary can be further expanded to create a rich variation of haptic sensations that could potentially be useful for other designers.

Expert feedback —Professor Roberta Klatzky of the psychology department has done extensive work in the field of haptics. Her feedback on the prototype would be instrumental in developing the prototype further.

Creating a haptic prototyping toolkit — Once we test the prototype, a haptic prototyping toolkit could be created for other designers who wish to design haptic interactions. A robust system to build on the existing haptic vocabulary could help other designers to contribute to building a haptic language systematically.

References

Goldschmidt, G. (1991). The dialectics of sketching. Creativity Research Journal, 4(2), 123–143. doi: 10.1080/10400419109534381

Lupton, E., & Lipps, A. (2018). The senses: design beyond vision. New York, NY: Copper Hewitt, Smithsonian Design Museum.

Meyer, E., & Wachter-Boettcher, S. (2017). Design for real life. New York, NY: A Book Apart.

Pawling, R., Cannon, P. R., Mcglone, F. P., & Walker, S. C. (2017). C-tactile afferent stimulating touch carries a positive affective value. Plos One, 12(3). doi: 10.1371/journal.pone.0173457

Walker, S., & Mcglone, F. (2013). The social brain: Neurobiological basis of affiliative behaviours and psychological well-being. Neuropeptides, 47(6), 379–393. https://doi.org/10.1016/j.npep.2013.10.008

--

--

Aadya Krishnaprasad
Research for/into/through design(ing)

Graduate student at the School of Design at CMU | Interaction designer