Multisensory UX: Harmonizing Future Experiences

Noman Siddiqui
i-ux
Published in
8 min readFeb 28, 2024

Summary: In the ever-evolving landscape of Experience Design, Multisensory UX (MUX) will play a pivotal role to transform our interactions into an exceptionally holistic experience. It will deepen user engagement and enhance Spatial Experience Design (SXD). In this month’s article, we will delve into the impact of Multisensory Learning and SXD, discussing the relevance, use cases, and fundamental design principles. We will conclude with a few key strategic insights on how it could revolutionize the future of customer experience in both physical and digital domains.

Industry Insights

According to a research report published by Markets and Markets, the Spatial Computing market size is expected to grow from USD $97.9 billion in 2023 to $280.5 billion by 2028 at a Compound Annual Growth Rate (CAGR) of 23.4% during the forecast period. The availability of affordable hardware and advancements in real-time rendering engines will likely boost the adoption of spatial computing globally.

source: Markets and Markets

Moreover, research from Gartner reports that the market for spatial computing is expected to grow to $1.7 trillion by 2033 — up from $110 billion in 2023 — driven by growth in advanced location-based services.

Multisensory Learning 101

Let’s take a step back and start with some background information and context first. Multisensory learning is sometimes referred to as ‘whole brain’ learning. Learning experiences that stimulate more than two sensory systems at a time (such as audio, visual and tactile) activate different parts of the brain to receive, process and make sense of the information. Multisensory stimuli can be very useful in enhancing cognitive performance for people, both younger and adults.

image by author and Midjourney

According to recent research, learning that engages multiple senses can be advantageous for people, making it easier for them to remember information. This approach is especially beneficial for those who may face challenges in learning and thinking differently. Those of us who have difficulties with visual or auditory processing, for instance, might find traditional methods of learning through reading or listening less effective. Engaging more than one sense can help people grasp and retain skills more successfully.

Scientists continue to examine the scope and challenges of technology in the study of multisensory integration in a world that is increasingly characterized as a fusion of physical and virtual events. They discuss multisensory integration research through the lens of new technologies. As such, this helps brings human-computer interaction research, experimental psychology, and neuroscience closer to each other.

An Ode to Sen-so-rama

One such example of this multisensory technology that we should credit is Sensorama. In the 1960s, Morton Heilig developed the Sensorama, which is one of the earliest examples of multisensory, immersive technology. It was a machine that provided a 4D visual experience along with vibration, aromas (apparently you could taste what was in the air) stereo sound, and even wind to simulate a real-world experience.

Inspired by earlier science fiction, Helig had an idea to make a kind of theatre that would target all the primary senses of the audience. He shared his thoughts in a 1955 paper called “The Future of Cinema.” A decade later, he invented an immersive experience, which imitated a bike trip around Brooklyn. It combined visuals, sounds, smells, and motion to make users feel like they were really spending an afternoon in New York City. Unfortunately, Helig’s Sensorama never got the funding it needed to grow.

source: Morton Heilig — Figure 5 of U.S. Patent #3050870

Current Implementations

Fast forward to 2024, advancements in technology have enabled technology to incorporate tactile feedback, and auditory spatial sound into user experiences, creating a more immersive environment. Companies like Apple, Meta, Magic Leap and Microsoft are leading the way in integrating haptic feedback: i.e. using touch as a form of interaction giving you physical sensations to let you know something is happening or to make virtual experiences feel more real — providing users with tactile sensations that mimic real-world actions.

The new landscape of Spatial Computing e.g. VisionPro, necessitates that UX designers adapt to designing for 3D spaces and augmented realities. This shift underscores the significance of creating context-aware and situational interfaces, pointing towards a future where digital and physical realms merge — offering more comprehensive and immersive experiences.

Spatial Computing

Thanks to the technological advancements in the recent years, Multisensory learning opens up new doors to integrate immersive technology and Spatial Computing — allowing computers to understand and interact with the physical space around us. It combines the digital and physical worlds by recognizing how objects are arranged in space and how they move. This enables devices like augmented reality (AR) and mixed reality (MR) headsets to see, analyze, and interact with the environment in real-time. Spatial computing makes these scenarios possible by blending the virtual information with the real world, creating experiences where digital and physical spaces co-exist and interact seamlessly.

Future state of UX design will revolve a lot around how to design with depth, scale, windows, and immersion, and apply best practices for creating comfortable, human-centered experiences that transform reality. Thanks to Apple Developer resources, we can use these spatial design principles for VisionOS to extend existing apps or bring a new idea to life.

source: Apple Developer

Here is a high-level summary without delving too deeply into the details (to be explained in more detail in a future episode):

  1. Familiar: Starting with familiar elements (sidebars, tabs, search fields) to help users navigate spatial apps.
  2. Human-Centered Design: Keeping main content within the user field of view and using landscape layouts for better visibility.
  3. Dimensional: Designing apps to be adaptable to any amount of physical space without being constrained by it.
  4. Immersive: Transitioning between different states of immersion spectrum, from windowed views to full-space experiences.
  5. Authentic: Focusing on key moments or features that can uniquely benefit from spatial design.

Striking the right “Immersive” balance

For businesses and designers, the shift towards multisensory and spatial UX represents both a challenge and an opportunity. The opportunity is in the ability to create deeply engaging and memorable experiences that foster emotional connections with community. However, the challenge lies in the complexity of designing for multiple senses without overwhelming the user.

According to research shared in Frontiers in Neuroscience, long immersion in VR environment may cause anxiety and cybersickness due to the brain receiving conflicting signals about the user position and its relation to the movement observed in the virtual environment. Therefore, the aspects of usability and accessibility in Spatial UX Design will become increasingly important as it gains mainstream acceptance.

An Immersive Design Use Case Study

Speaking of anxiety, it is ironic that the technology we sometimes blame for stress can also be used to treat anxiety disorders. Below is an example of an Augmented Reality (AR) project I had the privilege of leading over a year ago for a large company in the mental health space. It was primarily an envisioning project focused on VR-Assisted Cognitive Behavioural Therapy for Anxiety Disorders, with a particular emphasis on Acrophobia (an intense fear of heights).

The approach that my team and I took in the discovery and design phases was not overly technical. It began with facilitating a workshop to understand the goals and pain points of patients, therapists, and the business stakeholders. This was followed by creating a detailed Service Blueprint to evaluate how the front-stage experience (therapist and patient) would align with the back-stage technology and channels, leading up to a demonstration of a VR Acrophobia Therapy Module — supported by a functioning prototype and an introduction video trailer.

case study: AR / XR workshop board image by author

The outcomes from the project were applauded by the key stakeholders not only for their innovative value but also because they enabled the company to stand out among its competitors by showcasing a culture of experimentation and innovation.

What does this mean for Organizations?

Organizations can embrace multisensory spatial design technology by start thinking how it will transform and impact their online screen experience for optimal translation to future spatial devices. For example, the efforts needed to redesign and optimize content (text, images, videos, sensory sounds, micro-interactions) to create personalized, immersive experiences that engage other senses, beyond just sight (as such, creating a stronger demand for sensory spatial sound design).

Secondly, industry-specific Omni-channel design would need to be re-imagined and evolve faster, to factor in scalability and hyper-personalization for customer environments.

Lastly, new strategies for contextual, on-demand customer journey maps would be essential for easy interaction across both physical & virtual spaces. However, much depends on making spatial computing devices more affordable (and lighter of course) for mainstream adoption (just think of the mobile phone evolution since the 90s).

This can revolutionize and evolve customer engagement, product design (think about the need of integrating new Smart Spatial Design Systems), and the impact on modern workplace collaboration tools, merging the digital and physical worlds in ways we have yet to imagine.

🎥 Video of the month

The era of spatial computing is here, where digital content blends seamlessly with our physical space. It will soon impact (especially with future models) the way we interact with our hybrid virtual worlds. Here is a demo from UXDA about how future AI-powered spatial banking might look on Vision Pro.

Conclusion

To sum up and conclude, Multisensory Spatial UX design (MSUX) is set to mark a significant shift in interaction design, one that promises to enrich customer experiences and business transformation in profound ways. By harmonizing and innovating across all senses, designers and businesses can unlock new dimensions of user engagement, setting the stage for a future where technology not only meets but anticipates and adapts to the full spectrum of human experience. It can enhance emotional and cognitive engagement, making storytelling even richer, contextual and impactful (we will delve deeper into Immersive and Contextual Storytelling in a future article).

The new landscape of Spatial Computing necessitates that UX designers adapt to designing for 3D spaces and augmented realities. This shift underscores the significance of creating context-aware and situational interfaces, pointing towards a future where digital and physical realms merge together to offer more intuitive and immersive experiences.

Lastly, feel free to reach out to inquire further about our Spatial UX Design Advisory Services. We would be pleased to discuss a high-level use cases workshop for your products or services as an initial step of our discovery process. Stay tuned for next month’s EXD article, where we aim to offer more insightful research and tips. Until then, stay curious and remember, There is no “I” in “UX”.

Noman Siddiqui is an Experience Design Leader, DesignOps Strategist, Adjunct Professor of Design and a self-professed Usability Geek. He serves as the Experience Design & Strategy Director at Nomans Land Creative Inc.

--

--