Into the Multisensory Metaverse:
The Future of the Food-Product Lifecycle
By Alpana Dubey & Alex Kass — Accenture Labs
Food products are an important component of the economy and a central aspect of day-to-experience, but they are not typically part of today’s discussion of extended reality and the metaverse. You may be surprised to learn that we think that this may well begin to change soon. As the metaverse becomes a more comprehensive, multisensory extension of physical reality, we expect it to change the way everything is created and experienced — including the things we eat and drink.
One way we describe the metaverse broadly is as, “an evolution of the internet that enables a user to move beyond browsing, to inhabiting rich, often shared, experiences.” This will in turn rely on several component technologies, all of which continue to progress rapidly. Three broad kinds of technologies are involved:
- Cameras, microphones, and other sensors used to digitalize the physical world. Digitalization — the creation of digital twins — is becoming increasingly broad and comprehensive. Today, small sensors — often even small enough to be worn — support creation of increasingly high-fidelity digital twins of people and their behavior, and of the objects and environments that surround them.
- AI is used to interpret what those sensors detect, to identify and model higher-level aspects of the sensed environment to support increasingly sophisticated simulation of physical properties and behaviors in the digital realm.
- Rendering and actuation needed to translate digital models into physical and perceptual activity,
As these underlying technologies of the metaverse mature, we will increasingly be inhabiting a physio-digital (or, as we sometimes call it, phygital) space that will span the spectrum from digitally augmented physical experiences, to fully digital experiences taking place in virtual reality. The boundaries between the physical and the digital will feel increasingly porous, with events in a physical space having effects that play out in a virtual one, and vice versa. As the fidelity and richness that extended reality (XR) technologies provide goes up, so will the sense of immersion and the range of applications. The implications will be felt across many sectors of the economy, transforming how products and services are designed, tested, marketed, and used.
Enriching the metaverse with more senses
How rich and immersive can the phygital experiences in the metaverse become? While current mainstream commercial XR platforms create a metaverse that is primarily limited to the audiovisual realm (with a bit of haptic feedback), we are already seeing emerging platforms with both sensing and synthesis capabilities that begin to extend the metaverse to embrace the full range of human senses, including fine-grained touch, temperature, olfaction, and end even taste. For example:
FeelReal is a multisensory XR mask which can be programmed to trigger aromas based on the environment or scene user is experiencing through his/her mask. Technologies, like Aromashooter, can augment any, be it XR or any other conventional 2D, experience with an immersive olfactive experience.
Technology such as depicted below creates an immersive experience that creates an illusion of wetness. It uses a combination of thermal and moving phantom vibrotactile stimuli to create this illusion. This can create a convincing immersive experience of rain or taking a shower.
Haptic gloves are getting better at emulating a rich sense of touch. Haptx, a commercial product, pictured below, have developed gloves which, with its 133 points of tactile feedback, can provide a true illusion of surfaces. Meta has also announced that is working on high-resolution haptic gloves.
Adding additional senses — beyond sight and sound — may be thought of as nice-to-have option for some virtual experiences, but one should not underestimate their importance. While it’s true that the information density of audio and visual channels is high, and they are core to most forms of human communication, the other sense, such as touch and smell can play a key role in making an experience more visceral, emotional, and even memorable. Aroma, in particular, is known to magnify the emotional response of participants, which in turn can make an experience more memorable (see, for instance, this article). Sight and sound often dominate our conscious minds, but other senses play big roles of which we are not always conscious.
Food in the metaverse
Like any metaverse applications, those involving food products will have to rely on sensing, modeling, and rendering/synthesis. But with food products, it is the sensing, modeling, and synthesis of aroma, texture, and eventually taste, more at the core than the audio and visual modes that currently dominate extended reality. Getting to metaverse experiences that are truly immersive involves a combination of science and engineering for each sensory mode being modeled. For the audiovisual metaverse this means modeling acoustics, optics and force-and-motion physics. Replicating and simulating aroma- and taste-based metaverse experience will involve more chemistry (and chemical engineering) and psychology. When we consider a future in which the sensations of texture, smell, and taste can be created and manipulated digitally, we believe that the entire lifecycle of food products will be transformed.
Transforming the four phases of the food-product lifecycle
Let’s consider four distinct phases of food-product lifecycle, and how a Multi-Sensory Metaverse (MSM) might transform each: from accelerating development and testing, to enhancing marketing, and even enriching the experience of consumption.
We’ll examine each phase briefly, to talk about how we envision it being transformed, and will also touch on the R&D work we are doing to bring the vision to life:
1. Designing food products
Inventing new food recipes was long thought of as more of an art than science; while inventing new recipes, both amateur and professional recipe designers have traditionally relied on intuition, implicitly informed by their own flavor preferences. However, to improve their success rates, Food and Beverage (F&B) companies have been working to adopt a more data driven approaches to understanding consumer’s flavor preferences. They analyze the food items and spices bought by the target consumers to understand their flavor preferences. In other words, their food product recipes may be guided by digital twins of food products and of consumer food preferences. In the digital physical blend of the Metaverse, the scope and level of detail represented in these digital twins will become a crucial factor for understanding the consumers’ preferences — including flavor, texture, appearance, nutritional constraints, even down to the representation of the molecular composition of as molecular composition of ingredients of food products.
For instance, citral is a flavor molecule dominant in citrus fruits such as lemon. If an analysis of food products digital twins purchased by target consumers, shows frequent presence of citral as dominant molecule, one could predict that the consumers will like lemon-flavored products even if the purchased products do not have lemon as ingredient; rather have some other ingredient with citral as a main molecule. Such analysis requires changes in how machine learning (ML) models are trained to understand consumers. Early commercial examples of AI-driven adaptation of food products include IntelligentX, which uses individualized consumer feedback, delivered through an app, to build an AI model of each consumer’s preferences, so that it can customize recipes to meet those preferences.
Going forward, as the digital representations of food products get more sophisticated, the ML algorithms not only consider ingredients and process steps as input data but also the molecules found in those ingredients to make more accurate estimation of flavor profiles. Such deeper level digital representations will not only aids to various algorithms we may use for consumer understanding, but also will aid to the new recipe discovery process. F&B firms, while inventing new food products recipes, create variations in the recipes by changing ingredients and/or process steps. For example, changes in ingredients and steps may be guided by AI algorithms where ingredients / process steps are recommended based on consumers preferences.
At Accenture labs, we have developed prototype solutions that empower AI systems with rich digital twin representation of consumers and food products. Our flavor profiling system understands the relative importance of flavor molecules in recipes to arrive at a flavor profile of consumers as well as of food items. Our AI based computational creativity tools can guide F&B firms in the recipe ideation process by suggesting them ingredients that are novel, surprising, or unexpected. Moreover, AI can help in generating recipes automatically.
2. User testing food products
Artificial Intelligence, although is a great tool to generate recipes, they have limitations too, when it comes to predicting human response for new recipes. For instance, it has been shown that presence of certain volatiles (aka flavor molecules) makes ‘Matina’, a variety of tomato, taste sweeter compared to another variety ‘Yellow Jelly Bean’ even though the amount of glucose was half in Matina [research article]. An AI model may miss-predict the flavor profile of a recipe having Matina or Yellow Jelly-bean tomatoes in case it has not been trained on Matina and Jelly Bean samples. This makes human evaluation necessary. However, human evaluation has its own limitations. Firstly, it requires availability of human testers and secondly, it needs physical samples. Therefore, to evaluate human response, F&B firms hire a fleet of experienced food testers to test physical samples of food in their labs.
Multi-sensory experience can dramatically shift the way food testing happens. Think about these testers working in Multi-sensory Metaverse where they can see 3D models of the food product, feel its texture, smell it, and taste it from any part of the world. With this capability, firms need not bring testers from a particular culture and geography into a central location to test a food product, but rather create a Metaverse testing lab, with food samples and a set of protocols that testers that testers use. While physical testing will still likely be important in the latest stages, the multi-sensory testing can help getting a larger number of samples in dramatically less time.
3. Marketing / selling food products
The concept of Metaverse is going to impact retail experience too. We are already witnessing an increased focus on 3D commerce which gives e-commerce firms (See the End of 2D Commerce at CES ) an edge by providing shoppers a differentiated experience with 3D view of the products. Multi-sensory Metaverse will give a further edge to it. Addition of extra senses has already shown to increase consumer engagement and sales in the context of physical retails, e.g. we may find examples where restaurants and physical stores diffuse fragrances to drive sale / increase consumers’ craving. Hence, it is obvious that multi-sensory version of Metaverse would drive more engaging experiences.
In the MSM, consumers can visualize food products in 3D, feel them, and smell them virtually, before making a purchase decision and taking them out of the package. For instance, imagine you get an immersive cookie unwrapping experience in virtual reality environment along with cookie fragrance getting stronger during the unwrapping process. This is not just a more engaging experience, but also reduce the waste associated with buying a food product only to later discover that you don’t like the way it tastes or smells. It may also speed up the sales for newly launched and other unfamiliar food.
At Accenture labs, we have developed an intelligent multi-sensory immersive kiosk, IMOSK, which can be used by the retail customers (online / offline) to browse through the food products and simulate them in 3D virtual environment along with olfactory simulation. To assess the usefulness of such an experience, we conducted a set of experiments on few subjects. The subjects were put in a MSM experience of a super market where they could browse 3D products visuals along with olfactive experience. The main aim of our study was to evaluate how often the purchase decisions taken through multi-sensory experience is in conformance to their actual liking. Our results showed that 85% of time the purchase decisions were in line with their preferences. A multi-sensory experience may open a new possibility for differently abled people. Imagine a voice based olfactory displays that assists a visually impaired person to help understand the products better.
4. Consuming food products
In the metaverse we will continue, of course, to consume real, physical food. However, the experience may be augmented by, for example, smart utensils that digitally stimulate tongue to augment the flavor. Taste simulation technologies, in early-stage development in research labs around the world, would make It possible. For instance, Ranasinghe et al. developed a pair of chopsticks and Miso soup bowl (Figure below), to add seasoning to your food lounge touching electrodes. such as vocktail , and other taste synthesizers simulate taste virtually. Such augmentation will help people with dietary restrictions, such as diabetes, and people who have limited food options due to unusual circumstances such as astronauts and military personals on the field.
Smart utensil will be able to stimulate our taste buds digitally. Think about these utensils not only augmenting taste but also storing information about our preferences and adopting on-the fly what we eat either automatically or through apps. For instance, someone having strong preference to sweeter coffee could stimulate sweetness while drinking a coffee with less sugar. Such taste augmentation has a great potential to shift the population towards healthy and sustainable eating habits. As the demand for healthier and sustainable food alternatives is rapidly growing, it will be equally important to make those alternatives as tastier as the lesser healthy / unsustainable alternatives were for them to be acceptable at population scale. A solution like multi-sensory digital augmentation of taste can potentially address this problem.
Applying the metaverse to the food-products industry, which is so closely tied to senses of taste, smell, touch, may seem like a stretch to those whose conception of the metaverse are anchored very solidly in today’s audio-visual XR. The technologies to enable a multi-sensory metaverse experience are only beginning to mature out of the lab and into the market. As a result, we have only begun to explore the opportunity space that they will open, industry by industry. But as virtual and augmented haptic, olfaction, and even taste technologies mature, we see the potential impact being a very exciting transformation of the entire food-product lifecycle, extending from the initial design through to consumption.
For more thoughts about how the metaverse is ushering in the next wave of digital change and is providing forward-looking companies with an opportunity to act today and be ready for the future, check out Accenture’s Technology Vision 2022.