The Haptification Manifesto

Co-authored with Alex Penman

An exploration made by Alex Penman, Julius Ingeman Breitenstein and Rina Shumylo during Immersive Experiences workshop at Copenhagen Institute of Interaction Design (CIID) facilitated by instructors James Tichenor and Joshua Walton.

On Touch

For the last decades, the brightest minds of our generation were working on moving interfaces from physical to digital. Basically, we moved all daily used objects into the digital world. To make it compact, comfortable, lighter, better… everything was put into a box with a flat screen. We glorify clicks, taps, and swipes. Interaction became screen by default. Welcome to the Era of Glowing Rectangles.

In the visually driven world the screen became king. Apps became our best friends. But humans pay a high price for it. Our fingers are trapped on a tiny perfectly polished flat surface that feels like… nothing. It is so easy to isolate ourselves from physical space and ignore other senses.

What about virtual and augmented reality (VR/AR)? Well, interfaces are still mainly visual and controllers still feel numb. Tactile richness is sacrificed for phony visual benefits. We immersed ourselves in tactile void.

However, this status quo is flawed. Human system of receptors can distinguish minor variations in the external environment. It links our skin, nerves and brain. Touch is the first feeling that we got in this world. Touch is the most emotionally central sense.

So, why do we work so hard to change it? We can not change human nature, but we can craft technology.

The Haptification Manifesto

The paradigm of Haptification was born out of the suffocating limitations placed on the sense of touch by modern technology. Our fingers are bored and tired, looking for new sensations across these glass surfaces. We want to feel. We want to experience. We want to touch.
Interaction is never “screen by default”.
Think about how life was with touch. Question those who have removed touch from your senses. Think about how life can be with touch.
We proclaim kinesthetic communication as the new medium of the human-device experience.
We are devoted to creating natural interactions to enrich human experience with every reality.

The Creation of The Haptification Manifesto and M.A.N.G.O. Labs

We explored the idea of bringing touch to VR through the lens of a fictional research group and their accompanying manifesto for a touch research project. By creating M.A.N.G.O. Labs and The Haptification Manifesto, we were able to explore touch in VR through a critical, yet lighthearted, lens. On one hand, the manifesto brings all our ideas into one clear statement which we can convey to others. It’s meant to be an extreme viewpoint, through which we can experiment with the medium and push the boundaries of current implementations of VR. It’s written in a way to make it obvious where the intentions of the project lie. On the other hand, M.A.N.G.O. becomes a less “serious” packaging for the manifesto, allowing us to deliver the manifesto in an approachable way. We have a fictional group making playful interactions under the influence of a strongly worded and serious mission statement. The two pull on each other and meet in the middle to create an engaging project and allowed us to have some fun along the way.

The Manifesto

When first starting the project, we knew we wanted to explore ways in which VR could be enhanced through the sense of touch but didn’t have a clear direction for it. After talking to Josh, he saw that we were all aligned on the idea and encouraged us to explore it from an extreme view — possibly writing a manifesto. This sparked our interest and we decided to try it. Through writing the manifesto we were able to clarify as a group what we felt was missing from current experiences and what we wanted people to feel with the project we were about to make. It allowed us to take a critical approach of the medium while exploring what we could add to it. Our main ideas are found within two points of the manifesto:

Interaction is never “screen by default”

Screens are not and should not be considered the default interface. Relying only on sight ignores the other senses that are so tightly linked with the way we experience life and puts limitations on those who are hard of sight.

Question those who have removed touch from your senses

This is a call to think about how and why our sense of touch has been dulled to only experience glass screens. Not only should we think about the tactile feedback that could be provided by other media but also how we got in this situation in the first place. Through this, the manifesto becomes about how each individual experiences technology, rather than the perspective of the few people who authored it together.

Most people will only ever be able to experience this project through our documentation on the internet and when the most important part of our experience is the touch, the manifesto gives them more context into how we operated.

M.A.N.G.O. Labs

M.A.N.G.O. Labs was a way for us to pull the manifesto in the opposite direction and make the content more digestible. Where the manifesto’s impact is in its effect to question the status quo, the Lab draws people into the experience by opening up the topic to fun experimentation. The fictional staff at M.A.N.G.O. would consider these as serious lab tests but, in real life, things play out in the VR world similar to games like Job Simulator. This playful aspect is what draws people in. By introducing it this way, people let their guard down and open up to the world you’ve created. Participants quickly become immersed into this new reality but stay linked in the physical realm through touch.

Because the manifesto is worded in a way to get people to question why there is a lack of touch in technology, M.A.N.G.O. Labs doesn’t push their physical implementation of these apps as the one true way to enhance VR. It’s more of an invite to reflect on why technology is currently missing these feelings rather than a prescription to correct current mistakes.

Even if people don’t find an interaction successful, we found that they will still reflect on why it wasn’t successful and are quick to offer solutions that would create a better experience for them. In this regard, the manifesto not only works for us as a way to explore the topic, but also extends out to anyone who tries out the experience as well.

The 3 Interfaces

We created 3 stations in a room where people would interact with our interfaces. The space was modeled in Unity and objects were placed in a way that made navigating the room both physically and virtually feel natural. Our first prototype was the Tinder interaction. The viewer sees a 3D model of a person with a heart and a trash can on either end of them, which recalls the signature swipe interaction of Tinder. They are invited to reach out and their hand is greeted by the physical slider. Moving the slider causes the virtual model to slide to either end of the spectrum. For Instagram, a gear sits next to a towering feed of pictures in VR. A crank attached to the table in physical space acts as the control for the virtual gear. By rotating the physical crank, the Instagram feed scrolls down towards the participant. The third app, a weather app, uses a rotating disk to spin between different fictional cities. Detailed models of each scene are presented in VR, while sensations are felt in the physical space to match the weather of the currently chosen city.

Even if people don’t find an interaction successful, we found that they will still reflect on why it wasn’t successful and are quick to offer solutions that would create a better experience for them. In this regard, the manifesto not only works for us as a way to explore the topic, but also extends out to anyone who tries out the experience as well.

VR as a blindfold

One concept we wanted to explore was the way VR can be used as a blindfold. During our ideation process, this was the initial idea that got us interested in touch as a sense for VR. We wanted to explore ways we could augment VR with touch and how sight affected the way people experienced touch. Once the headset goes on, you become immersed in a new world but become disconnected from the physical world. This pushed us to explore new ways touch could bridge that divide between the two realities.

It also allowed us to prototype very quickly. Because VR acts as a blindfold to the real world, the fidelity of the physical objects doesn’t really matter as long as they feel convincing enough. We created a working version of our Instagram interaction within an hour and a half by modeling the Instagram feed in Unity and fastening a standard crank bolt wrench to a table. By pressing the arrow keys on a keyboard, we were able to control the up and down movement of the virtual scrolling and sync it with how fast the crank gets rotated. This proved to be so convincing to people that we actually never got rid of the manual controls. Rather than spending time trying to add sensors to the crank, we moved on and focused on the experience. Since our project was about experimentation and learning, the low-fidelity prototype worked well enough for us.


Physical to Virtual Interaction Mapping

Through the Tinder app interaction, we found out how empowering it is to move something, no matter how small the movement, in physical space and have a matching object follow the motion in virtual reality. The tighter the movement felt between the two objects, the more convincing the interaction felt. Most current applications in VR will use a controller to point at and grab objects in space. This interaction feels natural but there is a disconnect between the pressure of the trigger in your fingers and the distance at which you see the object in front of you. By grasping a physical object and manually moving it while a virtual object follows creates a strong connection between the two realities. Your motion in the physical space makes you feel as though you have the agency to control the virtual world in a more natural way than a controller can.

Acclimating to weird senses of touch

We placed a mango formed into the shape of a face on top of the Tinder controller to see how people reacting to touching an exaggerated fleshy texture in this environment. This came from our early discussions wondering how the experience would change when it actually felt like you were grabbing someone and swiping them away in one direction or another. After a few rounds of testing with people, the interesting point was not in the closeness of connection the participant felt with the person they were swiping but actually in how quickly they got acclimated to the physical interface.

A few people would immediately draw their hand back when they felt the slimy mango and almost all admitted to it feeling discomforting. But at the same time, everyone went through with the experience and their attention was moved from the strange physical sensation to what they were doing in the virtual space. You could see their hands relax more as they got settled into the virtual world and accepted this as the new interface, whether intentional or not.


Using Instagram as an example, we turned the very relatable task of scrolling and Haptified it. Participants are immediately faced with a towering wall of images which they can then scroll through with the use of a crank attached to gear wheels. Moving the images off of a phone screen and into a space that has more depth, the person can really get a feel for how much content they have to go through. While the scale of the wall can feel daunting, it also sparks some curiosity in the user to explore the interaction.

The physical force of cranking

Rather than making it easier to scroll, we used the crank to bring a sense of physicality to the virtual task being performed. The user can no longer mindlessly scroll through content. There is a mental awareness that comes with this action of fully rotating the crank. Even if you don’t fully focus on the content scrolling past, you start to feel the physical energy required to keep the motion going.

The sound of the crank

We found through testing that the most convincing part of this interaction was the sound the gear in the crank made as you rotated it. The loud mechanical sound lining up with each physical click of the teeth in the gear has a satisfying feeling with it. The clicking transfers into the digital world and senses blend together.


The weather app’s purpose in this project was to bring in senses of touch in a more passive way. The previous two apps required force input from the participant in order to make the interaction more physical. Since we found the power of connecting physical and virtual objects through visual feedback so powerful, the person still must turn the Lazy Susan in the real world to move the virtual interface. However, in this experiment we tried giving some feelings to the top of the person’s hand.

A lack of vocabulary for senses

All three of these give a unique sensation to the top of the hand. Through user feedback we found that the desert heat and the wind were often considered the most effective while the mist was a little too strong of a sensation. When we tried to reflect on this, we found that there was a lack of vocabulary to describe the feeling of the heat lamp shining on the back of your hand. It’s much more complex than just feeling “hot” and becomes hard to articulate to others. Then, when paired with VR, it becomes even trickier. There is a sense of immersion and a full atmosphere around just your hand that is being created and totally change based on your actions in a virtual world. We can easily describe things we see. Our brains are trained so that we can quickly come up with mental images of verbal descriptions but this is not the case with other senses. Maybe it’s just because we don’t put enough emphasis on the other senses in our daily lives.

Towards the Future

People use rich vocabulary to talk about visual things, but tactile sensation is limited in words. Hot, cold, hard, soft, smooth, but, is it enough to talk about the whole spectrum of tactility?

In The Haptification Manifesto we merged different technologies and dived deep into Mixed Reality to explored kinesthetic. During the experiment we pushed boundaries of common interaction and explore multidimensional experience with technology. We intentionally introduced a partly ridiculous, partly serious approach to what future interface could be. Our research journey proved that touch matters in any reality and enriches human experience.

The seeds to shape future of interaction were planted. Our vision of this future is to free senses from a cage and celebrate the sensation of touch.