The other methods of using Kinaesthetics outside of the typical computer can be broken down into 3 different types: Touch, Augmented Reality and Virtual Reality. In this Article I will cover the a few aspects of each of these, while also talking about the difficulty these new types of media can cause issues with UX Design.
Touch based inputs now allow us to give the user more direct control over elements on the screen and new methods of navigating spaces. Some of these new methods include using finger motions to carry out functions, such as pinching in on the screen to zoom out and pulling fingers apart to zoom in.
Examples of touch based platforms: Controllers, Phones, simulation controls.
One of the biggest challenges designers can face when using touch based interactions is clearly showing what can be interacted with, over what is just design. Some methods to fix this include having the interactive parts maintaining a uniform style throughout the product as the users can then tie an Affordance to those.
While products such as google glass are still not available to a wide market, AR offers us some pervasive forms of interacting with the user to try and elicit emotions and actions from them. The main one being convenience. Having the ability to use a computer anywhere with simple hand movements, not pulling us down to look at a screen can greatly change how we interact with the world around us.
Hand based interaction:
The unique problem raised by AR is the lack of tactile interaction, unless peripherals are being used, which requires us to emphasise the feedback given from the UI as to continue to give clarity on whether the device is working.
The other problems we encounter with this pervasive design, is how to still give the user clarity of vision when using the device. We don’t suddenly want to completely obscure the users eye with pop ups and messages that cause displeasure or have them no longer able to pay attention to where they were walking.
But how can we build this idea of Kinaesthetics into this:
The example I will use is: Imagine that a Clothing Store had a sensor that could pick up whether someone with AR was in range, and much like a QR code could send data to your AR Device. Now let’s say that it can read gender/age as you walk past the store an icon appears asking: “Latest Sales, Click to see.” You touch the virtual button with hand motion and it opens a catalogue tailored to the data it has on you.
A few things happen here: We have curiosity, and the convenience of not having to go out of way to look at the items (Although it still requires your attention) and I can help stimulate the impulse buying nature in people when you can tailor their searches to items they would find desirable.
VR on the other hand offers us a completely immersive method of interaction that can be used for a myriad of purposes, including: Gambling, Gaming, Learning, Social Interactions and more.
But the problem that has been cause developers working the Products like the Oculus Rift is, how will it control? Because with an increased level of immersion comes new forms of input that we will have to pay importance to.
Visual Inputs becoming motion based rather than touch based:
Before when navigating a space, whether it was with a mouse, controller or touch, we received visual feedback of the actions we were taking, but we were always removed from the system by having to look at a screen and use a device to interact. Now what happens when you remove the requirement to use a touch based input to look around the room and can do it just by turning out heads, for most this will be fine until we add movement.
Now this is where kinaesthesia comes into play, which is the feeling we get when moving through space, that isn’t one of the base 5 senses. Now when creating an immersive experience you need to take into account that the body will understand that it is not moving, if the user is sat down. This could cause a disconnect, between the user and the experience if the character begins moving but the user is still stationary, which can at times cause major motion sickness.
To combat this issue designers have been creating various devices to allow continuous movement within a small area: Omni Treadmills. Below I have included a video on how designers are using these technologies in the context of games.
Motion Controls in VR:
Another important change that will occur when designing for VR is how the user will interface with the environment other than by looking at it. And what kind of devices do we need to use to get cross the feeling of the person actually been there.
While designers are making leaps in the realms of different forms of interactive media, we still have a few years before they can be commercialised. Though hopefully one day the convenience and immersion you can receive will be widely available.
Now onto the Final Article, how do we test UX design?
Superbunnyhop — Is VR the future of gaming? https://www.youtube.com/watch?v=Y2myY_pQEMk
“Human Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, Third Edition” (2012) by Julie A. Jacko