The UX of the Future- The future kind of has a UI
I recently had the pleasure of hearing Golden Krishna talk about his new book ‘The Best Interface is No Interface’ at a Designers + Geeks event in San Francisco. Golden is well known for his dry and witty stabs at an industry that believes ‘if you like it, you better put a screen on it’ (/Beyonce). It was Golden’s thoughtfulness on more intelligent interaction with our devices that really got me thinking: what would happen if we didn’t always think screen first? More importantly, what can we do in the tech industry to learn how to deal with this paradigm shift of interaction. Can we avoid panic when new devices arrive?
UX-based thought process can help designers, marketers, sales people, and users interact with the new generation of devices. If a new disruptive technology hits the market, we learn from the mistakes of devices-past.
I believe the future does have a UI, but it will also have much more than that. If we more closely examine how we design for interactions with new types of devices we can isolate the user needs that actually drive these products, not the form. I am going to talk about how to utilize UX design to conceptualize products that have hardware living symbiotically with their software insides.
There is plenty of healthy (and unhealthy) debate about the future of wearables. I believe we are past the peak and valley of the hype curve (Gartner), and now we can have productive conversations about what the future of wearables looks like.
We can get over the giddiness of making pedometers in every color, shape, and price-point, and really start to design products that work for and with us. As a UX and product designer, my question is this: how do we design those products as the paradigms of design are shifting so rapidly? The obvious answer uses sound UX design principles!
Design only whats needed
Don’t design features for users they don’t need (no matter how beautiful the design is)
We have heard a lot about the supposed downfall of Google Glass. In reality, the implementation of Glass has now been pivoted to focus on workplace integration. Workplace-focused wearables present a particularly difficult UX challenge. Should the device change the way a worker works or should it be used for tracking, learning, and teaching? In the case of Pristine’s Google Glass implementation, the answer is, “get out of the worker’s way and let them work.” Kyle Samani, Pristine’s CEO, believes that the best experience for Pristine’s Google Glass user is an experience that is barely there. This is a brilliant design decision that is often the hardest to make- make designs that no one will ever see. We let our designs do what they are supposed to do without making the users constantly aware of it.
Some screens are necessary.
Design with them wisely and consistently.
It may be too early to tell if the iWatch really is going to change everything. What does seem to be obvious is that the line between “wearables” and everything else is going to blur. I don’t think this needs to occur with smaller, higher resolution screens. Companies such as Fitbit and Misfit are paying close attention to the consistency of user experience when it comes to their wearable products.
Designers at both companies spoke constantly about the UX of the physical design; the weight, feel, and interaction of the device. Both Misfit and Fitbit are designing the minimal amount of screen required on their devices. Harnessing massive amounts of data to subtly inform the user of their behavior is incredibly powerful. Just because we can change someone’s behavior with a 1080i HD screen on-wrist doesn’t mean we can’t do the same thing more efficiently with a simple e-ink screen. A wealth of information and experience can be found on a much larger and appropriate screen like their laptop screen.
The UI can’t be forgotten
Even if screens aren’t the focus of a product.
Products like Coin don’t utilize a UI screen for its primary application. Coin’s designer Kim Ho discussed how all the touch-points of a product should be consistent. Coin is not a mobile UI focused product, but the product would struggle to exist without one. In the same way that a beautiful mobile interface can’t save a clunky useless wearable, an ineffective UI can sink a thoughtful and elegant device.
All customer touchpoints are important to the overall user experience. Cooper Design’s Lauren Ruiz writes in her Service Design 101 post how communicating across all channels effectively is crucial for a cohesive experience . This continues to hold true even as we begin to add more and more channels.
Service-oriented architecture design
Control the experiences you can, and glean all you can from the rest!
In the integrated tech world, designers and users must be able to plug and play. A lot of software products are beginning to integrate with third party physical sensors. Whether that is a third party integration or a “necessary-evil” sensor, some products rely almost exclusively on the data they receive externally. Designers at Strava have to deal with data coming from numerous different devices and sources.
So how does Strava integrate all of this into a cohesive experience? They are data-source agnostic. Since their influence is often constrained on other devices, Strava utilizes the data to create their experience. They don’t reinvent Google Glass or the iWatch, but they optimize for the functionality that is there.
By focusing on the needs and behaviors of our users we can identify real needs before we decide what tool to fix them with. Whether we are designing, marketing, selling, or buying these new devices, it is important that we think of them in context of the human user. Attaching increasingly high resolution screens to everyday items does not constitute revolutionary design, nor does it represent good UX design.
Philip Likens speaks often about how we can design with the behavior of humans, instead of designing with the possibilities of current technology. Philip thinks “the definition of UI will change. Potentially giving rise to the emergence of VX (Voice as Experience).” Instead of thinking about wearables as new ways to alert us of notifications, we can think of them as new ways to interact with the software we build. Instead of using fingers and screens to log information, new sensors and interactions can do it for us. UX designers in the future are going to have to consider taps, clicks, and swipes; but also oohs, aahs, waves, and blinks. Whether or not we create disruptive technologies or new paradigms, the future will be designed by the users, not the screens.