iUX: Convergence of virtual and physical worlds

Noman Siddiqui
i-ux
Published in
6 min readJan 24, 2017

--

by Noman Siddiqui

iUX: Convergence of virtual and physical worlds by Noman Siddiqui

In a fast-paced global economy, where a voluminous amount of organizations are struggling to keep up with increased competition and deal with uncertainty — augmenting the user-experience (UX) of products and services has become paramount. Research shows that 94% of a user’s first impressions of interaction are functionality and design related. Therefore, it is not surprising that over the past decade, UX/UI design perception has transformed from an aesthetic strategy into an essential need (and a way of thinking) to drive success for almost every organization and institution.

Indeed, technology has weaved some undesirable consequences, in the social fabric of our society, by keeping us overly engaged with our digital touch points. However, it has also helped us become connected to people close to us (globally), keeping track of happenings in the world, decide which places to go to and how to get there, quickly. It has clearly amplified our understanding of what the world is — the way we continue to experience it and how we choose to experience the technology that is around us.

In spite of many professionals who frequently think of UX Design as a contemporary solution, UX is not as novel as it is perceived to be. In fact, it has been practiced and delivered for a long time by various industries, including: automobiles, films, video games and theme parks (image below), to name a few.

The difference lies in the medium — a variable, enabling the amount of human-computer interaction to exceed more than ever before.

Disney Magic Band. Image Credit: Kent Phillips/Disney Parks

Due to the depth of its foundations and impact, UX thinking now can be applied to both our virtual and physical worlds; the confluence of which is inevitable. In fact, it is already taking place as you read this and with the emergence of newer and complex mediums, these two worlds will not just collide, but gradually, integrate with each other.

The goal of the iUX series is to inspire and guide established and future UX/UI designers by connecting best practices of UX and interaction design from our virtual and physical worlds.

The question is if we are ready to design the future experiences for this age of convergence?

Don Norman defines UX in a way that strongly aligns with my narrative:
“User-experience is everything. It’s the way you experience the world, it’s the way you experience your life, that’s the way you experience the service, or an app or a computer system — but it’s a system that’s everything.”

As a result, we can say that the new information era will redefine how we interact with the next generation of systems, including: mobile, tablets, products, and the most intriguing of all — AI.

By referring to AI, I don’t mean to incept an idea that is synonymous with AI robots exclusively (like Honda’s ASIMO or the “insanely cute” home robot, Kuri, by Mayfield Robotics).

Kuri, by Mayfield Robotics (left) | Honda’s ASIMO (right)

AI encompasses various other mediums, including: touch screens, wearables, intelligent apps and possibly, holograms (like Azuma Hikari or Hololens) in future. In recent news, Google Translate is now able to start inventing its own language with its new engine: the Google Neural Machine Translation system (GNMT) to help it translate more effectively for us. You can read more about it in Gil Fewster’s article on Google’s AI announcement.

Illustration by Pablo Delcan, from The New York Times

Here are some questions to think about:

  1. How will the user-experience evolve for future applications?
  2. How will this change the current skill-set of UX designers?
  3. Which new skills will UX/UI designers need to learn to design for both virtual and physical worlds?
  4. Who will be designing the future human-ai interaction?
  5. What if other AI forms take care of redundant design jobs (possibly with automating the aggregation of design with reusable components and pattern libraries). In which case, how will we use our time?

These are some of the questions the iUX series aims to answer. Below is an eclectic movie clip from AI: Artificial Intelligence. It shows but a glimpse of how Rick Carter (production designer), Steven Spielberg (Director) and Ian Watson (writer) envisioned the UX/UI of future search engines integrated with AI, over 20 years ago.

How will this vision evolve into the coming years? We can already see this happening at a grassroots level by interacting with our very own evolving, intelligent (and witty) personal assistants, Siri and Google Assistant; available on mobile, tablets and laptops (without getting charged for every question we ask).

Lynne Parker, Division Director of the National Science Foundation writes, “AI could open us up to the ability to be creative and to really think broadly because it can relieve us of some meaningless jobs. I think there’s a potential there if we seize the opportunity to be relieved from everyday tedious things to do things that are more impactful and really more human, more intelligent, more creative. Whether or not we will seize the day, as they say, is a question to be answered.”

It is very likely that physicists and engineers would take care of development of future mediums, in the preliminary stages. However, due to the emotional design thinking and creative problem-solving involved, UX designers, cognitive psychologists and behavioural scientists (see video below) would be heavily relied on, to do these jobs, in the long run.

Susan Weinschenk is a Behavioral Scientist, Author, Speaker, Consultant and Mentor

Interestingly, we can see the ripples of this integration in our physical worlds too. Perhaps most vividly with our virtual personas on social media. Think about the last time you unconsciously tried to superimpose your physical personality on your virtual personality (or vice versa). Most of us do that because we like to have a seamless experience with those we interact with, online and offline.

In other words, our virtual persona tends to become an idealistic extension of our physical personality type. The virtual persona however, tends to travel far beyond our physical–by breaking up into components of information, details, useable big data, which tends to define us more than the latter.

What happens when you try to design with both physical and virtual personas in mind? Below is an example of MirrorFugue — a set of interfaces for the piano to visualize the gesture of a performance. It is interesting to see how a medium like this will evolve in future. Would it make life easier or complex by allowing us to share a familiar emotional experience with the ones we have lost? Or, perhaps by sharing an experience with our inspirations and influences (e.g. by playing with a virtual celebrity ).

The iUX series aims to inspire, prepare and guide established (and upcoming) designers by connecting best practices of UX/UI and interaction design from our virtual and physical worlds.

Another goal of this narrative is to see how this cohesive user-experience can play a role beyond improving products and apps. If the convergence of these worlds (virtual and physical) is certain, then why not benefit from it in our waking lives by preparing for it? Particularly, to make ourselves more efficient as the next generation of designers and more purposeful as evolving human beings.

If you liked this article, please give it a clap ♥︎
Thank You.

Go To Next Episode >
iUX: Guidelines for present and future UX Designers — 1 of 2

Noman Siddiqui is a Canadian UX/UI/Interaction Design Consultant.

--

--