Designing for New Realities: How AR Changes Everything

Sutherland Labs
Sutherland Labs
Published in
7 min readSep 19, 2019

By The Labs Team

Some technologies transform our everyday behaviors, habits and social rituals, and the new wave of mass market immersive visual technology (IVT), including augmented reality (AR), virtual reality (VR), mixed reality (MR), and augmented virtuality (AV), are another behavioral game changer. As designers, this makes AR, VR, MR and AV fascinating new technologies because they might just define a new normal, and that hasn’t happened for a while.

Image by Jamie Blackett

The last new normal game changer: The smartphone

It’s easy to forget that just fifteen years ago people took pictures mostly on cameras. The first commercial mobile phone with a camera appeared in 2000, and it took years for them to challenge traditional cameras and camcorders as our go-to image capture choice. The convergence of cameras and mobile telephones, with the growth of radio bandwidth for bluetooth and data transmissions, plus wifi, and a bunch of new technologies like IOS and Android, combined with open-source databases and APIs transformed our daily lives, disrupted business empires that had stood for most of the 20th Century, and created new global giants that connected us together in new ways.

Mobile devices were a game changer. Within a decade, we had selfies, duck face, the Kardashians and Instagram. Beyond that, chat had become something we did mostly with our thumbs via apps, rather than voices via phones. We shop, watch, listen, play, chat, work and pretty much everything else on our mobile computers we nostalgically call phones. However, beyond mobility, the smartphone developed in tandem with tablets and laptops, innovations like touch screens and cloud based software that made everything omni-channel. We went from a world of print, pens and paper to a world of screens in the matter of a few decades.

Breaking down the screens

For all the huge benefits we’ve enjoyed in our screen-heavy world, from watching Netflix on the commute to video conferencing with offices around the world, or replacing tons of paper with digital documents and becoming 24/7 contactable via email and messaging, there’s also a downside to the screen — it’s a physical barrier between us and other people.

Humans collaborate, connect and engage with other humans, and too often the screen can get in the way.

If you are familiar with Dilbert cartoons, they are set in this Kafkaesque bureaucracy where everyone works in cubicles. That style of office was bad for employee morale, and there’s been a shift towards more open plan designs, which is great, except you soon realize that we’re not looking at each other, we’re looking at screens. It’s a virtual cubicle. Similarly, on the train, all those people eyes-down on their devices are not interacting socially, they are in a screen enabled mental cubicle. You’ve probably seen people walking down the road, thumbing their phone oblivious to the world around them, again, mentally walled-off from their environment by a screen.

None of that is inherently bad, we all need privacy sometimes, but it’s important to remember that work is also a social interaction, as is sharing a public space. Humans collaborate, connect and engage with other humans, and too often the screen can get in the way.

Image by Jamie Blackett

The future is looking up, and looking deeper

The power of looking up, and seeing all that screen-based information in context, applied to the world around you is part of the promise of AR, VR, and MR. However, what these technologies also offer is the ability to make our use of screens much more effective too, giving them greater utility. What both looking up and looking deeper means for people is a whole new experience of daily life, and new normal behaviors. So let’s consider where these tools might take us… sooner than you think.

What both looking up and looking deeper means for people is a whole new experience of daily life, and new normal behaviors.

#1: Looking deeper: Transforming retail habits

The transformative potential of AR, VR and MR is most easily expressed in its ability to visualize environments. As we’ve discussed previously, apps like Ikea Place or Ace & Tate’s virtual eyewear try-on tool will become more complex and widespread.

We should expect MR apps everywhere, that enable much greater ranges of context when it comes to digital shopping experiences, for example:

  • Apps that lets you see yourself in virtual clothes (we have this now, but it will become more complex and realistic as the tech develops, plus you’ll be able to see it in AR enabled mirrors) In-store, we will also experience more integrated MR experiences. Last year Coty launched a magic mirror that enables users to visualise their make-up in multiple shades and styles, virtually. Just apply one tester, then augment your image to explore the whole range. Also, in various high tech stores, there are mirrors that also use cameras to show you your rear image too (does my bum look big in this?) That kind of tech is a game changer for expensive designer items, and also offers the ability to 3D scan yourself, and re-use that data for both online and offline retail contexts.
  • Apps that overlay virtual clothes over your photos so you can see how you would have looked in different contexts (weddings, parties, work etc.) to help you choose new outfits for occasions based on your actual experiences, social network and cultural identity.
  • An app that allows you to change fabrics, lighting, background, etc. to explore your clothing options in greater depth, and make more choices based on visualizations on a screen rather than in your imagination.
  • Imagine similar functionality for furniture, where you can take pictures of your current home and then swap items around, change wall colours, flooring, reconfigure layouts and so on.
  • Gift apps where you can take a picture of a friend and select clothing gifts that you know will fit, or find a perfect vase for your mum’s dining table and know how it will look before you place it there.

#2. Looking up: Contextual information and better human connections

Right now we rely almost entirely on written words and sounds to comprehend data from dozens of systems we take for granted (like email, messaging, social media, system notifications, deliveries and news alerts) but humans process simple, non-verbal visual data much faster and more intuitively (which is why warning lights on dashboards work so effectively). The obvious power of immersive visual technology means visual overlays could become the new normal in many contexts.

  • The AR enhanced home and workplace could become an important IoT (internet of things) interface to enable interactions with automated objects, and optimize energy efficiency, safety and process design. It also offers contextual real world help as opposed to manuals and screen based training tools, giving us hands-free support with visual guides.
  • There will also be an AR-enhanced view of the office. Taking current screen based project management and productivity tools into a AR, VR and MR space could transform human efficiency. Look up and see who is free, who is busy, and who is working in different teams or projects. Look around the room, see who you need to talk to, see which rooms are free, or who is leading a project, or knows a client. In that office, you could share files by handing over virtual items, and collaborate on shared documents like working on a whiteboard.
  • AR, VR and MR enables people to connect in-person but also remotely, so you can have a sit down chat between two people in different locations. Immersive tools offer us a world beyond Skype or conference calls, because freeing us up from looking at a screen (with AR eyewear for example) potentially creates a world where meetings could be anywhere, anytime, and involve as many or as few people as you need. They could become more spontaneous, last as long or as short as you need, and you could pass documents or tasks around like handing out biscuits. In your pyjamas, on the beach…

Finally, changing the place also changes pace

We had email before we had smartphones, in fact, before most people had mobile phones. That tool didn’t change between one platform and another, but having email in your pocket transformed our relationship with it, and created new working behaviours as we collaborated and communicated more effectively between our homes, workplaces, and while travelling. It’s that context that means AR, VR and MR will transform our normal habits by taking familiar tools and by transposing them from a screen into glasses — or taking data and overlaying it on the images and videos we capture on our mobile devices.

Perhaps we’re not that far off looking back and laughing at GPS Sat Navs that you had to look down at, taking your eyes off the road ahead; or buying clothes from a picture on a catalogue model as opposed to seeing it on yourself, from multiple angles; or assembling some flat-pack furniture with an AR guide and laughing about incomprehensible Ikea instructions; or remembering the bad old days of conference calls where nobody spoke, and emails cc’d to everyone in the office, or chatting with someone a couple of desks away on Slack rather than actually speaking to them.

What all these examples share, is using the tech to reduce user friction, and specifically the friction that is caused by having to look down at a screen, or at a monitor, when as humans our natural inclination is to be social, and mobile.

Watch this space? Yes. Literally.

--

--

Sutherland Labs
Sutherland Labs

We design and run bespoke user experience strategy, research, and design for major brands to help them understand the end-to-end customer experience.