Designed Intelligence: Enhancing the human experience

Fjord
Design Voices
Published in
6 min readSep 11, 2020

--

How designing with AI can enhance the human experience.

By Connor Upton, Group Design Director, Fjord at The Dock, and James O’Neill, Service and Systems Design Lead, Fjord at The Dock.

“Any sufficiently advanced technology is indistinguishable from magic .” — Arthur C. Clarke

We often hear about how AI can be used to automate mundane tasks and free us up to do other things, but we rarely talk about what those other things are. What about an alternative perspective: how might AI help us do the things we love, but better? AI can be used to extend our perceptual and cognitive abilities and change how we interact with the world around us. It could be used to give us new capabilities beyond what we can achieve on our own. Enhancing the human experience is one of the pillars of Designed Intelligence, our approach to designing for, and designing with, AI at Fjord. This allows us to think about experience design in a new way, not just as a means to deliver an existing service, but potentially opening up new ways to interact with customers and generating new business models.

Artistic inspiration

The world of art is often a source of inspiration for technologist and designers. Autonomous vehicles were described in a short story by Asimov in 1953 and voice interfaces by Philip K Dick in 1968. Technology also informs the work of artists, and AI has now become a new creative medium. When Google released Deep Dream, a computer vision application, in 2015 many creators experimented with its ability to transfer artistic styles onto their images or generate psychedelic patterns. But beyond its ability to mimic style, artists are exploring how AI can be used in more novel ways. Gene Kogan’s work on deep fakes, where he controls the faces of famous politicians, opens up questions around the trustworthiness of media. Memo Akten uses neural nets to create new forms of participatory art. In his piece “Learning to see: Gloomy Sunday”, he enables people to move mundane household objects around under a camera to generate stunning interactive scenes from nature, including seascapes, clouds and fire. These “live reality filters” are a new form of experience, and they feel magical. And like all good art they help us see our reality through different eyes. So, how are these works of art and AI influencing the world of interaction design?

AI as interface

AI, like all new technologies, needs new modes of interaction. The early web brought drop-down selection and digital forms into the mainstream, mobile brought us the hamburger menu, pull-to-refresh and infinite scrolling. Each new type of interaction is a response to the goals people want to achieve and the constraints or abilities of the technology. While AI may seem new, it is already driving many of our interactions. Everyday experiences, like auto complete in your search bar and song recommendations on your streaming service, are so common we forget that these are powered by machine learning. Even more recent AI advances like computer vision are becoming important interaction techniques. The first time you pointed your camera at your credit card rather than type in all those numbers it felt amazing. This “perceptual accelerator” got the job done faster and more accurately using the technology that was already built into the device.

We’ve been experimenting with how computer vision can be used in this way in different domains. For example, when you buy a box of medication it comes with a wad of paper that describes ingredients, dosage, side-effects and other important information. Most of us never read it. But what if you could just point your camera at the box and let it identify the drug and cross reference it against your conditions, drug regimen, allergies and other factors. Native digital formats make it easier to highlight the parts that are most relevant to you, insuring that you don’t miss any important information. We developed a prototype that showed that this approach is feasible. What impact might this have on patient experience and safety?

Using similar technology, AI can redefine how we navigate the physical and digital world. Working with grocery retailer Whole Foods, our Austin studio developed a proof of concept that demonstrates how computer vision, NLP and recommendation systems can be blended into the shopper experience. Enhancing the experience with these technologies allowed shoppers better navigate the store, find products quicker, and discover new recipes and ingredients in real-time.

AI and AR for grocery shopping.

The role of computer vision in user interaction will accelerate as the technology becomes more accessible. Companies like Matterport are already providing a platform that allows people to capture, edit and share 3D models of physical spaces and objects. Originally targeted at professional users, their new mobile app puts the capability into everybody’s hands. This makes it possible for designers to experiment with creating digital twins, simulations, and AR experiences.

Extending human capabilities

As well as accelerating our interactions with services, AI is allowing us to tackle entirely new interaction challenges.

Dr Peter Scott-Morgan is a Cambridge academic who suffers from motor neuron disease, a degenerative condition that attacks nerves in the brain and spinal cord. Peter has set out on a mission to become the world first true cyborg, undergoing a series of physical and technological augmentations to help him continue to live and work. The personality retention project is a collaborative research program that uses emerging technologies to support this mission. Working with a team of partners, Fjord helped design a new eye-tracking keyboard that integrates with text to voice generating technology. This solution enables Peter to continue to write and communicate even as he gradually loses his natural physical and sensory abilities.

Concept design of new visual keyboard from personality retention project.

We’re also using technology to give people skills they’ve never had before. For example with VELUX, the roof window specialist, we designed an app that lets people see how additional light could transform their living space. Combining computer vision and augmented reality with a seamless interface, the app let people scan a room and then place virtual windows into the space. This gives the average home-owner the ability to explore and imagine 3D space with the vision of an architect.

By enhancing our experience, AI can also help us see our blind spots. Microsoft Powerpoint’s presenter coach is a feature that does just that. It uses voice recognition to analyse the users presentation for areas for improvement. It can help you realise when you’re saying ‘ummm’ too much, or even make you aware when you use unnecessary gendered pronouns. In this case the AI truly helps you see the things you might usually miss.

Prototyping the future

These examples show that if we design it right, AI can be a powerful tool. At Fjord we are working with our colleagues in Accenture to research, ideate, and prototype new forms of human interaction with AI; from virtual agents, to robotic assistance, to intelligent operations. Taking a human-centred approach results in solutions that really answer people’s needs and ultimately enhance their lived experience.

--

--

Fjord
Design Voices

Design and Innovation from Accenture Interactive