[AR]range

Vikas Yadav
Vikas Yadav
Published in
7 min readFeb 5, 2017

(AR)range is an ecosystem where people can tag physical objects with digital information in the desired context. Based on platforms of Internet of Things and Augmented Reality, arrange regulates information and communication flow in a family setting. The system consists of two components: a mobile application to tag objects with new information and physical objects that can reveal information upon physical interaction.

Client: Philips

Duration: 6 weeks

Team: Allison Huang, Irene Alvarado, Mackenzie Cherban, Vikas Yadav

My role: Design research, physical prototyping, system mapping, video prototyping

Tools Used: Sketch, After-effects, Unity

PROMPT

We were asked to investigate sleep in multi-user context. Most digital sleep solutions focus on the sleep quality of a single individual. But in many cases, such as snoring or families, or a family with a newborn, the total sleep quality of the family may be much more relevant to consider. The open-ended nature of the brief allowed us to explore numerous scenarios around sleep, health and families. It also encouraged us to explore emerging technologies in the same context.

CHALLENGES

We started by brainstorming crucial questions around the brief in order to define a scope to move ahead. Raising these helped us to think around different scenarios. Each of us did storyboards in order to observe and assess the need and appropriateness of technology in personal/familial health and sleep. Storyboarding provided us insights which manifested into an array of specific questions which we used for User Interviews.

USER INTERVIEWS

We interviewed five people from different professions of varying ages and family structure. Almost all had some dramatic stories about how health-related issues affected their family’s sleep. Few spoke of respiratory issues that their children suffered during sleep. Some also talked about how seemingly innocuous sicknesses affected both their sleep and the sleep of their children: coughing can keep people awake, and stomach bugs are common sleep disruptors for everyone involved. Four out of our five participants mentioned that either a family member or they themselves had experienced a potential sleep disorder. One research participant spoke of a potential sleep disorder that went overlooked, even after his wife spoke with him about it, until he saw his own sleep data by using a sensor while sleeping.

DESIGN PRINCIPLES

We synthesized our findings from the interviews in “How Might We” questions which further transformed into “Design Principles” to abide by. These principles guided us to delve into the topic further and enabled us to envision strategies which could be include all principles. We thought How Might We:

• Improve the sleep quality of patients and family members with
sleeping disorders?

• Help people be aware of their own sleep patterns and those of the
people they live with?

• Collect data, both quantitative and qualitative, that would aid in the diagnosis
and treatment of sleep disorders in a way that is intuitive and non disruptive?

Reduce the stress caused by health-related sleep disruptions?

• Leverage the intuitiveness of the physical world along with the richness
of the digital world?

GENERATIVE PHASE

PROTOTYPING

Moving forward, we did more research on what exists currently: both technologies that are widely adopted and novel. We often referred to work coming out of MIT Media Lab, particularly Hiroshi Ishii’s musicBottles and Sublimate and Valentin Heun’s Reality Editor. These projects brought tangible objects to life in a digital world, connecting the physicality of our environments to information that is usually hidden beneath two-dimensional screens. Our team grappled with a major question: How do we prototype when designing for AR? It should come as no surprise that we started the process much like any other design process: by picking tools that would allow us to prototype cheaply and quickly. Paper, scissors, tape, markers, and other simple objects were good enough for our early explorations. We wanted to achieve two things during this process: a) gain an understanding of the affordances and possibilities of AR and b) test which of our ideas merited further exploration through higher fidelity forms of prototyping. The former was important because none of us had worked with AR in the past. And apart from trying out games such as Pokémon Go, there weren’t any AR experiences we could point to in our daily lives for inspiration. The latter was important because we had a long list of needs and pain points to possibly address from our research, but very little intuition for which problems to prioritize — or which ones we could reliably address given our project’s time constraints.We kept bringing it back to the design brief, our research, and our design principles, asking questions like:

“ What would actually be helpful for families as a unit? What could be used to improve sleep quality–for the user themselves or the user’s family members? What would fade nicely into the background or be integrated effortlessly into people’s lives? “

We created five video prototypes to explore some of our early ideas of how physical interactions might reveal different pieces of data in augmented reality. Three of them began to explore our idea of using an aura as a signifier, which ended up carrying all the way through to our final solution. We also created one prototype that augmented a meaningful object and leveraged a pre-existing affordance in that particular object: another idea that made it all the way through. Finally, the fifth video prototype showed the ability to use gestural interactions to move a hologram around, which did not make it to the final; after creating both video and working prototypes, purely gestural interactions no longer seemed compelling.

REFINEMENT PHASE

SYSTEM OVERVIEW

SYSTEM MAP OF ARRANGE

The prototyping process led us to create arrange: a holistic digital platform for families that places embedded information at people’s fingertips — connecting the information we find most meaningful to the objects that surround us. It combines the intuitiveness of the physical world with the richness of the digital world. The arrange system consists of two components: a mobile application and pre-existing physical objects tagged with information in AR. It consists of two components:

Component 1 [Mobile Tagging System]: The mobile app enables users to photograph and tag objects with personalized AR information. The basic interaction involves taking a picture of an object, then going through a series of steps to customize the AR content the user wants to combine with the object. They can choose from a variety of categories, which range from sleep and health to messaging and games. arrange also gives users the ability to assign tags to specific family members, preview and confirm holograms. Once an object has been tagged, it is cataloged within the app — users can see the most recent objects they’ve tagged as well as edit and activate older tags.

MOBILE APPLICATION FLOW

We also created an interactive prototype in InVision to test interaction flows and demonstrate its intended use.

Component 2 {Interactions with AR content through physical objects]: Once an object has been tagged, it emits an aura in AR, signaling to users that embedded information is present in that object. The aura’s glow is a subtle, as well as a peripheral signifier that gives the first level of information in our communication hierarchy: that there is an active tag nearby. The viewer can use the intuitive affordances of touch to reveal the full hologram in AR, showing data relevant to their family and home or showing messages that family members leave for each other. Objects with active tags can be combined to reveal new pertinent information. For example, if an object has been assigned to include the sleep quality of a loved one and another object has been assigned to monitor the room’s temperature, a user can bring the two objects together to combine the AR and reveal the loved one’s body temperature. After the AR has been viewed, the hologram can either time out, as specified by the user setting up the tag, or it can be double-tapped to turn it off.

1. Tap to activate when aura is present, 2. Holograms reveal separate information, 3. Combined holograms reveal another level of information

--

--