Context-aware designs

monthlydesigns
Monthly Designs
Published in
4 min readDec 29, 2015

There hasn’t been a time more appropriate for context-aware designs. Computers can do more today than they ever could, thanks to a variety of sensors and advancements in artificial intelligence. Armed with a thin slab of glass and a decent internet connectivity, one can access any information, anywhere and at any time. This is a great privilege to have as a smartphone user. But what does it really mean to use a smartphone in 2016? We are living in an information age — a time when data from different sources can drive experiences, alleviate pain-points, and even act as a foundation to new technology. Smartphones should not just make access to such information easy, but should bring information to us when we need it — based on the time, location, mood, user behavior, actions and more. What if we could provide tiny bits of glance-able information that is useful to the average smartphone user?

The current landscape

Android has Google Now, iOS has Siri and Microsoft has Cortana. The digital space is moving quickly towards context-aware systems. With Google Now giving you information about your next bus home when you are sitting at Starbucks lost in your work; iOS providing a quick access to Spotify on the lock-screen as you plug in your earphones. But these digital assistants still exist as features and require deliberate, detailed user interaction. How can we do more by creating systems that are more active in serving the user with useful information in a very compendious fashion? Below, we explore a few different scenarios to propose ideas that leverage the current technology to do just that.

You are using google maps and driving 23 miles straight on a highway

You are driving to a different city, quite unsure about the route you decide to use the map on your smartphone. It provides with the directions and instructions through voice to avoid the need to look at the screen. You hit the interstate and you have to drive 23 miles straight. No diversions, no turnings, no exits; just a straight road ahead for the next 20 miles. The current maps application would have the screen turned on showing your way on the map. But are we really looking at the map when we are driving a long stretch straight?

Maps applications senses that the driver is driving for 23 miles straight and turns off the screen hence saving battery life.

You just finished your workout and want to check out all the awesome stats

It is not uncommon to check how many calories were burnt, how many steps were taken, etc. after a workout session. Why not cater to this behavior and serve all the important stats on the lock-screen after workout? Imagine getting a quick summary of work out stats on the lock-screen right after you step off that elliptical.

The operating system can predict that you are at the gym based on location and any application such as FitBit. Google Fit or Apple Watch stats are displayed on the lock-screen based on the user’s preference.

You want to make sure that you carry your umbrella with you on a rainy day

Often we step out of the house while it’s sunny and then find ourselves caught in the rain on our way back home. Or think about the time when you step out of the house, realise that it is drizzling and then go back in to get the umbrella. Now we do check the weather app every now and then apart from looking at the summarised weather notifications in the notification center. However; considering the scenario mentioned above, what if our smartphone subtly reminded us to carry the umbrella just before we left the house? This can be done by sensing that the user is up and moving apart from taking weather conditions for the day into account.

From left to right: User is sleeping, user wakes up and sees contextual notification reminding him to take the umbrella, user takes the umbrella on way out.

You are chatting away with your friend and need to note down that important meeting time you decided on

With instant messaging being the go-to way to communicate these days, people have a lot of important conversations inside these apps. One common scenario is deciding on a meeting time in the conversation. What if our smartphone acted like a smart assistant and allowed us to add the meeting time to our calendar in one tap?

The user’s phone can read the message and indicate smartly of existing conflicts when there involves an event planning. When the user responds with an affirmation, the system then provides a nifty shortcut to create an event with just one tap.

Monthly Designs is design blog written and curated by Adhithya & Shankar — HCI/d graduate students from Indiana University. Follow Monthly Designs on Twitter to stay updated for future posts.

If you like this post, don’t forget to hit the ‘Recommend’ button. Thanks!

--

--

monthlydesigns
Monthly Designs

Join us in a journey to solve a new design problem every month.