Augmented Assisted Living or “How we create a future vision for the elderly”

Susann Maßlau
Ergosign
Published in
6 min readOct 21, 2016

At Ergosign Labs we are currently working on a concept for an assistance system for the elderly that combines Augmented Reality with smart glasses and a smart bracelet. A prototype that demonstrates a scenario for an interactive drink reminder is already running.

The future of our society

Ergosign Labs focuses on UX research for future technologies. The goal of this study is to design a system that reminds people to drink water, using two wearable devices. Based on the insights and learnings, we want to define general design guidelines for Augmented Reality projects.

Our society is aging and shrinking, and therefore the demand for assistive technology in elderly care will increase in the near future. There are already many different technological concepts and devices on the market for this target group (Ambient Assisted Living), but most of them still have to face some problems like a lack of standards and social acceptance. Wearables like smart watches and smart glasses are still confronted with technical constraints (such as battery life and overheating) as well as inconvenient interaction and poor user experience.

So our task as UX designers is to innovate solutions that address the user’s needs by taking all aspects of new technologies and devices into account.

Our vision is deliberately futuristic and does not consider all present technological constraints. But you can imagine that AR glasses, which are at the moment too chunky, could look like normal glasses soon or become even almost invisible in the form of contact lenses.

The combination of Augmented Reality glasses and a smart bracelet is what puts Ambient Assisted Living on the next level — that is why we call it “Augmented Assisted Living”.

Controlling AR via a smart bracelet

With the help of a smart bracelet and AR glasses combined, we want to create an assistance system for the elderly that supports their daily lives in a subtle and meaningful way.

This target group has special needs concerning the functionality and user experience of technological devices. Decreasing fine motor skills and mental powers are restrictions that designers have to take into consideration. But of course, these can be very diverse and we cannot think of „the elderly“ as a homogenous group.

This is why we did some user research and created two personas, one of them being a 89-year-old woman with moderate physical restrictions and a 65-year-old man in good shape.

In this early stage of the project, we are focusing on a scenario in which the two wearables work together as a reminder for constant fluid intake for elderly persons. Reasons for insufficient hydration are very complex and we are certainly not able to tackle all of them. But in those cases where subtle reminders and digital rewards could do the job, is where our future vision comes in. There are of course plenty of other possible scenarios you could think of like assistance in navigation, orientation, communication or further reminders such as intake of medication or appointments.

Further use case examples, where both devices work together: Warnings, orientation and navigation, individual (projected) UIs and digital rewards

Augmented Reality should be meaningful, not distracting

Augmented Reality can be a great tool to enrich physical objects in everday situations by displaying subtle visual overlays. Furthermore, it offers the opportunity to compensate for decreasing eyesight and forgetfulness in the target group. But care has to be taken with non-textual information – e.g. random reminders displayed in your view – since this might be annoying. Furthermore, touch interaction or gestures in a projected, three-dimensional user interface can feel unnatural because there is no haptic feedback.

In our project, smart glasses are used to highlight real world objects in a meaningful, non-distracting way. We believe that displaying text in notifications should be avoided wherever possible and that overlays should never be too striking. The additional information must feel natural and shall not be perceived as a disturbing layer between the user and the real world.

A smart bracelet for discrete interactions

This is the point where smart bracelets can intervene. Haptic feedback? No problem. Non-contextual reminders? Still less distracting to be notified via a vibrating alert on your wrist than by an unsolicited overlay in your view. And it is much more unobtrusive and less stigmatizing to control your AR interface by touching your bracelet than to gesticulate in the air.

In our concept, the bracelet displays more detailed information. Notifications, verbalized as motivating prompts, as well as hydration status information and digital rewards, when the intended behavior (= fluid intake) has been successfully accomplished, are shown here.

The bracelet serves both to control its own UI and to interact with the AR overlays or dialogs in an unobtrusive way.

Additional Devices
The concept can be extended by an external app for larger screens that is used for advanced configurations, data analysis and other users like doctors or family members.

Example Scenario
In our example scenario, the system tracks a lack of fluid intake and sends subtle alerts that continuously become more intense. If the vessel is within the user‘s view, it is highlighted to attract attention and a slight status information is shown nearby. At the same time, this information is also displayed on the bracelet.

If there is no glass or water bottle in sight, the user receives only a vibration alert on the bracelet, which also shows the notification message and status information.

Prototyping with Samsung Gear VR and the Apple Watch

As a proof of concept, Ergosign Labs developed a prototype that simulates the communication between the two devices as well as the notification system and makes the user interactions experienceable. The use case seems a bit half-baked here, but our focus does currently not lie in the tracking of fluid intake but to test the user experience.

We used a Samsung Galaxy Gear VR as smart glasses (running with Vuforia SDK) and the Apple Watch as a substitute for the smart bracelet. An additional iPhone app allows further configuration. The communication between both devices is done via a web service.

What‘s Next?

We are currently working on the refinement of the concept and prototype, focusing on the user experience. We are also investigating new devices like the HoloLens that will probably be more suitable for our use case. After consolidation we will do usability testings to verify the concept.

Takeaways

In an aging society where more and more people at an advanced age still want or have to live in their own homes with little or even without human support, assistive devices can fill this gap at least partially.

Technology has the power to support people maintaining their autonomy and compensate for slight physical and mental restrictions as well as a lack of geriatric nurses in the future.

But we can no longer look at devices with blinders on, regarding them as standalone tools that run separately from each other. Talking about the internet of things means to combine devices and interpret data from various sources. Every technology has its specific strengths and weaknesses. By keeping the big picture in mind, we are able to compensate for the weak points that inevitably come along with a single tool and profit from synergy effects in a useful and innovative design approach.

Wearables change the way we use technology and especially Augmented Reality will be a real game changer, since it offers a wide range of opportunities in assisting the user‘s daily lives by providing meaningful, context-sensitive information. This is why it is a very promising technology for creating visions of the future.

--

--