Accessible UWP apps — Chapter 1: Gaze control

Niels Laute
6 min readFeb 18, 2019

--

This is the first part of a series or articles that can help you to create more inclusive UWP apps and make you look cool while doing it!

#InclusiveDesign in Windows 10

Inclusive Design Toolkit by Microsoft Design

With version 10, Windows is more accessible than ever. Narrator, high contrast, color filters and Cortana are just examples how the Windows platform empowers a wide range of users with unique capabilities and conditions to achieve more. This inclusiveness is not only great for people with certain conditions, but also for the ever changing context of your user. With new interaction paradigms the platform and its apps can adapt to the user’s situation — and not the other way around.

I personally never put in a lot of effort in making my apps more accessible, until I got an email from a user:

I’m a disabled person and this app is one of the best apps I own, now I can turn on and off my lights with my voice and don’t need assistance.

Thank you, Niels :)

Joe

And then it hit me.. because of the Cortana voice commands in my app (Huetro —a smart lighting app) I’m having an impact on someone’s life - and that’s pretty awesome!

Keyboard, mouse, speech, pen, dial, touch, gaze, gamepad, adaptive controller — all supported out of the box with UWP!

Because of the many input methods Windows 10 supports out of the box I thought it might be a good idea to explain how you can easily add these technologies (or a combination of) to your app.
To illustrate the different implementations, we’ll be creating an app called HomeHub — it allows you to control your home and get in contact with people (you’ll be able to find the source at the bottom of each article).

Chapter 1: Gaze control

In this first chapter we will explore Gaze control. Eye Control and gaze tracking APIs were added in the Windows 10 April Update and allows developers to use eye tracking hardware (like the excellent Tobii Eye Tracker 4C) as an interaction method in their UWP apps:

“Gaze input is a powerful way to interact and use Windows and UWP applications that is especially useful as an assistive technology for users with neuro-muscular diseases (such as ALS) and other disabilities involving impaired muscle or nerve functions.

In addition, gaze input offers equally compelling opportunities for both gaming (including target acquisition and tracking) and traditional productivity applications, kiosks, and other interactive scenarios where traditional input devices (keyboard, mouse, touch) are not available, or where it might be useful/helpful to free up the user’s hands for other tasks (such as holding shopping bags).” (Source)

In this example I’m using a Tobii 4C that connects over USB. Install the Tobii firmware and calibrate so it’s optimized for your eyes.

Tobii 4C

Gaze principles

Gaze interaction has some specific principles. Basically time is used to indicate if the user ‘enters’ a control, and by setting the dwelling time you can decide how long the user should look at the control before it get’s activated (“click” event).

Source: Gaze docs

Code

Let’s get started. Create a new UWP project (minimum SDK: Windows 10 April 2018 Update (Version 1803, build 17134)) and make sure to add the GazeInput device-capability in the package manifest

First, we’ll enable gaze interaction across the entire XAML page. We can do this by using the Windows Community Toolkit’s gaze extension:

This 1 line of code will enable gaze interaction across the entire XAML page and includes all controls: a button will be clicked when your gaze is on it, a scrollviewer will scroll down if you look at the bottom and a pivot control will switch tabs based on the tab you’re looking at. Pretty awesome that you get all of this out of the box. You can also set this to individual controls.

So, let’s add a new DetailsView and on the MainPage a GridView with a couple of rooms. We can just leverage the ItemClick event of the GridView to navigate to the DetailsPage which is pretty convenient — there’s no code adaption needed to support gaze!

Changing the gaze feedback and cursor

To indicate to the user, the default feedback contains of a red and green rectangle:

You can override this by adding a Loaded event to the page. In the loading event we can then change some GazeInput properties to change the default visualization. Let’s start by making the cursor a bit thicker and override the default colors:

What happens now is that once the user’s gaze hits the control, the color of the border will be dark gray and ending up being white when the DwellingTime is passed (“click” event). This results in the following:

We can also adapt the cursor radius by setting GazeInput.SetCursorRadius(this, 32);

Taking it a step further..

Some eye trackers, like the one I’m using, can measure even more than just gaze position — head position and rotation is also measured. Unfortunetly the GazeInput APIs do not support this yet (INSERT LINK), however — it is possible to get the distance between your eyes and tracker.
Having the distance between my eyes and the screen is interesting. Wouldn’t it be cool to adapt the interface based on how close the user is to the screen? Let’s try that.

So what happens is with the GiazeHidPsitionParser API we’re able to get back the X,Y and Z positions of each eye. We then user that information to check if the user near the screen (< 50mm) or further aware (> 50mm). Based on that, we’ll change the ItemTemplate to make the tiles smaller or larger.

The result:

Final thoughts

In this article we explored gaze control to enhance your app in terms of accessibility. However, there are many other interesting sci-fi use cases to explore where gaze control as a primary or secondary interaction method...

and with Windows 10, it’s as easy as adding 1 line of code :).

Source code on GitHub

--

--

Niels Laute

UX designer by day, Windows developer by night. Talks about Fluent, XAML and UWP a lot.