Working with Gestures

(Designers who had part in this study) Russell Jamison, Landon Call, Colten Whatcott, Krista Bice, and Neal Wandja

This case study is project that me and some other designers worked on to discover needed/hidden gestures and then found a way to better utilize them to help create a more simple “hand’s off” user experience.


Waze is a community update mapping app of over 100 million users. Because of community involvement Waze is able to keep the most current traffic conditions, locations of hazards and police officers. As well as the quickest routes for your drive. In fact, only recently Waze launched a facet of its app specifically for motorcycles, dedicated to giving motorcyclists the best a quickest routes(which sometime vary from those of cars). “Waze is all about contributing to the ‘common good’ out there on the road. By connecting drivers to one another, we help people create local driving communities that work together to improve the quality of everyone’s daily driving. That might mean helping them avoid the frustration of sitting in traffic, cluing them in to a police trap or shaving five minutes off of their regular commute by 3 showing them new routes they never even knew about.” “Waze.” Free Community-Based GPS, Maps & Traffic Navigation App,

Environment: Waze has three main environments where it is most often used. These are, in the car navigating to a location, on the bus/train system navigating the railways. Finally, on foot navigating your way around the cities. However, this app also contains some user-stories not in the main three items. Such as a driver wishing to be warned of impending dangers or hiding cops.

Gestures: Gesture controls while very intuitive and effective in the app when one is not in motion, can become cumbersome and difficult when one is trying to navigate and drive. While Waze does provide voice control, gestures are needed for things such as indicating the start of a route, reporting cops or warnings, etc. However, in recent years Waze has studied and tried to implement gesture controls that will allow for a less distracted driving experience. In forums one can find users bringing up the issues presented and seeking answers.

Target Audience: The target Audience we anticipate is that of young college age kids (18–30). This is because numerous factors. First, College age kids are more likely to search out a mapping app that is not native to their phone.


Our research is going to specifically focus on two areas. First, is the current app’s gesture controls driver friendly, or in other words are they optimized for safe driving? Second, we would like to see how the gestures differ from mode of transportation being Driving, Riding, and Walking. We decided to focus on these areas, first to see how the app will be used by a user and how it will effect their specific mod of transportation. Second, we were curious to find out how the UX designers tackled the issue of designing for the specific use case of a driver. Are the gestures safe and easy enough to use without becoming distracted? Do the respective gestures make the driver safer or more dangerous on the road? If not in what ways could gesture controls be optimized within the developer guidelines?

Journey Map

We began by creating a Journey Map the an average user would go through beginning from the time they open app to the time they reach their destination and close the app. We did this to gain perspective on how a user would use the app and the different step/features used in each step.


After analyzing the app itself we need to create a persona to model after an individual that would be using this app to get around. Staying within our targeting audience parameters you can view our persona of Heidi to the left. Both the Persona and Journey map are key in the EMPATHIZING stage of the design process. This allows us not only to get in the mindset of Heidi but also have an idea as to the context in which the app will be used.

Initial Testing

Usually we would create a prototype then conduct user testing with the app, but because it already has been created we decided to preform testing before hand to see how a user would interact with the app to discover pain-points within the app and to give us a better idea of how to simplify the apps functionality. BELOW IS OUR INITIAL USER TESTING RESULTS

After conducting the first set of testing we noticed that little to none of our user’s used the voice feature within the app. This is what we were looking to see. Was whether the user’s would even use the voice feature. After each test we asked the users if they even knew that there was a voice feature. Most stated that they were unaware of the voice features. Below are the steps that a user would take in order to “turn on” or activate the voice feature.

Keeping the problem we are trying to solve in mind (creating a SAFE user experience) we felt that the voice feature needs to be the most important and suggested way to use the app. Reason being, having the user look back and forth to tap the screen whether it be reporting an accident or a cop is unsafe via the user tests.

Having said that, I chose to create a new process within the app so that users can choose their mode of transportation that will in turn active and deactivate certain features. Obviously you can never force a user to use the “driving features” while driving but having the option there is our effort to creating a safe, seamless, experience. Below is the initial on-boarding sequence.

And here are my two added screens to add to the on-boarding process to hopefully, again you can only direct a user so much as to where they go within the app. (I would love feedback as to how to make the voice features more used.)

Added screens to promote voice feature up front.

To reiterate and conclude this case study:

With these screens I hope to create more awareness of the voice features within the app and essentially promote them. I did notice when testing the reporting aspect of Waze that I was unable to report things with voice. This really was the main issue with the app in the beginning. There is not much to visually show as to a prototype of that but I WOULD DEFINITELY ENHANCE THE ABILITY TO REPORT USING THE VOICE FEATURE.

Above all else, the above statement would be the biggest change to the app. Reports are the things that take the user’s eyes off the road and to the phone while trying to report things by tapping on the phone. We would also suggest that Waze warns people to keep an eye on the road when alerting the user about a reported event.