A Reimagination of Interface Design: Quick, Draw!

Gary Tse
Andrew Gray’s HCI Work
7 min readMar 31, 2019

Andrew Gray, Matt Harmon, Nigesh Prajapati, Gary Tse

For those who want access to our git repository, our code for this project can be found here: https://gitlab.bucknell.edu/np021/design-for-expression-team-1

Background

This project was developed under the guidance of Professor Evan Peck in his computer science elective course Human-Computer Interaction held at Bucknell University. The purpose of this project is to create a more fun user experience through Leap Motion.

Leap Motion is a small rectangular USB device with cameras built into it. This device is meant to be placed in front of the desktop facing upwards.

Leap Motion is designed to track user hand gestures. Under the scope of our project, we hope to remap traditional user inputs from the keyboard/mouse to a 3D user interface using Leap Motion. This change will allow users to interact with their old games in a novel and hopefully more positive way.

Quick, Draw!

Learning the Technology

We looked at multiple different ways in which this technology has previously been implemented in so that we can understand the benefits and limitations provided by this technology. Some more interesting projects that we looked into include using leap motion to interact with Google Earth and Exoplanet App.

Leap Google Earth

The controls used to interact with Google Earth were wide open palm gestures. From purely the demonstration, it looked as if the user's hand was modeling a plane. Pushing the hand forwards zooms the map closer to the user. Left, right, and backward hand movements are also mapped by Leap Motion in a very intuitive manner.

Leap Exoplanet App

The controls used to interact with the Exoplanet App is very similar to the ones used in Leap Google Earth, except that now, users can also point with their fingers for a more fine-tuned interaction with the app. The finger point forwards provides tracking and zooming into a specific point.

In addition to the two games listed above, we also tried a few games that came with the Leap Motion package to get a better understanding of the constraints associated with the technology. We found that Leap Motion did really well when it came to track hand motion and detect gestures, but was not that accurate when it came to grabbing items. This became the primary concern when we were looking to pick a game.

Brainstorming

At this step of the process, we brainstormed a list of games which we have previously interacted with and know are fun. The list of games we thought would be fun for users to interact with in no particular order was Kick the Buddy, Quick, Draw!, Burrito Bison Revenge, and Clicker Heroes.

Interactive Buddy

Kick the Buddy is a carefully calibrated rag-doll physics simulation game. The violent stress-release app lets players assault a rag doll.

Quick, Draw!

The user is prompted to draw a specific object. As the user is drawing the object, the computer will be guessing what the user is drawing like charades. Used by Google to collect data to recognize drawings.

Burrito Bison Revenge

A game that is similar to angry birds. Fling the character across the map. Interactions include moving the character up and down as they move across the map.

Clicker Heroes

A clicker game where the user gets more and more points by clicking on the screen. These points can be used to purchase upgrades to amass a larger number of points.

Picking the Game

We decided to choose “Quick, Draw!” because of its uniqueness and freedom. Users can doodle whatever they like on a blank canvas environment by gesturing with their hand in the air. We thought that although this game would be a little challenging, it would be the most entertaining of all, which is the main goal of the project nonetheless.

Formative Testing

This game has 4 main screens, the login screen, question prompt screen, drawing screen, and the end screen.

Login Screen

The Login screen allows the user to start the game by clicking the button “Let’s Draw!” Clicking this button moves you on to the next screen, the question prompt.

Question Prompt Screen

The question prompt tells you what you are supposed to draw in the next screen. You have to click next to move onto the drawing screen.

Drawing Screen

The drawing screen allows you to draw a picture, all the while a neural network attempts to interpret your drawing. You are given 20 seconds or until the neural network accurately guesses your drawing before you moved onto the next question prompt screen. There are 6 question prompt screens before the game ends on the end screen.

End Screen

The end screen allows you to look at your previous drawings, what the neural network guessed your drawing to be, and drawings of the question prompt by other people.

From interacting with the game Quick, Draw! we learned that we had to map the input of the mouse onto new instructions using Leap Motion. The two actions of the mouse that we had to map were:

  1. dragging of the cursor across the screen
  2. clicking of the cursor on specific options.

By mapping just the mouse we can easily interact with all the different screens in the game Quick, Draw!

Wizard-of-Oz

Based on the recommendation on the project document, we wanted to do the “Wizard-of-Oz” of our idea with other students. However, one of the difficulty we faced with doing so was that Quick, Draw! involved tracking hand in the free space. When a participant was interacting with the game, it was extremely challenging for the controller to replicate or mimic those actions. Hence, our team decided that rather than focusing extensively on “Wizard-of-Oz”, we would just prototype different gestures directly on our code to check how well they perform.

Testing Different Gestures

Implementing the first mouse action (dragging of the cursor across the screen) was very straightforward. We decided to keep track of the index finger for the purpose. The challenging part was identifying the “best gesture” for the clicking of the cursor. We came up with several options for gestures, and for each of the gestures, we did testing with users to see how “intuitive” they were. The article “Introduction to Motion Control” was very helpful in a better understanding of what it means to be an intuitive design and how we can make intuitive interfaces. Based on the article, we looked for these three main properties in our design to deem it intuitive:

  1. Learnable
  2. Understandable
  3. Habitual

We used this definition with our users during our testing as well.

Go here for more information on how we used the interaction box of the leap motion

After gaining valuable feedback from our users, we figured that the pinch gesture was the most effective of all when it came to clicking, although it did have some problems associated with it. The users felt that pinching was more intuitive, compared to others.

User Feedback

We conducted some user testing with three questions formatted for the users to answer. The question prompts are: I like, I wish, what if? I like is some design choice or feature that the user liked during the testing. I wish is something that might be infeasible that the user would like changed. This change might be one in a design approach or game selection. What if is a feasible change that the user believes should be implemented in the given game/controls.

I like, I wish, What if?

The users loved the new ability to use their fingers to draw as that is more physically intuitive to the users than using a mouse. What the users did not like was how glitchy the Leap Motion device seemed to be. It proved to be somewhat of a challenge for the device to keep track of the user hand positions making it so that their drawings would jump around on the screen.

Conclusion

We learned that while our approach to the game was incredibly intuitive, the actual technology required to implement our design was lacking. While we were testing our application, we found that Leap Motion was unable to properly capture the motions of the hand while drawing. There were issues where the game would lose track of where the finger was in relation to the screen such that drawing became sloppy or incredibly slow. If we were provided a better technology to track user hand gestures we believe that drawing with the hands will be much more fun for users.

--

--