Behavioral Prototype: Angry Birds

Wizard of Oz is a technique used to present or elicit an experience that is hard to simulate using paper, such as gestural or voice interaction.

For this assignment, I teamed up with three other peers and chose to do perform a Wizard of Oz experience for a gesture recognition platform, described below.

Gesture recognition platform: a gestural user interface for an Apple TV or similar system that allows interaction through physical motions. An example prototype would be controlling basic video function controls (play, pause, stop, fast forward, rewind, etc.). The gestural UI can be via a 2D (tablet touch) or a 3D (camera sensor, like Kinect) system.

To efficiently decide on the structure of a project idea, we determined the main scope and topic of the project as a team by keeping several questions in mind.

  • How can the user effectively control the interface using hand gestures?
  • What are the most intuitive gestures for this application?
  • What level of accuracy is required in this gesture recognition technology?

We decided to go with playing Angry Birds because one teammate had the app installed in her laptop. The user would be testing the gestures ‘grab’, ‘pull’, and ‘release’ of the hand sensor prototype we made. The velcro strips were sewn on the mitten to ensure that they would not fall off during testing.

Hand Motion Sensor

The wizard will be in another room watching the player with a wireless bluetooth mouse, mimicking the player’s actions and adjusting the angle of the shot based on the player’s gestural motions. On the other hand, the facilitator will help ‘replay’ the game. A scribe will be taking notes, and the documentarian will be recording the test with a camera or cellular device. These are the tools we used, including a laptop.

Initially, we were going to connect our laptop with the big screens at the study rooms located at the common area of one teammate’s apartment. However, we were unsuccessful in doing so after trying out three different screens, so we ended up testing with the laptop.

One of the potential testing screens
Testing site

The participant was unaware of the wizard when she walked in, and wholly believed the experience, even giving us suggestions to improve the system. When we revealed to her the truth at the end, she was surprised and did not really suspect anything during the process. She just thought the sensors were not working very well.

We concluded that the design was pretty effective because the participant successfully shot out more than three angry bird, though it was hard to match the participant’s actual hand gestures with the mouse without having a lag. When doing in-class critique, everyone had good responses and liked our video. Next time, we would make sure that all the devices work way before the actual testing, including connections to screens. Additionally, I would be interested in exploring other platforms for interactive behavioral prototyping, including chatbots or voice interactions.

Like what you read? Give Joy Jean a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.