Power Point: Wizard of Oz Prototype
Purpose
The purpose of this assignment was to practice planning, executing, and filming a behavioral protoype. The behavioral prototype was supposed to evaluate the interactions between a product and the user. The purpose behind this kind of testing is to test a product and evaluate the user’s experience with it without having to have a fully developed, functioning product.
I worked with two other students (Karina and Stephen) on this project and we decided to create a gesture-controlled product for a Powerpoint presentation.
Brainstorming
We first thought about what the current Powerpoint presentation experience is like. We identified some keys interactions as well some pain points with the current presentation controls we were familiar with. We noticed that controlling the presentation from the computer or from a remote control can be limiting when the presenter wants to do more than move to the next or previous slide. Playing and pausing videos as well as jumping to another slide in a potentially very long presentation are functions that we have noticed aren’t easily accessible especially when using a remote control for the presentation.
We came up with some gestures to match the key features we wanted our product to implement. We thought swiping left and right between slides would be natural, pointing would work when controlling a video, and swiping up, then left and right, and then clicking would make sense when doing the more complicated task of jumping to a different slide. When thinking through the use of these gestures in the presentation context, we realized that people often present with their hands already and so we would want to make sure that the system only recognized what the presenter intended to make as control gestures. Stephen pointed out that he has used a product that uses a camera and white dots to help track a golf club. We thought that we could mock that setup for our system since we could convince the user that there is already existing technology that uses this approach for golfing. We still needed to make sure the white dot was only recognized by the system when the presenter wanted it to be so we decided to modify the gestures. Before each controlling gesture, we decided the presenter needed to point at the camera and then proceed with the control they wanted to use. That way, the point should activate the tracking and then the following gesture could be recognized. We thought that this was the best solution because it would allow the user to walk around a stage while presenting and not be limited by a gesture sensing pad that they had to swipe over or something. We also thought that this was better than a sensor on their hand because, again, some people present with their hands and so gestures migth be accidentally recognized or it might be less natural for the user to interact with.
Planning
The next step we took was planning how we would setup the Wizard of Oz prototype. We realized that a standard computer shows a hovering mouse when clicking around to control the slides and we felt like that would ruin the believability of the product and experience. Therefore, we decided to rent a touchscreen computer since the clicking could be done without displaying a mouse on the screen. We wanted the user to practice giving a presentation so that we would get relevant feedback about the prototype being used in the presentation context. However, we got some feedback that the presenter might be too nervous to present something in front of us and that might hinder the results and feedback we received about the product. Stephen also mentioned that the user he recruited was a friend that was pretty timid and shy and therefore might have a hard time presenting something to a group of people he didn’t know. Therefore, we wanted the user to be able to see the slides and practice the gestures without having to formally present. We couldn’t figure out how to get the slides on two computers in addition to the main presentation screen since, in all of the setup scenarios we came up with, the mouse would show when controlling the slides or a FaceTime/Skype setup wouldn’t have allowed the user to control the slides on the computer therefore ruining the believability of the experience. Given all of those setup constraints and challenges, we decided that it was more important for the user to believe in the product and experience than to test the product as if they were presenting so we decided the user would face the screen while using the gestures and the wizard would sit behind using a remote control and the touch screen computer.
The Setup
We reserved a conference room with a large TV display that we could connect the computer to. The screen was at the front of the room and our wizard (Stephen) sat behind the user at the table so that the user wouldn’t be able to see him controlling the presentation. We had the user place a white sticker on their pointing finger and we used a GoPro as the camera recognizing the gestures. Below is a picture that captures the setup and the key components are labelled.
We tested the controls by having Karina pretend to be the user so that Stephen could practice being the wizard and to ensure we could capture everything and know what to say in various scenarios.
Testing
Stephen went and walked his friend into the room. When he entered, we started by having Stephen and Karina explain what we wanted him to test and what the test would involeve. Stephen explained the golfing technology he was familiar with and how we had a similar setup. We told the user that we had built some software that allowed the GoPro to capture the gestures and translate them into Powerpoint controls using some of the same methodology that the current golfing technology uses. Karina briefly explained the gestures and acted out the motions so that the user would have a visual understanding before having to try them out on their own. Once we felt like he was prepared, we pulled up the slides and let him practice using the gestures he would want to use. We wanted him to have the freedom to control the slides using whatever controls he wanted whenever he wanted to help him believe that the system really worked. I took notes throughout the user test to keep track of key interactions, comments, and observations. Below is a video that shows the testing experience as well as shows some of the feedback we got when asking follow-up questions.
Key Findings
We found that the user picked up the gestures really quickly and had a really easy time remembering what gestures to use. He seemed impressed by the system’s ability to respond when moving between slides. However, we had trouble executing the video playing and pausing feature because the touchscreen would sometimes interpret Stephen’s click as a play/pause control but sometimes the click would register as a move to the next slide control. The user thought it was an error on his part because he explained how he thought maybe his gesture wasn’t precise enough or was registering as a swipe instead of a click. We explained that we thought it was an error with Powerpoint and that were still a couple of bugs in our software and that was why we were testing it. The video did end up working but it did not work as consistently as the other features. He was able to successfully complete all of the actions that the gestures corresponded to despite this difficulties with the video playback feature.
Our user said that he purposefully exaggerated the gestures and so they didn’t feel natural mostly because of that. However, a classmate commented and noted that our user may have felt like he had to exaggerate because the screen was so large in front of him and therefore he might have felt like the swiping motion had to span across the whole screen. She noted that it could have been interesting to test with a smaller screen in front of the user to see if that impacted whether or not he exaggerated as much. The user mentioned that he thought the technology was cool and he would use it but he doesn’t give many presentations so he doesn’t have a ton of presentation experience.
Reflection
I think we got some good experience testing a behavioral prototype because we realized how difficult the setup is. There were a lot of little details and the available resources and time were difficult constraints to deal with since we had to include all of the features while still making the setup and experience believable. I think we focused more on the product’s believabliity as opposed to focusing on the context and environment of the test. The test felt more like a test of the technology as opposed to a test of the experience so it would be interesting to try and run another test that tested the product in the presentation environment. I think we could have included more camera angles when filming the user testing so that the video could capture what was happening on the screen when the user was doing the various gestures. This would be easy to fix in the next iteration if we used a presentation environment setup because the user would be acting as if they were facing an audience for a presentation so we could see them and the screen behind them captured in the camera at the same time. I think this would require that we have the participant create their own slides or somehow allow them to view the slides in front of them when giving the presentation without being able to see the mouse or us controlling the slides.