Stage 5: Testing Prototypes with Students

Hannah Rosenfeld
TheRealWorld
Published in
6 min readMay 7, 2017

With some more refined student prototypes developed, we returned to Rosedale for a final round of user testing with their students. In this round of evaluative research, we prototyped a “paper” prototype of our AR UI — using an acrylic sheet and transparent UI elements — as well as created a 3D animation of our AR UI to play in the Hololens. While the animation was not interactive, it at least gave students an idea of what the UI might look like as well as a basic understanding of gaze and gesture in the Hololens.

“Paper” Prototyping for AR

Paper prototyping has been an incredibly helpful tool for us throughout this project, but our AR “paper” prototyping session was particularly helpful in approximating the experience of seeing in AR in a very accessible and low-fi way as well as in stimulating fruitful conversation around missed elements and opportunities for future work.

Replicating Gaze

To replicate the experience of gaze, we mounted a laser pointer onto some basic safety glasses and asked students to use the laser pointer to select objects within our UI. This, in conjunction with the Hololens’ “learn gestures” tutorial gave the students a sound understanding of what it means to use gaze as a selection tool (something that is difficult to do without prior experience or effective proxy).

Key Takeaways

This exercise was helpful in providing us insight into both the content as well as UI of our system. Key takeaways from the testing are outlined below.

General

  • While the instructors we interviewed were excited about the educational potential of our system, students were looking towards the future and considering how Otto might support them in their professional career. For us, this really hammered home the need to introduce technology early on in a person’s career, ideally during their education, and enable the tool to grow and develop with them throughout their professional life.

Content

  • While the students liked the list of resources we’d included, they also wanted a place to access vehicle specifications for the car they were working on. Vehicle specifications include things like vehicle torque specifications and fluid specifications.
  • The idea of an IoT scanner tool was exciting to these students, particularly for saving data in a visible place for use later on, but they still remarked that for small measurements, they would still want to get into the components with a hand held multimeter of other tool.

Interaction

  • During the activity, we prompted students to imagine how they might interact with this UI. Voice, gaze and gesture seemed to be most popular. When pushed on voice — whether noise in the garage would prevent that form of input — students assured us that while working on car they were always sufficiently far from another person to allow for voice interaction.

User Interface

  • When asked what selecting the profile picture would do, students said they thought it would be a vehicle-specific contact list. Contacts suggested for inclusion here would be the owner, insurance company, garage manager. When pushed to consider this within an educational context they suggested their professor and team members. This represented a very different model than the one we imagined, with the profile picture being the way a user would access their own individual system. For the students, however, they were much more interested in how the system could connect them with others.
  • Customization of the user dashboard emerged as a significant value for students. While some would want the menu to disappear completely, others wanted to arrange it nicely on the side for easy access.

Future Work

  • The students were incredibly generative and offered a lot of ideas about future applications of this technology. Because, as mentioned above, they were most excited about the professional potential of such a tool, the suggestions they gave were for other professional applications. Racing came up as a key application area they were excited about. They suggested ways in which the system could connect drivers with pit crews or mechanics within a pit crew for efficient repair. Fire fighting, another high pressure situation where interaction is limited and speed is essential, came up as a possibility for future application of our too.

Testing with the Hololens

As we learned after first introducing the Hololens to our research participants, getting someone into AR enables a completely different level of feedback than simply talking about the technology. While our “paper” prototyping was useful for soliciting feedback on content and UI elements, the Hololens prototyping helped us get more nuanced feedback on the experience of working with AR.

Because the Hololens is an individual experience, and we were working with 8 students, we used the video prototypes from our previous round of prototyping to familiarize students with the types of interactions they were going to see in the Hololens. In addition to keeping the students busy while they waited for their turn in the Hololens, this proved valuable in soliciting feedback. Because our Hololens prototype was relatively low-fi, the video prototypes gave more of a sense of the look and feel of our system.

Once in the Hololens, we walked users through both the “learn gesture” tutorial as well as the 3D animation we’d prototyped in Cinema 4D. Both were useful exercises, and the students loved getting to experience the Hololens.

Key Takeaways

While our “paper” prototyping activity proved to be more generative, yielding fruitful feedback on content and future work, our Hololens prototype helped us get specific feedback to help us refine our interactions.

Interactions

  • Students who tried the Hololens “learn gestures” tutorial liked gaze as a primary means of selection. However, they expressed concerns about the fact that it was based on head movement — citing limited space and the awkward square field of view as limiting factors for the interaction. Interactions based on eye tracking seem like they might be a better option for gaze in this context.
  • The bloom and air tap gestures didn’t seem to be a problem for these students. While we had been throwing around the idea of a foot based interaction — using a pressure sensor in students’ shoes to interact with the system — they didn’t find this attractive. While gaze and gesture seem intuitive interactions, foot based control didn’t. Additionally, because they are standing on their feet all the time, they worried about foot cramps from gestures like this.
  • When prompted about additional interactions, students said they would also be happy with touching the AR glasses for some interactions, though this was not be the preference for most of them.
  • Gaze and voice interactions were the top preference, but students said they wanted to have some sort of feedback that they were being heard — either an icon that animates to show the system listening or a text-based feature that converts speech to text as they talk to it.

User Interface

  • Students appreciated the place-based mapping of objects for easy reference when looking around the garage. Once again, the idea of a customizable dashboard was appealing here.
  • Students liked the ability to scale objects in and out and wondered if they could do the same with menu icons. This speaks to yet another desire for dashboard customization and even opens up opportunities to think about customization for accessibility within AR.

--

--

Hannah Rosenfeld
TheRealWorld

Director @ IDEO | Pushing the edges of Design Research to meet the complexity of today and the call of tomorrow