Prototyping approaches for Augmented Reality
Tackling prototyping for the Lumilo project has been a perfect example of the importance of tailoring methods to the particularities of a problem space. We found certain limitations from more traditional approaches and quickly iterated on the structure of our prototyping session after meeting with teachers.
Stephanie Houde and Charles Hill’s model from “What do prototypes prototype?” has been a helpful framework for thinking about our journey this semester. Their three-dimensional model is made to help visualize the focus of exploration, which can be role; look and feel; and implementation. We were really tackling all three prototypes in parallel during this semester.
Houde and Hill refer to the “role” prototype as one that will help designers understand what function an artifact should serve in a user’s life and how it will be useful to them. Through synthesizing affinity notes, we generated top level pain points of math teachers using ITSs that we wanted to address in the Lumilo project and generated storyboards to validate these:
- I want to know who is struggling (and prevent students from remaining stuck or wasting time).
- I want to know who is off task (and not making use of the tutoring system).
- I want to see how students are approaching problems (and their thought process).
- I want to know whether to intervene on the individual or group level.
- I want to trust the system (and the information it’s conveying about my students/class).
Ultimately, these have to do with two central themes of our project: How might we support teachers’ abilities in a classroom? How might we help teachers better manage how their attention is distributed?
A key note on this framing is that the focus is on providing information in a format for teachers to then make decisions on how they want to use it. In other words, we’re not yet at the stage where the system could intelligently prioritize for the teacher. This is an important technical limitation to recognize. For example, if the ITS cannot yet provide a reliable additional layer of analysis about who to help first, it may be best to show the teacher all the students who need help, which would allow him/her to make decision with additional contextual information.
We found that we were able to quickly validate the needs through storyboard testing and used screen designs in Photoshop to do a “choose your own adventure” storyboard session to see what flow the teacher would take given an initial set of information (ex: struggling, off-task, or quickly progressing students). This “choose your own adventure” approach simulated a more interactive prototype.
One of the key aspects of this project is that the augmented reality tool should be supporting the teacher in real time. While we asked what the teacher would do with the indicators/information in the moment, we knew that the actual behavior in a dynamic classroom would be quite different.
A classroom is one of the most dynamic environments, with multiple students with different needs at varying skill levels. Experience prototypes (Buchenau and Suri) are especially value for projects that are in this upper right quadrant because the context plays such an important role in how the user may interact with the product.
We know that what users say and actually do are often worlds apart. While prototyping this semester, we knew that much of the feedback that we were getting from teachers could be different when we actually get to simulating the chaotic environment of the classroom experience during mid to high-fidelity prototyping. For example, we asked teachers what they do while students are working with an ITS and many said they walk around the classroom. One teacher told us that he actually bought a pedometer to see how much more he was walking around the classroom now that his class was using an ITS. If we want to know the role this AR tool would have in changing teachers behaviors in the classroom (with a focus on real-time information), it would be really difficult to know without an experiential prototype.
Look and Feel Prototype
The Look and Feel prototype is designed to answer questions about the concrete sensory experience of using a product. Many design projects don’t progress until the role prototype has been thoroughly validated, but for the Lumilo project, the role of a concept that provides spatial information of a dynamically changing class is very closely tied to the look and feel. We found it a barrier to teachers evaluating the role a tool like this could have without ever having experienced an augmented reality tool.
As designers, we also felt that viewing our screen designs in a specific space through the HoloLens created a completely different experience. For example, we found that an amount of information that at first seemed overwhelming, when displayed in a 2-dimensional Photoshop prototype, was not necessarily overwhelming when experienced in 3-dimensions, in the context of a large classroom. Moreover, in 3 dimensions, it’s possible to control when you move closer to a screen, to engage at a more detailed level with certain information. So, in an augmented reality experience, detailed information can be constantly present and available, without being constantly salient, unlike in our screen design mockups.
We moved away from using the Photoshop screen designs for the “choose your own adventure” and transitioned to showing teachers similar designs, but in the HoloLens. Buchenanu and Suri have described the value of experience prototypes as making information more vivid and engaging such that it resonates with personal experience and can actually generated more design ideas. We definitely saw teachers light up and become more engaged once we started showing the screen designs in the HoloLens. While there’s definitely a novelty element at play, showing teachers that this tool could be a near reality was powerful in getting more engagement during the sessions.
“The tools we use to design, such as prototypes, influence the way we think. Solutions, and probably even imagination, are inspired and limited by the prototyping tools we have at our disposal.”
While directly using the HoloLens made the experience closer to the real look and feel, we also faced some challenges from trying to conduct a prototyping session with it:
- During our first session with the HoloLens, we realized we couldn’t see what the teachers were viewing. To address this, Ken set up live streaming of the HoloLens view to his laptop, so that we could both view and record from the teacher’s view. This helped us give better cues (e.g., physically gesturing towards particular holograms) and reduced miscommunication overall.
- There are not many prototyping tools for the HoloLens yet. Ken found HoloSketch, which is an app that lets you quickly mock-up holographic apps in a real physical space, using live, cloud-hosted Photoshop assets. It takes Ken around 15 minutes to set up a classroom using HoloSketch. Unfortunately, while HoloSketch does allow teachers to manipulate static holograms (e.g., by manipulating a hologram’s size or position), it doesn’t yet support the creation of interactive prototypes. So, for example, teachers could not actually click on a student-level indicator in order to see additional information about that student. To work around this, we set up a dedicated “gallery” of possible information that could pop up to elaborate student-level indicators, and only brought teachers to this gallery when they were ready. In the front of the classroom, we placed an additional “gallery”, showing possible alternatives for class-level summary displays. This “class-level info” gallery was visible to teachers at any time, if they looked to the front of their classrooms.
During these sessions, we also conducted think-alouds of the individual student level indicators. We set up the different indicators shown below in the classroom space and asked teachers to think-aloud about how they’re interpreting those indicators. I also tried to sit below an indicator to give teachers a sense of how they would be positioned above the student. The goal of the think-aloud wasn’t necessarily to see how accurate teachers were about the intention of our icon designs, but to stimulate thinking about what kind of information the system could be communicating about a student. We transitioned from talking about the existing icons to ask them what they thought might be missing. When we asked teachers which issue they would address first, many said the “Alert” icon, because it communicated severity through the color and visual design. We were hoping to learn which issues teachers would prioritize and hearing how they would be influenced by the visual design of the icons definitely made us think about how we wanted to move forward with refining the look and feel prototype for the next mid-to-high fidelity prototype.
These indicators were really placeholders in the sense that we do not expect the visual design of them to remain as they are for the mid-to-high fidelity prototypes. We met with Austin Lee from the School of Design and as a designer with prior experience with HoloLens apps, he had many suggestions on directions we could explore for the Look and Feel prototype, especially when it came to the visual elements of the experience. Some of his comments/questions:
- What other solutions would allow the teacher to keep eye contact with students? Given that the HoloLens can currently restrict this eye-to-eye gaze.
- Are there ways to use the teacher’s gaze rather than gestures? For example, gazing for a longer time at a particular student will make their information pop up.
- “Semantic zoom is critical for the HoloLens because the field of view is so narrow.” Austin pushed us to think about what would be the input for “zooming”.
- If you need to use gestures, are there other tools that could be used, like a wand or stylus?
We’re planning to address these questions as we move forward towards a more mid-fidelity prototype.
To deploy the Cognitive Tutor Authoring Tools (CTAT) -built intelligent tutoring systems (ITSs) we’re currently testing with, we use the TutorShop learning management system. Although ITSs typically rely on a set of standard real-time analytics, such as Bayesian Knowledge Tracing parameter estimates, to drive their “intelligence” or adaptive behavior… when we started, CTAT did not support the authoring of custom real-time analytics for intelligent tutoring systems.
We knew from some of our early design work that the kinds of real-time analytics generated by standard ITSs does not always strongly align with the types of information human teachers actuallywant to see about their students in real-time. So, it was clear that we would need to extend CTAT in order to support the Lumilo project! Ken and Jonathan Sewall extended CTAT to allow for analytics authoring in ITSs, and created a public repository for future learning analytics designer-developers to share the analytics they create.
In addition, in order to stream real-time analytics about student behavior and performance to learning analytics dashboards such as Lumilo, we found that we needed to extend TutorShop to facilitate dashboard development, deployment, and management. Zac Yu, Octav Popescu, and Ken worked on this piece over the past few months, and Zac designed several APIs to help future dashboard developers get started more quickly.
An interesting design challenge while working on both the CTAT and TutorShop extensions was to create a suite of tools that could work for future learning analytics designer-developers, across a broad range of possible projects (not just ours). Since some of our colleagues, Franceska Xhakaj and Anouschka van Leeuwen are also gearing up to develop teacher dashboard projects, we viewed them as our first ‘external clients’ when designing this new tool suite.
Another major design goal was to make sure the tools allow designer-developers to test and iterate upon analytics and dashboards as quickly as possible — so that technical challenge does not get in the way of the designer-client dialogue during mid-hi fi prototyping stages!
All in all, it was a long road to get to a first implementation prototype of Lumilo! But finally, in April, Zac and Ken developed a tablet-based dashboard interface, in order to begin getting a better sense of what it’s like to use a real-time dashboard, while it’s being updated by real (replayed) student data. This early testing had the advantage of bringing challenges related to temporal information design to the forefront early on in the prototyping process. For example, these tests revealed that it was frustrating to have alerts about a given student behavior simply disappear when the student stops exhibiting that behavior. On the other hand, teachers have noted in our prototyping sessions that displaying ‘historical’ alerts by default, in addition to ‘current’ behavior alerts, seems like information overload. So, we’re beginning to think of ways to strike a good balance!
Ken is currently building off of the tablet-based implementation prototype in order to develop the first fully-functioning HoloLens implementation prototype. This summer, we also plan to further prototype and develop tablet-based interfaces for Lumilo, in addition to head-up displays. This is especially important because if Lumilo is to be used by teachers in the near future (at least, outside of research studies) they probably won’t have access to a device like the HoloLens!
As we move forward, we’ll be increasingly integrating all three dimensions and moving towards building experience prototypes incorporating real-time data.