Teaching Augmented Reality: Making More Than Novelty
How do you get a diverse group of students excited about emerging technology? Over the past few years, I’ve been teaching and mentoring university students on how to create experiences using augmented and virtual reality as well as other types of interactive digital technology. With my University of Maryland colleague in Immersive Media Design (IMD), Daniel Brown, we designed a studio course to give students from across campus the opportunity to work as teams to design and build augmented reality (AR) experiences in Unity. We called it “Augmented Reality Design for Creatives and Coders.”
Our students learned how to leverage AR to create unique, meaningful, and compelling experiences and Daniel and I learned a ton about the huge upsides and the real challenges of teaching emerging technology. The 7 takeaways below are excerpted from a longer version of this article on my website, HERE.
Takeaway #1: Collaborative Teaching for a Collaborative Medium
The initial concept for the class was to experiment with a teaching and learning environment for spatial technology that leveraged the power of teams. Daniel and I both recognized that almost all compelling experiences in augmented and virtual reality are created by teams with mixed skills, experience, and backgrounds upon which to draw. They work in a highly iterative manner, constantly testing concepts and approaches and seeing what works and what fails.
We co-taught the class, modeling the kind of cross-disciplinary collaboration that we wanted our students to engage in. Daniel led discussions on development and prototyping, I led discussions on ideation and project management, and we both led conversations on the design process and documentation.
We organized the class as a cross between a laboratory and a design studio, a space where both the instructors and the students could test different methodology around creating AR experiences and see what could be built.
In some ways, learning the design and technical skills necessary to create AR experiences were in service to teaching core collaboration skills (rather than the other way around). By putting together students with diverse backgrounds and training them on how to communicate and build something challenging and novel together, we were able to nurture fundamental “soft skills” that would serve them throughout their careers no matter where they decided to work.
As a result, Daniel and I spent a good deal of time before launching the class building our own projects together. We shared these demo projects with the students, unpacking our process and what our goal was with each prototype or proof of concept that we created.
Takeaway #2: Immersive Storytelling Principles to Go Beyond Novelty
I believe that augmented reality’s potential has gone largely untapped. Especially when compared to the content being created for virtual reality. AR has found utility in a few areas like novel marketing gimmicks and tools for picking furniture as well as heads up displays for factory workers and first responders, but its potential to incorporate story and narrative have been largely unexplored. Why is this? How could we approach making experiences utilizing AR differently?
In the first week, we introduced the students to some core principles of narrative design and immersive storytelling. We wanted our students to approach the creation of their apps as experience designers, prioritizing the user experience (UX) and user journey over technical sophistication and novelty alone. We certainly welcomed apps that would provide utility and solve real-world problems, but with this course we wanted to challenge the students to make more than tools.
With each project idea that the students put forward, I would ask them how it created a compelling connection between digital content and something in the real world, be it a person, object, or environment. How would their experience encourage the user to see, hear, feel, experience the real world around them in new ways? What kind of constraints or particular design and technical considerations would this require? In short, why did their project need AR?
I broke narrative design elements down into a series of conceptual categories that aimed to help the students make the connection between the content that they would be creating in their apps and more traditional narrative-driven mediums, like film, and especially immersive performance. The categories included story and narrative, character, audience, agency, accessibility, attention, liveness, presence, immersion, interactivity, and environment.
Takeaway #3: Concept vs. Execution: Conceptual Skills vs. Technical Skills Instruction
One of the most challenging aspects to creating the class was deciding how much class time to devote to basic technical skills (like navigating Unity for the first time) vs using class time to focus on topics that could not be taught as easily through a Youtube tutorial (like discussions on ethics or accessibility).
We decided to focus most of our roughly five hours a week of class time to introducing the students to design and project management principles.
Beginning with concept boards, the students learned how design documents are a valuable step towards translating their ideas into content proposals with specific aesthetics, beats of action, interaction, and mechanics. Borrowing mainly from video game and animation production workflows, the students created mood boards, style guides, storyboards, user journeys, and technical design documents. In this way, before writing a line of code or generating 3D assets or animation, the students had crafted a cohesive set of blueprints for what they wanted to create.
In the first year of teaching the course, we badly misjudged the support and time that our students needed to work with the core software, in this case Unity and Adobe Aero. Software like Unity is complex and intimidating for new users — just getting installed and running correctly on some of our students’ computers took weeks. Frequent updates to software created version discrepancies; out of date documentation was also a frequent issue.
Promising AR-specific software like Aero was in beta and came with instability issues and lack of support for multiple OS or mobile platforms. As a result, teams leaned heavily on the one or two students who already knew how to code in C# and were familiar with Unity for the building of prototypes and proof of concepts, rather than everyone on the team feeling confident enough with Unity to contribute directly.
I took in all of this feedback and, like a good XR creator, went back to the drawing board to redesign the course for the next year.
The second time around, we spent much more time with the students in class at the beginning of the semester to make sure that they had a basic understanding of Unity and how to build AR applications with it. By spending the class time to make sure that each student could build a basic AR scene in Unity on their computer and deploy it to a mobile device, we were able to ensure that each student had obtained a basic level of knowledge about the primary technical workflow. Students in the second year were far more likely to share programming duties and to work in areas that were outside their comfort zones.
Takeaway #4: Paper Prototyping for AR
In the spirit of treating the course as a lab space to test different approaches to teaching XR, I took a day during the second year to do paper prototyping. I was curious to see how thinking in a tactile way would change the students’ perceptions and approaches to implementing AR.
Sometimes working with different materials outside of software can allow us to prototype faster or think about 3D digital content in new ways. It gives us a lot of quick and useful information about the content and what’s compelling about it. Paper prototyping can be a much faster method (for most students) to translate their ideas into spatial content. It can help the students zero in on the affordances and constraints in how they intend to use AR before they spend time building. It also points out parts of their proposed experiences that might be better realized through an approach other than AR.
In general, I found it to be an incredibly useful exercise, but it has its limitations. On the plus side, it puts the narrative and UX aspects of the experience front and center. The students started to think more specifically and spatially about what their experiences and how the user would engage with the content.
On the down side, often the students created 2D cutouts or cards to stand in for 3D objects; this undercut some of their exploration and understanding of how a digital object’s volume would interact with the physical space. Plus, they ended up focusing too much on what would happen in AR (rather than utilizing non-AR content like physical installation elements or 2D screen content). Ironically, working with physical materials to stand in for digital ones in their paper prototypes sometimes de-emphasized their thinking around how their experiences would engage with the physical world — people, objects, environments.
In order to get some additional information from the students about what they gained from the exercise, I had them fill out a two question survey at the end of class:
1) What worked.
2) What didn’t.
The results were interestingly mixed. For some students, in particular those with more of an art background, paper prototyping catapulted their projects from fuzzy ideas to fully fleshed out experiences. They created experiential road maps and discovered exciting ways to communicate the core theme of their content. It brought groups together around a clear and specific vision for the project.
For other students, especially those who had already done a decent amount of planning and designing for their experience as well as for teams that included students with a high degree of programming experience, they found the exercise less helpful. These students felt that it was a waste of time that they could have spent building a digital prototype. Still, even for these students, the exercise pointed out where there were gaps or holes in the user experience that they needed to flesh out before they invested time inside of Unity.
Takeaway #5: Classroom as Design Studio: A model for Experiential Learning
In keeping with our desire to connect the work we did in class with the professional world of XR, we brought in guest speakers like Daniel Plemmons, a UX lead at Adobe, and Ashley Crowder, the CEO and founder of Vntana, a B2B 3D model platform, to share their work and best practices with the students.
For the students’ midterm group project, I paired each team with a guest professional director or producer working in XR. The guest presented the students with a project brief or outline, and the students worked together as mini-design studios to translate the concept into design documents and a prototype.
In the second year of the course, I reached out to the developer relations team at Niantic who were just rolling out Lightship, an AR development platform that works with Unity to create sophisticated AR experiences. It takes much of the underlying multi-player and contextual awareness software that made their game, Pokemon GO, such a huge success and hands it over to outside developers. I scheduled workshops with Niantic to teach Lightship to the students and the Niantic development team made themselves available to support any student teams that wanted to use it to build their final projects.
It was incredibly exciting for both the students and myself to work with Niantic’s developer relations team on a new, and potentially game-changing toolset for creating AR-enabled experiences.
The workshops embodied so much of what I wanted for the class: introducing the students to companies and professionals working in XR, testing cutting edge-technology, and offering them specialized training that they could not get elsewhere.
Of course, it also came with the challenges of working with a brand new software solution: incomplete documentation as well as frequent updates to the software that would break existing projects or change established workflows. And unlike more established AR solutions, there was not yet a large community of developers to go to with questions or best practice tips. (In the year-plus since Niantic worked with us, their team has greatly improved Lightship’s documentation and tutorials and they have cultivated an engaged and growing developer/creator community.)
I think our work with Niantic was a great way for the students and the IMD program to build relationships with a potential employer for our students and a leader in the AR experience space. We were also able to give quick and useful feedback about the documentation materials and tools that was appreciated by the Niantic team as they worked to improve the Lightship platform. As neither my TA that year, Christopher Maxey, nor myself had used the platform, we built a Lightship prototype alongside our students. We were so taken with the platform that we built the beginnings of a multiplayer AR block building game. It won best use of multiplayer in Lightship’s summer 2022 competition.
Takeaway #6: Balancing Process and Product
It was an ambitious goal: a class that would teach students with little exposure to augmented reality and the primary software used to build it (namely Unity) both the technical and conceptual skills necessary to make fully-realized AR experiences. We would focus on process, but by the end of the semester, expect student teams to produce working demos and present them to an invited audience. It was the equivalent of bringing in a group of students, introducing them to this thing called “motion pictures,” teaching them the most rudimentary basics of how to shoot content, and then expecting them to make short films by the end of the semester. And all of this before the advent of modern workflows and software that streamline the creation, capture, editing, and distribution of content, so motion picture creation circa 1910.
For their final project, the student teams chose their own subject matter and created working demos of their projects that they then presented to faculty and industry guests at the end of the semester.
Here’s a few examples of projects created by the student teams in the first year of the class at the University of Maryland (UMD) and the second year at UMD and California State University Northridge’s Emerging Media program.
“NavigateAR” is an indoor/outdoor campus navigation application that allows users to traverse and discover the University of Maryland in a more efficient and immersive way while also introducing them to campus resources and general points of interest. The AR experience provides highlighted directions that augment their path to their desired destination along with narrated and text directions. It will also encourage exploration of buildings like the Adele H. Stamp Student Union through an AR scavenger hunt led by the UMD mascot, Testudo.
“ART!” Is an AR scavenger hunt that turns the Cal State University Northridge campus into an AR student art gallery. The experience uses LiDAR scans of the campus and GPS to place content indoors and outdoors.
“Chesapeek,” An AR experience that allows users to engage with and learn about wildlife native to the Chesapeake Bay watershed. The experience begins as a site-specific, geo-located walking tour around Lake Artemesia.
“Lone Light,” an AR installation that casts the audience as a paranormal detective sent to investigate a strange murder.
“EchoStAR,” an interactive projection installation that aims to encourage social interaction and connections between strangers. It uses strategically placed microphones in the room to capture the sounds of the audience as well as motion sensors to capture their movement. It translates these inputs into patterns of abstract color and form. As the audience moves through the space or changes the sounds they make, the installation dynamically responds.
Takeaway #7: Hardware Headaches: biggest hurdle for teaching AR
The biggest challenge with teaching the class, and I feel teaching emerging media in general, are the frustrating and time-consuming issues that students face getting the core software to work properly on their devices. Yes, the struggle provided a useful learning opportunity in problem-solving, but for some, it generated weeks of roadblocks that kept them from completing basic technical assignments. It also required many hours of additional technical support outside of class by my co-instructor and myself.
I’m not sure what a realistic solution to this challenge is in the short term. Perhaps the department could supply the students with standardized hardware (ie. development computers with preconfigured and updated versions of Unity and AR-enabled tablets). My hope is that as the tools to create AR (and VR) experiences continue to mature, they will become more stable and simpler to use.
Looking Ahead
My students have shown me how AR can be a powerful tool to rethink our relationship to the physical spaces around us: whether that is through an experience that allows a community to plant a digital garden in a disused parking lot to collectively imagine its potential or how a networked AR graffiti app could create unexpected collaborations in unexpected locales.
Their work has also reinforced my belief that AR in its current form (mostly deployed via handheld mobile devices) is often most effective as a supportive design element rather than as the sole focus of an experience.
It has also further confirmed my opinion that, like all new technologies, we need to think about how storytelling and narrative can frame the technical possibilities (and ethical considerations) of AR in a way that is relevant and meaningful for the widest possible audience. To me, this includes thinking beyond AR in an entertainment or art context and thinking about how storytelling and narrative apply to AR’s application in contexts like health care, education, manufacturing, and social justice.
These are areas where AR is already making an impact and where many people first interact with the technology in a sophisticated way. Creating engaging and compelling user experiences in these contexts is critical to anticipating UX/UI in AR for mass adoption in future wearable devices.
Speaking of the future, I hope this article sparks conversations amongst educators and XR professionals about how to attract, support, and educate new designers and developers. Like many others using XR in the classroom, my understanding of the medium and the tools is constantly evolving. I’m always seeking out resources to refine and reorient my approach to teaching XR design. If you have ideas about XR education and training or thoughts about this article and the “Augmented Reality Design for Creatives and Coders” course, reach out to me on LinkedIn.