Google Assistant for Classroom

How might we design an experience to help an educator match faces to names.

STEP 1: User Research

Study problems before jumping to solutions


I shadowed the first few classes of professor Shawn Van Every’s “Live Web” class at NYU.


I also interviewed four educators who taught at different levels in different cities to help me understand their experiences with names and faces of students.


I categorized their experience into three themes:


Both Nancy and Jessica face difficulties remembering and pronouncing the unfamiliar names and wish that there were tools to read those names to them correctly.


Teachers do not seem to struggle with remembering the students who are talkative or the students who are attention seeking.


Both Jessica and Joey desire to call on the students who are quiet but they do not know or remember their names.

STEP 2: Brainstorming

Diverge on the ideas and converge on the one that solves the problem best.

During the class observation, I noticed that Shawn had printed out a list with the students’ headshots from the school system to help him match their faces with their names. At first glance, the print out seemed a very powerful aid. However, the pain point to this experience is that sometimes when a student is speaking, Shawn could not find his name on the list quickly enough. There are two main reasons this struggle arises. One, the student might look different in the photo and Shawn was not able to recognize the student. Two, the photo was listed towards the end of the list and Shawn did not manage to get to them before the student finished talking.

I recently read this news article online, about how the speaker recognition technology might present opportunities to assist teachers in the classroom. These observations led me to craft the problem statement as follow:

“How might we create a digital student list that automatically identifies the student who is speaking through voice recognition?”

STEP 3: Rapid Prototyping

Build a low-fidelity prototype to learn from the users and involve them in the design process as early as possible.

Story 1: Before the class

Story 2: During the class

Story 3: After the class

STEP 4: Measure and Learn

I tested the stories with the teachers and learned what works and what doesn’t work

“What works…”

1. Memorization happens by categories in sequence

Talkative students and attention seekers are easily remembered on the first few days but the quiet ones are harder to remember.

2. 3D > 2D

Pre-recorded videos really help with remembering names and with pronunciations.

3. “Wow” moment

All of them wowed upon finding out that there is the quiet student list function.

4. Action reinforces memory

Calling students from the quiet list helps teachers remember them.

“What doesn’t works…”

1. No phones

Teachers don’t like to look at their phones in class.

2. Microphone concern

Some were worried their smartphone microphone is not powerful enough to pick up the students’ voices from a distance.

STEP 5: Pivot and Iterate

Keep what works and iterate on what doesn’t

Story 1 and 3 work. But story 2 doesn’t work because teachers do not want to use their phones during class and are concerned about the reception range of the microphones on their phones. Story 2 needs to be iterated. Now we face a design challenge:

“How might we document class participation with high enough accuracy without phones?”

Emerging Technology

Google Home has two far-field microphones, which theoretically allow Google Home to pick up voices from across a room. And it is also able to recognize different voices. Moreover, unlike using a phone or wearing Google Glass, Google Home can work passively in the background, which minimizes distractions to the teachers and students. What if we can use Google Home in the classroom to assist the teacher in documenting class participation? But how can documenting class participation help teacher remember names? From the research we learned:

Key User Insight

Teachers have no problem remembering attention seekers (trouble-makers) and active students (talkative students). It is the name of the quiet students that teachers find most difficult to learn.

Problem statement 2.0

“How might we automatically identify the quiet students and help teachers with their names?”

Design Principles

I gained other insights from the experiment. I summarized those insights into three design principles : “Minimize class interruption from technology”, “Memorizing names is easier in sequence by categories” and “Watch students talk is valuable to the learning process”. I am going to design the experience based on these principles.

Story Board 2.0

Story 1: Before the class

Story 2 : During the class

Story 3: After the class

STEP 6: High-fidelity mockups

I am going to play around with the visuals, interactions, and bring delight to the users

I designed the interfaces in Sketch for Mac.

Story 1: Before the class

Student upload a video of self-introduction to the platform

Story 2: During the class

Using the voice data from the student videos, Google Home will be the Class Assistant that monitors class participation using voice recognition.

Story 3: After the class

Teachers use the app to identify quiet students and watch the self-intro video to help remember their faces and names.

To help the teachers remember names category by category, I sorted the students into “quiet students,” “attention seekers,” and “active students,” and color coded them. The last two categories are easy to remember from class time, so teachers can focus on remembering the “quiet students” using the videos outside of class.

Teachers can tap the speaker icons next to the names to learn how to pronounce them and tap the photos to watch the students introducing themselves to you.

And then I designed animations and built a clickable prototype in Principle.

Finally, I did post production using Adobe Premiere. Here is a demo video for the experience:


STEP 7: User Feedback & Next Steps

Design is never done, it is never finished.

After playing with the clickable prototype, the teachers were really excited about the app. They kept coming up with ideas about how the app can do more to solve their problems in class. (When this happens, it is like a paradise for UX designers). For many teachers, matching names with faces is not a problem big enough to spend extra effort on to solve. The Assistant has the potential to solve other underlying pain points for them in the classroom at the same time. Here are some interesting learnings:

Recommendation to Call
It is surprising to me that the teachers perceive the “Recommendation to Call” column to be a big help (I didn’t put in too much thought when I first designed it). I ended up learning that teachers tend to call on the better looking students and the smarter students more frequently. This app helped them become more fair by randomizing who they are going to call.

More than self-introduction
The teachers love the self-introduction videos and think they are super helpful to them in remember the names of the students. They really want more information about their students than what the short introduction videos cover. Information such as what they are good at and what they are struggling with can help teachers better assist the students.

Machine Learning
The teachers are not only interested in who is talking in the class, but also what are they talking about. The assistant has the potential to record and even understand what the student is talking about and provide meaningful assistance and even suggestions to the teacher through the mobile app.

Mark as remembered
One thing that will be helpful to build next is adding a “Mark as remembered” function, so teachers can separate the ones they already remember and focus on the unfamiliar ones.

“Start with a good design, and then a better one, and then a great one.” — Melora Zaner
Like what you read? Give Luke Kao a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.