Mastery Orientation: Motivational Design in Lynnette

Teaching students to focus not simply on doing problems, but on how they choose which problems to do.

Lynnette is an adaptive math tutor built at Carnegie Mellon University. However, it feels like a tutor—there’s little present beyond more and more math problems to solve. This semester I worked with Professor Vincent Aleven and Yanjin Long (PhD candidate) to focus on the non-mathematical part of the experience. Specifically, the goal of our project was to focus on problem selection and encourage a mastery orientation rather than a performance mindset, with the end goal of conducting a research study on the efficacy of different design interventions.

The first welcome screen of the interactive prototype
One of the tutorial screens that calls out a specific interface element.
The home screen
Problem selection becomes an interactive item with explanatory feedback.

Problem Scope

In the words of O’Keefe et al., “A mastery goal orientation refers to a focus on developing competence. With a performance goal orientation, the focus is on demonstrating competence” (51). That is to say: challenge is healthy and effective learning comes from growth, not from what you already know.

What we want to design are structures that get students out of the continuous rhythm of doing math problems over and over again and instead get students to question, “Which problems should I be doing?” However, we don’t just want to heavily support students in making decisions; we want to affirmatively teach students how to make their own decisions and create transferable selection strategies that are applicable in different domains.


To kick off our design process, Yanjin and I brainstormed potential features and solutions that could help students make good problem selections, which I turned into 18 storyboards, ranging from different types of feedback to social mechanisms to avatar customization. We used these storyboards to run speed dating with 12 middle school students, quickly walking them through each idea to see how the students reacted.

This storyboard focused on negative feedback when students select an already mastered level.
Several of our storyboards focused on goal setting and helping students plan their way forward.

Middle school students are fascinating to interview. Some were timid, some were so sharp they made me feel stupid. What struck me the most, however, was how enthusiastic they were—generally they were in favor of our ideas, though some ideas more than others. The key insights we took away from storyboarding were:

Collectibility: Students like collecting things and being able to see a visual representation of their progress. Achievements, badges, and collectible avatar items were all well received.

Mastery is hard: As a concept, mastery is not well defined in the minds of middle school students. Some students had very high capabilities for planning their education, but most students simply tried to do what was asked of them without taking a proactive look at what is best for their learning. Student feedback solidified that this metacognitive domain is worth exploring.

Social works for some, but not all: We spent a lot of time trying to think of what motivations we could latch on to and use to foster a mastery orientation, one of which is the idea of social competition. Some students loved this idea, but the more timid students were decidedly unenthused by the competition-centric storyboards. In retrospect, competition is largely focused on demonstration of competence rather than development of competence, making it hard to fit into the mastery scheme.

Interactive Prototype + Paper Mockups

After storyboarding, we moved to paper prototyping and a HTML/Javascript prototype and tested with another group of 10 middle school students. This allowed us to test both the usability as well as the interactivity of our ideas and get a good look at whether or not our features were teaching students the right concepts.

The “HCI” approach that breaks students out of a sequential level model. However, this would be too much scaffolding and not help students transfer their knowledge afterwards.

Human-Centered Design vs. Research Design: One of the key issues I kept bumping up against was that the design had to be practical from a research standpoint: every detail had to be considered with a specific hypothesis in mind, and every feature had to be linked to a cognitive theory and motivation. Most importantly, the desired outcome was that students would make good problem selections outside of the tutor; it was not sufficient to simply steer them toward good selections within the tutor. Instead, we needed to focus on instruction and goals that would have a lasting impact.

Each interaction is designed to reinforce a mental model of mastery.

Mental Models: The success of our design lives and dies on teaching students a correct conception of “mastery” and how mastery relates to making good problem selections. In our testing, however, this proved a tricky concept to impart. We try to take every opportunity available to surface this concept and make it an inescapable part of the instruction. What became immediately clear is the need for an explicit tutorial that defines the concepts, explains how mastery is calculated by the tutoring system, and how each interface component is tied to mastery.

Rewards are given not for problem completion, but for making good selections.

Explicit Reflection: Yanjin had previously observed that students had a tendency to quickly and automatically go through problem selection without any pause for reflection and metacognition. Our design focuses on making moments of reflection an integral part of instruction, allowing students to receive explicit feedback on their strategy and forcing them to pay attention to an often overlooked aspect of learning how to solve new types of problems.


This project was very much an exercise in my own mastery orientation. I had never done extensive user testing before and I had only basic design skills. My main strength, web development, was pushed beyond my comfort zone by my desire to learn the entire MEAN.js stack when I only had a rudimentary understanding of Angular. Working beyond my current masteries, though, has been immensely rewarding: I now feel comfortable with running user tests with students at different stages during the design process, my ability to pump out mockups has grown significantly (though my visual design ability is still far from perfect), and I feel competent and comfortable tackling future projects using MEAN.js. I am always surprised by how useful it is to learn about learning—learning how students learn best is also learning how I learn best, and the idea of a mastery orientation is one I plan on keeping in my head.

Metacognition is difficult to design for: it’s not immediately visible and useful to students, and is often taught implicitly rather than explicitly. However, getting students to take a step back and think about how they are learning is enormously fruitful, and pays educational dividends over time. The fact that these middle school students do not have well developed metacognitive abilities is perhaps not surprising; what is slightly more startling is that I sometimes see the same thinking in my peers in graduate school. When you see students in their mid-twenties still in the mode of “perform, perform, perform” without the ability to take stock and look at their own learning strategies, you have to wonder, and you have to hope that metacognitive skills become an integral part of future educational design.

This piece is based on my independent study with Associate Professor Vincent Aleven and PhD Candidate Yanjin Long for the Fall 2015 semester.

Source code for the prototype can be viewed on GitHub here: