QuizBox: designing an experience that helps students learn better

Carolina Li
CS449/649 F20 — UWaterloo
13 min readDec 12, 2020

by Grain Obtainers

Designed by pch.vector / Freepik

As students, we know that exams are a big part of the educational experience. Preparing for tests can be stressful and overwhelming, especially in university, so it’s important to figure out a study strategy that’s efficient and effective.

We also realized that something’s missing from our university experience — a lack of creativity, autonomy, and collaboration in the studying process. Students often study in silos, using material provided by the school, such as course notes and past exams. However, research has shown that people who teach others learn more and retain information better.

Motivated by the need for a stronger student community and the fact that teaching the content leads to better understanding, our team of five decided to create QuizBox, a platform that allows students to share and enhance their learning with others. Currently, we do this through creating and sharing quizzes.

This benefits both the quiz-makers and the students studying from the quiz. Students who make quizzes have to think critically about what information is important, leading to a deeper understanding of the course material. On the other hand, students who study from the quiz benefit from having an extra source of practice, made by someone going through the exact same process as they are. Furthermore, our stretch goal is that QuizBox can help create a more tight-knit university community.

Keep reading to learn about our design process, what we learned, and what we’d do next!

Empathizing with students

To begin, we needed to understand who would want to use our service. Having a good understanding of our target audience would help guide development such as what features we should offer and what priority each of them be given. We leveraged Personas and Empathy Maps to represent our target users. Personas consist of the descriptions of a fictional person, including details such as their background, goals and fears. Empathy maps help us relate better to personas by letting us understand why they feel a certain way or what their daily activities are. For this project, we created three distinct target users.

User 1: Stefan

Stefan’s a university student who struggles academically. He’s normally alone in his courses. He relies heavily on course resources such as lecture notes and office hours. Stefan lacks confidence in his abilities. He wants to do well but without peer support, he lacks motivation.

Stefan: Persona
Stefan: Empathy Map

User 2: Bam

Bam’s a stellar student. He’s very popular among his peers and always helps others with assignments. He’s aiming to one day become a professor. While helping others, he often finds himself repeating the same information. He wishes there was an easier and less time consuming method to share resources with his peers.

Bam: Persona
Bam: Empathy Map

User 3: Jean

Jean’s a teaching assistant who wants all her students to succeed. She feels frustrated in the limited options to share resources with her students. She is open to trying out new teaching methods.

Jean: Persona
Jean: Empathy Map

Now that we had our personas, we wanted to validate our hypotheses about user needs and desires with real participants. We also wanted to see what resources users currently study with and where they turn to when they need more study resources. We came up with a series of open ended questions and a questionnaire to guide our interviews.

Interview script

From the interviews we were able to uncover that students rarely think to study off of self made quizzes. In fact, most interviewees had never used quiz building tools before. When studying with peers, they only discuss ideas around course content and work through course provided exercises together.

Additionally, we discovered that all interviewees only trust the resources provided by professors or teaching assistants. This was a crucial discovery as credibility was the most important factor for enticing participants to use a quiz service like QuizBox.

Defining the problem

With the interview data, we extracted all the points that we thought were important to both the interviewees and our application. This set us up to use affinity diagrams. Affinity diagrams are a technique that involve grouping together related points and trying to identify a main theme across all points in a group. This can be useful to identify interesting patterns within your data. We used affinity diagrams to discover three main themes.

1. Time Efficiency

Our interviewees were all students and they stressed that they were already spending too much time on school work. As such, they really value efficiency in their studying. This was an important discovery as it meant our app needed to allow efficient quiz building and searching for users to consider it worthwhile.

Affinity Diagram: Time Efficiency

2. Trust

Students were also skeptical of quizzes made by strangers. As such, they much rather prefer studying off of material that has been proven to be useful. This again raised the question of how we can design for credibility.

Affinity Diagram: Trust

3. Motives

Our interviewees had a multitude of motives that ranged from simply passing a course to getting a good paying job. This opened up the idea that our platform could be leveraged by more people than just students. It could be anybody that wanted to learn from more resources.

Affinity Diagram: Motives

With our new and improved understanding of user wants, we devised some common user tasks that would be performed on our application. The most important tasks for a quiz related application are the quiz building and quiz studying features. With the insights gained from the interview data, we knew that the interface needed to be minimal to enable efficient studying.

Since trust and ease of use was important, we decided to add groups, which organizes related quizzes into sections. This would help students access desired material and users can join groups that they felt provided good content, such as a group with an active teaching assistant.

This stage of the design process helped us better understand our user needs. We leveraged this data throughout the later ideating and refinement stages.

Ideating

Having gained a strong understanding of who our users were and what features they desired from the QuizBox application, we switched our focus towards solidifying the main use cases of our application. We knew that our application needed to solve a real pain point for individuals, if they were to leverage our platform.

Many interviewed individuals seemed hesitant on studying by quizzes, so our goal was to show them the benefits of using a quiz sharing platform.

First, through user stories and storyboards, we created and refined features to solve real user needs.

We brainstormed up as many ideas for features as we could, such as student profiles, voting, and messaging. Next, we had a voting session, where we put stars next to ideas we liked. We quickly noticed that we had many groups of similar features, and so we had to conglomerate and ensure our selected features didn’t overlap. We favoured ideas that represented a unique value in our app, while also most reflecting our users’ pain points. We ended up with five feature epics and corresponding user stories. Here are the stories for the epics that we eventually chose to prototype.

Two of the final feature epics

Next, we created storyboards for each feature. We envisioned a specific user, a problem they’re facing, and how the feature would fit into their life. We thought about the setting, sequence of actions, and results, which we represented through visualizations.

Every detail aimed to reflect a real user and their actions. Research from user interviews and exercises like the affinity diagram helped us gain an understanding of user problems and needs. To create the storyboard itself, we used a variety of methods, including hand-sketching, digital sketching, and third party comic-building software. Here are two of the storyboards from our manga drawing group members:

With our stories representing our products main use cases, and our story boards outlining specific user scenarios, we wanted to get an even stronger vision of what the final product might resemble. So we leveraged sketches and user flows to put our ideas into practice and developed detailed layouts of these main features.

Using Crazy 8's, we consecutively sketched out eight potential layouts for our selected features, only allowing 40 seconds per sketch. Here’s an example for the Quiz Tagging feature:

The activity was helpful as we were able to individually hone our assigned feature and develop multiple versions of a potential layout. We then decided on which layout we preferred for each of the features.

We used the storyboards and Crazy 8’s as starting points to create more fleshed out screens. As we decided on the screen layouts, we thought about design principles such as affordance and discoverability.

For example, we gave clickable components stronger shadows, to create a raised, press-able look. We also took inspiration from other successful mobile apps, such as Reddit and Twitter. We figured there was no need to reinvent the wheel, and these companies already went through the design process and optimized their UI. However, we kept in mind that what works for other companies may not work for us, so we were critical of which design patterns to borrow.

After sketching out the screens, we connected them by creating user flows. We thought about how each persona would use each feature. We created diagrams showing how interactive components let users progress to different screens, or showed how components react to user input. The user flows show how users would use the app to complete their tasks.

User flows for the quiz tagging feature

Overall, the ideation process was incredibly useful for our team. Coming into this process, we had lots of ideas as to what our application could do and how it would resemble. Through the many activities, we were able to design detailed mocks with proper user flows for the application. This helped us flesh out our ideas and provided a solid foundation to work off of in future steps. At this point in the design process, we were more confident that QuizBox could be of use for individuals looking for additional study resources.

Prototyping and Testing

After designing our mockups, we were ready to move onto building a simple prototype. At this point in our design process we were highly focused on iteration; we wanted as much feedback as possible in order to incrementally improve our product. As such, our design went through different stages of evolution.

First, we used Miro to collaboratively create a low-fidelity (or paper) prototype. We purposely left out a lot of the colour and shifted our focus primarily to the layout.

Our paper prototype for the group screen, where users can search for quizzes

Following the creation of the paper prototypes, we engaged in three user evaluations: one mock evaluation with our partner team and two real users from outside the class. In these evaluations, we presented users with tasks such as, “How would you find a quiz that’s TA-approved?”. The users would then indicate where they would click on the paper prototype and we would progress to the next relevant prototype. Overall, these evaluations identified some early points of confusion and helped us iterate on our design.

Some common points of confusions our users identified were:

  • Quiz tagging was vague. We iterated on this feature by introducing a one-time popup to explain tags, updated the tag icons, and made certain tags more explicit (e.g. “LA” became “Long Ans”)
  • Some filtering options were vague; “Popular” had no information as to what was being measured, and “All Questions” did not communicate that it was related to question types. We updated the filtering to better explain what was being filtered (e.g. “Popular” became “Highest Rating” and “All Questions” became “Question Types”)
  • Our multiple choice selection originally had the second answer pre-selected. It was our intention that this was a placeholder and the user could modify any number of answers to be correct. However, this became a point of confusion as some users thought they had to write the correct answer into the second slot. We removed the pre-selected answer and simply required users to select at least one correct answer before proceeding.

After the low-fidelity testing, we used Figma to generate a high-fidelity prototype, this time complete with interactive buttons and a colour scheme. We created this high-fidelity prototype for three features: the group page, taking a quiz, and creating a quiz. We also created a launch screen complete with an icon.

The new, colourful group screen!
Our new, fancy launch screen

Play with the Figma prototype here

After we created the shiny new prototypes, we re-evaluated the designs with users. We conducted both heuristic and cognitive evaluations.

Heuristic evaluations

This is when a user steps through the application with certain overarching principles (heuristics) in mind, and evaluates how well the application adheres to these guidelines. After the heuristic evaluation with another member of the class, we identified multiple areas of potential improvement:

  • Visibility of system status: Here, we could add a progress indicator on quizzes so users know which quizzes they have yet to complete.
  • User control and freedom: We could add additional options for users to save a quiz as a draft or confirm a quiz deletion in case a user presses “Back” accidentally.
  • Recognition over recall: We should find a way to create folders of quizzes or save favourite quizzes for easy access.
  • Aesthetic and minimalist design: We could combine filtering and sorting into one option to reduce clutter

Cognitive evaluations

Here, the participant freely explores the application to try and accomplish certain tasks. These are useful for seeing how the average user would learn how to navigate and accomplish their goals in our application. We conducted a cognitive evaluation with a third year Mathematics student and gave them three tasks.

  • Finding a quiz: While there was a little confusion over tags and filters, overall the participant was able to quickly find a specific quiz.
  • Creating a quiz: There was confusion over the “add answer” button where the user thought this was only to add correct answers. As well, they expressed frustration that the back button immediately deleted a quiz.
  • Editing a quiz: There were no problems.

Overall, the prototype and testing process was an important part of our final design. Conducting various user trials and evaluations helped us pinpoint parts of our design that were inefficient or confusing, and allowed us to continually iterate on our project following user feedback.

Conclusion

Now, at the end of the term, after creating personas, conducting user interviews, and learning about gestalt principles, each of us has gained a better understanding of how to design products for real users. Along the way, the weekly reading list exposed us to how the design process works in the industry, and let us explore topics such as research and affordances in more detail. To sum it up, the past 12 weeks gave us a holistic overview of what it takes to design an app from scratch.

If we could redo the whole process, we’d place a greater emphasis on data, to make sure that whatever decision we make is backed up with proof that it’s the best choice for the user. And, if this weren’t a course with weekly deadlines, we’d use the flexibility to give ourselves more room to explore and possibly pivot our idea.

For example, at the end of the course, we realized that much of our prototype depends on having existing quizzes and the assumption that students are willing to put in the effort to make quizzes. But from our research, we learned that persuading students to create quizzes in the first place would be a challenge. For our future plan, we’d focus more on validating that our design solves this root problem, since the success of the app depends on it.

CS 449 Human-Computer Interaction gave us a great opportunity to gain experience with the design process and how to think about our product from the point of view of the user. We’re looking forward to applying what we learned to our future work, whether that’s a personal project or in our careers!

--

--