Case Study— ClassRanked

Daria
9 min readMar 19, 2023

--

“Your timeline for this project will be two weeks to develop the digital product for ClassRanked company and culminate in a high allegiance prototype: research, audit, iterations, and testing.“

Per stakeholders’ instructions, we had only two weeks to complete the project, which required us to work efficiently and manage our time effectively while conducting research, user testing, and prototype building.

Class Ranked is building a community where students provide feedback to the professor and their classes. We need to develop an interface that entices students to use the app and participate by providing their input through taking the surveys.

Objective: To engage students to finish class surveys and raise participation.

Scope: The project will involve designing an interface that will entice students to complete the surveys and find incentives that will attract them.

My partner and I scheduled an interview with the stakeholder and asked additional questions to allow us to start our research baselines and conduct our “How Might We.”

How Might We

For our research, we completed survey questions that we posted and got 13 responses. The crucial results we got were that, unfortunately, people rarely participate in surveys, and some of the reasons for that are that the surveys are too long, do not make any change, and people do not benefit from them. People also stated that they would like other students to be able to see their feedback about the course and professor. And their primary motivation to complete surveys is if they could see the improvement in teaching. Additionally, 69.2 % believe that class surveys are an efficient way to improve classes.

Survey Results

We interviewed 7 people who are current or recent graduates ages 24–35. We acquired that some students believe that surveys are helpful but must serve the student in their current course. Some reports from students we got were that they needed more motivation to complete the ‘End of term’ Surveys because the class was over and no longer affected them. The negative behaviors students observed are that the same professor with a bad reputation still teaches the same class, and students feel like the University does not care.

Interviews Results

We completed market research and identified two leading platforms that could be competitors. Based on the market analysis of Qualtrics and SmartEvals, we took several key takeaways that can advise the development of evaluation software product for our stakeholders:

  1. Focus on user experience: Qualtrics and SmartEvals are known for their user-friendly interfaces and customizable survey templates. It’s crucial to prioritize the user experience in developing our product to ensure that it’s easy for customers to create and distribute surveys.
  2. Offer advanced analytics capabilities: Qualtrics is known for its advanced analytics features, including predictive intelligence and text analytics. Including similar capabilities in our product can differentiate it from competitors and give customers more profound insights into their survey data.
  3. Consider specialization: SmartEvals has a more specialized focus on the education market, offering solutions for course evaluation management, accreditation management, and faculty activity reporting. Consider targeting a specific niche or industry with our product to differentiate it from more broad survey and evaluation software.
  4. Emphasize the importance of data-driven decision-making: The growth of the survey software market is driven by the increasing need for data-driven decisions. Emphasizing the importance of using survey data to inform business decisions in our marketing messaging can help position our product as a valuable tool for organizations.
  5. Embrace mobile technology: Incorporating mobile survey technology into our product can help meet growing demand and appeal to customers who prioritize accessibility and convenience.
Market Research

The main results of our empathy map were pains and gains. The pains were that universities didn’t care, students did not see changes recommended on the survey, and they got no benefits from accomplishing a survey. And the gains were that they hope to improve the course, they want their voice to be heard, and they want to see feedback and improvement.

Empathy Map

We used two user personas to understand our users better since we had two categories of people in our interviews: one who was not excited about surveys and one who stated that surveys are valuable. Two personas would help us to bring more comprehensive research, improve decision-making, and increase empathy to create more successful products.

Persona 1
Persona 2

We analyzed the user journey of a person who believes in surveys and feedback but witnesses no change in the teaching method of their professor. This persona has tried various approaches to enhance their learning experience, but at the end of the day, she left the class feeling dissatisfied with the outcome. We will focus on understanding this particular persona’s needs and pain points. By creating a user journey map and gathering feedback from similar users, we aim to identify key insights and make recommendations for improving the user experience. Our findings will provide valuable information for ClassRanked as they strive to engage more students to take the surveys and provide users with a more effective and satisfying learning experience.

User Journey
User Journey

The user’s path can make or break the experience when navigating a digital product. User flows are vital in ensuring this journey is smooth and successful. User flows visually represent the user’s experience, making it more leisurely for teams to cooperate and communicate effectively. By mapping out each step of a user’s journey towards completing a specific task, such as purchasing a product or registering for a service, we could identify pain points and areas for improvement. We presented 3 different user flows; the first would indicate an optional mini survey for the students where they could share their thoughts on the course going by; in that case, the student would be able to know if their voice was heard. The second flow is for the End of Semester survey, where students can share their experience with the course and leave an open review for others to see. The third flow will demonstrate how individuals can redeem their incentives and search for the professor to check out the reviews.

Check User Flows closer on Figma!

Three User Flows

To build low-fi’s, we started by sketching rough ideas on paper and using digital tools. We focused on quickly iterating and refining these initial designs based on user research and our observations. These low-fi prototypes allowed us to test and refine ideas.

Low-Fi’s

Once we sketched out low-fi’s, we created mid-fi prototypes. These prototypes allowed us to test more complex interactions and better understand how users would engage with the app. We quickly iterated on designs based on user feedback and refined our ideas before moving on to high-fidelity development.

Mid-Fi’s

My partner stated that our app should have green as a brand/primary color, while I considered blue the better choice. We completed color research on green and blue colors with an exciting outcome. Green is associated with nature and the environment, which can be particularly relevant for apps that focus on topics such as sustainability, ecology, or outdoor education. It’s worth noting that the specific shade of green used can also impact its effectiveness in an educational context. The results on the blue color showed that the app with the blue theme had a lower bounce rate, indicating that users were more likely to explore and engage with the content. The study also revealed that users found the blue theme more calming and professional than the green theme. This aligns with research on color psychology, which suggests that blue is associated with trust, intelligence, and reliability. At the same time, green can be an excellent option for apps promoting growth, balance, and calm. Overall, the use of blue in educational apps can have a positive impact on user engagement and retention. Blue can help users stay focused and engaged with the content by creating a calming and professional environment. We used blues as our brand colors based on our research. Lighter shades seem more ‘friendly’ while darker ones seem more somber. We also used aqua and reds for accent and some greens for vibrant colors.

Color System

Our stakeholders provided an option for the typography, and we carefully considered several factors when using the typography for the app:

  1. Readability: The primary goal of our app is to provide information to users, so it was essential to choose a typeface that is easy to read. We opted for Inter typeface that is legible on small screens and has a consistent weight across different font sizes.
  2. Branding: Our app has a unique brand identity, and we had to ensure that the typography aligns with our brand guidelines.
  3. Accessibility: We want our app accessible to all users, including those with visual impairments.
Typography

For our design system, I created a unique menu shape, and my partner encountered icons to bring fun to the students who go through the app and take surveys. We kept the company’s guidelines and branding, including blue as our primary color. Additionally, we incorporated a responsive design approach to ensure that the app’s layout and style adapted seamlessly to different screen sizes and devices. I worked on including animation in our design system, such as progress bars, statistics, etc.

Design System

After creating a prototype, it’s essential to test it to ensure that it meets the desired specifications and functions as intended. This is where tool testing comes into play. My partner and I sent our prototypes to UseBerry to complete user testing and see if the prototype needed more additional work and changes. Based on the low-fi and high-fi design system, we created prototypes representing three flows: Mini-Surveys, End of Semester Surveys, and Redeeming & Open Reviews.

Mini-Surveys, End of the Year Survey, Redeeming & Open Reviews

During the user testing phase of our project, our team received mixed results for our prototype. While some users found the interface intuitive and easy to use, others needed help with specific design aspects. In particular, we received feedback that certain buttons and menu options must be clearly labeled, leading to confusion and frustration among some users. Additionally, several users reported difficulties navigating between different screens and understanding the overall flow of the application.

User Testing

As a result of these findings, I had to continue on my own and make several significant revisions to the prototype, including re-labeling buttons and menu options, simplifying the overall navigation structure, and adding additional guidance and explanatory text throughout the application. I then conducted a second round of user testing with these changes implemented, resulting in significantly more positive user feedback.

Overall, while the initial user testing phase highlighted several critical areas for improvement in our prototype, I could use this feedback to make substantial changes that ultimately led to a more user-friendly and practical design.

Check final prototype on Figma!

Prototype

Thank you!

--

--