Testing, testing — how we created an assessment tool for Babbel users

Carina de Magalhães
Babbel Design
Published in
7 min readSep 12, 2023

When I joined Babbel in 2021 as a Product Designer, I didn’t know much about language learning. I only knew that I was really interested in the topic, and that Babbel was at the forefront of language ed-tech. So, I was very excited and curious about what I would be working on.

As a learning company, Babbel really has learning at the core of its culture. People around you have in-depth knowledge about everything from didactics, to learning design, data analytics to engineering. I’m always learning something new from my Babbel colleagues.

When I joined, I was the first Product Designer in the B2B team, and I worked there for 8 months. It was a great experience to gain different perspectives about the company and the product. During that time, I was also part of a new project that involved cross-teams collaborating, and because of that, I ended up moving to a new team that soon became focused on the Assessment topic.

The 1st challenge

This is a story about the design process that we took to build a Language Assessment tool from scratch, with cross-team collaboration. Initially this was done for our B2B learners, but its value soon grew to the benefit of our B2C learners too.

One key aspect of learning a new language is building the confidence to apply it in real life. Our main goal was to develop a tool that empowers learners to practice effectively and gain confidence in their language skills.

We constantly hear feedback from user research about how our learners want to be tested and have their skills challenged. So, we discovered that this is also something that motivates them to keep learning.

The first challenge we faced was to get the team and all stakeholders onboarded and aligned about the importance of this and why it should be a part of the learners’ journey.

To create a metaphor for the process, think about a regular learning process that we all have been taught during our time in school. We had our classes, homework, and, at some point, we had exams to test our knowledge. From a didactical perspective, it is a very important part of any learning process.

The challenge we have is that Babbel users are not in school anymore, so we had to be really careful about how we communicate this, because we don’t want to make them feel like they are in school again.

A simplified visualisation of the learning process

So we keep asking ourselves:

“How might we create an Assessment tool that can track progress and provide feedback?”

Before starting the design, we needed to address several key questions, and here are just a few examples:

  1. What are the different types of assessments available?
  2. What is a CEFR assessment?
  3. What approaches do our competitors take in their assessments?
  4. Should Babbel Assessment be considered an official certificate, similar to those from Oxford or TOEFL?

Having some answers to these questions was essential to start the design process. I’m not going to answer all of them, but I can say that after many meetings and involving many stakeholders, we decided that the Babbel assessment test will be a low-stakes test that doesn’t aim to be an official certification. It is supposed to be a tool that will help you assess your knowledge of the language you are learning. The important thing was to use the CEFR framework (The Common European Framework of Reference for Languages) so we could provide the level of the learner.

More challenges: A new layer of stakeholders

Different from the other companies I’ve worked with, Babbel has the learning content complexity as part of the process of everything new that we want to build. So, this brings an exciting challenge, especially to align the user needs with learning experts’ advice. Sometimes, in order to test and validate some ideas, we have to ship things that might be too minimal, but in the end, we are always trying to balance what’s best for the users.

MVP approach

In order to ship the product faster, we first released a version that tested only grammar and vocabulary skills. After collecting some data, we noticed that our users were interested in taking assessments, so we felt confident to put a lot of effort into creating listening questions as well.

As this is a complex project, it was very important to try to collect real feedback from users as soon as possible. We needed to validate the logic of the test, the length of the test, and more than that, how to provide good feedback and results to our users. The big question was: are they going to use it?

After an extensive process to create the content of the test (kudos to our excellent learning experts), we started the design itself.

In the beginning, Yoyo, our brilliant Principal Designer, was also there as a B2C designer, and I was there to support as a B2B designer. At this point, it was very important to bring the B2B needs and perspectives to the project.

For the first prototype, we created an onboarding screen to set expectations about the test. It was important to mention the duration of the test and what skills were going to be tested. Carl Krause, Content Design Lead, worked closely with the product designers to ensure the UX copy and content architecture was optimal.

We also had to create a different look and feel from regular lessons, but still following the Babbel visual style.

In regular lessons, users get immediate feedback if they got the right or wrong question, but in this case, we had to change this behavior because it’s a test.

User testing

Our research team ran eight unmoderated tests with non-Babbel users. Check the prototype here

Insights after testing

💡They felt that the results screen was confusing and they didn’t know how to distinguish Grammar and vocab skills.

💡They felt the information about the test was irrelevant.

💡They want to be tested in listening skills.

💡They wanted to check their answers and see correct and incorrect answers.

Solutions after first iteration

After all this feedback, I worked closely with engineer Miguel Guelbenzu and senior content designer, Lisa van Aswegen, to solve these issues.

We added the possibility to check the answers on a list and not one by one.

Before/After user testing

Introduction of a new type of skill: Listening comprehension

This feature was very important to our learners because it helps them to evaluate their listening skills, which are crucial when learning languages.

Results

~ 1900 users completing the Assessment in the first 2 weeks (organic)

~ Average completion rate 80%

~90% of the users check their answers in the end of the test

Continuous improvement

As we had good results with the MVP, we decided to dig deeper into the topic of assessment and feedback and our world-class user insights team did extensive research about the topic and they provided incredible content for me as a designer to work on the project.

For future iterations we are planning the release of other types of assessments as well, such as placement tests and small recap experiences in between lessons.

Conclusion

Designing for language learning is an iterative process that needs constant evolution. At Babbel, we strive to strike the right balance between user needs and the best practices recommended by experts. By incorporating user feedback and data-driven insights, we refine our designs to better serve learners, making their language journey more engaging, personalized and effective.

In conclusion, my time at Babbel has taught me invaluable lessons about designing for language learning. By placing the learner at the center, embracing the significance of learning, collaborating with diverse stakeholders, and maintaining an iterative approach, we can create meaningful and impactful experiences.

If you want to try it for yourself, the Assessment feature is available for all learners learning English. You can find it in the Profile section of your Babbel app.

I would like to thank all the support from the Babbel Product Design, Research team, Learning content team and a very special thanks to the Leo team, you rock!

--

--