Findings : How I am A/B testing my Master’s student courses to help them succeed — Part #2

Finally the results !

After spending weeks A/B testing my master’s students (as outlined in Part 1 of this article), the results are in!

To quickly recap: Teaching regularly in various business schools, I wondered if digital education tools had an impact on both the students’ results and perceptions of the class. In order to find out, I decided to practice what I preach, and apply a method I often use as a Product Manager: A/B testing.

Group A could only raise their hands to both ask questions during class, as well to answer quizzes at the end of each class. Group B could both raise their hands and use a mobile service (Mentimeter) to ask questions during class, and to answer the quizzes at the end of class.


Results

November 21st 2016 marked the last day of class with my beloved students. After 36 hours spent passionately teaching Product Management (18 hours to each cohort), just before saying goodbye, I asked them all to answer a multiple choice questionnaire. In addition to the regular tallying of course involvement throughout the course, this final piece of assessment would help me conclude this A/B test.

/ Student performance

Did either cohort appear to retain more information? No.

In the final course-end quiz asked to both cohorts, 25 different questions focused on content seen in class. Some questions were simple but others more tricky, including information they were not specifically asked to remember (i.e. WeChat MAUs in April 2014). Both groups performed nearly equally well, with 16 correct answers out of 25 (Figure A).

/ Student engagement

Did either cohort perceive to have participated more? Yes.

The final course-end quiz also asked students to self-assess their perceived participation. When asked “Have you ever asked a question in class?”, Group A answered “Yes” only 37% of the time, versus Group B’s 46% (Figure B).

Did either cohort interact more? Yes.

Having kept a running tally of questions asked in class, Group B consistently over-performed Group A in each class. The use of a digital education tools increased engagement, with limited cannibalization of “organic” participation (Figure C).

/ Student satisfaction

Did either cohort perceive a more interactive, fun, educational course? Yes.

The final course-end quiz equally gathered four points of subjective data: perceived interactivity, fun, and learning — as well as an NPS. On all statements, Group B over-index vs. their non-digital Group A counterparts — especially on interactivity and fun (Figure D).

In terms of overall satisfaction, I asked the students the classic NPS question “How likely is that you would recommend this class to a friend?”. Here again, the results were clear (Figure E).


Concluding thoughts

While the limited sample size makes this A/B test unlike one I’ve ever done and the findings qualitative in nature, I found the experience very illuminating nonetheless.

I’ve enjoyed teaching at Audencia most notably for the school’s strong focus on student satisfaction — an element I haven’t always found as equally emphasized at all schools. While I was surprised to find that the interactivity of the digital tools didn’t appear to lift student performance on the short quiz at the end of the course — I’m just as committed as ever to integrate this practice until it does.

I’m committed because the experiment demonstrated what I believe to be true: it is in our best interests as educators to meet the evolving expectations of students (expectations molded by an ever-more personalized and engaging world) because when we do, they will meet us right back — more interested, involved and excited than ever.

About :

By Emilien Nizon, Product Manager at Evaneos (ex. Deezer, Vente-Privée) and regular lecturer at various Business Schools.

Big up to the students who have been part of this test.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.