How I am A/B testing my Master’s student courses to make them succeed — Part #1
Regularly teaching at various business schools, I often use digital education tools like Socrative and Mentimer to both lead a more interactive, dynamic class, as well as to track my students’ understanding of content presented in class.
It appeared to me that the students both learnt more, and enjoyed the class to a greater extent. But I asked myself, are digital education tools really efficient — and can I bring forth demonstrable proof to my personal impressions.
Are digital education tools really efficient ?
To do so, on Monday the 26th of September, I’ll use a technique I rely on regularly as a Product Manager on two cohorts of students (18 hours each): A/B testing.
Beforehand it is important to stipulate that this paper is not a research study in any way and I am not a formal academic researcher.
Using digital education tools improve both student satisfaction and grades.
1) Course-end Quiz Grades (Multiple Choice Questionnaire)
2) Course-end Student Satisfaction Questionnaire
3) Number of questions asked in class (tallied throughout the course)
The test will be take place at Audencia Business School in Nantes, France — a school I’ve taught at for the past 3 years.
The students in the course are Masters Students (Programme Grande Ecole), and are a diverse set of both French and international students. The cohorts are divided nearly equally:
- Group A (49 students)
- Group B (50 students)
The program taught -Digital Growth- is 18 hour long. Each cohort attends either the morning or afternoon session.
Consequently, I will see each day two groups, and teach twice the same program to avoid bias.
I will strictly apply the following protocol :
What other tools should I use to teach ?
What do you think about the given protocol ?
The next article -including the results of this study- will be published in December.
By Emilien Nizon, Product Manager at Vente-Privée (ex. Deezer) and regular teacher in various Business Schools.