Improving AARRR pirate metrics

The tribal stage of the startup is all about running right experiments and quickly making changes based on the results. In addition to experiment with daily knowledge channels (which aims to raise retention), I’ve looked at Eduson.tv customer funnel (acquisition -> activation -> retention -> virality) and designed 3 tests which we plan to run.

1. ACTIVATION. Will the video tests engage employees more than traditional text-based tests?

Our new onboarding will put employees right into online assessment of their knowledge. We want to find out their strengths and weaknesses and based on it appoint individual learning paths. 10 new interns who joined the company are all different and so shall be their training.

But if the test is too boring, the users will quit. We currently have around 60% of users finishing the general test. To raise engagement we’ve decided to try video tests which shall create the illusion of talking to the person. Such interactivity has worked with our video courses well.

In order to launch this experiment we’ll need to 1. write the script for actors, 2. shoot the video, 3. randomly split at least 200–300 professionals and watch them pass the same text (it’s text version and it’s video version), compare the results (tests completion percentage, number of diplomas awarded after 4 weeks).

2. ACTIVATION. What learning process engages most?

In our experience of 100+ enterprise integrations, the process is as important for success as content. In transportation company HR manager has appointed the learning paths with content for sales beginners to 25 regional heads of sales. None even started learning. Since then we offer many ready-made learning tracks to ensure that best and relevant content is chosen.

Shall we set deadlines or not, appoint learning tracks or not, pull all employees into online assessment and auto-appoint courses or not? The correct experiment would require to split the employees inside the same company into groups, but we don’t have such luxury. So we’ll look at same course completion ratios and average number of diplomas received in 4 weeks for 5 groups of users:

1. individual customers who bought courses

2. employees with obligatory learning (who were appointed sets of courses with deadlines)

3. employees with voluntary learning and learning paths which were recommended by HR

4. employees with voluntary self-paced learning, who studied anything they liked from the catalogue

5. professionals who passed online assessment and automatically received courses’ recommendations

Ratio (1) shall be high as people paid their own money, in contrast to employees getting the service for free from the company. (2) shall be high, as employees are generally afraid to disobey HR, but it is never 100%, even if KPIs and bonuses are linked to courses completion, like one of our clients in metallurgy did.

Seems like (4) shall be relatively low, but all our most successful projects in companies with 50,000+ employees were with this design. Why? I guess, because what people want to learn themselves differs from what their HR thinks they want or need to learn (3). A blue-collar worker from a steel plant once thanked us for a leadership course (that makes a lot of sense, if he plans to become head of his steel melting department; not that this was in his HR plans). Finally, as we’re shifting to online assessment we would like (5) to be highest, which would mean a) users buy the idea that assessment shall be linked to training, b) our courses’ recommendation provides relevant matches.

3. RETENTION. We want to know which types of courses are most engaging and focus our production team efforts on them.

Eduson.tv has 5 courses formats:

· video + text tests (classic ‘talking head’ which was our first format)

· video + interactive tests (imitates speaking to lecturer)

· animated presentations

· dialogue simulator (tool for sales reps to train scripts, pitches, objections)

· business case (a simulator of situations which require management decisions)

My hypothesis is that the more interactivity learning has (requiring actions -answering questions, writing an email) and the closer it is to working environment (imitating Skype calls, gathering sometimes intentionally false information from your colleagues etc.), the higher engagement will be. I assume, ‘talking head’ is dead.

We don’t need to run an experiment to find the leading formats, it’s enough to analyze historic courses’ completion. With 90,000 diplomas awarded this year, we have enough data points.

Those experiments (and the culture of testing hypotheses instead of relying on guesses or what a random client said) shall help us improve Eduson.tv value proposition, onboard more customers, who’ll love the service (like the CEO of a logistics company who passed 176 courses in 4 weeks) and will use it again.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.