What works for them may not work for you — Dangers of drawing parallels.

Soumya Jain
4 min readFeb 5, 2018

--

Fact: There are tens and thousands of great products out there which have invested mind and money in solving complex problems for their users by doing numerous iterations and A/B tests. Hence, most of the problems a new product comes across in it’s early stages are already solved by someone else.

Logic: Given a similar problem, what works for them should work for us! 🤔

Truth: NOPE (mostly) 😵

At Vedantu, we had built a MVP believing in the vision that every child should have access to the best teachers across the globe. Flow was simple: Show the students a list of our handpicked awesome teachers, let them choose and study with one they feel comfortable with. Just like any marketplace does.

The MVP: ‘Marketplace of teachers’

The thing about MVP’s is, you generally entertain the first thought that comes to your mind and go ahead with it because you want to put your idea out in the market ASAP. It’s your effort after building the MVP is what makes or breaks a product.

Just within a month of launching, we started to see something fishy…

The problem

Less sign up rate than expected even when the product feedback was great. 😧

Analysing the data we soon realised that our Onboarding process of throwing a list of teachers to students to choose from is not working out. Although ours was a marketplace model, but interestingly, drawing parallels from what works in blooming e-commerce market didn’t prove right.

So what went wrong? Pondering on this question, it didn’t take us long to realise that unlike physical/tangible products, Humans (Teachers) are not a commodity. And they most certainly can’t be judged and compared as we compare products on an e-commerce website. Each teacher is equally important… heck, each of them is a product in themselves.

Valid point. But how do we bring this knowledge to our product? This led us to ask further questions:

Do students really want choices?

Why is a huge list of teachers needed?

What are students signing up for?

How are they seeing value of the product in current flow?

We spoke to some of our users to understand the answers to these questions, it required some deep digging and reading between the lines, but the conclusion was eye opening!

Turned out, list of teachers is not what students/parents were interested in, in-fact, the list was triggering unwanted and unfair comparison in their minds. What they really wanted was assurance that their requirements are understood and they will get a great teacher accordingly.

Great! Let’s test it out. 👊

How?

Obviously, release the proposed product flow to 10% of our users to see if its working and then refine it and release it to everyone else.

But that will require a complete product building cycle.

Instead, we went to very basics and tried to test this flow just by calling. We called up a set of users directly, understood their requirements and pitched them only one teacher which was carefully selected by our academics team. (Talk about MVP!)

Data showed, students who were given no teacher choice converted more quickly and were even more content from our service! 💃💃💃

Voila! We had hit gold!

That triggered a product and design sprint for next week or so to build a new Onboarding flow which we called “Find me a Teacher” or FMAT for short.

The Solution

We asked students requirements in a series of 4–5 simple objective questions and then ran an algorithm which matched the requirements with our pool of teachers and recommended a perfectly suited teacher to them. To further reduce the friction, we offered to book a free trial session with that teacher immediately.

Wireframing & Ideation
‘Atom’ asks your requirements and shows you the best matching teacher

The Result

  • Drastic increase in sign up rates in a short span of time (from 2.5% to 9%). 😬
  • Lesser drops between sign up and first session. 👍
  • Students loved the recommended teacher ~90% of time. 🙂

Further experiments: We further tried to fine tune the flow by running experiments like

  • Showing more than 1 teacher. Giving a choice to choose from 3–5 best teachers. This further increased the sign up rate.
  • Giving functionality to book a trial slot immediately
  • Showing reviews of the teacher along with other information

Learning 🤔

Drawing parallels from other products needs careful consideration. Maybe they were trying to solve a similar problem but are their target audience the same? What is the mood of the users when they browse your product vs one you are comparing with? Is the end goal of both the products the same?

Think about the flow your users are comfortable with in real world. If your flow is predictable, there is lesser learning curve and lesser chances of users feeling lost.

--

--