What is the first AHA-moment of my users? — practical guide

Jakub Tutaj
6 min readFeb 24, 2023

--

Preface

Are you losing the majority of your trial users, and are some users quickly churning after paying for your product? If so, it’s possible that they didn’t fully understand or appreciate the value of your product in the time they invested in it. This lack of an “aha moment” is a strong indicator that changes need to be made. Let’s explore ways to address this issue starting with the basics.

Table of contents:
> First AHA-moment
> Finding Aha-moment — listen to the people
> Finding Aha-moment — hunting for the data
> You can make it complex, but please don’t for now
> Get started

First AHA-moment

Aha moment — a moment of sudden realization, inspiration, insight, recognition, or comprehension

https://www.merriam-webster.com/dictionary/aha%20moment

The “AHA moment” is a crucial point in the customer journey where they understand how your product can benefit them. This moment can occur multiple times as customers discover additional value in your advanced functionality.

For example, with an e.g. Automator app, a user’s first “AHA moment” might be discovering that Automator can trigger an email every time a form is filled out. If they can’t achieve this initial “AHA moment” and validate that your product can meet their needs, they will likely leave. To avoid this, it’s important to focus on identifying and addressing the customer’s first “AHA moment”.

Finding Aha-moment — listen to the people

I wanted to share 3 qualitative research actions I took to get a more holistic sense of what users were looking for in product I worked on:

  1. Ask a single question at the right time: What are you hoping to accomplish with <product>?
  2. Run loosely structured 30 live onboarding sessions with new customers
  3. Analyze all available feedback sources of potential and new customers (support tickets, community, voice of the customer records, etc.)

The output of each action had to be the same, so it was later easy to combine the information I captured. Here’s a simplistic output example I had:

What are you hoping to accomplish with <product>?

I found this one very helpful. Showing up this question when landing inside of the product for the first time, and following up with email, if there was no response, made miracles.

Output to aim for: 50–100 signals
Time to capture: depends on new users volume.
Performance: Achieved 10% chat message reply rate, 40% follow up email open rate, 3% email reply rate.
Time to summarize research output: 2,5–5 hours (assuming 3 minutes per signal)

30 onboarding session with new customers

New customers are fairly eager to jump on an onboarding (not sales) call. And typically they are not onboarded to your product at all yet. You can also try recruiting users that are not customers yet, but it’s an uphill battle — don’t start with it in the beginning.

Each session was loosely structured by our Customer Success Manager (CSM) — it was a video session where we were asking customers a little bit about their background, and (again) what they hoped to accomplish. What are their current obstacles etc.

Once we heard their initial responses, CSM provided answers where applied, and then ran a quick demo of basic functionality. Many times it sparkled additional “oooh, that’s what I was looking for” etc., but the product didn’t make it easy for customers to find it. Great signals to capture.

Then after each session, We reviewed recordings and wrote down signals.

Output to aim for: 7 video sessions, within each session that is rich in content, you might capture around 50 signals
Time to capture: 1–2 weeks for recruiting, 2–3 weeks for scheduled sessions. 7x30 minute sessions = 3,5 hours
Time to summarize research output: 8–12 hours (at least double the time session time)

Analyzing all available feedback sources of potential and new customers (support tickets, community, voice of the customer records, etc.)

We dove deep into potential and new customer issues. Hopefully we could filter out long-term customers and look only at the information from potential and new customers.

Output to aim for: 50–150 signals
Time to capture: depends on your support ticket volume.
Time to summarize research output: 4–12 hours. Assuming 5 minutes per support ticket to understand and capture signals.

Summarizing what I’ve captured

I had now 150–300 structured signals turned into struggles, categories, areas etc.

I’ve also just traded around 5 days of research work for at least a quarter worth of insight for my product to get better. Believe me, it was worth it. And I was reusing what I’ve learned in dozens of documents and hundreds of discussions.

Next stop — product data.

Finding Aha-moment — hunting for the data

I already knew which struggles and areas have shown up in the listen to the people phase. I also had a sense which were strongly related for customers to understand the product value.

Questions I asked myself:

  • How do these identified struggles map to actions that can or should be done in my product?
  • Are there specific related metrics that come to my mind?

I picked 5–7 top candidates of actions that I believed were helping your customers onboard themselves and stick to my product.

Here’s the analysis our team did:

Retention analysis

We picked a cohort of customers that started paying e.g. 90–180 days ago, and analysed their retention.

Here’s an output example of this analysis:

# of customers: 3000

Above example assumes there is one customer segment. Some numbers are to make a point, not exact calculation.

From this short analysis we had strong candidates to form the first AHA-moment set of actions: Actions 1, 2 and 5.

Payment correlation analysis

Now we checked which of these 3 selected actions were also indicating a good trial to paying conversion rate.

We picked a cohort of customers that started a trial e.g. 30–60 days ago.

Here’s an output example of this analysis:

# of trial users: 10000

Based on above example it seems we could find 2 great insights:

  1. Both retention and payment correlation analysis suggests Action 1 and 5 are great actions to be considered as requiring completion for first AHA-moment
  2. There’s something amazing happening when Action 2 is performed, but the product has some big friction in the onboarding process you should tackle to remove. And analyze again in the future. Perhaps it’s a key action to be performed as well.

Don’t forget

We also had to ask ourselves — were there actions that customers should take to get onboarded even possible in our product now?

This might be a huge contributor to finding the right AHA-moment that data analysis on existing actions itself won’t reveal.

Don’t omit this part.

You can make it complex, but please don’t for now

You can also further analyze e.g. combinations of actions.

You can also analyze less obvious actions that didn’t show up in customer signals.

You can also perform similar research starting earlier — with your website, in collaboration with your marketing team.

If you’re just starting, and your focus is product, save your hours.

Above actions are already a huge undertaking to get started.

Is it very scientific? No.

Is it bullet proof? No.

Will it dramatically change your sense of the customer needs and increase your revenue? Yes 🙂

Get started.

Get started

I hope the above tips will help you better understand how you can approach finding the first AHA-moment, form hypotheses, and build onboarding based on found signals, not hunches.

Once you have your new onboarding, head on to tracking initial onboarding success.

Thank you

If you find this article interesting, share it with your colleagues that focus on business and product growth.

You can find me on LinkedIn.

--

--

Jakub Tutaj

Senior Product Manager at Softr. Passionate about all things product growth. Helping bring more MRR.