How eye tracking will take your onboarding experience to the next level

Need to turn an early stage funnel into a consolidated user-centric product? Involving users and stakeholders is the way.

Carolina Modica
Telepass Digital
8 min readJul 4, 2023

--

Written by Nicola Vagniluca and Carolina Modica.

We are about to share not only a step-by-step methodology, but also an approach that views design as an entity within a larger system that is based on the needs of the customers. We will concentrate on three major points:

  • working together without relying on personal opinions;
  • user research as the primary method for observing users’ interactions with the system;
  • exploration of the eye-tracking possibilities.

But we don’t want to give away too much right now… enjoy the read!

MVP funnel onboarding: the very beginning

We had to improve the user experience in our in-app onboarding funnel in order to increase the perceived quality of the product by the user, achieve some strategic company KPIs, and integrate a new commercial offer (the activation of a Telepass device via smartphone using NFC technology).

We produced an MVP aka a Minimum Viable Product with an optimized design and major reworking with Figma, so it would be scalable for future evolutions.

As a result, a task force was formed with a cross-functional team comprised of the following individuals:

All members of the task force.
All members of the task force.

We used the scrum methodology to design and develop the MVP, with sprints of two weeks and recurring events (Daily, Refinement, Planning, and Retrospective).

The task force’s primary goal was to establish an MVP that adhered to a challenging time-to-market. As a result, we held co-design sessions with the business team and created a draft flow tree.

Part of high level flow designed with business department.

Then we finalized the flow design and engaged the devs to ensure the technical feasibility of the funnel views. Finally, we defined analytics tracking with the data team.

MVP development has begun on native iOS and Android platforms!

Part of MVP user flow with two different offer funnels.

Data, data everywhere

At this point, the task force has evolved into a stable product team, working iteratively on the product. So we’ve came up with a strategy:

  • monitoring analytics (quantitative data) on a daily basis is critical to understanding the points of “fall” in the flow where we can intervene to increase the Conversion Rate. In fact, the analytics data allowed us to see the drops that were present between two steps of the funnel;
  • collect business objectives: we specifically asked which services/offers needed to be made more visible to users and how many transactions (in percentage compared to the previous year) we hoped to obtain.

In summary, our macro-goals for this modernization are the following:

  • makes it easier for the user to sign a contract by providing a simplified, clear, and transparent user experience;
  • increase the overall conversion rate by making the subscription funnel more intuitive for our customers.

A live session of qualitative user testing was also scheduled at the same time. In fact, in addition to quantitative data, it is critical to observe how users interact with the system by analyzing their actual behavior in order to identify strengths and areas for improvement.

User test with eye tracking in action.

A 360-degree data-driven approach requires the correlation of quantitative and qualitative data.

Finally, the customer care team was crucial for our success. It is the department of a company that directly gathers and attempts to solve the needs, problems, and roadblocks of its customers. They know what works and what doesn’t because of phone calls, e-mails, and reviews from customers, and their involvement is always a valuable source of information.

Their weekly “top reasons for contacting us” reports gave us a boost: they were genuine, raw, and straight from the customer’s mouth.

Eyetracking research: our bestie

We had cleared the step with drops, thanks to the quantitative analysis, and we knew the main reasons for contacting support.

But now it’s time to move on to user testing:

  • we recruited 12 prospects (non-Telepass users) who matched the characteristics of Telepass customer base;
  • the Thinking Aloud methodology was used during interaction with the system to observe any blocks but also doubts, perplexities, and unclear elements in the pages;
  • we used the eye-tracker, a tool that allows you to record the eye movement of users.

The eye-tracker detects the refraction of near-infrared light in the cornea (similar to infrared, but completely non-invasive to the human eye) and records the direction of gaze using high-resolution cameras.

Advanced algorithms can calculate the position of the eye in relation to the screen with almost no margin of error and reconstruct a visual pattern for each individual page visited (Carter, B. T., & Luke, S. G. 2020, Best practices in eye tracking research, International Journal of Psychophysiology, 155, 49–62).

Example of eye tracking output with user feedback.
Example of eye tracking output with user feedback.

The tests were designed to last up to an hour (to avoid cognitive overload) and were carried out with the help of a prototype that perfectly replicated the subscription process contracted in the application, with the user’s task being to choose between the offers and complete the subscription.

The moderator was asked not to intervene unless the user was extremely frustrated.

It was possible to identify the main problems of the analysed flow by using eyetracking, observing the interaction with the system, and asking detailed open questions.

Here are some of the major issues:

  • lack of knowledge of the documents required to complete the signature at the start of the flow;
  • when using mobile devices, it is difficult to compare offers.

User insights were analyzed, expanded upon, and correlated with quantitative data. The quotes from users are very useful and interesting, and when contextualized correctly, they add a lot of value to the final storytelling (see image above).

Yes, but where do we start?

We had a lot of ideas about how to improve the onboarding funnel at this point, so we decided to hold a workshop to define high-level interventions and prioritize them before getting our “hands dirty.” Product Team, Product Manager, Marketing Team, and Customer Care were all involved in this phase. The workshop was in full remote mode and was divided into two sections.

In the first part, we discussed the pain points that emerged (specifically, the main drops — analytics data — combined with the most relevant user testing insights), and then we had a brainstorming session. Finally, we grouped and voted on these ideas. Following that, we prioritized the most popular ideas.

We were split into two groups, and each had to vote on the interventions on a scale of 1 to 5: the product team, along with customer care, had to vote on the user value versus development effort (this was evaluated by the developers at the table); the product manager and marketing teams were required to vote on the amount of business value that the intervention would bring.

The vote results completed an effort-per-value matrix, allowing us to define an ordered list. Indeed, the interventions with the best ratio of low development effort to value for UX/business received the highest priority.

Example of Value vs. Effort matrix.
Example of Value vs. Effort matrix.

Priority interventions include:

  • inclusion of a summary document view. This assists the user in having everything ready to sign the contract;
  • highlighting promotions during the offer selection process and incorporating a direct commercial support channel to aid in the understanding of the offers;
  • copy change in each activation status step, with the option to click to solve any problems.

Everyone’s valuable and diverse contributions enabled us to have a broader vision and anticipate potential future problems.

Let’s iterate!

The product managers created high-level requirements to formalize the workshop’s outcome, and the product owner was in charge of declining the detailed tasks within the product team’s backlog. We began working on the tasks at this point, taking an incremental approach with several design waves to complete each requirement while continuing to gather insights (analytics data).

We used an iterative process that was divided into several stages: UX/UI planning is managed by a product designer and a UX writer, who use previous insights and collaborate with the technical side to identify the best solution with full feasibility. Customer care’s ongoing involvement in this iterative cycle has been critical in gathering and sharing insights.

Below are some of the outcomes we’ve achieved as a result of our iterative and incremental approach (both from a user experience and a technical standpoint):

  • the dynamization of content in the offer selection step, which has enabled content changes to be triggered by releases, allowing us to communicate with our entire promotional world’s users in a more punctual and timely manner;
  • advanced error handling in the activation status step, allowing us to warn the user of the reason for the block related to contract opening, with the option of a dedicated action;
  • the infrastructure change in the document loading step, which resulted in an improved UX and a reduction in approval times during contract opening from 5 days to around 2 days.
Offer selection step with new dinamic contents.
Offer selection step with new dinamic contents.

End of the story

This case had some limitations: the flow was pre-defined, never tested with users, and was based on technical-business logic rather than user-centered logic.

At the end of the intervention, we observed an increase in the overall CR of the funnel of + 7% (calculated from the previous year), with monthly peaks exceeding expectations. All of this led us to believe that the outcome was very encouraging.

We will continue to iterate in order to improve our method in the coming sprints, but we can do better retrospectively:

  • synergy between different business groups (product manager, customer service, developers, marketing, and the research / design team) has resulted in unprecedented new ideas. Different knowledge and know-how, in fact, have created an inestimable value for this project at the same table;
  • research with real users: observing, listening, and talking to system end users allows us to shift our perspective from internal (company-centric) to external (user-centric). Of course, mediating between user insights and business objectives is always necessary.

This article was written by Nicola Vagniluca, Product Designer, and Carolina Modica, UX researcher, and edited by Marta Milasi and Gaetano Matonti, respectively UX Content Lead and Managerial Software Engineer at Telepass. Interested in joining our team? Check out our open roles!

--

--