Sitemap

Usability testing our onboarding journey

How we went about user testing the Indie onboarding experience, our process and what we learned.

11 min readJun 25, 2020
The mobile testing setup.

About Indie

Sanlam Indie is a financial services company with a mission to design better financial products and services that are more user-friendly, easier to understand and accessible to more people. The kind of financial services that people actually want to use.

Using some pretty smart actuarial thinking and tech, led by a user-centred design approach customers can buy fully underwritten life insurance online in just a matter of minutes.

Why did we decide to do user tests?

When Indie first launched to the market, the product was nowhere near what it is today. Like most startups, Indie’s plan was to launch quickly, with the best possible version at the time, and to learn and iterate on the product going forward. The launch would have been delayed by several months had we decided to perfect the UI with endless tweaks, build our own custom product admin system, design more product feature, further pre-launch testing, and on it goes...

After Indie was launched and as more people started using the website and buying insurance online, we realised the time had come to reassess the design and customer experience to better optimise the product. We wanted to better understand the thought process, methods and context in which people were buying insurance and particularly Indie insurance.

User testing — what, how and where

Together with Chris (from UX design company, How Might We), we conducted a series of in-house user testing and research sessions with people and users matching the persona profiles of Indie’s customers.

Each testing session lasted about 45 – 60 minutes, which allowed us enough time to cover the following:

  • A brief one-on-one, casual interview with each recruit. This allowed us to ask questions to better understand a little bit more about them — their family context, their understanding of life insurance, their personal motives and reasoning for needing life insurance and how they would go about buying insurance, especially within the context of purchasing insurance online.
  • The actual test process where the recruit was asked to complete a set of tasks on the Indie website. Usually focussed around the onboarding process that a client would go through when buying insurance. (We’d frame the scenario for the person with prompts like, “How might you buy life insurance from Indie given you had landed here from a Google search result and are looking for Life Insurance”).
  • If we had enough time, we’d also share some early prototypes of other designs we had been working on which we’d like feedback on. These might customer dashboard designs, iconography, visual designs, or product and feature names.

The lead up these testing sessions, and recruiting the actual people who would be involved, meant we needed to understand who was likely to buy insurance online, and who Indies’ target audience was.

1. Defining personas

Before getting started with any testing, we first needed to identify a few Indie customer profiles to make sure we looked for testers who best matched our actual Indie customer. The Marketing team had already gathered a lot of research on Indie’s target audience and demographics which helped in better understanding who we should be looking for. Along with all this customer information, we followed this up by running a few internal workshops including people from Marketing, Product and Customer Services to help us unpack, validate and better define these personas.

We identified and created three different persona profiles to help us visualise and understand who we were designing for. Once these were created it gave us a much clearer idea as to who would be ideal test users. We actually printed these personas out and stuck them on the office wall reminding us all who our customers were. It was really interesting watching other people from the office gather around the personas, and discussing each one of them. It helped in galvanising the team around the people we were actually designing for. It provided a name and face for people to relate to. And eventually, you start referring to them by name, instead of “users”.

2. Conducting the tests

A screenshot from one of the recordings.

All the testing sessions were conducted at the Indie offices which gave us a lot of control over the setup and process, ensuring each person felt relaxed and comfortable throughout.

We prepared one of the rooms in the office as the test room, from which we ran all the tests. The basic setup consisted of a table and chairs, a laptop for the test user, microphone and a camera to record the candidate as they completed the test. For cases where we tested on a mobile device, we’d use a Mr Tappy rig, making it much easier to record the experience from the user’s perspective.

What is a Mr Tappy?
Basically it’s a mobile testing device that allows you to record a test user, from their own perspective, as they interact with your site or app on a mobile device.
www.mrtappy.com

We tried to make the room feel as friendly and welcoming as possible (given we hadn’t long moved into the new office and some areas were still a little bare), and kitted it out with some pot plants, snacks and water. During the session, this room would only be occupied by the interviewer, and the test user — with the interviewer leading the session by asking the candidate to perform certain tasks, taking notes themself and also asking certain questions throughout to better understand the tester’s thinking and thought process as they completed tasks. Furthermore, the interviewer is only there to observe.

The testing room setup.

We prepped the main boardroom as the viewing room, rigged up with a large monitor which live-streamed the test sessions as they happened. We did this through private YouTube streams and recorded each session too. This is useful when you need to go back and watch specific snippets later and to make reference video clips of anything interesting.

The viewing room was occupied by the designers watching and taking notes of the sessions as they happened. I’m talking notes of just about anything and everything — including UI issues, interesting thoughts, observations, the user’s reactions, broken patterns, and anything else of interest that came to mind. We also invited other people from Indie to drop in to watch the sessions too.

Overall, we conducted about 3–4 sessions per day (depending on availability). At the end of each day, we’d then sit down together to sort through and categorise all the sticky notes that had been taken that day. You end up with a ton of notes, that we then grouped based on the session, journey point/screen and also a timestamp of the video to go back and watch it again if needs be.

Example:
Jane — Onboarding/Personal Details Screen — Didn’t have an email address — 00:36

3. Prioritising insights

After a few days of testing, once all the tests were completed, we then sorted each note based on its severity/level of impact:

  1. Positives — any positive or pleasant experiences that the user remarked on or was noticed.
  2. Paper cuts — issues that cause slight confusion or resistance, although nothing too serious on their own. Stack a few of these together though and you’ve got ‘death by a thousand paper cuts’.
  3. Speed bumps — issues that create a lot more confusion and uncertainty which take the user significantly more effort and patience to get through.
  4. Roadblocks — exactly that. Issues that on their own are enough to make the test users abandon the experience entirely, resulting in little user confidence and further hope of giving it another go.
Sorting through the notes after testing.

The sorting process helped us define issues that we needed to work on immediately, what we should keep or do more of to delight the user, and if any ‘quick-fix’ improvements could be made to eliminate some of the papercuts and confusion.

5. Preparing Pitch Docs

After a round of tests and sorting, we’d then prepare what Indie called Pitch Docs to motivate for design updates and improvements to the product. At Indie, anyone can write a Pitch Doc, and we use these docs as the starting point of a more detailed outline to define new product features and updates to product owners and the business.

A Pitch Doc sets out to identify:

  • a problem we feel needs to be worked on and solved for our users
  • why we must spend time on it
  • what evidence do we have to validate the issue (e.g. user tests, analytics, customer service)
  • any early exploration on the solution (basic wireframes, docs, etc)
  • the estimated scope of the work
  • which people might need to be involved from a skills/technical perspective (e.g. Design, Engineering, Actuarial, etc)
  • And any limitations or blockers that we know of
We used a Trello board to keep track of all design cards for the team to work through.

These docs are all gathered on a dedicated Trello board for review by the product owners and other functional leads. Then, based on any other prior business requirements or product features on the roadmap, the selected pitches would be further defined and scheduled into cycles to be worked on by strategically assigned teams depending on the people needed and scope of the work.

Insights

Collecting feedback from all the sessions.

While there were loads of observations that came out of these user tests, here are a few interesting ones that I feel are worth mentioning.

Consider the copy
Copy that’s written in a friendly ‘write-it-as-you-say-it’ tone goes a long way in creating a confident and friendly user experience, especially in an industry filled with confusing financial jargon and technical terms. Some of the test users even had a little smile or laugh at some of the wording — hat tip to the copywriters.

With this in mind though, there’s a time to be casual and a time to be serious — especially when dealing with life insurance. So be cautious about when and where you decide to lighten the tone.

Let’s call it ‘Bounty’… Everyone will understand that.
Some people may understand a specific word to mean something totally different from what the intention is. This might be based on their prior contextual connotation of the word or their perceived understanding.

A good example of this was the word, ‘Bounty’, which referred to the Indie investment that comes included with every Indie policy. Some people thought Bounty meant a referral, as if they referred someone to Indie or even worse, a bounty on someone else’s life should they die! I’m being totally genuine.

With good reason, it wasn’t long before we then changed the word ‘Bounty’ to ‘Wealth Bonus’. Which is now a heck of a lot clearer for people to understand with much less confusion and explanation required.

Consider all user flows within a journey
When referring to the Indie flow of buying insurance, in some cases, due to your level of education or income earned, a person may not qualify for a particular product (i.e. fully underwritten insurance). If this was the case, they would still qualify for other products which required no underwriting. If this happened during onboarding, their user journey would still require them to answer unnecessary underwriting questions as if they did qualify, rather than proceed with a shorter, non-underwritten process. When we noticed this, the user flow was updated to allow for a much smoother process, for both fully underwritten and non-underwritten cover. Something that in hindsight seems so obvious.

Not all customers have an email address
At the start of the onboarding journey, a user is required to enter their name and email address before continuing with the journey. There are several reasons for this, but importantly it’s so that Indie can contact you with regards to your quote and onboarding journey should you save your process — sounds simple enough, right?

Well, not really… while it’s less of an issue in more developed countries, not everyone in South Africa has an email address, which meant we were turning away a large portion of potential customers based on an email address being a mandatory requirement.

This issue was quickly prioritised and a solution designed so that an email address was no longer a mandatory requirement. Now anyone can buy insurance with either an email address or mobile number.

Design debt increases over time
We noticed over time, as more updates and features had been added to the website and app, that things had started to feel a little inconsistent in terms of styling, UI elements and patterns.

Not everyone knows their height
Another interesting one here is that not everyone knows their height and weight. Part of the underwriting process is to calculate your health risk which takes into account your BMI. And your height and weight are what’s needed to calculate this.

Inconsistent patterns
The onboarding journey is a series of non-linear questions, whereby each question is presented on a separate screen, and the user would click ‘Continue’ after completing that question. After a few questions, you become pretty used to this pattern. But then on certain screens, the pattern would change. In some specific use cases, if a question only required one answer, from a radio select input for example, as soon as it was selected, the app would assume the form was completed and would automatically skip to the next screen as if the ‘Continue’ button had been pressed. This meant if a user selected a radio button by accident, they would need to click ‘Back’ to return to that previous question. This became frustrating as it broke the pattern that you’d come to expect.

Conclusion

Overall we gathered plenty of useful insights and feedback and learned which helped guide our design decisions for future iterations.

While in-person testing is really valuable, it does require a fair amount of preparation, effort and time (especially if you’re a small design team).

If you’re pressed for time or looking for quicker and cheaper methods there are plenty of alternative options which can offer you great user feedback too. From online (remote) testing tools, asking family and friends, or simply asking people from within your own company if they don’t mind being testers.

However intriguing, be careful not to solely relying on the feedback from these in-person tests either. Instead, consider the feedback in conjunction with other analytics, data and tests too (e.g. Google Analytics, Hotjar, FullStory to other tracking and testing tools). This will help validate your decisions going forward and highlight areas of importance.

We’re always looking to improve our processes and are interested in hearing what methods you’re using for user feedback. Feel free to drop us a comment below.

Big shoutout to Chris and the Indie designers — Shawn, Leigh and Christine.

--

--

Adrian Myburgh
Adrian Myburgh

Written by Adrian Myburgh

Head of Design at Indie — Helping design better financial products and services that are more accessible, easy to understand and simpler to use.

No responses yet