How we design our A/B Test experiments at Syfe

Aditya Singh
10 min readJan 20, 2022

--

TL;DR: the A/B test experiment failed 👀 but in the process we learned a lot of interesting concepts. Detailed insights and learnings revealed at the end.

Team illustration from DrawKit

Investing has become the ‘New Normal’. In the 2000s, owning and using a simple air conditioner (AC) was a luxury, but now has become a necessity.
In the same way, being investment-savvy was a luxury a couple of years ago but, after the Covid-19 pandemic hit, everyone realised how important it is to invest and save, and it is no more a luxury but a necessity if you want to grow your wealth over time. And that’s exactly what Syfe, a digital investment platform, helps its users with.

Syfe is a digital wealth manager based in Singapore built for investors who expect more — greater transparency, smarter portfolios, and better investment outcomes.

Syfe: Investing & Cash Management For All

At Syfe, customer experience is one of the key focuses for the business, and the Product and the Design teams are responsible for delivering a large portion for it. This case study is about an A/B test that we recently performed on one of our popular products, and what did we learn in our journey.

Addressing the problem 📌

We noticed that, on one our products, a lot of people who visited the landing page, only a very small percentage entered in to our account creation funnel, hence we wanted to increase the percentage of people entering this funnel and eventually increase the CVR for this product.

Context

We have many portfolios (products) at Syfe. People can choose between these investment portfolios based on their risk preferences, investment objectives and financial goals. This case study is about improving the conversion rate (CVR) of one such investment portfolio. For privacy reasons, we cannot name this portfolio hence, let’s just refer to it as portfolio “Sigma”.

Now, Syfe’s ‘account creation funnel’ looks something like this.

Step 1: User lands on the “Sigma” portfolio page

Step 2: User clicks on the “Get Started” button and enters into the ‘account creation funnel’

Step 3: User answers some questions about their risk preference, investment goals, the amount they can invest upfront etc, which finally leads to the “create account” page

Step 4: User is required to fill in some details for their account, and after successfully doing so, user lands on the Syfe platform

Syfe customer journey

We wanted to increase the percentage of people entering the account creation funnel and eventually increase the CVR for our “Sigma” portfolio product.

Research, User Interviews and more 🧐

With the problem clearly defined, user interviews were conducted by the Product and Design teams to understand what were the issues that users felt created friction in making their decision to go ahead with the “Sigma” portfolio product.

Research@Syfe

The best thing that I like about working at Syfe is the company’s research culture. Research is the central part of every project and we conduct it regularly to understand our users more and enhance the product experience. It is a collaborative effort between designers and product managers to ensure we build empathy first hand.
Even if you are a junior designer at Syfe, you can still be a part of ‘n’ interviews that we conduct with our users in Singapore and Hong Kong.

On the basis of these interviews, and our secondary research (going through Syfe’s forums and communities), we have designed Syfe’s ‘Customer Theory’.

Customer Theory

Customer Theory is the study of how our customers make certain decisions, what are their preferences, what are factors that motivates them, what are the things that confuses them. In short, what we believe about our customers. Building a ‘Customer Theory’ is an iterative and a continuous process.

Syfe’s Customer Theory

User Research Findings 📃

Based on the user interviews and our secondary research, we identified three key mental barriers why people didn’t move ahead after landing on the product page.

  • First, people aren’t sure that if this is a popular product, i.e. are there a lot of people investing on this product right now, is it a popular choice
  • Second, compared to the different products they have seen in the past (and in competition), is this really a better product, is it more sophisticated or intelligent than others
  • Third, people are happy with the status quotient, they might move forward with the product, but because there is no feeling of urgency, they end up procrastinating their decision

The Solution 🎯

Based on the research findings, we decided to solve these three mental barriers with three different solutions so that we can test which of them has the highest affect on the conscious minds of the end users. Simultaneously, we also wanted to compare their performance with the original landing page. Hence, we decided to do an A/B test experiment.

A/B Testing

A/B testing, also known as split testing, is an experiment where two or more variants of an ad, or web page are shown to users, and then different statistical methods are used to determine which variant drives more conversions.

A/B Testing

In A/B Testing, we design and set up our Control and Challenger versions. Basically, the unaltered or the unchanged version of your product is called ‘control.’ And when you do some modifications with your control, that is called your ‘challenger’ version.

A metric is then chosen to measure the level of engagement from the users. For example, measuring the CTR (click-through rate) for a webpage.

Designing our A/B Test 🎨

How you design an A/B test is crucial for your end result and your business. A poorly designed A/B test can give you false winners which basically means that it will look good in terms of performance but wont have a major impact on the end goal or the actual business goal.

For us, there were four variants of the landing page for the “Sigma” portfolio that we wanted to test via the A/B testing experiment. All of them are explained below.

Original

This variation is basically our ‘control’ variation. Nothing was changed in this variation i.e. everything remains unaltered, as it currently exists

Control Variation

Popular

This is our ‘first challenger’. Solving the first mental barrier here, this variant was designed in a way that made the product look like a popular choice, thereby adding social proof, expert opinion, customer reviews, etc.

Challenger 1

Sophisticated

This is our ‘second challenger’. Solving the second mental barrier here, this variant was designed in a more sophisticated and an intelligent way, hence presenting the USPs upfront, adding the statistical and technical information, so on and so forth.

Challenger 2

FOMO

This is our ‘third challenger’. Solving the third mental barrier here, this variant was designed with a FOMO (fear of missing out) affect, making the user feel that this is something that they need to act on sooner rather than later otherwise they might miss out, playing with body copy etc.

Challenger 3

Other parameters of the A/B test are mentioned below. Some of the information has not been disclosed for privacy reasons.

Duration of the test

14 days

Devices

Mobile and Desktop

Traffic allocation

50% of the total incoming traffic distributed evenly among the 4 variants

Primary objective

Users creating an account (the actual goal)

Secondary objective

  • People entering the ‘account creation funnel’ i.e. clicks on “Get Started”
  • Increase scroll percentage
  • Reduce bounce rate

Results 🌈

The results were not something that we were expecting (as usual), there was no clear winner, each of the variants had small wins in terms of our secondary objectives, and to our surprise none of them, out of three ‘challengers’, could actually be considered a winner against the ‘control’ variation since the ‘control’ still had the highest account creations, which was our primary objective for the experiment.

So, does this mean that our experiment failed? I disagree.

  • For all the three ‘challenger’ variants, the amount of users that clicked on “Get Started” button and entered into the funnel was about 2X times as compared to the ‘original’ variant, which means that enough motivation was created in all the three ‘challenger’ variants which nudged the users to go to the next step
  • For all the three ‘challenger’ variants, the amount of users that reached to the bottom of the page was about 3X times as compared to the ‘original’ variant, which means that the new variants were more engaging, and were able to keep the user glued to the page for a longer duration
Performance with respect to Secondary Objectives
  • The primary objective i.e. account creation for the ‘original’ variant was about 0.5X times more as compared to the three ‘challenger’ variants, which we are not sure why 🤔
  • The bounce rate was nearly the same for all the variants
Performance with respect to Primary Objective

That’s why I mentioned earlier how crucial it is to design your A/B test properly. Had we kept a simpler objective for the A/B test, let’s say, recording the clicks on the ‘Get Started’ button, the results would have been different and all three ‘challenger’ variants would be our winners.
But when the primary objective is considered, the ‘original’ variation, even though had less ‘Get Started’ clicks, still had the highest percentage of account creations as compared to all three ‘challenger’ variants.
So, “when is a winner not a winner?”

Learnings 📝

The A/B test did provide us with a lot of learnings and meaningful insights. Some of the learning that we gained from the experiment are shared below:

Insight 1: Funnel velocity

The story on the landing page should be powerful. This is what keeps the user glued to the page for longer duration. Also, it should give the user enough funnel velocity so that he easily reaches to the end of the funnel.

‘Funnel velocity’ is the amount of force with which someone enters into a funnel which should take them through.

Which version will you get you motivated and why? Let me know in the comments section

For example, consider two landing pages, one has the heading, “follow the next steps and get 100 dollars”, and the other one says, “try it out”.
It’s quite clear that the first option will generate higher motivational intention and the user will enter and move with a higher velocity through the funnel. Hence, the motivational intention generated (on the landing page) should be enough to take you through the funnel.

Insight 2: Component placement matters

The placement of certain components on the landing page at different positions does affects the percentage of users going to the next step,
(in this case, starting the ‘account creation funnel’)

Insight 3: Nature of the component matters

The nature of the component decides its interaction rate, so some components, even if they are placed further down the page, will have more interaction than the ones placed above, which means that people are more keen to view these components. So, ordering of these components on the landing page really matters.

Next steps 🎈

The next steps would be to find the answers to the questions that emerged after this A/B testing experiment, and then probably again conduct an A/B test from whatever information we have gathered by looking at the performance of the three ‘challengers’ in the test. Stay tuned for the next phase of the experiment.

Wrapping up ✨

In the end, I would like to thank each and everyone who was involved in the project, the product and design teams, the dev team, Snehal Samant (Head of Product), Claire Yong (Head of Brand and Communications), Selwyn Lim, (Head of Legal and Compliance), Pandora Leong (Senior HR Manager), and a huge shoutout to Lakshyya Mahalwal (Head of Design) for giving me the opportunity to pen down this captivating case study.

Special thanks to Aakash Suri (Lead Product Designer) for helping me refine the whole case study.

About the author 🙋‍♂️

My name is Aditya Singh, and I’m working as a Product Designer here at Syfe. It’s a great company with wonderful people, and my experience here has been pivotal to my design career.

Product Design Team at Syfe

I have spent around 4 years working in the design industry and I absolutely love every bit of it. Being a designer is one of the most exciting professions I can think of. We craft the experience that a lot of people get to interact with every day. It is fun, challenging and rewarding at the same time. In the end, it’s all about creating a positive impact on the lives of end users.

Thank you for reading this case study, this is my first attempt at writing so would absolutely love your feedbacks.

Also, currently, we are hiring for multiple design roles at Syfe, so if you are looking for great opportunities, visit our careers page for more details.

Arrivederci!

--

--

Aditya Singh

Product Designer | Storyteller | Ambivert | Currently crafting digital e-commerce experiences at Ounass