How Our A/B Tests Increased Cross-Sales by 41%

Dyninno Group
Dyninno
Published in
8 min readApr 8, 2024

--

Nikolajs Mirosnicenko, Senior Product Designer, Dyninno Group

What a simple UX tool — A/B testing — revealed about consumer behavior, and how we used it to our advantage to increase cross-sales by 41%.

For various companies, their biggest profit comes not from the direct product they sell but from their cross-sales and upsells. No matter where we buy anything online, we will be offered something extra — longer warranty, accessories, upgrades, etc. In the cut-throat online marketplace where price is paramount and the customer is king, cross-sales are a safer bet to remain in the green, so companies are throwing what they can at their customers. But how to do it well, whilst providing true value for our clients? This is Dyninno Group’s story.

The Power of Cross-Sales: Understanding the Product

We are now underway doing A/B testing for Trevolution Group’s (part of Dyninno Group) Oojo.com — an online flight booking website designed to bridge the world through innovative technology and personalized service. Trevolution also has Skylux Travel and ASAP Tickets brands, but A/B testing is harder to implement there due to the way they interact with their customers.

We offer two cross-sale products: Travel Care service (TCS), and Price drop assurance (PDA). Price drop assurance does what it says and guarantees a partial refund to the customer if the price of their flight decreases for one reason or another. However, our focus in this article is the TCS. It currently has three options: Minimum, Premium, and All-included. They differ by the number of benefits they cover. When coming up with their names, it is important they’re kept short. It is good practice in Unified design — the concept of developing a design with the goal of creating a consistent user experience across various platforms and devices — as the shorter names work better on your cellphone due to the form factor.

Uncovering Consumer Behavior with A/B Testing

To begin A/B testing, we started with a list of things we’d like to test. It was a long list that we prioritized via a subsequent client survey. The first actual AB test was pricing — we used the popular “dynamic pricing” method. After evaluating the results, we determined which price generated the most profit or the most sales. We then used this data to select the optimal price. Of course, dynamic pricing requires constant monitoring and adapting to changes in the market and consumer behavior. But after that, we got down to user experience (UX) that turned out to be the biggest game-changer.

The initial TCS window design had a “No Travel Care” option selected by default. We then started displaying the TCS price per passenger rather than the total sum for the whole trip. This approach didn’t perform well enough, so we dropped it.

We then switched to a version of a design where none of the TCS options are selected. The customer, therefore, had to make a conscious choice regarding TCS. All airlines try to upsell something to their customers, and people have become immune to these offers. So, we wanted to accentuate the need for this choice. The page would display an error message if nothing was selected. This worked well, and the cross-sales went up.

Ups and Downs of Customer Preference

We then started testing a version where, upon choosing the “No TCS” option, a warning would pop up detailing the potential risks and harm they may cause to the trip and the passenger themselves. This worked tremendously well, so, we integrated this with the previous changes.

Then we tested a version where each part of the TCS featured explanatory text both — in the tooltip and underneath each heading. This made the information deeper, but the page — longer. Especially for the mobile version. This didn’t bring the desired results, so we reverted to the old version. It was visually obvious that we were trying to cram too much information into the window, making it harder to perceive. This is undesirable from a UX point of view because it created cognitive overload for users.

We also did several corridor tests within our company. For instance, on how to best display the amount of money the customer saves by choosing TCS and even — how to write the numbers themselves — e.g., 1000 USD or 1k USD. The corridor tests favored the 1k version, which made sense since it’s a common abbreviation in the US, where most of our clients are from. We started an A/B test but surprisingly saw a drop. This clearly signaled we are putting too much cognitive pressure on our customers, and it’s easier for them to perceive simple checkmarks rather than various sums.

Keep It Simple

The last version we ended up testing was putting the TCS parts that the customers cared most about on the top of the window. Therefore, clearly signaling the difference between the plans at the very beginning. Before that, it was the other way around, and the top of the window was filled with the similarities of the TCS plans, making it harder to perceive their differences.

How did we find out the most relevant TCS parts? We used a survey option by Hotjar on our customers where questions would pop up during the page’s use, asking them about product price and specific parts of the TCS offer. The results clearly showed our customers favored “Cancel for any reason,” “Change for any reason,” and “Agency fees waived”. Since our audience mostly reads left to right and top to bottom — which is an important distinction to make, trust me — we put the most popular ones on the top. Thus, accentuating what people really need and truly want for themselves.

When researching our competitors, we saw that no one is using this specific approach. Our A/B test, therefore, broke the proverbial mold and again increased sales. Although I can’t disclose the individual percentages of the respective changes we implemented, all in all they represent a total combined 41.33% increase of TCS sales compared to the beginning of A/B testing.

Why A/B Test at All?

The answer by now should be obvious — it’s better business. But, coming from design, I myself find A/B testing tremendously interesting. Because after all, what’s the point in drawing pretty pictures if they don’t change anything for the company or the customer? New designers often tend to concentrate on the user interface (UI) and the way it looks, not how it functions. But when it comes to finding the right way to talk to your customers, experimenting with UX brings more meaningful results. Take Amazon, for instance. They most certainly don’t have a visually flawless UI, but it’s their UX that ensures sales conversion.

The logistics of Oojo’s A/B tests are straightforward. We begin with an idea, which is then approved, followed by the design phase. After that, our developer implements it into the root code of the website. Consequently, our customers are evenly split, with 50% directed to the existing sales page and the other 50% to a potentially new version. This allows us to analyze, compare, and measure the data and any differences between them.

Such a way of testing is not technically complicated, and we can therefore realize our tests quickly. It takes a week max to go from an idea to a started test. The main thing is — we can see what works and what doesn’t. A/B tests let us better offer our products to our customers with better value for them.

Keep Your Friends Close, and Your Competitors Closer

It is also important to keep an eye out for the competition and what they’re doing in order not to miss something new. We can sometimes observe our competitors doing seemingly interesting or even strange things on their platforms, which might indicate they are performing tests of their own.

Since our customers browse several pages looking for the best deal, it is unwise to go for something extra revolutionary with weird layouts and avant-garde design because the page will lose its trustworthiness, and it will be harder for the customer to communicate with the page. So, you often see designs repeated across the aisle because what matters most to the customer is the price and how easy it is to perceive it.

What’s Next for A/B Testing?

In the future, we plan on testing not only TCS but also other products. As well as not just the US customers but also Europe, Latin America, and elsewhere. Each target audience and each product differ, and this is where A/B tests come in and help figure out which tailored combination will work best.

We plan on continuing experiments in the future and might increase the cross-sales categories from two to four or five. We’ll keep experimenting with illustrations, UX and UI elements — so, to learn more about that, watch this space and follow our channel for more product design insights.

Of course, the layouts will keep changing. But the underlying principle of saving space and pixels and making the information as easy as possible to perceive and to interact with will remain the same. Dyninno Group is adamant about providing true value for our clients with the ecosystem of our products, so it makes sense to continue finding out what matters most to them and keep our offers relevant.

Trevolution Group, incorporating the travel businesses of the Dyninno Group of companies, operates International Travel Network, ASAP Tickets, Skylux Travel, Dreamport, Oojo, and other travel brands, and has established itself as the market leader in the travel business, specializing in the visiting friends and relatives’ segment. Over 840,000 unique airline tickets and vacation packages were sold by the companies under the Trevolution Group brand in 2023, making it the fourth-largest travel consolidator in the US.

--

--

Dyninno Group
Dyninno
Editor for

Dyninno is a group of companies providing products and services in the travel, finance, entertainment, and technology sectors in 50+ countries.