Does A/B Testing Actually Work?

A/B testing has become widely used over the past 3 years; according to a recent study done by Econsultancy, 67% of businesses use A/B testing to optimise their conversion rates:

However, 90% of A/B testers say that they’re unsatisfied with the results!

Does that mean that A/B testing just doesn’t work? Yes! Except for 10% of people using it!

Of course, A/B testing isn’t a magic solution that waves its little wand the moment you take out a subscription with AB Tasty or Optimizely.

Take the exemple of Laurence who works in the Marketing department for an online shoe shop; he discovered A/B testing in 2014:

One Monday in September, I made an important decision: to introduce A/B tests on my business’s website!
After some Google research on A/B testing services available, I made up a shortlist comprised of: AB Tasty, Kameleoon, Optimizely and VWO.
In the end I went for AB Tasty because our office is based in France; supporting the industry and all that!
The starter pack wasn’t expensive: only €29 for testing 5,000 visitors. Seemed pretty reasonable for a first test…
After a quick sign-up process, I arrived on a large, all-white page with just one visible button: “Create a campaign”. There was also a video and tutorial section to the right of the screen, but the explanations were a bit too…”technical” for me at this point!
I was asked “What do you want to test on your site?”
Most of the online articles I’ve read talk about how simply changing the colour of a Call to Action button can increase conversions by 300% so I thought I’d start with something simple like this.
After just 3 hours, my credit of 5,000 visitors had run out and the AB Tasty interface was showing a very low rate of reliability, which meant of course I would need to do further testing to have a more reliable outcome.
But how many more times would be sufficient? What volume of visitors did I need to test? It was all a bit vague in that I changed on to a €99 pack to try and get some satisfying results from this first test.
The next day, the reliability level had gone up to ‘Green’ with an improvement rate of 27% showing! I was elated at this result and I went straight to my Manager to give them the good news: a 27% increase in revenue!
Except the thing that I hadn’t realised is that this 27% increase wasn’t particularly visible in the end of month revenue report, in fact the increase barely showed in the figures at all and my Manager wasn’t at all convinced that any increase was down to my A/B testing…
In the end, my first experience with A/B testing wasn’t particularly conclusive, which made me realise that signing up to an A/B testing solution probably isn’t the way to go unless you really know what it is you want to test and know how to manage the test correctly.

Laurence’s experience is a classic one: A/B solutions don’t offer any kind of magic solution for improving your conversion rates, they only offer a tool with which you could ultimately reach your goal.

You want to know the 5 most important things about A/B testing ?
Read full post >>