How we got a 0% increase in conversions from an A/B test, and why we still considered it a win

Campaign Monitor
5 min readJan 26, 2016

--

Do you ever A/B test your email campaigns?

If so, you’ve probably had some good gains in the past. We certainly have.

But you’ll also know that not every test you run produces a big increase in opens or click-throughs.

And that’s ok, because even if there isn’t a significant increase in conversions you still learned something that will make your email marketing better in the future.

In this post, we’ll share with you the story of how we tested taking a more personal approach in our email campaigns and got a 0% increase in conversions, and why we still considered it a win.

The inspiration and hypothesis for the test

Whenever we write and publish a post on the Campaign Monitor blog, we send an email to our blog subscribers letting them know a new post is published.

The email looks a bit like this:

Over time we’ve done a lot of A/B testing on this email (including the template, copy,buttons, and more), so it’s well-optimized for conversions.

However, recently we’ve noticed a trend amongst other bloggers and content marketers towards taking a more personal approach to their blog notification emails.

This email campaign from Groove is a great example:

While we send a summary of the post with a big call to action button, Groove CEO Alex Turnbull sends a personal email to his subscribers telling them a new post is out and he’d love them to check it out.

Groove’s email contains very little information about the post and instead focuses on creating a more personal dialogue between Alex and the subscribers.

This got us thinking, were we taking the wrong approach to our email? Were we missing out on a significant amount of conversions by continuing to use the ‘summary’ approach to our blog notification email instead of the more personal approach?

See the thing is, even when you’re doing a lot of A/B testing and optimizing of your campaigns, it’s easy to hit a local maximum.

If you’re continually making small changes to your campaigns (like button copy, colors, etc) but still using the same template and methodology, then you’ll like reach a point where you’ve gotten the best results possible from that approach (the local maximum).

But there is always a possibility that by taking a radically different approach, you can find yourself on another path that could lead to results much greater than the previous one (the true maximum).

Afraid that we might have optimized ourselves to a local maximum, we decided to test the personal approach against our summary approach to see if it would help us find a new maximum.

The two emails we tested

After publishing our recent post on the 5 elements of an effective marketing offer email, we created two different versions of the notification email to our blog subscribers.

The summary approach

For version A of the campaign, we crafted the email in the same way we always do. We used the title of the post as the main heading, included a summary of the post as the body copy and added a benefit-focused call to action button that links through to the post.

It looked a bit like this:

The personal approach

For the B version of the campaign, we adopted Groove’s method of writing emails and crafted a personal email to our subscribers. We spoke to them in a friendly tone, telling them about the new post we’d just published and asking them to click-through and have a read.

It looked a bit like this:

The result

As mentioned earlier, the two different versions of the campaign performed almost identically and there was no noticeable uplift from using the personal approach than our normal summary approach.

Why we still considered this test a win

Given that there was no increase in click-throughs from this test, you’re probably wondering why we’re taking the time to tell you about it.

The thing with A/B testing is that not every test you run will have a clear winner. Sometimes the version you were testing will lose, and other times there will be no change whatsoever.

But what you do get from it is insight into what works for your unique audience.

We learned that even though the personal approach works for Groove (and many others), our audience is completely different to theirs and obviously responds in a different way.

And with the knowledge that taking the personal approach doesn’t yield huge gains, we can continue using our existing summary approach to our blog notification emails knowing that we haven’t reached a local maximum and are getting the best results possible from our campaigns.

And for us, gaining that knowledge is a win.

In conclusion

Even though we didn’t get a noticeable increase in conversions from this A/B test, we learned that the summary approach to our blog notifications email isn’t necessarily the wrong approach, and that knowledge helps us run more educated, more successful A/B tests in the future.

So when running A/B tests on your email campaigns in the future, don’t be afraid to occasionally test something a bit more radical than just your subject lines or button copy.

By testing completely different templates or approaches, you could not only get an instant increase in conversions, but find yourself on a new path that leads to a sustained increase in conversions over time.

Join 200,000 other marketers and agencies who get tips, tricks and stories on email marketing delivered to their inbox every 2 weeks

This post was originally published on the Campaign Monitor blog

--

--