A/B tests, what if they’re all bad?
I’ve worked in digital marketing in one shape or another for over 10 years and it’s always blown me away by how the vast majority of people I’ve met would die to protect the sanctity of the good ol’ fashioned A/B test.
Don’t believe me? How about a few links to back me up:
A really phenomenal white paper titled " Most Winning A/B Test Results Are Illusory," published by Qubit, was making…blog.kissmetrics.com
In a 2013 study by eConsultancy & RedEye, surveying almost 1,000 client-side & agency marketers, it was found that 60…conversionxl.com
It's clear that A/B testing has had, and continues to have, a significant impact on Silicon Valley and beyond. It is…www.wired.com
But you know what? A/B tests suck. There, I said it.
It’s not the frame works fault though, it’s the way marketers conduct A/B tests. Using finite and static datasets, variations with little to no differentiation and a deep lack of understanding of what drives a customer or user to engage.
I don’t want to whinge about it and cry about the lack of solutions out there though. Instead, we’re going to build a platform using the latest technologies, statistical modelling, UI/UX techniques and more to help everyone increase their overall conversions and retention, for real.