Eliminating Biases and Re-Meeting Your Customer

Andrew Maibach
Enova Studios
3 min readApr 6, 2018

--

It’s not difficult to identify when a front-end user experience is outdated, buggy or fails to create an ecosystem of self-service. But identifying a problematic infrastructure is not the same as identifying the problem. And it certainly does not always come with an easily identified solution, where creators know the perfect way to optimize the user experience.

Updating a front-end experience can be a grove of low hanging fruit, but only if we acknowledge that we, as the creators, do not have all the answers. Our customers, rather, are the farmers that will yield the plentiful harvest, if we allow them to show us what solutions they prefer.

Conventional approaches to problem solving front-end experiences usually encompass a variety of classic exercises, in which we leverage the expertise of our most veteran colleagues’ tribal knowledge of yesterday’s greatest successes and worst mistakes. Through their lenses, we can very quickly create what we believe to be the optimized experiences and workflows. This approach has quick turnaround and uses few resources, which in turn does wonders to appease stakeholders during the planning phase.

Making the Shift

At Enova, we’re starting to push back on this conventional approach, in favor of an ecosystem of A/B testing. Our focus has shifted away from our experts and the spotlight is now shining directly on our users, as we await the results of their organic decision making. Rather than create a mockup to dictate our development, we’re using mockups as a template of our own hypotheses. This lays the precursor to our testing grounds in production, where the users will choose the solution for themselves.

In using this approach, we are continuously validated in our methodology, as we analyze the results of each individual change. Ideas and hypotheses our teams have had, backed by research and conventional wisdom, are being challenged and turned upside down; and we are now furthering our understanding of what changes our users prefer. We know our customers want a fast and simple registration process. And while we think we know how to deliver that to them, our bias has never been more clear.

Accepting New Truths

We have learned that for example, while we thought adding more information could only help users complete some challenging fields, our instincts appear to have provided too much information, as our proposal instead overwhelmed and caused more confusion. We have learned that while research dictates that users strongly prefer simple information up front before making an account, this change provided only minimal value for our users. And what we thought would be a simple housekeeping change with marginal return (removing our driver’s license field), was far and away the most impactful change we’ve made for our users; that it caused a significant pain point and impeded our users from moving forward with their application.

We’ve helped our users complete our registration process at over a 10% shift from these simple changes, and we are not even halfway done with our current brainstorm of ideas worth testing. We now have a framework and workflow of A/B testing for optimal results, and in doing so, we are ensuring that what we think are our best ideas do not yield the sour grapes of under optimized and underperforming solutions.

This approach has been challenging up front because it requires longer timelines and an investment of resources. We spent nearly a month creating an infrastructure for A/B testing before any monetary or customer value was realized. But through this foundation we have enabled ourselves to test any and all of our wildest ideas against our users behaviors so we can confidently move forward to a win-win world of harvest all around; driving business results by focusing on the only opinions that matter: our customers’.

--

--