Mapping Your Digital Experience Optimization (DXO) Strategy: Part 2 — A Case Study

Keith Aric Hall
fraqtl
Published in
5 min readJul 9, 2020
Photo by Lukas from Pexels

The first article in this 2-part series discussed the framework for the DXO Toolkit I created. In this second article, I will walk through a case study illustrating how I used the framework on a project.

Download the DXO Toolkit Now

Recall that the toolkit includes two canvases, the Optimization Strategy Map; and the Experimentation Plan Canvas. The strategy map is a high-level look at the experiences we want to improve and the tools, data, and research methods available to our team. The experimentation plan is a more granular look at how we can use those tools, data, and research methods to optimize a specific experience.

The Project

At the beginning of the year, our senior leadership team refocused on our browser extension. They set an aggressive goal to increase installs 150% by the end of the year. This was ambitious considering our past efforts. I led design for the project.

Optimization Strategy Map

Sometimes knowing where to start is the hardest part, especially when tasked with such an ambitious goal. The Optimization Strategy Map makes starting easy by focusing on the experiences we want to improve and the resources we have at our disposal.

Optimization Strategy Map

Defining the Experience

As discussed above, our focus was on the browser extension. Several elements make up the extension experience, besides the extension itself. On-site placements promote the extension. Our landing page communicates its value, how it works, and links to the Chrome and Firefox web stores. The post-install page guides users on how to get started and encourages account creation.

The Strategy Map aided us in identifying the collective experiences and the research methods we have at our disposal. It served as a launchpad for understanding which levers we can pull to affect the experience.

Identifying Measurements

We track several indicators to determine the health and performance of our experiences. Since the number of installs was the target metric identified by our executive leadership team, it was the first one on the list and our primary measure of success. Our data science team provided us with several insights and projections to help us get a clearer picture of how the extension was being used and how well it was performing against our key performance indicators (KPIs).

Qualitative Research

Usability testing was our primary source of qualitative research. We kicked off the project with user research around extensions and the people who use them. The insights that resulted were not surprising, yet illuminating. We learned that low awareness, difficult discovery, and unclear messaging were big impediments to user growth.

Experimentation Plan Canvas

Experimentation Plan Canvas

Energized by our findings, we set our sights on increasing installs. A limited marketing budget led us to prioritize on-site promotion to raise awareness about the extension. Examining the data and research available, we started generating some hypotheses and potential solutions to the problems identified in our research.

The Execution Plan

Once we had a list of solutions to try, we needed to understand the impact and effort of each option. Doing so empowered us to be scrappy considering our limited time and resources. Once again, our data science team came to the rescue. They ran projections estimating the total number of installs each method would drive by the end of the year.

Outcomes

Have you ever been on a scavenger hunt where people are searching to find the hidden treasure so they can win the grand prize? Searching high and low, looking in all the nooks and crannies, certain that it must be well hidden. Then, to everyone’s surprise, the treasure was right in front of their faces the entire time. Our exit modal solution hid in plain sight!

The previous team in charge of the extension had run an A/B test in which the exit modal was one variant. Though it performed well, they pursued other, more subtle approaches. We were confident with the original test results, so we recreated the exit modal. The results were significant, increasing daily installs by 2.5X and outpacing our original projections!

Feeling good, we kept moving forward with our plan, testing a promotional placement for the extension in our post-click offer modal next. Once again, our projections proved valuable, confirming our hypothesis by doubling original daily install rates.

Next, we set out to improve the messaging in our on-site banner — an issue we discovered in our initial research study. Focusing on the copy, we tested several variants. There was one clear winner, which we rolled into production. The final tally of all our experimentation yielded 4X daily installs from where we started.

Next Steps

We are very pleased with the results, but we still have a lot of work to do. The DXO toolkit helped us define quantifiable optimization targets. Though we still have several hypotheses to test, we will continue to improve the successful implementations.

It’s important to remember that the DXO Toolkit is not a disposable resource. It’s not immutable. We can go back to it, updating and revising when we get new data and research or when business objectives change. That’s what makes it so great! It’s a resilient tool that helps drive orient and align product teams. So what are you waiting for? Go ahead, try it now.

Download the DXO Toolkit Now

--

--