UX Research and Outcome-Driven Innovation: A Case Study

The research team within Liberty Mutual’s UX department recently concluded a hefty study, and researcher Fallon Parker has a story to tell.

Fallon Parker
Digital @ Liberty Mutual
4 min readJun 12, 2019

--

Here, in Liberty Mutual’s Digital department, we know that to be a responsible partner with our customers we must deeply understand their needs and goals. To follow this mission, we have put a number of design thinking methodologies into practice. During 2017 and 2018, dozens of agile teams went through a month-long transformation focusing on customer-centricity and adherence to agile practices.

Design thinking is the gift that keeps on giving, though, and we knew there was opportunity to pursue other customer-centric innovation processes. I’m excited to share more about our recent experience piloting the Outcome Driven Innovation (ODI) framework.

The Outcome Driven Innovation process, created by Anthony Ulwick of Strategyn, is appealing to a metrics-driven organization, such as ours, for a number of reasons. Some benefits of the framework include:

  • A method for leveraging qualitative and quantitative user data
  • A prescriptive form allowing for direct comparison across user needs
  • A direction for innovation strategy

But most promising, ODI boasts an 86% success rate for solutions derived from the process. Who wouldn’t be jazzed?

Working across different areas within the company, I led a workshop teaching ODI. The workshop highlights the popular Jobs-to-be-Done (JTBD) theory, and a case study of its practical applications that I experienced during rotations in Liberty Mutual’s innovation incubator.

Fallon leading a workshop with her UX colleagues.

We first presented this workshop to the Boston Innovation Chapter, which is a local community of Boston-based Liberty Mutual employees interested in innovation and skill-building. The hands-on workshop was enthusiastically received. 100% of attendees agreed or strongly agreed that the training was relevant to them and would recommend the training to a colleague.

Encouraged by the success of the workshops, I approached a Senior UX Design Director and a Senior Director of Product within the Digital Group to pitch the process and, more importantly, a pilot. With their support, I partnered with one of our squads focused on homeowners’ insurance — and they were eager to get started with the ODI framework.

The ODI process first calls for the identification of the customer jobs-to-be-done. (Anthony Ulwick’s interpretation of a JTBD is slightly different from other definitions of JTBD. The definition of a JTBD in this context is “a task, goal or objective a person is trying to accomplish or a problem they are trying to resolve”). I worked with key stakeholders to determine the appropriate JTBD for this scope of work — get the right homeowners insurance for me.

Once the JTBD is uncovered, it can then be broken down into 8 process steps. The steps of a JTBD are; define, locate, prepare, confirm, execute, monitor, modify, and conclude.

Our process steps became:

  1. Determine that I need a policy.
  2. Receive an online homeowners’ insurance quote.
  3. Examine the details of the quote.
  4. Decide which quote to select.
  5. Verify that the information provided is correct.
  6. Adjust any information, if necessary.
  7. Initiate the purchase process.
  8. Finalize purchase and store documents.

For each process step above, we determined how customers are defining success at that stage. This is where the data from the recent benchmark study came in.

The Senior UX Researcher for the squad meticulously captured all customer verbatims from the benchmark study and entered them into a spreadsheet where she categorized them by theme.

From each pain the customers articulated, we crafted outcome statements, also known as “customer-defined success metrics”. For each process step of the JTBD, we wanted to know how a customer could be sure they had successfully completed that step.

Some of our design artifacts!

Squad members translated the individual customer verbatims (pictured above on sticky notes) into outcome statements. This ensured we kept the customer at the center of the process.

Once we had these outcome statements written on post-its, we affinitized them into each process step of the customer JTBD. We grouped the outcome statements according to their appropriate stage in the JTBD.

Quantitative research

Up until this point, our research had been largely qualitative. We hoped to capture which outcome statements are most important for customers through a survey. Along with another researcher and designer, I helped create a Qualtrics survey to measure each customer-defined outcome statement on two factors: importance and satisfaction. Let’s just say I’m blessed that there is so much Excel expertise around me.

With this data, we were able to plot each customer-defined outcome statement on an opportunity matrix. This visual representation of the data allowed us to craft our innovation strategy. For example, one strategy is to focus on those outcome statements that survey respondents rated highly important, but still dissatisfied.

The other lens to consider when evaluating the outcomes is by leveraging the Opportunity Score of each outcome statement.

Opportunity Score = Importance Score + (Importance Score — Satisfaction Score)

Outcome statements with an opportunity score greater than 10 represent the greatest opportunity for innovation success. We had 31 of our outcome statements scoring above 10, with 15 being above 11 and 2 being above 12.

Next steps for this team include scheduling and planning a Google Design Sprint around one or two of the most highly rated outcome statements. And from there, of course, we’ll iterate.

We hope to continue using this framework and grow in our understanding and application of it. Thanks for reading!

--

--