14th Century Wall, Inca Roca Palace, Peru

A UX Research Portfolio — Problems Solved!

What follows is examples of work that I have conducted, led and been a part of over the past few years as a UX researcher, designer and practitioner. In keeping with my belief that everything should be efficient, simple and easy, this portfolio is a minimum viable product (MVP), one that covers the essentials of a good approach to research, design, strategy and UX maturity.

(Please note: To maintain the confidentiality and privacy of my clients, I have altered work where necessary)

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Case Study #1

Search Engine Marketing Campaign — Metric Driven ROI

The following case study details a metric driven approach to increasing new sales leads via an online, lead generation marketing form.

Initial Problem (As outlined by the stakeholder, a.k.a. the client)

An online marketing campaign was failing to generate the desired number of new customer leads, both by phone and via an online submission form. Users of the online form misunderstood its purpose and were using it more often to ask general questions rather than inquire about new products and services, as the design intended.

Problem Assessment

From a UX perspective, my first step was to review the current campaign, understand the most glaring problems and form a hypothesis which could be validated later through testing.

Performing a heuristic review, it was immediately clear that the problems with the online form stemmed from a lack of clear messaging and call to action. The design itself also lacked some basic usability and visual design criteria, such as contrast and adequate levels of scale to help the user quickly determine the most important elements and where to look first. As a result, my hypothesis was that the content in the current design was being ignored or quickly skimmed and therefore, the new design would have to focus on content first and foremost.

Here is an abridged version at the current design:

Original Design Sample (Edited for client privacy)

Following my assessment, and avoiding solution discussions, my next step was to speak with the client to understand their goals for improving the campaign from a quantitative perspective.

“Not having a measurement strategy is what keeps UX as a “nice to have” as opposed to being an influential driving force within an organization.”

Measuring Our Success

Steps taken to align design goals to proven business success.

Step 1: Deciding what to measure
Quantitative first and foremost— I found it imperative to gather, through client and SME discussions, the right metrics, ones the client would love and delivering those in a way that the client would be excited to share with her boss and up the chain. Good usability? That’s a given! It is what we do! My main focus, however, was proving to those fluent in business language that our work can provide that kind of value as well. It didn’t matter how we got there, only that we did. It also opened the door for us to truly lead in terms of what we wanted to design and how we wanted to test it.

Here are the metrics we landed on to measure during the project and subsequent launch:

  1. Double users filling out the form
  2. Increase good leads via the web form
  3. Increase good leads via phone
  4. Decrease bad leads due to misunderstanding intension of the campaign
  5. Increase overall leads who became new customers
  6. Increased income as a result of improved campaign

Step 2: Designing the test approach
We chose to perform an A/B or Split Test on the final designs because that would allow us to gather data in a live, real-world setting. We agreed to create two version of the design, Design A, with the form vertically positioned in the right hand column and Design B, with the form displayed horizontally at the bottom of the page, but well above the fold.

Step 3: Determining current baseline
Determining the current baseline was fairly easy to do because the client had access to analytics and data from the current usage of the online form. This gave us something to work with and to compete against. Our directive was to raise those numbers, displayed as percentages, by double.

Step 4: Measuring the baseline against new design(s)
When our deigns were competed, we launched Design A, along with the baseline design, and allowed for a random sampling of user engagement. Once a valid sampling was collected, Design B was launched in a similar fashion, tested once more against the baseline.

Design Approach

With these key metric driven criteria and plans in place, we started on the design phase. Initially, my review of the marketing campaign showed a poor approach to experience design — content was missing a clear call to action, eye focus was weak across the page and the lead generating submission form was well “below the fold” — meaning it was unable to be seen by the user without scrolling.

Next I researched the demographics of the online form in terms of who is clicking on the campaign currently. To clarify, these would be people who found the online form via a Google search of either the company itself or terms related to the product. The link on Google appeared at the very top of the search results as a paid advertisement or “Ad”. Similar to the example below:

This meant that those clicking on the link might not be the typical web savvy user. To understand this better, I did some research and found that user demographics for “ad” related clicks are fairly widespread:

http://blog.hubspot.com

While this data was a bit inconclusive, I did some additional research to understand the kinds of people who would be interested in the product enough to click on such a link. Interviewing product SME’s to confirm, it became clear that we would be directing our design and call to action at users 55+, as well as people who may not be very web savvy, but who were actively looking for a similar product and more likely to click on the first item returned, regardless of it being an “ad” or not.

With our metrics determined and design direction agreed upon, we kicked off with a focus on content first, creating simple, but clear call-to-action that more clearly provided instruction. For example:

Next, I cleaned up of the form to include only the most important and necessary fields. I also moved the form as high on the page as possible.

Side note: There were no plans for a stand alone mobile version so the design was focused on website viewing, but responsive to mobile devices nevertheless.

Next, I began sketching a design and passed these along to the visual designer who put the final skin on it, as seen in the following image:

Design B

One problems that arose was when we discovered that the dev team was coding within a tightly controlled framework. This posed a challenge in terms of the accuracy of our design across browsers as well as how well it would respond on mobile devices. With the dev team taking a “it’s close enough” approach, I worked closely with them to ensure that we maintained the integrity of the intended design and to ensure that we captured the most accurate data during A/B testing.

Ultimately, we created two versions of the new design; one with the form extending top to bottom in the right hand column (Design A) and one with the form at the bottom of the page, but well above the fold (Design B).

Results

Less than a month after launching the first test, we found that “Design A” increased click conversion (the number of people contacting the company for the intended purpose) by 103%; increased web leads (people using the online form to contact the company about the product) by 111% and increased phone leads (people using the phone to contact the company about the product) by 86%. Design B returned similar numbers.

Metrics captured from A/B test of baseline vs new designs

In the end we chose Design A to be the new lead generation campaign. With this metric driven approach, we delivered a more clearly understood user experience as well as clearly measurable improvements to the business. The client’s feedback following the project was that they had never seen such a dramatic increase in conversion rates with any of their previous campaigns.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Case Study #2

Agent Email Communication Improvements — In-Depth User and Business Research

A research project to understand and improve an internal messaging system used to forward customer related emails to field agents.

Initial Problem (As outlined by the stakeholder, a.k.a. the client)

An internal messaging system used to forward customer related emails to field agents had been neglected for some time. Messages were still being sent, but a review of the folders where the emails were going we filled with hundreds or unread emails. These email folders were assigned to field agent managers across the country, whose job it was to review the emails and forward them to the agents within their region. I was asked to review this problem and make recommendations for possible solutions.

Problem Assessment

Understanding this problem began with a review of the current state of the email messaging process and creating a process map similar to the one in the following image, in which I mapped each step of an agent email — from the time is was generated to when it found its way into a field agents inbox.

Creating a similar map required a number of meetings with those most familiar with this email process.

Research Approach

Using a screener, I found a good sampling of SME’s with regards to age range and from tech savvy to admittedly not. In all, I interviewed 10 internal SME’s from across the company, 9 external field managers from across the country, and 4 sales agents in the field, asking them a number of questions related to their familiarity with this email process, their awareness of their full email folders, their process for sorting and delivering the emails to their agents and the value these emails provided. I also asked SME’s to help me understand more clearly the interface used to read sort and deliver these emails to agents. The following screens depicts how this was done:

I then documented what, based on my interviews, provided the most satisfaction to the least based on scenarios as they should exist vs. how they currently exist. Next, I documented and prioritized the various email types, from most important to least and the potential opportunities for agents who received these emails and the risk of hurting customer relationships if these emails continued to be ignored.

From those interviews I delivered: A current state assessment, including general information about how many work or customer related events trigger an email, the number of emails generate monthly, the number of “undeliverable” emails due to agent recipient having no known email address or if they were terminated or retired, how often emails go out daily, and so on. I then provided a Good and Bad assessment of the current state, showing how the bad was currently outweighing the good.

I then compiled all interview responses and collected the most telling quotes from all SME interviews, which became voice of customer (VoC) responses, further informing my research, my deliverable and the final recommendations.

Results

Following my conclusions, in which I identified additional gaps found during my research, I delivered a 26 page presentation detailing my findings, which included next steps and a road map to get us there. I presented my findings and awaited further funding and portfolio prioritization in order to move forward.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Case Study #3

Internal Field Agent Intranet Re-design

A redesign of an internal news and information intranet site that provided field agents with access to the company’s portfolio of products, services and timely information about their client’s accounts.

Initial Problem (As outlined by the stakeholder, a.k.a. the client)

The clients responsible for the intranet site came to UX saying that the “news” section on the current site was not large enough and therefore, the users were not engaging with it on a regular basis.

Problem Assessment

With such a vague explanation and request for UX assistance, my first response was “why do you want the news section to be bigger?” The answer was simply that they felt it was not big enough on the page and felt it was a value add for the agent to know what is happening in the field as well as in the industry. Looking at the current design, it was clear why they wanted the news to be bigger. As you can see in this blurry (due to privacy concerns for the client) image of the current state design, the news section was not very large and not very well contrasted with the rest of the site. They were also not calling the section “News” but created another term to identify this section, based on a previous printed version of the news under the same name.

Original Design (content blurred to protect client privacy)

In addition to the problem of understanding why the news was so important and that it was not immediately standing out, it was also clear that the overall design of the site was cluttered and did not provide any specific starting point or main area of focus for the eye.

Measuring Our Success

With little to go on from the client, I began asking the important questions like: What value are we really trying to add with a new design? What are some of the major pain point users are currently facing? What is the benefit of a larger news section? Do we know if people care about the news in the first place and what will we measure to prove our success?

Research Methods

My first step was to determine what about the news was valuable to the client and to the end user and what strategic or tactical goal was desired by making it more prominent. I created a survey to send out to the majority of agents who had access to the intranet site. Using the survey, we were able to ask pointed questions about overall site usage, value of the site to the user agent, the most important information the user wanted from the site and what value they might get from reading industry or company news.

We also learned through our research that the current site was believed to reduce, by at least a third, the time it took to create a new customer application, the amount of time agents had to spend in front of their computer or mobile device, and provided financial guidance, lead generation, enabling smarter sales decisions, and much more.

Step 1: Deciding what to measure
The goal for deciding what to measure focused on determining what value, if any the more prominent news section would have on users as well as any business impact. It as also important to understand if we were focusing on the right areas for improvement. I also chose to validate if the tools and information already available on the site truly decreased the time to create new customer applications, ease of use and improved sales decisions.

Step 2: Designing the test approach
After conducting user interviews, we planned to test a new design with remote users and also would conduct a card sort to determine which sections on the current site resonated most in order to understand the sites current IA and look for opportunities for improvement.

Step 3: Determining current baseline
# of clicks to the current news section, the type of news stories currently clicked the most and overall clicks.

Design Approach

Reviewing the current design, it was clear that agents navigating a very information heavy interface that had no clear starting point, focal point, or any clear IA. In addition, the current site was lacking any real quality to the design in terms of various levels of scale or contrast to differentiate between sections.

After reviewing the results of our user interviews, I wireframed some simple sketches using Balsamiq to create a layout that provided greater focus on the areas we determined were most valuable to agents.

They were quick access to most important tools, which could be customized per user, quick access to their sales stats (My Progress) and we were also able to accommodate a prominent area for news and upcoming events.

New design (content blurred to protect client privacy)

Lastly, we worked as a team on improving overall site IA, creating a strategy for long term update-ability and ease of maintenance.

Step 4: Measuring the baseline against new design(s)
The new design was turned into a working HTML prototype and remote tested with 24 participants, showing each participant the current version, followed by the new design to get an overall impression of the new design as well as walking them through common tasks to determine ease of use. Improvements were made same day and shown to subsequent test participants.

Results

The result of our work provided the client with ample data to compare against baseline and to track after the site went live. We maintained a follow up schedule with the client as well to ensure that the metrics were monitored so that any changes could be made along the way if need be. The client was pleased with the results and used the design as a showcase project at an executive conference.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Case Study #4

Large-Scale Application — Research and Design

A large-scale application portal enabling businesses large and small to become PCI compliant, a requirement for anyone that stores, processes or transmits credit card data.

Initial Problem (As outlined by the stakeholder, a.k.a. the client)

The client wanted to create an online wizard application for walking customers through the complex PCI compliance process, a standard security requirement for any business that accepts credit cards.

Problem Assessment

This project posed a number of challenges, such as understanding the complex PCI process and transferring that to an online environment. Think TurboTax, but for credit card compliance. The next challenge was that I would be working as UX researcher, designer and testing the application with users while working alongside an agile development team who had a tight schedule for delivery. At the time, I was also new to agile team collaboration.

Research

Understanding PCI compliance required a lot of reading and learning and talking to SME’s and customer alike. After about three weeks of learning, I met with a number of customers to talk to them about PCI as well, get their thoughts about the current compliance process and what could make the process easier, quicker and more streamlined. Next, I investigated competitor products, but since not existed, we would be designing a totally new concept.

Metrics

My goals for this project in terms of metrics was to think about the current PCI process, gauge an average time to completion and worked backwards from there to improve those numbers. Since PCI was traditionally a paper based process that employers or their assistants did on their own, my objective was to stay true to the process, but to also create an approachable interface that the user could get through in a relatively pain free way.

Design Process

I began the design process with a lengthy journey map to identify the steps users take in the process as well as what happens on the back end. From there we did a similar mapping of the IA to build a foundation for future improvements and to determine potential distractions causing users to abandon mid-way through the process.

Following this, we began wireframing and sharing these across the company to test basic functionality. I performed tests following R.I.T.E. testing methodologies, which allows for faster, more improvised testing, updating and re-testing more quickly than traditional lab-based studies.

Results

To view a video of the application, follow this link:
https://www.youtube.com/watch?v=FEVgDppb5oM