Design the right thing: the science of little videos & lightweight metrics (Part 2)

Aaron Powers
athenahealth design
4 min readFeb 12, 2021

By Aaron Powers

athenahealth provides network-enabled services for healthcare and point-of-care mobile apps to drive clinical and financial results for its hospital and ambulatory clients.

In Part 1 of this blog series, we shared the backstory behind Concept Validation, a new research method to help engage more with customers in the usually silent period early in the design phase before usability testing. In this second segment, we’ll focus on the science — how we decide whether a concept is a big enough improvement to build.

In the words of one of athenahealth’s key customers, we all need to know more about “the little videos.”

Concept Validation helps us validate strongly informed, loosely held potential solutions to problems. What’s the secret? These tiny videos & lightweight metrics solve many problems early in the design process, which we explored in Part 1.

Lightweight metrics from users provide a data point early in the design process. The purpose is checking to see whether you’re heading in the right direction — do our users value it? This isn’t about polishing the design, it’s about basic utility for our customers.

This isn’t just about gut feeling or subjective validation — this can be a science, predicting future usage from an early stage design artifact.

The science behind Concept Validation

It’s easy, and thus tempting, for each team in a large organization to design their own unique survey. A huge advantage of standardizing these surveys is that you can use industry best efforts to into a science, and compare the features to each other — these comparisons can make difficult choices much easier. Concept Validation has benefited from this approach — since athenahealth has run over 230 Concept Validation surveys, all using the same key metrics, we can compare designs to each other.

Concept videos tend to score high — new designs usually look better than what exists today, and a concept video can’t capture many of the weaknesses that come into play once implement something — which is always imperfect compared to the vision. Our lowest average scores for new concept was 3.1 — slightly above the midpoint of the scale. A 3.1 out of 5 sounds like it’s above average — but just above average means there are many users that rated the feature below average.

We’ve run >230 Concept Validations, so each dot represents one concept. While a 3.1 out of 5 may sound like a good score since it’s above the midpoint of the scale, it’s one of the lower scores for a new concept. The comparison is intended to inspire another iteration.

Before and after showing the video we also ask a satisfaction question — this allows us to identify concepts where satisfaction is not increasing as much as we could (we found that for our great designs concepts resulted in an average increase in satisfaction with a concept is >1 point, but every company will have different results depending on where they start and the new concepts).

We compare satisfaction before and after seeing the concept video. While negative scores imply the concept is worse than it was before, a small increase may also imply the design won’t be seen as a large change for users, and the comparison helps us aim for a higher potential improvement before moving to the next stage of resource investment.

Each question in Concept Validation is based off industry research — all metrics in our survey are based on questions designed outside athenahealth that apply well to our industry and customers. We’ve used 3 questions and the overall approach from the Technology Acceptance Model, which has been shown to predict actual user adoption of new technologies. Other questions are taken from UMUX-Lite, Jobs To Be Done’s Outcome Driven Innovation, and Fidelity’s open-ended question.

Each survey includes only a few relevant questions — the shorter the survey, the higher the response rate.

To design your own Concept Validation survey that works for your users and for your business goals, select just a few (e.g. <5) of the best survey questions that are used in your industry. You’ll be focused on balancing key factors for your product and business goals. Include something like overall satisfaction — you’ll ask that question both before and after showing the video. At the start, you might include more questions and plan to reduce the number of questions in later surveys — after we ran 50 concepts, we used several statistical techniques to optimize the survey questions and we dropped a few questions.

For each concept, create a short, friendly, ~2 minute video describing the essence of the change you’re planning to make — it doesn’t have to get into the details, the idea here is that you’re validating the concept before you invest too much into it — for more rigorous testing, you’d use another research method like usability testing. Plug that video into the survey and share it with your customers. Last but not least, reuse the same survey over and over again across features. If you keep the survey similar enough that you can trust comparisons, you can truly understand what scores are good for your company and set aspirational goals.

If you liked this article, you may also find our related articles in this series valuable on your journey to measuring the user experience: Design the right thing: how Concept Validation’s little videos & lightweight metrics help athenahealth validate concepts earlier in design, Why your company needs standardized metrics to accelerate development of successful features, Getting questions about the ROI of UX? 3 ways to refocus the conversation to opportunities, and Applying Machine Learning To User Research: 6 Machine Learning Methods To Yield User Experience Insights.

--

--