In user-experience driven businesses, it’s the pulse that matters.

Michael Schofield
Mar 11 · 4 min read

In Austin last week I taught a workshop around practical, low-fi, low-overhead strategic design and research techniques used to understand service as an ecosystem and the fluid relationship between its systems and the user experience.

It doesn’t need to be said that even while we’re able to make the case among ourselves to invest resources in service and user experience design (aren’t those the same thing?), it’s easy from the outside looking in to see UX like an organizational luxury. You need the people, the time, and the money to stare at a whiteboard, perform research, and drink LaCroix, that — okay — even if robust discovery reduces the total cost of a project, it’s still largely hypothetical monopoly money to the folks paying for the project until you can prove it to their accountant.

I make the case that the only surefire way to improve the user-centricity of a company is to bake user experience design practice into the infrastructure, to make doing UX the way people work rather than a thing someone puts on the to-do list. This is also the most cost effective approach. Yes, you need buy-in by the leaders who decide how people work, but once earned the low investment, low overhead, and the return on the value of the resulting dataset during decision making speaks for itself.

Emma Boulton, who has done really neat work on the logistics of improving UX maturity, insodoing explained the research funnel that gave me the vocabulary I needed to better introduce this thinking.

Operational user research is the kind of research I’m describing when I advocate for “baking UX into the system.”

If jobs-to-be-done interviews and market analysis describe exploratory research designed to define the problem, and if service blueprinting or customer feature/satisfaction analysis and user interviews define strategic and tactical research used to fine-tune the roadmap, then operational research can largely be defined by a dataset organizations probably already have: emails, NPS scores, feedback from users through various channels, complaints, logs or even anecdotes from customer support, and web analytics.

What’s usually lacking is the consensus to stash all of that somewhere where it can be used to observe user patterns and behavioral cues.

The path from exploratory to operational research is a spectrum from synchronous to asynchronous research, which is the difference between observing the pulse and having your finger on it.

The research funnel describes not only the linear path of the discovery process from broad exploratory research down, but my takeaway is that it also describes the spectrum from synchronous, time-intensive, expensive research performance to asynchronous, time unblocking or even automated, cheap research performance.

Imagine a journey mapping workshop, the sticky-note wallpapered conference room, and the team confined therein. Now, imagine folks at the service desk, folks behind the phone, folks monitoring social, in customer support, headscratchers looking at analytics dashboards, your colleague walking across campus and just overhearing a conversation, who then tag or log that interaction, that tweet, that email, that observation, that anecdote in a way that lets decision makers later sort it.

The error is to assume that the simplicity and curated expertise of the former is more valuable than the infrastructure and amateurishness of the latter. The former produces some insight once. The latter produces a consistent feedback loop that becomes exponentially more valuable as this now nascent “user research catalog” grows over months and years.

To make the workshop happen you need the people to facilitate it, to conceive it, to convince someone it’s worth it — it’s an entirely different thing to then do something with that insight.

To seed this operational research infrastructure you need a spreadsheet, consensus to log this stuff, and a few minutes each week to groom it — like weed out the not-so-useful feedback (“I love your site!”). It’s probably already being logged by your inboxes or your tools (like Zendesk), which means you have to figure out what criteria matter and then how to copy that into this spreadsheet — try Zapier.

These aren’t mutually exclusive, of course. At the start of a new thing you only have exploratory research as an option, but operational research makes returning to exploratory research more valuable, and when yours is a team of one or might-as-well-be-ones given the makeup of your workplace, the time you spend figuring out how to start this catalog will pay off in dividends.

In user-experience driven businesses, it’s the pulse that matters. As soon as the journey mappers leave the conference room, the finger is no longer on the pulse.


Clapping (👏) is a super way to brighten my day. Check out my podcast Metric: The User Experience Design Podcast (right here on Medium), and consider subscribing to my newsletter Doing User Experience Work. ❤ It goes a long way if you’re able to support this kind of thinking on Patreon.

Metric

High-level practical design thinking by Michael Schofield

Michael Schofield

Written by

User Experience Development Lead @WhereByUs. 🎙 Metric: the User Experience Design Podcast (metricpodcast.com).

Metric

Metric

High-level practical design thinking by Michael Schofield

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade