Usability Donuts: How and Why Software Engineers Should Speak With Customers

Michael Fleetwood
CA\rABU Life
Published in
8 min readMar 28, 2016

About two years ago, we instituted a program dubbed “Know Your Customer,” which requires everyone in the engineering organization to have at least two customer contacts per year. At the time, I thought we were aiming low. “Just two customer contacts?” I complained. As we instituted the program over the next six months, a couple of things became abundantly clear: two customer contacts was going to be a high bar for many of our software engineers, and requiring (aka “forcing”) them to meet with customers wasn’t going to get us there. We scrapped the program and went back to the drawing board. We had learned that getting to “know your customer” can’t be a forced activity. We needed to create a culture where everyone in our organization wants to interact with customers, where speaking with customers is seen as a privilege and an opportunity. This post explains a little more about where we are in that journey and how we got here.

Why should engineers get to know customers?

When we’re working, it’s extremely beneficial for our minds to focus, get into the flow, and crank on our work. This is often the retort I get from Product Owners when I encourage our engineers to meet with customers. POs want to see their teams in “the flow” and cranking out code, and customer discussions are a distraction from that. Of course, the downside of staying perpetually focused is that we lose perspective on the big picture and the reason for our work. The work becomes a task without meaning, which leads to a worker without inspiration.

Engaging directly with our end users and customers, for as little as an hour, can instantly reconnect us with what’s important to our users and how they perceive and use our products. We gain immediate clarity in our priorities, feel a sense of empathy, and develop a personal drive to help alleviate a user’s pain. And, we remind ourselves that our end users are often very different from us, with different organizational constraints, beliefs, and processes.

Indeed, research shows that when the entire team (product managers, product owners, developers, testers, UX) regularly participates in first-person observation of users interacting with their product, it directly correlates to an improvement in the user experience of the product. This study found that you do not get the same result from just reading a report of usability testing findings or having just the UX person participate in the observation. Everyone has to be involved. (Note: Spool’s recommendation from his research is two hours of direct user exposure at least every six weeks for each role on the team. A lofty goal, indeed.)

Another benefit from engaging with customers is that engineers gain an opinion and voice about what they should be working on and in what order. They’re enabled to participate more in the design and solution conversations, so they can make smarter interaction, functional, user experience and quality decisions on a day-to-day basis.

So, how do we get to know our customers?

As a user experience researcher, I focus on four key enablement activities: creating opportunities, inviting input, providing tools, and offering bribes.

Create opportunities.

I spoke with a few engineers who claimed to like hearing from customers, but didn’t do it often. Their biggest impediment to hearing customer feedback was opportunities to do so. There’s a simple fix to this impediment: I set up customer interviews and invite team members to attend. Scheduling interviews takes time (always longer than I think), and it’s not something I would expect our engineers to do.

Focus your invites.

I’ve found that my invites get a lot more “acceptances” if the topic is directly related to what a team is working on, which is why I keep my campaigns small and focused. In the past I announced interviews and opportunities for customer feedback on a broad stage, such as in company-wide meetings, and found this strategy to be mostly ineffectual. Even if the topic is related, the sense of a direct connection between the interview topic and a team’s work is diluted. Now when I set up interviews, I show up to a team’s morning standup and announce the interviews I have for the day or week, giving a quick summary of what I will be covering and why it may be of interest. Then I send everyone on the team a personal calendar invite. I’ve found that being very explicit about how each interview relates directly to each individual’s work goes a long way toward piquing interest.

Invite input.

When a new interview topic comes up, my work begins by formulating hypotheses about the feature set I’ll be researching. Typically I spend time with the UX Designer or Product Owner who’s been working with the idea. They usually have had to make some assumptions in their design process, and one of my goals is to help them evaluate those assumptions. I ask them to phrase their assumptions according to four categories:

Positive Assumptions

Template: I think [this feature or functionality] is going to be [discovered easily | understood intuitively | found to be useful | liked] by participants because [reason].

Negative Assumptions

Template: I’m worried that [this feature or functionality] is going to be [overlooked | problematic | misunderstood | not found useful | disliked] by the participant because [reason].

Demand Assumptions

Template: I think that [this missing feature or functionality] is going to be requested by the participant because [reason].

General Question or Curiosity

Template: I want to know if the participant [does or thinks something] so that [reason].

Once we flesh out a decent set of hypotheses, I share the document with the team that has worked or will work on the functionality. I ask them to “+1” a hypothesis they agree with, “-1” those they don’t, and add any new hypotheses I’m missing. The engineers are often the best source of hypotheses, since they’ve gotten to know the intimate details of a feature as they implement it. These hypotheses will form the basis of my testing script; I’ll go through each one and do my best to come up with an activity that will validate (or invalidate) each hypothesis. So, by asking for their input, each team member has an opportunity to indirectly voice their opinions on the feature and directly influence the interview script, and they now have a personal reason to listen in on the interviews, to see how their assumptions play out.

Provide tools.

The second major barrier to speaking with customers is insecurity about how to do it. I don’t expect the engineers who attend my customer interviews to lead the discussion, but I want them to feel comfortable participating and to walk away with some nuggets of value. To do that I provide copies of the hypotheses and testing scripts in advance (so they can formulate their opinions and questions ahead of time.) I provide some tools and structured formats for taking notes, like this:

And I moderate a debriefing activity immediately after the interview. This is where bribes come in.

Offer bribes (usability donuts.)

Full disclosure for any CA engineers reading this: for the last two years I’ve been conducting experiments on you. I’ve been systematically varying the incentive I offer to attend a customer interview. We’ve gone through cookies, candy, pizza, chocolate bars, beer, coffee, gift cards, and the most consistently effective bribe of the bunch: donuts. I call them usability donuts, and here’s how I set up my scheme:

Usability donuts. Not a subtle bribe, but an effective one.

Not only do the donuts provide an incentive for attending the interview, they provide a nice way for me to lay out my expectations of the interview observers. We spend time discussing and affinity mapping their observations immediately after the interview, while we enjoy any remaining donuts.

Of course, once all the interviews are completed and the results tabulated, I share my findings with the team. Typically, we have a discussion about what we learned and what actions we can take going forward. (That topic is probably worthy of another blog post.) It’s important not to just share this information with the product owner or manager. For the engineer, it closes the loop on the entire process, directly linking feedback from the customer interview with user stories or defects that he or she will be tackling in the future.

Outcome: Are we winning?

I started this post describing the failures of our now-defunct “Know your customer” program from a couple of years back. I wish that I’d been keeping careful records of every engineer (or otherwise) who’s attended a customer interview for the last two years; there would have been a nice, quantitative number I could point to, but alas, it’s not one I have. (Ask me again two years from now!)

I do have a lot of qualitative evidence that the voice of our customers is becoming embedded across our engineering organization. In addition to recording how many attendees come to a customer interview, I’m now also listening for mentions of customer feedback in our team demos (“we’ve been hearing from customers…”, “a top customer request has been…”, etc.), for requests for customer feedback on hackathon projects, and for engineers who come to me asking for some customer input on their team’s work. For those teams working directly on customer-facing features, I work hard to get them to attend at least one customer discussion every quarter.

In sum, it’s a disparate set of qualitative evidence to work with, but it all suggests that our engineers increasingly care about what our customers say and that they want to hear it directly from the customers themselves. The benefit of this is an increased focus and drive in their work and a better product for our users. And of course, it’s always nice to hear from the engineers too:

“Direct customer feedback helps drive discussion between our developers, PO, and UX beyond the one-way feedback form available on the user story detail page.”

- Steven Boles, Technical Lead

“Live phone and video usability reviews with customers really helped us validate our vision for the Visual Studio plugin. We talked with developers and figured out what we got right and where we needed to make changes.”

- David Smith, Software Developer

Originally published at www.rallydev.com.

--

--