Lessons from a UX researcher on a 0–1 product in a new team.

Michael O'Sullivan
UXR @ Microsoft
Published in
7 min readNov 14, 2023

Over the past year or so, I’ve been the lead (and sole) UX researcher for Demand Planning — a new product within Microsoft D365 Supply Chain Management. The app was released to Public Preview just over a week ago (at the time of writing) and is already being used by several large companies, including Domino’s Pizza. For me, Microsoft was a totally new company, Demand Planning was a totally new space, and I was much more comfortable in the physical, B2C world than the digital B2B one. I was thrown into the deep end and had to learn quickly. I was successful in some areas and failed in others. However, this experience has been invaluable and extremely enjoyable, so I wanted to write this article to reflect on my journey and to share 3 key insights on what worked well and what I would do differently if I were to join a new team or work on another product from scratch (which, in a way, we’re all doing as we begin to reimagine our products in this world of AI).

1. Move fast, but not too fast.

When I joined Demand Planning, a team of designers, engineers and PMs had already been working on the app for a couple of months. They had even carried out 20+ hours of interviews with potential users to get a sense of their requirements and expectations. In one of my first weeks, I travelled to the office in Copenhagen for a week-long workshop to review the research findings and agree on a development plan. Here, I was introduced to a FigJam file with hundreds of quotes from the interviews, organised into pain points, opportunities and jobs-to-be-done (JTBD) — a framework that was entirely new to me. They had identified two key personas: The Demand Planner and the Demand Forecaster, each with 8 distinct JTBD (key responsibilities that our software would have to support them with), pain points and opportunities. The team set out to develop separate experiences for each of these two personas.

The team had done an incredible amount of collaborative work and used a very structured approach to get to this point, and I was excited to be on a team with such a drive to be user-centred. Unfortunately, I was so unfamiliar with the concept of JTBD and the technical space of Demand Planning that I was hardly in a position to critique their work or provide much guidance. Instead, I spent that week simply trying to learn as much as possible about the product space, the team and the research requirements.

Seeing the aggressive development timelines and feeling the pressure to impress in my new role, I was eager to start carrying out research studies and publishing results, though I still felt like I didn’t know enough about the space to even be able to interview users. I looked at work done by other researchers in Microsoft and found that the next ideal step in the persona development process (after identifying JTBD, pain points and opportunities) was to carry out a persona segmentation study. These surveys are used to segment the personas according to which jobs they spend most time on, their satisfaction with them, and a variety of other data. Following that, the next step would be to carry out more interviews to deep dive into the biggest opportunity areas.

Fortunately, I had been given a decent research budget and was able to outsource this segmentation study to a third party vendor. In fact, I even arranged to outsource the follow-up deep-dive interviews before receiving the results of the segmentation study. Outsourcing these studies would allow me to have some results to share within a month or two, and in the meantime I could continue doing online research to learn about the space.

When the results came in, I presented them to the team and they were delighted. I was able to say some impressive stuff along the lines of “There are 3 types of Demand Planner and 3 types of Demand Forecaster. The type of Demand Planner that works in medium-large companies (as we were targeting) spends most of their time doing XYZ, have X level of technical proficiency and their biggest opportunity areas (differences between importance and satisfaction) are A&B.” I could say the same about Demand Forecasters. Then, I received the insights from the follow-up deep dive interviews, and was able to present these saying “We also carried out some deep dive interviews into opportunity areas A&B, and here’s what users had to say…” The team was pumped and I was starting to feel more comfortable.

The problem here was that, by this time, I had started doing my own, tangential research. I was beginning to test terminology, concepts and other items for the team. Through interviews and usability tests, I was quickly beginning to get the sense that Demand Planners and Demand Forecasters were actually the same persona, and that some of the JTBD that had been defined earlier in the project were actually just tasks within JTBD. This meant that, while much of the outsourced work was still valuable, several of the JTBD that had been tested, and the persona types that emerged from them didn’t really exist.

Upon realising this, I decided to merge the personas into one, refine the JTBD and tie these data points from the outsourced studies as best I could. I advised the team to merge the two separate Planner and Forecaster experiences into one, and thankfully they did. If I could go back, I would have done this exercise first, before outsourcing any work. Sure, it would have still taken time for me validate and refine the team’s work and so they still would have continued development under some incorrect assumptions, but it would have allowed the outsourced work to be more correct and useful down the line. I want to re-iterate that this was a mistake on my part, and that the research efforts carried out by the team were fantastic — they just needed to be fine-tuned by a researcher.

I think as researchers, we often worry that the product team will move ahead without us. We feel like we’re playing catch up and so we rush to get insights that will guide the team, rather than reporting later that they got it wrong — or, arguably even worse, reporting that they got it right and feeling redundant in our insights. Of course we should move fast and try to get ahead of the team where possible, but only if we are confident that the foundations are right.

2. Use quantitative scores as motivators.

While qualitative research is hugely important, and quotes and video clips of users being confused and complaining about a product or prototype are very impactful, I have found that nothing motivates a team more than a low score. Combining these together is sure to light a fire under the product team and get them to take UX seriously, rather than just worrying about new features.

At Microsoft, we typically carry out a benchmark usability test once the product reaches general availability (GA) and then once per year after that, with the goal of moving at least one task from red to green (i.e., improving its success rate and/or satisfaction) and ultimately increasing the overall product score. I chose to carry out a ‘lite’ version of one of these for Demand Planning during Private Preview (almost 9 months before GA), testing it with only 5 participants rather than 10 and excluding some metrics like time-on-task. Thanks to this streamlining, within 2 weeks, I was able to provide the team with an overall product score (SUS), success and satisfaction rates for each of the key tasks/journeys, and insights on where they could be improved.

The results got the team so fired up that it’s only been 4 months and they’re already asking for another one to see how the score has changed, now that they’ve implemented all of the insights from the first one. None of these insights related to new features, they were purely usability-related. Again, qualitative insights are crucial for helping the team to understand the problems and how to fix them, but quantitative scores can help to turn the development process into a game, encouraging teams to constantly strive for a new high score.

3. Make research presentations engaging.

We all know that there tends to be a lot of unnecessary meetings and presentations in companies, especially larger ones where people are working remotely. This makes it all the more important to make them engaging, which can be a tall order for research. By nature, research can often be a little dull, especially when discussing methods or presenting quantitative data. However, if you do manage to make your presentations engaging you’ll find it much easier to grow your audience, have constructive debate during the sessions and, ultimately, see your insights incorporated into the product.

I recommend acting as if no other researchers are present, even if they are. You don’t want to get bogged down in discussions about research methodology or data analysis. You want to give just enough context as to why and how the study was carried out, while spending most of the presentation going over insights, making suggestions for how these should be incorporated into the product and encouraging discussion among the team. Use visuals where possible, especially to express things like time-on-task or pain point severity (play with colour, shape sizes, emojis, etc.). Use quotes and audio/video clips of participants to keep things interesting and make it feel more human. Summarise and prioritise the insights again at the end, providing recommendations on next steps and future studies. Be bold and confident in your statements and suggestions, even if they go against the grain. The team can’t argue with evidence in the form of user data.

Make your presentations engaging and memorable and it will make your life so much easier. The team will begin to ask and even wait for research before developing new features and experiences, rather than finding out how it could have been done better later down the line, after it’s been developed. Then, finally, research can start to guide design, rather than playing catch-up.

I hope you found these insights useful and consider incorporating them into your own role, especially if you’re starting on a new product or joining a new team.

--

--