How an Atypical Client Helped Us (Re)Learn Design Process Fundamentals

Josh Lucas-Falk
grandstudio
6 min readNov 14, 2018

--

At Grand Studio, we’ve worked as design consultants for years. Over that time, we’ve developed a pretty clear sense of our typical client. The people we work with are usually some flavor of product manager employed by large, matrixed organizations. Those organizations typically have well-established digital product management and product design practices, and they generally come to us for a little bit of outside, expert support. Our client organizations are typically in the business of providing a service (like financial services or healthcare services) to individuals or other companies. We have deeply developed muscles for working successfully with these types of clients.

It turns out, though, that while we’d been working out these muscles, we may have neglected a few others. We came face-to-face with this reality during a year-long project that we recently wrapped, with a client that is highly atypical for us in pretty much every way possible. The end result of our project was work that I’m really proud of (though can’t say much about at the moment). However, it took some pain and effort that could have been avoided if we had done a better job of really understanding how this client and their organization was different than our typical client and had adjusted our approach accordingly.

I’m going to skip over the list of all the things we (I) messed up on the project and skip right to the things that we should have done differently.

Map the client organization, understand roles and explicitly define client team engagement

Photo by Thomas Drouault on Unsplash

Our team of ~4 researchers and designers from Grand Studio was working with a core team of about 6 people on the client side. These people included designers and researchers, but also engineers and a mathematician. I’m embarrassed to admit that we never explicitly asked this group questions like: How much time do you have to give to working with us? Which decisions do you want to be a part of and which do you trust us to make on our own? What information do you need from us to, for instance, report on project progress with your bosses and others in the organization? If you’ve worked on projects like this before, what worked well? What didn’t work well?

We (eventually) figured this all out, but it would have been a lot easier for everyone if we had just had an open conversation about this at the beginning of the project. There are certainly reasons why we didn’t do this, mostly having to do with pressure to get started on the “real work” of doing research and design, but, looking back, this wasn’t really a shortcut as much as a painful detour that required some later course correction.

Define and re-define the value of qualitative research

Photo by Daria Nepriakhina on Unsplash

The population of end-users of our product is tiny, like probably only a few thousand people in North America across a few hundred very large, very insular companies. It took us several months to identify likely potential users at target companies and set up interviews with them. There is simply no practical way to conduct a quantitative study within this set of target users. However, the company that hired us is fundamentally an engineering company, so they wanted results that gave them mathematical certainty that a particular solution was the right solution.

We do qualitative research, either generative or evaluative, on pretty much every project. And we have often worked with quant researchers (especially market researchers) at our clients to build qual studies that use existing quant intelligence. Up until this client, though, we’d never worked on a project where there was as much pressure to derive qualitative value from our quant research as there was on this one.

Stepping back, the root cause of this is pretty obvious: our client sought certainty and confidence that the product we created with them was solving the right problems in the right ways. We got to a point where we were able to assuage this concern and provide some confidence in the decisions we were making. However, we could have done it quicker by more clearly communicating and re-communicating the special things we’re able to do through qualitative methods, like understanding user workflows and pain points in fine detail and discovering new product opportunities. Critically, we should have been clearer and more open in describing and re-describing what qualitative research does not do, which is “prove” that a given solution is the right one.

Articulate, re-articulate, and then re-re-articulate the design process

Photo by Alice Achterhof on Unsplash

Our problem space for this project was highly complex and highly technical. We had a few concrete workflows that we knew we would have to redesign to make more efficient and less time-consuming, but our general project goal was to create something that promoted highly qualitative values like “trust” for users of our product. We knew that there were some unknown unknowns we needed to define and understand when we started the project. We do something like this on basically all of our projects, and the basic cycle of define/discover -> concept/test -> design/document is second nature to us as we work through this kind of ambiguity.

Of course, that doesn’t mean that it’s second nature to our clients, especially clients who had never really gone through this process from start to finish before. While we were working through the design process, we struggled to communicate the purpose and value of this process and of basic design tools like sketch concepts and wireframes. This is entirely our (my) fault.

It’s obvious to us that we start as rough as possible so that we can answer the big, general questions before we start trying to answer the detailed ones. We use low-fidelity tools like sketches and wireframes specifically because they’re not resolved and not precious — we can easily modify and even throw away these ideas because we deliberately have not invested much time in them. Our client had a tendency to view even the roughest concepts and sketchiest wireframes as literal design suggestions. Even worse, there were points where members of our client team thought we were unintentionally making bad design, rather than understanding we were making unresolved design because we wanted their feedback before resolving it.

This is a pretty basic mistake that we made: make sure all members of the project team understand the project plan in general, understand how the specific thing we’re working on at any moment contributes to overall project process and understands the utility and value of the specific work we’re doing at any moment.

The Conclusion

The nature of our work is such that there’s rarely a successful path through a project that doesn’t involve meaningfully and substantially involving our clients in the design process. This recently complete project is an excellent reminder that we don’t get this kind of involvement for free: if our client is unfamiliar with the way we work, the first job is to teach them the how and the why of what we do. This lesson is pretty fresh for me and the team, and I’m looking forward to using this as an example within the company of why it’s worth the time to invest in this client education and client engagement.

--

--

Josh Lucas-Falk
grandstudio

I’m the CEO of Grand Studio, a Chicago-based digital product design consultancy that used to be called Moment.