Measuring design team health

Chris Collingridge
Auto Trader Workshop
4 min readAug 1, 2022

In amongst all the busyness of activities and initiatives that help us improve, it’s often difficult to get an overall sense of where a team is at and whether things are improving for them. With this in mind, we’ve developed a way to regularly sense-check how things are going for our people in Auto Trader design & research.

An arm with a blood pressure monitor
Photo by Mockup Graphics on Unsplash

An initial diagnosis

At the back end of last year, we were reflecting on where we were at as a design & research team at Auto Trader, and where we needed to focus most for improvement.

We’d been through a couple of years of lots of change: we’d hired lots of new people; transitioned to home working during COVID lockdowns & recently started a hybrid approach; gone through an evolution of our organisational structure; introduced some new roles & levels, and lots of other things.

We had lots of good work & development conversations going on between people and their managers, regular retrospectives (both in cross-discipline teams and with design & research groups), and a list of prioritised operational improvements we were making. What we felt we were lacking was a “pulse” on how people were feeling, and on whether things were getting better (or worse) for us as a team.

Finding the equipment

Auto Trader runs a regular series of employee engagement surveys that are filled out by the whole organisation, and these cover lots of topics that are as relevant to designers & researchers as everyone else — things like their relationship with their manager, their personal and professional development, the support they get from their teams, and so on.

Those surveys aren’t lip-service; after each one we come together, review the results, and identify actions we need to take to make Auto Trader an even better place to work.

We didn’t want to duplicate any of the main Auto Trader surveys, but rather extend them with questions more specific to the satisfaction of designers & researchers.

Luckily, I was pointed in the direction of an excellent article on this very subject, by Erin Casali. It’s full of excellent practical advice.

Taking the pulse

Based on Erin’s article, we ran a survey — just using Microsoft Forms — using almost all of her advice. We adapted it slightly in a couple of ways:

  • We ran it as completely anonymous (with people encouraged to speak to named individuals if they had anything they wanted to get a response to)
  • Removing questions we were asking elsewhere (such as in the all-company surveys)
  • Changed the red / yellow / green scoring to emojis of happy / neutral / sad
  • Tweaked the free text questions to “What’s the #1 thing we could do to enable you to be more successful in your role?” and “What’s the #1 thing we could do to make the design & research team better?”

We largely used the approach of the analysis spreadsheet intact — using the heatmap approach to spot patterns, averaging scores for both the questions and individual people, and highlighting/counting any ‘reds’ (sad faces).

A spreadsheet showing part of the results from the survey, with mainly green cells, a few orange, and one red

Reviewing the experiment

  • It was simple enough for people to fill out reasonably quickly
  • We got an overall score that we could track over time
  • We could see the variation between the most satisfied members of the team (who have exceptionally high scores) and the least (who gave moderately high scores)
  • We could see the patterns easily around where we were strongest (environment and support within the team) and where we could most improve (clarity of vision in product teams)
  • The free text comments yielded a few themes that we could focus on improving

Moving to regular check-ups

We first ran the survey in January 2022, and we’ve just run it again this July. Our overall score stayed exactly the same between the two iterations (despite some changes to the people in the team over that period) but there was interesting variation between some of the individual questions.

On this second occasion, we added some additional questions that we’d like to track over time — around people’s awareness of and use of our design principles, our design process, and their ability to attend critiques regularly.

While this doesn’t in any way replace the regular, qualitative inspection and adaption that we’re doing in our various teams — or indeed the more strategic initiatives we’re taking to continue to grow the impact of design & research at Auto Trader — it does give us a good temperature check of where the team is at. It’s easy to run (and fill out) and is now forming part of our overall balanced scorecard for how we operate and contribute.

--

--

Chris Collingridge
Auto Trader Workshop

I battle with tech, sometimes professionally. One of @nuxuk. Lots of attention to detail for interaction design; none for DIY. These are my personal views.