Disrupting price comparison using design sprints

Rich Clark
12 min readJun 27, 2017

--

This story accompanies a talk delivered at the UX Cubed event in Liverpool organised by Shop Direct with NUX and Forward Role on 16th June 2017. You can find the slides on Speakerdeck (https://speakerdeck.com/richclark/disrupting-price-comparison-using-design-sprints) to follow along with this write up.

I’m going to explain how we’ve been using design sprints at MoneySuperMarket, where I’m currently leading the UX and Design team to help improve our product development process.

Rather than focusing on what we’ve worked on in sprints, I’m going to discuss the process we’ve used and how we’ve evolved it to fit our needs. I’ll also talk about what we’ve learnt along the way, what’s worked well and what hasn’t worked. I’ll then explain why I think design sprints on the whole are beneficial.

Note: I’ll interchange between ‘design sprint’ and ‘sprint’ but please consider any mention of sprint to be in relation to a design sprint.

Design Sprints

Before we get started, I’ve got a few questions. Hands up if you’ve:

  1. Heard of a design sprint?
  2. Taken part in one?
  3. Taken part in more than 1?
  4. Facilitated a sprint?

Briefly then for those of you who haven’t heard of a sprint, I’ll quickly explain about the process.

It’s a process that’s been around for a number of years, having been initially used by companies such as IDEO before being refined and popularised by Google’s venture capital arm, Google Ventures. It helped GV to gain more confidence in which companies would be worthy of investing in.

The Sprint Process

Design sprints are a 5 day agile product process of structured brainstorming. They help solve critical business or customer problems through design, prototyping and testing ideas with users.

Sprints shortcut the debate cycle and compress months of time into a single week. They allow sprint teams to work on big problems and gain confidence and learnings from testing a prototype with real customers. The process helps the teams decide whether or not it’s worth spending more time on.

In short, it gives us a superpower to build and test nearly any idea in just 40 hours.

Roughly the days breakdown as shown below. Day 1 is all about unpacking the insight and mapping out the problem; day 2 you sketch potential solutions; day 3 you decide on your solution and design; day 4 is reserved for prototyping and day 5 is where you go out and test your prototype with real customers.

The design sprint process. Image from the Zapier Blog (https://zapier.com/blog/google-ventures-design-sprint/).

I won’t go into much more about sprints here, as there’s a wealth of information available about them starting with the Google Venture Sprint site (http://www.gv.com/sprint/).

Sprints at MoneySuperMarket

We’ve talked about design sprints in general but what about sprints at MoneySuperMarket?

Towards the end of 2015 we started learning and reading more about sprints. From a company perspective we were in the process of migrating to a new tech platform which meant that within the product team we had some additional time to experiment and try out new processes.

In the first week of 2016 we ran our first design sprint. The sprint was mostly full of UX people along with a product manager to test out the process. Could we take a big problem and prove it out with customers within a week?

Lou, one of our researchers taking part in our first sprint. Photo by Taz Hussain.

We undertook a challenge on our credit cards channel and being honest, the brief was a bit wooly. The week went well and whilst we didn’t follow the process to the letter we muddled our way through and did manage to take a prototype to test with customers at the end of the week.

In fact, because we don’t like to do things by halves we tested two versions of a HTML/CSS high fidelity prototype using mock data with 12 participants across a couple of labs. We scored the prototypes at the end of the sessions and one of the prototypes scored markedly higher than the other which along with the verbatim gave us some great learnings.

Our first key learning though was that during a sprint you tend to run into a bunch of additional questions or problems to solve or a number of additional, unintended hypotheses and ideas. You connect the dots, or trigger some divergent thinking that gives you even more actionable work to fold into your product development process. We’ve had this on pretty much every sprint that we’ve run.

17 sprints in 2016

The sprint was deemed a success. Following a retrospective and more research of the sprint process (including reading the sprint book) we iterated it slightly and ran more sprints. We also brought in more people in from across the business and in some cases suppliers, agencies and providers.

In 2016 we ran approximately 17 sprints. I say approximately because sprints seemed to become MoneySuperMarket parlance. Everything became a sprint whether it was a sprint, a workshop or yes, in some cases a meeting. That’s a downside when everyone thinks all their problems can be solved by a sprint. Quite often though, it doesn’t need a sprint — it simply needs some dedicated time or people.

MoneySuperMarket and agency colleagues in a sprint.

The product team at MoneySuperMarket are a group function so we ran sprints for other areas of the business such as finance, HR and TravelSuperMarket. We also ran a couple of mini-sprints during recruitment days for the UX team, which helped candidates understand a bit about our process and allowed us to observe them in a setting more akin to a working day.

Looking back now, we did too many sprints and put everyone under a bit too much pressure. Credit to the team, they rose to the challenge and delivered with great success. The volume of sprints definitely raised the teams profile and helped to explain what UX is and why it’s valuable throughout the business.

User research

We learnt a lot. We tried to solve some big problems and attempted to transform some of our biggest channels. For some, customers didn’t like it so we killed the idea. For others, it validated the idea and the proposition either has been or is being built.

Successful sprint ideas tended to move from the qual of the sprint and following some iteration were taken into an A/B test. In fact, for some sprints we built the prototype as an A/B test directly into our testing platform so we could launch a quantative test following the sprint.

A sprint taking place with an agency in their offices.

Another big learning was that we were killing our prototypers (perhaps before they killed us). We’re exceptionally lucky to have some very skilled people in the team but asking them to turn around working HTML/CSS prototypes in a day or less just wasn’t sustainable over the longer term.

However, it’s also true that by the very nature of what we do customers want to see results personal to them, they need to see “real prices” so in many cases paper or static prototypes don’t provide us with enough learnings.

2016 Reflections

At the end of 2016 we sat back and reflected, as a team on what we’d achieved over the year and ran another retro. We came to a number of conclusions and questions listed below:

  • not everything is a sprint
  • there are different types of sprints
  • not all sprints need to be tested in the same way
  • retro required on all sprints
  • pre-Sprint kick-offs are invaluable
  • how might we coach more facilitators?
  • how might we improve the sprint environment?

Some are obvious and whilst I’m not going to go through them all in detail, in essence it boiled down to the fact that not everything is a sprint; we tried to do too much and that we needed some flexibility in how we approach sprints and testing.

We also talked about how we could improve the environment for sprints and how we could help others facilitate sprints to take some of the pressure off the UX team.

Design Sprints in 2017

This year so far, we’ve run 8 sprints but these have been run in one of 3 different types of sprint based off our learnings from 2016.

Types of sprints we run at MoneySuperMarket

We’ve got our imaginatively named Type A sprint which largely follows the standard Google Ventures process but strips back the need for a high fidelity HTML/CSS prototype when we can get away with that. We’re also flexible with how we test with customers. Sometimes we’ll take statics or a low-fi Invision prototype to qual test with customers but other times we’ll survey them using a tool such as UserZoom.

Our type B sprint isn’t dissimilar to type A but here we use a HTML prototype. Generally these prototypes involve re-using a lot of code and pre-built components. We do do a normal qual test with customers in this version.

Type C process, the 10 day sprint

For type C it’s 10 days in length and we get double the insight from a survey and qual test.

Looking at the process for Type C in more detail (below) you can see that it appears to be longer than the 10 days I just mentioned.

Something we’ve started recently is to ensure that about a week before the sprint is due to start we have a pre sprint kick-off session. This introduces everyone to the customer problem and brief. We amend it if it’s not right and the session helps to guide what insight attendees should be bringing with them for that all important day 1 of unpacking the problem. Be that product knowledge, data, user research, competitor research, market analysis — basically everything you need to design and build great products.

Process breakdown for our Type C sprints.

In week one we follow a pretty standard sprint process; the difference being we tend to get two days focussing on the UX and design because we’ve not got the prototyping day in this week. We also start to recruit for the qual testing at the end of week 2. Before that though, at the end of this week we tend to send out a quick survey to a few hundred people where we can get results over the weekend. This might be to provide confidence in an idea or to help choose a route to go down if we’re struggling to decide between two creative executions. Sometimes the prototypers can get a head start here too depending on the solution which has been designed.

We then come in fresh from a weekend break and recap week 1, review the survey results and iterate designs based on those results. The prototypers then get to work. We then have regular showcases to see what progress is being made whilst our researchers write up their test scripts ready for qual testing on the Friday.

During these two weeks the team flexes up and down after the first few days depending on who is required — we like to keep people on hand in case we need them for something but don’t want to take everyone out from their day jobs for a full two weeks. We have one or two daily check-in’s though for progress updates.

Finally, we encourage everyone to come along and observe the testing but if that isn’t possible — depending on location, we try to stream the sessions so people can observe remotely and capture questions in Slack.

Then hopefully on the Monday, sometimes the Tuesday, we get everyone back together to get a shared understand of what customers thought of the solution, good, bad or indifferent and run a retro on the sprint itself to see where the process can be improved. This continuous learning has really helped us to develop our sprint process and refine it to ensure we add maximum value by running them.

As I mentioned earlier there’s a number of things that can then happen post sprint; we might iterate and test again (rarely in another sprint); we might schedule some AB tests; we might start to productionise the work or finally, we might kill the idea. Whatever we do with it, we’ve got those valuable learnings which all get recorded and stored to come back to if we face a similar problem elsewhere across the site.

Collaboration, focus and colleague engagement

Over the last 18 months or so that we’ve been running sprints we’ve probably had around 100–120 colleagues from across the business taking part. We’ve had colleagues from commercial, customer services, tech, product, marketing, content, CRM, data and more take part.

Collaboration between teams has been one of the biggest benefits of sprints. The very fact of getting people together to collaborate and secondly spend the time focussing on a problem is invaluable. It doesn’t really matter what you call it but that’s the part which is really key to everything I’ve talked about; collaboration and focus.

Reasons to sprint

To summarise, when I think about what sprints have really helped us with it pretty much boils down to these seven things, the first couple of which I’ve mentioned.

  • Collaboration
  • Focus
  • Shared understanding of problems
  • Energised team
  • Unexpected outcomes
  • Increased customer exposure
  • Continuous learning

Getting a shared understanding of problems is invaluable. It helps everyone understand each other (and more importantly, the customers) point of view.

Running parts of a sprint, such as sketching Crazy 8’s, really gets the team engaged and energised. They get the opportunity to contribute and have their say so it’s not left to the designers to come up with all the ideas. Sometimes running competing teams coming up with sketches and approaches within a sprint can be good too.

I talked briefly about unexpected outcomes and happy accidents before, but we find quite often that we’ll end up solving something for a channel that we’re not directly working on with something that came out of a sprint. These are then dropped into our UX & MVT backlog to be proved out at a later date.

It’s been a great way for us to increase our colleagues exposure to customers using our products and helped the business gain more empathy with our customers to put them at the heart of what we do.

And finally, I’ve mentioned it before, but it means we’re continuously learning.

Takeaways

As a takeaway, I’d like to share a few additional tips to compliment the learnings I’ve discussed earlier in the story.

Firstly, make sure you’re focussing on customer problems, not business problems. Now this might not always be the case depending on what you’re trying to do but it kind of goes without saying in our line of work. I find myself turning down a number of sprint requests due to the fact that there isn’t a customer problem at it’s heart.

We’ve had sprints where after the first couple of days team members disappear back to their office or wherever but it just doesn’t work.

In those longer two week sprints where sometimes not everyone is required all the time, you must make sure the decider is always available should a decision need to be made.

As I’ve hopefully shown by sharing our story of sprints, flex the process to fit your needs. Don’t feel that you have to stick rigidly to the process. Each business or organisation will have their own culture so find a process that works with that.

Get the environment right. This is actually more important than you’d think. Recently we changed our normal sprint room from being a typical meeting room layout to more of a workshop room. There’s no meeting table, just a few chairs which means sprint participants can’t hide away in a corner, they have to get involved. Having enough wall space to stick things up is crucial too as is getting people off their feet, running warm up and icebreaker exercises and of course, snacks!

Finally for anyone that’s facilitated any kind of workshop before, it’s obvious but the bigger the team the harder it is to facilitate and keep everyone on track. I think the sprint book recommends a team of 7, I find once you get over 10 it’s becomes unmanageable. That’s when I’d start to look to only bring additional experts in for certain parts of day one for example.

For those of you who’ve not run design sprints before, you’re encouraged to try them out and for those of you who have, that I’ve given you some alternative ideas to consider.

I’m keen to hear your sprint stories so please give me a nudge on twitter (http://twitter.com/rich_clark) and we’ll continue the conversation there.

--

--

Rich Clark
Rich Clark

Written by Rich Clark

Leading the UX & Design team at MoneySuperMarket

Responses (2)