We’re a digital design agency in Melbourne Australia, and our client, Momentum Energy, is a growing energy provider. A little while ago we agreed to work on a Customer Research project together, which we scoped to include some initial customer research along with three consecutive GV Design Sprints and a final showcase.
Here’s how we got on …
Before we started sprinting, we spent a couple of weeks learning more about the business. We conducted twelve one-on-one customer interviews, with a plan to identify and assess the difference between our assumptions about what customers are looking for and first hand accounts of customer needs. The research, as brief as it was, proved to be incredibly worthwhile as it ended up guided us throughout the subsequent sprints.
Let the sprints begin
On Monday morning, our first task was to define our goal: ‘Users will have a complete understanding of our product and will feel confident in why they chose Momentum’. We created a ‘war room’ and covered the walls with our goal, our questions, our research, the challenge map, and our HMWs. While it may have felt slightly overwhelming to start with, by the second and then third sprints, Mondays became easier for the team — we found ourselves being able to fairly quickly and confidently identify how we’d spend the sprint because we were continually revisiting the war room.
Each sprint sought to answer a key HMW question, and the questions we landed on were:
- “How might we help people understand where their energy comes from?”
- “How might we do something wildly different (and memorable)?”
- “How might we help users find the right Momentum product using lifestyle indicators?”
Working together as one team
We’d worked on a few small BAU projects for Momentum in the past, but nothing like this project before — this was the first GV Design Sprint we would be doing together. It was a great opportunity to really get to know each other, and for the Momentum team to, at least temporarily, remove themselves from the corporate environment and give themselves the space to focus on one thing for the week.
Some of us had experience participating in other Design Sprint teams but, for the Momentum team in particular, this was going to be a new experience. We suggested everyone read or re-read Jake Knapp’s Sprint book, which is what we’d be basing our sprints on, prior to starting.
The sprint team was made up of MASS and Momentum staff. MASS provided the creative resources and facilitated the sprints and the user testing sessions, while Momentum provided business, product and marketing expertise. Not everyone from Momentum could dedicate their entire week each sprint, but we were lucky enough to have two team members working on site with us for most of the time.
We held daily stand-ups and encouraged everyone from the extended team to join us. We had executives join us for the really important sessions like voting and deciding, and we used Lookback for all our user testing, which meant all team members could easily log in, observe and take notes as the sessions were in progress. We also used Slack throughout the project for comms.
Prototyping in a day
We used a combination of Figma, Marvel and Principle for creating our prototypes. All prototypes were built for testing on Desktop and included limited functionality. Since we wanted to keep our clients’ identity unknown during the user testing, we used a fictitious company name and brand, and we tried to avoid using content that might give them away.
The first sprint’s prototype was relatively straight forward; it explored how we might ‘help people understand where their energy comes from’. We focused on content, and broke the experience down into mini chapters. We used Figma for design and Marvel for the prototyping. The final prototype involved fairly long scrolling pages with an emphasis on content.
During the second sprint we challenged ourselves to ‘Do something wildly different’. We came up with lots of great UI / interaction ideas and decided to spend our Thursday generating four different prototypes (three of them based on our solution sketches, and one control). With the benefit of hindsight, we can now say this was a fairly ambitious undertaking! The prototypes were intended to feel like a single-page app and they relied heavily on testing interaction patterns, so while we did the design work in Figma, this time we built them in Principle. Principle is ordinarily used to demonstrate and explore micro-interactions, so it was an obvious choice. The downside of Principle is that it’s quite a laborious method of prototyping (multiple art boards are required to prototype seemingly small differences between pages/elements), but we made it work. It also forced us to refine our knowledge of Principle (learning more about components) which will make the process easier/faster in the future.
During the last sprint we went back to basics and focused on one prototype that explored how we might ‘help users find the right Momentum product using lifestyle indicators’. We used Figma for the design and Marvel for the prototype, like we did in the first sprint. The interface was relatively simple, with a layout that allowed users to take the simple action of drilling deeper into the content without having to refresh the page. We would have liked to have made this really fluid, but basic screen-to-screen interactivity was the most achievable.
Learning from each sprint
We learned a lot by the end of each sprint, even though our testing sample group was pretty small — we only scheduled four one-on-ones each Friday (although we did do some additional testing later). Despite the small group, we saw the trends pretty quickly.
- What influences a customer’s choice of brand
- What kind of content a customer wants to see on a website
- What will help influence a customer’s purchase, and
- What might help keep to them as customers
… and we also discovered a few surprises:
We found out how open customers were to non-standard interaction patterns. We underestimated their appetite for alternative layouts and their ability to recognise these instantly. To be honest, very few test users even commented on our wilder UI ideas, ultimately expressing an expectation that an energy provider would be innovative. This was a nice reminder that users’ expectations are increasing and their ability to recognise different interaction patterns is greater, likely due to increased exposure to different user interfaces that daily life now affords us. It’s important to remember that we tested with a pretty small sample group; it would be interesting to see if we’d achieve the same result had we cast a wider net.
I like the analogy that Mel DeStefano shares in this article; ‘… your first sprint is like throwing a dart in a dark room. You have no idea where you’re aiming, and there’s a really good chance you’ll miss. But after you’ve thrown, the light comes on, and you get to try again. You have a much better chance of hitting the target now’. We garnered some great insights from our three sprints and now we have a clearer and stronger starting point for future work.
We completed the project — the initial research and three subsequent sprints — and decided to create a book to document our observations and commemorate the process. While it was a last minute decision, it felt like an important thing to do.
At first, we were reluctant to invest additional time in creating a book — it didn’t feel like a ‘lean’ thing to do. My personal opinion is that one of the key advantages of a sprint is that it reduces the need for documentation, that the very nature of the process is such that people experience and synthesise their observations and findings together, so something like a book just isn’t required. But who doesn’t like the smell of a freshly printed book?
In the end, I’m glad we decided to do it.
‘The Customer Research Book is fantastic because it so clearly captures everything we went through together. But it will also become a reference for us moving forward and demonstrate the importance of customer-centricity to others who weren’t part of the sprint’.
Dean Cartwright, Senior Digital Strategy Manager, Momentum
Dean’s comment was unexpected, to be honest, but it serves as a great reminder of the value of a thing. A thing that can be shared, discovered, flipped through idly while you’re waiting for the kettle to boil. These are the moments that can be the beginnings of culture change, and for that reason we know taking the time to create this for Momentum was well worth it.
What worked / what didn’t
We were super impressed with how smoothly the project went, and there were some things that worked especially well:
- Working collaboratively meant connecting with the team really quickly
- Recruiting subjects using Facebook, and pre-qualifying them online
- Intensive, time-boxed tasks forced us to move quickly
- The freedom to work on more speculative ideas
- Knowing that even a bad result was a good result (fail fast)
- Using Lookback to capture our user tests and observations
- Discussing our observations as a team at the end of the testing
- Photographing and documenting everything throughout the sprint
That being said, we were still trying to identify things that needed improvement. Some of the things that we found challenging were:
- Everyone embracing the crazy eights (a bit intimidating for some)
- Feeling overwhelmed by the process, mostly in the beginning
- Getting enough people sketching to make the process of voting valuable
- Being too optimistic about how much prototyping we could do
- Underestimating how much test users fixated on content
We wouldn’t say we’ve completely mastered the art of sprinting, but by the end of the project we definitely felt a lot more comfortable with the process.
What we did differently
We pretty much kept to the recommended process with only a few minor differences.
We facilitated a requirements gathering workshop with the Momentum team and conducted some attitudinal focus groups sessions with customers before we started. I think this might even be recommended — either way it definitely helped.
We tried to stick to the schedule pretty closely during the sprints but I’d be lying if I said we stuck to all the suggested times, every time. Once we got more familiar with the sprint we even stole time from some sessions to buy us additional time to prototype.
Monday mornings for sprints two and three were much easier, having gone through the process before; those sessions ended up being more about sorting the ‘How Might We’s’, reviewing, voting and discussing. Because we had the same, consistent goal for all three sprints, we didn’t need as much time defining the target. This certainly helped to improve as we went or, at the very least, helped to make sure it was abundantly clear what the group thought was working well.
After testing prototypes from sprint’s two and three, we decided to revisit them at the end of the project. We felt that both sprints warranted more investigation and needed more work on the content to get a more accurate test result.
We didn’t have the recommended amount of full-time team members throughout each sprint but we worked well with the core team and ‘ring-ins’ at critical moments, like the voting exercises. This, in combination with daily stand-ups and showcases at the end of the week, meant anyone who wasn’t able to attend all of the weeks’ exercises still had an opportunity to be involved.
Sprinting into the future
We all had a super positive experience. While we were hoping the project would be a success, we were thrilled that the Momentum team wholeheartedly embraced a new way of working. We’ve now adopted some of the sprint techniques to other projects for Momentum and we’re hoping to do more sprints together in the near future.