4 Lessons from Running My First Design Sprint

It’s interesting to see all the different opinions on the topic of Design Sprints — even more-so when you pitch it to executives. There is no shortage of naysayers, although it could be their loss if they aren’t willing to try.

Dan Stumph
Lean Startup Circle
8 min readFeb 17, 2017

--

I felt fairly confident preparing for the first Design Sprint at Vendasta; I read the Sprint Book, did my research, and even had a one on one phone call with Jay Malone from New Haircut. The culture around Vendasta is built on continuous innovation and keen to experiment so we had great buy-in from leadership. This may have been the best case scenario for someone running their first sprint. The room was ready with whiteboards, sticky-notes, sharpies, and paper. The team started filing in — everyone ripe with anticipation — ready to tackle the five days ahead of us. Will it be a success? Here are 4 lessons I’ve personally learned from running my first Design Sprint.

1 | Do more research beforehand

There is always room for more information at the beginning of a Design Sprint, but I was ignorant and thought there would be more of an organic result going in with little upfront research. Now, I obviously didn’t go in completely blind being that I was the strategist on the project. However, I didn’t do as much competitor analysis as I should have. Unfortunately, I assumed the time together with experts would fulfill the need. I was wrong. Had I known the landscape for the chosen problem more intimately, I would have been able to contribute more variety of thoughts to the team. There is a lot of anticipation of crowdsourcing information from the experts. If you and your team don’t take the time to prepare — you’re going to lose a ton of value.

“Had I known the landscape for the chosen problem more intimately, I would have been able to contribute more thoughts to the team.”

You should know the problem you plan on testing in the sprint weeks before day one. Take advantage of this time and look through competitors, understand your users, and try to feel out the behaviours that are driving the need. Additionally, if you have an existing product, you should understand the current state it’s in; both in usability and technology to the best you can. This is another reason why having a variety of expertise is important because each can shed some light on the state of affairs in each of their areas (more on this later). All this to say that I may have been prepared for the process of a design sprint, but I wasn’t as prepared as I could have been for the context. Lesson learned.

2 | Be very selective with customers and cognizant of their schedules

Vendasta does not have the luxury of one-size-fits-all for our customers or users. We have customers that may or may not be users, who also have customers who may or may not be users. We also have various sizes of organizations that use the system very differently. All that to be said, when selecting users to test at the end of the Design Sprint, be very aware of the target audience that your prototype will be speaking to and select your customers for testing from that diagnosis.

We ran into some schedule issues the week that we ran our sprint. Being in Canada (and because of some last minute date selection) it didn’t dawn on us that we scheduled the sprint for the week of American Thanksgiving. This meant that because most of our customers are in the US we couldn’t test on Friday (or at least to the extent we would have liked). As if that wasn’t enough, we had two different workflows to prototype. Prototyping Thursday turned into Finalizing Friday which merged with Thanksgiving Testing Torture. Nonetheless, it ended up working out. We wrapped up testing on Monday and were able to partially validate the work that came out of our sprint. I’m sending my future-self an email to look at the calendar a little closer.

3 | Variety in your team is mandatory

Because this was an experiment there was some hesitation on who to include. We were unsure what the outcome would look like and to steal away a diverse group from their week was dicey. So, we improvised and tried it out with the entire strategy team and some additional members. This ended up being an Executive, three Designers(including myself as the facilitator), two Product Managers, a Developer, and for a user reference we invited one of our internal Success Reps everyday for about an hour. Aside from our full time team we had more executive presence for the expert sessions and more internal users. There were definitely some wins and losses with this type of team. It was great that the whole strategy team was able to glean from the experience, but feedback was that there should be a more diverse group. We could have had more users and obviously the respective Product Manager and Designer that would be working with development teams.

Here’s why the variety of people is so critical to the success of design sprints. I thought the most insight and potentially best ideas came from those that don’t think through these kinds of problems every day. The strategy and product management group is accustomed to seeing a need and trying to fill it with solutions. The problem is we get so caught up in system and iterative challenges that we can forget to let loose and shoot from the hip. “What if we did this” or “Wouldn’t it be amazing if we could…” are sayings we need to unleash in these sprints. We can at least wireframe what those pie-in-the-sky ideas might be and then narrow down scope when we select the direction for the prototype. If we handcuff our ideas to what the system can currently do today, we will lose out on innovation and growth. The people outside of regular product strategy had the edge — users are especially invaluable in this process if you are lucky enough to have them around. The other incredible source of information that I found so, so, so important was the developer. The insight into how the system is orchestrated behind the scenes gives the team much more clarity into what is possible in the short term and what might increase scope.

“If we handcuff our ideas to what the system can currently do today, we will lose out on innovation and growth.”

4 | Your final deliverable is not the prototype

Often when we drive this sort of design agenda it can be misleading what our deliverable will actually be. It’s more common that the design process delivers an artifact that is ready for development. What we learned here is that the feedback from testing customers is the most valuable deliverable. It seems so simple thinking about it now, but it exposed something to me that has started a journey for continuous feedback.

“Rapid wireframing in silos allowed each of us to get our unbiased ideas out in the open.”

It was the week after the sprint and I felt mixed emotions about the outcome. We had finished testing and everything seemed to point in the right direction — with a few minor changes. The team involved had expressed their thoughts and generally the sentiment was positive with some valid takeaways. One afternoon I had a meeting with a few leaders to go over the results. I spent a good portion of the time trying to validate the value of the sprint by showing all the work we did. The map, the sticky notes, the whiteboard mayhem — and such wireframes! I was framing the sprint to be the final deliverable. We looked at the prototype and I was thinking, “So many people contributed to this and it’s nearly on point!” Well it was nearly on point, but the reason it wasn’t completely on point was emphasized in the next discussion. We continued looking through the documentation and we get to the testing feedback. One of the leaders sat up in his chair — and I’m paraphrasing — “Look at the point so-and-so made! That’s gold!” I was embarrassed yet intrigued. The important part about this is that we didn’t change the prototype after each test. We need to test the same prototype on different users in order to validate the feedback. The cumulative user tests shed an enormous light on what was wrong, but further to that, it pointed directly to the answer we were looking for. This has got to be the drive behind every Design Sprint — early feedback is the golden ticket!

“Finally, testing the prototype showed us where we were right, but more importantly, where we were wrong.”

We could have spent months strategizing, high-fidelity prototyping, tweaking, and revising focus. In my experience — so far — this has the tendency to blow scope out of proportion. From what we learned in our design sprint in five days, we’ve been able to narrow down the desired outcome for the next release of our platform. By including the experts in discussions early on we validated the journey we were going to tackle. Rapid wireframing in silos allowed each of us to get our unbiased ideas out in the open. Coming together and agreeing on a direction to prototype was challenging, but it showed us how many possible ways we could solve this problem. Finally, testing the prototype showed us where we were right, but more importantly, where we were wrong. I feel very confident that our effort to try a Design Sprint was successful. Was it perfect? Not at all. But we now have an idea of the clarity and collaboration it can bring to our product development efforts. Like anything, with practice it will keep getting better.

The consensus that I keep reading about design sprints that result in invalidated ideas is that you “only wasted five days.” I’d agree with that in a cost analysis manner, but now being through one I disagree with the term “waste.” The fact that you failed to prototype the right solution is perfectly acceptable, as long as you understand what needs to change. I’ll leave you with this thought; which is more expensive — spending five days on the wrong prototype or three months building the wrong solution?

Have you had the same experience or maybe some feedback on my findings? Leave a comment, I’d love to hear about it!

--

--