THE DESIGN LOOP

Brett Whitham
Jul 25, 2017 · 8 min read

Design a better experience in just a few steps.

Design and aesthetics are often confused for one another. Though aesthetics play a part in the process, there is much more that goes into designing an engaging experience. Here are a few steps I like to follow before diving into a new design project.


I like to believe that few projects are ever complete. Our environment, technology, process we follow, and countless other things are constantly shifting and evolving. Those evolutions impact us and are a basis for establishing the concept of a iterative design loop.

Let’s just assume before diving in here you have some guideline for moving forward. “Let’s build a better tooth brush” or “We’re going to revolutionize the chair industry”.


  1. Empathize with your user
    It’s difficult (impossible maybe?) to be an expert on every problem put in front of you. Looking back, I know I certainly had zero knowledge on things like currency exchange risk, what makes a döner kebab authentic, or how to fundraise for an AIDS organization. What’s important is the ability to identify your users and subject matter experts and do everything you can to understand their problems. Hold a focus group, gather usage data, visit your clients, hit the pavement and talk to the public, or call your mother for once if it’ll help. Do whatever you can to get to a place where you can see through the eyes of your users to empathize with the challenges they’re facing.

2. Hypothesis
You’ve gained a good understanding of your user and the subject matter. Next you’ll need to capture your thoughts to establish a base profile of three key items:

  • Who’s the user(s)?
  • What’s their core problem(s)?
  • What path will they follow to solve their problem(s)?

Here you’ll want to leverage the practice of developing use cases and user workflows. It’s good practice to loop back with your user to verify your workflows. Even better, consider including a user or two during the process of whiteboarding/sketching out their workflow. It’s possible that you’ll end up with a large list of use cases you’re trying to solve. This step will help you look at those cases and begin to lay the foundation of your minimum viable product (“MVP”).

I could touch on the concept of an MVP in a future post, but there’s a lot written already that a quick Google search could uncover for you. In short, it’s the smallest set of features you’ll need to build, with the least amount of effort, to overcome the user’s core problem. The goal being to get something into market so you can see how it’s being used, gather feedback, and then make improvements with future versions.

3. Conceptualize and define the MVP
Grab a whiteboard and start big. Since you now have a solid idea of who you’re designing for and the problems they’re facing, now you’ll look to figure out the individual pieces that will make up the experience.

It’s important that you encourage this step to be completely open with all ideas being ok (Obviously, use your best judgement on anything that might be offensive or in poor taste). You want to imagine the grandest view possible that serves all the needs of the user in the awesomest way possible. From this Taj Mahal view of the problem you can begin to prune your concept and prioritize the concepts that solve your core use cases thus defining your MVP.

4. Prototype
This step gives tangibility to your ideas and gets you closer to the validation process. Your prototype may be a low-fidelity wireframe of a website, paper mockups, something 3D printed, or any number of other things. It’s really up to you and your team to decide what’s needed here. Ideally you’re keeping this simple, fast, and efficient.

5. Validate
It’s time to validate those ideas you’ve turned into prototypes. You should find yourself looping between the previous conceptual and prototyping steps as you work through the verification process. It’s up to you to decide what amount of verification needs to be done but the following steps have been very useful for me.

  1. Pick your users. I found a lot of truth in a Nielson Norman study around the concept of testing with five users (it actually ends up translating to 15 or so). The idea is that you don’t need a massive group of users to uncover issues in your designs. After about the third user you test something on you’ll begin seeing similarities. At and after about the fifth user you’ll be witnessing the same feedback the previous users gave.
    In trying to stay true to rapid iterative design I like to have multiple test groups of three users. Five may be better in certain situations but three is an easy to manage number for both yourself and your users and tends to uncover a great deal of information. My preference is for a total number of at least nine users across three groups. It’s also important for each user group to be of a similar profile. If user #1 in group #1 is in their 30’s, college educated, tech savvy, and from a rural area then user #1 in groups #2 and #3 should be someone similar.
  2. Write your tests. There may be some situations where you simply want to watch a user use your product with no restrictions but in general I like to have a strict set of things I gather feedback on. These tests should be inspired from previous steps where you’ve defined your MVP (and it’s associated core set of use cases). Think small and remember you can step through small sets of items… “Is the process of clicking {these three things} to get {to this place} intuitive enough?” It’s important to keep this controlled and not write anything so complicated that a user has to spend more than a few minutes with the test.
  3. Test and Observe. Here you’ll work on your groups of three one at a time. It’s ideal to be in-person while giving the tests but I’ve also found success with remote sessions using products like WebEx. Record the sessions, if possible, to refer back to later. Since you’re not going to be over the shoulder of every user when your product is in the market, it’s important that you remain hands-off and don’t lead the user on during the tests. You should also ask that your users verbalize any thoughts they may be having while working through the tests.
    Run your tests with each user from the group back to back. Pick one day during the week where you can schedule the tests as close to back-to-back as possible. If you’re doing this right you can run your test within 15 minutes. Even with a larger project you should aim to keep it under 30 minutes.
  4. Review. After you’ve finished the day’s test sessions, take some time to gather your thoughts. Within a few hours you can regroup to review notes and videos and anything else you gathered from the test sessions. Any issues you’ve observed should be prioritized based on your definition of the MVP.
  5. Iterate and repeat. Next you’ll take whatever issues you’ve agreed to fix in the prototype and fix them. Try and make your changes within a few days so that you’re prepared to run tests with your second group of users. Hopefully with each new test group you aren’t observing the same issues that the previous group saw. It’s also ok to introduce new tests after a few rounds but be careful to not introduce too much too quickly. You could easily lose control over the results.
    As you work through this you should notice that the problems group #1 has are less apparent for group #2 and for group #3 are no longer present. This is where building groups of users with similar profiles is useful. The tech savvy user #1 in group #1 may have had problems that you no longer see with user #1 in the other two groups.

In most cases it’s not necessary to validate the entire experience. Only enough so that you can start feeding pieces into your production timeline. The prototyping and validation process can continue along in it’s own timeline independent of the actual build timeline.

6. Build and Deliver
Not a ton to say here. It’s time for finalizing and putting the polish on the pieces you’re going to deliver. Whatever space you’re working in you’ll want to stay close to production to offer direction and verify things are being built as designed.

7. Analyze
As mentioned earlier, few projects are simply “complete”. Even after you’ve gone to market you should continue to iterate and refine. Especially if you’ve gone the MVP route. That would assume that the first few versions you release aren’t quite fulfilling that Taj Mahal vision you originally crafted. This analysis step may include usage data, user surveying, and even more test sessions.

There’s obviously going to be some down time where users just need to use stuff and you need to move on to other projects. Even after you reach what feels to be the finish line, you should be encouraged to feed your product right back into this loop to see what you uncover.

Brett Whitham

Written by

I like making the things.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade