How to Run a Design Jam

This face-to-face mini-sprint kickstarted our remote team’s design project

Katie Jones


We’re jammin’.

For many of us in UC Irvine’s Master of Human-Computer Interaction & Design program, a centerpiece of the experience is the capstone project, a group design project for a real-world client. My team has been working with a Fortune Global 500 consulting firm on innovation in the wine space, with about 10 weeks to research and 10 weeks to design. MHCID is a low-residency program, so our capstone collaboration all happens via video scrum meetings and lots of chatting on Slack.

After 10 weeks of research, the team was ready to come up with ideas to design. But we had a problem: ideation is one phase of the process that really benefits from face-to-face time. Our team rocks at getting stuff done remotely and often asynchronously, but there is no replacing the generative energy that comes from being in a room together.

Enter the design jam, an in-person weekend structured to help us crank out our best ideas and validate them with real users — all in two days. Two weeks ago, our team holed up in a room in Orange County with some healthy snacks (and donuts), a whiteboard, and some optimism for a weekend of mostly in-person work.

The team. L-to-R: Felicia Wang, Tim George, David Nguyen [our mentor], Aaron Soto, Steve Ngo. Not pictured: Katie Jones.

There’s still plenty of work to be done, but our team agrees that a jam was a great way to kickstart the design phase of our project. What follows is our play-by-play and a few lessons learned. May it be useful for other remote teams in need of a creative shot in the arm.

The Team

Tim George (jam leader), Katie Jones, Steve Ngo, Aaron Soto, and Felicia Wang, with David Nguyen (our mentor and guru)


We had three goals for our design jam:

  1. Generate design ideas rooted in our research.
  2. Validate some of those ideas through user testing.
  3. Identify at least one idea to build out and iterate in the remaining weeks of the project.

At the suggestion of our mentor David, our team drew inspiration from Jake Knapp’s sprint process to plan the jam, but with only a weekend, we knew we could not commit to a full 5-day sprint cycle. Our modified sprint included two rounds of “flare and focus” ideation to generate and home in on the strongest concepts, followed by a day of very rapid prototyping and user testing.


Flare and Focus 1: How Might We?

Flare: We started with a “How Might We” exercise, an ideation strategy that got us thinking about ways to address the user problems we had identified in research. Our previous research had identified six Customer Jobs that people hire wine to accomplish, so we attacked each of our six jobs one at a time. We dedicated 15 minutes to each job: 5 minutes to remind ourselves of our research finding, and 10 minutes of quiet, heads-down brainstorming in the form of “how might we…?” questions. People in the room wrote theirs on sticky notes; our distance teammate dropped hers in Slack. At the end of each 10-minute writing session, we read out our ideas and added them all to a board.

“How Might We” exercise — before and after.

Focus: The How Might We exercise left us with dozens of sticky notes of questions and half-formed ideas. The next step was to identify the strongest questions that deserved more attention. First, we grouped the related How Might We questions by theme. Then, we each used dot stickers to silently vote for our top three; our out-of-town teammate called out her votes so that hers were included. In a nod to the “Sprint” doctrine, a predetermined decider made the final call after the rest of the team had voted. The focus exercise left us with three How Might We questions for each of our Customer Jobs.

Having a remote teammate didn’t hold us back, but it took work to keep the process inclusive.

Flare and Focus 2: 2x2 Decision Matrix

Flare: We then put our pens to sticky notes again to quietly sketch possible answers to the “How Might We” questions we had chosen, completing 3 successive sketching sessions of 12 minutes each. (That is more exhausting than it sounds.) At the end of the flare, we had another stack of ideas. We went around the table and screen, reporting out what we had come up with.

Focus: Once we knew the idea landscape, we then placed each idea on a 2x2 axis: difficult-to-easy to prototype on one axis, and boring-to-delightful on the other. We turned our attention to the quadrant of ideas that we considered easy to prototype and delightful. The same voting ritual followed: each participant cast 3 votes, then the decider chose 5 winning ideas.

The top right quadrant was the sweet spot: not too difficult to prototype, not too boring.

8 hours later, we had chosen 5 ideas to prototype, and it was time to call it a day.


Rapid Prototyping

We started our morning by splitting up and quickly prototyping our 5 solutions. Steve created one product prototype in Invision for a screen experience, and Aaron worked up a quick script for a chatbot; the rest were just paper. Our goal here was not to test interaction design but just to validate whether our ideas resonated with wine drinkers. We worked up interview scripts to understand their current ways of solving the problem and their feelings about our solutions.

User Testing

With prototypes in hand, it was time to bring in real people. We set up a lab for our interviewer and users, and a viewing room for the rest of us to watch and listen. Just using some laptops and a recorded Google Hangout, Tim rigged up two cameras, one to watch our interviewee and the other to watch them use the prototypes. Each of our four subjects sat with the interviewer for about 45 minutes, working through and talking about all 5 of our prototyped ideas. They left with gift cards, and we left with lots of data.

Taking notes in the viewing room

Lessons Learned

Validating ideas through user testing is really hard.

How do you figure out whether a stranger would actually want — and choose — your product? We gave ourselves two hours to rapidly prototype 5 ideas, not enough time to think deeply about each one. As users showed up, we ran them through all 5 prototypes, asking just enough questions to figure out whether each product solves a problem they actually have. We found that it’s easy enough to quickly figure out whether a problem resonates (Q: “Have you ever had trouble finding the right gift for someone?” A: “OMG yes. My dad. Impossible to get presents for. He has everything.”) It’s much harder to feel out whether our solution would actually appeal to a user in the real world. Participants can’t really know whether they would use a solution, and most are too polite to be brutally honest. Their comments about the prototypes drew out useful data about aspects of our products, but we did not leave the jam confident that the market was ripe for our ideas.

In-person ideating made the whole thing worth it.

Ideas gain momentum when given a little attention. They snowball and spin off and evolve. That momentum is where surprising ideas come from, and it almost requires the thinkers to be in a room together (preferably with phones on airplane mode). While we had trouble validating our ideas in the 2-day sprint, we collectively came up with dozens of ideas together that would not have come out without focused attention in the pressure cooker of a room. Now that we are designing out our ideas with more thought and fidelity, we are glad to have a big slate of ideas to draw from. Some of the ideas that felt exciting in the jam are not holding up as we flesh them out. Others that we had ignored are resurfacing based on feedback we receive.

Team momentum matters. A jam can help.

Working on projects remotely can be a slog. We discovered that a design jam was the mid-project kick we needed to recommit to the another ten weeks of collaboration that looks more like this:

Wish us luck!



Katie Jones

Content strategy | Brand | UX | Higher Ed