4 ways we improved our product design peer review process

Natalie Gilmor
Industry Dive Design
5 min readAug 15, 2022

For many years, Industry Dive product designers were responsible for most of the product development cycle up to the point of engineering — from research and discovery to UI/UX design to stakeholder approval.

This process changed in 2021 when the company began to hire product managers. The addition of these new team members radically changed the role of Industry Dive’s product designers and led to new sets of challenges.

In particular, we quickly discovered that we needed to rethink how we approached design peer reviews.

Below are four issues related to design peer reviews that we identified after we began to work with PMs:

  1. Lack of incremental feedback — Designers were waiting until they felt 100% ready to share their work before scheduling a review. This led to a lack of visibility into their design process and lost time when different solutions or directions were uncovered in review.
  2. Progress was often blocked — Once the designer was ready to share, it could be difficult to find time with the team in the immediate future. The designer would then be blocked from moving forward until they could meet for peer review.
  3. Minimal UX and analysis discussion — Only UI work was receiving peer feedback as the designers did not think to share operational or research deliverables with the team. This led to lower-quality UX and analysis work due to limited feedback and discussion.
  4. Poor decision documentation — With more frequent reviews happening over video call, the “why” behind design decisions was being lost. This made it difficult to then share with stakeholders how we arrived at certain solutions.

In order to address these challenges, we made four key changes:

  1. Scheduled a 45-minute weekly critique meeting
  2. Implemented a 30–60–90 framework to provide feedback earlier on in the design process
  3. Created new async communication guidelines
  4. Created new guidelines for feedback documentation

With these changes, we aimed to add a supportive structure to our peer review process and address the four issues listed above. Our other goals were to increase the quality of our UI and UX work, decrease the time spent on projects and create greater transparency and knowledge sharing between designers.

Weekly structured meetings

To start, we scheduled a 45-minute weekly critique for the designers to share their work. I also provided guidance on how they should prepare for the review meeting and an itinerary to ensure each project received the appropriate amount of attention.

A version of the review meeting guidance:

  1. What are you working on and why?
  2. What do you want to get out of the review?
  3. What are the challenges of the current design in production? (If applicable)
  4. What are the jobs to be done and the business value of these changes?
  5. Are there any important constraints, blockers or deadlines to consider when providing feedback?

With even just minimal guidance on how to ensure high-quality feedback, our discussions quickly improved as a team and became more valuable. We currently hold these review sessions once a week but might increase to twice a week in the future.

30–60–90 Framework

Next, we incorporated the 30–60–90 framework shared by Pavi Logeswaran in her 2022 Figma Config talk, How to Show and Tell. The crux of the framework is that you should show work at the 30%, 60% and 90% progress points. Each stage has a unique set of goals. At 30%, you should aim for feedback on the high-level direction. At 60%, you should focus on visuals and more expanded concepts. At 90%, the final stage, it is all about fine-tuning the details.

Previously, our team only shared their work for peer critique at the 90% point. Normalizing the sharing of early-stage work greatly improved the value of peer feedback. It also ensured little time would be spent working on designs that would later be totally changed or discarded..

Async guidelines

While the weekly structured meetings provided an opportunity for discussion and critique, sometimes a designer would need quick feedback to enable them to move forward. Due to being an entirely remote team, I developed guidelines for asynchronous communication. First, we defined which channels were appropriate for specific types of async feedback. Slack is great for quick design approvals or gut-checking your ideas, while an email is for more formal feedback or something less time-sensitive.

I then defined a similar outline to the structured peer review to ensure the feedback they received was on-topic and helpful for the designer. These questions focused on the status of the project, the goals for the feedback and any applicable deadlines.

Designers were also encouraged to have their own opinions when asking for feedback. If they were asynchronously sharing multiple options, they should explain which one they are leaning toward and why. This would provide further context and insight into their thinking and accelerate creating consensus.

Documenting feedback

Finally, to ensure feedback from peer reviews was properly documented, we brainstormed how we could be more accountable for documenting these discussions and design decisions.

First, we agreed to be intentional in how we iterated on ongoing designs. Rather than deleting nixed designs, each iteration would be saved and labeled with the review date (ex: June 11, design review 30%). This would make it easier to go back and look at past iterations in case we needed to walk stakeholders through how we arrived at our final design.

For the reviewed designs, notes would be added directly to the design file explaining the feedback and key decisions. These notes would include a timestamp as well as which individuals were involved in the feedback process.

For larger project decisions or final approval, the designer would document the decision and share it with the product manager and stakeholders to ensure transparency across teams. This would ensure everyone was on the same page and there would be no surprises when the designs were reviewed or pushed to production.

Conclusion

The four operational changes described in this post significantly improved our peer review process. The team now has greater ownership of their work and knows exactly what they should do at each stage of a project. Our next goal is to better incorporate the product team into design discovery conversations to improve cross-team visibility of our work.

--

--

Natalie Gilmor
Industry Dive Design

The product design director at Industry Dive. I help my team solve UX & UI challenges in the B2B media space.