How Agoda Incorporated Design Reviews to Ship Quality Designs
by Yuiko Majima and Arash Bal
Introduction
When handing off designs to development, it’s essential that we deliver the best possible version. These are the designs that eventually get built, and as the frontline of this process, it’s our responsibility to prevent unintentional errors or suboptimal solutions from being passed along.
At Agoda Design, we’ve established a checkpoint called Design Reviews to serve as the final checks and balances before handoff. These asynchronous sessions evaluate designs against our Experience Principles, Agoda Design (ADS) language, and accessibility standards. Conducted through Figma’s native comment system and ADS Lint, our in-house Figma plugin, these reviews ensure a consistent and high-quality user experience by:
- Gathering diverse feedback: Collecting insights from peers, domain experts, and page owners to refine and improve designs.
- Ensuring alignment: Ensuring designs are aligned with stakeholders impacted by them.
- Identifying gaps and mitigating risks: Detecting any missing elements or inconsistencies before handoff to avoid potential issues later.
This article explores how we developed Design Reviews to improve design quality without significantly impacting efficiency.
The state of our feedback culture
Before incorporating design reviews, designers were already sharing their designs. There was a regular cadence of weekly design critiques, and each team had a process that suited them. However, we saw opportunities to improve our collaboration and our design quality.
Key issues we uncovered through interviewing designers were:
- Sharing is often siloed: Designers primarily shared their work with their immediate team and product stakeholders, with optional sharing across other vertical design teams. Depending on team size and velocity, designers used various formats such as critique sessions, asynchronous commenting on Figma, and 1:1 sessions. Weekly critique sessions were the most common. However, only 21% of designers at Agoda reported that sharing designs in critiques was mandatory, while 50% said it was not. Teams also shared designs in smaller Slack groups, often resulting in misaligned goals with other teams impacted by design changes.
- Final designs are not always shared: Designers share their work at varying stages—44% always share before handoff, 20% during design iterations, and 24% after the first design concept. However, not all designers share their final designs prior to shipping, leading to potential gaps in quality assurance.
- Lack of unified quality checks: There was no shared platform or process for ensuring consistent quality checks across teams. Design reviews were optional, leaving room for human errors to slip into production. Without a shared understanding of quality, feedback often became subjective and inconsistent.
To address these challenges, we needed a more structured, unified approach to design sharing, quality checks, and cross-team collaboration to ensure alignment, reduce errors, and maintain consistent standards.
How we implemented design reviews
Re-evaluating design reviews was key to standardizing quality across the design organization and adopting the design principles. To improve the process, we conducted multiple testing rounds, measuring changes in quality and efficiency before rolling it out to the entire design org.
- Pilot testing: Three design teams within our design org requested designs for review. Design Leads and Managers reviewed designs against the Experience Principles. The review was conducted using a simple tracker in Excel.
2. Beta testing: For beta testing, the same three design teams requested a review; however, this time, the reviews were conducted using our in-house Figma plugin, ADS Lint, developed by Lars Jarlvik. The design leads evaluated the work against the Experience Principles.
3. Rollout to the entire design organization: We then formally opened this process to all design teams in our organization.
How we conduct formal reviews today
- Designer initiates review: The designer selects the areas for review in ADS Lint and adds the corresponding Jira ticket. Using the plugin, designers can self-assess their use of design language.
2. Slack for transparency: To improve transparency, we moved everyone into a single Slack channel where designers can initiate a review using a bot to tag designers affected by the changes.
3. Reviewer evaluates design: Reviewers assess the design against the Experience Principles and provide feedback using Figma’s comment system.
4. Resolving comments: The designer addresses the feedback, makes necessary changes, and marks comments as resolved.
5. Submitting the review: Once all issues are resolved, the designer completes the design review in ADS Lint, which re-runs automated checks for final verification.
Key learnings
- Experience Principles standardize how we assess user experience:
73% of Agoda designers agreed that Experience Principles help evaluate user experience consistently, and 82% felt they standardized the review criteria.
2. Design reviews improve quality:
83% of designers felt that Design reviews improved the quality of their designs. Measured quality (percentage of reviews with no issues at handoff) improved by 50% during the pilot and 171% during the beta phase. Remaining issues at handoff were primarily due to technical constraints, out-of-scope backlog items, or items handed over to maintain development velocity.
3. Process and tooling changes better integrate into the designers’ workflow:
75% of designers felt that incorporating Design Reviews with ADS Lint integrated well with their day-to-day workflow. While reviews can be submitted at any point, most were conducted in the final stages of design.
4. Time invested in reviews leads to better quality, but minimizing inefficiencies remains important:
While the additional time introduced by Design Reviews remains unclear, time spent on reviews is now formally tracked. 81% of reviews were completed within one hour, consistent with the pilot phase. 90% of designers reported no negative impact on development velocity, although time spent updating designs has slightly increased. Part of this increase is attributed to tooling inefficiencies, which are being addressed to further streamline the process.
Conclusion
By integrating design reviews into our workflow, we’ve improved the quality and consistency of our design decisions. This process has led to noticeable improvements in how we work:
- Improved ux quality: We can refine designs to better meet user needs by catching potential issues early.
- Increased confidence: Regular reviews build confidence in our design outputs, making them more reliable and user-centric.
- Stronger collaboration: Design reviews encourage collaboration across teams, bringing diverse perspectives to the table.
At the same time, additional processes add to the day-to-day burden. Based on internal indicators and customer satisfaction surveys, we plan to evaluate the entire review process once we are satisfied with the overall design quality.