The Art of Peer Review in Analytics

Our journey in implementing a peer review framework in the Atlassian Analytics & Data Science team

Marie
Data at Atlassian
6 min readSep 30, 2022

--

A long-standing tradition of engineering teams, peer review should be a non-negotiable part of every analytics and data science team. Yet, many analytics teams struggle to follow formal peer review processes. Urgent stakeholder requests, little peer review experience, or non-overlapping work might limit teams from doing more peer reviews. Teams may also struggle to find online resources related to peer review for analytics and data science organizations. The benefits may be obvious, but the implementation is less clear.

The above were some challenges we faced when creating an analytics peer review framework at Atlassian. Our team, the Data Quality and Enablement team, spearheaded this initiative because we wanted to enable teams to do quality work that included checks before results were reported to a stakeholder or deployed in a production pipeline. Our vision was to encourage Atlassian analytics teams to actively engage in peer review and find value in it. This blog recaps our approach and findings.

Defining peer review for Atlassian Analytics

We started our work by aligning on a definition.

Peer review methods maintain quality standards, improve performance, and provide credibility. It is the evaluation of work by one or more people with similar competencies as the producers of the work, functioning as a form of self-regulation by qualified members of a profession within the relevant field to ensure the correctness of results and catch costly errors.

– Adapted from Wikipedia and Shay Palachy’s article

When we first published this definition of peer review, we received a complaint: it didn’t describe how to do peer review. But that was intentional. We wanted to equip teams with tools and resources and provide enough flexibility to decide what would work best for them. So we preferred that our definition focused on the benefits and general process rather than a prescribed checklist.

Aligning on the goals of peer review

Next, we focused on our goals. Change management can be very hard in organizations, so it’s imperative to define clear objectives. We often had to refer to these goals when convincing teams to setup peer review processes. So the more they resonate with teams, the better chance you’ll have at convincing them to start peer reviewing.

Here are our peer review goals:

Reduce human error by

  • catching mistakes
  • validating approaches
  • ensuring recommendations make sense

Create a culture of constant improvement by

  • seeing how more senior peers approach projects
  • learning about new methods and tooling
  • sharing knowledge

Share best practices by

  • broadening knowledge of the company
  • seeing what has worked in other areas

Scale by

  • using consistent approaches
  • developing templates

Challenges to implementing peer review

We then assessed what peer review obstacles existed. When we looked into teams’ processes, we found that many teams weren’t doing anything due to

  1. time commitments required
  2. siloed work

Peer review takes time…but it’s worth it

In a growing company, there are never enough hours to finish all the possible work, so we must be thoughtful and discerning about where to spend our time. This is not unique to analytics peer review. One of our software engineers, Martien Verbruggen, articulated a great response in an internal company blog about peer review.

“But, I hear you say, if we have to do all this, we will be terribly slow.

Well, you’ll be appropriately slow. Quality does not necessarily come for free.

But there are ways in which you can become faster while meeting this bar….almost all teams who spend some time on engineering health do become faster in the long run.”

We recognize it can be slow to ramp up. But remind teams of the quality goals peer review seeks to achieve and how work speeds up when fewer hours are spent debugging.

Siloed work can make some types of peer review more difficult but not all

On the other hurdle, many Atlassian analytics teams only have one analyst or data scientist per stakeholder group. If you are the only analyst working on Community, is it okay for someone not to understand all the nuances of the subject matter? Will they be familiar with the same dataset? We can’t spend time fully onboarding team members onto each subject area for the sake of nuanced peer review.

Independent projects can make peer review more challenging but are not insurmountable. We instead suggest:

That value can be added even by those unfamiliar with the domain.

  • Reviews don’t have to be only code reviews; learning how to approach a problem is not subject matter-area dependent.
  • You should be able to explain your work at a high level to anyone. This is true whether you’re in your project’s early or final stages. Reviewers can provide feedback on your message.

To look for peer review partners outside your team.

  • You can request anyone in the org to provide you with feedback. This could prove especially helpful for more technically advanced work that your team doesn’t have much experience with.
  • Knowledge sharing across different subject matter areas might provide new ways of looking at the problem.

So how do we do peer review?

Finally, we began crafting peer review frameworks that could work for our teams. What framework could reap the benefits of peer review while mitigating the issues that prevented many teams from trying it in the first place?

We decided to provide multiple peer review options and the autonomy to determine what works for each team. We also provided support as they embarked on this journey and feedback on their progress.

Here is an abbreviated list of possible peer review activities adopted by teams.

  • Sparring: Our most popular form of peer review. Generally, teams will have a regular meeting, and someone will sign up to have their work reviewed. We advocate for work to be peer-reviewed early on — better to catch errors sooner rather than later! Reviewees give context and share their progress, while reviewers ask questions and provide feedback.
  • Code Review: Code reviews can be difficult if your subject matter area and associated tables are siloed. One Atlassian team posts a review signup sheet on Confluence, and reviewers volunteer to peer review the code in depth. For productionized pipeline work, we create and review pull requests.
  • Slack Rooms: A few teams have created dedicated peer review Slack rooms. One has a more casual vibe where the group may jump into a thread to help, or a review can be taken to a call. Another team has a more formal Slack setup, including bots that count how often people volunteer to help.
  • Office Hours: One of our product analytics teams has setup weekly office hours where two analysts/data scientists are assigned to be the reviewers of the day, and people can bring their work into the office hours for advice.

And finally, we discovered that it is essential that the reviewee explicitly asks for the type of feedback that they are looking for. Where should reviewers focus? On your message? On your analytical methods? This helps ensure you get helpful input from the exchange.

Try to measure your progress and continue improving

At this point, over 90% of our analytics teams have engaged with at least one form of peer review. This is a good starting point, and we are working towards improving our practices to net greater gains in the future. Happy reviewing!

Are you interested in joining our team? Check out our careers page.

--

--