A harsh engineer’s guide to delivering impactful yet benevolent tech reviews

Christopher Anciaux
Doctolib
Published in
10 min readMar 3, 2023

I joined Doctolib as a junior software engineer in September 2020. Before that, I was working at a company where processes weren’t as rigorous as at Doctolib, and I was yet to review a single Pull Request nor to review any technical conception documents (tech scopings as we call them at Doctolib).

Since then, I have reviewed more than 700 Pull Requests and about 30 tech scopings. My teammates often welcome my reviews as they feel they’re challenging, yet objective and benevolent. I learnt all the review tips I know from other developers at Doctolib and by scouring the internet for guidelines on the subject.

Here is the guide I follow when reviewing a PR or a tech document. Please, feel free to review it and share your own tips ;)

This illustration and the others you’ll find in this article are gently designed by Vanko Yutian Zhou 💙

TL;DR:

Delivering impactful reviews

  • Understand the original issue, challenge it if needed
  • First read: understand each piece of the work, search for small issues and inconsistencies
  • Second read: understand the whole flow, how pieces interact with each other and with the rest of the project
  • Be extra-challenging: think of edge cases, ask “stupid” questions
  • Check that the document proves it’s solving the issue (tests, KPIs…).

Delivering benevolent reviews

  • Separate the artist from the art, review the work as if it was done by a junior even if it was done by your manager or tech lead
  • Be objective in your reviews: reviewing is not about sharing your preferences, but making sure we’re complying with the project standards
  • Prove you’re objective in your reviews with documentation, links, …,
  • Bring solutions, not criticism
  • You’re known to be a bit harsh? Check out the last part of this article for a few tips on how to soften your style ;)

First things first: why review your peers’ work?

“Why bother with reviews?” could sound like a stupid question — but it is worth asking it: I personally dislike doing something “because we have to” or just because “it’s the process”. If you know why you are doing something, then it is much more likely you’ll do it well.

Firstly, reviews are great to spot small inconsistencies or a major dysfunction in a peer’s work, no matter the level of experience of your colleague. As I tend to say, “two heads are better than one”.

Reviewing your beloved peers’ work might also be the moment to discuss the code and the product more generally (“could this class be simplified?”, “could we make the whole process more intuitive instead of building something new into it?”…), to spot some edge cases or to identify security issues, as an external pair of eyes.

Finally (we could find tons of other reasons for reviewing peers’ work), reviews are an excellent opportunity to learn, both for your colleague and particularly for you! Trying to be very challenging on your reviews will force yourself to ask some questions you’d not have proactively asked about your own work.

“I didn’t know the module_function util — Today I Learned 😄”

Better than seeing the review process as a waste of time, seeing it as a moment dedicated to learning (for you and your peers) can make it very motivating! Even though some people think that some work wouldn’t benefit from being reviewed. That’s a debatable and interesting point, about which you can find some thoughts in this article from Jeroen Moels: No, your Pull Request does not need a review.

Delivering impactful reviews: my personal guide

Before we dig into how to deliver benevolent reviews (which is as important as delivering impactful reviews), let’s see how we can use reviews to have an impact on your peers’ skills and project knowledge.

First, understand what issue your peer is trying to solve

Let’s not put the cart before the horse. Before reviewing any code or any solution to a problem, you need to understand what the problem is. This might sound obvious, but it’s easy to forget.

Understanding the problem isn’t only about thinking of “How could it be solved?” but also about “Is it really a problem?”, “Should we really solve it?”. Sometimes, challenging a proposition is also challenging the reason for this proposition. Some pragmatism and common sense could save you many of hours of reviewing, coding and discussion.

Try understanding the work piece by piece, reading it like a robot

Right. You’re convinced the work is useful and the issue is a real issue. Now let’s get our hands dirty and actually start the review.

Before going deep in the review process, I like first to skim over the work, trying to find the core parts of it and doing some “robot inspection” to spot:

  • Small issues (methods and classes naming, files positions…)
  • Typos
  • Inconsistencies across the document but also among the rest of the codebase / documentation or against the guidelines of the company
  • Easy security weaknesses (SQL injection, XSS,… You name it).

With these first ideas and things to spot in mind, you could already have a great impact with your review, particularly when it comes to security.

Time to dig deeper.

Now you have the main pieces in mind, understand the whole mechanism and make sure it solves the original problem

The second read is more about understanding how the pieces interact both with each other and with the rest of the project, and that we are really solving the original issue.

To do so, especially when I’m reviewing a code Pull Request, I don’t browse the work chunk by chunk or file by file as displayed in Github — instead, I start from the entry point of the PR and follow the code flow so I can almost run the algorithm in my head.

For example, if I was about to review the code of a feature adding some endpoints to a MVC (Model-View-Controller) application, I would start by reviewing the controller as it’s the entry point of the user. Then, I’d review the services used by this controller and the models it might rely on, and finally the views.

This way, I make sure that all of the code in the PR and in the existing code base interact consistently. Also, it helps me make sure that every added component is used (don’t hesitate to use the “Viewed” checkmark to trace the files you already browsed!).

This second read is also a good moment to check if concerns are correctly separated, if each part of a system is doing what it’s supposed to do — not more, not less.

Working like this allows me to double check that the “happy path” of the solution is solving the original problem.

Think of some edge cases your peer wouldn’t have thought of — try to find what is wrong your peer’s work

This part is more about being extra challenging and making your review relevant: not only from the point of view of the code or the scope of the document. But giving insights and criticizing the work considering its impact on the wider project is at least equally important.

I like to ask myself: “Did my peer check his/her work actually fixed the issue? Is my peer proving it?”. No matter if you’re reviewing a Pull Request, a product specification or a technical document: make sure that what you are reviewing proves that it’s actually solving the general issue.

For a PR, this can be a unit / integration / manual test, depending on the existing test strategies of your company. At Doctolib, we’re in love with integration tests and other funny things.

For a product specification, proposing a Key Performance Indicator (KPI) to measure the success (or failure, it’s okay) of the project can bring a lot of value on how we can validate this specification, making sure it can help solve the original issue.

You got it: It is not our job as reviewers to test the workflow and prove that it works. It is up to the submitter to provide enough proof to make reviewers confident to approve.

Finally, this last read could be a good moment to nitpick (in a benevolent way ;)) and to ask some “stupid” questions. Ask questions about performance, security, disaster issues etc. (“What happens if the DB gets disconnected?”, “If this variable is empty, will it crash?”…).

Delivering impactful reviews is good, delivering impactful and benevolent reviews is much better

We all know it’s not easy to have our work criticized. We naturally tend to take the comments personally instead of thinking that the reviewers are separating the artist from the art.

To avoid offending my peers with my reviews — I already offended some — , I now have some simple rules and reminders I try to keep in mind so my colleagues don’t think I am a horrible person.

Performing benevolent reviews is way easier than performing impactful ones — the tricks for benevolent reviews are easily applicable to every comment. But an impactful review with an unkind tone will probably just get ignored or lead to non constructive discussions, significantly reducing the real impact of your review.

Be as objective as possible

This statement might sound obvious, but a reminder doesn’t hurt. Reviewing your peers’ work is not about reviewing your peers. It’s easy to be more lenient with colleagues that you know well and trust, and more severe with those that you’ve worked with less.

Being more lenient with some work of a peer who is “more qualified” than you, “more comfortable with the technologies”, or [insert some other good excuse] doesn’t help you, your peer nor the project you’re working on. By being more lenient than you should, in the best case you’ll miss an opportunity to propose an improvement. In the worst you may miss an important security breach.

I have one rule of thumb: be as severe with a trusted colleague as you’re with a new joiner. It can feel confusing to speak of severity while we try to find ways to deliver benevolent reviews, but it’s also about consistency across your reviews. If you want your reviews to be impactful and valued by your peers, you better make sure they look — and are — objective.

Being objective is one thing, but proving your review is objective is another. To do so, here are some simple tips I keep in mind:

  • Support your reviews with concrete arguments and examples, to make sure the review will clearly be understood and accepted
  • Source your comments with some external elements proving your point: the official documentation of libraries, some StackOverflow answers — always use a reliable source, not one just defending your point of view
  • Even better: prove your points using facts — if you can share some code fiddle demonstrating some point, don’t hesitate to do so. Not to say “Hey, I’m right! Here is the proof:”, but more to share the knowledge with a concrete example, hoping your peer will get better from it

Bring solutions over criticizing when reviewing your peers’ work

More generally, reviewing a code isn’t only about criticizing the work of your colleagues. Remember: you and your colleagues are on the same boat. If they’re struggling with a PR, you’ll probably struggle with them at some point.

So, why not take the opportunity to give solutions when you’re reviewing some work? If pointing some issue, make sure to also point a solution to handle this issue — or at least, admit you don’t have the solution and that you’re open to discuss it.

Sharing your potential solutions (with some small piece of code, some rough proposition — not by rewriting a whole PR) will orientate your peer to the most-fitted solution, avoiding back-and-forth discussions in the reviews and leading to acceptable solutions for the reviewer and the submitter.

I’m often told I’m being harsh — let’s use some simple tricks not to be that harsh in reviews

Before submitting a review, I also spend a few seconds reading my comments again, to make sure the tone I’m using is not too harsh. I’ve some tricks to soften my approach:

  • Ask directed questions instead of giving orders (remember, a review is a discussion)
  • Use emojis (sometimes I also use GIFs, I love them)
  • Ask the opinion of the submitter about a change you’re proposing, to stress that you’re not trying to be superior to the submitter but just discussing their work
  • Highlight the good ideas and the things you learnt from your peers’ work (“nice optimization in this method!”, “I didn’t know this function, thanks!” …)

Finally, I rarely use the “Request changes” feature of Github, Gitlab etc unless I’m really convinced a PR will cause an issue if merged. I often feel this is a bit harsh to “request changes”, so if I’m not sufficiently convinced by a PR, I prefer to “conditionally” approve it (warning the submitter that my approval applies only if my comments are handled) or just give a comment (nor approving or requesting changes).

Well that’s pretty much the advice I’ve in mind when reviewing a PR. You might have noticed: we didn’t dig into technical or product aspects in this article. Delivering impactful reviews is more about structuring them well, while bringing some benevolence in them is more about how to communicate.

Everybody is capable of delivering impactful and benevolent reviews. It’s just about taking time to do it. Don’t worry if spotting your peers’ mistakes or axes for improvements doesn’t come naturally — you’ll learn and improve as you go along!

While looking for inspiration for this article I was a bit disappointed to not find as many articles on the subject as I expected to, but here are two interesting ones I did find, which you might like too:

Many thanks to Sam Lukes for reading and correcting my English mistakes, to Arnaud Bellizzi for finding a catchy name and to Vanko Yutian Zhou for the wonderful illustrations of this article 💙

--

--