Code reviews: Plausible approach

Jeevan D C
The Startup
Published in
5 min readMay 9, 2020
pinterest

Disclaimer: This article may or may not convey something new. Stating obvious helps.

Goals of code review:

Give/ take feedback, means of collaboration!

Take aways:

  • Highlighting common problems.
  • Benefits of fostering a good code review process.
  • Best practices for reviewers.
  • Best practices for authors.
  • Checklist for code review.

The main code review pitfalls are:

  • Not getting useful feedback.
  • Not having enough time to do code reviews.
  • Code reviews taking too long causing long waiting times for PR closure.
  • The review process is not constructive enough.

Here it comes……….

To the reviewer………

https://twitter.com/heyjulesfern/status/1233175503680364549?s=20
  • Ask for self, “How can I add value to this PR?”
  • The tone is the key, stick to the facts, and explain.

Refrain from phrasing opinions as facts.

Instead of saying: For instance, avoid stating something like:
Rename this variable to ...
Try this: It would be better to stick to the facts and explain further, like this:
For X reason, this variable doesn’t convey intent entirely. Can we make it verbose?

  • Phrase suggestions as questions

Indirection by evoking thought leads to learning than blank direction

Instead of saying: Use camel case instead of kebab case
Try this: Would you think it would be better to use camel case to maintain consistency across?

  • Avoid sarcasm

So so: Why did you use threads here when there’s obviously no benefit to be gained from concurrency?
Better:

The concurrency model here is adding complexity to the system without any actual performance benefit that I can see. Because there’s no performance benefit. Would you think it’s best for this code to be single-threaded instead of using multiple threads?

As a rule of thumb, it’s usually better to give recommendations with reasoning than it is to just give point blank direction telling others what they should have done in the first place.

  • Nitpicking on old code/ previous tech-debts
    It’s not that the author knowingly left it unimproved, but the author’s PR has a clear goal and time frame associated, might have not noticed.
  • Time-box
    The attention span varies with people. Reviewing more than 40 min on single PR is too much for me in a single go.
    Maybe we should break down the PR? or it was inevitable and take multiple session? # You decide
  • Appreciation of nice solutions

It doesn’t hurt to comment on the lines….

Learned a new approach from you. This will reduce code duplication by a large extent. Thanks!

  • Faster responses:
pinterest

Be proactive to revert on completion of changes or redressal of suggestion by the author.

  • Empathy fuels connection
    Timezone, culture, language differences affect the way we communicate.
  • Always explain why you rejected a change
    Let’s be honest! No author enjoys a code change rejection. So, it is important that you are thoughtful and explain your rejection in a polite, constructive way.
  • Integrate code review into your daily routine
    Before standup/ after standup/ after lunch etc.
    It works both ways. Review others' code faster for self code to reviewed faster.
  • Comment on code. Not on the Person
    Imagine you are reviewing a piece of code on socket connection,
    You are not closing socket connection
    . . . . . . . . . vs. . . . . . . . . .
    This code doesn’t close socket connection

    This subtle difference can create huge impacts.

To the reviewee/ authors…..

  • Defining “Why” of the change
    Describe the purpose and motivation of the change.
    This practice improves code review participation rates immensely.
  • Practice lightweight PRs
    Smaller code avoids TL; DR (Too Long; Didn’t Read) syndrome.
  • Open-minded.
    Make the reviewers feel you value and consider their feedback. That doesn’t mean all feedbacks have to be taken. Let them know the why of the change.
  • Pushing back the comments/ suggestions

Explain to the reviewer why the change is required if he/ she may be out of context, or why this is the best approach weighing various pros and cons.

Remember: Reviewer didn’t understand, hence the comment. Reviewer’s perspective helps.

  • Link PRs to functional stories/ epics
    Helps to build the context.
  • Circulate reviewers pool
    -
    Diverse perspective.
    - Distribution of workload.
  • Notify people that benefit from this review
    ex: code change in the common module, notifying previous author(s) from the other functional squad helps keep them updated of change.

For Both reviewer and author…..

  • Save time

F2F > Screen share > Call > Slack/ text

  • Technical facts and data overrule opinions and personal preferences.
  • If no other rule applies, then the reviewer may ask the author to be consistent with what is in the current codebase.
  • A key point here is that there is no such thing as perfect code — there is only better code.
  • Reviewing should remain irrespective of the seniority of the member.
  • One can’t be a good reviewer if one isn’t a good developer. The converse may not be true.
  • Ensure traceability for decisions
    Even though conversations are less traceable, it adds value to document discussions done over call, code review, etc via Jira tickets, Confluence .
  • Code reviews are a means of learning for both
    One dimension: As an author can I learn(better approaches, conventions, missing edge cases, approaches, etc) from reviewers’ feedback?
    Another dimension: As a reviewer can I learn(alter approach, business logic, etc) from the author when doing a code review?

Checklist

  • Can I understand it easily?
  • Does it stick to our coding guidelines?
  • Are there duplications in the code?
  • How easily I can debug this code?
  • Is there too much going on in one function/class?
  • Does it follow DRY, KISS, SOLID blah blah?
  • Can there be an alternate approach/ algorithm which increases readability/ performance with least overhead?
  • The correctness(can mean many things :P) of the code maintained?
  • Allowed test coverage unit?
  • Functionality changes accounted?
  • Reusability: Moving to common modules, or using an existing utility from common modules
  • Does the design pattern usage fit in the architecture?
  • Handles error well?
  • more….
The struggle is to get to the right

References:

--

--