How to reduce visual regressions with quick peer based UI reviews

KD Singh Arneja
Idyllic Geek
Published in
3 min readSep 15, 2015

Chances are your engineering team is Agile and engaged in iterative development for UIs. God bless your heart if it is not. The argument, that UI development cannot be done iteratively has long been settled. But after connecting with ex-colleagues, networking with counterparts at other firms and in my own travels, I have come to understand different gaps and holes which surface every now and then in different companies. One of those is how to handle regressions related to User Interfaces during Agile development, especially when you are releasing every few weeks or days.

You say, well a good set of unit tests and automation test should have you covered. I say, not so fast. Yes, they will help (if written well) for changes in functionality, but what about the changes that are not in the purview of these tests. An obvious example would be that of a holistic CSS change after you have done a refactor of a few screens in the name of consistency of look and feel, color palette. An even more specific example would be your team having spent a spike/sprint on making web pages responsive using flex box. And doing such spikes more often than not, introduces visual regressions in places that were not even targeted. Chances are you overwrote your own and your team member’s changes.

Now, unless you are one of the companies that can afford to invest and write automated tests based on screenshot comparison, visual regressions are a moving target. Even if you had those tests in place, cross browser visual bugs are even harder to catch. And guess who hates (in some cases love) these visual regressions? Your own QA team! Next thing you know there is a flood of tickets of small B***S*** issues, paddings and margins messed up, widths all 100% in places and it all looks bad! Developers usually do not pay attention to the little fact that there is a phantom cost associated with opening these tickets. The small feedback cycles of test-fix-retest that otherwise may have been avoided, now add up. “Just fix it” attitude here is not acceptable.

Well how to mitigate this? Simplest thing that has worked wonders for me is what I call, pair-with-a-peer UI review. It is a mouthful I know, but it is nothing more than a way to do UI Pair Programming but tailored towards, i.e visual inspections. The “strategy” is when you are done making changes, invite another UI engineer as a part of the manual visual review. He/she should approach the exercise not just to serve as a second set of eyes, but an actual user doing quick visual testing focussing. It is best if that engineer does not know anything about the code you changed, but is aware of the functionality.

As a visual inspector:

# If applicable, talk through the workflows and test them. If mocks are available they help in doing the workflows.

# Do an A/B testing of production vs new and related screens by:

  • looking at all aspects visually, paddings, alignments, consistency etc
  • checking colors and fonts
  • testing responsiveness on targeted devices
  • reviewing wordings and messages

Be on the lookout for changes to screens that were not supposed to be updated.

Also, I am not implying that you grab a hold of someone else’s Git branch/codebase , build it and then test. No. There are simple practical ways of doing this testing on a pre-staging server or even author’s local server. The idea is not to spend more than an hour on this. This task should be fast tracked and the speed comes once the exercise becomes part of the practice/culture rather than a process. Having done this myself, I am amazed on how many chicken-$hit issues I and my colleagues have found in each other's work before QA gets to have dibs on it.

Key takeaway of the idea is not to replace QA, but to do a quick UI bashing in order to find as many trivial bugs as possible, i.e. before QA finds them. This in turn, will help QA team to focus on finding serious bugs. We are more likely to fix the trivial ones at this point. If we don't, then even in an Agile environment, these issues will become tickets/issues and then those will get triaged, sub triaged and scrutinized into bigger things than what they were to begin with.

Happy coding!

--

--

KD Singh Arneja
Idyllic Geek

Senior Director, User Experience at IMOHealth. Founder and Principle @ Neza.