The Overstated Importance of Code Consistency

Eric Anderson
The Startup
Published in
4 min readAug 13, 2020

Style guides generally start as a few well intended rules, but over time grow to a huge document micro-managing every line of code. What started as a strategy to save time by avoiding bike-shedding arguments now consumes time as we work to constantly make the linters happy even through the original code may be perfectly fine.

Avoid Errors

The only style rules that really matter are ones that help you avoid errors. The number of rules that fall in this category are MUCH smaller than most style guides and linters suggest. How many depends a bit on the language. Certain languages make it easier to do bad things, while other languages tend to be safe by default.

An example of a rule that helps avoid errors is consistent indention. With inconsistent indention (different widths, mixing tabs and spaces) you can misunderstand the control flow of a program.

We can argue about what rule is best (2 space characters obviously 😀), but the real key here is that everybody working on that code base is using the same settings. A width of 8 looks awful IMHO. But as long as everybody is using the same settings it helps avoid errors.

But not all rules setup by style guides and linters only concern themselves with things like indention. They get into all sorts of opinions where it doesn’t really matter if we are inconsistent or is even harmful by making consistent.

The Problem with Excess Rules

Multiple methods exist because in different situations one method may hold slight advantages over another in terms of readability. For example, Ruby has two ways to express a hash literal. The first is the more general method:

{ :cat => 'meow', :dog => 'bark', :cow => 'moo' }

Later versions of Ruby introduced a more succinct version for the case where all keys are symbols (common in Ruby):

{ cat: 'meow', dog: 'bark', cow: 'moo' }

In my opinion either is readable and neither leads to more errors. Both are acceptable.

Should we just pick one to use and ignore the other? Or maybe have a complex rule that requires different forms in different scenarios?

Let’s consider the harm in not defining a rule. You might end up with one piece of code using style 1 while another is using style 2. <…crickets...> Anything else? Anything bad? Does anybody really have trouble reading either style? Does either style lead to errors?

Now let’s see the harm in defining a rule.

Most likely there is a reason there is more than one way to do it. This means you rarely can use a blanket rule. You will probably have a general guideline combined with a bunch of exceptions. An advanced linter will try to compensate by taking those exceptions into account but it will eventually produce false positives because it is just a machine and doesn’t really understand the code? The solution by the linters? Put some ugly comment before and after the code to tell the linter to ignore that bit. Now we are writing code for the linter and not other people. We have now made our code LESS readable due to a linter.

Or maybe you decide to bite your tongue and conform to the linter. Use something that you think is less readable for the situation just to make the linter happy. Again we end up with code LESS readable due to the linter.

Or maybe you don’t care and you feel either way is just as readable so you decide to conform to the linter. Now we have another commit, more CI runs, more squashing commits, more code reviews, etc. All just to change the code from one method that is perfectly readable to another method that is perfectly readable.

The harm in excess rules? It can reduce readability and lead to an increase in unnecessary work.

It’s OK to be Consistent

Should you expect anarchy when looking at my code? Hopefully not. I don’t want to sound like I’m against consistency just for the sake of it. I generally have a style that I follow. Back to our example. If I am writing a hash in Ruby and the keys are all symbol I tend to use the newer notation. I don’t flip depending on my mood. Barring some reason I prefer consistency.

It’s OK to be Inconsistent

Despite my desire for consistency, I also sometimes find myself purposely being inconsistent. Back to my hash example. When I am defining a Ruby rake task, the DSL looks like:

desc "Some task"
task :my_task => :some_dependency do
...code for task...
end

Note, I am passing a hash to task where the keys are symbols. Why was I inconsistent? Why did I use the old syntax? Why doesn’t it look like:

desc "Some task"
task my_task: :some_dependency do
...code for task...
end

While both are valid, I find the => a bit nicer as it is pointing to something. To me this indicates a better connection between the task an its dependency. Now if I come across code that isn’t use the arrow am I suddenly confused? Do I suddenly no longer understand what is going on? Of course not.

Simplify the Tools

As I said, I’m not against linters and not against code consistency. I use static analysis on my projects to help me catch errors, security issues, code complexity issues, etc. But if a tool suggests something to me that I don’t think reduces errors and I’m perfectly fine with my way of doing it rather than conform to the linter, I disable that rule. This rule doesn’t add value and I don’t want it bugging me in the future. Over time you tune the linters to just what is needed which lets you focus on the important things and not on pointless syntax changes.

--

--