Stop Writing Unnecessary Tests

Adam Edwards
Pandera Labs
Published in
4 min readMay 11, 2017

I have been a developer now for about 4 years, and I am no stranger to seeing a deplorable lack of test coverage in many applications/microservices. I am a champion of writing unit tests myself. Wherever I am, I always encourage everyone who I work with to aim for 100% code coverage on any of the projects that they work on.

I have found, however, that people (developers/managers/product owners/etc.) aren’t fans of writing tests. I once had a manager tell me that I was spending too much time writing tests, and that most teams in the company only aim for about 20% coverage at any given time.

I was devastated. I mean, how could a manager want fewer tests? Did they want all of this code to break? It was like they were condemning me to have to fix something later on, even though I could invest the time and address these edge cases now.

Shortly after leaving that team, I started working on the front end of the project. I’m traditionally more of a Java guy, but I started doing some JavaScript in Angular JS. I was shocked to see that there were so many tests! Most modules had over 96% code coverage. I later found out why the coverage was so high — tests that were being written like:

expect(true).to.equal(true);

Or my favorite:

expect(true).to.not.equal(false);

What was going on? Why is it one half of the engineering team abhors tests and writes the bare minimum to keep things moving, and the other is writing horrible tests that assert nothing!

I fell into a pit of test-driven-despair…

I started writing tests for things in ways that were illegible and essentially pointless. It was so bad, that I was writing PowerMock definitions for classes that had static methods that were part of the java.utils package… But I was doing it so that I could get my application to 100%! I was doing the right thing! Even though my method was only one line long, I had written a test that was about 36 lines in length that essentially asserted:

expect(true).to.equal(true);

I’m not proud of that phase of my life, but a co-worker taught me a very valuable lesson: when it comes to TDD (Test-Driven Development) and the practice of writing quality unit tests:

You should focus on addressing those lines of code where you are changing something, or where you are reacting to something. You don’t have to create a mock for a service if all you are doing is pumping a value into it. Hopefully that service has enough quality tests of its own that your little value isn’t going to break it.

After I learned that lesson, I realized that there were two kinds of tests that we as developers typically write: “reports” and “tests”.

If you are writing a test so you can get a gold-star at the end of a sprint/release cycle, then you are writing a report. If you are writing a test so that you can explain what your code is doing for the next developer, you are writing a report.

If you are writing a test to make sure that some conditional is handled appropriately, you are writing a real test. If you are writing a test to address an edge case appropriately and to ensure that your code handles it the same way each time that edge case is encountered, then you are writing a real test.

Report tests do no one any favors.

Writing tests in order to achieve 100% coverage without regard for the value of the test is a waste of your time, and everyone else’s. “Reports” only illustrate how difficult your code is to understand. So if you find yourself writing a test that is 36 lines long, do yourself a favor — stop. If you have to write a novel in your test to illustrate how to get to your specific code that you’re testing, you should really consider doing some refactoring to see if you can’t isolate that piece better.

It’s like writing documentation for your code. We all agree that writing comments that explain your code is bad. Your code should have good variable and function names and be concise enough that, at a glance, someone should be able to understand what your code is doing.

Well guess what:

Writing tests that explain your code is also bad.

So the next time you are writing a test, ask yourself these things:

  1. Is this test easy to read and understand?
  2. Does this test assert something that hasn’t already been tested? (For example, are you writing a test for a service you are depending on that has already been tested?)
  3. Do I have too much mock behavior defined in my test?

If you have answered yes to any of these questions, take a good, hard look at both your tests and your source code again. See if there is a way you can clean up or remove some pieces from either.

Chances are you could probably pull out some functions and isolate their inputs and outputs well enough to make something easier to test, and easier to read.

Don’t be afraid to refactor. Challenge yourself to have source code that you are proud of.

--

--