Better Error Messages in Tests

Tihomir Tonov
Donatix
Published in
3 min readMar 8, 2019

We all have encountered error messages in our test suits like, Failed asserting that null is not nullor Failed asserting that false is trueand we have no clue what is the reason and what we have broken.

Of course, under the error message there is the file name and the line number of the failed assertion, so we open the file and here is the content:

$this->assertTrue($user->is_active);

Aha, now we know what is the real error and hopefully how to fix it. And if we think this is an important path of our code and a commonplace to have errors, we might add an additional message to the assertion to be displayed when this assertion fails, like this:

$this->assertTrue($user->is_active, 'The user should be active');

So we made the test a bit better, but I would argue if this is the best bang for the buck and don’t encourage anyone to submit a big PR, which fixes all non-descriptive errors in their project. The goal here is to know about the tool and use it when you need it.

Now let’s see some more complicated examples:

Enhancing array assertions

We all know about assertCount(), assertContains() and assertNotContains(), but when it comes to asserting, that some DB record presents or not in the returned response it gets a bit tricky. One way to do it is to pluck the ids and assert on them, but when such a test fails, it only says Failed asserting that [1,2,3] not contains 2, which is not telling us anything.

So what we could do?

One way is to use some string column on the model as the key to pluck and assert on its contents. And to make it clear why the test fails, we will name the column with the reason the test fails. Here is an example:

This way if the test fails we will see from its output that the result contains the product named Too expensive, which should not be the case.

This trick also could be used for ordering, when naming items something like: first, middle and last, and when the test fails we will see an error like Failed asserting that [first, middle, last] is equal to [last, middle, first], which means we broke the ordering.

Enhancing callback based assertions

Laravel allows us to mock its facades and check if given action was performed. Even further we can pass an additional callback to perform some more checks on the data with which the action was performed. Too abstract? Let’s see an example:

This is a really powerful and useful concept, but with one downside, when it fails it says The expected [CheckNotification] was not sent. Failed asserting that false is true.and we have no idea what the error is.

Let’s think about the possible errors:

  • We sent the notification to the wrong notifiable
  • We sent the wrong notification
  • The notification data misses domain1.com
  • The notification data Includes domain2.com

And even though we cannot do much for the first two from this list, they are much less likely to be the problem.

On the other hand, the error message for the second two cases could be changed easily. We could write the assertions for the cases and return trueat the end. If some of the assertions fail, an exception will be thrown by phpunit and better error will be presented to us. Here is the example from above rewritten in such a way:

Now if the test fails we will see an error like this: Failed asserting that ['problem.com', 'normal.com'] not contains 'normal.com', and we will immediately know what failed the test case, without debugging the content of the variables passed to the callback.

This approach also might be used with assertions on the parameters with whom some mocked method was called. Abstract again? Let’s see an example:

There is one thing to keep in mind if you are trying to use withArgs in laravel project. You have to upgrade to a newer version of Mockery because laravel uses an older one, and withArgsmethod on mocks is not available. In my case, the upgrade was painless and went without any problems.

With that being said I hope you liked the “tricks” I presented and you will try them in your own projects and have better errors when your tests fail.

--

--