Mutation testing

Gayan Perera
7 min readOct 6, 2019

--

What is testing ?

There are several ways you can test your software components.
- Unit testing
- Integration testing
- E2E testing

Now today we are going to focus on Unit testing with Java

What is Unit testing ?

There are plethora of libraries you can use for unit testing, just to name few JUnit, TestNG etc.

How do you measure the quality of your tests ?

Well for this you use Test coverage. Test coverage will give you an idea of how much of the production code was tested using your unit tests. A typical coverage report will look likes below

!!! The above simple coverage report will say that your unit tests are covering all the production code from the tests you have written.

Well that is not always true, trust me i have seen many time in my career that developers write unit tests which has 100% coverage but still fails in production. The question is How ?

Lets take the following example

package org.gap.medium.mutation;

public class Foo {
public int fooSum(int x, boolean increaseX, int y, boolean increaseY) {
if(increaseX) {
x++;
} else {
x -= y % 2;
}

if(increaseY) {
y++;
} else {
y -= x % 2;
}
return x + y;
}
}

Pretty simple function right ? Now lets see some unit tests for this code.

package org.gap.medium.mutation;

import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;

import static org.assertj.core.api.Assertions.assertThat;

public class FooTest {
private Foo foo;
@BeforeTest
void before() {
foo = new Foo();
}

@Test
void fooSum_Scenario_1() {
assertThat(foo.fooSum(1, true, 4, true)).isEqualTo(7);
}

@Test
void fooSum_Scenario_2() {
assertThat(foo.fooSum(3, false, 5, false)).isEqualTo(7);
}
}

Pretty good right ? Lets see the coverage for this

100% ? Well yes the tests we just wrote actually gives 100% coverage from the coverage tools perspective. There are different libraries you can use for test coverage reports, Here i use the built-in coverage tool from IntelliJ IDEA. But even when using JaCoCo it is same.

Even thought the coverage report says it is 100%, how can we be sure ? What if the code was changed in such away so that the test data and test results of the unit tests to be satisfied.

Lets do a small change to our code above in such away that unit tests will still pass.

Here is the modified code:

package org.gap.medium.mutation;

public class Foo {
public int fooSum(int x, boolean increaseX, int y, boolean increaseY) {
if(increaseX) {
x++;
} else {
x -= y % 2;
}

if(increaseY) {
y++;
} else {
y += x % 2; // modified -= to +=
}
return x + y;
}
}

Here is the test results with the same tests:

So how we can we test our Tests to make sure they are testing the code properly. For this we use something called Mutation Testing.

What is Mutation testing ?

Basically it changes your production code by using bytecode engineering and run your unit tests against the mutated code to measure how may tests cases are failing. Well if you have written the correct set of unit test cases they should fail with each mutant.

How can i add support Mutation tests ?

Well there are several libraries out there for mutation testing for different language such as pitest, Stryker etc.

In this story we will use pitest to perform mutation testing on our sample code we just looked at.

http://pitest.org

I suggest you go through the http://pitest.org/quickstart/maven/ to setup your project for mutation testing. Following is the configuration i added into my sample project.

<dependencies>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
</dependency>
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<version>3.13.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<version>1.4.10</version>
<configuration>
<targetClasses>
<param>org.gap.medium.mutation*Foo*</param>
</targetClasses>
<targetTests>
<param>org.gap.medium.mutation*Foo*</param>
</targetTests>
<testPlugin>testng</testPlugin>
<mutators>STRONGER</mutators>
</configuration>
</plugin>
</plugins>
</build>

Note : At the time of writing this story the latest pitest version 1.4.10 only support testng 6.x releases.

You can control the production classes you want to mutate with the <targetClasses> parameter. And <targetTests> can be used to control the tests that you want to execute as part of mutation testing.

Now lets see the report for our sample project.

================================================================================
- Mutators
================================================================================
> org.pitest.mutationtest.engine.gregor.mutators.IncrementsMutator
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.RemoveConditionalMutator_EQUAL_ELSE
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.ReturnValsMutator
>> Generated 1 Killed 1 (100%)
> KILLED 1 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.MathMutator
>> Generated 5 Killed 4 (80%)
> KILLED 4 SURVIVED 1 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.NegateConditionalsMutator
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
================================================================================
- Timings
================================================================================
> scan classpath : < 1 second
> coverage and dependency analysis : < 1 second
> build mutation tests : < 1 second
> run mutation analysis : < 1 second
--------------------------------------------------------------------------------
> Total : 1 seconds
--------------------------------------------------------------------------------
================================================================================
- Statistics
================================================================================
>> Generated 12 mutations Killed 11 (92%)
>> Ran 36 tests (3 tests per mutation)

You can clearly see that the mutation coverage is not 100%. To investigate more where the mutations had failed in production code, you can look at the report which is in {build-dir}/pit-reports/{timestamp}/index.html. Following are some information from the reports

The above report will give you more details about what mutators did failed on those particular red lines. Mutators are which actually mutate your code. PITest have few you can find more details here http://pitest.org/quickstart/mutators/

Now lets try change the tests. In this particular scenario we only need to change the input parameters. But in some cases we might need add more tests.

package org.gap.medium.mutation;

import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;

import static org.assertj.core.api.Assertions.assertThat;

public class FooTest {
private Foo foo;
@BeforeTest
void before() {
foo = new Foo();
}

@Test
void fooSum_Scenario_1() {
assertThat(foo.fooSum(1, true, 4, true)).isEqualTo(7);
}

@Test
void fooSum_Scenario_2() {
// i changed the first input from 3 to 4
assertThat(foo.fooSum(4, false, 5, false)).isEqualTo(7);
}
}

Lets run the mutation coverage again.

================================================================================
- Mutators
================================================================================
> org.pitest.mutationtest.engine.gregor.mutators.IncrementsMutator
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.RemoveConditionalMutator_EQUAL_ELSE
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.ReturnValsMutator
>> Generated 1 Killed 1 (100%)
> KILLED 1 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.MathMutator
>> Generated 5 Killed 5 (100%)
> KILLED 5 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
> org.pitest.mutationtest.engine.gregor.mutators.NegateConditionalsMutator
>> Generated 2 Killed 2 (100%)
> KILLED 2 SURVIVED 0 TIMED_OUT 0 NON_VIABLE 0
> MEMORY_ERROR 0 NOT_STARTED 0 STARTED 0 RUN_ERROR 0
> NO_COVERAGE 0
--------------------------------------------------------------------------------
================================================================================
- Timings
================================================================================
> scan classpath : < 1 second
> coverage and dependency analysis : < 1 second
> build mutation tests : < 1 second
> run mutation analysis : 1 seconds
--------------------------------------------------------------------------------
> Total : 1 seconds
--------------------------------------------------------------------------------
================================================================================
- Statistics
================================================================================
>> Generated 12 mutations Killed 12 (100%)
>> Ran 36 tests (3 tests per mutation)

Now you can see you have 100% mutation test coverage.

Summary

In summary we just look at writing unit tests for a simple function. And we learn that some times even though the coverage tools report that we have 100% coverage, in reality we don’t. As a extra step to secure the quality of our unit tests we used Mutation testing. It helped us to improve out unit tests, in our case to improve our input test data so that we have better coverage from our unit tests.

Happy Unit Testing Devs

--

--