Testing of Annotation-Processors Reloaded

Tobias Stamann
Holisticon Consultants
7 min readJan 30, 2024

This article demonstrates how black-box and unit tests of annotation processor related code can be done by using the CUTE framework.

Testing is an essential part in software development ensuring the quality of the software you are developing.

For sure you know the testing pyramid. It describes different categories of testing, starting from unit, integration tests, up to the End-to-End tests.

This works for the “normal” Java development. But when it comes to the testing of annotation processors in most cases only black-box tests are used. Annotation processors are tested as a whole by just checking the outcome of a compilation of test source files which are processed by the annotation processor under test.

But it would be great to be able to write unit tests as well as to be able to test the processor related code on a much smaller level — but unfortunately this is very hard to do. There are several reasons:

The first reason is that annotation processors work with the Java compile time model which is really hard to mock.

Secondly, there is no out of the box support for writing unit tests in the most commonly used annotation processor test frameworks.

That’s why I would like to introduce the Compile-Time Unit-Testing framework (short CUTE) to you.

Here’s a short overview of it’s capabilities:

  • It allows both black-box and unit tests of annotation processors.
  • It is based on Java Compiler API — an in process compiler — so debugging of annotation processors is possible during the test execution.
  • It is compatible with all testing and mocking frameworks (JUnit 4&5, TestNG, Mockito, …).
  • It provides out of the box checks for compiler outcome like generated source, resource and class files and compiler messages (equal/contains check).
  • It provides extended debug output if a test fails (i.e. generated files will be written to disk, all compiler messages will be logged, …).
  • It has just one uncommon dependency (no version clashes).

You only need to add the following dependency to your project to be able to use the framework:

<!-- maven style -->
<dependency>
<groupId>io.toolisticon.cute</groupId>
<artifactId>cute</artifactId>
<version>1.7.0</version>
<scope>test</scope>
</dependency>

The source code can be found on Github: https://github.com/toolisticon/cute

Black-Box Testing

For Black-Box testing test cases will be defined by source files on which your annotation processor will be applied during compilation. The compilation outcome can be validated once compilation is finished.

// Provides an immutable fluent api
// so it's possible to share a common
// setup between tests
CuteApi.BlackBoxTestSourceFilesInterface compileTestBuilder;

@Before
public void init() {
compileTestBuilder = Cute
.blackBoxTest()
.given()
.processors(YourProcessor.class);
}

// a successful compilation
@Test
public void testYourProcessor() {
compileTestBuilder
.andSourceFiles("testcases/yourTestSourceFile1.java")
.whenCompiled().thenExpectThat()
.compilationSucceeds()
.andThat().generatedSourceFile("your.test.package.ExpectedGeneratedSourceFile").exists()
.executeTest();
}

// a failing compilation that must contain a specific error compiler message
@Test
public void test_invalid_usage_without_noarg_constructor() {
compileTestBuilder
.andSourceFiles("testcases/yourTestSourceFile2.java")
.whenCompiled().thenExpectThat()
.compilationFails()
.andThat().compilerMessage()
.ofKindError()
.contains("SNIPPET FROM ERROR MESSAGE LIKE AN ERROR CODE")
.executeTest();
}

Source files must be located in your “src/test/resources” folders of your project and must been suffixed with either “.java” or “.java.ct”. (Some IDE’s like Eclipse compile “*.java” files in resource folders which can lead to errors, in this case “.java.ct” should be used for highest compatibility.)

Furthermore, they must contain at least one annotation processed by your annotation processor, otherwise the test will fail (to prevent false positives in case compilation is expected to succeed).

Additionally, you can add some criteria to check the outcome of the compilation.

It allows you to check:

  • if the compilation should have been successful or not.
  • if specific source, resource or class files have been created and if content of the files is as expected. There are some predefined matchers to check file contents, but it’s also possible to use custom matchers.
  • if generated classes behave correctly. Generated classes can be tested as well — but there are some limitations, we will come back to this a bit later.
  • if certain compiler messages have been written, allowing to check if info, warning and error messages are equal to or containing the expected string. Additionally, it’s possible to check for its exact location.

All instances returned by the fluent API are immutable. So it’s possible to share a basic setup defining the processor to use between the tests.

Unit Tests

Unit tests can be done easily as well. Unit test code can be defined via the fluent API. For example:

@Test
public void test_yourTestMethod() {

Cute
.unitTest()
.when()
.unitTestWithoutPassIn(processingEnvironment -> {

// Add your test code and assertions
// You can look up Elements by using the passed in processingEnvironment parameter
// Additional you can use all kinds of test and mock frameworks for testing

YourToolingClassUnderTest unit = new YourToolingClassUnderTest();
MockedClass mock = Mockito.mock(MockedClass.class);
MatcherAssert.assertThat(unit.doSomething(mock),Matchers.is("OK"));

})
.executeTest();

}

Additionally to your unit test code it’s possible to add checks for compiler outcome, generated source, resource and class files, expected exception, compiler messages via the fluent API.

In most cases your unit test code needs some kind of context. Usually this will either be a specific Element or an initialized annotation processor instance. The CUTE framework provides different ways to provide such contextual information, helping you to reduce initialization code needed for testing.

The context can either be provided by a precompiled class or by a source file which will be compiled during the unit test.

Passing in Elements via Pre-Compiled Classes

Static inner class next to your unit test method can be used to pass in the context. The class must contain exactly one Element annotated with the PassIn annotation, which then will be passed into your unit test.

// -------------------------------------------------------------
// located in unit test class
// -------------------------------------------------------------

static class PassInClass {

@PassIn
public void passedInMethod(){
}

}

@Test
public void test_testMethodWithPassIn() {

Cute
.unitTest()
.when()
.passInElement().<ExecutableElement>fromClass(PassInClass.class)
.intoUnitTest((processingEnvironment, element) -> {

// add your unit test code and assertions

})
.executeTest();

}

Providing context by pre-compiled classes has a major drawback — non-runtime scope annotations aren’t available anymore, so you can’t use this approach if the code under test processes any kind of non-runtime annotations.

Passing in Context via Source Files

It’s also possible to pass in via a source file. In this case the element to pass in can be marked by the PassIn annotation. If your source files contain more than one element annotated with that annotation, you need to add the PassIn annotation to one of them.

// -------------------------------------------------------------
// located in file /src/test/resources/testcase/PassInClass.java
// -------------------------------------------------------------
static class PassInClass {
@PassIn
public void passedInMethod(){
}
}

// -------------------------------------------------------------
// located in unit test class
// -------------------------------------------------------------
@Test
public void test_testMethodWithPassIn() {

Cute
.unitTest()
.when()
.passInElement()
.<ExecutableElement>fromSourceFile("/testcase/PassInClass.java")
.intoUnitTest((processingEnvironment, element) -> {

// add your unit test code and assertions

})
.executeTest();

}

Alternatively, if you need more than one source file in your test, you can define those in the unit tests given section with one limitation. The source files must contain exactly one element annotated with PassIn annotation!

@Test
public void test_testMethodWithPassIn() {

Cute
.unitTest()
.given().useSourceFiles(
"/testcase/PassInClass.java",
"/testcase/OtherClass.java")
.when()
.passInElement().<ExecutableElement>fromGivenSourceFiles()
.intoUnitTest((processingEnvironment, element) -> {

// add your unit test code and assertions

})
.executeTest();

}

Passing in an initialized Annotation Processor instance

If you want to test an annotation processor directly, it’s nice to have an initialized processor instance passed into your test.

@Test
public void test_testMethodWithPassIn() {

Cute
.unitTest()
.given().useSourceFiles("/testcase/PassInClass.java")
.when()
.passInProcessor(YourProcessor.class)
.intoUnitTest((processor, processingEnvironment) -> {

// add your unit test code and assertions


})
.executeTest();

}

Supported Checks

Let’s take a look at the supported check types. Most checks can be used in both black box and unit tests:

@Test
public void testYourProcessor() {
compileTestBuilder
.andSourceFiles("testcases/yourTestSourceFile1.java")
.whenCompiled().thenExpectThat()
// compilation outcome
.compilationSucceeds()
// generated file checks
.andThat()
.generatedSourceFile(
"your.testpackage.ExpectedGeneratedSourceFile"
).exists()
.andThat()
.generatedResourceFile("your.resourcepackage", "file.xml")
.matches(
CoreGeneratedFileObjectMatchers
.createIsWellFormedXmlMatcher(),
CoreGeneratedFileObjectMatchers
.createContainsSubstringsMatcher("YourSubstring"),
CoreGeneratedFileObjectMatchers
.createBinaryMatcher(
JavaFileObjectUtils.readFromResource("expectedFile.xml")
),
(fileObject) -> {
//do your custom checks
return true; // or false
}
)
// Compiler Message
.andThat().compilerMessage().ofKindError()
.atLine(12)
.atColumn(5)
.atSource("/your/testpackage/ExpectedGeneratedSourceFile")
.equals("abc") // or .contains()
// Test generated class
.andThat().generatedClassesTestedSuccessfullyBy(
(cuteClassLoader) -> {
AnImplementedInterface unit = (AnImplementedInterface) cuteClassLoader
.getClass("your.testpackage.ExpectedGeneratedSourceFile")
.getConstructor().newInstance();

MatcherAssert.assertThat(unit.doSomething(), Matchers.is("OK"));
}
)
.executeTest();
}

Testing of generated classes with CUTE just makes sense if the generated class implements a precompiled interface, because otherwise only the Java Reflection api can be used for testing which is very cumbersome.

Prefer an extra integration test submodule if no such interface is present.

Testing Annotation Processors based on APTK

The annotation processor toolkit (APTK) is a framework that helps you to build annotation processors in a more efficient way by providing a lot of useful tools. Unfortunately, these tools need to be initialised which is usually done by the APTK Processors base class.

APTK is providing some unit test base classes, to avoid that these initialisations have to be done manually and over again in unit tests.

You need to bind the following dependency:

<dependency>
<groupId>io.toolisticon.aptk</groupId>
<artifactId>aptk-parent</artifactId>
<version>0.24.0</version>
<scope>test</scope>
</dependency>

Unit test code looks like this then:

@Test
public void test_AnnotationToWrap_getSimpleName() {
Cute.unitTest()
.when()
.unitTestWithoutPassIn(new APTKUnitTestProcessorWithoutPassIn() {
@Override
public void aptkUnitTest(
ProcessingEnvironment processingEnvironment) {

//Add your unit test code and assertions here

}
})
.compilationShouldSucceed()
.executeTest();
}

There are 4 base classes for tests — one for each constellation of pass ins of Element and Processor instance.

Give it a try

By using CUTE it’s easy to write both black-box and unit tests for annotation processor related code.

You’ll find all my helpful CUTE code here on Github.

Give it a try — I’m looking forward to your feedback on it!

--

--

Tobias Stamann
Holisticon Consultants

I am a Freelancing IT-Consultant from Hamburg, Germany. My main focus is JVM based development, process automation and DevOps.