Best Practices for Integration Testing In Java

Nir Shafrir
Javarevisited
Published in
10 min readApr 23, 2024

Recently, I was looking to introduce robust integration testing to my company. Specifically working in Java with an SQL environment, I started exploring frameworks, tools, and approaches for integration tests and I wanted to share my journey around Java Integration testing.

So, I started researching features and characteristics to help me efficiently build out scalable integration testing env. I realized it depends on what kind of integration tests I want to run.

In Java, there are narrow integration tests, which basically are unit tests that touch classes that communicate with the application’s I/O in some way. For these tests, I use TestContainers combined with the normal unit test framework.

The other, and more complicated, approach is wide integration testing. When you have multiple systems passing calls and messages to each other. Some sort of E2E kind of solution, just not necessarily the entire solution. What is appropriate here highly depends on the specific solution.

If I have a lot of services/systems I’d also question whether wide integration testing is the appropriate way to go. These kinds of tests require a lot of maintenance and are generally only used for the most critical of flows. What is more common, especially in a microservice environment, is the combination of narrow integration tests, system tests, and contract tests — and the combination of these three levels of testing provides you with a high degree of certainty when making changes.

For unit, narrow integration tests, and system tests, I usually containerize everything and make it all executable during build on local machines and in pipelines. Shorter the feedback loop the better (also, debugging is easier when you can run it locally).

If you’re aiming for readability by technical people, consider using frameworks like Cucumber or Fitnesse. These tools provide a high-level overview of test functionality, making it accessible to non-developers as well. If the tests are meant for developers, consider using the regular unit test framework. This approach allows developers to write tests like they are used to.

As far as I know, there’s no silver bullet framework and no specific approach. There’s so much depending on the specific system at hand: but there are some cool tools and habits you could adopt to better succeed in integration testing.

So, what is Integration Testing in simple words?

Integration is a very loaded term. By definition, it’s how two independent things work together.

Unit testing tests code in isolation, meaning any API call, database call, and any call to a different module are all mocked out. Integration is testing that code without the mocks to see how they work together.

Integration testing is testing how two service classes interact.

Integration testing is testing how a service interacts with a datastore.

Integration testing is testing how the UI responds to the back end.

There are several more examples, just know it’s how two things work together. Essentially any test that’s not a unit test is an integration test by pure definition. We use other terms to specify what type of integration test it is, like contract test, API test, acceptance test, etc.

You can check out this talk by Naresh Jain. He talks about every layer in the test pyramid (including integration) and gives examples.

I’ve timestamped it for when he talks specifically about integration:

https://youtu.be/ApQgaqafdR8

So as you can see, the Integration test is a broad term and it will depend on the context of your project. I really like Martin’s post about the test pyramid that includes different types of integration test scenarios. You folks probably know it: https://martinfowler.com/articles/practical-test-pyramid.html

When planning Integration Testing, consider:

  • When should the test run? During build? Pipeline? In its own environment? All of the above?
  • Should developers be able to run these tests locally?
  • Does the business need to understand the tests?
  • What will the developers be comfortable writing? Will they accept something like Cucumber or are they more prone to embrace something they already know?

Best Practices for Integration Testing in Java

  1. Separation of concerns
  2. Use Insightful Names for Test Package
  3. Use Insightful Names for Test Cases
  4. Expected vs Actual
  5. Write Simple Test Cases
  6. Use Appropriate Assertions
  7. Test Production Scenarios
  8. Avoid Code Redundancy
  9. Make use of Annotations
  10. Make Use of Continuous Feedback
  11. Test behavior and not implementation

1. Separation of concerns

It is good practice to keep the test classes separate from the main source code. This way, tests can be written, executed, and managed independently of the code that runs in production.

Additionally, keeping your test and source codes separate helps prevent the chance of executing test codes in the production environment.

You also have the option to mimic the actions of build tools like Maven and Gradle, which search for test implementations in the src/main/test directory.

2. Use Insightful Names for Test Packages

It is important to use an insightful package naming strategy in the src/main/test directory for test classes to enhance the test code’s readability and maintainability.

In other words, the names used for the test class package should align with the source class’s package that it will be testing.

If the Todo class is located in the com.digma.task package, then the TodoTest class should also be in the com.digma.task package within the src/main/test directory structure.

3. Use Insightful Names for Test Cases

Test names should provide clear insight into the test’s behavior and expectations for users when they look at the name of the test.

For instance, an integration test named testCreate is vague as it provides little insight into the test scenario or expectation.

Hence, it is important to give test names based on the action and expected outcome, like
testPersonWithCreationRequestThatReturnsCreated, and
testPersonWithUpdateRequestThatReturnsUpdated

You can still improve on the way we name our test cases; for better readability, you can name them using the given_when_then convention.

givenValidPersonRequest_whenCreatePerson_thenPersonIsCreated()
givenValidPersonRequest_whenUpdatePerson_thenPersonIsUpdated()

@MicronautTest
public class IntegrationTests {

@Inject
private PersonClient client;

@Test
public void givenValidPersonRequest_whenCreatePerson_thenPersonIsCreated() {
// given
var requestDto = new PersonRequestDto("John","Smith","jsmith@codaholic.com");

// when
var response = client.create(requestDto);
var person = response.body();

// then
Assertions.assertEquals(HttpStatus.CREATED, response.getStatus());
}
}

From the code snippet above, our code blocks are described in the Given, When, and Then format. This will help divide the test into three segments: input, action, and output.

The code block for the specified section initializes the test objects, simulates the data, and organizes the input.

Following this, the code block within the when section denotes a particular action or test case.

Similarly, the following section demonstrates the code’s results and compares them with the expected outcome using assertions.

4. Expected vs Actual

A test case requires having a statement that compares the expected and actual values.

In order to confirm the concept of anticipated versus real values, we can examine the definition of the assertEquals method in JUnit 5’s Assert class.

public static void assertEquals(Object expected, Object actual)

Let us utilize the assertion within one of our test cases.

@MicronautTest
public class IntegrationTests {
@Inject
private PersonClient client;
@Test
public void testCreate() {
var requestDto = new PersonRequestDto("John", "Smith", "jsmith@codaholic.com");
var response = client.create(requestDto);
var person = response.body();
Assertions.assertEquals(HttpStatus.CREATED, response.getStatus());
}
}

5. Write Simple Test Cases

Write test cases that are very transparent and straightforward. They need to be precise and succinct.

The main objective of every software project is to develop test cases that align with customer needs and are user-friendly and simple to operate. As a software engineer, you must consider the perspective of the end-user when writing test cases.

It is necessary to write a simple test scenario that validates an expected value against the actual one.

While it may be necessary to include logic in a test case, it’s important not to go overboard. Furthermore, you should avoid incorporating production logic in a test case in order to successfully pass the assertions.

6. Use Appropriate Assertions

Make sure to always use appropriate assertions to confirm the accuracy of the actual results compared to the expected results. You need to use different techniques provided in the Assert class of JUnit 5 or comparable frameworks like AssertJ.

For example, we have utilized the Assert.assertEquals method for asserting values already. In the same way, you can utilize assertNotEquals to verify that the anticipated and real values are not the same.

Other techniques like assertNotNull, assertTrue, and assertNotSame come in handy for various assertions.

7. Test Production Scenarios

Writing tests while keeping real-world scenarios in mind makes unit testing more rewarding.

Mainly, it assists in making your integration tests easier to understand. Additionally, it is crucial for comprehending the code’s behavior in specific production scenarios.

One major way to test production scenarios is to test behavior rather than implementation. This particular concept is discussed at length in point 11 below.

8. Avoid Code Redundancy

Develop additional helper functions to produce frequently used objects and simulate data or external services for comparable unit testing.

Similar to other best practices, this improves the readability and manageability of the test code.

9. Make Use of Annotations

Frequently, testing frameworks offer annotations for different reasons like setting up, running code beforehand and cleaning up after a test has been executed.

Different annotations like JUnit 5’s @Before, @BeforeClass, @After, and @Order, as well as those from other testing frameworks like TestNG, can be used.

You need to utilize annotations to set up the system for testing by generating data, organizing objects, and deleting everything after each test in order to maintain isolation between test cases.

10. Make use of Continuous Feedback

Sadly, test passes can be deceptive. When we only focus on whether tests pass or fail, we disregard the complete insights they provide about our code and the system being tested.

One tool that takes you beyond the pass/fail results of your tests is Digma. It is a free tool that analyzes code by examining observability data of this kind. This basic IDE plugin operates on your computer, handles all the complexities of OTEL, and gathers and analyzes metrics and traces. Digma can be configured by adding it to the IDE through the plugin marketplace.

Now, when you execute your integration or end-to-end tests within the IDE, Digma will detect this automatically to enhance observability for every test scenario. If the Observability Toggle is activated, Digma will enclose every test in a tracing mechanism and start evaluating the assets that were accessed and their performance.

If you are interested in learning more about exactly what is going on, the tracing integration allows you to drill into the anatomy of the requests simulated by the tests to understand exactly what is going on and what could be wrong:

Below is a typical test that passed; however, when you examine the code, you’ll see that the test result analysis has already uncovered some problems.

Executing your integration tests only will not allow you to see potential SQL query issues. However, running your integration tests with Digma comes with the benefit of seeing a score for the performance impact of every query. This allows you to consider your SQL query optimization choices not only based on the seriousness of the issue but also taking into account their usage and impact on your system.

Continuous Feedback does not collect new data; it simply monitors the observability data stream (generated from your integration tests) to identify any critical information that should be noticed early.

One of Digma’s great points is that it establishes a two-way connection for every test asset, linking it with a database query, endpoint, code location, or consumer. At the testing stage, you can pinpoint problems and analyze system data within the testing environment.

11. Test behavior and not implementation

In practice, testing behavior is preferred over testing implementation. For what reason? The key is to ensure that your tests remain robust, flexible, and maintainable. Focusing on the “what” rather than the “how” ensures that these tests stay unchanged even when the code is altered, serving as a reliable safety measure.

Tests written to test code behavior are simpler to understand as they serve as code samples showcasing the various ways your class’s methods can be utilized, making it possible for even those unfamiliar with the implementation to understand how to use the class by reading through the tests.

Testing behavior will also help you consider how to hide implementation details and only reveal necessary information for users to utilize your services efficiently. This enables you to develop robust contracts and tight interfaces.

One way to ensure that you are testing behavior and not implementation is to

  • Test the public API of a package, not classes, methods, or technical details.
  • Make sure that your test is focused on use cases or user stories.
  • When refactoring an implementation, use tests to ensure the expected results are still produced. Tests guarantee the anticipated outcome or behavior.

There are numerous instances in which testing implementation specifics is necessary (such as verifying that your code reads from a cache and not a datastore), but this practice should be minimized as tests ideally should not rely on implementation details in most cases.

Final Thought

Integration testing in Java is a broad term and it will depend on the context of your project. It is not really about frameworks but more about processes, approaches, and your great habits.

Get Digma: Here

--

--

Nir Shafrir
Javarevisited

Mountain biker! cofounder of Digma.ai, An IDE plugin that analyzes code as it runs, providing actionable insights into errors, performance, and usage.