Downsides of Black Box Testing with TestContainers

Sulyz Andrey
skyro-tech
Published in
5 min readApr 29, 2024

I have already written about advantages and disadvanteges of Black Box Testing approach for developers. But in the article I would like to describe challenges you may encounter in practice when using this approach. Despite the fact that most of these problems are solvable, they can still be unpleasant for developers.

Remote Debugging

In some teams, there is an agreement that each service should be deployable locally on a developer’s machine. This is due to the need for thorough debugging during application development. Many developers prefer to run the application through an IDE in debug mode to identify and fix problematic parts.

However, there is another approach where development is driven by testing. This doesn’t necessarily mean using the TDD approach, but involves testing the application through the test code rather than locally running it. I am a fan of this approach because even a minimal check of the application prompts developers to write at least basic happy path tests, which can be used as regression tests. This approach also does not exclude debugging the application. However, when you test your application through a container, you lose the ability to simply run tests in debug mode and put a breakpoint in the application code.

But it’s not all bad! For debugging your application running in a container, you can use JDWP: the Java Debug Wire Protocol. It allows you to connect to a remote container and debug the application. If you’re using IntelliJ IDEA, it’s very easy to do. You just need to set up the JVM Debugger and run it when your application starts.

To activate the protocol, you’ll need to specify a command when starting the application:

-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005

It’s clear that enabling Remote Debugging on a production environment is not possible, so you won’t be able to use the same Docker image for running tests and building the production image. In this case, you can use Dockerfile DSL provided by TestContainers.

return new ImageFromDockerfile(moduleName, true)
.withFileFromPath("app.jar", Path.of(System.getProperty("user.dir") + "/build/libs/app.jar"))
.withDockerfileFromBuilder(builder ->
builder.from("openjdk:21-jdk-slim")
.copy("app.jar", "/app/app.jar")
.workDir("app")
.expose(5005, 8080)
.cmd("java", "-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005", "-jar", "app.jar")
.build()
);

Another important point is that you’ll need to expose the JDWP port (5005). You probably know that when running TestContainers, ports are mapped randomly. However, you can fix certain ports to remain static:

Containers.application.setPortBindings(List.of("3268:5005"))

After that, you can put breakpoints in your tests and the application code, run the Remote Debugger, and resume your tests. If you think it’s complicated, yes, it is. But after initial preparation, you will do it fast and easy.

Logging

Despite being able to use Remote Debugger with the Black Box approach, in most cases, you will rely on logs from your application. Some developers don’t like this approach. Most just want to write code, check the logic by running the application or writing a couple of simple unit or integration tests, and that’s all.

However, testing the application through containers can reveal interesting aspects. For example, you may find that your logs are insufficient to identify the issue at hand. Many developers don’t think about this during development, but when a bug happens in the production environment and the logs are either empty or provide useless descriptions of the problem, developers spend a lot of time trying to reproduce the situation locally.

To enable logging for the container, all you need to do is specify a LogConsumer:

withLogConsumer(new Slf4jLogConsumer(LoggerFactory.getLogger(AppContainer.class)))

However, if you are running your tests in parallel, relying on logs will be very difficult, or impossible. In this case, you can test your application locally in a single thread and run parallel tests in your CI. Parallel tests are usefull as they also help uncover some issues in your application. However, there are no specific best practices here. Testing in parallel with TestContainers can be very challenging and sometimes can consume a lot of time in developing the test code.

Data preparation

This point, in my opinion, is the most contentious and challenging and needs to be discussed with your team. It’s clear that in any integration testing, the first thing you’ll need to do is set up the environment. But when we talk about integration testing through, for example, Spring Test, you can use Mockito for some interactions with external services or repositories. However, when using Black Box tests, you’ll need to fully prepare data for all external services.

In terms of interacting with other web applications, you can almost always use MockServer or GripMock (for gRPC). The real challenges arise when working with databases, especially when testing complex SQL queries that require a many interactions with multiple tables.

Preparation data for the database can be a complex or a simple task. It all depends on how well you understand your data schema and can prepare the data for the required test scenario. We’ve tried various approaches for preparing test data. For example, generating data for the database directly in the tests via Java code, or using a common file with migrations for all necessary data.

In the end, we decided to prepare separate PL/SQL scripts for each test scenarios that contain required data in the database. These scripts properly populate the tables in the database. They don’t necessarily need to be different SQL procedures. Sometimes, we can pass the necessary parameters from the code to them, which will subsequently generate different data. The main task of these migrations is to generate data that is completely independent of each other so that tests can run in parallel.

Each directory is named the same as the test in the code and contains all the necessary files for creating test data

Before each test, we call a PL/SQL procedure that prepares data for a test. We use this approach with parameterized tests because, in most cases, we only need to change the test case name:

@Test
public void testCreatedTasksForSucceed() {
String testCaseName = "createdTasksForSucceed";

Map<String, Object> parameters = Map.of("idPrefix", testCaseName);
TestDbHelper.callStatementBlock(
Path.of(testCasesPath.toString(), testCaseName, "/query.sql"),
parameters
);

//Test implementation...
}

In this case, we understand that we can easily break tests when changing the database schema. However, in our case, the database schema doesn’t change as frequently or drastically as the business logic of the application.

Conclusion

While testing your application through containers offers some advantages, it also presents challenges that necessitate thorough discussion within the team. Development with this approach can be considerably more complex compared to using standard frameworks like Spring Test. Therefore, it’s essential to carefully consider these factors when deciding on this solution.

--

--