Unlocking the Hidden Potential of Integration Testing Using Containers

Kevin Wittek
97 Things
Published in
2 min readFeb 23, 2020

Most Java developers have probably encountered the testing pyramid at one point in their career, whether as part of a computer science curriculum or mentioned in conference talks, articles, or blog posts. We can find a multitude of origin stories and variations of this metaphor (with a deep dive into those worthy of an article on its own) but, in general, it boils down to having a sizeable foundation of unit tests, followed by a smaller chunk of integration tests on top of that, and an even smaller tip of end-to-end UI tests.

This shape is proposed as an ideally optimal ratio of the different test classes. However, as with everything in software and computers, these guidelines need to be assessed in context, which means assuming integration tests to be slow and brittle. And this assumption is probably true if integration tests are expected to be run in a shared testing environment or require an extensive setup of local dependencies. But would the ideal shape still be a pyramid if we challenge these assumptions?

With ever more powerful machines we can either use virtual machines (VMs) to wholly contain the complete development environment or use them to manage and run the external dependencies necessary for integration testing (such as databases or message brokers). But since most VM implementations aren’t overhead free, this will add considerable load and resource consumption to the developer workstation. Also, start and creation times of VMs are too high for an ad hoc setup of a required environment as part of test execution.

The advent of user-friendly container technology, on the other hand, allows new testing paradigms to emerge. These low overhead container implementations (being essentially an isolated process with its own self-contained file system) enable the creation and instrumentation of required services on demand and the usage of uniform tooling. Still, this instrumentation has been mostly done manually and laboriously outside of the actual test execution, slowing onboarding of new developers and introducing the potential for clerical mistakes.

In my opinion, the goal we as a community should strive for is to make the setup and instrumentation of the test environment an integral part of the test execution and even of the test code itself. In the case of Java, this means that executing a JUnit test suite, whether done by the IDE or the build tool, would implicitly lead to the creation and configuration of a set of containers necessary for the tests. And this goal is achievable with today’s technology!

We can interact directly with the container engine using existing APIs or CLI tools, thereby writing our own “container driver” — note, however, the distinction between starting a container and the readiness of the service inside the container for testing. Alternatively, there is also the opportunity to explore the Java ecosystem for existing projects that deliver these functionalities on a higher level of abstraction. Either way, it’s time to unleash the power of good integration tests and to emancipate them from the shackles of their past!

--

--

Kevin Wittek
97 Things

Engineer turned PhD student, head of blockchain research @_ifis, @testcontainers co-maintainer, @groundbreakers , guitar @KobraKrusaders, @swk_ruhr orga team.