The Subtle Art of Test Automation Using Docker Compose
Leverage docker-compose to execute test automation
Many of us, over the years, have executed test automation against production-like test environments.
With the term production-like, I mean test environments that have an exact set up like a production environment, but may or may not have exact configurations as that of the production environment.
However, when it comes to executing test automation against the test environments, there is always a certain degree of challenge (although solvable) that an engineer faces.
Classic examples being:
- QA engineer performing manual/exploratory testing on the test environment and, at the same time, the CI pipeline gets triggered due to a new commit, thereby deploying the current build snapshot, triggering automated tests that might hamper the already done test data setup.
- May be resolved with the design of the test suites and the test framework but parallel execution is not possible to support specific use cases.
- Overall cost associated with a dedicated full-fledged production-like test environment.
These are some of the challenges that can be solved by executing test automation using docker-compose. Some of the classic challenges that we were able to solve with this approach:
- The point that running tests against the Docker images, instead of a full-fledged test environment, is always debatable. But at the end of the day, the purpose of test automation is to certify the functionality that is baked into these Docker images and not the infrastructure set up. Definitely, there can be subsets of tests a.k.a smoke-test suites which can be run against the real production-like environment — preferably in system integration tests or development environments. But docker-compose based set up can be used to run the time-consuming regression suite.
- Multiple isolated environments can be created, possibly in the same host, and thereby test suites can be run in isolation. The biggest advantage here is, instead of being dependant on automation-framework-level parallelization, we are now able to leverage the parallelization offered by the underlying infrastructure, i.e. an isolated environment created by docker-compose, in this case.
- Do test data setups and tear down at test-case or test-suite level without thinking much about what will be the impact of this when multiple test suites are running in parallel. Reason being — when parallelization is done at the automation framework level, the execution still points to a single environment. But when parallelization is done by executing tests against isolated environments, the responsibility shifts to underlying infrastructure levels, thereby bringing in a certain degree of segregation of concerns.
- If a hotfix has to be deployed and regression has to be executed against the hotfix, then instead of deploying the hotfix snapshot on the testing environment, as a result of which ongoing sprint testing and deliverables will be put up on hold, spinning up independent environments using docker-compose and certifying the hotfix is far more convenient.
- For running backward-compatibility tests, if there is a need to deploy current production snapshots, spinning up independent environments using docker-compose, creating the test data in the state that is currently in production, then upgrading to the current-release snapshot and verifying if the new changes are not breaking the build can be done far more effectively.
- Copying the required data which will be used for future reference, i.e. automated execution test reports, in our case, using volumes so the host can be leveraged.
In this article, we will see an example of how to run a REST API test automation suite against a microservice using docker-compose.
- Create a Docker image of a microservice and push it to Docker Hub.
- Create a Docker image of the REST API automation and push it to Docker Hub.
- Create a
a. Pulls the microservice Docker image from the Docker Hub and brings the service up.
b. Pulls the REST API automation Docker image from the Docker Hub and brings the image up so that the automated test suite gets executed against the microservice in point 3-a.
4. Copy the report of the test automation execution to a volume on the host where the Docker containers are in action.
Now, let us have look at each of these steps in detail.
Microservice repository and Dockerfile
REST API test framework repository and Dockerfile
Step 1. Build Docker images for the microservice repository and REST API test framework repository.
Step 2. Run
docker images, this lists the images created as a result of the above
docker build command.
Step 3. Push the created Docker Images to Docker Hub.
Step 4. Pushed images listed in Docker Hub.
Step 5. Add local directory, to be mounted into Docker containers.
Docker compose YAML.
Step 6. Inspect the configuration of
Step 7. Run
Step 8. Resulting files generated in the mounted macOS directory.
Step 9. Detailed HTML report from the mounted macOS directory.
Video tutorial for the steps:
To achieve the steps above, it needs strong coordination between Devs, QAs, and Ops. Without the support of the Ops, it will be relatively tricky to achieve this goal.
Hence, it is very important to onboard every role — Devs, QA, and Ops.