Service tests with Docker

Konstantinos Konstantinidis
Ardanis
Published in
5 min readNov 12, 2020

In this blog post, we will investigate service tests and discuss why they are useful. We’ll explore a sample solution and go through the tooling needed to run them efficiently. Additionally, we’ll see why this improves the overall code quality, velocity and delivery pipeline of a software team.

Docker

Docker is a virtualization platform that allows us to run containers in memory. We will be using its capabilities to spin up dependencies such as databases and others locally.

Definition and usefulness

Testing takes place at various levels and one of those is service tests. Sometimes called integration tests or service-level acceptance tests, they usually take place after unit tests. Service tests ensure that business requirements are fulfilled within its bounded context. A bounded context is a slice of the business domain. A very crud bounded context would be considered the area of Subscriptions in Netflix.

Unit tests exercise a given method or smallest piece of code while service tests exercise a given logical service end to end, treating it as a black box and expecting it to fulfil the specified acceptance criteria. When investing in this way of testing, you get proof that everything works and test reports that can prove that you have test coverage for a given area. This is especially useful when auditing and regulatory requirements are high when it comes to code quality.

Development teams should create service tests when delivering features. They should be a criterion for code completeness. Code that is typically covered by both unit tests and service tests minimizes the chances for bugs. You should have fewer service tests than unit tests.

Test Pyramid

Historically it has been quite difficult to allow developers to run service tests without first having many dependencies locally installed, at the very least a database. Nowadays, you may also depend on cloud resources such as Azure Storage. This increases the resources needed to run them and introduces a requirement for an internet connection. Ever tried to coding while on an airplane? Similarly, the CI requirements increase if you want to run them there.

Fortunately, with the advent of Docker and containers, it’s now possible to get away from all that.

Expenses App

For this blog post, a sample repository has been created. Certain areas of the code have been implemented simplistically on purpose.

The use case included in the code describes of a businessman trying to submit his business expense along with a receipt and then trying to retrieve it. For this demo, we have introduced a dependency on Postgres and Azure Blob Storage.

Coding features in a human-readable way

Leveraging tooling that allows you to write the business requirements in a way that maximizes readability for your stakeholders is recommended. Cucumber’s Gherkin language is an immensely popular approach. In the code examples of this post, we use SpecFlow which supports Gherkin out of the box.

When using Gherkin, you can specify your features in separate files. Each feature includes a set of scenarios. Scenarios are broken into steps that you need to implement in Step files.

Here’s the gherkin file that describes the feature and a simple scenario.

Feature file for the submit expense feature

The docker side

Docker allows us to create docker-compose files which describe how to bring up the entire service and its dependencies locally without anything other than Docker Desktop installed. Here’s the docker-compose file of this example.

Docker compose file to bring everything together

Our docker-compose specifies how to fetch or create containers for

  • Postgres
  • Azure blob storage
  • The WebApi service
  • The service tests

The Postgres and Azure blob storage containers images are fetched on the fly. We’ll need to tell docker how to build the WebApi and service test container. This is achieved by Dockerfiles.

Dockerfile to build the WebApi

The WebApi Dockerfile uses the builder pattern to first build the solution using a heavier container and then pull those artifacts into a lighter container that is capable of running them.

Dockerfile to build and run the service tests

For the sake of simplicity, we are building and running the service tests using the dotnet sdk and thus we are sticking with the heavier container.

With all of the above, the following options are now available:

  1. Run ‘docker-compose up azure-blob-storage db’ to bring all the dependencies needed to run service tests locally from your editor of choice. In this case Visual Studio
  2. Run ‘docker-compose up –abort-on-container-exit’ to bring everything up and automatically run the service tests.

Option 1 allows you to keep running your service tests while you are developing something new.

Option 2 allows you to run the service tests before you create your pull request, that way you have maximum confidence as a developer that you didn’t break anything.

Using them in Continuous Integration

At this point, you can write and run service tests locally with ease. They run fast so they are a short feedback loop. Next step, we can run them as part of our pull request(PR) checks! It’s recommended to keep the build and run of the service tests quick. If they are slow to run, then you’ll need to move them out of the PR check.

Part of the benefit of keeping the feedback loop short and the docker containers light and quick to build is the ability to run them in the as pull request checks. That way, most of the times there are no surprises when landing on your UAT environment. Most of the functionality of the service will have been verified/checked by the service tests and this should reduce the number of bugs encountered by now. Plus you may not need to run any functional tests at this point on these environments.

Here’s a sample azure-pipelines file to get you started:

Azure-pipelines to enable this to run as part of your CI
Azure devops now runs the service tests as part of CI / PR checks

Conclusion

We have seen why service tests are useful. Why they have been a headache to develop and run locally in the past and how Docker allows us to improve our delivery pipeline by giving us tools to run them locally and in the CI with ease. Most of the gotchas that are found usually in a deployed environment by QA are easily identifiable when running service tests before the code gets merged and deployed.

Next Steps

  • Use the builder pattern for service tests and run them with a console-based test runner.
  • Generate html test reports that can be shared with the stakeholders
  • Add unit tests as part of your WebApi Dockerfile build, turning the Dockerfile into a mini-ci.

--

--

Konstantinos Konstantinidis
Ardanis
Editor for

Co-founder at Ardanis. Simplifying software development.