Containerized integration testing for ASP.NET Core

Dennis Bappert
4 min readApr 23, 2019

This is part 1 of a three-part series on “Containerized testing”:

  • Part 1: Containerized integration testing for ASP.NET Core
  • Part 2: Containerized UI testing with Selenium for ASP.NET Core
  • Part 3: Executing Part 1 and 2 in Azure DevOps as part of a build pipeline

Integration tests rely on external resources like SQL Servers, Blob Storage, S3, etc. which need to be in a clean state before executing the test suite. This implies a series of issues (or challenges) which you will be faced during development and in your continuous integration / deployment pipelines:

  • How to ensure that every developer is able to execute all tests before committing his change? I came across a variety of projects in my career where I had to step through a 50 pages documentation on how to setup my development environment 🤔.
  • How to prevent the “works on my machine” syndrome e.g. a developer updated his local SQL database manually and the tests are passing but there is no migration available?
  • Integration tests should mimic the production environment as good as however possible but how can you ensure this on a developers machine?

Most of the time there is only one valid answer for the questions mentioned above: execute integration tests only during build verification in your CI pipeline.

Personally I think a good testing suite should be executable everywhere — this really makes a difference for the development team!

Fortunately we have Docker these days which runs on all major operating systems (yes even on Windows and it runs well!). So how can we leverage Docker to solve our problem? Well, that’s obvious, isn’t it?

Let’s consider a microservice which handles avatar uploads from users, stores the images in Azure Blob Storage, resizes them into a suitable resolution, cop them and keeps some metadata in a SQL database. Our integration tests require a SQL server and a blob storage to be “100% valid” (well, kind of). Let’s start with the code first before digging into the tests.

This is just an example and there are probably several better choices to solve this particular problem but it is good to explain the approach. In a real project you would probably choose Azure Queue Storage and a Azure Function for reliability and flexibility.

Startup.cs: Registration of EntityFramework and the Blob Storage Account
Startup.cs: Registration of the required Middlewares (Important: we want to apply our EntityFramework migrations when the application starts)
appsettings.json: EntityFramework and the Blob Storage Client are configured to use LocalDB and the Visual Studio Storage Emulator for easy local development
AvatarController.cs: This is our “main API endpoint” which we want to test from an end-to-end perspective

The above screenshots only highlight the most important parts of the project, the entire code is stored in GitHub.

How can we test the code above?

There are a few possibilities to ensure the code above works as expected:

  • Write unit tests to ensure that the resizing logic works with mocked SQL and Blob Storage classes
  • Execute load and performance tests to ensure the code is scalable and fast
  • Use a testing tool like SoapUI, Postman, etc
  • Execute integration tests 🤔

We want to focus on integration tests (you should have unit / performance tests anyway) and the challenges we need to solve in order to make them reproducible, executable everywhere and expressive. This means we need to work with a real SQL database and Blob Storage to verify the expected result. Let’s start with a new test project using xUnit (my favorite testing framework).

The solution contains two projects: the actual API and our IntegrationTest project
AvatarService.API.IntegrationTests.csproj: We are using the Microsoft.AspNetCore.TestHost library and xUnit. It is important to set the TargetFramework to netcoreapp2.2, otherwise you’re not able to reference the API project.

Now the funny part begins: we want to have a clean SQL database and Storage Account on each test run. There are two possibilities to solve this problem:

  • we can provision these resources on Azure just for testing and destroy them afterwards — possible.
  • we can use the power of Docker to provision a SQL database and a Storage Emulator on each test execution — yes!
AvatarControllerTests.cs: we are using Fixtures to host our application and all required services
TestServerFixture.cs: In our fixture we connect to the local Docker agent and create the resources we need. Wait, what?
SqlServerContainer.cs: Yes, why not just create the required resources on the fly on-demand. I implemented a little abstraction layer DockerContainerBase which makes it easy to declare containers. Thanks to this post.
TestServerFixture.cs: Now we just need to modify the connection strings accordingly and we are good to start with our first test 🙂
AvatarControllerTests.cs: We can use the TestHost framework to call the APIs we defined in our application. In this case I upload a new avatar and expect the result to be 204 NoContent.
AvatarControllerTests.cs: Luckily the tests and the API itself run in the same execution context, so we can just initialize the DbContext to verify the API did what we expected.

This example shows you a simple way to test the functionality end-to-end in a quite simple and sophisticated way. The entire code is available on GitHub. Thanks for reading.

--

--

Dennis Bappert

Sr. Solutions Architect @ AWS - passionate about analytics and machine learning — all opinions are my own.