Docker for Selenium test automation?
YES, please! #AgileTD
The DaWanda QA team visited the Agile Testing Days 2017 in Potsdam Germany and held a workshop about using Docker for test automation. I will explain the basics of dockerized test automation below, and you can find the complete presentation here.
But first, why should you consider (yet) another testing environment?
Let’s summarize the three current ways of automation environments and their respective issues:
- Local execution has the problem that you cannot scale the amount of executable browsers; furthermore, only browsers supported by your operating system are available.
- The other possibility is to create an environment using your own grid. You can execute tests in parallel on multiple machines, but the maintenance of the drivers and browsers will take you a lot of time and effort.
- Cloud services take care of this, but they are very costly (around 3000€ per month for 25 parallel sessions) and the location of the cloud provider’s servers impacts the execution speed of your tests.
We, at DaWanda, decided to use Docker for test our automations. A new technology has not only benefits, so before we look at more details, let me first list some pros and cons of this setup:
- Efficient, lightweight, self-contained Images and Containers
- Very easy to scale the number of Nodes
- Local and remote execution of tests
- Executable on any system where Docker is installed (https://www.docker.com/get-docker)
- No setting up and configuring development environments, just develop (no download of drivers, browsers or any other dependencies are necessary)
- Selenium Hub and Node images are provided by Selenium
- Only Firefox and Chrome browsers are currently supported
- Other browsers can be added through cloud (Browserstack, Saucelabs) or grid services; it will at least save money
Basically, we have four components: our created automation code, the Selenium Hub, and Firefox and Chrome images. The good aspect is: We can download the images for the Hub and the Firefox and Chrome browsers directly from Selenium! We only need to take care of our own code and we can scale the amount of running containers.
We have to create three files to dockerize an automation project: a Dockerfile, a Docker-compose file, and a local cache for our code dependencies. If not used already, we need to change the local WebDriver to a RemoteWebDriver for grid usage.
A Dockerfile is a ‘text document that contains all the commands a user could call on the command line to assemble an image.’¹ It contains all the instructions needed to build an image automatically (used by Jenkins).
A Docker-compose YAML file is a ‘tool for defining and running multi-container Docker applications.’¹ Just ‘with a single command, you create and start all the services from your configuration.’¹
Just do it!
Create our cache file:
In this example we are using Java. Our project depends on the openjdk image (version 8). This contains everything that we need, including the operating system and Java. We are defining our working directory “/app” as the root directory. The next step is to copy the important folders - like gradle (my software project management tool) - into this path. We also have to copy some code into the image. We only want to save/cache it’s dependencies, so we remove the compiled code (
rm -rf src/). We turn this file into an image using the command
docker build -f Dockerfile-base . . After creating it you should give it a useful name like qa-test-base with the command
docker tag <container_id> qa-test-base.
The second step is our Dockerfile, which is used by CI systems like Jenkins:
We are using our newly created cache image
qa-test-base. Now we want to copy our written code into the root path. In this example, we are also creating a few environment variables to be a bit more flexible in the execution. The command at the end will start our test. That’s it for the remote execution. The remaining problem is that we don’t have any Hub or Browser setup running for executing our test on a local machine.
The third step is to create a docker-compose file to have an easy way of executing our tests locally:
I know this looks quite complicated at first glance, but let me show you why it isn’t! 😉
We are using version 2 in this example, but you can already use version 3 if you like. We need four services: one for our code (test_local), one for the hub (hub), and one each for the Firefox and Chrome browsers. Our code service (test_local) depends on the other services (depends_on). Furthermore we are setting some environment variables and mounting our local code path to get some results locally (this step is not strictly necessary). The other services are very easy to setup. We have to define the images and the version that we want to use, as well as which IP and Port should be used. That’s it! You can execute the dockerized example with
docker-compose run test_local.
For more information, have a look at the finished dockerized code example here:
Have a look at the official docker and selenium documentation:
Docker makes my daily life easier at DaWanda and you should give it a try. Before, I was frustrated about the maintenance part of my local grid setup. Of course I made my own experience with cloud providers. The service is just expensive and it slowed down the test execution.
¹docs.docker.com (23 Oct. 2017)