How Did We Measure How Much Automation Testing Covered Our Application?

Automation test code coverage process

Seyma Yilmaz
Trendyol Tech
8 min readNov 16, 2023

--

While it is so important how much unit tests cover our project, why don’t we talk about how much automation codes cover the project?

As the Trendyol team, we carry out our business as a whole, end-to-end. For a feature to be released into the production environment, we need to make sure that all steps are working correctly. At this point, automation tests come into our lives as an important step.

As a team, our common goal was to eliminate manual processes and complete the entire process with automation tests.
Since we did not know how much automation covered the project in the work done, we were not sure if we had missed a point, and this made us question the reliability of automation.

As a Recommendation team, we wanted to measure how much our automation tests covered our application project; therefore, to achieve this, we have made the additions and changes that we will mention below.

Let’s discuss our journey and explain why and how we did it.

Why do we want to measure automation code coverage?

  • Reducing uncovered code blocks
  • Make sure the automation covers the important parts of the project
  • Improvement of Code Quality
  • Error Catching and Fixing
  • Integration and traceability into CI-CD processes

Advantages and Disadvantages

Advantages

  • An easy addition to the pipeline
  • Confident automation writing
  • Confident and safe deployments

Disadvantages

  • Local tests are difficult
  • Only Java and Go agents
  • Poor readability of generated coverage reports

So how did we progress when we started?

To be able to measure code coverage, we need to make changes both in the main and the automation project.

Code coverage calculations can vary based on the programming language used in the main projects. The tools used for each language differ, and coverage is measured through these tools.

Since most of our projects are written in Java and Go, we have implemented this approach in APIs that are coded in Java and Go, using the respective Java agent named JaCoCo and the Go agent named goc agent.

Our Implementation

We need to examine the implementation in two different steps Go and Java.

high-level automation architecture

GoLang

  • Update Dockerfile to be able to build API image with automation test coverage calculation ability.

Example dockerfile:

    ...

ARG ARG_MODE
ARG GITLAB_API_TOKEN
...

RUN if [ "$ARG_MODE" = "automation_test" ] ; then \
# fetch code coverage agent
wget --header "PRIVATE-TOKEN: $GITLAB_API_TOKEN" "https://gitlab.loremipsum/api/v4/projects/4443/repository/files/goc-v1.4.4-linux-amd64.tar.gz/raw?ref=master" -O goc.tar.gz ; \
tar -xvf goc.tar.gz ; \
cp goc goc_bin/ ; \
# built go binary with coverage agent
./goc build --center=http://127.0.0.1:7777 --output /app/lorem-ipsum-api --agentport=:8182 ; \
else \
# if ARG_MODE is not automation_test build for production
go build -ldflags="-w -s" -o /app/lorem-ipsum-api ; \
fi
...

We are fetching the required agent from [one of our repositories] using GitLab rest API.

The actual source for the coverage agent is in GitHub. If you encounter any version-related error, you can fetch a new agent from GitHub.

  • Add required script files to the repository

automation-test.sh: this script builds API container and runs it in the background. Then it starts the automation test.

# build application container with automation mode
docker build -t branch-lorem-ipsum-api:1.0 . --build-arg ARG_MODE=automation_test --build-arg GITLAB_API_TOKEN="$GITLAB_API_TOKEN"

# run previously built docker image in background
nohup docker run -p 1010:1010 -p 7777:7777 -e CLUSTER_NAME=qa -e GO_ENV=stage -e ARG_MODE=automation_test branch-lorem-ipsum-api:1.0 /app/automation_run.sh &> automation_test.out &

# wait 10 seconds
sleep 10

# run automation test image with required parameters
docker run --net=host --env test_param="-DTestEnv=qa -Dsurefire.suiteXmlFiles=TestNG/LoremIpsum.xml" \
"${GITLAB_REGISTRY_HOST}"/lorem/ipsum/automation-tests/lorem-ipsum-automation-tests:latest

automation-run.sh: main purpose of this script is to run code coverage agent.

        #!/bin/sh

# run coverage calculation agent in background
nohup /app/goc server &

/app/lorem-ipsum-api
  • Add gitlab-ci stage to run automation in the pipeline.
Automation Test:
image: ${GITLAB_REGISTRY_HOST}/dolor/sit/image/docker:23.0.1
tags:
- environment=stage
stage: Automation Test
artifacts:
when: always
paths:
- automation_test.out
only:
refs:
- branches
except:
refs:
- develop
services:
- name: ${GITLAB_REGISTRY_HOST}/dolor/sit/image/docker:23.0.1-dind
command: [ "--tls=false" ]
alias: docker
variables:
DOCKER_HOST: "tcp://docker:2375"
DOCKER_TLS_CERTDIR: ""
DOCKER_DRIVER: overlay2
DOCKER_BUILDKIT: 1
before_script:
- docker login $GITLAB_REGISTRY_HOST -u $GITLAB_REGISTRY_USER -p $GITLAB_REGISTRY_PASS
script:
- chmod +x .deploy/automation-test.sh
- .deploy/automation-test.sh
allow_failure: false

Code Coverage detail report on pipeline

pipeline view
the pipeline view
the detailed report in the pipeline

The objects that do the basic work in these projects are agents, but the coverage calculation is done within the helper classes that we add to our automation. You can pull these classes from the repos we have added as examples use them and adapt them to your projects.

  • The XML file you see below allows all automation tests to run in the pipeline.
  • The XML file you see below ensures that all the tests in the automation run in the pipeline. There are code pieces that calculate code coverage in the helper classes in the After suite class. In this way, the LoremIpsum.XML file we add to the pipeline can calculate code coverage.
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >

<suite name="Recommendation-Team" verbose="1" time-out="100">
<test name="LoremIpsumTest">
<parameter name="env" value="qa"/>
<classes>
<class name="loremIpsum.HealthCheckTest"/>
<class name="loremIpsum.Dolor"/>
<class name="loremIpsum.Sit"/>
<class name="aftersuite.AfterSuiteClass"/>
</classes>
</test>
</suite>

Java

  • Jacoco is a tool for measuring test coverage of Java code. You can follow the steps below to measure test coverage using Jacoco Agent.
  • We are fetching the required agent from [one of our repositories] using Gitlab rest API. You can create a new repository and fetch it from your repository.
  • The actual source for the coverage agent is in the jar file. If you encounter any version-related error, you can fetch a new agent from jacoco.
  • Update Dockerfile to be able to build API image with automation test coverage calculation ability.

Example dockerfile:

    ...

ARG ARG_MODE
ARG GITLAB_API_TOKEN
...
RUN mkdir -p jacoco_bin/codecoverage

# Get dependencies
COPY pom.xml .
RUN mvn clean package -Dmaven.main.skip -Dmaven.test.skip -Dmaven.repo.local=.m2/repository

# Package the application
COPY . .
RUN mvn clean package -Dmaven.test.skip -Dmaven.repo.local=.m2/repository

RUN if [ "$ARG_MODE" = "automation_test" ] ; then \
# fetch code coverage agent
wget --header "PRIVATE-TOKEN: $GITLAB_API_TOKEN" "https://gitlab.lorem.ipsum/api/v4/projects/1114/repository/files/jacocoagent.jar/raw?ref=master" -O ./jacoco_bin/jacocoagent.jar ; \
cp -R target/classes jacoco_bin/codecoverage/ ; \
cp -R src jacoco_bin/codecoverage/ ; \
fi
...
COPY --from=builder ./lorem-ipsum-java-api/jacoco_bin/ .
...
EXPOSE 36320
ENTRYPOINT ["./entrypoint.sh"]

Explanation of the Docker script:

1- mkdir -p jacoco_bin/codecoverage: 
This command creates a directory named /jacoco_bin/codecoverage inside the container. This directory will contain the Jacoco-related files.

2- if [ "$ARG_MODE" = "automation_test" ] ; then ... fi:
This line starts a conditional statement to perform specific actions based on the value of the $ARG_MODE variable. If the value of $ARG_MODE is "automation_test", the commands within the condition will be executed.

3- wget ... -O ./jacoco_bin/jacocoagent.jar:
If $ARG_MODE is "automation_test", this command downloads the Jacoco agent's JAR file from the specified URL and saves it as ./jacoco_bin/jacocoagent.jar. This file is used by Jacoco to integrate with the main project and track test coverage.

4- cp -R target/classes jacoco_bin/codecoverage/:
This command copies the target/classes directory to the /jacoco_bin/codecoverage/ directory. This step moves the compiled Java classes to a location accessible by Jacoco.

5- cp -R src jacoco_bin/codecoverage/:
This command copies the src directory to the /jacoco_bin/codecoverage/ directory. This step moves the source code files to a location accessible by Jacoco.

6- COPY --from=builder ./lorem-ipsum-java-api/jacoco_bin/ .:
This command copies the /lorem-ipsum-java-api/jacoco_bin/ directory from a previous image named "builder" to the current directory (working directory) in the new image. This is done to include previously prepared Jacoco files and coverage reports in the final image.

7- EXPOSE 36320:
This command specifies a port that the container will listen on for communication with the outside world. However, this is merely a port declaration; relevant configurations within the application need to be made to actually listen on this port.

8- ENTRYPOINT ["./entrypoint.sh"]:
Entrypoint details are below.

This Dockerfile was designed to prepare a container with the necessary files and configurations to monitor code coverage using Jacoco.

  • Add required script files to the repository

automation-test.sh: this script builds API container and runs it in the background. Then it starts the automation test.

# build application container with automation mode
docker build -t branch-java-api:1.0 . --build-arg ARG_MODE=automation_test --build-arg GITLAB_API_TOKEN="$GITLAB_API_TOKEN"

# creates a persistent data storage location (Docker volume) named "codecoverage" for sharing data between containers and the host machine.
docker volume create --name codecoverage

# run previously built docker image in background
nohup docker run -p 1012:1012 -p 36320:36320 -e QUARKUS_PROFILE=stage -e ARG_MODE=automation_test -v codecoverage:/app/codecoverage branch-java-api:1.0 &> automation_test.out &

# wait 10 seconds
sleep 10

# run automation test image with required parameters
docker run --net=host --env test_param="-DTestEnv=qa -Dsurefire.suiteXmlFiles=TestNG/JavaApi.xml -DCC_CLASSES=/app/codecoverage/classes -DCC_SRC=/app/codecoverage/src" -v codecoverage:/app/codecoverage \
"${GITLAB_REGISTRY_HOST}"/lorem/ipsum/automation-tests/java-api-automation-tests:latest

entrypoint.sh: update entrypoint to add required agent when “$ARG_MODE” = “automation_test”

#!/bin/sh
# default jvm parameters
JAVA_OPTS=""

if [ "$ARG_MODE" = "automation_test" ]; then
printf "Code Coverage Setup \n"
JAVA_OPTS="$JAVA_OPTS -javaagent:/app/jacocoagent.jar=address=*,port=36320,destfile=jacoco-it.exec,output=tcpserver"
fi

# run artifact
exec java ${JAVA_OPTS} -jar /app/${ARTIFACT_NAME}
  • Add gitlab-ci stage to run automation in pipeline
    Automation Test:
image: ${GITLAB_REGISTRY_HOST}/lorem-ipsum/base/image/docker:23.0.1
tags:
- environment=stage
stage: Automation Test
artifacts:
when: always
paths:
- automation_test.out
only:
refs:
- branches
except:
refs:
- develop
services:
- name: ${GITLAB_REGISTRY_HOST}/lorem-ipsum/base/image/docker:23.0.1-dind
command: [ "--tls=false" ]
alias: docker
variables:
DOCKER_HOST: "tcp://docker:2375"
DOCKER_TLS_CERTDIR: ""
DOCKER_DRIVER: overlay2
DOCKER_BUILDKIT: 1
before_script:
- docker login $GITLAB_REGISTRY_HOST -u $GITLAB_REGISTRY_USER -p $GITLAB_REGISTRY_PASS
script:
- chmod +x .deploy/automation-test.sh
- .deploy/automation-test.sh
allow_failure: false

Conclusion

In this article, I tried to explain how we measured how much automation covered our project. In this way, several teams other than us have integrated this development into their application projects and pipelines, and test automation is now written more reliably.

We would like to thank Hakan Moray for his greatest support in our success 💛

And again, big thanks to Abdulkadir Karakoç for his help.

Hopefully, see you later in another story. 👋

Thank you for reading! ✨

If you want to give me feedback feel free to reach me via LinkedIn. 👩‍💻

Are you eager to be part of a dynamic team that loves exploring new technologies and embraces fresh challenges every single day?
Join us and let’s shape the future together! 💛 💻

--

--