How to monitor End User Response Time of your Web Applications using JMeter + Webdriver, Containers, and Azure Pipelines

Responsiveness of a web application is one of the most important aspects which affects how happy users are with the application. For a modern web application, end-user response time is a good indication of responsiveness, which at a high level is the sum of the client side response time (time to bind elements, UI execution etc) and the backend response time (which includes backend server request processing time, network latency etc).

Generally, prior to the application changes being propagated into production performance testing stage tests the backend response time, and whether the application’s backend APIs are responding as quickly as expected.

Typical JMeter tests against backend APIs

This testing validates that the response times of the key backend API calls, meet the response time requirements of the application.

In many cases it is also important to additionally validate that the changes made to the application have not increased the client side response time substantially, thereby increasing the end user response time. This can be done in the performance testing stage itself. In many cases JMeter is the tool of choice to test the performance of the application backend APIs, in this post we will look at how we can use JMeter with webdriver plugin to validate that the end user response times are at acceptable levels as well.

JMeter test with webdriver plugin configured

We will then see how we can get this configuration to work inside of a Docker container, and also how the docker configuration can be extended to get this setup in Azure Pipelines using Azure Container Instances.

The code used in this post is available at this GitHub repository

How the JMeter tests with Webdriver plugin work

With typical JMeter execution only API response times are checked, the client side scripts are not executed by JMeter. Once the webdriver plugin is configured we can use the JMeter wedriver sampler to simulate user interaction with the browser (loading a page, entering value in a text box, clicking a button etc) using selenium libraries along with scripting language of your choice (JavaScript, groovy etc). The plugin allows us to configure different browsers like Firefox, Chrome etc as well.

The BlazeMeter blog post jmeter-webdriver-sampler is awesome, it explains how to configure the webdriver plugin along with sample JavaScript code on your machine.

For this post we are modifying the code a bit, mainly to enable logging messages to Stdout, so that these messages are available in container logs.

Webdriver script snippet

The above script does the following using selenium:

  • First loads the DuckDuckGo home page
  • In the search text box enters “jmeter”
  • Clicks the search button. Our Objective from this test is to calculate the response time from the time search button click takes place, to when the user can see the first search result appear. For this reason we invoke the Webdriver sampler sampleStart before the button click and the sampleEnd after we have located the first result link. Code block below highlights these key lines of the file
var searchField = WDS.browser.findElement(pkg.By.id('search_form_input_homepage')); //saves search field into searchField
searchField.click(); //clicks search field
searchField.sendKeys(['jmeter']); //types word "jmeter" in field
lang.System.out.println("------------->Entered jmeter in Search Box");
var button = WDS.browser.findElement(pkg.By.id('search_button_homepage')); //Find Search buttonWDS.sampleResult.sampleStart(); //captures sampler's start time
WDS.sampleResult.getLatency();
lang.System.out.println("------------->Webdriver sampler timer started");
button.click(); //Click Search Button
lang.System.out.println("------------->Clicked on the search button");
var link = WDS.browser.findElement(pkg.By.cssSelector('#r1-0 > div > h2 > a.result__a > b')); //click on the first search result for JMeter
lang.System.out.println("------------->Got Search link (Apache JMeter Page)");
WDS.sampleResult.sampleEnd();
  • Next the first search result is clicked and the JMeter home page is loaded

The entire JMeter jmx file, along with the embedded webdriver sampler code can be found at https://github.com/maniSbindra/jmeter-benchmark-end-user-response-time/blob/main/jmx/jmeter-webdriver.jmx .

Local execution on your machine

To execute this JMeter jmx file locally from your machine you can perform the following steps:

  • Clone the Repo, cd in repo folder and create a folder for JMeter results
$ git clone https://github.com/maniSbindra/jmeter-benchmark-end-user-response-time.git$ cd jmeter-benchmark-end-user-response-time$ mkdir results
  • Install webdriver plugin either using plugin manager or downloading the plugin files as mentioned in https://jmeter-plugins.org/?search=jpgc-webdriver
  • Download Gecko driver which will enable JMeter+Webdriver plugin to communicate with the Firefox browser. Make a note of where geckodriver is stored on your system, we will need it in the next step.
  • You will also need Firefox on you machine
  • After this we Execute JMeter in non-GUI mode with the following parameters.
$ jmeter -Dwebdriver.gecko.driver=/Users/mani/tmpdir/geckodriver -n -t jmx/jmeter-webdriver.jmx -l results/scriptresults.jtl -e -o results/jmeter-reports

We are asking JMeter to store the results in a jtl file, and also to publish the JMeter Reports under the results folder

The GIF below shows the local execution in action

Local JMeter execution in action

As we can see from the GIF above, that on executing the JMeter script we see Firefox browser coming up and see the user actions being mimicked there. We see console logs with the messages we added, and we also see that a results folder with jtl and standard JMeter reports are also available to us.

Console messages are as follows:

------------->Sample started
.
.
------------->Clicked on the search button
.
.
summary = 1 in 00:00:15 = 0.1/s Avg: 1153 Min: 1153 Max: 1153 Err: 0 (0.00%)
.
.

You can browse to results/jmeter-reports/index.html to see the output of the execution including the end user response time (approx. 1.1 second from my local machine) for the search results page to load after the button click.

JMeter reports result page

Next let us look at how we setup JMeter to execute with the webdriver plugin and the geckodriver inside a docker container

Local execution on machine using Linux docker container

This section assumes that you have docker configured to run container locally. When executing inside a container we will still need JMeter, Webdriver plugin, geckodriver and Firefox browser within the container. Additionally since the container has no display hardware we will use Virtual framebuffer Xvfb, which will enable selenium to emulate user actions using browser within the container (which does not have display hardware).

Let us look at the key sections of the sample Dockerfile (and Dockerfile-run.sh) for this section can be found at https://github.com/maniSbindra/jmeter-benchmark-end-user-response-time/tree/main/Docker:

FROM ubuntu:18.04# Install packages including xvfb
RUN apt-get update && apt-get install default-jdk -y && apt-get install unzip xvfb libxi6 libgconf-2-4 -y

RUN mkdir /opt/jmeter && cd /opt/jmeter && apt install wget -y
# Get JMeter
RUN wget https://downloads.apache.org//jmeter/binaries/apache-jmeter-5.3.zip && unzip apache-jmeter-5.3.zip
# Install Webdriver plugin
RUN cd apache-jmeter-5.3 && wget https://jmeter-plugins.org/files/packages/jpgc-webdriver-3.2.zip && unzip jpgc-webdriver-3.2.zip
# Install geckodriver
RUN cd /opt/jmeter && mkdir bin && cd bin && wget https://github.com/mozilla/geckodriver/releases/download/v0.27.0/geckodriver-v0.27.0-linux32.tar.gz && tar -xzf geckodriver-v0.27.0-linux32.tar.gz
# Install Firefox
RUN apt install firefox -y
# Add JMeter and geckodriver folders to PATH
ENV PATH="/apache-jmeter-5.3/bin:/opt/jmeter/bin:${PATH}"
# Set xvfb Screen Number
ENV DISPLAY=:99

RUN chmod +x /opt/jmeter/bin/geckodriver
RUN chown root:root /opt/jmeter/bin/geckodriver

Here we install the required components JMeter, Webdriver plugin, geckodriver, xvfb and Firefox. We also add JMeter and geckodriver to PATH.

The version of JMeter (5.3), webdriver plugin and geckodriver are those which I had installed locally on my machine. These can be modified as needed.

Note!: The docker file above is not optimized. If you are planning to use this for production workloads it is recommended you optimize it(with minimal packages installed and a slimmer base image).

Let us now run this within Docker container:

  • cd into the docker folder, and build the container image
$ cd Docker
$ docker build -t jmeter-webdriver:0.5 .
  • Run the Docker container: to do this we attach volumes containing the JMX file/s (jmx/), and the results directory (results/) where we want JMeter to pulish the jtl file and the reports . We docker run in interactive mode with bash as the shell.
# cd back to repository root if required
$ cd ..
$ docker run -it \
-v $(pwd)/results:/opt/jmeter/results \
-v $(pwd)/jmx:/opt/jmeter/jmx \
jmeter-webdriver:0.5 bash
  • The above command brings us to the bash prompt within the container. The next set of commands shown below are executed within the container. We start virtual frame buffer in the background and associate it with virtual screen 99. Then we cd in to the /opt/jmeter folder. and finally we execute the jmeter command which produces the console summary and also publishes the reports to the results folder.
$ Xvfb :99 &
$ cd /opt/jmeter
$ jmeter -n -t jmx/jmeter-webdriver.jmx -l results/scriptresults.jtl -e -o results/jmeter-reports

The console output is similar to what we saw earlier:

------------->Sample started
.
.
------------->Clicked on the search button
.
.
summary = ...
.

Let us look at the GIF below which shows the whole process in action

Local execution in docker container

Now that we know how this execution can happen using a docker container let us look at how we can create an Azure Pipeline to execute this JMeter + Webdriver test using an Azure Container Instance

Execution from an Azure Pipeline using Azure Container Instance

The strategy used here will be similar to the one described in the post “Load testing private apps/apis in Azure Pipelines using Azure Container Instances” , the only difference is that here since we do not need to test private endpoints, we will not create the azure container instance in a specific subnet.

The yaml of the entire pipeline can be found at https://github.com/maniSbindra/jmeter-benchmark-end-user-response-time/blob/main/azure-pipeline-aci-jmeter-selenium-headless.yml

The pipeline will consist of 3 stages, before we look at the details of the 3 stages, let us look at the key pipeline variables:

  1. ACI_RESOURCE_GROUP_NAME : Name of the azure resource group where the Azure Container instance will be created. This resource group needs to be created prior to pipeline execution.
  2. AzDO_POOL : Name of the Azure DevOps agent pool, to which the Azure Container Instance will register as an agent. The default value set of this in the pipeline is “jmeter-webdriver”. For details of how you can create an agent pool you can refer this docs. link
  3. AzDO_AGENT_IMAGE: This refers to the container base image used for the Azure container instance.
  4. AzDO_TOKEN: In our example this value is passed while executing the pipeline, using a pipeline secret variable. This value should be fetched from Azure Key Vault for production scenarios. This is the Azure DevOps PAT token, which is used by the Azure Container instance to register as an Azure DevOps agent. The token created needs to have permissions to read and manage agent pools as mentioned in this link
  5. AzDO_ORGANIZATION: This the name of your Azure DevOps organization. This value is currently being passed while executing the pipeline.
Variables injected at runtime

Let us now take a detailed look at the stages of the pipeline

Stage 1:

Our objective is to minimize the cost of the performance test runs, so we will dynamically provision the Azure Container Instance to execute the tests, and then destroy it after the tests.

The first stage will be executed on the executed on a Microsoft Hosted Agent. This will provision an Azure Container Instance, Install the software required to execute the JMeter + Webdriver tests (as discussed in the previous section), and then register this Azure Container Instance as a self hosted Azure DevOps Agent in the “jmeter-webdriver” Azure DevOps Agent pool.

Let us look at the key sections of yaml for this stage:

- stage: initialize_benchmark_testing_infrastructure_in_rg
jobs:
- job: initialize_benchmark_testing_infrastructure_in_rg
pool:
vmImage: 'Ubuntu-16.04'
steps:
- task: AzureCLI@1
displayName: "create load test infra in resource group"
inputs:
azureSubscription: 'jmeter-webdriver-rg'
scriptLocation: 'inlineScript'
inlineScript: |
CURRENT_ACI_COUNT=$(az container list -o table | grep $ACI_INSTANCE_NAME | grep $ACI_RESOURCE_GROUP_NAME | wc -l)
if [ $CURRENT_ACI_COUNT -gt 0 ];
then
echo "ACI instance for the release already exists";
else
echo "ACI instance does not exist. Creating .......";
az container create \
--name $(ACI_INSTANCE_NAME) \
--resource-group $(ACI_RESOURCE_GROUP_NAME) \
--cpu $(NUMBER_OF_CPUS) \
--memory $(MEMORY_GB) \
--command-line "/bin/bash -c 'apt-get update && apt-get install -y default-jdk libxi6 libgconf-2-4 firefox && mkdir /opt/jmeter && cd /opt/jmeter && wget https://downloads.apache.org//jmeter/binaries/apache-jmeter-5.3.zip && unzip apache-jmeter-5.3.zip && cd apache-jmeter-5.3 && wget https://jmeter-plugins.org/files/packages/jpgc-webdriver-3.2.zip && unzip jpgc-webdriver-3.2.zip && mkdir -p /usr/share/gecko/bin && cd /usr/share/gecko/bin && wget https://github.com/mozilla/geckodriver/releases/download/v0.27.0/geckodriver-v0.27.0-linux32.tar.gz && tar -xzf geckodriver-v0.27.0-linux32.tar.gz && export DISPLAY=:99 && mkdir -p /jmx && mkdir -p /results && /vsts/start.sh'" \
--image $(AzDO_AGENT_IMAGE) -e VSTS_TOKEN=$(AzDO_TOKEN) VSTS_ACCOUNT=$(AzDO_ORGANIZATION) VSTS_POOL=$(AzDO_POOL) VSTS_AGENT=$(ACI_INSTANCE_NAME)

fi

We use the AzCLI task to provision the Azure Container Instance. The azureSubcription is the name of the Azure service connection in Azure DevOps.

In the inline script we first check if the required Azure Container Instance is already running, if Not then we create it using the az container create command.

To create the container we pass in parameters like the cpu’s, memory and the docker container image name . One option in a production scenario is that we create our container image and push it to an Azure Container Registry and pass the name of that image. Here to keep this simple we use one of the standard Azure DevOps container images (mcr.microsoft.com/azure-pipelines/vsts-agent:ubuntu-16.04-docker-18.06.1-ce-standard) as the base image, and we install the required software (JMeter, webdriver plugin, geckodriver and Firefox) by overriding the -command-line parameter. If we look at the command-line parameter values more closely, we will see they are similar to the Dockerfile which we saw in the “Local execution on machine using Linux docker container” section of this post.

At the end of this stage, the Azure Container Instance is up with all required software to execute the JMeter+Webdriver tests, and is has registered as an Azure DevOps agent under the jmeter-webdriver agent pool.

Agent Online in Azure DevOps Agent Pool

Note: since the base container image for the Azure DevOps agent is large it might take several minutes for the agent to come online.

Stage 2:

This stage executes on the agent created in the previous stage (jmeter-webdriver pool). Let us look at the key sections of yaml for this stage.

This stage has 4 different steps:

-Step1: In this step we do the following

We add path of JMeter and geckodriver binaries to PATH

export PATH="/opt/jmeter/apache-jmeter-5.3/bin:/usr/share/gecko/bin:$PATH"

We run Xvfb in the background, and note it’s process Id

Xvfb :99 &
XV_PID=$!

After this we create the results folder, then execute the JMeter command, passing in the paths where we want the reports and jtl file to be published by JMeter.

mkdir -p $(System.DefaultWorkingDirectory)/results
jmeter -n -t $(System.DefaultWorkingDirectory)/jmx/jmeter-webdriver.jmx -l $(System.DefaultWorkingDirectory)/results/scriptresults.jtl -e -o $(System.DefaultWorkingDirectory)/results/jmeter-reports

Finally we kill the Xvfb process after the JMeter tests are complete

kill -9 $XV_PID

Step2: In this step we convert the JMeter results file to JUNIT format, this is so that subsequent step can publish the Load test report and make it available in the Azure DevOps test tab

python $(System.DefaultWorkingDirectory)/junit-onverter.py $(System.DefaultWorkingDirectory)/results/scriptresults.jtl $(System.DefaultWorkingDirectory)/results/junit-result.xml

-Step3: In this step we publish the JUnit format report and make it available in the Azure DevOps Tests tab

-Step 4: In this step publish the results folder, which contains the result jtl file as well as their reports folder which contains the JMeter generated reports. After this step the results folder contents are available as artifacts for the pipeline execution.

The full YAML for this stage is as follows:

- stage: execute_Jmeter_Webdriver_test_and_publish_report
jobs:
- job: execute_JMeter_Webdriver_tests
pool: $(AzDO_POOL)
steps:
- script: |
set -x
export display=:99
export PATH="/opt/jmeter/apache-jmeter-5.3/bin:/usr/share/gecko/bin:$PATH"
Xvfb :99 &
XV_PID=$!
echo $(System.DefaultWorkingDirectory)
ls -al $(System.DefaultWorkingDirectory)
ls -al /
mkdir -p $(System.DefaultWorkingDirectory)/results
jmeter -n -t $(System.DefaultWorkingDirectory)/jmx/jmeter-webdriver.jmx -l $(System.DefaultWorkingDirectory)/results/scriptresults.jtl -e -o $(System.DefaultWorkingDirectory)/results/jmeter-reports
kill -9 $XV_PID

displayName: 'Execute JMeter Webdriver Tests'

- script: |
echo "Convert JMeter Report to JUNIT format"
cd $(System.DefaultWorkingDirectory)/results
wget https://raw.githubusercontent.com/Azure-Samples/jmeter-aci-terraform/main/scripts/jtl_junit_converter.py -O $(System.DefaultWorkingDirectory)/junit-onverter.py

python $(System.DefaultWorkingDirectory)/junit-onverter.py $(System.DefaultWorkingDirectory)/results/scriptresults.jtl $(System.DefaultWorkingDirectory)/results/junit-result.xml

displayName: 'Convert Report to JMeter format'

- task: PublishTestResults@2
displayName: 'Publish Test Results'
inputs:
testResultsFormat: JUnit
testResultsFiles: $(System.DefaultWorkingDirectory)/results/junit-result.xml
failTaskOnFailedTests: false

- task: PublishBuildArtifacts@1
displayName: "Publish Build Artifacts"
inputs:
pathToPublish: '$(System.DefaultWorkingDirectory)/results'
artifactName: 'published-results'
failTaskOnFailedTests: false

Stage 3:

This stage executes on the Microsoft hosted agent. If the DELETE_TEST_INFRA variable is set during pipeline execution, then this stage will delete the Azure Container Instance. The yaml for this stage is

- stage: clean_up_test_infrastructure
jobs:
- job: delete_aci_instance
pool:
vmImage: 'Ubuntu-16.04'
steps:
- task: AzureCLI@1
displayName: "clean up load test infra from private (delete azure container instance)"
inputs:
azureSubscription: 'jmeter-webdriver-rg'
scriptLocation: 'inlineScript'
inlineScript: |
if [ $(DELETE_TEST_INFRA) == "TRUE" ];
then

echo "Deleting ACI Instance ......";
az container delete --name $(ACI_INSTANCE_NAME) --resource-group $(ACI_RESOURCE_GROUP_NAME) --yes
else
echo "Not deleting ACI Instance as per pipeline configuration .......";
fi

Let us look at some screen shots of the pipeline execution:

The images below shows one of the executions of the pipeline, the Tests tab and published Artifacts

Azure Pipelines stage execution overview
Azure Pipelines tests tab
Published Artifacts

Let us also inspect the JMeter+Webdriver test execution Logs, as expected these are pretty much similar to the earlier 2 executions (Local and Local within Docker container)

JMeter+Webdriver test console execution logs

Bonus: If you want to publish the JMeter reports to an Azure Pipelines tab, you can use this awesome extension by Lakshay Kaushik PublishHTMLReports.

Thanks for reading this post. I hope you liked it. Please feel free to write your comments and views about the same over here or at @manisbindra

Microsoft Azure

Any language.

Microsoft Azure

Any language. Any platform. Our team is focused on making the world more amazing for developers and IT operations communities with the best that Microsoft Azure can provide. If you want to contribute in this journey with us, contact us at medium@microsoft.com

Maninderjit (Mani) Bindra

Written by

Cloud, Containers, K8s, DevOps | CKA | LFCS | Principal Software Engineer @ Microsoft

Microsoft Azure

Any language. Any platform. Our team is focused on making the world more amazing for developers and IT operations communities with the best that Microsoft Azure can provide. If you want to contribute in this journey with us, contact us at medium@microsoft.com