AWS Lambda load testing with JMeter

Srinivas Nali
River Island Tech
Published in
7 min readOct 8, 2023

In River Island robust QA practice, we’ve embarked on an exciting journey to optimise the performance of AWS Lambda functions within our event-driven architecture. How? By harnessing the formidable power of JMeter — a versatile tool and its awsmeter plugin to meticulously push data messages into AWS SQS, thus triggering our Lambdas into action.

Why Performance Testing Matters:

As businesses increasingly rely on microservices and serverless computing, the efficiency of AWS Lambda functions and the reliability of AWS SQS queues become pivotal.

Imagine this: your application encounters an unexpected surge in traffic — a peak load scenario. Can your Lambdas rise to the occasion swiftly and efficiently? Is there room for improvement in your setup? These are the questions that inspired our exploration, and today, we’re here to guide you through the very steps we’ve taken.

Load Test Scenario:

Let’s dive into a practical load test scenario to understand the performance of our AWS Lambda setup. Imagine our services are configured as follows:

  • We have an AWS Lambda function configured to be triggered by an SQS queue.
  • Whenever messages are added to the queue, the Lambda function automatically executes to process them.

Now, we want to answer two critical questions:

  1. Processing Time: How much time will the service take to process the peak load of messages?
  2. API Call Rate: What will be the rate of API calls required to send forward the processed messages?

Let’s break down each of these questions:

Processing Time:

We want to determine how long it takes for our AWS Lambda function to process a substantial load of messages. In this scenario, we are simulating a continuous stream of messages being added to the SQS queue.

To calculate the processing time, we need to consider factors such as the Lambda’s execution time, any potential bottlenecks, and how efficiently it processes messages under load.

API Call Rate:

To send the processed messages forward, we need to understand the rate at which our AWS Lambda function generates API calls or invokes other services. This rate is crucial for scaling our infrastructure and ensuring timely message delivery.

In the next steps, we’ll set up a load test using JMeter to simulate this scenario and gather valuable insights into the performance of our AWS Lambda service.

Why JMeter and AWSMeter:

In our quest to optimise AWS Lambda performance, the choice of load testing tools was a critical decision. We opted for JMeter and its AWSMeter plugin for several compelling reasons:

  1. Open Source Power: JMeter is a versatile open-source tool that offers extensive capabilities for performance testing. It’s widely used, well-documented, and supported by a robust community.
  2. AWS Integration: AWSMeter, a JMeter plugin, specifically caters to testing AWS services, making it a natural fit for our AWS Lambda and SQS setup. This integration streamlines the testing process and ensures accurate results.
  3. Flexibility and Customisation: JMeter allows us to create highly customised load test scenarios, making it possible to mimic real-world usage patterns effectively. This flexibility is essential when dealing with complex serverless architectures.
  4. Cost-Efficiency: Being open source, JMeter is a cost-effective choice for load testing. It aligns with the efficiency principles of serverless computing, allowing us to optimise without breaking the bank.

Load Test Configuration

With JMeter and the AWSMeter plugin ready, we can configure and run the load test:

check out my guide on Installing JMeter with AWSMeter Plugin for step-by-step instructions. In that guide, I also delve into the intricacies of thread properties and provide sample load scenarios to help you get started.

Java Request for SQS
  1. Create a New Test Plan: Under “Test Plan,” create a new “Thread Group” and a “Sampler.” Save the test plan with a descriptive name. (Thread Properties should be configured as per the requirement.)
  2. Choose Java Request: Within the sampler, select the Java Request option. Depending on your SQS type, choose the appropriate Java request (e.g., org.apache.jmeter.protocol.aws.sqs.SQSProducerStandardQueue for standard SQS).
  3. Fill in AWS Details: Configure the Java Request by providing your AWS access ID, key, session token, SQS queue name, and message body (e.g., {"type": "", "id": "5432"}).
  4. Save the Test: Save the test plan with a meaningful name (e.g., test.jmx).
  5. Run the Test: Execute the test from the command line using the command jmeter -n -t test.jmx.
  6. Observe Message Flow: During the test execution, closely observe the flow of messages into the SQS queue and how the Lambda function processes them.

Data Parameterisation for Realistic Scenarios

In a real-world load test, you often need to simulate dynamic data and scenarios. JMeter provides powerful data parameterisation capabilities to achieve this. Let’s explore how to use variables, user-defined variables, random values, and CSV data feeds:

User-Defined Variables:

User-defined variables allow you to create and use custom variables with specific values in your test plan. Here’s how to set them up:

  1. Navigate to Test Plan > Add > Config Element > User Defined Variables.
  2. Add a name and value for your variable. For example, you can define a variable like user_define_variable_name with a value of Production.

Now, you can refer to this variable within the above JSON SQS message body using the ${user_define_variable_name} syntax. For example:

jsonCopy code{
"type": "",
"id": "5432",
"Env": "${user_define_variable_name}"
}

This allows you to inject dynamic values into your test messages.

Random Values:

To introduce randomness into your test, you can generate random variables with specific properties. Here’s how:

  1. Go to Test Plan > Add > Config Element > Random Variables.
  2. Provide a name, minimum, maximum values, and a seed for your random variable.

Now, you can refer to this variable within the above JSON SQS message body using the ${random_variable_name} syntax. For example:

{
"type": "",
"id": "${random_variable_name}"
}

This is particularly useful when you need to simulate varying data, such as different user IDs or timestamps.

CSV Data Feeds:

For more complex data sets, you can feed data from a CSV file into your test plan. Here’s how:

  1. Navigate to Test Plan > Add > Config Element > CSV Data Set Config.
  2. Provide the path to your CSV file and specify variable names matching the column headings (comma-delimited) in your CSV file.

Now, you can refer to this variable within the above JSON SQS message body using the ${column_name} syntax. For example:

{
"type": "",
"id": "${column_name}"
}

Using CSV data feeds is beneficial when you need to simulate real-world data sources or scenarios, such as different product names or customer IDs.

By mastering data parameterisation techniques, you can create load tests that closely mimic real-world conditions and provide more accurate performance insights for your AWS Lambda function.

Post-Testing Actions for AWS Lambda Optimisation

Now that you’ve gathered valuable performance insights, it’s time to take actionable steps to optimise your serverless architecture. Here’s what you can do:

1. Analyse Metrics:

  • Dive deep into the performance metrics collected during testing.
  • Pay attention to response times, error rates, and resource utilisation.

2. Identify Bottlenecks:

  • Use the metrics to pinpoint bottlenecks and areas where improvements are needed.

3. Adjust Configuration:

  • Based on your findings, consider adjusting AWS Lambda configurations.
  • For example, optimise parallel session counts or tweak resource allocation.

4. Code Refactoring:

  • Refactor your Lambda function code to process data more efficiently.
  • Optimise algorithms and eliminate unnecessary processing steps.

5. Scaling Strategies:

  • Develop scaling strategies for your Lambda functions to handle spikes in demand gracefully.

6. Implement Monitoring:

  • Implement continuous monitoring and alerting to proactively detect and address performance issues in real-time.

By taking these post-testing actions, you’ll be well on your way to optimising your AWS Lambda functions for peak performance, ensuring your serverless architecture can thrive under any load.

With these steps, you not only identify performance bottlenecks but also actively work towards enhancing your AWS Lambda functions to deliver top-notch performance and reliability.

Conclusion: Optimising AWS Lambda Performance

In this journey of load testing AWS Lambda with JMeter and exploring various strategies for simulating real-world scenarios, we’ve gained valuable insights into optimising our serverless architecture. Remember, performance testing is not just about identifying bottlenecks; it’s about crafting a resilient and responsive system that can handle the demands of your users and applications.

As you embark on your own performance testing adventures, keep these key takeaways in mind:

  • Precise Configuration: Configure your load test scenarios with precision, aligning them with your real-world usage patterns and business objectives.
  • Data Parameterisation: Harness the power of data parameterisation to create dynamic and realistic tests, allowing you to validate your system’s performance under varying conditions.
  • Continuous Improvement: Performance testing is an iterative process. Regularly revisit and refine your test scenarios as your application evolves.
  • Monitoring and Analysis: Post-testing analysis is as crucial as the testing itself. Leverage the data gathered during tests to identify areas for improvement and make informed decisions.

With these principles in your arsenal, you’re well-equipped to optimise your AWS Lambda functions and ensure they deliver top-notch performance when it matters most.

So, as you venture into the world of load testing and AWS optimisation, remember to stay curious, keep testing, and most importantly, strive for excellence in your serverless endeavors.

Happy Testing.

--

--

Srinivas Nali
River Island Tech

In addition to practicing software testing, I have a strong passion for Test Automation and reading.