Integration Test For AWS Lambdas — Easier than you might think (+TestContainers!).

Leonardo Galani
assert(QA)
Published in
4 min readMar 15, 2022
I think this reference only works for Brazilians :D

Whenever you search for the title of this article, you will see some amazon articles saying how integration tests are essential and that you can do them on the AWS console (which is a manual operation) or to use stuff like localstack.

As you know, there is no "manual" step on a CI/CD pipeline, meaning that I had to find other ways to put those automated tests on my pipeline. Still, I didn't want to use additional dependencies like localstack in the projects I'm working on.

TL:DR — the example and solutions that I'm describing in this short article are for whenever you have a nodejs lambda project and want to exercise its integrations with other dependencies like sqs queue or databases. You still get to have them as inspiration if you use a different programming language.

TL:DR2 — If you want more detailed explanations, you can follow the official documentation over here → https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-test-and-debug.html

— -

AWS SAM

If you are doing AWS lambdas, you are using SAM for sure, and you probably know one thing or two about the SAM CLI capabilities.

The thing you "might" not know is that SAM runs your lambda function in a container itself, meaning that if you want to test things locally, you will need to create a network bridge between SAM and the containers running your dependencies.

docker network create <your_network_name> 

Things get more manageable if your local dependencies run using docker-compose, meaning that you can also use the same network on the docker-compose file.

docker-compose.ymlnetworks:
<your_network_name>:
name: <your_network_name>
external: true

Now you can update your .env file or whatever way you store the URL of your dependencies and set them to use the service name within your docker-compose.

export SQS_QUEUE_URL= http://sqs:9324/queue/yourQueue

And yes, with this setup, you don't need to use the URL host.docker.internal anymore.

— -

Now that you have your network and your dependencies up and running, you can call your local lambda with sam:

sam local invoke Function_Name --event <path_to_json_file> --docker-network <your_network_name>

See how easy it was? The good thing is that this also works on CI pipelines on Github actions (probably on GitLab too).

"But how to write tests to this?"

I come up with two simple and dummy solutions to write those tests.

The first one is pretty simple, but the execution time can be high for multiple scenarios.

When you call sam local invoke, it returns whatever your lambda returns, meaning that you have something like this to assert the return (sample code in JS +nodejs+ jest):

import { execSync } from "child_process"const script = `sam local invoke MyFunction --event ${fullPath} --docker-network MyDockerNetwork`;const raw_result = execSync(script).toString()const result = await JSON.parse(raw_result);expect(result.statusCode).toBe(200);

As you can see, if you have multiple test cases, you will be spinning up the sam container, executing the lambda, and tearing it down.

To avoid this kind of problem, you can use the other function from the SAM CLI.

sam local start-lambda  --docker-network <your_network_name>

Once you have your local server up, you can create your lambda instance once and invoke as many times your want, without having to set up and teardown for all scenarios:

import { Lambda } from "aws-sdk";const lambda = new Lambda({
endpoint: 'http://127.0.0.1:3001',
sslEnabled: false,
region: 'eu-central-1',
accessKeyId: 'any',
secretAccessKey: 'any'
});
var eventparams = {
FunctionName: 'MyFunction',
InvocationType: 'RequestResponse',
Payload: JSON.stringify(event)
};
const result = await lambda.invoke(eventparams).promise()expect(result.StatusCode).toBe(200)

In the example above, you can use dummy data to create a Lambda instance because this will connect to your local sam and not your account on AWS.

Also, even if you are working with events and not API Gateway, you still need to use InvocationType: 'RequestResponse' because you want to wait for the response.
If you use the type "event", the lambda invoke process will not wait for the response, returning a 202 without a body.

The payload can be the same you were using before when doing local invoke.

— -

Wrapping things together (UPDATED!)

Now it's just a matter of how you create the pipeline to run those tests.

To make things easier, I'm using "test containers" which can use your docker-compose.yml to start all the dependencies.

> https://github.com/testcontainers/testcontainers-node

Also, I created a helper to spin up and tear down the local server with SAM's:

#helper.js import { exec } from "child_process";function startLambda() {
exec(`sam local start-lambda --docker-network <your_network_name> &> /dev/null & disown`);
}
async function stopLambda() {
exec(`kill $(ps aux | grep '[s]am local start-lambda' | awk '{print $2}')`);
}
export { startLambda, stopLambda };

now you can do something like this on your test:

import {
DockerComposeEnvironment,
StartedDockerComposeEnvironment,
Wait
} from "testcontainers";
import { startLambda, stopLambda } from "./helper";# ... your code + previous examplesdescribe(... {
let environment: StartedDockerComposeEnvironment;
beforeAll(async () => {
startLambda();
environment = await new DockerComposeEnvironment(process.cwd(), "docker-compose.yml")
.withWaitStrategy("sqs_1", Wait.forLogMessage("some string"))
.up();
});
afterAll(async () => {
stopLambda();
await environment.down();
});
(...)})

On the code above, you can see that it starts the lambda server, do the compose up with a wait strategy that will look for a specific string on the service log.

You can see the full documentation on their readme file :)

— -

References:

— -

If you wondering about the cover image from this article, wonder no more!

"LAMBaDA", the forbidden dance :D

--

--