Setup Custom AWS lambda (λ) function dependencies using Docker containers

George Bakas
Innovation-res
Published in
5 min readJun 22, 2022
https://i.imgflip.com/6kckxt.jpg

Let’s face it… When it comes to running serverless applications, AWS Lambda (λ) service is one of the most easy to develop and deploy. With AWS λ one can create functions, self-contained applications written in one of the supported languages and runtimes, and upload them to AWS Lambda, which executes those functions in an efficient and flexible manner.

Although this flexibility is helpful, sometimes your applications have many dependencies. AWS has introduced layers [1]. Using layers can be somewhat difficult while at the same time there are many limitations regarding size as well as flexibility.

https://aws.amazon.com/blogs/compute/working-with-aws-lambd

Once again, containers come to the rescue [link1, link2]. AWS offers the ability to deploy lambda functions by using a tailored made container for AWS lambda services. The container can be found in the public AWS Container Registry (ECR) (public.ecr.aws/lambda/python).

Lets get our hands dirty and deploy one Lambda function using the AWS Lambda (λ) Container. Our problem is as follows: We will be using a custom KeyBert implementation (repo) that extracts keyphrases from any given input. In our case this is a title and an abstract of a given paper.

├── app.py
├── Dockerfile
├── requirements.txt
├── utils.py
└── weights/

The structure of the documents is given above.

The weights directory includes files that are related to KeyBert inference. In this tutorial we will not be getting into details on how this model works.

The Dockerfile is provided:

FROM public.ecr.aws/lambda/python:3.8ENV S3_BUCKET=#####
ENV S3_KEY=######
ENV S3_SECRET=#######
# Install Python dependencies for function
COPY requirements.txt .
RUN pip install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"
COPY . ${LAMBDA_TASK_ROOT}CMD [ "app.handler" ]

As you can see, this is a barely easy to understand and deploy Dockerfile. It uses as a base the ECR lambda (λ) provided container. We provide some environmentals needed for our environment and we install the requirements needed in the ${LAMBDA_TASK_ROOT} directory.

The Dockerfile specifies that a file named app.py exists with a handler function which works as a main function for our scope.

requirements.txt

boto3
keybert

We include app.py which is the main python script running and handling requests and utils.py, the file that includes the get_key_phrases() which uses KeyBert to extract phrases from a given text. Pretrained weights can be found here:

Once you have inlcuded all files in the directory you are now in capable of building and uploading your Container at AWS ECR. A very complete guide on how to push your Container to ECR can be found

https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html

We have finished the 1st step! Now we need to deploy the Lambda function.

Log in to your AWS account → Lambda (λ) → Create Function → Container Image

Select Container Image and Click on Browse Images

Find the image you uploaded on ECR using the Browse Images Button

Select repo and tag

Once configured, click on Create Function

To trigger the lamdba function you developed, there are two methods one can follow:

  1. Using boto3

For this method, find this tutorial for invoking your lambda function here.

In this way, you may need to change app.py, as the payload you are sending to your function doesn’t by default have a ‘body’ key. With this method, you are free to develop the payload as you like.

2. Using AWS API Gateway

Lambda function overview

Click on your function and click on Add trigger

API Configuration

We are using a simple HTTP API configuration. REST APIs support more features than HTTP APIs, while HTTP APIs are designed with minimal features so that they can be offered at a lower price. If you need to add a more sophisticated API with more utilities, AWS gives you the ability to create a REST API of your own (link here).

Once you have created your API, it is time to invoke it! Let’s make a quick test. Go to the API Gateway service, there you will find the API you just created. Click on it.

In the stages for the API you have developed you will find a URL. This is the invoke url. Copy it!

We will be using Postman to test invoking our API.

Postman is an API platform for building and using APIs. Postman simplifies each step of the API lifecycle and streamlines collaboration so you can create better APIs — faster.

Postman will show you the status of your output.

Typically, it should be a json of your liking, that you have defined in the app.py. In our case it is something like this…

Lambda Function API response

Once your test is complete, maybe you will need to know how to invoke the API using Python. This is easy, we provide the code in the following lines…

import requests lambda_dict = {
"s3Bucket": "xxxx-xxx-xx-xx",
"abstract": 'xxxxxxxx',
"title": 'xxxxxxx' }
response = requests.post(
'https://<api-gateway-id>.execute-api.<region>.amazonaws.com/default/<your-lambda-function-name>',
json=lambda_dict)

This is it! You have successfully created an custom, tailored made lambda (λ) function and an API gateway to invoke it!

Feel free to communicate with me for more info or questions.

--

--

George Bakas
Innovation-res

I am a proficient DevOps Engineer with a PhD in High Energy Physics, interested in creating CI/CD pipelines, infrastructure as code and orchestration.