AWS Lambda layer in Java

Asaf Adar
Asaf Adar
Jan 22 · 6 min read
AWS lambda layers

What are lambda layers?

AWS introduced Lambda Layers back in 2018 as a way to share code and data between functions within and across different accounts.

Lambda layers allow us to share common code and dependencies between multiple lambda functions.

Using lambda layers to manage and govern lambda function will result in a flexible, scalable and solid base for your lambda based architecture.

Why use lambda layers?

The main benefits of using lambda layers are:

  • Small resulting JAR — Each function deployed will be small and contains only the code related to the action it intended to do.
  • Single packaging for shared dependencies — No need to package shared dependencies or code with each function.
  • Flexible code updates — Updating a code or a dependency will happen only in one place instead of each function.

In short, lambda layers are here to help us develop higher scale and complex lambda functions using flexible, scalable and simple management tools.

When to use lambda layers?

The use of lambda layers can, sometimes, create more problems than solutions. Especially when coding in a static language like Java, which requires all the code and dependencies during the compilation process.

In my point of view the use of lambda layers should be done in the following cases:

  • Large dependencies that frequently updated.
  • Custom lambda runtimes.
  • Lambda function custom wrapper.
Lambda Function

Lambda layers in Java:

As we understand so far, having your lambda layers written in Java can be a real pain. I would not recommend this at all. Using Python or NodeJS will be much easier and faster.

But, in some cases. The cost of migrating existing services, written in Java, into lambda functions, in a different language, will be even more painful than wrapping yourself around this tutorial and deploying lambda layers in Java.

Lambda Basic Wrapper:

The first thing that we will do is to create the function trigger handler, This trigger handler will use us as the first and most basic lambda layer for our functions. We will be using MAVEN and openjdk8.

So, our initial pom.xml should contain the following dependencies:

And also the shade plugin:

  • Notice the dependencies section for a transformer to log4j2 logging framework. In case you want to use log4j logger in your app as well, please add also the following dependencies:

Right now we are coding our first functional layer, with some functionality and not just dependencies. I would recommend keeping this layer as minimal as possible. Any other dependencies required by your implementation are welcome. Just bear in mind, this layer should be generic and light.

On top of it, depends on the version of aws SDK you are using, some version conflicts raise and should be dealt with. Please make sure to follow the AWS documentation to complete your pom.xml

The Lambda handler class:

The next step is to create the handler that will be triggered by AWS Lambda when needed.

This can be done using the RequestHandler interface<I, O> that will require us to implement the handleRequest method.

In my example, I chose to implement this interface using an abstract class, this will allow me the create a solid infrastructure between all of my lambda functions, and the AWS Lambda interface.

My example describes an abstract SQS message lambda trigger.

Please notice the comments inside the class for further explanations

Local installation and use:

Now, for us to use our new lambda layer, we will need to install it and import it.

  1. Layer installation:
    This is a very simple task, we will use MAVEN to create our fat JAR using the shade plugin: mvn clean install
    The result will be a new jar installed in our local repository .m2. We will use this JAR as part of our layer deployment (will be explained later on). But, also as a basic layer for lambda functions.
  2. Layer import:
    Before going into the technical part, let’s understand what exactly we want to do and why this is not that trivial.
    We wish to create, eventually, a JAR that will compile properly by MAVEN using our description in the pom.xml but we don’t want to package this JAR’s dependencies with it.

    Why?
    well, doing that will make the use of layers pointless, we don’t want to package every function with all of its dependencies. We want to be able to compile just the code that we need and to use dependencies at runtime for a different repository.
    How?
    For us to get this weird setup to work, we will need all the dependencies available on our local .m2 repository. We will need to physically import the dependencies into our local environment. We need this for compile-time so we will be able to access our code. Now, to mark MAVEN not to package those dependencies alongside our code, we will use the provided scope in the dependency definition.

Add the provided scope for each dependency you want to extract from a lambda layer at runtime. Just make sure that the versions and JAR names will match.

Now all that’s left for you is to create a class extending the LambdaAbstractWrapper class you created in the basic layer, set the return type of the function and implement the processMessage abstract method.

Layer packaging for runtime:

As we mentioned earlier, our lambda layer is added to the lambda function runtime environment once the function is invoked. So, how can we tell the lambda runtime which layers we want? and where they are located so our app will be able to use them?

First, let us create a lambda layer:

  1. ZIP all desired JAR files into one zipped file — LambdaLayer.zip
  2. Upload this ZIP file into the desired S3 bucket.
    aws s3 cp LambdaLayer.zip s3://<BUCKET_NAME>
  3. Create a new lambda layer using aws CLI and save the response.
    aws lambda publish-layer-version --layer-name <LAYER_NAME>\
    --compatible-runtimes java8\
    --content S3Bucker=<BUCKET_NAME>,S3Key=LambdaLayer.zip\
    >> LambdaLayerResult.json

By now, you should have a lambda layer compatible with Java lambda functions. You can verify it by logging into your AWS account, navigate to lambda and click on “Layers” on the left side menu. The result of the CLI call was saved into a file called LambdaLayerResult.json. From this file take the LayerArn. When we will define the lambda function we will need it to attach the layer to the function.

Create and deploy the Lambda function:

Now, we will create our lambda execution role, deploy our Java lambda function and attach it to the lambda layer we created in the last part.

  1. Create a lambda execution role example, save the new role ARN for later use.
  2. Building our lambda function JAR using MAVEN — mvn clean install
  3. Uploading the lambda function using aws CLI:

aws lambda create-fucntion --function-name <FUNCTION_NAME>\
--runtime java8 --role <ROLE_ARN> --handler <PATH_TO_HANDLER_CLASS>\
--memory-size <MEMORY_SIZE_IN_MB> --region <REGION> --layers <LAYER_ARN>\ --zip-file fileb://<PATH_TO_JAR> --publish

Now, we have created the lambda execution role, the lambda function and attached the lambda layer to it. You can verify it by logging into your aws account, navigate to lambda and search for your new function under Functions.

That’s it!

Lambda functions, in general, are a powerful tool in a serverless architecture. The ability to manage lambda layers in Java will allow you to maintain a simple, strong and flexible serverless environment.

Having trouble accessing the internet from lambda function inside a VPC? I have a solution for you!

Read my Medium story about the problem and the solution!

Stay tuned for more tutorials about aws serverless infrastructure

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Asaf Adar

Written by

Asaf Adar

Backend developer, AWS cloud infrastructures. https://www.linkedin.com/in/asaf-adar

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade