Automating Rust and NodeJS deployment on AWS Lambda using Lambda Layers

Part of our stack at Clevy is written in Rust, and we use Neon to ease the bindings with other parts of the stack written in NodeJS.

Recently, we needed to deploy this stack on AWS Lambda, which runs a very specific NodeJS runtime, not cross-compatible with our existing stack.

Since we struggled a little bit with getting Lambda/Rust/Node to play nicely together, I figured I would post a short how-to of what we found worked well for us. You can of course take this as a base and change it to your liking!

1. The Setup

The first thing you need to know is that AWS Lambda runs on either Amazon Linux 1 or 2, depending on the version of NodeJS that you plan to use. So your build pipeline needs to reflect that. Luckily, Amazon provides Docker images for both: amazonlinux:1 or amazonlinux:2. In our case, we want to use node v10.x, so:

FROM amazonlinux:2

Then of course, you need Rust and NodeJS to be installed onto the amazonlinux image.

# Install rust
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y --default-toolchain stable && \
PATH="/root/.cargo/bin:$PATH" rustup install stable
ENV PATH $PATH:/root/.cargo/bin
# Install node
RUN curl -sL https://rpm.nodesource.com/setup_10.x | bash - && \
yum install -y nodejs && yum clean all

Then, Neon requires a few dependencies, that you can customize based on what you actually require for your own needs. In our case we needed to add quite a few dependencies over what is stated in the docs, especially all the *-devel dependencies which were definitely not straightforward.

# Install dependencies
RUN yum install -y make gcc gcc-c++ libgcc openssl-devel readline-devel sqlite-devel && yum clean all

Finally, install neon-cli and you are all set.

RUN npm i -g neon-cli

Save this base image somewhere and use it for your AWS Lambda-compatible builds later!

docker build -t lambdabuildbase .

2. The Build

There are several ways to use this image, so let me share my script, which you can customize to your liking. The goal with Neon is to create a NodeJS addon that you can then require elsewhere like any other node module, but precompiled for the environment it runs on.

Let’s put our sources into a src/ folder, and inside, consider the following package.json:

{
"name": "@clevy/lambda-build-demo",
"version": "1.0.0",
"description": "AWS Lambda demo",
"main": "lib/index.js",
"scripts": {
"build": "neon build --release && mv native/index.node lib/addon.node"
}
}

The reason why we mv native/index.node lib/addon.node is that we don’t need the whole native directory after the build. It is quite huge (over 700MB in our case), compared to what we really need (only the compiled addon, which is only a few MB). But of course you can leave it as is if you are happy with your final build size, simply note that AWS Lambda functions (including all the layers together) can never exceed 250MB unzipped.

The main lib/index.js contains:

module.exports = require("addon");

And of course, native/ contains all my rust code.

Let’s create a second Dockerfile that looks like the following:

FROM lambdabuildbaseWORKDIR /distCOPY src .
RUN npm install && npm run build
# remove now useless native/ directory
RUN rm -rf native

To extract built files from the Dockerfile, one easy way is the following bash script:

#!/bin/bashimage=lambdabuildpkg
docker build -t $image .
id=$(docker create $image)
docker cp $id:dist - | tar x
docker rm -v $id

This will build the image (which will in turn build the node module with the FFI bindings), copy the resulting built node module from inside the docker image into the dist/ folder on your host machine, then cleanup.

3. The Deployment

Lambda requires node layers to be prepared in a very specific way. First, it needs to be inside a directory called exactlynodejs. Then, if you are preparing a layer that contains a node module, it needs to be inside the usual node_modules/namespace/package_name tree, so in our case nodejs/node_modules/@clevy/lambda-build-demo.

path=node_modules/@clevy/lambda-build-demo# remove any existing data
rm -rf nodejs
mkdir -p nodejs/$path
mv dist nodejs/$path

Then, Lambda tells us they need the layers to be zipped before we upload them. Fine:

# cleanup then zip again
rm nodejs.zip
zip -r nodejs.zip nodejs -q

Then, proceed to deploy onto AWS Lambda (using the AWS CLI, so you can add your own credentials in whatever way you like). If the package is too big, you can simply send it to S3 first, then upload to Lambda Layers from a S3 bucket. The documentation is quite simple, you can customize it easily.

aws s3 cp nodejs.zip s3://my-bucket/nodejs.zip
aws lambda publish-layer-version \
--layer-name "my-lambda-layer" \
--content "S3Bucket=my-bucket,S3Key=nodejs.zip" \
--compatible-runtimes "nodejs10.x"

Voilà, you have your Rust-powered NodeJS-compatible AWS Lambda Layer ready to use in your Lambda functions!

4. The Usage

This is the easiest step. Create a lambda function in whichever way you want, select nodejs10.x as the runtime, use the newly imported layer as one of its layers by selecting “layers” just below the lambda function:

Then click again on the Lambda function and inside your code, simply import your module as usual:

Notice that there are no node_modules in this Lambda function? That’s because the node_modules are inherited from the underlying layer.

Of course you can stack layers (up to 5) and you can also import your own node modules very easily, but using layers is a very simple way to use weird runtimes on AWS Lambda as well as to share common code.

Hope this tutorial helps you as much as it would have helped me to find it in the first place!

Clevy.io

Clevy is an AI company based in Station F, Paris, France.

Francois Falala-Sechet

Written by

CTO @ clevy.io

Clevy.io

Clevy.io

Clevy is an AI company based in Station F, Paris, France. We’re hiring! If you found this article interesting, clap, share or comment below! You may also enjoy learning more about Clevy at https://www.clevy.io.

Francois Falala-Sechet

Written by

CTO @ clevy.io

Clevy.io

Clevy.io

Clevy is an AI company based in Station F, Paris, France. We’re hiring! If you found this article interesting, clap, share or comment below! You may also enjoy learning more about Clevy at https://www.clevy.io.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store