Effortless Guide to Setting Up AWS Lambda with Rust

Jed Lechner
7 min readJan 21, 2024

--

Introduction

This guide will walk you through the process of setting up a Rust project using Cargo workspaces. We’ll cover creating a couple of HTTP APIs, a shared library, and deploying these components on AWS Lambda. I must admit, my motivation for creating this guide stems from my preference for developing on a Mac, coupled with a reluctance to setup developer desktops for code deployment. This guide should also work on Linux distributions, however it may not directly translate for Windows users.

Prerequisites

Before diving in, ensure you have the following installed:

About Cargo Workspaces

Cargo workspaces enable the management of multiple related Rust packages within a single project. This structure is particularly beneficial for large AWS Lambda projects, where functions often have distinct responsibilities and different dependency sets. In this setup, each Lambda function is treated as a separate package within the workspace, while shared code is organized in a common package also located within the same workspace.

Some key benefits include:

  • Modularization: There’s a clear separation of concerns, allowing each Lambda function to evolve independently.
  • Independent Build: Functions can be built and deployed independently, which can lead to reduced build times.
  • Dependency Management: Dependencies are managed on a per-function basis, effectively reducing code bloat.

Setting Up Your Workspace

  1. Create a new workspace
mkdir setup_aws_lambda_workspace_with_rust_guide
cd setup_aws_lambda_workspace_with_rust_guide

2. Create a Cargo.toml as the root of the workspace

[workspace]
resolver = "2"
members = [
"common",
"http_lambda_1",
"http_lambda_2"
]

3. Add a shared library and a couple of packages where we will add the Lambda code.

cargo new --lib common
cargo new http_lambda_1
cargo new http_lambda_2

4. In the dependencies section in the Cargo.toml file for each of the lambdas add the shared library dependency.

[dependencies]
common = { path = "../common" }

5. Lets add all of the dependent packages to use a Rust on AWS Lambda. You can use the following script to help with that.

Rust compiles to native code and you don’t need a dedicated runtime to run Rust on Lambda. We will use the Rust runtime client to build your project locally, and then deploy it to Lambda using the provided.al2023 or provided.al2 runtime. When you use provided.al2023 or provided.al2, Lambda automatically keeps the operating system up to date with the latest patches. However, you don’t need to worry about this, as we will abstract these complexities away.

#!/bin/bash

set -xe

# List of packages to add dependencies to
PACKAGES=(
"http_lambda_1"
"http_lambda_2"
)

# Define the dependencies
DEPENDENCIES=(
"lambda_http"
"lambda_runtime"
"serde"
"serde_json"
"tokio --features full"
"tracing"
"tracing-subscriber --features json"
)

# Function to add dependencies to a package
add_dependencies_to_package() {
local package=$1
echo "Adding dependencies to package: $package"

for dep in "${DEPENDENCIES[@]}"; do
echo "Adding dependency: $dep to $package"
cargo add $dep --package $package
done
}

# Loop through each specified package and add dependencies
for package in "${PACKAGES[@]}"; do
add_dependencies_to_package $package
done


# Add dependencies to the common library
cargo add tracing --package common
cargo add tracing-subscriber --features json --package common

Setting Up the Common Library

We’ll utilize the common library to share code across all the Lambda functions in our project. For now we’ll set up tracing only. Tracing offers structured logging with context. For specific details refer to the AWS Lambda Rust Logging Documentation.

In common -> src -> lib.rs file add the tracing function. We will use this within each lambda.

pub fn init_tracing() {
tracing_subscriber::fmt().json()
.with_max_level(tracing::Level::INFO)
// This needs to be set to remove duplicated information in the log.
.with_current_span(false)
// This needs to be set to false, otherwise ANSI color codes will
// show up in a confusing manner in CloudWatch logs.
.with_ansi(false)
// Disabling time is handy because CloudWatch will add the ingestion time.
.without_time()
// Remove the name of the function from every log entry.
.with_target(false)
.init();
}

Setting up the HTTP Lambdas

I don’t want to put too much emphasis on the code in this section because this guide is more focused on creating reproducible setup for Rust projects on AWS Lambda. However, let’s add some code for to the HTTP lambdas so we can invoke them later.

In the http_lambda_1 and http_lambda_2-> src -> main.rs files add the following code:

use lambda_http::{
http::{Response, StatusCode},
run, service_fn, Error, IntoResponse, Request, RequestPayloadExt,
};
use serde::{Deserialize, Serialize};
use serde_json::json;
use std::sync::Arc;
use tracing::info;

#[tokio::main]
async fn main() -> Result<(), Error> {
// Initialize tracing for structured logging.
common::init_tracing();

// AppState is wrapped in an Arc for thread-safe reference counting.
let state = Arc::new(AppState {});

// `run` starts the Lambda runtime listener.
// `service_fn` maps incoming requests to the handle_request function.
run(service_fn(|event: Request| async {
handle_request(&state, event).await
}))
.await
}

// The handler function for incoming Lambda requests.
pub async fn handle_request(_: &Arc<AppState>, event: Request) -> Result<impl IntoResponse, Error> {
info!("Received request: {:?}", event);

// Parse the JSON payload from the request body.
let body = event.payload::<MyPayload>()?;

// Construct an HTTP response.
let response = Response::builder()
.status(StatusCode::OK)
.header("Content-Type", "application/json")
.body(
json!({
"message": "Hello World",
"payload": body,
})
.to_string(),
)
.map_err(Box::new)?;

// Return the response.
Ok(response)
}

// AppState can hold shared state or dependencies (like database connections).
pub struct AppState {}

// Define a structure for the expected payload in the HTTP request.
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct MyPayload {
pub prop1: String,
pub prop2: String,
}

Testing the Lambdas

One of the major benefits of using Cargo Lambda is the cargo lambda watch command. The watch subcommand emulates the AWS Lambda control plane API. We can run a local instance of each of our lambdas without deploying in order to test them out.

Considering both lambda’s do the same thing, lets just test out http_lambda_1.

  1. Change the directory and run the watch command
cd http_lambda_1
cargo lambda watch

2. From a different terminal run the following curl command

curl --location 'http://localhost:9000/lambda-url/http_lambda_1/' \
--header 'Content-Type: application/json' \
--data '{
"prop1": "You are awesome!",
"prop2": "LGTM"
}'

Building a Release Build for AWS Lambda

Most engineers who have started using Rust know that they can create a release build of their package by running the command cargo build --release. However, the process isn't quite as straightforward when working with AWS Lambda.

Remember, we previously discussed using the AWS Lambda Rust Runtime and installing Cargo Lambda. Cargo Lambda is designed to build your functions for x86_64 architectures, the default architecture for AWS Lambda. If you’re developing on a Mac or encountering issues on other operating systems, you might face difficulties building your function with cargo lambda build --release. But don’t fret – this is precisely why we introduced Docker earlier.

  1. Start by creating the following Dockerfile. We will use ARM Architecture.
# Use the cargo-lambda image
FROM ghcr.io/cargo-lambda/cargo-lambda:latest

# Create a directory for your application
WORKDIR /usr/src/app

# Copy your source code into the container
COPY . .

# Build the project using cargo lambda
RUN cargo lambda build --release --arm64

2. Create the following script release_build.sh within the root of the project. This script will allow us to build the artifacts using cargo lambda within a Dockerfile and sync them to the host machine.

#!/bin/bash

# '-e' option causes the script to exit immediately if any command exits with a non-zero status.
# '-x' option causes the script to print each command to standard output (with a few exceptions) before executing it, useful for debugging.
set -xe

# This command removes any previously built artifacts in your Rust project. This ensures that your Docker build starts with a clean state.
cargo clean

# This command builds a Docker image from the Dockerfile in the current directory ('.').
# The '-t' flag tags the resulting image with the name 'rust-lambda'.
docker build -t rust-lambda .

# This command creates a new Docker container named 'temp-container' from the 'rust-lambda' image.
# However, it does not start the container. It's used to prepare the container for copying files out of it.
docker create --name temp-container rust-lambda

# This command copies the '/usr/src/app/target/' directory from the 'temp-container' container into the current directory on the host machine.
# This is typically used to retrieve build artifacts from a build container.
docker cp temp-container:/usr/src/app/target/ ./

# This command removes the 'temp-container' container.
# It's a cleanup step to ensure that the temporary container does not persist after the script is done.
docker rm temp-container

3. Make the script executable and run it.

chmod +x release_build.sh
./release_build.sh

At this point we have successfully built our lambda are are ready to deploy to AWS.

Deploy the lambda

This part is entirely optional, but if you wanted to deploy the function you can use cargo lambda deploy .

  1. Ensure you authenticate with your AWS Account
  2. Run the following commands:
./release_build.sh
cargo lambda deploy http_lambda_1
cargo lambda deploy http_lambda_2

3. Test out the function

cargo lambda invoke --remote \
--data-ascii '{"prop1": "You are awesome", "prop2": "LGTM"}' \
--output-format json \
http_lambda_1

Summary

This guide serves as a resource for Rust developers looking to set up and deploy AWS Lambda functions. It covers everything from initial set up and local testing to building for release and optional deployment. Feel free to follow me for more articles on similar topics.

🔗 Reference Code on Github

🌐 Connect with me on LinkedIn

--

--

Jed Lechner

🚀 Software Engineer at AWS | Hiker | Amateur Photographer | Tech Content Creator