How to develop your Lambda Functions like a rockstar

Joel Barna
7 min readFeb 18, 2020

--

Here at Stavvy, we are using a Serverless approach to building our SaaS product and utilizing the Serverless Framework to develop our cloud functions. After successfully deploying hundreds of Lambda functions that respond to API Gateway, Queues, S3 buckets, and much more, we’ve developed a rock-solid strategy for development and testing that we want to share with you.

This post is for me if:

  • I want to see best practices for developing Serverless Architecture.
  • I am researching Serverless Applications and the best tools available.
  • I want make my development time more efficient when building against AWS Lambda.
  • I am looking for real-world development experience and not “hello world” examples.
  • I want to become an AWS Lambda ninja.

First Up: Pick your toolset

Flying solo with AWS Lambda is a bit tricky — You can only execute your Lambda code in the cloud and you have to configure permissions and environment variables manually through the console. Fortunately, there are Serverless Application Frameworks that ease development and let us test our code on our local machine.

Serverless Framework: The Serverless Framework is an FaaS agnostic, yaml based framework that orchestrates cloud function deployment and testing. It supports multiple providers such as AWS, Google Cloud, Azure, although similar to Terraform, most of what you write with it will be provider-specific. It also has a Plugin Ecosystem, where you can find plugins that ease common pain points of development and deployment.

Serverless Application Model: SAM is also a yaml based framework that orchestrates cloud function deployment and testing, but it is maintained by AWS and only supports AWS infrastructure. It is the backbone of services such as Stackery which abstract SAM to a visual interface, and ease development by providing a simpler CLI.

Spoiler Alert: We chose the Serverless Framework out of the gate due to some quick judgements and it has performed very well for our needs. That is not to say SAM or Stackery would not perform well, but our experience with Serverless has been solid enough that we have not considered switching.

Next: Write your function and Test Locally

In order to develop at a reasonable speed, we will need the ability to (1) run our Lambda functions on our local machine and (2) run against test data that is representative of what our function will receive in our cloud environment.

The Serverless Framework makes the first part easy by allowing you to execute a function with a one line command:

serverless invoke local --function get_user

This will run the get_user function as defined in your serverless.yml file, along with setting the common environment variables that AWS sets in the cloud and that you may have defined in the environment section in the serverless.yml.

If you are using Python, Node.js, Ruby, or Java then the function will also execute quickly as Serverless avoids using Docker for those runtimes.

Introduce event JSON into the mix

In most cases, you have information such as path parameters, body payloads, stage variables, authorizer variables, and other application specific data coming through the Lambda event object that your function relies on.

It’s important to provide those application specific values, but it’s equally important that the structure of your test data represents what your function will receive when deployed. Otherwise, you will run into errors as your function was developed expecting one format, but received another.

The best way to get properly structured sample JSON for your functions is by visiting our aforementioned friend SAM. The SAM cli has a generate-event utility (source-code cheat sheet) that creates event JSON for all of the invoking sources for Lambda, whether that is API Gateway, SQS, DynamoDB, etc.

> sam local generate-event apigateway aws-proxy{
"body": "eyJ0ZXN0IjoiYm9keSJ9",
"resource": "/{proxy+}",
"path": "/path/to/resource",
...
}

Once you have the default structure, the next thing is to populate your specific application values. Make sure that you are paying attention to the data types in your JSON.

Path Parameters are string values in the event JSON, not numeric. So if you have a URL such as users/1234, your JSON should have { user_id: "1234" } . Additionally, if you are using an authorizer in front of your functions and creating context, even if you pass numeric or boolean values AWS will stringify them.

Locally Invoking with your Event Data

Okay so we have the JSON that our get_user function needs, let’s slap it into a get_user.json file and run it, right? You can, but there’s a better way.

While Serverless does support loading JSON from file by specifying --path get_user.json, we recommend putting your JSON into a script that spits out the JSON to stdout and piping that into the Serverless command:

python get_user_test_data.py | sls invoke local -f get_user

The reason we have found this most effective is that AWS Lambda events frequently have stringified JSON in a subfield.

For example, if you have a POST method in API Gateway that accepts JSON, your function will receive a stringified JSON object in the body field of the event. It’s a pain to write and edit strigified JSON continuously, not to mention the difficulty in debugging a JSON file if you forget to escape one \” out of the 50 in your POST request.

So as simple as it sounds, we have a bunch of lightweight Python scripts that do something along the following:

import jsonbody = json.dumps({
'hello': 'world',
'some_other': 'data',
'is_awesome': true
})
event = json.dumps({
'body': body, # insert Stringified JSON body here
'requestContext': {
'authorizer': {
'user_id': '1234',
}
},
'pathParameters': {
'some_path_param': '1234'
}
})
print(event)

And that is much more manageable than trying to build and maintain a .json file on record with stringified JSON inside of it. We’ve experience this challenge with API Gateway and SQS/SNS events, and there are most likely more situations where it occurs.

Debugging the local execution of your function

If your Lambda isn’t running as expected and you want a closer look, you can always spin up a debugger on your local functions as well. We use Python for all of our functions and have adapted this SAM article using ptvsd to debug our functions using VS code. If you use Node.js, check out this post on how link Node’s default debugger with VS code.

Deploy your function & Double Check

Great! The function executes as expected locally and it’s time to deploy it into the cloud. A quick serverless deploy will do the trick, but there are a few more things we should do to ensure the health of our function in the cloud.

Run the same JSON against our deployed function

First, run the same command to test your function except this time don’t use the local command: sls invoke -f get_user. If you run into issues that you didn’t see locally, ask the following questions:

  1. What IAM Permissions does your function have?

When you run Serverless locally, the function assumes the same role that you have in your ~/.aws/credentials file. In most cases, you have much more access than your function will have when assigned to its IAM Role. Double check to make sure that your function was given proper permissions to read from that S3 bucket, decrypt that secret, and access the database.

2. Does the function have all of its dependency code?

Often times libraries that our function relies on are installed on our local machine, readily available when we test locally. We need to ensure that this dependency code is uploaded to Lambda when we deploy our function. You can use Lambda Layers for dependency management, but if you don’t then a quick trick is to unzip the deployment package in .serverless and see if the required libraries are there. If not, you need to ensure that you add them to the Serverless Package block, or in our case we use the serverless-python-requirements plugin to package the libraries we depend on.

Run it in the wild

Assuming everything has worked so far, the last thing to do is invoke the Lambda function from its normal AWS resource. So if this Lambda function is serving API Gateway, hit the subscribed endpoint with cURL to ensure the function is operating as expected.

Odd reasons your function may not be working

In our experience, we’ve found a few situations that can be difficult to debug because AWS CloudWatch has almost no application logs, yet the Lambda fails.

The first is network timeouts that masquerade as function timeouts. For example if your Lambda function attempts to connect to your database the connection hangs and exceeds the timeout period of the Lambda function (6 second default), you will be scratching your head wondering why your function isn’t running fast enough. Our best practice has been to add 3 second timeouts to all network requests so that our errors in this situation are more direct.

The second is if the specified handler does not exist in the code that you uploaded. Most times there isn’t a specific error message, but the function silently fails. Double check that the handler specified for the function matches the code uploaded.

That’s it! 👏

I hope you have smooth sailing with your Serverless experience, and if you believe in Serverless architecture and want to get your hands dirty, we’re always hiring @ Stavvy.

--

--