A step-by-step guide to using Serverless to host a Python function on AWS and call it with an HTTP endpoint

Matt Solomon
Aug 1 · 7 min read
Source

There are various reasons you may want to use cloud providers such as Amazon Web Services (AWS), Google Cloud, or Microsoft Azure to host your code. This piece will take you step-by-step through the process of using Serverless to host a Python function on AWS and call it with an HTTP endpoint.

If this doesn’t exactly match what you are looking for don’t worry! If you want a different cloud provider, Serverless supports a number of different providers. Other languages — as well as triggers aside from HTTP requests — are also supported and the options available for these depend on your specific provider. After completing this tutorial, you should feel comfortable modifying the steps to support these additional options.

Some prerequisites for this article:


Step 1: Setting Up Serverless

If you aren’t familiar with Serverless, it’s a framework that helps you easily develop, deploy, and test serverless applications. I’ve always found AWS to be a complex (but powerful) service, and Serverless makes using AWS much easier by abstracting away some of the complexities.

First, install Serverless on your machine with npm install -g serverless.

Now we can create our serverless project using the command serverless create --template aws-python3 --path aws-serverless-test. Here we’ve used the create command to create a new service in the current working directory. Notice that with the --template flag we’ve specified AWS as the provider and Python 3 as the runtime, and with the --path flag we’ve named our project aws-serverless-test.

If you want to use a different provider or different language, make sure to use a different template. You can find a list of popular templates here, or run serverless create --help to see a list of all templates. (Note that anytime we call the serverless command you can instead use sls for brevity.)

Our new project will have three files (if using a different provider or language, these files may vary a bit):

  • .gitignore — This is generated based on the template we selected, and we can update this as needed
  • handler.py — This file is where we specify the functions that should be run for each endpoint
  • serverless.yml — Here we can configure settings for Serverless deployment to AWS

Step 2: Creating Our Function

With our template set up, let’s create our function and prepare it for deployment.

First, we’ll create a virtual environment. Make sure to switch your terminal to the project’s folder, then run python3 -m venv venv to create a virtual environment and activate it using the source ./venv/bin/activate. Make sure to add venv to your .gitignore so you don’t accidentally commit it. At this point, feel free to use pip to install any linters or formatters you prefer.

For this tutorial, we’ll now create a file in the root directory called functions.py, which will contain our function to deploy. If you don’t have your own function to test this with, here’s a simple Python one to use:

With our function complete, let’s configure the API to return this message. In handler.py, you’ll see a function called hello that takes two inputsevent is a dict containing all incoming request data, such as headers and query string parameters, and context “provides methods and properties that provide information about the invocation, function, and execution environment.”

We won’t need everything below the return statement, so let’s remove that portion. Update the message in the body to return our custom message defined above (or whatever else you want to return). Our handler.py function should now look like this:

Notice that we also added support for URL query string parameters, so when we make a GET request we can provide a name to this function by appending ?first_name=Matt to our query. The sample code below goes into more detail about how to parse POST and GET parameters:


Step 3: Updating Serverless Settings

Now let’s get ready for deployment. Open up serverless.yml and update it so it looks like the code snippet below. Feel free to leave in all the default comments for future reference. In this snippet, I’ve removed those comments and replaced them with others for clarity on what you are changing and why.

The last two lines of this file talk about Cross-Origin Resource Sharing (CORS). If you’re unfamiliar with CORS, just know that “a web application executes a cross-origin HTTP request when it requests a resource that has a different origin (domain, protocol, or port) than its own origin.” Since we will not be hosting our function on the same domain as our web app, or wherever this function will be called from, we need to enable CORS.

The Serverless team has a great blog post about handling CORS, where they recommend middy for JavaScript functions and lambda-decorators for Python functions. So we’ll now also follow the lambda-decorators instructions to ensure our function provides CORS support.

First, we install serverless-python-requirements. This plugin extends the capabilities of Serverless to make it easy to include the dependencies of our requirements.txt file upon deployment. This is installed by running sls plugin install -n serverless-python-requirements. You’ll notice this creates a package.json file and updates your serverless.yml file for you. If you’re interested in learning more about Serverless plugins, see here.

Afterward, we can install the lambda-decorators package with pip install lambda_decorators. To use it, we’ll update our handler.py file to include from lambda_decorators import cors_headers, and our function to use the decorator we just imported. The updated file looks like this:


Step 4: Packaging Updates

Since you likely aren’t using the same operating systems as AWS, your installed dependencies won’t run on their servers. To work around this, we use Docker. If you are unfamiliar with Docker, that’s ok, and just follow these installation instructions and make sure Docker is running on your computer.

We’ve already installed the serverless-python-requirements package, which includes a dockerizePip option to facilitate this. To use it, simply add the following to the bottom of serverless.yml

For more details on this command, you can see here and here.


Step 5: Deploy Our Function

We are now ready to deploy our function! First, we need to create our requirements.txt file by running pip freeze > requirements.txt.

Next, let’s configure our AWS credentials. You can do this by following the steps here. When you reach the Using AWS Access Keys section, follow the subsection called Using AWS Profiles, and create a profile called aws-serverless-tutorial. If needed, you can directly edit the ~/.aws/credentials file on your machine to configure your profile. A sample profile may look like this:

[default]
aws_access_key_id = YOUR_ACCESS_KEY_ID
aws_secret_access_key = YOUR_SECRET_ACCESS_KEY
region=us-east-1
[aws-serverless-tutorial]
aws_access_key_id=YOUR_ACCESS_KEY_ID
aws_secret_access_key=YOUR_SECRET_ACCESS_KEY
region=us-east-1

We can now deploy our function using sls deploy -v --aws-profile aws-serverless-tutorial. In this command, the -v flag is optional and is used to specify the verbose mode, i.e. additional logging to the terminal. Notice that we also specified which AWS profile we want to deploy with. It is good practice to have different profiles for your different projects.

Be patient, as this will take a few minutes. This command must be run every time you change your serverless.yml file. If you only edited the function you are deploying, you can speed up deployment by running sls deploy -v --aws-profile aws-serverless-tutorial --function get-custom-greeting. (Of course, make sure to replace get-custom-greeting with your function name.)


Step 6: Test our Function

Once the deployment is complete, look for the Stack Outputs section of the console and notice the ServiceEndpoint value. This should be something like https://kr3c39wj4o.execute-api.us-east-1.amazonaws.com/dev. This is the root URL of our function. Your exact URL will be different.

If you look a little higher up, in the Service Information section, you’ll see the specific endpoint we just created, similar to the snippet below:

endpoints:
GET - https://kr3c39wj4o.execute-api.us-east-1.amazonaws.com/dev/get-custom-greeting

Remember that we require a name parameter with this function, so to see it working in practice, append ?first_name=yourName to the URL above — e.g. https://kr3c39wj4o.execute-api.us-east-1.amazonaws.com/dev/get-custom-greeting?first_name=Matt— and open that URL it in your browser. Your function response should now display!

But, it may not…


Step 7: Troubleshooting

If, to your disappointment, you don’t see your result and instead see {“message”: “Internal server error”}, don’t worry — we just have to do quick troubleshooting.

This error message is pretty unhelpful. To get a more useful one, follow the steps below:

  1. Login to your AWS account
  2. Navigate to Services > CloudWatch
  3. In the navigation sidebar on the left, click Logs
  4. Select the appropriate log group, e.g. https://kr3c39wj4o.execute-api.us-east-1.amazonaws.com/dev/get-custom-greeting
  5. Select a specific log stream from that group to view the error message

As an alternative, you can directly view the logs with Serverless as described here and here.

Most likely you’ll see this error:

DistributionNotFound: The 'jsonschema' distribution was not found and is required by the application

This is due to an incompatibility with AWS and the jsonschema package listed in requirements.txt. To fix this, we’ll force the jsonschema package to 2.6.0 using the steps below:

  1. Open requirements.txt and update the appropriate line to read jsonschema==2.6.0
  2. Re-install our dependencies based on this file using pip install -r requirements.txt
  3. Ensure nothing else has changed by running pip freeze > requirements.txt. It is good practice to run this before every deployment.
  4. Re-deploy our service using the command above. Since we may have multiple functions in this project, we’ll deploy the full service for simplicity. However, note that dependencies are updated when you re-deploy a function.

Refresh the browser page, and this time you should see your message!

In handler.py, we left the input — our event parameter — as an output of the response. Consequently, the response you see might look a bit messy. If you don’t need this output, feel free to remove it and then re-deploy your function.

At this point, you are done, and you can check out the Serverless docs if you are ready to dive deeper!

Better Programming

Advice for programmers.

Matt Solomon

Written by

Better Programming

Advice for programmers.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade