Uploading a file to S3 using AWS Lambda (python) & API Gateway

Jaivardhan Lal
4 min readAug 12, 2020

Learn how to upload a file to AWS S3 using Lambda & API gateway

Summary:
The following process will work as follows:
1) Sending a POST request which includes the file name to an API
2) Receiving a pre-signed URL for an S3 bucket
3) Sending the file as multipart/formdata over a PUT request to the pre-signed URL received in (2.)

Let’s dive in

I. Creating an S3 Bucket

  1. Create an S3 Bucket where the files will be stored.
  2. Go to S3 management console and click on the created bucket. Now go to Permissions and select CORS configuration. The configuration should look like following:

If it doesn’t look like this then make the necessary changes. These changes allow GET and PUT requests needed for interacting with this bucket from your browser.

<?xml version=”1.0" encoding=”UTF-8"?>
<CORSConfiguration xmlns=”http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

II. Creating a Lambda function (& Associated IAM Role)

  1. Create a new lambda function using python 3.6
  2. Under the permissions header select: Create a New Role using lambda basic permissions. Give it a name and then go ahead and create the function
  3. Once Created, go to the permissions tab and click on the Role hyperlink to open the IAM dashboard.

4. Add AmazonS3FullAccess policy to this role to allow the lambda function to access S3 properties.

LAMBDA Function

1. Go to the configuration tab in lambda function. You’ll need to add boto3 for successfully running the code. How to install boto3 layer for using across all your lambda functions is explained in the following short article: <Link>

Click on layers and then add a layer

AWS Lamda adding boto3 layer [python]

In the code console (lambda_function.py) copy and paste the following code:

Replace the “<bucket name>” with your actual bucket name

import json
import boto3
def lambda_handler(event, context):
s3_client = boto3.client(‘s3’)
eventBody = json.loads(json.dumps(event))[‘body’]

key = json.loads(eventBody)[‘key’]
#Replace the “<bucket name>” with your actual bucket name
bucket_name = <bucket name>

response = s3_client.generate_presigned_url(‘put_object’, Params={‘Bucket’: bucket_name , ‘Key’: key}, ExpiresIn=3600)




return {
‘statusCode’: 200,
‘body’: json.dumps(response)
}

The code uses generate_presigned_url( ) function which is defined as follows:

generate_presigned_url( ) function definition

III. API Gateway Trigger

Go to the designer section at the top of lambda function

Click on Add trigger. Select API Gateway and create a new API.

Now to get the API endpoint click on the API Gateway in Designer section and you will find the API endpoint.

IV. Testing

To start testing you can use Postman.
Make a POST request to this API with filename as body of the request.

Body{
“key”: “abc.pdf”
}

You will receive a pre-signed URL based on the filename as a response. To send a file to S3 , send a PUT request to the pre-signed URL with the appropriate header (eg., “Content-type” : “multipart/form-data”). Include the file in the body of this PUT request in a multipart/formdata format.

If you have any query please drop in a comment. I will soon be writing another post on how to retrieve this file from S3 using a similar flow.

--

--