Transfer Files to S3 Using REST API
My Use Case
I sometimes get asked: “Which method do you prefer to transfer files to S3?” My answer is: “The one that works for you”. The definition of works varies from case to case. The simplest way would be to make the bucket public. If this is not desired, then create a user with programmatic access to the S3 bucket or with enough permissions to assume a role that allows access to the S3 bucket. The issue is that I may end up with many user identities. I can implement identity federation using Cognito  to manage this better. Needless to say, there are different ways to request temporary security credentials each with their use .
What would happen if you really want not to use any user credentials or identities or if the objects in the bucket would require a bit of processing? The API Gateway-Lambda-S3 serverless pattern is here to help.
My pattern is based on a REST API that integrates with a Lambda function. The function can perform any number of tasks, such as issue pre-signed URLs to upload or download objects to a designated S3 bucket, send notifications to an SNS topic and process the uploaded objects (not done in this article). This is relatively a secure method to transfer files because the pre-signed URL can be set to expire quickly, if necessary. The SNS topic can inform the stakeholders of the system of the requested operation. This cloud stack can be further extended using S3 event notification as triggers for other Lambda functions.
The usability of this cloud stack is increased by implementing automation through an application or script to act as a proxy between the user and the cloud. I prefer to use PowerShell to do this because of the ease of introducing documentation within the script.
I do not intend to go into details of PowerShell programming or integrating API Gateway — Lambda responses. The latter I demonstrated step-by-step, code included . Instead, I will focus on the deployment and usage aspects.
Deployment and Usage
To deploy the solution, you can clone my repository . The code you require is located by following the link in “readme.md”, under the same heading as the title of this article. Without going into a lot of detail, you will need to:
- Prepare the S3 bucket hosting the code
- Create the CloudFormation stack: the most important outputs of the stack are the REST API Prod Endpoint Url and Lambda Arn, which are used to
- Allow the API to invoke Lambda: by executing the relevant CLI commands
- Prepare the PowerShell Script: by coding the correct URI variable
All these steps are documented in detail in the “readme.md” located in the CFN-S3PreSignedURL folder.
Before invoking the script, I suggest reading the documentation. Using a PowerShell console, type help ./Transfer-Files.ps1 -Example. I recommend PowerShell Core, but any of the latest flavors would do . The script supports renaming the object by simply specifying a different name for the target than the source. An automated prefix-based by the UTC date-time is added to the object, in the case where the S3 key is omitted for an upload operation. If you have confirmed the subscription with the notification topic, each time you invoke the script expect an email in your inbox.
I hope you found this article useful and had a bit of fun dissecting the various components. In my case, the most time-consuming part was to make sure that the PowerShell script was properly documented. If you did not have any issues using the script, it means that I have done it correctly.
Please feel free to send me your remarks…