Upload Files To AWS S3 From React App —Using AWS Amplify

Anjan Biswas
12 min readDec 8, 2018

--

Amazon has made significant progress in creating tools, frameworks and libraries that finally give developers the true sense of focusing on the code and make them more resilient rather than being bogged down with infrastructure concerns. Once notable library which has made great progress is AWS Amplify.

What existed as one-off solutions for developers to integrate their mobile and web apps with AWS services (via MobileHub) is now a more cohesive solution in the form of AWS Amplify, and with AWS re:Invent 2018 the introduction of Amplify Console is just icing on the cake.

In this post though, we are going to look at how to use AWS Amplify to let users upload files to S3 buckets from a React app. The use case is simple — authenticate users of your React app from the browser through Cognito and allow them to upload files to an S3 bucket. I’ll be honest, there’s quite a bit of configuration that is involved, but nothing that can’t be done, so let’s get to it.

User authentication & S3 operations

Authentication Using Cognito (a pre-requisite)

You can skip this section if you already have a Cognito User Pool and Identity Pool set-up.

Before we get into the details of implementing Amplify with React, it’s important to understand what role Cognito plays in this entire process. If you didn’t already know, Cognito makes it easy for you to let your users sign up/sign in to your app and enables you to manage their access control to your AWS services (such as in this case, S3). For the sake of this article it’s important to understand the two functions of Cognito-

  1. Cognito User Pool: User pools allow you to setup how your users are going to sign-up and sign-in to your app (i.e. using email, or username, or phone number or all of them), it let’s you define password policy (eg. minimum 8 characters), define custom user attributes, enable MFA (multi factor authentication), and so on. You can check out this webinar on how to create a user pool (or there are plethora of articles online that will help you get started). Once, a user pool is created, it should provide you a “Cognito User Pool ID” (eg. us-east-1_18spw01t) and an “App client ID” (eg: 3v6a23ov3g1rhj23e27nm4v59si). We will need both of these later.
  2. Cognito Identity Pool: Technically, a User Pool alone is enough to setup a basic authentication service with Cognito for your app. However, if you want your users to have fine grain access to other AWS Services, or perhaps integrate 3rd party Authentication providers such as Google, Facebook, Twitter, SAML etc. you will need to setup a “Federated Identity” using Cognito Identity Pool (CIP). CIP lets you assign IAM roles at authenticated and unauthenticated levels which basically dictates what services (or parts of services) can a user access if they are authenticated vs. unauthenticated. Just like a user pool a CIP will have its own ID (eg: us-east-1:16e03s22-ce44–4cf5-jhg8-f11245xfcB15). We will keep a note of this as we will need it later.

Once we have Cognito ready to go we can move on to the next section.

Configure S3 permissions in Cognito

Once we have our User Pool and Identity Pool ready, we need a way to add permissions to the Identity Pool which will give our users the ability to perform S3 operations (like PUT, GET, LIST etc.). When a user authenticates using the Cognito Identity Pool, their identity would “assume” the IAM role that we assigned to the identity pool and they can then perform the allowed operations on S3.

Identity Pool allows you to add two types of roles (IAM Roles).

  1. Authenticated Role : For when the user is in authenticated state.
  2. Unauthenticated Role: For when the user is in unauthenticated state.

We will need to assign individual IAM Roles to both even though we are only going to use the Authenticated Role since we want our users to only be able to upload files if they are authenticated.

Head over to Cognito console and click on the “Manage Identity Pool” from the homepage. This should take you to the identity pool manager. On the next page select your identity pool and then “Edit identity pool” link on the top right hand corner.

Cognito Identity Pool

When you create an Identity Pool, two roles will be created for you by Cognito. One for “Unauthenticated Role” and another for “Authenticated Role” as shown in the image above. We will need to modify these roles using IAM console to add S3 permissions. Head over to IAM to edit the roles.

We will edit the roles using the policy editor and use the policy json below. We will assume our S3 bucket name is “my-test-bucket-amplify”.

We will repeat this step for the unauth role, however for unauth role make sure you remove the DeleteObject and PutObject permissions. Everything else remains the same as the policy json above. We will discuss about public/, protected/, and private/ prefixes , as seen in the json above, a little bit further down.

We will also need to make sure that the “Trust Relationships” for both the roles are set correctly. Click on the “Trust Relationship” tab and then “Edit trust relationship”.

Ensure Trust Relationship is set correctly

IMPORTANT: Make sure that the cognito-identity.amazonaws.com:aud is your Cognito identity pool id and cognito-identity.amazonaws.com:amr is authenticated for the authenticated role and unauthenticated for the unauthenticated role.

Once both the IAM roles are edited as shown above, they will enable authenticated users to perform S3 operations.

Setting Up Amplify

We will install AWS Amplify’s Javascript SDK (amplify-js). Here’s the Github repo for the SDK and here’s the documentation.

NOTE: Amplify provides some boilerplate React components (HOCs) which can be created using the Amplify CLI. We will not be using Aplify CLI, but instead do our own implementation using the modules within amplify-js.

We will use a create-react-app boilerplate to get started. Let’s create a CRA project and name it amplify-s3. We will also install amplify-js into our project.

$ create-react-app amplify-s3
$ cd amplify-s3
$ npm install aws-amplify --save

Now that we have amplify-js installed, we need to make use of the Storage sub-module for S3 related operations (and Auth sub-module for Cognito authentication related operations).

Let’s configure Amplify with the Cognito user pool, identity pool information, and the S3 bucket information. In order to do so, we will create a services.js in the root of the project and put all the Amplify Auth/S3 related operations there so that we can re-use it throughout our app. In services.js the first thing to do is to initialize and configure Amplify —

All the pieces of the configuration information above is mandatory in order to use Storage sub-module for S3 and authentication (Auth) operations. Note the process.env.REACT_APP_* variables. These are environment variables that are declared inside of a .env file in the root of your project. CRA has inbuilt implementation of DotEnv so we can take advantage of that without having to type in the values again and again. The .env file’s content will look like this and we covered each of these values in the Cognito section.

.env file in React project

We need to perform Amplify.configure when your app is initialized, so we will need to import the configureAmplify() function in the index.js file of your project using named import so it would look like this —

It is important to call the configureAmplify() function before your App mounts. At this point we are good to start using Storage and Auth.

For the sake of this article, we will assume you already have code in place to sign-in your users into your app using Auth sub-module and signIn() method that is built-in i.e. Auth.signIn() and so on. Take a look at the documentation which explains Auth sub-module usage.

Using the Storage sub-module

We can now use the Storage submodule in Amplify and start performing S3 operations. But before we can do that, there is one important shortcoming that we need to discuss. As of this writing Amplify.configure() supports initializing only a single S3 bucket. This may not be ideal use case for everyone, for example; you may want user images to go to one bucket and videos to another and so on. This could become particularly challenging scenario with the way we configured Amplify above, and unfortunately this is the only way to configure Amplify right now. But, not all is lost. Let’s take a look at how we can use multiple S3 buckets in your app.

In order to be able to use different buckets, we need to perform some Storage specific configurations, specifically set the bucket name, level, region and Identity Pool ID every time you perform an S3 operation. Although this may not be an ideal way to do it, it is the only way to do it. We will add a new function in our services.js file to configure Storage with all these parameters for this purpose. We will first use modular import to import Storage from Amplify and then performconfigure()

Let’s discuss what we did here, and what level means. We simply passed an object to the built in configure() method for Storage. Most of the values are familiar, like bucket — bucket’s name, region — AWS region, and identityPoolId — the CIP ID. But what does level mean?

The Storage module can manage files with three different access levels; public, protected and private.

  1. public : Files with public access level can be accessed by all users who are using your app. In S3, they are stored under the public/ path in your S3 bucket. Amplify will automatically prepend the public/ prefix to the S3 object key (file name) when the level is set to public.
  2. protected : Files with protected access level are readable by all users but writable only by the creating user. In S3, they are stored under protected/{user_identity_id}/ where the user_identity_id corresponds to a unique Amazon Cognito Identity ID for that user. IMPORTANT: Cognito user pool will automatically assign your users a “User ID” (which looks like: 09be2c02–8a21–4f51-g778–05448a6afbb5) and the Identity Pool assigns users an “Identity ID” (which looks like: us-east-1:09be2c02–3d0a-400d-a718–4e284dca6de4) and they are different. The user_identity_id signifies the “Identity ID”. Amplify will automatically prepend the protected/ prefix to the S3 object key (file name) when the level is set to protected and use the user_identity_id for the user that is logged into your app and has a valid session. So, give a file name of my_profile_pic.jpg, a bucket name of my_profile_pics_bucket, and authenticated user Identity ID of us-east-1:09be2c02–3d0a-400d-a718–4e284dca6de4 the S3 path will look like s3://my_profile_pics_bucket/protected/us-east-1:09be2c02–3d0a-400d-a718–4e284dca6de4/my_profile_pic.jpg
  3. private: Files with private access level are only accessible for specific authenticated users only. In S3, they are stored under private/{user_identity_id}/ where the user_identity_id corresponds to a unique Amazon Cognito Identity ID for that user.

In our case we want user’s to be able to store their files under the private file access level in the S3 bucket. This means that Amplify will always perform the operations under s://your_buck_name/private/{user_identity_id}/.

We can now use our reusable function in services.js file to configure the S3 Storage submodule by simply passing the bucket name and the file access level like so —

SetS3Config("my_other_bucket","private");

Setting up the S3 bucket

We will now setup our S3 bucket so that it works with Amplify’s Storage sub-module. The Storage sub-module is a convenient wrapper around S3’s API endpoints (which is also used by the AWS CLI internally) and abstracts away much of the implementation details from you. This means that you would actually be making API calls to the S3 endpoints from your React/Javascript code. With that in mind let us quickly create an S3 bucket.

Head over to AWS S3 console and click “Create Bucket”. We will name our bucket “my-test-bucket-amplify”.

New my-test-bucket-amplify S3 bucket

We now have to enable CORS (Cross Origin Resource Sharing) for this bucket. We need to perform this step since Amplify will interact with this bucket through the S3 Rest API endpoints. In order to do that, click on the bucket name and then go to the “Permissions” tab, and then click the “CORS Configuration” button.

S3 bucket CORS Configuration

In the CORS configuration editor enter the XML below-

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>HEAD</AllowedMethod>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>DELETE</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<ExposeHeader>x-amz-server-side-encryption</ExposeHeader>
<ExposeHeader>x-amz-request-id</ExposeHeader>
<ExposeHeader>x-amz-id-2</ExposeHeader>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

This XML configures CORS for the S3 bucket API endpoint. You can allow all * domains in the AllowedOrigin or you can list specific domain for example <AllowedOrigin>yourdomain.com</AllowedOrigin> to restrict calls from only your domain. You can also select which Rest methods to allow by listing them (or not listing them) with <AllowedMethod>. Learn more about S3 CORS configuration here. Thats about it for S3, note that the bucket remains private by default and we have not added any Bucket policy explicitly since we do not need any for this example. Now it’s time to write some React code to see our upload operation in action.

Uploading from React App

We’ve come a long way, so now it’s time to put some code into our CRA Project’s index.js file and build a component that can actually perform an upload. Here’s how the code would look like —

Pretty straight forward. Let’s take a look at what we did there.

  1. We imported configureAmplify() and SetS3Config() from our services.js file. These two functions will configure Amplify to initialize the services and help us set the bucket configurations respectively.
  2. We then wrote a component with basic jsx with an upload field with <input type="file"> and a couple of buttons — one to select files from our computer and another to upload the selected file. NOTE: the code restricts the file types to png and jpeg using accept=”image/png, image/jpeg” but that’s optional.
  3. We save the image file and the image file name from HTML5’s FileAPI files[] array (array because you may select multiple files but in our case one one file is allowed), in the state using setState on onChange event of the input element. NOTE: The actual upload element (file selector i.e the regular button) is hidden and we have a regular field there just for styling purposes).
  4. Once we have the file and file name in the state we invoke the uploadImage function from the button’s onClick event.
  5. The uploadImage function initializes the S3 bucket configuration using SetS3Config function by setting the bucket name (“my-test-bucket-amplify”) and the file access level (“protected”).
  6. Finally, we call the Storage.put() function which takes in the object key (i.e. the file name) with an S3 object prefix (i.e. in this case userimages/), the file and a file metadata object where we passed an object with contentType. Checkout Storage.put() example’s and documentation here.

Here’s what it looks like —

Upload from React App to S3 Bucket

Conclusion

It’s a rather simple implementation of a file uploader, considering you are using Cognito and Amplify, from your app but it also leaves quite a bit to be desired, for example multipart uploads are non-existent out of the box, and neither is file upload progress subscriber which indicates how much of the file has been uploaded (in order to show some sort of progress bar to the user while the file uploads), and so on. However, this is a step in the right direction with Amplify and simplifying a lot along the way of providing simple upload functionality in your web (or mobile apps) for your users.

--

--

Anjan Biswas

Special Solutions Architect @AWS, AI and Machine Learning Engineer, generative AI specialist — opinions are my own