Creating an Image thumbnail generator using Vue, AWS, and Serverless (Part 2) — The Upload

Ramsay Lanier
5 min readDec 7, 2017

--

In part 1 we created a default Vue application and deployed it to an S3 bucket that is set to allow static hosting. In this part, will gut the default Vue application and create a basic UI that will upload a file to a different S3 bucket. In order to do this, we need to:

  • Create a new S3 bucket for uploads
  • Create the UI components
  • Configure the AWS JavaScript SDK with our user credentials
  • Wire up the UI components to work with S3

Creating A New S3 Bucket

Lets modify the existing serverless.yml file so we can create a new bucket. The new template looks like this:

On line 5 we add a new custom uploadBucket variable. On lines 13–16 we add a new bucket resource. Save the file and then run sls deploy from your terminal. Voila! You now have a new bucket to upload stuff to.

The last thing we have to do is enable CORS for this bucket, which unfortunately at the time of this writing is not possible to configure from Serverless, so we’ll have to do it from the S3 console. We have to do this because the web application that will be uploading images is from a different origin than where the S3 bucket lives. This is a common security issue, so for now we’ll get around it by adding a very loose CORSRule.

In the S3 console, click on your new S3 upload bucket and then select the Permissions tab. You’ll see three buttons below the tabs — click on CORS Configuration. In the text area, add the following on line 9:

<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>

The end result should look like this:

For you security experts, this might look alarming. We’re basically saying that PUT methods can be made into your bucket from any origin. In production, the allowed origin will be the URL of your client-side application that is hosted in the S3 bucket we created in part 1. For now, we’re just testing it locally so we’ll leave it at * .

Create the UI components

We’re going to gut the default application and replace some stuff. First, lets start with App.vue which gets cleaned up a bit to look like this:

This is pretty close to what we are given in the default, just with some clutter removed. The important thing to see here is the <router-view/> component which is given to us from Vue Router.

Lets take a look at the routes, which is very simple for now.

Pretty simple — there is just one route from the Homepage component.
For now, the Home component is also pretty simple. It will get more added to it later, but for now it just renders another component — the upload form.

UploadForm is where all the good stuff will happen.

There’s a lot going on here, so we’ll break it down into pieces. First, we have a form that, when submitted, calls a handleSubmit method. On line 31, you can see handleSubmit simply calls a uploadFiles function (that we’ll define shortly).

Inside the form is a file input that, when changed, calls the handleFileChange method which takes the list of files that we’ve selected and sets our component state called files. This is just an array of files.

After the input we set up a loop that will loop through the files we’ve selected to upload and show the user the names of those files. Lastly, there is a submit input type that, when clicked, will submit the form.

The last piece of this puzzle is the uploadFiles files function, which you can see is being imported from a file that we will create in the next step.

Configuring the AWS SDK and wire it up to S3

First, we’ll need to add two npm packages — aws-sdk and config . In your terminal:

npm install --save aws-sdk config

The config package lets us apply different configuration files based on the environment in which the application is running. When running locally, our NODE_ENV (node environment) variable is set to “development”. When we bundle the application by running npm run build the NODE_ENV is set to “production”. So’ well create two new files in the config directory — development.json and production.json . The config package will determine which file to use based on the NODE_ENV.

Its important to use configuration files because we’ll be storing some private access keys given to us by AWS, and we don’t want to hard-code that in the source code that gets committed. Instead, the configuration files will be added to our .gitignore file so anybody cloning our repo wont get our access key.

We’ll need to get our public and private access keys from AWS. If you don’t know how to do this, read the documentation! Now, we can create the two config files. For now, they will have the exact same contents:

/client/config/development.json

Now, that we have the configuration files, lets configure AWS and create that uploadFiles function.

/client/src/aws.js

First, we import the aws object from our config file which contains the bucket name and access keys. On line 7, we configure AWS using these items. On line 15, we’re telling AWS that we’d like to us promises. We want promises because we’d like the UI to update after the uploads are done. On line 17 we create a new S3 object, giving it the name of the upload bucket we created earlier. Lastly, we create the uploadFiles function that gets called when the upload form is submitted. We map across all the files and upload each one. This function returns a promise that resolves when all the files are done uploading.

That’s it!

I’d be remiss if I didn't mention that we should probably be doing some file type and size checking on the files that are uploaded. For the purposes of this application, I decided not to get into it — but in production this is something that would be very important!

Next Up

In the next part, we’ll create a Lambda function that gets called whenever an object is put into our upload bucket. The lambda function with create some thumbnail files from the uploaded image and put them into yet another bucket.

Ramsay is a full-stack JavaScript developer for a data analytics company in Springfield, VA. You can follow him on twitter or github. He likes when you do that.

--

--

Ramsay Lanier

I’m a senior web developer at @Novettasol and senior dad-joke developer at home.