How to Use Amazon S3 Effectively

Anurag Agarwal
7 min readJun 4, 2024

As a junior developer, I’ve worked on various projects where I needed to upload images to storage. One of the best places to store and easily fetch general data is Amazon S3 (Simple Storage Service). While there are many platforms offering similar services, S3 stands out due to its extensive functionalities. In this blog, I’ll share how to use S3 effectively. Let’s explore three different methods to upload files to cloud platforms and conclude with the most optimized solution.

Methods to Upload Files to S3

  1. Directly Post to Cloud Platform from Frontend
  2. Use a Backend Package like Multer
  3. Optimized Method: Frontend Upload with Backend Permission

Let’s dive deeper into each method and understand how to achieve them, ending with the optimized solution.

Directly Post to Cloud Platform from Frontend

You can directly install the necessary package to call specific functions on the cloud provider. We’ll use AWS’s S3 as an example. Follow these steps to post any image directly to S3 from the frontend.

Step-by-Step Guide

  1. Install AWS SDK:
npm install @aws-sdk/client-s3

2. Create a React Component:

Create a new file named Upload.tsx or any name you prefer.

import React, { useRef } from 'react';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';

const S3Uploader: React.FC = () => {
const fileInputRef = useRef<HTMLInputElement>(null);

// AWS configuration
const secretAccessKey = process.env.REACT_APP_S3_SECRET_KEY as string;
const accessKeyId = process.env.REACT_APP_S3_ACCESS_KEY as string;
const bucket = process.env.REACT_APP_S3_BUCKET_NAME as string;
const region = process.env.REACT_APP_S3_REGION as string;

const client = new S3Client({
region,
credentials: {
secretAccessKey,
accessKeyId,
},
});

const onImageUploaded = async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0];
if (!file) return;

const command = new PutObjectCommand({
Bucket: bucket,
Key: file.name,
Body: file,
});

try {
const response = await client.send(command);
console.log('Upload success:', response);
} catch (err) {
console.error('Upload error:', err);
}

// Access link
console.log(`https://${bucket}.s3.${region}.amazonaws.com/${file.name}`);
};

return (
<div className="block-container">
<button onClick={() => fileInputRef.current?.click()} className="show-more">
Upload Image
</button>
<input
type="file"
multiple
onChange={onImageUploaded}
className="hidden"
ref={fileInputRef}
accept="image/*"
style={{ display: 'none' }}
/>
</div>
);
};

export default S3Uploader;

This will create a simple React component to upload files to S3 from the frontend. Before using this, you need to create and configure a bucket. If you need guidance, follow this article.

Disadvantages

  • Security: This method does not securely manage AWS credentials. Direct client-side access to AWS services should be handled cautiously to avoid exposing sensitive credentials.
  • Complexity: Managing environment variables on the client side is complex, so it’s usually better to let the backend handle them.

Using Backend as an Intermediary

Here, we set up a backend service to handle file uploads and store the files in Amazon S3 using Node.js, Express, and the AWS SDK.

Step-by-Step Guide

  1. Install Required Packages:
npm install express multer @aws-sdk/client-s3 dotenv

2. Create Backend Service:

Create a file named server.js and set up an Express server with Multer for handling file uploads.

const express = require('express');
const multer = require('multer');
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
require('dotenv').config();

const app = express();
const upload = multer({ dest: 'uploads/' });

const s3Client = new S3Client({
region: process.env.AWS_REGION,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
});

app.post('/upload', upload.single('file'), async (req, res) => {
const file = req.file;
const fileStream = require('fs').createReadStream(file.path);

const uploadParams = {
Bucket: process.env.S3_BUCKET_NAME,
Key: file.originalname,
Body: fileStream,
};

try {
const command = new PutObjectCommand(uploadParams);
await s3Client.send(command);
res.status(200).send({ message: 'File uploaded successfully' });
} catch (err) {
console.error(err);
res.status(500).send({ error: 'Error uploading file' });
}
});

const PORT = process.env.PORT || 5000;
app.listen(PORT, () => {
console.log(`Server is listening on port ${PORT}`);
});

3. Environment Variables:

AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
AWS_REGION=your-region
S3_BUCKET_NAME=your-bucket-name
PORT=5000

4. Frontend Setup:

Create a React component to handle the file upload process.

import React, { useRef } from 'react';
import axios from 'axios';

const FileUpload: React.FC = () => {
const fileInputRef = useRef<HTMLInputElement>(null);

const handleFileChange = async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0];
if (!file) return;

const formData = new FormData();
formData.append('file', file);

try {
const response = await axios.post('http://localhost:5000/upload', formData, {
headers: {
'Content-Type': 'multipart/form-data',
},
});
console.log('File uploaded successfully:', response.data);
} catch (error) {
console.error('Error uploading file:', error);
}
};

const handleClick = () => {
fileInputRef.current?.click();
};

return (
<div className="block-container">
<button onClick={handleClick} className="show-more">
Upload File
</button>
<input
type="file"
onChange={handleFileChange}
className="hidden"
ref={fileInputRef}
style={{ display: 'none' }}
/>
</div>
);
};

export default FileUpload;

This setup allows you to upload files from your frontend to your backend, and then from the backend to S3, ensuring that sensitive AWS credentials are not exposed on the client side.

Advantages of this method:

  • We don’t have to expose our secret keys to frontend CDN servers.
  • Complex solution to store environment variables is limited.

Disadvantages of this method:

  • We are currently storing the files/images on the server then re-routing them to the S3 bucket. This will create storage problems on our server.
  • Usage of more bandwidth: Suppose the file is 20 MB. First, you have to send the 20 MB data to your backend, then your backend will send the corresponding data to the server. This creates an overhead on the server and increases costs.
  • The server response is very slow for big files.

Using Pre-Signed URLs to Send Files to S3 Directly

Let us understand what the architecture behind this is and what we can achieve from this. All the previously mentioned problems to store data on the server and send files to cloud storage from the server are minimized through this approach.

The main idea behind this is that we ask the server to give us a URL where we can send our data to cloud storage directly. That URL is personalized for us and already has some credentials for one-time use to only send data for storage and specific types of data. So, we ask the server to give us a URL, which the server creates for a specific user, to which we can send our file directly.

In this approach, the API calls have increased, but the data overhead is reduced.

Steps to Create this Approach

  1. Backend Setup to Generate Pre-Signed URLs:
import express from 'express';
require('dotenv').config();
import { createPresignedPost } from '@aws-sdk/s3-presigned-post'
import { S3Client } from '@aws-sdk/client-s3'
import { ACCESS_KEY_ID, SECRET_ACCESS_KEY } from '../config'
const app = express();

const s3Client = new S3Client({
credentials: {
accessKeyId: ACCESS_KEY_ID,
secretAccessKey: SECRET_ACCESS_KEY,
},
region: "us-east-1"
})

export async function createPresignedPostRequest(userId :string){
return await createPresignedPost(s3Client, {
Bucket: process.env.S3_BUCKET_NAME,
Key: req.query.filename,
Expires: 60, // URL expiration time in seconds
})
}

app.get('/generate-presigned-url', userAuthMiddleware,async (req, res) => {

try {
const url = await createPresignedPost(req.userId);
res.status(200).json({ url });
} catch (err) {
console.error(err);
res.status(500).send({ error: 'Error generating pre-signed URL' });
}
});

const PORT = process.env.PORT || 5000;
app.listen(PORT, () => {
console.log(`Server is listening on port ${PORT}`);
});

2. Frontend Setup to Use Pre-Signed URLs:

import React, { useRef } from 'react';
import axios from 'axios';
export async function postToS3(file,signed) {
try {
const preSignedUrl = signed.preSignedUrl;
const formData = new FormData();
formData.set("bucket", signed.fields["bucket"])
formData.set("X-Amz-Algorithm", signed.fields["X-Amz-Algorithm"]);
formData.set("X-Amz-Credential", signed.fields["X-Amz-Credential"]);
formData.set("X-Amz-Date", signed.fields["X-Amz-Date"]);
formData.set("key", signed.fields["key"]);
formData.set("Policy", signed.fields["Policy"]);
formData.set("X-Amz-Signature", signed.fields["X-Amz-Signature"]);
formData.append("file", file);
const awsResponse = await axios.post(preSignedUrl, formData);
return awsResponse;
} catch (err) {
console.log("Error uploading file:", err);
}
}
const FileUpload: React.FC = () => {
const fileInputRef = useRef<HTMLInputElement>(null);

const handleFileChange = async (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0];
if (!file) return;

try {
const response = await axios.get('http://localhost:5000/generate-presigned-url', {
params: {
filename: file.name,
},
});
const { url } = response.data;

await postToS3(fileInputRef,url);

console.log('File uploaded successfully');
} catch (error) {
console.error('Error uploading file:', error);
}
};

const handleClick = () => {
fileInputRef.current?.click();
};

return (
<div className="block-container">
<button onClick={handleClick} className="show-more">
Upload File
</button>
<input
type="file"
onChange={handleFileChange}
className="hidden"
ref={fileInputRef}
style={{ display: 'none' }}
/>
</div>
);
};

export default FileUpload;

I know this was a long process, but it’s complete now. Through this, I’ve learned how to upload files and images to storage in a better and more efficient way. Although the code is not tested yet, please review it and point out any mistakes I may have made.

Conclusion

Using pre-signed URLs is an optimized approach for uploading files directly to S3. It minimizes server storage issues, reduces bandwidth usage, and speeds up the upload process for large files. By generating pre-signed URLs on the backend and using them on the frontend, you can securely and efficiently upload files to S3. This method leverages the strengths of both frontend and backend while keeping your AWS credentials secure and minimizing server overhead.

--

--