Game Tech Tutorial
Published in

Game Tech Tutorial

How to Update Your Static Website on AWS S3?

AWS S3 is file storage to save images, audio, videos, and HTML files.

But you might not be aware that S3 can even save your costs to host a static website in high security without maintaining your Http Server on AWS EC2.

In this post, we will learn how to set up an S3 bucket and create a script to deploy, so you can automatically update a static website to speed up your workflow.

Step1: Create an S3 bucket

  • Log in AWS Console → Amazon S3 → Create bucket
  • Remember Select ACLs Enabled

Step2: Setup IP white list of the firewall (Optional)

If you want to restrict access to your static website, you need to set up the Bucket Policy below.

  • Goto your S3 Bucket → Permissions → Bucket policy → Click Edit
  • Create a policy and update <Your Bucket Name> and <Allowed IP> below to access your S3 Bucket
"Version": "2012-10-17",
"Id": "Policy1592300398519",
"Statement": [
"Sid": "Stmt1592300396008",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": "arn:aws:s3:::<Your Bucket Name>/*",
"Condition": {
"NotIpAddress": {
"aws:SourceIp": [
<Allowed IP>

Step3: Setup IAM

  • Create an IAM account

In this step, you will create an IAM profile account and get Access Key and Password.

  • Create and attach the following policy to your account
  • Update <Your S3 Bucket Name> below
"Version": "2012-10-17",
"Statement": [
"Action": [
"Effect": "Allow",
"Resource": "arn:aws:s3:::*"
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<Your S3 Bucket Name>",
"arn:aws:s3:::<Your S3 Bucket Name>/*"

Step4: Create Scrips to upload files to S3

Since we want to automate deployment, we need to create scripts to upload files to S3. Then we can easily integrate these scripts with any CI/CD tools like Jenkins.

4.1 Setup AWS IAM

Install aws cli on your Linux and run

aws configure --profile=<Your Account>AWS Access Key ID [****************]: <Input Your Access Key >AWS Secret Access Key [****************]: <Input Your Key Password>Default region name [ap-northeast-1]:  <Input Region>Default output format [json]: json 

4.2 Setup s3cmd

Install s3cmd

On CentOS/RHEL and Fedora:

sudo dnf  install s3cmd

On Ubuntu/Debian:

sudo apt-get install s3cmd

Setup s3cmd and type IAM Access key and Password

s3cmd --configure


  • In 4.1 and 4.2, you need to set up IAM account, Access Key, and Password properly on your Linux where you have CI tools running like Jenkins.

4.3 Create Scrtips to upload files

  • vim
  • update <Your Bucket Name> and <Your Profile Name>
#!/bin/bashif [ -z "$1" ]
echo "Input web folder path parameter is empty"
exit 1
S3_BUCKET="s3://<Your Bucket Name>"
AWS_PROFILE="<Your Profile Name>"
AWS_DEFAULT_REGION="<Your Region Name>"
## Upload files to S3
aws s3 cp --recursive ${WEB_FOLDER} --profile ${AWS_PROFILE} --region ${AWS_DEFAULT_REGION} ${S3_BUCKET}/html
# Setup Cache Clear header
s3cmd --recursive modify --add-header="Cache-Control:max-age=0, no-cache, no-store, must-revalidate" ${S3_BUCKET}/html
# Make it public
s3cmd setacl ${S3_BUCKET}/html --acl-public --recursive


Why do we need to set up Cache Clear header?

# Setup Cache Clear header
s3cmd --recursive modify --add-header="Cache-Control:max-age=0, no-cache, no-store, must-revalidate" ${S3_BUCKET}/html

Because this can completely remove cache from the client’s browser. A client can always see the latest files without manually deleting the cache in the browser.

  • Making files public is necessary because by default the uploaded files are private on S3.
# Make it public
s3cmd setacl ${S3_BUCKET}/html --acl-public --recursive
  • To improve the performance, you can run a single command to sync changed files instead of all the files. The following command can function as same as the previous three commands.
# Sync files to S3, make them public, and setup cache cleaer header aws s3 sync ${WEB_FOLDER} --profile ${AWS_PROFILE} --region ${AWS_DEFAULT_REGION} ${S3_BUCKET}/html --cache-control max-age=0 --acl public-read

Finally, we can simply run below to upload any web folder to S3.

./ <Your WebFolder>


In this post, you have learned how to create AWS Bucket and scripts to upload files to S3.

You can utilize these scripts in any CI/CI tool to automate your deployment for a static website to improve your work efficiency.

Thanks for reading.

You might be interested in




High-quality technical tutorials: AWS, GCP, Unity, SDK, Security, and more.

Recommended from Medium

Android Test Automation with Espresso Testing Framework

IoT Cisco Virtualized Packet Core (VPC)

Why The Monolith Will Forever Reign Supreme

I Wrote More than 90 Articles on 2021: Here is What I Learned


The Ultimate Way of Doing OPS?

AWS — Keep an eye on cost


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Eric Wei

Eric Wei

Senior Full Stack Engineer & Solution architecture | AWS, GCP | Cloud, Unity Game Development, SDK, DevOps, and more.

More from Medium

Provisioning EC2 for a Basic Docker Image

AWS IAM: Resource Based Policies

A Newbie’s Guide To Creating An EC2 Instance With An Apache Server Via The AWS CLI

Create an AWS EC2 Instance with an Apache Webserver Using the AWS CLI