Build and deploy your website seamlessly using GitHub actions, AWS S3, CloudFront and Lambda

This article will help you to automate the deployment of your frontend development work using GitHub and AWS.

Before we dive deep into achieving our goal, let’s first define what each service mentioned in the title will be used for.

Services

  1. GitHub actions to build a pipeline to upload your latest build code on push automatically to a defined S3 bucket
  2. S3 to store your build code and host your static website
  3. CloudFront a CDN (Content Delivery Network) service to serve your website and improve its availability across the world.
  4. Lambda a serverless component that will be triggered upon upload of latest code and will ensure that the CDN service (CloudFront) is always serving the latest code.

Implementation

Website repository

The next step is to create the S3 bucket that will be storing your website code. We have named our sample bucket as simple-website-ci.

S3 bucket

Next we will need to define GitHub Actions such that on a push to the master branch of our repository, the code is automatically uploaded to S3. In order to do this we would first need to store the AWS credentials such as AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY inside GitHub secrets. We will also define the S3 bucket that we will be uploading the code to as AWS_S3_BUCKET. You can do that by going to your code repository homepage and then go to Settings -> Secrets

AWS Credentials stored securely in GitHub Secrets

GitHub Actions is a service that will help us automate the deployment of our website. You can go there by clicking on Actions on your repo page. Then click on set up a workflow yourself. You will see a generated template main.yml file under the .github/workflows directory. We would need to modify it according to our needs. The modified file (main.yml) will look like this:

name: website - deploy to S3 bucketon:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@1
with:
ref: refs/heads/master
fetch-depth: 1
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: "ap-east-1"
- name: Push files to S3
run: aws s3 sync . s3://${{ secrets.AWS_S3_bucket }} --size-only --exclude ".github/*"

The workflow file essentially uploads the code files to S3 that correspond to the latest commit on the master branch. It uses the AWS credentials we defined in the Secrets page as accessed under secrets.$variable. It omits/excludes git and workflow files which are not required for our website.

Make sure that your AWS account have the sufficient IAM permissions to upload to the S3 bucket of your choice

Once you commit the main.yml file, your workflow will start and can be viewed under the Actions page.

Workflow visualisation

Once the workflow has successfully finished, the uploaded files appear in your S3 bucket as shown:

Uploaded code files in S3

Now that our website code is in S3 we will need to configure S3 such that it is able host a static website. In the S3 top menu bar, go to Properties and click on Static website hosting and point the Index document to index.html

Static website on S3

The website is now accessible under the endpoint generated.

Ensure that the bucket objects are publicly readable in order for the website to be publicly available

In order to improve website availability/latency across the world, we would use CloudFront Distribution (AWS CDN) to serve our S3 bucket hosting the website. Furthermore, CloudFront also issues a default SSL certificate and helps to redirect HTTP traffic to HTTPS, consequently securing your website.

In AWS console go to CloudFront -> Create Distribution. Then select Web Distribution. Copy your S3 website endpoint and paste into the Origin Domain Name field. Use the option to Redirect HTTP to HTTPS.

Then click Create Distribution. It takes around 15–30 minutes to deploy the distribution. Once the status is Deployed click on the distribution to find the https CloudFront link under Domain Name as shown

CloudFront Distribution

If you have your website domain ready you can define it under the Alternate Domain Names (CNAMEs) field

Although now that we have our CloudFront distribution ready, any new code deployment to our S3 bucket will not be reflected automatically onto the CloudFront distribution since it would be cached for a certain period of time (normally 24 hours) in the Edge location. We would need to invalidate the CloudFront cache in order to serve the most updated content. One way to do this is to create a Lambda function that will invalidate CloudFront cache by automatically executing upon witnessing a change in the S3 bucket.

Lambda functions are AWS serverless components that enables us to run code without provisioning any compute resources. This makes them useful to carry out asynchronous tasks that are triggered upon various events. In our case the Lambda function will be executed when there is change in our S3 bucket hosting the website.

In AWS console go to Lambda -> Create Function. Choose a Python 3.x runtime and Create a new Execution role

Lambda function

The function code to invalidate CloudFront cache is as follows:

from __future__ import print_function

import boto3
import time

def lambda_handler(event, context):
path = "/" + event["Records"][0]["s3"]["object"]["key"]
bucket_name = event["Records"][0]["s3"]["bucket"]["name"]

print("Path: " + path + " Bucket name: " + bucket_name)

client = boto3.client('s3')
tags = client.get_bucket_tagging(Bucket=bucket_name)
for tag in tags["TagSet"]:
if tag["Key"] == "distribution_id":
distribution_id = tag["Value"]
print("CDN id: " + distribution_id)
break

client = boto3.client('cloudfront')
invalidation = client.create_invalidation(DistributionId=distribution_id,
InvalidationBatch={
'Paths': {
'Quantity': 1,
'Items': [path]
},
'CallerReference': str(time.time())
})

On your Lambda function page, under Designer click on Add trigger and use S3 as your trigger. Fill in the details of your bucket with the Event Type as All objects create event.

Now we need to define the distribution_id tag that we are using in the Lambda function code to define our CloudFront distribution. First copy the CloudFront Distribution ID under CloudFront in AWS console and then go to your S3 bucket -> Properties. Under tags define the distribution_id tag.

CloudFront Distribution ID added as a tag in S3

Now we would want to make sure that your Lambda functions has the appropriate IAM permission to access your S3 bucket and the CloudFront Distribution. So go to your Lambda function and click on Permissions tab and then the Role name.

Lambda IAM Role

Ensure that IAM permissions of S3 read and CloudFront CreateInvalidation are given to the Lambda function. Once the IAM permissions have been provided head over to your Lambda function to test it.

Create a test event using the s3-put template and replace the sample bucket name with your bucket name and sample bucket ARN with your bucket ARN.

Testing the Lambda Function

Once your test is executed head over to your CloudFront Distribution and under Invalidation you will see an ongoing Invalidation you just triggered

Ongoing Cache Invalidation

So now we are going to make a small commit to our GitHub repository that will change the website Heading from “Simple Website” to “Simple CI Website” and we can demonstrate the power of continuous integration using all the tools that are described above.

Changes reflected automatically

Software Engineer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store