Automated deployment to AWS Elastic Beanstalk using Github Actions

Andrey
Seamless Cloud
Published in
8 min readJul 25, 2020

Welcome to the second part of the series of posts where I will guide you through the steps for creating a modern web application in Python and deploy to the cloud using Elastic Beanstalk. At the end of this article, you will have automated deployment of your application to AWS Elastic Beanstalk using Github Actions.

Pre-requisites:

  • Basic Python knowledge.
  • Flask Python application deployed to AWS Elastic Beanstalk.

All posts in this series

Let’s jump right into it!

Putting your code into GitHub

Hopefully, you’ve already heard of GitHub and all the benefits you get by storing your code in a version control system. Let’s create a repo:

Then, you need to clone the repo and add the code you have written in the previous article:

After you commit your changes, the repo should look something like this:

Working with GitHub Actions

GitHub Actions is a continuous integration/deployment service that is pretty recent on the day of writing this article. They are pretty late in the game of CI/CD tools, but the tool works like a charm. Given also the fact that it is integrated natively with GitHub, where we store our code, this is an excellent choice for software workflow automation.

In GitHub Actions, to describe a workflow, you need to create a .yml file in the folder .github/workflows/. Let's create one and name it deploy-to-eb.yml. Then, put the following code inside:

name: deploy-to-eb

on:
push:
branches: [ master ]

jobs:
deploy:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

There are three top-level sections here. The name is obviously the name of this workflow. on defines triggers. In our case, we will trigger our deployment on any push to the master branch. The third section is jobs, and it contains the actual logic of the workflow. We have only one job, and it is called deploy.

Job has the runs-on section that specifies the environment; in our case, we will execute this script inside the Ubuntu environment. The job consists of steps. We have only one step for now, which is defined as uses: actions/checkout@v2. This is actually a reference to one of the "actions" in the GitHub library of actions. It makes the code we've pushed to master available for our workflow. You can read more about it here.

As you’ve already figured out, our workflow does not deploy anything yet. We’ve just laid out the foundation for it. Now we need to implement the actual deployment flow. Here we go.

Creating an S3 bucket

We need to create an S3 bucket in AWS. If you don’t know what it is, you can read about it here. Why do we need it? Our deployment flow will look like this:

We have to do it like this. You cannot upload your code directly into Elastic Beanstalk. We have 2/3 of the component on the diagram. Let’s go ahead and create the S3 bucket. You need to go into your AWS console and search for the S3 service as we did for Elastic Beanstalk in the previous article.

There is already a bucket created. It is the bucket that Elastic Beanstalk created automatically. Let’s click the Create bucket button.

Choose the unique bucket name, keep all other settings default, and create the bucket.

Putting your AWS credentials into GitHub

We need AWS credentials to make API requests to our AWS account from GitHub Actions. For safety reasons, it’s better to create a new user in AWS who will be dedicated to GitHub Actions. Open your AWS console, choose IAM in services, then click on Users in the navigation panel on the left.

Then click Add user. Let's name our user github_actions and give him programmatic access to our account.

When you click Next, AWS suggests you add the user to a group. We need a group that has access to S3 Buckets and Elastic Beanstalk. Let's click Create group.

I named this group the same as my user and checked two checkboxes for full access to S3 and Elastic Beanstalk. Well, we don’t need full access, but we will leave it like this to not overcomplicate this tutorial. Let’s keep other settings in the default state and click “next” until we have successfully created the user. On the last screen you’ll see this:

We need Access key ID and Secret access key. Those are credentials we were looking for. Secrets are no joke, and they need to be in a secure place. Gladly, GitHub has Secrets. You can find this feature in the "Settings" of your repo:

Let’s add values we’ve just copied from AWS as Secrets. Let’s name them ACCESS_KEY_ID and SECRET_ACCESS_KEY.

Finishing the GitHub Actions workflow

Now let’s put a bunch of code into our workflow definition. Watch out for indentation. You can get syntax errors if you’re not careful.

name: deploy-to-eb

on:
push:
branches: [ master ]

jobs:
deploy:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: Create ZIP deployment package
run: zip -r deploy_package.zip ./

- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.SECRET_ACCESS_KEY }}
aws-region: "us-east-1"

- name: Upload package to S3 bucket
run: aws s3 cp deploy_package.zip s3://my-awesome-app-deploy-andrey/

- name: Create new ElasticBeanstalk Application Version
run: |
aws elasticbeanstalk create-application-version \
--application-name MyAwesomeApp \
--source-bundle S3Bucket="my-awesome-app-deploy-andrey",S3Key="deploy_package.zip" \
--version-label "ver-${{ github.sha }}" \
--description "commit-sha-${{ github.sha }}"

- name: Deploy new ElasticBeanstalk Application Version
run: aws elasticbeanstalk update-environment --environment-name Myawesomeapp-env-1 --version-label "ver-${{ github.sha }}"

Don’t worry. I’ll explain it section by section.

Create ZIP deployment package

This section puts all our code into the archive named deploy_package.zip. As simple as that.

Configure AWS Credentials

Here we use another action from the library of actions generated by the GitHub community. This is the place where we get our secret values and put them into the environment of our workflow. We also need to define aws-region. You can check your region in the top right corner of AWS console:

Upload package to S3 bucket

In this section, we upload our archive with the code to the S3 bucket. Please put the name of the bucket you’ve created in the “Creating an S3 bucket” section of this post.

Create new ElasticBeanstalk Application Version

This is a long command, let’s take it piece by piece:

  • --source-bundle S3Bucket="my-awesome-app-deploy-andrey",S3Key="deploy_package.zip" This is where we let Elastic Beanstalk know where to find the application code for this app version.
  • --version-label "ver-${{ github.sha }}" Versions should be unique. It makes sense to link them with GitHub commits, so it's easier to debug issues in the future.
  • --description "commit-sha-${{ github.sha }}" For even more clarity - let's also put commit version into the description.

Deploy a new ElasticBeanstalk Application Version

In this section, we simply tell Elastic Beanstalk to update the application to the latest version. Please make sure to replace --environment-name Myawesomeapp-env-1 with the name of your Elastic Beanstalk environment name.

Deploying a new application version with GitHub Actions

To check that everything worked out well, let’s change something in our application. There is not much to change right now except for the “Hello World!” line, so let’s write “Hello GitHub Actions World!”.

from flask import Flask 
application = Flask(__name__)
@application.route('/')
def hello_world():
return 'Hello GitHub Actions World!'

Let’s commit everything to master in our repo. You can see our GitHub Action in action (ha-ha) in the Actions tab in the repo.

Now, let’s check out Elastic Beanstalk environment. It can take some time for the application to deploy after our workflow finished. We can check the application status in AWS console:

If you see this green “OK” sign, let’s click on the URL of our application.

Voila! Now you just commit your code to master, and the application will be magically updated. However, now you know the magic behind it.

If you have any issues, please shoot me an email, and I’ll try to help you. The full code is available here.

In “Part 3: AWS Elastic Beanstalk infrastructure in code with Terraform.” (writing is in progress) we will put our infrastructure in the code. Exciting stuff.

You can read more of our blog here.

Originally published at https://blog.seamlesscloud.io on July 25, 2020.

--

--

Andrey
Seamless Cloud

Software Engineer with a passion for automating routine tasks.