Terraform Use Case with AWS

Today I am going discuss one use case of terraform with AWS. I am going to show a terraform script created by me which perform the following tasks.

Task 1 - Create the key and security group which allow the port 80.
Task 2 - Launch EC2 instance.
Task 3 - In this Ec2 instance use the key and security group which we have created in step 1.
Task 4 - Launch one Volume (EBS) and mount that volume into /var/www/html
Task 5 - Developer have uploded the code into github repo also the repo has some images.Copy the github repo code into /var/www/html
Task 6 - Create S3 bucket, and copy/deploy the images from github repo into the s3 bucket and change the permission to public readable.
Task 7- Create a Cloudfront using s3 bucket(which contains images) and use the Cloudfront URL to update in code in /var/www/html

Before performing the above task I want to discuss the prerequisites.

  • You have terraform installed in your system
  • You have AWS account configured with Access Key & Secret Key.
  • Configure aws cli on your system by “aws configure — profile <name> ”
  • Must have key-pair either create on aws or create by “ssh-keygen” command.

Lets get continued with our task step by step.

Task 1- Create the key and security group which allow the port 80.

Before we create a key & security group, we create & initialize our work space. This workspace we can use to maintain the state of terraform. You can list your workspace by “terraform workspace list”.

Create workspace

mkdir -p terra/task1/

cd terra/task1

Now create .tf file.

vim ec2.tf

Firstly mention the provider which provider you are going to use AWS, Azure,etc

provider

Now create a key pair & security group. For key either you create via terraform on aws or mention existing key pair or provide public key generated on your system. I opted third one.

key
Security Group

Task 2 — Launch EC2 instance.
Task 3 — In this Ec2 instance use the key and security group which we have created in step 1.

Now we perform task2 & task3 to create ec2 instance & attach key and security group created.

Also I have install httpd , git & php for future use.

EC2

Task 4 - Launch one Volume (EBS) and mount that volume into /var/www/html

Task 5 — Developer have uploded the code into github repo also the repo has some images.Copy the github repo code into /var/www/html

Launched the ebs volume in same availability zones as same of ec2 instance. And attach the ebs to ec2 instance having directory “/var/www/html”.

First created ebs volume.

Now attach it to ec2

Use null resource, done ssh via key we created & mount the volume. Also copy the code of git repo in /var/www/html/

I have done this to make our data persistent. EC2 is ephemeral in nature.

EBS Attach to /var/www/html

Task 6 — Create S3 bucket, and copy/deploy the images from github repo into the s3 bucket and change the permission to public readable.

I have create an S3 bucket & copy the image which came with repo I had copy in previous task. So I copy this image from here to s3. To send data from s3 first I had configure aws-cli. Also this task perform after above task so I use “depends on” to run code in sequence.

First we create S3 bucket with public read access.

Create S3

Now copy the image in S3 bucket with public access permission. Also update the url of image to s3 url. Now image you see cam from S3.

From output we get all info about S3.

Task 7- Create a Cloudfront using ec2 and use the Cloudfront url.

Here I have created a cloudfront using ec2 public dns.

CloudFront

Now your script is ready. Run this script & everything is set up automatically for you.

terraform init

terraform apply -auto-approve

To delete everything

terraform destroy -auto-approve

I will do further advancement by integrating it with jenkins. For that stay stunned & please give a clap if feel article useful

GitHub : Terraform code

Thanks For Reading!!!

Vikas Goswami

DevOps Engineer