Created by Raktim

Getting Started with AWS CLI, Windows PowerShell & JSON Parser

Let’s learn how to work on AWS using AWS CLI along with Windows advanced shell scripting — PowerShell and JSON Parsing.

Raktim Midya
Published in
13 min readOct 13, 2020

--

This article will help you to learn the basics of the AWS Command Line Interface. Also, we gonna learn how to work on Windows PowerShell and JSON Parser. Then we will integrate these things to create one Automation Script which will help us to provide some resources on AWS. Let's start one by one…

AWS CLI :

Source: Google

The AWS Command Line Interface (CLI) is a unified tool to manage AWS services. With just one tool to download and configure, we can control multiple AWS services from the command line and automate them through scripts.

  • I don't want to waste your time by explaining more about what is AWS CLI because once we start the practice you can easily understand that. Simply you can think it's a program that provides us one command called “aws”. Using this command, we as a user of AWS can communicate with our AWS account and can control nearly everything on our AWS account. The great thing about this program is you can run it on any OS.
  • To know more about AWS CLI : https://aws.amazon.com/cli/
Source: Google

Installing AWS CLI:

The installation of AWS CLI is so simple. You just need to download the application from the below-mentioned link and like we install any other application, just run the application and keep on clicking and it will be installed.

Once you are done with installation open your Command Prompt or Terminal and run the below mentioned command to check if it's working or not…

Configuring AWS CLI :

Before starting, we need the aws access key and secret key for configuration. Because for humans we use username and password for authentication. But to authenticate any program we use an access key and secret key.

  • To, get those things you need to go to your AWS account from AWS Web Console. Then go to “IAM” service => click on “Users” => then click on “Add User” and create one user.
  • Then click on “Programmatic access” and if you read the description of this access you can see, it’s giving us “access key” and “secret key”. For reference follow the below-mentioned screenshot…
AWS IAM Screenshot
  • Next, go to “Permissions” and click on “Attach Existing Policies”. Then if you know about IAM, you can select your desired policy. But to make things simple I am giving “Administrator Access”. For reference see the below-mentioned screenshot…
AWS IAM Screenshot
  • Next, click on “Tags” and give any desired tag. Then click on “review” and then click on “Create User”. Next, it will provide you one option to see your Access Key and Secret Key. Don't forget to “Download” the credentials by clicking on the download button for future reference.

Now It’s time to authenticate our AWS CLI with our AWS account. For that go to the command line and type the below mentioned command.

aws configure
  • Next, provide your “access key” and “secret key” and then provide region means on which region you want to work on. Lastly in “Output format” don’t provide anything. For reference check the below-mentioned screenshot…
  • Next to make sure you AWS CLI, is able to communicate with your AWS Account run the below mentioned command and if you see no error is coming means we have successfully set up our AWS CLI…

Basics of AWS CLI :

Created by Raktim
  • Usually, before running any command on AWS CLI, you should have some knowledge about AWS, and also you should know on which service you want to work on.
  • To find the basic command structure you can run “aws help” command. Frankly speaking, the documentation of AWS CLI is too much great so try to utilize that by using help.
  • After running help, you just keep on pressing the space bar to scroll and “q” to quit. Now my requirement is to check something on the “EC2” service. So, if you read the help a little bit you will see there is one subcommand called “ec2”.
  • Now let's see the help of “ec2”. For that run “aws ec2 help” and again by reading this output we can understand that “aws ec2” command has again some subcommands. One of the sub commands of “aws ec2” command is “describe-instances”. This subcommand tells us the details about our Instances. Now go back to the last command we just run and you will understand that time I don't have any instance on my account, so the output shows nothing.
  • Now, one thing to notice here, that by default AWS CLI provides us the output in JSON format. So, to filter data from those outputs we need to pass them through one JSON parser.
  • Next, I am going to talk about JSON parser because once we learn JSON parser and then once we go to the actual practical, that time it would be very much easier to understand how to provision resources using AWS CLI.

JSON Parser (JQ) :

Source: Google

JQ is a program using which we do JSON Parsing or we fetch data from a JSON script. JQ is like sed for JSON data — you can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep and friends let you play with text. jq is written in portable C, and it has zero runtime dependencies.

Installation of JQ :

Installation of JQ is very simple. I suggest follow the below mentioned YouTube link and install the JQ program.

Basics of JQ :

Here also I don't want to talk much about JSON parsing because I think once we start writing the automaton script, you will be able to easily understand JSON parsing. But I suggest if you don't know what is JSON parsing or how to work with JQ just watch this below mentioned YouTube video.

Windows PowerShell :

Source: Google

PowerShell is an object-oriented automation engine and scripting language with an interactive command-line shell that Microsoft developed to help IT professionals configure systems and automate administrative tasks. PowerShell, built on the .NET framework, works with objects, whereas most command-line shells are based on text.

  • Frankly speaking, PowerShell is a huge topic to learn because it has lots of facilities. But to do our practical we simply need the basic Variables Concept of PowerShell. It's nearly the same as when we use “Bash Shell” on Linux. Here to store any variable start by using the “$” sign and then the variable name, then equal (=) to the value we want to store on that variable. For example, see the below-mentioned screenshot…
PowerShell Screenshot

Now instead I tell more concept let's start building the automation script and once I explain each and every line on that script, you will very easily understand these concepts of PowerShell and JQ.

Let's see the Problem Statement :

  1. Create a key pair.
  2. Create a security group.
  3. Launch an instance using the above created key pair and security group.
  4. Create an EBS volume of 1 GB.
  5. The final step is to attach the above created EBS volume to the instance you created in the previous steps.

Pre-requisite :

  • In this article, I will not talk about these AWS resources. My main focus is on how we can provide these resources using AWS CLI. So, I am considering you already have the knowledge of these resources.
  • Next, I have given my file name “aws-cli.ps1”. You can give any name you want but the extension should be “ps1”. Also, I am using VSCode as my text editor, but you can use any editor you want.

As always we gonna see each portion of the script and at the end I will provide the GitHub link from where you can download the entire script.

Setting up the Variables :

  • As we can notice that I am storing some variables that we gonna use in the future to pass on the AWS Commands. Here, I have given my key name to “MyWebKey” and Security Group name to “WebSG”.
  • Next, I selected the Amazon Linux 2 AMI ID and stored in a variable called “image_id”. Next, I selected my Instance type to “t2.micro” and Instance count to 1.
  • Next, I selected the subnet in which I want to launch my Instances and the Availability Zone where I want to provision my extra 1GB EBS volume. One very important thing to note that here make sure your Subnet belong to the same Availability Zone that you are selecting for extra EBS. Because EBS is a zonal service.
  • Lastly, I selected the volume size to 1 means 1GB and stored in a variable called “volume_size”. Lastly, I selected the volume type to “gp2” and stored in a variable.

Creating AWS Key pair :

aws ec2 create-key-pair --key-name "$key_name" --query 'KeyMaterial' --output text | out-file -encoding ascii -filepath "$key_name.pem"
  • To create the AWS Key-pair I am using this above-mentioned command. Now let's understand how to write this command.
  • First, you already know that the key pair belongs to EC2, that's why we are using “AWS ec2” as the main command and under that we are running one subcommand called “create-key-pair”. This helps us to create one AWS Key-pair.
  • Now this command has one argument called “- - key-name”. Using this argument we are passing the key name that we want to set. Next after this, we are calling our variable and while running as this is shell scripting so this variable will be replaced by its value.
  • Next, we have one option called “- - query”. This option works on the output of any aws command and it fetches the value of any variable we want to fetch. In the beginning, I already told AWS command gives us the output in JSON format, and in this format, we normally store data in key-value pairs.
  • So, one of the key of the output of the create key command is “KeyMaterial”. So, we are fetching its value and using “- - output” and we are converting it into text format data.
  • Next, we are passing this text format data, which contains our aws private key, using this pipe symbol (|) to one option called “-encoding” to encode it and finally using “-filepath” we are storing this key in pem format and here again we are using variable to give our key the same name that we were given while creation.

Creating Security Group :

$sg_id = aws ec2 create-security-group --group-name "$sg_name"       --description "Security group allowing SSH" |  jq ".GroupId"aws ec2 authorize-security-group-ingress --group-id "$sg_id"          --protocol tcp --port 22 --cidr 0.0.0.0/0
  • From here the actual JSON Parsing, we gonna use. Now each time we run any command on AWS Command it gives us output in JSON format. Next using the pipe (|) symbol we pass the output to the next command. In our case, we are passing the output of aws command into the “jq” command.
  • Now let's understand the 1st line. Here we are using one command called “aws ec2 create-security-group” and then we pass the parameters like name and description. These are very simple parameters we use to create a Security Group. Next from the output of this command we are filtering the “GroupId” using JSON parsing and finally storing it in a variable called “sg_id”.

I know it's a bit tricky but once again I will explain this same concept while creating instance. So, don't worry…

  • Next using the “aws ec2 authorize-security-group-ingress” command we are adding rules to the security group we created. Now here if you notice on the fly we are fetching the id of the newly created security group and through the variable, we are passing that in this command. Here in the future, I want to do SSH and from any location of the Internet that's why “tcp 22 no.” port and “cidr 0.0.0.0/0” is allowed.

Launching Instance :

$instance_id = aws ec2 run-instances --image-id "$image_id"         --instance-type "$instance_type" --count "$instance_count"           --subnet-id "$subnet_id" --security-group-ids "$sg_id"               --key-name "$key_name" | jq ".Instances[0].InstanceId"
  • Now using “aws ec2 run-instances” command we are launching one Instance. Here image id, instance type, count, subnet id, key name we are fetching from the variables which was set previously.
  • Next the “security group id” we are fetching from the variable we created on the fly while creation of the security group.
  • Finally again we are using a pipe symbol to pass the output of this “aws ec2 run-instances” command to “jq” command and after that using the “.Instances[0]” argument we are filtering the output and only fetching the details of the 1st instance.
  • Next after fetching the details of the 1st Instance we are again passing “.InstanceId” to filter the Instance Id. Lastly, we are storing it inside one variable called “instance_id” for future reference.

Creating EBS :

$volume_id = aws ec2 create-volume --availability-zone "$az" --size "$volume_size" --volume-type "$volume_type" | jq ".VolumeId"
  • Next using “aws ec2 create-volume” we are creating one EBS volume of 1GB. Here we are passing the value of availability zone, volume size & volume type values from the variables we set previously.
  • Next, we are passing the output from “jq” and parsing the Volume Id and storing it in one variable called “volume_id”.

For your knowledge the argument we are passing after “jq” totally depends on the output of the previous command. Like for previous output we need to fetch instance id after fetching the Instance. But here we are directly fetching the Volume Id.

Attaching Volume :

Start-Sleep 20aws ec2 attach-volume --volume-id "$volume_id" --instance-id "$instance_id" --device /dev/xvdh
  • Here I used this “Start-Sleep” to hold my shell for 20 seconds. Because usually after we provision the Instance it takes a few seconds to run. Till the time it runs, we can't attach EBS volume. That's why I waited for 20 seconds and then finally run the command to attach the EBS.
  • Next, I used “aws ec2 attach-volume” command and pass the volume id and instance id from the variables we created while launching them. Lastly, we need to mention one mount point and I selected the mount point to “/dev/xvdh”.

Let's run :

You can find the entire code from the below mentioned GitHub link…

  • Running any PowerShell script is so simple. Just open Windows PowerShell and navigate to the workspace where you have the “ps1” file. Then just type the complete file name along with the extension and then “enter” and it will run.
  • As, I am using VSCode, so here we can easily run our script using the “f5” key.

Let's see the output…

VS Code Screenshot
  • On the left terminal, you can see I run the script, and finally, we can see the output of our last command which was volume attach. That means our all previous commands run successfully.
  • On the right terminal we are doing SSH to the instance and we can see we are successful in doing ssh. Finally, to make sure that the volume has been attached run the below mentioned linux commands inside EC2 instance.
sudo su - root
fdisk -l
VS Code Screenshot

Congratulations… We have successfully completed our practical and also learn how to work on AWS CLI, Windows PowerShell Scripting and JQ Parser.

Final Words :

  • This was a quick introduction to these tools. There are unlimited possibilities of these tools like we can provision more resources by adding more aws commands. Also if we want we can do more advanced Shell Scripting because it's also one programming language and we can do nearly every operations we do on other programming languages.
  • JQ is a very useful tool and each day we are using it lots of time because in networking we always need to work on JSON.
  • I tried to make it as simple as possible. Hope You learned Something from here. Feel free to check out my LinkedIn profile mentioned below and obviously feel free to comment. I write DevOps, Cloud Computing, Machine Learning etc. blogs so feel free to follow me on Medium.

Thanks Everyone for reading. That’s all… Signing Off… 😊

--

--

Raktim Midya
Nerd For Tech

Technical Content Writer || Exploring modern tools & technologies under the domains — AI, CC, DevOps, Big Data, Full Stack etc.