A Script for Listing All of Your AWS Instances
If you work in tech, it’s likely that you’re familiar with Amazon Web Services, more commonly known as AWS.
Usually, we have EC2 instances in many regions, due to lower costs and convenience. The downside of this is that it can easily get out of control when there are too many instances in varying regions in different AWS profiles.
If you work in a company which supports cloud solutions for clients it’s very likely that you have multiple AWS profiles. Even if you collect them under an organisation billing it can be really difficult to learn more about these EC2 instances. For example, which regions and profiles are there or what are the size details of these instances?
Bearing this in mind, a script that lists all of them in one database or CSV file might be very helpful. Well, there is good news! When I spotted this problem I decided to invest some time working on it and I wrote a script for this listing, which I wanted to share with you.
Why do all of this through a database?
Can’t we just generate directly a CSV file?
To me, adding all of this into a database is a much better option as we can check the uniqueness of the instance_id and we can gain control over the updates — we can count them.
Go to the public repo:
So, in order to make it happen here is what you need to do:
- Make sure you have the preconditions.
- AWS-CLI so you can fetch the data from AWS API
- jq for parsing the JSON we get from AWS-CLI
- at least PostgreSQL 9.5, so you have the PostgreSQL tool that we will use
2. Create the database:
3. Then, create a table from the script in SQL folder:
psql aws_instances -f sql/aws_ec2_instances.sql# You can also use pg_restore if you prefer
This will create an aws_instances table and fields as in the variable $SAVED_FIELDS. You can of course change the naming directly in sql/aws_instances.sql file.
There are 2 scripts you can run. The first one is:
# works also for sh
This will save you time when going through all of the regions to check where the particular instances are. It also gives you an insight into how many of them are running so you don’t forget about any large server that can generate unwanted costs.
# works also for sh
This saves the instances in the database and generates a CSV file for all of the profiles you give in profiles array.
To run it properly you need to complete the list of AWS profiles.
You only need to create and array like this:
declare -a profiles=(“profile_1” “profile_2”)
To get the ‘wow effect’, I also added the open command in the end for fun but it’s obviously just a firework.
The worst part of the code?
Well, look at this example instance that I INSERT INTO the database:
Here I was hoping to reuse the SAVED_FIELDS for the DO UPDATE SET but without success. This means in the case of updating the variables, two places need to be changed. Can you help me with it? It’s all open-source!
We can now save time finding all of these instances and have control over them, turn them off if they are not needed. Yay!
I was hoping for a universal tool and wanted to check if we really needed all of the ORM layers. I can see now that the code is not as universal as it could be yet, as I had to unexpectedly dig deeper into PostgreSQL and the same code would not work on MySQL or another database.
More details available in the readme file.
Troubleshooting section included! Happy using!
What do you think? If you enjoyed this post, please let me know!