A Script for Listing All of Your AWS Instances

If you work in tech, it’s likely that you’re familiar with Amazon Web Services, more commonly known as AWS.

Usually, we have EC2 instances in many regions, due to lower costs and convenience. The downside of this is that it can easily get out of control when there are too many instances in varying regions in different AWS profiles.

A screenshot of a script which saves all of the instances from multiple AWS profiles in several regions to the database

If you work in a company which supports cloud solutions for clients it’s very likely that you have multiple AWS profiles. Even if you collect them under an organisation billing it can be really difficult to learn more about these EC2 instances. For example, which regions and profiles are there or what are the size details of these instances?

Bearing this in mind, a script that lists all of them in one database or CSV file might be very helpful. Well, there is good news! When I spotted this problem I decided to invest some time working on it and I wrote a script for this listing, which I wanted to share with you.

From the point of view of technical decisions, inspired by the idea of minimum dependencies, I decided not to use any ORM or backend at all. Zero NPM (the package manager for the Node JavaScript platform), no gems and other libraries. Simply the plain command line using a build in PostgreSQL tool. The downside of it is that users of other databases will have to adjust the script but as it is open-source, feel free to add your version. Thanks to this approach it is possible to run the script having AWS Command Line Interface (CLI or AWS-CLI), jq (Command-line JSON processor) and PostgreSQL installed and ready to be run as shell commands.

Why do all of this through a database?
Can’t we just generate directly a CSV file?

To me, adding all of this into a database is a much better option as we can check the uniqueness of the instance_id and we can gain control over the updates — we can count them.

Go to the public repo:


So, in order to make it happen here is what you need to do:

  1. Make sure you have the preconditions.

That is:

  • AWS-CLI so you can fetch the data from AWS API
  • jq for parsing the JSON we get from AWS-CLI
  • at least PostgreSQL 9.5, so you have the PostgreSQL tool that we will use

2. Create the database:

createdb aws_instances

3. Then, create a table from the script in SQL folder:

psql aws_instances -f sql/aws_ec2_instances.sql# You can also use pg_restore if you prefer

This will create an aws_instances table and fields as in the variable $SAVED_FIELDS. You can of course change the naming directly in sql/aws_instances.sql file.

There are 2 scripts you can run. The first one is:

# works also for sh
zsh aws_ec2_instances_from_all_regions_to_db.zsh

This will save you time when going through all of the regions to check where the particular instances are. It also gives you an insight into how many of them are running so you don’t forget about any large server that can generate unwanted costs.

# works also for sh

This saves the instances in the database and generates a CSV file for all of the profiles you give in profiles array.

To run it properly you need to complete the list of AWS profiles.

You only need to create and array like this:

declare -a profiles=(“profile_1” “profile_2”)

To get the ‘wow effect’, I also added the open command in the end for fun but it’s obviously just a firework.

The worst part of the code?
Well, look at this example instance that I INSERT INTO the database:

Here I was hoping to reuse the SAVED_FIELDS for the DO UPDATE SET but without success. This means in the case of updating the variables, two places need to be changed. Can you help me with it? It’s all open-source!



We can now save time finding all of these instances and have control over them, turn them off if they are not needed. Yay!


I was hoping for a universal tool and wanted to check if we really needed all of the ORM layers. I can see now that the code is not as universal as it could be yet, as I had to unexpectedly dig deeper into PostgreSQL and the same code would not work on MySQL or another database.

More details available in the readme file.

Troubleshooting section included! Happy using!

What do you think? If you enjoyed this post, please let me know!




Venture builder. Having fun building scalable businesses. www.appnroll.com

Recommended from Medium

Customized Login Identity Provider: How Does IDP Work?

customized identity providers

Compiler Writing in Elixir

What is Big Decimal in Java? & Examples

QuickFIX vs FIX Orchestra vs FinSpec — what’s the difference?

10 Things I Learned this Summer as a Cloud Engineering Intern at ExxonMobil

⚙️How to Recover Data from RAID 50 Based on HP P410 Smart Arraу Controller⚙️

Cost of delaying Tech Debt

Debugging iOS Data Race EXC_BAD_ACCESS errors using Thread Sanitizer (TSan)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Piotr Zientara

Piotr Zientara

#CEO at Xfaang, leader of the WarsawJS Community

More from Medium

post fix install and configure with AWS SES


Install MongoDB on EC2 Instance — Solved Connection Issue From Public DNS — ScanSkill

Using CloudFront to serve a Static Website hosted on an Amazon S3 Bucket