AWS Asset Inventory Dashboard with CloudQuery and Grafana
To make a cloud architecture accessible, a solution architect needs to have a overall picture of the assets deployed in their organization. By having a dashboard which will be able to segregate the assets based on microscale factors, better decision making and management can be achieved for the respective organization.
Architecture Components
- CloudQuery — All of your assets from cloud and SaaS applications will be extracted, transformed, and loaded into PostgreSQL using this open-source tool.
- Grafana — It is an open source software that specializes in creating graphs and visualizations for users to easily understand the time-series data. It can be used to query, visualize, monitor, and alert.
Steps to Configure the Dashboard
Step 1: Create an EC2 instance
Launch an EC2 with Amazon Linux 2 AMI.
For the EC2 Instance Security Group, open SSH (22
) and default Grafana port (3000
) to the internet (0.0.0.0/0
).
Step 2: Install CloudQuery on the EC2
- SSH into your EC2 Instance.
- Run the following commands to install cloudquery on EC2 Linux machine:
curl -L https://github.com/cloudquery/cloudquery/releases/latest/download/cloudquery_linux_x86_64 -o cloudquery
# Give executable permissions to the downloaded cloudquery file
chmod a+x cloudquery
sudo cp ./cloudquery /bin
Step 3: Create a EC2 Role for giving cloudquery access to the assets in your AWS account
Amazon has created an IAM Managed Policy named ReadOnlyAccess, which grants read-only access to active resources on most AWS services.
The biggest difference is that we want our read-only roles to be able to see the architecture of our AWS systems and what resources are active, but we would prefer that the role not be able to read sensitive data from DynamoDB, S3, Kinesis, SQS queue messages, CloudFormation template parameters, and the like.
To better protect our data when creating read-only roles, we not only attach the ReadOnlyAccess managed policy from Amazon, but we also attach our own DenyData managed policy that uses Deny statements to take away a number of the previously allowed permissions.
So, we will attach 2 policies to our Role:
- AWS managed ReadOnlyAccess Policy
- Customer managed cloudqury-deny-data-read Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DenyData",
"Effect": "Deny",
"Action": [
"cloudformation:GetTemplate",
"dynamodb:GetItem",
"dynamodb:BatchGetItem",
"dynamodb:Query",
"dynamodb:Scan",
"ec2:GetConsoleOutput",
"ec2:GetConsoleScreenshot",
"ecr:BatchGetImage",
"ecr:GetAuthorizationToken",
"ecr:GetDownloadUrlForLayer",
"kinesis:Get*",
"lambda:GetFunction",
"logs:GetLogEvents",
"s3:GetObject",
"sdb:Select*",
"sqs:ReceiveMessage"
],
"Resource": "*"
}
]
}
After successful creation of this Role, attach it to the created EC2 instance for AWS resource access.
Step 3: Setup CloudQuery
- After installing CloudQuery, you need to generate a
cloudquery.yml
file that will describe which cloud provider you want to use and which resources you want CloudQuery to ETL:
cloudquery init aws
- By default, cloudquery will try to connect to the database
postgres
onlocalhost:5432
with usernamepostgres
and passwordpass
. After installing docker, you can create such a local postgres instance with:
# Docker Installation Commands for Amazon Linux 2
yum-config-manager --enable rhui-REGION-rhel-server-extras
yum -y install docker
systemctl start docker
systemctl enable docker
docker version
# Docker Command to create a postgres instance
docker run --name cloudquery_postgres -p 5432:5432 -e POSTGRES_PASSWORD=pass -d postgres
- If you are running postgres at a different location or with different credentials, you need to edit
cloudquery.yml
'sconnection
section.
For Example:
cloudquery:
...
...
connection:
type: postgres
username: postgres
password: pass
host: localhost
port: 5432
database: postgres
sslmode: disable
Once cloudquery.yml
is generated and you are authenticated with AWS, run the following command to fetch the resources.
# --no-telemetry flag for not sending any telemetry data to CQ
cloudquery fetch --no-telemetry
After this command has run, you postgres database will be populated with the data fetched from your AWS Accounts and segregated in tables according to the services.
Step 4: Grafana Installation
Add a new YUM respository for the operating system to know where to download Grafana. The command below will use nano
.
sudo nano /etc/yum.repos.d/grafana.repo
Add the lines below to grafana.repo
. This setting will install to the Open Source version of Grafana.
[grafana]
name=grafana
baseurl=https://packages.grafana.com/oss/rpm
repo_gpgcheck=1
enabled=1
gpgcheck=1
gpgkey=https://packages.grafana.com/gpg.key
sslverify=1
sslcacert=/etc/pki/tls/certs/ca-bundle.crt
Installation and Configuration commands
sudo yum install grafana
sudo systemctl daemon-reload
sudo systemctl start grafana-server
sudo systemctl status grafana-server
sudo systemctl enable grafana-server.service
Visit the newly installed Grafana Server by visiting the Public IP of the EC2 Instance on port 3000
. Default username and password is admin
. Change it to desired complex password after logging in.
Step 5: Adding Data Source to Grafana
Go to Configuration -> Data Sources
section in Grafana and click on add data source, thereafter select PostgreSQL and apply configure it in following manner:
After this is done, click on Save and Test button to check connectivity of the postgres with grafana.
Step 6: Importing Grafana Dashboard
- Execute this query in postgreSQL to add the
aws_resources
view. - Download the JSON for AWS Asset Inventory Grafana Dashboard.
- To import a dashboard click Import under the Dashboards icon in the side menu.
After all 3 steps you should be able to see the asset inventory dashboard:
Step 7: Customizing the Dashboard
To further customize the asset inventory dashboard, you can make use of cloudquery aws provider schema and write SQL queries of your business interest.
Conclusion
Here in this post we have discussed about Grafana and CloudQuery Setup. Later, we configured an open-source cloud asset inventory for our AWS Account.
If you need help with DevOps practices, AWS, or Kubernetes at your company, feel free to reach out to us at Opsnetic.
Contributed By: Raj Shah