Attach An IAM Role To EC2 Instance To Access DynamoDB

Ray Sylverne
Nerd For Tech
Published in
8 min readNov 26, 2022
Image by: Revathijoshi

MISSION

  1. Create a DynamoDB table for something of your choosing
    (e.g. movies, food, games)
  2. Add 10 or more items to the table
  3. Create a t.2micro Ec2 instance via AWS CLI
  4. Using an IAM role and the principle of least privilege, grant the EC2 instance read access to DynamoDB.
  5. Use the AWS CLI in the EC2 instance to scan the DynamoDB table
  6. Use the AWS CLI in the EC2 instance to validate you cannot write an item to the DynamoDB table

ADMIN & LOGISTICS

  • AWS DynamoDB Fundamentals
  • AWS S3 Fundamentals
  • AWS CLI Fundamentals
  • Utilize Reference Section for AWS Documentation Assitance
Image by jorono from Pixabay

EXECUTION

While trying to decide what theme to create my table around, I came across ChampsOrChumps.com, which is home to some interesting stats about sports teams and their respective cities. For instance, they’ve kept track of NFL teams with the best regular season records by winning percentage and then tell you whether they won the super bowl or not that year. My chosen theme represents my passion as a die-hard Miami Dolphins fan. Even though we’ve had it rough the past couple of years, I’m clinging to our glory days. The Dolphins are the only team in NFL history to finish an entire season undefeated; their 1972 season ended without a loss and culminated in a victory in Super Bowl VII. Ok, now that we know why I chose this table, let’s return to our lab.

The table contains fifty items with four attributes per item ( Team, Record, Result, and Year). Manually adding each item and subsequent attributes into DynamodDB is feasible but not an ideal use of time. I will take advantage of a recently released AWS feature (August 2022) that imports data from Amazon S3 into a new DynamoDB table.

Before this feature was released, you had limited options for bulk importing data from S3 into DynamoDB. A popular workaround involved writing python code into a Lambda function, and like most ad-hoc fixes, your results varied. Bulk importing data can also require a custom data loader, which takes resources to build and operate. DynamoDB import from S3 helps you to bulk import terabytes of data from Amazon S3 into a new DynamoDB table with no code or servers required.

STEP 1: IMPORT DATA & CREATE TABLE

Begin by signing in to the AWS Management Console and opening DynamoDB at https://console.aws.amazon.com/dynamodb/. In the console's navigation pane on the left side, choose Import from S3.

On the page that appears, select Import from S3 again, and you will be brought to the request import wizard. If the bucket is owned by your account, you can find it by using the Browse S3 button. This is by far the simplest option.

After selecting the Browse S3 button, you can import an entire bucket or an individual object within that bucket. Before starting this lab, I downloaded the data from ChampsOrChumps.com and converted it into a CSV file I uploaded to S3.

Alternatively, you can enter the URL of the bucket in the s3://bucket/prefix format. This is a great option if you are importing data from a different AWS account than the one you’re currently logged into.

Next, you must specify if you are the S3 bucket owner. If the source bucket is owned by a different account, select A different AWS account. You will need to provide the account ID of the bucket owner. Select the appropriate Import file format. The options are DynamoDB JSON, Amazon Ion, or CSV. Choosing the wrong format will cause the import process to fail. If you select CSV, you will have two additional options: CSV header and delimiter character.

On the next screen, you will decide the parameters that will be associated with the new table that will be created to store your data. As you can see below, I’ve named my table and set the partition and sort key.

We had to use a sort key because some teams appear on this list more than once. When a partition key and sort key are used together, they are referred to as a composite primary key. Multiple items can have the same partition key value in a table with a partition key and a sort key. However, those items must have different sort key values. 📝 Note: Primary Key and Sort Key must match the attributes in your file, or the import will fail. The attributes are case-sensitive.

Select Next to review your import options, then click Import to begin the import task. Your new table is listed in the “Tables” with the status “Creating.” At this time, the table is not accessible.

Our data was successfully imported from S3, and a DynamoDB table has been created for us.

STEP 2: CREATE IAM ROLE WITH READ-ONLY ACCESS

The IAM service supports only one type of resource-based policy called a role trust policy, which is attached to an IAM role. Trust policies define which principal entities can assume the role. I am using Visual Studio Code to create my .json file. A template can be seen below.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"Service": "ec2.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]

The AWS CLI command to create the role and attach the trust policy

aws iam create-role --role-name <Name_Your_Role> --assume-role-policy-document file://<filelocation.json>

Attach the permissions policy to your role. Initially, I wanted to use an AWS-managed policy for my newly created role. The only requirement would be procuring the ARN for the AWS-managed AmazonDynamoDBReadOnlyAccess policy. However, upon reviewing the policy, I noticed that it grants access to other resources that our mission does not call for. I will need to create another .json file with permissions that meet our requirements while adhering to the principle of least privileges.

The following permissions policy grants permissions for the GetItem, BatchGetItem, Scan, Query, and ConditionCheckItem DynamoDB actions only, and as a result, set read-only access on the BestNflRegularSeasonRecord table. You will need to paste your table ARN in the resource section of your policy to specify which table you're granting access to.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ReadOnlyAPIActionsOnBooks",
"Effect": "Allow",
"Action": [
"dynamodb:GetItem",
"dynamodb:BatchGetItem",
"dynamodb:Scan",
"dynamodb:Query",
"dynamodb:ConditionCheckItem"
],
"Resource": "arn:aws:dynamodb:us-west-2:123456789012:table/BestNflRegularSeasonRecord"
}
]
}

The command below embeds the inline policy to the role to specify what it can do.

$ aws iam put-role-policy --role-name Test-Role-for-EC2 --policy-name Permissions-Policy-For-Ec2 --policy-document file://C:\policies\permissionspolicyforec2.json

Create the instance profile required by EC2 to contain the role. You will need to provide a unique name for your instance profile.

$ aws iam create-instance-profile --instance-profile-name <TestEc2Profile>

Lastly, add the role to the instance profile

$ aws iam add-role-to-instance-profile --instance-profile-name EC2-ListBucket-S3 --role-name Test-Role-for-EC2

STEP 3: CREATE EC2 INSTANCE

Run the below command to create your EC2 instance. You will need to provide a value next to every parameter except for associate --associate-public-address

aws ec2 run-instances \
--image-id ami-0b0dcb5067f052a63 \
--instance-type t2.micro \
--subnet-id subnet-01b370d32254f5d0b \
--security-group-ids sg-025ec04872b1c4e46 \
--associate-public-ip-address \
--key-name cli-dev-proj \
--tag-specifications "ResourceType=instance,Tags=[{Key=Name,Value=EC2-Wk8}]"
--iam-instance-profile "Name"="TestEc2Profile"

Upon successfully creating your instance, you will need to obtain your public IP address to SSH into the system. You can obtain this by running the following command. Don’t forget to add your instance profile to your EC2 instance.

aws ec2 describe-instances \
--query "Reservations[*].Instances[*].{PublicIP:PublicIpAddress,Name:Tags[?Key=='Name']|[0].Value,Status:State.Name}" --output table

STEP 4: QUALITY ASSURANCE CHECK

Now it's time to see if everything functions as expected or do we need to go back to the drawing board. Use the AWS CLI in the EC2 instance to scan the DynamoDB table

The following AWS CLI example reads an item from the BestNflRegularSeasonRecord table. You can do this through the DynamoDB API or PartiQL, a SQL-compatible query language for DynamoDB.

# DynamoDB API
aws dynamodb query \
--table-name BestNflRegularSeasonRecord \
--key-condition-expression "Team = :name" \
--expression-attribute-values '{":name":{"S":"Miami Dolphins"}}' \
--region us-east-1

The output from the command shows that the 70s and 80s were the Miami Dolphins' glory years, and I wasn’t even born yet.

The final step is to make sure our role cannot write

aws dynamodb put-item \
--table-name BestNflRegularSeasonRecord --region us-east-1 \
--item \
'{"Team": {"S": "San Diego Chargers"}, "Year": {"N": "2006"}, "Record": {"S": "14-0"}, "Result": {"S": "Lost SB"}}'

The portion highlighted in yellow confirms that our EC2 instance cannot write in the table.

TOUCHDOWN

I had to reference many different resources to figure out the steps and develop the syntax needed for the different phases within this one lab. I’m posting the references below to help anyone working on a similar problem. The references are in a similar order to the steps of the lab.

References & Resources:

  1. Import Amazon S3 data into Amazon DynamoDB
  2. Cheat sheet for DynamoDB
  3. IAM policy to grant read-only permissions on items in a DynamoDB table
  4. Creating a role for a service (AWS CLI)
  5. Create an EC2 Instance via AWS CLI
  6. Query data in a table
  7. Write data in a table

--

--

Ray Sylverne
Nerd For Tech

👨🏽‍🎓 BS CyberSecurity |👨🏽‍💻 3x AWS Certified |🐧 Linux |🐍 Python | 🐳 Docker | ⚓️ Kubernetes