Optimizing AWS Costs: Your Handbook to Financial Efficiency — Part 3

TechStoryLines
5 min readNov 7, 2023

--

Welcome, dear readers, to the next exciting chapter of our AWS cost optimization journey! In Part 1 and Part 2, we’ve already uncovered a wealth of strategies to help you harness the power of AWS while maintaining a watchful eye on your costs. But today, as we venture into Part 3, we’re about to embark on an adventure into the world of Amazon DynamoDB. So, let’s dive right in and uncover the secrets to managing your AWS DynamoDB costs.

Amazon DynamoDB

Amazon DynamoDB, a completely managed NoSQL database service provided by AWS, stands as a real-time data powerhouse offering a wide range of uses. In the realms of e-commerce, gaming, IoT, content management, and ad-tech, it guarantees rapid response times and resilient data solutions.

Let’s discuss some cost optimizing techniques for DynamoDB-

1. Choosing Capacity Modes: On-Demand vs. Provisioned

On-Demand tables may be enticing for their flexibility, but it’s essential to be aware that they come at a higher cost, around 5 to 6 times more expensive per request compared to Provisioned tables.

For workloads with steady and predictable utilization, opting for the Provisioned mode with autoscaling enabled can be a cost-effective choice, ensuring you have the required capacity without breaking the bank. Carefully evaluate your workload needs to make the right choice.

Price comparison On-demand vs Provisioned

Scenario: Let’s say an Ad tech company needs to store 10 TB of data in DynamoDB, with an average item size of 1 KB. The company expects to have 1 million write requests per day and 5 million read requests per day.

For On-demand capacity in N.Virginia region using Standard table class, ​

Data stored =10TB ===> Monthly cost = 10240GB * 0.25 = $2560​​

Writes = 30million per month ===> Monthly cost = 30*1.25 = $37.50​​

Reads = 150million per month ===> Monthly cost = 150*0.25=$37.50

​​Total monthly cost = $2635​​.

Annual cost = $31620​

For Provisioned capacity in N.Virginia region,

Data stored=10TB ===> Monthly cost = 10240GB * 0.25 = $2560​​

(Calculate the WCUs and RCUs based on the writes and reads)

Write Capacity Units (WCUs) =12 ===> Monthly cost = $5.69​​

Read Capacity Units (RCUs) = 58 ===> Monthly cost = $2.75​​

Total monthly cost = $2568.44​​

Annual cost = $30821.28​​

SAVINGS = $31620 — $30821.28 = $798.72 ==> 3%

2. Harness the Benefits of Reserved Capacity

If you’re operating in the provisioned capacity mode and your capacity exceeds 100 units, it’s worth exploring the option of investing in reserved capacity.

With a 3-year commitment plan, the reserved capacity offers a substantial 76% discount, while a 1-year commitment still provides a noteworthy 53% cost reduction in comparison to the standard provisioned throughput capacity.

3. Selecting the Appropriate Table Class

Amazon DynamoDB provides two distinct table classes to empower you in your cost optimization efforts:

(i) Standard Table Class:

  • This serves as the default table class, striking a balance between storage and read/write costs.
  • It proves to be the most economical choice for tables where throughput costs take precedence.

(ii) Standard-Infrequent Access Table Class:

  • Offering a remarkable reduction of up to 60% in storage expenses, it comes with 25% higher read/write costs compared to the Standard table class.
  • This table class becomes the most cost-efficient option for tables where storage expenses dominate the budget.

Considering the previous scenario-

For On-demand capacity in N.Virginia region using Standard table class,

For the 10TB Data stored, we get Total monthly cost = $2635​​

Annual cost = $31620​

By using Standard Infrequent Access table class,

Data stored =10TB ===> Monthly cost = 10240GB * 0.10 = $1024​​

Writes = 30million per month ===> Monthly cost = 30*1.56 = $46.80​​

Reads = 150million per month ===> Monthly cost = 150*0.31=$46.5​​

Total monthly cost = $1117.3

Annual cost = $13407.6​​

SAVINGS = $31620 — $13407.6 = $18212.4​​

With this we can have savings up to 58% annually.​

4. Efficient Handling of Large Data

Storing substantial values or images in DynamoDB can escalate costs quickly because DynamoDB imposes a maximum item size limit of 400KB, making it unsuitable for storing images or data exceeding this size.

However, there’s a solution at hand: Leverage Amazon S3 for Large Objects

You can write a large object into an S3 bucket and create a DynamoDB item to store the object identifier (for instance, a pointer to the S3 URL of the object).​ This strategy ensures cost-effective, efficient management of large data.

5. TTL for Cost-Effective Data Cleanup

To cut storage expenses, it’s essential to regularly remove unnecessary data. If your business doesn’t require events older than a certain number of days, make use of DynamoDB’s Time-to-Live (TTL) feature.

This feature automatically deletes outdated or unnecessary items from your tables. Implementing regular TTL-based data cleanup will significantly reduce your storage costs.

6. Optimize Data Retrieval with Queries

When retrieving data from DynamoDB, you have the option to use either queries or scans. However, it’s important to note that using the scan method can be costlier as it scans the entire table, resulting in charges for all the rows scanned.

Queries, on the other hand, efficiently locate data by utilizing partition and sort keys, thereby saving you from unnecessary expenses. It’s a smart choice for cost-effective data retrieval.

As we conclude Part 3 of our “Cost Optimization in DynamoDB”, you’ve uncovered some techniques for optimizing this essential AWS service. Get ready for Part 4, where we’ll broaden the horizon and explore cost optimization strategies for a range of other AWS services. Your journey to financial efficiency in the AWS cloud is far from over, and the next chapter promises even more insights and success stories. Stay tuned!

We appreciate your readership and support. For more insightful updates and tips, don’t forget to follow us and stay connected on our journey through the ever-evolving world of cloud computing.

--

--