Mastering Serverless (Part I): Enhancing DynamoDB Interactions with Document Client

The SaaS Enthusiast
10 min readJan 8, 2024

--

An image depicting a modern software development environment with a focus on serverless architecture. The scene includes a developer’s workspace with multiple screens displaying code, flowcharts, and cloud infrastructure diagrams. The atmosphere is tech-savvy and innovative, with digital representations of serverless concepts like cloud services, databases, and abstract coding elements floating around the room. The overall vibe is futuristic and efficient, illustrating the cutting-edge nature of

Introduction

Have you ever felt like you’re doing the same thing over and over again in your serverless projects, especially when dealing with databases? I certainly have. This is the story of how I turned my repetitive tasks into a streamlined process using the DynamoDB Document Client. Initially, I created a basic file for CRUD operations, but as my needs evolved, so did the file — it now includes helpful tools for Batch Write and Batch Get operations. Let’s delve into how this can transform your serverless project approach and make your interactions with DynamoDB more efficient and less repetitive.

DynamoDB Document Client

The DynamoDB Document Client is a game-changer for anyone using AWS’s DynamoDB in serverless applications. It’s essentially a wrapper that simplifies the interaction with DynamoDB, allowing you to work with JavaScript objects directly. This means no more cumbersome manual marshalling of JSON data! For instance, when you retrieve data using the standard DynamoDB client, you receive the response in a DynamoDB-specific format, which requires parsing. In contrast, the Document Client automatically converts this data into a more readable, standard JavaScript object format.

Using Standard DynamoDB Client with Unmarshalling

When interacting directly with DynamoDB using the standard AWS SDK, not only do you need to marshal your request data, but you also have to unmarshal the response data to convert it from DynamoDB’s format to a standard JSON format. Here’s an example of both marshalling a request and unmarshalling a response:

Standard DynamoDB Client:

const AWS = require('aws-sdk');
const dynamoDB = new AWS.DynamoDB();

const params = {
TableName: 'YourTable',
Key: {
'YourPrimaryKey': { S: 'SomeValue' }
}
};

dynamoDB.getItem(params, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
// Unmarshalling the raw response
const item = AWS.DynamoDB.Converter.unmarshall(data.Item);
console.log(item); // Standard JSON format
}
});

In this example, after receiving the raw response from DynamoDB, we use the AWS.DynamoDB.Converter.unmarshall method to convert the data item into a standard JSON object.

Contrast with Using DynamoDB Document Client

Now, let’s see how the same operation looks when using the DynamoDB Document Client. It abstracts away the need for manual marshalling and unmarshalling.

const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient();

const params = {
TableName: 'YourTable',
Key: { 'YourPrimaryKey': 'SomeValue' }
};

docClient.get(params, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
console.log(data.Item); // Already in standard JSON format
}
});

With the Document Client, the Key is specified in a straightforward JSON format ({ 'YourPrimaryKey': 'SomeValue' }), and the response is automatically in a readable, standard JSON format. This greatly simplifies the code, making it more readable and less prone to errors related to data formatting.

CRUD + Batch Write + Batch Get

Centralizing your database interactions into a single, well-defined location in your codebase is a strategic decision that offers several long-term benefits, particularly when working with a service like DynamoDB. Let’s delve into the reasons why this approach is advantageous:

  1. Facilitates Cloud Provider Transition: If you ever decide to switch cloud providers or migrate to a different database system, having a centralized interaction point simplifies the process. You only need to update your database interaction logic in one place, rather than hunting down and modifying every direct call scattered across your codebase.
  2. Reusable Across Projects: By centralizing your database operations, you create a reusable component that can be easily imported into other projects. This not only saves time but also ensures consistency in how you interact with DynamoDB across different applications.
  3. Ease of Upgrade and Maintenance: When AWS releases new versions of their SDK (such as moving from V3 to V4), you only need to make adjustments in one place. This localized update reduces the risk of introducing bugs into your system and makes the upgrade process more manageable.
  4. Consistent Service Calls: When multiple functions or components of your application are calling the same database service, a centralized module ensures that all these calls are consistent and standardized. This uniformity is crucial for maintaining data integrity and application stability.
  5. Streamlined Testing and Reliability: Testing becomes much more straightforward when all database interactions are contained within a single module. You can rigorously test this module to ensure its reliability, and once it’s proven, you can trust it to perform correctly across your application. Any changes or bug fixes made in this module will be reflected wherever it’s used, improving the overall quality and maintainability of your code.
  6. Centralized Improvements and Enhancements: When you want to enhance your database interaction — say, by implementing more efficient query patterns or adding new features — you only need to modify one central module. This change will automatically propagate to every part of your application that uses the module, ensuring that improvements are uniformly applied.

In summary, centralizing your DynamoDB interactions in a single module not only makes your code cleaner and more organized but also significantly enhances the maintainability, scalability, and flexibility of your application. This approach lays a solid foundation for future development, testing, and potential migration or scaling scenarios.

The code

import {DynamoDBClient, UpdateItemCommand} from "@aws-sdk/client-dynamodb";
import {
DynamoDBDocumentClient,
GetCommand,
PutCommand,
QueryCommand,
DeleteCommand,
BatchGetCommand,
ScanCommand, BatchWriteCommand
} from "@aws-sdk/lib-dynamodb";
import {MsaLogger as logger} from "../logger.js";

const client = new DynamoDBClient({});
const docClient = DynamoDBDocumentClient.from(client);

/**
* This function will get an item from a table.
* @param TableName
* @param Key
* @return {Promise<(Omit<GetItemCommandOutput, "Item"> & {Item?: Record<string, NativeAttributeValue>})|GetItemCommandOutput>}
*/
export const dynamodbGetHelper = async (TableName, Key) => {
const params = {
TableName,
Key
};
logger.info(`dynamodbGetHelper: ${params}`);
return await docClient.send(new GetCommand(params));
};

/**
* This function will scan a table.
* @param TableName
* @return {Promise<(Omit<ScanCommandOutput, "Items" | "LastEvaluatedKey"> & {Items?: Record<string, NativeAttributeValue>[], LastEvaluatedKey?: Record<string, NativeAttributeValue>})|ScanCommandOutput>}
*/
export const dynamodbScanHelper = async (TableName) => {
const params = {
TableName,
};
logger.info(`dynamodbScanHelper: ${JSON.stringify(params)}`);

return await docClient.send(new ScanCommand(params));
};

/**
* This function will update an item in a table.
* @param TableName - The name of the table to update the item in.
* @param Key - The key of the item to update. {id: value}
* @param UpdateExpression - The update expression to use. "SET #attribute1 = :value1, #attribute2 = :value2, ..."
* @param ExpressionAttributeNames - The expression attribute names to use. {"#attribute1": "attribute1", "#attribute2": "attribute2", ...}
* @param ExpressionAttributeValues - The expression attribute values to use. {":value1": value1, ":value2": value2, ...}
* @return {Promise} - The response from the update item command.
*/
export const dynamodbUpdateItemHelper = async (TableName, Key, UpdateExpression, ExpressionAttributeNames, ExpressionAttributeValues) => {
if (!(Key && UpdateExpression && ExpressionAttributeNames && ExpressionAttributeValues)) {
logger.error("Key, UpdateExpression, ExpressionAttributeNames, and ExpressionAttributeValues are required");
throw new Error("Key, UpdateExpression, ExpressionAttributeNames, and ExpressionAttributeValues are required");
}
const params = {
TableName,
Key,
UpdateExpression,
ExpressionAttributeNames,
ExpressionAttributeValues,
};
console.log(JSON.stringify(params, null, 2));

logger.info(`dynamodbUpdateItemHelper: ${JSON.stringify(params)}`);
return await docClient.send(new UpdateItemCommand(params));
};

/**
* This function will get an item from a table with a projection.
* @param TableName
* @param Key
* @param ProjectionExpression
* @return {Promise<(Omit<GetItemCommandOutput, "Item"> & {Item?: Record<string, NativeAttributeValue>})|GetItemCommandOutput>}
*/
export const dynamodbGetHelperWithProjection = async (TableName, Key, ProjectionExpression) => {
const params = {
TableName,
Key,
ProjectionExpression
};
logger.info(`dynamodbGetHelperWithProjection: ${JSON.stringify(params)}`);

return await docClient.send(new GetCommand(params));
};

/**
* This function will put an item in a table.
* @param TableName
* @param Item
* @return {Promise<(Omit<PutItemCommandOutput, "Attributes" | "ItemCollectionMetrics"> & {Attributes?: Record<string, NativeAttributeValue>, ItemCollectionMetrics?: Omit<ItemCollectionMetrics, "ItemCollectionKey"> & {ItemCollectionKey?: Record<string, NativeAttributeValue>}})|PutItemCommandOutput>}
*/
export const dynamodbPutHelper = async (TableName, Item) => {
const params = {
TableName,
Item
};
logger.info(`dynamodbPutHelper: ${JSON.stringify(params)}`);

return await docClient.send(new PutCommand(params));
};

/**
* This function will query a table.
* @param TableName
* @param KeyConditionExpression
* @param ExpressionAttributeNames
* @param ExpressionAttributeValues
* @param IndexName
* @return {Promise<(Omit<QueryCommandOutput, "Items" | "LastEvaluatedKey"> & {Items?: Record<string, NativeAttributeValue>[], LastEvaluatedKey?: Record<string, NativeAttributeValue>})|QueryCommandOutput>}
*/
export const dynamodbQueryHelper = async (TableName, KeyConditionExpression, ExpressionAttributeNames, ExpressionAttributeValues, IndexName) => {
if (!(KeyConditionExpression && ExpressionAttributeValues && ExpressionAttributeNames)) {
const errorMessage = "KeyConditionExpression and ExpressionAttributeValues are required"
logger.error(errorMessage);
throw new Error(errorMessage);
}

const params = {
TableName,
KeyConditionExpression,
ExpressionAttributeValues,
ExpressionAttributeNames
};
if (IndexName) {
params.IndexName = IndexName;
}
logger.info(`dynamodbQueryHelper: ${params}`);

return await docClient.send(new QueryCommand(params));
};

/**
* This function will delete an item from a table.
* @param TableName
* @param Key
* @return {Promise<(Omit<DeleteItemCommandOutput, "Attributes" | "ItemCollectionMetrics"> & {Attributes?: Record<string, NativeAttributeValue>, ItemCollectionMetrics?: Omit<ItemCollectionMetrics, "ItemCollectionKey"> & {ItemCollectionKey?: Record<string, NativeAttributeValue>}})|DeleteItemCommandOutput>}
*/
export const dynamodbDeleteHelper = async (TableName, Key) => {
let params = {
TableName,
Key
};
logger.info(`dynamodbDeleteHelper: ${JSON.stringify(params)}`);

return await docClient.send(new DeleteCommand(params));
}

/**
* This function will return an array of items from a table given a list of keys.
* @param TableName - The name of the table to retrieve the items from.
* @param Keys - The list of keys to retrieve the items for. [{id: value1}, {id: value2}, ...]
* @param AttributesToGet - The list of attributes to retrieve for each item. ['attribute1', 'attribute2', ...]
* @return [{id, attribute1, attribute2, ...}]
*/
export const dynamodbBatchGetHelper = async (TableName, Keys, AttributesToGet) => {
const MAX_ITEMS_PER_BATCH = 25;
const batches = [];
for (let i = 0; i < Keys.length; i += MAX_ITEMS_PER_BATCH) {
batches.push(Keys.slice(i, i + MAX_ITEMS_PER_BATCH));
}

const responses = [];
for (const batch of batches) {
// Constructing Expression Attribute Names
const ExpressionAttributeNames = AttributesToGet.reduce((acc, attr) => {
acc[`#${attr}`] = attr;
return acc;
}, {});

const params = {
RequestItems: {
[TableName]: {
Keys: batch,
ProjectionExpression: Object.keys(ExpressionAttributeNames).join(", "),
ExpressionAttributeNames: ExpressionAttributeNames
}
}
};

logger.info(`dynamodbBatchGetHelper: ${JSON.stringify(params)}`);

try {
const response = await docClient.send(new BatchGetCommand(params));
responses.push(...response.Responses[TableName]);
} catch (error) {
logger.error(`Error in dynamodbBatchGetHelper: ${error}`);
throw error;
}
}

return responses;
};

/**
* This function will write a batch of items to a table.
* @param TableName - The name of the table to write the items to.
* @param Items - The list of items to write. [{id: value1, attr: value2}, ...]
*/
export const dynamodbBatchWriteHelper = async (TableName, Items) => {
logger.info(`dynamodbBatchWriteHelper: ${JSON.stringify(TableName)}`);
const MAX_ITEMS_PER_BATCH = 25;
const writeBatches = [];

for (let i = 0; i < Items.length; i += MAX_ITEMS_PER_BATCH) {
const batch = Items.slice(i, i + MAX_ITEMS_PER_BATCH);
const writeRequests = batch.map(item => ({ PutRequest: { Item: item } }));
writeBatches.push(writeRequests);
}

for (const batch of writeBatches) {
const params = {
RequestItems: {
[TableName]: batch
}
};

try {
await docClient.send(new BatchWriteCommand(params));
} catch (error) {
logger.error(`Error in dynamodbBatchWriteHelper: ${error}`);
throw error;
}
}
logger.info(`Successfully Written: ${Items.length} items to ${TableName}`);
};

CRUD Interactions with helper layer

Get Item: The dynamodbGetHelper function simplifies retrieving an item from a table. It takes the table name and key as parameters, reducing the need for manual data formatting.

// Usage of dynamodbGetHelper
const result = await dynamodbGetHelper('YourTable', { YourPrimaryKey: 'SomeValue' });

Scan Table: The dynamodbScanHelper is used for scanning a table. While scanning can be resource-intensive, this helper makes the process straightforward.

// Usage of dynamodbScanHelper
const scannedItems = await dynamodbScanHelper('YourTable');

Update Item: The dynamodbUpdateItemHelper facilitates updating items in a table, allowing for complex updates using expressions without dealing with the lower-level syntax.

// Usage of dynamodbUpdateItemHelper
const updateResponse = await dynamodbUpdateItemHelper('YourTable', { id: '123' }, 'SET #attr1 = :value1', { '#attr1': 'attribute1' }, { ':value1': 'newValue' });

Get Item with Projection: The dynamodbGetHelperWithProjection function retrieves specific attributes of an item, optimizing data retrieval and reducing read costs.

// Usage of dynamodbGetHelperWithProjection
const item = await dynamodbGetHelperWithProjection('YourTable', { id: '123' }, 'attribute1, attribute2');

Put Item: The dynamodbPutHelper makes adding a new item or replacing an existing one in a table an effortless task.

// Usage of dynamodbPutHelper
const putResult = await dynamodbPutHelper('YourTable', { id: '123', attribute1: 'value1' });

Query Table: The dynamodbQueryHelper is particularly useful for fetching items based on a specific condition, especially in tables with secondary indexes.

// Usage of dynamodbQueryHelper
const queryResults = await dynamodbQueryHelper('YourTable', 'id = :idVal', null, { ':idVal': '123' }, 'YourIndexName');

Delete Item: With the dynamodbDeleteHelper, you can easily remove an item from a table by specifying its key.

// Usage of dynamodbDeleteHelper
const deleteResponse = await dynamodbDeleteHelper('YourTable', { id: '123' });

Batch Get: The dynamodbBatchGetHelper allows for retrieving multiple items in batches, optimizing the read operation.

// Usage of dynamodbBatchGetHelper
const batchGetResults = await dynamodbBatchGetHelper('YourTable', [{ id: '123' }, { id: '456' }], ['attribute1', 'attribute2']);

Batch Write: Finally, the dynamodbBatchWriteHelper enables batch writing of items to a table, significantly reducing the number of network calls.

// Usage of dynamodbBatchWriteHelper
await dynamodbBatchWriteHelper('YourTable', [{ id: '123', attr: 'value' }, { id: '456', attr: 'value2' }]);

These helper functions encapsulate the complexity of interacting with DynamoDB, allowing you to focus more on the business logic rather than the intricacies of the database operations.

Conclusion

As we wrap up our journey through the intricacies of DynamoDB and the efficiencies of the Document Client, let’s pause and reflect on the bigger picture. Embracing best practices in our coding endeavors isn’t just about writing code that works; it’s about crafting code that endures. By adopting approaches like centralized database interaction, we’re not just streamlining our current projects — we’re laying down blueprints for future innovation.

High-quality code is more than just functional; it’s a testament to our commitment to excellence and sustainability in our digital creations. Every time we choose long-lasting solutions over quick fixes, we’re not just solving a problem — we’re setting a standard. This mindset is essential for any developer aiming to leave a mark in the tech world.

I invite you to join me in this ongoing pursuit of excellence. Whether it’s exploring the depths of serverless architecture, mastering the nuances of AWS services, or any other area of technology, let’s continue this journey together. Check out my other posts for more insights and strategies to elevate your coding skills and project quality. Together, let’s build robust, scalable, and efficient systems that stand the test of time. Remember, in the world of coding, we’re not just writing lines of code; we’re architecting the future.

Other Articles You May Be Interested In:

New Projects or Consultancy

Advanced Serverless Techniques

Mastering Serverless Series

--

--