Automating an E-Commerce Platform Deployment and Monitoring on AWS Part 1: Setup Data Storage

Rachel Murphy
8 min readOct 25, 2023

--

Welcome to Part 1 of our Automating an E-Commerce Platform Deployment and Monitoring on AWS series! In the main article, we walked through the prerequisites and list of steps. This is part 1 where we configure and implement the Database using AWS DynamoDb and storing and linking images in AWS S3.

While it’s exhaustive to detail every piece of code here, you can find the complete project in the GitHub repository.

Prerequisites:

As you navigate through the series, it’s essential to have the foundational elements we discussed in the main article already in place. Ensure you’ve covered these prerequisites to ensure a smooth journey:

  • AWS Account: Make sure you have an AWS account with valid access and secret keys for AWS CLI.
  • Node.js and npm: Install the latest version of node and npm from the official Node.js website.
  • AWS CLI: Get the AWS Command Line Interface installed and configured.
  • Terraform: Have Terraform ready to manage infrastructure as code.
  • Docker: Set up Docker to harness the power of containerization.
  • Project Initialization: Create and initialize a new project using npm and Git.
  • Backend Dependencies: Add the following dependencies to your project’s package.json file in the root directory and run npm install.
"dependencies": {
"@aws-sdk/client-dynamodb": "^3.423.0",
"@aws-sdk/client-s3": "^3.431.0",
"@aws-sdk/lib-dynamodb": "latest",
"aws-sdk": "^2.1396.0",
"axios": "^1.4.0",
"cors": "^2.8.5",
"express": "^4.18.2"
}

Steps:

  1. Set Up config.js for AWS credentials.
  2. Initialize DynamoDB Connection.
  3. Set Up initial data initialData.json to upload to S3.
  4. Set Up dynamoDbUtils.js for DynamoDB and S3 Utilities.
    This module provides low-level functions for CRUD operations with DynamoDB and a utility for updating a local JSON with S3 URLs.
  5. Set Up ‘Product-Specific’ Operations in productDb.js.
    This is a higher-level module focused on the “product” entity. It utilizes the utilities from dynamoDbUtils.js to provide specific functionalities related to the "products" domain in the application.

Step 1: Set Up config.js for AWS credentials.

Within a config folder, create and configure config.js as follows:

module.exports = {
// DynamoDB table name
aws_table_name: "ecom-app-products-table",

// Local DynamoDB configuration for development
aws_local_config: {
region: "local",
endpoint: "http://localhost:8000",
},

// AWS remote service configuration
aws_remote_config: {
accessKeyId: process.env.AWS_ACCESS_KEY, // AWS Access Key
secretAccessKey: process.env.AWS_SECRET_KEY, // AWS Secret Key
region: "us-east-1", // AWS region
},
};

I have the table name hardcoded in the config because the table is created using Terraform which we will uncover in the next article.

Step 2: Initialize DynamoDB Connection.

Before setting up our dataset and working with S3, we need to initialize our connection to AWS DynamoDB.

  1. DynamoDB Client Configuration: We’ll use the AWS SDK to create a DynamoDB client. This client will be configured using the details we set up earlier in our AWS configuration inconfig.js file.
  2. Create db/index.js: Create a folder named db in your project root. Inside, add an index.js file with the following content:
// db/index.js

const { DynamoDBClient } = require("@aws-sdk/client-dynamodb");
const config = require("../config/config");

// Initialize the DynamoDB client with our remote AWS configuration
const dynamoDb = new DynamoDBClient(config.aws_remote_config);

module.exports = dynamoDb;

Now, with this in place, we can interact with DynamoDB in subsequent steps.

Step 3: Set Up initial data (initialData.json) to upload to S3.

Next, let’s create a dataset and add associated images.

Creating the Image Folder:

Create a folder named e_com_products. Inside this folder, add images of the products you want to upload to Amazon S3.

Setting Up Product Data:

Create a JSON file to represent the products you want to add to the products table. Each product entry should include fields like productId, name, price, description, and imageUrl. Here's a sample structure for initialData.json:

[
...
{
"productId": "11",
"name": "SoapVarietyBundle",
"price": 59.99,
"description": "There are variety of soaps made with various all natural oils and bases"
},
...
]

Note: We will cover how to upload this data as multiple files into S3 using Terraform in the next article.

Populating the DynamoDB Table:

Script Creation: We use populateData.js to load data from initialData.json.

DynamoDB and S3 Setup:

  • We connect to DynamoDB and our S3 bucket.
  • The script fetches image URLs for products from S3.

Data Handling:

  • Each product gets its corresponding image URL.
  • Products without images are flagged.
  • Data is transformed for batch-writing into DynamoDB.

Chunking and Writing:

  • We split products into chunks due to DynamoDB’s 25-item limit.
  • Each chunk is logged and written to DynamoDB, with errors handled and logged.
// data/populateData.js

const dynamoDb = require("../db/index");
const { ListObjectsCommand } = require("@aws-sdk/client-s3");
const { BatchWriteItemCommand } = require("@aws-sdk/client-dynamodb");
const { S3Client } = require("@aws-sdk/client-s3");
const products = require("./initialData.json");

const tableName = "ecom-app-products-table";
const s3Client = new S3Client({ region: "us-east-1" });
const bucketName = "ecom-app-products-s3-bucket";

// Utility function to break an array into chunks of a specified size
const chunk = (array, size) => {
return Array.from({ length: Math.ceil(array.length / size) }, (v, i) =>
array.slice(i * size, i * size + size)
);
};

// Function to retrieve image URLs from S3 bucket based on object keys
const getS3ImageUrls = async () => {
const command = new ListObjectsCommand({ Bucket: bucketName });
const data = await s3Client.send(command);

const urls = {};
if (data.Contents && data.Contents.length > 0) {
data.Contents.forEach((content) => {
const productName = content.Key.split(".")[0];
urls[productName] = `https://${bucketName}.s3.amazonaws.com/${content.Key}`;
});
}

return urls;
};

// Main function to populate product data in DynamoDB
const populateData = async () => {
const s3ImageUrls = await getS3ImageUrls(); // Fetch the image URLs from S3

// Associate each product with its respective image URL
const productsWithUrls = products.map((product) => ({
...product,
imageUrl: s3ImageUrls[product.name] || null,
}));
// Log warnings for products without associated images
productsWithUrls.forEach((product) => {
if (!product.imageUrl) {
console.warn(`Image URL not found for product: ${product.name}`);
}
});

// Convert the product data into the format required for a DynamoDB batch write
const writeRequests = productsWithUrls.map((product) => ({
PutRequest: {
Item: {
productId: { S: product.productId },
name: { S: product.name },
price: { N: product.price.toString() },
description: { S: product.description },
imageUrl: { S: product.imageUrl || "default-image-url" },
},
},
}));

// Split the write requests into batches (DynamoDB batch write has a max limit of 25 items)
const chunks = chunk(writeRequests, 25);
// Process each batch of write requests
for (let batch of chunks) {
const params = {
RequestItems: {
[tableName]: batch,
},
};

// Log each batch before sending to DynamoDB
console.log("Batch data to be sent to DynamoDB:", JSON.stringify(params));
const command = new BatchWriteItemCommand(params);
try {
await dynamoDb.send(command);
console.log("Batch populated successfully");
} catch (error) {
console.error("Error populating batch:", error);
throw error;
}
}
};

populateData().catch((error) => {
console.error("Error occurred while populating data:", error);
process.exit(1);
});

Batch writes ensure efficient data loading into DynamoDB.

Step 4: Set Up dynamoDbUtils.js for DynamoDB and S3 Utilities.

In this step, we’ll establish a utility module named dynamoDbUtils.js. This module serves two main purposes.

Here’s a breakdown of the module:

  1. DynamoDB Interactions: The functions putItem, getItem, deleteItem, updateItem, queryItem, and scanItems handle the basic CRUD operations for DynamoDB. Each function accepts parameters specific to its operation, manages the interaction with DynamoDB, and handles potential errors.
  2. S3 URL Management: The storeS3Urls functions stores S3 URLs and has the following flow:
    1. Fetches a list of objects from a designated S3 bucket.
    2. Maps through these objects to generate a list of URLs.
    3. Reads a local JSON file (containing product data) and updates the product entries with the corresponding S3 URLs.
    4. Writes the updated product data back to the JSON file.
// db/dynamoDbUtils.js

const dynamoDb = require("./index");
const { S3Client, ListObjectsV2Command } = require("@aws-sdk/client-s3");

const bucketName = "ecom-app-products-s3-bucket";
const jsonFilePath = "../data/initialData.json";

// Get S3 objects from the bucket and store the URLs in DynamoDB
const storeS3Urls = async (bucketName) => {
const s3 = new S3Client({ region: "us-east-1" });
const command = new ListObjectsV2Command({ Bucket: bucketName });
const objects = await s3.send(command);
const urls = objects.Contents.map((object) => ({
url: `https://${bucketName}.s3.amazonaws.com/${object.Key}`,
productId: object.Key.split(".")[0],
}));

// Read the JSON file and parse its contents
const fs = require("fs");
const data = fs.readFileSync(jsonFilePath, "utf8");
const products = JSON.parse(data);

// Add the S3 URLs to the product objects
products.forEach((product) => {
const matchingUrl = urls.find((url) => url.productId === product.productId);
if (matchingUrl) {
product.imageUrl = matchingUrl.url;
}
});

// Write the updated JSON file
const updatedData = JSON.stringify(products, null, 2);
fs.writeFileSync(jsonFilePath, updatedData, "utf8");

console.log(`S3 URLs added to ${jsonFilePath}`);
};

// Call the function to add S3 URLs into the JSON file
storeS3Urls(bucketName, jsonFilePath).catch((error) =>
console.error("Error occurred while storing S3 URLs:", error)
);

// Insert items to Product Table
const putItem = async (params, errorMessage) => {
try {
return await dynamoDb.put(params).promise();
} catch (error) {
console.error(`${errorMessage}: ${error}`);
throw error;
}
};

// Get items from Product Table
const getItem = async (params, errorMessage) => {
try {
return await dynamoDb.get(params).promise();
} catch (error) {
console.error(`${errorMessage}: ${error}`);
throw error;
}
};

// Delete items from Product Table s
...
};

With this setup, we bridge our application to AWS DynamoDB and S3, optimizing our interactions.

Step 5: Set Up ‘Product-Specific’ Operations in productDb.js

This module focuses on the business logic specific to our application. It leverages utilities from dynamoDbUtils.js to furnish functionalities strictly related to the "products" domain — these include addProduct, getProduct, deleteProduct, updateProduct, queryProduct, and scanProducts.

// db/productDb.js
const {
putItem,
getItem,
deleteItem,
updateItem,
queryItem,
scanItems,
} = require("./dynamoDbUtils");

// Add an item to the Product Table
const addProduct = async (productId, name, price, description, imageUrl) => {
const params = {
TableName: "ecom-app-products-table",
Item: {
productId: productId,
name: name,
price: price,
description: description,
imageUrl: imageUrl,
},
};
return putItem(
params,
"Error occurred while inserting an item from products"
);
};

// Get an item from the Product Table
const getProduct = async (productId) => {
const params = {
TableName: "ecom-app-products-table",
Key: {
productId: productId,
},
};
return getItem(params, "Error occurred while getting an item from products");
};

// Delete an item from the Product Table
const deleteProduct = async (productId) => {
const params = {
TableName: "ecom-app-products-table",
Key: {
productId: productId,
},
};
return deleteItem(
params,
"Error occurred while deleting an item from products"
);
};

// Update an item in the Product Table
...
};

After this step, we’ll have a dedicated module for product operations within DynamoDB.

In our next article, we’ll develop the e-commerce platform using Node.js for the backend and React.js for the frontend.

--

--