Files Uploader CLI with Nodejs, Typescript, and AWS S3 [Part 1/2]
An interactive CLI application to help uploading file or folder to AWS S3
Introduction
Hi folks, let's write some code today!
Sometimes, at the end of a working day, we want to upload our working documents to cloud storage such as Google Drive, or OneDrive, and it is very easy to do so by just drag and drop files/folders to those platforms.
How about AWS S3? AWS has already provided a very great CLI package aws-cli
to support uploading files to AWS S3, then why do we still need this application?
aws s3 sync <local_folder> s3://<bucket_name>/<remote_folder> [--options]
The above command requires the mentioned bucket and folder to exist before we upload, or else we have to create it beforehand. This is sometimes considered to be annoying for a late-to-dinner critical time.
And more than that, we have to remember the AWS CLI command to create the bucket and upload the bucket. Developers do not like to remember everything, so a CLI application with full INTERACTIVE actions would be a nicer solution for this situation.
Here is a sneak peek of the finishing application, if you found this interesting, then let’s build it.
Getting Started
1. Project initialization
Firstly, let's initialize the nodejs application with Typescript
- Create a new directory for your project and navigate to it in your terminal. Then running
npm init -y
(feel free to add additional information if you want to) - Install all the dependency packages that we need with the command
npm install aws-sdk dotenv inquirer chalk commander
npm install --save-dev typescript ts-node @types/inquirer @types/node
- Add a start script to
package.json
as well astype
tomodule
as we will use the import statement inside nodejs
// package.json
{
...
"scripts": {
"start": "ts-node --esm src/index.ts"
},
"type": "module",
...
}
- Create a
tscofig.json
file with basic settings.
{
"compilerOptions": {
"target": "es6",
"module": "NodeNext",
"outDir": "dist",
"strict": true,
"esModuleInterop": true
},
"include": ["src/**/*"]
}
- Create a
src
directory in the root of your project and add a TypeScript filesrc/index.ts
to it.
// index.ts
const message: string = "Hello world!";
console.log(message);
- Try to run the project by running in the command line
npm start
and if the console outputHello world!
without any errors then you are good to go to the next step.
2. Prepare AWS IAM
It can be very lengthy to include how to create an AWS IAM user here, I will attach the link to AWS official document on how to create an IAM user. Make sure you grant enough permissions to list buckets, create bucket, list objects, and put objects.
After creating the IAM user, create a new env file in the code repository named .env
. Please remember to add this file to .gitignore
the file if you intend to upload this to any public source code management tool such as Github, Gitlab, etc…
AWS_ACCESS_KEY=<Your-Access-Key>
AWS_SECRET_KEY=<Your-Secret-Key>
AWS_REGION=<Your-Intended-Region>
3. Setup AWS interface and helper functions
Create a separate file named aws.ts
inside src
folder as below to config AWS with the credentials and export all the functions that we need.
import fs from "fs";
import dotenv from "dotenv";
import AWS, { AWSError } from "aws-sdk";
import chalk from "chalk";
dotenv.config();
AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_KEY,
region: process.env.AWS_REGION,
});
const s3 = new AWS.S3();
// List all buckets in S3
export const listBucket = () => {
return new Promise<string[]>((resolve, reject) => {
s3.listBuckets((err: AWSError, data: AWS.S3.ListBucketsOutput) => {
if (err || !data.Buckets) {
return resolve([]);
}
const buckets: string[] = [];
data.Buckets.forEach((bucket) => {
if (bucket.Name) {
buckets.push(bucket.Name);
}
});
resolve(buckets);
});
});
};
// List all folders of a bucket in S3
export const listFoldersOfBucket = (
bucketName: string,
prefix: string
): Promise<string[]> => {
return new Promise<string[]>((resolve, reject) => {
s3.listObjectsV2(
{ Bucket: bucketName, Prefix: prefix, Delimiter: "/" },
(err: AWSError, data: AWS.S3.ListObjectsV2Output) => {
if (err || !data.CommonPrefixes) {
return resolve([]);
}
const folders: string[] = [];
data.CommonPrefixes.forEach((folder) => {
if (folder.Prefix) {
folders.push(folder.Prefix);
}
});
resolve(folders);
}
);
});
};
// Upload a file to S3
export const putObject = (
bucketName: string,
fileName: string,
s3UploadingPath: string
): Promise<{ error: number | string }> => {
let awsFilePath = s3UploadingPath + fileName;
if (fileName.includes("/")) {
awsFilePath =
s3UploadingPath + fileName.slice(fileName.lastIndexOf("/") + 1);
}
return new Promise<{ error: number | string }>((resolve) => {
// Prepare the s3 uploading path
let fileContent;
try {
fileContent = fs.readFileSync(fileName);
} catch (error) {
console.log(chalk.red("Error reading file: " + fileName));
return;
}
// Uploading logic
s3.putObject(
{ Bucket: bucketName, Key: awsFilePath, Body: fileContent },
(err: AWSError) => {
if (err) {
console.log(chalk.red(`Unsuccessfully uploaded ${fileName}`));
return resolve({ error: err.message });
} else {
console.log(
chalk.green(`Successfully uploaded ${fileName} to ${awsFilePath}`)
);
return resolve({ error: 0 });
}
}
);
});
};
The exported functions:
listBucket
: list all the buckets on S3listFoldersOfBucket
: List all folders of a bucket in S3putObject
: Upload a file to S3
Replace all content in the file index.ts
with this block code below and run npm start to check if the program works up till now.
import { listBucket } from "./aws.js";
listBucket().then(console.log);
Troubleshoot:
- The type error form Typescript may prevent you from using
import
statement, make sure you follow section 1 where we setuppackage.json
andtsconfig.json
to usemodule
import. - You may face the error of insufficient permission with the IAM user that you just created, make sure you give enough permissions to that user.
4. Prepare the prompt utility with inquirer
and chalk
to make the CLI more interactive
In the src
directory, create another file named inquirer.ts
that contains all the logic to prompt asking about buckets and folders to select.
import inquirer from "inquirer";
import chalk from "chalk";
export const NEW_BUCKET_CHOICE = "New bucket";
export const generateBucketQuestion = (choices: string[]) => [
{
type: "list",
name: "bucket",
message: chalk.green("Please select which bucket?"),
choices: [...choices, NEW_BUCKET_CHOICE],
},
];
export const CURRENT_FOLDER_CHOICE = "Current folder";
export const NEW_FOLDER_CHOICE = "New folder";
export const generateFolderQuestion = (choices: string[]) => [
{
type: "list",
name: "folder",
message: chalk.green("Please select which folder?"),
choices: [CURRENT_FOLDER_CHOICE, ...choices, NEW_FOLDER_CHOICE],
},
];
// Prompt asking for bucket choice
export const promptBucketChoiceQuestion = <T>(
choices: string[]
): Promise<T> => {
return new Promise<T>((resolve) => {
inquirer.prompt(generateBucketQuestion(choices)).then(resolve);
});
};
// Prompt asking for folder choice
export const promptFolderChoiceQuestion = (
choices: string[]
): Promise<{ folder: string }> => {
return new Promise<{ folder: string }>((resolve) => {
inquirer.prompt(generateFolderQuestion(choices)).then(resolve);
});
};
promptBucketChoiceQuestion
: function to promptly ask for the selection of which bucket will be uploaded to.promptFolderChoiceQuestion
: function to promptly ask for the selection of which folder will be uploaded to.
5. Prepare the logic for command line argument parsing
This part will help us to pass the arguments from the command line to indicate which file we want to upload to S3 with the help of the npm package commander
(https://www.npmjs.com/package/commander)
Replace all content in index.ts
file with the below content
// index.ts
import { program } from "commander";
function parseArgumentsCLI() {
program
.argument("<filepath>", "Path to the file to process")
.parse(process.argv);
const filepath = program.args[0];
return { filepath };
}
async function main() {
const { filepath } = parseArgumentsCLI();
console.log("File path " + filepath);
}
main();
Try to run the CLI application with npm start ./hello-word.txt
.The console will output File path ./example-files/helloword.txt
6. Integrate the core logic to upload a file to AWS S3
Modify the content of index.ts
as below.
import chalk from "chalk";
import { listBucket, listFoldersOfBucket, putObject } from "./aws.js";
import {
promptBucketChoiceQuestion,
promptFolderChoiceQuestion,
NEW_BUCKET_CHOICE,
CURRENT_FOLDER_CHOICE,
NEW_FOLDER_CHOICE,
} from "./inquirer.js";
import { program } from "commander";
function parseArgumentsCLI() {
program
.argument("<filepath>", "Path to the file to process")
.parse(process.argv);
const filepath = program.args[0];
return { filepath };
}
async function main() {
let selectedBucket: string = "";
let selectedFolder: string = "";
let prefix = "";
const { filepath } = parseArgumentsCLI();
const allBuckets = await listBucket();
const { bucket } = await promptBucketChoiceQuestion<{ bucket: string }>(
allBuckets
);
if (bucket === NEW_BUCKET_CHOICE) {
// TODO: Create new bucket
} else {
selectedBucket = bucket;
}
// Because user might choose sub-folder, so it should be in a while loop
while (true) {
const subFolderList: string[] = await listFoldersOfBucket(selectedBucket, prefix);
const { folder } = await promptFolderChoiceQuestion(
subFolderList
.map((folder) => (!!prefix ? folder.replace(prefix, "") : folder))
.map((folder) => folder.replace("/", ""))
);
if (folder === CURRENT_FOLDER_CHOICE) {
selectedFolder = prefix;
break;
} else if (folder === NEW_FOLDER_CHOICE) {
// TODO: Prompt to ask new folder name
break;
} else {
prefix += folder + "/";
}
}
let awsFilePath = selectedFolder + filepath;
if (filepath.includes("/")) {
awsFilePath =
selectedFolder + filepath.slice(filepath.lastIndexOf("/") + 1);
}
const result = await putObject(selectedBucket, filepath, awsFilePath);
if (!result.error) {
console.log(
chalk.green(`Successfully uploaded ${filepath} to ${awsFilePath}`)
);
}
}
main();
Here is the walkthrough logic inside main
function:
- First, it will get the
filepath
from command liner parsercommander
. - Then, it fetches all the buckets that existed in the AWS S3 and passes it as the choices of the bucket for the user to select.
- After selecting the bucket (we will work on creating a new bucket in Part 2), it will fetch all the folders inside the bucket and display them to be selected
- We can either select this current folder to upload to, or we can also select the sub-folder as well, or we can even create a new folder on the fly while uploading (we will work on this in Part 2).
- After selecting
Current folder
to indicate that we finally choose this is the place we want to upload to, it will perform the uploading to logic to upload it to S3.
Wow, if you have reached this part, congratulation you have done a great job today. It’s a long way until we can finally test a part of this application.
We still have a few things to work on such as uploading a folder instead of a file or creating a new folder while uploading, or even creating a new bucket and uploading files/folder to the newly created folder as well.
If you find my blog interesting, please consider following it for more updates and insights at Tech with Harry.