Navigating the Cloud Resume Challenge

Yonathan Amare
4 min readDec 6, 2023

--

Cloud Resume Architecture Diagram

Hi everyone. I’m Yonathan Amare, currently a Senior Information Systems and Business Analytics student at Loyola Marymount University in Los Angeles! This is my Cloud Resume story. I’ll do a brief overview of the challenge, explore the different services I used, and highlight the obstacles I overcame.

My Cloud Resume: yonathanamare.com

What is this challenge?

The Cloud Resume Challenge is a comprehensive project demonstrating your computer skills. The main goal is to craft your own website, serving as a personalized resume to showcase your abilities. Using programming languages like HTML, CSS, and JavaScript, you construct the website. Then, you utilize various tools provided by Amazon Web Services (AWS) to publish your site on the internet. The challenge involves learning about securing and optimizing your site for speed by leveraging different AWS services such as S3, DynamoDB, and Lambda. It’s a complex yet enriching project that expands your knowledge of website creation and leveraging internet tools for professional purposes. Now let’s walk through each step that I took to get my website up and securely running.

HTML and CSS

The first step in the creation process was writing the frontend website code on a text editor. I used Visual Studio Code to write my html and css scripts. This laid the foundation for the rest of the project, given that I would be calling different functions in the html code (website visitor counter function in JavaScript for example).

Deploying With Amazon S3

S3 stands for “Simple Storage Service”. This AWS service allowed me to store objects (files) in buckets to host on a static website. I learned how to actually create the bucket, upload files into the bucket, create folders and move the files, and enable public static website hosting using the AWS policy generator. This is great, but the website needed to reach users faster.

CloudFront

CloudFront is a content delivery network (CDN) that speeds up the distribution of my website by caching it at edge locations worldwide, reducing latency for users accessing my website from various geographical locations. So every time I would upload my updated file to S3, I would subsequently invalidate the distribution cache on CloudFront, overall enhancing my website performance.

HTTPS and AWS Certificate Manager

I used HTTPS to secure my website. In short, the data on my website was encrypted, making it harder for hackers to attack. To use HTTPS, I needed to obtain an SSL certification using AWS Certificate Manager. With these two steps, I was able to configure CloudFront to use HTTPS to encrypt the data between CloudFront and my website.

DNS

For seamless user access, I had to link my domain to the CloudFront-hosted website. Firstly, in Namecheap, I configured the DNS settings, setting up custom nameservers to point towards AWS Route 53. Next, in Route 53, I created a hosted zone for my domain, setting up an A record pointing to the CloudFront distribution. This process established a streamlined connection, ensuring that when users accessed my domain, they were directed to the CloudFront-hosted website.

Visitor Counter

To implement a live visitor counter on my resume webpage, I utilized JavaScript to interact with an AWS Lambda-based API, which, in turn, interfaced with an Amazon DynamoDB NoSQL database. The JavaScript code made API requests to update and retrieve visitor count data, maintaining real-time count updates without directly communicating with the database. The API, scripted in Python using AWS Lambda and API Gateway, facilitated secure communication between the web app and the database.

CI/CD

I utilized GitHub for version control of my backend Python code, ensuring a structured and organized approach to managing changes. This facilitated updates to both the backend API and frontend website through the implementation of Continuous Integration/Continuous Deployment (CI/CD) practices.

Challenges

My experience with S3, Lambda, and API gateway was specifically challenging in their own respects. Starting with S3, understanding its nuances presented a learning curve; setting up the bucket, configuring policies, and establishing static website hosting seemed straightforward initially, yet intricacies emerged when connecting it with CloudFront.

Moving on to AWS Lambda and API Gateway introduced a whole new level of complexity. Developing the Lambda function in Python to interact with DynamoDB was particularly difficult. Handling permissions, configuring the API Gateway to invoke the Lambda function, and debugging the asynchronous nature of the API calls were like solving a puzzle without a clear picture.

One memorable hurdle was troubleshooting the visitor counter on my website. Utilizing the browser’s developer tools became a frequent routine. I delved into the console section to catch any errors arising from the JavaScript function responsible for updating the visitor count. The challenge was not only debugging the Lambda function but also ensuring seamless integration between the frontend and backend, overcoming errors that hindered the display of the visitor count.

I Learned How To Learn, Fast

Throughout this project, I discovered the art of rapid learning. The journey was less about finding concrete solutions and more about navigating the labyrinth of website hosting complexities. I learned to effectively Google, leaned on ChatGPT interactions, and embraced the trial-and-error process as essential tools. Each step felt like being thrown into the deep end, yet it became a pathway to float above the challenges. This experience underscored the importance of adaptability and flexible learning in today’s ever-evolving tech landscape.

--

--