The Cloud Résumé Challenge with Azure — My Experience

So, we are roughly two weeks into the new year, but I’ll be a tad dramatic and emphatically tell you that this was the most difficult task I’ve had to encounter this year. Okay, I’ll drop the drama and say “it was challenging” instead because not only does that sound encouraging, but it also sheds a positive light on this task.

This article is a journal of my experience taking on the popular Cloud Résumé Challenge where I highlight what I enjoyed, what I found most difficult and a general summary of my approach. The nitty-gritty technical details would be explored in other upcoming articles so stay glued as it’ll provide in-depth tips to aid your journey, should you decide to embark on this challenge.

One main aim of The Cloud Résumé Challenge is to assist you in building cloud-centric skills backed by hands-on projects for your portfolio, which in turn increases your chances of being hired as an entry-level Cloud/DevOps Engineer. I look forward to taking on similar challenges and enriching my portfolio as time progresses.

Now, as you may have rightly guessed, this was inspired by Gwyn, whom I recently followed on Twitter, in a bid to gain bearing and some sort of guidance as I voyage into the world of cloud computing. She had partaken in the challenge and shared some information here on how to go about it.

Requirements to succeed at this challenge.

I needed to have the following for a pretty smooth sail;

  • An Azure account: I signed up for a free Azure account to get the ball rolling.
  • A free GitHub account: I opened a GitHub account late last year so that came in handy.
  • A domain name: I had registered a domain name last year too, it was almost as if I knew I’d be needing it for this challenge.
  • A code editor: I already had VsCode. I would recommend VsCode to anyone looking to build their cloud résumé on Azure as the editor would come in handy if you choose to write your Azure functions locally.
  • The tenacity of a thousand hungry hyenas, and I am not even joking when I say this.
Tenacious hyenas encircling a grown lion.
See what I mean?

How did I approach this challenge?

I'll admit it, I wouldn’t applaud my initial approach and would totally have done most things differently if I knew any better. But isn’t that what learning is all about?

I started off by writing the actual résumé from scratch in HTML and CSS but quickly resorted to editing the same template as Gwyn used. I do not particularly fancy ‘frontend business’ and couldn’t wait to move on to the actual cloud stuff I was eager to get my hands dirty with.

Next, I needed to write javascript code that would increment and display my visitor count each time my page is visited or gets loaded. This meant more frontend business for me and required a ton of googling and research on the DOM, event listeners e.t.c. At some point, I got caught in a web of unnecessary information and the only thing that kept counting was time as I was stuck, making no meaningful headway. Luckily, I stumbled on this article by Bolaji and this by Rishab and soon had a better understanding of what was actually needed to make progress.

This new knowledge adjusted my approach and;

  • I proceeded to set up my Azure Cosmos DB database which would store my visitor count in JSON.
  • Wrote an API with Azure Functions that would enable my javascript code to interact with the JSON that stores my count. I had to read a lot about APIs, Azure Cosmos DB triggers & bindings, and NoSQL databases to fully understand what I needed to do.
Screenshot of newly created cosmos db account.
Newly created Cosmos DB account.

To write my serverless function locally, I installed the Azure Account and Azure Functions extensions on VsCode. It was interesting figuring out how to create HTTP Trigger functions directly from the code editor.

Creating functions directly from VsCode.

After successfully writing and testing my function (major thanks again to the articles by Rishab and Bolaji);

  • I updated my javascript code to call my function API whenever my page gets loaded and display the count on my homepage.
  • Next, I created a blob storage account and uploaded my HTML, CSS and JS files. I then enabled static websites on my blob service so that I could host my files as a website.

As I mentioned earlier, we’ll take a deeper dive into all of these processes in upcoming publications.

Screenshot of enables static website
Enabled static website on the blob.

Interestingly and also annoyingly, I noticed I couldn’t easily upload folders to my blob storage directly, just files. Eventually, setting up GitHub actions for CI/CD ensured I wouldn’t need to worry about my folder structure.

Next, I created a CDN endpoint and pointed my custom domain name to the website. This was the easiest part of the challenge for me, mostly because I was navigating pretty familiar territories. I however soon noticed that my visitor count wasn’t working anymore, or so I thought. But this was because of something called the CORS policy that prevents external hosts or URLs like my newly created CDN endpoint and custom domain, from making cross-origin calls to my Azure function. This was sorted out by specifying allowed origins on my Function app API settings.

Allowed origin points.

At this stage, I had my cloud résumé set up, visitor count was working, it was accessible via my custom domain and I could smell blood.

If you sss-meeeeeeeeelllllll
Me smelling blood.

Lastly, I needed to be able to modify my webpage without editing the files directly or uploading new files to the blob storage from the portal. This is where CI/CD with GitHub actions comes in. This was surprisingly another tedious affair, as I mostly kept getting build errors when executing workflow runs. I had to refer to Microsoft documentation and troubleshoot errors for a couple of hours before deciphering the right commands to use for access to my Azure blob storage. And after sixteen failed workflow runs, I saw green on the seventeenth and screamed like a child for victory was finally mine.

GitHub Actions
My troubleshooting journey.

This pretty much summarizes what I experienced taking on this challenge. I would definitely be doing this on AWS next and then on GCP afterwards.

My goal this year is to become a proficient cloud engineer and get very familiar with cloud computing technologies. Thanks to this fantastic Learn to Cloud guide compiled by Gwyn and Rishab, my goal seems pretty attainable.

For some weird reason, this experience somewhat triggered an interest in web development, particularly for the backend. I will definitely be building stuff on the side to acquire adequate knowledge and hands-on experience writing code and then combine both skills for future projects.

You can find my stunning résumé here: resume.benny.com.ng.

And thanks to la Pantera Hermosa for the emotional support rendered to me during this challenge. I owe you a truckload of crackers and ice cream.

--

--

--

Budding Cloud and DevOps Engineer.

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Benny

Benny

Budding Cloud and DevOps Engineer.

More from Medium

SNOWFLAKE SQL execution error: Error assuming AWS_ROLE.

snowflake aws s3 inetgration

Semi-Structured To Structured Data Conversion Using Talend In Snowflake

Salesforce, Tableau and MuleSoft bring new COVID-19 data tools

How to configure Snowflake Connector for Kafka