On a journey with cloud

Madhura Sawant
Developer Students Club, VJTI
8 min readFeb 23, 2023

I got introduced to cloud computing in my 2nd year through Google Developers’ 30 Days of Google Cloud program, which I was a part of in October. A poster was sent across on GDSC VJTI’s WhatsApp groups and that got me intrigued about the program. Through a link provided, I navigated to the program’s website and read about the program, what it provides, the program syllabus etc. Clearly, it was a 30 days program where Google was offering free credits and subscriptions to its Google Cloud Platform(GCP) and to perform labs on the Google Cloud Skills Boost platform. I found the syllabus very intimidating — like I didn’t understand what it meant, all that I inferred from the syllabus was that there were 2 tracks — cloud computing and machine learning each having a total of 6 quests and were to be completed within a month. After looking at the prize section that offered skill badges and Google swags, I was motivated to complete this program and decided to invest my time in the cloud computing track.

When I started with the program I didn’t even know what cloud meant so I had to begin with the basic ABCs of cloud computing. Watching youtube videos and reading blogs helped me in starting with the cloud. After a few tutorials, I realized that cloud is nothing but just someone else’s computer!! LOL. But this makes sense because it’s just a group of servers located geographically away from your system running some applications or databases and you can access them over the internet. And if you just look around, you can see that we are already using cloud computing for so many things and not just google drive or Gmail. You all must be binge-watching a series on Netflix but did you know that it uses cloud computing to manage all the downstream traffic, recommendation engines, data analysis, storage, and so much more! We all have used Hackerrank or Leetcode to solve coding challenges and they too use cloud computing. How? The code that we submit is sent to a server through API for further processing. It compiles and runs our code and sends us back a comparison between the obtained and the desired output.

Starting with the program

The program began with a tour lab that just made me familiar with the Google cloud console. Basically, the cloud console is an interface or a dashboard through which you can access and manage the resources and services offered by Google, view analysis of your applications and much more. Performing the hands-on labs felt like a real learning experience. To make this easier, the GDSC cloud computing team helped us by providing resources to study and by conducting study jams and quizzes.

Google Cloud Console

The quests started with the basics of the cloud console and cloud shell( cloud CLI), slowly introducing new technologies. After completing the basics, the first thing I learnt was creating a virtual machine in the cloud. It is done using Compute Engine which allows us to create and run virtual machines on Google’s infrastructure. Creating a virtual machine was a simple task to perform using the cloud console or the cloud shell.

Creating a VM in Compute Engine is like using Infrastructure-as-a-service (IaaS). This is one of the 3 service models used in cloud computing, the other 2 being Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).

In IaaS, you do not have to be concerned about the infrastructure that will help your application to run i.e. the server or virtual machine that will run our code. Networking and storage are also handled by the cloud platform in IaaS.

But what we as developers often do is take our application code along with the data and deploy it to some platforms like Heroku or Render. This is Platform-as-a-Service(PaaS). Here we don’t take the trouble of managing the runtime environment and OS. This along with VMs, storage and networking is managed by the vendor (or cloud service provider). As student developers, we mostly make use of PaaS so that we don’t have to be bothered about configuring virtual machines or servers.

Learning about Containers and Docker

After learning about VMs, creating and interacting with them on Google Cloud, I learned about another exciting tech called Docker. While working on projects in a group or even sharing assignments, I often noticed that my code worked fine on my machine but when I shared the same code with my friend, it either gave errors or the interface wasn’t the same as it was on my machine. Then we had to sit together and solve those bugs, it was a real pain🫠. And I am sure that I am not the only one who faced this!

But what if we could make our code portable? Just put all things necessary to run it in a box and ship it to others — that’s what a container does for us! During the program, I learned that Docker is a tool that helps create and run these containers.

The Docker codelab showed how to create Docker Containers by using the Docker CLI. However, docker and containerisation were completely new concepts to me, so just reading the small intro at the beginning of the lab and directly coming to execute these commands was a little difficult to digest. I had to refer to tutorials again specifically for docker because other cloud computing tutorials hardly gave any insights on it. Docker Docs and some youtube tutorials provided some really useful resources for proceeding with this codelab.

Moving ahead in this lab the next task was building a Docker Image by running a Dockerfile. I followed the instructions to create a dockerfile and then build an image using it. The codelab included instructions on how to do this. Basically, a dockerfile is a simple text file that gives instructions to build a docker image. Here’s an example of a Dockerfile:

Dockerfile example
  • In a dockerfile FROM is the most important element, it will specify the base parent image. ( You can use Docker Hub for searching or using any container image)
  • You can also declare environment variables using ENV.
  • RUN will execute the command that you specify.
  • COPY will copy the contents from the source directory to the destination directory.
  • EXPOSE specifies the port that the docker container will listen to at runtime.
  • CMD will execute the instructions in the container from the specified file.

Once a dockerfile is created it is used to build the docker image which contains instructions on how to create a docker container. This docker image is run and a docker container is created. The codelab also showed how to push your image to a docker registry.

Why Kubernetes?

So using docker you create an application in some dev environment and you can use the same environment in production too because of containers. The benefit of using these containers is so profound that Netflix, Google, Amazon, and almost every other tech company uses them in development and in production. However, as the number of containers grows, it becomes difficult to manage so many Dockerfiles, the networking between them, their security, and many other concerns. In order to manage an enormous amount of containers, Kubernetes was created by Google in 2013.

Kubernetes is an orchestration tool i.e. it groups containers that make up an application allowing you to run and manage your container-based workloads along with other additions like scheduling and networking of nodes, load balancing and self-healing. But managing Kubernetes on your own is very hard. Instead of self-managing Kubernetes, many organizations prefer using managed Kubernetes services like Google Kubernetes Engine(GKE) or Amazon Elastic Kubernetes Service (EKS).

The codelabs on Kubernetes were mostly based on the GKE. There were a couple of labs on Kubernetes and most of them were intermediate level so completing those labs was quite challenging for me.

After completing all the labs of a quest, a challenge lab was to be completed after which we were awarded the skill badge for that quest. I started slow at the beginning and because of this, I had to rush with the syllabus when there were a few days left before the program deadline. I remember it was the last day and I was still left with completing the challenge lab of the last quest which was about Kubernetes:) I had tried that lab almost 3 times and was about to exhaust my quota for the lab, this really made me anxious and I seriously gave up and that’s when a message popped up on the group that the deadline has been extended by 2 days🥳. Thank god!! I was able to complete all the quests of the cloud computing track.

The end is the beginning!

Towards the end, I had to my glory all the skill badges of the cloud computing track and the Google Swags plus I explored a completely new domain in just 30 days.

The successful end of this program has only encouraged me to explore more about the cloud. A few months after the program, I learned more about the heavy terminologies in cloud computing with the ‘a-z cloud’ project built by the Cloud Computing team at GDSC. And becoming the Cloud Computing Lead of GDSC VJTI has actually kickstarted my journey in the cloud.

Finally, while concluding this blog I would like to express my gratitude to my seniors Pankaj Khushalani (Former Cloud Computing Lead GDSC VJTI) and Sarah Tisekar (Former Cloud facilitator GDSC VJTI) and our GDSC Lead Alisha Kamat for always having my back and constantly guiding me throughout this journey.

Thank you!!

--

--