Google Cloud Associate Certified Engineer — 100daysoflearning (Part 12)

Saiyam Pathak
100daysoflearning
Published in
4 min readApr 24, 2019

Day 90–91

In continuation to the course for Google Cloud Associate Certified engineer lets move on with next set of learning topics.

If you have been following along my previous posts then by this time you would have created a VCN, public/private bucket, pubsub topic, bigtable, bigquery, Kubernetes cluster where the products api is deployed, spanner instance and compute instance created via deployment manager where the ads api is deployed. The next topic in continuation of deploying full application is
Cloud Function, when a message is published to pubsub topic it will trigger a cloud function that runs a code which will deserialize the uploaded image from the message and send over to cloud vision for further processing , once the processing is done the cloud function takes that and stores in bigtable. This is the data that the products api will query via bigquery.

Now you have understood the flow lets see how to create Google Cloud Function. In this article I am doing things differently as I am explaining the how to create Google cloud functions first and its definition and advantage after that.

deploy function

As you can see above that deployment of function is quite simple , gcloud beta function deploy command is used with set of required parameters to be passed that includes entry-point which is name of the function as in the code, source where the application code is , stage bucket where the code will be uploaded to, trigger resource and event (there are other trigger events as well but for this usecase we are going ahead with pubsub trigger), project name , project region and the environment variables that have bigtable and database name & the public bucket where the user upload images will be stored.

other cloud functions triggers

When you deploy the function you can see the function created with the gcloud command and the UI as well.

Google cloud functions
Functions code in private bucket

for now there is nothing in the UI that you can see for cloud functions as when the whole applications is deployed and the messages are pushed to the pubsub topic then the Cloud functions will trigger and work. But just to see the graph you can manually trigger the functions by passing message to the pubsub topic or by calling the functions directly

gcloud functions call {function name} — data ‘{“name”: “Keyboard Cat”}
gcloud pubsub topics publish myTopic — message ‘{“name”: “Keyboard Cat”}

you will be able to see that the function actually got triggered.

Google Cloud Functions is based on serverless that means you don’t have to worry where and how your code will run , the infrastructure and resources will be managed and maintained by Google for making your function code run. Serverless is the present and the future as more focus will be on development rather than the infra part of it. Its cost effective managed by google and easy to use as well. Once the functions gets triggered , whole infrastructure and resources used to run that function will be provisioned automatically by Google and the infrastructure will also be managed, scaled by Google itself. Google keeps on adding more and more triggers for Google cloud functions that makes it more efficient to use.

That was it about Google Cloud Function – Serverless Based.

Happy Learning & Happy Coding
Saiyam Pathak
https://www.linkedin.com/in/saiyam-pathak-97685a64/
https://twitter.com/SaiyamPathak

--

--

Saiyam Pathak
100daysoflearning

l CNCF Ambassador | CKA | CKAD | Influx ACE | Multi-cloud certified | Rancher Ranch Hands member