New GCP Essentials video — Top 3 ways to run your Containers on Google Cloud
If you don’t like spoilers on what those 3 ways are, then watch the video :
If you do like spoilers or if you just like reading better, here it goes :
First of all, we live in a world where Docker, Kubernetes, and an entire ecosystem of products, tools, and best practices have emerged in the last few years, thus enabling many different kinds of applications to be containerized.
Just like every application isn’t running in the cloud, many workloads aren’t containerized yet, those that are can be executed in a predictable way across different implementations. On Google Cloud the various solutions for running containers vary essentially in how much of the underlying infrastructure is exposed.
Google Kubernetes Engine — GKE
As the inventor of Kubernetes, Google quite naturally offers a fully-managed Kubernetes service taking care of scheduling and scaling your containers while monitoring their health and state.
GKE clusters are secure-by-default, highly-available, monitored, and they run on Google Cloud’s high speed network. They can also be fine-tuned for zonal or regional locations and offer auto-scaling, auto-repair of failing nodes, and auto-upgrade to the latest stable Kubernetes version.
GKE is also a key component of Anthos, Google Cloud’s enterprise hybrid, and multi-cloud platform. With Anthos you can also migrate existing VMs directly into containers and move your workloads freely between on-premises and cloud environments such as GCP.
Cloud Run gives you the benefits of both Containers and Serverless. There is no cluster or infrastructure to provision or manage as Cloud Run automatically scales any of your stateless containers.
Creating a Cloud Run service with your container only requires selecting a location, giving it a name, and setting authentication requirements. It supports multiple requests per container, and works with any language, any library, any binary, and even any base image!
The result is serverless with true “pay for usage” able to scale-to-zero and full out-of-the-box monitoring, logging, and error reporting.
Because Cloud Run is built using the Knative open source project, a Serverless abstraction on top of Kubernetes, you can have your own private hosting environment, and deploy the exact same container workload on Cloud Run for Anthos — in GCP or on-prem.
Google Compute Engine — GCE
Yes, you can leverage the familiar GCE environment to run your containers. When creating a virtual machine the “Container” section will let you specify the image you’d like to use as well as a few other important options such as the “boot disk” section where it’s recommended to use “Container-Optimized OS”, an operating system optimized for running Docker containers, maintained by Google.
This operating system image comes with the all the necessary runtimes pre-installed, thus enabling you to bring up your Docker container at the same time you create your VM. But it also lacks most of what you expect to find in a typical Linux distribution such as a package manager, and many other binaries. This is to offer a locked-down environment and ensure a smaller attack surface, keeping your container runtime as safe as possible.
The great thing about running your containers on Compute Engine is that you can still create scalable services using managed instance groups (MIGs) as they offer autoscaling, autohealing, rolling updates, multi-zone deployments, and load balancing for the compute instances.
Google Container Registry — GCR
Google Container Registry is a place to store all these container images, to version them, and to restrict access.
GCR is a private-by-default container registry that runs on GCP with consistent uptime and across multiple regions. You can push, pull, and manage images in GCR from any system, VM instance, or your own hardware and maintain control over who can access, view, or download images.
GCR also features Container Analysis, a container image scanner for known vulnerabilities that keeps you informed so that you can review and address issues before deployment.
Use what works best for you!
Google Cloud offers you (at least) three solid solutions to run your containers, ranging from a fully-managed Kubernetes environment to a truly serverless platform. Pick the solution that works best for you and start deploying your containerized workloads today!