Kubernetes: More than just container
In the journey as the ARTH Learner in the program “ARTH -2020” under the guidance of ‘The World Record Holder Mr. Vimal Daga Sir’, I got to explore more and here I’m sharing some learnings to get into some insights of Kubernetes.
Kubernetes, also known as K8s, is an open-source system for automating deployment, scaling, and management of containerized applications.
It groups containers that make up an application into logical units for easy management and discovery. It has a large, rapidly growing ecosystem. Kubernetes services, support, and tools are widely available.
Modern applications are increasingly built using containers, which are microservices packaged with their dependencies and configurations. Kubernetes is open-source software for deploying and managing those containers at scale — and it is also the Greek word for helmsmen of a ship or pilot.
Kubernetes is a very flexible and extensible platform. It allows you to consume its functionality, or use your own solution as built-in functionality. On the other hand, you can also integrate Kubernetes into your environment and add additional capabilities.
Kubernetes is taking the app development world by storm. By 2022, more than 75% of global organizations will be running containerized applications in production.
Keeping containerised apps up and running can be complex because they often involve many containers deployed across different machines. Kubernetes provides a way to schedule and deploy those containers and additionally scale them to the desired state and manage their life cycles. Use Kubernetes to implement your container-based applications in a portable, scalable and extensible way.
Kubernetes eases the burden of configuring, deploying, managing, and monitoring even the largest-scale containerized applications. It also helps IT pros manage container lifecycles and related application lifecycles, and issues including high availability and load balancing.
Make workloads portable
Because container apps are separate from their infrastructure, they become portable when you run them on Kubernetes. Move them from local machines to production among on-premises, hybrid and multiple cloud environments, while maintaining consistency across environments.
Scale containers easily
One can define even the complex containerised applications and deploy them across a cluster of servers with Kubernetes. As Kubernetes scales applications according to the desired state, it automatically monitors and maintains container health.
Build more extensible apps
A large open-source community of developers and companies actively builds extensions and plugins that add capabilities such as security, monitoring and management to Kubernetes.
Kubernetes & Docker
While in practice Kubernetes is most often used with Docker, the most popular containerization platform, it can also work with any container system that conforms to the Open Container Initiative (OCI) standards for container image formats and runtimes.
And because Kubernetes is open source, with relatively few restrictions on how it can be used, it can be used freely by anyone who wants to run containers, most anywhere they want to run them — on-premises, in the public cloud, or both.
Kubernetes doesn’t replace Docker, but augments it. However, Kubernetes does replace some of the higher-level technologies that have emerged around Docker.
Kubernetes Use Cases
Spotify’s Kubernetes Story
Spotify’s Golden Path to Kubernetes Adoption
The company started small, experimenting with a few services on Kubernetes clusters then moving up to more complex workloads and self-service migration.
Spotify is well known worldwide for its music service. Not so well known is its path to Kubernetes deployment has been a road with many twists and turns.
What also may be a surprise to many is that Spotify is a veteran user of Kubernetes and how it owes much of its product-delivery capabilities to its agile DevOps. Indeed, Spotify continues to increasingly rely on a container and microservices infrastructure and cloud native deployments to offer a number of advantages. This allows its DevOps teams to continually improve the overall streaming experience for millions of subscribers.
Spotify open sourced its in-house container orchestration service, Helios, in 2014. After several years of use, Spotify decided to make the switch from Helios to Kubernetes. Backed by thousands of developers, Kubernetes comes with a huge ecosystem behind it, and trying to reach feature parity with an in-house system not widely adopted by other enterprises is difficult, even for a business as large as Spotify.
It became clear that Spotify needed a managed solution rather than operating clusters from scratch. By moving to Kubernetes, Spotify would benefit with several features provided by Kubernetes.
How Spotify did it and still are doing it
Spotify decided to start small, experimenting with running one service on one Kubernetes cluster and then moving up to three services on a shared cluster for a few days.
Spotify continues to expand its use of Kubernetes on a monthly basis since its adoption a few years ago. Previously, Spotify had already begun to shift its operation to a containerized infrastructure before it began to consider the potential benefits Kubernetes might offer.
Despite its early adoption, Spotify began to shift to Kubernetes in earnest about two years ago. Kubernetes has since played a key role in Spotify’s DevOps in two key ways. This includes how the platform has helped to reduce toil. The second main benefit Kubernetes has provided is how the adoption of a cloud native infrastructure has enabled the music streaming giant to add a number of new tools and platforms to improve its production pipeline and operations.
During the past year, Spotify has been expanding the number of services it runs on Kubernetes while taking advantage of its highly distributed structures. For example, how Spotify moved data pipelines and machine learning to Kubernetes and “relies on it to build ephemeral environments.” Moving forward, the company still has to tackle a few challenges, including cluster management, multicluster operations in each region and building up support for data jobs, machine learning workloads and GPU workloads.
Moving step by step with steadily increasing goals, instead of a single, monolithic migration, allowed Spotify to steadily increase scope and complexity and handle unknown factors at a manageable pace, keeping morale for developers up.
In one of his interview, Mr. Jai Chakrabarti stated that :
“We saw the amazing community that’s grown up around Kubernetes, and we wanted to be part of that. We wanted to benefit from added velocity and reduced cost, and also align with the rest of the industry on best practices and tools.”
— JAI CHAKRABARTI, DIRECTOR OF ENGINEERING, INFRASTRUCTURE AND OPERATIONS AT SPOTIFY
Kubernetes is a great tool for orchestrating containerised applications. It automates the very complex task of dynamically scaling an application in real time. Kubernetes enables enterprises to solve common dev and ops problems easily, quickly, and safely. It also provides other benefits, such as building a seamless multi/hybrid cloud strategy, saving infrastructure costs, and speeding time to market.
Now, that was a whole lot to cover but if you read it, Kudos!
Stay tuned for further readings on Deep Dive into Kubernetes and some more case study on Kubernetes, that will be more insightful same as this article and it will have an overview of these services and helps you to understand the power of container technology.
Stay Safe! Keep Learning!
#vimaldaga #righteducation #educationredefine #rightmentor #worldrecordholder #linuxworld #makingindiafutureready #righeducation #arthbylw #ansibleglaxy#rightansibleconcepts #usecase #kubernetes #microservices