Containerization Technology: What, When, Where, and Why?

Evans Stepanov
4 min readNov 18, 2017

--

Docker has grown immensely in popularity over the past four years because it has simplified container technology for developers. Container’s are an abstraction over a host operating system so that an application living in that container perceives itself to be in an isolated environment. Today if you use almost any Google product, that product will be served to you from a container.

What is it?

Unlike virtual machines, that virtually replicate hardware and software, containers house an application in its own environment but with a shared operating system. That means every container running on a computer is utilizing that computers operating system. If a different computer were hosting multiple virtual machines, each virtual machine would have its own operating system as well as virtualized hardware.

image sourced from Docker

This graphic can help us visualize the difference between virtual machines and containers. On the left you see each VM hosts a single application, and its own operating system. On the right you can see a container is only composed of the application it is hosting and its specific dependencies. Containers run on an engine that “shares” the host operating system with the application. Containers can also have isolated memory and disk space. With virtual machines, the host operating system that runs the hypervisor is not shared with the virtual machines being served by the hypervisor. This means virtual machines can take up a lot of extra resources that would normally be shared by a container.

Note that on a virtual machine two apps (App A and A`) that both use the same libraries and operating system. Because of the nature of virtual machines, their operating system, virtual hardware, and application dependencies are sometimes unnecessarily duplicated. The same two applications running in a container can be setup to share non-virtualized hardware and operating system kernel resources from a host.

When Are Containers Used?

Containers are well suited for a multitude of tasks and situations. For companies such as Amazon, Microsoft, and Google that run millions of virtual machines, containers on top of their VMs allows them to serve up more applications and allocate more resources to those applications with less hardware and more efficiently. Software developers use containers to ensure their application’s development and testing environment is identical to the applications production environment. Docker was designed with developers in mind, and as such it is a great at being configured for automated deployment. Developers also no longer need to install a series of language environments (such as Node, Ruby, Java, Python, etc..) on their machines. Now developers can set up docker and have the required language environment installed in a container and run their applications through docker.

Containers are great for running a series of applications that require one operating system kernel for the applications. That means you can run more of these applications together on a single containerized host than you could on a single virtual machine. As a general rule of thumb containers can serve up about two to three times as many applications than the same applications run on virtual machines.

Where are they used?

Developers, cloud computing providers, and large organizations that maintain their own hardware are picking up container tech. Amazon’s Web Service (AWS), Microsoft Azure, Digital Ocean, and Google Cloud are all already powered by virtualization technology. Today all of those providers support docker or containerization because it simplifies the development and deployment process.

Why Containers?

Lets be verbose and break down the benefits of containers:

  1. Developer can easily run their application in a production setting on their local machine.
  2. Developers no longer need to install coding environments on their main system.
  3. Multiple developers working together on a project can all run their application in a standardized environment. Meaning code that works on one developers machine will work on another developers machine.
  4. Developers and organizations can more easily automate deployment with continuous integration.
  5. Less overhead (and possibly less expensive) when compared with virtual machines.

Containers are not a new idea that only recently has become so popular. Today containers are ubiquitous because they are portable, agile, and now standardized.

In 2015 Docker donated its code base related to the container format, runtime, and specifications to The Linux Foundations’ Open Container Project. The Open Container Project also includes Amazon Web Services, Cisco, CoreOS, EMC, Google, HP, IBM, Microsoft, Red Hat, VMware, and many others. These supporters all contributing to the technology, and agreeing to promote a set of common open standards with a minimal scope, pretty much guarantees that containers will only become more popular.

--

--

Evans Stepanov

Software engineer, computer hardware enthusiast, cybersecurity devotee, passionate gamer, and lifelong student.