Docker and Virtual Machines: Advancements of Virtual Environments.

Dolamu Oludare
Towards Data Engineering
5 min readSep 29, 2023
Photo by Guillaume Bolduc on Unsplash

Virtualization

One of the best innovations that happened to the software world of computing has to be the emergence of virtualization and containers. Virtualization is simply a computing technology that allows you create instances of your system's physical components such as storage, servers, and networking are abstracted within a physical hardware system. The instances are always referred to as Virtual Machines or Virtual Environment.

For instance, you are a chef and you have a very big building with 10 rooms that you can use as a kitchen to make dishes for a dinner party. Let's call the building a central kitchen, we need to make 9 different dishes for the dinner party, instead of cooking all the dishes in the central kitchen. We can easily assign different chefs and cooking materials to each of the rooms in the building to make the cooking more effective and structured.

Photo by roam in color on Unsplash

Each room where the dishes are made are instance of the central kitchen. Though we can see the rooms in this case, in the case of virtualization these instances are not visible like the physical rooms but are stationed on the central computer and working just like the central computer. Each kitchen room in the building is similar to a virtual machine or virtual environment running on a physical computer which is the central kitchen.Just like each of the rooms is running on different cooking materials, each instance of the virtual machines is running on different operating systems.

Docker and Virtual Machines

One of the reasons why the need for technologies like docker and virtual machines arose was because of the problem of softwares packages mismatch and different configuration setting. For example, my system runs on a window operating system and I need to run a software or program that is only compatible with a linux operating system.

Docker and virtual machines both run on the technology of virtualization. Virtual Machines are abstractions of a physical computer. it is an instance of a running computer, this instance has its own storage, processing unit, networking, and operating system segregated from that of the central or physical computer. They use an Hypervisor to connect the software components of the instance to the hardware components and allows multiple virtual machines to run on a single machine. The beauty of virtual machines is that it enables you to carry out a separate task on one instance without it disturbing or conflicting with other virtual instances or the central computer itself.

Image by Author.

Docker is a tool designed for building, running, and shipping applications in a consistent manner using the technology of containers. A container is an isolated virtual environment that allows a developer to package up an application with all of the parts it needs such as libraries and other dependencies, and the container ships it all out as one package. So if your application works on your local machine, it can work on any other machine without issues.

Docker Architecture

Image from docs.docker.com

Docker uses a client-server architecture where the docker client uses REST API technology to communicate with the docker server also known as the docker daemon. The docker daemon is the main engine of the docker architecture, it is the backend of the architecture that runs docker containers. Docker command can be run either through the terminal or through a Graphical User Interface (GUI) which is the Docker desktop, available for all operating systems.

Docker development Workflow

Image by Author

The development workflow in docker is quite simple, but before we talk about the sequence of operation in docker, let’s talk about the components of docker workflow.

  1. Docker file: a docker file is simply a file that contains lines of instructions that are required to build a docker image.
  2. Docker Image: a docker image is a set of instruction used to build a docker container. Docker images are like classes in object oriented programming, they are like snapshots of a particular program or application which is the starting point to run a docker container. Docker images for different programs are available on docker hub: a registry of docker images.
  3. Docker Containers: Docker containers are lightweight, stand-alone and executable package of software that includes everything needed to run the application. They are an instances of a docker image.

The sequence of working in docker is initiated by writing a docker file which is used to build a docker image of the software. The image is then run to initiate a docker container instance of that particular software. The three major commands use to initiate a docker container are:

  1. docker build: this command is used to build a docker image from a docker file. The command has to be initiated in the directory where the docker file is present.
docker build -t <image_name> <path_to_build_image>

2. docker run: docker run is used to run the container from the docker image. This command is carried out sequentially after the image has been built.

docker run --name <name_of_container> <image_name>

3. docker pull: the docker pull command is used to pull ready-made images of application from docker hub, a registry of docker images.

docker pull <image_name>

Benefits of Docker Containers

  1. Docker containers allow running of multiple applications in isolation.
  2. Docker containers are lightweight.
  3. Docker containers uses the operating system of the host system, unlike Virtual machines which runs on its own special operating system.
  4. Docker containers starts up quickly.
  5. Docker containers do not require needless hardware resources, you can run tens and thousands of containers side by side.

Installing Docker

you can read through this article by Aman Rajan Verma to properly install docker and docker-compose on your local machine. Once docker is installed on your system, you can finally run any application or package without installing them on your local computer. In subsequent articles, we would be using docker to set up our infrastructure for programming languages, databases, and so on.

Conclusion

We have finally come to the end of the article. In my next article, we will be learning basic docker commands by doing a walkthrough project on running a mini Python application with docker. Docker technology is one that is unavoidable in the Data, DevOps, and Cloud space for application deployment. Understanding this technology is essential for anyone looking to break into the DevOps or Cloud space.

Thank you for reading!

References

--

--