Docker and Microservices — Why do they make us better computer science engineers? — Part 3

ANIRBAN ROY DAS
8 min readOct 2, 2016

--

If you haven’t already read the Part 1 of this post, then please go ahead and read it here. Part 1 talks about What Docker is and What are Microservices and explains about the history of containers and not just Docker, it also talks about why monoliths are not good and what are the things to consider technologically for a microservices architecture like api gateway, service discovery, service registration, database transactions, etc. Go to this link to read about part 1.

Part 2 talks about how docker and microservices help each other co-exist. It also talks about how setting up microservices with and without docker differ and how docker movement drives the microservcies movement. Here is the link to Part 2 of the series.

How Docker and Microservices make you better computer science engineers?

Often as developers we spend most of your time at work writing code, be it in any programming languauge we like. Generally, we are given a whole new project to start, or are asked to add some Feature X to a Project Y. So, we generally go ahead and gather the requirements for the feature, look into the functionalities it will change and break, and what new technologies we would have to add to integrate the new feature in the project, for example, would the new feature require any special consideration, any new data stores, like a graph database maybe or may be it requires addition of a message queue or something.

After gathering the requirements, and planning out a high level design of the feature, we move to something which we love the most, coding.

After we code, to test that the feature that we have added doesn’t break any other functionality of the project, we need to do a minimum set of tests, starting from unit tests, integration tests, and end to end tests. These tests gives us the confidence that our code did not break anything earlier, and if it does break which will become clear via the tests so that we can go ahead and fix it.

Most of the time, thats all. Thats about it. As developers, we are happy to release our feature as fast as we can. Once we are done with the tests we did on our local machine, we push it to the code repository where a CI server like Jenkins, Travis, etc. take this to the next step by starting a build, perform some tests, be it unit, integration, end to end. After the tests are finished, a new deployment is kickstarted. And we happily ship our new feature with much agility.

What we missed in all the above processes is the actual working of how the entire system works. We code in our development machines and everything is done. We don’t care about anything else, we don’t care about the network, we don’t care about the persistence layer, We just care about making the application work on our local machine. And our CI servers take care of the rest. Although its great to have this feature where we don’t have to worry about anything else except how we code on our local machine.

But does that make us good computer science engineers? As computer science engineers we ought to know a lot of things but due to much automation already provided to us (Thanks to our good DevOps engineers), we don’t have to care about anything else.

But this is purely in my opinion, IMHO, a computer science engineer should know more than just some data structures and algorithms. Today’s developers have gone fast track with shipping features fast. That’s their only bread and butter. Lets see how Docker and Microservices change all that and by force or by good-will require us to know more than just writing code or knowing algorithms and data structures.

Docker Enlightenment

Image Source

Docker is an isolated process space. So you have to understand what is an isolated process space and how user/groups permissions inside a container differ from those outside the container which in a way inspires you to read up on that topic and somehow you know more about users/groups, permissions, process spaces because you stated using docker. Hence, you know more about those contrary to then when you didn’t use docker.

When you start using docker compose, you need to write docker-compose.yml files which have their own rules. Even without docker compose, when you use docker to run containers in a bit complex scenarios, you come across terms such as volume mounts, host volume mounts, attaching containers to a network, linking different containers. All these concepts of network and volume, you may already know but you never got a chance to work with them or make use of them in your daily development routine. You just wrote code, unit tested and let the DevOps guys do the rest of the job for you, CI servers help automate most of the processes and deploy software faster. But by using docker you opened yourself up to all these concepts once again and now you actually know what is going on, if not in a detailed manner but you feel from within to brush up your intel on those topics and sometimes in your quest for the same you read up a lot of things and come out as a monk who now knows a lot of how networks work, what are volumes, how to mount them, how to do backups of volumes, how network interfaces work, what they are, what were those quite verbose ifconfig output you had no idea about before. You slowly seem to understand or at least have a fair idea of all these.

Before using docker, you never cared about network interfaces, but now you do, you know that to link different containers, they need to be in the same network to do better dns queries. Oh, yes, DNS, you didn’t care about this too. Not unless you are handling the ops side of the systems or you are a CTO, or working in an early stage startups where everybody is everybody.

You seldom looked at /etc/resolv.conf, and now you know what it does. You now learnt about overlay networks, you now know how a VPN works, you now have good idea about what are Virtual IPs. All these were completely foreign to you and you never cared to know about these too because it didn't matter to you or your work. You could release features without knowing about any of these. But docker opened doors for you to learn all these. Well, you can still use docker and get away with it without knowing all the above, but you are a good developer, you will want to know how all the docker magic happens and you start learning about stuffs and by the end of it, you get enlightened.

A computer science engineer should not just code. A proper engineer knows about the in and out of the actual engineering that is happening. Well, again this is purely IMHO.

Microservices — Your crash course into Distributed Systems

Image Source

Distributed Systems are hard. They are interesting but real real hard. The idealism of yours going to take a back seat when you work with distributed systems. Well, till now you didn’t care about them anyway, But microservices are distributed systems. So by ditching monoliths you entered the wormwhole of problems of distributed systems because microservices architecture is a distributed systems archtecture with a good ornamented name to allure you. Heard of Medusa?

While working in a monolith architecture, all we cared about is how function calls happen, how that stack is getting over populated or what is a heap and how garbage collection happens, how to profile your program if you hit some performance bottlenecks. (Sometimes, you don’t even do or know all these, you just use your algorithms and data structures and write some code and you feel complacent enough to call yourself computer science engineers).

Again, IMHO, an engineer is ought to know more and give back to the community with much more competent services.

So, microservices made you think more than just function calls, you started to care much about REST APIs. Well, you did before too, but for a smaller part of your entire monolithic elephant application. But, now you have to think of it as the first class citizen of your application because your microservice is of no use if it cannot talk to other microservices. So REST is your bread and butter now. In this process of making it substantial, you started taking care about how you structure your REST APIs. Before, you didn’t care much because your REST APIs were part of just one project/application, one system, so however bad you structure you REST APIs, you just have to expose that contract to the clients who consume those APIs and nobody else. But now your microservice may be consumed by several other microservices, handled by different teams having different idealogies, different programming languages. So for not breaking any other service(your API consumers), you have to follow some strict standard which is universally accepted so that all the other microservices can trust your REST APIs and nothing funny happens. See, how you became a pro in REST APIs now.

Not just REST, you can use other RPC mechanisms using Thrift, gRPC, and you stared focusing on those too with greater detail and increased importance. You started thinking more about latency, because with microservices calling each other, it creates a chain of requests and increased latency as compared to monoliths where there were only function calls which were multifold faster than calling other microservices over the network.

In a distributed system, you have to know about the Fallacies of Distributed Computing and make your program bullet proof to those fallacies. So microservices make you even better programmers than you were before. It makes you think about exceptions, network effects more than you thought about them before.

You even find yourself reading about CAP theorem and start believing in stuffs like network partitions and how that is a real thing and your world of idealistic view of applications gets gutted and you feel like reborn from the ashes to a new but difficult world. you become a strict programmer. You start caring about things like how to give timeouts to all your requests, how to handle some weird network errors.

You start reading things like distributed transactions, 2 phase commit, eventual consistency, and how etcd, consul, etc. are implemented, because you start using them as service discover tools.

All those research certainly made you better computer science engineers. You know have a wider view of the world and you cease to just stay in those cozy comfort code editor of yours. You start respecting about network engineers, devops engineers.

To gain further confidence on the quality of your product you start investing more on tests now. In return you become better test engineers too.

So now you are no more just a developer, you are a network engineer, you are an operations engineer, you are a QA engineer, you are becoming the real DevOps engineer. You are a better computer science engineer because you know about a lot more things now.

Docker and Microservice

These two really changed how we did things. Also they gave us a completely different overview of so many other things. We are now doing distributed systems, we are doing devops, we are doing everything. We were good computer science engineers, but now we are better computer science engineers.

--

--

ANIRBAN ROY DAS

I believe in knights and 50 other things. An observer, listener, storyteller, make believer and writes colourful texts on a dark background for a living.