Docker Has Turned Us All into SysAdmins
Docker has been one of my favorite software stories of the last couple of years. On the face of it, it should be pretty boring. Containerization isn’t, after all, as revolutionary as most of the hype around Docker would have you believe. What’s actually happened is that Docker has refined the concept, and found a really clear way of communicating the idea. Deploying applications, managing your infrastructure doesn’t sound immediately ‘sexy’. It was data scientist that was proclaimed the sexiest job of the twenty-first century; sysadmins hardly got an honorable mention. But Docker has, amazingly, changed all that. It’s started to make sysadmins sexy…
And why should we be surprised? If a SysAdmin’s role is all about delivering software, managing infrastructures, maintaining it and making sure it performs for the people using it, it’s vital (if not obviously sexy). A decade ago, when software architectures were apparently immutable and much more rigid, the idea of administration wasn’t quite so crucial. But now, in a world of mobile and cloud, where technology is about mobility as much as it is about stability (in the past, tech glued us to desktops; now it’s encouraging us to work in the park), the task of system administration is crucial. Tools like Docker are vital for this. By letting us isolate and package applications into their component pieces we can start using software in a way that is infinitely more agile and efficient. Where once the focus was on making sure software was simply ‘there,’ waiting for us to use it, it’s now something that actively invites invention, reconfiguration and exploration.
Docker’s importance to the ‘API economy’ (which you’re going to be hearing a lot more about in 2016) only serves to underline its significance to modern software. Not only does it provide ‘a convenient way to package API-provisioning applications,’ but it also ‘makes the composition of API-providing applications more programmatic’, as this article on infoworld has it. Essentially, it’s a tool that unlocks and spreads value.
Can we, then, say the same about the humble sysadmin? Well yes — it’s clear that administering systems is no longer a matter of simple organization, a question of robust management, but is a business critical role that can be the difference between success and failure. However, what this paradigm shift really means is that we’ve all become SysAdmins. Whatever role we’re working in, we’re deeply conscious of the importance of delivery and collaboration. It’s not something we expect other people to do, it’s something that we know is crucial. And it’s for that reason that I love Docker — it’s being used across the tech world, a gravitational pull bringing together disparate job roles in a way that’s going to become more and more prominent over the next 12 months.
Let’s take a look at just two of the areas in which Docker is going to have a huge impact.
Web development is one field where Docker has already taken hold. It’s changing the typical web development workflow, arguably making web developers more productive. If you build in a single container on your PC, that container can then be deployed and managed anywhere. It also gives you options: you can build different services in different containers, or you can build a full-stack application in a single container (although Docker purists might say you shouldn’t). In a nutshell, it’s this ability to separate an application into its component parts that underlines why microservices are fundamental to the API economy. It means different ‘bits’ — the services — can be used and shared between different organizations.
Fundamentally though, Docker bridges the difficult gap between development and deployment. Instead of having to worry about what happens once it has been deployed, when you build inside a container you can be confident that you know it’s going to work — wherever you deploy it. With Docker, delivering your product is easier (essentially, it helps developers manage the ‘ops’ bit of DevOps, in a simpler way than tackling the methodology in full); which means you can focus on the specific process of development and optimizing your products.
Docker’s place within data science isn’t quite as clearly defined or fully realized as it is in web development. But it’s easy to see why it would be so useful to anyone working with data. What I like is that with Docker, you really get back to the ‘science’ of data science — it’s the software version of working in a sterile and controlled environment. This post provides a great insight on just how great Docker is for data — admittedly it wasn’t something I had thought that much about, but once you do, it’s clear just how simple it is. As the author of puts it: ‘You can package up a model in a Docker container, go have that run on some data and return some results — quickly. If you change the model, you can know that other people will be able to replicate the results because of the containerization of the model.’
Wherever Docker rears its head, it’s clearly a tool that can be used by everyone. However you identify — web developer, data scientist, or anything else for that matter — it’s worth exploring and learning how to apply Docker to your problems and projects. Indeed, the huge range of Docker use cases is possibly one of the main reasons that Docker is such an impressive story — the fact that there are thousands of other stories all circulating around it. Maybe it’s time to try it and find out what it can do for you?
To dive into Docker visit Packt Publishing’s dedicated Docker page, featuring our top titles and video courses, as well as some of our quick and accessible free Docker tutorials.
Originally published at www.packtpub.com.