Container Sensemaking: What Your Peers Are Up To
You get a container! And, YOU get a container! You get a container! And, You! EVERYBODY GETS A CONTAINER!!!
Like an Oprah season premiere giveaway, IT organizations across industries are handing out containers to every workload in the room. These containers are filled with hopes and dreams of increasing speed, agility and compression rates. But, is this the wrong approach and the wrong expectation? Not necessarily… but the reality takes away just a bit of the rose-coloring on those glasses.
The reality that is shaking out as more and more businesses test container approaches and implementation is that containerization will help some workloads more than others. And if you use them in the wrong way, or for the wrong workload, you’ll end up wasting more time and resources than you save.
In a recent webinar, Forrester stated they see between 30–35% of businesses testing containerization, with right around 10% running containers in production. That number jumps when you poll DevOps and digitally-transformed businesses, but if we’re cutting across all industries and surveys, the overall market adoption is around 10% today and growing rapidly. Most surveys expect that number to grow to 50% or more in the next 2 years.
Those surveys match our one-on-one conversations with clients as well. As more and more businesses shift to defining their core value in the software they deliver to their internal and external customers, stable, innovative deployments become key. Several of our customers that previously defined their core value broadly across multiple services are shifting into software development IT focus as their business transforms and their customers look to consume those services in an application model.
Nearly every business we’ve talked to is looking to spend less time on the operational part of their IT business and put that time back into innovation and development efforts, which drives their business forward. Containers, while not the only answer to that problem, are certainly an attractive tool to help shift that work from one bucket to the next.
The market is a state of exploration today as they try to figure out what exactly this panacea can truly cure to achieve their business goal. So, what are your peers learning that you should know?
Market Understanding #1 — Containers are not VMs, and VMs are not Containers.
Just as cloud-washing was common ten years ago, container-washing can be seen in many places today. The common market understanding is that containers and VMs are fundamentally different in their design, use cases and purpose.
Docker Technical Evangelist Mike Coleman describes the difference this way, “With containers, you share the underlying resources of the Docker host and you build an image that is exactly what you need to run your application. You start with the basics and you add what you need. VMs are built in the opposite direction. You are going to start with a full operating system and, depending on your application, might be strip out the things you don’t want.”
Image from SDXcentral.com: https://www.sdxcentral.com/cloud/containers/definitions/containers-vs-vms/
Market Understanding #2 — Containers and VMs are not mortal enemies.
Containers and VMs are not oil and water. They mix well together, and in fact, we expect most future datacenters will have both, each with its ideal use case and workload fit. In today’s world however, many companies are taking to learning containers within their existing environments — which means deploying containers within a VM.
As containers continue to mature, many analysts and experts are recommending this in-VM deployment model to increase stability and security of the containers. While it’s assumed this is not necessarily the ideal long-term model, it’s an generally agreed upon ideal starter approach for many businesses who have VM expertise in-house already, or have established IaaS hosting partnerships, because the approach allows your team to learn the tool without learning new operational overhead of shifting to a bare metal provider.
Containers require a new way of thinking, servicing and approaching application deployment and development, so deploying containers in VMs to start can be a nice set of training wheels to gain value without introducing too much risk.
Market Understanding #3 — There’s no one right place to run containers.
Some think containers are equal to cloud, but that’s just not the case. Containers can run in hyper-scale clouds, in managed cloud hosting providers, in private clouds and on bare metal. There’s nothing about containers that require hosting in a certain type of environment over the other.
Rather than have the architecture of the application dictate the location of its hosting, the location should be dictated by the business needs of the application. This is one area that doesn’t fundamentally change from the past. If your application’s business needs put it as a best-fit in the cloud, it can stay in cloud in its new architecture model. If it was a best fit on-prem previously, that doesn’t have to change simply due to the architecture model in the future.
Market Understanding #4 — “Mode 2”, or cloud-native apps are a better fit than your incumbent/legacy “Mode 1” apps
In an ideal future world, every application would leverage the latest and greatest technology. But, reality states that’s likely never going to happen. Just as many companies never achieved 100% virtualization, the reality of resource constraints says most companies will not be able to re-architect all of their applications; it’s just not feasible.
In the near-term, “Mode 2” and cloud-native apps are going to be the best fit to adopt containers, often in concert with microservices. By the nature of where they are today, it’s going to be easier to take advantage of the benefits of containerization and DevOps practices with these applications. Re-architecting “Mode 1” applications requires time, money and overhead that often can’t be spared when weighed against everything else on the team’s priority list.
Transformation doesn’t happen in a day, and this is going to be a lengthy learning process for everyone in the market. As the technology evolves, we’re going to learn more and gain efficiencies in the process. What should be comforting to know, however, is the key to success with containers doesn’t change from the past. Take small steps, keep learning, ask questions and share what you’ve learned, when you can.
— Diana Nolting is a Product Analyst at Bluelock, an industry-leading Disaster Recovery-as-a-Service provider for complex environments and sensitive data. With five years experience in the cloud service provider market, Nolting pairs her insatiable appetite for consuming information with user interviews, industry articles and analyst reports. She transitioned to the tech world from the media and sports entertainment industry and holds a Bachelors of Journalism from Indiana University Bloomington. Connect with Diana on twitter at @diananolting.