How Software Teams Accelerated Average Release Frequency from Three Weeks to Three Minutes

A Quick Guide to DevOps for Non-Engineers

DataKitchen
data-ops
4 min readMar 1, 2017

--

In a recent blog post we discussed how software development teams accelerated average release frequency from 12 months to three weeks using Agile Development. Today, we will talk about taking the next gigantic step — accelerating from three weeks to three minutes.

Before the advent of on-demand cloud services, the various groups in software development (design, development, test, quality, support, …) had to set-up their own infrastructure. Whatever components were needed (physical servers, networks, storage, software, …) had to be ordered, installed, configured and managed by the IT department. Servers would be ordered at different times and from different vendors, each slightly different from the other. Depending on the task at hand, different machines could have a different array of software applications with revisions of each app being continuously updated. With all this variability, it was quite common for individuals within the company to be running code in different environments. Outside the four walls of the company, customers could be running in yet another environment. This situation presented challenges.

If, for example, the customer reported a problem, it might not be replicable in the support, test or development groups due to differences in the hardware and software environments being run. This lack of alignment fostered misunderstandings and delays and often led to a lack of trust and communication between the various stakeholders.

About a decade ago, Amazon Web Services (AWS) and other cloud providers, began offering computing, storage and other IT resources as an on-demand service. No more waiting weeks or months for the IT department to fulfill a request for servers. Cloud providers now allow you to order computing services, paying only for what you use, whether that is one processor for an hour or thousands of processors for months. These on-demand cloud services have enabled developers to write code that provisions processing resources with strictly specified environments, on-demand, in just a few minutes. This capability has been called Infrastructure as Code (IaC). IaC has made it possible for everyone in the software development pipeline, all the different groups mentioned above, to use an identical environment tailored to application requirements. With IaC, design, test, QA and support can easily get on the same page. This leads to much better collaboration between the groups and breaks down barriers that prevented open communication. In other words, no more finger pointing.

With IT infrastructure being defined by code, the hard divisions between IT operations and software development are able to blur. The merger of development and operations is how the term DevOps originated.

With the automated provisioning of resources, DevOps paved the way for a fully automated test and release process. The process of deploying code that used to take weeks, could now be completed in minutes. Major organizations including Amazon, Facebook and Netflix are now operating this way. At a recent conference, Amazon disclosed that their AWS team performs 50,000,000 code releases per year. This is more than one per second! This methodology of rapid releases is called continuous delivery or alternatively, continuous deployment, when new features (and fixes) are not only delivered internally but fully deployed to customers.

DevOps improves collaboration between employees from the planning through the deployment phases of software. It seeks to reduce time to deployment, decrease time to market, minimize defects, and shorten the time required to fix problems.

The impact of DevOps on development organizations was shown in a 2014 survey, “The 2014 State of DevOps Report” by Puppet Labs, IT Revolution Press and ThoughtWorks, based on 9,200 survey responses from technical professionals. The survey found that IT organizations implementing DevOps were deploying code 30 times more frequently and with 50 percent fewer failures. Further, companies with these higher performing IT organizations tended to have stronger business performance, greater productivity, higher profitability and larger market share. In other words, DevOps is not just something that engineers are doing off in a dark corner. It is a core competency that helps good companies become great.

The Data analytics team transforms raw data into actionable information that improves decision making and provides market insight. Imagine an organization with the best data analytics in the industry. That organization would have a tremendous advantage over competitors. That could be you.

The lessons learned in DevOps can be applied to Data Analytics. At DataKitchen, we call this DataOps. In order to explore DataOps further, it will help readers to understand Agile Development and a testing methodology called statistical process controls (SCP), which is a foundation of lean manufacturing. We’ll cover SCP in our next blog.

--

--