Cloud computing is everywhere. It has become a term the general public recognizes and within our industry it is revolutionizing the way we operate and what is possible. Surprisingly enough, even as it has achieved a level of ubiquitous recognition in the tech industry, I continuously get questions from industry peers about whether the cloud is right for their situation, especially from a financial perspective. This article is an attempt to address some of the most common and basic factors that a company needs to consider when determining if a cloud solution will lower costs for them compared to their traditional solutions.
There are numerous options when it comes to leveraging the cloud, especially in how you elect to utilize it and pay for it. All of the leading cloud providers offer a variety of pricing models, which while being beneficial can be confusing, ultimately leading to inefficient usage and wasted spending. The central benefit to customers is the ability to “rent” infrastructure by the hour and only pay for what they use. This differs from traditional models where companies buy infrastructure up front, incurring cost regardless of how much they utilize that infrastructure.
The most common price model is one where the customer pays a set rate for every unit of usage, most often the unit being an hour of usage. This model was the initial one offered by cloud providers and is still the most heavily used. The unit price for a piece of infrastructure will vary depending upon three factors.
- Resource Specifications: The physical specifications of the resource will impact the hourly cost. This can vary based on CPU, memory and operating system specifications of a server to the level of I/O for a disk, the amount of provisioned output for a managed database service, etc.
- Physical Location: All the major cloud providers have multiple regions around the world where they offer their services. These regions will impact the hourly cost of their various services with small variations between regions.
- Point in Time: Last but not least, prices will vary over time. While in theory this rate can increase or decrease, historically cloud providers have lowered the cost of many of their services since their inception as they attempt to remain price competitive with each other.
This usage rate price option has existed since the inception of public cloud and for many companies it is the only option they ever exercise. If a company compares this model to their traditional options it’s a fairly straight forward comparison of how much it would cost to purchase and manage infrastructure on their own versus how much it would cost to rent that infrastructure from a cloud provider.
So when does it make sense to leverage a public cloud provider? There is no universal answer; it greatly depends upon the details and requirements of each possible scenario. That being said, there are several situations where it makes financial sense for a company to utilize usage rate pricing.
The first area a company should consider when comparing cloud versus their traditional model is the total cost of ownership. What is the total cost of purchasing the infrastructure and maintaining over a set time period? The relatively easy part of this evaluation is determining how much it would cost to purchase the infrastructure in a cloud and non-cloud model. The harder calculation is to determine what the true cost of the people to fully manage the infrastructure on an ongoing basis. In many cases, a company might not have that expertise in-house and has to deal with the inefficiencies of not being able to adequately administer that infrastructure. This inefficiency could result in infrastructure not being reliable or insecure or not able to handle the desired use cases. For smaller companies that inefficiency alone might justify partnering with a public cloud provider. In other scenarios where companies have the expertise in-house, it is a question of how much it costs the company to manage the infrastructure in question and whether they are comfortable with the ongoing cost of maintaining those resources as compared to allowing the cloud provider to manage the infrastructure for them. This last calculation can be complicated for a lot of companies. Issues such as administrators having broader responsibility than just the cloud infrastructure in question, levels of comfort for their first foray into the cloud and fear of the unknown will all influence this financial comparison
Another important consideration that impacts a cloud decision is the usage level of the infrastructure combined with the ability to only pay for what is consumed. A clear understanding of the planned use of the infrastructure is necessary to make this consideration. Historically, companies have acquired infrastructure, paid for it up front, amortized it over several years and run it constantly over that time period. With this model, companies have sized infrastructure based upon their peak estimated usage, basically purchasing enough infrastructure to support that worst case usage scenario.
With a public cloud provider, there is potential to save money compared to the traditional model described above by only using what you need when you need it. This means turning off infrastructure when its not in use and only adding new infrastructure when the when there is demand for it. To figure out if a customer can benefit from this approach they need to consider the following questions:
- Will the infrastructure run continuously or can it only run for specific durations? It is very common for infrastructure to only be needed for a finite time period. An example would be a server that runs calculations once per month and sits idle the rest of the time. Or development and test servers that are heavily used Monday through Friday during regular business hours but sit silent on nights and weekends. Every company has their own usage patterns, with public cloud they just need to closely evaluate if there are potential cost savings due to periods of non-use.
- What levels of activity will occur during periods of usage? A company must evaluate whether the infrastructure will operate at a constant level of usage or will there be variances in the usage levels. In the example of a website, historically a company would purchase enough web servers to support the peak load of that website. That peak load could occur daily, weekly, monthly or even yearly. Regardless of how often that peak load was experienced, the number of servers required to service that peak load would have to be purchased. This example results in wasted capacity with servers running under utilized for significant periods of time. With a cloud provider, a company could elastically scale the number of servers that would be required to support ongoing load. During busy times, more servers could be put into commission and during slower times excess servers could be turned off lowering costs.
This provides some insight around the most common and basic ways to evaluate whether a cloud-based solution will help a company lower costs. There are definitely additional methods for further helping lower costs via a cloud solution, such as buying capacity in advance or buying excess capacity from cloud providers, topics I hope to cover in a future article. The key point to remember when evaluating the cloud from a financial viewpoint is you only need to pay for what you consume. If you keep that thought in mind and look at your systems usage from a different perspective, you could potentially dramatically lower your costs compared to your existing models.