Rethinking Revenue and Demand Forecasts with Deep Learning

Photo by Riccardo Annandale on Unsplash

With so much buzz around the new uses of artificial intelligence and machine learning, it’s easy to miss how it can vastly improve basic activities we’ve been doing for years. Revenue and demand forecasting is one of these areas.

Deep learning is a rapidly changing technology that is revolutionizing many business processes, including forecasting models. There are certain advantages of deep learning that make it particularly applicable for use in revenue forecasting. Deep learning models:

· Produce more accurate forecasting models than traditional methods

· Recognize complex and less obvious interacting patterns

· Make many related forecasts such as revenue by profit center or sales by product

· Use a robust, highly scrutinized methodology that doesn’t live in a spreadsheet

· Enable rapid, less expensive prototyping on cloud platforms

Model Accuracy

Simply put, deep learning-based forecasting methods produce some of the most accurate models in use. A recent study of 27 different forecasting techniques concluded that of the top 4 most accurate models, all were based on deep learning. Compared to basic moving average models, deep learning methods were as much as twice as accurate. Compared to other advanced, non-linear statistical models like ARX, or machine learning models like XGBoost, deep learning techniques offer an improvement of 10% to 20%.

Recognizing Complex Patterns

Today’s accurate and detailed revenue forecasts must take into account quarterly, monthly, seasonal, and perhaps even daily or hourly patterns present in the data. Modeling these effects and their interactions by hand quickly becomes a daunting task.

In a machine learning approach such as deep learning, we don’t tell the model what each of these patterns are, but rather ask it to learn them from historical data. Deep learning uses many historical observations to decompose complexity into its underlying components, which can then be modeled.

An example of the DeepAR algorithm detecting and decomposing the blue time series into a set of component parts. Source: https://docs.aws.amazon.com/sagemaker/latest/dg/deepar_how-it-works.html

This approach transfers much of the complexity from excel spreadsheets and human brains to powerful algorithms and hardware running efficiently in the cloud environment.

Making Many Related Forecasts

As the business cycle continues to accelerate, there is a demand for more frequent, more granular forecasts. In traditional moving average models, such as ARIMA, a separate model must be built for each individual product or business unit forecast (each of these is called a “series” as in “time series forecasting”). This makes each model much more susceptible to noise, due to the smaller subset of data that each model sees. Additionally, the large number of resulting models can be cumbersome to manage.

Deep learning models take a different approach to this problem. A single model is fit across all series, and seeks to maximize accuracy across the whole data set. This approach produces a single model capable of generating many related forecasts, or even producing forecasts for previously unseen data. For example, only a couple of data points are required to forecast a new product. Patterns learned from long-standing products are applied to new ones where years of historical data are not available.

Armed with deep learning, companies are increasingly moving to more granular forecasts, such as forecasting demand for individual SKUs or profit centers, rather than divisions or business units.

Robust Methodology

Moving to a deep learning approach can present a great opportunity to move away from spreadsheets and user-written code to a more robust and controlled environment. Since deep learning models are essentially pieces of software, they can take advantage of your existing source control system. This proper version control ensures consistency and repeatability of models and forecasts.

Secondly, since deep learning models are built with common software libraries in use across many organizations, there is peace of mind that your models have been heavily scrutinized.

More Accessible than Ever

For several years, cloud platforms have been providing access to the infrastructure necessary to build deep learning models. Hardware that traditionally cost tens or hundreds of thousands of dollars is now available on demand for a few dollars an hour. These platforms enable experienced data science teams to quickly custom develop flexible and accurate deep learning models with a very low barrier to entry.

Further, within the last twelve months, there are now several products that offer deep learning-based forecasts as a managed service. These products offer a simple “just bring your data” model that hides much of the complexity from the user while, allowing analysts and citizen data scientists to leverage cutting edge algorithms. AWS released their Amazon Forecast product at their Re:Invent conference in November 2018, which allows largely non-technical users to employ the same demand forecasting models in use today at Amazon.com. Additionally, leaps forward in automated machine learning tools such as DataRobot mean that they are now capable of producing forecasts without the need to write a single line of code.

Final Thoughts

Deep learning is one of the hottest topics in data science today. While it can sound daunting to employ such cutting-edge technology, new products and recent developments in cloud platforms have removed the largest barriers to entry. Companies that employ these techniques will enjoy the ability to make more informed decisions from data they already have, and the agility to keep pace with an accelerating marketplace