Identifying Cross-Country Inflation Trends with an Autoencoder in TensorFlow

ODSC - Open Data Science
ODSCJournal
Published in
7 min readJun 8, 2022

Editor’s Note: Isaiah Hull is a speaker for ODSC Europe 2022 this June 15th-16th. Be sure to check out his talk, “Machine Learning for Economics and Finance in TensorFlow 2,” there!

The consumer price index (CPI) measures the cost of a fixed basket of goods and services that the average household purchases over the course of a year. An increase in the price of a single good or service is referred to as a relative price increase; whereas a rise in the general level of prices — that is, an increase in the total cost of the basket — is called inflation. Inflation implies that the value of money has fallen relative to the value of goods and services.

Empirically, CPI inflation exhibits strong comovement across countries. Inflation rose sharply in many high-income countries at the same time in the 1960s and 1970s, but then remained low and stable in those same countries during the Great Moderation, which started in the mid-1980s.

There are several reasons why we might expect cross-country correlation in inflation rates. One reason is that the prices of many goods in the CPI basket are determined in global markets. An increase in commodity prices, for instance, will tend to increase inflationary pressure in all countries. Another reason is that correlation in policy regimes across central banks will tend to lead to correlation in inflation outcomes across countries.

https://odsc.com/europe/

Are the Drivers of Inflation Global or Local?

How can we determine whether inflation is being driven by global or local factors at some point in time? And how can we determine what those factors are? We can, of course, examine individual components of the CPI and compare them across countries. Food and energy prices, for instance, are volatile and often comove strongly across country. Another common approach is to make use of Principal Components Analysis (PCA), which is a dimensionality reduction technique.

PCA takes a set of features — in this case, CPI inflation time series for different countries — and maps them to a smaller set of “principal components,” which are independent and are ordered by the share of variance that they explain. The first principal component, for instance, explains the largest share of the variance. In our example, it is likely to be the most important global driver of inflation.

If we expect the relationships in CPI inflation across country to be linear, then PCA will typically be the best choice for dimensionality reduction exercises. If we want to allow for nonlinearities, then we may instead want to use an autoencoder. The general architecture for an autoencoder consists of a neural network that performs encoding, which is joined to a network that performs decoding. The encoder and decoder are connected by a latent state. Analogous to principal components, the latent state provides us with a compact summary of the most important global drivers of inflation.

Preparing the Dataset

We must first collect a suitable dataset. Fortunately, the BIS provides a database of inflation series, which are available for download here: https://www.bis.org/statistics/cp.htm. We will use 57 of the available countries, discarding those had a hyperinflation event and those for which there were missing observations over the sample period of January of 1996 to November of 2021. We will load this data into Python and then convert it to a TensorFlow constant object.

# Import modules.
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
# Load the data.
inflation = pd.read_csv(data_path+'cpi_inflation.csv', index_col = 'Date')
# Convert data to constant object.
inflation = tf.constant(np.array(inflation), tf.float32)
# Count number of countries in dataset.
nCountries = inflation.shape[1]

Defining and Training the TensorFlow Model

An autoencoder is trained to predict its inputs. This is sometimes referred to as minimizing the “reconstruction loss,” which is a transformation of the difference between the inputs and the outputs. Normally, predicting inputs would be a trivial task, but the architecture of an autoencoder intentionally makes this difficult.

In an autoencoder, the input data is first compressed as it is passed through the “encoder” network. This will necessarily mean that some information is lost. It then passes through a bottleneck layer and arrives at the decoder, where it is upsampled to have the same dimensions as the input layer. During the training process, the model learns what information must be retained and what information is inessential. The information that is critical for reconstructing the inputs is embodied in the latent state.

In the code below, we define the encoder network and the latent state. We use a functional model in tf.Keras, where we pass each successive layer as an argument to the layer that follows. Notice that we have a single hidden layer with 32 nodes and a latent layer with five nodes. This means that the latent state will consist of five features. We can loosen or tighten that bottleneck, depending on how many features we want to recover.

# Set number of hidden nodes.
hiddenNodes = 32
# Set number of nodes in latent state.
latentNodes = 5
# Define input layer for encoder.
encoderInput = tf.keras.layers.Input(shape = (nCountries))
# Define hidden layer for the encoder.
encoderHidden = tf.keras.layers.Dense(hiddenNodes,
activation = 'sigmoid')(encoderInput)
# Define the bottleneck layer.
latent = tf.keras.layers.Input(shape = (latentNodes))

The next block defines the decoder network, which inverts the architecture of the encoder, taking the output of the encoder network as an input. The decoder network is also defined as a functional model. Notice that the output layer of the decoder network, decoded, uses a linear activation function, since each output node predicts a continuous variable: the CPI inflation rate for a country.

# Define output layer for encoder.
encoded = tf.keras.layers.Dense(latentNodes,
activation = 'sigmoid')(encoderHidden)
# Define hidden layer for decoder.
decoderHidden = tf.keras.layers.Dense(hiddenNodes,
activation = 'sigmoid')(latent)
# Define output layer for decoder.
decoded = tf.keras.layers.Dense(nCountries,
activation = 'linear')(decoderHidden)

Finally, we complete the functional model definitions for the encoder and decoder networks. We then combine them into an autoencoder, which we will use to perform training. For simplicity, we will avoid tuning the model; however, this step could be modified to improve model performance. We will then use the .fit() method to train the model for 1000 epochs. Notice that both the features and target are inflation.

# Define separate models for encoder and decoder.
encoder = tf.keras.Model(encoderInput, encoded)
decoder = tf.keras.Model(latent, decoded)
# Define functional model for autoencoder.
autoencoder = tf.keras.Model(encoderInput, decoder(encoded))
# Compile the autoencoder.
autoencoder.compile(loss = 'mae', optimizer = 'adam')
# Train the autoencoder.
autoencoder.fit(inflation, inflation, epochs = 1000)

Interpreting the Results

We have now trained an autoencoder and can use it to examine our original question — namely, how can we determine whether inflation for a given country and period is driven by global or local factors? Let’s start by making use of the .predict() method to examine the model outputs. We will first recover the autoencoder outputs for each period. These are predicted rates of inflation. We then compute the absolute deviations of these predictions from their true values, as shown in the code block. We also apply the same method to the encoder to recover the latent states.

# Predict inflation series.
inflation_predicted = autoencoder.predict(inflation)
# Compute absolute deviations.
reconstructionLoss = pd.DataFrame(np.abs(inflation-inflation_predicted),
columns = cNames)
# Generate latent state for inflation time series.
latentState = pd.DataFrame(encoder.predict(inflation))

The plot below shows this absolute prediction error series for Japan. We can see that there are substantial deviations during the Great Recession and again around 2013–2015. This suggests that inflation in Japan was not well-explained by the global factors that the model identified during these periods. This could either be a consequence of poor model performance or the presence of local drivers of inflation.

In addition to looking at deviations from the model for individual countries, we can also examine the latent states themselves. We plot them in the figure below. Notice that each is bounded within the [0, 1] interval. If we wanted to allow for negative values, we could have used a different activation function.

If we wanted to complete the analysis, our next step would be to determine how to make use of the latent states. One possibility would be to compute the correlations with the time series for individual countries. We might expect that large economies, such as China and the United States, are drivers of some of the global components of inflation and, thus, have a strong association with certain latent factors. Another possibility would be to look at correlations between the latent states and prices for specific goods, such as oil and wheat.

Conclusions

We showed that an autoencoder could, in principle, be used to decompose inflation into global and local components. Additionally, we discussed what steps could be taken next to complete and interpret that decomposition.

Original post here.

Read more data science articles on OpenDataScience.com, including tutorials and guides from beginner to advanced levels! Subscribe to our weekly newsletter here and receive the latest news every Thursday. You can also get data science training on-demand wherever you are with our Ai+ Training platform. Subscribe to our fast-growing Medium Publication too, the ODSC Journal, and inquire about becoming a writer.

--

--

ODSC - Open Data Science
ODSCJournal

Our passion is bringing thousands of the best and brightest data scientists together under one roof for an incredible learning and networking experience.