A Complete Introduction To Time Series Analysis (with R):: Stationary processes

Hair Parra
May 8, 2020 · 4 min read
Image for post
Image for post
A zero-mean Stationary Time Series

In the last article, we discussed two important models with structure: the trend decomposition model and seasonal variation. We said that one of the most important steps in the general strategy for time series analysis was to remove the signal, i.e., remove either the estimated trend or seasonality or both. In this section, we will talk about stationary processes, and what it means to be stationary. Let’s dive right into it!

Stationary Processes

Image for post
Image for post

Let’s consider some time-series process Xt. Informally, it is said to be stationary if, after certain lags, it roughly behaves the same. For example, in the graph at the beginning of the article, we see that although there is some fluctuation, the points seem to wander around zero, in particular, this is called a mean-zero process. We will define more precisely what this means next. For this purpose, we will focus on studying the second moments, i.e., mean, covariance, and variance of a certain process.

There are two kinds of stationarity: weak stationarity and strong stationarity. Both of these are defined below. If you need a refresher on expectation, covariance, variance, and distributions, make sure to check these notes from Standford’s CS229 machine learning course.

Let’s now check these definitions.

Weak stationarity

Image for post
Image for post
Image for post
Image for post

The first point says that we have some mean, whatever it might be, which is a function independent of t, that is, it does not depend on any particular timestep.
In the second point, we define the autocovariance function by the letter gamma, of two observations at times t and s. In order for the process to be stationary, we require that the covariance function of Xt at lags t and t+h is also independent of t. That is, we only have covariances for lags 0,1,…,m, say, after which, it should get repeated again.

Image for post
Image for post

Strong stationarity

This time, we require in addition that the joint distribution of the process shifted by a lag s is the same. This is indeed much stronger than the previous definition.

Ok, so what do we do with this? It turns out that we would much more like to work with stationary series than non-stationary ones, and in fact, most of the theorems and propositions that we will are based on these assumptions. If a series is not stationary, the idea is to make it into one (we will see how in a later article ).

Autocorrelation function

The autocovariance function might seem a little funky at first, but it is in fact, along the autocorrelation function (ACF), is an essential tool for Time Series analysis. Let’s recall the definition of the correlation between two random variables:

Image for post
Image for post

This tells us how strongly related or “similar” are two variables. The ACF of a time series uses the exact same concept, except that using the autocovariance formula we saw before; that is

Image for post
Image for post
Image for post
Image for post

So we see that this is no more than the good old correlation. As you should know, this ranges between -1 (indicating negative correlation) and 1 (indicating positive correlation). The main idea is that, if we plot the ACF, a stationary series will show most or all of the points within certain confidence bounds, without any predictable pattern, like in the following plot:

Image for post
Image for post
The points at all lags are within the confidence bounds, without any predictable pattern

Next time

Next time, we will see illustrate these concepts with various processes that are and are not stationary, along with the How to R sections to produce these plots. Stay tuned!

Previous article

Models with Structure

Main page

Follow me at

  1. https://blog.jairparraml.com/
  2. https://www.linkedin.com/in/hair-parra-526ba19b/
  3. https://github.com/JairParra
  4. https://medium.com/@hair.parra

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data…

Sign up for Analytics Vidhya News Bytes

By Analytics Vidhya

Latest news from Analytics Vidhya on our Hackathons and some of our best articles! Take a look

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Hair Parra

Written by

Data Scientist & Data Engineer at Cisco, Canada. McGill University CS, Stats & Linguistics graduate. Polyglot.

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Hair Parra

Written by

Data Scientist & Data Engineer at Cisco, Canada. McGill University CS, Stats & Linguistics graduate. Polyglot.

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store