Introduction to Markov Chains

David Mavrodiev
3 min readApr 29, 2024

--

In this article, I will define what Markov Chains are, explore their properties, and discuss how they behave.

Definition

A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. Essentially, it is a graph where the nodes represent states and the edges represent the probabilities of transitioning from one state to another. Here’s a simple example to illustrate a Markov chain and its associated Transition Matrix:

Markov chain example
Transition matrix of the Markov chain

As example, the first row [0.3, 0.2, 0.5] describes the probabilities of transitions from State 0 as follows:

  • Probability to remain in State 0 is 0.3.
  • Probability to transition from State 0 to State 1 is 0.2.
  • Probability to transition from State 0 to State 2 is 0.5.

The sum of the probabilities in each row is always equal to 1, which ensures that the total probability distributed among possible future states is complete.

A key assumption of Markov Chains is the Markov Property, which states that the future state depends only on the current state and not on the sequence of events that preceded it. This can be visualized by the following conditional probability representation for a first-order Markov chain, where the transition depends solely on the current state. There are also higher order Markov chains where the transition depends on a sequence of past states.

Markov property for first-order Markov chain.
Markov property for second-order Markov chain.

Properties of Markov Chains

Markov Chains exhibit several interesting properties which are not immediately obvious but are crucial for analyzing their behavior:

Recurrent Class of States

This is a set of states where it is possible to return to any given state with a probability of 1. Moreover, these states form a closed set, meaning no transitions lead out of this group.

Transient States

These are states from which the process can transition to a state from which it is not possible to return. Over time, the probability of returning to such states approaches zero.

Reducibility

A Markov chain is reducible if its states can be partitioned in such a way that there are no transitions possible from one subset of states to another. If every state can reach every other state, the chain is irreducible.

Periodicity

The periodicity of a state refers to the number of steps required to return to the same state, where returns can only occur in multiples of this number. A state with a period of 1 can be revisited at any time, which defines an aperiodic chain.

Emerging Behaviours in Markov Chains

When simulating a Markov chain by running a series of transitions, interesting long-term behaviors emerge. If the chain is both irreducible and aperiodic, it will eventually reach a steady state where the probabilities of being in each state stabilize. These probabilities form the stationary distribution.

You can explore a JavaScript simulation of an irreducible and aperiodic Markov chain here: https://david-mavrodiev.github.io/markov-chains-and-models/markov-chain-simulation.html

Javascript simulation of Markov chain

This introduction to Markov Chains provides the groundwork for understanding more complex models and their applications in various technological and scientific fields.

Follow-Up

In follow-up article “Introduction to Hidden Markov Models”, I explain what are HMMs and how they emerge from Markov Chains. Go ahead and check it out!

--

--