Markov Chains : A General Introduction
Markov Chain
A Markov Chain is a sequence of events where the probabilities of future only depend on the present. In other words, by human analogy if we think, it’s just like your future depends on your present not past.
In the above image is the example of how a basic markov chain looks like.
Let’s breakdown this with some points:
The probability of state A being in state A is 0.6.
The probability of going to state B from state A is 0.4.
The probability of state B being in state B is 0.2.
The probability of going to state A from state B is 0.8.
In the above image, inside transition matrix P, P(AA) means probability of state A being in state A, P(AB) means probability of going to state B from state A and so on.
Transition Matrix
A transition matrix is a square (n x n) matrix which contains transition probabilities of states.
So from above discussion, I’m now convinced that you have an idea of what is Markov chain and how we can make transition matrix from it.
In the next part, we will see it’s more complex mathematical definitions and after that couple of real world examples.