What is Hidden in the Hidden Markov Model?

Basics of Hidden Markov Models for Data Science Interviews

Vimarsh Karbhari
Acing AI
4 min readNov 13, 2018

--

Hidden Markov Models or HMMs are the most common models used for dealing with temporal Data. They also frequently come up in different ways in a Data Science Interview usually without the word HMM written over it. In such a scenario it is necessary to discern the problem as an HMM problem by knowing characteristics of HMMs.

In the Hidden Markov Model we are constructing an inference model based on the assumptions of a Markov process.

The Markov process assumption is that the “future is independent of the past given that we know the present”.

It means that the future state is related to the immediately previous state and not the states before that. These are the first order HMMs.

What is Hidden?

With HMMs, we don’t know which state matches which physical events instead each state matches a given output. We observe the output over time to determine the sequence of states.

Example: If you are staying indoors you will be dressed up a certain way. Lets say you want to step outside. Depending on the weather, your clothing will change. Over time, you will observe the weather and make better judgements on what to wear if you get familiar with the area/climate. In an HMM, we observe the outputs over time to determine the sequence based on how likely they were to produce that output.

HMMs — Adapted from Russel and Norvig, Chapter 15.

Let us consider the situation where you have no view of the outside world when you are in a building. The only way for you to know if it is raining outside it so see someone carrying an umbrella when they come in. Here, the evidence variable is the Umbrella, while the hidden variable is Rain. See the probabilities in the diagram above.

HMM representation

Since this is a Markov model, R(t) depends only on R(t-1)

A number of related tasks ask about the probability of one or more of the latent variables, given the model’s parameters and a sequence of observations which is sequence of umbrella observations in our scenario. Some tasks that related to this example are also similar to those asked in a Data Science Interview(See Questions here):

  1. If I see someone having an umbrella for the last three days, what is the probability that it is raining today? (Inference type — Filtering)
  2. If I see someone having an umbrella for the last three days, what is the probability it will rain day after tomorrow? (Prediction type)
  3. If I see someone having an umbrella for the last three days, what is the probability it rained yesterday? (Hindsight type — Smoothing)
  4. If I see someone having an umbrella for the last three days, what could be the weather like since the past three days? (Sequence type — Most Likely explanation)

It is worth spending time learning HMMs in detail. Above you will see the Matrix based representations for HMM for the same umbrella problem we talked about. Scikit-learn provides the framework to use HMMs in Python.

Conclusion:

HMMs allow us to model processes with a hidden state, based on observable parameters. The main problems solved with HMMs include determining how likely it is that a set of observations came from a particular model, and determining the most likely sequence of hidden states. They are a valuable tool in temporal pattern recognition. Within the temporal pattern recognition area, the HMMs find application in speech, handwriting and gesture recognition, musical score following and SONAR detection.

The next discussion around this topic is the Chinese Room Argument which I have talked about here.

Source(Please refer these to know HMMs in mode detail):

Hidden Markov Models (Sean R Eddy)

Hidden Markov Models — JHU Computer Science Paper

Subscribe to our newsletter here. We are building a new course to help people ace data science interviews. Sign up below to join the wait-list!

--

--