The Must-Know Bayesian Inference Algorithm: Metropolis-Hastings
1. Introduction
Posterior distribution is the soul in Bayesian statistics as it reflects our updated prior beliefs about the parameters by combining the prior distribution and the sampling distribution. Bayesian relies on the posterior for making inferences and predictions but there are many situations where it is challenging to compute the posterior in closed form.
Markov chain Monte Carlo (MCMC) provides a way to estimate the posterior without having to calculate it analytically. By simulating a Markov chain that converges to the posterior distribution, we can generate samples to approximate the posterior and estimate its properties.
In this article, I will introduce a fundamental MCMC algorithms that are widely used in practice: Metropolis-Hasting. I will first explain the concept of the Metropolis-Hastings, and then demonstrate how to use this algorithm to generate samples from an unnamed distribution. Lastly, I will talk about the advantages and disadvantages of Metropolis-Hastings.
2. Overview of Metropolis-Hastings
Metropolis-Hastings is one of the most popular MCMC methods to draw samples from…