[Bayesian DL] 1. Properties of Gaussian Distribution and Prior(Posterior) Predictive Distribution
At the later chapter, I’m going to write an article about the Gaussian Process and Bayesian Deep learning. What is written below is the fundamental concepts which help us to understand the Gaussian process.
1. Properties of the Gaussian Distribution
The probability density function(PDF) of the Gaussian Distribution is as below.
The Gaussian Distribution has several properties, but I’m going to only deal with some important properties which are related to the Gaussian Process.
- Sum of Gaussians
- scaling the Gaussian
- Correlated Gaussian
- The conditional density of Gaussian is also Gaussian
Apart from the above properties, we also need to keep the one in mind which is shown in the below figure. It shows the joint distribution of two random variable ‘weight/kg’ and the ‘height/m’, and its marginal distribution, respectively.
2. Prior and Posterior predictive distribution
The word ‘predictive’ in predictive distribution refers to the prediction for observations. The difference between prior(posterior) distribution and prior(posterior) predictive distribution is that the former one is a distribution for parameters(or weights) theta whereas the latter one is a distribution for observations(y, also called target value).
- Prior predictive distribution
As is shown above, the prior predictive distribution can be interpreted as the distribution of averaged(or weighted) y over all possible values of theta.
- Posterior predictive distribution
The posterior predictive distribution is structured in the same way as the prior predictive distribution as is driven above apart from the part that the former one weights its likelihood with posterior(which refers to our updated knowledge on theta with observations) while the latter one weights with prior.
3. Reference
[1] Explanation of Gaussian Processes with detail figures
[2] https://online.stat.psu.edu/stat505/lesson/6/6.1
[3] Prior predictive distribution
Any corrections, suggestions, and comments are welcome