Statistics

Statistics: Conditional Probability Models

The gentle introduction to conditional probability and related theories

Yuki Shizuya
Intuition

--

Photo by Edge2Edge Media on Unsplash

Introduction

In many applications of probability, it is impossible to observe the experiment’s outcome directly; instead, an event related to the outcome is observed. So, conditional probability models are essential to consider and leverage the information gained from the observed event. Moreover, conditional probability models are very much related to the Bayes theorem. In this blog, I will introduce the basics of conditional probability models and solve some quizzes to get used to the concept using a reference [1].

Table of Concepts

  1. Conditional Probability
  2. Conditional Expected Value and Conditional Variance
  3. Exercises to recap conditional probability

1. Conditional Probability

Conditional probability sounds intimidating, but it only refers to the proportion of probability of an event A under an event B. In theory, when we have sample space S, the conditional probability can be described as the probability of event A occurring, given that event B has already occurred.

The conditional probability visualization

How does the conditional probability change when we change the marginal probability value? The visualization is in the following figure. As you can see, the bigger the joint probability is, the larger the proportion that A accounts for B is.

How does the conditional probability change when changing the marginal probability?

The conditional probability theorem can apply to the two types of random variables, discrete and continuous random variables. When we define sample space as S, and an event B ⊂ S, the conditional probability for discrete and continuous random variables are described as:

The formulas of the conditional probability for two random variables

From the above equations, we can derive the formulas of the marginal probability.

The formulas of the marginal probability

Marginal probability is the probability of a single event occurring. The above formula means that we can know the probability of a single event if we know all other events related to that single event and its proportion that accounts for other events. Next, let’s solve the following questions to check our understanding of conditional probability.

Quiz 1.1
Quiz 1.1 Answer
Quiz 1.2
Quiz 1.2 Answer

2. Conditional Expected Value and Conditional Variance

Conditional expected value and conditional variance are also important concepts. Obtaining statistics with specific conditions can help us understand the given data deeply. In this section, we dive into conditional expected values and variance to understand them mathematically.

2.1 Conditional Expected Value

The conditional expected value is similar to the expected value for the unconditional probability. It replaces the unconditional probability part in the expected value with the conditional probability.

The theorem of conditional expected value

Given the conditional probability models for a partition Bᵢ, we can compute the expected value E[X] in terms of the conditional expected values E[X|Bᵢ].

The above formula is similar to the relationship between conditional and marginal probability when the event has partitions. If we sum all probabilities of partitioned events and their conditional expected values, it becomes the expected value. We can derive this formula as follows:

The derivation of the relationship between conditional and marginal probability

2.2 Conditional Variance

The conditional variance is also similar to the variance for unconditional probability. The conditional standard deviation can be described as:

The theorem of the conditional variance

You can derive the conditional variance formula as:

The derivation of the conditional variance

To get used to the concepts, let’s solve the questions below.

Quiz 2.1
Quiz 2.1 Answer
Quiz 2.2
Quiz 2.2 Answer

3. Exercises to recap conditional probability

In the last sections, I will use some quizzes from the book [1] to understand the concept of the conditional probability and related theories.

Quiz 3.1
Quiz 3.1 Answer 1
Quiz 3.1 Answer 2
Quiz 3.2
Quiz 3.2 Answer 1
Quiz 3.2 Answer 2
Quiz 3.3
Quiz 3.3 Answer 1
Quiz 3.3 Answer 2

In this blog, we have gone through the conditional probability theorems. This concept is essential to understanding the Bayes theorem and applied machine learning, such as natural language processing. Thank you for reading!

Reference

[1] Yates, Roy D., Goodman, David J., Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers 3rd edition, WILEY

--

--

Yuki Shizuya
Intuition

Data Scientist in Japanese IT company. I write blogs about machine learning/deep learning/statistics.