Necessary Probability Concepts for Deep Learning: Part 1

The minimum probability concepts a DL beginner should know

Sunil Yadav
5 min readFeb 25, 2022

In this blog, we will discuss a couple of probability/statistics concepts that are necessary for Deep Learning (DL).

Probabilities (Joint and Conditional)

Probability computes the likelihood of an event from a random variable. Let's consider x is a random variable then the probability of an event X can be expressed as P(x=X). For example, flipping a coin can be represented as random variable x then the probability of an event getting head, P(x=head) will be 0.5.

Probability computation is simple with a single random variable. However, it gets complicated with multiple random variables. Moreover, this complicacy depends on the intractability between these random variables.

Let's consider two random variables x and y. Furthermore, we pick two events X and Y from these random variables to define the following probabilities:

  1. Conditional probability computes the probability of occurring of event X given that event Y has occurred and is denoted by P(X|Y).
  2. Joint probability is defined as the probability of both events (x=X and y=Y) jointly and can be expressed as in the form of conditional probability:

From a deep learning (DL) point of view, these two concepts are used almost in every problem because, in DL, we are either…

--

--

Sunil Yadav

An experienced researcher and co-founder @nocturneGmbH with keen focus on applying academic research to clinical practice.