Bernoulli Distribution : A brief introduction

Karthik Katragadda
2 min readAug 7, 2018

--

The Bernoulli distribution is probably the simplest distribution in the field of statistics. It simply denotes the probability distribution of a discrete random variable that can take only two values: 0, 1. Although the Bernoulli variable itself can take only two outcomes, there are different probabilities associated with these two outcomes.

Let’s take an example to understand this concept. Consider the following random variable X.

X tells us whether it will rain tomorrow or not. X = 1 denotes that it will rain and X = 0 denotes that it will not rain. If the probability that it rains tomorrow is p then, the probability that it will not rain is 1-p.

So, the probability mass function for this distribution is,

A few examples of Bernoulli variables are

i.) The outcome of a coin toss.

ii.) The outcome of a tennis match.

iii.) Whether a bank will give us a loan or not.

We can look around us and see several other examples of Bernoulli variables.

An important point to keep in mind when discussing Bernoulli variables is that it only takes one trial i.e. the outcomes of multiple coin tosses cannot be taken into one variable. Only one trial is taken into one variable.

A Bernoulli variable can be described by one parameter, p.

The probability mass function is,

The expected value of a Bernoulli distribution is p.

Deriving the Expected Value

The variance of a Binomial distribution is p(1-p).

Deriving the variance

Finally, the important equations related to Bernoulli distribution are as follows.

--

--