HAPPY FACES!!

How I sent my friend to sleep

Or, how I explained the expected value of a Geometric Random Variable to a friend.

Thai Pangsakulyanont
6 min readOct 10, 2013

--

I and my friend (I’ll call her Aime from now on) are taking a probability course. Last night, she couldn’t sleep, so she chatted with me on Facebook:

Aime: I can’t sleep! My sister brought a friend over to my house and I had to sleep alone.

Thai: So what are you doing?

Aime: Attempting to sleep. I’m already sleepy, but it just won’t let me sleep. It’s going to be terrible…

Thai: Well, if you can’t sleep, maybe you might want to do some probability exercise!

Suppose that you, Aime, are trying hard to sleep, so you decided that you’d count sheep to fall asleep, and the probability that you’d fall asleep whenever you count a single sheep is 0.5%. Now, find:

(a) The probability that you’d fall asleep when you counted the Nth sheep.
(b) The expected value of
N, the number of sheep that you need to count in order to fall asleep.
(c) Suppose also that it takes you 2.5 seconds to count 1 sheep, how much time it should take you to count sheep to fall asleep?

Aime: For (a), it is 0.005 × 0.995^(N-1).

Thai: Ok, now do (b) and (c).

Aime: For question (b), is it the summation of n × 0.005 × 0.995^(n-1) from 1 to N?

Thai: You can’t sum from 1 to N!

Aime: But didn’t I fall asleep when I counted the Nth sheep?

Thai: Right, but you don’t know N just yet. In fact, the expected value of N itself is what I wanted you to find, so you can’t include N here. Think of it like this. If you counted so many sheep, but you’re still awake, and you’re determined to count until…

Aime: From 1 to Infinity?

Thai: That’s right! So what’s the expected value of N?

Aime: (1 × 0.005) + (2 × 0.005 × 0.995) + (3 × 0.005 × 0.995²) + … =_=

Thai: Yeah, but what’s the answer? Do you remember which kind of random variable it is?

Aime: A Geometric Random Variable.

Thai: And did you remember what its expected value is?

Aime: Noooo! TOT

Thai: Ok, so, first, think about this: Suppose that you counted sheep until you fall asleep. When you fall asleep, how many times did you fall asleep?

Aime: 1 time.

Thai: That’s right! Now suppose that you wanted to count N sheep, and the probability that you’ll fall asleep whenever you count a sheep is p (let’s just use a variable for now), how many times, then, do you expect to fall asleep?

Aime: Shouldn’t it be just 1 time? Because when I fall asleep, I can’t continue counting.

Thai: Well, let’s say that you’re so determined to count up to Nth sheep, that even after you fell asleep, when you wake up, you just continue counting until the Nth sheep. So, with that in mind, how many times do you expect to fall asleep?

Aime: ……

Thai: If you still don’t get it, take another example. When you roll a dice, the probability that the uppermost face will have the value of 1 is (1/6). Suppose that you threw the dice 600 times, how many times do you expect that 1 will be the value of the throw?

Aime: (1/6) × 600 = 100?

Thai: Yes! So can you answer the previous question now?

Aime: (1/p) × N?

Thai: No! Remember that 0 ≤ p ≤ 1. So if 0 < p < 1, then (1/p) will be greater than 1, and thus (1/pN will be greater than N! If you counted just N sheep, how can you fall asleep more than N times?

Aime: So, p is the probability of falling asleep, and N is the number of times that I fell asleep…

Thai: No! N is the number of sheep that you counted.

Aime: I pity you for having to put up with me… TT_TT. Is it p × N?

Thai: Yes, that’s correct! Another example, suppose that the chance of winning a lottery is 0.01. If you bought 5000 lotteries, 50 of them would be winning, right?

Thai: So, to summarize, if N is the number of attempts (counting a sheep), and p is the probability that an event we’re interested in occurs (falling asleep when you count), then Np is the expected number of occurrences of the event we’re interested in, given that we made N attempts.

Aime: Ah! Got it!

Thai: So, let’s get back to the Geometric Random Variable. You remember that a Geometric Random Variable represents the number of trials until the event we’re interested in occurs, and the event only needs to occur once, right?

Aime: Yeah.

Thai: So, you’d expect that Np = 1. What would you expect the value of N, the number of sheep counted, to be?

Aime: N = (1/p). Right?

Thai: Right! And I believe that’s why the expected value of a Geometric Random Variable, E[N] is (1/p). Now, can you answer (b) now?

Aime: (1/0.005) = 200.

Thai: Correct! Now answer (c).

Aime: Ok. I think I can remember about the Poisson Random Variable. Is it αˣ times…

Thai: It has nothing to do with the Poisson Random Variable!

Aime: But it has something to do with time!

Thai: Yeah, but this is just a simple linear transformation.

Aime: Oh, so that’s 2.5 × 200?

Thai: I asked for the number of minutes.

Aime: 2.5 * 200 / 60 = 8.3333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333

Thai: Yeah!!

Aime: Yeah!!!! Give me more problems! I don’t think I can sleep just yet…

Thai: GO TO BED AND COUNT SHEEP!

Aime: TOT

Afterthoughts

In the class, the instructor (an excellent teacher, by the way) told us that we didn’t have to remember the equations if we understood it.

He explained why the expected value of a Geometric Random Variable is equal to (1/p) by proving it.

Then followed by another large block of several equations to show why the summation before the last line results in (1/p).

That way, you don’t have to remember the equation. If you forgot it, you can always derive the equation again. However it would take a lot of time, and it’s quite error prone as well. I remember proving in the exam room that dV = ρ² sin ϕ dρ dϕ dθ and my exam time quickly ran out.

When I gave Aime this problem, I pondered about it a little bit further, and an intuition came to my mind:

For a Geometric Random Variable, we expect 1 occurrence from N trials. Each trial has the same probability p, so we expect that Np = 1, and it should follow that we can expect that N = (1/p).

This comes just from an instinctive feeling; no proof is given (someone else already proved it anyways), and maybe it might not even be the real reason why E[N] = (1/p). However, I’d say that understanding it intuitively makes remembering that equation easier than ever!

Now, I applied the same intuition to the Binomial Random Variable and Pascal Random Variable. So now I can remember the expected values of all of them.

P.S. In the end, she slept at 5 A.M. Sheep and probability did not help.

--

--

Thai Pangsakulyanont

(@dtinth) A software engineer and frontend enthusiast. A Christian. A JavaScript musician. A Ruby and Vim lover.