0% found this document useful (0 votes)
32 views4 pages

Notes Bernoulli

The document provides an overview of Bernoulli random variables and their relationship to the binomial distribution, including definitions, expected values, variances, and standard deviations. It explains how to calculate probabilities for binomial distributions through examples involving coin flips and game outcomes. Additionally, it discusses the sample mean of Bernoulli random variables and its properties, including expected values and variances.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views4 pages

Notes Bernoulli

The document provides an overview of Bernoulli random variables and their relationship to the binomial distribution, including definitions, expected values, variances, and standard deviations. It explains how to calculate probabilities for binomial distributions through examples involving coin flips and game outcomes. Additionally, it discusses the sample mean of Bernoulli random variables and its properties, including expected values and variances.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Econ 325

Notes on Bernoulli Random Variable and Binomial Distribution1


By Hiro Kasahara

Bernoulli Random Variable


Consider a random variable X that takes a value of zero or one with probability 1 − p and
p, respectively. That is, 
0 with prob. 1 − p
X= (1)
1 with prob. p
The probability mass function is written as f (x) = px (1 − p)1−x and we say that X has a
Bernoulli distribution.
The expected value of X is
X
E(X) = xpx (1 − p)1−x = (0)(1 − p) + (1)(p) = p
x=0,1

and the variance of X is


X
Var(X) = (x − p)2 px (1 − p)1−x
x=0,1

= (0 − p)2 (1 − p) + (1 − p)2 p
= p2 (1 − p) + (1 − p)2 p = (p + (1 − p)) × p(1 − p) = p(1 − p).
p p
The standard deviation of X is Var(X) = p(1 − p).

Binomial Distribution
Let X1 , X2 ,..., Xn be a sequence of n independent Bernoulli random variables, each of which
has the probability of success equal to p, given by (1). Define a random variable Y by
n
X
Y = Xi ,
i=1

i.e., Y is the number of successes in n Bernoulli trials. The number of ways of selecting y
positions for the y successes in the n trials is
 
n n!
= .
y y!(n − y)!
Then, the probability mass function of Y is given by
 
n y
f (y) = p (1 − p)(n−y) . (2)
y
1 c Hiroyuki Kasahara. Not to be copied, used, revised, or distributed without explicit permission of
copyright owner.

1
Pn
The expected value of Y = i=1 Xi is
n
X n
X
E(Y ) = E( Xi ) = E(Xi ) = np
i=1 i=1
Pn
where the last equality follows because E(Xi ) = p which is a constant and i=1 c = nc for
any constant c.
Let Zi = Xi − p. Then, the variance of Y = ni=1 Xi is
P

Xn
Var(Y ) = E[{( Xi ) − np}2 ]
i=1
n
X
= E[{ (Xi − p)}2 ]
i=1
Xn
= E[{ Z i }2 ] (Define Zi = Xi − p for i = 1, ..., n)
i=1
= E[Z12 + Z22 + ... + Zn2 + 2Z1 Z2 + 2Z1 Z3 + ... + 2Z1 Zn + ... + 2Zn−1 Zn ]
Xn Xn X n
2
= E[Zi ] + 2 E[Zi Zj ]
i=1 i=1 j=i+1
Xn
= p(1 − p) + 2 × 0 (Because E[Zi2 ] = Var(Xi ) = p(1 − p) and E[Zi Zj ] = 0 if i 6= j)
i=1
= np(1 − p),
where E[Zi Zj ] = 0 if i 6= j because Xi and Xj is independent if i 6= j.

The Sample Mean of Bernoulli Random Variables


Binomial distribution is closely related to the distribution of the sample mean of Bernoulli
random variables. Define n
1X
X̄ = Xi ,
n i=1
where X1 , X2 ,..., Xn are a sequence of n independent Bernoulli random variables. Then, the
possible values X̄ can take are {0, 1/n, 2/n, ..., (n − 1)/n, 1}. Further, X̄ = (1/n)Y so that
Y = nX̄. Therefore, by letting y = nx̄ in (2), the probability mass function of X̄ is given by
 
n
Pr(X̄ = x̄) = pnx̄ (1 − p)(n−nx̄) .
(nx̄)
This is the exact probability mass function of X̄ when Xi s are independent Bernoulli random
variables. Later, we will discuss that the distribution of X̄ can be approximated by the normal
distribution when n isPlarge. We may also compute the expected value and the variance of
X̄ from those of Y = ni=1 Xi . In fact,
E[X̄] = E[(1/n)Y ] = (1/n)E[Y ] = (1/n)np = p,
and
p(1 − p)
V ar(X̄) = V ar((1/n)Y ) = (1/n)2 V ar(Y ) = (1/n)2 np(1 − p) = .
n

2
Examples
1. Flip a coin four times and let Y be the number of heads. What is the probability that
the number of heads Y is equal to 2?
Answer: We have n = 4 and p = 0.5. The probability that Y = 2 is equal to
 
4 3
(0.5)2 (1 − 0.5)(4−2) = 6(0.5)4 = .
2 8

2. Exercise 4.40: The Minnesota Twins are to play a series of 5 games against the Red
Sox. For any one game it is estimated that the probability of a Twins’ win is 0.5. The
outcome of the 5 games are independent of one another. (a) What is the probability
that Twins will win all 5 games? (b) What is the probability that Twins will win a
majority of the 5 games? (c) If the Twins win the first game, what is the probability
that they will win a majority of the five games?
Answer: We have n = 5 and p = 0.5. For (a), the probability that Y = 5 is equal to
 
5 1
Pr(Y = 5) = (0.5)5 (1 − 0.5)(5−5) = 1 × (0.5)5 × 1 = .
5 32
For (b), the probability that Y = 3 and that of Y = 4 are
 
5 10
Pr(Y = 3) = (0.5)3 (1 − 0.5)(5−3) = 10 × (0.5)5 = ,
3 32
 
5 5
Pr(Y = 4) = (0.5)4 (1 − 0.5)(5−4) = 5 × (0.5)5 = .
4 32
Therefore, the probability that Twins will win a majority of the 5 games, namely,
Y ≥ 3 is
10 5 1 1
Pr(Y ≥ 3) = Pr(Y = 3) + Pr(Y = 4) + Pr(Y = 5) = + + = .
32 32 32 2
For (c), when the Twins win the first game, the Twins need to win at least 2 games out
of four games to win a majority. Because the outcomes of the 5 games arePindependent
of one another, we may define a new Binomial random variable W = ni=1 Xi with
n = 4 and p = 0.5, which represents the outcomes of the 4 games after the first game.
6 4 1 11
Pr(W ≥ 2) = Pr(W = 2) + Pr(W = 3) + Pr(W = 4) = + + = ,
16 16 16 16
where, for example, Pr(W = 2) = 42 (0.5)2 (1 − 0.5)(4−2) = 6(0.5)4 = 6/16. Therefore,


the probability for the Twins to win a majority after winning the first game is 11/16.

3. Let X1 and X2 are two Bernoulli random variables with the probability of success p,
where X1 and X2 are independent, and Xi = 0 with probability 1 − p and Xi = 1 with
probability p for i = 1, 2. Define a random variable Y = X1 + X2 . Therefore, Y follows
the Binomial Distribution with n = 2 trials.

3
(a) Find the mean and the variance of Y .
Answer: Note that E[X1 ] = E[X2 ] = p and Var(X1 ) =Var(X2 ) = p(1−p) because
E[Xi ] = 0 × (1 − p) + 1 × p = p and V ar(Xi ) = E[Xi2 ] − {E[Xi ]}2 = p − p2 =
p(1 − p), where E[Xi2 ] = p follows from E[Xi2 ] = E[Xi ] = p with Xi2 = Xi .
Furthermore, because X1 and X2 are independent, Cov(X1 , X2 ) = 0. Therefore,
E[Y ] = E[X1 + X2 ] = E[X1 ] + E[X2 ] = p + p = 2p and Var(Y ) =Var(X1 −
X2 ) =Var(X1 )+Var(X2 ) + 2Cov(X1 , X2 ) = p(1 − p) + p(1 − p) + 0 = 2p(1 − p).
(b) What is E[Y |X1 = 1]?
Answer: E[Y |X1 = 1] = E[X1 + X2 |X1 = 1] = E[X1 |X1 = 1] + E[X2 |X1 = 1] =
1 + E[X2 ] = 1 + p, where E[X1 |X1 = 1] = 1 holds X1 = 1 with probability one
when we condition on the event that X1 = 1 and E[X2 |X1 = 1] = E[X2 ] because
X1 and X2 are independent.
(c) What is E[X1 |Y = 1]?
Answer: Pr(X1 = 1|Y = 1) = Pr(X1 = 1|X1 + X2 = 1) = Pr(X 1 =1,X1 +X2 =1)
Pr(X1 +X2 =1)
.
Note that there are four possible outcomes for (X1 , X2 ): (0, 0), (1, 0), (0, 1), and
(1, 1). Now, Pr(X1 = 1, X1 + X2 = 1) = Pr(X1 = 1, X2 = 0) = p(1 − p) and
Pr(X1 + X2 = 1) = Pr(X1 = 0, X2 = 1) + Pr(X1 = 1, X2 = 0) = 2p(1 − p).
Therefore, Pr(X1 = 1|Y = 1) = Pr(X 1 =1,X1 +X2 =1)
Pr(X1 +X2 =1)
p(1−p)
= 2p(1−p) = 1/2. Similarly, we
P prove that Pr(X1 = 0|Y = 1). Finally, E[X1 |Y = 1] = E[X1 |X1 + X2 = 1] =
may
x1 ∈{0,1} x1 Pr(X1 = x1 |X1 + X2 = 1) = 0 × (1/2) + 1 × (1/2) = 1/2.

You might also like