Probability-1
Discreet Random Variables
Prof Prashant Gupta
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
2
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
3
Random Variables
Frequently, we are mainly interested in some function of the outcome rather than
the outcome itself. These real-valued functions defined on the sample space are
known as random variables (RV’s).
Example: Toss three fair coins. Let Y denote the number of heads happening.
Then Y is a RV taking one of the values 0, 1, 2, 3 with respective probabilities:
1
P (Y = 0) = P ({TTT}) =
8
3
P (Y = 1) = P ({TTH, THT, HTT}) =
8
3
P (Y = 2) = P ({HHT, HTH, THH}) =
8
1
P (Y = 3) = P ({HHH}) =
8
and
3
X
P (Y = i) = 1.
i=0
4
Random Variables
Example: Independent trials of flipping a coin with probability p of taking up
heads are continuously performed until a heads occurs. Let X denote the num-
ber of times the coin is flipped. Then X is a RV taking values on {1, 2, 3, 4, . . .}.
We have
P (X = 1) = P ({H}) = p
P (X = 2) = P ({TH}) = (1 p)p
P (X = 3) = P ({TTH}) = (1 p)2 p
..
.
P (X = n) = P ({TT· · · TH}) = (1 p)n 1
p
and
1
X 1
X p
P (X = n) = (1 p)n 1
p= = 1.
n=1 n=1
1 (1 p)
5
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
6
Discrete Random Variables
A RV that can take on at most a countable number of possible values is said to
be discrete.
For a discrete RV X, the probability mass function (PMF) of X, denoted by
pX (a) or simply p(a), is defined as
p(a) = P (X = a).
Example: If X must assume one of the values x1 , x2 , x3 , . . ., then we have p(xi )
0, for i = 1, 2, . . . and p(x) = 0 for all the other values of x, and
1
X
p(xi ) = 1.
i=1
7
Example
Example: Consider a RV Y with
8 1
>
> 0 with probability 8
>
<1 3
with probability 8
Y = 3
>
> 2 with probability
>
: 8
1
4 with probability 8
The PMF of Y can be demonstrated using the following diagram:
p(y)
3
8
1
8
y
0 1 2 4 8
Example
Problem: The PMF of a RV X is given by
c x
p(x) = , x = 0, 1, 2, . . .
x!
where is some positive number, and p(x) = 0 for other values. 1) Find the
value of c; 2) Find P (X = 0) and P (X > 2).
Solution: 1) Since p(x) is a PMF, we have
1
X 1
X x 1
X x
c
1= p(x) = =c = ce
x=1 x=1
x! x=1
x!
and therefore c = e .
2)
0
e
P (X = 0) = p(0) = =e
0!
and
1
X 2
X 2
e
P (X > 2) = p(x) = 1 p(x) = 1 e e
x=3 x=0
2
9
Cumulative Distribution Function
The cumulative distribution function (CDF) of a discrete RV X, denoted by
FX (a) or simply F (a), can be expressed in terms of PMF p(x) by
X
F (a) = p(x).
xa
Later we will generalize CDF to continuous RV’s. Indeed, note that PMF only
exists for discrete RV’s, but CDF can be defined for any RV.
If Y is a discrete RV, then its CDF is a step function. For example, if Y has
PMF given by
1 3 3 1
p(0) = , p(1) = , p(2) = , p(4) =
8 8 8 8
then its CDF is given by
8
>
>0 a < 0
>
>
>
> 1
<8 0 a < 1
F (a) = 21 1 a < 2
>
>
>
> 7
2a<4
>
> 8
:1 a 4
10
Cumulative Distribution Function
p(y) F (y)
1
3 7
8
8
1
2
1
8 1
y 8
y
0 1 2 4 0 1 2 4
F (y) is a right-continuous but not left-continuos function.
11
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
12
Expected Value
If X is a discrete RV having a PMF p(x), then the expectation or expected
value of X is X
E[X] = x · p(x)
x: p(x)>0
For example, if the PMF of X is p(0) = p(1) = 12 , then
1 1 1
E[X] = 0 · +1· = .
2 2 2
If the PMF of X is p(0) = 13 , p(1) = 23 , then
1 2 2
E[X] = 0 · +1· = .
3 3 3
The concept of expectation is analogous to the physical concept of the center of
mass. Think of it in diagram!
13
Example
Suppose X is the outcome when we roll a fair die. Then the expectation of X
is
X6 6
1X 7
E[X] = i · p(i) = i=
i=1
6 i=1 2
Let an indicator variable for the event A be defined as
(
1 if A occurs
IA =
0 if Ac occurs
Then the expectation of IA is
E[IA ] = 1 · p(1) + 0 · p(0) = P (A)
14
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
15
Expectation of Function
Given a RV X with PMF p(x), we can also calculate the expectation of some
function g of X. In particular, we have
X
E[g(X)] = g(x) · p(x)
x: p(x) 0
Corollary: If a, b are constant and X is a RV, then E[aX + b] = aE[X] + b.
Example: Suppose X has PMF given by p(0) = p(1) = 12 , and g(x) = x2 . Then
1
E[g(X)] = g(0) · p(0) + g(1) · p(1) =
2
In general, the expectation
X
E[X n ] = xn · p(x)
x: p(x) 0
is called the n-th moment of X. In particular, the expectation of X is also
called the first moment, or the mean of X.
16
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
17
Variance
Two RV’s can have the same expectation, but behave (quite) di↵erently in their
distributions. For example, consider RV’s Y and Z, where the pmf for Y is
pY (1) = pY ( 1) = 0.5 and the pmf for Z is pZ (100) = pZ ( 100) = 0.5. Both
Y and Z have the same expectation, i.e. E[Y ] = E[Z] = 0, but obviously they
have very di↵erent spread of distributions.
To measure the spread of the distribution of X, we will consider the average
squared deviation of X from its mean E[X] = µ, and call this measure the
variance of X, i.e.,
Var(X) = E[(X µ)2 ].
The square root of Var(X) is called the standard deviation of X, i.e.,
p
SD(X) = Var(X).
Fact: Note that we have
Var(X) = E[(X µ)2 ] = E[X 2 + µ2 2Xµ] = E[X 2 ] + µ2 2µ2 = E[X 2 ] µ2 .
Also, Var(aX + b) = a2 Var(X) because
Var(aX + b) = E[(aX + b E[aX + b])2 ] = E[a2 (X E[X])2 ] = a2 Var(X).
18
Example
For the previous example of Y and Z, we have
1 1
Var(Y ) = E[Y 2 ] E[Y ]2 = E[Y 2 ] = ( 1)2 · + 12 · = 1
2 2
and
1 1
Var(Z) = E[Z 2 ] E[Z]2 = E[Z 2 ] = ( 100)2 · + 1002 · = 10000.
2 2
7
Another example: If X denotes the outcome of rolling a fair die, then E[X] = 2
and
X6
91
E[X 2 ] = i2 p(i) = ,
i=1
6
and therefore
✓ ◆2
91 7 35
Var(X) = E[X 2 ] E[X]2 = =
6 2 12
19
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
20
Bernoulli and Binomial RV
Bernoulli RV (success and failure):
(
1 w.p. p
X=
0 w.p. 1 p
PMF: p(1) = p and p(0) = 1 p.
Expectation: E[X] = p. Second Moment: E[X 2 ] = p.
Variance: Var(X) = E[X 2 ] E[X]2 = p p2 .
Binomial RV: Now suppose that we do n independent trials (each time being
success w.p. p and failure w.p. 1 p). Let X represent the number of successes
in the n trials. Then X is said to be a binomial RV with parameter (n, p),
denoted by X ⇠ Binomial(n, p); in particular the case of (1, p) is Bernoulli.
n
PMF: p(i) = i pi (1 p)n i
where 0 i n; the shape of PMF is interesting
Expectation: E[X] = np. Variance: Var(X) = np(1 p).
21
Outline
• Random Variables
• Discrete Random Variables
• Expected Value
• Expectation of a Function
• Variance
• Bernoulli and Binomial Random Variables
• Poisson Random Variables
• Geometric Random Variables
22
Poisson and Geometric RV
Poisson RV: A RV X taking on one of the values 0, 1, 2, . . ., is said to be a
Poisson RV with parameter > 0 if its PMF is given by
i
p(i) = e , i = 0, 1, 2, . . .
i!
The parameter a↵ects the shape of PMF for X ⇠ Poisson( )
Expectation: E[X] = . Second moment: E[X 2 ] = 2
+ .
Variance: Var(X) = .
Geometric RV: Perform independent trials until a success occurs. The number
of times we do the trials is a Geometric RV.
PMF: p(i) = p(1 p)i 1
1 p
Expectation: E[X] = p1 . Variance: Var(X) = p2 .
23