Lecture 5_3
Lecture 5_3
Definition: E(Xr ) defines the rth moment of the random variable X about the origin.
Moment Generating Functions (mgfs) are used to generate moments of random variables about
the origin.
Definition: The Moment Generating Function of the random variable X is defined as
P
tx
tx ∀x e p(x), for x discrete
MX t = E(e ) = R
∞ etx f(x), for x continuous
−∞
x2 x3 X xk ∞
x
e =1+x+ + + ... =
2! 3! k=0
k!
Hence
t2 x2 t3 x3 X tk xk
∞
tx
e = 1 + tx + + + ... =
2! 3! k=0
k!
MX (t) = E(etx )
t2 x2 t3 x3
= E 1 + tx + + + ...
2! 3!
t2 t3
= 1 + tE(x) + E(x2 ) + E(x3 ) + ...
2 6
0
Let us define MX (t) as the first derivative of MX (t) with respect to t.
0 t2
MX (t) = E(x) + tE(x2 ) + E(x3 ) + terms with higher powers ot t (1)
2
00
MX (t) = E(x2 ) + tE(x3 ) + terms with higher powers ot t (3)
1
In general, finding the nth derivative of MX (t) with respect to t and letting t = 0 gives the
nth moment of the random variable X about the origin i.e.
(n)
MX (0) = E(xn ) (5)
Hence, we are able to use moment generating functions of random variables if they exist to find
the mean and variance of a random variable since
0
E(X) = MX (0)
Example 1
Determine the moment generating function of X. Hence determine E(X) and Var(X).
Solution
MX (t) = E(et x)
X
= etx p(x)
∀x
X
n
n x n−x
tx
= e p q
x=0
x
Xn
n
= (pet )x qn−x
x=0
x
= (pet + q)n
0
MX (t) = npet (pet + q)n−1
00
MX (t) = n(n − 1)(pet )2 (pet + q)n−2
2
0
E(X) = MX (0)
= np
00 0
Var(X) = MX (0) − [MX (0)]2
= np(1 − p)
Theorem 1
Theorem 2
The moment generating function of the sum of a number of independent random variables
is equal to the product of their respective moment generating functions. Symbolically, if
X1 , X2 , . . . , Xn are independent random variables, then the moment generating function of their
sum X1 + X2 + . . . + Xn is given by
Proof By definition:
3
Theorem 3
Effect of change of origin and scale on the MGF. Let us transform X to the new variable U by
changing both the origin and scale in X as follows:
X−a
U=
h
MU (t) = E(etU )
X−a
= E et( h )
h tX at
i
= E e h × e− h
h tX i
− at
=e E eh
h
− at t
= e MX
h
h
Theorem 4
The moment generating function of a distribution, if it exists, uniquely determines the distri-
bution. This implies that corresponding to a given probability distribution, there is only one
moment generating function (provided it exists) and corresponding to a given moment gener-
ating function, there is only one probability distribution. Hence MX (t) = MY (t) ⇒ X and Y
are identically distributed.
2. A random variable X can have a moment generating function and some (or all) moments,
yet the moment generation function does not generate the moments.
3. A random variable X can have all or some moments; but the moment generation function
does not exist except perhaps at one point.
4
Class Exercise
1. Define the moment generating function of a random variable. Hence or otherwise find the
moment generating function of
a) Y = aX + b
X−m
b) Y =
σ
1
2. The random variable X takes the value n with probability , n = 1, 2, 3, . . . . Find the
2n
moment generating function of X and hence find the mean and variance of X.
4. Show that the moment generating function of the random variable X having the proba-
bility density function
1 , −1 < x < 2
f(x) = 3
0, elsewhere
is 2t −t
e − e , t 6= 0
MX (t) = 3t
1, t=0
5. X is a random variable and p(x) = abx where a and b are positive, a + b = 1 with x
taking the values 0, 1, 2, . . . . Find the moment generating function of X. Hence show that
m2 = m1 (2m1 + 1), m1 and m2 being the first two moments.
Reference
1. Morris H DeGroot & Mark J Schervish (2012) Probability and Statistics; 4th Edition, Pear-
son Education, Inc.
2. Robert V Hogg, Elliot A. Tanis & Dale L. Zimmerman (2015) Probability & Statistical
Inference; 9th Edition, Pearson Education, Inc.