0% found this document useful (0 votes)
4 views5 pages

Lecture 5_3

Moment Generating Functions (mgfs) are defined as E(e^(tx)) and are used to generate moments of random variables about the origin. The document explains how to derive the moments and variance of a random variable using mgfs, along with several theorems related to their properties and limitations. It also includes examples and exercises to illustrate the application of mgfs in probability and statistics.

Uploaded by

nessmaina.io
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views5 pages

Lecture 5_3

Moment Generating Functions (mgfs) are defined as E(e^(tx)) and are used to generate moments of random variables about the origin. The document explains how to derive the moments and variance of a random variable using mgfs, along with several theorems related to their properties and limitations. It also includes examples and exercises to illustrate the application of mgfs in probability and statistics.

Uploaded by

nessmaina.io
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Moment Generating Functions

Definition: E(Xr ) defines the rth moment of the random variable X about the origin.
Moment Generating Functions (mgfs) are used to generate moments of random variables about
the origin.
Definition: The Moment Generating Function of the random variable X is defined as

 P
 tx
tx ∀x e p(x), for x discrete
MX t = E(e ) = R

 ∞ etx f(x), for x continuous
−∞

where t is a real number, if it exists.


NB: Moment generating functions do not exist for all random variables.
Recall: (from the special results provided earlier)

x2 x3 X xk ∞
x
e =1+x+ + + ... =
2! 3! k=0
k!

Hence
t2 x2 t3 x3 X tk xk

tx
e = 1 + tx + + + ... =
2! 3! k=0
k!

MX (t) = E(etx )
t2 x2 t3 x3
 
= E 1 + tx + + + ...
2! 3!
t2 t3
= 1 + tE(x) + E(x2 ) + E(x3 ) + ...
2 6

0
Let us define MX (t) as the first derivative of MX (t) with respect to t.

0 t2
MX (t) = E(x) + tE(x2 ) + E(x3 ) + terms with higher powers ot t (1)
2

Let t = 0 in Equation (1)


0
MX (0) = E(x) (2)
00
Let us define MX (t) as the second derivative of MX (t) with respect to t. We get this by
differentiating Equation (1) with respect to t to get

00
MX (t) = E(x2 ) + tE(x3 ) + terms with higher powers ot t (3)

Let t = 0 in Equation (3)


00
MX (0) = E(x2 ) (4)

1
In general, finding the nth derivative of MX (t) with respect to t and letting t = 0 gives the
nth moment of the random variable X about the origin i.e.

(n)
MX (0) = E(xn ) (5)

Hence, we are able to use moment generating functions of random variables if they exist to find
the mean and variance of a random variable since

0
E(X) = MX (0)

Var(X) = E(X2 ) − [E(X)]2


00 0
= MX (0) − [MX (0)]2

Example 1

The probability mass function of a random variable X is given by




 n px qn−x , x = 0, 1, 2, ..., n where p + q = 1

x
p(x) =

0, otherwise

Determine the moment generating function of X. Hence determine E(X) and Var(X).

Solution

MX (t) = E(et x)
X
= etx p(x)
∀x
X
n  
n x n−x
tx
= e p q
x=0
x
Xn  
n
= (pet )x qn−x
x=0
x

= (pet + q)n

0
MX (t) = npet (pet + q)n−1
00
MX (t) = n(n − 1)(pet )2 (pet + q)n−2

2
0
E(X) = MX (0)

= np
00 0
Var(X) = MX (0) − [MX (0)]2

= n(n − 1)p2 − (np)2

= np(1 − p)

Theorems on Moment Generating Functions

Theorem 1

McX (t) = MX (ct), c being a constant.


Proof By definition:
L.H.S = McX (t) = E(etcX )
R.H.S = MX (ct) = E(ectX ) = L.H.S

Theorem 2

The moment generating function of the sum of a number of independent random variables
is equal to the product of their respective moment generating functions. Symbolically, if
X1 , X2 , . . . , Xn are independent random variables, then the moment generating function of their
sum X1 + X2 + . . . + Xn is given by

M(X1 +X2 +. . . +Xn ) (t) = MX1 (t)MX2 (t). . . MXn (t)

Proof By definition:

M(X1 +X2 +. . . +Xn ) (t) = E[et(X1 +X2 +. . . +Xn ) ]

= E[et(X1 ) et(X2 ) ...et(Xn ) ]

= E[et(X1 ) ]E[et(X2 ) ]...E[et(Xn ) ] because the Xi0 s are independent

= MX1 (t)MX2 (t). . . MXn (t)

Hence the theorem.

3
Theorem 3

Effect of change of origin and scale on the MGF. Let us transform X to the new variable U by
changing both the origin and scale in X as follows:

X−a
U=
h

where a and h are constants.


The moment generating function of U (about origin) is given by

MU (t) = E(etU )
 X−a 
= E et( h )
h tX at
i
= E e h × e− h
h tX i
− at
=e E eh
h

 
− at t
= e MX
h
h

where MX (t) is the moment generating function of X about the origin.

Theorem 4

The moment generating function of a distribution, if it exists, uniquely determines the distri-
bution. This implies that corresponding to a given probability distribution, there is only one
moment generating function (provided it exists) and corresponding to a given moment gener-
ating function, there is only one probability distribution. Hence MX (t) = MY (t) ⇒ X and Y
are identically distributed.

Limitation of the Moment Generating Function


1. A random variable X may have no moments although its moment generation function exists.

2. A random variable X can have a moment generating function and some (or all) moments,
yet the moment generation function does not generate the moments.

3. A random variable X can have all or some moments; but the moment generation function
does not exist except perhaps at one point.

4
Class Exercise
1. Define the moment generating function of a random variable. Hence or otherwise find the
moment generating function of

a) Y = aX + b
X−m
b) Y =
σ
1
2. The random variable X takes the value n with probability , n = 1, 2, 3, . . . . Find the
2n
moment generating function of X and hence find the mean and variance of X.

3. Show that if X̄ is the mean of n independent random variables, then


  n
t
MX̄ (t) = MX
n

4. Show that the moment generating function of the random variable X having the proba-
bility density function 

 1 , −1 < x < 2
f(x) = 3

0, elsewhere

is  2t −t

 e − e , t 6= 0
MX (t) = 3t

1, t=0

5. X is a random variable and p(x) = abx where a and b are positive, a + b = 1 with x
taking the values 0, 1, 2, . . . . Find the moment generating function of X. Hence show that
m2 = m1 (2m1 + 1), m1 and m2 being the first two moments.

Reference

1. Morris H DeGroot & Mark J Schervish (2012) Probability and Statistics; 4th Edition, Pear-
son Education, Inc.

2. Robert V Hogg, Elliot A. Tanis & Dale L. Zimmerman (2015) Probability & Statistical
Inference; 9th Edition, Pearson Education, Inc.

You might also like