STAT21613 Chapter4
STAT21613 Chapter4
Chapter 4
Functions of Random Variables
Let’s recall function of one random variable.
Example 1:
Solution:
1
STAT 21613 Probability Distributions and Applications II
Theorem 1
Example 2:
Solution:
2
STAT 21613 Probability Distributions and Applications II
Ex:
4.1 Distribution of the sum and difference of Two Continuous Random Variables
Theorem 2
Proof:
Let X and Y be two continuous random variables with joint probability density function 𝑓𝑋,𝑌 (𝑥, 𝑦)
Let Z = X + Y, then the cumulative distribution:
𝐹𝑍 (𝑧) = 𝑃(𝑍 ≤ 𝑧)
= 𝑃(𝑋 + 𝑌 ≤ 𝑧)
∞ 𝑧−𝑥
= ∫−∞(∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦)𝑑𝑥
3
STAT 21613 Probability Distributions and Applications II
∞ 𝑧−𝑥
𝑑
𝑓𝑍 (𝑧) = [ ∫ ( ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦) 𝑑𝑥 ]
𝑑𝑧
−∞ −∞
∞ 𝑧−𝑥
𝑑
=[ ∫ ( ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦) 𝑑𝑥 ]
𝑑𝑧
−∞ −∞
Now apply the Leibniz’ Rule
∞
𝑓𝑍 (𝑧) = ∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑧 − 𝑥)𝑑𝑥
Similarly:
∞
𝑓𝑍 (𝑧) = ∫−∞ 𝑓𝑋,𝑌 (𝑧 − 𝑦, 𝑦)𝑑𝑦
Corollary 4.1.1: Let X and Y are two continuous and independent random variables. Let Z=X+Y, then
∞ ∞
𝑓𝑍 (𝑧) = ∫−∞ 𝑓𝑋 (𝑧 − 𝑦)𝑓𝑌 (𝑦)𝑑𝑦 = ∫−∞ 𝑓𝑋 (𝑥)𝑓𝑌 (𝑧 − 𝑥)𝑑𝑥
Example 3:
Let 𝑓(𝑥, 𝑦) = 2 , 𝑥 ≥ 0, 𝑦 ≥ 0, 𝑥 + 𝑦 ≤ 1
Find the pdf of W = X+Y
Solution:
∞
Example 4:
Let 𝑋 and 𝑌 be two independent continuous random variables with exponential distributions, with
parameter λ.
Find the probability density function of random variable, 𝑊 when 𝑊 = 𝑋 + 𝑌.
4
STAT 21613 Probability Distributions and Applications II
Let X and Y be two continuous random variables with joint probability function 𝑓𝑋,𝑌 (𝑥, 𝑦). Let V=X-Y,
then cumulative distribution
𝐹𝑉 (𝑣) = 𝑃(𝑉 ≤ 𝑣)
= 𝑃(𝑋 − 𝑌 ≤ 𝑣)
∞ 𝑣+𝑦
= ∫−∞{∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑥 }𝑑𝑦
Density function:
𝑑𝐹𝑉 (𝑣)
𝑓𝑉 (𝑣) = 𝑑𝑣
𝑑 ∞ 𝑣+𝑦
= 𝑑𝑣 [∫−∞{∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑥 }𝑑𝑦]
∞
= ∫−∞ 𝑓𝑋,𝑌 (𝑣 + 𝑦, 𝑦)𝑑𝑦
Similarly:
∞
𝑓𝑉 (𝑣) = ∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑥 − 𝑢)𝑑𝑥
4.2 Distribution of the Product and Quotient of Two Continuous Random Variables
Let X and Y be two continuous and independent random variables with joint probability function 𝑓𝑋,𝑌 (𝑥, 𝑦).
Then the product Z = XY has cumulative distribution is given by:
𝐹𝑍 (𝑧) = 𝑃(𝑍 ≤ 𝑧)
= 𝑃[𝑋𝑌 ≤ 𝑧]
𝑧
∞ 𝑥
0 ∞
𝐹𝑍 (𝑧) = ∫ {∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦} 𝑑𝑥 + ∫ ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦 𝑑𝑥
−∞ 𝑧 0
−∞
𝑥 { }
Now apply Leibiniz’ Theorem
5
STAT 21613 Probability Distributions and Applications II
𝑧
∞ 𝑥
0 ∞
𝑑
𝑓𝑍 (𝑧) = ∫ {∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦} 𝑑𝑥 + ∫ ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦 𝑑𝑥
𝑑𝑧 −∞ 0
𝑧 −∞
𝑥 { }
( )
𝑧
∞ 𝑥
0 ∞
𝑑 𝑑
=∫ { ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦} 𝑑𝑥 + ∫ ∫ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑦 𝑑𝑥
−∞ 𝑑𝑧 𝑧 0 𝑑𝑧
−∞
𝑥 { }
0 1 𝑧 ∞1 𝑧
𝑓𝑍 (𝑧) = ∫−∞ −𝑥 𝑓𝑋,𝑌 (𝑥, 𝑥) 𝑑𝑥 + ∫0 𝑥 𝑓𝑋,𝑌 (𝑥, 𝑥) 𝑑𝑥𝑑𝑥 `
Density function:
∞ 1 𝑧
𝑓𝑍 (𝑧) = ∫−∞ |𝑥| 𝑓𝑋,𝑌 (𝑥, 𝑥) 𝑑𝑥
Let X and Y be two continuous and independent random variables with joint probability function
𝑓𝑋,𝑌 (𝑥, 𝑦). Then, the quotient 𝑈 = 𝑋/𝑌 has the cumulative distribution:
𝐹𝑈 (𝑢) = 𝑃(𝑈 ≤ 𝑢)
𝑋
= 𝑃 [ ≤ 𝑢]
𝑌
0 ∞ ∞ 𝑢𝑦
= ∫−∞ {∫𝑢𝑦 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑥 } 𝑑𝑦 + ∫0 {∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦)𝑑𝑥 }𝑑𝑦
Density function:
𝑑𝐹𝑈 (𝑢)
𝑓𝑈 (𝑢) = 𝑑𝑢
Corollary 4.4.1: Let X and Y be two continuous and independent random variables. Let 𝑈 = 𝑋/𝑌, then
∞
6
STAT 21613 Probability Distributions and Applications II
Order Statistics
Example:
Suppose a random sample of five rats yields the following weights (in grams):
Theorem 3
Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be a set of random variables each of which has cumulative probability distribution
function 𝐹(. ).
Let 𝑌𝑖 = 𝑚𝑖𝑛{𝑋1 , 𝑋2 , … , 𝑋𝑛 } and 𝑌𝑛 = 𝑚𝑎𝑥{𝑋1 , 𝑋2 , … , 𝑋𝑛 }.
Cumulative Probability Distribution of 𝑌𝑛 = 𝑚𝑎𝑥{𝑋1 , 𝑋2 , … , 𝑋𝑛 } and 𝑌1 = 𝑚𝑖𝑛{𝑋1 , 𝑋2 , … , 𝑋𝑛 } are
𝐹𝑌𝑛 = 𝑃[𝑌𝑛 ≤ 𝑦]
= 𝑃[𝑋1 ≤ 𝑦, 𝑋2 ≤ 𝑦, … , 𝑋𝑛 ≤ 𝑦]
= {𝐹(𝑦)}𝑛
= 1 − 𝑃[𝑌1 > 𝑦]
7
STAT 21613 Probability Distributions and Applications II
= 1 − {1 − 𝐹(𝑦)}𝑛
𝐹𝑌𝑖 = 1 − {1 − 𝐹(𝑦)}𝑛
Example 5:
Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be U(0,1). Find the distribution of the minimum and maximum.
Example 6:
A string of 10 light bulbs is connected in series, which means that the entire string will not light
up if anyone of the light bulbs fails. Assume that the lifetimes of the bulbs, 𝜏1 , 𝜏2 , … , 𝜏10 , are
independent random variables that are exponentially distributed with mean 2. Find the
distribution of the life length of this string of light bulbs.
Ref: Ramachandran, K. M.Mathematical statistics with applications / Kandethody M. Ramachandran, Chris P. Tsokos.
8
STAT 21613 Probability Distributions and Applications II
Recall:
Let X and Y be two random variables with joint density function 𝑓(𝑥, 𝑦). A real valued function
𝑚: ℝ2 → ℝ defined by
is called the joint moment generating function of X and Y if this expected values exists for all
𝑠 is some interval −ℎ < 𝑠 < ℎ and for all 𝑡 is some interval −𝑘 < 𝑡 < 𝑘 for some positive ℎ and
𝑘.
Then the joint probability generating function 𝑚𝑌1 ,𝑌2 ,…,𝑌𝑘 (𝑡1 , 𝑡2 , … , 𝑡𝑘 ) of 𝑌1 , 𝑌2 , … , 𝑌𝑘 is defined by
𝑘 𝑘
𝑚𝑌1 ,𝑌2 ,…,𝑌𝑘 (𝑡1 , 𝑡2 , … , 𝑡𝑘 ) = 𝐸 (𝑒 ∑𝑖=1 𝑡𝑖𝑌𝑖 ) = = 𝐸 (𝑒 ∑𝑖=1 𝑡𝑖𝑔𝑖(𝑋1 ,𝑋2 ,…,𝑋𝑛 ) )
9
STAT 21613 Probability Distributions and Applications II
Theorem 4
Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be n independent random variables and suppose that moment generating function of
each random variable exits for all −ℎ < 𝑡 < ℎ for some ℎ > 0.
Let 𝑌 = ∑𝑛𝑖=1 𝑋𝑖
Then
𝑛
∞ ∞ 𝑛
𝑡 ∑𝑛
𝑖=1 𝑋𝑖
∫ …∫ 𝑒 𝑓𝑋1 ,𝑋2 ,…,𝑋𝑛 (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) ∏ 𝑑𝑥𝑖 , 𝑖𝑓 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠
−∞ −∞
= 𝑖=1
𝑡 ∑𝑛
𝑖=1 𝑋𝑖
∑…∑𝑒 𝑃[𝑋1 = 𝑥1 , 𝑋2 = 𝑥2 , … , 𝑋𝑛 = 𝑥𝑛 ], 𝑖𝑓 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒
{ 𝑥1 𝑥𝑛
𝑛 ∞
∏ {∫ 𝑒 𝑡𝑥𝑖 𝑓𝑋 (𝑥𝑖 )} , 𝑖𝑓 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠
𝑖=1 −∞
= 𝑛
∏ {∑ 𝑒 𝑡𝑥𝑖 𝑃[ 𝑋𝑖 = 𝑥𝑖 ]} , 𝑖𝑓 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒
{ 𝑖=1 𝑥𝑖
Example 7:
Let X1 and X2 be independent random variables with uniform distributions on{1, 2, 3, 4}.
Let Y = X1 + X2. For example, Y could equal the sum when two fair four-sided dice are rolled.
Solution:
10
STAT 21613 Probability Distributions and Applications II
Example 8:
11
STAT 21613 Probability Distributions and Applications II
4.5 Transformations
Recall:
Assume that
(i) 𝑌 = 𝑔(𝑋) is one to one transformation of 𝑥 onto 𝑦.
(ii) The derivative of 𝑥 = 𝑔−1 (𝑦) with respective to 𝑦 is continuous and nonzero for 𝑦 ∈ ℛ
Then 𝑌 = 𝑔(𝑋) is continuous random variable with probability density function
−1
𝑑𝑔−1 (𝑦)
𝑓𝑌 (𝑦) = 𝑓𝑋 (𝑔 )
(𝑦) | | , 𝑓𝑜𝑟 𝑦 ∈ ℛ
𝑑𝑦
Proof:
Case(i). When 𝑔(𝑥) is a monotone increasing function of 𝑥
𝐹𝑌 (𝑦) = 𝑃{𝑌 ≤ 𝑦}
= 𝑃{𝑔(𝑋) ≤ 𝑦}
= 𝐹𝑋 (𝑔−1 (𝑦))
𝑑𝐹𝑌 (𝑦)
𝑓𝑌 (𝑦) = 𝑑𝑦
𝑑𝑔−1 (𝑦)
= 𝑓𝑋 (𝑔−1 (𝑦)) 𝑑𝑦
𝐹𝑌 (𝑦) = 𝑃{𝑌 ≤ 𝑦}
= 𝑃{𝑔(𝑋) ≤ 𝑦}
= 1 − 𝐹𝑋 (𝑔−1 (𝑦))
Thus the density function:
𝑑𝐹𝑌 (𝑦)
𝑓𝑌 (𝑦) =
𝑑𝑦
12
STAT 21613 Probability Distributions and Applications II
Thus we have
𝑑𝑔−1 (𝑦)
𝑓𝑌 (𝑦) = | | 𝑓𝑋 (𝑔−1 (𝑦))
𝑑𝑦
Theorem 5
Let X1 and X2 be jointly continuous random variables with joint probability density function
𝑓𝑋1, 𝑋2 (𝑥1 , 𝑥2 )
Let x = ( x1 , x2 ) : f X1 , X 2 ( x1, x2 ) 0 .
Assume that
(i). y1 g1 ( x1 , x2 ) and y 2 g 2 ( x1 , x2 ) are one to one transformation of x onto y.
1 1
(ii). The first partial derivatives of x1 g1 ( y1 , y2 ) and x2 g 2 ( y1 , y2 ) are continuous over y.
(iii). The Jacobian J of transformation is nonzero for ( y1 , y 2 ) y. Then the joint probability
density function of Y1 g1 ( X 1 , X 2 ) and Y2 g 2 ( X 1 , X 2 ) is given by
Note:
x1 x1
y y 2
J 1
x2 x2
y1 y 2
13
STAT 21613 Probability Distributions and Applications II
Example 9:
Example 10:
14
STAT 21613 Probability Distributions and Applications II
Show that
Solution:
15