7 Multiple Random Variables
7 Multiple Random Variables
Faculty of Engineering
Electrical and Electronics Engineering Department
Salma Elkawafi
[email protected]
𝒑(𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 ) = 𝑷(𝑿𝟏 = 𝒙𝟏 , 𝑿𝟐 = 𝒙𝟐 , … , 𝑿𝒏 = 𝒙𝒏 ).
For 𝑛 jointly continuous random variables 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 , the joint PDF is defined to be the
function 𝑓(𝑥1 , 𝑥2 , … , 𝑥𝑛 ) such that the probability of any set 𝐴 ⊂ 𝑅𝑛 is given by the integral
of the PDF over the set A. In particular, for a set 𝐴 ∈ 𝑅𝑛, we can write:
The marginal PDF of 𝑋𝑖 can be obtained by integrating all other 𝑋𝑗′𝑠. For example,
∞ ∞
𝑓𝑋1 (𝑥1 ) = න … න 𝑓 𝑥1 , 𝑥2 , … , 𝑥𝑛 𝑑𝑥2 ⋯ 𝑑𝑥𝑛
−∞ −∞
Expected Value
𝐸[𝑋1 𝑋2 ⋯ 𝑋𝑛 ] = 𝐸[𝑋1 ]𝐸[𝑋2 ] ⋯ 𝐸[𝑋𝑛 ].
More Than Two Random Variables
Independent and identically distributed (i.i.d.) Random Variables:
Variance :
𝜎𝑌2 = 𝑉𝑎𝑟 𝑌 = 𝑉𝑎𝑟 σ𝑛𝑖=1 𝑋𝑖 = σ𝑛𝑖=1 𝑉𝑎𝑟[𝑋𝑖 ]+ 2 σ𝑖<𝑗 Cov(𝑋𝑖 , 𝑋𝑗 )
For example :
𝑉𝑎𝑟(𝑋1 + 𝑋2 ) = 𝑉𝑎𝑟(𝑋1 ) + 𝑉𝑎𝑟(𝑋2 ) + 2𝐶𝑜𝑣(𝑋1 , 𝑋2 )
Then we have
𝐸 𝒀 = 𝐴𝐸 𝑿 + 𝑏.
1
3 2 3 1
𝑓𝑋 𝑥 = න 𝑥 + 𝑦 𝑑𝑦 = 𝑥 2 + , 𝑓𝑜𝑟 0 < 𝑥 < 1
0 2 2 2
1
3 1
𝑓𝑌 𝑦 = න 𝑥 2 + 𝑦 𝑑𝑥 = 𝑦 + , 𝑓𝑜𝑟 0 < 𝑦 < 1
0 2 2
Random Vector
Examples: Cont.
3
Let 𝑋 and 𝑌 be two jointly continuous random variables with joint PDF 𝑓 𝑥, 𝑦 = 2 𝑥 2 + 𝑦, 𝑓𝑜𝑟 0 < 𝑥, 𝑦 < 1 𝑎𝑛𝑑 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝑋
and let the random vector 𝑼 be defined as 𝑈 = . Find the correlation and covariance matrices of U.
𝑌
Solution
1 1
3 2 3 1 3 2 1
𝑓𝑋 𝑥 = න 𝑥 + 𝑦 𝑑𝑦 = 𝑥 2 + , 𝑓𝑜𝑟 0 < 𝑥 < 1 , 𝑓𝑌 𝑦 = න 𝑥 + 𝑦 𝑑𝑥 = 𝑦 + , 𝑓𝑜𝑟 0 < 𝑦 < 1
0 2 2 2 0 2 2
5 7 7 5
From these, we obtain 𝐸[𝑋] = , 𝐸[𝑋 2 ] = , 𝐸[𝑌] = , and 𝐸 𝑌 2 = . We also need 𝐸[𝑋𝑌].
8 15 12 12
1 1 1
3 2 3 1 17
𝐸𝑋𝑌 = න න 𝑥𝑦( 𝑥 + 𝑦)𝑑𝑥𝑑𝑦 = න 𝑦 + 𝑦 2 𝑑𝑦 = .
0 0 2 0 8 2 48
From this, we also obtain
17 5 7 1
Cov(𝑋, 𝑌) = 𝐸[𝑋𝑌] − 𝐸[𝑋]𝐸[𝑌] = − . =− .
48 8 12 96
7 17 73 1
𝐸[𝑋 2 ] 𝐸[𝑋𝑌] 𝑉𝑎𝑟(𝑋) Cov(𝑋, 𝑌) −
𝑹𝑼 = 𝑬 𝑼𝑼 𝑻
= 2 = 15 48 , 𝑪𝑿 = = 960 96
𝐸[𝑌𝑋] 𝐸[𝑌 ] 17 5 Cov(𝑌, 𝑋) 𝑉𝑎𝑟(𝑌) 1 11
−
48 12 96 144