0% found this document useful (0 votes)
23 views

7 Multiple Random Variables

This document discusses probability concepts related to more than two random variables including joint distributions, independence, sums of random variables, and random vectors. Key concepts covered are joint and marginal distributions, independence, expectation, correlation, and covariance.

Uploaded by

rsmyrsmy14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

7 Multiple Random Variables

This document discusses probability concepts related to more than two random variables including joint distributions, independence, sums of random variables, and random vectors. Key concepts covered are joint and marginal distributions, independence, expectation, correlation, and covariance.

Uploaded by

rsmyrsmy14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

University of Benghazi

Faculty of Engineering
Electrical and Electronics Engineering Department

Probability and Random Process Course


EE277

Salma Elkawafi
[email protected]

ES233: Probability and Statistics - UOB


Goals
Understand the following:
More Than Two Random Variables
• Joint Distributions
• Independence
• Independent and identically distributed (i.i.d.) RVs
• Sums of Random Variables
Random Vectors
• Expectation
• Correlation
• Covariance
More Than Two Random Variables
Joint Distributions
For three or more random variables, the joint PDF, joint PMF, and joint CDF are defined in a
similar way to what we have already seen for the case of two random variables. Let
𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 be 𝑛 discrete random variables. The joint PMF of 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 is defined as

𝒑(𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 ) = 𝑷(𝑿𝟏 = 𝒙𝟏 , 𝑿𝟐 = 𝒙𝟐 , … , 𝑿𝒏 = 𝒙𝒏 ).

For 𝑛 jointly continuous random variables 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 , the joint PDF is defined to be the
function 𝑓(𝑥1 , 𝑥2 , … , 𝑥𝑛 ) such that the probability of any set 𝐴 ⊂ 𝑅𝑛 is given by the integral
of the PDF over the set A. In particular, for a set 𝐴 ∈ 𝑅𝑛, we can write:

𝑷((𝑿𝟏 , 𝑿𝟐 , ⋯ , 𝑿𝒏 ) ∈ 𝑨) = න … න … න 𝒇 𝒙𝟏 , 𝒙𝟐 , … , 𝒙𝒏 𝒅𝒙𝟏 𝒅𝒙𝟐 ⋯ 𝒅𝒙𝒏 .


𝑨 𝑨 𝑨
More Than Two Random Variables
Example:
Let X,Y and Z be three jointly continuous random variables with joint PDF

• Find the constant c.


• Find the marginal PDF of X
More Than Two Random Variables
Joint Distributions

The marginal PDF of 𝑋𝑖 can be obtained by integrating all other 𝑋𝑗′𝑠. For example,
∞ ∞
𝑓𝑋1 (𝑥1 ) = න … න 𝑓 𝑥1 , 𝑥2 , … , 𝑥𝑛 𝑑𝑥2 ⋯ 𝑑𝑥𝑛
−∞ −∞

The joint CDF of n random variables 𝑋1 , 𝑋2 , … , 𝑋𝑛 is defined as


𝐹(𝑥1 , 𝑥2 , … , 𝑥𝑛 ) = 𝑃(𝑋1 ≤ 𝑥1 , 𝑋2 ≤ 𝑥2 , … , 𝑋𝑛 ≤ 𝑥𝑛 )
More Than Two Random Variables
Independence:
The idea of independence is exactly the same as what we have seen before. We restate
it here in terms of the joint PMF, joint PDF, and joint CDF. Random variables 𝑋1 , 𝑋2 , … , 𝑋𝑛
are independent, if for all (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) ∈ 𝑅𝑛 ,
𝐹 𝑥1 , 𝑥2 , … , 𝑥𝑛 = 𝐹𝑋1 𝑥1 𝐹𝑋2 𝑥2 ⋯ 𝐹𝑋𝑛 𝑥𝑛

if 𝑋1 , 𝑋2 , … , 𝑋𝑛 are discrete and independent, then we have


𝑝(𝑥1 , 𝑥2 , … , 𝑥𝑛 ) = 𝑝𝑋1 𝑥1 𝑝𝑋2 (𝑥2 ) ⋯ 𝑝𝑋𝑛 (𝑥𝑛 ).

If 𝑋1 , 𝑋2 , … , 𝑋𝑛 are continuous and independent, we have


𝑓 𝑥1 , 𝑥2 , … , 𝑥𝑛 = 𝑓𝑋1 𝑥1 𝑓𝑋2 (𝑥2 ) ⋯ 𝑓𝑋𝑛 (𝑥𝑛 )

Expected Value
𝐸[𝑋1 𝑋2 ⋯ 𝑋𝑛 ] = 𝐸[𝑋1 ]𝐸[𝑋2 ] ⋯ 𝐸[𝑋𝑛 ].
More Than Two Random Variables
Independent and identically distributed (i.i.d.) Random Variables:

Random variables 𝑋1 , 𝑋2 , … , 𝑋𝑛 are said to be independent and identically distributed


(i.i.d.) if they are independent, and they have the same marginal distributions:
𝐹𝑋1 (𝑥) = 𝐹𝑋2 (𝑥) = ⋯ = 𝐹𝑋𝑛 (𝑥), 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥 ∈ 𝑅.
Then
𝐸 𝑋1 𝑋2 ⋯ 𝑋𝑛 = 𝐸 𝑋1 𝐸 𝑋2 ⋯ 𝐸 𝑋𝑛 = 𝐸 𝑋1 𝑛
More Than Two Random Variables
Sums of Random Variables
In many applications, we need to work with a sum of several random variables.
In particular, we might need to study a random variable 𝑌 given by
𝑌 = 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑛
Then expectation is:
𝐸[𝑌] = 𝐸[𝑋1 ] + 𝐸 𝑋2 + ⋯ + 𝐸[𝑋𝑛 ].

Variance :
𝜎𝑌2 = 𝑉𝑎𝑟 𝑌 = 𝑉𝑎𝑟 σ𝑛𝑖=1 𝑋𝑖 = σ𝑛𝑖=1 𝑉𝑎𝑟[𝑋𝑖 ]+ 2 σ𝑖<𝑗 Cov(𝑋𝑖 , 𝑋𝑗 )
For example :
𝑉𝑎𝑟(𝑋1 + 𝑋2 ) = 𝑉𝑎𝑟(𝑋1 ) + 𝑉𝑎𝑟(𝑋2 ) + 2𝐶𝑜𝑣(𝑋1 , 𝑋2 )

Note: If the 𝑿𝒊 ′𝒔 are independent, then 𝐂𝐨𝐯 𝑿𝒊 , 𝑿𝒋 = 𝟎


More Than Two Random Variables
Sums of Random Variables
Example:
𝑁 people sit around a round table, where 𝑁 > 5. Each person tosses a coin. Anyone whose
outcome is different from his/her two neighbors will receive a present. Let 𝑋 be the number of
people who receive presents. Find 𝐸 𝑋 .
Solution:
Number the 𝑁 people from 1 to 𝑁. Let 𝑋𝑖 be the indicator random variable for the 𝑖th person, that is,
𝑋𝑖 = 1 if the 𝑖th person receives a present and zero otherwise. Then
𝑋 = 𝑋1 + 𝑋2 + ⋯ + 𝑋𝑁
𝐸[𝑋] = 𝐸[𝑋1 ] + 𝐸[𝑋2 ] + ⋯ + 𝐸[𝑋𝑁 ]
𝐸[𝑋] = 𝑁 ∗ 𝐸[𝑋1 ]
1 1
𝐸[𝑋1 ] = 𝑃 𝑋1 = 1 = + , 𝐻𝑇𝐻 𝑜𝑟 𝑇𝐻𝑇
8 8
1
𝐸[𝑋1 ] =
4
𝑁
𝐸[𝑋] =
4
Random Vector
When dealing with multiple random variables, it is sometimes useful to use vector
and matrix notations. When we have n random variables 𝑋1 , 𝑋2 , … , 𝑋𝑛 we can put
them in a (column) vector 𝑋:
𝑋1
𝑋
𝑿= 2 , 𝑿𝑻 = 𝑋1 , 𝑋2 , … . , 𝑋𝑛

𝑋𝑛
𝑿 a random vector.
The CDF of the random vector 𝑿 as
𝐹𝑿 (𝑥) = 𝑃(𝑋1 ≤ 𝑥1 , 𝑋2 ≤ 𝑥2 , … , 𝑋𝑛 ≤ 𝑥𝑛 ).
The PDF of the random vector 𝑿 as
𝜕𝑛
𝑓𝑿 𝑥 = 𝐹 (𝑥)
𝜕𝑥1 𝜕𝑥2 … . 𝜕𝑥𝑛 𝑿
Random Vector
Expectation:
The expected value vector or the mean vector of the random vector 𝑿 is defined as
𝐸[𝑋1 ]
𝐸[𝑋2 ]
𝑬[𝑿] =

𝐸[𝑋𝑛 ]
Similarly, we can have an 𝑚 by 𝑛 random matrix 𝑴 as
𝑋11 𝑋12 … 𝑋1𝑛
𝑋21 𝑋22 … 𝑋2𝑛
𝑴=
⋮ ⋮ ⋮ ⋮
𝑋𝑚1 𝑋𝑚2 … 𝑋𝑚𝑛
The mean matrix of M is given by
𝐸[𝑋11 ] 𝐸[𝑋12 ] … 𝐸[𝑋1𝑛 ]
𝐸[𝑋21 ] 𝐸[𝑋22 ] … 𝐸[𝑋2𝑛 ]
𝑬[𝑴] =
⋮ ⋮ ⋮ ⋮
𝐸[𝑋𝑚1 ] 𝐸[𝑋𝑚2 ] … 𝐸[𝑋𝑚𝑛 ]
Random Vector
Expectation:
Linearity of expectation is also valid for random vectors and matrices. In particular, let 𝑿
be an n-dimensional random vector and the random vector Y be defined as
𝒀 = 𝐴𝑿 + 𝑏, (Linear Transformation)
where 𝑨 is a fixed (non-random) 𝑚 by 𝑛 matrix
𝑏 is a fixed 𝑚-dimensional vector.

Then we have
𝐸 𝒀 = 𝐴𝐸 𝑿 + 𝑏.

Also, 𝑖𝑓 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑘 are n-dimensional random vectors, then we have


𝐸[𝑿𝟏 + 𝑿𝟐 + ⋯ + 𝑿𝒌 ] = 𝐸[𝑿𝟏 ] + 𝐸[𝑿𝟐 ] + ⋯ + 𝐸 𝑿𝒌 .
Random Vector
Correlation and Covariance Matrix
For a random vector 𝑿, we define the correlation matrix, 𝑹𝑿 , as
𝑋12 𝑋1 𝑋2 … 𝑋1 𝑋𝑛 𝐸[𝑋12 ] 𝐸[𝑋1 𝑋2 ] … 𝐸[𝑋1 𝑋𝑛 ]
𝑋22 … 𝑋2 𝑋𝑛 2 … 𝐸[𝑋2 𝑋𝑛 ]
𝑹𝑿 = 𝑬 𝑿𝑿𝑻 = 𝑬 𝑋2 𝑋1 = 𝐸[𝑋2 𝑋1 ] 𝐸[𝑋2 ]
⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
𝑋𝑛 𝑋1 𝑋𝑛 𝑋2 … 𝑋𝑛 2 𝐸[𝑋𝑛 𝑋1 ] 𝐸[𝑋𝑛 𝑋2 ] … 𝐸[𝑋𝑛2 ]
The covariance matrix,𝐶𝑋 , is defined as
𝑪𝑿 = 𝐸[ 𝑿 − 𝑬[𝑿] 𝑿 − 𝑬[𝑿] 𝑻 ]
𝑉𝑎𝑟(𝑋1 ) Cov(𝑋1 , 𝑋2 ) … Cov(𝑋1 , 𝑋𝑛 )
Cov(𝑋2 , 𝑋1 ) 𝑉𝑎𝑟(𝑋2 ) … Cov(𝑋2 , 𝑋𝑛 )
𝑪𝑿 =
⋮ ⋮ ⋮ ⋮
Cov(𝑋𝑛 , 𝑋1 ) Cov(𝑋𝑛 , 𝑋2 ) … 𝑉𝑎𝑟(𝑋𝑛 )

If 𝒀 be defined as 𝒀 = 𝑨𝑿 + 𝑏, then , (Linear Transformation)


𝑪𝒀 = 𝑨𝑪𝑿 𝑨𝑻
if we have two random vectors, 𝑿 and 𝒀, we can define the cross correlation matrix of 𝑿 and 𝒀 as
𝑅𝑿𝒀 = 𝐸[𝑿𝒀𝑻 ].
Also, the cross covariance matrix of 𝑿 and 𝒀 is
𝑪𝑿𝒀 = 𝐸[ 𝑿 − 𝑬[𝑿] 𝒀 − 𝑬[𝒀] 𝑻 ]
Random Vector
Examples
Let 𝑋 and 𝑌 be two jointly continuous random variables with joint PDF
3 2
𝑓 𝑥, 𝑦 = 𝑥 + 𝑦, 𝑓𝑜𝑟 0 < 𝑥, 𝑦 < 1 𝑎𝑛𝑑 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
2
and let the random vector 𝑼 be defined as
𝑋
𝑈= .
𝑌
Find the correlation and covariance matrices of U.
Solution
𝑻 𝐸[𝑋 2 ] 𝐸[𝑋𝑌] 𝑉𝑎𝑟(𝑋) Cov(𝑋, 𝑌)
𝑹𝑼 = 𝑬 𝑼𝑼 = , 𝑪 𝑿 =
𝐸[𝑌𝑋] 𝐸[𝑌 2 ] Cov(𝑌, 𝑋) 𝑉𝑎𝑟(𝑌)

1
3 2 3 1
𝑓𝑋 𝑥 = න 𝑥 + 𝑦 𝑑𝑦 = 𝑥 2 + , 𝑓𝑜𝑟 0 < 𝑥 < 1
0 2 2 2
1
3 1
𝑓𝑌 𝑦 = න 𝑥 2 + 𝑦 𝑑𝑥 = 𝑦 + , 𝑓𝑜𝑟 0 < 𝑦 < 1
0 2 2
Random Vector
Examples: Cont.
3
Let 𝑋 and 𝑌 be two jointly continuous random variables with joint PDF 𝑓 𝑥, 𝑦 = 2 𝑥 2 + 𝑦, 𝑓𝑜𝑟 0 < 𝑥, 𝑦 < 1 𝑎𝑛𝑑 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝑋
and let the random vector 𝑼 be defined as 𝑈 = . Find the correlation and covariance matrices of U.
𝑌
Solution
1 1
3 2 3 1 3 2 1
𝑓𝑋 𝑥 = න 𝑥 + 𝑦 𝑑𝑦 = 𝑥 2 + , 𝑓𝑜𝑟 0 < 𝑥 < 1 , 𝑓𝑌 𝑦 = න 𝑥 + 𝑦 𝑑𝑥 = 𝑦 + , 𝑓𝑜𝑟 0 < 𝑦 < 1
0 2 2 2 0 2 2
5 7 7 5
From these, we obtain 𝐸[𝑋] = , 𝐸[𝑋 2 ] = , 𝐸[𝑌] = , and 𝐸 𝑌 2 = . We also need 𝐸[𝑋𝑌].
8 15 12 12
1 1 1
3 2 3 1 17
𝐸𝑋𝑌 = න න 𝑥𝑦( 𝑥 + 𝑦)𝑑𝑥𝑑𝑦 = න 𝑦 + 𝑦 2 𝑑𝑦 = .
0 0 2 0 8 2 48
From this, we also obtain
17 5 7 1
Cov(𝑋, 𝑌) = 𝐸[𝑋𝑌] − 𝐸[𝑋]𝐸[𝑌] = − . =− .
48 8 12 96

7 17 73 1
𝐸[𝑋 2 ] 𝐸[𝑋𝑌] 𝑉𝑎𝑟(𝑋) Cov(𝑋, 𝑌) −
𝑹𝑼 = 𝑬 𝑼𝑼 𝑻
= 2 = 15 48 , 𝑪𝑿 = = 960 96
𝐸[𝑌𝑋] 𝐸[𝑌 ] 17 5 Cov(𝑌, 𝑋) 𝑉𝑎𝑟(𝑌) 1 11

48 12 96 144

You might also like