LectSlides#3
LectSlides#3
Chapter#5
(c) Appears to be concentered near origin, but not bounded to an enclosed region
(d) Appears to be concentrated near origin, and some linear relationship between ‘x’ and ‘y’
Large values of ‘x’ tend to have linearly proportional increasing values of ‘y’
R epr esentation of a Pair of RVs
To determine the probability that the pair X = (X, Y ) is in some
region B in the plane, find the equivalent event A, then:
P [X ∈ B] = P [A] = P [{ζ : (X(ζ), Y (ζ)) ∈ B}]
The probability of any event B is the sum of the pmf over the
outcomes in B:
Figure
14
15
16
E xample 5.6
A random experiment consisting of tossing two “loaded” (biased) dice and
noting the pair of numbers (X, Y ) facing up. The joint pmf pX,Y (j, k) for
j = 1, . . . , 6 and k = 1, . . . , 6 is given by the two-dimensional table shown
in Figure 5.6 (textbook, page 252). Find the P [min(X, Y ) = 3].
M ar ginal Pr obability M ass Function
The probability that X = j is found by summing over the values in column j. For example,
Similarly, the probability that Y = k is found by summing over the values in row k.
As a result, P [X = j] = 1/6 for j = 1, . . . , 6, and P [Y = k] = 1/6 for k = 1, . . . , 6. When isolated,
each die appears to be fair since each face is equiprobable, meaning that from these
marginal pmf’s we cannot determine if the dice are loaded.
5.3 T he J oint c d f of X and Y
The joint cumulative distribution function of X and Y is defined as
the probability of the event {X ≤ x1} ∩ {Y ≤ y1}:
Solution
−αx
FX (x) = lim FX,Y
y→∞
=1− e x≥0
−βy
FY (y) = lim FX,Y = 1 − e
x→∞
y≥0
𝐴= 𝑋+𝑌 ≤1 , 𝐵 = {𝑋 2 + 𝑌 2 ≤ 1}
Joint pdf
If the cdf is sufficiently smooth, the probability of each
rectangle can be expressed in terms of a density
function:
∆𝑥 𝑎𝑛𝑑 ∆𝑦 → 0
J oint pdf
Probability of a rectangular region
Marginal pdf’s obtained by taking the derivative of the corresponding marginal cdf’s:
E xample 5.16
Find the normalization constant ‘c’ and the marginal pdf’s for
Region-of-Support (ROS)
E xample 5.16
E xample 5.17
Find P [X + Y ≤ 1] in example 5.16.
Example#5.18, Jointly Gaussian R.V
The joint pdf of ‘X’ and ‘Y’ is given below
Example#5.18, Jointly Gaussian R.V
The marginal pdf of X is found by integrating the joint pdf over ‘y’
5.5 Independence of Two RVs
Consider the probability of the event A = AX ∩ AY where AX = {X = xj }
and AY= {Y= yj }.
X and Y are independent random variables if any event AX defined in
terms of X is independent of any event AY defined in terms of Y :
if X and Y are independent discrete RVs, then the joint pmf is equal to
the product of the marginal pmf’s.
If X and Y are independent RVs, then the RVs defined by any pair of
functions g(X) and h(Y ) are also independent
If (X-mx) and (Y-my) tend to have sometime same sign and sometime
opposite sign, and its expected value Cov(X,Y) will be close to zero
Covariance
C or r elation C oefficient
Definition The value of covariance is influenced by the scales(magnitude) of
the random variables, but the correlation coefficient of X and Y is
not:
COV(X, Y ) E[X, Y ] − mX mY
ρX,Y = =
σX σY oX σY
Properties 1. −1 ≤ ρX,Y ≤ 1
2. Linearly related, Y = aX + b
ρX,Y = 1 if a > 0 and ρX,Y = −1 if a < 0.
3. X and Y are said to be uncorrelated if ρX,Y = 0.
4. If X and Y are independent, then ρX,Y = 0.
Correlation Coefficient
𝑌 = 𝑎𝑋 + 𝑏, 𝐸 𝑌 = 𝑎𝐸 𝑋 + 𝑏, 𝑉𝑎𝑟 𝑌 = 𝑎2 𝑉𝑎𝑟(𝑋)
𝐸 𝑋𝑌 = 𝐸 𝑋 𝑎𝑋 + 𝑏 = 𝑎𝐸[𝑋 2 ] + 𝑏𝐸[𝑋]
𝑎𝐸 𝑋 2 + 𝑏𝐸 𝑋 − 𝑎(𝐸[𝑋])2 −𝑏𝐸[𝑋]
𝜌𝑋𝑌 =
𝑉𝑎𝑟 𝑋 𝑉𝑎𝑟(𝑌)
𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= 37.0478 Red: Negative
Correlation=0.9616
Simulation#2
𝑋~𝑈 0,20 , 𝑌 = 0.1𝑋 + 𝜀, 𝜀~𝑁(0, 𝜎 = 2)
𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= 3.1558 Red: Negative
Correlation=0.2702
Simulation#3
𝑋~𝑈 0,20 , 𝑌 = −2𝑋 + 𝜀, 𝜀~𝑁(0, 𝜎 = 2)
𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= -62.5934 Red: Negative
Correlation=-0.9851
E xample 5.28
Let X and Y are the random variables in Example 5.16.
c=2
E[X]=3/2, VAR[X]=5/4
E[Y]=1/2 VAR[Y]=1/4
Example: Discrete R.V
5.7.1 C onditional Pr obability
The experiments in real world are not usually independent, so
analyzing statistical dependency between two random variables is
important to describe their joint behaviors.
P [Y ∈ A, X = x]
P [Y ∈ A | X = x] = for P [X = x] > 0
P [X = x]
P [Y = y, X = x] pX,Y (x, y)
pY (y|x) = P [Y = y | X = x] = =
P [X = x] pX (x)
Equivalently,
z y
FZ ( z ) P X Y z f XY ( x, y )dxdy,
y x
x z y
Fig:5.1
x
We can find f Z (z ) by differentiating FZ (z) directly. In this
context, it is useful to recall the differentiation rule in) due to
Leibnitz. Suppose b( z )
H ( z) a( z)
h( x, z )dx.
Then b ( z ) h( x, z )
hb( z ), z ha( z ), z
dH ( z ) db( z ) da( z )
dx.
dz dz dz a( z) z
z y z y f
XY ( x , y )
fZ ( z) f XY ( x, y )dx dy f XY ( z y , y ) 0 dy
z z
f XY ( z y , y )dy.
dFZ ( z ) zx
f Z ( z) y XY
f ( x , y ) dy dx x
dz x
z
f XY ( x, z x )dx.
x
f XY ( x, y) f X ( x) fY ( y)
f Z ( z) f X ( z y ) fY ( y )dy f X ( x ) fY ( z x )dx.
y x 67
Example#5.4.1
Let T1 and T2 be the lifetimes of the two electronics component, then the total system lifetime
Is T=T1+T2, where we assume T1 and T2 are independent. The pdf of ‘T’ need to be calculated.
Assume T1 and T2 are exponentially distributed lifetimes with the same mean (λ)
Example : Let Z X Y . Determine its p.d.f f Z (z).
Solution: z y
FZ ( z ) P X Y z f XY ( x, y )dxdy
y x
and hence
dFZ ( z ) z y
fZ ( z)
dz
y
z
x
f XY ( x, y )dx dy
f XY ( y z, y )dy.
y x y z
x yz
x
Two Functions of Two Random Variables
Suppose X and Y are two random variables with joint p.d.f
f XY ( x, y ).
Given two functions: h( x, y ), g ( x, y )
Define the new random variables
Z g ( X ,Y )
W h ( X , Y ).
How does one determine their joint p.d.f f ZW ( z, w) ?
Dz ,w
Dz ,w
x
1
f ZW ( z, w) | J ( z, w) | f XY ( xi , yi ) f XY ( xi , yi ),
i i | J ( xi , yi ) |
g1 g1
z w
J ( z, w) det .
h h1
1
z w
1
| J ( z , w) |
| J ( xi , yi ) |
1 ( x 2 y 2 ) / 2 2
f XY ( x, y ) e .
2 2
Since z g ( x, y ) x 2 y 2 ; w h( x, y ) tan 1 ( y / x ), | w | / 2,
so that | J ( z, w) | z. 74
We can also compute J ( x, y )
x y
x2 y2 x2 y2
1 1
J ( x, y ) .
y x x2 y2 z
x2 y2 x2 y2
Notice that | J ( z, w) | 1 / | J ( xi , yi ) |,
f ZW ( z, w) z f XY ( x1 , y1 ) f XY ( x2 , y2 )
z z 2 / 2 2
e , 0 z , | w | .
2
2
Thus
2
fUV (u , v) e u / , 0 | v | u ,
2
Example#5.45
Assume X and Y be the jointly Gaussian random variables. Let V and W be obtained from
X and Y by transformation. Find the joint pdf of V and W
Inverse
1/ 2 -1 / 2
J ( x, y ) 1/ 2 1/ 2 1
1/ 2 1/ 2
Pairs of jointly Gaussian R.V
The random variables X and Y are said to be jointly Gaussian if
their joint pdf has the form
The pdf is centered at the point (m 1,m2) and it has bell shape
that depends on the values of 𝜎1 , 𝜎2 , 𝜎𝑋𝑌