0% found this document useful (0 votes)
16 views

LectSlides#3

The document covers the concepts of pairs of random variables, including their joint probability mass function (pmf), cumulative distribution function (cdf), and probability density function (pdf). It discusses independence, expected values, joint moments, and the representation of random variables in two-dimensional space. Various examples illustrate these concepts, including the behavior of discrete and continuous random variables.

Uploaded by

sknlilzawaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

LectSlides#3

The document covers the concepts of pairs of random variables, including their joint probability mass function (pmf), cumulative distribution function (cdf), and probability density function (pdf). It discusses independence, expected values, joint moments, and the representation of random variables in two-dimensional space. Various examples illustrate these concepts, including the behavior of discrete and continuous random variables.

Uploaded by

sknlilzawaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 80

LectureSlides#3

Chapter#5

Pairs of Random Variables


O ver view
 Two Random Variables
 Pairs of Discrete Random Variables (Joint pmf)
 The Joint cdf of X and Y
 The Joint pdf of Two Continuous Random Variables
 Independence of Two Random Variables
 Joint Moments and Expected Values of a Function of Two Random
Variables
 Conditional Probability (Discrete and Continuous)
 Function of Two Random Variables
 One Function of Two Random Variable
 Two Function of Two Random Variables (Closed Form Formula)
5.1 Two R andom Var iables
Notion: Mapping from events to pairs of random variable; domain
becomes R2.

Note that bold, capital letter X represents a random “vector”.

Example 5.1 Let a random experiment consist of randomly


selecting a student’s name. Let ζ denote the outcome of this
experiment, and define the following two functions
H(ζ) = height of student ζ in centimeters
W (ζ) = weight of student ζ in kilograms
Then (H(ζ), W (ζ)) assigns a pair of numbers to each ζ in S. The
event B = {H ≤ 183, W ≤ 82} represents students satisfying the
conditions, and a subset of the universe (sample space).
5.1 Two R andom Var iables
E xample 5.2
Web page advertisements
ζ: patterns of user arrivals in T seconds.
N1(ζ): number of Web page requests, bringing 0.001 cent.
N2(ζ): number of times the ad is chosen, bringing 1 cent.
(N1(ζ), N2(ζ)): assignment of a pair of nonnegative integers to each
ζ ∈ S.
Find the event “revenue in T seconds is less than 100 dollars.

B = {0.001N1 + N2 < 10, 000}


Example#5.4
R epr esentation of a Pair of RVs
The events involving a pair of random variables (X, Y ) are specified
by conditions, represented by regions in the 2-D plane.
Figure 5.2 shows the following events:
{X + Y≤ 10}
{min(X, Y ) ≤ 5} = {X ≤ 5} ∪ {Y≤ 5}
{X2 + Y 2 ≤ 100}.
R epr esentation of a Pair of RVs
Scattergram
 A scattergram can be used to deduce the joint behavior of two random variables
 A scattergram plot simply places a ‘dot’ at every observation pair (x,y) that result
from preforming the experiment that generates (X,Y)
 Figure#5.3 shows the scattergram for 200 observations of four different pairs of
random variables

(a) Appears to be uniformly distributes in the unit square


(b) Appears to be confined to a disc of unit radius and concentrated around unit radius
Scattergram

(c) Appears to be concentered near origin, but not bounded to an enclosed region
(d) Appears to be concentrated near origin, and some linear relationship between ‘x’ and ‘y’
Large values of ‘x’ tend to have linearly proportional increasing values of ‘y’
R epr esentation of a Pair of RVs
To determine the probability that the pair X = (X, Y ) is in some
region B in the plane, find the equivalent event A, then:
P [X ∈ B] = P [A] = P [{ζ : (X(ζ), Y (ζ)) ∈ B}]

The joint pmf, cdf, and pdf can be found by intersection


B = {X ∈ A1} ∩ {Y∈ A2}
P [B] = P [{X ∈ A1} ∩ {Y∈ A2}] , P [X ∈ A1, Y∈ A2]

Figure#5.4 shows some two-dimensionl PRODUCT-FORM events


R epr esentation of a Pair of RVs

{𝑋 < −𝑥1 } ∩ {𝑦1 ≤ 𝑌 ≤ 𝑦2 }


5.2 Pair s of Discr ete R andom Var iables
Let X(ζ) = (X(ζ), Y (ζ)) assume values from some countable set
SX ,Y = { (xj , yk ), j = 1, 2, . . .} . The joint probability mass function
of X specifies the probabilities of the event {X = x} ∩ {Y= y}:

The probability of any event B is the sum of the pmf over the
outcomes in B:

When the event B is the entire sample space SX,Y

Condition for valid joint pmf


Equation Table

Figure

14
15
16
E xample 5.6
A random experiment consisting of tossing two “loaded” (biased) dice and
noting the pair of numbers (X, Y ) facing up. The joint pmf pX,Y (j, k) for
j = 1, . . . , 6 and k = 1, . . . , 6 is given by the two-dimensional table shown
in Figure 5.6 (textbook, page 252). Find the P [min(X, Y ) = 3].
M ar ginal Pr obability M ass Function

The marginal probability pX (xj ) can be obtained from the relative


frequency of the event {X = Xj } that is found by adding the relative
frequencies of all outcome pairs in which Xj appears.
It is impossible to find joint probability, pX,Y (x, y), from marginal
probabilities, pX (x) and pY (y), unless they are independent.
19
20
E xample 5.8
Find the marginal pmf’s in the loaded dice experiment in Example 5.6

The probability that X = j is found by summing over the values in column j. For example,

Similarly, the probability that Y = k is found by summing over the values in row k.
As a result, P [X = j] = 1/6 for j = 1, . . . , 6, and P [Y = k] = 1/6 for k = 1, . . . , 6. When isolated,
each die appears to be fair since each face is equiprobable, meaning that from these
marginal pmf’s we cannot determine if the dice are loaded.
5.3 T he J oint c d f of X and Y
The joint cumulative distribution function of X and Y is defined as
the probability of the event {X ≤ x1} ∩ {Y ≤ y1}:

Probability of rectangle-shaped area


5.3 T he J oint c d f of X and Y
5.3 T he M a r g i n a l c d f of X and Y
5.3 T he J oint C DF of X and Y

𝑈𝑠𝑒 𝑡ℎ𝑒 𝑎𝑏𝑜𝑣𝑒 𝑤𝑖𝑡ℎ 𝑥2 → ∞


P( x1  X  x2 , Y  y 2 )  FXY ( x2 , y 2 )  FXY ( x1 , y 2 )
𝐹𝑋,𝑌 ∞, 𝑦2 − 𝐹𝑋,𝑌 ∞, 𝑦1 − 𝐹𝑋,𝑌 𝑥1 , 𝑦2 + 𝐹𝑋,𝑌 (𝑥1 , 𝑦1 )

𝐹𝑌 𝑦2 − 𝐹𝑌 𝑦1 − 𝐹𝑋,𝑌 𝑥1 , 𝑦2 + 𝐹𝑋,𝑌 (𝑥1 , 𝑦1 )


Example#5.11
Example#5.11
E xample 5.12
The joint cdf for the pair of random variables X = (X , Y ) is given by

Find the marginal cdf’s.

Solution

−αx
FX (x) = lim FX,Y
y→∞
=1− e x≥0
−βy
FY (y) = lim FX,Y = 1 − e
x→∞
y≥0

X and Y individually have exponential distributions with parameters α and


β, respectively.
E xample 5.13
In example 5.12, find the probability of the events
A = {X ≤ 1, Y≤ 1}
B = {X > x, Y> y} where x > 0 and y > 0.
D = {1 < X ≤ 2, 2 < Y≤ 5}
5.4 J oint pdf of Two C ontinuous RVs
The joint cdf allows us to compute the probabilities of events that correspond
to “rectangular” shapes in the plane
The probability of events corresponding to regions other than rectangles is
approximated by the sum of the disjoint rectangles of infinitesimal width.
Figure#5.12 shows the events (A and B) which are approximated by rectangle of infinitesimal width:

𝐴= 𝑋+𝑌 ≤1 , 𝐵 = {𝑋 2 + 𝑌 2 ≤ 1}
Joint pdf
 If the cdf is sufficiently smooth, the probability of each
rectangle can be expressed in terms of a density
function:

∆𝑥 𝑎𝑛𝑑 ∆𝑦 → 0
J oint pdf
Probability of a rectangular region

Condition for valid pdf

Marginal pdf’s obtained by taking the derivative of the corresponding marginal cdf’s:
E xample 5.16
Find the normalization constant ‘c’ and the marginal pdf’s for

Region-of-Support (ROS)
E xample 5.16
E xample 5.17
Find P [X + Y ≤ 1] in example 5.16.
Example#5.18, Jointly Gaussian R.V
The joint pdf of ‘X’ and ‘Y’ is given below
Example#5.18, Jointly Gaussian R.V
The marginal pdf of X is found by integrating the joint pdf over ‘y’
5.5 Independence of Two RVs
Consider the probability of the event A = AX ∩ AY where AX = {X = xj }
and AY= {Y= yj }.
X and Y are independent random variables if any event AX defined in
terms of X is independent of any event AY defined in terms of Y :

which implies that in discrete cases, ∀xj , ∀yk

pX,Y (xj , yk ) = P [X = xj , Y= yk ] = P [X = xj ]P [Y= yk ]


= pX (xj )pY (yk )

if X and Y are independent discrete RVs, then the joint pmf is equal to
the product of the marginal pmf’s.

(In mathematics) discrete RVs X and Y are independent if and only if


the joint pmf is equal to the product of the marginal pmf’s.
Independence of C ontinuous RVs
The RVs X and Y are independent if and only if the joint cdf is equal
to the product of the marginal cdf’s.

FX,Y (x, y) = FX (x)FY (y)

If X and Y are jointly continuous, then X and Y are independent if and


only if the joint pdf is equal to the product of the marginal pdf’s.

f X,Y (x, y) = f X (x)f Y (y)

If X and Y are independent RVs, then the RVs defined by any pair of
functions g(X) and h(Y ) are also independent

P [g(X) ∈ Ag(X), h(Y ) ∈ Ah(Y )] = P [X ∈ AX , Y ∈ AY ]


= P [X ∈ AX ]P [Y ∈ AY ]
= P [g(X) ∈ Ag(X)]P [h(Y ) ∈ Ah(Y )]
E xamples
5.19 Does the pmf in Example 5.6 consist of the independent tosses of two
fair dice?
No. For the two tosses to be independent, their joint pmf should
always be 1 × 1 = 1 . However, Figure 5.6 shows that the
6 6 36
2 1
diagonal terms are 42 and the others are 42 .

5.21 Are the random variables X and Y in Example 5.16 independent?


No. Note that the joint pdf appears to be factorized, but it is not
the product of the marginal pdf’s.

5.23 Are the random variables X and Y in Example 5.12 independent?


Yes.
−αx −βy
FX (x)FY (y) = (1 − e )(1 − e ) = FX ,Y (x, y)
5.6.1 E xpected Values of a Pair Function
Key issues:
1. How X and Y vary together?
2. Whether the variation of X and Y are correlated?
3. If X increases does Y increase or decrease? Or not related?
4. The joint moments of X and Y provide the above information

Expected Values of a Function of Two Random Variables: The expected value of


Z = g(X , Y ) can be found by
E xamples
5.24 Sum of Random Variables Let Z = X + Y . Find E [Z ].

5.25 Product of Functions of “Independent” Random Variables Suppose that X


and Y are independent RVs, and let g(X, Y ) = g1(X)g2(Y ). Find
E[g(X, Y )].

E[g(X, Y )] = E[g1(X)g2 (Y )] = E[g1(X)]E[g2(Y )] page 42 / 45 — Chapter 5. Pairs of Random Variables


5.6.2 J oint M om ents
The “joint” moments summarize the “joint” behavior.
The jth and kth joint moment of X and Y is defined by

Supplements Generally, the shape of the probability distribution of a


random variable can be described by the statistical moments. Some
of the important statistical moments of a scalar random variable.
1. 1st order moment: mX = E[X] - mean
2. 2nd order central moment: σ2X = E[(X − mX )2] - variance
3. 3rd order central moment: E[(X − mX )3] - skewness
4. 4th order central moment: E[(X − mX )4] - kurtosis (peakedness)
C or r elation and C ovar iance
Correlation E [X Y ]. Joint moment of X and Y with j = k = 1. If it is 0, then
X and Y are said to be orthogonal.
Covariance Central moment of X and Y with j = k = 1.

COV(X, Y ) = E[(X − mX )(Y − mY )]


= E[XY − mY X − mX Y + mX mY ]
= E[XY ] − mX mY

If X and Y are independent random variables, then


E[XY]=E[X]E[Y]
Cov(X,Y)=0
Covariance
 Covariance can be used to measure correlation between X and Y
 The covariance measure the deviation from m x=E[X] and my=E[Y]
 If a positive value of (X-mx) tends to be accompanied by a positive
value of (Y-my) (also if both negative values) then (X-mx) (Y-my) will be
positive values, and its expected value Cov(X,Y) will be positive
 The scattergram shows this case where observed points tend to cluster
along line of positive slope
Covariance
 If (X-mx) and (Y-my) tend to have opposite signs, and its expected
value Cov(X,Y) will be then negative
 The scattergram in this case would have observation points cluster
along the line of negative slope

 If (X-mx) and (Y-my) tend to have sometime same sign and sometime
opposite sign, and its expected value Cov(X,Y) will be close to zero
Covariance
C or r elation C oefficient
Definition The value of covariance is influenced by the scales(magnitude) of
the random variables, but the correlation coefficient of X and Y is
not:
COV(X, Y ) E[X, Y ] − mX mY
ρX,Y = =
σX σY oX σY

Properties 1. −1 ≤ ρX,Y ≤ 1
2. Linearly related, Y = aX + b
ρX,Y = 1 if a > 0 and ρX,Y = −1 if a < 0.
3. X and Y are said to be uncorrelated if ρX,Y = 0.
4. If X and Y are independent, then ρX,Y = 0.
Correlation Coefficient
𝑌 = 𝑎𝑋 + 𝑏, 𝐸 𝑌 = 𝑎𝐸 𝑋 + 𝑏, 𝑉𝑎𝑟 𝑌 = 𝑎2 𝑉𝑎𝑟(𝑋)

𝐸 𝑋𝑌 = 𝐸 𝑋 𝑎𝑋 + 𝑏 = 𝑎𝐸[𝑋 2 ] + 𝑏𝐸[𝑋]

𝑎𝐸 𝑋 2 + 𝑏𝐸 𝑋 − 𝑎(𝐸[𝑋])2 −𝑏𝐸[𝑋]
𝜌𝑋𝑌 =
𝑉𝑎𝑟 𝑋 𝑉𝑎𝑟(𝑌)

𝑎𝑉𝑎𝑟(𝑋) 𝑎>0 𝜌𝑋𝑌 = 1


𝜌𝑋𝑌 =
𝑎2 (𝑉𝑎𝑟(𝑋))2
𝑎<0 𝜌𝑋𝑌 = −1
C or r elation C oefficient
 The closer the correlation coefficient to one the stronger the positive correlation.
 The closer the correlation coefficient to minus one the stronger the negative
correlation.
 The closer the correlation coefficient zero the weaker (nearly zero) the correlation.

Correlation measures linearity between X and Y.


Matlab Simulation
Covariance Estimate
𝑛
1
𝑐𝑜𝑣 𝑥, 𝑦 = (𝑥𝑖 − 𝑋)(𝑦𝑖 − 𝑌)
𝑛
𝑖=1
Mean Estimate:
Use Matlab function: mean(x) or mean(y)
𝑛
1
𝑋= (𝑥𝑖 )
𝑛
𝑖=1
Variance Estimate:
Use Matlab function: var(x) or var(y)
𝑛
1
v𝑎𝑟 𝑋 = (𝑥𝑖 − 𝑋)2
𝑛
𝑖=1
Correlation Coefficient:
Use Matlab function: corr(x,y)
Simulation#1
𝑋~𝑈 0,20 , 𝑌 = 1.2𝑋 + 𝜀, 𝜀~𝑁(0, 𝜎 = 2)

𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= 37.0478 Red: Negative
Correlation=0.9616
Simulation#2
𝑋~𝑈 0,20 , 𝑌 = 0.1𝑋 + 𝜀, 𝜀~𝑁(0, 𝜎 = 2)

𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= 3.1558 Red: Negative
Correlation=0.2702
Simulation#3
𝑋~𝑈 0,20 , 𝑌 = −2𝑋 + 𝜀, 𝜀~𝑁(0, 𝜎 = 2)

𝑠𝑖𝑔𝑛[ 𝑥𝑖 − 𝑋 𝑦𝑖 − 𝑌 ]
Scattergram
Blue: Positive
Covariance= -62.5934 Red: Negative
Correlation=-0.9851
E xample 5.28
Let X and Y are the random variables in Example 5.16.

c=2

Find E[XY ], COV(X, Y ), and ρX,Y .

E[X]=3/2, VAR[X]=5/4
E[Y]=1/2 VAR[Y]=1/4
Example: Discrete R.V
5.7.1 C onditional Pr obability
The experiments in real world are not usually independent, so
analyzing statistical dependency between two random variables is
important to describe their joint behaviors.

Definition of conditional probability: The probability that Y is in A


given that we know that X = x

P [Y ∈ A, X = x]
P [Y ∈ A | X = x] = for P [X = x] > 0
P [X = x]

Case 1: discrete RVs X , Y the conditional pmf of Y given X = x is:

P [Y = y, X = x] pX,Y (x, y)
pY (y|x) = P [Y = y | X = x] = =
P [X = x] pX (x)

Equivalently,

pX ,Y (x, y) = pY (y|x) pX (x) = pX (x|y) pY (y)


Pr oper ties of C onditional pm f
1. If P [X = x] = 0, we define pY (y|x) = 0, since there’s no such event.
2. The conditional pmf also satisfies all the properties of a pmf:

3. If X and Y are independent, then


P [Y = y, X = x] P [Y = y]P [X = x]
pY (y|x) = = = P [Y = y] = pY (y)
P [X = x] P [X = x]

Verbally, knowing that X = xk does not affect the probability of event


A involving Y .

4. The probability of an event ‘A’ given X=xk is found by:


Example:5.29, Loaded Dice Expt

P[X=1]=P[X=2]=,…..P[X=6]= 1/6 , Marginals pmf


Joint pmf

𝑝𝑋𝑌 (5,5) 2/42 2


𝑝𝑌 5 5 = = =
𝑝𝑋 (5) 1/6 7
C ase 2: C onditional pdf of a C ont. RV
If X is continuous, P [X = x] = 0. So derive the conditional cdf of Y
given X = x first. For a very short interval h, the conditional cdf is
Pr oper ties of C onditional pdf
1. It satisfies Bayes’ Rule:

f X,Y (x, y) = f Y (y|x)f X (x) = f X (x|y)f Y (y)

2. If X and Y are independent, then


f X,Y (x, y) = f X (x)f Y (y) ⇔ f Y (y|x) = f Y (y) ⇔ f X (x|y) = f X (x)

FX,Y (x, y) = FX (x)FY (y) ⇔ FY (y|x) = FY (y) ⇔ FX (x|y) = FX (x)

3. The probability of event ‘A’ given X=xk is obtained by integrating


the conditional pdf:
E xample 5.32
Let X and Y be the RVs in Example 5.16
Find f X (x|y) and f Y (y|x).
5.8.1 Functions of Two RVs
The cdf of Z = g(X, Y ) is found by the probability of the event of
Rz = {Z ≤ z} = { x = (x, y)|g(x ) < z} .

If X and Y are independent


Example : Z = X + Y. Find f Z (z).

 z y
FZ ( z )  P X  Y  z     f XY ( x, y )dxdy,
y   x  

since the region of the xy plane where x  y  z. is the


shaded area in Fig. 5.1 to the left of the line x  y  z
Integrating over the horizontal strip along the x-axis first
(inner integral) followed by sliding that strip along the y-axis
from   to   (outer integral) we cover the entire shaded
area.
y

x z y
Fig:5.1
x
We can find f Z (z ) by differentiating FZ (z) directly. In this
context, it is useful to recall the differentiation rule in) due to
Leibnitz. Suppose b( z )
H ( z)   a( z)
h( x, z )dx.

Then b ( z ) h( x, z )
hb( z ), z   ha( z ), z   
dH ( z ) db( z ) da( z )
 dx.
dz dz dz a( z) z

  z y   z  y f
 XY ( x , y ) 

fZ ( z)      f XY ( x, y )dx  dy     f XY ( z  y , y )  0     dy

 z   z 

 f XY ( z  y , y )dy.


Alternatively, the integration in can be carried out first along


the y-axis followed by the x-axis as in
In that case
 zx
FZ ( z )   
y
f XY ( x, y )dxdy,
x   y  

and differentiation of gives y zx

dFZ ( z )    zx 
f Z ( z)     y  XY
f ( x , y ) dy dx x
dz x  
 z 

 f XY ( x, z  x )dx.
x  

If X and Y are independent, then

f XY ( x, y)  f X ( x) fY ( y)

 
f Z ( z)   f X ( z  y ) fY ( y )dy   f X ( x ) fY ( z  x )dx.
y   x   67
Example#5.4.1

Let T1 and T2 be the lifetimes of the two electronics component, then the total system lifetime
Is T=T1+T2, where we assume T1 and T2 are independent. The pdf of ‘T’ need to be calculated.
Assume T1 and T2 are exponentially distributed lifetimes with the same mean (λ)
Example : Let Z  X  Y . Determine its p.d.f f Z (z).
Solution:  z y
FZ ( z )  P X  Y  z     f XY ( x, y )dxdy
y   x  

and hence
dFZ ( z )    z y  
fZ ( z) 
dz
  y  
 z
 x 
f XY ( x, y )dx  dy 

 
f XY ( y  z, y )dy.

If X and Y are independent, then the above formula reduces



to f Z ( z)  
f X ( z  y ) fY ( y )dy  f X (  z )  fY ( y ),


y x y z
x yz
x
Two Functions of Two Random Variables
Suppose X and Y are two random variables with joint p.d.f
f XY ( x, y ).
Given two functions: h( x, y ), g ( x, y )
Define the new random variables
Z  g ( X ,Y )
W  h ( X , Y ).
How does one determine their joint p.d.f f ZW ( z, w) ?

Obviously with f ZW ( z, w) in hand, the marginal p.d.fs fW (w)


and f Z (z) can be easily determined.
FZW ( z, w)  P Z ( )  z,W ( )  w   P g ( X , Y )  z, h( X , Y )  w
 P ( X , Y )  Dz ,w     f XY ( x, y )dxdy,
( x , y )Dz , w

where Dz ,w is the region in the xy plane such that the


inequalities g ( x, y )  z and h( x, y )  w are simultaneously
satisfied.
y

Dz ,w
Dz ,w
x
1
f ZW ( z, w)   | J ( z, w) | f XY ( xi , yi )   f XY ( xi , yi ),
i i | J ( xi , yi ) |

 g1 g1 
 z w 
 
J ( z, w)  det  .
 h h1 
 1

 z w 

1
| J ( z , w) | 
| J ( xi , yi ) |

where J ( xi , yi ) represents the Jacobian of the original


transformation given by
 g g 
 
 x y 
J ( xi , yi )  det   .
 h h 
 x  72
y  x  x , y  y
 i i
Example : Suppose X and Y are zero mean independent
Gaussian r.vs with common variance  2 .
Define Z  X 2  Y 2 , W  tan 1(Y / X ), where | w |  / 2.
Obtain
f ZW ( z, w).

1 ( x 2  y 2 ) / 2 2
f XY ( x, y )  e .
2 2

Since z  g ( x, y )  x 2  y 2 ; w  h( x, y )  tan 1 ( y / x ), | w |  / 2,

if ( x1, y1 ) is a solution pair so is ( x1, y1 ).


y
 tan w, or y  x tan w.
x
Substituting this into z, we get

z  x 2  y 2  x 1  tan 2 w  x sec w, or x  z cos w.


and
y  x tan w  z sin w.

Thus there are two solution sets


x1  z cos w, y1  z sin w, x2   z cos w, y2   z sin w.
to obtain J ( z, w).
x x
z w cos w  z sin w
J ( z , w)    z,
y y sin w z cos w
z w

so that | J ( z, w) | z. 74
We can also compute J ( x, y )
x y
x2  y2 x2  y2
1 1
J ( x, y )    .
 y x x2  y2 z
x2  y2 x2  y2

Notice that | J ( z, w) | 1 / | J ( xi , yi ) |,

f ZW ( z, w)  z  f XY ( x1 , y1 )  f XY ( x2 , y2 ) 
z  z 2 / 2 2 
 e , 0  z  , | w | .
 2
2
Thus

which represents a Rayleigh r.v with parameter  2 , and


Example : Let X and Y be independent exponential random
variables with common parameter . Define U = X + Y, V
= X - Y. Find the joint and marginal p.d.f of U and V.
Solution: It is given that
1 ( x  y ) / 
f XY ( x, y )  e , x  0, y  0.

2

Now since u = x + y, v = x - y, always | v | u, and there is only


one solution given by
uv uv
x , y .
2 2
Moreover the Jacobian of the transformation is given by
1 1
J (u , v)   1  1  2
1 1

2
fUV (u , v)  e u /  , 0  | v |  u  ,

2
Example#5.45

Assume X and Y be the jointly Gaussian random variables. Let V and W be obtained from
X and Y by transformation. Find the joint pdf of V and W

Inverse

1/ 2 -1 / 2
J ( x, y )   1/ 2 1/ 2  1
1/ 2 1/ 2
Pairs of jointly Gaussian R.V
The random variables X and Y are said to be jointly Gaussian if
their joint pdf has the form

The pdf is centered at the point (m 1,m2) and it has bell shape
that depends on the values of 𝜎1 , 𝜎2 , 𝜎𝑋𝑌

You might also like