unit 2b
unit 2b
BA/BSc II Semester
(STB 251)
by
2 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
Two-Dimensional Random Variables
Definition
Let E be an experiment and S a sample space associated with E. Let X = X ( s ) and Y = Y ( s )
be two functions, each assigning a real number to each outcome s S . Then ( X , Y ) is a two-
dimensional random variable.
X(s)
S
Y(s)
( X , Y ) = ( xi , y j ), i = 1, 2,..., n; j = 1, 2,...m
For example, suppose two unbiased dice are rolled together. If X be the number on first die
and Y on second die, then set of possible values of (X, Y) are
( X , Y ) = (1,1)(1, 2)...(6,1)(6,6)
Two-Dimensional Continuous Random Variable
If ( X , Y ) can assume all values in some non-countable set of two-dimensional plane. Then,
( X , Y ) is called two-dimensional continuous random variable. For example, if ( X , Y ) assume
all values in the rectangle ( x, y) | a x b, c y d . Then, ( X , Y ) is a two-dimensional
r.v.
Joint Probability Function or Joint Probability Mass Function
Let ( X , Y ) be a two-dimensional discrete r.v. Then, for each pair of possible values
( xi , y j ), i = 1, 2,..., n; j = 1, 2,..., m there associate a real number p ( xi , y j ) representing
P( X = xi , Y = y j ) , is called joint probability function of ( X , Y ) and satisfying the following
conditions:
i) p ( xi , y j ) 0 for every ( x, y )
ii) p ( xi , y j ) = 1
j =1 i =1
3 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
The joint probability function of X and Y can be represented in the tabular form as below:
Y y1 y2 yj ym −1 ym Total
X
x1 p11 p12 p1 j p1m−1 p1m p1•
xn pn1 pn 2 pn j pn m −1 pnm pn •
4 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
Continuous Case:
Let f ( ) be the joint pdf of the two-dimensional continuous random variable ( X , Y ) . We
define g ( ) and h ( ) , the marginal probability density functions of X and Y respectively are:
g ( x) = − f ( x, y)dy
and h( y) = − f ( x, y)dx
P( X = xi Y = y j )
P( X = xi | Y = y j ) =
P(Y = y j )
P( X = xi , Y = y j ) p( xi , y j )
or P( X = xi | Y = y j ) = =
P(Y = y j ) p( y j )
Continuous Case
Let ( X , Y ) be a two-dimensional continuous r.v. with joint pdf f ( x, y ) . Then conditional
pdf of X given Y = y is given by
f ( x, y )
g ( x | y) = , h( y ) 0.
h( y )
and the conditional pdf of Y given X = x is
f ( x, y )
h( y | x ) = , g ( x) 0.
g ( x)
Note that
1 h( y )
− g ( x | y)dx = h( y) − f ( x, y)dx = h( y) = 1
5 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
1 g ( x)
Similarly, − h( y | x)dy = −
f ( x, y )dy = =1
g ( x) g ( x)
x y
= − − f ( x, y)dydx : Continuous Case
F ( x) = F ( x, y) = P( X x, Y y)
y y
and the marginal distribution function of Y is given by
F ( y) = F ( x, y) = P( X x, Y y )
x x
ii) p1 ( xi | y j ) = p1 ( xi )
and p2 ( y j | xi ) = p2 ( y j ) for every i, j
p( xi , y j )
Proof: Since p1 ( xi | y j ) =
p2 ( y j )
6 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
p( xi , y j ) = p1( xi | y j ) p2 ( y j )
Now p1 ( xi ) = p( xi , y j )
j
= p1 ( xi | y j ) p2 ( y j )
j
= p1 ( xi | y j ) p2 ( y j )
j
p1 ( xi ) = p1 ( xi | y j ) for every i, j
Similarly,
p2 ( y j ) = p2 ( y j | xi ) for every i, j
Continuous Case
Let ( X , Y ) be a two-dimensional continuous r.v., then we say X and Y are independent
random variables, if and only if
i) f ( x, y ) = g ( x).h( y ) x, y
ii) g ( x | y ) = g ( x)
h ( y | x ) = h ( y ) x, y
Proof: We have
f ( x, y )
g ( x | y) =
h( y )
f ( x, y ) = g ( x | y ) h ( y )
Now g ( x) = − f ( x, y)dy
= − g ( x | y)h( y)dy
= g ( x | y)− h( y)dy
g ( x ) = g ( x | y ) x, y
Similarly,
h( x) = h( y | x), x, y
Example 2.27: The joint probability distribution of ( X , Y ) is given in the following table
X 0 1 2
Y
0 3k k 2k
1 k 2k k
7 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
iv) Conditional distribution X given Y = 1.
v) Are X and Y independent?
8 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
P( X = 1, Y = 0) 0.1 1
Y = 0 : P(Y = 0 | X = 1) = = =
P( X = 1) 0.3 3
P( X = 1, Y = 1) 0.2 2
Y = 1: P(Y = 1| X = 1) = = =
P( X = 1) 0.3 3
Thus,
Y : 0 1
P (Y = y | X = 1) : 0.6 0.4
v) In the above table we can observe that p( x, y ) p1 ( x) p2 ( y ) for any x, y . Hence X and Y
are not independent.
Practice Problem 2.11: A two dimensional r.v. ( X , Y ) has the bivariate distribution given
by
p( x, y) = k ( x2 + y); x = 0,1, 2,3, y = 0,1
Determine the value of k. Hence find the marginal distribution of X and Y.
Practice Problem 2.12: A two dimensional r.v. ( X , Y ) has joint pmf given by
1
P ( X = x, Y = y ) = (2 x + y ); x = 0,1, 2, y = 0,1, 2
27
Find the conditional distribution of Y given X = x .
9 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
or 4k = 1
1
k=
4
Hence
1
f ( x, y ) = x y, 0 x 2;0 y 2
4
(i) P( X 1 Y 1) = P( X 1, Y 1)
1 1
= 0 0 f ( x, y )dydx
=
1 1 1
x ydy dx
4 0 0
1 1 1
= 0 xdx =
8 16
P( X 1, Y 1)
(ii) P( X 1| Y 1) =
P(Y 1)
2 1
P(Y 1) = 0 0 f ( x, y)dydx
1 2 1
= xydydx
4 0 0
=
1 2
4
0
1
x 0 ydy dx
1 2 1
= 0 xdx =
8 4
Therefore,
P( X 1, Y 1) 1/16 1
P( X 1| Y 1) = = = .
P(Y 1) 1/ 4 4
(iii) Marginal pdf of X is
2
g ( x) = 0 f ( x, y)dy
x 2
= ydy
4 0
x
= , 0 x2
2
Marginal pdf of Y is
2
h( y) = 0 f ( x, y)dx
y 2
= xdx
4 0
10 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
y
= , 0 y2
2
(iv) Conditional pdf of X given Y is
f ( x, y )
g ( x | y) =
h( y )
1
xy
4 x
= = , 0 x 2.
y/2 2
(v) Here it can be observed that
f ( x, y ) = g ( x)h( y ) x, y
x
Also g ( x | y ) = g ( x) =
, x (0, 2) .
2
Hence X and Y are independent.
Example 2.29: If X and Y are two random variables having density function
1
f ( x, y ) = (6 − x − y ); 0 x 2, 2 y 4
8
Find (i) P( X 1 Y 3) (ii) P( X + Y 3) (iii) P ( X 1| Y 3)
Solution:
(i) P( X 1 Y 3) = P( X 1, Y 3)
1 1 3
= 0 2 (6 − x − y )dydx
8
3
=
8
1 1 3− x
(ii) P( X + Y 3) = 0 2 (6 − x − y )dydx
8
= 6 0 2 dy dx − 0 x 2 dy dx
1 1 3− x
8
1 3− x
X +Y = 3
− ydydx
1 3− x
0 2
=
5 X +Y 3
24
1 2 3 5
(iii) P(Y 3) = 0 2 (6 − x − y )dydx =
8 8
Therefore,
11 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
P( X 1 Y 3) 3 / 8 3
P( X 1| Y 3) = = = .
P(Y 3) 5/8 5
Example 2.30: Let the two-dimensional random variable ( X , Y ) has joint pdf
f ( x, y) = 6x2 y, 0 x 1;0 y 1
= 0, elsewhere
3 1
Find (i) P(0 X , Y 2) , (ii) P ( X + Y 1) , (iii) P ( X Y ) ,
4 3
(iv) P( X 1| Y 2) .
Solution:
3 1 3/4 1 3/4 2
(i) P(0 X , Y 2) = 0 1/3 f ( x, y )dydx + 0 1 f ( x, y )dydx
4 3
3/4 1 3/4 2
= 0 1/3 6 x2 ydydx + 0 1 0 dydx
8 3/4 2 3
= 0
3 x dx =
9 8
1 1− x
(ii) P( X + Y 1) = 0 0 6 x2 ydydx
1− x
1 y2 1
= 0
6 x2 dx =
2 0 10
1 x
(iii) P( X Y ) = 0 0 6 x2 ydydx
1
x
= 60 x 2 0 ydy dx
x
1 2 y2
= 6 0
x dx
2 0
1 3
= 30 x 4 dx =
5
1 1 1 2
(iv) P( X 1 Y 2) = 0 0 6 x2 ydydx + 0 1 0 dydx = 1
1 2
P(Y 2) = 0 0 f ( x, y)dydx
12 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
1 1 1 2
= 0 0 6 x2 ydydx + 0 1 0 = 1dydx
Thus,
P( X 1 Y 2) 1
P( X 1| Y 2) = =1
P(Y 2) 1
Example 2.31: Let two-dimensional continuous random variable ( X , Y ) has joint pdf
f ( x, y ) = 2, 0 x y 1
Find marginal density functions of X and Y.
Solution: We know that
1
g ( x) = x f ( x, y)dy
1
= x 2dy
= 2(1 − x), 0 x 1
Similarly
y
h( y) = 0 2dx = 2 y,0 y 1
Example 2.32: Let two-dimensional continuous random variable ( X , Y ) has joint pdf
f ( x, y) = xe− x( y+1) , x 0; y 0
Find marginal and conditional pdf.
Solution: We have
g ( x) = 0 f ( x, y)dy
= x e− x 0 e− x y dy = e− x , x 0.
and h( y) = 0 x e− x( y +1) dx
1 e − x x p −1dx = ( p )
h( y ) = 2 0
z e − z dz
( y + 1) 0
1
= , y0
( y + 1) 2
Now
g ( x | y ) = f ( x, y ).h( y )
= ( y + 1) 2 x e − x ( y +1) , x 0.
and h( y | x ) = f ( x, y ) g ( x )
= x e− x y , y 0.
13 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
Example 2.33: Let the conditional distribution of X given Y is
e− y y x
g ( x | y) = , x = 0,1, 2...
x!
which follows Poisson distribution with parameter y .
yx
= e −2 y ; x = 0,1, 2,..., y 0
x!
Therefore, the marginal density of X is given by
g ( x) = 0 f ( x, y)dy
yx
= 0 e −2 y dy
x!
1 −2 y x
= e y dy
x! 0
( x + 1)
= x +1
= 2− ( x +1) , x = 0,1, 2...
x !2
Example 2.34: Let ( X , Y ) are distributed with a constant density inside a square R of side
b. Find f ( x, y ), F ( x, y ) and marginal pdfs.
Solution: The joint pdf of X and Y is given as
1 1 1
f ( x, y) = = 2 , ( x, y) R
b b b
= 0, otherwise
The joint cdf is given by
F ( x, y ) = 0, x 0, y 0
x y1 xy
= 0 0 2
dt1dt2 = 2 , 0 x b; 0 y b
b b
by y
= 2 = , x b;0 y b
b b
xb x
= 2 = , 0 x b; y b
b b
= 1, x b; y b
14 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
The marginal pdf of X is
b 1 1
g ( x) = 0 2
dy = , x (0, b)
b b
and the marginal pdf of Y is
b 1 1
h( y) = 0 2
dx = , y (0, b)
b b
Example 2.35: The joint pdf of a two-dimensional random variable ( X , Y ) is given by
f ( x, y ) = 2, 0 x 1; 0 y x
= 0, elsewhere
(i) Find marginal density functions of X and Y.
(ii) Find the conditional distribution of Y given X and X given Y.
(iii)Check the independence of X and Y.
Solution: (i) The marginal density function of X and Y respectively are given by
x
g ( x) = 0 2dy = 2 x, 0 x 1
= 0, elsewhere
1
h( y ) = y 2dx = 2(1 − y), 0 y 1
and
= 0, elsewhere
(ii) The conditional distribution of Y given X is
f ( x, y ) 2 1
h( y | x ) = = = , 0 y x.
g ( x) 2x x
The conditional density function of X given Y is
f ( x, y ) 2 1
g ( x | y) = = = , y x 1.
h( y ) 2(1 − y ) (1 − y )
(iii) We have
f ( x, y ) = 2, 0 x 1;0 y x
and g ( x) h( y ) = 4 x(1 − y )
f ( x, y ) g ( x ) h ( y )
Hence, X and Y are not independent.
f ( x, y ) = 4 x y e − ( x + y2 )
2
; x 0, y 0
Test, whether X and Y are independent? Also find the conditional distribution of X given
Y = y.
Solution: The marginal distribution of X is given by
15 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
g ( x) = 0 f ( x, y)dy
= 0 4 xye− ( x + y2 )
2
dy
= 4 xe− x 0 ye− y dy
2 2
= 2 xe− x 0 e−t dt y 2 = t 2 ydy = dt
2
g ( x) = 2 xe − x ; x 0 .
2
h( y ) = 2 ye − y ; y 0
2
Since, f ( x, y ) = g ( x) h( y )
Therefore, X and Y are independent.
The conditional distribution of X given Y is
f ( x, y )
= 2 xe− x , x 0.
2
g ( x | y) =
h( y )
Example 2.37: If joint distribution function of X and Y is given by
F ( x, y ) = 1 − e− x − e− y + e− ( x + y ) ; x 0, y 0.
= 0, elsewhere
(i) Find the marginal densities of X and Y.
(ii) Are X and Y independent?
(iii) Find P ( X + Y 1)
Solution: We have
F ( x, y) = 1 − e− x − e− y + e−( x+ y ) ; x 0, y 0.
Therefore,
2
f ( x, y ) = F ( x, y )
x y
=
x y
1 − e− x − e− y + e−( x+ y )
= [e − y − e − ( x + y ) ]
x
f ( x, y ) = e − ( x + y ) , x 0; y 0
= 0, elsewhere
(i) & (ii) The joint density of X and Y can be expressed as
f ( x, y) = e− x e− y = g ( x) h( y)
where g ( x) = e− x , x 0
16 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
and h( y) = e− y , y 0.
Thus, X and Y are independent.
(iii)
P( X + Y 1) = 1 − P( X + Y 1)
1 1− x
P( X + Y 1) = 0 0 f ( x, y)dydx
1− x
= 0 e− x {0 e− y dy}dx
1
= 0 e− x {1 − e−(1− x ) }dx
1
= 1 − 2e−1 .
Thus,
P( X + Y 1) = 2e−1 = 0.7358
y 2 y3
= xdx = y − , 0 y2
2 y 4
Since f ( x, y ) g ( x) h( y )
Hence, X and Y are dependent.
f ( x, y) = e−( x+ y ) , x 0; y 0.
Are X and Y independent?
Find (i) P( X 1) , (ii) P( X Y | X 2Y ) , (iii) P(1 X + Y 2)
Solution: The joint pdf of X and Y can be expressed as
f ( x, y) = e− x e− y = g ( x) h( y) ,
where g ( x) = e− x , x 0
17 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
and h( y) = e− y , y 0.
Thus, X and Y are independent.
(i) P( X 1) = 1 g ( x)dx
= 1 e− x dx = 1/ e
(ii)
y
P( X Y ) = 0
{0 f ( x, y )dx}dy
= 0 e − y {0 e − x dx}dy
y
= 0 e − y (1 − e − y )dy
[ 1 − e − y = t e − y dy = dt ]
1
= 0 tdt
= 1/ 2
Similarly,
2y
P( X 2Y ) = 0 {0 f ( x, y )dx}dy
= 0 e− y {0 e − x dx}dy
2y
= 0 e− y (1 − e−2 y )dy
= 2/3
Therefore,
P( X Y ) 1/ 2
P( X Y | X 2Y ) = = = 3/ 4
P( X 2Y ) 2 / 3
(iii)
P (1 X + Y 2)
= f ( x, y )dydx + f ( x, y )dydx
I II
1 2− x 2 2− x
= 0 1− x f ( x, y)dydx + 1 0 f ( x, y)dydx
2− x 2− x
= 0 e− x {1− x e− y dy}dx + 1 e− x {0 e− y dy}dx
1 2
2 3
= −
e e2
= 0.3298
18 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India
Example 2.40: Let X and Y be jointly distributed with pdf
1
f ( x, y ) = (1 + xy ), | x | 1,| y | 1
4
= 0, elsewhere
Show that X and Y are not independent but X 2 and Y 2 are independent.
Solution: The marginal pdf of X is given as
1
g ( x) = −1 f ( x, y)dy
1 1 1
= −1
(1 + xy )dy = , − 1 x 1
4 2
Similarly, the marginal pdf of Y is
1 1
h( y ) = −1 f ( x, y )dx = , −1 y 1
2
Since, f ( x, y ) g ( x).h( y ) , therefore X and Y are not independent.
Now
P( X 2 x) = P(| X | x )
x
= − x
g ( x)dx = x
Similarly,
P(Y 2 y) = P(| Y | y ) = y
Also,
P( X 2 x Y 2 y ) = P(| X | x ,| Y | y )
x y
= −
x − y
f (u, v)dvdu
1 x y
= (1 + uv)dvdu
4 − x − y
= x y
This implies
P( X 2 x, Y 2 y) = P( X 2 x) P(Y 2 y)
19 Lecture notes by Dr. Haseeb Athar, Department of Statistics & O.R., A.M.U., Aligarh, India