0% found this document useful (0 votes)
6 views37 pages

Joint Distribution

The document discusses joint distributions in probability and statistics, covering both discrete and continuous cases. It explains concepts such as joint probability mass functions (pmf), joint probability density functions (pdf), marginal distributions, joint cumulative distribution functions (CDF), and expectations. Additionally, it addresses the independence of random variables and provides examples to illustrate these concepts.

Uploaded by

f20240441
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views37 pages

Joint Distribution

The document discusses joint distributions in probability and statistics, covering both discrete and continuous cases. It explains concepts such as joint probability mass functions (pmf), joint probability density functions (pdf), marginal distributions, joint cumulative distribution functions (CDF), and expectations. Additionally, it addresses the independence of random variables and provides examples to illustrate these concepts.

Uploaded by

f20240441
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

Probability and Statistics (MATH F113)

Pradeep Boggarapu

Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

March 10, 2025

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 1 / 37
Joint Distributions

1 Joint pmf and pdf

2 Marginal pmf, pdf, Joint CDF and Expectation

3 Independent Random Variable

4 Examples

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 2 / 37
Joint distributions

In an educational institution, we may be interested to record the


height and weight of every students. To expalin such type of
experiments mathematically, we introduce the idea of two
dimensional random variable.
In many statistical investigations, one is frequently interested in
studying the relationship between two or more random variables, such
as the relationship between annual income and yearly savings per
family or the relationship between occupation and hypertension.
For example, we might measure the amount of precipitate P and
volume V of gas released from a controlled chemical experiment,
giving rise to a two-dimensional sample space consisting of the
outcomes (p, v ), or we might be interested in the hardness H and
tensile strength T of cold-drawn copper resulting in the outcomes
(h, t).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 3 / 37
Joint Distributions-Discrete Case

Often, experiments are conducted where two or more random variables are
observed simultaneously in order to determine not only their individual
behaviour but also the degree of relationship between them.

Let X and Y be two discrete random variable are given along with the
function
p(x, y ) = P(X = x, Y = y ),
then we say that the random variable X and Y are jointly distributed with
joint pmf p(x, y ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 4 / 37
Example 1
Table: Joint Probability Mass Function of X and Y
HH Y
HH 1 2
X H
H
3 5
1 24 24
1 7
2 24 24
6 2
3 24 24

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 5 / 37
Example 2
Let two dice are thrown. Let X and Y denote the numbers shown by first
and second die.The joint pmf is given by
HH Y
H
1 2 3 4 5 6
X H
H
H
1 1 1 1 1 1
1 36 36 36 36 36 36
1 1 1 1 1 1
2 36 36 36 36 36 36
1 1 1 1 1 1
3 36 36 36 36 36 36
1 1 1 1 1 1
4 36 36 36 36 36 36
1 1 1 1 1 1
5 36 36 36 36 36 36
1 1 1 1 1 1
6 36 36 36 36 36 36

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 6 / 37
General Case-Discrete
Suppose that we are concerned with k discrete random variables
X1 , X2 , . . . , Xk . Let x1 be a possible value for the first random variable X1 ,
x2 be a possible value for the second random variable X2 , and so on with
xk a possible value for the kth random variable Xk . Then the probabilities

p(X1 = x1 , X2 = x2 , . . . , Xk = xk ) = p(x1 , x2 , . . . , xk )

need to be specified. We refer to the function p and the corresponding


k−tuples of possible values (x1 , x2 , . . . , xk ) as the joint probability
distribution of these discrete random variables or joint pmf.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 7 / 37
Joint PMF

Necessary and sufficient condition for a function p(x, y ) to be a joint pmf


(discrete case):
p(x, y ) ≥ 0 for all x, y
P P
all x all y p(x, y ) = 1.

Let X and Y be discrete random variable with the joint pmf p. For any
suitable B ⊂ R2 , we have that
X
P((X , Y ) ∈ B) = p(x, y ).
(x,y )∈B

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 8 / 37
Example

Example 3
Table: Joint Probability Mass Function of X and Y

HH Y
H
1 2
X H
H
H
a 4
1 24 24
1 7
2 24 24
6 2
3 24 24

Find the value of a. Also find P(X + Y ≤ 3).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 9 / 37
Joint Distributions-Continuous Case

There are many situations in which we describe an outcome by giving the


values of several continuous random variables. For instance, we may
measure the weight and the hardness of a rock; the volume, pressure, and
temperature of a gas; or the thickness, compressive strength, and
potassium content of a piece of glass.

If X and Y are two continuous random variables along with a non negative
function f (x, y ) such that
Z d Z b
P(a ≤ X ≤ b; c ≤ Y ≤ d) = f (x, y )dxdy ,
c a

for all a < b and c < d, then X and Y are jointly distributed with the
joint pdf f (x, y ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 10 / 37
Example 4
X and Y are continuous random variable with joint pdf
(
6e −2x−3y for x > 0, y > 0
f (x, y ) =
0, elsewhere

Example 5
X and Y are continuous random variable with joint pdf
(
2, for 0 ≤ x ≤ y ≤ 1
f (x, y ) =
0, otherwise

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 11 / 37
General Case-Continuous
If X1 , X2 , . . . , Xk are k continuous random variables, we shall refer to
f (x1 , x2 , . . . , xk ) as the joint probability density of these random variables,
if the probability that a1 ≤ X1 ≤ b1 , a2 ≤ X2 ≤ b2 , . . ., and ak ≤ Xk ≤ bk
is given by the multiple integral

P(a1 ≤ X1 ≤ b1 ; a2 ≤ X2 ; ≤ b2 ; . . . ; ak ≤ Xk ≤ bk )
Z bk Z b2 Z b2
= ··· f (x1 , x2 , . . . xn )dx1 dx2 . . . dxk
ak a2 a1

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 12 / 37
Joint PDF

Necessary and sufficient condition for a function f (x, y ) to be a joint PDF


(discrete case):
f (x, y ) ≥ 0 for all x, y
ZZ
f (x, y )dxdy = 1.
R2

Let X and Y be continuous random variable with the joint pdf f . For any
suitable B ⊂ R2 , we have that
ZZ
P((X , Y ) ∈ B) = f (x, y )dxdy .
(x,y )∈B

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 13 / 37
Example

Example 6
The joint probability density function of X and Y is given by
(
c(y 2 − x 2 )e −y −y ≤ x ≤ y , 0 < y < ∞
f (x, y ) =
0 otherwise.

Then find c.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 14 / 37
Marginal pmf, Joint CDF and Expectations-Discrete Case
Marginal pmf
If X and Y are discrete random variables with joint pmf p(x, y ), the
marginal pmf of X is given by
X
pX (x) = p(x, y )
all possible y

and the marginal pmf of Y is given by


X
pY (y ) = p(x, y ).
all possible x

General Case: Let f be the joint pmf of the discrete random variables
X1 , X2 , · · · , Xk , then marginal pmf of the random variable Xi is given
by X
pXi (xi ) = f (x1 , x2 , . . . , xk ).
x1 ,...xi−1 ,xi+1 ,...,xk
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 15 / 37
Joint CDF
The joint cdf of X and Y is defined by
X
F (x, y ) = p(a, b).
a≤x,b≤y

General Case: The joint cdf of X1 , X2 , . . . , Xk is given by


X
F (x1 , x2 , . . . , xk ) = p(a1 , a2 , . . . , ak ).
a1 ≤x1 ,a2 ≤x2 ,...,ak ≤xk

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 16 / 37
Expectation
Let g be a function of X and Y , then the expectation of g (X , Y ) is
defined by
X
E (g (X , Y )) = g (x, y )p(x, y )
all possible x,y

provided the right hand side exists.


General case: The expectation of g (X1 , X2 , . . . , Xk ) is given by
X
E (g (X1 , X2 , . . . , Xk )) = g (x1 , x2 , . . . , xk )p(x1 , x2 , . . . xk )

provided the right hand side exists.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 17 / 37
Marginal pdf, Joint CDF and Expectations-Continuous

Marginal pdf
Let X and Y be two continuous random Z ∞ variable with pdf f . The
marginal pdf of X is given by fX (x) = f (x, y )dy and marginal
Z ∞ −∞

pdf of Y is given by fY (y ) = f (x, y )dx.


−∞
Let f be the joint pdf of the continuous random variables
X1 , X2 , · · · , Xk , then marginal pdf of the random variable Xi is given
by
Z Z Z Z
fXi (xi ) = · · · · · · f (x1 , x2 , . . . , xk )dx1 · · · dxi−1 dxi+1 · · · dxk .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 18 / 37
Joint CDF
Joint cdf of X and Y is given by
Z x Z y
F (x, y ) = f (s, t)dsdt.
−∞ −∞

General case: Joint cdf of X1 , X2 , . . . , Xk is given by


Z x1 Z x2 Z xk
F (x1 , x2 , . . . , xk ) = ··· f (s1 , s2 , . . . , sk )ds1 ds2 · · · dsk .
−∞ −∞ −∞

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 19 / 37
Expectation
The expectation of g (X , Y ) is defined by
Z ∞Z ∞
E (g (X , Y )) = g (x, y )f (x, y )dxdy ,
−∞ −∞

provided the right hand side exists.


General Case: The expectation of g (X1 , X2 , . . . , Xk ) is defined by

E (g (X1 , X2 , . . . , Xk ))
Z ∞Z ∞ Z ∞
= ··· g (x1 , x2 , . . . , xk )f (x1 , x2 , . . . xk )dx1 dx2 . . . dxk ,
−∞ −∞ −∞

provided the right hand side exists.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 20 / 37
Independent Random Variable

Let p be the joint pmf (or f be the joint pdf) function of the
discrete(or continuous) random variables X and Y . The random
variables X and Y are independent if and only if

p(x, y ) = pX (x)pY (y ), when X and Y are discrete,

or
f (x, y ) = fX (x)fY (y ), when X and Y are continuous,
for all possible order pairs (x, y ), where pX and pY (or fX and fY ) are
the marginal pmf (or pdf) of X and Y , respectively.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 21 / 37
Independent Random Variable

General Case: Let f be the joint pmf (or pdf) function of the discrete(or
continuous) random variables X1 , X2 , . . . , Xk . The random variables
X1 , X2 , . . . , Xk are independent if and only if

f (x1 , x2 , . . . xk ) = fX1 (x1 ) · · · fXk (xk )

for all possible k− tuples (x1 , x2 , . . . , xk ), where fXi is the marginal pmf (or
pdf) of Xi .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 22 / 37
Recall Example 1:
Example 7
Table: Joint Probability Mass Function of X and Y
HH Y
HH 1 2
X H
H
3 5
1 24 24
1 7
2 24 24
6 2
3 24 24

1 Find marginal pmfs and determine whether X and Y are independent.


2 Find P(X + Y ≤ 3) and E (X 2 Y ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 23 / 37
Solution:
Example 7
HH Y
HH 1 2 pX (x)
X H
H
3 5 8
1 24 24 24
1 7 8
2 24 24 24
6 2 8
3 24 24 24
10 14
pY (y ) 24 24 Total =1

3
1 The random variables X and Y are not independent as p(1, 1) = 24
8×10 5
and pX (1)pY (1) = 24×24 = 36 clearly not equal to p(1, 1).
3 5 1 9
2 P(X + Y ≤ 3) = p(1, 1) + p(1, 2) + p(2, 1) = 24 + 24 + 24 = 24

X 3 + 4 + 54 + 10 + 56 + 36
E (X 2 Y ) = x 2 yp(x, y ) =
24
all possible x,y

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 24 / 37
Theorem 1.
If X and Y are independent random variables then

E (g (X )h(Y )) = E (g (X )) · E (h(Y )).

General case: If X1 , X2 , . . . , Xk are independent random variable with


a joint distribution f , then

E (g1 (X1 )g2 (X2 ) · · · gK (Xk )) = E (g1 (X1 ))E (g2 (X2 )) · · · E (gk (Xk )).

Problem 2.
If X and Y are independent random variables show that

MX +Y (t) = MX (t) · MY (t),

where MZ denotes the moment generating function of random variable Z


for Z = X , Y or X + Y .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 25 / 37
Recall Example 2:
Example 8
Let two dice are thrown. Let X and Y denote the numbers shown by first
and second die. The joint pmf is given by
HH Y
HH 1 2 3 4 5 6
H
X H
1 1 1 1 1 1
1 36 36 36 36 36 36
1 1 1 1 1 1
2 36 36 36 36 36 36
1 1 1 1 1 1
3 36 36 36 36 36 36
1 1 1 1 1 1
4 36 36 36 36 36 36
1 1 1 1 1 1
5 36 36 36 36 36 36
1 1 1 1 1 1
6 36 36 36 36 36 36
1 Find marginal pmfs and determine whether X and Y are independent.
2 Find P(X + Y ≤ 3) and E (X 2 Y ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 26 / 37
Solution
HH Y
HH 1 2 3 4 5 6 PX (x)
H
X H
1 1 1 1 1 1 1
1 36 36 36 36 36 36 6
1 1 1 1 1 1 1
2 36 36 36 36 36 36 6
1 1 1 1 1 1 1
3 36 36 36 36 36 36 6
1 1 1 1 1 1 1
4 36 36 36 36 36 36 6
1 1 1 1 1 1 1
5 36 36 36 36 36 36 6
1 1 1 1 1 1 1
6 36 36 36 36 36 36 6
1 1 1 1 1 1
pY (y ) 6 6 6 6 6 6 1
1 X and Y are independent because p(x, y ) = pX (x) · pY (y ).
3
2 P(X + Y ≤ 3) = p(1, 1) + p(1, 2) + p(2, 1) = 36 .And

91 21
E (X 2 Y ) = E (X 2 ) · E (Y ) = ·
6 6

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 27 / 37
Recall Example 3.
Example 9
X and Y are continuous random variable with joint pdf
(
6e −2x−3y for x > 0, y > 0
f (x, y ) =
0, elsewhere

1 Find marginal pdfs and determine whether X and Y are independent.


2 Find P(X + Y ≤ 3) and E (XY ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 28 / 37
Solution
Marginal pdf of X is given by
Z ∞ Z ∞
6 ∞
e −2x−3y dy = e −2x −e −3y 0 = 2e −2x ,

fX (x) = f (x, y )dy = 6
−∞ 0 3

for all x > 0. Similarly we get

fY (y ) = 3e −3y

for all y > 0. Clearly we have that f (x, y ) = fX (x)fY (y ) for all (x, y ).
Therefore X and Y are independent.

∞ Z 3−y ∞ 3−y
e −2x
Z Z 
−2x−3y −3y
P(X + Y ≤ 3) = 6 e dxdy = 6 e =
0 0 0 −2 0

1 1 1
E (XY ) = E (X ) · E (Y ) = · = .
2 3 6
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 29 / 37
Recall Example 4.
Example 10
X and Y are continuous random variable with joint pdf
(
2, for 0 ≤ x ≤ y ≤ 1
f (x, y ) =
0, otherwise

1 Find marginal pdfs and determine whether X and Y are independent.


2 Find P(X + Y ≤ 12 ) and E (X 2 Y ).

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 30 / 37
Examples

Example 3.
Let X and Y be discrete
( random variables with joint probability mass
1
(x + y ), if x = 1, 2; y = 1, 2, 3
function pXY (x, y ) = 21
0, otherwise.
What are the marginal probability density functions of X and Y?

Example 4.
For what value( of the constant k the function given by
kxy , if x = 1, 2, 3; y = 1, 2, 3
pXY (x, y ) =
0, otherwise.
is a joint probability mass function of some random variables X and Y?
1
ANS: 36 .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 31 / 37
Examples

Example 5.
A privately owned business operates both a drive-in facility and walk-in
facility. On a randomly selected day, let X and Y respectively, be the
proportions of the time that the drive-in and the walk-in facilities are in
use, and suppose that the joint density function of these random variables
is (
k(2x + 3y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
fXY (x, y ) =
0, elsewhere

Find the value of k.


Find P[(X , Y ) ∈ A], where A = {(x, y )|0 < x < 21 , 1
4 < y < 21 }.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 32 / 37
Examples

Example 6.
In a healthy individual age 20 to 29 years, the calcium level in the blood,
X , is usually between 8.5 and 10.5 milligrams per deciliter (mg/dl) and
the cholesterol level, Y , is usually between 120 and 240 mg/dl. Assume
that for a healthy individual in this age group the random variable (X , Y )
is uniformly distributed over the rectangle whose corners are
(8.5, 120), (8.5, 240), (10.5, 120), (10.5, 240). That is, assume that the
joint density for (X , Y ) is

fXY (x, y ) = c, 8.5 ≤ x ≤ 10.5, 120 ≤ y ≤ 240.

Find c and find P[9 ≤ X ≤ 10 and 125 ≤ Y ≤ 140].


Find fX and fY .

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 33 / 37
Examples

Example 7.
The joint density of X and Y is given by
(
xe −(x+y ) , x > 0, y > 0,
fXY (x, y ) =
0, elsewhere

Are X and Y independent? Justify your answer. What if f (x, y ) were


given by (
2, 0 < x < y , 0 < y < 1,
fXY (x, y ) =
0. elsewhere

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 34 / 37
Examples

Example 8.
Let the joint density function of X and Y is
(
kxy , 0 < x < 1, 0 < y < 1, 0 ≤ x + y ≤ 1
fXY (x, y ) =
0, elsewhere

then find
1 k
2 E [X ] and E [y ]
3 P[X − Y ≤ 1].

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 35 / 37
Examples

Example 9.
Three points X1 , X2 , X3 are selected at random on a line L. What is the
probability that X2 lies between X1 and X3

Example 10.
Two points are selected randomly on a line of length L so as to be on
opposite sides of the midpoint of the line. [In other words, the two points
X and Y are independent random variables such that X is uniformly
distributed over (0, L/2) and Y is uniformly distributed over (L/2, L).]
Find the probability that the distance between the two points is greater
than L/3.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 36 / 37
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics March 10, 2025 37 / 37

You might also like