0% found this document useful (0 votes)
206 views7 pages

HW 2 Solution

This document contains Georgi Dinolov's homework assignments from AMS 203 that are due on 11/12/2012. It lists several problems from the textbook DeGroot and Schervish regarding probability distributions, random variables, and conditional probabilities. The problems involve determining joint and marginal probability distributions, whether random variables are independent, and conditional probability density functions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
206 views7 pages

HW 2 Solution

This document contains Georgi Dinolov's homework assignments from AMS 203 that are due on 11/12/2012. It lists several problems from the textbook DeGroot and Schervish regarding probability distributions, random variables, and conditional probabilities. The problems involve determining joint and marginal probability distributions, whether random variables are independent, and conditional probability density functions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Georgi Dinolov

AMS 203
Homework #2
11/12/2012

DeGroot and Schervish: 3.5.10, 3.6.10, 3.6.12, 3.7.2, 3.7.8, 3.8.14


3.5.10 Suppose that a point ( X, Y ) is chosen at random from the circle S defined as
follows:
S = {( x, y) : x2 + y2 1}.
a. Determine the joint p.d.f. of X and Y, the marginal p.d.f. of X, and the marginal
p.d.f. of Y.
b. Are X and Y independent?
a. The points ( x, y) are chosen at random, meaning that they are all equally likely, and
that f ( x, y) is uniform on the unit circle. Thus,
f ( x, y) =

1
,

since the radius of S is unity. The marginal distribution for a given Y is found by
integrating along all possible values for x, namely,
p
Z 1 y2
2 1 y2
1
f (y) =
dx =
.

1 y2
Similarly for X,
f (x) =

2 1 x2
dy =
.

Z 1 x 2
1

1 x 2

b. If X and Y are independent, f ( x ) f (y) = f ( x, y). However,


4 p
f ( x ) f (y) = 2 1 x2

1 y2 6 =

1
in general.

3.6.10 In a large collection of coins, the probability of X that a head will be obtained
when a coin is tossed varies from one coin to another, and the distribution of X in the
collection is specified by the following p.d.f.:

6x (1 x ) for 0 < x < 1,
f1 (x) =
0 otherwise
Suppose that a coin is selected at random from the collectin and tossed once, and that a
head is obtained. Determine the conditional p.d.f. of X for this coin.
In this problem we use Bayes Theorem. Let T1 = { h, t} be the event of tossing a coin for
the first time, with the two possible outcomes of head or tail. We seek f ( x | T1 = h). We
have
f ( x | T1 = h) = R 1
0

= R1
0

f ( T1 = h| x ) f ( x )
f ( T1 = h| x ) f ( x ) dx
6x2 (1 x )
6x2 (1 x ) dx

= R1
0

x [6x (1 x )]
x [6x (1 x )] dx

6x2 (1 x )
1/2

= 12x2 (1 x )


3.6.12 Let Y be the rate (calls per hour) at which calls arrive at a switchboard. Let X be
the number of calls during a two-hour period. Suppose that the marginal p.d.f. of Y is
 y
e
if y > 0,
f 2 (y) =
0
otherwise.
and that the conditional p.f. of X given Y = y is
(
(2y) x 2y
if x = 0, 1, . . . ,
x! e
g1 ( x | y ) =
0
otherwise.
a. Find the marginal p.f. of X. (You may use the formula

R
0

yk ey dy = k!)

b. Find the conditional p.d.f. g2 (y|0) of Y given X = 0.


c. Find the conditional p.d.f. g2 (y|1) of Y given X = 1.
d. For what values of y is g2 (y|1) > g2 (y|0)? Does this agree with the intuition that
the more calls you see, the higher you should think the rate is?
a. The marginal p.f. of X is found by integrating the product g1 ( x |y) f 2 (y) over all
admissible values of y,
f (x) =

Z
(2y) x 2y y
e e dy
g1 ( x |y) f 2 (y) dy =
x!

0
x Z

2
(3y) x e3y dy, substituting m = 3y
x
3 x! 0
Z
2x
2x
= x +1
(m) x em dm = x+1 x!
3 x! 0
3 x!
 x
1 2
=
3 3

b. We have g2 (y|0) = g1 (0|y) f 2 (y)/ f (0) so that


g2 ( y | 0 ) =

g1 ( 0 | y ) f 2 ( y )
e2y ey
=
= 3e3y
f (0)
1/3

c. We have g2 (y|1) = g1 (1|y) f 2 (y)/ f (1) so that


g2 ( y | 1 ) =

g1 ( 1 | y ) f 2 ( y )
2ye2y ey
=
= 9ye3y
f (1)
2/9

d. We seek y from the condition


1
3
This agrees with the intuition that that the more calls you see, the higher you should
think the rate is.
9ye3y > 3e3y

y>

3.7.2 Suppose that three random variables X1 , X2 , and X3 have mixed joint distribution
with p.f./p.d.f.:
 1+ x + x
cx1 2 3 (1 x1 )3 x2 x3 if 0 < x1 < 1 and x2 , x3 {0, 1},
f ( x1 , x2 , x3 ) =
0
otherwise.
(Notice that X1 has a continuous distribution and X2 and X3 have discrete distributions.)
Determine
a. the value of the constant c;
b. the marginal joint p.f. of X2 and X3 ;
c. the conditional p.d.f. of X1 given X2 = 1 and X3 = 1.
R
a. We need c such taht f = 1. This integral takes on the form
Z 1
Z
Z
3
f ( x1 , x2 , x3 ) dx1 dx2 dx3 = c
x1 (1 x1 ) dx1 +
0
Z 1

+
x12 (1 x1 )2 dx1 +
0


2
1
1
1
+
+
=c
= c
20 30 20
6

0
Z 1
0

x12 (1 x1 )2 dx1

3
x1 (1 x1 )dx1

c=6
b. The marginal joint p.f. of X2 and X3 is given by
f 2 ( x2 , x3 ) =

Z 1
0

f ( x1 , x2 , x3 )dx1 = 6

Z 1
0

x11+ x2 + x3 (1 x1 )3 x2 x3 dx1

This integral is difficult to compute, so we will rely on the fact that ( X1 , X2 ) have
discrete distributions. The integrals in Part a will be used to compute the probabilities.
Z 1

 
1
3
= 0.3
f 2 (0, 0) = 6
x1 (1 x1 ) dx1 = 6
20
0
Z 1

 
1
2
2
f 2 (0, 1) = 6
= 0.2
x1 (1 x1 ) dx1 = 6
30
0
Z 1

 
1
2
2
f 2 (1, 0) = 6
= 0.2
x1 (1 x1 ) dx1 = 6
30
0
Z 1

 
1
3
f 2 (1, 1) = 6
x1 (1 x1 )dx1 = 6
= 0.3
20
0
Thus,


f 2 ( x2 , x3 ) =

0.2
0.3

if ( x2 , x3 ) {(1, 0), (0, 1)}


if ( x2 , x3 ) {(0, 0), (1, 1)}
4

c. If x (0, 1),
3/2( x13 (1 x1 )1 )
f ( x1 , x2 = 1, x3 = 1)
=
f 2 (1, 1)
3/40


= 20 x13 (1 x1 )1

f 3 ( x1 | x2 = 1, x3 = 1) =

Else, f 3 ( x1 | x2 = 1, x3 = 1) = 0.


3.7.8 Suppose that the p.d.f. of a random variable X is as follows:


 1 n x
for x > 0
n! x e
f (x) =
0
otherwise.
Suppose also that for any given value X = x ( x > 0), the n random variables Y1 , . . . , Yn
are i.i.d. and the conditional p.d.f. g of each of them is as follows:
 1
for 0 < y < x
x
g(y| x ) =
0
otherwise.
Determine
a. the marginal joint p.d.f. of Y1 , . . . , Yn and
b. the conditional pdf of X for any given values of Y1 , . . . , Yn .
For any given value of X, the random variables Y1 , . . . , Yn are iid, each with a pdf g(y| x ).
Therefore, the conditional joind pdf of Y1 , . . . , Yn given that X = x is
 1
for 0 < yi < x, i = 1, . . . , n,
xn
h ( y1 , . . . , y n | x ) = g ( y1 | x ) g ( y n | x ) =
0
otherwise
The joing pdf is positive if and only if each yi > 0 and x is greater than every yi , that is
x > m = max{y1 , . . . , yn }
a. For y1 > 0 (i = 1, . . . , n), the marginal pdf of Y1 , . . . , Yn is
g0 ( y 1 , . . . , y n ) =

f ( x )h(y1 , . . . , yn | x ) dx =

Z
1
1 x
e dx = em
m

n!

n!

b. For yi > 0 (i = 1, . . . , n), the conditional pdf of X given that Yi = yi (i = 1, . . . , n)


is
 ( xm)
f ( x ) h ( y1 , . . . , y n | x )
e
for x > m,
g1 ( x | y 1 , . . . , y n ) =
=
0
otherwise.
g0 ( y 1 , . . . , y n )


3.8.14 Let X have the uniform distribution on the interval [ a, b], and let c > 0. Prove
that cX + d has the uniform distribution on the interval [ca + d, cb + d].
Let Y = cX + d. The inverse transformation is x = (y d)/c. Assume that c > 0. The
derivative of the inverse is 1/c. The pdf of Y is
g(y) = f ([y d]/c)/c = [c(b a)]1 , for a (y d)/c b.
It is easy to see that a (y d)/c b if and only if ca + d y cb + d, so g is the
pdf of the uniform distribution on the interval [ca + d, cb + d]. If c < 0, the distribution
of Y would be uniform on the interval [cb + d, ca + d]. If c = 0, the distribution of Y is
degenerate at the value d, ie P(Y = d) = 1.


You might also like