Bayesian Prob Exercises
Bayesian Prob Exercises
BAYESIAN INFERENCE
Progic 2005
1
Thomas Bayes (1702-1761)
English
Presbyterian
minister and
mathematician
2
Bayes’ rule
prior probability of A
P ( A) P ( B | A)
P( A | B) =
P( B)
posterior probability of A
P( A) P( B | A) + P( A) P( B | A)
prior or unconditional probability of B
3
Example
1% of pop. has disease (D); rest is healthy (H)
90% of diseased persons test positive (+)
90% of healthy persons test negative (-)
Randomly selected person tests positive
Probability that person has disease is:
P( D) P(+ | D)
P( D | +) =
P( D) P(+ | D) + P( H ) P(+ | H )
0.01×0.9 0.009 1
= = =
0.01×0.9 + 0.99×0.1 0.108 12 4
Hypothetical population
1000
D 10 H 990
D+ 9 D- 1 H+ 99 H- 891
Two boys: BB
p = P(BB | A) = 1/3
7
Problem....
Suppose: P(BB) = 1/6 P(BG) = 1/3
P(GB) = 1/3 P(GG) = 1/6
(1/ 4)×1 1
p= = as before
3/ 4 3
9
(b) Older GP is a boy: B* = {BB, BG}
Both are boys: BB
So P(BB) = 1/2
Or....
P ( BB ) P ( B* | BB) (1/ 4)×1 1
p = P( BB | B*) = = =
P( B*) 1/ 2 2
(1/ 6)×1 1
NB: If P(BB) = 1/6, etc, then p= =
1/ 2 3
10
(c) Let T = “Ann tells you her older GP is a boy”
P ( BB ) = P( BG ) = P(GB) = P(GG ) = 1/ 4 )
Then
P ( BB) P(T | BB)
P ( BB | T ) =
P (T )
where
P(T ) = P( BB ) P(T | BB) + P( BG ) P(T | BG )
+ P(GB) P (T | GB) + P(GG ) P(T | GG )
11
P(T | BB) 1
P ( BB | T ) = =
P (T | BB ) + P(T | BG ) 2
12
Eg, suppose that BB is worth BIG $’s
Then maybe
P(T | BB) = 0.9
P (T | BG ) = 0.2
In that case
0.9 9
P( BB | T ) = =
0.9 + 0.2 11
13
(d) Let R = “A GP was picked randomly
and found to be a boy”
Then
P( BB) P( R | BB) (1/ 4)×1 1
p = P( BB | R) = = =
P( R) 2/4 2
14
But we should first ask:
Eg:
P( R | BB)
P( BB | R) =
P( R | BB) + P( R | BG ) + P( R | GB) + P( R | GG )
0.8 4
= =
0.8 + 0.5 + 0.5 9
15
Moral
Just because something happened
(eg, Anne told you her oldest is a boy,
or a GP was picked randomly,
or even that you met Ann, etc)
= (1/3)*1 + (2/3)*1
= 2/3
18
But this is wrong
19
Let:
If M2 is your host
q = 1/2 & p = 1/(1 + 1/2) = 2/3
26
But this is wrong
If given a choice,
M2 was more likely to open No. 3 than M1
1 1 + 0 1 1 + 1/ 2 5
=
+
=
2 9 2 9 36
So now the pr. that M2 is your host equals:
P(q = 1/ 2) P(CO | q = 1/ 2)
P(q = 1/ 2 | CO) =
P(CO)
(1/ 2)(1 + 1/ 2) / 9 3
= =
5 / 36 5 28
So: P(p=2/3|CO) = 3/5 (M2: q = 1/2)
9000 hosted by M1
– opens door with lowest number (q=0)
9000 hosted by M2
– mentally flips a coin (q=1/2)
30
M2: 9000
Car behind No. 1: 3000
1K 1K 1K 1K 1K 1K 1K 1K 1K
(C1) (C2) (C3) (C1) (C2) (C3) (C1) (C2) (C3)
9K (M1)
1K 1K 1K 1K 1K 1K 1K 1K 1K
(C1) (C2) (C3) (C1) (C2) (C3) (C1) (C2) (C3)
500 500 1K
1K 1K 500 500 1K 1K 1K 500 500
(O2) (O3) (O3)
(O2) (O3) (O1) (O3) (O1) (O2) (O1) (O1) 32(O2)
We find:
So P(2|CO) = P(2CO)/P(CO)
= #(2CO)/#(CO)
= 2000/2500
= 4/5, as before
33
Some statements & their meaning
P(p=2/3) = 1/2
35
In the presence of a prior, the required pr. is
p = P(2|CO,q) = 1/(1 + q)
Equate u = EU
Get 1 = (1 + q)/9
Solution is q = 8
Closest possible value of q is 1
Corresponding value of p is 1/(1 + 1) = 1/2
38
Another problem
Suppose q ~ U(0,1) (a priori ignorance)
Then
1
1+ q 1
P(CO) = ∫ P(CO | q) f (q)dq = ∫ ×1dq =
0
9 6
f (q ) P (CO | q ) 1× (1 + q ) / 9 2
f (q | CO ) = = = (1 + q )
P (CO ) 1/ 6 3
1
1 2 2
E ( p | CO) = ∫ pf (q | CO)dq = ∫ × (1 + q)dq =
0
1+ q 3 3
39
In a 1992 paper on the Monty Hall problem:
41
THANK YOU
42