0% found this document useful (0 votes)
10 views5 pages

Homework Week2 Solutions-1

The document presents a series of exercises related to Markov chains, including proofs of properties such as the Markov property and conditional independence. It explores specific cases and computations involving transition probabilities and conditional expectations. The exercises also illustrate the limitations of the Markov property in certain contexts.

Uploaded by

The Aquaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views5 pages

Homework Week2 Solutions-1

The document presents a series of exercises related to Markov chains, including proofs of properties such as the Markov property and conditional independence. It explores specific cases and computations involving transition probabilities and conditional expectations. The exercises also illustrate the limitations of the Markov property in certain contexts.

Uploaded by

The Aquaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Exercise 1: Try to solve Theorem ([N’97] 1.1.

2 - Markov property) for the specific case that


|I| = 3, for example I = {1, 2, 3}, m = 2, and n = 2.
Solution: We want to show that, for any A ∈ F2 and i, j, k, w ∈ I,

P({X2 = i, X3 = j, X4 = k} ∩ A|X2 = w) = δwi pij pjk P(A|X2 = z) .

Thw relation trivially holds when w ̸= i, so in the following we are going to assume that
i = w. Since A ∈ F2 , we have that
X
A= Au,v,z
(u,v,z)∈AI 3

where AI 3 ⊂ I 3 and Au,v,z = {ω ∈ Ω : X0 (ω) = u, X1 (ω) = v, X2 (ω) = z}. The


P
symbol
stresses the fact that the union is over disjoint sets, that is Au,v,z ∩ Au′ ,v′ ,z ′ = ∅ if (u, v, z) ̸=
(u′ , v ′ , z ′ ). It follows that

P({X3 = j, X4 = k} ∩ A|X2 = w)
P
P({X2 = w, X3 = j, X4 = k} ∩ (u,v,z)∈AI Au,v,z )
=
P(X2 = w)
X P({X2 = w, X3 = j, X4 = k} ∩ Au,v,z )
=
P(X2 = w)
(u,v,z)∈AI 3
X P(X0 = u, X1 = v, X2 = z, X2 = w, X3 = j, X4 = k)
=
P(X2 = w)
(u,v,z)∈AI 3
X P(Au,v,w ∩ {X2 = w})
= P(X3 = j, X4 = k|X2 = w, X1 = v, X0 = u)
P(X2 = w)
(u,v)∈AI 2

M.P.
X P(Au,v,w ∩ {X2 = w})
= P(X3 = j, X4 = k|X2 = w)
P(X2 = w)
(u,v)∈Aw2
I
X P(Au,v,w ∩ {X2 = w})
= pwj pjk
P(X2 = w)
(u,v)∈Aw2
I
P
P( (u,v)∈Aw2 Au,v,w ∩ {X2 = w})
I
= pwj pjk
P(X2 = w)
P
P( (u,v,z)∈A 3 Au,v,z ∩ {X2 = w})
I
= pwj pjk
P(X2 = w)
= pwj pjk P(A|X2 = w) = δwi pij pjk P(A|X2 = w) ,

where we denoted by Aw
I 2 the set {(u, v) : (u, v, w) ∈ AI 3 }.

Exercise 2: Let (Xn )n≥0 with Xi ∈ {0, 1} i = 1, 2, . . . be a sequence of r.v. conditionally


independent of the events A and B subsets of Ω where

1. A ∩ B = ∅, A ∪ B = Ω and P(A) = P(B) = 1/2;

2. P(Xi = 1|A) = P(Xi = 1|B) = p, i = 1, 2, . . . with p ∈ (0, 1).


Compute P(Xj = 1|Xk = 1), where k, j ∈ 1, 2, . . . and k ̸= j.

Solution: We have
P(Xj = 1, Xk = 1)
P(Xj = 1|Xk = 1) =
P(Xk = 1)

PA (Xj = 1, Xk = 1)P(A) + PB (Xj = 1, Xk = 1)P(B)


=
PA (Xk = 1)P(A) + PB (Xk = 1)P(B)

PA (Xj = 1)PA (Xk = 1)P(A) + PB (Xj = 1)PB (Xk = 1)P(B)


=
PA (Xk = 1)P(A) + PB (Xk = 1)P(B)

p2
= = p.
p

Exercise 3: Let (Xn )n≥0 be an homogeneous Markov chain. Consider the disjoint events
A = X0 = iA A B B
 
0 , . . . , Xn−1 = in−1 and B = X0 = i0 , . . . , Xn−1 = in−1 . Prove that for any
n ∈ N it holds

P(Xn+1 = j| {Xn = i, A} ∪ {Xn = i, B}) = P(Xn+1 = j|Xn = i).

Solution: We use the Bayes’s rule to write:

P(Xn+1 = j| {Xn = i, A} ∪ {Xn = i, B})

P({Xn = i, A} ∪ {Xn = i, B} |Xn+1 = j)P(Xn+1 = j)


=
P({Xn = i, A} ∪ {Xn = i, B})

[P({Xn = i, A} |Xn+1 = j)) + P({Xn = i, B} |Xn+1 = j)] P(Xn+1 = j)


=
P({Xn = i, A} ∪ {Xn = i, B})

= ...

Using again the Bayes’s rule and the Markov property ,for the first two terms of the numerator,
we obtain
P({Xn = i, A})
P({Xn = i, A} |Xn+1 = j)) = P(Xn+1 = j|Xn = i))
P(Xn+1 = j)

and
P({Xn = i, B})
P({Xn = i, B} |Xn+1 = j)) = P(Xn+1 = j|Xn = i)) .
P(Xn+1 = j)
Now, substituting in the previous formula we have

P(Xn+1 = j|Xn = i)) P({Xn =i,A})+P({X


P(Xn+1 =j)
n =i,B})
P(Xn+1 = j)
... =
P({Xn = i, A} ∪ {Xn = i, B})

P(Xn+1 = j|Xn = i)) [P({Xn = i, A}) + P({Xn = i, B})]


=
P({Xn = i, A} ∪ {Xn = i, B})

= P(Xn+1 = j|Xn = i).

Exercise 4: Let (Xn )n≥0 be an homogeneous Markov chain with values in I. Prove that for
any n ∈ N and j2 , j1 , in , . . . , i0 ∈ I it holds the following equality

P(Xn+2 = j2 , Xn+1 = j1 |Xn = in , . . . , X0 = i0 ) = P(Xn+2 = j2 , Xn+1 = j1 |Xn = in ).

Solution: From the very definition of conditional probability and the Markov property, we
have

P(Xn+2 = j2 , Xn+1 = j1 |Xn = in , . . . , X0 = i0 ) =

P(Xn+2 = j2 , Xn+1 = j1 , Xn = in , . . . , X0 = i0 )
=
P(Xn = in , . . . , X0 = i0 )

P(Xn+2 = j2 |Xn+1 = j1 )P(Xn+1 = j1 |Xn = in )P(Xn = in , . . . , X0 = i0 )


=
P(Xn = in , . . . , X0 = i0 )

P(Xn+2 = j2 |Xn+1 = j1 )P(Xn+1 = j1 |Xn = in ) =

P(Xn+2 = j2 |Xn+1 = j1 , Xn = in )P(Xn+1 = j1 |Xn = in ) =

P(Xn+2 = j2 , Xn+1 = j1 , Xn = in ) P(Xn+1 = j1 , Xn = in )


=
P(Xn+1 = j1 , Xn = in ) P(Xn = in )

P(Xn+2 = j2 , Xn+1 = j1 |Xn = in ).

Exercise 5: [Ex. 2.1.3. Bremaud] Markov property DOES NOT imply that the past and
the future are independent given ANY information on the present. Build an example of a
Markov chain (Xn )n≥0 with values in I = {1, 2, . . . , 6} such that

P(X2 = 6|X1 ∈ {3, 4}, X0 = 2) ̸= P(X2 = 6|X1 ∈ {3, 4}).


Solution: Let (Xn )n≥1 be a Markov chain taking values in E = {1, 2, 3, 4, 5, 6} with initial
distribution ν = {1/6, 1/6, 1/6, 1/6, 1/6, 1/6} and transition matrix
 
0 1/2 0 1/2 0 0
 0 0 1 0 0 0 
 
 0 0 0 0 0 1 
P =  . (1)
 0 0 0 0 1 0 

 0 0 0 0 1 0 
0 0 0 0 0 1

On the one hand, we compute


P(X2 = 6, X1 ∈ {3, 4})
P(X2 = 6|X1 ∈ {3, 4}) =
P(X1 ∈ {3, 4})

P(X2 = 6, X1 = 3) + P(X2 = 6, X1 = 4)
=
P(X1 = 3) + P(X1 = 4)

P6
i=1 P(X2 = 6, X1 = 3, X0 = i) + P(X2 = 6, X1 = 4, X0 = i)
= P6
i=1 P(X1 = 3, X0 = i) + P(X1 = 4, X0 = i)

P6
i=1 ν(i)pi3 p36 + ν(i)pi4 p46
= P6
i=1 ν(i)pi3 + ν(i)pi4

P6
i=1 pi3
= P6
i=1 pi3 + pi4

1 2
= = .
1/2 + 1 3
On the other hand, proceeding as above and recalling that p23 = 1, p36 = 1 e p24 = 0 we have

P(X2 = 6|X1 ∈ {3, 4}, X0 = 2) = P(X2 = 6|X1 = 3, X0 = 2)

= P(X2 = 6|X1 = 3) = 1.

Exercise 6: Let I be a numerable state space and a function f : I × F → I where F is a


given set. Let X0 ∈ I, and define recursively the stochastic process {Xn }n≥0 by the recursive
equation
Xn+1 = f (Xn , Zn+1 ), per n ≥ 0,
where for any n ≥ 0, Zn+1 ∈ F is conditionally independent of Z1 , . . . , Zn , X0 , . . . , Xn−1
given Xn , with distribution independent on n. Show that {Xn }n≥0 is a Markov chain with
transition probabilities

pij = P(f (i, Z1 ) = j|X0 = i), i, j ∈ I .

Solution: To show that the Markov property holds, we have that

P(Xn + 1 = j|Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )


= P(f (Xn , Zn+1 ) = j|Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )
P(f (i, Zn+1 ) = j, Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )
=
P(Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )
P(f (i, Zn+1 ) = j, Xn−1 = in−1 , . . . , X0 = i0 |Xn = i)P(Xn = i)
=
P(Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )
⊥ P(f (i, Zn+1 ) = j|Xn = i)P(Xn−1 = in−1 , . . . , X0 = i0 |Xn = i)P(Xn = i)
=
P(Xn = i, Xn−1 = in−1 , . . . , X0 = i0 )
= P(f (i, Zn+1 ) = j|Xn = i)
= P(f (Xn , Zn+1 ) = j|Xn = i)
= Pi (f (i, Z1 ) = j) =: pij .

You might also like