Homework_Week3_solutions
Homework_Week3_solutions
Exercise 2: Consider an HMC (Xn )n≥0 with values in I = {0, 1, 2} and transition matrix
0.2 0.5 0.3
P= 0.1 0.1 0.8 . (1)
0.5 0.2 0.3
Draw the transition graph and compute P(X3 = 1|X0 = 1) and P(X7 = 2|X4 = 0).
Solution: Since M does not have null entries, the transition graph is the complete graph.
That is ogni any pairs of states are connected by an edge of the graph. In other words, it is
possible to reach any state, from any starting point, in one step.
Moreover, we have
X3
P(X3 = 1|X0 = 1) = p1i pij pj1 = 0.307;
i,j=0
3
X
P(X7 = 2|X4 = 0) = P(X3 = 2|X0 = 0) = p0i pij pj2 = 0.405.
i,j=0
Exercise 3: Consider N numbered balls. Some of them are contained in urn A and the
others in urn B. Let Xn be the cardinality of the balls in A at time n. The cardinality of the
balls in A at time n + 1 (i.e. Xn+1 ) is decided as follows. Choose at random a ball out the
N balls. Put the chosen ball in A with probability p and in B with probability 1 − p. Prove
(Xn )n≥0 is an HMC and compute the transition matrix of the process.
Solution: The dynamics of Xn may be described as follows
N −i
pii+1 = P(Zn+1 = +1|Xn = i) = p;
N
i
pii−1 = P(Zn+1 = −1|Xn = i) = p.
N
Exercise 4: Let (Zn )n≥1 be a sequence of i.i.d. geometric r.v. with parameter p ∈ (0, 1):
that is, for k ≥ 0 we have P(Zn = k) = (1 − p)k p. Let us consider the stochastic pro-
cess (Xn )n≥0 , where X0 ∈ N is a r.v. independent of the sequence (Zn )n≥1 and define
Xn = max{X0 , Z1 , . . . , Zn } for n ≥ 1. Show that (Xn )n≥0 is a Markov chain and compute its
transition matrix.
Solution: To prove that (Xn )n≥0 ia an HMC it suffices to show that for n ≥ 1
Exercise 5: A mice moves in a maze. The rooms of the maze are numbered from. 1 to 5.
In 5 there is the cheese whereas in 3 there is cat. The cat is lazy and remains all the time in
room 3. However, the mice enters the room inhabited by the cat, the cat will eat the mice.
The position of the mice at time n ≥ 0 is a Markov chain with transition matrix:
0 1/2 0 1/2 0
1/2 0 1/2 0 0
P= 0 0 1 0 0 . (3)
1/3 0 1/3 0 1/3
0 0 0 0 1
1. Draw the transition graph associated to P .
2. Is P irreducible?
3. Compute the probability that the mice, starting from room 1, get to the cheese without
being eaten by the the cat .
1 5 4 3 1
1 2
1/2
The chain is NOT irreducible since we have 3 communicationg classes {1, 2, 4}, {3} and {5}.
Define ui = Pi (T5 < T3 ) where Ti = min{n >= 0 : Xn = i} is the first hitting time of state
i ∈ {1, 2, 3, 4, 5}. We have that
2u1 = u2 + u4
2u2 = u1 + u3
u3 = 0
3u4 = u1 + u3 + u5
u5 = 1
The solution is U T = (2/7, 1/7, 0, 3/7, 1). Therefore P1 (T5 < T3 ) = u1 = 2/7.
Exercise 6: A cat is hunting a rat. Rat and cat moves between two rooms using different
paths. Their motion are independent and described by the transition matrices
0.7 0.3 0.4 0.6
Pcat = Prat = . (4)
0.3 0.7 0.6 0.4
If the cat and the rat are ever in the same room the cat will eat the rat and the hunting
comes to an end.
1. Describe the time evolution of the hunting as a Markov chain with one of the state
representing the end of the hunting (i.e. the cat and the rat are in the same room and
the cat eats the rat).
2. Assume the cat is starting from room 2 and the rat from room 1. Compute the average
duration of the hunting.
Solution:
1. Let (Xn )n≥0 be the Markov chain describing the dynamics of the hunting. We set
Xn ∈ {0, 1, 2}, n ≥ 0, where
Let us compute the transition matrix P = (pij )i,j∈{0,1,2} associate to the chain.
Notice that, since 0 is a closed state, it holds p00 = 1 and p01 = p02 = 0.
Then, let A be the event “the rat remains in room 1 whereas the cat moves from room
2 to room 1” and B the event “the cat remains in room 2 whereas the rat moves from
room 1 to room 2”. Since the cat is moving independently of the rat, we have
In the same way we can compute p11 = 0.4 × 0.7 and p12 = 0.3 × 0.6 which are respec-
tively the probability that the rat remains in room 1 and the cat in room 2 and the
probability that the cat moves from room 2 to room 1 and the same the rat moves from
room 1 to room 2.
It follows from the same computations that p20 = 0.4 × 0.3 + 0.7 × 0.6, p21 = 0.3 × 0.6
and p22 = 0.4 × 0.7. In conclusion, we have
1 0 0
P= 0.4 × 0.3 + 0.7 × 0.6 0.4 × 0.7 0.3 × 0.6 . (5)
0.4 × 0.3 + 0.7 × 0.6 0.3 × 0.6 0.4 × 0.7
2. Put T = inf{n ≥ 0|Xn = 0} and E(i) = E(T |X0 = i) per i = 0, 1, 2. Using the first-step
analysis, and the boundary condition E(0) = 0, we obtain the system
E(1) = p11 E(1) + p12 E(2) + 1
, (6)
E(2) = p21 E(1) + p22 E(2) + 1
Exercise 7: Let (Xn )n≥0 be an HMC taking value in I = {1, 2, 3, 4} and transition matrix
0.2 0.3 0.5 0
0 0.2 0.3 0.5
P=
0.5
. (7)
0 0.2 0.3
0.3 0.5 0 0.2
Assume X0 = 1. Compute the probability the chain visits state 3 before state 4.
Solution: Put H = inf{n ≥ 0|Xn ∈ {3, 4}}. We have to compute u(1) := P(XH = 3|X0 = 1).
In general, we set u(i) := P(XH = 3|X0 = i), for i = 1, 2, 3, 4. Notice that u(3) = 1 e u(4) = 0.
Then conditioning on the first step we obtain the following system for u(1) and u(2):
u(1) = p11 u(1) + p12 u(2) + p13 ;
u(2) = p22 u(2) + p23 ;
with solution u(1) ≃ 0.766 and u(2) = 0.375.
Exercise 8: The vector of mean hitting times of the set A ⊂ I, k A = (kiA : i ∈ I), is the
minimal non-negative solution to the system of linear equations
A
ki = 0 P for i ∈ A
A A
ki = 1 + j̸∈A pij kj for i ̸∈ A
We now check that k A is the minimal solution. Let us consider another solution {xi }i∈I with
x ≥ 0.
If i ∈ A than 0 = xi ≥ kiA . If i ̸∈ A we have
X
xi = 1 + pij1 xj1
j1 ̸∈A
Solution: We know that the mean hitting time has to solve the system of equations
k0 = 1
ki = 1 + 12 ki−1 + 12 ki+1 for i > 0
∆ki = ∆ki−1 − 2 ,
with ∆k0 = k1 .
It follows that ∆ki = k1 − 2i. Since ki+1 = ∆ki + ki = in=1 ∆kn + k1 , we get
P
i
X
ki+1 = (i + 1)k1 − 2 n = (i + 1)k1 − i(i + 1) = (i + 1)(k1 − i) .
n=1
Since we look for the minimal solution, among the non-negative ones, we see that any finite
k1 would not satisfy the condition of non negativity. Therefore, ki = ∞ for all i > 0.