Questions IE621
Questions IE621
10. Three telephone companies A, B and C compete for customers. Each year A loses 5% of its
customers to B and 20% to C; B loses 15% to A nd 20% to C; C loses 5% to A and 10% to B.
Write the transition matrix for this model. What the limiting market share for each of these
companies?
11. A basketball player makes a shot with the folowing probabilities: 12 if he has missed the last
two times; 23 if he has hit one of his last two shots; 34 if he has hit both of his last two shots.
Formulate a Markov chain model to his shooting, and compute the limiting fraction of time
he hits a shot.
12. (Landscape dynamics) To make a crude model of a forest we might introduce states 0 =
grass, 1 = bushes, 2 = small trees, 3 = large trees, and write down a transition matrix like
the following:
0 1 2 3
1 1
0 2 2 0 0
1 7 1
1 24 8 12 0
1 8 1
2 36 0 9 12
1 7
3 8 0 0 8
The idea behind this matrix is that if left undisturbed a grass area will see bushes grow, then
small trees, which of course grow into large trees. However, disturbances such as tree falls or
fires can reset the system to state 0. Find the limiting fraction of land in each of the states.
13. A warehouse has a capacity to hold four items. If the warehouse is neither full nor empty,
the number of items in the warehouse changes whenever a new item is produced or an item
is sold. Suppose that (no matter when we look) the probability that the next even is “a new
item is produced” is 23 and that the new event is “sale” is 13 . If there is currently one item in
the warehouse what is the probability that the warehouse will become full before it becomes
empty.
14. Consider a single facility where the probability that there are j arrivals in a time interval is
pj . The maximum rate of service is k per time period.
(a) let Xn represent the number of people in the queue at the end of a period. If the initial
queue is empty, what is the Markov chain {Xn } ? (i.e. what are the state and the
transition probabilities?)
(b) Give conditions on pj so that the queue is stable in the long run (Define an appropriate
notion of stability).
(c) The number in the system is announced at the beginning of a period. If the re are
more than M people then, each incoming person decides whether to join or not with a
probability p.Is the resulting system a Markov process?
(d) What are the quantities in the queuing system that you would be interested in the long
run?
15. The (S, S) inventory system.
Two parameter are selected; S the order up-to level and s, the reorder level. Demand is a
discrete random variable {Zk } in period k. If the inventory at the end of a period is greater
than s, then nothing is done. If it falls below s, an order is placed, bringing the inventory level
back up-to S (after fulfilling any backlogged demand). Replenishment happens instantly. Let
{Xn } be the Markov chain representing the state of the inventory at the end of period n.
(a) Write the equation representing the evolution of the evolution of the Markov chain
(i.e.{Xn+1 } in term of {Xn } and other quantities).
(b) write the states of the system and the transition probabilities.
(c) Explain how you would estimate (i) The average order size and (ii) The probability of
there being backlogs in the system.
3
16. Let a dart be thrown at a very large wall. What are the possible outcomes and what is the
sample space?
17. Let a point be picked at random in the disk of radius 1. Find the probability that it lies in
the angular sector from 0 to π4 radians.
18. The following events refer to the infinite coin tossing model. Determine which of these events
is finite dimensional and calculate the probability.
20. Suppose three identical and perfectly balanced coins are tossed once. Let Ai be the event
that ith coin lands heads. Show that the events A1 , A2 and A3 are mutually independent.
21. Suppose A, B and C are mutually independent events and P (A ∩ B) ̸= 0. Show that
P (C|A ∩ B) = P (C).
22. A machine consists of 4 components linked in parallel, so that the machine fails only if all
four components fail. Assumpe component failures are independent of each other. If the
components have probabilities 0.1, 0.2, 0.3, and 0.4 of failing when the machine is turned on.
What is the probability that the machine will function when turned on?
23. Suppose a point is picked at random in the unit square. If it is known that the point is in the
rectangle bounded by y = 0, y = 1, x = 0 and x = 12 , what is the probability that the point
is in the triangle bounded by y = 0, x = 12 and x + y = 1?
24. In the experiment of infinite coin tossing, let Sn denote the number of occurrences of heads
in n trials i.e., Sn = X1 + X2 + · · · + Xn . Show that
n 1
P (Sn = k) = .
k 2k
Also find ESn .
26. Consider the following probability model: Ω = [0, 1]2 , P (A) = Area of A. Thus ω = (x, y),
where x, y ∈ [0, 1]. Define X(ω) = x and Y (ω) = y. Show that X and Y are independent
random variables.
4
27. Consider the experiment of infinite coin tossing. Let Xn be random number obtained at the
nth toss. Show that each Xn is a random variable.
28. Let X be a real valued random variable with a p.d.f. f (x) and let Y be non-negative inte-
ger valued random variable with the distribution P (Y = k) = pk , k ≥ 0. Find the p.d.f. of
Z = X + Y when X and Y are independent.
29. Any point in the interval [0, 1) can be represented by its decimal expansion 0.x1 x2 x3 · · · .
Suppose a point is chosen at random from the interval. Let X be the first digit in the decimal
expansion representing the point. Compute the density of X.
1
30. Let Ω = [0, 1] and P be the Lebesgue P∞ measure. Define An = [0, n ]. Verify that An , n =
1, 2, · · · , are not independent and n=1 P (An ) = ∞, but P (An occurs i.o.) = P (0) = 0.
31. What values of x, y, z will make the following matrices transition probabilities?
.5 .1 x x .1 .7
P = y .2 .4 Q = .2 .3 y
.3 z .1 .6 z .2
32. A red urn contains 2 red marbles and 3 blue marbles. A blue urn contains 1 red marble and 4
blue marbles. A marble is selected from an urn, the marble is returned to the urn from which
it was drawn and the next marble is drawn from the urn with the color that was drawn. Is this
a Markov chain and if so what is the transition probability matrix for this chain? Suppose
the first marble is drawn from the red urn. What is the probability that the third one will be
drawn from the blue urn?
33. A person is flipping a coin repeatedly. Let Xn be the outcome of the two previous coin flips
at time n, for example the state might be HT to indicate that the last flip was T and the
one before that was H. Compute the transition probability for this chain and the two-step
tranition probability matrix.
34. A local rail system has just started operating. In the first month of operation, it was found
that 25% of commuters are using the system while 75% are tavelling by bus. Suppose that
each month 10% of rail users go back to bus, while 30% of bus users switch to the rail system.
Compute the three-step tranisition probability. What will be the fractions using rail system
in the fourth month? What happens in the long run?
35. Consider the Markov chain {Xn , n ≥ 0} with state space {0, 1, 2}, whose transition probability
matrix is
0 12 12
1 1 0
2 2
1 0 0
(−N, N) (N, N)
1
0
0
1
0
1
0
1
0
1
1
0
1
0
0
1
0
1
00000
11111
00000
0
1
11111
0
1(0,0)
0
1
0
1
0
1
0
1
0
1
0
1
1
0
39. Consider the variant of the above problem in which he never choses the same direction as in
previous step (if he took north in previous step, he will not take north in the current step).
Forumate this stochastic process as a Markov chain.
40. In case of problem 1, will the drunkard return to the starting point? Give a justification either
intuitively or rigorously. Compute the expected number of steps and probability to return to
the starting point. (To ease the computation, you can take N = 1 or 2.)
41. Model the movement of knight in chess as a Markov chain. Show that starting from any initial
state, knight can take all other states before reaching to initial state with positive probability.
(Note that computing this probability may be extremely tedious job.)
42. Suppose a virus can exist in two different strains α, β and in each generation either stays the
same, or with probability p ≪ 1/2 mutates to the other strain. Suppose the virus is in strain
α initially, what is the probability that it is in the same strain after n generations?
Find the communicating classes and check if any of them is closed. Classify the states as
absorbing, transient, and recurrent.
44. Let {Xn , n ≥ 0} be a MC with finite state space with m states. For state i, j, if i −→ j, then
show that j can be reached from i in m steps or less (in other words show that there exists
(r)
some r ≤ m such that pij > 0).
45. Suppose a communications network transmits numbers in a binary system, that is 0s and
1s. At each state of the network, there is probability q > 0 that the number will be passed
6
incorrectly to the next stage. Let Y0 be number entered into the system and Yn be the num-
ber recorded at stage n. What conditions are needed for Yn to be a Markov chain? What is
P{Y10 = 1 | Y0 = 1, Y8 = 0} when Yn is a Markov chain?
46. Let Xn and Yn be two independent Markov chains with same transition probability matrix
P and starting in states i and j respectively.
48. LetXn be a Markov chain on the state space {−1, 0, 1} and suppose that pij > 0 for all i, j.
When will the absolute values of the Markov chain i.e., |Xn | be a Markov chain?
1
49. Let the n × n transition probability matrix P be symmetric. Show that π(i) = n is stationary
distribution.
52. Show that any sequence of independent random variables taking values in a countable set S
is a Markov chain. Under what conditions will it be homogeneous?
53. Let N (t) be a renewal process. Answer whether the following statements are true or not.
Justify your answer.
(a) N (t) < n if and only if Tn > t.
(b) N (t) ≤ n if and only if Tn ≥ t.
(c) N (t) > n if and only if Tn < t.
54. Suppose there is an urn with N balls. Each ball is coloured either red or green. In each time
period, one ball is chosen at random from the urn and with probability 12 is replaced with a
ball of other color. Otherwise, the ball is returned to the urn. Let Xn denote the number of
red balls after n picks. Then
(c) How will the red and green balls be distributed in the long run?
(d) Show that the invariant probability is given by the binomial distribution
N
π(j) = 2−N .
j
55. Consider a simple random walk on the graph below. (Recall that simple random walk on a
graph is the Markov chain which at each time moves to an adjacent vertex, each adjacent
vertex having the same probability).
A B
1
0
0
1 1
0
0
1
1
0
0
1
C
0
1
0
1 1
0
0
1
D E
Figure 2
56. The complete graph on {1, 2, · · · , N } is the smple graph with these vertices such that any
pair of disjoint points is adjacent. Let Xn denote the simple random walk on this graph and
T be the first time that the walk reaches the state 1.
1
(a) Give the distribution of T assuming X0 = 1. Verifty that E(T ) = π(1) .
(b) What E[T |X0 = 2] ?
(c) Find the expected number of steps neeed until every point has been visited at least once.
59. When is a continuous time stochastic process {Xt , t ≥ 0} called a continous time stochastic
proces?
Let τi denote the amount of time that the process stays in the state i before making a
transition to a different state. Show that τi is exponential
60. Suppose A and B have N rupees each. Let a fair coin be tossed. If head falls, Player A
receives 1 rupee from B, otherwise, Player B receives 1 rupees from A. This procedure
is continued until one of them has no money. Model this as Markov chain and write the
transition probability.
63. Consider the stochastic process {Xn }n≥0 taking values in W = {0, 1, · · · } defined in the
following way” Let Un , n ≥ 1 be iid random variables taking their values in W∗ = {1, 2, · · · }.
Let X0 be a random variable with values in W and independent of Un , n ≥ 1. The process
{Xn } decreases by 1 every unit of time, except when it reaches the state 0. Then, if it is the
kth time it has reached 0, then it jumps to the value Uk − 1.
Show that {Xn } is irreducible Markov chain and give its transition matrix. Show that it
admits unique stationary distribution π if and only if EU1 < ∞ and give an expression for π
interms of the distribution of U1 . In this case what is the reversed chain?
64. Let {Xn } be a Markov chain with transition matrix P . Let {Yn } be defined by the sequence
of new values of {Xn } i.e.,
Y0 = X0 ,
Y1 = Xk where Xk ̸= X0 and Xj = X0 for all j = 0, 1, · · · , k − 1
..
.
Show that {Yn } is Markov chain and find its transition matrix. Show that {Xn } is irreducible
and recurrent if and only if {Yn } is irreducible and recurrent. In the irreducible recurrent
case, express the invariant measure of {Yn } in terms of P and the invariant measure of {Xn }.
65. Consider a hesitant gambler: at each time, she flips a coin with probability of p success. If it
comes heads, she places a fair one dollar bet. If tails, she does nothing that round, and her
fortune stays the same. If her fortune ever reaches 0 or n, she stops playing. Model this as a
Markov chain and find the transition matrix. Assuming that her initial fortune is k, find the
expected value of the time required for her fortune to be arrive at (either) endpoint in terms
of n, k, and p.
9
66. Consider a Markov chain {Xn } with state space {1, 2, · · · , N } and suppose that whenever the
state is i, a reward g(i) is obtained. Let Rn be the total reward upto time n. Give expression
for Rn . For every state i, let
mn (i) = E[Rn |X0 = i]
and
76. Let the times between the events of a renewal process N (t) be uniformly distributed on (0, 1).
Find the mean and variance of N (t) for 0 ≤ t ≤ 1.
77. Let N (t) be a Poisson process. Calculate E[N (t)N (t + s)] for s, t > 0.
78. Show that Poisson process satisfies Markov property. Does a renewal process satisfy Markov
property? Explain.
79. In an airport, cab arrives as a Poisson process with rate 2 per minute while bus arrives as an
independent Poisson process with rate 0.1 per minute. What is the chance that there is ten
cabs arrived before one bus?