0% found this document useful (0 votes)
76 views23 pages

Markov Analysis

Markov analysis is a probabilistic technique used to predict future states based on current probabilities of transitioning between states. It involves defining states, transition probabilities between states, and calculating steady state probabilities to determine long-term probabilities of being in each state. The document provides examples of how Markov analysis can model marketing strategies, maintenance decisions, stock markets, and more. It defines key terms like states, transition probabilities, and steady state probabilities and outlines the properties and steps of Markov analysis, demonstrating its application through decision trees and transition matrices.

Uploaded by

baby Lhetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
76 views23 pages

Markov Analysis

Markov analysis is a probabilistic technique used to predict future states based on current probabilities of transitioning between states. It involves defining states, transition probabilities between states, and calculating steady state probabilities to determine long-term probabilities of being in each state. The document provides examples of how Markov analysis can model marketing strategies, maintenance decisions, stock markets, and more. It defines key terms like states, transition probabilities, and steady state probabilities and outlines the properties and steps of Markov analysis, demonstrating its application through decision trees and transition matrices.

Uploaded by

baby Lhetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Markov

Analysis

In Brand-switching Problem
Markov Analysis

+ Like decision analysis, is a probabilistic technique. However,


Markov Analysis is different in that it does not provide a
recommended decision. Instead, it provides probabilistic
information about a decision situation that can aid the decision
maker in making a decision.
+ Not an optimization technique but a descriptive technique that
results in probabilistic information.
Markov Analysis

+ Developed by Andrey Markov, a Russian mathematician, to


describe the movement of gas in a closed container in 1940
+ In 1950s, management scientists began to recognize that the
technique could be adapted to decision situations which fit the
pattern that Markov described.
Markov Analysis

+ In 1960s, the procedure has been used to describe


+ Marketing Strategies
+ Plant Maintenance Decisions
+ Stock Market Movements
+ Account Receivable Projections
Markov Analysis

+ Markov analysis is used to analyze the current state


and movement of a variable to predict the future
occurrences and movement of this variable by the
use of presently known probabilities.
Terms

+ Market Share – the fraction of a population that


shops at a particular store or market
+ These shares can be used in place of state
probabilities when expressed as a fraction
Terms

+ State Probability – the probability of an event


occurring at a point in time, i.e., the probability that
a person will purchase a product at a given store
during a given month.
Terms

+ Transition Probability – is the probability of


moving from one state to another during one time
period.
+ Transition Matrix – includes the transition
probabilities for each state of nature.
Terms

+ State of the system – is where the system is at a


point in time.
+ Steady State Probabilities – are the average,
constant probabilities that the system will be in a
state in the future.
Markov Properties

1. The probabilities of a given beginning state of the system


sum to one.
2. The probabilities apply to all participants in the system.
3. The transition probabilities apply to all participants in the
system.
4. The states are independent over time.
Markov Analysis using Decision trees
Markov Analysis using Decision trees
Markov Analysis using Transition Matrix
Markov Analysis using Transition Matrix
Markov Analysis using Transition Matrix
Markov Analysis using Transition Matrix
Markov Analysis using Transition Matrix
Markov Analysis using Transition Matrix
Steady State Probabilities

+ Steady State Probabilities – are the average, constant


probabilities that the system will be in a state in the
future.
+ They can be computed by developing a set of equations
using matrix operations and solving them
simultaneously.
Algebraic Determination of Steady State
Probabilities
Application of Steady State Probabilities

+ The steady-state probabilities indicate not only the probability of


a customer’s trading at a particular service station in the long-
term future but also the percentage of customers who will trade
at a service state during any given month in the long run. For
example, if there are 3,000 customers in the community who
purchase gasoline, then in the long run the following expected
number will purchase gasoline at each station on a monthly
basis.
Algebraic Determination of Steady State
Probabilities – Machine Breakdown
Assignment
+ Consider the Carry-All Rental Truck Firm, which serves three
states – Virginia, North Carolina, and Maryland. Trucks are
rented on a daily basis and can be rented and returned in any of
the three states. The transition matrix for this is shown below.
Find the steady state.

You might also like