Lecture 2: Probability Review
END 322E System Simulation Mehmet Ali Ergün, Ph.D.
1
Sample Space
❖ Elementary events: elements of some set
❖ Think of it as set of possible outcomes of an experiment
❖ E.g., for flipping a coin once
❖ Elementary events are H and T
❖ This set is called the sample space
❖ One more example:
❖ Suppose we throw 3 dice in sequence
❖ What are the possible outcomes? (This defines )
❖ How many possible outcomes are there?
2
Events
❖ Events are subsets of the sample space of elementary events
❖ The set of possible events must be a -field, i.e.:
❖ Entire space is an event
❖ Complement of any event is an event
❖ Union and intersection of two events are events
❖ Union of a countable number of events is an event
❖ Example event: all outcomes in throwing of 3 dice in which the total shown is at least 17
3
Probability
❖ To each event , we assign a real number called its probability, P(e)
❖ Axioms of probability:
❖ For each
❖ For countable sequence of mutually exclusive events , we have:
❖ Examples:
❖ Cast of a single die: P(each face)=1/6
❖ Same die, different events: ?
4
Conditional Probability
❖ Consider an experiment that consists of flipping a coin twice
❖ Flipping a coin twice:
❖ If the first flip , then what is the probability of ?
❖ Let be the event of both flips are and the event first flip is then the
probability of conditional on is denoted by:
5
Conditional Probability and Independence
❖ In general:
❖ In the previous example: because event and are not independent
❖ In order for to happen must happen
❖ Events and are independent if following holds:
6
Random Variables (RV)
❖ When we perform experiments, we are sometimes primarily concerned about a numerical quantity.
❖ Essentially a function defined on sample space
❖ i.e. Sum of the 3 dice tosses
❖ We call such quantities random variables
❖ We need random variables for simulations, e.g.
❖ Inter-arrival times for costumers/orders/vehicles
❖ Service times for customers
❖ Repair times for machines
❖ Assume their distributions, then generate them on demand
7
Discrete RVs
❖ A random variable that can take either a finite, or at most, a countable number of possible values
is said to be a discrete random variable.
❖ Number of customers arriving to a shop in a day
❖ Probability mass function, p(x) is
❖ Since must take on one value:
❖ Cumulative density function
8
Discrete RVs (Example)
❖ The Industrial Engineering Department has a lab with six computers
reserved for its students.
❖ Let X denote the number of these computers that are in use at a
particular time of day.
❖ Suppose that the probability distribution of X is as given in the
following table; the first row of the table lists the possible X values and
the second row gives the probability of each such value.
9
Discrete RVs (Example)
❖ We can now use elementary probability properties to calculate other
probabilities of interest. For example, the probability that at most 2
computers are in use P(X 2)
10
Discrete RVs (Example)
❖ The probability that at least 3 computers are in use?
❖ The probability that between 2 and 5 computers inclusive are in use?
1
1
Continuous RVs
❖ Continuous Random Variables
❖ X = Time waited at a bus station
❖ Range of X = [0,
❖ Probability density function, f(x) is
❖ Cumulative density function
12
Continuous RVs
❖ The probability that X takes on a value in the interval [a, b] is the area above this interval and
under the graph of the density function
❖ The graph of f (x) is often referred to as the density curve.
❖ for any constant
13
Random Variables (Example)
❖ The life of a laser-ray device used to inspect cracks in aircraft wings is
a continuous random variable with a pdf given by
❖ What is the probability that the life of the laser device is between 2 and
3 years?
14
Expectation and Variance
❖ The expectation or the mean is a measure of the central tendency of a random variable.
E(X) is given by if
❖ Discrete rv
❖ Continuous rv
❖ The variance V(X) is a measure of spread of the random variable around the mean E(X)
given by
V
15
Expectation (Examples)
❖ The mean number of computers occupied in IE department at a given time:
= 3.3 computers per day
❖ The mean life years of laser ray device is
16
Bernoulli Distribution
❖ The simplest form of random variable.
Success/Failure
0.7
❖
0.6
❖ Flip coin Heads/Tails 0.5
0.4
P(X=x)
P ( X 1) p 0.3
P ( X 0) 1 p 0.2
E[ X ] p 0.1
0
Var ( X ) p(1 p ) 0 1
X
17
Binomial Distribution
❖ The number of successes in n Bernoulli trials.
Or the sum of n Bernoulli random variables.
0.3
❖
❖ Number of heads/tails out of n flips
0.2
n x
P(X=x)
n x
P ( X x ) p (1 p )
x 0.1
E [ X ] np
0
Var ( X ) np(1 p ) 0 1 2 3 4 5 6 7 8 9 10
X
18
Binomial Example
❖ An airline knows that 5 percent of the people making reservations on a certain flight
will not show up. Consequently their policy is to sell 52 tickets for a flight that can hold
only 50 passengers.
❖ Then p = P(a certain passenger to show up) = .95,
X = The number of passengers show up for the flight,
X ~ Bin(52,.95).
❖ What is the probability that there will be a seat for every passenger who shows up?
19
Binomial Example
❖ But it is too much calculation. Solution: use axioms of probability:
20
Geometric Distribution
❖ The number of Bernoulli trials required to get the first success.
x 1
P ( X x) p (1 p )
1
E[ X ]
p
(1 p )
Var ( X ) 2
p
21
Geometric Distribution - Example
❖ Starting at a fixed time, we observe the gender of each newborn child at a certain hospital until
a boy (B) is born. Let , assume that successive births are independent, and define the RV X by
of births observed until the first boy is born.
❖ Then also assume p = P(a newborn is Boy) = .49, so
X = the number births until the first Boy,
X ~ Geom(.49).
❖ The probability that 3 births are observed
P(X = 3) =until
(.51)the
2 first
(.49) 1 boy?
= .127
❖ Expected number of births until a boy is born
22
Poisson Distribution
❖ The number of random events occurring in a fixed interval of time
❖ Random batch sizes
0.3
❖ Number of defects on an area of material
−𝜇 𝑥
𝑒 𝜇
𝑃 ( 𝑋 =𝑥)= 0.2
𝑥!
P(X=x)
0.1
Var 0
0 1 2 3 4 5 6 7 8 9 10
X
23
Poisson Example
❖ Let X denote the number of creatures of a particular type captured in a trap
during a given time period. Suppose that X has a Poisson distribution with,
so on average traps will contain 4.5 creatures.
❖ The probability that a trap contains exactly five creatures is
❖ The probability that a trap has at most five creatures is
5 − 4.5 𝑥
𝑒 4.5
𝑃 ( 𝑋 ≤ 5 ) =∑ =0.7029
𝑥=0 𝑥!
24
Exponential Distribution
❖ Model times between events 0.5
❖ Times between arrivals, failures
0.4
❖ Times to repair, Service Times
0.3
❖ Probability Density:
f(x)
0.2
❖ Cumulative Density:
❖ 0.1
❖ Memoryless
𝑓 ( 𝑋 > 𝑥 + 𝑦 ∨ 𝑋 > 𝑦 )= 𝐹 ( 𝑥)
0
0 2 4 6 8 10
X
25
Exponential Example
❖ Suppose that calls are received at a 24-hour “suicide hotline” according
to a Poisson process with rate λ = 0.5 call per day.
❖ Then the number of days X between successive calls has an exponential
distribution with parameter value 0.5, so the probability that more than
2 days elapse between calls is
P(X > 2) = 1 – P(X 2)
= 1 – (1 - e –(.5)(2)
)
=e–(.5)(2)
=0.368
❖ The expected time between calls is
26
Normal Distribution
❖ The distribution of the average of iid random variables are eventually normal
❖ Distribution of heights 0.45
1
1 x
2
f ( x)
2
2
e 0.3
2 2
f(x)
E[ X ]
0.15
Var ( X ) 2 0
0 2 4 6 8 10
X
27
❖ Left Tail distribution (
𝑃 ( 𝑋 ≤ 𝑥) = 𝑃 𝑍 ≤
𝜎)
𝑥 −𝜇
=𝜑 (
𝑥 −𝜇
𝜎
)
Check from standard normal table
❖ Central Limit Theorem
independent and identically distributed random variables with mean and variance and
Then
28
Normal Distribution Example
❖ The time that it takes a driver to react to the brake lights on a decelerating
vehicle is critical in helping to avoid rear-end collisions.
❖ The article “Fast-Rise Brake Lamp as a Collision-Prevention Device”
(Ergonomics, 1993: 391–395) suggests
that reaction time for an in-traffic response to a brake signal from standard
brake lights can be modeled with a normal distribution having mean value
1.25 sec and standard
deviation of .46 sec.
29
Normal Distribution Example
❖ What is the probability that reaction time is between 1.00 sec and 1.75 sec? If
we let X denote reaction time, then standardizing gives
1.00 X 1.75
Thus normalizing
= P(–.54 Z 1.09) = φ(1.09) – φ(–.54)
= .8621 – .2946 = .5675
30
Triangular Distribution
❖ Used in situations were there is little or no data.
❖ Just requires the minimum (a), maximum(b) and most likely (m)s
value.
2( x a )
f ( x) , axm 0.3
( m a )( b a )
2( b x )
, m xb 0.2
(b m)( b a )
f(x)
0, otherwise 0.1
E [ X ] ( a b) / 2
0
0 1 2 3 4 5 6 7 8 9 10
Var ( X ) (b a ) / 12
2 X
31