0% found this document useful (0 votes)
75 views58 pages

Probability Theory

The document discusses basic probability concepts including: - Random experiments and sample spaces consisting of all possible outcomes - Events as subsets of outcomes in the sample space including simple, compound, joint, and conditional events - Calculating probabilities using formulas like the addition rule and multiplication rule - Visualizing probability and events using tables, diagrams, and trees - Computing conditional probabilities and applying Bayes' theorem to update probabilities based on new information

Uploaded by

Evelyn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views58 pages

Probability Theory

The document discusses basic probability concepts including: - Random experiments and sample spaces consisting of all possible outcomes - Events as subsets of outcomes in the sample space including simple, compound, joint, and conditional events - Calculating probabilities using formulas like the addition rule and multiplication rule - Visualizing probability and events using tables, diagrams, and trees - Computing conditional probabilities and applying Bayes' theorem to update probabilities based on new information

Uploaded by

Evelyn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 58

Basic Probability

Why Probability

The following situations provide examples of the role


of uncertainty in our lives/ a business context
 investment counselors cannot be sure which of two

stocks will deliver the better growth over the coming


year
 Engineers try to reduce the likelihood that a

machine will break down.


 Marketers may be uncertain as to the effectiveness

of an ad campaign or the eventual success of a new


product.
Why Probability
 Product manufacturers and system designers
need to have testing methods that will assess
various aspects of reliability.
long lifetimes> time consuming >we need
“accelerated” testing methods

 inventory management
Basic Concepts
Random Experiment is a process
leading to at least two possible
outcomes with uncertainty as to
which will occur.
· A coin is thrown

· A consumer is asked which of two

products he or she prefers


Sample Spaces
Collection of all possible outcomes
e.g.: Examine three fuses in sequence and note
the result of each examination, then an
outcome for the entire experiment is any sequence
of Ns and Ds of length 3
Sample space s={NNN, NND, NDN, NDD, DNN, DND,
DDN, DDD}
Events and Sample Spaces
A
An event is any collection (subset) of
outcomes contained in the sample space S.
An event is said to be simple if it consists
of exactly one outcome and compound if
it consists of more than one outcome.
Joint Event: 2 Events Occurring Simultaneously
e.g. male and age over 20
Intersection: (A and B, (AB))
Union: (A or B, (A  B))
Unions and Intersections
Compound Events
Made of two or
more other events

Union Intersection
A B A B
Either A or B, Both A and B
or both, occur occur
Event Properties
Mutually Exclusive: Two outcomes that
cannot occur at the same time
E.g. flip a coin, resulting in head and
tail
Collectively Exhaustive: One outcome in
sample space must occur
E.g. Male or Female
Special Events

Null Event
Club & Diamond on 1 Card Draw
Complement of Event
For Event A, All Events Not In A:
A' or Ā
What is Probability?

focuses on a systematic study of randomness and


uncertainty.
provides methods for quantifying the chances, or
likelihoods, associated with the various outcomes
Numerical measure of 1 Certain
likelihood that the event will occur
Simple Event
Compound
Lies between 0 & 1 .5
Sum of events is 1

n
0  pi  1 p i
1 0 Impossible
Concept of Probability
A Priori classical probability, the probability of
success is based on prior knowledge of the
process involved.
i.e. the chance of picking a black card from a deck of
cards
Empirical classical probability, the outcomes are
based on observed data, not on prior knowledge
of a process.
i.e. the chance that individual selected at random
from employee survey if satisfied with his or her job.
Concept of Probability

Subjective probability, the chance of


occurrence assigned to an event by a
particular individual, based on his/her
experience, personal opinion and analysis of a
particular situation.
i.e. The chance of a newly designed style
of mobile phone will be successful in
market.
Computing Probabilities
The probability of an event E:

number of event outcomes


P( E ) 
total number of possible outcomes in the sample space
X

T

Each of the outcomes in the sample space is equally


likely to occur
Presenting Probability &
Sample Space
1. Listing
S = {Head, Tail}
2. Venn Diagram
3. Tree Diagram
4. Contingency
Table
Visualizing Events
 Contingency Tables
Ace Not Ace Total

Black 2 24 26
Red 2 24 26

Total 4 48 52

 Tree Diagrams 2
Sample

Sample
Ace Space

C ar d
Space
Bla ck 24
Full Deck Not an Ace
of 52 Cards Ac e
Red C 2
ard
Not an 24
Ace
Joint Probability
Using Contingency Table

Event
Event B1 B2 Total
A1 P(A1  B1) P(A1  B2) P(A1)
A2 P(A2  B1) P(A2  B2) P(A2)
Total P(B1) P(B2) 1

Joint Probability Marginal (Simple)


Probability
Visualizing Events
 Venn Diagrams
 Let A = aces
A ∩ B = ace and red
 Let B = red cards

B
A U B = ace or red
Compound Probability
Addition Rule
1. Used to Get Compound Probabilities for
Union of Events
2. P(A or B) = P(A  B)
= P(A) + P(B)  P(A  B)
3. For Mutually Exclusive Events:
P(A or B) = P(A  B) = P(A) + P(B)
4. Probability of Complement
P(A) + P(Ā) = 1. So, P(Ā) = 1  P(A)
Addition Rule: Example
A hamburger chain found that 75% of all
customers use mustard, 80% use ketchup,
65% use both. What is the probability that a
particular customer will use at least one of
these?
A = Customers use mustard
B = Customers use ketchup
AB = a particular customer will use at least
one of these
Given P(A) = .75, P(B) = .80, and P(AB) = .
65,
P(AB) = P(A) + P(B)  P(AB)
= .75 + .80  .65= .90
Computing Conditional
Probabilities
 A conditional probability is the probability of
one event, given that another event has
occurred:
P(A and B) The conditional
P(A | B)  probability of A given
P(B) that B has occurred

P(A and B) The conditional


P(B | A)  probability of B given
P(A) that A has occurred

Where P(A and B) = joint probability of A and B


P(A) = marginal probability of A
P(B) = marginal probability of B
Conditional Probability Example

 Of the cars on a used car lot, 70% have air


conditioning (AC) and 40% have a CD player
(CD). 20% of the cars have both.

 What is the probability that a car has a CD


player, given that it has AC ?

i.e., we want to find P(CD | AC)


Conditional Probability Example
(continued)
 Of the cars on a used car lot, 70% have air
conditioning (AC) and 40% have a CD player (CD).
20% of the cars have both.
CD No CD Total
AC 0.2 0.5 0.7
No AC 0.2 0.1 0.3
Total 0.4 0.6 1.0

P(CD and AC) 0.2


P(CD | AC)    0.2857
P(AC) 0.7
Multiplication Rule
1. Used to Get Joint Probabilities for
Intersection of Events (Joint
Events)
2. P(A and B) = P(A  B)
P(A  B) = P(A)P(B|A)
= P(B)P(A|B)
3. For Independent Events:
P(A and B) = P(AB) = P(A)P(B)
Bayes’ Theorem
1. Permits Revising
Prior
Old Probabilities Probability
Based on New
New
Information Information
2. Application of Apply Bayes'
Conditional Theorem
Probability Revised
Probability
3. Mutually Exclusive
Events
Bayes’ Theorem Formula
The computation of a posterior probability P(Aj|B) from
given prior probabilities P(Ai) and conditional
probabilities P(B|Ai)
 Given k mutually exclusive and exhaustive
events B1, B2,… Bk, and an observed event A,
then
P ( Bi | A)  P( Bi  A) / P ( A) 

P ( Bi ) P ( A | Bi )
.
P ( B1) P ( A | B1)  P ( B 2) P ( A | B 2)  P ( Bk ) P ( A | B k )
Bayes’ Theorem Example
 A drilling company has estimated a 40%
chance of striking oil for their new well.
 A detailed test has been scheduled for more
information. Historically, 60% of successful
wells have had detailed tests, and 20% of
unsuccessful wells have had detailed tests.
 Given that this well has been scheduled for a
detailed test, what is the probability
that the well will be successful?
Bayes’ Theorem Example
(continued)

 Let S = successful well


U = unsuccessful well
 P(S) = 0.4 , P(U) = 0.6 (prior probabilities)
 Define the detailed test event as D
 Conditional probabilities:
P(D|S) = 0.6 P(D|U) = 0.2
 Goal is to find P(S|D)
Bayes’ Theorem Example
(continued)

Apply Bayes’ Theorem:


P(D | S)P(S)
P(S | D) 
P(D | S)P(S)  P(D | U)P(U)
(0.6)(0.4)

(0.6)(0.4)  (0.2)(0.6)
0.24
  0.667
0.24  0.12

So the revised probability of success, given that this


well has been scheduled for a detailed test, is 0.667
Bayes’ Theorem Example
(continued)

 Given the detailed test, the revised


probability of a successful well has risen to
0.667 from the original estimate of 0.4

Prior Conditional Joint Revised


Event Prob.
Prob. Prob. Prob.
S (successful) 0.4 0.6 (0.4)(0.6) = 0.24/0.36 = 0.667
0.24
U (unsuccessful) 0.6 0.2 (0.6)(0.2) = 0.12/0.36 = 0.333
0.12
Sum = 0.36
Bayes’s Theorem Example
Fifty percent of borrowers repaid their loans. Out of
those who repaid, 40% had a college degree. Ten
percent of those who defaulted had a college degree.
What is the probability that a randomly selected
borrower who has a college degree will repay the loan?
B1= repay, B2= default, A=college degree
P(B1) = .5, P(A|B1) = .4, P(A|B2) = .1, P(B1|A)
P( A | B1 ) P( B1 )
=?
P( B1 | A) 
P( A | B1 ) P( B1 )  P( A | B2 ) P( B2 )
(.4)(. 5) .2
   .8
(.4)(. 5)  (.1)(. 5) .25
Bayes’ Theorem Example
Table Solution
Event Prior Cond. Joint Post.
Prob Prob Prob Prob
Bi P(Bi) P(A|Bi) P(Bi  A) P(Bi |A)
B1 .5 X .4 = .20 .20/.25 = .8

B2 .5 .1 .05 .05/.25 = .2

1.0 P(A) = 0.25 1.0

Repay Default P(College)


Permutation and Combination
Counting Rule 1
If any one of n different mutually exclusive and
collectively exhaustive events can occur on
each of r trials, the number of possible
outcomes is equal to : n·n ·… ·n = nr
Counting Rule 2
The number of ways that all n objects can be
arranged in order is :
= n(n -1)(n -2)(2)(1) = n!
Where n! is called factorial and 0! is defined
as 1
Permutation and Combination
Counting Rule 3: Permutation
Example : What is the number of ways of
arranging 3 books selected from 5 books in
order? (5)(4)(3) = 60
The number of ways of arranging r objects
selected from n objects in order is :

n n!
P 
r
(n  r )!
Permutation and Combination
Counting Rule 4: Combination
Example : The number of combinations of 3
books selected from 5 books is
(5)(4)(3)/[(3)(2)(1)] = 10

The number of ways that arranging r objects


selected from n objects, irrespective of the
order, is equal to

n n!
C    
n
r
 r  r!(n  r )!
Random Variable
 A random variable is a variable hat assumes
numerical values associated with the random outcome
of an experiment, where one (and only one)
numerical value is assigned to each sample point
 A discrete random variable can assume a
countable number of values.(obtained by counting)
 A random variable that can take on only certain values
along an interval, with the possible values having gaps
between them
 Number of steps to the top of a Tower
Random Variable
 A continuous random variable can assume any
value along a given interval of a number line.
 The time a tourist stays at the top

once s/he gets there


 exact temperature outside
Discrete Probability
Distribution Example
Event: Toss 2 Coins. Count # Tails.

Probability Distribution
Values Probability
0 1/4 = .25
T
1 2/4 = .50
T 2 1/4 = .25

T T
Discrete Probability
Distribution Example
 Six batches of components are ready to be shipped by a supplier.
The number of defective components in each batch is as follows:
Batch #1 #2 #3 #4 #5 #6
             
Number of 0 2 0 1 2 0
defectives

 P(0)=P(batch 1 or 3 or 6)=3/6=0.500
 P(1)=P(batch 4)=1/6=0.167
 P(2)=P(batch 2 or 5)=2/6=0.333
Probability Distributions for
Discrete Random Variables
 The probability distribution
(probability mass function )of a
discrete random variable is a graph, table
or formula that specifies the probability
associated with each possible outcome
the random variable can assume.
 List of all possible [Xj , p(Xj) ] pairs
 Xj = value of random variable
 P(Xj) = probability associated with value
 p(x) ≥ 0 for all values of x
Probability Distributions for
Discrete Random Variables
x P(x)

 Say a random variable x 1 .30

follows this pattern: 2 .21

p(x) = (.3)(.7)x-1 3 .15


4 .11
for x > 0. 5 .07
 This table gives the 6 .05
probabilities (rounded to
7 .04
two digits) for x between
8 .02
1 and 10.
9 .02
10 .01
Expected Values of Discrete
Random Variables
 The mean, or expected value, of a
discrete random variable is

  E ( x)   xp ( x).
Expected Values of Discrete
Random Variables
 The variance of a discrete random
variable x is
 2  E[( x   )2 ]   ( x   ) 2 p( x).

 The standard deviation of a discrete


random variable x is
  E[( x   ) ] 
2 2
 (x  ) 2
p( x).
Important Discrete
Probability Distributions

Discrete Probability
Distributions

Binomial Hypergeometric Poisson


The Binomial Distribution
 A Binomial Random Variable
 n identical trials
 Two outcomes: Success or Failure
 P(S) = p; P(F) = q = 1 – p
 Trials are independent
 x is the number of Successes in n trials
The Binomial Distribution

 A Binomial Random
Variable
 n identical trials Flip a coin 3 times
 Two outcomes: Success or Outcomes are Heads or Tails
Failure
 P(S) = p; P(F) = q = 1 – p P(H) = .5; P(F) = 1-.5 = .5
 Trials are independent A head on flip i doesn’t
 x is the number of S’s in n change P(H) of flip i + 1
trials
Possible Binomial Distribution
Settings
 A manufacturing plant labels items as
either defective or acceptable
 A firm bidding for contracts will either get
a contract or not
 A marketing research firm receives survey
responses of “yes I will buy” or “no I will
not”
 New job applicants either accept the offer
or reject it
The Binomial Distribution
 The Binomial Probability Distribution
 p = P(S) on a single trial
 q=1–p
 n = number of trials
 x = number of successes

 n  x n x
P( x)    p q
 x
n! n x
p( x )  p (1  p )
x

x! ( n  x )!
The Binomial Distribution
 The Binomial Probability Distribution
The probability of The probability of
The number of getting the
ways of getting getting the
required number required number
the desired results of successes of failures

 n  x n x
P( x)    p q
 x
Example:
Calculating a Binomial Probability
What is the probability of one success in five
observations if the probability of success is .1?
X = 1, n = 5, and p = 0.1

n!
P(X  1)  p X (1  p)n X
X!(n  X)!
5!
 (0.1)1(1  0.1) 5 1
1! (5  1)!
 (5)(0.1)(0 .9) 4
 0.32805
The Binomial Distribution

 A Binomial Random Variable has

Mean   np
Variance  2  npq
Standard Deviation   npq
The Poisson Distribution
 Evaluates the probability of a (usually
small) number of occurrences out of many
opportunities in a …
 Period of time
 Area
 Volume
 Weight
 Distance
 Other units of measurement
The Poisson Distribution
x 
e
P( x) 
x!
  = mean number of occurrences in the
given unit of time, area, volume, etc.
 e = 2.71828….
 µ=
 2 = 
 x: number of successes per unit
The Poisson Distribution
 Say in a given stream there are an average
of 3 striped trout per 100 yards. What is the
probability of seeing 5 striped trout in the
next 100 yards, assuming a Poisson
distribution?
x 
e 3e 5 3
P( x  5)    .1008
x! 5!
The Hypergeometric Distribution
 In the binomial situation, each trial was
independent.
 Drawing cards from a deck and replacing the
drawn card each time
 If the card is not replaced, each trial depends
on the previous trial(s).
 The hypergeometric distribution can be used in this
case.
The Hypergeometric Distribution
 Randomly draw n elements from a set of N
elements, without replacement. Assume
there are r successes and N-r failures in the N
elements.
 The hypergeometric random variable is the
number of successes, x, drawn from the r
available in the n selections.
The Hypergeometric Distribution
 r  N  r 
  
 x  n  x 
P( x) 
N
 
n
where
N = the total number of elements (population size)
r = number of successes in the N elements (successes in the population)
n = number of elements drawn (sample size)
X = the number of successes in the n elements (successes in the sample)
The Hypergeometric Distribution
 r  N  r 
  
 x  n  x 
P( x) 
N
 
n
nr

N
r ( N  r ) n( N  n)
 
2

N 2 ( N  1)
Hypergeometric Distribution
Function
e.g.: Three Light bulbs were
 r  N  r  selected from ten. Of the ten,
   four were defective. What is the
 x  n  x 
P ( x)  probability that two of the three
N selected are defective?
  N = 10 n = 3 r = 4 x = 2
n
 4  6 
  
 2 1 
P  2   .30
10 
 
3 

You might also like