0% found this document useful (0 votes)
15 views60 pages

Probability

Pdf

Uploaded by

cbsemathsx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views60 pages

Probability

Pdf

Uploaded by

cbsemathsx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 60

Data Science for

Managerial Decisions
What are the chances that sales will decrease if we increase
price?

What is the likelihood that a new assembly method will increase


productivity?

What are the odds that a new investment will be profitable?

2
Probability

3
Probability

◼ Experiment
◼ Sample Space
◼ Sample Points
◼ Event

4
Experiment, Sample Space, Sample Point, Event

An experiment is any process that generates well-


defined outcomes.

The sample space for an experiment is the set of


all experimental outcomes.

An experimental outcome is called a sample point.

An event is a collection of sample points.


Experiment, Sample Space, Sample Point, Event

◼ In statistics, the notion of an experiment differs somewhat from that of an


experiment in the physical sciences.
◼ Even though the experiment is repeated in exactly the same way, an
entirely different outcome may occur.
◼ For this reason, statistical experiments are sometimes called random
experiments.
Experiment and its Sample Space

Experiment Experiment Outcomes


Toss a coin Head, tail
Inspection of a part Defective, non-defective
Conduct a sales call Purchase, no purchase
Roll a die 1, 2, 3, 4, 5, 6
Play a football game Win, lose, tie
Experiment and its Sample Space
Example: Bradley Investments
Bradley has invested in two stocks, Markley Oil and Collins Mining.
Bradley has determined that the possible outcomes of these
investments three months from now are as follows.

Investment Gain or Loss


in 3 Months (in $000)
Markley Oil Collins Mining
10 8
5 -2
0
-20
Experiment and its Sample Space
Example: Bradley Investments

Experimental
Markley Oil Collins Mining
Outcomes
+8 (10, 8)
(10, -2)
+10 -2
+8 (5, 8)

-2 (5, -2)
+5
+8
(0, 8)
0
(0, -2)
-20 -2
+8 (-20, 8)
-2 (-20, -2)
Experiment and its Sample Space
Example: Bradley Investments

Experimental
Markley Oil Collins Mining
Outcomes
+8 (10, 8) Gain $18,000
(10, -2) Gain $8,000
+10 -2
+8 (5, 8) Gain $13,000

-2 (5, -2) Gain $3,000


+5
+8
(0, 8) Gain $8,000
0
(0, -2) Lose - $2,000
-20 -2
+8 (-20, 8) Lose - $12,000
-2 (-20, -2) Lose - $22,000
Events
◼ Example: Bradley Investments

Event M= Markley Oil Profitable


M = {(10, 8), (10, -2), (5, 8), (5, -2)}
Event C = Collins Mining Profitable
C = {(10, 8), (5, 8), (0, 8), (-20, 8)}

An event is a collection of sample points.


11
Assigning Probabilities

12
Assigning Probabilities – Basic Requirements
Basic Requirements for Assigning Probabilities

1. The probability assigned to each experimental outcome


must be between 0 and 1, inclusively.

0 < P(Ei) < 1 for all i

where:
Ei is the ith experimental outcome and P(Ei) is its probability
Assigning Probabilities – Basic Requirements
Basic Requirements for Assigning Probabilities

2. The sum of the probabilities for all experimental


outcomes must equal 1.

P(E1) + P(E2) + . . . + P(En) = 1

where:
n is the number of experimental outcomes
Assigning Probabilities - Methods
Classical Method
Assigning probabilities based on the assumption
of equally likely outcomes

Relative Frequency Method


Assigning probabilities based on experimentation
or historical data

Subjective Method
Assigning probabilities based on judgment
Assigning Probabilities - Classical Method
◼ Example: Rolling a Die
◼ If an experiment has n possible outcomes, the classical method would
assign a probability of 1/n to each outcome.
◼ Experiment: Rolling a die
◼ Sample Space: S = {1, 2, 3, 4, 5, 6}
◼ Probabilities: Each sample point has a 1/6 chance of occurring

16
Assigning Probabilities - Relative Frequency Method
Example: Kentucky Power & Light Company (KP&L)

KP&L is starting a project designed to increase the generating capacity of one


of its plants in northern Kentucky. The project is divided into two sequential
stages or steps: stage 1 (design) and stage 2 (construction).
Even though each stage will be scheduled and controlled as closely as possible,
management cannot predict beforehand the exact time required to complete
each stage of the project.
An analysis of similar construction projects revealed possible completion
times for the design stage of 2, 3, or 4 months and possible completion times
for the construction stage of 6, 7, or 8 months.
Assigning Probabilities - Relative Frequency Method
Example: Kentucky Power & Light Company (KP&L)

Number of Number
Polishers Rented of Days
0 4
1 6
2 18
3 10
4 2

18
Assigning Probabilities - Relative Frequency Method
Example: Kentucky Power & Light Company (KP&L)

19
Assigning Probabilities - Subjective Method
◼ When economic conditions and a company’s circumstances change
rapidly it might be inappropriate to assign probabilities based solely on
historical data.
◼ We can use any data available as well as our experience and intuition,
but ultimately a probability value should express our degree of belief
that the experimental outcome will occur.
◼ The best probability estimates often are obtained by combining the
estimates from the classical or relative frequency approach with the
subjective estimate.

20
Assigning Probabilities - Subjective Method
Example: Bradley Investments
Bradley has invested in two stocks, Markley Oil and Collins Mining.
Bradley has determined that the possible outcomes of these
investments three months from now are as follows.

Investment Gain or Loss


in 3 Months (in $000)
Markley Oil Collins Mining
10 8
5 -2
0
-20
Assigning Probabilities - Subjective Method
◼ Example: Bradley Investments
◼ An analyst made the following probability estimates.

Experimental
Markley Oil Collins Mining
Outcomes Probability
+8 (10, 8) 0.20
(10, -2) 0.08
+10 -2
+8 (5, 8) 0.16

-2 (5, -2) 0.26


+5
+8
(0, 8) 0.10
0
(0, -2) 0.12
-20 -2
+8 (-20, 8) 0.02
-2 (-20, -2) 0.06
Basic Relationships of Probability

23
Basic Relationships of Probability

Complement of an Event

Union of Two Events

Intersection of Two Events

Mutually Exclusive Events

24
Basic Relationships of Probability
There are some basic probability relationships that can be used to compute
the probability of an event without knowledge of all the sample point
probabilities.

Complement of an Event

Union of Two Events

Intersection of Two Events

Mutually Exclusive Events

25
John Venn
(4 August 1834 – 4 April 1923)

1881 Work: Symbolic Logic 26


Complement of an Event

The complement of event A is defined to be the event


consisting of all sample points that are not in A.

The complement of A is denoted by Ac.

Sample
Event A Ac Space S

Venn
Diagram
P(A) + P(Ac) = 1
27
Union of Two Events
The union of events A and B is the event containing
all sample points that are in A or B or both.

The union of events A and B is denoted by A  B

Sample
Event A Event B Space S
Intersection of Two Events

The intersection of events A and B is the set of all


sample points that are in both A and B.

The intersection of events A and B is denoted by A  

Sample
Event A Event B Space S

Intersection of A and B
Addition Law

The addition law provides a way to compute the


probability of event A, or B, or both A and B occurring.

The law is written as:

P(A  B) = P(A) + P(B) - P(A  B)


Mutually Exclusive Events

Two events are said to be mutually exclusive if the


events have no sample points in common.

Two events are mutually exclusive if, when one event


occurs, the other cannot occur.

Sample
Event A Event B Space S
Mutually Exclusive Events

If events A and B are mutually exclusive, P(A  B) = 0.

The addition law for mutually exclusive events is:

P(A  B) = P(A) + P(B)

There is no need to
include “- P(A  B)”
Conditional Probability

The probability of an event given that another event


has occurred is called a conditional probability.

The conditional probability of A given B is denoted


by P(A|B).

A conditional probability is computed as follows :

P( A  B)
P( A|B) =
P( B)
Multiplication Law

The multiplication law provides a way to compute the


probability of the intersection of two events.

The law is written as:

P(A  B) = P(B)P(A|B)
Independent Events

If the probability of event A is not changed by the


existence of event B, we would say that events A
and B are independent.

Two events A and B are independent if:

P(A|B) = P(A) or P(B|A) = P(B)


Multiplication Law for Independent Events

The multiplication law also can be used as a test to see


if two events are independent.

The law is written as:

P(A  B) = P(A)P(B)
Mutual Exclusiveness and Independence

Do not confuse the notion of mutually exclusive


events with that of independent events.

Two events with nonzero probabilities cannot be


both mutually exclusive and independent.

If one mutually exclusive event is known to occur,


the other cannot occur.; thus, the probability of the
other event occurring is reduced to zero (and they
are therefore dependent).

Two events that are not mutually exclusive, might


or might not be independent.
Bayes’ Theorem
Bayes’ Theorem
◼ Often we begin probability analysis with initial or prior probabilities.
◼ Then, from a sample, special report, or a product test we obtain some
additional information.
◼ Given this information, we calculate revised or posterior probabilities.
◼ Bayes’ theorem provides the means for revising the prior probabilities.

Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
Bayes’ Theorem
Example: Suppliers
◼ Consider a manufacturing firm that receives shipments of parts from two
different suppliers.
Bayes’ Theorem
Example: Suppliers
Let:
A1 = Event that a part is from supplier 1
A2 = Event that a part is from supplier 2

Currently, 65% of the parts purchased by the company are from supplier 1
and the remaining 35% are from supplier 2. Hence, if a part is selected at
random, we would assign the prior probabilities as follows:

P(A1) = .65, P(A2) = .35


Bayes’ Theorem
Example: Suppliers
◼ The quality of the purchased parts varies with the source of supply.
◼ Historical data suggest that the quality ratings of the two suppliers are as
shown in below Table.

% Good Parts % Bad Parts


Supplier 1 98 2
Supplier 2 95 5

◼ If we let G denote the event that a part is good and B denote the event that a
part is bad, the information in above Table provides the following conditional
probability values.

P(G|A1) = .98 P(G|A2) = .95

P(B|A1) = .02 P(B|A2) = .05


54
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?

P( A1 )P( B| A1 )
P( A1 |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 )
(.65)(.02)
=
(.65)(.02) + (.35)(.05)

= 0.4262
58
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?

P( A2 )P( B| A2 )
P( A2 |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 )
(.35)(.05)
=
(.65)(.02) + (.35)(.05)

=0.5738
Tree Diagram
Example: Suppliers

Experimental
Supplier Condition
Outcomes

P(G|A1) = .98
P(A1  G) = .6370
P(A1) = .65
P(B|A1) = .02 P(A1  B) = .0130

P(G|A2) = .95
P(A2  G) = .3325
P(A2) = .35
P(B|A2) = .05 P(A2  B) = .0175
Bayes’ Theorem
To find the posterior probability that event Ai will occur given that event B
has occurred, we apply Bayes’ theorem.

P( Ai )P( B| Ai )
P( Ai |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 ) + ... + P( An )P( B| An )

Bayes’ theorem is applicable when the events for which we want to compute
posterior probabilities are mutually exclusive and their union is the entire
sample space.
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?

P( A1 )P( B| A1 )
P( A1 |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 )
(.65)(.02)
=
(.65)(.02) + (.35)(.05)

= 0.4262
Posterior Probabilities
Example: Suppliers
Suppose now that the parts from the two suppliers are used in the firm’s
manufacturing process and that a machine breaks down because it attempts to
process a bad part. Given the information that the part is bad, what is the
probability that it came from supplier 1 and what is the probability that it
came from supplier 2?

P( A2 )P( B| A2 )
P( A2 |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 )
(.35)(.05)
=
(.65)(.02) + (.35)(.05)

=0.5738
Bayes’ Theorem: Tabular Approach

Prepare the following three columns:

Step 1:
Column 1 - The mutually exclusive events for which posterior
probabilities are desired.
Column 2 - The prior probabilities for the events.
Column 3 - The conditional probabilities of the new information given
each event.
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 1:

(1) (2) (3) (4) (5)


Prior Conditional
Events Probabilities Probabilities
Ai P(Ai) P(B|Ai)

A1 .65 .02
A2 .35 .05
1.0
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 2:
Prepare the fourth column:

Column 4
Compute the joint probabilities for each event and the new
information B by using the multiplication Law.

Multiply the prior probabilities in column 2 by the corresponding


conditional probabilities in column 3. That is, P(Ai IB) = P(Ai) P(B|Ai).
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 2:

(1) (2) (3) (4) (5)


Prior Conditional Joint
Events Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B)

A1 .65 .02 .0130


A2 .35 .05 .0175
.65 x .02
1.0
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 3:

(1) (2) (3) (4) (5)


Prior Conditional Joint
Events Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B)

A1 .65 .02 .0130


A2 .35 .05 .0175
1.00 P(B) = .0305
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 4: Prepare the fifth column

Compute the posterior probabilities using the basic relationship of


conditional probability.

P( Ai  B)
P( Ai | B) =
P( B)

The joint probabilities P(Ai I B) are in column 4 and the probability


P(B) is the sum of column 4.
Bayes’ Theorem: Tabular Approach

Example: Suppliers
Step 4: Prepare the fifth column

(1) (2) (3) (4) (5)


Prior Conditional Joint Posterior
Events Probabilities Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B) P(Ai |B)

A1 .65 .02 .0130 .4262

A2 .30 .05 .0175 .5738


1.0 P(B) =.0305 1.0000
.0130/.0305
Thank You !!!

You might also like