End321 01 Introduction
End321 01 Introduction
Week 1
1
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc.
Uncertainty
Realistic models of real world phenomenon should take into
account the presence of uncertainty (or randomness).
2
Basic Rules of Probability
• Definition: Any situation where the outcome is
uncertain is called an experiment.
• Definition: For any experiment, the sample space S
of the experiment consists of all possible outcomes
for the experiment.
• Definition: An event E consists of any collection of
points (set of outcomes) in the sample space.
• Definition: A collection of events E1, E2,…,En is said
to be a mutually exclusive collection of events if for
i ≠ j (i=1,2,…,n and j=1,2,…n), Ei and Ej have no
points in common.
3
Basic Rules of Probability
• With each event E, we associate an event Ē. Ē
consists of the points in the sample space that
are not in E.
• With each event E, we also associate a
number P(E), which is the probability that
event E will occur when we perform the
experiment.
4
Basic Rules of Probability
Some examples for sample spaces are as follows:
5
5
Basic Rules of Probability
Further examples:
6
6
Basic Rules of Probability
In each of the examples on previous slides, we may also define many events such
as:
If E1, E2, …, En, … are a collection of events then is the event that
at least either one of E1, E2, …, En, … occurs. Conversely, I
is the event that all events occur.
8
Kolmogorov Axioms on Probability
– Rule 1 For any event E, P(E) ≥ 0.
– Rule 2 If E=S (that is, if E contains all points in the
sample space), then P(E) = 1.
– Rule 3 If E1, E2,…,En is mutually exclusive
collection of events, then
k =n
P( E1 ∪ E2 ∪ ... ∪ En ) = ∑ P( Ek )
k =1
9
Basic Rules of Probability
A fair coin refers to a coin that lands up head or tail with equal probability. In a
biased coin, though, this is not the case.
10
Basic Rules of Probability
12
Union of Two Events
The union of two events E and F is denoted by E U F. Probability of this new event
should account for the probability of all outcomes that lie in either E or F. Then,
what about the below formula for the probability of E U F?
13
Union of Two Events
The sample space is
We assume that each outcome is equally likely. In other words, the probability of
each outcome is ¼. Consider the events:
We will not use this inclusion-exclusion identity very often in this course.
Nevertheless, it is a useful formula that you may wish to keep for your records.
16
Conditional Probabilities
Conditional probability is one of the most important (maybe the most) concept in
probability theory when it comes to modeling stochastic systems. It is used to
model the impact of incoming information on our probability assessments.
Consider the following example of rolling dice. If we roll the same dice twice, there
are 36 possible outcomes likely to occur. If we use a fair die, we expect that the
probability of each outcome in this sample space is 1/36.
Now, assume that someone tells you the outcome of the first trial (which is 4).
Then, what is the probability that the outcome of the original
experiment is (3, 5)?
Zero!
17
Conditional Probabilities
How about the probability that the outcome of the original experiment
is (4, 6)?
1/6
This simple example shows one more time that incoming information might
change your probability assessments on the experiment you are making. In general
let E be the event whose probability you’re after, and let F be the event you can
observe before E occurs. Naturally, you might be interested in the conditional
probability that E will occur given F occurs. The standard notation used for this
probability is:
How can the information that F occurs change our assessment of the
probability of event E?
18
1 Conditional Probabilities
Well, in one simple way: The sample space of the experiment may be affected by
the occurrence of the event F. That’s what happened in the previous example. The
moment you learn that 4 is indeed the outcome of the first roll, you immediately
disregarded outcomes such as (3,5), (2,1), (6,6) …etc. In fact, a total of 30 possible
outcomes have been eliminated this way. What is left in the sample space then?
Outcomes such as (4,1), (4,2), …, (4,6). Since all are equally likely you concluded
that the probability of (4,6) is 1/6.
The natural question one might ask is whether there is a formula for P(E| F)? You
must have seen one in your probability and statistics course. Does anyone
remember?
19
Conditional Probabilities
Let’s consider some examples now:
20
Conditional Probabilities
We did not use the formula given in earlier slides, though. If we wish to solve this
problem using the formula, then E denotes the event that the card drawn is 10 and
F is the event that it is at least 5. Then
Solution:
21
Conditional Probabilities
Another example is as follows:
Solution:
22
Conditional Probabilities
One more similar example is:
Let F denote the event that the first ball is black, and E denote the event that the
second ball is black. What is the event in question here?
Solution:
23
Independence of Two Events
In many uncertain situations, we may be faced with the question of assessing the
probability of two events that have no relevance to each other. Namely, for two
events E and F, knowing that F has occurred gives us no reason to update the
probability assessment on E. In this case, we say that event E and F are
independent.
A formal definition of independence is as follows: Two events E and F are said to
be independent if:
or:
24
Independence of Two Events
Let’s consider an example:
The sum of two rolls is 6 when the outcomes are (1,5), (2,4), (3,3), (4,2),
(5,1). Hence P(E1) = 5/36.
25
Independence of Two Events
How about P(E1 ∩ F)? What outcome of the two consecutive rolls satisfy 1) a
sum of six 2) 4 on the first roll? Simple! Only one outcome, that is (4,2). What is
the probability of getting a (4,2)? 1/36. Then P(E1 F) =1/36, as well.
One final step before we understand whether E1 and F are independent. Let’s
compute P(E1).P(F) = 1/6 * 5/36 < 1/36. Hence these two events are not
independent.
Now, consider another event E2. E2 is the event that the sum equals 7 (not 6).
Following the same sequence of steps, let’s check whether E2 and F are
independent. First, using parallel arguments, one could see that
The outcome will bring the sum of 7 if we observe (1,6), (2,5), (3,4), (4,3), (5,2),
(6,1). Then P(E2)=1/6 and P(E2).P(F) = 1/6 * 1/6 = 1/36. Since this is equal to the
number above, E2 and F are independent.
26
Independence of Two Events
Is it not an interesting conclusion to see in this example that E2 and F are
independent while E1 and F are NOT. Can you think of an intuitive
explanation for this?
Hint: Think along the lines of how learning about F could change your probability
assessment about the other event.
If F occurs, then E1 will occur if you get a 2 in the second trial. Hence, in your
judgment, the probability of E1 should be 1/6, not 5/36 as we initially claimed.
For E2, though, both probabilities are 1/6. Hence, as far as E2 is concerned, F does
not provide us extra information.
27
Bayes Formula
As we mentioned in an earlier example, conditioning is sometimes key to
computing probabilities. This is partly because of the following set theoretic
identity for two events E and F:
28
Bayes Formula
Consider the following example:
If the coin comes up heads, ball is selected from the first urn. Any suggestions
on the solution?
Let W be the event that a white ball is selected, and H be the event that the
outcome of the coin toss is heads. What is the probability that we are
interested in?
29
Bayes Formula
Solution:
30
Bayes Formula
Another example is regarding the result of a lab. test:
What events are of interest here? The last sentence gives the answer: D, the
event that the person is actually sick and E, the event that the test result is
positive. Hence, we need P(D|E).
32
Bayes Formula
We need a generalization of the following form: Suppose F1, F2, ...,Fn are mutually
exclusive events such that Ui=1,…,n Fi=S. Then we should have:
Since Fi’s are mutually exclusive, EFi’s are mutually exclusive as well. Then, it follows
that:
33
Bayes Formula
The last expression on the previous slide leads us to the Bayes formula:
34