0% found this document useful (0 votes)
3 views38 pages

Statistics (Part 3)

The document provides an overview of probability concepts, including definitions of probability, events, and key properties. It covers rules of probability, such as the addition and multiplication rules, as well as concepts like conditional probability and Bayes' theorem. Additionally, it introduces discrete random variables, their distributions, and specific distributions like binomial and Poisson distributions.

Uploaded by

phamthuy6782
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views38 pages

Statistics (Part 3)

The document provides an overview of probability concepts, including definitions of probability, events, and key properties. It covers rules of probability, such as the addition and multiplication rules, as well as concepts like conditional probability and Bayes' theorem. Additionally, it introduces discrete random variables, their distributions, and specific distributions like binomial and Poisson distributions.

Uploaded by

phamthuy6782
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 38

4.

Probability Concepts

4.1 Probability basics


 Probability for Equally Likely Outcomes (f/N Rule)
 Suppose an experiment has N possible outcomes, all
equally likely. An event that can occur in f ways has
probability f/N of occurring.
 Experiment:an action whose outcome cannot be
predicted with certainty.
 Event: some specified result that may or may not occur
when an experiment is performed.
 The Meaning of Probability
 Probability model: a mathematical description of the
experiment based on certain primary aspects and
assumptions.
1
4. Probability Concepts
(cont.)
 Basic Properties of Probabilities
Property 1: The probability of an event is always
between 0 and 1
Property 2: The probability of an event that cannot
occur is 0. (An event that cannot occur is called an
impossible event.)
Property 3: The probability of an event that must occur
is 1. (An event that must occur is called a certain
event.)

2
4. Probability Concepts
(cont.)
3.2 Events
 Sample space: The collection of all possible
outcomes for an experiment.
 Event: A collection of outcomes for the
experiment, that is, any subset of the sample
space.
 An event occurs if and only if the outcome of the
experiment is a member of the event.
 Notation and Graphical Displays for Events:

3
4. Probability Concepts
(cont.)
 Relationships Among Events
 (not E ): The event “E does not occur”
 (A & B ): The event “both A and B occur”
 (A or B ): The event “either A or B or both occur”

4
4. Probability Concepts
(cont.)
 Mutually Exclusive Events: Two or more events are
mutually exclusive events if no two of them have
outcomes in common.

5
4. Probability Concepts
(cont.)
4.3 Some rules of Probability
 Probability Notation:
 If E is an event, then P(E) represents the probability
that event E occurs. It is read “the probability of E.”
 The Special Addition Rule
 If event A and event B are mutually exclusive, then
P(A or B) = P(A) + P(B).
 More generally, if events A, B, C, ... are mutually
exclusive, then P(A or B or C or ···) = P(A) + P(B) +
P(C) +··· .

6
4. Probability Concepts
(cont.)
 The Complementation Rule
 For any event E, P(E) = 1 − P(not E).

 The General Addition Rule


 If A and B are any two events, then
P(A or B) = P(A) + P(B) − P(A & B).

7
4. Probability Concepts
(cont.)
4.4 Contingency Table
 Data from one variable of a population are called
univariate data
 Data from two variables of a population are
called bivariate data
 A frequency distribution for bivariate data is
called a contingency table or two-way table

8
9
4. Probability Concepts
(cont.)
 Joint and Marginal Probabilities

10
4. Probability Concepts
(cont.)
 P(A1), P(A2)…and P(R1), P(R2)…: marginal
probabilities
 Marginal probabilities correspond to events
represented in the margin of the contingency
table
 P(A1&R1), P(A1&R2)…: joint probabilities

11
4. Probability Concepts
(cont.)

12
4. Probability Concepts
(cont.)
4.5 Conditional Probability
 The probability that event B occurs given that
event A occurs is called a conditional probability.
 It is denoted P (B |A), which is read “the
probability of B given A.” We call A the given
event.
 Example
 The Conditional Probability Rule
 If A and B are any two events with P(A)> 0, then

13
4. Probability Concepts
(cont.)
4.6 The Multiplication Rule
 If A and B are any two events, then
P(A & B) = P(A) · P(B|A)
 Independent Events
 Event B is said to be independent of event A if P(B | A) =
P(B).
 The Special Multiplication Rule
 If A and B are independent events, then
P(A & B) = P(A) · P(B),
 and conversely, if P(A & B) = P(A) · P(B), then A and B are
independent events.
 If events A, B, C, ... are independent, then
14
 P(A & B & C & ···) = P(A) · P(B) · P(C) ··· .
 Mutually Exclusive Versus Independent Events
4. Probability Concepts
(cont.)
4.7 Bayes’ s Rule
 The Rule of Total Probability
 Suppose that events A1, A2,..., Ak are mutually
exclusive and exhaustive, that is, exactly one of the
events must occur. Then for any event B,

 It is also referred to as the stratified sampling


theorem because of its importance in stratified
sampling
15
4. Probability Concepts
(cont.)
 Bayes’s Rule
 Suppose that events A1, A2,..., Ak are mutually
exclusive and exhaustive. Then for any event B,

 where Ai can be any one of events A1, A2,..., Ak.

16
4. Probability Concepts
(cont.)
4.8 Counting Rule
 The Basic Counting Rule (BCR)
 Suppose that r actions are to be performed in a
definite order. Further suppose that there are m1
possibilities for the first action and that
corresponding to each of these possibilities are m 2
possibilities for the second action, and so on. Then
there are m1· m2 ···mr possibilities altogether for the r
actions.
 Factorials
 k! = k(k − 1) ··· 2 · 1.
 We also define 0! = 1.
17
4. Probability Concepts
(cont.)
 A permutation of r objects from a collection of m
objects is any ordered arrangement of r of the m
objects.
 The number of possible permutations of r objects
that can be formed from a collection of m objects
is denoted mPr (read “m permute r”).
 The Permutations Rule
 The number of possible permutations of r objects
from a collection of m objects is given by the formula

18
4. Probability Concepts
(cont.)
 The Special Permutations Rule
 The number of possible permutations of m objects
among themselves is m!.
 Combinations
 A combination of r objects from a collection of m
objects is any unordered arrangement of r of the m
objects—in other words, any subset of r objects from
the collection of m objects.
 The Combinations Rule
 The number of possible combinations of r objects
from a collection of m objects is given by the formula

19
4. Probability Concepts
(cont.)
 Number of Possible Samples
 The number of possible samples of size n from a
population of size N is NCn.
 Applications to Probability
 Examples

20
Review

1. use and understand the formulas in this chapter.


2. compute probabilities for experiments having equally likely
outcomes.
3. interpret probabilities, using the frequentist interpretation of
probability.
4. state and understand the basic properties of probability.
5. construct and interpret Venn diagrams.
6. find and describe (not E), (A & B), and(A or B).
7. determine whether two or more events are mutually exclusive.
8. understand and use probability notation.
9. state and apply the special addition rule.
10. state and apply the complementation rule.
11. state and apply the general addition rule.
21
12.* read and interpret contingency tables.
13. construct a joint probability distribution.
14. compute conditional probabilities both directly and by using
the conditional probability rule.
15. state and apply the general multiplication rule.
16. state and apply the special multiplication rule.
17. determine whether two events are independent.
18. understand the difference between mutually exclusive events
and independent events.
19. determine whether two or more events are exhaustive.
20. state and apply the rule of total probability.
21. state and apply Bayes’s rule.
22. state and apply the basic counting rule (BCR).
23. state and apply the permutations and combinations rules.
24. apply counting rules to solve probability problems where
22
appropriate.
Ex.4.1-4.26
5. Discrete Random
Variables
 Random Variable: a quantitative variable whose value
depends on chance.
 Discrete Random Variable: Its possible values can be
listed.
 Random-Variable Notation: usually use uppercase letters
 E.g.: {X = 2} and the probability of that event as P(X= 2)
5.1 Probability distribution
 A listing of the possible values and corresponding
probabilities of a discrete random variable, or a formula
for the probabilities.
 Probability histogram:
 A graph of the probability distribution that displays the
possible values of a discrete random variable on the
horizontal axis and the probabilities of those values on
the vertical axis.
 The probability of each value is represented by a vertical23
bar whose height equals the probability.
5. Discrete Random
Variables (cont.)

24
5. Discrete Random
Variables (cont.)
 Sum of the Probabilities of a Discrete Random
Variable
 For any discrete random variable X, we have P(X =
x) = 1.
 Interpretation of a Probability Distribution
 In a large number of independent observations of a
random variable X,the proportion of times each
possible value occurs will approximate the
probability distribution of X; or, equivalently, the
proportion histogram will approximate the
probability histogram for X.

25
5. Discrete Random
Variables (cont.)
5.2 Mean and Standard Deviation of a Discrete
Random Variable
 Mean of a Discrete Random Variable
 Denoted as μX or simply μ. It is defined by
μ = xP(X = x).
 The terms expected value and expectation are
commonly used in place of the term mean.
 Interpretation of the Mean of a Random Variable
 In a large number of independent observations of a
random variable X, the average value of those
observations will approximately equal the mean, μ, of
X. The larger the number of observations, the closer
the average tends to be to μ.
26
5. Discrete Random
Variables (cont.)
 The standard deviation of a discrete random
variable X is denoted σX or simply σ. It is defined
as

 The standard deviation of a discrete random


variable can also be obtained from the computing
formula

 What Does It Mean?


27
5. Discrete Random
Variables (cont.)
5.3 Binomial Distribution
 To analyze repeated trials of an experiment that
has two possible outcomes, we require knowledge
of factorials, binomial coefficients, Bernoulli trials,
and the binomial distribution
 Binomial Coefficients
 If n is a positive integer and x is a nonnegative
integer less than or equal to n, then the binomial
coefficient is defined as

28
5. Discrete Random
Variables (cont.)
 Bernoulli Trials
 Repeated trials of an experiment are called Bernoulli
trials if the following three conditions are satisfied:
1. The experiment (each trial) has two possible
outcomes, denoted generically s, for success, and f, for
failure.
2. The trials are independent.
3. The probability of a success, called the success
probability and denoted p, remains the same from trial
to trial.
 The binomial distribution is the probability
distribution for the number of successes in a
sequence of Bernoulli trials
29
5. Discrete Random
Variables (cont.)
 Number of Outcomes Containing a Specified Number
of Successes
 In n Bernoulli trials, the number of outcomes that contain
exactly x successes equals the binomial coefficient
 Binomial Probability Formula
 Let X denote the total number of successes in n Bernoulli
trials with success probability p. Then the probability
distribution of the random variable X is given by

 The random variable X is called a binomial random


variable and is said to have the binomial distribution with
parameters n and p.

30
5. Discrete Random
Variables (cont.)
 Procedure To Find a Binomial Probability Formula
 Assumptions:
1. n trials are to be performed.
2. Two outcomes, success or failure, are possible for each
trial.
3. The trials are independent.
4. The success probability, p, remains the same from trial to
trial.
Step 1 Identify a success.
Step 2 Determine p, the success probability.
Step 3 Determine n, the number of trials.
Step 4 The binomial probability formula for the number of
successes, X,is
31
5. Discrete Random
Variables (cont.)
 Binomial Probability Tables
 Eliminate most of the computations required in
working with the binomial distribution.
 Such tables are of limited usefulness, because they
contain only a relatively small number of different
values of n and p.
 Shape of a Binomial Distribution

32
5. Discrete Random
Variables (cont.)
 Mean and Standard Deviation of a Binomial Random
Variable

 Binomial Approximation to the Hypergeometric


Distribution
 In reality, sampling is ordinarily done without replacement,
the sampling process does not constitute Bernoulli trials
and the success probability varies from trial to trial.
 Its distribution is important, referred to as a
hypergeometric distribution.
 if the sample size does not exceed 5% of the population
size, there is little difference between sampling with and
without replacement

33
5. Discrete Random
Variables (cont.)
 Sampling and the Binomial Distribution
 Suppose that a simple random sample of size n is
taken from a finite population in which the
proportion of members that have a specified
attribute is p. Then the number of members sampled
that have the specified attribute:
 Has exactly a binomial distribution with parameters
n and p if the sampling is done with replacement and
 Has approximately a binomial distribution with
parameters n and p if the sampling is done without
replacement and the sample size does not exceed
5% of the population size.

34
5. Discrete Random
Variables (cont.)
5.4 The Poisson Distribution
 often used to model the frequency with which a specified
event occurs during a particular period of time
 In addition, used to describe the probability distribution
of the number of misprints in a book, or the number of
bacterial colonies appearing on a petri dish smeared with
a bacterial suspension.
 Poisson Probability Formula
 Probabilities for a random variable X that has a Poisson
distribution are given by the formula

 where λ is a positive real number and e≈ 2.718.


35
5. Discrete Random
Variables (cont.)
 Shape of a Poisson Distribution
 all Poisson distributions are right skewed
 Mean and Standard Deviation of a Poisson Random
Variable

 Poisson Approximation to the Binomial Distribution


Step 1 Find n, the number of trials, and p, the success
probability.
Step 2 Continue only if n ≥ 100 and np ≤ 10.
Step 3 Approximate the binomial probabilities by using
the Poisson probability formula
36
Review

1. use and understand the formulas in this chapter.


2. determine the probability distribution of a discrete
random variable.
3. construct a probability histogram.
4. describe events using random-variable notation,
when appropriate.
5. use the frequentist interpretation of probability to
understand the meaning of the probability distribution
of a random variable.
6. find and interpret the mean and standard deviation
of a discrete random variable.
7. compute factorials and binomial coefficients.
37
8. define and apply the concept of Bernoulli trials.
9. assign probabilities to the outcomes in a sequence of
Bernoulli trials.
10. obtain binomial probabilities.
11. compute the mean and standard deviation of a
binomial random variable.
12. obtain Poisson probabilities.
13. compute the mean and standard deviation of a
Poisson random variable.
14. use the Poisson distribution to approximate binomial
probabilities, when appropriate.
Ex.
38

You might also like