0% found this document useful (0 votes)
3 views22 pages

Chapter 15 Notes

Chapter 15 covers the fundamentals of probability, including definitions, types of events, and various probability concepts such as classical, empirical, and conditional probability. It explains random experiments, events, and the rules for calculating probabilities, including addition and multiplication rules. The chapter also introduces random variables, expected values, and variance, providing examples and properties related to these concepts.

Uploaded by

ansh.shahu2015
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views22 pages

Chapter 15 Notes

Chapter 15 covers the fundamentals of probability, including definitions, types of events, and various probability concepts such as classical, empirical, and conditional probability. It explains random experiments, events, and the rules for calculating probabilities, including addition and multiplication rules. The chapter also introduces random variables, expected values, and variance, providing examples and properties related to these concepts.

Uploaded by

ansh.shahu2015
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter 15

Probability
Summary chart of the chapter

Random
Probability
Variable

Event and its


Origin and Random Definitions of Probability Expected
types (various
Meaning Experiment Probability Distribution Value
concepts)

Statistical or Mathematical
Venn
Empirical or A or Classical or
Diagrams
posteriori A priori

Rules of
Addition and
Multiplication
Introduction
 Originally branch of Mathematics
 Probability of 1 means 100% ‘chance’ in our practical day to day world
language
 Relative frequency of past data gives or resembles the concept probability
 There are two types of Probability: Subjective (based on individual judgement
and experience) and Objective (scientific and objective method when the
possible outcomes are known in advance but their occurrence is random) –
We will study only Objective probability.
 Originally it was used to win gambling games and later theories were
developed by many people:
 Abraham De Moicere and Piere-Simon De Laplace of France
 Reverend Thomas Bayes and R A Fisher of England
 Chebyshev, Morkov, Khinchin, Kolmogorov of Russia
Random Experiment
 Experiment can be described as a performance that produces certain
results
 Random experiment is the experiment in which the result of the
experiment depends upon chance only. For example, tossing of coin,
rolling of dice, drawing a card from a pack of well shuffled cards.
 Results or outcomes of Random experiment are known as events.
 Simple event is the one which can not be decomposed into further
events. For example, getting Head or Tail in the experiment of tossing a
coin once.
 Composite event is the one which can be further split into subset events
with smaller number of elements. For example, getting only one head in
the experiment of tossing coin twice. {HT, TH} is composite event and
can be further decomposed into {HT} and {TH}.
Various concepts or types of events
 Primary or Elementary events e.g. {H, T}: Primary events are simple events which can
not be decomposed further.
 Equally likely or Equiprobable events P(A) = P(B): Elementary events are all equally
likely. Composite events can also be equally likely. For example Odd numbers on
dice and even numbers on dice i.e. {1, 3, 5} and {2, 4, 6}.
 Union event 𝐀 ∪ 𝐁: Either A or B occurs.
 Intersection event 𝐀 ∩ 𝐁: Both A and B occur simultaneously.
 Mutually exclusive events 𝐀 ∩ 𝐁 = and P(𝐀 ∩ 𝐁) = 𝟎: A and B can never occur
together. All elementary events are mutually exclusive.
 Exhaustive events 𝐀 ∪ 𝐁 = S (Universal Set) and P( 𝐀 ∪ 𝐁) = 𝟏 : There is no other
outcome possible outside events A and B. All elementary events are exhaustive.
 Sure event A = S and P(A) = 1: A always occurs.
 Impossible event A = { } and P(A) = 0: A never occurs.
 Complementary event A’ or Ac: A doesn’t occur.
 Difference event A – B = A ∩ B’= A occurs but B doesn’t occur.
 Independent events: Two mutually exclusive events can not be independent.
Classical or A priori definition of
Probability (Bernoulli and Laplace)
 A random experiment of n equally likely elementary events

 Applicable only when total number of events is finite


 Applicable only when events are equally likely
 Only limited field of application like tossing of coin, rolling of dice,
drawing of card, etc. where possible outcomes are known in
advance
Properties of Classical Probability

 Probability of event A is always between 0 and 1, both inclusive.


When P(A) is zero, A is impossible event and when P(A) = 1, A is sure
event.
 Event A and A’ are mutually exclusive and exhaustive events. P(A) +
P(A’) = 1.
 If P(A) = p/(p+q) and hence P(A’) = q/(p+q) then Odds in favour of
event A are p/q and odds against event A are q/p.
All possible outcomes of various
experiments (Examples)
 Tossing of Coin once A = {H, T} and n(A) = 21 = 2
 Tossing of Coin twice or tossing of two coins simultaneously A = {HH, HT,
TH, TT} and n(A) = 22 = 4
 Tossing of Coin thrice or tossing of three coins simultaneously A = {HHH,
HHT, HTH, HTT, THH, THT, TTH, TTT} and n(A) = 23 = 8
 Rolling of Dice once A = {1, 2, 3, 4, 5, 6} and n(A) = 61 = 6
 Rolling of Dice twice or Rolling of two dice simultaneously A = {(1, 1), (1,
2), (1, 3), (1, 4), (1, 5), (1, 6), (2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6), (3, 1),
(3, 2), (3, 3), (3, 4), (3, 5), (3, 6), (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6), (5,
1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6), (6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}
and n(A) = 62 = 36
 A pack of 52 cards has 13 Hearts, 13 Diamonds, 13 Clubs, 13 Spades, 4
Aces, 4 Jacks, 4 Queens and 4 Kings.
Relative frequency definition –
Statistical or A posterieri - British
 If an experiment is repeated very large number of times n under
identical sets of conditions, then the ratio or relative frequency of m/n
will approach the probability of event A where m is the frequency of the
occurrence of event A.
𝑚
 𝑃 𝐴 = lim
𝑛→∞ 𝑛
Operations on events – Set
theoretic approach to probability
 A subset of S

 Union event A ∪ B and Intersection event A ∩ B

 Difference events A – B and B – A

 Complementary event A’
Axiomatic or Modern definition
Addition rule of Probability
 For two events
 P(A ∪ B) = P(A) + P(B) – P(A ∩ B) = P(A or B) = P(A+B)
 P(A ∩ B) = P(A) + P(B) – P(A ∪ B)
 For two Mutually exclusive events
 P(A ∪ B) = P(A) + P(B)
 For two Exhaustive events
 1 = P(A) + P(B) – P(A ∩ B)
 For two Mutually exclusive and Exhaustive events
 1 = P(A) + P(B)
 For three events
 P(A ∪ B ∪ C) = P(A) + P(B) + P(C) – P(A ∩ B) – P(B ∩ C) – P C ∩ A + P(A ∩ B ∩ C)
 For three Mutually exclusive and Exhaustive events
 1 = P(A) + P(B) + P(C)
Conditional probability
 P(A) can be different if A is dependent on another event B and if it is
known that B has already occurred. In this case, A is dependent event
on event B and Probability of A given that B has occurred is written as
P(A/B). Similarly if B is dependent on event A then the probability of
event B if it is known that A has already occurred is written as P(B/A).

 In both the above formulas, the denominator can not be zero.


Therefore, in case of P(A/B), B can not be impossible event so B’ can not
be sure event. Same way, in case of P(B/A), A can not be impossible
event and hence A’ can not be sure event.
 Same way, the formulas can be applied for P(A/B’) or P(B/A’) or P(A’/B’)
or P(B’/A’)
Example of Conditional
probabilities
B (Married) B’(Unmarried) Total

A (Males) 50 10 60

A’ (Females) 30 10 40

Total 80 20 100
Compound Probability theorem or Rule
of Multiplication of Probability or Joint
Probability
 For two independent events A and B, P(A ∩ B) = P(A).P(B) and therefore, P(A ∪ B) =
P(A) + P(B) – P(A).P(B)
 Since, two mutually exclusive events have P(A ∩ B) = 0, they can not be independent.
Also since they can not occur together, their occurrence or non occurrence is
dependent on the fact that the other has not occurred or occurred.
 Two different students attempting a QA paper question and getting correct answer
are independent events but same student giving answers of two different questions
correctly may be dependent.
 If A and B are independent their Complementary events are also independent. A and
B’ are also independent, B and A’ are also independent and A’ and B’ are also
independent, and therefore, to each of those pairs, the formula of multiplication will
apply in the same manner.
 Same way, for three independent events A, B and C, P(A ∩ B ∩ C) = P(A).P(B).P(C)
De Morgan’s Law and Difference
events
As per De Morgan,
 P(A ∩ B)’ = P(A′ ∪ B’)
 P(A ∪ B)’ = P(A′ ∩ B’)
 Same way for three events

 P A ∩ B ′ = P A − P(A ∩ B)
 P B ∩ A′ = P B − P(A ∩ B)
 If 𝐴 ⊂ 𝐵, then P(A) = P(A ∩ B)
 If 𝐵 ⊂ 𝐴, then P(B) = P(A ∩ B)

 P A ≥P A∩B
 P B ≥P A∩B
Summary of P(A ∩ B)

 P(A ∩ B) = P(A and B) = P(A.B)


 P(A ∩ B) = P(B ∩ A)
 P(A ∩ B) = P(A) + P(B) – P(A ∪ B)
 P(A ∩ B) = P(A) – P(A ∩ B′)
 P(A ∩ B) = P(B) – P(B ∩ A′)
 P(A ∩ B) = 0 for Mutually Exclusive events
 P(A ∩ B) = P(A).P(B) for Independent events
 P(A ∩ B) = P(A).P(B/A) for B dependent on A
 P(A ∩ B) = P(B).P(A/B) for A dependent on B
 P A∩B ≤P A
 P A∩B ≤P B
 P A∩B ≤P A∪B
Random Variable
 A random variable or a stochastic variable is a function defined on a
sample space associated wit a random experiment assuming any value
from Real Number Set R and assigning a real number to each and every
sample point of the random experiment. A random variable is denoted by
a capital letter such as X.
 For example, if a coin is tossed three times and X is the number of times
Heads occurring, then X is the random variable.
 In the above example, sample space is {HHH, HHT, HTH, HTT, THH, THT, TTH,
TTT}.
 For each outcome, the corresponding Real numbers respectively are {3, 2,
2, 1, 2, 1, 1, 0} but in set builder approach we don’t write the same
element twice and therefore the it will be {0, 1, 2, 3} but each value
repeating 1, 3, 3, 1 times respectively.
 In the above example, 1, 3, 3, 1 are the possible frequencies if the
experiment is repeated 8 times so we have to convert it into relative
frequencies if it is repeated 3 times or any or n number of times. That will
give us the relative frequency distribution or probability distribution.
Probability distribution (Example of
tossing of coin thrice and X = number
of times Head
xi pi

0 1/8

1 3/8

2 3/8

3 1/8
Expected Value and Variance

 Expected Value of X = Mean of the Probability Distribution of X = 𝜇 =


𝐸 𝑋 = σ 𝑝𝑖 𝑥𝑖

 𝐸 𝑋2 = ෌ 𝑝𝑖 𝑥𝑖2
 Variance of X = Var(X) = 𝜎 2 = E 𝑋 − 𝜇 2
= E X2 − 𝜇2 = ෌ 𝑝𝑖 𝑥𝑖2 −[E(𝑋)]2
= ෌ 𝑝𝑖 𝑥𝑖2 −(σ 𝑝𝑖 𝑥𝑖 )2
Probability distribution (Example of
tossing of coin thrice and X = number
of times Head
xi pi pixi pixi2

0 1/8 0 0

1 3/8 3/8 3/8

2 3/8 6/8 12/8

3 1/8 3/8 9/8

12/8 = 1.5 24/8 = 3


Properties of Random Variable and
Expected Value
 Expected value of all constant values k is constant k. E(k) = k
 Expected value of sum of two random variables is the sum of the
expected values of the two variables. E(X+Y) = E(X) + E(Y)
 Expected value of product of constant and a random variable is
the product of constant and the expected value of the random
variable. E(kX) = k.E(X)
 Expected value of the product of two independent random
variables is the product of the expected values of both random
variables. E(XY) = E(X).E(Y)

You might also like