0% found this document useful (0 votes)
2 views99 pages

Module 1

The document outlines the course BECE207L on Random Processes, focusing on modeling and analyzing probabilistic systems, with applications in digital communication and various follow-up fields. It details course objectives, outcomes, and evaluation criteria, as well as key concepts related to random variables, probability distributions, and specific examples of random processes. The syllabus includes topics on continuous and discrete random variables, probability mass functions, and binomial experiments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views99 pages

Module 1

The document outlines the course BECE207L on Random Processes, focusing on modeling and analyzing probabilistic systems, with applications in digital communication and various follow-up fields. It details course objectives, outcomes, and evaluation criteria, as well as key concepts related to random variables, probability distributions, and specific examples of random processes. The syllabus includes topics on continuous and discrete random variables, probability mass functions, and binomial experiments.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 99

BECE207L Random Process

Why to Study this Course


Tools for Modeling and Analyzing
Probabilistic Systems
Example: Digital Communication Link
What is the Probabilistic Model used for?
Follow Up Course
• Signal processing
• Communications
• Machine learning
• Computer Vision
• Image and video processing
- It also provides a good background for other
areas (e.g., noise in devices, circuits, biological
systems,
Course Objective
1. To familiarize the students with two and
multi-random variable theory.
2. To enable the students process the random
signals in time and frequency domains.
3. To make the students understand the noise
concepts and design a matched filter to increase
the Signal to Noise Ratio (SNR).
Course Outcome
1. Compute the probability density functions for multiple
random variables.
2. Perform transformation on multiple random variables
and complex random variables.
3. Interpret the random processes in terms of stationarity,
statistical independence, and correlation.
4. Compute the power spectral density of the random
signals.
5. Interpret the effect of random signals on LTI systems
output both in the time and frequency domain.
6. Design the Optimum linear systems for extracting
signals in the presence of noise.
Syllabus
Syllabus (Contd.,)
Syllabus (Contd.,)
Books
Grading policy

Evaluation Criteria Marks Weightage


CAT-I 50 15
CAT-II 50 15
Assessment 30 30
FAT 100 40
Module 1 Continuous and Discrete Multiple Random Variables
Introduction to Random Variables – Vector Random Variables- Joint
Distribution and its Properties-Joint Density and its Properties-Joint
Probability Mass Function – Conditional Distribution and Density-
Statistical Independence –Distribution and Density of Function of
Random Variables – Central Limit Theorem
Random Variable (Terminologies)
Experiment: Any process of observation/ measurement
may or may not produce numerical value
e.g,
– checking switch on/off
– Imperfections in cloth
– Mass of electrons
– PCB testing
– Rainfall
Experiments can be deterministic/ Random
• Deterministic- that can be predicted, e.g., Potential difference
and resistance
• Random Experiment – cannot be predicted, e.g., Potential
difference and resistance prediction of number in a 6 faced
cube
Random Variable (Terminologies)
Outcome: results obtained from experiment
- yes/no
- values through extensive calculation
Sample Space -set of all possible outcomes of an
experiment is called sample space and is usually
denoted by letter ‘S’
Each outcome in sample space is called an element
of the sample space
e.g, flipping a coin
S=H, T
Random Variable
Random – observed value depends on which of the possible
outcome
Variable – Different numerical value
Random Variable –rule that assigns a numerical value to each
possible outcome of an experiment.
Notation
- Always use upper case letters for random variables (X, Y , ...)
- Always use lower case letters for values of random variables: X = x means
that the random variable X takes on the value x
Example 1: Machine Breakdown
- Sample space : S= { electrical, mechanical, misuse}
– Each of these failures may be associated with a repair cost – State space :
{50,200,350}
– Cost is a random variable : 50, 200, and 350
Range space- set of all values taken up by the random variable ’X’ is called range
space Rx={0,1,2}
Random Variables
• Why Random variables?
- Random variables can represent the gain or loss in a
random experiment e.g., stock market
- Random variable can represent a measurement over
a random experiment, e.g., noise voltage on a resistor
• In most applications we care more about
these costs/measurements than the
underlying probability space
• Very often we work directly with random
variables without knowing (or caring to know)
the underlying probability space
Types of Random Variables
We classify r.v.s as:
Discrete: X can assume only one of a countable number of
values. Such r.v. can be specified by a Probability Mass Function
(pmf).
Examples: number of scratches on a surface, proportion
of defective parts among 1.000, no of transmitted bits in
error
Continuous: X can assume one of a continuum of values and the
probability of each value is 0. Such r.v. can be specified by a
Probability Density Function (pdf).
Examples electrical current, length, pressure, time,
voltage
Mixed: X is neither discrete nor continuous. Such r.v. (as well as
discrete and continuous r.v.s) can be specified by a Cumulative
Distribution Function (cdf) Example: Vehicle wait time
Examples of Random Variables
1. Flip a coin n times. Here S={H,T}n. Define the random variable X∈{0,1,2,...,n} to be the
number of heads
2. Roll a 4-sided die twice
(a)Define the random variable X as the maximum of the two rolls
(b)Define the random variable Y to be the sum of the outcomes of the two rolls
(c)Define the random variable Z to be 0 if the sum of the two rolls is odd and 1 if it is
even
3. Flip coin until first heads shows up. Define the random variable X∈{1,2....}to be the
number of flips until the first heads
4. Let S = R. Define the two random variables
Y= +1 forω≥0
−1 otherwise
5. N packets arrive at a node in a communication network. Here S is the set of arrival
time sequences(t1,t2,...,tn)∈(0,∞)n
(a) Define the random variable N to be the number of packets arriving in the interval
(0,1]
(b)Define the random variable T to be the first interarrival time
Probability
• Usedto quantify likelihood or chance that particular
values occur.
- The likelihood is quantified by assigning a number from the
interval [0, 1] to the set of values (or a percentage from 0 to
100%)
- Higher numbers indicate that the set of values is more likely

•Used to represent risk or uncertainty in engineering


applications
•Can be interpreted as our degree of belief or
relative frequency
Example: Your team has won 9 games from a total of 12 games played:
the Frequency of winning is 9. the Relative Frequency of winning is 9/12 = 75%
Probability

•A probability is usually expressed in terms of a


random variable.
•Let X denotes the part length and the probability
statement can be written in either of the following
forms

•Both equations state that the probability that the


random variable X assumes a value in [10.8, 11.2] is
0.25.
Probability

Complement of an Event
• Given a set E, the complement of E is the set of
elements that are not in E. The complement is
denoted as E’.

Mutually Exclusive Events


•The sets E1 , E2 ,...,Ek are mutually exclusive if
the intersection of any pair is empty. That is, each
element is in one and only one of the sets E1 , E2
,...,Ek .
Probability
Events
•A measured value is not always obtained from an
experiment.
• Sometimes, the result is only classified (into one of
several possible categories).
• These categories are often referred to as events.
Illustrations
• The current measurement might only be
• recorded as low, medium, or high;
• a manufactured electronic component might be
classified only as defective or not; either a message is
sent through a network or not.
Probability

Probability Properties
Probability functions

• A probability function maps the possible values of


x against their respective probabilities of
occurrence, p(x)
• p(x) is a number from 0 to 1.0.
• The area under a probability function is always 1.
Practice Problem

Which of the following are probability functions?

a. f(x)=.25 for x=9,10,11,12

b. f(x)= (3-x)/2 for x=1,2,3,4

c. f(x)= (x2+x+1)/25 for x=0,1,2,3


Answer (a)

a. f(x)=.25 for x=9,10,11,12


x f(x) Yes, probability
function!
9 .25
10 .25
11 .25

12 .25
1.0
Answer (b)

b. f(x)= (3-x)/2 for x=1,2,3,4


x f(x)
Though this sums to 1,
1 (3-1)/2=1.0 you can’t have a negative
probability; therefore, it’s
2 (3-2)/2=.5 not a probability
function.
3 (3-3)/2=0

4 (3-4)/2=-.5
Answer (c)
c. f(x)= (x2+x+1)/25 for x=0,1,2,3
x f(x)

0 1/25
1 3/25 Doesn’t sum to 1. Thus,
2 7/25 it’s not a probability
function.
3 13/25
24/25
Probability distribution
Specifying a random variable means being able to
determine the probability
• The probability distribution of X says how the total
probability of 1 is distributed among (allocated to) the
various possible X values.
e.g., 4 Printers purchased, number of printers to be serviced
within warranty period. Possible X values are then 0, 1, 2, 3, and
4.
- how much probability is associated with the X value 0, P(X=0)=
p(0)
- how much is apportioned to the X value 1, P(X=1)= p(1)
and so on
p(x) – probability assigned to value x
Discrete Random Variable- Probability mass function (pmf)

Probability mass function (pmf) describes how total


probability mass of 1 is distributed at various points
along the axis of possible values of a rv
px(x) = P(X = x) for all x
• A function can serve as the probability distribution of
a discrete random variable X if and only if its values,
p(x) satisfy the conditions
1. ≥ 0 for each value within its domain
2. σ𝑥 𝑝x(𝑥) = 1, where the summation extends over all the values
within its domain
Widely used Probability Mass Function (pmf)

Binomial:
The binomial r.v. represents, for example, the number of
heads in n independent coin flips
Poisson
-The Poisson r.v. often represents the number of random
events
e.g., arrivals of packets, photons, customers, etc.in some time
interval, e.g., (0,1)
Geometric
This r.v. represents, for example, the number of coin flips
until the first heads shows up (assuming independent
coin flips)
Binomial Experiments
A binomial experiment is a probability experiment that
satisfies the following conditions:

- The experiment is repeated for a fixed number of


trials, where each trial is independent of other
trials.
- There are only two possible outcomes of interest
for each trial. The outcomes can be classified as a
success (S) or as a failure (F).
- The probability of a success P(S) is the same for
each trial.
-The random variable x counts the number
of successful trials
Binomial Probability
Binomial: X ∼ B(n,p) for integer n > 0 and 0 ≤ p ≤ 1 has pmf

x n −x
n!
P (x ) = nC x p q = − x )! x ! p x q n − x
(n
where x=0,1,2,…n
Symbol Description

n The number of times a trial is repeated.

p = P (S) The probability of success in a single trial.

q = P (F ) The probability of failure in a single trial. (q = 1 – p)

x The random variable represents a count of the number of


successes in n trials: x = 0, 1, 2, 3, … , n.
Binomial Probability
Example:
A bag contains 10 chips. 3 of the chips are red, 5 of the chips are
white, and 2 of the chips are blue. Three chips are selected, with
replacement. Find the probability that you select exactly one red
chip.
p = the probability of selecting a red chip = 3 = 0.3
10
q = 1 – p = 0.7

n=3 P (1) = 3C 1(0.3)1(0.7)2


x=1 = 3(0.3)(0.49)
= 0.441
Binomial Experiments
Example:
Decide whether the experiment is a binomial experiment. If it is,
specify the values of n, p, and q, and list the possible values of the
random variable x. If it is not a binomial experiment, explain why.

• You randomly select a card from a deck of cards, and note


if the card is an Ace. You then put the card back and repeat
this process 8 times.

This is a binomial experiment. Each of the 8 selections


represent an independent trial because the card is replaced
before the next one is drawn. There are only two possible
outcomes: either the card is an Ace or not.

n =8 p= 4 = 1 q = 1 − 1 = 12 x = 0,1,2,3,4,5,6,7,8
52 13 13 13
Binomial Experiments
Example:
Decide whether the experiment is a binomial experiment.
If it is, specify the values of n, p, and q, and list the
possible values of the random variable x. If it is not a
binomial experiment, explain why.
• You roll a die 10 times and note the number the die lands
on.

This is not a binomial experiment. While each trial


(roll) is independent, there are more than two
possible outcomes: 1, 2, 3, 4, 5, and 6.
Binomial Probability Distribution
Example:
A bag contains 10 components. 3 of it are resistors, 5 capacitors, and
2 inductors. Four components are selected, with replacement. Create
a probability distribution for the number of resistors selected.

p = the probability of selecting a resistor = 3 = 0.3


10
q = 1 – p = 0.7
n=4 x P (x)
x = 0, 1, 2, 3, 4 0 0.240 The binomial
1 0.412 probability
2 0.265 formula is used to
3 0.076 find each
4 0.008 probability.
Finding Probabilities
Example:
The following probability distribution represents the probability of selecting 0, 1, 2, 3, or 4 red
chips when 4 chips are selected.

x P (x) a.) Find the probability of selecting no more than 3


0 0.24 red chips.
1 0.412
2 0.265
3 0.076 b.) Find the probability of selecting at least 1 red
chip.
4 0.008
a.) P (no more than 3) = P (x  3) = P (0) + P (1) + P (2) + P (3)
= 0.24 + 0.412 + 0.265 + 0.076 = 0.993
b.) P (at least 1) = P (x  1) = 1 – P (0) = 1 – 0.24 = 0.76
Complement
Graphing Binomial Probabilities
Example:
The following probability distribution represents the probability of
selecting 0, 1, 2, 3, or 4 red chips when 4 chips are selected. Graph
the distribution using a histogram.
P (x)
x P (x) 0.5 Selecting Red Chips
0 0.24 Probability
0.4
1 0.412
0.3
2 0.265
3 0.076 0.2
4 0.008 0.1
0 x
0 1 2 3 4
Number of red chips
Poisson Distribution
The Poisson distribution is a discrete probability distribution
of a random variable x that satisfies the following conditions.
- The experiment consists of counting the number of
times an event, x, occurs in a given interval. The
interval can be an interval of time, area, or volume.
- The probability of the event occurring is the same for
each interval.
- The number of occurrences in one interval is
independent of the number of occurrences in other
intervals.
The probability of
𝑥
exactly x occurrences in an interval is:
𝜇 −𝜇
𝑃 𝑋 = 𝑒
𝑥!
where e  2.71828 and μ is the mean number of occurrences
Poisson Distribution
The probability of an event is the number of favorable outcomes
divided by the total number of outcomes possible.

For example:

Rolling a 3 on a die, the number of events is 1 (there's only a


single 3 on each die), and the number of outcomes is 6.
Poisson Distribution
Message arrive at a switchboard in a Poisson manner at an
average rate of six per hour. Find the probability for each of the
following events.

a) Exactly two messages arrive within one hour.


b) No message arrives within one hour.
c) At least three messages arrive within one hour.
𝑎) µ = 6, 𝑥 = 2
62𝑒−6
𝑃 𝑥=2 = = 0.0446
2!

b) µ = 6, 𝑥 = 0
60𝑒−6
𝑃 𝑥=0 = = 0.0024
0!
𝐶) µ = 6, 𝑥 = 2
𝑃 𝑥 ≥ 3 = 1 − 𝑃(𝑥 < 3)
𝑃 𝑥 ≥ 3 = 1 − *𝑃 0 + 𝑃 1 + 𝑃 2
60𝑒−6 61𝑒−6 62𝑒−6
𝑃 𝑥 ≥ 3 =1−* + +
0! 1! 2!

𝑃 𝑥 ≥ 3 = 0.9380
Poisson Distribution
Example:
The mean number of power outages in a city is 4 per year. Find the
probability that in a given year,
a) there are exactly 3 outages,
b) there are more than 3 outages

a)  = 4, x = 3 b) P (more than 3)
= 1 − P (x  3)
3 -4
P (3) = 4 (2.71828)
3! = 1 − [P (3) + P (2) + P (1) + P (0)]

 0.195 = 1 − (0.195 + 0.147 + 0.073 + 0.018)


 0.567
Geometric Distribution
A geometric distribution is a discrete probability
distribution of a random variable x that satisfies the
following conditions.
- A trial is repeated until a success occurs.
- The repeated trials are independent of each
other.
- The probability of a success p is constant for
each trial
The probability that the first success will occur on
trial x is
- Geometric: X ∼ Geom(p) for 0 ≤ p ≤ 1 has pmf
P (x) = p(q)x – 1, where q = 1 – p.
Geometric Distribution
Example:
A fast food chain puts a winning game piece on every fifth package of French fries. Find
the probability that you will win a prize,
a.) with your third purchase of French fries,
b.) with your third or fourth purchase of French fries.
p = 0.20 q = 0.80

a.) x = 3 b.) x = 3, 4

P (3) = (0.2)(0.8)3 – 1 P (3 or 4) = P (3) + P (4)

= (0.2)(0.8)2  0.128 + 0.102

= (0.2)(0.64)  0.230
= 0.128
Geometric Distribution
A bag contains six blue balls and four red balls. Balls are
randomly drawn from the bag, one at a time, until a red ball is
obtained. If we assume that each drawn ball is replaced before
the next one is drawn, what is the probability that the
experiment stops after exactly five balls have been drawn?
Success with Probability = 4/10=0.4

The probability that the success will occur on trial x is, here x=5
P (x) = p(q)x – 1, where q = 1 – p.
• Binomial distribution
• The number of trials is fixed, and the random
variable counts the number of successes in
those trials.
• Geometric distribution
• The number of trials is not fixed, and the
random variable counts the number of trials
required to get the first success.
Continuous Random Variable
Suppose a r.v. X can take on a continuum of values and
probability of taking specific value is 0
Examples:
- Pick a number between 0 and 1
- Measure the voltage across a heated resistor
- Measure the phase of a random sinusoid. . .
• How do we describe probabilities of interesting events?
Idea:
• For discrete r.v., we sum a pmf over points in a set to find
its probability
• For continuous r.v., integrate a probability density over a
set to find its probability—analogous to mass density in
physics (integrate mass density to get the mass)
Continuous Random Variable-Probability Density
Function
A continuous r.v. X can be specified by a probability density function 𝑓𝑋 𝑥 (pdf) such
that, for any event A,

𝑃 𝑋 ∈ 𝐴 = න 𝑓𝑋 𝑥 𝑑𝑥
𝐴

For example, for A = (a,b], the probability can be computed as


𝑏

𝑃 𝑎 < 𝑋 < 𝑏 = න 𝑓𝑋 𝑥 𝑑𝑥
𝑎
pdf is used to specify the probability of the random variable falling within a particular
range of values
Properties of 𝑓𝑋 𝑥 :
1. 𝑓𝑋 𝑥 > 0

2. ‫׬‬−∞ 𝑓𝑋 𝑥 𝑑𝑥 = 1
Important note: of 𝑓𝑋 𝑥 should not be interpreted as the probability that 𝑋 = 𝑥, in
fact it is not a probability measure, e.g., it can be > 1
Notation: 𝑋~𝑓𝑋 𝑥 means that 𝑋 has pdf 𝑓𝑋 𝑥
Continuous Random Variables
Example: Metal Cylinder Production
Random variable X is the diameter of a randomly chosen
cylinder manufactured by the company. Random variable can
take any value between 49.5 and 50.5
Suppose that the diameter of a metal cylinder has a p.d.f
1.5 − 6 𝑥 − 50 2 𝑓𝑜𝑟 49.5 ≤ 𝑥 ≤ 50.5
𝑓 𝑥 =ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
f (x)
i. Check whether it is a valid pdf
ii. Determine the probability that the metal
cylinder has a diameter between 49.8 and
50.1 mm

49.5 50.5
Probability Density Function
• The probability that a metal cylinder has a
diameter between 49.8 and 50.1 mm can be
calculated to be

50.1
(1.5 − 6( x − 50.0) 2 )dx = [1.5 x − 2( x − 50.0) 3 ]50.1
49.8
49.8
f (x)
= [1.5  50.1 − 2(50.1 − 50.0) 3 ]
−[1.5  49.8 − 2(49.8 − 50.0) 3 ]
= 75.148 − 74.716 = 0.432

49.5 49.8 50.1 50.5 x


Problem
Famous Probability Density Function (pdf)
Uniform
Uniform r.v. is commonly used to model quantization
noise and finite precision computation error (roundoff
error)
Exponential
Exponential r.v. is commonly used to model interarrival
time in a queue,
i.e., time between two consecutive packet or customer arrivals,
service time in a queue, and lifetime of a particle, etc.
Gaussian
Gaussian r.v.s are frequently encountered in nature,
e.g., thermal and shot noise in electronic devices are Gaussian,
and very frequently used in modelling various social, biological,
and other phenomena
Uniform Distribution
Exponential Distribution
Gaussian Distribution
Cumulative Distribution Function (cdf)
• For discrete r.v.s we use pmf’s, for continuous r.v.s we use pdf’s
• Many real-world r.v.s are mixed, that is, have both discrete
and continuous components
Example: A packet arrives at a router in a communication
network. If the input buffer is empty (happens with
probability p), the packet is serviced immediately. Otherwise
the packet must wait for a random amount of time as
characterized by a pdf
Define the r.v. X to be the packet service time. X is neither
discrete nor continuous
• There is a third probability function that characterizes
all random variable types —discrete, continuous, and
mixed. The cumulative distribution function or cdf FX(x) of
a random variable is defined by
𝑥
𝐹𝑋 𝑥 = 𝑃 𝑋 ≤ 𝑥 = ‫׬‬−∞ 𝑓 𝑢 𝑑𝑢 for -∞ < 𝑥 < ∞
Properties of Distribution Function or CDF
There are six properties
1. 𝐹𝑋 −∞ = 0
2. 𝐹𝑋 ∞ = 1
3. 0 ≤ 𝐹𝑋 𝑥 ≤ 1
4. 𝐹𝑋 𝑥1 ≤ 𝐹𝑋 𝑥2 if x1 < x2
5. 𝑃 𝑎 < 𝑋 ≤ 𝑏 = 𝑃 𝑋 ≤ 𝑏 − 𝑃 𝑋 ≤ 𝑎
=𝐹𝑋 𝑏 - 𝐹𝑋 𝑎
+
6. 𝐹𝑋 𝑥 = 𝐹𝑋 𝑥
How to determine some function G𝑋 𝑥 is a valid
distribution function?
Ans: If properties 1, 2, 4 and 6 satisfies, then the function
G𝑋 𝑥 ) is a distribution function or CDF
Properties of Prob. Density function
1. 0 < 𝑓𝑋 (𝑥) for all x

2. ‫׬‬−∞ 𝑓𝑋(𝑥) 𝑑𝑥 = 1

3. 𝐹𝑋 𝑥 = ‫׬‬−∞ 𝑓 𝑥 𝑑𝑥
𝑥
4. 𝑃 𝑥1 < 𝑋 ≤ 𝑥2 = ‫ 𝑥׬‬2 𝑓𝑋 𝑥 𝑑𝑥
1
Examples

Cdf of a Gaussian r.v.:


Cumulative Distribution function (cdf)

1. What’s the probability that you roll a 3 or less? P(x ≤ 3) = 1/2


2. What’s the probability that you roll a 5 or higher?
P( x ≥ 5) = 1 – P( x ≤ 4) = 1-2/3 = 1/3
Cumulative Distribution Function
Problems
Problems
Problems
• Find the values of real constants a and b such
that the following function is a valid
distribution function.
𝐺𝑋(𝑥) = [1 – a 𝑒 (−𝑥/𝑏) ] u(𝑥)
Ans:
Test 1: Property 1
GX(-∞) = 0, since u(-∞) = 0
Test 2: property 2
GX(∞) = 1, if b>0
Test 3: Property 4
It is an increasing function
Test 4:
It is a continuous function
for x+
Probability Density Function
PDF is used to define the probability of the random
variable coming within a distinct range of values, as
objected to taking on any one value.

Probability density function is denoted by fX(x), is


defined as the derivative of the distribution function.
𝑑𝐹𝑋(𝑥)
• 𝑓𝑋 𝑥 =
𝑑𝑥

fX(x) is the density function of the random variable X.


Problems
If
−𝑥 2
𝑓 𝑥 = ቐ𝑥 𝑒 2 𝑥≥0
0, 𝑥<0

i) Show that f(x) is a pdf


ii) Find its distribution function
• The diameter of an electric cube x is a crv with
pdf fine the value of ‘k’

𝑘𝑥 1 − 𝑥 , 0 ≤ 𝑥 ≤ 1
• 𝑓 𝑥 =𝑓 𝑥 =ቊ
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟

i)The cdf of ‘x’


ii) A number ‘b’ such that p(x<b)=p(x>b)
Problems

A random variable X in known to have a


distribution function
−𝑥2
FX(x) = u(x)[1-𝑒 𝑏 ], where b>0.

Find the density function.


Multiple random variables
• Vector Random Variable

• Let X and Y are random variables defined on a sample space S, where x and y
are specific values of X and Y respectively.

• Any ordered pair of numbers (x, y) is considered as a random point in xy


plane and the point may take a specific value of a vector random variable.
New sample
space Joint sample space (SJ)
Multiple random variables
• The joint event for two events is denoted by {X≤ 𝑥, Y≤ 𝑦}.

• When two random variables X and Y defined on a sample space S, they are the
components of 2D Random variables.

• When N random variables X1, X2, …, XN are defined on a sample space S, then
they are the components of N-Dimensional random vector or N-dimensional
random variable.

• The joint sample space is N dimensional.


Joint Probability Mass Function
Used for discrete random variable
• Probability Mass Function for a random variable X is denoted by
𝑃𝑋 𝑥 = 𝑃(𝑋 = 𝑥)
• Joint Probability Mass function:
𝑃𝑋𝑌 𝑥, 𝑦 = 𝑃 𝑋 = 𝑥, 𝑌 = 𝑦
= 𝑃( 𝑋 = 𝑥 𝑎𝑛𝑑 (𝑌 = 𝑦))
Properties of Joint pmf
• 0 ≤ 𝑝(𝑥, 𝑦) ≤ 1

• σ𝑥 σ𝑦 𝑝 𝑥, 𝑦 = 1

• 𝑃 𝑋, 𝑌 ∈ 𝐴 = σ𝑥𝜖𝐴 σ𝑦𝜖𝐴 𝑝(𝑥, 𝑦)


Joint Probability Mass Function
• Marginal Probability

𝑝𝑋 𝑥 = σ𝑗 𝑝 𝑥, 𝑦𝑗 -> Fix the value of X and


sum over possible values of Y

𝑝𝑌 𝑦 = σ𝑖 𝑝 𝑥𝑖 , 𝑦 -> Fix the value of Y and


sum over possible values of X
Conditional Probability Mass Function
Joint Probability Mass Function
• Discrete Random Variables 𝑋1 , 𝑋2 , … , 𝑋𝑛 are independent if the joint pmf is
equal to the product of the marginal pmf’s.
𝑝 𝑥1 , 𝑥2 , … , 𝑥𝑛 = 𝑝𝑋1 𝑥1 . 𝑝𝑋2 𝑥2 … 𝑝𝑋𝑛 (𝑥𝑛)
• If X and Y are independent random variables, then
E[XY] = E[X]E[Y]
• Suppose that X and Y are jointly distributed discrete random variables with
joint pmf p(x,y).
• If g(X,Y) is a function of these two random variables, then its expected value is
given by the following,

𝐸 𝑔 𝑋, 𝑌 = ෍ ෍ 𝑔 𝑥, 𝑦 𝑝(𝑥, 𝑦)
𝑥 𝑦
Joint Distribution Function Joint Density Function

For 1D • For two random variables X

𝐹𝑋 (𝑥) = P{X≤ 𝑥}, where and Y, the joint probability

x is a real number ranging from - density function or joint density

∞ to ∞. function, denoted as 𝑓𝑋,𝑌 𝑥, 𝑦 ,


is defined by the second
• For 2D
derivative of the joint
𝐹𝑋,𝑌 𝑥, 𝑦 = 𝑃{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
distribution function wherever
𝑃{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦} = P(A ∩ 𝐵)
it exists.
A ∩ 𝐵 is the joint event defined
𝜕2 𝐹𝑋,𝑌 (𝑥,𝑦)
on S • 𝑓𝑋,𝑌 𝑥, 𝑦 =
𝜕𝑥 𝜕𝑦
Properties
The joint density function for two random The joint distribution function for two random
variables X and Y has the following six
variables X and Y has the following six
properties
1. 𝑓𝑋,𝑌 𝑥, 𝑦 ≥ 0 properties.
∞ ∞
2. ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦=1 1. 𝐹𝑋,𝑌 −∞, ∞ = 0 𝐹𝑋,𝑌 −∞, 𝑦 =
3. 𝐹𝑋,𝑌 𝑥, 𝑦 = 0 𝐹𝑋,𝑌 𝑥, −∞ = 0
𝑦 𝑥
‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋,𝑌 𝜉1 , 𝜉2 𝑑 𝜉1 d𝜉2
𝑥 ∞ 2. 𝐹𝑋,𝑌 ∞, ∞ = 1
4. 𝐹𝑋 𝑥 = ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋,𝑌 𝜉1 , 𝜉2 𝑑𝜉1 d𝜉2
𝑦 ∞ 3. 0 ≤ 𝐹𝑋,𝑌 𝑥, 𝑦 ≤ 1
𝐹𝑌 𝑦 = න න 𝑓𝑋,𝑌 𝜉1 , 𝜉2 𝑑𝜉1 d𝜉2
4. 𝐹𝑋,𝑌 𝑥, 𝑦 𝑖𝑠 𝑎 𝑛𝑜𝑛 −
−∞ −∞
5. P{𝑥1 < 𝑋 ≤ 𝑥2 , 𝑦1 < 𝑌 ≤ 𝑦2 } = 𝑑𝑒𝑐𝑟𝑒𝑎𝑠𝑖𝑛𝑔 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑜𝑓 𝑏𝑜𝑡ℎ 𝑥 𝑎𝑛𝑑 𝑦
𝑦2 𝑥2
‫𝑋𝑓 𝑥׬ 𝑦׬‬,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦 5. 𝐹𝑋,𝑌 𝑥2, 𝑦2 +𝐹𝑋,𝑌 𝑥1, 𝑦1 -𝐹𝑋,𝑌 𝑥1, 𝑦2 -
1 1

6. 𝑓𝑋 𝑥 = ‫׬‬−∞ 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑦 𝐹𝑋,𝑌 𝑥2, 𝑦1 = 𝑃{𝑥1 < 𝑋 ≤ 𝑥2, 𝑦1 < 𝑌 ≤

𝑦2} ≥ 0
𝑓𝑌 𝑦 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥
−∞ 6. 𝐹𝑋,𝑌 𝑥, ∞ = 𝐹𝑋 𝑥 , 𝐹 𝑋,𝑌 ∞, 𝑦 = 𝐹𝑌 𝑦
Properties 1 and 2 are sufficient conditions to
determine if a function is valid density Property 1,2 and 5 must be satisfied for a valid
function or not joint distribution function
Marginal Distribution, Density Function
Functions 𝑓𝑋 𝑥 𝑎𝑛𝑑 𝑓𝑌 (𝑦) of property 6 • Property 6 of joint distribution states
are called Marginal Probability density that the distribution function of one
function or Marginal density function. random variable can be obtained by
The functions are the density functions of setting the value of the other variable
the single variables X and Y are defined as to infinity in 𝐹𝑋,𝑌 𝑥, 𝑦 .
the derivatives of the marginal distribution • The functions 𝐹𝑋 𝑥 or 𝐹𝑌 𝑦
functions, obtained in this manner are called
marginal distribution functions.
𝑑𝐹𝑋 (𝑥)
𝑓𝑋 𝑥 =
𝑑𝑥 • 𝐹𝑋 𝑥 = 𝑙𝑖𝑚𝑦→∞ 𝐹𝑋𝑌 𝑥, 𝑦 =

𝑑𝐹𝑌 (𝑦) 𝐹𝑋𝑌 (𝑥, ∞)


𝑓𝑌 𝑦 =
𝑑𝑦 • 𝐹𝑌 (𝑦) = 𝑙𝑖𝑚𝑥→∞ 𝐹𝑋𝑌 𝑥, 𝑦 =
𝐹𝑋𝑌 (∞, 𝑦)
Conditional distribution and Density Function
Useful concept in probability theory: Conditional Probability and
expectations. Reason:
1 Often interested in calculating probability and expectation
when partial information is available, hence desired probability
is conditioned one
2. In calculating desired prob and expectation, it is useful to first
condition on some appropriate rv
• Conditional distribution
The conditional density function function of a random variable
after taking the derivative is, X, given some event B, with
non zero probability, is,
𝑑𝐹𝑋 (𝑥|𝐵) • 𝐹𝑋 𝑥 𝐵 = 𝑃 𝑋 ≤ 𝑥 𝐵 =
𝑓𝑋 𝑥 𝐵 =
𝑑𝑥 𝑃{𝑋≤𝑥∩𝐵}
𝑃(𝐵)
Point Conditioning
• If the distribution function of one random variable X conditioned by the fact
that a second random variable Y has some specific value y. Then it is called
point conditioning.


𝑓𝑋,𝑌(𝑥,𝑦)
• 𝑓𝑋 𝑥 𝑌 = 𝑦 = 𝐹𝑋 𝑥 𝑌 = 𝑦𝑘 =
𝑓𝑌 (𝑦)
𝑃(𝑥𝑖 , 𝑦𝑘 )
σ𝑁
𝑖=1 𝑢(𝑥 − 𝑥𝑖 )
𝑓𝑋,𝑌(𝑥,𝑦) 𝑃(𝑦𝑘 )
• 𝑓𝑋 𝑥 𝑦 =
𝑓𝑌 (𝑦)

𝑓𝑋,𝑌(𝑥,𝑦)
• 𝑓𝑌 𝑦 𝑥 =
𝑓𝑋 (𝑥)
Density function after
differentiation,

𝑓𝑋 𝑥 𝑌 = 𝑦𝑘
𝑁
𝑃(𝑥𝑖 , 𝑦𝑘 )
=෍ 𝛿(𝑥 − 𝑥𝑖 )
𝑃(𝑦𝑘 )
𝑖=1
Statistical Independence
• Two events are statistically independent if and only if,
are independent iff
𝑃 𝐴 ∩ 𝐵 = 𝑃 𝐴 𝑃(𝐵)
• Two random variables X and Y are said to be statistically
independent random variables, if and only if
𝑃 𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦 = 𝑃 𝑋 ≤ 𝑥 𝑃{𝑌 ≤ 𝑦}
For a density function a • For distribution function
𝑓𝑋𝑌 𝑥𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 𝐹𝑋𝑌 𝑥𝑦 = 𝐹𝑋 𝑥 𝐹𝑌 𝑦
Conditional density function for • Conditional distribution
function for independent
independent X and Y events
𝐹𝑋 𝑥 𝑌 ≤ 𝑦 =𝐹𝑋 𝑥
𝑓𝑋 𝑥 𝑌 ≤ 𝑦 =𝑓𝑋 𝑥 𝐹𝑦 𝑦 𝑋 ≤ 𝑥 =𝐹𝑌 𝑦
𝑓𝑌 𝑦 𝑋 ≤ 𝑥 =𝑓𝑌 𝑦
Problems
1.1

d) Conditional pmf of X given Y and Y given X


e) Find P(X+Y>2)
Problems

1.2

C. Find the marginal pdf of X and Y


d. Are X and Y independent
e. fy/x(y/x) and fX/Y(x/y)
problems
1. 3 Cumulative distribution function of continuous
random variable X and Y is:
1 2 1 3
𝐹𝑋𝑌 𝑥, 𝑦 = ቐ2 𝑥 𝑦 + 2 𝑥𝑦 , 𝑥 ≥ 1, 𝑦 ≥ 1
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Find:
i) FXY(2,3)
ii) P(2<X ≤ 7, 4 < 𝑌 ≤ 6)
iii) Find the density function
Problems
1.4 Cumulative distribution function of two random variable X
and Y is:
𝐹𝑋𝑌 𝑥, 𝑦 = ൛𝑢 𝑥 𝑢(𝑦) 1 − 𝑒 −𝑎𝑥 − 𝑒 −𝑎𝑦 + 𝑒 −𝑎(𝑥+𝑦)
Where u(.) is the unit step function and assume a=0.5. Find:
i) P(X ≤ 1, 𝑌 ≤2)
ii)P(0.5<X ≤ 1.5)
iii) P(-1.5<X ≤ 2, 1 < 𝑌 ≤ 3)
(iv)Find the marginal distributions 𝐹𝑋 (𝑥) and 𝐹𝑌 (𝑦
(v) Find the marginal distributions 𝑓𝑋 (𝑥) and 𝑓𝑌 (𝑦
(vi) Determine the joint probability density function
𝑓𝑋𝑌 𝑥, 𝑦
(vii)Are X and Y Statistically independent?
Problems
1.5 Discrete random variables X and Y have a joint distribution
function 𝐹𝑋𝑌 𝑥, 𝑦 = 0.1𝑢 𝑥 − 1 𝑢 𝑦 − 1 + 0.35𝑢 (𝑥 −
2)𝑢 𝑦 + 0.05𝑢 𝑥 − 3 𝑢 𝑦 − 3 + 0.5𝑢 𝑥 − 4 𝑢 𝑦
Find
i)P(X ≤ 2.5, 𝑌 ≤ 6.5)
ii) (X ≤ 3)
Problems
1.6 If the joint pdf of X and Y is given by
𝑓𝑋,𝑌 𝑥, 𝑦
24𝑥𝑦 0 < 𝑥 < 1,0 < 𝑦 < 1, 𝑥 + 𝑦 < 1
=ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
i. Find CDF
ii. Find marginal cdf of x and y
iii. Find marginal pdf of x any y
1
Iv. Find P(X + Y < )
2
Problems
1.7 Calculate the probability of the event
𝑌 ≤ 2|𝑋 = 1 for the function
𝑓𝑋,𝑌 (𝑥, 𝑦) = 𝑥𝑒 −𝑥 𝑦+1 𝑢 𝑥 𝑢(𝑦)
Problems
𝑘(𝑥 + 𝑦)
1.8. 𝑓𝑋𝑌 𝑥, 𝑦 = ቊ , 0 < 𝑥 < 2,0 < 𝑦 < 2
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

i) Find k
ii) Find the marginal pdf of X and Y
iii)Are X and Y independent
iv)fy/x(y/x) and fX/Y(x/y)
1
0<𝑦<
v)𝑃 ൗ𝑋=1
2

You might also like