UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
UNIT-III
STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
INTRODUCTION
A random process is an ensemble of (collection) number of time varying functions.
Further it is nothing but a random variable with time added.
A random variable is a real valued function that assigns numerical values to the
outcomes of physical experiment. If time is added to a random variable then it is
called random process.
Random processes are used to describe the time varying nature of random variable.
They describe the statistical behavior of various real time signals like speech,
noise, atmosphere, etc…
Random processes are denoted by 𝑋(𝑡, 𝑠) 𝑜𝑟 𝑋(𝑡). If time is fixed i.e; if any
specific time instant is taken then random process becomes random variable.
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 1
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
CLASSIFICATION OF RANDOM PROCESS
Based on the characteristics sample space of a random variable and time t random
process are classified as
Continuous Random Process
Discrete Random Process
Continuous Random Sequence
Discrete Random Sequence
Continuous Random Process
A random process is said to be continuous, if random variable X and time t are
continuous. It means that they can take any value
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 2
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Discrete Random Process
A random process is said to be discrete, if random variable X is discrete and time t
is continuous.
Continuous Random Sequence
A random sequence is said to be continuous, if random variable X is continuous
and time t is discrete.
Discrete Random Sequence
A random sequence is said to be discrete, if random variable X and time t is
discrete.
DETERMINISTIC RANDOM PROCESSES & NON DETERMINSTIC PROCESSES
A random process is said to be deterministic if its future values can be predicted
from observed past values.
Ex: X(t) = A Cos(0t + )
A random process is said to be non-deterministic if its future values cannot
predicted from observed past values.
Ex: Random noise signal
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 3
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Distribution and Density Functions of Random Process
It is known that a random process becomes a random variable at specific time
instant. Hence all the statistical properties of random variables are applicable to
random processes. Based on the number of random variables various distribution
and density functions are defined.
First order Distribution and Density Function
The first order distribution function of a random process is defined as
𝐹𝑋 (𝑥1; 𝑡1 ) = P{𝑋(𝑡1) ≤ 𝑥1 }
Similarly the first order density function of random process is
𝑑𝐹𝑋 (𝑥1 ; 𝑡1)
𝑓𝑋 (𝑥1; 𝑡1 ) =
𝑑𝑥1
Second order Distribution and Density Function
For two random variables at time instant 𝑡1 𝑎𝑛𝑑 𝑡2 𝑋(𝑡1) = 𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 ,
the second order distribution (joint distribution) function of a random process is
defined as
𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1, 𝑡2) = P{𝑋(𝑡1 ) ≤ 𝑥1 ; 𝑋(𝑡2 ) ≤ 𝑥2 }
The second order probability density function of random process is
𝜕 2 𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 )
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1, 𝑡2 ) =
𝜕𝑥1 𝜕𝑥2
nth order Distribution and Density Function
In general for N random variables nth order joint distribution function is
𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = P{𝑋 (𝑡1 ) ≤ 𝑥1 ; 𝑋(𝑡2 ) ≤ 𝑥2 … … 𝑋(𝑡𝑛 ) ≤ 𝑥𝑛 }
The nth order probability density function of random process is
𝜕 𝑛 𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1, 𝑡2 … … 𝑡𝑛 )
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1, 𝑡2 … … 𝑡𝑛 ) =
𝜕𝑥1 𝜕𝑥2 … … 𝜕𝑥𝑛
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 4
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Stationary and Statistical Independence
Random process is generally characterized by two parameters
Whether it is Stationary or not
Whether the random variables involved are statistically independent
or not (random process also)
First order Stationary Process
A random process X(t) is said to be first order stationary if its first order density
function does not change with time.
𝑓𝑋 (𝑥1; 𝑡1 ) = 𝑓𝑋 (𝑥1 ; 𝑡1 + Δ)
where Δ is the minute increment or decrement of time
Whenever a random process is first order stationary then its average value or mean
is constant over time.
𝐸 [𝑋(𝑡1 )] = 𝐸 [𝑋 (𝑡2 )]
= 𝐸 [𝑋 (𝑡1 + Δ)]
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝐸 [𝑋 (𝑡)] = 𝐸 [𝑋(𝑡 + Δ)]
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Second order Stationary Process
A random process X(t) is said to be second order stationary if its second order
density function does not change with time.
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1, 𝑡2 ) = 𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 + Δ, 𝑡2 + Δ)
Let 𝐸 [𝑋(𝑡1) 𝑋(𝑡2 )] denote the correlation between two random variables
𝑋1 𝑎𝑛𝑑 𝑋2 taken at time instants 𝑡1 𝑎𝑛𝑑 𝑡2 then
𝑅𝑋1𝑋2 (𝑡, 𝑡 + Δ) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + Δ)]
= 𝑅𝑋𝑋 (𝜏)
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 5
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Where 𝑅𝑋𝑋 (𝜏) is auto correlation function of random process X(t).
If this auto correlation function is constant i.e.; independent on time such a random
process is called second order stationary process.
Wide Sense Stationary Process
A random process is said to be wide sense stationary process if
𝐸 [𝑋 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
nth order Stationary Process
A random process X(t) is said to be nth order stationary if its nth order density
function does not change with time.
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = 𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 + Δ, 𝑡2 + Δ … … 𝑡𝑛 + Δ)
A random process is said to be nth order stationary is also called strict sense
stationary.
TIME AVERAGES:
The random process is also characterized by time average functions along with
statistical averages. Statistical average of a random process is calculated by
considering all sample functions at given time.
Time averages are calculated for any sample function. The time average of a
random process is defined as a
1 𝑇
𝐴[∎] = lim ∫ [∎] 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Here A is used to denote time average in a manner analogous to E for the statistical
average.
The time average of a random process 𝑥(𝑡) is given as
1 𝑇
𝑥̅ = 𝐴[𝑥(𝑡)] = lim ∫ 𝑥(𝑡) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 6
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Similarly the time average of 𝑥 (𝑡)𝑥(𝑡 + 𝜏) is called as time auto correlation
function and is given by
ℜ𝑥𝑥 (𝜏) = 𝐴[𝑥 (𝑡) 𝑥(𝑡 + 𝜏)]
1 𝑇
= lim ∫ 𝑥 (𝑡) 𝑥(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
The time auto correlation function is used to calculate the similarity between two
random variables within a single random process.
The time cross correlation function measures the similarity between two different
random processes.
ℜ𝑥𝑦 (𝜏) = 𝐴[𝑥 (𝑡) 𝑦(𝑡 + 𝜏)]
𝑇
1
= lim ∫ 𝑥(𝑡) 𝑦(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Ergodic Theorem
This theorem stated that the all time averages 𝑥̅ 𝑎𝑛𝑑 ℜ𝑥𝑥 (𝜏) of a random process
are equal to Statistical averages 𝑋̅ 𝑎𝑛𝑑 𝑅𝑋𝑋
A random process is said to be ergodic if its satisfies ergodic theorem.
𝐸 [𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
𝐸 [𝑋 (𝑡) 𝑋(𝑡 + 𝜏)] = 𝐴[𝑥 (𝑡) 𝑦(𝑥 + 𝜏)]
𝑅𝑋𝑋 (𝜏) = ℜ𝑥𝑥 (𝜏)
Mean Ergodic Random Process
A random process is said to be mean ergodic (or) ergodic in mean if the time
average of 𝑥 (𝑡) is equal to statistical average of X(t)
𝐸 [𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 7
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Auto Correlation Ergodic Random Process
A random process is said to be ergodic in auto correlation if the time auto
correlation function is equal to statistical auto correlation function.
𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐴[𝑥 (𝑡) 𝑥(𝑡 + 𝜏)]
𝑅𝑋𝑋 (𝜏) = ℜ𝑥𝑥 (𝜏)
Cross Correlation Ergodic Random Process
A random process is said to be ergodic in cross correlation if the time cross
correlation function is equal to statistical cross correlation function.
𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)] = 𝐴[𝑥 (𝑡) 𝑦(𝑡 + 𝜏)]
𝑅𝑋𝑌 (𝜏) = ℜ𝑥𝑦 (𝜏)
Auto Correlation Function
It is a measure of similarity between two random variables for a given random
process. It is defined as the expected value of 𝑥 (𝑡) 𝑥(𝑡 + 𝜏)
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Properties of auto correlation function
1. 𝑅𝑋𝑋 (𝜏) cannot have an arbitrary shape.
2. The value of auto correlation function at origin 𝑖. 𝑒; 𝜏 = 0 is equal to mean
square value of the process (Average Power)
𝑅𝑋𝑋 (0) = ̅̅̅̅̅̅̅
𝑋 2 (𝑡)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
𝑙𝑒𝑡 𝜏 = 0
𝑅𝑋𝑋 (0) = 𝐸 [𝑋(𝑡) 𝑋(𝑡)]
= 𝐸 [𝑋 2 (𝑡)]
𝑅𝑋𝑋 (0) = ̅̅̅̅̅̅̅
𝑋 2 (𝑡)
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 8
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
3. The maximum value of auto correlation function occurs at origin
|𝑅𝑋𝑋 (𝜏) | ≤ 𝑅𝑋𝑋 (0)
Proof: let X(t) be a wide sense stationary process such that 𝑋(𝑡1 ) =
𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 then select a positive quantity such that
[𝑋(𝑡1) ± 𝑋(𝑡2 )]2 ≥ 0
Apply expectation on both sides then
𝐸 [𝑋 (𝑡1) ± 𝑋(𝑡2 )]2 ≥ 0
𝐸 [𝑋2 (𝑡1) + 𝑋2 (𝑡2 ) ± 2 𝑋(𝑡1) 𝑋(𝑡2)] ≥ 0
𝐸 [𝑋 2 (𝑡1 )] + 𝐸 [𝑋2 (𝑡2 )] ± 2 𝐸 [𝑋(𝑡1) 𝑋(𝑡2 )] ≥ 0
Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏
𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑋2 (𝑡 + 𝜏)] ± 2 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] ≥ 0
As statistical properties does not change with time, then
𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) ± 2 𝑅𝑋𝑋 (𝜏) ≥ 0
2 𝑅𝑋𝑋 (0) ± 2 𝑅𝑋𝑋 (𝜏) ≥ 0
|𝑅𝑋𝑋 (𝜏)| ≤ |𝑅𝑋𝑋 (0)|
4. Autocorrelation function is an even function
𝑅𝑋𝑋 (−𝜏) = 𝑅𝑋𝑋 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Let 𝜏 = −𝜏
𝑅𝑋𝑋 (−𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 − 𝜏)]
𝑡−𝜏=𝑢
𝑡 = 𝑢+𝜏
𝑅𝑋𝑋 (−𝜏) = 𝐸 [𝑋 (𝑢 + 𝜏) 𝑋(𝑢)]
𝑅𝑋𝑋 (−𝜏) = 𝑅𝑋𝑋 (𝜏)
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 9
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
5. When a random process X(t) is periodic with period T then the
Autocorrelation function is also periodic.
𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝑅𝑋𝑋 (𝜏)
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏 + 𝑇)]
As X(t) is periodic 𝑋(𝑡 + 𝜏 + 𝑇) = 𝑋(𝑡 + 𝜏)
𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝑅𝑋𝑋 (𝜏)
6. If 𝐸 [𝑋 (𝑡)] = 𝑋̅ ≠ 0 and X(t) is Ergodic with no periodic components, then
the auto correlation function is given as
lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2
|𝜏|→∞
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Since the process has no periodic components as |𝜏| → ∞, the random variables
becomes independent.
lim 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐸 [𝑋 (𝑡) ] 𝐸 [ 𝑋(𝑡 + 𝜏)]
|𝜏|→∞
The given random process is Ergodic, then
lim 𝐸 [𝑋 (𝑡) 𝑋(𝑡 + 𝜏)] = 𝐸 [𝑋 (𝑡) ] 𝐸 [ 𝑋(𝑡)]
|𝜏|→∞
lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2
|𝜏|→∞
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 10
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
7. If X(t) is Ergodic, zero mean, and has no periodic components, then the
auto correlation function is given as
lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞
Proof: from the above property
lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2
|𝜏|→∞
It is given that zero mean random process.
𝑡ℎ𝑒𝑛 lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞
8. Let there be a random process w(t) such that 𝑤 (𝑡) = 𝑋(𝑡) + 𝑌(𝑡). Then the
auto correlation function of sum of random process is equal to
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Given 𝑤 (𝑡) = 𝑋(𝑡) + 𝑌 (𝑡)
𝑅𝑊𝑊 (𝜏) = 𝐸 [𝑊 (𝑡) 𝑊(𝑡 + 𝜏)]
= 𝐸[(𝑋(𝑡) + 𝑌(𝑡)) (𝑋(𝑡 + 𝜏) + 𝑌 (𝑡 + 𝜏))]
= 𝐸[(𝑋(𝑡)) (𝑋(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑋(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑋(𝑡 + 𝜏))]
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
CROSS Correlation Function
It is a measure of similarity between two random processes X(t) and Y(t).
𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)]
Consider two random processes X(t) and Y(t) that are at least wide sense
stationary.
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)]
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 11
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Properties of auto correlation function
1. The cross correlation function is an even function
𝑅𝑋𝑌 (−𝜏) = 𝑅𝑌𝑋 (𝜏)
Proof:
We know that
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)]
Let 𝜏 = −𝜏
𝑅𝑋𝑌 (−𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 − 𝜏)]
𝑡−𝜏=𝑢
𝑡 = 𝑢+𝜏
𝑅𝑋𝑌 (−𝜏) = 𝐸 [𝑋 (𝑢 + 𝜏) 𝑌(𝑢)]
𝑅𝑋𝑌 (−𝜏) = 𝐸 [𝑌 (𝑢) 𝑋(𝑢 + 𝜏) ]
𝑅𝑋𝑌 (−𝜏) = 𝑅𝑌𝑋 (𝜏)
2. The cross correlation function of a random process is always less than or
equal to the geometric mean of individual auto correlation functions.
𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
Proof:
Let X(t) and Y(t) be two random processes such that
[𝑌 (𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
Apply expectation on both sides then
𝐸 [𝑌 (𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
𝐸 [𝑌 2 (𝑡 + 𝜏) + 𝛼 2 𝑋2 (𝑡) ± 2 𝛼𝑋(𝑡) 𝑌 (𝑡 + 𝜏)] ≥ 0
𝐸 [𝑌 2 (𝑡 + 𝜏)] + 𝛼 2𝐸 [𝑋 2 (𝑡)] ± 2 𝛼 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)] ≥ 0
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 12
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
Given processes are stationary, hence
𝛼 2 𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑌 2 (𝑡)] ± 2𝛼 𝐸 [𝑋(𝑡) 𝑌 (𝑡 + 𝜏)] ≥ 0
𝑅𝑋𝑋 (0) 𝛼 2 ± 2𝛼 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑌 (0) ≥ 0
The above equation of the form 𝑎𝑥 2 + 𝑏𝑥 + 𝑐. Hence the roots of the above
−𝑏±√𝑏2 −4𝑎𝑐
equation are given as
2𝑎
∓2 𝑅𝑋𝑌 (𝜏) ± √4 𝑅 2𝑋𝑌 (𝜏) − 4𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
= ≥0
2𝑅𝑋𝑋 (0)
∓ 𝑅𝑋𝑌 (𝜏) ± √ 𝑅 2𝑋𝑌 (𝜏) − 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
= ≥0
2𝑅𝑋𝑋 (0)
𝑅 2𝑋𝑌 (𝜏) − 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0) ≥ 0
𝑅 2𝑋𝑌 (𝜏) ≤ 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
3. The cross correlation function of a random process is always less than or
equal to the arthematic mean of individual auto correlation functions.
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
𝑅𝑋𝑌 (𝜏) ≤
2
Proof:
We know that
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)]
𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)
The geometric mean of any given series is always less than or equal to arthematic
mean
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
√𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0) ≤
2
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
𝑅𝑋𝑌 (𝜏) ≤
2
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 13
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
4. For two random processes X(t) and Y(t) having non-zero mean and are
statically independent
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
Proof:
We know that
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋(𝑡) 𝑌(𝑡 + 𝜏)]
As X(t) and Y(t) having non-zero mean and are statically independent
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋 (𝑡) ] 𝐸 [ 𝑌(𝑡 + 𝜏)]
𝑎𝑠 X(t) and Y(t) 𝑎𝑟𝑒 𝑊𝑆𝑆 𝑡ℎ𝑒𝑛 𝐸 [ 𝑌(𝑡 + 𝜏)] = 𝐸 [ 𝑌(𝑡)]
𝑅𝑋𝑌 (𝜏) = 𝐸 [𝑋(𝑡) ] 𝐸 [ 𝑌(𝑡)]
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
COVARIANCE FUNCTION
The Covariance function is a measure of interdependence between two random
variables of the random process X(t).
Auto Covariance function
𝐶𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [{𝑋(𝑡) − 𝐸 [𝑋(𝑡)]} − {𝑋(𝑡 + 𝜏) − 𝐸[𝑋(𝑡 + 𝜏)]}]
𝐶𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) − 𝐸 [𝑋(𝑡)] 𝐸 [ 𝑋(𝑡 + 𝜏)]
Cross-Covariance function
𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸 [{𝑋 (𝑡) − 𝐸 [𝑋(𝑡)]} − {𝑌 (𝑡 + 𝜏) − 𝐸[𝑌(𝑡 + 𝜏)]}]
𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) − 𝐸 [𝑋 (𝑡)] 𝐸 [ 𝑌(𝑡 + 𝜏)]
NOTE:
1. If X(t) is at least wide sense stationary random process then
𝐶𝑋𝑋 (𝜏) = 𝑅𝑋𝑋 (𝜏) − (𝑋̅)2
2. At 𝜏 = 0
𝐶𝑋𝑋 (0) = 𝑅𝑋𝑋 (0) − (𝑋̅)2 = 𝜎𝑋 2
3. If X(t) and Y(t) is at least jointly wide sense stationary random process then
𝐶𝑋𝑌 (𝜏) = 𝑅𝑋𝑌 (𝜏) − 𝑋̅ 𝑌̅
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 14
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
DESCRPTIVE QUESTIONS
1. Explain numerous categories of random processes with examples.
2. Explain stationarity of random processes.
3. Interpret about ergodic random processes.
4. Interpret the significance of time averages and ergodicity.
5. Choose necessary expressions to verify the properties of Auto correlation
function.
6. Choose relevant expressions to verify the properties of cross correlation
function.
7. Interpret the concepts of covariance with relevance to random processes.
PROBLEMS
1. A random process is described by X(t) = A, where A is a continuous random
variable and is uniformly distributed on (0,1). Show that X(t) is wide sense
stationary.
2. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
3. A random process is given as X(t) = A Cos(0t + ), where A and 0 are
constants and is uniformly distributed random variable on the interval (0,
2). Verify whether given random process is wide sense stationary or not.
4. Two random process X(t) & Y(t) are defined as
X(t) = A Cos(0t) + B Sin(0t) and Y(t) = B Cos(0t) – A Sin(0t), where
A, B are uncorrelated, zero mean random variables with same variances and
0 is constant. Verify whether X(t),Y(t) are Jointly wide sense stationary or
not
5. A random process is defined as X(t) = A Cos(ω0t), where ω0 is a constant
and A is a random variable uniformly distributed over (0,1).Estimate the
autocorrelation function.
6. A random process is given as X(t) = A Cos(ω0t) + B Sin(ω0t), where A & B
are uncorrelated, zero mean random variables having same variance 2 then
appraise whether X(t) is wide sense stationary or not.
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 15
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
7. A Random Process Y(t) = X(t) – X(t+τ) is defined in terms X(t) that is at
least wide sense stationary
(i) Deduce the mean value of Y(t) if E[X(t)] ≠ 0.
(ii) Justify that the variance Y 2 2[ RXX (0) RXX ( )] .
(iii) If Y(t) = X(t) + X(t+τ), estimate E[Y(t)] and Y 2 .
8. Two statistically independent zero mean random processes X(t), Y(t) have
auto correlation functions RXX ( ) exp( | |) , RYY ( ) cos(2) respectively.
Evaluate the
(i) Autocorrelation of the Sum W1(t) = X(t)+Y(t)
(ii) Autocorrelation of the Difference W 2(t) = X(t)-Y(t)
(iii) Cross correlation of W1(t) & W2(t)
9. Given X 6 and RXX (t, t ) 36 25 exp( ) for a random process X(t).
Indicate which of the following statements are true and give the reason.
(i) Is first order stationary?
(ii) Has total average power of 61W
(iii) Is wide sense stationary?
(iv) Has a periodic component
(v) Has an AC power of 36W
10. Show that X(t) & Y(t) are Jointly WSS, if random processes,
X (t ) A Cos(1t ) , Y (t ) B Cos(2t ) , where A,B, 1 & 2 are constants,
while Φ, θ are Statistically independent uniform random variables on (0,2Π).
11. If X (t ) A Cos(0 t ) , where A, 0 are constants, and θ is a uniform random
variable on (-, ). A new random process is defined by Y(t) = X2(t).
(i) Obtain the Mean and Auto Correlation Function of X(t).
(ii) Obtain the Mean and Auto Correlation Function of Y(t).
(iii) Find the Cross Correlation Function of X(t) & Y(t).
(iv) Are X(t) and Y(t) are WSS?
(v) Are X(t) & Y(t) are Jointly WSS.
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 16
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
1. A random process is described by X(t) = A, where A is a
continuous random variable and is uniformly distributed on
(0,1). Show that X(t) is wide sense stationary.
Sol:
A random process is said to be wide sense stationary process if
𝑚𝑒𝑎𝑛 𝑣𝑎𝑙𝑢𝑒 𝐸 [𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
∞
𝐸 [𝑋 (𝑡)] = ∫ 𝑥(𝑡). 𝑓𝑋 (𝑥 ) 𝑑𝑥
−∞
Given A is uniformly distributed random variable in the interval (0, 1).
∞
𝐸 [𝑋 (𝑡)] = ∫ 𝑥 (𝑡) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞
Density function of uniformly distributed random variable is
1
𝑓𝑋 (𝑥 ) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1
𝑓𝐴 (𝐴) = = 1
1−0
2𝜋
𝐸 [𝑋 (𝑡)] = ∫ 𝐴 𝑑𝐴
0
1
A2
= [ ]
2 0
1
= = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
∞
= ∫ 𝑥 (𝑡) 𝑥(𝑡 + 𝜏) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 17
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
1
= ∫ 𝐴 𝐴 𝑑𝐴
0
1
= ∫ A2 𝑑𝐴
0
1
A3 1
= [ ] = Independent on time
2 0 2
Hence given RP is WSS.
2. A random process is given as X(t) = A Cos(0t + ), where A and
0 are constants and is uniformly distributed random
variable on the interval (0, 2). Verify whether given random
process is wide sense stationary or not.
Sol:
A random process is said to be wide sense stationary process if
𝐸 [𝑋 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
∞
𝐸 [𝑋 (𝑡)] = ∫ 𝑥(𝑡). 𝑓𝑋 (𝑥 ) 𝑑𝑥
−∞
∞
𝐸 [𝑋(𝑡)] = ∫ 𝑥 (𝑡) 𝑓𝜃 (𝜃 ) 𝑑𝜃
−∞
Given is uniformly distributed random variable on the interval (0, 2).
1
𝑓𝑋 (𝑥 ) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃 ) = =
2𝜋 − 0 2𝜋
2𝜋
1
𝐸 [𝑋(𝑡)] = ∫ A cos(𝑤0 𝑡 + 𝜃 ) 𝑑𝜃
0 2𝜋
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 18
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
2𝜋
sin(𝑤0 𝑡 + 𝜃 )
=𝐴 [ ]
2𝜋 0
A
= [ sin(𝑤0 𝑡 + 2𝜋) − sin(𝑤0 𝑡 + 0)]
2𝜋
A
= [ sin(𝑤0 𝑡) − sin(𝑤0 𝑡)] = 0
2𝜋
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
∞
= ∫ 𝑥(𝑡) 𝑥 (𝑡 + 𝜏) 𝑓𝜃 (𝜃 ) 𝑑𝜃
−∞
1 2𝜋
= ∫ (𝐴 cos(𝑤0𝑡 + 𝜃 ) A cos(𝑤0 (𝑡 + 𝜏) + 𝜃 ) ) 𝑑𝜃
2𝜋 0
1 2𝜋
= ∫ ( 𝐴 cos(𝑤0 𝑡 + 𝜃 ) A cos(𝑤0 𝑡 + 𝜃 + 𝑤0 𝜏) ) 𝑑𝜃
2𝜋 0
2 cosA cosB = cos(A+B) + cos(A−B)
A2 2𝜋
= ∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) + cos(𝑤0 𝜏) ) 𝑑𝜃
4𝜋 0
A2 2𝜋 2𝜋
= [∫ ( cos 2𝑤0 𝑡 + 2𝜃 + 𝜏 𝑑𝜃 + ∫ ( cos(𝑤0 𝜏) )𝑑𝜃 ]
( ) )
4𝜋 0 0
A2
= [ 2𝑤0 sin(2𝑤0 𝑡 + 2𝜃 + 𝜏) ]2𝜋 2𝜋
0 + cos(𝑤0 𝜏 ) 𝜃 ]0 ]
4𝜋
A2
= cos(𝑤0𝜏) 2𝜋
4𝜋
A2
= cos(𝑤0𝜏) Independent on time
2
Hence given RP is WSS.
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 19
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
3. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
Sol:
A random variable is said to be wide sense stationary process if
𝐸 [𝑥 (𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸 [𝑥 (𝑡)𝑥(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
∞
𝐸 [𝑥 (𝑡)] = ∫ 𝑥 (𝑡) 𝑓𝑥 (𝑥 )𝑑𝑥
−∞
∞
𝐸 [𝑥 (𝑡)𝑥 (𝑡, 𝑡 + 𝜏)] = ∫ 𝑥 (𝑡)𝑥 (𝑡 + 𝜏) 𝑓𝑥 (𝑥 )𝑑𝑥
−∞
B is a uniform random variable on (-1,1) with the density function
1
𝑓𝐵 (𝐵 ) =
𝑏−𝑎
1 1
𝑓𝐵 (𝐵 ) = =
1 − (−1) 2
1
1
𝐸 [𝑥 (𝑡)] = ∫ (𝐵𝑠𝑖𝑛𝜔0 𝑡) 𝑑𝐵
−1 2
𝑠𝑖𝑛𝜔0 𝑡 1
= ∫ 𝐵 𝑑𝐵
2 −1
𝑠𝑖𝑛𝜔0 𝑡 𝐵 2 1
= [ ]
2 2 −1
𝑠𝑖𝑛𝜔0 𝑡 1 1
= [ − ] = 0 = 𝐶𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2 2 2
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑥 (𝑡)𝑥 (𝑡 + 𝜏)]
1
1
= ∫ (𝐵𝑠𝑖𝑛(𝜔0 𝑡))(𝐵𝑠𝑖𝑛(𝜔0(𝑡 + 𝜏)) 𝑑𝐵
−1 2
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 20
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
(𝑠𝑖𝑛(𝜔0𝑡))(𝑠𝑖𝑛(𝜔0(𝑡 + 𝜏)) 1 2
= ∫ 𝐵 𝑑𝐵
2 −1
cos(𝐴 − 𝐵 ) − cos(𝐴 + 𝐵)
𝑠𝑖𝑛𝐴𝑠𝑖𝑛𝐵 =
2
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 𝐵 3 1
= [ ]
4 3 −1
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0𝜏)) 1 −1
= [ − ( )]
4 3 3
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 2
= [ ]
4 3
(cos(𝜔0𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏))
=
6
= 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒(𝑡)
Hence the given random process is not wide sense stationary
4. A random process 𝑌 (𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏) is defined in terms of 𝑋(𝑡)
that is at least wide sense stationary.
(𝑖) Deduce the mean value of 𝑌 (𝑡) if 𝐸 [𝑋(𝑡)] ≠ 0
(𝑖𝑖) Justify that the variance 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) − 𝑅𝑋𝑋 (𝜏)]
(iii) If 𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏), estimate 𝐸 [𝑌 (𝑡)] and 𝜎𝑌 2
Sol: Given,
𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏)
Given 𝑋(𝑡) is wide sense stationary and 𝐸 [𝑋 (𝑡)] ≠ 0
(i) Mean of 𝑌(𝑡):
𝐸 [𝑌 (𝑡)] = 𝐸 [𝑋(𝑡) − 𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋 (𝑡)] − 𝐸 [𝑋(𝑡 + 𝜏)]
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 21
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
= 𝐸 [𝑋(𝑡)] − 𝐸 [𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
∴ 𝐸 [𝑌 (𝑡)] = 0
(ii) Variance of 𝑌(𝑡):
𝜎𝑌 2 = 𝐸 [𝑌 2 (𝑡)] − (𝐸 [𝑌 (𝑡)])2
2
= 𝐸 [(𝑋 (𝑡) − 𝑋(𝑡 + 𝜏)) ] − 0[∵ 𝐸 [𝑌 (𝑡)] = 0]
= 𝐸 [𝑋 2 (𝑡) + 𝑋2 (𝑡 + 𝜏) − 2𝑋 (𝑡)𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑋2 (𝑡 + 𝜏)] − 2𝐸 [𝑋(𝑡)𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑋 2 (𝑡)] − 2𝑅𝑋𝑋 (𝜏)[∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) − 2𝑅𝑋𝑋 (𝜏)[∵ 𝑅𝑋𝑋 (0) = ̅̅̅̅̅̅̅
𝑋 2 (𝑡)]
= 2𝑅𝑋𝑋 (0) − 2𝑅𝑋𝑋 (𝜏)
∴ 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) − 𝑅𝑋𝑋 (𝜏)]
(iii) Given,
𝑌(𝑡) = 𝑋(𝑡) + 𝑋(𝑡 + 𝜏)
Now,
𝐸 [𝑌 (𝑡)] = 𝐸 [𝑋(𝑡) + 𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋 (𝑡)] + 𝐸 [𝑋(𝑡 + 𝜏)]
= 𝐸 [𝑋(𝑡)] + 𝐸 [𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 2 𝐸 [𝑋 (𝑡)]
∴ 𝐸 [𝑌 (𝑡)] = 2 𝐸 [𝑋 (𝑡)]
Now,
𝜎𝑌 2 = 𝐸 [𝑌 2 (𝑡)] − (𝐸 [𝑌 (𝑡)])2
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 22
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
2
= 𝐸 [(𝑋 (𝑡) − 𝑋(𝑡 + 𝜏)) ] − (2 𝐸 [𝑋(𝑡)])2 [∵ 𝐸 [𝑌 (𝑡)] = 2 𝐸 [𝑋(𝑡)]]
= 𝐸 [𝑋 2 (𝑡) + 𝑋2 (𝑡 + 𝜏) + 2𝑋 (𝑡)𝑋(𝑡 + 𝜏)] − 4(𝐸 [𝑋 (𝑡)])2
= 𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑋2 (𝑡 + 𝜏)] + 2𝐸 [𝑋(𝑡)𝑋(𝑡 + 𝜏)] − 4(𝐸 [𝑋(𝑡)])2
= 𝐸 [𝑋2 (𝑡)] + 𝐸 [𝑋 2 (𝑡)] + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸 [𝑋 (𝑡)])2 [∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸 [𝑋 (𝑡)])2 [∵ 𝑅𝑋𝑋 (0) = ̅̅̅̅̅̅̅
𝑋2 (𝑡)]
= 2𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4𝑋̅ 2
= 2[𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (𝜏) − 2𝑋̅ 2 ]
∴ 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (𝜏) − 2𝑋̅ 2 ]
5. A random process is given as X(t) = A Cos(ω0t) + B Sin(ω0t), where A & B
are uncorrelated, zero mean random variables having same variance 2 then
appraise whether X(t) is wide sense stationary or not.
Sol:
𝐺𝑖𝑣𝑒𝑛,
𝑋(𝑡) = 𝐴 cos(𝜔0 𝑡) + 𝐵 sin(𝜔0 𝑡)
𝐼𝑓 𝐴 & 𝐵 𝑎𝑟𝑒 𝑢𝑛𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑒𝑑
𝐸 (𝐴𝐵 ) = 𝐸 (𝐴) 𝐸 (𝐵 )
𝐵𝑢𝑡,
𝐸 (𝐴) = 𝐸 (𝐵 ) = 0
𝐸 (𝐴𝐵 ) = 0
𝑁𝑜𝑤,
𝜎𝐴2 = 𝐸 [𝐴2] − (𝐸 [𝐴])2
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 23
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
= 𝐸 [ 𝐴2 ]
𝜎𝐵2 = 𝐸 [𝐵 2 ] − (𝐸 [𝐵 ])2
= 𝐸 [𝐵 2 ]
𝐸[𝐴2] = 𝜎 2 𝑎𝑛𝑑 𝐸 [𝐵 2 ] = 𝜎 2
𝐴 𝑟𝑎𝑛𝑑𝑜𝑚 𝑝𝑟𝑜𝑐𝑒𝑠𝑠 𝑖𝑠 𝑠𝑎𝑖𝑑 𝑡𝑜 𝑏𝑒 𝑤𝑖𝑑𝑒 𝑠𝑒𝑛𝑠𝑒 𝑠𝑡𝑎𝑡𝑖𝑜𝑛𝑎𝑟𝑦 𝑝𝑟𝑜𝑐𝑒𝑠𝑠 𝑖𝑓
𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏) = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒
𝑁𝑜𝑤,
𝐸[𝑋(𝑡)] = 𝐸[𝐴 cos(𝜔0 𝑡) + 𝐵 𝑆𝑖𝑛(𝜔0 𝑡)]
= 𝐸[𝐴] 𝐶𝑜𝑠(𝜔0 𝑡) + 𝐸[𝐵] 𝑆𝑖𝑛(𝜔0 𝑡)
= 0 (𝑆𝑖𝑛𝑐𝑒 𝐸 [𝐴] = 𝐸 [𝐵 ] = 0)
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸 [𝑋(𝑡)𝑋(𝑡 + 𝜏)]
= 𝑬[ (𝑨 𝐜𝐨𝐬(𝝎𝟎 𝒕) + 𝑩 𝐬𝐢𝐧(𝝎𝟎 𝒕)) ( 𝑨 𝐜𝐨𝐬(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑩 𝐬𝐢𝐧(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉))]
= 𝐸[ ( 𝐴2 cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐴𝐵 cos(𝜔0 𝑡)
+ 𝐴𝐵 sin(𝜔0 𝑡) cos(𝜔0𝑡 + 𝜔0 𝜏) + 𝐵 2 sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)]
= 𝐸 [𝐴2] cos(𝜔0 𝑡) cos(𝜔0𝑡 + 𝜔0 𝜏) + 𝐸 [𝐴𝐵 ] cos(𝜔0𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)
+ 𝐸 [𝐴𝐵 ] sin(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐸 [𝐵 2] sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)
= 𝜎 2 cos(𝜔0 𝑡) cos(𝜔0𝑡 + 𝜔0 𝜏) + 0 + 0 + 𝜎 2 sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)
(𝑺𝒊𝒏𝒄𝒆 𝑬[𝑨𝟐 ] = 𝑬[𝑩𝟐 ] = 𝝈𝟐 𝑬[𝑨𝑩] = 𝟎)
= 𝜎 2 [ cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + sin(𝜔0𝑡) sin(𝜔0 𝑡 + 𝜔0𝜏)]
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 24
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
[𝑆𝑖𝑛𝑐𝑒 𝐶𝑜𝑠𝐴 𝐶𝑜𝑠𝐵 + 𝑆𝑖𝑛𝐴 𝑆𝑖𝑛𝐵 = 𝐶𝑜𝑠(𝐴 − 𝐵)]
= 𝜎 2 cos[𝜔0 𝑡 − (𝜔0 𝑡 + 𝜔0 𝜏)]
= 𝜎 2 cos(𝜔0 𝜏)
= 𝑅𝑋𝑋 (𝜏)
𝐼𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒
𝐻𝑒𝑛𝑐𝑒 𝑔𝑖𝑣𝑒𝑛 𝑅𝑎𝑛𝑑𝑜𝑚 𝑝𝑟𝑜𝑐𝑒𝑠 𝑖𝑠 𝑊𝑖𝑑𝑒 𝑆𝑒𝑛𝑠𝑒 𝑆𝑡𝑎𝑡𝑖𝑜𝑛𝑎𝑟𝑦
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 25