0% found this document useful (0 votes)
32 views44 pages

Random Variable and Stochastic ProcessUNIT-3

Random variables

Uploaded by

prasaddilli720
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views44 pages

Random Variable and Stochastic ProcessUNIT-3

Random variables

Uploaded by

prasaddilli720
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

UNIT-III
STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

INTRODUCTION

A random process is an ensemble of (collection) number of time varying functions.


Further it is nothing but a random variable with time added.

A random variable is a real valued function that assigns numerical values to the
outcomes of physical experiment. If time is added to a random variable then it is
called random process.

Random processes are used to describe the time varying nature of random variable.
They describe the statistical behavior of various real time signals like speech,
noise, atmosphere, etc…

Random processes are denoted by 𝑋(𝑡, 𝑠) 𝑜𝑟 𝑋(𝑡). If time is fixed i.e; if any
specific time instant is taken then random process becomes random variable.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 1


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

CLASSIFICATION OF RANDOM PROCESS


Based on the characteristics sample space of a random variable and time t random
process are classified as

 Continuous Random Process


 Discrete Random Process
 Continuous Random Sequence
 Discrete Random Sequence

Continuous Random Process

A random process is said to be continuous, if random variable X and time t are


continuous. It means that they can take any value

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 2


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Discrete Random Process

A random process is said to be discrete, if random variable X is discrete and time t


is continuous.

Continuous Random Sequence

A random sequence is said to be continuous, if random variable X is continuous


and time t is discrete.

Discrete Random Sequence

A random sequence is said to be discrete, if random variable X and time t is


discrete.

DETERMINISTIC RANDOM PROCESSES & NON DETERMINSTIC PROCESSES

A random process is said to be deterministic if its future values can be predicted


from observed past values.

Ex: X(t) = A Cos(0t + )

A random process is said to be non-deterministic if its future values cannot


predicted from observed past values.

Ex: Random noise signal


CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 3
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Distribution and Density Functions of Random Process

It is known that a random process becomes a random variable at specific time


instant. Hence all the statistical properties of random variables are applicable to
random processes. Based on the number of random variables various distribution
and density functions are defined.

First order Distribution and Density Function

The first order distribution function of a random process is defined as

𝐹𝑋 (𝑥1 ; 𝑡1 ) = P{𝑋(𝑡1 ) ≤ 𝑥1 }
Similarly the first order density function of random process is

𝑑𝐹𝑋 (𝑥1 ; 𝑡1 )
𝑓𝑋 (𝑥1 ; 𝑡1 ) =
𝑑𝑥1
Second order Distribution and Density Function

For two random variables at time instant 𝑡1 𝑎𝑛𝑑 𝑡2 𝑋(𝑡1 ) = 𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 ,
the second order distribution (joint distribution) function of a random process is
defined as

𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 ) = P{𝑋(𝑡1 ) ≤ 𝑥1 ; 𝑋(𝑡2 ) ≤ 𝑥2 }


The second order probability density function of random process is

𝜕 2 𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 )
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 ) =
𝜕𝑥1 𝜕𝑥2

nth order Distribution and Density Function

In general for N random variables nth order joint distribution function is

𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = P{𝑋(𝑡1 ) ≤ 𝑥1 ; 𝑋(𝑡2 ) ≤ 𝑥2 … … 𝑋(𝑡𝑛 ) ≤ 𝑥𝑛 }


The nth order probability density function of random process is

𝜕 𝑛 𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 )
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) =
𝜕𝑥1 𝜕𝑥2 … … 𝜕𝑥𝑛

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 4


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Stationary and Statistical Independence

Random process is generally characterized by two parameters

Whether it is Stationary or not


Whether the random variables involved are statistically independent
or not (random process also)
First order Stationary Process

A random process X(t) is said to be first order stationary if its first order density
function does not change with time.

𝑓𝑋 (𝑥1 ; 𝑡1 ) = 𝑓𝑋 (𝑥1 ; 𝑡1 + Δ)

where Δ is the minute increment or decrement of time

Whenever a random process is first order stationary then its average value or mean
is constant over time.

𝐸[𝑋(𝑡1 )] = 𝐸[𝑋(𝑡2 )]

= 𝐸[𝑋(𝑡1 + Δ)]

= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝐸[𝑋(𝑡)] = 𝐸[𝑋(𝑡 + Δ)]

= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Second order Stationary Process

A random process X(t) is said to be second order stationary if its second order
density function does not change with time.

𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 ) = 𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 + Δ, 𝑡2 + Δ)

Let 𝐸[𝑋(𝑡1 ) 𝑋(𝑡2 )] denote the correlation between two random variables
𝑋1 𝑎𝑛𝑑 𝑋2 taken at time instants 𝑡1 𝑎𝑛𝑑 𝑡2 then

𝑅𝑋1𝑋2 (𝑡, 𝑡 + Δ) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + Δ)]

= 𝑅𝑋𝑋 (𝜏)

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 5


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Where 𝑅𝑋𝑋 (𝜏) is auto correlation function of random process X(t).

If this auto correlation function is constant i.e.; independent on time such a random
process is called second order stationary process.

Wide Sense Stationary Process

A random process is said to be wide sense stationary process if

𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

nth order Stationary Process

A random process X(t) is said to be nth order stationary if its nth order density
function does not change with time.

𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = 𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 + Δ, 𝑡2 + Δ … … 𝑡𝑛 + Δ)

A random process is said to be nth order stationary is also called strict sense
stationary.

TIME AVERAGES:

The random process is also characterized by time average functions along with
statistical averages. Statistical average of a random process is calculated by
considering all sample functions at given time.

Time averages are calculated for any sample function. The time average of a
random process is defined as a

1 𝑇
𝐴[∎] = lim ∫ [∎] 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇

Here A is used to denote time average in a manner analogous to E for the statistical
average.

The time average of a random process 𝑥(𝑡) is given as


1 𝑇
𝑥̅ = 𝐴[𝑥(𝑡)] = lim ∫ 𝑥(𝑡) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 6


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Similarly the time average of 𝑥(𝑡)𝑥(𝑡 + 𝜏) is called as time auto correlation


function and is given by

ℜ𝑥𝑥 (𝜏) = 𝐴[𝑥(𝑡) 𝑥(𝑡 + 𝜏)]

1 𝑇
= lim ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇

The time auto correlation function is used to calculate the similarity between two
random variables within a single random process.

The time cross correlation function measures the similarity between two different
random processes.

ℜ𝑥𝑦 (𝜏) = 𝐴[𝑥(𝑡) 𝑦(𝑡 + 𝜏)]

1 𝑇
= lim ∫ 𝑥(𝑡) 𝑦(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇

Ergodic Theorem
This theorem stated that the all time averages 𝑥̅ 𝑎𝑛𝑑 ℜ𝑥𝑥 (𝜏) of a random process
are equal to Statistical averages 𝑋̅ 𝑎𝑛𝑑 𝑅𝑋𝑋

A random process is said to be ergodic if its satisfies ergodic theorem.

𝐸[𝑋(𝑡)] = 𝐴[𝑥(𝑡)]

𝑋̅ = 𝑥̅

𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐴[𝑥(𝑡) 𝑦(𝑥 + 𝜏)]

𝑅𝑋𝑋 (𝜏) = ℜ𝑥𝑥 (𝜏)

Mean Ergodic Random Process

A random process is said to be mean ergodic (or) ergodic in mean if the time
average of 𝑥(𝑡) is equal to statistical average of X(t)

𝐸[𝑋(𝑡)] = 𝐴[𝑥(𝑡)]

𝑋̅ = 𝑥̅

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 7


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Auto Correlation Ergodic Random Process

A random process is said to be ergodic in auto correlation if the time auto


correlation function is equal to statistical auto correlation function.

𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐴[𝑥(𝑡) 𝑥(𝑡 + 𝜏)]

𝑅𝑋𝑋 (𝜏) = ℜ𝑥𝑥 (𝜏)

Cross Correlation Ergodic Random Process

A random process is said to be ergodic in cross correlation if the time cross


correlation function is equal to statistical cross correlation function.

𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)] = 𝐴[𝑥(𝑡) 𝑦(𝑡 + 𝜏)]

𝑅𝑋𝑌 (𝜏) = ℜ𝑥𝑦 (𝜏)

Auto Correlation Function

It is a measure of similarity between two random variables for a given random


process. It is defined as the expected value of 𝑥(𝑡) 𝑥(𝑡 + 𝜏)

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]

Properties of auto correlation function

1. 𝑅𝑋𝑋 (𝜏) cannot have an arbitrary shape.


2. The value of auto correlation function at origin 𝑖. 𝑒; 𝜏 = 0 is equal to mean
square value of the process (Average Power)
𝑅𝑋𝑋 (0) = 𝑋 ̅̅̅̅̅̅̅
2 (𝑡)

Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
𝑙𝑒𝑡 𝜏 = 0
𝑅𝑋𝑋 (0) = 𝐸[𝑋(𝑡) 𝑋(𝑡)]
= 𝐸[𝑋 2 (𝑡)]
̅̅̅̅̅̅̅
𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 8


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

3. The maximum value of auto correlation function occurs at origin

|𝑅𝑋𝑋 (𝜏) | ≤ 𝑅𝑋𝑋 (0)

Proof: let X(t) be a wide sense stationary process such that 𝑋(𝑡1 ) =
𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 then select a positive quantity such that

[𝑋(𝑡1 ) ± 𝑋(𝑡2 )]2 ≥ 0

Apply expectation on both sides then

𝐸[𝑋(𝑡1 ) ± 𝑋(𝑡2 )]2 ≥ 0

𝐸[𝑋 2 (𝑡1 ) + 𝑋 2 (𝑡2 ) ± 2 𝑋(𝑡1 ) 𝑋(𝑡2 )] ≥ 0

𝐸[𝑋 2 (𝑡1 )] + 𝐸[𝑋 2 (𝑡2 )] ± 2 𝐸[𝑋(𝑡1 ) 𝑋(𝑡2 )] ≥ 0

Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏

𝐸[𝑋 2 (𝑡)] + 𝐸[𝑋 2 (𝑡 + 𝜏)] ± 2 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] ≥ 0

As statistical properties does not change with time, then

𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) ± 2 𝑅𝑋𝑋 (𝜏) ≥ 0

2 𝑅𝑋𝑋 (0) ± 2 𝑅𝑋𝑋 (𝜏) ≥ 0

|𝑅𝑋𝑋 (𝜏)| ≤ |𝑅𝑋𝑋 (0)|

4. Autocorrelation function is an even function


𝑅𝑋𝑋 (−𝜏) = 𝑅𝑋𝑋 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Let 𝜏 = −𝜏
𝑅𝑋𝑋 (−𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 − 𝜏)]
𝑡−𝜏 =𝑢
𝑡 =𝑢+𝜏
𝑅𝑋𝑋 (−𝜏) = 𝐸[𝑋(𝑢 + 𝜏) 𝑋(𝑢)]
𝑅𝑋𝑋 (−𝜏) = 𝑅𝑋𝑋 (𝜏)

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 9


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

5. When a random process X(t) is periodic with period T then the


Autocorrelation function is also periodic.

𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝑅𝑋𝑋 (𝜏)

We know that

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]

𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏 + 𝑇)]

As X(t) is periodic 𝑋(𝑡 + 𝜏 + 𝑇) = 𝑋(𝑡 + 𝜏)

𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]

𝑅𝑋𝑋 (𝜏 + 𝑇) = 𝑅𝑋𝑋 (𝜏)

6. If 𝐸[𝑋(𝑡)] = 𝑋̅ ≠ 0 and X(t) is Ergodic with no periodic components, then


the auto correlation function is given as

lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2


|𝜏|→∞

Proof:

We know that

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]

Since the process has no periodic components as |𝜏| → ∞, the random variables
becomes independent.

lim 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐸[𝑋(𝑡) ] 𝐸[ 𝑋(𝑡 + 𝜏)]


|𝜏|→∞

The given random process is Ergodic, then

lim 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝐸[𝑋(𝑡) ] 𝐸[ 𝑋(𝑡)]


|𝜏|→∞

lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2


|𝜏|→∞

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 10


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

7. If X(t) is Ergodic, zero mean, and has no periodic components, then the
auto correlation function is given as
lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞

Proof: from the above property


lim 𝑅𝑋𝑋 (𝜏) = 𝑋̅ 2
|𝜏|→∞

It is given that zero mean random process.


𝑡ℎ𝑒𝑛 lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞

8. Let there be a random process w(t) such that 𝑤(𝑡) = 𝑋(𝑡) + 𝑌(𝑡). Then the
auto correlation function of sum of random process is equal to
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Given 𝑤(𝑡) = 𝑋(𝑡) + 𝑌(𝑡)
𝑅𝑊𝑊 (𝜏) = 𝐸[𝑊(𝑡) 𝑊(𝑡 + 𝜏)]
= 𝐸[(𝑋(𝑡) + 𝑌(𝑡)) (𝑋(𝑡 + 𝜏) + 𝑌(𝑡 + 𝜏))]
= 𝐸[(𝑋(𝑡)) (𝑋(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑋(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑋(𝑡 + 𝜏))]

𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)

CROSS Correlation Function

It is a measure of similarity between two random processes X(t) and Y(t).

𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)]

Consider two random processes X(t) and Y(t) that are at least wide sense
stationary.

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)]


CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 11
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Properties of auto correlation function

1. The cross correlation function is an even function

𝑅𝑋𝑌 (−𝜏) = 𝑅𝑌𝑋 (𝜏)

Proof:

We know that

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)]

Let 𝜏 = −𝜏

𝑅𝑋𝑌 (−𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 − 𝜏)]

𝑡−𝜏 =𝑢

𝑡 =𝑢+𝜏

𝑅𝑋𝑌 (−𝜏) = 𝐸[𝑋(𝑢 + 𝜏) 𝑌(𝑢)]

𝑅𝑋𝑌 (−𝜏) = 𝐸[𝑌(𝑢)𝑋(𝑢 + 𝜏) ]

𝑅𝑋𝑌 (−𝜏) = 𝑅𝑌𝑋 (𝜏)

2. The cross correlation function of a random process is always less than or


equal to the geometric mean of individual auto correlation functions.

𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)

Proof:

Let X(t) and Y(t) be two random processes such that

[𝑌(𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0

Apply expectation on both sides then

𝐸[𝑌(𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0

𝐸[𝑌 2 (𝑡 + 𝜏) + 𝛼 2 𝑋 2 (𝑡) ± 2 𝛼𝑋(𝑡) 𝑌(𝑡 + 𝜏)] ≥ 0

𝐸[𝑌 2 (𝑡 + 𝜏)] + 𝛼 2 𝐸[𝑋 2 (𝑡)] ± 2 𝛼 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)] ≥ 0

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 12


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Given processes are stationary, hence

𝛼 2 𝐸[𝑋 2 (𝑡)] + 𝐸[𝑌 2 (𝑡)] ± 2𝛼 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)] ≥ 0

𝑅𝑋𝑋 (0) 𝛼 2 ± 2𝛼 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑌 (0) ≥ 0

The above equation of the form 𝑎𝑥 2 + 𝑏𝑥 + 𝑐. Hence the roots of the above
−𝑏±√𝑏2 −4𝑎𝑐
equation are given as
2𝑎

∓2 𝑅𝑋𝑌 (𝜏) ± √4 𝑅2𝑋𝑌 (𝜏) − 4𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)


= ≥0
2𝑅𝑋𝑋 (0)

∓ 𝑅𝑋𝑌 (𝜏) ± √ 𝑅2𝑋𝑌 (𝜏) − 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)


= ≥0
2𝑅𝑋𝑋 (0)

𝑅2𝑋𝑌 (𝜏) − 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0) ≥ 0

𝑅2𝑋𝑌 (𝜏) ≤ 𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)

𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)

3. The cross correlation function of a random process is always less than or


equal to the arthematic mean of individual auto correlation functions.
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
𝑅𝑋𝑌 (𝜏) ≤
2
Proof:
We know that
𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)]

𝑅𝑋𝑌 (𝜏) ≤ √𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0)


The geometric mean of any given series is always less than or equal to arthematic
mean
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
√𝑅𝑋𝑋 (0) 𝑅𝑌𝑌 (0) ≤
2
𝑅𝑋𝑋 (0) + 𝑅𝑌𝑌 (0)
𝑅𝑋𝑌 (𝜏) ≤
2
CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 13
UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

4. For two random processes X(t) and Y(t) having non-zero mean and are
statically independent

𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅

Proof:

We know that

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) 𝑌(𝑡 + 𝜏)]

As X(t) and Y(t) having non-zero mean and are statically independent

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) ] 𝐸[ 𝑌(𝑡 + 𝜏)]

𝑎𝑠 X(t) and Y(t) 𝑎𝑟𝑒 𝑊𝑆𝑆 𝑡ℎ𝑒𝑛 𝐸[ 𝑌(𝑡 + 𝜏)] = 𝐸[ 𝑌(𝑡)]

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡) ] 𝐸[ 𝑌(𝑡)]

𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅

COVARIANCE FUNCTION

The Covariance function is a measure of interdependence between two random


variables of the random process X(t).
Auto Covariance function
𝐶𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[{𝑋(𝑡) − 𝐸[𝑋(𝑡)]} − {𝑋(𝑡 + 𝜏) − 𝐸[𝑋(𝑡 + 𝜏)]}]
𝐶𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) − 𝐸[𝑋(𝑡)] 𝐸[ 𝑋(𝑡 + 𝜏)]
Cross-Covariance function
𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[{𝑋(𝑡) − 𝐸[𝑋(𝑡)]} − {𝑌(𝑡 + 𝜏) − 𝐸[𝑌(𝑡 + 𝜏)]}]
𝐶𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) − 𝐸[𝑋(𝑡)] 𝐸[ 𝑌(𝑡 + 𝜏)]
NOTE:
1. If X(t) is at least wide sense stationary random process then
𝐶𝑋𝑋 (𝜏) = 𝑅𝑋𝑋 (𝜏) − (𝑋̅)2
2. At 𝜏 = 0
𝐶𝑋𝑋 (0) = 𝑅𝑋𝑋 (0) − (𝑋̅)2 = 𝜎𝑋 2
3. If X(t) and Y(t) is at least jointly wide sense stationary random process then
𝐶𝑋𝑌 (𝜏) = 𝑅𝑋𝑌 (𝜏) − 𝑋̅ 𝑌̅

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 14


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

DESCRPTIVE QUESTIONS
1. Explain numerous categories of random processes with examples.
2. Explain stationarity of random processes.
3. Interpret about ergodic random processes.
4. Interpret the significance of time averages and ergodicity.
5. Choose necessary expressions to verify the properties of Auto correlation
function.
6. Choose relevant expressions to verify the properties of cross correlation
function.
7. Interpret the concepts of covariance with relevance to random processes.
PROBLEMS
1. A random process is described by X(t) = A, where A is a continuous random
variable and is uniformly distributed on (0,1). Show that X(t) is wide sense
stationary.

2. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.

3. A random process is given as X(t) = A Cos(0t + ), where A and 0 are


constants and  is uniformly distributed random variable on the interval (0,
2). Verify whether given random process is wide sense stationary or not.
4. Two random process X(t) & Y(t) are defined as
X(t) = A Cos(0t) + B Sin(0t) and Y(t) = B Cos(0t) – A Sin(0t), where
A, B are uncorrelated, zero mean random variables with same variances and
0 is constant. Verify whether X(t),Y(t) are Jointly wide sense stationary or
not

5. A random process is defined as X(t) = A Cos(ω0t), where ω0 is a constant


and A is a random variable uniformly distributed over (0,1).Estimate the
autocorrelation function.

6. A random process is given as X(t) = A Cos(ω0t) + B Sin(ω0t), where A & B


are uncorrelated, zero mean random variables having same variance  2 then
appraise whether X(t) is wide sense stationary or not.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 15


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

7. A Random Process Y(t) = X(t) – X(t+τ) is defined in terms X(t) that is at


least wide sense stationary
(i) Deduce the mean value of Y(t) if E[X(t)] ≠ 0.
(ii) Justify that the variance  Y 2  2[ RXX (0)  RXX ( )] .
(iii) If Y(t) = X(t) + X(t+τ), estimate E[Y(t)] and  Y 2 .
8. Two statistically independent zero mean random processes X(t), Y(t) have
auto correlation functions RXX ( )  exp(  |  |) , RYY ( )  cos(2) respectively.
Evaluate the
(i) Autocorrelation of the Sum W1(t) = X(t)+Y(t)
(ii) Autocorrelation of the Difference W2(t) = X(t)-Y(t)
(iii) Cross correlation of W1(t) & W2(t)

9. Given X  6 and RXX (t, t   )  36  25 exp(  ) for a random process X(t).


Indicate which of the following statements are true and give the reason.
(i) Is first order stationary?
(ii) Has total average power of 61W
(iii) Is wide sense stationary?
(iv) Has a periodic component
(v) Has an AC power of 36W

10.Show that X(t) & Y(t) are Jointly WSS, if random processes,
X (t )  A Cos(1t   ) , Y (t )  B Cos(2t  ) , where A,B, 1 & 2 are constants,
while Φ, θ are Statistically independent uniform random variables on (0,2Π).

11.If X (t )  A Cos(0 t   ) , where A, 0 are constants, and θ is a uniform random


variable on (-, ). A new random process is defined by Y(t) = X2(t).
(i) Obtain the Mean and Auto Correlation Function of X(t).
(ii) Obtain the Mean and Auto Correlation Function of Y(t).
(iii) Find the Cross Correlation Function of X(t) & Y(t).
(iv) Are X(t) and Y(t) are WSS?
(v) Are X(t) & Y(t) are Jointly WSS.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 16


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

1. A random process is described by X(t) = A, where A is a


continuous random variable and is uniformly distributed on
(0,1). Show that X(t) is wide sense stationary.

Sol:

A random process is said to be wide sense stationary process if

𝑚𝑒𝑎𝑛 𝑣𝑎𝑙𝑢𝑒 𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒



𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡). 𝑓𝑋 (𝑥) 𝑑𝑥
−∞
Given A is uniformly distributed random variable in the interval (0, 1).

𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞
Density function of uniformly distributed random variable is
1
𝑓𝑋 (𝑥) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1
𝑓𝐴 (𝐴) = = 1
1−0
2𝜋
𝐸[𝑋(𝑡)] = ∫ 𝐴 𝑑𝐴
0
1
A2
= [ ]
2 0

1
= = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]

= ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 17


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

1
= ∫ 𝐴 𝐴 𝑑𝐴
0

1
= ∫ A2 𝑑𝐴
0

1
A3 1
= [ ] = Independent on time
2 0 2

Hence given RP is WSS.

2. A random process is given as X(t) = A Cos(0t + ), where A and


0 are constants and  is uniformly distributed random
variable on the interval (0, 2). Verify whether given random
process is wide sense stationary or not.

Sol:

A random process is said to be wide sense stationary process if

𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒



𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡). 𝑓𝑋 (𝑥) 𝑑𝑥
−∞

𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝜃 (𝜃) 𝑑𝜃
−∞

Given  is uniformly distributed random variable on the interval (0, 2).

1
𝑓𝑋 (𝑥) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃) = =
2𝜋 − 0 2𝜋
2𝜋
1
𝐸[𝑋(𝑡)] = ∫ A cos(𝑤0 𝑡 + 𝜃) 𝑑𝜃
0 2𝜋

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 18


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

2𝜋
sin(𝑤0 𝑡 + 𝜃)
=𝐴 [ ]
2𝜋 0

A
= [ sin(𝑤0 𝑡 + 2𝜋) − sin(𝑤0 𝑡 + 0)]
2𝜋
A
= [ sin(𝑤0 𝑡) − sin(𝑤0 𝑡)] = 0
2𝜋
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]



= ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏) 𝑓𝜃 (𝜃) 𝑑𝜃
−∞

1 2𝜋
= ∫ (𝐴 cos(𝑤0 𝑡 + 𝜃) A cos(𝑤0 (𝑡 + 𝜏) + 𝜃) ) 𝑑𝜃
2𝜋 0

1 2𝜋
= ∫ ( 𝐴 cos(𝑤0 𝑡 + 𝜃) A cos(𝑤0 𝑡 + 𝜃 + 𝑤0 𝜏) ) 𝑑𝜃
2𝜋 0

2 cosA cosB = cos(A+B) + cos(A−B)

A2 2𝜋
= ∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) + cos(𝑤0 𝜏) ) 𝑑𝜃
4𝜋 0

A2 2𝜋 2𝜋
= [∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) )𝑑𝜃 + ∫ ( cos(𝑤0 𝜏) )𝑑𝜃]
4𝜋 0 0

A2
= [ 2𝑤0 sin(2𝑤0 𝑡 + 2𝜃 + 𝜏) ]2𝜋 2𝜋
0 + cos(𝑤0 𝜏) 𝜃]0 ]
4𝜋
A2
= cos(𝑤0 𝜏) 2𝜋
4𝜋
A2
= cos(𝑤0 𝜏) Independent on time
2
Hence given RP is WSS.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 19


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

3. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
Sol:
A random variable is said to be wide sense stationary process if
𝐸[𝑥(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑥(𝑡)𝑥(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

𝐸[𝑥(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝑥 (𝑥)𝑑𝑥
−∞

𝐸[𝑥(𝑡)𝑥(𝑡, 𝑡 + 𝜏)] = ∫ 𝑥(𝑡)𝑥(𝑡 + 𝜏) 𝑓𝑥 (𝑥)𝑑𝑥
−∞

B is a uniform random variable on (-1,1) with the density function


1
𝑓𝐵 (𝐵) =
𝑏−𝑎
1 1
𝑓𝐵 (𝐵) = =
1 − (−1) 2
1
1
𝐸[𝑥(𝑡)] = ∫ (𝐵𝑠𝑖𝑛𝜔0 𝑡) 𝑑𝐵
−1 2

𝑠𝑖𝑛𝜔0 𝑡 1
= ∫ 𝐵 𝑑𝐵
2 −1

𝑠𝑖𝑛𝜔0 𝑡 𝐵2 1
= [ ]
2 2 −1

𝑠𝑖𝑛𝜔0 𝑡 1 1
= [ − ] = 0 = 𝐶𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2 2 2

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑥(𝑡)𝑥(𝑡 + 𝜏)]


1
1
= ∫ (𝐵𝑠𝑖𝑛(𝜔0 𝑡))(𝐵𝑠𝑖𝑛(𝜔0 (𝑡 + 𝜏)) 𝑑𝐵
−1 2

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 20


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

(𝑠𝑖𝑛(𝜔0 𝑡))(𝑠𝑖𝑛(𝜔0 (𝑡 + 𝜏)) 1 2


= ∫ 𝐵 𝑑𝐵
2 −1

cos(𝐴 − 𝐵) − cos(𝐴 + 𝐵)
𝑠𝑖𝑛𝐴𝑠𝑖𝑛𝐵 =
2
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 𝐵3 1
= [ ]
4 3 −1

(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 1 −1


= [ − ( )]
4 3 3
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 2
= [ ]
4 3
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏))
=
6
= 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒(𝑡)
Hence the given random process is not wide sense stationary
4. A random process 𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏) is defined in terms of 𝑋(𝑡)
that is at least wide sense stationary.
(𝑖) Deduce the mean value of 𝑌(𝑡) if 𝐸[𝑋(𝑡)] ≠ 0

(𝑖𝑖) Justify that the variance 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) − 𝑅𝑋𝑋 (𝜏)]

(iii) If 𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏), estimate 𝐸[𝑌(𝑡)] and 𝜎𝑌 2

Sol: Given,
𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏)
Given 𝑋(𝑡) is wide sense stationary and 𝐸[𝑋(𝑡)] ≠ 0
(i) Mean of 𝑌(𝑡):
𝐸[𝑌(𝑡)] = 𝐸[𝑋(𝑡) − 𝑋(𝑡 + 𝜏)]

= 𝐸[𝑋(𝑡)] − 𝐸[𝑋(𝑡 + 𝜏)]

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 21


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

= 𝐸[𝑋(𝑡)] − 𝐸[𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]

∴ 𝐸[𝑌(𝑡)] = 0

(ii) Variance of 𝑌(𝑡):

𝜎𝑌 2 = 𝐸[𝑌 2 (𝑡)] − (𝐸[𝑌(𝑡)])2

2
= 𝐸 [(𝑋(𝑡) − 𝑋(𝑡 + 𝜏)) ] − 0[∵ 𝐸[𝑌(𝑡)] = 0]

= 𝐸[𝑋 2 (𝑡) + 𝑋 2 (𝑡 + 𝜏) − 2𝑋(𝑡)𝑋(𝑡 + 𝜏)]

= 𝐸[𝑋 2 (𝑡)] + 𝐸[𝑋 2 (𝑡 + 𝜏)] − 2𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]

= 𝐸[𝑋 2 (𝑡)] + 𝐸[𝑋 2 (𝑡)] − 2𝑅𝑋𝑋 (𝜏)[∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]

̅̅̅̅̅̅̅
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) − 2𝑅𝑋𝑋 (𝜏)[∵ 𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)]

= 2𝑅𝑋𝑋 (0) − 2𝑅𝑋𝑋 (𝜏)

∴ 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) − 𝑅𝑋𝑋 (𝜏)]

(iii) Given,
𝑌(𝑡) = 𝑋(𝑡) + 𝑋(𝑡 + 𝜏)
Now,
𝐸[𝑌(𝑡)] = 𝐸[𝑋(𝑡) + 𝑋(𝑡 + 𝜏)]
= 𝐸[𝑋(𝑡)] + 𝐸[𝑋(𝑡 + 𝜏)]
= 𝐸[𝑋(𝑡)] + 𝐸[𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 2 𝐸[𝑋(𝑡)]
∴ 𝐸[𝑌(𝑡)] = 2 𝐸[𝑋(𝑡)]
Now,
𝜎𝑌 2 = 𝐸[𝑌 2 (𝑡)] − (𝐸[𝑌(𝑡)])2

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 22


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

2
= 𝐸 [(𝑋(𝑡) − 𝑋(𝑡 + 𝜏)) ] − (2 𝐸[𝑋(𝑡)])2 [∵ 𝐸[𝑌(𝑡)] = 2 𝐸[𝑋(𝑡)]]

= 𝐸[𝑋 2 (𝑡) + 𝑋 2 (𝑡 + 𝜏) + 2𝑋(𝑡)𝑋(𝑡 + 𝜏)] − 4(𝐸[𝑋(𝑡)])2

= 𝐸[𝑋 2 (𝑡)] + 𝐸[𝑋 2 (𝑡 + 𝜏)] + 2𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] − 4(𝐸[𝑋(𝑡)])2

= 𝐸[𝑋 2 (𝑡)] + 𝐸[𝑋 2 (𝑡)] + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸[𝑋(𝑡)])2 [∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]

̅̅̅̅̅̅̅
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸[𝑋(𝑡)])2 [∵ 𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)]

= 2𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4𝑋̅ 2

= 2[𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (𝜏) − 2𝑋̅ 2 ]

∴ 𝜎𝑌 2 = 2[𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (𝜏) − 2𝑋̅ 2 ]

5. A random process is given as X(t) = A Cos(ω0t) + B Sin(ω0t), where A & B


are uncorrelated, zero mean random variables having same variance  2 then
appraise whether X(t) is wide sense stationary or not.
Sol:

𝐺𝑖𝑣𝑒𝑛,

𝑋(𝑡) = 𝐴 cos(𝜔0 𝑡) + 𝐵 sin(𝜔0 𝑡)

𝐼𝑓 𝐴 & 𝐵 𝑎𝑟𝑒 𝑢𝑛𝑐𝑜𝑟𝑟𝑒𝑙𝑎𝑡𝑒𝑑

𝐸(𝐴𝐵) = 𝐸(𝐴) 𝐸(𝐵)

𝐵𝑢𝑡,

𝐸(𝐴) = 𝐸(𝐵) = 0

𝐸(𝐴𝐵) = 0

𝑁𝑜𝑤,

𝜎𝐴2 = 𝐸[𝐴2 ] − (𝐸[𝐴])2

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 23


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

= 𝐸[𝐴2 ]

𝜎𝐵2 = 𝐸[𝐵2 ] − (𝐸[𝐵])2

= 𝐸[𝐵2 ]

 𝐸[𝐴2 ] = 𝜎 2 𝑎𝑛𝑑 𝐸[𝐵2 ] = 𝜎 2

𝐴 𝑟𝑎𝑛𝑑𝑜𝑚 𝑝𝑟𝑜𝑐𝑒𝑠𝑠 𝑖𝑠 𝑠𝑎𝑖𝑑 𝑡𝑜 𝑏𝑒 𝑤𝑖𝑑𝑒 𝑠𝑒𝑛𝑠𝑒 𝑠𝑡𝑎𝑡𝑖𝑜𝑛𝑎𝑟𝑦 𝑝𝑟𝑜𝑐𝑒𝑠𝑠 𝑖𝑓

𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏) = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒

𝑁𝑜𝑤,

𝐸[𝑋(𝑡)] = 𝐸[𝐴 cos(𝜔0 𝑡) + 𝐵 𝑆𝑖𝑛(𝜔0 𝑡)]

= 𝐸[𝐴] 𝐶𝑜𝑠(𝜔0 𝑡) + 𝐸[𝐵] 𝑆𝑖𝑛(𝜔0 𝑡)

= 0 (𝑆𝑖𝑛𝑐𝑒 𝐸[𝐴] = 𝐸[𝐵] = 0)

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]

= 𝑬[ (𝑨 𝐜𝐨𝐬(𝝎𝟎 𝒕) + 𝑩 𝐬𝐢𝐧(𝝎𝟎 𝒕)) ( 𝑨 𝐜𝐨𝐬(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑩 𝐬𝐢𝐧(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉))]

= 𝐸[ ( 𝐴2 cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐴𝐵 cos(𝜔0 𝑡)

+ 𝐴𝐵 sin(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐵2 sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)]

= 𝐸[𝐴2 ] cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐸[𝐴𝐵] cos(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)

+ 𝐸[𝐴𝐵] sin(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐸[𝐵2 ] sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)

= 𝜎 2 cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + 0 + 0 + 𝜎 2 sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)

(𝑺𝒊𝒏𝒄𝒆 𝑬[𝑨𝟐 ] = 𝑬[𝑩𝟐 ] = 𝝈𝟐 𝑬[𝑨𝑩] = 𝟎)

= 𝜎 2 [ cos(𝜔0 𝑡) cos(𝜔0 𝑡 + 𝜔0 𝜏) + sin(𝜔0 𝑡) sin(𝜔0 𝑡 + 𝜔0 𝜏)]

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 24


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

[𝑆𝑖𝑛𝑐𝑒 𝐶𝑜𝑠𝐴 𝐶𝑜𝑠𝐵 + 𝑆𝑖𝑛𝐴 𝑆𝑖𝑛𝐵 = 𝐶𝑜𝑠(𝐴 − 𝐵)]

= 𝜎 2 cos[𝜔0 𝑡 − (𝜔0 𝑡 + 𝜔0 𝜏)]

= 𝜎 2 cos(𝜔0 𝜏)

= 𝑅𝑋𝑋 (𝜏)

𝐼𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒

𝐻𝑒𝑛𝑐𝑒 𝑔𝑖𝑣𝑒𝑛 𝑅𝑎𝑛𝑑𝑜𝑚 𝑝𝑟𝑜𝑐𝑒𝑠 𝑖𝑠 𝑊𝑖𝑑𝑒 𝑆𝑒𝑛𝑠𝑒 𝑆𝑡𝑎𝑡𝑖𝑜𝑛𝑎𝑟𝑦

6. Two random processes 𝑋(𝑡)& 𝑌(𝑡) are defined as

𝑋(𝑡) = 𝐴 𝐶𝑜𝑠(𝜔0 𝑡) + 𝐵 𝑆𝑖𝑛(𝜔0 𝑡) and 𝑌(𝑡) = 𝐵 𝐶𝑜𝑠(𝜔0 𝑡) − 𝐴 𝑆𝑖𝑛(𝜔0 𝑡),


where 𝐴, 𝐵 are uncorrelated, zero mean random variables with same
variances and 𝜔0 is constant. Verify whether𝑋(𝑡), 𝑌(𝑡) are Jointly wide sense
stationary or not.

Sol:Given,

𝑋(𝑡) = 𝐴 𝑐𝑜𝑠(𝜔0 𝑡) + 𝐵 𝑠𝑖𝑛(𝜔0 𝑡)

𝑌(𝑡) = 𝐵 𝑐𝑜𝑠(𝜔0 𝑡) − 𝐴 𝑠𝑖𝑛(𝜔0 𝑡)

Conditions to be satisfied to be Jointly WSS:

Mean of X(t): ̅̅̅̅̅̅


𝑋(𝑡) = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
̅̅̅̅̅̅ = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Mean of Y(t):𝑌(𝑡)

Auto Correlation of X(t)&Y(t): 𝑅𝑋𝑋 𝑎𝑛𝑑 𝑅𝑌𝑌 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

Cross Correlation of X(t) & Y(t):𝑅𝑋𝑌 𝑜𝑟 𝑅𝑌𝑋 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒


Given that,

𝐸[𝐴] = 𝐸[𝐵] = 0 and 𝐴, 𝐵 are uncorrelated.

∴ 𝐸[𝐴𝐵] = 𝐸[𝐴]𝐸[𝐵] = 0

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 25


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Now, 𝜎𝐴 2 = 𝐸[𝐴2 ] − (𝐸[𝐴])2

= 𝐸[𝐴2 ] − 0

∴ 𝜎𝐴 2 = 𝐸[𝐴2 ]

Let 𝜎𝐴 2 = 𝐸[𝐴2 ] = 𝜎 2

Then, 𝜎𝐵 2 = 𝐸[𝐵2 ] − (𝐸[𝐵])2

= 𝐸[𝐵2 ] − 0

= 𝐸[𝐵2 ]

∴ 𝜎𝐵 2 = 𝐸[𝐵2 ] = 𝜎 2

[∵ Given 𝐴 and 𝐵 have same variance]

To verify Jointly WSS or not:

Calculation of Mean:

𝐸[𝑋(𝑡)] = 𝐸[𝐴 𝑐𝑜𝑠(𝜔0 𝑡) + 𝐵 𝑠𝑖𝑛(𝜔0 𝑡)]

= 𝐸[𝐴]𝑐𝑜𝑠(𝜔0 𝑡) + 𝐸[𝐵]𝑠𝑖𝑛(𝜔0 𝑡)[∵ 𝐸[𝐴] = 𝐸[𝐵] = 0 ]

∴ ̅̅̅̅̅̅̅
𝑿(𝒕) = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕

𝐸[𝑌(𝑡)] = 𝐸[𝐵 𝑐𝑜𝑠(𝜔0 𝑡) − 𝐴 𝑠𝑖𝑛(𝜔0 𝑡)]

= 𝐸[𝐵]𝑐𝑜𝑠(𝜔0 𝑡) − 𝐸[𝐴]𝑠𝑖𝑛(𝜔0 𝑡)[∵ 𝐸[𝐴] = 𝐸[𝐵] = 0 ]

∴ ̅̅̅̅̅̅
𝒀(𝒕) = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕

Calculation of Auto Correlation Functions:

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]

= 𝐸[(𝐴 𝑐𝑜𝑠(𝜔0 𝑡) + 𝐵 𝑠𝑖𝑛(𝜔0 𝑡))(𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐵 𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏))]

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 26


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

= 𝑬[𝑨𝟐 𝒄𝒐𝒔(𝝎𝟎 𝒕)𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑨𝑩 𝒄𝒐𝒔(𝝎𝟎 𝒕)𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉)


+ 𝑨𝑩 𝒔𝒊𝒏(𝝎𝟎 𝒕)𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑩𝟐 𝒔𝒊𝒏(𝝎𝟎 𝒕)𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉)]

= 𝑬[𝑨𝟐 ]𝒄𝒐𝒔(𝝎𝟎 𝒕)𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑬[𝑨𝑩]𝒄𝒐𝒔(𝝎𝟎 𝒕)𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉)


+ 𝑬[𝑨𝑩]𝒔𝒊𝒏(𝝎𝟎 𝒕)𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝑬[𝑩𝟐 ]𝒔𝒊𝒏(𝝎𝟎 𝒕)𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝝎𝟎 𝝉)

= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) + 0 + 0 + 𝜎 2 𝑠𝑖𝑛(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)

= 𝜎 2 [𝑐𝑜𝑠(𝜔0 𝑡 − (𝜔0 𝑡 + 𝜔0 𝜏))][∵ 𝐶𝑜𝑠(𝐴 − 𝐵) = 𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 + 𝑆𝑖𝑛𝐴𝑆𝑖𝑛𝐵]

= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝜏)

∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = 𝑹𝑿𝑿 (𝝉) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆

𝑅𝑌𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[(𝐵 𝑐𝑜𝑠(𝜔0 𝑡) − 𝐴 𝑠𝑖𝑛(𝜔0 𝑡))(𝐵 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴 𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏))]

= 𝐸[𝐵2 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴𝐵 𝑐𝑜𝑠(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)


− 𝐴𝐵 𝑠𝑖𝑛(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐴2 𝑠𝑖𝑛(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)]

= 𝐸 [𝐵2 ]𝑐𝑜𝑠(𝜔0 𝑡 )𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐸 [𝐴𝐵]𝑐𝑜𝑠(𝜔0 𝑡 )𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)


− 𝐸 [𝐴𝐵 ]𝑠𝑖𝑛(𝜔0 𝑡 )𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) + 𝐸 [𝐴2 ]𝑠𝑖𝑛(𝜔0 𝑡 )𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)

= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 0 − 0 + 𝜎 2 𝑠𝑖𝑛(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)

= 𝜎 2 [𝑐𝑜𝑠(𝜔0 𝑡 − (𝜔0 𝑡 + 𝜔0 𝜏))][∵ 𝐶𝑜𝑠(𝐴 − 𝐵) = 𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 + 𝑆𝑖𝑛𝐴𝑆𝑖𝑛𝐵]

= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝜏)

∴ 𝑹𝒀𝒀 (𝒕, 𝒕 + 𝝉) = 𝑹𝒀𝒀 (𝝉) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆

Calculation of Cross Correlation Function:

𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[(𝐴 𝑐𝑜𝑠(𝜔0 𝑡) + 𝐵 𝑠𝑖𝑛(𝜔0 𝑡))(𝐵 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴 𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏))]

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 27


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

= 𝐸[𝐴𝐵 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴2 𝑐𝑜𝑠(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)


+ 𝐵2 𝑠𝑖𝑛(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐴𝐵 𝑠𝑖𝑛(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)]

= 𝐸 [𝐴𝐵]𝑐𝑜𝑠(𝜔0 𝑡 )𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐸 [𝐴2 ]𝑐𝑜𝑠(𝜔0 𝑡 )𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)


+ 𝐸 [𝐵 2 ]𝑠𝑖𝑛(𝜔0 𝑡 )𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 𝐸 [𝐴𝐵]𝑠𝑖𝑛(𝜔0 𝑡 )𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏)

= 0 − 𝜎 2 𝑐𝑜𝑠(𝜔0 𝑡)𝑠𝑖𝑛(𝜔0 𝑡 + 𝜔0 𝜏) + 𝜎 2 𝑠𝑖𝑛(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) − 0

= 𝜎 2 [𝑠𝑖𝑛(𝜔0 𝑡 − (𝜔0 𝑡 + 𝜔0 𝜏))][∵ 𝑆𝑖𝑛(𝐴 − 𝐵) = 𝑆𝑖𝑛𝐴𝐶𝑜𝑠𝐵 + 𝐶𝑜𝑠𝐴𝑆𝑖𝑛𝐵]

= 𝜎 2 [𝑠𝑖𝑛(−𝜔0 𝜏)]

= −𝜎 2 𝑠𝑖𝑛(𝜔0 𝜏)

∴ 𝑹𝑿𝒀 (𝒕, 𝒕 + 𝝉) = 𝑹𝑿𝒀 (𝝉) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆


Hence, 𝑋(𝑡)and𝑌(𝑡) are Jointly wide sense stationary.

7. A random process is defined as 𝑋(𝑡) = 𝐴 𝐶𝑜𝑠(𝜔0 𝑡), where 𝜔0 is a


constant and 𝐴 is a random variable uniformly distributed over (0,1).
Estimate the autocorrelation function.

Sol: Given,

𝑋(𝑡) = 𝐴 𝑐𝑜𝑠(𝜔0 𝑡), where 𝜔0 is a constant.

Auto Correlation Function: 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]



Now, 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = ∫−∞ 𝑥(𝑡) . 𝑥(𝑡 + 𝜏)𝑓𝑋 (𝑥) 𝑑𝑥

Given 𝐴 is a uniformly distributed random variable in the interval (0,1).


𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = ∫ 𝑥(𝑡). 𝑥(𝑡 + 𝜏)𝑓𝐴 (𝐴) 𝑑𝐴


−∞

Density function of a uniformly distributed random variable is

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 28


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

1
𝑓𝑋 (𝑥) = ; 𝑎≤𝑋≤𝑏
𝑏−𝑎
1
𝑓𝐴 (𝐴) = =1
1−0
1

⇒ 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = ∫ 𝐴 𝑐𝑜𝑠(𝜔0 𝑡)𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) (1)𝑑𝐴


0
1

= 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) ∫ 𝐴2 𝑑𝐴
0
1
1 𝐴3
= (𝑐𝑜𝑠 (2𝜔 0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏)) [ ]
2 3 0
[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]
1 1
= (𝑐𝑜𝑠(2𝜔0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏)) [ ]
2 3
1
∴ 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = (𝑐𝑜𝑠(2𝜔0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏))
6
𝟏
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = (𝒄𝒐𝒔(𝟐𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝒄𝒐𝒔(𝝎𝟎 𝝉)) ⇒ 𝑫𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟔
8. Two statistically independent zero mean random processes 𝑋(𝑡), 𝑌(𝑡)
have autocorrelation functions 𝑅𝑋𝑋 (𝜏) = 𝑒𝑥𝑝(−|𝜏|), 𝑅𝑌𝑌 (𝜏) = cos(2𝜋𝜏)
respectively. Evaluate the
(i) Autocorrelation of the Sum𝑊1 (𝑡) = 𝑋(𝑡) + 𝑌(𝑡)
(ii) Autocorrelation of the Difference 𝑊2 (𝑡) = 𝑋(𝑡) − 𝑌(𝑡)
(iii) Crosscorrelation of 𝑊1 (𝑡) and 𝑊2 (𝑡)

Sol: Given,

𝐸[𝑋(𝑡)] = 0 and 𝐸[𝑌(𝑡)] = 0

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 29


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = 𝑒𝑥𝑝(−|𝜏|)

𝑅𝑌𝑌 (𝜏) = 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)] = cos(2𝜋𝜏)

Now, cross correlation of 𝑋(𝑡) and 𝑌(𝑡) is

𝑅𝑋𝑌 (𝜏) = 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[𝑋(𝑡)]𝐸[𝑌(𝑡 + 𝜏)][∵ 𝑋(𝑡) 𝑎𝑛𝑑 𝑌(𝑡) 𝑎𝑟𝑒 𝑠𝑡𝑎𝑡𝑖𝑠𝑡𝑖𝑐𝑎𝑙𝑙𝑦 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 ]

= (0)(𝐸[𝑌(𝑡 + 𝜏)])

∴ 𝑅𝑋𝑌 (𝜏) = 0&𝑅𝑌𝑋 (𝜏) = 0

(i) Auto correlation of 𝑊1 (𝑡):

Given, 𝑊1 (𝑡) = 𝑋(𝑡) + 𝑌(𝑡)

Now, 𝑅𝑊1𝑊1 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑊1 (𝑡)𝑊1 (𝑡 + 𝜏)]

= 𝐸[(𝑋(𝑡) + 𝑌(𝑡))(𝑋(𝑡 + 𝜏) + 𝑌(𝑡 + 𝜏))]

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏) + 𝑋(𝑡)𝑌(𝑡 + 𝜏) + 𝑌(𝑡)𝑋(𝑡 + 𝜏) + 𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] + 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)] + 𝐸[𝑌(𝑡)𝑋(𝑡 + 𝜏)] + 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)

= 𝑒𝑥𝑝(−|𝜏|) + 0 + 0 + 𝑐𝑜𝑠(2𝜋𝜏)

∴ 𝑹𝑾𝟏𝑾𝟏 (𝒕, 𝒕 + 𝝉) = 𝒆𝒙𝒑(−|𝝉|) + 𝒄𝒐𝒔(𝟐𝝅𝝉)

(ii)Auto correlation of 𝑊2 (𝑡):

Given, 𝑊2 (𝑡) = 𝑋(𝑡) − 𝑌(𝑡)

Now,

𝑅𝑊2𝑊2 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑊2 (𝑡)𝑊2 (𝑡 + 𝜏)]

= 𝐸[(𝑋(𝑡) − 𝑌(𝑡))(𝑋(𝑡 + 𝜏) − 𝑌(𝑡 + 𝜏))]

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 30


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏) − 𝑋(𝑡)𝑌(𝑡 + 𝜏) − 𝑌(𝑡)𝑋(𝑡 + 𝜏) + 𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] − 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)] − 𝐸[𝑌(𝑡)𝑋(𝑡 + 𝜏)] + 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝑅𝑋𝑋 (𝜏) − 𝑅𝑋𝑌 (𝜏) − 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)

= 𝑒𝑥𝑝(−|𝜏|) − 0 − 0 + 𝑐𝑜𝑠(2𝜋𝜏)

∴ 𝑹𝑾𝟐𝑾𝟐 (𝒕, 𝒕 + 𝝉) = 𝒆𝒙𝒑(−|𝝉|) + 𝒄𝒐𝒔(𝟐𝝅𝝉)

(iii)Cross correlation of 𝑊1 (𝑡) and 𝑊2 (𝑡):

𝑅𝑊1𝑊2 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑊1 (𝑡)𝑊2 (𝑡 + 𝜏)]

= 𝐸[(𝑋(𝑡) + 𝑌(𝑡))(𝑋(𝑡 + 𝜏) − 𝑌(𝑡 + 𝜏))]

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏) − 𝑋(𝑡)𝑌(𝑡 + 𝜏) + 𝑌(𝑡)𝑋(𝑡 + 𝜏) − 𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] − 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)] + 𝐸[𝑌(𝑡)𝑋(𝑡 + 𝜏)] − 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]

= 𝑅𝑋𝑋 (𝜏) − 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) − 𝑅𝑌𝑌 (𝜏)

= 𝑒𝑥𝑝(−|𝜏|) − 0 + 0 − 𝑐𝑜𝑠(2𝜋𝜏)

∴ 𝑹𝑾𝟏𝑾𝟐 (𝒕, 𝒕 + 𝝉) = 𝒆𝒙𝒑(−|𝝉|) − 𝒄𝒐𝒔(𝟐𝝅𝝉)

9. Given 𝑋̅ = 6 and 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 36+25 𝑒𝑥𝑝(−𝜏) for a random


process 𝑋(𝑡). Indicate which of the following statements are true and
give the reason.
(i) Is first order stationary?
(ii) Has total average power of 61W
(iii) Is wide sense stationary?
(iv) Has a periodic component
(v) Has an AC power of 36W

Sol: Given,

𝑋̅ = 6 ⇒ 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 31


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

(i) First order stationary:

It is first order stationary because the mean value [𝑋̅ = 6] is constant.

Hence, the given statement is true.

(ii) Total Average Power:

The average power is the mean square value of the process𝑋(𝑡).

Average Power: ̅̅̅̅̅̅̅


𝑬[𝑿𝟐 (𝒕)] = 𝑿 𝟐 (𝒕) = 𝑹 (𝟎)
𝑿𝑿

Given,

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 36+25 𝑒𝑥𝑝(−𝜏)

⇒ 𝑅𝑋𝑋 (𝜏) = 36+25 𝑒𝑥𝑝(−𝜏)

Now,

𝑅𝑋𝑋 (0) = 36+25 𝑒𝑥𝑝(−0) = 36 + 25(1) = 61

∴ Average Power = 61W

Hence, the given statement is true.

(iii) Wide Sense Stationary:

Here,

Mean: 𝑋̅ = 6 ⇒ 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡

Auto Correlation: 𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 36+25 𝑒𝑥𝑝(−𝜏) ⇒


𝐼𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

∴ 𝑋(𝑡) is wide sense stationary random process.

Hence, the given statement is true.

(iv) Periodic component:

If the given RP 𝑋(𝑡) has no periodic components then, it satisfies the


condition

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 32


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

̅ )𝟐
𝐥𝐢𝐦 𝑹𝑿𝑿 (𝝉) = (𝑿
|𝝉|→∞

Now,

lim 𝑅𝑋𝑋 (𝜏) = lim [36 + 25 𝑒𝑥𝑝(−𝜏)]


𝜏→∞ 𝜏→∞

= 36 + 25 𝑒𝑥𝑝(−∞)

= 36 + 25(0)

= 36

= 62
̅ )𝟐
∴ 𝒍𝒊𝒎 𝑹𝑿𝑿 (𝝉) = (𝑿
𝝉→∞

Hence, the given RP 𝑋(𝑡) has no periodic components.

(v) AC Power:

The value of AC Power is given by the variance of the RP 𝑋(𝑡).

AC Power: 𝝈𝟐 𝑿(𝒕) = 𝑬[𝑿𝟐 (𝒕)] − (𝑬[𝑿(𝒕)])𝟐

⇒ 𝜎 2𝑋(𝑡) = 𝑅𝑋𝑋 (0) − (𝑋̅)2

= 61 − 62

= 61 − 36

∴ 𝜎 2𝑋(𝑡) = 25

∴ AC Power = 25W

Hence, the given statement is not true.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 33


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

Show that 𝑋(𝑡)&𝑌(𝑡) are Jointly WSS, if random processes, 𝑋(𝑡) =


𝐴 𝐶𝑜𝑠(𝜔1 𝑡 + 𝜃),

𝑌(𝑡) = 𝐵 𝐶𝑜𝑠(𝜔2 𝑡 + 𝛷), where 𝐴, 𝐵, 𝜔1 &𝜔2 are constants, while 𝛷, 𝜃 are


statistically

independent uniform random variables on (0,2𝜋).

Sol: Given,

𝑋(𝑡) = 𝐴 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃)

𝑌(𝑡) = 𝐵 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷)

Conditions to be satisfied to be Jointly WSS:

Mean of X(t): ̅̅̅̅̅̅


𝑋(𝑡) = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
̅̅̅̅̅̅ = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Mean of Y(t):𝑌(𝑡)

Auto Correlation of X(t) &Y(t): 𝑅𝑋𝑋 𝑎𝑛𝑑 𝑅𝑌𝑌 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

Cross Correlation of X(t) & Y(t):𝑅𝑋𝑌 𝑜𝑟 𝑅𝑌𝑋 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒

Mean of𝑋(𝑡):

𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝑋 (𝑥) 𝑑𝑥


−∞

Given, 𝜃 is a uniformly distributed random variable on the interval (0,2𝜋).


𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝜃 (𝜃) 𝑑𝜃


−∞
Density function of a uniformly distributed random variable is
1
𝑓𝑋 (𝑥) = ; 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃) = =
2𝜋 − 0 2𝜋

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 34


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

2𝜋

⇒ 𝐸[𝑋(𝑡)] = ∫ 𝐴 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃) 𝑓𝜃 (𝜃) 𝑑𝜃


0

2𝜋
1
= 𝐴 ∫ 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃) ( ) 𝑑𝜃
2𝜋
0

2𝜋
𝑠𝑖𝑛(𝜔1 𝑡 + 𝜃)
= 𝐴[ ]
2𝜋 0

𝐴
= [𝑠𝑖𝑛(2𝜋 + 𝜔1 𝑡) − 𝑠𝑖𝑛(𝜔1 𝑡)]
2𝜋
𝐴
= [𝑠𝑖𝑛(𝜔1 𝑡) − 𝑠𝑖𝑛(𝜔1 𝑡)]
2𝜋
𝐴
= [0]
2𝜋
∴ 𝑬[𝑿(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕

Auto Correlation of𝑋(𝑡):

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]


= ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏)𝑓𝑋 (𝑥) 𝑑𝑥


−∞

2𝜋

= ∫ 𝐴 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃)𝐴 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜔1 𝜏 + 𝜃)𝑓𝜃 (𝜃) 𝑑𝜃


0

2𝜋
1
= 𝐴2 ∫ 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃)𝑐𝑜𝑠(𝜔1 𝑡 + 𝜔1 𝜏 + 𝜃) ( ) 𝑑𝜃
2𝜋
0

2𝜋
𝐴2
= ∫ [𝑐𝑜𝑠(2𝜔1 𝑡 + 𝜔1 𝜏 + 2𝜃) + 𝑐𝑜𝑠(𝜔1 𝜏)] 𝑑𝜃
4𝜋
0

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 35


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]


2𝜋
𝐴2 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏 + 2𝜃)
= [ + 𝑐𝑜𝑠(𝜔1 𝜏)[𝜃]]
4𝜋 2
0

𝐴2 𝑠𝑖𝑛(2𝜋 + 2𝜔1 𝑡 + 𝜔1 𝜏) 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏)


= [ + 2𝜋𝑐𝑜𝑠(𝜔1 𝜏) − ( + 0)]
4𝜋 2 2

𝐴2 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏) 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔1 𝜏) − ]
4𝜋 2 2

𝐴2
= [2𝜋𝑐𝑜𝑠(𝜔1 𝜏)]
4𝜋

𝑨𝟐
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = ( 𝒄𝒐𝒔(𝝎𝟏 𝝉)) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
Mean of 𝑌(𝑡):

𝐸[𝑌(𝑡)] = ∫ 𝑦(𝑡) 𝑓𝑌 (𝑦) 𝑑𝑦


−∞

Given, 𝛷 is a uniformly distributed random variable on the interval (0,2𝜋).


𝐸[𝑌(𝑡)] = ∫ 𝑦(𝑡) 𝑓𝛷 (𝛷) 𝑑𝛷


−∞

Density function of a uniformly distributed random variable is


1
𝑓𝑌 (𝑦) = ; 𝑎≤𝑌≤𝑏
𝑏−𝑎
1 1
𝑓𝛷 (𝛷) = =
2𝜋 − 0 2𝜋

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 36


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

2𝜋

⇒ 𝐸[𝑌(𝑡)] = ∫ 𝐵 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷) 𝑓𝛷 (𝛷) 𝑑𝛷


0

2𝜋
1
= 𝐵 ∫ 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷) ( ) 𝑑𝛷
2𝜋
0

2𝜋
𝑠𝑖𝑛(𝜔2 𝑡 + 𝛷)
= 𝐵[ ]
2𝜋 0

𝐵
= [𝑠𝑖𝑛(2𝜋 + 𝜔2 𝑡) − 𝑠𝑖𝑛(𝜔2 𝑡)]
2𝜋
𝐵
= [𝑠𝑖𝑛(𝜔2 𝑡) − 𝑠𝑖𝑛(𝜔2 𝑡)]
2𝜋
𝐵
= [0]
2𝜋
∴ 𝑬[𝒀(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕

Auto Correlation of 𝑌(𝑡):

𝑅𝑌𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]


= ∫ 𝑦(𝑡) 𝑦(𝑡 + 𝜏)𝑓𝑌 (𝑦) 𝑑𝑦


−∞

2𝜋

= ∫ 𝐵 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷)𝐵 𝑐𝑜𝑠(𝜔2 𝑡 + 𝜔2 𝜏 + 𝛷)𝑓𝛷 (𝛷) 𝑑𝛷


0

2𝜋
1
= 𝐵2 ∫ 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷)𝑐𝑜𝑠(𝜔2 𝑡 + 𝜔2 𝜏 + 𝛷) ( ) 𝑑𝛷
2𝜋
0

2𝜋
𝐵2
= ∫ [𝑐𝑜𝑠(2𝜔2 𝑡 + 𝜔2 𝜏 + 2𝛷) + 𝑐𝑜𝑠(𝜔2 𝜏)] 𝑑𝛷
4𝜋
0

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 37


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]


2𝜋
𝐵2 𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏 + 2𝛷)
= [ + 𝑐𝑜𝑠(𝜔2 𝜏)[𝛷]]
4𝜋 2
0

𝐵2 𝑠𝑖𝑛(2𝜋 + 2𝜔2 𝑡 + 𝜔2 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔2 𝜏)
4𝜋 2
𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏)
−( + 0)]
2

𝐵2 𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏) 𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔2 𝜏) − ]
4𝜋 2 2

𝐵2
= [2𝜋𝑐𝑜𝑠(𝜔2 𝜏)]
4𝜋
𝑩𝟐
∴ 𝑹𝒀𝒀 (𝒕, 𝒕 + 𝝉) = ( 𝒄𝒐𝒔(𝝎𝟐 𝝉)) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
Cross Correlation of𝑋(𝑡)and𝑌(𝑡):

𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)]

Given 𝜃 and 𝛷 are statistically independent random variables then,

the random processes 𝑋(𝑡) and 𝑌(𝑡) also would be statistically


independent.

⇒ 𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)] 𝐸[𝑌(𝑡 + 𝜏)]

= (0) 𝐸[𝑌(𝑡 + 𝜏)][ ∵ 𝐸[𝑋(𝑡)] = 0]

∴ 𝑹𝑿𝒀 (𝒕, 𝒕 + 𝝉) = 𝟎 ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆

Hence, 𝑋(𝑡) and 𝑌(𝑡) are Jointly wide sense stationary.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 38


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

10. If 𝑋(𝑡) = 𝐴 𝐶𝑜𝑠(𝜔0 𝑡 + 𝜃), where 𝐴, 𝜔0 are constants and 𝜃 is a uniform


random variable on (−𝜋, 𝜋). A new random process is defined by 𝑌(𝑡) =
𝑋 2 (𝑡).

(i) Obtain the Mean and Auto Correlation Function of 𝑋(𝑡).


(ii) Obtain the Mean and Auto Correlation Function of 𝑌(𝑡).
(iii) Find the Cross Correlation Function of 𝑋(𝑡) and 𝑌(𝑡).
(iv)Are 𝑋(𝑡) and 𝑌(𝑡) are WSS ?
(v) Are 𝑋(𝑡) and 𝑌(𝑡) are Jointly WSS ?

Sol: Given,

𝑋(𝑡) = 𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃), where 𝐴, 𝜔0 are constants.

(i) Mean of𝑋(𝑡):


𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝑋 (𝑥) 𝑑𝑥


−∞

Given, 𝜃 is a uniformly distributed random variable on the interval (−𝜋, 𝜋).


𝐸[𝑋(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝜃 (𝜃) 𝑑𝜃


−∞

Density function of a uniformly distributed random variable is


1
𝑓𝑋 (𝑥) = ; 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃) = =
𝜋 − (−𝜋) 2𝜋
𝜋

⇒ 𝐸[𝑋(𝑡)] = ∫ 𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃)𝑓𝜃 (𝜃) 𝑑𝜃


−𝜋
𝜋
1
= 𝐴 ∫ 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃) ( ) 𝑑𝜃
2𝜋
−𝜋

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 39


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝜋
𝑠𝑖𝑛(𝜔0 𝑡 + 𝜃)
= 𝐴[ ]
2𝜋 −𝜋

𝐴
= [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡)]
2𝜋
𝐴
= [−𝑠𝑖𝑛𝜔0 𝑡 + 𝑠𝑖𝑛𝜔0 𝑡]
2𝜋
∴ 𝑬[𝑿(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕

Auto Correlation Function of𝑋(𝑡):

𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)]


= ∫ 𝑥(𝑡)𝑥(𝑡 + 𝜏)𝑓𝑋 (𝑥) 𝑑𝑥


−∞

= ∫ 𝑥(𝑡)𝑥(𝑡 + 𝜏)𝑓𝜃 (𝜃) 𝑑𝜃


−∞
𝜋
1
= ∫ 𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃) 𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏 + 𝜃) ( ) 𝑑𝜃
2𝜋
−𝜋
𝜋
𝐴2
= ∫[𝑐𝑜𝑠(2𝜔0 𝑡 + 𝜔0 𝜏 + 2𝜃) + 𝑐𝑜𝑠(𝜔0 𝜏)] 𝑑𝜃
4𝜋
−𝜋

[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]


𝜋
𝐴2 𝑠𝑖𝑛(2𝜔0 𝑡 + 𝜔0 𝜏 + 2𝜃)
= [ + 𝑐𝑜𝑠(𝜔0 𝜏)[𝜃]]
4𝜋 2
−𝜋
𝟐
𝑨 𝒔𝒊𝒏(𝟐𝝅 + 𝟐𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) 𝒔𝒊𝒏(−𝟐𝝅 + 𝟐𝝎𝟎 𝒕 + 𝝎𝟎 𝝉)
= [ + 𝝅𝒄𝒐𝒔(𝝎𝟎 𝝉) − ( − 𝝅𝒄𝒐𝒔(𝝎𝟎 𝝉))]
𝟒𝝅 𝟐 𝟐

𝐴2 𝑠𝑖𝑛(2𝜔0 𝑡 + 𝜔0 𝜏) 𝑠𝑖𝑛(2𝜔0 𝑡 + 𝜔0 𝜏)
= [ + 𝜋𝑐𝑜𝑠(𝜔0 𝜏) − ( − 𝜋𝑐𝑜𝑠(𝜔0 𝜏))]
4𝜋 2 2

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 40


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝐴2
= [𝜋𝑐𝑜𝑠(𝜔0 𝜏) + 𝜋𝑐𝑜𝑠(𝜔0 𝜏)]
4𝜋
𝐴2
= [2𝜋𝑐𝑜𝑠(𝜔0 𝜏)]
4𝜋
𝑨𝟐
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = 𝑹𝑿𝑿 (𝝉) = 𝒄𝒐𝒔(𝝎𝟎 𝝉) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
∴ The random process 𝑋(𝑡) is Wide Sense Stationary.

(ii) Mean of𝑌(𝑡):

Given,

𝑌(𝑡) = 𝑋 2 (𝑡)

⇒ 𝐸[𝑌(𝑡)] = 𝐸[𝑋 2 (𝑡)]


̅̅̅̅̅̅̅
= 𝑅𝑋𝑋 (0)[∵ 𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)]

𝐴2
= 𝑐𝑜𝑠(𝜔0 (0))
2
𝐴2
= (1)
2
𝑨𝟐
∴ 𝑬[𝒀(𝒕)] = ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
𝟐
Auto Correlation Function of𝑌(𝑡):

𝑅𝑌𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑌(𝑡)𝑌(𝑡 + 𝜏)]


= ∫ 𝑦(𝑡)𝑦(𝑡 + 𝜏)𝑓𝑌 (𝑦) 𝑑𝑦


−∞

= ∫ 𝑥 2 (𝑡)𝑥 2 (𝑡 + 𝜏) 𝑓𝜃 (𝜃) 𝑑𝜃
−∞

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 41


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝜋
1
= ∫[𝐴2 𝑐𝑜𝑠 2 (𝜔0 𝑡 + 𝜃)𝐴2 𝑐𝑜𝑠 2 (𝜔0 𝑡 + 𝜔0 𝜏 + 𝜃)] ( ) 𝑑𝜃
2𝜋
−𝜋
𝜋
𝐴4 1 + 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃) 1 + 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)
= ∫( )( ) 𝑑𝜃
2𝜋 2 2
−𝜋
𝜋 𝜋 𝜋
𝐴4 𝐴4 𝐴4
= ∫ 1 𝑑𝜃 + ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃)𝑑𝜃 + ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)𝑑𝜃
8𝜋 8𝜋 8𝜋
−𝜋 −𝜋 −𝜋
𝜋
𝐴4
+ ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃)𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)𝑑𝜃
8𝜋
−𝜋
𝜋 𝜋
𝐴4 𝜋
𝐴4 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜃) 𝐴4 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)
= [𝜃] + [ ] + [ ]
8𝜋 −𝜋 8𝜋 2 −𝜋
8𝜋 2 −𝜋
𝜋
𝐴4
+ ∫[𝑐𝑜𝑠(4𝜔0 𝑡 + 2𝜔0 𝜏 + 4𝜃) + 𝑐𝑜𝑠(2𝜔0 𝜏)] 𝑑𝜃
16𝜋
−𝜋

𝐴4 𝐴4
= [𝜋 − (−𝜋)] + [𝑠𝑖𝑛(2𝜋 + 2𝜔0 𝑡) − 𝑠𝑖𝑛(−2𝜋 + 2𝜔0 𝑡)] +
8𝜋 16𝜋
𝐴4
[𝑠𝑖𝑛(2𝜋 + 2𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−2𝜋 + 2𝜔0 𝑡 + 2𝜔0 𝜏)] +
16𝜋
𝜋
𝐴4 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏 + 4𝜃)
[[ ] + 𝑐𝑜𝑠(2𝜔0 𝜏)[𝜃]𝜋−𝜋 ]
16𝜋 4 −𝜋

𝐴4 𝐴4 𝐴4
= [2𝜋] + [𝑠𝑖𝑛(2𝜔0 𝑡) − 𝑠𝑖𝑛(2𝜔0 𝑡)] + [𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋 16𝜋 16𝜋

𝐴4 𝑠𝑖𝑛(4𝜋 + 4𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−4𝜋 + 4𝜔0 𝑡 + 2𝜔0 𝜏)


+ [ + 𝑐𝑜𝑠(2𝜔0 𝜏)[𝜋 − (−𝜋)]]
16𝜋 4

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 42


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝐴4 𝐴4 𝐴4
= + [0] + [0]
4 16𝜋 16𝜋
𝐴4 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏)
+ [ + 𝑐𝑜𝑠(2𝜔0 𝜏)[2𝜋]]
16𝜋 4

𝐴4 𝐴4
= +0+0+ [0 + 2𝜋 𝑐𝑜𝑠(2𝜔0 𝜏)]
4 16𝜋
𝐴4 𝐴4
= + (𝑐𝑜𝑠(2𝜔0 𝜏))
4 8
𝐴4
= [2 + 𝑐𝑜𝑠(2𝜔0 𝜏)]
8
𝑨𝟒
∴ 𝑹𝒀𝒀 (𝒕, 𝒕 + 𝝉) = 𝑹𝒀𝒀 (𝝉) = [𝟐 + 𝒄𝒐𝒔(𝟐𝝎𝟎 𝝉)] ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟖
∴ The new random process 𝑌(𝑡) is also Wide Sense Stationary.

(iii) Cross Correlation Function of𝑋(𝑡)and𝑌(𝑡):

𝑅𝑋𝑌 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡)𝑌(𝑡 + 𝜏)]


= ∫ 𝑥(𝑡)𝑦(𝑡 + 𝜏)𝑓𝜃 (𝜃) 𝑑𝜃


−∞
𝜋
1
= ∫[𝐴 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃)𝐴2 𝑐𝑜𝑠 2 (𝜔0 𝑡 + 𝜔0 𝜏 + 𝜃)] ( ) 𝑑𝜃
2𝜋
−𝜋
𝜋
𝐴3 1 + 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)
= ∫(𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃)) ( ) 𝑑𝜃
2𝜋 2
−𝜋
𝜋
𝐴3
= ∫[𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃) + 𝑐𝑜𝑠(𝜔0 𝑡 + 𝜃)𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)] 𝑑𝜃
4𝜋
−𝜋

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 43


UNIT-III STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS

𝝅
𝑨𝟑 𝟏
= ∫ [𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝜽) + (𝒄𝒐𝒔(𝟑𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝟑𝜽) + 𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝜽))] 𝒅𝜽
𝟒𝝅 𝟐
−𝝅

[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]


𝝅 𝝅 𝝅
𝑨𝟑 𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝜽) 𝐴3 𝒔𝒊𝒏(𝟑𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝟑𝜽) 𝑨𝟑 𝒔𝒊𝒏(𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝜽)
= [ ] + [ ] + [ ]
𝟒𝝅 𝟏 −𝝅
8𝜋 𝟑 −𝝅
𝟖𝝅 𝟏 −𝝅

𝐴3
= [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡)]
4𝜋
𝐴3
+ [𝑠𝑖𝑛(3𝜋 + 3𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−3𝜋 + 3𝜔0 𝑡 + 2𝜔0 𝜏)]
24𝜋
𝐴3
+ [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋
𝐴3 𝐴3
= [−𝑠𝑖𝑛𝜔0 𝑡 + 𝑠𝑖𝑛𝜔0 𝑡] + [−𝑠𝑖𝑛(3𝜔0 𝑡 + 2𝜔0 𝜏) + 𝑠𝑖𝑛(3𝜔0 𝑡 + 2𝜔0 𝜏)]
4𝜋 24𝜋
𝐴3
+ [−𝑠𝑖𝑛(𝜔0 𝑡 + 2𝜔0 𝜏) + 𝑠𝑖𝑛(𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋
𝐴3 𝐴3 𝐴3
= [0] + [0] + [0]
4𝜋 24𝜋 8𝜋
=0

∴ 𝑹𝑿𝒀 (𝒕, 𝒕 + 𝝉) = 𝟎 ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆

(iv) Yes, 𝑋(𝑡)and𝑌(𝑡)are Wide Sense Stationary.

(v) Yes, 𝑋(𝑡) and 𝑌(𝑡) are Jointly wide sense stationary.

CH SIVA RAMA KRISHNA Dept. of ECE, LBRCE Page 44

You might also like