Random Variable and Stochastic ProcessUNIT-3
Random Variable and Stochastic ProcessUNIT-3
UNIT-III
STOCHASTIC PROCESS-TEMPORAL CHARACTERISTICS
INTRODUCTION
A random variable is a real valued function that assigns numerical values to the
outcomes of physical experiment. If time is added to a random variable then it is
called random process.
Random processes are used to describe the time varying nature of random variable.
They describe the statistical behavior of various real time signals like speech,
noise, atmosphere, etc…
Random processes are denoted by 𝑋(𝑡, 𝑠) 𝑜𝑟 𝑋(𝑡). If time is fixed i.e; if any
specific time instant is taken then random process becomes random variable.
𝐹𝑋 (𝑥1 ; 𝑡1 ) = P{𝑋(𝑡1 ) ≤ 𝑥1 }
Similarly the first order density function of random process is
𝑑𝐹𝑋 (𝑥1 ; 𝑡1 )
𝑓𝑋 (𝑥1 ; 𝑡1 ) =
𝑑𝑥1
Second order Distribution and Density Function
For two random variables at time instant 𝑡1 𝑎𝑛𝑑 𝑡2 𝑋(𝑡1 ) = 𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 ,
the second order distribution (joint distribution) function of a random process is
defined as
𝜕 2 𝐹𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 )
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 ) =
𝜕𝑥1 𝜕𝑥2
𝜕 𝑛 𝐹𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 )
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) =
𝜕𝑥1 𝜕𝑥2 … … 𝜕𝑥𝑛
A random process X(t) is said to be first order stationary if its first order density
function does not change with time.
𝑓𝑋 (𝑥1 ; 𝑡1 ) = 𝑓𝑋 (𝑥1 ; 𝑡1 + Δ)
Whenever a random process is first order stationary then its average value or mean
is constant over time.
𝐸[𝑋(𝑡1 )] = 𝐸[𝑋(𝑡2 )]
= 𝐸[𝑋(𝑡1 + Δ)]
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Second order Stationary Process
A random process X(t) is said to be second order stationary if its second order
density function does not change with time.
𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 , 𝑡2 ) = 𝑓𝑋 (𝑥1, 𝑥2 ; 𝑡1 + Δ, 𝑡2 + Δ)
Let 𝐸[𝑋(𝑡1 ) 𝑋(𝑡2 )] denote the correlation between two random variables
𝑋1 𝑎𝑛𝑑 𝑋2 taken at time instants 𝑡1 𝑎𝑛𝑑 𝑡2 then
= 𝑅𝑋𝑋 (𝜏)
If this auto correlation function is constant i.e.; independent on time such a random
process is called second order stationary process.
𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
A random process X(t) is said to be nth order stationary if its nth order density
function does not change with time.
𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 , 𝑡2 … … 𝑡𝑛 ) = 𝑓𝑋 (𝑥1, 𝑥2 … … 𝑥𝑛 ; 𝑡1 + Δ, 𝑡2 + Δ … … 𝑡𝑛 + Δ)
A random process is said to be nth order stationary is also called strict sense
stationary.
TIME AVERAGES:
The random process is also characterized by time average functions along with
statistical averages. Statistical average of a random process is calculated by
considering all sample functions at given time.
Time averages are calculated for any sample function. The time average of a
random process is defined as a
1 𝑇
𝐴[∎] = lim ∫ [∎] 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Here A is used to denote time average in a manner analogous to E for the statistical
average.
1 𝑇
= lim ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
The time auto correlation function is used to calculate the similarity between two
random variables within a single random process.
The time cross correlation function measures the similarity between two different
random processes.
1 𝑇
= lim ∫ 𝑥(𝑡) 𝑦(𝑡 + 𝜏) 𝑑𝑡
𝑇→∞ 2𝑇 −𝑇
Ergodic Theorem
This theorem stated that the all time averages 𝑥̅ 𝑎𝑛𝑑 ℜ𝑥𝑥 (𝜏) of a random process
are equal to Statistical averages 𝑋̅ 𝑎𝑛𝑑 𝑅𝑋𝑋
𝐸[𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
A random process is said to be mean ergodic (or) ergodic in mean if the time
average of 𝑥(𝑡) is equal to statistical average of X(t)
𝐸[𝑋(𝑡)] = 𝐴[𝑥(𝑡)]
𝑋̅ = 𝑥̅
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
𝑙𝑒𝑡 𝜏 = 0
𝑅𝑋𝑋 (0) = 𝐸[𝑋(𝑡) 𝑋(𝑡)]
= 𝐸[𝑋 2 (𝑡)]
̅̅̅̅̅̅̅
𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)
Proof: let X(t) be a wide sense stationary process such that 𝑋(𝑡1 ) =
𝑋1 𝑎𝑛𝑑 𝑋(𝑡2 ) = 𝑋2 then select a positive quantity such that
Let 𝑡1 = 𝑡 and 𝑡2 = 𝑡1 + 𝜏 = 𝑡 + 𝜏
We know that
Proof:
We know that
Since the process has no periodic components as |𝜏| → ∞, the random variables
becomes independent.
7. If X(t) is Ergodic, zero mean, and has no periodic components, then the
auto correlation function is given as
lim 𝑅𝑋𝑋 (𝜏) = 0
|𝜏|→∞
8. Let there be a random process w(t) such that 𝑤(𝑡) = 𝑋(𝑡) + 𝑌(𝑡). Then the
auto correlation function of sum of random process is equal to
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Proof:
We know that
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
Given 𝑤(𝑡) = 𝑋(𝑡) + 𝑌(𝑡)
𝑅𝑊𝑊 (𝜏) = 𝐸[𝑊(𝑡) 𝑊(𝑡 + 𝜏)]
= 𝐸[(𝑋(𝑡) + 𝑌(𝑡)) (𝑋(𝑡 + 𝜏) + 𝑌(𝑡 + 𝜏))]
= 𝐸[(𝑋(𝑡)) (𝑋(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑋(𝑡)) (𝑌(𝑡 + 𝜏))] + 𝐸[(𝑌(𝑡)) (𝑋(𝑡 + 𝜏))]
𝑅𝑊𝑊 (𝜏) = 𝑅𝑋𝑋 (𝜏) + 𝑅𝑋𝑌 (𝜏) + 𝑅𝑌𝑋 (𝜏) + 𝑅𝑌𝑌 (𝜏)
Consider two random processes X(t) and Y(t) that are at least wide sense
stationary.
Proof:
We know that
Let 𝜏 = −𝜏
𝑡−𝜏 =𝑢
𝑡 =𝑢+𝜏
Proof:
[𝑌(𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
𝐸[𝑌(𝑡 + 𝜏) ± 𝛼 𝑋(𝑡)]2 ≥ 0
The above equation of the form 𝑎𝑥 2 + 𝑏𝑥 + 𝑐. Hence the roots of the above
−𝑏±√𝑏2 −4𝑎𝑐
equation are given as
2𝑎
4. For two random processes X(t) and Y(t) having non-zero mean and are
statically independent
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
Proof:
We know that
As X(t) and Y(t) having non-zero mean and are statically independent
𝑅𝑋𝑌 (𝜏) = 𝑋̅ 𝑌̅
COVARIANCE FUNCTION
DESCRPTIVE QUESTIONS
1. Explain numerous categories of random processes with examples.
2. Explain stationarity of random processes.
3. Interpret about ergodic random processes.
4. Interpret the significance of time averages and ergodicity.
5. Choose necessary expressions to verify the properties of Auto correlation
function.
6. Choose relevant expressions to verify the properties of cross correlation
function.
7. Interpret the concepts of covariance with relevance to random processes.
PROBLEMS
1. A random process is described by X(t) = A, where A is a continuous random
variable and is uniformly distributed on (0,1). Show that X(t) is wide sense
stationary.
2. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
10.Show that X(t) & Y(t) are Jointly WSS, if random processes,
X (t ) A Cos(1t ) , Y (t ) B Cos(2t ) , where A,B, 1 & 2 are constants,
while Φ, θ are Statistically independent uniform random variables on (0,2Π).
Sol:
1
= = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2
𝑅𝑋𝑋 (𝑡, 𝑡 + 𝜏) = 𝐸[𝑋(𝑡) 𝑋(𝑡 + 𝜏)]
∞
= ∫ 𝑥(𝑡) 𝑥(𝑡 + 𝜏) 𝑓𝐴 (𝐴) 𝑑𝐴
−∞
1
= ∫ 𝐴 𝐴 𝑑𝐴
0
1
= ∫ A2 𝑑𝐴
0
1
A3 1
= [ ] = Independent on time
2 0 2
Sol:
𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
1
𝑓𝑋 (𝑥) = 𝑎≤𝑋≤𝑏
𝑏−𝑎
1 1
𝑓𝜃 (𝜃) = =
2𝜋 − 0 2𝜋
2𝜋
1
𝐸[𝑋(𝑡)] = ∫ A cos(𝑤0 𝑡 + 𝜃) 𝑑𝜃
0 2𝜋
2𝜋
sin(𝑤0 𝑡 + 𝜃)
=𝐴 [ ]
2𝜋 0
A
= [ sin(𝑤0 𝑡 + 2𝜋) − sin(𝑤0 𝑡 + 0)]
2𝜋
A
= [ sin(𝑤0 𝑡) − sin(𝑤0 𝑡)] = 0
2𝜋
= 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
1 2𝜋
= ∫ (𝐴 cos(𝑤0 𝑡 + 𝜃) A cos(𝑤0 (𝑡 + 𝜏) + 𝜃) ) 𝑑𝜃
2𝜋 0
1 2𝜋
= ∫ ( 𝐴 cos(𝑤0 𝑡 + 𝜃) A cos(𝑤0 𝑡 + 𝜃 + 𝑤0 𝜏) ) 𝑑𝜃
2𝜋 0
A2 2𝜋
= ∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) + cos(𝑤0 𝜏) ) 𝑑𝜃
4𝜋 0
A2 2𝜋 2𝜋
= [∫ ( cos(2𝑤0 𝑡 + 2𝜃 + 𝜏) )𝑑𝜃 + ∫ ( cos(𝑤0 𝜏) )𝑑𝜃]
4𝜋 0 0
A2
= [ 2𝑤0 sin(2𝑤0 𝑡 + 2𝜃 + 𝜏) ]2𝜋 2𝜋
0 + cos(𝑤0 𝜏) 𝜃]0 ]
4𝜋
A2
= cos(𝑤0 𝜏) 2𝜋
4𝜋
A2
= cos(𝑤0 𝜏) Independent on time
2
Hence given RP is WSS.
3. Verify the Sine wave process X(t) = B sin ((0t), where B is uniform
random variable on (-1,1) is wide sense stationary or not.
Sol:
A random variable is said to be wide sense stationary process if
𝐸[𝑥(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑅𝑋𝑋 (𝜏) = 𝐸[𝑥(𝑡)𝑥(𝑡 + 𝜏)] = 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
∞
𝐸[𝑥(𝑡)] = ∫ 𝑥(𝑡) 𝑓𝑥 (𝑥)𝑑𝑥
−∞
∞
𝐸[𝑥(𝑡)𝑥(𝑡, 𝑡 + 𝜏)] = ∫ 𝑥(𝑡)𝑥(𝑡 + 𝜏) 𝑓𝑥 (𝑥)𝑑𝑥
−∞
𝑠𝑖𝑛𝜔0 𝑡 1
= ∫ 𝐵 𝑑𝐵
2 −1
𝑠𝑖𝑛𝜔0 𝑡 𝐵2 1
= [ ]
2 2 −1
𝑠𝑖𝑛𝜔0 𝑡 1 1
= [ − ] = 0 = 𝐶𝑜𝑛𝑠𝑡𝑎𝑛𝑡
2 2 2
cos(𝐴 − 𝐵) − cos(𝐴 + 𝐵)
𝑠𝑖𝑛𝐴𝑠𝑖𝑛𝐵 =
2
(cos(𝜔0 𝜏) − cos(2𝜔0 𝑡 − 𝜔0 𝜏)) 𝐵3 1
= [ ]
4 3 −1
Sol: Given,
𝑌(𝑡) = 𝑋(𝑡) − 𝑋(𝑡 + 𝜏)
Given 𝑋(𝑡) is wide sense stationary and 𝐸[𝑋(𝑡)] ≠ 0
(i) Mean of 𝑌(𝑡):
𝐸[𝑌(𝑡)] = 𝐸[𝑋(𝑡) − 𝑋(𝑡 + 𝜏)]
∴ 𝐸[𝑌(𝑡)] = 0
2
= 𝐸 [(𝑋(𝑡) − 𝑋(𝑡 + 𝜏)) ] − 0[∵ 𝐸[𝑌(𝑡)] = 0]
̅̅̅̅̅̅̅
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) − 2𝑅𝑋𝑋 (𝜏)[∵ 𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)]
(iii) Given,
𝑌(𝑡) = 𝑋(𝑡) + 𝑋(𝑡 + 𝜏)
Now,
𝐸[𝑌(𝑡)] = 𝐸[𝑋(𝑡) + 𝑋(𝑡 + 𝜏)]
= 𝐸[𝑋(𝑡)] + 𝐸[𝑋(𝑡 + 𝜏)]
= 𝐸[𝑋(𝑡)] + 𝐸[𝑋(𝑡)][∵ 𝑋(𝑡) 𝑖𝑠 𝑊𝑆𝑆 ]
= 2 𝐸[𝑋(𝑡)]
∴ 𝐸[𝑌(𝑡)] = 2 𝐸[𝑋(𝑡)]
Now,
𝜎𝑌 2 = 𝐸[𝑌 2 (𝑡)] − (𝐸[𝑌(𝑡)])2
2
= 𝐸 [(𝑋(𝑡) − 𝑋(𝑡 + 𝜏)) ] − (2 𝐸[𝑋(𝑡)])2 [∵ 𝐸[𝑌(𝑡)] = 2 𝐸[𝑋(𝑡)]]
̅̅̅̅̅̅̅
= 𝑅𝑋𝑋 (0) + 𝑅𝑋𝑋 (0) + 2𝑅𝑋𝑋 (𝜏) − 4(𝐸[𝑋(𝑡)])2 [∵ 𝑅𝑋𝑋 (0) = 𝑋 2 (𝑡)]
𝐺𝑖𝑣𝑒𝑛,
𝐵𝑢𝑡,
𝐸(𝐴) = 𝐸(𝐵) = 0
𝐸(𝐴𝐵) = 0
𝑁𝑜𝑤,
= 𝐸[𝐴2 ]
= 𝐸[𝐵2 ]
𝐸[𝑋(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
𝑁𝑜𝑤,
= 𝜎 2 cos(𝜔0 𝜏)
= 𝑅𝑋𝑋 (𝜏)
𝐼𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑓 𝑡𝑖𝑚𝑒
Sol:Given,
∴ 𝐸[𝐴𝐵] = 𝐸[𝐴]𝐸[𝐵] = 0
= 𝐸[𝐴2 ] − 0
∴ 𝜎𝐴 2 = 𝐸[𝐴2 ]
Let 𝜎𝐴 2 = 𝐸[𝐴2 ] = 𝜎 2
= 𝐸[𝐵2 ] − 0
= 𝐸[𝐵2 ]
∴ 𝜎𝐵 2 = 𝐸[𝐵2 ] = 𝜎 2
Calculation of Mean:
∴ ̅̅̅̅̅̅̅
𝑿(𝒕) = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
∴ ̅̅̅̅̅̅
𝒀(𝒕) = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝜏)
= 𝜎 2 𝑐𝑜𝑠(𝜔0 𝜏)
= 𝜎 2 [𝑠𝑖𝑛(−𝜔0 𝜏)]
= −𝜎 2 𝑠𝑖𝑛(𝜔0 𝜏)
Sol: Given,
1
𝑓𝑋 (𝑥) = ; 𝑎≤𝑋≤𝑏
𝑏−𝑎
1
𝑓𝐴 (𝐴) = =1
1−0
1
= 𝑐𝑜𝑠(𝜔0 𝑡)𝑐𝑜𝑠(𝜔0 𝑡 + 𝜔0 𝜏) ∫ 𝐴2 𝑑𝐴
0
1
1 𝐴3
= (𝑐𝑜𝑠 (2𝜔 0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏)) [ ]
2 3 0
[∵ 2𝐶𝑜𝑠𝐴𝐶𝑜𝑠𝐵 = 𝐶𝑜𝑠(𝐴 + 𝐵) + 𝐶𝑜𝑠(𝐴 − 𝐵)]
1 1
= (𝑐𝑜𝑠(2𝜔0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏)) [ ]
2 3
1
∴ 𝐸[𝑋(𝑡)𝑋(𝑡 + 𝜏)] = (𝑐𝑜𝑠(2𝜔0 𝑡 + 𝜔0 𝜏) + 𝑐𝑜𝑠(𝜔0 𝜏))
6
𝟏
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = (𝒄𝒐𝒔(𝟐𝝎𝟎 𝒕 + 𝝎𝟎 𝝉) + 𝒄𝒐𝒔(𝝎𝟎 𝝉)) ⇒ 𝑫𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟔
8. Two statistically independent zero mean random processes 𝑋(𝑡), 𝑌(𝑡)
have autocorrelation functions 𝑅𝑋𝑋 (𝜏) = 𝑒𝑥𝑝(−|𝜏|), 𝑅𝑌𝑌 (𝜏) = cos(2𝜋𝜏)
respectively. Evaluate the
(i) Autocorrelation of the Sum𝑊1 (𝑡) = 𝑋(𝑡) + 𝑌(𝑡)
(ii) Autocorrelation of the Difference 𝑊2 (𝑡) = 𝑋(𝑡) − 𝑌(𝑡)
(iii) Crosscorrelation of 𝑊1 (𝑡) and 𝑊2 (𝑡)
Sol: Given,
= (0)(𝐸[𝑌(𝑡 + 𝜏)])
= 𝑒𝑥𝑝(−|𝜏|) + 0 + 0 + 𝑐𝑜𝑠(2𝜋𝜏)
Now,
= 𝑒𝑥𝑝(−|𝜏|) − 0 − 0 + 𝑐𝑜𝑠(2𝜋𝜏)
= 𝑒𝑥𝑝(−|𝜏|) − 0 + 0 − 𝑐𝑜𝑠(2𝜋𝜏)
Sol: Given,
𝑋̅ = 6 ⇒ 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
Given,
Now,
Here,
Mean: 𝑋̅ = 6 ⇒ 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
̅ )𝟐
𝐥𝐢𝐦 𝑹𝑿𝑿 (𝝉) = (𝑿
|𝝉|→∞
Now,
= 36 + 25 𝑒𝑥𝑝(−∞)
= 36 + 25(0)
= 36
= 62
̅ )𝟐
∴ 𝒍𝒊𝒎 𝑹𝑿𝑿 (𝝉) = (𝑿
𝝉→∞
(v) AC Power:
= 61 − 62
= 61 − 36
∴ 𝜎 2𝑋(𝑡) = 25
∴ AC Power = 25W
Sol: Given,
𝑋(𝑡) = 𝐴 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃)
𝑌(𝑡) = 𝐵 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷)
Auto Correlation of X(t) &Y(t): 𝑅𝑋𝑋 𝑎𝑛𝑑 𝑅𝑌𝑌 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑜𝑛 𝑡𝑖𝑚𝑒
Mean of𝑋(𝑡):
∞
2𝜋
2𝜋
1
= 𝐴 ∫ 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃) ( ) 𝑑𝜃
2𝜋
0
2𝜋
𝑠𝑖𝑛(𝜔1 𝑡 + 𝜃)
= 𝐴[ ]
2𝜋 0
𝐴
= [𝑠𝑖𝑛(2𝜋 + 𝜔1 𝑡) − 𝑠𝑖𝑛(𝜔1 𝑡)]
2𝜋
𝐴
= [𝑠𝑖𝑛(𝜔1 𝑡) − 𝑠𝑖𝑛(𝜔1 𝑡)]
2𝜋
𝐴
= [0]
2𝜋
∴ 𝑬[𝑿(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
2𝜋
2𝜋
1
= 𝐴2 ∫ 𝑐𝑜𝑠(𝜔1 𝑡 + 𝜃)𝑐𝑜𝑠(𝜔1 𝑡 + 𝜔1 𝜏 + 𝜃) ( ) 𝑑𝜃
2𝜋
0
2𝜋
𝐴2
= ∫ [𝑐𝑜𝑠(2𝜔1 𝑡 + 𝜔1 𝜏 + 2𝜃) + 𝑐𝑜𝑠(𝜔1 𝜏)] 𝑑𝜃
4𝜋
0
𝐴2 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏) 𝑠𝑖𝑛(2𝜔1 𝑡 + 𝜔1 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔1 𝜏) − ]
4𝜋 2 2
𝐴2
= [2𝜋𝑐𝑜𝑠(𝜔1 𝜏)]
4𝜋
𝑨𝟐
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = ( 𝒄𝒐𝒔(𝝎𝟏 𝝉)) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
Mean of 𝑌(𝑡):
∞
2𝜋
2𝜋
1
= 𝐵 ∫ 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷) ( ) 𝑑𝛷
2𝜋
0
2𝜋
𝑠𝑖𝑛(𝜔2 𝑡 + 𝛷)
= 𝐵[ ]
2𝜋 0
𝐵
= [𝑠𝑖𝑛(2𝜋 + 𝜔2 𝑡) − 𝑠𝑖𝑛(𝜔2 𝑡)]
2𝜋
𝐵
= [𝑠𝑖𝑛(𝜔2 𝑡) − 𝑠𝑖𝑛(𝜔2 𝑡)]
2𝜋
𝐵
= [0]
2𝜋
∴ 𝑬[𝒀(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
2𝜋
2𝜋
1
= 𝐵2 ∫ 𝑐𝑜𝑠(𝜔2 𝑡 + 𝛷)𝑐𝑜𝑠(𝜔2 𝑡 + 𝜔2 𝜏 + 𝛷) ( ) 𝑑𝛷
2𝜋
0
2𝜋
𝐵2
= ∫ [𝑐𝑜𝑠(2𝜔2 𝑡 + 𝜔2 𝜏 + 2𝛷) + 𝑐𝑜𝑠(𝜔2 𝜏)] 𝑑𝛷
4𝜋
0
𝐵2 𝑠𝑖𝑛(2𝜋 + 2𝜔2 𝑡 + 𝜔2 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔2 𝜏)
4𝜋 2
𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏)
−( + 0)]
2
𝐵2 𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏) 𝑠𝑖𝑛(2𝜔2 𝑡 + 𝜔2 𝜏)
= [ + 2𝜋𝑐𝑜𝑠(𝜔2 𝜏) − ]
4𝜋 2 2
𝐵2
= [2𝜋𝑐𝑜𝑠(𝜔2 𝜏)]
4𝜋
𝑩𝟐
∴ 𝑹𝒀𝒀 (𝒕, 𝒕 + 𝝉) = ( 𝒄𝒐𝒔(𝝎𝟐 𝝉)) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
Cross Correlation of𝑋(𝑡)and𝑌(𝑡):
Sol: Given,
𝜋
𝑠𝑖𝑛(𝜔0 𝑡 + 𝜃)
= 𝐴[ ]
2𝜋 −𝜋
𝐴
= [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡)]
2𝜋
𝐴
= [−𝑠𝑖𝑛𝜔0 𝑡 + 𝑠𝑖𝑛𝜔0 𝑡]
2𝜋
∴ 𝑬[𝑿(𝒕)] = 𝟎 ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
𝐴2 𝑠𝑖𝑛(2𝜔0 𝑡 + 𝜔0 𝜏) 𝑠𝑖𝑛(2𝜔0 𝑡 + 𝜔0 𝜏)
= [ + 𝜋𝑐𝑜𝑠(𝜔0 𝜏) − ( − 𝜋𝑐𝑜𝑠(𝜔0 𝜏))]
4𝜋 2 2
𝐴2
= [𝜋𝑐𝑜𝑠(𝜔0 𝜏) + 𝜋𝑐𝑜𝑠(𝜔0 𝜏)]
4𝜋
𝐴2
= [2𝜋𝑐𝑜𝑠(𝜔0 𝜏)]
4𝜋
𝑨𝟐
∴ 𝑹𝑿𝑿 (𝒕, 𝒕 + 𝝉) = 𝑹𝑿𝑿 (𝝉) = 𝒄𝒐𝒔(𝝎𝟎 𝝉) ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟐
∴ The random process 𝑋(𝑡) is Wide Sense Stationary.
Given,
𝑌(𝑡) = 𝑋 2 (𝑡)
𝐴2
= 𝑐𝑜𝑠(𝜔0 (0))
2
𝐴2
= (1)
2
𝑨𝟐
∴ 𝑬[𝒀(𝒕)] = ⇒ 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕
𝟐
Auto Correlation Function of𝑌(𝑡):
= ∫ 𝑥 2 (𝑡)𝑥 2 (𝑡 + 𝜏) 𝑓𝜃 (𝜃) 𝑑𝜃
−∞
𝜋
1
= ∫[𝐴2 𝑐𝑜𝑠 2 (𝜔0 𝑡 + 𝜃)𝐴2 𝑐𝑜𝑠 2 (𝜔0 𝑡 + 𝜔0 𝜏 + 𝜃)] ( ) 𝑑𝜃
2𝜋
−𝜋
𝜋
𝐴4 1 + 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃) 1 + 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)
= ∫( )( ) 𝑑𝜃
2𝜋 2 2
−𝜋
𝜋 𝜋 𝜋
𝐴4 𝐴4 𝐴4
= ∫ 1 𝑑𝜃 + ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃)𝑑𝜃 + ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)𝑑𝜃
8𝜋 8𝜋 8𝜋
−𝜋 −𝜋 −𝜋
𝜋
𝐴4
+ ∫ 𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜃)𝑐𝑜𝑠(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)𝑑𝜃
8𝜋
−𝜋
𝜋 𝜋
𝐴4 𝜋
𝐴4 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜃) 𝐴4 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏 + 2𝜃)
= [𝜃] + [ ] + [ ]
8𝜋 −𝜋 8𝜋 2 −𝜋
8𝜋 2 −𝜋
𝜋
𝐴4
+ ∫[𝑐𝑜𝑠(4𝜔0 𝑡 + 2𝜔0 𝜏 + 4𝜃) + 𝑐𝑜𝑠(2𝜔0 𝜏)] 𝑑𝜃
16𝜋
−𝜋
𝐴4 𝐴4
= [𝜋 − (−𝜋)] + [𝑠𝑖𝑛(2𝜋 + 2𝜔0 𝑡) − 𝑠𝑖𝑛(−2𝜋 + 2𝜔0 𝑡)] +
8𝜋 16𝜋
𝐴4
[𝑠𝑖𝑛(2𝜋 + 2𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−2𝜋 + 2𝜔0 𝑡 + 2𝜔0 𝜏)] +
16𝜋
𝜋
𝐴4 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏 + 4𝜃)
[[ ] + 𝑐𝑜𝑠(2𝜔0 𝜏)[𝜃]𝜋−𝜋 ]
16𝜋 4 −𝜋
𝐴4 𝐴4 𝐴4
= [2𝜋] + [𝑠𝑖𝑛(2𝜔0 𝑡) − 𝑠𝑖𝑛(2𝜔0 𝑡)] + [𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(2𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋 16𝜋 16𝜋
𝐴4 𝐴4 𝐴4
= + [0] + [0]
4 16𝜋 16𝜋
𝐴4 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(4𝜔0 𝑡 + 2𝜔0 𝜏)
+ [ + 𝑐𝑜𝑠(2𝜔0 𝜏)[2𝜋]]
16𝜋 4
𝐴4 𝐴4
= +0+0+ [0 + 2𝜋 𝑐𝑜𝑠(2𝜔0 𝜏)]
4 16𝜋
𝐴4 𝐴4
= + (𝑐𝑜𝑠(2𝜔0 𝜏))
4 8
𝐴4
= [2 + 𝑐𝑜𝑠(2𝜔0 𝜏)]
8
𝑨𝟒
∴ 𝑹𝒀𝒀 (𝒕, 𝒕 + 𝝉) = 𝑹𝒀𝒀 (𝝉) = [𝟐 + 𝒄𝒐𝒔(𝟐𝝎𝟎 𝝉)] ⇒ 𝑰𝒏𝒅𝒆𝒑𝒆𝒏𝒅𝒆𝒏𝒕 𝒐𝒏 𝒕𝒊𝒎𝒆
𝟖
∴ The new random process 𝑌(𝑡) is also Wide Sense Stationary.
𝝅
𝑨𝟑 𝟏
= ∫ [𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝜽) + (𝒄𝒐𝒔(𝟑𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝟑𝜽) + 𝒄𝒐𝒔(𝝎𝟎 𝒕 + 𝟐𝝎𝟎 𝝉 + 𝜽))] 𝒅𝜽
𝟒𝝅 𝟐
−𝝅
𝐴3
= [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡)]
4𝜋
𝐴3
+ [𝑠𝑖𝑛(3𝜋 + 3𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−3𝜋 + 3𝜔0 𝑡 + 2𝜔0 𝜏)]
24𝜋
𝐴3
+ [𝑠𝑖𝑛(𝜋 + 𝜔0 𝑡 + 2𝜔0 𝜏) − 𝑠𝑖𝑛(−𝜋 + 𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋
𝐴3 𝐴3
= [−𝑠𝑖𝑛𝜔0 𝑡 + 𝑠𝑖𝑛𝜔0 𝑡] + [−𝑠𝑖𝑛(3𝜔0 𝑡 + 2𝜔0 𝜏) + 𝑠𝑖𝑛(3𝜔0 𝑡 + 2𝜔0 𝜏)]
4𝜋 24𝜋
𝐴3
+ [−𝑠𝑖𝑛(𝜔0 𝑡 + 2𝜔0 𝜏) + 𝑠𝑖𝑛(𝜔0 𝑡 + 2𝜔0 𝜏)]
8𝜋
𝐴3 𝐴3 𝐴3
= [0] + [0] + [0]
4𝜋 24𝜋 8𝜋
=0
(v) Yes, 𝑋(𝑡) and 𝑌(𝑡) are Jointly wide sense stationary.