0% found this document useful (0 votes)
83 views

Discrete-Time Random Signals

This document discusses discrete-time random signals and stochastic processes. It defines random processes as indexed families of random variables characterized by probability distributions. A random process differs from a single random variable in that the outcome of a random process is mapped to a sequence rather than a single number. The document outlines key concepts for random processes including probability density functions, independence, stationarity, ergodicity, expectation, autocorrelation, and the Fourier transform representation.

Uploaded by

Naeem Ali Sajad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views

Discrete-Time Random Signals

This document discusses discrete-time random signals and stochastic processes. It defines random processes as indexed families of random variables characterized by probability distributions. A random process differs from a single random variable in that the outcome of a random process is mapped to a sequence rather than a single number. The document outlines key concepts for random processes including probability density functions, independence, stationarity, ergodicity, expectation, autocorrelation, and the Fourier transform representation.

Uploaded by

Naeem Ali Sajad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 34

Discrete-time Random Signals

 Until now, we have assumed that the signals are


deterministic, i.e., each value of a sequence is
uniquely determined.
 In many situations, the processes that generate
signals are so complex as to make precise
description of a signal extremely difficult or
undesirable.
 A random or stochastic signal is considered to be
characterized by a set of probability density
functions.
Stochastic Processes
 Random (or stochastic) process (or signal)
 A random process is an indexed family of random
variables characterized by a set of probability
distribution function.
 A sequence x[n], <n< . Each individual sample x[n] is
assumed to be an outcome of some underlying random
variable Xn.
 The difference between a single random variable and a
random process is that for a random variable the
outcome of a random-sampling experiment is mapped
into a number, whereas for a random process the
outcome is mapped into a sequence.
Stochastic Processes (continue)

 Probability density function of x[n]: pxn , n 

 Joint distribution of x[n] and x[m]: pxn , n , xm , m

 Eg., x1[n] = Ancos(wn+n), where An and n are


random variables for all  < n < , then x1[n] is a
random process.
Independence and Stationary

 x[n] and x[m] are independent iff

pxn , n , xm , m  pxn , n pxm , m


 x is a stationary process iff

pxn k , n  k , xm k , m  k   pxn , n , xm , m


for all k.
 That is, the joint distribution of x[n] and x[m]

depends only on the time difference m  n.


Stationary (continue)

 Particularly, when m = n for a stationary process:

pxn k , n  k   pxn , n

It implies that x[n] is shift invariant.


Stochastic Processes vs.
Deterministic Signal
 In many of the applications of discrete-time signal
processing, random processes serve as models for
signals in the sense that a particular signal can be
considered a sample sequence of a random
process.
 Although such a signals are unpredictable –
making a deterministic approach to signal
representation is inappropriate – certain average
properties of the ensemble can be determined,
given the probability law of the process.
Expectation

 Mean (or average)



m xn   xn    xn pxn , ndxn
  denotes the expectation operator

 g xn    g xn  pxn , n dxn

 For independent random variables
 xn ym    xn  ym 
Mean Square Value and
Variance

 Mean squared value



 { xn }   xn pxn , n dxn
2 2


 Variance
 2
varxn     xn  mxn 
 
Autocorrelation and
Autocovariance
 Autocorrelation
 xx {n,m}     
xn xm
 
   xn xm pxn , n , xm , mdxn dxm

 Autocovariance

 
 xx {n,m}   xn  mxn xm  mxm 
*

  xx {n,m}  mxn mxm


Stationary Process
 For a stationary process, the autocorrelation is
dependent on the time difference m  n.
 Thus, for stationary process, we can write

mx  mxn   xn 
x
2

  x n  m x 
2

 If we denote the time difference by k, we have

xx n  k , n  xx k     
xnk xn 
Wide-sense Stationary
 In many instances, we encounter random
processes that are not stationary in the strict
sense.
 If the following equations hold, we call the
process wide-sense stationary (w. s. s.).
mx  mxn   xn 
x
2

  x n  m x 
2

xx n  k , n  xx k    x 
n k xn 
Time Averages
 For any single sample sequence x[n], define their
time average to be
L
xn  lim  xn
1
l  2 L  1
n L
 Similarly, time-average autocorrelation is
L
xn  mxn  xn  mx n
 1 
 lim
l  2 L  1
n L
Ergodic Process
 A stationary random process for which time
averages equal ensemble averages is called an
ergodic process:

xn  mx

xn  mxn   xx m



Ergodic Process (continue)
 It is common to assume that a given sequence is
a sample sequence of an ergodic random
process, so that averages can be computed from
a single sequence.
1 L 1
 In practice, we cannot ˆ x   xn
m
compute with the limits, but L n 0
instead the quantities. L 1

 Similar quantities are often  2


x 
1
  x n   ˆ
m x 2
L n 0
computed as estimates of the L 1
xn  mx  n   xn  mx  n
mean, variance, and 1
autocorrelation. L L n 0
Properties of correlation and
covariance sequences
 xx m   xn  m xn 
   

 xx m   xn  m  m x xn  m x 
    

 xy m   xn  m y n 
 xy m   xn  m  m x y n  m y  

 Property 1:
 xx m   xx m  m x
2

 xy m   xy m 
 mx m y
Properties of correlation and
covariance sequences (continue)
 Property 2:
 xx 0  E  xn   Mean Squared Value
 2
 
 xy 0 x
2
 Variance

 Property 3
 xx  m m

  xx  xy  m m

  xy
 xx  m   xx m

 xy  m   xy m

Properties of correlation and
covariance sequences (continue)
 Property 4:
 xy m   xx 0 yy 0
2

 xy m   xx 0 yy 0


2

 xx m   xx 0
 xx m   xx 0
Properties of correlation and
covariance sequences (continue)
 Property 5:
 If yn  xnn0

 yy m   xx m
 yy m   xx m
Fourier Transform Representation
of Random Signals
 Since autocorrelation and autocovariance
sequences are all (aperiodic) one-dimensional
sequences, there Fourier transform exist and are
bounded in |w|.
 Let the Fourier transform of the autocorrelation
and autocovariance sequences be

 
 xx m   xx e jw  
 xy m   xy e jw
 xx m  xx e 
jw
 xy m  xy e 
jw
Fourier Transform Representation
of Random Signals (continue)
 Consider the inverse Fourier Transforms:

 xx m 
1
2

  
xx e jw e jwn dw

 xx m 
1
2

  xx e e
jw jwn
dw
Fourier Transform Representation
of Random Signals (continue)
 Consequently,

 
 xn   xx 0 
2 1 

2 
 xx  
e jw
dw

 x   xx 0 
2 1 

2 
 
xx e e dw
jw jwn

  
Denote Pxx w   xx e  
jw

to be the power density spectrum (or power


spectrum) of the random process x.
Power Density Spectrum

 
 xn2

1
2

 Pxx wdw
 The total area under power density in [,] is the
total energy of the signal.
 Pxx(w) is always real-valued since xx(n) is
conjugate symmetric
 For real-valued random processes, Pxx(w) = xx(ejw)
is both real and even.
Mean and Linear System
 Consider a linear system with frequency response
h[n]. If x[n] is a stationary random signal with
mean mx, then the output y[n] is also a stationary
random signal with mean mx equaling to
 
m y n   yn   hk  xn  k    hk mx n  k 
k   k  
 Since the input is stationary, mx[nk] = mx , and
consequently,
 

m y  mx  hk   H e j 0 mx
k  
Stationary and Linear System
 If x[n] is a real and stationary random signal, the
autocorrelation function of the output process is
 yy n , n  m   ynyn  m
   
    hk hr xn  k xn  m  r 
k   r   
 
  hk   hr  xn  k xn  m  r 
k   r  

 Since x[n] is stationary , {x[nk]x[n+mr] }


depends only on the time difference m+kr.
Stationary and Linear System
(continue)
 Therefore,  yy n , n  m
 
  hk   hr  xx m  k  r 
k   r  
  yy m
The output power density is also stationary.
 Generally, for a LTI system having a wide-sense
stationary input, the output is also wide-sense
stationary.
Power Density Spectrum and
Linear System
 By substituting l = rk,
 
 yy m    xx m  l hk   hk hl  k 
l   k  

  xx m  l chh l 
l  

where
chh l    hk hl  k 
k  
 A sequence of the form of chh[l] is called a
deterministic autocorrelation sequence.
Power Density Spectrum and
Linear System (continue)
 A sequence of the form of Chh[l] l = rk,

 yy e    C e  e 
jw
hh
jw
xx
jw

where Chh(ejw) is the Fourier transform of chh[l].


chh l   hl  h l 
   H e H e 
 For real h,
jw jw  jw
Chh e
 Thus
Chh e   H e 
jw jw
2
Power Density Spectrum and
Linear System (continue)
 We have the relation of the input and the output
power spectrums to be the following:

 yy e   H e   e 
jw jw
2
xx
jw

 
 xn   xx 0 
2 1 

2 
 xx  
e jw
dw  total average power of the input

 
 yn   yy 0 
2 1 

2 
H    
e jw
2
 xx e jw
dw

 total average power of the output


Power Density Property
 Key property: The area over a band of
frequencies, wa<|w|<wb, is proportional to the
power in the signal in that band.
 To show this, consider an ideal band-pass filter.
Let H(ejw) be the frequency of the ideal band
pass filter for the band wa<|w|<wb.
 Note that |H(ejw)|2 and xx(ejw) are both even
functions. Hence,  0  averagepowerin output
yy


1
2
 wa
wb
   e 
He jw
2
xx
jw
dw 
1
2
wb
w
a
   e dw
He jw
2
xx
jw
White Noise (or White
Gaussian Noise)
 A white noise signal is a signal for which
xx m   x2 m
 Hence, its samples at different instants of time are
uncorrelated.
 The power spectrum of a white noise signal is a
constant jw
  
 xx e 2
x

 The concept of white noise is very useful in


quantization error analysis.
White Noise (continue)
 The average power of a white-noise is therefore

 xx 0 
1 

2 
 
 xx e dw 
jw 1  2
2  
 x dw  x2

 White noise is also useful in the representation


of random signals whose power spectra are not
constant with frequency.
 A random signal y[n] with power spectrum yy(ejw) can
be assumed to be the output of a linear time-invariant
system with a white-noise input.

   H e  
 yy e jw jw
2
2
x
Cross-correlation
 The cross-correlation between input and output of
a LTI system:  m   xnyn  m
xy

  
   xn hk xn  m  k 

 k   

  hk  xx m  k 
k  
 That is, the cross-correlation between the input
output is the convolution of the impulse response
with the input autocorrelation sequence.
Cross-correlation (continue)
 By further taking the Fourier transform on both sides
of the above equation, we have
     
 xy e jw  H e jw  xx e jw
 This result has a useful application when the input is
white noise with variance x2.
 xy m   x hm,
  2
 xy e    H e 
jw 2
x
jw

 These equations serve as the bases for estimating the


impulse or frequency response of a LTI system if it is
possible to observe the output of the system in response to
a white-noise input.
Remained Materials Not Included
From Chap. 4, the materials will be
taught in the class without using
slides

You might also like