0% found this document useful (0 votes)
22 views44 pages

m1- Random Variables and Processes

The document provides an overview of the principles of communication systems, focusing on random variables and processes. It discusses the statistical characterization of random signals, probability theory, and the definitions and properties of random variables, including their expected values and moments. Additionally, it introduces concepts of random processes, highlighting their significance in analyzing communication systems and the differences between random variables and random processes.

Uploaded by

revvytwohandz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views44 pages

m1- Random Variables and Processes

The document provides an overview of the principles of communication systems, focusing on random variables and processes. It discusses the statistical characterization of random signals, probability theory, and the definitions and properties of random variables, including their expected values and moments. Additionally, it introduces concepts of random processes, highlighting their significance in analyzing communication systems and the differences between random variables and random processes.

Uploaded by

revvytwohandz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

BANGALORE INSTITUTE OF TECHNOLOGY

Principles of Communication
Systems
BEC 402

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Module 1 (PCS)

RANDOM VARIABLES AND PROCESSES


Dr. Anupama H
Assistant Professor
Department of ECE, BIT

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

INTRODUCTION
• Deals with the statistical characterization of random signals
• A signal is “ random” if it is not possible to predict its precise value in advance.
• In a radio communication system, the received signal consists of an information-bearing signal component, a
random interference component, and receiver noise.
• The information-bearing signal component may represent, a voice signal that, typically, consists of randomly
spaced bursts of energy of random duration.
• The interference component represent spurious electromagnetic waves produced by other communication
systems operating in the vicinity of the radio receiver.
• A major source of receiver noise is thermal noise, caused by the random motion of electrons in conductors
and devices at the front end of the receiver.
• Thus find that the received signal is completely random in nature.
• So described in terms of its statistical properties such as the average power in the random signal, or the
average spectral distribution of this power.
• The mathematical discipline that deals with the statistical characterization of random signals is probability
theory.
• A random variable is obtained by observing a random process at a fixed instant of time. Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

PROBABILITY
• Probability theory is a phenomena that, explicitly or implicitly, can be
modeled by an experiment with an outcome that is subject to chance.
• if the experiment is repeated, the outcome can differ because of the
influence of an underlying random phenomenon or chance
mechanism. Such an experiment is referred to as a random
experiment.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Basic Definitions

• Assume an experiment and its possible outcomes as defining a space and its points. If an
experiment has K possible outcomes, then for the kth possible outcome there is a point called the
sample point, denoted by sk.
• The set of all possible outcomes of the experiment is called the sample space, denoted by S.
• An event corresponds to either a single sample point or a set of sample points in the space S.
• A single sample point is called an elementary event.
• The entire sample space S is called the sure event, and the null set f is called the null or impossible
event.
• Two events are mutually exclusive if the occurrence of one event precludes the occurrence of the
other event.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Properties
• A probability measure P is a function that assigns a non-negative
number to an event A in the sample space S and satisfies the
following three properties (axioms):

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Relationship between
sample space, events, and probability.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• The sample space S is mapped to events via the random experiment.


• The events may be elementary outcomes of the sample space or
larger subsets of the sample space.
• The probability function assigns a value between 0 and 1 to each of
these events.
• The probability value is not unique to the event; mutually exclusive
events may be assigned the same probability.
• However, the probability of the union of all events— is always unity.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

The three axioms and their relationship to the relative


frequency approach is illustrated by the Venn diagram.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

CONDITIONAL PROBABILITY
• Consider an experiment that involves a pair of events A and B. Let P[B|A] denote the probability of
event B, given that event A has occurred. The probability P[B|A] is called the conditional probability of
B given A. Assuming that A has nonzero probability, the conditional probability P[B|A] is defined by

• where P[A ∩ B] is the joint probability of A and B. Eq. (5.7) may be written as,

• we may also write

• the joint probability of two events may be expressed as the product of the conditional probability of one
event given the other, and the elementary probability of the other
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• provided P[A] ≠ 0, Baye’s rule is defined as

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

RANDOM VARIABLES
• Random variable assigns a number to the outcome of a random experiment (Ex- 1 for head, 0 for
tail in coin tossing expt)
• a function whose domain is a sample space and whose range is a set of real numbers is called a
random variable of the experiment
• for events in set E, a random variable assigns a subset of the real line.
• if the outcome of the experiment is s, the random variable is denoted as X( s) or just X.
• In the figure, subsets of the sample space are being mapped directly to a subset or the real line
• Random variables may be discrete or continuous
• Discrete R.V takes only a finite number of values, such as in the coin-tossing experiment.
• Continuous R.V takes a range of real values like the amplitude of a noise voltage at a particular
instant in time, it may take on any value between plus and minus infinity
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Probabilistic description of random variables works equally well for both discrete as well as
continuous random variables
• Consider the random variable X and the probability of the event(X ≤ x) denoted by probability P[X
≤ x], the function Fx(x), called the cumulative distribution function (cdf ) of the random variable X
is given by

• For any point x, the distribution function Fx(x) expresses a probability.


• CDF properties ----------
1. The distribution function Fx(x) is bounded between zero and one
2. The distribution function Fx(x) is is a monotone-nondecreasing function of x
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Probability density function (pdf )


• The derivative of the distribution function,

• The name density function arises from the fact that the probability of the event (x1<=X<=x2)
equals

• The probability of an interval is therefore the area under the probability density function in that
interval. Putting x1 = - ∞ in Eq. (5.18), and changing the notation, the distribution function is
defined in terms of the probability density function as follows:
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Since Fx(∞) = 1, corresponding to the probability of a certain event, and Fx(- ∞) = 0,


corresponding to the probability of an impossible event, from Eq. (5.18) , we may write

• probability density function must always be a nonnegative function, and with a total area of
one.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

STATISTICAL AVERAGES
• Determining the average behavior of the outcomes arising in random
experiments
• Expected value or mean is defined by

• E denotes the statistical expectation operator


• the mean μx locates the center of gravity of the area under the probability
density curve of the random variable X

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

FUNCTION OF A RANDOM VARIABLE


• Let X denote a random variable, and let g(X) denote a real-valued function defined on the real
line.
Let Y = g( X ), the expected value of the random variable Y is given by,

It can also be written as,

• This is the generalized concept of expected value to an arbitrary function g(X) of a random
variable X
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

MOMENTS
• nth moment of the probability distribution of the random variable X is,

• putting n = 1 in above equation gives the mean of the random variable


• Putting n = 2 gives the mean-square value of X

• central moments are the moments of the difference between a random variable X and its mean.
Thus, the nth central moment is

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• For n = 1, the central moment is zero, whereas for n = 2 the second central moment is referred to
as the variance of the random variable X, written as

• The variance of a random variable X is commonly denoted as σ2. The square root of the variance,
is called the standard deviation of the random variable X
• The variance of a random variable X is a measure of the variable’s “ randomness.”
• if the mean is zero, then the variance and the mean-square value E[X ] of the random variable X
are equal

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

CHARACTERISTIC FUNCTION

• where v is real and j=sqrt(-1)

• This relation may be used to evaluate the probability density function


of the random variable X from its characteristic function
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Use the characteristic function to evaluate the higher-order moments


of the Gaussian random variable X.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Differentiating both sides of eqn with respect to v a total of n times,


and then setting v = 0, we get the result

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

JOINT MOMENTS

• The statistical average of a pair of random variables X and Y is joint moment

• Correlation defined by E[XY], which corresponds to i = k = 1 in Eq. (5.51).


• The correlation of the centered random variables X - E[X| and Y - E[Y], is the joint moment
covariance of X and Y.

• correlation coefficient of X and Y

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• the two random variables X and Y are uncorrelated if and only if their
covariance is zero, that is, if and only if
cov[XY] = 0
• And are orthogonal if and only if their correlation is zero, that is, if
and only if
E[XY] = 0

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Moments of a Bernouli Random Variable

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

RANDOM PROCESSES
• Statistical analysis of communication systems is the characterization of random signals such as
voice signals, television signals, computer data, and electrical noise
• In random signals, each sample point in sample space is a function of time. The sample space or
ensemble comprised of functions of time is called a random or stochastic process
• Consider then a random experiment specified by the outcomes s from some sample space S, by
the events defined on the sample space S, and by the probabilities of these events.
• Suppose that each sample point s is assigned a function of time in accordance with the rule:

where 2T is the total observation interval. For a fixed sample point sj, the graph of the function
X( t,Sj) versus time t is called a realization or sample function of the random process is defined as

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• constitutes a random variable. Thus family of random variables {X(t,s)}, is called a random
process. To simplify the notation, suppress the s and simply use X(t) to denote a random process.
• Random process X(t ) is defined as an ensemble of time functions together with a probability
rule that assigns a probability to any meaningful event associated with an observation of one of
the sample functions of the random process.
• Difference between a random variable and a random process :
1. For a random variable, the outcome of a random experiment is mapped into a number.
2. For a random process, the outcome of a random experiment is mapped into a waveform that is a
function of time

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

MEAN, CORRELATION, AND COVARIANCE


FUNCTIONS

• A random process is said to be stationary to first order if the distribution function (and therefore
density function) of X(t) does not vary with time. That is, the density functions for the random
variables X(t1) and X(t2 ) satisfy

• The mean of the random process is a constant for a stationary process of first order

• The autocorrelation function of the process X(t) is the expectation of the product of two random
variables X(t1 ) and X(t2 ), obtained by observing X(t) at times t1 and t2,respectively

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Random process X(t) is stationary to second order if the joint distribution function
fx(t1 ),x(t2 ) (x1,x2 ) depends only on the difference between the observation times
t1 and t2

• The autocovariance function of a stationary random process X(t) is

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

PROPERTIES OF THE AUTOCORRELATION


FUNCTION
• Redefining the autocorrelation function of a stationary process X(t) as

1. The mean-square value of the process may be obtained from Rx(𝜏)


simply by putting x = 0 in above Eq.

2. The autocorrelation function Rx(𝜏) is an even function of 𝜏,

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

3. The autocorrelation function Rx(𝜏) has its maximum magnitude at 𝜏= 0,

Proof: Consider the nonnegative quantity,

By above equations,

Equivalently,

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Physical significance of the autocorrelation


function
• Provides a means of describing the “ interdependence” of two random variables obtained by
observing a random process X(t) at times 𝜏 seconds apart
• The more rapidly the random process X(t ) changes with time, the more rapidly will the
autocorrelation function Rx(𝜏) decrease from its maximum Rx(0) as 𝜏 increases.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

CROSS-CORRELATION FUNCTIONS
• Consider two random processes X(t) and Y( t ) with autocorrelation functions
Rx{t,u) and RY(t,u), respectively. The cross-correlation function of X(t) and Y(t) is
defined by

• If the random processes X(t) and Y(t ) are each wide-sense stationary and, the
cross-correlation may be written as

• symmetry relationship

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

GAUSSIAN PROCESS
• The random variable Y has a Gaussian distribution if its probability density function has the form

• for the special case when the Gaussian random variable Y is normalized to have a mean of zero
and a variance of one is written as N(0,1)

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

CENTRAL LIMIT THEOREM


• The central limit theorem provides the mathematical justification for using a Gaussian process as
a model for a large number of different physical phenomena in which the observed random
variable, at a particular instant of time, is the result of a large number of individual random
events.
• Proof:
Let Xi = 1, 2, ..., N, be a set of random variables that satisfies the following requirements:
1. The Xi are statistically independent.
2. The Xi have the same probability distribution with mean μx and variance σx2.

The Xi are said to constitute a set of independently and identically distributed (i.i.d.) random
variables. Let these random variables be normalized as follows;

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• so that we have E[Yi] = 0 and Var[Yi] = 1


• Define the random variable VN,

• The central limit theorem states that the probability distribution of VN approaches a normalized
Gaussian distribution N(0,1) in the limit as N approaches infinity. That is, regardless of the
distribution of the Yi, the sum VN approaches a Gaussian distribution

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

PROPERTIES OF A GAUSSIAN PROCESS


1. If a Gaussian process X(t) is applied to a stable linear filter, then the random
process Y(t) developed at the output of the filter is also Gaussian.
2. Consider the set of random variables or samples X(t1), X(t2 ),…., X(tn), obtained
by observing a random process X(t) at times t1, t2,...,tn. If the process X(t) is
Gaussian, then this set of random variables is jointly Gaussian for any n, with their
n-fold joint probability density function being completely determined by specifying
the set of means

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

3. If a Gaussian process is wide-sense stationary, then the process is also stationary in the strict
sense.

4. If the random variables X(t1), X(t2 ),……, X(t n), obtained by sampling a Gaussian process X( t) at
times t1, t2, ….,tn are uncorrelated, that is,

then these random variables are statistically independent.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

Proof of Property 1
Statement: If a Gaussian process X(t) is applied to a stable linear filter, then the random process Y(t)
developed at the output of the filter is also Gaussian
Proof:
• Consider a linear time-invariant filter of impulse response h{t), with the random process X(t) as input
and the random process Y(t) as output. Assume that X(t ) is a Gaussian process. The random processes
Y{t) and X(t) are related by the convolution integral

• Assume that the impulse response h(t) is such that the mean-square value of the output random
process Y(t ) is finite for all t in the range 0 <t < ∞ for which Y(t) is defined.

• Let us define a random variable Z which must be a Gaussian random variable for every function gy(t),
such that the mean square value of Z is finite.
Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING


BANGALORE INSTITUTE OF TECHNOLOGY

• Interchanging the order of integration in Eq.



• -------------------------(2)

• Where

• Since X(t) is a Gaussian process by hypothesis, it follows from Eq. 2 that Z must be a Gaussian
random variable. Thus it is shown that if the input X(t) to a linear filter is a Gaussian process, then
the output Y(t) is also a Gaussian process.

Dr. Anupama H
Assistant Professor
Dept. of ECE, BIT

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING

You might also like