0% found this document useful (0 votes)
40 views30 pages

Lec 01 Random Variables and Filters

This document provides an outline for a course on communications techniques. It includes the following topics: 1. Random variables, the origin of noise and modeling, baseband transmission of analog signals, performance of amplitude and phase modulations. 2. Digitization, sampling and quantization noise, calculation of power spectral densities, determination of bit error rate, and discrete modulations. 3. The textbook for the course is "Introduction to Analog and Digital Communications" by Simon Haykin. 4. The first lecture will cover random variables, including probability, random variables, distribution functions, expectation, variance, covariance, and Gaussian random variables.

Uploaded by

Eng Adel khaled
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views30 pages

Lec 01 Random Variables and Filters

This document provides an outline for a course on communications techniques. It includes the following topics: 1. Random variables, the origin of noise and modeling, baseband transmission of analog signals, performance of amplitude and phase modulations. 2. Digitization, sampling and quantization noise, calculation of power spectral densities, determination of bit error rate, and discrete modulations. 3. The textbook for the course is "Introduction to Analog and Digital Communications" by Simon Haykin. 4. The first lecture will cover random variables, including probability, random variables, distribution functions, expectation, variance, covariance, and Gaussian random variables.

Uploaded by

Eng Adel khaled
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Communications Techniques 2

TIC 419
Dr Adel Khaled
23-1-2024
Course Outlines
• Random Variables
• Origin of noise and modeling
• Base band transmission of a basic analog signal
• Performance of amplitude and phase modulations,
• Digitization, sampling and quantization noise
• Calculation of power spectral densities,
• Determination of BER,
• Discrete modulations.
Textbook
• Introduction to Analog and Digital Communications, Second Edition,
Simon Haykin.
Lec 01
Random Variables
Random Variables
➢ The term “random” is used to describe erratic and apparently
unpredictable variations of an observed signal.
➢ Random signals in one form or another are encountered in
every practical communication system.
➢ Consider voice communication, in which voice is often
converted to an electrical signal by means of a microphone
before processing for transmission.
➢ If this electrical signal is displayed on an oscilloscope, we might
be tempted on first sight to say the signal appears to be quite
random; that is, it would be difficult to predict or reproduce.
Random Variables
➢ Similarly, with digital communications, if we consider the stream
of 0s and 1s that are transported over the Internet, they appear
quite random—they are always 0 or 1, but their order and location
are quite unpredictable.
➢ This randomness or unpredictability is a fundamental property of
information.
➢ If the information were predictable, there would be no need to
communicate, because the other end could predict the information
before receiving it.
Probability and Random Variables
Probability theory is rooted in situations that involve performing an experiment with
an outcome that is subject to chance.

For example, the experiment may be the observation of the result of the tossing of a fair
coin. In this experiment, the possible outcomes of a trial are “heads” and “tails.”
Probability and Random Variables

as the probability of event A.


Sample space
Consider for example, an experiment that involves the throw of a die. In
this experiment there are six possible outcomes; the showing of one, two,
three, four, five, and six dots on the upper face of the die. By assigning a
sample point to each of these possible outcomes, we have a sample space
that consists of six sample points

sample point : each possible outcome of the experiment,


Sample space: all possible outcomes of the experiment
An event corresponds to either a single sample point or a set of sample points

The elementary event describing the statement “a six shows” corresponds to


the sample point On the other hand, the event describing the statement “an
even number of dots shows” corresponds to the subset of the sample space.
Note that the term “event’ is used interchangeably to describe the subset or
the statement.
Sample space
We are now ready to make a formal definition of probability. A probability system consists of the triple:

1. A sample space S of elementary events (outcomes).


RANDOM VARIABLES
We use the expression random variable to describe this process of
assigning a number to the outcome of a random experiment.

➢ if the outcome of the experiment is s, we denote the random


variable as X(s) or just X.
➢ Note that X is a function.
➢ We denote a particular outcome of a random experiment by x; that
is X(sk)=x.

• Random variables may be discrete and take only a finite number


of values, such as in the coin-tossing experiment.
• Alternatively, random variables may be continuous and take a
range of real values.
RANDOM VARIABLES

• The probability mass function describes the probability of each possible


value of the discrete-valued random variable.
• For the coin-tossing experiment, if it is a fair coin, the probability mass
function of the associated random variable may be written as
DISTRIBUTION FUNCTIONS
The probability that the random variable X takes any value less than or equal to x.
The distribution function is written as FX(x)so

The distribution function has two basic properties


Probability density function
If X is a continuous-valued random variable and is differentiable with respect to x, then a third
commonly used function is the probability density function, denoted by 𝑓𝑋 (𝑥) where
Expectation

For a continuous random variable with a density function𝑓𝑋 𝑥 , the analogous definition of the expected
value is
VARIANCE
COVARIANCE

we find that the covariance of independent random variables is zero. It should be noted however
that the opposite is not always true: zero covariance does not, in general, imply independence.
Gaussian Random Variables
A Gaussian random variable is a continuous random variable with a density function given by
Gaussian Random Variables

% Set the parameters for the Gaussian distribution


mu = 0; % Mean
sigma = 1; % Standard deviation
% Generate x values for the PDF plot
x = linspace(-5, 5, 1000);
% Calculate the PDF values for each x
%pdfValues = normpdf(x, mu, sigma);
f=(1/sqrt(2*pi))*exp(-(x).^2/2);
% Plot the PDF
figure;
plot(x,f)% pdfValues, 'LineWidth', 2);
title('Probability Density Function (PDF) of Gaussian Distribution');
xlabel('X');
ylabel('Probability Density');
grid on;
% Display mean and standard deviation on the plot
text(mu, max(pdfValues), sprintf('Mean = %g\nStd Dev = %g', mu, sigma), ...
'VerticalAlignment', 'top', 'HorizontalAlignment', 'left', 'FontSize', 10, 'Color', 'red');
% Show the plot
legend('PDF');
Gaussian Random Variables
Random Processes
▪ In a radio communication system, the received signal usually consists of an information bearing signal component,
a random interference component, and channel noise.
▪ The information-bearing signal may represent, for example, a voice signal that, typically, consists of randomly
spaced bursts of energy of random duration.
▪ The interference component may represent spurious electromagnetic waves produced by other communication
systems operating in the vicinity of the radio receiver.
▪ A major source of channel noise is thermal noise, which is caused by the random motion of the electrons in
conductors and devices at the front end of the receiver.
▪ We thus find that the received time-varying signal is random in nature. In this section, we combine the concepts of
time variation and random variables to introduce the concept of random processes.
▪ Although it is not possible to predict the exact value of the random signal or process in advance, it is possible to
describe the signal in terms of the statistical parameters such as average power and power spectral density
Random Processes
random processes have the following properties:
1. Random processes are functions of time.
2. Random processes are random in the sense that it is not possible to predict exactly what waveform will be
observed in the future.
Random Processes
• With a random variable, the outcome of random
experiment is mapped to a real number.
• With a random process, the outcome of random
experiment is mapped into a waveform that is a function of
time.

The family of all such random variables, indexed by the time


variable t, forms the random process.

The range of possible random processes is quite large. To


restrict this range to random processes that are both: (i) typical
of real-world situations, and (ii) mathematically tractable, we
need two technical conditions, stationarity and ergodicity
STATIONARY RANDOM PROCESSES
With real-world random processes, we often find that the statistical characterization of a process
is independent of the time at which the observations occur.
That is, if a random process is divided into a number of time intervals, the various sections of the
process exhibit essentially the same statistical properties. Such a process is said to be stationary.
Otherwise, it is said to be nonstationary.
ERGODICITY: instantaneous rate
White Noise
The power spectral density of white noise is independent of frequency.
We denote the power spectral density of a white noise process as

Narrowband Noise
Narrowband Noise
Narrowband Noise

You might also like