1. Digital Modulation
1. Digital Modulation
Digital Communications
Engr. Rose Anne Reano, PUP-STB
Objectives
• Ease of processing
• Ease of multiplexing
• Noise immunity
Digital Communications
Digital
Communications
Digital
Digital Radio
Transmission
Digital Transmission
Forms of Digital
Modulation:
1. ASK
2. FSK
3. PSK
4. QAM
Applications of Digital Modulation
• The transmission
medium can be a
metallic cable, optical
fiber cable, Earth’s
atmosphere, or a
combination of two or
more types of
transmission systems.
Digital Radio Systems
• In the receiver, the
incoming signals are
filtered, amplified, and
then applied to the
demodulator and
decoder circuits, which
extracts the original
source information from
the modulated carrier.
Digital Radio Systems
• The clock and carrier
recovery circuits recover
the analog carrier and
digital timing (clock)
signals from the
incoming modulated
wave since they are
necessary to perform
the demodulation
process.
Digital Radio Systems
• The clock and carrier
recovery circuits recover
the analog carrier and
digital timing (clock)
signals from the
incoming modulated
wave since they are
necessary to perform
the demodulation
process.
Information Capacity, Bits,
Bit Rate, Baud and M-ary
Encoding
Digital Modulation
Information Capacity
• Information theory is a highly theoretical
study of the efficient use of bandwidth to
propagate information through electronic
communications systems.
Information Capacity
• Information theory can be used to
determine the information capacity of a
data communications system.
Information Capacity
• Information capacity is a measure of how
much information can be propagated
through a communications system and is a
function of bandwidth and transmission
time.
Information Capacity
• Information capacity represents the
number of independent symbols that can
be carried through a system in a given unit
of time.
Information Capacity
• The most basic digital symbol used to
represent information is the binary digit, or
bit.
I∝B xt
where
I information capacity (bits per second)
B bandwidth (hertz)
t transmission time (seconds)
Hartley’s Law
• If either the bandwidth or the transmission
time changes, a directly proportional
change occurs in the information capacity
I∝B xt
where
I information capacity (bits per second)
B bandwidth (hertz)
t transmission time (seconds)
Hartley’s Law
• The relationship between time, information
capacity, and channel bandwidth is given
by the Hartley’s Law:
I = ktB
where
I - amount of information to be sent
k – Boltzmann’s constant 1.38 10^-23 J/K
t - time available
B – channel bandwidth
Shannon Limit
• In 1948, mathematician Claude E.
Shannon (also of Bell Telephone
Laboratories) published a paper in the Bell
System Technical Journal relating the
information capacity of a communications
channel to bandwidth and signal-to-noise
ratio.
Shannon Limit
• The higher the signal-to-noise ratio, the
better the performance and the higher the
information capacity.
Shannon Limit
Example
• For a standard telephone circuit with a
signal-to-noise power ratio of 1000 and a
bandwidth of 2.7 kHz, the Shannon limit
for information capacity is?
Example
• For a standard telephone circuit with a
signal-to-noise power ratio of 1000 and a
bandwidth of 2.7 kHz, the Shannon limit
for information capacity is?
M-ary Encoding
• M-ary is a term derived from the word
binary.
N = log2 M