SC 11
SC 11
N
(x n yn ) 2
2
x
SNR = 2
d
2x
SNR (in dB) =10 log10 2
d
x 2peak
PSNR (dB) =10 log10
d2
Example
• Let X= {0,1,2,…………15} (Source Alphabet)
• By dropping LSB
• By inserting 0 as LSB
Review of :
Conditional Entropy
Mutual Information
• Let X be a random variable that takes values
from the source alphabet
X ={x0, x1, ……xN-1}
• Let Y be a random variable that takes on values
from the reconstruction alphabet
Y ={y0, y1,……….yM-1}
• A measure of the relationship between two
random variables is the conditional entropy (the
average value of the conditional self-information).
Previous example
Our uncertainty about the source output is 1 bit.
Continuous Amplitude Sources
• Differential Entropy
• Entropy of a continuous random variable X with
probability density function (pdf) fx(x)
• While the random variable X cannot generally
take on a particular value with nonzero
probability, it can take on a value in an interval
with nonzero probability. Therefore, let us
divide the range of the random variable into
intervals of size Δ. Then, by the mean value
theorem, in each interval [(i - 1) Δ, i Δ), there
exists a number xi , such that
Let us define the random variable Yd in a
manner similar to the random variable Xd,
as the discretized version of a continuous
valued random variable Y
• H(XY) is Hb(D)
• Hence
R-D Curve for the Binary Source
Rate distortion function for the
Gaussian source
In order to minimize the right-hand side of above Equation we have to
maximize the second term subject to the constraint given by Equation (7.67).
This term is maximized if X - Y is Gaussian, and the constraint can be
satisfied if
E [(X - Y)2] = D.
Therefore, h(X- Y) is the differential entropy of a Gaussian random variable with
variance D, and the lower bound becomes
If D = 2, I (X;Y) = 0
Hence
1 2
R (D) log for D 2
2 D
R (D) 0 for D 2
X Y
+
-
D( R ) min D
p ( y j / x i ):I ( X ; Y ) R
2 for R 0
2 R 2
D( R ) 2
for R 0
Properties of Rate Distortion Function
• R(D) is non-negative since mutual information I(X;Y) is
non-negative.
• R(D) is non-increasing in D .
• For each D [0, Dmax] there is one and only one relative
minimum of average mutual information and this
minimum value is R(D) and it always occurs at a point
where the average distortion is D.
A typical rate distortion curve for a discrete memoryless
source and single letter distortion measure is shown in
the figure1.
R(D)
R(D) H
Dmax
D
Probability models used in Lossy Compression