0% found this document useful (0 votes)
145 views

ITC I MID TERM

This document appears to be an exam for a course on Information Theory and Coding. It contains 3 parts - Part A with 5 short answer questions, Part B with 4 longer questions, and Part C with 2 essay questions. The questions cover topics like calculating source entropy, average information content, properties of information, channel models, and mutual information. The exam tests students' understanding of key information theory concepts and their ability to apply equations and solve problems related to entropy, channels, and coding.

Uploaded by

Yadvendra Bedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
145 views

ITC I MID TERM

This document appears to be an exam for a course on Information Theory and Coding. It contains 3 parts - Part A with 5 short answer questions, Part B with 4 longer questions, and Part C with 2 essay questions. The questions cover topics like calculating source entropy, average information content, properties of information, channel models, and mutual information. The exam tests students' understanding of key information theory concepts and their ability to apply equations and solve problems related to entropy, channels, and coding.

Uploaded by

Yadvendra Bedi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

RTU ROLL NO…………………….

GLOBAL INSTITUTE OF TECHNOLOGY


FIRST MID TERM EXAMINATION ( V Sem.) 2021-22
5CS3-01: Information Theory & Coding
22-10-2021/Friday

Max. Time: 2 Hrs Max. Marks: 100


NOTE:- Attempt all questions.
PART A (5*4)

(1) Consider a Source X that produces five symbols with probabilities 1/2, 1/4, 1/8, 1/16, 1/16.
Determine the source entropy H(X). (CO1)
(2) Calculate the average information content in the English language, assuming that each of the 26
characters in the alphabet occurs with equal probability. (CO1)
(3) Verify the Equation I(xi +xj)= I(xi) + I(xj) (CO3)
(4) Difference between Lossless, Deterministic and Noiseless channel. (CO2)
(5) What do you mean by Entropy? (CO1)

PART B ( 4*10)

(1) Consider a telegraph source having two symbols, Dot and Dash. The dot duration is 0.2 the
dash duration is 3 times of dot duration. The probability of dots occurring is twice that of the
dash, and the time between the symbols is 0.2 calculate the information rate of the telegraph
source. (CO3)
(2) A high resolution black and white T.V. picture consist of 2*106 picture elements and 16
different brightness levels. Picture are repeated at the rate of 32 per second. All picture
elements are assumed to be independent and all levels have equal likelihood of occurrence.
Calculate the average rate of information conveyed by T.V. Picture source. (CO3)
(3) Consider a binary channel shown in figure

(a) Find the channel matrix of the channel.


(b) Find P(y1) and P(y2) when P(x1)=P(x2)=0.5

(c) Find the joint probability P(x1, y2) and P(x2, y1) when P(x1)=P(x2)=0.5(CO3)
(4) Consider a noiseless channel with m input symbols and m output symbols. Show that
H(x)=H(y) and H(Y|X)=0(CO4)
PART C (2*20)
(1) A binary source which have two symbols 0, 1 with probability P and (1-P), then find the Entropy
and Maximum Entropy. (CO4)

(2)
Consider a BSC with P(x1)=α

(a) Show that the mutual information I(X;Y) is given by

I(X;Y)=H(y) + plog2p + (1-p)log2(1-p)


(b) Calculate the I(X;Y) for α=0.5 and p=0.1 (CO4)

You might also like