0% found this document useful (0 votes)
29 views

Information Theory Assignment: Talib Abbas (TC-005)

The document discusses different types of channels in information theory. It defines noiseless channels as having the same or more output symbols than input symbols without noise or ambiguity. The capacity of binary symmetric channels (BSC) and binary erasure channels (BEC) is defined in terms of the binary entropy function. Hamming distance is the number of substitutions required to change one string to another. Deterministic channels have the same input and output symbols so mutual information equals channel capacity. Maximum channel capacity is defined as the highest possible average mutual information between input and output.

Uploaded by

Talib Abbas
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Information Theory Assignment: Talib Abbas (TC-005)

The document discusses different types of channels in information theory. It defines noiseless channels as having the same or more output symbols than input symbols without noise or ambiguity. The capacity of binary symmetric channels (BSC) and binary erasure channels (BEC) is defined in terms of the binary entropy function. Hamming distance is the number of substitutions required to change one string to another. Deterministic channels have the same input and output symbols so mutual information equals channel capacity. Maximum channel capacity is defined as the highest possible average mutual information between input and output.

Uploaded by

Talib Abbas
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Information Theory Assignment

By

Talib Abbas (TC-005)

Noiseless Channel A noiseless channel will have either the same or possibly more output symbols than input symbols and is such that there is no noise, ambiguity, or uncertainty of which input caused the output. The channel matrix of such channel will have only 1 nonzero value in each column. Since the amount of information sent through the channel is same as received therefore, mutual information will be ( ) ( )

BSC and BEC Channel Capacity The capacity of the BSC channel is ( ) is the binary entropy function. The capacity of the BEC channel is the probability of the function. ( ), where

, where

is

Hamming Distance The Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different. Put another way, it measures the minimum number of substitutions required to change one string into the other, or the number of errors that transformed one string into the other.

Deterministic Channel A deterministic channel will have either the same or possibly more input symbols than output symbols and is such that we can determine which output symbol will be received when a particular input symbol is transmitted. Since the amount of information provided by the channel is the same as the information produced by the channel output. Therefore mutual information will be ( ) ( )

Citation Roberto Tangori and Christopher J.S desilva, Fundementals of Information theory and coding design ISBN 1-58488-310-3 Wikipedia.com, Hamming Distance

Maximum Channel Capacity The maximum average mutual information, ( ), in any single use of a channel defines the channel capacity . Mathematically, the channel capacity, C, is defined as:
( )

EngrMicroLectures Youtube Channel, Hamming Distance

Where, ( * ) | | and | |+

You might also like