Information Theory Assignment: Talib Abbas (TC-005)
Information Theory Assignment: Talib Abbas (TC-005)
By
Noiseless Channel A noiseless channel will have either the same or possibly more output symbols than input symbols and is such that there is no noise, ambiguity, or uncertainty of which input caused the output. The channel matrix of such channel will have only 1 nonzero value in each column. Since the amount of information sent through the channel is same as received therefore, mutual information will be ( ) ( )
BSC and BEC Channel Capacity The capacity of the BSC channel is ( ) is the binary entropy function. The capacity of the BEC channel is the probability of the function. ( ), where
, where
is
Hamming Distance The Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different. Put another way, it measures the minimum number of substitutions required to change one string into the other, or the number of errors that transformed one string into the other.
Deterministic Channel A deterministic channel will have either the same or possibly more input symbols than output symbols and is such that we can determine which output symbol will be received when a particular input symbol is transmitted. Since the amount of information provided by the channel is the same as the information produced by the channel output. Therefore mutual information will be ( ) ( )
Citation Roberto Tangori and Christopher J.S desilva, Fundementals of Information theory and coding design ISBN 1-58488-310-3 Wikipedia.com, Hamming Distance
Maximum Channel Capacity The maximum average mutual information, ( ), in any single use of a channel defines the channel capacity . Mathematically, the channel capacity, C, is defined as:
( )
Where, ( * ) | | and | |+