Information Theory and Coding
Information theory: Information – Entropy, Information rate, classification of codes, Kraft McMillanin equality, Source coding theorem, Shannon ‐ Fano coding, Huffman coding, Extended Huffman coding ‐ Joint and conditional entropies, Mutual information ‐ Discrete memory less channels – BSC, BEC – Channel capacity, Shannon limit.