Information Theory and Coding
Sources - memory less and Markov; Information; Entropy; Extended sources; Shannon’s noiseless coding theorem; Source coding; Mutual information; Channel capacity; BSC and other channels; Shannon's channel capacity theorem; Continuous channels; Comparison of communication systems based on Information Theory; Channel Coding - block and convolutional. Block codes majority logic decoding; Viterbi decoding algorithm; Coding gains and performance.
