Information Theory and Inference

Entropy, KL-Divergence and Mutual Information. Asymptotic Equipartition Property and Typical Sets. Data Compression: Prefix-free codes, Shannon Code, Huffman Code. Universal Codes. Channel Capacity. Channel Coding Theorem. Feedback Capacity. Source-Channel Separation Theorem. Differential Entropy and Continuous Asymptotic Equipartition Property. Gaussian Channel. Parallel Gaussian Channels. The following topics may be covered to varying degrees depending on time availability and student interest: Image and Video Compression, Rate-Distortion Theory, Fisher Information and Information Geometry, Kolmogorov Complexity, Network Information Theory, Portfolio Theory.


Reference:
Thomas M. Cover and Joy A. Thomas. 2006. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing). Wiley-Interscience, USA.

 

* Standard program. The teacher has the autonomy to make any changes.