INF563 Introduction to Information Theory
- January 10, Entropy, typical sequences
Lecture 0, Lecture 1.
- January 17, Memoryless source coding
- January 24, Memoryless source coding, Huffman code, Shannon-Fano-Elias code, Shannon code and arithmetic coding
- January 31, adaptative Huffman coding, universal coding of a source
- February 7, Stationnary source, typical sequences, AEP
- February 14, Channel coding, capacity, Shannon's second theorem
- February 27, Linear codes, Hamming and Reed-Solomon codes, decoding, concatenated codes
- March 6, Polar codes
- March 12, Other applications of information theory : distributed
- TD1 Exercises on entropy.
Solution : solutions
- TD2 Huffman coding.
- TD3 Arithmetic coding.
- TD4 Lempel Ziv coding.
- TD5 Entropy of English and
automatic generation of English.
TD6 A first example of an error-correcting code : the Nordstrom-Robinson code
- TD7 A second example of an error-correcting code : a concatenated code.
TD8 Polar codes
The list of projects can be found
here (html version). and
here (PDF version).
- T. Cover, J. Thomas, "Elements of Information Theory". Wiley Series in Telecommunications, 1991.
S. Roman, "Coding and Information Theory",Graduate Texts in Mathematics. Springer Verlag, New York - Berlin, 1992.