INF563 Introduction to Information Theory
Lectures
 January 10, Entropy, typical sequences
Lecture 0, Lecture 1.
 January 17, Memoryless source coding
Lecture 2.
 January 24, Memoryless source coding, Huffman code, ShannonFanoElias code, Shannon code and arithmetic coding
Lecture 3.
 January 31, adaptative Huffman coding, universal coding of a source
Lecture 4.
 February 7, Stationnary source, typical sequences, AEP
Lecture 5.
 February 14, Channel coding, capacity, Shannon's second theorem
Lecture 6.
 February 27, Linear codes, Hamming and ReedSolomon codes, decoding, concatenated codes
Lecture 7.
 March 6, Polar codes
Lecture 8.
 March 12, Other applications of information theory : distributed
data storage.
TD
 TD1 Exercises on entropy.
Solution : solutions
 TD2 Huffman coding.
 TD3 Arithmetic coding.
 TD4 Lempel Ziv coding.
 TD5 Entropy of English and
automatic generation of English.

TD6 A first example of an errorcorrecting code : the NordstromRobinson code
 TD7 A second example of an errorcorrecting code : a concatenated code.

TD8 Polar codes
Project
The list of projects can be found
here (html version). and
here (PDF version).
Bibliography
 T. Cover, J. Thomas, "Elements of Information Theory". Wiley Series in Telecommunications, 1991.

S. Roman, "Coding and Information Theory",Graduate Texts in Mathematics. Springer Verlag, New York  Berlin, 1992.