Back to overview over all lectures...

Information Theory
Spring 2007


  • Class Evaluation: There will be a class evaluation online between June 18 and June 29. I would very much like to encourage all students to participate! Thanks!
  • Final Grade: You will receive your final grade of this course together with your results from the final exam (the final grade will be depicted beside the grade of the final exam). The final exams will be returned and discussed on June 25 during normal class hours. If you cannot make it, contact the Teaching Assistant Lin Ding-Jie.
  • Final Exam: The final exam will take place on
    • Wednesday, June 20, 15:40-18:30 (Note that this is one hour longer than usual!)
    • open book: any book is allowed
    • not allowed are: any telecommunication devices like mobile phones, any "friend" or other help from outside...
    • covered material: everything we covered in class
  • Mid-Term Exam: The mid-term exam will take place on
    • Wednesday, April 25, 15:40-18:30 (Note that this is one hour longer than usual!)
    • open book: any book is allowed
    • not allowed are: any telecommunication devices like mobile phones, any "friend" or other help from outside...
    • covered material: everything covered in class until (and including) April 9, i.e., everything up to the definition of information capacity, and strongly and weakly symmetric DMCs


Stefan M. Moser
Engineering Building IV, Room 727
phone: 03-571 21 21 ext. 54548

Teaching Assistant

In case you would like to discuss some questions in Chinese, you might contact my TA:

Room: Engineering Building IV, Room 711 (ED711)
Phone: 03-571 21 21 ext. 54628

Time and Place

The course is scheduled for 4 hours per week:

  • Monday, 15:40-17:30, Engineering Building IV, Room 021 (ED021)
  • Wednesday, 15:40-17:30, Engineering Building IV, Room 021 (ED021)

On Wednesday, the second hour is reserved for exercises. The mid-term and final exams will be 3 hours.

Course Objectives

This course is an introduction to Information Theory. We will cover the most important results concerning data compression and reliable communication over a communication channel. The course will follow approximately the following schedule:

  • Introduction and basic definitions:
    • entropy
    • mutual information
    • relative entropy
  • Source coding: how to compress data efficiently?
    • Kraft inequality
    • source coding theorem for a single random variable
    • Shannon-Fano codes
    • Huffman codes
    • source coding theorem for a discrete memoryless source
    • Tunstall codes
    • universal codes
  • Channel coding: how to transmit data reliably?
    • Fano's inequality and data processing lemma
    • converse to channel coding theorem for discrete memoryless channel
    • AEP and typical sequences
    • channel coding theorem
    • continuous random variables and entropy
    • channel coding theorem for the AWGN channel

We hope that a student who finishes the course will be able to understand the basic principles underlying any communication or data storage system.


The following lectures/topics are recommended:

  • Probability
  • once more Probability
  • Principles of Communication Engineering I and II (preferably)
  • joy in math and engineering


The course will be mainly be based on

  • Thomas M. Cover and Joy A. Thomas: "Elements of Information Theory," second edition, Wiley, 2006.

You find here a link to an electronic version of the book. Sometimes additional lecture notes will be distributed.

Further references and recommended readings:

  • Robert G. Gallager: "Information Theory and Reliable Communication," Wiley, 1968.
  • Raymond W. Yeung: "A First Course in Information Theory," Kluwer Academic Publishers, 2005.
  • James L. Massey: "Applied Digital Information Theory I and II," lecture notes, Swiss Federal Institute of Technology (ETH), Zurich. (Link to electronic version)


Every week there will be an exercise consisting of a couple of problems that needs to be solved at home. For the understanding of the course and also as a preparation for the mid-term and final exam we highly recommend to solve the exercises! Since the material of this course is rather demanding by itself, we have decided not to further challenge the students with additional tasks like, e.g., a presentation of a paper. We hope that the saved time will instead be used for solving the exercises and reading the textbook!

The grade will be an average of

  • the homework (15%),
  • the midterm exam (35%), and
  • the final exam (50%).

The grade of the homework will not be based on the correctness of the answers, but rather the effort the student shows in trying to solve them. To pass the course there is the additional condition that at least 10 exercises have to be handed in.

This course is worth 3 credits.

Time Table

Date Topic Handouts Exercise (due on) Solutions Comments
26 Feb. Introduction, measure of information, IT-inequality, entropy Syllabus (Version 2) Exercise 1 (5 Mar.)   Problem 6 of Exercise 1 is canceled
Chapter 2
28 Feb. No lecture        
5 Mar. Conditional entropy, relative entropy, mutual information, Jensen's inequality, data compression Handout 1 Exercise 2 (12 Mar.)   Chapter 2
7 Mar. Rooted trees, leaf-counting/leaf-depth lemma, Kraft inequality Handout 2   Solutions 1 Handout 2
12 Mar. Path length lemma, leaf-entropy theorem, coding theorem for a single RV, Shannon-Fano codes, types of codes, McMillan   Exercise 3 (19 Mar.)   Handout 2, Chapter 5
14 Mar. Huffman Codes, coding of an information source Handout 3   Solutions 2 Handout 2, Chapter 5
19 Mar. Coding of an information source, block-to-variable-length coding, variable-length-to-block coding, Tunstall coding Handout 4 Exercise 4 (26 Mar.)   Handout 4
21 Mar. Stochastic processes, stationarity, Markov processes, entropy rate     Solutions 3 Chapter 4
26 Mar. Entropy rate, Cesaro mean, coding of sources with memory Handout 5 Exercise 5 (2 Apr.)   Chapter 4, Handout 5
28 Mar. Elias-Willems universal coding scheme     Solutions 4 Handout 5
2 Apr. AEP, typical sets, transmission over a noisy channel   Exercise 6 (9 Apr.)   Chapter 3
4 Apr. No lecture     -----  
9 Apr. discrete memoryless channel DMC, information capacity   Exercise 7 (18 Apr.) Solutions 5 Chapter 7
11 Apr. Coding for a DMC, Fano's lemma, data processing inequality, converse of coding theorem for a DMC     Solutions 6 Chapter 7
16 Apr. No lecture   -----    
18 Apr. Jointly typical sequences, coding theorem for a DMC   Exercise 8 (30 Apr.) Solutions 7 Chapter 7
23 Apr. Coding Theorem for a DMC, convex optimization   -----   Chapter 7, R. Gallager's book
25 Apr. Midterm Exam     ----- ATTENTION: This is a 3 hours exam: 15:40-18:30
30 Apr. Finding channel capacity: convex optimization, Kuhn-Tucker conditions   Exercise 9 (7 May)   R. Gallager's book
2 May Kuhn-Tucker conditions, joint source channel coding theorem     Solutions 8 R. Gallager's book, Chapter 7.13
7 May Joint source channel coding theorem, differential entropy   Exercise 10 (14 May)   Chapter 7.13, Chapter 8
9 May differential entropy, continuous-alphabet AEP     Solutions 9 Chapter 8
14 May Additive Gaussian noise channel   Exercise 11 (21 May)   Chapter 9
16 May Sampling Theorem     Solutions 10 Chapter 9
21 May Bandlimited Gaussian channel, parallel Gaussian channels   Exercise 12 (28 May)   Chapter 9
23 May Parallel Gaussian channels with colored noise     Solutions 11 Chapter 9
28 May Convolutional codes Handout 6 Exercise 13 (4 June)   Handout 6
30 May Quality of convolutional codes: signalflowgraphs     Solutions 12 Handout 6
4 June Quality of convolutional codes: signalflowgraphs, Bhattacharyya bound   Exercise 14 (11 June)   Handout 6
6 June Quality of convolutional codes     Solutions 13 Handout 6
11 June Exercises, questions   Exercise 15 (13 June)   Note: the due date of Exercise 14 has been moved to 13 June
13 June Discussion Exercises 14 and 15, questions, general information about final exam     Solutions 14,
Solutions 15
18 June No lecture   -----   Class evaluation online until June 29!
20 June Final Exam       ATTENTION: This is a 3 hours exam: 15:40-18:30
25 June Discussion of final exam   -----    
27 June No lecture     -----  

Special Remarks

The lecture will be held in English.

-||-   _|_ _|_     /    __|__   Stefan M. Moser
[-]     --__|__   /__\    /__   Senior Researcher & Lecturer, ETH Zurich, Switzerland
_|_     -- --|-    _     /  /   Adj. Professor, National Chiao Tung University (NCTU), Taiwan
/ \     []  \|    |_|   / \/    Web:

Last modified: Thu Feb 19 08:17:30 UTC+8 2009