Back to overview over all lectures...

Information Theory
Fall 2007/2008


News

  • Final Exam: The final exam will take place on
    • Monday, 7 January, 15:40-18:30 (Note that this is one hour longer than usual!)
    Regulations:
    • open book: any book is allowed
    • not allowed are: laptop, any telecommunication devices like mobile phones, any "friend" or other help from outside...
    • covered material: everything covered in class during the whole semester
  • Mid-Term Exam: The mid-term exam will take place on
    • Monday, 12 November, 15:40-18:30 (Note that this is one hour longer than usual!)
    Regulations:
    • open book: any book is allowed
    • not allowed are: any telecommunication devices like mobile phones, any "friend" or other help from outside...
    • covered material: everything covered in class until (and including) 29 October, i.e., everything up to weakly and strongly symmetric DMCs
  • Wrong Time-Table: Please note that the time-schedule as previously listed here is wrong. The class takes place on MONDAY and THURSDAY (and not Tuesday/Friday)!!
  • Registration: Please note that in the printed version of the course timetable, this course is only listed as graduate course. But this course is as usual open to undergraduate (4-th year), graduate, and Ph.D. students!
  • Attention: This year the Information Theory class is held in autumn and NOT in spring!

Instructor

Stefan M. Moser
Engineering Building IV, Room 727
phone: 03-571 21 21 ext. 54548
e-mail:

Teaching Assistant

In case you would like to discuss some questions in Chinese, you might contact my TA:

  • Hsuan-Yin Lin
    e-mail:
    Room: Engineering Building IV, Room 711 (ED711)
    Phone: 03-571 21 21 ext. 54628

Time and Place

The course is scheduled for 4 hours per week:

  • Monday, 15:40-17:30, Engineering Building IV, Room 103 (ED103)
  • Thursday, 10:10-12:00, Engineering Building IV, Room 103 (ED103)

On Thursday, the second hour is reserved for exercises. The mid-term and final exams will be 3 hours.

Office Hours

NCTU requests that every teacher offers two hours per week where students may come to ask questions:

  • Tuesday, 13:30-15:30, Engineering Building IV, Room 727

However, I would like to encourage you to show up in my office or in the teaching assistant's office at any time in case you have questions about the class or related subjects. Moreover, I'm always available during and after classes and particularly in the second hour on Friday (the "exercise" lecture).

Course Objectives

This course is an introduction to Information Theory. We will cover the most important results concerning data compression and reliable communication over a communication channel. The course will follow approximately the following schedule:

  • Introduction and basic definitions:
    • entropy
    • mutual information
    • relative entropy
  • Source coding: how to compress data efficiently?
    • Kraft inequality
    • source coding theorem for a single random variable
    • Shannon-Fano codes
    • Huffman codes
    • source coding theorem for a discrete memoryless source
    • Tunstall codes
    • universal codes
  • Channel coding: how to transmit data reliably?
    • Fano's inequality and data processing lemma
    • converse to channel coding theorem for discrete memoryless channel
    • AEP and typical sequences
    • channel coding theorem
    • continuous random variables and entropy
    • channel coding theorem for the AWGN channel
    • band-limited AWGN channel
    • convolutional codes on DMCs

It is hoped that a student who finishes the course will be able to understand the basic principles underlying any communication or data storage system.

Prerequisites

The following lectures/topics are recommended:

  • Probability
  • once more Probability
  • Principles of Communication Engineering I and II (preferably, but not necessary)
  • joy in math and engineering

Textbook

The course will be mainly be based on

  • Thomas M. Cover and Joy A. Thomas: "Elements of Information Theory," second edition, Wiley, 2006.

You find here a link to an electronic version of the book. For certain topics additional lecture notes will be handed out during class.

Further references and recommended readings:

  • James L. Massey: "Applied Digital Information Theory I and II," lecture notes, Swiss Federal Institute of Technology (ETH), Zurich, Switzerland. (Link to electronic version)
  • Robert G. Gallager: "Information Theory and Reliable Communication," Wiley, 1968.
  • Po-Ning Chen and Fady Alajaji: "Lecture Notes in Information Theory," Volume I, National Chiao Tung University (NCTU), Hsinchu, Taiwan. (Link to electronic version)
  • Po-Ning Chen and Fady Alajaji: "Lecture Notes in Information Theory," Volume II, National Chiao Tung University (NCTU), Hsinchu, Taiwan. (Link to electronic version)
  • Raymond W. Yeung: "A First Course in Information Theory," Kluwer Academic Publishers, 2005.

Grading

Every week there will be an exercise consisting of a couple of problems that needs to be solved in the exercise hour and at home. I believe the exercises to be extremely important and crucial to the understanding of the course. They also serve as a preparation for the mid-term and final exam and I therefore highly recommend to solve them! Since the material of this course is rather demanding by itself, I have decided not to further challenge the students with additional tasks like, e.g., a presentation of a paper. I hope that the saved time will instead be used for solving the exercises and reading the textbook!

The grade will be an average of

  • the homework (15%),
  • the midterm exam (35%), and
  • the final exam (50%).

The grade of the homework will not be based on the correctness of the answers, but rather on the effort the student shows in trying to solve them. To pass the course there is the additional condition that at least 10 exercises must be handed in.

This course is worth 3 credits.

Time Table

W Date Topic Handouts Exercise (due on) Solutions Comments
1 10 Sep. Introduction, measure of information, IT-inequality, entropy Syllabus (Version 5) Exercise 1 (17 Sep.)   Chapter 2 (2.1-2.6)
  13 Sep. Conditional entropy Handout 1   ----- Chapter 2 (2.1-2.6)
2 17 Sep. Chain rule, relative entropy, mutual information, Jensen's inequality, data compression: efficient coding of a single RV Handout 2 Exercise 2 (1 Oct.)   Chaper 2 (2.1-2.6), Handout 2, Chapter 5
  20 Sep. Leaf-counting, leaf-depths, Kraft inequality, path length lemma     Solutions 1 Handout 2, Chapter 5
3 24 Sep. No lecture (Holiday)   -----    
  27 Sep. No lecture (Conference)     -----  
4 1 Oct. Leaf-entropy theorem, coding theorem for a single RV, Shannon-Fano coding   Exercise 3 (8 Oct.)   Handout 2, Chapter 5
  4 Oct. Types of codes, McMillan's Theorem, binary Huffman codes     Solutions 2 Handout 2, Chapter 5
5 8 Oct. D-ary Huffman codes, coding of an information source, block-to-variable length coding, variable-length-to-block coding Handout 3,
Handout 4
Exercise 4 (15 Oct.)   Handout 2 & 4, Chapter 5
  11 Oct. Tunstall coding     Solutions 3 Handout 4, Chapter 5
6 15 Oct. Stochastic processes, stationarity, Markovity, entropy rate, Cesáro mean   Exercise 5 (22 Oct.)   Chapter 4
  18 Oct. Coding of sources with memory, Elias-Willems universal coding scheme Handout 5   Solutions 4 Handout 5, Chapter 5
7 22 Oct. AEP, typical sets, data compression revisited   Exercise 6 (29 Oct.)   Chapter 3
  25 Oct. Transmission over a noisy channel, DMC, information capacity     Solutions 5 Chapter 7
8 29 Oct. Strongly & weakly symmetric DMCs   Exercise 7 (5 Nov.)   Chapter 7
  1 Nov. Coding for a DMC, Fano's lemma, data processing inequality, converse coding theorem     Solutions 6 Chapter 2 & 7
9 5 Nov. -----   Exercise 8 (19 Nov.) Solutions 7 2 hours exercises
  8 Nov. Jointly typical sequences, Channel Coding Theorem     ----- Chapter 7
10 12 Nov. Midterm Exam   -----   ATTENTION: This is a 3 hours exam: 15:40-18:30
  15 Nov. Disussion of Midterm Exam, Channel Coding Theorem     ----- Chapter 7
11 19 Nov. Channel Coding Theorem, finding channel capacity revisited, convex and concave functions Handout 6 Exercise 9 (26 Nov.)   Chapter 7, Handout 6
  22 Nov. Convex and concave functions, Kuhn-Tucker conditions     Solutions 8 Handout 6
12 26 Nov. Kuhn-Tucker conditions, Joint Source Channel Coding Theorem   Exercise 10 (3 Dec.)   Handout 6, Chapter 7.13
  29 Nov. Joint Source Channel Coding Theorem, differential entropy     Solutions 9 Chapter 7.13, 8
13 3 Dec. Differential entropy, AEP for continuous RVs   Exercise 11 (10 Dec.)   Chapter 8
  6 Dec. Gaussian channel     Solutions 10 Chapter 9
14 10 Dec. Gaussian channel   Exercise 12 (17 Dec.)   Chapter 9
  13 Dec. Sampling theorem     Solutions 11 Chapter 9
15 17 Dec. Sampling theorem, bandlimited Gaussian channel, convolutional codes, trellis, Viterbi decoding Handout 7 Exercise 13 (24 Dec.)   Chapter 9, Handout 7
  20 Dec. Viterbi decoding, quality of convolutional codes     Solutions 12 Handout 7
16 24 Dec. Quality of convolutional codes: signal flow graph, transmission gain, Bhattacharyya bound   Exercise 14 (31 Dec.)   Handout 7
  27 Dec. Quality of convolutional codes: Bhattacharyya bound     Solutions 13 Handout 7
17 31 Dec. Parallel Gaussian channels   Exercise 15 (3 Jan.)   Chapter 9; Class evaluation online until January 11
  3 Jan. Parallel Gaussian channels, questions     Solutions 14 (corrected version),
Solutions 15
Chapter 9
18 7 Jan. Final Exam   -----   ATTENTION: This is a 3 hours exam: 15:40-18:30
  10 Jan. Discussion Final Exam     -----  

Special Remarks

The lecture will be held in English.


-||-   _|_ _|_     /    __|__   Stefan M. Moser
[-]     --__|__   /__\    /__   Senior Researcher & Lecturer, ETH Zurich, Switzerland
_|_     -- --|-    _     /  /   Adj. Professor, National Chiao Tung University (NCTU), Taiwan
/ \     []  \|    |_|   / \/    Web: http://moser-isi.ethz.ch/


Last modified: Thu Feb 19 08:20:45 UTC+8 2009