|
Back to overview over all lectures...
Information Theory Spring 2006
News
- Final Grade: The results from the final exam and the final grade can now be collected at Joe's office. In case you are to able to go there in person, you may write an email to me or to Joe.
- Final Exam: The final exam will take place on
- Wednesday, June 21, 15:40-18:35 (Note that this is one hour longer than usual!)
Regulations:
- open book: any book is allowed
- not allowed are: any telecommunication devices like mobile phones, any "friend" or other help from outside...
- covered material: everything we covered in class
- Class Evaluation: There will be a class evaluation online available between June 12-June 23. I would very much like to encourage all students to participate! Thanks!
- Mid-Term Exam: The mid-term exam will take place on
- Wednesday, April 26, 15:40-18:35 (Note that this is one hour longer than usual!)
Regulations:
- open book: any book is allowed
- not allowed are: any telecommunication devices like mobile phones, any "friend" or other help from outside...
- covered material: everything up to definition of information capacity, strongly and weakly symmetric DMCs (lecture of April 10)
Instructor
Stefan M. Moser
Engineering Building IV, Room 727
phone: 03-571 21 21 ext. 54548
e-mail:
Teaching Assistant
In case you would like to discuss some questions in Chinese, you might contact my TA:
- Joe Chen
e-mail: <joechen.cm88g at nctu.edu.tw>
Time: Monday, 15:40-16:30
Room: Engineering Building IV, Room 811 (ED811)
Phone: 03-571 21 21 ext. 54571
Time and Place
The course is scheduled for 3 hours per week:
- Monday, 16:40-17:30, Engineering Building IV, Room 303 (ED303)
- Wednesday, 15:40-17:30, Engineering Building IV, Room 303 (ED303)
Course Objectives
This course is an introduction to Information Theory. We will cover
the most important results concerning data compression and reliable
communication over a communication channel. The course will follow
approximately the following schedule:
- Introduction and basic definitions:
- entropy
- mutual information
- relative entropy
- Source coding: how to compress data efficiently?
- Kraft inequality
- source coding theorem for a single random variable
- Shannon-Fano codes
- Huffman codes
- source coding theorem for a discrete memoryless source
- Tunstall codes
- universal codes
- Channel coding: how to transmit data reliably?
- Fano's inequality and data processing lemma
- converse to channel coding theorem for discrete memoryless channel
- AEP and typical sequences
- channel coding theorem
- continuous random variables and entropy
- channel coding theorem for the AWGN channel
We hope that a student who finishes the course will be able to understand
the basic principles underlying any communication or data storage
system.
Prerequisites
The following lectures/topics are recommended:
- Probability
- once more Probability
- Principles of Communication Engineering I and II (preferably)
- joy in math and engineering
Textbook
The Course will be mainly be based on
- Thomas M. Cover and Joy A. Thomas: "Elements of Information Theory," Wiley, 1991.
You find here a link to an electronic version of the book.
Further references and recommended readings:
- Robert G. Gallager: "Information Theory and Reliable Communication," Wiley, 1968.
- Raymond W. Yeung: "A First Course in Information Theory," Kluwer Academic Publishers, 2005.
- James L. Massey: "Applied Digital Information Theory I and II," lecture notes, Swiss Federal Institute of Technology (ETH), Zurich. (Link to electronic version)
Grading
Every week there will be an exercise consisting of a couple of
problems that needs to be solved at home. For the understanding of the
course and also as a preparation for the mid-term and final exam we
highly recommend to solve the exercises! Since the material of this
course is rather demanding by itself, we have decided not to further
challenge the students with additional tasks like, e.g., a
presentation of a paper. We hope that the saved time will instead be used
for solving the exercises and reading the textbook!
The grade will be an average of
- the homework (15%),
- the midterm exam (35%), and
- the final exam (50%).
The grade of the homework will not be based on the
correctness of the answers, but rather the effort the student shows in
trying to solve them. To pass the course there is the additional
condition that at least 10 exercises have to
be handed in.
This course is worth 3 credits.
Time Table
Date |
Topic |
Handouts |
Exercise (due on) |
Solutions |
Comments |
20 Feb. |
Introduction, measure of information |
Syllabus |
Exercise 1 (27 Feb.) |
|
Chapter 2 |
22 Feb. |
Entropy, mutual information, relative entropy, IT-inequality, Jensen's inequality |
Handout 1 |
|
|
Chapter 2 |
27 Feb. |
Data compression: efficient coding of a single random variable: leaf-counting, leaf-depths, Kraft's inequality |
|
Exercise 2 (6 Mar.) |
|
Chapter 5, Handout 3 |
1 Mar. |
Kraft's inequality, path-length lemma, leaf-entropy theorem, converse of coding theorem for single RV, McMillan's theorem |
Handout 2, Handout 3 |
|
Solutions 1 |
Handout 2 and 3, Chapter 5 |
6 Mar. |
McMillan's theorem, Huffman codes |
|
Exercise 3 (13 Mar.) |
|
Handout 3, Chapter 5 |
8 Mar. |
Huffman codes, block-to-variable-length coding of a DMS, variable-length-to-block coding of a DMS, general converse |
Handout 4 |
|
Solutions 2 |
Handout 3 and 4, Chapter 5 |
13 Mar. |
Variable-length-to-block coding of a DMS: Tunstall, stationary stochastic processes |
|
Exercise 4 (20 Mar.) |
|
Handout 4, Chapter 4 |
15 Mar. |
Time-invariant Markov processes, entropy rates, entropy rates of stationary processes |
|
|
Solutions 3 |
Chapter 4 |
20 Mar. |
Coding of sources with memory: coding theorem, universal source compression (Elias-Willems coding) |
Handout 5 |
Exercise 5 (27 Mar.) |
|
Handout 5 |
22 Mar. |
Elias-Willems coding, Asymptotic Equipartition Property (AEP) |
|
|
Solutions 4 |
Handout 5, Chapter 3 |
27 Mar. |
Typical sets, data compression using typical sets, high-probability sets |
|
Exercise 6 (10 Apr.) |
|
Chapter 3 BEWARE: Over the weekend (24-26 Mar.) a faulty version of Exercise 6 was online. If you have downloaded it, please delete it and download it once again! |
29 Mar. |
Transmission over a noisy digital channel, DMC, information capacity, stronly symmetric DMCs |
|
|
Solutions 5 |
Chapter 8 |
3 Apr. |
No lecture |
|
----- |
|
|
5 Apr. |
No lecture |
|
|
----- |
|
10 Apr. |
Weakly symmetric DMCs |
|
Exercise 7 (17 Apr.) |
|
Chapter 8 |
12 Apr. |
Definition of a channel code, operational capacity, Fano's lemma, data processing inequality, converse to the Channel Coding Theorem |
|
|
Solutions 6 |
Chapter 8 |
17 Apr. |
Jointly typical sequences |
|
Exercise 8 (1 May) |
|
Chapter 8 |
19 Apr. |
Channel coding theorem |
|
|
Solutions 7 |
Chapter 8 |
24 Apr. |
Finding channel capacity: convex optimization, Kuhn-Tucker conditions |
|
----- |
|
R. Gallager's book |
26 Apr. |
Midterm Exam (3 hours) |
|
|
----- |
ATTENTION: This is a 3 hours exam: 15:40-18:35 |
1 May |
Discussion of Exam |
|
Exercise 9 (8 May) |
Solutions Midterm Exam |
|
3 May |
Kuhn-Tucker conditions |
|
|
Solutions 8 |
R. Gallager's book |
8 May |
Joint source channel coding theorem |
|
Exercise 10 (15 May) |
|
Chapter 8 |
10 May |
Differential entropy |
|
|
Solutions 9 |
Chapter 9 |
15 May |
Differential entropy, Gaussian channel |
|
Exercise 11 (22 May) |
|
Chapter 9, 10 |
17 May |
Gaussian channel |
|
|
Solutions 10 |
Chapter 10 |
22 May |
Sampling theorem |
|
Exercise 12 (29 May) |
|
Chapter 10 |
24 May |
Band-limited Gaussian channels, parallel Gaussian channels |
|
|
Solutions 11 |
Chapter 10 |
29 May |
Convolutional codes |
Handout 6 |
Exercise 13 (5 June) |
|
Handout 6 |
31 May |
No lecture |
|
|
----- |
|
5 June |
Signal flowgraph, transmission gain |
|
Exercise 14 (12 June) |
Solutions 12 |
Handout 6 |
7 June |
Bhattacharyya bound, signal flowgraph bound on bit error probability |
|
|
Solutions 13 |
Handout 6 |
12 June |
Bit error probability of convolutional encoders, signal flowgraphs |
|
Exercise 15 (14 June) |
|
Handout 6 Class evaluation online until June 23 |
14 June |
Repetition, exercises, questions |
|
|
Solutions 14, Solutions 15 |
|
19 June |
Time for questions... |
|
----- |
|
|
21 June |
Final Exam (3 hours) |
|
|
|
ATTENTION: This is a 3 hours exam: 15:40-18:35 |
Special Remarks
The lecture will be held in English.
-||- _|_ _|_ / __|__ Stefan M. Moser
[-] --__|__ /__\ /__ Senior Scientist, ETH Zurich, Switzerland
_|_ -- --|- _ / / Adjunct Professor, National Yang Ming Chiao Tung University, Taiwan
/ \ [] \| |_| / \/ Web: https://moser-isi.ethz.ch/
Last modified: Thu Feb 19 08:16:46 UTC+8 2009
|