|
Back to overview over all lectures...
Information Theory Fall 2011/2012
⇒ to time table and download of class material
News
- New (hopefully stable) version of Lecture Notes: Version 1.9 of the lecture notes is available online. Chapter 18 has been worked over. Hopefully, this will now remain stable for a while. But I'm still very much interested in any type of feedback, typos, comments, etc.! Please send me an email!
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.8) is available online. Chapter 16 and the Appenices have been worked over, moreover, Chapter 15 has been improved further based on feedback.
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.7) is available online. Chapters 14, 15, and 17 have been worked over.
- Final Exam: The final exam will take place on
- Tuesday, 10 January, 12:30–15:20 (Note that this is ONE HOUR EARLIER than usual!)
Location: Engineering Building IV, Room 102 (ED102)
Regulations:
- open book: any book is allowed
- not allowed are: computer, any telecommunication devices like mobile phones, any "friend" or other help from outside...
- covered material: everything covered in class during the whole semester apart from Chapter 18, i.e., Chapters 1–16
- style: similar to mid-term exam
- recommendation: bring along a pocket calculator for simple quick calculations
- Class Evaluation: The class evaluation will be online between 20 December and 8 January. I would very much appreciate your feedback, so please spend a couple of minutes to fill out the online form! Thanks!
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.6) is available online. Chapters 12 and 13 have been worked over.
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.5) is available online. Chapter 11 has been improved and some typos in Chapter 12 corrected.
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.4) is available online. Chapter 10 has been improved.
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.3) is available online. Chapters 8 and 9 have been improved.
- Mid-Term Exam: The mid-term exam will take place on
- Tuesday, 8 November, 12:30–15:20 (Note that we start ONE HOUR EARLIER than usual!)
Location: Engineering Building IV, Room 102 (ED102)
Regulations:
- open-book: any books or notes are allowed
- not allowed are: computers, any telecommunication devices like mobile phones, any "friend" or other help from outside...
- covered material: Chapters 1 to 7
- style: similar to exercises, only easier... :-)
- recommendation: bring along a pocket calculator for simple quick calculations
- New version of Lecture Notes: A next corrected version of the lecture notes (version 1.2) is available online. Typos in Chapters 1 to 7 have been corrected, and Section 4.3 has been expanded.
- Double lectures: On Thursday, 20 October, and Thursday, 27 October, we will be having a two-hours lecture to make up the classes that will be canceled in December.
- Change of room: Due to the really annoying air-conditioner we have now changed our room for good: the lecture will be held on both days in Room 102 (ED102).
- Change of room: Note that the class room has changed. The lecture will be held on both days in Room 103 (ED103).
- New Lecture Notes: Starting from this fall, we will be using my own lecture notes and not anymore the textbook of Cover and Thomas. The notes will be distributed free of charge to all registered students.
Course Description
This course is an introduction to information theory. We will cover the most important results concerning data compression and reliable communication over a communication channel. The course will follow the following schedule:
- Introduction and basic definitions:
- entropy
- mutual information
- relative entropy
- Source coding: how to compress data efficiently?
- Kraft inequality
- source coding theorem for a single random variable
- Shannon-type codes
- Shannon code
- Fano code
- Huffman code
- source coding theorem for a discrete memoryless source
- arithmetic coding
- Tunstall code
- source coding theorem for a sources with memory
- adaptive Huffman coding
- universal codes: Elias-Willems coding, Lempel-Ziv coding
- Tools in information theory:
- Asymptotic Equipartition Property (AEP)
- Karush-Kuhn-Tucker conditions
- Gambling and Horse Betting
- Channel coding: how to transmit data reliably?
- Fano's inequality and data processing lemma
- channel coding theorem for a discrete memoryless channel
- computing capacity
- joint source and channel coding
- Continuous random variables and differential entropy
- Gaussian Channel:
- channel coding theorem for the Gaussian channel
- bandlimited AWGN channel
- parallel Gaussian channels
- Brief overview over cryptography
For more detail see the time table below.
We hope that a student who finishes the course will be able to understand the basic principles underlying any communication or data storage system.
Prerequisites
The following lectures/topics are recommended:
- Probability
- once more Probability
- Principles of Communication Engineering I and II (preferably, but not necessary)
- joy in math and engineering
Instructor
Prof. Stefan M. Moser
Engineering Building IV, Office 727
phone: 03-571 21 21 ext. 54548
e-mail:
Teaching Assistants
In case you would like to discuss some questions in Chinese, you may contact the TA of this class:
- Chou Yu-Hsing
e-mail:
Office: Engineering Building IV, Lab 716A (ED716A)
Phone: 03-571 21 21 ext. 54630
To make our and your life easier, let's agree on the following rule: You may contact or visit the TAs at any time also outside of office hours. However, if you haven't made an appointment in advance, they have the right to tell you that they haven't got time right at that moment.
Time and Place
The course is scheduled for 4 hours per week:
- Tuesday, 13:30–15:20 (EF), Engineering Building IV, Room 103 (ED103)
- Thursday, 13:30–15:20 (EF), Engineering Building IV, Room 103 (ED103)
On Thursday, the second hour is reserved for exercises. The mid-term and final exams will be 3 hours.
The course starts on Tuesday, 13 September 2011, and finishes on Thursday, 12 January 2012.
Office Hours
NCTU requests that every teacher offers two hours per week where students may come to ask questions. I will, of course, also do so:
- Tuesday, 15:40–17:30, Engineering Building IV, Office 727
However, we would like to encourage you to show up in the teacher's or teaching assistant's office at any time whenever you have questions about the class or related subjects. Moreover, we are always available during and after classes and particularly in the second hour on Thursday (the "exercise" hour).
Textbook
The course will follow my own lecture notes:
These lecture notes will be distributed free of charge to all registered students of the course during the first week of the semester.
Further references and recommended readings:
- Claude E. Shannon: “A mathematical theory of communication,” Bell System Technical Journal, vol. 27, pp. 379–423 and 623–656, July and October 1948. (PDF) (Some remarks about this electronic version.)
- James L. Massey: “Applied Digital Information Theory I and II,” lecture notes, Swiss Federal Institute of Technology (ETH), Zurich, Switzerland. (Link to electronic version)
- Thomas M. Cover and Joy A. Thomas: “Elements of Information Theory,” second edition, Wiley, 2006.
- Robert G. Gallager: “Information Theory and Reliable Communication,” Wiley, 1968.
- Po-Ning Chen and Fady Alajaji: “Lecture Notes in Information Theory,” Volume I, National Chiao Tung University (NCTU), Hsinchu, Taiwan. (PDF)
- Po-Ning Chen and Fady Alajaji: “Lecture Notes in Information Theory,” Volume II, National Chiao Tung University (NCTU), Hsinchu, Taiwan. (PDF)
- Raymond W. Yeung: “A First Course in Information Theory,” Kluwer Academic Publishers, 2005.
Exercises
Every week, an exercise will be distributed in class and also made available online for download. This exercise will consist of several problems that need to be solved at home and handed in during the class of the following week. A model solution will be distributed and made available online afterwards.
We believe the exercises to be extremely important and crucial to the understanding of the course. They also serve as a preparation for the mid-term and final exams and we therefore highly recommend to solve them. To pass the course you need to hand in at least 10 exercises.
Exams
There will be one mid-term and one final exam. Both exams are going to last three hours and be open-book. Details about the covered material will be published in due time.
Grading
The grade will be an average of
- the homework and class participation (15%),
- the midterm exam (35%), and
- the final exam (50%).
The grade of the homework will not be based on the correctness of the answers, but rather on the effort the student shows in trying to solve them. Moreover, I will try to reward students who participate actively in the course.
This course is worth 3 credits.
Special Remarks
The lecture will be held in English.
Time Table
Note that some details of this program might change in the course of the semester.
Note that some linked documents in this table can only be downloaded within NCTU and NTHU!
W |
Date |
Topic |
Handouts |
Exercise (due on) |
Solutions |
Comments |
1 |
13 Sep. |
Introduction, motivation, entropy, IT Inequality, conditional entropy |
Syllabus (Version 4) |
Exercise 1 (20 Sep.) |
Lecture Notes |
Chapter 1 |
|
15 Sep. |
Conditional entropy, chain rule, relative entropy, mutual information |
|
|
|
Chapter 1 |
2 |
20 Sep. |
Mutual information, Jensen's inequality; data compression: efficient coding of a single RV, trees, leaf-counting and leaf-depth lemma, Kraft inequality, trees with probabilities, path length lemma |
|
Exercise 2 (27 Sep.) |
|
Chapter 1 & 3 |
|
22 Sep. |
Leaf-entropy theorem, converse part of coding theorem for a single random message, Shannon-type codes, Shannon codes |
|
|
Solutions 1 |
Chapter 3 |
3 |
27 Sep. |
Fano codes, binary Huffman codes, D-ary Huffman codes, types of codes: McMillan's theorem |
|
Exercise 3 (4 Oct.) |
|
Chapter 3 |
|
29 Sep. |
Coding of an information source: block–to–variable-length coding, arithmetic coding |
|
|
Solutions 2 |
Chapter 4 |
4 |
4 Oct. |
Arithmetic coding & decoding; variable-length–to–block coding |
|
Exercise 4 (11 Oct.) |
|
Chapter 4 daryexp.m invdaryexp.m |
|
6 Oct. |
Tunstall coding |
|
|
Solutions 3 |
Chapter 4 |
5 |
11 Oct. |
Performance bounds on Tunstall coding, stochastic processes, stationarity, Markovity, entropy rate, Cesáro mean |
|
Exercise 5 (18 Oct.) |
|
Chapter 4 & 5 |
|
13 Oct. |
Entropy rate |
|
|
Solutions 4 |
Chapter 5 |
6 |
18 Oct. |
Coding of sources with memory, adaptive Huffman coding, converse, Elias codes for integers, Elias-Willems universal coding scheme, Lempel-Ziv universal coding scheme |
|
Exercise 6 (25 Oct.) |
|
Chapter 6 |
|
20 Oct. |
Analysis of Lempel-Ziv universal coding scheme, maximum entropy distribution |
|
|
Solutions 5 |
Attention: 2 hours lecture Chapter 6 |
7 |
25 Oct. |
Analysis of Lempel-Ziv universal coding scheme, convex and concave functions, Kuhn-Tucker conditions |
|
Exercise 7 (1 Nov.) |
|
Chapter 6 & 7 |
|
27 Oct. |
Gambling and horse betting |
|
|
Solutions 6 |
Attention: 2 hours lecture Chapter 8 |
8 |
1 Nov. |
Gambling and horse betting, transmission over a noisy channel: introduction |
|
Exercise 8 (15 Nov.) |
|
Chapter 8 & 9 |
|
3 Nov. |
Transmission over a noisy channel: DMC |
|
|
Solutions 7 |
Chapter 9 |
9 |
8 Nov. |
Midterm Exam (bring calculator!) |
Exam statistics |
|
|
ATTENTION: This is a 3 hours exam: 12:30–15:20 |
|
10 Nov. |
Discussion midterm exam; transmission over a noisy channel: DMC |
|
|
|
Chapter 9 |
10 |
15 Nov. |
Transmission over a noisy channel: capacity, Fano's lemma, data processing inequality, converse of coding theorem; random coding |
|
Exercise 9 (22 Nov.) |
|
Chapter 9 |
|
17 Nov. |
Transmission over a noisy channel: random coding, coding theorem |
|
|
Solutions 8 |
Chapter 9 |
11 |
22 Nov. |
Transmission over a noisy channel: random coding, coding theorem; computing capacity: strongly & weakly symmetric DMCs, Kuhn-Tucker conditions |
|
Exercise 10 (29 Nov.) |
|
Chapter 9 & 10 |
|
24 Nov. |
Computing capacity: Kuhn-Tucker conditions |
|
|
Solutions 9 |
Chapter 10 |
12 |
29 Nov. |
Convolutional codes |
|
Exercise 11 (6 Dec.) |
|
Chapter 11 |
|
1 Dec. |
Convolutional codes; joint source and channel coding |
|
|
Solutions 10 |
Chapter 11 & 12 |
13 |
6 Dec. |
Joint source and channel coding, transmission above capacity |
|
Exercise 12 (13 Dec.) |
|
Chapter 12 |
|
8 Dec. |
Transmission above capacity; continuous RVs: differential entropy |
|
|
Solutions 11 |
Chapter 12 & 13 |
14 |
13 Dec. |
Differential entropy, mutual information, Gaussian channel |
|
Exercise 13 (27 Dec.), Exercise 14 (27 Dec.) |
|
Chapter 13 & 14 |
|
15 Dec. |
No lecture |
|
|
Solutions 12 |
|
15 |
20 Dec. |
No lecture |
|
|
|
|
|
22 Dec. |
Gaussian channel |
|
|
|
Class evaluation online until 8 January! Chapter 14 |
16 |
27 Dec. |
Gaussian channel, AWGN channel |
|
Exercise 15 (3 Jan.) |
|
Chapter 14 & 15 |
|
29 Dec. |
Parallel Gaussian channels |
|
|
Solutions 13, Solutions 14 |
Chapter 16 |
17 |
3 Jan. |
Introduction to cryptography (not part of exam) |
|
Exercise 16 (5 Jan.) |
|
Chapter 18 |
|
5 Jan. |
Introduction to cryptography (not part of exam), questions |
|
|
Solutions 15, Solutions 16 |
Chapter 18 |
18 |
10 Jan. |
Final Exam |
Exam statistics |
|
|
ATTENTION: This is a 3 hours exam: 12:30–15:20 |
|
12 Jan. |
Discussion Final Exam, coffee time |
|
|
|
|
-||- _|_ _|_ / __|__ Stefan M. Moser
[-] --__|__ /__\ /__ Senior Scientist, ETH Zurich, Switzerland
_|_ -- --|- _ / / Adjunct Professor, National Yang Ming Chiao Tung University, Taiwan
/ \ [] \| |_| / \/ Web: https://moser-isi.ethz.ch/
Last modified: Wed Jan 11 23:47:07 UTC+8 2012
|