Back to overview over all lectures...

Information Theory II
Spring 2019


⇒ to time table and download of class material

Aim

This course builds on Information Theory I. It introduces additional topics in single-user communication, connections between Information Theory and Statistics, and Network Information Theory.

The course has two objectives: to introduce the students to the key information theoretic results that underlie the design of communication systems and to equip the students with the tools that are needed to conduct research in Information Theory.

Contents

  • Differential entropy, including maximum entropy principle
  • Gaussian channel
  • The multiple-access channel (MAC)
  • The broadcast channel (BC)
  • Channels with states
  • Method of types, Sanov's theorem, Stein's lemma
  • Guessing and Rényi entropy
  • Redundancy and Context-Tree Weighting Algorithm

Prerequisites

  • Undergraduate studies
  • Solid foundation in probability and calculus
  • Pleasure with mathematics
  • Information Theory I

Instructors

Prof. Amos Lapidoth Prof. Stefan M. Moser
Office: ETF E107 ETF E104
Phone: 044 632 51 92 044 632 36 24
E-mail:

Teaching Assistant

Robert Graczyk
Office: ETF E104
Phone: 044 632 28 99
E-mail:

Time and Place

  • Lecture/Exercise: Thursday, 13:15–17:00, ETZ E9

Note that usually the lecture is from 3 to 5 and the exercise hours are from 1 to 3.

Textbook

We use the book by Thomas M. Cover and Joy A. Thomas:
T.M. Cover, J.A. Thomas, Elements of Information Theory, 2nd Edition, John Wiley & Sons, Inc., New York, NY, USA.
(Here is a link to an electronic copy of it.)

Supplementary Notes

Stefan M. Moser: Information Theory (Lecture Notes), 6th edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Institute of Communications Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2018.

Stefan M. Moser: Advanced Topics in Information Theory (Lecture Notes), 3rd edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Institute of Communications Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2018.

A. El Gamal, Y.-H. Kim, Network Information Theory, Cambridge University Press

G. Kramer, Topics in Multi-User Information Theory, available here.

Exercises

Printed copies of the exercises will be handed out in class. See time table below.

Examination

Oral exam (30 min.).

Testat requirements

None.

Special Remarks

The lecture is held in English.

Time Table

W Date Topic Handouts Exercise Solutions Lecturer
1 21 Feb. Differential entropy, max-entropy principle, relative entropy and mutual information for continuous RVs Infos Contents Stefan M. Moser
2 28 Feb. AEP for continuous RVs: volume, properties of (jointly) typical sets; setup of Gaussian channel: average power constraint, information capacity of Gaussian channel   Exercise 1 Solutions 1 Stefan M. Moser
3 7 Mar. Gaussian channel, coding theorem: intuition, achievability, and converse   Exercise 2 Solutions 2 Stefan M. Moser
4 14 Mar. Multiple-Access Channel: channel model; achievable rate region, intuition, time-sharing; coding-theorem: achievability   Exercise 3 Solutions 3 Amos Lapidoth
5 21 Mar. Multiple-Access Channel: introduction of the Q-region and equivalence to time-sharing (no redundant pentagon constraints, Caratheodory); coding-theorem: converse   Exercise 4 Solutions 4 Stefan M. Moser
6 28 Mar. Broadcast Channel: channel model; (physically and stochastically) degraded BCs; capacity region depends on marginals only; achievabilty via superposition-coding; converse for degraded BCs Handout 1 Exercise 5 Solutions 5 Stefan M. Moser
7 4 Apr. Channels with States: introduction, examples; Gel'fand-Pinsker problem: achievability   Exercise 6 Solutions 6 Stefan M. Moser
8 11 Apr. Gel'fand-Pinsker problem: converse; examples: Writing on Dirty Paper, Channels that Sometimes Get Stuck   Exercise 7 Solutions 7 Stefan M. Moser
9 18 Apr. Method of types: introduction, number and size of type classes, probability of observing a type; large devitation theory:2 Sanov's theorem; universal source coding   Exercise 8 Solutions 8 Amos Lapidoth
25 Apr. holidays    
10 2 May Minimizing relative entropy subject to equality constraints; Stein's Lemma; Chernoff Information   Exercise 9 Solutions 9 Amos Lapidoth
11 9 May Capacity-Redundancy Theorem; Properties of Bayesian Mixture Codes; Normalized Maximum Likelihood Codes Minimize Worst-Case Individual Sequence Redundancy   Exercise 10 Solutions 10 Amos Lapidoth
12 16 May d/2*log(n) redundancy; Laplace method of integration; Context-Tree Weighting algorithm Handout 2 Exercise 11 Solutions 11 Amos Lapidoth
13 23 May Guessing and Rényi entropy Handout 3 Exercise 12 Solutions 12 Robert Graczyk
14 30 May holiday    

-||-   _|_ _|_     /    __|__   Stefan M. Moser
[-]     --__|__   /__\    /__   Senior Researcher & Lecturer, ETH Zurich, Switzerland
_|_     -- --|-    _     /  /   Adj. Professor, National Yang Ming Chiao Tung University (NCTU), Taiwan
/ \     []  \|    |_|   / \/    Web: https://moser-isi.ethz.ch/


Last modified: Thu Jul 14 06:08:52 CEST 2022