Back to overview over all lectures...

Information Theory II
Spring 2017


⇒ to time table and download of class material

Aim

This course builds on Information Theory I. It introduces additional topics in single-user communication, connections between Information Theory and Statistics, and Network Information Theory.

The course has two objectives: to introduce the students to the key information theoretic results that underlie the design of communication systems and to equip the students with the tools that are needed to conduct research in Information Theory.

Contents

  • Differential entropy, including maximum entropy principle
  • Gaussian channel
  • The multiple-access channel (MAC)
  • The broadcast channel (BC)
  • Channels with states
  • Method of types, Sanov's theorem, Stein's lemma
  • Guessing and Rényi entropy
  • MAC with feedback

Prerequisites

  • Undergraduate studies
  • Solid foundation in probability and calculus
  • Pleasure with mathematics
  • Information Theory I

Instructors

Prof. Amos Lapidoth Prof. Stefan M. Moser
Office: ETF E107 ETF E104
Phone: 044 632 51 92 044 632 36 24
E-mail:

Teaching Assistant

Christoph Pfister
Office: ETF D108
Phone: 044 632 65 59
E-mail:

Time and Place

  • Lecture/Exercise: Thursday, 13:15–17:00, ETZ E6

Note that Prof. Lapidoth and Prof. Moser share this course. In the weeks when Prof. Lapidoth teaches, the lecture is from 3 to 5 and the exercise hours are from 1 to 3. In the weeks when Prof. Moser teaches, the lecture is from 1 to 3 and the exercise hours are from 3 to 5.

Textbook

We use the book by Thomas M. Cover and Joy A. Thomas:
T.M. Cover, J.A. Thomas, Elements of Information Theory, 2nd Edition, John Wiley & Sons, Inc., New York, NY, USA.
(Here is a link to an electronic copy of it.)

Supplementary Notes

Stefan M. Moser: Information Theory (Lecture Notes), 5th edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Department of Electrical & Computer Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2017.

Stefan M. Moser: Advanced Topics in Information Theory (Lecture Notes), 2nd edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Department of Electrical & Computer Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2013.

A. El Gamal, Y.-H. Kim, Network Information Theory, Cambridge University Press

G. Kramer, Topics in Multi-User Information Theory, available here.

Exercises

Printed copies of the exercises will be handed out in class. See time table below.

Examination

Oral exam (30 min.).

Testat requirements

None.

Special Remarks

The lecture is in English.

Time Table

W Date Topic Handouts Exercise Solutions Lecturer
1 23 Feb. Differential entropy: definition, shifting, scaling, conditioning, chain rule; relative entropy; mutual information; data processing inequality; differential entropy of multivariate Gaussians; maximum differential entropy principle Infos Contents Exercise 1 Solutions 1 Stefan M. Moser
2 2 Mar. Gaussians maximize differential entropy for a given covariance matrix; weak typicality and joint weak typicality; Gaussian channel: average-power constraint, capacity, achievability with joint weak typicality decoding   Exercise 2 Solutions 2 Stefan M. Moser
3 9 Mar. Gaussian channel: converse, source-channel separation, bit-error converse; bandlimited Gaussian channel   Exercise 3 Solutions 3 Stefan M. Moser
4 16 Mar. Parallel Gaussian channel: independent case, waterfilling solution, dependent case   Exercise 4 Solutions 4 Stefan M. Moser
5 23 Mar. Multiple-access channel (MAC): definitions, pentagon regions, time-sharing, capacity region, achievability of pentagon regions with joint weak typicality decoding   Exercise 5 Solutions 5 Amos Lapidoth
6 30 Mar. Multiple-access channel (MAC): alternative characterization of the capacity region, converse Handout 1 Exercise 6 Solutions 6 Amos Lapidoth
7 6 Apr. Broadcast channel: physically degraded channel, stochastically degraded channel, superposition coding with joint weak typicality decoding, proof sketch that superposition coding is optimal for degraded channels   Exercise 7 Solutions 7 Stefan M. Moser
8 13 Apr. Channels with IID states: Shannon strategies, achievability and converse for causal state information, defective memories, Gelfand–Pinsker capacity, writing on dirty paper   Exercise 8 Solutions 8 Amos Lapidoth
20 Apr. holidays    
9 27 Apr. Gelfand–Pinsker (noncausal state information): converse and direct part   Exercise 9 Solutions 9 Amos Lapidoth
10 4 May Method of types (4 type theorems); universal source compression   Exercise 10 Solutions 10 Stefan M. Moser
11 11 May Sanov's theorem; Stein's lemma   Exercise 11 Solutions 11 Stefan M. Moser
12 18 May Guessing and Rényi entropy   Exercise 12 Solutions 12 Amos Lapidoth
13 25 May holiday    
14 1 Jun. Multiple-Access channel with feedback: Gaarder–Wolf, Cover–Leung Handout 2 Exercise 13 Solutions 13 Amos Lapidoth

-||-   _|_ _|_     /    __|__   Stefan M. Moser
[-]     --__|__   /__\    /__   Senior Researcher & Lecturer, ETH Zurich, Switzerland
_|_     -- --|-    _     /  /   Adj. Professor, National Yang Ming Chiao Tung University (NCTU), Taiwan
/ \     []  \|    |_|   / \/    Web: https://moser-isi.ethz.ch/


Last modified: Thu Jul 14 06:18:05 CEST 2022