Back to overview over all lectures...
Information Theory II
|Prof. Amos Lapidoth||Prof. Stefan M. Moser|
|Office:||ETF E107||ETF E104|
|Phone:||044 632 51 92||044 632 36 24|
Office: ETF D108
Phone: 044 632 65 59
Note that Prof. Lapidoth and Prof. Moser share this course. In the weeks when Prof. Lapidoth teaches, the lecture is from 3 to 5 and the exercise hours are from 1 to 3. In the weeks when Prof. Moser teaches, the lecture is from 1 to 3 and the exercise hours are from 3 to 5.
We use the book by Thomas M. Cover and Joy A. Thomas:
T.M. Cover, J.A. Thomas, Elements of Information Theory, 2nd Edition, John Wiley & Sons, Inc., New York, NY, USA.
(Here is a link to an electronic copy of it.)
Stefan M. Moser: Information Theory (Lecture Notes), 5th edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Department of Electrical & Computer Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2017.
Stefan M. Moser: Advanced Topics in Information Theory (Lecture Notes), 2nd edition, Signal and Information Processing Laboratory, ETH Zürich, Switzerland, and Department of Electrical & Computer Engineering, National Chiao Tung University (NCTU), Hsinchu, Taiwan, 2013.
A. El Gamal, Y.-H. Kim, Network Information Theory, Cambridge University Press
G. Kramer, Topics in Multi-User Information Theory, available here.
Printed copies of the exercises will be handed out in class. See time table below.
Oral exam (30 min.).
The lecture is in English.
|1||23 Feb.||Differential entropy: definition, shifting, scaling, conditioning, chain rule; relative entropy; mutual information; data processing inequality; differential entropy of multivariate Gaussians; maximum differential entropy principle||Infos Contents||Exercise 1||Solutions 1||Stefan M. Moser|
|2||2 Mar.||Gaussians maximize differential entropy for a given covariance matrix; weak typicality and joint weak typicality; Gaussian channel: average-power constraint, capacity, achievability with joint weak typicality decoding||Exercise 2||Solutions 2||Stefan M. Moser|
|3||9 Mar.||Gaussian channel: converse, source-channel separation, bit-error converse; bandlimited Gaussian channel||Exercise 3||Solutions 3||Stefan M. Moser|
|4||16 Mar.||Parallel Gaussian channel: independent case, waterfilling solution, dependent case||Exercise 4||Solutions 4||Stefan M. Moser|
|5||23 Mar.||Multiple-access channel (MAC): definitions, pentagon regions, time-sharing, capacity region, achievability of pentagon regions with joint weak typicality decoding||Exercise 5||Solutions 5||Amos Lapidoth|
|6||30 Mar.||Multiple-access channel (MAC): alternative characterization of the capacity region, converse||Handout 1||Exercise 6||Solutions 6||Amos Lapidoth|
|7||6 Apr.||Broadcast channel: physically degraded channel, stochastically degraded channel, superposition coding with joint weak typicality decoding, proof sketch that superposition coding is optimal for degraded channels||Exercise 7||Solutions 7||Stefan M. Moser|
|8||13 Apr.||Channels with IID states: Shannon strategies, achievability and converse for causal state information, defective memories, Gelfand–Pinsker capacity, writing on dirty paper||Exercise 8||Solutions 8||Amos Lapidoth|
|9||27 Apr.||Gelfand–Pinsker (noncausal state information): converse and direct part||Exercise 9||Solutions 9||Amos Lapidoth|
|10||4 May||Method of types (4 type theorems); universal source compression||Exercise 10||Solutions 10||Stefan M. Moser|
|11||11 May||Sanov's theorem; Stein's lemma||Exercise 11||Solutions 11||Stefan M. Moser|
|12||18 May||Guessing and Rényi entropy||Exercise 12||Solutions 12||Amos Lapidoth|
|14||1 Jun.||Multiple-Access channel with feedback: Gaarder–Wolf, Cover–Leung||Handout 2||Exercise 13||Solutions 13||Amos Lapidoth|
-||- _|_ _|_ / __|__ Stefan M. Moser
[-] --__|__ /__\ /__ Senior Researcher & Lecturer, ETH Zurich, Switzerland
_|_ -- --|- _ / / Adj. Professor, National Chiao Tung University (NCTU), Taiwan
/ \  \| |_| / \/ Web: http://moser-isi.ethz.ch/
Last modified: Mon Jun 5 15:52:25 CEST 2017