Graduate School and Research Center in Digital Sciences

Information theory

T Technical Teaching


  • Since 1948, the year of publication of Shannon's landmark paper "A mathematical theory of communications", Information theory has paved the ground for the most important developments of today's information/communication world.
  • Information theory studies the ultimate theoretical limits of source coding and data compression, of channel coding and reliable communications via channels, and provides the guidelines for the development of practical signal-processing and coding algorithms.
  • This course covers Information theory at an introductory level.
  • The practical implications of theoretical results presented are put in evidence through examples and case studies.




  • Entropy, divergence and mutual information : Definitions, elementary relations, inequalities.
  • Lossless Source Coding : Source coding theorems, Huffman codes, universal data compression and Lempel-Ziv coding.
  • Channel coding : Channel coding theorems, reliability function and error exponents.
  • Gaussian channels : Capacity of discrete-time Gaussian channels, correlated noise, intersymbol interference.
  • Rate-distortion theory : Compression of Gaussian sources, vector quantisation.
  • Topics in multiterminal information theory : the min-cut maw-flow outer bound on the capacity region of a general multiterminal network, the multi-access, the broadcast channel.
Nb hours: 42.00
Nb hours per week: 3.00
Control form: examen écrit