Statistical signal processing


The proper treatment of modern communication systems requires the modelling of signals as random processes. Often the signal description will involve a number of parameters such as carrier frequency, timing, channel impulse response, noise variance, interference spectrum. The values of these parameters are unknown and need to be estimated for the receiver to be able to proceed.

Parameters may also occur in the description of other random analysis of communication networks, or in the descriptions of sounds and images, or other data, e.g. geolocation. This course provides an introduction to the basic techniques for estimation of a finite set of parameters, of a signal spectrum or of one complete signal on the basis of a correlated signal (optimal filtering, Wiener and Kalman filtering). The techniques introduced in this course have a proven track record of many decades. They are complementary to the techniques introduced in the EURECOM course Stat. They are useful for other application branches such as machine learning, in the EURECOM courses MALIS and ASI.

Teaching and Learning Methods: Lectures, Homework, Exercise and  Lab session (groups of 1-2 students depending on size of class).

Course Policies:  Attendance of Lab session is mandatory (15% of final grade).


[1] H.L. Van Trees. Detection, Estimation and Modulation Theory, volume 1. Wiley, New York, 1968.

[2] L. Scharf. Statistical Signal Processing. Addison-Wesley, Reading, MA, 1991.

[3] S.M. Kay. Fundamentals of Statistical Signal Processing: Estimation Theory. Prenctice Hall, 1993.

[4] B. Porat. Digital Processing of Random Signals: Theory and Methods. Prentice Hall, 1994.

[5] C.W. Therrien. Discrete Random Signals and Statistical Signal Processing. Prentice Hall, 1992.

[6] M.H. Hayes Statistical Digital Signal Processing and Modeling Wiley, 1996. Pdf on internet.

[7] S. Kay Modern Spectral Estimation: Theory and Application Prentice Hall, 1988.

[8] P. Stoica, R. Moses Spectral Analysis of Signals. Prentice hall, 2005. ps/SAS-new.pdf

[9] T. Kailath. Lectures on Wiener and Kalman Filtering. Springer-Verlag, Wien – New York, 1981.

[10] T. Kailath, A.H. Sayed, B. Hassibi Linear Estimation Prentice Hall, 2000.

[11] A.H. Sayed Adaptive Filters Wiley-IEEE Press, 2008.

Les transparents utilisés sont disponibles. Un polycopié non-finalisé existe aussi.


Basics in probability and random variables/processes, basics in (digital) signal processing (linear systems, Fourier Transform, z transform), basics in linear algebra (vectors, matrices, solving linear equations). Notions from Optimization. Useful prerequisite course at EURECOM: MathEng.

  • Parameter estimation : Random parameters, Bayesian estimation : minimum mean squared error estimation, orthogonality principle, maximum a posteriori estimation, performance bounds, linear estimation, the linear model. Deterministic unknown parameters : minimum variance estimation, bias, efficiency, consistency, Cramer-Rao lower bound, maximum likelihood estimation, EM algorithm, least-squares and BLUE methods, method of moments, the linear model.
  • Spectrum estimation : Non-parametric techniques : periodogram, windowing, spectral leakage and resolution. Parametric techniques : autoregressive processes, linear prediction, maximum entropy, Levinson and Schur algortihms, lattice filters. Time and frequency domain localization, short-time Fourier transform, wavelet transform, QMF, subbands, perfect reconstruction filter banks.
  • Optimal filtering : Wiener filtering, non-causal, causal and FIR, application to channel equalization. Kalman filtering : time-varying and time-invariant state-space models. Application to channel tracking.
  • Adaptive Filtering : Some elements from optimization theory, steepest-descent algorithm. The LMS (least mean Square) and RLS (Recursive Least-Squares) algorithms, performance analysis. Tracking time-varying parameters, applications.
  • Sinusoids in Noise : Maximum likelihood estimation, Cramer-Rao bounds, IQML algorithm and variations, subspace techniques, moment matching, MVDR filtering, Prony and Pisarenko techniques, Capon method, adaptive notch filtering.

Learning Outcomes:

Upon successful study of the course, students will be able to understand and apply:

  • Bayesian and deterministic parameter estimation methods
  • Non-parametric spectrum estimation, linear prediction and auto-regressive modeling
  • Wiener filtering
  • Adaptive filtering (in particular the LMS algorithm, the main stochastic gradient technique)

Nb hours: 42.00, 6hr exercise sessions, 3hrs lab session.

Grading Policy:  Homework (15%), Lab report (15%), Final Exam (70%). 2 hour written exam - all documents authorized.