Graduate School and Research Center in Digital Sciences

Advanced Statistical Inference

[ASI]
T Technical Teaching


Abstract

This course focuses on the principles of learning from data and quantification of uncertainty, by complementing and enriching the Introduction to Statistical Learning course. In particular, the course is divided into two main parts that correspond to the supervised and unsupervised learning paradigms. The presentation of the material follows a common thread based on the probabilistic data modeling approach, so that many classical algorithms, such as least squares and k-means, can be seen as special cases of inference problems for more general probabilistic models. Taking a probabilistic view also allows the course to derive inference algorithms for a class of nonparametric models that have close connections with neural networks and support vector machines. Similarly to the Introduction to Statistical Learning course, the focus is not on the algorithmic background of the methods, but rather on their mathematical and statistical foundations. This advanced course is complemented by lab sessions to guide students through the design and validation of the methods developed duringthe lectures.

Teaching and Learning Methods : Lectures and Lab sessions (preferably one student per group)

Course Policies : Attendance of Lab sessions is mandatory

Bibliography

·         Pattern Recognition and Machine Learning (2006). C. M. Bishop, Springer

·         A First Course in Machine Learning (2011). S. Rogers, M. Girolami, CRC press.

Requirements

Basic knowledge of algebra and calculus

Description

The course will cover a selection of the following topics :

·         Introduction

o   Recap on linear algebra and calculus

o   Overview of probability theory

·         Supervised learning

o   Linear regression

o   Linear classification

o   Bayesian classification

o   Kernel methods for nonlinear regression and classification

·         Unsupervised learning

o   K-means and Kernel K_means

o   Gaussian mixture models

o   Principal component analysis (PCA) and Kernel PCA

·         Advanced topics

o   Dirichlet Processes

o   Gaussian Processes

o   Variational inference

o   Markov chain Monte Carlo

Learning outcomes:

·         Be able to identify the key elements composing a given probabilistic model

·         Be able to recognize the suitability of different probabilistic models given a machine learning problem

·         Be able to use the appropriate techniques to derive prababilistic machine learning algorithms

·         Be able to develop proof-of-concept software to set up analyses of data using probabilistic machine learning algorithms

Nb hours:42.00, at least 4 Lab sessions (12 hours)

Grading Policy:assessed exercise (25%), final written exam (75%)

Nb hours: 42.00
Nb hours per week: 3.00