Advanced Statistical Inference

ASI
Abstract

Abstract

This course focuses on the principles of learning from data and quantification of uncertainty in the context of machine learning, by complementing and enriching the “Machine Learning and Intelligence Systems” course. The presentation of the material follows a common thread based on the probabilistic data modeling approach, so that many classical learning models/algorithms can be seen as special cases of inference problems for more general probabilistic models.

We will start by drawing connections between loss optimization and probabilistic inference (e.g., maximum likelihood estimation, maximum a posteriori estimation). We will then introduce the concept of Bayesian inference, and discuss how to perform inference in complex and intractable models using approximate methods (e.g., variational inference, Markov Chain Monte Carlo, Laplace approximation). We will then focus on prediction, and the evaluation of predictive models. We will start by discussing simple predictive models (e.g., linear regression, logistic regression), and then move on to more complex models (e.g., Gaussian processes, neural networks). We will discuss all these models in the context of probabilistic inference, and we will put in practice the probabilistic methods introduced in the first half. While mostly focused on supervised learning, we will also take a look at unsupervised learning and probabilistic generative models. Finally, the course will be complemented by several practical sessions, where students will be able to implement and experiment with the methods discussed in class (using Python).

Teaching and Learning Methods: Lectures and Lab sessions (preferably one student per group).

Course Policies: Attendance to the Lab sessions are mandatory.

Bibliography

 

Bibliography

  • Book: BISHOP M. PatternRecognition and Machine Learning. Springer-Verlag, 2006, 768p.
  • Book: MURPHY K. Probabilistic Machine Learning: An Introduction, 2023
  • Book: MURPHY K. Probabilistic Machine Learning: Advanced Topics, 2023

Requirements

Prerequisites

  • Probability theory and statistics.
  • Linear algebra and calculus.
  • Basic programming skills (Python).
  • Basic knowledge of machine learning.

Description

Description

The course will cover a selection of the following topics:

Introduction

  • Recap on linear algebra and calculus
  • Overview of probability theory

Bayesian inference

  • Definition of likelihood, prior and posterior
  • Maximum likelihood estimation
  • Posterior estimation
  • Model selection

Approximate inference:

  • Variational inference
  • Laplace approximation
  • Markov chain Monte Carlo

Supervised learning:

  • Linear regression
  • Linear classification
  • Kernel methods for nonlinear regression and classification (Gaussian Processes)
  • Neural networks

Generative Models:

  • Gaussian mixture models
  • Variational autoencoders
  • Advanced deep generative models

Learning outcomes:

  • To identify the key elements composing a given probabilistic model
  • To recognize the suitability of different probabilistic models given a machine-learning problem
  • To use the appropriate techniques to derive probabilistic machine-learning algorithms
  • To develop codeto set up analyses of data using probabilistic machine-learning algorithms

 

Nb hours: 42.00, at least 4 Lab sessions (12 hours)

Evaluation: 

  • Assessed exercise (25% of the final grade)
  • Final written exam (75% of the final grade)