DS Talk - Kurt CUTAJAR : "Scalable Gaussian processes with a twist of probabilistic numerics"

Kurt CUTAJAR - PhD student
Data Science

Date: October 5th 2017
Location: Eurecom - Eurecom

Speaker: Kurt Cutajar PhD student in the DS dept. Abstract: Developing scalable learning models without compromising performance is at the forefront of machine learning research. The scalability of such models is predominantly hindered by linear algebraic operations having large computational complexity, among which is the solution of linear systems involving kernel matrices. A common way to tackle this scalability issue is to use the conjugate gradient algorithm, but this technique is not without its own issues: the conditioning of kernel matrices is often such that conjugate gradients will have poor convergence in practice. Preconditioning is a common approach to alleviating this issue. With particular emphasis on Gaussian processes, this talk shall outline how preconditioning can be effectively exploited to develop a scalable approach to both solving kernel machines and learning their hyperparameters [1]. Inspired by recent developments in the field of probabilistic numerics, this talk shall also cover ongoing work on characterising the computational uncertainty introduced by such algebraic approximations. This ties in with our work on casting the computation of the log determinant of a matrix as an estimation problem [2]. [1] Preconditioning Kernel Matrices, ICML 2016 K Cutajar; J P Cunningham; M Osborne; M Filippone [2] Bayesian Inference of Log Determinants, UAI 2017 J Fitzsimons; K Cutajar; M Osborne; S Roberts; M Filippone Bio: Kurt is a second-year PhD student supervised by Dr. Maurizio Filippone, whose primary research interests involve developing scalable approximations to Gaussian process inference without compromising on precision and performance. Of particular interest is whether the penalties associated with such approximations can be tuned to a given computational budget. Bridging the gap between Gaussian processes and deep learning techniques is another ongoing research goal. This work has resulted in multiple collaborations with external institutions, most recently culminating in a three-month visiting period at the University of Oxford.

Permalink: https://www.eurecom/seminar/63447