Broadening the scope of Gaussian processes for large-scale learning

Cutajar, Kurt
Thesis

The renewed importance of decision making under uncertainty calls for a re-evaluation of Bayesian inference techniques targeting this goal in the big data regime. Gaussian processes (GPs) are a fundamental building block of many probabilistic kernel methods; however, the computational and storage complexity of GPs hinders their scaling to large modern datasets. The contributions presented in this thesis are two-fold. We first investigate the effectiveness of exact GP inference on a computational budget by proposing a novel scheme for accelerating regression and classification by way of preconditioning. In the spirit of probabilistic numerics, we also show how the numerical uncertainty introduced by approximate linear algebra should be adequately evaluated and incorporated. Bridging the gap between GPs and deep learning techniques remains a pertinent research goal, and the second broad contribution of this thesis is to establish and reinforce the role of GPs, and their deep counterparts (DGPs), in this setting. Whereas GPs and DGPs were once deemed unfit to compete with state-of-the-art deep learning methods, we demonstrate how such models can also be adapted to the large-scale and complex tasks to which machine learning is now being applied.


HAL
Type:
Thèse
Date:
2019-04-24
Department:
Data Science
Eurecom Ref:
5852
Copyright:
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Thesis and is available at :
See also:

PERMALINK : https://www.eurecom.fr/publication/5852