Sparse Gaussian processes revisited: Bayesian approaches to inducing-variable approximations

Rossi, Simone; Helnonen, Markus: Bonilla, Edwin V; Shen, Zheyang; Filippone, Maurizio
AISTATS 2021, 24th International Conference on Artificial
Intelligence and Statistics, 13-15 April 2021, San Diego,
California, USA (Virtual Conference)

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (gp) models. Besides enabling scalability, one of their main advantages over sparse approximations using direct marginal likelihood maximization is that they provide a robust alternative for point estimation of the inducing inputs, i.e. the location of the inducing variables. In this work we challenge the common wisdom that optimizing the inducing inputs in the variational framework yields optimal performance. We show that, by revisiting old model approximations such as the fully-independent training conditionals endowed with powerful sampling-based inference methods, treating both inducing locations
and gp hyper-parameters in a Bayesian way can improve performance significantly.
Based on stochastic gradient Hamiltonian Monte Carlo, we develop a fully Bayesian
approach to scalable gp and deep gp models, and demonstrate its state-of-the-art performance through an extensive experimental campaign across several regression and classification problems.

San Diego
Data Science
Eurecom Ref:
© 2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.