Graduate School and Research Center in Digital Sciences

Rethinking sparse Gaussian processes: Bayesian approaches to inducing-variable approximations

Rossi, Simone; Heinonen, Markus; Bonilla, Edwin; Shen, Zheyang; Filippone, Maurizio

Submitted to ArXiv, 9 March 2020

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models. Most previous works treat the locations of the inducing variables, i.e. the inducing inputs, as variational hyperparameters, and these are then optimized together with GP covariance hyper-parameters. While some approaches point to the benefits of a Bayesian treatment of GP hyper-parameters, this has been largely overlooked for the inducing inputs. In this work, we show that treating both inducing locations and GP hyper-parameters in a Bayesian way, by inferring their full posterior, further significantly improves performance. Based on stochastic gradient Hamiltonian Monte Carlo, we develop a fully Bayesian approach to scalable GP and deep GP models, and demonstrate its competitive performance through an extensive experimental campaign across several regression and classification problems. 

Arxiv Bibtex

Title:Rethinking sparse Gaussian processes: Bayesian approaches to inducing-variable approximations
Department:Data Science
Eurecom ref:6203
Copyright: © EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Submitted to ArXiv, 9 March 2020 and is available at :
Bibtex: @inproceedings{EURECOM+6203, year = {2020}, title = {{R}ethinking sparse {G}aussian processes: {B}ayesian approaches to inducing-variable approximations}, author = {{R}ossi, {S}imone and {H}einonen, {M}arkus and {B}onilla, {E}dwin and {S}hen, {Z}heyang and {F}ilippone, {M}aurizio}, booktitle = {{S}ubmitted to {A}r{X}iv, 9 {M}arch 2020}, address = {}, month = {03}, url = {} }
See also: