MCMC for variationally sparse Gaussian processes

Hensman, James; Matthews, Alexander G. de G; Filippone, Maurizio; Ghahramani, Zoubin
NIPS 2015, 29th Annual Conference on Neural Information Processing Systems, December 7-12, 2015, Montreal, Quebec, Canada

Gaussian process (GP) models form a core part of probabilistic machine learning.
Considerable research effort has been made into attacking three issues with GP
models: how to compute efficiently when the number of data is large; how to approximate
the posterior when the likelihood is not Gaussian and how to estimate
covariance function parameter posteriors. This paper simultaneously addresses
these, using a variational approximation to the posterior which is sparse in support
of the function but otherwise free-form. The result is a Hybrid Monte-Carlo
sampling scheme which allows for a non-Gaussian approximation over the function
values and covariance parameters simultaneously, with efficient computations
based on inducing-point sparse GPs. Code to replicate each experiment in this paper
is available at github.com/sparseMCMC.

Type:
Conférence
City:
Montreal
Date:
2015-12-07
Department:
Data Science
Eurecom Ref:
4710

PERMALINK : https://www.eurecom.fr/publication/4710