Ecole d'ingénieur et centre de recherche en Sciences du numérique

MCMC for variationally sparse Gaussian processes

Hensman, James; Matthews, Alexander G. de G; Filippone, Maurizio; Ghahramani, Zoubin

NIPS 2015, 29th Annual Conference on Neural Information Processing Systems, December 7-12, 2015, Montreal, Quebec, Canada / Also on ArXiv

Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at github.com/sparseMCMC.

Document Arxiv Bibtex

Titre:MCMC for variationally sparse Gaussian processes
Type:Conférence
Langue:English
Ville:Montreal
Pays:CANADA
Date:
Département:Data Science
Eurecom ref:4710
Bibtex: @inproceedings{EURECOM+4710, year = {2015}, title = {{MCMC} for variationally sparse {G}aussian processes}, author = {{H}ensman, {J}ames and {M}atthews, {A}lexander {G}. de {G} and {F}ilippone, {M}aurizio and {G}hahramani, {Z}oubin}, booktitle = {{NIPS} 2015, 29th {A}nnual {C}onference on {N}eural {I}nformation {P}rocessing {S}ystems, {D}ecember 7-12, 2015, {M}ontreal, {Q}uebec, {C}anada / {A}lso on {A}r{X}iv}, address = {{M}ontreal, {CANADA}}, month = {12}, url = {http://www.eurecom.fr/publication/4710} }
Voir aussi: