Graduate School and Research Center in Digital Sciences

Efficient approximate inference with walsh-hadamard variational inference

Rossi, Simone; Marmin, Sebastien; Filippone, Maurizio

NIPS 2019, 33rd Conference on Neural Information Processing Systems, Workshop on Bayesian Deep Learning, 8-14 December 2019, Vancouver, Canada


Variational inference offers scalable and flexible tools to tackle intractable Bayesian inference of modern statistical models like Bayesian neural networks and Gaussian processes. For largely over-parameterized models, however, the over-regularization property of the variational objective makes the application of variational inference challenging. Inspired by the literature on kernel methods, and in particular on structured approximations of distributions of random matrices, this paper proposes Walsh-Hadamard Variational Inference, which uses Walsh-Hadamard-based factorization strategies to reduce model parameterization, accelerate computations, and increase the expressiveness of the approximate posterior beyond fully factorized ones. 

Document Bibtex

Title:Efficient approximate inference with walsh-hadamard variational inference
Type:Conference
Language:English
City:Vancouver
Country:CANADA
Date:
Department:Data Science
Eurecom ref:6034
Bibtex: @inproceedings{EURECOM+6034, year = {2019}, title = {{E}fficient approximate inference with walsh-hadamard variational inference}, author = {{R}ossi, {S}imone and {M}armin, {S}ebastien and {F}ilippone, {M}aurizio}, booktitle = {{NIPS} 2019, 33rd {C}onference on {N}eural {I}nformation {P}rocessing {S}ystems, {W}orkshop on {B}ayesian {D}eep {L}earning, 8-14 {D}ecember 2019, {V}ancouver, {C}anada\&\#13;\&\#10;}, address = {{V}ancouver, {CANADA}}, month = {12}, url = {http://www.eurecom.fr/publication/6034} }
See also: