Good initializations of variational Bayes for deep models

Rossi, Simone; Michiardi, Pietro; Filippone, Maurizio
ICML 2019, 36th International Conference on Machine Learning, 9-15 June 2019, Long Beach, CA, USA / Also published in PMLR, Vol. 97, 2019

Stochastic variational inference is an established way to carry out approximate Bayesian inference for deep models. While there have been effective proposals for good initializations for loss minimization in deep learning, far less attention has been devoted to the issue of initialization of stochastic variational inference. We address this by proposing a novel layer-wise initialization strategy based on Bayesian linear models. The proposed method is extensively validated on regression and classification tasks, including Bayesian DeepNets and ConvNets, showing faster convergence compared to alternatives inspired by the literature on initializations for loss minimization. 


Type:
Conference
City:
Long Beach
Date:
2019-06-09
Department:
Data Science
Eurecom Ref:
5725
Copyright:
© 2019 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PERMALINK : https://www.eurecom.fr/publication/5725