Ecole d'ingénieur et centre de recherche en Sciences du numérique

Good initializations of variational Bayes for deep models

Rossi, Simone; Michiardi, Pietro; Filippone, Maurizio

ICML 2019, International Conference on Machine Learning, 9-15 June 2019, Long Beach, California, USA / Also published in PMLR, Vol. 97, 2019

Stochastic variational inference is an established way to carry out approximate Bayesian inference for deep models. While there have been effective proposals for good initializations for loss minimization in deep learning, far less attention has been devoted to the issue of initialization of stochastic variational inference. We address this by proposing a novel layer-wise initialization strategy based on Bayesian linear models. The proposed method is extensively validated on regression and classification tasks, including Bayesian DeepNets and ConvNets, showing faster convergence compared to alternatives inspired by the literature on initializations for loss minimization. 

Document Arxiv Bibtex

Titre:Good initializations of variational Bayes for deep models
Type:Conférence
Langue:English
Ville:Long Beach
Pays:ÉTATS-UNIS
Date:
Département:Data Science
Eurecom ref:5725
Bibtex: @inproceedings{EURECOM+5725, year = {2019}, title = {{G}ood initializations of variational {B}ayes for deep models}, author = {{R}ossi, {S}imone and {M}ichiardi, {P}ietro and {F}ilippone, {M}aurizio}, booktitle = {{ICML} 2019, {I}nternational {C}onference on {M}achine {L}earning, 9-15 {J}une 2019, {L}ong {B}each, {C}alifornia, {USA} / {A}lso published in {PMLR}, {V}ol. 97, 2019}, address = {{L}ong {B}each, {\'{E}}{TATS}-{UNIS}}, month = {06}, url = {http://www.eurecom.fr/publication/5725} }
Voir aussi: