Graduate School and Research Center in Digital Sciences

Isotropic SGD: a practical approach to Bayesian posterior sampling

Franzese, Giulio; Candela, Rosa; Milios, Dimitrios; Filippone, Maurizio; Michiardi, Pietro

Submitted on ArXiV, 9 June 2020

In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms. Our formulation unlocks the design of a novel, practical approach to posterior sampling, which makes the SG noise isotropic using a fixed learning rate that we determine analytically, and that requires weaker assumptions than existing algorithms. In contrast, the common traits of existing sgmcmc algorithms is to approximate the isotropy condition either by drowning the gradients in additive noise (annealing the learning rate) or by making restrictive assumptions on the sg noise covariance and the geometry of the loss landscape. Extensive experimental validations indicate that our proposal is competitive with the state-of-the-art on sgmcmc, while being much more practical to use.

Arxiv Bibtex

Title:Isotropic SGD: a practical approach to Bayesian posterior sampling
Type:Conference
Language:English
City:
Date:
Department:Data Science
Eurecom ref:6292
Copyright: © EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Submitted on ArXiV, 9 June 2020 and is available at :
Bibtex: @inproceedings{EURECOM+6292, year = {2020}, title = {{I}sotropic {SGD}: a practical approach to {B}ayesian posterior sampling}, author = {{F}ranzese, {G}iulio and {C}andela, {R}osa and {M}ilios, {D}imitrios and {F}ilippone, {M}aurizio and {M}ichiardi, {P}ietro}, booktitle = {{S}ubmitted on {A}r{X}i{V}, 9 {J}une 2020}, address = {}, month = {06}, url = {http://www.eurecom.fr/publication/6292} }
See also: