Revisiting the effects of stochasticity for Hamiltonian samplers

Franzese, Giulio; Milios, Dimitrios; Filippone, Maurizio; Michiardi, Pietro
ICML 2022, 39th International Conference on Machine Learning, 17-23 July, 2022, Baltimore, USA

We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling. Our main result is a novel analysis for the effect of mini-batches through the lens of differential operator splitting, revising previous literature results. The stochastic component of a Hamiltonian SDE is decoupled from the gradient noise, for which we make no normality assumptions. This leads to the identification of a convergence bottleneck: when considering mini-batches, the best achievable error rate is O(η2), with η being the integrator step size. Our theoretical results are supported by an empirical study on a variety of regression and classification tasks for Bayesian neural networks.


HAL
Type:
Conference
City:
Baltimore
Date:
2022-07-17
Department:
Data Science
Eurecom Ref:
6740
Copyright:
© 2022 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PERMALINK : https://www.eurecom.fr/publication/6740