Sampling from posterior distributions using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to ensure full exploration. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions. The pseudo-extended method augments the state-space of the posterior using pseudo-samples as auxiliary variables. On the extended space, the modes of the posterior become connected, which allows the MCMC sampler to easily move between well-separated posterior modes. We demonstrate that the pseudo-extended approach delivers improved MCMC sampling over the Hamiltonian Monte Carlo algorithm on multi-modal posteriors, including Boltzmann machines and models with sparsity-inducing priors.
Pseudo-extended Markov chain Monte Carlo
NIPS 2019, 33rd Conference on Neural Information Processing Systems, 8-14 December 2019, Vancouver, Canada
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in NIPS 2019, 33rd Conference on Neural Information Processing Systems, 8-14 December 2019, Vancouver, Canada and is available at :
PERMALINK : https://www.eurecom.fr/publication/6035