Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler in complex posterior distributions. The pseudo-extended method augments the state-space of the posterior using pseudo-samples as auxiliary variables, where on the extended space, the MCMC sampler is able to easily move between the well-separated modes of the posterior. We apply the pseudo-extended method within an Hamiltonian Monte Carlo sampler, and show that by using the No U-turn algorithm (Hoffman and Gelman, 2014), our proposed sampler is completely tuning free. We compare the pseudo-extended method against wellknown tempered MCMC algorithms and show the advantages of the new sampler on a number of challenging examples from the statistics literature.
Pseudo-extended Markov chain Monte Carlo
Submitted on ArXiV, August 18th 2017
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Submitted on ArXiV, August 18th 2017 and is available at :
PERMALINK : https://www.eurecom.fr/publication/5362