In recent years, it has become increasingly easy to obtain approximate posterior samples via efficient computation algorithms, such as those in variational Bayes. On the other hand, concerns exist on the accuracy of uncertainty estimates, which make it tempting to consider exploiting the approximate samples in canonical Markov chain Monte Carlo algorithms. A major technical barrier is that the approximate sample, when used as a proposal in Metropolis-Hastings steps, tends to have a low acceptance rate as the dimension increases. In this article, we propose a simple yet general solution named ''graph-accelerated Markov Chain Monte Carlo''. We first build a graph with each node location assigned to an approximate sample, then we run Markov chain Monte Carlo with random walks over the graph. In the first stage, we optimize the choice of graph edges to enforce small differences in posterior density/probability between neighboring nodes, while encouraging edges to correspond to large distances in the parameter space. This optimized graph allows us to accelerate a canonical Markov transition kernel through mixing with a large-jump Metropolis-Hastings step, when collecting Markov chain samples at the second stage. Due to its simplicity, this acceleration can be applied to most of the existing Markov chain Monte Carlo algorithms. We theoretically quantify the rate of acceptance as dimension increases, and show the effects on improved mixing time. We demonstrate our approach through improved mixing performances for challenging sampling problems, such as those involving multiple modes, non-convex density contour, or large-dimension latent variables.
翻译:暂无翻译