Markov chain Monte Carlo (MCMC) algorithms have played a significant role in statistics, physics, machine learning and others, and they are the only known general and efficient approach for some high-dimensional problems. The Metropolis-Hastings (MH) algorithm as the most classical MCMC algorithm, has had a great influence on the development and practice of science and engineering. The behavior of the MH algorithm in high-dimensional problems is typically investigated through a weak convergence result of diffusion processes. In this paper, we introduce Mosco convergence of Dirichlet forms in analyzing the MH algorithm on large graphs, whose target distribution is the Gibbs measure that includes any probability measure satisfying a Markov property. The abstract and powerful theory of Dirichlet forms allows us to work directly and naturally on the infinite-dimensional space, and our notion of Mosco convergence allows Dirichlet forms associated with the MH Markov chains to lie on changing Hilbert spaces. Through the optimal scaling problem, we demonstrate the impressive strengths of the Dirichlet form approach over the standard diffusion approach.
翻译:Markov连锁 Monte Carlo (MCMC) 算法在统计、物理、机器学习和其他方面起着重要作用,它们是某些高维问题的唯一已知的一般和有效方法。大都会-哈斯廷(MH)算法作为最古典的MCMC算法,对科学和工程的发展和实践有着巨大影响。MH算法在高维问题中的行为通常通过扩散过程的微弱趋同结果来调查。在本文中,我们在分析大型图表的MH算法时引入了Drichlet 格式的MSco趋同,其目标分布是Gibbs测量方法,其中包括任何满足Markov属性的概率测量方法。Drichlet 形式的抽象和强大的理论使我们能够直接和自然地在无限的宇宙空间上工作,而我们的Mosco趋同概念允许与MH Markov 链相关的Drit 形式在改变Hilbert 空间上躺着。我们通过最佳的缩放问题展示了Drichlet 方法在标准扩散方法上的巨大优势。