We consider MCMC methods for learning equivalence classes of sparse Gaussian DAG models when $p = e^{o(n)}$. The main contribution of this work is a rapid mixing result for a random walk Metropolis-Hastings algorithm, which we prove using a canonical path method. It reveals that the complexity of Bayesian learning of sparse equivalence classes grows only polynomially in $n$ and $p$, under some common high-dimensional assumptions. Further, a series of high-dimensional consistency results is obtained by the path method, including the strong selection consistency of an empirical Bayes model for structure learning and the consistency of a greedy local search on the restricted search space. Rapid mixing and slow mixing results for other structure-learning MCMC methods are also derived. Our path method and mixing time results yield crucial insights into the computational aspects of high-dimensional structure learning, which may be used to develop more efficient MCMC algorithms.
翻译:我们考虑了在$p = e ⁇ o(n)}$的情况下学习稀有高森DAG模型等同类的MCMC方法。 这项工作的主要贡献是随机步行大都会-哈斯廷算法的快速混合结果,我们证明它使用一种康星路径法。它揭示出,在一些共同的高维假设下,巴伊西亚人对稀有等同类学习的复杂性仅以单数方式增长,以美元和美元为单位。此外,路径法还取得了一系列高维一致性结果,包括结构学习经验性贝斯模型的强有力选择一致性和在有限搜索空间进行贪婪的本地搜索的一致性。其他结构学习MCMC方法的快速混合和缓慢混合结果也得到推导出。我们的路径法和时间结果混合对高维结构学习的计算方面产生至关重要的洞察力,而高维结构学习可用来开发更有效率的MCMC算法。