Structure learning via MCMC sampling is known to be very challenging because of the enormous search space and the existence of Markov equivalent DAGs. Theoretical results on the mixing behavior are lacking. In this work, we prove the rapid mixing of a random walk Metropolis-Hastings algorithm, which reveals that the complexity of Bayesian learning of sparse equivalence classes grows only polynomially in $n$ and $p$, under some high-dimensional assumptions. A series of high-dimensional consistency results is obtained, including the strong selection consistency of an empirical Bayes model for structure learning. Our proof is based on two new results. First, we derive a general mixing time bound on finite state spaces, which can be applied to local MCMC schemes for other model selection problems. Second, we construct high-probability search paths on the space of equivalence classes with node degree constraints by proving a combinatorial property of DAG comparisons. Simulation studies on the proposed MCMC sampler are conducted to illustrate the main theoretical findings.
翻译:通过MCMC采样进行结构学习是非常具有挑战性的,因为存在着巨大的搜索空间以及等价的马尔可夫图. 该领域缺乏关于混合行为的理论结果. 在本文中,我们证明了一个随机游走Metropolis-Hastings算法的快速混合性质,揭示了基于贝叶斯框架下的学习稀疏等价类的复杂度在一些高维约束下只呈多项式增长. 我们获得了一系列高维一致性结果,包括经验贝叶斯模型在结构学习中的强选择一致性. 我们的证明基于两个新的结果. 首先,我们导出了一个有限状态空间的混合时间界限,该界限可应用于其他模型选择问题的本地MCMC方案. 其次,我们通过证明DAG之间的组合性质构建了具有节点度约束的在等价类空间的高概率搜索路径. 我们进行了模拟研究以说明主要的理论发现.