Structure learning via MCMC sampling is known to be very challenging because of the enormous search space and the existence of Markov equivalent DAGs. Theoretical results on the mixing behavior are lacking. In this work, we prove the rapid mixing of a random walk Metropolis-Hastings algorithm, which reveals that the complexity of Bayesian learning of sparse equivalence classes grows only polynomially in $n$ and $p$, under some high-dimensional assumptions. A series of high-dimensional consistency results is obtained, including the strong selection consistency of an empirical Bayes model for structure learning. Our proof is based on two new results. First, we derive a general mixing time bound on finite state spaces, which can be applied to various local MCMC schemes for other model selection problems. Second, we construct greedy search paths on the space of equivalence classes with node degree constraints by proving a combinatorial property of the comparison between two DAGs. Simulation studies on the proposed MCMC sampler are conducted to illustrate the main theoretical findings.
翻译:通过MCMC抽样进行的结构学习被认为非常具有挑战性,因为搜索空间巨大,而且Markov对等的DAG也存在。关于混合行为缺乏理论结果。在这项工作中,我们证明随机地将Metropolis-Hastings算法迅速混合在一起,这表明在一些高维假设下,巴伊西亚人对稀有等同类的学习的复杂性仅以多元方式以美元和美元增长,在某些高维假设下,仅以美元和美元增长。取得了一系列高维一致性的结果,包括经验性贝斯模型在结构学习方面的选择一致性很强。我们的证据基于两个新的结果。首先,我们从有限的州空间中得出一个一般的混合时间,这个时间可以适用于各种当地MMCC计划的其他模式选择问题。第二,我们通过证明两个DAGs之间的比较的组合属性,在等同类空间上建立贪婪的搜索路径,无度限制。对拟议的MCMC取样器进行了模拟研究,以说明主要的理论结论。