Gaussian graphical models are relevant tools to learn conditional independence structure between variables. In this class of models, Bayesian structure learning is often done by search algorithms over the graph space. The conjugate prior for the precision matrix satisfying graphical constraints is the well-known G-Wishart. With this prior, the transition probabilities in the search algorithms necessitate evaluating the ratios of the prior normalizing constants of G-Wishart. In moderate to high-dimensions, this ratio is often approximated using sampling-based methods as computationally expensive updates in the search algorithm. Calculating this ratio so far has been a major computational bottleneck. We overcome this issue by representing a search algorithm in which the ratio of normalizing constant is carried out by an explicit closed-form approximation. Using this approximation within our search algorithm yields significant improvement in the scalability of structure learning without sacrificing structure learning accuracy. We study the conditions under which the approximation is valid. We also evaluate the efficacy of our method with simulation studies. We show that the new search algorithm with our approximation outperforms state-of-the-art methods in both computational efficiency and accuracy. The implementation of our work is available in the R package BDgraph.
翻译:Gausian 图形模型是学习各变量之间有条件独立结构的相关工具。 在这一类模型中, Bayesian 结构学习通常通过图形空间的搜索算法进行。 精确矩阵之前满足图形限制的共鸣是众所周知的 G- Wishart 。 在此前, 搜索算法的过渡概率需要评估G- Wishart 先前正常化常数的比重。 在中度到高度之间, 此比率往往使用基于抽样的方法作为计算成本的更新方法进行比对。 计算这一比率到目前为止是一个主要的计算瓶颈。 我们通过代表一种搜索算法来克服了这个问题, 在这种算法中, 常数的常数比率是通过明确的封闭式近似值来进行。 在搜索算法中, 使用这种近似使结构学习的缩略性得到显著改善, 同时又不牺牲结构学习的准确性。 我们研究准度所依据的条件。 我们还通过模拟研究来评估我们方法的功效。 我们通过模拟研究显示, 我们的近似超常数的精确度方法在计算法中可以实现的精确性。