Recently directed acyclic graph (DAG) structure learning is formulated as a constrained continuous optimization problem with continuous acyclicity constraints and was solved iteratively through subproblem optimization. To further improve efficiency, we propose a novel learning framework to model and learn the weighted adjacency matrices in the DAG space directly. Specifically, we first show that the set of weighted adjacency matrices of DAGs are equivalent to the set of weighted gradients of graph potential functions, and one may perform structure learning by searching in this equivalent set of DAGs. To instantiate this idea, we propose a new algorithm, DAG-NoCurl, which solves the optimization problem efficiently with a two-step procedure: 1) first we find an initial cyclic solution to the optimization problem, and 2) then we employ the Hodge decomposition of graphs and learn an acyclic graph by projecting the cyclic graph to the gradient of a potential function. Experimental studies on benchmark datasets demonstrate that our method provides comparable accuracy but better efficiency than baseline DAG structure learning methods on both linear and generalized structural equation models, often by more than one order of magnitude.
翻译:最近定向的环流图结构学习(DAG)是作为持续周期性制约的有限连续优化问题的连续优化问题制定的,通过子问题优化反复解决。为了进一步提高效率,我们提出一个新的学习框架,用于在DAG空间直接建模和学习加权相邻矩阵。具体地说,我们首先显示,DAG的一组加权相邻矩阵相当于一组图形潜在功能的加权梯度,人们可以通过搜索这组相等的DAGs来进行结构学习。为了回溯这一想法,我们建议一种新的算法(DAG-NoCurl),它以两步程序有效解决优化问题:(1) 首先,我们找到一种最优化问题的初步周期解决方案,然后我们使用Hodge图解析法,然后通过将环形图投射到潜在函数的梯度来学习一个循环图。关于基准数据集的实验研究表明,我们的方法比基线的DAG结构在线性和普遍结构方程模型上学习的方法具有可比性,但效率要好得多,通常要高出一个级。