Learning the underlying Bayesian Networks (BNs), represented by directed acyclic graphs (DAGs), of the concerned events from purely-observational data is a crucial part of evidential reasoning. This task remains challenging due to the large and discrete search space. A recent flurry of developments followed NOTEARS[1] recast this combinatorial problem into a continuous optimization problem by leveraging an algebraic equality characterization of acyclicity. However, the continuous optimization methods suffer from obtaining non-spare graphs after the numerical optimization, which leads to the inflexibility to rule out the potentially cycle-inducing edges or false discovery edges with small values. To address this issue, in this paper, we develop a completely data-driven DAG structure learning method without a predefined value to post-threshold small values. We name our method NOTEARS with adaptive Lasso (NOTEARS-AL), which is achieved by applying the adaptive penalty method to ensure the sparsity of the estimated DAG. Moreover, we show that NOTEARS-AL also inherits the oracle properties under some specific conditions. Extensive experiments on both synthetic and a real-world dataset demonstrate that our method consistently outperforms NOTEARS.
翻译:学习纯观察性数据所显示的纯观察性数据所显示的有关事件的基本巴伊西亚网络(BNs)是证据推理的一个关键部分。由于搜索空间巨大且离散,这项任务仍然具有挑战性。在ONARS[1]1] 之后,最近一阵子的发展动态在ONARS[1] 将这一组合问题重新定位为连续优化问题,利用对周期性进行代谢性平等定性的方法,使循环性平面图(DAGs)在数字优化后获得非平面图,从而影响连续优化方法,从而导致难以排除潜在周期性诱导边缘或带有小值的虚假发现边缘。为了解决这一问题,我们在本文件中开发了完全以数据驱动的DAG结构学习方法,但没有预先确定对后固值值值值的价值。我们用适应性拉索(NOTEARS-AL)的方法命名为适应性拉索(NOTEARSARS,这是通过应用适应性惩罚方法确保估计DAG的紧张性而实现的。此外,我们还显示,NOARSAL还继承了在某种特定条件下的合成系统特性。