Learning the underlying Bayesian Networks (BNs), represented by directed acyclic graphs (DAGs), of the concerned events from purely-observational data is a crucial part of evidential reasoning. This task remains challenging due to the large and discrete search space. A recent flurry of developments followed NOTEARS[1] recast this combinatorial problem into a continuous optimization problem by leveraging an algebraic equality characterization of acyclicity. However, the continuous optimization methods suffer from obtaining non-spare graphs after the numerical optimization, which leads to the inflexibility to rule out the potentially cycle-inducing edges or false discovery edges with small values. To address this issue, in this paper, we develop a completely data-driven DAG structure learning method without a predefined value to post-threshold small values. We name our method NOTEARS with adaptive Lasso (NOTEARS-AL), which is achieved by applying the adaptive penalty method to ensure the sparsity of the estimated DAG. Moreover, we show that NOTEARS-AL also inherits the oracle properties under some specific conditions. Extensive experiments on both synthetic and a real-world dataset verify the efficacy of the proposed method.
翻译:学习纯观察性数据所显示的纯观察性数据所显示的有关事件的基本巴伊西亚网络(BNS)是证据推理的一个关键部分。由于搜索空间巨大且离散,这项任务仍然具有挑战性。在ONARS[1]1] 之后,最近一阵子的发展动态在ONARS[1] 将这一组合问题重新定位为连续优化问题,办法是利用对周期性的代谢性平等特征来进行调适性拉索(NOTEARS-AL),在数字优化后,持续优化方法会因获得非粘贴性图而受到影响,从而导致难以排除潜在的循环引燃边缘或利用小值的虚假发现边缘。为了解决这一问题,我们在本文件中开发了完全以数据驱动的DAG结构学习方法,但没有预先确定对后固值的价值。我们用适应性拉索(NOTEARS-AL)的方法是采用适应性处罚方法,以确保估计的DAG的宽度。此外,我们显示,NOARSAL还继承了某些特定条件下的合成系统特性。