We describe NTS-NOTEARS, a score-based structure learning method for time-series data to learn dynamic Bayesian networks (DBNs) that captures nonlinear, lagged (inter-slice) and instantaneous (intra-slice) relations among variables. NTS-NOTEARS utilizes 1D convolutional neural networks (CNNs) to model the dependence of child variables on their parents; 1D CNN is a neural function approximation model well-suited for sequential data. DBN-CNN structure learning is formulated as a continuous optimization problem with an acyclicity constraint, following the NOTEARS DAG learning approach. We show how prior knowledge of dependencies (e.g., forbidden and required edges) can be included as additional optimization constraints. Empirical evaluation on simulated and benchmark data show that NTS-NOTEARS achieves state-of-the-art DAG structure quality compared to both parametric and nonparametric baseline methods, with improvement in the range of 10-20% on the F1-score. We also evaluate NTS-NOTEARS on complex real-world data acquired from professional ice hockey games that contain a mixture of continuous and discrete variables.
翻译:我们描述NTS-NOTEARS,这是一个基于分数结构的学习方法,用于在时间序列数据中学习动态的Bayesian网络(DBNS),它能捕捉各变量之间的非线性、滞后(跨切)和瞬时(跨切)关系。NTS-NOTEARS利用 1D进化神经网络(CNNN)来模拟儿童变量对父母的依赖性;1DCNN有线电视新闻网是一个神经功能近似模型,非常适合相继数据。DBN-CNN结构学习是一个持续优化的问题,具有周期性限制,采用O注ARS DAG学习方法。我们展示了如何将以前对依赖性(例如被禁止和需要的边缘)的了解作为额外的优化限制。对模拟和基准数据进行的精神评估表明,NTS-NOTEARS达到最新水平的DAG结构质量,而与对准和非对称基准的基线方法相比,在F1核心专业的10-20%的范围有所改进。我们还评估了NTS-NOTS-NARS在不断从复杂的冰层中获取的数据。