Dynamic Bayesian networks have been well explored in the literature as discrete-time models: however, their continuous-time extensions have seen comparatively little attention. In this paper, we propose the first constraint-based algorithm for learning the structure of continuous-time Bayesian networks. We discuss the different statistical tests and the underlying hypotheses used by our proposal to establish conditional independence. Furthermore, we analyze and discuss the computational complexity of the best and worst cases for the proposed algorithm. Finally, we validate its performance using synthetic data, and we discuss its strengths and limitations comparing it with the score-based structure learning algorithm from Nodelman et al. (2003). We find the latter to be more accurate in learning networks with binary variables, while our constraint-based approach is more accurate with variables assuming more than two values. Numerical experiments confirm that score-based and constraint-based algorithms are comparable in terms of computation time.
翻译:文献中作为离散时间模型很好地探索了动态贝叶斯网络:然而,它们的连续时间扩展相对而言很少引起注意。在本文中,我们提出了第一个用于学习连续时间贝叶斯网络结构的基于约束的算法。我们讨论了不同的统计测试和我们提出的建立有条件独立的建议所使用的基本假设。此外,我们分析和讨论拟议算法的最佳和最坏案例的计算复杂性。最后,我们利用合成数据验证了它的性能,我们讨论了它的长处和局限性,将其与诺德尔曼等人(2003年)的基于分数的结构学习算法进行比较。我们发现后者在使用二进制变量的学习网络中更为准确,而我们基于约束的方法则更精确,变量假设两个以上的数值。数字实验证实,基于分数和基于约束的算法在计算时间方面是可比的。