Probabilistic sentential decision diagrams are a class of structured-decomposable probabilistic circuits especially designed to embed logical constraints. To adapt the classical LearnSPN scheme to learn the structure of these models, we propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit. Sum nodes are thus learned by recursively clustering batches in the initial data base, while the partitioning of the variables obeys a given input vtree. Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base, that is a relaxation of the training data base.
翻译:概率感知性决定图是一种结构分解的概率性电路,特别设计用于嵌入逻辑制约。为了修改古典的LainSPN计划以学习这些模型的结构,我们提议了一个基于部分封闭世界假设的新方案:数据隐含地提供了电路的逻辑基础。因此,在初始数据库中,通过循环组合组别分批学习了总结,而变量的分割则符合给定的输入 vtree 。初步实验表明,拟议的方法可能适当适合培训数据,并且很适合测试数据,只要这些方法与基本逻辑基础保持一致,也就是培训数据库的松散。