Neural-symbolic AI (NeSy) allows neural networks to exploit symbolic background knowledge in the form of logic. It has been shown to aid learning in the limited data regime and to facilitate inference on out-of-distribution data. Probabilistic NeSy focuses on integrating neural networks with both logic and probability theory, which additionally allows learning under uncertainty. A major limitation of current probabilistic NeSy systems, such as DeepProbLog, is their restriction to finite probability distributions, i.e., discrete random variables. In contrast, deep probabilistic programming (DPP) excels in modelling and optimising continuous probability distributions. Hence, we introduce DeepSeaProbLog, a neural probabilistic logic programming language that incorporates DPP techniques into NeSy. Doing so results in the support of inference and learning of both discrete and continuous probability distributions under logical constraints. Our main contributions are 1) the semantics of DeepSeaProbLog and its corresponding inference algorithm, 2) a proven asymptotically unbiased learning algorithm, and 3) a series of experiments that illustrate the versatility of our approach.
翻译:神经共振(Nesy)允许神经网络以逻辑的形式利用象征性背景知识,这证明有助于在有限的数据系统中学习,有助于对分配外数据进行推断; 概率内系统侧重于将神经网络与逻辑和概率理论相结合,从而在不确定性下可以进行学习; 深度ProbLog等当前概率性神经系统的一个主要局限性是限制有限概率分布,即离散随机变量; 相比之下,深度概率性编程(DPP)在建模和优化连续概率分布方面优异。 因此,我们引入了Deep SeaProbLog, 一种将DPP技术纳入Nesy的神经概率性逻辑编程语言。 这样做的结果是支持在逻辑制约下对离散概率分布和连续概率分布的判断和学习。 我们的主要贡献是:(1) 深海ProbLog的定义及其相应的推论,2) 一种经过验证的无偏向性学习法和3) 一系列实验,用以说明我们的反向分析方法。</s>