Recent advances in neural symbolic learning, such as DeepProbLog, extend probabilistic logic programs with neural predicates. Like graphical models, these probabilistic logic programs define a probability distribution over possible worlds, for which inference is computationally hard. We propose DeepStochLog, an alternative neural symbolic framework based on stochastic definite clause grammars, a type of stochastic logic program, which defines a probability distribution over possible derivations. More specifically, we introduce neural grammar rules into stochastic definite clause grammars to create a framework that can be trained end-to-end. We show that inference and learning in neural stochastic logic programming scale much better than for neural probabilistic logic programs. Furthermore, the experimental evaluation shows that DeepStochLog achieves state-of-the-art results on challenging neural symbolic learning tasks.
翻译:神经象征学( DeepProbLog) 的最近进步, 诸如 DeepProbLog 等神经象征学, 扩展神经前导的概率逻辑程序。 与图形模型一样, 这些概率逻辑程序定义了在可能的世界中的概率分布, 其推论在计算上是很难的。 我们提议了 DeepStochLog, 这是一种基于随机定义语法的替代神经象征框架, 一种随机逻辑程序, 其定义了在可能的衍生的概率分布 。 更具体地说, 我们引入神经语法规则, 将神经语法规则引入随机的确定条款语法中, 以创建一个可以经过培训的终端到端的框架。 我们显示, 神经对立逻辑编程的推论和学习比神经对立逻辑程序要好得多。 此外, 实验性评估显示, DeepStochLog在挑战神经象征学的任务上取得了最先进的结果 。