In the recent past, there has been a growing interest in Neural-Symbolic Integration frameworks, i.e., hybrid systems that integrate connectionist and symbolic approaches to obtain the best of both worlds. In this work we focus on a specific method, KENN (Knowledge Enhanced Neural Networks), a Neural-Symbolic architecture that injects prior logical knowledge into a neural network by adding on its top a residual layer that modifies the initial predictions accordingly to the knowledge. Among the advantages of this strategy, there is the inclusion of clause weights, learnable parameters that represent the strength of the clauses, meaning that the model can learn the impact of each rule on the final predictions. As a special case, if the training data contradicts a constraint, KENN learns to ignore it, making the system robust to the presence of wrong knowledge. In this paper, we propose an extension of KENN for relational data. One of the main advantages of KENN resides in its scalability, thanks to a flexible treatment of dependencies between the rules obtained by stacking multiple logical layers. We show experimentally the efficacy of this strategy. The results show that KENN is capable of increasing the performances of the underlying neural network, obtaining better or comparable accuracies in respect to other two related methods that combine learning with logic, requiring significantly less time for learning.
翻译:最近,人们日益关注神经-系统整合框架,即结合连接和象征性方法的混合系统,以获得两个世界的最佳结果。在这项工作中,我们侧重于一种特定方法,KENN(知识增强神经网络),一个神经-系统架构,将逻辑知识注入神经网络,在其顶部增加一个剩余层,据此根据知识对初步预测作出相应的调整。这一战略的优点之一是包括了条款权重、反映条款实力的可学习参数,这意味着模型可以了解每项规则对最后预测的影响。作为一个特殊的例子,如果培训数据与限制相矛盾,KENN学会忽视它,使系统对错误知识的存在产生强大的影响。在这份文件中,我们建议扩大KENNN,将关系数据作为扩展。KENN的主要优点之一在于其可扩展性,因为灵活地处理规则之间的依赖性,通过堆叠多逻辑层次,意味着模型可以了解每项规则对最后预测的影响。作为一个特例,如果培训数据与限制相矛盾,KENNM学会,则学会忽视它。我们从实验性地展示了这种学习方法与其他学习方法的更好效果。