Reasoning is a fundamental problem for computers and deeply studied in Artificial Intelligence. In this paper, we specifically focus on answering multi-hop logical queries on Knowledge Graphs (KGs). This is a complicated task because, in real-world scenarios, the graphs tend to be large and incomplete. Most previous works have been unable to create models that accept full First-Order Logical (FOL) queries, which include negative queries, and have only been able to process a limited set of query structures. Additionally, most methods present logic operators that can only perform the logical operation they are made for. We introduce a set of models that use Neural Networks to create one-point vector embeddings to answer the queries. The versatility of neural networks allows the framework to handle FOL queries with Conjunction ($\wedge$), Disjunction ($\vee$) and Negation ($\neg$) operators. We demonstrate experimentally the performance of our model through extensive experimentation on well-known benchmarking datasets. Besides having more versatile operators, the models achieve a 10\% relative increase over the best performing state of the art and more than 30\% over the original method based on single-point vector embeddings.
翻译:理性是计算机的根本问题, 也是人工智能的深层研究问题。 在本文中, 我们特别侧重于回答知识图( KGs) 的多点逻辑查询。 这是一项复杂的任务, 因为在现实世界的情景下, 图表一般是大而不完整的。 大多数先前的工程都无法创建接受完整一极逻辑查询( FOL) 的模型, 包括负面查询, 并且只能处理有限的查询结构。 此外, 大多数方法都显示逻辑运算符, 只能执行它们要执行的逻辑操作。 我们引入了一套模型, 使用神经网络来创建一点矢量嵌入来回答查询。 神经网络的多功能性使得框架能够用连接( $\ twedge$ ) 、 ($\vee$) 和 Negation ($\neg$) 操作员来处理 FOL 查询。 我们通过对众所周知的基准数据集进行广泛实验, 实验我们模型的性表现。 除了拥有更灵活的操作员外, 模型还能在艺术最佳执行状态上相对增加 30 以上 位原 方法上 。