Reasoning about the relationships between entities from input facts (e.g., whether Ari is a grandparent of Charlie) generally requires explicit consideration of other entities that are not mentioned in the query (e.g., the parents of Charlie). In this paper, we present an approach for learning to solve problems of this kind in large, real-world domains, using sparse and local hypergraph neural networks (SpaLoc). SpaLoc is motivated by two observations from traditional logic-based reasoning: relational inferences usually apply locally (i.e., involve only a small number of individuals), and relations are usually sparse (i.e., only hold for a small percentage of tuples in a domain). We exploit these properties to make learning and inference efficient in very large domains by (1) using a sparse tensor representation for hypergraph neural networks, (2) applying a sparsification loss during training to encourage sparse representations, and (3) subsampling based on a novel information sufficiency-based sampling process during training. SpaLoc achieves state-of-the-art performance on several real-world, large-scale knowledge graph reasoning benchmarks, and is the first framework for applying hypergraph neural networks on real-world knowledge graphs with more than 10k nodes.
翻译:基于输入事实的实体之间的关系(例如,Ari是否是查理的祖父母),通常需要明确考虑在查询中未提及的其他实体(例如,Charlie的父母)。在本文件中,我们提出一种方法,利用稀疏和本地高光学神经网络(SpaLoc),学习解决大型、现实世界域的这类问题。 SpaLoc的动因是传统逻辑推理的两种观点:通常在当地适用的关系推论(即,只涉及少数个人),而且关系通常很少(即,只持有一小部分域内的图象)。 我们利用这些特性,通过(1) 利用稀有的高光学代表来解决超强光学网络的大型领域的问题,(2) 在培训期间采用松散损失法鼓励稀释演示,(3) 在培训期间基于新信息的充足性取样程序基础上进行下取样。 SpaLoc在一些现实世界、大型知识网络上采用比10个图表推理基准更高级的高级知识网络上取得最新业绩,这是第一个采用10个数字图表基准的框架。</s>