We propose Dynamically Pruned Message Passing Networks (DPMPN) for large-scale knowledge graph reasoning. In contrast to existing models, embedding-based or path-based, we learn an input-dependent subgraph to explicitly model a sequential reasoning process. Each subgraph is dynamically constructed, expanding itself selectively under a flow-style attention mechanism. In this way, we can not only construct graphical explanations to interpret prediction, but also prune message passing in Graph Neural Networks (GNNs) to scale with the size of graphs. We take the inspiration from the consciousness prior proposed by Bengio to design a two-GNN framework to encode global input-invariant graph-structured representation and learn local input-dependent one coordinated by an attention module. Experiments show the reasoning capability in our model that is providing a clear graphical explanation as well as predicting results accurately, outperforming most state-of-the-art methods in knowledge base completion tasks.
翻译:我们为大规模知识图表推理提出了动态谨慎信息传递网络(DPMPN) 。 与现有的模型相比, 我们学习了一个基于输入的子集, 以明确模拟顺序推理过程。 每个子集都是动态构建的, 在流动式关注机制下有选择地扩展自己。 这样, 我们不仅可以构建图形解释来解释预测, 还可以构建图形神经网络( GNNs)中传递的信息, 以与图表大小相适应。 我们从Bengio先前提出的设计两个GNN框架以编码全球输入变量图形结构代表并学习一个由关注模块协调的本地输入驱动框架的意识中得到启发。 实验显示了我们模型中的推理能力, 它提供了清晰的图形解释并准确预测结果, 超过了知识基础完成任务中最先进的方法。