Knowledge graph completion (KGC) has become a focus of attention across deep learning community owing to its excellent contribution to numerous downstream tasks. Although recently have witnessed a surge of work on KGC, they are still insufficient to accurately capture complex relations, since they adopt the single and static representations. In this work, we propose a novel Disentangled Knowledge Graph Attention Network (DisenKGAT) for KGC, which leverages both micro-disentanglement and macro-disentanglement to exploit representations behind Knowledge graphs (KGs). To achieve micro-disentanglement, we put forward a novel relation-aware aggregation to learn diverse component representation. For macro-disentanglement, we leverage mutual information as a regularization to enhance independence. With the assistance of disentanglement, our model is able to generate adaptive representations in terms of the given scenario. Besides, our work has strong robustness and flexibility to adapt to various score functions. Extensive experiments on public benchmark datasets have been conducted to validate the superiority of DisenKGAT over existing methods in terms of both accuracy and explainability.
翻译:知识图的完成(KGC)由于对许多下游任务的出色贡献,已成为深层学习界关注的焦点。虽然最近KGC的工作激增,但是由于采用单一和静态的表述方式,它们仍不足以准确反映复杂的关系。在这项工作中,我们提议为KGC建立一个新的分解知识图关注网络(DisenKGAT),它既能利用微分和宏观分解来利用知识图(KGs)背后的表述方式。为了实现微观分解,我们提出了一种新的关系意识汇总,以学习各种组成部分的表述方式。对于宏观分解,我们利用相互的信息作为正规化的加强独立性的手段。在分解的帮助下,我们的模型能够在特定情景中产生适应性的表述方式。此外,我们的工作具有强大的活力和灵活性,可以适应各种分数功能。在公共基准数据集上进行了广泛的实验,以验证DenKGAT在准确性和解释性两方面都优于现有方法。