Large knowledge graphs often grow to store temporal facts that model the dynamic relations or interactions of entities along the timeline. Since such temporal knowledge graphs often suffer from incompleteness, it is important to develop time-aware representation learning models that help to infer the missing temporal facts. While the temporal facts are typically evolving, it is observed that many facts often show a repeated pattern along the timeline, such as economic crises and diplomatic activities. This observation indicates that a model could potentially learn much from the known facts appeared in history. To this end, we propose a new representation learning model for temporal knowledge graphs, namely CyGNet, based on a novel timeaware copy-generation mechanism. CyGNet is not only able to predict future facts from the whole entity vocabulary, but also capable of identifying facts with repetition and accordingly predicting such future facts with reference to the known facts in the past. We evaluate the proposed method on the knowledge graph completion task using five benchmark datasets. Extensive experiments demonstrate the effectiveness of CyGNet for predicting future facts with repetition as well as de novo fact prediction.
翻译:大型知识图往往逐渐形成,以储存时间事实,以模拟实体在时间线上的动态关系或互动。由于这种时间知识图往往不完全,因此必须开发有时间意识的演示学习模型,以帮助推断缺失的时间事实。虽然时间事实通常在变化,但人们注意到,许多事实往往在时间线上反复出现,例如经济危机和外交活动。这一观察表明,一个模型有可能从历史中出现的已知事实中学到很多东西。为此,我们提出一个新的时间知识图(即CyGNet)代表学习模型,即CyGNet,以新的时间觉悟复制机制为基础。CyGNet不仅能够从整个实体词汇中预测未来事实,而且还能够根据过去已知事实来查明事实并据此预测未来事实。我们利用五个基准数据集评估关于知识图完成任务的拟议方法。广泛的实验表明,CyGNet在预测未来事实方面的有效性,同时进行重复和无遗论事实预测。