In recent years a number of large-scale triple-oriented knowledge graphs have been generated and various models have been proposed to perform learning in those graphs. Most knowledge graphs are static and reflect the world in its current state. In reality, of course, the state of the world is changing: a healthy person becomes diagnosed with a disease and a new president is inaugurated. In this paper, we extend models for static knowledge graphs to temporal knowledge graphs. This enables us to store episodic data and to generalize to new facts (inductive learning). We generalize leading learning models for static knowledge graphs (i.e., Tucker, RESCAL, HolE, ComplEx, DistMult) to temporal knowledge graphs. In particular, we introduce a new tensor model, ConT, with superior generalization performance. The performances of all proposed models are analyzed on two different datasets: the Global Database of Events, Language, and Tone (GDELT) and the database for Integrated Conflict Early Warning System (ICEWS). We argue that temporal knowledge graph embeddings might be models also for cognitive episodic memory (facts we remember and can recollect) and that a semantic memory (current facts we know) can be generated from episodic memory by a marginalization operation. We validate this episodic-to-semantic projection hypothesis with the ICEWS dataset.
翻译:近些年来,产生了许多大型的三重导向知识图表,并提出了各种模型,以在这些图表中进行学习。大多数知识图表是静态的,反映当前世界状况。在现实中,世界状况正在发生变化:健康的人被诊断出疾病,新总统就职。在本文中,我们将静态知识图表模型的模型推广到时间知识图表。这使我们能够存储偶发数据并概括到新的事实(感应学习),我们将静态知识图表(如塔克、RESCAL、HolE、ComplEx、DistMult)的主要学习模型推广到时间知识图表中。特别是,我们引入了新的高压模型(ConT),其总体性表现优异。所有拟议模型的性能都用两种不同的数据集来分析:全球事件、语言和Tone(GDELT)数据库以及冲突早期预警系统(ICESS)。我们说,时间知识图形嵌入的模型也可能是认知性认知性记忆的模型(我们从记忆中可以记住的缩略感记忆和历史的缩略数据)。