Most real-world knowledge graphs are characterized by a long-tail relation frequency distribution where a significant fraction of relations occurs only a handful of times. This observation has given rise to recent interest in low-shot learning methods that are able to generalize from only a few examples. The existing approaches, however, are tailored to static knowledge graphs and not easily generalized to temporal settings, where data scarcity poses even bigger problems, e.g., due to occurrence of new, previously unseen relations. We address this shortcoming by proposing a one-shot learning framework for link prediction in temporal knowledge graphs. Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities, and a network to compute a similarity score between a given query and a (one-shot) example. Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks while achieving significantly better performance for sparse relations.
翻译:多数实际世界知识图的特点是长期关联频率分布,其中大部分关系仅发生很少次。这一观察最近引起了对低发学习方法的兴趣,这些方法只能从几个例子中概括。但是,现有方法是针对静态知识图的,不易向时间环境推广,因为数据稀缺造成更大的问题,例如,由于出现了新的、以前不为人知的关系。我们提出一个一次性学习框架,用于在时间知识图中进行连接预测。我们提议的方法使用一种自留机制,以有效编码各实体之间的时间互动,并建立一个网络,计算特定查询和(一发)示例之间的类似分数。我们的实验显示,拟议的算法超过了两个经过仔细研究的基准的艺术基线,同时在稀有关系中取得显著更好的业绩。