Temporal Knowledge Graph (TKG) representation learning embeds entities and event types into a continuous low-dimensional vector space by integrating the temporal information, which is essential for downstream tasks, e.g., event prediction and question answering. Existing methods stack multiple graph convolution layers to model the influence of distant entities, leading to the over-smoothing problem. To alleviate the problem, recent studies infuse reinforcement learning to obtain paths that contribute to modeling the influence of distant entities. However, due to the limited number of hops, these studies fail to capture the correlation between entities that are far apart and even unreachable. To this end, we propose GTRL, an entity Group-aware Temporal knowledge graph Representation Learning method. GTRL is the first work that incorporates the entity group modeling to capture the correlation between entities by stacking only a finite number of layers. Specifically, the entity group mapper is proposed to generate entity groups from entities in a learning way. Based on entity groups, the implicit correlation encoder is introduced to capture implicit correlations between any pairwise entity groups. In addition, the hierarchical GCNs are exploited to accomplish the message aggregation and representation updating on the entity group graph and the entity graph. Finally, GRUs are employed to capture the temporal dependency in TKGs. Extensive experiments on three real-world datasets demonstrate that GTRL achieves the state-of-the-art performances on the event prediction task, outperforming the best baseline by an average of 13.44%, 9.65%, 12.15%, and 15.12% in MRR, Hits@1, Hits@3, and Hits@10, respectively.
翻译:时间知识图( TKG) 代表学习将实体和事件类型嵌入一个连续的低维矢量空间,方法是整合时间信息,这是下游任务所必需的,例如,事件预测和回答问题。现有方法堆叠多图形变异层,以模拟远方实体的影响,从而导致过度缓解问题。为了缓解问题,最近研究将强化学习以获得有助于模拟远方实体影响的路径。然而,由于跳跃数量有限,这些研究未能捕捉到相距甚远、甚至无法接触的实体之间的关联。为此,我们提议GTRL,即实体的GTRL,即一个实体的GOforth-awa-Temal知识图示示意教学方法。GTRL是将实体的模型化成首项工作,通过只堆放一定的层数来捕捉各实体之间的关联。具体来说,该实体组的地图显示从实体学习方式生成的实体组。根据实体组,引入了隐含的关联编码,可以捕捉到任何对齐实体组之间的隐含的关联。此外,我们提议GCNLL,GCN 级的GCN,一个实体的GCN- Worthal Proal 示表将最终的Sld 和G- IMF IM 的Sldalalalalalalalalalalal 的模型将用来在 GO 上完成 GL 3 和G- sal-s 3 上,在G- sal- sal- sal-hal-hilal- sal-sal-sal-salgalgalgalgalgal-sal- saldaldaldal-dal-dal-sal-dal-sal-sal-saldal-sal-sal-sal-sal-sal-sal-sal-dal-dal-ldal-sal-sal-sal-sal-salgalgal-sal-sal-ldal-sal-ldal-ldal-sal-ldal-ldal-ldal-ldal-ldal-ldal-ld-ld-ldal-ldal-ld-l