Graph Neural Networks (GNNs) have recently become increasingly popular due to their ability to learn complex systems of relations or interactions arising in a broad spectrum of problems ranging from biology and particle physics to social networks and recommendation systems. Despite the plethora of different models for deep learning on graphs, few approaches have been proposed thus far for dealing with graphs that present some sort of dynamic nature (e.g. evolving features or connectivity over time). In this paper, we present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events. Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient. We furthermore show that several previous models for learning on dynamic graphs can be cast as specific instances of our framework. We perform a detailed ablation study of different components of our framework and devise the best configuration that achieves state-of-the-art performance on several transductive and inductive prediction tasks for dynamic graphs.
翻译:最近,由于能够学习从生物学和粒子物理学到社交网络和建议系统等一系列广泛问题产生的复杂关系或互动系统,神经网络(GNNs)最近越来越受欢迎。尽管在图表上有许多不同的深层学习模型,但迄今为处理具有某种动态性质的图表(例如随着时间推移不断演变的特征或连通性)而提出的办法很少。在本文中,我们介绍了Temporal图形网络(TGns),这是一个在作为时间事件序列的动态图形上进行深层学习的通用、有效的框架。由于记忆模块和图形操作员的新型组合,TGNs能够大大地超越以前的方法,同时在计算上效率更高。我们进一步表明,一些以前在动态图形上学习的模型可以作为我们框架的具体实例。我们详细研究了我们框架的不同组成部分,并设计了最佳配置,以便在一些动态图形的传输和导引预测任务上实现最先进的业绩。