Temporal graphs exhibit dynamic interactions between nodes over continuous time, whose topologies evolve with time elapsing. The whole temporal neighborhood of nodes reveals the varying preferences of nodes. However, previous works usually generate dynamic representation with limited neighbors for simplicity, which results in both inferior performance and high latency of online inference. Therefore, in this paper, we propose a novel method of temporal graph convolution with the whole neighborhood, namely Temporal Aggregation and Propagation Graph Neural Networks (TAP-GNN). Specifically, we firstly analyze the computational complexity of the dynamic representation problem by unfolding the temporal graph in a message-passing paradigm. The expensive complexity motivates us to design the AP (aggregation and propagation) block, which significantly reduces the repeated computation of historical neighbors. The final TAP-GNN supports online inference in the graph stream scenario, which incorporates the temporal information into node embeddings with a temporal activation function and a projection layer besides several AP blocks. Experimental results on various real-life temporal networks show that our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency. Our code is available at \url{https://github.com/doujiang-zheng/TAP-GNN}.
翻译:随着时间的推移,时间图呈现出节点之间的动态交互,其拓扑结构也随着时间的流逝而发生变化。节点的整个时间邻域透露了节点的不断变化的偏好。然而,以前的工作通常为了简化计算,在动态表示中生成了具有有限邻居的方法,这导致了性能的下降和在线推断的高延迟。因此,在本文中,我们提出了使用整个邻域的时间图卷积的新方法,即时间聚合和传播图神经网络(TAP-GNN)。具体而言,我们首先通过消息传递范式展开时间图,分析动态表示问题的计算复杂度。由于计算复杂度昂贵,我们设计了AP(聚合和传播)块,明显降低了历史邻居的重复计算。最终的TAP-GNN支持在图流场景下的在线推断,除了几个AP块之外,还具有时间激活函数和投影层,将时态信息并入节点嵌入中。在各种实际时间网络上的实验结果表明,我们提出的TAP-GNN在预测性能和在线推断延迟方面均显著优于现有的时间图方法。我们的代码可在\url{https://github.com/doujiang-zheng/TAP-GNN}上获得。