Static graph neural networks have been widely used in modeling and representation learning of graph structure data. However, many real-world problems, such as social networks, financial transactions, recommendation systems, etc., are dynamic, that is, nodes and edges are added or deleted over time. Therefore, in recent years, dynamic graph neural networks have received more and more attention from researchers. In this work, we propose a novel dynamic graph neural network, Efficient-Dyn. It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure. Therefore, while avoiding the use of snapshots to cause information loss, it also achieves a finer time granularity, which is close to what continuous networks could provide. In addition, we also designed a lightweight module, Sparse Temporal Transformer, to compute node representations through both structural neighborhoods and temporal dynamics. Since the fully-connected attention conjunction is simplified, the computation cost is far lower than the current state-of-the-arts. Link prediction experiments are conducted on both continuous and discrete graph datasets. Through comparing with several state-of-the-art graph embedding baselines, the experimental results demonstrate that Efficient-Dyn has a faster inference speed while having competitive performance.
翻译:然而,许多现实世界的问题,如社交网络、金融交易、建议系统等,都是动态的,也就是说,节点和边缘会随着时间推移而被添加或删除。因此,近年来,动态图形神经网络越来越受到研究人员的注意。在这项工作中,我们提出一个新的动态图形神经网络,高效-Dyn。它将时间信息适应性地编码成一系列具有等量时间-地形结构的补丁。因此,在避免使用截图造成信息丢失的同时,它也实现一个更细的时间颗粒,这接近于持续网络所能提供的。此外,我们还设计了一个轻量模块,即微调时空变异器,通过结构邻里和时间动态进行分解。由于完全相连的注意关系已经简化,计算成本远远低于目前的状态。在连续和离散的图表数据集上进行联系预测实验,同时在具有快速的实验性实验性基准的同时,通过对具有快速的实验性测算结果进行比较。