Dynamic graph representation learning has emerged as a crucial research area, driven by the growing need for analyzing time-evolving graph data in real-world applications. While recent approaches leveraging recurrent neural networks (RNNs) and graph neural networks (GNNs) have shown promise, they often fail to adequately capture the impact of temporal edge states on inter-node relationships, consequently overlooking the dynamic changes in node features induced by these evolving relationships. Furthermore, these methods suffer from GNNs' inherent over-smoothing problem, which hinders the extraction of global structural features. To address these challenges, we introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning. It first designs a heuristic method to explicitly model edge temporal states by employing different edge types and weights based on the differences between consecutive snapshots, thereby integrating varying edge temporal states into the graph's topological structure. We then propose a structure-reinforced graph transformer that captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm, enabling the extraction of both local and global structural features. Comprehensive experiments on four real-world datasets demonstrate RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
翻译:暂无翻译