Graph representation learning resurges as a trending research subject owing to the widespread use of deep learning for Euclidean data, which inspire various creative designs of neural networks in the non-Euclidean domain, particularly graphs. With the success of these graph neural networks (GNN) in the static setting, we approach further practical scenarios where the graph dynamically evolves. Existing approaches typically resort to node embeddings and use a recurrent neural network (RNN, broadly speaking) to regulate the embeddings and learn the temporal dynamics. These methods require the knowledge of a node in the full time span (including both training and testing) and are less applicable to the frequent change of the node set. In some extreme scenarios, the node sets at different time steps may completely differ. To resolve this challenge, we propose EvolveGCN, which adapts the graph convolutional network (GCN) model along the temporal dimension without resorting to node embeddings. The proposed approach captures the dynamism of the graph sequence through using an RNN to evolve the GCN parameters. Two architectures are considered for the parameter evolution. We evaluate the proposed approach on tasks including link prediction, edge classification, and node classification. The experimental results indicate a generally higher performance of EvolveGCN compared with related approaches. The code is available at \url{https://github.com/IBM/EvolveGCN}.
翻译:由于广泛使用深入学习的Euclidean数据,刺激了非Euclidean域域内神经网络的各种创造性设计,特别是图形。随着这些图形神经网络(GNN)在静态环境中的成功,我们进一步处理图形动态演变的实用假设。现有方法通常采用节点嵌入,并使用经常性神经网络(NNN,广称)来调节嵌入和学习时间动态。这些方法需要了解全时节点(包括培训和测试),并不太适用于节点设置的频繁变化。在某些极端的假设中,不同时间步骤的节点可能完全不同。为了解决这一挑战,我们提议EvolveGCN,在时间层面调整图形革命网络模式,而不必使用不动嵌入。拟议方法通过 RNNE(包括培训和测试)来捕捉图形序列序列的活力,以演进GCN参数参数参数参数参数参数参数参数参数参数参数参数参数的演变。两个结构在考虑时,对E-G-G-G-C-G-C-C-C-C-C-C-C-I-C-C-C-L-L-I-S-I-I-I-I-L-I-S-I-I-I-S-I-I-L-I-I-I-I-I-I-I-I-L-I-I-I-L-L-I-I-I-I-S-S-S-S-S-S-S-S-S-S-I-I-S-I-I-S-S-I-I-L-L-L-I-S-I-I-I-I-I-I-I-I-S-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-S-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-S-I-I-I-I-I-I-