Despite the recent success of Graph Neural Networks, it remains challenging to train a GNN on large graphs with millions of nodes and billions of edges, which are prevalent in many graph-based applications. Traditional sampling-based methods accelerate GNN training by dropping edges and nodes, which impairs the graph integrity and model performance. Differently, distributed GNN algorithms accelerate GNN training by utilizing multiple computing devices and can be classified into two types: "partition-based" methods enjoy low communication costs but suffer from information loss due to dropped edges, while "propagation-based" methods avoid information loss but suffer from prohibitive communication overhead caused by the neighbor explosion. To jointly address these problems, this paper proposes DIGEST (DIstributed Graph reprEsentation SynchronizaTion), a novel distributed GNN training framework that synergizes the complementary strength of both categories of existing methods. We propose to allow each device to utilize the stale representations of its neighbors in other subgraphs during subgraph parallel training. This way, our method preserves global graph information from neighbors to avoid information loss and reduce communication costs. Our convergence analysis demonstrates that DIGEST enjoys a state-of-the-art convergence rate. Extensive experimental evaluation on large, real-world graph datasets shows that DIGEST achieves up to 21.82 speedups without compromising performance compared to state-of-the-art distributed GNN training frameworks.
翻译:尽管图形神经网络最近取得了成功,但用成百万节点和数十亿边缘的大图对GNN进行培训仍然是一项艰巨的任务,许多基于图形的应用都普遍存在。传统的基于取样的方法加快了GNN培训,通过下降边缘和节点加速了GNN培训,从而损害了图形的完整性和模型性能。不同的是,分布式GNN算法通过使用多种计算设备加速了GNN培训,可以分为两类:“基于分区的”方法享有较低的通信成本,但因边缘下降而蒙受信息损失,而“基于分析的”方法避免了信息损失,但因邻居爆炸而承受着令人望的通信管理。为了共同解决这些问题,本文提议采用DIGEST(配制的图象重现图重现同步同步和模型性能)来加速GNNNNN培训。我们提议允许每个装置在子节点的平行培训中利用邻居的典型描述,从而避免信息损失,避免了邻居们因邻居爆炸而承受了令人望的通信费。我们的分析显示DNENE-IEF在不进行高水平的实验性评估。我们的数据显示的是,DIG-G-G-G-G-G-G-C-C-C-S-S-S-S-C-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-IG-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-