Graph neural networks (GNNs) are among the most powerful tools in deep learning. They routinely solve complex problems on unstructured networks, such as node classification, graph classification, or link prediction, with high accuracy. However, both inference and training of GNNs are complex, and they uniquely combine the features of irregular graph processing with dense and regular computations. This complexity makes it very challenging to execute GNNs efficiently on modern massively parallel architectures. To alleviate this, we first design a taxonomy of parallelism in GNNs, considering data and model parallelism, and different forms of pipelining. Then, we use this taxonomy to investigate the amount of parallelism in numerous GNN models, GNN-driven machine learning tasks, software frameworks, or hardware accelerators. We use the work-depth model, and we also assess communication volume and synchronization. We specifically focus on the sparsity/density of the associated tensors, in order to understand how to effectively apply techniques such as vectorization. We also formally analyze GNN pipelining, and we generalize the established Message-Passing class of GNN models to cover arbitrary pipeline depths, facilitating future optimizations. Finally, we investigate different forms of asynchronicity, navigating the path for future asynchronous parallel GNN pipelines. The outcomes of our analysis are synthesized in a set of insights that help to maximize GNN performance, and a comprehensive list of challenges and opportunities for further research into efficient GNN computations. Our work will help to advance the design of future GNNs.
翻译:深层学习中最强大的工具之一是图形神经网络(GNN),它们通常解决非结构化网络的复杂问题,例如节点分类、图形分类或链接预测,并且非常精确。然而,GNN的推论和培训都是复杂的,它们独有地将不规则的图形处理与密集和定期计算结合起来。这种复杂性使得在现代大规模平行结构中高效地执行GNN的难度非常大。为了缓解这一点,我们首先设计GNN的平行技术分类,考虑到数据和模型平行性,以及不同形式的管道。然后,我们利用这种分类来调查许多GNNN模式、GNN驱动的机器学习任务、软件框架或硬件加速器中的平行性数量。我们使用工作深度模型,我们还要评估通信量和同步性。我们特别侧重于相关电磁度/密度,以便了解如何有效地应用诸如矢量化等技术。我们还帮助GNNN的管道进一步分析,我们把既定的信息-Passing GNNN的透明性设计机会推广到 GNNNN的透明性轨道未来分析中,我们最终要调查G的任意性地研究 GNNNNNN的轨道的轨道,我们未来的轨道的深度,作为G-NNNNNN的轨道的轨道的路径的路径分析,我们将来的透明性研究,我们的未来的透明性研究的深度分析将在未来的深度分析将探索性研究,作为任意性研究的路径的路径的路径的路径的路径,我们的未来的深度分析,将探索性研究。