Graph Convolutional Networks (GCNs) have received significant attention from various research fields due to the excellent performance in learning graph representations. Although GCN performs well compared with other methods, it still faces challenges. Training a GCN model for large-scale graphs in a conventional way requires high computation and memory costs. Therefore, motivated by an urgent need in terms of efficiency and scalability in training GCN, sampling methods are proposed and achieve a significant effect. In this paper, we categorize sampling methods based on the sampling mechanisms and provide a comprehensive survey of sampling methods for efficient training of GCN. To highlight the characteristics and differences of sampling methods, we present a detailed comparison within each category and further give an overall comparative analysis for the sampling methods in all categories. Finally, we discuss some challenges and future research directions of the sampling methods.
翻译:由于学习图示的出色表现,各种研究领域对革命网络(GCN)给予了极大关注,尽管GCN与其他方法相比表现良好,但仍面临挑战。用常规方式培训大型图表的GCN模型需要很高的计算和记忆成本。因此,由于培训GCN时迫切需要效率和可缩放性,因此建议了采样方法,并取得了显著效果。在本文件中,我们根据取样机制对采样方法进行分类,并为高效培训GCN提供抽样方法的全面调查。为了突出取样方法的特征和差异,我们在每一类别中进行详细比较,并进一步对各类采样方法进行总体比较分析。最后,我们讨论了采样方法的一些挑战和今后的研究方向。我们讨论了采样方法的一些挑战和未来研究方向。