Graph Convolutional Networks (GCNs) have received significant attention from various research fields due to the excellent performance in learning graph representations. Although GCN performs well compared with other methods, it still faces challenges. Training a GCN model for large-scale graphs in a conventional way requires high computation and storage costs. Therefore, motivated by an urgent need in terms of efficiency and scalability in training GCN, sampling methods have been proposed and achieved a significant effect. In this paper, we categorize sampling methods based on the sampling mechanisms and provide a comprehensive survey of sampling methods for efficient training of GCN. To highlight the characteristics and differences of sampling methods, we present a detailed comparison within each category and further give an overall comparative analysis for the sampling methods in all categories. Finally, we discuss some challenges and future research directions of the sampling methods.
翻译:由于学习图示的出色表现,各种研究领域对革命网络(GCN)给予了极大关注,尽管GCN与其他方法相比表现良好,但仍面临挑战;以常规方式培训大型图表的GCN模型需要很高的计算和储存费用;因此,由于培训GCN的效益和可缩放性方面的迫切需要,提出了采样方法,并取得了显著效果;在本文件中,我们根据采样机制对采样方法进行了分类,为有效培训GCN提供了抽样方法的全面调查;为了突出取样方法的特点和差异,我们在每个类别中进行详细比较,并进一步对各类采样方法进行总体比较分析;最后,我们讨论了采样方法的一些挑战和今后的研究方向。