Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data. Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data, referred to as over-fitting. This paper proposes Topology Adaptive Edge Dropping (TADropEdge) method as an adaptive data augmentation technique to improve generalization performance and learn robust GNN models. We start by explicitly analyzing how random edge dropping increases the data diversity during training, while indicating i.i.d. edge dropping does not account for graph structural information and could result in noisy augmented data degrading performance. To overcome this issue, we consider graph connectivity as the key property that captures graph topology. TADropEdge incorporates this factor into random edge dropping such that the edge-dropped subgraphs maintain similar topology as the underlying graph, yielding more satisfactory data augmentation. In particular, TADropEdge first leverages the graph spectrum to assign proper weights to graph edges, which represent their criticality for establishing the graph connectivity. It then normalizes the edge weights and drops graph edges adaptively based on their normalized weights. Besides improving generalization performance, TADropEdge reduces variance for efficient training and can be applied as a generic method modular to different GNN models. Intensive experiments on real-life and synthetic datasets corroborate theory and verify the effectiveness of the proposed method.
翻译:GNN是利用图形结构信息来模拟网络数据表示方式的处理结构。尽管GNN取得了成功,但GNN由于有限的培训数据(被称之为超称)而具有亚最佳的概括性业绩。本文提出将图式适应性调整边缘下降法(TADropEdge)作为一种适应性数据增强技术,以改善概括性表现并学习强大的GNN模型。我们首先明确分析随机边缘下降如何在培训期间增加数据多样性,同时指出i.d.边缘下降并不反映图形结构信息,并可能导致数据急剧增加性能。为了克服这一问题,我们认为图形连接性是采集图表表层学的关键属性。TADropEdge将这一因素纳入随机边缘,使边缘倾斜的子图保持与底图相似的表层,从而产生更令人满意的数据增强。特别是,TADropED首先利用图谱谱谱频谱来给图形边缘分配适当的权重,这代表了它们建立图形连接的关键性。然后,我们把图像连接性边重标准化,然后将图连接作为关键性重量和递增的理论级模型, 将GROPRODGROdealalalalalalalal deal deal degradudududeal degradudeal degradudududududududududeald ladaldal deal deal deal degradud gradud daldaldald dald gradud madaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldald graphaldald rodaldald madd rodaldaldaldald sald sald sald saldaldaldald madaldald rodaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldald mod mod mad madaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldald