In recent years, Graph Neural Networks (GNNs) have shown superior performance on diverse applications on real-world datasets. To improve the model capacity and alleviate the over-smoothing problem, several methods proposed to incorporate the intermediate layers by layer-wise connections. However, due to the highly diverse graph types, the performance of existing methods vary on diverse graphs, leading to a need for data-specific layer-wise connection methods. To address this problem, we propose a novel framework LLC (Learn Layer-wise Connections) based on neural architecture search (NAS) to learn adaptive connections among intermediate layers in GNNs. LLC contains one novel search space which consists of 3 types of blocks and learnable connections, and one differentiable search algorithm to enable the efficient search process. Extensive experiments on five real-world datasets are conducted, and the results show that the searched layer-wise connections can not only improve the performance but also alleviate the over-smoothing problem.
翻译:近年来,图形神经网络(GNN)在现实世界数据集的各种应用方面表现优异。为了提高模型能力并缓解过度移动的问题,建议了几种方法,通过多层连接将中间层纳入其中层。然而,由于图表类型多种多样,现有方法的性能因不同图形不同而各异,从而需要数据特定层连接方法。为了解决这一问题,我们提议以神经结构搜索(NAS)为基础,建立一个新的框架LLC(Lear-Liet-wise Connects),以学习GNNNS中间层之间的适应性连接。 LLC包含一个由3类区块和可学习连接组成的新颖搜索空间,以及一种不同的搜索算法,以使有效搜索进程得以进行。在五个真实世界数据集上进行了广泛的实验,结果显示,搜索的层次连接不仅可以改进性能,还可以缓解过度移动的问题。