Designing spectral convolutional networks is a challenging problem in graph learning. ChebNet, one of the early attempts, approximates the spectral graph convolutions using Chebyshev polynomials. GCN simplifies ChebNet by utilizing only the first two Chebyshev polynomials while still outperforming it on real-world datasets. GPR-GNN and BernNet demonstrate that the Monomial and Bernstein bases also outperform the Chebyshev basis in terms of learning the spectral graph convolutions. Such conclusions are counter-intuitive in the field of approximation theory, where it is established that the Chebyshev polynomial achieves the optimum convergent rate for approximating a function. In this paper, we revisit the problem of approximating the spectral graph convolutions with Chebyshev polynomials. We show that ChebNet's inferior performance is primarily due to illegal coefficients learnt by ChebNet approximating analytic filter functions, which leads to over-fitting. We then propose ChebNetII, a new GNN model based on Chebyshev interpolation, which enhances the original Chebyshev polynomial approximation while reducing the Runge phenomenon. We conducted an extensive experimental study to demonstrate that ChebNetII can learn arbitrary graph convolutions and achieve superior performance in both full- and semi-supervised node classification tasks. Most notably, we scale ChebNetII to a billion graph ogbn-papers100M, showing that spectral-based GNNs have superior performance. Our code is available at https://github.com/ivam-he/ChebNetII.
翻译:GPR-GNN和BernNet在图形学习方面是一个挑战性的问题。ChebNet(早期尝试之一)利用 Chebyshev 多元分子学类来接近光谱图图图。GCN 简化ChebNet,仅使用前两个Chebyshev 多元分子学,但仍在真实世界数据集上表现得比它好。GPR-GNN和BernNet显示,单式和伯恩斯泰因基地在学习光谱图同化方面也比Chebyshev基础要差。这些结论在近似理论领域是反直观的。在近似理论领域,确定Chebyshev 多元分子学类能达到最佳的趋同率。在本文中,我们重新审视光谱图与Chebyshev 多元分子学相近。我们显示,ChebNet基础的低效性能主要是由于ChebNet 基础的低效因 Cheb-commexcial 校验新精度过滤功能,导致超常性能性化的Gyshev 。我们建议, IMFlental II 进行一个原始性能化的变现化化学现象研究。