Graph Neural Networks have recently become a prevailing paradigm for various high-impact graph learning tasks. Existing efforts can be mainly categorized as spectral-based and spatial-based methods. The major challenge for the former is to find an appropriate graph filter to distill discriminative information from input signals for learning. Recently, attempts such as Graph Convolutional Network (GCN) leverage Chebyshev polynomial truncation to seek an approximation of graph filters and bridge these two families of methods. It has been shown in recent studies that GCN and its variants are essentially employing fixed low-pass filters to perform information denoising. Thus their learning capability is rather limited and may over-smooth node representations at deeper layers. To tackle these problems, we develop a novel graph neural network framework AdaGNN with a well-designed adaptive frequency response filter. At its core, AdaGNN leverages a simple but elegant trainable filter that spans across multiple layers to capture the varying importance of different frequency components for node representation learning. The inherent differences among different feature channels are also well captured by the filter. As such, it empowers AdaGNN with stronger expressiveness and naturally alleviates the over-smoothing problem. We empirically validate the effectiveness of the proposed framework on various benchmark datasets. Theoretical analysis is also provided to show the superiority of the proposed AdaGNN. The implementation of AdaGNN is available at \url{https://github.com/yushundong/AdaGNN}.


翻译:神经网络(GCN)等尝试最近已成为各种高影响图形学习任务的主流范式。现有的努力可以主要分类为光谱和空间方法。前者的主要挑战是找到一个适当的图形过滤器,从输入信号中提取歧视性信息,供学习之用。最近,诸如Greg Convolual 网络(GCN)利用Chebyshev 多元网格,利用Chebyshev 多边网格网格,寻找图表过滤器近似,并连接这两种方法的组合。在最近的研究中显示,GCN及其变体基本上使用固定的低射线过滤器进行信息分解。因此,它们的学习能力相当有限,在更深层次上可能出现超模的节点表达。为了解决这些问题,我们开发了一个创新的图表神经网络框架AdaGNNNNN, 设计了一个适应频率反应过滤器。AdaGNNNN利用一个简单但优雅的训练过滤器,跨多个层次,以了解不同频率组成部分的不同重要性,结点的学习。不同特征的内在差异也由过滤器很好地被过滤器所捕捉到。在更深层层次上,因此,AdaGNSADNSADNNNNNU(A)的高级/A(A)分析是自然地(ADNNNNNNURG)将自动和(S)更强的模型(S)的效能分析。我们(S)的实验性)的实验性)将自动地(SUDNU)(A)(AG)(ADI)(ADIG)(S)(A)(Adal-I)(Ad)(Adal-I)(S)(G)(S)(G)(S)(A)(S)(S)(A)(A)(A)(A)(A)(A)(A)(A)(S)(S)(S)(S)(S)(A)(AD)(ADU)(A)(ADUR)(S)(S)(S)(S)(A)(A)(A)(A)(A)(A)(A)(A)(A)(A)(A)(A)(A)(ADI)(S)

0
下载
关闭预览

相关内容

【图神经网络导论】Intro to Graph Neural Networks,176页ppt
专知会员服务
126+阅读 · 2021年6月4日
神经网络的拓扑结构,TOPOLOGY OF DEEP NEURAL NETWORKS
专知会员服务
33+阅读 · 2020年4月15日
Stabilizing Transformers for Reinforcement Learning
专知会员服务
60+阅读 · 2019年10月17日
论文浅尝 | GMNN: Graph Markov Neural Networks
开放知识图谱
20+阅读 · 2020年2月14日
LibRec 精选:AutoML for Contextual Bandits
LibRec智能推荐
7+阅读 · 2019年9月19日
Hierarchically Structured Meta-learning
CreateAMind
26+阅读 · 2019年5月22日
逆强化学习-学习人先验的动机
CreateAMind
16+阅读 · 2019年1月18日
meta learning 17年:MAML SNAIL
CreateAMind
11+阅读 · 2019年1月2日
Disentangled的假设的探讨
CreateAMind
9+阅读 · 2018年12月10日
Arxiv
14+阅读 · 2021年7月20日
Arxiv
15+阅读 · 2021年6月27日
Arxiv
8+阅读 · 2019年5月20日
A Comprehensive Survey on Graph Neural Networks
Arxiv
21+阅读 · 2019年1月3日
Arxiv
3+阅读 · 2018年2月11日
VIP会员
Top
微信扫码咨询专知VIP会员