ICML2020上图神经网络好文

2020 年 8 月 4 日 图与推荐

2020 年的 ICML 前几天结束了,图神经网络也是今年的一大热点,这里总结一下 GNN 的文章,贴个链接和概述备忘。ICML 似乎没有给 tag,所以只能总结一下个人看到的 GNN 的 paper,估计少了好多,欢迎补充。

今年 ICML 搞了一个 graph representation 的 workshop,里面有不少有意思的文章,贴一下链接,workshop 里面的文章就不在下文出现了。

https://grlplus.github.io

本文首发于作者知乎专栏:图神经网络实战,里面有各种 GNN 相关文章,欢迎关注。

https://zhuanlan.zhihu.com/graph-neural-networks

GNN理论:

Generalization and Representational Limits of Graph Neural Networks

We address two fundamental questions about graph neural networks (GNNs).

Fast Learning of Graph Neural Networks with Guaranteed Generalizability: One-hidden-layer Case:

In this paper, we provide a theoretically-grounded generalizability analysis of GNNs with one hidden layer for both regression and binary classification problems.

Constant Curvature Graph Convolutional Networks:

Here, we bridge this gap by proposing mathematically grounded generalizations of graph convolutional networks (GCN) to (products of) constant curvature spaces.

如何训练更好的GNN:

Robust Graph Representation Learning via Neural Sparsification:

In this paper, we present NeuralSparse, a supervised graph sparsification technique that improves generalization power by learning to remove potentially task-irrelevant edges from input graphs.

Bayesian Graph Neural Networks with Adaptive Connection Sampling:

We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs) that generalizes existing stochastic regularization methods for training GNNs. (dropout与dropedge升级版)

Continuous Graph Neural Networks:

We propose continuous graph neural networks (CGNN), which generalise existing graph neural networks with discrete dynamics in that they can be viewed as a specific discretisation scheme.

Simple and Deep Graph Convolutional Networks:

In this paper, we study the problem of designing and analyzing deep graph convolutional networks.

Graph Homomorphism Convolution:

In this paper, we study the graph classification problem from the graph homomorphism perspective.

When Does Self-Supervision Help Graph Convolutional Networks:

In this study, we report the first systematic exploration and assessment of incorporating self-supervision into GCNs.

GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation:

This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulation (FiLM).

Contrastive Multi-View Representation Learning on Graphs:

We introduce a self-supervised approach for learning node and graph level representations by contrasting structural views of graphs.

Poisson Learning: Graph Based Semi-Supervised Learning At Very Low Label Rates:

We propose a new framework, called Poisson learning, for graph based semi-supervised learning at very low label rates.

Interferometric Graph Transform: a Deep Unsupervised Graph Representation:

We propose the Interferometric Graph Transform (IGT), which is a new class of deep unsupervised graph convolutional neural network for building graph representations.

Spectral Clustering with Graph Neural Networks for Graph Pooling:

In this paper, we propose a graph clustering approach that addresses these limitations of SC.

图生成:

Scalable Deep Generative Modeling for Sparse Graphs:

Based on this, we develop a novel autoregressive model, named BiGG, that utilizes this sparsity to avoid generating the full adjacency matrix, and importantly reduces the graph generation time complexity to . (非常优雅的方法)

A Graph to Graphs Framework for Retrosynthesis Prediction:

In this paper, we propose a novel template-free approach called G2Gs by transforming a target molecular graph into a set of reactant molecular graphs.

Hierarchical Generation of Molecular Graphs using Structural Motifs:

In this paper, we propose a new hierarchical graph encoder-decoder that employs significantly larger and more flexible graph motifs as basic building blocks.

GNN与物理:

Learning to Simulate Complex Physics with Graph Networks:

Here we present a general framework for learning simulation, and provide a single model implementation that yields state-of-the-art performance across a variety of challenging physical domains, involving fluids, rigid solids, and deformable materials interacting with one another.

Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid Flow Prediction:

In this work, we develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.

Learning Algebraic Multigrid Using Graph Neural Networks:

Here we propose a framework for learning AMG prolongation operators for linear systems with sparse symmetric positive (semi-) definite matrices.

GraphOpt: Learning Optimization Models of Graph Formation:

In this work, we propose GraphOpt, an end-to-end framework that jointly learns an implicit model of graph structure formation and discovers an underlying optimization mechanism in the form of a latent objective function.

自动编程:

Graph-based, Self-Supervised Program Repair from Diagnostic Feedback:

Program repair is challenging for two reasons: First, it requires reasoning and tracking symbols across source code and diagnostic feedback. Second, labeled datasets available for program repair are relatively small. In this work, we propose novel solutions to these two challenges.

其它:

Inductive Relation Prediction by Subgraph Reasoning:

Here, we propose a graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics.

Deep Graph Random Process for Relational-Thinking-Based Speech Recognition:

We present a framework that models a percept as weak relations between a current utterance and its history.

Graph Convolutional Network for Recommendation with Low-pass Collaborative Filters:

To address this gap, we leverage the original graph convolution in GCN and propose a Low-pass Collaborative Filter (LCF) to make it applicable to the large graph.

Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs:

We propose a novel Bayesian meta-learning approach to effectively learn the posterior distributions of the prototype vectors of tasks, where the initial prior of the prototype vectors is parameterized with a graph neural network on the global task graph.



登录查看更多
1

相关内容

专知会员服务
47+阅读 · 2020年9月20日
一份简单《图神经网络》教程,28页ppt
专知会员服务
123+阅读 · 2020年8月2日
【KDD2020】最小方差采样用于图神经网络的快速训练
专知会员服务
27+阅读 · 2020年7月13日
专知会员服务
42+阅读 · 2020年7月7日
【ICML2020】持续图神经网络,Continuous Graph Neural Networks
专知会员服务
150+阅读 · 2020年6月28日
【MIT-ICML2020】图神经网络的泛化与表示的局限
专知会员服务
42+阅读 · 2020年6月23日
图神经网络表达能力的研究综述,41页pdf
专知会员服务
169+阅读 · 2020年3月10日
ICML2020 图神经网络的预训练
图与推荐
12+阅读 · 2020年4月4日
19篇ICML2019论文摘录选读!
专知
28+阅读 · 2019年4月28日
无监督元学习表示学习
CreateAMind
27+阅读 · 2019年1月4日
清华NLP组图深度学习推荐,146篇必读GNN最新论文
中国人工智能学会
28+阅读 · 2018年12月29日
图神经网络综述:模型与应用
PaperWeekly
197+阅读 · 2018年12月26日
Attentive Graph Neural Networks for Few-Shot Learning
Arxiv
40+阅读 · 2020年7月14日
Heterogeneous Graph Transformer
Arxiv
27+阅读 · 2020年3月3日
Heterogeneous Deep Graph Infomax
Arxiv
12+阅读 · 2019年11月19日
Arxiv
15+阅读 · 2019年4月4日
Simplifying Graph Convolutional Networks
Arxiv
12+阅读 · 2019年2月19日
Arxiv
11+阅读 · 2018年7月8日
VIP会员
相关VIP内容
专知会员服务
47+阅读 · 2020年9月20日
一份简单《图神经网络》教程,28页ppt
专知会员服务
123+阅读 · 2020年8月2日
【KDD2020】最小方差采样用于图神经网络的快速训练
专知会员服务
27+阅读 · 2020年7月13日
专知会员服务
42+阅读 · 2020年7月7日
【ICML2020】持续图神经网络,Continuous Graph Neural Networks
专知会员服务
150+阅读 · 2020年6月28日
【MIT-ICML2020】图神经网络的泛化与表示的局限
专知会员服务
42+阅读 · 2020年6月23日
图神经网络表达能力的研究综述,41页pdf
专知会员服务
169+阅读 · 2020年3月10日
相关资讯
ICML2020 图神经网络的预训练
图与推荐
12+阅读 · 2020年4月4日
19篇ICML2019论文摘录选读!
专知
28+阅读 · 2019年4月28日
无监督元学习表示学习
CreateAMind
27+阅读 · 2019年1月4日
清华NLP组图深度学习推荐,146篇必读GNN最新论文
中国人工智能学会
28+阅读 · 2018年12月29日
图神经网络综述:模型与应用
PaperWeekly
197+阅读 · 2018年12月26日
相关论文
Attentive Graph Neural Networks for Few-Shot Learning
Arxiv
40+阅读 · 2020年7月14日
Heterogeneous Graph Transformer
Arxiv
27+阅读 · 2020年3月3日
Heterogeneous Deep Graph Infomax
Arxiv
12+阅读 · 2019年11月19日
Arxiv
15+阅读 · 2019年4月4日
Simplifying Graph Convolutional Networks
Arxiv
12+阅读 · 2019年2月19日
Arxiv
11+阅读 · 2018年7月8日
Top
微信扫码咨询专知VIP会员