Most state-of-the-art Graph Neural Networks (GNNs) can be defined as a form of graph convolution which can be realized by message passing between direct neighbors or beyond. To scale such GNNs to large graphs, various neighbor-, layer-, or subgraph-sampling techniques are proposed to alleviate the "neighbor explosion" problem by considering only a small subset of messages passed to the nodes in a mini-batch. However, sampling-based methods are difficult to apply to GNNs that utilize many-hops-away or global context each layer, show unstable performance for different tasks and datasets, and do not speed up model inference. We propose a principled and fundamentally different approach, VQ-GNN, a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance. In contrast to sampling-based techniques, our approach can effectively preserve all the messages passed to a mini-batch of nodes by learning and updating a small number of quantized reference vectors of global node representations, using VQ within each GNN layer. Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix. We show that such a compact low-rank version of the gigantic convolution matrix is sufficient both theoretically and experimentally. In company with VQ, we design a novel approximated message passing algorithm and a nontrivial back-propagation rule for our framework. Experiments on various types of GNN backbones demonstrate the scalability and competitive performance of our framework on large-graph node classification and link prediction benchmarks.
翻译:多数最先进的图形神经网络(GNNS)可以定义为一种可以通过直接邻居之间或更远的传递信息实现的图形变异形式。 将这种GNNS缩放到大图、 各种相邻、 层或子取样技术, 目的是通过考虑在小型批量中传递给节点的一小部分信息来缓解“ 邻居爆炸” 问题。 但是, 抽样法很难适用于使用多跳或全球背景的图象, 显示不同任务和数据集的不稳性能, 并且不加快模型推导速度。 我们建议一种原则性和根本不同的方法, VQGNNNN, 一个通用框架, 以使用Vctor concontal化(VQQ) 来提升任何基于革命的GNNNNNN, 与基于取样的技术相反, 我们的方法可以有效地将所有的信息保存到一个小型的节点, 通过学习和更新少量的四分化参考的基级参考矢量, 显示不同任务和数据集的非四分级的不稳度矢量矢量矢量矢量 。