Knowledge distillation (KD) transfers knowledge from a teacher network to a student by enforcing the student to mimic the outputs of the pretrained teacher on training data. However, data samples are not always accessible in many cases due to large data sizes, privacy, or confidentiality. Many efforts have been made on addressing this problem for convolutional neural networks (CNNs) whose inputs lie in a grid domain within a continuous space such as images and videos, but largely overlook graph neural networks (GNNs) that handle non-grid data with different topology structures within a discrete space. The inherent differences between their inputs make these CNN-based approaches not applicable to GNNs. In this paper, we propose to our best knowledge the first dedicated approach to distilling knowledge from a GNN without graph data. The proposed graph-free KD (GFKD) learns graph topology structures for knowledge transfer by modeling them with multivariate Bernoulli distribution. We then introduce a gradient estimator to optimize this framework. Essentially, the gradients w.r.t. graph structures are obtained by only using GNN forward-propagation without back-propagation, which means that GFKD is compatible with modern GNN libraries such as DGL and Geometric. Moreover, we provide the strategies for handling different types of prior knowledge in the graph data or the GNNs. Extensive experiments demonstrate that GFKD achieves the state-of-the-art performance for distilling knowledge from GNNs without training data.
翻译:知识蒸馏(KD) 将知识从教师网络向学生转移,强制学生模仿受过培训的教师在培训数据方面的产出,从而将知识从教师网络向学生转移。然而,由于数据大小、隐私或保密性强,数据样本在许多情况下并不总是容易获得。为了解决这个问题,已经做出了许多努力,为革命性神经网络(CNNs)解决了这个问题,这些网络的投入存在于网络网格范围内,如图像和视频等连续空间内,但大多忽略了图形神经网络(GNNS),这些网络在离散空间内处理不同地形结构的非网络数据。它们的投入之间的内在差异使得这些基于CNN的CNN方法不适用于GNNs。在本文中,我们建议我们最了解的是第一个专门从GNNF中提取知识的方法,而没有图形数据数据。提议的无图形KDD(GD)通过多变异的分布模型来学习知识转移知识的图形表层结构结构。然后我们引入一个梯度估计器来优化这个框架。基本上,它们输入的GNW.r.t图形结构结构结构只是用GNNF的前方G-GGGNF处理方式,我们只能通过现代GFS-CFS-CFS-creal-creal-creal sal sal sal salde sal sal sal sal sal sal 来展示来证明,这样在GV 和GD的GPDS-calpalpalpaldaldaldaldaldaldaldald 。