Graph few-shot learning is of great importance among various graph learning tasks. Under the few-shot scenario, models are often required to conduct classification given limited labeled samples. Existing graph few-shot learning methods typically leverage Graph Neural Networks (GNNs) and perform classification across a series of meta-tasks. Nevertheless, these methods generally rely on the original graph (i.e., the graph that the meta-task is sampled from) to learn node representations. Consequently, the graph structure used in each meta-task is identical. Since the class sets are different across meta-tasks, node representations should be learned in a task-specific manner to promote classification performance. Therefore, to adaptively learn node representations across meta-tasks, we propose a novel framework that learns a task-specific structure for each meta-task. To handle the variety of nodes across meta-tasks, we extract relevant nodes and learn task-specific structures based on node influence and mutual information. In this way, we can learn node representations with the task-specific structure tailored for each meta-task. We further conduct extensive experiments on five node classification datasets under both single- and multiple-graph settings to validate the superiority of our framework over the state-of-the-art baselines. Our code is provided at https://github.com/SongW-SW/GLITTER.
翻译:图形少见的学习在各种图表学习任务中非常重要。 在微小的情景下,通常需要模型来进行分类,因为有有限的标签样本。现有的图形少见的学习方法通常会利用图形神经网络(GNNS),并在一系列元任务中进行分类。然而,这些方法一般依靠原始图表(即元任务样本中的图表)来学习节点表示。因此,每个元任务中使用的图表结构是相同的。由于各个元任务不同,因此,需要用不同的任务组合来进行分类,因此,应当以特定任务的方式学习节点表示,以促进分类业绩。因此,为了在元任务中适应性地学习节点表示,我们建议了一个新颖的框架,为每个元任务学习一个特定的任务结构。为了处理元任务之间的节点,我们提取相关的节点,并根据节点和相互的信息学习具体任务结构。通过这种方式,我们可以学习与为每个元任务定制的具体任务结构的节点表示方式,应当以特定的方式学习节点表示方式学习节点表示。为了在元任务中适应每个元任务的业绩表现,我们根据适应适应的适应性学习适应性学习对元任务显示节点的节点的节点的节点表示,我们还在5个基准框架下进行广泛的实验,我们在标准下提供的高级标准框架。我们对5次级的基线框架,我们提供了对等的标准化的标准化的标准化的标准化的基线框架。我们对5次级的标准化。我们提供的对等基准框架。