Graph neural networks (GNNs) have been used to tackle the few-shot learning (FSL) problem and shown great potentials under the transductive setting. However under the inductive setting, existing GNN based methods are less competitive. This is because they use an instance GNN as a label propagation/classification module, which is jointly meta-learned with a feature embedding network. This design is problematic because the classifier needs to adapt quickly to new tasks while the embedding does not. To overcome this problem, in this paper we propose a novel hybrid GNN (HGNN) model consisting of two GNNs, an instance GNN and a prototype GNN. Instead of label propagation, they act as feature embedding adaptation modules for quick adaptation of the meta-learned feature embedding to new tasks. Importantly they are designed to deal with a fundamental yet often neglected challenge in FSL, that is, with only a handful of shots per class, any few-shot classifier would be sensitive to badly sampled shots which are either outliers or can cause inter-class distribution overlapping. %Our two GNNs are designed to address these two types of poorly sampled few-shots respectively and their complementarity is exploited in the hybrid GNN model. Extensive experiments show that our HGNN obtains new state-of-the-art on three FSL benchmarks.
翻译:图像神经网络(GNNs)已被用于解决微小的学习(FSL)问题,并展示了在传输环境中的巨大潜力。 但是,在感化设置下,现有的GNN采用的方法竞争力较低。 这是因为它们使用一个实例GNN作为标签传播/分类模块, 标签传播/ 分类模块, 这是与特性嵌入网络共同的元化学习。 这个设计有问题, 因为分类者需要快速适应新任务, 而嵌入的网络却没有。 为了克服这个问题, 我们在本文件中提议了一个新的混合GNN(HGNN)模式(HGNN)模式, 由两个GNN(G)模式和原型GNN(GNN)构成。 它们作为嵌入模块嵌入适应模块, 以快速适应元学习特性嵌入新任务。 它们的设计是为了应对FSL中一个基本但经常被忽视的挑战, 也就是说, 只需每类一小片镜头, 任何几发的分类都会敏感到错误的抽样镜头, 要么是外端点, 或可能导致类间分配重叠。% OUR的GNNNNNNS模型是用来分别处理这三种G的模型。