Few-shot learning (FSL) aims to learn models that generalize to novel classes with limited training samples. Recent works advance FSL towards a scenario where unlabeled examples are also available and propose semi-supervised FSL methods. Another line of methods also cares about the performance of base classes in addition to the novel ones and thus establishes the incremental FSL scenario. In this paper, we generalize the above two under a more realistic yet complex setting, named by Semi-Supervised Incremental Few-Shot Learning (S2 I-FSL). To tackle the task, we propose a novel paradigm containing two parts: (1) a well-designed meta-training algorithm for mitigating ambiguity between base and novel classes caused by unreliable pseudo labels and (2) a model adaptation mechanism to learn discriminative features for novel classes while preserving base knowledge using few labeled and all the unlabeled data. Extensive experiments on standard FSL, semi-supervised FSL, incremental FSL, and the firstly built S2 I-FSL benchmarks demonstrate the effectiveness of our proposed method.
翻译:少见的学习(FSL)旨在学习普及到具有有限培训样本的新型班级的模式。最近的工作将FSL推向一种提供未贴标签的例子的情景,并提出半监督的FSL方法。另一套方法还关心基础班的绩效,而除了新颖的课外,还关心基础班的绩效,从而确立了递增FSL情景。在本文中,我们在更现实、更复杂的环境下将以上两种模式推广到一个更现实、更复杂的环境中,以半监督的增量低热学习(S2 I-FSL)命名。为了完成这项任务,我们提出了一个包含两部分的新颖的范例:(1) 精心设计的元培训算法,以减轻不可靠的伪标签造成的基础班和新班之间的模糊性;(2) 示范的适应机制,在保留基本知识的同时,使用很少贴标签和所有未贴标签的数据保留基础知识。关于标准的FSL、半监督的FSL、递增的FSL和首建的S2 I-FSL基准的广泛实验,显示了我们拟议方法的有效性。