New classes arise frequently in our ever-changing world, e.g., emerging topics in social media and new types of products in e-commerce. A model should recognize new classes and meanwhile maintain discriminability over old classes. Under severe circumstances, only limited novel instances are available to incrementally update the model. The task of recognizing few-shot new classes without forgetting old classes is called few-shot class-incremental learning (FSCIL). In this work, we propose a new paradigm for FSCIL based on meta-learning by LearnIng Multi-phase Incremental Tasks (LIMIT), which synthesizes fake FSCIL tasks from the base dataset. The data format of fake tasks is consistent with the `real' incremental tasks, and we can build a generalizable feature space for the unseen tasks through meta-learning. Besides, LIMIT also constructs a calibration module based on transformer, which calibrates the old class classifiers and new class prototypes into the same scale and fills in the semantic gap. The calibration module also adaptively contextualizes the instance-specific embedding with a set-to-set function. LIMIT efficiently adapts to new classes and meanwhile resists forgetting over old classes. Experiments on three benchmark datasets (CIFAR100, miniImageNet, and CUB200) and large-scale dataset, i.e., ImageNet ILSVRC2012 validate that LIMIT achieves state-of-the-art performance.
翻译:在我们不断变化的世界中,新班经常出现,例如社交媒体和电子商务新产品类型中的新主题。模型应该承认新班,同时保持旧班的不均匀性。在严重的情况下,只有有限的新例可用于逐步更新模型。在不忘旧班的情况下,承认少发新班的任务被称为少发班级强化学习(FSCIL)。在这项工作中,我们提出了基于SearningIng多阶段增量任务(LIMIT)的元学习的FSCIL新模式,该模式综合了基础数据集中假冒的FSCIL任务。假任务的数据格式符合“真实的”增量性任务,我们可以通过元学习为不可见的任务建立一个通用的功能空间。此外,LIMIT还根据变异器构建了一个校准模块,将旧班级分类和新的班级原型调整为相同的规模,填补了语系20级差距。校准模块还将具体实例嵌入的ISILS-LS-IM-MLIS 3级、IMIS-MLIMISM 和MLIMA 3级的升级数据库数据库,并有效地调整了新格式数据。