Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts continually with only a few samples, which is prone to suffer the catastrophic forgetting and overfitting problems. The inaccessibility of old classes and the scarcity of the novel samples make it formidable to realize the trade-off between retaining old knowledge and learning novel concepts. Inspired by that different models memorize different knowledge when learning novel concepts, we propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks. Additionally, to update the model with few novel samples, we develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution. Extensive experiments on three benchmark datasets, e.g., CIFAR100, miniImageNet and CUB200, have demonstrated the superiority of our proposed method.
翻译:少见的课堂强化学习(FSCIL)旨在不断学习新概念,只有少数几个样本,这些样本容易遭受灾难性的遗忘和过度适应问题。旧类的难以获得和新颖样本的稀缺使得难以在保留旧知识和学习新概念之间实现权衡。受不同模型的启发,在学习新概念时将不同知识混为一文,我们提议一个记忆化补充网络(MCNet),在新任务中结合多种模型,以补充不同的记忆知识。此外,为了用少数新样本更新模型,我们开发了一种原型滑动式的硬采矿三重体(PSHT)损失,以便不仅在目前的任务中,而且在旧的分布中,将新样本从彼此中推开。关于三个基准数据集(例如CIFAR100、MiniImageNet和CUB200)的广泛实验展示了我们拟议方法的优越性。