We propose a novel class incremental learning approach by incorporating a feature augmentation technique motivated by adversarial attacks. We employ a classifier learned in the past to complement training examples rather than simply play a role as a teacher for knowledge distillation towards subsequent models. The proposed approach has a unique perspective to utilize the previous knowledge in class incremental learning since it augments features of arbitrary target classes using examples in other classes via adversarial attacks on a previously learned classifier. By allowing the cross-class feature augmentations, each class in the old tasks conveniently populates samples in the feature space, which alleviates the collapse of the decision boundaries caused by sample deficiency for the previous tasks, especially when the number of stored exemplars is small. This idea can be easily incorporated into existing class incremental learning algorithms without any architecture modification. Extensive experiments on the standard benchmarks show that our method consistently outperforms existing class incremental learning methods by significant margins in various scenarios, especially under an environment with an extremely limited memory budget.
翻译:我们提出了一种新颖的类别增量学习方法,通过采用受对抗攻击启发的特征增强技术。我们利用在过去学习的分类器来补充训练示例,而不仅仅是作为后续模型的知识蒸馏的教师。所提出的方法具有独特的视角,因为它通过对先前学习的分类器进行对抗攻击来增加任意目标类别的特征。通过允许跨类别的特征增强,旧任务中的每个类都可以方便地在特征空间中填充样本,从而减轻由于以前任务的样本不足而导致的决策边界崩溃,特别是当存储的示例数量很少时。这个想法可以轻松地结合现有的类别增量学习算法,而不需要任何架构修改。在标准基准测试中进行的广泛实验表明,我们的方法在各种场景下都可以显著优于现有的类别增量学习方法,在极其有限的存储容量预算下尤为突出。