The challenge of the Class Incremental Learning (CIL) lies in difficulty for a learner to discern the old classes' data from the new while no previous data is preserved. Namely, the representation distribution of different phases overlaps with each other. In this paper, to alleviate the phenomenon of representation overlapping for both memory-based and memory-free methods, we propose a new CIL framework, Contrastive Class Concentration for CIL (C4IL). Our framework leverages the class concentration effect of contrastive representation learning, therefore yielding a representation distribution with better intra-class compactibility and inter-class separability. Quantitative experiments showcase our framework that is effective in both memory-based and memory-free cases: it outperforms the baseline methods of both cases by 5% in terms of the average and top-1 accuracy in 10-phase and 20-phase CIL. Qualitative results also demonstrate that our method generates a more compact representation distribution that alleviates the overlapping problem.
翻译:班级递增学习(CIL)的挑战在于,学习者很难从新的班级中辨别旧类的数据,而以前的数据则没有保存。也就是说,不同阶段的表述分布相互重叠。在本文件中,为了缓解记忆和无记忆方法的表述重叠现象,我们提议一个新的CIL框架,即CIL的对比等级集中(C4IL),我们的框架利用了对比代表性学习的等级集中效应,从而产生了更紧凑和等级间分离的代表性分布。量化实验展示了我们在基于记忆和无记忆的情况下都有效的框架:从10阶段和20阶段的中平均和头一准确性看,它比两个案例的基准方法都高出5%。定性结果还表明,我们的方法产生了更紧凑的代表性分布,从而缓解了重叠的问题。