We consider class incremental learning (CIL) problem, in which a learning agent continuously learns new classes from incrementally arriving training data batches and aims to predict well on all the classes learned so far. The main challenge of the problem is the catastrophic forgetting, and for the exemplar-memory based CIL methods, it is generally known that the forgetting is commonly caused by the classification score bias that is injected due to the data imbalance between the new classes and the old classes (in the exemplar-memory). While several methods have been proposed to correct such score bias by some additional post-processing, e.g., score re-scaling or balanced fine-tuning, no systematic analysis on the root cause of such bias has been done. To that end, we analyze that computing the softmax probabilities by combining the output scores for all old and new classes could be the main cause of the bias. Then, we propose a new method, dubbed as Separated Softmax for Incremental Learning (SS-IL), that consists of separated softmax (SS) output layer combined with task-wise knowledge distillation (TKD) to resolve such bias. Throughout our extensive experimental results on several large-scale CIL benchmark datasets, we show our SS-IL achieves strong state-of-the-art accuracy through attaining much more balanced prediction scores across old and new classes, without any additional post-processing.
翻译:我们考虑的是班级递增学习(CIL)问题,在这个问题上,学习代理机构不断从逐渐到达的培训数据批量中学习新班级,目的是对迄今所学的所有班级进行良好的预测。问题的主要挑战是灾难性的忘却,对于基于模拟-模拟的CIL方法来说,众所周知,这种忘却通常是由于新班级和旧班之间数据不平衡而注入的分类分数偏差造成的(在示范-模拟阶段)。虽然已经建议了几种方法来通过一些额外的后处理来纠正这种分数偏差,例如,重新评分或平衡的微调,但还没有对这种偏差的根源进行系统分析。为此,我们分析,通过将所有旧班和新班级的产出分数合并来计算软负的概率可能是偏差的主要原因。然后,我们提出了一种新的方法,被调低为累进学习(SS-IL)的分数分数,它由分离的软处理(SS)输出层和平衡的微微调整组成,同时没有进行这种偏差的系统分析,也没有对这种偏差的根源进行系统分析。为此,我们整个SILLLLA的大规模实验性结果,要达到某种高层次的高级的后级。(T-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-