Exemplar-free class-incremental learning is very challenging due to the negative effect of catastrophic forgetting. A balance between stability and plasticity of the incremental process is needed in order to obtain good accuracy for past as well as new classes. Existing exemplar-free class-incremental methods focus either on successive fine tuning of the model, thus favoring plasticity, or on using a feature extractor fixed after the initial incremental state, thus favoring stability. We introduce a method which combines a fixed feature extractor and a pseudo-features generator to improve the stability-plasticity balance. The generator uses a simple yet effective geometric translation of new class features to create representations of past classes, made of pseudo-features. The translation of features only requires the storage of the centroid representations of past classes to produce their pseudo-features. Actual features of new classes and pseudo-features of past classes are fed into a linear classifier which is trained incrementally to discriminate between all classes. The incremental process is much faster with the proposed method compared to mainstream ones which update the entire deep model. Experiments are performed with three challenging datasets, and different incremental settings. A comparison with ten existing methods shows that our method outperforms the others in most cases.
翻译:由于灾难性的遗忘所带来的负面后果,不光彩的课堂强化学习非常具有挑战性。需要平衡渐进过程的稳定性和可塑性,以便获得过去和新类别的良好准确性。现有的不光彩的课堂强化方法侧重于对模型进行连续的细微调整,从而有利于塑料化,或者在最初的递增状态之后使用固定的特征提取器,从而有利于稳定性。我们引入了一种方法,将固定的特征提取器和假的功能生成器结合起来,以改善稳定性和稳定性的平衡。发电机使用简单而有效的新类别特征的几何转换法来建立过去班级的表象,由假的功能构成。现有特征的转化只需要存储过去班子的百变体表示来产生假的假性能。新类别和过去班子的伪性能的实际特征被注入一个线性分类器,经过逐步培训以区分所有类别。与更新整个深层次模型的主流相比,拟议的方法的递增速度要快得多。实验用三种不同的例子显示我们现有方法的渐进式。