Class incremental learning has attracted much attention, but most existing works still continually fine-tune the entire representation model, inevitably resulting in much catastrophic forgetting. Instead of struggling to fight against such forgetting by replaying or distillation like most of the existing methods, we take a novel pre-train-and-prompt-tuning paradigm to sequentially learn new visual concepts based on a fixed semantic-rich pre-trained representation model. In detail, we incrementally prompt-tune category prototypes for classification and example prototypes to compensate for semantic drift, the problem caused by learning bias at different phases. Extensive experiments conducted on the mainstream incremental learning benchmarks demonstrate that our method outperforms other state-of-the-art methods.
翻译:班级递增学习引起了许多关注,但大多数现有作品仍然不断微调整个代号模式,不可避免地导致灾难性的遗忘。 我们不但没有努力通过重弹或像大多数现有方法那样的蒸馏来克服这种忘却,反而采取了一种新的预培训和速成调范式,在固定语义丰富、经过训练前代号代表模式的基础上按顺序学习新的视觉概念。 详细来说,我们逐渐地快速调用分类的原型,并用样样来弥补语义漂移,这是在不同阶段学习偏见造成的问题。 在主流递增学习基准上进行的广泛实验表明,我们的方法优于其他最先进的方法。