Current class-incremental learning research mainly focuses on single-label classification tasks while multi-label class-incremental learning (MLCIL) with more practical application scenarios is rarely studied. Although there have been many anti-forgetting methods to solve the problem of catastrophic forgetting in class-incremental learning, these methods have difficulty in solving the MLCIL problem due to label absence and information dilution. In this paper, we propose a knowledge restore and transfer (KRT) framework for MLCIL, which includes a dynamic pseudo-label (DPL) module to restore the old class knowledge and an incremental cross-attention(ICA) module to save session-specific knowledge and transfer old class knowledge to the new model sufficiently. Besides, we propose a token loss to jointly optimize the incremental cross-attention module. Experimental results on MS-COCO and PASCAL VOC datasets demonstrate the effectiveness of our method for improving recognition performance and mitigating forgetting on multi-label class-incremental learning tasks.
翻译:目前,课堂强化学习研究主要侧重于单标签分类任务,而多标签类强化学习(MLCIL)则很少研究具有更实际应用设想的多标签类强化学习(MLCIL),尽管在课堂强化学习中有许多反催生方法来解决灾难性的遗忘问题,但由于标签缺失和信息稀释,这些方法难以解决刚果解放运动的问题。在本文件中,我们提议为刚果解放运动建立一个知识恢复和转让框架,其中包括一个恢复旧类知识的动态假标签(DPL)模块和一个递增的跨用途模块,以保存会议特定知识,并将旧类知识充分转让给新模式。此外,我们提出象征性损失,以共同优化递增的跨用途模块。关于MS-COCO和PASAL VOC数据集的实验结果显示了我们改进识别业绩和减缓多标签类强化学习任务的遗忘的方法的有效性。</s>