Continual zero-shot learning(CZSL) is a new domain to classify objects sequentially the model has not seen during training. It is more suitable than zero-shot and continual learning approaches in real-case scenarios when data may come continually with only attributes for a few classes and attributes and features for other classes. Continual learning(CL) suffers from catastrophic forgetting, and zero-shot learning(ZSL) models cannot classify objects like state-of-the-art supervised classifiers due to lack of actual data(or features) during training. This paper proposes a novel continual zero-shot learning (DVGR-CZSL) model that grows in size with each task and uses generative replay to update itself with previously learned classes to avoid forgetting. We demonstrate our hybrid model(DVGR-CZSL) outperforms the baselines and is effective on several datasets, i.e., CUB, AWA1, AWA2, and aPY. We show our method is superior in task sequentially learning with ZSL(Zero-Shot Learning). We also discuss our results on the SUN dataset.
翻译:连续零光学习( CZSL) 是一个新的领域, 用来对模型在训练期间没有看到的对象进行顺序分类。 它比零光和连续学习方法更适合, 在实际情况中, 数据可能连续出现, 只给几个班级提供属性, 以及其它班级的属性和特征。 连续学习( CL) 遭受灾难性的遗忘, 零光学习( ZSL) 模式无法像在训练期间缺少实际数据( 或特性) 那样对物体进行分类。 本文提出了一个新的连续零光学习( DVGR- CZSL) 模式, 该模式随着每项任务的规模而扩大, 并使用基因回放来更新先前学到的课程, 以避免忘记。 我们展示了我们的混合模型( DVGR- CZSL) 超越了基线, 对几个数据集有效, 即 CUB、 AWAW1、 AWAW2 和 aPY。 我们显示我们的方法在与 ZSL( Zero-Shot learking) 的连续学习任务中更优越。 我们还讨论我们在 SUN 数据集上的结果 。