Few-Shot Class-Incremental Learning (FSCIL) aims at incrementally learning novel classes from a few labeled samples by avoiding the overfitting and catastrophic forgetting simultaneously. The current protocol of FSCIL is built by mimicking the general class-incremental learning setting, while it is not totally appropriate due to the different data configuration, i.e., novel classes are all in the limited data regime. In this paper, we rethink the configuration of FSCIL with the open-set hypothesis by reserving the possibility in the first session for incoming categories. To assign better performances on both close-set and open-set recognition to the model, Hyperbolic Reciprocal Point Learning module (Hyper-RPL) is built on Reciprocal Point Learning (RPL) with hyperbolic neural networks. Besides, for learning novel categories from limited labeled data, we incorporate a hyperbolic metric learning (Hyper-Metric) module into the distillation-based framework to alleviate the overfitting issue and better handle the trade-off issue between the preservation of old knowledge and the acquisition of new knowledge. The comprehensive assessments of the proposed configuration and modules on three benchmark datasets are executed to validate the effectiveness concerning three evaluation indicators.
翻译:微小类入门学习(FSCIL)的目的是通过避免过度装配和灾难性的同时遗忘,从几个标签样本中逐步从几个标签样本中学习新课程。FSCIL目前的协议是通过模仿普通类入门学习环境来构建的,而由于数据配置不同,即新类都存在于有限的数据制度中,因此并不完全合适。在本文件中,我们重新思考FSCIL的配置与开放假设的组合,在第一个课程中保留入门类别的可能性。在近位和开放承认模型方面,指定更好的业绩。超偏心对等点学习模块(Hyper-RPL)建在双向点学习(RPL)上,并建在超偏心神经网络上。此外,为了从有限的标签数据中学习新类别,我们将一个超单立度的衡量学习模块(Hyper-Metricle)纳入基于蒸馏的框架,以缓解过度配置问题,并更好地处理旧知识保存和获取新知识之间的交易问题。关于拟议数据评估的三个模型和基准的全面评估是三个关于拟议数据模块和指标的核实。