Few-shot class-incremental learning (FSCIL) has been a challenging problem as only a few training samples are accessible for each novel class in the new sessions. Finetuning the backbone or adjusting the classifier prototypes trained in the prior sessions would inevitably cause a misalignment between the feature and classifier of old classes, which explains the well-known catastrophic forgetting problem. In this paper, we deal with this misalignment dilemma in FSCIL inspired by the recently discovered phenomenon named neural collapse, which reveals that the last-layer features of the same class will collapse into a vertex, and the vertices of all classes are aligned with the classifier prototypes, which are formed as a simplex equiangular tight frame (ETF). It corresponds to an optimal geometric structure for classification due to the maximized Fisher Discriminant Ratio. We propose a neural collapse inspired framework for FSCIL. A group of classifier prototypes are pre-assigned as a simplex ETF for the whole label space, including the base session and all the incremental sessions. During training, the classifier prototypes are not learnable, and we adopt a novel loss function that drives the features into their corresponding prototypes. Theoretical analysis shows that our method holds the neural collapse optimality and does not break the feature-classifier alignment in an incremental fashion. Experiments on the miniImageNet, CUB-200, and CIFAR-100 datasets demonstrate that our proposed framework outperforms the state-of-the-art performances. Code address: https://github.com/NeuralCollapseApplications/FSCIL
翻译:少见的课堂强化学习( FSCIL) 是一个具有挑战性的问题, 因为在新的会话中, 每一新类中只有少量的培训样本可供使用。 微调骨干或调整在前几会话中受过训练的分类原型将不可避免地导致旧类特性和分类者之间的偏差, 这解释了众所周知的灾难性遗忘问题。 在本文中, 我们处理FSCIL 中由最近发现的名为神经崩溃的现象引发的对称问题。 这揭示了同一类的最后一层特征将崩溃成一个顶端, 所有类的顶端都与分类原型一致, 而分类原型将形成一个简单度的矩形紧凑框架( ETF) 。 它相当于一个最佳分类的地理结构结构, 因为它是最佳的 Fisherport Discrimination 比率。 我们为FSCIL 提议了一个神经崩溃激励框架。 一组分类原型被预先命名为整个标签空间的简单化的 ETFTF, 包括基础会话和所有递增会话。 在培训中, Cligerrgieral 原型原型模型将显示我们无法学习 的 的 格式/ tremal 格式 格式 格式 格式 格式分析。 。 我们的系统 格式 格式 格式 格式 格式 格式 和 格式 格式 格式, 我们采用 格式 格式 格式 格式 格式 格式 采用 格式 格式 格式 格式 。