Few-shot learning methods offer pre-training techniques optimized for easier later adaptation of the model to new classes (unseen during training) using one or a few examples. This adaptivity to unseen classes is especially important for many practical applications where the pre-trained label space cannot remain fixed for effective use and the model needs to be "specialized" to support new categories on the fly. One particularly interesting scenario, essentially overlooked by the few-shot literature, is Coarse-to-Fine Few-Shot (C2FS), where the training classes (e.g. animals) are of much `coarser granularity' than the target (test) classes (e.g. breeds). A very practical example of C2FS is when the target classes are sub-classes of the training classes. Intuitively, it is especially challenging as (both regular and few-shot) supervised pre-training tends to learn to ignore intra-class variability which is essential for separating sub-classes. In this paper, we introduce a novel 'Angular normalization' module that allows to effectively combine supervised and self-supervised contrastive pre-training to approach the proposed C2FS task, demonstrating significant gains in a broad study over multiple baselines and datasets. We hope that this work will help to pave the way for future research on this new, challenging, and very practical topic of C2FS classification.
翻译:少见的学习方法提供了最优化的培训前技巧,使模型在以后更方便地适应新的班级(培训期间不见),使用一个或几个例子。这种适应性对于许多实际应用来说特别重要,因为事先训练的标签空间不能固定,以便有效使用,模型需要“专门化”才能支持新的飞行类别。一个特别有趣的情景,基本上被少数文献所忽视,是Coarse-Fine-Off-Shot(C2FS),在这种情景中,训练班(例如动物)比目标(测试)班(例如繁殖)“更“粗颗粒化”得多。C2FS的一个非常实用的例子就是目标班是训练班的子。直观(经常和少见)受监督的训练前往往学会忽视对分级至关重要的班级内部变异性。在本文件中,我们引入了一个新的“矩正正正正常化”模块,以便能够有效地将监督和自我控制的分类与自我超越目标班级(例如繁殖)。一个非常具有挑战性的C2级的前期研究的目标班级作为未来的重要基线,我们将在这项具有挑战性的研究中展示我们提出的C2级之前的重要的基准。