Few-shot classification aims to learn to classify new object categories well using only a few labeled examples. Transferring feature representations from other models is a popular approach for solving few-shot classification problems. In this work we perform a systematic study of various feature representations for few-shot classification, including representations learned from MAML, supervised classification, and several common self-supervised tasks. We find that learning from more complex tasks tend to give better representations for few-shot classification, and thus we propose the use of representations learned from multiple tasks for few-shot classification. Coupled with new tricks on feature selection and voting to handle the issue of small sample size, our direct transfer learning method offers performance comparable to state-of-art on several benchmark datasets.
翻译:少见的分类旨在学会只用几个有标签的例子对新的物体类别进行分类。从其他模型中转移特征说明是一种解决少见分类问题的流行方法。在这项工作中,我们系统地研究用于少数特征分类的各种特征说明,包括从MAML中学到的特征说明、监督的分类和若干共同的自我监督任务。我们发现,从更复杂的任务中学习往往更能反映少见的分类,因此我们建议利用从多个任务中学习的特征说明来进行微小的分类。结合关于特征选择和投票的新技巧来处理小样本规模的问题,我们的直接转移学习方法提供了与几个基准数据集中最先进的相似的性能。