Few-shot image classification aims to classify unseen classes with limited labeled samples. Recent works benefit from the meta-learning process with episodic tasks and can fast adapt to class from training to testing. Due to the limited number of samples for each task, the initial embedding network for meta learning becomes an essential component and can largely affects the performance in practice. To this end, many pre-trained methods have been proposed, and most of them are trained in supervised way with limited transfer ability for unseen classes. In this paper, we proposed to train a more generalized embedding network with self-supervised learning (SSL) which can provide slow and robust representation for downstream tasks by learning from the data itself. We evaluate our work by extensive comparisons with previous baseline methods on two few-shot classification datasets ({\em i.e.,} MiniImageNet and CUB). Based on the evaluation results, the proposed method achieves significantly better performance, i.e., improve 1-shot and 5-shot tasks by nearly \textbf{3\%} and \textbf{4\%} on MiniImageNet, by nearly \textbf{9\%} and \textbf{3\%} on CUB. Moreover, the proposed method can gain the improvement of (\textbf{15\%}, \textbf{13\%}) on MiniImageNet and (\textbf{15\%}, \textbf{8\%}) on CUB by pretraining using more unlabeled data. Our code will be available at \hyperref[https://github.com/phecy/SSL-FEW-SHOT.]{https://github.com/phecy/ssl-few-shot.}
翻译:少量图像分类旨在用有限的标签样本对隐蔽类进行分类。 最近的工作得益于带有非常规任务的元学习过程 。 最近的工作得益于从培训到测试的元学习过程 。 由于每个任务样本数量有限, 初始嵌入元学习网络成为基本组成部分, 并在很大程度上影响实际业绩 。 为此, 提出了许多预培训方法, 其中多数是经过监督的培训, 隐蔽类的传输能力有限 。 在本文中, 我们提议训练一个更普及的嵌入网络, 由自上而下的学习 (SSL) 提供缓慢和强大的代表, 通过从数据本身学习, 来为下游任务提供缓慢和强大的代表。 我们通过对前两个微小分类数据集( 即: 微小的网络和 CUB) 进行广泛的比较来评估我们的工作 。 根据评估结果, 拟议的方法可以大大提高性能, 即通过近\ textb {ral_ { { 3} 和\\\ textf twexx 。 (通过 UB\\\\\\\\\\ t weal) roup 数据, 在 Cral_\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\