Learning the generalizable feature representation is critical for few-shot image classification. While recent works exploited task-specific feature embedding using meta-tasks for few-shot learning, they are limited in many challenging tasks as being distracted by the excursive features such as the background, domain and style of the image samples. In this work, we propose a novel Disentangled Feature Representation framework, dubbed DFR, for few-shot learning applications. DFR can adaptively decouple the discriminative features that are modeled by the classification branch, from the class-irrelevant component of the variation branch. In general, most of the popular deep few-shot learning methods can be plugged in as the classification branch, thus DFR can boost their performance on various few-shot tasks. Furthermore, we propose a novel FS-DomainNet dataset based on DomainNet, for benchmarking the few-shot domain generalization tasks. We conducted extensive experiments to evaluate the proposed DFR on general and fine-grained few-shot classification, as well as few-shot domain generalization, using the corresponding four benchmarks, i.e., mini-ImageNet, tiered-ImageNet, CUB, as well as the proposed FS-DomainNet. Thanks to the effective feature disentangling, the DFR-based few-shot classifiers achieved the state-of-the-art results on all datasets.
翻译:学习一般特征代表对于几张图片的图像分类至关重要。 虽然最近的工作利用了特定任务的具体特点嵌入了元任务,以进行微小的学习,但是由于图像样本的背景、域和风格等外观性特征分散了注意力,因此这些任务在很多艰巨的任务中是有限的。 在这项工作中,我们提出一个新的分解特征代表框架,称为DFR,用于几张光学学习应用。 DFR可以适应性地将分类处的分类部门所建模的、与类别相关的组成部分的歧视性特征从变异处的分类中分离出来。 一般来说,大多数受欢迎的深浅的微小学习方法可以作为分类部门插入,因此DFR可以提高其在各种微小任务上的业绩。 此外,我们提议在DomainNet上建立一个新的FS-DomainNet数据集集,以对几张域通用任务进行基准化。 我们进行了广泛的实验,以评价拟议的DFRR对一般和精细的几张分类,以及几张的域分类,使用相应的四个基准,即MISNet-Net, 和DFS-IM-delge-deal-deal as-deal-dealalal-as-as- as-salizmagial asional- as-sal-s-s-s-deal-s-Silviewd-s-Imag-SUB-s-s-delviewd-s-s-s-s-s-delvial-delvial-s-s-s-delviald-sal-sal-s-s-s-del-s-delviewdal-s-del-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-I-I-IBal-Ima-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-