Few-shot learning aims to learn a classifier using a few labelled instances for each class. Metric-learning approaches for few-shot learning embed instances into a high-dimensional space and conduct classification based on distances among instance embeddings. However, such instance embeddings are usually shared across all episodes and thus lack the discriminative power to generalize classifiers according to episode-specific features. In this paper, we propose a novel approach, namely \emph{Episode Adaptive Embedding Network} (EAEN), to learn episode-specific embeddings of instances. By leveraging the probability distributions of all instances in an episode at each channel-pixel embedding dimension, EAEN can not only alleviate the overfitting issue encountered in few-shot learning tasks, but also capture discriminative features specific to an episode. To empirically verify the effectiveness and robustness of EAEN, we have conducted extensive experiments on three widely used benchmark datasets, under various combinations of different generic embedding backbones and different classifiers. The results show that EAEN significantly improves classification accuracy about $10\%$ to $20\%$ in different settings over the state-of-the-art methods.
翻译:少见的学习旨在学习一个分类器, 使用每一类的少数贴标签实例。 微小的学习方法, 将微小的学习实例嵌入一个高维空间, 并依据各试嵌入的距离进行分类。 但是, 此类实例嵌入通常在所有场景中共享, 从而缺乏根据个别事件特点对分类者进行概括化的歧视性力量。 在本文中, 我们提出一种新的方法, 即 \ emph{ Episode Differentive Embeding Network} (EAEN), 来学习特定案例的嵌入。 通过在每一个频道- 像素嵌入层面的插件中利用所有实例的概率分布, EAEN 不仅可以缓解在微小的学习任务中遇到的过度适应问题, 而且还可以捕捉到某个事件特有的歧视特征。 为了实证EAEN 的有效性和稳健性, 我们根据不同通用的嵌入骨和不同的分类师的各种组合, 对三种广泛使用的基准数据集进行了实验。 结果表明, EAEN 大大改进了在州- 方法上不同环境中不同场合的分类的分类精确度约为 10 $20$ $ $ 。