Few-shot learning (FSL) aims to classify images under low-data regimes, where the conventional pooled global feature is likely to lose useful local characteristics. Recent work has achieved promising performances by using deep descriptors. They generally take all deep descriptors from neural networks into consideration while ignoring that some of them are useless in classification due to their limited receptive field, e.g., task-irrelevant descriptors could be misleading and multiple aggregative descriptors from background clutter could even overwhelm the object's presence. In this paper, we argue that a Mutual Nearest Neighbor (MNN) relation should be established to explicitly select the query descriptors that are most relevant to each task and discard less relevant ones from aggregative clutters in FSL. Specifically, we propose Discriminative Mutual Nearest Neighbor Neural Network (DMN4) for FSL. Extensive experiments demonstrate that our method outperforms the existing state-of-the-arts on both fine-grained and generalized datasets.
翻译:略微少见的学习(FSL)旨在根据低数据制度对图像进行分类,在低数据制度下,传统的全球集合特征有可能丧失有用的当地特征。最近的工作通过使用深层描述符取得了有希望的成绩。它们一般都考虑到神经网络的所有深度描述符,而忽视其中一些描述符由于其有限的可接受域而在分类上毫无用处,例如任务相关描述符可能会误导,而背景混杂的多重隔离描述符甚至可能压倒物体的存在。在本文件中,我们主张应建立近邻互助关系(MNN)关系,以明确选择与每项任务最相关的查询描述符,并抛弃FSL的隔离区块中不那么相关的描述符。具体地说,我们提议为FSL提出不同式的相近邻神经网络(DMND4)。 广泛实验表明,我们的方法超过了关于精密和通用数据集的现有状态。