This paper introduces a new few-shot learning pipeline that casts relevance ranking for image retrieval as binary ranking relation classification. In comparison to image classification, ranking relation classification is sample efficient and domain agnostic. Besides, it provides a new perspective on few-shot learning and is complementary to state-of-the-art methods. The core component of our deep neural network is a simple MLP, which takes as input an image triplet encoded as the difference between two vector-Kronecker products, and outputs a binary relevance ranking order. The proposed RankMLP can be built on top of any state-of-the-art feature extractors, and our entire deep neural network is called the ranking deep neural network, or RankDNN. Meanwhile, RankDNN can be flexibly fused with other post-processing methods. During the meta test, RankDNN ranks support images according to their similarity with the query samples, and each query sample is assigned the class label of its nearest neighbor. Experiments demonstrate that RankDNN can effectively improve the performance of its baselines based on a variety of backbones and it outperforms previous state-of-the-art algorithms on multiple few-shot learning benchmarks, including miniImageNet, tieredImageNet, Caltech-UCSD Birds, and CIFAR-FS. Furthermore, experiments on the cross-domain challenge demonstrate the superior transferability of RankDNN.The code is available at: https://github.com/guoqianyu-alberta/RankDNN.
翻译:本文引入了一个新的微小学习管道, 使图像检索具有相关性, 作为二进制排序关系分类 。 与图像分类相比, 排序关系分类是样本高效的, 并且是域不可知性的 。 此外, 它提供了对点数学习的新视角, 并且补充了最先进的方法 。 我们深层神经网络的核心组件是一个简单的 MLP 。 它将两种矢量- 克朗产品和产出的二进制相关性排序作为三进制的图像编码。 拟议的 RankMLP 可以建在任何最新地段提取器的顶端上, 整个深层神经网络被称为深层神经网络。 同时, RankDNNNN可以灵活地与其他后处理方法结合。 在元测试期间, RankDNNNN 排名支持图像, 因为它与查询样本相似, 每一个查询样本被指派为最近的邻居的等级标签 。 实验显示 RankDNNNNP能够有效地改进基线的性能, 建在各种骨架上, 而且它超越了RangD- NEFS- NEB- 的高级 IM- NEAR- tral- gal- gal- gal- glas- gal- sal- sal- sal- sal- salbal- sal- sal- salbalbal- salbal- sal- salbalbaldaldaldaldaldaldaldaldal- sal- saldaldaldaldalbisaldaldaldaldaldaldaldald.