The main challenge for fine-grained few-shot image classification is to learn feature representations with higher inter-class and lower intra-class variations, with a mere few labelled samples. Conventional few-shot learning methods however cannot be naively adopted for this fine-grained setting -- a quick pilot study reveals that they in fact push for the opposite (i.e., lower inter-class variations and higher intra-class variations). To alleviate this problem, prior works predominately use a support set to reconstruct the query image and then utilize metric learning to determine its category. Upon careful inspection, we further reveal that such unidirectional reconstruction methods only help to increase inter-class variations and are not effective in tackling intra-class variations. In this paper, we for the first time introduce a bi-reconstruction mechanism that can simultaneously accommodate for inter-class and intra-class variations. In addition to using the support set to reconstruct the query set for increasing inter-class variations, we further use the query set to reconstruct the support set for reducing intra-class variations. This design effectively helps the model to explore more subtle and discriminative features which is key for the fine-grained problem in hand. Furthermore, we also construct a self-reconstruction module to work alongside the bi-directional module to make the features even more discriminative. Experimental results on three widely used fine-grained image classification datasets consistently show considerable improvements compared with other methods. Codes are available at: https://github.com/PRIS-CV/Bi-FRN.
翻译:微细细细微图像分类的主要挑战是学习高层次之间和较低层次内部差异的特征表现,只贴上几个标签。常规微小的学习方法不能天真地适用于这种细微差别的环境 -- -- 快速试点研究显示,它们事实上是推倒相反的(即低层次之间差异和高层次内部差异)。为了缓解这一问题,以前的工作主要使用一套支持来重建查询图像,然后利用一些标准学习来确定其类别。在仔细检查后,我们进一步发现,这种单向重建方法只会帮助增加不同等级之间的差异,而对于处理不同等级内部差异来说却不那么有效。在本文中,我们首次引入一个双向建筑机制,可以同时适应不同层次之间的差异(即低层次之间差异和高层次内部差异)。除了利用一套支持来重建用于增加不同等级之间差异的查询器外,我们还进一步使用一套查询工具来重建用于减少内部类别差异的成套支持。这个设计有效地帮助模型探索更微妙和有区别性的特性,而这些特性对于精细的分类是精细的C内部差异变化,对于处理内部差异差异变化的特性是十分关键的。我们首次引入一个比重的模型,用来模拟模型,用以展示其他的模型,从而展示了比重的模型,从而展示了其他的模型。 也展示了比重地展示了比重的模型。