Few-shot learning problem focuses on recognizing unseen classes given a few labeled images. In recent effort, more attention is paid to fine-grained feature embedding, ignoring the relationship among different distance metrics. In this paper, for the first time, we investigate the contributions of different distance metrics, and propose an adaptive fusion scheme, bringing significant improvements in few-shot classification. We start from a naive baseline of confidence summation and demonstrate the necessity of exploiting the complementary property of different distance metrics. By finding the competition problem among them, built upon the baseline, we propose an Adaptive Metrics Module (AMM) to decouple metrics fusion into metric-prediction fusion and metric-losses fusion. The former encourages mutual complementary, while the latter alleviates metric competition via multi-task collaborative learning. Based on AMM, we design a few-shot classification framework AMTNet, including the AMM and the Global Adaptive Loss (GAL), to jointly optimize the few-shot task and auxiliary self-supervised task, making the embedding features more robust. In the experiment, the proposed AMM achieves 2% higher performance than the naive metrics fusion module, and our AMTNet outperforms the state-of-the-arts on multiple benchmark datasets.
翻译:少见的学习问题集中在认识隐蔽的阶级上,有少数贴标签的图像。最近,我们更加关注细微的细微特征嵌入,忽视不同距离度量仪之间的关系。在本文件中,我们首次调查了不同距离度量仪的贡献,并提出了适应性融合计划,在微小的分类中带来了重大改进。我们从一个天真的信任和权衡基线开始,并表明有必要利用不同距离度量仪的互补属性。通过在基线的基础上发现它们之间的竞争问题,我们建议采用适应性计量模块(AMM),将调和性指标融合到多指标性聚变和量度损失中。在试验中,我们鼓励相互互补,而后者则通过多任务合作学习来缓解衡量性竞争。我们设计了一个微小的AMTNet分类框架,包括AMM和全球适应性损失(GAL),以共同优化微小相位任务和辅助性自我监督任务,使嵌入性功能更加坚固。在AMMM-MM模型中,拟议的AS-MIS-MMM模型比我们的多级标准级模型达到2MD标准。