Over the last couple of years few-shot learning (FSL) has attracted great attention towards minimizing the dependency on labeled training examples. An inherent difficulty in FSL is the handling of ambiguities resulting from having too few training samples per class. To tackle this fundamental challenge in FSL, we aim to train meta-learner models that can leverage prior semantic knowledge about novel classes to guide the classifier synthesis process. In particular, we propose semantically-conditioned feature attention and sample attention mechanisms that estimate the importance of representation dimensions and training instances. We also study the problem of sample noise in FSL, towards the utilization of meta-learners in more realistic and imperfect settings. Our experimental results demonstrate the effectiveness of the proposed semantic FSL model with and without sample noise.
翻译:过去几年来,一些短片学习(FSL)吸引了对尽量减少对有标签的培训实例的依赖的极大关注。FSL的内在困难是处理由于每班培训样本太少而造成的模糊问题。为了应对FSL的这一根本挑战,我们的目标是培训元流学模型,这些模型能够利用关于新类的语义知识来指导分类合成过程。特别是,我们提议以具有语义条件的注意和抽样关注机制来估计代表性层面和培训实例的重要性。我们还研究了FSL的抽样噪音问题,以便在更现实和不完善的环境中使用元流体。我们的实验结果显示了拟议中带有或不带有样本噪音的语义FSL模型的有效性。