The key issue of few-shot learning is learning to generalize. In this paper, we propose a large margin principle to improve the generalization capacity of metric based methods for few-shot learning. To realize it, we develop a unified framework to learn a more discriminative metric space by augmenting the softmax classification loss function with a large margin distance loss function for training. Extensive experiments on two state-of-the-art few-shot learning models, graph neural networks and prototypical networks, show that our method can improve the performance of existing models substantially with very little computational overhead, demonstrating the effectiveness of the large margin principle and the potential of our method.
翻译:少见学习的关键问题是学习概括化。 在本文中,我们提出了一个巨大的边际原则,以提高基于计量方法的通用能力,进行少见学习。为了实现这一点,我们制定了一个统一的框架,通过扩大软负分级损失功能,为培训提供巨大的距离损失功能,学习更具歧视性的衡量空间。关于两个最先进的少见学习模式,即图形神经网络和原型网络的广泛实验表明,我们的方法可以大大改进现有模型的性能,而计算间接费用很少,这表明大宽度原则的有效性和我们方法的潜力。