This paper introduces and studies zero-base generalized few-shot learning (zero-base GFSL), which is an extreme yet practical version of few-shot learning problem. Motivated by the cases where base data is not available due to privacy or ethical issues, the goal of zero-base GFSL is to newly incorporate the knowledge of few samples of novel classes into a pretrained model without any samples of base classes. According to our analysis, we discover the fact that both mean and variance of the weight distribution of novel classes are not properly established, compared to those of base classes. The existing GFSL methods attempt to make the weight norms balanced, which we find helps only the variance part, but discard the importance of mean of weights particularly for novel classes, leading to the limited performance in the GFSL problem even with base data. In this paper, we overcome this limitation by proposing a simple yet effective normalization method that can effectively control both mean and variance of the weight distribution of novel classes without using any base samples and thereby achieve a satisfactory performance on both novel and base classes. Our experimental results somewhat surprisingly show that the proposed zero-base GFSL method that does not utilize any base samples even outperforms the existing GFSL methods that make the best use of base data.
翻译:本文介绍并研究零基础通用的少见学习(零基础GFSL),这是少少见学习问题的极端而实际的极端版本。在那些由于隐私或伦理问题而无法获得基础数据的情况下,零基础GFSL的目标是将少数新类样本的知识新纳入一个未经任何基础类样本的预培训模式。根据我们的分析,我们发现与基础类相比,新类重量分布的平均值和差异没有适当确定。现有的GFSL方法试图使加权规范平衡,我们发现这些规范只对差异部分有帮助,但放弃加权平均值的重要性,特别是对新类而言,导致甚至基础数据在GFSL问题上的绩效有限。在本文件中,我们克服了这一限制,提出了简单而有效的正常化方法,可以有效控制新类重量分布的平均值和差异,而没有使用任何基础样本,从而在新类和基础类中都取得了令人满意的表现。我们的实验结果令人惊讶地显示,拟议的GFSL方法没有利用任何基础样本,甚至现有基数。