We propose an adaptation of the curriculum training framework, applicable to state-of-the-art meta learning techniques for few-shot classification. Curriculum-based training popularly attempts to mimic human learning by progressively increasing the training complexity to enable incremental concept learning. As the meta-learner's goal is learning how to learn from as few samples as possible, the exact number of those samples (i.e. the size of the support set) arises as a natural proxy of a given task's difficulty. We define a simple yet novel curriculum schedule that begins with a larger support size and progressively reduces it throughout training to eventually match the desired shot-size of the test setup. This proposed method boosts the learning efficiency as well as the generalization capability. Our experiments with the MAML algorithm on two few-shot image classification tasks show significant gains with the curriculum training framework. Ablation studies corroborate the independence of our proposed method from the model architecture as well as the meta-learning hyperparameters
翻译:我们建议对课程培训框架进行调整,以适用于最先进的元学习技术,用于几发分级。基于课程的培训普遍尝试模仿人类学习,逐步提高培训复杂性,以便能够进行渐进式概念学习。由于元学习者的目标是学习如何从尽可能多的样本中学习,这些样本的确切数量(即支助的大小)自然代表了特定任务的困难。我们定义了一个简单而新颖的课程时间表,从更大的支持规模开始,在整个培训过程中逐步减少,以最终匹配所期望的测试设置的瞄准尺寸。这一拟议方法提高了学习效率和一般化能力。我们用MAML算法进行的关于两个微小图像分类任务的实验显示了课程培训框架的巨大收益。吸收研究证实了我们拟议方法与模型结构以及元学习超参数的独立性。