Gradient-based meta-learning techniques are both widely applicable and proficient at solving challenging few-shot learning and fast adaptation problems. However, they have the practical difficulties of operating in high-dimensional parameter spaces in extreme low-data regimes. We show that it is possible to bypass these limitations by learning a low-dimensional latent generative representation of model parameters and performing gradient-based meta-learning in this space with latent embedding optimization (LEO), effectively decoupling the gradient-based adaptation procedure from the underlying high-dimensional space of model parameters. Our evaluation shows that LEO can achieve state-of-the-art performance on the competitive 5-way 1-shot miniImageNet classification task.
翻译:渐进式的元学习技术既广泛适用,又熟练地解决了挑战性微小的学习和快速适应问题;然而,在极低数据系统中,这些技术在高维参数空间的运作中遇到实际困难;我们表明,通过学习低维潜在模型参数的基因化体现法和在这一空间进行基于梯级的元学习,同时进行潜嵌优化(LEO),有效地将基于梯度的适应程序与基础的高维模型参数空间脱钩。我们的评估表明,低地轨道能够在具有竞争力的5-way 1的小型图像网分类任务中取得最先进的业绩,从而克服这些局限性。