Meta-learning optimizes the hyperparameters of a training procedure, such as its initialization, kernel, or learning rate, based on data sampled from a number of auxiliary tasks. A key underlying assumption is that the auxiliary tasks, known as meta-training tasks, share the same generating distribution as the tasks to be encountered at deployment time, known as meta-test tasks. This may, however, not be the case when the test environment differ from the meta-training conditions. To address shifts in task generating distribution between meta-training and meta-testing phases, this paper introduces weighted free energy minimization (WFEM) for transfer meta-learning. We instantiate the proposed approach for non-parametric Bayesian regression and classification via Gaussian Processes (GPs). The method is validated on a toy sinusoidal regression problem, as well as on classification using miniImagenet and CUB data sets, through comparison with standard meta-learning of GP priors as implemented by PACOH.
翻译:元学习根据从一些辅助任务中抽样的数据优化培训程序的超参数,如初始化、内核或学习率。一个关键的基本假设是,辅助任务(称为元培训任务)与部署时要遇到的任务(称为元测试任务)的生成分布相同。但是,当测试环境与元培训条件不同时,情况可能并非如此。为了处理在元培训和元测试阶段之间产生分配任务的变化,本文件引入了加权免费能源最小化(WFEM),用于转让元学习。我们通过高山进程(GPs)对拟议非参数巴伊西亚回归和分类方法进行同步。该方法通过与PACOH实施的小型Imagenet和CUB数据集的标准元学习,在微小类类骨折回归问题上得到验证,并在使用微型Imagenet和CUB数据集进行分类方面得到验证。