Meta-learning methods aim to build learning algorithms capable of quickly adapting to new tasks in low-data regime. One of the main benchmarks of such an algorithms is a few-shot learning problem. In this paper we investigate the modification of standard meta-learning pipeline that takes a multi-task approach during training. The proposed method simultaneously utilizes information from several meta-training tasks in a common loss function. The impact of each of these tasks in the loss function is controlled by the corresponding weight. Proper optimization of these weights can have a big influence on training of the entire model and might improve the quality on test time tasks. In this work we propose and investigate the use of methods from the family of simultaneous perturbation stochastic approximation (SPSA) approaches for meta-train tasks weights optimization. We have also compared the proposed algorithms with gradient-based methods and found that stochastic approximation demonstrates the largest quality boost in test time. Proposed multi-task modification can be applied to almost all methods that use meta-learning pipeline. In this paper we study applications of this modification on Prototypical Networks and Model-Agnostic Meta-Learning algorithms on CIFAR-FS, FC100, tieredImageNet and miniImageNet few-shot learning benchmarks. During these experiments, multi-task modification has demonstrated improvement over original methods. The proposed SPSA-Tracking algorithm shows the largest accuracy boost that is competitive against the state-of-the-art meta-learning methods. Our code is available online.
翻译:元学习方法旨在建立能够迅速适应低数据制度中新任务的学习算法。这种算法的主要基准之一是一个微小的学习问题。在本文件中,我们调查了在培训期间采用多任务方法的标准化元学习管道的修改情况。拟议方法同时在共同损失函数中利用若干元培训任务的信息。这些任务在损失函数中的影响由相应的份量来控制。这些重量的适当优化可以对整个模型的培训产生很大影响,并可能提高测试时间任务的质量。在这项工作中,我们提议并调查了从同时周期性透析近似(SPSA)方法的组合中采用的方法,以优化元任务加权法。我们还将拟议的算法与基于梯度的方法进行比较,发现在测试时间里,这些偏差近近于质量的提升。拟议的多任务修正可以适用于几乎所有使用元学习管道的方法。在本文中,我们研究了对Protocrical 网络和模型-Agnal-A) 精确度近似近似近似近似方法的使用方法。在模型-Mexisal-al-al-al-algal-al-alalal-al-al-altraction-al-al-assal-assal-al-assal-al-assal-assalinging the the theslational-I.