Neural networks require a large amount of annotated data to learn. Meta-learning algorithms propose a way to decrease the number of training samples to only a few. One of the most prominent optimization-based meta-learning algorithms is Model-Agnostic Meta-Learning (MAML). However, the key procedure of adaptation to new tasks in MAML is quite slow. In this work we propose an improvement to MAML meta-learning algorithm. We introduce Lambda patterns by which we restrict which weight are updated in the network during the adaptation phase. This makes it possible to skip certain gradient computations. The fastest pattern is selected given an allowed quality degradation threshold parameter. In certain cases, quality improvement is possible by a careful pattern selection. The experiments conducted have shown that via Lambda adaptation pattern selection, it is possible to significantly improve the MAML method in the following areas: adaptation time has been decreased by a factor of 3 with minimal accuracy loss; accuracy for one-step adaptation has been substantially improved.
翻译:神经网络需要大量附加说明的数据才能学习。 元学习算法建议一种方法, 将培训样本的数量减少至少数。 最突出的基于优化的元学习算法之一是模型- 不可知元学习(MAML ) 。 然而, 适应MAML 中新任务的关键程序相当缓慢。 在这项工作中, 我们建议改进MAML 元学习算法。 我们引入 Lambda 模式, 通过这些模式我们限制网络在适应阶段更新了哪些重量。 这样可以跳过某些梯度计算。 最快的模式被选择为允许的质量降解阈值。 在某些情况下, 谨慎选择模式可以提高质量。 所进行的实验表明, 通过 Lambda 适应模式选择, 有可能在以下领域大幅改进MAML 方法: 适应时间减少了3倍, 准确性损失最小; 一步适应的准确性已经大大提高。