Meta Learning automates the search for learning algorithms. At the same time, it creates a dependency on human engineering on the meta-level, where meta learning algorithms need to be designed. In this paper, we investigate self-referential meta learning systems that modify themselves without the need for explicit meta optimization. We discuss the relationship of such systems to in-context and memory-based meta learning and show that self-referential neural networks require functionality to be reused in the form of parameter sharing. Finally, we propose fitness monotonic execution (FME), a simple approach to avoid explicit meta optimization. A neural network self-modifies to solve bandit and classic control tasks, improves its self-modifications, and learns how to learn, purely by assigning more computational resources to better performing solutions.
翻译:Meta Learning 自动连接学习算法的搜索。 同时, 它在需要设计元学习算法的元水平上创造了对人文工程的依赖。 在本文中, 我们调查自我改造的自优元学习系统, 不需要明确的元优化。 我们讨论这些系统与内文字和记忆的元学习的关系, 并表明自优神经网络需要以参数共享的形式再利用功能。 最后, 我们提议了健身单体执行( FME ), 这是一种简单的方法, 以避免明显的元优化。 一个神经网络自我调整, 解决强盗和经典控制任务, 改进自我编纂, 并学习如何学习学习, 纯粹通过分配更多的计算资源来更好地执行解决方案 。