Rapidly learning from ongoing experiences and remembering past events with a flexible memory system are two core capacities of biological intelligence. While the underlying neural mechanisms are not fully understood, various evidence supports that synaptic plasticity plays a critical role in memory formation and fast learning. Inspired by these results, we equip Recurrent Neural Networks (RNNs) with plasticity rules to enable them to adapt their parameters according to ongoing experiences. In addition to the traditional local Hebbian plasticity, we propose a global, gradient-based plasticity rule, which allows the model to evolve towards its self-determined target. Our models show promising results on sequential and associative memory tasks, illustrating their ability to robustly form and retain memories. In the meantime, these models can cope with many challenging few-shot learning problems. Comparing different plasticity rules under the same framework shows that Hebbian plasticity is well-suited for several memory and associative learning tasks; however, it is outperformed by gradient-based plasticity on few-shot regression tasks which require the model to infer the underlying mapping. Code is available at https://github.com/yuvenduan/PlasticRNNs.
翻译:快速地从当前的经验中吸取教训,并用灵活的记忆系统回忆过去的事件,这是生物智能的两个核心核心能力。虽然基本神经机制没有得到完全理解,但各种证据支持合成可塑性在记忆形成和快速学习中发挥着关键作用。受这些结果的启发,我们为经常神经网络配备了可塑性规则,使其能够根据不断的经验调整参数。除了传统的Hebbian本地的可塑性外,我们还提议了一个全球的、基于梯度的可塑性规则,使模型能够向自定目标演变。我们的模型显示连续和关联记忆任务方面有希望的结果,表明它们能够强有力地形成和保存记忆。与此同时,这些模型可以应对许多挑战性小的学习问题。在同一框架内比较不同的可塑性规则表明,Hebbian塑料完全适合若干记忆和连带学习任务;然而,在需要模型来推断基本绘图的微小的可塑性可塑性工作上,该模型比它更差。