Biological agents do not have infinite resources to learn new things. For this reason, a central aspect of human learning is the ability to recycle previously acquired knowledge in a way that allows for faster, less resource-intensive acquisition of new skills. In spite of that, how neural networks in the brain leverage existing knowledge to learn new computations is not well understood. In this work, we study this question in artificial recurrent neural networks (RNNs) trained on a corpus of commonly used neuroscience tasks. Combining brain-inspired inductive biases we call functional and structural, we propose a system that learns new tasks by building on top of pre-trained latent dynamics organised into separate recurrent modules. These modules, acting as prior knowledge acquired previously through evolution or development, are pre-trained on the statistics of the full corpus of tasks so as to be independent and maximally informative. The resulting model, we call a Modular Latent Primitives (MoLaP) network, allows for learning multiple tasks while keeping parameter counts, and updates, low. We also show that the skills acquired with our approach are more robust to a broad range of perturbations compared to those acquired with other multi-task learning strategies, and that generalisation to new tasks is facilitated. This work offers a new perspective on achieving efficient multi-task learning in the brain, illustrating the benefits of leveraging pre-trained latent dynamical primitives.
翻译:生物剂没有无限的资源来学习新事物。 出于这个原因,人类学习的一个核心方面是能够以更快、较少资源密集的方式获取新技能的方式,将以前获得的知识循环利用,从而能够以更快、较少资源密集的方式获取新技能。尽管如此,大脑神经网络如何利用现有知识来学习新计算方法并没有得到很好理解。在这项工作中,我们研究人工的经常性神经网络(RNN)中的问题,这些网络在一系列常用神经科学任务方面受过培训。将大脑激发的诱导偏差结合起来,我们称之为功能和结构,我们提议一个系统,通过在经过事先训练的潜伏动态中建立起来来学习新任务。这些模块作为以前通过进化或开发获得的知识,在全部任务的全部统计数据上进行了预先培训,以便做到独立和最丰富的信息。我们由此而建立的模型,我们叫Modular Lent Primitives (MolaP) 网络(MolaP) 来学习多种任务,同时保持参数的计算和更新,低调。我们还提议了一个系统,通过我们的方法学习新获得的技能可以更牢固地进行广泛的透视透视。 与这些先进的先进的先进进化的先进化的先进化的先进化的先进化,与后进进进进化的进进进进进进进进进进进进进进进进进化的进进进进进进进进式的进进进进进进式的进进进进进进进进进式的进式的进进进进进进进进进进式的进式的进进进进进进进进进进进进进进进进进进进进进进进式的进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进进