Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) and neuroscience. Prior theoretical work has focused on RNNs with additive interactions. However, gating - i.e. multiplicative - interactions are ubiquitous in real neurons and also the central feature of the best-performing RNNs in ML. Here, we show that gating offers flexible control of two salient features of the collective dynamics: i) timescales and ii) dimensionality. The gate controlling timescales leads to a novel, marginally stable state, where the network functions as a flexible integrator. Unlike previous approaches, gating permits this important function without parameter fine-tuning or special symmetries. Gates also provide a flexible, context-dependent mechanism to reset the memory trace, thus complementing the memory function. The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs. At this transition, unlike additive RNNs, the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity). The rich dynamics are summarized in phase diagrams, thus providing a map for principled parameter initialization choices to ML practitioners.
翻译:经常性神经网络(RNN)是强大的动态模型,在机器学习(ML)和神经科学中广泛使用。先前的理论工作侧重于RNN(具有添加性互动作用),但是,在实际神经元中,即多复制性互动是无处不在的,也是ML中表现最佳的RNN的核心特征。这里,我们表明,Gate提供了对集体动态的两个突出特征的灵活控制:一)时间尺度和二)维度。门控制时间尺度导致一种新颖的、不太稳定的状态,即网络功能是灵活的聚合体。与以往的方法不同,在这种转变中,Glate允许这一重要功能不使用参数微调或特殊的对称性。盖茨还提供了一个灵活、背景独立的机制,以重新设置记忆跟踪,从而补充记忆功能。关口调节了维度的两种显著特征:一)时间尺度和二)维度。在这种转变中,投入将稳定的系统推向强大的混乱活动,与典型的稳定性效果形成对比。在这种转变中,不同于RENS的添加性,关键动态变化的模型的模型的特征变化阶段(即提供了动态的复杂度的模型的模型的形成)。