We study how the connectivity within a recurrent neural network determines and is determined by the multistable solutions of network activity. To gain analytic tractability we let neural activation be a non-smooth Heaviside step function. This nonlinearity partitions the phase space into regions with different, yet linear dynamics. In each region either a stable equilibrium state exists, or network activity flows to outside of the region. The stable states are identified by their semipositivity constraints on the synaptic weight matrix. The restrictions can be separated by their effects on the signs or the strengths of the connections. Exact results on network topology, sign stability, weight matrix factorization, pattern completion and pattern coupling are derived and proven. Our work may lay the foundation for multistability in more complex recurrent neural networks.
翻译:我们研究经常神经网络内部的连通性如何决定和取决于网络活动的多维解决方案。为了获得分析性可感应性,我们让神经激活成为非悬浮的高度阶梯函数。这种非线性将阶段空间分割成具有不同但线性动态的区域。在每个区域,要么存在稳定的平衡状态,要么网络活动流向区域外。稳定状态是通过其对合成重量矩阵的半遗传性限制而确定的。这些限制可以通过其对连接信号或强力的影响而分离。网络地形学、标志稳定性、重量矩阵因子化、模式完成和模式组合的精确结果可以产生并证明。我们的工作可以为更复杂的经常性神经网络的多变性奠定基础。</s>