Biological synaptic plasticity exhibits nonlinearities that are not accounted for by classic Hebbian learning rules. Here, we introduce a simple family of generalized nonlinear Hebbian learning rules. We study the computations implemented by their dynamics in the simple setting of a neuron receiving feedforward inputs. These nonlinear Hebbian rules allow a neuron to learn tensor decompositions of its higher-order input correlations. The particular input correlation decomposed and the form of the decomposition depend on the location of nonlinearities in the plasticity rule. For simple, biologically motivated parameters, the neuron learns eigenvectors of higher-order input correlation tensors. We prove that tensor eigenvectors are attractors and determine their basins of attraction. We calculate the volume of those basins, showing that the dominant eigenvector has the largest basin of attraction. We then study arbitrary learning rules and find that any learning rule that admits a finite Taylor expansion into the neural input and output also has stable equilibria at generalized eigenvectors of higher-order input correlation tensors. Nonlinearities in synaptic plasticity thus allow a neuron to encode higher-order input correlations in a simple fashion.
翻译:生物合成可塑性表现出非线性, 经典的赫比亚学习规则没有考虑到这一点。 在这里, 我们引入了一个简单的非线性非线性普通的黑比亚学习规则大家庭。 我们研究在接收进料输入的神经元的简单设置中, 以其动态方式实施的计算方法。 这些非线性黑比亚规则允许神经元学习其较高级输入关联的振动分解。 特定的输入相关分解和分解形式取决于在可塑性规则中的非线性扩展位置。 对于简单的、 生物动机的参数, 神经元学习高阶输入相关温度的精子。 我们证明, 高阶源性神经元是吸引者, 并决定其吸引力的盆地。 我们计算出这些盆地的体积, 表明主要的叶源性具有最大的吸引力盆地。 我们随后研究任意学习规则, 并发现任何学习规则, 承认泰勒在神经性输入和输出中的定型扩展都取决于非线性的位置。 对于简单进料性输入和输出的参数来说, 也具有稳定的直线性, 。 因此, 高阶性神经感性进度的感性进感性 。