Memory is a key component of biological neural systems that enables the retention of information over a huge range of temporal scales, ranging from hundreds of milliseconds up to years. While Hebbian plasticity is believed to play a pivotal role in biological memory, it has so far been analyzed mostly in the context of pattern completion and unsupervised learning. Here, we propose that Hebbian plasticity is fundamental for computations in biological neural systems. We introduce a novel spiking neural network architecture that is enriched by Hebbian synaptic plasticity. We show that Hebbian enrichment renders spiking neural networks surprisingly versatile in terms of their computational as well as learning capabilities. It improves their abilities for out-of-distribution generalization, one-shot learning, cross-modal generative association, language processing, and reward-based learning. As spiking neural networks are the basis for energy-efficient neuromorphic hardware, this also suggests that powerful cognitive neuromorphic systems can be build based on this principle.
翻译:内存是生物神经系统的关键组成部分,它使得信息能够保存在从几百毫秒到多年不等的广泛时间尺度上。 虽然据信赫比亚的可塑性在生物记忆中发挥着关键作用, 但迄今为止,它大多是在模式完成和无人监督的学习背景下分析的。 我们在这里建议赫比亚的可塑性是生物神经系统计算的基础。 我们引入了一种由赫比亚合成合成合成的可塑性所丰富的新颖的神经网络结构。 我们显示赫比亚的浓缩使得神经网络在计算和学习能力方面具有惊人的多功能性。 它提高了它们超越分布、一手学习、跨模式的基因组合、语言处理和以奖励为基础的学习的能力。 由于神经网络是节能神经形态硬件的基础,这也表明强大的认知神经形态系统可以建立在这一原则的基础上。