A common view in the neuroscience community is that memory is encoded in the connection strength between neurons. This perception led artificial neural network models to focus on connection weights as the key variables to modulate learning. In this paper, we present a prototype for weightless spiking neural networks that can perform a simple classification task. The memory in this network is stored in the timing between neurons, rather than the strength of the connection, and is trained using a Hebbian Spike Timing Dependent Plasticity (STDP), which modulates the delays of the connection.
翻译:神经科学界的一个共同观点是,内存是在神经元之间的连接强度中编码的。 这种感知导致人工神经网络模型侧重于连接权重,作为调节学习的关键变量。 在本文中,我们提出了一个无重量的神经系统跳跃网络的原型,可以执行简单的分类任务。 这个网络的内存存储在神经元之间的时间间隔中,而不是连接的强度,并且使用一种赫比亚斯派斯派克时针依赖性可塑性(STDP)来调节连接的延迟。