Spiking Neural Networks (SNNs) have emerged as a hardware efficient architecture for classification tasks. The penalty of spikes-based encoding has been the lack of a universal training mechanism performed entirely using spikes. There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANN): (1) SNNs can be trained by externally computed numerical gradients. (2) A major advancement toward native spike-based learning has been the use of approximate Backpropagation using spike-time-dependent plasticity (STDP) with phased forward/backward passes. However, the transfer of information between such phases necessitates external memory and computational access. This is a challenge for neuromorphic hardware implementations. In this paper, we propose a stochastic SNN-based Back-Prop (SSNN-BP) algorithm that utilizes a composite neuron to simultaneously compute the forward pass activations and backward pass gradients explicitly with spikes. Although signed gradient values are a challenge for spike-based representation, we tackle this by splitting the gradient signal into positive and negative streams. The composite neuron encodes information in the form of stochastic spike-trains and converts Backpropagation weight updates into temporally and spatially local discrete STDP-like spike coincidence updates compatible with hardware-friendly Resistive Processing Units (RPUs). Furthermore, our method approaches BP ANN baseline with sufficiently long spike-trains. Finally, we show that softmax cross-entropy loss function can be implemented through inhibitory lateral connections enforcing a Winner Take All (WTA) rule. Our SNN shows excellent generalization through comparable performance to ANNs on the MNIST, Fashion-MNIST and Extended MNIST datasets. Thus, SSNN-BP enables BP compatible with purely spike-based neuromorphic hardware.
翻译:螺旋神经网络(SNN)已成为一个用于分类任务的硬件高效结构。基于钉钉的神经编码惩罚是缺乏完全使用钉钉的通用培训机制。有人曾几次试图采用非刺刺人工神经网络(ANN)中使用的强大的反向透析(BP)技术:(1) SNN可以由外部计算数字梯度来培训。(2) 本地的钉钉钉式学习的一大进步是使用基于超时软塑料(STDP)的近似反向调整,分步前向后传递。然而,这些阶段之间的信息传输需要外部记忆和计算访问。这是神经变形硬件实施的一项挑战。在本文件中,我们建议采用基于SNNN(SSNNNB-B)的后向后向性神经系统(SNP)的算法,同时将前向电传动和后向的内流梯度梯度变换。尽管已签的梯度值对基于钉式的代表构成挑战,但我们要通过将梯度信号转换成正向和反向的螺旋直流。 IM Stal Stal-S-ral Stal Stal Stal Stal Stal Stal-de Stal Stal-de Stal-de Stal-de Stal-st 将我们Slock Slock Slock Salxxxxxxxxxxxxxxxxxxxx Slxxxxxxxxxxxxxxxxxx 。