Neuromorphic systems achieve high energy efficiency by computing with spikes, in a brain-inspired way. However, finding spike-based learning algorithms that can be implemented within the local constraints of neuromorphic systems, while achieving high accuracy, remains a formidable challenge. Equilibrium Propagation is a hardware-friendly counterpart of backpropagation which only involves spatially local computations and applies to recurrent neural networks with static inputs. So far, hardware-oriented studies of Equilibrium Propagation focused on rate-based networks. In this work, we develop a spiking neural network algorithm called EqSpike, compatible with neuromorphic systems, which learns by Equilibrium Propagation. Through simulations, we obtain a test recognition accuracy of 96.9% on MNIST, similar to rate-based Equilibrium Propagation, and comparing favourably to alternative learning techniques for spiking neural networks. We show that EqSpike implemented in silicon neuromorphic technology could reduce the energy consumption of inference by up to three orders of magnitude and training by up to two orders of magnitude compared to GPUs. Finally, we also show that during learning, EqSpike weight updates exhibit a form of Spike Timing Dependent Plasticity, highlighting a possible connection with biology.
翻译:内晶系统通过以大脑启发的方式用钉钉子计算,实现了高能效。 然而,找到在神经形态系统当地限制范围内可以实施的基于钉子的学习算法,虽然具有很高的精确性,但仍然是一项艰巨的挑战。 利衡平准是一个硬件友好的反反射配方,它仅涉及空间局部计算,并适用于具有静态投入的经常性神经网络。迄今为止,对以基于比率的网络为主的平衡性Propagation的硬件导向性研究。在这项工作中,我们开发了一种名为EqSpike的神经平衡网络算法,与神经形态系统兼容,通过QqSpikation系统学习。通过模拟,我们在MNISTIS上获得96.9%的测试识别精度,类似于基于比率的平衡普质,并适用于带有静态投入的经常性神经网络的常规学习技术。 我们显示,在硅神经形态技术中实施的EqSpike可以降低节能的消耗量,通过三个阶级级的节能,与神经形态连接,通过QSimimimalimalimalimstimstal imstal imstilling imstal lap imstal lap lap lap lap a lap lap lap lap lactor lap lap lap lap lax lax lax lap a lap lade a lax lax lax lax latictor 10 10 10 10 10 10 10 10 10 10 10 10 10 10 和 10 10 10 10 10 10 和 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10