Spiking neural networks (SNNs) in neuromorphic systems are more energy efficient compared to deep learning-based methods, but there is no clear competitive learning algorithm for training such SNNs. Eligibility propagation (e-prop) offers an efficient and biologically plausible way to train competitive recurrent SNNs in low-power neuromorphic hardware. In this report, previous performance of e-prop on a speech classification task is reproduced, and the effects of including STDP-like behavior are analyzed. Including STDP to the ALIF neuron model improves the classification performance, but this is not the case for the Izhikevich e-prop neuron. Finally, it was found that e-prop implemented in a single-layer recurrent SNN consistently outperforms a multi-layer variant.
翻译:与深层次的学习方法相比,神经形态系统中的螺旋神经网络(SNNs)更能提高能效,但并没有明确的竞争性学习算法来培训这种神经形态网络。资格传播(e-prop)为在低功能神经形态硬件中培训具有竞争力的经常性 SNNs提供了一种高效和生物上可信的方法。在本报告中,电子螺旋在语音分类任务上的先前表现得到复制,并分析了将类似STDP行为包括在内的效果。将STDP纳入ALIF神经模型可以提高分类性能,但Izhikevich e-propen神经系统的情况并非如此。最后,发现在单层经常性 SNNE实施的e-prop一直优于多层变量。