Spiking neural networks (SNNs) are receiving increasing attention due to their low power consumption and strong bio-plausibility. Optimization of SNNs is a challenging task. Two main methods, artificial neural network (ANN)-to-SNN conversion and spike-based backpropagation (BP), both have their advantages and limitations. For ANN-to-SNN conversion, it requires a long inference time to approximate the accuracy of ANN, thus diminishing the benefits of SNN. With spike-based BP, training high-precision SNNs typically consumes dozens of times more computational resources and time than their ANN counterparts. In this paper, we propose a novel SNN training approach that combines the benefits of the two methods. We first train a single-step SNN by approximating the neural potential distribution with random noise, then convert the single-step SNN to a multi-step SNN losslessly. The introduction of Gaussian distributed noise leads to a significant gain in accuracy after conversion. The results show that our method considerably reduces the training and inference times of SNNs while maintaining their high accuracy. Compared to the previous two methods, ours can reduce training time by 65%-75% and achieves more than 100 times faster inference speed. We also argue that the neuron model augmented with noise makes it more bio-plausible.
翻译:Spik 神经网络(SNN)由于耗电量低和生物可变性强而日益受到越来越多的关注。 Snones 优化 Snones 是一项艰巨的任务。 两种主要方法,即人工神经网络(ANN)到SNN的转换和基于螺旋的反向推进(BP),两者都有其优点和局限性。 对于ANN到SNN的转换,它需要很长的推论时间来接近 ANN的准确性,从而降低 SNNN的效益。由于以钉钉钉为基础的BP,培训高精度的SNNNNS通常消耗的计算资源和时间比他们的ANN的对应人员多数十倍。在本文件中,我们提出了一个新的SNNN培训方法,将两种方法的效益结合起来。我们首先将神经潜力的分布与随机噪音相近,然后将单步SNNNN(S)转换成一个多步模式,无损无遗漏的模型。Gaussian 传播的噪音在转换后会大大提高准确性。结果显示,我们的方法会大大降低先前的准确性,比SNNNNN的精确度要短一倍。我们比前的精确度要缩短。我们比前的精确度要缩短了65次,我们更缩短。