Spiking neural network (SNN), as a brain-inspired energy-efficient neural network, has attracted the interest of researchers. While the training of spiking neural networks is still an open problem. One effective way is to map the weight of trained ANN to SNN to achieve high reasoning ability. However, the converted spiking neural network often suffers from performance degradation and a considerable time delay. To speed up the inference process and obtain higher accuracy, we theoretically analyze the errors in the conversion process from three perspectives: the differences between IF and ReLU, time dimension, and pooling operation. We propose a neuron model for releasing burst spikes, a cheap but highly efficient method to solve residual information. In addition, Lateral Inhibition Pooling (LIPooling) is proposed to solve the inaccuracy problem caused by MaxPooling in the conversion process. Experimental results on CIFAR and ImageNet demonstrate that our algorithm is efficient and accurate. For example, our method can ensure nearly lossless conversion of SNN and only use about 1/10 (less than 100) simulation time under 0.693$\times$ energy consumption of the typical method. Our code is available at https://github.com/Brain-Inspired-Cognitive-Engine/Conversion_Burst.
翻译:螺旋神经网络(SNN)是大脑激发的节能神经网络,吸引了研究人员的兴趣。虽然对螺旋神经网络的培训仍然是一个尚未解决的问题。一种有效的方法是将受过训练的ANN到SNN的重量映射到SNN,以达到高推理能力。然而,转换的螺旋神经网络往往由于性能退化和相当长的时间延误而受到影响。为了加快推论过程并获得更高的准确性,我们从三个角度从理论上分析转换过程中的错误:IF和RELU之间的差别、时间尺寸和集中操作。我们提出了释放爆裂钉的神经模型,这是解决残余信息的廉价但效率高的方法。此外,还提议了横向侵入集合(LIPooling),以解决在转换过程中由MaxPooling造成的不准确问题。CIFAR和图像网络的实验结果表明,我们的算法是有效和准确的。例如,我们的方法可以确保SNN几乎不损失的转换,并且只使用大约1/10(不到0.9美元以下)的模拟时间,在0.693.Btimeximal-Btimemeal-bistimemeal-brugnial rodumental drodustration rodustration rodustration)