The spiking neural network (SNN) computes and communicates information through discrete binary events. It is considered more biologically plausible and more energy-efficient than artificial neural networks (ANN) in emerging neuromorphic hardware. However, due to the discontinuous and non-differentiable characteristics, training SNN is a relatively challenging task. Recent work has achieved essential progress on an excellent performance by converting ANN to SNN. Due to the difference in information processing, the converted deep SNN usually suffers serious performance loss and large time delay. In this paper, we analyze the reasons for the performance loss and propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag. Also, when ResNet structure-based ANNs are converted, the information of output neurons is incomplete due to the rapid transmission of the shortcut path. We design synchronous neurons (SN) to help efficiently improve performance. Experimental results show that the proposed method only needs 1/4-1/10 of the time steps compared to previous work to achieve nearly lossless conversion. We demonstrate state-of-the-art ANN-SNN conversion for VGG16, ResNet20, and ResNet34 on challenging datasets including CIFAR-10 (95.16% top-1), CIFAR-100 (78.12% top-1), and ImageNet (72.64% top-1).
翻译:神经神经网络(SNN)通过离散的二进制事件来计算和传递信息。在新兴神经突变硬件中,它被认为比人工神经网络(ANN)更具有生物上更合理和更高的节能效率。然而,由于不连续和无差异的特点,培训SNN是一项相对较具挑战性的任务。最近的工作通过将ANN转换为 SNN而取得了极佳性能的重要进展。由于信息处理的不同,转换的深层 SNNN通常会遭受严重的性能损失和长时间延误。在本文中,我们分析了性能损失的原因,并提出了比人工神经网络(ANNNNNN)更具有更生物上合理和更高的节能效率。然而,由于阶段性铅和阶段落后,培训SNNNN(S)结构的ANNNNN,产出神经信息因快捷路径的快速传输而不完整。我们设计同步的神经(SN(SN20)来帮助提高性能。实验结果表明,拟议的方法只需要100-1/10时间步骤,比前一个具有挑战性的A-NFRM-NRM-NLAS-NBAS-NBAS-NL16、前的S-S-S-S-S-S-S-NRAS-S-NLDRD-NB-S-S-S-S-S-S-S-S-S-S-NR-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-NG-NL-NL-NL-NL-NG-NG-NR-NR-NR-NL-NG-NR-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-NL-NL-NG-NL-NL-N-NL-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N-N