Spiking Neural Networks (SNNs) have gained great attraction due to their distinctive properties of low power consumption and fast inference on neuromorphic hardware. As the most effective method to get deep SNNs, ANN-SNN conversion has achieved comparable performance as ANNs on large-scale datasets. Despite this, it requires long time-steps to match the firing rates of SNNs to the activation of ANNs. As a result, the converted SNN suffers severe performance degradation problems with short time-steps, which hamper the practical application of SNNs. In this paper, we theoretically analyze ANN-SNN conversion error and derive the estimated activation function of SNNs. Then we propose the quantization clip-floor-shift activation function to replace the ReLU activation function in source ANNs, which can better approximate the activation function of SNNs. We prove that the expected conversion error between SNNs and ANNs is zero, enabling us to achieve high-accuracy and ultra-low-latency SNNs. We evaluate our method on CIFAR-10/100 and ImageNet datasets, and show that it outperforms the state-of-the-art ANN-SNN and directly trained SNNs in both accuracy and time-steps. To the best of our knowledge, this is the first time to explore high-performance ANN-SNN conversion with ultra-low latency (4 time-steps). Code is available at https://github.com/putshua/SNN\_conversion\_QCFS
翻译:Spik Spik Neural Networks(SNN)因其低电耗和神经变异硬件快速推导的特殊性能而获得很大的吸引力。 作为获取深度 SNNSs的最有效方法,ANNN-SNN转换在大型数据集中作为ANNNSs实现了可比的性能。尽管如此,这需要很长的时间步骤才能匹配SNNS的点火率以激活ANNNSs。因此,转换的SNNN面临严重的性能退化问题,时间步很短,这阻碍了SNNNS的实用应用。在本文件中,我们理论上分析了ANNN-SNNN的转换错误,并得出了SNNNNNS的估计启动功能。然后,我们提议在源 ANNS 中以四分制的点点点点点点点点点化点化点化,这可以更接近SNNNS的启动功能。我们证明, SNNNSs和ANS ERS 数据库的高级时间-时间-10100 和图像S 数据转换规则显示这个高端-NNNNNS 的高级的高级数据。我们用的时间-NUR-ral-ral-ral-ral-rations-al-l-l-l-l-l-l-l-l-l-l-al-l-al-al-l-l-l-al-al-al-al-al-d-ds-ds-destruts disal-l-l-l-ds-lations-lations-l-l-l-l-l-l-l-s-s-s-l-l-l-s-s-s-s-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-set-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-</s>