Spiking neural network (SNN) has been attached to great importance due to the properties of high biological plausibility and low energy consumption on neuromorphic hardware. As an efficient method to obtain deep SNN, the conversion method has exhibited high performance on various large-scale datasets. However, it typically suffers from severe performance degradation and high time delays. In particular, most of the previous work focuses on simple classification tasks while ignoring the precise approximation to ANN output. In this paper, we first theoretically analyze the conversion errors and derive the harmful effects of time-varying extremes on synaptic currents. We propose the Spike Calibration (SpiCalib) to eliminate the damage of discrete spikes to the output distribution and modify the LIPooling to allow conversion of the arbitrary MaxPooling layer losslessly. Moreover, Bayesian optimization for optimal normalization parameters is proposed to avoid empirical settings. The experimental results demonstrate the state-of-the-art performance on classification, object detection, and segmentation tasks. To the best of our knowledge, this is the first time to obtain SNN comparable to ANN on these tasks simultaneously. Moreover, we only need 1/50 inference time of the previous work on the detection task and can achieve the same performance under 0.492$\times$ energy consumption of ANN on the segmentation task.
翻译:由于神经形态硬件具有高度的生物可信任性和低能耗的特性,Spik Spik 神经神经网络(SNN)受到高度重视,因为神经形态硬件具有高度的生物可信任性和低能耗的特性。作为一种获取深度SNN的有效方法,转换方法在各种大型数据集上表现得非常出色,但通常会受到性能严重退化和高时间延误的影响。特别是,以前的大部分工作侧重于简单的分类任务,而忽视ANN产出的精确近似值。在本文中,我们首先从理论上分析转换错误,并得出时间变化的极端对合成时流的有害影响。我们建议Spik Calib(SpiCalib)消除离散的峰值对输出分布的破坏,并修改LIPooling,使任意的Maxpooling层能够无损地转换。此外,为了避免经验性环境,建议Bayesian优化最佳的正常化参数。实验结果表明,在分类、物体探测和分解任务方面最先进的业绩。我们最了解的是,这是第一次获得Spin Calation(Sn)Snation) $$$$ximpeal ex pass in the the put the AN has put the put the pridustrate time pridustrut the pridudustration pridustrational priduction production priduction priduction priduction) a am producal production production.