Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. Recently, SNNs with backpropagation through time (BPTT) have achieved a higher accuracy result on image recognition tasks than other SNN training algorithms. Despite the success from the algorithm perspective, prior works neglect the evaluation of the hardware energy overheads of BPTT, due to the lack of a hardware evaluation platform for this SNN training algorithm. Moreover, although SNNs have long been seen as an energy-efficient counterpart of ANNs, a quantitative comparison between the training cost of SNNs and ANNs is missing. To address the aforementioned issues, in this work, we introduce SATA (Sparsity-Aware Training Accelerator), a BPTT-based training accelerator for SNNs. The proposed SATA provides a simple and re-configurable systolic-based accelerator architecture, which makes it easy to analyze the training energy for BPTT-based SNN training algorithms. By utilizing the sparsity, SATA increases its computation energy efficiency by $5.58 \times$ compared to the one without using sparsity. Based on SATA, we show quantitative analyses of the energy efficiency of SNN training and make a comparison between the training cost of SNNs and ANNs. The results show that, on Eyeriss-like systolic-based architecture, SNNs consume $1.27\times$ more total energy with considering sparsity (spikes, gradient of firing function, and gradient of membrane potential) when compared to ANNs. We find that such high training energy cost is from time-repetitive convolution operations and data movements during backpropagation. Moreover, to guide the future SNN training algorithm design, we provide several observations on energy efficiency with respect to different SNN-specific training parameters.
翻译:Spik Neal Networks(SNN)作为常规人造神经网络(ANN)的潜在节能替代能源效率替代物,引起了人们的极大关注。最近,SNNS(Snations)在图像识别任务方面比SNN培训算法(BBTT)取得了更高的准确性结果。尽管从算法角度上取得了成功,但先前的工作忽视了对BPTT的硬件能源管理费的评估,因为SNN培训算法缺乏硬件评价平台。此外,虽然SNNS长期以来一直被视为ANNS的节能对应物,SNNNNNP和ANS的培训算法(SNTA)在SNF培训算法的节能效率分析过程中,SNPT的节能培训算法(SNPTA)在SNT的节能培训算法中,在SNTAST培训算法(SNT)的节算法(SNNTA)中,在SNTADL的节能效率分析中,在SNTAiral AS的节算法中,在SNTTTA值中可以提供一些高的节能培训算算算算算算算。SNTTTF的节时,在SNTTTTTF的节能的节能的节中,在SLA的节能的节算算算算算算法中提供了多少。