Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. Recently, SNNs with backpropagation through time (BPTT) have achieved a higher accuracy result on image recognition tasks than other SNN training algorithms. Despite the success from the algorithm perspective, prior works neglect the evaluation of the hardware energy overheads of BPTT due to the lack of a hardware evaluation platform for this SNN training algorithm. Moreover, although SNNs have long been seen as an energy-efficient counterpart of ANNs, a quantitative comparison between the training cost of SNNs and ANNs is missing. To address the aforementioned issues, in this work, we introduce SATA (Sparsity-Aware Training Accelerator), a BPTT-based training accelerator for SNNs. The proposed SATA provides a simple and re-configurable systolic-based accelerator architecture, which makes it easy to analyze the training energy for BPTT-based SNN training algorithms. By utilizing the sparsity, SATA increases its computation energy efficiency by $5.58 \times$ compared to the one without using sparsity. Based on SATA, we show quantitative analyses of the energy efficiency of SNN training and compare the training cost of SNNs and ANNs. The results show that, on Eyeriss-like systolic-based architecture, SNNs consume $1.27\times$ more total energy with sparsities when compared to ANNs. We find that such high training energy cost is from time-repetitive convolution operations and data movements during backpropagation. Moreover, to propel the future SNN training algorithm design, we provide several observations on energy efficiency for different SNN-specific training parameters and propose an energy estimation framework for SNN training. Code for our framework is made publicly available.
翻译:Spik Neal Networks(SNN)作为传统人工神经网络(ANN)的一个潜在的节能替代物,引起了人们的极大关注。最近,Snations(Snations)在图像识别任务方面比SNN的其他培训算法(BBTT)取得了更高的准确性结果。尽管从算法角度上取得了成功,但先前的工作忽视了对BPTT的硬件能源管理费的评估,因为SNNE培训算法缺乏一个硬件评价平台。此外,虽然SNNE长期以来一直被视为ANS的节能对应物,但SNNNNNations和ANNNations的培训成本却缺乏定量比较。SNNational National 的节能效率分析过程不难,SNNNationalTT的S-National-National Afforlational 培训算法显示SNational-SNational-Avalations SNational 节能效率分析。SNNTTA(SNNT)在SNational 节能效率分析过程中,SNational 演示SNTATATL(S-CA)的节能成本分析期间,SNTATL)将S-CLAx 的节能训练算算算算算法,SSSNTLTTTTA(SSSSS-CLTTTA)提供一个测试算算算算算算算算算算算算算算算算算算算算算算算算法。