Spiking Neural Networks (SNNs) have recently emerged as the low-power alternative to Artificial Neural Networks (ANNs) owing to their asynchronous, sparse, and binary information processing. To improve the energy-efficiency and throughput, SNNs can be implemented on memristive crossbars where Multiply-and-Accumulate (MAC) operations are realized in the analog domain using emerging Non-Volatile-Memory (NVM) devices. Despite the compatibility of SNNs with memristive crossbars, there is little attention to study on the effect of intrinsic crossbar non-idealities and stochasticity on the performance of SNNs. In this paper, we conduct a comprehensive analysis of the robustness of SNNs on non-ideal crossbars. We examine SNNs trained via learning algorithms such as, surrogate gradient and ANN-SNN conversion. Our results show that repetitive crossbar computations across multiple time-steps induce error accumulation, resulting in a huge performance drop during SNN inference. We further show that SNNs trained with a smaller number of time-steps achieve better accuracy when deployed on memristive crossbars.
翻译:最近,Spik Neal网络(SNNS)成为人工神经网络(ANNS)的低功率替代品,因为其无同步、稀少和二进制信息处理。为了提高能源效率和吞吐量,SNNS可以在中间十字栏上实施,在中间十字栏上实现乘积(MAC)操作,在模拟域使用新兴的非Volial-Memory(NVM)装置。尽管SNS与中间十字栏兼容性,但很少注意研究内在交叉非理想性与相近性对SNNS绩效的影响。在本文中,我们全面分析SNNNS在非理想十字栏上的稳健性。我们检查通过学习算法培训的SNNS,例如,临时梯度和ANN-SNN(NNN)转换。我们的结果显示,跨多个时段的重复交叉栏计算会导致错误积累,导致在SNNNPS部署的跨步期间出现巨大的性能下降。我们进一步显示,在SNNNS部署时的精确度上实现了一个较小的数字。