Spiking Neural Networks (SNNs) are promising energy-efficient models for neuromorphic computing. For training the non-differentiable SNN models, the backpropagation through time (BPTT) with surrogate gradients (SG) method has achieved high performance. However, this method suffers from considerable memory cost and training time during training. In this paper, we propose the Spatial Learning Through Time (SLTT) method that can achieve high performance while greatly improving training efficiency compared with BPTT. First, we show that the backpropagation of SNNs through the temporal domain contributes just a little to the final calculated gradients. Thus, we propose to ignore the unimportant routes in the computational graph during backpropagation. The proposed method reduces the number of scalar multiplications and achieves a small memory occupation that is independent of the total time steps. Furthermore, we propose a variant of SLTT, called SLTT-K, that allows backpropagation only at K time steps, then the required number of scalar multiplications is further reduced and is independent of the total time steps. Experiments on both static and neuromorphic datasets demonstrate superior training efficiency and performance of our SLTT. In particular, our method achieves state-of-the-art accuracy on ImageNet, while the memory cost and training time are reduced by more than 70% and 50%, respectively, compared with BPTT.
翻译:螺旋神经网络(SNNS)是具有前景的神经形态计算节能模型。为了培训不可区别的 SNN 模型,使用代用梯度(SG)方法的反向反向反向转换方法取得了很高的性能。然而,这种方法在培训期间有相当大的记忆成本和培训时间。在本文件中,我们提议了空间时间学习方法,可以取得高性能,同时与BPTT相比大大提高培训效率。首先,我们表明,通过时间域对 SNNS的反向反向调整只是对最终计算梯度稍小一点的贡献。因此,我们提议在反向调整期间忽略计算图中的不重要路径(BBTTT)。拟议方法减少了标度乘法的数量,并实现了一个独立于全部时间步骤的小型记忆工作。此外,我们提议了一个SLTTT(SLTT)的变式方法,它只允许在K 时间步骤中反向反向调整,然后要求的变法数将进一步减少,并且独立于整个时间步骤。因此,我们提议在计算图形图形图图图中忽略了不重要的路径上的路径,在Sstal-LTTB的精确率上分别是,在Straxxxxxxxxxxxxxxx的进度上的性能和Stxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx</s>