Spiking Neural Networks (SNNs) have recently become more popular as a biologically plausible substitute for traditional Artificial Neural Networks (ANNs). SNNs are cost-efficient and deployment-friendly because they process input in both spatial and temporal manners using binary spikes. However, we observe that the information capacity in SNNs is affected by the number of timesteps, leading to an accuracy-efficiency tradeoff. In this work, we study a fine-grained adjustment of the number of timesteps in SNNs. Specifically, we treat the number of timesteps as a variable conditioned on different input samples to reduce redundant timesteps for certain data. We call our method Spiking Early-Exit Neural Networks (SEENNs). To determine the appropriate number of timesteps, we propose SEENN-I which uses a confidence score thresholding to filter out the uncertain predictions, and SEENN-II which determines the number of timesteps by reinforcement learning. Moreover, we demonstrate that SEENN is compatible with both the directly trained SNN and the ANN-SNN conversion. By dynamically adjusting the number of timesteps, our SEENN achieves a remarkable reduction in the average number of timesteps during inference. For example, our SEENN-II ResNet-19 can achieve 96.1% accuracy with an average of 1.08 timesteps on the CIFAR-10 test dataset.
翻译:近年来,脉冲神经网络(SNN)因其有生物学基础的特点成为了传统人工神经网络(ANN)的可替代方案。SNN以二进制脉冲的方式进行空间和时间双重处理,具有成本效益和易于部署的特点。然而,我们发现SNN中的信息容量会受到时间步数的影响,导致精度和效率之间的权衡。本文研究了SNN中时间步数的精细调整。具体而言,我们将时间步数视为一个变量,根据不同的输入样本条件进行调整,以减少某些数据中的冗余时间步数。我们称该方法为时序脉冲早期退出神经网络(SEENN)。为确定合适的时间步数,我们提出了两种方法:SEENN-I使用置信度阈值筛选不确定的预测结果,SEENN-II通过强化学习确定时间步数。此外,我们证明SEENN兼容直接训练的SNN和ANN-SNN转换。通过动态调整时间步数,我们的SEENN在推断过程中实现了平均时间步数的显著减少。例如,我们的SEENN-II ResNet-19在 CIFAR-10 测试数据集上可以实现96.1%的准确率,平均时间步数为1.08次。