As the scales of neural networks increase, techniques that enable them to run with low computational cost and energy efficiency are required. From such demands, various efficient neural network paradigms, such as spiking neural networks (SNNs) or binary neural networks (BNNs), have been proposed. However, they have sticky drawbacks, such as degraded inference accuracy and latency. To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision. The proposed S$^3$NN processes the information between hidden layers by spikes as SNNs. Nevertheless, it has no temporal dimension so that there is no latency within training and inference phases as BNNs. Thus, the proposed S$^3$NN has a lower computational cost than SNNs that require time-series processing. However, S$^3$NN cannot adopt na\"{i}ve backpropagation algorithms due to the non-differentiability nature of spikes. We deduce a suitable neuron model by reducing the surrogate gradient for multi-time step SNNs to a single-time step. We experimentally demonstrated that the obtained surrogate gradient allows S$^3$NN to be trained appropriately. We also showed that the proposed S$^3$NN could achieve comparable accuracy to full-precision networks while being highly energy-efficient.
翻译:随着神经网络规模的扩大,需要有能够以低计算成本和高能效运行这些神经网络的技术。从这些需求中,提出了各种高效神经网络范例,如神经网络或双神经网络。然而,随着神经网络规模的扩大,它们有粘性缺陷,如低推力精确度和延缓度等。为了解决这些问题,我们提议建立一个单步跳动神经网络(S$3NNN),一个节能神经网络,低计算成本和高精度。拟议的S$3NNNN网络处理作为SNNN的峰值的隐藏层之间的信息。然而,它没有时间层面,因此,培训和推断阶段中不存在延缓性,例如低推力的推力准确性。因此,拟议的S$3NNNNN的计算成本比需要时间序列处理的SNNN低。然而,S$3NNN无法采用节能反向反调算算法,因为不易辨的阶梯度性质是S-NN3,我们也可以在S-NNGS的阶梯度上进行一个合适的实验级模型。我们推算出S-NNNGS的高度的S-NGA级模型。