Spiking neural networks (SNNs) can utilize spatio-temporal information and have a nature of energy efficiency which is a good alternative to deep neural networks(DNNs). The event-driven information processing makes SNNs can reduce the expensive computation of DNNs and save a lot of energy consumption. However, high training and inference latency is a limitation of the development of deeper SNNs. SNNs usually need tens or even hundreds of time steps during the training and inference process which causes not only the increase of latency but also the waste of energy consumption. To overcome this problem, we proposed a novel training method based on backpropagation (BP) for ultra-low latency(1-2 time steps) SNN with multi-threshold. In order to increase the information capacity of each spike, we introduce the multi-threshold Leaky Integrate and Fired (LIF) model. In our proposed training method, we proposed three approximated derivative for spike activity to solve the problem of the non-differentiable issue which cause difficulties for direct training SNNs based on BP. The experimental results show that our proposed method achieves an average accuracy of 99.56%, 93.08%, and 87.90% on MNIST, FashionMNIST, and CIFAR10, respectively with only 2 time steps. For the CIFAR10 dataset, our proposed method achieve 1.12% accuracy improvement over the previously reported direct trained SNNs with fewer time steps.
翻译:Spik神经网络(SNNS)可以使用时空信息,并且具有能源效率的性质,这是深神经网络(DNNS)的好替代品。事件驱动的信息处理使SNNS能够降低DNS的昂贵计算,节省大量能源消耗。然而,高培训和推推导延迟度限制了更深SNS的发展。SNNS通常需要数十甚至数百个时间步骤。在培训和推断过程中,SNNNS不仅需要增加延缓性,而且还需要能源消耗的浪费。为了克服这一问题,我们提议了一个新的培训方法,它基于对超低纬度(1-2时间步骤)的回推进(BP),为SNNNS提供了新的培训方法。为了提高每次加固度的信息能力,我们引入了多临界值(LAKY ) 整合和 消防(LIFF) 模式。我们提议的培训方法仅用三种近似的衍生物来解决不可辨别的改进问题。为了解决这一问题,我们提出了一种基于99-90级标准直接培训(SNNFM)步骤, 以B法显示我们提议的直测算结果。