Spiking Neural Networks (SNNs) that operate in an event-driven manner and employ binary spike representation have recently emerged as promising candidates for energy-efficient computing. However, a cost bottleneck arises in obtaining high-performance SNNs: training a SNN model requires a large number of time steps in addition to the usual learning iterations, hence this limits their energy efficiency. This paper proposes a general training framework that enhances feature learning and activation efficiency within a limited time step, providing a new solution for more energy-efficient SNNs. Our framework allows SNN neurons to learn robust spike feature from different receptive fields and update neuron states by utilizing both current stimuli and recurrence information transmitted from other neurons. This setting continuously complements information within a single time step. Additionally, we propose a projection function to merge these two stimuli to smoothly optimize neuron weights (spike firing threshold and activation). We evaluate the proposal for both convolution and recurrent models. Our experimental results indicate state-of-the-art visual classification tasks, including CIFAR10, CIFAR100, and TinyImageNet.Our framework achieves 72.41% and 72.31% top-1 accuracy with only 1 time step on CIFAR100 for CNNs and RNNs, respectively. Our method reduces 10x and 3x joule energy than a standard ANN and SNN, respectively, on CIFAR10, without additional time steps.
翻译:暂无翻译