We propose a novel backpropagation algorithm for training spiking neural networks (SNNs) that encodes information in the relative multiple spike timing of individual neurons without single-spike restrictions. The proposed algorithm inherits the advantages of conventional timing-based methods in that it computes accurate gradients with respect to spike timing, which promotes ideal temporal coding. Unlike conventional methods where each neuron fires at most once, the proposed algorithm allows each neuron to fire multiple times. This extension naturally improves the computational capacity of SNNs. Our SNN model outperformed comparable SNN models and achieved as high accuracy as non-convolutional artificial neural networks. The spike count property of our networks was altered depending on the time constant of the postsynaptic current and the membrane potential. Moreover, we found that there existed the optimal time constant with the maximum test accuracy. That was not seen in conventional SNNs with single-spike restrictions on time-to-fast-spike (TTFS) coding. This result demonstrates the computational properties of SNNs that biologically encode information into the multi-spike timing of individual neurons. Our code would be publicly available.
翻译:我们提议了用于培训神经神经网络的新颖的后演算法(SNNs),该算法将信息编码为单个神经神经神经神经网络的相对多重峰值时间,而没有单波限制。提议的算法继承了常规时间制方法的优点,因为它计算了与峰值时间有关的精确梯度,这促进了理想的时间编码。与每个神经神经火灾最多一次的传统方法不同,提议的算法允许每个神经神经网络发射多次。这一扩展自然提高了SNNS的计算能力。我们的SNN模型优于可比的SNN模型,并取得了与非革命性人工神经网络一样的高度精确性。我们网络的峰值计数属性根据日志当前和膜潜力的时间常数进行了改变。此外,我们发现存在与最高测试准确性的最佳时常数。这在对时间到快速加密(TTFS)的常规 SNNS中并不见。这个结果显示SNS的计算特性,即生物化信息将公开输入到各个神经序列的时间。