Spiking Neural Networks (SNNs) have been studied over decades to incorporate their biological plausibility and leverage their promising energy efficiency. Throughout existing SNNs, the leaky integrate-and-fire (LIF) model is commonly adopted to formulate the spiking neuron and evolves into numerous variants with different biological features. However, most LIF-based neurons support only single biological feature in different neuronal behaviors, limiting their expressiveness and neuronal dynamic diversity. In this paper, we propose GLIF, a unified spiking neuron, to fuse different bio-features in different neuronal behaviors, enlarging the representation space of spiking neurons. In GLIF, gating factors, which are exploited to determine the proportion of the fused bio-features, are learnable during training. Combining all learnable membrane-related parameters, our method can make spiking neurons different and constantly changing, thus increasing the heterogeneity and adaptivity of spiking neurons. Extensive experiments on a variety of datasets demonstrate that our method obtains superior performance compared with other SNNs by simply changing their neuronal formulations to GLIF. In particular, we train a spiking ResNet-19 with GLIF and achieve $77.35\%$ top-1 accuracy with six time steps on CIFAR-100, which has advanced the state-of-the-art. Codes are available at \url{https://github.com/Ikarosy/Gated-LIF}.
翻译:数十年来一直在研究Spiking神经网络(Snural Networks ), 以纳入其生物可视性并利用其有希望的能源效率。 在现有的 SNNS 中,通常采用泄漏整合与火灾模型(LIF), 以形成神经神经神经元, 并进化成具有不同生物特征的多种变体。 然而,大多数基于 LIF 的神经元只支持不同神经神经行为中的单一生物特征, 限制其表情和神经动态多样性。 在本文中, 我们提议GLIF, 一种统一的Spiring神经元, 以整合不同神经行为的不同生物功能, 扩大神经神经元的展示空间。 在GLIF中, 用于确定混合生物功能比例的GIF, 在培训中可以学习。 将所有可学习的膜相关参数结合起来, 我们的方法可以使神经元不同且不断变化的神经元变化, 从而增加神经元的异性性和适应性。 通过在各种数据设置中进行广泛的实验, 将GLLFS- 的精度与SIFS- 的精度定的精度与S- 的精度转化为相比, 我们的精度的精度的精度与S- 的精度的精度的精度与S- 的精度,在S- 的精度的精度的精度的精度的精度与S-LLLFFFA- tr的精度的精度的精度的精度的精度,在S- tr的精度,在S- tr的精度,在S- tr的精度上,在S-II-RFFFFFS- s- s- s- s- s-re- s- s- s-l- s- s- s-re- s- s-l- s- s-l-l-re-l-l-l-re-l-l-l-l-re-l-l-l-l-l-l-l-l-l-l-l-re-l-l-l-l-l-l-l-l-l-l-re-l-l-l-l-l-l-l-l-l-l-l-