Brain-inspired spiking neural networks (SNNs) have recently drawn more and more attention due to their event-driven and energy-efficient characteristics. The integration of storage and computation paradigm on neuromorphic hardwares makes SNNs much different from Deep Neural Networks (DNNs). In this paper, we argue that SNNs may not benefit from the weight-sharing mechanism, which can effectively reduce parameters and improve inference efficiency in DNNs, in some hardwares, and assume that an SNN with unshared convolution kernels could perform better. Motivated by this assumption, a training-inference decoupling method for SNNs named as Real Spike is proposed, which not only enjoys both unshared convolution kernels and binary spikes in inference-time but also maintains both shared convolution kernels and Real-valued Spikes during training. This decoupling mechanism of SNN is realized by a re-parameterization technique. Furthermore, based on the training-inference-decoupled idea, a series of different forms for implementing Real Spike on different levels are presented, which also enjoy shared convolutions in the inference and are friendly to both neuromorphic and non-neuromorphic hardware platforms. A theoretical proof is given to clarify that the Real Spike-based SNN network is superior to its vanilla counterpart. Experimental results show that all different Real Spike versions can consistently improve the SNN performance. Moreover, the proposed method outperforms the state-of-the-art models on both non-spiking static and neuromorphic datasets.
翻译:由大脑启发的神经神经网络(SNNs)最近因其事件驱动和节能特点而引起越来越多的关注。神经形态硬件存储和计算范式的整合使SNNs与深神经网络(DNNS)大不相同。在本文中,我们争辩说,SNNs可能无法受益于权重共享机制,该机制可以有效减少参数,提高DNNs在某些硬件中的推论效率,并假设一个拥有未共享的神经内核内核的SNNNN可以发挥更好的效果。基于这一假设,提出了名为Real Spik的SNNPs培训导出分解方法,该方法不仅具有非共享的进化内核内核内核和二进核在发时的双峰性峰性峰值峰值。Snockennicrole的这种分解机制可以通过重新校准技术实现。此外,基于培训分解的构想,一个名为Real-decouple的Sral-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-deal-de-deal-deal-deal-deal-deal-deal-deal-deal-deal-demoal-deal-lax 展示所有等级的运行都展示了所有等级和制式的版本,在不同的级的版本上展示了该方法,该方法上展示了该方法,并展示了其真实的正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正正制,并展示了它。