Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips. As these chips are usually resource-constrained, the compression of SNNs is thus crucial along the road of practical use of SNNs. Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs, thus limiting the performance of the pruned SNNs. Besides, these methods are only suitable for shallow SNNs. In this paper, inspired by synaptogenesis and synapse elimination in the neural system, we propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retraining. Our key innovation is to redefine the gradient to a new synaptic parameter, allowing better exploration of network structures by taking full advantage of the competition between pruning and regrowth of connections. The experimental results show that the proposed method achieves minimal loss of SNNs' performance on MNIST and CIFAR-10 dataset so far. Moreover, it reaches a $\sim$3.5% accuracy loss under unprecedented 0.73% connectivity, which reveals remarkable structure refining capability in SNNs. Our work suggests that there exists extremely high redundancy in deep SNNs. Our codes are available at https://github.com/Yanqi-Chen/Gradient-Rewiring.
翻译:Spik Spik Neural Networks(SNN)由于其生物光度和神经质芯片的高能效,因此非常重要。由于这些芯片通常受到资源限制,因此在实际使用SNNs的道路上压缩SNNs至关重要。大多数现有方法直接在人工神经网络(ANNS)和SNNS中应用修剪方法,这些方法忽视了ANNS和SNNS之间的差异,从而限制了经修剪的SNNS的性能。此外,这些方法只适用于浅度的SNNNS。在本文中,这些芯片的产生和神经质的消除受到启发,因此,我们建议对SNNNNS进行梯度再接线(GR R),这是SNNNS连接和重量的联合学习算法,使我们无需再培训就能完美地优化网络结构。我们的关键创新是将梯度重新定位为一个新的合成参数,通过充分利用正在运行的运行和连接之间的竞争来更好地探索网络结构结构。实验结果表明,拟议的方法在神经质系统系统系统系统中的精度上实现了最低程度的精度的精度,SNDRIMS-10的精度,在S-NISIMS-10的精度结构下,在S-10级结构下显示的精确性能性能性能性能性能性能性能显示了我们极性能的精确性能性能性能。