Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips. As these chips are usually resource-constrained, the compression of SNNs is thus crucial along the road of practical use of SNNs. Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs, thus limiting the performance of the pruned SNNs. Besides, these methods are only suitable for shallow SNNs. In this paper, inspired by synaptogenesis and synapse elimination in the neural system, we propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain. Our key innovation is to redefine the gradient to a new synaptic parameter, allowing better exploration of network structures by taking full advantage of the competition between pruning and regrowth of connections. The experimental results show that the proposed method achieves minimal loss of SNNs' performance on MNIST and CIFAR-10 dataset so far. Moreover, it reaches a $\sim$3.5% accuracy loss under unprecedented 0.73% connectivity, which reveals remarkable structure refining capability in SNNs. Our work suggests that there exists extremely high redundancy in deep SNNs. Our codes are available at \url{https://github.com/Yanqi-Chen/Gradient-Rewiring}.
翻译:Spik Spik Neural Networks(SNN)由于其生物光度和神经质芯片的高能效,因此被高度重视。由于这些芯片通常受到资源限制,因此在实际使用SNNS的道路上压缩SNNS。大多数现有方法直接将人工神经网络(ANNS)的修剪方法运用到SNNS,这些方法忽视了ANNS和SNNN的区别,从而限制了经修剪的SNNS的性能。此外,这些方法只适用于浅度的SNNNS。在这篇文章中,由于神经系统合成的起源和突触,我们建议对SNNNNS进行梯度的重新连线(Grad R),这是SNNNNS连接和重量的联合学习算法,使我们能够在不重复的情况下无缝优化网络结构。我们的关键创新是将梯度重新定位为一个新的合成参数,通过充分利用网络运行和连接之间的竞争来更好地探索网络结构。实验结果显示,在神经系统中,拟议的方法在SNGNMR3的精确度上将达到最起码的精确性数据。