Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips. As these chips are usually resource-constrained, the compression of SNNs is thus crucial along the road of practical use of SNNs. Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs, thus limiting the performance of the pruned SNNs. Besides, these methods are only suitable for shallow SNNs. In this paper, inspired by synaptogenesis and synapse elimination in the neural system, we propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain. Our key innovation is to redefine the gradient to a new synaptic parameter, allowing better exploration of network structures by taking full advantage of the competition between pruning and regrowth of connections. The experimental results show that the proposed method achieves minimal loss of SNNs' performance on MNIST and CIFAR-10 dataset so far. Moreover, it reaches a $\sim$3.5% accuracy loss under unprecedented 0.73% connectivity, which reveals remarkable structure refining capability in SNNs. Our work suggests that there exists extremely high redundancy in deep SNNs. Our codes are available at https://github.com/Yanqi-Chen/Gradient-Rewiring .
翻译:Spik Spik Neural Networks(SNN)由于其生物光度和神经质芯片的高能效,因此受到高度重视。由于这些芯片通常受到资源限制,因此在实际使用SNNs的道路上压缩SNNs至关重要。大多数现有方法直接在人工神经网络(ANNs)和SNNSs中应用修剪方法,这些方法忽视了非NNS和已修剪的SNNs之间的区别,从而限制了经修剪的SNNS的性能。此外,这些方法只适用于浅度的SNNNNS。在本论文中,由于神经系统出现合成和突触作用,我们建议对SNNNNS进行梯度再接线(GR R),这是SNNNNS连接和重量的联合学习算法,使我们能够在不重复的情况下无缝地优化网络结构。我们的主要创新创新是将梯度重新定义到一个新的合成参数,通过充分利用网络运行和再连接之间的竞争来更好地探索网络结构。实验结果显示,在神经质系统中,SNNRC的深度S-NL3的精确度中,在S-NGRS-10的精确度中将达到一个最深的精确性能。S-ILILILS-ILILS-IL值。在极值的深度结构下,在S-ILI值中,在极值中显示一个最深的精确性能的精确性能。