Spiking neural networks (SNNs) have superb characteristics in sensory information recognition tasks due to their biological plausibility. However, the performance of some current spiking-based models is limited by their structures which means either fully connected or too-deep structures bring too much redundancy. This redundancy from both connection and neurons is one of the key factors hindering the practical application of SNNs. Although Some pruning methods were proposed to tackle this problem, they normally ignored the fact the neural topology in the human brain could be adjusted dynamically. Inspired by this, this paper proposed an evolutionary-based structure construction method for constructing more reasonable SNNs. By integrating the knowledge distillation and connection pruning method, the synaptic connections in SNNs can be optimized dynamically to reach an optimal state. As a result, the structure of SNNs could not only absorb knowledge from the teacher model but also search for deep but sparse network topology. Experimental results on CIFAR100 and DVS-Gesture show that the proposed structure learning method can get pretty well performance while reducing the connection redundancy. The proposed method explores a novel dynamical way for structure learning from scratch in SNNs which could build a bridge to close the gap between deep learning and bio-inspired neural dynamics.
翻译:脉冲神经网络 (SNNs) 由于其生物合理性在感知信息识别任务中具有出色的特性。然而,一些当前基于脉冲的模型的性能受限于它们的结构,即全连接或过深的结构带来了太多的冗余。来自连接和神经元的这种冗余是阻碍SNNs实际应用的关键因素之一。虽然一些剪枝方法被提出来解决这个问题,但它们通常忽略了人脑中的神经拓扑可以动态调整这一事实。受此启发,本文提出了一种基于进化的结构构建方法,用于构建更合理的SNNs。通过集成知识蒸馏和连接剪枝方法,可以动态优化SNN中的突触连接,使其达到最佳状态。因此,SNN的结构不仅可以从教师模型中吸收知识,还可以搜索深度但稀疏的网络拓扑。在CIFAR100和DVS-Gesture上的实验结果表明,所提出的结构学习方法可以获得相当不错的性能,同时减少连接冗余。所提出的方法探索了一种从零开始动态学习SNN结构的新颖方式,这可以建立一座桥梁,以缩小深度学习和启发式神经动力学之间的差距。