Spiking neural networks (SNNs) have gained attention as a promising alternative to traditional artificial neural networks (ANNs) due to their potential for energy efficiency and their ability to model spiking behavior in biological systems. However, the training of SNNs is still a challenging problem, and new techniques are needed to improve their performance. In this paper, we study the impact of skip connections on SNNs and propose a hyperparameter optimization technique that adapts models from ANN to SNN. We demonstrate that optimizing the position, type, and number of skip connections can significantly improve the accuracy and efficiency of SNNs by enabling faster convergence and increasing information flow through the network. Our results show an average +8% accuracy increase on CIFAR-10-DVS and DVS128 Gesture datasets adaptation of multiple state-of-the-art models.
翻译:脉冲神经网络(SNN)因其节能和模拟生物系统中脉冲行为的能力而备受关注,是传统人工神经网络(ANN)的一个有前途的替代方案。然而,SNN的训练仍然是一个具有挑战性的问题,需要新的技术来提高其性能。在本文中,我们研究了跳跃连接对SNN的影响,并提出了一种超参数优化技术,将模型从ANN调整到SNN。我们证明了优化跳跃连接的位置、类型和数量可以显著提高SNN的准确性和效率,使其能够更快地收敛,并增加信息在网络中的传输流。我们的结果表明,在CIFAR-10-DVS和DVS128 Gesture数据集的多个最先进模型的转换中,平均准确度提高了8%。