Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. However, most prior SNN methods use ANN-like architectures (e.g., VGG-Net or ResNet), which could provide sub-optimal performance for temporal sequence processing of binary information in SNNs. To address this, in this paper, we introduce a novel Neural Architecture Search (NAS) approach for finding better SNN architectures. Inspired by recent NAS approaches that find the optimal architecture from activation patterns at initialization, we select the architecture that can represent diverse spike activation patterns across different data samples without training. Furthermore, to leverage the temporal correlation among the spikes, we search for feed forward connections as well as backward connections (i.e., temporal feedback connections) between layers. Interestingly, SNASNet found by our search algorithm achieves higher performance with backward connections, demonstrating the importance of designing SNN architecture for suitably using temporal information. We conduct extensive experiments on three image recognition benchmarks where we show that SNASNet achieves state-of-the-art performance with significantly lower timesteps (5 timesteps).
翻译:Spik Neal 网络(SNN)作为传统人造神经网络(ANN)的潜在节能替代能源效率的替代方法,引起了人们的极大关注,因为传统人造神经网络(ANN)具有内在的高度分化激活功能。然而,大多数以前的SNN方法使用类似于ANN的架构(如VGG-Net或ResNet),这些架构可以为SNNS的二元信息的时间序列处理提供亚最佳性能。为解决这一问题,我们在本文件中引入了一种创新的神经结构搜索(NAS)方法,以寻找更好的SNNN的架构。受最近的NAS方法的启发,这些方法从初始化的启动模式中找到最佳架构,我们选择了能够代表不同数据样本中不同快速启动模式的架构。此外,为了利用峰值之间的时间相关性,我们搜索前方连接以及各层之间的后向连接(即时间反馈连接)。有趣的是,我们的搜索算法所发现的SNASNet与后向连接的更高性能,表明设计SNNNNE结构对于使用时间信息是否适当的重要性。我们进行了广泛的时间性实验,我们用三个图像基准(5个SNAS-stals-s-destreps-destrat lagistrations-stments-laxxxx)在三个图像性能基准上进行了大幅度的测试。