Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. However, most prior SNN methods use ANN-like architectures (e.g., VGG-Net or ResNet), which could provide sub-optimal performance for temporal sequence processing of binary information in SNNs. To address this, in this paper, we introduce a novel Neural Architecture Search (NAS) approach for finding better SNN architectures. Inspired by recent NAS approaches that find the optimal architecture from activation patterns at initialization, we select the architecture that can represent diverse spike activation patterns across different data samples without training. Moreover, to further leverage the temporal information among the spikes, we search for feed forward connections as well as backward connections (i.e., temporal feedback connections) between layers. Interestingly, SNASNet found by our search algorithm achieves higher performance with backward connections, demonstrating the importance of designing SNN architecture for suitably using temporal information. We conduct extensive experiments on three image recognition benchmarks where we show that SNASNet achieves state-of-the-art performance with significantly lower timesteps (5 timesteps). Code is available at Github.
翻译:Spik Neal 网络(SNN)作为传统人造神经网络(ANN)的潜在节能替代能源效率的替代方法,引起了人们的极大关注,因为传统人造神经网络(ANN)具有内在的高度差异性能。然而,大多数以前的SNN方法都使用类似于ANN的架构(如VGG-Net或ResNet),这些架构可以为SNNS的二进信息的时间序列处理提供亚最佳性能。为解决这一问题,我们在本文件中引入了一种创新的神经结构搜索(NAS)方法,以寻找更好的SNNNE结构。受最近的NAS方法的启发,这些方法从初始化的启动模式中找到最佳架构,我们选择了能够代表不同数据样本中不同快速启动模式的架构。此外,为了进一步利用峰值之间的时间信息,我们搜索了前方连接以及层之间的后向连接(即时间反馈连接)。有趣的是,我们搜索算法发现SNASNet与后向连接的更高性能,表明设计SNNNNE结构对于使用时间信息正确性信息的重要性。我们用三个时间级标准进行广泛的实验。我们在GISISSDSDSDS-s-s-s-laxxs-lax-s-s-laxxxxxxx-s-s-s-laxxxxxxxxx