Spiking neural networks (SNNs) have received substantial attention in recent years due to their sparse and asynchronous communication nature, and thus can be deployed in neuromorphic hardware and achieve extremely high energy efficiency. However, SNNs currently can hardly realize a comparable performance to that of artificial neural networks (ANNs) because their limited scalability does not allow for large-scale networks. Especially for Transformer, as a model of ANNs that has accomplished remarkable performance in various machine learning tasks, its implementation in SNNs by conventional methods requires a large number of neurons, notably in the self-attention module. Inspired by the mechanisms in the nervous system, we propose an efficient spiking Transformer (EST) framework enabled by partial information to address the above problem. In this model, we not only implemented the self-attention module with a reasonable number of neurons, but also introduced partial-information self-attention (PSA), which utilizes only partial input signals, further reducing computational resources compared to conventional methods. The experimental results show that our EST can outperform the state-of-the-art SNN model in terms of accuracy and the number of time steps on both Cifar-10/100 and ImageNet datasets. In particular, the proposed EST model achieves 78.48% top-1 accuracy on the ImageNet dataset with only 16 time steps. In addition, our proposed PSA reduces flops by 49.8% with negligible performance loss compared to a self-attention module with full information.
翻译:Spik神经网络(SNNS)近年来受到大量关注,原因是其通信性质稀少和不同步,因此可以部署在神经形态硬件中,实现极高的能效。然而,SNNS目前很难实现与人工神经网络(ANNS)的类似性能,因为其可缩缩缩性有限,无法建立大规模网络。特别是,对于作为在各种机器学习任务中取得出色业绩的ANNS模型的变异器来说,它以常规方法在SNNS中的实施需要大量神经元,特别是在自省模块中。在神经系统机制的启发下,我们提议了一个高效的跳动变异器(EST)框架,通过部分信息来应对上述问题。在这个模型中,我们不仅使用数量合理的神经网络网络网络网络,而且引入了部分信息自留(PSA)模式,仅使用部分输入信号,进一步减少计算资源与常规方法相比。实验结果显示,我们的EST-EP-48模型比SNet系统(SNUR)升级了S-100级模型和SNNIS-100级模型中的拟议最高数据级步骤的精确度。在16个模型中,仅用SNEF-IES-IS-I-I-I-IAS-I-I-I-I-I-I-IL-IL-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-IAS-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I