Inspired by more detailed modeling of biological neurons, Spiking neural networks (SNNs) have been investigated both as more biologically plausible and potentially more powerful models of neural computation, and also with the aim of extracting biological neurons' energy efficiency; the performance of such networks however has remained lacking compared to classical artificial neural networks (ANNs). Here, we demonstrate how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs on challenging benchmarks in the time-domain, like speech and gesture recognition. This also exceeds the performance of standard classical recurrent neural networks (RNNs) and approaches that of the best modern ANNs. As these SNNs exhibit sparse spiking, we show that they theoretically are one to three orders of magnitude more computationally efficient compared to RNNs with comparable performance. Together, this positions SNNs as an attractive solution for AI hardware implementations.
翻译:在生物神经元更详细模型的启发下,Spiking神经网络(SnNNs)被调查为更具有生物价值和潜在更强大的神经计算模型,同时也是为了提取生物神经元的能源效率;然而,与古典人工神经网络相比,这些网络的性能仍然缺乏。 这里,我们展示了一种新型的替代梯度,加上金枪鱼和适应性突触神经元的经常性网络,如何为SnNPs带来最新技术,使其在时间范围内具有挑战性的基准,如言语和手势识别。这也超过了标准古典常线性神经网络(RNNS)的性能和现代顶级神经网络(ANNS)的实用性能。由于这些SNNs展示了稀薄的神经网络,我们表明这些网络在理论上比具有类似性能的RNPs更具有计算效率的一至三级。此外,SNNPs对于AI硬件实施来说是一个有吸引力的解决方案。