Compared to conventional artificial neurons that produce dense and real-valued responses, biologically-inspired spiking neurons transmit sparse and binary information, which can also lead to energy-efficient implementations. Recent research has shown that spiking neural networks can be trained like standard recurrent neural networks using the surrogate gradient method. They have shown promising results on speech command recognition tasks. Using the same technique, we show that they are scalable to large vocabulary continuous speech recognition, where they are capable of replacing LSTMs in the encoder with only minor loss of performance. This suggests that they may be applicable to more involved sequence-to-sequence tasks. Moreover, in contrast to their recurrent non-spiking counterparts, they show robustness to exploding gradient problems without the need to use gates.
翻译:与传统人造神经元相比,这些神经元会产生密集的、有实际价值的反应,生物启发的突袭神经元传递稀有和二进制的信息,这也可以导致节能的实施。最近的研究显示,对神经神经网络的培训可以像使用代用梯度法的标准经常性神经网络一样。它们在语音指令识别任务上表现出了有希望的结果。使用同样的技术,我们显示,它们可以伸缩到大型词汇持续语音识别上,它们能够取代编码器中的LSTMs,而只造成轻微的性能损失。这表明,它们可能适用于更多涉及的序列到序列的任务。此外,它们与经常出现的非跳跃式神经网络不同,它们显示出在不需要使用门的情况下爆发梯度问题的能力。