Spiking neural networks (SNN) are a type of artificial network inspired by the use of action potentials in the brain. There is a growing interest in emulating these networks on neuromorphic computers due to their improved energy consumption and speed, which are the main scaling issues of their counterpart the artificial neural network (ANN). Significant progress has been made in directly training SNNs to perform on par with ANNs in terms of accuracy. These methods are however slow due to their sequential nature, leading to long training times. We propose a new technique for directly training single-spike-per-neuron SNNs which eliminates all sequential computation and relies exclusively on vectorised operations. We demonstrate over a $\times 10$ speedup in training with robust classification performance on real datasets of low to medium spatio-temporal complexity (Fashion-MNIST and Neuromophic-MNIST). Our proposed solution manages to solve certain tasks with over a $95.68 \%$ reduction in spike counts relative to a conventionally trained SNN, which could significantly reduce energy requirements when deployed on neuromorphic computers.
翻译:螺旋神经网络(SNN)是一种由大脑行动潜力的利用所启发的人工网络,由于神经形态计算机的能量消耗和速度的提高,人们越来越有兴趣将这些网络在神经形态计算机上进行模拟,这是对口人人工神经网络(ANN)的主要规模问题。在直接培训SNN(SNN),使其在准确性方面与ANN(SNN)同等工作方面已经取得重大进展。然而,由于这些方法的相继性质,导致培训时间长,这些方法进展缓慢。我们提出了一种直接培训单式中子中子SNNN(SNN)的新技术,消除所有连续计算,并完全依赖传导操作。我们展示了10万美元的超速培训,在中低波时复杂度实际数据集(Fashion-MNIST和Neuromopric-MNIST)上进行了强有力的分类。我们提议的解决方案设法解决某些任务,比常规训练的SNNNN(SNN)减少的峰值超过95.68 ⁇ 美元,这可以大大减少在神经形态计算机上部署时所需的能源需求。