While backpropagation (BP) has been applied to spiking neural networks (SNNs) achieving encouraging results, a key challenge involved is to backpropagate a continuous-valued loss over layers of spiking neurons exhibiting discontinuous all-or-none firing activities. Existing methods deal with this difficulty by introducing compromises that come with their own limitations, leading to potential performance degradation. We propose a novel BP-like method, called neighborhood aggregation (NA), which computes accurate error gradients guiding weight updates that may lead to discontinuous modifications of firing activities. NA achieves this goal by aggregating finite differences of the loss over multiple perturbed membrane potential waveforms in the neighborhood of the present membrane potential of each neuron while utilizing a new membrane potential distance function. Our experiments show that the proposed NA algorithm delivers the state-of-the-art performance for SNN training on several datasets.
翻译:虽然在刺激神经网络(SNNS)取得令人鼓舞的结果时采用了回推进法(BP),但所涉及的一项关键挑战是,在展示不连续的全或无线射击活动的刺激神经神经元的层层上,对持续估价的损失进行反演。 现有方法通过引入带有自身局限性的折射法来应对这一困难,从而导致潜在的性能退化。 我们提出了一种新型的BP类方法,称为邻里聚合(NA),该方法计算准确的误差梯度指导重量更新,从而可能导致对射击活动进行不连续的修改。 NA通过在使用新的膜潜在距离功能的同时,在目前每个神经元的膜潜力的周围,将多种环膜潜在波形上损失的有限差异综合起来,从而实现这一目标。 我们的实验表明,拟议的NA算法为SNNE培训在几个数据集上提供了最先进的性能。