We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and a form of temporal coding known as time-to-first-spike coding. With this coding scheme, neurons fire at most once per stimulus, but the firing order carries information. Here, we introduce BS4NN, a modification of S4NN in which the synaptic weights are constrained to be binary (+1 or -1), in order to decrease memory (ideally, one bit per synapse) and computation footprints. This was done using two sets of weights: firstly, real-valued weights, updated by gradient descent, and used in the backward pass of backpropagation, and secondly, their signs, used in the forward pass. Similar strategies have been used to train (non-spiking) binarized neural networks. The main difference is that BS4NN operates in the time domain: spikes are propagated sequentially, and different neurons may reach their threshold at different times, which increases computational power. We validated BS4NN on two popular benchmarks, MNIST and Fashion-MNIST, and obtained reasonable accuracies for this sort of network (97.0% and 87.3% respectively) with a negligible accuracy drop with respect to real-valued weights (0.4% and 0.7%, respectively). We also demonstrated that BS4NN outperforms a simple BNN with the same architectures on those two datasets (by 0.2% and 0.9% respectively), presumably because it leverages the temporal dimension. The source codes of the proposed BS4NN are publicly available at https://github.com/SRKH/BS4NN.
翻译:我们最近提议了 S4NN 算法, 基本上将回演法调整为多层神经网络, 这些网络使用简单的非液态集成和火神经网络, 以及一种被称为时间到第一线编码的时间性编码。 有了这个编码方案, 神经系统火灾最多一次, 但火序会传递信息 。 在这里, 我们引入了 BS4NN, S4NN 的修改, 其合成重量限制为二进制( +1 或 - 1 ), 以减少记忆( 理想是, 一点点每神经神经圈) 和计算足迹。 这是用两组重量来完成的。 首先, 实际价值的重量, 以渐渐降法速度更新, 并在后向的反调过程中使用它们的迹象。 类似的战略已被用于训练( 非蒸发) 硬化的神经网络网络。 主要的区别是, BS4NF4NS 运行在时间域域内: 钉是连续传播, 不同的神经系统结构, 可能分别以直径直径的直径直径为基准点3,, 也就是B4SNISNFS4 。