Spiking neural networks (SNNs) are energy-efficient neural networks because of their spiking nature. However, as the spike firing rate of SNNs increases, the energy consumption does as well, and thus, the advantage of SNNs diminishes. Here, we tackle this problem by introducing a novel penalty term for the spiking activity into the objective function in the training phase. Our method is designed so as to optimize the energy consumption metric directly without modifying the network architecture. Therefore, the proposed method can reduce the energy consumption more than other methods while maintaining the accuracy. We conducted experiments for image classification tasks, and the results indicate the effectiveness of the proposed method, which mitigates the dilemma of the energy--accuracy trade-off.
翻译:Spik 神经网络(SNNS)是节能神经网络,因为它们具有喷射性。然而,随着SNNS高涨的发射率,能源消耗也随之减少,因此,SNNS的优势也随之减少。在这里,我们通过在培训阶段的客观功能中引入对喷射活动的新的惩罚性术语来解决这个问题。我们的方法设计是为了在不改变网络结构的情况下直接优化能源消耗指标。因此,拟议方法可以比其他方法减少能源消耗,同时保持准确性。我们进行了图像分类任务实验,结果显示了拟议方法的有效性,缓解了能源-准确交易的两难境地。