Event-based machine learning promises more energy-efficient AI on future neuromorphic hardware. Here, we investigate how the recently discovered Eventprop algorithm for gradient descent on exact gradients in spiking neural networks can be scaled up to challenging keyword recognition benchmarks. We implemented Eventprop in the GPU-enhanced Neural Networks framework and used it for training recurrent spiking neural networks on the Spiking Heidelberg Digits and Spiking Speech Commands datasets. We found that learning depended strongly on the loss function and extended Eventprop to a wider class of loss functions to enable effective training. When combined with the right additional mechanisms from the machine learning toolbox, Eventprop networks achieved state-of-the-art performance on Spiking Heidelberg Digits and good accuracy on Spiking Speech Commands. This work is a significant step towards a low-power neuromorphic alternative to current machine learning paradigms.
翻译:暂无翻译