Neuromorphic perception with event-based sensors, asynchronous hardware and spiking neurons is showing promising results for real-time and energy-efficient inference in embedded systems. The next promise of brain-inspired computing is to enable adaptation to changes at the edge with online learning. However, the parallel and distributed architectures of neuromorphic hardware based on co-localized compute and memory imposes locality constraints to the on-chip learning rules. We propose in this work the Event-Based Three-factor Local Plasticity (ETLP) rule that uses (1) the pre-synaptic spike trace, (2) the post-synaptic membrane voltage and (3) a third factor in the form of projected labels with no error calculation, that also serve as update triggers. We apply ETLP with feedforward and recurrent spiking neural networks on visual and auditory event-based pattern recognition, and compare it to Back-Propagation Through Time (BPTT) and eProp. We show a competitive performance in accuracy with a clear advantage in the computational complexity for ETLP. We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learn spatio-temporal patterns with a rich temporal structure. Finally, we provide a proof of concept hardware implementation of ETLP on FPGA to highlight the simplicity of its computational primitives and how they can be mapped into neuromorphic hardware for online learning with low-energy consumption and real-time interaction.
翻译:以事件为基础的传感器、非同步硬件和神经神经神经元的神经畸形感知的平行和分布结构对基于事件的感知规则施加了局部限制。我们在此工作中提议“基于事件的三个因素的地方可塑性规则”规则,该规则使用:(1) 合成前的峰值跟踪,(2) 合成后的膜电压,(3) 预测标签形式的第三个因素,没有错误计算,也是更新的触发因素。我们采用基于共同定位的计算和记忆的神经畸形硬件平行和分布结构,对基于视觉和审校的事件模式识别造成局部限制。我们提议“基于事件的三因素地方可塑性(ETLP)”规则,使用以下规则:(1) 合成突变前的峰值跟踪,(2) 合成合成后的合成膜电动电文,以及(3) 预测标签结构中的第三个因素,没有错误计算,也可以作为更新的触发因素。我们采用前置和反复的神经神经硬度网络网络的识别,并将它与后期(BBTTTTT)和eProp。我们展示了一种具有竞争力的业绩表现,其精确性,其计算复杂性的精确性,其计算复杂性在Sloyalal-realal-al-de-real-real-real-deal-deal-de-deal-deal-deal-deal-de-deal-deal-deal-deal-deal-stal-de-deal-de-al-stal-stal-stal-stal-stal-stal-stal-stal-stal-stal-stal-stal-stal-al-stal-stal-stal-stal-stal-al-al-al-st-st-st-al-al-al-al-stal-al-al-al-stal-stal-al-stal-stal-al-al-al-stal-stal-stal-al-al-al-al-al-al-pal-pal-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-al-real-real-al-al-