We demonstrate for the first time that a biologicallyplausible spiking neural network (SNN) equipped with Spike- Timing-Dependent Plasticity (STDP) learning can continuously learn to detect walking people on the fly using retina-inspired, event-based camera data. Our pipeline works as follows. First, a short sequence of event data (< 2 minutes), capturing a walking human from a flying drone, is shown to a convolutional SNNSTDP system which also receives teacher spiking signals from a convolutional readout (forming a semi-supervised system). Then, STDP adaptation is stopped and the learned system is assessed on testing sequences. We conduct several experiments to study the effect of key mechanisms in our system and we compare our precision-recall performance to conventionally-trained CNNs working with either RGB or event-based camera frames.
翻译:我们第一次证明,一个配备了斯派克-定时-依赖性可塑性(STDP)学习的生物学可塑性神经网络(SNNN)能够利用视网膜激发的、以事件为基础的相机数据不断学习探测飞行中的行走者。我们的管道工作如下。首先,一个从飞行无人机中捕捉行走人的短时间事件数据序列( < 2分钟)被展示给一个革命性SNNNSTDP系统,该系统还接收来自革命性读出(形成半监督系统)的教师弹射信号。随后,STDP的适应被停止,对学习的系统进行测试序列评估。我们进行了几项实验,以研究我们系统中关键机制的效果,并将我们的精确召回功能与常规受过训练的CNN在RGB或事件摄像机上工作的CNN作比较。