Neuromorphic computing and spiking neural networks (SNN) mimic the behavior of biological systems and have drawn interest for their potential to perform cognitive tasks with high energy efficiency. However, some factors such as temporal dynamics and spike timings prove critical for information processing but are often ignored by existing works, limiting the performance and applications of neuromorphic computing. On one hand, due to the lack of effective SNN training algorithms, it is difficult to utilize the temporal neural dynamics. Many existing algorithms still treat neuron activation statistically. On the other hand, utilizing temporal neural dynamics also poses challenges to hardware design. Synapses exhibit temporal dynamics, serving as memory units that hold historical information, but are often simplified as a connection with weight. Most current models integrate synaptic activations in some storage medium to represent membrane potential and institute a hard reset of membrane potential after the neuron emits a spike. This is done for its simplicity in hardware, requiring only a "clear" signal to wipe the storage medium, but destroys temporal information stored in the neuron. In this work, we derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns. We achieved competitive accuracy on two complex datasets. We also demonstrate the advantage of our model by a novel temporal pattern association task. Codesigned with this algorithm, we have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity. This circuit implementation of the neuron model is simulated to demonstrate its ability to react to temporal spiking patterns with an adaptive threshold.
翻译:内向计算和神经神经网络(SNN)模仿生物系统的行为,并吸引人们对它们执行认知任务的潜力的兴趣,高能效。然而,一些因素,如时间动态和峰值计时对于信息处理至关重要,但经常被现有工作忽视,限制了神经突变计算的性能和应用。一方面,由于缺乏有效的 SNN培训算法,很难利用时间神经动态。许多现有的算法仍然从统计学角度处理神经激活。另一方面,利用时神经动态也给硬件设计带来挑战。Synaps展示了时间动态,作为存储历史信息的记忆系统,但往往被简化作为与重量连接的记忆系统。大多数目前的模型将合成激活功能纳入某些存储介质,以代表膜潜力,并在神经内向内向内向下移动后,建立硬的膜潜力组合。这是为了在硬件中简单化,只需要一个“清晰”的存储介质信号来擦拭存储中,但摧毁了在神经系统中存储的时空信息。在这个模型中,我们用一个高效的智能电路路路路路模式来进行模拟的运行。我们用一个精度分析,一个精巧的智能路路路路路路变。我们用来学习一个精度测。我们用来测试一个智能的智能的智能的智能变动变。我们用来学习一个智能的智能变动的智能变动的智能网络。我们学习一个精巧的系统。我们用来演算。我们用来演算。我们这个系统,一个精巧的智能的智能变。我们用来学习了一个智能的智能变。