Developmental plasticity plays a vital role in shaping the brain's structure during ongoing learning in response to the dynamically changing environments. However, the existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from the brain's developmental plasticity mechanisms, thus limiting their ability to learn efficiently, rapidly, and accurately. This paper proposed a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons according to the "use it or lose it, gradually decay" principle. The proposed DPAP model considers multiple biologically realistic mechanisms (such as dendritic spine dynamic plasticity, activity-dependent neural spiking trace, local synaptic plasticity), with the addition of an adaptive pruning strategy, so that the network structure can be dynamically optimized during learning without any pre-training and retraining. We demonstrated that the proposed DPAP method applied to deep ANNs and SNNs could learn efficient network architectures that retain only relevant important connections and neurons. Extensive comparative experiments show consistent and remarkable performance and speed boost with the extremely compressed networks on a diverse set of benchmark tasks, especially neuromorphic datasets for SNNs. This work explores how developmental plasticity enables the complex deep networks to gradually evolve into brain-like efficient and compact structures, eventually achieving state-of-the-art (SOTA) performance for biologically realistic SNNs.
翻译:发展型可塑性在根据动态变化的环境不断学习的过程中,在形成大脑结构的过程中发挥着关键作用。然而,现有的深层人工神经网络网络网络压缩方法(ANNS)和神经网络(SNNS)很少从大脑发育型可塑性机制中得到灵感,从而限制了其高效、快速和准确学习的能力。本文件建议了一种发展型可塑性激励型适应性修剪方法(DPAP),其灵感来自根据“使用或丢失它,逐渐衰败”原则对登地性脊椎、神经突触和神经系统的适应性发育剪裁剪裁。拟议的DPP模型认为多种生物型机制(例如斜度脊椎动态可塑性、活动依赖性神经神经网络),而增加了适应性裁剪裁战略,因此网络结构在学习期间可以动态优化,而无需任何预先培训和再培训。我们证明,根据“使用或失去它,逐渐衰败”原则,拟议的DPP系统状态方法可以学习高效的网络结构结构。拟议的DAPP模式认为多种生物现实性机制机制(例如:稳定性脊椎网络)最终能够持续地提升稳定的生物级和神经结构,从而保持重要的稳定地进行相关的生物级数据库。