Two main routes of learning methods exist at present including error-driven global learning and neuroscience-oriented local learning. Integrating them into one network may provide complementary learning capabilities for versatile learning scenarios. At the same time, neuromorphic computing holds great promise, but still needs plenty of useful algorithms and algorithm-hardware co-designs for exploiting the advantages. Here, we report a neuromorphic hybrid learning model by introducing a brain-inspired meta-learning paradigm and a differentiable spiking model incorporating neuronal dynamics and synaptic plasticity. It can meta-learn local plasticity and receive top-down supervision information for multiscale synergic learning. We demonstrate the advantages of this model in multiple different tasks, including few-shot learning, continual learning, and fault-tolerance learning in neuromorphic vision sensors. It achieves significantly higher performance than single-learning methods, and shows promise in empowering neuromorphic applications revolution. We further implemented the hybrid model in the Tianjic neuromorphic platform by exploiting algorithm-hardware co-designs and proved that the model can fully utilize neuromorphic many-core architecture to develop hybrid computation paradigm.
翻译:目前存在着两种主要的学习方法,包括由错误驱动的全球学习和以神经科学为导向的本地学习。将它们融入一个网络可以为多种学习情景提供互补的学习能力。与此同时,神经形态计算有着巨大的希望,但仍然需要大量的实用算法和算法硬件共同设计来利用这些优势。这里,我们报告神经形态混合学习模式,方法是引入一个由大脑启发的元学习模式和一个包含神经神经动态和合成可塑性的不同跳跃模型。它可以产生自流的本地可塑性,并接收自上而下的监督信息,用于多尺度的协同学习。我们展示了这一模式在多种不同任务中的优势,包括微小的学习、持续学习和神经形态视觉传感器中的过错容忍学习。它取得了比单项学习方法高得多的性能,并展示了增强神经形态应用革命的希望。我们通过利用算法硬件共同设计在天体神经形态平台上进一步应用了混合模型,并证明该模型能够充分利用神经形态的多层结构来发展混合计算模型。