Gradient-based first-order adaptive optimization methods such as the Adam optimizer are prevalent in training artificial networks, achieving the state-of-the-art results. This work attempts to answer the question whether it is viable for biological neural systems to adopt such optimization methods. To this end, we demonstrate a realization of the Adam optimizer using biologically-plausible mechanisms in synapses. The proposed learning rule has clear biological correspondence, runs continuously in time, and achieves performance to comparable Adam's. In addition, we present a new approach, inspired by the predisposition property of synapses observed in neuroscience, to circumvent the biological implausibility of the weight transport problem in backpropagation (BP). With only local information and no separate training phases, this method establishes and maintains weight symmetry in the forward and backward signaling paths, and is applicable to the proposed biologically plausible Adam learning rule. These mechanisms may shed light on the way in which biological synaptic dynamics facilitate learning.
翻译:Adam 优化器等基于渐进的一阶适应性优化方法在人工网络培训中非常普遍,实现了最先进的结果。 这项工作试图解答生物神经系统采用这种优化方法是否可行的问题。 为此,我们展示了在突触中利用生物可变性机制实现亚当优化的方法。 拟议的学习规则具有明确的生物对应,在时间上持续运行,并达到与Adam相近的性能。 此外,我们提出了一种新的方法,在神经科学中观察到的突触特征的启发下,绕过反光学( BBP) 中重量迁移问题的生物不合理性。由于只有当地信息,没有单独的培训阶段,这种方法在前向和后向信号路径上建立并保持重量对称,并适用于拟议的生物学上可信的亚当学习规则。 这些机制可以说明生物合成动态促进学习的方式。