The paper proposes a new algorithm called SymBa that aims to achieve more biologically plausible learning than Back-Propagation (BP). The algorithm is based on the Forward-Forward (FF) algorithm, which is a BP-free method for training neural networks. SymBa improves the FF algorithm's convergence behavior by addressing the problem of asymmetric gradients caused by conflicting converging directions for positive and negative samples. The algorithm balances positive and negative losses to enhance performance and convergence speed. Furthermore, it modifies the FF algorithm by adding Intrinsic Class Pattern (ICP) containing class information to prevent the loss of class information during training. The proposed algorithm has the potential to improve our understanding of how the brain learns and processes information and to develop more effective and efficient artificial intelligence systems. The paper presents experimental results that demonstrate the effectiveness of SymBa algorithm compared to the FF algorithm and BP.
翻译:论文提出了一个新的算法,称为Symba,目的是实现比BBP更具有生物学上合理意义的学习,而不是“后期促进”(BP) 。算法基于前期-前期(FF)算法,这是培训神经网络的一种不使用BP的方法。Symba通过解决正式和负式样本相矛盾的趋同方向引起的不对称梯度问题,改进FF算法的趋同行为。算法平衡正式和负式损失,以提高性能和趋同速度。此外,它通过添加包含课堂信息的Intrinsic类模式(ICP)来修改FF算法,以防止在培训期间损失班级信息。拟议的算法有可能增进我们对大脑如何学习和处理信息的理解,并开发更有效和更高效的人工智能系统。文件介绍了实验结果,表明SymBa算法与FF算法和BP相比的有效性。</s>