This paper introduces application of the Exponentially Averaged Momentum Particle Swarm Optimization (EM-PSO) as a derivative-free optimizer for Neural Networks. It adopts PSO's major advantages such as search space exploration and higher robustness to local minima compared to gradient-descent optimizers such as Adam. Neural network based solvers endowed with gradient optimization are now being used to approximate solutions to Differential Equations. Here, we demonstrate the novelty of EM-PSO in approximating gradients and leveraging the property in solving the Schr\"odinger equation, for the Particle-in-a-Box problem. We also provide the optimal set of hyper-parameters supported by mathematical proofs, suited for our algorithm.
翻译:本文介绍了作为神经网络的无衍生物优化器的电荷平均动力粒子优化应用(EM-PSO)作为神经网络的无衍生物优化器的应用,它采用了PSO的主要优势,例如搜索空间探索,以及与亚达姆等梯度优化器相比,对当地小型微粒的强度更高。基于神经网络的具有梯度优化功能的溶液现在被用来大致解决差异等量。在这里,我们展示EM-PSO在接近梯度和在解决Schr\'odinger等式时利用财产解决Schr\'odinger等式方面的新颖之处,我们还提供了一套适合我们算法的由数学证据支持的最优的超参数。