We propose a new gradient descent algorithm with added stochastic terms for finding the global optimizers of nonconvex optimization problems, referred to as ``AdaVar'' here. A key component in the algorithm is the adaptive tuning of the randomness based on the value of the objective function. In the language of simulated annealing, the temperature is state-dependent. With this, we prove the global convergence of the algorithm with an algebraic rate both in probability and in the parameter space. This is an improvement over the classical rate from using a simpler control of the noise term. The convergence proof is based on the actual discrete setup of the algorithm. We also present several numerical examples to demonstrate the efficiency and robustness of the algorithm for reasonably complex objective functions.
翻译:我们建议一个新的梯度下移算法, 加上“ AdaVar' ” 等词, 以寻找全球非电流优化问题的优化者。 算法中的一个关键组成部分是根据目标函数的价值对随机性进行适应性调整。 在模拟Annealing 语言中, 温度取决于国家。 这样, 我们就可以证明算法在概率和参数空间方面与代数率的全球趋同。 这比使用较简单的噪音术语控制方法的古典率有所改进。 趋同证据以算法的实际离散设置为基础。 我们还提出了几个数字例子, 以证明算法在合理复杂的客观函数方面的效率和稳健性。