In this study, we propose a global optimization algorithm based on quantizing the energy level of an objective function in an NP-hard problem. According to the white noise hypothesis for a quantization error with a dense and uniform distribution, we can regard the quantization error as i.i.d. white noise. From stochastic analysis, the proposed algorithm converges weakly only under conditions satisfying Lipschitz continuity, instead of local convergence properties such as the Hessian constraint of the objective function. This shows that the proposed algorithm ensures global optimization by Laplace's condition. Numerical experiments show that the proposed algorithm outperforms conventional learning methods in solving NP-hard optimization problems such as the traveling salesman problem.
翻译:在本研究中,我们提出了一个全球优化算法,其依据是在NP-硬问题中对客观函数的能量水平进行量化。根据对密集和统一分布的量化错误的白噪音假设,我们可以将量化错误视为i.d.白噪音。根据随机分析,拟议的算法只在满足Lipschitz连续性的条件下,而不是在诸如目标函数的赫森约束等本地趋同性特性下,在满足Lipschitz连续性的条件下,才会微弱地集中起来。这表明拟议的算法能够确保根据Laplace的条件实现全球优化。数字实验表明,拟议的算法在解决NP-硬优化问题时,如旅行销售员问题,优于常规的学习方法。