In this paper, we propose a quantized learning equation with a monotone increasing resolution of quantization and stochastic analysis for the proposed algorithm. According to the white noise hypothesis for the quantization error with dense and uniform distribution, we can regard the quantization error as i.i.d.\ white noise. Based on this, we show that the learning equation with monotonically increasing quantization resolution converges weakly as the distribution viewpoint. The analysis of this paper shows that global optimization is possible for a domain that satisfies the Lipschitz condition instead of local convergence properties such as the Hessian constraint of the objective function.
翻译:在本文中,我们提出一个量化的学习方程式,为拟议算法提出一个单一的分辨率增加的量化和随机分析。根据对密度和均匀分布的量化错误的白噪音假设,我们可以将量化错误视为i. id.\ 白噪音。基于这一点,我们表明,单倍增加的量化分辨率的学习方程式与分布观点不尽相同。本文的分析表明,对于满足Lipschitz条件的域域而言,全球优化是可能的,而不是像目标功能的赫西恩限制等本地趋同特性。