It is known that when the statistical models are singular, i.e., the Fisher information matrix at the true parameter is degenerate, the fixed step-size gradient descent algorithm takes polynomial number of steps in terms of the sample size $n$ to converge to a final statistical radius around the true parameter, which can be unsatisfactory for the application. To further improve that computational complexity, we consider the utilization of the second-order information in the design of optimization algorithms. Specifically, we study the normalized gradient descent (NormGD) algorithm for solving parameter estimation in parametric statistical models, which is a variant of gradient descent algorithm whose step size is scaled by the maximum eigenvalue of the Hessian matrix of the empirical loss function of statistical models. When the population loss function, i.e., the limit of the empirical loss function when $n$ goes to infinity, is homogeneous in all directions, we demonstrate that the NormGD iterates reach a final statistical radius around the true parameter after a logarithmic number of iterations in terms of $n$. Therefore, for fixed dimension $d$, the NormGD algorithm achieves the optimal overall computational complexity $\mathcal{O}(n)$ to reach the final statistical radius. This computational complexity is cheaper than that of the fixed step-size gradient descent algorithm, which is of the order $\mathcal{O}(n^{\tau})$ for some $\tau > 1$, to reach the same statistical radius. We illustrate our general theory under two statistical models: generalized linear models and mixture models, and experimental results support our prediction with general theory.
翻译:已知当统计模型是单数时, 即, 实际参数的Fisher 信息矩阵是衰变的, 固定的梯度梯度梯度下沉算法以样本大小为单位, 以美元为单位, 折合到真实参数的最后统计半径, 这对应用来说可能不尽人意。 为了进一步改善计算的复杂性, 我们考虑在设计优化算法时使用第二阶级信息。 具体地说, 我们研究用于解决参数估计的参数的正常梯度梯度梯度下沉算法( NormGD) 算法( NOmGD) 变量, 这是梯度下梯度梯度下算法值的梯度缩缩放变量, 由统计模型的赫森矩阵实验损失函数的最大数值缩缩放值 。 当人口损失函数, 即当美元到精确度时, 实验损失函数的极限值在各种方向上是均匀的。 我们的诺姆德( NOGD) 计算结果在以美元为单位的对数后最终的统计半径值。 因此, 以美元计算中, 最精确的基数为O mationalalalal- calalalalalalalalalalalalalalal comalalalalalalalalalalal = = =1 计算结果。