Hamilton and Moitra (2021) showed that, in certain regimes, it is not possible to accelerate Riemannian gradient descent in the hyperbolic plane if we restrict ourselves to algorithms which make queries in a (large) bounded domain and which receive gradients and function values corrupted by a (small) amount of noise. We show that acceleration remains unachievable for any deterministic algorithm which receives exact gradient and function-value information (unbounded queries, no noise). Our results hold for the classes of strongly and nonstrongly geodesically convex functions, and for a large class of Hadamard manifolds including hyperbolic spaces and the symmetric space $\mathrm{SL}(n) / \mathrm{SO}(n)$ of positive definite $n \times n$ matrices of determinant one. This cements a surprising gap between the complexity of convex optimization and geodesically convex optimization: for hyperbolic spaces, Riemannian gradient descent is optimal on the class of smooth and and strongly geodesically convex functions, in the regime where the condition number scales with the radius of the optimization domain. The key idea for proving the lower bound consists of perturbing the hard functions of Hamilton and Moitra (2021) with sums of bump functions chosen by a resisting oracle.
翻译:汉密尔顿和莫伊特拉(2021年)表明,在某些制度下,如果我们将自己限于在(大)受约束域内提出查询并接受因(小)噪音而腐蚀的梯度和函数值的算法,就不可能加速双曲平面上里曼尼梯度的下降。我们显示,对于任何获得精确梯度和功能价值信息的确定性算法(无限制查询,无噪音)来说,加速仍然无法实现。对于强度和非强度大地锥形功能的类别,以及对于包括超度空间和对称空间在内的大型哈达马多马数元体,包括超度空间和对称空间的对称空间,以及接受正数的梯度和函数的计算。我们显示,对于获得精确度和函数优化和地分矩曲线优化的复杂程度之间,我们的结果是无法实现的。对于高度和强度的测深度二次曲线的梯度系系,在光度和强烈的测深度等功能的类别中最优化,在最优度和深度的多度的多度矩形元体函数函数系中,在最稳的轨道上,以最精确的阶阶阶阶轨道的轨道函数,在最下轨道上,以最硬的阶定的轨道的轨道的阶度函数。