We provide a lower bound showing that the $O(1/k)$ convergence rate of the NoLips method (a.k.a. Bregman Gradient) is optimal for the class of functions satisfying the $h$-smoothness assumption. This assumption, also known as relative smoothness, appeared in the recent developments around the Bregman Gradient method, where acceleration remained an open issue. On the way, we show how to constructively obtain the corresponding worst-case functions by extending the computer-assisted performance estimation framework of Drori and Teboulle (Mathematical Programming, 2014) to Bregman first-order methods, and to handle the classes of differentiable and strictly convex functions.
翻译:我们提供了一个较低的约束值,显示诺利普斯法(a.k.a.b.bregman Gradient)的1美元(1/k)美元趋同率对于满足美元和吸附率假设的功能类别来说是最佳的,这一假设也称为相对顺畅,出现在布雷格曼梯度法的最近发展动态中,加速率仍然是尚未解决的问题。在路上,我们展示了如何通过将德罗里和泰布勒(2014年数学规划)计算机辅助性业绩估计框架扩大到布雷格曼第一阶法,并处理不同和严格交错功能的类别,从而建设性地获得相应的最坏功能。