Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.
翻译:与基于预测的方法不同的是,全球加速趋同率一般无法为CG提供严格的趋同保证。然而,最近关于本地加速计算法(LaCG)的一项工作表明,对于许多感兴趣的环境,CG的本地加速率是可能的。LaCG的主要下行是,它需要了解目标功能的平稳性和强烈的共性参数。我们通过引入新型的“无参数本地加速计算法(PF-LaCG)”算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LaCG)算法(PF-LACG)算法(PF-LACG-LACG)的理论结果得到补充,这些实验显示本地加速和墙时钟算法的实际改进。