The Frank Wolfe algorithm (FW) is a popular projection-free alternative for solving large-scale constrained optimization problems. However, the FW algorithm suffers from a sublinear convergence rate when minimizing a smooth convex function over a compact convex set. Thus, exploring techniques that yield a faster convergence rate becomes crucial. A classic approach to obtain faster rates is to combine previous iterates to obtain the next iterate. In this work, we extend this approach to the FW setting and show that the optimal way to combine the past iterates is using a set of orthogonal Jacobi polynomials. We also a polynomial-based acceleration technique, referred to as Jacobi polynomial accelerated FW, which combines the current iterate with the past iterate using combing weights related to the Jacobi recursion. By carefully choosing parameters of the Jacobi polynomials, we obtain a faster sublinear convergence rate. We provide numerical experiments on real datasets to demonstrate the efficacy of the proposed algorithm.
翻译:Frank Wolfe 算法( FW) 是解决大规模限制优化问题的流行无投影的替代方法。 但是, FW 算法在将一个光滑的锥形功能最小化于一个紧凑的锥形组时,会受到亚线性趋同率的影响。 因此, 探索能产生更快趋同率的技术变得至关重要。 获取更快率的经典方法是将先前的迭代合并, 以获得下一个迭代。 在这项工作中, 我们将这种方法推广到 FW 设置, 并显示将过去的迭代合并的最佳方式是使用一套正方形的雅各比多线性多线性多线性加速法。 我们还是一种以多线性为基础的加速技术, 称为雅各比多线性加速法, 将当前的迭代与过去的迭代相结合, 使用与雅各比重重来梳理。 通过仔细选择叶科比多线的参数, 我们获得了一个更快的子线性趋同率。 我们在真实的数据集上提供数字实验, 以显示提议的算法的功效。