First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal-dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.
翻译:在过去20年中,解决曲线优化问题的第一阶方法一直处于数学优化的最前沿。这一重要类别的算法的迅速发展,是由各种应用中的成功故事所激发的,其中最重要的是机器学习、信号处理、成像和控制理论。第一阶方法有可能在低计算复杂性下提供低精确度解决方案,从而在大规模优化问题中使其具有吸引力。在本次调查中,我们涵盖了基于梯度优化方法的一些关键发展动态。这包括典型的原始梯度方法及其加速版本的非欧洲扩展。此外,我们调查了无投影方法类别中的最新动态,以及原始二元计划的初步版本。我们为各种关键结果提供了完整的证明,并突出了若干优化算法的统一方面。