We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders them particularly relevant for applications in machine learning and large-scale data analysis. Relying on sum-of-squares (SOS) optimization, we introduce a hierarchy of semidefinite programs that give increasingly better convergence bounds for higher levels of the hierarchy. Alluding to the power of the SOS hierarchy, we show that the (dual of the) first level corresponds to the Performance Estimation Problem (PEP) introduced by Drori and Teboulle [Math. Program., 145(1):451--482, 2014], a powerful framework for determining convergence rates of first-order optimization algorithms. Consequently, many results obtained within the PEP framework can be reinterpreted as degree-1 SOS proofs, and thus, the SOS framework provides a promising new approach for certifying improved rates of convergence by means of higher-order SOS certificates. To determine analytical rate bounds, in this work we use the first level of the SOS hierarchy and derive new result{s} for noisy gradient descent with inexact line search methods (Armijo, Wolfe, and Goldstein).
翻译:我们引入了一个新的统一和系统化框架,用于统一和系统化对一阶黑盒优化算法的性能分析,以便不受限制地最大限度地减少调子;一阶算法享有的低成本迭代复杂性使得它们对于机器学习和大规模数据分析中的应用特别相关;我们借助于平方优化,引入了半成品方案等级分级,为更高层次的等级提供了日益更好的趋同界限;根据SOS等级的力量,我们表明,第一级(水平)与Drori和Teboulle提出的性能估计问题(PEP)相对应;Drori和Teboulle[Math. 方案,145(1):451-482,2014]是确定一级优化算法趋同率的强大框架;因此,在PEP框架内获得的许多结果可以被重新解释为一级-SOS证据,因此SOS框架提供了一种有希望的新办法,通过更高级别的SOS证书证明改进的趋同率。在SOS一级,我们使用新的SOLSAS级和SOLS级研究结果,确定分析率和新的SOL级的等级,我们使用新的S的S级,在SOLSBIEBLA中,我们使用新的SBLEBLA 。