We introduce and analyse a new family of algorithms which generalizes and unifies both the mirror descent and the dual averaging algorithms. The unified analysis of the algorithms involves the introduction of a generalized Bregman divergence which utilizes subgradients instead of gradients. Our approach is general enough to encompass classical settings in convex optimization, online learning, and variational inequalities such as saddle-point problems.
翻译:我们引入并分析一个新的算法体系,该体系将镜底和双平均算法加以概括化和统一化。 对算法的统一分析包括引入一个通用的布雷格曼差异,使用次梯度而不是梯度。 我们的方法非常笼统,足以包含典型的曲线优化、在线学习和变相不平等(如马鞍点问题)等典型环境。