In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
翻译:在本文中,我们考虑光滑的平面优化问题,包括简单的限制和神器信息的不准确性,如目标功能的价值、部分或定向衍生物;我们引入一个统一框架,允许为这类问题建立不同类型的加速随机方法,并证明这些问题的汇合率;我们侧重于加速随机区块坐标下移、加速随机方向搜索、加速随机无衍生物方法,并利用我们的框架,为不准确或非精确的信息提供其版本。我们的贡献还包括加速随机块坐标下移的不精确或不精确的偏差,和正丙基准氧化物设置,以及这一方法的无衍生物版本。此外,我们还介绍了我们框架的扩展,以解决强烈的二次曲线优化问题。我们还讨论了目标功能不精确模型的扩展问题。