We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function - as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objectives, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods - like UnixGrad or AcceleGrad - is not possible, especially in the presence of randomness and uncertainty. The proposed method - which we call adaptive mirror descent (AdaMir) - aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.
翻译:具体地说,由于最近关于非Lipschitz(NoLips)优化(NoLips)的一流活动,我们考虑的问题与参照Bregman函数相比是连续的或顺畅的,而不是一个全球的、环境的规范(Euclidean 或其他的),因此,我们建议了一套适应性第一阶方法,以对付可能无法持续或顺利地达到标准意义上的“Lipschitz ” 的一类细小最小化问题。这些条件包含一系列具有独特目标的范围广泛的问题,如渔业市场、Poissontomography、Ddesign等。在这种环境下,不可能应用现有的秩序优化适应方法,如UnixGrad或AcceleGrad,特别是在随机性和不确定性的情况下。我们称之为适应性反射镜下沉(AdaMir)的拟议方法旨在缩小这一差距,同时在相对持续或平稳的问题中达到微量最佳速率,包括随机性的问题。