Goldstein's 1977 idealized iteration for minimizing a Lipschitz objective fixes a distance - the step size - and relies on a certain approximate subgradient. That "Goldstein subgradient" is the shortest convex combination of objective gradients at points within that distance of the current iterate. A recent implementable Goldstein-style algorithm allows a remarkable complexity analysis (Zhang et al. 2020), and a more sophisticated variant (Davis and Jiang, 2022) leverages typical objective geometry to force near-linear convergence. To explore such methods, we introduce a new modulus, based on Goldstein subgradients, that robustly measures the slope of a Lipschitz function. We relate near-linear convergence of Goldstein-style methods to linear growth of this modulus at minimizers. We illustrate the idea computationally with a simple heuristic for Lipschitz minimization.
翻译:暂无翻译