We present NCGD, a method for constrained nonsmooth convex optimization. In each iteration, NCGD finds the best norm-constrained descent direction by considering the worst bound over all local subgradients. We prove a few global convergence rates of NCGD. For well-behaved nonsmooth functions (characterized by the weak smoothness property and being Lipschitz continuous), NCGD converges in $O(\epsilon^{-1})$ iterations, where $\epsilon$ is the desired optimality gap. NCGD converges in $O(\epsilon^{-0.5})$ iterations for strongly convex, weakly smooth functions. Furthermore, if the function is strongly convex and smooth, then NCGD achieves linear convergence (i.e., $O(-\log \epsilon)$). The overall efficiency of NCGD depends on the efficiency of solving a minimax optimization problem involving the subdifferential of the objective function in each iteration.
翻译:暂无翻译