Subgradient methods are the natural extension to the non-smooth case of the classical gradient descent for regular convex optimization problems. However, in general, they are characterized by slow convergence rates, and they require decreasing step-sizes to converge. In this paper we propose a subgradient method with constant step-size for composite convex objectives with $\ell_1$-regularization. If the smooth term is strongly convex, we can establish a linear convergence result for the function values. This fact relies on an accurate choice of the element of the subdifferential used for the update, and on proper actions adopted when non-differentiability regions are crossed. Then, we propose an accelerated version of the algorithm, based on conservative inertial dynamics and on an adaptive restart strategy. Finally, we test the performances of our algorithms on some strongly and non-strongly convex examples.
翻译:亚梯度方法是用于常规曲线优化问题的古典梯度梯度下降的非偏移情况的自然延伸。 但是,一般而言,它们具有缓慢的趋同率的特点,它们需要递减的步级大小才能趋同。 在本文件中,我们建议了一种次梯度方法,以恒定的阶梯大小用于以$\ell_1$-正则化的复合曲线目标。如果光滑的术语是强烈的曲线,我们可以为函数值建立一个线性趋同结果。这个事实取决于精确地选择用于更新的次梯度元素,以及在无差异区域被跨越时采取的正确行动。 然后,我们提出一种基于保守惯性动态和适应性重新激活战略的加速算法版本。 最后,我们用一些强度和非强烈的曲线示例测试我们的算法的性能。