Finding a solution to the linear system $Ax = b$ with various minimization properties arises from many engineering and computer science applications, including compressed sensing, image processing, and machine learning. In the age of big data, the scalability of stochastic optimization algorithms has made it increasingly important to solve problems of unprecedented sizes. This paper focuses on the problem of minimizing a strongly convex objective function subject to linearly constraints. We consider the dual formulation of this problem and adopt the stochastic coordinate descent to solve it. The proposed algorithmic framework, called fast stochastic dual coordinate descent, utilizes an adaptive variation of Polyak's heavy ball momentum and user-defined distributions for sampling. Our adaptive heavy ball momentum technique can efficiently update the parameters by using iterative information, overcoming the limitation of the heavy ball momentum method where prior knowledge of certain parameters, such as singular values of a matrix, is required. We prove that, under strongly admissible of the objective function, the propose method converges linearly in expectation. By varying the sampling matrix, we recover a comprehensive array of well-known algorithms as special cases, including the randomized sparse Kaczmarz method, the randomized regularized Kaczmarz method, the linearized Bregman iteration, and a variant of the conjugate gradient (CG) method. Numerical experiments are provided to confirm our results.
翻译:暂无翻译