In machine learning and data science, we often consider efficiency for solving problems. In sparse estimation, such as fused lasso and convex clustering, we apply either the proximal gradient method or the alternating direction method of multipliers (ADMM) to solve the problem. It takes time to include matrix division in the former case, while an efficient method such as FISTA (fast iterative shrinkage-thresholding algorithm) has been developed in the latter case. This paper proposes a general method for converting the ADMM solution to the proximal gradient method, assuming that the constraints and objectives are strongly convex. Then, we apply it to sparse estimation problems, such as sparse convex clustering and trend filtering, and we show by numerical experiments that we can obtain a significant improvement in terms of efficiency.
翻译:在机器学习和数据科学中,我们常常考虑解决问题的效率。在诸如集成的拉索和混凝土组群等稀有的估计中,我们要么采用近似梯度法,要么采用乘数交替方向法(ADMM)来解决这个问题。在前一种情况下,需要时间才能将矩阵分割包括在内,而在后一种情况下,我们开发了一种有效的方法,如FISTA(快速迭代缩缩影-保持算法),本文提出了一个将ADMM办法转换为近似梯度法的一般方法,假设限制和目标是很强的弯曲。 然后,我们将其应用于稀有的估计问题,如稀少的convex集群和趋势过滤,我们通过数字实验表明,我们可以在效率方面大大改进。