Convex-composite optimization, which minimizes an objective function represented by the sum of a differentiable function and a convex one, is widely used in machine learning and signal/image processing. Fast Iterative Shrinkage Thresholding Algorithm (FISTA) is a typical method for solving this problem and has a global convergence rate of $O(1 / k^2)$. Recently, this has been extended to multi-objective optimization, together with the proof of the $O(1 / k^2)$ global convergence rate. However, its momentum factor is classical, and the convergence of its iterates has not been proven. In this work, introducing some additional hyperparameters $(a, b)$, we propose another accelerated proximal gradient method with a general momentum factor, which is new even for the single-objective cases. We show that our proposed method also has a global convergence rate of $O(1/k^2)$ for any $(a,b)$, and further that the generated sequence of iterates converges to a weak Pareto solution when $a$ is positive, an essential property for the finite-time manifold identification. Moreover, we report numerical results with various $(a,b)$, showing that some of these choices give better results than the classical momentum factors.
翻译:连接式组合优化将一个由不同功能总和和一个曲线总和代表的客观功能最小化,在机器学习和信号/图像处理中被广泛使用。快速循环递减压压压(FISTA)是解决这一问题的典型方法,其全球趋同率为O(1/k ⁇ 2)美元。最近,这一方法已扩大到多目标优化,同时证明了美元(1/k ⁇ 2)美元的全球趋同率。然而,其动力因素是典型的,其变异点的趋同尚未被证明。在这项工作中,我们提出了另一种加速加速的准加速梯度方法,其总动力因素是新的,甚至对单一目标案例来说也是如此。我们提出的方法也扩大到多目标优化,同时证明了美元(a)和(k ⁇ 2)美元的全球趋同率。然而,其形成的势头因素是典型的,其趋同于一个薄弱的帕雷托(a)美元解决方案,当我们给出了这些稳定的量化结果时,这些结果比数字更具有肯定性。