We consider a class of structured fractional minimization problems, in which the numerator part of the objective is the sum of a differentiable convex function and a convex nonsmooth function, while the denominator part is a concave or convex function. This problem is difficult to solve since it is nonconvex. By exploiting the structure of the problem, we propose two Coordinate Descent (CD) methods for solving this problem. One is applied to the original fractional function, the other is based on the associated parametric problem. The proposed methods iteratively solve a one-dimensional subproblem \textit{globally}, and they are guaranteed to converge to coordinate-wise stationary points. In the case of a convex denominator, we prove that the proposed CD methods using sequential nonconvex approximation find stronger stationary points than existing methods. Under suitable conditions, CD methods with an appropriate initialization converge linearly to the optimal point (also the coordinate-wise stationary point). In the case of a concave denominator, we show that the resulting problem is quasi-convex, and any critical point is a global minimum. We prove that the algorithms converge to the global optimal solution with a sublinear convergence rate. We demonstrate the applicability of the proposed methods to some machine learning and signal processing models. Our experiments on real-world data have shown that our method significantly and consistently outperforms existing methods in terms of accuracy.
翻译:我们考虑的是一组结构化的最小化分数问题,其中,目标的分子部分是可区分的分解函数和分解的非线性功能的总和,而分解部分则是调和的固定点。由于分解函数或分解函数,这个问题很难解决。通过利用问题的结构,我们建议了两种协调源(CD)方法来解决这个问题。一种适用于原始的分数函数,另一种基于相关的参数问题。建议的方法是迭代地解决一个维维的子问题子问题(textit{globally}),并且保证它们会汇合到协调的固定点。在分解分解函数中,我们证明,使用顺序的非坐标近似法的CD方法比现有方法更强。在适当的条件下,带有适当初始化的CD方法可以线性地与最佳点(也与协调的固定点)相融合。在相近的分解中,我们发现由此产生的问题是准的,而任何临界值的指数性点是全球的趋同性方法。我们所展示了某种最起码的模型,我们所展示了某种最接近的模型的模型。我们所展示的模型的模型的模型,我们证明,以最接近的精确的方法是最接近的。