Nonsmooth composite optimization with orthogonality constraints is crucial in statistical learning and data science, but it presents challenges due to its nonsmooth objective and computationally expensive, non-convex constraints. In this paper, we propose a new approach called \textbf{OBCD}, which leverages Block Coordinate Descent (BCD) to address these challenges. \textbf{OBCD} is a feasible method with a small computational footprint. In each iteration, it updates $k$ rows of the solution matrix, where $k \geq 2$, while globally solving a small nonsmooth optimization problem under orthogonality constraints. We prove that \textbf{OBCD} converges to block-$k$ stationary points, which offer stronger optimality than standard critical points. Notably, \textbf{OBCD} is the first greedy descent method with monotonicity for this problem class. Under the Kurdyka-Lojasiewicz (KL) inequality, we establish strong limit-point convergence. We also extend \textbf{OBCD} with breakpoint searching methods for subproblem solving and greedy strategies for working set selection. Comprehensive experiments demonstrate the superior performance of our approach across various tasks.
翻译:暂无翻译