We focus on a class of non-smooth optimization problems over the Stiefel manifold in the decentralized setting, where a connected network of $n$ agents cooperatively minimize a finite-sum objective function with each component being weakly convex in the ambient Euclidean space. Such optimization problems, albeit frequently encountered in applications, are quite challenging due to their non-smoothness and non-convexity. To tackle them, we propose an iterative method called the decentralized Riemannian subgradient method (DRSM). The global convergence and an iteration complexity of $\mathcal{O}(\varepsilon^{-2} \log^2(\varepsilon^{-1}))$ for forcing a natural stationarity measure below $\varepsilon$ are established via the powerful tool of proximal smoothness from variational analysis, which could be of independent interest. Besides, we show the local linear convergence of the DRSM using geometrically diminishing stepsizes when the problem at hand further possesses a sharpness property. Numerical experiments are conducted to corroborate our theoretical findings.
翻译:本文关注于分散式网络中 Stiefel 流形上一类非光滑优化问题,其中 $n$ 个代理协作地最小化一个总和式目标函数,每个分量在环境欧几里得空间中都是弱凸的。这些优化问题虽然在应用中经常遇到,但由于不光滑和不凸性质,它们相当具有挑战性。为了解决这些问题,我们提出了一种迭代方法,称为分散式 Riemannian 次梯度方法 (DRSM)。利用变分分析中的近端光滑性强大工具,我们证明了 DRSM 的全局收敛性和迭代复杂度,使得本方法能够在将自然稳定性指标降低到 $\varepsilon$ 以下的情况下,具有 $\mathcal{O}(\varepsilon^{-2} \log^2(\varepsilon^{-1}))$ 的收敛速度,这一点具有独立的重要性。此外,我们还展示了 DRSM 在处理具有锐度性质的问题时,当采用几何逐步减小步长时,具有局部线性收敛性。我们进行了数值实验,证明了我们的理论结果。