Decentralized optimization with orthogonality constraints is found widely in scientific computing and data science. Since the orthogonality constraints are nonconvex, it is quite challenging to design efficient algorithms. Existing approaches leverage the geometric tools from Riemannian optimization to solve this problem at the cost of high sample and communication complexities. To relieve this difficulty, based on two novel techniques that can waive the orthogonality constraints, we propose a variance-reduced stochastic gradient tracking (VRSGT) algorithm with the convergence rate of $O(1 / k)$ to a stationary point. To the best of our knowledge, VRSGT is the first algorithm for decentralized optimization with orthogonality constraints that reduces both sampling and communication complexities simultaneously. In the numerical experiments, VRSGT has a promising performance in a real-world autonomous driving application.
翻译:在科学计算和数据科学中广泛发现,以正方位限制进行分散化优化,在科学计算和数据科学中发现,正方位限制是非曲线的,因此设计高效算法是相当困难的。现有方法利用里曼优化的几何工具,以高抽样和通信复杂度为代价解决这个问题。为了缓解这一困难,我们基于两种可以免除正方位限制的新技术,提出了一种差异式的慢方位梯度跟踪算法,其趋同率为$O(1/k)到固定点。据我们所知,VRSGT是第一个以正方位限制同时减少抽样和通信复杂度的分散化优化算法。在数字实验中,VRSGT在现实世界自主驱动应用程序中表现良好。