Geometric median (\textsc{Gm}) is a classical method in statistics for achieving a robust estimation of the uncorrupted data; under gross corruption, it achieves the optimal breakdown point of 0.5. However, its computational complexity makes it infeasible for robustifying stochastic gradient descent (SGD) for high-dimensional optimization problems. In this paper, we show that by applying \textsc{Gm} to only a judiciously chosen block of coordinates at a time and using a memory mechanism, one can retain the breakdown point of 0.5 for smooth non-convex problems, with non-asymptotic convergence rates comparable to the SGD with \textsc{Gm}.
翻译:几何中位数(\ textsc{Gm}) 是一种典型的统计方法,用于对未损坏的数据进行可靠估计;在严重腐败的情况下,它达到0.5这一最佳分类点。然而,它的计算复杂性使得无法对高维优化问题进行稳健的随机梯度梯度下降(SGD) 。 在本文中,我们显示,通过将\ textsc{Gm} 应用于一次明智选择的坐标块,并使用记忆机制,人们可以保留0.5的分解点,用于平滑的非混凝土问题,而非抽查梯度趋同率可与SGD和\ textsc{Gm} 相比。