Min-max problems have broad applications in machine learning, including learning with non-decomposable loss and learning with robustness to data distribution. Convex-concave min-max problem is an active topic of research with efficient algorithms and sound theoretical foundations developed. However, it remains a challenge to design provably efficient algorithms for non-convex min-max problems with or without smoothness. In this paper, we study a family of non-convex min-max problems, whose objective function is weakly convex in the variables of minimization and is concave in the variables of maximization. We propose a proximally guided stochastic subgradient method and a proximally guided stochastic variance-reduced method for the non-smooth and smooth instances, respectively, in this family of problems. We analyze the time complexities of the proposed methods for finding a nearly stationary point of the outer minimization problem corresponding to the min-max problem.
翻译:最小最大问题在机器学习中具有广泛的应用,包括学习非分解损失,学习数据分布的稳健性; 混凝土微轴问题是利用高效算法和完善的理论基础进行的一项积极研究课题; 然而,为非分解微轴问题设计出一种可以想象的有效算法,无论是否平滑,仍然是一项挑战; 在本文件中,我们研究一组非分解微轴问题,其目标功能在最小化变量中是微弱的混凝土,在最大化变量中是混杂的。 我们提出了一种近似引导的随机次梯度次梯度方法,以及一种对非单调和平滑度情况分别进行平行导导导导的分解差异法。 我们分析了为找到与微轴问题相应的外部最小化问题几乎固定点而拟议方法的时间复杂性。