This paper studies bilinear saddle point problems $\min_{\bf{x}} \max_{\bf{y}} g(\bf{x}) + \bf{x}^{\top} \bf{A} \bf{y} - h(\bf{y})$, where the functions $g, h$ are smooth and strongly-convex. When the gradient and proximal oracle related to $g$ and $h$ are accessible, optimal algorithms have already been developed in the literature \cite{chambolle2011first, palaniappan2016stochastic}. However, the proximal operator is not always easy to compute, especially in constraint zero-sum matrix games \cite{zhang2020sparsified}. This work proposes a new algorithm which only requires the access to the gradients of $g, h$. Our algorithm achieves a complexity upper bound $\tilde{\mathcal{O}}\left( \frac{\|\bf{A}\|_2}{\sqrt{\mu_x \mu_y}} + \sqrt[4]{\kappa_x \kappa_y (\kappa_x + \kappa_y)} \right)$ which has optimal dependency on the coupling condition number $\frac{\|\bf{A}\|_2}{\sqrt{\mu_x \mu_y}}$ up to logarithmic factors.
翻译:此纸张研究双线马鞍问题 $\ min\\ bf{ x}\ max\ bff{ bf{ y} g (\\ bf{ x}) +\ bf{ x} 顶端}\ bf{ { h} - h( bf{ y} ) $g, h$ 是平滑的, 坚固的 convex 。 当与$g 和$h 相关的梯度和预兆符可以访问时, 文献中已经开发出最佳算法 \ cite{ chambolle2011 first, palstochaster} 。 然而, 最准运算的操作器并不总是容易拼写, 特别是在限制的零和矩阵游戏中 \ cite{ hang2020sparsizationd} 。 这项工作提出的新算法只要求访问 $g, h$. h$. 。 我们的算法实现了最复杂的上限 $\ k_ mathal {O\\\ left (\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\