Recent results suggest that quantum computers possess the potential to speed up nonconvex optimization problems. However, a crucial factor for the implementation of quantum optimization algorithms is their robustness against experimental and statistical noises. In this paper, we systematically study quantum algorithms for finding an $\epsilon$-approximate second-order stationary point ($\epsilon$-SOSP) of a $d$-dimensional nonconvex function, a fundamental problem in nonconvex optimization, with noisy zeroth- or first-order oracles as inputs. We first prove that, up to noise of $O(\epsilon^{10}/d^5)$, accelerated perturbed gradient descent with quantum gradient estimation takes $O(\log d/\epsilon^{1.75})$ quantum queries to find an $\epsilon$-SOSP. We then prove that perturbed gradient descent is robust to the noise of $O(\epsilon^6/d^4)$ and $O(\epsilon/d^{0.5+\zeta})$ for $\zeta>0$ on the zeroth- and first-order oracles, respectively, which provides a quantum algorithm with poly-logarithmic query complexity. We then propose a stochastic gradient descent algorithm using quantum mean estimation on the Gaussian smoothing of noisy oracles, which is robust to $O(\epsilon^{1.5}/d)$ and $O(\epsilon/\sqrt{d})$ noise on the zeroth- and first-order oracles, respectively. The quantum algorithm takes $O(d^{2.5}/\epsilon^{3.5})$ and $O(d^2/\epsilon^3)$ queries to the two oracles, giving a polynomial speedup over the classical counterparts. Moreover, we characterize the domains where quantum algorithms can find an $\epsilon$-SOSP with poly-logarithmic, polynomial, or exponential number of queries in $d$, or the problem is information-theoretically unsolvable even by an infinite number of queries. In addition, we prove an $\Omega(\epsilon^{-12/7})$ lower bound in $\epsilon$ for any randomized classical and quantum algorithm to find an $\epsilon$-SOSP using either noisy zeroth- or first-order oracles.
翻译:最新结果显示, 量子计算机具有加速非convex优化问题的潜力。 然而, 实施量子优化算法的一个关键因素是它们对于实验和统计噪音的稳健性。 在本文中, 我们系统地研究量子算法, 以寻找 $\ epsilon$- 近似二级固定点( epsilon$- suplex ) 美元( epsilon$- 非convex 函数), 在非convex优化中存在一个根本性问题, 以零位或一阶或一阶或一阶( ir) 美元( opsilential- listal) 美元( ilselon_ 10} / d=5美元, 加速的梯度梯度下降需要 $( d/ epsilsonlon) 美元 美元( ==xiliocial- rqolation) 数字( entialsional- discial- rassial- ral- ralislational- 数)。 数( ral- or- orx) 数( mocialislusial- orx) =x) 数( orx) 美元- or- or- or- or- or- orxxxxxxxxxxxxxxx) 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元- 美元-