Finding efficient optimization methods plays an important role for quantum optimization and quantum machine learning on near-term quantum computers. While backpropagation on classical computers is computationally efficient, obtaining gradients on quantum computers is not, because the computational complexity usually scales with the number of parameters and measurements. In this paper, we connect Koopman operator theory, which has been successful in predicting nonlinear dynamics, with natural gradient methods in quantum optimization. We propose a data-driven approach using Koopman operator learning to accelerate quantum optimization and quantum machine learning. We develop two new families of methods: the sliding window dynamic mode decomposition (DMD) and the neural DMD for efficiently updating parameters on quantum computers. We show that our methods can predict gradient dynamics on quantum computers and accelerate the variational quantum eigensolver used in quantum optimization, as well as quantum machine learning. We further implement our Koopman operator learning algorithm on a real IBM quantum computer and demonstrate their practical effectiveness.
翻译:寻找高效优化方法对量子优化和量子机器在近期量子计算机上学习起着重要作用。 虽然古典计算机的反向反射在计算上效率很高,但在量子计算机上获取梯度并不有效,因为计算复杂性通常与参数和测量数量相比。 在本文中,我们将已经成功地预测非线性动态的Koopman操作员理论与量子优化中的自然梯度方法联系起来。我们提议了一种数据驱动方法,使用Koopman操作员学习加速量子优化和量子机器学习。我们开发了两种新的方法组合:滑动窗口动态模式分解(DMD)和神经DMD,以高效更新量子计算机的参数。我们显示,我们的方法可以预测量子计算机的梯度动态,加速量子优化中使用的变量子素索尔法以及量子机器学习。我们进一步实施我们的Koopman操作员在真正的 IBM量子计算机上学习算法,并展示其实用性。