We define \emph{laziness} to describe a large suppression of variational parameter updates for neural networks, classical or quantum. In the quantum case, the suppression is exponential in the number of qubits for randomized variational quantum circuits. We discuss the difference between laziness and \emph{barren plateau} in quantum machine learning created by quantum physicists in \cite{mcclean2018barren} for the flatness of the loss function landscape during gradient descent. We address a novel theoretical understanding of those two phenomena in light of the theory of neural tangent kernels. For noiseless quantum circuits, without the measurement noise, the loss function landscape is complicated in the overparametrized regime with a large number of trainable variational angles. Instead, around a random starting point in optimization, there are large numbers of local minima that are good enough and could minimize the mean square loss function, where we still have quantum laziness, but we do not have barren plateaus. However, the complicated landscape is not visible within a limited number of iterations, and low precision in quantum control and quantum sensing. Moreover, we look at the effect of noises during optimization by assuming intuitive noise models, and show that variational quantum algorithms are noise-resilient in the overparametrization regime. Our work precisely reformulates the quantum barren plateau statement towards a precision statement and justifies the statement in certain noise models, injects new hope toward near-term variational quantum algorithms, and provides theoretical connections toward classical machine learning. Our paper provides conceptual perspectives about quantum barren plateaus, together with discussions about the gradient descent dynamics in \cite{together}.
翻译:我们定义 {emph{ laziness} 以描述对神经网络、古典或量子网络的变异参数更新的大规模抑制。 在量子案例中,随机变异量电路的量位数抑制指数成指数。 我们讨论量子物理学家在\ cite{mclecle2018barren} 中创建的量子机器学习中的酸性差,以表示在梯度下降期间损失功能的平坦性。 我们根据神经正心内核理论,从理论上理解这两种现象。 对于无噪音的量子电路,没有测量噪音,随机变异的量量电路变数是指数的指数数成指数。 我们讨论的是过度的量变数,在量变异的量上,在量变异的量分析中,我们用量变量的数值分析, 以数值变量的算法显示我们的工作变量和量变量的变数。 此外,在量变数分析中, 大量本地的量变数, 我们用量变数的数值和量的变数中,我们用量的数值变数看得的是, 。