Parameterized quantum circuits can be used as quantum neural networks and have the potential to outperform their classical counterparts when trained for addressing learning problems. To date, much of the results on their performance on practical problems are heuristic in nature. In particular, the convergence rate for the training of quantum neural networks is not fully understood. Here, we analyze the dynamics of gradient descent for the training error of a class of variational quantum machine learning models. We define wide quantum neural networks as parameterized quantum circuits in the limit of a large number of qubits and variational parameters. We then find a simple analytic formula that captures the average behavior of their loss function and discuss the consequences of our findings. For example, for random quantum circuits, we predict and characterize an exponential decay of the residual training error as a function of the parameters of the system. We finally validate our analytic results with numerical experiments.
翻译:参数化量子电路可以用作量子神经网络,并且有可能在接受处理学习问题的培训时超越传统的神经网络。到目前为止,在实际问题上,其表现结果大多是超自然性的。特别是,数量神经网络培训的趋同率还没有得到完全理解。在这里,我们分析一组变量机器学习模型培训错误的梯度下降动态。我们把大量神经网络定义为大量qubit和变异参数范围内的参数化量子电路。然后我们找到一个简单的分析公式,捕捉其损失函数的平均行为,并讨论我们发现的结果的后果。例如,对于随机量子电路,我们预测并定性残余培训错误的指数衰减,作为系统参数的函数。我们最终用数字实验来验证我们的解析结果。</s>