The prospect of achieving quantum advantage with Quantum Neural Networks (QNNs) is exciting. Understanding how QNN properties (e.g., the number of parameters $M$) affect the loss landscape is crucial to the design of scalable QNN architectures. Here, we rigorously analyze the overparametrization phenomenon in QNNs with periodic structure. We define overparametrization as the regime where the QNN has more than a critical number of parameters $M_c$ that allows it to explore all relevant directions in state space. Our main results show that the dimension of the Lie algebra obtained from the generators of the QNN is an upper bound for $M_c$, and for the maximal rank that the quantum Fisher information and Hessian matrices can reach. Underparametrized QNNs have spurious local minima in the loss landscape that start disappearing when $M\geq M_c$. Thus, the overparametrization onset corresponds to a computational phase transition where the QNN trainability is greatly improved by a more favorable landscape. We then connect the notion of overparametrization to the QNN capacity, so that when a QNN is overparametrized, its capacity achieves its maximum possible value. We run numerical simulations for eigensolver, compilation, and autoencoding applications to showcase the overparametrization computational phase transition. We note that our results also apply to variational quantum algorithms and quantum optimal control.
翻译:以 量子神经网络( QNN) 实现量子优势的前景令人兴奋。 了解 QNN 属性( 例如参数数量 $M) 如何影响损失地貌对于设计可缩放的QNN 架构至关重要 。 在这里, 我们严格分析有定期结构的QNN QN 超平衡现象。 我们将超平衡定义为一个制度, QN 拥有超过关键数目的参数 $M_ c$, 使其能够在州空间探索所有相关方向 。 我们的主要结果显示, 从QNN 生成器中获取的利位代数( 例如参数数量 $M$M$M$cN$) 的尺寸是如何影响损失地貌的。 我们的主要结果显示, 从QNNND 应用程序中获取的利位数( $M$M$$M$$$$$$$$) 的尺寸对损失地貌环境产生影响, 而对于量子和海星矩阵矩阵矩阵最高级等级的等级来说, Q.