Overparametrization is one of the most surprising and notorious phenomena in machine learning. Recently, there have been several efforts to study if, and how, Quantum Neural Networks (QNNs) acting in the absence of hardware noise can be overparametrized. In particular, it has been proposed that a QNN can be defined as overparametrized if it has enough parameters to explore all available directions in state space. That is, if the rank of the Quantum Fisher Information Matrix (QFIM) for the QNN's output state is saturated. Here, we explore how the presence of noise affects the overparametrization phenomenon. Our results show that noise can "turn on" previously-zero eigenvalues of the QFIM. This enables the parametrized state to explore directions that were otherwise inaccessible, thus potentially turning an overparametrized QNN into an underparametrized one. For small noise levels, the QNN is quasi-overparametrized, as large eigenvalues coexists with small ones. Then, we prove that as the magnitude of noise increases all the eigenvalues of the QFIM become exponentially suppressed, indicating that the state becomes insensitive to any change in the parameters. As such, there is a pull-and-tug effect where noise can enable new directions, but also suppress the sensitivity to parameter updates. Finally, our results imply that current QNN capacity measures are ill-defined when hardware noise is present.
翻译:超离性是机器学习中最令人惊讶和最臭名昭著的现象之一。 最近,我们进行了几次努力,研究Qantum神经网络(QNN)在没有硬件噪音的情况下运作的QQNN是否以及如何能够过度平衡。 特别是,有人提议,如果QNN有足够的参数可以探索州空间的所有可用方向,那么QNN就可以被定义为超离性。 也就是说,如果QNNN输出状态的Qantum Fish信息信息矩阵(QFIM)级别饱和,QNNN输出状态的QF信息矩阵(QFIM)的级别是准超离异的。在这里,我们探索了噪音的存在如何影响超离异性神经网络(QNNNN)的走向。我们的结果显示,在QFIM之前的零电子值值中,噪音可以“转动 ” 。 这使得光化状态可以探索原本无法进入的方向, 从而可能将超离谱的QNNNNN变成一个不完全平衡的方向。 对于小的噪音水平,QN是准的, QNN是半断然值与目前与小的基值共存。 最后, 我们的硬值会证明, 的硬值会变成一个硬值的QQND的大小。