Variational quantum machine learning is an extensively studied application of near-term quantum computers. The success of variational quantum learning models crucially depends on finding a suitable parametrization of the model that encodes an inductive bias relevant to the learning task. However, precious little is known about guiding principles for the construction of suitable parametrizations. In this work, we holistically explore when and how symmetries of the learning problem can be exploited to construct quantum learning models with outcomes invariant under the symmetry of the learning task. Building on tools from representation theory, we show how a standard gateset can be transformed into an equivariant gateset that respects the symmetries of the problem at hand through a process of gate symmetrization. We benchmark the proposed methods on two toy problems that feature a non-trivial symmetry and observe a substantial increase in generalization performance. As our tools can also be applied in a straightforward way to other variational problems with symmetric structure, we show how equivariant gatesets can be used in variational quantum eigensolvers.
翻译:变化量子机器学习是短期量子计算机的广泛研究应用。变量量子学习模型的成功与否关键取决于能否找到一种合适的模型,这种模型能够将与学习任务相关的感化偏差编码成一个适当的模型。然而,对于建造适当准差化的指导原则却知之甚少。在这项工作中,我们从整体上探索学习问题何时以及如何利用对称性来构建量子学习模型,在学习任务的对称性下,结果不易变。在代表制理论工具的基础上,我们展示了如何通过门对称化过程将标准门扇转换成一个可尊重手头问题对称性的静气门门。我们把拟议方法以两个小问题为基准,即非三角对称性,并观察普遍性表现的大幅提高。由于我们的工具也可以直接用于解决对称结构的其他变异性问题,我们展示了可如何在变性量质质量子成软质质的门被使用。