With near-term quantum devices available and the race for fault-tolerant quantum computers in full swing, researchers became interested in the question of what happens if we replace a supervised machine learning model with a quantum circuit. While such "quantum models" are sometimes called "quantum neural networks", it has been repeatedly noted that their mathematical structure is actually much more closely related to kernel methods: they analyse data in high-dimensional Hilbert spaces to which we only have access through inner products revealed by measurements. This technical manuscript summarises and extends the idea of systematically rephrasing supervised quantum models as a kernel method. With this, a lot of near-term and fault-tolerant quantum models can be replaced by a general support vector machine whose kernel computes distances between data-encoding quantum states. Kernel-based training is then guaranteed to find better or equally good quantum models than variational circuit training. Overall, the kernel perspective of quantum machine learning tells us that the way that data is encoded into quantum states is the main ingredient that can potentially set quantum models apart from classical machine learning models.
翻译:随着近距离量子装置的可用以及耐过量量计算机的竞赛,研究人员开始对下述问题感兴趣:如果我们用量子电路取代一个受监督的机器学习模型,会发生什么情况。虽然这种“量子模型”有时被称为“量子神经网络”,但人们一再指出,它们的数学结构实际上与内核方法的关系要密切得多:他们分析高维希尔伯特空间的数据,而我们只能通过测量所揭示的内产产品进入这些空间。这个技术手稿总结并扩展了系统改写受监督的量子模型作为内核方法的想法。有了这个技术手稿,许多近期和耐过量量模型可以被一个一般支持性载体机器所取代,而后者的内核在数据分解量子状态之间的距离。然后保证以内核为基础的培训能找到更好或同样好的量子模型,而不是变化电路培训。总体而言,量子机器学习的内核视角告诉我们,将数据编码成量子状态是有可能将量子模型与古典机器学习模型分开的主要成分。