Quantum kernel methods are considered a promising avenue for applying quantum computers to machine learning problems. Identifying hyperparameters controlling the inductive bias of quantum machine learning models is expected to be crucial given the central role hyperparameters play in determining the performance of classical machine learning methods. In this work we introduce the hyperparameter controlling the bandwidth of a quantum kernel and show that it controls the expressivity of the resulting model. We use extensive numerical experiments with multiple quantum kernels and classical datasets to show consistent change in the model behavior from underfitting (bandwidth too large) to overfitting (bandwidth too small), with optimal generalization in between. We draw a connection between the bandwidth of classical and quantum kernels and show analogous behavior in both cases. Furthermore, we show that optimizing the bandwidth can help mitigate the exponential decay of kernel values with qubit count, which is the cause behind recent observations that the performance of quantum kernel methods decreases with qubit count. We reproduce these negative results and show that if the kernel bandwidth is optimized, the performance instead improves with growing qubit count and becomes competitive with the best classical methods.
翻译:量子内核方法被认为是将量子计算机应用于机器学习问题的有希望的渠道。 确定控制量子机器学习模型的诱导偏差的超参数,由于超参数在确定古典机器学习方法的性能方面起着核心作用,因此预期至关重要。 在这项工作中,我们引入了超参数控制量子内核的带宽,并表明它控制了由此形成的模型的表达性。 我们用多种量子内核和古典数据集进行广泛的数字实验,以显示模型行为的一致性变化,从安装不足(带宽太大)到过度安装(带宽太小),以最佳的两种方式加以概括化。 我们把古典和量子内核的带宽连接起来,并在两种情况下显示类似的行为。 此外,我们表明,优化带宽有助于减轻内核值与qubit计数的指数衰减,这是最近观察到量子内核方法的性能随着qubit计数下降的原因。 我们复制这些负面结果,并表明如果最佳的内核带宽是优化的,则性会随着古典方法的增长而得到改善。