The Gaussian kernel is a very popular kernel function used in many machine-learning algorithms, especially in support vector machines (SVM). For nonlinear training instances in machine learning, it often outperforms polynomial kernels in model accuracy. We use Gaussian kernel profoundly in formulating nonlinear classical SVM. In the recent research, P. Rebentrost et.al. discuss a very elegant quantum version of least square support vector machine using the quantum version of polynomial kernel, which is exponentially faster than the classical counterparts. In this paper, we have demonstrated a quantum version of the Gaussian kernel and analyzed its complexity in the context of quantum SVM. Our analysis shows that the computational complexity of the quantum Gaussian kernel is O(\epsilon^(-1)logN) with N-dimensional instances and \epsilon with a Taylor remainder error term |R_m (\epsilon^(-1) logN)|.
翻译:Gaussian 内核是许多机器学习算法中使用的一个非常受欢迎的内核函数, 特别是在支持矢量机( SVM) 中。 对于机器学习的非线性培训实例, 它往往在模型精度方面优于多圆内核。 我们使用高森内核来深度制定非线性古典 SVM。 在最近的研究中, P. Rebentrost et.al. 讨论一个非常优雅的量子版的最平方支持矢量机, 使用比古典的对口机( SVM ) 更快的量子版本 。 在本文中, 我们展示了高森内核的量值版本, 并分析了其在量子 SVM 背景下的复杂性 。 我们的分析表明, 量高内核的计算复杂性是O( epsilon) 和 N- 维实例 和 Taylor 剩余错误术语 {R_m (\ epsilon (-1 logN) 。