Random features are a central technique for scalable learning algorithms based on kernel methods. A recent work has shown that an algorithm for machine learning by quantum computer, quantum machine learning (QML), can exponentially speed up sampling of optimized random features, even without imposing restrictive assumptions on sparsity and low-rankness of matrices that had limited applicability of conventional QML algorithms; this QML algorithm makes it possible to significantly reduce and provably minimize the required number of features for regression tasks. However, a major interest in the field of QML is how widely the advantages of quantum computation can be exploited, not only in the regression tasks. We here construct a QML algorithm for a classification task accelerated by the optimized random features. We prove that the QML algorithm for sampling optimized random features, combined with stochastic gradient descent (SGD), can achieve state-of-the-art exponential convergence speed of reducing classification error in a classification task under a low-noise condition; at the same time, our algorithm with optimized random features can take advantage of the significant reduction of the required number of features so as to accelerate each iteration in the SGD and evaluation of the classifier obtained from our algorithm. These results discover a promising application of QML to significant acceleration of the leading classification algorithm based on kernel methods, without ruining its applicability to a practical class of data sets and the exponential error-convergence speed.
翻译:随机测算是基于内核方法的可伸缩学习算法的核心技术。最近的一项工作表明,通过量子计算机和量子机器学习(QML)的机器学习算法可以快速加速优化随机特性的抽样,即使不把限制的假设强加在对常规的QML算法适用性有限的矩阵的宽度和低级别上;这种QML算法使得有可能大幅度降低和以可辨别方式将回归任务所需的特性数目减少到最小化。然而,在QML领域的一个主要利益是量子计算的好处能够被广泛利用,而不仅是在回归任务中。 我们在这里为以优化随机特性加速的分类任务加速了分类任务的速度,我们证明用于取样的优化随机特性的QML算法,加上偏差性梯梯级梯级梯落(SGD),可以达到在低音调条件下减少分类任务中分类错误的状态最先进的指数趋同速度速度速度速度;同时,我们采用最优化随机特性的算法,可以利用大量减少量子计算方法的特性,从而加快其快速地加速进行最终的分类。