Kernel methods provide a powerful framework for non parametric learning. They are based on kernel functions and allow learning in a rich functional space while applying linear statistical learning tools, such as Ridge Regression or Support Vector Machines. However, standard kernel methods suffer from a quadratic time and memory complexity in the number of data points and thus have limited applications in large-scale learning. In this paper, we propose Snacks, a new large-scale solver for Kernel Support Vector Machines. Specifically, Snacks relies on a Nystr\"om approximation of the kernel matrix and an accelerated variant of the stochastic subgradient method. We demonstrate formally through a detailed empirical evaluation, that it competes with other SVM solvers on a variety of benchmark datasets.
翻译:核方法为非参数学习提供了强大的框架。它们基于核函数,允许在丰富功能空间中学习,同时应用线性统计学习工具,如岭回归或支持向量机。然而,标准核方法在数据点的数量上具有二次时间和内存复杂度,因此在大规模学习中具有受限的应用。在本文中,我们提出了Snacks,一个新的大规模核支持向量机求解器。具体而言,Snacks基于核矩阵的Nyström近似和随机次梯度方法的加速变体。通过详细的实证评估,我们正式证明它在各种基准数据集上与其他SVM求解器相竞争。