We present a dual subspace ascent algorithm for support vector machine training that respects a budget constraint limiting the number of support vectors. Budget methods are effective for reducing the training time of kernel SVM while retaining high accuracy. To date, budget training is available only for primal (SGD-based) solvers. Dual subspace ascent methods like sequential minimal optimization are attractive for their good adaptation to the problem structure, their fast convergence rate, and their practical speed. By incorporating a budget constraint into a dual algorithm, our method enjoys the best of both worlds. We demonstrate considerable speed-ups over primal budget training methods.
翻译:我们提出了一个支持矢量机培训的双重子空间升降算法,该算法尊重限制支持矢量数量的预算限制。预算方法在减少SVM内核的培训时间方面是有效的,同时保持很高的准确性。到目前为止,预算培训只对原始(基于SGD的)解答者提供。连续最低优化等双重子空间升降法具有吸引力,有利于它们很好地适应问题结构、快速趋同率和实际速度。通过将预算限制纳入双重算法,我们的方法在世界上都享有最佳的优势。我们展示了相对于初步预算培训方法的大量快速提升。