Recently, Nystr\"{o}m method has proved its prominence empirically and theoretically in speeding up the training of kernel machines while retaining satisfactory performances and accuracy. So far, there are several different approaches proposed to exploit Nystr\"{o}m method in scaling up kernel machines. However, there is no comparative study over these approaches, and they were individually analyzed for specific types of kernel machines. Therefore, it remains a question that the philosophy of which approach is more promising when it extends to other kernel machines. In this work, motivated by the column inclusion property of Gram matrices, we develop a new approach with a clear geometric interpretation for running Nystr\"{o}m-based kernel machines. We show that the other two well-studied approaches can be equivalently transformed to be our proposed one. Consequently, analysis established for the proposed approach also works for these two. Particularly, our proposed approach makes it possible to develop approximation errors in a general setting. Besides, our analysis also manifests the relations among the aforementioned two approaches and another naive one. First, the analytical forms of the corresponding approximate solutions are only at odds with one term. Second, the naive approach can be implemented efficiently by sharing the same training procedure with others. These analytical results lead to the conjecture that the naive approach can provide more accurate approximate solutions than the other two sophisticated approaches. Since our analysis also offers ways for computing the accuracy of these approximate solutions, we run experiments with classification tasks to confirm our conjecture.
翻译:最近, Nystr\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\在经验上和理论上在加速内核机培训的同时保持令人满意的性能和准确性,在加速内核机培训方面,Nystr\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\