Asymmetric kernels naturally exist in real life, e.g., for conditional probability and directed graphs. However, most of the existing kernel-based learning methods require kernels to be symmetric, which prevents the use of asymmetric kernels. This paper addresses the asymmetric kernel-based learning in the framework of the least squares support vector machine named AsK-LS, resulting in the first classification method that can utilize asymmetric kernels directly. We will show that AsK-LS can learn with asymmetric features, namely source and target features, while the kernel trick remains applicable, i.e., the source and target features exist but are not necessarily known. Besides, the computational burden of AsK-LS is as cheap as dealing with symmetric kernels. Experimental results on the Corel database, directed graphs, and the UCI database will show that in the case asymmetric information is crucial, the proposed AsK-LS can learn with asymmetric kernels and performs much better than the existing kernel methods that have to do symmetrization to accommodate asymmetric kernels.
翻译:以内核为基础的现有学习方法大多要求内核具有对称性,以防止使用不对称内核。本文述及在最小方支持矢量机AsK-LS的框架内基于内核的不对称学习,从而形成第一个可直接使用不对称内核的分类方法。我们将表明,AsK-LS可以学习不对称特性,即源和目标特征,而内核的戏法仍然适用,即源和目标特征仍然存在,但不一定已知。此外,ASK-LS的计算负担与处理对称内核一样便宜。Coryl数据库的实验结果、定向图表和UCI数据库将显示,在情况中,不对称信息至关重要,拟议的AsK-LS可以学习不对称内核,并比现有的内核方法要好得多,因为内核化必须进行对称,才能容纳不对称内核。