This paper presents novel generalization bounds for the multi-kernel learning problem. Motivated by applications in sensor networks, we assume that the dataset is mixed where each sample is taken from a finite pool of Markov chains. Our bounds for learning kernels admit $O(\sqrt{\log m})$ dependency on the number of base kernels and $O(1/\sqrt{n})$ dependency on the number of training samples. However, some $O(1/\sqrt{n})$ terms are added to compensate for the dependency among samples compared with existing generalization bounds for multi-kernel learning with i.i.d. datasets.
翻译:本文介绍了多内核学习问题的新的一般性界限。 以传感器网络的应用为动力,我们假定,当每个样本取自一个有限的马尔科夫链子库时,数据集是混合的。 我们的学习内核的界限承认对基础内核数的依赖值为O( sqrt $) 美元,对培训样品数的依赖值为O( 1/\ sqrt{n} $) 美元。 但是,增加了一些O( 1/\ sqrt{n} 美元,以补偿样本与i. d. 数据集现有多内核学习的通用界限相比的依赖性。