Vector-valued learning, where the output space admits a vector-valued structure, is an important problem that covers a broad family of important domains, e.g. multi-label learning and multi-class classification. Using local Rademacher complexity and unlabeled data, we derive novel data-dependent excess risk bounds for learning vector-valued functions in both the kernel space and linear space. The derived bounds are much sharper than existing ones, where convergence rates are improved from $\mathcal{O}(1/\sqrt{n})$ to $\mathcal{O}(1/\sqrt{n+u}),$ and $\mathcal{O}(1/n)$ in special cases. Motivated by our theoretical analysis, we propose a unified framework for learning vector-valued functions, incorporating both local Rademacher complexity and Laplacian regularization. Empirical results on a wide number of benchmark datasets show that the proposed algorithm significantly outperforms baseline methods, which coincides with our theoretical findings.
翻译:矢量估值的学习,在输出空间承认矢量估值结构的情况下,是一个重要问题,涉及重要领域的广泛组合,例如多标签学习和多级分类。使用本地Rademacher复杂程度和无标签数据,我们为在内核空间和线性空间学习矢量估值功能,得出新的基于数据的额外风险界限。衍生的界限比现有界限要清晰得多,从$\mathcal{O}(1/\ sqrt{n}}美元提高到$\mathcal{O}(1/\sqrt{n+u})美元,在特殊情况下,美元和$\mathcal{O}(1/n)美元。根据我们的理论分析,我们提出了一个学习矢量估值功能的统一框架,其中既包括本地Rademacher复杂程度,又包括Laplacian的正规化。大量基准数据集的结果表明,拟议的算法大大优于基线方法,这与我们的理论结论相吻合。