We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output. We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method. Our analysis extends the interest of reduced-rank regression beyond the standard low-rank setting to more general output regularity assumptions. We illustrate our theoretical insights on synthetic least-squares problems. Then, we propose a surrogate structured prediction method derived from this reduced-rank method. We assess its benefits on three different problems: image reconstruction, multi-label classification, and metabolite identification.
翻译:我们提出并分析一种低级方法,用无限的维度输出来解决最不平方回归问题。 我们从中学习了方法的界限,并研究了与全级方法相比,统计性能的设定得到改善。 我们的分析将降级回归的兴趣扩大到了标准低级环境以外的更一般的产出常规性假设。 我们展示了我们对合成最低平方问题的理论洞察力。 然后,我们提出了从这一降级方法中得出的代位结构化预测方法。 我们评估了它对于三个不同问题的好处:图像重建、多标签分类和代谢物识别。