The theory of spectral filtering is a remarkable tool to understand the statistical properties of learning with kernels. For least squares, it allows to derive various regularization schemes that yield faster convergence rates of the excess risk than with Tikhonov regularization. This is typically achieved by leveraging classical assumptions called source and capacity conditions, which characterize the difficulty of the learning task. In order to understand estimators derived from other loss functions, Marteau-Ferey et al. have extended the theory of Tikhonov regularization to generalized self concordant loss functions (GSC), which contain, e.g., the logistic loss. In this paper, we go a step further and show that fast and optimal rates can be achieved for GSC by using the iterated Tikhonov regularization scheme, which is intrinsically related to the proximal point method in optimization, and overcomes the limitation of the classical Tikhonov regularization.
翻译:光谱过滤理论是了解与内核学习的统计特性的一个了不起的工具。对于最不平方,它能够得出各种正规化计划,其产生的超重风险趋同率比Tikhonov正规化更快,这通常是通过利用称为源和容量条件的传统假设来实现的,这是学习任务困难的特点。为了了解从其他损失函数中得出的估计值,Marteau-Ferey等人将Tikhonov正规化理论扩展至普遍自我协调损失功能(GSC),后者包含,例如,后勤损失。在本文件中,我们进一步表明,通过使用与优化的准点法密不可分法,可以使GSC实现快速和最佳比率,并克服典型的Tikhonov正规化的局限性。