Iterative regularization is a classic idea in regularization theory, that has recently become popular in machine learning. On the one hand, it allows to design efficient algorithms controlling at the same time numerical and statistical accuracy. On the other hand it allows to shed light on the learning curves observed while training neural networks. In this paper, we focus on iterative regularization in the context of classification. After contrasting this setting with that of regression and inverse problems, we develop an iterative regularization approach based on the use of the hinge loss function. More precisely we consider a diagonal approach for a family of algorithms for which we prove convergence as well as rates of convergence. Our approach compares favorably with other alternatives, as confirmed also in numerical simulations.
翻译:循环正规化是正规化理论中一个典型的概念,最近在机器学习中变得很受欢迎。一方面,它允许设计高效的算法,同时控制数字和统计准确性。另一方面,它允许说明在培训神经网络时所观察到的学习曲线。在本文中,我们在分类方面注重迭代正规化。在将这一环境与回归和反向问题作对比之后,我们根据使用断链损失函数的方式制定了一种迭代正规化方法。更准确地说,我们考虑的是一种对等法方法,对于这一系列的算法,我们证明是趋同和趋同率。我们的方法优于其他替代方法,在数字模拟中也证实了这一点。