This article presents a different way to study the theory of regularized learning for generalized data including representer theorems and convergence theorems. The generalized data are composed of linear functionals and real scalars to represent the discrete information of the local models. By the extension of the classical machine learning, the empirical risks are computed by the generalized data and the loss functions. According to the techniques of regularization, the global solutions are approximated by minimizing the regularized empirical risks over the Banach spaces. The Banach spaces are adaptively chosen to endow the generalized input data with compactness such that the existence and convergence of the approximate solutions are guaranteed by the weak* topology.
翻译:本文为研究普通数据,包括代表性理论和趋同理论的正规化学习理论提供了一个不同的方法。通用数据由直线功能和真实的标尺组成,代表当地模型的离散信息。通过古典机器学习的延伸,经验风险由通用数据和损失功能计算。根据正规化技术,通过尽可能减少班纳赫空间的正规化经验风险,全球解决方案是近似于全球解决方案的。巴纳奇空间是适应性地选择的,以便缩小通用输入数据的范围,使其具有精细性,使薄弱的表层能够保证近似解决方案的存在和趋同性。