In this article we propose, based on a non-probabilistic supervised learning approach, a new general mathematical framework to solve non-linear convex variational problems on reflexive Banach spaces. The variational problems covered by this methodology include the standard ones. Concerning the training sets considered in this work, called radial dictionaries, the definition includes, among others, tensors in Tucker format with bounded rank and Neural Networks with fixed architecture and bounded parameters. The training set will be used to construct, from an iterative algorithm defined by a multivalued map, a sequence called progressive learning by dictionary optimization. We prove the convergence of this sequence to the solution of the variational problem. Furthermore, we obtain the same rate of convergence which is obtained in the Method of Steepest Descend implemented in a reflexive Banach space, $O(m^{-1}).$
翻译:在本条中,我们基于非概率性监督的学习方法,提议一个新的通用数学框架,以解决反射型班纳克空间的非线性锥形变异问题,该方法涵盖的变异问题包括标准问题。关于这项工作中考虑的成套培训,称为辐射词典,定义除其他外包括塔克式的带约束级的加仑和带有固定结构及约束参数的神经网络的带约束级的神经网络。这套培训将用来从由多值地图界定的迭代算法中,构建一个叫作字典优化的渐进式学习序列。我们证明这一序列与变异问题的解决办法是趋同的。此外,我们获得了在反射型班纳克空间实施的Sepepest Sendcend方法中取得的相同的趋同率,即$O(m)-1美元。