We study the convergence of random iterative sequence of a family of operators on infinite dimensional Hilbert spaces, which are inspired by the Stochastic Gradient Descent (SGD) algorithm in the case of the noiseless regression, as studied in [1]. We demonstrate that its polynomial convergence rate depends on the initial state, while the randomness plays a role only in the choice of the best constant factor and we close the gap between the upper and lower bounds.
翻译:我们研究的是无限维度希尔伯特空间操作员组成的一组操作员随机迭接序列的趋同,正如[1]所研究的那样,在无噪音回归的情况下,这些操作员是受Stochatistic Gradientle Ground(SGD)算法(SGD)的启发而成的。 我们证明,其多元聚合率取决于初始状态,而随机性只在选择最佳常数要素中发挥作用,我们缩小上下界限之间的差距。