We consider least squares estimation in a general nonparametric regression model. The rate of convergence of the least squares estimator (LSE) for the unknown regression function is well studied when the errors are sub-Gaussian. We find upper bounds on the rates of convergence of the LSE when the errors have uniformly bounded conditional variance and have only finitely many moments. We show that the interplay between the moment assumptions on the error, the metric entropy of the class of functions involved, and the "local" structure of the function class around the truth drives the rate of convergence of the LSE. We find sufficient conditions on the errors under which the rate of the LSE matches the rate of the LSE under sub-Gaussian error. Our results are finite sample and allow for heteroscedastic and heavy-tailed errors.
翻译:在一般的非参数回归模型中,我们考虑最小方位估计值。 当错误为亚高加索地区时, 未知回归函数最小方位估计值( LSE) 的趋同率会得到很好的研究。 当错误一致地约束有条件差异, 且只有有限的时间, 我们发现 LSE 的趋同率的上界值。 我们显示错误的瞬间假设、 所涉函数类别的公吨和 函数类的“ 本地” 结构之间的相互作用会驱动 LSE 趋同率的趋同率。 我们找到足够的条件, 让 LSE 率与 亚高加索地区误差下的 LSE 率相匹配。 我们的结果是有限的样本, 并允许发生螺旋和严重尾部错误 。