The problem of linear predictions has been extensively studied for the past century under pretty generalized frameworks. Recent advances in the robust statistics literature allow us to analyze robust versions of classical linear models through the prism of Median of Means (MoM). Combining these approaches in a piecemeal way might lead to ad-hoc procedures, and the restricted theoretical conclusions that underpin each individual contribution may no longer be valid. To meet these challenges coherently, in this study, we offer a unified robust framework that includes a broad variety of linear prediction problems on a Hilbert space, coupled with a generic class of loss functions. Notably, we do not require any assumptions on the distribution of the outlying data points ($\mathcal{O}$) nor the compactness of the support of the inlying ones ($\mathcal{I}$). Under mild conditions on the dual norm, we show that for misspecification level $\epsilon$, these estimators achieve an error rate of $O(\max\left\{|\mathcal{O}|^{1/2}n^{-1/2}, |\mathcal{I}|^{1/2}n^{-1} \right\}+\epsilon)$, matching the best-known rates in literature. This rate is slightly slower than the classical rates of $O(n^{-1/2})$, indicating that we need to pay a price in terms of error rates to obtain robust estimates. Additionally, we show that this rate can be improved to achieve so-called ``fast rates" under additional assumptions.
翻译:在过去的世纪里,在相当广泛的框架下,对线性预测问题进行了广泛的研究。最近,强健的统计文献的进步使我们得以通过Memond of Medien of Meemination(MOM)的棱镜分析古典线性模型的稳健版本。将这些方法以零敲碎打的方式合并可能会导致临时程序,而作为每项贡献基础的有限理论结论可能不再有效。为了一致地应对这些挑战,我们在本研究报告中提供了一个统一的强健框架,其中包括在希尔伯特空间上的广泛种类的线性预测问题,以及一般的损失等级功能。值得注意的是,我们并不要求任何关于偏差数据点(mathcal{O}(Mmmuncal{O}$)的分布的假设,也不要求隐蔽数据支持的紧凑性(mathcal{I}(I}}%I}。在两种标准中,我们表明,对于具体金额的确定值水平的错误率(maxmaxmaxmax_max_max_max pass pass pass pass) a lax pass ration ration ration ration ration ration pass______________r_________________________________rxxxxxx_______________________________________________________________________________________________________________________________________________________________________________________________________