It is a common phenomenon that for high-dimensional and nonparametric statistical models, rate-optimal estimators balance squared bias and variance. Although this balancing is widely observed, little is known whether methods exist that could avoid the trade-off between bias and variance. We propose a general strategy to obtain lower bounds on the variance of any estimator with bias smaller than a prespecified bound. This shows to which extent the bias-variance trade-off is unavoidable and allows to quantify the loss of performance for methods that do not obey it. The approach is based on a number of abstract lower bounds for the variance involving the change of expectation with respect to different probability measures as well as information measures such as the Kullback-Leibler or $\chi^2$-divergence. In a second part of the article, the abstract lower bounds are applied to several statistical models including the Gaussian white noise model, a boundary estimation problem, the Gaussian sequence model and the high-dimensional linear regression model. For these specific statistical applications, different types of bias-variance trade-offs occur that vary considerably in their strength. For the trade-off between integrated squared bias and integrated variance in the Gaussian white noise model, we propose to combine the general strategy for lower bounds with a reduction technique. This allows us to reduce the original problem to a lower bound on the bias-variance trade-off for estimators with additional symmetry properties in a simpler statistical model. In the Gaussian sequence model, different phase transitions of the bias-variance trade-off occur. Although there is a non-trivial interplay between bias and variance, the rate of the squared bias and the variance do not have to be balanced in order to achieve the minimax estimation rate.
翻译:本文探讨了在高维、非参数统计模型中,最优估计器平衡平方偏差和方差的现象。尽管这种平衡被广泛观察到,但很少有人知道是否存在可以避免偏差和方差之间的折衷的方法。我们提出了一种通用策略,以获得任何偏差小于预先指定的限制的估计器的方差下限。这显示了偏差-方差权衡的不可避免程度,并允许量化不遵守它的方法的性能损失。该方法基于多个抽象下限,这些下限涉及到不同概率测度的期望改变以及信息测度,如Kullback-Leibler或$\chi^2$-分布。在文章的第二部分中,将这些抽象下限应用于几个统计模型,包括高斯白噪声模型、边界估计问题、高斯序列模型和高维线性回归模型。对于这些具体的统计应用,出现不同强度的不同类型的偏差-方差权衡。对于高斯白噪声模型中的综合平方偏差和综合方差的权衡,我们建议将下限的通用策略与减少技术相结合。这使我们能够将原始问题减少到更简单的统计模型中具有额外对称性质的估计器的偏差-方差权衡的下限限制。在高斯序列模型中,出现了不同的偏差-方差权衡相位转变。虽然偏差和方差之间存在非平凡的相互作用,但平方偏差和方差的速率不必平衡即可达到极小化最大风险估计速率的目的。