In high dimensional regression, where the number of covariates is of the order of the number of observations, ridge penalization is often used as a remedy against overfitting. Unfortunately, for correlated covariates such regularisation typically induces in generalized linear models not only shrinking of the estimated parameter vector, but also an unwanted \emph{rotation} relative to the true vector. We show analytically how this problem can be removed by using a generalization of ridge penalization, and we analyse the asymptotic properties of the corresponding estimators in the high dimensional regime, using the cavity method. Our results also provide a quantitative rationale for tuning the parameter that controlling the amount of shrinking. We compare our theoretical predictions with simulated data and find excellent agreement.
翻译:在高维回归中,共变体的数量按观测次数的顺序排列,山脊惩罚常常被用来作为防止过度适应的补救办法。 不幸的是,这种关联共变通常会诱发一般线性模型,不仅缩小估计的参数矢量,而且比照真实矢量,产生一个不必要的\emph{rotation}。我们用分析方式展示了如何通过使用山脊惩罚的概括化来消除这一问题。我们利用洞察法分析了高维系统中相应的测算员的无症状特性。我们的结果也为调整控制缩水量的参数提供了数量上的理由。我们用模拟数据比较了我们的理论预测,并找到了极好的一致点。