Envelope methods perform dimension reduction of predictors or responses in multivariate regression, exploiting the relationship between them to improve estimation efficiency. While most research on envelopes has focused on their estimation properties, certain envelope estimators have been shown to excel at prediction in both low and high dimensions. In this paper, we propose to further improve prediction through envelope-guided regularization (EgReg), a novel method which uses envelope-derived information to guide shrinkage along the principal components (PCs) of the predictor matrix. We situate EgReg among other PC-based regression methods and envelope methods to motivate its development. We show that EgReg delivers lower prediction risk than a closely related non-shrinkage envelope estimator when the number of predictors $p$ and observations $n$ are fixed and in any alignment. In an asymptotic regime where the true intrinsic dimension of the predictors and $n$ diverge proportionally, we find that the limiting prediction risk of the non-shrinkage envelope estimator exhibits a double descent phenomenon and is consistently larger than the limiting risk for EgReg. We compare the prediction performance of EgReg with envelope methods and other PC-based prediction methods in simulations and a real data example, observing improved prediction performance over these alternative approaches in general.
翻译:暂无翻译