Consider the {$\ell_{\alpha}$} regularized linear regression, also termed Bridge regression. For $\alpha\in (0,1)$, Bridge regression enjoys several statistical properties of interest such as sparsity and near-unbiasedness of the estimates (Fan and Li, 2001). However, the main difficulty lies in the non-convex nature of the penalty for these values of $\alpha$, which makes an optimization procedure challenging and usually it is only possible to find a local optimum. To address this issue, Polson et al. (2013) took a sampling based fully Bayesian approach to this problem, using the correspondence between the Bridge penalty and a power exponential prior on the regression coefficients. However, their sampling procedure relies on Markov chain Monte Carlo (MCMC) techniques, which are inherently sequential and not scalable to large problem dimensions. Cross validation approaches are similarly computation-intensive. To this end, our contribution is a novel \emph{non-iterative} method to fit a Bridge regression model. The main contribution lies in an explicit formula for Stein's unbiased risk estimate for the out of sample prediction risk of Bridge regression, which can then be optimized to select the desired tuning parameters, allowing us to completely bypass MCMC as well as computation-intensive cross validation approaches. Our procedure yields results in a fraction of computational times compared to iterative schemes, without any appreciable loss in statistical performance. An R implementation is publicly available online at: https://github.com/loriaJ/Sure-tuned_BridgeRegression .
翻译:暂无翻译