The composite quantile regression (CQR) was introduced by Zou and Yuan [Ann. Statist. 36 (2008) 1108--1126] as a robust regression method for linear models with heavy-tailed errors while achieving high efficiency. Its penalized counterpart for high-dimensional sparse models was recently studied in Gu and Zou [IEEE Trans. Inf. Theory 66 (2020) 7132--7154], along with a specialized optimization algorithm based on the alternating direct method of multipliers (ADMM). Compared to the various first-order algorithms for penalized least squares, ADMM-based algorithms are not well-adapted to large-scale problems. To overcome this computational hardness, in this paper we employ a convolution-smoothed technique to CQR, complemented with iteratively reweighted $\ell_1$-regularization. The smoothed composite loss function is convex, twice continuously differentiable, and locally strong convex with high probability. We propose a gradient-based algorithm for penalized smoothed CQR via a variant of the majorize-minimization principal, which gains substantial computational efficiency over ADMM. Theoretically, we show that the iteratively reweighted $\ell_1$-penalized smoothed CQR estimator achieves near-minimax optimal convergence rate under heavy-tailed errors without any moment constraint, and further achieves near-oracle convergence rate under a weaker minimum signal strength condition than needed in Gu and Zou (2020). Numerical studies demonstrate that the proposed method exhibits significant computational advantages without compromising statistical performance compared to two state-of-the-art methods that achieve robustness and high efficiency simultaneously.
翻译:复合孔径回归(CQR)由Zou和Yuan[Ann. Statist. 36(2008) 1108-1126] 推出,作为具有重尾误的线性模型的稳健回归方法,同时实现了高效率。最近在Gu和Zou[IEEE Trans. Inf. Theory 66 (2020) 7132-7154] 中研究了高维稀薄模型的受罚对应方,同时采用了基于乘数交替直接法(ADMMM)的专门优化算法。与对等最低正方的各种一级算法相比,基于ADMMM的算法不适应于大规模趋同问题。要克服这种计算硬性硬性模型的硬性硬性,我们在GQR采用高超前进缩缩缩缩缩缩缩缩缩缩缩缩缩缩缩缩略图技术,在CADRMM1 下大幅地计算出高压压压压的CMDR值。