Quantile regression is a powerful tool for learning the relationship between a response variable and a multivariate predictor while exploring heterogeneous effects. In this paper, we consider statistical inference for quantile regression with large-scale data in the "increasing dimension" regime. We provide a comprehensive and in-depth analysis of a convolution-type smoothing approach that achieves adequate approximation to computation and inference for quantile regression. This method, which we refer to as {\it{conquer}}, turns the non-differentiable quantile loss function into a twice-differentiable, convex and locally strongly convex surrogate, which admits a fast and scalable Barzilai-Borwein gradient-based algorithm to perform optimization, and multiplier bootstrap for statistical inference. Theoretically, we establish explicit non-asymptotic bounds on both estimation and Bahadur-Kiefer linearization errors, from which we show that the asymptotic normality of the conquer estimator holds under a weaker requirement on the number of the regressors than needed for conventional quantile regression. Moreover, we prove the validity of multiplier bootstrap confidence constructions. Our numerical studies confirm the conquer estimator as a practical and reliable approach to large-scale inference for quantile regression. Software implementing the methodology is available in the \texttt{R} package \texttt{conquer}.
翻译:量化回归是学习响应变量{ 和多变量预测器之间的关系的有力工具。 在本文件中, 我们考虑在“ 递增维度” 系统中, 以大规模数据对四分位回归进行统计推论。 我们提供了对进化型平滑方法的全面和深入分析, 该方法在计算和推算方面均能达到适当的近似值, 对四分位回归的推论。 我们称之为 prit{conquer ⁇ 的这种方法, 将不可区分的定量缩放损失函数转换成一个可区分的、 comvex 和 本地强烈的 convex 代谢。 本文中, 允许快速和可缩放的巴齐莱- 博瑞因梯级算法进行快速和可缩放的回归。 从理论上讲, 我们为估算和巴哈杜尔- 梯- 线性缩放错误建立了明确的非无偏移界限。 我们从中显示, 征服性缩缩放的平流法的正常的回归方法 。 我们对于累进式回归者数量的要求较弱, 我们为常规的变化的变化的变化分析中, 的变化法中, 我们的变化的变化法的变制的变制的变制的变制的变制的变制方法。