In this paper, we develop a novel high-dimensional coefficient estimation procedure based on high-frequency data. Unlike usual high-dimensional regression procedure such as LASSO, we additionally handle the heavy-tailedness of high-frequency observations as well as time variations of coefficient processes. Specifically, we employ Huber loss and truncation scheme to handle heavy-tailed observations, while $\ell_{1}$-regularization is adopted to overcome the curse of dimensionality under a sparse coefficient structure. To account for the time-varying coefficient, we estimate local high-dimensional coefficients which are biased estimators due to the $\ell_{1}$-regularization. Thus, when estimating integrated coefficients, we propose a debiasing scheme to enjoy the law of large number property and employ a thresholding scheme to further accommodate the sparsity of the coefficients. We call this Robust thrEsholding Debiased LASSO (RED-LASSO) estimator. We show that the RED-LASSO estimator can achieve a near-optimal convergence rate with only finite $\gamma$th moment for any $\gamma>2$. In the empirical study, we apply the RED-LASSO procedure to the high-dimensional integrated coefficient estimation using high-frequency trading data.
翻译:在本文中,我们开发了一个基于高频数据的新颖的高维系数估计程序。 与LASSO(LASSO)等通常的高维回归程序不同,我们额外处理高频观测以及系数过程的时间变异的繁琐程度。 具体地说,我们采用Huber损失和缺勤计划来处理重尾观察,而美元则采用美元-美元-正规化,以克服在稀疏的系数结构下对维度的诅咒。考虑到时间变化系数,我们估计了当地高维系数,这些高维系数由于美元/ ell%1美元- 美元- 常规化而具有偏差性。 因此,在估算综合系数时,我们建议采用一种降低偏差计划,以享受大属性法,并采用一个阈值计划,以进一步适应系数的偏差。 我们称之为“保持低度”的LASSO(RED-LASSO)估算器。 我们显示,RED-LASASTO(RED)估计值可实现接近-opimal $2=美元- 美元- 美元- 高额- Inmamamamaimalstalstalstal existial extial extical extical extical extiquestal extiquestal extical extical extical extical extiquestal extical exticalmental extial extical extical extical extical exital exital extical extical exital exment extical extical extical 任何我们任何 exital expal extical exal exal exment ex exal exal exal exal exmental extical exmental exmental extical exmental exment exment exment exment exment exment exmental $ $ $ $ $ $ $ $ $ $ $ $ exment ex</s>